Finding the Right Healthcare Providers from Clinical Trials to War Zones
In This Episode
In this episode of the Life Sciences DNA Podcast, host Amar Drawid chats with Ariel Katz, CEO of H1, about how the company leverages AI to connect life sciences organizations with the right healthcare professionals. Ariel explains H1’s approach to overcoming data challenges in clinical trials, the importance of understanding physicians' activities, and how their platform supports both clinical research and commercial engagement.
- AI in Healthcare: Ariel discusses how they use AI to analyze complex datasets and connect healthcare providers with pharmaceutical companies, helping to find the right doctors for clinical trials and research collaborations.
- Global Data Integration Challenges: Learn how to navigate the diverse regulatory and cultural landscapes to aggregate healthcare data from over 100 countries, improving access to physicians' information worldwide.
- Finding the Right Physicians: Ariel explains how they identify key physicians for clinical trials by analyzing demographics, professional activities, and digital footprints, including social media presence.
- Real-World Use Cases: Discover how H1's platform has been used in humanitarian efforts, such as connecting doctors to Israel during the October 7th attacks, and how the platform facilitated emergency healthcare responses in Gaza
- The Future of Healthcare Data: Ariel shares his vision for the future, discussing how the use of AI and data in healthcare will continue to evolve, particularly in ensuring diversity in clinical trials and improving healthcare access in underserved regions.
Transcript
Daniel Levine (00:00)
The Life Sciences DNA podcast is sponsored by Agilisium Labs, a collaborative space where Agilisium works with its clients to co-develop and incubate POCs, products, and solutions. To learn how Agilisium Labs can use the power of its generative AI for life sciences analytics, visit them at labs.agilisium.com. Amar, we've got Ariel Katz on the show today. Who is Ariel?
Amar Drawid (00:29)
Ariel is the co-founder and CEO of H1. It's a global healthcare data technology company. He founded the company in 2017 to address the inefficiencies in healthcare data and connectivity. H1 has become a key resource for life science companies, hospitals, and academic medical centers worldwide. Katz was named to the Forbes 30 under 30 list in 2022 for his impact on healthcare innovation.
Daniel Levine (00:56)
And what is H1?
Amar Drawid (00:58)
H1 aggregates and analyzes information on millions of healthcare professionals and thousands of institutions worldwide, drawing from public, private, and proprietary data sources such as medical publications, clinical trials, claims data, and organizational affiliations. The platform allows users to access up-to-date profiles, identify key opinion leaders, and accelerate clinical research and drug development. The vision is to democratize access to critical healthcare expertise and data.
Daniel Levine (01:30)
Before we begin, I want to remind our audience that they can stay up on the latest episodes of Life Sciences DNA by hitting the Subscribe button. If you enjoy this content, be sure to hit the Like button, and let us know your thoughts in the comments section. With that, let's welcome Ariel to the show.
Amar Drawid (01:51)
Ariel, thanks for joining us. We're going to talk today about the interactions between drug companies and healthcare providers, the data challenges drug companies face when trying to understand which physicians they should be engaging with, and how your AI-based platform can help them work more effectively with providers. So let's start with the problem that H1 is seeking to address, whether it's the need to find the principal investigators for a clinical trial or key opinion leaders or to know what doctors commercial teams should target. What is the data landscape that the drug companies need to navigate today? And what are the challenges that they face?
Ariel Katz (02:27)
Yeah, it's good to see you, old friend, from Novartis days. Yeah, H1 we set out for our mission to connect the world to the right doctor. So we'll follow every doctor and tell you everything we know about them. Today we work with, just for context, today we work with about 200 companies, about 60 of the top 100 life sciences companies, mostly pharma, but we also work with like med-device, diagnostics as well. And we started in medical affairs, which is helping life science companies, pharma companies find fault-leading doctors to engage with as they're launching their drugs, find those KOLs. And then we expanded into finding, into clinical, finding the right principal investigators to engage with and sites to work with. And then we eventually went into commercial, find the right doctors to target for your therapies. It's a beast, you know it's not getting any easier. You would think with AI it would get easier, but it's actually creating somewhat more noise. The process of finding the right PI and the right site for a study, it's not like there's that higher enrollment rates today than there were four years ago. KOLs and top vault leading experts, there's more noise to sift through and it's even getting harder and harder to commercialize drugs and forecast. And so I see the problem not going away with AI. I think it's actually getting more complex in many ways and that's why it's like H1 exists.
Amar Drawid (03:40)
So what are the challenges? I mean, you think in this day and age, the information about the healthcare provider should be like out there or been, right? Like people should know about that. So why is that not happening right now?
Ariel Katz (03:53) Let's take two examples. Let's say you're launching a drug in cardiology for [undecipherable] or something, or launching a drug, multiple myeloma. It's not yet approved. You want to start to educate the market about the clinical benefits of your drug. The first question you're gonna ask is who? Which doctors need to know? And to answer that question, you want to know like, what do doctors currently believe is the best standard of care for patients? What does their hospital say they need to prescribe as the local protocol? Which doctors look up to other doctors and who they influence? A lot of the information's just not available publicly. It's in people's heads. It's in PDF documents and a printout at hospital. Incredibly difficult. On the clinical side, you're running a clinical trial in bladder cancer. You want to enroll patients quickly. How do you know which doctors will enroll patients, have eligible patients, like you or don't like you or like your competitor, has successfully conducting studies in the past, incredibly difficult to figure that out. None of that data is available publicly. And so, you have to find different ways to get it to solve that problem. So that's why it's just not that easy.
Amar Drawid (05:01)
So this is not just the master data about an HCP, but it's much more information about their activities, how they're perceived. So it's much richer information than what you find in a typical master data for an HCP.
Ariel Katz (05:16)
Yeah, master data is solved and it's easier and it's cheaper, should be cheaper. Your vendors are still charging you a lot of money. We should say make it cheaper because AI made it easy. Knowing where a doctor works, not that hard. Knowing to predict a physician's enrollment rate on an oncology clinical study, it's a beast of problem.
Amar Drawid (05:35)
Let's work on that, right? So how do you do that, right? So let's say, mean, if we take one of the examples of, mean, you pick an example. Let's say, I mean, we're talking about these clinical trials, right? So the clinical trials, and then the big problem that's in clinical development is that you have a drug, and then you want to do the clinical trials, and you're going find the right...investigators who have the right patients who can actually do the trial and then the trial can be done in a in fast way, right? So that's the big challenge that we have. How is your platform and like the AI like how are you - how are you trying to solve? How are you solving this problem?
Ariel Katz (06:14)
Yeah, so first you need the ingredients and then you could layer on the AI on top of it. The ingredients to solve that problem is knowing which physicians have eligible patient populations. Nearly impossible to know that outside the US. So you already eliminated half the world. Or not half the way - more than half the world, but statistically a lot of sites are in the US. So you eliminated the majority of the world. So you, okay, great. Maybe in the US you could say which physicians have eligible patients, because there's very complex inclusion exclusion criteria. So you could somewhat solve that in the US.
Amar Drawid (06:43)
And for people who don't know, if you just elaborate a bit on the exclusion and inclusion criteria in the clinical trials.
Ariel Katz (06:49)
I'm running a clinical trial. A clinical trial is like a new drug to cure a condition or help with a condition. If there's a cure, no patient's saying sign me up for a clinical trial. And so if I'm working on a drug in bladder cancer, these are generally patients that have failed on current drug treatments, failed on chemotherapy, don't have other confounding conditions, because you want to test if the drug actually works on bladder cancer, not on bone cancer or multiple myeloma, or between certain age groups - all these different criteria that you need to hit. And so you have to find the doctor that meets this like perfect identity painting of a patient, very difficult to do. And that the patient is willing to sign on an experimental drug, these are human beings, these are their lives. And so it's one you have to find that. Now, once you find that, by the way, you got to answer the question of, does the place that the physician work have the capabilities and the facilities to do a clinical trial? Do they have research staff? Do they have an industrial refrigerator to store CAR T cells wherever the drug is? Then you need to ask, is the physician motivated? Do they like, I work at Merck, do they like Pfizer? I work at Pfizer, do they like Merck? Then you got to ask is, have they ever run a clinical trial? Have they enrolled patients on previous clinical trials? Are they working on current clinical trials? And so it's a very, those are the ingredients. Then I'm sure you put it into open AI, you say, is this a good doctor? Sure, good luck. They'll work well, but those ingredients are the hard part to this problem. That's something that H1 solves.
Amar Drawid (08:18)
So as you're solving this problem, there's two issues. One is data. The other one is how do you use AI to then figure out based on the data, who is good or not. So can you tell us about the data that you're getting, how you're collecting, and that's the first part. We can talk about the AI aspect later.
Ariel Katz (08:38)
The AI is much easier to be honest. The AI part, could go to Anthropic, Google, OpenAI, put that data into any of those models and it will give you the right answer. The AI is actually, I mean, that was not the case three years ago, which is crazy. The analytics and predicting was the hard part, the impossible part. Now it's actually the easiest part. The data is now the incredibly valuable hard part. For us, I mean, there's no easy answer. It's country specific. We get it from partnering with governments, from buying the data, from getting information directly from research sites and physicians, because it's a combination of all these different pieces that doesn't exist in one place, which is the value of bringing it all together, structuring it, and putting it in a way that it could then feed an AI model, which then it will predict it correctly. So if you had all that data and you say, looking at all this data, here's the clinical trial protocol I'm trying to run. Can you pull out for me the physicians based off all this information that will recruit patients, it'll give you the answer. It's getting all that data, getting it evenly across all the geographies or in countries in the world that is a challenge for them.
Amar Drawid (09:44)
So what is the data about physicians that you collect? What kind of information?
Ariel Katz (09:52)
Yeah, so take Dr. John. For Dr. John, we would have demographic information. Where do they work? What language do they speak? What's their phone number? What's their email? Easy stuff. That's more of the reference master data. What we also have is then all of their scholarly activity. What medical congresses they speak at, what clinical trials they work on, what do they publish, what societies are they on the board of. We then have all their clinical activity. What patients they see, what drugs they prescribe, what referrals do they make, what diagnosis do they make, what procedures they're doing. We then have all their clinical research activity, which is which trials specifically did they work on, with which sponsors, at which phase, how much they get paid for it, did they successfully enroll patients for that study? And then for other products, which is used by commercial teams for other use cases, we'd have like, what did they tweet two seconds ago, or things like that, and what did they prescribe in detail? And then the product that's used for patients today we have, is this doctor more expensive than a different doctor to get a certain procedure? What is the clinical quality score of this doctor? If you go see this doctor, do you have lower hospital readmission rates and infection rates? So you have quality scores, clinical focus areas. It's really more every patient reviews. Anything that's possibly available about this position, we would have that information.
Amar Drawid (11:08)
And assuming you're collecting that from a lot of different sources, right? Because this is a lot. You have to connect a lot of different datasets for this. And some of this information, not even sure if that's easy to get, right? So how do you manage that?
Ariel Katz (11:24)
Beg, borrow and steal. we, The easiest is the easiest. And now it's actually easier with AI. The easiest is public sources. I mean, depending on the country. In the U S, take this as an example, they have open scientific registry of publications with PubMed, open registry of clinical clinicaltrials.gov. The NPI database has all the physicians in the U S and you go script every hospital website and boom, have 80 % of the demographic data on physicians in the U S. But try doing that in Saudi Arabia, or Taiwan, it's a bit harder, so And so it depends on the country. Some countries you have to partner with governments, some countries you have to partner with local health insurance, some companies you have to partner with local companies there, some countries. And so that's the position of the graphic. And with the more insightful piece of information, a lot of it is via partnerships or proprietary ways that we get that information from people coming in and claiming their HRM profile, analyzing the data in certain ways and that's where our secret sauce.
Amar Drawid (12:26)
Okay. All right. And tell us about the coverage of these HCPs, right? You talked about the US, but also a lot of other countries. So can you give a sense of how many physicians you cover in the US? What person. And are these more the key opinion leaders or are they even prescribing physicians? Can you talk a bit about that as well?
Ariel Katz (12:50)
We have every healthcare professional in the world besides for ones in Iran, Russia, and parts of China, everyone. We have users in over hundred countries around the world. There's multiple languages. Yeah, have data in every country in the world.
Amar Drawid (13:04)
So you have the data. Of course, we talked about the key opinion leaders and having data about their publication and stuff. But you also have data on just regular doctors.
Ariel Katz (13:14)
We have the primary care physician with a poster in front of their basement house in Omaha. everyone you can imagine basically from the chiropractor to the medical oncologist, except for those countries that I mentioned, but every physician.
Amar Drawid (13:27)
in terms of like outside of the US, like you talk about so many different countries. Did you have to like do like separate contracts or partnerships in each country or like how does, how did that work?
Ariel Katz (13:38)
It depends on the country. I mean, I had to learn a lot about history. We all did by doing this. Take Germany. I go to Germany, it's a country. It's actually sort of like the United States. It's like Prussia and Bavaria, and each one has their own registry of physicians and doctors, and you have to go to each province registry. In Canada, there was Quebec and Ontario and British Columbia. They all have their own NPI-like registry. so each country, in Japan, it's Altmark, is a part of M3, which is the biggest company there around healthcare professionals. So each country has their own unique texture to it. And you have to know the country, go country by country strategically to be able to get the right position information based off what's going on in that country.
Amar Drawid (14:19)
Now, do you find, like, especially in the developing countries, do they even have registries of, like, these, all these different HCPs?
Ariel Katz (14:27)
Short answer is no. A lot of countries in Latin America do. Like Brazil has a more structured database than CMS in the US. It's incredible. Oh really? Yeah really. Data Zeus, check it out. They have deeper patient granularity than in the US. Smaller market for pharma companies, but it's amazing what they have. Longitudinal patient data for 200 million people. It's crazy. Wow. Right? It's like Israel. Yeah. Crypto, yeah, it's a cryptograde there. So Brazil, there's one example. But in certain markets, to take Africa, in the Middle East region, the UAE and Israel and Jordan, pretty organized with their physician information. Take a plane ride south, go to Zambia, Kenya. No, they don't have physician registries, but we've been in contact with the Ministry of Health, for example, in Zambia, where they send up a link to H1 to all the physicians in Zambia. There's like one oncologist in Zambia, quite literally, I'm not exaggerating. They signed up for H1 and they, that was the way, we went viral in Zambia last year, which was an incredible experience. And so no, that's why each country is unique. You learn a lot about healthcare policy in all these countries around the world.
Amar Drawid (15:38)
That's fascinating, actually. Yeah, and so as you're bringing all of this together, how do you classify HCPs into KOLs, like the opinion leaders versus not? As you're bringing this together, how are you arranging this information?
Ariel Katz (16:07)
Yeah, the KOL and non-KOL is actually the most very complicated. Specialty is, you would think, easy. A medical oncologist is a medical oncologist. Nope. There's no medical oncologist in Zambia. There's an oncologist. There's no rab onc or surgical onc. Or take another simple one. You would think like family medicine. Primary care physician. All right, pretty understood by people - primary care physician. I go there to get my checkup. I have a cold, take my kids there. The definition of primary care physician in the US is nothing to do with primary care physician in England. It's a general practitioner. And you sort of the entry way. Yeah, it's different. Or take the example in Kenya. A nurse is sort of like the same thing as a family medicine doctor, but they're not credentialed as an MD, but they act like an MD. So specialty is actually a very hard one to normalize across geography and world. And then the next level you want to get to is like, I'll say it in plain English terms. Who's the doctor that people call when they don't know how to treat the patient well? And that's the local leader. The doctor that you all call when you're like, man, I don't know what this rash is, who do I call? And that took place in America too, and takes place globally. Then the next level would be like a national leader. Who really influences protocol and guidelines in a given country? And that's that could worry. And then you think of global leaders, probably classic KOLs who drive the clinical for diseases globally that help write guidelines for these top international societies. Those are actually pretty easy to identify, to be honest. It's these local ones, and then specialties and normalization globally is very difficult to do.
Amar Drawid (17:45)
Before you started H1, you had been the founder of Research Connection, right? So you sold the business and then how did that lead into like the founding of H1? Like is it having a connection to the previous aspect?
Ariel Katz (17:57)
With Research Connection, we profiled every professor in America, aerospace engineering professors, chemistry, biology, Russian literature, and companies were using it. J &J, Exxon, Microsoft, they were reaching out to these professors for grant funding. And I was pretty annoyed about it because these professors didn't go to Research Connection for funding grants, from all the grants. And I tried to tell these companies to stop using it, and they wouldn't listen to me. I was not smart enough to think I should go and charge these companies. I was thinking, stop using this thing. I didn't build it for you. And when we sold the Research Connection, took some time off, did a long trip in India. I met at the time my co-founder of H1 and I was working on a couple ideas and H1 was one of ideas I pitched him on: let's build LinkedIn meets Zoom info for doctors. It's the most powerful ever doctor in the world. I think there's, didn't know anything about pharma by the way. Not a single thing we learned over time. And he was like, let's go do it. He's the data and tech guy. He built it all. I'm just the sales guy. I'm a sales guy here. He's the data guy. He built it all.
Amar Drawid (19:01)
And what does H1 mean?
Ariel Katz (19:03)
There was just three of us in the room and we decided it. There were three reasons why we called it H1. One, my wife's name is Helena. Helena - I was dating her at the time, so I got some great brownie points for doing that. She's not married yet. Two, thought H and healthcare, okay, that's nice. And then H1, you have a PhD, it's the alternative hypothesis. It's when something new is observed in the world. And so we were the nerds thinking everyone knows what that means. Apparently nobody knows what that means. That was a decision back then.
Amar Drawid (19:35)
Great, great. When I looked up H1, it's not that easy to find. You do search on H1, it does come up, but then there's a lot of other things that you also get like the heading one, right?
Ariel Katz (19:47)
Yeah, for the HTML, you get H1N1. Yeah. We hear a funny story. I tried to buy h1.com from GSK. They bought it in 1998, H1N1 vaccine. Oh. And I couldn't buy it. I tried to negotiate with the general counsel of GSK. If you're listening to this, we're still open for business to buy it from you. But they refused to sell. We almost struck a deal, but they refused to sell it to us during COVID when they put a freeze on selling IP from their vaccine portfolio. I almost had a deal on COVID, tank that deal to buy h1.com.
Amar Drawid (20:17)
Okay. And so tell us how many physicians you have in your database.
Ariel Katz (20:22)
About 11 million.
Amar Drawid (20:24)
Eleven Million. Wow.
Ariel Katz (20:26)
A lot of people are.
Amar Drawid (20:29)
And so now as we're looking at these kinds of data, right? So one competitor I see is Viva has like this profiling data of the KOL. So can you like talk about how your data is different and more enriched than Veeva?
Ariel Katz (20:44)
Veeva is a good product. We have different philosophies. What is the definition of a KOL? So take two companies, take our five companies, top five pharma companies, J &J, Merck, Pfizer, Novartis, Amgen. Let's say they all do work in oncology, I think they all do. Who is the medical oncologist KOL in Montreal? You asked all those five companies? Give me your top 50 list. There's some overlap, but I guarantee you 50 % of the list will be different between those companies. And so the definition of KOL is sort of in the eye of the beholder. And it's how the pharma company thinks about it. Veeva does a cool thing where they would designate who's the KOL and they define it. We tell you, here are the 94,000 physicians in Canada. Here are the ones that publish the most, speak the most, are on the board of all the societies there, do clinical trials, see the most patients. You could filter and sort for what you're interested in. And here's our pre-canned definitions of who's KOL or not. But honestly, you know better than us, and we're gonna give you all the ingredients to figure it out and you could use our definitions. And so it's a different philosophy around it. So that's the difference between us and Veeva. So like when we have every physician in the world, they have the KOLs. And so that because we don't believe a company can define, a technology company could define a KOL for every company. Generally the company wants to define it themselves.
Amar Drawid (21:58)
Okay. And so do you provide raw data as well as the platform and about any consulting services? So can you tell us what is the products that you provide?
Ariel Katz (22:10)
We have software solutions, which is a user interface to access the data. For web science companies, we have prescriber universe for commercial teams, HCP universe for medical teams, and then set universe and patient universe for clinical teams. So those are the UI interfaces. You could log in and see it. And then we have, you could buy API or data feed access to the underlying data. Generally, top 50 pharma companies do that. Folks like yourself who can process the data, ingest it, map it to their reference data, get it into their CRM, and then other folks that might not have as much resources, buy access to the interface and maybe do a CRM integration and some other touch points.
Amar Drawid (22:49)
Okay. And in terms of consulting services, you also do those in terms of data, or is that more like providing the information?
Ariel Katz (22:57)
Providing information.
Amar Drawid (23:16)
Okay, gotcha. Then at the heart of each one platform is a massive knowledge graph, right, if I'm correct. So can you explain, like, first of all, can you explain to the audience what is a knowledge graph? And then if you can talk about the significance of the knowledge graph that you have built.
Ariel Katz (23:17)
Yeah, so think of a knowledge graph that we can all understand. We all probably have families or friends. So put you in the middle and then put a line and a circle next to the face of everyone of your friends and family members. And then think about under the face of each of those people, their favorite pizza flavor, their favorite ice cream flavor, their favorite soda flavor. And so now they all have attributes associated with them. And then now also link together the names of all those ice creams. So if someone likes vanilla, you can look at all the people that like vanilla, see all the nodes associated with vanilla ice cream. If everyone's friends with John, then see all the people friends with John and say, you have all this data stored in a relational manner and there are nodes, which is circles could be connected to each other. So we have a few different knowledge graphs here. We have one that are scientific and medical concepts. It's actually a hard problem. Mapping together the keywords in scientific publications to medical diagnoses, it's actually hard to build. And so we've mapped together every single medical diagnosis - bladder cancer, breast cancer, prostate cancer, heart failure, to scientific publications where they might say heart conditions. Map together every single scientific keyword to categorize the world of medicine and science. On top of that, we have a knowledge graph of who doctors work with, who do they publish scientific papers with, who do they share patients with, who do they refer patients to, who do they go to medical school with, who do they speak on the panel with, who are they on the board with, who do they tweet? Who do they follow? Who are they connected to on LinkedIn? So we have all these layered relationships between these physicians. Those two elements of the knowledge graph, we have way more nodes to it, help us understand who are the physicians, how are they related to each other, what do they focus on, what scientific or medical work do they do.
Amar Drawid (24:58)
So we talked about the one big use case, is finding the right HCPs for the principal investigators for the clinical trials. What are some of the other big use cases that you find for these datasets?
Ariel Katz (25:12)
Yep, a big growing one now is for medical affairs teams finding what they call digital opinion leaders. Who's the physician that has the most Twitter followers or X followers and how many? And that's actually a hard problem. And so, cause there could be someone that doesn't publish or is not on the board of the American Heart Association that influences more physicians than the person on the board by having large social media followers. And so that's a very fast and growing one. Another one is for commercial teams finding the right physician that thinks positively about your company. You know what their digital preferences are. You know they like LinkedIn over X over reading news articles and then helping those commercial teams share the right content to that physician after assertive behavior is taken. So for example, we know when a physician is presenting a poster at ASCO, we know what that poster is about and then giving that information to commercial teams so they can know what is that physician interested in and at which channel should they target that physician with that content. So they call Omnichannel and some other activities like that. So those are some other use cases that we see there growing with our data.
Amar Drawid (26:11)
So for those who don't know, ASCO is the largest oncology conference in the world. Yeah. Okay. And do you see like as the landscape is evolving, do you see the use cases evolving as well? Or like, how do you kind of see in the future what are going to be the needs?
Ariel Katz (26:27)
We've been talking a lot on the pharma side, on the patient side, tens of millions of patients use H1 every day to find there a doctor. We sell our information to digital health companies and health insurance companies. We're getting asked now, tell me how good this physician is. Not what's on ZocDoc, not a patient review. Like literally, are they better at doing shoulder surgery than these other doctors, and how, and blah, and so quality of physician is being asked a lot. Another one is around cultural competencies. Do they see representative patients? I mean, they're sort of tick sickle cell. Sickle cell affects - 70 % of people in the world that have sickle cell are African-American as their race. I wanna find those physicians that see those types of patients that are representative of people that have the condition. HIV in the US predominantly affects black men. All right, who are the physicians seeing these black men? I wanna educate them about the HIV and some treatment options. And so we're seeing more and different types of questions. And we're continuing to try and build the data to help solve it.
Amar Drawid (27:29)
Yeah. And so one of the requirements these days is to have diversity in the clinical trials, right? So how does your platform help in finding PIs that can help with this?
Ariel Katz (27:43)
Yeah, we, in the US, have the race and ethnicity of 300 million patients. And then we link that to a physician. And then about the physician, we have their race and ethnicity and the language that they speak. There's no better place to go to in the US to get that information. We do the same thing in England. We do the same thing in Brazil. We're getting similar information in Spain and Germany. It's more than just a US problem, getting representative patients. It's more challenging in some other countries, but we're...doing it globally. It's near and dear to our heart. It's very important.
Amar Drawid (28:14)
You spent time studying in Israel and while we've been talking about the use of your platform in the context of biopharma patients, there's another real world example I'd like you to touch on. So can you talk about how your platform was used following the October 7, 2023 Hamas attacks in Israel?
Ariel Katz (28:33)
So October 7th was a Saturday. That Monday was Columbus Day. On Tuesday, I pulled together a team and said, we're going to go help. And we reached out to every doctor in America and said, who wants to go in Israel? They need doctors. They need help. At the time, there was a shortage of doctors. They called up the reserves. So there was no doctors at hospitals anymore. And then there was a need for all these different types of specialists. People talk about the 1,200 people that were killed. It's 5,000 people that were injured. Limbs, all these things. And so we ended up there flying doctors to Israel. We reached out to every doctor in Israel and said, where do you need help? Reached out to the Ministry of Health, started flying doctors to Israel. Fast forward a few months, the war was in Gaza. And we did the same thing in Gaza. Started partnering with the world's largest NGOs, the World Health Organization, Save the Children, Project Hope. And we were in contact with the majority of physicians in Gaza. A lot of them were displaced from the North at the time. We got in contact with them and we connected them to the NGOs in the south of Gaza. And remember every doctor in the world, it's all the doctors in Gaza, all is in the West Bank. so I don't know if there was another company that was able to get in touch with all the doctors at Al-Shifa Hospital in Gaza, no longer functioning and those doctors still wanted to help. I'm actually still in contact with WhatsApp with a lot of these doctors in Gaza and Israel, but we were able to connect them to the NGOs. After that, we did the same thing in Ukraine. Somehow built up a good brand within these NGOs of being able to get doctors and be in touch with doctors that nobody else knew about because we have every doctor in the world. And that is how we went viral in Zambia. There was a cholera scare in Zambia and the Ministry of Health told all their physicians to sign up for H1. So, and like overnight, we had like thousands of people in Zambia sign up. Didn't know much about Zambia until that happened and I was pretty excited about it. And so we're continuing to help with humanitarian missions and sending doctors around the world to also support our mission.
Amar Drawid (30:20)
Fantastic. So does this suggest potentially broader applications in the future?
Ariel Katz (30:25)
I think we're just getting started. I mean, like, been out for a while. I wish we'd done more, but like we've done a lot. And I think there's a lot more to do. Who has the problem to know that they're seeing the right doctor or they're working with the right doctor? I think like everybody in the world, in every country. How many people know about H1? How many people use H1? 200 pharma companies, tens of millions of patients in the US, which is awesome, but like it is far from, far from the mission.
Amar Drawid (30:54)
Ariel Katz, co-founder and CEO of H1, thanks for your time today.
Ariel Katz (30:58)
Thank you.
Daniel Levine (31:01)
It was an interesting conversation, Amar. What did you think?
Amar Drawid (31:04)
It was a fascinating discussion. We started with helping the pharma companies identify patients, sorry, identify the physicians for the clinical trials to really applications even in the war zones. So it covered a lot of breadth, I thought. And it's the same data, right? But so many very, very different use cases that you can have. And it's also interesting that this kind of data doesn't exist right now, right? I mean, it's the data about the physicians and there's a lot of hoops that Ariel and the company had to go through to get the collected data in so many different countries.
Daniel Levine (31:46)
Were you surprised that he said that the problem of getting that kind of data has gotten worse with AI rather than better, there's more noise in the system today?
Amar Drawid (31:55)
And I was asking him early on, is it about master data or is it about the data even beyond that? So master data, of course, I do believe that is getting more and more kind of a systemized as the different countries do that. But what H1 is really collecting is a lot of information about the publications, the...the digital activities, how the physicians are perceived. And that is information that is not straightforward to get, and you have to go through lot of complexities to get that. And I'm not even sure about what are all the different types of sources that H1 needs to use, right? So with the AI, think now that this...can be a lot of information that's generated, but you don't know if that's true or not. So that, I think, could be a very confounding factor. You do need to get to the real sources of data and information and not something that's just maybe just promoting some physicians over the others. So you may have to watch out for a lot of those things.
Daniel Levine (33:01)
The other thing that's interesting here is that in the case of H1, it's about the data, not the AI. I'd argue that's often the case, but what did you make of that?
Amar Drawid (33:12)
As Ariel said, three, four years ago, before Gen.AI the AI wasn't that easy. And that's because you have to really... He's asking complex questions about this. And then for that, you have to program all of that out and then make sure that... So there's a lot of algorithm development that was needed. And now with Gen. AI, it's just become easier because now you can have...Gen. AI do a lot of the analysis or lot of the logical reasoning around this. So you can use that and that's become much easier. think so, I mean, for anything that's AI related, and we've talked about this several times here, which is that it's anything that you show with AI is as good as the data that was trained on. And you definitely need a lot of data to train any AI. it's as we say, garbage in, garbage out, right? So the data is the most important part. I think early on it was, you know, data and AI both were there. AI problem has become easier for them, but data problem is still there and that's not going away anytime soon.
Daniel Levine (34:17)
You know, it's always interesting to me that when someone builds a system like this, there are unintended uses they didn't contemplate. And it seems that when you marry this type of data to AI, there are all kinds of opportunities for those other uses. I think about, you know, the various war-torn places he referenced. I imagine that there are going to be other people finding new ways to use this kind of system.
Amar Drawid (34:47)
Yeah, absolutely. I'm pretty sure this should definitely do a partnership with Red Cross. Anywhere there's a war zone, definitely this is something that's been very helpful to find the doctors. And then I can see in any natural calamities, like you have earthquakes or something like that, something like this would be very helpful information to really connect all these doctors, bring them to the right place. I mean, doctor is...someone who everyone needs. I mean, that's like a universal need for people, right? Well, there are people you need the doctors. And so having this, I guess, the universal database of doctors can just have so many uses. I mean, you could even think about like you and your pandemic, right? Like if there's like some epicenters in the pandemic, they could have used...this database to identify, okay, what doctors can help, right? like anywhere there are issues with public health, this is something that's gonna be very, very useful. And of course, I mean, you have the use cases with doctors and patients and pharmaceuticals during the general day-to-day life anyway, but a lot of these...disasters, definitely we can see the uses. And I'm pretty sure there are lot of other use cases that we can think about. mean, even the governments can think about what are the kind of doctors that we have? Where is it that we don't have enough doctors? So even this could be even useful for the governments to make sure that the doctors are enough and different types of doctors to serve their populations and where are the gaps? So there's a lot that can be done here for, I mean, from public health point of view.
Daniel Levine (36:36)
It was a compelling story. Amar, thanks so much for your time as always.
Amar Drawid (36:41)
Thank you, Danny.
Daniel Levine (36:42)
Thanks again to our sponsor, Agilisium Labs. Life Sciences DNA is a bi-monthly podcast produced by the Levine Media Group with production support from Fullview Media. Be sure to follow us on your preferred podcast platform. Music for this podcast is provided courtesy of the Jonah Levine Collective. We'd love to hear from you. Pop us a note at danny at levinemediagroup.com.
For Life Sciences DNA, I'm Daniel Levine. Thanks for joining us.