Unlocking the Power of AI in Healthcare | Hamed Shahbazi (Part 2)
The James Altucher ShowOctober 15, 202401:05:0959.66 MB

Unlocking the Power of AI in Healthcare | Hamed Shahbazi (Part 2)

James is joined again by Hamed Shahbazi, founder of Well Health, for an in-depth look at AI’s role in transforming healthcare. They discuss how AI can process vast amounts of medical data to assist doctors in diagnosing and treating patients quickly and accurately.

A Note from James:

"I do think that the most important use case for AI is in extending our lifespans. And I think if this isn’t five or ten years in the future—this is right now. Things are happening in AI that are going to change healthcare overnight. Today, I’m continuing my discussion with Hamed Shahbazi, founder of Well Health. This is a publicly-traded company in Canada, and it’s already impacting millions of lives by applying AI to healthcare.

HealWell, one of Hamed’s companies, has processed over 18 million patient records. With this data, their AI can analyze symptoms and assist with diagnosis and treatment options faster than ever. Imagine calling your doctor, and before you even see them, AI has already pieced together potential diagnoses and suggested next steps. This is how we extend lives—prevention is the best cure, after all.

In this episode, Hamed and I talk about healthcare AI and how it’s poised to save millions of lives. I’ll be sharing updates on AI and healthcare companies doing interesting things in the next few months. So, listen in, subscribe, and let’s keep the conversation going. Here’s Hamed, talking about AI in healthcare."

Episode Description:

James is joined again by Hamed Shahbazi, founder of Well Health, for an in-depth look at AI’s role in transforming healthcare. They discuss how AI can process vast amounts of medical data to assist doctors in diagnosing and treating patients quickly and accurately. Hamed shares his insights on how HealWell's AI copilots help healthcare providers manage data and improve patient care. This conversation reveals how AI is poised to revolutionize healthcare and extend lifespans in previously unimaginable ways.

What You’ll Learn:

  1. The role of AI in accelerating diagnosis and treatment, potentially saving millions of lives.
  2. How HealWell's AI copilots enhance productivity and enable healthcare providers to focus more on patient care.
  3. The ethical and regulatory challenges in using AI in healthcare, especially concerning patient data privacy.
  4. The potential future of digital twins in healthcare, allowing doctors to simulate treatments on virtual models of patients.
  5. Insights into how healthcare AI could personalize supplements and treatment plans based on individual health data.

Timestamped Chapters:

  • [01:30] – James introduces Hamed Shahbazi and the current impact of AI on healthcare.
  • [03:25] – How AI can prevent illnesses by providing rapid diagnosis and treatment options.
  • [05:51] – Discussing the resistance from healthcare professionals toward AI.
  • [13:27] – The use of AI for scribing medical notes and its benefits for healthcare providers.
  • [19:05] – Ensuring patient data privacy and the role of consent in AI data processing.
  • [27:13] – The future of healthcare AI, digital twins, and personalized medicine.
  • [37:14] – Challenges in eliminating bias from healthcare AI.
  • [47:56] – Potential for AI to improve life expectancy by supporting preventive care.

Additional Resources:

------------

  • What do YOU think of the show? Head to JamesAltucherShow.com/listeners and fill out a short survey that will help us better tailor the podcast to our audience!
  • Are you interested in getting direct answers from James about your question on a podcast? Go to JamesAltucherShow.com/AskAltucher and send in your questions to be answered on the air!

------------

------------

Thank you so much for listening! If you like this episode, please rate, review, and subscribe to “The James Altucher Show” wherever you get your podcasts: 

Follow me on social media:

[00:00:06] [SPEAKER_02]: I do think that the most important use case for AI is in extending our lifespans. This is not five years in the future, ten years in the future. This is like right now. Already, things are happening in AI that are going to change the world overnight in terms of our health and healthcare.

[00:00:27] [SPEAKER_02]: And once again, I'm continuing the discussion with Hamed Shahbazi, the founder of Well Health, but also the founder of HealWell. And HealWell basically has taken, and this is a publicly traded company in Canada, knock yourselves out if you want to invest in it, but listen to this podcast first because HealWell basically took 18 million patient records.

[00:00:50] [SPEAKER_02]: A patient record is anything like when you call up your doctor and it's like, oh, I have a sore throat, but my leg hurts, and after various tests and patient visits and so on, there's a diagnosis, and then there's treatment.

[00:01:03] [SPEAKER_02]: And imagine if you put 18 million records like that into an AI, the result is when a new patient comes in, right from the phone call where the patient is complaining about their symptoms, the AI is already in motion figuring out a diagnosis that the doctor might not even have a clue on, and then figuring out what tests needed and what possible treatments there are.

[00:01:25] [SPEAKER_02]: It's proven. This speeds up diagnosis and treatment so quickly. And we all know that prevention is the cure. This is how you expand lifespan.

[00:01:35] [SPEAKER_02]: So Hamed and I talk about this AI and other future use cases of AI and how he's using it to stay healthy, for instance, how I could, how you could, and just what the future of healthcare AI is.

[00:01:48] [SPEAKER_02]: It's so fascinating and these companies are so exciting. Like this is, this is the real, everybody's all afraid of AI, but AI is going to save millions and millions of lives.

[00:02:00] [SPEAKER_02]: It probably already has, and it's going to continue to do so. So such a great conversation with Hamed Shabazi. Listen to this, subscribe to our podcast.

[00:02:08] [SPEAKER_02]: I'm going to be giving you a lot of up-to-date news about AI and, and companies doing AI and, and all the things I'm excited about with AI in the next months and years and so on. So let's keep this rolling.

[00:02:20] [SPEAKER_02]: Here's Hamed talking about healthcare AI.

[00:02:28] [SPEAKER_01]: This isn't your average business podcast and he's not your average host. This is the James Altucher Show.

[00:02:44] [SPEAKER_02]: Yeah. I mean, how are things going in general? Like how's, how's the company? How's Hillwell? How's, how's well?

[00:02:50] [SPEAKER_00]: Well, everything's going really well. Like we're very focused right now on really developing these tools for healthcare providers.

[00:02:59] [SPEAKER_00]: We're also working very hard to collect the consents from the patient population that we have.

[00:03:06] [SPEAKER_00]: See a lot of people are always talking about how much data they sit on and how much data they have, but they can't do anything with the data.

[00:03:13] [SPEAKER_00]: And it's because it takes hard work to go through and actually collect the proper consents unambiguously, not trying to slide things by people.

[00:03:20] [SPEAKER_02]: Well, actually, this is a, this is an interesting topic. I want to, I want to talk about this because I think this is an interesting topic for AI and healthcare in general.

[00:03:26] [SPEAKER_02]: And I'm just curious, like what pushback have you seen from the healthcare industry, from doctors in particular, and then I'll get to patients, but from doctors in particular about using AI in their practice?

[00:03:39] [SPEAKER_00]: It's really been an interesting journey, actually with that.

[00:03:43] [SPEAKER_00]: I would tell you that pre-pandemic, even though AI and machines were around and the ability to kind of crunch data, doctors were very against it.

[00:03:53] [SPEAKER_00]: And the view was that machines would find out where they have done things that, that, that didn't contribute to the best care outcomes.

[00:04:02] [SPEAKER_00]: And they were worried that machines would sort of show them up and, and, and, and, and, and, and, and identify things that they may have not done perfectly.

[00:04:11] [SPEAKER_02]: Which, by the way, would be a good thing, but of course they feel threatened by it.

[00:04:15] [SPEAKER_02]: So it's, it's a, like, I always assume let's, let's best case scenario, doctors are in it to help people and cure people of illnesses and prevent people from getting illnesses.

[00:04:25] [SPEAKER_02]: So they have all the good intentions and yes, you know, doctors are well-paid profession.

[00:04:30] [SPEAKER_02]: So another reason is doctors want to make money, but at the end of the day, we're human.

[00:04:35] [SPEAKER_02]: And a lot of humans feel threatened by, is AI going to take my job or is AI going to make a fool out of me in some way?

[00:04:44] [SPEAKER_00]: Yeah.

[00:04:45] [SPEAKER_00]: Yeah.

[00:04:45] [SPEAKER_00]: And look, I mean, I think there's different subspecialties that have a lot of that fear of replacement right now.

[00:04:51] [SPEAKER_00]: I mean, there was a, there was, there was actually decreases in enrollment in radiology for a while because the view is like, hey, what's the point?

[00:04:59] [SPEAKER_00]: Right.

[00:04:59] [SPEAKER_00]: If a machine is going to do a better job.

[00:05:02] [SPEAKER_00]: But I think what I understand is enrollments back up now because most people understand that no matter what happens, not in the foreseeable future is a medical system going to allow a machine to make a decision.

[00:05:16] [SPEAKER_00]: It'll allow a machine to provide decision support and help amplify and support a physician in making a call.

[00:05:23] [SPEAKER_00]: That was kind of a key difference that, that was, I think, feared for a while.

[00:05:28] [SPEAKER_02]: But the key word there is, and sorry for interrupting, but I have questions along the way.

[00:05:33] [SPEAKER_02]: The key word there is allow.

[00:05:35] [SPEAKER_02]: It's the law that a computer cannot, you know, talk to the patient after, you know, an x-ray.

[00:05:43] [SPEAKER_02]: It has to be a human licensed radiologist by law who interprets the x-ray for you.

[00:05:49] [SPEAKER_02]: Even if it's, even if there's a glaring tumor, right, on the x-ray, you cannot talk to anybody other than a licensed radiologist.

[00:05:57] [SPEAKER_02]: So, so they're protected now by the law, which is, which is a good thing.

[00:06:00] [SPEAKER_02]: I think there is a human side to, to doctor-patient relationships.

[00:06:05] [SPEAKER_02]: But it's interesting that enrollment was down.

[00:06:06] [SPEAKER_02]: I did not know that initially.

[00:06:09] [SPEAKER_00]: Yeah, it did.

[00:06:10] [SPEAKER_00]: And you're right.

[00:06:11] [SPEAKER_00]: I think there's positives and negatives, probably more positives at this stage for medical professionals to have to weigh in on a quote-unquote decision.

[00:06:21] [SPEAKER_00]: But I feel like over time, it'll become more nuanced where, like, we already know that machines and machine learning and AI can do just as good as job, if not better than a human.

[00:06:33] [SPEAKER_00]: And things like, you know, examining a mole and determining whether it's cancerous or looking at an x-ray.

[00:06:39] [SPEAKER_00]: Like, the field of vision of a human is just not going to be as good as an incredibly well-trained machine, right?

[00:06:46] [SPEAKER_00]: But I think it's, like you said, it's all of the other aspects of care.

[00:06:51] [SPEAKER_00]: It's about sanity checking the situation.

[00:06:53] [SPEAKER_00]: It's about, you know, providing patient care.

[00:06:55] [SPEAKER_00]: It's about communication.

[00:06:56] [SPEAKER_00]: You know, over time, these things may allow certain machines to start to make a decision.

[00:07:01] [SPEAKER_00]: Or to have their decisions or their analysis be more relied upon.

[00:07:07] [SPEAKER_02]: Well, right now, like the, you know, the HealWell, so your company, HealWell, has these, what is it, like 18 million patient records of a patient essentially calling up and saying,

[00:07:19] [SPEAKER_02]: oh, my knee hurts, I have a sore throat, I have a headache, what's wrong with me?

[00:07:24] [SPEAKER_02]: And, you know, you used kind of supervised learning.

[00:07:29] [SPEAKER_02]: Each one of those patient records has a diagnosis that eventually they came to, whether it was that day or weeks or months later.

[00:07:36] [SPEAKER_02]: And now that's all been crunched into an AI.

[00:07:39] [SPEAKER_02]: And now you're able to take a new patient who's calling up with complaining about something and do a diagnosis very quickly.

[00:07:47] [SPEAKER_02]: Are those AI-generated diagnoses better than, you know, have you been able to measure?

[00:07:54] [SPEAKER_02]: Are they better than the doctor diagnosis or faster?

[00:07:57] [SPEAKER_02]: Or in what ways is it better?

[00:08:00] [SPEAKER_00]: Yeah.

[00:08:00] [SPEAKER_00]: So, in general, I mean, what we found with our physician co-pilots, the AI physician co-pilots that we've deployed,

[00:08:10] [SPEAKER_00]: what we found is that they've done a pretty good job of identifying, you know, some gaps in care.

[00:08:16] [SPEAKER_00]: Doctors get most of the things that they're doing right.

[00:08:19] [SPEAKER_00]: But, you know, doctors also have really messy inboxes.

[00:08:22] [SPEAKER_00]: So, there's always stuff coming into a physician's inbox or a patient record.

[00:08:27] [SPEAKER_00]: You know, there's lab results coming.

[00:08:28] [SPEAKER_00]: There's specialist results coming.

[00:08:30] [SPEAKER_00]: There's even hospital stuff coming in.

[00:08:32] [SPEAKER_00]: And, you know, everything always goes back to your primary care provider, right?

[00:08:37] [SPEAKER_00]: And so, you know, there's this expectation for your primary care provider to integrate all this information all the time

[00:08:46] [SPEAKER_00]: and be able to provide the most up-to-date, you know, aware perspective about your health.

[00:08:52] [SPEAKER_00]: But that is simply an unreasonable expectation.

[00:08:56] [SPEAKER_00]: Up here in Canada, where I live, you know, doctors are getting paid $30 to $35 a patient visit to do that.

[00:09:01] [SPEAKER_00]: And there could be hundreds, if not thousands of patient records that they need to skim through and understand.

[00:09:10] [SPEAKER_00]: That's just not reasonable, right?

[00:09:11] [SPEAKER_00]: And so, I think we need, just like a plane that requires a pilot, you know, does so with the help of an onboard computer.

[00:09:22] [SPEAKER_00]: I think you need diagnostics and machines to be able to crunch all this complex data to provide insights to physicians.

[00:09:28] [SPEAKER_02]: So, it's viewing, and I think this is an important thing, like anybody threatened by AI should make this leap.

[00:09:36] [SPEAKER_02]: It's viewing the AI as like incredibly good and informed assistant, essentially.

[00:09:43] [SPEAKER_00]: Exactly, exactly.

[00:09:45] [SPEAKER_00]: AI is really good at taking complex data sets and really, really, really digesting them and providing simple insights and signal.

[00:09:53] [SPEAKER_00]: And it's not like a search engine.

[00:09:54] [SPEAKER_02]: It's not like, oh, tell me more about AI that I type into Google.

[00:09:59] [SPEAKER_02]: It's like, it's seeing all these patient records and it has in its memory, you know, natural, like plain English records of 18 million other patients who have been diagnosed or have received treatment in the past.

[00:10:15] [SPEAKER_02]: And so, it's able to say, huh, this is like this 127,000 other patient records that were similar to this.

[00:10:21] [SPEAKER_02]: I'm combining it all in my head right now and seeing what the most likely, you know, neural network produced outcome is of this particular patient.

[00:10:29] [SPEAKER_00]: Yeah, yeah.

[00:10:30] [SPEAKER_00]: I mean, that's where, with that kind of deep learning and the broader data sets, you know, the ability to, you know, to provide those types of insights are incredible.

[00:10:40] [SPEAKER_00]: You know, but there's so many different types of tools emerging.

[00:10:44] [SPEAKER_00]: For example, you know, that example I gave about a patient who's, let's say, had a long medical history, could have thousands of pages of records.

[00:10:55] [SPEAKER_00]: Imagine if a new doctor comes on the scene because the other one's away or something happened to him or moved or whatever the case may be.

[00:11:02] [SPEAKER_00]: Imagine trying to synthesize and understand the state of the nation of that patient.

[00:11:08] [SPEAKER_00]: You know, that's a really hard thing to do.

[00:11:09] [SPEAKER_00]: And so, AI, we know, is very good at taking complex documents, you know, like large PDFs and allowing you to ask questions and things of that nature.

[00:11:17] [SPEAKER_00]: But doing this with a patient record is a lot more complex.

[00:11:21] [SPEAKER_00]: It's a lot more non-trivial because you have a lot of structured and unstructured information.

[00:11:25] [SPEAKER_00]: So, you have sentences written by doctors over time.

[00:11:28] [SPEAKER_00]: You've got, you know, diagrams.

[00:11:30] [SPEAKER_00]: You've got all this kind of stuff.

[00:11:31] [SPEAKER_00]: And AI is very good at parsing that information and understanding unstructured information and organizing it so it can be searched.

[00:11:39] [SPEAKER_00]: And summarized.

[00:11:40] [SPEAKER_00]: There's a variety of different tools that are coming of age.

[00:11:43] [SPEAKER_02]: Right now, who's happier?

[00:11:45] [SPEAKER_02]: Are the patients happier or are the doctors happier?

[00:11:48] [SPEAKER_02]: Like, are you seeing good results, basically?

[00:11:51] [SPEAKER_00]: We are.

[00:11:52] [SPEAKER_00]: And like one of the key tools that we're using where we're seeing incredible results is an ambient AI scribe.

[00:11:59] [SPEAKER_00]: So, basically, this is a product that it's basically a service.

[00:12:04] [SPEAKER_00]: When you come in, and again, we need to collect the right consents to make sure that you understand what's happening.

[00:12:09] [SPEAKER_00]: And basically, there's a machine involved.

[00:12:10] [SPEAKER_00]: But the doctor turns the service on.

[00:12:12] [SPEAKER_00]: It listens to a conversation between a physician and the patient.

[00:12:16] [SPEAKER_00]: And you don't have to tell it who's speaking it.

[00:12:19] [SPEAKER_00]: You know, it'll always sort of figure that out.

[00:12:21] [SPEAKER_00]: It'll, you know, exclude anything.

[00:12:24] [SPEAKER_00]: Like, let's say, you know, you and I are patient and provider.

[00:12:28] [SPEAKER_00]: And we talked about, you know, our vacations and the cottage and the lake or whatever.

[00:12:31] [SPEAKER_00]: And the kids.

[00:12:32] [SPEAKER_00]: It'll kind of block that stuff out.

[00:12:35] [SPEAKER_00]: And then what it'll do is it'll take the total sum of our consultation and create a medically relevant note for regulatory purposes.

[00:12:43] [SPEAKER_00]: This is really important because a lot of times what will happen, a physician will come into a session and transcribe everything that happens.

[00:12:50] [SPEAKER_00]: You know, like, where does it hurt?

[00:12:52] [SPEAKER_00]: What happens when you do this?

[00:12:54] [SPEAKER_00]: And tell me, you know, what happens when you sleep?

[00:12:57] [SPEAKER_00]: And tell me what happens when you touch it this way.

[00:12:58] [SPEAKER_00]: And they're just collecting information.

[00:13:01] [SPEAKER_00]: They're collecting observations.

[00:13:02] [SPEAKER_00]: And that consultation gets to a place where they've assessed what they're going to do and what the care plan is going to be.

[00:13:10] [SPEAKER_00]: Maybe there's a therapeutic prescribed.

[00:13:11] [SPEAKER_00]: Now, later on, a physician has to take that entire transcript and create a medically relevant note.

[00:13:18] [SPEAKER_00]: So a note that's structured in a way that's very condensed.

[00:13:22] [SPEAKER_00]: And that by itself usually takes longer than the actual consultation.

[00:13:27] [SPEAKER_00]: It can anyway.

[00:13:28] [SPEAKER_00]: For some people, it takes a bit less.

[00:13:29] [SPEAKER_00]: But we've launched that for physicians and we found that it gives them back at least 20% of their day.

[00:13:37] [SPEAKER_00]: So you just think about that.

[00:13:39] [SPEAKER_00]: The productivity enhancement of getting 20% of your day back.

[00:13:42] [SPEAKER_00]: And also what it's done is it's allowed now physicians, instead of, you know, having their heads sort of buried in a laptop taking notes, to pay attention to the patient and not worry about the notes and just worry about the patient.

[00:13:56] [SPEAKER_00]: So they observe more.

[00:13:58] [SPEAKER_00]: And what we found is that they touch the patient more.

[00:14:03] [SPEAKER_00]: And they care for the patient more.

[00:14:05] [SPEAKER_00]: They have more of a caring energy towards the patient.

[00:14:08] [SPEAKER_00]: Interestingly, there's been studies that show your physician touching you and looking at you and caring for you actually results in better patient outcomes.

[00:14:16] [SPEAKER_00]: Really? Why is that, you think?

[00:14:19] [SPEAKER_00]: I think it's because there's something about having a care provider show interest in our care.

[00:14:29] [SPEAKER_00]: You know, we know that placebo effect, we know just the power of belief is powerful, right?

[00:14:35] [SPEAKER_00]: We know placebo is a very powerful thing.

[00:14:36] [SPEAKER_00]: Like, people in the pharmaceutical industry rave about this all the time.

[00:14:40] [SPEAKER_00]: Like, it is natural to have a patient see, you know, significant improvements when they're taking a sugar pill just because they believe they could be taking some pair of therapeutics.

[00:14:52] [SPEAKER_00]: So our observations, what we experience, how people interact with us, the level of care they show us, the level of interest they show in our well-being impacts how we feel about our wellness and our healing.

[00:15:05] [SPEAKER_00]: Isn't that crazy?

[00:15:06] [SPEAKER_02]: I guess it reminds me of whenever I'm talking to my accountant about the IRS.

[00:15:11] [SPEAKER_02]: Like, whether he gives me good news or not, I always feel better after I speak to him.

[00:15:16] [SPEAKER_02]: Totally, right?

[00:15:17] [SPEAKER_02]: Because he is really focused.

[00:15:19] [SPEAKER_02]: Like, I really think he cares about whatever my situation is.

[00:15:24] [SPEAKER_00]: Any situation you're in, whether it's legal or accounting or whatever, like, you talk to someone that really knows what they're doing and they kind of level with you.

[00:15:34] [SPEAKER_00]: They kind of give you the perspective.

[00:15:35] [SPEAKER_00]: They say, hey, look, here are the pros and cons of this situation.

[00:15:38] [SPEAKER_00]: Don't worry.

[00:15:39] [SPEAKER_00]: We've got, like, a care plan for you.

[00:15:40] [SPEAKER_00]: We're going to take care of your situation.

[00:15:43] [SPEAKER_00]: You know, there's going to be some bumps in the road.

[00:15:45] [SPEAKER_00]: Like, that kind of care is, you know, we're all humans after all, right?

[00:15:51] [SPEAKER_02]: So how is the model now evolving?

[00:15:53] [SPEAKER_02]: Like, do you take every new patient and keep on learning?

[00:15:58] [SPEAKER_02]: Or like, how do you improve the AI?

[00:16:01] [SPEAKER_00]: It's a great question.

[00:16:03] [SPEAKER_00]: So first thing that we do is we have undertaken to collect these consents that I was talking about earlier.

[00:16:09] [SPEAKER_00]: This is really important because when people walk into the clinic, we want there to be a clear and unambiguous understanding of that they're walking into a doctor's office that's leveraging technology to help the patient and to also help empower them.

[00:16:25] [SPEAKER_00]: And so we will ask them, do you want to be contacted for clinical trials purposes?

[00:16:30] [SPEAKER_00]: And this is really important because AI will now, if you say yes, AI will now be able to review your patient records for the purposes of connecting with different patient trials that are happening.

[00:16:46] [SPEAKER_00]: And that could be a great opportunity for you.

[00:16:49] [SPEAKER_00]: But yeah, generally speaking, if you've consented to your data to be sort of put into the AI pool, there's learning happening with it.

[00:16:58] [SPEAKER_00]: There could be kind of looking for patterns.

[00:17:03] [SPEAKER_00]: There could be, you know, obviously looking for diseases, looking for challenges that are preventative from a preventative health.

[00:17:10] [SPEAKER_00]: It's mostly to do with your benefit.

[00:17:12] [SPEAKER_00]: You're deriving the benefit as opposed to the greater.

[00:17:15] [SPEAKER_00]: But there is greater learning as well that's going on.

[00:17:18] [SPEAKER_02]: Well, it's interesting because both sides, right, are in some sense consenting.

[00:17:23] [SPEAKER_02]: Like the doctors are saying, okay, I want all my records to be put into the AI.

[00:17:26] [SPEAKER_02]: And all the patients are saying, yes, I want my personal data to be put into the AI.

[00:17:31] [SPEAKER_02]: I'm just curious.

[00:17:31] [SPEAKER_02]: This is like an offbeat thing.

[00:17:34] [SPEAKER_02]: But can you incentivize them to contribute to the AI?

[00:17:37] [SPEAKER_02]: Like, can you say, hey, we have a rewards program for doctors who contribute, you know, 10,000 records or more.

[00:17:43] [SPEAKER_02]: Maybe they get shares because they're building essentially the main assets of the company.

[00:17:49] [SPEAKER_00]: Yeah.

[00:17:50] [SPEAKER_00]: I mean, look, you definitely.

[00:17:52] [SPEAKER_00]: Is that legal to do rewards like that?

[00:17:54] [SPEAKER_00]: You can.

[00:17:54] [SPEAKER_00]: I think for physicians, you know, the idea here is physicians control your records.

[00:18:01] [SPEAKER_00]: Patients own their records.

[00:18:03] [SPEAKER_00]: And then there's custodians who happen to kind of manage all of this data.

[00:18:09] [SPEAKER_00]: You know, and in different countries, there's different access and ability to leverage that data.

[00:18:15] [SPEAKER_00]: Depending on which regulatory framework that you're in, U.S., Canada, Europe,

[00:18:20] [SPEAKER_00]: a physician may or may not be able to single-handedly lift and shift your data into a learning kind of algorithm.

[00:18:29] [SPEAKER_00]: Now, obviously, you know, what could be different is whether it's anonymized or de-anonymized.

[00:18:35] [SPEAKER_00]: But even in some regulatory jurisdictions, even an anonymized data set cannot be kind of reviewed in that way.

[00:18:42] [SPEAKER_00]: So, you know, that's all pretty murky, you know, from country to country.

[00:18:47] [SPEAKER_00]: And it's an evolving art, if you will, to some degree with some of this stuff.

[00:18:51] [SPEAKER_00]: But I do think there's a lot of focus on it right now because I think what everyone understands now,

[00:18:56] [SPEAKER_00]: given what's happened with generative AI in the past couple of years,

[00:18:59] [SPEAKER_00]: is just how powerful this technology can be and the importance of feeding this technology.

[00:19:05] [SPEAKER_00]: I mean, if you think about it, healthcare is the best application you can think of for AI.

[00:19:09] [SPEAKER_00]: It truly will help advance, you know, identify disease, identify longevity opportunities.

[00:19:16] [SPEAKER_00]: It's pretty compelling stuff.

[00:19:20] [SPEAKER_02]: Take a quick break.

[00:19:21] [SPEAKER_02]: If you like this episode, I'd really, really appreciate it.

[00:19:24] [SPEAKER_02]: It means so much to me.

[00:19:26] [SPEAKER_02]: Please share it with your friends and subscribe to the podcast.

[00:19:29] [SPEAKER_02]: Email me at alcatra at gmail.com and tell me why you subscribed.

[00:19:33] [SPEAKER_02]: Thanks.

[00:19:43] [SPEAKER_02]: Like, it seems like insurance companies would want to make this mandatory that physicians and patients use this

[00:19:49] [SPEAKER_02]: because they're going to get the, you know, prevention is the cure.

[00:19:52] [SPEAKER_02]: So they're going to get diagnosis much earlier using the AI system you develop than any other solution.

[00:20:00] [SPEAKER_00]: Yeah, I think you're right.

[00:20:01] [SPEAKER_00]: Right now, what we understand is a lot of the U.S. insurers are just even just preparing their data

[00:20:07] [SPEAKER_00]: because they understand they have so much unstructured data,

[00:20:10] [SPEAKER_00]: meaning that it's not in perfect tables and arranged perfectly.

[00:20:15] [SPEAKER_00]: It's all this stuff that I was talking about, you know, and most of it, it, you know, can't be used effectively.

[00:20:20] [SPEAKER_00]: You know, we're talking about 30% of the world's data is healthcare data, growing at 36% per year.

[00:20:28] [SPEAKER_02]: So, and it's not usable because it's in English and it's, you have to read the handwriting of the doctor.

[00:20:33] [SPEAKER_02]: And like, why is it not all usable?

[00:20:36] [SPEAKER_00]: It's, yeah, it's, if, let's say you went to the hospital and got a bunch of tests done,

[00:20:43] [SPEAKER_00]: you know, all of that stuff gets, you know, kind of as a computer output into a report.

[00:20:50] [SPEAKER_00]: The report is really dense.

[00:20:52] [SPEAKER_00]: It's really hard to understand.

[00:20:54] [SPEAKER_00]: It's a bunch of data that just gets sort of thrown down into this report.

[00:20:59] [SPEAKER_00]: And, you know, what a lot of AI is doing now is going through that data, parsing it and organizing it

[00:21:06] [SPEAKER_00]: so that it's easier for humans to understand.

[00:21:09] [SPEAKER_00]: And obviously, you know, this is in human form as well, but it's just dense and it's not easy to kind of go through.

[00:21:17] [SPEAKER_00]: And they're integrating it with other forms of information.

[00:21:20] [SPEAKER_00]: So, for example, our AI capabilities are, you know, we have a product called Chart Distillery

[00:21:25] [SPEAKER_00]: through our Pentavir AI service, which is one of the companies owned by HealWell AI.

[00:21:30] [SPEAKER_00]: It's fascinating what they can do.

[00:21:32] [SPEAKER_00]: They can effectively, you know, take skyscrapers worth of data and they can crunch that data

[00:21:38] [SPEAKER_00]: and they can essentially find key attributes that they're looking for.

[00:21:42] [SPEAKER_00]: They were written in prestigious journals to find rare lung cancer.

[00:21:47] [SPEAKER_00]: They have been able to stage head and neck cancers, which are famously difficult to do

[00:21:53] [SPEAKER_00]: because of the complex anatomy in the head and neck, just by looking at the text in an EHR.

[00:21:59] [SPEAKER_02]: So, yeah.

[00:21:59] [SPEAKER_02]: So how do they do that?

[00:22:00] [SPEAKER_02]: So what text are they looking at?

[00:22:01] [SPEAKER_02]: They're looking at what the doctor wrote, what the patient said?

[00:22:04] [SPEAKER_00]: All that stuff.

[00:22:05] [SPEAKER_00]: They're looking at blood tests.

[00:22:07] [SPEAKER_00]: They're looking at doctor's notes.

[00:22:08] [SPEAKER_00]: Wow.

[00:22:09] [SPEAKER_00]: Hospital printouts.

[00:22:10] [SPEAKER_00]: Results of diagnostic tests.

[00:22:13] [SPEAKER_00]: They're reading all of that information and basically using large language models and transformers

[00:22:20] [SPEAKER_00]: to be able to effectively key in on whether or not there's a risk for a particular disease

[00:22:27] [SPEAKER_00]: associated with that.

[00:22:28] [SPEAKER_02]: And again, it's because their model learned from prior patients who had all these different

[00:22:34] [SPEAKER_02]: cancers and so on.

[00:22:35] [SPEAKER_01]: Absolutely.

[00:22:36] [SPEAKER_02]: Yeah.

[00:22:36] [SPEAKER_02]: So over the next five years, say, what technologies – how is this going to evolve?

[00:22:42] [SPEAKER_02]: What technologies are exciting you in healthcare AI?

[00:22:46] [SPEAKER_02]: Because I think for an entrepreneur – and you started off your journey as an entrepreneur.

[00:22:50] [SPEAKER_02]: You sold a company to PayPal, then you started Well and then HealWell.

[00:22:57] [SPEAKER_02]: Where do you think the opportunities are for entrepreneurs?

[00:22:59] [SPEAKER_02]: And what are you excited about in what technologies are coming up in healthcare AI?

[00:23:06] [SPEAKER_00]: You know, I think that all of this is going towards a situation where we're going to have a very

[00:23:13] [SPEAKER_00]: good digital model of each one of us.

[00:23:15] [SPEAKER_00]: And I think that's really exciting because, you know, right now we have pieces of the data.

[00:23:21] [SPEAKER_00]: We may have some information about the microbiome or the metabolome or maybe some full-body

[00:23:25] [SPEAKER_00]: imaging, you know, maybe some CT scans.

[00:23:29] [SPEAKER_00]: So you can sort of put together a general picture, right?

[00:23:34] [SPEAKER_00]: As the ability to kind of capture data and this continuous monitoring – you know, I'm wearing an

[00:23:40] [SPEAKER_00]: Oura ring.

[00:23:41] [SPEAKER_00]: This is capturing tons of user-generated data all the time from when I sleep, my heart rate

[00:23:46] [SPEAKER_00]: variability, my resting heart rate, even the temperature, even the moisture around my fingers,

[00:23:52] [SPEAKER_00]: capturing all this data all the time.

[00:23:53] [SPEAKER_00]: And so imagine, you know, this whole area of digital twins.

[00:23:58] [SPEAKER_00]: I don't know if you've heard much about that.

[00:24:00] [SPEAKER_00]: But, you know, the idea of creating a digital twin of me, Hamid.

[00:24:05] [SPEAKER_00]: So if we get to the point where we've got enough data, where AI can start to cover areas that

[00:24:12] [SPEAKER_00]: we don't have.

[00:24:13] [SPEAKER_00]: So you can start to guess or assume what missing data could look like.

[00:24:17] [SPEAKER_00]: Like, now we have a digital model of Hamid and we can start interacting with that digital

[00:24:21] [SPEAKER_00]: model with therapeutics and seeing what the digital model would do instead of trying on

[00:24:26] [SPEAKER_00]: Hamid the human.

[00:24:28] [SPEAKER_02]: Oh, yeah.

[00:24:29] [SPEAKER_02]: I mean, you'd have to have a pretty comprehensive digital model.

[00:24:33] [SPEAKER_02]: Like, it has to basically model everything going on in your body.

[00:24:36] [SPEAKER_00]: Yeah.

[00:24:36] [SPEAKER_00]: But that's the future.

[00:24:37] [SPEAKER_00]: The future is more continuous monitoring, more data, more analytics coming from everywhere.

[00:24:43] [SPEAKER_00]: And this starts to create a framework where it gets pretty exciting because now, you know,

[00:24:51] [SPEAKER_00]: there's some kind of computational model of each one of us.

[00:24:54] [SPEAKER_00]: And we can start to get really precise in terms of, you know, what therapeutics would or

[00:25:00] [SPEAKER_00]: would not work or what combination of therapeutics.

[00:25:03] [SPEAKER_00]: Maybe the doctor will say, oh, gosh, you know, look, based on your computer model, based on what

[00:25:08] [SPEAKER_00]: we see, you know, we think you need to walk 10,000 steps a day.

[00:25:12] [SPEAKER_00]: You need to have this much olive oil.

[00:25:14] [SPEAKER_00]: You need to take this particular therapeutic.

[00:25:18] [SPEAKER_00]: And, you know, we think if you do this, then your metabolites and your biomarkers will

[00:25:23] [SPEAKER_00]: improve.

[00:25:23] [SPEAKER_00]: We'll review that again.

[00:25:25] [SPEAKER_00]: And, you know, because we tried all these different things.

[00:25:27] [SPEAKER_00]: And that's the other thing.

[00:25:27] [SPEAKER_00]: It's the ability to try a bunch of different things against that computer model.

[00:25:32] [SPEAKER_00]: You only get to try some things against the human model, right?

[00:25:36] [SPEAKER_00]: Because, you know, your body has to go through stuff.

[00:25:38] [SPEAKER_00]: But imagine all this ability to kind of, you know, try different scenarios like a machine

[00:25:43] [SPEAKER_00]: would one after another and find the optimal one.

[00:25:46] [SPEAKER_00]: This starts to get quite interesting.

[00:25:48] [SPEAKER_00]: From a longevity perspective, this is enormously important.

[00:25:51] [SPEAKER_02]: So, like, is this where it starts to intersect with genomics?

[00:25:56] [SPEAKER_02]: Like, would you take someone's, you know, I don't know what you call it.

[00:26:00] [SPEAKER_02]: Like, when they model all of your DNA and they know exactly your DNA versus another patient's

[00:26:05] [SPEAKER_02]: DNA.

[00:26:06] [SPEAKER_02]: And, you know, they use things like CRISPR to edit genes.

[00:26:09] [SPEAKER_02]: Would this start to intersect with that field?

[00:26:12] [SPEAKER_02]: So, oh, James's genome says he probably shouldn't take Tylenol.

[00:26:18] [SPEAKER_02]: He should take Advil instead or whatever.

[00:26:20] [SPEAKER_02]: Like, it'll know all my drug interactions and it'll figure that stuff out.

[00:26:24] [SPEAKER_00]: Yeah.

[00:26:25] [SPEAKER_00]: Yeah.

[00:26:25] [SPEAKER_00]: You're right.

[00:26:25] [SPEAKER_00]: It starts with your multi-omics, right?

[00:26:28] [SPEAKER_00]: Your genome, your metabolome, your microbiome, you know, all of these, you know, all of these

[00:26:35] [SPEAKER_00]: ohms get collected.

[00:26:36] [SPEAKER_00]: That's a lot of information.

[00:26:37] [SPEAKER_00]: Just what you said.

[00:26:38] [SPEAKER_00]: The genome and the microbiome, obviously the microbiome changes, the genome doesn't,

[00:26:43] [SPEAKER_00]: but you have epigenetics.

[00:26:45] [SPEAKER_00]: So, you have this notion where genes get turned on and off as a result of our lifestyle factors.

[00:26:50] [SPEAKER_00]: And this is what's interesting and what makes healthcare so difficult.

[00:26:54] [SPEAKER_00]: It's like, hey, why did all the people that had these genes not, you know, develop these

[00:26:58] [SPEAKER_00]: comorbidities or these diseases or these ailments?

[00:27:01] [SPEAKER_00]: Well, you know, lifestyle factors are turning those genes on and off.

[00:27:05] [SPEAKER_00]: And so, it's, you know, you can't, it was thought previously that the human body was just

[00:27:10] [SPEAKER_00]: like this system.

[00:27:12] [SPEAKER_00]: You would do certain things to it and it would react a certain way.

[00:27:14] [SPEAKER_00]: But they're finding out that it's a lot more sensitive.

[00:27:19] [SPEAKER_00]: There's a lot more variables.

[00:27:20] [SPEAKER_00]: And another part of the variable is actually our mind and our sentiments and how we feel,

[00:27:25] [SPEAKER_00]: right?

[00:27:26] [SPEAKER_00]: So, it was viewed that, you know, we always say stress causes disease or influences disease.

[00:27:32] [SPEAKER_00]: Now, there's a much more mature understanding of how that actually works.

[00:27:35] [SPEAKER_00]: You know, literally, it, you know, again, that power of belief, you know, can significantly

[00:27:43] [SPEAKER_00]: influence your cellular health, which is fascinating.

[00:27:47] [SPEAKER_02]: What has the AI told us?

[00:27:49] [SPEAKER_02]: Has the AI confirmed that, yes, I've looked at these 18 million patients and it turns out

[00:27:54] [SPEAKER_02]: the ones who were stressed had more problems?

[00:27:58] [SPEAKER_00]: Yeah.

[00:27:59] [SPEAKER_00]: I mean, that's the kind of stuff that's quite interesting as you deploy this AI into clinical

[00:28:04] [SPEAKER_00]: settings and you start to, you know, annotate all of the patient records with clinical observations.

[00:28:10] [SPEAKER_00]: Say, hey, this person came in, they're quite stressed out.

[00:28:13] [SPEAKER_00]: Here's what's going on in their lives.

[00:28:15] [SPEAKER_00]: And when there's enough of those examples, of those annotations, those observations, and

[00:28:20] [SPEAKER_00]: they're sort of collated and integrated with all the other biomarker data, you can bet that

[00:28:25] [SPEAKER_00]: we're going to start to be able to demonstrate how stress impacts your biomarkers, impacts,

[00:28:30] [SPEAKER_00]: you know, the physiological body.

[00:28:33] [SPEAKER_00]: And, you know, to date, I think we have a learned, you know, kind of observational ability

[00:28:38] [SPEAKER_00]: to tell that, but not at a biomarker level.

[00:28:40] [SPEAKER_00]: You know, maybe some studies, you know, I'm sure they're out there that people have been

[00:28:45] [SPEAKER_00]: able to demonstrate what happens to all aspects of blood tests and whatnot.

[00:28:51] [SPEAKER_00]: But this is all coming.

[00:28:53] [SPEAKER_02]: Well, it seems like the integrated is key because, like, for instance, I have one of

[00:28:58] [SPEAKER_02]: those beds, those eight beds that it keeps track of so much data, I couldn't figure it

[00:29:05] [SPEAKER_02]: out.

[00:29:05] [SPEAKER_02]: So I don't even really use it for what it's supposed to do.

[00:29:08] [SPEAKER_02]: But it keeps track of how many hours a night you had good sleep.

[00:29:11] [SPEAKER_02]: How was your breathing while you slept?

[00:29:12] [SPEAKER_02]: How was your heartbeat while you slept?

[00:29:13] [SPEAKER_02]: But my doctor, you know, has never asked for that.

[00:29:18] [SPEAKER_02]: Or I should say my wife's doctor has never asked for that.

[00:29:20] [SPEAKER_02]: I haven't gone to the doctor at all.

[00:29:22] [SPEAKER_02]: But nobody asks for this data.

[00:29:24] [SPEAKER_02]: It's just, like, personally useful if we use it.

[00:29:27] [SPEAKER_02]: But it's not really integrated with my healthcare records, for instance.

[00:29:31] [SPEAKER_02]: So do you think that's coming up?

[00:29:34] [SPEAKER_00]: 100%.

[00:29:34] [SPEAKER_00]: It has to come up.

[00:29:36] [SPEAKER_00]: Sleep is when we heal.

[00:29:38] [SPEAKER_00]: It's when we get restored.

[00:29:40] [SPEAKER_00]: You know, that kind of data is incredibly important for a healthcare provider, you know,

[00:29:47] [SPEAKER_00]: to have over time.

[00:29:48] [SPEAKER_00]: Now, we just don't have the mechanism for most health systems don't have the mechanism

[00:29:52] [SPEAKER_00]: to collect that data.

[00:29:53] [SPEAKER_00]: It's a lot of data.

[00:29:54] [SPEAKER_00]: There's permissions.

[00:29:55] [SPEAKER_00]: There's IT considerations.

[00:29:57] [SPEAKER_00]: There's privacy considerations.

[00:29:58] [SPEAKER_00]: But it'll happen.

[00:30:00] [SPEAKER_00]: We're seeing a lot of this testing happen right now with glucose monitors.

[00:30:04] [SPEAKER_00]: You know, there's guys on TikTok and Instagram, you know, testing themselves eating various

[00:30:08] [SPEAKER_00]: different foods.

[00:30:11] [SPEAKER_00]: Even, you know, going for walks out.

[00:30:12] [SPEAKER_02]: Like these biohackers, it's one guy trying something.

[00:30:16] [SPEAKER_02]: But he could be completely different from all the other people trying something.

[00:30:20] [SPEAKER_02]: So with AI, you have millions.

[00:30:22] [SPEAKER_02]: It's as if millions of people are trying something and privately sharing it into this big pool

[00:30:27] [SPEAKER_02]: of data.

[00:30:27] [SPEAKER_02]: I mean, are there any privacy issues from the patient side?

[00:30:31] [SPEAKER_02]: Like, is there a way, like, so I could have all my data in this AI thing and somehow the

[00:30:37] [SPEAKER_02]: data gets out and everybody knows all my diseases?

[00:30:41] [SPEAKER_00]: Well, yeah.

[00:30:42] [SPEAKER_00]: I mean, I think there's a lot of, whoever's in this business has to take an enormous duty

[00:30:47] [SPEAKER_00]: of care to understand just the, you know, the importance of locking down this data to

[00:30:53] [SPEAKER_00]: de-anonymize it when interacting with third parties.

[00:30:58] [SPEAKER_00]: You know, this is the most valuable data in the world.

[00:31:00] [SPEAKER_00]: I mean, there's not, because you can't change a health condition.

[00:31:03] [SPEAKER_00]: You can change a credit card number.

[00:31:05] [SPEAKER_00]: You know, your credit card number gets compromised.

[00:31:08] [SPEAKER_00]: It sucks.

[00:31:08] [SPEAKER_00]: You were inconvenienced, but someone sends you a new one and you're off.

[00:31:12] [SPEAKER_00]: You can't change a health ailment.

[00:31:13] [SPEAKER_00]: You know, it's, you know, and a lot of people want access to this information so that they

[00:31:17] [SPEAKER_00]: can use it to, you know, improve their ability to get into your email.

[00:31:23] [SPEAKER_00]: You know, just phishing success rates go up dramatically when, you know, the bad guys

[00:31:29] [SPEAKER_00]: know your medical ailments.

[00:31:31] [SPEAKER_00]: You know, it's unbelievable what people will use this information for.

[00:31:36] [SPEAKER_00]: So I think it's really important.

[00:31:37] [SPEAKER_00]: I do think that's one of the things that really hurt 23andMe.

[00:31:40] [SPEAKER_00]: You know, at the end of the day, you know, you're giving your, you know, your biological

[00:31:46] [SPEAKER_00]: sample to a company and they're going to take a whole bunch of information related to that.

[00:31:51] [SPEAKER_00]: It's incredibly important that that information be safeguarded.

[00:31:57] [SPEAKER_02]: And by the way, have you spoken to a company like 23andMe in order to get all of their data

[00:32:02] [SPEAKER_02]: and somehow add it to your learning models?

[00:32:06] [SPEAKER_02]: Or maybe you need it with the patients you currently have in the model.

[00:32:10] [SPEAKER_02]: I don't know.

[00:32:12] [SPEAKER_00]: No, we have not.

[00:32:13] [SPEAKER_00]: We have not added genomic data into our, you know, kind of major profile, meaning like

[00:32:21] [SPEAKER_00]: as a kind of a core requirement for every patient because most patients of the, you know, they're

[00:32:29] [SPEAKER_00]: just doing general primary care are not doing that.

[00:32:32] [SPEAKER_00]: At the private care level, like longevity health, we're, you know, collecting that kind of information,

[00:32:38] [SPEAKER_00]: but we're not storing it on our servers right now.

[00:32:41] [SPEAKER_00]: So, because we don't, you know, we don't want to be able to store that information on

[00:32:47] [SPEAKER_00]: our servers.

[00:32:47] [SPEAKER_00]: We want to be able to access it when we need to access it.

[00:32:50] [SPEAKER_00]: So there's a whole kind of strategy that you use in order to reduce the effect of breaches

[00:32:56] [SPEAKER_00]: and whatnot.

[00:32:56] [SPEAKER_00]: But yeah, we have strategies of being able to consolidate data when absolutely needed.

[00:33:01] [SPEAKER_00]: And we can do that as well.

[00:33:18] [SPEAKER_02]: So the role of a doctor is going to evolve to basically being kind of this middleman

[00:33:23] [SPEAKER_02]: between your data and the AI.

[00:33:28] [SPEAKER_02]: This person, according to the AI, has an 86% chance of having this type of cancer.

[00:33:32] [SPEAKER_02]: So it needs to go to this specialist.

[00:33:33] [SPEAKER_02]: So he becomes sort of like the highway patrol as far as like, hey, you need to go here.

[00:33:38] [SPEAKER_02]: You need to go here, which is what they've always done.

[00:33:40] [SPEAKER_02]: But now they have a lot more data at their hands.

[00:33:43] [SPEAKER_02]: And maybe you don't even really need to know medicine that well.

[00:33:46] [SPEAKER_02]: I don't know.

[00:33:47] [SPEAKER_02]: Like, you think it'll be easier to become a doctor?

[00:33:50] [SPEAKER_02]: You know, it's easier to be a graphic designer now because I can use AI.

[00:33:54] [SPEAKER_02]: I might not know how to use Photoshop or any design tools, but I can use AI now to design

[00:33:59] [SPEAKER_02]: things.

[00:33:59] [SPEAKER_02]: So it's easier for me to raise my hand and say, I'm a designer.

[00:34:03] [SPEAKER_00]: Well, I think you certainly need to, as a doctor, you know, know your way around these

[00:34:08] [SPEAKER_00]: systems in the future, at least at a basic level.

[00:34:11] [SPEAKER_00]: But I think there's great pains being made in order to not require the doctor to become

[00:34:16] [SPEAKER_00]: an expert.

[00:34:17] [SPEAKER_00]: So how do we do that?

[00:34:18] [SPEAKER_00]: So, for example, with our technology, when we're distilling down a patient record or a

[00:34:24] [SPEAKER_00]: really dense document and we're organizing it, when the doctor is reviewing the organized

[00:34:28] [SPEAKER_00]: version of the output that we're showing them, we will actually be able to show them where

[00:34:34] [SPEAKER_00]: the source of that came from.

[00:34:36] [SPEAKER_00]: So if the doctor says, you know, this is strange, I'd like to better understand this.

[00:34:40] [SPEAKER_00]: There's literally a click they can make and it goes directly into some hospital print

[00:34:45] [SPEAKER_00]: out somewhere or some page in the patient record where that information came from.

[00:34:50] [SPEAKER_00]: I think that's really important that we build systems that are not just black box systems

[00:34:54] [SPEAKER_00]: that allow physicians to continue to dig when they need more information to be able to

[00:35:00] [SPEAKER_00]: sanity check it.

[00:35:01] [SPEAKER_00]: Because, you know, look, technology is not infallible.

[00:35:04] [SPEAKER_00]: It's pretty good.

[00:35:05] [SPEAKER_00]: It's getting better.

[00:35:07] [SPEAKER_00]: It's often extremely precise.

[00:35:08] [SPEAKER_00]: But we all know that it can make mistakes.

[00:35:11] [SPEAKER_00]: And use ChatGPT or any of these models.

[00:35:13] [SPEAKER_00]: It will just stay right there at the bottom of the screen.

[00:35:16] [SPEAKER_00]: You know, this model can make mistakes.

[00:35:18] [SPEAKER_02]: So I guess like one problem could be you don't know what you don't know.

[00:35:22] [SPEAKER_02]: So for instance, let's say for whatever reason, you don't collect the ethnic identity of

[00:35:29] [SPEAKER_02]: all your patients.

[00:35:30] [SPEAKER_02]: Now that might be very relevant for a diagnosis.

[00:35:34] [SPEAKER_02]: So whether someone has lupus or not might be completely related to what their ethnic background

[00:35:38] [SPEAKER_02]: is.

[00:35:39] [SPEAKER_02]: But if you don't, if for some reason you forgot to collect that particular data or some other

[00:35:46] [SPEAKER_02]: data that's equally important, your system might get biased to say, oh, this person has

[00:35:52] [SPEAKER_02]: lupus more often than they should be diagnosing that.

[00:35:58] [SPEAKER_00]: Yeah, yeah.

[00:35:59] [SPEAKER_00]: I mean, look, oftentimes there's different ways to get to the right answer.

[00:36:02] [SPEAKER_00]: So you would hope that that's the case.

[00:36:04] [SPEAKER_00]: And you would hope that if the machine doesn't have enough information to render a suggestion

[00:36:10] [SPEAKER_00]: or an insight, then it would just not do that and just say, hey, here's the view.

[00:36:16] [SPEAKER_00]: It could be any one of these things.

[00:36:17] [SPEAKER_00]: Or this person likely has this ailment as a result of these sort of biomarkers or these

[00:36:25] [SPEAKER_00]: sort of results in their blood tests or what have you.

[00:36:28] [SPEAKER_00]: But I think it's really important that we build models that can be explained, that can

[00:36:34] [SPEAKER_00]: demonstrate where they got their answers from.

[00:36:36] [SPEAKER_00]: Otherwise, I think the stuff becomes a lot less valuable to physicians.

[00:36:40] [SPEAKER_02]: You know, also, is there a bias in terms of prescription?

[00:36:45] [SPEAKER_02]: So for instance, doctors right now might in some cases, not always, of course, but might

[00:36:51] [SPEAKER_02]: in some cases be biased by which pharmaceutical companies they have the closest relationships

[00:36:55] [SPEAKER_02]: with.

[00:36:56] [SPEAKER_02]: And I'm not saying anything bad here.

[00:36:57] [SPEAKER_02]: It's just, oh, Pfizer regularly visits my office and I like the salesperson and I like

[00:37:03] [SPEAKER_02]: their drugs.

[00:37:04] [SPEAKER_02]: And oh, my patient is having trouble sleeping.

[00:37:06] [SPEAKER_02]: Here, take this drug from Pfizer that cures it.

[00:37:09] [SPEAKER_02]: So the AI won't be biased towards a particular pharmaceutical company the way a doctor might

[00:37:16] [SPEAKER_02]: be.

[00:37:16] [SPEAKER_02]: And I wonder if that becomes an issue or maybe they are biased in some ways towards certain

[00:37:21] [SPEAKER_02]: pharmaceutical companies or pharmaceutical solutions rather than natural solutions.

[00:37:26] [SPEAKER_02]: I don't know.

[00:37:27] [SPEAKER_02]: Do you see bias in that?

[00:37:29] [SPEAKER_00]: What you're talking about is a huge area of discussion and debate because that's what a

[00:37:35] [SPEAKER_00]: lot of physicians are worried about is that these algorithms will be written in such

[00:37:38] [SPEAKER_00]: a way that do bias and tip the scales towards certain pharmaceutical companies over the

[00:37:45] [SPEAKER_00]: others.

[00:37:45] [SPEAKER_00]: And this is why what's really important and whether or not it's something that's common

[00:37:53] [SPEAKER_00]: practice, I think maybe from a regulatory perspective should be the case where any kind of tipping,

[00:37:59] [SPEAKER_00]: any kind of inclination is disclosed.

[00:38:03] [SPEAKER_00]: So creating some kind of disclosure like, hey, look, this response that you got was funded in part by a group of

[00:38:10] [SPEAKER_00]: pharmaceutical companies that included Pfizer.

[00:38:12] [SPEAKER_00]: So, I mean, as a doctor, it's important that you know that.

[00:38:16] [SPEAKER_00]: If no disclosure is happening, I think you're probably assuming that there's no bias, right?

[00:38:22] [SPEAKER_00]: And this is where I think we do need to make sure that we have, you know, good, you know,

[00:38:27] [SPEAKER_00]: clear rules and regulations about how information is presented to care providers.

[00:38:31] [SPEAKER_02]: But then there's the flip side of that regulatory issue.

[00:38:35] [SPEAKER_02]: And this is what I really worry about with regulators because so a regulator could say,

[00:38:40] [SPEAKER_02]: like the government could say, listen, you have to use one of the AI recommendations in your

[00:38:48] [SPEAKER_02]: suggested cure to the patient.

[00:38:50] [SPEAKER_02]: Otherwise, that's malpractice.

[00:38:51] [SPEAKER_02]: Now, that's an extreme, but I could see regulation going in that direction if government trusts the

[00:38:58] [SPEAKER_02]: AI too much even.

[00:38:59] [SPEAKER_02]: And so then a doctor would get in trouble for not recommending, oh, the doctor could say,

[00:39:03] [SPEAKER_02]: I don't believe in this Pfizer drug and the AI keeps recommending it, but he could get in

[00:39:07] [SPEAKER_02]: trouble if he doesn't.

[00:39:08] [SPEAKER_02]: Once the AI tells him it, if the regulators say you have to use it, that could be an issue.

[00:39:14] [SPEAKER_00]: It could.

[00:39:14] [SPEAKER_00]: I think you're right.

[00:39:15] [SPEAKER_00]: I think the burden of proof and demonstrating certain evidence-based, you know, thresholds

[00:39:24] [SPEAKER_00]: of success has to be quite high, you know, in order to get to that point.

[00:39:29] [SPEAKER_00]: But I think you're right.

[00:39:30] [SPEAKER_00]: I think we can't really imagine a world right now where machines are trusted more than humans

[00:39:36] [SPEAKER_00]: as it relates to healthcare.

[00:39:38] [SPEAKER_00]: And I don't think it'll be in my lifetime.

[00:39:41] [SPEAKER_00]: I don't think it'll be anytime soon.

[00:39:43] [SPEAKER_00]: It could be 100 years.

[00:39:46] [SPEAKER_00]: But it'll happen.

[00:39:47] [SPEAKER_00]: I do think it'll happen.

[00:39:49] [SPEAKER_00]: I think we'll get to a point eventually where the AI just will get so good and there'll be

[00:39:55] [SPEAKER_00]: belt and suspenders-based approach and people will get comfortable around that.

[00:39:58] [SPEAKER_00]: But it's fascinating to think, you know, how do we get there from here?

[00:40:03] [SPEAKER_00]: You know, what, you know, and I think it comes back to, you know, to some degree regulation.

[00:40:11] [SPEAKER_00]: It comes back to what our expectations of these AIs are.

[00:40:15] [SPEAKER_00]: But there's no doubt they're becoming more valuable.

[00:40:18] [SPEAKER_00]: And how can you disregard them?

[00:40:20] [SPEAKER_00]: You know, like, I think there's a certain point in time where physicians could be, you know,

[00:40:27] [SPEAKER_00]: admonished or even reprimanded if they don't use an AI to help them.

[00:40:31] [SPEAKER_00]: So let's say you were just like, hey, I'm an old school doctor.

[00:40:34] [SPEAKER_00]: I'm not saying today, but maybe even in five years that could happen.

[00:40:39] [SPEAKER_00]: Like, if you have a seriously good AI that's generically capable, available,

[00:40:44] [SPEAKER_00]: that can work with any patient record and can make you, you know, 80% less likely to make a mistake,

[00:40:55] [SPEAKER_00]: maybe you should use it.

[00:40:57] [SPEAKER_02]: You know what I mean?

[00:40:58] [SPEAKER_02]: Right now, do you see, like, for instance, thinking about pharmaceutical companies again,

[00:41:05] [SPEAKER_02]: like, does the AI get to that point where it starts prescribing specific medicines or not?

[00:41:11] [SPEAKER_02]: Like, I don't know where your AI stands.

[00:41:13] [SPEAKER_02]: Does it do that?

[00:41:15] [SPEAKER_00]: No, no.

[00:41:16] [SPEAKER_00]: Yeah, yeah.

[00:41:16] [SPEAKER_00]: No, I think what could happen in these situations when you've detected some kind of disease

[00:41:22] [SPEAKER_00]: or you've identified a risk factor.

[00:41:24] [SPEAKER_00]: In our software, we risk stratify a patient or a group of patients.

[00:41:28] [SPEAKER_00]: So we say, hey, look, Mr. Doctor, we reviewed, you know, your patient records.

[00:41:33] [SPEAKER_00]: And what we found is that there are certain patients that are higher risk for heart disease

[00:41:40] [SPEAKER_00]: or diabetes or chronic kidney disease.

[00:41:43] [SPEAKER_00]: And we literally kind of red, yellow, green, you know, kind of identify who those patients are.

[00:41:49] [SPEAKER_00]: And then when someone clicks on that, you know, it'll identify what the call to action could be.

[00:41:54] [SPEAKER_00]: Say, hey, you may want to prescribe something like this.

[00:41:56] [SPEAKER_00]: It typically will not talk about a brand name.

[00:41:58] [SPEAKER_00]: It'll talk about a molecule that could help with that particular, you know, ailment.

[00:42:03] [SPEAKER_00]: And it'll identify, you know, recommended dosages and things of that nature.

[00:42:07] [SPEAKER_00]: So that's, you know, eventually.

[00:42:10] [SPEAKER_00]: And so eventually all this stuff will become more automatic and more continuous.

[00:42:15] [SPEAKER_00]: But I think it's really, really important, you know, in our view that this continue to be decision support

[00:42:22] [SPEAKER_00]: and not allow machines to make decisions right now.

[00:42:25] [SPEAKER_00]: Because this still does make mistakes.

[00:42:27] [SPEAKER_00]: Because it's still, there's still rough spots.

[00:42:30] [SPEAKER_00]: I think it'll get better.

[00:42:31] [SPEAKER_00]: But I think to kind of understand the importance of trusting the care provider, you know,

[00:42:36] [SPEAKER_00]: and having them, you know, be the ultimate decision makers about our health is really important.

[00:42:40] [SPEAKER_02]: I mean, how much, like in the past 40 or 50 years, lifespan in U.S. and Canada in particular

[00:42:48] [SPEAKER_02]: has increased by about 11 or 12 years.

[00:42:52] [SPEAKER_02]: So whereas people maybe on average live to the age of 68 or now they're living to the age of 80.

[00:42:59] [SPEAKER_02]: And how much of that, how much increase do you think could happen now due to,

[00:43:06] [SPEAKER_02]: let's say if every doctor was using your system, for instance, and getting these, you know,

[00:43:11] [SPEAKER_02]: diagnosis much faster and getting so much more data about each patient so that,

[00:43:17] [SPEAKER_02]: you know, there's better prescriptions and so on.

[00:43:21] [SPEAKER_00]: Well, look, I think preventative care does elongate life, right?

[00:43:26] [SPEAKER_00]: I mean, speed is one of the most important things and time is one of the most important

[00:43:31] [SPEAKER_00]: things in healthcare.

[00:43:32] [SPEAKER_00]: You know, if you have a pain, if something's wrong with your body, if something's not right,

[00:43:38] [SPEAKER_00]: sometimes it comes down to time.

[00:43:40] [SPEAKER_00]: You know, think about cancer.

[00:43:41] [SPEAKER_00]: Cancer starts out with a number of cells in your body that, you know, something's not

[00:43:47] [SPEAKER_00]: right.

[00:43:47] [SPEAKER_00]: And they're, you know, most of us have cancer cells in our bodies and we're overcoming them,

[00:43:52] [SPEAKER_00]: you know, pretty easily.

[00:43:53] [SPEAKER_00]: Our immune system can just readily take care of them and overpower them.

[00:43:57] [SPEAKER_00]: But once in a while, our immune system is depressed for a long enough time.

[00:44:02] [SPEAKER_00]: And there's some toxicity that we took into our body that suddenly the immune system is overwhelmed

[00:44:09] [SPEAKER_00]: by this cancer.

[00:44:10] [SPEAKER_00]: And then the thing about cancer is that it effectively becomes like this growing organism

[00:44:17] [SPEAKER_00]: inside your body.

[00:44:18] [SPEAKER_00]: And it gets to a point where we need to attack it now with like, you know, chemotherapy or

[00:44:24] [SPEAKER_00]: radiation and try to quell it or kill it.

[00:44:27] [SPEAKER_00]: And it's all about time.

[00:44:30] [SPEAKER_00]: Like if you catch these things early enough, you can really address them.

[00:44:35] [SPEAKER_00]: And I would say that, you know, that's what real healthcare is.

[00:44:40] [SPEAKER_00]: You know, otherwise it's sick care.

[00:44:41] [SPEAKER_00]: Real healthcare is helping you avoid disease and giving you great diagnostics and how to avoid

[00:44:49] [SPEAKER_00]: disease, how to avoid those pitfalls.

[00:44:50] [SPEAKER_00]: And a lot of it happens as a result of what you put in your mouth and how you move your

[00:44:57] [SPEAKER_00]: body, right?

[00:44:58] [SPEAKER_00]: And so I do think that pretty significant improvements in lifespan and healthspan will take place as

[00:45:07] [SPEAKER_00]: a result of, you know, getting this data in the right people's hands and nudging them to

[00:45:13] [SPEAKER_00]: better care so that they understand when they're taking some of these wrong steps that they're

[00:45:18] [SPEAKER_00]: hurting themselves, right?

[00:45:19] [SPEAKER_00]: And sometimes they're just not aware of that.

[00:45:22] [SPEAKER_00]: So, yeah.

[00:45:23] [SPEAKER_02]: I wonder, like, when do you think you'll be at the point where you have the data where

[00:45:27] [SPEAKER_02]: you can say, oh, the doctors who are using the HealWell system, you know, all of our software,

[00:45:33] [SPEAKER_02]: they're getting better results than the doctors who are not using it?

[00:45:37] [SPEAKER_00]: Oh, I think we're there.

[00:45:39] [SPEAKER_00]: I think if you combine all the different AI co-pilots that our physicians use, they're

[00:45:45] [SPEAKER_00]: calmer.

[00:45:46] [SPEAKER_00]: They can pay attention during a patient visit.

[00:45:49] [SPEAKER_00]: They don't have to spend time transcribing and pulling information into a patient record

[00:45:54] [SPEAKER_00]: so that they can understand it and study it later.

[00:45:56] [SPEAKER_00]: I think that they are already running our disease detection algorithms against their patient

[00:46:03] [SPEAKER_00]: records to the extent that those patients accept it.

[00:46:06] [SPEAKER_00]: You know, again, this comes with consent.

[00:46:09] [SPEAKER_00]: So to the extent that the consents are in place.

[00:46:12] [SPEAKER_00]: And I think already our physicians are seeing benefits that accrue as a result of using these

[00:46:18] [SPEAKER_00]: tools.

[00:46:19] [SPEAKER_00]: So it's pretty incredible.

[00:46:20] [SPEAKER_00]: Like, and when doctors free up time, you know, they provide better care.

[00:46:27] [SPEAKER_00]: They're also, you know, professionals that have significant time constraints and administrative

[00:46:35] [SPEAKER_00]: burdens and overheads.

[00:46:36] [SPEAKER_00]: You know, just the things that they need to do just to get paid are pretty incredible.

[00:46:42] [SPEAKER_02]: And, you know, one idea is you can make maybe some headlines.

[00:46:48] [SPEAKER_02]: How about you donate your software in special situations?

[00:46:52] [SPEAKER_02]: So for instance, Ukraine hospitals are destroyed.

[00:46:55] [SPEAKER_02]: Let's say their whole healthcare infrastructure is destroyed.

[00:46:58] [SPEAKER_02]: Can you kind of replace part of their healthcare system by essentially donating your software to

[00:47:03] [SPEAKER_02]: all their medical professionals?

[00:47:07] [SPEAKER_00]: Yeah, I don't see why not.

[00:47:08] [SPEAKER_00]: I think that's a great idea.

[00:47:10] [SPEAKER_00]: You know, I think especially physicians that are really sort of struggling to keep up with

[00:47:15] [SPEAKER_00]: the demand.

[00:47:16] [SPEAKER_00]: Imagine having some of these tools at their disposal.

[00:47:20] [SPEAKER_00]: I mean, it does exactly what you're saying.

[00:47:24] [SPEAKER_00]: It sort of helps amplify a physician.

[00:47:26] [SPEAKER_00]: In those situations, probably a fewer number of physicians are overwhelmed by the amount of

[00:47:30] [SPEAKER_00]: care that they need to provide.

[00:47:32] [SPEAKER_00]: So giving them tools and allowing them to take on more patients, but still putting the physician

[00:47:37] [SPEAKER_00]: in the driver's seat, I think is the key.

[00:47:39] [SPEAKER_00]: And that's what our tools are doing.

[00:47:42] [SPEAKER_02]: Well, and right now, Well Company and HealWell, it's two separate companies.

[00:47:49] [SPEAKER_02]: HealWell spun out of Well.

[00:47:51] [SPEAKER_02]: They're both public in Canada.

[00:47:55] [SPEAKER_02]: Are you looking to go public or to list shares in the U.S.?

[00:48:00] [SPEAKER_00]: At some point, yeah.

[00:48:01] [SPEAKER_00]: Yeah, I think what we found with the U.S. is because of the regulatory kind of framework

[00:48:07] [SPEAKER_00]: there and there's typically now larger, the average company that's public is a bit larger

[00:48:15] [SPEAKER_00]: and there's actually fewer public companies today than there were years ago.

[00:48:19] [SPEAKER_00]: You would think the opposite would be the case, but the demands and the costs to be public

[00:48:24] [SPEAKER_00]: in the U.S. are quite a bit greater now.

[00:48:27] [SPEAKER_00]: Here in Canada, the costs are still a bit lower.

[00:48:30] [SPEAKER_00]: So it makes sense, even though our shares are available for purchase in the U.S. through

[00:48:36] [SPEAKER_00]: different quote machines because we trade on the OTC, on the over-the-counter.

[00:48:42] [SPEAKER_02]: But it's interesting because there's both good and bad to the regulatory framework.

[00:48:46] [SPEAKER_02]: Most people I know who are with public companies in the U.S., they're very upset at all the regulations.

[00:48:53] [SPEAKER_02]: And it costs them $1 to $2 million a year to stay public in the U.S. because of all the filings

[00:48:58] [SPEAKER_02]: and accounting and all this stuff that you have to do.

[00:49:01] [SPEAKER_02]: But the good side is that creates more volume in the shares because people feel more comfortable

[00:49:07] [SPEAKER_02]: buying shares in the U.S. than they do buying...

[00:49:09] [SPEAKER_02]: I mean, Canada is almost like the U.S., but I'd rather buy shares in the U.S. than shares

[00:49:14] [SPEAKER_02]: on some stock exchange in, I don't know, China or the Saudi Arabian stock exchange.

[00:49:23] [SPEAKER_02]: I don't know if they have one.

[00:49:26] [SPEAKER_02]: So this is kind of the flip side, which is that people feel...

[00:49:28] [SPEAKER_02]: It's like the U.S. dollar too.

[00:49:30] [SPEAKER_02]: Why is the dollar kind of still the standard currency?

[00:49:33] [SPEAKER_02]: And maybe it won't be for long, but it's because there's more transparency

[00:49:36] [SPEAKER_02]: in how many dollars are out there than any other country.

[00:49:39] [SPEAKER_02]: So it's that transparency and safety and disclosure that, I guess, being public here gives a company.

[00:49:48] [SPEAKER_00]: Yeah.

[00:49:49] [SPEAKER_00]: I mean, look, we have similar laws.

[00:49:50] [SPEAKER_00]: I think one big difference between the U.S. and anywhere else in the world, frankly,

[00:49:55] [SPEAKER_00]: is just all the litigation, right?

[00:49:57] [SPEAKER_00]: So you have a lot of what's called stock drop suit, where the stock drops and a bunch of lawyers

[00:50:05] [SPEAKER_00]: start looking at the filings of the company and say, hey, why did the stock drop?

[00:50:10] [SPEAKER_00]: Was there any disclosure given?

[00:50:12] [SPEAKER_00]: And you could argue both sides of that.

[00:50:15] [SPEAKER_00]: You could say, hey, it's really good that they're doing that.

[00:50:16] [SPEAKER_00]: And you could say, wow, that's going to create a lot of overhead for the industry

[00:50:21] [SPEAKER_00]: because a lot of this sort of ambulance-chasing behavior, if you will,

[00:50:27] [SPEAKER_00]: is not focused on improving behavior.

[00:50:32] [SPEAKER_00]: And so what ends up happening is you just have costs that go up as a result of supporting this.

[00:50:38] [SPEAKER_00]: Professional fees go up, and then guess what?

[00:50:41] [SPEAKER_00]: Audit firms get sued once in a while.

[00:50:43] [SPEAKER_00]: So guess what?

[00:50:43] [SPEAKER_00]: They're going to charge more.

[00:50:44] [SPEAKER_00]: And that's what's really happened to some degree,

[00:50:47] [SPEAKER_00]: is all of this activity has caused costs to go up to be a public company.

[00:50:53] [SPEAKER_00]: But to your point, if that, whether directly or indirectly,

[00:50:57] [SPEAKER_00]: causes there to be more trust in the system, then that's not a bad thing.

[00:51:01] [SPEAKER_00]: And I think there's a tremendous amount of volume.

[00:51:07] [SPEAKER_00]: You can't argue with the fact that the U.S. capital markets are by far, by a country mile,

[00:51:13] [SPEAKER_00]: the most powerful and most liquid anywhere in the world.

[00:51:17] [SPEAKER_00]: And one has to think that that has to do with some level of ability for third parties

[00:51:24] [SPEAKER_00]: to intervene if they're not happy with the result.

[00:51:26] [SPEAKER_00]: Right?

[00:51:27] [SPEAKER_00]: Yeah, but you're right, though.

[00:51:28] [SPEAKER_02]: Something needs to be done about the litigation.

[00:51:30] [SPEAKER_02]: It's almost like the lawyers take advantage of freedom of speech

[00:51:35] [SPEAKER_02]: more than just average citizens do.

[00:51:37] [SPEAKER_02]: Like, oh, we're allowed to litigate.

[00:51:38] [SPEAKER_02]: We could do anything.

[00:51:40] [SPEAKER_00]: Well, anything can be abused, right?

[00:51:42] [SPEAKER_00]: And so these rules, I think, are all well-intentioned.

[00:51:46] [SPEAKER_00]: But sometimes they get abused as well,

[00:51:48] [SPEAKER_00]: by the very people who are designed to enforce them.

[00:51:52] [SPEAKER_02]: Yeah.

[00:51:53] [SPEAKER_02]: So I wonder also, here's another idea.

[00:51:56] [SPEAKER_02]: I'm just going to keep throwing ideas at you.

[00:51:58] [SPEAKER_02]: What about a consumer version of your software so that I can just plug in my Oura ring,

[00:52:04] [SPEAKER_02]: plug in my bed, type like a little diary how many steps I took today and how I'm feeling

[00:52:09] [SPEAKER_02]: and what my stress levels are.

[00:52:12] [SPEAKER_02]: And the software has very generic kind of output, like not real professional doctor output.

[00:52:19] [SPEAKER_02]: But it basically says, look, we reached a threshold where you probably need to see a doctor.

[00:52:25] [SPEAKER_00]: I love it.

[00:52:26] [SPEAKER_00]: You're probably a product manager in another life.

[00:52:29] [SPEAKER_00]: So I agree.

[00:52:32] [SPEAKER_00]: We are actually working on that.

[00:52:34] [SPEAKER_00]: It is not our main focus right now, but kind of one of the, I'll say, kind of earlier stage initiatives

[00:52:42] [SPEAKER_00]: that we're working on, and we've got a partner that we've just been kind of discussing this with,

[00:52:47] [SPEAKER_00]: is the idea of being able to pull patient records into a personal vault

[00:52:51] [SPEAKER_00]: and then be able to add other information to that and take control and initiative

[00:52:57] [SPEAKER_00]: and be able to invite a service to be able to scan that and provide data information.

[00:53:04] [SPEAKER_00]: To me, that's very important to be able to get to the patient and empower the patient.

[00:53:10] [SPEAKER_00]: In reality, that is the best way to achieve the best health outcomes,

[00:53:14] [SPEAKER_00]: is to get people to care about their own health and to take steps.

[00:53:17] [SPEAKER_00]: Because if there is a magic bullet that we have in healthcare,

[00:53:20] [SPEAKER_00]: there's only really one, and that's behavior, right?

[00:53:23] [SPEAKER_00]: And so how do you get someone to change their behavior?

[00:53:26] [SPEAKER_00]: Well, you get them to take an interest in their health.

[00:53:30] [SPEAKER_00]: And what do I have to do to experience better health?

[00:53:33] [SPEAKER_00]: Well, let me find out.

[00:53:36] [SPEAKER_00]: Let me plug all this stuff in and invite an AI to tell me a little bit more.

[00:53:41] [SPEAKER_00]: Now, there's some concern from physicians that, well,

[00:53:45] [SPEAKER_00]: all that's going to happen is that person's going to get this information

[00:53:48] [SPEAKER_00]: and they call their physician and say, can you help me decode this?

[00:53:50] [SPEAKER_00]: So it's got to be really simple.

[00:53:52] [SPEAKER_00]: It's got to come with the right disclaimers.

[00:53:54] [SPEAKER_00]: And it's got to be correctly packaged.

[00:53:57] [SPEAKER_00]: But I think it's a good thing.

[00:53:58] [SPEAKER_00]: I think it'll do more good than harm

[00:54:00] [SPEAKER_00]: to be able to empower people to start thinking about their patient data.

[00:54:05] [SPEAKER_00]: We own our patient data.

[00:54:07] [SPEAKER_00]: What benefit are we deriving from owning that patient data, right?

[00:54:11] [SPEAKER_00]: And that comes back to something you were saying earlier,

[00:54:13] [SPEAKER_00]: which is the incentive.

[00:54:14] [SPEAKER_00]: I could see employers start to incentivize you to care about your data.

[00:54:19] [SPEAKER_02]: Yeah, and definitely insurance companies.

[00:54:22] [SPEAKER_02]: But again, it all gets wrapped up into what's going to be required and mandated

[00:54:26] [SPEAKER_02]: as opposed to what's just a good idea.

[00:54:29] [SPEAKER_02]: So I think that there's an issue there.

[00:54:31] [SPEAKER_02]: But in general, I'm excited.

[00:54:34] [SPEAKER_02]: I do think when people say, oh, what good is all this AI stuff?

[00:54:38] [SPEAKER_02]: It's just helping kids cheat on their college tests or whatever.

[00:54:41] [SPEAKER_02]: I do see healthcare AI as basically the main use case right now for AI.

[00:54:48] [SPEAKER_02]: It's going to actually, like you said, increase lifespans.

[00:54:52] [SPEAKER_00]: I do too.

[00:54:53] [SPEAKER_00]: And look, I take a lot of supplements.

[00:54:54] [SPEAKER_00]: I do a lot of research around my own personal health.

[00:54:58] [SPEAKER_00]: And I take supplements.

[00:54:58] [SPEAKER_00]: And I think they're helping me.

[00:55:01] [SPEAKER_00]: I'm super excited about a day where I can actually get way more precise biomarker data

[00:55:09] [SPEAKER_00]: as a result of all of this information, all my sleep information, all my activity information,

[00:55:15] [SPEAKER_00]: all of my clinical notes, all of my blood tests, all going into some kind of model

[00:55:21] [SPEAKER_00]: that provides me with real-time information about how my body's changing as a result of

[00:55:26] [SPEAKER_00]: what I put into it.

[00:55:27] [SPEAKER_00]: We're not there yet, but you can imagine that's not far away, right?

[00:55:32] [SPEAKER_00]: And so, right?

[00:55:33] [SPEAKER_02]: I mean, that's interesting.

[00:55:34] [SPEAKER_02]: I never know which supplements are doing what.

[00:55:39] [SPEAKER_02]: It's hard to know.

[00:55:40] [SPEAKER_02]: What supplements do you take?

[00:55:42] [SPEAKER_00]: Me?

[00:55:43] [SPEAKER_00]: Me?

[00:55:44] [SPEAKER_00]: I take Bs because I did a pharmacogenetic test that showed that I don't digest my Bs

[00:55:51] [SPEAKER_00]: very well from food.

[00:55:52] [SPEAKER_00]: So that was important.

[00:55:53] [SPEAKER_00]: I did a genetic test that showed my strengths and weaknesses from being able to digest food

[00:55:59] [SPEAKER_00]: perspective.

[00:56:00] [SPEAKER_00]: And then I supplemented based on that awareness.

[00:56:03] [SPEAKER_00]: So Gary Bracka talks a lot about that.

[00:56:05] [SPEAKER_00]: Get your pharmacogenetic test done and then understand if you have the methylated B12

[00:56:11] [SPEAKER_00]: or the non-methylated B12.

[00:56:12] [SPEAKER_00]: Otherwise, you could be wasting your money, right?

[00:56:14] [SPEAKER_00]: So I do that.

[00:56:15] [SPEAKER_00]: I take that with fish oil.

[00:56:18] [SPEAKER_00]: I take TruNiagen, which is a nicotinamide riboside.

[00:56:23] [SPEAKER_00]: It's an NAD booster.

[00:56:24] [SPEAKER_00]: I think it's phenomenal technology.

[00:56:26] [SPEAKER_00]: I've been taking it for years.

[00:56:27] [SPEAKER_00]: I really think that it helps.

[00:56:29] [SPEAKER_02]: Do you think that's better than NMN, which is the other alternative for NAD boosting?

[00:56:35] [SPEAKER_00]: I do.

[00:56:36] [SPEAKER_00]: I do.

[00:56:37] [SPEAKER_00]: And for full disclosure, I'm on the board of TruNiagen, the company Chromadex.

[00:56:44] [SPEAKER_00]: But the reason why I think it's a lot better is there isn't a pristine source of NMN.

[00:56:52] [SPEAKER_00]: First of all, there's so many providers and the level of quality that you could get is

[00:56:59] [SPEAKER_00]: fairly diverse.

[00:57:01] [SPEAKER_00]: And so you have to be really careful.

[00:57:03] [SPEAKER_00]: Whereas with something like a nicotinamide riboside, when you work with a company like

[00:57:08] [SPEAKER_00]: Chromadex that owns the patents and is very precise about quality, I mean, quality is a

[00:57:13] [SPEAKER_00]: really important thing in supplements.

[00:57:15] [SPEAKER_00]: Because if you're, if you could be taking what you're supposed to be taking, but you're,

[00:57:20] [SPEAKER_00]: you could be getting, you know, 70% filler, right?

[00:57:22] [SPEAKER_00]: So, so it's really important to consider that.

[00:57:24] [SPEAKER_00]: But I mean, if you kind of look at the metabolic pathway that, that both NR and NMN take in order

[00:57:31] [SPEAKER_00]: to get to NAD, you know, boosting your NAD, NMN is a longer pathway.

[00:57:37] [SPEAKER_00]: And so, you know, generally speaking, you know, we, we think that NR is a much better booster.

[00:57:43] [SPEAKER_00]: And so, yeah, that's one that I take.

[00:57:45] [SPEAKER_00]: I take a curcumin every day, which is great for low levels of inflammation in your body.

[00:57:51] [SPEAKER_00]: I take a lion's mane and, you know, I'll take a ginkgo.

[00:57:56] [SPEAKER_00]: I don't know if you know, I've heard of Dr. Daniel Amin.

[00:58:00] [SPEAKER_00]: He talks about that.

[00:58:01] [SPEAKER_00]: So he's a really interesting guy.

[00:58:03] [SPEAKER_00]: Like he, you know, he's, he's a psychologist and a psychiatrist and people come in and see

[00:58:11] [SPEAKER_00]: him with, with various different, you know, mental health issues.

[00:58:14] [SPEAKER_00]: What, what he does is he also scans your brain at the same time as, as talking to you and

[00:58:21] [SPEAKER_00]: understanding your clinical issues.

[00:58:23] [SPEAKER_00]: And what, what he's, he's now scanned thousands of people and he started to develop some real

[00:58:28] [SPEAKER_00]: understanding.

[00:58:29] [SPEAKER_00]: You know, he kind of, he kind of developed the kind of pattern recognition that, that,

[00:58:33] [SPEAKER_00]: that an AI would, but he just did it intuitively himself.

[00:58:36] [SPEAKER_00]: And he, and he started to, you know, take, take someone that came in and, and, and look

[00:58:43] [SPEAKER_00]: at their brain and say, well, I can tell why you're having these issues.

[00:58:46] [SPEAKER_00]: Like what, you know, our thoughts are the product of, of, of this brain, you know, think about

[00:58:51] [SPEAKER_00]: the, the thoughts being the software, you know, the brain being the hardware, if the, if

[00:58:55] [SPEAKER_00]: the hardware is not working, how do you expect the software to work properly?

[00:58:59] [SPEAKER_00]: Right.

[00:58:59] [SPEAKER_00]: And so he'll then take you on, you know, he'll, he'll, he'll ask you to stop smoking or drink

[00:59:05] [SPEAKER_00]: less or, you know, you know.

[00:59:06] [SPEAKER_00]: I want to, I want to, I want to, I want to be his patient.

[00:59:09] [SPEAKER_00]: He's, he's pretty incredible.

[00:59:12] [SPEAKER_00]: But, but he says the most beautiful brains take Ginkgo, you know, Ginkgo by a little bit

[00:59:19] [SPEAKER_00]: increases the circulation in your brain.

[00:59:20] [SPEAKER_00]: You got to be careful.

[00:59:21] [SPEAKER_00]: You don't take too much because you can cross some hemorrhoid, you know, some, some, some

[00:59:24] [SPEAKER_00]: bleeding up there, but some hemorrhages and whatnot.

[00:59:27] [SPEAKER_00]: But, but yeah, Ginkgo, I, I take a Ginkgo once in a while.

[00:59:31] [SPEAKER_00]: That's, that's, so, so, you know, over time I've sort of collected ones that I think are

[00:59:36] [SPEAKER_00]: really, really meaningful and you know, in, in the winter I'll take D because D is incredibly

[00:59:42] [SPEAKER_00]: important in the summer.

[00:59:43] [SPEAKER_00]: I won't because I don't want too much D I'm out in the sun a lot.

[00:59:46] [SPEAKER_00]: What's a good amount of vitamin D?

[00:59:49] [SPEAKER_02]: This is a really good question.

[00:59:51] [SPEAKER_02]: I during the pandemic, I was taking like 10,000 I use a day, but I think that's too much.

[00:59:57] [SPEAKER_00]: Um, yeah, I mean, look, I take five a day.

[01:00:00] [SPEAKER_00]: Uh, and if I'm not feeling well, I'll double up.

[01:00:03] [SPEAKER_00]: Um, and, and D was incredibly important.

[01:00:06] [SPEAKER_00]: Um, you know, you know, during the pandemic, we heard a lot about that.

[01:00:12] [SPEAKER_00]: And I don't, I don't, I don't think I understand the exact reason for it.

[01:00:16] [SPEAKER_00]: Uh, but I do, uh, in terms of whether or not it would, you know, kind of help you avoid

[01:00:20] [SPEAKER_00]: the spike protein or whatnot, but it certainly did improve the defenses.

[01:00:24] [SPEAKER_00]: Um, and it's one of the ones that is famously measured by every care provider.

[01:00:29] [SPEAKER_00]: They want to understand sort of your, your, your, your D levels to understand your immunity

[01:00:33] [SPEAKER_00]: overall.

[01:00:34] [SPEAKER_00]: Um, but, um, your immune system and whatnot, um, you know, I, those are sort of the key

[01:00:40] [SPEAKER_00]: ones that I take.

[01:00:41] [SPEAKER_00]: Um, you know, I, if, if, uh, I will take kind of a L tyrosine once in a while, if I'm feeling

[01:00:47] [SPEAKER_00]: anxious, you know, that, that helps, that's the precursor to your dopamine.

[01:00:52] [SPEAKER_00]: Um, so you will produce more dopamine as a result of having L tyrosine.

[01:00:57] [SPEAKER_00]: It'll kind of calm you down.

[01:00:58] [SPEAKER_00]: If let's say you're going through an anxious time, you know, that, that, that's kind of

[01:01:02] [SPEAKER_00]: an interesting one.

[01:01:04] [SPEAKER_02]: Yeah.

[01:01:04] [SPEAKER_02]: I have all of these in my cabinet and I take none of them.

[01:01:07] [SPEAKER_02]: I was taking NMN for a while, but it's hard to know if it's actually doing anything.

[01:01:12] [SPEAKER_02]: And so you just, I just drift away and maybe I should, I should start up again on,

[01:01:17] [SPEAKER_02]: on these supplements.

[01:01:18] [SPEAKER_00]: But this is, but this is why, um, but this is why I think it'd be very interesting to know,

[01:01:24] [SPEAKER_00]: uh, through hard data, how our bodies responding to these things, because, because you're right.

[01:01:29] [SPEAKER_00]: It's unless you're really sensitive, unless you're really tuned in and paying attention and,

[01:01:34] [SPEAKER_00]: and measuring how you feel and almost being clinical about it, it may be difficult to discern

[01:01:40] [SPEAKER_00]: whether or not these things are working.

[01:01:41] [SPEAKER_02]: I see your company also.

[01:01:43] [SPEAKER_02]: Here's, here's another idea.

[01:01:44] [SPEAKER_02]: I see you guys as a platform for people to, um, give, give doctors like this, this mental

[01:01:53] [SPEAKER_02]: health doctor you mentioned, give doctors an incentive to collect data and potentially, you

[01:02:00] [SPEAKER_02]: know, sell it to your platform.

[01:02:03] [SPEAKER_02]: Like you could use that data now for all, for, for, for, for all their, uh, patients in,

[01:02:08] [SPEAKER_02]: in, in, in your system.

[01:02:10] [SPEAKER_00]: Yeah.

[01:02:10] [SPEAKER_00]: Yeah.

[01:02:10] [SPEAKER_00]: Yeah.

[01:02:11] [SPEAKER_00]: No, if we have the right consents, if we, and if we're providing value to our patients,

[01:02:15] [SPEAKER_00]: then, then, then, you know, I, what I find that's really the key with patients is that,

[01:02:21] [SPEAKER_00]: um, they will let you use their data as long as they get something back.

[01:02:24] [SPEAKER_00]: You know, they get some value out of that.

[01:02:26] [SPEAKER_00]: Say, Hey, I'll let you use my data.

[01:02:29] [SPEAKER_00]: Um, uh, and you must anonymize it.

[01:02:32] [SPEAKER_00]: And I don't want you to, you know, be sending my name to anyone else.

[01:02:35] [SPEAKER_00]: But I also then want to see how I compare with the average person in, you know, my

[01:02:42] [SPEAKER_00]: town or my country or, or, or, or, or, or, or, you know, with my cultural disposition,

[01:02:46] [SPEAKER_00]: this could be very valuable.

[01:02:48] [SPEAKER_02]: That's a great idea.

[01:02:49] [SPEAKER_00]: Right.

[01:02:50] [SPEAKER_02]: People are competitive.

[01:02:51] [SPEAKER_02]: I want to know.

[01:02:52] [SPEAKER_02]: I am the healthiest 56 year old in my neighborhood, in my town, in my state or whatever.

[01:03:00] [SPEAKER_00]: One of the things that we're working on in our, we have a longevity health business.

[01:03:04] [SPEAKER_00]: One of the things that we're working on now is to try to tell people, you know, look,

[01:03:09] [SPEAKER_00]: here's your chronological age.

[01:03:10] [SPEAKER_00]: And here's what we think is sort of your, your, your intrinsic health age, you know,

[01:03:16] [SPEAKER_00]: as it relates to your, to, to what the information is telling us.

[01:03:19] [SPEAKER_00]: And we'll tell you if we think you're aging faster, slower, or about what you should,

[01:03:23] [SPEAKER_00]: the rate at which you should be aging.

[01:03:25] [SPEAKER_00]: Cause some of us are aging faster than we should be aging.

[01:03:28] [SPEAKER_00]: And some of us are aging slower.

[01:03:31] [SPEAKER_00]: And this is just given to us through a model.

[01:03:34] [SPEAKER_00]: Right.

[01:03:34] [SPEAKER_00]: And I think, and I think, cause there's gotta be a way to get people's attention.

[01:03:38] [SPEAKER_00]: And most people, you know it's, it's, it's, it's about the finality of life.

[01:03:43] [SPEAKER_00]: It's like, well, what do you mean?

[01:03:44] [SPEAKER_00]: I'm aging faster.

[01:03:45] [SPEAKER_00]: What do you mean that I'm really five years older?

[01:03:48] [SPEAKER_00]: Like my body's five years older than, than, than, than, than my chronological age.

[01:03:54] [SPEAKER_00]: I don't want that to be the case.

[01:03:55] [SPEAKER_00]: You know, tell me what I need to change.

[01:03:57] [SPEAKER_00]: Right.

[01:03:58] [SPEAKER_00]: So I think it's coming to terms with, with, with, with, you know, a simple and powerful

[01:04:02] [SPEAKER_00]: vision for yourself and where you are today and where you want to go.

[01:04:06] [SPEAKER_00]: It's just like anything in life.

[01:04:07] [SPEAKER_02]: Well, Hamad Shabazi, the CEO founder of well health technologies and also heal well, which

[01:04:15] [SPEAKER_02]: is kind of the, the AI spinoff of well health.

[01:04:19] [SPEAKER_02]: Thank you once again for this is the, again, I think this is the most important area of AI.

[01:04:25] [SPEAKER_02]: I think this is the one that's going to change everyone's life within a very small amount

[01:04:29] [SPEAKER_02]: of time over the next few years.

[01:04:31] [SPEAKER_02]: I think your company and companies are at the center of this.

[01:04:35] [SPEAKER_02]: So I'm very excited to have you on, on the podcast and thanks so much once again for,

[01:04:39] [SPEAKER_00]: for coming on the show.

[01:04:41] [SPEAKER_00]: Thank you.

[01:04:41] [SPEAKER_00]: It's been awesome.

[01:04:43] [SPEAKER_00]: Really fun to talk about this with you.

[01:04:45] [SPEAKER_00]: Thank you for your interest in all your great questions.

[01:04:47] [SPEAKER_00]: We look forward to, um, you have to come on again.

[01:04:51] [SPEAKER_00]: I'm going to have more questions.

[01:04:52] [SPEAKER_00]: You got it.

[01:04:54] [SPEAKER_00]: Thanks.

James Altucher,medical ai copilots,health data,ai in healthcare,healthcare as a service,rare disease identification,predictive healthcare,oura ring,health tech startups,fintech healthcare comparison,hamed shahbazi,healthcare technology,biometric tracking,hamed,electronic medical records,preventative healthcare,ai-assisted diagnostics,digital patient engagement,healwell,chronic disease management,continuous monitoring,health analytics,wellhealth,well tech,physician efficiency,telehealth,patient records management,ai in pharmaceuticals,healthcare innovation.,machine learning in medicine,well technology,