5 votes

Medical selfies

2 comments

  1. geeklynad
    Link
    Ooo! Excited to see some Eric Topol in the wild. He's been an amazing source of information on covid for the past couple years. I'm also a big fan of how much more accessible at-home equipment is...

    Ooo! Excited to see some Eric Topol in the wild. He's been an amazing source of information on covid for the past couple years.

    I'm also a big fan of how much more accessible at-home equipment is becoming. I would love to be able to play around with some ultrasound imaging.

    2 votes
  2. skybrian
    Link
    From the article: [...]

    From the article:

    To date, smartphone ultrasound has not caught on much in the United States for a several reasons. There’s no reimbursement code for physicians to acquire the images or an easy mechanism to place them in the electronic medical record. In primary care and many specialties, there is a lack of experience in acquiring images, dependent on referring patients to the ultrasound lab for formal studies by sonographers. Image quality in the early days (circa 2010) of handheld ultrasound was poor, but it has markedly improved now and in most patients (in skilled hands) comparable to expensive ultrasound machines. The ultrasound probe devices that attach to smartphones are much too expensive, ranging from $2000 to $8000 with many requiring monthly subscription services to use the app. But the amount of money saved by not referring patients to an ultrasound lab would easily pay for these devices in a system with universal health care. In the United States there is no incentive.

    But that has not deterred the adoption of smartphone ultrasound in the hinterlands.

    [...]

    The New York Times piece in 2019 above showed its use in remote Uganda villages for diverse applications such as diagnosing pneumonia, tuberculosis, fetal monitoring and more. I find this a particularly good example for how medical technology can help reduce inequities, here making high resolution ultrasound imaging possible in low income countries that would not have access. The images could potentially be obtained by untrained personnel and sent to experts for interpretation.

    But now with A.I. that has changed. For the heart, there are a number of AI tools that will guide a person with no echocardiographic knowledge for acquiring images or interpreting them. All that is needed is for a person to put the smartphone probe on the left side of the chest. Then the AI can tell the person to move it up or down, or clock, or counterclockwise, until it “sees” the heart image and automatically captures it. And then automatically label the structures and provide interpretation. This sets up the potential for any person to perform a screening echocardiogram and get an initial reading of it. An example is shown below with automated labeling of the cavities (LV=left ventricle, LA=left atrium, MV=mitral valve) and providing the ejection fraction (EF%) and other outputs. If this can be done with the heart, the most challenging organ to image because of its motion, you can imagine how much easier it would be for many other organs when just a static shot is needed (like the kidney photo above).

    1 vote