As someone who works in data analysis in health care and has read countless studies, I would not purchase this software either. There are a plethora of reasons as to why this isn't particularly...
Exemplary
As someone who works in data analysis in health care and has read countless studies, I would not purchase this software either. There are a plethora of reasons as to why this isn't particularly useful, and I'll summarize a few of them here:
Humans are diverse and what shows up in aggregate is often not too important except to provide a hierarchy. If there's a huge difference in outcomes, then you should prescribe one over another, but often the difference in efficacy is mostly negligible and you can always shift what the patient is on if they want something else.
Studies vary a ton in quality, even when you're searching a database like pubmed. Any good meta-analysis will have a section where they talk about which studies they excluded and why. In many cases you simply need a human to abstract these, or some sort of AI. I see nothing of the sort in this proposed model (yes, one of his links on networked studies does use a form of AI to screen studies, but I'm not convinced this is actually accurate enough in aggregate except for in certain areas of research where data is more than plentiful).
Figuring out the gold standard of practice for different diseases, presentations, procedures, etc. are already the job of many academic scientists and this information is easily searchable when it comes to important decisions in patient care.
The way medical education is set up addresses many of the concerns this developer imagined there to be. Providers operate in specific spaces for more than one reason, and one of those reasons is that continuing medical education is important - conventions and associations of doctors provide the exact kind of material he is already providing, but through superior means (review studies, meta-analyses, standards of care, etc.). While it's great to see a chart showing that naproxen tends to work better by roughly 35% than advil, it does not address other issues such as comorbidities or considerations that a provider would be making when recommending an otc painkiller to a patient. Also a patient likely already knows which otc painkiller works for them and will simply purchase that when it's recommended regardless of what the provider says.
There are apps which already provide high quality recommendations to doctors which are likely already paid for or adopted by the organizations and provide additional utility over pooling studies together.
Perhaps the most important issue which has already been brought up is one of incentives. Yes, providing the best care for your patient is absolutely important, but there is a cost to this product and the current system is not set up in a way in which helping a patient find the right drug for them 25% faster (just throwing a random number out there) is visible or a real incentive for any healthcare provider. This might help organizations decide on best practice workflows or health insurance companies to decide on sequencing of specific events, but realistically the incentives to improve efficiency are fairly low when there's a combination of so much secrecy and a clear split between who pays for services and who provides them.
These are plausible reasons and I am sure you know more about it than me, but I’m wondering if this proves too much? We didn’t see the demo and just have the blogger’s word that some doctor...
These are plausible reasons and I am sure you know more about it than me, but I’m wondering if this proves too much? We didn’t see the demo and just have the blogger’s word that some doctor thought it was good, and we don’t know what they wanted to charge.
It seems like they may have gotten over-enthusiastic about how much benefit there is from presenting the same research in a somewhat nicer way, but maybe it was better than the other apps, who knows.
I wouldn’t be surprised, though, if they hadn’t evaluated the competition from other apps.
I’m wondering about how doctors view expenses towards continuing education, which would seem to be about improving patient care in some way?
I think so too haha. They meant well but in their excitement forgot due diligence I suppose? If someone involved in the medical or healthcare field had brainstormed with them from the ground up,...
It seems like they may have gotten over-enthusiastic about how much benefit there is from presenting the same research in a somewhat nicer way, but maybe it was better than the other apps, who knows.
I think so too haha. They meant well but in their excitement forgot due diligence I suppose? If someone involved in the medical or healthcare field had brainstormed with them from the ground up, not at the end point when they only needed medical endorsement, they might have saved themselves a lot of time.
I can't speak for this doctor or any doctor in the United States (whose healthcare system this startup idea seems geared towards) but just as what Gaywallet very neatly explained above, as a doctor I wouldn't use this software either regardless of price. I'm not even sure who they want the end user to be. Patients? Doctors? Providers? Pharmacists?
The idea alone of what treatment is "better" changes from city to city, and day to day. Doctors treat patients not symptoms. Patients on the other hand, are good at knowing what's wrong, but bad at knowing specifically how to fix it. Hundreds of studies and software could tell both patient and doctor that ibuprofen was more effective then paracetamol/acetaminophen but I would never use it as first-choice pain killer on a pregnant female nor would I prescribe naproxen as first-line pain relief for headache. If anything this app may be useful to drug companies but they would have their own ways of getting info and comparing treatments with their competitors.
I’m wondering about how doctors view expenses towards continuing education, which would seem to be about improving patient care in some way?
I'm not sure I understood the question? We have to keep continually updated as best we can or risk providing sub-standard care. Most governments require proof of continuing education as a minimum baseline for medical license renewal. This is usually done through training and apprenticeship under a seasoned mentor or specialist group, through the clinic/hospital/ institution/government you work at, through personal dime and time, or if with no other option, through third-party organizations and events.
I guess I'm wondering whether some part of that continuing education could be mediated via software of some sort? Since there isn't as much travel currently, is it more online now? How much is...
I guess I'm wondering whether some part of that continuing education could be mediated via software of some sort? Since there isn't as much travel currently, is it more online now? How much is video chat versus reading papers? How do people in health care learn about new developments with COVID-19?
Aaah, okay, yes most of it is done online now and it goes about as well as the average online classroom lecture does. Software-wise, there is a bunch of software already available that...
Aaah, okay, yes most of it is done online now and it goes about as well as the average online classroom lecture does. Software-wise, there is a bunch of software already available that consolidates such information and aids practice like the one in this article and better - most health care workers have their corresponding one on their phones. However, to use them we often need an institutional subscription or expensive paid account to access it (That's a whole other issue altogether) and sometimes license verification.
Before COVID as doctors we'd usually do maybe....hmm ~33% in-person formal events like conferences/workshops/training, ~33% in-person hospital/institution-based case reports, audits, rounds (things like going over the interesting patients from the past month and reviewing new standards for their diseases), ~33% self-study with journals, textbooks, research publications. Video conferencing/online conferencing was rare except in specialist needs.
Last 2020 i'd say that shifted quite drastically as doctors of all specialties were reassigned to the needs of the COVID wards and care. Continuing education in the formal sense took a back seat to shifting and adjusting staff so we wouldn't cross contaminate each other with COVID. Now I'd say all annual/biannual conferences either went to online video or got postponed. Most of them also changed their contents to focus on the COVID-19 relevant information. Self-study took a huge chunk of 2020 learning as we all rushed to meet whatever information on COVID we could get. Most governments and international institutions mandated that COVID information be free so that's a huge plus but the sheer majority of it is "in progress" or a "developing study" so it takes us longer to sift through the information based on whatever our current practice guidelines are.
How do people in health care learn about new developments with COVID-19?
Officially, via announcement/text/memo/email through the Department of Health which mandates nationwide information, then adjusted per institution based on cases we see in our specific hospitals. Most doctors are members of associations so depending on your specialty or training post, you'd follow bulletins from your association. We don't just read the journals - for our guidance and protection, we also follow steps or algorithms of care that are recommended by institutions who did the bulk of the data-crunching for us - common examples include NICE guidelines for the UK. Where we do not have local standards, we follow standards of WHO or countries with similar population makeup to our own.
Information during covid is pretty much a flood and it's hard to separate the trust-able from the dubious especially since everything is so new. I would not say last year and this current one is representative of how information is usually distributed to doctors. Whether it will be representative going forward, I don't know.
I'd be fascinated to know what his price point was - if the doctors really were seeing any value at all, rather than just being polite, it seem like there should have been some level that he could...
I'd be fascinated to know what his price point was - if the doctors really were seeing any value at all, rather than just being polite, it seem like there should have been some level that he could sell it to them at.
Obviously it's easy to be a back seat driver here, especially on a pithy and deliberately self deprecating blog post, so I'm sure the author already considered this - but that's kind of why I'd like to see a deeper dive on why it didn't work. The beauty of SaaS is that you can sell it dirt cheap because you build once and then sell slices of the same thing many times.
Google says there are over 230,000 medical practices in the US alone. Charging $10/practice/month is, to them, so close to nothing that they wouldn't even be able to bring up the word "budget" with a straight face. Expenses seemed fairly low (a few servers, outsourced data entry), and again, if there was truth in the potential the doctors saw, he could be clearing seven figures within a couple of years. The story of what really went wrong would be a nice addendum to this abridged version!
That's a product well designed for sale to larger organizations like governments who provide health care to many millions of people to help them make pricing and qualification decisions. It's not...
That's a product well designed for sale to larger organizations like governments who provide health care to many millions of people to help them make pricing and qualification decisions. It's not helpful to sell it to individual doctors, but large systems could benefit from the information significantly more.
I'm unsure why you would try to sell this to individual practices. This looks like something a large multi-state healthcare company or someone who provides software to said companies would love to...
I'm unsure why you would try to sell this to individual practices. This looks like something a large multi-state healthcare company or someone who provides software to said companies would love to throw money at.
Because he was a programmer, not a salesman, and didn't pay for a subject matter expert in the field he was developing for until he wanted endorsements.
Because he was a programmer, not a salesman, and didn't pay for a subject matter expert in the field he was developing for until he wanted endorsements.
Something in her tone makes me pause. "Uh, yeah," I say. "So what would you imagine a product like this—one that could change the very practice of medicine—how much would you pay for such a service?"
"Oh, uh—hmmmm," she said. "I don't know if we can spare the budget here, to be honest. It's very fun...but I'm not sure if our practice can justify this cost."
If you read enough sales books most of them tell you that when people say your product is too expensive what they really mean is your product isn't valuable enough. Susan acted like I was offering her Nirvana as a Service so the conversation has taken quite a wild turn.
"So you don't think this product is useful?"
"Oh sure! I mean, I think in many cases I'll just prescribe what I normally do, since I'm comfortable with it. But you know it's possible that sometimes I'll prescribe something different, based on your metastudies."
"And that isn't worth something? Prescribing better treatments?"
"Hmmmm," she said, picking at her fingernails. "Not directly. Of course I always have the best interests of my patients in mind, but, you know, it's not like they'll pay more if I prescribe Lexapro instead of Zoloft. They won't come back more often or refer more friends. So I'd sorta just be, like, donating this money if I paid you for this thing, right?"
Ahh for-profit medicine. Patient's best interest? "That won't make me more money, who cares???" Maybe it's too cynical or too conspiratorial but it also seems logical to think and despair about...
Ahh for-profit medicine. Patient's best interest? "That won't make me more money, who cares???"
Maybe it's too cynical or too conspiratorial but it also seems logical to think and despair about all of the technological, especially medical, advancements that have been made and sit in a file cabinet or HDD somewhere because it wasn't deemed profitable enough.
Well, on second thought, it might be that non-patient-specific medical advice is cheap because there is so much of it. And there is so much of it because it scales really well. That makes it hard...
Well, on second thought, it might be that non-patient-specific medical advice is cheap because there is so much of it. And there is so much of it because it scales really well. That makes it hard to get started though.
It's kind like trying to sell an encyclopedia when we have the Internet.
Yeah but we’re talking about this context being sold to doctors. I’d sure like my doctor to pull from an app dedicated to better meta analysis of medicines or whatever instead of just googling...
Yeah but we’re talking about this context being sold to doctors. I’d sure like my doctor to pull from an app dedicated to better meta analysis of medicines or whatever instead of just googling (which your average doctor actually relies on quite a bit).
To be fair, googling's only effective when supplemented with background knowledge. For instance, I couldn't point to my liver if you asked me to, so having me google your symptoms would be about...
Exemplary
To be fair, googling's only effective when supplemented with background knowledge. For instance, I couldn't point to my liver if you asked me to, so having me google your symptoms would be about as effective as having you see a homeopath.
For that matter, while there are undoubtedly benefits to meta-analysis in principle, you still need to be able to trust whoever's actually compiling the meta-analysis (eg, preferably a research MD). The only regression I saw in the link (about halfway down the page) looks questionable at best -- the lines of best fit poorly match the data, there are no error bands on the regression, and there's nothing to suggest that the model (parabolic?) is appropriate. A flimsy meta-analysis can be worse than no meta-analysis at all since a flimsy meta-analysis can be used to justify inappropriate conclusions.
(Of course, I'm extrapolating from a single plot, so I'm being slightly unfair to the author; maybe the actual methodology they used was more robust. The more general point still stands, however.)
As someone who works in data analysis in health care and has read countless studies, I would not purchase this software either. There are a plethora of reasons as to why this isn't particularly useful, and I'll summarize a few of them here:
These are plausible reasons and I am sure you know more about it than me, but I’m wondering if this proves too much? We didn’t see the demo and just have the blogger’s word that some doctor thought it was good, and we don’t know what they wanted to charge.
It seems like they may have gotten over-enthusiastic about how much benefit there is from presenting the same research in a somewhat nicer way, but maybe it was better than the other apps, who knows.
I wouldn’t be surprised, though, if they hadn’t evaluated the competition from other apps.
I’m wondering about how doctors view expenses towards continuing education, which would seem to be about improving patient care in some way?
I think so too haha. They meant well but in their excitement forgot due diligence I suppose? If someone involved in the medical or healthcare field had brainstormed with them from the ground up, not at the end point when they only needed medical endorsement, they might have saved themselves a lot of time.
I can't speak for this doctor or any doctor in the United States (whose healthcare system this startup idea seems geared towards) but just as what Gaywallet very neatly explained above, as a doctor I wouldn't use this software either regardless of price. I'm not even sure who they want the end user to be. Patients? Doctors? Providers? Pharmacists?
The idea alone of what treatment is "better" changes from city to city, and day to day. Doctors treat patients not symptoms. Patients on the other hand, are good at knowing what's wrong, but bad at knowing specifically how to fix it. Hundreds of studies and software could tell both patient and doctor that ibuprofen was more effective then paracetamol/acetaminophen but I would never use it as first-choice pain killer on a pregnant female nor would I prescribe naproxen as first-line pain relief for headache. If anything this app may be useful to drug companies but they would have their own ways of getting info and comparing treatments with their competitors.
I'm not sure I understood the question? We have to keep continually updated as best we can or risk providing sub-standard care. Most governments require proof of continuing education as a minimum baseline for medical license renewal. This is usually done through training and apprenticeship under a seasoned mentor or specialist group, through the clinic/hospital/ institution/government you work at, through personal dime and time, or if with no other option, through third-party organizations and events.
I guess I'm wondering whether some part of that continuing education could be mediated via software of some sort? Since there isn't as much travel currently, is it more online now? How much is video chat versus reading papers? How do people in health care learn about new developments with COVID-19?
Aaah, okay, yes most of it is done online now and it goes about as well as the average online classroom lecture does. Software-wise, there is a bunch of software already available that consolidates such information and aids practice like the one in this article and better - most health care workers have their corresponding one on their phones. However, to use them we often need an institutional subscription or expensive paid account to access it (That's a whole other issue altogether) and sometimes license verification.
Before COVID as doctors we'd usually do maybe....hmm ~33% in-person formal events like conferences/workshops/training, ~33% in-person hospital/institution-based case reports, audits, rounds (things like going over the interesting patients from the past month and reviewing new standards for their diseases), ~33% self-study with journals, textbooks, research publications. Video conferencing/online conferencing was rare except in specialist needs.
Last 2020 i'd say that shifted quite drastically as doctors of all specialties were reassigned to the needs of the COVID wards and care. Continuing education in the formal sense took a back seat to shifting and adjusting staff so we wouldn't cross contaminate each other with COVID. Now I'd say all annual/biannual conferences either went to online video or got postponed. Most of them also changed their contents to focus on the COVID-19 relevant information. Self-study took a huge chunk of 2020 learning as we all rushed to meet whatever information on COVID we could get. Most governments and international institutions mandated that COVID information be free so that's a huge plus but the sheer majority of it is "in progress" or a "developing study" so it takes us longer to sift through the information based on whatever our current practice guidelines are.
Officially, via announcement/text/memo/email through the Department of Health which mandates nationwide information, then adjusted per institution based on cases we see in our specific hospitals. Most doctors are members of associations so depending on your specialty or training post, you'd follow bulletins from your association. We don't just read the journals - for our guidance and protection, we also follow steps or algorithms of care that are recommended by institutions who did the bulk of the data-crunching for us - common examples include NICE guidelines for the UK. Where we do not have local standards, we follow standards of WHO or countries with similar population makeup to our own.
Information during covid is pretty much a flood and it's hard to separate the trust-able from the dubious especially since everything is so new. I would not say last year and this current one is representative of how information is usually distributed to doctors. Whether it will be representative going forward, I don't know.
Interesting, thanks!
I'd be fascinated to know what his price point was - if the doctors really were seeing any value at all, rather than just being polite, it seem like there should have been some level that he could sell it to them at.
Obviously it's easy to be a back seat driver here, especially on a pithy and deliberately self deprecating blog post, so I'm sure the author already considered this - but that's kind of why I'd like to see a deeper dive on why it didn't work. The beauty of SaaS is that you can sell it dirt cheap because you build once and then sell slices of the same thing many times.
Google says there are over 230,000 medical practices in the US alone. Charging $10/practice/month is, to them, so close to nothing that they wouldn't even be able to bring up the word "budget" with a straight face. Expenses seemed fairly low (a few servers, outsourced data entry), and again, if there was truth in the potential the doctors saw, he could be clearing seven figures within a couple of years. The story of what really went wrong would be a nice addendum to this abridged version!
That's a product well designed for sale to larger organizations like governments who provide health care to many millions of people to help them make pricing and qualification decisions. It's not helpful to sell it to individual doctors, but large systems could benefit from the information significantly more.
I'm unsure why you would try to sell this to individual practices. This looks like something a large multi-state healthcare company or someone who provides software to said companies would love to throw money at.
Because he was a programmer, not a salesman, and didn't pay for a subject matter expert in the field he was developing for until he wanted endorsements.
From the article:
Ahh for-profit medicine. Patient's best interest? "That won't make me more money, who cares???"
Maybe it's too cynical or too conspiratorial but it also seems logical to think and despair about all of the technological, especially medical, advancements that have been made and sit in a file cabinet or HDD somewhere because it wasn't deemed profitable enough.
Well, on second thought, it might be that non-patient-specific medical advice is cheap because there is so much of it. And there is so much of it because it scales really well. That makes it hard to get started though.
It's kind like trying to sell an encyclopedia when we have the Internet.
Yeah but we’re talking about this context being sold to doctors. I’d sure like my doctor to pull from an app dedicated to better meta analysis of medicines or whatever instead of just googling (which your average doctor actually relies on quite a bit).
To be fair, googling's only effective when supplemented with background knowledge. For instance, I couldn't point to my liver if you asked me to, so having me google your symptoms would be about as effective as having you see a homeopath.
For that matter, while there are undoubtedly benefits to meta-analysis in principle, you still need to be able to trust whoever's actually compiling the meta-analysis (eg, preferably a research MD). The only regression I saw in the link (about halfway down the page) looks questionable at best -- the lines of best fit poorly match the data, there are no error bands on the regression, and there's nothing to suggest that the model (parabolic?) is appropriate. A flimsy meta-analysis can be worse than no meta-analysis at all since a flimsy meta-analysis can be used to justify inappropriate conclusions.
(Of course, I'm extrapolating from a single plot, so I'm being slightly unfair to the author; maybe the actual methodology they used was more robust. The more general point still stands, however.)