52
votes
US lawsuit on behalf of deceased patients alleges United Health denies care based on AI model with ninety percent error rate
Link information
This data is scraped automatically and may be incorrect.
- Title
- UnitedHealth uses AI model with 90% error rate to deny care, lawsuit alleges
- Authors
- Beth Mole
- Published
- Nov 16 2023
- Word count
- 684 words
Title is grossly inaccurate, but only because the article is grossly inaccurate.
This is a total misunderstanding of statistics, and it’s questionable to me whether anything can be inferred from this fact. It is absolutely not the case that the model has an error rate of 90% though — all we know is that cases that are appealed have a high success rate, but what doctors are going to appeal cases where there is a low chance of success?
Also, it’s unclear where they even got this number. I have read the lawsuit filing and they lean heavily on the legal phrase “on information and belief” (see page 38 of the filing.). That said, I’m not a lawyer so maybe that’s common, but it doesn’t strike me as highly credible.
Thanks.
This part seems worth a lawsuit though.
Oh, I don’t disagree that there’s some bad behaviour happening here, just that the title is making it sound like things are insanely bad when actually lawyers are sensationalizing things using bad statistics to bolster their case.
"On information and belief" is a boilerplate legal phrase indicating that evidence exists to support the assertions of a legal case, before that evidence is introduced in court and its validity can be proven or disproved. Using the phrase doesn't undermine the case, it's utterly unremarkable.
I agree you can’t really tell the overall error rate from this, but there false positives for denials is pretty critical from a regulatory perspective.
There’s a pretty strong financial incentive for insurance companies to deny claims, and navigating the system to appeal them is a nightmare that many people won’t have the time or knowledge to do so. Knowingly increasing the false negative rate while hiding behind “The algorithm”, sounds like it should be illegal (or at least regulated), but that means it’s probably chill in the US
Good shout. I respect Ars but this is a bad editorial oversight. It doesn't mean nothing is there but it's a severe misinterpretation of the statistics.
Ars’ legal coverage has gotten worse over the years, or so it feels to me. It could just be me though. It definitely isn’t the worst but, it’s by no means great.
To give Ars the benefit of the doubt here, this looks like they're mostly reporting statistical claims made in the lawsuit itself. While unkz is right that the title is a grossly inaccurate misstatement of the statistics, it's the sort of mistake that's not unsurprising from someone who isn't super familiar with statistics. Though I suppose you can argue that journalists should have a better understanding of statistics than the average person, especially when reporting on something like this.
It's blatantly unethical to base patient care decisions on an algorithm that hasn't been publicly validated. This would never be tolerated in a diagnostic process (e.g. AI mammography reading for tumor detection).
So why should it be permissible to use a privately developed, closed source algorithm for denial of care, when the owners' biases and incentives all favor that denial?
Regardless of the reported accuracy of the statistics that /u/unkz had problems with, let's have a public blind trial of unaffiliated physicians in applicable specialties reviewing a large random sample of the cases that the NaviHealth algorithm scored. If there's a significant difference in allowances/denials, that would imply that use of the algorithm must be subject to independent review.
In any case, the full STAT article reveals substantial evidence that the algorithm, accurate or not, was misused. Even according to UnitedHealthcare's original written policies, the algorithm's recommendations were supposed to be subject to human review. Instead, NaviHealth's case managers were forced to use the recommendations as a target for which they were held accountable.
If Amber Lynch and other NaviHealth case managers testify to the information they gave STAT News concerning the patently unjustified denial of medically necessary care, it's going to go very badly for UnitedHealthcare.