16 votes

NarxCare score may influence who can get or prescribe pain medication

11 comments

  1. JXM
    Link
    There is a massive opioid crisis in the U.S., but this isn’t the way to stop it. This will just prevent innocent people from getting the short term pain relief they need.

    There is a massive opioid crisis in the U.S., but this isn’t the way to stop it. This will just prevent innocent people from getting the short term pain relief they need.

    9 votes
  2. [3]
    patience_limited
    Link
    A recent Axios article reports a pending petition to the FDA to regulate NarxCare as a medical device. There's a growing movement in U.S. medicine to regulate clinical decision systems under a...

    A recent Axios article reports a pending petition to the FDA to regulate NarxCare as a medical device.

    There's a growing movement in U.S. medicine to regulate clinical decision systems under a "Software as a Medical Device" model, calling for validation of technical and clinical reliability/safety. More detail regarding NarxCare in particular here.

    8 votes
    1. [2]
      boxer_dogs_dance
      Link Parent
      Is this petition something that we can also sign? Thanks very much for the information

      Is this petition something that we can also sign? Thanks very much for the information

      2 votes
      1. patience_limited
        Link Parent
        The petition has already been filed and denied, apparently. I did a little digging, because "Center for U.S. Policy" has a very sus AstroTurf ring to it, and the website is about as anodyne as it...

        The petition has already been filed and denied, apparently.

        I did a little digging, because "Center for U.S. Policy" has a very sus AstroTurf ring to it, and the website is about as anodyne as it could possibly be when it's fronting for pain med prescribers. Michael Barnes, the founder, has a history of opening and closing small non-profits associated with drug access and policy, so I'm taking the whole thing with a heftier grain of salt than when I first looked in.

        6 votes
  3. [7]
    Comment removed by site admin
    Link
    1. [6]
      unkz
      Link Parent
      I have to question some of these claims. Obviously computer programs are used all the time to make medical decisions. As the article itself says, the final decision is in the hands of the...

      I have to question some of these claims. Obviously computer programs are used all the time to make medical decisions. As the article itself says, the final decision is in the hands of the physician — how is this any different from AI augmented radiology image analysis, for example? If anything, this sounds to me like it has the potential to be even more transparent and unbiased than relying on humans.

      https://www.washingtonpost.com/wellness/interactive/2022/women-pain-gender-bias-doctors/

      From heart disease to IUDs: How doctors dismiss women’s pain

      https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4843483/

      Black Americans are systematically undertreated for pain relative to white Americans

      Now, I’ll be the first to admit that I myself am heavily biased on this subject, but I am generally in favour of data driven approaches to decision making.

      7 votes
      1. DefinitelyNotAFae
        Link Parent
        There's a garbage in-garbage out problem here with bias though. In numerous other ways, we've found that computer systems, AI or otherwise labeled, have taken existing bias and replicated it,...

        There's a garbage in-garbage out problem here with bias though. In numerous other ways, we've found that computer systems, AI or otherwise labeled, have taken existing bias and replicated it, whether because they're not trained on representative data, or because the individuals documenting the information that's then analyzed are still documenting biased "factual" information.

        I'd hesitate to suggest that this will inherently bring less bias, when it's likely to perpetuate the same biases. I didn't dig into the factors it looks at but as someone that manages chronic pain meds for my partner, I've seen his care vary widely depending on how the doctors perceive him. I literally had to pull him out of a hospital that was giving him less meds than he was Rx'd at home and drive him to another one so he could get surgery. I've seen doctors actually manage his pain. The biggest help I've seen has been humanizing him to medical professionals, not reducing him to a risk score.

        13 votes
      2. sparksbet
        Link Parent
        AI systems like this are trained on existing data though, so it's highly likely that an AI is going to replicate the biases of human doctors -- in fact, it may base its decisions on those factors...

        AI systems like this are trained on existing data though, so it's highly likely that an AI is going to replicate the biases of human doctors -- in fact, it may base its decisions on those factors more consistently as it learns that, for instance, being black is a good predictor of getting denied pain medication. The AI itself is a black box so it's impossible to identify this in any individual case, only when we look at the statistics over many cases. There are already tons of AI bias incidents where the AI system was implemented in an attempt to avoid human bias but ended up being even more biased (Amazon's resume bot discriminating against women even without being explicitly given their gender is a good and famous example).

      3. [4]
        Comment removed by site admin
        Link Parent
        1. [3]
          FluffyKittens
          (edited )
          Link Parent
          Respectfully, this is terribly wrong information. Per the allegations in the article, NarxCare has absolutely crossed some red lines that put them at legal risk, and on a personal level, I think...

          Respectfully, this is terribly wrong information. Per the allegations in the article, NarxCare has absolutely crossed some red lines that put them at legal risk, and on a personal level, I think their business model is an actively harmful menace to society - but your legal rationale here is nonsense.

          Providers don't have to document everything in a single silo. Radiology data specifically is near-always stored outside the EHR in a specialized database called a RIS/PACS (radiology information system, picture archival and communication system - typically bundled as one piece of software but not always).

          Radiologists can use CAD (="ai detection") for mammo and do so all the time - not all that data lives in the EHR. There's nothing to stop drug information from being pulled and tossed into ML-based treatment and prediction algorithms where relevant, e.g. for sepsis.

          There's no special consent or HIPAA release required as long as the data is used for treatment (see HIPAA's "TPO" exemptions).

          *For background, I make medical software for a living and I've read a whole lotta legislation.

          8 votes
          1. [3]
            Comment removed by site admin
            Link Parent
            1. [2]
              FluffyKittens
              (edited )
              Link Parent
              This is flat-out wrong. I agree with your argument as a common sense one, but courts don't see it that way. Re: your linked quote, ^See emphasis. Borrowing a quote from this pharmacy law...

              This is flat-out wrong. I agree with your argument as a common sense one, but courts don't see it that way.

              Re: your linked quote,

              And here's the most important part = "A health care provider or health plan may send copies of your records to another provider or health plan only as needed for treatment or payment or with your permission"

              ^See emphasis.

              Borrowing a quote from this pharmacy law discussion written by a pharmacy professor with a JD:

              Oregon, like a majority of U.S. states, has a mandatory Prescription Drug Monitoring Program (PDMP) that requires pharmacies to electronically report the dispensing of controlled-substance prescriptions to a state agency that monitors this type of activity. Approximately 7 million prescription records are reported to the Oregon PDMP annually.

              The name and description of the drug and the quantity dispensed, plus identifying information about the patient and the practitioner who prescribed the drug, are all reported. According to the information provided by Oregon, “The primary purpose of the PDMP is to provide practitioners and pharmacists a tool to improve health care, by providing health care providers with a means to identify and address problems related to the side effects of drugs, risks associated with the combined effects of prescription drugs with alcohol or other prescribed drugs, and overdose.”

              And here's a history of US drug monitoring programs for you that discusses the legal precedent, viewable through the well-known hubs of science.

              The Harold Rogers Prescription Drug Monitoring Program, established by the Department of Justice in 2003, gives states federal grant funding to establish and operate PDMPs. President Trump’s Commission on Combating Drug Addiction and the Opioid Crisis focused many of its recommendations on funding, expanding, and improving PDMP systems. The commission dedicated significant portions of its final report to supply-side restriction policies, including PDMPs. The 2018 Substance Use-Disorder Prevention that Promotes Opioid Recovery and Treatment (SUPPORT) for Patients and Communities Act requires state Medicaid programs to order their health care providers to query state PDMPs prior to prescribing opioids.

              1 vote
              1. [2]
                Comment removed by site admin
                Link Parent
                1. FluffyKittens
                  Link Parent
                  If that's the understanding you came away with, you clearly did not read it. Here's the opinion: https://law.justia.com/cases/federal/district-courts/oregon/ordce/3:2012cv02023/109761/60/ The...

                  If that's the understanding you came away with, you clearly did not read it.

                  Here's the opinion: https://law.justia.com/cases/federal/district-courts/oregon/ordce/3:2012cv02023/109761/60/

                  The monitoring program was being sued by the DEA, who wanted access to the program's data without a court order, and they were rightfully denied. The judge was not ruling on the legality of the monitoring program.

                  4 votes