22 votes

Therapists are secretly using ChatGPT

31 comments

  1. [14]
    stu2b50
    Link
    Feels like an extension of what has always plagued the soft sciences: they're soft. It's really hard for a patient to be able to objectively evaluate a therapist's work in a way that is unlike,...

    Feels like an extension of what has always plagued the soft sciences: they're soft. It's really hard for a patient to be able to objectively evaluate a therapist's work in a way that is unlike, say, a doctor. This, of course, opens the door to people half assing things and getting away with it.

    You can legislate it, but in the end, it's really hard to detect when a psychiatrist is half assing things with ChatGPT if they don't monumentally fuck up on their side, as the examples in the article indicate.

    17 votes
    1. [5]
      Carrow
      Link Parent
      As to your last point, pardon for being a pedant, but though there's some overlap in the patient's experience, psychiatrists are rather different from therapists. Psychiatrists are for diagnosing...

      As to your last point, pardon for being a pedant, but though there's some overlap in the patient's experience, psychiatrists are rather different from therapists. Psychiatrists are for diagnosing medical conditions and prescribing medication, therapists provide talk therapy and emotional support but can only advise talking to a doctor about medication. Both treat mental health conditions and, compared to the typical doctor interaction, both require you to be vulnerable and review life events to give them context, so it is a reasonable mix up, not to mention media conflating the two.

      13 votes
      1. [4]
        TaylorSwiftsPickles
        Link Parent
        Not 100% sure if you imply the opposite but, in my case at least, all psychiatrists I've been to also provided therapy aside from just diagnosing and medicating

        Not 100% sure if you imply the opposite but, in my case at least, all psychiatrists I've been to also provided therapy aside from just diagnosing and medicating

        3 votes
        1. vord
          Link Parent
          My psychiatrist (whom also does therapy) says that prior to the rise of the shitstorm that the USA calls health insurance, they used to be fairly intertwined. It's only once they started...

          My psychiatrist (whom also does therapy) says that prior to the rise of the shitstorm that the USA calls health insurance, they used to be fairly intertwined. It's only once they started separating out the billing codes that things started getting messy.

          6 votes
        2. redwall_hp
          (edited )
          Link Parent
          I've mostly seen the opposite. My girlfriend's psychiatrist is an attending at a hospital, besides running his own practice for various psychiatric treatments. He oversees in-clinic treatments and...

          I've mostly seen the opposite. My girlfriend's psychiatrist is an attending at a hospital, besides running his own practice for various psychiatric treatments. He oversees in-clinic treatments and manages prescriptions for self-administered medications. Therapists aren't even offered; you have to go elsewhere if you want that.

          It's been a huge step up from the place she used to go, and the doctor has gone above and beyond and done research into her esoteric illness to find new ways to help manage it that her neurologist was unaware of. (He also formally reprimanded an ER doctor, who was an ass to her, once.)

          6 votes
        3. Carrow
          Link Parent
          Thanks for clarifying, I didn't mean to imply psychiatrists can't provide talk therapy. IME and my peers, we haven't had talk therapy from psychs, may be a regional thing. All the more reason why...

          Thanks for clarifying, I didn't mean to imply psychiatrists can't provide talk therapy. IME and my peers, we haven't had talk therapy from psychs, may be a regional thing. All the more reason why my pedantry was hardly worth mentioning honestly. I mainly wanted to point out the article was about therapists (though does quote a psych) and that were a psychiatrist to use an LLM, you may very well find out quite a bit more quickly given the medication aspect.

          2 votes
    2. [7]
      papasquat
      Link Parent
      I think this has less to do with the fact that psychology is a soft science and more to do with the fact that LLMs are extraordinarily good at doing a passible imitation of a therapist. LLMs are...

      I think this has less to do with the fact that psychology is a soft science and more to do with the fact that LLMs are extraordinarily good at doing a passible imitation of a therapist.

      LLMs are designed to imitate human language. A therapist's entire job is using human language to provide mental healthcare. It's a job that's particularly susceptible to "cheating" with LLMs in a way that other "soft sciences", like say, anthropology or archeology or economics aren't.

      After all, computer science is also particularly susceptible to outsourcing work to LLMs. That doesn't mean that computer science is a soft science, it just means that LLMs are good at imitating code too, since they've been trained on mountains of it.

      10 votes
      1. [6]
        stu2b50
        Link Parent
        A key difference there is that the output of a developer is much "harder", in the "hard-soft" continuum and not in difficulty. A piece of software has a specification and a job and it will do that...

        A key difference there is that the output of a developer is much "harder", in the "hard-soft" continuum and not in difficulty. A piece of software has a specification and a job and it will do that on a spectrum that is considerably easier to measure than the mental state of a patient.

        Much of the FUD inherent in the examples in the article is the idea that the therapists who are using ChatGPT are not actually doing a good job for their clients, but that they can't tell either way. And that's scary.

        Meanwhile in the software world, it's more a case of wasted money - are we going to invest into LLMs just for the code to be useless?

        The ambiguity is the difference.

        8 votes
        1. [5]
          papasquat
          Link Parent
          I'd argue that only very simple software can be measured so easily. A complex piece of software can have hidden bugs, UX issues, security vulnerabilities and so on which only rear their ugly heads...

          I'd argue that only very simple software can be measured so easily. A complex piece of software can have hidden bugs, UX issues, security vulnerabilities and so on which only rear their ugly heads well down the line, and which may not be adequately caught by tests, and which will not be covered by the spec.

          A very large piece of software may have more in common with human behavior than an equation with a formal proof.

          We judge whether a person is mentally healthy or not based on their external indicators of happiness and contentment just like we judge whether a computer program is successful based on user feedback, instead of a formal study of the inner structure of either.

          The big difference for me is the consequences if you get it wrong. In therapy, it could be depression, psychosis or suicide. For software development, it usually (but not always) means someone can't order new socks, or a image filter doesn't work on social media correctly.

          The usage of LLMs to "cheat" in both should be mainly informed by those consequences.

          4 votes
          1. [4]
            Eji1700
            Link Parent
            Which is why almost every best practice involving AI/LLMs is "use it for bite sized chunks you can evaluate". You can't break up a therapy session or depression diagnosis into tiny parts to review...

            I'd argue that only very simple software can be measured so easily.

            Which is why almost every best practice involving AI/LLMs is "use it for bite sized chunks you can evaluate". You can't break up a therapy session or depression diagnosis into tiny parts to review for sanity checking.

            And hell given some of the practicing I've seen, it's already a field that has issues with legitimacy because of that, which gets back to the soft sciences thing. There is no out and out reliable way to "prove" someone is depressed. Two sincerely practicing and well trained psychologists could come up with a yes or a no for the exact same set of data, and both could have good arguments for that reason.

            It's not like code, where you can objectively break parts of it apart and run them through unit tests, and it's not like engineering, where the math is the math, and it's not like some sections of normal medicine where "yes having X on a test result almost always means Y".

            5 votes
            1. [3]
              skybrian
              Link Parent
              There are a lot of things you can test, but there are still unknowns outside of that. You're testing the code, but often it's not the code, it's the environment it runs in. (This is particularly...

              There are a lot of things you can test, but there are still unknowns outside of that. You're testing the code, but often it's not the code, it's the environment it runs in. (This is particularly true of performance.)

              Even so, gathering information from machines is a lot easier than from patients.

              1 vote
              1. [2]
                Eji1700
                Link Parent
                I mean this is a solved scenario as well just limited by the ability of your team to mock environments and the heavily diminishing returns of doing so. In general you can make very very very...

                I mean this is a solved scenario as well just limited by the ability of your team to mock environments and the heavily diminishing returns of doing so.

                In general you can make very very very robustly tested and reliable code. It's just that the return on doing so isn't worth the time spent on it. But if someone said tomorrow "prove that this does or doesn't do X" and puts a large enough bounty on it, you WILL get results.

                You literally cannot objectively determine if a diagnosis is correct in a large majority of cases.

                2 votes
                1. skybrian
                  Link Parent
                  It depends on the domain and what you want to know about. It’s difficult to predict whether users will find a game fun. A simulated social network isn’t going to tell you much when trying to...

                  It depends on the domain and what you want to know about. It’s difficult to predict whether users will find a game fun. A simulated social network isn’t going to tell you much when trying to predict the patterns of interaction your users will fall into.

                  1 vote
    3. TonesTones
      Link Parent
      I think this is a bit more nuanced. In my experience as a patient, once I became willing to change, I could easily evaluate a therapist’s work based on personal progress that I was making. The...

      It's really hard for a patient to be able to objectively evaluate a therapist's work in a way that is unlike, say, a doctor.

      I think this is a bit more nuanced. In my experience as a patient, once I became willing to change, I could easily evaluate a therapist’s work based on personal progress that I was making. The objectivity of that evaluation depends on your definition; I wasn’t using external measurables, but I was able to report either clear decreases in symptoms or stagnation over time in a way that is similar to a doctor.

      I think many people are fairly resistant to meaningful change; ego gets in the way. Even if you were able to provide a perfect metric on how “good” a therapist was, patients would go to the “best” therapist and complain your metric is bad because of their own mental blocks.

      I think many people lean on a “current emotional state” metric to decide if they are happy with their therapist, which rarely translates cleanly to “quality of life improving”.

      6 votes
  2. [14]
    Habituallytired
    Link
    I have other providers who use "AI" transcription/translation apps to make sure they get all the info for their notes. I think that's the correct use of AI in medicine, but using GPT to ask it how...

    I have other providers who use "AI" transcription/translation apps to make sure they get all the info for their notes. I think that's the correct use of AI in medicine, but using GPT to ask it how to comfort someone when they lose a pet if they've never had one seems lazy. How can you not think to say, "I'm sorry for your loss. It must be difficult not having your buddy by your side," on your own?

    Therapists using GPT like google to ask it how to handle patient situations is insane to me. They might as well just quit and set you up with GPT before they go.

    10 votes
    1. [8]
      papasquat
      Link Parent
      Personally I have no issues with my therapist using an LLM on their own time to do research into my problem and brainstorm ideas on how to solve them. I'd want them to look at the references that...

      Personally I have no issues with my therapist using an LLM on their own time to do research into my problem and brainstorm ideas on how to solve them. I'd want them to look at the references that the tool is providing and use their professional experience to evaluate those references before developing a therapy strategy though. I'd similarly have no problem with them using Google to do that research, or talking to colleagues about it, as long as they're not identifying me.

      The issue for me is them furiously typing what I say into chatGPT during my session, just like I'd have a problem with them wildly googling during my session or pausing every 30 seconds to call their mentor during my session.

      It would make me think that
      a. They have no idea wtf they're doing,
      b. They have no respect for me or my time, and
      c. They're completely exposing very confidential information about me without taking the time to anonymize it.

      My main issue isn't with the tool itself, it's with the usage pattern. I'm tired of people trying to outsource their brain to an LLM.

      15 votes
      1. Habituallytired
        Link Parent
        You know, I think I'm ok with therapists using any resource/research tools at their disposal on their own time, you're right. I'm just incensed that these therapists seem to be using these tools...

        You know, I think I'm ok with therapists using any resource/research tools at their disposal on their own time, you're right.

        I'm just incensed that these therapists seem to be using these tools DURING the session without letting their patients know implicitly, or (the better option imo) asking permission to use them in the first place during the session.

        I agree ,I'm so tired of everyine who is outsourcing their brains to LLMs. It's something I talk about a lot at work because I'm one of the few holdouts that refuses to use any AI tools available to me for my work.

        5 votes
      2. [4]
        IIIIIIIIIIII
        Link Parent
        I had a doctor literally google my symptoms while I was in their office once. I was sitting next to them, in their office. This must have been in about 2012, and the doctor was in their 50's. They...

        I had a doctor literally google my symptoms while I was in their office once. I was sitting next to them, in their office. This must have been in about 2012, and the doctor was in their 50's.

        They then asked me what kind of general antibiotics I'd like for what turned out to be a minor chest infection. How the fuck should I know?! It felt like someone on an IT helpdesk asking which drivers I'd like.

        It was the most confusing medical interaction I've ever had. And it did make me think the exact same things as your point A, B, and C.

        4 votes
        1. [3]
          JCPhoenix
          Link Parent
          Asking what medicine you want is definitely weird. I get, to an extent, if the doc explained each medication's possible side effects and was like "And they basically all do the same thing and take...

          Asking what medicine you want is definitely weird. I get, to an extent, if the doc explained each medication's possible side effects and was like "And they basically all do the same thing and take the same amount of time to act; have any input?" But to just straight up be like, "Idk, pick one," would make me question if this person was a real doctor or not!

          Though maybe you give off doctor vibes and they thought you were a doctor too!

          Anyway, the Googling of symptoms I don't really have a problem with. I'm assuming you didn't have exotic symptoms the doc had never heard of or seen before. Just run of the mill stuff since it was just a minor chest infection. But I don't see any issues using technology to help narrow down a potential diagnosis. "OK, patient has symptoms A, B, not C, but D. Google, what could this constellation of symptoms be?" At the end of the day, the doctor has to make the determination themselves, but I don't see how that would be any different from consulting a textbook or some other documentation. I don't expect a doctor to know every possible infection from every type of microbial invader, after all. Plus, I feel like I've seen modern electronic record/charting systems at the doctor's office that do some of that already.

          1 vote
          1. Omnicrola
            Link Parent
            Not the person you're replying to, but the difference to me is huge. A doctor going to a specific website for information is to me the same as going and pulling a reference book off a shelf. It...

            At the end of the day, the doctor has to make the determination themselves, but I don't see how that would be any different from consulting a textbook or some other documentation.

            Not the person you're replying to, but the difference to me is huge. A doctor going to a specific website for information is to me the same as going and pulling a reference book off a shelf. It shows that they know at least part of the answer, and where to find the information they need to confirm or expand on it. The same as I do when I have questions about something I have expertise in, I go to a specific set of documentation, forums, or references. I may have even originally found them via Google, but I've read their material enough to trust that they are both useful and generally correct.

            Googling something is the beginning of learning. I don't want a doctor who is beginning, unless I have some weirdly unique thing happening to me.

            3 votes
          2. IIIIIIIIIIII
            Link Parent
            Yeah those are good points. I think the thing that threw me most was, as you say, asking what antibiotics I wanted. Another big thing was that this was 2012 internet, so the sources he consulted...

            Yeah those are good points. I think the thing that threw me most was, as you say, asking what antibiotics I wanted.

            Another big thing was that this was 2012 internet, so the sources he consulted were (in my mind) authoritative. I remember Mayo Clinic being one of them.

            If a doctor used Google to search my symptoms now, even for a minor ailment, I would be horrified!

            1 vote
      3. DefinitelyNotAFae
        Link Parent
        I have an issue with it. They really shouldn't be googling that often for a particular clients issue either unless it's a Google scholar search. They have professional resources, textbooks, etc....

        I have an issue with it. They really shouldn't be googling that often for a particular clients issue either unless it's a Google scholar search. They have professional resources, textbooks, etc. You get (if licensed) a whole Master's degree in this. You take multiple licensure exams and get (and sometimes pay for) professional supervision.

        All of that should teach you a dozen ways to learn something about mental health that isn't Google much less an LLM. Unless you know what's false immediately an LLM will just risk planting false info, and if you knew you probably wouldn't be asking an LLM.

        You're even taught how to support people with things you haven't experienced, specifically. (You're also taught how not to throw your own experiences in the way of supporting things you have dealt with).

        There's a distinct lack of boundaries in this situation that is particularly upsetting. I agree with the rest of your post though

        2 votes
      4. 286437714
        Link Parent
        P-Squat, I was thinking about your comment during a therapy session I had this morning. Nothing major, just routine maintenance. We're using a pretty structured format and the psychologist was...

        P-Squat, I was thinking about your comment during a therapy session I had this morning. Nothing major, just routine maintenance.

        We're using a pretty structured format and the psychologist was describing something to me that didn't quite make sense. I asked if they had any slides or literature they could send through and they said 'hmm... not really, this is more theoretical and in text books.'

        I hardly ever use LLMs, but when I put this term into one and asked for a lay-person's description, it did actually help. It really clarified for me what the psychologist was trying to convey.

        Because this system of therapy is so widely used, it also gave me more confidence that the psych knew what they were doing, and increased my trust in our therapeutic relationship.

        I kept thinking about your last point - the treatment plan was built using the brain of the educated specialist. If I had an experience with a mental health professional putting in my therapy goals or things I was struggling with, I would be extremely pissed.

        Short anecdote, but I asked this LLM after to explain some very well-known (publicly available) military ground doctrine to me afterwards. It confidently asserted an incorrect definition with a wildly wrong example (and used football as a metaphor instead of war).

        In the case of the therapy term, if I didn't know what to search for, it would have been useless. Being able to cross check it on Wikipedia gave me confidence.

        In the case of the military doctrine test, I'm guessing it scraped some kind of LinkedIn post where someone who got out was trying to start a consultancy and dazzle clients with military terminology.

        These things will almost certainly never get to a point where I will feel comfortable with anyone I depend on outsourcing their brain to them.

    2. [4]
      blivet
      Link Parent
      A friend of mine with a doctorate in psychology one told me that a lot of therapists aren't particularly well suited to their line of work, because they started studying psychology in the first...

      A friend of mine with a doctorate in psychology one told me that a lot of therapists aren't particularly well suited to their line of work, because they started studying psychology in the first place as a way of trying to learn how to deal with their own mental and emotional problems.

      6 votes
      1. [2]
        papasquat
        Link Parent
        If they've learned to deal with those problems, wouldn't that make them particularly well suited to it?

        If they've learned to deal with those problems, wouldn't that make them particularly well suited to it?

        3 votes
        1. blivet
          Link Parent
          Just because they got a degree doesn't mean they succeeded in addressing their own issues.

          Just because they got a degree doesn't mean they succeeded in addressing their own issues.

          6 votes
      2. EarlyWords
        Link Parent
        That was instantly clear in my introduction to psychology class many years ago in college. Whatever interest I had in the field was immediately lost when I discovered nearly everyone in the class...

        That was instantly clear in my introduction to psychology class many years ago in college. Whatever interest I had in the field was immediately lost when I discovered nearly everyone in the class only wanted to discuss their own problems.

        They were nearly as neurotic as the modern dance department!

        3 votes
    3. MimicSquid
      Link Parent
      At the very least, someone who's a trained therapist getting a small topic-refresher from GPT to help with an unfamiliar situation is worlds better than an unfiltered connection. The therapist is...

      At the very least, someone who's a trained therapist getting a small topic-refresher from GPT to help with an unfamiliar situation is worlds better than an unfiltered connection. The therapist is a mandatory reporter with regards to issues of impending harm to the patient or others, or various types of abuse of vulnerable people, while these AI programs have egged people on.

      3 votes
  3. DefinitelyNotAFae
    Link
    I don't think this is particularly common, but I'm quite glad we've banned AI therapy in IL. All of this was incredibly unprofessional. This is why CEs and professional supervision are part of the...

    I don't think this is particularly common, but I'm quite glad we've banned AI therapy in IL. All of this was incredibly unprofessional. This is why CEs and professional supervision are part of the requirements for licensure. This is up there with the nurse live-streaming handing meds out on her hospital floor. It's an extreme outlier but also a huge red flag for the training new professionals are getting.

    I swear to gods we learn what to do in the classroom when someone isn't making progress. There are plenty of hard things about therapy but this is no where near the solution.

    You have professional resources for a reason. You should rarely be googling much less using chatGPT for answers.

    Ugh.

    4 votes
  4. lou
    Link
    It doesn't seem smart to make yourself redundant.

    It doesn't seem smart to make yourself redundant.

    1 vote