38 votes

Therapists are secretly using ChatGPT

46 comments

  1. [17]
    stu2b50
    Link
    Feels like an extension of what has always plagued the soft sciences: they're soft. It's really hard for a patient to be able to objectively evaluate a therapist's work in a way that is unlike,...

    Feels like an extension of what has always plagued the soft sciences: they're soft. It's really hard for a patient to be able to objectively evaluate a therapist's work in a way that is unlike, say, a doctor. This, of course, opens the door to people half assing things and getting away with it.

    You can legislate it, but in the end, it's really hard to detect when a psychiatrist is half assing things with ChatGPT if they don't monumentally fuck up on their side, as the examples in the article indicate.

    21 votes
    1. [5]
      Carrow
      Link Parent
      As to your last point, pardon for being a pedant, but though there's some overlap in the patient's experience, psychiatrists are rather different from therapists. Psychiatrists are for diagnosing...

      As to your last point, pardon for being a pedant, but though there's some overlap in the patient's experience, psychiatrists are rather different from therapists. Psychiatrists are for diagnosing medical conditions and prescribing medication, therapists provide talk therapy and emotional support but can only advise talking to a doctor about medication. Both treat mental health conditions and, compared to the typical doctor interaction, both require you to be vulnerable and review life events to give them context, so it is a reasonable mix up, not to mention media conflating the two.

      19 votes
      1. [4]
        TaylorSwiftsPickles
        Link Parent
        Not 100% sure if you imply the opposite but, in my case at least, all psychiatrists I've been to also provided therapy aside from just diagnosing and medicating

        Not 100% sure if you imply the opposite but, in my case at least, all psychiatrists I've been to also provided therapy aside from just diagnosing and medicating

        7 votes
        1. redwall_hp
          (edited )
          Link Parent
          I've mostly seen the opposite. My girlfriend's psychiatrist is an attending at a hospital, besides running his own practice for various psychiatric treatments. He oversees in-clinic treatments and...

          I've mostly seen the opposite. My girlfriend's psychiatrist is an attending at a hospital, besides running his own practice for various psychiatric treatments. He oversees in-clinic treatments and manages prescriptions for self-administered medications. Therapists aren't even offered; you have to go elsewhere if you want that.

          It's been a huge step up from the place she used to go, and the doctor has gone above and beyond and done research into her esoteric illness to find new ways to help manage it that her neurologist was unaware of. (He also formally reprimanded an ER doctor, who was an ass to her, once.)

          12 votes
        2. vord
          Link Parent
          My psychiatrist (whom also does therapy) says that prior to the rise of the shitstorm that the USA calls health insurance, they used to be fairly intertwined. It's only once they started...

          My psychiatrist (whom also does therapy) says that prior to the rise of the shitstorm that the USA calls health insurance, they used to be fairly intertwined. It's only once they started separating out the billing codes that things started getting messy.

          11 votes
        3. Carrow
          Link Parent
          Thanks for clarifying, I didn't mean to imply psychiatrists can't provide talk therapy. IME and my peers, we haven't had talk therapy from psychs, may be a regional thing. All the more reason why...

          Thanks for clarifying, I didn't mean to imply psychiatrists can't provide talk therapy. IME and my peers, we haven't had talk therapy from psychs, may be a regional thing. All the more reason why my pedantry was hardly worth mentioning honestly. I mainly wanted to point out the article was about therapists (though does quote a psych) and that were a psychiatrist to use an LLM, you may very well find out quite a bit more quickly given the medication aspect.

          4 votes
    2. [2]
      TonesTones
      Link Parent
      I think this is a bit more nuanced. In my experience as a patient, once I became willing to change, I could easily evaluate a therapist’s work based on personal progress that I was making. The...
      • Exemplary

      It's really hard for a patient to be able to objectively evaluate a therapist's work in a way that is unlike, say, a doctor.

      I think this is a bit more nuanced. In my experience as a patient, once I became willing to change, I could easily evaluate a therapist’s work based on personal progress that I was making. The objectivity of that evaluation depends on your definition; I wasn’t using external measurables, but I was able to report either clear decreases in symptoms or stagnation over time in a way that is similar to a doctor.

      I think many people are fairly resistant to meaningful change; ego gets in the way. Even if you were able to provide a perfect metric on how “good” a therapist was, patients would go to the “best” therapist and complain your metric is bad because of their own mental blocks.

      I think many people lean on a “current emotional state” metric to decide if they are happy with their therapist, which rarely translates cleanly to “quality of life improving”.

      10 votes
      1. arch
        Link Parent
        I am not sure if you are familiar with the concept, but your comment made me think about the negative outcomes of talk therapy as well as treatment resistant anxiety. There seems to be a treatment...

        I am not sure if you are familiar with the concept, but your comment made me think about the negative outcomes of talk therapy as well as treatment resistant anxiety. There seems to be a treatment resistant variant for every psychological disorder. There are cases where therapy does make people feel worse off than they were before they started it. I personally think there needs to be significant psychological research into what psychological trauma is, how it is processed in the brain, different ways people can cope with and manage it, etc. I imagine there are occurrences of people who have repressed or dissociated from their traumatic memories in same way only to have them dug back up by therapy when they weren't able to cope with them.

        Personally, I had a hugely positive experience with therapy, it has changed my daily life significantly. I wish I could have done it sooner. I also kind of doubt I would have been open to it sooner. I used to be so terrified of the idea that I could have had difficulties and shortcomings to overcome that I hid them voraciously.

        5 votes
    3. [9]
      papasquat
      Link Parent
      I think this has less to do with the fact that psychology is a soft science and more to do with the fact that LLMs are extraordinarily good at doing a passible imitation of a therapist. LLMs are...

      I think this has less to do with the fact that psychology is a soft science and more to do with the fact that LLMs are extraordinarily good at doing a passible imitation of a therapist.

      LLMs are designed to imitate human language. A therapist's entire job is using human language to provide mental healthcare. It's a job that's particularly susceptible to "cheating" with LLMs in a way that other "soft sciences", like say, anthropology or archeology or economics aren't.

      After all, computer science is also particularly susceptible to outsourcing work to LLMs. That doesn't mean that computer science is a soft science, it just means that LLMs are good at imitating code too, since they've been trained on mountains of it.

      11 votes
      1. [8]
        stu2b50
        Link Parent
        A key difference there is that the output of a developer is much "harder", in the "hard-soft" continuum and not in difficulty. A piece of software has a specification and a job and it will do that...

        A key difference there is that the output of a developer is much "harder", in the "hard-soft" continuum and not in difficulty. A piece of software has a specification and a job and it will do that on a spectrum that is considerably easier to measure than the mental state of a patient.

        Much of the FUD inherent in the examples in the article is the idea that the therapists who are using ChatGPT are not actually doing a good job for their clients, but that they can't tell either way. And that's scary.

        Meanwhile in the software world, it's more a case of wasted money - are we going to invest into LLMs just for the code to be useless?

        The ambiguity is the difference.

        10 votes
        1. [7]
          papasquat
          Link Parent
          I'd argue that only very simple software can be measured so easily. A complex piece of software can have hidden bugs, UX issues, security vulnerabilities and so on which only rear their ugly heads...

          I'd argue that only very simple software can be measured so easily. A complex piece of software can have hidden bugs, UX issues, security vulnerabilities and so on which only rear their ugly heads well down the line, and which may not be adequately caught by tests, and which will not be covered by the spec.

          A very large piece of software may have more in common with human behavior than an equation with a formal proof.

          We judge whether a person is mentally healthy or not based on their external indicators of happiness and contentment just like we judge whether a computer program is successful based on user feedback, instead of a formal study of the inner structure of either.

          The big difference for me is the consequences if you get it wrong. In therapy, it could be depression, psychosis or suicide. For software development, it usually (but not always) means someone can't order new socks, or a image filter doesn't work on social media correctly.

          The usage of LLMs to "cheat" in both should be mainly informed by those consequences.

          6 votes
          1. [6]
            Eji1700
            Link Parent
            Which is why almost every best practice involving AI/LLMs is "use it for bite sized chunks you can evaluate". You can't break up a therapy session or depression diagnosis into tiny parts to review...

            I'd argue that only very simple software can be measured so easily.

            Which is why almost every best practice involving AI/LLMs is "use it for bite sized chunks you can evaluate". You can't break up a therapy session or depression diagnosis into tiny parts to review for sanity checking.

            And hell given some of the practicing I've seen, it's already a field that has issues with legitimacy because of that, which gets back to the soft sciences thing. There is no out and out reliable way to "prove" someone is depressed. Two sincerely practicing and well trained psychologists could come up with a yes or a no for the exact same set of data, and both could have good arguments for that reason.

            It's not like code, where you can objectively break parts of it apart and run them through unit tests, and it's not like engineering, where the math is the math, and it's not like some sections of normal medicine where "yes having X on a test result almost always means Y".

            7 votes
            1. [5]
              skybrian
              Link Parent
              There are a lot of things you can test, but there are still unknowns outside of that. You're testing the code, but often it's not the code, it's the environment it runs in. (This is particularly...

              There are a lot of things you can test, but there are still unknowns outside of that. You're testing the code, but often it's not the code, it's the environment it runs in. (This is particularly true of performance.)

              Even so, gathering information from machines is a lot easier than from patients.

              1 vote
              1. [4]
                Eji1700
                Link Parent
                I mean this is a solved scenario as well just limited by the ability of your team to mock environments and the heavily diminishing returns of doing so. In general you can make very very very...

                I mean this is a solved scenario as well just limited by the ability of your team to mock environments and the heavily diminishing returns of doing so.

                In general you can make very very very robustly tested and reliable code. It's just that the return on doing so isn't worth the time spent on it. But if someone said tomorrow "prove that this does or doesn't do X" and puts a large enough bounty on it, you WILL get results.

                You literally cannot objectively determine if a diagnosis is correct in a large majority of cases.

                4 votes
                1. skybrian
                  Link Parent
                  It depends on the domain and what you want to know about. It’s difficult to predict whether users will find a game fun. A simulated social network isn’t going to tell you much when trying to...

                  It depends on the domain and what you want to know about. It’s difficult to predict whether users will find a game fun. A simulated social network isn’t going to tell you much when trying to predict the patterns of interaction your users will fall into.

                  2 votes
                2. [2]
                  papasquat
                  Link Parent
                  I think there's a huge caveat on your last assertion. There's nothing logically stopping a objective measurement of a person's brain state and formally determining whether someone is or is not...

                  I think there's a huge caveat on your last assertion. There's nothing logically stopping a objective measurement of a person's brain state and formally determining whether someone is or is not depressed.

                  We just don't currently understand the human brain well enough or have the technology to accurately measure its state. Its not technically impossible, it's just completely infeasible with where we currently are in neuroscience.

                  For all intents and purposes though, formally proving the code of windows 11 is just as out of reach as formally proving a psychological diagnosis. Both are completely infeasible in the real world. The first because of the sheer ridiculous number of man hours involved, the second because of that and the fact that the technology doesn't exist to do it.

                  2 votes
                  1. sparksbet
                    (edited )
                    Link Parent
                    We do not have remotely the evidence that it is something that can be measured in any objective way, including with future technologies, because there is nothing even close to sufficient evidence...

                    There's nothing logically stopping a objective measurement of a person's brain state and formally determining whether someone is or is not depressed.

                    We just don't currently understand the human brain well enough or have the technology to accurately measure its state. Its not technically impossible, it's just completely infeasible with where we currently are in neuroscience.

                    We do not have remotely the evidence that it is something that can be measured in any objective way, including with future technologies, because there is nothing even close to sufficient evidence that depression would be directly observable from the state of one's brain. Currently the evidence I'm aware of points to the symptoms we classify as depression not necessarily all being caused by the same sources physiologically -- different people with depression could have completely different things causing it underlyingly. Heck, that's already obviously true to some extent -- I had depression that was caused at least in part by my thyroid not working properly.

                    It's not impossible that eventually we'll develop the ability to directly test for and measure some mental health problems through neuroscientific methods. But it's also not remotely unlikely that for many mental health conditions, doing so is indeed not possible.

                    3 votes
  2. [24]
    Habituallytired
    Link
    I have other providers who use "AI" transcription/translation apps to make sure they get all the info for their notes. I think that's the correct use of AI in medicine, but using GPT to ask it how...

    I have other providers who use "AI" transcription/translation apps to make sure they get all the info for their notes. I think that's the correct use of AI in medicine, but using GPT to ask it how to comfort someone when they lose a pet if they've never had one seems lazy. How can you not think to say, "I'm sorry for your loss. It must be difficult not having your buddy by your side," on your own?

    Therapists using GPT like google to ask it how to handle patient situations is insane to me. They might as well just quit and set you up with GPT before they go.

    17 votes
    1. [15]
      papasquat
      Link Parent
      Personally I have no issues with my therapist using an LLM on their own time to do research into my problem and brainstorm ideas on how to solve them. I'd want them to look at the references that...

      Personally I have no issues with my therapist using an LLM on their own time to do research into my problem and brainstorm ideas on how to solve them. I'd want them to look at the references that the tool is providing and use their professional experience to evaluate those references before developing a therapy strategy though. I'd similarly have no problem with them using Google to do that research, or talking to colleagues about it, as long as they're not identifying me.

      The issue for me is them furiously typing what I say into chatGPT during my session, just like I'd have a problem with them wildly googling during my session or pausing every 30 seconds to call their mentor during my session.

      It would make me think that
      a. They have no idea wtf they're doing,
      b. They have no respect for me or my time, and
      c. They're completely exposing very confidential information about me without taking the time to anonymize it.

      My main issue isn't with the tool itself, it's with the usage pattern. I'm tired of people trying to outsource their brain to an LLM.

      25 votes
      1. [7]
        IIIIIIIIIIII
        Link Parent
        I had a doctor literally google my symptoms while I was in their office once. I was sitting next to them, in their office. This must have been in about 2012, and the doctor was in their 50's. They...

        I had a doctor literally google my symptoms while I was in their office once. I was sitting next to them, in their office. This must have been in about 2012, and the doctor was in their 50's.

        They then asked me what kind of general antibiotics I'd like for what turned out to be a minor chest infection. How the fuck should I know?! It felt like someone on an IT helpdesk asking which drivers I'd like.

        It was the most confusing medical interaction I've ever had. And it did make me think the exact same things as your point A, B, and C.

        11 votes
        1. [6]
          JCPhoenix
          Link Parent
          Asking what medicine you want is definitely weird. I get, to an extent, if the doc explained each medication's possible side effects and was like "And they basically all do the same thing and take...

          Asking what medicine you want is definitely weird. I get, to an extent, if the doc explained each medication's possible side effects and was like "And they basically all do the same thing and take the same amount of time to act; have any input?" But to just straight up be like, "Idk, pick one," would make me question if this person was a real doctor or not!

          Though maybe you give off doctor vibes and they thought you were a doctor too!

          Anyway, the Googling of symptoms I don't really have a problem with. I'm assuming you didn't have exotic symptoms the doc had never heard of or seen before. Just run of the mill stuff since it was just a minor chest infection. But I don't see any issues using technology to help narrow down a potential diagnosis. "OK, patient has symptoms A, B, not C, but D. Google, what could this constellation of symptoms be?" At the end of the day, the doctor has to make the determination themselves, but I don't see how that would be any different from consulting a textbook or some other documentation. I don't expect a doctor to know every possible infection from every type of microbial invader, after all. Plus, I feel like I've seen modern electronic record/charting systems at the doctor's office that do some of that already.

          2 votes
          1. Omnicrola
            Link Parent
            Not the person you're replying to, but the difference to me is huge. A doctor going to a specific website for information is to me the same as going and pulling a reference book off a shelf. It...

            At the end of the day, the doctor has to make the determination themselves, but I don't see how that would be any different from consulting a textbook or some other documentation.

            Not the person you're replying to, but the difference to me is huge. A doctor going to a specific website for information is to me the same as going and pulling a reference book off a shelf. It shows that they know at least part of the answer, and where to find the information they need to confirm or expand on it. The same as I do when I have questions about something I have expertise in, I go to a specific set of documentation, forums, or references. I may have even originally found them via Google, but I've read their material enough to trust that they are both useful and generally correct.

            Googling something is the beginning of learning. I don't want a doctor who is beginning, unless I have some weirdly unique thing happening to me.

            8 votes
          2. [4]
            IIIIIIIIIIII
            Link Parent
            Yeah those are good points. I think the thing that threw me most was, as you say, asking what antibiotics I wanted. Another big thing was that this was 2012 internet, so the sources he consulted...

            Yeah those are good points. I think the thing that threw me most was, as you say, asking what antibiotics I wanted.

            Another big thing was that this was 2012 internet, so the sources he consulted were (in my mind) authoritative. I remember Mayo Clinic being one of them.

            If a doctor used Google to search my symptoms now, even for a minor ailment, I would be horrified!

            5 votes
            1. [2]
              JCPhoenix
              Link Parent
              I'm just imagining some doctor Googling, "patient coughing up red water-ish stuff. What is?"

              I'm just imagining some doctor Googling, "patient coughing up red water-ish stuff. What is?"

              4 votes
              1. chocobean
                Link Parent
                Watch them Google "how human breathe" or "human how many hearts" lol

                Watch them Google "how human breathe" or "human how many hearts" lol

                3 votes
            2. DefinitelyNotAFae
              Link Parent
              I remember the era of using the symptom checkers on WebMD and similar for ... Um... Fun seems like not the word in retrospect. But that era is where the meme of everything being possibly cancer...

              I remember the era of using the symptom checkers on WebMD and similar for ... Um... Fun seems like not the word in retrospect. But that era is where the meme of everything being possibly cancer came from. Because that headache could be a symptom of migraine, a cold, or a tumor (and so could the nausea, or the foot pain, or...). I remember a resurgence in 2020 because everything was COVID and cancer. But even in the (seemingly) authoritative days I'd be worried if a doc used it on the diagnostic end barring maybe looking for zebras.

              I do think since then hospital systems and provider networks have built out their own summaries of diagnoses with simplified explanations to circumvent some of the googling or at least give a stating printout/link that is something they have control over.

              2 votes
      2. Habituallytired
        Link Parent
        You know, I think I'm ok with therapists using any resource/research tools at their disposal on their own time, you're right. I'm just incensed that these therapists seem to be using these tools...

        You know, I think I'm ok with therapists using any resource/research tools at their disposal on their own time, you're right.

        I'm just incensed that these therapists seem to be using these tools DURING the session without letting their patients know implicitly, or (the better option imo) asking permission to use them in the first place during the session.

        I agree ,I'm so tired of everyine who is outsourcing their brains to LLMs. It's something I talk about a lot at work because I'm one of the few holdouts that refuses to use any AI tools available to me for my work.

        10 votes
      3. [5]
        DefinitelyNotAFae
        Link Parent
        I have an issue with it. They really shouldn't be googling that often for a particular clients issue either unless it's a Google scholar search. They have professional resources, textbooks, etc....

        I have an issue with it. They really shouldn't be googling that often for a particular clients issue either unless it's a Google scholar search. They have professional resources, textbooks, etc. You get (if licensed) a whole Master's degree in this. You take multiple licensure exams and get (and sometimes pay for) professional supervision.

        All of that should teach you a dozen ways to learn something about mental health that isn't Google much less an LLM. Unless you know what's false immediately an LLM will just risk planting false info, and if you knew you probably wouldn't be asking an LLM.

        You're even taught how to support people with things you haven't experienced, specifically. (You're also taught how not to throw your own experiences in the way of supporting things you have dealt with).

        There's a distinct lack of boundaries in this situation that is particularly upsetting. I agree with the rest of your post though

        7 votes
        1. [2]
          Raspcoffee
          Link Parent
          I'd also be concerned about culture tbh. LLMs have already shown to have a cultural bias (defining cultures is difficult and I'm aware of flaws of these kind of research, but still). With how LLMs...

          I'd also be concerned about culture tbh. LLMs have already shown to have a cultural bias (defining cultures is difficult and I'm aware of flaws of these kind of research, but still). With how LLMs are designed to (usually) give an answer that sounds satisfying, I could see it resulting in projecting issues in a culturally incompatible way? Feel free to correct me, but it sounds to me like a recipe for issues with ethnic minorities.

          I don't want to rule out LLMs never being useful - but I wouldn't trust one with this anytime soon given the issues around mental health that keep popping up.

          3 votes
          1. DefinitelyNotAFae
            (edited )
            Link Parent
            It's something I've harped on before. Cultural competency is an area I can definitely see people trying to shortcut - because there's not an easy button for navigating building trust and rapport...

            It's something I've harped on before. Cultural competency is an area I can definitely see people trying to shortcut - because there's not an easy button for navigating building trust and rapport with different communities - and it's an area LLMs are explicitly poor in.

            So yeah absolutely that's yet another reason. In this article the violating of confidentiality and the absolute lack of professional boundaries would have me firing a therapist before we even get into the specifics. In my current non-clinical roles we'd be in the documentation process up to potential termination or non-reappointment.

            1 vote
        2. [2]
          papasquat
          Link Parent
          There is information out there that hasn't been extensively written about in research papers. If a client is part of a not very widely known culture with particular expectations and traditions, it...

          There is information out there that hasn't been extensively written about in research papers. If a client is part of a not very widely known culture with particular expectations and traditions, it may not be covered in psychological research. Same goes for esoteric hobbies or interests or social movements. It may make sense for a therapist to familiarize themselves with those things between sessions so that they have some common language to understand a client's particular situation. For instance, I've played video games my whole life. I had a therapist who knew that and kept making really terrible video game analogies. She told me she'd been looking up video game stuff between sessions and it helped her understanding of one of my main hobbies significantly.

          Did it tell her how to treat someone who played video games specifically? No, but it did help her connect with me a bit more which probably made her work more effective.

          I have no problem with her doing that, and I wouldn't have a problem with her doing the same thing with LLMs, as long as she's actually verifying the things it's telling her, and applying it through a lens of professional expertise.

          1. DefinitelyNotAFae
            Link Parent
            Sorry, I misunderstood "researching your problem and brainstorming ideas" as actually addressing particular mental health or related issue, not something that's more rapport building or cultural...

            Sorry, I misunderstood "researching your problem and brainstorming ideas" as actually addressing particular mental health or related issue, not something that's more rapport building or cultural competency.

            Which yes, rapport is as important as what therapeutic model you use, so it's pretty important, as is understanding where a client is coming from. Blind googling wouldn't be my go-to, I generally ask my clients for a recommendation to help me understand "thing they like" or "piece of their culture" better, but it's not offensive as a strategy. Still better to start more scholarly with cultural understanding though.

            and I wouldn't have a problem with her doing the same thing with LLMs, as long as she's actually verifying the things it's telling her, and applying it through a lens of professional expertise.

            I still would. I'm not saying you have to by any means. If she's only looking up things that are for broader cultural understanding, like hobbies and interests and such, professional expertise won't get her that far on judging the results of an LLM and I'd rather not risk my therapeutic relationship on hallucinations. Regardless this was a pivot from what I thought you initially meant, and what I was saying which is basically that a therapist shouldn't be researching therapy on an LLM IMO at all, or even Google as a general rule.

      4. Lia
        Link Parent
        While I respect your perspective and agree that it's valid, my personal opinion is different. I can use an LLM myself to brainstorm solutions to my problems if I happen to want LLM-driven...

        Personally I have no issues with my therapist using an LLM on their own time to do research into my problem and brainstorm ideas on how to solve them.

        While I respect your perspective and agree that it's valid, my personal opinion is different. I can use an LLM myself to brainstorm solutions to my problems if I happen to want LLM-driven solutions. That's not what I pay my therapist for.

        I pay him to take in and consider topics that I don't want anyone or anything outside of that office to ever find out about. Let alone some ethically dubious AI firm! I have not authorised him to share my ideas and information with such companies - not even with the healthcare companies that are operating in my country in line with our local legislation. Thankfully he is such a thoughtful person that he himself brought this up during the first session, by volunteering the information that he isn't keeping session logs and definitely not sharing any of his (hand written) notes with anyone else. I am still going to explicitly ask him, come next session, that he not type my stuff into an LLM chatbox.

        The level of privacy I need is probably higher than it would be for most people because my therapist is first and foremost a career coach, so I talk to them about my top secret business stuff all the time. My professional field is highly competitive and the project I'm doing is on the more challenging end of things, so sometimes I also need emotional support and more traditional therapy techniques to make sure I'm not overstepping my abilities too much, or driven by some unhealthy emotional patterning that could result in going in the wrong direction. The last thing I need is to worry that some of my most sensitive business info could leak before I even have a chance to launch.

        When it comes to more mainstream uses of therapy, having very strong faith that your sessions are private is a key factor to some people's healing. When you feel certain that you are in an environment of absolute privacy - sometimes including feeling like you're the only person in the room, i.e. the therapist isn't even a person at all - you will gain better visibility to your unconscious mind's operations. This can very largely determine the results we are able to achieve, depending of what sort of issues the person is trying to heal.

        In fact, I believe the fact that some believe an LLM to be a better therapist than a human is linked to the experience that there's no human observer present other than the user. The user is able to be more open and experimental in their role as a "patient" than they can with a human therapist.

        However, the grim downsides are: i) the sessions are only seemingly private, not truly so (I don't believe for a second that the user's information won't be sold off for targeted advertising), and ii) the LLM has no idea how to provide an effective healing service congruently over multiple sessions / months/years, and how to stay within tolerable boundaries when these are different for each individual client and the methodology that helps redirect the session also isn't the same for everyone. Therapists receive actual training to learn all that but the training itself isn't enough - you also have to be able to use all five senses to gauge the client's reactions at all times.

        4 votes
    2. [7]
      blivet
      Link Parent
      A friend of mine with a doctorate in psychology one told me that a lot of therapists aren't particularly well suited to their line of work, because they started studying psychology in the first...

      A friend of mine with a doctorate in psychology one told me that a lot of therapists aren't particularly well suited to their line of work, because they started studying psychology in the first place as a way of trying to learn how to deal with their own mental and emotional problems.

      18 votes
      1. [2]
        EarlyWords
        Link Parent
        That was instantly clear in my introduction to psychology class many years ago in college. Whatever interest I had in the field was immediately lost when I discovered nearly everyone in the class...

        That was instantly clear in my introduction to psychology class many years ago in college. Whatever interest I had in the field was immediately lost when I discovered nearly everyone in the class only wanted to discuss their own problems.

        They were nearly as neurotic as the modern dance department!

        7 votes
        1. chocobean
          Link Parent
          And practice fewer healthy ways to express themselves too I bet

          dance dept

          And practice fewer healthy ways to express themselves too I bet

          1 vote
      2. [4]
        papasquat
        Link Parent
        If they've learned to deal with those problems, wouldn't that make them particularly well suited to it?

        If they've learned to deal with those problems, wouldn't that make them particularly well suited to it?

        6 votes
        1. [3]
          blivet
          Link Parent
          Just because they got a degree doesn't mean they succeeded in addressing their own issues.

          Just because they got a degree doesn't mean they succeeded in addressing their own issues.

          15 votes
          1. [2]
            papasquat
            Link Parent
            Yeah, that's fair. I guess it just rubs me the wrong way sometimes when people imply that because someone has a mental health problem, they can't be effective at their job (even if they're a...

            Yeah, that's fair.

            I guess it just rubs me the wrong way sometimes when people imply that because someone has a mental health problem, they can't be effective at their job (even if they're a therapist). I think a depressed therapist treating someone with depression is probably a horrible idea, but I don't think perfect mental health needs to be a prerequisite for providing mental health care in all cases. I actually think it's pretty common in the mental health field for them to have a therapist of their own to help them deal with the burden of dealing with everyone else's issues.

            I'm also totally talking out of my ass and have almost no experience in this area, so I could be wrong.

            2 votes
            1. DefinitelyNotAFae
              Link Parent
              It's basically an expectation you see a therapist of your own, secondary trauma alone is a lot to deal with. But, I'd say at least for me, it's not a place of stigma. There are definitely people...

              It's basically an expectation you see a therapist of your own, secondary trauma alone is a lot to deal with.

              But, I'd say at least for me, it's not a place of stigma. There are definitely people who go into the field for the wrong reasons - because of their own mental health struggles, or because they're always the helper for their friends, or any number of things - that they think prepare them for the work, but actually IME it requires you to unlearn bad habits, set even stronger boundaries and deal with your own mental health in a way that not everyone is actually willing and able to do. You can't just shove your mental health in a box and take care of others without bad stuff happening there eventually. Having diagnoses, or struggles or whatever is not disqualifing in and of itself.

              A lot of relationships broke up in my grad school cohort, for example, for two major reasons that rose above the static of general breakups. The first was because people were analyzing their relationship through this newly taught professional lens and becoming dissatisfied and the other was because they tried to misuse their new therapy skills on their partner and were a real toxic asshat all of a sudden.

              There are shitty therapists who make it through school and internship and licensure, like any profession, but there's a lot of weeding out on the way for that reason.

              2 votes
    3. MimicSquid
      Link Parent
      At the very least, someone who's a trained therapist getting a small topic-refresher from GPT to help with an unfamiliar situation is worlds better than an unfiltered connection. The therapist is...

      At the very least, someone who's a trained therapist getting a small topic-refresher from GPT to help with an unfamiliar situation is worlds better than an unfiltered connection. The therapist is a mandatory reporter with regards to issues of impending harm to the patient or others, or various types of abuse of vulnerable people, while these AI programs have egged people on.

      5 votes
  3. DefinitelyNotAFae
    Link
    I don't think this is particularly common, but I'm quite glad we've banned AI therapy in IL. All of this was incredibly unprofessional. This is why CEs and professional supervision are part of the...

    I don't think this is particularly common, but I'm quite glad we've banned AI therapy in IL. All of this was incredibly unprofessional. This is why CEs and professional supervision are part of the requirements for licensure. This is up there with the nurse live-streaming handing meds out on her hospital floor. It's an extreme outlier but also a huge red flag for the training new professionals are getting.

    I swear to gods we learn what to do in the classroom when someone isn't making progress. There are plenty of hard things about therapy but this is no where near the solution.

    You have professional resources for a reason. You should rarely be googling much less using chatGPT for answers.

    Ugh.

    13 votes
  4. lou
    Link
    It doesn't seem smart to make yourself redundant.

    It doesn't seem smart to make yourself redundant.

    5 votes
  5. Lia
    (edited )
    Link
    I responded to some comments before even reading the article, and I still have only read the first two paragraphs, and I'm already noping out. The way a therapist is supposed to listen to the...

    I responded to some comments before even reading the article, and I still have only read the first two paragraphs, and I'm already noping out.

    The way a therapist is supposed to listen to the client requires extreme focus. You can't possibly direct your attention to chatting with an LLM simultaneously, not if you're being an actual therapist. Anyone doing this should lose their license, and to be fair, they probably would in my country where licenses are more strictly regulated and adherence to them actively monitored by the government.

    I can see value to a service where a mental health professional would summarise the client's speech into professionally valid questions to feed to an LLM, filter the answers using his professional judgment and finally, deliver the result to the client utilising his Human Expression Enhancement SuiteTM - Now with the updated Body Language Extension package! But under no circumstances should it be called psychotherapy.

    3 votes
  6. Parou
    Link
    I wouldn't be surprised if especially the therapists who know nothing about their profession for some reason and diagnose people (or better said: withhold diagnoses) purely based on your vibes in...

    I wouldn't be surprised if especially the therapists who know nothing about their profession for some reason and diagnose people (or better said: withhold diagnoses) purely based on your vibes in the first 2 minutes of meeting them instead of actual diagnostic criteria and tests, are overrepresented in the demographic of therapists depending on ChatGPT.

    3 votes