• WhatAmLemmy@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    4
    ·
    1 day ago

    Well, AI therapy is more likely to harm their mental health, up to encouraging suicide (as certain cases have already shown).

    • FosterMolasses@leminal.space
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      1 day ago

      There’s evidence that a lot of suicide hotlines can be just as bad. You hear awful stories all the time of overwhelmed or fed up operators taking it out on the caller. There’s some real evil people out there. And not everyone has access to a dedicated therapist who wants to help.

    • Cybersteel@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      1 day ago

      Suicide is big business. There’s infrastructure readily available to reap financial rewards from the activity, atleast in the US.

    • atmorous@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      1 day ago

      More so from corporate proprietary ones no? At least I hope that’s the only cases. The open source ones suggest really useful ways proprietary do not. Now I dont rely on open source AI but they are definitely better

      • SSUPII@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        3
        ·
        22 hours ago

        The corporate models are actually much better at it due to having heavy filtering built in. The fact that a model generally encourages self arm is just a lie that you can prove right now by pretending to be suicidal on ChatGPT. You will see it will adamantly push you to seek help.

        The filters and safety nets can be bypassed no matter how hard you make them, and it is the reason why we got some unfortunate news.

    • whiwake@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      14
      ·
      1 day ago

      Real therapy isn’t always better. At least there you can get drugs. But neither are a guarantee to make life better—and for a lot of them, life isn’t going to get better anyway.

            • whiwake@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              7
              ·
              18 hours ago

              Compare, as in equal? No. You can’t “game” a person (usually) like you can game an AI.

              Now, answer my question

                • whiwake@sh.itjust.works
                  link
                  fedilink
                  English
                  arrow-up
                  2
                  arrow-down
                  3
                  ·
                  17 hours ago

                  Answer my question, or just admit you refuse to engage in conversation and we can depart

                  • Kami@lemmy.dbzer0.com
                    link
                    fedilink
                    English
                    arrow-up
                    3
                    ·
                    17 hours ago

                    There is no conversation to be had here, because you have no point.

                    A text generator is not someone you talk to, it’s a thing that takes your input and then outputs the text that is most likely relevant to said input.

                    No reasoning. No knowledge.

                    Taking the example made in one of the other comments here, you could paint a face to a ball and keep yourself mentally stable on a desert island, but the ball isn’t doing anything, there’s only you on the island.

                    Apply that to getting mental help from an LLM and you’ll see how creepy it is.

      • CatsPajamas@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        4
        ·
        24 hours ago

        Real therapy is definitely better than an AI. That said, AIs will never encourage self harm without significant gaming.

        • whiwake@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          19 hours ago

          AI “therapy” can be very effective without the gaming, but the problem is most people want it to tell them what they want to hear. Real therapy is not “fun” because a therapist will challenge you on your bullshit and not let you shape the conversation.

          I find it does a pretty good job with pro and con lists, listing out several options, and taking situations and reframing them. I have found it very useful, but I have learned not to manipulate it or its advice just becomes me convincing myself of a thing.

        • triptrapper@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          2
          ·
          21 hours ago

          I agree, and to the comment above you, it’s not because it’s guaranteed to reduce symptoms. There are many ways that talking with another person is good for us.