Fair enough. I wouldn’t even consider seeing a therapist that used an llm in any capacity, let alone letting an llm be the therapist. Sadly I think the people that would make the mistake of doing just that probably wont be swayed, but fair enough to raise awareness.
Sadly with how this tech is going I don’t think it’s possible to stop it from being used like that by the masses.
I just hope that the people who do, would at least be aware of it’s shortcomings.
I myself would never use it like that, but I understand the appeal. There is no awkwardness because it isn’t a person, it tends to be extremely supportive and agreeable, and many people perceive it as intelligent. All of this combined makes it sound like a really good therapist, but that is of course missing the core issues of this tech.
I agree. I guess my point was that people need to be aware of how crazy AI models can be and always be careful about sensitive topics with them.
If I were to use an LLM as a therapist, I would be extremely skeptical of anything it says, and doubly so when it confirms my own beliefs.
Fair enough. I wouldn’t even consider seeing a therapist that used an llm in any capacity, let alone letting an llm be the therapist. Sadly I think the people that would make the mistake of doing just that probably wont be swayed, but fair enough to raise awareness.
Sadly with how this tech is going I don’t think it’s possible to stop it from being used like that by the masses.
I just hope that the people who do, would at least be aware of it’s shortcomings.
I myself would never use it like that, but I understand the appeal. There is no awkwardness because it isn’t a person, it tends to be extremely supportive and agreeable, and many people perceive it as intelligent. All of this combined makes it sound like a really good therapist, but that is of course missing the core issues of this tech.