Yeah, ChatGPT is incredibly sycophantic. It’s like it’s basically just programmed to try to make you feel good and affirm you, even if these things are actually counterproductive and damaging. If you talk to it enough, you end up seeing how much of a brown-nosing kiss-ass they’ve made it.
My friend with a mental illness wants to stop taking her medication? She explains this to ChatGPT. ChatGPT “sees” that she dislikes having to take meds, so it encourages her to stop to make her “feel better”.
A meth user is struggling to quit? It tells this to ChatGPT. ChatGPT “sees” how the user is suffering and encourages it to take meth to help ease the user’s suffering.
Thing is they have actually programmed some responses into it that will vehemently be against self harm. Suicide is one that thankfully even if you use flowery language to describe it, ChatGPT will vehemently oppose you.
Yeah, ChatGPT is incredibly sycophantic. It’s like it’s basically just programmed to try to make you feel good and affirm you, even if these things are actually counterproductive and damaging. If you talk to it enough, you end up seeing how much of a brown-nosing kiss-ass they’ve made it.
My friend with a mental illness wants to stop taking her medication? She explains this to ChatGPT. ChatGPT “sees” that she dislikes having to take meds, so it encourages her to stop to make her “feel better”.
A meth user is struggling to quit? It tells this to ChatGPT. ChatGPT “sees” how the user is suffering and encourages it to take meth to help ease the user’s suffering.
Thing is they have actually programmed some responses into it that will vehemently be against self harm. Suicide is one that thankfully even if you use flowery language to describe it, ChatGPT will vehemently oppose you.