A new study in Frontiers in Psychology finds that AI models like ChatGPT-4 can outperform human psychologists in understanding and responding to emotions during counseling.
I'll admit I have used it as a "therapist" of sorts, telling it my troubles and asking it to analyze a bit and offer possible points of action.
It's actually crazy good. I mean, I've been to plenty real therapists, and honestly, GPT comes across as more compassionate than most of them, which is saying something. The advice it offers is also actually quite good and actionable, instead of useless psychobabble. It also doesn't try to rope you into MLM's or alternative shit that never works. Its answers are clearly from CBT and other proven, working systems.
I know it's become somewhat of a joke that "AI" in this day and age isn't much more than predictive fill, but it's a lot better than that. At some things, at the very least.
None actually, just typed out my issues as if I was chatting with a friend online. As long as you're descriptive enough so that it has something to work off, you should get decent answers.
The best therapy is objective. As such, a well tuned "predictive fill" ought to be a solid fit for this purpose. But, this is the exception that proves the rule; this does not mean LLM is something other than what it is. So I wouldn't read too much into it...