Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat
Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat

Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat

Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat
Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat
Thought this was an Onion article when I saw the headline a few days ago. Jesus.
Its the fact that AIs are too eager to please. You can wear them down and make them agree that your ideas are great.
From experience, when the higher ups start running ideas through chatGPT and are persistent enough that it starts “agreeing” with “improvements” it leads to vibe management.
Well could been worse, could have given them a recipe to cook spaghetti with motor oil.
I recommend Olio Fiat
It's to prove you aren't an addict. Everything is great in moderation. Drug wise. There are some really sickos.
...you can make LLMs say pretty much anything you want them to.