Jailbroken AI Chatbots Can Jailbreak Other Chatbots
Jailbroken AI Chatbots Can Jailbreak Other Chatbots
www.scientificamerican.com Jailbroken AI Chatbots Can Jailbreak Other Chatbots
AI chatbots can convince other chatbots to instruct users how to build bombs and cook meth
There is a discussion on Hacker News, but feel free to comment here as well.
0
comments