Jailbroken AI Chatbots Can Jailbreak Other Chatbots
Jailbroken AI Chatbots Can Jailbreak Other Chatbots
www.scientificamerican.com Jailbroken AI Chatbots Can Jailbreak Other Chatbots
AI chatbots can convince other chatbots to instruct users how to build bombs and cook meth
You're viewing a single thread.
View all comments
80
comments
Anybody found the source? I wanna read the study but the article doesn't seem to link to it (or I missed it)
7 0 ReplyI believe this is the referenced article:
13 0 ReplyThanks a lot!
4 0 Reply
You've viewed 80 comments.
Scroll to top