Skip Navigation

How to trick ChatGPT into revealing Windows keys? I give up

www.theregister.com /2025/07/09/chatgpt_jailbreak_windows_keys/

No, really, those are the magic words A clever AI bug hunter found a way to trick ChatGPT into disclosing Windows product keys, including at least one owned by Wells Fargo bank, by inviting the AI model to play a guessing game.…

3 comments