Skip Navigation

How to trick ChatGPT into revealing Windows keys? I give up

www.theregister.com /2025/07/09/chatgpt_jailbreak_windows_keys/

No, really, those are the magic words A clever AI bug hunter found a way to trick ChatGPT into disclosing Windows product keys, including at least one owned by Wells Fargo bank, by inviting the AI model to play a guessing game.…

3 comments
  • Hacking in the future is going to be so stupid

  • before i read this and realized “I give up” was the trigger to getting the key, I read the headline like “AI? i can’t even…”

  • Why is the model trained on real Windows keys in the first place?