Skip Navigation

"Google Gemini tried to kill me"

I followed these steps, but just so happened to check on my mason jar 3-4 days in and saw tiny carbonation bubbles rapidly rising throughout.

I thought that may just be part of the process but double checked with a Google search on day 7 (when there were no bubbles in the container at all).

Turns out I had just grew a botulism culture and garlic in olive oil specifically is a fairly common way to grow this bio-toxins.

Had I not checked on it 3-4 days in I'd have been none the wiser and would have Darwinned my entire family.

Prompt with care and never trust AI dear people...

23 comments
  • never trust AI

    Statements from LLMs are to be seen as hallucinations unless proven otherwise by classic research.

  • headline is inaccurate and downplays the incredible potential of ai. Google Gemini tried to kill this person AND their entire family

    • mods can you please ban "david gerard" or whatever his name really is. ai hate is already out of hand without people coming to push their agenda like this

      • unfortunately I am firmly in the pocket of the concept of fiat money, big small data, and whatever the opposite of a metaverse is

        but also,

        mods can you please ban “david gerard”

        if I ever release an experimental electronic album I’m calling dibs on this track name

  • It’s slowly refining its approach. No-one went for the pizza glue or eating rocks, so…

    Reddit still delivers sometimes.

  • Huh. I was making my own garlic oil this way (without advice from an LLM mind-you) and I was today years old when I learned this carries the risk of botulism (albeit small) , so in a way, an LLM has potentially saved my life by causing the chain of events which taught me something new.

23 comments