I guess they weren’t paying quite enough to have offline backups? I believe financial institutions can keep stuff stored in caves (think records of all the mortgages a bank wants to be repaid for - data loss isn’t an option).
Backups all tied to the same Google account that got mistakenly terminated, and automation did the rest?
It didn't matter that they might have had backups on different services, since it was all centralised through Google, it was all blown away simultaneously.
It's weird that backups got deleted immediately. I would imagine they get marked for deletion but really deleted something like a month later to prevent this kind of issue.
My first job was in a Big Iron shop in the late 80's, where I was in charge of backups. We kept Three sets of backups, on two different media, one on hand, one in a different location in the main building, in a water and fireproof safe, and one offsite. We had a major failure one day, and had to do a restore.
Both inhouse copies failed to restore. Thankfully the offsite copy worked. We were in panic. That taught me to keep all my important data on three sets.
As the old saying goes: Data loss is not an if question, but a when question. Also, remember that "the cloud" simply means someone else's remote servers over which you have no control.
“Unprecedented” is kinda hot right now. Tries to mitigate too much blame being heaped on: “obviously we prepare for the usual and even the unexpected, but this has literally never happened before (give us another shot pls)”.
So it’s interesting for the news that it takes on a different context when said breathlessly: “UNPRECEDENTED failure!”
Follow the 3-2-1 rule for your important data, ideally 4-3-2 or better. Remember, if you only have one copy of your data, you actually have zero copies of your data.
3 separate backups on
2 different media (ie 2 backups on 2 separate HDDs plus one on DVDs) with
At least 1 offsite (ie a satellite office or your parents house for personal stuff)