Did they mistake it for one of their own services people were using?
We need to have something like reddit gold. but the money goes to a charity of your choice, and whoever you golded gets a badge next to their comment. I’d do that to this if I could.
But how would a capitalist benefit from that?
Sigh. That is the world we live in.
Just give a Lemmy Lemon 🍋 and donate to the charity of your choice.
Or we could just add the Yeah button from the Miiverse
Does this mean we can put that account on https://killedbygoogle.com/ ?
yes
Man nice site
Fucking gold!
😂
deleted by creator
And it’s a sad, sad day when the situation in xkcd 908 looks like an improvement over even one of the commercial offerings.
Better article:
https://www.theregister.com/2024/05/09/unisuper_google_cloud_outage_caused/
They restored from another cloud service. Were I in charge, I’d still be leery of not having that data on my own drives. I have my Windows libraries mapped to my ghetto RAID 0, and those folders are in turn backed to Google. If all else fails, I have a local backup. And this story reminds me, I haven’t installed VEEAM on this new PC…
Yeah, this has definitely happened before, we just don’t hear about it in the news. I am personally aware of a Canadian non-profit whose Google accounts were nuked with no notice or explanation last year, leading to massive disruptions for 150 staff and even more clients. They never found out why, and had to restore from backups onto a brand new Google business account
had to restore from backups onto a brand new Google business account
Thus proving that they learned nothing from the experience.
Waiting for the news “Google deleted users account, now they lost access to their passkeys and with that to all other services” It can only be a matter of days until it happens.
Happened all the time over on r/androiddev. Small company brings on the wrong person/uses the wrong SDK/wrongfully fails an review and their account is then banned via “association”, which then propagates down to countless other employees. Only way out is to hope and pray that a human sees the appeal or try and blow up online
Happened so often in fact that the subreddit even created several guides on how to avoid it. My favourite part is that even unpublished apps must be updated in perpetuity to abide by Google’s ever changing requirements
Or this other occasion where viewers of one of the most popular YouTubers in the world were banned for typing in chat
Here is an alternative Piped link(s):
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Meh. Not all Indians lol
Backup was on Azure. I get the sentiment on the cloud, but there is no excuse for this incompetence at Google.
“This should not have happened.”
Duh, ya think?
Google Sales Engineer: oh I see you didn’t purchase the “Do not randomly nuke my cloud” option… well there’s the problem.
But you can’t trust regular people to have open source ASI, but don’t worry, we won’t fuck it up.
deleted by creator
Oopsies lol
Whoopsie!
This is a “one of a kind” error.
OK, that it can happen at all is a problem. And sorry, but the idiots who put their data in with Google should be fired.
I get offloading risk, little good will that do when your company goes tits up.
Where would you put their data then? Self hosting is not exactly safe either.
At the end of the day, every approach has its tradeoffs. Using a reputable cloud provider is a very valid choice.
Thank you! Every time a story like this comes up, people seem to wanna pretend managing your own hardware is all sunshine and rainbows. Especially if you want global scale or as little down-time as possible, cloud provider’s your best bet, albeit one where you have less control than you would with your own servers.
Opinion: You should be building on top of open source platforms and tools (Docker, Kubernetes if you need it…granted I’m not an expert in this area) to mitigate some of the vendor-lockin, and take a multi-cloud approach. If you’re mainly hosting on GCP for example, host smaller deployments on AWS, Azure, Cloudflare, or something else as a contingency…eventuality you can also add or just move to your own servers relatively painlessly. Also AGGRESSIVELY backup up your database in multiple places.
Whoopsie my bad guys
This is the best summary I could come up with:
The company accidentally erased the private Google Cloud account of a $125 billion Australian pension fund, UniSuper.
“This is an isolated, ‘one-of-a-kind occurrence’ that has never before occurred with any of Google Cloud’s clients globally,” Google Cloud CEO Thomas Kurian and UniSuper CEO Peter Chun said in a joint statement obtained by The Guardian May 8.
Google Cloud has identified the events that led to this disruption and taken measures to ensure this does not happen again.”
And nearly half a million companies across the globe use Google Cloud as a “platform-as-a-service,” or client-facing tool, including Volkswagen and Royal Bank of Canada.
The National Security Agency inked a $10 billion deal with Amazon to move its intelligence surveillance data onto the company’s cloud.
And the Pentagon has a $9 billion contract with Microsoft, Google, Oracle, and Amazon for cloud computing services.
The original article contains 272 words, the summary contains 141 words. Saved 48%. I’m a bot and I’m open source!
It has happened before. They just swept it under the rug and blamed the client.
A user was setting up a new laptop and synced an empty folder with google drive, intending to download accounts data to their machine. It bugged and treated the empty folder as the master and began erasing the drive contents.
After two weeks of pestering google, they relented and pulled from their backups they swore they didn’t have and didn’t exist.
19 billion dollars and they can’t do it themselves? They need Amazon and Microsoft?
Much cheaper to have an external company do these things sometimes
It also lets you pass the buck in case of issues from a manager’s perspective.
Good way to pay your informants.