More than half a million UniSuper fund members went a week with no access to their superannuation accounts after a “one-of-a-kind” Google Cloud “misconfiguration” led to the financial services provider’s private cloud account being deleted, Google and UniSuper have revealed.
The most surprising thing here is they got in contact with a human in Google cloud to resolve the issue.
It’s easier when you’ve got $146BN moving through you.
Imagine this happens to some random personal account… It’d probably be gone for good.
There were several months with people complaining their data was getting deleted and Google just ignored the whole thing until it blew up on hacker news.
Clouds… clouds everywhere…
This is the best summary I could come up with:
More than half a million UniSuper fund members went a week with no access to their superannuation accounts after a “one-of-a-kind” Google Cloud “misconfiguration” led to the financial services provider’s private cloud account being deleted, Google and UniSuper have revealed.
Services began being restored for UniSuper customers on Thursday, more than a week after the system went offline.
Investment account balances would reflect last week’s figures and UniSuper said those would be updated as quickly as possible.
In an extraordinary joint statement from Chun and the global CEO for Google Cloud, Thomas Kurian, the pair apologised to members for the outage, and said it had been “extremely frustrating and disappointing”.
“These backups have minimised data loss, and significantly improved the ability of UniSuper and Google Cloud to complete the restoration,” the pair said.
“Restoring UniSuper’s Private Cloud instance has called for an incredible amount of focus, effort, and partnership between our teams to enable an extensive recovery of all the core systems.
The original article contains 412 words, the summary contains 162 words. Saved 61%. I’m a bot and I’m open source!
And the crazy part is that it sounds like Google didn’t have backups of this data after the account was deleted. The only reason they were able to restore the data was because UniSuper had a backup on another provider.
This should make anyone really think hard about the situation before using Google’s cloud. Sure, it is good practice and frankly refreshing to hear that a company actually backed up away from their primary cloud infrastructure but I’m surprised Google themselves do not keep backups for awhile after an account is deleted.
Actually, it highlights the importance of a proper distributed backup strategy and disaster recovery plan.
The same can probably happen on AWS, Azure, any data center reallyActually, it highlights the importance of a proper distributed backup strategy and disaster recovery plan.
Uh, yeah, that’s why I said
it is good practice and frankly refreshing to hear that a company actually backed up away from their primary cloud infrastructure
The same can probably happen on AWS, Azure, any data center really
Sure, if you colocate in another datacenter and it isn’t your own, they aren’t backing your data up without some sort of other agreement and configuration. I’m not sure about AWS but Azure actually has offline geographically separate backup options.
I use AWS to host a far amount of servers and some micro services and for them if you don’t build the backup into your architecture design and the live data gets corrupted, etc you are screwed.
They give you the tools to built it all, but it is up to you as the sysadmin/engineer/ dev to actually use those tools.
The IT guy who set up that backup deserves a hell of a bonus.
A lot of people would have been happy with their multi region resiliency and stopped there.
No, they had backups. They deleted those, too.
Google Cloud definitely backs up data. Specifically I said
after an account is deleted.
The surprise here being that those backups are gone (or unrecoverable) after the account is deleted.
A replica is not a backup.
You just know the IT guy who restored it was like, “Y’ALL REAL QUIET WITH THAT ‘WHAT DO YOU EVEN DO HERE’ SHIT.”