A Dutch cybersecurity researcher recently came across a 4-terabyte SQL Server backup file sitting wide open on the internet — and it belonged to EY, one of the world’s largest accounting and consulting firms.
According to a report from Neo Security, the exposed backup contained API keys, authentication tokens, passwords, and service accounts — essentially the company’s digital DNA. Even more worrying, the file was unencrypted.
“Finding a 4TB SQL backup exposed to the public internet is like finding the blueprint and the keys to a vault — with a note that says ‘free to a good home,’”
wrote Neo Security in its analysis.
A Classic Mistake, Still Happening
The leak appears to have been caused by a misconfigured cloud storage bucket, one of the oldest and most avoidable errors in the modern IT playbook.
Neo Security described it as “a case of convenience gone wrong.” With most cloud platforms, exporting a full database is just a matter of a few clicks — select the database, choose a bucket, and let the system do the rest.
The problem? One wrong click or a missing access control setting can make the bucket public. And from there, automated scanners roaming the web pick it up within minutes.
This wasn’t the first time Neo Security had seen such a scenario. In a previous investigation, a ransomware case began from a simple eight-kilobyte web.config file that exposed a database connection string. Eight kilobytes — not terabytes.
The Dangerous Comfort of the Cloud
There’s a pattern here. The tools we use every day are built for speed and ease, not necessarily for caution. Engineers under pressure to deliver migrations or backups often focus on finishing the job, assuming that the default settings are safe.
But defaults don’t think for you. They don’t warn you that your entire customer database is now sitting in a public folder.
It’s easy to forget that the same simplicity that accelerates workflows also accelerates mistakes.
EY’s Response
By the time Neo Security confirmed what it had found, it was a weekend. The researcher spent hours sending urgent messages through LinkedIn to reach EY’s security team.
To EY’s credit, the response was described as “professional and effective.” The exposed backup was locked down within a week, and the company moved quickly to contain any potential damage.
Still, the question lingers: how long was it out there before someone noticed?
No one knows for sure — but in cybersecurity, if data was exposed, it’s safest to assume it was compromised.
What This Incident Tells Us
This isn’t just an EY story. It’s a reminder for every organization — large or small — that the simplest errors can have massive consequences.
A few key lessons stand out:
- Cloud convenience needs discipline. The easier it gets to store and back up data, the easier it becomes to overlook security settings.
- Encryption isn’t optional. Every backup file should be encrypted, no matter how temporary it seems.
- Automation helps, but validation matters. Continuous exposure scans and access-control audits should be routine, not reactive.
Because let’s be honest — no one plans to misconfigure a bucket. It happens in seconds, and sometimes the fallout lasts for years.
A Question of Trust
For a firm like EY, whose business depends on confidentiality and trust, such an incident is particularly painful. Clients expect their auditors to protect sensitive data as tightly as financial assets.
When the guardians of governance slip, the message is clear: no one is immune from human error.
Cybersecurity today isn’t about building higher walls. It’s about creating habits, checks, and culture that make such missteps less likely in the first place.
Final Thought
In the end, this was a wake-up call — not just for EY, but for all of us who rely on cloud platforms every day.
Security isn’t something we add at the end of a process; it’s something we practice with every decision, every click, every upload.
Because sometimes, the costliest breaches begin with the smallest checkbox left unticked.
