Ransomware has Pushed Backup to the Breaking Point

Increasingly, when ransomware successfully infiltrates and encrypts a large company’s data, they pay the ransom, which comes as a surprise to many. For instance, when Colonial Pipeline was hit by a ransomware attack in 2021 and shut down operations, the company paid a $4.4 million ransom to recover its business systems. It’s unthinkable that a company of that size responsible for such critical infrastructure wouldn’t have backups in place. Why not just recover their data from the backups the company almost certainly had?

According to a report in the Wall Street Journal, Colonial’s CEO said the company decided to pay the ransom because they were unsure how badly their systems had been breached and didn’t know how long it would take to bring them back. The issue, ultimately, was that of time to recovery. Colonial decided it would be less expensive to pay the ransom to gain the decryption key than it would have been to wait until they could fully recover from their backup files—even though they knew it would take weeks to fully decrypt the affected data.

So, the answer to the question of why a large enterprise would choose to pay a ransom instead of relying on their backups to recover data is actually pretty simple: Backup is broken.

Ransomware Will Come for us All

First, it’s important to understand that, eventually, every large business is going to fall victim to a successful ransomware attack. It’s inevitable. If a company has thousands or tens of thousands of employees, all a criminal organization has to do is trick a very small number of them into downloading their malware—and if their software is sophisticated enough, they’ll only need one.

Once the cybercriminals have made a successful breach, the software-based portion of the attack begins. Modern ransomware is patient, quietly worming its way through the network until it is everywhere. Once it starts encrypting files, it’s too late.

So while prevention is important, companies have to think about recovery. And traditional enterprise backup systems are failing on that front.

To illustrate, let’s say IT detects a ransomware attack one week after the malware began eating through company data. With a traditional backup solution, the entire file system has to be rolled back to a point prior to infiltration, which means the organization loses seven days worth of data. IT has to rehydrate data stored in backups, copy over clean files to a restored server and move them back into production. If we’re talking about virtual machines or databases, this process can be accomplished fairly quickly. But the vast majority of enterprise data is file data, and rebuilding a file server that houses terabytes of data takes weeks even with high-speed transfer.

File system versioning provides a better alternative to traditional file backup, but block-based versioned storage area networks (SANs) can only keep a limited number of versions or snapshots. Unless the breach is detected within a few days, IT will have to rely on their backups to restore file systems.

A Cloud-Based Alternative to Traditional Backup

Cloud-native file storage, however, provides an effective backstop against ransomware attacks. For starters, backups can be kept in object storage in read-only format so they’re impervious to encryption by malware. At first glance, object storage doesn’t lend itself to storing unstructured data. But through the use of sophisticated software and storage snapshot technology, it’s possible to build an object-based cloud-native file system that has the look and feel of a traditional file share.

Of course, a cloud-based system for production files will usually require a hybrid infrastructure to provide performance and eliminate latency. But the protection against ransomware attacks remains strong. Local appliances only need to keep a copy of the working set which is synchronized with the gold copy in the cloud. So, if the data on the local appliance is hit with ransomware, it’s not hard to fix. All IT needs to do is point the appliance back to the most recent unaffected version of the file system.

But IT doesn’t have to revert all files back to the point prior to the attack. A comprehensive audit trail combined with immutable versions of files in the cloud enables IT to unravel the jagged timeline that corresponds to the ransomware’s penetration profile. And it’s not a slow process—instead of weeks, it takes just hours and IT can radically minimize the amount of data lost because those affected immediately before the attack was discovered will only lose a few hours of work, not, say, a full week.

Just as applications are moving to the cloud, files are heading there, too, as is nearly all enterprise data. That’s good news for the enterprise and bad news for cybercriminals, because when traditional backup is replaced by built-in backup within a cloud-based storage system, ransomware is no longer an effective predatory business model.

Avatar photo

Andres Rodriguez

Andres Rodriguez is CTO at Nasuni.

andres-rodriguez has 1 posts and counting.See all posts by andres-rodriguez