The nonprofit GDI Foundation has tracked close to 175,000 examples of misconfigured software and services on the cloud this year. As more and more organizations are moving to the cloud, the number of leaky servers is increasing. We have seen several AWS data leaks this year – from Booz Allen Hamilton to the WWE – that have left millions of private records exposed.
In this Q&A, we will discuss why there have been so many leaky servers, who is to blame and what can be done to protect against this issue.
Q: Why have there been so many leaky servers recently?
A: I saw one vendor claim that they detected 7% of all S3 buckets are public, which is quite high. My sense is that data leaks appear to be growing exponentially due to the increasing rate of data being stored in the cloud. There is simply far more data to lose today, and thanks to data breach notification laws and good netizens, we now know about the household names that lose control of our personal information and government secrets. Compounding the problem, configuring Identity and Access Management (IAM) in the cloud can be difficult. Most organizations haven’t developed the expertise, and mistakes are commonly made.
Q: It appears that AWS is always the provider hosting the lost data. Is this problem unique to Amazon?
A: AWS is the 900-pound gorilla in this space, so, statistically speaking, lost data is bound to come from their service. However, other providers have had very concerning issues as well, and all are at risk of human error leading to data leaks and breaches. I was one of 68 million Dropbox users that received an email last year asking me to reset my password because they found out that in 2012 they had lost our User IDs and hashed passwords.
Then, earlier this year, Box had to change the way they handled publicly shared accounts and folders because search engines found these directories and some had confidential data in them. It is simply easier to make folders public then setting up proper IAM policies, and like S3 users, these Box customers hoped nobody would find their folder. Hope isn’t considered a best security practice.
And of course, we can’t forget about Uber, who was in the news twice for major data breaches. On both occasions Uber left its encryption keys on GitHub, which in part led to the breach. These problems happen far more frequently than are reported. Most companies don’t lose millions of records and don’t make the news – or worse, they don’t know that they are leaking data.
Q: Is AWS to blame for the rash of breaches?
A: AWS, Cloud Security Alliance, every analyst and every other cloud provider talks about “shared responsibility” for data security. This means that the customer is ultimately responsible for data security. However, at the same time, every provider wants the customer to feel that their cloud is the safest, they should be confident in moving their data to them and that they have all the services required to protect their data. We have to remember, that at the end of the day, the cloud providers aren’t taking responsibility for implementing the architecture and process to protect your data-each organization owns this responsibility. In addition, they aren’t taking responsibility for failures in their solutions if they were to occur, such as User IDs and passwords being stolen. A major driver to move to the cloud is to reduce capital and operations costs, but customers have to remember that they own data security and typically can’t transfer that liability.
Unfortunately, it still isn’t as simple as acting responsible.
Many of the misconfigurations of S3 were by contractors. Often, organizations are handing their precious data off to an “expert” third party. These contractors typically look for the least expensive solutions because they need a competitive bid to win the contract.
Data security means that you have to retain control of your data—wherever it resides. Before handing off resumes of people with top-secret clearance or military secrets to a vendor, the RFP and service-level agreements must spell out how to protect the data. I expect that this wasn’t defined in the contract with the vendors who lost control of their customers’ data.
Q: Have there been any vulnerabilities behind leaky servers?
A: The recent AWS leaks appear to be caused by human configuration errors. Although, the Verizon leak was interesting because they said that it wasn’t a Verizon configuration error, but allegedly a Verizon Wireless employee who was hosting 100 MB of their data in his personal S3 Bucket. I think this should have raised even more questions about Verizon’s ability to control data if their employees are copying 100MB of customer data to personal storage without setting off notifications. But, this wasn’t an AWS vulnerability; it was a Verizon data security architecture and process failure.
Q: Is leaked data falling into the wrong hands?
A: There are simple methods and many tools available to detect leaky buckets; you can find videos about them on YouTube. When we hear about leaks in the news, it’s because white-hat hackers are finding the problems, letting the effected organization know and then earning the much-deserved press for themselves. After the forensics come out, we often learn that the data has been leaking for many months or years, it is naive to think that those white-hat do-gooders were the first and only to find the data. I’m sure every hacker group has added data leak tools to their arsenal.
Q: What do leaky breaches cost companies?
A: The costs really vary in accordance with the type of data lost, media coverage, how the company responds, fines and customer reaction. What I fear is that data leaks have become so common that they are not being reported the way they should be. This is a problem because the information that is lost could be my personal information or your personal information, and we will have to deal with the ramifications.
In some cases, Attorney Generals are enforcing their state’s data breach notification laws and going after companies that have leaked unencrypted PII data and didn’t properly inform their customers. Although it wasn’t from an S3 leak, Hilton Hotels recently paid $700K in fines to two states.
Q: Should organizations stop moving sensitive data to the cloud?
A: Absolutely not. Cloud solutions offer incredible value, new capabilities and speed of deployment. Every organization should have a cloud first strategy, and for each project determine what is the best cloud architecture vs on-premises solutions. However, even in the cloud, data security best practices need to be followed:
- Encryption is a proven method to control data, but you have to have proper key management. All major providers offer Bring Your Own Key (BYOK) capabilities. Take advantage of this and use a FIPS 140-2 validated device or third-party service to manage your keys.
- Successful data security architectures have layers of defense. Your customer PII, Intellectual Property and other highly valuable information should use advanced encryption techniques.
- Encrypt Object Storage, like S3, before it reaches the bucket with a cloud encryption gateway running on-premises or at least independently in the cloud service. This ensures that S3 data is encrypted and if a misconfiguration or breach occurs, only unusable data leaks. The only way to view this data again is to be authenticated by the cloud encryption gateway. The result isn’t only de-risking data breaches and leaks, but gaining visibility of data that is being read and written to your buckets.
- Control valuable data being used or created in cloud workloads with advanced bring your own encryption, tokenization or application layer encryption. And be sure you control your keys with a FIPS 140-2 key manager that is on-premises or running as a separate instance managed by your SecOps team.
- When handing your data over to a contractor or partner, clearly define your encryption and key management requirements. Set the right expectations for data security from the start. This is a very common practice for our BFSI customers.
In this Q&A, we’ve talked a lot about data leaks found in AWS. I just want to recognize that AWS is working hard on educating their customers and creating more tools to help customers securely use their cloud service. But, the reality is multi-cloud environments are now common. Data Security is paramount for compliance and protection of the business. It is important to consider how to enable security teams to easily, consistently and repeatedly secure multi-cloud environments, not just AWS.
This is a Security Bloggers Network syndicated blog post authored by Charles Goldberg. Read the original post at: Data Security Blog | Thales e-Security