Amazon Web Service (AWS) S3 buckets have become a common source of data loss for public and private organizations alike. Here are five solutions you can use to evaluate the security of data stored in your S3 buckets.
For business professionals, the public cloud is a smorgasbord of micro-service offerings which provide rapid delivery of hardware and software solutions. For security and IT professionals, though, public cloud adoption represents a constant struggle to secure data and prevent unexpected exposure of private and confidential information. Balancing these requirements can be tricky, especially when trying to adhere to your organization’s unique Corporate Information Security Policies and Standards.
Amazon Web Service (AWS) S3 buckets have become a common source of data loss for public and private organizations alike. Industry researchers and analysts most often attribute the root cause of the data loss to misconfigured services, vulnerable applications/tools, wide-open permissions, and / or usage of default credentials.
Recent examples of data leaks from AWS storage buckets include:
- The discovery of billions of online posts and news commentary collected by the U.S. Department of Defense and stored in publicly accessible AWS S3 buckets.
- Sensitive personal data of thousands of insurance policyholders left exposed in a storage bucket that was not password protected.
- More than 60,000 sensitive files stored by defense contractor Booz Allen Hamilton on a publicly accessible AWS storage server.
- The personal details of hundreds of thousands of U.S. voters reportedly exposed by robocall firm RoboCent.
Data leakage is only one of the many risks presented by misuse of AWS S3 buckets. For example, attackers could potentially replace legitimate files with malicious ones for purposes of cryptocurrency mining or drive-by attacks.
To make matters worse for organizations (and simpler for hackers), automated tools are available to help find insecure S3 buckets.
How to protect data stored in AWS S3 buckets
Going back to the basics provides the most direct path to protecting your data. Recommended best practices for S3 buckets include always applying the principle of least privileges by using IAM policies and resource-based controls via Bucket Policies and Bucket ACLs.
Another best practice is to define a clear strategy for bucket content by taking the following steps:
- Creating automated monitoring / audits / fixes of S3 bucket security changes via Cloud Trail, Cloud Watch and Lambda.
- Creating a bucket lifecycle policy to transfer old data to an archive automatically based on usage patterns and age.
- When creating new buckets, applying encryption by default via server-side encryption (SSE-S3/SSE-C/SSE-KMS) and / or client-side encryption.
- Creating an S3 inventory list to automatically report inventory, replication and encryption in an easy to use CSV / ORC format.
- Testing, testing and testing some more to make sure the controls mentioned above have been implemented effectively and the data is secure.
Here at Tenable, I have researched five additional solutions you can use to evaluate the security of data stored in S3 buckets. These five solutions, when implemented correctly and incorporated into daily operational checklists, can help you quickly assess your organization’s cyber exposure in the public cloud and help you determine next steps for securing your business-critical data.
- Amazon Macie: Automates data discovery and classification. Uses Artificial Intelligence to classify data files on S3 by leveraging a rules engine that identifies application data, correlates file extensions and predictable data themes, with strong regex matching to determine data type, cloud trail events, errors and basic alerts.
- Security Monkey: An open source bootstrap solution on github provided by Netflix. This implements monitoring, alerting and an auditable history of Cloud configurations across S3, IAM, Security Groups, Route 53, ELBs and SQS services.
- Amazon Trusted Advisor: Helps perform multiple other functions apart from identifying insecure buckets.
- Amazon S3 Inventory Tool: Provides either a CSV or ORC which further aids in auditing the replication and encryption status of objects in S3.
- Custom S3 bucket scanning solutions: Scripts available on github can be used to scan and check specific S3 buckets. These include kromtech’s S3-Inspector and sa7mon’s S3Scanner. In addition, avineshwar’s slurp clone monitors certstream and enumerates s3 buckets from each domain.
With the business demanding speed and ease of use, we expect to see the continued evolution of applications, systems and infrastructure away from on-premises data centers secured behind highly segregated networks to cloud-based “X-as-a-Service” architectures. The solutions and guidance highlighted above will help you identify security gaps in your environment and bootstrap solutions to automate resolution, alerting and auditing, thereby helping you meet your organization’s Corporate Information Security Policies and Standards.
*** This is a Security Bloggers Network syndicated blog from Tenable Blog authored by Tenable Research. Read the original post at: http://feedproxy.google.com/~r/tenable/qaXL/~3/GO1M8phI1Kc/leaky-amazon-s3-buckets-challenges-solutions-and-best-practices