7 Common Data Misconfigurations for Google Cloud Platform

Data breach header image for the Sonrai Security blog and webinar

Seven Common Data Misconfigurations for GCP — and How to Avoid Them

Just about every process in the Google Cloud Platform (GCP) is regulated by identity access management (IAM). Yet, when it comes to IAM while using GCP, many users today are making major mistakes. Unfortunately, these missteps lead to critical security vulnerabilities that can compromise systems and applications.

So, what do misconfigurations look like in GCP? 

Let’s take a look. 

Common Misconfigurations in GCP

1. Confusing ‘allUsers’ group and ‘allAuthenticatedUsers’

There are two types of users on GCP: allUsers, which are both authenticated users and unauthenticated anonymous users, and allAuthenticatedUsers, which can be anyone with a verified Google account.

Granting allUsers access to one of your buckets is like leaving your front door unlocked. Basically, anyone who obtains the link will be able to access it freely — creating an opportunity for public data exposure. 

Granting allAuthenticatedUsers access is slightly more restricted than allUsers, but not by much. This simply limits those that can access your data to anyone who is authenticated with a Google account or a service account, but not anonymous users. 

All team members should be cognizant of the use of these permissions and ideally avoid using them if possible. Alternately, restrictions can be applied to prevent the use of these across GCP .

2. Excessive permissions and user privilege escalation 

If an Identity is over-privileged I can directly or indirectly,  promote itself to the ownership level of a bucket. With this level of privilege, they have the authority to make administrative decisions that could compromise your entire operation. 
Let’s look at an example of what an Identity with excessive permissions could achieve:

  • Copy data out of your bucket to an attacker-owned bucket.
  • Delete all of your data and then delete your bucket.
  • Create a new bucket with the same name in the attacker’s project, and give read/write permissions for files to everyone; the attacker could grant storage object creator and storage object viewer roles to the allUsers group to accomplish this.
  • Copy the data from the backup bucket to the new bucket in the attacker’s project.

Thus, it is very important to understand the Effective Permissions (aka: end-to-end permissions) of all your GCP Identities whether they be human or non-human.

3. Storage access and constraints

The constraint that’s most relevant to this misconfiguration is called domain restricted sharing

If you place your data storage buckets with sensitive data under a certain project or folder, you can then apply this constraint at the project or folder level to specify that no IAM permissions be granted to anyone outside of your organization. 

If you are a G Suite customer, you can grant access to the G Suite ID for your domain. This will prevent any user who has not authenticated to your G Suite domain from being granted IAM permissions to any resources in your project.

The issues with using this constraint are as follows:

  • It prevents the IAM permissions being changed. This means that anything that’s already misconfigured when you apply this constraint will stay that way.
  • If your buckets are not already organized and segmented by projects (or folders) such that public and private buckets are clearly separated, you won’t be able to implement this constraint.

With this in mind, what Cloud Ops teams need to ensure is that they think through the design of their data storage architecture with Identities and access in mind. This will ensure that when implemented these issues don’t arise … and go unknown until it is too late.

4. Storage access and VPC constraints

Virtual Public Cloud (VPC) Service Controls can also help mitigate the misconfiguration of storage buckets. 

GCP allows you to make resources private, which means they can’t be accessed via the Internet even if the IAM policy allows it. This control allows you to set up a VPC service perimeter around projects and then control access to that perimeter based on things like your IP address, geographic location, and conditions on the device requesting the access, among other things.

The issues with using a VPC Service Control are as follows:

  • You could easily miss valid use cases and actually interrupt your business while implementing a service perimeter.
  • If your buckets are not already organized by project, this will not be a feasible solution for you.

While this is a great control, it comes back to the importance of fully understanding you data and identity access requirements. You need to know where you data is and which Identities will require access and from where they need it.  Furthermore, you need to continuously monitor for potential changes to ensure that things stay locked down.

5. Secrets management and encryption

Users often want to know whether encryption will prevent the exposure of their files. 

GCP provides encryption on stored objects by default with keys that they manage. You might think that this approach would reduce data exposure, but it doesn’t. 

When you apply IAM permissions that allow the public to read objects in buckets, Google is obliged to decrypt the data — the same as it would for your internal users. This also applies to customer managed encryption keys (CMEKs) that you are able to provision and control in the key management service (KMS).

The one case that encryption would not allow data exposure is with customer supplied encryption keys (CSEKs) because Google never stores those. In this case, you have to store and manage your own keys. For use with storage buckets, you must supply your public key to Google to allow it to encrypt the data being stored in the bucket. But, under this configuration, GCP has no way to decrypt the data for you without that. 

So, if an unauthorized party gains access to the encrypted objects, they would need to separately obtain access to your keys in order to decrypt the data.

6. API keys

API keys in GCP are a form of authentication and authorization that can be used when calling specific API endpoints in the cloud. These keys are tied directly to GCP projects and are therefore considered less secure than OAuth 2.0 client credentials or service account user-managed keys. 

In a secure cloud environment, all assets and resources should be monitored for when they are created, updated, or deleted. This makes sensitive credentials like API keys especially important to track. 

Unfortunately, GCP does not currently support a native way to programmatically inventory API keys across an entire GCP organization.

7. Disabled logging and monitoring

While this is listed as #7, it is one of the most important things you need to avoid.

It’s surprising how many organizations don’t enable, configure, or even review the logs and telemetry data that public clouds provide, which in many cases can be extremely sophisticated. Someone on your enterprise cloud team should have the responsibility for regularly reviewing this data and flagging security-related events. Be sure to enable logging and monitoring functionality to support these efforts.

Tips for Addressing Cloud Security GCP Misconfigurations

Now that you have a better idea of some of the more common GCP misconfiguration, here are some tips that can help you avoid them:

  1. Know your cloud environments and define a security foundation.
  2. Review access controls to ensure only authorized users can take action on specified. cloud resources. This includes ensuring IAM policies are properly implemented, such as bucket policies on storage accounts inside of GCP.
  3. Enforce the principle of least privilege by only giving your users the permissions they need to do their jobs. 
  4. Implement logging, which can identify changes to your cloud environments and help determine the extent of an incident. 
  5. Remember, much of this work can be automated to help you rapidly discover misconfigurations, data that lives where it shouldn’t, and identities that shouldn’t have access to it.
  6. Take advantage of the right tools, analyze your cloud environments, and perform best practice assessments and audits. 

Final Thoughts

The rate at which new features and functionalities from cloud providers are growing is exciting and promising. At the same time, however, it adds complexities to our cloud environments that makes it harder to protect yourself against misconfigurations and compliance risks while keeping your data secure.

This is why it is so important to consider an intelligent cloud security posture management (CSPM) solution. Intelligent CSPM helps with many important processes — including real-time misconfiguration monitoring, providing a consolidated view for multi-cloud environments, and creating standards framework and compliance reports (like NIST, HIPAA, GDPR, PCI-DSS, and AWS Well-Architected Framework, among others). On top of that, it uses advanced analytics for Identity and Data Governance to ensure that you have the full, and continuous, picture of the state of your GCP security. With that in place, an intelligent CSPM will facilitate smart workflows to get issues to the right teams for resolution as well as leverage automation to fix your problems before they become issues.

For more information on the easiest way to secure your GCP environment, check out Sonrai’s approach to CSPM.

The post 7 Common Data Misconfigurations for Google Cloud Platform appeared first on Sonrai Security.

*** This is a Security Bloggers Network syndicated blog from Blog - Sonrai Security authored by Eric Kedrosky. Read the original post at: