Cloud providers have done a good job of integrating default encryption services within their core infrastructure. However, as discussed in previous blogs, the encryption service is only as secure as the keys that are used to encrypt the data. Enterprises cannot ignore the responsibility of implementing a strong key assurance service that ensures they maintain control of their own risks. With full control of the encryption key, the enterprise controls who can access data stored in the cloud and when they can access it. To maintain full control of cloud data, the following should be considered:
Ensure visibility into how the key is being used
Full visibility info how an encryption key is used is a critical component of maintaining control in the cloud. There are useful reasons and conditions to use both third party encryption services as well as encryption services that are embedded into the cloud service provider’s infrastructure. Regardless of the encryption service being used, it is critical to maintain complete visibility of keys being accessed.
Key access information
Cloud service providers have done a great job of allowing application developers and devops teams to automate key management services. Via APIs, a developer can write scripts or apps that create key repositories, generate keys, and provision them to the encryption services that use the keys. Although this powerful tool reduces costs and ensures a default security layer is implemented for new applications and services, it creates a serious headache for governance teams that need to monitor the security framework to ensure threats are minimized. To date, cloud service providers do not do a great job of providing a security overview of:
- What services are actually using a key to encrypt data
- What services and/or person is creating, changing, or deploying keys
- Where key repositories are being used
- Which data within cloud workloads is encrypted
This information is important if an enterprise is just using one subscription account with one cloud provider. The need escalates when a security team is trying to provide a security governance framework for a large enterprise with tens if not hundreds of key repositories and tens of thousands of keys in many different cloud service provider subscriptions.
Some of the biggest public cloud providers have built in security access controls to their IaaS, PaaS, and SaaS environments. Most of this is built off extensive authentication schemes as well as authorization policies. However, just because someone can access and manage the encryption environment doesn’t necessarily mean they should. The key authorization system should be able to set authorization policies that are independent of the specific native cloud provider policies. By doing so, a security administrator can map key activities to business processes.
A typical example of this is with key delete. If a key is accidentally or maliciously deleted, there is danger that the data encrypted with said key would be unrecoverable. Even though an administrator has the ability to delete a key, it often makes sense to add additional authorization processes before the administrator deletes the key. For instance, it could be effective if the key management system triggered a multi-authorization workflow after a delete key request was made. A good implementation of this would be a multiple approval chain. For example, the administrator could set a policy that 3 out of a pool of 10 administrators need to approve the key delete before it is executed.
Using a third party key management service when leveraging a data encryption service that is part of a cloud provider’s infrastructure allows enterprises to strengthen their disaster recovery strategies. A third party key management system can push a key to the cloud provider system for use with the embedded system. When a key delete event is triggered, the third party key management system can delete the key from the cloud provider environment, thus rendering the applied data unreadable. However, the third party key management system can maintain a copy of the key in its own repository. If needed for business or compliance reasons, the key can then be restored into the cloud provider environment to regain access to the encrypted data.
In many instances, compliance regulations specify that data stored in the cloud must stay resident in the enterprise, country or region in which the data was generated. These regulations are sometimes hard to comply with because most cloud provider infrastructures replicate data to data centers across the globe for easy access and fault tolerance reasons.
But, there is a way to get around this stipulation. It has been argued and agreed upon in many instances that data is not really data if it cannot be read. So if encrypted data is distributed globally but the encryption key is being maintained locally, the data residency and control regulations are met.
Additionally, some countries and industries mandate that encryption keys must be created and stored in a FIPs 140-2 Level 3 environment. By definition, this means that the keys must be created in a physical HSM appliance. In many cases, public cloud providers and SaaS providers do not have the ability to set up this type of infrastructure for the customer. By utilizing a third party key management system that leverages HSM appliances, the enterprise can meet the requirement by managing the keys on-premises and only allowing the cloud provider to use the key on a temporary basis as needed.
Feel free to leave comment below, check out Thales eSecurity’s cloud security key management page, and/or tweet me @rjkcasl
The post Why Enterprises Should Control Their Encryption Keys appeared first on Data Security Blog | Thales e-Security.
This is a Security Bloggers Network syndicated blog post authored by Rick Killpack. Read the original post at: Data Security Blog | Thales e-Security