SBN

Webinar Recap: Using the Keyfactor Secrets Engine for HashiCorp Vault

The rapid pace of DevOps has greatly contributed to the growth in cloud adoption and deployments. Consequently, the role of public key infrastructure (PKI) has also expanded across a growing ecosystem of tools and applications that either issue digital certificates or consume them—like Hashicorp Vault.

The rapid adoption of these tool sets has led InfoSec teams to change the way they think about PKI and the management of keys and certificates. Now they’re asking questions like:

  • How do I know how many keys and certificates I have in inventory?
  • How do I track and manage these keys and certificates effectively?
  • How do I ensure that every certificate is issued from a trusted and authorized CA?
  • How do I manage the back-end PKI within this multi-cloud environment?

Before we address these questions though, let’s discuss how HashiCorp Vault fits into the picture, and some of the challenges of using PKI in a DevOps environment.

Vault’s Value for Developers

HashiCorp Vault has become a point of focus in enterprise DevOps infrastructure. It allows developers to centrally store and tightly control access to secrets, and provides access via a common API.

Vault can also act as a CA (certificate authority) by using its onboard PKI Secrets Engine to issue short-lived TLS certificates. These are used by developers and operations teams to secure machine-to-machine communications between servers, containers, service meshes, and so on.

DevOps teams love Vault, because it gives them fast, easy access to certificates without going through the typical manual processes involved in getting a security-approved certificate from the PKI or InfoSec team. Vault’s flexibility also allows it to generate self-signed certificates, or configured to issue certificates from a subordinate CA, like Microsoft CA.

But like any tool in a DevOps environment, Vault needs to be implemented and handled in the way that complies with enterprise security requirements.

The Bigger Picture: DevOps vs InfoSec

Taking a step back to look at the bigger picture, the use of keys and certs has grown rapidly in most organizations. That’s added a lot more complexity to the way we think about PKI.

It’s the story you’ve heard before, DevOps wants to move fast, but InfoSec needs to enforce security controls.When it comes to PKI, that story is no different.

Application and operations teams are not all that concerned with where a certificate is issued from or what policies it complies with. Their job is to build, package, test and deliver applications efficiently, not manage certificates. That often means finding ways to avoid time-consuming PKI processes and instead using non-complaint workarounds.

With the amount of self-signed certificates being issued, and the abundance of tools that have their own built-in CAs, (e.g. Ansible, Kubernetes, AWS, Azure) most security teams lose sight of how many keys and certificates they actually have. Knowledge of where those keys and certificates live, when they expire or what policies they comply with is difficult to achieve with such a distributed set of tools and applications. These unknowns lead to serious security risks and certificate-related outages.

When something does go wrong – for instance, if a certificate expires or a CA is compromised – it can take hours to find and remediate the issue, resulting in downtime and lost productivity. Without direct accountability, it really becomes a game of cat-and-mouse between DevOps and InfoSec teams.

That is, unless there’s a solution that can provide the visibility and control that InfoSec needs, without compromising on automation and ease-of-access required by DevOps.

Keyfactor Vault Integration

Using the Keyfactor Secrets Engine with HashiCorp Vault, organizations can to deploy and use Vault in a way that makes both the InfoSec and the DevOps teams happy.

 

The integration starts with gaining visibility of the certs across all of your Vault instances and pulling them into a single enterprise inventory. This way you can apply the same consistent set of policies, compliance and auditing while ensuring that all certificates are also being issued from an authorized and trusted public or private CA—or from Keyfactor’s Cloud-hosted PKI platform.

The struggle between DevOps and InfoSec teams is real, but there is a way to achieve appropriate levels of security without disruption. Using the Keyfactor Secrets Engine, the InfoSec team gets a standardized way of integrating with Vault. For DevOps teams, they’ll be able to issue certs with the same set of native Vault APIs and commands they’re used to, and gain access to publicly trusted certificates used in production.

Get an inside look at how Keyfactor integrates with Hashicorp Vault in our on-demand webinar:

Watch Now


*** This is a Security Bloggers Network syndicated blog from PKI Blog authored by Ryan Sanders. Read the original post at: https://blog.keyfactor.com/keyfactor-secrets-integration-hashicorp

Avatar photo

Ryan Sanders

Ryan Sanders is a Toronto-based product lead with Keyfactor, a leader in providing secure digital identity solutions for the Global 2000 Enterprises. Ryan has a passion for cybersecurity and actively analyzes the latest in compliance mandates, market trends, and industry best practices related to public key infrastructure (PKI) and digital certificates.

ryan-sanders has 53 posts and counting.See all posts by ryan-sanders