Making a Case for Tokenization in the Enterprise

Tokenization can be used to protect any sensitive data within an organization, with little overhead

One of the most difficult tasks in information security is protecting sensitive data across complex, distributed enterprise systems as well as a mix of legacy systems and cloud-based applications, all blended in with critical business requirements. Added to the ever-present risk of data breaches are the various privacy laws and regulations, such as GDPR or the California Consumer Privacy Act.

When implemented properly, encryption is one of the most effective security controls available to enterprises, but it can be challenging to deploy and maintain across a complex enterprise landscape. Worse, sensitive data still can be taken advantage of should it be subject to a breach. Fortunately, there are other data protection options that enterprises can implement with far less disruption—namely, tokenization.

Increasingly, enterprises are turning to tokenization because it offers a stateless, data-centric approach with fewer security gaps and risks. With tokenization, security travels with the data while it’s at rest, in use and in motion. As a result, no additional security methods are needed to provide protection when the data leaves the enterprise.

Tokenization accomplishes this by replacing the original sensitive data with randomly generated substitute characters as placeholder data. These random characters, known as tokens, have no intrinsic value, but they allow authorized users to retrieve the sensitive data when needed. If tokenized data is lost or stolen, it is useless to cybercriminals. The tokenized data can also be stored in the same size and format as the original data. This is ideal for enterprise environments—especially those with legacy systems—since the tokenized data requires no changes in database schema or processes.

Broad Applications

The use of tokenization also minimizes data exposure. Applications generally will use tokens and only access real values when absolutely necessary. Although tokenization is most typically associated with credit cards, it is applicable to virtually every industry, for data types such as Social Security numbers, birth dates, passport numbers and account numbers. Through the use of network-level and REST APIs, tokenization can be integrated into a variety of different enterprise environments.

Tokenization is also a secret weapon for organizations with heavy compliance burdens. Financial institutions, for instance, are often responsible for securing millions of account holder credentials in data infrastructures that are subject to PCI DSS regulations. Tokenizing as much data as possible allows these organizations to ease their compliance burdens as tokens are not generally within the scope of audits.

With the advent of vaultless tokenization, the implementation of tokenization in the enterprise is now a relatively straightforward affair. Legacy methods of “vaulted” tokenization require maintaining databases with tokens and their corresponding real data. These token vaults represent a high-risk target for theft. Furthermore, large token vaults often present complex implementation problems, particularly in distributed, worldwide deployments. One could argue that the implementation challenges surrounding vaulted tokenization are a primary reason why enterprises continue to leave sensitive data vulnerable to cyberattackers.

No Vault Database to Maintain

In contrast, vaultless tokenization is safer and more efficient while offering the advantage of either on-premises or cloud deployment. In this model, a hardware security module (HSM) is used to cryptographically tokenize data. This data can then be detokenized, returning the appropriate portion of a record, for use by authorized parties or applications. In this model, there is no token vault or centralized token database to maintain.

The information security principle of least privilege dictates that organizations limit access to sensitive data to solely what an individual needs to do their job. Any additional access is an unnecessary exposure of sensitive data. Intelligence agencies have operated under this principle of “need to know access” for years. This reduces the risk of data breaches of both the accidental and intentional varieties.

Customizing detokenization output based on user or application role is one way to accomplish this. For example, loyalty applications may find a partially detokenized account number, perhaps just the last four digits of a credit card number, sufficient to do their job, while an e-commerce application would likely require a fully detokenized account number for repeat purchases. Other applications, such as business analytics, may be able to use the token itself as an identifier without any need to ever detokenize it.

Historically, the protection of credit and debit card numbers, both for payment as well as non-payment processes, has been the main application for tokenization, but the largest opportunity going forward is the general protection of sensitive data. With the costs of recovering from a data breach spiraling out of control, the case for tokenization in the enterprise is an easy one to make.

David Close

Avatar photo

David Close

David Close is chief solutions architect at Futurex, a trusted provider of hardened enterprise data security solutions. Close heads up major projects involving the design, development, and deployment of mission critical systems used by organizations for their cryptographic needs, including the secure encryption, storage, transmission, and certification of sensitive data. Close is a subject matter expert in enterprise key management best practices and systems architecture and infrastructure design. He holds a B.S. in Computer Engineering from St. Mary’s University.

david-close has 1 posts and counting.See all posts by david-close

One thought on “Making a Case for Tokenization in the Enterprise

Comments are closed.