Tokenization: Your Secret Weapon for Data Security?

Author: David Cross, CISSP, GCIH, GPEN, GWAPT, ISTQB
Date Published: 11 November 2019

Consider this: 4.1 billion records have been compromised in the first half of 2019 alone, a 54% increase over the same period in 2018.1 Organizations in nearly every industry are facing an ever-growing and out-of-control cybersecurity crisis.

Encryption is one of the most effective security controls available to enterprises, but it can be challenging to deploy and maintain across a complex enterprise landscape. Instead, organizations often invest in simpler security methods such as perimeter defenses, enforcement of password rules and applying patches in a timely way. But when these traditional defenses fail to hold up—and clearly, they are insufficient—hackers make off with a treasure trove of your organization’s most sensitive data.

To fight back, a growing number of organizations are turning to tokenization as a cost-effective means to protect important data without impacting ongoing operations. With tokenization, data are masked in ciphertext, making the data unidentifiable and useless to attackers.

Tokenization is an approach to protect data at rest while preserving data type and length. It replaces the original sensitive data with randomly generated, nonsensitive substitute characters as placeholder data. These random characters, known as tokens, have no intrinsic value, but they allow authorized users to retrieve the sensitive data when needed. If tokenized data are lost or stolen, they are useless to cybercriminals. The tokenized data can also be stored in the same size and format as the original data. This is ideal for enterprise environments—especially those with legacy systems—since the tokenized data require no changes in database schema or processes.

Minimizes Data Exposure

Applications will generally use tokens and only access real values when absolutely necessary. Tokenization was originally developed for the credit card industry to reduce the scope of audits. Now, with the advent of lightweight yet powerful tokenization solutions, any industry responsible for securing sensitive data—including data such as Social Security numbers, birth dates, passport numbers and account numbers—can implement tokenization and minimize data exposure.

With the advent of lightweight yet powerful tokenization solutions, any industry responsible for securing sensitive data…can implement tokenization and minimize data exposure.

What About Implementation?

The implementation of tokenization throughout the enterprise is now fairly straightforward, thanks to vaultless tokenization.2 Legacy methods of “vaulted” tokenization require maintaining databases with tokens and their corresponding real data. These token vaults represent a high-risk target for theft. In addition, large token vaults often present complex implementation problems, particularly in distributed, worldwide deployments. Implementation challenges surrounding vaulted tokenization are one of the reasons why enterprises continue to leave sensitive data vulnerable to cyberattackers.

No Vault Database Required

In contrast, vaultless tokenization is safer and more efficient, while offering the advantage of either on-premises or cloud deployment. In this model, a hardware security module (HSM)3 is used to cryptographically tokenize data. These data can then be detokenized, returning the appropriate portion of a record for use by authorized parties or applications. In this model, there is no token vault or centralized token database to maintain. Using network-level and representational state transfer application program interfaces (REST APIs), tokenization can be efficiently integrated into nearly any enterprise environment.

Typically, the main application for tokenization has been the protection of credit and debit card numbers, both for payment and nonpayment processes. However, the largest opportunity going forward is the general protection of sensitive data. The likelihood that your organization will be breached is steadily increasing and, coupled with the skyrocketing costs related to data breach recovery, the case for tokenization in the enterprise is compelling.

David Close

Is chief solutions architect at Futurex, a trusted provider of hardened enterprise data security solutions. Close heads up major projects involving the design, development and deployment of mission-critical systems used by organizations for their cryptographic needs, including the secure encryption, storage, transmission and certification of sensitive data. Close is a subject matter expert in enterprise key management best practices and systems architecture and infrastructure design.

Endnotes

1 Risk Based Security, The 2019 MidYear Data Breach QuickView Report, USA, 2019
2 Futurex, Vaultless Tokenization
3 Futurex, Hardware Security Modules