Cloud 101CircleEventsBlog
Master CSA’s Security, Trust, Assurance, and Risk program—download the STAR Prep Kit for essential tools to enhance your assurance!

Data States Security Experts Unhappy With Traditional Tokenization

Published 12/08/2022

Data States Security Experts Unhappy With Traditional Tokenization

Originally published by Titaniam.

Titaniam’s 2022 State of Enterprise Tokenization Survey shows that the vast majority of cybersecurity experts are dissatisfied with their current tokenization tools. In fact, despite spending 1 million dollars annually on tokenization security tools, 99% of respondents indicated that they were unhappy with the tradeoffs they have to make when utilizing traditional tokenization. Security tools such as traditional tokenization may soon take a backseat to newer and more innovative solutions that are designed to deliver the strong security that users require but without losing the use of the underlying data.

The survey revealed:

  • Almost 40% of respondents spend over $1,000,000 on tokenization every year, yet 70% have experienced sensitive data theft by an external adversary in the last 12 months.
  • Of the 70% who experienced data theft over the last year, nearly all, 98.63%, said they believe that this could have been prevented with a more modern data security solution.

Tokenization Explained

Tokenization is a security method originally designed for payment card numbers where the original number would be swapped for a token, simply a secondary string of numbers. This is a 1-1 transaction that was meant to correlate each token with the original card number while having no monetary value by itself. This process allowed the token to flow through transactions seamlessly and minimize the potential for payment card information to be stolen. This method is still the best practice for payment card numbers since they are utilized only for monetary transactions and for no other purpose. Today many companies utilize it for other types of sensitive data as well with the idea of providing the same level of security for other types of PII as is available for payment card numbers.

Unfortunately, tokenization was never meant to secure data more complex than payment card numbers, and cybercriminals have begun seeking a wide variety of data that goes well beyond payment card numbers. These same attackers are now identifying and targeting Personally Identifiable Information (PII) and Protected Health Information (PHI), meaning they are stealing addresses, social security numbers, medical information and far more. In response, some organizations have chosen to invest in the previously tried and true security method of tokenization.

Companies quickly discovered tokenization’s limits. After spending millions of dollars on integrating tokenization systems into their workflows, companies are now faced with the realization that it is only useful if they truly have no need to analyze the underlying data. The actual reality on the ground is that the same sensitive data they seek to protect is also required daily to understand customer behaviors, needs and pain points. Information such as names and addresses must be indexed and readily available for rich searches that contribute valuable insight to customer relations.

Some companies process this sensitive data in clear text format while accepting the risk of a data breach, and others go through the process of detokenizing data or reversing the tokenization process, in mass quantities to ultimately be deleted.

Neither solution is secure, and the detokenization process is time-consuming.

The Current State

Once you consider tokenization’s original intent and the complexity of today’s data, it brings to light glaring issues in security plans that use tokenization as the main shield. The survey also shows that the top three types of data housed today include: employee data, customer data and payment card data. While all of these could result in dire consequences if stolen, tokenization, again, was only designed to secure payment card data.

The same report concluded that nearly half (47%) of companies using tokenization can’t even tokenize all of their necessary data due to insufficient insight, and some are unable to lack of performance (44%) or lack of context (41%). Not only are companies spending millions of dollars on a security tool intended for entirely different data, but the same tool cannot be relied upon to prevent data loss.

In fact, the survey disclosed that 99% of companies surveyed are unhappy with the invasive and disruptive nature of tokenization. While tokenization has its time and place, protecting PII and PHI, which make up the majority of compromised data, just isn’t it. Companies must begin looking ahead to the future and let go of traditional security tools that are no longer mitigating or preventing modern-day ransomware attacks.

The Next Generation

According to the survey, over 85% of companies using traditional tokenization methods bring the data back into clear text format, negating the security benefits. This puts companies at a disadvantage and at a higher risk of data loss during a breach.

The bottom line is: traditional tokenization simply cannot provide the coverage that is required in today’s threat environment. Cybercriminals are already looking ahead and incorporating data exfiltration and extortion tactics in their breach attempts, and current tools aren’t keeping up.

Share this content on your favorite social network today!