Cloud 101CircleEventsBlog
Master CSA’s Security, Trust, Assurance, and Risk program—download the STAR Prep Kit for essential tools to enhance your assurance!

It’s Not ‘See You Later.’ It’s ‘Goodbye’: Moving on from Tokenization in the Age of Ransomware

Published 10/12/2022

It’s Not ‘See You Later.’ It’s ‘Goodbye’: Moving on from Tokenization in the Age of Ransomware

Written by Arti Raman, CEO, Titaniam.

Invented in 2001, tokenization addresses the risk of losing cardholder data from eCommerce platforms and remains the golden standard for protecting sensitive information.

The concept was simple:

  • Swap payment card numbers for substitute numbers, i.e., tokens, with a 1-1 correlation between a token and its underlying card.
  • The token serves as a stand-in of the actual payment card number.
  • Transactions flow through entire financial workflows without risking payment card compromise.
  • Financial institutions “clear” these by matching tokens with the original payment cards in secure back-end environments.

When cyberattacks began targeting data beyond just payment cards, organizations began to experience the loss of other types of Personally Identifiable Information (PII) and Protected Health Information (PHI). In response, the cybersecurity industry employed tokenization as a vetted tactic. Organizations who wished to protect complex data spent hundreds of millions of dollars implementing and integrating tokenization into complex workflows.

However, billions of PII records continue to be lost to cyberattacks from these same enterprises. According to BigID, an alarming 80% of compromised data contains PII — which is often sensitive, vulnerable, and regulated.

The invention of tokenization was intended for securing payment card data throughout a transaction without storing the actual data, nothing more, and fails to compensate for other types of sensitive data.

Tokenization’s Shortcomings

For instance, names and addresses are populated in enormous databases and indexed for search with the expectation of retrieving them eventually. Applying tokenization concepts makes little sense to protect this data.

But enterprises did it anyway and received complex implementations where tokens were swapped in and out of systems and reconciled in token vaults. Whenever the enterprise was faced with sensitive data that needed to be richly searched and analyzed, they did one of two things:

  1. They decided to process it in clear text and accept the risk of losing it in a data breach.
  2. They decided to implement large scale batch detokenization processes, placed enormous volumes of detokenized sensitive data in “walled gardens.”

Not only is this process slow, cumbersome and insecure, but it also defeats the purpose of tokenization.

Today, clear text analytic stores that were either never tokenized or formed after de-tokenization represent the largest concentration of sensitive data risk.

As a result, ransomware attackers now steal privileged credentials and look for these large repositories, where they leave with millions of sensitive data records in cleartext for later extortion. Ultimately, the data makes its way to the dark web for sale and sees a long tail of impact. In fact, the average cost of a ransomware attack in 2021 is $1.85 million, almost twice what it was the previous year.

With data collection at an all-time high, it is unsurprising that the average organization using tokenization protects less than 10% of its sensitive data.

What About Data Encryption?

While many victim enterprises do employ encryption techniques, they are not all equal. Data exists in three states: at-rest, in-transit, and in-use. Until recently, the only types of encryption that have been viable have been encryption for data-at-rest and encryption for data-in-transit. However, once hackers have access to highly privileged datastore credentials, these methods fall away, and it is easy for hackers to query datastores or dump their contents en masse.

What can enterprises do?

Encryption-in-use data protection is a new type of encryption that stays on throughout active use, including while running rich searches. Queries and analytics can be supported without data decryption, and result sets come back in encrypted form even when applying the highest privileged credentials. Hackers are accessing memory and encounter only encrypted data.

Encryption-in-use has advanced to a stage where performance and scale are no longer a concern, and organizations have the means to expand sensitive data protection from 10% to 100%, closing the large extortion gap that exists today.

In the last two years, ransomware has risen to become the dominant cybersecurity threat for enterprises and governments and tokenization is simply not enough. Encryption-in-use providers are seeing a strong influx of customers, keen attention from analysts, and a free flow of investment.

Will encryption-in-use make traditional tokenization obsolete? We will see.

Share this content on your favorite social network today!