Cloud Data Security Requires 20/20 Vision
Published 08/08/2022
Originally published by Laminar here.
Written by Oran Avraham, Laminar.
No reasonable business leader would ever dream about leaving their logistics software unmanaged or their sales departments to their own devices. Visibility into every aspect of a business—every crevice, no matter how large or small—is critical to the success of any operation. Lack of visibility leaves businesses open to risks in the form of theft, inefficiencies, customer dissatisfaction and so much more.
As the business world continues to thrive on big data—and with more of that data stored in the cloud—visibility into a company’s data is undeniably important.
Data visibility into a self-contained, on-prem system is one thing, but that structure is hard to come by these days. Most modern businesses rely on the cloud to improve flexibility, to increase scalability, and to execute tasks quickly and effortlessly.
As the cloud allows businesses to work efficiently from anywhere at any time, greater access drives higher levels of risk. Due to increased pace of change as well as sprawl of new cloud tech, an organization’s data will be spread around various places, leaving some data to be more-or-less invisible in a “dark corner.”
Many large brands have already come to face this reality. Earlier this year, SEGA Europe sustained a massive data breach after someone inadvertently stored secure, sensitive files in a publicly accessible AWS S3 bucket. Similarly, a “glitch” caused some Twitter users’ personal information and passwords to be stored in a readable text format on the company’s internal system rather than disguised by their hashing process. The breaches of these two shadow environments show how a little mistake can lead to public scrutiny and damage a brand.
Ignorance is (Not) Bliss
Some may argue that data visibility before the cloud was mediocre at best, often downplayed by poor employee security awareness and inconclusive data protection policies. The introduction of cloud technology highlighted that issue and led to the widespread issue of ever-increasing data breach experienced today.
One of the biggest factors contributing to data breach culture is the sheer absence of comprehensive data visibility. It’s almost become an inevitable outcome—the price of admission, so to speak—that an organization can’t know what’s going on with every piece of data. A lot of professionals have accepted that conclusion as fact.
Often referred to as “shadow data,” hidden sensitive files and programs occur when data is copied, backed up or housed in a data store that is neither governed under the same security structure nor kept up to date. What some have simply accepted as the cost of doing business is turning out to be one of the largest threats to data security.
Shadow data has primarily been a result of four main changes to data culture: The proliferation of technology and its associated high complexity, the limited bandwidth of data protection teams who are falling behind, the democratization of data and the removal of on-prem perimeters.
What Lurks in the Shadows?
While hidden data can be a result of several different situations, it typically occurs when sensitive data— customer information, employee information, financial data, applications, intellectual property, etc.—is copied in an unsanctioned way. When data is copied and stored in a way that makes the files or programs invisible to a data protection team, those assets are unsecured and unmanageable using most modern security tools. Below are a few examples of how shadow data comes about:
- S3 Backups: Almost every modern business has at least one backup data store that they use as a contingency plan in the case of a breach or damage to its production environment. The backup data store is meant to keep exact copies of production data in case of an emergency. However, these are often left unmonitored and can mistakenly expose large amounts of data to the public, as in the SEGA Europe example.
- Leftover Data from Cloud Migration: As many organizations move to the cloud, they will deploy “lift and shift” data migration projects, but too often, the original data will never get deleted. This lingering data will remain unmanaged, unmaintained and often forgotten, which can most definitely lead to vulnerabilities down the line.
- Test Environment: Most organizations have a partial copy of their production or RDS database in a development or test environment where developers are building applications and testing programs. Often, developers need to move quickly and may take a snapshot of some data but fail to properly remove or secure the copied data—or they simply forget about it.
- Toxic Data Logs: When developers and log frameworks mistakenly copy actual sensitive data into log files, the result is a “toxic” data log. For example, naming the logs with a user’s email address exposes PII that is against policy.
- Analytics Pipeline: Many companies will store data in some type of analytics pipeline using the likes of Snowflake or others because it improves data recall speed and allows them to manipulate and analyze the data more easily. However, analytics pipelines are typically unmonitored by most security solutions today.
Turning the Lights On
Shining a light into these “dark corners” of a business’ data stores can help thwart data breaches and other inadvertent vulnerabilities. Yes, it’s necessary for modern organizations to enable their employees to move at the speed of the cloud, but that doesn’t mean security has to play second fiddle. Shadow data will occur, but the beauty of modern technology is that new solutions and approaches to decades-old challenges emerge every day.
These solutions are continuously working to discover and classify data and automatically detect all data stores and assets by scanning the entire cloud environment, revealing content in the shadows. Once all data is scanned, these solutions can categorize and classify files and programs and apply sanctioned data security policies that will allow security teams complete visibility and automated monitoring to manage all of a company’s assets effectively.
The number of breaches occurring “in the shadows” today should be enough for a business leader to reevaluate his or her approach to cloud security. Do they know where their sensitive data lives, and do they have the tools and resources to manage it? Having full data observability lets businesses understand where their shadow data stores are, their security posture and who owns them. Doing so leads to data flowing smoothly and safely and the ability to thrive in a fast-moving, cloud-first world.
Related Articles:
How to Demystify Zero Trust for Non-Security Stakeholders
Published: 12/19/2024
Why Digital Pioneers are Adopting Zero Trust SD-WAN to Drive Modernization
Published: 12/19/2024
Managed Security Service Provider (MSSP): Everything You Need to Know
Published: 12/18/2024
Zero-Code Cloud: Building Secure, Automated Infrastructure Without Writing a Line
Published: 12/16/2024