3 Data Strategies that will be Critical for Surviving 2021
Blog Article Published: 02/05/2021
By: Jim McGann, Vice President Marketing and Business Development, Index Engines
Users create content on a daily basis. Much of this content has no long-term value and is not business critical, however, a small percentage is key to running operations. Some of it contains sensitive client information. Some of it contains intellectual property. If this data goes missing or falls into the wrong hands due to a ransomware attack, an organization would be severely handicapped and could be at the risk of extinction.
In the past, most of the leading security applications could keep this data safe and secure inside the data center, but 2020 has shown us that data is now under massive attacks by cyber criminals and is being used to cause great harm and expense to both large and small organizations alike.
Over the past year we have seen the following examples of companies that did not secure and manage their data effectively.
In May, the New York based law firm of Grubman Shire Meiselas & Sacks was hit with REvil ransomware. This firm represents high profile clients and celebrities who trust them with very sensitive information. 756 gigabytes of data was stolen during the attack including contracts, nondisclosure agreements, phone numbers, email addresses and personal correspondence. The cyber criminals responsible asked for $42M to avoid publishing this data on the internet to cause long term damage to the firm’s reputation.
In June, the University of California said it paid over $1M to cyber criminals (they asked for $3M) to unlock data that was encrypted, data that was part of their COVID-19 research. This intellectual property was critical to a possible treatment for the Coronavirus and the criminals threatened to publish it online and share it with the world.
In August, Travelex, the British foreign exchange company, was forced into bankruptcy due to a ransomware attack. Travelex is a very data driven business that relies on their reputation of secure and trusted banking. This attack not only shut down their business in the middle of a pandemic, but also created global embarrassment that could not be repaired.
Going forward data will matter more than ever before. Protecting data will go to the next level, beyond existing data protection initiatives. In 2021, the following strategies will be critical to organizations that want to avoid the examples used above.
Continue to focus on keeping cyber criminals out of the data center, but also check the integrity of data using analytics to know when they have circumvented existing security measures. Analytics will look at how data changes over time and will detect signs of corruption such as encryption and unusual modification not attributed to normal user activity but suspicious cyber threats.
So where do you deploy analytics, being that the production network is already overloaded. The right place to implement analytics is with your backup data. Backups will be used to recover from a ransomware attack. Therefore, ensuring that your backups have integrity, and the data in the latest backup is good is critical to minimizing disruption. Continually check the backups daily and when corrupted data in a backup is detected, you will have the confidence knowing you have a clean, previous backup to restore and you will avoid being held hostage by cyber criminals.
Understand your data environment to know where sensitive data is stored and make sure it is managed based on its value. As data ages it gets lost and unmanaged within the infrastructure. Indexing data in order to understand the content will be a key initiative in 2021. Understanding where sensitive files exist, including intellectual property, contracts, client information will enable this content to be secured and less available to inside and outside threats.
Reports will expose servers that contain thousands of Excel spreadsheets containing client addresses and bank account information, folders that store legal contracts, PSTs of key executives’ emails, and other data that if found by cyber criminals would quickly be exposed on the dark web. Find this information before they do, secure and protect it to avoid any public embarrassment.
Without knowledge of data, it is almost impossible to manage. Profiling data to understand its value will allow it to be stored more cost effectively and securely. Leaving important data on unmanaged servers costs money, but also makes the content vulnerable to inside and outside threat actors. Old project data, ex-employee data, old research studies should be archived and secured to keep it so it could be leveraged in the future, but also make it less accessible to protect it from those who can use it for harm.
Running reports that can easily uncover this content and migrating it to less expensive storage, or a searchable archive, or even the cloud would control costs and secure it at the same time. This will permit those who need it in the future to easily find and access it, and at the same time make it less available from cyber criminals.
We have seen much pain in 2020 related to cyberattacks and the resulting data breaches. We will see organizations respond to this with smarter more intelligent capabilities. Adding analytics, indexing, reporting and archiving to the data environment will allow for these intelligent decisions. Decisions that will not only protect organizations from threats on their data, but also streamline their environments to more effectively manage their data assets.
About the Author
Jim McGann is the Vice President of Marketing & Business Development for Index Engines. He has extensive experience with the eDiscovery and Information Management in the Fortune 2000 sector. Before joining Index Engines in 2004, he worked for leading software firms, including Information Builders and the French based engineering software provider Dassault Systemes. Jim graduated from Villanova University with a degree in Mechanical Engineering.