Cloud 101CircleEventsBlog

Insider Threat Detection: What You Need To Know

Insider Threat Detection: What You Need To Know

Blog Article Published: 05/25/2023

Originally published by Code42.

Written by Aimee Simpson.

The modern hybrid and remote workplace relies more than ever on cloud-based applications and data sharing. Because of the evolving cybersecurity landscape, security professionals must rely on a comprehensive insider threat detection strategy to keep company data secure as insider threats are some of the most difficult attacks to detect and resolve.

Our 2023 Data Exposure Report reveals the average cost of an insider incident is $16 million – that doesn’t even include the cost of reputational damage among customers, shareholders and even employees.

In this article we’ll cover the types of insider threats present in your organization, as well as methods to detect and respond to those threats.

What is an insider threat?

An insider threat is someone with legitimate access to company systems and data who consciously or unconsciously presents a threat for a potential data breach. Because there can be multiple people with varying levels of access to your data at any one time – including employees, consultants and vendors – detecting insider threats has its challenges. In fact, 27% of CISOs say that insider threat is the most difficult type of risk to detect, because of the sheer number of possible threats and people responsible for said risk.

Types of insider threats

There are three main types of insider threats – knowing the difference between each category can help you recognize the signs of risky behavior and better protect your organization from a dangerous and costly breach.

Malicious insiders

Malicious insiders are what many people think of when they hear the term “insider threats” – but they comprise only 20% of all insider incidents. These employees deliberately abuse their access to company data to bring harm to their employer, either for personal or professional gain. These actions can include stealing intellectual property or spying on company work on behalf of a competitor.

There are a multitude of cases of malicious insiders, especially when employees are departing the company. Data loss security company Proofpoint experienced this firsthand in 2021. Proofpoint filed a lawsuit against a former employee who loaded confidential sales enablement data onto a USB drive before heading to Abnormal Security — one of the company’s major competitors.

However, not all departing employees have hostile intentions. There’s a common assumption among departing employees that if they work on a project, then they “own” it, so naturally they want to take that data with them. These actions are not intentionally malicious – just uninformed.

Compromised insiders

Compromised insiders have not deliberately done anything wrong – except maybe consistently missing their company phishing tests. When an employee’s data becomes compromised via an attack like a phishing email, an external actor can easily gain access to the corporate network or accounts. Compromised insiders can appear malicious because the actions of the external actors make them look that way.

Negligent insiders

It may surprise you to learn that 80% of insider incidents are not malicious, but instead negligent. Negligent employees are usually unaware of the risk brought on by their careless online behavior. These insiders accidentally expose company data by disregarding company policies or through human error. Examples of negligent insiders include an employee accidentally downloading a harmful attachment or moving data to an unsanctioned app to save time. Other examples of negligent insiders could include employees who evade security to speed up their workflow or third-party partners who may not be fully versed in your company’s security policies.

Methods to detect insider threats

Any employee, contractor or partner who has access to sensitive data could be a potential risk, which makes insider threats notoriously difficult to detect. Monitoring your company’s data movement, instituting clear security policies and implementing employee training are smart strategies to put in place to prevent potential attacks before they happen.

Monitor all data and its movement

Unusual file movement is a common red flag that might indicate an insider threat. By constantly scanning your systems, you can establish a baseline pattern of file movement and get the context needed to know if it’s risky. Activities outside of that normal pattern of behavior might indicate an insider threat and should be investigated in order of priority:

  • File exfiltrated: Removing a file from its original location using zip file, USB or even AirDrop could mean the data ends up in the wrong hands.
  • File destination: Ensure that company files are moved to destinations you trust rather than personal or unsanctioned cloud applications.
  • File source: Looking into the source can clue you into its potential danger. Suspicious file sources, such as attachments sent via ProtonMail, could be malware or ransomware in disguise.
  • User characteristics and behaviors: Investigate all potential signs of suspicious insider activities. Monitor excessive spikes in data downloading, moving data at unusual times of day or acquiring privileged access to high-value data.
Investigate unusual data behavior

It isn’t enough to simply detect signs of a potential insider attack. It is important to follow up with robust investigation. Not all unusual behaviors will be problematic, but they should be investigated regardless.

In order to make this tactic effective, Code42 believes it’s important to add contextual indicators that can be prioritized to effectively protect data from employees who are most likely to leak or steal files. Then, when there is unusual data behavior from an employee, security teams can intercept and investigate. Some behaviors that may require investigation include:

  • Creating new user accounts
  • Copying data that’s not related to their work
  • Using unauthorized applications
  • Renaming files for concealed exfiltration
  • Increasing access permissions

How to respond to insider threats

Arguably the most important step following insider threat detection is the response strategy that IT and security has in place. While blocking data exfiltration upfront can be a “quick fix” to a data breach in progress, to reduce insider threat incidents over time, you will need to develop and execute a comprehensive response plan.

  • Set expectations: Clearly communicate security policies with your users. By aligning on what is and what’s not acceptable when sharing data, you can hold employees accountable when these established rules are breached.
  • Change behavior: Real time feedback and just-in-time training videos are crucial when working to improve a user’s security habits. These practices hold employees accountable and help them to follow best practices, which ultimately changes behavior over time.
  • Contain threats: Even with training and holding employees accountable, insider threat data risks are inevitable. When they happen, the key is to minimize the damage by revoking or reducing access on a user level if necessary. Then you can investigate and determine the best course of action to remediate.
  • Block activity for your highest risk users: Preventing your riskiest users from sharing data to unsanctioned destinations is a crucial step in your response plan. Blocking certain activities from those users allow the rest of your organization to work collaboratively without hindering productivity, all while knowing your data is safe from those likely to cause harm.

Responding to insider threats is no easy task. Staying vigilant with the right tools, processes and programs can keep your company ready when insider threats occur.


About the Author

Aimee is the Director of Product Marketing at Code42. She and her team perform market research and launch new product features for customers.

Share this content on your favorite social network today!