Cloud 101CircleEventsBlog
Master CSA’s Security, Trust, Assurance, and Risk program—download the STAR Prep Kit for essential tools to enhance your assurance!

The Data Security Risks of Adopting Copilot for Microsoft 365

Published 04/16/2024

The Data Security Risks of Adopting Copilot for Microsoft 365

Originally published by Cyera.

Written by Leo Reznik.

Microsoft is taking the lead when it comes to AI-powered ecosystems. The company’s newly introduced Copilot AI assistant for Microsoft 365 surfaces organizational data to deliver users a seamless workflow experience. However, with lots of data comes great responsibility. AI-powered tools enhance productivity while generating substantial new data that must be secured. These tools simultaneously raise the risk of inadvertent exposure of sensitive information.


What is Copilot for Microsoft 365?

Copilot is Microsoft's AI work assistant that integrates with Microsoft 365 (M365) apps such as Outlook, Word, Excel, Teams, and other apps to create a seamless and holistic workflow productivity tool. First rolled out on November 1, 2023, the AI-powered assistant asks users for a prompt to generate content and help users schedule meetings, send emails, create presentations, and perform other tasks more productively. The main benefits of Copilot for Microsoft 365 include:

  • Ability for users to get real-time suggestions when drafting emails, creating documents, writing code, and doing other work-related tasks. Copilot can anticipate your needs and offer custom suggestions, saving time and increasing productivity.
  • Automating routine tasks such as organizing files, arranging calendar meetings, and creating reports.
  • Summarizing team meetings and conversations, which can be shared with the rest of the organization so everyone is up-to-date on important discussions and decisions.
  • Learning from your actions and preferences to customize its suggestions and intuitively adapt to your own work style.

Microsoft’s main goal with Copilot for M365 is to help users and organizations work faster, creatively, and more efficiently.


What makes Copilot unique?

Unlike other AI-powered software, Copilot for Microsoft 365 simultaneously utilizes the context and intelligence of the internet, integrates work data, and synchronizes with ongoing tasks on various devices to bring users enhanced AI capabilities to their work-related tasks. But here’s the best part, did you know that Copilot is not limited to Microsoft 365?

Copilot includes various different Microsoft Copilots that leverage sub-specialized AI engines to address different use cases. Thus, Copilot should be conceptualized as a new technology stack rather than a simple productivity tool. For example, the suite includes Microsoft Copilot for Sales, and Microsoft Service Copilot, both include and extend Copilot for Microsoft 365, which will be updated and rolled out throughout 2024.

Since Copilot forms a comprehensive AI suite of technologies, its users need to be aware of the inherent risks to using this technology. According to Gartner’s report Assessing the Impact of Microsoft’s Generative AI Copilots on Enterprise Application Strategy, “Microsoft Copilots are built on a complex interdependence of new and existing Microsoft technologies that, if not evaluated properly, can lead to risks such as poor compliance and data governance.”


What are the data security risks of adopting Microsoft Copilot for 365?

Many organizations are still not fully ready to handle the risks associated with implementing Microsoft's Copilot for M365 technologies on a large scale. Microsoft states that the access control system embedded in your Microsoft 365 tenant is designed to prevent accidental data sharing among users, groups, and different tenants. Furthermore, Microsoft Copilot for Microsoft 365 is set up to only reveal information that a user can access, utilizing the same data access mechanisms that are implemented across various Microsoft 365 services. However, achieving visibility into where sensitive data is stored can be challenging for many organizations. So is the case when dealing with data access controls and Microsoft sensitivity labels (Microsoft Information Protection or “MIP”) within Microsoft 365. Here are some of the main risks that Copilot for Microsoft 365 introduces:

  • Exposure of sensitive data: Lack of visibility into sensitive files and datastores, combined with excessive access permissions may expose sensitive data to unauthorized users. While exposure of sensitive data to employees presents its own set of risks, those risks are multiplied in the hands of malicious insiders and threat actors should they decide to exploit lax access levels.
  • Exposure of Copilot-Generated Sensitive Data: Copilot's use of diverse data sources can lead users to become unaware of generated content that contains sensitive information such as confidential business data or employee data. This could lead to unintentional sharing of sensitive data with 3rd parties and unauthorized users.
  • Improper Use of Sensitivity Labels: Newly generated content inherits the sensitivity labels of the files Copilot referenced to create the content. This further increases the problem of inconsistent labeling of sensitive files requiring active enforcement and raises the risk of exposing sensitive data to unauthorized parties.

Share this content on your favorite social network today!