Are You Ready for Microsoft Copilot?
Published 04/19/2024
Originally published by Reco.
Written by Gal Nakash.
On March 14, Microsoft made Copilot available to customers in their 365 environment. Originally designed to enable productivity, Copilot is an AI chatbot that allows any user to conduct research or create content. It has the ability to generate slide decks, create text in word files, analyze spreadsheets and more. It’s powerful.
Shared Organizational Data Is Now at Risk of Data Leakage
It also opens up new doors for threat actors to gain access to critical financial, customer, IP, and employee data across the organization by acting as an employee.
Visualization of the possible volume of data shared with all employees after adoption of Microsoft Copilot.
Can you guess what happens if you're attempting to access a restricted file? The attempt will be blocked and audited. However, when someone with higher privileges acts on behalf of the organization (such as Copilot) and attempts to access organizational documents with limited access, there is little to no logs about this action. And here lies the risk.
Potentially, every file within the organization's scope is available to be queried and retrieved as part of the user's interaction (which is as simple as ChatGPT’s prompts) unless defined otherwise.
Think about it. Documents that may contain Personal Identifiable Information in SharePoint are now available to be queried and retrieved as part of the new Generative AI revolution we’re witnessing. It can reveal secrets, file names (yes, even file names might contain PII), pay rises, employee terminations, metadata (the users that created them), and if the threat actor is lucky enough - the content itself.
Since identities are now the perimeter of the traditional organizational barriers, why would threat actors waste their valuable time meticulously looking for sensitive files when they can simply ask Copilot? The data already resides in the Microsoft 365 suite. And when we refer to the data itself that Copilot can access, it means that full SharePoint sites, contact, calendars, chats and emails are available to be indexed, queried and analyzed in order to retrieve the relevant results.
Example of a prompt in Microsoft Copilot, an AI chatbot.
In the example above, we were able to search and retrieve the names of files that were not shared across the organization, simply by asking Copilot about the author. You can see that these files were not accessible by the interacting user.
Example revealing the volume and variety of files Microsoft Copilot gains access to once utilized.
Can you tell how many interactions the users in your organization are having a day with Copilot? Or what are they attempting to access?
Microsoft’s own MVP published an article regarding the safeguards needed to be put in place prior to onboarding and after onboarding Copilot. This process contains a 20-step manual of minimizing permissions and adding labels just to enable the safe usage of Copilot, which makes IT teams invest time and effort while the business rushes forward.
Conclusion
The vulnerabilities discovered within Microsoft Copilot serves as a stark reminder of the constant threat to the critical data that resides in SaaS applications. In this ever-evolving landscape, SSPM solutions are vital for proactively securing your Microsoft environment, detecting suspicious activities, and preventing unauthorized access and data breaches. Monitoring and alerting capabilities provided by SSPM enable organizations to protect their identities and data against emerging threats.
Related Articles:
CSA Community Spotlight: Nerding Out About Security with CISO Alexander Getsin
Published: 11/21/2024
Establishing an Always-Ready State with Continuous Controls Monitoring
Published: 11/21/2024
AI-Powered Cybersecurity: Safeguarding the Media Industry
Published: 11/20/2024
The Lost Art of Visibility, in the World of Clouds
Published: 11/20/2024