Top Security Concerns CISOs Have About Microsoft 365 Copilot in 2025

The benefits and risks entailed with deploying Microsoft 365 Copilot within complex organizations has become very apparent to leadership teams trying to wade into the era of AI with the baggage of all their organization’s data. In 2024, security teams at enterprise-licensed ‘Microsoft shops’ began to report on security risks being discovered through initial piloting. My conversations across multiple sectors provided keen insights into just how common (and relatable) many of these risks were. It’s true that many of these risks are present even today for many organizations. I wanted to dedicate this post to explore some top security concerns CISOs have about Microsoft 365 Copilot in 2025.

“This tells me we’re not ready for Copilot like we had hoped. That much we’re clear on. What’s less clear is our readiness pathway…”

Real talk. If this sounds like someone you may know… chances are it’s not. If it resonates, it’s probably because you’re not alone.

Yes, Microsoft 365 Copilot offers powerful AI capabilities to enhance productivity. It’s the productivity benefits that most organizations are rightly interested in pursuing.

And Yes, despite the obvious upside, it also introduces new security risks that organizations must carefully consider and address.

Here are the top 3 security risks for Microsoft 365 Copilot in 2025 that most likely affect your organization too. I’ll share why I believe they matter, and some quick tips on how to address these risks in your organization.

What is it? Oversharing is the act of sharing access to information (whether that’s content, files, or even entire Teams) beyond the needs of the recipients.

Why does it matter? It’s probably happened to all of us at some point. What we were granted access to was an entire shared folder, when all we actually needed was one specific file. Let’s imagine it’s a file that contains my personally identifiable or confidential information that was required for a specific purpose. Let’s imagine there’s a similar file for each of my colleagues in the same folder. That I could theoretically access any file within that shared folder if I wanted to (even without malicious intent), was already problematic. Imagine the risks in the era of AI, when employees can access confidential information using a simple prompt with Microsoft 365 Copilot. This risk is prevalent in organizations with default sharing settings of “Anyone with a Link,” but it also exists with “People in your Organization” links or for those who do not regularly perform access reviews.  

Quick Tip on how to address this risk: consider performing an oversharing assessment of your Microsoft 365 environment. This will not mitigate the risk, but it will at least show you how significant it is in your own organization.

What is it? While it is true that threat actors or cybercriminals typically start attacks with email, Microsoft 365 does provide multiple infiltration points that require managing. A compromised user account that has been enabled with a Microsoft 365 Copilot license is yet another vector.

Why does it matter? A threat actor or cybercriminal who has compromised a user account enabled with Copilot license can more quickly find valuable or sensitive information that can help them plan the next stage of their attack. If your organization has discovered a large ‘oversharing’ footprint, this scenario is likely even more concerning.

Quick Tip on how to address this risk: consider reducing your attack surface by decommissioning unused Teams or SharePoint Online sites, and implementing sensitivity labels in Microsoft Purview Information Protection, so you are able to restrict what Microsoft 365 Copilot can access.

What is it? Underestimating the OCM required for Microsoft 365 Copilot adoption can delay effective use. Many security leaders overlook the need for significant awareness, training, and reinforcement to ensure responsible use of Microsoft 365 Copilot and other GenAI tools handling organizational data.

Why does it matter? “Security is a team sport” and implementing all the proper technical security guardrails that you could realistically implement, is only half the battle. That’s because system-wide security guardrails do not address all risks associated with how employees may use Microsoft 365 Copilot (or other GenAI tools) in their daily work.

Quick Tip on how to address this risk: During your implementation planning, plan out your OCM efforts and activities in a way that ensures employees are aware of their personal responsibilities and obligations while using these tools, and trained on how to use these tools responsibly. Your OCM communication and training should clarify what’s permitted and not permitted (or data restrictions they need to abide by), and who’s there to support them no matter what their questions, concerns, or technical issues may be.

Thanks for reading, and please reach out if you have a question or just want to chat more!