The Compliance Gap Nobody Anticipated
When Microsoft rolled out Copilot across M365, IT teams got a powerful productivity tool. Compliance teams got a migraine. Copilot can access, summarize, and surface content across SharePoint, Teams, Exchange, and OneDrive — which means every compliance gap in your tenant just became an AI-powered compliance gap.
Microsoft Purview is the compliance suite designed to address exactly this problem. But most organizations are using maybe 10% of what Purview offers for Copilot governance. The rest sits untouched because nobody told them it existed or how to configure it.
This guide covers the Purview tools that matter most for Copilot deployments: audit logs, eDiscovery, information barriers, communication compliance, and data lifecycle management. No fluff. Just the configurations that keep your legal and compliance teams from losing sleep.
Audit Logs: Seeing What Copilot Actually Does
The first rule of Copilot compliance: you can't govern what you can't see. Microsoft Purview's unified audit log now captures Copilot interactions, and this is where your compliance story starts.
What Gets Logged
Every Copilot interaction in M365 generates audit events. According to Microsoft's documentation on audit logs for Copilot and AI applications, these events include:
- CopilotInteraction events — the prompt the user submitted and the app context (Word, Teams, Excel, etc.)
- AI app interaction events — which files and data sources Copilot accessed to generate its response
- Sensitivity label context — whether the accessed content carried sensitivity labels
To access these logs, navigate to the Microsoft Purview portal, select Audit, and filter for Copilot-specific activities. You can filter by user, date range, and application.
Setting Up Audit Log Retention
By default, audit logs are retained for 180 days with an E5 license (90 days for E3). For Copilot compliance, that's often not enough. Regulated industries typically need 1-7 years of retention.
Configure extended retention through Audit log retention policies in the Purview portal:
- Navigate to Audit → Audit retention policies
- Create a new policy targeting CopilotInteraction record types
- Set your retention period based on regulatory requirements
- Apply to specific users or groups if needed
The critical point: if you're in healthcare, financial services, or government, your regulators will eventually ask what Copilot accessed. Having 90 days of logs when they ask for 3 years of history is not a conversation you want to have.
eDiscovery: When Copilot Conversations Become Evidence
Here's a scenario compliance teams are already facing: a former employee claims they were terminated based on information Copilot surfaced from private Teams messages. Legal needs to reconstruct exactly what Copilot showed, when, and to whom.
Microsoft Purview eDiscovery (Premium) can search and collect Copilot interaction data. This capability, documented in Microsoft's data protection architecture guide, means Copilot prompts and responses are discoverable content.
Running a Copilot eDiscovery Search
To search for Copilot interactions in eDiscovery:
- Create a new eDiscovery (Premium) case in the Purview portal
- Add custodians whose Copilot usage you need to investigate
- Create a search with the Copilot interactions content type selected
- Define date ranges and keyword filters
- Review results in the eDiscovery review set
The search captures both the user's prompt and Copilot's response, along with metadata about which documents were referenced. This is powerful for investigations but also means your legal hold procedures need updating.
Updating Legal Hold Procedures
If your organization has existing legal hold workflows, they need to explicitly include Copilot interaction data. A legal hold that preserves emails and documents but ignores Copilot conversations has a gap that opposing counsel will absolutely exploit.
Update your hold procedures to include:
- Copilot interaction logs for custodians under hold
- The source documents Copilot referenced in its responses
- Teams meeting transcripts that Copilot summarized
- Any Copilot-generated content saved to SharePoint or OneDrive
Information Barriers: Drawing Lines Copilot Can't Cross
Information barriers are Purview's answer to the "Chinese wall" problem. In financial services, the investment banking team can't share information with the trading team. In law firms, teams working for opposing clients must be isolated.
Without information barriers, Copilot ignores these walls entirely. A trader could ask Copilot to "summarize recent M&A activity" and get responses based on investment banking documents they should never see — not because permissions are wrong, but because Copilot surfaces content based on organizational access patterns that predate AI.
Configuring Information Barriers for Copilot
Information barriers in Purview work by defining segments (groups of users) and policies (which segments can or cannot communicate):
- Define segments in the Purview portal based on department, role, or custom attributes from Entra ID
- Create barrier policies that block communication and content discovery between segments
- Apply the policies — this triggers a background process that enforces barriers across Teams, SharePoint, and OneDrive
When information barriers are active, Copilot respects them. A user in the "Trading" segment won't get Copilot results from content owned by the "Investment Banking" segment, even if the underlying SharePoint permissions would technically allow access.
The catch: information barriers affect collaboration broadly, not just Copilot. Users in blocked segments can't chat in Teams, share files, or see each other's profiles. Plan carefully before deploying — overly broad barriers create productivity problems that generate more tickets than the compliance risk they mitigate.
Communication Compliance: Monitoring What Users Ask Copilot
Communication compliance in Purview monitors messages for policy violations — think harassment, insider trading language, or unauthorized disclosures. With Copilot integration, this monitoring extends to what users prompt Copilot to do.
Why This Matters
Consider these real scenarios:
- An employee asks Copilot to "find salary information for everyone in my department"
- A user prompts Copilot to "summarize the confidential restructuring plan"
- Someone asks Copilot to "draft an email using the competitor's pricing data from that leaked document"
Communication compliance can flag these interactions for review, creating an early warning system for data misuse that happens through Copilot rather than traditional channels.
Setting Up Copilot Monitoring Policies
To create a communication compliance policy targeting Copilot:
- Navigate to Communication compliance in the Purview portal
- Create a new policy
- Under Locations, include Microsoft 365 Copilot interactions
- Define conditions using keywords, sensitive information types, or trainable classifiers
- Set up reviewers who will evaluate flagged interactions
Effective keyword sets for Copilot monitoring include terms like "confidential," "salary," "restructuring," "merger," combined with action verbs like "find," "summarize," "extract." Trainable classifiers for regulatory content (financial data, healthcare records) add another detection layer.
The volume consideration: Copilot generates far more interactions than traditional email or chat. Start with narrow policies targeting high-risk terms and expand gradually. A policy that flags 500 interactions per day becomes noise, not governance.
Data Lifecycle Management: Controlling What Copilot Can Still Find
Deleted files aren't necessarily gone. Retention policies, litigation holds, and backup systems can keep content accessible long after users think it's been removed. Copilot can surface this retained content, which creates a unique compliance problem.
As Microsoft notes in their Purview data security documentation, Copilot's access follows the same permissions model as other M365 services, including retained content.
Retention Labels and Copilot
Retention labels control how long content is kept and what happens when the retention period expires. For Copilot governance:
- Apply retention labels to sensitive content categories so they're automatically deleted when no longer needed
- Use disposition reviews for high-value content — require human approval before deletion
- Configure auto-apply policies using sensitive information types or trainable classifiers to catch unlabeled content
The goal: reduce the surface area of content Copilot can access by ensuring old, unnecessary data doesn't linger indefinitely.
Records Management
For content that must be retained but shouldn't be surfaced by Copilot, records management provides the controls:
- Declare items as records — this locks them from modification but doesn't remove Copilot access by default
- Use regulatory records for content under strict compliance requirements
- Combine with sensitivity labels — a "Highly Confidential" sensitivity label with DLP policies can prevent Copilot from surfacing record content
The interplay between retention, records, and Copilot access is complex. The short version: retention keeps content alive, sensitivity labels control who (and what AI) can access it, and DLP policies enforce the rules. All three need to work together.
Sensitivity Labels: The Foundation Everything Else Builds On
If you've read our sensitivity labels guide, you know these are the cornerstone of Copilot data protection. But in the Purview context, sensitivity labels connect to every other compliance tool:
- Audit logs record which sensitivity labels Copilot encountered
- eDiscovery can filter by sensitivity label to scope investigations
- Information barriers work alongside labels for defense-in-depth
- Communication compliance can flag interactions involving highly labeled content
- DLP policies use labels as conditions to block Copilot from surfacing content
If you haven't deployed sensitivity labels yet, stop reading this article and go deploy them. Everything else in this guide works better — or only works at all — when sensitivity labels are in place.
The Purview Compliance Dashboard: Your Single Pane of Glass
Microsoft has been consolidating Copilot governance into the Purview portal. The compliance dashboard gives you:
- Copilot usage analytics — who's using it, how often, which apps
- Policy violation trends — are communication compliance hits increasing?
- Data exposure metrics — how much sensitive content is Copilot accessing?
- Audit log summaries — quick views without diving into raw logs
For organizations with DLP policies configured for Copilot, the dashboard also shows DLP policy matches specific to Copilot interactions.
Implementation Priority: What to Deploy First
Not every organization needs every Purview tool on day one. Here's the priority order:
Week 1: Audit logging and sensitivity labels. You need visibility before you need control. Turn on Copilot audit logging and ensure sensitivity labels are deployed to at least your most sensitive content.
Week 2-3: DLP policies for Copilot. Configure DLP to prevent Copilot from surfacing content marked with your highest sensitivity labels. This is your biggest risk reduction per hour of effort.
Month 1: Communication compliance. Start with narrow policies targeting obviously high-risk prompts. Expand based on what you find.
Month 2: Information barriers (if required by your industry). These are complex to deploy and affect collaboration broadly. Get everything else right first.
Month 3: eDiscovery procedures update. Update your legal hold and investigation procedures to include Copilot data. Train your legal team on searching Copilot interactions.
Common Mistakes and How to Avoid Them
Mistake 1: Deploying Copilot before Purview. The compliance tools should be in place before Copilot goes live, not after. Retroactive compliance is always harder.
Mistake 2: Ignoring Copilot in eDiscovery planning. Copilot interactions are discoverable. If your eDiscovery procedures don't account for them, you have a gap.
Mistake 3: Setting communication compliance policies too broadly. Start narrow. A policy that flags everything is worse than no policy because it creates alert fatigue.
Mistake 4: Not extending audit log retention. Default retention periods are rarely sufficient for regulated industries. Extend them before you need the data, not after.
Mistake 5: Treating Purview as set-and-forget. Copilot capabilities evolve quarterly. New features mean new data access patterns, which means your Purview configurations need regular review.
The Bottom Line
Microsoft Purview gives you the tools to govern Copilot properly. The challenge isn't capability — it's configuration. Most organizations have the licenses but haven't activated the features that matter.
The combination of audit logging, sensitivity labels, DLP, communication compliance, and eDiscovery creates a comprehensive governance framework. But only if you actually deploy them.
Your Copilot deployment without Purview governance is a compliance incident waiting to happen. The tools exist. Use them.
Take Action Now
Not sure where your Copilot compliance gaps are? Run a free scan to assess your M365 Copilot readiness and get a prioritized remediation plan. It takes 5 minutes and shows you exactly what Purview configurations you're missing.