
Microsoft Copilot for M365 inherits the same data access permissions as the user. If an employee has access to a SharePoint site containing salary data, Copilot can surface that salary data in response to a prompt. If a shared drive gives "Everyone" access to board meeting minutes, any Copilot user can ask about board discussions.
This is not a bug — it is how Copilot is designed. It respects M365 permissions exactly. The problem is that most organisations have accumulated years of permission sprawl, overshared sites, and stale access grants.
Without a governance model, Copilot becomes a data leakage risk. With the right governance, Copilot is secure by design.
A comprehensive Copilot governance framework has five layers:
Before deploying Copilot, classify your organisational data into tiers:
| Classification | Examples | Copilot Access |
|---|---|---|
| Public | Marketing materials, public website content | Open to all Copilot users |
| Internal | Internal memos, team wikis, project documents | Open to relevant team members |
| Confidential | Financial reports, client data, contracts | Restricted to authorised roles |
| Highly Confidential | Salary data, board papers, M&A documents, legal matters | Excluded from Copilot or heavily restricted |
Align M365 permissions with your data classification:
SharePoint Sites:
OneDrive:
Teams:
Exchange:
Sensitivity labels are the most powerful governance tool for Copilot. They classify and protect documents regardless of where they are stored.
Recommended label structure:
| Label | Visual Marking | Encryption | Copilot Behaviour |
|---|---|---|---|
| Public | "Public" watermark | None | Full Copilot access |
| Internal | "Internal" header/footer | None | Copilot access for internal users |
| Confidential | "Confidential" header/footer | AES-256 encryption | Copilot access restricted to label-authorised users |
| Highly Confidential | "Highly Confidential" watermark + header | AES-256 + restricted access | Copilot cannot surface this content |
Auto-labelling rules:
Every organisation deploying Copilot needs a clear usage policy. The policy should cover:
Approved Use Cases:
Prohibited Use Cases:
Quality Assurance Requirements:
Disclosure Requirements: When to disclose that AI was used:
Copilot Activity Monitoring:
Compliance with Regional Regulations:
For Singapore:
For Malaysia:
Building a Copilot governance framework requires expertise in M365 security, data classification, and regional data protection regulations. Training providers in the region offer governance workshops and implementation support.
Copilot can access any data that the individual user has permission to see in M365, including SharePoint, OneDrive, Teams, and Exchange. It does not bypass permissions. The risk comes from overly broad permissions that already exist — Copilot just makes it easier for users to find data they technically already had access to.
Use Microsoft Purview sensitivity labels to classify confidential documents, restrict permissions to only authorised users, remove broad sharing (Everyone, All Company) from sensitive sites, and enable auto-labelling for documents containing personal data. Properly configured sensitivity labels prevent Copilot from surfacing protected content.
Yes. Every organisation deploying Copilot should have a written usage policy covering approved and prohibited use cases, data handling rules, quality assurance requirements, and disclosure requirements. Without a policy, employees will make their own decisions about what is appropriate, leading to inconsistent and potentially risky usage.