Back to Microsoft Copilot Training & Enablement

Copilot Governance & Access Model — Secure and Responsible Copilot Deployment

Pertama PartnersFebruary 11, 202611 min read
🇲🇾 Malaysia🇸🇬 Singapore
Copilot Governance & Access Model — Secure and Responsible Copilot Deployment

Why Copilot Governance Cannot Be an Afterthought

Microsoft Copilot for M365 inherits the same data access permissions as the user. If an employee has access to a SharePoint site containing salary data, Copilot can surface that salary data in response to a prompt. If a shared drive gives "Everyone" access to board meeting minutes, any Copilot user can ask about board discussions.

This is not a bug — it is how Copilot is designed. It respects M365 permissions exactly. The problem is that most organisations have accumulated years of permission sprawl, overshared sites, and stale access grants.

Without a governance model, Copilot becomes a data leakage risk. With the right governance, Copilot is secure by design.

The Copilot Governance Framework

A comprehensive Copilot governance framework has five layers:

Layer 1: Data Classification

Before deploying Copilot, classify your organisational data into tiers:

ClassificationExamplesCopilot Access
PublicMarketing materials, public website contentOpen to all Copilot users
InternalInternal memos, team wikis, project documentsOpen to relevant team members
ConfidentialFinancial reports, client data, contractsRestricted to authorised roles
Highly ConfidentialSalary data, board papers, M&A documents, legal mattersExcluded from Copilot or heavily restricted

Layer 2: Access Controls

Align M365 permissions with your data classification:

SharePoint Sites:

  • Review every site's sharing settings
  • Remove "Everyone" and "Everyone except external users" groups from confidential sites
  • Replace broad access with specific security groups
  • Implement site-level access reviews on a quarterly basis

OneDrive:

  • Audit sharing links (especially "Anyone with the link" shares)
  • Set default sharing to "People in your organisation" at minimum
  • Block sharing of files classified as Highly Confidential

Teams:

  • Review team membership and remove inactive members
  • Set up private channels for confidential discussions
  • Audit guest access and remove unnecessary external accounts
  • Enable meeting transcription controls (who can transcribe, retention period)

Exchange:

  • Review shared mailbox permissions
  • Audit delegate access to executive and HR mailboxes
  • Implement email encryption for Highly Confidential communications

Layer 3: Sensitivity Labels (Microsoft Purview)

Sensitivity labels are the most powerful governance tool for Copilot. They classify and protect documents regardless of where they are stored.

Recommended label structure:

LabelVisual MarkingEncryptionCopilot Behaviour
Public"Public" watermarkNoneFull Copilot access
Internal"Internal" header/footerNoneCopilot access for internal users
Confidential"Confidential" header/footerAES-256 encryptionCopilot access restricted to label-authorised users
Highly Confidential"Highly Confidential" watermark + headerAES-256 + restricted accessCopilot cannot surface this content

Auto-labelling rules:

  • Documents containing NRIC/IC numbers → Confidential
  • Documents containing credit card numbers → Confidential
  • Documents in HR salary folders → Highly Confidential
  • Documents in legal/M&A folders → Highly Confidential
  • Board meeting documents → Highly Confidential

Layer 4: Usage Policy

Every organisation deploying Copilot needs a clear usage policy. The policy should cover:

Approved Use Cases:

  • Drafting emails, reports, and presentations
  • Summarising meetings and documents
  • Analysing non-confidential data
  • Generating ideas and brainstorming
  • Research and information synthesis

Prohibited Use Cases:

  • Inputting personal data (NRIC, passport numbers, medical records) into Copilot prompts
  • Using Copilot for legal decisions without legal review
  • Using Copilot for financial decisions that affect external stakeholders without verification
  • Sharing Copilot outputs externally without human review
  • Using Copilot to generate content that impersonates specific individuals

Quality Assurance Requirements:

  • All Copilot-generated content must be reviewed by a human before sharing externally
  • Financial figures and data points must be verified against source data
  • Legal and compliance content must be reviewed by the relevant department
  • Customer-facing communications must meet brand and tone guidelines

Disclosure Requirements: When to disclose that AI was used:

  • Regulatory filings and compliance documents: always disclose
  • Client deliverables: follow client's AI usage policy
  • Internal documents: disclosure not required but encouraged
  • Public-facing content: follow company communication guidelines

Layer 5: Monitoring and Compliance

Copilot Activity Monitoring:

  • Enable audit logging for Copilot interactions in Microsoft 365 compliance centre
  • Review usage patterns monthly (who is using Copilot, for what, how often)
  • Set up alerts for unusual activity (e.g., high-volume queries on confidential data)
  • Retain audit logs for 12 months minimum (or as required by your industry regulations)

Compliance with Regional Regulations:

For Singapore:

  • PDPA (Personal Data Protection Act): Ensure Copilot usage complies with data protection obligations, particularly consent and purpose limitation
  • MAS Guidelines: Financial institutions must follow MAS technology risk management guidelines for AI usage

For Malaysia:

  • PDPA 2010: Ensure personal data processing through Copilot complies with the seven data protection principles
  • Bank Negara Malaysia: Financial institutions must follow BNM risk management guidelines

Implementation Roadmap

Phase 1: Assessment (Weeks 1-2)

  • Audit current M365 permissions
  • Identify overshared sites and files
  • Classify data into the four-tier model
  • Document current access control gaps

Phase 2: Remediation (Weeks 3-6)

  • Clean up SharePoint permissions
  • Implement sensitivity labels
  • Configure auto-labelling policies
  • Set up conditional access policies for Copilot

Phase 3: Policy and Training (Weeks 5-7)

  • Draft and approve the Copilot usage policy
  • Train all Copilot users on the governance framework
  • Train IT/security team on monitoring and compliance
  • Brief leadership on governance structure and residual risks

Phase 4: Deploy and Monitor (Week 8+)

  • Deploy Copilot to pilot group with governance in place
  • Monitor audit logs weekly during the first month
  • Review and adjust policies based on real usage patterns
  • Expand deployment as governance model proves effective

Getting Help

Building a Copilot governance framework requires expertise in M365 security, data classification, and regional data protection regulations. Training providers in the region offer governance workshops and implementation support.

  • Malaysia: Governance training and assessment is HRDF claimable
  • Singapore: SkillsFuture subsidies cover 70-90% of governance training and assessment costs

Related Reading

Frequently Asked Questions

Copilot can access any data that the individual user has permission to see in M365, including SharePoint, OneDrive, Teams, and Exchange. It does not bypass permissions. The risk comes from overly broad permissions that already exist — Copilot just makes it easier for users to find data they technically already had access to.

Use Microsoft Purview sensitivity labels to classify confidential documents, restrict permissions to only authorised users, remove broad sharing (Everyone, All Company) from sensitive sites, and enable auto-labelling for documents containing personal data. Properly configured sensitivity labels prevent Copilot from surfacing protected content.

Yes. Every organisation deploying Copilot should have a written usage policy covering approved and prohibited use cases, data handling rules, quality assurance requirements, and disclosure requirements. Without a policy, employees will make their own decisions about what is appropriate, leading to inconsistent and potentially risky usage.

More on Microsoft Copilot Training & Enablement