Back to Microsoft Copilot Training & Enablement

Copilot Governance & Access Model — Secure and Responsible Copilot Deployment

Pertama PartnersFebruary 11, 202611 min read
🇲🇾 Malaysia🇸🇬 Singapore
Copilot Governance & Access Model — Secure and Responsible Copilot Deployment

Why Copilot Governance Cannot Be an Afterthought

Microsoft Copilot for M365 inherits the same data access permissions as the user. If an employee has access to a SharePoint site containing salary data, Copilot can surface that salary data in response to a prompt. If a shared drive gives "Everyone" access to board meeting minutes, any Copilot user can ask about board discussions.

This is not a bug — it is how Copilot is designed. It respects M365 permissions exactly. The problem is that most organisations have accumulated years of permission sprawl, overshared sites, and stale access grants.

Without a governance model, Copilot becomes a data leakage risk. With the right governance, Copilot is secure by design.

The Copilot Governance Framework

A comprehensive Copilot governance framework has five layers:

Layer 1: Data Classification

Before deploying Copilot, classify your organisational data into tiers:

ClassificationExamplesCopilot Access
PublicMarketing materials, public website contentOpen to all Copilot users
InternalInternal memos, team wikis, project documentsOpen to relevant team members
ConfidentialFinancial reports, client data, contractsRestricted to authorised roles
Highly ConfidentialSalary data, board papers, M&A documents, legal mattersExcluded from Copilot or heavily restricted

Layer 2: Access Controls

Align M365 permissions with your data classification:

SharePoint Sites:

  • Review every site's sharing settings
  • Remove "Everyone" and "Everyone except external users" groups from confidential sites
  • Replace broad access with specific security groups
  • Implement site-level access reviews on a quarterly basis

OneDrive:

  • Audit sharing links (especially "Anyone with the link" shares)
  • Set default sharing to "People in your organisation" at minimum
  • Block sharing of files classified as Highly Confidential

Teams:

  • Review team membership and remove inactive members
  • Set up private channels for confidential discussions
  • Audit guest access and remove unnecessary external accounts
  • Enable meeting transcription controls (who can transcribe, retention period)

Exchange:

  • Review shared mailbox permissions
  • Audit delegate access to executive and HR mailboxes
  • Implement email encryption for Highly Confidential communications

Layer 3: Sensitivity Labels (Microsoft Purview)

Sensitivity labels are the most powerful governance tool for Copilot. They classify and protect documents regardless of where they are stored.

Recommended label structure:

LabelVisual MarkingEncryptionCopilot Behaviour
Public"Public" watermarkNoneFull Copilot access
Internal"Internal" header/footerNoneCopilot access for internal users
Confidential"Confidential" header/footerAES-256 encryptionCopilot access restricted to label-authorised users
Highly Confidential"Highly Confidential" watermark + headerAES-256 + restricted accessCopilot cannot surface this content

Auto-labelling rules:

  • Documents containing NRIC/IC numbers → Confidential
  • Documents containing credit card numbers → Confidential
  • Documents in HR salary folders → Highly Confidential
  • Documents in legal/M&A folders → Highly Confidential
  • Board meeting documents → Highly Confidential

Layer 4: Usage Policy

Every organisation deploying Copilot needs a clear usage policy. The policy should cover:

Approved Use Cases:

  • Drafting emails, reports, and presentations
  • Summarising meetings and documents
  • Analysing non-confidential data
  • Generating ideas and brainstorming
  • Research and information synthesis

Prohibited Use Cases:

  • Inputting personal data (NRIC, passport numbers, medical records) into Copilot prompts
  • Using Copilot for legal decisions without legal review
  • Using Copilot for financial decisions that affect external stakeholders without verification
  • Sharing Copilot outputs externally without human review
  • Using Copilot to generate content that impersonates specific individuals

Quality Assurance Requirements:

  • All Copilot-generated content must be reviewed by a human before sharing externally
  • Financial figures and data points must be verified against source data
  • Legal and compliance content must be reviewed by the relevant department
  • Customer-facing communications must meet brand and tone guidelines

Disclosure Requirements: When to disclose that AI was used:

  • Regulatory filings and compliance documents: always disclose
  • Client deliverables: follow client's AI usage policy
  • Internal documents: disclosure not required but encouraged
  • Public-facing content: follow company communication guidelines

Layer 5: Monitoring and Compliance

Copilot Activity Monitoring:

  • Enable audit logging for Copilot interactions in Microsoft 365 compliance centre
  • Review usage patterns monthly (who is using Copilot, for what, how often)
  • Set up alerts for unusual activity (e.g., high-volume queries on confidential data)
  • Retain audit logs for 12 months minimum (or as required by your industry regulations)

Compliance with Regional Regulations:

For Singapore:

  • PDPA (Personal Data Protection Act): Ensure Copilot usage complies with data protection obligations, particularly consent and purpose limitation
  • MAS Guidelines: Financial institutions must follow MAS technology risk management guidelines for AI usage

For Malaysia:

  • PDPA 2010: Ensure personal data processing through Copilot complies with the seven data protection principles
  • Bank Negara Malaysia: Financial institutions must follow BNM risk management guidelines

Implementation Roadmap

Phase 1: Assessment (Weeks 1-2)

  • Audit current M365 permissions
  • Identify overshared sites and files
  • Classify data into the four-tier model
  • Document current access control gaps

Phase 2: Remediation (Weeks 3-6)

  • Clean up SharePoint permissions
  • Implement sensitivity labels
  • Configure auto-labelling policies
  • Set up conditional access policies for Copilot

Phase 3: Policy and Training (Weeks 5-7)

  • Draft and approve the Copilot usage policy
  • Train all Copilot users on the governance framework
  • Train IT/security team on monitoring and compliance
  • Brief leadership on governance structure and residual risks

Phase 4: Deploy and Monitor (Week 8+)

  • Deploy Copilot to pilot group with governance in place
  • Monitor audit logs weekly during the first month
  • Review and adjust policies based on real usage patterns
  • Expand deployment as governance model proves effective

Getting Help

Building a Copilot governance framework requires expertise in M365 security, data classification, and regional data protection regulations. Training providers in the region offer governance workshops and implementation support.

  • Malaysia: Governance training and assessment is HRDF claimable
  • Singapore: SkillsFuture subsidies cover 70-90% of governance training and assessment costs

Related Reading

Implementing Role-Based Copilot Access Controls

Effective Copilot governance requires granular access controls that balance security requirements with productivity objectives. Organizations should define role-based access policies that specify which Copilot features are available to different employee categories based on their data access privileges and job requirements. Employees handling sensitive financial, legal, or HR data may require additional controls such as Copilot output review requirements or restrictions on specific features that could inadvertently expose confidential information in shared contexts. Microsoft Purview integration enables sensitivity label inheritance for Copilot-generated content, ensuring that AI-created documents automatically receive appropriate classification and access restrictions based on the source data used in their generation.

Data Security Considerations for Copilot Deployments

Copilot's access to organizational data through Microsoft Graph creates security considerations that governance frameworks must address proactively. Copilot respects existing Microsoft 365 permissions, which means that overly permissive file sharing and access controls become a security risk when Copilot can surface sensitive documents in response to user queries. Organizations should conduct a permissions audit before Copilot deployment, identifying and remediating excessive access grants that could expose sensitive information through Copilot interactions. Implement information barriers where required to prevent Copilot from surfacing confidential information across organizational boundaries, such as preventing deal team members from accessing material non-public information from other client engagements through Copilot searches.

Monitoring and Auditing Copilot Usage

Governance frameworks must include monitoring and auditing capabilities that provide visibility into how Copilot is being used across the organization without creating surveillance concerns that undermine employee trust. Microsoft 365 usage analytics and Copilot-specific reporting tools provide aggregate usage statistics that reveal adoption patterns, feature utilization rates, and potential governance concerns at the organizational and departmental levels without exposing individual conversation content. Quarterly governance reviews should analyze usage data to identify departments exceeding expected usage patterns that may indicate productive adoption or potential data handling concerns, and departments with declining usage that may need additional training or workflow integration support.

Organizations should document their Copilot governance policies in accessible formats that all employees can understand, avoiding technical jargon that may confuse non-technical staff. Clear governance documentation should explain what data Copilot can access on behalf of each user role, what types of queries or tasks are prohibited, how governance violations are detected and reported, and what consequences apply for intentional policy circumvention. Regular governance awareness training ensures that all Copilot users understand their responsibilities within the governance framework.

Governance frameworks should also address Copilot usage in regulated communications scenarios. Financial services firms, healthcare organizations, and legal practices face specific requirements regarding the retention, supervision, and auditability of client communications. When Copilot assists in drafting regulated communications, organizations must ensure that appropriate supervision and archival processes capture both the AI-assisted drafting process and the final approved communication to satisfy regulatory recordkeeping obligations.

Common Questions

Copilot can access any data that the individual user has permission to see in M365, including SharePoint, OneDrive, Teams, and Exchange. It does not bypass permissions. The risk comes from overly broad permissions that already exist — Copilot just makes it easier for users to find data they technically already had access to.

Use Microsoft Purview sensitivity labels to classify confidential documents, restrict permissions to only authorised users, remove broad sharing (Everyone, All Company) from sensitive sites, and enable auto-labelling for documents containing personal data. Properly configured sensitivity labels prevent Copilot from surfacing protected content.

Yes. Every organisation deploying Copilot should have a written usage policy covering approved and prohibited use cases, data handling rules, quality assurance requirements, and disclosure requirements. Without a policy, employees will make their own decisions about what is appropriate, leading to inconsistent and potentially risky usage.

A basic governance framework (data audit, sensitivity labels, usage policy) takes 6-8 weeks. Week 1-2: permissions audit. Week 3-6: remediation and label configuration. Week 5-7: policy drafting and training. Week 8+: pilot deployment with monitoring. Larger organizations with complex permissions may need 10-12 weeks.

Risks include: data leakage (Copilot surfaces salary data, confidential contracts, or board papers to unauthorized users), compliance violations (PDPA breaches from improper personal data handling), productivity loss (users get low-quality outputs due to poor data quality), and security incidents (overshared credentials or sensitive information).

Yes, with proper governance. Financial services must comply with MAS TRM (Singapore) or BNM RMiT (Malaysia) guidelines. Healthcare must follow PDPA plus sector-specific data protection. Key controls: sensitivity labels for regulated data, restricted access, audit logging, data residency verification, and documented governance frameworks.

Sensitivity labels classify documents (Public, Internal, Confidential, Highly Confidential) and control access through encryption. Copilot respects these labels—if a user lacks permission to decrypt a labeled document, Copilot cannot surface its content. Auto-labeling rules can automatically classify documents based on content (e.g., NRIC numbers → Confidential).

More on Microsoft Copilot Training & Enablement