Back to Insights
AI Governance & Risk ManagementPlaybookPractitioner

How to Report AI Risks to Your Board: Templates and Best Practices

October 11, 20259 min readMichael Lansdowne Hauge
For:Risk ManagersBoard DirectorsCompliance OfficersCXOs

Complete guide to board-level AI risk reporting with downloadable template, best practices, and common mistakes to avoid.

Consulting Client Presentation - ai governance & risk management insights

Key Takeaways

  • 1.Frame AI risks in business terms boards understand: financial impact, reputation, and competitive position
  • 2.Use a tiered reporting structure distinguishing strategic risks from operational ones
  • 3.Include risk velocity metrics showing how quickly AI threats can materialize
  • 4.Provide clear mitigation options with cost-benefit analysis for each risk
  • 5.Establish regular reporting cadence with escalation triggers for emerging threats

Executive Summary

  • Boards need AI risk information that's concise, meaningful, and actionable
  • Effective reports balance comprehensiveness with board-level accessibility
  • Standard formats enable comparison over time and across organizations
  • Reports should answer: What's the risk? What are we doing? Is it working?
  • Frequency depends on AI maturity and risk profile—quarterly is typical
  • Avoid technical jargon; focus on business implications
  • Include recommendations, not just information

Why Board AI Risk Reporting Matters

Boards have fiduciary responsibility for organizational risk. As AI becomes embedded in operations, boards need visibility into AI risk—not the technical details, but the business implications.

Good reporting enables boards to:

  • Exercise appropriate oversight
  • Ask informed questions
  • Support or challenge AI investment decisions
  • Fulfill governance responsibilities
  • Demonstrate due diligence to regulators and stakeholders

Poor reporting—or no reporting—leaves boards unable to govern AI effectively.


What Boards Need to Know

The Four Key Questions

Every board AI risk report should answer:

  1. What AI are we using?

    • Inventory of AI systems
    • Changes since last report
    • Planned additions
  2. What are the risks?

    • Current risk posture
    • Key risks and their ratings
    • Changes in risk profile
  3. What are we doing about it?

    • Mitigation activities and progress
    • Governance activities
    • Incidents and responses
  4. Is it working?

    • Effectiveness of controls
    • Trends over time
    • Outstanding issues

What Boards DON'T Need

  • Technical details of AI models
  • Exhaustive risk registers (summary is sufficient)
  • Jargon-heavy explanations
  • Information that can't lead to decisions

Board Report Template

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
        AI RISK REPORT TO THE BOARD
        [Organization Name]
        [Quarter/Year]
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

EXECUTIVE SUMMARY

Overall AI Risk Status: [GREEN / YELLOW / RED]

Key Points:
• [Summary bullet 1]
• [Summary bullet 2]
• [Summary bullet 3]

Recommendations for Board:
1. [Recommendation requiring board action]
2. [Recommendation for board awareness]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
1. AI INVENTORY UPDATE
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Current AI Systems: [Number]
New This Quarter: [Number]
Retired This Quarter: [Number]
Planned Next Quarter: [Number]

KEY CHANGES:
• [Notable new AI deployment]
• [Significant AI expansion]
• [Retirement or replacement]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
2. RISK POSTURE
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

RISK SUMMARY

| Risk Level | Count | Trend |
|------------|-------|-------|
| Critical   | X     | ↑/↓/→ |
| High       | X     | ↑/↓/→ |
| Medium     | X     | ↑/↓/→ |
| Low        | X     | ↑/↓/→ |
| TOTAL      | X     |       |

TOP RISKS FOR BOARD ATTENTION:

1. [Risk Name]
   Status: [Rating] - [Trend]
   Description: [One sentence]
   Mitigation: [Status]
   Outlook: [Improving/Stable/Concerning]

2. [Risk Name]
   Status: [Rating] - [Trend]
   Description: [One sentence]
   Mitigation: [Status]
   Outlook: [Improving/Stable/Concerning]

3. [Risk Name]
   Status: [Rating] - [Trend]
   Description: [One sentence]
   Mitigation: [Status]
   Outlook: [Improving/Stable/Concerning]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
3. GOVERNANCE ACTIVITIES
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

AI Governance Committee:
• Meetings held: [X]
• Key decisions: [List]
• Issues escalated: [List or "None"]

Policy Updates:
• [Any policy changes]

Training:
• Completion rate: [X%]
• Notable activities: [Summary]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
4. INCIDENTS
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

AI Incidents This Quarter: [X]

INCIDENT SUMMARY (if any):

[Incident 1]
• Date: [X]
• System: [X]
• Impact: [Brief description]
• Status: [Resolved/Ongoing]
• Lessons: [Brief summary]

[If no incidents: "No AI incidents this quarter."]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
5. COMPLIANCE STATUS
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Regulatory Developments:
• [Any relevant regulatory changes]

Compliance Status:
• PDPA: [Compliant/In Progress/Concerns]
• Sector requirements: [Status]
• Internal policy: [Compliance rate]

Upcoming:
• [Any compliance deadlines or changes]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
6. OUTLOOK AND RECOMMENDATIONS
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

NEXT QUARTER OUTLOOK:
• [Key expected changes]
• [Emerging risks to monitor]

RECOMMENDATIONS:

For Board Decision:
1. [Item requiring board approval]

For Board Awareness:
2. [Item for information]
3. [Item for information]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Report Prepared By: [Name/Role]
Date: [Date]
Next Report: [Quarter]

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━

Best Practices

1. Keep It Concise

Board time is limited. One to three pages is ideal. Use appendices for detail that members may want but most won't read.

2. Use Consistent Format

Same structure each quarter enables comparison over time. Board members learn where to find information.

3. Lead with Status

Start with the overall status (Red/Yellow/Green) and key points. Board members who read only the first paragraph should get the essential message.

Numbers without context mean little. Show direction: increasing, decreasing, stable. Compare to prior quarter.

5. Be Honest About Problems

Boards appreciate candor. Hiding problems undermines trust. Present issues with proposed responses.

6. Make Recommendations

Reports that require decisions should include recommendations. "We recommend the board approve..." is more useful than "The board may wish to consider..."

7. Anticipate Questions

Before presenting, ask: "What questions will board members ask?" Address likely questions proactively.


Reporting Frequency

AI MaturityRecommended Frequency
Early (few AI systems)Semi-annually
Developing (growing AI)Quarterly
Mature (significant AI)Quarterly with monthly monitoring
High-risk industriesQuarterly minimum; may need monthly

Trigger-Based Reporting

In addition to scheduled reports, report immediately for:

  • Significant AI incidents
  • Material regulatory changes
  • Major AI deployment decisions
  • Emerging risks requiring board attention

Common Mistakes

1. Too Much Technical Detail

Boards don't need to understand how the AI works. They need to understand business risk and implications.

Fix: Describe risks in business terms. Save technical details for appendices or follow-up questions.

2. No Actionable Recommendations

Reports that inform but don't recommend leave boards unsure what to do.

Fix: Include specific recommendations. Be clear whether you need approval, support, or simply awareness.

3. Inconsistent Format

Different format each time makes comparison difficult and frustrates board members.

Fix: Use a standard template. Consistent structure becomes expected.

4. Burying Bad News

Hiding problems or putting them deep in the report undermines trust.

Fix: Lead with status. If there's a problem, address it upfront with your response plan.

5. No Comparative Data

Current risk levels without context are hard to evaluate.

Fix: Show trends and comparisons to prior periods.


Checklist: Board AI Risk Report

Preparation

  • Data gathered from AI governance committee
  • Risk register reviewed for changes
  • Incidents documented
  • Compliance status confirmed
  • Trends analyzed

Content

  • Executive summary with status
  • AI inventory update
  • Risk summary with top risks
  • Governance activities
  • Incidents (if any)
  • Compliance status
  • Recommendations

Review

  • Language appropriate for board level
  • Technical jargon minimized
  • Recommendations clear and actionable
  • Questions anticipated and addressed
  • Format consistent with prior reports

Frequently Asked Questions

How detailed should the risk register section be?

Provide a summary—total risks by level and top risks for attention. Full register is typically too detailed for board; keep in appendix or available on request.

What if there's nothing significant to report?

Brief updates are still valuable. Confirm governance is operating, note stable risk posture, and highlight any emerging items. No news is also news.

Should we include AI investment alongside risk?

Some organizations combine AI risk reporting with AI investment updates. This can be effective if not too lengthy. Alternatively, separate reports to different committees (Risk, Technology).


Next Steps

Establish regular AI risk reporting if you haven't already. Use the template as a starting point and adapt to your board's needs and preferences.

Book an AI Readiness Audit with Pertama Partners for help establishing governance and reporting frameworks.


Frequently Asked Questions

Establish quarterly reporting as a baseline, with immediate escalation for critical incidents. Include AI risk updates in existing risk committee meetings and provide annual comprehensive reviews.

Include risk velocity metrics, incident counts and severity, compliance status, mitigation progress, emerging threat analysis, and financial impact assessments for current and potential AI risks.

Frame AI risks in familiar business terms like financial impact, competitive position, regulatory exposure, and reputational risk. Use analogies to traditional risks and provide clear cost-benefit analysis for mitigation options.

Michael Lansdowne Hauge

Founder & Managing Partner

Founder & Managing Partner at Pertama Partners. Founder of Pertama Group.

AI RiskBoard ReportingGovernanceTemplatesRisk Management

Ready to Apply These Insights to Your Organization?

Book a complimentary AI Readiness Audit to identify opportunities specific to your context.

Book an AI Readiness Audit