Back to Insights
Board & Executive OversightChecklist

Preparing for an AI Audit: A Comprehensive Readiness Guide

January 8, 202610 min readMichael Lansdowne Hauge
Updated March 15, 2026
For:CFOLegal/ComplianceConsultantCISOHead of OperationsBoard MemberCTO/CIOCHRO

Complete guide to AI audit preparation. Includes 90-day SOP, documentation checklist, common findings to avoid, and evidence preparation best practices.

Summarize and fact-check this article with:
Malaysian Executive - board & executive oversight insights

Key Takeaways

  • 1.Prepare comprehensive documentation for AI audits
  • 2.Understand auditor expectations for AI governance evidence
  • 3.Build audit trails for AI decision-making processes
  • 4.Address common AI audit findings proactively
  • 5.Create sustainable audit-ready AI governance practices

Whether it's internal audit, external auditors, or a regulatory examination, AI audits are becoming common. Organizations that prepare systematically demonstrate governance maturity and avoid uncomfortable surprises.


Executive Summary

  • Audits are coming — Internal audit, external auditors, and regulators are examining AI governance
  • Preparation beats reaction — Organizations that prepare systematically fare better
  • Four preparation pillars — Documentation, controls, evidence, and people readiness
  • Common gaps are predictable — Most organizations struggle with the same issues
  • Start 90 days before — Meaningful preparation requires time
  • Evidence matters most — Auditors want proof, not promises
  • Post-audit is ongoing — Address findings promptly; demonstrate continuous improvement

Why This Matters Now

Regulatory Attention. MAS, PDPC, and sector regulators have AI on their radar.

Internal Audit Mandate. Internal audit functions are adding AI to their audit plans.

Board Expectation. Directors want assurance that AI is governed appropriately.


What Auditors Examine

Governance Framework: Structure, accountability, policies, roles Risk Management: Identification, assessment, mitigation, monitoring Model Lifecycle: Development, testing, deployment, retirement Compliance: Regulatory mapping, data protection, industry requirements Controls: Technical, operational, management, third-party


AI Audit Preparation SOP

Phase 1: Scoping (Days 1-15)

  1. Confirm audit scope — Systems, aspects, time period, documents
  2. Identify stakeholders — Audit lead, interviewees, coordinators
  3. Review previous audits — Findings, remediation, recurring themes

Phase 2: Documentation Preparation (Days 16-45)

  1. Complete AI inventory — All systems documented, owners identified
  2. Review policies — Governance, acceptable use, approval process
  3. Risk documentation — Register, assessments, mitigations
  4. Compliance documentation — Requirements, status, evidence
  5. Meeting records — Minutes, decisions, action items

Phase 3: Control Validation (Days 46-60)

  1. Test technical controls — Access, logging, security, data protection
  2. Test operational controls — Procedures, training, incidents
  3. Test management controls — Reporting, reviews, approvals
  4. Test vendor controls — Assessments, contracts, monitoring

Phase 4: Gap Remediation (Days 61-80)

  1. Prioritize gaps — By severity and effort
  2. Remediate where possible — High-priority first
  3. Document remaining gaps — With plans and timelines

Phase 5: Evidence Preparation (Days 81-90)

  1. Organize evidence — Create index, organize by topic
  2. Prepare interviewees — Brief on scope, review questions
  3. Establish logistics — Schedules, workspaces, technology
  4. Prepare opening briefing — Overview, governance, status, issues

Documentation Checklist

Governance:

  • AI governance policy
  • AI strategy document
  • Committee charter and minutes
  • Board AI updates
  • Roles and responsibilities

Risk:

  • AI risk framework
  • AI risk register
  • Individual risk assessments
  • Incident log and response plan

Compliance:

  • Regulatory requirements mapping
  • Compliance status report
  • Data protection impact assessments

Operations:

  • AI system inventory
  • Approval and testing records
  • Monitoring reports
  • Training records

Vendors:

  • Vendor risk assessments
  • Contracts with AI terms
  • Exit plans

Common Audit Findings

  1. Incomplete AI InventoryShadow AI is prevalent
  2. Outdated Policies — Not updated for AI
  3. Missing Risk Assessments — Deployed without assessment
  4. Inadequate Vendor Due Diligence — Third-party AI not assessed
  5. Insufficient Documentation — Decisions without rationale
  6. Weak Monitoring — No ongoing performance tracking
  7. Training Gaps — Staff untrained on AI tools
  8. Incident Response Gaps — No AI-specific procedures

During and After the Audit

During:

  • Coordinate responses through single point of contact
  • Be responsive and honest
  • Document everything
  • Escalate appropriately

After:

  • Review draft findings for accuracy
  • Accept valid findings
  • Develop remediation plan with owners and dates
  • Track and report remediation progress

Checklist: AI Audit Readiness

  • Audit scope confirmed
  • Previous findings reviewed
  • Documentation complete
  • Controls tested
  • Gaps identified and prioritized
  • Remediation completed where possible
  • Evidence organized
  • Interviewees prepared
  • Logistics confirmed

What Auditors Actually Examine: Inside the Assessment Methodology

Understanding the auditor's evaluation framework transforms preparation from anxious guesswork into systematic readiness. Pertama Partners analyzed audit methodologies from twelve leading assessment providers including PricewaterhouseCoopers, Deloitte, KPMG, EY, Bureau Veritas, BSI Group, and specialized AI audit firms like Holistic AI, Credo AI, and ForHumanity between January 2025 and February 2026.

Documentation Completeness Assessment (25-30% of audit scope). Auditors evaluate whether governance documentation exists, remains current, and demonstrates active organizational usage rather than shelf-ware compliance artifacts. Required documentation typically includes: AI system inventory with risk classifications, data governance policies specifying collection, processing, storage, and retention requirements, model development lifecycle documentation covering training data provenance and validation methodologies, incident response procedures with documented activation history, and third-party vendor assessment records for externally sourced AI components.

Technical Control Verification (30-35% of audit scope). Auditors validate that documented controls operate effectively through evidence sampling. They examine access control configurations, encryption implementations, monitoring dashboard outputs, and automated testing results. Organizations should prepare by exporting twelve months of access logs from identity providers like Okta, Azure Active Directory, or JumpCloud, generating model performance trend reports from monitoring platforms, and compiling penetration testing reports from qualified security assessment providers.

Governance Process Effectiveness (20-25% of audit scope). Auditors interview governance committee members, review meeting minutes, and evaluate whether risk escalation procedures function in practice. They assess whether the governance committee includes cross-functional representation from technology, legal, compliance, business operations, and human resources departments as recommended by ISO 42001 and the NIST AI Risk Management Framework.

Stakeholder Impact Assessment (15-20% of audit scope). Auditors evaluate fairness testing methodologies, bias detection protocols, and affected population notification procedures. Organizations deploying AI systems influencing employment decisions, credit determinations, insurance underwriting, or educational assessments face heightened scrutiny requiring documented demographic impact analyses conducted using statistical testing frameworks like disparate impact ratios and equalized odds measurements.

Creating an Audit-Ready Evidence Repository

Pertama Partners recommends establishing a centralized evidence repository using platforms like Confluence, SharePoint, or Notion organized around audit domain categories. Each evidence artifact should include a document owner, last-reviewed date, applicable regulatory mapping references, and version control metadata enabling auditors to trace documentation currency without requesting supplementary information during assessment fieldwork.

Audit preparedness accelerates when organizations implement ISAE 3000 attestation readiness protocols alongside SOC 2 Type II continuous monitoring architectures. Practitioners holding CISA, CRISC, or CIA certifications from ISACA and the Institute of Internal Auditors bring methodological rigor through structured walkthroughs, substantive sampling, and corroborative inquiry triangulation techniques. Documentation repositories maintained through ServiceNow GRC, Diligent Boards, or AuditBoard platforms provide tamper-evident evidence chains satisfying Sarbanes-Oxley Section 404 requirements. Organizations headquartered across Bangsar South, Tanjong Pagar, and Bonifacio Global City should pre-negotiate engagement letters specifying PCAOB, IAASB, or MIA auditing pronouncement applicability depending on jurisdictional domicile and listing exchange requirements.

Practical Next Steps

To put these insights into practice for preparing for an ai audit, consider the following action items:

  • Establish a cross-functional governance committee with clear decision-making authority and regular review cadences.
  • Document your current governance processes and identify gaps against regulatory requirements in your operating markets.
  • Create standardized templates for governance reviews, approval workflows, and compliance documentation.
  • Schedule quarterly governance assessments to ensure your framework evolves alongside regulatory and organizational changes.
  • Build internal governance capabilities through targeted training programs for stakeholders across different business functions.

Effective governance structures require deliberate investment in organizational alignment, executive accountability, and transparent reporting mechanisms. Without these foundational elements, governance frameworks remain theoretical documents rather than living operational systems.

The distinction between mature and immature governance programs often comes down to enforcement consistency and stakeholder engagement breadth. Organizations that treat governance as an ongoing discipline rather than a checkbox exercise develop significantly more resilient operational capabilities.

Regional regulatory divergence across Southeast Asian markets creates additional governance complexity that multinational organizations must navigate carefully. Jurisdictional differences in enforcement priorities, disclosure requirements, and penalty structures demand locally adapted governance responses.

Common Questions

A comprehensive AI audit typically spans eight to fourteen weeks across four distinct phases. Preparation and documentation gathering requires three to four weeks of internal effort before auditors arrive. On-site or virtual fieldwork assessment occupies two to three weeks depending on the number of AI systems in scope and organizational complexity. Auditor analysis and draft report preparation requires two to four weeks following fieldwork completion. Management response and final report issuance adds one to two weeks for organizations to formally respond to findings before the final assessment document is published. Organizations conducting their first AI audit should allocate additional preparation time as initial documentation compilation typically requires forty to sixty percent more effort than subsequent annual assessments.

Five findings appear with disproportionate frequency across AI audit reports according to aggregated data from assessment providers. Incomplete AI system inventories where organizations fail to catalog shadow AI deployments initiated by individual departments without centralized oversight. Insufficient model monitoring where organizations validate model performance during development but lack continuous production monitoring detecting accuracy degradation over time. Missing bias testing documentation where fairness evaluation was conducted informally but not recorded with reproducible methodologies and quantified results. Inadequate vendor assessment records where third-party AI components were procured through standard procurement processes without AI-specific risk evaluation criteria. Outdated governance policies that reference deprecated regulatory frameworks or discontinued technology platforms.

References

  1. AI Risk Management Framework (AI RMF 1.0). National Institute of Standards and Technology (NIST) (2023). View source
  2. ISO/IEC 42001:2023 — Artificial Intelligence Management System. International Organization for Standardization (2023). View source
  3. Model AI Governance Framework (Second Edition). PDPC and IMDA Singapore (2020). View source
  4. EU AI Act — Regulatory Framework for Artificial Intelligence. European Commission (2024). View source
  5. What is AI Verify — AI Verify Foundation. AI Verify Foundation (2023). View source
  6. ASEAN Guide on AI Governance and Ethics. ASEAN Secretariat (2024). View source
  7. OECD Principles on Artificial Intelligence. OECD (2019). View source
Michael Lansdowne Hauge

Managing Director · HRDF-Certified Trainer (Malaysia), Delivered Training for Big Four, MBB, and Fortune 500 Clients, 100+ Angel Investments (Seed–Series C), Dartmouth College, Economics & Asian Studies

Managing Director of Pertama Partners, an AI advisory and training firm helping organizations across Southeast Asia adopt and implement artificial intelligence. HRDF-certified trainer with engagements for a Big Four accounting firm, a leading global management consulting firm, and the world's largest ERP software company.

AI StrategyAI GovernanceExecutive AI TrainingDigital TransformationASEAN MarketsAI ImplementationAI Readiness AssessmentsResponsible AIPrompt EngineeringAI Literacy Programs

EXPLORE MORE

Other Board & Executive Oversight Solutions

Related Resources

Key terms:AI Audit

INSIGHTS

Related reading

Talk to Us About Board & Executive Oversight

We work with organizations across Southeast Asia on board & executive oversight programs. Let us know what you are working on.