Ask most organizations "What AI are you running?" and you'll get uncertain answers. Marketing uses a chatbot. Finance has some predictive tools. IT doesn't know what teams have signed up for. A department head bought something last month. This shadow AI problem is growing—and it creates real governance and risk management challenges.
An AI model inventory solves this by systematically documenting what AI systems exist, what they do, and who's responsible for them. This guide shows you how to build and maintain one.
Executive Summary
- AI model inventory is a comprehensive register of all AI/ML systems in use across an organization, both built and bought
- Key fields: model purpose, data inputs, risk level, owner, status, review dates
- Regulatory drivers: many frameworks now require organizations to know what AI they're using
- Benefits: visibility, risk management, audit readiness, compliance, better governance
- Implementation: can start simple (spreadsheet) and mature into dedicated platforms
- Scope should include third-party AI embedded in tools—often the largest category
Why This Matters Now
Shadow AI is proliferating. Teams sign up for AI tools without IT or governance awareness. ChatGPT, embedded AI features in SaaS products, and department-level purchases create sprawl.
Regulators are asking questions. "What AI do you use?" is becoming a standard regulatory inquiry. Organizations that can't answer clearly face increased scrutiny.
You can't manage risks you don't know about. Each AI system carries risks—data privacy, bias, accuracy, security. Without an inventory, you're blind to your exposure.
Third-party AI is everywhere. That CRM? It has AI features. That analytics platform? Machine learning inside. Your AI footprint is larger than you think.
Definitions and Scope
AI Model vs. AI System
AI Model: The core algorithm or machine learning component. Often referred to in technical contexts.
AI System: The broader application that includes the model plus data, interfaces, integrations, and business processes. More useful for governance.
For inventory purposes, focus on AI systems—what people actually use and interact with.
What Counts as "AI" for Inventory Purposes?
This is harder than it seems. Define explicitly:
Typically in scope:
- Custom-built machine learning models
- Purchased AI/ML software
- SaaS products with significant AI features
- Generative AI tools (ChatGPT, image generators, etc.)
- AI-powered automation
- Predictive analytics that uses ML
Gray area (decide for your organization):
- Simple rule-based automation (no ML)
- Statistical models without learning component
- AI features embedded in larger products (e.g., grammar check in Office)
- Personal AI use by individuals
Recommendation: Start broad, then adjust. It's easier to exclude than to discover you missed something.
First-Party vs. Third-Party AI
First-party: AI you build or have built for you. You control the model, data, and deployment.
Third-party: AI built into products you purchase. You use it but don't control how it works internally.
Both need to be in your inventory. Third-party AI is often the larger category and frequently overlooked.
Step-by-Step Implementation Guide
Phase 1: Define Scope and Criteria (Week 1)
Before discovering AI, decide what you're looking for.
Scope decisions:
- What definition of "AI" will you use?
- Will you include embedded AI features?
- Minimum significance threshold? (e.g., only AI used by >5 people)
- Will you track personal AI tool use?
Document your definition so discovery is consistent.
Phase 2: Discovery—Find Existing AI Systems (Week 2-4)
Systematically identify AI already in use.
Discovery methods:
IT/Procurement records:
- Software purchase records
- Cloud service subscriptions
- Security assessments of vendors
- Shadow IT discovery tools
Surveys and interviews:
- Department heads: "What AI tools do your teams use?"
- IT: "What tools have AI/ML features?"
- Power users: "What do you use for analysis/automation?"
Technical discovery:
- Cloud resource audits (AI service usage)
- API logs (calls to AI services)
- Network traffic analysis
Vendor review:
- Review feature lists of existing software
- Ask vendors directly: "Do you use AI/ML in your product?"
Discovery checklist:
- IT/Security for known applications
- Procurement for purchases
- Finance for expense reports (tool subscriptions)
- Survey to department heads
- Survey to technical teams
- Review major vendor feature sets
Phase 3: Design Inventory Schema (Week 2-3)
What information will you capture for each AI system?
Essential fields:
| Field | Description | Example |
|---|---|---|
| System Name | Common name | "Customer Churn Predictor" |
| System ID | Unique identifier | AI-2024-001 |
| Description | What it does, in business terms | "Predicts which customers are likely to cancel" |
| Type | Category of AI | Predictive model |
| First/Third Party | Built or bought | Third-party (SaaS) |
| Vendor | If third-party | Acme Analytics Inc. |
| Owner | Accountable person | Jane Doe, VP Customer Success |
| Technical Contact | For operational issues | John Smith, IT |
| Business Unit | Department using it | Customer Success |
| Status | Operational state | Production |
| Go-Live Date | When deployed | 2024-03-15 |
| Data Inputs | What data it uses | Customer transactions, behavior data |
| Data Classification | Sensitivity level | Confidential |
| Risk Level | Overall risk assessment | Medium |
| Last Review Date | When last assessed | 2024-06-01 |
| Next Review Date | When next due | 2024-12-01 |
Optional but valuable fields:
- Integration points
- User count
- Decision authority (advisory vs. automated)
- Compliance status
- Related policies
- Incident history
Phase 4: Populate Initial Inventory (Week 4-5)
Enter discovered AI systems into your inventory.
For each system:
- Complete all required fields
- Assign owner (may require escalation)
- Determine initial risk level
- Set review schedule
Common challenges:
- "No one owns this" → assign interim owner and escalate
- "We don't know what data it uses" → investigate or flag as unknown
- "Is this really AI?" → apply your definition; when in doubt, include
Phase 5: Establish Registration Process (Week 5-6)
Create process for adding new AI to the inventory.
Registration triggers:
- New AI tool procurement
- Development of new AI capability
- Discovery of previously unknown AI
- Significant change to existing AI system
Registration workflow:
- Requester completes registration form
- Initial review by governance team
- Risk assessment if warranted
- Addition to inventory
- Notification to relevant stakeholders
Make it easy: If registration is burdensome, people won't do it. Streamline the process.
Phase 6: Connect to Risk Assessment Workflow (Week 6+)
The inventory enables risk management.
Integration points:
- New registrations trigger risk assessment
- Risk-based review schedules
- Inventory feeds compliance reporting
- Incident response uses inventory for impact assessment
Policy Template: AI System Registration Requirements
AI SYSTEM REGISTRATION POLICY
1. PURPOSE
This policy establishes requirements for registering AI systems in the
organizational AI inventory to enable appropriate oversight and governance.
2. SCOPE
This policy applies to all AI systems used by [Organization] employees
and contractors, including:
- Custom-built AI/ML models
- Purchased AI software
- SaaS products with significant AI capabilities
- Generative AI tools used for business purposes
- Third-party AI embedded in business tools
3. DEFINITION OF AI SYSTEMS
For purposes of this policy, an AI system is defined as any software
that uses machine learning, neural networks, natural language processing,
or similar techniques to make predictions, generate content, or automate
decisions that would otherwise require human judgment.
4. REGISTRATION REQUIREMENTS
4.1 All AI systems must be registered in the AI inventory before
production deployment.
4.2 Registration must include all required fields as defined in
the inventory schema.
4.3 Each AI system must have an assigned owner accountable for
its responsible use.
5. TIMELINE
5.1 New AI systems: Register before deployment
5.2 Existing AI systems: Register within [30 days] of policy effective date
5.3 Changed AI systems: Update registration within [7 days] of
material change
6. EXEMPTIONS
[Organization] may exempt certain low-risk AI uses from registration
requirements. Exemptions must be approved by [Governance Committee]
and documented.
7. NON-COMPLIANCE
Failure to register AI systems may result in:
- Required cessation of AI system use
- Disciplinary action per applicable policies
- Required remediation and review
8. REVIEW
This policy will be reviewed annually and updated as needed.
Common Failure Modes
Failure 1: Definition Too Narrow
Symptom: Inventory shows 5 AI systems; organization actually uses 50 Cause: Only counting custom-built ML, missing third-party and embedded AI Prevention: Use broad definition; explicitly include third-party AI
Failure 2: Definition Too Broad
Symptom: Inventory has 500 items; team can't keep up Cause: Including trivial features like spell-check as "AI" Prevention: Set reasonable thresholds; focus on systems with governance implications
Failure 3: Inventory Becomes Stale
Symptom: Inventory last updated 18 months ago; new AI not captured Cause: No ongoing registration process; no update triggers Prevention: Required registration for new AI; periodic refresh; connect to procurement
Failure 4: No Link to Action
Symptom: Inventory exists but doesn't drive governance Cause: Documentation exercise without connection to risk management Prevention: Connect inventory to risk assessment; use it for compliance; report on it
Failure 5: Owner Unclear or Unaccountable
Symptom: Issues arise but no one takes responsibility Cause: Inventory has "owner" field but no real accountability Prevention: Owner must be individual (not team); owner has actual authority; escalation for ownership gaps
Implementation Checklist
Planning
- AI definition scoped and documented
- Inventory fields designed
- Discovery methods identified
- Responsibility assigned for inventory management
Discovery
- IT/Procurement records reviewed
- Department surveys completed
- Vendor features reviewed
- Shadow AI discovery conducted
Inventory Build
- Initial systems entered
- Owners assigned
- Risk levels determined
- Review schedules set
Process Establishment
- Registration policy documented
- Registration workflow created
- Update triggers defined
- Compliance monitoring planned
Integration
- Connected to risk assessment
- Feeds compliance reporting
- Included in incident response
- Reported to governance
Metrics to Track
Inventory Completeness
- Number of AI systems in inventory
- Estimated vs. discovered systems (gap analysis)
- Systems without assigned owners
- Systems past due for review
Registration Compliance
- New AI systems registered before deployment
- Time from discovery to registration
- Registration rejection rate (and reasons)
Governance Effectiveness
- Systems with completed risk assessments
- High-risk systems with mitigation plans
- Audit findings related to undocumented AI
Tooling Suggestions
Spreadsheet (starting point): Simple, accessible, familiar. Good for organizations starting out or with <50 AI systems. Limited for scale and workflow.
GRC platforms: Many governance, risk, and compliance platforms now have AI/model inventory modules. Good integration with risk assessment and compliance workflows.
Dedicated AI governance platforms: Purpose-built for AI inventory and governance. More sophisticated features but additional cost and complexity.
IT asset management extensions: Some ITAM tools are adding AI tracking capabilities. Good if you want to integrate with existing asset management.
Frequently Asked Questions
Do we need to inventory AI in third-party tools?
Yes. Third-party AI often processes your data, affects your customers, and creates compliance obligations. The inventory should distinguish first-party vs. third-party but include both.
What's the difference between model registry and model inventory?
Model registry is technical—tracking versions, parameters, and artifacts of ML models in development. Used by data science teams.
Model inventory is governance—tracking what AI systems exist, who owns them, and what risks they carry. Used by governance and risk functions.
Both are valuable; they serve different purposes.
How granular should the inventory be?
Balance comprehensiveness with manageability. Generally, one entry per "AI system" as experienced by users or as it operates. Don't inventory individual features within a product; do inventory distinct products/tools.
Who owns the inventory?
Typically the AI governance function, risk management, or IT. Ownership should include authority to require registration and follow up on gaps.
How often should we update it?
Continuous registration of new AI. Periodic review of existing entries (annually at minimum). Triggered updates when material changes occur.
What about AI used by individuals (personal ChatGPT accounts)?
Policy decision. Some organizations inventory and govern; others accept risk with guidelines. At minimum, have a policy on acceptable use of personal AI tools for work.
Conclusion
An AI model inventory is foundational to AI governance. You cannot assess risks, ensure compliance, or manage AI responsibly if you don't know what AI you're using.
Start simple—a spreadsheet with essential fields is better than nothing. Discover what AI already exists (likely more than you think). Establish a registration process for new AI. Connect the inventory to risk assessment and compliance workflows.
The inventory enables everything else in AI governance. Without it, you're governing blind.
Book an AI Readiness Audit
Not sure what AI your organization is using? Our AI Readiness Audit includes a comprehensive discovery of AI systems and provides a foundation for governance.
References
- AI governance frameworks
- Regulatory requirements for AI documentation
- IT asset management best practices
Frequently Asked Questions
Comprehensive inventory is foundational for governance, risk management, and compliance. You can't govern what you don't know about. It enables impact analysis when policies change.
Audit procurement and expense records, survey teams, analyze network traffic, check SaaS inventories, and create safe reporting channels for undisclosed AI use.
Document system purpose, owner, data processed, risk classification, approval status, vendor details, and business context. Include both custom and vendor AI solutions.
References
- AI governance frameworks. AI governance frameworks
- Regulatory requirements for AI documentation. Regulatory requirements for AI documentation
- IT asset management best practices. IT asset management best practices

