Back to Insights
AI Governance & AdoptionFramework

AI Use-Case Intake Process — From Idea to Implementation

February 11, 20269 min readPertama Partners

A structured process for evaluating, prioritising, and approving AI use cases in your company. Includes intake form template, scoring criteria, and governance workflow.

AI Use-Case Intake Process — From Idea to Implementation

Why You Need an AI Use-Case Intake Process

As AI adoption grows in your organisation, the number of AI use-case ideas will multiply rapidly. Without a structured intake process, your company faces two problems:

  1. Good ideas get lost — Employees suggest AI applications informally, but there is no system to capture, evaluate, and act on them
  2. Bad ideas consume resources — Without evaluation criteria, the loudest voice or most senior requester wins, rather than the highest-value use case

An AI use-case intake process creates a fair, transparent system for capturing AI ideas from anywhere in the organisation and routing them through evaluation, prioritisation, and (if approved) implementation.

The Intake Process Overview

Stage 1: Submission

Any employee can submit an AI use-case idea through a standard intake form.

Stage 2: Initial Triage

The AI governance committee or designated reviewer conducts a quick assessment to filter out duplicates, out-of-scope requests, and clearly infeasible ideas.

Stage 3: Detailed Evaluation

Promising use cases are scored against standardised criteria covering business value, feasibility, risk, and alignment.

Stage 4: Prioritisation

Scored use cases are ranked and placed on the AI project backlog.

Stage 5: Approval and Assignment

The top-priority use cases are approved for implementation and assigned to an AI champion or project team.

Stage 6: Implementation and Review

The use case is implemented, measured, and reviewed. Learnings feed back into the process.

AI Use-Case Intake Form Template

Section 1: Submitter Information

FieldEntry
Name
Department
Role
Date
Email

Section 2: Use Case Description

What is the current process or task? [Describe the current way this work is done, without AI]

What problem does this solve or what opportunity does it create? [Describe the pain point, inefficiency, or missed opportunity]

How would AI improve this process? [Describe specifically how AI would be used — which tool, what inputs, what outputs]

Who would benefit? [List the team, department, or stakeholders who would benefit]

How often is this task performed?

  • Daily
  • Weekly
  • Monthly
  • Ad hoc / As needed

Estimated time currently spent on this task: [Hours per week or per occurrence]

Section 3: Data and Risk

What data would be used as input to the AI tool? [Describe the data types involved]

Does this data include personal data?

  • Yes
  • No
  • Unsure

Does this data include confidential or client data?

  • Yes
  • No
  • Unsure

What is the impact if the AI output is incorrect?

  • Low — minor inconvenience, easily corrected
  • Medium — requires rework, could cause delays
  • High — could cause financial loss, reputational damage, or compliance issues
  • Critical — could cause harm to individuals or severe business impact

Section 4: Expected Benefits

Estimated time saved per week/month: [Hours]

Other expected benefits: [e.g. improved quality, faster turnaround, better customer experience, reduced errors]

Is this a quick win (can be implemented in 1-2 weeks) or a strategic initiative (requires 1-3 months)?

  • Quick win
  • Strategic initiative

Evaluation Scoring Criteria

Use these criteria to score each submitted use case on a 1-5 scale:

Business Value (Weight: 30%)

ScoreCriteria
5Major time/cost savings, directly impacts revenue or customer satisfaction
4Significant productivity improvement for a large team
3Moderate improvement for a department
2Minor convenience improvement
1Nice to have, minimal measurable impact

Feasibility (Weight: 25%)

ScoreCriteria
5Can be done immediately with existing approved tools, no custom development
4Requires minor configuration or workflow adjustment
3Requires new tool approval or moderate setup effort
2Requires custom development or significant integration work
1Technically very challenging, uncertain feasibility

Risk Level (Weight: 25%) — Inverse scoring

ScoreCriteria
5No personal data, low impact if output is incorrect, no regulatory concern
4Minimal personal data, low to medium impact, standard compliance
3Some personal data or medium impact, requires human review
2Significant personal data or high impact, requires careful governance
1Critical data or impact, major regulatory considerations

Strategic Alignment (Weight: 20%)

ScoreCriteria
5Directly supports company strategic priorities and AI roadmap
4Supports departmental goals and demonstrates AI value
3Moderately aligned with company direction
2Tangentially related
1Does not align with current priorities

Composite Score Calculation

Composite Score = (Business Value × 0.30) + (Feasibility × 0.25) + (Risk Level × 0.25) + (Alignment × 0.20)

Score ranges:

  • 4.0 - 5.0: High priority — fast-track for implementation
  • 3.0 - 3.9: Medium priority — add to backlog, implement when capacity allows
  • 2.0 - 2.9: Low priority — reconsider in 3-6 months or when conditions change
  • Below 2.0: Not recommended — provide feedback to submitter

Governance Workflow

Triage (Within 5 business days of submission)

The AI governance committee or designated reviewer:

  1. Checks for duplicate or similar submissions
  2. Confirms the use case is within scope (not already addressed by an existing tool)
  3. Assigns an initial priority estimate
  4. Communicates receipt to the submitter

Evaluation (Within 10 business days)

For use cases that pass triage:

  1. Score against the evaluation criteria
  2. Identify any governance or compliance concerns
  3. Estimate implementation effort and timeline
  4. Prepare recommendation for the governance committee

Decision (Monthly governance meeting)

The AI governance committee:

  1. Reviews all scored use cases
  2. Decides: Approve, Defer, or Reject
  3. Assigns approved use cases to an AI champion or project team
  4. Communicates decisions to all submitters

Implementation Tracking

FieldDetails
Use case ID[AUTO-GENERATED]
StatusSubmitted / In Triage / In Evaluation / Approved / In Progress / Completed / Deferred / Rejected
Assigned to[CHAMPION OR TEAM]
Start date[DATE]
Target completion[DATE]
Actual completion[DATE]
Results[MEASURED OUTCOMES]

Encouraging Submissions

The intake process only works if employees actually use it. To encourage submissions:

  1. Make it easy — Use a simple form (the template above), not a bureaucratic process
  2. Respond quickly — Acknowledge every submission within 2 business days
  3. Celebrate successes — Share implemented use cases and their results publicly
  4. Provide feedback — Even rejected ideas deserve an explanation
  5. Remove barriers — Employees should not need manager approval to submit an idea

Frequently Asked Questions

Any employee should be able to submit an AI use-case idea. The best AI applications often come from frontline staff who understand daily pain points intimately. The intake form is designed to be simple enough that anyone can complete it in 10-15 minutes.

Initial triage should happen within 5 business days of submission. Detailed evaluation takes another 5-10 business days. Final decisions are made at the monthly governance meeting. From submission to decision, expect 2-4 weeks. Quick wins can be fast-tracked.

Typically 30-50% of submitted use cases are approved for implementation. Another 20-30% are deferred (good ideas but not the right time). The remaining are rejected due to feasibility, risk, or priority issues. Providing clear feedback on rejections encourages continued submissions.

Ready to Apply These Insights to Your Organization?

Book a complimentary AI Readiness Audit to identify opportunities specific to your context.

Book an AI Readiness Audit