Back to Insights
AI in Schools / Education OpsGuide

Designing AI-Proof Assessments: Strategies for Authentic Evaluation

December 5, 20257 min readMichael Lansdowne Hauge
Updated March 15, 2026
For:CMOCHRO

Practical strategies for creating assessments that promote genuine learning regardless of AI availability. Focus on process, personalization, and verification.

Summarize and fact-check this article with:
Education Faculty Office - ai in schools / education ops insights

Key Takeaways

  • 1.Design assessments that demonstrate genuine student learning
  • 2.Create project-based and process-oriented evaluation methods
  • 3.Incorporate oral components and presentations into assessment
  • 4.Build assignments that leverage rather than compete with AI
  • 5.Develop rubrics that evaluate thinking process not just output

The arms race between AI capabilities and detection tools is unwinnable. A better approach: design assessments that promote genuine learning whether or not students use AI.

This guide provides practical strategies for creating assessments that are meaningful, hard to outsource to AI, and focused on what actually matters—student learning.


Executive Summary

  • "AI-proof" doesn't mean AI can't help—it means AI can't replace the learning
  • The best assessments require students to demonstrate understanding, not just produce content
  • Process-based assessment (drafts, reflection, defense) is more robust than product-only assessment
  • Personalization and context-specificity make AI assistance less useful
  • In-class components verify that students can demonstrate learning in real-time
  • Good assessment design serves learning goals first, integrity goals second
  • These strategies benefit all students, not just in response to AI concerns

Principles of AI-Resistant Assessment

Principle 1: Assess Understanding, Not Just Output

AI can produce polished output without understanding. Assessments should require students to demonstrate they understand what they've produced.

Strategies:

  • Oral defense of written work
  • Follow-up questions that probe understanding
  • Application to new scenarios
  • Explanation of reasoning and choices

Principle 2: Value Process Over Product

AI can generate final products instantly. Requiring evidence of process makes AI less helpful.

Strategies:

  • Submit drafts showing development
  • Annotated bibliographies showing research journey
  • Reflection on learning and revision
  • Documented decision-making

Principle 3: Connect to Specific Context

AI struggles with highly specific, current, or local contexts. Generic prompts invite AI completion.

Strategies:

  • Reference specific class discussions
  • Apply to current events or local situations
  • Connect to personal experience
  • Build on previous student work

Principle 4: Include Real-Time Components

AI assistance is harder (though not impossible) during supervised, time-limited conditions.

Strategies:

  • In-class writing portions
  • Timed components
  • Live presentations
  • Real-time problem-solving

Assessment Design SOP

Step 1: Clarify Learning Objectives

What should students know or be able to do after completing this assessment?

Objective TypeExampleAI Vulnerability
Recall factsName the causes of WWIHigh—AI knows facts
Apply conceptApply supply/demand to local businessMedium—requires context
AnalyzeCompare two literary interpretationsMedium—AI can analyze
CreateDesign solution to school problemLow if personalized
EvaluateDefend a position with evidenceLow if requires oral defense
ReflectDescribe how your thinking changedLow—requires authentic experience

Step 2: Choose Assessment Format

Select formats that align with objectives and resist AI completion:

FormatAI ResistanceBest For
Traditional essay (take-home)LowAvoid for high-stakes
Essay with oral defenseHighAnalysis, argument
In-class writingMedium-HighDemonstrating understanding
Process portfolioHighDevelopment over time
Presentation with Q&AHighDemonstrating mastery
Project with documentationMedium-HighApplied learning
Exam (proctored)HighRecall and application

Step 3: Add AI-Resistant Elements

Modify the assessment to include:

Personalization:

  • "Based on our class discussion on [specific date]..."
  • "Using the example we analyzed together..."
  • "Applying this to [local context students know]..."

Process evidence:

  • "Submit three drafts showing revision"
  • "Include annotated notes from your research"
  • "Document how your thesis evolved"

Verification component:

  • "Be prepared to explain any part of your essay in class"
  • "Present your findings and answer questions"
  • "Complete related in-class writing"

Step 4: Communicate Expectations

Make AI policy explicit for each assignment:

  • What AI use (if any) is permitted?
  • What disclosure is required?
  • What verification will occur?

Step 5: Design Rubric Accordingly

Rubrics should reward:

  • Evidence of personal engagement
  • Connection to specific contexts
  • Ability to explain and defend
  • Development over time

Assessment Redesign Examples

Example 1: Traditional Essay → Process Portfolio

Before: "Write a 1500-word essay analyzing symbolism in The Great Gatsby."

After: "Over three weeks, develop an analysis of symbolism in The Great Gatsby:

  • Week 1: Submit initial thesis and three pieces of textual evidence with annotations explaining why you chose them
  • Week 2: Submit first draft with reflection on what's working and what isn't
  • Week 3: Submit final draft with 200-word reflection on how your analysis evolved
  • Be prepared for 5-minute oral discussion of your analysis"

Example 2: Research Paper → Authentic Investigation

Before: "Write a research paper on climate change solutions."

After: "Investigate climate adaptation in [our city]:

  • Interview a local official or business owner about their climate preparations
  • Analyze one local policy or initiative
  • Propose a realistic improvement based on your research
  • Present findings to class with documentation of your research process"

Example 3: Problem Set → Explained Solutions

Before: "Solve these 10 calculus problems."

After: "Solve these 10 calculus problems:

  • For 5 problems of your choice, write a brief explanation of your approach
  • In class, you'll be asked to solve a similar problem and explain your reasoning
  • AI calculators may be used for computation, but you must explain why each step is valid"

Quick Wins: Immediate Improvements

If you can't redesign completely, add these elements:

  1. Add oral component: "Be prepared to discuss your work in class"
  2. Require process evidence: "Submit your notes/drafts with final version"
  3. Personalize the prompt: Reference specific class discussions or local contexts
  4. Include reflection: "Describe what you learned and how your thinking changed"
  5. Verify understanding: Plan follow-up questions for submitted work

Common Failure Modes

Failure 1: Making assessments harder but not better

Adding obstacles that don't serve learning.

Prevention: Every design choice should serve a learning objective, not just an integrity objective.

Failure 2: Assuming in-class = AI-free

Students can still access AI on phones, smartwatches, or memorize AI-generated content.

Prevention: In-class components add resistance but aren't foolproof. Combine with other strategies.

Failure 3: Over-complicating assessment

So many requirements that the assessment becomes a compliance exercise.

Prevention: Keep it simple. Focus on 2-3 AI-resistant elements, not 10.


Implementation Checklist

  • Reviewed current assessments for AI vulnerability
  • Identified highest-risk assessments (high-stakes, easily AI-completed)
  • Selected redesign strategies for priority assessments
  • Updated rubrics to value process and understanding
  • Communicated AI expectations for each assessment
  • Planned verification components (oral, in-class)
  • Scheduled assessment review with department

Next Steps

Start with your highest-stakes, most AI-vulnerable assessments. Apply 2-3 redesign strategies. Evaluate and iterate.

Need help redesigning your school's assessment approach?

Book an AI Readiness Audit with Pertama Partners. We'll help you create assessments that promote authentic learning.


Practical Assessment Redesign Strategies

Faculty can redesign assessments to emphasize authentic learning without completely abandoning traditional formats. Process portfolios that require students to document their research journey through annotated bibliographies, draft iterations, and reflective journals provide evidence of genuine engagement with course material that AI cannot fabricate. Oral defenses or viva voce examinations where students explain and defend their written work test understanding in ways that written submissions alone cannot verify. Collaborative assessments requiring real-time teamwork with documented individual contributions leverage the social nature of learning that AI tools cannot replicate. These approaches shift assessment focus from product to process, making the learning journey itself a central component of evaluation.

Supporting Faculty in Assessment Redesign

Institutions should provide structured support for faculty undertaking assessment redesign rather than expecting individual instructors to navigate this transition independently. Faculty development centers can offer workshop series covering assessment design principles for the AI era, providing hands-on practice with redesigning common assignment types across different disciplines. Peer mentoring networks connecting instructors who have successfully redesigned assessments with colleagues beginning the process accelerate adoption and build collective expertise. Assessment design rubrics and templates tailored to different course types, from large lecture courses to small seminars, give faculty practical starting points that they can customize rather than requiring everyone to design solutions from scratch.

Assessment Design Resources and Templates

Faculty can accelerate assessment redesign by adapting proven templates rather than starting from blank pages. Discussion-based assessments that require students to respond to classmates' arguments in real time test critical thinking that AI cannot replicate in interactive settings. Portfolio assessments collecting evidence of learning progression across an entire course demonstrate sustained engagement that single-submission assignments cannot verify. Capstone projects requiring integration of concepts from multiple course modules with application to novel real-world scenarios test the depth of understanding and creative transfer abilities that current AI tools struggle to simulate convincingly.

Building Assessment Integrity Without Surveillance

The most effective approach to maintaining assessment integrity in the AI era focuses on creating genuinely engaging assessments rather than expanding surveillance measures. When students find assignments intellectually stimulating and personally relevant, the motivation to use AI as a shortcut diminishes because the learning process itself becomes valuable. Connect assessments to real-world problems that students care about, provide meaningful choices within assignment parameters, and create opportunities for students to demonstrate expertise in areas that align with their emerging professional identities.

Practical Next Steps

To put these insights into practice for designing ai, consider the following action items:

  • Establish a cross-functional governance committee with clear decision-making authority and regular review cadences.
  • Document your current governance processes and identify gaps against regulatory requirements in your operating markets.
  • Create standardized templates for governance reviews, approval workflows, and compliance documentation.
  • Schedule quarterly governance assessments to ensure your framework evolves alongside regulatory and organizational changes.
  • Build internal governance capabilities through targeted training programs for stakeholders across different business functions.

Effective governance structures require deliberate investment in organizational alignment, executive accountability, and transparent reporting mechanisms. Without these foundational elements, governance frameworks remain theoretical documents rather than living operational systems.

Common Questions

No assessment is truly AI-proof. Focus on designing assessments that evaluate thinking processes, incorporate oral components, require personal reflection, and have value even with AI assistance.

Focus on process documentation, personal connection to content, iterative work with feedback, oral defense components, and authentic tasks that require human judgment and experience.

The best approach often leverages AI as a tool in the assignment while evaluating higher-order skills. Competing with AI in tasks it does well is increasingly futile.

References

  1. Guidance for Generative AI in Education and Research. UNESCO (2023). View source
  2. AI and Education: Guidance for Policy-Makers. UNESCO (2021). View source
  3. The Fundamental Values of Academic Integrity (Third Edition). International Center for Academic Integrity (2021). View source
  4. AI Risk Management Framework (AI RMF 1.0). National Institute of Standards and Technology (NIST) (2023). View source
  5. ISO/IEC 42001:2023 — Artificial Intelligence Management System. International Organization for Standardization (2023). View source
  6. Model AI Governance Framework (Second Edition). PDPC and IMDA Singapore (2020). View source
  7. OECD Principles on Artificial Intelligence. OECD (2019). View source
Michael Lansdowne Hauge

Managing Director · HRDF-Certified Trainer (Malaysia), Delivered Training for Big Four, MBB, and Fortune 500 Clients, 100+ Angel Investments (Seed–Series C), Dartmouth College, Economics & Asian Studies

Managing Director of Pertama Partners, an AI advisory and training firm helping organizations across Southeast Asia adopt and implement artificial intelligence. HRDF-certified trainer with engagements for a Big Four accounting firm, a leading global management consulting firm, and the world's largest ERP software company.

AI StrategyAI GovernanceExecutive AI TrainingDigital TransformationASEAN MarketsAI ImplementationAI Readiness AssessmentsResponsible AIPrompt EngineeringAI Literacy Programs

EXPLORE MORE

Other AI in Schools / Education Ops Solutions

INSIGHTS

Related reading

Talk to Us About AI in Schools / Education Ops

We work with organizations across Southeast Asia on ai in schools / education ops programs. Let us know what you are working on.