The arms race between AI capabilities and detection tools is unwinnable. A better approach: design assessments that promote genuine learning whether or not students use AI.
This guide provides practical strategies for creating assessments that are meaningful, hard to outsource to AI, and focused on what actually matters—student learning.
Executive Summary
- "AI-proof" doesn't mean AI can't help—it means AI can't replace the learning
- The best assessments require students to demonstrate understanding, not just produce content
- Process-based assessment (drafts, reflection, defense) is more robust than product-only assessment
- Personalization and context-specificity make AI assistance less useful
- In-class components verify that students can demonstrate learning in real-time
- Good assessment design serves learning goals first, integrity goals second
- These strategies benefit all students, not just in response to AI concerns
Principles of AI-Resistant Assessment
Principle 1: Assess Understanding, Not Just Output
AI can produce polished output without understanding. Assessments should require students to demonstrate they understand what they've produced.
Strategies:
- Oral defense of written work
- Follow-up questions that probe understanding
- Application to new scenarios
- Explanation of reasoning and choices
Principle 2: Value Process Over Product
AI can generate final products instantly. Requiring evidence of process makes AI less helpful.
Strategies:
- Submit drafts showing development
- Annotated bibliographies showing research journey
- Reflection on learning and revision
- Documented decision-making
Principle 3: Connect to Specific Context
AI struggles with highly specific, current, or local contexts. Generic prompts invite AI completion.
Strategies:
- Reference specific class discussions
- Apply to current events or local situations
- Connect to personal experience
- Build on previous student work
Principle 4: Include Real-Time Components
AI assistance is harder (though not impossible) during supervised, time-limited conditions.
Strategies:
- In-class writing portions
- Timed components
- Live presentations
- Real-time problem-solving
Assessment Design SOP
Step 1: Clarify Learning Objectives
What should students know or be able to do after completing this assessment?
| Objective Type | Example | AI Vulnerability |
|---|---|---|
| Recall facts | Name the causes of WWI | High—AI knows facts |
| Apply concept | Apply supply/demand to local business | Medium—requires context |
| Analyze | Compare two literary interpretations | Medium—AI can analyze |
| Create | Design solution to school problem | Low if personalized |
| Evaluate | Defend a position with evidence | Low if requires oral defense |
| Reflect | Describe how your thinking changed | Low—requires authentic experience |
Step 2: Choose Assessment Format
Select formats that align with objectives and resist AI completion:
| Format | AI Resistance | Best For |
|---|---|---|
| Traditional essay (take-home) | Low | Avoid for high-stakes |
| Essay with oral defense | High | Analysis, argument |
| In-class writing | Medium-High | Demonstrating understanding |
| Process portfolio | High | Development over time |
| Presentation with Q&A | High | Demonstrating mastery |
| Project with documentation | Medium-High | Applied learning |
| Exam (proctored) | High | Recall and application |
Step 3: Add AI-Resistant Elements
Modify the assessment to include:
Personalization:
- "Based on our class discussion on [specific date]..."
- "Using the example we analyzed together..."
- "Applying this to [local context students know]..."
Process evidence:
- "Submit three drafts showing revision"
- "Include annotated notes from your research"
- "Document how your thesis evolved"
Verification component:
- "Be prepared to explain any part of your essay in class"
- "Present your findings and answer questions"
- "Complete related in-class writing"
Step 4: Communicate Expectations
Make AI policy explicit for each assignment:
- What AI use (if any) is permitted?
- What disclosure is required?
- What verification will occur?
Step 5: Design Rubric Accordingly
Rubrics should reward:
- Evidence of personal engagement
- Connection to specific contexts
- Ability to explain and defend
- Development over time
Assessment Redesign Examples
Example 1: Traditional Essay → Process Portfolio
Before: "Write a 1500-word essay analyzing symbolism in The Great Gatsby."
After: "Over three weeks, develop an analysis of symbolism in The Great Gatsby:
- Week 1: Submit initial thesis and three pieces of textual evidence with annotations explaining why you chose them
- Week 2: Submit first draft with reflection on what's working and what isn't
- Week 3: Submit final draft with 200-word reflection on how your analysis evolved
- Be prepared for 5-minute oral discussion of your analysis"
Example 2: Research Paper → Authentic Investigation
Before: "Write a research paper on climate change solutions."
After: "Investigate climate adaptation in [our city]:
- Interview a local official or business owner about their climate preparations
- Analyze one local policy or initiative
- Propose a realistic improvement based on your research
- Present findings to class with documentation of your research process"
Example 3: Problem Set → Explained Solutions
Before: "Solve these 10 calculus problems."
After: "Solve these 10 calculus problems:
- For 5 problems of your choice, write a brief explanation of your approach
- In class, you'll be asked to solve a similar problem and explain your reasoning
- AI calculators may be used for computation, but you must explain why each step is valid"
Quick Wins: Immediate Improvements
If you can't redesign completely, add these elements:
- Add oral component: "Be prepared to discuss your work in class"
- Require process evidence: "Submit your notes/drafts with final version"
- Personalize the prompt: Reference specific class discussions or local contexts
- Include reflection: "Describe what you learned and how your thinking changed"
- Verify understanding: Plan follow-up questions for submitted work
Common Failure Modes
Failure 1: Making assessments harder but not better
Adding obstacles that don't serve learning.
Prevention: Every design choice should serve a learning objective, not just an integrity objective.
Failure 2: Assuming in-class = AI-free
Students can still access AI on phones, smartwatches, or memorize AI-generated content.
Prevention: In-class components add resistance but aren't foolproof. Combine with other strategies.
Failure 3: Over-complicating assessment
So many requirements that the assessment becomes a compliance exercise.
Prevention: Keep it simple. Focus on 2-3 AI-resistant elements, not 10.
Implementation Checklist
- Reviewed current assessments for AI vulnerability
- Identified highest-risk assessments (high-stakes, easily AI-completed)
- Selected redesign strategies for priority assessments
- Updated rubrics to value process and understanding
- Communicated AI expectations for each assessment
- Planned verification components (oral, in-class)
- Scheduled assessment review with department
Frequently Asked Questions
Q1: Doesn't this make assessment much more work?
It can initially. But process-based assessment often provides richer feedback opportunities and may reduce grading time on polished-but-shallow work.
Q2: What about standardized tests?
These are typically AI-resistant due to proctoring. Focus redesign efforts on classroom assessments.
Q3: How do we handle students who need AI as an accommodation?
Accommodations take precedence. Work with special education staff to distinguish accommodation use from integrity concerns.
Q4: Won't students just have AI help them prepare for oral defense?
AI can help students understand content—that's fine. If they can explain and defend their work, they've learned. That's the goal.
Next Steps
Start with your highest-stakes, most AI-vulnerable assessments. Apply 2-3 redesign strategies. Evaluate and iterate.
Need help redesigning your school's assessment approach?
→ Book an AI Readiness Audit with Pertama Partners. We'll help you create assessments that promote authentic learning.
References
- Wiggins, G. & McTighe, J. Understanding by Design.
- Bearman, M. et al. (2023). Re-imagining Assessment in the Age of AI.
- Stanford Teaching Commons. (2024). AI-Resistant Assignment Design.
Frequently Asked Questions
No assessment is truly AI-proof. Focus on designing assessments that evaluate thinking processes, incorporate oral components, require personal reflection, and have value even with AI assistance.
Focus on process documentation, personal connection to content, iterative work with feedback, oral defense components, and authentic tasks that require human judgment and experience.
The best approach often leverages AI as a tool in the assignment while evaluating higher-order skills. Competing with AI in tasks it does well is increasingly futile.
References
- Wiggins, G. & McTighe, J. Understanding by Design.. Wiggins G & McTighe J Understanding by Design
- Bearman, M. et al. (2023). Re-imagining Assessment in the Age of AI.. Bearman M et al Re-imagining Assessment in the Age of AI (2023)
- Stanford Teaching Commons. (2024). AI-Resistant Assignment Design.. Stanford Teaching Commons AI-Resistant Assignment Design (2024)

