When a school selects an AI-powered EdTech tool, it is not simply choosing a product. It is choosing a data partner, one that will gain access to some of the most sensitive information any institution holds: student records, behavioral patterns, and learning profiles. The consequences of choosing poorly extend well beyond reputational damage. Under the Personal Data Protection Acts governing Singapore (PDPA), Malaysia (PDPA 2010), and Thailand (PDPA), schools remain legally accountable for data protection even when a third-party vendor is the one processing the data.
Not all vendors treat student data with the rigor it demands. This guide provides a practical framework for evaluating AI vendors through a data protection lens, helping school leaders move from reactive trust to structured due diligence.
For foundational context, see our guide on student data protection in the age of AI.
Why This Matters Now
The urgency of vendor evaluation has intensified across several dimensions simultaneously. EdTech vendors are embedding AI capabilities into their platforms at speed, sometimes without clearly communicating the new data processing those features require. AI tools, by their nature, tend to require more data than traditional software. An adaptive learning engine, for example, may need granular behavioral signals that a static quiz platform never touched.
Regulatory scrutiny is also accelerating. Privacy regulators across Southeast Asia are paying closer attention to how children's data flows through educational technology stacks. Singapore's PDPC, Malaysia's Department of Personal Data Protection, and Thailand's PDPC have all signaled heightened enforcement postures around minors' data.
Vendor maturity, meanwhile, varies enormously. Some providers maintain robust data protection programs backed by ISO 27001 certification and documented incident response plans. Others are early-stage startups still building those capabilities. The distinction matters because schools bear the accountability regardless of which vendor they select.
Vendor Evaluation Framework
A rigorous evaluation should examine six categories. Each represents a distinct surface area of risk, and weakness in any one of them can undermine the protections established in the others.
Category 1: Data Collection and Access
The first question is scope. Schools should ask vendors to specify exactly which student data elements the tool will collect or access, and why each element is necessary for the stated functionality. It is worth testing whether the data shared can be limited without sacrificing core features. Behavioral data and biometric data deserve particular scrutiny, as both carry elevated privacy risk.
Warning signs in this category include vague descriptions of data collection practices, data requirements that appear excessive relative to the tool's functionality, and any collection of biometric information without a clearly articulated necessity.
Category 2: Data Processing and AI Use
Understanding how a vendor's AI actually processes student data is essential. Schools should ask whether student data is used to train machine learning models, whether those models are shared across customers, and whether the AI creates persistent profiles of individual students. The ability to opt out of specific AI features provides an important safety valve.
Vendors that refuse to explain how their AI processes data, that use one school's student data to train models sold to other institutions, or that make consequential decisions about students without any human oversight should be treated with significant caution.
Category 3: Data Storage and Security
Data residency and security posture form the third pillar of evaluation. Schools should confirm where student data is physically stored, what security certifications the vendor holds, whether the vendor has experienced prior security incidents, and what its incident response process looks like in practice.
The absence of any recognized security certification is a meaningful red flag. So is storing data in jurisdictions with weak or poorly enforced privacy laws, or operating without a documented incident response plan.
Category 4: Data Sharing and Sub-processors
Few vendors operate in isolation. Most rely on sub-processors for infrastructure, analytics, or specialized services. Schools should ask for a complete list of sub-processors, understand the nature of data each sub-processor can access, and negotiate the right to approve or veto changes to that list.
A vendor's refusal to disclose its sub-processors, the presence of sub-processors in high-risk jurisdictions, or the absence of contractual data protection requirements flowing down to sub-processors all represent material risks.
Category 5: Data Retention and Deletion
Data that persists beyond its useful life becomes a liability rather than an asset. Schools should confirm how long student data is retained, what happens to data upon contract termination, and whether data deletion can be requested and verified.
Undefined retention periods, indefinite data retention, and the continued holding of data after a contract ends are each problematic. The right to request deletion, accompanied by a vendor certification that deletion has been completed, should be a baseline expectation.
Category 6: Contract and Compliance
The contract is where promises become enforceable commitments. Schools should ask whether the vendor will sign a Data Processing Agreement, whether the contract provides audit rights over the vendor's data practices, and whether the vendor can demonstrate PDPA compliance.
A vendor's refusal to sign a DPA, contractual language that completely excludes liability for data issues, or a take-it-or-leave-it posture with no willingness to negotiate are each significant warning signals.
Vendor Due Diligence Process
Effective due diligence proceeds through five phases, each building on the previous one.
Before First Contact
Before engaging any vendor, schools should define their own data protection requirements, identify the specific data elements they are willing to share, understand their regulatory obligations under the applicable PDPA framework, and prepare targeted questions drawn from the evaluation framework above.
Initial Evaluation
The first round of assessment involves reviewing the vendor's privacy policy and security documentation, identifying the full scope of data collection, confirming which AI features exist and how they use data, and verifying any claimed security certifications.
Deep Dive
A deeper investigation should yield a complete sub-processor list, sample DPA terms for review, confirmation of data storage locations and jurisdictions, a clear picture of retention and deletion practices, and references from other schools already using the tool.
Contract Negotiation
The negotiation phase is where schools formalize protections: agreeing on DPA terms, securing audit rights, confirming how liability is allocated, specifying data deletion requirements and timelines, and establishing a sub-processor approval process.
Ongoing Governance
Due diligence does not end at contract signing. Schools should schedule periodic vendor reviews, create a process for assessing new features as they are released, establish incident notification procedures, and maintain documentation of all due diligence activities for accountability purposes.
Sample DPA Requirements
A well-structured Data Processing Agreement should address six areas. The SDPC National Data Privacy Agreement, which serves as a model template used across more than 275,000 school-vendor agreements, provides a useful starting reference.
Scope and Purpose
The DPA should enumerate the specific data elements being processed, define permitted purposes alongside explicitly prohibited uses, and establish the duration of processing.
Security Requirements
Minimum security standards should be specified in concrete terms, including any required certifications and a defined timeline for incident notification.
Sub-processor Controls
The agreement should include a list of approved sub-processors, require notification before any changes to that list, and grant the school the right to object to new sub-processors.
Audit Rights
The school should retain the right to audit the vendor's data practices directly or to receive independent audit reports at defined intervals.
Data Subject Rights
The DPA should clarify the vendor's obligation to assist the school in responding to data access requests from parents or students.
Termination Provisions
Upon contract termination, the agreement should specify whether data will be returned or deleted, and require the vendor to certify that deletion has been completed.
Common Failure Modes
Five recurring mistakes undermine even well-intentioned vendor evaluation efforts.
The first is accepting boilerplate contract terms without negotiation. Even smaller vendors will often modify their standard terms for school customers when asked. The second is trusting marketing claims at face value. Specifics should be requested in writing, and certification evidence should be independently verified.
The third failure mode is ignoring sub-processors. A vendor's own practices may be sound, but a poorly governed sub-processor can introduce risk that the primary agreement never anticipated. The fourth is treating evaluation as a one-time event rather than an ongoing responsibility. Annual vendor reviews and monitoring for feature changes should be standard practice.
The fifth, and perhaps most consequential, is inadequate documentation. Schools may need to demonstrate accountability to regulators, parents, or governing boards. A thorough paper trail of due diligence decisions, vendor responses, and ongoing monitoring activities provides that evidence.
Metrics to Track
Effective vendor governance requires measurement. Schools should track the number of vendors that have completed due diligence assessments, the percentage of vendors operating under signed DPAs, the current security certification status of each vendor, the average time required to complete a vendor assessment, and the number of vendors requiring remediation actions.
Next Steps
Vendor evaluation is not a one-time event. It is an ongoing relationship that requires sustained attention. Schools should begin with their highest-risk vendors and work through the framework systematically, building institutional muscle for data protection governance over time.
Need help establishing vendor evaluation processes?
Book an AI Readiness Audit with Pertama Partners. We will assess your vendor landscape, identify gaps in current agreements, and help you build robust due diligence procedures.
Disclaimer
This article provides general guidance on vendor evaluation for student data protection. It does not constitute legal advice. Consult qualified legal counsel for specific contractual and regulatory guidance.
Related Articles
- Student Data Protection in the Age of AI
- Parental Consent for AI in Schools
- Data Minimization in School AI
Common Questions
Schools should include several critical contractual provisions when engaging AI vendors that process student data. First, a data processing addendum that explicitly prohibits the vendor from using student data for any purpose beyond providing the contracted service, including model training, product improvement, or sale to third parties. Second, data deletion obligations specifying that all student data must be permanently deleted within 30 days of contract termination, with written certification of deletion provided. Third, breach notification requirements mandating that the vendor notify the school within 24 to 48 hours of discovering any unauthorized access to student data, with specific information required in the notification including the scope of affected data and remediation steps taken. Fourth, audit rights allowing the school or its designated third party to inspect the vendor's data handling practices annually. Fifth, sub-processor restrictions requiring prior written approval before the vendor engages any additional processors who will access student data.
Schools should verify vendor data protection claims through multiple validation methods rather than accepting marketing statements at face value. Request and review the vendor's most recent SOC 2 Type II audit report, which provides independent verification of security controls over a sustained period rather than a single point in time. Ask for references from other educational institutions of similar size and type, and contact those references to inquire specifically about data protection practices, incident history, and the vendor's responsiveness to security concerns. Review the vendor's public privacy policy and terms of service for consistency with their sales representations, as discrepancies between marketing claims and legal documents often indicate weaker actual protections than advertised. Finally, conduct a technical assessment of the vendor's platform during a trial period to verify that access controls, encryption, and data handling match documented capabilities.
References
- National Data Privacy Agreement (NDPA). Student Data Privacy Consortium (SDPC) (2024). View source
- Student Privacy Pledge. Future of Privacy Forum (2014). View source
- Personal Data Protection Act (PDPA) — Overview. PDPC Singapore (2012). View source
- Advisory Guidelines on Use of Personal Data in AI Recommendation and Decision Systems. PDPC Singapore (2024). View source
- Guidance for Generative AI in Education and Research. UNESCO (2023). View source
- The First National Model Student Data Privacy Agreement Launches. Future of Privacy Forum (2018). View source
- Student Data Privacy Consortium — Resources. SDPC / A4L Community (2025). View source

