How the PDPA Applies to AI
Singapore's Personal Data Protection Act (PDPA), enacted in 2012, is the primary law governing how private sector organizations collect, use, and disclose personal data. While it was written before the current AI era, the PDPC (Personal Data Protection Commission) has made it clear that the PDPA fully applies to AI systems that process personal data.
In March 2024, the PDPC issued dedicated Advisory Guidelines on the Use of Personal Data in AI Recommendation and Decision Systems, providing specific guidance for AI developers and deployers.
Core PDPA Obligations for AI
Consent
The default requirement under the PDPA is that organizations must obtain consent before collecting, using, or disclosing personal data. For AI systems, this means:
- Training data: If you use personal data to train AI models, you generally need consent from the individuals whose data is used
- Input data: If users provide personal data that is processed by an AI system, you need consent for that processing
- Output data: If an AI system generates personal data about individuals (profiles, predictions, scores), that output is also governed by the PDPA
Exceptions to Consent
The PDPA provides several exceptions relevant to AI development:
- Business improvement exception: Organizations can use personal data without consent for improving or developing products and services, provided the data is not used for decisions about the individual. This is highly relevant for AI model training and testing
- Research exception: Personal data may be used for research purposes under certain conditions
- Legitimate interests exception: Organizations may process data without consent if it is in their legitimate interests, subject to a balancing test against the individual's interests
The PDPC Advisory Guidelines specifically note that organizations are "encouraged to use anonymized data where possible for AI training and testing."
Purpose Limitation
Personal data collected for one purpose generally cannot be repurposed for AI training without additional consent, unless an exception applies. This has significant implications for companies looking to use existing customer databases to train AI models.
Data Protection Obligations
Organizations must protect personal data in AI systems with reasonable security arrangements. This includes:
- Encryption of training data at rest and in transit
- Access controls for datasets used in AI development
- Regular security assessments of AI systems
- Monitoring for data breaches or unauthorized access
Data Breach Notification
If an AI system suffers a data breach involving personal data, organizations must notify the PDPC and affected individuals if the breach is likely to result in significant harm or affects more than 500 individuals.
PDPC Advisory Guidelines on AI (March 2024)
The 2024 Advisory Guidelines provide practical, AI-specific guidance:
Scope
The guidelines cover AI recommendation and decision systems — but explicitly do NOT cover Generative AI (which is addressed separately under the Model AI Governance Framework for GenAI).
Key Guidance
On consent for AI training:
- Consent is generally needed when using personal data for AI training
- The business improvement exception may apply if the training is to improve products/services and the data is not used for individual-level decisions
- Organizations should clearly inform individuals about AI-related data processing in their privacy policies
On anonymization:
- Organizations are strongly encouraged to anonymize data before using it for AI development
- Properly anonymized data is no longer considered personal data under the PDPA
- The PDPC provides guidance on what constitutes adequate anonymization
On AI decision-making:
- If an AI system makes or significantly influences decisions about individuals, organizations should implement additional safeguards
- Individuals should be informed when AI plays a significant role in decisions affecting them
- Organizations should maintain the ability to explain AI decisions in general terms
On data intermediaries:
- Companies that process data on behalf of others (including AI service providers) have specific obligations as data intermediaries under the PDPA
- The data controller remains ultimately responsible for PDPA compliance
Penalties
| Violation | Maximum Penalty |
|---|---|
| Administrative fine (per breach) | SGD 1,000,000 (approximately USD 740,000) |
| Data breach notification failure | Additional penalties |
| Serious or repeated violations | Higher penalties may apply |
The PDPC has actively enforced the PDPA, issuing decisions and fines against organizations for various data protection failures.
How to Comply
Step 1: Data Audit for AI Systems
Map all personal data flows in your AI systems:
- What personal data is used for training? Where did it come from?
- What personal data is processed as input during operation?
- Does the AI system generate personal data as output?
- Are any PDPA exceptions applicable?
Step 2: Consent Review
For each data flow involving personal data:
- Do you have valid consent for this use?
- If relying on an exception, document why the exception applies
- Update privacy policies to describe AI-related data processing
Step 3: Anonymization Strategy
Where possible, anonymize data before using it for AI:
- Apply techniques like differential privacy, k-anonymity, or data masking
- Test anonymized datasets to ensure they cannot be re-identified
- Document your anonymization methodology
Step 4: AI-Specific Privacy Impact Assessment
Conduct a privacy impact assessment for each AI system that processes personal data:
- Identify risks to individuals
- Assess the likelihood and severity of those risks
- Document mitigation measures
- Review and update periodically
Step 5: Transparency Measures
Inform users and affected individuals about AI data processing:
- Update privacy policies to describe AI uses
- Provide accessible explanations of AI decision-making processes
- Implement channels for individuals to ask questions about AI decisions
Related Regulations
- Singapore MAS AI Risk Management Guidelines: Sector-specific requirements for financial institutions
- Singapore Model AI Governance Framework: Voluntary best practices that complement PDPA obligations
- ASEAN Guide on AI Governance and Ethics: Regional framework aligned with PDPA principles
- EU GDPR: Comparable data protection requirements with AI-specific provisions under Article 22
Frequently Asked Questions
Generally yes, unless an exception applies. The most relevant exception is the business improvement exception, which allows organizations to use personal data for improving products and services without consent, provided the data is not used for decisions about specific individuals. The PDPC encourages anonymizing data where possible for AI training.
The PDPC can impose administrative fines of up to SGD 1,000,000 (approximately USD 740,000) per breach. The PDPC actively enforces the PDPA and has issued numerous decisions involving data protection failures. There is no separate AI-specific penalty — violations are handled under the general PDPA enforcement framework.
No. Properly anonymized data is not considered personal data under the PDPA and is therefore not subject to its requirements. However, the anonymization must be robust — if data can be re-identified, it is still personal data. The PDPC provides guidance on adequate anonymization standards.
The guidelines themselves are advisory. However, the underlying PDPA obligations they clarify ARE mandatory. The guidelines explain how existing PDPA requirements apply to AI systems. Failing to follow the guidelines does not create a separate violation, but failing to comply with the PDPA obligations they describe does.
No. The March 2024 PDPC Advisory Guidelines explicitly cover AI recommendation and decision systems, not Generative AI. GenAI governance is addressed separately through Singapore's Model AI Governance Framework for Generative AI (published in 2024) and the new Agentic AI framework (published January 2026).
References
- Personal Data Protection Act 2012. Singapore PDPC (2012). View source
- Advisory Guidelines on Use of Personal Data in AI Recommendation and Decision Systems. Singapore PDPC (2024). View source
