Thailand's AI Regulatory Landscape
Thailand has a maturing approach to AI governance built on three pillars:
- Thailand PDPA (enacted 2019, enforced since June 2022): Mandatory data protection law with significant penalties
- Draft AI Law (principles issued 2025): Risk-based AI framework expected to be formalized in 2026
- BOT AI Risk Management Guidelines (September 2025): Mandatory for financial institutions
Additionally, Thailand has an active AI Governance Center (AIGC) within ETDA, a regulatory sandbox for AI, and a National AI Strategy (2022-2027).
Thailand PDPA and AI
Overview
The Personal Data Protection Act B.E. 2562 (2019) is Thailand's comprehensive data protection law. It applies to all processing of personal data, including AI-related processing.
Key Requirements for AI Systems
Lawful basis: Personal data processing requires a lawful basis — consent, contractual necessity, legal obligation, vital interests, public interest, or legitimate interests.
Consent: Must be freely given, specific, informed, and unambiguous. For AI:
- Clearly explain how personal data will be used in AI systems
- Separate consent for AI-specific processing from general service consent
- Consent can be withdrawn at any time
Sensitive data: Explicit consent is required for processing sensitive data, which includes:
- Racial or ethnic origin
- Political opinions
- Religious or philosophical beliefs
- Trade union membership
- Genetic data
- Biometric data
- Health data
- Sexual orientation
- Criminal records
This is particularly important for AI systems using biometric data (facial recognition, voice analysis) or health data.
Data protection impact assessment: Required for processing that is likely to result in high risks to individuals' rights and freedoms. Most AI systems that process personal data for automated decision-making should undergo an assessment.
Cross-border transfers: Personal data can only be transferred outside Thailand if the destination country has adequate data protection standards or other safeguards are in place.
Penalties
| Violation Type | Maximum Penalty |
|---|---|
| Administrative fine | Up to THB 5 million (~USD 140,000) |
| Civil liability | Actual damages + punitive damages (up to 2x actual damages) |
| Criminal penalties | Up to 1 year imprisonment and/or THB 1 million fine |
The Personal Data Protection Committee and the Office of the Personal Data Protection Committee enforce the law.
Draft AI Law (2025-2026)
Thailand's Ministry of Digital Economy and Society (MDES) and the Electronic Transactions Development Agency (ETDA) have been developing a dedicated AI law:
Current Status
- Draft principles issued in 2025
- Currently non-binding
- 2026 targeted as deadline for formalized framework
- Risk-based approach similar to the EU AI Act
Expected Framework
Based on published principles:
Risk classification: AI systems will be classified by risk level:
- High-risk AI: Healthcare, law enforcement, critical infrastructure, financial services
- Medium-risk: Customer-facing AI with moderate impact
- Low-risk: Administrative and operational AI
Requirements for high-risk AI:
- Risk assessments before deployment
- Ongoing monitoring
- Transparency to affected individuals
- Human oversight for critical decisions
AI Governance Center (AIGC)
ETDA has established the AI Governance Center, which provides:
- Technical standards for AI governance
- Readiness assessments for organizations
- Support for implementing the AI ethics guidelines
- Coordination with the regulatory sandbox
AI Regulatory Sandbox
ETDA's AI Innovation Testing Center provides:
- Controlled environment for testing AI applications
- Regulatory guidance during testing
- Reduced regulatory requirements in exchange for transparency and data sharing
- Pathway from sandbox to full deployment
How to Comply
Step 1: PDPA Compliance
- Map all personal data processing in your AI systems
- Ensure valid lawful basis for each processing activity
- Implement consent mechanisms specific to AI data use
- Conduct data protection impact assessments for high-risk AI
- Implement cross-border transfer safeguards
Step 2: AI Ethics Alignment
- Review your AI systems against Thailand's published AI ethics principles
- Implement fairness monitoring and bias mitigation
- Establish transparency mechanisms for AI decisions
- Create accountability structures for AI governance
Step 3: Sector-Specific Compliance
- Financial services: Implement BOT AI Risk Management Guidelines
- Healthcare: Follow existing health data regulations for AI applications
- Telecom: Comply with NBTC requirements for AI in communications
Step 4: Prepare for AI Legislation
- Monitor MDES and ETDA announcements
- Conduct AI system inventory and risk classification
- Build governance infrastructure that can adapt to binding requirements
- Consider engaging with ETDA's regulatory sandbox for innovative AI applications
Related Regulations
- Singapore PDPA & Model Framework: More mature governance framework in neighboring market
- Indonesia PDP Law: Comparable GDPR-style data protection
- Malaysia PDPA 2025: Similar evolving data protection landscape
- ASEAN AI Governance Guide: Regional framework Thailand's approach aligns with
Frequently Asked Questions
Yes. The PDPA applies to all processing of personal data regardless of the technology used. AI systems that collect, process, store, or generate personal data must comply. This includes AI training data, real-time processing, and AI-generated outputs containing personal data.
Thailand has targeted 2026 for formalizing its AI regulatory framework. Draft principles were issued in 2025 by MDES and ETDA. The final form — whether standalone legislation, regulation, or enforceable guidelines — has not been confirmed.
Administrative fines up to THB 5 million (~USD 140,000), civil liability including actual damages plus punitive damages up to twice the actual amount, and criminal penalties of up to 1 year imprisonment and/or THB 1 million fine for serious violations.
Yes. ETDA's AI Innovation Testing Center provides a regulatory sandbox for AI applications. Companies can test AI systems under relaxed regulatory conditions in exchange for transparency and data sharing. This is particularly useful for innovative AI applications where regulatory requirements are unclear.
Yes. The BOT AI Risk Management Guidelines (September 2025) apply to all financial institutions, special financial institutions, and payment service providers. Licensed fintech companies fall under BOT supervision and must comply. Implementation can be proportionate to the firm's size and AI usage.
References
- Personal Data Protection Act B.E. 2562 (2019). Government of Thailand (2019)
- AI Risk Management Guidelines for Financial Service Providers. Bank of Thailand (BOT) (2025)
- National AI Strategy and Action Plan 2022-2027. Thailand Ministry of Digital Economy and Society (2022)
