Navigate BNM's RMiT requirements, the Cyber Security Act 2024, and PDPA amendments simultaneously — build AI capabilities your compliance team will champion.
Malaysia's financial sector operates under one of ASEAN's most rigorous regulatory frameworks. BNM's Risk Management in Technology (RMiT) policy requires financial institutions to strengthen cybersecurity and cloud risk governance, while the Cyber Security Act 2024 mandates 6-hour incident notification for NCII entities including banks. The PDPA amendments impose RM1 million maximum fines and mandatory DPO appointments from June 2025. BNM's Financial Technology Regulatory Sandbox — with its new 'Green Lane' accelerated track — creates opportunities for AI innovation within a controlled environment. This programme is structured to qualify for HRD Corp SBL-Khas claims, with training costs covered directly from employer levy contributions — no upfront payment required. The PDPA amendments, with maximum fines increased to RM1 million and mandatory 72-hour breach notification, make compliance-aware AI deployment a business imperative.
LOCAL CONTEXT
Malaysia is rapidly positioning itself as a regional AI hub through the Malaysia Digital initiative. Strong government incentives, including HRDF and MDEC grants, combined with a growing pool of digital talent, create fertile ground for AI transformation across industries.
$2.1 billion AI market by 2030
growing
THE CHALLENGE
“PDPA Amendment Compliance Gap”
“HRD Corp Funding Underutilisation”
“AI Talent Shortage Blocking Implementation”
“Cyber Security Act 2024 Compliance Burden”
Our team has trained executives at globally-recognized brands
OUTCOMES
FUNDING & SUBSIDIES
Up to RM1,000 per participant
Covers training costs for employees of registered employers (mandatory for 10+ staff). Direct provider payment — no upfront cost to employer.
Official SourceUp to MYR 5,000 per company
50% matching grant for digital service subscriptions adopted as part of this programme's implementation phase.
Official SourceVaries by partner institution
Part of RM1.5 billion public-private initiative supporting MSME business digitalisation through financial institutions and digital service providers.
Official SourceREGULATORY LANDSCAPE
The PDPA 2010 amendments (effective January–June 2025) are directly relevant: maximum fines increased to RM1 million, mandatory DPO appointments, 72-hour breach notification, expanded sensitive data definitions including biometrics, and new data portability rights. BNM's Risk Management in Technology (RMiT) policy imposes additional technology governance requirements on financial institutions, while the Financial Technology Regulatory Sandbox provides a controlled environment for AI innovation. The Cyber Security Act 2024 requires NCII entities to conduct annual cybersecurity risk assessments, biennial audits, and notify authorities of incidents within 6 hours of discovery. MOSTI's National Guidelines on AI Governance and Ethics (AIGE) outline seven core principles for responsible AI deployment, and the National AI Office (NAIO) is developing the AI Technology Action Plan 2026–2030 as a risk-based regulatory framework.
CHALLENGES IN MALAYSIA
The 2024 PDPA amendments require mandatory DPO appointments, 72-hour breach notification, and expanded sensitive data definitions including biometrics — effective June 2025. Many Malaysian organisations lack the AI governance frameworks needed to ensure automated systems meet these heightened requirements, risking fines up to RM1 million.
Malaysian employers with 10+ staff pay a mandatory 1% levy to HRD Corp, yet many fail to fully claim these funds for AI training. The SBL-Khas scheme covers up to RM1,000 per participant with direct provider payment, but the 'apply before training' requirement and 5-10 day processing time catch unprepared organisations off-guard.
Malaysia has only 3,000 AI professionals against a projected demand of 30,000 by 2030. With 81% of employers struggling to hire AI talent and a 34% salary premium required for AI-skilled candidates, building internal capability through training is significantly more cost-effective than competing in the talent market.
The Cyber Security Act 2024 requires NCII entities to conduct annual cybersecurity risk assessments, biennial audits, and report incidents within 6 hours. AI systems that process sensitive data must be designed with these requirements embedded from the start — retrofitting compliance is far more expensive.
OUR PROCESS
We assess your current AI maturity, regulatory environment, technology stack, and strategic priorities. This includes interviews with risk, compliance, operations, and business leaders to map your highest-impact use cases.
We tailor modules to your specific sub-sector (banking, insurance, asset management, fintech), regulatory jurisdiction, and team composition. All examples, case studies, and exercises use financial services scenarios your teams will recognise.
Interactive workshops using real-world financial services data sets (anonymised), regulatory scenarios, and industry tools. Each module combines concept explanation with immediate practice on tasks your teams perform daily.
Participants develop 2-3 AI use case proposals specific to their departments, with business cases, risk assessments, and implementation roadmaps — ready for leadership review.
30-day post-programme support includes office hours, Slack access, implementation coaching, and a follow-up session to review progress on use case pilots and address emerging challenges.
IS THIS RIGHT FOR YOU?
Banks, insurers, and asset managers with 50+ employees seeking structured AI adoption
Financial institutions facing regulatory pressure to modernise risk and compliance operations
Regional financial groups wanting consistent AI capability across multiple markets
Fintech companies scaling operations and needing AI-literate teams
Financial services firms that tried generic AI training and found it too disconnected from their reality
Individual learners (this is a team programme — try our AI Readiness Fundamentals instead)
Organisations looking to build custom AI models from scratch (try our Engineering tier)
Financial services firms already running AI at scale across departments (try our Enterprise Transformation offering)
See yourself above? Let's talk about AI for Banking & Financial Services Teams in Malaysia.
Let's TalkCOMMON QUESTIONS
MORE TRAINING
WHY PERTAMA PARTNERS
Pertama's advisors understand the specific intersection of BNM's RMiT requirements, the Cyber Security Act 2024, and PDPA amendments that Malaysian financial institutions must navigate simultaneously. Most local training providers address these regulations in isolation; we train teams to build AI systems that satisfy all three frameworks from the start.
Training is delivered in English as the primary working language, with Bahasa Malaysia terminology integrated where relevant. Facilitators are comfortable with the code-switching between English, Bahasa Malaysia, and Mandarin that is common in Malaysian professional settings. All materials reference Malaysian regulations, funding mechanisms, and market examples. On-premise delivery is available for organisations with strict information security requirements. Programme structure is designed to meet HRD Corp's 'apply before training' process requirements, with adequate lead time built into scheduling.
Let's discuss how ai for banking & financial services teams can help your organization in Malaysia.
Start a Conversation