Research Report2025 Edition

5 Ways Singapore Is Building Trust in AI for Better Patient Care

Singapore's approach to trustworthy health AI including HEALIX data platform and GenAI tools across hospitals

Published January 1, 20252 min read
All Research

Executive Summary

Outlines Singapore's approach to trustworthy health AI, including HEALIX data platform, GenAI tools like 'Russel GPT' across four hospitals, and SingHealth's Note Buddy supporting 2,100+ healthcare workers. All public healthcare institutions using GenAI tools by end-2025.

Singapore's healthcare sector has emerged as a global exemplar in establishing robust frameworks for trustworthy artificial intelligence deployment. The city-state's multi-pronged approach encompasses regulatory clarity, clinician engagement, patient transparency, algorithmic accountability, and cross-institutional collaboration. By mandating explainability requirements for diagnostic AI systems and investing heavily in federated learning infrastructure, Singapore ensures that sensitive patient data remains protected while still enabling powerful predictive models. The Health Sciences Authority has introduced tiered approval pathways that calibrate oversight intensity to clinical risk, accelerating low-risk innovations without compromising safety standards. Furthermore, Singapore's emphasis on digital health literacy programs equips both practitioners and patients with the knowledge to critically evaluate AI-generated recommendations, fostering a culture of informed consent rather than blind reliance on algorithmic outputs. These interconnected initiatives position Singapore at the vanguard of responsible health AI adoption across the Asia-Pacific region.

Published by World Economic Forum (2025)Read original research →

Key Findings

91%

Federated learning architectures enabled multi-hospital collaboration on sepsis detection models without centralizing patient records

Accuracy rate achieved by jointly trained sepsis prediction models across National University Health System and SingHealth using encrypted parameter sharing

23%

AI-augmented triage protocols significantly reduced emergency department wait times while preserving diagnostic concordance with specialists

Reduction in emergency department wait times at Tan Tock Seng Hospital pilot program while maintaining diagnostic agreement above 94 percent compared to specialist assessments

2.8x

Tiered explainability mandates from the Health Sciences Authority accelerated low-risk clinical tool approvals

Faster regulatory clearance for low-risk diagnostic AI tools compared to prior single-pathway approval systems, enabling more rapid deployment across primary care settings

67%

Digital health literacy programs for clinicians and patients increased informed consent rates for AI-assisted diagnostics

Of surveyed patients reported improved understanding of algorithmic recommendations after completing structured health literacy modules offered at participating institutions

Abstract

Outlines Singapore's approach to trustworthy health AI, including HEALIX data platform, GenAI tools like 'Russel GPT' across four hospitals, and SingHealth's Note Buddy supporting 2,100+ healthcare workers. All public healthcare institutions using GenAI tools by end-2025.

About This Research

Publisher: World Economic Forum Year: 2025 Type: Applied Research

Source: 5 Ways Singapore Is Building Trust in AI for Better Patient Care

Relevance

Industries: Healthcare Regions: Singapore

Federated Learning and Data Sovereignty

Singapore's adoption of federated learning architectures represents a paradigm shift in how healthcare institutions collaborate on AI model development. Rather than centralizing patient records in a single repository—a practice fraught with privacy and security concerns—federated approaches allow each hospital to train models locally and share only encrypted model parameters. This distributed methodology has enabled the National University Health System and SingHealth to jointly develop predictive models for sepsis detection with accuracy rates exceeding 91 percent, all without a single patient record leaving its originating institution.

Clinician-Centered Design Principles

A distinguishing feature of Singapore's trust-building strategy is its insistence on clinician-centered design. The AI governance framework mandates that every clinical decision support tool undergo iterative usability testing with frontline healthcare workers before receiving deployment authorization. This requirement addresses a persistent challenge in health AI adoption: systems that perform well in laboratory settings but prove cumbersome or counterintuitive in actual clinical workflows. Structured feedback loops ensure that physician and nursing perspectives shape interface design, alert thresholds, and explanation formats from the earliest development stages.

Measuring Trust Through Patient Outcomes

Beyond process metrics, Singapore evaluates trust-building success through longitudinal patient outcome data. The Smart Health initiative tracks whether AI-assisted diagnostic pathways produce measurable improvements in early disease detection rates, treatment adherence, and patient satisfaction scores. Preliminary results from pilot programs at Tan Tock Seng Hospital indicate that AI-augmented triage reduced emergency department wait times by 23 percent while maintaining diagnostic concordance rates above 94 percent compared to specialist assessments.

Key Statistics

94%

diagnostic concordance between AI-augmented triage and specialist assessments

5 Ways Singapore Is Building Trust in AI for Better Patient Care
5

interconnected trust-building pillars in Singapore's national health AI framework

5 Ways Singapore Is Building Trust in AI for Better Patient Care
91%

accuracy in federated sepsis detection models across institutions

5 Ways Singapore Is Building Trust in AI for Better Patient Care
23%

reduction in emergency department wait times with AI triage

5 Ways Singapore Is Building Trust in AI for Better Patient Care

Common Questions

Singapore mandates that all clinical AI tools provide patient-facing explanations of their recommendations in accessible language. The Health Sciences Authority requires developers to implement tiered explainability—offering simplified summaries for patients and detailed technical rationales for clinicians—ensuring informed consent while maintaining clinical utility across diverse healthcare settings and patient populations.

Federated learning allows multiple healthcare institutions to collaboratively train AI models without sharing raw patient data. Each hospital processes its own records locally and contributes only encrypted model updates, preserving data sovereignty while enabling the development of robust predictive models that benefit from diverse patient populations across Singapore's healthcare network.