Bias and Oversight in Clinical AI: A Review of Decision Support Tools and Equity Frameworks
Author Department
Cardiology; Medicine
Document Type
Article, Peer-reviewed
Publication Date
2-2026
Abstract
Artificial intelligence (AI) decision support tools (DSTs) are increasingly used across clinical settings to improve efficiency and support decision-making. However, these tools risk perpetuating existing healthcare disparities if not designed and implemented with transparency, equity, and cultural sensitivity. This review explores how racial and ethnic biases manifest within AI-driven DSTs and evaluates the role of governance frameworks in mitigating such harms. It examines the implications of biased algorithms, presents case examples highlighting disparities in tool performance, and critically assesses the adequacy of current national and international regulatory guidance. The review reports that bias can stem from unrepresentative training datasets, exclusion of equity auditing in design, and the absence of mandated transparency in reporting. Although several frameworks exist to guide development and reporting, few are mandatory, and most do not include equity as a core criterion. The current UK and US regulatory models are decentralized and lack mechanisms to systematically detect or prevent bias. To prevent biased tools from entering practice, equity must be structurally embedded across the AI lifecycle. Embedding equity into AI tools requires standardized subgroup performance reporting, mandating fairness assessments, and establishing national and global governance standards to ensure AI tools serve all populations equitably.
Keywords: algorithmic bias; artificial intelligence; clinical decision support tools; digital health; ethnic disparities; healthcare inequity; racial bias.
Recommended Citation
Adegunle F, Chhatwal K, Arab S, Alabdaljabar MS, Raslan MA, Sayed O, Goldsweig AM. Bias and Oversight in Clinical AI: A Review of Decision Support Tools and Equity Frameworks. J Gen Intern Med. 2026 Feb 2. doi: 10.1007/s11606-026-10229-5. Epub ahead of print.
PMID
41627658