When Bias Becomes Knowledge: How Sociodemographic Inequities Shape Medical AI

Auteurs-es

  • Om M. Patel McMaster University

DOI :

https://doi.org/10.18192/uojm.v16iS1.7834

Mots-clés :

Artificial intelligence, Equity, Diversity, Representation, Equity, Residency, Systemic Barriers

Résumé

Artificial intelligence, particularly large-language models, increasingly informs clinical decision-making, from triage to treatment recommendations. While promising efficiency and objectivity, these systems can encode and amplify historical biases present in clinical practice and documentation. Names, language, and other implicit social signals can trigger inequitable recommendations, formalizing discrimination at scale. Mitigating this requires technical, data-centric, and institutional interventions, including counterfactual testing, equitable dataset design, and fairness audits. With deliberate oversight, AI can not only improve care but also reveal and address persistent healthcare disparities.

Références

1. Maity S, Saikia MJ. Large Language Models in Healthcare and Medical Applications: A Review. Bioengineering (Basel) 2025; 12: 631.

2. Straw I. The automation of bias in medical Artificial Intelligence (Ai): Decoding the past to create a better future. Artificial Intelligence in Medicine 2020; 110: 101965.

3. Omar M, Soffer S, Agbareia R, et al. Sociodemographic biases in medical decision making by large language models. Nat Med 2025; 31: 1873–1881.

4. Hicks SA, Strümke I, Thambawita V, et al. On evaluation metrics for medical applications of artificial intelligence. Sci Rep 2022; 12: 5979.

5. Cross JL, Choma MA, Onofrey JA. Bias in medical AI: Implications for clinical decision-making. PLOS Digit Health 2024; 3: e0000651.

6. Jarman AF, Hwang AC, Schleimer JP, et al. Racial Disparities in Opioid Analgesia Administration Among Adult Emergency Department Patients with Abdominal Pain. West J Emerg Med 2022; 23: 826–831.

7. Lunova T, Komorovsky R, Klishch I. Gender Differences in Treatment Delays, Management and Mortality among Patients with Acute Coronary Syndrome: A Systematic Review and Meta-analysis. Curr Cardiol Rev 2023; 19: e300622206530.

8. Merritt CC, Halverson TF, Elliott T, et al. Racial Disparities and Predictors of Functioning in Schizophrenia. Am J Orthopsychiatry 2023; 93: 177–187.

9. Ivy ZK, Hwee S, Kimball BC, et al. Disparities in Documentation: Evidence of Race-Based Biases in the Electronic Medical Record. Racial Ethn Health Disparities 2025; 12: 3294–3300.

10. Rajkomar A, Hardt M, Howell MD, et al. Ensuring Fairness in Machine Learning to Advance Health Equity. Ann Intern Med 2018; 169: 866–872.

11. Kücking F, Hübner U, Przysucha M, et al. Automation Bias in AI-Decision Support: Results from an Empirical Study. Stud Health Technol Inform 2024; 317: 298–304.

12. Char DS, Shah NH, Magnus D. Implementing Machine Learning in Health Care — Addressing Ethical Challenges. N Engl J Med 2018; 378: 981–983.

13. Chakradeo K, Huynh I, Balaganeshan SB, et al. Navigating fairness aspects of clinical prediction models. BMC Med 2025; 23: 567.

14. Chen RJ, Wang JJ, Williamson DFK, et al. Algorithm fairness in artificial intelligence for medicine and healthcare. Nat Biomed Eng 2023; 7: 719–742.

15. Obermeyer Z, Powers B, Vogeli C, et al. Dissecting racial bias in an algorithm used to manage the health of populations. Science 2019; 366: 447–453.

16. Mehrabi N, Morstatter F, Saxena N, et al. A Survey on Bias and Fairness in Machine Learning. ACM Comput Surv 2022; 54: 1–35.

17. FDA Issues Comprehensive Draft Guidance for Developers of Artificial Intelligence-Enabled Medical Devices. FDA, https://www.fda.gov/news-events/press-announcements/fda-issues-comprehensive-draft-g uidance-developers-artificial-intelligence-enabled-medical-devices (2025, accessed 30 January 2026).

Publié-e

2026-04-30

Comment citer

Patel, O. M. (2026). When Bias Becomes Knowledge: How Sociodemographic Inequities Shape Medical AI. Journal médical De l’Université d’Ottawa, 16(S1), 8–10. https://doi.org/10.18192/uojm.v16iS1.7834

Numéro

Rubrique

Commentaire (concours)