When Bias Becomes Knowledge: How Sociodemographic Inequities Shape Medical AI
DOI :
https://doi.org/10.18192/uojm.v16iS1.7834Mots-clés :
Artificial intelligence, Equity, Diversity, Representation, Equity, Residency, Systemic BarriersRésumé
Artificial intelligence, particularly large-language models, increasingly informs clinical decision-making, from triage to treatment recommendations. While promising efficiency and objectivity, these systems can encode and amplify historical biases present in clinical practice and documentation. Names, language, and other implicit social signals can trigger inequitable recommendations, formalizing discrimination at scale. Mitigating this requires technical, data-centric, and institutional interventions, including counterfactual testing, equitable dataset design, and fairness audits. With deliberate oversight, AI can not only improve care but also reveal and address persistent healthcare disparities.
Références
1. Maity S, Saikia MJ. Large Language Models in Healthcare and Medical Applications: A Review. Bioengineering (Basel) 2025; 12: 631.
2. Straw I. The automation of bias in medical Artificial Intelligence (Ai): Decoding the past to create a better future. Artificial Intelligence in Medicine 2020; 110: 101965.
3. Omar M, Soffer S, Agbareia R, et al. Sociodemographic biases in medical decision making by large language models. Nat Med 2025; 31: 1873–1881.
4. Hicks SA, Strümke I, Thambawita V, et al. On evaluation metrics for medical applications of artificial intelligence. Sci Rep 2022; 12: 5979.
5. Cross JL, Choma MA, Onofrey JA. Bias in medical AI: Implications for clinical decision-making. PLOS Digit Health 2024; 3: e0000651.
6. Jarman AF, Hwang AC, Schleimer JP, et al. Racial Disparities in Opioid Analgesia Administration Among Adult Emergency Department Patients with Abdominal Pain. West J Emerg Med 2022; 23: 826–831.
7. Lunova T, Komorovsky R, Klishch I. Gender Differences in Treatment Delays, Management and Mortality among Patients with Acute Coronary Syndrome: A Systematic Review and Meta-analysis. Curr Cardiol Rev 2023; 19: e300622206530.
8. Merritt CC, Halverson TF, Elliott T, et al. Racial Disparities and Predictors of Functioning in Schizophrenia. Am J Orthopsychiatry 2023; 93: 177–187.
9. Ivy ZK, Hwee S, Kimball BC, et al. Disparities in Documentation: Evidence of Race-Based Biases in the Electronic Medical Record. Racial Ethn Health Disparities 2025; 12: 3294–3300.
10. Rajkomar A, Hardt M, Howell MD, et al. Ensuring Fairness in Machine Learning to Advance Health Equity. Ann Intern Med 2018; 169: 866–872.
11. Kücking F, Hübner U, Przysucha M, et al. Automation Bias in AI-Decision Support: Results from an Empirical Study. Stud Health Technol Inform 2024; 317: 298–304.
12. Char DS, Shah NH, Magnus D. Implementing Machine Learning in Health Care — Addressing Ethical Challenges. N Engl J Med 2018; 378: 981–983.
13. Chakradeo K, Huynh I, Balaganeshan SB, et al. Navigating fairness aspects of clinical prediction models. BMC Med 2025; 23: 567.
14. Chen RJ, Wang JJ, Williamson DFK, et al. Algorithm fairness in artificial intelligence for medicine and healthcare. Nat Biomed Eng 2023; 7: 719–742.
15. Obermeyer Z, Powers B, Vogeli C, et al. Dissecting racial bias in an algorithm used to manage the health of populations. Science 2019; 366: 447–453.
16. Mehrabi N, Morstatter F, Saxena N, et al. A Survey on Bias and Fairness in Machine Learning. ACM Comput Surv 2022; 54: 1–35.
17. FDA Issues Comprehensive Draft Guidance for Developers of Artificial Intelligence-Enabled Medical Devices. FDA, https://www.fda.gov/news-events/press-announcements/fda-issues-comprehensive-draft-g uidance-developers-artificial-intelligence-enabled-medical-devices (2025, accessed 30 January 2026).
Téléchargements
Publié-e
Comment citer
Numéro
Rubrique
Licence
© Om Patel 2026

Cette œuvre est sous licence Creative Commons Attribution - Pas d'Utilisation Commerciale - Pas de Modification 4.0 International.
- Les auteurs qui publient dans le JMUO gardent les droits d’auteur de leurs articles, incluant tous les brouillons et la copie finale publiée dans le journal
- Bien que le JMUO n’a pas les droits d’auteur des articles soumis, en acceptant de publier dans le JMUO, les auteurs donnent le droit au journal d’être les premiers à publier et à distribuer leurs articles.
- Par la suite, les auteurs peuvent soumettre leurs documents à d’autres publications, incluant des revues ou des livres, avec un remerciement de leur première publication dans le JMUO
- Des copies du JMUO seront distribuées à la fois sous format papier et en ligne, et tous les matériaux seront accessibles au public en ligne. Le journal n’a pas de responsabilité légale par rapport à la distribution publique du contenu.
- Prière de vous assurer que tous les auteurs, les coauteurs et les investigateurs
- Le contenu est rendu disponible sous licence Creative Commons Attribution - Pas d'Utilisation Commerciale - Pas de Modification 4.0 International.