Race, Gender, and Age Biases in Biomedical Masked Language Models
Michelle YoungJin Kim, Junghwan Kim, Kristen Johnson
Findings: Computational Social Science and Cultural Analytics Findings Paper
Session 4: Computational Social Science and Cultural Analytics (Virtual Poster)
Conference Room: Pier 7&8
Conference Time: July 11, 11:00-12:30 (EDT) (America/Toronto)
Global Time: July 11, Session 4 (15:00-16:30 UTC)
Keywords:
language/cultural bias analysis
TLDR:
Biases cause discrepancies in healthcare services.
Race, gender, and age of a patient affect interactions with physicians and the medical treatments one receives.
These biases in clinical practices can be amplified following the release of pre-trained language models trained on biomedical corpora.
T...
You can open the
#paper-P5024
channel in a separate window.
Abstract:
Biases cause discrepancies in healthcare services.
Race, gender, and age of a patient affect interactions with physicians and the medical treatments one receives.
These biases in clinical practices can be amplified following the release of pre-trained language models trained on biomedical corpora.
To bring awareness to such repercussions, we examine social biases present in the biomedical masked language models.
We curate prompts based on evidence-based practice and compare generated diagnoses based on biases.
For a case study, we measure bias in diagnosing coronary artery disease and using cardiovascular procedures based on bias.
Our study demonstrates that biomedical models are less biased than BERT in gender, while the opposite is true for race and age.