Bias Correct
This program identifies bias related to gender, race, and ethnicity in writing samples such as evaluations and letters of recommendations. The program is currently rule-based and highlights common forms of bias described in the scientific literature.
Our goal is to help you write strong letters, statements, or evaluations by giving you a way to identify the implicit bias that shows up in everyone’s writing.
The program highlights potential bias and describes the associated rules, so you can understand the best practices and make changes to reduce bias. The rules are designed to be over-inclusive, which means they may highlight words or phrases that do not represent bias (e.g., “She is patient” may represent bias related to gender, but “Her patient notes were clear and well-organized” does not represent bias).
Gender, racial, and ethnic bias in evaluations and letters of recommendations often shows up in coded language (e.g., “articulate,” “competent”) or failure to mention concrete accomplishments. Addressing bias may mean adding examples of concrete accomplishments or removing coded language.
We are not trying to make all writing the same. For example, describing emotional intelligence often includes communal language (“patient,” “compassionate”) and is important, even if it is associated with gender bias. To address this type of bias, you can check that you are including accomplishments, superlatives, and agentic language (“independent,” “assertive,” “decisive”) as well. Ask yourself if you use the same words when writing for all genders. You might need to add communal language for some genders and not others.
Areas of potential bias
This will be a summary of the rules identified in the text you submitted
-
Disparities based on gender, race, and ethnicity exist in medicine, science, business, and many other professions
Letters of recommendation and evaluations written differ in key ways depending on gender, race, and ethnicity
The differences impact everything from how individuals are graded in a class to whether they are hired or promoted
Most writers are unaware of bias in their writing (it is implicit)
Even if someone wants to write a strong letter, they will likely include language that reflects implicit bias, which weakens the letter.
-
Studies on bias related gender, race, and ethnicity show that letters/evaluations written for women and persons excluded because of their ethnicity or race (PEERs) are:
Less likely to mention publications, projects, and research
Less likely to include superlatives ('They were the best, the top, the greatest')
Less likely to use nouns ('He was a researcher' while 'she taught')
More likely to include minimal assurance ('They can do the job') rather than a strong endorsement
More likely to highlight effort ('They are hard-working') instead of highlighting accomplishments ('they are an exceptional researcher')
More likely to discuss personal life and fail to use formal titles
More likely to include stereotypes ('They are warm'), coded language (“articulate”), and emotion-focused words (“patient”)
More likely to raise doubt
-
-
This project was part of the Mozilla Open Leaders program and started in 2018 under the mentorship of Jason Clark.
Since 2018, we have had contributors from around the world supporting this project.
The project is supported by a 'Fixing The Broken Rung' Grant from the Gender Equity in Medicine Research Foundation.
-
https://github.com/gender-bias/gender-bias