Reducing blind spots takes attention.

We have to catch ourselves in the act of drawing a conclusion based on incomplete information, assumptions, or proxy information.

Proxy information means we substitute in one set of information for another. We infer quality, potential, or trustworthiness from information that is not in fact that information. It may be only patterned that way in our minds, or we were taught to make the associations. And then we stopped questioning them. They become our personal shortcut to decision-making.

These little things matter. Whether it’s grading a paper, evaluating an oral exam performance, deciding about a new team member or colleague, the small shortcuts that we’ve accepted as normal and “unbiased” permeate many interactions and evaluative situations.  We are all affected, no one is immune.

Which ideas do we take seriously in a meeting or brainstorming session?

Whose hands do we call on first in a classroom or seminar?  Who feels comfortable and welcome to voice ideas first?

Which people do we allow to interrupt which other people?

Which leader authorities do we call in a familiar way by first name instead of by last name (Hillary or Angie or Madeline or Ursula instead of Clinton, Merkel, Albright, or von der Leyen: compared to Obama, Scholz, Schröder, … nearly all male leaders are called by their last name)?

And how do these influence the outcomes, biases in voice, influence?

Examples of patterns that we often substitute in for quality:

  • “think leader, think male” — a common pattern in our brains about who’s likely to be the leader and who has our trust until they break it, instead of who has to earn our trust because we’ve not got many examples of that kind of human doing that kind of work.
  • lots of publications is taken as a signal of high quality.
  • coming from the right school or lab  (and not coming from these means not good quality).
  • coauthoring with famous people.
  • an uninterrupted career is thought to mean someone is better at  their job.
  • being from the Global North.
  • a name that is familiar to us culturally.
  • a male name.
  • someone we already know.

Some resources that can help us learn more about, and change, our unconscious patterns of shortcutting:

Learning more and doing better:

Bohnet, I., Chilazi, S., & Asundi, A. Ten Evidence-based Practices for de-biasing the Workplace. Retrieved from Cambridge, MA:

Gino, F., & Coffman, K. (2021). Unconscious Bias Training that Works. Harvard Business Review, September-October. Retrieved from

Harvard University, extensive collection of pdf handouts and links for further reading:

Texas A&M University 2023 Handbook

Moody, JoAnn. N.D. Rising above Cognitive Errors: Guidelines for Search, Tenure Review, and Other Evaluation Committees.

Implicit Association Tests from Harvard

The Royal Society You Tube Video: Understanding Unconscious Bias

11 Harmful Types of Unconscious Bias and how to interrupt them (2020), from Catalyst


Hofmeister, H. (2016). Gender and Science: A Trial. Women versus 16 other suspects guilty for causing women’s underrepresentation in science careers. In N. Baur, C. Besio, M. Norkus, & G. Petschik (Eds.), Wissen – Organisation – Forschungspraxis. Der Makro-Meso-Mikro-Link in der Wissenschaft (pp. 626–670). Weinheim: Beltz-Juventa.’s_Underrepresentation_in_Science_Careers

Gvozdanović, J., & Maes, K. (2018). Implicit Bias in academia: A challenge to the meritocratic principle and to women‘ careers – and what to do about it. Retrieved from LERU:

Writing and reading letters of recommendation can be biased

Dutt, K., Pfaff, D. L., Bernstein, A. F., Dillard, J. S., & Block, C. J. (2016). Gender differences in recommendation letters for postdoctoral fellowships in geoscience. Nature Geoscience, 9(11), 805-808. doi:10.1038/ngeo2819 And summary of the article here

Madera, J. M., Hebl, M. R., Dial, H., Martin, R., & Valian, V. (2019). Raising Doubt in Letters of Recommendation for Academia: Gender Differences and Their Impact. Journal of Business and Psychology, 34(3), 287-303. doi:10.1007/s10869-018-9541-1 And summary of the article here

Research quality assessments can be biased, the famous Nature study from 1997:

Wenneras, C., & Wold, A. (1997). Nepotism and sexism in peer-review. Nature, 387(22), 341-343.

Evaluating job candidates can be biased

Moss-Racusin, C. A., Dovidio, J. F., Brescoll, V. L., Graham, M. J., & Handelsman, J. (2012). Science faculty’s subtle gender biases favor male students. Proceedings of the National Academy of Sciences of the United States of America, 109(41), 16474-16479. doi:10.2307/41763373

Prize nominations and winning can be biased

Lincoln, A. E., Pincus, S., Koster, J. B., & Leboy, P. S. (2012). The Matilda Effect in science: Awards and prizes in the US, 1990s and 2000s. Social Studies of Science, 42(2), 307-320. doi:10.1177/0306312711435830

Research performance evaluations can be biased

Paludi, M. A., & Strayer, L. A. (1985). What’s in an Author‘s Name? Differential Evaluations of Performance as a Function of Author‘s Name. Sex Roles, 12(3/4), 353-361.

How music sounds can be biased

Goldin, C., & Rouse, C. (2000). Orchestrating Impartiality: The Impact of “Blind” Auditions on Female Musicians. The American Economic Review, 90(4), 715-741. Retrieved from

Physical space matters

Cheryan, S., Plaut, V. C., Davies, P. G., & Steele, C. M. (2009). Ambient belonging: How stereotypical cues impact gender participation in computer science. Journal of Personality and Social Psychology, 97(6), 1045-1060. doi:10.1037/a0016239

Master, A., Cheryan, S., & Meltzoff, A. N. (2016). Computing whether she belongs: Stereotypes undermine girls’ interest and sense of belonging in computer science. Journal of Educational Psychology, 108(3), 424-437.

Feature length documentary Picture a Scientist (2020)


Greenwald AG, Banaji MR. 1995. Implicit social cognition: attitudes, self-esteem, and stereotypes. Psychol. Rev. 102:4–27.

Greenwald AG, Lai, CK. 2020. Implicit Social Cognition. Annual Review of Psychology. 71:419-45.

Greenwald, Anthony G., and Linda Hamilton Krieger. 2006. “Implicit Bias: Scientific Foundations.”  California Law Review 94 (4):945-968.