Full Program »
People Use Algorithmic Advice To License The Expression of Prejudice
Much research and public discussion has focused on how the biased design of algorithms can disadvantage underrepresented social groups, specifically racial minorities and women. In contrast, the present research examines how the biased use of algorithms may disadvantage these groups. Results from two studies, with 750 participants and more than 6,000 observations, suggest that people use algorithmic advice to license expressing prejudice in evaluating Black job applicants. We find that participants with more-prejudiced racial attitudes place significantly greater weight on a hiring algorithm’s advice when it suggests lowering, rather than raising, their evaluation of a Black job applicant. This effect is attenuated, and even reversed, among participants with less-prejudiced racial attitudes. We argue that this pattern of results reflects evaluators using algorithmic advice not only to accurately assess applicants’ quality, but also to license evaluations that might otherwise seem prejudiced. These findings highlight the importance of considering how to structure joint human-algorithm decision making to constrain, rather than license, human biases.