Discrimination Isn’t Just Unethical — It’s Inefficient
New research undermines the premise of “statistical discrimination theory.”
Based on research by Diana Jue-Rajasingh, Felipe A. Csaszar (Michigan) and Michael Jensen (Michigan)
“For statistical discrimination theory to work,” Jue-Rajasingh says, “it must assume that managers are infallible and decision-making conditions are optimal.”
Key findings:
- “Statistical discrimination theory” argues that companies should consider group characteristics like race or gender when hiring, assuming this information to be helpful in predicting productivity.
- But new research challenges this school of thought, proving that discriminatory cues result in less accurate hiring decisions in many cases.
- In fact, focusing on fewer, more relevant factors can improve the accuracy of hiring predictions.
The Latin phrase scientia potentia est translates to “knowledge is power.” (A related phrase, sapientia potentia est, means “wisdom is power,” which might make a nice tagline for Rice Business Wisdom.)
In the world of business, there’s a school of thought that takes “knowledge is power” to an extreme. It’s called statistical discrimination theory. This framework suggests that companies should use all available information to make decisions and maximize profits, including the group characteristics of potential hires — such as race and gender — that correlate with (but do not cause) productivity.
Statistical discrimination theory suggests that if there’s a choice between equally qualified candidates — let’s say, a man and a woman — the hiring manager should use gender-based statistics to the company’s benefit. If there’s data showing that male employees typically have larger networks and more access to professional development opportunities, the hiring manager should select the male candidate, believing such information points to a more productive employee.
Recent research reveals the fault in this logic.
A peer-reviewed study out of Rice Business and Michigan Ross undercuts the premise of statistical discrimination theory. According to researchers Diana Jue-Rajasingh (Rice Business), Felipe A. Csaszar (Michigan) and Michael Jensen (Michigan), hiring outcomes actually improve when decision-makers ignore statistics that correlate employee productivity with characteristics like race and gender.
Here’s Why “Less is More”
Statistical discrimination theory assumes a correlation between individual productivity and group characteristics (e.g., race and gender). But Jue-Rajasingh and her colleagues highlight three factors that undercut that assumption:
- Environmental uncertainty
- Biased interpretations of productivity
- Decision-maker inconsistency
This third factor plays the biggest role in the researchers’ model. “For statistical discrimination theory to work,” Jue-Rajasingh says, “it must assume that managers are infallible and decision-making conditions are optimal.”
Indeed, when accounting for uncertainty, inconsistency and interpretive bias, the researchers found that using information about group characteristics actually reduces the accuracy of job performance predictions.
That’s because the more information you include in the decision-making process, the more complex that process becomes. Complex processes make it more difficult to navigate uncertain environments and create more space for managers to make mistakes. It seems counterintuitive, but when firms use less information and keep their processes simple, they are more accurate in predicting the productivity of their hires.
The less-is-more strategy is known as a “heuristic.” Heuristics are simple, efficient rules or mental shortcuts that help decision-makers navigate complex environments and make judgments more quickly and with less information. In the context of this study, published by Organization Science, the heuristic approach suggests that by focusing on fewer, more relevant cues, managers can make better hiring decisions.
Two Types of Information “Cues”
The “less is more” heuristic works better than statistical discrimination theory largely because decision makers are inconsistent in how they weight the available information. To factor for inconsistency, Jue-Rajasingh and her colleagues created a model that reflects the “noise” of external factors, such as a decision maker’s mood or the ambiguity of certain information.
The model breaks the decision-making process into two main components: the environment and the decision maker.
In the environment component, there are two types of information, or “cues,” about job candidates. First, there’s the unobservable, causal cue (e.g., programming ability), which directly relates to job performance. Second, there’s the observable, discriminatory cue (e.g., race or gender), which doesn’t affect how well someone can do the job but, because of how society has historically worked, might statistically seem connected to job skills.
Even if the decision maker knows they shouldn’t rely too much on information like race or gender, they might still use it to predict productivity. But job descriptions change, contexts are unstable, and people don’t consistently consider all variables. Between the inconsistency of decision-makers and the environmental noise created by discriminatory cues, it’s ultimately counterproductive to consider this information.
The Bottom Line
Jue-Rajasingh and her colleagues find that avoiding gender- and race-based statistics improves the accuracy of job performance predictions. The fewer discriminatory cues decision-makers rely on, the less likely their process will lead to errors.
That said: With the advent of AI, it could become easier to justify statistical discrimination theory. The element of human inconsistency would be removed from the equation. But because AI is often rooted in biased data, its use in hiring must be carefully examined to prevent worsening inequity.
For more, see Csaszar, et al. “When Less is More: How Statistical Discrimination Can Decrease Predictive Accuracy.” Organization Science 34.4 (2023): 1383-99. https://doi.org/10.1287/orsc.2022.1626.
Never Miss A Story