MBA applications are open! First deadline: October 14.

We're Hiring sign
Discrimination | Peer-Reviewed Research

Discrimination Isn’t Just Unethical — It’s Inefficient

In many cases, hiring decisions that consider race and gender lead to worse outcomes.

Based on research by Diana Jue-Rajasingh, Felipe A. Csaszar (Michigan) and Michael Jensen (Michigan)

Using biased criteria in hiring makes decisions more difficult and less accurate in predicting job performance.

  • "Statistical discrimination theory" argues that companies should consider group characteristics like race or gender when hiring, assuming this information to be helpful in predicting productivity.
  • But new research challenges this school of thought, proving that discriminatory cues result in less accurate hiring decisions in many cases.
  • In fact, focusing on fewer, more relevant factors can improve the accuracy of hiring predictions.

The Latin phrase scientia potentia est translates to “knowledge is power.” (A related phrase, sapientia potentia est, means “wisdom is power,” which might make a nice tagline for Rice Business Wisdom.)

In the world of business, there’s a school of thought that takes “knowledge is power” to an extreme. It’s called statistical discrimination theory. This framework suggests that companies should use all available information to make decisions and maximize profits, including the group characteristics of potential hires — such as race and gender — that correlate with (but do not cause) productivity.

According to statistical discrimination theory, if a hiring manager is deciding between equally qualified candidates — let’s say one white person and one Black person — they’re obligated to use racial statistics to the company’s advantage. For instance, if data show that white employees tend to have larger networks and more professional development opportunities, the hiring manager should choose the white candidate. The group characteristics point to a potentially higher-productive employee.

For adherents of this theory, the morality of such discrimination is beside the point. What counts is the ability to more accurately and objectively predict company outcomes. Since managers often cannot directly observe a job candidate’s productivity, data about group characteristics provide a supposedly empirical basis for the hiring decision. In fact, the thinking goes, such an approach eliminates the potential for bias in favor of impartial information. The more managers know about a candidate’s group characteristics, the better equipped they are to hire the candidate that best serves company interests.

A recent study challenges the premise of statistical discrimination theory. Researchers Diana Jue-Rajasingh (Rice Business), Felipe A. Csaszar (Michigan) and Michael Jensen (Michigan) find that hiring outcomes actually improve when decision-makers actually ignore information about group characteristics like race and gender.

Here's Why “Less is More”

Statistical discrimination theory assumes a correlation between individual productivity and group characteristics (e.g., race and gender). But Jue-Rajasingh and her colleagues highlight three factors that undercut that assumption:

  • Environmental uncertainty
  • Biased interpretations of productivity
  • Decision-maker inconsistency

This third factor plays the biggest role in the researchers' model. “For statistical discrimination theory to work,” Jue-Rajasingh says, “it must assume that managers are infallible and decision-making conditions are optimal.”

Indeed, when accounting for uncertainty, inconsistency and interpretive bias, the researchers found that using information about group characteristics actually reduces the accuracy of job performance predictions.

That’s because the more information you include in the decision-making process, the more complex that process becomes. Complex processes make it more difficult to navigate uncertain environments and create more space for managers to make mistakes. It seems counterintuitive, but when firms use less information and keep their processes simple, they are more accurate in predicting the productivity of their hires.

The less-is-more strategy is known as a “heuristic.” Heuristics are simple, efficient rules or mental shortcuts that help decision-makers navigate complex environments and make judgments more quickly and with less information. In the context of this study, published by Organization Science, the heuristic approach suggests that by focusing on fewer, more relevant cues, managers can make better hiring decisions.

Two Types of Information "Cues"

The “less is more” heuristic works better than statistical discrimination theory largely because decision makers are inconsistent in how they weight the available information. To factor for inconsistency, Jue-Rajasingh and her colleagues created a model that reflects the “noise” of external factors, such as a decision maker’s mood or the ambiguity of certain information.

The model breaks the decision-making process into two main components: the environment and the decision maker.

In the environment component, there are two types of information, or “cues,” about job candidates. First, there’s the unobservable, causal cue (e.g., programming ability), which directly relates to job performance. Second, there's the observable, discriminatory cue (e.g., race or gender), which doesn't affect how well someone can do the job but, because of how society has historically worked, might statistically seem connected to job skills.

Even if the decision maker knows they shouldn't rely too much on information like race or gender, they might still use it to predict productivity. But job descriptions change, contexts are unstable, and people don’t consistently consider all variables. Between the inconsistency of decision-makers and the environmental noise created by discriminatory cues, it’s ultimately counterproductive to consider this information.

Ultimately, Jue-Rajasingh and her colleagues find that avoiding information like race and gender leads to more accurate predictions of job performance. The fewer discriminatory cues decision-makers rely on, the less likely their process will lead to errors.

With the advent of AI, it could become easier to justify statistical discrimination theory. Human inconsistency would be removed from the equation. Still, AI is often rooted in biased data, and its use in hiring must be carefully examined to prevent worsening inequity.

 

For more, see Csaszar, et al. “When Less is More: How Statistical Discrimination Can Decrease Predictive Accuracy.” Organization Science 34.4 (2023): 1383-99. https://doi.org/10.1287/orsc.2022.1626.


 

Keep Exploring