Using biased statistics in hiring makes it more difficult to predict job performance. Photo via Getty Images

The Latin phrase scientia potentia est translates to “knowledge is power.”

In the world of business, there’s a school of thought that takes “knowledge is power” to an extreme. It’s called statistical discrimination theory. This framework suggests that companies should use all available information to make decisions and maximize profits, including the group characteristics of potential hires — such as race and gender — that correlate with (but do not cause) productivity.

Statistical discrimination theory suggests that if there's a choice between equally qualified candidates — let's say, a man and a woman — the hiring manager should use gender-based statistics to the company's benefit. If there's data showing that male employees typically have larger networks and more access to professional development opportunities, the hiring manager should select the male candidate, believing such information points to a more productive employee.

Recent research suggests otherwise.

A peer-reviewed study out of Rice Business and Michigan Ross undercuts the premise of statistical discrimination theory. According to researchers Diana Jue-Rajasingh (Rice Business), Felipe A. Csaszar (Michigan) and Michael Jensen (Michigan), hiring outcomes actually improve when decision-makers ignore statistics that correlate employee productivity with characteristics like race and gender.

Here's Why “Less is More”

Statistical discrimination theory assumes a correlation between individual productivity and group characteristics (e.g., race and gender). But Jue-Rajasingh and her colleagues highlight three factors that undercut that assumption:

  • Environmental uncertainty
  • Biased interpretations of productivity
  • Decision-maker inconsistency

This third factor plays the biggest role in the researchers' model. “For statistical discrimination theory to work,” Jue-Rajasingh says, “it must assume that managers are infallible and decision-making conditions are optimal.”

Indeed, when accounting for uncertainty, inconsistency and interpretive bias, the researchers found that using information about group characteristics actually reduces the accuracy of job performance predictions.

That’s because the more information you include in the decision-making process, the more complex that process becomes. Complex processes make it more difficult to navigate uncertain environments and create more space for managers to make mistakes. It seems counterintuitive, but when firms use less information and keep their processes simple, they are more accurate in predicting the productivity of their hires.

The less-is-more strategy is known as a “heuristic.” Heuristics are simple, efficient rules or mental shortcuts that help decision-makers navigate complex environments and make judgments more quickly and with less information. In the context of this study, published by Organization Science, the heuristic approach suggests that by focusing on fewer, more relevant cues, managers can make better hiring decisions.

Two Types of Information "Cues"

The “less is more” heuristic works better than statistical discrimination theory largely because decision makers are inconsistent in how they weight the available information. To factor for inconsistency, Jue-Rajasingh and her colleagues created a model that reflects the “noise” of external factors, such as a decision maker’s mood or the ambiguity of certain information.

The model breaks the decision-making process into two main components: the environment and the decision maker.

In the environment component, there are two types of information, or “cues,” about job candidates. First, there’s the unobservable, causal cue (e.g., programming ability), which directly relates to job performance. Second, there's the observable, discriminatory cue (e.g., race or gender), which doesn't affect how well someone can do the job but, because of how society has historically worked, might statistically seem connected to job skills.

Even if the decision maker knows they shouldn't rely too much on information like race or gender, they might still use it to predict productivity. But job descriptions change, contexts are unstable, and people don’t consistently consider all variables. Between the inconsistency of decision-makers and the environmental noise created by discriminatory cues, it’s ultimately counterproductive to consider this information.

The Bottom Line

Jue-Rajasingh and her colleagues find that avoiding gender- and race-based statistics improves the accuracy of job performance predictions. The fewer discriminatory cues decision-makers rely on, the less likely their process will lead to errors.

That said: With the advent of AI, it could become easier to justify statistical discrimination theory. The element of human inconsistency would be removed from the equation. But because AI is often rooted in biased data, its use in hiring must be carefully examined to prevent worsening inequity.

------

This article originally ran on Rice Business Wisdom based on research by Rice University's Diana Jue-Rajasingh, Felipe A. Csaszar (Michigan) and Michael Jensen (Michigan). For more, see Csaszar, et al. “When Less is More: How Statistical Discrimination Can Decrease Predictive Accuracy.”

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston unicorn closes $421M to fuel first phase of flagship energy project

Heating Up

Houston geothermal unicorn Fervo Energy has closed $421 million in non-recourse debt financing for the first phase of its flagship Cape Station project in Beaver County, Utah.

Fervo believes Cape Station can meet the needs of surging power demand from data centers, domestic manufacturing and an energy market aiming to use clean and reliable power. According to the company, Cape Station will begin delivering its first power to the grid this year and is expected to reach approximately 100 megwatts of operating capacity by early 2027. Fervo added that it plans to scale to 500 megawatts.

The $421 million financing package includes a $309 million construction-to-term loan, a $61 million tax credit bridge loan, and a $51 million letter of credit facility. The facilities will fund the remaining construction costs for the first phase of Cape Station, and will also support the project’s counterparty credit support requirements.

Coordinating lead arrangers include Barclays, BBVA, HSBC, MUFG, RBC and Société Générale, with additional participation from Bank of America, J.P. Morgan and Sumitomo Mitsui Trust Bank, Limited, New York Branch.

“As demand for firm, clean, affordable power accelerates, EGS (Enhanced Geothermal Systems) is set to become a core energy asset class for infrastructure lenders,” Sean Pollock, managing director, project Finance at RBC Capital Markets, said in a news release. “Fervo is pioneering this step change with Cape Station, a vital contribution to American energy security that RBC is proud to support.”

The oversubscribed financing marks Cape Station’s shift from early-stage and bridge funding to a long-term, non-recourse capital structure, according to the news release.

“Non-recourse financing has historically been considered out of reach for first-of-a-kind projects,” David Ulrey, CFO of Fervo Energy, said in a news release. “Cape Station disrupts that narrative. With proven oil and gas technology paired with AI-enabled drilling and exploration, robust commercial offtake, operational consistency, and an unrelenting focus on health and safety, we have shown that EGS is a highly bankable asset class.”

Fervo continues to be one of the top-funded startups in the Houston area. The company has raised about $1.5 billion prior to the latest $421 million. It also closed a $462 million Series E in December.

According to Axios Pro, Fervo filed for an IPO that would value the company between $2 billion and $3 billion in January.

---

This article first appeared on EnergyCapitalHTX.com.

Houston food giant Sysco to acquire competitor in $29 billion deal

Mergers & Acquisitions

Sysco, the nation's largest food distributor, will acquire supplier Restaurant Depot in a deal worth more than $29 billion.

The acquisition would create a closer link between Sysco and its customers that right now turn to Restaurant Depot for supplies needed quickly in an industry segment known as “cash-and-carry wholesale.”

Sysco, based in Houston, serves more than 700,000 restaurants, hospitals, schools, and hotels, supplying them with everything from butter and eggs to napkins. Those goods are typically acquired ahead of time based on how much traffic that restaurants typically see.

Restaurant Depot offers memberships to mom-and-pop restaurants and other businesses, giving them access to warehouses stocked with supplies for when they run short of what they've purchased from suppliers like Sysco.

It is a fast growing and high-margin segment that will likely mean thousands of restaurants will rely increasingly on Sysco for day-to-day needs.

Restaurant Depot shareholders will receive $21.6 billion in cash and 91.5 million Sysco shares. Based on Sysco’s closing share price of $81.80 as of March 27, 2026, the deal has an enterprise value of about $29.1 billion.

Restaurant Depot was founded in Brooklyn in 1976. The family-run business then known as Jetro Restaurant Depot, has become the nation's largest cash-and-carry wholesaler.

The boards of both companies have approved the acquisition, but it would still need regulatory approval.

Shares of Sysco Corp. tumbled 13% Monday to $71.26, an initial decline some industry analysts expected given the cost of the deal.

Houston researcher builds radar to make self-driving cars safer

eyes on the road

A Rice University researcher is giving autonomous vehicles an “extra set of eyes.”

Current autonomous vehicles (AVs) can have an incomplete view of their surroundings, and challenges like pedestrian movement, low-light conditions and adverse weather only compound these visibility limitations.

Kun Woo Cho, a postdoctoral researcher in the lab of Rice professor of electrical and computer engineering Ashutosh Sabharwal, has developed EyeDAR to help address such issues and enhance the vehicles’ sensing accuracy. Her research was supported in part by the National Science Foundation.

The EyeDAR is an orange-sized, low-power, millimeter-wave radar that could be placed at streetlights and intersections. Its design was inspired by that of the human eye. Researchers envision that the low-cost sensors could help ensure that AVs always pick up on emergent obstacles, even when the vehicles are not within proper range for their onboard sensors and when visibility is limited.

“Current automotive sensor systems like cameras and lidar struggle with poor visibility such as you would encounter due to rain or fog or in low-lighting conditions,” Cho said in a news release. “Radar, on the other hand, operates reliably in all weather and lighting conditions and can even see through obstacles.”

Signals from a typical radar system scatter when they encounter an obstacle. Some of the signal is reflected back to the source, but most of it is often lost. In the case of AVs, this means that "pedestrians emerging from behind large vehicles, cars creeping forward at intersections or cyclists approaching at odd angles can easily go unnoticed," according to Rice.

EyeDAR, however, works to capture lost radar reflections, determine their direction and report them back to the AV in a sequence of 0s and 1s.

“Like blinking Morse code,” Cho added. “EyeDAR is a talking sensor⎯it is a first instance of integrating radar sensing and communication functionality in a single design.”

After testing, EyeDAR was able to resolve target directions 200 times faster than conventional radar designs.

While EyeDAR currently targets risks associated with AVs, particularly in high-traffic urban areas, researchers also believe the technology behind it could complement artificial intelligence efforts and be integrated into robots, drones and wearable platforms.

“EyeDAR is an example of what I like to call ‘analog computing,’” Cho added in the release. “Over the past two decades, people have been focusing on the digital and software side of computation, and the analog, hardware side has been lagging behind. I want to explore this overlooked analog design space.”