Everyday data like grocery store receipts can help expand access to credit and support upward mobility. Photo by Boxed Water Is Better on Unsplash

More than a billion people worldwide can’t access credit cards or loans because they lack a traditional credit score. Without a formal borrowing history, banks often view them as unreliable and risky. To reach these borrowers, lenders have begun experimenting with alternative signals of financial reliability, such as consistent utility or mobile phone payments.

New research from Rice Business builds on that approach. Previous work by assistant professor of marketing Jung Youn Lee showed that everyday data like grocery store receipts can help expand access to credit and support upward mobility. Her latest study extends this insight, using broader consumer spending patterns to explore how alternative credit scores could be created for people with no credit history.

Forthcoming in the Journal of Marketing Research, the study finds that when lenders use data from daily purchases — at grocery, pharmacy, and home improvement stores — credit card approval rates rise. The findings give lenders a powerful new tool to connect the unbanked to credit, laying the foundation for long-term financial security and stronger local economies.

Turning Shopping Habits into Credit Data

To test the impact of retail transaction data on credit card approval rates, the researchers partnered with a Peruvian company that owns both retail businesses and a credit card issuer. In Peru, only 22% of people report borrowing money from a formal financial institution or using a mobile money account.

The team combined three sets of data: credit card applications from the company, loyalty card transactions, and individuals’ credit histories from Peru’s financial regulatory authority. The company’s point-of-sale data included the types of items purchased, how customers paid, and whether they bought sale items.

“The key takeaway is that we can create a new kind of credit score for people who lack traditional credit histories, using their retail shopping behavior to expand access to credit,” Lee says.

The final sample included 46,039 credit card applicants who had received a single credit decision, had no delinquent loans, and made at least one purchase between January 2021 and May 2022. Of these, 62% had a credit history and 38% did not.

Using this data, the researchers built an algorithm that generated credit scores based on retail purchases and predicted repayment behavior in the six months following the application. They then simulated credit card approval decisions.

Retail Scores Boost Approvals, Reduce Defaults

The researchers found that using retail purchase data to build credit scores for people without traditional credit histories significantly increased their chances of approval. Certain shopping behaviors — such as seeking out sale items — were linked to greater reliability as borrowers.

For lenders using a fixed credit score threshold, approval rates rose from 15.5% to 47.8%. Lenders basing decisions on a target loan default rate also saw approvals rise, from 15.6% to 31.3%.

“The key takeaway is that we can create a new kind of credit score for people who lack traditional credit histories, using their retail shopping behavior to expand access to credit,” Lee says. “This approach benefits unbanked applicants regardless of a lender’s specific goals — though the size of the benefit may vary.”

Applicants without credit histories who were approved using the retail-based credit score were also more likely to repay their loans, indicating genuine creditworthiness. Among first-time borrowers, the default rate dropped from 4.74% to 3.31% when lenders incorporated retail data into their decisions and kept approval rates constant.

For applicants with existing credit histories, the opposite was true: approval rates fell slightly, from 87.5% to 84.5%, as the new model more effectively screened out high-risk applicants.

Expanding Access, Managing Risk

The study offers clear takeaways for banks and credit card companies. Lenders who want to approve more applications without taking on too much risk can use parts of the researchers’ model to design their own credit scoring tools based on customers’ shopping habits.

Still, Lee says, the process must be transparent. Consumers should know how their spending data might be used and decide for themselves whether the potential benefits outweigh privacy concerns. That means lenders must clearly communicate how data is collected, stored, and protected—and ensure customers can opt in with informed consent.

Banks should also keep a close eye on first-time borrowers to make sure they’re using credit responsibly. “Proactive customer management is crucial,” Lee says. That might mean starting people off with lower credit limits and raising them gradually as they demonstrate good repayment behavior.

This approach can also discourage people from trying to “game the system” by changing their spending patterns temporarily to boost their retail-based credit score. Lenders can design their models to detect that kind of behavior, too.

The Future of Credit

One risk of using retail data is that lenders might unintentionally reject applicants who would have qualified under traditional criteria — say, because of one unusual purchase. Lee says banks can fine-tune their models to minimize those errors.

She also notes that the same approach could eventually be used for other types of loans, such as mortgages or auto loans. Combined with her earlier research showing that grocery purchase data can predict defaults, the findings strengthen the case that shopping behavior can reliably signal creditworthiness.

“If you tend to buy sale items, you’re more likely to be a good borrower. Or if you often buy healthy food, you’re probably more creditworthy,” Lee explains. “This idea can be applied broadly, but models should still be customized for different situations.”

---

This article originally appeared on Rice Business Wisdom. Written by Deborah Lynn Blumberg

Anderson, Lee, and Yang (2025). “Who Benefits from Alternative Data for Credit Scoring? Evidence from Peru,” Journal of Marketing Research.

A map of U.S. data centers. Courtesy of Rice Businesses Wisdom

Your data center is either closer than you think or much farther away

houston voices

A new study shows why some facilities cluster in cities for speed and access, while others move to rural regions in search of scale and lower costs. Based on research by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard).

Key findings:

  • Third-party colocation centers are physical facilities in close proximity to firms that use them, while cloud providers operate large data centers from a distance and sell access to virtualized computing resources as on‑demand services over the internet.
  • Hospitals and financial firms often require urban third-party centers for low latency and regulatory compliance, while batch processing and many AI workloads can operate more efficiently from lower-cost cloud hubs.
  • For policymakers trying to attract data centers, access to reliable power, water and high-capacity internet matter more than tax incentives.

Recent outages and the surge in AI-driven computing have made data center siting decisions more consequential than ever, especially as energy and water constraints tighten. Communities invest public dollars on the promise of jobs and growth, while firms weigh long-term commitments to land, power and connectivity.

Against that backdrop, a critical question comes into focus: Where do data centers get built — and what actually drives those decisions?

A new study by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard Business School) provides the first large-scale statistical analysis of data center location strategies across the United States. It offers policymakers and firms a clearer starting point for understanding how different types of data centers respond to economic and strategic incentives.

Forthcoming in the journal Strategy Science, the study examines two major types of infrastructure: third-party colocation centers that lease server space to multiple firms, and hyperscale cloud centers owned by providers like Amazon, Google and Microsoft.

Two Models, Two Location Strategies

The study draws on pre-pandemic data from 2018 and 2019, a period of relative geographic stability in supply and demand. This window gives researchers a clean baseline before remote work, AI demand and new infrastructure pressures began reshaping internet traffic patterns.

The findings show that data centers follow a bifurcated geography. Third-party centers cluster in dense urban markets, where buyers prioritize proximity to customers despite higher land and operating costs. Cloud providers, by contrast, concentrate massive sites in a small number of lower-density regions, where electricity, land and construction are cheaper and economies of scale are easier to achieve.

Third-party data centers, in other words, follow demand. They locate in urban markets where firms in finance, healthcare and IT value low latency, secure storage, and compliance with regulatory standards.

Using county-level data, the researchers modeled how population density, industry mix and operating costs predict where new centers enter. Every U.S. metro with more than 700,000 residents had at least one third-party provider, while many mid-sized cities had none.

ImageThis pattern challenges common assumptions. Third-party facilities are more distributed across urban America than prevailing narratives suggest.

Customer proximity matters because some sectors cannot absorb delay. In critical operations, even slight pauses can have real consequences. For hospital systems, lag can affect performance and risk exposure. And in high-frequency trading, milliseconds can determine whether value is captured or lost in a transaction.

“For industries where speed is everything, being too far from the physical infrastructure can meaningfully affect performance and risk,” Pan Fang says. “Proximity isn’t optional for sectors that can’t absorb delay.”

The Economics of Distance

For cloud providers, the picture looks very different. Their decisions follow a logic shaped primarily by cost and scale. Because cloud services can be delivered from afar, firms tend to build enormous sites in low-density regions where power is cheap and land is abundant.

These facilities can draw hundreds of megawatts of electricity and operate with far fewer employees than urban centers. “The cloud can serve almost anywhere,” Pan Fang says, “so location is a question of cost before geography.”

The study finds that cloud infrastructure clusters around network backbones and energy economics, not talent pools. Well-known hubs like Ashburn, Virginia — often called “Data Center Alley” — reflect this logic, having benefited from early network infrastructure that made them natural convergence points for digital traffic.

Local governments often try to lure data centers with tax incentives, betting they will create high-tech jobs. But the study suggests other factors matter more to cloud providers, including construction costs, network connectivity and access to reliable, affordable electricity.

When cloud centers need a local presence, distance can sometimes become a constraint. Providers often address this by working alongside third-party operators. “Third-party centers can complement cloud firms when they need a foothold closer to customers,” Pan Fang says.

That hybrid pattern — massive regional hubs complementing strategic colocation — may define the next phase of data center growth.

Looking ahead, shifts in remote work, climate resilience, energy prices and AI-driven computing may reshape where new facilities go. Some workloads may move closer to users, while others may consolidate into large rural hubs. Emerging data-sovereignty rules could also redirect investment beyond the United States.

“The cloud feels weightless,” Pan Fang says, “but it rests on real choices about land, power and proximity.”

---

This article originally appeared on Rice Business Wisdom. Written by Scott Pett.

Pan Fang and Greenstein (2025). “Where the Cloud Rests: The Economic Geography of Data Centers,” forthcoming in Strategy Science.

Legislators carved out $715 million for nuclear, semiconductor, and other economic development projects, and a potential $1 billion pool of tax incentives to support research-and-development projects. Photo via Getty Images

How Houston's innovation sector fared in 2025 Texas legislative session

That's a Wrap

The Greater Houston Partnership is touting a number of victories during the recently concluded Texas legislative session that will or could benefit the Houston area. They range from billions of dollars for dementia research to millions of dollars for energy projects.

“These wins were only possible through deep collaboration, among our coalition partners, elected officials, business and community leaders, and the engaged members of the Partnership,” according to a partnership blog post. “Together, we’ve demonstrated how a united voice for Houston helps drive results that benefit all Texans.”

In terms of business innovation, legislators carved out $715 million for nuclear, semiconductor, and other economic development projects, and a potential $1 billion pool of tax incentives through 2029 to support research-and-development projects. The partnership said these investments “position Houston and Texas for long-term growth.”

Dementia institute

One of the biggest legislative wins cited by the Greater Houston Partnership was passage of legislation sponsored by Sen. Joan Huffman, a Houston Republican, to provide $3 billion in funding over 10 years for the Dementia Prevention and Research Institute of Texas. Voters will be asked in November to vote on a ballot initiative that would set aside $3 billion for the new institute.

The dementia institute would be structured much like the Cancer Prevention and Research Institute of Texas (CPRIT), a state agency that provides funding for cancer research in the Lone Star State. Since its founding in 2008, CPRIT has awarded nearly $3.9 billion in research grants.

“By establishing the Dementia Prevention and Research Institute of Texas, we are positioning our state to lead the charge against one of the most devastating health challenges of our time,” Huffman said. “With $3 billion in funding over the next decade, we will drive critical research, develop new strategies for prevention and treatment, and support our healthcare community. Now, it’s up to voters to ensure this initiative moves forward.”

More than 500,000 Texans suffer from some form of dementia, including Alzheimer’s disease, according to Lt. Gov. Dan Patrick.

“With a steadfast commitment, Texas has the potential to become a world leader in combating [dementia] through the search for effective treatments and, ultimately, a cure,” Patrick said.

Funding for education

In the K-12 sector, lawmakers earmarked an extra $195 million for Houston ISD, $126.7 million for Cypress-Fairbanks ISD, $103.1 million for Katy ISD, $80.6 million for Fort Bend ISD, and $61 million for Aldine ISD, the partnership said.

In higher education, legislators allocated:

  • $1.17 billion for the University of Houston College of Medicine, University of Texas Health Science Center at Houston, UT MD Anderson Cancer Center, and Baylor College of Medicine
  • $922 million for the University of Houston System
  • $167 million for Texas Southern University
  • $10 million for the Center for Biotechnology at San Jacinto College.

Infrastructure

In the infrastructure arena, state lawmakers:

  • Approved $265 million for Houston-area water and flood mitigation projects, including $100 million for the Lynchburg Pump Station
  • Created the Lake Houston Dredging and Maintenance District
  • Established a fund for the Gulf Coast Protection District to supply $550 million for projects to make the coastline and ship channel more resilient

"Nuclear power renaissance"

House Bill 14 (HB 14) aims to lead a “nuclear power renaissance in the United States,” according to Texas Gov. Greg Abbott’s office. HB 14 establishes the Texas Advanced Nuclear Energy Office, and allocates $350 million for nuclear development and deployment. Two nuclear power plants currently operate in Texas, generating 10 percent of the energy that feeds the Electric Reliability Council Texas (ERCOT) power grid.

“This initiative will also strengthen Texas’ nuclear manufacturing capacity, rebuild a domestic fuel cycle supply chain, and train the future nuclear workforce,” Abbott said in a news release earlier this year.

One of the beneficiaries of Texas’ nuclear push could be Washington, D.C.-based Last Energy, which plans to build 30 micro-nuclear reactors near Abilene to serve power-gobbling data centers across the state. Houston-based Pelican Energy Partners also might be able to take advantage of the legislation after raising a $450 million fund to invest in companies that supply nuclear energy services and equipment.

Reed Clay, president of the Texas Nuclear Alliance, called this legislation “the most important nuclear development program of any state.”

“It is a giant leap forward for Texas and the United States, whose nuclear program was all but dead for decades,” said Clay. “With the passage of HB 14 and associated legislation, Texas is now positioned to lead a nuclear renaissance that is rightly seen as imperative for the energy security and national security of the United States.”

---

A version of this article first appeared on EnergyCapitalHTX.com.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston geothermal unicorn Fervo officially files for IPO

going public

Fervo Energy has officially filed for IPO.

The Houston-based geothermal unicorn filed a registration statement on Form S-1 with the U.S. Securities and Exchange Commission on April 17 to list its Class A common stock on the Nasdaq exchange. Fervo intends to be listed under the ticker symbol "FRVO."

The number and price of the shares have not yet been determined, according to a news release from Fervo. J.P. Morgan, BofA Securities, RBC Capital Markets and Barclays are leading the offering.

The highly anticipated filing comes as Fervo readies its flagship Cape Station geothermal project to deliver its first power later this year

"Today, miles-long lines for gasoline have been replaced by lines for electricity. Tech companies compete for megawatts to claim AI market share. Manufacturers jockey for power to strengthen American industry. Utilities demand clean, firm electricity to stabilize the grid," Fervo CEO Tim Latimer shared in the filing. "Fervo is prepared to serve all of these customers. Not with complex, idiosyncratic projects but with a simplified, standardized product capable of delivering around-the-clock, carbon-free power using proven oil and gas technology."

Fervo has been preparing to file for IPO for months. Axios Pro first reported that the company "quietly" filed for an IPO in January and estimated it would be valued between $2 billion and $3 billion.

Fervo also closed $421 million in non-recourse debt financing for the first phase of Cape Station last month and raised a $462 million Series E in December. The company also announced the addition of four heavyweights to its board of directors last week, including Meg Whitman, former CEO of eBay, Hewlett-Packard, and Spring-based HPE.

Fervo reported a net loss of $70.5 million for the 2025 fiscal year in the S-1 filing and a loss of $41.1 million in 2024.

Tracxn.com estimates that Fervo has raised $1.12 billion over 12 funding rounds. The company was founded in 2017 by Latimer and CTO Jack Norbeck.

---

This article originally appeared on our sister site, EnergyCapitalHTX.com.

New UT Austin med center, anchored by MD Anderson, gets $1 billion gift

Future of Health

A donation announced Tuesday, April 21, breaks a major record at the University of Texas at Austin. Michael and Susan Dell are now UT Austin's first supporters to give $1 billion. In response, the university will create the UT Dell Campus for Advanced Research and the UT Dell Medical Center to "advance human health," per a press release.

The release also records "significant support" for undergraduate scholarships, student housing, and the Texas Advanced Computing Center for supercomputing research.

Both the new research campus and the UT Dell Medical Center will integrate advanced computing into their research and practices. At the medical center, the university hopes that will lead to "earlier detection, more precise and personalized care, and better health outcomes." The University of Texas MD Anderson Cancer Center will also be integrated into the new medical center.

That comes with a numeric goal measured in 10s: raise $10 billion and rank among the top 10 medical centers in the U.S., both in the next decade.

In the shorter term, the university will break ground on the medical center with architecture firm Skidmore, Owings & Merrill (SOM) "later this year."

“UT Austin, where Dell Technologies was founded from a dorm room, has always been a place where bold ideas become real-world impact,” said Michael and Susan Dell in a joint statement.

They continued, “What makes this moment so meaningful is the opportunity to build something that brings every part of the journey together — from how students learn, to how discoveries are made, to how care reaches families. By bringing together medicine, science and computing in one campus designed for the AI era, UT can create more opportunity, deliver better outcomes, and build a stronger future for communities across Texas and beyond.”

This is the second major gift this year for the planned multibillion-dollar medical center. In January, Tench Coxe, a former venture capitalist who’s a major shareholder in chipmaking giant Nvidia, and Simone Coxe, co-founder and former CEO of the Blanc & Otus PR firm, contributed $100 million$100 million.

Baylor scientist lands $2M grant to explore links between viruses and Alzheimer’s

Alzheimer’s research

A Baylor College of Medicine scientist will begin exploring the possible link between Alzheimer’s disease and viral infections thanks to a $2 million grant awarded in March.

Dr. Ryan S. Dhindsa is an assistant professor of pathology & immunology at Baylor and a principal investigator at Texas Children’s Duncan Neurological Research Institute (Duncan NRI). He hypothesizes that Alzheimer’s may have some link to previous viral infections contracted by the patient. To study this intriguing possibility, the American Brain Foundation has gifted him the Cure One, Cure Many award in neuroinflammation.

“It is an honor to receive this support from the Cure One, Cure Many Award. Viral infections are emerging as a major, underappreciated driver of Alzheimer's disease, and this award will allow our team to conduct the most comprehensive screen of viral exposures and host genetics in Alzheimer's to date, spanning over a million individuals,” Dhindsa said in a news release. “Our goal is to identify which viruses matter most, why some people are more vulnerable than others, and ultimately move the field closer to new therapeutic strategies for patients.”

Roughly 150 million people worldwide will suffer from Alzheimer’s by 2050, making it the most common cause of dementia in the world. Despite this, scientists are still at a loss as to what exactly causes it.

Dhindsa’s research is part of a new range of theories that certain viral infections may trigger Alzheimer’s. His team will take a two-fold approach. First, they will analyze the medical records of more than a million individuals looking for patterns. Second, they will analyze viral DNA in stem cell-derived brain cells to see how the infections could contribute to neurological decay. The scale of the genomic data gathering is unprecedented and may highlight a link that traditional studies have missed.

Also joining the project are Dr. Caleb Lareau of Memorial Sloan Kettering Cancer Center and Dr. Artem Babaian of the University of Toronto. Should a link be found, it would open the door to using anti-virals to prevent or treat Alzheimer’s.