Everyday data like grocery store receipts can help expand access to credit and support upward mobility. Photo by Boxed Water Is Better on Unsplash

More than a billion people worldwide can’t access credit cards or loans because they lack a traditional credit score. Without a formal borrowing history, banks often view them as unreliable and risky. To reach these borrowers, lenders have begun experimenting with alternative signals of financial reliability, such as consistent utility or mobile phone payments.

New research from Rice Business builds on that approach. Previous work by assistant professor of marketing Jung Youn Lee showed that everyday data like grocery store receipts can help expand access to credit and support upward mobility. Her latest study extends this insight, using broader consumer spending patterns to explore how alternative credit scores could be created for people with no credit history.

Forthcoming in the Journal of Marketing Research, the study finds that when lenders use data from daily purchases — at grocery, pharmacy, and home improvement stores — credit card approval rates rise. The findings give lenders a powerful new tool to connect the unbanked to credit, laying the foundation for long-term financial security and stronger local economies.

Turning Shopping Habits into Credit Data

To test the impact of retail transaction data on credit card approval rates, the researchers partnered with a Peruvian company that owns both retail businesses and a credit card issuer. In Peru, only 22% of people report borrowing money from a formal financial institution or using a mobile money account.

The team combined three sets of data: credit card applications from the company, loyalty card transactions, and individuals’ credit histories from Peru’s financial regulatory authority. The company’s point-of-sale data included the types of items purchased, how customers paid, and whether they bought sale items.

“The key takeaway is that we can create a new kind of credit score for people who lack traditional credit histories, using their retail shopping behavior to expand access to credit,” Lee says.

The final sample included 46,039 credit card applicants who had received a single credit decision, had no delinquent loans, and made at least one purchase between January 2021 and May 2022. Of these, 62% had a credit history and 38% did not.

Using this data, the researchers built an algorithm that generated credit scores based on retail purchases and predicted repayment behavior in the six months following the application. They then simulated credit card approval decisions.

Retail Scores Boost Approvals, Reduce Defaults

The researchers found that using retail purchase data to build credit scores for people without traditional credit histories significantly increased their chances of approval. Certain shopping behaviors — such as seeking out sale items — were linked to greater reliability as borrowers.

For lenders using a fixed credit score threshold, approval rates rose from 15.5% to 47.8%. Lenders basing decisions on a target loan default rate also saw approvals rise, from 15.6% to 31.3%.

“The key takeaway is that we can create a new kind of credit score for people who lack traditional credit histories, using their retail shopping behavior to expand access to credit,” Lee says. “This approach benefits unbanked applicants regardless of a lender’s specific goals — though the size of the benefit may vary.”

Applicants without credit histories who were approved using the retail-based credit score were also more likely to repay their loans, indicating genuine creditworthiness. Among first-time borrowers, the default rate dropped from 4.74% to 3.31% when lenders incorporated retail data into their decisions and kept approval rates constant.

For applicants with existing credit histories, the opposite was true: approval rates fell slightly, from 87.5% to 84.5%, as the new model more effectively screened out high-risk applicants.

Expanding Access, Managing Risk

The study offers clear takeaways for banks and credit card companies. Lenders who want to approve more applications without taking on too much risk can use parts of the researchers’ model to design their own credit scoring tools based on customers’ shopping habits.

Still, Lee says, the process must be transparent. Consumers should know how their spending data might be used and decide for themselves whether the potential benefits outweigh privacy concerns. That means lenders must clearly communicate how data is collected, stored, and protected—and ensure customers can opt in with informed consent.

Banks should also keep a close eye on first-time borrowers to make sure they’re using credit responsibly. “Proactive customer management is crucial,” Lee says. That might mean starting people off with lower credit limits and raising them gradually as they demonstrate good repayment behavior.

This approach can also discourage people from trying to “game the system” by changing their spending patterns temporarily to boost their retail-based credit score. Lenders can design their models to detect that kind of behavior, too.

The Future of Credit

One risk of using retail data is that lenders might unintentionally reject applicants who would have qualified under traditional criteria — say, because of one unusual purchase. Lee says banks can fine-tune their models to minimize those errors.

She also notes that the same approach could eventually be used for other types of loans, such as mortgages or auto loans. Combined with her earlier research showing that grocery purchase data can predict defaults, the findings strengthen the case that shopping behavior can reliably signal creditworthiness.

“If you tend to buy sale items, you’re more likely to be a good borrower. Or if you often buy healthy food, you’re probably more creditworthy,” Lee explains. “This idea can be applied broadly, but models should still be customized for different situations.”

---

This article originally appeared on Rice Business Wisdom. Written by Deborah Lynn Blumberg

Anderson, Lee, and Yang (2025). “Who Benefits from Alternative Data for Credit Scoring? Evidence from Peru,” Journal of Marketing Research.

A map of U.S. data centers. Courtesy of Rice Businesses Wisdom

Your data center is either closer than you think or much farther away

houston voices

A new study shows why some facilities cluster in cities for speed and access, while others move to rural regions in search of scale and lower costs. Based on research by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard).

Key findings:

  • Third-party colocation centers are physical facilities in close proximity to firms that use them, while cloud providers operate large data centers from a distance and sell access to virtualized computing resources as on‑demand services over the internet.
  • Hospitals and financial firms often require urban third-party centers for low latency and regulatory compliance, while batch processing and many AI workloads can operate more efficiently from lower-cost cloud hubs.
  • For policymakers trying to attract data centers, access to reliable power, water and high-capacity internet matter more than tax incentives.

Recent outages and the surge in AI-driven computing have made data center siting decisions more consequential than ever, especially as energy and water constraints tighten. Communities invest public dollars on the promise of jobs and growth, while firms weigh long-term commitments to land, power and connectivity.

Against that backdrop, a critical question comes into focus: Where do data centers get built — and what actually drives those decisions?

A new study by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard Business School) provides the first large-scale statistical analysis of data center location strategies across the United States. It offers policymakers and firms a clearer starting point for understanding how different types of data centers respond to economic and strategic incentives.

Forthcoming in the journal Strategy Science, the study examines two major types of infrastructure: third-party colocation centers that lease server space to multiple firms, and hyperscale cloud centers owned by providers like Amazon, Google and Microsoft.

Two Models, Two Location Strategies

The study draws on pre-pandemic data from 2018 and 2019, a period of relative geographic stability in supply and demand. This window gives researchers a clean baseline before remote work, AI demand and new infrastructure pressures began reshaping internet traffic patterns.

The findings show that data centers follow a bifurcated geography. Third-party centers cluster in dense urban markets, where buyers prioritize proximity to customers despite higher land and operating costs. Cloud providers, by contrast, concentrate massive sites in a small number of lower-density regions, where electricity, land and construction are cheaper and economies of scale are easier to achieve.

Third-party data centers, in other words, follow demand. They locate in urban markets where firms in finance, healthcare and IT value low latency, secure storage, and compliance with regulatory standards.

Using county-level data, the researchers modeled how population density, industry mix and operating costs predict where new centers enter. Every U.S. metro with more than 700,000 residents had at least one third-party provider, while many mid-sized cities had none.

ImageThis pattern challenges common assumptions. Third-party facilities are more distributed across urban America than prevailing narratives suggest.

Customer proximity matters because some sectors cannot absorb delay. In critical operations, even slight pauses can have real consequences. For hospital systems, lag can affect performance and risk exposure. And in high-frequency trading, milliseconds can determine whether value is captured or lost in a transaction.

“For industries where speed is everything, being too far from the physical infrastructure can meaningfully affect performance and risk,” Pan Fang says. “Proximity isn’t optional for sectors that can’t absorb delay.”

The Economics of Distance

For cloud providers, the picture looks very different. Their decisions follow a logic shaped primarily by cost and scale. Because cloud services can be delivered from afar, firms tend to build enormous sites in low-density regions where power is cheap and land is abundant.

These facilities can draw hundreds of megawatts of electricity and operate with far fewer employees than urban centers. “The cloud can serve almost anywhere,” Pan Fang says, “so location is a question of cost before geography.”

The study finds that cloud infrastructure clusters around network backbones and energy economics, not talent pools. Well-known hubs like Ashburn, Virginia — often called “Data Center Alley” — reflect this logic, having benefited from early network infrastructure that made them natural convergence points for digital traffic.

Local governments often try to lure data centers with tax incentives, betting they will create high-tech jobs. But the study suggests other factors matter more to cloud providers, including construction costs, network connectivity and access to reliable, affordable electricity.

When cloud centers need a local presence, distance can sometimes become a constraint. Providers often address this by working alongside third-party operators. “Third-party centers can complement cloud firms when they need a foothold closer to customers,” Pan Fang says.

That hybrid pattern — massive regional hubs complementing strategic colocation — may define the next phase of data center growth.

Looking ahead, shifts in remote work, climate resilience, energy prices and AI-driven computing may reshape where new facilities go. Some workloads may move closer to users, while others may consolidate into large rural hubs. Emerging data-sovereignty rules could also redirect investment beyond the United States.

“The cloud feels weightless,” Pan Fang says, “but it rests on real choices about land, power and proximity.”

---

This article originally appeared on Rice Business Wisdom. Written by Scott Pett.

Pan Fang and Greenstein (2025). “Where the Cloud Rests: The Economic Geography of Data Centers,” forthcoming in Strategy Science.

Legislators carved out $715 million for nuclear, semiconductor, and other economic development projects, and a potential $1 billion pool of tax incentives to support research-and-development projects. Photo via Getty Images

How Houston's innovation sector fared in 2025 Texas legislative session

That's a Wrap

The Greater Houston Partnership is touting a number of victories during the recently concluded Texas legislative session that will or could benefit the Houston area. They range from billions of dollars for dementia research to millions of dollars for energy projects.

“These wins were only possible through deep collaboration, among our coalition partners, elected officials, business and community leaders, and the engaged members of the Partnership,” according to a partnership blog post. “Together, we’ve demonstrated how a united voice for Houston helps drive results that benefit all Texans.”

In terms of business innovation, legislators carved out $715 million for nuclear, semiconductor, and other economic development projects, and a potential $1 billion pool of tax incentives through 2029 to support research-and-development projects. The partnership said these investments “position Houston and Texas for long-term growth.”

Dementia institute

One of the biggest legislative wins cited by the Greater Houston Partnership was passage of legislation sponsored by Sen. Joan Huffman, a Houston Republican, to provide $3 billion in funding over 10 years for the Dementia Prevention and Research Institute of Texas. Voters will be asked in November to vote on a ballot initiative that would set aside $3 billion for the new institute.

The dementia institute would be structured much like the Cancer Prevention and Research Institute of Texas (CPRIT), a state agency that provides funding for cancer research in the Lone Star State. Since its founding in 2008, CPRIT has awarded nearly $3.9 billion in research grants.

“By establishing the Dementia Prevention and Research Institute of Texas, we are positioning our state to lead the charge against one of the most devastating health challenges of our time,” Huffman said. “With $3 billion in funding over the next decade, we will drive critical research, develop new strategies for prevention and treatment, and support our healthcare community. Now, it’s up to voters to ensure this initiative moves forward.”

More than 500,000 Texans suffer from some form of dementia, including Alzheimer’s disease, according to Lt. Gov. Dan Patrick.

“With a steadfast commitment, Texas has the potential to become a world leader in combating [dementia] through the search for effective treatments and, ultimately, a cure,” Patrick said.

Funding for education

In the K-12 sector, lawmakers earmarked an extra $195 million for Houston ISD, $126.7 million for Cypress-Fairbanks ISD, $103.1 million for Katy ISD, $80.6 million for Fort Bend ISD, and $61 million for Aldine ISD, the partnership said.

In higher education, legislators allocated:

  • $1.17 billion for the University of Houston College of Medicine, University of Texas Health Science Center at Houston, UT MD Anderson Cancer Center, and Baylor College of Medicine
  • $922 million for the University of Houston System
  • $167 million for Texas Southern University
  • $10 million for the Center for Biotechnology at San Jacinto College.

Infrastructure

In the infrastructure arena, state lawmakers:

  • Approved $265 million for Houston-area water and flood mitigation projects, including $100 million for the Lynchburg Pump Station
  • Created the Lake Houston Dredging and Maintenance District
  • Established a fund for the Gulf Coast Protection District to supply $550 million for projects to make the coastline and ship channel more resilient

"Nuclear power renaissance"

House Bill 14 (HB 14) aims to lead a “nuclear power renaissance in the United States,” according to Texas Gov. Greg Abbott’s office. HB 14 establishes the Texas Advanced Nuclear Energy Office, and allocates $350 million for nuclear development and deployment. Two nuclear power plants currently operate in Texas, generating 10 percent of the energy that feeds the Electric Reliability Council Texas (ERCOT) power grid.

“This initiative will also strengthen Texas’ nuclear manufacturing capacity, rebuild a domestic fuel cycle supply chain, and train the future nuclear workforce,” Abbott said in a news release earlier this year.

One of the beneficiaries of Texas’ nuclear push could be Washington, D.C.-based Last Energy, which plans to build 30 micro-nuclear reactors near Abilene to serve power-gobbling data centers across the state. Houston-based Pelican Energy Partners also might be able to take advantage of the legislation after raising a $450 million fund to invest in companies that supply nuclear energy services and equipment.

Reed Clay, president of the Texas Nuclear Alliance, called this legislation “the most important nuclear development program of any state.”

“It is a giant leap forward for Texas and the United States, whose nuclear program was all but dead for decades,” said Clay. “With the passage of HB 14 and associated legislation, Texas is now positioned to lead a nuclear renaissance that is rightly seen as imperative for the energy security and national security of the United States.”

---

A version of this article first appeared on EnergyCapitalHTX.com.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston quantum energy chip startup emerges from stealth with $12M round

seed funding

Houston-based Casimir has emerged from stealth with a $12 million seed round to commercialize its quantum energy chip.

The round was led by Austin-based Scout Ventures. Lavrock Ventures, Cottonwood Technology, Capital Factory, American Deep Tech, and Tim Draper of Draper Associates also participated in the round. The oversubscribed round exceeded the company’s original $8 million target, according to a news release.

Casimir’s semiconductor chips can generate power from quantum vacuum fields without the need for batteries or charging. The company plans to commercialize its first-generation MicroSparc chip by 2028.

The MicroSparc chip measures 5 millimeters by 5 millimeters and is designed to produce 1.5 volts at 25 microamps, comparable to a small rechargeable battery, without degradation and no replacement cycle.

“Casimir represents exactly the kind of breakthrough dual-use technology Scout Ventures was built to back,” Brad Harrison, founder and managing partner at Scout Ventures, said in the release. “This is based on 100 years of science and we’re finally approaching a commercial product … We’re proud to lead this round and support Casimir’s journey from applied science to deployed technology.”

Casimir says it aims to scale its technology across the ”full power spectrum,” including large-scale energy systems that can power homes, commercial infrastructures and electric vehicles.

Casimir's scientific work has been supported by DARPA-funded nanofabrication research and its technology was incubated at the Limitless Space Institute (LSI). LSI is a nonprofit that works to innovate interstellar travel and was founded by Kam Ghaffarian. Technology investor and serial entrepreneur Ghaffarian has been behind companies like X-energy, Intuitive Machines, Axiom Space and Quantum Space.

Harold “Sonny” White, founder and CEO of Casimir, believes the technology can power devices for years without replacements.

“Millions of devices will operate for years without a battery ever needing to be replaced or recharged because we have engineered a customized Casimir cavity into hardware capable of producing persistent electrical power,” White added in the release. “I spent nearly two decades at NASA studying how we power humanity’s future. That work led me to the Casimir effect and the quantum vacuum, where new tools have allowed us to build on a century of scientific knowledge and bring abundant power to the world.”

Houston-based Fervo Energy bumps up IPO target to $1.82 billion

IPO update

Houston-based geothermal power company Fervo Energy is now eyeing an IPO that would raise $1.75 billion to $1.82 billion, up from the previous target of $1.33 billion.

In paperwork filed Monday, May 11 with the U.S. Securities and Exchange Commission, Fervo says it plans to sell 70 million shares of Class A common stock at $25 to $26 per share.

In addition, Fervo expects to grant underwriters 30-day options to buy up to 8.33 million additional shares of Class A common stock. This could raise nearly $200 million.

When it announced the IPO on May 4, Fervo aimed to sell 55.56 million shares at $21 to $24 per share, which would have raised $1.17 billion to $1.33 billion. The initial valuation target was $6.5 billion.

A date for the IPO hasn’t been scheduled. Fervo’s stock will be listed on Nasdaq under the ticker symbol FRVO.

Fervo, founded in 2017, has attracted about $1.5 billion in funding from investors such as Bill Gates-founded Breakthrough Energy Ventures, Google, Mitsubishi Heavy Industries, Devon Energy (which is moving its headquarters to Houston), Tesla co-founder JB Straubel, CalSTRS, Liberty Mutual Investments, AllianceBernstein, JPMorgan, Bank of America and Sumitomo Mitsui Trust Bank.

Fervo’s marquee project is Cape Station in Beaver County, Utah, the world’s largest EGS (enhanced geothermal system) project. The first phase will deliver 100 megawatts of baseload clean power, with the second phase adding another 400 megawatts. The site can accommodate 2 gigawatts of geothermal energy. Fervo holds more than 595,000 leased acres for potential expansion.

Cape Station has secured power purchase agreements for the entire 500-megawatt capacity. Customers include Houston-based Shell Energy North America and Southern California Edison.

---

This article originally appeared on our sister site, EnergyCapitalHTX.com.