A map of U.S. data centers. Courtesy of Rice Businesses Wisdom

A new study shows why some facilities cluster in cities for speed and access, while others move to rural regions in search of scale and lower costs. Based on research by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard).

Key findings:

  • Third-party colocation centers are physical facilities in close proximity to firms that use them, while cloud providers operate large data centers from a distance and sell access to virtualized computing resources as on‑demand services over the internet.
  • Hospitals and financial firms often require urban third-party centers for low latency and regulatory compliance, while batch processing and many AI workloads can operate more efficiently from lower-cost cloud hubs.
  • For policymakers trying to attract data centers, access to reliable power, water and high-capacity internet matter more than tax incentives.

Recent outages and the surge in AI-driven computing have made data center siting decisions more consequential than ever, especially as energy and water constraints tighten. Communities invest public dollars on the promise of jobs and growth, while firms weigh long-term commitments to land, power and connectivity.

Against that backdrop, a critical question comes into focus: Where do data centers get built — and what actually drives those decisions?

A new study by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard Business School) provides the first large-scale statistical analysis of data center location strategies across the United States. It offers policymakers and firms a clearer starting point for understanding how different types of data centers respond to economic and strategic incentives.

Forthcoming in the journal Strategy Science, the study examines two major types of infrastructure: third-party colocation centers that lease server space to multiple firms, and hyperscale cloud centers owned by providers like Amazon, Google and Microsoft.

Two Models, Two Location Strategies

The study draws on pre-pandemic data from 2018 and 2019, a period of relative geographic stability in supply and demand. This window gives researchers a clean baseline before remote work, AI demand and new infrastructure pressures began reshaping internet traffic patterns.

The findings show that data centers follow a bifurcated geography. Third-party centers cluster in dense urban markets, where buyers prioritize proximity to customers despite higher land and operating costs. Cloud providers, by contrast, concentrate massive sites in a small number of lower-density regions, where electricity, land and construction are cheaper and economies of scale are easier to achieve.

Third-party data centers, in other words, follow demand. They locate in urban markets where firms in finance, healthcare and IT value low latency, secure storage, and compliance with regulatory standards.

Using county-level data, the researchers modeled how population density, industry mix and operating costs predict where new centers enter. Every U.S. metro with more than 700,000 residents had at least one third-party provider, while many mid-sized cities had none.

ImageThis pattern challenges common assumptions. Third-party facilities are more distributed across urban America than prevailing narratives suggest.

Customer proximity matters because some sectors cannot absorb delay. In critical operations, even slight pauses can have real consequences. For hospital systems, lag can affect performance and risk exposure. And in high-frequency trading, milliseconds can determine whether value is captured or lost in a transaction.

“For industries where speed is everything, being too far from the physical infrastructure can meaningfully affect performance and risk,” Pan Fang says. “Proximity isn’t optional for sectors that can’t absorb delay.”

The Economics of Distance

For cloud providers, the picture looks very different. Their decisions follow a logic shaped primarily by cost and scale. Because cloud services can be delivered from afar, firms tend to build enormous sites in low-density regions where power is cheap and land is abundant.

These facilities can draw hundreds of megawatts of electricity and operate with far fewer employees than urban centers. “The cloud can serve almost anywhere,” Pan Fang says, “so location is a question of cost before geography.”

The study finds that cloud infrastructure clusters around network backbones and energy economics, not talent pools. Well-known hubs like Ashburn, Virginia — often called “Data Center Alley” — reflect this logic, having benefited from early network infrastructure that made them natural convergence points for digital traffic.

Local governments often try to lure data centers with tax incentives, betting they will create high-tech jobs. But the study suggests other factors matter more to cloud providers, including construction costs, network connectivity and access to reliable, affordable electricity.

When cloud centers need a local presence, distance can sometimes become a constraint. Providers often address this by working alongside third-party operators. “Third-party centers can complement cloud firms when they need a foothold closer to customers,” Pan Fang says.

That hybrid pattern — massive regional hubs complementing strategic colocation — may define the next phase of data center growth.

Looking ahead, shifts in remote work, climate resilience, energy prices and AI-driven computing may reshape where new facilities go. Some workloads may move closer to users, while others may consolidate into large rural hubs. Emerging data-sovereignty rules could also redirect investment beyond the United States.

“The cloud feels weightless,” Pan Fang says, “but it rests on real choices about land, power and proximity.”

---

This article originally appeared on Rice Business Wisdom. Written by Scott Pett.

Pan Fang and Greenstein (2025). “Where the Cloud Rests: The Economic Geography of Data Centers,” forthcoming in Strategy Science.

There's no crystal ball, but this researcher from Rice University is trying to see if some metrics work for economic forecasting. Photo via Getty Images

Houston researcher tries to crack the code on the Fed's data to determine economic outlook

houston voices

Research by Rice Business Professor K. Ramesh shows that the Fed appears to harvest qualitative information from the accounting disclosures that all public companies must file with the Securities and Exchange Commission.

These SEC filings are typically used by creditors, investors and others to make firm-level investing and financing decisions; and while they include business leaders’ sense of economic trends, they are never intended to guide macro-level policy decisions. But in a recent paper (“Externalities of Accounting Disclosures: Evidence from the Federal Reserve”), Ramesh and his colleagues provide persuasive evidence that the Fed nonetheless uses the qualitative information in SEC filings to help forecast the growth of macroeconomic variables like GDP and unemployment.

According to Ramesh, the study was made possible thanks to a decision the SEC made several years ago. The commission stores the reports submitted by public companies in an online database called EDGAR and records the IP address of any party that accesses them. More than a decade ago, the SEC began making partially anonymized forms of those IP addresses available to the public. But researchers eventually figured out how to deanonymize the addresses, which is precisely what Ramesh and his colleagues did in this study.

"We were able to reverse engineer and identify those IP addresses that belonged to Federal Reserve staff," Ramesh says.

The team ultimately assembled a data set containing more than 169,000 filings accessed by Fed staff between 2005 and 2015. They quickly realized that the Fed was interested only in filings submitted by a select group of industry leaders and financial institutions.

But if Ramesh and his colleagues now had a better idea of precisely which bellwether firms the Fed focused on, they still had no way of knowing exactly what Fed staffers had gleaned from the material they accessed. So the team decided to employ a measure called "tone" that captures the overall sentiment of a piece of text – whether positive, negative, or neutral.

Building on previous research that had identified a set of words with negatively toned financial reports, Ramesh and his colleagues examined the tone of all the SEC filings accessed by Fed staff between one meeting of the Federal Open Markets Committee (FOMC) and the next. The FOMC sets interest rates and guides monetary policy, and its meetings provide an opportunity for Fed officials to discuss growth forecasts and announce policy decisions.

The researchers then examined the Fed's growth forecasts to see if there was a relationship between the tone of the documents that Fed staff examined in the period between FOMC meetings and the forecasts they produced in advance of those meetings.

The team found close correlations between the tone of the reports accessed by the Fed and the agency’s forecasts of GDP, unemployment, housing starts and industrial production. The more negative the filings accessed prior to an FOMC meeting, for example, the gloomier the GDP forecast; the more positive the filings, the brighter the unemployment forecast.

Ramesh and his colleagues also compared the Fed's forecasts with those of the Society of Professional Forecasters (SPF), whose members span academia and industry. Intriguingly, the researchers found that while the errors in the SPF's forecasts could be attributed to the absence of the tonal information culled from the SEC filings, the errors in the Fed’s forecasts could not. This suggests both that the Fed was collecting qualitative information that the SPF was not—and that the agency was making remarkably efficient use of it.

"They weren’t leaving anything on the table," Ramesh says.

Having solved one mystery, Ramesh would like to focus on another; namely, how does the Fed identify bellwether firms in the first place?

Unfortunately, the SEC no longer makes IP address data publicly available, which means that Ramesh and his colleagues can no longer study which companies the Fed is most interested in. Nonetheless, Ramesh hopes to use the data they have already collected to build a model that can accurately predict which firms the Fed is most likely to follow. That would allow the team to continue studying the same companies that the Fed does, and, he says, “maybe come up with a way to track those firms in order to understand how the economy is going to move.”

------

This article originally ran on Rice Business Wisdom and was based on research from K. Ramesh is Herbert S. Autrey Professor of Accounting at Jones Graduate School of Business at Rice University.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston geothermal unicorn Fervo officially files for IPO

going public

Fervo Energy has officially filed for IPO.

The Houston-based geothermal unicorn filed a registration statement on Form S-1 with the U.S. Securities and Exchange Commission on April 17 to list its Class A common stock on the Nasdaq exchange. Fervo intends to be listed under the ticker symbol "FRVO."

The number and price of the shares have not yet been determined, according to a news release from Fervo. J.P. Morgan, BofA Securities, RBC Capital Markets and Barclays are leading the offering.

The highly anticipated filing comes as Fervo readies its flagship Cape Station geothermal project to deliver its first power later this year

"Today, miles-long lines for gasoline have been replaced by lines for electricity. Tech companies compete for megawatts to claim AI market share. Manufacturers jockey for power to strengthen American industry. Utilities demand clean, firm electricity to stabilize the grid," Fervo CEO Tim Latimer shared in the filing. "Fervo is prepared to serve all of these customers. Not with complex, idiosyncratic projects but with a simplified, standardized product capable of delivering around-the-clock, carbon-free power using proven oil and gas technology."

Fervo has been preparing to file for IPO for months. Axios Pro first reported that the company "quietly" filed for an IPO in January and estimated it would be valued between $2 billion and $3 billion.

Fervo also closed $421 million in non-recourse debt financing for the first phase of Cape Station last month and raised a $462 million Series E in December. The company also announced the addition of four heavyweights to its board of directors last week, including Meg Whitman, former CEO of eBay, Hewlett-Packard, and Spring-based HPE.

Fervo reported a net loss of $70.5 million for the 2025 fiscal year in the S-1 filing and a loss of $41.1 million in 2024.

Tracxn.com estimates that Fervo has raised $1.12 billion over 12 funding rounds. The company was founded in 2017 by Latimer and CTO Jack Norbeck.

---

This article originally appeared on our sister site, EnergyCapitalHTX.com.

New UT Austin med center, anchored by MD Anderson, gets $1 billion gift

Future of Health

A donation announced Tuesday, April 21, breaks a major record at the University of Texas at Austin. Michael and Susan Dell are now UT Austin's first supporters to give $1 billion. In response, the university will create the UT Dell Campus for Advanced Research and the UT Dell Medical Center to "advance human health," per a press release.

The release also records "significant support" for undergraduate scholarships, student housing, and the Texas Advanced Computing Center for supercomputing research.

Both the new research campus and the UT Dell Medical Center will integrate advanced computing into their research and practices. At the medical center, the university hopes that will lead to "earlier detection, more precise and personalized care, and better health outcomes." The University of Texas MD Anderson Cancer Center will also be integrated into the new medical center.

That comes with a numeric goal measured in 10s: raise $10 billion and rank among the top 10 medical centers in the U.S., both in the next decade.

In the shorter term, the university will break ground on the medical center with architecture firm Skidmore, Owings & Merrill (SOM) "later this year."

“UT Austin, where Dell Technologies was founded from a dorm room, has always been a place where bold ideas become real-world impact,” said Michael and Susan Dell in a joint statement.

They continued, “What makes this moment so meaningful is the opportunity to build something that brings every part of the journey together — from how students learn, to how discoveries are made, to how care reaches families. By bringing together medicine, science and computing in one campus designed for the AI era, UT can create more opportunity, deliver better outcomes, and build a stronger future for communities across Texas and beyond.”

This is the second major gift this year for the planned multibillion-dollar medical center. In January, Tench Coxe, a former venture capitalist who’s a major shareholder in chipmaking giant Nvidia, and Simone Coxe, co-founder and former CEO of the Blanc & Otus PR firm, contributed $100 million$100 million.

Baylor scientist lands $2M grant to explore links between viruses and Alzheimer’s

Alzheimer’s research

A Baylor College of Medicine scientist will begin exploring the possible link between Alzheimer’s disease and viral infections thanks to a $2 million grant awarded in March.

Dr. Ryan S. Dhindsa is an assistant professor of pathology & immunology at Baylor and a principal investigator at Texas Children’s Duncan Neurological Research Institute (Duncan NRI). He hypothesizes that Alzheimer’s may have some link to previous viral infections contracted by the patient. To study this intriguing possibility, the American Brain Foundation has gifted him the Cure One, Cure Many award in neuroinflammation.

“It is an honor to receive this support from the Cure One, Cure Many Award. Viral infections are emerging as a major, underappreciated driver of Alzheimer's disease, and this award will allow our team to conduct the most comprehensive screen of viral exposures and host genetics in Alzheimer's to date, spanning over a million individuals,” Dhindsa said in a news release. “Our goal is to identify which viruses matter most, why some people are more vulnerable than others, and ultimately move the field closer to new therapeutic strategies for patients.”

Roughly 150 million people worldwide will suffer from Alzheimer’s by 2050, making it the most common cause of dementia in the world. Despite this, scientists are still at a loss as to what exactly causes it.

Dhindsa’s research is part of a new range of theories that certain viral infections may trigger Alzheimer’s. His team will take a two-fold approach. First, they will analyze the medical records of more than a million individuals looking for patterns. Second, they will analyze viral DNA in stem cell-derived brain cells to see how the infections could contribute to neurological decay. The scale of the genomic data gathering is unprecedented and may highlight a link that traditional studies have missed.

Also joining the project are Dr. Caleb Lareau of Memorial Sloan Kettering Cancer Center and Dr. Artem Babaian of the University of Toronto. Should a link be found, it would open the door to using anti-virals to prevent or treat Alzheimer’s.