A map of U.S. data centers. Courtesy of Rice Businesses Wisdom

A new study shows why some facilities cluster in cities for speed and access, while others move to rural regions in search of scale and lower costs. Based on research by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard).

Key findings:

  • Third-party colocation centers are physical facilities in close proximity to firms that use them, while cloud providers operate large data centers from a distance and sell access to virtualized computing resources as on‑demand services over the internet.
  • Hospitals and financial firms often require urban third-party centers for low latency and regulatory compliance, while batch processing and many AI workloads can operate more efficiently from lower-cost cloud hubs.
  • For policymakers trying to attract data centers, access to reliable power, water and high-capacity internet matter more than tax incentives.

Recent outages and the surge in AI-driven computing have made data center siting decisions more consequential than ever, especially as energy and water constraints tighten. Communities invest public dollars on the promise of jobs and growth, while firms weigh long-term commitments to land, power and connectivity.

Against that backdrop, a critical question comes into focus: Where do data centers get built — and what actually drives those decisions?

A new study by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard Business School) provides the first large-scale statistical analysis of data center location strategies across the United States. It offers policymakers and firms a clearer starting point for understanding how different types of data centers respond to economic and strategic incentives.

Forthcoming in the journal Strategy Science, the study examines two major types of infrastructure: third-party colocation centers that lease server space to multiple firms, and hyperscale cloud centers owned by providers like Amazon, Google and Microsoft.

Two Models, Two Location Strategies

The study draws on pre-pandemic data from 2018 and 2019, a period of relative geographic stability in supply and demand. This window gives researchers a clean baseline before remote work, AI demand and new infrastructure pressures began reshaping internet traffic patterns.

The findings show that data centers follow a bifurcated geography. Third-party centers cluster in dense urban markets, where buyers prioritize proximity to customers despite higher land and operating costs. Cloud providers, by contrast, concentrate massive sites in a small number of lower-density regions, where electricity, land and construction are cheaper and economies of scale are easier to achieve.

Third-party data centers, in other words, follow demand. They locate in urban markets where firms in finance, healthcare and IT value low latency, secure storage, and compliance with regulatory standards.

Using county-level data, the researchers modeled how population density, industry mix and operating costs predict where new centers enter. Every U.S. metro with more than 700,000 residents had at least one third-party provider, while many mid-sized cities had none.

ImageThis pattern challenges common assumptions. Third-party facilities are more distributed across urban America than prevailing narratives suggest.

Customer proximity matters because some sectors cannot absorb delay. In critical operations, even slight pauses can have real consequences. For hospital systems, lag can affect performance and risk exposure. And in high-frequency trading, milliseconds can determine whether value is captured or lost in a transaction.

“For industries where speed is everything, being too far from the physical infrastructure can meaningfully affect performance and risk,” Pan Fang says. “Proximity isn’t optional for sectors that can’t absorb delay.”

The Economics of Distance

For cloud providers, the picture looks very different. Their decisions follow a logic shaped primarily by cost and scale. Because cloud services can be delivered from afar, firms tend to build enormous sites in low-density regions where power is cheap and land is abundant.

These facilities can draw hundreds of megawatts of electricity and operate with far fewer employees than urban centers. “The cloud can serve almost anywhere,” Pan Fang says, “so location is a question of cost before geography.”

The study finds that cloud infrastructure clusters around network backbones and energy economics, not talent pools. Well-known hubs like Ashburn, Virginia — often called “Data Center Alley” — reflect this logic, having benefited from early network infrastructure that made them natural convergence points for digital traffic.

Local governments often try to lure data centers with tax incentives, betting they will create high-tech jobs. But the study suggests other factors matter more to cloud providers, including construction costs, network connectivity and access to reliable, affordable electricity.

When cloud centers need a local presence, distance can sometimes become a constraint. Providers often address this by working alongside third-party operators. “Third-party centers can complement cloud firms when they need a foothold closer to customers,” Pan Fang says.

That hybrid pattern — massive regional hubs complementing strategic colocation — may define the next phase of data center growth.

Looking ahead, shifts in remote work, climate resilience, energy prices and AI-driven computing may reshape where new facilities go. Some workloads may move closer to users, while others may consolidate into large rural hubs. Emerging data-sovereignty rules could also redirect investment beyond the United States.

“The cloud feels weightless,” Pan Fang says, “but it rests on real choices about land, power and proximity.”

---

This article originally appeared on Rice Business Wisdom. Written by Scott Pett.

Pan Fang and Greenstein (2025). “Where the Cloud Rests: The Economic Geography of Data Centers,” forthcoming in Strategy Science.

There's no crystal ball, but this researcher from Rice University is trying to see if some metrics work for economic forecasting. Photo via Getty Images

Houston researcher tries to crack the code on the Fed's data to determine economic outlook

houston voices

Research by Rice Business Professor K. Ramesh shows that the Fed appears to harvest qualitative information from the accounting disclosures that all public companies must file with the Securities and Exchange Commission.

These SEC filings are typically used by creditors, investors and others to make firm-level investing and financing decisions; and while they include business leaders’ sense of economic trends, they are never intended to guide macro-level policy decisions. But in a recent paper (“Externalities of Accounting Disclosures: Evidence from the Federal Reserve”), Ramesh and his colleagues provide persuasive evidence that the Fed nonetheless uses the qualitative information in SEC filings to help forecast the growth of macroeconomic variables like GDP and unemployment.

According to Ramesh, the study was made possible thanks to a decision the SEC made several years ago. The commission stores the reports submitted by public companies in an online database called EDGAR and records the IP address of any party that accesses them. More than a decade ago, the SEC began making partially anonymized forms of those IP addresses available to the public. But researchers eventually figured out how to deanonymize the addresses, which is precisely what Ramesh and his colleagues did in this study.

"We were able to reverse engineer and identify those IP addresses that belonged to Federal Reserve staff," Ramesh says.

The team ultimately assembled a data set containing more than 169,000 filings accessed by Fed staff between 2005 and 2015. They quickly realized that the Fed was interested only in filings submitted by a select group of industry leaders and financial institutions.

But if Ramesh and his colleagues now had a better idea of precisely which bellwether firms the Fed focused on, they still had no way of knowing exactly what Fed staffers had gleaned from the material they accessed. So the team decided to employ a measure called "tone" that captures the overall sentiment of a piece of text – whether positive, negative, or neutral.

Building on previous research that had identified a set of words with negatively toned financial reports, Ramesh and his colleagues examined the tone of all the SEC filings accessed by Fed staff between one meeting of the Federal Open Markets Committee (FOMC) and the next. The FOMC sets interest rates and guides monetary policy, and its meetings provide an opportunity for Fed officials to discuss growth forecasts and announce policy decisions.

The researchers then examined the Fed's growth forecasts to see if there was a relationship between the tone of the documents that Fed staff examined in the period between FOMC meetings and the forecasts they produced in advance of those meetings.

The team found close correlations between the tone of the reports accessed by the Fed and the agency’s forecasts of GDP, unemployment, housing starts and industrial production. The more negative the filings accessed prior to an FOMC meeting, for example, the gloomier the GDP forecast; the more positive the filings, the brighter the unemployment forecast.

Ramesh and his colleagues also compared the Fed's forecasts with those of the Society of Professional Forecasters (SPF), whose members span academia and industry. Intriguingly, the researchers found that while the errors in the SPF's forecasts could be attributed to the absence of the tonal information culled from the SEC filings, the errors in the Fed’s forecasts could not. This suggests both that the Fed was collecting qualitative information that the SPF was not—and that the agency was making remarkably efficient use of it.

"They weren’t leaving anything on the table," Ramesh says.

Having solved one mystery, Ramesh would like to focus on another; namely, how does the Fed identify bellwether firms in the first place?

Unfortunately, the SEC no longer makes IP address data publicly available, which means that Ramesh and his colleagues can no longer study which companies the Fed is most interested in. Nonetheless, Ramesh hopes to use the data they have already collected to build a model that can accurately predict which firms the Fed is most likely to follow. That would allow the team to continue studying the same companies that the Fed does, and, he says, “maybe come up with a way to track those firms in order to understand how the economy is going to move.”

------

This article originally ran on Rice Business Wisdom and was based on research from K. Ramesh is Herbert S. Autrey Professor of Accounting at Jones Graduate School of Business at Rice University.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Tech giant Apple doubles down on Houston with new production facility

coming soon

Tech giant Apple announced that it will double the size of its Houston manufacturing footprint as it brings production of its Mac mini to the U.S. for the first time.

The company plans to begin production of its compact desktop computer at a new factory at Apple’s Houston manufacturing site later this year. The move is expected to create thousands of jobs in the Houston area, according to Apple.

Last year, the Cupertino, California-based company announced it would open a 250,000-square-foot factory to produce servers for its data centers in the Houston area. The facility was originally slated to open in 2026, but Apple reports it began production ahead of schedule in 2025.

The addition of the Mac mini operations at the site will bring the footprint to about 500,000 square feet, the Houston Chronicle reports. The New York Times previously reported that Taiwanese electronics manufacturer Foxconn would be involved in the Houston factory.

Apple also announced plans to open a 20,000-square-foot Advanced Manufacturing Center in Houston later this year. The project is currently under construction and will "provide hands-on training in advanced manufacturing techniques to students, supplier employees, and American businesses of all sizes," according to the announcement. Apple opened a similar Apple Manufacturing Academy in Detroit last year.

Apple doubles down on Houston with new production facility, training center Photo courtesy Apple.

“Apple is deeply committed to the future of American manufacturing, and we’re proud to significantly expand our footprint in Houston with the production of Mac mini starting later this year,” Tim Cook, Apple’s CEO, said in the news release. “We began shipping advanced AI servers from Houston ahead of schedule, and we’re excited to accelerate that work even further.”

Apple's Houston expansion is part of a $600 billion commitment the company made to the U.S. in 2025.

Houston energy trailblazer Fervo taps into hottest reservoir to date

Heating Up

Things are heating up at Houston-based geothermal power company Fervo Energy.

Fervo recently drilled its hottest well so far at a new geothermal site in western Utah. Fewer than 11 days of drilling more than 11,000 feet deep at Project Blanford showed temperatures above 555 degrees Fahrenheit, which exceeds requirements for commercial viability. Fervo used proprietary AI-driven analytics for the test.

Hotter geothermal reservoirs produce more energy and improve what’s known as energy conversion efficiency, which is the ratio of useful energy output to total energy input.

“Fervo’s exploration strategy has always been underpinned by the seamless integration of cutting-edge data acquisition and advanced analytics,” Jack Norbeck, Fervo’s co-founder and chief technology officer, said in a news release. “This latest ultra-high temperature discovery highlights our team’s ability to detect and develop EGS sweet spots using AI-enhanced geophysical techniques.”

Fervo says an independent review confirms the site’s multigigawatt potential.

The company has increasingly tapped into hotter and hotter geothermal reservoirs, going from 365 degrees at Project Red to 400 degrees at Cape Station and now more than 555 degrees at Blanford.

The new site expands Fervo’s geologic footprint. The Blanford reservoir consists of sedimentary formations such as sandstones, claystones and carbonates, which can be drilled more easily and cost-effectively than more commonly targeted granite formations.

Fervo ranks among the top-funded startups in the Houston area. Since its founding in 2017, the company has raised about $1.5 billion. In January, Fervo filed for an IPO that would value the company at $2 billion to $3 billion, according to

Axios Pro.

---

This article originally appeared on EnergyCapitalHTX.com.

11 Houston researchers named to Rice innovation cohort

top of class

The Liu Idea Lab for Innovation and Entrepreneurship (Lilie) has named 11 students and researchers with breakthrough ideas to its 2026 Rice Innovation Fellows cohort.

The program, first launched in 2022, aims to support Rice Ph.D. students and postdocs in turning their research into real-world ventures. Participants receive $10,000 in translational research funding, co-working space and personalized mentorship.

The eleven 2026 Innovation Fellows are:

Ehsan Aalaei, Bioengineering, Ph.D. 2027

Professor Michael King Laboratory

Aalaei is developing new therapies to prevent the spread of cancer.

Matt Lee, Bioengineering, Ph.D. 2027

Professor Caleb Bashor Laboratory

Lee’s work uses AI to design the genetic instructions for more effective therapies.

Thomas Howlett, Bioengineering, Postdoctoral 2028

Professor Kelsey Swingle Laboratory

Howlett is developing a self-administered, nonhormonal treatment for heavy menstrual bleeding.

Jonathan Montes, Bioengineering, Ph.D. 2025

Professor Jessica Butts Laboratory

Montes and his team are developing a fast-acting, long-lasting nasal spray to relieve chronic and acute anxiety.

Siliang Li, BioSciences, Postdoctoral 2025

Professor Caroline Ajo-Franklin Laboratory

Li is developing noninvasive devices that can quickly monitor gut health signals.

Gina Pizzo, Statistics, Lecturer

Pizzo’s research uses data modeling to forecast crop performance and soil health.

Alex Sadamune, Bioengineering, Ph.D. 2027

Professor Chong Xie Laboratory

Sadamune is working to scale the production of high-precision neural implants.

Jaeho Shin, Chemistry, Postdoctoral 2027

Professor James M. Tour Laboratory

Shin is developing next-generation semiconductor and memory technologies to advance computing and AI.

Will Schmid, Electrical and Computer Engineering, Postdoctoral 2025

Professor Alessandro Alabastri Laboratory

Schmid is developing scalable technologies to recover critical minerals from high-salinity resources.

Khadija Zanna, Electrical and Computer Engineering, Ph.D. 2026

Professor Akane Sano Laboratory

Zanna is building machine learning tools to help companies deploy advanced AI in compliance with complex global regulations.

Ava Zoba, Materials Science and Nano Engineering, Ph.D. 2029

Professor Christina Tringides Laboratory

Zoba is designing implantable devices to improve the monitoring of brain function following tumor-removal surgery.

According to Rice, its Innovation Fellows have gone on to raise over $30 million and join top programs, including The Activate Fellowship, Chain Reaction Innovations Fellowship, the Texas Medical Center’s Cancer Therapeutics Accelerator and the Rice Biotech Launch Pad. Past participants include ventures like Helix Earth Technologies and HEXASpec.

“These fellows aren’t just advancing science — they’re building the future of industry here at Rice,” Kyle Judah, Lilie’s executive director, said in a news release. “Alongside their faculty members, they’re stepping into the uncertainty of turning research into real-world solutions. That commitment is rare, and it’s exactly why Lilie and Rice are proud to stand shoulder-to-shoulder with them and nurture their ambition to take on civilization-scale problems that truly matter.”