Despite its high energy production, Texas has had more outages than any other state over the past five years due to the increasing frequency and severity of extreme weather events and rapidly growing demand. Photo via Getty Images

Texas stands out among other states when it comes to energy production.

Even after mass rolling blackouts during Winter Storm Uri in 2021, the Lone Star State produced more electricity than any other state in 2022. However, it also exemplifies how challenging it can be to ensure grid reliability. The following summer, the state’s grid manager, the Electrical Reliability Council of Texas (ERCOT), experienced ten occasions of record-breaking demand.

Despite its high energy production, Texas has had more outages than any other state over the past five years due to the increasing frequency and severity of extreme weather events and rapidly growing demand, as the outages caused by Hurricane Beryl demonstrated.

A bigger storm is brewing

Electric demand is poised to increase exponentially over the next few years. Grid planners nationwide are doubling their five-year load forecast. Texas predicts it will need to provide nearly double the amount of power within six years. These projections anticipate increasing demand from buildings, transportation, manufacturing, data centers, AI and electrification, underscoring the daunting challenges utilities face in maintaining grid reliability and managing rising demand.

However, Texas can accelerate its journey to becoming a grid reliability success story by taking two impactful steps. First, it could do more to encourage the adoption of distributed energy resources (DERs) like residential solar and battery storage to better balance the prodigious amounts of remote grid-scale renewables that have been deployed over the past decade. More DERs mean more local energy resources that can support the grid, especially local distribution circuits that are prone to storm-related outages. Second, by combining DERs with modern demand-side management programs and technology, utilities can access and leverage these additional resources to help them manage peak demand in real time and avoid blackout scenarios.

Near-term strategies and long-term priorities

Increasing electrical capacity with utility-scale renewable energy and storage projects and making necessary electrical infrastructure updates are critical to meet projected demand. However, these projects are complex, resource-intensive and take years to complete. The need for robust demand-side management is more urgent than ever.

Texas needs rapidly deployable solutions now. That’s where demand-side management comes in. This strategy enables grid operators to keep the lights on by lowering peak demand rather than burning more fossil fuels to meet it or, worse, shutting everything off.

Demand response, a demand-side management program, is vital in balancing the grid by lowering electricity demand through load control devices to ensure grid stability. Programs typically involve residential energy consumers volunteering to let the grid operator reduce their energy consumption at a planned time or when the grid is under peak load, typically in exchange for a credit on their energy bill. ERCOT, for example, implements demand responseand rate structure programs to reduce strain on the grid and plans to increase these strategies in the future, especially during the months when extreme weather events are more likely and demand is highest.

The primary solution for meeting peak demand and preventing blackouts is for the utility to turn on expensive, highly polluting, gas-powered “peaker” plants. Unfortunately, there’s a push to add more of these plants to the grid in anticipation of increasing demand. Instead of desperately burning fossil fuels, we should get more out of our existing infrastructure through demand-side management.

Optimizing existing infrastructure

The effectiveness of demand response programs depends in part on energy customers' participation. Despite the financial incentive, customers may be reluctant to participate because they don’t want to relinquish control over their AC. Grid operators also need timely energy usage data from responsive load control technology to plan and react to demand fluctuations. Traditional load control switches don’t provide these benefits.

However, intelligent residential load management technology like smart panels can modernize demand response programs and maximize their effectiveness with real-time data and unprecedented responsiveness. They can encourage customer participation with a less intrusive approach – unlocking the ability for the customer to choose from multiple appliances to enroll. They can also provide notifications for upcoming demand response events, allowing the customer to plan for the event or even opt-out by appliance. In addition to their demand response benefits, smart panels empower homeowners to optimize their home energy and unlock extended runtime for home batteries during a blackout.

Utilities and government should also encourage the adoption of distributed energy resources like rooftop solar and home batteries. These resources can be combined with residential load management technology to drastically increase the effectiveness of demand response programs, granting utilities more grid-stabilizing resources to prevent blackouts.

Solar and storage play a key role

During the ten demand records in the summer of 2023, batteries discharging in the evening helped avoid blackouts, while solar and wind generation covered more than a third of ERCOT's daytime load demand, preventing power price spikes.

Rooftop solar panels generate electricity that can be stored in battery backup systems, providing reliable energy during outages or peak demand. Smart panels extend the runtime of these batteries through automated energy optimization, ensuring critical loads are prioritized and managed efficiently.

Load management technology, like smart panels, enhances the effectiveness of DERs. In rolling blackouts, homeowners with battery storage can rely on smart panels to manage energy use, keeping essential appliances operational and extending stored energy usability. Smart panels allow utilities to effectively manage peak demand, enabling load flexibility and preventing grid overburdening. These technologies and an effective demand response strategy can help Texans optimize the existing energy capacity and infrastructure.

A more resilient energy future

Texas can turn its energy challenges into opportunities by embracing advanced energy management technologies and robust demand-side strategies. Smart panels and distributed energy resources like solar and battery storage offer a promising path to a resilient and efficient grid. As Texans navigate increasing electricity demands and extreme weather events, these innovations provide hope for a future where reliable energy is accessible to all, ensuring grid stability and enhancing the quality of life across the state.

------

Kelly Warner is the CEO of Lumin, a responsive energy management solutions company.

This article originally ran on EnergyCapital.
Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

New Rice Brain Institute partners with TMC to award inaugural grants

brain trust

The recently founded Rice Brain Institute has named the first four projects to receive research awards through the Rice and TMC Neuro Collaboration Seed Grant Program.

The new grant program brings together Rice faculty with clinicians and scientists at The University of Texas Medical Branch, Baylor College of Medicine, UTHealth Houston and The University of Texas MD Anderson Cancer Center. The program will support pilot projects that address neurological disease, mental health and brain injury.

The first round of awards was selected from a competitive pool of 40 proposals, and will support projects that reflect Rice Brain Institute’s research agenda.

“These awards are meant to help teams test bold ideas and build the collaborations needed to sustain long-term research programs in brain health,” Behnaam Aazhang, Rice Brain Institute director and co-director of the Rice Neuroengineering Initiative, said in a news release.

The seed funding has been awarded to the following principal investigators:

  • Kevin McHugh, associate professor of bioengineering and chemistry at Rice, and Peter Kan, professor and chair of neurosurgery at the UTMB. McHugh and Kan are developing an injectable material designed to seal off fragile, abnormal blood vessels that can cause life-threatening bleeding in the brain.
  • Jerzy Szablowski, assistant professor of bioengineering at Rice, and Jochen Meyer, assistant professor of neurology at Baylor. Szablowski and Meyer are leading a nonsurgical, ultrasound approach to deliver gene-based therapies to deep brain regions involved in seizures to control epilepsy without implanted electrodes or invasive procedures.
  • Juliane Sempionatto, assistant professor of electrical and computer engineering at Rice, and Aaron Gusdon, associate professor of neurosurgery at UTHealth Houston. Sempionatto and Gusdon are leading efforts to create a blood test that can identify patients at high risk for delayed brain injury following aneurysm-related hemorrhage, which could lead to earlier intervention and improved outcomes.
  • Christina Tringides, assistant professor of materials science and nanoengineering at Rice, and Sujit Prabhu, professor of neurosurgery at MD Anderson, who are working to reduce the risk of long-term speech and language impairment during brain tumor removal by combining advanced brain recordings, imaging and noninvasive stimulation.

The grants were facilitated by Rice’s Educational and Research Initiatives for Collaborative Health (ENRICH) Office. Rice says that the unique split-funding model of these grants could help structure future collaborations between the university and the TMC.

The Rice Brain Institute launched this fall and aims to use engineering, natural sciences and social sciences to research the brain and reduce the burden of neurodegenerative, neurodevelopmental and mental health disorders. Last month, the university's Shepherd School of Music also launched the Music, Mind and Body Lab, an interdisciplinary hub that brings artists and scientists together to study the "intersection of the arts, neuroscience and the medical humanities." Read more here.

Your data center is either closer than you think or much farther away

houston voices

A new study shows why some facilities cluster in cities for speed and access, while others move to rural regions in search of scale and lower costs. Based on research by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard).

Key findings:

  • Third-party colocation centers are physical facilities in close proximity to firms that use them, while cloud providers operate large data centers from a distance and sell access to virtualized computing resources as on‑demand services over the internet.
  • Hospitals and financial firms often require urban third-party centers for low latency and regulatory compliance, while batch processing and many AI workloads can operate more efficiently from lower-cost cloud hubs.
  • For policymakers trying to attract data centers, access to reliable power, water and high-capacity internet matter more than tax incentives.

Recent outages and the surge in AI-driven computing have made data center siting decisions more consequential than ever, especially as energy and water constraints tighten. Communities invest public dollars on the promise of jobs and growth, while firms weigh long-term commitments to land, power and connectivity.

Against that backdrop, a critical question comes into focus: Where do data centers get built — and what actually drives those decisions?

A new study by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard Business School) provides the first large-scale statistical analysis of data center location strategies across the United States. It offers policymakers and firms a clearer starting point for understanding how different types of data centers respond to economic and strategic incentives.

Forthcoming in the journal Strategy Science, the study examines two major types of infrastructure: third-party colocation centers that lease server space to multiple firms, and hyperscale cloud centers owned by providers like Amazon, Google and Microsoft.

Two Models, Two Location Strategies

The study draws on pre-pandemic data from 2018 and 2019, a period of relative geographic stability in supply and demand. This window gives researchers a clean baseline before remote work, AI demand and new infrastructure pressures began reshaping internet traffic patterns.

The findings show that data centers follow a bifurcated geography. Third-party centers cluster in dense urban markets, where buyers prioritize proximity to customers despite higher land and operating costs. Cloud providers, by contrast, concentrate massive sites in a small number of lower-density regions, where electricity, land and construction are cheaper and economies of scale are easier to achieve.

Third-party data centers, in other words, follow demand. They locate in urban markets where firms in finance, healthcare and IT value low latency, secure storage, and compliance with regulatory standards.

Using county-level data, the researchers modeled how population density, industry mix and operating costs predict where new centers enter. Every U.S. metro with more than 700,000 residents had at least one third-party provider, while many mid-sized cities had none.

ImageThis pattern challenges common assumptions. Third-party facilities are more distributed across urban America than prevailing narratives suggest.

Customer proximity matters because some sectors cannot absorb delay. In critical operations, even slight pauses can have real consequences. For hospital systems, lag can affect performance and risk exposure. And in high-frequency trading, milliseconds can determine whether value is captured or lost in a transaction.

“For industries where speed is everything, being too far from the physical infrastructure can meaningfully affect performance and risk,” Pan Fang says. “Proximity isn’t optional for sectors that can’t absorb delay.”

The Economics of Distance

For cloud providers, the picture looks very different. Their decisions follow a logic shaped primarily by cost and scale. Because cloud services can be delivered from afar, firms tend to build enormous sites in low-density regions where power is cheap and land is abundant.

These facilities can draw hundreds of megawatts of electricity and operate with far fewer employees than urban centers. “The cloud can serve almost anywhere,” Pan Fang says, “so location is a question of cost before geography.”

The study finds that cloud infrastructure clusters around network backbones and energy economics, not talent pools. Well-known hubs like Ashburn, Virginia — often called “Data Center Alley” — reflect this logic, having benefited from early network infrastructure that made them natural convergence points for digital traffic.

Local governments often try to lure data centers with tax incentives, betting they will create high-tech jobs. But the study suggests other factors matter more to cloud providers, including construction costs, network connectivity and access to reliable, affordable electricity.

When cloud centers need a local presence, distance can sometimes become a constraint. Providers often address this by working alongside third-party operators. “Third-party centers can complement cloud firms when they need a foothold closer to customers,” Pan Fang says.

That hybrid pattern — massive regional hubs complementing strategic colocation — may define the next phase of data center growth.

Looking ahead, shifts in remote work, climate resilience, energy prices and AI-driven computing may reshape where new facilities go. Some workloads may move closer to users, while others may consolidate into large rural hubs. Emerging data-sovereignty rules could also redirect investment beyond the United States.

“The cloud feels weightless,” Pan Fang says, “but it rests on real choices about land, power and proximity.”

---

This article originally appeared on Rice Business Wisdom. Written by Scott Pett.

Pan Fang and Greenstein (2025). “Where the Cloud Rests: The Economic Geography of Data Centers,” forthcoming in Strategy Science.

Houston climbs to top 10 spot on North American tech hubs index

tech report

Houston already is the Energy Capital of the World, and now it’s gaining ground as a tech hub.

On Site Selection magazine’s 2026 North American Tech Hub Index, Houston jumped to No. 10 from No. 16 last year. The index relies on data from Site Selection as well as data from CBRE, CompTIA and TeleGeography to rank the continent’s tech hotspots. The index incorporates factors such as internet connectivity, tech talent and facility projects for tech companies.

In 2023, the Greater Houston Partnership noted the region had “begun to receive its due as a prominent emerging tech hub, joining the likes of San Francisco and Austin as a major player in the sector, and as a center of activity for the next generation of innovators and entrepreneurs.”

The Houston-area tech sector employs more than 230,000 people, according to the partnership, and generates an economic impact of $21.2 billion.

Elsewhere in Texas, two other metros fared well on the Site Selection index:

  • Dallas-Fort Worth nabbed the No. 1 spot, up from No. 2 last year.
  • Austin rose from No. 8 last year to No. 7 this year.

San Antonio slid from No. 18 in 2025 to No. 22 in 2026, however.

Two economic development officials in DFW chimed in about the region’s No. 1 ranking on the index:

  • “This ranking affirms what we’ve long seen on the ground — Dallas-Fort Worth is a top-tier technology and innovation center,” said Duane Dankesreiter, senior vice president of research and innovation at the Dallas Regional Chamber. “Our region’s scale, talent base, and diverse strengths … continue to set DFW apart as a national leader.”
  • “Being recognized as the top North American tech hub underscores the strength of the entire Dallas-Fort Worth region as a center of innovation and next-generation technology,” said Robert Allen, president and CEO of the Fort Worth Economic Development Partnership.

While not directly addressing Austin’s Site Selection ranking, Thom Singer, CEO of the Austin Technology Council, recently pondered whether Silicon Hills will grow “into the kind of community that other cities study for the right reasons.”

“Austin tech is not a club. It is not a scene. It is not a hashtag, a happy hour, or any one place or person,” Singer wrote on the council’s blog. “Austin tech is an economic engine and a global brand, built by thousands of people who decided to take a risk, build something, hire others, and be part of a community that is still young enough to reinvent itself.”

South of Austin, Port San Antonio is driving much of that region’s tech activity. Occupied by more than 80 employers, the 1,900-acre tech and innovation campus was home to 18,400 workers in 2024 and created a local economic impact of $7.9 billion, according to a study by Zenith Economics.

“Port San Antonio is a prime example of how innovation and infrastructure come together to strengthen [Texas’] economy, support thousands of good jobs, and keep Texas competitive on the global stage,” said Kelly Hancock, the acting state comptroller.