CenterPoint, NVIDIA and Palantir have formed Chain Reaction. Photo via Getty Images

Houston-based utility company CenterPoint Energy is one of the founding partners of a new AI infrastructure initiative called Chain Reaction.

Software companies NVIDIA and Palantir have joined CenterPoint in forming Chain Reaction, which is aimed at speeding up AI buildouts for energy producers and distributors, data centers and infrastructure builders. Among the initiative’s goals are to stabilize and expand the power grid to meet growing demand from data centers, and to design and develop large data centers that can support AI activity.

“The energy infrastructure buildout is the industrial challenge of our generation,” Tristan Gruska, Palantir’s head of energy and infrastructure, says in a news release. “But the software that the sector relies on was not built for this moment. We have spent years quietly deploying systems that keep power plants running and grids reliable. Chain Reaction is the result of building from the ground up for the demands of AI.”

CenterPoint serves about 7 million customers in Texas, Indiana, Minnesota and Ohio. After Hurricane Beryl struck Houston in July 2024, CenterPoint committed to building a resilient power grid for the region and chose Palantir as its “software backbone.”

“Never before have technology and energy been so intertwined in determining the future course of American innovation, commercial growth, and economic security,” Jason Wells, chairman, president and CEO of CenterPoint, added in the release.

In November, the utility company got the go-ahead from the Public Utility Commission of Texas for a $2.9 billion upgrade of its Houston-area power grid. CenterPoint serves 2.9 million customers in a 12-county territory anchored by Houston.

A month earlier, CenterPoint launched a $65 billion, 10-year capital improvement plan to support rising demand for power across all of its service territories.

---

This article originally appeared on our sister site, EnergyCapitalHTX.com.

If we want to see real change, we need action by all parties. Photo via Getty Images

Texas vs the nation: Comparing energy grid resilience across America

guest column

The 2024 Atlantic hurricane season has proven disastrous for the United States. On July 8th, Hurricane Beryl barreled into Texas as a Category 1 storm knocking out power for nearly 3 million, causing over $2.5 billion in damages, and resulting in the deaths of at least 42 people.

More recently, Hurricanes Helene and Milton tore through the East Coast, dropping trillions of gallons of rain on Florida, Georgia, South Carolina, North Carolina, Virginia, and Tennessee, causing dams to collapse, flash flooding, trees to fall, millions of power outages, complete destruction of homes and businesses, and the deaths of hundreds.

Amidst the horror and rescue efforts, wariness of the increasing strength of natural disasters, and repeated failures of energy grids around the nation begs a few questions.

  1. Is there a version of a power grid that can better endure hurricanes, heat waves, and freezes?
  2. How does the Texas grid compare to other regional grids in the United States?
  3. What can we do to solve our power grid problems and who is responsible for implementing these solutions?

Hurricane-proof grids do not exist

There is no version of a grid anywhere in the United States that can withstand the brunt of a massive hurricane without experiencing outages.

The wind, rain, and flooding are simply too much to handle.

Some might wonder, “What if we buried the power lines?” Surely, removing the power lines from the harsh winds, rain, flying debris, and falling tree branches would be enough to keep the lights on, right?

Well, not necessarily. Putting aside the fact that burying power lines is incredibly expensive – estimates range from thousands to millions of dollars per mile buried – extended exposure to water from flood surges can still cause damage to buried lines. To pile on further, flood surges are likely to seriously damage substations and transformers. When those components fail, there’s no power to run through the lines, buried or otherwise.

Heat waves and winter freezes are a different story

During extreme weather events like heat waves or winter freezes, the strain on the grid goes beyond simple issues of generation and distribution—it’s also a matter of human behavior and grid limitations.

Building and maintaining a power grid is extremely expensive, and storing electricity is not only costly but technically challenging. Most grids are designed with little "buffer" capacity to handle peak demand moments, because much of the infrastructure sits idle during normal conditions. Imagine investing billions of dollars in a power plant or wind farm that only operates at full capacity a fraction of the time. It’s difficult to recoup that investment.

When extreme weather hits, demand spikes significantly while supply remains relatively static, pushing the grid to its limits. This imbalance makes it hard to keep up with the surge in energy usage.

At the same time, our relationship with electricity has changed—our need for electricity has only increased. We’ve developed habits—like setting thermostats to 70 degrees or lower during summer heat waves or keeping homes balmy in winter— that, while comfortable, place additional strain on the system.

Behavioral changes, alongside investments in infrastructure, are crucial to ensuring we avoid blackouts as energy demand continues to rise in the coming years.

How the Texas grid compares to other regional grids

Is the Texas grid really in worse shape compared to other regional grids around the U.S.?

In some ways, Texas is lagging and in others, Texas is a leader.

One thing you might have heard about the Texas grid is that it is isolated, which restricts the ability to import power from neighboring regions during emergencies. Unfortunately, connecting the Texas grid further would not be a one-size fits all solution for fixing its problems. The neighboring grids would need to have excess supply at the exact moment of need and have the capacity to transmit that power to the right areas of need. Situations often arise where the Texas grid needs more power, but New Mexico, Oklahoma, Arkansas, and Louisiana have none to spare because they are experiencing similar issues with supply and demand at the same time. Furthermore, even if our neighbors have some power to share, the infrastructure may not be sufficient to deliver the power where it’s needed within the state.

On the other hand, Texas is leading the nation in terms of renewable development. The Lone Star State is #1 in wind power and #2 in solar power, only behind California. There are, of course, valid concerns about heavy reliance on renewables when the wind isn’t blowing or the sun isn’t shining, compounded by a lack of large-scale battery storage. Then, there’s the underlying cost and ecological footprint associated with the manufacturing of those batteries.

Yet, the only state with more utility-scale storage than Texas is California.

In recent years, ERCOT has pushed generators and utility companies to increase their winterization efforts, incentivize the buildout of renewables and electricity storage. You might have also heard about the Texas Electricity Fund, which represents the state’s latest effort to further incentivize grid stability. Improvements are underway, but they may not be enough if homeowners and renters across the state are unwilling to set their thermostats a bit higher during extended heatwaves.

How can we fix the Texas grid?

Here’s the reality we must face – a disaster-proof, on-demand, renewable-powered grid is extremely expensive and cannot be implemented quickly. We must come to terms with the fact that the impact of natural disasters is unavoidable, no matter how much we “upgrade” the infrastructure.

Ironically, the most impactful solution out there is free and requires only a few seconds to implement. Simple changes to human behavior are the strongest tool we have at our disposal to prevent blackouts in Texas. By decreasing our collective demand for electricity at the right times, we can all help keep the lights on and prices low.

During peak hours, the cumulative effort is as simple as turning off the lights, turning the thermostat up a few degrees, and running appliances like dishwashers and laundry machines overnight.

Another important element we cannot avoid addressing is global warming. As the temperatures on the surface of the earth increase, the weather changes, and, in many cases, it makes it more volatile.

The more fossil fuels we burn, the more greenhouse gases are released into the atmosphere. More greenhouse gases in the atmosphere leads to more volatile weather. Volatile weather, in turn, contributes to extreme grid strain in the form of heat waves, winter freezes, and hurricanes. This is no simple matter to solve, because the energy needs and capabilities of different countries differ. That is why some countries around the globe continue to expand their investments in coal as an energy source, the fossil fuel that burns the dirtiest and releases the most greenhouse gases per unit.

While governments and private organizations continue to advance carbon capture, renewable, and energy storage technology efficiency, the individual could aid these efforts by changing our behavior. There are many impactful things we can do to reduce our carbon footprint, like adjusting our thermostat a few degrees, eating less red meat, driving cars less often, and purchasing fewer single-use plastics to name a few.

If we want to see real change, we need action by all parties. The complex system of generation, transmission, and consumption all need to experience radical change, or the vicious cycle will only continue.

———

Sam Luna is director at BKV Energy, where he oversees brand and go-to-market strategy, customer experience, marketing execution, and more.

This article originally ran on EnergyCapital.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston VC funding surged nearly 50% in Q1 2026, report says

VC victories

First-quarter venture capital funding for Houston-area startups climbed nearly 50 percent compared to the same time last year, according to the PitchBook-NVCA Venture Monitor.

In Q1 2026, Houston-area startups raised $532.3 million, a 49 percent jump from $320.2 million in Q1 2025, according to the PitchBook-NVCA Venture Monitor.

However, the Q1 total fell 23 percent from the $671.05 million raised in Q4 2025.

Among the first-quarter funding highlights in Houston were:

  • Utility Global, which focuses on industrial decarbonization, announced a first close of $100 million for its Series D round.
  • Sage Geosystems raised a $97 million Series B round to support its geothermal energy storage technology.

Those funding rounds underscore Houston’s evolution as a magnet for VC in the energy sector.

“Today, the energy sector is increasingly extending into the startup economy as venture capital flows into companies developing the technologies that will shape the future of global energy,” the Greater Houston Partnership says.

The energy industry accounted for nearly 40 percent of Houston-area VC funding last year, according to market research and lead generation service Growth List.

Adding to Houston’s stature in VC for energy startups are investors like Chevron Technology Ventures, the investment arm of Houston-based oil and gas giant Chevron; Goose Capital; Mercury Fund; and Quantum Energy Partners.

How Houston innovators played a role in the historic Artemis II splashdown

safe landing

Research from Rice University played a critical role in the safe return of U.S. astronauts aboard NASA’s Artemis II mission this month.

Rice mechanical engineer Tayfun E. Tezduyar and longtime collaborator Kenji Takizawa developed a key computational parachute fluid-structure interaction (FSI) analysis system that proved vital in NASA’s Orion capsule’s descent into the Pacific Ocean. The FSI system, originally developed in 2013 alongside NASA Johnson Space Center, was critical in Orion’s three-parachute design, which slowed the capsule as it returned to Earth, according to Rice.

The model helped ensure that the parachute design was large enough to slow the capsule for a safe landing while also being stable enough to prevent the capsule from oscillating as it descended.

“You cannot separate the aerodynamics from the structural dynamics,” Tezduyar said in a news release. “They influence each other continuously and even more so for large spacecraft parachutes, so the analysis must capture that interaction in a robustly coupled way.”

The end result was a final parachute system, refined through NASA drop tests and Rice’s computational FSI analysis, that eliminated fluctuations and produced a stable descent profile.

Apart from the dynamic challenges in design, modeling Orion’s parachutes also required solving complex equations that considered airflow and fabric deformation and accounted for features like ringsail canopy construction and aerodynamic interactions among multiple parachutes in a cluster.

“Essentially, my entire group was dedicated to that work, because I considered it a national priority,” Tezduyar added in the release. “Kenji and I were personally involved in every computer simulation. Some of the best graduate students and research associates I met in my career worked on the project, creating unique, first-of-its-kind parachute computer simulations, one after the other.”

Current Intuitive Machines engineer Mario Romero also worked on Orion during his time at NASA. From 2018 to 2021, Romero was a member of the Orion Crew Capsule Recovery Team, which focused on creating likely scenarios that crewmembers could encounter in Orion.

The team trained in NASA’s 6.2-million-gallon pool, using wave machines to replicate a range of sea conditions. They also simulated worst-case scenarios by cutting the lights, blasting high-powered fans and tipping a mock capsule to mimic distress situations. In some drills, mock crew members were treated as “injured,” requiring the team to practice safe, controlled egress procedures.

“It’s hard to find the appropriate descriptors that can fully encapsulate the feeling of getting to witness all the work we, and everyone else, did being put into action,” Romero tells InnovationMap. “I loved seeing the reactions of everyone, but especially of the Houston communities—that brought me a real sense of gratitude and joy.”

Intuitive Machines was also selected to support the Artemis II mission using its Space Data Network and ground station infrastructure. The company monitored radio signals sent from the Orion spacecraft and used Doppler measurements to help determine the spacecraft's precise position and speed.

Tim Crain, Chief Technology Officer at Intuitive Machines, wrote about the experience last week.

"I specialized in orbital mechanics and deep space navigation in graduate school,” Crain shared. “But seeing the theory behind tracking spacecraft come to life as they thread through planetary gravity fields on ultra-precise trajectories still seems like magic."