Elon Musk announced that both SpaceX and X will relocate headquarters to two Texas cities. Photo via Getty Images

Elon Musk vowed this week to upend another industry just as he did with cars and rockets — and once again he's taking on long odds.

The world's richest man said he wants to put as many as a million satellites into orbit to form vast, solar-powered data centers in space — a move to allow expanded use of artificial intelligence and chatbots without triggering blackouts and sending utility bills soaring.

To finance that effort, Musk combined SpaceX with his AI business on Monday, February 2, and plans a big initial public offering of the combined company.

“Space-based AI is obviously the only way to scale,” Musk wrote on SpaceX’s website, adding about his solar ambitions, “It’s always sunny in space!”

But scientists and industry experts say even Musk — who outsmarted Detroit to turn Tesla into the world’s most valuable automaker — faces formidable technical, financial and environmental obstacles.

Feeling the heat

Capturing the sun’s energy from space to run chatbots and other AI tools would ease pressure on power grids and cut demand for sprawling computing warehouses that are consuming farms and forests and vast amounts of water to cool.

But space presents its own set of problems.

Data centers generate enormous heat. Space seems to offer a solution because it is cold. But it is also a vacuum, trapping heat inside objects in the same way that a Thermos keeps coffee hot using double walls with no air between them.

“An uncooled computer chip in space would overheat and melt much faster than one on Earth,” said Josep Jornet, a computer and electrical engineering professor at Northeastern University.

One fix is to build giant radiator panels that glow in infrared light to push the heat “out into the dark void,” says Jornet, noting that the technology has worked on a small scale, including on the International Space Station. But for Musk's data centers, he says, it would require an array of “massive, fragile structures that have never been built before.”

Floating debris

Then there is space junk.

A single malfunctioning satellite breaking down or losing orbit could trigger a cascade of collisions, potentially disrupting emergency communications, weather forecasting and other services.

Musk noted in a recent regulatory filing that he has had only one “low-velocity debris generating event" in seven years running Starlink, his satellite communications network. Starlink has operated about 10,000 satellites — but that's a fraction of the million or so he now plans to put in space.

“We could reach a tipping point where the chance of collision is going to be too great," said University at Buffalo's John Crassidis, a former NASA engineer. “And these objects are going fast -- 17,500 miles per hour. There could be very violent collisions."

No repair crews

Even without collisions, satellites fail, chips degrade, parts break.

Special GPU graphics chips used by AI companies, for instance, can become damaged and need to be replaced.

“On Earth, what you would do is send someone down to the data center," said Baiju Bhatt, CEO of Aetherflux, a space-based solar energy company. "You replace the server, you replace the GPU, you’d do some surgery on that thing and you’d slide it back in.”

But no such repair crew exists in orbit, and those GPUs in space could get damaged due to their exposure to high-energy particles from the sun.

Bhatt says one workaround is to overprovision the satellite with extra chips to replace the ones that fail. But that’s an expensive proposition given they are likely to cost tens of thousands of dollars each, and current Starlink satellites only have a lifespan of about five years.

Competition — and leverage

Musk is not alone trying to solve these problems.

A company in Redmond, Washington, called Starcloud, launched a satellite in November carrying a single Nvidia-made AI computer chip to test out how it would fare in space. Google is exploring orbital data centers in a venture it calls Project Suncatcher. And Jeff Bezos’ Blue Origin announced plans in January for a constellation of more than 5,000 satellites to start launching late next year, though its focus has been more on communications than AI.

Still, Musk has an edge: He's got rockets.

Starcloud had to use one of his Falcon rockets to put its chip in space last year. Aetherflux plans to send a set of chips it calls a Galactic Brain to space on a SpaceX rocket later this year. And Google may also need to turn to Musk to get its first two planned prototype satellites off the ground by early next year.

Pierre Lionnet, a research director at the trade association Eurospace, says Musk routinely charges rivals far more than he charges himself —- as much as $20,000 per kilo of payload versus $2,000 internally.

He said Musk’s announcements this week signal that he plans to use that advantage to win this new space race.

“When he says we are going to put these data centers in space, it’s a way of telling the others we will keep these low launch costs for myself,” said Lionnet. “It’s a kind of powerplay.”

A map of U.S. data centers. Courtesy of Rice Businesses Wisdom

Your data center is either closer than you think or much farther away

houston voices

A new study shows why some facilities cluster in cities for speed and access, while others move to rural regions in search of scale and lower costs. Based on research by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard).

Key findings:

  • Third-party colocation centers are physical facilities in close proximity to firms that use them, while cloud providers operate large data centers from a distance and sell access to virtualized computing resources as on‑demand services over the internet.
  • Hospitals and financial firms often require urban third-party centers for low latency and regulatory compliance, while batch processing and many AI workloads can operate more efficiently from lower-cost cloud hubs.
  • For policymakers trying to attract data centers, access to reliable power, water and high-capacity internet matter more than tax incentives.

Recent outages and the surge in AI-driven computing have made data center siting decisions more consequential than ever, especially as energy and water constraints tighten. Communities invest public dollars on the promise of jobs and growth, while firms weigh long-term commitments to land, power and connectivity.

Against that backdrop, a critical question comes into focus: Where do data centers get built — and what actually drives those decisions?

A new study by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard Business School) provides the first large-scale statistical analysis of data center location strategies across the United States. It offers policymakers and firms a clearer starting point for understanding how different types of data centers respond to economic and strategic incentives.

Forthcoming in the journal Strategy Science, the study examines two major types of infrastructure: third-party colocation centers that lease server space to multiple firms, and hyperscale cloud centers owned by providers like Amazon, Google and Microsoft.

Two Models, Two Location Strategies

The study draws on pre-pandemic data from 2018 and 2019, a period of relative geographic stability in supply and demand. This window gives researchers a clean baseline before remote work, AI demand and new infrastructure pressures began reshaping internet traffic patterns.

The findings show that data centers follow a bifurcated geography. Third-party centers cluster in dense urban markets, where buyers prioritize proximity to customers despite higher land and operating costs. Cloud providers, by contrast, concentrate massive sites in a small number of lower-density regions, where electricity, land and construction are cheaper and economies of scale are easier to achieve.

Third-party data centers, in other words, follow demand. They locate in urban markets where firms in finance, healthcare and IT value low latency, secure storage, and compliance with regulatory standards.

Using county-level data, the researchers modeled how population density, industry mix and operating costs predict where new centers enter. Every U.S. metro with more than 700,000 residents had at least one third-party provider, while many mid-sized cities had none.

ImageThis pattern challenges common assumptions. Third-party facilities are more distributed across urban America than prevailing narratives suggest.

Customer proximity matters because some sectors cannot absorb delay. In critical operations, even slight pauses can have real consequences. For hospital systems, lag can affect performance and risk exposure. And in high-frequency trading, milliseconds can determine whether value is captured or lost in a transaction.

“For industries where speed is everything, being too far from the physical infrastructure can meaningfully affect performance and risk,” Pan Fang says. “Proximity isn’t optional for sectors that can’t absorb delay.”

The Economics of Distance

For cloud providers, the picture looks very different. Their decisions follow a logic shaped primarily by cost and scale. Because cloud services can be delivered from afar, firms tend to build enormous sites in low-density regions where power is cheap and land is abundant.

These facilities can draw hundreds of megawatts of electricity and operate with far fewer employees than urban centers. “The cloud can serve almost anywhere,” Pan Fang says, “so location is a question of cost before geography.”

The study finds that cloud infrastructure clusters around network backbones and energy economics, not talent pools. Well-known hubs like Ashburn, Virginia — often called “Data Center Alley” — reflect this logic, having benefited from early network infrastructure that made them natural convergence points for digital traffic.

Local governments often try to lure data centers with tax incentives, betting they will create high-tech jobs. But the study suggests other factors matter more to cloud providers, including construction costs, network connectivity and access to reliable, affordable electricity.

When cloud centers need a local presence, distance can sometimes become a constraint. Providers often address this by working alongside third-party operators. “Third-party centers can complement cloud firms when they need a foothold closer to customers,” Pan Fang says.

That hybrid pattern — massive regional hubs complementing strategic colocation — may define the next phase of data center growth.

Looking ahead, shifts in remote work, climate resilience, energy prices and AI-driven computing may reshape where new facilities go. Some workloads may move closer to users, while others may consolidate into large rural hubs. Emerging data-sovereignty rules could also redirect investment beyond the United States.

“The cloud feels weightless,” Pan Fang says, “but it rests on real choices about land, power and proximity.”

---

This article originally appeared on Rice Business Wisdom. Written by Scott Pett.

Pan Fang and Greenstein (2025). “Where the Cloud Rests: The Economic Geography of Data Centers,” forthcoming in Strategy Science.

CenterPoint, NVIDIA and Palantir have formed Chain Reaction. Photo via Getty Images

CenterPoint and partners launch AI initiative to stabilize the power grid

AI infrastructure

Houston-based utility company CenterPoint Energy is one of the founding partners of a new AI infrastructure initiative called Chain Reaction.

Software companies NVIDIA and Palantir have joined CenterPoint in forming Chain Reaction, which is aimed at speeding up AI buildouts for energy producers and distributors, data centers and infrastructure builders. Among the initiative’s goals are to stabilize and expand the power grid to meet growing demand from data centers, and to design and develop large data centers that can support AI activity.

“The energy infrastructure buildout is the industrial challenge of our generation,” Tristan Gruska, Palantir’s head of energy and infrastructure, says in a news release. “But the software that the sector relies on was not built for this moment. We have spent years quietly deploying systems that keep power plants running and grids reliable. Chain Reaction is the result of building from the ground up for the demands of AI.”

CenterPoint serves about 7 million customers in Texas, Indiana, Minnesota and Ohio. After Hurricane Beryl struck Houston in July 2024, CenterPoint committed to building a resilient power grid for the region and chose Palantir as its “software backbone.”

“Never before have technology and energy been so intertwined in determining the future course of American innovation, commercial growth, and economic security,” Jason Wells, chairman, president and CEO of CenterPoint, added in the release.

In November, the utility company got the go-ahead from the Public Utility Commission of Texas for a $2.9 billion upgrade of its Houston-area power grid. CenterPoint serves 2.9 million customers in a 12-county territory anchored by Houston.

A month earlier, CenterPoint launched a $65 billion, 10-year capital improvement plan to support rising demand for power across all of its service territories.

---

This article originally appeared on our sister site, EnergyCapitalHTX.com.

HPE will supply distributed hybrid multicloud technology to the DIS Agency. Photo courtesy of HPE

Houston-based HPE wins $931M contract to upgrade military data centers

defense data centers

Hewlett Packard Enterprise (HPE), based in Spring, Texas, which provides AI, cloud, and networking products and services, has received a $931 million contract to modernize data centers run by the federal Defense Information Systems Agency.

HPE says it will supply distributed hybrid multicloud technology to the federal agency, which provides combat support for U.S. troops. The project will feature HPE’s Private Cloud Enterprise and GreenLake offerings. It will allow DISA to scale and accelerate communications, improve AI and data analytics, boost IT efficiencies, reduce costs and more, according to a news release from HPE.

The contract comes after the completion of HPE’s test of distributed hybrid multicloud technology at Defense Information Systems Agency (DISA) data centers in Mechanicsburg, Pennsylvania, and Ogden, Utah. This technology is aimed at managing DISA’s IT infrastructure and resources across public and private clouds through one hybrid multicloud platform, according to Data Center Dynamics.

Fidelma Russo, executive vice president and general manager of hybrid cloud at HPE, said in a news release that the project will enable DISA to “deliver innovative, future-ready managed services to the agencies it supports that are operating across the globe.”

The platform being developed for DISA “is designed to mirror the look and feel of a public cloud, replicating many of the key features” offered by cloud computing businesses such as Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform, according to The Register.

In the 1990s, DISA consolidated 194 data centers into 16. According to The Register, these are the U.S. military’s most sensitive data centers.

More recently, in 2024, the Fort Meade, Maryland-based agency laid out a five-year strategy to “simplify the network globally with large-scale adoption of command IT environments,” according to Data Center Dynamics.

Google is investing in Texas. Photo courtesy of Google

Google to invest $40 billion in AI data centers in Texas

Google is investing a huge chunk of money in Texas: According to a release, the company will invest $40 billion on cloud and artificial intelligence (AI) infrastructure, with the development of new data centers in Armstrong and Haskell counties.

The company announced its intentions at a meeting on November 14 attended by federal, state, and local leaders including Gov. Greg Abbott who called it "a Texas-sized investment."

Google will open two new data center campuses in Haskell County and a data center campus in Armstrong County.

Additionally, the first building at the company’s Red Oak campus in Ellis County is now operational. Google is continuing to invest in its existing Midlothian campus and Dallas cloud region, which are part of the company’s global network of 42 cloud regions that deliver high-performance, low-latency services that businesses and organizations use to build and scale their own AI-powered solutions.

Energy demands

Google is committed to responsibly growing its infrastructure by bringing new energy resources onto the grid, paying for costs associated with its operations, and supporting community energy efficiency initiatives.

One of the new Haskell data centers will be co-located with — or built directly alongside — a new solar and battery energy storage plant, creating the first industrial park to be developed through Google’s partnership with Intersect and TPG Rise Climate announced last year.

Google has contracted to add more than 6,200 megawatts (MW) of net new energy generation and capacity to the Texas electricity grid through power purchase agreements (PPAs) with energy developers such as AES Corporation, Enel North America, Intersect, Clearway, ENGIE, SB Energy, Ørsted, and X-Elio.

Water demands

Google’s three new facilities in Armstrong and Haskell counties will use air-cooling technology, limiting water use to site operations like kitchens. The company is also contributing $2.6 million to help Texas Water Trade create and enhance up to 1,000 acres of wetlands along the Trinity-San Jacinto Estuary. Google is also sponsoring a regenerative agriculture program with Indigo Ag in the Dallas-Fort Worth area and an irrigation efficiency project with N-Drip in the Texas High Plains.

In addition to the data centers, Google is committing $7 million in grants to support AI-related initiatives in healthcare, energy, and education across the state. This includes helping CareMessage enhance rural healthcare access; enabling the University of Texas at Austin and Texas Tech University to address energy challenges that will arise with AI, and expanding AI training for Texas educators and students through support to Houston City College.

---

This article originally appeared on CultureMap.com.

An aerial view of Stargate’s AI data center in Abilene. Photo courtesy OpenAI.

Abbott highlights Texas AI boom, with Houston projects on the horizon

AI investments are booming in Texas, Gov. Greg Abbott says. And Houston is poised to benefit from this surge.

At a recent Texas Economic Development Corp. gathering in the Dallas-Fort Worth area, Abbott said AI projects on the horizon in the Lone Star State would be bigger than the $500 billion multistate Project Stargate, according to the Dallas Business Journal. So far, Stargate includes three AI data centers in Texas.

Stargate, a new partnership among OpenAI, Oracle, Softbank, and the federal government, is building AI infrastructure around the country. The project’s first data center is in Abilene, and the center’s second phase is underway. Once the second phase is finished, the 875-acre site will host eight buildings totaling about 4 million square feet with a power capacity of 1.2 gigawatts. An additional 600 megawatts of capacity might be added later.

On Sept. 23, Stargate announced the development of another five AI data centers in the U.S., including a new facility in Shackelford County, Texas, near Abilene. That facility is likely a roughly $25 billion, 1.4-gigawatt AI data center that Vantage Data Centers is building on a 1,200-acre site in Shackelford County.

Another will be in Milam County, between Waco and Austin. In conjunction with Stargate, OpenAI plans to occupy the more than $3 billion center, which will be situated on a nearly 600-acre site, the Austin Business Journal reported. OpenAI has teamed up with Softbank-backed SB Energy Global to build the facility.

Abbott said several unannounced AI projects in Texas — namely, data centers — will be larger than Stargate.

“Bottom line is ... when you look at diversification, the hottest thing going on right now is artificial intelligence,” Abbott said.

The Houston area almost certainly stands to attract some of the projects teased by the governor.

In Houston, Taiwanese tech manufacturer Foxconn already is investing $450 million to make AI servers at the 100-acre Fairbanks Logistics Park, which Foxconn recently purchased for a reported $142 million. The park features four industrial buildings totaling one million square feet. It appears Foxconn will manufacture the servers for Apple and Nvidia, both of which have announced they’ll open server factories in Houston.

The Foxconn, Apple, and Nvidia initiatives are high-profile examples of Houston’s ascent in the AI economy. A report released in July by the Brookings Institution identified Houston as one of the country’s 28 “star” hubs for AI.

The Greater Houston Partnership says the Houston area is undergoing an "AI-driven data revolution."

“As Houston rapidly evolves into a hub for AI, cloud computing, and data infrastructure, the city is experiencing a surge in data center investments driven by its unique position at the intersection of energy, technology, and innovation,” the partnership says.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston startup debuts new drone for first responders

taking flight

Houston-based Paladin Drones has debuted Knighthawk 2.0, its new autonomous, first-responder drone.

The drone aims to strengthen emergency response and protect first responders, the company said in a news release.

“We’re excited to launch Knighthawk 2.0 to help build safer cities and give any city across the world less than a 70-second response time for any emergency,” said Divyaditya Shrivastava, CEO of Paladin.

The Knighthawk 2.0 is built on Paladin’s Drone as a First Responder (DFR) technology. It is equipped with an advanced thermal camera with long-range 5G/LTE connectivity that provides first responders with live, critical aerial awareness before crews reach the ground. The new drone is National Defense Authorization Act-compliant and integrates with Paladin's existing products, Watchtower and Paladin EXT.

Knighthawk 2.0 can log more than 40 minutes of flight time and is faster than its previous model, reaching a reported cruising speed of more than 70 kilometers per hour. It also features more advanced sensors, precision GPS and obstacle avoidance technology, which allows it to operate in a variety of terrains and emergency conditions.

Paladin also announced a partnership with Portuguese drone manufacturer Beyond Vision to integrate its Drone as a First Responder (DFR) technology with Beyond Vision’s NATO-compliant, fully autonomous unmanned aerial systems. Paladin has begun to deploy the Knighthawk 2.0 internationally, including in India and Portugal.

The company raised a $5.2 million seed round in 2024 and another round for an undisclosed amount earlier this year. In 2019, Houston’s Memorial Villages Police Department piloted Paladin’s technology.

According to the company, Paladin wants autonomous drones responding to every 911 call in the U.S. by 2027.

Rice research explores how shopping data could reshape credit scores

houston voices

More than a billion people worldwide can’t access credit cards or loans because they lack a traditional credit score. Without a formal borrowing history, banks often view them as unreliable and risky. To reach these borrowers, lenders have begun experimenting with alternative signals of financial reliability, such as consistent utility or mobile phone payments.

New research from Rice Business builds on that approach. Previous work by assistant professor of marketing Jung Youn Lee showed that everyday data like grocery store receipts can help expand access to credit and support upward mobility. Her latest study extends this insight, using broader consumer spending patterns to explore how alternative credit scores could be created for people with no credit history.

Forthcoming in the Journal of Marketing Research, the study finds that when lenders use data from daily purchases — at grocery, pharmacy, and home improvement stores — credit card approval rates rise. The findings give lenders a powerful new tool to connect the unbanked to credit, laying the foundation for long-term financial security and stronger local economies.

Turning Shopping Habits into Credit Data

To test the impact of retail transaction data on credit card approval rates, the researchers partnered with a Peruvian company that owns both retail businesses and a credit card issuer. In Peru, only 22% of people report borrowing money from a formal financial institution or using a mobile money account.

The team combined three sets of data: credit card applications from the company, loyalty card transactions, and individuals’ credit histories from Peru’s financial regulatory authority. The company’s point-of-sale data included the types of items purchased, how customers paid, and whether they bought sale items.

“The key takeaway is that we can create a new kind of credit score for people who lack traditional credit histories, using their retail shopping behavior to expand access to credit,” Lee says.

The final sample included 46,039 credit card applicants who had received a single credit decision, had no delinquent loans, and made at least one purchase between January 2021 and May 2022. Of these, 62% had a credit history and 38% did not.

Using this data, the researchers built an algorithm that generated credit scores based on retail purchases and predicted repayment behavior in the six months following the application. They then simulated credit card approval decisions.

Retail Scores Boost Approvals, Reduce Defaults

The researchers found that using retail purchase data to build credit scores for people without traditional credit histories significantly increased their chances of approval. Certain shopping behaviors — such as seeking out sale items — were linked to greater reliability as borrowers.

For lenders using a fixed credit score threshold, approval rates rose from 15.5% to 47.8%. Lenders basing decisions on a target loan default rate also saw approvals rise, from 15.6% to 31.3%.

“The key takeaway is that we can create a new kind of credit score for people who lack traditional credit histories, using their retail shopping behavior to expand access to credit,” Lee says. “This approach benefits unbanked applicants regardless of a lender’s specific goals — though the size of the benefit may vary.”

Applicants without credit histories who were approved using the retail-based credit score were also more likely to repay their loans, indicating genuine creditworthiness. Among first-time borrowers, the default rate dropped from 4.74% to 3.31% when lenders incorporated retail data into their decisions and kept approval rates constant.

For applicants with existing credit histories, the opposite was true: approval rates fell slightly, from 87.5% to 84.5%, as the new model more effectively screened out high-risk applicants.

Expanding Access, Managing Risk

The study offers clear takeaways for banks and credit card companies. Lenders who want to approve more applications without taking on too much risk can use parts of the researchers’ model to design their own credit scoring tools based on customers’ shopping habits.

Still, Lee says, the process must be transparent. Consumers should know how their spending data might be used and decide for themselves whether the potential benefits outweigh privacy concerns. That means lenders must clearly communicate how data is collected, stored, and protected—and ensure customers can opt in with informed consent.

Banks should also keep a close eye on first-time borrowers to make sure they’re using credit responsibly. “Proactive customer management is crucial,” Lee says. That might mean starting people off with lower credit limits and raising them gradually as they demonstrate good repayment behavior.

This approach can also discourage people from trying to “game the system” by changing their spending patterns temporarily to boost their retail-based credit score. Lenders can design their models to detect that kind of behavior, too.

The Future of Credit

One risk of using retail data is that lenders might unintentionally reject applicants who would have qualified under traditional criteria — say, because of one unusual purchase. Lee says banks can fine-tune their models to minimize those errors.

She also notes that the same approach could eventually be used for other types of loans, such as mortgages or auto loans. Combined with her earlier research showing that grocery purchase data can predict defaults, the findings strengthen the case that shopping behavior can reliably signal creditworthiness.

“If you tend to buy sale items, you’re more likely to be a good borrower. Or if you often buy healthy food, you’re probably more creditworthy,” Lee explains. “This idea can be applied broadly, but models should still be customized for different situations.”

---

This article originally appeared on Rice Business Wisdom. Written by Deborah Lynn Blumberg

Anderson, Lee, and Yang (2025). “Who Benefits from Alternative Data for Credit Scoring? Evidence from Peru,” Journal of Marketing Research.

XSpace adds 3 Houston partners to fuel national expansion

growth mode

Texas-based XSpace Group has brought onboard three partners from the Houston area to ramp up the company’s national expansion.

The new partners of XSpace, which sells high-end multi-use commercial condos, are KDW, Pyek Financial and Welcome Wilson Jr. Houston-based KDW is a design-build real estate developer, Katy-based Pyek offers fractional CFO services and Wilson is president and CEO of Welcome Group, a Houston real estate development firm.

“KDW has been shaping the commercial [real estate] landscape in Texas for years, and Pyek Financial brings deep expertise in scaling businesses and creating long‑term value,” says Byron Smith, founder of XSpace. “Their commitment to XSpace is a powerful endorsement of our model and momentum. With their resources, we’re accelerating our growth and building the foundation for nationwide expansion.”

The expansion effort will target high-growth markets, potentially including Nashville, Tennessee; Orlando, Florida; and Charlotte and Raleigh, North Carolina.

XSpace launched in Austin with a $20 million, 90,000-square-foot project featuring 106 condos. The company later added locations on Old Katy Road in Houston and at The Woodlands Town Center. A third Houston-area location is coming to the Design District.

XSpace condos range in size from 300 to 3,000 square feet. They can accommodate a variety of uses, such as a luxury-car storage space, a satellite office, or a podcasting studio.

“XSpace has tapped into a fundamental shift in how entrepreneurs and professionals want to use space,” Wilson says. “Houston is one of the best places in the country to innovate and build, and XSpace’s model is perfectly aligned with the needs of this fast‑growing, opportunity‑driven market.”