"To solve the climate crisis, confidence in emissions data is crucial." Photo via Getty Images

Sustainability has been top of mind for all industries as we witness movements towards reducing carbon emissions. For instance, The Securities and Exchange Commission (SEC) proposed a new rule that requires companies to disclose certain climate-related activities in their reporting on a federal level. Now, industries and cities are scrambling to ensure they have strategies in the right place.

While the data behind sustainability poses challenges across industries, it is particularly evident in oil and gas, as their role in energy transition is of the utmost importance, especially in Texas. We saw this at the COP26 summit in Glasgow last November, for example, in the effort to reduce carbon emissions on both a national and international scale and keep global warming within 1.5 degrees Celsius.

The event also made it clear achieving this temperature change to meet carbon neutrality by 2030 won’t be possible if organizations rely on current methods and siloed data. In short, there is a data problem associated with recent climate goals. So, what does that mean for Houston’s oil and gas industry?

Climate is a critical conversation – and tech can help

Houston has long been considered the oil and gas capital of the world, and it is now the epicenter of energy transition. You can see this commitment by the industry in the nature of the conferences as well as the investment in innovation centers.

In terms of the companies themselves, over the past two years each of the major oil and gas players have organized and grown their low carbon business units. These units are focused on bringing new ideas to the energy ecosystem. The best part is they are not working alone but joining forces to find solutions. One of the highest profile examples is ExxonMobil’s Carbon Capture and Underground Storage project (CCUS) which directly supports the Paris Agreement.

Blockchain technology is needed to improve transparency and traceability in the energy sector and backing blockchain into day-to-day business is key to identifying patterns and making decisions from the data.

The recent Blockchain for Oil and Gas conference, for instance, focused on how blockchain can help curate emissions across the ecosystem. This year has also seen several additional symposiums and meetings – such as the Ion and Greentown Houston – that focus on helping companies understand their carbon footprint.

How do we prove the data?

The importance of harmonizing data will become even more important as the SEC looks to bring structure to sustainability reporting. As a decentralized, immutable ledger where data can be inputted and shared at every point of action, blockchain works by storing information in interconnected blocks and providing a value-add for insuring carbon offsets. To access the data inside a block, users first need to communicate with it. This creates a chain of information that cannot be hacked and can be transmitted between all relevant parties throughout the supply chain. Key players can enter, view, and analyze the same data points securely and with assurance of the data’s accuracy.

Data needs to move with products throughout the supply chain to create an overall number for carbon emissions. Blockchain’s decentralization offers value to organizations and their respective industries so that higher quantities of reliable data can be shared between all parties to shine a light on the areas they need to work on, such as manufacturing operations and the offsets of buildings. Baking blockchain into day-to-day business practice is key in identifying patterns over time and making data-backed decisions.

Oil and gas are key players

Cutting emissions is not a new practice of the oil and gas industry. In fact, they’ve been cutting emissions estimates by as much as 50 percent to avoid over-reporting.

The traditional process of reporting data has also been time-consuming and prone to human error. Manually gathering data across multiple sources of information delivers no real way to trace this information across supply chains and back to the source. And human errors, even if they are accidental, pose a risk to hefty fines from regulatory agencies.

It’s a now-or-never situation. The industry will need to pivot their approaches to data gathering, sharing, and reporting to commit to emissions reduction. This need will surely accelerate the use of technologies, like blockchain, to be a part of the energy transition. While the climate challenges we face are alarming, they provide the basis we need for technological innovation and the ability to accurately report emissions to stay in compliance.

The Energy Capital of the World, for good

To solve the climate crisis, confidence in emissions data is crucial. Blockchain provides that as well as transparency and reliability, all while maintaining the highest levels of security. The technology provides assurance that the data from other smart technologies, like connected sensors and the Internet of Things (IoT), is trustworthy and accurate.

The need for good data, new technology, and corporate commitment are all key to Houston keeping its title as the energy capital of the world – based on traditional fossil fuels as well as transitioning to clean energy.

------

John Chappell is the director of energy business development at BlockApps.

Siloed data, lack of consistency, and confusing regulations are all challenges blockchain can address. Photo via Getty Images

Houston expert: Blockchain is the key to unlocking transparency in the energy industry

guest column

Houston has earned its title as the Energy Transition Capital of the world, and now it has an opportunity to be a global leader of technology innovation when it comes to carbon emissions reporting. The oil and gas industry has set ambitious goals to reduce its carbon footprint, but the need for trustworthy emissions data to demonstrate progress is growing more apparent — and blockchain may hold the keys to enhanced transparency.

Despite oil and gas companies' eagerness to lower carbon dioxide emissions, current means of recording emissions cannot keep pace with goals for the future. Right now, the methods of tracking carbon emissions are inefficient, hugely expensive, and inaccurate. There is a critical need for oil and gas companies to understand and report their emission data, but the complexity of this endeavor presents a huge challenge, driven by several important factors.

Firstly, the supply chain is congested with many different data sources. This puts tracking initiatives into many different silos, making it a challenge for businesses to effectively organize their data. Secondly, the means of calculating, modeling, and measuring carbon emissions varies across the industry. This lack of consistency leaves companies struggling to standardize their outputs, complicating the record-keeping process. Finally, the regional patchwork of regulations and compliance standards is confusing and hard to manage, resulting in potential fines and the headaches associated with being found noncompliant.

Better tracking through blockchain

When it comes to tracking carbon emissions, the potential for blockchain is unmatched. Blockchain is an immutable ledger, that allows multiple parties to securely and transparently share data in near real time across the supply chain. Blockchain solutions could be there at every step of operations, helping businesses report their true emissions numbers in an accurate, secure way.

Oil and gas companies are ready to make these changes. Up to now, they've been using outdated practices, including manually entering data into spreadsheets. With operations spread across the world, there is simply no way to ensure that numbers have been accurately recorded at each and every point of action if everything is done manually. Any errors, even if they're accidental, are subject to pricey fines from regulatory agencies. This forces businesses into the costly position of overestimating their carbon emissions. Instead of risking fines, energy companies choose to deflate their carbon accomplishments, missing out on valuable remediation credits in the process. In addition, executives are forced to make decisions based on this distorted data which leaves projects with great potential to cut carbon emissions either underfunded or abandoned entirely.

In conversations with the super majors, they've reported that they have cut emission reduction estimates by as much as 50% to avoid over-reporting. This is anecdotal, but demonstrates a real problem that results in slower rates to meet targets, missed opportunities, and unnecessary expenditures.

There are so many opportunities to integrate blockchain into the energy industry but tackling the carbon output data crisis should come first. Emissions data is becoming more and more important, and oil and gas companies need effective ways to track their progress to drive success. It's essential to start at the bottom and manage this dilemma at the source. Using blockchain solutions would streamline this process, making data collection more reliable and efficient than ever before.

Houston is on the right track to lead the world in energy innovation — local businesses have made impressive, action-driven efforts to make sure that our community can rightfully be called the Energy Capital of the World. The city is in a great position to drive net-zero carbon initiatives worldwide, especially as sustainability becomes more and more important to our bottom lines. Still, to maintain this command, we need to continue to look forward. Making sure we have the best data is critical as the energy world transitions into the future. If Houston wants to continue to be a leader in energy innovation, we need to look at blockchain solutions to tackle the data problem head on.

------

John Chappell is the director of energy business development at BlockApps.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

​Planned UT Austin med center, anchored by MD Anderson, gets $100M gift​

med funding

The University of Texas at Austin’s planned multibillion-dollar medical center, which will include a hospital run by Houston’s University of Texas MD Anderson Cancer Center, just received a $100 million boost from a billionaire husband-and-wife duo.

Tench Coxe, a former venture capitalist who’s a major shareholder in chipmaking giant Nvidia, and Simone Coxe, co-founder and former CEO of the Blanc & Otus PR firm, contributed the $100 million—one of the largest gifts in UT history. The Coxes live in Austin.

“Great medical care changes lives,” says Simone Coxe, “and we want more people to have access to it.”

The University of Texas System announced the medical center project in 2023 and cited an estimated price tag of $2.5 billion. UT initially said the medical center would be built on the site of the Frank Erwin Center, a sports and entertainment venue on the UT Austin campus that was demolished in 2024. The 20-acre site, north of downtown and the state Capitol, is near Dell Seton Medical Center, UT Dell Medical School and UT Health Austin.

Now, UT officials are considering a bigger, still-unidentified site near the Domain mixed-use district in North Austin, although they haven’t ruled out the Erwin Center site. The Domain development is near St. David’s North Medical Center.

As originally planned, the medical center would house a cancer center built and operated by MD Anderson and a specialty hospital built and operated by UT Austin. Construction on the two hospitals is scheduled to start this year and be completed in 2030. According to a 2025 bid notice for contractors, each hospital is expected to encompass about 1.5 million square feet, meaning the medical center would span about 3 million square feet.

Features of the MD Anderson hospital will include:

  • Inpatient care
  • Outpatient clinics
  • Surgery suites
  • Radiation, chemotherapy, cell, and proton treatments
  • Diagnostic imaging
  • Clinical drug trials

UT says the new medical center will fuse the university’s academic and research capabilities with the medical and research capabilities of MD Anderson and Dell Medical School.

UT officials say priorities for spending the Coxes’ gift include:

  • Recruiting world-class medical professionals and scientists
  • Supporting construction
  • Investing in technology
  • Expanding community programs that promote healthy living and access to care

Tench says the opportunity to contribute to building an institution from the ground up helped prompt the donation. He and others say that thanks to MD Anderson’s participation, the medical center will bring world-renowned cancer care to the Austin area.

“We have a close friend who had to travel to Houston for care she should have been able to get here at home. … Supporting the vision for the UT medical center is exactly the opportunity Austin needed,” he says.

The rate of patients who leave the Austin area to seek care for serious medical issues runs as high as 25 percent, according to UT.

New Rice Brain Institute partners with TMC to award inaugural grants

brain trust

The recently founded Rice Brain Institute has named the first four projects to receive research awards through the Rice and TMC Neuro Collaboration Seed Grant Program.

The new grant program brings together Rice faculty with clinicians and scientists at The University of Texas Medical Branch, Baylor College of Medicine, UTHealth Houston and The University of Texas MD Anderson Cancer Center. The program will support pilot projects that address neurological disease, mental health and brain injury.

The first round of awards was selected from a competitive pool of 40 proposals, and will support projects that reflect Rice Brain Institute’s research agenda.

“These awards are meant to help teams test bold ideas and build the collaborations needed to sustain long-term research programs in brain health,” Behnaam Aazhang, Rice Brain Institute director and co-director of the Rice Neuroengineering Initiative, said in a news release.

The seed funding has been awarded to the following principal investigators:

  • Kevin McHugh, associate professor of bioengineering and chemistry at Rice, and Peter Kan, professor and chair of neurosurgery at the UTMB. McHugh and Kan are developing an injectable material designed to seal off fragile, abnormal blood vessels that can cause life-threatening bleeding in the brain.
  • Jerzy Szablowski, assistant professor of bioengineering at Rice, and Jochen Meyer, assistant professor of neurology at Baylor. Szablowski and Meyer are leading a nonsurgical, ultrasound approach to deliver gene-based therapies to deep brain regions involved in seizures to control epilepsy without implanted electrodes or invasive procedures.
  • Juliane Sempionatto, assistant professor of electrical and computer engineering at Rice, and Aaron Gusdon, associate professor of neurosurgery at UTHealth Houston. Sempionatto and Gusdon are leading efforts to create a blood test that can identify patients at high risk for delayed brain injury following aneurysm-related hemorrhage, which could lead to earlier intervention and improved outcomes.
  • Christina Tringides, assistant professor of materials science and nanoengineering at Rice, and Sujit Prabhu, professor of neurosurgery at MD Anderson, who are working to reduce the risk of long-term speech and language impairment during brain tumor removal by combining advanced brain recordings, imaging and noninvasive stimulation.

The grants were facilitated by Rice’s Educational and Research Initiatives for Collaborative Health (ENRICH) Office. Rice says that the unique split-funding model of these grants could help structure future collaborations between the university and the TMC.

The Rice Brain Institute launched this fall and aims to use engineering, natural sciences and social sciences to research the brain and reduce the burden of neurodegenerative, neurodevelopmental and mental health disorders. Last month, the university's Shepherd School of Music also launched the Music, Mind and Body Lab, an interdisciplinary hub that brings artists and scientists together to study the "intersection of the arts, neuroscience and the medical humanities." Read more here.

Your data center is either closer than you think or much farther away

houston voices

A new study shows why some facilities cluster in cities for speed and access, while others move to rural regions in search of scale and lower costs. Based on research by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard).

Key findings:

  • Third-party colocation centers are physical facilities in close proximity to firms that use them, while cloud providers operate large data centers from a distance and sell access to virtualized computing resources as on‑demand services over the internet.
  • Hospitals and financial firms often require urban third-party centers for low latency and regulatory compliance, while batch processing and many AI workloads can operate more efficiently from lower-cost cloud hubs.
  • For policymakers trying to attract data centers, access to reliable power, water and high-capacity internet matter more than tax incentives.

Recent outages and the surge in AI-driven computing have made data center siting decisions more consequential than ever, especially as energy and water constraints tighten. Communities invest public dollars on the promise of jobs and growth, while firms weigh long-term commitments to land, power and connectivity.

Against that backdrop, a critical question comes into focus: Where do data centers get built — and what actually drives those decisions?

A new study by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard Business School) provides the first large-scale statistical analysis of data center location strategies across the United States. It offers policymakers and firms a clearer starting point for understanding how different types of data centers respond to economic and strategic incentives.

Forthcoming in the journal Strategy Science, the study examines two major types of infrastructure: third-party colocation centers that lease server space to multiple firms, and hyperscale cloud centers owned by providers like Amazon, Google and Microsoft.

Two Models, Two Location Strategies

The study draws on pre-pandemic data from 2018 and 2019, a period of relative geographic stability in supply and demand. This window gives researchers a clean baseline before remote work, AI demand and new infrastructure pressures began reshaping internet traffic patterns.

The findings show that data centers follow a bifurcated geography. Third-party centers cluster in dense urban markets, where buyers prioritize proximity to customers despite higher land and operating costs. Cloud providers, by contrast, concentrate massive sites in a small number of lower-density regions, where electricity, land and construction are cheaper and economies of scale are easier to achieve.

Third-party data centers, in other words, follow demand. They locate in urban markets where firms in finance, healthcare and IT value low latency, secure storage, and compliance with regulatory standards.

Using county-level data, the researchers modeled how population density, industry mix and operating costs predict where new centers enter. Every U.S. metro with more than 700,000 residents had at least one third-party provider, while many mid-sized cities had none.

ImageThis pattern challenges common assumptions. Third-party facilities are more distributed across urban America than prevailing narratives suggest.

Customer proximity matters because some sectors cannot absorb delay. In critical operations, even slight pauses can have real consequences. For hospital systems, lag can affect performance and risk exposure. And in high-frequency trading, milliseconds can determine whether value is captured or lost in a transaction.

“For industries where speed is everything, being too far from the physical infrastructure can meaningfully affect performance and risk,” Pan Fang says. “Proximity isn’t optional for sectors that can’t absorb delay.”

The Economics of Distance

For cloud providers, the picture looks very different. Their decisions follow a logic shaped primarily by cost and scale. Because cloud services can be delivered from afar, firms tend to build enormous sites in low-density regions where power is cheap and land is abundant.

These facilities can draw hundreds of megawatts of electricity and operate with far fewer employees than urban centers. “The cloud can serve almost anywhere,” Pan Fang says, “so location is a question of cost before geography.”

The study finds that cloud infrastructure clusters around network backbones and energy economics, not talent pools. Well-known hubs like Ashburn, Virginia — often called “Data Center Alley” — reflect this logic, having benefited from early network infrastructure that made them natural convergence points for digital traffic.

Local governments often try to lure data centers with tax incentives, betting they will create high-tech jobs. But the study suggests other factors matter more to cloud providers, including construction costs, network connectivity and access to reliable, affordable electricity.

When cloud centers need a local presence, distance can sometimes become a constraint. Providers often address this by working alongside third-party operators. “Third-party centers can complement cloud firms when they need a foothold closer to customers,” Pan Fang says.

That hybrid pattern — massive regional hubs complementing strategic colocation — may define the next phase of data center growth.

Looking ahead, shifts in remote work, climate resilience, energy prices and AI-driven computing may reshape where new facilities go. Some workloads may move closer to users, while others may consolidate into large rural hubs. Emerging data-sovereignty rules could also redirect investment beyond the United States.

“The cloud feels weightless,” Pan Fang says, “but it rests on real choices about land, power and proximity.”

---

This article originally appeared on Rice Business Wisdom. Written by Scott Pett.

Pan Fang and Greenstein (2025). “Where the Cloud Rests: The Economic Geography of Data Centers,” forthcoming in Strategy Science.