"To solve the climate crisis, confidence in emissions data is crucial." Photo via Getty Images

Sustainability has been top of mind for all industries as we witness movements towards reducing carbon emissions. For instance, The Securities and Exchange Commission (SEC) proposed a new rule that requires companies to disclose certain climate-related activities in their reporting on a federal level. Now, industries and cities are scrambling to ensure they have strategies in the right place.

While the data behind sustainability poses challenges across industries, it is particularly evident in oil and gas, as their role in energy transition is of the utmost importance, especially in Texas. We saw this at the COP26 summit in Glasgow last November, for example, in the effort to reduce carbon emissions on both a national and international scale and keep global warming within 1.5 degrees Celsius.

The event also made it clear achieving this temperature change to meet carbon neutrality by 2030 won’t be possible if organizations rely on current methods and siloed data. In short, there is a data problem associated with recent climate goals. So, what does that mean for Houston’s oil and gas industry?

Climate is a critical conversation – and tech can help

Houston has long been considered the oil and gas capital of the world, and it is now the epicenter of energy transition. You can see this commitment by the industry in the nature of the conferences as well as the investment in innovation centers.

In terms of the companies themselves, over the past two years each of the major oil and gas players have organized and grown their low carbon business units. These units are focused on bringing new ideas to the energy ecosystem. The best part is they are not working alone but joining forces to find solutions. One of the highest profile examples is ExxonMobil’s Carbon Capture and Underground Storage project (CCUS) which directly supports the Paris Agreement.

Blockchain technology is needed to improve transparency and traceability in the energy sector and backing blockchain into day-to-day business is key to identifying patterns and making decisions from the data.

The recent Blockchain for Oil and Gas conference, for instance, focused on how blockchain can help curate emissions across the ecosystem. This year has also seen several additional symposiums and meetings – such as the Ion and Greentown Houston – that focus on helping companies understand their carbon footprint.

How do we prove the data?

The importance of harmonizing data will become even more important as the SEC looks to bring structure to sustainability reporting. As a decentralized, immutable ledger where data can be inputted and shared at every point of action, blockchain works by storing information in interconnected blocks and providing a value-add for insuring carbon offsets. To access the data inside a block, users first need to communicate with it. This creates a chain of information that cannot be hacked and can be transmitted between all relevant parties throughout the supply chain. Key players can enter, view, and analyze the same data points securely and with assurance of the data’s accuracy.

Data needs to move with products throughout the supply chain to create an overall number for carbon emissions. Blockchain’s decentralization offers value to organizations and their respective industries so that higher quantities of reliable data can be shared between all parties to shine a light on the areas they need to work on, such as manufacturing operations and the offsets of buildings. Baking blockchain into day-to-day business practice is key in identifying patterns over time and making data-backed decisions.

Oil and gas are key players

Cutting emissions is not a new practice of the oil and gas industry. In fact, they’ve been cutting emissions estimates by as much as 50 percent to avoid over-reporting.

The traditional process of reporting data has also been time-consuming and prone to human error. Manually gathering data across multiple sources of information delivers no real way to trace this information across supply chains and back to the source. And human errors, even if they are accidental, pose a risk to hefty fines from regulatory agencies.

It’s a now-or-never situation. The industry will need to pivot their approaches to data gathering, sharing, and reporting to commit to emissions reduction. This need will surely accelerate the use of technologies, like blockchain, to be a part of the energy transition. While the climate challenges we face are alarming, they provide the basis we need for technological innovation and the ability to accurately report emissions to stay in compliance.

The Energy Capital of the World, for good

To solve the climate crisis, confidence in emissions data is crucial. Blockchain provides that as well as transparency and reliability, all while maintaining the highest levels of security. The technology provides assurance that the data from other smart technologies, like connected sensors and the Internet of Things (IoT), is trustworthy and accurate.

The need for good data, new technology, and corporate commitment are all key to Houston keeping its title as the energy capital of the world – based on traditional fossil fuels as well as transitioning to clean energy.

------

John Chappell is the director of energy business development at BlockApps.

Siloed data, lack of consistency, and confusing regulations are all challenges blockchain can address. Photo via Getty Images

Houston expert: Blockchain is the key to unlocking transparency in the energy industry

guest column

Houston has earned its title as the Energy Transition Capital of the world, and now it has an opportunity to be a global leader of technology innovation when it comes to carbon emissions reporting. The oil and gas industry has set ambitious goals to reduce its carbon footprint, but the need for trustworthy emissions data to demonstrate progress is growing more apparent — and blockchain may hold the keys to enhanced transparency.

Despite oil and gas companies' eagerness to lower carbon dioxide emissions, current means of recording emissions cannot keep pace with goals for the future. Right now, the methods of tracking carbon emissions are inefficient, hugely expensive, and inaccurate. There is a critical need for oil and gas companies to understand and report their emission data, but the complexity of this endeavor presents a huge challenge, driven by several important factors.

Firstly, the supply chain is congested with many different data sources. This puts tracking initiatives into many different silos, making it a challenge for businesses to effectively organize their data. Secondly, the means of calculating, modeling, and measuring carbon emissions varies across the industry. This lack of consistency leaves companies struggling to standardize their outputs, complicating the record-keeping process. Finally, the regional patchwork of regulations and compliance standards is confusing and hard to manage, resulting in potential fines and the headaches associated with being found noncompliant.

Better tracking through blockchain

When it comes to tracking carbon emissions, the potential for blockchain is unmatched. Blockchain is an immutable ledger, that allows multiple parties to securely and transparently share data in near real time across the supply chain. Blockchain solutions could be there at every step of operations, helping businesses report their true emissions numbers in an accurate, secure way.

Oil and gas companies are ready to make these changes. Up to now, they've been using outdated practices, including manually entering data into spreadsheets. With operations spread across the world, there is simply no way to ensure that numbers have been accurately recorded at each and every point of action if everything is done manually. Any errors, even if they're accidental, are subject to pricey fines from regulatory agencies. This forces businesses into the costly position of overestimating their carbon emissions. Instead of risking fines, energy companies choose to deflate their carbon accomplishments, missing out on valuable remediation credits in the process. In addition, executives are forced to make decisions based on this distorted data which leaves projects with great potential to cut carbon emissions either underfunded or abandoned entirely.

In conversations with the super majors, they've reported that they have cut emission reduction estimates by as much as 50% to avoid over-reporting. This is anecdotal, but demonstrates a real problem that results in slower rates to meet targets, missed opportunities, and unnecessary expenditures.

There are so many opportunities to integrate blockchain into the energy industry but tackling the carbon output data crisis should come first. Emissions data is becoming more and more important, and oil and gas companies need effective ways to track their progress to drive success. It's essential to start at the bottom and manage this dilemma at the source. Using blockchain solutions would streamline this process, making data collection more reliable and efficient than ever before.

Houston is on the right track to lead the world in energy innovation — local businesses have made impressive, action-driven efforts to make sure that our community can rightfully be called the Energy Capital of the World. The city is in a great position to drive net-zero carbon initiatives worldwide, especially as sustainability becomes more and more important to our bottom lines. Still, to maintain this command, we need to continue to look forward. Making sure we have the best data is critical as the energy world transitions into the future. If Houston wants to continue to be a leader in energy innovation, we need to look at blockchain solutions to tackle the data problem head on.

------

John Chappell is the director of energy business development at BlockApps.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

How Houston innovators played a role in the historic Artemis II splashdown

safe landing

Research from Rice University played a critical role in the safe return of U.S. astronauts aboard NASA’s Artemis II mission this month.

Rice mechanical engineer Tayfun E. Tezduyar and longtime collaborator Kenji Takizawa developed a key computational parachute fluid-structure interaction (FSI) analysis system that proved vital in NASA’s Orion capsule’s descent into the Pacific Ocean. The FSI system, originally developed in 2013 alongside NASA Johnson Space Center, was critical in Orion’s three-parachute design, which slowed the capsule as it returned to Earth, according to Rice.

The model helped ensure that the parachute design was large enough to slow the capsule for a safe landing while also being stable enough to prevent the capsule from oscillating as it descended.

“You cannot separate the aerodynamics from the structural dynamics,” Tezduyar said in a news release. “They influence each other continuously and even more so for large spacecraft parachutes, so the analysis must capture that interaction in a robustly coupled way.”

The end result was a final parachute system, refined through NASA drop tests and Rice’s computational FSI analysis, that eliminated fluctuations and produced a stable descent profile.

Apart from the dynamic challenges in design, modeling Orion’s parachutes also required solving complex equations that considered airflow and fabric deformation and accounted for features like ringsail canopy construction and aerodynamic interactions among multiple parachutes in a cluster.

“Essentially, my entire group was dedicated to that work, because I considered it a national priority,” Tezduyar added in the release. “Kenji and I were personally involved in every computer simulation. Some of the best graduate students and research associates I met in my career worked on the project, creating unique, first-of-its-kind parachute computer simulations, one after the other.”

Current Intuitive Machines engineer Mario Romero also worked on Orion during his time at NASA. From 2018 to 2021, Romero was a member of the Orion Crew Capsule Recovery Team, which focused on creating likely scenarios that crewmembers could encounter in Orion.

The team trained in NASA’s 6.2-million-gallon pool, using wave machines to replicate a range of sea conditions. They also simulated worst-case scenarios by cutting the lights, blasting high-powered fans and tipping a mock capsule to mimic distress situations. In some drills, mock crew members were treated as “injured,” requiring the team to practice safe, controlled egress procedures.

“It’s hard to find the appropriate descriptors that can fully encapsulate the feeling of getting to witness all the work we, and everyone else, did being put into action,” Romero tells InnovationMap. “I loved seeing the reactions of everyone, but especially of the Houston communities—that brought me a real sense of gratitude and joy.”

Intuitive Machines was also selected to support the Artemis II mission using its Space Data Network and ground station infrastructure. The company monitored radio signals sent from the Orion spacecraft and used Doppler measurements to help determine the spacecraft's precise position and speed.

Tim Crain, Chief Technology Officer at Intuitive Machines, wrote about the experience last week.

"I specialized in orbital mechanics and deep space navigation in graduate school,” Crain shared. “But seeing the theory behind tracking spacecraft come to life as they thread through planetary gravity fields on ultra-precise trajectories still seems like magic."

UH breakthrough moves superconductivity closer to real-world use

Energy Breakthrough

University of Houston researchers have set a new benchmark in the field of superconductivity.

Researchers from the UH physics department and the Texas Center for Superconductivity (TcSUH) have broken the transition temperature record for superconductivity at ambient pressure. The accomplishment could lead to more efficient ways to generate, transmit and store energy, which researchers believe could improve power grids, medical technologies and energy systems by enabling electricity to flow without resistance, according to a release from UH.

To break the record, UH researchers achieved a transition temperature 151 Kelvin, which is the highest ever recorded at ambient pressure since the discovery of superconductivity in 1911.

The transition temperature represents the point just before a material becomes superconducting, where electricity can flow through it without resistance. Scientists have been working for decades to push transition temperature closer to room temperature, which would make superconducting technologies more practical and affordable.

Currently, most superconductors must be cooled to extremely low temperatures, making them more expensive and difficult to operate.

UH physicists Ching-Wu Chu and Liangzi Deng published the research in the Proceedings of the National Academy of Sciences earlier this month. It was funded by Intellectual Ventures and the state of Texas via TcSUH and other foundations. Chu, founding director and chief scientist at TcSUH, previously made the breakthrough discovery that the material YBCO reaches superconductivity at minus 93 K in 1987. This helped begin a global competition to develop high-temperature superconductors.

“Transmitting electricity in the grid loses about 8% of the electricity,” Chu, who’s also a professor of physics at UH and the paper’s senior author, said in a news release. “If we conserve that energy, that’s billions of dollars of savings and it also saves us lots of effort and reduces environmental impacts.”

Chu and his team used a technique known as pressure quenching, which has been adapted from techniques used to create diamonds. With pressure quenching, researchers first apply intense pressure to the material to enhance its superconducting properties and raise its transition temperature.

Next, researchers are targeting ambient-pressure, room-temperature superconductivity of around 300 K. In a companion PNAS paper, Chu and Deng point to pressure quenching as a promising approach to help bridge the gap between current results and that goal.

“Room-temperature superconductivity has been seen as a ‘holy grail’ by scientists for over a century,” Rohit Prasankumar, director of superconductivity research at Intellectual Ventures, said in the release. “The UH team’s result shows that this goal is closer than ever before. However, the distance between the new record set in this study and room temperature is still about 140 C. Closing this gap will require concerted, intentional efforts by the broader scientific community, including materials scientists, chemists, and engineers, as well as physicists.”

---

This article originally appeared on EnergyCapitalHTX.com.