Using APIs, organizations can more easily combine their own internal data. Getty Images

Houston, home to one of Cognite's U.S. headquarters, is the energy capital of the world. But while many oil and gas industry players and partners come together here, much of the data they use — or want to employ — remains siloed.

There's no lack of data. Connected devices are a wellspring of enterprise resource planning data, depth-based trajectories, piping and instrumentation diagrams, and sensor values. But incompatible operational data systems, poor data infrastructure, and restricted data access prevent organizations from easily combining data to solve problems and create solutions.

We understand these challenges because we work alongside some of the biggest operators, OEMs and engineering companies in the oil and gas business. Lundin Petroleum, Aker Energy OMV, and Aker BP are among our customers, for example.

Flexible, open application programming interfaces can address the challenges noted above. APIs enable users to search, filter and do computations on data without downloading full data sets. And they abstract the complexity of underlying storage formats.

As a result, data scientists and process engineers can access data in an efficient manner, spending more time on their use cases and less effort contending with technical details. Using APIs, organizations can more easily combine their own internal data. APIs also simplify the process of using data from industry partners and other sources.

Most companies have slightly different work processes. But common API standards can help a company combine software services and platforms from others in a way that matches its own business logic and internal processes. That can allow the company to differentiate itself from competitors by employing services from the best suppliers to create innovative solutions.

Standardizing APIs across the oil and gas industry would open the door to a community of developers, which could create custom applications and connect existing market solutions. Then more new and exciting applications and services would reach the market faster.

To ensure adoption and success of such a standardization effort, the APIs would need to be well crafted and intuitive to use. These APIs would have to include the business logic required to perform the operations to empower users. In addition, APIs would need to define and allow for the sharing of desired information objects in a consistent way.

Best practices in defining common APIs for sharing data within the industry include:

  • Introducing APIs iteratively, driven by concrete use cases with business value
  • Ensuring all services using the API provide relevant output and insights in a structured machine-readable format, enabling ingestion into the API to ensure continuous enrichment of the data set
  • Making all data searchable
  • Preventing underlying technology from being exposed through the APIs to ensure continuous optimization and allow companies to implement their technology of choice
  • Supporting all external data sharing through an open, well-documented and well-versioned API, using the OpenAPI standard

If oil and gas industry operators define APIs, suppliers will embrace them. That will "grease" the value chain, allowing it to move with less friction and waste.

Operations and maintenance are a natural place for API harmonization to start. Standardized APIs also can enable operators to aggregate and use environmental, equipment and systems, health and safety, and other data. That will accelerate digital transformation in oil and gas and enable companies to leverage innovative solutions coming from the ecosystem, reduce waste, and improve operations, making production more sustainable.

------

Francois Laborie is the general manager of Cognite North Americas.

Cognite is opening two offices in Texas. Getty Images

European software company plans first U.S. office in Houston

New to town

When considering entering the United States market, Francois Laborie, general manager of Cognite North Americas, of course considered some of the obvious cities for a regional headquarters.

"Initially, when we talked about the US, people assumed Silicon Valley or Boston, because we are a traditional software company," Laborie says. "But we really didn't consider too long because the customers we work with require a pretty deep understanding of industry."

The Norway-based company decided to bet on the energy capital of the world and has announced future offices in Houston as well as Austin — both to open by this summer. This will be Cognite's first expansion outside of Northern Europe. The company makes data software for industrial businesses — oil and gas being a huge focus, as is engineering, equipment manufacturing, shipping, and more.

"The industrial world is very siloed and closed, and we are changing a lot of things in that world," Laborie says. "In the digital world, data and information only becomes valuable as you share it. We are all about liberating data, contextualizing it, and then drawing value out of it."

Laborie says the Houston office will be the company's energy hub — both current and prospective clients of Cognite have pressences in town. Meanwhile, Austin will be the tech hub, since the city has a large tech talent pool. Currently, Austin is on the path to be the U.S. headquarters, but nothing is set in stone at the moment, Laborie says.

Cognite, which expects around 50 employees (both new hires and relocations) split between the two locations, already has strategic Houston partnerships in place. Cognite will operate out of Station Houston and even has an internship program and partnership with Rice University. Overall, Laborie says the reception of the city has been positive.

"Houston went above and beyond," Laborie says. "The relationship with Rice has been very interesting because they are working closely with the Houston municipality to transform this image of Houston to get a stronger driver on innovation with the Innovation District, which spoke very loudly to us."

These partnerships are a crucial party of the company, Laborie says, and Cognite plans to work within Houston's innovation ecosystem to continue to push the envelope on innovative technologies.

"We have partnerships with large corporations, but we also see the importance to work with smaller companies to drive innovation — even if they aren't directly related," Laborie says.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston wearable biosensing company closes $13M pre-IPO round

fresh funding

Wellysis, a Seoul, South Korea-headquartered wearable biosensing company with its U.S. subsidiary based in Houston, has closed a $13.5 million pre-IPO funding round and plans to expand its Texas operations.

The round was led by Korea Investment Partners, Kyobo Life Insurance, Kyobo Securities, Kolon Investment and a co-general partner fund backed by SBI Investment and Samsung Securities, according to a news release.

Wellysis reports that the latest round brings its total capital raised to about $30 million. The company is working toward a Korea Securities Dealers Automated Quotations listing in Q4 2026 or Q1 2027.

Wellysis is known for its continuous ECG/EKG monitor with AI reporting. Its lightweight and waterproof S-Patch cardiac monitor is designed for extended testing periods of up to 14 days on a single battery charge.

The company says that the funding will go toward commercializing the next generation of the S-Patch, known as the S-Patch MX, which will be able to capture more than 30 biometric signals, including ECG, temperature and body composition.

Wellysis also reports that it will use the funding to expand its Houston-based operations, specifically in its commercial, clinical and customer success teams.

Additionally, the company plans to accelerate the product development of two other biometric products:

  • CardioAI, an AI-powered diagnostic software platform designed to support clinical interpretation, workflow efficiency and scalable cardiac analysis
  • BioArmour, a non-medical biometric monitoring solution for the sports, public safety and defense sectors

“This pre-IPO round validates both our technology and our readiness to scale globally,” Young Juhn, CEO of Wellysis, said in the release. “With FDA-cleared solutions, expanding U.S. operations, and a strong AI roadmap, Wellysis is positioned to redefine how cardiac data is captured, interpreted, and acted upon across healthcare systems worldwide.”

Wellysis was founded in 2019 as a spinoff of Samsung. Its S-Patch runs off of a Samsung Smart Health Processor. The company's U.S. subsidiary, Wellysis USA Inc., was established in Houston in 2023 and was a resident of JLABS@TMC.

Elon Musk vows to launch solar-powered data centers in space

To Outer Space

Elon Musk vowed this week to upend another industry just as he did with cars and rockets — and once again he's taking on long odds.

The world's richest man said he wants to put as many as a million satellites into orbit to form vast, solar-powered data centers in space — a move to allow expanded use of artificial intelligence and chatbots without triggering blackouts and sending utility bills soaring.

To finance that effort, Musk combined SpaceX with his AI business on Monday, February 2, and plans a big initial public offering of the combined company.

“Space-based AI is obviously the only way to scale,” Musk wrote on SpaceX’s website, adding about his solar ambitions, “It’s always sunny in space!”

But scientists and industry experts say even Musk — who outsmarted Detroit to turn Tesla into the world’s most valuable automaker — faces formidable technical, financial and environmental obstacles.

Feeling the heat

Capturing the sun’s energy from space to run chatbots and other AI tools would ease pressure on power grids and cut demand for sprawling computing warehouses that are consuming farms and forests and vast amounts of water to cool.

But space presents its own set of problems.

Data centers generate enormous heat. Space seems to offer a solution because it is cold. But it is also a vacuum, trapping heat inside objects in the same way that a Thermos keeps coffee hot using double walls with no air between them.

“An uncooled computer chip in space would overheat and melt much faster than one on Earth,” said Josep Jornet, a computer and electrical engineering professor at Northeastern University.

One fix is to build giant radiator panels that glow in infrared light to push the heat “out into the dark void,” says Jornet, noting that the technology has worked on a small scale, including on the International Space Station. But for Musk's data centers, he says, it would require an array of “massive, fragile structures that have never been built before.”

Floating debris

Then there is space junk.

A single malfunctioning satellite breaking down or losing orbit could trigger a cascade of collisions, potentially disrupting emergency communications, weather forecasting and other services.

Musk noted in a recent regulatory filing that he has had only one “low-velocity debris generating event" in seven years running Starlink, his satellite communications network. Starlink has operated about 10,000 satellites — but that's a fraction of the million or so he now plans to put in space.

“We could reach a tipping point where the chance of collision is going to be too great," said University at Buffalo's John Crassidis, a former NASA engineer. “And these objects are going fast -- 17,500 miles per hour. There could be very violent collisions."

No repair crews

Even without collisions, satellites fail, chips degrade, parts break.

Special GPU graphics chips used by AI companies, for instance, can become damaged and need to be replaced.

“On Earth, what you would do is send someone down to the data center," said Baiju Bhatt, CEO of Aetherflux, a space-based solar energy company. "You replace the server, you replace the GPU, you’d do some surgery on that thing and you’d slide it back in.”

But no such repair crew exists in orbit, and those GPUs in space could get damaged due to their exposure to high-energy particles from the sun.

Bhatt says one workaround is to overprovision the satellite with extra chips to replace the ones that fail. But that’s an expensive proposition given they are likely to cost tens of thousands of dollars each, and current Starlink satellites only have a lifespan of about five years.

Competition — and leverage

Musk is not alone trying to solve these problems.

A company in Redmond, Washington, called Starcloud, launched a satellite in November carrying a single Nvidia-made AI computer chip to test out how it would fare in space. Google is exploring orbital data centers in a venture it calls Project Suncatcher. And Jeff Bezos’ Blue Origin announced plans in January for a constellation of more than 5,000 satellites to start launching late next year, though its focus has been more on communications than AI.

Still, Musk has an edge: He's got rockets.

Starcloud had to use one of his Falcon rockets to put its chip in space last year. Aetherflux plans to send a set of chips it calls a Galactic Brain to space on a SpaceX rocket later this year. And Google may also need to turn to Musk to get its first two planned prototype satellites off the ground by early next year.

Pierre Lionnet, a research director at the trade association Eurospace, says Musk routinely charges rivals far more than he charges himself —- as much as $20,000 per kilo of payload versus $2,000 internally.

He said Musk’s announcements this week signal that he plans to use that advantage to win this new space race.

“When he says we are going to put these data centers in space, it’s a way of telling the others we will keep these low launch costs for myself,” said Lionnet. “It’s a kind of powerplay.”

Johnson Space Center and UT partner to expand research, workforce development

onward and upward

NASA’s Johnson Space Center in Houston has forged a partnership with the University of Texas System to expand collaboration on research, workforce development and education that supports space exploration and national security.

“It’s an exciting time for the UT System and NASA to come together in new ways because Texas is at the epicenter of America’s space future. It’s an area where America is dominant, and we are committed as a university system to maintaining and growing that dominance,” Dr. John Zerwas, chancellor of the UT System, said in a news release.

Vanessa Wyche, director of Johnson Space Center, added that the partnership with the UT System “will enable us to meet our nation’s exploration goals and advance the future of space exploration.”

The news release noted that UT Health Houston and the UT Medical Branch in Galveston already collaborate with NASA. The UT Medical Branch’s aerospace medicine residency program and UT Health Houston’s space medicine program train NASA astronauts.

“We’re living through a unique moment where aerospace innovation, national security, economic transformation, and scientific discovery are converging like never before in Texas," Zerwas said. “UT institutions are uniquely positioned to partner with NASA in building a stronger and safer Texas.”

Zerwas became chancellor of the UT System in 2025. He joined the system in 2019 as executive vice chancellor for health affairs. Zerwas represented northwestern Ford Bend County in the Texas House from 2007 to 2019.

In 1996, he co-founded a Houston-area medical practice that became part of US Anesthesia Partners in 2012. He remained active in the practice until joining the UT System. Zerwas was chief medical officer of the Memorial Hermann Hospital System from 2003 to 2008 and was its chief physician integration officer until 2009.

Zerwas, a 1973 graduate of the Houston area’s Bellaire High School, is an alumnus of the University of Houston and Baylor College of Medicine.