Using APIs, organizations can more easily combine their own internal data. Getty Images

Houston, home to one of Cognite's U.S. headquarters, is the energy capital of the world. But while many oil and gas industry players and partners come together here, much of the data they use — or want to employ — remains siloed.

There's no lack of data. Connected devices are a wellspring of enterprise resource planning data, depth-based trajectories, piping and instrumentation diagrams, and sensor values. But incompatible operational data systems, poor data infrastructure, and restricted data access prevent organizations from easily combining data to solve problems and create solutions.

We understand these challenges because we work alongside some of the biggest operators, OEMs and engineering companies in the oil and gas business. Lundin Petroleum, Aker Energy OMV, and Aker BP are among our customers, for example.

Flexible, open application programming interfaces can address the challenges noted above. APIs enable users to search, filter and do computations on data without downloading full data sets. And they abstract the complexity of underlying storage formats.

As a result, data scientists and process engineers can access data in an efficient manner, spending more time on their use cases and less effort contending with technical details. Using APIs, organizations can more easily combine their own internal data. APIs also simplify the process of using data from industry partners and other sources.

Most companies have slightly different work processes. But common API standards can help a company combine software services and platforms from others in a way that matches its own business logic and internal processes. That can allow the company to differentiate itself from competitors by employing services from the best suppliers to create innovative solutions.

Standardizing APIs across the oil and gas industry would open the door to a community of developers, which could create custom applications and connect existing market solutions. Then more new and exciting applications and services would reach the market faster.

To ensure adoption and success of such a standardization effort, the APIs would need to be well crafted and intuitive to use. These APIs would have to include the business logic required to perform the operations to empower users. In addition, APIs would need to define and allow for the sharing of desired information objects in a consistent way.

Best practices in defining common APIs for sharing data within the industry include:

  • Introducing APIs iteratively, driven by concrete use cases with business value
  • Ensuring all services using the API provide relevant output and insights in a structured machine-readable format, enabling ingestion into the API to ensure continuous enrichment of the data set
  • Making all data searchable
  • Preventing underlying technology from being exposed through the APIs to ensure continuous optimization and allow companies to implement their technology of choice
  • Supporting all external data sharing through an open, well-documented and well-versioned API, using the OpenAPI standard

If oil and gas industry operators define APIs, suppliers will embrace them. That will "grease" the value chain, allowing it to move with less friction and waste.

Operations and maintenance are a natural place for API harmonization to start. Standardized APIs also can enable operators to aggregate and use environmental, equipment and systems, health and safety, and other data. That will accelerate digital transformation in oil and gas and enable companies to leverage innovative solutions coming from the ecosystem, reduce waste, and improve operations, making production more sustainable.

------

Francois Laborie is the general manager of Cognite North Americas.

Luther Birdzell, founder and CEO of Houston-based OAG Analytics is on a mission to democratize data for his upstream oil and gas clients. Courtesy of OAG Analytics

Houston entrepreneur is using his analytics company to change the oil and gas industry

Featured Innovator

Luther Birdzell has been on a mission to democratize data for the upstream oil and gas industry since he started his company, OAG Analytics, in 2013.

For him, there's just not enough data scientists for hire to do the same thing internally for different companies. He thought of a way where he can give clients an easy-to-use platform to have access to data that could save oil and gas companies millions of dollars. So, that's exactly what he did.

"Over the past five and a half years, we've built that platform," Birdzell says. "We are currently helping to optimize over $1 billion in capital deployment around drilling and completions."

The company has grown to 25 employees and tripled its revenue last year. The team is forecasting another year of high grow for 2019.

Birdzell spoke with InnovationMap to talk about his start in software, the company's growth, and why nonprofit work has been important to him as a business leader.

InnovationMap: Did you always know you wanted to be an entrepreneur?

Luther Birdzell: When I was about two years old, my grandfather ran a meat business in New York City — in the meatpacking district, back when that area actually had meat packers. It just was in my bones from a really young age that I wanted to start a business.

IM: How did you get into software development?

LB: I studied electrical engineering in college. For my first seven years, I worked within consulting, implementing systems that made data more valuable to subject matter experts. I was primarily supporting management teams and mostly tech teams.

Then, I met the founders of iTKO, who were doing software testing for clients, and I helped them figure out a way that was complementary to what they were doing. We took a capability that can enable software developers that can help companies reduce their data center costs by a lot. It was a capability that was really restricted to specialized programing. Together we figured out how to make that a capability that anyone in an IT company used. That resulted in companies being able to higher fewer people to maintain servers, as well as reduce other costs. Companies were saving of millions of dollars per year per project.

IM: When did the idea for OAG come to you?

LB: Computer Associates bought iTKO from us in 2011. When I resigned from CA in 2013, it was very clear to me that artificial intelligence, big data, machine learning, and the cloud, were all tech ingredients for adding more value to data. Then the oil and gas business came into focus.

When I founded OAG Analytics, our mission then — and still is today — was to build a platform for the upstream oil and gas industry that enables them to manage their data, introduces world-class machine learning in minutes without having to write a single line of code, and allow them to run simulations on the resulting analysis.

IM: What makes OAG successful?

LB: My vision was to create a platform that could be trusted to support billions of dollars of capital optimization through transparency and control. A black box doesn't work for the kind of problems we're helping our customers optimize. They need something that's easy to use, simple, powerful, and also gives them complete control.

IM: What's the barrier of success for your clients?

LB: We have customers who have increased their capital efficiency on drilling programs that are about $500 million by over 25 percent, while still getting the same amount of oil out of the ground.

IM: What was the early reception like?

LB: We found a lot of interest in talking about how it works. In 2013, 2014, 2015, well over half the industry knew enough about this technology from other industries to have high confidence that it would affect the oil and gas industry one day. They were willing to spend an hour or two on what it is and how it works. But the number of companies who were really willing to invest in a meaningful way was really small.

There were companies, like EOG Resources, for example started spending millions of dollars developing this technology in house. Other companies seeing EOG and Anadarko success, raised the bar on the level of proof.

There's an increasing number of companies in the industry who realize that AI isn't a futuristic thing anymore. There are companies using it today, and the companies using it right are making more money. But, they're learning it's hard to do right. It could take years and millions of dollars to develop this yourself, but we're helping companies get up to speed in a matter of months, and our total cost for the first year is well under a million bucks to do this. They want us to train them how to use it, then act as support, rather than run it all for them.

IM: Do you plan to stay in just upstream oil and gas?

LB: We're 100 percent focused on upstream oil and gas, and always have been, but as we continue to grow, we're going to follow the market and what customers want. Repurposing our platform for other applications in oil and gas, energy, and even beyond that. We're evaluating. The vision has always been to democratize AI, and oil and gas is where we started.

IM: Do you have an exit strategy?

LB: As far as exits, I get asked this a lot. I don't believe in exit strategies. I believe in building a great company. I've seen a lot of founders make a lot of mistakes trying to cut corners to get to early exits. Our goal is to be a great company, and that starts with the right vision and then getting the right people and hires.

IM: How has Houston been as a place to have a startup in energy?

LB: Houston is unparalleled in the oil patch or the ability to support day trips. There's two airports and tons of direct flights to other cities in the oil patch. It's the only city you can cover all the other cities from with day trips. The efficiency of being able to be on site with customers is such an advantage.

There are a lot of industry experts in and around Houston, but a startup software company works very differently from an oil company. I think we have a long road ahead of us before we have an ecosystem in place to support startups and give them the best chance of success. Some of that comes from advisers, some from the ecosystem, and some part of it just takes time. But once those pieces come into play, talent follows. I think Houston is a very natural hub for energy tech.

IM: Volunteering is an important part of your business. Why is that something you've focused on?

LB: Something in the DNA of our business is giving back. We do that through direct community action. We've volunteered as a company, and we're always on the lookout for ways we can engage with and make the most contribution to the community. We do this primarily for personal reasons, but the universe has been very generous over my career with reciprocating a professional upside.

You volunteer in high school to get into college, then maybe some in college. And you might think, "oh that's for philanthropists or retired people and I'll get back to that later." But the reality of that is it feels better doing some of that now, so we do.

------

Portions of this interview have been edited.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston geothermal unicorn Fervo officially files for IPO

going public

Fervo Energy has officially filed for IPO.

The Houston-based geothermal unicorn filed a registration statement on Form S-1 with the U.S. Securities and Exchange Commission on April 17 to list its Class A common stock on the Nasdaq exchange. Fervo intends to be listed under the ticker symbol "FRVO."

The number and price of the shares have not yet been determined, according to a news release from Fervo. J.P. Morgan, BofA Securities, RBC Capital Markets and Barclays are leading the offering.

The highly anticipated filing comes as Fervo readies its flagship Cape Station geothermal project to deliver its first power later this year

"Today, miles-long lines for gasoline have been replaced by lines for electricity. Tech companies compete for megawatts to claim AI market share. Manufacturers jockey for power to strengthen American industry. Utilities demand clean, firm electricity to stabilize the grid," Fervo CEO Tim Latimer shared in the filing. "Fervo is prepared to serve all of these customers. Not with complex, idiosyncratic projects but with a simplified, standardized product capable of delivering around-the-clock, carbon-free power using proven oil and gas technology."

Fervo has been preparing to file for IPO for months. Axios Pro first reported that the company "quietly" filed for an IPO in January and estimated it would be valued between $2 billion and $3 billion.

Fervo also closed $421 million in non-recourse debt financing for the first phase of Cape Station last month and raised a $462 million Series E in December. The company also announced the addition of four heavyweights to its board of directors last week, including Meg Whitman, former CEO of eBay, Hewlett-Packard, and Spring-based HPE.

Fervo reported a net loss of $70.5 million for the 2025 fiscal year in the S-1 filing and a loss of $41.1 million in 2024.

Tracxn.com estimates that Fervo has raised $1.12 billion over 12 funding rounds. The company was founded in 2017 by Latimer and CTO Jack Norbeck.

---

This article originally appeared on our sister site, EnergyCapitalHTX.com.

New UT Austin med center, anchored by MD Anderson, gets $1 billion gift

Future of Health

A donation announced Tuesday, April 21, breaks a major record at the University of Texas at Austin. Michael and Susan Dell are now UT Austin's first supporters to give $1 billion. In response, the university will create the UT Dell Campus for Advanced Research and the UT Dell Medical Center to "advance human health," per a press release.

The release also records "significant support" for undergraduate scholarships, student housing, and the Texas Advanced Computing Center for supercomputing research.

Both the new research campus and the UT Dell Medical Center will integrate advanced computing into their research and practices. At the medical center, the university hopes that will lead to "earlier detection, more precise and personalized care, and better health outcomes." The University of Texas MD Anderson Cancer Center will also be integrated into the new medical center.

That comes with a numeric goal measured in 10s: raise $10 billion and rank among the top 10 medical centers in the U.S., both in the next decade.

In the shorter term, the university will break ground on the medical center with architecture firm Skidmore, Owings & Merrill (SOM) "later this year."

“UT Austin, where Dell Technologies was founded from a dorm room, has always been a place where bold ideas become real-world impact,” said Michael and Susan Dell in a joint statement.

They continued, “What makes this moment so meaningful is the opportunity to build something that brings every part of the journey together — from how students learn, to how discoveries are made, to how care reaches families. By bringing together medicine, science and computing in one campus designed for the AI era, UT can create more opportunity, deliver better outcomes, and build a stronger future for communities across Texas and beyond.”

This is the second major gift this year for the planned multibillion-dollar medical center. In January, Tench Coxe, a former venture capitalist who’s a major shareholder in chipmaking giant Nvidia, and Simone Coxe, co-founder and former CEO of the Blanc & Otus PR firm, contributed $100 million$100 million.

Baylor scientist lands $2M grant to explore links between viruses and Alzheimer’s

Alzheimer’s research

A Baylor College of Medicine scientist will begin exploring the possible link between Alzheimer’s disease and viral infections thanks to a $2 million grant awarded in March.

Dr. Ryan S. Dhindsa is an assistant professor of pathology & immunology at Baylor and a principal investigator at Texas Children’s Duncan Neurological Research Institute (Duncan NRI). He hypothesizes that Alzheimer’s may have some link to previous viral infections contracted by the patient. To study this intriguing possibility, the American Brain Foundation has gifted him the Cure One, Cure Many award in neuroinflammation.

“It is an honor to receive this support from the Cure One, Cure Many Award. Viral infections are emerging as a major, underappreciated driver of Alzheimer's disease, and this award will allow our team to conduct the most comprehensive screen of viral exposures and host genetics in Alzheimer's to date, spanning over a million individuals,” Dhindsa said in a news release. “Our goal is to identify which viruses matter most, why some people are more vulnerable than others, and ultimately move the field closer to new therapeutic strategies for patients.”

Roughly 150 million people worldwide will suffer from Alzheimer’s by 2050, making it the most common cause of dementia in the world. Despite this, scientists are still at a loss as to what exactly causes it.

Dhindsa’s research is part of a new range of theories that certain viral infections may trigger Alzheimer’s. His team will take a two-fold approach. First, they will analyze the medical records of more than a million individuals looking for patterns. Second, they will analyze viral DNA in stem cell-derived brain cells to see how the infections could contribute to neurological decay. The scale of the genomic data gathering is unprecedented and may highlight a link that traditional studies have missed.

Also joining the project are Dr. Caleb Lareau of Memorial Sloan Kettering Cancer Center and Dr. Artem Babaian of the University of Toronto. Should a link be found, it would open the door to using anti-virals to prevent or treat Alzheimer’s.