Using APIs, organizations can more easily combine their own internal data. Getty Images

Houston, home to one of Cognite's U.S. headquarters, is the energy capital of the world. But while many oil and gas industry players and partners come together here, much of the data they use — or want to employ — remains siloed.

There's no lack of data. Connected devices are a wellspring of enterprise resource planning data, depth-based trajectories, piping and instrumentation diagrams, and sensor values. But incompatible operational data systems, poor data infrastructure, and restricted data access prevent organizations from easily combining data to solve problems and create solutions.

We understand these challenges because we work alongside some of the biggest operators, OEMs and engineering companies in the oil and gas business. Lundin Petroleum, Aker Energy OMV, and Aker BP are among our customers, for example.

Flexible, open application programming interfaces can address the challenges noted above. APIs enable users to search, filter and do computations on data without downloading full data sets. And they abstract the complexity of underlying storage formats.

As a result, data scientists and process engineers can access data in an efficient manner, spending more time on their use cases and less effort contending with technical details. Using APIs, organizations can more easily combine their own internal data. APIs also simplify the process of using data from industry partners and other sources.

Most companies have slightly different work processes. But common API standards can help a company combine software services and platforms from others in a way that matches its own business logic and internal processes. That can allow the company to differentiate itself from competitors by employing services from the best suppliers to create innovative solutions.

Standardizing APIs across the oil and gas industry would open the door to a community of developers, which could create custom applications and connect existing market solutions. Then more new and exciting applications and services would reach the market faster.

To ensure adoption and success of such a standardization effort, the APIs would need to be well crafted and intuitive to use. These APIs would have to include the business logic required to perform the operations to empower users. In addition, APIs would need to define and allow for the sharing of desired information objects in a consistent way.

Best practices in defining common APIs for sharing data within the industry include:

  • Introducing APIs iteratively, driven by concrete use cases with business value
  • Ensuring all services using the API provide relevant output and insights in a structured machine-readable format, enabling ingestion into the API to ensure continuous enrichment of the data set
  • Making all data searchable
  • Preventing underlying technology from being exposed through the APIs to ensure continuous optimization and allow companies to implement their technology of choice
  • Supporting all external data sharing through an open, well-documented and well-versioned API, using the OpenAPI standard

If oil and gas industry operators define APIs, suppliers will embrace them. That will "grease" the value chain, allowing it to move with less friction and waste.

Operations and maintenance are a natural place for API harmonization to start. Standardized APIs also can enable operators to aggregate and use environmental, equipment and systems, health and safety, and other data. That will accelerate digital transformation in oil and gas and enable companies to leverage innovative solutions coming from the ecosystem, reduce waste, and improve operations, making production more sustainable.

------

Francois Laborie is the general manager of Cognite North Americas.

Luther Birdzell, founder and CEO of Houston-based OAG Analytics is on a mission to democratize data for his upstream oil and gas clients. Courtesy of OAG Analytics

Houston entrepreneur is using his analytics company to change the oil and gas industry

Featured Innovator

Luther Birdzell has been on a mission to democratize data for the upstream oil and gas industry since he started his company, OAG Analytics, in 2013.

For him, there's just not enough data scientists for hire to do the same thing internally for different companies. He thought of a way where he can give clients an easy-to-use platform to have access to data that could save oil and gas companies millions of dollars. So, that's exactly what he did.

"Over the past five and a half years, we've built that platform," Birdzell says. "We are currently helping to optimize over $1 billion in capital deployment around drilling and completions."

The company has grown to 25 employees and tripled its revenue last year. The team is forecasting another year of high grow for 2019.

Birdzell spoke with InnovationMap to talk about his start in software, the company's growth, and why nonprofit work has been important to him as a business leader.

InnovationMap: Did you always know you wanted to be an entrepreneur?

Luther Birdzell: When I was about two years old, my grandfather ran a meat business in New York City — in the meatpacking district, back when that area actually had meat packers. It just was in my bones from a really young age that I wanted to start a business.

IM: How did you get into software development?

LB: I studied electrical engineering in college. For my first seven years, I worked within consulting, implementing systems that made data more valuable to subject matter experts. I was primarily supporting management teams and mostly tech teams.

Then, I met the founders of iTKO, who were doing software testing for clients, and I helped them figure out a way that was complementary to what they were doing. We took a capability that can enable software developers that can help companies reduce their data center costs by a lot. It was a capability that was really restricted to specialized programing. Together we figured out how to make that a capability that anyone in an IT company used. That resulted in companies being able to higher fewer people to maintain servers, as well as reduce other costs. Companies were saving of millions of dollars per year per project.

IM: When did the idea for OAG come to you?

LB: Computer Associates bought iTKO from us in 2011. When I resigned from CA in 2013, it was very clear to me that artificial intelligence, big data, machine learning, and the cloud, were all tech ingredients for adding more value to data. Then the oil and gas business came into focus.

When I founded OAG Analytics, our mission then — and still is today — was to build a platform for the upstream oil and gas industry that enables them to manage their data, introduces world-class machine learning in minutes without having to write a single line of code, and allow them to run simulations on the resulting analysis.

IM: What makes OAG successful?

LB: My vision was to create a platform that could be trusted to support billions of dollars of capital optimization through transparency and control. A black box doesn't work for the kind of problems we're helping our customers optimize. They need something that's easy to use, simple, powerful, and also gives them complete control.

IM: What's the barrier of success for your clients?

LB: We have customers who have increased their capital efficiency on drilling programs that are about $500 million by over 25 percent, while still getting the same amount of oil out of the ground.

IM: What was the early reception like?

LB: We found a lot of interest in talking about how it works. In 2013, 2014, 2015, well over half the industry knew enough about this technology from other industries to have high confidence that it would affect the oil and gas industry one day. They were willing to spend an hour or two on what it is and how it works. But the number of companies who were really willing to invest in a meaningful way was really small.

There were companies, like EOG Resources, for example started spending millions of dollars developing this technology in house. Other companies seeing EOG and Anadarko success, raised the bar on the level of proof.

There's an increasing number of companies in the industry who realize that AI isn't a futuristic thing anymore. There are companies using it today, and the companies using it right are making more money. But, they're learning it's hard to do right. It could take years and millions of dollars to develop this yourself, but we're helping companies get up to speed in a matter of months, and our total cost for the first year is well under a million bucks to do this. They want us to train them how to use it, then act as support, rather than run it all for them.

IM: Do you plan to stay in just upstream oil and gas?

LB: We're 100 percent focused on upstream oil and gas, and always have been, but as we continue to grow, we're going to follow the market and what customers want. Repurposing our platform for other applications in oil and gas, energy, and even beyond that. We're evaluating. The vision has always been to democratize AI, and oil and gas is where we started.

IM: Do you have an exit strategy?

LB: As far as exits, I get asked this a lot. I don't believe in exit strategies. I believe in building a great company. I've seen a lot of founders make a lot of mistakes trying to cut corners to get to early exits. Our goal is to be a great company, and that starts with the right vision and then getting the right people and hires.

IM: How has Houston been as a place to have a startup in energy?

LB: Houston is unparalleled in the oil patch or the ability to support day trips. There's two airports and tons of direct flights to other cities in the oil patch. It's the only city you can cover all the other cities from with day trips. The efficiency of being able to be on site with customers is such an advantage.

There are a lot of industry experts in and around Houston, but a startup software company works very differently from an oil company. I think we have a long road ahead of us before we have an ecosystem in place to support startups and give them the best chance of success. Some of that comes from advisers, some from the ecosystem, and some part of it just takes time. But once those pieces come into play, talent follows. I think Houston is a very natural hub for energy tech.

IM: Volunteering is an important part of your business. Why is that something you've focused on?

LB: Something in the DNA of our business is giving back. We do that through direct community action. We've volunteered as a company, and we're always on the lookout for ways we can engage with and make the most contribution to the community. We do this primarily for personal reasons, but the universe has been very generous over my career with reciprocating a professional upside.

You volunteer in high school to get into college, then maybe some in college. And you might think, "oh that's for philanthropists or retired people and I'll get back to that later." But the reality of that is it feels better doing some of that now, so we do.

------

Portions of this interview have been edited.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

TMC, Memorial Hermann launch partnership to spur new patient care technologies

medtech partnership

Texas Medical Center and Memorial Hermann Health System have launched a new collaboration for developing patient care technology.

Through the partnership, Memorial Hermann employees and physicians will now be able to participate in the TMC Center for Device Innovation (CDI), which will assist them in translating product innovation ideas into working prototypes. The first group of entrepreneurs will pitch their innovations in early 2026, according to a release from TMC.

“Memorial Hermann is excited to launch this new partnership with the TMC CDI,” Ini Ekiko Thomas, vice president of information technology at Memorial Hermann, said in the news release. “As we continue to grow (a) culture of innovation, we look forward to supporting our employees, affiliated physicians and providers in new ways.”

Mentors from Memorial Hermann, TMC Innovation and industry experts with specialties in medicine, regulatory strategy, reimbursement planning and investor readiness will assist with the program. The innovators will also gain access to support systems like product innovation and translation strategy, get dedicated engineering and machinist resources and personal workbench space at the CDI.

“The prototyping facilities and opportunities at TMC are world-class and globally recognized, attracting innovators from around the world to advance their technologies,” Tom Luby, chief innovation officer at TMC Innovation Factor, said in the release.

Memorial Hermann says the partnership will support its innovation hub’s “pilot and scale approach” and hopes that it will extend the hub’s impact in “supporting researchers, clinicians and staff in developing patentable, commercially viable products.”

“We are excited to expand our partnership with Memorial Hermann and open the doors of our Center for Device Innovation to their employees and physicians—already among the best in medical care,” Luby added in the release. “We look forward to seeing what they accomplish next, utilizing our labs and gaining insights from top leaders across our campus.”

Google to invest $40 billion in AI data centers in Texas

Google is investing a huge chunk of money in Texas: According to a release, the company will invest $40 billion on cloud and artificial intelligence (AI) infrastructure, with the development of new data centers in Armstrong and Haskell counties.

The company announced its intentions at a meeting on November 14 attended by federal, state, and local leaders including Gov. Greg Abbott who called it "a Texas-sized investment."

Google will open two new data center campuses in Haskell County and a data center campus in Armstrong County.

Additionally, the first building at the company’s Red Oak campus in Ellis County is now operational. Google is continuing to invest in its existing Midlothian campus and Dallas cloud region, which are part of the company’s global network of 42 cloud regions that deliver high-performance, low-latency services that businesses and organizations use to build and scale their own AI-powered solutions.

Energy demands

Google is committed to responsibly growing its infrastructure by bringing new energy resources onto the grid, paying for costs associated with its operations, and supporting community energy efficiency initiatives.

One of the new Haskell data centers will be co-located with — or built directly alongside — a new solar and battery energy storage plant, creating the first industrial park to be developed through Google’s partnership with Intersect and TPG Rise Climate announced last year.

Google has contracted to add more than 6,200 megawatts (MW) of net new energy generation and capacity to the Texas electricity grid through power purchase agreements (PPAs) with energy developers such as AES Corporation, Enel North America, Intersect, Clearway, ENGIE, SB Energy, Ørsted, and X-Elio.

Water demands

Google’s three new facilities in Armstrong and Haskell counties will use air-cooling technology, limiting water use to site operations like kitchens. The company is also contributing $2.6 million to help Texas Water Trade create and enhance up to 1,000 acres of wetlands along the Trinity-San Jacinto Estuary. Google is also sponsoring a regenerative agriculture program with Indigo Ag in the Dallas-Fort Worth area and an irrigation efficiency project with N-Drip in the Texas High Plains.

In addition to the data centers, Google is committing $7 million in grants to support AI-related initiatives in healthcare, energy, and education across the state. This includes helping CareMessage enhance rural healthcare access; enabling the University of Texas at Austin and Texas Tech University to address energy challenges that will arise with AI, and expanding AI training for Texas educators and students through support to Houston City College.

---

This article originally appeared on CultureMap.com.

TMCi names 11 global startups to latest HealthTech Accelerator cohort

new class

Texas Medical Center Innovation has named 11 medtech startups from around the world to its latest HealthTech Accelerator cohort.

Members of the accelerator's 19th cohort will participate in the six-month program, which kicked off this month. They range from startups developing on-the-go pelvic floor monitoring to 3D-printed craniofacial and orthopedic implants. Each previously participated in TMCi's bootcamp before being selected to join the accelerator. Through the HealthTech Accelerator, founders will work closely with TMC specialists, researchers, top-tier hospital experts and seasoned advisors to help grow their companies and hone their clinical trials, intellectual property, fundraising and more.

“This cohort of startups is tackling some of today’s most pressing clinical challenges, from surgery and respiratory care to diagnostics and women’s health," Tom Luby, chief innovation officer at Texas Medical Center, said in a news release. "At TMC, we bring together the minds behind innovation—entrepreneurs, technology leaders, and strategic partners—to help emerging companies validate, scale, and deliver solutions that make a real difference for patients here and around the world. We look forward to seeing their progress and global impact through the HealthTech Accelerator and the support of our broader ecosystem.”

The 2025 HealthTech Accelerator cohort includes:

  • Houston-based Respiree, which has created an all-in-one cardiopulmonary platform with wearable sensors for respiratory monitoring that uses AI to track breathing patterns and detect early signs of distress
  • College Station-based SageSpectra, which designs an innovative patch system for real-time, remote monitoring of temperature and StO2 for assessing vascular occlusion, infection, and other surgical flap complications
  • Austin-based Dynamic Light, which has developed a non-invasive imaging technology that enables surgeons to visualize blood flow in real-time without the need for traditional dyes
  • Bangkok, Thailand-based OsseoLabs, which develops AI-assisted, 3D-printed patient-specific implants for craniofacial and orthopedic surgeries
  • Sydney, Australia-based Roam Technologies, which has developed a portable oxygen therapy system (JUNO) that provides real-time oxygen delivery optimization for patients with chronic conditions
  • OptiLung, which develops 3D-printed extracorporeal blood oxygenation devices designed to optimize blood flow and reduce complications
  • Bengaluru, India-based Dozee, which has created a smart remote patient monitor platform that uses under-the-mattress bed sensors to capture vital signs through continuous monitoring
  • Montclair, New Jersey-based Endomedix, which has developed a biosurgical fast-acting absorbable hemostat designed to eliminate the risk of paralysis and reoperation due to device swelling
  • Williston, Vermont-based Xander Medical, which has designed a biomechanical innovation that addresses the complications and cost burdens associated with the current methods of removing stripped and broken surgical screws
  • Salt Lake City, Utah-based Freyya, which has developed an on-the-go pelvic floor monitoring and feedback device for people with pelvic floor dysfunction
  • The Netherlands-based Scinvivo, which has developed optical imaging catheters for bladder cancer diagnostics