AI's true potential lies in its ability to enhance human capabilities, not replace them. Photo via Getty Images

The rapid advancement of artificial intelligence is forcing businesses to evaluate how they will manage the inevitable changes this technology will bring. With its ability to automate tasks, analyze large amounts of data, and provide detailed insights, AI offers an enormous opportunity for businesses of all sizes. However, realizing this potential requires a strategic approach that positions AI as a powerful partner, rather than a replacement for human ingenuity.

The British Council reports that an estimated 65 percent of today's students will eventually work in professions that have yet to be conceived. With the emergence of new AI, this projection emphasizes the importance of cultivating a versatile skill set that allows us to adapt to the ever-changing landscape. It also underscores the importance of having a strategy that embraces the division of labor between humans and machines.

What this means is that an AI strategy shouldn't just be about automation – it should also incorporate an understanding of the human-AI partnership that will be necessary for future success. By using the concepts of automation, augmentation, and autonomy, businesses can unlock the full potential of AI to boost efficiency, enhance decision-making, and ultimately drive continued success.

Automation: Delegating to the AI

We know AI can automate many tasks in a business. However, we should also look at automation from a strategy standpoint by asking, "What tasks can be fully delegated to the AI?" Answering this question might include considering routine, repetitive, and time-consuming tasks that shouldn't require human intervention or those that would be more susceptible to human error. The goal here should be to identify tasks that don't benefit from human nuance, meaning asking questions about time, precision, and compliance could offer even more value.

  • Time. What tasks are time-consuming and could be completed quickly with well-written instructions?
  • Precision. What tasks require precision that is difficult for humans to achieve?
  • Compliance. What tasks involve critical safety procedures or adherence to strict compliance that humans might overlook?

Augmentation: Using AI to boost your potential

Beyond automation, AI's true power lies in its ability to boost human capabilities. In this lens, you should ask, "How can the AI boost my output potential?" Think of AI as a skilled assistant that can analyze vast datasets, identify complex patterns, and present insights that aren't readily apparent to humans alone. The focus here is on tasks that still require a human touch but can benefit from computers' speed and data processing power. When exploring this further, consider asking questions about skill boosts, assistance, and focus.

  • Skill boosts. What tasks am I doing that I understand but need to be an expert at?
  • Assistance. What tasks still require a human's touch but could use processing or speed boosts?
  • Focus. What tasks are causing employees to spend more time on tools and less on goals?

Autonomy: The importance of humans in the loop

One question that comes up frequently when discussing AI is whether it will replace a particular set of jobs. My thoughts, however, are that while AI is remarkably powerful, the key to making all this work is understanding that not every task requires automation. In fact, some tasks would suffer from automation. This step requires you to ask, "Where are human emotion, creativity, intuition, and oversight essential?" Autonomy, in this sense, means digging into creativity, intuition, and uniqueness.

  • Creativity. Does this task require a level of creativity that a machine can't replicate?
  • Intuition. Does this task require emotional awareness that a machine can't discern?
  • Brand Uniqueness. Does this task represent a part of my brand that shouldn't be automated or machine-driven?

AI brings a lot to look forward to. It’s fair to say it’s on its way to transforming the world, but it's important to remember that the businesses that strategically embrace a human-centered approach to integrating AI into everyday business activities are the ones that will thrive. The three A’s: automation, augmentation, and autonomy, provide an essential foundation to begin this journey. By understanding the best applications for each aspect of AI, businesses of all sizes can discover areas for increased efficiency, more thoughtful decision-making, and a competitive edge that drives long-term success. AI's true potential lies in its ability to enhance human capabilities, not replace them.

------

Kelsey Ruger is the chief technology and product officer for Hello Alice.

Researchers at the new SynthX Center will aim to turn fundamental research into clinical applications and make precision adjustments to drug properties and molecules. Photo via Rice University

Houston organizations launch collaborative center to boost cancer outcomes

new to HOU

Rice University's new Synthesis X Center officially launched last month to bring together experts in cancer care and chemistry.

The center was born out of what started about seven years ago as informal meetings between Rice chemist Han Xiao's research group and others from the Baylor College of Medicine’s Dan L Duncan Comprehensive Cancer Center at the Baylor College of Medicine. The level of collaboration between the two teams has grown significantly over the years, and monthly meetings now draw about 100 participants from across disciplines, fields and Houston-based organizations, according to a statement from Rice.

Researchers at the new SynthX Center will aim to turn fundamental research into clinical applications and make precision adjustments to drug properties and molecules. It will focus on improving cancer outcomes by looking at an array of factors, including prevention and detection, immunotherapies, the use of artificial intelligence to speed drug discovery and development, and several other topics.

"At Rice, we are strong on the fundamental side of research in organic chemistry, chemical biology, bioengineering and nanomaterials,” Xiao says in the statement. “Starting at the laboratory bench, we can synthesize therapeutic molecules and proteins with atom-level precision, offering immense potential for real-world applications at the bedside ... But the clinicians and fundamental researchers don’t have a lot of time to talk and to exchange ideas, so SynthX wants to serve as the bridge and help make these connections.”

SynthX plans to issue its first merit-based seed grants to teams with representatives from Baylor and Rice this month.

With this recognition from Rice, the teams from Xiao's lab and the TMC will also be able to expand and formalize their programs. They will build upon annual retreats, in which investigators can share unpublished findings, and also plan to host a national conference, the first slated for this fall titled "Synthetic Innovations Towards a Cure for Cancer.”

“I am confident that the SynthX Center will be a great resource for both students and faculty who seek to translate discoveries from fundamental chemical research into medical applications that improve people’s lives,” Thomas Killian, dean of the Wiess School of Natural Sciences, says in the release.

Rice announced that it had invested in four other research centers along with SynthX last month. The other centers include the Center for Coastal Futures and Adaptive Resilience, the Center for Environmental Studies, the Center for Latin American and Latinx Studies and the Rice Center for Nanoscale Imaging Sciences.

Earlier this year, Rice also announced its first-ever recipients of its One Small Step Grant program, funded by its Office of Innovation. The program will provide funding to faculty working on "promising projects with commercial potential," according to the website.

Researchers at Baylor College of Medicine’s Human Genome Sequencing Center have trained an AI assistant to explain genetic test results to patients. Photo via Getty Images

Houston researches tap into GenAI for communicating genetic test results

hi, tech

Artificial intelligence in the health care setting has a lot of potential, and one Houston institution is looking into one particular use.

Researchers at Baylor College of Medicine’s Human Genome Sequencing Center have trained an AI assistant to explain genetic test results to patients. According to findings published in the Journal of the American Medical Informatics Association (JAMIA), the team has developed generative AI to understand and interpret genetic tests. They have also tested its accuracy against Open AI’s ChatGPT 3.5.

“We created a chatbot that can provide guidance on general pharmacogenomic testing, dosage implications, and the side effects of therapeutics, and address patient concerns,” explains first author Mullai Murugan in a press release. Murugan is director of software engineering and programming at the Human Genome Sequencing Center. “We see this tool as a superpowered assistant that can increase accessibility and help both physicians and patients answer questions about genetic test results.”

The initial chatbot training specifically targeted pharmacogenomic testing for statins, meaning a patient’s potential response to cholesterol-lowering drugs, as dictated by genetics.

Murugan explains why they decided to create their own chatbot in the key publication on statin pharmacogenomics was published in May 2022, four months after the training cutoff date for ChatGPT 3.5 in January 2022. Alternatively, her team’s technology uses Retrieval Augmented Generation (RAG) and was trained on the most recent guidelines.

How did the two AI assistants compare? Four experts on cardiology and pharmacogenomics rated both chatbots based on accuracy, relevancy, risk management, and language clarity, among other factors. Though the AI scored similarly on language clarity, Baylor’s chatbot scored 85 percent in accuracy and 81 percent in relevancy compared to ChatGPT’s 58 percent in accuracy and 62 percent in relevancy when asked questions from healthcare providers.

“We are working to fine-tune the chatbot to better respond to certain questions, and we want to get feedback from real patients,” Murugan says. “Based on this study, it is very clear that there is a lot of potential here.” Nonetheless, Murugan emphasized that there is much work still to be done before the program is ready for clinical applications. That includes training the chatbot to explain results in the language used by genetic counselors. Funds from the NIH’s All of Us Research Program helped to make the research possible.

Hear from guest columnist Onega Ulanova on AI and quality management systems in manufacturing. Photo via Getty Images

Expert: How AI is disrupting manufacturing and the future of quality management systems

guest column

The concept of quality management is so intrinsic to modern manufacturing — and yet so little understood by the general public — and has literally revolutionized our world over the past hundred years.

Yet, in the present day, quality management and the related systems that guide its implementation are far from static. They are continuously-evolving, shifting to ever-changing global conditions and new means of application unleashed by technological innovation.

Now, more than ever, they are essential for addressing and eliminating not only traditional sources of waste in business, such as lost time and money, but also the physical and pollutant waste that threatens the world we all inhabit.

But what are quality management systems, or QMS, exactly? Who created them, and how have they evolved over time? Perhaps most pressingly, where can they be of greatest help in the present world, and when can they be implemented by businesses in need of change and improvement?

In this article, we will explore the history of QMS, explain their essential role in today’s manufacturing practices, and examine how these systems will take us into the future of productivity.

Quality Management Systems: A Definition

In the United States and globally, the gold standard of quality management standards and practices is the American Society for Quality. This preeminent organization, with over 4,000 members in 130 countries, was established in 1946 and has guided practices and implementation of quality management systems worldwide.

The Society defines a quality management system as “a formalized system that documents processes, procedures, and responsibilities for achieving quality policies and objectives,” and further states that “a QMS helps coordinate and direct an organization’s activities to meet customer and regulatory requirements and improve its effectiveness and efficiency on a continuous basis.”

From this definition, it can be understood that a good quality management system’s purpose is to establish the conditions for consistent and ever-increasing improvement through the use of standardized business culture practices.

Which QMS Standards are Most Widely Used?

The results of quality management’s remarkable growth since the 1940s has led to the rise of a number of widely-used standards, which can serve as the basis for companies and organizations to design and implement their own practices. Most of these modern quality management standards are globally recognized, and are specifically tailored to ensure that a company’s newly-developed practices include essential elements that can increase the likelihood of success.

The most widely-known entity which has designed such guidance is the International Organization for Standardization (ISO), a global organization which develops and publishes technical standards. Since the 1980s, the ISO has provided the 9000 series of standards (the most famous of which is 9001:2015) which outline how organizations can satisfy the checklists of quality management requirements and create their own best practices.

In 2020, over 1.2 million organizations worldwide were officially certified by the ISO for their quality management implementation practices.

However, it should be understood that the ISO 9000 standards are merely guidelines for the design and implementation of a quality management system; they are not systems in and of themselves.

Furthermore, the ISO is far from the only relevant player in this field. Many industry-specific standards, such as the American Petroleum Institute’s API Q1 standard, have been developed to target the highly specialized needs of particular business practices of oil and gas industry. These industry-specific standards are generally aligned with the ISO 9000 standards, and serve as complimentary additional guidance, rather than a replacement. It is entirely possible, and in many cases desirable, for a company to receive both ISO certification and certification from an industry-specific standards body, as doing so can help ensure the company’s newly-developed QMS procedures are consistent with both broad and specialized best practices.

A History of Quality Management

The concept of quality management is intrinsically tied to the development of industrial production. Previous to the industrial revolution, the concept of ‘quality’ was inherently linked to the skill and effort of craftspeople, or in other words, individual laborers trained in specialized fields who, either individually or in small groups, produced goods for use in society.

Whether they were weaving baskets or building castles, these craftspeople were primarily defined by a skill that centered them in a specific production methodology, and it was the mastery of this skill which determined the quality. Guilds of craftspeople would sign their works, placing a personal or group seal on the resulting product and thereby accepting accountability for its quality.

Such signatures and marks are found dating back at least 4,500 years to the construction of Egypt’s Great Pyramid of Giza, and came into widespread practice in medieval Europe with the rise of craft guilds.

In these early confederations of workers, a person’s mastery of a skill or craft could become a defining part of their identity and life, to the extent that many craftspeople of 13th Century Europe lived together in communal settings, while the Egyptian pyramid workers may have belonged to life-long ‘fraternities’ who returned, year after year, to fulfill their roles in ‘work gangs’.

However, in the Industrial Revolution, craft and guild organizations were supplanted by factories. Though ancient and medieval projects at times reached monumental scale, the rise of thousands of factories, each requiring human and machine contributions to generate masses of identical products, required a completely different scale of quality management.

The emphasis on mass production necessitated the use of workers who were no longer crafts masters, and thus resulted in a decrease in the quality of products. This in turn necessitated the rise of the product inspection system, which was steadily refined from the start of the Industrial Revolution in 1760 into the early 20th century.

However, inspection was merely a system of quality control, rather than quality management; in other words, simply discarding defective products did not in and of itself increase total product quality or reduce waste.

As influential American engineer Joseph M. Juran explained, in 1920s-era America, it was common to throw away substantial portions of produced inventory due to defects, and when Juran prompted inspectors at his employer’s company to do something, they refused, saying it was the responsibility of the production line to improve. Quality control, in and of itself, would not yield quality management.

As is often the case in human history, war was the driver of change. In World War II, the mobilization of millions of American workers into wartime roles coincided with the need to produce greater quantities of high-quality products than ever before.

To counteract the loss of skilled factory labor, the United States government implemented the Training Within Industry program, which utilized 10-hour courses to educate newly-recruited workers in how to conduct their work, evaluate their efficiency, and suggest improvements. Similar training programs for the trainers themselves were also developed. By the end of the war, more than 1.6 million workers had been certified under the Training Within Industry program.

Training Within Industry represented one of the first successful implementations of quality management systems, and its impact was widely felt after the end of the war. In the ashes of conflict, the United States and the other Allied Powers were tasked with helping to rebuild the economies of the other wartime combatants. Nowhere was this a more pressing matter than Japan, which had seen widespread economic devastation and had lost 40 percent of all its factories. Further complicating the situation was the reality that, then as now, Japan lacked sufficient natural resources to serve its economic scale.

And yet, within just 10 years of the war’s end, Japan’s economy war growing twice as fast per year than it had been before the fighting started. The driver of this miraculous turnaround was American-derived quality management practices, reinterpreted and implemented with Japanese ingenuity.

In modern business management, few concepts are as renowned, and oft-cited for success, as kaizen. This Japanese word, which simply means “improvement,” is the essential lesson and driver of Japan’s postwar economic success.

Numerous books written outside Japan have attempted to explain kaizen’s quality management principles, often by citing them as being ‘distinctly Japanese.’ Yet, the basis for kaizen is actually universal and applicable in any culture or context; it is, simply put, an emphasis on remaining quality-focused and open to evolution. The development of kaizen began in the post-war period when American statistician William Edwards Deming was brought to Japan as part of the US government’s rebuilding efforts.

A student of earlier quality management thought leaders, Deming instructed hundreds of Japanese engineers, executives, and scholars, urging them to place statistical analysis and human relationships at the center of their management practices. Deming used statistics to track the number and origin of product defects, as well to analyze the effectiveness of remedies. He also reinstated a key idea of the craftsperson creed: that the individual worker is not just a set of hands performing a task, but a person who can, with time, improve both the self and the whole of the company.

Deming was not alone in these efforts; the aforementioned Joseph M. Juran, who came to Japan as part of the rebuilding program several years later, also gave numerous lectures expounding similar principles.

Like Deming, Juran had previously tried to impart these approaches to American industry, but the lessons often fell on deaf ears. Japanese managers, however, took the lessons to heart and soon began crafting their own quality management systems.

Kaoru Ishikawa, who began by translating the works of Deming and Juran into Japanese, was one of the crucial players who helped to create the ideas now known as kaizen. He introduced a bottom-up approach where workers from every part of the product life cycle could initiate change, and popularized Deming’s concept of quality circles, where small groups of workers would meet regularly to analyze results and discuss improvements.

By 1975, Japanese product quality, which had once been regarded as poor, had transformed into world-class thanks to the teachings of Deming, Juran, and kaizen.

By the 1980s, American industry had lost market share and quality prestige to Japan. It was now time for US businesses to learn from Deming and Juran, both of whom at last found a receptive audience in their home country. Deming in particular achieved recognition for his role in the influential 1980 television documentary If Japan Can, Why Can’t We?, in which he emphasized the universal applicability of quality management.

So too did kaizen, which influenced a new generation of global thought leaders. Arising out of this rapid expansion of QMS were new systems in the 1970s and ‘80s, including the Six Sigma approach pioneered by Bill Smith and Motorola in 1987. Ishikawa, who saw his reputation and life transformed as his ideas spread worldwide, eventually summed up the explanation as the universality of human nature and its desire to improve. As Ishikawa said, “wherever they are, human beings are human beings”.

In no small part due to the influence of the thought leaders mentioned, quality management systems are today a cornerstone of global business practice. So influential are the innovators of these systems that they are often called ‘gurus.’ But what are the specific benefits of these systems, and how best can they be implemented?

How QMS Benefits Organizations, and the World

The oft-cited benefits of quality management systems are operational efficiency, employee retention, and reduction of waste. From all of these come improvements to the company’s bottom line and reputation. But far from being dry talking points, each benefit not only serves its obvious purpose, but also can dramatically help benefit the planet itself.

Operational efficiency is the measurement, analysis, and improvement of processes which occur within an organization, with the purpose of utilizing data and consideration to eliminate or mediate any areas where current practices are not effective.

Quality management systems can increase operational efficiency by utilizing employee analysis and feedback to quickly identify areas where improvements are possible, and then to guide their implementation.

In a joint study conducted in 2017 by Forbes and the American Society for Quality, 56 percent of companies stated that improving operational efficiency was a top concern; in the same survey, 59 percent of companies received direct benefit to operations by utilizing quality management system practices, making it the single largest area of improvement across all business types.

Because operational improvements inherently reduce both waste and cost, conducting business in a fully-optimized manner can simultaneously save unnecessary resource expenditure, decrease pollutants and discarded materials, and retain more money which the company can invest into further sustainable practices. Efficiency is itself a kind of ‘stealth sustainability’ that turns a profit-focused mindset into a generator of greater good. It is this very point that the

United States government’s Environmental Protection Agency (EPA) has emphasized in their guidance for Environmental Management Systems (EMS). These quality management system guidelines, tailored specifically to benefit operational efficiency in a business setting, are also designed to benefit the global environment by utilizing quality management practices.

Examples in the EPA’s studies in preparing these guidelines showcased areas where small companies could reduce environmental waste, while simultaneously reducing cost, in numerous areas. These added to substantial reductions and savings, such as a 15 percent waste water reduction which saved a small metal finishing company $15,000 per year.

Similarly, a 2020 study by McKinsey & Company identified ways that optimizing operations could dramatically aid a company’s sustainability with only small outlays of capital, thereby making environmental benefit a by-product of improved profitability.

Employee retention, and more broadly the satisfaction of employees, is another major consideration of QMS. Defined simply, retention is not only the maintenance of a stable workforce without turnover, but the improvement of that workforce with time as they gain skill, confidence, and ability for continued self and organizational improvement. We may be in the post-Industrial Revolution, but thanks to the ideas of QMS, some of the concept of the craftsperson has returned to modern thinking; the individual, once more, has great value.

Quality management systems aid employee retention by allowing the people of an organization to have a direct hand in its improvement. In a study published in 2023 by the journal Quality Innovation Prosperity, 40 percent of organizations which implemented ISO 9001 guidance for the creation of a QMS reported that the process yielded greater employee retention.

A crucial success factor for employee satisfaction is how empowered the employee feels to apply judgment. According to a 2014 study by the Harvard Business Review, companies which set clear guidelines, protect and celebrate employee proposals for quality improvement, and clearly communicate the organization’s quality message while allowing the employees to help shape and implement it, have by far the highest engagement and retention rates. The greatest successes come from cultures where peer-driven approaches increase employee engagement, thereby eliminating preventable employee mistakes. Yet the same study also pointed out that nearly half of all employees feel their company’s leadership lacks a clear emphasis on quality, and only 10 percent felt their company’s existing quality statements were truthful and viable.

Then as now, the need to establish a clear quality culture, to manage and nurture that culture, and to empower the participants is critical to earning the trust of the employee participants and thereby retaining workers who in time can become the invaluable craftspeople of today.

Finally, there is the reduction of waste. Waste can be defined in many ways: waste of time, waste of money, waste of resources. The unifying factor in all definitions is the loss of something valuable, and irretrievable. All inevitably also lead to the increase of another kind of waste: pollution and discarded detritus which steadily ruin our shared planet.

Reducing waste with quality management can take many forms, but ultimately, all center on the realization of strategies which use only what is truly needed. This can mean both operational efficiencies and employee quality, as noted above. The Harvard Business Review survey identified that in 2014, the average large company (having 26,000 employees or more) loses a staggering $350 million each year due to preventable employee errors, many of which could be reduced, mitigated, or eliminated entirely with better implementation of quality management.

This is waste on an almost unimaginable financial scale. Waste eliminated through practices which emphasize efficiency and sustainability, as noted in the McKinsey & Company study, can also yield tremendous savings. In one example, a company which purchased asphalt and previously prioritized only the per-ton price found that, when examining the logistical costs of transporting the asphalt from distant suppliers, they were actually paying more than if they purchased it locally. The quality management analysis they performed yielded them a cost savings, and eliminated 40 percent of the carbon emissions associated with the asphalt’s procurement. In this case, not only was wasteful spending eliminated, but literal waste (pollution) was prevented.

In taking these steps, companies can meaningfully improve their bottom lines, while at the same time doing something worthwhile and beneficial for the planet. That, in turn, helps burnish their reputations. A remarkable plurality of consumers, 88 percent of Americans surveyed in a 2017 study to be exact, said they would be more loyal to a company that supports social or environmental issues.

It is therefore clear that any steps a company can take which save money, improve worker satisfaction, and yield increased positivity in the marketplace are well worth pursuing.

What is the Future of QMS?

Until the 2000s, quality management systems were just that: systems of desirable practices, outlined by individuals and implemented individually. That was the age of the gurus: the visionaries who outlined the systems. But what that age lacked was a practical and easy means for companies, sometimes located far away from direct guidance by the gurus, to implement their teachings.

In the intervening years, technology has radically changed that dynamic. Today, QMS software fills the marketplace, allowing businesses small and large to design and guide their quality management plans. But even these software solutions have not yet solved the last great challenge: personalized assistance in putting standards into practice.

That is why the latest innovations, particularly in artificial intelligence, have the potential to upend the equation. Already, major companies have started to use artificial intelligence in connection with QMS datasets managed by software, utilizing the programs for statistical analysis, suggested improvements, and even prediction of potential faults before they occur.

These are immensely valuable opportunities, hence why huge players such as Honeywell are spending billions of dollars to bring innovative AI technology companies into their platforms to refine existing QMS systems.

But while AI has already begun to significantly affect the biggest players, small and mid-sized companies remain eager, but not yet able, to take full advantage. It is thus the next great revolution for a new evolution of QMS, one which will bring these emerging technologies to all companies, regardless of size or scale. The future of QMS, and therefore the future of efficiency in business, rests upon this shift from companies being the recipients of ‘guru knowledge,’ to themselves being the designers of their own quality-minded futures.

------

Onega Ulanova is the CEO of QMS2GO, a provider of quality management systems leveraging AI in manufacturing.

The research outfit says North America leads global AI growth in oil and gas, with Houston playing a pivotal role. Photo via Getty Images

Report: Houston rises as emerging hub for $6B global AI in oil and gas industry

eyes on ai

Houston is emerging as a hub for the development of artificial intelligence in the oil and gas industry — a global market projected to be worth nearly $6 billion by 2028.

This fresh insight comes from a report recently published by ResearchAndMarkets.com. The research outfit says North America leads global AI growth in oil and gas, with Houston playing a pivotal role.

“With AI-driven innovation at its core, the oil and gas industry is set to undergo a profound transformation, impacting everything from reservoir optimization to asset management and energy consumption strategies — setting a new standard for the future of the sector,” says ResearchAndMarkets.com.

The research company predicts the value of the AI sector in oil and gas will rise from an estimated $3.2 billion in 2023 and $3.62 billion in 2024 to $5.8 billion by 2028. The report divides AI into three categories: software, hardware, and hybrids.

As cited in the report, trends that are sparking the explosion of AI in oil and gas include:

  • Stepped-up use of data
  • Higher demand for energy efficiency and sustainability
  • Automation of repetitive tasks
  • Optimization of exploration and drilling
  • Enhancement of safety

“The oil and gas industry’s ongoing digitization is a significant driver behind … AI in the oil and gas market. Rapid adoption of AI technology among oilfield operators and service providers serves as a catalyst, fostering market growth,” says ResearchAndMarkets.com.

The report mentions the Open AI Energy Initiative as one of the drivers of increased adoption of AI in oil and gas. Baker Hughes, C3 AI, Microsoft, and Shell introduced the initiative in February 2021. The initiative enables energy operators, service providers, and vendors to create sharable AI technology for the oil and gas industry.

Baker Hughes and C3 AI jointly market AI offerings for the oil and gas industry.

Aside from Baker Hughes, Microsoft, and Shell, other companies with a significant Houston presence that are cited in the AI report include:

  • Accenture
  • BP
  • Emerson Electric
  • Google
  • Halliburton
  • Honeywell
  • Saudi Aramco
  • Schlumberger
  • TechnipFMC
  • Weatherford International
  • Wood

Major AI-related trends that the report envisions in the oil and gas sector include the:

  • Digital twins for asset modeling
  • Autonomous robotics
  • Advanced analytics for reservoir management
  • Cognitive computing for decision-making
  • Remote monitoring and control systems

“The digitization trend within the oil and gas sector significantly propels the AI in oil and gas market,” says the report.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Rice University's edtech company receives $90M to lead NSF research hub

major collaboration

An educational technology company based out of Rice University has received $90 million to create and lead a research and development hub for inclusive learning and education research. It's the largest research award in the history of the university.

OpenStax received the grant funding from the U.S. National Science Foundation for a five-year project create the R&D hub called SafeInsights, which "will enable extensive, long-term research on the predictors of effective learning while protecting student privacy," reads a news release from Rice. It's the NSF's largest single investment commitment to national sale education R&D infrastructure.

“We are thrilled to announce an investment of $90 million in SafeInsights, marking a significant step forward in our commitment to advancing scientific research in STEM education,” NSF Director Sethuraman Panchanathan says in the release. “There is an urgent need for research-informed strategies capable of transforming educational systems, empowering our nation’s workforce and propelling discoveries in the science of learning.

"By investing in cutting-edge infrastructure and fostering collaboration among researchers and educators, we are paving the way for transformative discoveries and equitable opportunities for learners across the nation.”

SafeInsights is funded through NSF’s Mid-scale Research Infrastructure-2 (Mid-scale RI-2) program and will act as a central hub for 80 partners and collaborating institutions.

“SafeInsights represents a pivotal moment for Rice University and a testament to our nation’s commitment to educational research,” Rice President Reginald DesRoches adds. “It will accelerate student learning through studies that result in more innovative, evidence-based tools and practices.”

Richard Baraniuk, who founded OpenStax and is a Rice professor, will lead SafeInsights. He says he hopes the initiative will allow progress to be made for students learning in various contexts.

“Learning is complex," Baraniuk says in the release. "Research can tackle this complexity and help get the right tools into the hands of educators and students, but to do so, we need reliable information on how students learn. Just as progress in health care research sparked stunning advances in personalized medicine, we need similar precision in education to support all students, particularly those from underrepresented and low-income backgrounds.”

OpenStax awarded $90M to lead NSF research hub for transformational learning and education researchwww.youtube.com

2 Houston startups selected by US military for geothermal projects

hot new recruits

Two clean energy companies in Houston have been recruited for geothermal projects at U.S. military installations.

Fervo Energy is exploring the potential for a geothermal energy system at Naval Air Station Fallon in Nevada.

Meanwhile, Sage Geosystems is working on an exploratory geothermal project for the Army’s Fort Bliss post in Texas. The Bliss project is the third U.S. Department of Defense geothermal initiative in the Lone Star State.

“Energy resilience for the U.S. military is essential in an increasingly digital and electric world, and we are pleased to help the U.S. Army and [the Defense Innovation Unit] to support energy resilience at Fort Bliss,” Cindy Taff, CEO of Sage, says in a news release.

A spokeswoman for Fervo declined to comment.

Andy Sabin, director of the Navy’s Geothermal Program Office, says in a military news release that previous geothermal exploration efforts indicate the Fallon facility “is ideally suited for enhanced geothermal systems to be deployed onsite.”

As for the Fort Bliss project, Michael Jones, a project director in the Army Office of Energy Initiatives, says it’ll combine geothermal technology with innovations from the oil and gas sector.

“This initiative adds to the momentum of Texas as a leader in the ‘geothermal anywhere’ revolution, leveraging the robust oil and gas industry profile in the state,” says Ken Wisian, associate director of the Environmental Division at the U.S. Bureau of Economic Geology.

The Department of Defense kicked off its geothermal initiative in September 2023. Specifically, the Army, Navy, and Defense Innovation Unit launched four exploratory geothermal projects at three U.S. military installations.

One of the three installations is the Air Force’s Joint Base San Antonio. Canada-based geothermal company Eavor is leading the San Antonio project.

Another geothermal company, Atlanta-based Teverra, was tapped for an exploratory geothermal project at the Army’s Fort Wainwright in Alaska. Teverra maintains an office in Houston.

------

This article originally ran on EnergyCapital.