The two entities will collaborate on work focused on "fields of energy and climate; quantum computing and artificial intelligence; global health and medicine; and urban futures." Photo via Rice University

Rice University and Université Paris Sciences & Lettres signed a strategic partnership agreement last week that states that the two institutions will work together on research on some of today's most pressing subject matters.

According to an announcement made on May 13 in Paris, the two schools and research hubs will collaborate on work focused on "fields of energy and climate; quantum computing and artificial intelligence; global health and medicine; and urban futures."

The partnership allows Rice to expand its presence in France, after launching its Rice Global Paris Center about two years ago.

Université PSL consists of 11 top research institutes in France and 2,900 world-class researchers and 140 research laboratories.

“We are honored and excited to partner with Paris Sciences and Lettres University and join forces to advance bold innovation and find solutions to the biggest global challenges of our time,” Rice President Reginald DesRoches said in a statement. “The unique strengths and ambitions of our faculty, students, scholarship and research are what brings us together, and our passion and hope to build a better future for all is what will drive our partnership agenda. Representing two distinct geographic, economic and cultural regions known for ingenuity and excellence, Rice and PSL’s efforts will know no bounds.”

Rice and Université PSL plan to host conferences around the four research priorities of the partnership. The first took place last week at the Rice Global Paris Center. The universities will also biannually select joint research projects to support financially.

“This is a global and cross-disciplinary partnership that will benefit from both a bottom-up, research-driven dynamic and a top-down commitment at the highest level,” PSL President Alain Fuchs said in a statement. “The quality and complementarity of the researchers from PSL and Rice who mobilized for this event give us reason to believe that this partnership will get off to a rapid and productive start. It will offer a strong framework to all the PSL schools for developing collaborations within their areas of strength and their natural partners at Rice.”

Rice launched its Rice Global Paris Center in June 2022 in a historic 16th-century building in Le Marais. At the time it, the university shared that it was intended to support Rice-organized student programs, independent researchers, and international conferences, as well as a satellite and hub for other European research activity.

"Rice University's new home in the Marais has gone from an idea to a mature relative with a robust program of faculty research summits, student opportunities, cultural events and community engagement activities," Caroline Levander, Rice's global Vice President, said at the announcement of the partnership last week.

Click here to learn more about the Global Paris Center.

Last month, University of Houston also signed a memorandum of understanding with Heriot-Watt University in Scotland to focus on hydrogen energy solutions.


This article originally ran on EnergyCapital.

AI's true potential lies in its ability to enhance human capabilities, not replace them. Photo via Getty Images

Houston expert shares 3 strategies for integrating AI into the workforce

guest column

The rapid advancement of artificial intelligence is forcing businesses to evaluate how they will manage the inevitable changes this technology will bring. With its ability to automate tasks, analyze large amounts of data, and provide detailed insights, AI offers an enormous opportunity for businesses of all sizes. However, realizing this potential requires a strategic approach that positions AI as a powerful partner, rather than a replacement for human ingenuity.

The British Council reports that an estimated 65 percent of today's students will eventually work in professions that have yet to be conceived. With the emergence of new AI, this projection emphasizes the importance of cultivating a versatile skill set that allows us to adapt to the ever-changing landscape. It also underscores the importance of having a strategy that embraces the division of labor between humans and machines.

What this means is that an AI strategy shouldn't just be about automation – it should also incorporate an understanding of the human-AI partnership that will be necessary for future success. By using the concepts of automation, augmentation, and autonomy, businesses can unlock the full potential of AI to boost efficiency, enhance decision-making, and ultimately drive continued success.

Automation: Delegating to the AI

We know AI can automate many tasks in a business. However, we should also look at automation from a strategy standpoint by asking, "What tasks can be fully delegated to the AI?" Answering this question might include considering routine, repetitive, and time-consuming tasks that shouldn't require human intervention or those that would be more susceptible to human error. The goal here should be to identify tasks that don't benefit from human nuance, meaning asking questions about time, precision, and compliance could offer even more value.

  • Time. What tasks are time-consuming and could be completed quickly with well-written instructions?
  • Precision. What tasks require precision that is difficult for humans to achieve?
  • Compliance. What tasks involve critical safety procedures or adherence to strict compliance that humans might overlook?

Augmentation: Using AI to boost your potential

Beyond automation, AI's true power lies in its ability to boost human capabilities. In this lens, you should ask, "How can the AI boost my output potential?" Think of AI as a skilled assistant that can analyze vast datasets, identify complex patterns, and present insights that aren't readily apparent to humans alone. The focus here is on tasks that still require a human touch but can benefit from computers' speed and data processing power. When exploring this further, consider asking questions about skill boosts, assistance, and focus.

  • Skill boosts. What tasks am I doing that I understand but need to be an expert at?
  • Assistance. What tasks still require a human's touch but could use processing or speed boosts?
  • Focus. What tasks are causing employees to spend more time on tools and less on goals?

Autonomy: The importance of humans in the loop

One question that comes up frequently when discussing AI is whether it will replace a particular set of jobs. My thoughts, however, are that while AI is remarkably powerful, the key to making all this work is understanding that not every task requires automation. In fact, some tasks would suffer from automation. This step requires you to ask, "Where are human emotion, creativity, intuition, and oversight essential?" Autonomy, in this sense, means digging into creativity, intuition, and uniqueness.

  • Creativity. Does this task require a level of creativity that a machine can't replicate?
  • Intuition. Does this task require emotional awareness that a machine can't discern?
  • Brand Uniqueness. Does this task represent a part of my brand that shouldn't be automated or machine-driven?

AI brings a lot to look forward to. It’s fair to say it’s on its way to transforming the world, but it's important to remember that the businesses that strategically embrace a human-centered approach to integrating AI into everyday business activities are the ones that will thrive. The three A’s: automation, augmentation, and autonomy, provide an essential foundation to begin this journey. By understanding the best applications for each aspect of AI, businesses of all sizes can discover areas for increased efficiency, more thoughtful decision-making, and a competitive edge that drives long-term success. AI's true potential lies in its ability to enhance human capabilities, not replace them.


Kelsey Ruger is the chief technology and product officer for Hello Alice.

Researchers at the new SynthX Center will aim to turn fundamental research into clinical applications and make precision adjustments to drug properties and molecules. Photo via Rice University

Houston organizations launch collaborative center to boost cancer outcomes

new to HOU

Rice University's new Synthesis X Center officially launched last month to bring together experts in cancer care and chemistry.

The center was born out of what started about seven years ago as informal meetings between Rice chemist Han Xiao's research group and others from the Baylor College of Medicine’s Dan L Duncan Comprehensive Cancer Center at the Baylor College of Medicine. The level of collaboration between the two teams has grown significantly over the years, and monthly meetings now draw about 100 participants from across disciplines, fields and Houston-based organizations, according to a statement from Rice.

Researchers at the new SynthX Center will aim to turn fundamental research into clinical applications and make precision adjustments to drug properties and molecules. It will focus on improving cancer outcomes by looking at an array of factors, including prevention and detection, immunotherapies, the use of artificial intelligence to speed drug discovery and development, and several other topics.

"At Rice, we are strong on the fundamental side of research in organic chemistry, chemical biology, bioengineering and nanomaterials,” Xiao says in the statement. “Starting at the laboratory bench, we can synthesize therapeutic molecules and proteins with atom-level precision, offering immense potential for real-world applications at the bedside ... But the clinicians and fundamental researchers don’t have a lot of time to talk and to exchange ideas, so SynthX wants to serve as the bridge and help make these connections.”

SynthX plans to issue its first merit-based seed grants to teams with representatives from Baylor and Rice this month.

With this recognition from Rice, the teams from Xiao's lab and the TMC will also be able to expand and formalize their programs. They will build upon annual retreats, in which investigators can share unpublished findings, and also plan to host a national conference, the first slated for this fall titled "Synthetic Innovations Towards a Cure for Cancer.”

“I am confident that the SynthX Center will be a great resource for both students and faculty who seek to translate discoveries from fundamental chemical research into medical applications that improve people’s lives,” Thomas Killian, dean of the Wiess School of Natural Sciences, says in the release.

Rice announced that it had invested in four other research centers along with SynthX last month. The other centers include the Center for Coastal Futures and Adaptive Resilience, the Center for Environmental Studies, the Center for Latin American and Latinx Studies and the Rice Center for Nanoscale Imaging Sciences.

Earlier this year, Rice also announced its first-ever recipients of its One Small Step Grant program, funded by its Office of Innovation. The program will provide funding to faculty working on "promising projects with commercial potential," according to the website.

Researchers at Baylor College of Medicine’s Human Genome Sequencing Center have trained an AI assistant to explain genetic test results to patients. Photo via Getty Images

Houston researches tap into GenAI for communicating genetic test results

hi, tech

Artificial intelligence in the health care setting has a lot of potential, and one Houston institution is looking into one particular use.

Researchers at Baylor College of Medicine’s Human Genome Sequencing Center have trained an AI assistant to explain genetic test results to patients. According to findings published in the Journal of the American Medical Informatics Association (JAMIA), the team has developed generative AI to understand and interpret genetic tests. They have also tested its accuracy against Open AI’s ChatGPT 3.5.

“We created a chatbot that can provide guidance on general pharmacogenomic testing, dosage implications, and the side effects of therapeutics, and address patient concerns,” explains first author Mullai Murugan in a press release. Murugan is director of software engineering and programming at the Human Genome Sequencing Center. “We see this tool as a superpowered assistant that can increase accessibility and help both physicians and patients answer questions about genetic test results.”

The initial chatbot training specifically targeted pharmacogenomic testing for statins, meaning a patient’s potential response to cholesterol-lowering drugs, as dictated by genetics.

Murugan explains why they decided to create their own chatbot in the key publication on statin pharmacogenomics was published in May 2022, four months after the training cutoff date for ChatGPT 3.5 in January 2022. Alternatively, her team’s technology uses Retrieval Augmented Generation (RAG) and was trained on the most recent guidelines.

How did the two AI assistants compare? Four experts on cardiology and pharmacogenomics rated both chatbots based on accuracy, relevancy, risk management, and language clarity, among other factors. Though the AI scored similarly on language clarity, Baylor’s chatbot scored 85 percent in accuracy and 81 percent in relevancy compared to ChatGPT’s 58 percent in accuracy and 62 percent in relevancy when asked questions from healthcare providers.

“We are working to fine-tune the chatbot to better respond to certain questions, and we want to get feedback from real patients,” Murugan says. “Based on this study, it is very clear that there is a lot of potential here.” Nonetheless, Murugan emphasized that there is much work still to be done before the program is ready for clinical applications. That includes training the chatbot to explain results in the language used by genetic counselors. Funds from the NIH’s All of Us Research Program helped to make the research possible.

Hear from guest columnist Onega Ulanova on AI and quality management systems in manufacturing. Photo via Getty Images

Expert: How AI is disrupting manufacturing and the future of quality management systems

guest column

The concept of quality management is so intrinsic to modern manufacturing — and yet so little understood by the general public — and has literally revolutionized our world over the past hundred years.

Yet, in the present day, quality management and the related systems that guide its implementation are far from static. They are continuously-evolving, shifting to ever-changing global conditions and new means of application unleashed by technological innovation.

Now, more than ever, they are essential for addressing and eliminating not only traditional sources of waste in business, such as lost time and money, but also the physical and pollutant waste that threatens the world we all inhabit.

But what are quality management systems, or QMS, exactly? Who created them, and how have they evolved over time? Perhaps most pressingly, where can they be of greatest help in the present world, and when can they be implemented by businesses in need of change and improvement?

In this article, we will explore the history of QMS, explain their essential role in today’s manufacturing practices, and examine how these systems will take us into the future of productivity.

Quality Management Systems: A Definition

In the United States and globally, the gold standard of quality management standards and practices is the American Society for Quality. This preeminent organization, with over 4,000 members in 130 countries, was established in 1946 and has guided practices and implementation of quality management systems worldwide.

The Society defines a quality management system as “a formalized system that documents processes, procedures, and responsibilities for achieving quality policies and objectives,” and further states that “a QMS helps coordinate and direct an organization’s activities to meet customer and regulatory requirements and improve its effectiveness and efficiency on a continuous basis.”

From this definition, it can be understood that a good quality management system’s purpose is to establish the conditions for consistent and ever-increasing improvement through the use of standardized business culture practices.

Which QMS Standards are Most Widely Used?

The results of quality management’s remarkable growth since the 1940s has led to the rise of a number of widely-used standards, which can serve as the basis for companies and organizations to design and implement their own practices. Most of these modern quality management standards are globally recognized, and are specifically tailored to ensure that a company’s newly-developed practices include essential elements that can increase the likelihood of success.

The most widely-known entity which has designed such guidance is the International Organization for Standardization (ISO), a global organization which develops and publishes technical standards. Since the 1980s, the ISO has provided the 9000 series of standards (the most famous of which is 9001:2015) which outline how organizations can satisfy the checklists of quality management requirements and create their own best practices.

In 2020, over 1.2 million organizations worldwide were officially certified by the ISO for their quality management implementation practices.

However, it should be understood that the ISO 9000 standards are merely guidelines for the design and implementation of a quality management system; they are not systems in and of themselves.

Furthermore, the ISO is far from the only relevant player in this field. Many industry-specific standards, such as the American Petroleum Institute’s API Q1 standard, have been developed to target the highly specialized needs of particular business practices of oil and gas industry. These industry-specific standards are generally aligned with the ISO 9000 standards, and serve as complimentary additional guidance, rather than a replacement. It is entirely possible, and in many cases desirable, for a company to receive both ISO certification and certification from an industry-specific standards body, as doing so can help ensure the company’s newly-developed QMS procedures are consistent with both broad and specialized best practices.

A History of Quality Management

The concept of quality management is intrinsically tied to the development of industrial production. Previous to the industrial revolution, the concept of ‘quality’ was inherently linked to the skill and effort of craftspeople, or in other words, individual laborers trained in specialized fields who, either individually or in small groups, produced goods for use in society.

Whether they were weaving baskets or building castles, these craftspeople were primarily defined by a skill that centered them in a specific production methodology, and it was the mastery of this skill which determined the quality. Guilds of craftspeople would sign their works, placing a personal or group seal on the resulting product and thereby accepting accountability for its quality.

Such signatures and marks are found dating back at least 4,500 years to the construction of Egypt’s Great Pyramid of Giza, and came into widespread practice in medieval Europe with the rise of craft guilds.

In these early confederations of workers, a person’s mastery of a skill or craft could become a defining part of their identity and life, to the extent that many craftspeople of 13th Century Europe lived together in communal settings, while the Egyptian pyramid workers may have belonged to life-long ‘fraternities’ who returned, year after year, to fulfill their roles in ‘work gangs’.

However, in the Industrial Revolution, craft and guild organizations were supplanted by factories. Though ancient and medieval projects at times reached monumental scale, the rise of thousands of factories, each requiring human and machine contributions to generate masses of identical products, required a completely different scale of quality management.

The emphasis on mass production necessitated the use of workers who were no longer crafts masters, and thus resulted in a decrease in the quality of products. This in turn necessitated the rise of the product inspection system, which was steadily refined from the start of the Industrial Revolution in 1760 into the early 20th century.

However, inspection was merely a system of quality control, rather than quality management; in other words, simply discarding defective products did not in and of itself increase total product quality or reduce waste.

As influential American engineer Joseph M. Juran explained, in 1920s-era America, it was common to throw away substantial portions of produced inventory due to defects, and when Juran prompted inspectors at his employer’s company to do something, they refused, saying it was the responsibility of the production line to improve. Quality control, in and of itself, would not yield quality management.

As is often the case in human history, war was the driver of change. In World War II, the mobilization of millions of American workers into wartime roles coincided with the need to produce greater quantities of high-quality products than ever before.

To counteract the loss of skilled factory labor, the United States government implemented the Training Within Industry program, which utilized 10-hour courses to educate newly-recruited workers in how to conduct their work, evaluate their efficiency, and suggest improvements. Similar training programs for the trainers themselves were also developed. By the end of the war, more than 1.6 million workers had been certified under the Training Within Industry program.

Training Within Industry represented one of the first successful implementations of quality management systems, and its impact was widely felt after the end of the war. In the ashes of conflict, the United States and the other Allied Powers were tasked with helping to rebuild the economies of the other wartime combatants. Nowhere was this a more pressing matter than Japan, which had seen widespread economic devastation and had lost 40 percent of all its factories. Further complicating the situation was the reality that, then as now, Japan lacked sufficient natural resources to serve its economic scale.

And yet, within just 10 years of the war’s end, Japan’s economy war growing twice as fast per year than it had been before the fighting started. The driver of this miraculous turnaround was American-derived quality management practices, reinterpreted and implemented with Japanese ingenuity.

In modern business management, few concepts are as renowned, and oft-cited for success, as kaizen. This Japanese word, which simply means “improvement,” is the essential lesson and driver of Japan’s postwar economic success.

Numerous books written outside Japan have attempted to explain kaizen’s quality management principles, often by citing them as being ‘distinctly Japanese.’ Yet, the basis for kaizen is actually universal and applicable in any culture or context; it is, simply put, an emphasis on remaining quality-focused and open to evolution. The development of kaizen began in the post-war period when American statistician William Edwards Deming was brought to Japan as part of the US government’s rebuilding efforts.

A student of earlier quality management thought leaders, Deming instructed hundreds of Japanese engineers, executives, and scholars, urging them to place statistical analysis and human relationships at the center of their management practices. Deming used statistics to track the number and origin of product defects, as well to analyze the effectiveness of remedies. He also reinstated a key idea of the craftsperson creed: that the individual worker is not just a set of hands performing a task, but a person who can, with time, improve both the self and the whole of the company.

Deming was not alone in these efforts; the aforementioned Joseph M. Juran, who came to Japan as part of the rebuilding program several years later, also gave numerous lectures expounding similar principles.

Like Deming, Juran had previously tried to impart these approaches to American industry, but the lessons often fell on deaf ears. Japanese managers, however, took the lessons to heart and soon began crafting their own quality management systems.

Kaoru Ishikawa, who began by translating the works of Deming and Juran into Japanese, was one of the crucial players who helped to create the ideas now known as kaizen. He introduced a bottom-up approach where workers from every part of the product life cycle could initiate change, and popularized Deming’s concept of quality circles, where small groups of workers would meet regularly to analyze results and discuss improvements.

By 1975, Japanese product quality, which had once been regarded as poor, had transformed into world-class thanks to the teachings of Deming, Juran, and kaizen.

By the 1980s, American industry had lost market share and quality prestige to Japan. It was now time for US businesses to learn from Deming and Juran, both of whom at last found a receptive audience in their home country. Deming in particular achieved recognition for his role in the influential 1980 television documentary If Japan Can, Why Can’t We?, in which he emphasized the universal applicability of quality management.

So too did kaizen, which influenced a new generation of global thought leaders. Arising out of this rapid expansion of QMS were new systems in the 1970s and ‘80s, including the Six Sigma approach pioneered by Bill Smith and Motorola in 1987. Ishikawa, who saw his reputation and life transformed as his ideas spread worldwide, eventually summed up the explanation as the universality of human nature and its desire to improve. As Ishikawa said, “wherever they are, human beings are human beings”.

In no small part due to the influence of the thought leaders mentioned, quality management systems are today a cornerstone of global business practice. So influential are the innovators of these systems that they are often called ‘gurus.’ But what are the specific benefits of these systems, and how best can they be implemented?

How QMS Benefits Organizations, and the World

The oft-cited benefits of quality management systems are operational efficiency, employee retention, and reduction of waste. From all of these come improvements to the company’s bottom line and reputation. But far from being dry talking points, each benefit not only serves its obvious purpose, but also can dramatically help benefit the planet itself.

Operational efficiency is the measurement, analysis, and improvement of processes which occur within an organization, with the purpose of utilizing data and consideration to eliminate or mediate any areas where current practices are not effective.

Quality management systems can increase operational efficiency by utilizing employee analysis and feedback to quickly identify areas where improvements are possible, and then to guide their implementation.

In a joint study conducted in 2017 by Forbes and the American Society for Quality, 56 percent of companies stated that improving operational efficiency was a top concern; in the same survey, 59 percent of companies received direct benefit to operations by utilizing quality management system practices, making it the single largest area of improvement across all business types.

Because operational improvements inherently reduce both waste and cost, conducting business in a fully-optimized manner can simultaneously save unnecessary resource expenditure, decrease pollutants and discarded materials, and retain more money which the company can invest into further sustainable practices. Efficiency is itself a kind of ‘stealth sustainability’ that turns a profit-focused mindset into a generator of greater good. It is this very point that the

United States government’s Environmental Protection Agency (EPA) has emphasized in their guidance for Environmental Management Systems (EMS). These quality management system guidelines, tailored specifically to benefit operational efficiency in a business setting, are also designed to benefit the global environment by utilizing quality management practices.

Examples in the EPA’s studies in preparing these guidelines showcased areas where small companies could reduce environmental waste, while simultaneously reducing cost, in numerous areas. These added to substantial reductions and savings, such as a 15 percent waste water reduction which saved a small metal finishing company $15,000 per year.

Similarly, a 2020 study by McKinsey & Company identified ways that optimizing operations could dramatically aid a company’s sustainability with only small outlays of capital, thereby making environmental benefit a by-product of improved profitability.

Employee retention, and more broadly the satisfaction of employees, is another major consideration of QMS. Defined simply, retention is not only the maintenance of a stable workforce without turnover, but the improvement of that workforce with time as they gain skill, confidence, and ability for continued self and organizational improvement. We may be in the post-Industrial Revolution, but thanks to the ideas of QMS, some of the concept of the craftsperson has returned to modern thinking; the individual, once more, has great value.

Quality management systems aid employee retention by allowing the people of an organization to have a direct hand in its improvement. In a study published in 2023 by the journal Quality Innovation Prosperity, 40 percent of organizations which implemented ISO 9001 guidance for the creation of a QMS reported that the process yielded greater employee retention.

A crucial success factor for employee satisfaction is how empowered the employee feels to apply judgment. According to a 2014 study by the Harvard Business Review, companies which set clear guidelines, protect and celebrate employee proposals for quality improvement, and clearly communicate the organization’s quality message while allowing the employees to help shape and implement it, have by far the highest engagement and retention rates. The greatest successes come from cultures where peer-driven approaches increase employee engagement, thereby eliminating preventable employee mistakes. Yet the same study also pointed out that nearly half of all employees feel their company’s leadership lacks a clear emphasis on quality, and only 10 percent felt their company’s existing quality statements were truthful and viable.

Then as now, the need to establish a clear quality culture, to manage and nurture that culture, and to empower the participants is critical to earning the trust of the employee participants and thereby retaining workers who in time can become the invaluable craftspeople of today.

Finally, there is the reduction of waste. Waste can be defined in many ways: waste of time, waste of money, waste of resources. The unifying factor in all definitions is the loss of something valuable, and irretrievable. All inevitably also lead to the increase of another kind of waste: pollution and discarded detritus which steadily ruin our shared planet.

Reducing waste with quality management can take many forms, but ultimately, all center on the realization of strategies which use only what is truly needed. This can mean both operational efficiencies and employee quality, as noted above. The Harvard Business Review survey identified that in 2014, the average large company (having 26,000 employees or more) loses a staggering $350 million each year due to preventable employee errors, many of which could be reduced, mitigated, or eliminated entirely with better implementation of quality management.

This is waste on an almost unimaginable financial scale. Waste eliminated through practices which emphasize efficiency and sustainability, as noted in the McKinsey & Company study, can also yield tremendous savings. In one example, a company which purchased asphalt and previously prioritized only the per-ton price found that, when examining the logistical costs of transporting the asphalt from distant suppliers, they were actually paying more than if they purchased it locally. The quality management analysis they performed yielded them a cost savings, and eliminated 40 percent of the carbon emissions associated with the asphalt’s procurement. In this case, not only was wasteful spending eliminated, but literal waste (pollution) was prevented.

In taking these steps, companies can meaningfully improve their bottom lines, while at the same time doing something worthwhile and beneficial for the planet. That, in turn, helps burnish their reputations. A remarkable plurality of consumers, 88 percent of Americans surveyed in a 2017 study to be exact, said they would be more loyal to a company that supports social or environmental issues.

It is therefore clear that any steps a company can take which save money, improve worker satisfaction, and yield increased positivity in the marketplace are well worth pursuing.

What is the Future of QMS?

Until the 2000s, quality management systems were just that: systems of desirable practices, outlined by individuals and implemented individually. That was the age of the gurus: the visionaries who outlined the systems. But what that age lacked was a practical and easy means for companies, sometimes located far away from direct guidance by the gurus, to implement their teachings.

In the intervening years, technology has radically changed that dynamic. Today, QMS software fills the marketplace, allowing businesses small and large to design and guide their quality management plans. But even these software solutions have not yet solved the last great challenge: personalized assistance in putting standards into practice.

That is why the latest innovations, particularly in artificial intelligence, have the potential to upend the equation. Already, major companies have started to use artificial intelligence in connection with QMS datasets managed by software, utilizing the programs for statistical analysis, suggested improvements, and even prediction of potential faults before they occur.

These are immensely valuable opportunities, hence why huge players such as Honeywell are spending billions of dollars to bring innovative AI technology companies into their platforms to refine existing QMS systems.

But while AI has already begun to significantly affect the biggest players, small and mid-sized companies remain eager, but not yet able, to take full advantage. It is thus the next great revolution for a new evolution of QMS, one which will bring these emerging technologies to all companies, regardless of size or scale. The future of QMS, and therefore the future of efficiency in business, rests upon this shift from companies being the recipients of ‘guru knowledge,’ to themselves being the designers of their own quality-minded futures.


Onega Ulanova is the CEO of QMS2GO, a provider of quality management systems leveraging AI in manufacturing.

The research outfit says North America leads global AI growth in oil and gas, with Houston playing a pivotal role. Photo via Getty Images

Report: Houston rises as emerging hub for $6B global AI in oil and gas industry

eyes on ai

Houston is emerging as a hub for the development of artificial intelligence in the oil and gas industry — a global market projected to be worth nearly $6 billion by 2028.

This fresh insight comes from a report recently published by The research outfit says North America leads global AI growth in oil and gas, with Houston playing a pivotal role.

“With AI-driven innovation at its core, the oil and gas industry is set to undergo a profound transformation, impacting everything from reservoir optimization to asset management and energy consumption strategies — setting a new standard for the future of the sector,” says

The research company predicts the value of the AI sector in oil and gas will rise from an estimated $3.2 billion in 2023 and $3.62 billion in 2024 to $5.8 billion by 2028. The report divides AI into three categories: software, hardware, and hybrids.

As cited in the report, trends that are sparking the explosion of AI in oil and gas include:

  • Stepped-up use of data
  • Higher demand for energy efficiency and sustainability
  • Automation of repetitive tasks
  • Optimization of exploration and drilling
  • Enhancement of safety

“The oil and gas industry’s ongoing digitization is a significant driver behind … AI in the oil and gas market. Rapid adoption of AI technology among oilfield operators and service providers serves as a catalyst, fostering market growth,” says

The report mentions the Open AI Energy Initiative as one of the drivers of increased adoption of AI in oil and gas. Baker Hughes, C3 AI, Microsoft, and Shell introduced the initiative in February 2021. The initiative enables energy operators, service providers, and vendors to create sharable AI technology for the oil and gas industry.

Baker Hughes and C3 AI jointly market AI offerings for the oil and gas industry.

Aside from Baker Hughes, Microsoft, and Shell, other companies with a significant Houston presence that are cited in the AI report include:

  • Accenture
  • BP
  • Emerson Electric
  • Google
  • Halliburton
  • Honeywell
  • Saudi Aramco
  • Schlumberger
  • TechnipFMC
  • Weatherford International
  • Wood

Major AI-related trends that the report envisions in the oil and gas sector include the:

  • Digital twins for asset modeling
  • Autonomous robotics
  • Advanced analytics for reservoir management
  • Cognitive computing for decision-making
  • Remote monitoring and control systems

“The digitization trend within the oil and gas sector significantly propels the AI in oil and gas market,” says the report.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston jumps significantly on annual list of best places to live in 2024

by the numbers

Things are looking a little brighter for Houston as the city was recently named among the top 100 best places to live in U.S. News and World Report's "Best Places to Live" list for 2024-2025.

Previously, H-Town had shockingly plummeted toward the bottom of the list as No. 140 in the 2023-2024 rankings. But the latest report has placed Houston at No. 97, suggesting substantial improvements over the last year.

U.S. News annually measures 150 top American cities for their livability and ranks them based on four major indexes: quality of life, value, desirability, and job market.

New for the 2024-2025 report, U.S. News updated its methodology to analyze city-based data rather than metropolitan area data. Secondly, the report's annual survey decided to place greater weight on a city's "value and job market" while "weights for desirability and quality of life took a slight dip" on the grading scale.

"Rising concerns about career prospects, housing affordability and increased cost of goods and services are reflected in this year’s rankings," said U.S. News loans expert and reporter Erika Giovanetti in a press release. "While quality of life remains the top priority for many Americans, a city’s value and job market are becoming increasingly important for those looking for a place to live."

There's many factors that draw folks to Houston, among them our city's diversity, the highly esteemed schools, top universities, and much more. Houston is also a great place for retirees looking to settle down without compromising on the big city lifestyle. The city truly has something for everyone.

The good news continues: Houston additionally moved up two spots to take No. 8 on the report's Best Place to Live in Texas list for 2024. The Bayou City ranked No. 10 last year.

Elsewhere in Texas
The recent focus on city-based data was likely a major factor that fueled Houston's improvement in the statewide and national rankings, but it also favorably shifted nine other Texas cities.

Austin – which previously ranked No. 40 in last year's rankings – became the only city to represent the Lone Star State among the top 10 best places to live in 2024. The Texas Capital jumped up 31 spots to claim No. 9 nationally, due to its "high desirability and job market scores," the report said.

Three cities in the Rio Grande Valley also ranked higher than Houston, suggesting that South Texas may be a better place to live than East Texas. The border towns of McAllen (No. 48) and Brownsville (No. 87) climbed into the overall top 100 this year after formerly ranking No. 137 and No. 134 last year. Meanwhile, Corpus Christi moved up from No. 132 last year to No. 77 in 2024.

Naples, Florida won the gold medal as the No. 1 best place to live in the U.S. in 2024. Rounding out the top five are Boise, Idaho (No. 2); Colorado Springs, Colorado (No. 3); Greenville, South Carolina (No. 4); and Charlotte, North Carolina (No. 5).

Here's how other Texas cities faired in 2024's Best Places to Live report:

  • No. 62 – El Paso (up from No. 128 last year)
  • No. 89 – San Antonio (up from No. 103 last year)
  • No. 95 – Dallas (up from No. 113 last year)
  • No. 99 – Beaumont (up from No. 131 last year)
  • No. 107 – Killeen (up from No. 122 last year)
The full report and its methodology can be found on


This article originally ran on CultureMap.

Houston organizations launch study to explore hydrogen-powered travel

sustainability takes flight

A few major players have teamed up to look into making air travel more sustainable — and it's all happening in Houston.

The Center for Houston’s Future, Airbus, and Houston Airports have signed a memorandum of understanding intended to study the “feasibility of a hydrogen hub at George Bush Intercontinental Airport." The study, which will conclude in March of 2025, will include the participants that will collaborate ways to rethink how their infrastructures could be designed and operated to reduce an overall environmental footprint, and lead to hydrogen-powered aircrafts like the ones Airbus plans to bring to fruition by 2035.

In 2020, Airbus debuted its ZEROe hydrogen-powered aircraft project. The “Hydrogen Hub at Airports'' concept by Airbus unites key airport ecosystem players to develop ways to decarbonize all airport-associated infrastructure with hydrogen. The study will include airport ground transportation, airport heating, end-use in aviation, and possibly ways to supply adjacent customers in transport and local industries.

The use of hydrogen to power future aircraft aims to assist in eliminating aircraft CO2 emissions in the air, and also can help decarbonize air transport on the ground. With Houston being such a large city, and a destination for some many visiting on business, the Houston airports was an easy spot to assign the study.

"Houston’s airports are experiencing tremendous growth, connecting our city to the world like never before,” Jim Szczesniak, the aviation director for the city of Houston, says in a news release. “As we continue to expand and modernize our facilities, participating in this sustainability study is crucial. Continuing to build a sustainable airport system will ensure a healthy future for Houston, attract top talent and businesses, and demonstrate our commitment to being a responsible global citizen.

"This study will provide us with valuable insights to guide our development and position Houston as a global leader in sustainable aviation innovation for generations to come.”

The CHF was a founding organizer of the HyVelocity Hydrogen Hub, which was selected by the U.S. Department of Energy as one of seven hydrogen hubs in the nation, and will work in the Houston area and the Gulf Coast. The HyVelocity Hydrogen Hub is eligible to receive up to $1.2 billion as part of a Bipartisan Infrastructure Law funding to advance domestic hydrogen production.

“The Center for Houston’s Future is pleased to have played a crucial role in bringing together the partners for this study,” Brett Perlman, the center's outgoing CEO and president, adds. “With Houston’s role as the world’s energy capital, our record of energy innovation and desire to lead in the business of low-carbon energy, Houston is the perfect place to develop our airports as North American clean hydrogen pioneers.”

3 Houston innovators to know this week

who's who

Editor's note: Every week, I introduce you to a handful of Houston innovators to know recently making headlines with news of innovative technology, investment activity, and more. This week's batch includes a podcast with the founder of a fast-growing geothermal company, a human resources expert, and an outgoing climatetech CEO.

Tim Latimer, co-founder and CEO of Fervo Energy

Tim Latimer, CEO and co-founder of Fervo Energy, joins the Houston Innovators Podcast. Photo courtesy of Fervo Energy

Geothermal energy has been growing in recognition as a major player in the clean energy mix, and while many might think of it as a new climatetech solution, Tim Latimer, co-founder and CEO of Fervo Energy, knows better.

"Every overnight success is a decade in the making, and I think Fervo, fortunately — and geothermal as a whole — has become much more high profile recently as people realize that it can be a tremendous solution to the challenges that our energy sector and climate are facing," he says on the Houston Innovators Podcast.

In fact, Latimer has been bullish on geothermal as a clean energy source since he quit his job as a drilling engineer in oil and gas to pursue a dual degree program — MBA and master's in earth sciences — at Stanford University. He had decided that, with the reluctance of incumbent energy companies to try new technologies, he was going to figure out how to start his own company. Through the Stanford program and Activate, a nonprofit hardtech program that funded two years of Fervo's research and development, Latimer did just that. Read more.

Karen Leal, performance specialist at Insperity

Karen Leal, performance specialist at InsperityTime to think ahead, business owners. Here's what this expert thinks you need to prioritize. Photo courtesy

Not only is upskilling your workforce on a regular basis good for performance purposes, it also contributes to a positive company culture, writes Karen Leal, performance specialist with Houston-based Insperity, in a guest column.

"Learning and development (L&D) programs give employees the resources to grow within their current role and ready them for their possible advancement into new positions and/or another role or function," she writes. "This development should be a collaborative effort with the employee to support the employee’s growth goals. L&D programs build and strengthen your organization’s learning culture, which encourages employees to lean into the overall corporate culture and promotes employee engagement."

She goes on to outline the major benefits when developing L&D programs that impact business success. Read more.

Kevin Knobloch, CEO of Greentown Labs

Kevin Knobloch is stepping down as Greentown Labs CEO, effective on July 31. Photo via LinkedIn

While not based full time in Houston, Kevin Knobloch has led Greentown Labs, which is co-located in the Boston and Houston areas, as president and CEO for the past several months. Last week, he announced he's stepping down.

Knobloch will continue in his role until the end of July 2024.

“It has been an honor to lead this incredible team and organization, and a true privilege to get to know many of our brilliant startup founders," Knobloch says in the news release. “Greentown is a proven leader in supporting early-stage climatetech companies and I can’t wait to see all that it will accomplish in the coming years.” Read more.