Activate's application is live from now through October 23, and all founders of early-stage, research-backed hardtech companies in Houston are encouraged to apply. Photo via Getty Images

Applications are officially open for a Activate's second Houston cohort.

Activate's application is live from now through October 23, and all founders of early-stage, research-backed hardtech companies in Houston are encouraged to apply. The Berkley, California-based program launched in Houston last year and recently named its inaugural Houston cohort.

“The Activate Fellowship provides an opportunity for approximately 50 scientists and engineers annually to transform into entrepreneurial leaders, derisk their technologies, define first markets, build teams, and secure follow-on funding,” says Activate’s executive managing director, Aimee Rose, in a news release. “With an average 30 percent annual growth in applications since 2015, we know there is high demand for what we do, and we’re excited to see the talent and impactful ideas that come through the pipeline this year.

The program, led locally by Houston Managing Director Jeremy Pitts, has 249 current Activate fellows and alumni that have collectively raised over $2.4 billion in public and private funding since the organization was founded in 2015.

“The success of Activate Fellows is ample evidence that scientists and engineers have the talent and drive to face global challenges head-on,” adds Activate chief fellowship officer, Brenna Teigler. “Our diverse fellows are transforming technical breakthroughs into businesses across the United States in 26 states across a range of sectors spanning carbon management, semiconductors, manufacturing, energy, chemicals, ocean tech, and more.”

The application is available online, and fellows will be selected in April of next year. The 2025 program will begin in June.

Activate is looking for local and regional early-stage founders — who have raised less than $2 million in funding — who are working on high-impact technology. Each cohort consists of 10 fellows that join the program for two years. The fellows receive a living stipend, connections from Activate's robust network of mentors, and access to a curriculum specific to the program.

------

This article originally ran on EnergyCapital.

Hear from guest columnist Onega Ulanova on AI and quality management systems in manufacturing. Photo via Getty Images

Expert: How AI is disrupting manufacturing and the future of quality management systems

guest column

The concept of quality management is so intrinsic to modern manufacturing — and yet so little understood by the general public — and has literally revolutionized our world over the past hundred years.

Yet, in the present day, quality management and the related systems that guide its implementation are far from static. They are continuously-evolving, shifting to ever-changing global conditions and new means of application unleashed by technological innovation.

Now, more than ever, they are essential for addressing and eliminating not only traditional sources of waste in business, such as lost time and money, but also the physical and pollutant waste that threatens the world we all inhabit.

But what are quality management systems, or QMS, exactly? Who created them, and how have they evolved over time? Perhaps most pressingly, where can they be of greatest help in the present world, and when can they be implemented by businesses in need of change and improvement?

In this article, we will explore the history of QMS, explain their essential role in today’s manufacturing practices, and examine how these systems will take us into the future of productivity.

Quality Management Systems: A Definition

In the United States and globally, the gold standard of quality management standards and practices is the American Society for Quality. This preeminent organization, with over 4,000 members in 130 countries, was established in 1946 and has guided practices and implementation of quality management systems worldwide.

The Society defines a quality management system as “a formalized system that documents processes, procedures, and responsibilities for achieving quality policies and objectives,” and further states that “a QMS helps coordinate and direct an organization’s activities to meet customer and regulatory requirements and improve its effectiveness and efficiency on a continuous basis.”

From this definition, it can be understood that a good quality management system’s purpose is to establish the conditions for consistent and ever-increasing improvement through the use of standardized business culture practices.

Which QMS Standards are Most Widely Used?

The results of quality management’s remarkable growth since the 1940s has led to the rise of a number of widely-used standards, which can serve as the basis for companies and organizations to design and implement their own practices. Most of these modern quality management standards are globally recognized, and are specifically tailored to ensure that a company’s newly-developed practices include essential elements that can increase the likelihood of success.

The most widely-known entity which has designed such guidance is the International Organization for Standardization (ISO), a global organization which develops and publishes technical standards. Since the 1980s, the ISO has provided the 9000 series of standards (the most famous of which is 9001:2015) which outline how organizations can satisfy the checklists of quality management requirements and create their own best practices.

In 2020, over 1.2 million organizations worldwide were officially certified by the ISO for their quality management implementation practices.

However, it should be understood that the ISO 9000 standards are merely guidelines for the design and implementation of a quality management system; they are not systems in and of themselves.

Furthermore, the ISO is far from the only relevant player in this field. Many industry-specific standards, such as the American Petroleum Institute’s API Q1 standard, have been developed to target the highly specialized needs of particular business practices of oil and gas industry. These industry-specific standards are generally aligned with the ISO 9000 standards, and serve as complimentary additional guidance, rather than a replacement. It is entirely possible, and in many cases desirable, for a company to receive both ISO certification and certification from an industry-specific standards body, as doing so can help ensure the company’s newly-developed QMS procedures are consistent with both broad and specialized best practices.

A History of Quality Management

The concept of quality management is intrinsically tied to the development of industrial production. Previous to the industrial revolution, the concept of ‘quality’ was inherently linked to the skill and effort of craftspeople, or in other words, individual laborers trained in specialized fields who, either individually or in small groups, produced goods for use in society.

Whether they were weaving baskets or building castles, these craftspeople were primarily defined by a skill that centered them in a specific production methodology, and it was the mastery of this skill which determined the quality. Guilds of craftspeople would sign their works, placing a personal or group seal on the resulting product and thereby accepting accountability for its quality.

Such signatures and marks are found dating back at least 4,500 years to the construction of Egypt’s Great Pyramid of Giza, and came into widespread practice in medieval Europe with the rise of craft guilds.

In these early confederations of workers, a person’s mastery of a skill or craft could become a defining part of their identity and life, to the extent that many craftspeople of 13th Century Europe lived together in communal settings, while the Egyptian pyramid workers may have belonged to life-long ‘fraternities’ who returned, year after year, to fulfill their roles in ‘work gangs’.

However, in the Industrial Revolution, craft and guild organizations were supplanted by factories. Though ancient and medieval projects at times reached monumental scale, the rise of thousands of factories, each requiring human and machine contributions to generate masses of identical products, required a completely different scale of quality management.

The emphasis on mass production necessitated the use of workers who were no longer crafts masters, and thus resulted in a decrease in the quality of products. This in turn necessitated the rise of the product inspection system, which was steadily refined from the start of the Industrial Revolution in 1760 into the early 20th century.

However, inspection was merely a system of quality control, rather than quality management; in other words, simply discarding defective products did not in and of itself increase total product quality or reduce waste.

As influential American engineer Joseph M. Juran explained, in 1920s-era America, it was common to throw away substantial portions of produced inventory due to defects, and when Juran prompted inspectors at his employer’s company to do something, they refused, saying it was the responsibility of the production line to improve. Quality control, in and of itself, would not yield quality management.

As is often the case in human history, war was the driver of change. In World War II, the mobilization of millions of American workers into wartime roles coincided with the need to produce greater quantities of high-quality products than ever before.

To counteract the loss of skilled factory labor, the United States government implemented the Training Within Industry program, which utilized 10-hour courses to educate newly-recruited workers in how to conduct their work, evaluate their efficiency, and suggest improvements. Similar training programs for the trainers themselves were also developed. By the end of the war, more than 1.6 million workers had been certified under the Training Within Industry program.

Training Within Industry represented one of the first successful implementations of quality management systems, and its impact was widely felt after the end of the war. In the ashes of conflict, the United States and the other Allied Powers were tasked with helping to rebuild the economies of the other wartime combatants. Nowhere was this a more pressing matter than Japan, which had seen widespread economic devastation and had lost 40 percent of all its factories. Further complicating the situation was the reality that, then as now, Japan lacked sufficient natural resources to serve its economic scale.

And yet, within just 10 years of the war’s end, Japan’s economy war growing twice as fast per year than it had been before the fighting started. The driver of this miraculous turnaround was American-derived quality management practices, reinterpreted and implemented with Japanese ingenuity.

In modern business management, few concepts are as renowned, and oft-cited for success, as kaizen. This Japanese word, which simply means “improvement,” is the essential lesson and driver of Japan’s postwar economic success.

Numerous books written outside Japan have attempted to explain kaizen’s quality management principles, often by citing them as being ‘distinctly Japanese.’ Yet, the basis for kaizen is actually universal and applicable in any culture or context; it is, simply put, an emphasis on remaining quality-focused and open to evolution. The development of kaizen began in the post-war period when American statistician William Edwards Deming was brought to Japan as part of the US government’s rebuilding efforts.

A student of earlier quality management thought leaders, Deming instructed hundreds of Japanese engineers, executives, and scholars, urging them to place statistical analysis and human relationships at the center of their management practices. Deming used statistics to track the number and origin of product defects, as well to analyze the effectiveness of remedies. He also reinstated a key idea of the craftsperson creed: that the individual worker is not just a set of hands performing a task, but a person who can, with time, improve both the self and the whole of the company.

Deming was not alone in these efforts; the aforementioned Joseph M. Juran, who came to Japan as part of the rebuilding program several years later, also gave numerous lectures expounding similar principles.

Like Deming, Juran had previously tried to impart these approaches to American industry, but the lessons often fell on deaf ears. Japanese managers, however, took the lessons to heart and soon began crafting their own quality management systems.

Kaoru Ishikawa, who began by translating the works of Deming and Juran into Japanese, was one of the crucial players who helped to create the ideas now known as kaizen. He introduced a bottom-up approach where workers from every part of the product life cycle could initiate change, and popularized Deming’s concept of quality circles, where small groups of workers would meet regularly to analyze results and discuss improvements.

By 1975, Japanese product quality, which had once been regarded as poor, had transformed into world-class thanks to the teachings of Deming, Juran, and kaizen.

By the 1980s, American industry had lost market share and quality prestige to Japan. It was now time for US businesses to learn from Deming and Juran, both of whom at last found a receptive audience in their home country. Deming in particular achieved recognition for his role in the influential 1980 television documentary If Japan Can, Why Can’t We?, in which he emphasized the universal applicability of quality management.

So too did kaizen, which influenced a new generation of global thought leaders. Arising out of this rapid expansion of QMS were new systems in the 1970s and ‘80s, including the Six Sigma approach pioneered by Bill Smith and Motorola in 1987. Ishikawa, who saw his reputation and life transformed as his ideas spread worldwide, eventually summed up the explanation as the universality of human nature and its desire to improve. As Ishikawa said, “wherever they are, human beings are human beings”.

In no small part due to the influence of the thought leaders mentioned, quality management systems are today a cornerstone of global business practice. So influential are the innovators of these systems that they are often called ‘gurus.’ But what are the specific benefits of these systems, and how best can they be implemented?

How QMS Benefits Organizations, and the World

The oft-cited benefits of quality management systems are operational efficiency, employee retention, and reduction of waste. From all of these come improvements to the company’s bottom line and reputation. But far from being dry talking points, each benefit not only serves its obvious purpose, but also can dramatically help benefit the planet itself.

Operational efficiency is the measurement, analysis, and improvement of processes which occur within an organization, with the purpose of utilizing data and consideration to eliminate or mediate any areas where current practices are not effective.

Quality management systems can increase operational efficiency by utilizing employee analysis and feedback to quickly identify areas where improvements are possible, and then to guide their implementation.

In a joint study conducted in 2017 by Forbes and the American Society for Quality, 56 percent of companies stated that improving operational efficiency was a top concern; in the same survey, 59 percent of companies received direct benefit to operations by utilizing quality management system practices, making it the single largest area of improvement across all business types.

Because operational improvements inherently reduce both waste and cost, conducting business in a fully-optimized manner can simultaneously save unnecessary resource expenditure, decrease pollutants and discarded materials, and retain more money which the company can invest into further sustainable practices. Efficiency is itself a kind of ‘stealth sustainability’ that turns a profit-focused mindset into a generator of greater good. It is this very point that the

United States government’s Environmental Protection Agency (EPA) has emphasized in their guidance for Environmental Management Systems (EMS). These quality management system guidelines, tailored specifically to benefit operational efficiency in a business setting, are also designed to benefit the global environment by utilizing quality management practices.

Examples in the EPA’s studies in preparing these guidelines showcased areas where small companies could reduce environmental waste, while simultaneously reducing cost, in numerous areas. These added to substantial reductions and savings, such as a 15 percent waste water reduction which saved a small metal finishing company $15,000 per year.

Similarly, a 2020 study by McKinsey & Company identified ways that optimizing operations could dramatically aid a company’s sustainability with only small outlays of capital, thereby making environmental benefit a by-product of improved profitability.

Employee retention, and more broadly the satisfaction of employees, is another major consideration of QMS. Defined simply, retention is not only the maintenance of a stable workforce without turnover, but the improvement of that workforce with time as they gain skill, confidence, and ability for continued self and organizational improvement. We may be in the post-Industrial Revolution, but thanks to the ideas of QMS, some of the concept of the craftsperson has returned to modern thinking; the individual, once more, has great value.

Quality management systems aid employee retention by allowing the people of an organization to have a direct hand in its improvement. In a study published in 2023 by the journal Quality Innovation Prosperity, 40 percent of organizations which implemented ISO 9001 guidance for the creation of a QMS reported that the process yielded greater employee retention.

A crucial success factor for employee satisfaction is how empowered the employee feels to apply judgment. According to a 2014 study by the Harvard Business Review, companies which set clear guidelines, protect and celebrate employee proposals for quality improvement, and clearly communicate the organization’s quality message while allowing the employees to help shape and implement it, have by far the highest engagement and retention rates. The greatest successes come from cultures where peer-driven approaches increase employee engagement, thereby eliminating preventable employee mistakes. Yet the same study also pointed out that nearly half of all employees feel their company’s leadership lacks a clear emphasis on quality, and only 10 percent felt their company’s existing quality statements were truthful and viable.

Then as now, the need to establish a clear quality culture, to manage and nurture that culture, and to empower the participants is critical to earning the trust of the employee participants and thereby retaining workers who in time can become the invaluable craftspeople of today.

Finally, there is the reduction of waste. Waste can be defined in many ways: waste of time, waste of money, waste of resources. The unifying factor in all definitions is the loss of something valuable, and irretrievable. All inevitably also lead to the increase of another kind of waste: pollution and discarded detritus which steadily ruin our shared planet.

Reducing waste with quality management can take many forms, but ultimately, all center on the realization of strategies which use only what is truly needed. This can mean both operational efficiencies and employee quality, as noted above. The Harvard Business Review survey identified that in 2014, the average large company (having 26,000 employees or more) loses a staggering $350 million each year due to preventable employee errors, many of which could be reduced, mitigated, or eliminated entirely with better implementation of quality management.

This is waste on an almost unimaginable financial scale. Waste eliminated through practices which emphasize efficiency and sustainability, as noted in the McKinsey & Company study, can also yield tremendous savings. In one example, a company which purchased asphalt and previously prioritized only the per-ton price found that, when examining the logistical costs of transporting the asphalt from distant suppliers, they were actually paying more than if they purchased it locally. The quality management analysis they performed yielded them a cost savings, and eliminated 40 percent of the carbon emissions associated with the asphalt’s procurement. In this case, not only was wasteful spending eliminated, but literal waste (pollution) was prevented.

In taking these steps, companies can meaningfully improve their bottom lines, while at the same time doing something worthwhile and beneficial for the planet. That, in turn, helps burnish their reputations. A remarkable plurality of consumers, 88 percent of Americans surveyed in a 2017 study to be exact, said they would be more loyal to a company that supports social or environmental issues.

It is therefore clear that any steps a company can take which save money, improve worker satisfaction, and yield increased positivity in the marketplace are well worth pursuing.

What is the Future of QMS?

Until the 2000s, quality management systems were just that: systems of desirable practices, outlined by individuals and implemented individually. That was the age of the gurus: the visionaries who outlined the systems. But what that age lacked was a practical and easy means for companies, sometimes located far away from direct guidance by the gurus, to implement their teachings.

In the intervening years, technology has radically changed that dynamic. Today, QMS software fills the marketplace, allowing businesses small and large to design and guide their quality management plans. But even these software solutions have not yet solved the last great challenge: personalized assistance in putting standards into practice.

That is why the latest innovations, particularly in artificial intelligence, have the potential to upend the equation. Already, major companies have started to use artificial intelligence in connection with QMS datasets managed by software, utilizing the programs for statistical analysis, suggested improvements, and even prediction of potential faults before they occur.

These are immensely valuable opportunities, hence why huge players such as Honeywell are spending billions of dollars to bring innovative AI technology companies into their platforms to refine existing QMS systems.

But while AI has already begun to significantly affect the biggest players, small and mid-sized companies remain eager, but not yet able, to take full advantage. It is thus the next great revolution for a new evolution of QMS, one which will bring these emerging technologies to all companies, regardless of size or scale. The future of QMS, and therefore the future of efficiency in business, rests upon this shift from companies being the recipients of ‘guru knowledge,’ to themselves being the designers of their own quality-minded futures.

------

Onega Ulanova is the CEO of QMS2GO, a provider of quality management systems leveraging AI in manufacturing.

Meet the six startups that will be working with Shell and Greentown Labs for the next six months. Photo via Greentown

6 energy tech startups named to corporate-backed manufacturing accelerator

go make

Greentown Labs has named the six participating climatetech startups for an accelerator for a global energy leader.

Shell and Greentown Labs announced the cohort for Greentown Go Make 2023 — a program designed to accelerate partnerships between startups and corporates to advance carbon utilization, storage, and traceability solutions with manufacturing in mind. Shell, which invests in net-zero and carbon-removal technologies, is hoping to strategically align with startups within carbon utilization, storage, and traceability across the energy transition spectrum.

“At Greentown Labs we recognize and appreciate the role energy incumbents must play in the energy transition, and we’re eager to facilitate meaningful partnerships between these impressive startups and Shell—not only to advance these technologies but also to help Shell achieve its sustainability goals,” Kevin Knobloch, CEO and President of Greentown Labs, says in a news release. “We know carbon utilization, storage, and traceability will play a critical role in our collective efforts to reach net-zero, and we’re enthusiastic about the potential impact these companies can have in that work.”

The cohort, selected from 110 applications, is co-located at Greentown's Houston and Somerville, Massachusetts, locations and includes:

  • Portland-based Caravel Bio is developing a novel synthetic biology platform that uses microbial spores and enzymes to create catalysts that are long-lasting and can withstand extreme conditions and environments.
  • Circularise, which is based in the Netherlands, is developing a blockchain platform that provides digital product passports for end-to-end traceability and secure data exchange for industrial supply chains.
  • Corumat, based in Washington, converts organic waste into high-performance, insulating, greaseproof, and biodegradable packaging materials.
  • Cambridge, Massachusetts-headquartered Lydian develops a fully electrified reactor that can convert a variety of gaseous, non-fossil feedstocks into pure syngas with high efficiency.
  • Maple Materials from Richmond, California is developing a low-cost electrolysis process to split carbon dioxide into graphite and oxygen.
  • Ontario, Canada-founded Universal Matter develops a proprietary Flash Joule Heating process that converts carbon waste into high-value and high-performance graphene materials to efficiently create sustainable circular economies.

The program, which includes $15,000 in non-dilutive stipend funding for each company, will work closely with Shell and Greentown over six months via mentorship, networking opportunities, educational workshops, and partnership-focused programming to support collaboration. Go Make 2023 concludes with a showcase event on March 27 at Greentown Labs’ Houston location.

This week, Shell announced another accelerator cohort it's participating in. The Shell GameChanger Accelerator, a partnership with the U.S. Department of Energy’s National Renewable Energy Laboratory (NREL), named four West Coast climatetech companies: DTE Materials, Hexas Biomass, Invizyne Technologies, and ZILA BioWorks. The program provides early-stage cleantech startups with access to experts and facilities to reduce technology development risk and accelerate commercialization of new cleaner technologies.

“Tackling the climate challenge requires multifaceted solutions. At Shell, we believe technology that removes carbon dioxide from the atmosphere will be essential for lowering emissions from energy and chemical products,” Yesim Jonsson, Shell’s GCxN program manager, says in a statement. “The companies in GCxN's sixth cohort embody these objectives and have the potential to usher in a more sustainable future.”

------

This article originally ran on EnergyCapital.

The 130,000-square-foot Resilience Manufacturing Hub is coming to the Second Ward. Photo houston.org

$32M resilience-focused hub to rise in Houston's East End

coming soon

A first-of-its-kind manufacturing hub designed to “future proof” residential, commercial, industrial, and public sector infrastructure is coming to Houston.

The 130,000-square-foot Resilience Manufacturing Hub will house functions such as R&D, manufacturing, and assembly for products aimed at improving the resilience of homes, office buildings, warehouses, and other components of the “built environment.”

“We are looking for any product or technology solution that can reduce the impact from the next generation of disasters … by helping people thrive, not just survive, in their own community,” says Richard Seline, co-founder and managing director of the Houston-based Resilience Innovation Hub. The innovation hub is a partner in the manufacturing hub.

Seline says the manufacturing hub, with an estimated price tag of $32 million, will directly employ about 60 people. He expects the facility to either generate or “upskill” about 240 off-site jobs.

The manufacturing hub will be built adjacent to the 300,000-square-foot East End Maker Hub, which is opened in Houston’s Second Ward neighborhood two years ago. Seline says five companies already have expressed interest in being tenants at the manufacturing hub, which is set to open by next summer.

The East End Maker Hub, a public-private endeavor, opened in the summer of 2021. Photo by Natalie Harms/InnovationMap

“We know that the supply chains keep failing over and over again in regard to responding to and rebuilding after disasters. This is a way to address that,” Seline says of the manufacturing hub.

Aside from the innovation hub and East End Maker Hub, partners in the manufacturing venture are the nonprofit Urban Partnerships Community Development Corp. (UPC) and modular construction company VEMAS. UPC is based in Houston, and VEMAS has a Houston office.

“The Resilience Manufacturing Hub is one of four pillars in UPC’s vision for an Invest Houston strategy to grow our economy from within by directly impacting middle-income employment — vital for the 1 million jobs projected as a gap in greater Houston’s long-term competitiveness,” says Patrick Ezzell, president and chairman of UPC and founder of the East End Maker Hub.

The manufacturing hub will work hand in hand with the innovation hub. The innovation hub assesses and addresses risks triggered by climate-produced, manmade, pandemic-related and cybersecurity threats. Hub participants work on innovations aimed at alleviating these risks.

In 2012, the National Academy of Sciences defined resilience as “the ability to prepare and plan for, absorb, recover from, and more successfully adapt to adverse events.” Those events include hurricanes and floods.

The resilience movement got a substantial boost last year thanks to passage of the federal Community Disaster Resilience Zones Act. The law allows for designation of resilience zones in communities that are at high risk of natural disasters and have limited resources. These zones will qualify for federal funding earmarked for resilience efforts.

Harris County scores nearly 98 out of 100 on the National Risk Index, generated by the Federal Emergency Management Agency (FEMA), putting it into the “very high” risk category for natural hazards.

Yet Harris County ekes out a score of 12.73 out of 100 for community resilience, landing it in the “very low” category. This means the county has a poor ability to prepare for natural hazards, adapt to changing conditions, and withstand and recover from disruptions.

Richard Seline is the co-founder and managing director of the Houston-based Resilience Innovation Hub. Photo courtesy

Friday, October 1, is Manufacturing Day Houston at East End Maker Hub. Image courtesy of EEEMH

Houston has all the ingredients to thrive as a manufacturing hub, says expert

guest column

Manufacturing is critical to building the economy on both local and national levels.

According to Deloitte and The Manufacturing Institute, 4.6 million U.S. manufacturing jobs will be needed by 2030. The National Association of Manufacturing estimates that each $1 spent in manufacturing adds $2.79 to the economy and each $1 earned in direct manufacturing labor income yields $3.14 in labor income elsewhere. Failing to fill these jobs could cost the U.S. $1 trillion and thwart economic growth.

Manufacturing is a win-win for Houston. With Houston's manufacturing sectors tied to the overall U.S. economy, the Greater Houston area has the opportunity to thrive as a manufacturing powerhouse by returning manufacturing to the U.S.

"Houston is an amazing city with a wide variety of entrepreneurs, inventors and industry specialties. To support these firms, we need tens of thousands of skilled employees in a plethora of manufacturing jobs. On the product side, they include Space, Medical Devices, Robotics, Additive Manufacturing, BioEngineering, and next generation energy devices. From the process side - refined products, chemicals, beverages and plastics," said Michael Holthouse, CEO and founder of Holthouse Foundation For Kids.

In an effort to increase awareness of these advanced manufacturing careers, TXRX East End Maker Hub is hosting Manufacturing Day Houston on Friday, October 1. The event is attracting hundreds of middle- and high-school youth along with their teachers from the Greater Houston area.

EEMH is opening its doors to allow students the opportunity to engage in hands-on experiences, demonstrations, and interact with subject matter experts to learn the latest technologies in Process Manufacturing, Product Manufacturing, Bioengineering, Virtual Reality, Robotics, 3D printing and more. The keynote speaker, Jim "Mattress Mack" McIngvale of Gallery Furniture, will open the event.

Manufacturing Day Houston is a local effort to join National Manufacturing Day and Creators Wanted, both industry initiatives supported by the National Association of Manufacturers and the Manufacturing Institute. Manufacturing Day Houston has been created to reshape the perception of the advanced manufacturing industry and help today's youth understand how they can match their talents with in-demand product and process manufacturing careers that average $87,185 annually.

While attractive, many of these skilled manufacturing jobs go unfilled due to misinterpretations about the industry and educational opportunities. Houston's community colleges and technical programs offer affordable training for these opportunities, which can be completed in two years or less.

------

Michelle Wicmandy serves as a marketing consultant for Imagina Communications.

Rice Business Professor Amit Pazgal found that in certain situations, gray markets can actually help manufacturers and retailers. Photo by Science in HD on Unsplash

Rice University researcher reveals the benefits to unauthorized manufacturing markets

Houston voices

A camera store in Taiwan buys Nikon cameras from an electronics shop in the Philippines, where photo equipment is cheaper. Then the store sells them to consumers in Taiwan at a lower price. The camera comes without a warranty and instructions are in Filipino – the buyers in Taiwan are happy to have a real Nikon for a lower cost.

The sellers and customers are operating in the so-called gray market – where genuine products are sold through unauthorized channels. Gray marketers buy goods in markets with lower prices, then ship them to a market with higher prices, where they will likely sell for a profit. Though the products are identical, consumers typically see gray market goods as inferior since they often lack benefits like after-sale services or warranty coverage.

For years, gray markets have posed a significant threat to both manufacturers and retailers, depriving both of customers and profits. It's estimated that around $7 billion to $10 billion in goods enter the U.S. market through gray market channels every year. The IT industry, for one, loses approximately $5 billion a year due to gray market activities.

No specific laws in the U.S. ban this practice outright, however. As a result, in recent years, retailers are increasingly taking advantage of potentially cheaper prices abroad, personally importing or using third parties to buy original goods not meant for direct sale in the United States – and then selling them here for less. Alibaba, China's most extensive online shopping site, offers its hundreds of millions of shoppers a large array of gray market goods to peruse.

Manufacturers usually respond to gray markets with knee-jerk hostility, urging customers to avoid gray market goods and even filing lawsuits against gray market peddlers. Nikon, for example, includes a website section to educate consumers on how to identify gray market products, to shun the gray market.

But is gray market commerce always destructive? Rice Business Professor Amit Pazgal joined then-Rice Business Ph.D. student Xueying Liu (now an assistant professor at Nankai University) to explore scenarios in which gray markets could be good for both manufacturers and retailers. Testing the theory in recent research, Pazgal and Liu found that there are indeed situations in which both manufacturers and retailers can profit thanks to gray markets, while the associated product also improves in quality.

To reach these conclusions, the researchers started by recruiting 118 participants between the ages of 25 and 45 to complete a gray market product survey. They found the majority had no problem buying gray market goods. Only 3 percent of consumers wouldn't consider buying cosmetics from a gray marketer, while 6 to 7 percent wouldn't buy electronics. Despite this, more than 90 percent of participants who were willing to buy required a price discount of 20 to 30 percent, showing the goods were seen as slightly inferior.

The researchers then tested responses to a model of a manufacturer selling a single product to two markets – or countries – that differed in size and in customer willingness to pay for the product. Consumers in one market would pay more, on average, for quality. For example, the Nikon D500 camera is sold for a 7.5 percent premium in Taiwan versus Thailand and a 10 percent price premium in Taiwan versus the Philippines.

Pazgal and Liu found that when the manufacturer sells their product directly to consumers in both markets when there is also a gray market, both the manufacturer's profit and product quality decrease. But when the same manufacturer sells their product indirectly to a retailer in at least one of these markets, both the manufacturer's and the retailer's profits can increase. So can the product's quality.

This occurs for several reasons. First, gray marketers increase total demand and profit for the retailer in the lower-priced market, or in the market where the gray marketer buys their goods. The manufacturer can set a higher wholesale price for the better quality product in a market where consumers pay more, and increase sales in both markets as consumers compare the regular, high-quality product to the gray market one. In fact, by offering a lower-priced, lower quality (that is, gray market) alternative to its own high-quality product, the manufacturer can better segment consumers in the higher-priced market.

Finally, the retailer in the higher-priced market becomes more profitable even though they lose some customers to the gray market. This is because increased product quality and price more than make up for lost sales. Researchers found that the results hold regardless of whether the gray marketer buys from the manufacturer or a retailer.

The bottom line: in certain situations, gray markets can improve profitability for both manufacturers and retailers (and, of course, the gray marketers). Counterintuitive though it is, manufacturers that sell through retailers shouldn't automatically see gray markets as an obstacle to their profits, rushing to demand that governments and courts shut them down. Instead, in some cases, companies could do well to embrace these gray markets, because they lead to overall improved profits.

Manufacturers can use this information to their advantage, Pazgal noted. Nikon, for example, could introduce a higher quality camera to the market, allowing it to set even higher wholesale prices and increase sales in both markets, far exceeding the cost of the higher quality product.

For consumers, meanwhile, gray markets are always beneficial because of lower prices. If companies heed Pazgal's findings, however, customers could also benefit from more innovative and higher quality cameras and other merchandise, as manufacturers hurry to create better products to bump up their profits.

------

This article originally ran on Rice Business Wisdom and is based on research from Amit Pazgal, the Friedkin Professor of Management – Marketing at the Jones Graduate School of Business.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston's Texas Medical Center wins prestigious global award recognizing leaders in life science innovation

new bling

Last month, a global organization honored innovation leaders in life sciences, and the Texas Medical Center was among the recipients of the prestigious awards program.

The 18th annual Prix Galien Awards Gala awarded TMC Innovation with the win in the "Incubators, Accelerators and Equity" category. The Galien Foundation created the awards program in 1970 in honor of Galien, the father of medical science and modern pharmacology. Alongside TMC, the other winners represented biotech, digital health, startups, and more.

"We are super proud of this distinction," Tom Luby, director of TMC Innovation says at Envision 2024 last month, crediting the TMCi team and TMC leadership for the award. "We lean on a lot of advisers and experts — people who volunteer their time to work with startups. Without (them), we would not have been successful."

Luby explains that a Prix Galien Award holds a Nobel Prize level of significance for the community.

TMCi was named a finalist in August, and competed against programs from Cedars-Sinai, Mayo Foundation for Medical Education and Research, TechConnect, and more.

"The Awards Committee is honored to witness the exceptional dedication and creativity of our nominees as they turn visionary ideas into transformative solutions for patients worldwide," says Michael Rosenblatt, chair of the Prix Galien USA Awards Committee, in a news release. "Their unwavering commitment to advancing patient care is truly commendable, and we are honored to celebrate their outstanding contributions to global health."

The award is displayed at TMC Innovation's office, located in the medical center at 2450 Holcombe Blvd.

Houston energy transition tech SPAC goes public through IPO

BLANK CHECK

Houston-based CO2 Energy Transition Corp. — a “blank check” company initially targeting the carbon capture, utilization, and storage (CCUS) sector — closed November 22 on its IPO, selling 6 million units at $10 apiece.

“Blank check” companies are formally known as special purpose acquisition companies (SPACs). A SPAC aims to complete a merger, acquisition, share exchange, share purchase, reorganization or similar business combination in certain business sectors. CO2 Energy Transition will target companies valued at $150 million to $250 million.

Each CO2 Energy Transition unit consists of one share of common stock, one warrant to purchase one share of common stock at a per-share price of $11.50, and the right to receive one-eighth of a share of common stock based on certain business conditions being met.

The IPO also included the full exercise of the underwriter’s option to buy 900,000 units to cover over-allotments. Kingswood Capital Partners LLC was the sole underwriter.

Gross proceeds from the IPO totaled $69 million. The money will enable the company to pursue CCUS opportunities.

“Recent bipartisan support for carbon capture legislation heavily emphasized the government’s willingness to advance and support technologies for carbon capture, utilization, storage, and other purposes as efforts to reduce greenhouse gas emissions [continue],” Co2 Energy Transition says in an October 2024 filing with the U.S. Securities and Exchange Commission (SEC).

Brady Rogers is president and CEO of CO2 Energy Transition. He also is CEO of Carbon Capture Development Co., a Los Angeles-based developer of direct air capture (DAC) technology, and president of Houston-based Antelope Energy Partners LLC, a provider of oil and gas services.

------

This article originally ran on EnergyCapital.

Mastering control room management for smoother critical infrastructure operations

Up to Date

Control room management (CRM) systems play an integral role in ensuring the safe and efficient remote operations of automated processes for the world's most critical infrastructures (CI). If anything goes wrong with these CIs, the risks are major: loss of life or catastrophic environmental disasters. For this reason, rigorous regulatory requirements are crucial.

CRM systems give operators the ability to automate and take control of CI processes, giving operators situational awareness and real-time visibility of remote assets. This minimizes the need for manual work and inspection, and scales a company's ability to safely manage many assets over a large geographical area from one control room.

Most CI have to handle hazardous material in some, if not all, of their operational areas. Though different by industry, regulations and oversight are extremely necessary.

ICS (Industrial Control Systems) and CRM tools are key components of real-time monitoring for advanced warning and emergency alarming. The combination of a “green, amber, red” alert on the screen of an operator's control console will prompt them to respond, and potentially lead to following emergency shut-down response procedures. Training and testing of the control systems and their related standards, procedures, and activities are all recorded in a system of record in compliance with regulatory requirements.

Current challenges
One of the biggest challenges is the ability to easily aggregate the data from the many different systems and integrate them with the operator's daily activity and responses to the many notifications they receive. This makes it difficult for handover, when a new control room operator comes in fresh to take over from the operator coming off duty. Ensuring a clean and clear handover that encompasses all the pertinent information, so that the new operator can take over the console with ease and clarity, is much more difficult than some would imagine.

Another issue is the sheer volume of data. When you have thousands of sensors streaming data, it is not unrealistic for a console to receive a few thousand data points per second. Performance and continuity are priorities on a CI control room console(s). So there is no room for error — meaning there is no room for big (quite literally) data.

All of this means that real-time data must be pushed off the operational and process control network and moved into an area where there are no controls, but big data can be stored to produce big-data analytic capabilities, enabling AI, machine learning, and other data science.

Controller/operator fatigue is also an issue. Manual tracking, documenting, and record-keeping increases fatigue, leading to more mistakes and omissions.

Opportunities for improvement
The Houston-based Tory Technologies, Inc.is a corporation specializing in advanced software applications, creating and integrating various innovative technologies, and providing solutions for control room management and electronic flow measurement data management.

Tory Technologies, Inc. can help with the auto population of forms, inclusion of historical alarms and responses, and easy handover of control with active/open issues highlighted, making for an easier transition from one operator to the next.

"CRM is essential for keeping operations safe and efficient in industries where mistakes can lead to serious problems," says Juan Torres, director of operations - MaCRoM at Tory Technologies, Inc. "While many control rooms have worked hard to meet compliance standards, challenges remain that can affect performance and safety. It's not enough to just meet the basic rules; we need to go further by using smarter tools and strategies that make CRM more than just compliant, but truly effective."

Shaun Six, president of UTSI International, notes that, "CRM solutions are scalable. A smart integration with relevant systems and related data will reduce 'white noise' and increase relevance of data being displayed at the right time, or recalled when most helpful."

The future state
Offering CRM as a service for non-regulated control rooms will give economies of scale to critical infrastructure operators, which will allow dispatching, troubleshooting, and network monitoring so operators can focus on more value-add activities.

It can also virtualize network monitoring, ensuring that field machines and edge computers are compliant with industry and company standards and are not exposed to external threats.

Even better: Much of this can be automated. Smart tools can look through each device and test that passwords are changed, configurations are secure, and firmware/software has been properly patched or safeguarded against known exploits.

The sheer volume of data from these exercises can be overwhelming to operators. But a trained professional can easily filter and curate this data, cutting through the noise and helping asset owners address high-risk/high-probability exploits and plan/manage them.

Ultimately, the goal is to make control rooms efficient, getting the right information to the right people at the right time, while also retaining and maintaining required documents and data, ensuring an operators “license to operator” is uninterrupted and easily accessible to external parties when requested or needed.

Integrating smart CRM systems, network monitoring tools, and testing/validating processes and procedures are all easily accessible with current technological capabilities and availability, letting operators focus on the task at hand with ease and peace of mind.