Texas is listed as the third-most vulnerable state when it comes to robots replacing the workforce in manufacturing. Houston houses a third of the manufacturing jobs in the state. Thossaphol Somsri/Getty Images

If a new forecast comes true, Houston's manufacturing sector could take an especially hard hit from the upturn in the use of robots.

In a new report, Oxford Economics, a forecasting and analysis firm based in the United Kingdom, ranks Texas as the third most vulnerable state when it comes to human workers in manufacturing being replaced by robotic labor. The report gives no estimate of how many manufacturing jobs Texas might lose to robots, but around the world, robots could boot 20 million jobs by 2030.

About one-third of Texas' manufacturers operate in the Houston metro area, meaning the robot revolution carries significant weight for the regional economy.

In 2017, manufacturing accounted for $82.6 billion, or nearly 17 percent, of the Houston area's economic output, the U.S. Bureau of Economic Analysis says. Manufacturing employment in the region averaged 219,160 jobs in 2017, with total wages of nearly $4.8 billion.

Among the top manufacturing segments in the region are fabricated metals (22 percent of all manufacturing jobs), machinery (19 percent) and chemicals (17.5 percent), according to the Greater Houston Partnership. Between 2012 and 2017, manufacturing employment in the Houston area slipped by 9.8 percent, going from 243,011 workers to 219,160 workers.

However, a recent report from the Economic Innovation Group shows Harris County netted more manufacturing jobs (11,592) from December 2016 to December 2018 than any other county in the U.S.

According to the National Association of Manufacturers, the manufacturing sector in Texas created more than $226 billion in economic output in 2017. Last year, about 880,900 people held manufacturing jobs in Texas; that's more than 7 percent of the statewide workforce.

In declaring that Texas sits among the states most susceptible to job losses due to robotics, Oxford Economics took into account factors such as:

  • Dependence on manufacturing jobs.
  • Current use of robots in manufacturing.
  • Productivity of the manufacturing workforce.

Based on those criteria, Texas received a robot vulnerability score of 0.50. The top two states, Oregon and Louisiana, each got a score of 0.58, with the higher number meaning greater vulnerability.

The report cites three reasons for the ascent of robots in manufacturing:

  • Robots are becoming cheaper than humans.
  • Robots are becoming more sophisticated.
  • Demand for manufactured goods is rising.

"The rise of the robots will boost productivity and economic growth. It will lead, too, to the creation of new jobs in yet-to-exist industries, in a process of 'creative destruction,'" according to the Oxford Economics report. "But existing business models across many sectors will be seriously disrupted. And tens of millions of existing jobs will be lost, with human workers displaced by robots at an increasing rate as robots become steadily more sophisticated."

Tony Bennett, president and CEO of the Texas Association of Manufacturers, says the Oxford Economics report isn't all gloom and doom.

"Robotics and mechanization in our advanced manufacturing industries will continue to displace some general-labor jobs. However, this change is also ushering in a new set of higher-skilled jobs that are being created to engineer, build, and service these sophisticated machines," Bennett says. "The state of Texas must continue striving to increase educational opportunities in engineering, math, science, and career and technical programs to meet the complex manufacturing processes of the future."

Houston Community College's Advanced Manufacturing Center for Excellence is among the organizations in the Houston area that are preparing workers for jobs in robotics and other high-demand, tech-driven aspects of manufacturing.

"Innovation is Houston's bedrock," Houston Mayor Sylvester Turner said in 2017. "The city would have never thrived without the innovations it took to build the Ship Channel and the innovating that goes on every day in the energy industry, at the Texas Medical Center, at the Johnson Space Center and in the manufacturing sector. Now, Houston is poised to take its place at the forefront of the American future in technology."

Earlier this year, another study found a similarly daunting result. Almost half of Houston's workplace tasks are susceptible to automation, according to a new report from the Brookings Institution's Metropolitan Policy Program. Of 100 metros analyzed, Houston ranks 31st among the country's 100 biggest metros, with 46.3 percent of work tasks susceptible to automation.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

How Houston innovators played a role in the historic Artemis II splashdown

safe landing

Research from Rice University played a critical role in the safe return of U.S. astronauts aboard NASA’s Artemis II mission this month.

Rice mechanical engineer Tayfun E. Tezduyar and longtime collaborator Kenji Takizawa developed a key computational parachute fluid-structure interaction (FSI) analysis system that proved vital in NASA’s Orion capsule’s descent into the Pacific Ocean. The FSI system, originally developed in 2013 alongside NASA Johnson Space Center, was critical in Orion’s three-parachute design, which slowed the capsule as it returned to Earth, according to Rice.

The model helped ensure that the parachute design was large enough to slow the capsule for a safe landing while also being stable enough to prevent the capsule from oscillating as it descended.

“You cannot separate the aerodynamics from the structural dynamics,” Tezduyar said in a news release. “They influence each other continuously and even more so for large spacecraft parachutes, so the analysis must capture that interaction in a robustly coupled way.”

The end result was a final parachute system, refined through NASA drop tests and Rice’s computational FSI analysis, that eliminated fluctuations and produced a stable descent profile.

Apart from the dynamic challenges in design, modeling Orion’s parachutes also required solving complex equations that considered airflow and fabric deformation and accounted for features like ringsail canopy construction and aerodynamic interactions among multiple parachutes in a cluster.

“Essentially, my entire group was dedicated to that work, because I considered it a national priority,” Tezduyar added in the release. “Kenji and I were personally involved in every computer simulation. Some of the best graduate students and research associates I met in my career worked on the project, creating unique, first-of-its-kind parachute computer simulations, one after the other.”

Current Intuitive Machines engineer Mario Romero also worked on Orion during his time at NASA. From 2018 to 2021, Romero was a member of the Orion Crew Capsule Recovery Team, which focused on creating likely scenarios that crewmembers could encounter in Orion.

The team trained in NASA’s 6.2-million-gallon pool, using wave machines to replicate a range of sea conditions. They also simulated worst-case scenarios by cutting the lights, blasting high-powered fans and tipping a mock capsule to mimic distress situations. In some drills, mock crew members were treated as “injured,” requiring the team to practice safe, controlled egress procedures.

“It’s hard to find the appropriate descriptors that can fully encapsulate the feeling of getting to witness all the work we, and everyone else, did being put into action,” Romero tells InnovationMap. “I loved seeing the reactions of everyone, but especially of the Houston communities—that brought me a real sense of gratitude and joy.”

Intuitive Machines was also selected to support the Artemis II mission using its Space Data Network and ground station infrastructure. The company monitored radio signals sent from the Orion spacecraft and used Doppler measurements to help determine the spacecraft's precise position and speed.

Tim Crain, Chief Technology Officer at Intuitive Machines, wrote about the experience last week.

"I specialized in orbital mechanics and deep space navigation in graduate school,” Crain shared. “But seeing the theory behind tracking spacecraft come to life as they thread through planetary gravity fields on ultra-precise trajectories still seems like magic."

UH breakthrough moves superconductivity closer to real-world use

Energy Breakthrough

University of Houston researchers have set a new benchmark in the field of superconductivity.

Researchers from the UH physics department and the Texas Center for Superconductivity (TcSUH) have broken the transition temperature record for superconductivity at ambient pressure. The accomplishment could lead to more efficient ways to generate, transmit and store energy, which researchers believe could improve power grids, medical technologies and energy systems by enabling electricity to flow without resistance, according to a release from UH.

To break the record, UH researchers achieved a transition temperature 151 Kelvin, which is the highest ever recorded at ambient pressure since the discovery of superconductivity in 1911.

The transition temperature represents the point just before a material becomes superconducting, where electricity can flow through it without resistance. Scientists have been working for decades to push transition temperature closer to room temperature, which would make superconducting technologies more practical and affordable.

Currently, most superconductors must be cooled to extremely low temperatures, making them more expensive and difficult to operate.

UH physicists Ching-Wu Chu and Liangzi Deng published the research in the Proceedings of the National Academy of Sciences earlier this month. It was funded by Intellectual Ventures and the state of Texas via TcSUH and other foundations. Chu, founding director and chief scientist at TcSUH, previously made the breakthrough discovery that the material YBCO reaches superconductivity at minus 93 K in 1987. This helped begin a global competition to develop high-temperature superconductors.

“Transmitting electricity in the grid loses about 8% of the electricity,” Chu, who’s also a professor of physics at UH and the paper’s senior author, said in a news release. “If we conserve that energy, that’s billions of dollars of savings and it also saves us lots of effort and reduces environmental impacts.”

Chu and his team used a technique known as pressure quenching, which has been adapted from techniques used to create diamonds. With pressure quenching, researchers first apply intense pressure to the material to enhance its superconducting properties and raise its transition temperature.

Next, researchers are targeting ambient-pressure, room-temperature superconductivity of around 300 K. In a companion PNAS paper, Chu and Deng point to pressure quenching as a promising approach to help bridge the gap between current results and that goal.

“Room-temperature superconductivity has been seen as a ‘holy grail’ by scientists for over a century,” Rohit Prasankumar, director of superconductivity research at Intellectual Ventures, said in the release. “The UH team’s result shows that this goal is closer than ever before. However, the distance between the new record set in this study and room temperature is still about 140 C. Closing this gap will require concerted, intentional efforts by the broader scientific community, including materials scientists, chemists, and engineers, as well as physicists.”

---

This article originally appeared on EnergyCapitalHTX.com.