Texas A&M will work with DARPA to test fully automated and semi-automated helicopters to combat wildfires in the state. Image by Colie Wertz. Courtesy DARPA.

Texas A&M University's George H.W. Bush Combat Development Complex will receive $59.8 million to develop a way for autonomous helicopters to fight to wildfires in the state.

The funds appropriated from the Texas legislature will go toward acquiring up to four UH-60 Blackhawk helicopters and developing their autonomous configuration, as well as to facilities, tools and equipment for research, testing and integration of firefighting capabilities over the next two years, according to a release from Texas A&M.

The BCDC was also selected to work with the Defense Advanced Research Projects Agency (DARPA) on its Aircrew Labor In-cockpit Automation System (ALIAS), which works to reduce risks for pilots and aircraft in high-risk missions.

"Working together with Texas, we have an opportunity to use autonomous helicopters to completely change the conversation around wildfires from containing them to extinguishing them,” Stuart Young, DARPA program manager for ALIAS, said in a release from DARPA.

The BCDC program will incorporate DARPA's automation toolkit, known as MATRIX, which has already demonstrated fully autonomous flight capabilities on approximately 20 aircraft platforms. MATRIX, which was developed by California-based Sikorsky Aircraft, was previously tested in proof-of-concept demonstrations of autonomous fire suppression in California and Connecticut earlier this year, according to DARPA.

“I am proud we are working with DARPA in a manner that will benefit Texas, the Department of Defense, and commercial industry,” retired Maj. Gen. Tim Green, director of the BCDC, said in the release. “Wildland firefighting will be the first mission application fully developed to take advantage of over a decade of work by DARPA on its Aircrew Labor In-cockpit Automation System (ALIAS).”

The BDC will test fully automated and semi-automated ALIAS-equipped aircraft on highly complex firefighting tasks. The complex will also work with Texas A&M University–Corpus Christi’s Autonomy Research Institute, the Texas Division of Emergency Management, the Texas A&M Engineering Extension Service, the Texas A&M Forest Service and the Texas A&M Engineering Experiment Station on the project.

John Diem, director of the innovation proving grounds at BCDC, will serve as principal investigator for the project.

“Advancing system capabilities through the last stages of technology maturation, operational testing, and concept development is always hugely exciting and rewarding,” Diem added in the release. “The best part of my career has been seeing systems I tested move into the hands of warfighters. Now, I’m proud to help ensure ALIAS is safe and effective in protecting life and property – and we will do that through realistic and challenging testing.”

Two researches at Texas A&M University have developed a diagnostic software for monitoring electrical equipment to prevent outages and even wildfires. Getty Images

Texas A&M University technology used to prevent outages and wildfires

hot tech

The threat of wildfires is on most people's minds as Australia suffers from devastating, uncontrollable fires in its southeastern region. While Australia's fires are alleged to be caused by natural occurrences, some, like the California wildfires of late 2019, are caused by electrical malfunctions and sparks

Engineers at Texas A&M University have found a solution for preventing these electricity-caused wildfires — and the subsequently caused electrical outages — with their diagnostic software called Distribution Fault Anticipation, or DFA. The software can interpret variations in the electrical current on utility circuits — usually caused by issues with the equipment — that can cause outages or spark fires.

A Texas A&M research team — spearheaded by B. Don Russell, professor of electrical and computer engineering, and research professor Carl L. Benner — is behind the DFA software.

The technology has been tested at over a dozen utilities in Texas over the past six years, according to a news release, and now two Californian utility companies — Pacific Gas & Electric and Southern California Edison — will be testing DFA. In 2018, a state law from the California Public Utilities Commission began requiring utilities to submit Wildfire Mitigation Plans, per the release.

Up next: The researchers are preparing to test the software in Australia and New Zealand.

DFA's specific algorithms are based on and refined through 15 years of research. Russell and Benner liken DFA to the diagnostic tools cars use, and, comparatively, the utilities industry is way behind the times.

"Utility systems operate today like my 1950s Chevy," Russell says in the release. "They have some fuses and breakers and things, but they really don't have anything diagnostic. They don't have that computer under the hood telling them what's about to go wrong."

B. Don Russell, professor of electrical and computer engineering, led the research at A&M. Photo via A&M

Normal wear and tear on electrical equipment is inevitable, but it's hard for inspectors to visually see this damage. Until this DFA software, utilities had no choice but to react to failures or outages, rather than put money into prevention. The software allows for these companies to better see what could potentially cause issues. And, now with the ability to factor in dry conditions and weather, the software can even predict potential wildfires.

"Power is being turned off with nothing known to be wrong with a given circuit," Russell says in the release. "Utilities need a crystal ball, something telling them which circuit is going to start a fire tomorrow because it is already unhealthy. We are kind of that crystal ball."

DFA has the potential to prevent outages and devastation caused by wildfires, and it also is a huge economic solution for utilities companies — especially the ones reeling from the recent fires in California.

Pacific Gas & Electric, which is testing nine DFA devices, is the state's largest utility company and recently filed for bankruptcy due to a near $100 billion required from settlements following recent fires. By comparison, a DFA device costs only $15,000, according to the release.

"DFA is a new tool, allowing utilities to transform their operating procedures to find and fix problems before catastrophic failures." Russell says in the release. "Utilities operators need real time situational awareness of the health of their circuits…..DFA does that."

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

How Houston innovators played a role in the historic Artemis II splashdown

safe landing

Research from Rice University played a critical role in the safe return of U.S. astronauts aboard NASA’s Artemis II mission this month.

Rice mechanical engineer Tayfun E. Tezduyar and longtime collaborator Kenji Takizawa developed a key computational parachute fluid-structure interaction (FSI) analysis system that proved vital in NASA’s Orion capsule’s descent into the Pacific Ocean. The FSI system, originally developed in 2013 alongside NASA Johnson Space Center, was critical in Orion’s three-parachute design, which slowed the capsule as it returned to Earth, according to Rice.

The model helped ensure that the parachute design was large enough to slow the capsule for a safe landing while also being stable enough to prevent the capsule from oscillating as it descended.

“You cannot separate the aerodynamics from the structural dynamics,” Tezduyar said in a news release. “They influence each other continuously and even more so for large spacecraft parachutes, so the analysis must capture that interaction in a robustly coupled way.”

The end result was a final parachute system, refined through NASA drop tests and Rice’s computational FSI analysis, that eliminated fluctuations and produced a stable descent profile.

Apart from the dynamic challenges in design, modeling Orion’s parachutes also required solving complex equations that considered airflow and fabric deformation and accounted for features like ringsail canopy construction and aerodynamic interactions among multiple parachutes in a cluster.

“Essentially, my entire group was dedicated to that work, because I considered it a national priority,” Tezduyar added in the release. “Kenji and I were personally involved in every computer simulation. Some of the best graduate students and research associates I met in my career worked on the project, creating unique, first-of-its-kind parachute computer simulations, one after the other.”

Current Intuitive Machines engineer Mario Romero also worked on Orion during his time at NASA. From 2018 to 2021, Romero was a member of the Orion Crew Capsule Recovery Team, which focused on creating likely scenarios that crewmembers could encounter in Orion.

The team trained in NASA’s 6.2-million-gallon pool, using wave machines to replicate a range of sea conditions. They also simulated worst-case scenarios by cutting the lights, blasting high-powered fans and tipping a mock capsule to mimic distress situations. In some drills, mock crew members were treated as “injured,” requiring the team to practice safe, controlled egress procedures.

“It’s hard to find the appropriate descriptors that can fully encapsulate the feeling of getting to witness all the work we, and everyone else, did being put into action,” Romero tells InnovationMap. “I loved seeing the reactions of everyone, but especially of the Houston communities—that brought me a real sense of gratitude and joy.”

Intuitive Machines was also selected to support the Artemis II mission using its Space Data Network and ground station infrastructure. The company monitored radio signals sent from the Orion spacecraft and used Doppler measurements to help determine the spacecraft's precise position and speed.

Tim Crain, Chief Technology Officer at Intuitive Machines, wrote about the experience last week.

"I specialized in orbital mechanics and deep space navigation in graduate school,” Crain shared. “But seeing the theory behind tracking spacecraft come to life as they thread through planetary gravity fields on ultra-precise trajectories still seems like magic."

UH breakthrough moves superconductivity closer to real-world use

Energy Breakthrough

University of Houston researchers have set a new benchmark in the field of superconductivity.

Researchers from the UH physics department and the Texas Center for Superconductivity (TcSUH) have broken the transition temperature record for superconductivity at ambient pressure. The accomplishment could lead to more efficient ways to generate, transmit and store energy, which researchers believe could improve power grids, medical technologies and energy systems by enabling electricity to flow without resistance, according to a release from UH.

To break the record, UH researchers achieved a transition temperature 151 Kelvin, which is the highest ever recorded at ambient pressure since the discovery of superconductivity in 1911.

The transition temperature represents the point just before a material becomes superconducting, where electricity can flow through it without resistance. Scientists have been working for decades to push transition temperature closer to room temperature, which would make superconducting technologies more practical and affordable.

Currently, most superconductors must be cooled to extremely low temperatures, making them more expensive and difficult to operate.

UH physicists Ching-Wu Chu and Liangzi Deng published the research in the Proceedings of the National Academy of Sciences earlier this month. It was funded by Intellectual Ventures and the state of Texas via TcSUH and other foundations. Chu, founding director and chief scientist at TcSUH, previously made the breakthrough discovery that the material YBCO reaches superconductivity at minus 93 K in 1987. This helped begin a global competition to develop high-temperature superconductors.

“Transmitting electricity in the grid loses about 8% of the electricity,” Chu, who’s also a professor of physics at UH and the paper’s senior author, said in a news release. “If we conserve that energy, that’s billions of dollars of savings and it also saves us lots of effort and reduces environmental impacts.”

Chu and his team used a technique known as pressure quenching, which has been adapted from techniques used to create diamonds. With pressure quenching, researchers first apply intense pressure to the material to enhance its superconducting properties and raise its transition temperature.

Next, researchers are targeting ambient-pressure, room-temperature superconductivity of around 300 K. In a companion PNAS paper, Chu and Deng point to pressure quenching as a promising approach to help bridge the gap between current results and that goal.

“Room-temperature superconductivity has been seen as a ‘holy grail’ by scientists for over a century,” Rohit Prasankumar, director of superconductivity research at Intellectual Ventures, said in the release. “The UH team’s result shows that this goal is closer than ever before. However, the distance between the new record set in this study and room temperature is still about 140 C. Closing this gap will require concerted, intentional efforts by the broader scientific community, including materials scientists, chemists, and engineers, as well as physicists.”

---

This article originally appeared on EnergyCapitalHTX.com.