Rice University's Lei Li has been awarded a $550,000 NSF CAREER Award to develop wearable, hospital-grade medical imaging technology. Photo by Jeff Fitlow/ Courtesy Rice University

Another Houston scientist has won one of the highly competitive National Science Foundation (NSF) CAREER Awards.

Lei Li, an assistant professor of electrical and computer engineering at Rice University, has received a $550,000, five-year grant to develop wearable, hospital-grade medical imaging technology capable of visualizing deep tissue function in real-time, according to the NSF. The CAREER grants are given to "early career faculty members who demonstrate the potential to serve as academic models and leaders in research and education."

“This is about giving people access to powerful diagnostic tools that were once confined to hospitals,” Li said in a news release from Rice. “If we can make imaging affordable, wearable and continuous, we can catch disease earlier and treat it more effectively.”

Li’s research focuses on photoacoustic imaging, which merges light and sound to produce high-resolution images of structures deep inside the body. It relies on pulses of laser light that are absorbed by tissue, leading to a rapid temperature rise. During this process, the heat causes the tissue to expand by a fraction, generating ultrasound waves that travel back to the surface and are detected and converted into an image. The process is known to yield more detailed images without dyes or contrast agents used in some traditional ultrasounds.

However, current photoacoustic systems tend to use a variety of sensors, making them bulky, expensive and impractical. Li and his team are taking a different approach.

Instead of using hundreds of separate sensors, Li and his researchers are developing a method that allows a single sensor to capture the same information via a specially designed encoder. The encoder assigns a unique spatiotemporal signature to each incoming sound wave. A reconstruction algorithm then interprets and decodes the signals.

These advances have the potential to lower the size, cost and power consumption of imaging systems. The researchers believe the device could be used in telemedicine, remote diagnostics and real-time disease monitoring. Li’s lab will also collaborate with clinicians to explore how the miniaturized technology could help monitor cancer treatment and other conditions.

“Reducing the number of detection channels from hundreds to one could shrink these devices from bench-top systems into compact, energy-efficient wearables,” Li said in the release. “That opens the door to continuous health monitoring in daily life—not just in hospitals.”

Amanda Marciel, the William Marsh Rice Trustee Chair of chemical and biomolecular engineering and an assistant professor at Rice, received an NSF CAREER Award last year. Read more here.

A team at Rice University is designing wearable technology that can be used for navigation for users with visual and auditory impairments. Photo by Brandon Martin/Rice University

Rice team develops complex wearables that can navigate users through Houston

hi, tech

A group of Rice researchers have tapped into the sense of touch to improve how wearable technology can communicate with its user.

Barclay Jumet, a mechanical engineering PhD student at Rice working in the labs of Daniel Preston and Marcia O’Malley, published the findings in the August issue of “Device.” The study outlines the group's new system of haptic accessories that rely heavily on fluidic control over electrical inputs to signal or simulate touch to a wearer. The research was supported by the National Science Foundation, the Rice University Academy of Fellows, and the Gates Millennium Scholars Program.

The accessories include a belt and textile sleeves, which deliver haptic cues like vibration, tapping and squeezing through pressure generated by a lightweight carbon dioxide tank attached to the belt. The sleeve contains up to six quarter-sized pouches that inflate with varying force and frequency, depending on what is being communicated to the wearer.

Marcia O'Malley (from left), Barclay Jumet and Daniel Preston developed a wearable textile device that can deliver complex haptic cues in real time to users on the go. Photo by Brandon Martin/Rice University

The team says the wearables have uses for those with visual and auditory impairments and offer a slimmed-down design compared to other bulky complex haptic wearables. The wearables are also washable and repairable, which gives them more everyday uses.

To test the system's usability, the team guided a user on a mile-long route through Houston, signaling haptic cues for forward, backward, left or right through the devices.

“In the future, this technology could be directly integrated with navigational systems, so that the very textiles making up one’s clothing can tell users which way to go without taxing their already overloaded visual and auditory senses—for instance by needing to consult a map or listen to a virtual assistant,” Jumet said in a release from Rice.

O’Malley, chair of the Department of Mechanical Engineering, said the system could also work in tandem with Cochlear implants and make lip-reading easier for users in noisy environments by directing users to sources of sound.

Jumet also sees uses outside of the medical space.

“Instead of a smart watch with simple vibrational cues, we can now envision a ‘smart shirt’ that gives the sensation of a stroking hand or a soft tap on the torso or arm,” he said in the release. “Movies, games and other forms of entertainment could now incorporate the sense of touch, and virtual reality can be more comfortable for longer periods of time.”


Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston VC funding surged nearly 50% in Q1 2026, report says

VC victories

First-quarter venture capital funding for Houston-area startups climbed nearly 50 percent compared to the same time last year, according to the PitchBook-NVCA Venture Monitor.

In Q1 2026, Houston-area startups raised $532.3 million, a 49 percent jump from $320.2 million in Q1 2025, according to the PitchBook-NVCA Venture Monitor.

However, the Q1 total fell 23 percent from the $671.05 million raised in Q4 2025.

Among the first-quarter funding highlights in Houston were:

  • Utility Global, which focuses on industrial decarbonization, announced a first close of $100 million for its Series D round.
  • Sage Geosystems raised a $97 million Series B round to support its geothermal energy storage technology.

Those funding rounds underscore Houston’s evolution as a magnet for VC in the energy sector.

“Today, the energy sector is increasingly extending into the startup economy as venture capital flows into companies developing the technologies that will shape the future of global energy,” the Greater Houston Partnership says.

The energy industry accounted for nearly 40 percent of Houston-area VC funding last year, according to market research and lead generation service Growth List.

Adding to Houston’s stature in VC for energy startups are investors like Chevron Technology Ventures, the investment arm of Houston-based oil and gas giant Chevron; Goose Capital; Mercury Fund; and Quantum Energy Partners.

How Houston innovators played a role in the historic Artemis II splashdown

safe landing

Research from Rice University played a critical role in the safe return of U.S. astronauts aboard NASA’s Artemis II mission this month.

Rice mechanical engineer Tayfun E. Tezduyar and longtime collaborator Kenji Takizawa developed a key computational parachute fluid-structure interaction (FSI) analysis system that proved vital in NASA’s Orion capsule’s descent into the Pacific Ocean. The FSI system, originally developed in 2013 alongside NASA Johnson Space Center, was critical in Orion’s three-parachute design, which slowed the capsule as it returned to Earth, according to Rice.

The model helped ensure that the parachute design was large enough to slow the capsule for a safe landing while also being stable enough to prevent the capsule from oscillating as it descended.

“You cannot separate the aerodynamics from the structural dynamics,” Tezduyar said in a news release. “They influence each other continuously and even more so for large spacecraft parachutes, so the analysis must capture that interaction in a robustly coupled way.”

The end result was a final parachute system, refined through NASA drop tests and Rice’s computational FSI analysis, that eliminated fluctuations and produced a stable descent profile.

Apart from the dynamic challenges in design, modeling Orion’s parachutes also required solving complex equations that considered airflow and fabric deformation and accounted for features like ringsail canopy construction and aerodynamic interactions among multiple parachutes in a cluster.

“Essentially, my entire group was dedicated to that work, because I considered it a national priority,” Tezduyar added in the release. “Kenji and I were personally involved in every computer simulation. Some of the best graduate students and research associates I met in my career worked on the project, creating unique, first-of-its-kind parachute computer simulations, one after the other.”

Current Intuitive Machines engineer Mario Romero also worked on Orion during his time at NASA. From 2018 to 2021, Romero was a member of the Orion Crew Capsule Recovery Team, which focused on creating likely scenarios that crewmembers could encounter in Orion.

The team trained in NASA’s 6.2-million-gallon pool, using wave machines to replicate a range of sea conditions. They also simulated worst-case scenarios by cutting the lights, blasting high-powered fans and tipping a mock capsule to mimic distress situations. In some drills, mock crew members were treated as “injured,” requiring the team to practice safe, controlled egress procedures.

“It’s hard to find the appropriate descriptors that can fully encapsulate the feeling of getting to witness all the work we, and everyone else, did being put into action,” Romero tells InnovationMap. “I loved seeing the reactions of everyone, but especially of the Houston communities—that brought me a real sense of gratitude and joy.”

Intuitive Machines was also selected to support the Artemis II mission using its Space Data Network and ground station infrastructure. The company monitored radio signals sent from the Orion spacecraft and used Doppler measurements to help determine the spacecraft's precise position and speed.

Tim Crain, Chief Technology Officer at Intuitive Machines, wrote about the experience last week.

"I specialized in orbital mechanics and deep space navigation in graduate school,” Crain shared. “But seeing the theory behind tracking spacecraft come to life as they thread through planetary gravity fields on ultra-precise trajectories still seems like magic."