Research from Baylor College of Medicine and the Jan and Dan Duncan Neurological Research Institute at Texas Children’s Hospital will help develop targeted treatments for individuals with auditory disorders. Photo via Getty Images.

Researchers at Baylor College of Medicine and the Jan and Dan Duncan Neurological Research Institute at Texas Children’s Hospital have successfully mapped which cell populations are responsible for processing different types of sounds.

Working with a team at the Oregon Health & Science University, the Houston scientists have classified where in the cochlear nucleus our brains connect with various sounds, including speech and music. The research was published in the new edition of Nature Communications.

“Understanding these cell types and how they function is essential in advancing treatments for auditory disorders,” Matthew McGinley, assistant professor of neuroscience at Baylor, said in a release. “Think of how muscle cells in the heart are responsible for contraction, while valve cells control blood flow. The auditory brainstem operates in a similar fashion — different cell types respond to distinct aspects of sound.”

Though scientists have long thought that there are distinct types of cells in the cochlear nucleus, they didn’t have tools to distinguish them until now.

Lead author on the study, Xiaolong Jiang, associate professor of neuroscience at Baylor, added: “This study not only confirms many of the cell types we anticipated, but it also unveils entirely new ones, challenging long-standing principles of hearing processing in the brain and offering fresh avenues for therapeutic exploration.”

Jiang and his team have cooked up a comprehensive cellular and molecular atlas of the cochlear nucleus, which will help them to create more targeted and more effective treatments for patients struggling with their hearing.

The strategies that aided them in creating these tools included single-nucleus RNA sequencing, which made it possible to define neuronal populations on a molecular level. Phenotypic categorizations of the cells were made possible with patch sequencing.

This is a watershed moment for the development of targeted treatments for individuals with auditory disorders, including those with impaired function in the auditory nerve, for whom cochlear implants don’t work.

“If we can understand what each cell type is responsible for, and with the identification of new subtypes of cells, doctors can potentially develop treatments that target specific cells with greater accuracy,” McGinley explains. “These findings, thanks to the work of our collaborative team, make a significant step forward in the field of auditory research and get us closer to a more personalized treatment for each patient.”

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston researcher builds radar to make self-driving cars safer

eyes on the road

A Rice University researcher is giving autonomous vehicles an “extra set of eyes.”

Current autonomous vehicles (AVs) can have an incomplete view of their surroundings, and challenges like pedestrian movement, low-light conditions and adverse weather only compound these visibility limitations.

Kun Woo Cho, a postdoctoral researcher in the lab of Rice professor of electrical and computer engineering Ashutosh Sabharwal, has developed EyeDAR to help address such issues and enhance the vehicles’ sensing accuracy. Her research was supported in part by the National Science Foundation.

The EyeDAR is an orange-sized, low-power, millimeter-wave radar that could be placed at streetlights and intersections. Its design was inspired by that of the human eye. Researchers envision that the low-cost sensors could help ensure that AVs always pick up on emergent obstacles, even when the vehicles are not within proper range for their onboard sensors and when visibility is limited.

“Current automotive sensor systems like cameras and lidar struggle with poor visibility such as you would encounter due to rain or fog or in low-lighting conditions,” Cho said in a news release. “Radar, on the other hand, operates reliably in all weather and lighting conditions and can even see through obstacles.”

Signals from a typical radar system scatter when they encounter an obstacle. Some of the signal is reflected back to the source, but most of it is often lost. In the case of AVs, this means that "pedestrians emerging from behind large vehicles, cars creeping forward at intersections or cyclists approaching at odd angles can easily go unnoticed," according to Rice.

EyeDAR, however, works to capture lost radar reflections, determine their direction and report them back to the AV in a sequence of 0s and 1s.

“Like blinking Morse code,” Cho added. “EyeDAR is a talking sensor⎯it is a first instance of integrating radar sensing and communication functionality in a single design.”

After testing, EyeDAR was able to resolve target directions 200 times faster than conventional radar designs.

While EyeDAR currently targets risks associated with AVs, particularly in high-traffic urban areas, researchers also believe the technology behind it could complement artificial intelligence efforts and be integrated into robots, drones and wearable platforms.

“EyeDAR is an example of what I like to call ‘analog computing,’” Cho added in the release. “Over the past two decades, people have been focusing on the digital and software side of computation, and the analog, hardware side has been lagging behind. I want to explore this overlooked analog design space.”

12 winners named at CERAWeek clean tech pitch competition in Houston

top teams

Twelve teams from around the country, including several from Houston, took home top honors at this year's Energy Venture Day and Pitch Competition at CERAWeek.

The fast-paced event, held March 25, put on by Rice Alliance, Houston Energy Transition Initiative and TEX-E, invited 36 industry startups and five Texas-based student teams focused on driving efficiency and advancements in the energy transition to present 3.5-minute pitches before investors and industry partners during CERAWeek's Agora program.

The competition is a qualifying event for the Startup World Cup, where teams compete for a $1 million investment prize.

PolyJoule won in the Track C competition and was named the overall winner of the pitch event. The Boston-based company will go on to compete in the Startup World Cup held this fall in San Francisco.

PolyJoule was spun out of MIT and is developing conductive polymer battery technology for energy storage.

Rice University's Resonant Thermal Systems won the second-place prize and $15,000 in the student track, known as TEX-E. The team's STREED solution converts high-salinity water into fresh water while recovering valuable minerals.

Teams from the University of Texas won first and second place in the TEX-E competition, bringing home $25,000 and $10,000, respectively. The student winners were:

Companies that pitched in the three industry tracts competed for non-monetary awards. Here are the companies named "most-promising" by the judges:

Track A | Industrial Efficiency & Decarbonization

Track B | Advanced Manufacturing, Materials, & Other Advanced Technologies

  • First: Licube, based in Houston
  • Second: ZettaJoule, based in Houston and Maryland
  • Third: Oleo

Track C | Innovations for Traditional Energy, Electricity, & the Grid

The teams at this year's Energy Venture Day have collectively raised $707 million in funding, according to Rice. They represent six countries and 12 states. See the full list of companies and investor groups that participated here.

---

This article originally appeared on our sister site, EnergyCapitalHTX.com.