Kirsten Adam, a Rice psychologist, is studying how the brain refocuses in the age of screens, instant gratification and other lingering distractions. Photo via Pexels.

Rice University psychologist Kirsten Adam has received a $600,000 National Science Foundation CAREER Award to research how visual distractions like phone notifications, flashing alerts, crowded screens and busy workspaces can negatively impact focus—and how the brain works to try to regain it.

The highly competitive five-year NSF grants are given to career faculty members with the potential to serve as academic models and leaders in research and education. Adam’s work will aim to clarify how the brain refocuses in the age of screens, instant gratification and other lingering distractions. The funding will also be used to train graduate students in advanced cognitive neuroscience methods, expand access to electroencephalography (EEG) and for public data sharing.

“Kirsten is a valued member of the School of Social Sciences, and we are thrilled that she has been awarded the prestigious NSF CAREER,” Rachel Kimbro, dean of social sciences, said in a news release. “Because distractions continue to increase all around us, her research is timely and imperative to understanding their widespread impacts on the human brain.”

In Adam’s lab, participants complete simplified visual search tasks while their brain activity is recorded using EEG, allowing researchers to measure attention shifts in real time. This process then captures the moment attention is drawn from a goal and how much effort it takes to refocus.

According to Rice, Adam’s work will test long-standing theories about distraction. The research is meant to have real-world implications for jobs and aspects of everyday life where attention to detail is key, including medical imaging, airport security screening and even driving.

“At any given moment, there’s far more information in the world than our brains can process,” Adam added in the release. “Attention is what determines what reaches our awareness and what doesn’t.”

Additionally, the research could inform the design of new technologies that would support focus and decision-making, according to Rice.

“We’re not trying to make attention limitless,” Adam added. “We’re trying to understand how it actually works, so we can stop designing environments and expectations that fight against it.”

Rice University's Lei Li has been awarded a $550,000 NSF CAREER Award to develop wearable, hospital-grade medical imaging technology. Photo by Jeff Fitlow/ Courtesy Rice University

Rice University professor earns $550k NSF award for wearable imaging tech​

science supported

Another Houston scientist has won one of the highly competitive National Science Foundation (NSF) CAREER Awards.

Lei Li, an assistant professor of electrical and computer engineering at Rice University, has received a $550,000, five-year grant to develop wearable, hospital-grade medical imaging technology capable of visualizing deep tissue function in real-time, according to the NSF. The CAREER grants are given to "early career faculty members who demonstrate the potential to serve as academic models and leaders in research and education."

“This is about giving people access to powerful diagnostic tools that were once confined to hospitals,” Li said in a news release from Rice. “If we can make imaging affordable, wearable and continuous, we can catch disease earlier and treat it more effectively.”

Li’s research focuses on photoacoustic imaging, which merges light and sound to produce high-resolution images of structures deep inside the body. It relies on pulses of laser light that are absorbed by tissue, leading to a rapid temperature rise. During this process, the heat causes the tissue to expand by a fraction, generating ultrasound waves that travel back to the surface and are detected and converted into an image. The process is known to yield more detailed images without dyes or contrast agents used in some traditional ultrasounds.

However, current photoacoustic systems tend to use a variety of sensors, making them bulky, expensive and impractical. Li and his team are taking a different approach.

Instead of using hundreds of separate sensors, Li and his researchers are developing a method that allows a single sensor to capture the same information via a specially designed encoder. The encoder assigns a unique spatiotemporal signature to each incoming sound wave. A reconstruction algorithm then interprets and decodes the signals.

These advances have the potential to lower the size, cost and power consumption of imaging systems. The researchers believe the device could be used in telemedicine, remote diagnostics and real-time disease monitoring. Li’s lab will also collaborate with clinicians to explore how the miniaturized technology could help monitor cancer treatment and other conditions.

“Reducing the number of detection channels from hundreds to one could shrink these devices from bench-top systems into compact, energy-efficient wearables,” Li said in the release. “That opens the door to continuous health monitoring in daily life—not just in hospitals.”

Amanda Marciel, the William Marsh Rice Trustee Chair of chemical and biomolecular engineering and an assistant professor at Rice, received an NSF CAREER Award last year. Read more here.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston researcher builds radar to make self-driving cars safer

eyes on the road

A Rice University researcher is giving autonomous vehicles an “extra set of eyes.”

Current autonomous vehicles (AVs) can have an incomplete view of their surroundings, and challenges like pedestrian movement, low-light conditions and adverse weather only compound these visibility limitations.

Kun Woo Cho, a postdoctoral researcher in the lab of Rice professor of electrical and computer engineering Ashutosh Sabharwal, has developed EyeDAR to help address such issues and enhance the vehicles’ sensing accuracy. Her research was supported in part by the National Science Foundation.

The EyeDAR is an orange-sized, low-power, millimeter-wave radar that could be placed at streetlights and intersections. Its design was inspired by that of the human eye. Researchers envision that the low-cost sensors could help ensure that AVs always pick up on emergent obstacles, even when the vehicles are not within proper range for their onboard sensors and when visibility is limited.

“Current automotive sensor systems like cameras and lidar struggle with poor visibility such as you would encounter due to rain or fog or in low-lighting conditions,” Cho said in a news release. “Radar, on the other hand, operates reliably in all weather and lighting conditions and can even see through obstacles.”

Signals from a typical radar system scatter when they encounter an obstacle. Some of the signal is reflected back to the source, but most of it is often lost. In the case of AVs, this means that "pedestrians emerging from behind large vehicles, cars creeping forward at intersections or cyclists approaching at odd angles can easily go unnoticed," according to Rice.

EyeDAR, however, works to capture lost radar reflections, determine their direction and report them back to the AV in a sequence of 0s and 1s.

“Like blinking Morse code,” Cho added. “EyeDAR is a talking sensor⎯it is a first instance of integrating radar sensing and communication functionality in a single design.”

After testing, EyeDAR was able to resolve target directions 200 times faster than conventional radar designs.

While EyeDAR currently targets risks associated with AVs, particularly in high-traffic urban areas, researchers also believe the technology behind it could complement artificial intelligence efforts and be integrated into robots, drones and wearable platforms.

“EyeDAR is an example of what I like to call ‘analog computing,’” Cho added in the release. “Over the past two decades, people have been focusing on the digital and software side of computation, and the analog, hardware side has been lagging behind. I want to explore this overlooked analog design space.”

12 winners named at CERAWeek clean tech pitch competition in Houston

top teams

Twelve teams from around the country, including several from Houston, took home top honors at this year's Energy Venture Day and Pitch Competition at CERAWeek.

The fast-paced event, held March 25, put on by Rice Alliance, Houston Energy Transition Initiative and TEX-E, invited 36 industry startups and five Texas-based student teams focused on driving efficiency and advancements in the energy transition to present 3.5-minute pitches before investors and industry partners during CERAWeek's Agora program.

The competition is a qualifying event for the Startup World Cup, where teams compete for a $1 million investment prize.

PolyJoule won in the Track C competition and was named the overall winner of the pitch event. The Boston-based company will go on to compete in the Startup World Cup held this fall in San Francisco.

PolyJoule was spun out of MIT and is developing conductive polymer battery technology for energy storage.

Rice University's Resonant Thermal Systems won the second-place prize and $15,000 in the student track, known as TEX-E. The team's STREED solution converts high-salinity water into fresh water while recovering valuable minerals.

Teams from the University of Texas won first and second place in the TEX-E competition, bringing home $25,000 and $10,000, respectively. The student winners were:

Companies that pitched in the three industry tracts competed for non-monetary awards. Here are the companies named "most-promising" by the judges:

Track A | Industrial Efficiency & Decarbonization

Track B | Advanced Manufacturing, Materials, & Other Advanced Technologies

  • First: Licube, based in Houston
  • Second: ZettaJoule, based in Houston and Maryland
  • Third: Oleo

Track C | Innovations for Traditional Energy, Electricity, & the Grid

The teams at this year's Energy Venture Day have collectively raised $707 million in funding, according to Rice. They represent six countries and 12 states. See the full list of companies and investor groups that participated here.

---

This article originally appeared on our sister site, EnergyCapitalHTX.com.