A Houston expert shares reasons to swap screen time for extended reality. Photo via Getty Images

What does your reality look like? Look around you. What do you see? It would be safe to say (almost guarantee) that you are looking at a screen right now, correct? We are consumers of information and use screens to access, view, and create information.

But why are we spending so much of our time looking at screens?

One poll stated that the average adult will spend 34 years of their lives looking at screens. It almost feels that screens (TV, laptop, or phone) have become so ubiquitous in everyday life that they have blended into our reality and are just ‘there’. Do you think the inventor of the TV, John Logie Baird, ever fully grasped how much the fabric of society would revolve around his invention? Time and time again, incredible disruptions have always come from breaking the ‘norm’ and given the vast level of integration of screens into our everyday reality, this ‘norm’ feels long overdue for innovation. This is where the world of augmented reality and spatial computing comes into play.

The COVID-19 pandemic saw an unprecedented shift to even more screen time and interactions using remote video communication platforms. It was also around this time that wireless virtual reality headsets were, for the first time ever, economically accessible to the consumer due to the large push of one multinational corporation. Fast forward to 2023, there are even more companies beginning to enter the market with new extended reality (XR) headsets (i.e. virtual, mixed, and augmented reality) that offer spatial computing – the ability for computers to blend into the physical worlds (amongst other things).

Some of our innovation engineering activities at the Houston Methodist Institute for Technology, Innovation, and Education (MITIE) have focused on specific use cases of XR in surgical education and training. One of our projects, the MITIEverse, is a VR-based platform focused on creating the first-ever metaverse for medical innovation. It is a fully immersive VR environment that allows the user to view 3D-rendered patient anatomies whilst watching the actual patient procedure, even offering the ability to meet the surgeon who performed the operation. It also affords the ability to give a ‘Grand Rounds’ style presentation to an audience of 50 participants.

We have looked at using augmented reality to control robotic-assisted surgery platforms. In our proof-of-concept prototype, we successfully demonstrated the manipulation of guide wires and catheters using nothing more than an augmented reality headset, illustrating the possibility of surgeons performing surgery at a distance. Houston Methodist is dedicated to transforming healthcare using the latest innovative technology including XR. The question we now need to ask – is society ready and willing to replace screens with XR headsets?

To learn more about our XR initiatives and other Houston’s cross-industry innovation collaborations, attend Pumps & Pipes Annual Event 2023, Problem Xchange: Where Solutions Converge next month at The Ion.

------

Stuart Corr is the director of Innovation Systems Engineering at Houston Methodist and executive director of Pumps & Pipes.

Houston Methodist's new MITIEverse app takes users into the metaverse to learn from professionals across the globe. Image courtesy of Houston Methodist

Houston hospital joins the metaverse with new platform

now online

Houston Methodist has launched a platform that is taking medical and scientific experts and students into the metaverse.

The MITIEverse, a new app focused on health care education and training, provides hands-on practice, remote assistance from experienced clinicians, and more. The app — named for the Houston Methodist Institute for Technology, Innovation and Education, aka MITIE — was created in partnership with FundamentalVR and takes users into virtual showcase rooms, surgical simulations, and lectures from Houston Methodist faculty, as well as collaborators from across the world.

“This new app brings the hands-on education and training MITIE is known for to a new virtual audience. It could be a first step toward building out a medical metaverse,” says Stuart Corr, inventor of the MITIEverse and director of innovation systems engineering at Houston Methodist, in a news release.

Image courtesy of Houston Methodist

The hospital system's DeBakey Heart and Vascular Center has created a virtual showcase room on the app, and users can view Houston Methodist faculty performing real surgeries and then interact with 3D human models.

"We view the MITIEverse as a paradigm-shifting platform that will offer new experiences in how we educate, train, and interact with the health community,” says Alan Lumsden, M.D., medical director of Houston Methodist DeBakey Heart and Vascular Center, in the release.

“It essentially democratizes access to health care educators and innovators by breaking down physical barriers. There’s no need to travel thousands of miles to attend a conference when you can patch into the MITIEverse," he continues.

Image courtesy of Houston Methodist

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston researcher builds radar to make self-driving cars safer

eyes on the road

A Rice University researcher is giving autonomous vehicles an “extra set of eyes.”

Current autonomous vehicles (AVs) can have an incomplete view of their surroundings, and challenges like pedestrian movement, low-light conditions and adverse weather only compound these visibility limitations.

Kun Woo Cho, a postdoctoral researcher in the lab of Rice professor of electrical and computer engineering Ashutosh Sabharwal, has developed EyeDAR to help address such issues and enhance the vehicles’ sensing accuracy. Her research was supported in part by the National Science Foundation.

The EyeDAR is an orange-sized, low-power, millimeter-wave radar that could be placed at streetlights and intersections. Its design was inspired by that of the human eye. Researchers envision that the low-cost sensors could help ensure that AVs always pick up on emergent obstacles, even when the vehicles are not within proper range for their onboard sensors and when visibility is limited.

“Current automotive sensor systems like cameras and lidar struggle with poor visibility such as you would encounter due to rain or fog or in low-lighting conditions,” Cho said in a news release. “Radar, on the other hand, operates reliably in all weather and lighting conditions and can even see through obstacles.”

Signals from a typical radar system scatter when they encounter an obstacle. Some of the signal is reflected back to the source, but most of it is often lost. In the case of AVs, this means that "pedestrians emerging from behind large vehicles, cars creeping forward at intersections or cyclists approaching at odd angles can easily go unnoticed," according to Rice.

EyeDAR, however, works to capture lost radar reflections, determine their direction and report them back to the AV in a sequence of 0s and 1s.

“Like blinking Morse code,” Cho added. “EyeDAR is a talking sensor⎯it is a first instance of integrating radar sensing and communication functionality in a single design.”

After testing, EyeDAR was able to resolve target directions 200 times faster than conventional radar designs.

While EyeDAR currently targets risks associated with AVs, particularly in high-traffic urban areas, researchers also believe the technology behind it could complement artificial intelligence efforts and be integrated into robots, drones and wearable platforms.

“EyeDAR is an example of what I like to call ‘analog computing,’” Cho added in the release. “Over the past two decades, people have been focusing on the digital and software side of computation, and the analog, hardware side has been lagging behind. I want to explore this overlooked analog design space.”

12 winners named at CERAWeek clean tech pitch competition in Houston

top teams

Twelve teams from around the country, including several from Houston, took home top honors at this year's Energy Venture Day and Pitch Competition at CERAWeek.

The fast-paced event, held March 25, put on by Rice Alliance, Houston Energy Transition Initiative and TEX-E, invited 36 industry startups and five Texas-based student teams focused on driving efficiency and advancements in the energy transition to present 3.5-minute pitches before investors and industry partners during CERAWeek's Agora program.

The competition is a qualifying event for the Startup World Cup, where teams compete for a $1 million investment prize.

PolyJoule won in the Track C competition and was named the overall winner of the pitch event. The Boston-based company will go on to compete in the Startup World Cup held this fall in San Francisco.

PolyJoule was spun out of MIT and is developing conductive polymer battery technology for energy storage.

Rice University's Resonant Thermal Systems won the second-place prize and $15,000 in the student track, known as TEX-E. The team's STREED solution converts high-salinity water into fresh water while recovering valuable minerals.

Teams from the University of Texas won first and second place in the TEX-E competition, bringing home $25,000 and $10,000, respectively. The student winners were:

Companies that pitched in the three industry tracts competed for non-monetary awards. Here are the companies named "most-promising" by the judges:

Track A | Industrial Efficiency & Decarbonization

Track B | Advanced Manufacturing, Materials, & Other Advanced Technologies

  • First: Licube, based in Houston
  • Second: ZettaJoule, based in Houston and Maryland
  • Third: Oleo

Track C | Innovations for Traditional Energy, Electricity, & the Grid

The teams at this year's Energy Venture Day have collectively raised $707 million in funding, according to Rice. They represent six countries and 12 states. See the full list of companies and investor groups that participated here.

---

This article originally appeared on our sister site, EnergyCapitalHTX.com.