A Houston expert shares reasons to swap screen time for extended reality. Photo via Getty Images

What does your reality look like? Look around you. What do you see? It would be safe to say (almost guarantee) that you are looking at a screen right now, correct? We are consumers of information and use screens to access, view, and create information.

But why are we spending so much of our time looking at screens?

One poll stated that the average adult will spend 34 years of their lives looking at screens. It almost feels that screens (TV, laptop, or phone) have become so ubiquitous in everyday life that they have blended into our reality and are just ‘there’. Do you think the inventor of the TV, John Logie Baird, ever fully grasped how much the fabric of society would revolve around his invention? Time and time again, incredible disruptions have always come from breaking the ‘norm’ and given the vast level of integration of screens into our everyday reality, this ‘norm’ feels long overdue for innovation. This is where the world of augmented reality and spatial computing comes into play.

The COVID-19 pandemic saw an unprecedented shift to even more screen time and interactions using remote video communication platforms. It was also around this time that wireless virtual reality headsets were, for the first time ever, economically accessible to the consumer due to the large push of one multinational corporation. Fast forward to 2023, there are even more companies beginning to enter the market with new extended reality (XR) headsets (i.e. virtual, mixed, and augmented reality) that offer spatial computing – the ability for computers to blend into the physical worlds (amongst other things).

Some of our innovation engineering activities at the Houston Methodist Institute for Technology, Innovation, and Education (MITIE) have focused on specific use cases of XR in surgical education and training. One of our projects, the MITIEverse, is a VR-based platform focused on creating the first-ever metaverse for medical innovation. It is a fully immersive VR environment that allows the user to view 3D-rendered patient anatomies whilst watching the actual patient procedure, even offering the ability to meet the surgeon who performed the operation. It also affords the ability to give a ‘Grand Rounds’ style presentation to an audience of 50 participants.

We have looked at using augmented reality to control robotic-assisted surgery platforms. In our proof-of-concept prototype, we successfully demonstrated the manipulation of guide wires and catheters using nothing more than an augmented reality headset, illustrating the possibility of surgeons performing surgery at a distance. Houston Methodist is dedicated to transforming healthcare using the latest innovative technology including XR. The question we now need to ask – is society ready and willing to replace screens with XR headsets?

To learn more about our XR initiatives and other Houston’s cross-industry innovation collaborations, attend Pumps & Pipes Annual Event 2023, Problem Xchange: Where Solutions Converge next month at The Ion.

------

Stuart Corr is the director of Innovation Systems Engineering at Houston Methodist and executive director of Pumps & Pipes.

A mixed reality lab at the University of Houston is merging the physical and digital worlds. Photo via UH.edu

UH lab using mixed reality to optimize designs for the Moon and Mars

hi, tech

University of Houston researchers and students are bringing multiple realities together to help improve the design process for crewed space missions.

Helmed by Vittorio Netti, a researcher for UH and a space architect, the university has launched an XR Lab within the University of Houston architecture building. The lab allows researchers to combine mixed reality (MR), virtual reality (VR), augmented reality (AR) and extended reality (XR) to "blend the physical and digital worlds" to give designers a better understanding of life in space, according to a release from UH.

In the lab researchers can wear MY space suits and goggles, take a VR space walk, or feel what it's like to float to the International Space Station with the help of XR and a crane.

The area in which the researchers conduct this work is known as the "cage" and was developed during a six-month research and design study of lunar surface architecture sponsored by Boeing, which aimed to learn more about the design of a lunar terrain vehicle and a small lunar habitat.

The work is part of UH's Sasakawa International Center of Space Architecture (SICSA), which is led by Olga Bannova, a research associate professor and director of the space architecture graduate program at UH.

She says work like this will drastically cut down research and development time when designing space structures.

“These technologies should be harnessed to mitigate the dependency on physical prototyping of assets and help optimize the design process, drastically reducing research-and-development time and providing a higher level of immersion,” Bannova said in a statement.

Today the research team is shifting its focus on designing for a Mars landing. In the future, they aim to demonstrate and test the system for habitats designed for both lunar and Martian surfaces. They are also working with Boeing to test designs in microgravity, or zero gravity, which exists inside the International Space Station.

Mixed Reality Raising the Bar for Space Architecture on the Moon and MarsStep into this 'Cage' at the University of Houston where physical and digital worlds are merged, allowing students to see and ...

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston unicorn closes $421M to fuel first phase of flagship energy project

Heating Up

Houston geothermal unicorn Fervo Energy has closed $421 million in non-recourse debt financing for the first phase of its flagship Cape Station project in Beaver County, Utah.

Fervo believes Cape Station can meet the needs of surging power demand from data centers, domestic manufacturing and an energy market aiming to use clean and reliable power. According to the company, Cape Station will begin delivering its first power to the grid this year and is expected to reach approximately 100 megwatts of operating capacity by early 2027. Fervo added that it plans to scale to 500 megawatts.

The $421 million financing package includes a $309 million construction-to-term loan, a $61 million tax credit bridge loan, and a $51 million letter of credit facility. The facilities will fund the remaining construction costs for the first phase of Cape Station, and will also support the project’s counterparty credit support requirements.

Coordinating lead arrangers include Barclays, BBVA, HSBC, MUFG, RBC and Société Générale, with additional participation from Bank of America, J.P. Morgan and Sumitomo Mitsui Trust Bank, Limited, New York Branch.

“As demand for firm, clean, affordable power accelerates, EGS (Enhanced Geothermal Systems) is set to become a core energy asset class for infrastructure lenders,” Sean Pollock, managing director, project Finance at RBC Capital Markets, said in a news release. “Fervo is pioneering this step change with Cape Station, a vital contribution to American energy security that RBC is proud to support.”

The oversubscribed financing marks Cape Station’s shift from early-stage and bridge funding to a long-term, non-recourse capital structure, according to the news release.

“Non-recourse financing has historically been considered out of reach for first-of-a-kind projects,” David Ulrey, CFO of Fervo Energy, said in a news release. “Cape Station disrupts that narrative. With proven oil and gas technology paired with AI-enabled drilling and exploration, robust commercial offtake, operational consistency, and an unrelenting focus on health and safety, we have shown that EGS is a highly bankable asset class.”

Fervo continues to be one of the top-funded startups in the Houston area. The company has raised about $1.5 billion prior to the latest $421 million. It also closed a $462 million Series E in December.

According to Axios Pro, Fervo filed for an IPO that would value the company between $2 billion and $3 billion in January.

---

This article first appeared on EnergyCapitalHTX.com.

Houston food giant Sysco to acquire competitor in $29 billion deal

Mergers & Acquisitions

Sysco, the nation's largest food distributor, will acquire supplier Restaurant Depot in a deal worth more than $29 billion.

The acquisition would create a closer link between Sysco and its customers that right now turn to Restaurant Depot for supplies needed quickly in an industry segment known as “cash-and-carry wholesale.”

Sysco, based in Houston, serves more than 700,000 restaurants, hospitals, schools, and hotels, supplying them with everything from butter and eggs to napkins. Those goods are typically acquired ahead of time based on how much traffic that restaurants typically see.

Restaurant Depot offers memberships to mom-and-pop restaurants and other businesses, giving them access to warehouses stocked with supplies for when they run short of what they've purchased from suppliers like Sysco.

It is a fast growing and high-margin segment that will likely mean thousands of restaurants will rely increasingly on Sysco for day-to-day needs.

Restaurant Depot shareholders will receive $21.6 billion in cash and 91.5 million Sysco shares. Based on Sysco’s closing share price of $81.80 as of March 27, 2026, the deal has an enterprise value of about $29.1 billion.

Restaurant Depot was founded in Brooklyn in 1976. The family-run business then known as Jetro Restaurant Depot, has become the nation's largest cash-and-carry wholesaler.

The boards of both companies have approved the acquisition, but it would still need regulatory approval.

Shares of Sysco Corp. tumbled 13% Monday to $71.26, an initial decline some industry analysts expected given the cost of the deal.

Houston researcher builds radar to make self-driving cars safer

eyes on the road

A Rice University researcher is giving autonomous vehicles an “extra set of eyes.”

Current autonomous vehicles (AVs) can have an incomplete view of their surroundings, and challenges like pedestrian movement, low-light conditions and adverse weather only compound these visibility limitations.

Kun Woo Cho, a postdoctoral researcher in the lab of Rice professor of electrical and computer engineering Ashutosh Sabharwal, has developed EyeDAR to help address such issues and enhance the vehicles’ sensing accuracy. Her research was supported in part by the National Science Foundation.

The EyeDAR is an orange-sized, low-power, millimeter-wave radar that could be placed at streetlights and intersections. Its design was inspired by that of the human eye. Researchers envision that the low-cost sensors could help ensure that AVs always pick up on emergent obstacles, even when the vehicles are not within proper range for their onboard sensors and when visibility is limited.

“Current automotive sensor systems like cameras and lidar struggle with poor visibility such as you would encounter due to rain or fog or in low-lighting conditions,” Cho said in a news release. “Radar, on the other hand, operates reliably in all weather and lighting conditions and can even see through obstacles.”

Signals from a typical radar system scatter when they encounter an obstacle. Some of the signal is reflected back to the source, but most of it is often lost. In the case of AVs, this means that "pedestrians emerging from behind large vehicles, cars creeping forward at intersections or cyclists approaching at odd angles can easily go unnoticed," according to Rice.

EyeDAR, however, works to capture lost radar reflections, determine their direction and report them back to the AV in a sequence of 0s and 1s.

“Like blinking Morse code,” Cho added. “EyeDAR is a talking sensor⎯it is a first instance of integrating radar sensing and communication functionality in a single design.”

After testing, EyeDAR was able to resolve target directions 200 times faster than conventional radar designs.

While EyeDAR currently targets risks associated with AVs, particularly in high-traffic urban areas, researchers also believe the technology behind it could complement artificial intelligence efforts and be integrated into robots, drones and wearable platforms.

“EyeDAR is an example of what I like to call ‘analog computing,’” Cho added in the release. “Over the past two decades, people have been focusing on the digital and software side of computation, and the analog, hardware side has been lagging behind. I want to explore this overlooked analog design space.”