As we overcome the COVID crisis, and look to rebuild our economy and overcome future challenges, we need to learn from this experience and refuse to go back to the bad old days of red tape and stale technology. Photo via Getty Images

If you've logged onto a government website recently, you know that dealing with creaking, outdated government technology is about as much fun as a trip to the DMV. Held back by byzantine procurement rules, management-by-committee, and an aggressive commitment to decades-old UX principles, government websites and other tech tools are routinely confusing, horrible to use, and deeply inefficient.

Now, though, that could finally be changing. The COVID-19 pandemic has forced us all to rethink our relationships with the technologies we use, from Zoom calls to e-commerce services. Increasingly, government bodies are finding themselves forced to move faster, adopt more up-to-date technologies, and work with private-sector partners to meet new challenges and quickly bring their services into the 21st century.

Getting an education

One of the most dramatic examples comes in the realm of education. According to the U.S. Census Bureau, about 93 percent of school-age children have engaged in distance learning since the pandemic began, and four fifths of them relied on digital tech to take the place of classroom resources. But with access to digital tech at home strongly correlated to household income, governments and education departments have had to move quickly to ensure every child has access to laptops and web connections.

Not everyone is a fan of remote learning, and as a parent myself, I know how hard it can be to have kids at home. But one thing we should all be able to agree on is that if we're going to rely on digital learning, then we need to make sure it's available to everyone, including those families that don't have access to reliable computers and WiFi connections at home.

Achieving that rapidly and at scale has required remarkable flexibility and creativity from policymakers at all levels. Those that have succeeded have done so by brushing aside the red tape that has ensnared previous government tech initiatives, and instead working with private-sector partners to rapidly implement the solutions that are needed.

Lessons from Texas

Here in Texas, for instance, one in six public school students lacked access to high-speed internet connections at the start of the pandemic, and 30% lacked access to laptops or other learning devices. To speed the transition to remote learning, Gov. Greg Abbott and the Texas Education Agency (TEA) launched Operation Connectivity — a $400 million campaign to connect 5.5 million Texas public school students with a computer device and reliable internet connection. To date 4 million devices have been purchased and are being distributed to kids, opening doors to greater educational and economic opportunities. Further work is in progress to remove other connectivity barriers like slow connection speeds in rural areas to help students and all Texans.

Rolling out such an ambitious project to our state's 1,200 or so school districts could have been a disaster. After all, many government IT projects grind along for months or years without delivering the desired results — often at huge cost to taxpayers. But Operation Connectivity has been different because it's grounded in a true partnership between the government and private-sector players.

Facing urgent deadlines, government leaders turned to Gaby Rowe, former CEO of the Ion tech hub, to spearhead the project. As a tech innovator, Rowe brought entrepreneurial energy and a real understanding of the power of public-private partnerships, and drove Operation Connectivity from the blueprint to execution in a matter of weeks. Tech giants including Microsoft, SAP, and Hubspot also quickly joined the effort, helping to deliver cost-effective connectivity and hardware solutions to ensure that every kid in our state could get the education they deserve. Since then, Operation Connectivity has distributed over a million devices, including laptops and wireless hotspots, to families in need, with costs split between the state and individual districts.

Private sector edge

To get a sense of how private-sector knowhow can spur government tech transformation, consider my own company, Digital Glyde. As part of the Operation Connectivity effort, we were asked to help design and build the back-end software and planning infrastructure needed to coordinate effectively with hundreds of school district officials scattered all across our state.

Ordinarily, that kind of effort would require a drawn-out process of consultation, committee-work, and red tape. But facing an urgent need to help our state's children, we were given the freedom to move quickly, and were able to implement a viable system within just a few days.

By leveraging cutting-edge data-extraction and image-processing tools, we helped Operation Connectivity to automatically process invoices and match tech costs to available COVID relief funding in record time. We achieved 95% accuracy within three weeks of deployment to ensure school districts quickly received reimbursements for the hardware they were purchasing on behalf of their schoolchildren.

Building on success

Operation Connectivity is just one example of the ways in which government actors have embraced tech and leveraged private-sector assistance to chart their way through the COVID crisis. From contact-tracing programs to vaccine distribution programs, we're seeing governments taking a far more pragmatic and partnership-driven approach to technology.

Of course, not every experiment goes to plan. In Florida, government agencies decided to use web tools to manage vaccination appointments — but implemented that idea using a commercial website built to handle birthday party e-vites. Unsurprisingly, the results were chaotic, with users having to scramble to grab appointments as they were posted to the site, and seniors struggling to wrap their head around a website designed for young parents.

Such stories are a reminder that governments can't solve big problems simply by grabbing at whatever tech tools are nearest to hand. It's vital to find the right solutions, and to work with partners who understand the complexity and constraints that come with delivering public-sector services at scale.

As we overcome the COVID crisis, and look to rebuild our economy and overcome future challenges, we need to learn from this experience and refuse to go back to the bad old days of red tape and stale technology. In recent months, we've shown what can be done when we pull together, and combine real governmental leadership with private-sector innovation and efficiency. We'll need much more of this kind of teamwork and tech-enabled creativity in the months and years to come.

------

Varun Garg is the founder and CEO of Houston-based Digital Glyde

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

How Houston innovators played a role in the historic Artemis II splashdown

safe landing

Research from Rice University played a critical role in the safe return of U.S. astronauts aboard NASA’s Artemis II mission this month.

Rice mechanical engineer Tayfun E. Tezduyar and longtime collaborator Kenji Takizawa developed a key computational parachute fluid-structure interaction (FSI) analysis system that proved vital in NASA’s Orion capsule’s descent into the Pacific Ocean. The FSI system, originally developed in 2013 alongside NASA Johnson Space Center, was critical in Orion’s three-parachute design, which slowed the capsule as it returned to Earth, according to Rice.

The model helped ensure that the parachute design was large enough to slow the capsule for a safe landing while also being stable enough to prevent the capsule from oscillating as it descended.

“You cannot separate the aerodynamics from the structural dynamics,” Tezduyar said in a news release. “They influence each other continuously and even more so for large spacecraft parachutes, so the analysis must capture that interaction in a robustly coupled way.”

The end result was a final parachute system, refined through NASA drop tests and Rice’s computational FSI analysis, that eliminated fluctuations and produced a stable descent profile.

Apart from the dynamic challenges in design, modeling Orion’s parachutes also required solving complex equations that considered airflow and fabric deformation and accounted for features like ringsail canopy construction and aerodynamic interactions among multiple parachutes in a cluster.

“Essentially, my entire group was dedicated to that work, because I considered it a national priority,” Tezduyar added in the release. “Kenji and I were personally involved in every computer simulation. Some of the best graduate students and research associates I met in my career worked on the project, creating unique, first-of-its-kind parachute computer simulations, one after the other.”

Current Intuitive Machines engineer Mario Romero also worked on Orion during his time at NASA. From 2018 to 2021, Romero was a member of the Orion Crew Capsule Recovery Team, which focused on creating likely scenarios that crewmembers could encounter in Orion.

The team trained in NASA’s 6.2-million-gallon pool, using wave machines to replicate a range of sea conditions. They also simulated worst-case scenarios by cutting the lights, blasting high-powered fans and tipping a mock capsule to mimic distress situations. In some drills, mock crew members were treated as “injured,” requiring the team to practice safe, controlled egress procedures.

“It’s hard to find the appropriate descriptors that can fully encapsulate the feeling of getting to witness all the work we, and everyone else, did being put into action,” Romero tells InnovationMap. “I loved seeing the reactions of everyone, but especially of the Houston communities—that brought me a real sense of gratitude and joy.”

Intuitive Machines was also selected to support the Artemis II mission using its Space Data Network and ground station infrastructure. The company monitored radio signals sent from the Orion spacecraft and used Doppler measurements to help determine the spacecraft's precise position and speed.

Tim Crain, Chief Technology Officer at Intuitive Machines, wrote about the experience last week.

"I specialized in orbital mechanics and deep space navigation in graduate school,” Crain shared. “But seeing the theory behind tracking spacecraft come to life as they thread through planetary gravity fields on ultra-precise trajectories still seems like magic."

UH breakthrough moves superconductivity closer to real-world use

Energy Breakthrough

University of Houston researchers have set a new benchmark in the field of superconductivity.

Researchers from the UH physics department and the Texas Center for Superconductivity (TcSUH) have broken the transition temperature record for superconductivity at ambient pressure. The accomplishment could lead to more efficient ways to generate, transmit and store energy, which researchers believe could improve power grids, medical technologies and energy systems by enabling electricity to flow without resistance, according to a release from UH.

To break the record, UH researchers achieved a transition temperature 151 Kelvin, which is the highest ever recorded at ambient pressure since the discovery of superconductivity in 1911.

The transition temperature represents the point just before a material becomes superconducting, where electricity can flow through it without resistance. Scientists have been working for decades to push transition temperature closer to room temperature, which would make superconducting technologies more practical and affordable.

Currently, most superconductors must be cooled to extremely low temperatures, making them more expensive and difficult to operate.

UH physicists Ching-Wu Chu and Liangzi Deng published the research in the Proceedings of the National Academy of Sciences earlier this month. It was funded by Intellectual Ventures and the state of Texas via TcSUH and other foundations. Chu, founding director and chief scientist at TcSUH, previously made the breakthrough discovery that the material YBCO reaches superconductivity at minus 93 K in 1987. This helped begin a global competition to develop high-temperature superconductors.

“Transmitting electricity in the grid loses about 8% of the electricity,” Chu, who’s also a professor of physics at UH and the paper’s senior author, said in a news release. “If we conserve that energy, that’s billions of dollars of savings and it also saves us lots of effort and reduces environmental impacts.”

Chu and his team used a technique known as pressure quenching, which has been adapted from techniques used to create diamonds. With pressure quenching, researchers first apply intense pressure to the material to enhance its superconducting properties and raise its transition temperature.

Next, researchers are targeting ambient-pressure, room-temperature superconductivity of around 300 K. In a companion PNAS paper, Chu and Deng point to pressure quenching as a promising approach to help bridge the gap between current results and that goal.

“Room-temperature superconductivity has been seen as a ‘holy grail’ by scientists for over a century,” Rohit Prasankumar, director of superconductivity research at Intellectual Ventures, said in the release. “The UH team’s result shows that this goal is closer than ever before. However, the distance between the new record set in this study and room temperature is still about 140 C. Closing this gap will require concerted, intentional efforts by the broader scientific community, including materials scientists, chemists, and engineers, as well as physicists.”

---

This article originally appeared on EnergyCapitalHTX.com.