As we overcome the COVID crisis, and look to rebuild our economy and overcome future challenges, we need to learn from this experience and refuse to go back to the bad old days of red tape and stale technology. Photo via Getty Images

If you've logged onto a government website recently, you know that dealing with creaking, outdated government technology is about as much fun as a trip to the DMV. Held back by byzantine procurement rules, management-by-committee, and an aggressive commitment to decades-old UX principles, government websites and other tech tools are routinely confusing, horrible to use, and deeply inefficient.

Now, though, that could finally be changing. The COVID-19 pandemic has forced us all to rethink our relationships with the technologies we use, from Zoom calls to e-commerce services. Increasingly, government bodies are finding themselves forced to move faster, adopt more up-to-date technologies, and work with private-sector partners to meet new challenges and quickly bring their services into the 21st century.

Getting an education

One of the most dramatic examples comes in the realm of education. According to the U.S. Census Bureau, about 93 percent of school-age children have engaged in distance learning since the pandemic began, and four fifths of them relied on digital tech to take the place of classroom resources. But with access to digital tech at home strongly correlated to household income, governments and education departments have had to move quickly to ensure every child has access to laptops and web connections.

Not everyone is a fan of remote learning, and as a parent myself, I know how hard it can be to have kids at home. But one thing we should all be able to agree on is that if we're going to rely on digital learning, then we need to make sure it's available to everyone, including those families that don't have access to reliable computers and WiFi connections at home.

Achieving that rapidly and at scale has required remarkable flexibility and creativity from policymakers at all levels. Those that have succeeded have done so by brushing aside the red tape that has ensnared previous government tech initiatives, and instead working with private-sector partners to rapidly implement the solutions that are needed.

Lessons from Texas

Here in Texas, for instance, one in six public school students lacked access to high-speed internet connections at the start of the pandemic, and 30% lacked access to laptops or other learning devices. To speed the transition to remote learning, Gov. Greg Abbott and the Texas Education Agency (TEA) launched Operation Connectivity — a $400 million campaign to connect 5.5 million Texas public school students with a computer device and reliable internet connection. To date 4 million devices have been purchased and are being distributed to kids, opening doors to greater educational and economic opportunities. Further work is in progress to remove other connectivity barriers like slow connection speeds in rural areas to help students and all Texans.

Rolling out such an ambitious project to our state's 1,200 or so school districts could have been a disaster. After all, many government IT projects grind along for months or years without delivering the desired results — often at huge cost to taxpayers. But Operation Connectivity has been different because it's grounded in a true partnership between the government and private-sector players.

Facing urgent deadlines, government leaders turned to Gaby Rowe, former CEO of the Ion tech hub, to spearhead the project. As a tech innovator, Rowe brought entrepreneurial energy and a real understanding of the power of public-private partnerships, and drove Operation Connectivity from the blueprint to execution in a matter of weeks. Tech giants including Microsoft, SAP, and Hubspot also quickly joined the effort, helping to deliver cost-effective connectivity and hardware solutions to ensure that every kid in our state could get the education they deserve. Since then, Operation Connectivity has distributed over a million devices, including laptops and wireless hotspots, to families in need, with costs split between the state and individual districts.

Private sector edge

To get a sense of how private-sector knowhow can spur government tech transformation, consider my own company, Digital Glyde. As part of the Operation Connectivity effort, we were asked to help design and build the back-end software and planning infrastructure needed to coordinate effectively with hundreds of school district officials scattered all across our state.

Ordinarily, that kind of effort would require a drawn-out process of consultation, committee-work, and red tape. But facing an urgent need to help our state's children, we were given the freedom to move quickly, and were able to implement a viable system within just a few days.

By leveraging cutting-edge data-extraction and image-processing tools, we helped Operation Connectivity to automatically process invoices and match tech costs to available COVID relief funding in record time. We achieved 95% accuracy within three weeks of deployment to ensure school districts quickly received reimbursements for the hardware they were purchasing on behalf of their schoolchildren.

Building on success

Operation Connectivity is just one example of the ways in which government actors have embraced tech and leveraged private-sector assistance to chart their way through the COVID crisis. From contact-tracing programs to vaccine distribution programs, we're seeing governments taking a far more pragmatic and partnership-driven approach to technology.

Of course, not every experiment goes to plan. In Florida, government agencies decided to use web tools to manage vaccination appointments — but implemented that idea using a commercial website built to handle birthday party e-vites. Unsurprisingly, the results were chaotic, with users having to scramble to grab appointments as they were posted to the site, and seniors struggling to wrap their head around a website designed for young parents.

Such stories are a reminder that governments can't solve big problems simply by grabbing at whatever tech tools are nearest to hand. It's vital to find the right solutions, and to work with partners who understand the complexity and constraints that come with delivering public-sector services at scale.

As we overcome the COVID crisis, and look to rebuild our economy and overcome future challenges, we need to learn from this experience and refuse to go back to the bad old days of red tape and stale technology. In recent months, we've shown what can be done when we pull together, and combine real governmental leadership with private-sector innovation and efficiency. We'll need much more of this kind of teamwork and tech-enabled creativity in the months and years to come.

------

Varun Garg is the founder and CEO of Houston-based Digital Glyde

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

​Planned UT Austin med center, anchored by MD Anderson, gets $100M gift​

med funding

The University of Texas at Austin’s planned multibillion-dollar medical center, which will include a hospital run by Houston’s University of Texas MD Anderson Cancer Center, just received a $100 million boost from a billionaire husband-and-wife duo.

Tench Coxe, a former venture capitalist who’s a major shareholder in chipmaking giant Nvidia, and Simone Coxe, co-founder and former CEO of the Blanc & Otus PR firm, contributed the $100 million—one of the largest gifts in UT history. The Coxes live in Austin.

“Great medical care changes lives,” says Simone Coxe, “and we want more people to have access to it.”

The University of Texas System announced the medical center project in 2023 and cited an estimated price tag of $2.5 billion. UT initially said the medical center would be built on the site of the Frank Erwin Center, a sports and entertainment venue on the UT Austin campus that was demolished in 2024. The 20-acre site, north of downtown and the state Capitol, is near Dell Seton Medical Center, UT Dell Medical School and UT Health Austin.

Now, UT officials are considering a bigger, still-unidentified site near the Domain mixed-use district in North Austin, although they haven’t ruled out the Erwin Center site. The Domain development is near St. David’s North Medical Center.

As originally planned, the medical center would house a cancer center built and operated by MD Anderson and a specialty hospital built and operated by UT Austin. Construction on the two hospitals is scheduled to start this year and be completed in 2030. According to a 2025 bid notice for contractors, each hospital is expected to encompass about 1.5 million square feet, meaning the medical center would span about 3 million square feet.

Features of the MD Anderson hospital will include:

  • Inpatient care
  • Outpatient clinics
  • Surgery suites
  • Radiation, chemotherapy, cell, and proton treatments
  • Diagnostic imaging
  • Clinical drug trials

UT says the new medical center will fuse the university’s academic and research capabilities with the medical and research capabilities of MD Anderson and Dell Medical School.

UT officials say priorities for spending the Coxes’ gift include:

  • Recruiting world-class medical professionals and scientists
  • Supporting construction
  • Investing in technology
  • Expanding community programs that promote healthy living and access to care

Tench says the opportunity to contribute to building an institution from the ground up helped prompt the donation. He and others say that thanks to MD Anderson’s participation, the medical center will bring world-renowned cancer care to the Austin area.

“We have a close friend who had to travel to Houston for care she should have been able to get here at home. … Supporting the vision for the UT medical center is exactly the opportunity Austin needed,” he says.

The rate of patients who leave the Austin area to seek care for serious medical issues runs as high as 25 percent, according to UT.

New Rice Brain Institute partners with TMC to award inaugural grants

brain trust

The recently founded Rice Brain Institute has named the first four projects to receive research awards through the Rice and TMC Neuro Collaboration Seed Grant Program.

The new grant program brings together Rice faculty with clinicians and scientists at The University of Texas Medical Branch, Baylor College of Medicine, UTHealth Houston and The University of Texas MD Anderson Cancer Center. The program will support pilot projects that address neurological disease, mental health and brain injury.

The first round of awards was selected from a competitive pool of 40 proposals, and will support projects that reflect Rice Brain Institute’s research agenda.

“These awards are meant to help teams test bold ideas and build the collaborations needed to sustain long-term research programs in brain health,” Behnaam Aazhang, Rice Brain Institute director and co-director of the Rice Neuroengineering Initiative, said in a news release.

The seed funding has been awarded to the following principal investigators:

  • Kevin McHugh, associate professor of bioengineering and chemistry at Rice, and Peter Kan, professor and chair of neurosurgery at the UTMB. McHugh and Kan are developing an injectable material designed to seal off fragile, abnormal blood vessels that can cause life-threatening bleeding in the brain.
  • Jerzy Szablowski, assistant professor of bioengineering at Rice, and Jochen Meyer, assistant professor of neurology at Baylor. Szablowski and Meyer are leading a nonsurgical, ultrasound approach to deliver gene-based therapies to deep brain regions involved in seizures to control epilepsy without implanted electrodes or invasive procedures.
  • Juliane Sempionatto, assistant professor of electrical and computer engineering at Rice, and Aaron Gusdon, associate professor of neurosurgery at UTHealth Houston. Sempionatto and Gusdon are leading efforts to create a blood test that can identify patients at high risk for delayed brain injury following aneurysm-related hemorrhage, which could lead to earlier intervention and improved outcomes.
  • Christina Tringides, assistant professor of materials science and nanoengineering at Rice, and Sujit Prabhu, professor of neurosurgery at MD Anderson, who are working to reduce the risk of long-term speech and language impairment during brain tumor removal by combining advanced brain recordings, imaging and noninvasive stimulation.

The grants were facilitated by Rice’s Educational and Research Initiatives for Collaborative Health (ENRICH) Office. Rice says that the unique split-funding model of these grants could help structure future collaborations between the university and the TMC.

The Rice Brain Institute launched this fall and aims to use engineering, natural sciences and social sciences to research the brain and reduce the burden of neurodegenerative, neurodevelopmental and mental health disorders. Last month, the university's Shepherd School of Music also launched the Music, Mind and Body Lab, an interdisciplinary hub that brings artists and scientists together to study the "intersection of the arts, neuroscience and the medical humanities." Read more here.

Your data center is either closer than you think or much farther away

houston voices

A new study shows why some facilities cluster in cities for speed and access, while others move to rural regions in search of scale and lower costs. Based on research by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard).

Key findings:

  • Third-party colocation centers are physical facilities in close proximity to firms that use them, while cloud providers operate large data centers from a distance and sell access to virtualized computing resources as on‑demand services over the internet.
  • Hospitals and financial firms often require urban third-party centers for low latency and regulatory compliance, while batch processing and many AI workloads can operate more efficiently from lower-cost cloud hubs.
  • For policymakers trying to attract data centers, access to reliable power, water and high-capacity internet matter more than tax incentives.

Recent outages and the surge in AI-driven computing have made data center siting decisions more consequential than ever, especially as energy and water constraints tighten. Communities invest public dollars on the promise of jobs and growth, while firms weigh long-term commitments to land, power and connectivity.

Against that backdrop, a critical question comes into focus: Where do data centers get built — and what actually drives those decisions?

A new study by Tommy Pan Fang (Rice Business) and Shane Greenstein (Harvard Business School) provides the first large-scale statistical analysis of data center location strategies across the United States. It offers policymakers and firms a clearer starting point for understanding how different types of data centers respond to economic and strategic incentives.

Forthcoming in the journal Strategy Science, the study examines two major types of infrastructure: third-party colocation centers that lease server space to multiple firms, and hyperscale cloud centers owned by providers like Amazon, Google and Microsoft.

Two Models, Two Location Strategies

The study draws on pre-pandemic data from 2018 and 2019, a period of relative geographic stability in supply and demand. This window gives researchers a clean baseline before remote work, AI demand and new infrastructure pressures began reshaping internet traffic patterns.

The findings show that data centers follow a bifurcated geography. Third-party centers cluster in dense urban markets, where buyers prioritize proximity to customers despite higher land and operating costs. Cloud providers, by contrast, concentrate massive sites in a small number of lower-density regions, where electricity, land and construction are cheaper and economies of scale are easier to achieve.

Third-party data centers, in other words, follow demand. They locate in urban markets where firms in finance, healthcare and IT value low latency, secure storage, and compliance with regulatory standards.

Using county-level data, the researchers modeled how population density, industry mix and operating costs predict where new centers enter. Every U.S. metro with more than 700,000 residents had at least one third-party provider, while many mid-sized cities had none.

ImageThis pattern challenges common assumptions. Third-party facilities are more distributed across urban America than prevailing narratives suggest.

Customer proximity matters because some sectors cannot absorb delay. In critical operations, even slight pauses can have real consequences. For hospital systems, lag can affect performance and risk exposure. And in high-frequency trading, milliseconds can determine whether value is captured or lost in a transaction.

“For industries where speed is everything, being too far from the physical infrastructure can meaningfully affect performance and risk,” Pan Fang says. “Proximity isn’t optional for sectors that can’t absorb delay.”

The Economics of Distance

For cloud providers, the picture looks very different. Their decisions follow a logic shaped primarily by cost and scale. Because cloud services can be delivered from afar, firms tend to build enormous sites in low-density regions where power is cheap and land is abundant.

These facilities can draw hundreds of megawatts of electricity and operate with far fewer employees than urban centers. “The cloud can serve almost anywhere,” Pan Fang says, “so location is a question of cost before geography.”

The study finds that cloud infrastructure clusters around network backbones and energy economics, not talent pools. Well-known hubs like Ashburn, Virginia — often called “Data Center Alley” — reflect this logic, having benefited from early network infrastructure that made them natural convergence points for digital traffic.

Local governments often try to lure data centers with tax incentives, betting they will create high-tech jobs. But the study suggests other factors matter more to cloud providers, including construction costs, network connectivity and access to reliable, affordable electricity.

When cloud centers need a local presence, distance can sometimes become a constraint. Providers often address this by working alongside third-party operators. “Third-party centers can complement cloud firms when they need a foothold closer to customers,” Pan Fang says.

That hybrid pattern — massive regional hubs complementing strategic colocation — may define the next phase of data center growth.

Looking ahead, shifts in remote work, climate resilience, energy prices and AI-driven computing may reshape where new facilities go. Some workloads may move closer to users, while others may consolidate into large rural hubs. Emerging data-sovereignty rules could also redirect investment beyond the United States.

“The cloud feels weightless,” Pan Fang says, “but it rests on real choices about land, power and proximity.”

---

This article originally appeared on Rice Business Wisdom. Written by Scott Pett.

Pan Fang and Greenstein (2025). “Where the Cloud Rests: The Economic Geography of Data Centers,” forthcoming in Strategy Science.