Rice University's new Bachelor of Science in AI will be one of only a few in the country. Photo via Getty Images.

Rice University announced this month that it plans to introduce a Bachelor of Science in AI in the fall 2025 semester.

The new degree program will be part of the university's department of computer science in the George R. Brown School of Engineering and Computing and is one of only a few like it in the country. It aims to focus on "responsible and interdisciplinary approaches to AI," according to a news release from the university.

“We are in a moment of rapid transformation driven by AI, and Rice is committed to preparing students not just to participate in that future but to shape it responsibly,” Amy Dittmar, the Howard R. Hughes Provost and executive vice president for academic affairs, said in the release. “This new major builds on our strengths in computing and education and is a vital part of our broader vision to lead in ethical AI and deliver real-world solutions across health, sustainability and resilient communities.”

John Greiner, an assistant teaching professor of computer science in Rice's online Master of Computer Science program, will serve as the new program's director. Vicente Ordóñez-Román, an associate professor of computer science, was also instrumental in developing and approving the new major.

Until now, Rice students could study AI through elective courses and an advanced degree. The new bachelor's degree program opens up deeper learning opportunities to undergrads by blending traditional engineering and math requirements with other courses on ethics and philosophy as they relate to AI.

“With the major, we’re really setting out a curriculum that makes sense as a whole,” Greiner said in the release. “We are not simply taking a collection of courses that have been created already and putting a new wrapper around them. We’re actually creating a brand new curriculum. Most of the required courses are brand new courses designed for this major.”

Students in the program will also benefit from resources through Rice’s growing AI ecosystem, like the Ken Kennedy Institute, which focuses on AI solutions and ethical AI. The university also opened its new AI-focused "innovation factory," Rice Nexus, earlier this year.

“We have been building expertise in artificial intelligence,” Ordóñez-Román added in the release. “There are people working here on natural language processing, information retrieval systems for machine learning, more theoretical machine learning, quantum machine learning. We have a lot of expertise in these areas, and I think we’re trying to leverage that strength we’re building.”

Photo by Jasmin Merdan/Getty

Mastering control room management for smoother critical infrastructure operations

Up to Date

Control room management (CRM) systems play an integral role in ensuring the safe and efficient remote operations of automated processes for the world's most critical infrastructures (CI). If anything goes wrong with these CIs, the risks are major: loss of life or catastrophic environmental disasters. For this reason, rigorous regulatory requirements are crucial.

CRM systems give operators the ability to automate and take control of CI processes, giving operators situational awareness and real-time visibility of remote assets. This minimizes the need for manual work and inspection, and scales a company's ability to safely manage many assets over a large geographical area from one control room.

Most CI have to handle hazardous material in some, if not all, of their operational areas. Though different by industry, regulations and oversight are extremely necessary.

ICS (Industrial Control Systems) and CRM tools are key components of real-time monitoring for advanced warning and emergency alarming. The combination of a “green, amber, red” alert on the screen of an operator's control console will prompt them to respond, and potentially lead to following emergency shut-down response procedures. Training and testing of the control systems and their related standards, procedures, and activities are all recorded in a system of record in compliance with regulatory requirements.

Current challenges
One of the biggest challenges is the ability to easily aggregate the data from the many different systems and integrate them with the operator's daily activity and responses to the many notifications they receive. This makes it difficult for handover, when a new control room operator comes in fresh to take over from the operator coming off duty. Ensuring a clean and clear handover that encompasses all the pertinent information, so that the new operator can take over the console with ease and clarity, is much more difficult than some would imagine.

Another issue is the sheer volume of data. When you have thousands of sensors streaming data, it is not unrealistic for a console to receive a few thousand data points per second. Performance and continuity are priorities on a CI control room console(s). So there is no room for error — meaning there is no room for big (quite literally) data.

All of this means that real-time data must be pushed off the operational and process control network and moved into an area where there are no controls, but big data can be stored to produce big-data analytic capabilities, enabling AI, machine learning, and other data science.

Controller/operator fatigue is also an issue. Manual tracking, documenting, and record-keeping increases fatigue, leading to more mistakes and omissions.

Opportunities for improvement
The Houston-based Tory Technologies, Inc.is a corporation specializing in advanced software applications, creating and integrating various innovative technologies, and providing solutions for control room management and electronic flow measurement data management.

Tory Technologies, Inc. can help with the auto population of forms, inclusion of historical alarms and responses, and easy handover of control with active/open issues highlighted, making for an easier transition from one operator to the next.

"CRM is essential for keeping operations safe and efficient in industries where mistakes can lead to serious problems," says Juan Torres, director of operations - MaCRoM at Tory Technologies, Inc. "While many control rooms have worked hard to meet compliance standards, challenges remain that can affect performance and safety. It's not enough to just meet the basic rules; we need to go further by using smarter tools and strategies that make CRM more than just compliant, but truly effective."

Shaun Six, president of UTSI International, notes that, "CRM solutions are scalable. A smart integration with relevant systems and related data will reduce 'white noise' and increase relevance of data being displayed at the right time, or recalled when most helpful."

The future state
Offering CRM as a service for non-regulated control rooms will give economies of scale to critical infrastructure operators, which will allow dispatching, troubleshooting, and network monitoring so operators can focus on more value-add activities.

It can also virtualize network monitoring, ensuring that field machines and edge computers are compliant with industry and company standards and are not exposed to external threats.

Even better: Much of this can be automated. Smart tools can look through each device and test that passwords are changed, configurations are secure, and firmware/software has been properly patched or safeguarded against known exploits.

The sheer volume of data from these exercises can be overwhelming to operators. But a trained professional can easily filter and curate this data, cutting through the noise and helping asset owners address high-risk/high-probability exploits and plan/manage them.

Ultimately, the goal is to make control rooms efficient, getting the right information to the right people at the right time, while also retaining and maintaining required documents and data, ensuring an operators “license to operator” is uninterrupted and easily accessible to external parties when requested or needed.

Integrating smart CRM systems, network monitoring tools, and testing/validating processes and procedures are all easily accessible with current technological capabilities and availability, letting operators focus on the task at hand with ease and peace of mind.

BrainLM is now well-trained enough to use to fine-tune a specific task and to ask questions in other studies. Photo via Getty Images

Houston researchers create AI model to tap into how brain activity relates to illness

brainiac

Houston researchers are part of a team that has created an AI model intended to understand how brain activity relates to behavior and illness.

Scientists from Baylor College of Medicine worked with peers from Yale University, University of Southern California and Idaho State University to make Brain Language Model, or BrainLM. Their research was published as a conference paper at ICLR 2024, a meeting of some of deep learning’s greatest minds.

“For a long time we’ve known that brain activity is related to a person’s behavior and to a lot of illnesses like seizures or Parkinson’s,” Dr. Chadi Abdallah, associate professor in the Menninger Department of Psychiatry and Behavioral Sciences at Baylor and co-corresponding author of the paper, says in a press release. “Functional brain imaging or functional MRIs allow us to look at brain activity throughout the brain, but we previously couldn’t fully capture the dynamic of these activities in time and space using traditional data analytical tools.

"More recently, people started using machine learning to capture the brain complexity and how it relates it to specific illnesses, but that turned out to require enrolling and fully examining thousands of patients with a particular behavior or illness, a very expensive process,” Abdallah continues.

Using 80,000 brain scans, the team was able to train their model to figure out how brain activities related to one another. Over time, this created the BrainLM brain activity foundational model. BrainLM is now well-trained enough to use to fine-tune a specific task and to ask questions in other studies.

Abdallah said that using BrainLM will cut costs significantly for scientists developing treatments for brain disorders. In clinical trials, it can cost “hundreds of millions of dollars,” he said, to enroll numerous patients and treat them over a significant time period. By using BrainLM, researchers can enroll half the subjects because the AI can select the individuals most likely to benefit.

The team found that BrainLM performed successfully in many different samples. That included predicting depression, anxiety and PTSD severity better than other machine learning tools that do not use generative AI.

“We found that BrainLM is performing very well. It is predicting brain activity in a new sample that was hidden from it during the training as well as doing well with data from new scanners and new population,” Abdallah says. “These impressive results were achieved with scans from 40,000 subjects. We are now working on considerably increasing the training dataset. The stronger the model we can build, the more we can do to assist with patient care, such as developing new treatment for mental illnesses or guiding neurosurgery for seizures or DBS.”

For those suffering from neurological and mental health disorders, BrainLM could be a key to unlocking treatments that will make a life-changing difference.

The UH team is developing ways to use machine learning to ensure that power systems can continue to run efficiently when pulling their energy from wind and solar sources. Photo via Getty Images

Houston researcher scores prestigious NSF award for machine learning, power grid tech

grant funding

An associate professor at the University of Houston received the highly competitive National Science Foundation CAREER Award earlier this month for a proposal focused on integrating renewable resources to improve power grids.

The award grants more than $500,000 to Xingpeng Li, assistant professor of electrical and computer engineering and leader of the Renewable Power Grid Lab at UH, to continue his work on developing ways to use machine learning to ensure that power systems can continue to run efficiently when pulling their energy from wind and solar sources, according to a statement from UH. This work has applications in the events of large disturbances to the grid.

Li explains that currently, power grids run off of converted, stored kinetic energy during grid disturbances.

"For example, when the grid experiences sudden large generation losses or increased electrical loads, the stored kinetic energy immediately converted to electrical energy and addressed the temporary shortfall in generation,” Li said in a statement. “However, as the proportion of wind and solar power increases in the grid, we want to maximize their use since their marginal costs are zero and they provide clean energy. Since we reduce the use of those traditional generators, we also reduce the power system inertia (or stored kinetic energy) substantially.”

Li plans to use machine learning to create more streamlined models that can be implemented into day-ahead scheduling applications that grid operators currently use.

“With the proposed new modeling and computational approaches, we can better manage grids and ensure it can supply continuous quality power to all the consumers," he said.

In addition to supporting Li's research and model creations, the funds will also go toward Li and his team's creation of a free, open-source tool for students from kindergarten up through their graduate studies. They are also developing an “Applied Machine Learning in Power Systems” course. Li says the course will help meet workforce needs.

The CAREER Award recognizes early-career faculty members who “have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization,” according to the NSF. It's given to about 500 researchers each year.

Earlier this year, Rice assistant professor Amanda Marciel was also

granted an NSF CAREER Award to continue her research in designing branch elastomers that return to their original shape after being stretched. The research has applications in stretchable electronics and biomimetic tissues.

------

This article originally ran on EnergyCapital.

The NIH grant goes toward TransplantAI's work developing more precise models for heart and lung transplantation. Photo via Getty Images

Houston health tech company scores $2.2M grant to use AI to make organ transplants smarter, more successful

future of medicine

The National Institute of Health has bestowed a Houston medtech company with a $2.2 million Fast-Track to Phase 2 award. InformAI will use the money for the product development and commercialization of its AI-enabled organ transplant informatics platform.

Last year, InformAI CEO Jim Havelka told InnovationMap, “A lot of organs are harvested and discarded.”

TransplantAI solves that problem, as well as organ scarcity and inefficiency in allocation of the precious resource.

How does it work? Machine learning and deep learning from a million donor transplants informs the AI, which determines who is the best recipient for each available organ using more than 500 clinical parameters. Organ transplant centers and organ procurement organizations (OPOs) will be able to use the product to make a decision on how to allocate each organ in real time. Ultimately, the tool will service 250 transplant centers and 56 OPOs around the United States.

The NIH grant goes toward developing more precise models for heart and lung transplantation (kidney and liver algorithms are further along in development thanks to a previous award from the National Science Foundation), as well as Phase 2 efforts to fully commercialize TransplantAI.

"There is an urgent need for improved and integrated predictive clinical insights in solid organ transplantation, such as for real-time assessment of waitlist mortality and the likelihood of successful post-transplantation outcomes," according to the grant’s lead clinical investigator, Abbas Rana, associate professor of surgery at Baylor College of Medicine.

“This information is essential for healthcare teams and patients to make informed decisions, particularly in complex cases where expanded criteria allocation decisions are being considered," Rana continues. "Currently, the separation of donor and recipient data into different systems requires clinical teams to conduct manual, parallel reviews for pairing assessments. Our team, along with those at other leading transplant centers nationwide, receives hundreds of organ-recipient match offers weekly.”

Organ transplantation is moving into the future, and Transplant AI is at the forefront.

Houston can learn a lot from the decades of success from Silicon Valley, according to this Houston founder, who outlines just what all the city needs to do to become the startup city it has the potential to be. Photo via Getty Images

Houston expert: Can Houston replicate and surpass the success of Silicon Valley?

guest column

Anyone who knows me knows, as a Houston Startup Founder, I often muse about the still developing potential for startups in Houston, especially considering the amount of industry here, subject matter expertise, capital, and size.

For example, Houston is No. 2 in the country for Fortune 500 Companies — with 26 Bayou City companies on the list — behind only NYC, which has 47 ranked corporations, according to Fortune.

Considering layoffs, fund closings, and down rounds, things aren’t all that peachy in San Francisco for the first time in a long time, and despite being a Berkeley native, I’m rooting for Houston now that I’m a transplant.

Let’s start by looking at some stats.

While we’re not No. 1 in all areas, I believe we have the building blocks to be a major player in startups, and in tech (and not just energy and space tech). How? If the best predictor of future success is history, why not use the template of the GOAT of all startup cities: San Francisco and YCombinator. Sorry fellow founders – you’ve heard me talk about this repeatedly.

YCombinator is considered the GOAT of Startup Accelerators/Incubators based on:

  1. The Startup success rate: I’ve heard it’s as high as 75 percent (vs. the national average of 5 to 10 percent) Arc Search says 50 percent of YC Co’s fail within 12 years – not shabby.
  2. Their startup-to-unicorn ratio: 5 to 7 percent of YC startups become unicorns depending on the source — according to an Arc Search search (if you haven’t tried Arc Search do – super cool).
  3. Their network.

YC also parlayed that success into a "YC Startup School" offering:

  1. Free weekly lessons by YC partners — sometimes featuring unicorn alumni
  2. A document and video Library (YC SAFE, etc)
  3. Startup perks for students (AWS cloud credits, etc.)
  4. YC co-founder matching to help founders meet co-founders

Finally, there’s the over $80 billion in returns, according to Arc search, they’ve generated since their 2005 inception with a total of 4,000 companies in their portfolio at over $600 billion in value. So GOAT? Well just for perspective there were a jaw-dropping 18,000 startups in startup school the year I participated – so GOAT indeed.

So how do they do it? Based on anecdotal evidence, their winning formula is said to be the following well-oiled process:

  1. Bring over 282 startups (the number in last cohort) to San Francisco for 90 days to prototype, refine the product, and land on the go-to-market strategy. This includes a pre-seed YC SAFE investment of a phased $500,000 commitment for a fixed min 7 percent of equity, plus more equity at the next round’s valuation, according to YC.
  2. Over 50 percent of the latest cohort were idea stage and heavily AI focused.
  3. Traction day: inter-portfolio traction the company. YC has over 4,000 portfolio companies who can and do sign up for each other’s companies products because “they’re told to."
  4. Get beta testers and test from YC portfolio companies and YC network.
  5. If they see the traction scales to a massively scalable business, they lead the seed round and get this: schedule and attend the VC meetings with the founders.
  6. They create a "fear of missing out" mentality on Sand Hill Road as they casually mention who they’re meeting with next.
  7. They block competitors in the sector by getting the top VC’s to co-invest with then in the seed so competitors are locked out of the A list VC funding market, who then are up against the most well-funded and buzzed about players in the space.

If what I've seen is true, within a six-month period a startup idea is prototyped, tested, pivoted, launched, tractioned, seeded, and juiced for scale with people who can ‘make’ the company all in their corner, if not already on their board.

So how on earth can Houston best this?

  1. We have a massive amount of businesses — around 200,000 — and people — an estimated 7.3 million and growing.
  2. We have capital in search of an identity beyond oil.
  3. Our Fortune 500 companies that are hiring consultants for things that startups here that can do for free, quicker, and for a fraction of the extended cost.
  4. We have a growing base of tech talent for potential machine learning and artificial intelligence talent
  5. A sudden shot at the increasingly laid off big tech engineers.
  6. We have more accelerators and incubators.

What do we need to pull it off?

  1. An organized well-oiled YC-like process
  2. An inter-Houston traction process
  3. An "Adopt a Startup" program where local companies are willing to beta test and iterate with emerging startup products
  4. We have more accelerators but the cohorts are small — average five to 10 per cohort.
  5. Strategic pre-seed funding, possibly with corporate partners (who can make the company by being a client) and who de-risk the investment.
  6. Companies here to use Houston startup’s products first when they’re launched.
  7. A forum to match companies’ projects or labs groups etc., to startups who can solve them.
  8. A process in place to pull all these pieces together in an organized, structured sequence.

There is one thing missing in the list: there has to be an entity or a person who wants to make this happen. Someone who sees all the pieces, and has the desire, energy and clout to make it happen; and we all know this is the hardest part. And so for now, our hopes of besting YC may be up in the air as well.

------

Jo Clark is the founder of Circle.ooo, a Houston-based tech startup that's streamlining events management.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Texas plugs in among states at highest risk for summer power outages in 2025

hot, hot, hot

Warning: Houston could be in for an especially uncomfortable summer.

A new study from solar energy company Wolf River Electric puts Texas at No. 2 among the states most at risk for power outages this summer. Michigan tops the list.

Wolf River Electric analyzed the number of large-scale outages that left more than 5,000 utility customers, including homes, stores and schools, without summertime electricity from 2019 to 2023. During that period, Texas experienced 7,164 summertime power outages.

Despite Michigan being hit with more summertime outages, Texas led the list of states with the most hours of summertime power outages — an annual average of 35,440. That works out to 1,477 days. “This means power cuts in Texas tend to last longer, making summer especially tough for residents and businesses,” the study says.

The Electric Reliability Council of Texas (ERCOT), which operates the electric grid serving 90 percent of the state, predicts its system will set a monthly record for peak demand this August — 85,759 megawatts. That would exceed the current record of 85,508 megawatts, dating back to August 2023.

In 2025, natural gas will account for 37.7 percent of ERCOT’s summertime power-generating capacity, followed by wind (22.9 percent) and solar (19 percent), according to an ERCOT fact sheet.

This year, ERCOT expects four months to surpass peak demand of 80,000 megawatts:

  • June 2025 — 82,243 megawatts
  • July 2025 — 84,103 megawatts
  • August 2025 — 85,759 megawatts
  • September 2025 — 80,773 megawatts

One megawatt is enough power to serve about 250 residential customers amid peak demand, according to ERCOT. Using that figure, the projected peak of 85,759 megawatts in August would supply enough power to serve more than 21.4 million residential customers in Texas.

Data centers, artificial intelligence and population growth are driving up power demand in Texas, straining the ERCOT grid. In January, ERCOT laid out a nearly $33 billion plan to boost power transmission capabilities in its service area.

Houston ranks among top 5 cities for corporate HQ relocations in new report

h-town HQ

The Houston area already holds the title as the country’s third biggest metro hub for Fortune 500 headquarters, behind the New York City and Chicago areas. Now, Houston can tout another HQ accolade: It’s in a fourth-place tie with the Phoenix area for the most corporate headquarters relocations from 2018 to 2024.

During that period, the Houston and Phoenix areas each attracted 31 corporate headquarters, according to new research from commercial real estate services company CBRE. CBRE’s list encompasses public announcements from companies across various sizes and industries about relocating their corporate headquarters within the U.S.

Of the markets included in CBRE’s study, Dallas ranked first for corporate relocations (100) from 2018 to 2024. It’s followed by Austin (81), Nashville (35), Houston and Phoenix (31 each), and Denver (23).

According to CBRE, reasons cited by companies for moving their headquarters include:

  • Access to lower taxes
  • Availability of tax incentives
  • Proximity to key markets
  • Ability to support hybrid work

“Corporations now view headquarters locations as strategic assets, allowing for adaptability and faster reaction to market changes,” said CBRE.

Among the high-profile companies that moved their headquarters to the Houston area from 2018 to 2024 are:

  • Chevron
  • ExxonMobil
  • Hewlett-Packard Enterprise
  • Murphy Oil

Many companies that have shifted their headquarters to the Houston area, such as Chevron, are in the energy sector.

“Chevron’s decision to relocate its headquarters underscores the compelling advantages that position Houston as the prime destination for leading energy companies today and for the future,” Steve Kean, president and CEO of the Greater Houston Partnership, said in 2024. “With deep roots in our region, Chevron is a key player in establishing Houston as a global energy leader. This move will further enhance those efforts.”

According to CBRE, California (particularly the San Francisco Bay and Los Angeles areas) lost the most corporate HQs in 2024, with 17 companies announcing relocations—12 of them to Texas. Also last year, Texas gained nearly half of all state-to-state relocations.

In March, Site Selection magazine awarded Texas its 2024 Governor’s Cup, resulting in 13 consecutive wins for the state with the most corporate relocations and expansions.

In a news release promoting the latest Governor’s Cup victory, Gov. Greg Abbott hailed Texas as “the headquarters of headquarters.”

“Texas partners with the businesses that come to our great state to grow,” Abbott said. “When businesses succeed, Texas succeeds.”

CBRE explained that the trend of corporate HQ relocations reflects the desire of companies to seek new environments to support their goals and workforce needs.

“Ultimately, companies are seeking to establish themselves in locations with potential for long-term success and profitability,” CBRE said.

SpaceX test rocket explodes in Texas, but no injuries reported

SpaceX Update

A SpaceX rocket being tested in Texas exploded Wednesday night, sending a dramatic fireball high into the sky.

The company said the Starship “experienced a major anomaly” at about 11 pm while on the test stand preparing for the 10th flight test at Starbase, SpaceX’s launch site at the southern tip of Texas.

“A safety clear area around the site was maintained throughout the operation and all personnel are safe and accounted for,” SpaceX said in a statement on the social platform X.

CEO Elon Musk ’s SpaceX said there were no hazards to nearby communities. It asked people not to try to approach the site.

The company said it is working with local officials to respond to the explosion.

The explosion comes on the heels of an out-of-control Starship test flight in late May, which tumbled out of control. The FAA demanded an investigation into the accident.