Photo by Jasmin Merdan/Getty

Control room management (CRM) systems play an integral role in ensuring the safe and efficient remote operations of automated processes for the world's most critical infrastructures (CI). If anything goes wrong with these CIs, the risks are major: loss of life or catastrophic environmental disasters. For this reason, rigorous regulatory requirements are crucial.

CRM systems give operators the ability to automate and take control of CI processes, giving operators situational awareness and real-time visibility of remote assets. This minimizes the need for manual work and inspection, and scales a company's ability to safely manage many assets over a large geographical area from one control room.

Most CI have to handle hazardous material in some, if not all, of their operational areas. Though different by industry, regulations and oversight are extremely necessary.

ICS (Industrial Control Systems) and CRM tools are key components of real-time monitoring for advanced warning and emergency alarming. The combination of a “green, amber, red” alert on the screen of an operator's control console will prompt them to respond, and potentially lead to following emergency shut-down response procedures. Training and testing of the control systems and their related standards, procedures, and activities are all recorded in a system of record in compliance with regulatory requirements.

Current challenges
One of the biggest challenges is the ability to easily aggregate the data from the many different systems and integrate them with the operator's daily activity and responses to the many notifications they receive. This makes it difficult for handover, when a new control room operator comes in fresh to take over from the operator coming off duty. Ensuring a clean and clear handover that encompasses all the pertinent information, so that the new operator can take over the console with ease and clarity, is much more difficult than some would imagine.

Another issue is the sheer volume of data. When you have thousands of sensors streaming data, it is not unrealistic for a console to receive a few thousand data points per second. Performance and continuity are priorities on a CI control room console(s). So there is no room for error — meaning there is no room for big (quite literally) data.

All of this means that real-time data must be pushed off the operational and process control network and moved into an area where there are no controls, but big data can be stored to produce big-data analytic capabilities, enabling AI, machine learning, and other data science.

Controller/operator fatigue is also an issue. Manual tracking, documenting, and record-keeping increases fatigue, leading to more mistakes and omissions.

Opportunities for improvement
The Houston-based Tory Technologies, Inc.is a corporation specializing in advanced software applications, creating and integrating various innovative technologies, and providing solutions for control room management and electronic flow measurement data management.

Tory Technologies, Inc. can help with the auto population of forms, inclusion of historical alarms and responses, and easy handover of control with active/open issues highlighted, making for an easier transition from one operator to the next.

"CRM is essential for keeping operations safe and efficient in industries where mistakes can lead to serious problems," says Juan Torres, director of operations - MaCRoM at Tory Technologies, Inc. "While many control rooms have worked hard to meet compliance standards, challenges remain that can affect performance and safety. It's not enough to just meet the basic rules; we need to go further by using smarter tools and strategies that make CRM more than just compliant, but truly effective."

Shaun Six, president of UTSI International, notes that, "CRM solutions are scalable. A smart integration with relevant systems and related data will reduce 'white noise' and increase relevance of data being displayed at the right time, or recalled when most helpful."

The future state
Offering CRM as a service for non-regulated control rooms will give economies of scale to critical infrastructure operators, which will allow dispatching, troubleshooting, and network monitoring so operators can focus on more value-add activities.

It can also virtualize network monitoring, ensuring that field machines and edge computers are compliant with industry and company standards and are not exposed to external threats.

Even better: Much of this can be automated. Smart tools can look through each device and test that passwords are changed, configurations are secure, and firmware/software has been properly patched or safeguarded against known exploits.

The sheer volume of data from these exercises can be overwhelming to operators. But a trained professional can easily filter and curate this data, cutting through the noise and helping asset owners address high-risk/high-probability exploits and plan/manage them.

Ultimately, the goal is to make control rooms efficient, getting the right information to the right people at the right time, while also retaining and maintaining required documents and data, ensuring an operators “license to operator” is uninterrupted and easily accessible to external parties when requested or needed.

Integrating smart CRM systems, network monitoring tools, and testing/validating processes and procedures are all easily accessible with current technological capabilities and availability, letting operators focus on the task at hand with ease and peace of mind.

BrainLM is now well-trained enough to use to fine-tune a specific task and to ask questions in other studies. Photo via Getty Images

Houston researchers create AI model to tap into how brain activity relates to illness

brainiac

Houston researchers are part of a team that has created an AI model intended to understand how brain activity relates to behavior and illness.

Scientists from Baylor College of Medicine worked with peers from Yale University, University of Southern California and Idaho State University to make Brain Language Model, or BrainLM. Their research was published as a conference paper at ICLR 2024, a meeting of some of deep learning’s greatest minds.

“For a long time we’ve known that brain activity is related to a person’s behavior and to a lot of illnesses like seizures or Parkinson’s,” Dr. Chadi Abdallah, associate professor in the Menninger Department of Psychiatry and Behavioral Sciences at Baylor and co-corresponding author of the paper, says in a press release. “Functional brain imaging or functional MRIs allow us to look at brain activity throughout the brain, but we previously couldn’t fully capture the dynamic of these activities in time and space using traditional data analytical tools.

"More recently, people started using machine learning to capture the brain complexity and how it relates it to specific illnesses, but that turned out to require enrolling and fully examining thousands of patients with a particular behavior or illness, a very expensive process,” Abdallah continues.

Using 80,000 brain scans, the team was able to train their model to figure out how brain activities related to one another. Over time, this created the BrainLM brain activity foundational model. BrainLM is now well-trained enough to use to fine-tune a specific task and to ask questions in other studies.

Abdallah said that using BrainLM will cut costs significantly for scientists developing treatments for brain disorders. In clinical trials, it can cost “hundreds of millions of dollars,” he said, to enroll numerous patients and treat them over a significant time period. By using BrainLM, researchers can enroll half the subjects because the AI can select the individuals most likely to benefit.

The team found that BrainLM performed successfully in many different samples. That included predicting depression, anxiety and PTSD severity better than other machine learning tools that do not use generative AI.

“We found that BrainLM is performing very well. It is predicting brain activity in a new sample that was hidden from it during the training as well as doing well with data from new scanners and new population,” Abdallah says. “These impressive results were achieved with scans from 40,000 subjects. We are now working on considerably increasing the training dataset. The stronger the model we can build, the more we can do to assist with patient care, such as developing new treatment for mental illnesses or guiding neurosurgery for seizures or DBS.”

For those suffering from neurological and mental health disorders, BrainLM could be a key to unlocking treatments that will make a life-changing difference.

The UH team is developing ways to use machine learning to ensure that power systems can continue to run efficiently when pulling their energy from wind and solar sources. Photo via Getty Images

Houston researcher scores prestigious NSF award for machine learning, power grid tech

grant funding

An associate professor at the University of Houston received the highly competitive National Science Foundation CAREER Award earlier this month for a proposal focused on integrating renewable resources to improve power grids.

The award grants more than $500,000 to Xingpeng Li, assistant professor of electrical and computer engineering and leader of the Renewable Power Grid Lab at UH, to continue his work on developing ways to use machine learning to ensure that power systems can continue to run efficiently when pulling their energy from wind and solar sources, according to a statement from UH. This work has applications in the events of large disturbances to the grid.

Li explains that currently, power grids run off of converted, stored kinetic energy during grid disturbances.

"For example, when the grid experiences sudden large generation losses or increased electrical loads, the stored kinetic energy immediately converted to electrical energy and addressed the temporary shortfall in generation,” Li said in a statement. “However, as the proportion of wind and solar power increases in the grid, we want to maximize their use since their marginal costs are zero and they provide clean energy. Since we reduce the use of those traditional generators, we also reduce the power system inertia (or stored kinetic energy) substantially.”

Li plans to use machine learning to create more streamlined models that can be implemented into day-ahead scheduling applications that grid operators currently use.

“With the proposed new modeling and computational approaches, we can better manage grids and ensure it can supply continuous quality power to all the consumers," he said.

In addition to supporting Li's research and model creations, the funds will also go toward Li and his team's creation of a free, open-source tool for students from kindergarten up through their graduate studies. They are also developing an “Applied Machine Learning in Power Systems” course. Li says the course will help meet workforce needs.

The CAREER Award recognizes early-career faculty members who “have the potential to serve as academic role models in research and education and to lead advances in the mission of their department or organization,” according to the NSF. It's given to about 500 researchers each year.

Earlier this year, Rice assistant professor Amanda Marciel was also

granted an NSF CAREER Award to continue her research in designing branch elastomers that return to their original shape after being stretched. The research has applications in stretchable electronics and biomimetic tissues.

------

This article originally ran on EnergyCapital.

The NIH grant goes toward TransplantAI's work developing more precise models for heart and lung transplantation. Photo via Getty Images

Houston health tech company scores $2.2M grant to use AI to make organ transplants smarter, more successful

future of medicine

The National Institute of Health has bestowed a Houston medtech company with a $2.2 million Fast-Track to Phase 2 award. InformAI will use the money for the product development and commercialization of its AI-enabled organ transplant informatics platform.

Last year, InformAI CEO Jim Havelka told InnovationMap, “A lot of organs are harvested and discarded.”

TransplantAI solves that problem, as well as organ scarcity and inefficiency in allocation of the precious resource.

How does it work? Machine learning and deep learning from a million donor transplants informs the AI, which determines who is the best recipient for each available organ using more than 500 clinical parameters. Organ transplant centers and organ procurement organizations (OPOs) will be able to use the product to make a decision on how to allocate each organ in real time. Ultimately, the tool will service 250 transplant centers and 56 OPOs around the United States.

The NIH grant goes toward developing more precise models for heart and lung transplantation (kidney and liver algorithms are further along in development thanks to a previous award from the National Science Foundation), as well as Phase 2 efforts to fully commercialize TransplantAI.

"There is an urgent need for improved and integrated predictive clinical insights in solid organ transplantation, such as for real-time assessment of waitlist mortality and the likelihood of successful post-transplantation outcomes," according to the grant’s lead clinical investigator, Abbas Rana, associate professor of surgery at Baylor College of Medicine.

“This information is essential for healthcare teams and patients to make informed decisions, particularly in complex cases where expanded criteria allocation decisions are being considered," Rana continues. "Currently, the separation of donor and recipient data into different systems requires clinical teams to conduct manual, parallel reviews for pairing assessments. Our team, along with those at other leading transplant centers nationwide, receives hundreds of organ-recipient match offers weekly.”

Organ transplantation is moving into the future, and Transplant AI is at the forefront.

Houston can learn a lot from the decades of success from Silicon Valley, according to this Houston founder, who outlines just what all the city needs to do to become the startup city it has the potential to be. Photo via Getty Images

Houston expert: Can Houston replicate and surpass the success of Silicon Valley?

guest column

Anyone who knows me knows, as a Houston Startup Founder, I often muse about the still developing potential for startups in Houston, especially considering the amount of industry here, subject matter expertise, capital, and size.

For example, Houston is No. 2 in the country for Fortune 500 Companies — with 26 Bayou City companies on the list — behind only NYC, which has 47 ranked corporations, according to Fortune.

Considering layoffs, fund closings, and down rounds, things aren’t all that peachy in San Francisco for the first time in a long time, and despite being a Berkeley native, I’m rooting for Houston now that I’m a transplant.

Let’s start by looking at some stats.

While we’re not No. 1 in all areas, I believe we have the building blocks to be a major player in startups, and in tech (and not just energy and space tech). How? If the best predictor of future success is history, why not use the template of the GOAT of all startup cities: San Francisco and YCombinator. Sorry fellow founders – you’ve heard me talk about this repeatedly.

YCombinator is considered the GOAT of Startup Accelerators/Incubators based on:

  1. The Startup success rate: I’ve heard it’s as high as 75 percent (vs. the national average of 5 to 10 percent) Arc Search says 50 percent of YC Co’s fail within 12 years – not shabby.
  2. Their startup-to-unicorn ratio: 5 to 7 percent of YC startups become unicorns depending on the source — according to an Arc Search search (if you haven’t tried Arc Search do – super cool).
  3. Their network.

YC also parlayed that success into a "YC Startup School" offering:

  1. Free weekly lessons by YC partners — sometimes featuring unicorn alumni
  2. A document and video Library (YC SAFE, etc)
  3. Startup perks for students (AWS cloud credits, etc.)
  4. YC co-founder matching to help founders meet co-founders

Finally, there’s the over $80 billion in returns, according to Arc search, they’ve generated since their 2005 inception with a total of 4,000 companies in their portfolio at over $600 billion in value. So GOAT? Well just for perspective there were a jaw-dropping 18,000 startups in startup school the year I participated – so GOAT indeed.

So how do they do it? Based on anecdotal evidence, their winning formula is said to be the following well-oiled process:

  1. Bring over 282 startups (the number in last cohort) to San Francisco for 90 days to prototype, refine the product, and land on the go-to-market strategy. This includes a pre-seed YC SAFE investment of a phased $500,000 commitment for a fixed min 7 percent of equity, plus more equity at the next round’s valuation, according to YC.
  2. Over 50 percent of the latest cohort were idea stage and heavily AI focused.
  3. Traction day: inter-portfolio traction the company. YC has over 4,000 portfolio companies who can and do sign up for each other’s companies products because “they’re told to."
  4. Get beta testers and test from YC portfolio companies and YC network.
  5. If they see the traction scales to a massively scalable business, they lead the seed round and get this: schedule and attend the VC meetings with the founders.
  6. They create a "fear of missing out" mentality on Sand Hill Road as they casually mention who they’re meeting with next.
  7. They block competitors in the sector by getting the top VC’s to co-invest with then in the seed so competitors are locked out of the A list VC funding market, who then are up against the most well-funded and buzzed about players in the space.

If what I've seen is true, within a six-month period a startup idea is prototyped, tested, pivoted, launched, tractioned, seeded, and juiced for scale with people who can ‘make’ the company all in their corner, if not already on their board.

So how on earth can Houston best this?

  1. We have a massive amount of businesses — around 200,000 — and people — an estimated 7.3 million and growing.
  2. We have capital in search of an identity beyond oil.
  3. Our Fortune 500 companies that are hiring consultants for things that startups here that can do for free, quicker, and for a fraction of the extended cost.
  4. We have a growing base of tech talent for potential machine learning and artificial intelligence talent
  5. A sudden shot at the increasingly laid off big tech engineers.
  6. We have more accelerators and incubators.

What do we need to pull it off?

  1. An organized well-oiled YC-like process
  2. An inter-Houston traction process
  3. An "Adopt a Startup" program where local companies are willing to beta test and iterate with emerging startup products
  4. We have more accelerators but the cohorts are small — average five to 10 per cohort.
  5. Strategic pre-seed funding, possibly with corporate partners (who can make the company by being a client) and who de-risk the investment.
  6. Companies here to use Houston startup’s products first when they’re launched.
  7. A forum to match companies’ projects or labs groups etc., to startups who can solve them.
  8. A process in place to pull all these pieces together in an organized, structured sequence.

There is one thing missing in the list: there has to be an entity or a person who wants to make this happen. Someone who sees all the pieces, and has the desire, energy and clout to make it happen; and we all know this is the hardest part. And so for now, our hopes of besting YC may be up in the air as well.

------

Jo Clark is the founder of Circle.ooo, a Houston-based tech startup that's streamlining events management.

The $63.5 million contract aims to support UH in developing analytical modeling and simulation platforms that help the U.S. Army make timely and effective decisions. Photo via uhsystem.edu

University of Houston lands $63.5M contract with DOD to develop tech for the 'future battlefield'

ready to innovate

The University of Houston was recently awarded its largest grant in history—this time, from the U.S. Department of Defense.

The $63.5 million contract aims to support UH in developing analytical modeling and simulation platforms that help the U.S. Army make timely and effective decisions, according to a release from UH.

Craig Glennie, professor of civil and environmental engineering and director of engineering defense research initiatives at the UH Cullen College of Engineering, who is leading the project, says the team's work will focus on creating tools for the time period before conflict begins.

“We are not looking at what happens once bullets start flying. We are looking at what happens during the competition and crisis phases, the buildup and the posturing and the projection of forces before you actually get to the point of armed conflict,” he says in a statement. “The Army needs tools to understand how they can effectively position themselves and project their force towards the adversary in such a manner that they can avoid armed conflict, or if that is not possible, be prepared for the onset of armed conflict.”

The team, which also includes members from the University of Massachusetts Amherst, New Mexico State University and other organizations, will work closely with the U.S. Army Combat Capabilities Development Command Analysis Center, known as DAC. They've been commissioned to help build realistic modeling, analysis and simulation tools that the Army can use in the "future battlefield."

DAC has named several high priority issues for the team including quantum technology, artificial intelligence, and machine learning.

“For example, we will look at the electromagnetic spectrum, at owning the airspace, and projecting that we have the radio frequency technology that is capable of jamming a neighbor’s signals," Glennie adds.

UH president Renu Khator says the university is honored to revive the contract.

“We understand the significance of this project in enhancing the Army’s decision-making capabilities, and we are proud to contribute to our nation’s security and strategic competitiveness," she said in a statement. "We look forward to the remarkable contributions that will emerge from this collaboration, strengthening the University of Houston’s commitment to driving innovation that matters.”

UH has inked a number of grants and contracts in recent months that are pushing innovative initiatives forward at the university.

Last month, UH received a $100,000 grant from the Baker Hughes Foundation to go toward workforce development programs, and environmental justice research at its Energy Transition Institute. The ETI was launched last year through a $10 million grant from Shell USA Inc. and Shell Global Solutions (US) Inc.

And earlier this month, Houston-based The Welch Foundation awarded its inaugural $5 million Catalyst for Discovery Program Grant to a new initiative led by Jeffrey Rimer, UH's Abraham E. Dukler Professor of Chemical Engineering. The grant launched the Welch Center for Advanced Bioactive Materials Crystallization, which will build upon Rimer's work relating to the use of crystals to help treat malaria and kidney stones.

Craig Glennie, professor of civil and environmental engineering and director of engineering defense research initiatives at the UH Cullen College of Engineering, is leading the project. Photo via uh.edu

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Texas universities develop innovative open-source platform for cell analysis

picture this

What do labs do when faced with large amounts of imaging data? Powerful cloud computing systems have long been the answer to that question, but a new riposte comes from SPACe.

That’s the name of a new open-source image analysis platform designed by researchers at Baylor College of Medicine, Texas A&M University and the University of Houston.

SPACe, or Swift Phenotypic Analysis of Cells, was created to be used on standard computers that even small labs can access, meaning cellular analysis using images produced through cell painting has a lower barrier to entry than ever before.

“The pharmaceutical industry has been accustomed to simplifying complex data into single metrics. This platform allows us to shift away from that approach and instead capture the full diversity of cellular responses, providing richer, more informative data that can reveal new avenues for drug development,” Michael Mancini, professor of molecular and cellular biology and director of the Gulf Coast Consortium Center for Advanced Microscopy and Image Informatics co-located at Baylor College of Medicine and TAMU Institute for Bioscience and Technology.

SPACe is not only accessible because of its less substantial computational needs. Because the platform is open-source, it’s available to anyone who needs it. And it can be used by academic and pharmaceutical researchers alike.

“The platform allows for the identification of non-toxic effects of drugs, such as alterations in cell shape or effects on specific organelles, which are often overlooked by traditional assays that focus largely on cell viability,” says Fabio Stossi, currently a senior scientist with St. Jude Children’s Research Hospital, the lead author who was at Baylor during the development of SPACe.

The platform is a better means than ever of analyzing thousands of individual cells through automated imaging platforms, thereby better capturing the variability of biological processes. Through that, SPACe allows scientists an enhanced understanding of the interactions between drugs and cells, and does it on standard computers, translating to scientists performing large-scale drug screenings with greater ease.

"This tool could be a game-changer in how we understand cellular biology and discover new drugs. By capturing the full complexity of cellular responses, we are opening new doors for drug discovery that go beyond toxicity,” says Stossi.

And the fact that it’s open-source allows scientists to access SPACe for free right now. Researchers interested in using the platform can access it through Github at github.com/dlabate/SPACe. This early version could already make waves in research, but the team also plans to continually improve their product with the help of collaborations with other institutions.

The Ion names new coworking partner for Houston innovation hub

Where to Work

Rice University subsidiary Rice Real Estate Co. has tapped coworking company Industrious as the new operator of the Ion’s 86,000-square-foot coworking space in Midtown. Industrious replaces WeWork-owned Common Desk in that role.

The Ion, owned by Rice Real Estate and located at 4201 Main St., is a 266,000-square-foot office building and innovation hub in the 16-acre Ion District.

Features of the coworking space include private suites and offices, dedicated desks, phone booths and conference rooms. In 2022, Common Desk said it was expanding the space by 28,000 square feet, bringing it to the current size.

“(Industrious’) unparalleled expertise in delivering quality, hospitality-driven workspaces complements our vision of creating a world-class ecosystem where entrepreneurs, corporations, and academia converge to drive innovation forward,” Ken Jett, president of Rice Real Estate, said in a statement.

Natalie Levine, senior manager of real estate at Industrious, says her company will work with Rice Real Estate “to continue to position the Ion as an invaluable contributor to the growth of Houston’s innovation community.”

Dallas-based commercial real estate services company CBRE said Jan. 14 that it had agreed to acquire Industrious in a deal valued at $400 million.

The Ion is Industrious’ second location in Houston. The company’s other local coworking space is at 1301 McKinney St.

Office tenants at the Ion include Occidental Petroleum, Fathom Fund, Activate, Carbon Clean, Microsoft and Chevron Technology Ventures.