Stay informed and regularly check your security procedures to protect yourself, your business, and your customers. Photo via Getty Images

As news comes out every week about new technologies, from new crypto wallets to generative AI to self-driving taxis, it can get overwhelming for most of us to keep up or to understand the new intricacies of technology, and it can get easy to say, “The IT department has it covered.” Well, do they have it covered?

Far too often, companies fail to protect its data with the same muster as its financial security until it is too late. Just as a healthy business will regularly conduct audits of its accounting processes to detect potential fraud, ensure regulatory compliance, and locate areas of improvement for the organization, the same should be done for a business’s data security practices. Key components of any organization are its people and its information, and the IT department is in charge of protecting that information.

We as business people need to ensure that the company’s technology personnel are indeed securing one of the company’s most valuable assets: information.

Big picture: Your business needs to follow an audit process

  1. Confirm the scope of your data
  2. Conduct an internal review of all security practices
  3. Conduct a review of all vendor practices that have access to your data
  4. Confirm compliance with regulations and contractual obligations
  5. Prepare a report with detailed findings and recommendations to improve on year-over-year

Data: What do you have and what duties does it require?

Personal information, particularly when it belongs to customers, is the most frequently compromised type of data. Under laws like the newly passed Texas Data Privacy and Security Act (TDPSA), businesses can have additional obligations to keep this information protected. Personal information can include any information “that is linked or reasonably linkable to an identified or identifiable individual.”

Sensitive data also requires extra precaution, which means protecting (1) personal data that reveals racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexuality, or citizenship or immigration status; (2) genetic or biometric data that is processed for the purpose of uniquely identifying an individual; (3) personal data collected from a known child; or (4) precise geolocation data.

Other types of data to watch out for include the business’s intellectual property, anonymized customer data, employee personal information, and any other type of proprietary business data. Depending on the industry, the cost of a breach of any of these types of data could be incredibly high, particularly for healthcare and finance.

Ultimately, Texas businesses are required to maintain reasonable procedures to protect personal information, and there may be other laws implicated such as HIPAA, GLBA, CCPA/CPRA, BIPA, GDPR, PIPEDA, and many more, depending on where business is done, the industry implicated, and, in some cases, where customers are located.

"But I think the vendor is responsible."

Check your contracts, and check if the law requires you to have a duty to protect the compromised information, as many do. Involve your IT department in the review of technical compliance whenever you are sharing data with a third party. Further, it is important to make sure that however the Data Processing Addendum says the vendor is processing data is how they are actually processing data. To that point, if you are processing someone else’s data, your business also needs to be doing what it says it is doing, in contracts with third parties and in your Privacy Policy.

Software as a service arrangements, end user license agreements, and other internet and software-based services may require you to hand over data and not give you the opportunity to customize and shift risk. This is why it is important to thoroughly evaluate what technical protections are in place because the risk and duty may still fall on your business regarding the data of your customers and employees. Ask yourself (or your IT professionals) if the vendor actually needs the data they receive to provide services to you.

Key takeaway: Stay informed

Your business needs checks and balances in place with the IT department to ensure you know what they are (or are not) doing and what they are supposed to do. You need policies and procedures, and they need to regularly be tested.

Do you know where your data is stored, both internally and with third parties? Who controls it? How is it being processed, and is anything being shared? Are encryption procedures in place? Firewalls, Intrusion Protection Systems, and End-Point Detection and Response? Do you and your vendors have Incident Response Plans? Stay informed and regularly check your security procedures to protect yourself, your business, and your customers.

------

Courtney Gahm-Oldham is partner at Frost Brown Todd. Lauren Cole is associate at Frost Brown Todd.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

How Houston innovators played a role in the historic Artemis II splashdown

safe landing

Research from Rice University played a critical role in the safe return of U.S. astronauts aboard NASA’s Artemis II mission this month.

Rice mechanical engineer Tayfun E. Tezduyar and longtime collaborator Kenji Takizawa developed a key computational parachute fluid-structure interaction (FSI) analysis system that proved vital in NASA’s Orion capsule’s descent into the Pacific Ocean. The FSI system, originally developed in 2013 alongside NASA Johnson Space Center, was critical in Orion’s three-parachute design, which slowed the capsule as it returned to Earth, according to Rice.

The model helped ensure that the parachute design was large enough to slow the capsule for a safe landing while also being stable enough to prevent the capsule from oscillating as it descended.

“You cannot separate the aerodynamics from the structural dynamics,” Tezduyar said in a news release. “They influence each other continuously and even more so for large spacecraft parachutes, so the analysis must capture that interaction in a robustly coupled way.”

The end result was a final parachute system, refined through NASA drop tests and Rice’s computational FSI analysis, that eliminated fluctuations and produced a stable descent profile.

Apart from the dynamic challenges in design, modeling Orion’s parachutes also required solving complex equations that considered airflow and fabric deformation and accounted for features like ringsail canopy construction and aerodynamic interactions among multiple parachutes in a cluster.

“Essentially, my entire group was dedicated to that work, because I considered it a national priority,” Tezduyar added in the release. “Kenji and I were personally involved in every computer simulation. Some of the best graduate students and research associates I met in my career worked on the project, creating unique, first-of-its-kind parachute computer simulations, one after the other.”

Current Intuitive Machines engineer Mario Romero also worked on Orion during his time at NASA. From 2018 to 2021, Romero was a member of the Orion Crew Capsule Recovery Team, which focused on creating likely scenarios that crewmembers could encounter in Orion.

The team trained in NASA’s 6.2-million-gallon pool, using wave machines to replicate a range of sea conditions. They also simulated worst-case scenarios by cutting the lights, blasting high-powered fans and tipping a mock capsule to mimic distress situations. In some drills, mock crew members were treated as “injured,” requiring the team to practice safe, controlled egress procedures.

“It’s hard to find the appropriate descriptors that can fully encapsulate the feeling of getting to witness all the work we, and everyone else, did being put into action,” Romero tells InnovationMap. “I loved seeing the reactions of everyone, but especially of the Houston communities—that brought me a real sense of gratitude and joy.”

Intuitive Machines was also selected to support the Artemis II mission using its Space Data Network and ground station infrastructure. The company monitored radio signals sent from the Orion spacecraft and used Doppler measurements to help determine the spacecraft's precise position and speed.

Tim Crain, Chief Technology Officer at Intuitive Machines, wrote about the experience last week.

"I specialized in orbital mechanics and deep space navigation in graduate school,” Crain shared. “But seeing the theory behind tracking spacecraft come to life as they thread through planetary gravity fields on ultra-precise trajectories still seems like magic."

UH breakthrough moves superconductivity closer to real-world use

Energy Breakthrough

University of Houston researchers have set a new benchmark in the field of superconductivity.

Researchers from the UH physics department and the Texas Center for Superconductivity (TcSUH) have broken the transition temperature record for superconductivity at ambient pressure. The accomplishment could lead to more efficient ways to generate, transmit and store energy, which researchers believe could improve power grids, medical technologies and energy systems by enabling electricity to flow without resistance, according to a release from UH.

To break the record, UH researchers achieved a transition temperature 151 Kelvin, which is the highest ever recorded at ambient pressure since the discovery of superconductivity in 1911.

The transition temperature represents the point just before a material becomes superconducting, where electricity can flow through it without resistance. Scientists have been working for decades to push transition temperature closer to room temperature, which would make superconducting technologies more practical and affordable.

Currently, most superconductors must be cooled to extremely low temperatures, making them more expensive and difficult to operate.

UH physicists Ching-Wu Chu and Liangzi Deng published the research in the Proceedings of the National Academy of Sciences earlier this month. It was funded by Intellectual Ventures and the state of Texas via TcSUH and other foundations. Chu, founding director and chief scientist at TcSUH, previously made the breakthrough discovery that the material YBCO reaches superconductivity at minus 93 K in 1987. This helped begin a global competition to develop high-temperature superconductors.

“Transmitting electricity in the grid loses about 8% of the electricity,” Chu, who’s also a professor of physics at UH and the paper’s senior author, said in a news release. “If we conserve that energy, that’s billions of dollars of savings and it also saves us lots of effort and reduces environmental impacts.”

Chu and his team used a technique known as pressure quenching, which has been adapted from techniques used to create diamonds. With pressure quenching, researchers first apply intense pressure to the material to enhance its superconducting properties and raise its transition temperature.

Next, researchers are targeting ambient-pressure, room-temperature superconductivity of around 300 K. In a companion PNAS paper, Chu and Deng point to pressure quenching as a promising approach to help bridge the gap between current results and that goal.

“Room-temperature superconductivity has been seen as a ‘holy grail’ by scientists for over a century,” Rohit Prasankumar, director of superconductivity research at Intellectual Ventures, said in the release. “The UH team’s result shows that this goal is closer than ever before. However, the distance between the new record set in this study and room temperature is still about 140 C. Closing this gap will require concerted, intentional efforts by the broader scientific community, including materials scientists, chemists, and engineers, as well as physicists.”

---

This article originally appeared on EnergyCapitalHTX.com.