A Rice University student decided to use his data science skills for good. Photo courtesy of Biokind Analytics

For Alex Han, it all started with peanut butter.

Han was a high school student in Korea when he learned that the spread is a pure odorant that could be used to test smell in each hemisphere of the brain—issues on the left side was thought to be a predictor for Alzheimer’s disease. He later learned that the method wasn’t as airtight as previously thought, but Han was hooked. Alzheimer’s research became the teenager’s passion. While still in high school, Han began volunteering for Alzheimer’s Los Angeles, translating their brochures into Korean.

When it came time to choose a college, Han says Rice University appealed to him for many reasons.

“I loved the atmosphere. I loved the campus—it’s so beautiful. The diverse food, the people, I even liked the highway,” he says of Houston. “In Korea, everything is so close and compact. I loved the whole scenario of the city.”

A scholarship was also part of the appeal, as well as the pull of the world’s largest medical center. Han’s instincts were correct. Now, a junior at Rice, he has been working at renowned geneticist Huda Zoghbi’s Baylor College of Medicine lab for almost two years.

But dividing his obligations between full-time studies and his wet lab position wasn’t enough to keep Han’s active mind occupied. Last May, the statistics and biochemistry student began another endeavor that uses both his specialties. It was then that he founded Biokind Analytics. The nonprofit was designed to explore how data science can support health care nonprofits.

Han reached out to Alzheimer’s Los Angeles to offer his data analysis services on a volunteer basis and was shocked that the association had never considered it before.

“I was really surprised—even small stores and restaurants use statistics to boost their profits. [Alzheimer’s Los Angeles] receive a couple million dollars every year in donations. They have data stores but hadn’t really capitalized yet in the area of analytics.”

Han, along with a small team of Rice students, including vice president Zac Andrews and development director Masha Zaitsev, made Alzheimer’s Los Angeles a pet project, analyzing geospatial trends in its donorship and interpreting the past year’s donation trends. “We wanted to see if the demand was the same in Houston. We found that this pattern was consistent. A lot of nonprofits are willing to have us analyze the data sets they’ve already been tracking and provide data analysis for healthcare nonprofits.”

Less than a year after Han established Biokind Analytics, the 501(c)(3) already has seven chapters on college campuses around the country. From UC Davis and San Diego in the West to Brown University and the University of Virginia on the East Coast, the data science students have helped a diverse range of medical nonprofits, mostly based in the Houston area. They run the gamut from ALS Association of Texas to Nora’s Home, which serves organ failure and transplant patients.

Biokind Analytics has now completed seven projects and analyzed $100 million in funds. Each student group includes four to six members, mostly majors in the worlds of statistics, data science, and biochemistry, all working with the help of faculty advisors. At a total of about 35 students nationwide, Han says that he’s dedicated to growing at a steady pace to avoid potentially expanding too fast, too soon.

Another question for the future is what will happen to Biokind Analytics when Han completes his undergraduate studies in 2024. He plans to continue his medical studies with the goal of one day becoming a physician specializing in Alzheimer’s who uses data analytics to aid in patient care. But no matter how active Han continues to be in the nonprofit he started, his stated attachment to the cause and a growing group of both student leaders and healthcare associations eager for their services are sure to keep Biokind Analytics active long after graduation.

Let's talk about dark data — what it means and how to navigate it. Graphic byMiguel Tovar/University of Houston

Houston expert: Navigating dark data within research and innovation

houston voices

Is it necessary to share ALL your data? Is transparency a good thing or does it make researchers “vulnerable,” as author Nathan Schneider suggests in the Chronicle of Higher Education article, “Why Researchers Shouldn’t Share All Their Data.”

Dark Data Defined

Dark data is defined as the universe of information an organization collects, processes and stores – oftentimes for compliance reasons. Dark data never makes it to the official publication part of the project. According to the Gartner Glossary, “storing and securing data typically incurs more expense (and sometimes greater risk) than value.”

This topic is reminiscent of the file drawer effect, a phenomenon which reflects the influence of the results of a study on whether or not the study is published. Negative results can be just as important as hypotheses that are proven.

Publication bias and the need to only publish positive research that supports the PI’s hypothesis, it can be argued, is not good science. According to an article in the Indian Journal of Anaesthesia, authors Priscilla Joys Nagarajan, et al., wrote: “It is speculated that every significant result in the published world has 19 non-significant counterparts in file drawers.” That’s one definition of dark data.

Total Transparency

But what to do with all your excess information that did not make it to publication, most likely because of various constraints? Should everything, meaning every little tidbit, be readily available to the research community?

Schneider doesn’t think it should be. In his article, he writes that he hides some findings in a paper notebook or behind a password, and he keeps interviews and transcripts offline altogether to protect his sources.

Open-source

Open-source software communities tend to regard total transparency as inherently good. What are the advantages of total transparency? You may make connections between projects that you wouldn’t have otherwise. You can easily reproduce a peer’s experiment. You can even become more meticulous in your note-taking and experimental methods since you know it’s not private information. Similarly, journalists will recognize this thought pattern as the recent, popular call to engage in “open journalism.” Essentially, an author’s entire writing and editing process can be recorded, step by step.

TMI

This trend has led researchers to open-source programs like Jupyter and GitHub. Open-source programs detail every change that occurs along a project’s timeline. Is unorganized, excessive amounts of unpublishable data really what transparency means? Or does it confuse those looking for meaningful research that is meticulously curated?

The Big Idea

And what about the “vulnerability” claim? Sharing every edit and every new direction taken opens a scientist up to scoffers and harassment, even. Dark data in industry even involves publishing salaries, which can feel unfair to underrepresented, marginalized populations.

In Model View Culture, Ellen Marie Dash wrote: “Let’s give safety and consent the absolute highest priority, with openness and transparency prioritized explicitly below those. This means digging deep, properly articulating in detail what problems you are trying to solve with openness and transparency, and handling them individually or in smaller groups.”

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

A new UH-led program will work with energy corporations to prepare the sector's future workforce. Photo via Getty Images

University of Houston leads data science collaboration to propel energy transition

seeing green

Five Texas schools have teamed up with energy industry partners to create a program to train the sectors future workforce. At the helm of the initiative is the University of Houston.

The Data Science for Energy Transition project, which is funded through 2024 by a $1.49 million grant from the National Science Foundation, includes participation from UH, the University of Houston-Downtown, the University of Houston-Victoria, the University of Houston-Clear Lake, and Sam Houston State University.

The project will begin but introducing a five-week data science camp next summer where undergraduate and master’s level students will examine data science skills already in demand — as well as the skills that will be needed in the future as the sector navigates a shift to new technologies.

The camp will encompass computer science and programming, statistics, machine learning, geophysics and earth science, public policy, and engineering, according to a news release from UH. The project’s principal investigator is Mikyoung Jun, ConocoPhillips professor of data science at the UH College of Natural Science and Mathematics.

The new program's principal investigator is Mikyoung Jun. Photo via UH.edu

“It’s obvious that the Houston area is the capital for the energy field. We are supporting our local industries by presenting talented students from the five sponsoring universities and other Texas state universities with the essential skills to match the growing needs within those data science workforces,” Jun says in the release. “We’re planning all functions in a hybrid format so students located outside of Houston, too, can join in.”

Jun describes the camp as having a dual focus — both on the issue of energy transition to renewable sources as well as the traditional energy, because that's not being eradicated any time soon, she explains.

Also setting the program apart is the camp's prerequisites — or lack thereof. The program is open to majors in energy-related fields, such as data science or petroleum engineering, as well as wide-ranging fields of study, such as business, art, history, law, and more.

“The camp is not part of a degree program and its classes do not offer credits toward graduation, so students will continue to follow their own degree plan,” Jun says in the release. “Our goal with the summer camp is to give students a solid footing in data science and energy-related fields to help them focus on skills needed in data science workforces in energy-related companies in Houston and elsewhere. Although that may be their first career move, they may settle in other industries later. Good skills in data processing can make them wise hires for many technology-oriented organizations.”

Jun's four co-principal investigators include Pablo Pinto, professor at UH’s Hobby School of Public Affairs and director of the Center for Public Policy; Jiajia Sun, UH assistant professor of geophysics; Dvijesh Shastri, associate professor of computer science, UH-Downtown; and Yun Wan, professor of computer information systems and chair of the Computer Science Division, UH-Victoria. Eleven other faculty members from five schools will serve as senior personnel. The initiative's energy industry partners include Conoco Phillips, Schlumberger, Fugro, Quantico Energy Solutions, Shell, and Xecta Web Technologies.

The program's first iteration will select 40 students to participate in the camp this summer. Applications, which have not opened yet, will be made available online.

The Data Science for Energy Transition project is a collaboration between five schools. Image via UH.edu

Houston companies need cybersecurity professionals — and universities can help. Photo via Getty Images

How universities can help equip Houston with a skilled cybersecurity workforce

guest column

With an increasing number of data breaches, a high job growth rate, and a persistent skills gap, cybersecurity professionals will be some of the most in-demand workers in 2022. It’s more important than ever to have people that are properly trained to protect individuals, corporations, and communities.

Demand for cybersecurity talent in Texas is high. According to Burning Glass Labor Insights, employers in the Houston metro area have posted over 24,000 cybersecurity jobs since the beginning of 2021. But the pipeline of cybersecurity workers is very low, which means many local and national companies don’t have enough people on the front lines defending against these attacks.

Unfortunately, it looks like the cybersecurity skills gap is far from over. An annual industry report from the Information Systems Security Association shows that the global demand for cybersecurity skills still far exceeds the current supply of traditionally qualified individuals, with 38 percent of cybersecurity roles currently unfilled. This shortage has real-life, real-world consequences that can result in misconfigured systems and improper risk assessment and management.

How can companies help close the cybersecurity skills gap within their own organizations? We believe it will become increasingly important to look beyond “traditionally qualified” candidates and view hands-on experience as the same, or even more important than, the certifications or bachelor degree requirements often found in cybersecurity job descriptions.

The top open cybersecurity roles in the Houston area include analysts, managers, engineers, and developers. Employees in these positions are essential to the everyday monitoring, troubleshooting, testing and analyzing that helps companies protect data and stay one step ahead of hackers. When looking to fill these roles, hiring managers should be looking for candidates with both the knowledge and experience to take on these critical positions.

Fortunately, Houston-based companies looking to establish, grow, or upskill their cybersecurity teams don’t have to go far to find top-tier talent and training programs. More local colleges and universities are offering alternative credential programs, like boot camps, that provide students with the deep understanding and hands-on learning they need to excel in the roles that companies need to fill.

2U, Inc. and Rice University have partnered to power a data-driven, market-responsive cybersecurity boot camp that provides students with hands-on training in networking, systems, web technologies, databases, and defensive and offensive cybersecurity. Over 40 percent of the students didn’t have bachelor degrees prior to enrolling in the program. Since launching in 2019, the program has produced more than 140 graduates, some of whom have gone on to work in cybersecurity roles at local companies such as CenterPoint Energy, Fulcrum Technology Solutions, and Hewlett Packard.

Recognizing programs like university boot camps as local workforce generators not only gives companies a larger talent pool to recruit from, but also increases the opportunity for cybersecurity teams to diversify and include professionals with different experiences and backgrounds. We’re living in a security-first world, and the right mix of cybersecurity talent is essential to keeping us protected wherever we are.

------

David Vassar is the assistant dean of Susanne M. Glasscock School of Continuing Studies at Rice University. Bret Fund is vice president overseeing cybersecurity programs at 2U.

"Better and personalized healthcare through AI is still a hugely challenging problem that will take an army of scientists and engineers." Photo via UH.edu

Houston expert explains health care's inequity problem

guest column

We are currently in the midst of what some have called the "wild west" of AI. Though healthcare is one of the most heavily regulated sectors, the regulation of AI in this space is still in its infancy. The rules are being written as we speak. We are playing catch-up by learning how to reap the benefits these technologies offer while minimizing any potential harms once they've already been deployed.

AI systems in healthcare exacerbate existing inequities. We've seen this play out into real-world consequences from racial bias in the American justice system and credit scoring, to gender bias in resume screening applications. Programs that are designed to bring machine "objectivity" and ease to our systems end up reproducing and upholding biases with no means of accountability.

The algorithm itself is seldom the problem. It is often the data used to program the technology that merits concern. But this is about far more than ethics and fairness. Building AI tools that take account of the whole picture of healthcare is fundamental to creating solutions that work.

The Algorithm is Only as Good as the Data

By nature of our own human systems, datasets are almost always partial and rarely ever fair. As Linda Nordling comments in a Nature article, A fairer way forward for AI in healthcare, "this revolution hinges on the data that are available for these tools to learn from, and those data mirror the unequal health system we see today."

Take, for example, the finding that Black people in US emergency rooms are 40 percent less likely to receive pain medication than are white people, and Hispanic patients are 25 percent less likely. Now, imagine the dataset these findings are based on is used to train an algorithm for an AI tool that would be used to help nurses determine if they should administer pain relief medication. These racial disparities would be reproduced and the implicit biases that uphold them would remain unquestioned, and worse, become automated.

We can attempt to improve these biases by removing the data we believe causes the bias in training, but there will still be hidden patterns that correlate with demographic data. An algorithm cannot take in the nuances of the full picture, it can only learn from patterns in the data it is presented with.

Bias Creep

Data bias creeps into healthcare in unexpected ways. Consider the fact that animal models used in laboratories across the world to discover and test new pain medications are almost entirely male. As a result, many medications, including pain medication, are not optimized for females. So, it makes sense that even common pain medications like ibuprofen and naproxen have been proven to be more effective in men than women and that women tend to experience worse side effects from pain medication than men do.

In reality, male rodents aren't perfect test subjects either. Studies have also shown that both female and male rodents' responses to pain levels differ depending on the sex of the human researcher present. The stress response elicited in rodents to the olfactory presence of a sole male researcher is enough to alter their responses to pain.

While this example may seem to be a departure from AI, it is in fact deeply connected — the current treatment choices we have access to were implicitly biased before the treatments ever made it to clinical trials. The challenge of AI equity is not a purely technical problem, but a very human one that begins with the choices that we make as scientists.

Unequal Data Leads to Unequal Benefits

In order for all of society to enjoy the many benefits that AI systems can bring to healthcare, all of society must be equally represented in the data used to train these systems. While this may sound straightforward, it's a tall order to fill.

Data from some populations don't always make it into training datasets. This can happen for a number of reasons. Some data may not be as accessible or it may not even be collected at all due to existing systemic challenges, such as a lack of access to digital technology or simply being deemed unimportant. Predictive models are created by categorizing data in a meaningful way. But because there's generally less of it, "minority" data tends to be an outlier in datasets and is often wiped out as spurious in order to create a cleaner model.

Data source matters because this detail unquestionably affects the outcome and interpretation of healthcare models. In sub-Saharan Africa, young women are diagnosed with breast cancer at a significantly higher rate. This reveals the need for AI tools and healthcare models tailored to this demographic group, as opposed to AI tools used to detect breast cancer that are only trained on mammograms from the Global North. Likewise, a growing body of work suggests that algorithms used to detect skin cancer tend to be less accurate for Black patients because they are trained mostly on images of light-skinned patients. The list goes on.

We are creating tools and systems that have the potential to revolutionize the healthcare sector, but the benefits of these developments will only reach those represented in the data.

So, what can be done?

Part of the challenge in getting bias out of data is that high volume, diverse and representative datasets are not easy to access. Training datasets that are publicly available tend to be extremely narrow, low-volume, and homogenous—they only capture a partial picture of society. At the same time, a wealth of diverse health data is captured every day in many healthcare settings, but data privacy laws make accessing these more voluminous and diverse datasets difficult.

Data protection is of course vital. Big Tech and governments do not have the best track record when it comes to the responsible use of data. However, if transparency, education, and consent for the sharing of medical data was more purposefully regulated, far more diverse and high-volume data sets could contribute to fairer representation across AI systems and result in better, more accurate results for AI-driven healthcare tools.

But data sharing and access is not a complete fix to healthcare's AI problem. Better and personalized healthcare through AI is still a hugely challenging problem that will take an army of scientists and engineers. At the end of the day, we want to teach our algorithms to make good choices but we are still figuring out what good choices should look like for ourselves.

AI presents the opportunity to bring greater personalization to healthcare, but it equally presents the risk of entrenching existing inequalities. We have the opportunity in front of us to take a considered approach to data collection, regulation, and use that will provide a fuller and fairer picture and enable the next steps for AI in healthcare.

------

Angela Wilkins is the executive director of the Ken Kennedy Institute at Rice University.

This health tech company has made some significant changes in order to keep up with its growth. Photo via Getty Images

Houston data solutions startup rebrands, expands to support neuroscience research

startup soars

With a new CEO and chief operating officer aboard, Houston-based DataJoint is thinking small in order to go big.

Looking ahead to 2022, DataJoint aims to enable hundreds of smaller projects rather than a handful of mega-projects, CEO Dimitri Yatsenko says. DataJoint develops data management software that empowers collaboration in the neuroscience and artificial intelligence sectors.

"Our strategy is to take the lessons that we have learned over the past four years working with major projects with multi-institutional consortia," Yatsenko says, "and translate them into a platform that thousands of labs can use efficiently to accelerate their research and make it more open and rigorous."

Ahead of that shift, the startup has undergone some significant changes, including two moves in the C-suite.

Yatsenko became CEO in February after stints as vice president of R&D and as president. He co-founded the company as Vathes LLC in 2016. Yatsenko succeeded co-founder Edgar Walker, who had been CEO since May 2020 and was vice president of engineering before that.

In tandem with Yatsenko's ascent to CEO, the company brought aboard Jason Kirkpatrick as COO. Kirkpatrick previously was chief financial officer of Houston-based Darcy Partners, an energy industry advisory firm; chief operating officer and chief financial officer of Houston-based Solid Systems CAD Services (SSCS), an IT services company; and senior vice president of finance and general manager of operations at Houston-based SmartVault Corp., a cloud-based document management company.

"Most of our team are scientists and engineers. Recruiting an experienced business leader was a timely step for us, and Jason's vast leadership experience in the software industry and recurring revenue models added a new dimension to our team," Yatsenko says.

Other recent changes include:

  • Converting from an LLC structure to a C corporation structure to enable founders, employees, and future investors to be granted shares of the company's stock.
  • Shortening the business' name to DataJoint from DataJoint Neuro and recently launching its rebranded website.
  • Moving the company's office from the Texas Medical Center Innovation Institute (TMCx) to the Galleria area. The new space will make room for more employees. Yatsenko says the 12-employee startup plans to increase its headcount to 15 to 20 by the end of this year.

Over the past five years, the company's customer base has expanded to include neuroscience institutions such as Princeton University's Princeton Neuroscience Institute and Columbia University's Zuckerman Institute for Brain Science, as well as University College London and the Norwegian University of Science and Technology. DataJoint's growth has been fueled in large part by grants from the U.S. Defense Advanced Research Projects Agency (DARPA) and the Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) Initiative at the National Institutes of Health (NIH).

"The work we are tackling has our team truly excited about the future, particularly the capabilities being offered to the neuroscience community to understand how the brain forms perceptions and generates behavior," Yatsenko says.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

New Houston venture studio emerges to target early-stage hardtech, energy transition startups

funding the future

The way Doug Lee looks at it, there are two areas within the energy transition attracting capital. With his new venture studio, he hopes to target an often overlooked area that's critical for driving forward net-zero goals.

Lee describes investment activity taking place in the digital and software world — early stage technology that's looking to make the industry smarter. But, on the other end of the spectrum, investment activity can be found on massive infrastructure projects.

While both areas need funding, Lee has started his new venture studio, Flathead Forge, to target early-stage hardtech technologies.

“We are really getting at the early stage companies that are trying to develop technologies at the intersection of legacy industries that we believe can become more sustainable and the energy transition — where we are going. It’s not an ‘if’ or ‘or’ — we believe these things intersect,” he tells EnergyCapital.

Specifically, Lee's expertise is within the water and industrial gas space. For around 15 years, he's made investments in this area, which he describes as crucial to the energy transition.

“Almost every energy transition technology that you can point to has some critical dependency on water or gas,” he says. “We believe that if we don’t solve for those things, the other projects won’t survive.”

Lee, and his brother, Dave, are evolving their family office to adopt a venture studio model. They also sold off Azoto Energy, a Canadian oilfield nitrogen cryogenic services business, in December.

“We ourselves are going through a transition like our energy is going through a transition,” he says. “We are transitioning into a single family office into a venture studio. By doing so, we want to focus all of our access and resources into this focus.”

At this point, Flathead Forge has seven portfolio companies and around 15 corporations they are working with to identify their needs and potential opportunities. Lee says he's gearing up to secure a $100 million fund.

Flathead also has 40 advisers and mentors, which Lee calls sherpas — a nod to the Flathead Valley region in Montana, which inspired the firm's name.

“We’re going to help you carry up, we’re going to tie ourselves to the same rope as you, and if you fall off the mountain, we’re falling off with you,” Lee says of his hands-on approach, which he says sets Flathead apart from other studios.

Another thing that's differentiating Flathead Forge from its competition — it's dedication to giving back.

“We’ve set aside a quarter of our carried interest for scholarships and grants,” Lee says.

The funds will go to scholarships for future engineers interested in the energy transition, as well as grants for researchers studying high-potential technologies.

“We’re putting our own money where our mouth is,” Lee says of his thesis for Flathead Forge.

------

This article originally ran on EnergyCapital.

Houston-based lunar mission's rocky landing and what it means for America's return to the moon

houston, we have a problem

A private U.S. lunar lander tipped over at touchdown and ended up on its side near the moon’s south pole, hampering communications, company officials said Friday.

Intuitive Machines initially believed its six-footed lander, Odysseus, was upright after Thursday's touchdown. But CEO Steve Altemus said Friday the craft “caught a foot in the surface," falling onto its side and, quite possibly, leaning against a rock. He said it was coming in too fast and may have snapped a leg.

“So far, we have quite a bit of operational capability even though we’re tipped over," he told reporters.

But some antennas were pointed toward the surface, limiting flight controllers' ability to get data down, Altemus said. The antennas were stationed high on the 14-foot (4.3-meter) lander to facilitate communications at the hilly, cratered and shadowed south polar region.

Odysseus — the first U.S. lander in more than 50 years — is thought to be within a few miles (kilometers) of its intended landing site near the Malapert A crater, less than 200 miles (300 kilometers) from the south pole. NASA, the main customer, wanted to get as close as possible to the pole to scout out the area before astronauts show up later this decade.

NASA's Lunar Reconnaissance Orbiter will attempt to pinpoint the lander's location, as it flies overhead this weekend.

With Thursday’s touchdown, Intuitive Machines became the first private business to pull off a moon landing, a feat previously achieved by only five countries. Japan was the latest country to score a landing, but its lander also ended up on its side last month.

Odysseus' mission was sponsored in large part by NASA, whose experiments were on board. NASA paid $118 million for the delivery under a program meant to jump-start the lunar economy.

One of the NASA experiments was pressed into service when the lander's navigation system did not kick in. Intuitive Machines caught the problem in advance when it tried to use its lasers to improve the lander's orbit. Otherwise, flight controllers would not have discovered the failure until it was too late, just five minutes before touchdown.

“Serendipity is absolutely the right word,” mission director Tim Crain said.

It turns out that a switch was not flipped before flight, preventing the system's activation in space.

Launched last week from Florida, Odysseus took an extra lap around the moon Thursday to allow time for the last-minute switch to NASA's laser system, which saved the day, officials noted.

Another experiment, a cube with four cameras, was supposed to pop off 30 seconds before touchdown to capture pictures of Odysseus’ landing. But Embry-Riddle Aeronautical University’s EagleCam was deliberately powered off during the final descent because of the navigation switch and stayed attached to the lander.

Embry-Riddle's Troy Henderson said his team will try to release EagleCam in the coming days, so it can photograph the lander from roughly 26 feet (8 meters) away.

"Getting that final picture of the lander on the surface is still an incredibly important task for us,” Henderson told The Associated Press.

Intuitive Machines anticipates just another week of operations on the moon for the solar-powered lander — nine or 10 days at most — before lunar nightfall hits.

The company was the second business to aim for the moon under NASA's commercial lunar services program. Last month, Pittsburgh's Astrobotic Technology gave it a shot, but a fuel leak on the lander cut the mission short and the craft ended up crashing back to Earth.

Until Thursday, the U.S. had not landed on the moon since Apollo 17's Gene Cernan and Harrison Schmitt closed out NASA's famed moon-landing program in December 1972. NASA's new effort to return astronauts to the moon is named Artemis after Apollo's mythological twin sister. The first Artemis crew landing is planned for 2026 at the earliest.

3 female Houston innovators to know this week

who's who

Editor's note: Welcome to another Monday edition of Innovators to Know. Today I'm introducing you to three Houstonians to read up about — three individuals behind recent innovation and startup news stories in Houston as reported by InnovationMap. Learn more about them and their recent news below by clicking on each article.

Emma Konet, co-founder and CTO of Tierra Climate

Emma Konet, co-founder and CTO of Tierra Climate, joins the Houston Innovators Podcast. Photo via LinkedIn

If the energy transition is going to be successful, the energy storage space needs to be equipped to support both the increased volume of energy needed and new energies. And Emma Konet and her software company, Tierra Climate, are targeting one part of the equation: the market.

"To me, it's very clear that we need to build a lot of energy storage in order to transition the grid," Konet says on the Houston Innovators Podcast. "The problems that I saw were really on the market side of things." Read more.

Cindy Taff, CEO of Sage Geosystems

Houston-based Sage Geosystems announced the first close of $17 million round led by Chesapeake Energy Corp. Photo courtesy of Sage

A Houston geothermal startup has announced the close of its series A round of funding.

Houston-based Sage Geosystems announced the first close of $17 million round led by Chesapeake Energy Corp. The proceeds aim to fund its first commercial geopressured geothermal system facility, which will be built in Texas in Q4 of 2024. According to the company, the facility will be the first of its kind.

“The first close of our Series A funding and our commercial facility are significant milestones in our mission to make geopressured geothermal system technologies a reality,” Cindy Taff, CEO of Sage Geosystems, says. Read more.

Clemmie Martin, chief of staff at The Cannon

With seven locations across the Houston area, The Cannon's digital technology allows its members a streamlined connection. Photo courtesy of The Cannon

After collaborating over the years, The Cannon has acquired a Houston startup's digital platform technology to become a "physical-digital hybrid" community.

Village Insights, a Houston startup, worked with The Cannon to create and launch its digital community platform Cannon Connect. Now, The Cannon has officially acquired the business. The terms of the deal were not disclosed.

“The integration of a world-class onsite member experience and Cannon Connect’s superior virtual resource network creates a seamless, streamlined environment for member organizations,” Clemmie Martin, The Cannon’s newly appointed chief of staff, says in the release. “Cannon Connect and this acquisition have paved new pathways to access and success for all.” Read more.