A Rice University student decided to use his data science skills for good. Photo courtesy of Biokind Analytics

For Alex Han, it all started with peanut butter.

Han was a high school student in Korea when he learned that the spread is a pure odorant that could be used to test smell in each hemisphere of the brain—issues on the left side was thought to be a predictor for Alzheimer’s disease. He later learned that the method wasn’t as airtight as previously thought, but Han was hooked. Alzheimer’s research became the teenager’s passion. While still in high school, Han began volunteering for Alzheimer’s Los Angeles, translating their brochures into Korean.

When it came time to choose a college, Han says Rice University appealed to him for many reasons.

“I loved the atmosphere. I loved the campus—it’s so beautiful. The diverse food, the people, I even liked the highway,” he says of Houston. “In Korea, everything is so close and compact. I loved the whole scenario of the city.”

A scholarship was also part of the appeal, as well as the pull of the world’s largest medical center. Han’s instincts were correct. Now, a junior at Rice, he has been working at renowned geneticist Huda Zoghbi’s Baylor College of Medicine lab for almost two years.

But dividing his obligations between full-time studies and his wet lab position wasn’t enough to keep Han’s active mind occupied. Last May, the statistics and biochemistry student began another endeavor that uses both his specialties. It was then that he founded Biokind Analytics. The nonprofit was designed to explore how data science can support health care nonprofits.

Han reached out to Alzheimer’s Los Angeles to offer his data analysis services on a volunteer basis and was shocked that the association had never considered it before.

“I was really surprised—even small stores and restaurants use statistics to boost their profits. [Alzheimer’s Los Angeles] receive a couple million dollars every year in donations. They have data stores but hadn’t really capitalized yet in the area of analytics.”

Han, along with a small team of Rice students, including vice president Zac Andrews and development director Masha Zaitsev, made Alzheimer’s Los Angeles a pet project, analyzing geospatial trends in its donorship and interpreting the past year’s donation trends. “We wanted to see if the demand was the same in Houston. We found that this pattern was consistent. A lot of nonprofits are willing to have us analyze the data sets they’ve already been tracking and provide data analysis for healthcare nonprofits.”

Less than a year after Han established Biokind Analytics, the 501(c)(3) already has seven chapters on college campuses around the country. From UC Davis and San Diego in the West to Brown University and the University of Virginia on the East Coast, the data science students have helped a diverse range of medical nonprofits, mostly based in the Houston area. They run the gamut from ALS Association of Texas to Nora’s Home, which serves organ failure and transplant patients.

Biokind Analytics has now completed seven projects and analyzed $100 million in funds. Each student group includes four to six members, mostly majors in the worlds of statistics, data science, and biochemistry, all working with the help of faculty advisors. At a total of about 35 students nationwide, Han says that he’s dedicated to growing at a steady pace to avoid potentially expanding too fast, too soon.

Another question for the future is what will happen to Biokind Analytics when Han completes his undergraduate studies in 2024. He plans to continue his medical studies with the goal of one day becoming a physician specializing in Alzheimer’s who uses data analytics to aid in patient care. But no matter how active Han continues to be in the nonprofit he started, his stated attachment to the cause and a growing group of both student leaders and healthcare associations eager for their services are sure to keep Biokind Analytics active long after graduation.

Let's talk about dark data — what it means and how to navigate it. Graphic by Miguel Tovar/University of Houston

Houston expert: Navigating dark data within research and innovation

houston voices

Is it necessary to share ALL your data? Is transparency a good thing or does it make researchers “vulnerable,” as author Nathan Schneider suggests in the Chronicle of Higher Education article, “Why Researchers Shouldn’t Share All Their Data.”

Dark Data Defined

Dark data is defined as the universe of information an organization collects, processes and stores – oftentimes for compliance reasons. Dark data never makes it to the official publication part of the project. According to the Gartner Glossary, “storing and securing data typically incurs more expense (and sometimes greater risk) than value.”

This topic is reminiscent of the file drawer effect, a phenomenon which reflects the influence of the results of a study on whether or not the study is published. Negative results can be just as important as hypotheses that are proven.

Publication bias and the need to only publish positive research that supports the PI’s hypothesis, it can be argued, is not good science. According to an article in the Indian Journal of Anaesthesia, authors Priscilla Joys Nagarajan, et al., wrote: “It is speculated that every significant result in the published world has 19 non-significant counterparts in file drawers.” That’s one definition of dark data.

Total Transparency

But what to do with all your excess information that did not make it to publication, most likely because of various constraints? Should everything, meaning every little tidbit, be readily available to the research community?

Schneider doesn’t think it should be. In his article, he writes that he hides some findings in a paper notebook or behind a password, and he keeps interviews and transcripts offline altogether to protect his sources.

Open-source

Open-source software communities tend to regard total transparency as inherently good. What are the advantages of total transparency? You may make connections between projects that you wouldn’t have otherwise. You can easily reproduce a peer’s experiment. You can even become more meticulous in your note-taking and experimental methods since you know it’s not private information. Similarly, journalists will recognize this thought pattern as the recent, popular call to engage in “open journalism.” Essentially, an author’s entire writing and editing process can be recorded, step by step.

TMI

This trend has led researchers to open-source programs like Jupyter and GitHub. Open-source programs detail every change that occurs along a project’s timeline. Is unorganized, excessive amounts of unpublishable data really what transparency means? Or does it confuse those looking for meaningful research that is meticulously curated?

The Big Idea

And what about the “vulnerability” claim? Sharing every edit and every new direction taken opens a scientist up to scoffers and harassment, even. Dark data in industry even involves publishing salaries, which can feel unfair to underrepresented, marginalized populations.

In Model View Culture, Ellen Marie Dash wrote: “Let’s give safety and consent the absolute highest priority, with openness and transparency prioritized explicitly below those. This means digging deep, properly articulating in detail what problems you are trying to solve with openness and transparency, and handling them individually or in smaller groups.”

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

A new UH-led program will work with energy corporations to prepare the sector's future workforce. Photo via Getty Images

University of Houston leads data science collaboration to propel energy transition

seeing green

Five Texas schools have teamed up with energy industry partners to create a program to train the sectors future workforce. At the helm of the initiative is the University of Houston.

The Data Science for Energy Transition project, which is funded through 2024 by a $1.49 million grant from the National Science Foundation, includes participation from UH, the University of Houston-Downtown, the University of Houston-Victoria, the University of Houston-Clear Lake, and Sam Houston State University.

The project will begin but introducing a five-week data science camp next summer where undergraduate and master’s level students will examine data science skills already in demand — as well as the skills that will be needed in the future as the sector navigates a shift to new technologies.

The camp will encompass computer science and programming, statistics, machine learning, geophysics and earth science, public policy, and engineering, according to a news release from UH. The project’s principal investigator is Mikyoung Jun, ConocoPhillips professor of data science at the UH College of Natural Science and Mathematics.

The new program's principal investigator is Mikyoung Jun. Photo via UH.edu

“It’s obvious that the Houston area is the capital for the energy field. We are supporting our local industries by presenting talented students from the five sponsoring universities and other Texas state universities with the essential skills to match the growing needs within those data science workforces,” Jun says in the release. “We’re planning all functions in a hybrid format so students located outside of Houston, too, can join in.”

Jun describes the camp as having a dual focus — both on the issue of energy transition to renewable sources as well as the traditional energy, because that's not being eradicated any time soon, she explains.

Also setting the program apart is the camp's prerequisites — or lack thereof. The program is open to majors in energy-related fields, such as data science or petroleum engineering, as well as wide-ranging fields of study, such as business, art, history, law, and more.

“The camp is not part of a degree program and its classes do not offer credits toward graduation, so students will continue to follow their own degree plan,” Jun says in the release. “Our goal with the summer camp is to give students a solid footing in data science and energy-related fields to help them focus on skills needed in data science workforces in energy-related companies in Houston and elsewhere. Although that may be their first career move, they may settle in other industries later. Good skills in data processing can make them wise hires for many technology-oriented organizations.”

Jun's four co-principal investigators include Pablo Pinto, professor at UH’s Hobby School of Public Affairs and director of the Center for Public Policy; Jiajia Sun, UH assistant professor of geophysics; Dvijesh Shastri, associate professor of computer science, UH-Downtown; and Yun Wan, professor of computer information systems and chair of the Computer Science Division, UH-Victoria. Eleven other faculty members from five schools will serve as senior personnel. The initiative's energy industry partners include Conoco Phillips, Schlumberger, Fugro, Quantico Energy Solutions, Shell, and Xecta Web Technologies.

The program's first iteration will select 40 students to participate in the camp this summer. Applications, which have not opened yet, will be made available online.

The Data Science for Energy Transition project is a collaboration between five schools. Image via UH.edu

Houston companies need cybersecurity professionals — and universities can help. Photo via Getty Images

How universities can help equip Houston with a skilled cybersecurity workforce

guest column

With an increasing number of data breaches, a high job growth rate, and a persistent skills gap, cybersecurity professionals will be some of the most in-demand workers in 2022. It’s more important than ever to have people that are properly trained to protect individuals, corporations, and communities.

Demand for cybersecurity talent in Texas is high. According to Burning Glass Labor Insights, employers in the Houston metro area have posted over 24,000 cybersecurity jobs since the beginning of 2021. But the pipeline of cybersecurity workers is very low, which means many local and national companies don’t have enough people on the front lines defending against these attacks.

Unfortunately, it looks like the cybersecurity skills gap is far from over. An annual industry report from the Information Systems Security Association shows that the global demand for cybersecurity skills still far exceeds the current supply of traditionally qualified individuals, with 38 percent of cybersecurity roles currently unfilled. This shortage has real-life, real-world consequences that can result in misconfigured systems and improper risk assessment and management.

How can companies help close the cybersecurity skills gap within their own organizations? We believe it will become increasingly important to look beyond “traditionally qualified” candidates and view hands-on experience as the same, or even more important than, the certifications or bachelor degree requirements often found in cybersecurity job descriptions.

The top open cybersecurity roles in the Houston area include analysts, managers, engineers, and developers. Employees in these positions are essential to the everyday monitoring, troubleshooting, testing and analyzing that helps companies protect data and stay one step ahead of hackers. When looking to fill these roles, hiring managers should be looking for candidates with both the knowledge and experience to take on these critical positions.

Fortunately, Houston-based companies looking to establish, grow, or upskill their cybersecurity teams don’t have to go far to find top-tier talent and training programs. More local colleges and universities are offering alternative credential programs, like boot camps, that provide students with the deep understanding and hands-on learning they need to excel in the roles that companies need to fill.

2U, Inc. and Rice University have partnered to power a data-driven, market-responsive cybersecurity boot camp that provides students with hands-on training in networking, systems, web technologies, databases, and defensive and offensive cybersecurity. Over 40 percent of the students didn’t have bachelor degrees prior to enrolling in the program. Since launching in 2019, the program has produced more than 140 graduates, some of whom have gone on to work in cybersecurity roles at local companies such as CenterPoint Energy, Fulcrum Technology Solutions, and Hewlett Packard.

Recognizing programs like university boot camps as local workforce generators not only gives companies a larger talent pool to recruit from, but also increases the opportunity for cybersecurity teams to diversify and include professionals with different experiences and backgrounds. We’re living in a security-first world, and the right mix of cybersecurity talent is essential to keeping us protected wherever we are.

------

David Vassar is the assistant dean of Susanne M. Glasscock School of Continuing Studies at Rice University. Bret Fund is vice president overseeing cybersecurity programs at 2U.

"Better and personalized healthcare through AI is still a hugely challenging problem that will take an army of scientists and engineers." Photo via UH.edu

Houston expert explains health care's inequity problem

guest column

We are currently in the midst of what some have called the "wild west" of AI. Though healthcare is one of the most heavily regulated sectors, the regulation of AI in this space is still in its infancy. The rules are being written as we speak. We are playing catch-up by learning how to reap the benefits these technologies offer while minimizing any potential harms once they've already been deployed.

AI systems in healthcare exacerbate existing inequities. We've seen this play out into real-world consequences from racial bias in the American justice system and credit scoring, to gender bias in resume screening applications. Programs that are designed to bring machine "objectivity" and ease to our systems end up reproducing and upholding biases with no means of accountability.

The algorithm itself is seldom the problem. It is often the data used to program the technology that merits concern. But this is about far more than ethics and fairness. Building AI tools that take account of the whole picture of healthcare is fundamental to creating solutions that work.

The Algorithm is Only as Good as the Data

By nature of our own human systems, datasets are almost always partial and rarely ever fair. As Linda Nordling comments in a Nature article, A fairer way forward for AI in healthcare, "this revolution hinges on the data that are available for these tools to learn from, and those data mirror the unequal health system we see today."

Take, for example, the finding that Black people in US emergency rooms are 40 percent less likely to receive pain medication than are white people, and Hispanic patients are 25 percent less likely. Now, imagine the dataset these findings are based on is used to train an algorithm for an AI tool that would be used to help nurses determine if they should administer pain relief medication. These racial disparities would be reproduced and the implicit biases that uphold them would remain unquestioned, and worse, become automated.

We can attempt to improve these biases by removing the data we believe causes the bias in training, but there will still be hidden patterns that correlate with demographic data. An algorithm cannot take in the nuances of the full picture, it can only learn from patterns in the data it is presented with.

Bias Creep

Data bias creeps into healthcare in unexpected ways. Consider the fact that animal models used in laboratories across the world to discover and test new pain medications are almost entirely male. As a result, many medications, including pain medication, are not optimized for females. So, it makes sense that even common pain medications like ibuprofen and naproxen have been proven to be more effective in men than women and that women tend to experience worse side effects from pain medication than men do.

In reality, male rodents aren't perfect test subjects either. Studies have also shown that both female and male rodents' responses to pain levels differ depending on the sex of the human researcher present. The stress response elicited in rodents to the olfactory presence of a sole male researcher is enough to alter their responses to pain.

While this example may seem to be a departure from AI, it is in fact deeply connected — the current treatment choices we have access to were implicitly biased before the treatments ever made it to clinical trials. The challenge of AI equity is not a purely technical problem, but a very human one that begins with the choices that we make as scientists.

Unequal Data Leads to Unequal Benefits

In order for all of society to enjoy the many benefits that AI systems can bring to healthcare, all of society must be equally represented in the data used to train these systems. While this may sound straightforward, it's a tall order to fill.

Data from some populations don't always make it into training datasets. This can happen for a number of reasons. Some data may not be as accessible or it may not even be collected at all due to existing systemic challenges, such as a lack of access to digital technology or simply being deemed unimportant. Predictive models are created by categorizing data in a meaningful way. But because there's generally less of it, "minority" data tends to be an outlier in datasets and is often wiped out as spurious in order to create a cleaner model.

Data source matters because this detail unquestionably affects the outcome and interpretation of healthcare models. In sub-Saharan Africa, young women are diagnosed with breast cancer at a significantly higher rate. This reveals the need for AI tools and healthcare models tailored to this demographic group, as opposed to AI tools used to detect breast cancer that are only trained on mammograms from the Global North. Likewise, a growing body of work suggests that algorithms used to detect skin cancer tend to be less accurate for Black patients because they are trained mostly on images of light-skinned patients. The list goes on.

We are creating tools and systems that have the potential to revolutionize the healthcare sector, but the benefits of these developments will only reach those represented in the data.

So, what can be done?

Part of the challenge in getting bias out of data is that high volume, diverse and representative datasets are not easy to access. Training datasets that are publicly available tend to be extremely narrow, low-volume, and homogenous—they only capture a partial picture of society. At the same time, a wealth of diverse health data is captured every day in many healthcare settings, but data privacy laws make accessing these more voluminous and diverse datasets difficult.

Data protection is of course vital. Big Tech and governments do not have the best track record when it comes to the responsible use of data. However, if transparency, education, and consent for the sharing of medical data was more purposefully regulated, far more diverse and high-volume data sets could contribute to fairer representation across AI systems and result in better, more accurate results for AI-driven healthcare tools.

But data sharing and access is not a complete fix to healthcare's AI problem. Better and personalized healthcare through AI is still a hugely challenging problem that will take an army of scientists and engineers. At the end of the day, we want to teach our algorithms to make good choices but we are still figuring out what good choices should look like for ourselves.

AI presents the opportunity to bring greater personalization to healthcare, but it equally presents the risk of entrenching existing inequalities. We have the opportunity in front of us to take a considered approach to data collection, regulation, and use that will provide a fuller and fairer picture and enable the next steps for AI in healthcare.

------

Angela Wilkins is the executive director of the Ken Kennedy Institute at Rice University.

This health tech company has made some significant changes in order to keep up with its growth. Photo via Getty Images

Houston data solutions startup rebrands, expands to support neuroscience research

startup soars

With a new CEO and chief operating officer aboard, Houston-based DataJoint is thinking small in order to go big.

Looking ahead to 2022, DataJoint aims to enable hundreds of smaller projects rather than a handful of mega-projects, CEO Dimitri Yatsenko says. DataJoint develops data management software that empowers collaboration in the neuroscience and artificial intelligence sectors.

"Our strategy is to take the lessons that we have learned over the past four years working with major projects with multi-institutional consortia," Yatsenko says, "and translate them into a platform that thousands of labs can use efficiently to accelerate their research and make it more open and rigorous."

Ahead of that shift, the startup has undergone some significant changes, including two moves in the C-suite.

Yatsenko became CEO in February after stints as vice president of R&D and as president. He co-founded the company as Vathes LLC in 2016. Yatsenko succeeded co-founder Edgar Walker, who had been CEO since May 2020 and was vice president of engineering before that.

In tandem with Yatsenko's ascent to CEO, the company brought aboard Jason Kirkpatrick as COO. Kirkpatrick previously was chief financial officer of Houston-based Darcy Partners, an energy industry advisory firm; chief operating officer and chief financial officer of Houston-based Solid Systems CAD Services (SSCS), an IT services company; and senior vice president of finance and general manager of operations at Houston-based SmartVault Corp., a cloud-based document management company.

"Most of our team are scientists and engineers. Recruiting an experienced business leader was a timely step for us, and Jason's vast leadership experience in the software industry and recurring revenue models added a new dimension to our team," Yatsenko says.

Other recent changes include:

  • Converting from an LLC structure to a C corporation structure to enable founders, employees, and future investors to be granted shares of the company's stock.
  • Shortening the business' name to DataJoint from DataJoint Neuro and recently launching its rebranded website.
  • Moving the company's office from the Texas Medical Center Innovation Institute (TMCx) to the Galleria area. The new space will make room for more employees. Yatsenko says the 12-employee startup plans to increase its headcount to 15 to 20 by the end of this year.

Over the past five years, the company's customer base has expanded to include neuroscience institutions such as Princeton University's Princeton Neuroscience Institute and Columbia University's Zuckerman Institute for Brain Science, as well as University College London and the Norwegian University of Science and Technology. DataJoint's growth has been fueled in large part by grants from the U.S. Defense Advanced Research Projects Agency (DARPA) and the Brain Research Through Advancing Innovative Neurotechnologies (BRAIN) Initiative at the National Institutes of Health (NIH).

"The work we are tackling has our team truly excited about the future, particularly the capabilities being offered to the neuroscience community to understand how the brain forms perceptions and generates behavior," Yatsenko says.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston startup teams up with nonprofit research for decarbonization pilot

seeing green

A Houston tech company has joined forces with a nonprofit to test a new sustainable fuel production process.

The project is a joint effort from Houston-based Syzygy Plasmonics and nonprofit research institute RTI International and sponsored by Equinor Ventures and Sumitomo Corporation of Americas. Based in the RTI facility in Research Triangle Park, North Carolina, the six-month pilot is testing a way to convert two potent greenhouse gases — carbon dioxide (CO2) and methane (CH4) — into low-carbon-intensity fuels, which have the potential to replace petroleum-based jet fuel, diesel, and gasoline.

"This demonstration will be the first of its kind and represents a disruptive step in carbon utilization. The sustainable fuels produced are expected to quickly achieve cost parity with today's fossil fuels," says Syzygy CEO Trevor Best in a news release. "Integrating our technology with RTI's Fischer-Tropsch synthesis system has the potential to significantly reduce the carbon intensity of shipping, trucking, and aviation without requiring major fleet modifications."

According to Syzygy, the pilot is a step toward being able to scale the process to a commercial-ready Syzygy e-fuels plant.

"By making minor adjustments in the process, we also expect to produce sustainable methanol using the same technology," Best continues.

An independent research institute, RTI International's focus is on improving the human condition. The multidisciplinary nonprofit seeks to support science-based solutions like Syzygy's technology, which has already proven its scale-up capabilities in earlier testing.

Through the partnership, RTI will assist Syzygy with process design and systems integration for the pilot-scale demonstration. Once it reaches commercial scale, the technology is expected to turn millions of tons of CO2 per year to produce sustainable fuels.

"We are excited about the opportunity to collaborate with Syzygy to test and assist in the scale-up of this promising technology," says Sameer Parvathikar, Ph.D., the director of the Renewable Energy and Energy Storage program in RTI's Technology Advancement and Commercialization business unit. "This work aligns with our capabilities, our goals of helping de-risk and commercialize novel technologies, and our vision to address the world's most critical problems with science-based solutions."

Houston researcher tapped for prestigious fellowship for offshore safety innovation

big win

A University of Houston professor has been selected by a national organization to “contribute to the understanding, management and reduction of systemic risk in offshore energy activities.”

The Gulf Research Program of the National Academies of Sciences, Engineering, and Medicine announced that Harish Krishnamoorthy, assistant professor of electrical and computer engineering at the University of Houston, is one of four selected early-career research fellows in the Offshore Energy Safety track. Krishnamoorthy is the first researcher from UH selected for the recognition.

“I am happy and honored to be the first one, but hopefully there will be a lot more in the coming years,” Krishnamoorthy says in a UH news release.

The award, which isn't granted based on a specific project, includes a $76,000 grant, mentor support, and access to a network of current and past cohorts.

Created in 2013, the program is an independent, science-based program founded as part of legal settlements with the companies involved in the 2010 Deepwater Horizon disaster. Its goal is "to enhance offshore energy system safety and protect human health and the environment by catalyzing advances in science, practice and capacity, generating long-term benefits for the Gulf of Mexico region and the nation," the release reads.

“These exceptional individuals are working hard to pursue new research, technical capabilities, and approaches that address some of the greatest challenges facing the Gulf and Alaska regions today,” says Karena Mary Mothershed, senior program manager for the Gulf Research Program’s Board on Gulf Education and Engagement. “We are incredibly excited to announce these new Early-Career Research Fellows, and to continue supporting them as they make lasting impacts.”

Krishnamoorthy, who also serves as associate director of the Power Electronics, Microgrids and Subsea Electric Systems Center at UH, has expertise is in power electronics, power converters, and offshore technologies. His research interests include high-density power conversion for grid interface of energy systems, machine learning-based methods for improvement in quality and reliability of power electronics, advanced electronics and control for mission-critical applications.

According to Krishnamoorthy, there are around 1,500 offshore rigs — with a large amount located North Sea and the Gulf of Mexico. There's a need to improve existing systems, according to Krishnamoorthy, and this process of evolving the grid comes with safety risks and challenges.

“When there are so many electronics involved, safety and reliability are going to be very critical,” Krishnamoorthy says in he release. “I have been looking at safety aspects a lot in my research as well as how to connect subsea oil and gas systems with offshore renewable systems.”

In 2022, Krishnamoorthy was recognized as an OTC Emerging Leader at the Offshore Technology Conference for his contributions to offshore safety and workforce development in offshore, as well as reducing the carbon emissions.

Pitch perfect: What investors are looking for, according to Houston research

houston voices

Pitching to a venture capitalist is not only the most challenging part of building a startup, it’s also the most important. You can have the next pet rock idea, but nobody will ever experience it and you’ll never make a dime if the genius of this product cannot be expressed in an investor pitch. Okay, so pet rock isn’t the best example.

Let’s say you have a product that gets rid of stretch marks overnight. Great idea, right? Of course. But if you’re in front of an investor and they ask you how your product works, and you can’t answer them, your idea will forever remain just that: an idea. It’ll never manifest itself materially, which is your goal.

Did you know that the average venture capitalist holds around 500 in-person meetings per year? Further, did you know that only one in every 10 startups will make it past the first meeting?

With so many meetings with startup founders, you better believe that investors are virtually looking for reasons to pass on you and your cordless extension cord. Or whatever fakakta contraption you’ve developed in your garage.

Well, with so much importance placed on first impressions, here are some of the most important things investors look for and notice when you pitch to them:

Value proposition

This is what separates you from the pack. This is what makes your startup a standout. A value proposition shows an investor your company’s competitive advantage. If you can explain to your potential investor why it would be their folly if they invested in a competitor over your startup, then you’ll be that much closer to rolling out your product to market. Investors want to see a product or service that is unique because that means less competition, and less risk involved.

Entrepreneurship

Sure, you might be a brilliant scientist. You may have developed nanotechnology that eviscerates dirt and bacteria so you don’t have to shower anymore. But have you put together a team that can make your company a successful business? Do you have team members with experience in whatever it is your startup does? Do you have people with credibility congruent with your startup? Your pitch is a way for investors to find these things out. If you can show them that your team has experience, passion, insightfulness, and expertise, investors will feel much better about taking a chance on you.

Confidence is key

Investors can tell if a founder is confident, but not overconfident about how far they’ve come and how far they know they can go. During a pitch, investors can tell if your team is a cohesive unit or parts of a fractured whole.

Anatomy of an investor pitch

Your potential investor will notice if your pitch is structured well. He or she will take not of whether or not your pitch is designed well. They’ll ask themselves if it’s authentic. Does it cover business metrics? Is it concise and to the point? Is the founder communicating something complex in a simple way? Doing so shows absolute understanding and a total grasp of your product and the science behind it, plus the business aspect of it.


------

This article originally appeared on the University of Houston's The Big Idea. Rene Cantu was the writer and editor at UH Division of Research.