Absolutism has no bearing on the scientific process. Graphic byMiguel Tovar/University of Houston

Science, like politics, can elicit polarizing opinions. But with an ever-expanding body of knowledge — and the especially dizzying flurry of findings during the pandemic — is it fair to say that views on science are becoming more extreme?

Measuring the polarization

“A standard way of measuring polarization in the U.S. is asking Democrats and Republicans how warmly they feel toward members of their own group and members of their outgroup on a feeling thermometer from 0 to 100,” said Jessica Gottlieb, professor at the UH Hobby School of Public Affairs. “The difference in ingroup-outgroup warmth is then considered a measure of polarization. This has been measured by the American National Elections Studies systematically over the past several decades, and indeed the level of affective polarization has been increasing in the U.S.”

“Absolutism is the culprit.”

In an article in Foreign Affairs entitled, “How Extremism Went Mainstream,” the author notes that “the tools that authorities use to combat extremists become less useful when the line between the fringe and the center starts to blur.”

Science has traditionally been one such tool. However, this extremism — where everything is black and white — in politics, has made its unfortunate way into academia. John Lienhard is a professor at the University of Houston and host of “Engines of Our Ingenuity,” a national radio program which has been telling stories of how creativity has shaped our culture since 1988. According to Lienhard, extremism — as seen within the scientific enterprise — goes by a different name.

“Absolutism is the culprit – the need on the part of so many of us to know The Right Answer. The absolutists in the world will glom onto whatever vehicle suits them – religion, politics, education, and ultimately, science itself,” said Lienhard. In other words, good scientists amend and revise, while “the absolutist finds the honest practice of science hateful,” he says, “because science is a way of life where everything lies open to question.”

A series of approximations

In an article entitled, “If You Say Science Is Right You’re Wrong,” professor Naomi Oreskes introduces this quote by Nobel Prize–winning physicist Steven Weinberg:

“Even though a scientific theory is in a sense a social consensus, it is unlike any other sort of consensus in that it is culture-free and permanent.”

Well, no. Even a modest familiarity with the history of science offers many examples of matters that scientists thought they had resolved, only to discover that they needed to be reconsidered.

Some familiar examples are Earth as the center of the universe, the absolute nature of time and space, the stability of continents and the cause of infectious disease.

Absolutism in science is dangerous. Good scientists know how important it is to ask probing questions. In his book entitled, Science versus Absolutism: Science Approaches Truth by a Series of Approximations, the chemist T. Swann Harding asks the question: “What are scientific laws?” He goes on to answer:

“Most people appear to regard them as singularly exact and unalterable things … to violate them brings swift retribution. They are unchanging and eternal in character. Yet the so-called laws of science are really rules pieced together by man on a basis of much observation and experiment.”

In the past, so much of science was just plain wrong – until another researcher came around and amended the original belief (think Galileo). How are our modern times any different? There are still many situations where scientific thought has needed to be amended. Even as recently as the COVID crisis, researchers were revising their thoughts about the spread and contagiousness of the disease.

Allowing for dissent

In a Scientific American blog, Matt Nolan writes that “Dissent in Science Is Essential–up to a Point.” In it, he said, “It is the public who pay the price when marginalized science informs policy. History reminds us this is unsafe territory.” However, Lienhard adds that Einstein set limits on the validity of Newton’s laws just as nuclear fission provided an amendment to the conservation of energy law. There is always a new question to formalize where experimentation is being conducted.

Referred to as the “file drawer effect,” another predicament occurs when a researcher does not get the answer they were expecting, and therefore, decides to not publish the negative findings. Every answer is meaningful. And sometimes a negative answer — or no answer — is an answer.

Dissent, and perhaps a certain measure of disappointment, is a critical part of scientific inquiry.

The Big Idea

Science can be thought of as the best we know to the degree we understand a given problem at a given place and time. Absolutism has no bearing on the scientific process and in some cases actively obscures and colors that understanding. And that’s not black and white at all; that’s about as gray as it gets.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

If there are fewer grant proposals, does that mean innovation has slowed? UH gets to the bottom of the question. Graphic byMiguel Tovar/University of Houston

University of Houston: What a drop in NSF proposals means for the country's rate of innovation

houston voices

A 17 percent drop in proposals over the past decade to the National Science Foundation may be a mixed blessing.

A consistently rising budget – and this is in billions of dollars – is the preferred method of keeping the number of funded proposals ever higher. But a dip in the number of proposals submitted in the first place can have a similar effect of increasing the number of funded proposals, since the pool of submissions is much smaller.

In an article for Science Magazine, author Jeffrey Mervis poses the question: Has there been a decline in grant-worthy ideas? In NSF’s biology sector, Mervis notes that “demand has tumbled by 50 percent over the decade and the chances of winning a grant have doubled, from 18 percent in 2011 to 36 percent in 2020.” NSF’s leadership suggests two possible reasons for this phenomenon.

Eliminating fixed deadlines

“Dear Colleague” letters went out to numerous directorates within the NSF notifying PIs that fixed deadlines for small projects ($500,000 and less) would be taken out of the equation. For instance, the Directorate for Computer and Information Science and Engineering’s letter read: “in order to allow principal investigators (PIs) more flexibility and to better facilitate interdisciplinary research across disciplines” deadlines would be eliminated. The letter goes on to state that by eliminating fixed deadlines, PIs will be free to think more creatively and collaboratively – without the added stress of a deadline.

Wouldn’t less stress mean more applications? This doesn’t seem to be the case. In one instance, according to another article in Science, proposals dropped when the program ceased annual deadlines and replaced them with rolling deadlines.

Reducing stress for grant reviewers

That article goes on to say that these changes alleviate the strain on the grant reviewers without lowering standards. James Olds, assistant director of the Directorate for Biological Sciences, anticipated that the NSF program managers would get somewhat of a break, and that the new policy would relieve university administrators who process the applications from being overwhelmed.

Other factors at play

“It is highly unlikely there was one specific reason for the decrease,” said David Schultz, assistant vice president for Sponsored Projects in the Office of Contracts and Grants at the University of Houston, “but rather multiple factors contributing over time. One potential cause is that many major research institutions are diversifying their funding sources away from NSF and into other federal agencies more aligned with their strategic areas of research interest, such as NIH, DOD, and DOE. The NIH has seen an 11 percent increase in proposals over the same period, from 49,592 in 2011 to 55,038 in 2020.”

Tenure

“Another component is the documented decrease in the number of tenured faculty across the nation. Generally tenured faculty are more research-focused, as their ability to obtain externally funded research is a major criterion for promotion and tenure,” said Schultz. “While this may lead to fewer proposals, it does encourage new tenure track faculty to focus more efforts on the higher likelihood of being awarded an NSF grant.”

The Big Idea

Some people work better and more efficiently when presented with a deadline. Could that be the reason fewer proposals are being turned in? In his article, Mervis, deliberates over whether the number of proposals means that the nation is innovating more slowly than before. But how could that be?

The National Science Board, NSF’s presidentially appointed oversight committee, is trying to get to the bottom of the issue so as to mitigate it. Olds stands by the decision to remove deadlines, pointing out that it should be the strength of the proposal not the threat of a deadline which motivates the research project.

Schultz sees a silver lining. “With fewer proposals being submitted to the NSF, the shift creates an opportunity for smaller, emerging universities to increase their proposal submission and success rates.”

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution. Graphic byMiguel Tovar/University of Houston

Houston research: Why you need a data management plan

Houston voices

Why do you need a data management plan? It mitigates error, increases research integrity and allows your research to be replicated – despite the “replication crisis” that the research enterprise has been wrestling with for some time.

Error

There are many horror stories of researchers losing their data. You can just plain lose your laptop or an external hard drive. Sometimes they are confiscated if you are traveling to another country — and you may not get them back. Some errors are more nuanced. For instance, a COVID-19 repository of contact-traced individuals was missing 16,000 results because Excel can’t exceed 1 million lines per spreadsheet.

Do you think a hard drive is the best repository? Keep in mind that 20 percent of hard drives fail within the first four years. Some researchers merely email their data back and forth and feel like it is “secure” in their inbox.

The human and machine error margins are wide. Continually backing up your results, while good practice, can’t ensure that you won’t lose invaluable research material.

Repositories

According to Reid Boehm, Ph.D., Research Data Management Librarian at the University of Houston Libraries, your best bet is to utilize research data repositories. “The systems and the administrators are focused on file integrity and preservation actions to mitigate loss and they often employ specific metadata fields and documentation with the content,” Boehm says of the repositories. “They usually provide a digital object identifier or other unique ID for a persistent record and access point to these data. It’s just so much less time and worry.”

Integrity

Losing data or being hacked can challenge data integrity. Data breaches do not only compromise research integrity, they can also be extremely expensive! According to Security Intelligence, the global average cost of a data breach in a 2019 study was $3.92 million. That is a 1.5 percent increase from the previous year’s study.

Sample size — how large or small a study was — is another example of how data integrity can affect a study. Retraction Watch removes approximately 1,500 articles annually from prestigious journals for “sloppy science.” One of the main reasons the papers end up being retracted is that the sample size was too small to be a representative group.

Replication

Another metric for measuring data integrity is whether or not the experiment can be replicated. The ability to recreate an experiment is paramount to the scientific enterprise. In a Nature article entitled, 1,500 scientists lift the lid on reproducibility, “73 percent said that they think that at least half of the papers can be trusted, with physicists and chemists generally showing the most confidence.”

However, according to Kelsey Piper at Vox, “an attempt to replicate studies from top journals Nature and Science found that 13 of the 21 results looked at could be reproduced.”

That's so meta

The archivist Jason Scott said, “Metadata is a love note to the future.” Learning how to keep data about data is a critical part of reproducing an experiment.

“While this will be always be determined by a combination of project specifics and disciplinary considerations, descriptive metadata should include as much information about the process as possible,” said Boehm. Details of workflows, any standard operating procedures and parameters of measurement, clear definitions of variables, code and software specifications and versions, and many other signifiers ensure the data will be of use to colleagues in the future.

In other words, making data accessible, useable and reproducible is of the utmost importance. You make reproducing experiments that much easier if you are doing a good job of capturing metadata in a consistent way.

The Big Idea

A data management plan includes storage, curation, archiving and dissemination of research data. Your university’s digital librarian is an invaluable resource. They can answer other tricky questions as well: such as, who does data belong to? And, when a post-doctoral student in your lab leaves the institution, can s/he take their data with them? Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Let's talk about dark data — what it means and how to navigate it. Graphic byMiguel Tovar/University of Houston

Houston expert: Navigating dark data within research and innovation

houston voices

Is it necessary to share ALL your data? Is transparency a good thing or does it make researchers “vulnerable,” as author Nathan Schneider suggests in the Chronicle of Higher Education article, “Why Researchers Shouldn’t Share All Their Data.”

Dark Data Defined

Dark data is defined as the universe of information an organization collects, processes and stores – oftentimes for compliance reasons. Dark data never makes it to the official publication part of the project. According to the Gartner Glossary, “storing and securing data typically incurs more expense (and sometimes greater risk) than value.”

This topic is reminiscent of the file drawer effect, a phenomenon which reflects the influence of the results of a study on whether or not the study is published. Negative results can be just as important as hypotheses that are proven.

Publication bias and the need to only publish positive research that supports the PI’s hypothesis, it can be argued, is not good science. According to an article in the Indian Journal of Anaesthesia, authors Priscilla Joys Nagarajan, et al., wrote: “It is speculated that every significant result in the published world has 19 non-significant counterparts in file drawers.” That’s one definition of dark data.

Total Transparency

But what to do with all your excess information that did not make it to publication, most likely because of various constraints? Should everything, meaning every little tidbit, be readily available to the research community?

Schneider doesn’t think it should be. In his article, he writes that he hides some findings in a paper notebook or behind a password, and he keeps interviews and transcripts offline altogether to protect his sources.

Open-source

Open-source software communities tend to regard total transparency as inherently good. What are the advantages of total transparency? You may make connections between projects that you wouldn’t have otherwise. You can easily reproduce a peer’s experiment. You can even become more meticulous in your note-taking and experimental methods since you know it’s not private information. Similarly, journalists will recognize this thought pattern as the recent, popular call to engage in “open journalism.” Essentially, an author’s entire writing and editing process can be recorded, step by step.

TMI

This trend has led researchers to open-source programs like Jupyter and GitHub. Open-source programs detail every change that occurs along a project’s timeline. Is unorganized, excessive amounts of unpublishable data really what transparency means? Or does it confuse those looking for meaningful research that is meticulously curated?

The Big Idea

And what about the “vulnerability” claim? Sharing every edit and every new direction taken opens a scientist up to scoffers and harassment, even. Dark data in industry even involves publishing salaries, which can feel unfair to underrepresented, marginalized populations.

In Model View Culture, Ellen Marie Dash wrote: “Let’s give safety and consent the absolute highest priority, with openness and transparency prioritized explicitly below those. This means digging deep, properly articulating in detail what problems you are trying to solve with openness and transparency, and handling them individually or in smaller groups.”

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

The University of Houston has tips for doing your due diligence when it comes to avoiding unintentional plagiarism. Graphic byMiguel Tovar/University of Houston

Houston expert: How to avoid unintentional plagiarism in your research work

houston voices

Plagiarism is the use of someone else’s words, ideas, or visuals as if they were your original work. Unintentional plagiarism is plagiarism that results from the disregard for proper scholarly procedures. It’s much easier to commit than one would think, and it has toppled giants in the research enterprise.

From 2007-2020, the National Science Foundation made 200 research misconduct findings, of which 78 percent were related to plagiarism. Here are some do’s and don’ts that will help you avoid unintended plagiarism, a potentially career-killing misstep.

The dos and don'ts

Don’t paraphrase without citing

According to a study of 63,700 students, Rutgers University Business School found that 36% of undergraduates admit to “paraphrasing/copying few sentences from Internet source without footnoting it.”

Don’t forget to add the quotation marks

And don’t forget to properly cite your sources at the end of the paper even if you used any in-text or footnote citations to give proper credit to the primary author.

Don’t copy and paste placeholders

You mean to go back and rewrite it in your own words but are liable to forget or run out of time. (More on this later.) If you copy and paste from a previously published paper of your own, it’s not research misconduct, but it is considered bad practice if you don’t cite it. This is called self-plagiarism.

Do make sure your hypothesis or subject is your own

Plagiarism of ideas occurs when a researcher appropriates an idea, such as a theory or conclusion — whole or in part — without giving credit to its originator. Acknowledge all sources!

Peer review is supposed to be confidential, and colleagues put their trust in each other during this process, assuming there will be no theft of ideas. Once the paper is published in a peer-reviewed journal, it should be cited.

Do use direct quotes

But quoted material should not make up more than 10 percent of the entire article.

Failure to use your own “voice” or “tone” is also considered plagiarism, or could be construed as plagiarizing, depending on how unique the author’s voice is. When there is an excessively unique turn of phrase, use quotation marks and cite (if in doubt.)

When paraphrasing, the syntax should be different enough to be considered your own words. This is tricky because you need to understand the primary work in its original language in order to reword it without just moving words around. In other words, no shuffling words!

Do cite facts widely acknowledged to be true (just in case!)

If it’s something that is generally held within your discipline to be true, or it’s a fact that can be easily looked up – like the year a state passed a certain law – there’s no need to cite “Google” or any generic platform, but it’s better to be safe than sorry. Someone reading your work might not have a background in your discipline.

Do run your paper through a plagiarism-detecting tool

Some options are www.turnitin.com or http://www.ithenticate.com.

Sanctions

There are consequences for plagiarizing another’s work. If you’re a faculty member, the sanctions could affect your career. For instance, according to retractionwatch.com, a prominent researcher and university leader was recently found to have engaged in misconduct. Terry Magnuson was accused, and later admitted to, plagiarizing unintentionally.

In an open letter to his university colleagues, Magnuson wrote a startlingly candid statement: “You cannot write a grant spending 30 minutes writing and then shifting to deal with the daily crises and responsibilities of a senior leadership position in the university, only to get back to the grant when you find another 30 minutes free.”

He goes on to say: “I made a mistake in the course of fleshing out some technical details of the proposed methodology. I used pieces of text from two equipment vendor websites and a publicly available online article. I inserted them into my document as placeholders with the intention of reworking the two areas where the techniques —which are routine work in our lab — were discussed. While switching between tasks and coming back to the proposal, I lost track of my editing and failed to rework the text or cite the sources.” Taking responsibility for this oversight, he resigned.

And that brings us to the Big Idea…

The Big Idea

The one thing that trips up even the most seasoned writers is having enough time to properly cite all one’s sources. Give yourself a few extra days (weeks?) to finish your paper and have a peer read it over with any questionable facts or quotes that might need to be cited more appropriately.

Funding agencies take plagiarism very seriously. For instance, the NSF provides prevention strategies by implementing a pre-submission process, and is also attempting to make plagiarism detection software available.

You also may want to take advantage of resources in your university’s library or writing center. There are also several tools to help you organize your citations; one called RefWorks will keep track of your sources as you write in-text citations or footnotes.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research. It's based on a workshop given by Penny Maher and Laura Gutierrez at the University of Houston; Senior Research Compliance Specialists at the University of Houston.

There are a few things to remember about altmetrics when tapping into non-traditional methods of metrics reporting. Graphic byMiguel Tovar/University of Houston

University of Houston: How to navigate 'altmetrics' in your innovative research project

Houston voices

Alternative metrics, or “altmetrics,” refers to the use of non-traditional methods for judging a researcher’s reach and impact.

Being published in a peer-reviewed journal is surely a great feat. It’s the typical way professors get their research out there. But the tools established to measure this output might end up giving the skewed impression about an author’s impact in spheres both academic and social.

Traditional metrics

Web of Science and Scopus are the main platforms that researchers rely on for collecting article citations. Web of Science’s indexing goes back to 1900, and Scopus boasts the largest database abstract and citations. The caveat with these repositories is that each resource only gives you a rating based on the range and breadth of journals it indexes. Different journals are recorded in different tools, so you may not be getting a comprehensive metric from either.

Let’s talk about h index

The h index is probably never going away, although it is always being improved upon.

An h index is a complex equation that tells the story of how often a researcher is cited. For instance, if a scholar published six papers, and all six papers were each cited by at least six other authors, they would have an h index of 6.

This equation doesn’t work out too well for an academic who, say, had one paper that was continuously cited – they would still have an h index of 1. Brené Brown, Ph.D., even with her veritable empire of vulnerability and shame related self-help has h index of 7 according to Semantic Scholar.

On to altmetrics

When a psychology professor goes on a morning show to discuss self-esteem of young Black women, for instance, she is not helping her h index. Her societal impact is huge, however.

“When I use altmetrics to deliver a professor his or her impact report, I seek out nontraditional sources like social media. For instance, I check how many shares, comments or likes they received for their research. Or maybe their work was reported in the news,” said Andrea Malone, Research Visibility and Impact Coordinator at the University of Houston Libraries.

Altmetrics aim to answer the question of how academia accounts for the numerous other ways scholarly work impacts our society. What about performances done in the humanities, exhibitions, gallery shows or novels published by creative writers?

Alternative metrics are especially important for research done in the humanities and arts but can offer social science and hard science practitioners a better sense of their scope as well. With the constant connections we foster in our lives, the bubble of social media and such, there is a niche for everyone.

The equalizer

For some, Twitter or Facebook is where they like to publish or advertise their data or results.

“When altmetrics are employed, the general public finds out about research, and is able to comment, share and like. They can talk about it on Twitter. The impact of the work is outside of academia,” said Malone. She even checks a database to see if any of the professor’s works have been included in syllabi around the country.

Academia.edu is another social network offering a platform for publishing and searching scholarly content. It has a fee for premium access, whereas Google Scholar is free. Its profile numbers are usually high because it can pick up any public data – even a slide of a PowerPoint.

The Big Idea

At the University of Houston, altmetrics are categorized thusly: articles, books and book chapters, data, posters, slides and videos. While one would think there’s no downside to recording all of the many places academic work ends up, there are a few things to remember about altmetrics:

  1. They lack a standard definition. But this is being worked on currently by the NISO Alternative Assessment Metrics Initiative.
  2. Altmetrics data are not normalized. Tell a story with your metrics, but don’t compare between two unlike sources. Youtube and Twitter will deliver different insights about your research, but they can’t be compared as though they measure the same exact thing.
  3. They are time-dependent. Don’t be discouraged if an older paper doesn’t have much to show as far as altmetrics. The newer the research, the more likely it will have a social media footprint, for example.
  4. They have known tracking issues. Altmetrics work best with items that have a Digital Object Identifier (DOI).

So have an untraditional go of it and enlist help from a librarian or researcher to determine where your research is making the biggest societal impact.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

7 lessons from a Houston-based unicorn startup founder

taking notes

At a fireside chat at SXSW, a Houston founder pulled back the curtain on his entrepreneurial journey that's taken him from an idea of how to make the chemicals industry more sustainable to a company valued at over $2 billion.

Gaurab Chakrabarti, the CEO and co-founder of Solugen, joined the Greater Houston Partnership's Houston House at SXSW on Monday, March 13, for a discussion entitled, "Building a Tech Unicorn." In the conversation with Payal Patel, principal of Softeq Ventures, he share the trials and tribulations from the early days of founding Solugen. The company, which has raised over $600 million since its founding in 2016, has an innovative and carbon negative process of creating plant-derived substitutes for petroleum-based products.

The event, which quickly reached capacity with eager SXSW attendees, allowed Chakrabarti to instill advice on several topics — from early customer acquisition and navigating VC investing to finding the right city to grow in and setting up a strong company culture.

Here are seven pieces of startup advice from Chakrabarti's talk.

1. Don’t be near a black hole.

Chakrabarti began his discussion addressing the good luck he's had standing up Solugen. He's the first to admit that luck is an important element to his success, but he says, as a founder, you can set yourself up for luck in a handful of ways.

“You do make your own luck, but you have to be putting in the work to do it," Chakrabarti says, adding that it's not an easy thing to accomplish. “There are things you can be doing to increase your luck surface area."

One of the principals he notes on is not surrounding yourself with black holes. These are people who don't believe in your idea, or your ability to succeed, Chakrabarti explains, referencing a former dean who said he was wasting his talent on his idea for Solugen.

2. The co-founder dynamic is the most important thing.

Early on, Chakrabarti emphasizes how important having a strong co-founder relationship is, crediting Solugen's co-founder and CTO Sean Hunt for being his "intellectual ping-pong partner."

“If you have a co-founder, that is the thing that’s going to make or break your company,” he says. “It’s not your idea, and it’s not your execution — it’s your relationship with your co-founder.”

Hunt and Chakrabarti have been friends for 12 years, Chakrabarti says, and, that foundation and the fact that they've been passionate about their product since day one, has been integral for Solugen's success.

"We had a conviction that we were building something that could be impactful to the rest of the world," he says.

3. Confirm a market of customers early on.

Chakrabarti says that in the early days of starting his company, he didn't have a concept of startup accelerators or other ways to access funding — he just knew he had to get customers to create revenue as soon as possible.

He learned about the growing float spa industry, and how a huge cost for these businesses was peroxide that was used to sanitize the water in the floating pods. Chakrabarti and Hunt had created a small amount of what they were calling bioperoxide that they could sell at a cheaper cost to these spas and still pocket a profit.

“We ended up owning 80 percent of the float spa market,” Chakrabarti says. “That taught us that, ‘wow, there’s something here.”

While it was unglamourous work to call down Texas float spas, his efforts secured Solugen's first 100 or so customers and identified a path to profitability early on.

“Find your niche market that allows you to justify that your technology or product that has a customer basis,” Chakrabarti says on the lesson he learned through this process.

4. Find city-company fit.

While Chakrabarti has lived in Houston most of his life, the reason Solugen is headquartered in Houston is not due to loyalty of his hometown.

In fact, Chakrabarti shared a story of how a potential seed investor asked Chakrabarti and Hunt to move their company to the Bay Area, and the co-founders refused the offer and the investment.

“There’s no way our business could succeed in the Bay Area," Chakrabarti says. He and Hunt firmly believed this at the time — and still do.

“For our business, if you look at the density of chemical engineers, the density of our potential customers, and the density of people who know how to do enzyme engineering, Houston happened to be that perfect trifecta for us," he explains.

He argues that every company — software, hardware, etc. — has an opportunity to find their ideal city-company fit, something that's important to its success.

5. Prove your ability to execute.

When asked about pivots, Chakrabarti told a little-known story of how Solugen started a commercial cleaning brand. The product line was called Ode to Clean, and it was marketed as eco-friendly peroxide wipes. At the time, Solugen was just three employees, and the scrappy team was fulfilling orders and figuring out consumer marketing for the first time.

He says his network was laughing at the idea of Chakrabarti creating this direct-to-consumer cleaning product, and it was funny to him too, but the sales told another story.

At launch, they sold out $1 million of inventory in one week. But that wasn't it.

“Within three months, we got three acquisition offers," Chakrabarti says.

The move led to a brand acquisition of the product line, with the acquirer being the nation's largest cleaning wipe provider. It meant three years of predictable revenue that de-risked the business for new investors — which were now knocking on Solugen's door with their own investment term sheets.

“It told the market more about us as a company,” he says. “It taught the market that Solugen is a company that is going to survive no matter what. … And we’re a team that can execute.”

What started as a silly idea led to Solugen being one step closer to accomplishing its long-term goals.

“That pivot was one of the most important pivots in the company’s history that accelerated our company’s trajectory by four or five years," Chakrabarti says.

6. Adopt and maintain a miso-management style.

There's one lesson Chakrabarti says he learned the hard way, and that was how to manage his company's growing team. He shares that he "let go of the reins a bit" at the company's $400-$500 million point. He says that, while there's this idea that successful business leaders can hire the best talent that allows them to step back from the day-to-day responsibilities, that was not the right move for him.

“Only founders really understand the pain points of the business," Chakrabarti says. "Because it’s emotionally tied to you, you actually feel it."

Rather than a micro or macro-management style, Chakrabarti's describes his leadership as meso-management — something in between.

The only difference, Chakrabarti says, is how he manages his board. For that group, he micromanages to ensure that they are doing what's best for his vision for Solugen.

7. Your culture should be polarizing.

Chakrabarti wrapped up his story on talking about hiring and setting up a company culture for Solugen. The company's atmosphere is not for everyone, he explains.

“If you’re not polarizing some people, it’s not a culture,” Chakrabarti says, encouraging founders to create a culture that's not one size fits all.

He says he was attracted to early employees who got mad at the same things he did — that passion is what makes his team different from others.

Houston tech company to acquire IT infrastructure startup

M&A moves

Hewlett Packard Enterprise has announced its plans to acquire a San Jose, California-based startup.

HPE, which relocated its headquarters to Houston from the Bay Area a couple years ago, has agreed to acquire OpsRamp, a software-as-a-service company with an IT operations management, or ITOM, platform that can monitor, automate, and manage IT infrastructure, cloud resources, and more.

According to a news release from HPE, the OpsRamp platform will be merged with the HPE GreenLake edge-to-cloud platform, which supports more than 65,000 customers, powers over two million connected devices, and manages more than one exabyte of data with customers worldwide.

The new integrated system "will reduce the operational complexity of multi-vendor and multi-cloud IT environments that are in the public cloud, colocations, and on-premises," per the statement.

“Customers today are managing several different cloud environments, with different IT operational models and tools, which dramatically increases the cost and complexity of digital operations management,” says HPE's CTO Fidelma Russo in the release. “The combination of OpsRamp and HPE will remove these barriers by providing customers with an integrated edge-to-cloud platform that can more effectively manage and transform multi-vendor and multi-cloud IT estates.

"This acquisition advances HPE hybrid cloud leadership and expands the reach of the HPE GreenLake platform into IT Operations Management,” she continues.

HPE's corporate venture arm, Pathfinder, invested in OpsRamp in 2020. The company raised $57.5 million prior to the acquisition. Other investors included Morgan Stanley Expansion Capital and Sapphire Ventures, per TechCrunch.

“The integration of OpsRamp’s hybrid digital operations management solution with the HPE GreenLake platform will provide an unmatched offering for organizations seeking to innovate and thrive in a complex, multi-cloud world. Partners and the channel will also play a pivotal role to advance their as-a-service offerings, as enterprises look for a unified approach to better manage their operations from the edge to the cloud,” says Varma Kunaparaju, CEO of OpsRamp, in the release.

“We look forward to leveraging the scale and reach of HPE’s global go-to-market engine to deliver our unique offering and are excited for this journey ahead as part of HPE.”

3 Houston innovators to know this week

Editor's note: In this week's roundup of Houston innovators to know, I'm introducing you to three local innovators across industries — from space tech to software development — recently making headlines in Houston innovation.


Michael Suffredini, CEO and president of Axiom Space

Axiom's CEO announced a new mission and space suit design. Photo courtesy of Axiom Space

It was a big news week for Axiom Space. The Houston company announced its next commercial space mission with NASA to the International Space Station a day before it unveiled its newly design space suit that will be donned by the astronauts headed to the moon.

“We’re carrying on NASA’s legacy by designing an advanced spacesuit that will allow astronauts to operate safely and effectively on the Moon,” says Micahel Suffredini, CEO of Axiom, in a statement. “Axiom Space’s Artemis III spacesuit will be ready to meet the complex challenges of the lunar south pole and help grow our understanding of the Moon in order to enable a long-term presence there.”

Called the Axiom Extravehicular Mobility Unit, or AxEMU, the prototype was revealed at Space Center Houston’s Moon 2 Mars Festival on March 15. According to Axiom, a full fleet of training spacesuits will be delivered to NASA by late this summer. Read more.

Julie King, president of NB Realty Partners

Houston's access to lab space continues to be a challenge for biotech companies. Photo via Getty Images

In terms of Houston developing as an attractive hub for biotech companies, Julie King says the city still has one major obstacle: Available lab space.

She writes in a guest column for InnovationMap that biotech startups need specialized space that can hold the right equipment. That's not cheap, and it's usually a challenge for newer companies to incur that cost.

"However, with realistic expectations about these challenges, the good news is that once settled into a facility that is a fit, Houston’s emerging biotech companies can thrive and grow," she writes. Read more.

Owen Goode, executive vice president at Zaelot

Houston software development firm Axon is planning its Texas expansion thanks to its recent acquisition. Photo via LinkedIn

Owen Goode is a huge fan of Houston. That's why when his software design firm, Axon, got acquired by Zaelot, led by CEO Jeff Lombard, in January, he made sure the deal would mean growth in the region.

Zaelot is a global, software firm with a presence in 14 countries, mostly focused in the United States, Uruguay, and Iceland. With the acquisition of Axon, the combined company is poised to expand in Texas, beginning in Houston, Goode says.

“Together we have a strong suite of offerings across a wide variety of domains including full-stack development, cloud/data engineering, design, staff augmentation, project management, and software architecture. We also have experience in multiple domains, including health care, aviation, defense, finance, and startups,” says Goode. Read more.