Absolutism has no bearing on the scientific process. Graphic byMiguel Tovar/University of Houston

Science, like politics, can elicit polarizing opinions. But with an ever-expanding body of knowledge — and the especially dizzying flurry of findings during the pandemic — is it fair to say that views on science are becoming more extreme?

Measuring the polarization

“A standard way of measuring polarization in the U.S. is asking Democrats and Republicans how warmly they feel toward members of their own group and members of their outgroup on a feeling thermometer from 0 to 100,” said Jessica Gottlieb, professor at the UH Hobby School of Public Affairs. “The difference in ingroup-outgroup warmth is then considered a measure of polarization. This has been measured by the American National Elections Studies systematically over the past several decades, and indeed the level of affective polarization has been increasing in the U.S.”

“Absolutism is the culprit.”

In an article in Foreign Affairs entitled, “How Extremism Went Mainstream,” the author notes that “the tools that authorities use to combat extremists become less useful when the line between the fringe and the center starts to blur.”

Science has traditionally been one such tool. However, this extremism — where everything is black and white — in politics, has made its unfortunate way into academia. John Lienhard is a professor at the University of Houston and host of “Engines of Our Ingenuity,” a national radio program which has been telling stories of how creativity has shaped our culture since 1988. According to Lienhard, extremism — as seen within the scientific enterprise — goes by a different name.

“Absolutism is the culprit – the need on the part of so many of us to know The Right Answer. The absolutists in the world will glom onto whatever vehicle suits them – religion, politics, education, and ultimately, science itself,” said Lienhard. In other words, good scientists amend and revise, while “the absolutist finds the honest practice of science hateful,” he says, “because science is a way of life where everything lies open to question.”

A series of approximations

In an article entitled, “If You Say Science Is Right You’re Wrong,” professor Naomi Oreskes introduces this quote by Nobel Prize–winning physicist Steven Weinberg:

“Even though a scientific theory is in a sense a social consensus, it is unlike any other sort of consensus in that it is culture-free and permanent.”

Well, no. Even a modest familiarity with the history of science offers many examples of matters that scientists thought they had resolved, only to discover that they needed to be reconsidered.

Some familiar examples are Earth as the center of the universe, the absolute nature of time and space, the stability of continents and the cause of infectious disease.

Absolutism in science is dangerous. Good scientists know how important it is to ask probing questions. In his book entitled, Science versus Absolutism: Science Approaches Truth by a Series of Approximations, the chemist T. Swann Harding asks the question: “What are scientific laws?” He goes on to answer:

“Most people appear to regard them as singularly exact and unalterable things … to violate them brings swift retribution. They are unchanging and eternal in character. Yet the so-called laws of science are really rules pieced together by man on a basis of much observation and experiment.”

In the past, so much of science was just plain wrong – until another researcher came around and amended the original belief (think Galileo). How are our modern times any different? There are still many situations where scientific thought has needed to be amended. Even as recently as the COVID crisis, researchers were revising their thoughts about the spread and contagiousness of the disease.

Allowing for dissent

In a Scientific American blog, Matt Nolan writes that “Dissent in Science Is Essential–up to a Point.” In it, he said, “It is the public who pay the price when marginalized science informs policy. History reminds us this is unsafe territory.” However, Lienhard adds that Einstein set limits on the validity of Newton’s laws just as nuclear fission provided an amendment to the conservation of energy law. There is always a new question to formalize where experimentation is being conducted.

Referred to as the “file drawer effect,” another predicament occurs when a researcher does not get the answer they were expecting, and therefore, decides to not publish the negative findings. Every answer is meaningful. And sometimes a negative answer — or no answer — is an answer.

Dissent, and perhaps a certain measure of disappointment, is a critical part of scientific inquiry.

The Big Idea

Science can be thought of as the best we know to the degree we understand a given problem at a given place and time. Absolutism has no bearing on the scientific process and in some cases actively obscures and colors that understanding. And that’s not black and white at all; that’s about as gray as it gets.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

If there are fewer grant proposals, does that mean innovation has slowed? UH gets to the bottom of the question. Graphic byMiguel Tovar/University of Houston

University of Houston: What a drop in NSF proposals means for the country's rate of innovation

houston voices

A 17 percent drop in proposals over the past decade to the National Science Foundation may be a mixed blessing.

A consistently rising budget – and this is in billions of dollars – is the preferred method of keeping the number of funded proposals ever higher. But a dip in the number of proposals submitted in the first place can have a similar effect of increasing the number of funded proposals, since the pool of submissions is much smaller.

In an article for Science Magazine, author Jeffrey Mervis poses the question: Has there been a decline in grant-worthy ideas? In NSF’s biology sector, Mervis notes that “demand has tumbled by 50 percent over the decade and the chances of winning a grant have doubled, from 18 percent in 2011 to 36 percent in 2020.” NSF’s leadership suggests two possible reasons for this phenomenon.

Eliminating fixed deadlines

“Dear Colleague” letters went out to numerous directorates within the NSF notifying PIs that fixed deadlines for small projects ($500,000 and less) would be taken out of the equation. For instance, the Directorate for Computer and Information Science and Engineering’s letter read: “in order to allow principal investigators (PIs) more flexibility and to better facilitate interdisciplinary research across disciplines” deadlines would be eliminated. The letter goes on to state that by eliminating fixed deadlines, PIs will be free to think more creatively and collaboratively – without the added stress of a deadline.

Wouldn’t less stress mean more applications? This doesn’t seem to be the case. In one instance, according to another article in Science, proposals dropped when the program ceased annual deadlines and replaced them with rolling deadlines.

Reducing stress for grant reviewers

That article goes on to say that these changes alleviate the strain on the grant reviewers without lowering standards. James Olds, assistant director of the Directorate for Biological Sciences, anticipated that the NSF program managers would get somewhat of a break, and that the new policy would relieve university administrators who process the applications from being overwhelmed.

Other factors at play

“It is highly unlikely there was one specific reason for the decrease,” said David Schultz, assistant vice president for Sponsored Projects in the Office of Contracts and Grants at the University of Houston, “but rather multiple factors contributing over time. One potential cause is that many major research institutions are diversifying their funding sources away from NSF and into other federal agencies more aligned with their strategic areas of research interest, such as NIH, DOD, and DOE. The NIH has seen an 11 percent increase in proposals over the same period, from 49,592 in 2011 to 55,038 in 2020.”

Tenure

“Another component is the documented decrease in the number of tenured faculty across the nation. Generally tenured faculty are more research-focused, as their ability to obtain externally funded research is a major criterion for promotion and tenure,” said Schultz. “While this may lead to fewer proposals, it does encourage new tenure track faculty to focus more efforts on the higher likelihood of being awarded an NSF grant.”

The Big Idea

Some people work better and more efficiently when presented with a deadline. Could that be the reason fewer proposals are being turned in? In his article, Mervis, deliberates over whether the number of proposals means that the nation is innovating more slowly than before. But how could that be?

The National Science Board, NSF’s presidentially appointed oversight committee, is trying to get to the bottom of the issue so as to mitigate it. Olds stands by the decision to remove deadlines, pointing out that it should be the strength of the proposal not the threat of a deadline which motivates the research project.

Schultz sees a silver lining. “With fewer proposals being submitted to the NSF, the shift creates an opportunity for smaller, emerging universities to increase their proposal submission and success rates.”

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution. Graphic byMiguel Tovar/University of Houston

Houston research: Why you need a data management plan

Houston voices

Why do you need a data management plan? It mitigates error, increases research integrity and allows your research to be replicated – despite the “replication crisis” that the research enterprise has been wrestling with for some time.

Error

There are many horror stories of researchers losing their data. You can just plain lose your laptop or an external hard drive. Sometimes they are confiscated if you are traveling to another country — and you may not get them back. Some errors are more nuanced. For instance, a COVID-19 repository of contact-traced individuals was missing 16,000 results because Excel can’t exceed 1 million lines per spreadsheet.

Do you think a hard drive is the best repository? Keep in mind that 20 percent of hard drives fail within the first four years. Some researchers merely email their data back and forth and feel like it is “secure” in their inbox.

The human and machine error margins are wide. Continually backing up your results, while good practice, can’t ensure that you won’t lose invaluable research material.

Repositories

According to Reid Boehm, Ph.D., Research Data Management Librarian at the University of Houston Libraries, your best bet is to utilize research data repositories. “The systems and the administrators are focused on file integrity and preservation actions to mitigate loss and they often employ specific metadata fields and documentation with the content,” Boehm says of the repositories. “They usually provide a digital object identifier or other unique ID for a persistent record and access point to these data. It’s just so much less time and worry.”

Integrity

Losing data or being hacked can challenge data integrity. Data breaches do not only compromise research integrity, they can also be extremely expensive! According to Security Intelligence, the global average cost of a data breach in a 2019 study was $3.92 million. That is a 1.5 percent increase from the previous year’s study.

Sample size — how large or small a study was — is another example of how data integrity can affect a study. Retraction Watch removes approximately 1,500 articles annually from prestigious journals for “sloppy science.” One of the main reasons the papers end up being retracted is that the sample size was too small to be a representative group.

Replication

Another metric for measuring data integrity is whether or not the experiment can be replicated. The ability to recreate an experiment is paramount to the scientific enterprise. In a Nature article entitled, 1,500 scientists lift the lid on reproducibility, “73 percent said that they think that at least half of the papers can be trusted, with physicists and chemists generally showing the most confidence.”

However, according to Kelsey Piper at Vox, “an attempt to replicate studies from top journals Nature and Science found that 13 of the 21 results looked at could be reproduced.”

That's so meta

The archivist Jason Scott said, “Metadata is a love note to the future.” Learning how to keep data about data is a critical part of reproducing an experiment.

“While this will be always be determined by a combination of project specifics and disciplinary considerations, descriptive metadata should include as much information about the process as possible,” said Boehm. Details of workflows, any standard operating procedures and parameters of measurement, clear definitions of variables, code and software specifications and versions, and many other signifiers ensure the data will be of use to colleagues in the future.

In other words, making data accessible, useable and reproducible is of the utmost importance. You make reproducing experiments that much easier if you are doing a good job of capturing metadata in a consistent way.

The Big Idea

A data management plan includes storage, curation, archiving and dissemination of research data. Your university’s digital librarian is an invaluable resource. They can answer other tricky questions as well: such as, who does data belong to? And, when a post-doctoral student in your lab leaves the institution, can s/he take their data with them? Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Let's talk about dark data — what it means and how to navigate it. Graphic byMiguel Tovar/University of Houston

Houston expert: Navigating dark data within research and innovation

houston voices

Is it necessary to share ALL your data? Is transparency a good thing or does it make researchers “vulnerable,” as author Nathan Schneider suggests in the Chronicle of Higher Education article, “Why Researchers Shouldn’t Share All Their Data.”

Dark Data Defined

Dark data is defined as the universe of information an organization collects, processes and stores – oftentimes for compliance reasons. Dark data never makes it to the official publication part of the project. According to the Gartner Glossary, “storing and securing data typically incurs more expense (and sometimes greater risk) than value.”

This topic is reminiscent of the file drawer effect, a phenomenon which reflects the influence of the results of a study on whether or not the study is published. Negative results can be just as important as hypotheses that are proven.

Publication bias and the need to only publish positive research that supports the PI’s hypothesis, it can be argued, is not good science. According to an article in the Indian Journal of Anaesthesia, authors Priscilla Joys Nagarajan, et al., wrote: “It is speculated that every significant result in the published world has 19 non-significant counterparts in file drawers.” That’s one definition of dark data.

Total Transparency

But what to do with all your excess information that did not make it to publication, most likely because of various constraints? Should everything, meaning every little tidbit, be readily available to the research community?

Schneider doesn’t think it should be. In his article, he writes that he hides some findings in a paper notebook or behind a password, and he keeps interviews and transcripts offline altogether to protect his sources.

Open-source

Open-source software communities tend to regard total transparency as inherently good. What are the advantages of total transparency? You may make connections between projects that you wouldn’t have otherwise. You can easily reproduce a peer’s experiment. You can even become more meticulous in your note-taking and experimental methods since you know it’s not private information. Similarly, journalists will recognize this thought pattern as the recent, popular call to engage in “open journalism.” Essentially, an author’s entire writing and editing process can be recorded, step by step.

TMI

This trend has led researchers to open-source programs like Jupyter and GitHub. Open-source programs detail every change that occurs along a project’s timeline. Is unorganized, excessive amounts of unpublishable data really what transparency means? Or does it confuse those looking for meaningful research that is meticulously curated?

The Big Idea

And what about the “vulnerability” claim? Sharing every edit and every new direction taken opens a scientist up to scoffers and harassment, even. Dark data in industry even involves publishing salaries, which can feel unfair to underrepresented, marginalized populations.

In Model View Culture, Ellen Marie Dash wrote: “Let’s give safety and consent the absolute highest priority, with openness and transparency prioritized explicitly below those. This means digging deep, properly articulating in detail what problems you are trying to solve with openness and transparency, and handling them individually or in smaller groups.”

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

The University of Houston has tips for doing your due diligence when it comes to avoiding unintentional plagiarism. Graphic byMiguel Tovar/University of Houston

Houston expert: How to avoid unintentional plagiarism in your research work

houston voices

Plagiarism is the use of someone else’s words, ideas, or visuals as if they were your original work. Unintentional plagiarism is plagiarism that results from the disregard for proper scholarly procedures. It’s much easier to commit than one would think, and it has toppled giants in the research enterprise.

From 2007-2020, the National Science Foundation made 200 research misconduct findings, of which 78 percent were related to plagiarism. Here are some do’s and don’ts that will help you avoid unintended plagiarism, a potentially career-killing misstep.

The dos and don'ts

Don’t paraphrase without citing

According to a study of 63,700 students, Rutgers University Business School found that 36% of undergraduates admit to “paraphrasing/copying few sentences from Internet source without footnoting it.”

Don’t forget to add the quotation marks

And don’t forget to properly cite your sources at the end of the paper even if you used any in-text or footnote citations to give proper credit to the primary author.

Don’t copy and paste placeholders

You mean to go back and rewrite it in your own words but are liable to forget or run out of time. (More on this later.) If you copy and paste from a previously published paper of your own, it’s not research misconduct, but it is considered bad practice if you don’t cite it. This is called self-plagiarism.

Do make sure your hypothesis or subject is your own

Plagiarism of ideas occurs when a researcher appropriates an idea, such as a theory or conclusion — whole or in part — without giving credit to its originator. Acknowledge all sources!

Peer review is supposed to be confidential, and colleagues put their trust in each other during this process, assuming there will be no theft of ideas. Once the paper is published in a peer-reviewed journal, it should be cited.

Do use direct quotes

But quoted material should not make up more than 10 percent of the entire article.

Failure to use your own “voice” or “tone” is also considered plagiarism, or could be construed as plagiarizing, depending on how unique the author’s voice is. When there is an excessively unique turn of phrase, use quotation marks and cite (if in doubt.)

When paraphrasing, the syntax should be different enough to be considered your own words. This is tricky because you need to understand the primary work in its original language in order to reword it without just moving words around. In other words, no shuffling words!

Do cite facts widely acknowledged to be true (just in case!)

If it’s something that is generally held within your discipline to be true, or it’s a fact that can be easily looked up – like the year a state passed a certain law – there’s no need to cite “Google” or any generic platform, but it’s better to be safe than sorry. Someone reading your work might not have a background in your discipline.

Do run your paper through a plagiarism-detecting tool

Some options are www.turnitin.com or http://www.ithenticate.com.

Sanctions

There are consequences for plagiarizing another’s work. If you’re a faculty member, the sanctions could affect your career. For instance, according to retractionwatch.com, a prominent researcher and university leader was recently found to have engaged in misconduct. Terry Magnuson was accused, and later admitted to, plagiarizing unintentionally.

In an open letter to his university colleagues, Magnuson wrote a startlingly candid statement: “You cannot write a grant spending 30 minutes writing and then shifting to deal with the daily crises and responsibilities of a senior leadership position in the university, only to get back to the grant when you find another 30 minutes free.”

He goes on to say: “I made a mistake in the course of fleshing out some technical details of the proposed methodology. I used pieces of text from two equipment vendor websites and a publicly available online article. I inserted them into my document as placeholders with the intention of reworking the two areas where the techniques —which are routine work in our lab — were discussed. While switching between tasks and coming back to the proposal, I lost track of my editing and failed to rework the text or cite the sources.” Taking responsibility for this oversight, he resigned.

And that brings us to the Big Idea…

The Big Idea

The one thing that trips up even the most seasoned writers is having enough time to properly cite all one’s sources. Give yourself a few extra days (weeks?) to finish your paper and have a peer read it over with any questionable facts or quotes that might need to be cited more appropriately.

Funding agencies take plagiarism very seriously. For instance, the NSF provides prevention strategies by implementing a pre-submission process, and is also attempting to make plagiarism detection software available.

You also may want to take advantage of resources in your university’s library or writing center. There are also several tools to help you organize your citations; one called RefWorks will keep track of your sources as you write in-text citations or footnotes.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research. It's based on a workshop given by Penny Maher and Laura Gutierrez at the University of Houston; Senior Research Compliance Specialists at the University of Houston.

There are a few things to remember about altmetrics when tapping into non-traditional methods of metrics reporting. Graphic byMiguel Tovar/University of Houston

University of Houston: How to navigate 'altmetrics' in your innovative research project

Houston voices

Alternative metrics, or “altmetrics,” refers to the use of non-traditional methods for judging a researcher’s reach and impact.

Being published in a peer-reviewed journal is surely a great feat. It’s the typical way professors get their research out there. But the tools established to measure this output might end up giving the skewed impression about an author’s impact in spheres both academic and social.

Traditional metrics

Web of Science and Scopus are the main platforms that researchers rely on for collecting article citations. Web of Science’s indexing goes back to 1900, and Scopus boasts the largest database abstract and citations. The caveat with these repositories is that each resource only gives you a rating based on the range and breadth of journals it indexes. Different journals are recorded in different tools, so you may not be getting a comprehensive metric from either.

Let’s talk about h index

The h index is probably never going away, although it is always being improved upon.

An h index is a complex equation that tells the story of how often a researcher is cited. For instance, if a scholar published six papers, and all six papers were each cited by at least six other authors, they would have an h index of 6.

This equation doesn’t work out too well for an academic who, say, had one paper that was continuously cited – they would still have an h index of 1. Brené Brown, Ph.D., even with her veritable empire of vulnerability and shame related self-help has h index of 7 according to Semantic Scholar.

On to altmetrics

When a psychology professor goes on a morning show to discuss self-esteem of young Black women, for instance, she is not helping her h index. Her societal impact is huge, however.

“When I use altmetrics to deliver a professor his or her impact report, I seek out nontraditional sources like social media. For instance, I check how many shares, comments or likes they received for their research. Or maybe their work was reported in the news,” said Andrea Malone, Research Visibility and Impact Coordinator at the University of Houston Libraries.

Altmetrics aim to answer the question of how academia accounts for the numerous other ways scholarly work impacts our society. What about performances done in the humanities, exhibitions, gallery shows or novels published by creative writers?

Alternative metrics are especially important for research done in the humanities and arts but can offer social science and hard science practitioners a better sense of their scope as well. With the constant connections we foster in our lives, the bubble of social media and such, there is a niche for everyone.

The equalizer

For some, Twitter or Facebook is where they like to publish or advertise their data or results.

“When altmetrics are employed, the general public finds out about research, and is able to comment, share and like. They can talk about it on Twitter. The impact of the work is outside of academia,” said Malone. She even checks a database to see if any of the professor’s works have been included in syllabi around the country.

Academia.edu is another social network offering a platform for publishing and searching scholarly content. It has a fee for premium access, whereas Google Scholar is free. Its profile numbers are usually high because it can pick up any public data – even a slide of a PowerPoint.

The Big Idea

At the University of Houston, altmetrics are categorized thusly: articles, books and book chapters, data, posters, slides and videos. While one would think there’s no downside to recording all of the many places academic work ends up, there are a few things to remember about altmetrics:

  1. They lack a standard definition. But this is being worked on currently by the NISO Alternative Assessment Metrics Initiative.
  2. Altmetrics data are not normalized. Tell a story with your metrics, but don’t compare between two unlike sources. Youtube and Twitter will deliver different insights about your research, but they can’t be compared as though they measure the same exact thing.
  3. They are time-dependent. Don’t be discouraged if an older paper doesn’t have much to show as far as altmetrics. The newer the research, the more likely it will have a social media footprint, for example.
  4. They have known tracking issues. Altmetrics work best with items that have a Digital Object Identifier (DOI).

So have an untraditional go of it and enlist help from a librarian or researcher to determine where your research is making the biggest societal impact.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston innovator receives $5M to establish new center that explores crystallization process

crystal clear initiative

A new hub at the University of Houston is being established with a crystal-clear mission — and fresh funding.

Thanks to funding from Houston-based organization The Welch Foundation, the University of Houston will be home to the Welch Center for Advanced Bioactive Materials Crystallization. The nonprofit doled out its inaugural $5 million Catalyst for Discovery Program Grant to the new initiative led by Jeffrey Rimer, Abraham E. Dukler Professor of Chemical Engineering, who is known internationally for his work with crystals that help treat malaria and kidney stones.

“Knowledge gaps in the nascent and rapidly developing field of nonclassical crystallization present a wide range of obstacles to design crystalline materials for applications that benefit humankind, spanning from medicine to energy and the environment,” says Rimer in a news release. “Success calls for a paradigm shift in the understanding of crystal nucleation mechanisms and structure selection that will be addressed in this center.”

The Welch Foundation, which was founded in 1954, has granted over $1.1 billion to scientists in Texas. This new grant program targets researchers focused on fundamental chemical solutions. Earlier this year, the organization announced nearly $28 million in grants to Texas institutions.

"Support from the Welch Foundation has led to important advances in the field of chemistry, not only within Texas, but also throughout the United States and the world as a whole,” says Randall Lee, Cullen Distinguished University Chair and professor of chemistry, in the release. “These advances extend beyond scientific discoveries and into the realm of education, where support from the Welch Foundation has played a significant role in building the technological workforce needed to solve ongoing and emerging problems in energy and health care.”

Rimer and Lee are joined by the following researchers on the newly announced center's team:

  • Peter Vekilov, Moores Professor, chemical and biomolecular engineering
  • Alamgir Karim, Dow Chair and Welch Foundation Professor, chemical and biomolecular engineering;
  • Jeremy Palmer, Ernest J. and Barbara M. Henley Associate Professor, chemical and biomolecular engineering
  • Gül Zerze, chemical and biomolecular engineering
  • Francisco Robles Hernandez, professor of engineering technology.

The University of Houston also received another grant from the Welch Foundation. Megan Robertson, UH professor of chemical engineering, received $4 million$4 million for her work with developing chemical processes to transform plastic waste into useful materials.

“For the University of Houston to be recognized with two highly-competitive Welch Foundation Catalyst Grants underscores the exceptional talent and dedication of our researchers and their commitment to making meaningful contributions to society through discovery,” Diane Chase, UH senior vice president for academic affairs and provost, says in the release.

University opens its newest, largest campus research facility in Houston

research @ rice

As the academic year officially kicks off, professors have started moving in and Rice University has opened its largest core campus research facility, The Ralph S. O’Connor Building for Engineering and Science.

The 250,000-square-foot building is the new home for four key research areas at Rice: advanced materials, quantum science and computing, urban research and innovation, and the energy transition. The university aims for the space to foster collaboration and innovation between the disciplines.

"To me it really speaks to where Rice wants to go as we grow our research endeavors on campus," Michael Wong, Chair of the Department of Chemical and Biomolecular Engineering, whose lab is located in the new facility, said in a video from Rice. "It has to be a mix of engineering and science to do great things. We don’t want to do good things, we want to do great things. And this building will allow us to do that."

At $152 million, the state-of-the-art facility features five floors of labs, classrooms and seminar rooms. Common spaces and a cafe encourage communication between departments, and the top level is home to a reception suite and outdoor terrace with views of the Houston skyline.

It replaces 1940s-era Abercrombie Engineering Laboratory on campus, which was demolished in 2021 to make way for the new facilities. The iconic sculpture "Energy" by Rice alumnus William McVey that was part of the original building was preserved with plans to incorporate it into the new space.

The new building will be dedicated to its namesake Ralph O'Connor on Sept. 14 in Rice's engineering quad at 3 p.m. O'Connor, a Johns Hopkins University grad, became a fan Rice when he moved to Houston to work in the energy industry in the 1950s.

The former president and CEO of the Highland Oil Company and founder of Ralph S. O’Connor & Associates left the university $57 million from his estate after he died in 2018. The gift was the largest donation from an estate in Rice's history and brought his donations to the university, including those to many buildings on campus and endowments and scholarships, to a total of $85 million.

“How fitting that this building will be named after Ralph O’Connor,” Rice President Reginald DesRoches said in a statement last summer. “He was a man who always looked to the future, and the future is what this new engineering and science building is all about. Discoveries made within those walls could transform the world. Anybody who knew Ralph O’Connor knows he would have loved that.”

The dedication event will be open to the public. It will feature remarks from DesRoches, as well as Rice Provost Amy Dittmar, Dean of the Wiess School of Natural Sciences Thomas Killian, Chair of the Rice Board of Trustees Robert Ladd and Dean of the George R. Brown School of Engineering Luay Nakhleh. A reception and tours of the new building will follow.

New certificate course trains a ready workforce as biotech companies in Pearland take off

Top of the Class

Biotech companies in Pearland are thriving, with big names such as Lonza, Millar Inc. Inc., and Abbott all experiencing tremendous growth in recent years.

The only challenge to this success is the increased demand for a faster workforce pipeline. Fortunately, the Pearland Economic Development Corporation (PEDC) has a solution.

PEDC has partnered with Alvin Community College (ACC) and Lonza to create a two-level Biotechnology Certificate Course designed to address the need for a better-equipped entry-level workforce.

This initiative offers two options to quickly train individuals for employment in the biotech field: Level 1, a six-week commitment for Biotech: Material Handler; and Level 2, a twelve-week commitment for Biotech: Lab Technician. Each level consists of 64 contact hours, with lectures delivered online and labs and assessments conducted on-site.

Alvin Community College is offering this course, which commenced on August 21, under its Continued Education and Workforce Development (CEWD) department. This department provides programs that incorporate current and new technical courses, training partnerships with businesses and industries, and other opportunities for individuals to acquire and upgrade skills or pursue personal enrichment.

Before this initiative, the region's two- or four-year programs were only graduating a dozen or so individuals. Early discussions focused on how to expedite workforce development through a local community college's certificate program. Alvin Community College was prepared to respond to the local workforce's needs.

PEDC played a pivotal role in establishing an advisory committee comprised of industry partners responsible for vetting the Biotechnology Certificate Course curriculum. Industry partners included the University of Houston Clear Lake (UHCL) at Pearland, Lonza, Millar Inc., Merit Medical, and the nonprofit organization BioHouston.

These partners are invaluable as plans continue to expand these certification programs.

Given the ever-increasing demand for a biotechnology workforce in the Pearland area, the future wish list includes expanding the certification program to other education partners.

For more information about the Biotechnology Certificate Program at Alvin Community College, visit this link.