Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution. Graphic by Miguel Tovar/University of Houston

Why do you need a data management plan? It mitigates error, increases research integrity and allows your research to be replicated – despite the “replication crisis” that the research enterprise has been wrestling with for some time.

Error

There are many horror stories of researchers losing their data. You can just plain lose your laptop or an external hard drive. Sometimes they are confiscated if you are traveling to another country — and you may not get them back. Some errors are more nuanced. For instance, a COVID-19 repository of contact-traced individuals was missing 16,000 results because Excel can’t exceed 1 million lines per spreadsheet.

Do you think a hard drive is the best repository? Keep in mind that 20 percent of hard drives fail within the first four years. Some researchers merely email their data back and forth and feel like it is “secure” in their inbox.

The human and machine error margins are wide. Continually backing up your results, while good practice, can’t ensure that you won’t lose invaluable research material.

Repositories

According to Reid Boehm, Ph.D., Research Data Management Librarian at the University of Houston Libraries, your best bet is to utilize research data repositories. “The systems and the administrators are focused on file integrity and preservation actions to mitigate loss and they often employ specific metadata fields and documentation with the content,” Boehm says of the repositories. “They usually provide a digital object identifier or other unique ID for a persistent record and access point to these data. It’s just so much less time and worry.”

Integrity

Losing data or being hacked can challenge data integrity. Data breaches do not only compromise research integrity, they can also be extremely expensive! According to Security Intelligence, the global average cost of a data breach in a 2019 study was $3.92 million. That is a 1.5 percent increase from the previous year’s study.

Sample size — how large or small a study was — is another example of how data integrity can affect a study. Retraction Watch removes approximately 1,500 articles annually from prestigious journals for “sloppy science.” One of the main reasons the papers end up being retracted is that the sample size was too small to be a representative group.

Replication

Another metric for measuring data integrity is whether or not the experiment can be replicated. The ability to recreate an experiment is paramount to the scientific enterprise. In a Nature article entitled, 1,500 scientists lift the lid on reproducibility, “73 percent said that they think that at least half of the papers can be trusted, with physicists and chemists generally showing the most confidence.”

However, according to Kelsey Piper at Vox, “an attempt to replicate studies from top journals Nature and Science found that 13 of the 21 results looked at could be reproduced.”

That's so meta

The archivist Jason Scott said, “Metadata is a love note to the future.” Learning how to keep data about data is a critical part of reproducing an experiment.

“While this will be always be determined by a combination of project specifics and disciplinary considerations, descriptive metadata should include as much information about the process as possible,” said Boehm. Details of workflows, any standard operating procedures and parameters of measurement, clear definitions of variables, code and software specifications and versions, and many other signifiers ensure the data will be of use to colleagues in the future.

In other words, making data accessible, useable and reproducible is of the utmost importance. You make reproducing experiments that much easier if you are doing a good job of capturing metadata in a consistent way.

The Big Idea

A data management plan includes storage, curation, archiving and dissemination of research data. Your university’s digital librarian is an invaluable resource. They can answer other tricky questions as well: such as, who does data belong to? And, when a post-doctoral student in your lab leaves the institution, can s/he take their data with them? Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Let's talk about dark data — what it means and how to navigate it. Graphic by Miguel Tovar/University of Houston

Houston expert: Navigating dark data within research and innovation

houston voices

Is it necessary to share ALL your data? Is transparency a good thing or does it make researchers “vulnerable,” as author Nathan Schneider suggests in the Chronicle of Higher Education article, “Why Researchers Shouldn’t Share All Their Data.”

Dark Data Defined

Dark data is defined as the universe of information an organization collects, processes and stores – oftentimes for compliance reasons. Dark data never makes it to the official publication part of the project. According to the Gartner Glossary, “storing and securing data typically incurs more expense (and sometimes greater risk) than value.”

This topic is reminiscent of the file drawer effect, a phenomenon which reflects the influence of the results of a study on whether or not the study is published. Negative results can be just as important as hypotheses that are proven.

Publication bias and the need to only publish positive research that supports the PI’s hypothesis, it can be argued, is not good science. According to an article in the Indian Journal of Anaesthesia, authors Priscilla Joys Nagarajan, et al., wrote: “It is speculated that every significant result in the published world has 19 non-significant counterparts in file drawers.” That’s one definition of dark data.

Total Transparency

But what to do with all your excess information that did not make it to publication, most likely because of various constraints? Should everything, meaning every little tidbit, be readily available to the research community?

Schneider doesn’t think it should be. In his article, he writes that he hides some findings in a paper notebook or behind a password, and he keeps interviews and transcripts offline altogether to protect his sources.

Open-source

Open-source software communities tend to regard total transparency as inherently good. What are the advantages of total transparency? You may make connections between projects that you wouldn’t have otherwise. You can easily reproduce a peer’s experiment. You can even become more meticulous in your note-taking and experimental methods since you know it’s not private information. Similarly, journalists will recognize this thought pattern as the recent, popular call to engage in “open journalism.” Essentially, an author’s entire writing and editing process can be recorded, step by step.

TMI

This trend has led researchers to open-source programs like Jupyter and GitHub. Open-source programs detail every change that occurs along a project’s timeline. Is unorganized, excessive amounts of unpublishable data really what transparency means? Or does it confuse those looking for meaningful research that is meticulously curated?

The Big Idea

And what about the “vulnerability” claim? Sharing every edit and every new direction taken opens a scientist up to scoffers and harassment, even. Dark data in industry even involves publishing salaries, which can feel unfair to underrepresented, marginalized populations.

In Model View Culture, Ellen Marie Dash wrote: “Let’s give safety and consent the absolute highest priority, with openness and transparency prioritized explicitly below those. This means digging deep, properly articulating in detail what problems you are trying to solve with openness and transparency, and handling them individually or in smaller groups.”

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

The University of Houston has tips for doing your due diligence when it comes to avoiding unintentional plagiarism. Graphic by Miguel Tovar/University of Houston

Houston expert: How to avoid unintentional plagiarism in your research work

houston voices

Plagiarism is the use of someone else’s words, ideas, or visuals as if they were your original work. Unintentional plagiarism is plagiarism that results from the disregard for proper scholarly procedures. It’s much easier to commit than one would think, and it has toppled giants in the research enterprise.

From 2007-2020, the National Science Foundation made 200 research misconduct findings, of which 78 percent were related to plagiarism. Here are some do’s and don’ts that will help you avoid unintended plagiarism, a potentially career-killing misstep.

The dos and don'ts

Don’t paraphrase without citing

According to a study of 63,700 students, Rutgers University Business School found that 36% of undergraduates admit to “paraphrasing/copying few sentences from Internet source without footnoting it.”

Don’t forget to add the quotation marks

And don’t forget to properly cite your sources at the end of the paper even if you used any in-text or footnote citations to give proper credit to the primary author.

Don’t copy and paste placeholders

You mean to go back and rewrite it in your own words but are liable to forget or run out of time. (More on this later.) If you copy and paste from a previously published paper of your own, it’s not research misconduct, but it is considered bad practice if you don’t cite it. This is called self-plagiarism.

Do make sure your hypothesis or subject is your own

Plagiarism of ideas occurs when a researcher appropriates an idea, such as a theory or conclusion — whole or in part — without giving credit to its originator. Acknowledge all sources!

Peer review is supposed to be confidential, and colleagues put their trust in each other during this process, assuming there will be no theft of ideas. Once the paper is published in a peer-reviewed journal, it should be cited.

Do use direct quotes

But quoted material should not make up more than 10 percent of the entire article.

Failure to use your own “voice” or “tone” is also considered plagiarism, or could be construed as plagiarizing, depending on how unique the author’s voice is. When there is an excessively unique turn of phrase, use quotation marks and cite (if in doubt.)

When paraphrasing, the syntax should be different enough to be considered your own words. This is tricky because you need to understand the primary work in its original language in order to reword it without just moving words around. In other words, no shuffling words!

Do cite facts widely acknowledged to be true (just in case!)

If it’s something that is generally held within your discipline to be true, or it’s a fact that can be easily looked up – like the year a state passed a certain law – there’s no need to cite “Google” or any generic platform, but it’s better to be safe than sorry. Someone reading your work might not have a background in your discipline.

Do run your paper through a plagiarism-detecting tool

Some options are www.turnitin.com or http://www.ithenticate.com.

Sanctions

There are consequences for plagiarizing another’s work. If you’re a faculty member, the sanctions could affect your career. For instance, according to retractionwatch.com, a prominent researcher and university leader was recently found to have engaged in misconduct. Terry Magnuson was accused, and later admitted to, plagiarizing unintentionally.

In an open letter to his university colleagues, Magnuson wrote a startlingly candid statement: “You cannot write a grant spending 30 minutes writing and then shifting to deal with the daily crises and responsibilities of a senior leadership position in the university, only to get back to the grant when you find another 30 minutes free.”

He goes on to say: “I made a mistake in the course of fleshing out some technical details of the proposed methodology. I used pieces of text from two equipment vendor websites and a publicly available online article. I inserted them into my document as placeholders with the intention of reworking the two areas where the techniques —which are routine work in our lab — were discussed. While switching between tasks and coming back to the proposal, I lost track of my editing and failed to rework the text or cite the sources.” Taking responsibility for this oversight, he resigned.

And that brings us to the Big Idea…

The Big Idea

The one thing that trips up even the most seasoned writers is having enough time to properly cite all one’s sources. Give yourself a few extra days (weeks?) to finish your paper and have a peer read it over with any questionable facts or quotes that might need to be cited more appropriately.

Funding agencies take plagiarism very seriously. For instance, the NSF provides prevention strategies by implementing a pre-submission process, and is also attempting to make plagiarism detection software available.

You also may want to take advantage of resources in your university’s library or writing center. There are also several tools to help you organize your citations; one called RefWorks will keep track of your sources as you write in-text citations or footnotes.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research. It's based on a workshop given by Penny Maher and Laura Gutierrez at the University of Houston; Senior Research Compliance Specialists at the University of Houston.

There are a few things to remember about altmetrics when tapping into non-traditional methods of metrics reporting. Graphic by Miguel Tovar/University of Houston

University of Houston: How to navigate 'altmetrics' in your innovative research project

Houston voices

Alternative metrics, or “altmetrics,” refers to the use of non-traditional methods for judging a researcher’s reach and impact.

Being published in a peer-reviewed journal is surely a great feat. It’s the typical way professors get their research out there. But the tools established to measure this output might end up giving the skewed impression about an author’s impact in spheres both academic and social.

Traditional metrics

Web of Science and Scopus are the main platforms that researchers rely on for collecting article citations. Web of Science’s indexing goes back to 1900, and Scopus boasts the largest database abstract and citations. The caveat with these repositories is that each resource only gives you a rating based on the range and breadth of journals it indexes. Different journals are recorded in different tools, so you may not be getting a comprehensive metric from either.

Let’s talk about h index

The h index is probably never going away, although it is always being improved upon.

An h index is a complex equation that tells the story of how often a researcher is cited. For instance, if a scholar published six papers, and all six papers were each cited by at least six other authors, they would have an h index of 6.

This equation doesn’t work out too well for an academic who, say, had one paper that was continuously cited – they would still have an h index of 1. Brené Brown, Ph.D., even with her veritable empire of vulnerability and shame related self-help has h index of 7 according to Semantic Scholar.

On to altmetrics

When a psychology professor goes on a morning show to discuss self-esteem of young Black women, for instance, she is not helping her h index. Her societal impact is huge, however.

“When I use altmetrics to deliver a professor his or her impact report, I seek out nontraditional sources like social media. For instance, I check how many shares, comments or likes they received for their research. Or maybe their work was reported in the news,” said Andrea Malone, Research Visibility and Impact Coordinator at the University of Houston Libraries.

Altmetrics aim to answer the question of how academia accounts for the numerous other ways scholarly work impacts our society. What about performances done in the humanities, exhibitions, gallery shows or novels published by creative writers?

Alternative metrics are especially important for research done in the humanities and arts but can offer social science and hard science practitioners a better sense of their scope as well. With the constant connections we foster in our lives, the bubble of social media and such, there is a niche for everyone.

The equalizer

For some, Twitter or Facebook is where they like to publish or advertise their data or results.

“When altmetrics are employed, the general public finds out about research, and is able to comment, share and like. They can talk about it on Twitter. The impact of the work is outside of academia,” said Malone. She even checks a database to see if any of the professor’s works have been included in syllabi around the country.

Academia.edu is another social network offering a platform for publishing and searching scholarly content. It has a fee for premium access, whereas Google Scholar is free. Its profile numbers are usually high because it can pick up any public data – even a slide of a PowerPoint.

The Big Idea

At the University of Houston, altmetrics are categorized thusly: articles, books and book chapters, data, posters, slides and videos. While one would think there’s no downside to recording all of the many places academic work ends up, there are a few things to remember about altmetrics:

  1. They lack a standard definition. But this is being worked on currently by the NISO Alternative Assessment Metrics Initiative.
  2. Altmetrics data are not normalized. Tell a story with your metrics, but don’t compare between two unlike sources. Youtube and Twitter will deliver different insights about your research, but they can’t be compared as though they measure the same exact thing.
  3. They are time-dependent. Don’t be discouraged if an older paper doesn’t have much to show as far as altmetrics. The newer the research, the more likely it will have a social media footprint, for example.
  4. They have known tracking issues. Altmetrics work best with items that have a Digital Object Identifier (DOI).

So have an untraditional go of it and enlist help from a librarian or researcher to determine where your research is making the biggest societal impact.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Innovation isn't always the safest field. Here's what to consider within incident reporting. Graphic by Miguel Tovar/University of Houston

University of Houston: Navigating incident reporting in the lab

houston voices

Exploding refrigerator? Chemical splash on the face? These are not just personally devastating lab incidents, they are also expensive.

For instance, awhile back, the University of Hawaii faced a total $115,500 fine for 15 workplace safety violations after a laboratory explosion where a postdoctoral researcher lost one of her arms. Beryl Lieff Benderly wrote in Science that the accident “resulted from a static electricity charge that ignited a tank containing a highly flammable, pressurized mixture of hydrogen, oxygen and carbon dioxide.”

Referred to as “incidents,” they are defined by University of California Santa Barbara (UCSB) in this way: “An incident is an event that results in or causes injury or damage to someone or something, or an event that has the potential to result in or cause injury or damage.”

But when asked which incidents are reportable, the answer is uniform across all research universities: all incidents must be reported.

Incidentally...

There are websites dedicated to laboratory accidents, like this one at UCSB. It lists the two accidents mentioned in this blog’s first sentence. University of Michigan Environment, Health and Safety’s website said, “Being safe at the University of Michigan requires a positive safety culture where we learn from mistakes and near-misses in order to improve and prevent future occurrences. It is vital that you report all ‘incidents’ including near- misses, injuries resulting from your activities, non-compliance with safety and environmental rules, and general unsafe work conditions so that we can learn and grow.” Northwestern University’s website on Research Health and Safety said, “Always report ‘near-misses’ just as you would an incident that causes injury or harm to property.”

Near-missing

You may be asking, what constitutes a “near-miss”? At Western Kentucky University, for example: “A laboratory “near-miss” is an unplanned situation, where with minor changes to time or setting, could have easily resulted in damage or injury to person or property. A near-miss is characterized as having little, if any, immediate impact on individuals, processes, or the environment, but provides insight into accidents that could happen.” Laboratory near misses may cause chemical spills, explosions and bodily injury, but can be treated with first-aid.

Form finding

Most universities have a form to fill out if there is an incident that could have led to a severe injury or death. The form asks for a description of the incident and even asks, in some instances, “Why did it happen?” These should be made out comprehensively and quickly.

OSHA

The Occupational Safety and Health Administration (OSHA) has a reporting process, aside from what each university requires. They need information when you call. The OSHA website states: “Be prepared to supply: Business name; names of employees affected; location and time of the incident, brief description of the incident; contact person and phone number.”

There are even time limits for how quickly one must report a severe injury that requires an in-patient hospitalization, amputation or loss of an eye (24) or fatality.

(It’s eight hours.)

The fact that “losing an eye” is one of just four reasons to contact OSHA, you may wonder, “Are a lot of people blinded in the lab, often?” Also, “Where can I buy safety goggles?”

“Are a lot of people blinded in the lab, often? Also, where can I buy safety goggles?”

The big idea

There are many websites which detail lab disasters. Some are cautionary tales, some are avoidable situations. Just be sure to wear your Personal Protective Equipment (PPE) and be safe out there. Or rather, in there.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Just like with any other career, a work/life balance is critical to excel in either category. Graphic by Miguel Tovar/University of Houston

Houston researchers: Avoid becoming a lab rat with these work-life balance tips

houston voices

You just missed your niece’s birthday, misplaced your debit card and forgot to eat dinner last night after working late in the lab. These are relatively benign examples of collateral damage for a researcher who is overworked. But what about the female researcher who puts off having a family because she is working 80 hours a week? What about the scientist who is injured in an experiment because he worked late alone at the lab and made an error?

One is the loneliest number

Safety is a concern for those who work alone in a lab. Working evenings and weekends is par for the course for most researchers. In a 2013 study in Biological Conservation, the authors analyzed the timing of submissions to the journal from 2004 to 2012. More than one-quarter occurred either at weekends or on weekdays between 7 p.m. and 7 a.m. The weekend submission rate increased 5–6 percent every subsequent year. Work/life balance is difficult to achieve in any profession, but researchers seem to put their lives on hold, more often than not, when that next discovery is just an experiment away.

Some say they prefer to work when they are alone and can concentrate on their findings. Most likely an introvert to begin with, this type of scientist may cite the stillness and quiet of the lab as a peaceful retreat. “The laboratory can be comforting in its isolation and can act as a shelter away from the pressures of life and conflict with friends and family,” writes chemjobber on the Chemical & Engineering News Blog. For a researcher, social distancing may be heaven.

But more mistakes happen when one is working alone. There is the infamous incident of the graduate student who died after working with tert-butyl lithium, which ignites spontaneously in air, in a UCLA lab years ago. Her PI was charged with violating safe labor laws by not requiring another person in the lab, protective gear or proper chemical safety training.

On the Oxford University Press’ blog, there is a long list of imperatives for working in a lab, which include: “Never work alone or unsupervised, and never work when you are exhausted or emotionally upset.” Errors can be deadly, so check with your lab safety guidelines and make sure someone is at least “checking in” with you if you must work alone.

Working 9 to 5?

How many hours do you spend in the lab? How many are healthy? According to a 2016 Nature poll of early-career researchers worldwide, 38 percent of respondents reported working more than 60 hours each week — 9 percent of whom claimed more than 80 hours.

Obviously, it is difficult to maintain a work/life balance – healthy relationships, free time for hobbies – if one works 80 hours a week. Some researchers liken getting results in a lab to a gambler’s hot streak. It would seem insane to walk away when one finally, after painstaking labor, long hours and meticulous experimentation, experiences a positive result. But long hours can dull your senses and make having a breakthrough even more difficult.

Meet the new boss

Principal Investigators (PIs) may be to blame, at times, for unrealistic expectations. In his article in Nature Magazine, Chris Woolston says, “A toxic relationship between junior scientist and adviser can quickly turn career prospects sour.” Adds Karen Kelsky, career advisor in Eugene, Oregon, “Many junior researchers who find themselves at odds with their advisers could have avoided trouble with a little preliminary research. For Ph.D. students, it is helpful to find someone who has a history of turning trainees into scientists.”

According to UnDark.org, a non-profit, editorially independent digital magazine exploring the intersection of science and society, “Between September 2016 and May 2017, graduate student organizing committees at six private universities successfully negotiated contracts with their universities. These contract negotiations delivered, among other things, standardized pay rates; annual cost-of-living raises; improved health care, childcare, and dependent-care benefits; and arbitration support in contract disputes.” Organizing with others and stating your concerns may work, if you feel your PI is taking advantage of your work ethic, thereby compromising a healthy work/life balance.

Baby, baby

And one last important issue is that of the female researcher who, like women in other demanding careers, puts having children on hold. “A major issue for female scientists wanting to start a family is the career break–and the gap in their track record–that usually comes with having children,” states Elisabeth Pain in Science Magazine. What’s the solution? If the researcher is planning to return to work after the birth of her child, Pain states: “The impact of a career break will be smallest if women manage to get that paper published before they leave, arrange to attend a conference while on maternity leave and organize their research projects so that it is easy to get back in the swing of things when they return.” That’s a lot of pressure for a new mother. Social media and message boards abound with women commiserating about these stressors. There is no easy answer.

So the question remains: are you a lab rat? Do you hunch over your research statistics, experiments or lab equipment in a constant struggle to get ahead, publish your findings first and “win” at science? You may need to take a breath, relax and re-evaluate. Just like with any other career, a work/life balance is critical to excel in either category.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston doctors recognized among top creative leaders in business

winners

This week, Fast Company announced its 14th annual list of Most Creative People in Business — and two notable Houstonians made the cut.

Dr. Peter Hotez and his fellow dean of the National School of Tropical Medicine at Baylor College of Medicine, Dr. Maria Elena Bottazzi, were named among the list for “open sourcing a COVID-19 Vaccine for the rest of the world.” The list, which recognizes individuals making a cultural impact via bold achievements in their field, is made up of influential leaders in business.

Hotez and Bottazzi are also co-directors for the Texas Children's Hospital's Center for Vaccine Development -one of the most cutting-edge vaccine development centers in the world. For the past two decades it has acquired an international reputation as a non-profit Product Development Partnership (PDP), advancing vaccines for poverty-related neglected tropical diseases (NTDs) and emerging infectious diseases of pandemic importance. One of their most notable achievements is the development of a vaccine technology leading to CORBEVAX, a traditional, recombinant protein-based COVID-19 vaccine.

"It's an honor to be recognized not only for our team's scientific efforts to develop and test low cost-effective vaccines for global health, but also for innovation in sustainable financing that goes beyond the traditional pharma business model," says Hotez in a statement.

The technology was created and engineered by Texas Children's Center for Vaccine Development specifically to combat the worldwide problem of vaccine access and availability. Biological E Limited (BE) developed, produced and tested CORBEVAX in India where over 60 million children have been vaccinated so far.

Earlier this year, the doctors were nominated for the 2022 Nobel Peace Prize for their research and vaccine development of the vaccine. Its low cost, ease of production and distribution, safety, and acceptance make it well suited for addressing global vaccine inequity.

"We appreciate the recognition of our efforts to begin the long road to 'decolonize' the vaccine development ecosystem and make it more equitable. We hope that CORBEVAX becomes one of a pipeline of new vaccines developed against many neglected and emerging infections that adversely affect global public health," says Bottazzi in the news release from Texas Children's.

Fast Company editors and writers research candidates for the list throughout the year, scouting every business sector, including technology, medicine, engineering, marketing, entertainment, design, and social good. You can see the complete list here

.

Samsung sets sights on nearly $200 billion expansion in Texas

chipping in

As it builds a $17 billion chipmaking factory in Taylor, tech giant Samsung is eyeing a long-term strategy in the Texas area that could lead to a potential investment of close to $200 billion.

Samsung’s plans, first reported by the Austin Business Journal, call for an additional $192.1 billion investment in the Austin area over several decades that would create at least 10,000 new jobs at 11 new chipmaking plants. These facilities would be at the new Taylor site and the company’s existing site in Northeast Austin.

The first of the 11 new plants wouldn’t be completed until 2034, according to the Business Journal.

“Samsung has a history already in the Austin market as an employer of choice, providing high wages, great benefits, and a great working environment. All of this will be on steroids in the not-too-distant future, creating a historic boost to the already booming Austin economy,” John Boyd Jr., a corporate site selection consultant, tells CultureMap.

Samsung’s preliminary plans were revealed in filings with the State of Texas seeking possible financial incentives for the more than $190 billion expansion. The South Korean conglomerate says the filings are part of the company’s long-range planning for U.S. chipmaking facilities.

Given that Samsung’s 11 new plants would be decades in the making, there’s no certainty at this point that any part of the potential $192.1 billion expansion will ever be built.

Last November, Samsung announced it would build a $17 billion chipmaking factory in Taylor to complete its semiconductor operations in Northeast Austin. Construction is underway, with completion set for 2024. Boyd proclaimed last year that the Taylor project will trigger an “economic tsunami” in the quiet Williamson County suburb.

The Taylor facility, which is expected to employ more than 2,000 people, ranks among the largest foreign economic development projects in U.S. history. The impact of a nearly $200 billion cluster of 11 new chipmaking plants would far eclipse the Taylor project.

The Taylor factory will produce advanced chips that power mobile and 5G capabilities, high-performance computing, and artificial intelligence.

------

This article originally ran on CultureMap.