The Texas bureau of the NYSE will open in 2026. Photo by Tomas Eidsvold on Unsplash

A location has been chosen for The New York Stock Exchange Texas, the new Dallas-based offshoot of the The New York Stock Exchange in New York.

According to a release, the NYSE Texas has leased 28,000 square-feet of space at Old Parkland, the hospital-turned office space at 3819 Maple Ave. in Oak Lawn, where it will operate as a fully electronic equities exchange headquartered in Dallas. The property is owned by Dallas billionaire Harlan Crowe, who acquired it in 2006.

The NYSE is part of the Intercontinental Exchange, Inc., a global provider of technology and data. It was previously the NYSE Chicago, which will close once the bureau in Texas debuts.

They’ve also named a president to the Texas branch: Bryan Daniel, former chairman of the Texas Workforce Commission. In his new role over the NYSE Texas, Daniel will report to President of NYSE Group Lynn Martin.

Relocating from Chicago to Texas was a response to Texas' pro-business profile, Martin says in a statement.

“As the state with the largest number of NYSE listings, representing over $3.7 trillion in market value for our community, Texas is a market leader in fostering a pro-business atmosphere,” Martin says. “We are delighted to expand our presence in the Lone Star State, which plays a key role in driving our U.S. economy forward.”'

The move comes five months after the Texas Stock Exchange — AKA TXSE — announced plans to launch in Dallas and begin trading in 2026, pending approval from the U.S. Securities and Exchange Commission. The Texas Stock Exchange is backed by financial giants such as BlackRock, Citadel Securities, and Charles Schwab.

The NYSE expects the Texas location to open in 2026, where it will operate electronically, with stocks trading across multiple venues regardless of where they are first listed, according to the release.

---

This article originally appeared on CultureMap.com.

There's no crystal ball, but this researcher from Rice University is trying to see if some metrics work for economic forecasting. Photo via Getty Images

Houston researcher tries to crack the code on the Fed's data to determine economic outlook

houston voices

Research by Rice Business Professor K. Ramesh shows that the Fed appears to harvest qualitative information from the accounting disclosures that all public companies must file with the Securities and Exchange Commission.

These SEC filings are typically used by creditors, investors and others to make firm-level investing and financing decisions; and while they include business leaders’ sense of economic trends, they are never intended to guide macro-level policy decisions. But in a recent paper (“Externalities of Accounting Disclosures: Evidence from the Federal Reserve”), Ramesh and his colleagues provide persuasive evidence that the Fed nonetheless uses the qualitative information in SEC filings to help forecast the growth of macroeconomic variables like GDP and unemployment.

According to Ramesh, the study was made possible thanks to a decision the SEC made several years ago. The commission stores the reports submitted by public companies in an online database called EDGAR and records the IP address of any party that accesses them. More than a decade ago, the SEC began making partially anonymized forms of those IP addresses available to the public. But researchers eventually figured out how to deanonymize the addresses, which is precisely what Ramesh and his colleagues did in this study.

"We were able to reverse engineer and identify those IP addresses that belonged to Federal Reserve staff," Ramesh says.

The team ultimately assembled a data set containing more than 169,000 filings accessed by Fed staff between 2005 and 2015. They quickly realized that the Fed was interested only in filings submitted by a select group of industry leaders and financial institutions.

But if Ramesh and his colleagues now had a better idea of precisely which bellwether firms the Fed focused on, they still had no way of knowing exactly what Fed staffers had gleaned from the material they accessed. So the team decided to employ a measure called "tone" that captures the overall sentiment of a piece of text – whether positive, negative, or neutral.

Building on previous research that had identified a set of words with negatively toned financial reports, Ramesh and his colleagues examined the tone of all the SEC filings accessed by Fed staff between one meeting of the Federal Open Markets Committee (FOMC) and the next. The FOMC sets interest rates and guides monetary policy, and its meetings provide an opportunity for Fed officials to discuss growth forecasts and announce policy decisions.

The researchers then examined the Fed's growth forecasts to see if there was a relationship between the tone of the documents that Fed staff examined in the period between FOMC meetings and the forecasts they produced in advance of those meetings.

The team found close correlations between the tone of the reports accessed by the Fed and the agency’s forecasts of GDP, unemployment, housing starts and industrial production. The more negative the filings accessed prior to an FOMC meeting, for example, the gloomier the GDP forecast; the more positive the filings, the brighter the unemployment forecast.

Ramesh and his colleagues also compared the Fed's forecasts with those of the Society of Professional Forecasters (SPF), whose members span academia and industry. Intriguingly, the researchers found that while the errors in the SPF's forecasts could be attributed to the absence of the tonal information culled from the SEC filings, the errors in the Fed’s forecasts could not. This suggests both that the Fed was collecting qualitative information that the SPF was not—and that the agency was making remarkably efficient use of it.

"They weren’t leaving anything on the table," Ramesh says.

Having solved one mystery, Ramesh would like to focus on another; namely, how does the Fed identify bellwether firms in the first place?

Unfortunately, the SEC no longer makes IP address data publicly available, which means that Ramesh and his colleagues can no longer study which companies the Fed is most interested in. Nonetheless, Ramesh hopes to use the data they have already collected to build a model that can accurately predict which firms the Fed is most likely to follow. That would allow the team to continue studying the same companies that the Fed does, and, he says, “maybe come up with a way to track those firms in order to understand how the economy is going to move.”

------

This article originally ran on Rice Business Wisdom and was based on research from K. Ramesh is Herbert S. Autrey Professor of Accounting at Jones Graduate School of Business at Rice University.

Equity options can act as an alternative to credit default swaps for detecting a company’s credit risk. Photo via Getty Images

Rice research explains a new way to measure default risk for investors

houston voices

Up until the 2007-2009 financial crisis, credit default swaps (CDS) were a predominant method for predicting the probability of corporate default. CDS function like insurance for loan assets — if an asset defaults, the bank who purchased the CDS would recoup their loss. Higher-risk assets usually have higher premiums, and in this way the price of a CDS indicates the probability of default.

When the housing market crashed in 2007, the CDS market crashed along with it when banks had to pay out more than they had expected. The CDS market is not expected to ever return to its previous high, leaving a void in market-driven estimates for determining an asset’s default probability.

To fill that void, a team of researchers including Rice Business Professor Robert Dittmar created an alternative method for measuring default risk: equity options data. The team found that equity options not only correlate with CDS data in terms of accurate prediction of default but also provide additional insights on what types of assets are more likely to default, and when they will default.

There are two types of options, a call option, which is essentially a bet that a stock’s price will be higher than a contracted value (the strike price) and a put option, which is a bet that a stock’s price will be less than a contracted value.

A put is often viewed as an insurance contract — if you hold a stock, but also a put option on it, you limit your loss on the stock if the stock price falls.

“What we are looking at is essentially how expensive put options get,” says Dittmar. “If the market thinks a company is likely to default, it expects that its stock value will fall (almost to zero). As a result, put options, which represent insurance against this loss become more expensive. We are looking at how these option prices change to see if they inform us about the probabilities of default.”

According to Dittmar and his team, this approach has several advantages. 1) There are more stocks with options than CDS. 2) The CDS market is drying up whereas the option market remains liquid. And 3) Because of the nature of an option contract, and the fact that in principle equity holders have the lowest claim on a company’s assets, this approach may allow investors to predict losses in case of default.

The team looked at CDS quotes on 276 firms between 2002 and 2017, focusing attention on entities that had quote data available on one-year credit default swaps. The 15-year sample enabled the researchers to analyze the money lost through defaults over a longer period of time, including the 2007-2009 financial crisis.

Using equity options data as a predictor of default led to some interesting insights. First, there are two components that investors in corporate bonds think about when weighing default risk — the probability of default and (should there be a default) how much of the bond’s principal they will get back (i.e., recovery rate). “What we see is that credit ratings imply different levels of default thresholds, which may mean that investors believe that there are differences in the amount that debt holders will lose in the case of default,” says Dittmar.

Second, option-implied default probabilities correlate to historical changes in the economy. Default probabilities are higher in bad economic times and for firms with poorer credit ratings and financial positions. Default spikes are more likely during times of economic turbulence, such as the financial crisis of 2007-2009, which correlated with the decline of the CDS market after an onslaught of debt defaults during the recession. Assets are less likely to default during times of economic expansion. Over the period of 2013-2017, forecasted losses through defaults hovered around 15%.

The research sample ends in 2017, and the paper was published in 2020, about a month after the start of the coronavirus pandemic. Since then, there have been unprecedented changes in the economy, and some economists are anticipating another recession in 2023. With such instability in the market, multiple methods of predicting losses should be especially relevant. This research suggests that the equity options market may provide additional ways of finding the probability of these losses.

------

This article originally ran on Rice Business Wisdom and was based on research from Robert Dittmar, professor of finance at the Jones Graduate School of Business at Rice University.

Earnings report delays generally lead to drops in stock prices. Disclosure can soften this market reaction. Photo via Getty Images

Houston research: Is no news always bad news in market reports?

houston voices

Investors eagerly wait for the news in their earnings reports. When these reports don't appear on the expected date, investors worry — and stock prices often fall as a result. But what if managers could present late reports in a way that spared their companies?

Research by K. Ramesh, a professor at Rice Business, shows that managers' approach to late earnings reports can profoundly affect market reaction. When firms put off filing a report, it's up to managers to decide whether to speak up or stay quiet. Those who choose to talk about a postponement then must decide how, what and how much to say.

All earnings delays, whether they're attended by a statement or not, prompt negative market reaction, prior research suggests. But in his research, Ramesh, Herbert S. Autrey Professor of Accounting, wanted to learn more about the exact consequences of these late reports, and how managers can lessen the blowback.

To do this, Ramesh and a team of coauthors first looked at the incidence, timing and contents of a comprehensive sample of press releases announcing an earnings delay. Then they studied what those delays did to market value.

Conventional wisdom in the business press already suggested that investors viewed any announcement of a delayed earnings report as bad news. But finance theorists tell a more complicated story, one in which the market response might be partially shaped by managerial behavior. Subtle factors, they found, such as whether the impending delay is discussed or treated with silence, really can make a difference.

In the view of some theorists, merely announcing a delay can sometimes avert a drop in stock prices. Others argue that this isn’t necessarily the case, especially if the company discloses that the delay stemmed from legal concerns. The better approach: making it clear up front that reports aren't being postponed to hide disastrous information. But what if the information is indeed disastrous?

That may be the one case where disclosure won’t change much, Ramesh and his team found.

“Those companies that are in fact concealing disastrous results will experience no benefits (in the form of higher stock price) from revealing their true situation,” the research team wrote, “because the market will infer the worst from the manager’s decision not to announce the delay.” For this reason, they added, delayed earnings without a stated explanation prompt the most negative market reaction. As in so many areas of public relations, without a narrative, investors will infer a negative one of their own.

To better understand the impact of late reports, Ramesh and his coauthors built a comprehensive sample of 545 delay announcements by using a keyword search of the Dow Jones Factiva database between January 1, 1995, and December 31, 2009.

As conventional wisdom suggested, the study showed that announcements of late earnings reports led to negative market reactions. (Earlier studies have shown smaller firms are hit hardest by this dynamic, perhaps because investors assume large companies have more finely tuned financial reporting systems, so are less worried by their earnings delays).

Consistent with the anecdotal evidence, the average one-day abnormal stock return for the sample was -6.29 percent, while the median return was -2.27 percent. Both figures are economically and statistically significant.

The researchers next classified the announcements according to stated reason, dividing the delays into “Accounting” and “Non-Accounting” categories. “Accounting” explanations were subdivided into “Accounting Issue,” “Accounting Process” and “Rule Change.”

Meanwhile, “Non-Accounting” explanations were divided into “Business,” which linked the delay to some event such as divestitures or regulatory proceedings, and “Other,” which ranged from earthquakes to power outages. Finally, there were delays for no stated reason at all.

About two-thirds of the late announcements, the team found, were linked to accounting. When firms named a specific accounting issue as the cause for delay, the average abnormal return reached a statistically significant -8.15 percent. When managers explained that the accounting process was not complete, the average abnormal return was slightly lower, at -7.04 percent.

After accounting issues, business events drove most earnings delays. In theory, these events could have been either good or bad news. But the average abnormal return for the subsample was a statistically significant -3.74 percent — a reflection of the fact that most business events linked to late earnings reports tend to be negative.

Curiously, the average abnormal return for the grouping classified as “Other” was almost nil — at 0.53 percent. This suggests that the market does not penalize managers for events outside of their control that have little, if any, relevance to firm performance.

“No Reason,” the researchers found, was the most damaging explanation of all. Seven percent of the sample, or 37 out of 545 delays, came without a stated reason. The average abnormal return for these was a significant -10.41 percent, a greater negative number than the returns for any of the other reasons.

So what should managers do when a deadline is going to be busted? Bite the bullet and disclose the reasons, Ramesh suggests. For one thing, it helps limit legal exposure and preserve credibility. When the reason for the late report is innocuous, explaining to investors can also mitigate the market's displeasure. A caveat: While informing investors that a power outage caused earnings delay will calm jitters, disclosure may not make a difference if the company just can’t balance its books.

It's human nature, apparently, to read no news as bad news. Relaying something—anything—about the cause of a late report seems to soothe investors' nerves by preventing them from filling the silence themselves.

------

This article originally ran on Rice Business Wisdom and was based on research from K. Ramesh, Tiago Duarte-Silva, Huijing Fu and Christopher F. Noe. K. Ramesh is the Herbert S. Autrey Professor of Accounting at Jones Graduate School of Business at Rice University.

Research shows that some corporate executives skew earnings to influence the market and inflate share price. Photo via Pexels

Rice University research finds market outliers at risk of misreporting

houston voices

Say a company called CoolConsumerGoodsCo has just released its quarterly earnings report, revealing significantly higher profits than its consumer goods industry counterparts.

That result might spur analysts to slap a buy rating on the stock and investors to snap up shares. In an ideal world, the market wouldn't have to consider the possibility that the numbers aren't legit — but then again, it's not an ideal world. (Enron, anyone?)

Rice Business professors Brian R. Rountree and Shiva Sivaramakrishnan, along with Andrew B. Jackson at UNSW in Australia, studied what makes business leaders more likely to engage in fraudulent earnings reporting. Specifically, they focused on the relationship between this kind of misrepresentation and the degree to which a company's earnings are in line with the rest of its industry — a variable the researchers term "co-movements."

Many people are familiar with a similar variable, calculated using stock returns often referred to as a company's beta. The authors adapted the stock return beta to corporate earnings to see how a company's earnings move with earnings at the industry level.

The researchers hypothesized that the less in sync a company's earnings are with its industry, the higher the chance a company's leaders will manipulate earnings reports. They started with the well-accepted premise that corporations try to skew earnings reports to influence the market. The primary motive is typically to raise the company's stock price, as when an executive tries to "choose a level of bias" that balances potential fallout of getting caught against the benefits of a higher stock price.

To test their prediction, the professors analyzed a sample of enforcement actions taken by the U.S. Securities and Exchange Commission against companies for problematic financial reporting from 1970 to 2011 — although they noted that given the SEC's limited resources, the number of enforcement actions probably underestimates the actual amount of earnings manipulation in the market.

Their analysis revealed that firms with low earnings co-movements (meaning their earnings were out of sync with industry peers) were more likely to be accused by the SEC of reporting misdeeds. They concluded that the degree of earnings co-movement determines the probability of earnings manipulation. Put another way, earnings co-movements are a "causal factor" in the chances of earnings manipulations — and to a significant degree. The researchers found that firms who don't co-move with the market are more than 50 percent more likely to face an SEC enforcement action, compared with firms who are perfectly aligned with the market.

The researchers drilled deeper into the data to study whether the odds changed depending on the industry, since past research has indicated that the amount of competition in an industry works to constrain misreporting. That premise seems to hold true, the researchers concluded. In industries with more competitive markets, the impact of low co-movement on earnings manipulation is moderated.

They also studied whether the age of a firm played a part in the likelihood of earnings manipulation. Newer firms often rely more on stock compensation, which could be a motive for manipulating earnings reporting to drive up share price. Indeed, younger firms were more susceptible to misreporting when their earnings were out of whack with the rest of the marketplace.

Every firm faces some risk of misreporting, however. Even for public companies under analyst scrutiny, low co-movement proved to be a driver of earnings manipulation. But companies known for conservative reporting tend to be less likely to exaggerate their earnings, in general; these firms typically recognize losses in a more timely manner, the professors found.

These findings suggest a number of future lines of research. For example: When do executives underreport earnings? And can analyzing patterns related to cash flow reporting help better isolate earnings manipulation?

In the meantime, if you come across a company like CoolConsumerGoodsCo with an earnings report that's widely out of sync with the rest of its industry, you might think twice before rushing to buy in.

------

This article originally ran on Rice Business Wisdom and is based on research from Brian R. Rountree, an associate professor of accounting at the Jones Graduate School of Business at Rice University, and Shiva Sivaramakrishnan is the Henry Gardiner Symonds Professor of Accounting at Rice Business.

John Berger, CEO of Houston-based Sunnova, is this week's Houston Innovators Podcast guest. Courtesy of Sunnova

Houston solar energy exec shines light on company growth and IPO

HOUSTON INNOVATORS PODCAST EPISODE 15

It was all about the timing for John Berger, founder and CEO of Sunnova, a Houston-based residential solar energy company.

When he founded his company in 2012 in Houston, solar energy wasn't the trendy sustainability option it is today, but Berger saw the potential for technology within the industry. So, with a lot of perseverance and the right team behind him, he scaled Sunnova through nationwide expansion, billions of money raised, and a debut on the stock market last July — something that also happened with great timing.

About 72 hours after Sunnova went public last July, the Federal Reserve System announced it was going to cut rates. Additionally, Sunnova's IPO occurred ahead of WeWork's failed IPO.

"We went public in a market that still isn't back open again, I think, for IPOs," Berger says on this week's episode of the Houston Innovators Podcast. "We had pretty good timing when we went out the door."

However great the timing was, Sunnova's success is built on the hard work and skills of the company's employees, Berger explains on the podcast, and now running a public company requires a dynamic leader.

"I really look at myself and how I can change myself," Berger says. "I'm a different CEO today than I was 12 months ago, and hopefully I'll be a different CEO in 12 months, because the company demands it."

In the episode, Berger lifts the curtain on Sunnova's IPO, explains where he sees the solar energy industry headed, how battery storage technology has evolved, and why he's not worried about who ends up in the White House. Listen to the full episode below — or wherever you get your podcasts — and subscribe for weekly episodes.


Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Rice Brain Institute awards seed grants for dementia, Alzheimer’s research

brain trust

The recently established Rice Brain Institute awarded 12 seed grants last month to support research on dementia, Alzheimer’s disease, Parkinson’s disease and other neurological disorders.

The grants are part of the Rice DPRIT Seed Grant Program, which aims to help faculty members generate preliminary data, test and teams that would be supported under the Dementia Prevention and Research Institute of Texas.

The DPRIT was approved last year to provide $3 billion in state funding over a 10-year span for research on dementia prevention and other neurological conditions. It will be modeled after the Cancer Prevention and Research Institute of Texas (CPRIT), which has awarded nearly $4 billion in grants since 2008.

“DPRIT is a historic initiative with transformative impact potential and at Rice we are very well equipped to contribute to its mission and help make Texas a leader in brain health and innovation,” Behnaam Aazhang, a Rice professor of electrical and computer engineering and director of the Neuroengineering Initiative and the RBI, said in a news release.

The Rice DPRIT Seed Grant Program is supported by the RBI and the Educational and Research Initiative for Collaborative Health (ENRICH) office at Rice. Most of the funding came from Rice's Office of Research, with a contribution from Rice's Amyloid Mechanism and Disease Center, which also launched last year.

A number of the teams include collaborators from Houston's Texas Medical Center, including Baylor College of Medicine, University of Texas Medical Branch and the McGovern Medical School at UTHealth Houston.

The 12 teams are:

  • Keya Ghonasgi, assistant professor of mechanical engineering at Rice. Ghonasgi's research addresses the high risk of falls among people with different types of dementia and aims to develop a personalized, home-based fall-prevention approach using textile-integrated wearable sensors.
  • Luz Garcini, associate professor of psychological sciences at Rice, and Hannah Ballard, associate director of community and public health at the Kinder Institute for Urban Research at Rice. Garcini and Ballard's research looks at barriers and facilitators to early detection of Alzheimer’s disease in diverse, medically underserved urban communities and focuses on populations that experience late diagnosis, including Hispanic/Latino groups.
  • Lei Li, assistant professor of electrical and computer engineering at Rice, and Pablo Valdes, assistant professor of neurosurgery at UTMB. Li and Valdes' project develops a noninvasive, bedside imaging approach to monitor brain blood flow and oxygenation in patients recovering from stroke or brain surgery using photoacoustic imaging through a specialized transparent skull implant.
  • Cameron Glasscock, assistant professor of biosciences at Rice. Glasscock's project addresses repeat expansion disorders, such as Huntington’s disease and myotonic dystrophy, and focuses on stopping DNA instability before repeats reach a disease-causing threshold.
  • Raudel Avila, assistant professor of mechanical engineering at Rice. Avila's project focuses on everyday health factors such as nutrition, hydration and brain blood flow and how they influence brain aging long before symptoms of dementia appear.
  • Isaac Hilton, associate professor of bioengineering at Rice, and Laura Lavery, assistant professor of biosciences at Rice. Hilton and Lavery's project uses precise CRISPR-based gene regulation to target multiple genetic drivers of neuronal damage in Alzheimer’s.
  • Quanbing Mou, assistant professor of chemistry at Rice, and Qing-Long Miao, assistant professor of neurology at Baylor College of Medicine. Mou and Miao's project aims to develop a gene-regulation therapy for childhood absence epilepsy by restoring activity of the CACNA1A gene.
  • Momona Yamagami, assistant professor of electrical and computer engineering at Rice, and Christopher Fagundes, professor of psychological sciences at Rice. Yamagami and Fagundes' project addresses the physical and mental health challenges faced by spouses caring for partners with Alzheimer’s disease and related dementias and aims to develop algorithms to determine the optimal timing and frequency of supportive text messages.
  • Han Xiao, professor of chemistry at Rice. Xiao's project aims to improve the delivery of antibody therapies to the brain using a noninvasive, light-based approach that temporarily opens the blood–brain barrier.
  • Lan Luan, associate professor of electrical and computer engineering at Rice. Luan's project investigates how tiny blood-vessel injuries in the brain, known as microinfarcts, contribute to dementia.
  • Natasha Kirienko, associate professor of biosciences at Rice. Kirienko's project targets a shared cause of neurodegeneration, impaired mitochondrial cleanup, and aims to identify an existing antidepressant that could be repurposed to protect neurons in diseases like Alzheimer’s and Parkinson’s.
  • Harini Iyer, assistant professor of biosciences at Rice. Iyer's project will observe zebrafish to investigate how the brain’s primary immune cells become improperly activated in neurological disorders, leading to the loss of healthy neurons and cognitive impairment.

The RBI also named the first four projects to receive research awards through the Rice and TMC Neuro Collaboration Seed Grant Program in January. Read more about those projects here.

Report: These 10 jobs earn the biggest salary premiums in Texas

A move to Texas bolsters earnings for some, and a new SmartAsset study has revealed the top professions where the median annual earnings in the Lone Star State exceed the national median.

The report, "When it Pays to Work in Texas — and When It Doesn’t," published in April, analyzed over 700 occupations to determine which have the biggest "Texas premium" — meaning jobs where the price-adjusted median annual pay in Texas most exceeds the national median for the same occupation — and which jobs have the biggest “Texas penalty,” where the statewide median annual pay falls furthest below the national median. Salaries were sourced from the U.S. Bureau of Labor Statistics (BLS) and adjusted for regional price parity.

According to the report's findings, geoscientists have the biggest "Texas premium" and make a $159,903 median annual salary. Texas' salary for geoscientists is 61 percent higher than the national median for the same position (after adjusting for regional price parity).

"Texas’s large petroleum industry helps explain why employers in the state retain so many geoscientists," the report's author wrote. "In fact, the Lone Star State is home to more geoscientists than any other state except California."

There are more than 3,600 geoscientists working in Texas, SmartAsset said.

These are the remaining top 10 occupations with the biggest "Texas premiums" (salaries are price-adjusted):

  • No. 2 – Commercial pilots: $167,727 median Texas earnings; 37 percent higher than the national median
  • No. 3 – Sailors: $67,614 median Texas earnings; 36 percent higher than the national median
  • No. 4 – Aircraft structure assemblers: $83,519 median Texas earnings; 35 percent higher than the national median
  • No. 5 – Ship captains: $108,905 median Texas earnings; 27 percent higher than the national median
  • No. 6 – Nursing instructors (postsecondary): $100,484 median Texas earnings; 26 percent higher than the national median
  • No. 7 – Tax preparers: $63,321 median Texas earnings; 25 percent higher than the national median
  • No. 8 – Chemists: $104,241 median Texas earnings; 24 percent higher than the national median
  • No. 9 – Health instructors (postsecondary): $128,680 median Texas earnings; 22 percent higher than the national median
  • No. 10 – Engineering instructors (postsecondary): $129,030 median Texas earnings; 22 percent higher than the national media

The careers where Texas workers earn less

SmartAsset said an editor is the Texas profession where workers earn the furthest below the median for the same occupation elsewhere in the U.S. Not to be confused with film and video editors, BLS defines editors as those who "plan, coordinate, revise, or edit written material" and "may review proposals and drafts for possible publication."

The study found editors make a price-adjusted median wage of $29,710, which is 61 percent lower than the national median for the same position, and there are nearly 8,200 editors in Texas.

It's worth noting that the salaries for editors may be skewed by the fact that there are not major publications in rural areas of Texas, and other professions may also have financial deviations for similar reasons.

Several healthcare jobs also appear to have the worst penalties in Texas compared to elsewhere in the country. Home health aides are the second-worst paying professions in the state, making a median wage of $24,161.

"More home health aides work in Texas than in nearly any other state, with only California and New York employing more," the report said. "However, the more than 300,000 Texans in this occupation earn median annual pay that is about 31 percent below the national median, after adjusting for regional price parity.

SmartAsset clarified that pay penalties are not consistent "across the board" for other healthcare occupations in Texas.

"For physical therapy assistants, occupational therapy assistants, and postsecondary nursing instructors, Texas may be an especially strong place to work, with these occupations offering 'Texas premiums' of between 17 percent and 26 percent," the study said.

These are the remaining top 10 occupations where median annual earnings in Texas fall furthest below the national median for the same occupation:

  • No. 3 – Cardiovascular technicians: $49,382 median Texas earnings; 27 percent lower than the national median
  • No. 4 – Semiconductor processing technicians: $38,295 median Texas earnings; 25 percent lower than the national median
  • No. 5 – Tutors: $30,060 median Texas earnings; 25 percent lower than the national median
  • No. 6 – Control and valve installers: $56,496 median Texas earnings; 24 percent lower than the national median
  • No. 7 – Mental health social workers: $46,109 median Texas earnings; 23 percent lower than the national median
  • No. 8 – Clinical psychologists: $74,449 median Texas earnings; 22 percent lower than the national median
  • No. 9 – Producers/directors: $65,267 median Texas earnings; 22 percent lower than the national median
  • No. 10 – Interpreters/translators: $46,953 median Texas earnings; 21 percent lower than the national median

---

This article originally appeared on CultureMap.com.

Houston rises in 2026 ranking of best U.S. cities to start a business

Best for Biz

Houston has reaffirmed its commitment to a business-friendly environment and now ranks as the 26th best large U.S. city for starting a business in 2026. The city jumped up eight places after ranking 34th last year.

WalletHub's annual report compared 100 U.S. cities based on 19 relevant metrics across three key dimensions: business environment, access to resources, and costs. Factors that were analyzed include five-year business survival rates, job growth comparisons from 2020 and 2024, population growth of working-age individuals aged 16-64, office space affordability, and more.

Florida cities locked out the top five best places in America for starting a new business: Tampa, Orlando, Jacksonville, Hialeah, and St. Petersburg.

Houston's business environment ranked as the 19th best in the country, and the city ranked 51st in the "business costs" category. However, the city lagged behind in the "access to resources" ranking, coming in at No. 72 overall. This category examined metrics such as Houston's working-age population growth, the share of college-educated individuals, financing accessibility, the prevalence of investors, venture investment amounts per capita, and more.

"From the Gold Rush and the Industrial Revolution to the Internet Age, periods of innovation have shaped our economy and driven major societal progress," the report's author wrote. "However, the past few years have been particularly challenging for business owners in the U.S., due to factors such as the COVID-19 pandemic, the Great Resignation and high inflation."

Earlier this year, WalletHub declared Texas the third-best state for starting a business in 2026, and several Houston-area cities have seen robust growth after being recognized among the best career hotspots in the U.S. Entrepreneurial praise has also been extended to five local companies that were named the most innovative companies in the world, and six powerhouse female innovators that made Inc. Magazine's 2026 Female Founders 500 list.

Texas cities with strong environments for new businesses
Multiple cities in the Dallas-Fort Worth Metroplex can claim bragging rights as the best Texas locales for starting a new business. Dallas ranked highest overall — appearing 11th nationally — and Irving landed a few spots behind in the 16th spot. Arlington (No. 23), Fort Worth (No. 30), Plano, (No. 35), and Garland (No. 65) followed behind.

Only six other Texas cities earned spots in the report: Austin (No. 24), Lubbock (No. 36), Corpus Christi (No. 39), San Antonio (No. 64), El Paso (No. 67), and Laredo (No. 76).

Austin tied with Boise, Idaho and Fresno, California for the highest average growth in the number of small businesses nationally, while Corpus Christi and Laredo topped a separate list of the U.S. cities with the most accessible financing.

---

This article originally appeared on CultureMap.com.