The stock market has always been hard, if not impossible, to forecast. Image via Getty Images

What do you think the Standard & Poor’s 500 index will do over the next year?

When Rice Business finance professor Kevin Crotty asks his MBA students this question, the answers are all over the map. Some students expect the overall return on the stock market to be 10 percent, while others predict a loss of 20 percent.

This guessing game is closer to real life than many people realize. Experienced investors, people who have watched the stock market ebb and flow for many years, know that making predictions is a risky business. “Many money managers are more confident choosing individual stocks than trying to time the market,” says finance professor Kevin Crotty.

For most of the past century, academics have applied their power of analysis to understanding and predicting the stock market. Recently, some finance researchers have taken a closer look at option prices—the price paid for the right to buy or sell a security (like a stock or bond) at a specified price in the future. Combining economic theory with high-frequency options price data, they argued that they could estimate the expected return on the market in real-time, which would represent a tremendous development for finance practitioners and academics alike.

Crotty teamed up with Kerry Back, a fellow Rice Business professor, and Seyed Mohammad Kazempour, a finance Ph.D. student at the Jones Graduate School of Business, to evaluate whether the new predictors based on option prices really are a valuable forecasting tool. “Options are essentially a forward-looking contract, so it’s possible that they could be used to create a forward-looking measure of expected returns,” says Kazempour.

Economic theory suggests that the new predictors might systematically underestimate expected returns. The team set out to test if this may be the case, and if so, whether the predictors are useful as a forecasting tool. In their paper, “Validity, Tightness, and Forecasting Power of Risk Premium Bounds,” the Rice Business researchers ran the predictors through a more rigorous set of statistical tests that provide more power to detect whether the predictors systematically underestimate expected returns. The statistical tests used in previous research on the topic were less stringent, leading to conclusions that the predictors do not underestimate expected returns.

In short, the new predictors didn’t pass the more stringent tests. The researchers found that forecasts built on stock options consistently underestimated market returns. Moreover, the predictors are enough of an underestimate that they are not very useful as forecasts of market returns.

The results were somewhat anticlimatic, the researchers admit. If the option-based predictors had panned out, it could have become an innovative new tool for thinking about market timing for asset managers as well as investment decision-making for corporate finance projects. “Trying to estimate expected market returns is closely related to whether corporations decide to invest in projects,” notes Crotty. “The expected market return is an input in estimating the cost of capital when evaluating projects, and I explain in my MBA courses that we don’t have very precise estimates for this input. During this research project, I kept thinking about how cool it would be if we really had a better estimate,” he says.

Their research doesn’t end here. Crotty and Back have already begun brainstorming ways to potentially improve the option-based forecasting tool so that it can become more accurate.

At best, though, using option prices as a forecasting tool will only be one ingredient out of many that investors use to make decisions. “This tool may inform money management, but it will never drive it,” says Back.

For now, at least, the Rice researchers believe that trying to predict the stock market is still a very risky game.

------

This article originally ran on Rice Business Wisdom and was based on research from Rice Professors Kerry Back and Kevin Crotty.

Investors might be drawn to active fund investing, but index funds might be less risky, according to Rice University researchers. Getty Images

Rice University research finds how index funds can be a good investment opportunity for the risk adverse

Houston Voices

It's easy to assume that investing, like cooking, requires skill to get the right mix of ingredients. But that's not the case with index funds. Effort goes into building them, but these ready-made investments need minimal intervention. Yet the outcomes are appetizing indeed.

In the past few decades, use of index funds has exploded. So have media coverage and advertisements questioning if they can truly compete with active funds. A recent study by Alan Crane and Kevin Crotty, professors at the business school, provides a resounding "yes." These humble investment recipes, it turns out, are richer than they might seem.

Index funds track benchmark stock indexes, from the familiar Dow Jones Industrial Average to the widely followed Standard & Poor's 500. Like viewers following a cooking show, index fund managers buy stocks in the same companies and same proportions as those listed in a stock index. The best-known indices are traditionally based on the size of the companies.

The idea is that the index fund's returns will match those of its model. An S&P 500 index fund, for example, includes stocks in the same 500 major companies included in the Standard & Poor index, ranging from Apple to Whole Foods.

Index funds are part of the broad range of investment products called mutual funds. Like cooks making a stew, mutual fund managers add shares of various stocks into one single concoction, inviting investors to buy portions of the whole mixture.

While some mutual funds are active, meaning professional managers regularly buy and sell their assets, index funds are passive. Their managers theoretically just need to keep an eye on any changes in the index they're copying. Not surprisingly, active index funds tend to charge more than passive ones.

Curiously, not all index funds perform at the same level. So what should that mean for investors? To study these variations and their implications, Crane and Crotty expanded on past research about skill and index fund management, analyzing the full cross section of funds.

This wasn't possible to do until fairly recently: there simply weren't enough index funds to study. The first index fund, which tracked the S&P 500, was developed by Vanguard in the 1970s. To do their research, the Rice Business scholars looked at performance information for both index and active funds, starting their sample in 1995 with 29 index funds. The sample expanded to include a total of 240 index funds, all at least two years old with at least $5 million in assets, mostly invested in common stocks. They also analyzed 1,913 actively managed funds.

Using several statistical models, Crane and Cotty found that outperformance in index-fund returns was greater than it would be by chance. The discovery suggests that passive funds, although they require little skill to run, have almost as much upside as active funds.

In fact, the professors found, the best index funds perform surprisingly closely to the best active funds, but at a lower cost to the investor. The worst active funds perform far worse than the worst index funds–even before management fees.

The findings topple the conventional wisdom that only actively managed funds stand a chance of beating the market. While active-fund managers often measure their success against that of passive funds, the data show investors who are risk averse would do better to choose passive funds over more expensive active ones.

More adventurous investors, of course, will always be tempted by what's cooking in actively managed funds. But overall, investing in plain index funds is as good a meal at a lower price.

------

This story originally ran on Rice Business Wisdom.

Alan D. Crane and Kevin Crotty are associate professors of finance at the Jones Graduate School of Business at Rice University.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

How Houston innovators played a role in the historic Artemis II splashdown

safe landing

Research from Rice University played a critical role in the safe return of U.S. astronauts aboard NASA’s Artemis II mission this month.

Rice mechanical engineer Tayfun E. Tezduyar and longtime collaborator Kenji Takizawa developed a key computational parachute fluid-structure interaction (FSI) analysis system that proved vital in NASA’s Orion capsule’s descent into the Pacific Ocean. The FSI system, originally developed in 2013 alongside NASA Johnson Space Center, was critical in Orion’s three-parachute design, which slowed the capsule as it returned to Earth, according to Rice.

The model helped ensure that the parachute design was large enough to slow the capsule for a safe landing while also being stable enough to prevent the capsule from oscillating as it descended.

“You cannot separate the aerodynamics from the structural dynamics,” Tezduyar said in a news release. “They influence each other continuously and even more so for large spacecraft parachutes, so the analysis must capture that interaction in a robustly coupled way.”

The end result was a final parachute system, refined through NASA drop tests and Rice’s computational FSI analysis, that eliminated fluctuations and produced a stable descent profile.

Apart from the dynamic challenges in design, modeling Orion’s parachutes also required solving complex equations that considered airflow and fabric deformation and accounted for features like ringsail canopy construction and aerodynamic interactions among multiple parachutes in a cluster.

“Essentially, my entire group was dedicated to that work, because I considered it a national priority,” Tezduyar added in the release. “Kenji and I were personally involved in every computer simulation. Some of the best graduate students and research associates I met in my career worked on the project, creating unique, first-of-its-kind parachute computer simulations, one after the other.”

Current Intuitive Machines engineer Mario Romero also worked on Orion during his time at NASA. From 2018 to 2021, Romero was a member of the Orion Crew Capsule Recovery Team, which focused on creating likely scenarios that crewmembers could encounter in Orion.

The team trained in NASA’s 6.2-million-gallon pool, using wave machines to replicate a range of sea conditions. They also simulated worst-case scenarios by cutting the lights, blasting high-powered fans and tipping a mock capsule to mimic distress situations. In some drills, mock crew members were treated as “injured,” requiring the team to practice safe, controlled egress procedures.

“It’s hard to find the appropriate descriptors that can fully encapsulate the feeling of getting to witness all the work we, and everyone else, did being put into action,” Romero tells InnovationMap. “I loved seeing the reactions of everyone, but especially of the Houston communities—that brought me a real sense of gratitude and joy.”

Intuitive Machines was also selected to support the Artemis II mission using its Space Data Network and ground station infrastructure. The company monitored radio signals sent from the Orion spacecraft and used Doppler measurements to help determine the spacecraft's precise position and speed.

Tim Crain, Chief Technology Officer at Intuitive Machines, wrote about the experience last week.

"I specialized in orbital mechanics and deep space navigation in graduate school,” Crain shared. “But seeing the theory behind tracking spacecraft come to life as they thread through planetary gravity fields on ultra-precise trajectories still seems like magic."

UH breakthrough moves superconductivity closer to real-world use

Energy Breakthrough

University of Houston researchers have set a new benchmark in the field of superconductivity.

Researchers from the UH physics department and the Texas Center for Superconductivity (TcSUH) have broken the transition temperature record for superconductivity at ambient pressure. The accomplishment could lead to more efficient ways to generate, transmit and store energy, which researchers believe could improve power grids, medical technologies and energy systems by enabling electricity to flow without resistance, according to a release from UH.

To break the record, UH researchers achieved a transition temperature 151 Kelvin, which is the highest ever recorded at ambient pressure since the discovery of superconductivity in 1911.

The transition temperature represents the point just before a material becomes superconducting, where electricity can flow through it without resistance. Scientists have been working for decades to push transition temperature closer to room temperature, which would make superconducting technologies more practical and affordable.

Currently, most superconductors must be cooled to extremely low temperatures, making them more expensive and difficult to operate.

UH physicists Ching-Wu Chu and Liangzi Deng published the research in the Proceedings of the National Academy of Sciences earlier this month. It was funded by Intellectual Ventures and the state of Texas via TcSUH and other foundations. Chu, founding director and chief scientist at TcSUH, previously made the breakthrough discovery that the material YBCO reaches superconductivity at minus 93 K in 1987. This helped begin a global competition to develop high-temperature superconductors.

“Transmitting electricity in the grid loses about 8% of the electricity,” Chu, who’s also a professor of physics at UH and the paper’s senior author, said in a news release. “If we conserve that energy, that’s billions of dollars of savings and it also saves us lots of effort and reduces environmental impacts.”

Chu and his team used a technique known as pressure quenching, which has been adapted from techniques used to create diamonds. With pressure quenching, researchers first apply intense pressure to the material to enhance its superconducting properties and raise its transition temperature.

Next, researchers are targeting ambient-pressure, room-temperature superconductivity of around 300 K. In a companion PNAS paper, Chu and Deng point to pressure quenching as a promising approach to help bridge the gap between current results and that goal.

“Room-temperature superconductivity has been seen as a ‘holy grail’ by scientists for over a century,” Rohit Prasankumar, director of superconductivity research at Intellectual Ventures, said in the release. “The UH team’s result shows that this goal is closer than ever before. However, the distance between the new record set in this study and room temperature is still about 140 C. Closing this gap will require concerted, intentional efforts by the broader scientific community, including materials scientists, chemists, and engineers, as well as physicists.”

---

This article originally appeared on EnergyCapitalHTX.com.