Customer satisfaction directly influences a company's sales, margins, and earnings, and companies that track and measure customer satisfaction have a leg up on competition. Getty Images

Back when people flew nonchalantly for business, an unabashed fan of Great Reputation Airline took a flight where almost everything went wrong. First there was a weather delay. Then there was a mechanical issue. The crew was surly, the pretzels stale. Finally, after landing, when she finally made it to baggage claim, her suitcase was MIA.

But instead of complaining on social media, Great Reputation's passenger wrote off the problems to a rare bad day for the airline – which showered her with drink coupons and later delivered her luggage to her hotel.

GRA's response exemplifies customer satisfaction principles outlined in a paper by Rice Business professor Vikas Mittal and former Rice Business doctoral student Carly Frennea. Summarizing the major research about customer satisfaction, the coauthors codified their findings into a checklist for managers.

While most people understand the general concept of customer satisfaction, in business it's a specific term summarizing a consumer's post-use evaluation of the extent to which a product or service met their expectations. Satisfied customers are more likely to buy again, buy more, recommend a business to others and cost less to serve in the future. A satisfied customer doesn't just cut customer-acquisition costs. She can also help a business attract the right customers through online recommendations.

But the most compelling reason to chase customer satisfaction, say Mittal and Frennea, comes from the University of Michigan's American Customer Satisfaction Index, which tracks customer satisfaction ratings of public companies. Decades of studies based on this data show that customer satisfaction and financial performance go hand-in-hand. While the strength of this association can vary, the link is indisputable. "Nowhere else in marketing has the impact of a customer-based metric on a firm's financial performance been so clearly and consistently established," Mittal and Frennea write.

To help make that satisfaction/revenue link a felicitous one, the researchers recommend the five kinds of data managers should collect.

  • Overall customer satisfaction: A summary evaluation of an overall experience.
  • Behavioral intentions: "Loyalty metrics" that measure the likelihood of buying again, recommending to others and intent to complain.
  • Attribute-level perceptions: Evaluating specific product or service features. For a doctor, this may include time spent waiting in the office, quality of care and explanation of diagnosis. For an oilfield services company, this may include product quality, safety, ongoing service and support, billing and pricing.
  • Contextual information: Comparisons to earlier experiences with a firm and against those with competitors.
  • Customer background variables: Includes gender, age and use of competitors' products and services.

Once these data are collected, the researchers say, managers should use statistical analysis that includes all relevant variables (a method known as multiple regression). This allows companies to figure out which variables have the largest association with overall satisfaction, and which have none. For example, a multiple regression might show that the bad effect of dashing customer expectations is stronger than the good effect of exceeding those expectations. The analysis may also reveal that this effect is stronger for ongoing service and support, say, than for pricing and billing. Conclusion: The company should fix problems with ongoing service and support before tinkering with its pricing and billing strategy.

Companies should also share such customer satisfaction insights with employees and incentivize them to make customer satisfaction a top priority, the researchers write.

To achieve this, executives need to see customer satisfaction as a strategic tool, not just a "good-to-have" afterthought. For this:

  • Treat customer satisfaction as a strategic investment and integrate it into the strategic planning process.
  • Don't skimp on the science. Use the most advanced multiple regression models, and now machine-learning technologies, to distinguish the important from the unimportant, and prioritize the important.
  • Using statistical science, link customer-loyalty patterns to actual behaviors such as repurchasing and repeat sales.
  • Remember that your front-line employees are vital and motivate them by linking their performance to the right customer satisfaction metrics.
  • Don't just maximize customer satisfaction. Balance decreasing and increasing returns on satisfaction initiatives. For this, don't rely on "voice-of-customer" based on casual interviews and discussions. Use rigorously designed customer studies that can be statistically linked to financial results.
  • Share! Summarize satisfaction findings in understandable terms and train employees to act on them. Smart companies use this approach to derive their customer-value proposition and focus the company's strategy.

The formula, after all, is a simple one. If customers are a primary source of your company's cash flow, the first variable in your strategy needs to be making them happy.

------

This story originally ran on Rice Business Wisdom. It's based on research from Vikas Mittal, the J. Hugh Liedtke Professor of Marketing at Jones Graduate School of Business at Rice University, and Carly Frennea, now an executive at Nike, who received her M.B.A. and Ph.D. at Jones Graduate School of Business.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

How Houston innovators played a role in the historic Artemis II splashdown

safe landing

Research from Rice University played a critical role in the safe return of U.S. astronauts aboard NASA’s Artemis II mission this month.

Rice mechanical engineer Tayfun E. Tezduyar and longtime collaborator Kenji Takizawa developed a key computational parachute fluid-structure interaction (FSI) analysis system that proved vital in NASA’s Orion capsule’s descent into the Pacific Ocean. The FSI system, originally developed in 2013 alongside NASA Johnson Space Center, was critical in Orion’s three-parachute design, which slowed the capsule as it returned to Earth, according to Rice.

The model helped ensure that the parachute design was large enough to slow the capsule for a safe landing while also being stable enough to prevent the capsule from oscillating as it descended.

“You cannot separate the aerodynamics from the structural dynamics,” Tezduyar said in a news release. “They influence each other continuously and even more so for large spacecraft parachutes, so the analysis must capture that interaction in a robustly coupled way.”

The end result was a final parachute system, refined through NASA drop tests and Rice’s computational FSI analysis, that eliminated fluctuations and produced a stable descent profile.

Apart from the dynamic challenges in design, modeling Orion’s parachutes also required solving complex equations that considered airflow and fabric deformation and accounted for features like ringsail canopy construction and aerodynamic interactions among multiple parachutes in a cluster.

“Essentially, my entire group was dedicated to that work, because I considered it a national priority,” Tezduyar added in the release. “Kenji and I were personally involved in every computer simulation. Some of the best graduate students and research associates I met in my career worked on the project, creating unique, first-of-its-kind parachute computer simulations, one after the other.”

Current Intuitive Machines engineer Mario Romero also worked on Orion during his time at NASA. From 2018 to 2021, Romero was a member of the Orion Crew Capsule Recovery Team, which focused on creating likely scenarios that crewmembers could encounter in Orion.

The team trained in NASA’s 6.2-million-gallon pool, using wave machines to replicate a range of sea conditions. They also simulated worst-case scenarios by cutting the lights, blasting high-powered fans and tipping a mock capsule to mimic distress situations. In some drills, mock crew members were treated as “injured,” requiring the team to practice safe, controlled egress procedures.

“It’s hard to find the appropriate descriptors that can fully encapsulate the feeling of getting to witness all the work we, and everyone else, did being put into action,” Romero tells InnovationMap. “I loved seeing the reactions of everyone, but especially of the Houston communities—that brought me a real sense of gratitude and joy.”

Intuitive Machines was also selected to support the Artemis II mission using its Space Data Network and ground station infrastructure. The company monitored radio signals sent from the Orion spacecraft and used Doppler measurements to help determine the spacecraft's precise position and speed.

Tim Crain, Chief Technology Officer at Intuitive Machines, wrote about the experience last week.

"I specialized in orbital mechanics and deep space navigation in graduate school,” Crain shared. “But seeing the theory behind tracking spacecraft come to life as they thread through planetary gravity fields on ultra-precise trajectories still seems like magic."

UH breakthrough moves superconductivity closer to real-world use

Energy Breakthrough

University of Houston researchers have set a new benchmark in the field of superconductivity.

Researchers from the UH physics department and the Texas Center for Superconductivity (TcSUH) have broken the transition temperature record for superconductivity at ambient pressure. The accomplishment could lead to more efficient ways to generate, transmit and store energy, which researchers believe could improve power grids, medical technologies and energy systems by enabling electricity to flow without resistance, according to a release from UH.

To break the record, UH researchers achieved a transition temperature 151 Kelvin, which is the highest ever recorded at ambient pressure since the discovery of superconductivity in 1911.

The transition temperature represents the point just before a material becomes superconducting, where electricity can flow through it without resistance. Scientists have been working for decades to push transition temperature closer to room temperature, which would make superconducting technologies more practical and affordable.

Currently, most superconductors must be cooled to extremely low temperatures, making them more expensive and difficult to operate.

UH physicists Ching-Wu Chu and Liangzi Deng published the research in the Proceedings of the National Academy of Sciences earlier this month. It was funded by Intellectual Ventures and the state of Texas via TcSUH and other foundations. Chu, founding director and chief scientist at TcSUH, previously made the breakthrough discovery that the material YBCO reaches superconductivity at minus 93 K in 1987. This helped begin a global competition to develop high-temperature superconductors.

“Transmitting electricity in the grid loses about 8% of the electricity,” Chu, who’s also a professor of physics at UH and the paper’s senior author, said in a news release. “If we conserve that energy, that’s billions of dollars of savings and it also saves us lots of effort and reduces environmental impacts.”

Chu and his team used a technique known as pressure quenching, which has been adapted from techniques used to create diamonds. With pressure quenching, researchers first apply intense pressure to the material to enhance its superconducting properties and raise its transition temperature.

Next, researchers are targeting ambient-pressure, room-temperature superconductivity of around 300 K. In a companion PNAS paper, Chu and Deng point to pressure quenching as a promising approach to help bridge the gap between current results and that goal.

“Room-temperature superconductivity has been seen as a ‘holy grail’ by scientists for over a century,” Rohit Prasankumar, director of superconductivity research at Intellectual Ventures, said in the release. “The UH team’s result shows that this goal is closer than ever before. However, the distance between the new record set in this study and room temperature is still about 140 C. Closing this gap will require concerted, intentional efforts by the broader scientific community, including materials scientists, chemists, and engineers, as well as physicists.”

---

This article originally appeared on EnergyCapitalHTX.com.