By accounting for both known and unknowable factors, managers can identify salespeople with traits that work best in different types of sales. Getty Images

When you're a manager, decisions barrage you each day. What product works? Which store layout entices? How will you balance the budget? Many of these decisions ultimately hinge on one factor: the skills of your sales force.

Often, when managers evaluate their salespeople they contend with invisible factors that may not show up in commissions or name-tagged sales rosters — intangibles such as product placement, season or simply a store's surrounding population. This makes it hard to fully evaluate a salesperson, or to spot which workers can teach valuable skills to their peers and improve the whole team.

But what if you could plug a few variables into a statistical model to spot your best sellers? You could then ask the star salespeople to teach coworkers some of their secrets. New research by Rice Business professor Wagner A. Kamakura and colleague Danny P. Claro of Brazil's Insper Education and Research Institute offers a technique for doing this. Blending statistical methods that incorporate both known and unknown factors, Kamakura and Claro developed a practical tool that, for the first time, allows managers to identify staffers with key hidden skills.

To test their model, the researchers analyzed store data from 35 cosmetic and healthcare retail franchises in four South American markets. These particular stores were ideal to test the model because their salespeople were individually responsible for each transaction from the moment a customer entered a store to the time of purchase. The salespeople were also required to have detailed knowledge of products throughout each store.

Breaking down the product lines into 11 specific categories, and accounting for predictors such as commission, product display, time of year and market potential, Kamakura and Claro documented and compared each salesperson's performance across products and over time.

They then organized members of the salesforce by strengths and weaknesses, spotlighting those workers who used best practices in a certain area and those who might benefit from that savvy. The resulting insight allowed managers to name team members as either growth advisors or learners. Thanks to the model's detail, Kamakura and Claro note, managers can spot a salesperson who excels in one category but has room to learn, rather than seeing that worker averaged into a single, middle-of-the-pack ranking.

If a salesperson is, for example, a sales savant but lags in customer service, managers can use that insight to help the worker improve individually, while at the same time strategizing for the store's overall success. Put into practice, the model also allows managers to identify team members who excel at selling one specific product category — and encourage them to share their secrets and methods with coworkers.

It might seem that teaching one employee to sell one more set of earbuds or one more lawn chair makes little difference. But applied consistently over time, such personalized product-specific improvement can change the face of a salesforce — and in the end, a whole business. A good manager uses all the tools available. Kamakura and Claro's model makes it possible for every employee on a sales team to be a potential coach for the rest.

------

This story originally ran on Rice Business Wisdom.

Based on research from Wagner A. Kamakura, the Jesse H. Jones Professor of Marketing at Jones Graduate School of Business at Rice University.

Keeping on track with trends is crucial to growing and developing a relationship with your customers, these Rice University researchers found. Getty Images

Rice researcher delves into the importance of trendspotting in consumer behavior

Houston voices

Every business wants to read consumers' minds: what they love, what they hate. Even more, businesses crave to know about mass trends before they're visible to the naked eye.

In the past, analysts searching for trends needed to pore over a vast range of sources for marketplace indicators. The internet and social media have changed that: marketers now have access to an avalanche of real-time indicators, laden with details about the wishes hidden within customers' hearts and minds. With services such as Trendistic (which tracks individual Twitter terms), Google Insights for Search and BlogPulse, modern marketers are even privy to the real-time conversations surrounding consumers' desires.

Now, imagine being able to analyze all this data across large panels of time – then distilling it so well that you could identify marketing trends quickly, accurately and quantitatively.

Rice Business professor Wagner A. Kamakura and Rex Y. Du of the University of Houston set out to create a model that makes this possible. Because both quantitative and qualitative trendspotting are exploratory endeavors, Kamakura notes, both types of research can yield results that are broad but also inaccurate. To remedy this, Kamakura and Du devised a new model for quickly and accurately refining market data into trend patterns.

Kamakura and Du's model entails taking five simple steps to analyze gathered data using a quantitative method. By following this process of refining the data tens or hundreds of times, then isolating the information into specific seasonal and non-seasonal trends or dynamic trends, researchers can generate steady trend patterns across time panels.

Here's the process:

  • First, gather individual indicators by assembling data from different sources, with the understanding that the information is interconnected. It's crucial to select the data methodically, rather than making random choices, in order to avoid subjectively preselecting irrelevant indicators and blocking out relevant ones. Done sloppily, this first step can generate misleading information.
  • Distill the data into a few common factors. The raw data might include inaccuracies, which must be filtered out to lower the risk of overreacting or noting erroneous indicators.
  • Interpret and identify common trends by understanding the causes of spikes or dips in consumer behavior. It's key to separate non-cyclical and cyclical changes, because exterior events such as holidays or weather can alter behavior.
  • Compare your analysis with previously identified trends and other variables to establish their validity and generate insights. Looking at past performance through the filter of new insights can offer managers important guidance.
  • Project the trend lines you've identified using historical tracking data and their modeling framework. These trend lines can then be extrapolated into near-future projections, allowing managers to better position themselves and be proactive trying to reverse unfavorable trends and leverage positive ones.

It's important to bear in mind that the indicators used for quantitative trendspotting are prone to random and systematic errors, Kamakura writes. The model he devised, however, can filter these errors because it keeps them from appearing across different series of time panels. The result: better ability to identify genuine movements and general trends, free from the influence of seasonal events and from random error.

It goes without saying that the information and persuasiveness offered by the internet are inevitably attended by noise. For marketers, this means that without filtering, some trends show spikes for temporary items – mere viral jolts that can skew market research.

Kamakura and Du's model helps sidestep this problem by blending available historical data analysis, large time panels and movements while avoiding errors common to more traditional methods. For managers longing to glimpse the next big thing, this analytical model can reveal emerging consumer movements with clarity – just as they're becoming the future.

(For the mathematically inclined, and those comfortable with Excel macros and Add-Ins, who want to try trendspotting on their own tracking data, Kamakura's Analytical Tools for Excel (KATE) can be downloaded for free at http://wak2.web.rice.edu/bio/Kamakura_Analytic_Tools.html.)

------

This article originally appeared on Rice Business Wisdom.

Wagner A. Kamakura is Jesse H. Jones Professor of Marketing at Jones Graduate School of Business at Rice University.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston researchers develop material to boost AI speed and cut energy use

ai research

A team of researchers at the University of Houston has developed an innovative thin-film material that they believe will make AI devices faster and more energy efficient.

AI data centers consume massive amounts of electricity and use large cooling systems to operate, adding a strain on overall energy consumption.

“AI has made our energy needs explode,” Alamgir Karim, Dow Chair and Welch Foundation Professor at the William A. Brookshire Department of Chemical and Biomolecular Engineering at UH, explained in a news release. “Many AI data centers employ vast cooling systems that consume large amounts of electricity to keep the thousands of servers with integrated circuit chips running optimally at low temperatures to maintain high data processing speed, have shorter response time and extend chip lifetime.”

In a report recently published in ACS Nano, Karim and a team of researchers introduced a specialized two-dimensional thin film dielectric, or electric insulator. The film, which does not store electricity, could be used to replace traditional, heat-generating components in integrated circuit chips, which are essential hardware powering AI.

The thinner film material aims to reduce the significant energy cost and heat produced by the high-performance computing necessary for AI.

Karim and his former doctoral student, Maninderjeet Singh, used Nobel prize-winning organic framework materials to develop the film. Singh, now a postdoctoral researcher at Columbia University, developed the materials during his doctoral training at UH, along with Devin Shaffer, a UH professor of civil engineering, and doctoral student Erin Schroeder.

Their study shows that dielectrics with high permittivity (high-k) store more electrical energy and dissipate more energy as heat than those with low-k materials. Karim focused on low-k materials made from light elements, like carbon, that would allow chips to run cooler and faster.

The team then created new materials with carbon and other light elements, forming covalently bonded sheetlike films with highly porous crystalline structures using a process known as synthetic interfacial polymerization. Then they studied their electronic properties and applications in devices.

According to the report, the film was suitable for high-voltage, high-power devices while maintaining thermal stability at elevated operating temperatures.

“These next-generation materials are expected to boost the performance of AI and conventional electronics devices significantly,” Singh added in the release.

Houston to become 'global leader in brain health' and more innovation news

Top Topics

Editor's note: The most-read Houston innovation news this month is centered around brain health, from the launch of Project Metis to Rice''s new Amyloid Mechanism and Disease Center. Here are the five most popular InnovationMap stories from December 1-15, 2025:

1. Houston institutions launch Project Metis to position region as global leader in brain health

The Rice Brain Institute, UTMB's Moody Brain Health Institute and Memorial Hermann’s comprehensive neurology care department will lead Project Metis. Photo via Unsplash.

Leaders in Houston's health care and innovation sectors have joined the Center for Houston’s Future to launch an initiative that aims to make the Greater Houston Area "the global leader of brain health." The multi-year Project Metis, named after the Greek goddess of wisdom and deep thought, will be led by the newly formed Rice Brain Institute, The University of Texas Medical Branch's Moody Brain Health Institute and Memorial Hermann’s comprehensive neurology care department. The initiative comes on the heels of Texas voters overwhelmingly approving a ballot measure to launch the $3 billion, state-funded Dementia Prevention and Research Institute of Texas (DPRIT). Continue reading.

2.Rice University researchers unveil new model that could sharpen MRI scans

New findings from a team of Rice University researchers could enhance MRI clarity. Photo via Unsplash.

Researchers at Rice University, in collaboration with Oak Ridge National Laboratory, have developed a new model that could lead to sharper imaging and safer diagnostics using magnetic resonance imaging, or MRI. In a study published in The Journal of Chemical Physics, the team of researchers showed how they used the Fokker-Planck equation to better understand how water molecules respond to contrast agents in a process known as “relaxation.” Continue reading.

3. Rice University launches new center to study roots of Alzheimer’s and Parkinson’s

The new Amyloid Mechanism and Disease Center will serve as the neuroscience branch of Rice’s Brain Institute. Photo via Unsplash.

Rice University has launched its new Amyloid Mechanism and Disease Center, which aims to uncover the molecular origins of Alzheimer’s, Parkinson’s and other amyloid-related diseases. The center will bring together Rice faculty in chemistry, biophysics, cell biology and biochemistry to study how protein aggregates called amyloids form, spread and harm brain cells. It will serve as the neuroscience branch of the Rice Brain Institute, which was also recently established. Continue reading.

4. Baylor center receives $10M NIH grant to continue rare disease research

BCM's Center for Precision Medicine Models has received funding that will allow it to study more complex diseases. Photo via Getty Images

Baylor College of Medicine’s Center for Precision Medicine Models has received a $10 million, five-year grant from the National Institutes of Health that will allow it to continue its work studying rare genetic diseases. The Center for Precision Medicine Models creates customized cell, fly and mouse models that mimic specific genetic variations found in patients, helping scientists to better understand how genetic changes cause disease and explore potential treatments. Continue reading.

5. Luxury transportation startup connects Houston with Austin and San Antonio

Shutto is a new option for Houston commuters. Photo courtesy of Shutto

Houston business and leisure travelers have a luxe new way to hop between Texas cities. Transportation startup Shutto has launched luxury van service connecting San Antonio, Austin, and Houston, offering travelers a comfortable alternative to flying or long-haul rideshare. Continue reading.

Texas falls to bottom of national list for AI-related job openings

jobs report

For all the hoopla over AI in the American workforce, Texas’ share of AI-related job openings falls short of every state except Pennsylvania and Florida.

A study by Unit4, a provider of cloud-based enterprise resource planning (ERP) software for businesses, puts Texas at No. 49 among the states with the highest share of AI-focused jobs. Just 9.39 percent of Texas job postings examined by Unit4 mentioned AI.

Behind Texas are No. 49 Pennsylvania (9.24 percent of jobs related to AI) and No. 50 Florida (9.04 percent). One spot ahead of Texas, at No. 47, is California (9.56 percent).

Unit4 notes that Texas’ and Florida’s low rankings show “AI hiring concentration isn’t necessarily tied to population size or GDP.”

“For years, California, Texas, and New York dominated tech hiring, but that’s changing fast. High living costs, remote work culture, and the democratization of AI tools mean smaller states can now compete,” Unit4 spokesperson Mark Baars said in a release.

The No. 1 state is Wyoming, where 20.38 percent of job openings were related to AI. The Cowboy State was followed by Vermont at No. 2 (20.34 percent) and Rhode Island at No. 3 (19.74 percent).

“A company in Wyoming can hire an AI engineer from anywhere, and startups in Vermont can build powerful AI systems without being based in Silicon Valley,” Baars added.

The study analyzed LinkedIn job postings across all 50 states to determine which ones were leading in AI employment. Unit4 came up with percentages by dividing the total number of job postings in a state by the total number of AI-related job postings.

Experts suggest that while states like Texas, California and Florida “have a vast number of total job postings, the sheer volume of non-AI jobs dilutes their AI concentration ratio,” according to Unit4. “Moreover, many major tech firms headquartered in California are outsourcing AI roles to smaller, more affordable markets, creating a redistribution of AI employment opportunities.”