By accounting for both known and unknowable factors, managers can identify salespeople with traits that work best in different types of sales. Getty Images

When you're a manager, decisions barrage you each day. What product works? Which store layout entices? How will you balance the budget? Many of these decisions ultimately hinge on one factor: the skills of your sales force.

Often, when managers evaluate their salespeople they contend with invisible factors that may not show up in commissions or name-tagged sales rosters — intangibles such as product placement, season or simply a store's surrounding population. This makes it hard to fully evaluate a salesperson, or to spot which workers can teach valuable skills to their peers and improve the whole team.

But what if you could plug a few variables into a statistical model to spot your best sellers? You could then ask the star salespeople to teach coworkers some of their secrets. New research by Rice Business professor Wagner A. Kamakura and colleague Danny P. Claro of Brazil's Insper Education and Research Institute offers a technique for doing this. Blending statistical methods that incorporate both known and unknown factors, Kamakura and Claro developed a practical tool that, for the first time, allows managers to identify staffers with key hidden skills.

To test their model, the researchers analyzed store data from 35 cosmetic and healthcare retail franchises in four South American markets. These particular stores were ideal to test the model because their salespeople were individually responsible for each transaction from the moment a customer entered a store to the time of purchase. The salespeople were also required to have detailed knowledge of products throughout each store.

Breaking down the product lines into 11 specific categories, and accounting for predictors such as commission, product display, time of year and market potential, Kamakura and Claro documented and compared each salesperson's performance across products and over time.

They then organized members of the salesforce by strengths and weaknesses, spotlighting those workers who used best practices in a certain area and those who might benefit from that savvy. The resulting insight allowed managers to name team members as either growth advisors or learners. Thanks to the model's detail, Kamakura and Claro note, managers can spot a salesperson who excels in one category but has room to learn, rather than seeing that worker averaged into a single, middle-of-the-pack ranking.

If a salesperson is, for example, a sales savant but lags in customer service, managers can use that insight to help the worker improve individually, while at the same time strategizing for the store's overall success. Put into practice, the model also allows managers to identify team members who excel at selling one specific product category — and encourage them to share their secrets and methods with coworkers.

It might seem that teaching one employee to sell one more set of earbuds or one more lawn chair makes little difference. But applied consistently over time, such personalized product-specific improvement can change the face of a salesforce — and in the end, a whole business. A good manager uses all the tools available. Kamakura and Claro's model makes it possible for every employee on a sales team to be a potential coach for the rest.

------

This story originally ran on Rice Business Wisdom.

Based on research from Wagner A. Kamakura, the Jesse H. Jones Professor of Marketing at Jones Graduate School of Business at Rice University.

Keeping on track with trends is crucial to growing and developing a relationship with your customers, these Rice University researchers found. Getty Images

Rice researcher delves into the importance of trendspotting in consumer behavior

Houston voices

Every business wants to read consumers' minds: what they love, what they hate. Even more, businesses crave to know about mass trends before they're visible to the naked eye.

In the past, analysts searching for trends needed to pore over a vast range of sources for marketplace indicators. The internet and social media have changed that: marketers now have access to an avalanche of real-time indicators, laden with details about the wishes hidden within customers' hearts and minds. With services such as Trendistic (which tracks individual Twitter terms), Google Insights for Search and BlogPulse, modern marketers are even privy to the real-time conversations surrounding consumers' desires.

Now, imagine being able to analyze all this data across large panels of time – then distilling it so well that you could identify marketing trends quickly, accurately and quantitatively.

Rice Business professor Wagner A. Kamakura and Rex Y. Du of the University of Houston set out to create a model that makes this possible. Because both quantitative and qualitative trendspotting are exploratory endeavors, Kamakura notes, both types of research can yield results that are broad but also inaccurate. To remedy this, Kamakura and Du devised a new model for quickly and accurately refining market data into trend patterns.

Kamakura and Du's model entails taking five simple steps to analyze gathered data using a quantitative method. By following this process of refining the data tens or hundreds of times, then isolating the information into specific seasonal and non-seasonal trends or dynamic trends, researchers can generate steady trend patterns across time panels.

Here's the process:

  • First, gather individual indicators by assembling data from different sources, with the understanding that the information is interconnected. It's crucial to select the data methodically, rather than making random choices, in order to avoid subjectively preselecting irrelevant indicators and blocking out relevant ones. Done sloppily, this first step can generate misleading information.
  • Distill the data into a few common factors. The raw data might include inaccuracies, which must be filtered out to lower the risk of overreacting or noting erroneous indicators.
  • Interpret and identify common trends by understanding the causes of spikes or dips in consumer behavior. It's key to separate non-cyclical and cyclical changes, because exterior events such as holidays or weather can alter behavior.
  • Compare your analysis with previously identified trends and other variables to establish their validity and generate insights. Looking at past performance through the filter of new insights can offer managers important guidance.
  • Project the trend lines you've identified using historical tracking data and their modeling framework. These trend lines can then be extrapolated into near-future projections, allowing managers to better position themselves and be proactive trying to reverse unfavorable trends and leverage positive ones.

It's important to bear in mind that the indicators used for quantitative trendspotting are prone to random and systematic errors, Kamakura writes. The model he devised, however, can filter these errors because it keeps them from appearing across different series of time panels. The result: better ability to identify genuine movements and general trends, free from the influence of seasonal events and from random error.

It goes without saying that the information and persuasiveness offered by the internet are inevitably attended by noise. For marketers, this means that without filtering, some trends show spikes for temporary items – mere viral jolts that can skew market research.

Kamakura and Du's model helps sidestep this problem by blending available historical data analysis, large time panels and movements while avoiding errors common to more traditional methods. For managers longing to glimpse the next big thing, this analytical model can reveal emerging consumer movements with clarity – just as they're becoming the future.

(For the mathematically inclined, and those comfortable with Excel macros and Add-Ins, who want to try trendspotting on their own tracking data, Kamakura's Analytical Tools for Excel (KATE) can be downloaded for free at http://wak2.web.rice.edu/bio/Kamakura_Analytic_Tools.html.)

------

This article originally appeared on Rice Business Wisdom.

Wagner A. Kamakura is Jesse H. Jones Professor of Marketing at Jones Graduate School of Business at Rice University.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Texas university to lead new FAA tech center focused on drones

taking flight

The Texas A&M University System will run the Federal Aviation Administration’s new Center for Advanced Aviation Technologies, which will focus on innovations like commercial drones.

“Texas is the perfect place for our new Center for Advanced Aviation Technologies,” U.S. Transportation Secretary Sean Duffy said in a release. “From drones delivering your packages to powered lift technologies like air taxis, we are at the cusp of an aviation revolution. The [center] will ensure we make that dream a reality and unleash American innovation safely.”

U.S. Sen. Ted Cruz, a Texas Republican, included creation of the center in the FAA Reauthorization Act of 2024. The center will consist of an airspace laboratory, flight demonstration zones, and testing corridors.

Texas A&M University-Corpus Christi will lead the initiative, testing unstaffed aircraft systems and other advanced technologies. The Corpus Christi campus houses the Autonomy Research Institute, an FAA-designated test site. The new center will be at Texas A&M University-Fort Worth.

The College Station-based Texas A&M system says the center will “bring together” its 19 institutions, along with partners such as the University of North Texas in Denton and Southern Methodist University in University Park.

According to a Department of Transportation news release, the center will play “a pivotal role” in ensuring the safe operation of advanced aviation technologies in public airspace.

The Department of Transportation says it chose the Texas A&M system to manage the new center because of its:

  • Proximity to major international airports and the FAA’s regional headquarters in Fort Worth
  • Existing infrastructure for testing of advanced aviation technologies
  • Strong academic programs and industry partnerships

“I’m confident this new research and testing center will help the private sector create thousands of high-paying jobs and grow the Texas economy through billions in new investments,” Cruz said.

“This is a significant win for Texas that will impact communities across our state,” the senator added, “and I will continue to pursue policies that create new jobs, and ensure the Lone Star State continues to lead the way in innovation and the manufacturing of emerging aviation technologies.”

Texas Republicans are pushing to move NASA headquarters to Houston

space city

Two federal lawmakers from Texas are spearheading a campaign to relocate NASA’s headquarters from Washington, D.C., to the Johnson Space Center in Houston’s Clear Lake area. Houston faces competition on this front, though, as lawmakers from two other states are also vying for this NASA prize.

With NASA’s headquarters lease in D.C. set to end in 2028, U.S. Sen. Ted Cruz, a Texas Republican, and U.S. Rep. Brian Babin, a Republican whose congressional district includes the Johnson Space Center, recently wrote a letter to President Trump touting the Houston area as a prime location for NASA’s headquarters.

“A central location among NASA’s centers and the geographical center of the United States, Houston offers the ideal location for NASA to return to its core mission of space exploration and to do so at a substantially lower operating cost than in Washington, D.C.,” the letter states.

Cruz is chairman of the Senate Committee on Commerce, Science, and Transportation; and Babin is chairman of the House Committee on Science, Space, and Technology. Both committees deal with NASA matters. Twenty-five other federal lawmakers from Texas, all Republicans, signed the letter.

In the letter, legislators maintain that shifting NASA’s headquarters to the Houston area makes sense because “a seismic disconnect between NASA’s headquarters and its missions has opened the door to bureaucratic micromanagement and an erosion of [NASA] centers’ interdependence.”

Founded in 1961, the $1.5 billion, 1,620-acre Johnson Space Center hosts NASA’s mission control and astronaut training operations. More than 12,000 employees work at the 100-building complex.

According to the state comptroller, the center generates an annual economic impact of $4.7 billion for Texas, and directly and indirectly supports more than 52,000 public and private jobs.

In pitching the Johnson Space Center for NASA’s HQ, the letter points out that Texas is home to more than 2,000 aerospace, aviation, and defense-related companies. Among them are Elon Musk’s SpaceX, based in the newly established South Texas town of Starbase; Axiom Space and Intuitive Machines, both based in Houston; and Firefly Aerospace, based in the Austin suburb of Cedar Park.

The letter also notes the recent creation of the Texas Space Commission, which promotes innovation in the space and commercial aerospace sectors.

Furthermore, the letter cites Houston-area assets for NASA such as:

  • A strong business environment.
  • A low level of state government regulation.
  • A cost of living that’s half of what it is in the D.C. area.

“Moving the NASA headquarters to Texas will create more jobs, save taxpayer dollars, and reinvigorate America’s space agency,” the letter says.

Last November, NASA said it was hunting for about 375,000 to 525,000 square feet of office space in the D.C. area to house the agency’s headquarters workforce. About 2,500 people work at the agency’s main offices. NASA’s announcement set off a scramble among three states to lure the agency’s headquarters.

Aside from officials in Texas, politicians in Florida and Ohio are pressing NASA to move its headquarters to their states. Florida and Ohio both host major NASA facilities.

NASA might take a different approach, however. “NASA is weighing closing its headquarters and scattering responsibilities among the states, a move that has the potential to dilute its coordination and influence in Washington,” Politico reported in March.

Meanwhile, Congressional Delegate Eleanor Holmes Norton, a Democrat who represents D.C., introduced legislation in March that would prohibit relocating a federal agency’s headquarters (including NASA’s) away from the D.C. area without permission from Congress.

“Moving federal agencies is not about saving taxpayer money and will degrade the vital services provided to all Americans across the country,” Norton said in a news release. “In the 1990s, the Bureau of Land Management moved its wildfire staff out West, only to move them back when Congress demanded briefings on new wildfires.”

Houston research breakthrough could pave way for next-gen superconductors

Quantum Breakthrough

A study from researchers at Rice University, published in Nature Communications, could lead to future advances in superconductors with the potential to transform energy use.

The study revealed that electrons in strange metals, which exhibit unusual resistance to electricity and behave strangely at low temperatures, become more entangled at a specific tipping point, shedding new light on these materials.

A team led by Rice’s Qimiao Si, the Harry C. and Olga K. Wiess Professor of Physics and Astronomy, used quantum Fisher information (QFI), a concept from quantum metrology, to measure how electron interactions evolve under extreme conditions. The research team also included Rice’s Yuan Fang, Yiming Wang, Mounica Mahankali and Lei Chen along with Haoyu Hu of the Donostia International Physics Center and Silke Paschen of the Vienna University of Technology. Their work showed that the quantum phenomenon of electron entanglement peaks at a quantum critical point, which is the transition between two states of matter.

“Our findings reveal that strange metals exhibit a unique entanglement pattern, which offers a new lens to understand their exotic behavior,” Si said in a news release. “By leveraging quantum information theory, we are uncovering deep quantum correlations that were previously inaccessible.”

The researchers examined a theoretical framework known as the Kondo lattice, which explains how magnetic moments interact with surrounding electrons. At a critical transition point, these interactions intensify to the extent that the quasiparticles—key to understanding electrical behavior—disappear. Using QFI, the team traced this loss of quasiparticles to the growing entanglement of electron spins, which peaks precisely at the quantum critical point.

In terms of future use, the materials share a close connection with high-temperature superconductors, which have the potential to transmit electricity without energy loss, according to the researchers. By unblocking their properties, researchers believe this could revolutionize power grids and make energy transmission more efficient.

The team also found that quantum information tools can be applied to other “exotic materials” and quantum technologies.

“By integrating quantum information science with condensed matter physics, we are pivoting in a new direction in materials research,” Si said in the release.

---

This article originally appeared on our sister site, EnergyCapitalHTX.com.