How you can use your data to improve your marketing efforts. Photo via Getty Images

When focusing on revenue growth in business to business companies, analyzing data to develop and optimize strategies is one of the biggest factors in sales and marketing success. However, the process of evaluating B2B data differs significantly from that of B2C, or business to consumer. B2C analysis is often straightforward, focusing on consumer behavior and e-commerce transactions.

Unlike B2C, where customers can make a quick purchase decision with a simple click, the B2B customer journey involves multiple touchpoints and extensive research. B2B buyers will most likely discover a company through an ad or a referral, then navigate through websites, interact with salespeople, and explore different resources before finally making a purchasing decision, often with a committee giving input.

Because a B2B customer journey through the sales pipeline is more indirect, these businesses need to take a more nuanced approach to acquiring and making sense of data.

The expectations of B2B vs. B2C

It can be tempting to use the same methods of analysis between B2C and B2B data. However, B2B decision-making requires more consideration. Decisions involving enterprise software or other significant business products or services investments are very different from a typical consumer purchase.

B2C marketing emphasizes metrics like conversion rates, click-through rates, and immediate sales. In contrast, B2B marketing success also includes metrics like lead quality, customer lifetime value, and ROI. Understanding the differences helps prevent unrealistic expectations and misinterpretations of data.

Data differences with B2B

While B2C data analysis often revolves around website analytics and foot traffic in brick and mortar stores, B2B data analysis involves multiple sources. Referrals play a vital role in B2B, as buyers often seek recommendations from industry peers or companies similar to theirs.

Data segmentation in B2B focuses more on job title and job function rather than demographic data. Targeting different audiences within the same company based on their roles — and highlighting specific aspects of products or services that resonate with those different decision-makers — can significantly impact a purchase decision.

The B2B sales cycle is longer because purchases typically involve the input of a salesperson to help buyers with education and comparison. This allows for teams to implement account-based marketing and provides for more engagement which increases the chances of moving prospects down the sales funnel.

Enhancing data capture in B2B analysis

Many middle-market companies rely heavily on individual knowledge and experience rather than formal data management systems. As the sales and marketing landscape has evolved to be more digital, so must business. Sales professionals can leave and a company must retain the knowledge of the buyers and potential buyers. CRM systems not only collect data, they also provide the history of customer relationships.

Businesses need to capture data at all the various touchpoints, including lead generation, prospect qualification, customer interactions, and order fulfillment. Regular analysis will help with accuracy. The key is to derive actionable insights from the data.

B2B data integration challenges

Integrating various data sources in B2B data analysis used to be much more difficult. With the advent of business intelligence software such as Tableau and Power BI, data analysis is much more accessible with a less significant investment. Businesses do need access to resources to effectively use the tools.

CRM and ERP systems store a wealth of data, including contact details, interactions, and purchase history. Marketing automation platforms capture additional information from website forms, social media, and email campaigns. Because of these multiple sources, connecting data points and cleansing the data is a necessary step in the process.

When analyzing B2B data for account based marketing (ABM) purposes, there are some unique considerations to keep in mind. Industries like healthcare and financial services, for instance, have specific regulations that dictate how a business can use customer data.

Leveraging B2B data analysis for growth

B2B data analysis is the foundation for any sales and marketing strategy. Collecting and using data from multiple sources allows revenue teams to uncover gaps, trends, and opportunities for continued growth.

Acknowledging what’s different about B2B data and tracking all of the customer journey touchpoints is important as a business identifies a target market, develops an ideal customer profile, and monitors their competitors. Insights from data also single out gaps in the sales pipeline, use predictive analytics for demand forecasting, and optimize pricing strategies.

This comprehensive approach gives B2B companies the tools they need to make informed decisions, accelerate their sales and marketing efforts, and achieve long-term growth in a competitive market.

------

Libby Covington is a Partner with Craig Group, a technology-enabled sales and marketing advisory firm specializing in revenue growth for middle-market, private-equity-backed portfolio companies.

Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution. Graphic by Miguel Tovar/University of Houston

Houston research: Why you need a data management plan

Houston voices

Why do you need a data management plan? It mitigates error, increases research integrity and allows your research to be replicated – despite the “replication crisis” that the research enterprise has been wrestling with for some time.

Error

There are many horror stories of researchers losing their data. You can just plain lose your laptop or an external hard drive. Sometimes they are confiscated if you are traveling to another country — and you may not get them back. Some errors are more nuanced. For instance, a COVID-19 repository of contact-traced individuals was missing 16,000 results because Excel can’t exceed 1 million lines per spreadsheet.

Do you think a hard drive is the best repository? Keep in mind that 20 percent of hard drives fail within the first four years. Some researchers merely email their data back and forth and feel like it is “secure” in their inbox.

The human and machine error margins are wide. Continually backing up your results, while good practice, can’t ensure that you won’t lose invaluable research material.

Repositories

According to Reid Boehm, Ph.D., Research Data Management Librarian at the University of Houston Libraries, your best bet is to utilize research data repositories. “The systems and the administrators are focused on file integrity and preservation actions to mitigate loss and they often employ specific metadata fields and documentation with the content,” Boehm says of the repositories. “They usually provide a digital object identifier or other unique ID for a persistent record and access point to these data. It’s just so much less time and worry.”

Integrity

Losing data or being hacked can challenge data integrity. Data breaches do not only compromise research integrity, they can also be extremely expensive! According to Security Intelligence, the global average cost of a data breach in a 2019 study was $3.92 million. That is a 1.5 percent increase from the previous year’s study.

Sample size — how large or small a study was — is another example of how data integrity can affect a study. Retraction Watch removes approximately 1,500 articles annually from prestigious journals for “sloppy science.” One of the main reasons the papers end up being retracted is that the sample size was too small to be a representative group.

Replication

Another metric for measuring data integrity is whether or not the experiment can be replicated. The ability to recreate an experiment is paramount to the scientific enterprise. In a Nature article entitled, 1,500 scientists lift the lid on reproducibility, “73 percent said that they think that at least half of the papers can be trusted, with physicists and chemists generally showing the most confidence.”

However, according to Kelsey Piper at Vox, “an attempt to replicate studies from top journals Nature and Science found that 13 of the 21 results looked at could be reproduced.”

That's so meta

The archivist Jason Scott said, “Metadata is a love note to the future.” Learning how to keep data about data is a critical part of reproducing an experiment.

“While this will be always be determined by a combination of project specifics and disciplinary considerations, descriptive metadata should include as much information about the process as possible,” said Boehm. Details of workflows, any standard operating procedures and parameters of measurement, clear definitions of variables, code and software specifications and versions, and many other signifiers ensure the data will be of use to colleagues in the future.

In other words, making data accessible, useable and reproducible is of the utmost importance. You make reproducing experiments that much easier if you are doing a good job of capturing metadata in a consistent way.

The Big Idea

A data management plan includes storage, curation, archiving and dissemination of research data. Your university’s digital librarian is an invaluable resource. They can answer other tricky questions as well: such as, who does data belong to? And, when a post-doctoral student in your lab leaves the institution, can s/he take their data with them? Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Here's your university research data management checklist. Graphic by Miguel Tovar/University of Houston

Tips for optimizing data management in research, from a UH expert

Houston voices

A data management plan is invaluable to researchers and to their universities. "You should plan at the outset for managing output long-term," said Reid Boehm, research data management librarian at University of Houston Libraries.

At the University of Houston, research data generated while individuals are pursuing research studies as faculty, staff or students of the University of Houston are to be retained by the institution for a period of three years after submission of the final report. That means there is a lot of data to be managed. But researchers are in luck – there are many resources to help navigate these issues.

Take inventory

Is your data

  • Active (constantly changing) or Inactive (static)
  • Open (public) or Proprietary (for monetary gain)
  • Non-identifiable (no human subjects) or Sensitive (containing personal information)
  • Preservable (to save long term) or To discard in 3 years (not for keeping)
  • Shareable (ready for reuse) or Private (not able to be shared)

The more you understand the kind of data you are generating the easier this step, and the next steps, will be.

Check first

When you are ready to write your plan, the first thing to determine is if your funders or the university have data management plan policy and guidelines. For instance, University of Houston does.

It is also important to distinguish between types of planning documents. For example:

A Data Management Plan (DMP) is a comprehensive, formal document that describes how you will handle your data during the course of your research and at the conclusion of your study or project.

While in some instances, funders or institutions may require a more targeted plan such as a Data Sharing Plan (DSP) that describes how you plan to disseminate your data at the conclusion of a research project.

Consistent questions that DMPs ask include:

  • What is generated?
  • How is it securely handled? and
  • How is it maintained and accessed long-term?

However it's worded, data is critical to every scientific study.

Pre-proposal

Pre-proposal planning resources and support at UH Libraries include a consultation with Boehm. "Each situation is unique and in my role I function as an advocate for researchers to talk through the contextual details, in connection with funder and institutional requirements," stated Boehm. "There are a lot of aspects of data management and dissemination that can be made less complex and more functional long term with a bit of focused planning at the beginning."

When you get started writing, visit the Data Management Plan Tool. This platform helps by providing agency-specific templates and guidance, working with your institutional login and allowing you to submit plans for feedback.

Post-project

Post-project resources and support involve the archiving, curation and the sharing of information. The UH Data Repository archives, preserves and helps to disseminate your data. The repository, the data portion of the institutional repository Cougar ROAR, is open access, free to all UH researchers, provides data sets with a digital object identifier and allows up to 10 GB per project. Most most Federal funding agencies already require this type of documentation (NSF, NASA, USGS and EPA. The NIH will require DMPs by 2023.

Start out strong

Remember, although documentation is due at the beginning of a project/grant proposal, sustained adherence to the plan and related policies is a necessity. We may be distanced socially, but our need to come together around research integrity remains constant. Starting early, getting connected to resources, and sharing as you can through avenues like the data repository are ways to strengthen ourselves and our work.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Houston manufacturer names location of its $193.7 million facility

coming soon

Houston-based manufacturer of high-temperature superconducting wires MetOx International Inc. will build a major production facility in Chatham County, North Carolina, which is expected to create 333 jobs, and invest $193.7 million in the state.

MetOx is a leader in High Temperature Superconducting technology (HTS), which is an advanced power delivery technology that is capable of transmitting extremely high power at low voltage with zero heat generation or energy loss. The technology is assisting in the energy sectors like power transmission, distribution, and grid expansion.

“Establishing our new large-scale manufacturing facility in Chatham County is a pivotal step toward securing a reliable, domestic supply of HTS wire for the development of critical infrastructure in the United States,” Bud Vos, CEO of MetOx, says in a news release. “This facility will not only deliver transformative energy technologies that strengthen our grid and reduce carbon emissions but also create high-paying manufacturing jobs in a community eager to lead in innovation. We are proud to partner with North Carolina to drive forward a resilient energy future built on cutting-edge science and strong local collaboration.”

The new facility is funded in part by an $80 million investment from the United States Department of Energy, which the company announced in October. In September, the company closed $25 million in a series B extension round.

In late 2024, MetOx also announced that it received an undisclosed investment from Hawaii-based Elemental Impact, which is a leading climate-focused investment platform. As a national implementation partner for the EPA's $27 billion Greenhouse Gas Reduction Fund, Elemental Impact has received $100 million to deploy later-stage commercialized technologies according to the company.

The funding is expected to advance the expansion of MetOx’s Houston production line and the deployment of its HTS wire, which can make transmission cables up to ten times more efficient than traditional copper cables and will be used at the North Carolina facility.

“Building domestic manufacturing capacity for critical grid technologies is essential for America’s energy future," Danya Hakeem, vice president of Portfolio at Elemental Impact, says in a news release. “MetOx’s expansion in Houston demonstrates how we can simultaneously advance grid modernization and create quality manufacturing jobs. Their technology represents exactly the kind of innovation needed to unlock the next wave of clean energy deployment.”

The project in North Carolina will be facilitated with a Job Development Investment Grant formally awarded to a new company being created by MetOx. In the 12-year term of the grant, economists in the Department of Commerce estimated the project will grow North Carolina’s economy by $987.8 million.

------

This article originally was published on our sister site, EnergyCapital.

Houston Nobel Prize nominee earns latest award for public health research

Prized Research

Houston vaccine scientist Dr. Peter Hotez can add one more prize to his shelf.

Hotez — dean of the National School of Tropical Medicine and professor of Pediatrics and Molecular Virology & Microbiology at Baylor College of Medicine, co-director of the Texas Children’s Center for Vaccine Development (CVD) and Texas Children’s Hospital Endowed Chair of Tropical Pediatrics — is no stranger to impressive laurels. In 2022, he was even nominated for a Nobel Peace Prize for his low-cost COVID vaccine.

His first big win of 2025 is this year’s Hill Prize, awarded by the Texas Academy of Medicine, Engineering, Science and Technology (TAMEST).

Hotez and his team were selected to receive $500,000 from Lyda Hill Philanthropies to help fund The Texas Virosphere Project. The endeavor was born to help create a predictive disease atlas relating to climate disasters. Because the climate crisis has ushered in changes to the distribution of diseases, including dengue, chikungunya, Zika, Chagas disease, typhus and tick-borne relapsing fever, it’s important to predict outbreaks before they become a menace.

Rice University researchers are collaborating with Hotez and his team on a project that combines climate science and metagenomics to access 3,000 insect genomes. The goal is to aid health departments in controlling disease and informing policy.

The Hill Prize, which is being awarded to six innovators for the first time, thanks to a $10 million commitment from the philanthropic organization, is intended to back ideas that are high-risk and high-reward. Each of the projects was chosen for its potential real-life impact on some of Texas's — and the world’s — most challenging situations. Hotez’s prize is the first Hill Prize to be given in the realm of public health. The additional winners are:

  • Hill Prize in Medicine: Kenneth M. Hargreaves, D.D.S., Ph.D., The University of Texas Health Science Center at San Antonio
  • Hill Prize in Engineering: Joan Frances Brennecke, Ph.D. (NAE), The University of Texas at Austin
  • Hill Prize in Biological Sciences: David J. Mangelsdorf, Ph.D. (NAM, NAS), UT Southwestern Medical Center
  • Hill Prize in Physical Sciences: James Chelikowsky, Ph.D., The University of Texas at Austin
  • Hill Prize in Technology: Robert De Lorenzo, M.D., EmergenceMed, LLC
Read about other Houston-area researchers recognized by TAMEST here.

How Houston's cost of living compares to other major Texas cities in 2025

Calculating Costs

A new cost-of-living index yields a result that many Houstonians will find surprising: Houston is not the most expensive place to live in Texas. Dallas and Austin are costlier.

Numbeo’s cost-of-living index for 2025 shows Dallas ranks first in Texas and 24th in North America, landing at 65.8. The cost-of-living index compares the cost of living in New York City (which sits at 100) with the cost of living in another city. Austin is at 61.7, Houston at 60.6, and San Antonio at 58.8.

Houston ranks 40th overall in North America, out of 52 cities in the index.

Numbeo’s cost-of-living index takes into account the cost of items like groceries, restaurant meals, transportation, and utilities. The index excludes rent.

When rent is added to the cost-of-living index, Houston is still third among Texas cities. Dallas grabs the No. 21 spot in North America (57.1), one notch above Austin (56.6). Houston ranks 35th (51.4), and San Antonio ranks 42nd (34.6).

Rent index
While Dallas holds the top Texas spot on Numbeo’s overall cost-of-living index, Austin faces the highest rent prices. Numbeo's rent index for Austin sits at 50.1, putting it in 12th place among major cities in North America and highest in Texas, above the indexes for Dallas, Houston, and San Antonio. Houston lands at 27th.

The rent index in New York City, which tops the list, is 100. As Numbeo explains, the rent index estimates the cost of renting an apartment in a city compared with New York City. If the rent index is 50, for example, this suggests the average rent in that city is 50 percent below the average rent in New York City.

Around Texas, the rent index is:

  • 46.2 in Dallas
  • 39.8 in Houston
  • 34.6 in San Antonio

Restaurant index
In contrast to its showing on the rent and cost-of-living indexes, Houston outranks Dallas, Austin, and San Antonio on Numbeo’s restaurant index. This index compares the prices of meals and drinks at restaurants and bars to those in New York City.

Houston sits at No. 25 on the restaurant index, at 68.9. Dallas comes in at No. 32 (67.1), Austin at No. 34 (66.6), and San Antonio at No. 36 (65.2).

The National Restaurant Association reported in December that menu prices in the U.S. had risen 3.6 percent in the past 12 months, outpacing gains in grocery prices and the federal government’s overall Consumer Price Index. Fortunately for diners, that was the smallest 12-month increase in menu prices since August 2020, according to the association.

Toast, which provides a cloud-based restaurant management system, says the higher menu prices reflect higher food prices.

“Food prices have been increasing due to inflation, labor expenses, fuel costs, and supply chain disruptions, all of which impact restaurant profitability, Toast says. “While raising menu prices is one option to combat rising food costs, some restaurants have introduced service charges and simplified menus to avoid passing all costs onto customers.”

---

This story originally appeared on our sister site, CultureMap.com.