How you can use your data to improve your marketing efforts. Photo via Getty Images

When focusing on revenue growth in business to business companies, analyzing data to develop and optimize strategies is one of the biggest factors in sales and marketing success. However, the process of evaluating B2B data differs significantly from that of B2C, or business to consumer. B2C analysis is often straightforward, focusing on consumer behavior and e-commerce transactions.

Unlike B2C, where customers can make a quick purchase decision with a simple click, the B2B customer journey involves multiple touchpoints and extensive research. B2B buyers will most likely discover a company through an ad or a referral, then navigate through websites, interact with salespeople, and explore different resources before finally making a purchasing decision, often with a committee giving input.

Because a B2B customer journey through the sales pipeline is more indirect, these businesses need to take a more nuanced approach to acquiring and making sense of data.

The expectations of B2B vs. B2C

It can be tempting to use the same methods of analysis between B2C and B2B data. However, B2B decision-making requires more consideration. Decisions involving enterprise software or other significant business products or services investments are very different from a typical consumer purchase.

B2C marketing emphasizes metrics like conversion rates, click-through rates, and immediate sales. In contrast, B2B marketing success also includes metrics like lead quality, customer lifetime value, and ROI. Understanding the differences helps prevent unrealistic expectations and misinterpretations of data.

Data differences with B2B

While B2C data analysis often revolves around website analytics and foot traffic in brick and mortar stores, B2B data analysis involves multiple sources. Referrals play a vital role in B2B, as buyers often seek recommendations from industry peers or companies similar to theirs.

Data segmentation in B2B focuses more on job title and job function rather than demographic data. Targeting different audiences within the same company based on their roles — and highlighting specific aspects of products or services that resonate with those different decision-makers — can significantly impact a purchase decision.

The B2B sales cycle is longer because purchases typically involve the input of a salesperson to help buyers with education and comparison. This allows for teams to implement account-based marketing and provides for more engagement which increases the chances of moving prospects down the sales funnel.

Enhancing data capture in B2B analysis

Many middle-market companies rely heavily on individual knowledge and experience rather than formal data management systems. As the sales and marketing landscape has evolved to be more digital, so must business. Sales professionals can leave and a company must retain the knowledge of the buyers and potential buyers. CRM systems not only collect data, they also provide the history of customer relationships.

Businesses need to capture data at all the various touchpoints, including lead generation, prospect qualification, customer interactions, and order fulfillment. Regular analysis will help with accuracy. The key is to derive actionable insights from the data.

B2B data integration challenges

Integrating various data sources in B2B data analysis used to be much more difficult. With the advent of business intelligence software such as Tableau and Power BI, data analysis is much more accessible with a less significant investment. Businesses do need access to resources to effectively use the tools.

CRM and ERP systems store a wealth of data, including contact details, interactions, and purchase history. Marketing automation platforms capture additional information from website forms, social media, and email campaigns. Because of these multiple sources, connecting data points and cleansing the data is a necessary step in the process.

When analyzing B2B data for account based marketing (ABM) purposes, there are some unique considerations to keep in mind. Industries like healthcare and financial services, for instance, have specific regulations that dictate how a business can use customer data.

Leveraging B2B data analysis for growth

B2B data analysis is the foundation for any sales and marketing strategy. Collecting and using data from multiple sources allows revenue teams to uncover gaps, trends, and opportunities for continued growth.

Acknowledging what’s different about B2B data and tracking all of the customer journey touchpoints is important as a business identifies a target market, develops an ideal customer profile, and monitors their competitors. Insights from data also single out gaps in the sales pipeline, use predictive analytics for demand forecasting, and optimize pricing strategies.

This comprehensive approach gives B2B companies the tools they need to make informed decisions, accelerate their sales and marketing efforts, and achieve long-term growth in a competitive market.

------

Libby Covington is a Partner with Craig Group, a technology-enabled sales and marketing advisory firm specializing in revenue growth for middle-market, private-equity-backed portfolio companies.

Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution. Graphic by Miguel Tovar/University of Houston

Houston research: Why you need a data management plan

Houston voices

Why do you need a data management plan? It mitigates error, increases research integrity and allows your research to be replicated – despite the “replication crisis” that the research enterprise has been wrestling with for some time.

Error

There are many horror stories of researchers losing their data. You can just plain lose your laptop or an external hard drive. Sometimes they are confiscated if you are traveling to another country — and you may not get them back. Some errors are more nuanced. For instance, a COVID-19 repository of contact-traced individuals was missing 16,000 results because Excel can’t exceed 1 million lines per spreadsheet.

Do you think a hard drive is the best repository? Keep in mind that 20 percent of hard drives fail within the first four years. Some researchers merely email their data back and forth and feel like it is “secure” in their inbox.

The human and machine error margins are wide. Continually backing up your results, while good practice, can’t ensure that you won’t lose invaluable research material.

Repositories

According to Reid Boehm, Ph.D., Research Data Management Librarian at the University of Houston Libraries, your best bet is to utilize research data repositories. “The systems and the administrators are focused on file integrity and preservation actions to mitigate loss and they often employ specific metadata fields and documentation with the content,” Boehm says of the repositories. “They usually provide a digital object identifier or other unique ID for a persistent record and access point to these data. It’s just so much less time and worry.”

Integrity

Losing data or being hacked can challenge data integrity. Data breaches do not only compromise research integrity, they can also be extremely expensive! According to Security Intelligence, the global average cost of a data breach in a 2019 study was $3.92 million. That is a 1.5 percent increase from the previous year’s study.

Sample size — how large or small a study was — is another example of how data integrity can affect a study. Retraction Watch removes approximately 1,500 articles annually from prestigious journals for “sloppy science.” One of the main reasons the papers end up being retracted is that the sample size was too small to be a representative group.

Replication

Another metric for measuring data integrity is whether or not the experiment can be replicated. The ability to recreate an experiment is paramount to the scientific enterprise. In a Nature article entitled, 1,500 scientists lift the lid on reproducibility, “73 percent said that they think that at least half of the papers can be trusted, with physicists and chemists generally showing the most confidence.”

However, according to Kelsey Piper at Vox, “an attempt to replicate studies from top journals Nature and Science found that 13 of the 21 results looked at could be reproduced.”

That's so meta

The archivist Jason Scott said, “Metadata is a love note to the future.” Learning how to keep data about data is a critical part of reproducing an experiment.

“While this will be always be determined by a combination of project specifics and disciplinary considerations, descriptive metadata should include as much information about the process as possible,” said Boehm. Details of workflows, any standard operating procedures and parameters of measurement, clear definitions of variables, code and software specifications and versions, and many other signifiers ensure the data will be of use to colleagues in the future.

In other words, making data accessible, useable and reproducible is of the utmost importance. You make reproducing experiments that much easier if you are doing a good job of capturing metadata in a consistent way.

The Big Idea

A data management plan includes storage, curation, archiving and dissemination of research data. Your university’s digital librarian is an invaluable resource. They can answer other tricky questions as well: such as, who does data belong to? And, when a post-doctoral student in your lab leaves the institution, can s/he take their data with them? Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Here's your university research data management checklist. Graphic by Miguel Tovar/University of Houston

Tips for optimizing data management in research, from a UH expert

Houston voices

A data management plan is invaluable to researchers and to their universities. "You should plan at the outset for managing output long-term," said Reid Boehm, research data management librarian at University of Houston Libraries.

At the University of Houston, research data generated while individuals are pursuing research studies as faculty, staff or students of the University of Houston are to be retained by the institution for a period of three years after submission of the final report. That means there is a lot of data to be managed. But researchers are in luck – there are many resources to help navigate these issues.

Take inventory

Is your data

  • Active (constantly changing) or Inactive (static)
  • Open (public) or Proprietary (for monetary gain)
  • Non-identifiable (no human subjects) or Sensitive (containing personal information)
  • Preservable (to save long term) or To discard in 3 years (not for keeping)
  • Shareable (ready for reuse) or Private (not able to be shared)

The more you understand the kind of data you are generating the easier this step, and the next steps, will be.

Check first

When you are ready to write your plan, the first thing to determine is if your funders or the university have data management plan policy and guidelines. For instance, University of Houston does.

It is also important to distinguish between types of planning documents. For example:

A Data Management Plan (DMP) is a comprehensive, formal document that describes how you will handle your data during the course of your research and at the conclusion of your study or project.

While in some instances, funders or institutions may require a more targeted plan such as a Data Sharing Plan (DSP) that describes how you plan to disseminate your data at the conclusion of a research project.

Consistent questions that DMPs ask include:

  • What is generated?
  • How is it securely handled? and
  • How is it maintained and accessed long-term?

However it's worded, data is critical to every scientific study.

Pre-proposal

Pre-proposal planning resources and support at UH Libraries include a consultation with Boehm. "Each situation is unique and in my role I function as an advocate for researchers to talk through the contextual details, in connection with funder and institutional requirements," stated Boehm. "There are a lot of aspects of data management and dissemination that can be made less complex and more functional long term with a bit of focused planning at the beginning."

When you get started writing, visit the Data Management Plan Tool. This platform helps by providing agency-specific templates and guidance, working with your institutional login and allowing you to submit plans for feedback.

Post-project

Post-project resources and support involve the archiving, curation and the sharing of information. The UH Data Repository archives, preserves and helps to disseminate your data. The repository, the data portion of the institutional repository Cougar ROAR, is open access, free to all UH researchers, provides data sets with a digital object identifier and allows up to 10 GB per project. Most most Federal funding agencies already require this type of documentation (NSF, NASA, USGS and EPA. The NIH will require DMPs by 2023.

Start out strong

Remember, although documentation is due at the beginning of a project/grant proposal, sustained adherence to the plan and related policies is a necessity. We may be distanced socially, but our need to come together around research integrity remains constant. Starting early, getting connected to resources, and sharing as you can through avenues like the data repository are ways to strengthen ourselves and our work.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

Expert on Houston’s energy advantage: Building affordability, reliability for all

Guest Column

As the energy capital of the world, Houston has been at the forefront of innovation, powering industries and communities for generations. Many Houston families, however, are facing a reality that undermines our leadership: high energy bills and ongoing concerns about grid reliability.

Affordability and reliability are not just technical issues; they’re equity issues. To remain the world leader in energy, we must ensure that every household has access to affordable and dependable power.

Affordability: The First Step Toward Equity

According to the recent 2025 study by The Texas Energy Poverty Research Institute, nearly 80% of low- to moderate-income Houstonians scaled back on basic needs to cover electric bills. Rising costs mean some Houstonians are forced to choose between paying their utility bill or paying for groceries.

Additionally, Houston now has the highest poverty rate among America’s most populous cities. Energy should not be a privilege for only half of our city’s population. That’s why affordability needs to be at the center of Houston’s energy conversation.

Several practical solutions exist to help address this inequity:

  • We can increase transparency in electricity pricing and help families better understand their electricity facts labels to make smarter choices.
  • We can expand energy efficiency programs, like weatherizing homes and apartments, swapping out old light bulbs for LEDs, and adopting smart thermostats.
  • Incentives to help families invest in these changes can deliver long-term benefits for both them and apartment complex owners.

Many small changes, when combined, can add up to significant savings for families while reducing overall demand on the grid.

Reliability: A Shared Community Priority

The memories of Hurricane Beryl, Derecho, and Winter Storm Uri are still fresh in the minds of Texans. We saw firsthand the fragility of our grid and how devastating outages are to families, especially those without resources to handle extreme weather. Reliability of the grid is an issue of public health, economic stability, and community safety.

Houston has an opportunity to lead by embracing innovation. Grid modernization, from deploying microgrids to expanding battery storage, can provide stability when the system is under stress. Partnerships between utilities, businesses, and community organizations are key to building resilience. With Houston’s innovation ecosystem, we can pilot solutions here that other regions will look to replicate.

Energy Equity in Action

Reliable, affordable energy strengthens equity in tangible ways. When households spend less on utilities, they have more to invest in their children’s education or save for the future. When power is stable, schools remain open, businesses continue to operate, and communities thrive. Extending energy efficiency programs across all neighborhoods creates a fairer, more balanced system, breaking down inequities tied to income and geography.

Studies show that expanding urban green spaces such as community gardens and tree-planting programs can lower neighborhood temperatures, reduce energy use for cooling, and improve air quality in disadvantaged areas, directly reducing household utility burdens.

In Houston, for example, the median energy burden for low-income households is 7.1% of income, more than twice that of the general population, with over 20% of households having energy burdens above 6%.

Research also demonstrates that community solar programs and urban cooling investments deliver clean, affordable power, helping to mitigate heat stress and making them high-impact strategies for energy equity and climate resilience in vulnerable neighborhoods.

Public-Private Partnerships Make the Difference

The solutions to affordability and reliability challenges must come from cross-sector collaboration. For example, CenterPoint Energy offers incentives through its Residential and Hard-to-Reach Programs, which support contractors and community agencies in delivering energy efficiency upgrades, including weatherization, to low-income households in the greater Houston area.

Nonprofits like the Houston Advanced Research Center (HARC) received a $1.9 million Department of Energy grant to lead a weatherization program tailored for underserved communities in Harris County, helping to lower bills and improve housing safety

Meanwhile, the City of Houston’s Green Office Challenge and Better Buildings Initiative bring private-sector sponsors, nonprofits, and city leadership together to drive energy reductions across millions of square feet of commercial buildings, backed by training and financial incentives. Together, these partnerships can result in real impact that brings more equity and access to affordable energy.

BKV Energy is committed to being part of the solution by promoting practical, consumer-focused strategies that help families save money and use energy more efficiently. We offer a suite of programs designed to provide customers with financial benefits and alleviate the burden of rising electricity bills. Programs like BKV Energy’s demonstrate how utilities can ease financial strain for families while building stronger customer loyalty and trust. Expanding similar initiatives across Houston would not only lower household energy burdens but also set a new standard for how energy companies can invest directly in their communities.

By proactively addressing affordability, energy companies can help ensure that rising costs don’t disproportionately impact vulnerable households. These efforts also contribute to a more resilient and equitable energy future for Houston, where all residents can access reliable power without sacrificing financial stability.

Houston as a Blueprint

Houston has always been a city of leadership and innovation, whether pioneering the space race, driving advancements in medical research at the Texas Medical Center, or anchoring the global energy industry. Today, our challenge is just as urgent: affordability and reliability must become the cornerstones of our energy future. Houston has the expertise and the collaborative spirit to show how it can be done.

By scaling innovative solutions, Houston can make energy more equitable, strengthening our own community while setting a blueprint for the nation. As the energy capital of the world, it is both our responsibility and our opportunity to lead the way to a more equitable future for all.

---

Sam Luna is director at BKV Energy, where he oversees brand and go-to-market strategy, customer experience, marketing execution, and more.

Texas-based energy startup raises $1 billion on heels of Houston expansion

Powering Up

Austin-based startup Base Power, which offers battery-supported energy in the Houston area and other regions, has raised $1 billion in series C funding—making it one of the largest venture capital deals this year in the U.S.

VC firm Addition led the $1 billion round. All of Base Power’s existing major investors also participated, including Trust Ventures, Valor Equity Partners, Thrive Capital, Lightspeed Venture Partners, Andreessen Horowitz (a16z), Altimeter, StepStone Group, 137 Ventures, Terrain, Waybury Capital, and entrepreneur Elad Gil. New investors include Ribbit Capital, Google-backed CapitalG, Spark Capital, Bond, Lowercarbon Capital, Avenir Growth Capital, Glade Brook Capital Partners, Positive Sum and 1789 Capital Management.

Coupled with the new $1 billion round, Base Power has hauled in more than $1.27 billion in funding since it was founded in 2023.

Base Power supplies power to homeowners and the electric grid through a distributed storage network.

“The chance to reinvent our power system comes once in a generation,” Zach Dell, co-founder and CEO of Base Power, said in a news release. “The challenge ahead requires the best engineers and operators to solve it, and we’re scaling the team to make our abundant energy future a reality.”

Zach Dell is the son of Austin billionaire and Houston native Michael Dell, chairman and CEO of Round Rock-based Dell Technologies.

In less than two years, Base Power has developed more than 100 megawatt-hours of battery-enabled storage capacity. One megawatt-hour represents one hour of energy use at a rate of one million watts.

Base Power recently expanded its service to the city of Houston. It already was delivering energy to several other communities in the Houston area. To serve the Houston region, the startup has opened an office in Katy.

The startup also serves the Dallas-Fort Worth and Austin markets. At some point, Base Power plans to launch a nationwide expansion.

To meet current and future demand, Base Power is building its first energy storage and power electronics factory at the former downtown Austin site of the Austin American-Statesman’s printing presses.

“We’re building domestic manufacturing capacity for fixing the grid,” Justin Lopas, co-founder and chief operating officer of Base Power, added in the release. “The only way to add capacity to the grid is [by] physically deploying hardware, and we need to make that here in the U.S. ... This factory in Austin is our first, and we’re already planning for our second.”

---

This article originally appeared on EnergyCapitalHTX.com.

Houston professor awarded $2.6M grant for retina, neurological research

seeing green

University of Houston College of Optometry Professor John O’Brien has received a $2.6 million grant from the National Eye Institute to continue his research on the retina and neurological functions.

O’Brien is considered a leading expert in retinal neuroscience with more than 20 years of research in the field. The new funding will allow O’Brien and his team to continue to study the dense assembly of proteins associated with electrical synapses, or gap junctions, in the retina.

Gap junctions transfer electrical signals between neurons. And the plasticity of gap junctions changes the strength of a synapse, in turn changing how visual information is processed. Previous research has shown that reduced functions of electrical synapses could be linked to autism, while their hyperfunction may lead to seizures.

“The research we propose will significantly advance our understanding of the molecular complexes that control the function of electrical synapses,” O’Brien said in a news release.

The team at UH will work to identify the proteins and examine how they impact electrical synapses. It is particularly interested in the Connexin 36, or Cx36, protein. According to O’Brien, phosphorylation of Cx36, a short-term chemical modification of the protein, serves as a key driver of plasticity. And the protein has been linked to refractive error development, which is one of the largest vision problems in the world today.

Additionally, OBrien’s research has shown that plasticity is essential for all-day vision, allowing the retina to adjust sensitivity and sharpen images. He has also built a catalog of the core set of proteins surrounding electrical synapses that are conserved across species. His research has been funded by the NEI since 2000.