How you can use your data to improve your marketing efforts. Photo via Getty Images

When focusing on revenue growth in business to business companies, analyzing data to develop and optimize strategies is one of the biggest factors in sales and marketing success. However, the process of evaluating B2B data differs significantly from that of B2C, or business to consumer. B2C analysis is often straightforward, focusing on consumer behavior and e-commerce transactions.

Unlike B2C, where customers can make a quick purchase decision with a simple click, the B2B customer journey involves multiple touchpoints and extensive research. B2B buyers will most likely discover a company through an ad or a referral, then navigate through websites, interact with salespeople, and explore different resources before finally making a purchasing decision, often with a committee giving input.

Because a B2B customer journey through the sales pipeline is more indirect, these businesses need to take a more nuanced approach to acquiring and making sense of data.

The expectations of B2B vs. B2C

It can be tempting to use the same methods of analysis between B2C and B2B data. However, B2B decision-making requires more consideration. Decisions involving enterprise software or other significant business products or services investments are very different from a typical consumer purchase.

B2C marketing emphasizes metrics like conversion rates, click-through rates, and immediate sales. In contrast, B2B marketing success also includes metrics like lead quality, customer lifetime value, and ROI. Understanding the differences helps prevent unrealistic expectations and misinterpretations of data.

Data differences with B2B

While B2C data analysis often revolves around website analytics and foot traffic in brick and mortar stores, B2B data analysis involves multiple sources. Referrals play a vital role in B2B, as buyers often seek recommendations from industry peers or companies similar to theirs.

Data segmentation in B2B focuses more on job title and job function rather than demographic data. Targeting different audiences within the same company based on their roles — and highlighting specific aspects of products or services that resonate with those different decision-makers — can significantly impact a purchase decision.

The B2B sales cycle is longer because purchases typically involve the input of a salesperson to help buyers with education and comparison. This allows for teams to implement account-based marketing and provides for more engagement which increases the chances of moving prospects down the sales funnel.

Enhancing data capture in B2B analysis

Many middle-market companies rely heavily on individual knowledge and experience rather than formal data management systems. As the sales and marketing landscape has evolved to be more digital, so must business. Sales professionals can leave and a company must retain the knowledge of the buyers and potential buyers. CRM systems not only collect data, they also provide the history of customer relationships.

Businesses need to capture data at all the various touchpoints, including lead generation, prospect qualification, customer interactions, and order fulfillment. Regular analysis will help with accuracy. The key is to derive actionable insights from the data.

B2B data integration challenges

Integrating various data sources in B2B data analysis used to be much more difficult. With the advent of business intelligence software such as Tableau and Power BI, data analysis is much more accessible with a less significant investment. Businesses do need access to resources to effectively use the tools.

CRM and ERP systems store a wealth of data, including contact details, interactions, and purchase history. Marketing automation platforms capture additional information from website forms, social media, and email campaigns. Because of these multiple sources, connecting data points and cleansing the data is a necessary step in the process.

When analyzing B2B data for account based marketing (ABM) purposes, there are some unique considerations to keep in mind. Industries like healthcare and financial services, for instance, have specific regulations that dictate how a business can use customer data.

Leveraging B2B data analysis for growth

B2B data analysis is the foundation for any sales and marketing strategy. Collecting and using data from multiple sources allows revenue teams to uncover gaps, trends, and opportunities for continued growth.

Acknowledging what’s different about B2B data and tracking all of the customer journey touchpoints is important as a business identifies a target market, develops an ideal customer profile, and monitors their competitors. Insights from data also single out gaps in the sales pipeline, use predictive analytics for demand forecasting, and optimize pricing strategies.

This comprehensive approach gives B2B companies the tools they need to make informed decisions, accelerate their sales and marketing efforts, and achieve long-term growth in a competitive market.

------

Libby Covington is a Partner with Craig Group, a technology-enabled sales and marketing advisory firm specializing in revenue growth for middle-market, private-equity-backed portfolio companies.

Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution. Graphic byMiguel Tovar/University of Houston

Houston research: Why you need a data management plan

Houston voices

Why do you need a data management plan? It mitigates error, increases research integrity and allows your research to be replicated – despite the “replication crisis” that the research enterprise has been wrestling with for some time.

Error

There are many horror stories of researchers losing their data. You can just plain lose your laptop or an external hard drive. Sometimes they are confiscated if you are traveling to another country — and you may not get them back. Some errors are more nuanced. For instance, a COVID-19 repository of contact-traced individuals was missing 16,000 results because Excel can’t exceed 1 million lines per spreadsheet.

Do you think a hard drive is the best repository? Keep in mind that 20 percent of hard drives fail within the first four years. Some researchers merely email their data back and forth and feel like it is “secure” in their inbox.

The human and machine error margins are wide. Continually backing up your results, while good practice, can’t ensure that you won’t lose invaluable research material.

Repositories

According to Reid Boehm, Ph.D., Research Data Management Librarian at the University of Houston Libraries, your best bet is to utilize research data repositories. “The systems and the administrators are focused on file integrity and preservation actions to mitigate loss and they often employ specific metadata fields and documentation with the content,” Boehm says of the repositories. “They usually provide a digital object identifier or other unique ID for a persistent record and access point to these data. It’s just so much less time and worry.”

Integrity

Losing data or being hacked can challenge data integrity. Data breaches do not only compromise research integrity, they can also be extremely expensive! According to Security Intelligence, the global average cost of a data breach in a 2019 study was $3.92 million. That is a 1.5 percent increase from the previous year’s study.

Sample size — how large or small a study was — is another example of how data integrity can affect a study. Retraction Watch removes approximately 1,500 articles annually from prestigious journals for “sloppy science.” One of the main reasons the papers end up being retracted is that the sample size was too small to be a representative group.

Replication

Another metric for measuring data integrity is whether or not the experiment can be replicated. The ability to recreate an experiment is paramount to the scientific enterprise. In a Nature article entitled, 1,500 scientists lift the lid on reproducibility, “73 percent said that they think that at least half of the papers can be trusted, with physicists and chemists generally showing the most confidence.”

However, according to Kelsey Piper at Vox, “an attempt to replicate studies from top journals Nature and Science found that 13 of the 21 results looked at could be reproduced.”

That's so meta

The archivist Jason Scott said, “Metadata is a love note to the future.” Learning how to keep data about data is a critical part of reproducing an experiment.

“While this will be always be determined by a combination of project specifics and disciplinary considerations, descriptive metadata should include as much information about the process as possible,” said Boehm. Details of workflows, any standard operating procedures and parameters of measurement, clear definitions of variables, code and software specifications and versions, and many other signifiers ensure the data will be of use to colleagues in the future.

In other words, making data accessible, useable and reproducible is of the utmost importance. You make reproducing experiments that much easier if you are doing a good job of capturing metadata in a consistent way.

The Big Idea

A data management plan includes storage, curation, archiving and dissemination of research data. Your university’s digital librarian is an invaluable resource. They can answer other tricky questions as well: such as, who does data belong to? And, when a post-doctoral student in your lab leaves the institution, can s/he take their data with them? Every situation is unique and deserves a one-of-the-kind data management plan, not a one-size-fits-all solution.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Here's your university research data management checklist. Graphic byMiguel Tovar/University of Houston

Tips for optimizing data management in research, from a UH expert

Houston voices

A data management plan is invaluable to researchers and to their universities. "You should plan at the outset for managing output long-term," said Reid Boehm, research data management librarian at University of Houston Libraries.

At the University of Houston, research data generated while individuals are pursuing research studies as faculty, staff or students of the University of Houston are to be retained by the institution for a period of three years after submission of the final report. That means there is a lot of data to be managed. But researchers are in luck – there are many resources to help navigate these issues.

Take inventory

Is your data

  • Active (constantly changing) or Inactive (static)
  • Open (public) or Proprietary (for monetary gain)
  • Non-identifiable (no human subjects) or Sensitive (containing personal information)
  • Preservable (to save long term) or To discard in 3 years (not for keeping)
  • Shareable (ready for reuse) or Private (not able to be shared)

The more you understand the kind of data you are generating the easier this step, and the next steps, will be.

Check first

When you are ready to write your plan, the first thing to determine is if your funders or the university have data management plan policy and guidelines. For instance, University of Houston does.

It is also important to distinguish between types of planning documents. For example:

A Data Management Plan (DMP) is a comprehensive, formal document that describes how you will handle your data during the course of your research and at the conclusion of your study or project.

While in some instances, funders or institutions may require a more targeted plan such as a Data Sharing Plan (DSP) that describes how you plan to disseminate your data at the conclusion of a research project.

Consistent questions that DMPs ask include:

  • What is generated?
  • How is it securely handled? and
  • How is it maintained and accessed long-term?

However it's worded, data is critical to every scientific study.

Pre-proposal

Pre-proposal planning resources and support at UH Libraries include a consultation with Boehm. "Each situation is unique and in my role I function as an advocate for researchers to talk through the contextual details, in connection with funder and institutional requirements," stated Boehm. "There are a lot of aspects of data management and dissemination that can be made less complex and more functional long term with a bit of focused planning at the beginning."

When you get started writing, visit the Data Management Plan Tool. This platform helps by providing agency-specific templates and guidance, working with your institutional login and allowing you to submit plans for feedback.

Post-project

Post-project resources and support involve the archiving, curation and the sharing of information. The UH Data Repository archives, preserves and helps to disseminate your data. The repository, the data portion of the institutional repository Cougar ROAR, is open access, free to all UH researchers, provides data sets with a digital object identifier and allows up to 10 GB per project. Most most Federal funding agencies already require this type of documentation (NSF, NASA, USGS and EPA. The NIH will require DMPs by 2023.

Start out strong

Remember, although documentation is due at the beginning of a project/grant proposal, sustained adherence to the plan and related policies is a necessity. We may be distanced socially, but our need to come together around research integrity remains constant. Starting early, getting connected to resources, and sharing as you can through avenues like the data repository are ways to strengthen ourselves and our work.

------

This article originally appeared on the University of Houston's The Big Idea. Sarah Hill, the author of this piece, is the communications manager for the UH Division of Research.

Ad Placement 300x100
Ad Placement 300x600

CultureMap Emails are Awesome

New Houston venture studio emerges to target early-stage hardtech, energy transition startups

funding the future

The way Doug Lee looks at it, there are two areas within the energy transition attracting capital. With his new venture studio, he hopes to target an often overlooked area that's critical for driving forward net-zero goals.

Lee describes investment activity taking place in the digital and software world — early stage technology that's looking to make the industry smarter. But, on the other end of the spectrum, investment activity can be found on massive infrastructure projects.

While both areas need funding, Lee has started his new venture studio, Flathead Forge, to target early-stage hardtech technologies.

“We are really getting at the early stage companies that are trying to develop technologies at the intersection of legacy industries that we believe can become more sustainable and the energy transition — where we are going. It’s not an ‘if’ or ‘or’ — we believe these things intersect,” he tells EnergyCapital.

Specifically, Lee's expertise is within the water and industrial gas space. For around 15 years, he's made investments in this area, which he describes as crucial to the energy transition.

“Almost every energy transition technology that you can point to has some critical dependency on water or gas,” he says. “We believe that if we don’t solve for those things, the other projects won’t survive.”

Lee, and his brother, Dave, are evolving their family office to adopt a venture studio model. They also sold off Azoto Energy, a Canadian oilfield nitrogen cryogenic services business, in December.

“We ourselves are going through a transition like our energy is going through a transition,” he says. “We are transitioning into a single family office into a venture studio. By doing so, we want to focus all of our access and resources into this focus.”

At this point, Flathead Forge has seven portfolio companies and around 15 corporations they are working with to identify their needs and potential opportunities. Lee says he's gearing up to secure a $100 million fund.

Flathead also has 40 advisers and mentors, which Lee calls sherpas — a nod to the Flathead Valley region in Montana, which inspired the firm's name.

“We’re going to help you carry up, we’re going to tie ourselves to the same rope as you, and if you fall off the mountain, we’re falling off with you,” Lee says of his hands-on approach, which he says sets Flathead apart from other studios.

Another thing that's differentiating Flathead Forge from its competition — it's dedication to giving back.

“We’ve set aside a quarter of our carried interest for scholarships and grants,” Lee says.

The funds will go to scholarships for future engineers interested in the energy transition, as well as grants for researchers studying high-potential technologies.

“We’re putting our own money where our mouth is,” Lee says of his thesis for Flathead Forge.

------

This article originally ran on EnergyCapital.

Houston-based lunar mission's rocky landing and what it means for America's return to the moon

houston, we have a problem

A private U.S. lunar lander tipped over at touchdown and ended up on its side near the moon’s south pole, hampering communications, company officials said Friday.

Intuitive Machines initially believed its six-footed lander, Odysseus, was upright after Thursday's touchdown. But CEO Steve Altemus said Friday the craft “caught a foot in the surface," falling onto its side and, quite possibly, leaning against a rock. He said it was coming in too fast and may have snapped a leg.

“So far, we have quite a bit of operational capability even though we’re tipped over," he told reporters.

But some antennas were pointed toward the surface, limiting flight controllers' ability to get data down, Altemus said. The antennas were stationed high on the 14-foot (4.3-meter) lander to facilitate communications at the hilly, cratered and shadowed south polar region.

Odysseus — the first U.S. lander in more than 50 years — is thought to be within a few miles (kilometers) of its intended landing site near the Malapert A crater, less than 200 miles (300 kilometers) from the south pole. NASA, the main customer, wanted to get as close as possible to the pole to scout out the area before astronauts show up later this decade.

NASA's Lunar Reconnaissance Orbiter will attempt to pinpoint the lander's location, as it flies overhead this weekend.

With Thursday’s touchdown, Intuitive Machines became the first private business to pull off a moon landing, a feat previously achieved by only five countries. Japan was the latest country to score a landing, but its lander also ended up on its side last month.

Odysseus' mission was sponsored in large part by NASA, whose experiments were on board. NASA paid $118 million for the delivery under a program meant to jump-start the lunar economy.

One of the NASA experiments was pressed into service when the lander's navigation system did not kick in. Intuitive Machines caught the problem in advance when it tried to use its lasers to improve the lander's orbit. Otherwise, flight controllers would not have discovered the failure until it was too late, just five minutes before touchdown.

“Serendipity is absolutely the right word,” mission director Tim Crain said.

It turns out that a switch was not flipped before flight, preventing the system's activation in space.

Launched last week from Florida, Odysseus took an extra lap around the moon Thursday to allow time for the last-minute switch to NASA's laser system, which saved the day, officials noted.

Another experiment, a cube with four cameras, was supposed to pop off 30 seconds before touchdown to capture pictures of Odysseus’ landing. But Embry-Riddle Aeronautical University’s EagleCam was deliberately powered off during the final descent because of the navigation switch and stayed attached to the lander.

Embry-Riddle's Troy Henderson said his team will try to release EagleCam in the coming days, so it can photograph the lander from roughly 26 feet (8 meters) away.

"Getting that final picture of the lander on the surface is still an incredibly important task for us,” Henderson told The Associated Press.

Intuitive Machines anticipates just another week of operations on the moon for the solar-powered lander — nine or 10 days at most — before lunar nightfall hits.

The company was the second business to aim for the moon under NASA's commercial lunar services program. Last month, Pittsburgh's Astrobotic Technology gave it a shot, but a fuel leak on the lander cut the mission short and the craft ended up crashing back to Earth.

Until Thursday, the U.S. had not landed on the moon since Apollo 17's Gene Cernan and Harrison Schmitt closed out NASA's famed moon-landing program in December 1972. NASA's new effort to return astronauts to the moon is named Artemis after Apollo's mythological twin sister. The first Artemis crew landing is planned for 2026 at the earliest.

3 female Houston innovators to know this week

who's who

Editor's note: Welcome to another Monday edition of Innovators to Know. Today I'm introducing you to three Houstonians to read up about — three individuals behind recent innovation and startup news stories in Houston as reported by InnovationMap. Learn more about them and their recent news below by clicking on each article.

Emma Konet, co-founder and CTO of Tierra Climate

Emma Konet, co-founder and CTO of Tierra Climate, joins the Houston Innovators Podcast. Photo via LinkedIn

If the energy transition is going to be successful, the energy storage space needs to be equipped to support both the increased volume of energy needed and new energies. And Emma Konet and her software company, Tierra Climate, are targeting one part of the equation: the market.

"To me, it's very clear that we need to build a lot of energy storage in order to transition the grid," Konet says on the Houston Innovators Podcast. "The problems that I saw were really on the market side of things." Read more.

Cindy Taff, CEO of Sage Geosystems

Houston-based Sage Geosystems announced the first close of $17 million round led by Chesapeake Energy Corp. Photo courtesy of Sage

A Houston geothermal startup has announced the close of its series A round of funding.

Houston-based Sage Geosystems announced the first close of $17 million round led by Chesapeake Energy Corp. The proceeds aim to fund its first commercial geopressured geothermal system facility, which will be built in Texas in Q4 of 2024. According to the company, the facility will be the first of its kind.

“The first close of our Series A funding and our commercial facility are significant milestones in our mission to make geopressured geothermal system technologies a reality,” Cindy Taff, CEO of Sage Geosystems, says. Read more.

Clemmie Martin, chief of staff at The Cannon

With seven locations across the Houston area, The Cannon's digital technology allows its members a streamlined connection. Photo courtesy of The Cannon

After collaborating over the years, The Cannon has acquired a Houston startup's digital platform technology to become a "physical-digital hybrid" community.

Village Insights, a Houston startup, worked with The Cannon to create and launch its digital community platform Cannon Connect. Now, The Cannon has officially acquired the business. The terms of the deal were not disclosed.

“The integration of a world-class onsite member experience and Cannon Connect’s superior virtual resource network creates a seamless, streamlined environment for member organizations,” Clemmie Martin, The Cannon’s newly appointed chief of staff, says in the release. “Cannon Connect and this acquisition have paved new pathways to access and success for all.” Read more.