The Difference in Being Data-Driven, and BIG-Data-Driven

I had the pleasure of speaking at 6sense’s inaugural INmarket Conference on Wednesday. The theme of the event was that the world has become, and is becoming more and more data-driven – and marketing and sales need to adapt. I think marketing has improved using data to make decisions over the past 10 years, and we were part of that at Eloqua. However I think there is a movement to take it one step further and become “bigger”-data-driven.

I was on a panel with Jon Miller, founder of Marketo, now CEO of Engagio and Russ Glass, founder of Bizo, now head of product for Linkedin’s marketing cloud. Our topic was the past and future of B2B Marketing technology. We had a lively discussion and I realized there is both a massive opportunity and challenge with becoming big-data-driven.

 

“Moore’s Law for Data” Creates Masses of Implicit Information

In the opening kick-off Amanda Kahlow said they had a target of getting 100+ attendees, and the final number came in at 425. Wow that is awesome for your first conference. Amanda then posted the IBM stat that “90% of the world’s data has been generated in the last two years (2013)”. What is interesting about that stat, is that IBM’s believes that by now (2015) that number has already tripled with 80% of that data being “uncertain”. This seems to indicate there is a Moore’s-like law for implicit data.

Source: IBM.com

And as I walked around and talked with attendees and other speakers I realized that the best B2B marketing leaders were here. And they all came because they wanted to start to using data for more than just proving the value of something. They wanted to use data to do more. For these marketers I think:

Show me the money has become show me mo’ money.

The challenge with big data, is that most of it is unstructured and implicit. If you think of how marketing has used data in the past, most of it has been explicit about the customer. Demographic info like age, household income, job title, etc. has been the bread n’ butter of segmentation and personalization for many years. But if the majority of new data is transient and inexact, marketers have to learn or hire new skills to leverage this data for their efforts.

Being data-driven requires analysts, but being big-data-driven, requires scientists.

 

Implicit Data is More Useful Than Explicit Data, But Decays Faster

After spending 14 years in marketing automation, we learned one key thing. A buyer’s implicit data (digital body language) is a better indication of purchase intent, than their explicit demographic data. So in general it was better to have a prospect with high online activity enter your lead process, than someone with low activity with a lot of profile information. This was proven time and time again as we worked with customers.

But it gets more complicated. Implicit data decays much faster than older (less useful) explicit data. Online activity, social activity, location based activity – all of these data types lose their value for marketers quickly. The comparison graph would look something like this:

So if you are going to leverage the huge amounts of new implicit data in your marketing, you need to act in near-real-time. Which is tough, because there is more of it, and it isn’t clear on what it all means.

Being data-driven requires an analytics platform, but being big-data-driven, requires a predictive intelligence engine.

So if you are a modern B2B marketer, and want to be more data-driven it really comes down to:

  1. Having the people and skills to understand, and analyze the new large amounts of implicit data.
  2. Having a system or technology to be able to deal with all of this big data in near-real-time.

Both are challenges that can be overcome, and certainly with a partner like 6sense you are already ahead of the game. Looking forward to attending INmarket next year, to see how much more big-data-driven B2B marketers have become.

 

Recommended Posts