Some time ago, I was working with an SEO professional who was helping my client with some A/B testing of a website (testing two different versions to see which performs better).
After just a week, they came back with some good news. They tested a new version they had developed and indeed, there was a small, positive bump in sales. I asked what I thought was a logical question, “How do we know with statistical certainty that your change made the difference and that this didn’t happen by chance alone, or some confounding factor like advertising or seasonality?”
They looked at me with blank stares.
“You don’t understand,” the SEO professional said. “We’ve been doing this a long time. We KNOW there was a change.”
Real marketing data analysis
I hired a statistician to put the data through the appropriate test and learned that there was an 80 percent chance they were wrong and that the small increase was accounted for through normal variability in the data. I told the SEO I wouldn’t pay them until they started providing statistically valid analysis.
The sad thing is, this is the way it is in most places.
I spent most of my career in manufacturing where rigorous statistical analysis was at the heart of every process improvement and quality effort. It has been a great shock to my system to enter a marketing world where data analysis is rarely used properly, if it is even used at all.
The field of marketing seems to be at least a decade out of step with what are tried-and-true best practices elsewhere. It has been a puzzlement to me. Why are we managing our marketing practices based on guesses, hunches, or suspicious data we picked up from a spurious blog post somewhere?
You don’t have to be a math geek or a statistician who has all the right answers. But you do need to know enough about data analysis to ask the right questions.
Here are five questions every marketer should ask about their research data.
1. Did we calculate the correct sample size?
I recently read a research report by one of the best-known marketing consulting firms and saw that they actually based one of their conclusions based on the responses of two people. In other words, they were trying to sell an agenda without really doing the work. And everybody was tweeting this result like crazy without realizing how ridiculous it was.
When you see research, ask if the sample size was calculated, and large enough to support the assumptions.
2. What is the probability that the finding was due to random chance?
If your researcher cannot state a confidence level (90 percent confidence, for example, or plus or minus 5%) that the assumptions in the research are correct, you may not be able to believe the data. It might be tempting to “eyeball” the data but this can be a recipe for disaster. There is nothing more dangerous to a business than executing a perfect plan against a flawed strategy!
If you don’t have somebody in your company with the skills to do this kind of analysis, hire temporary expertise to get you started.
3. Are we seducing ourselves with good news?
It’s comfortable and easy to accept an analysis that supports our view of the world. But if something looks too good to be true, it probably is. So be skeptical — very skeptical. Always make sure that important results hold up to a deeper look and, if they don’t, get the full explanation.
Similarly, if we expected to see something in the data and it’s not there, don’t ignore it. Dig deep — there’s probably a market insight there.
4. Are we displaying and analyzing the data in the appropriate way?
Don’t be satisfied with “analyses of the average.” If all you do is look at pie charts and bar charts of averages, are you really going to learn anything?
One of my most profound business lessons came from a customer satisfaction report my company did every quarter. Instead of simply scanning through the bar charts and average customer satisfaction levels, I dug deep and read every comment, slicing the data different ways in an effort to find gaps and opportunities. There was one stray comment on one survey that was an early warning sign of a severe quality issue. It was so serious in fact, that one comment led to an investigation and a multi-million-dollar capital project.
The real insight and innovation comes from the small data, not the Big Data.
5. Are we being paralyzed by analysis?
I read a story one time about how one car company spent millions of dollars in research because their sun roof wouldn’t open properly at temperatures below zero. Perhaps it didn’t occur to them that nobody would be opening a sunroof at that temperature.
While I strongly support the use of rigorous data analysis, you also have to use logic, experienced-based instinct and common sense to run a business.
See? That set of questions is not so hard, is it? We hear so much about Big Data and now that you’re armed with this set of questions, you can begin to make some sense of it. What lessons can you share?
This post was written as part of the Dell Insight Partners program, which provides news and analysis about the evolving world of tech. To learn more about tech news and analysis visit TechPageOne. Dell sponsored this article, but the opinions are my own and don’t necessarily represent Dell’s positions or strategies.
Illustration courtesy of Flickr CC and Andy McGuire.