In yesterday’s MediaPost Marketing Daily, Douglas Brooks touched on a subject I’ve been yarping about for some time – measurement. He offers some prudent advice, but sidesteps the issue that’s been bothering me: to wit, our rage to quantification is driven by fear, not expertise, and it often leads us to ignore a whole suite of important decision-making tools.
I would never suggest that ROI doesn’t matter – quite the opposite – and I also wouldn’t argue that quantitative methods can’t provide us with useful data – of course they can. The problem is that American culture has this odd relationship with knowledge and evidence – in any kind of professional enterprise, statistics and numerical metrics are increasingly being taken as the only kind of evidence. If we want to say something about our customer base, we feel an obligation to quantify whatever we’re trying to say.
Correct me if I’m wrong, but my sense is that we’ve seen a massive increase in our rage to quantify over the past seven or eight years. Specifically, the trend seems to have paralleled all the belt-tightening that ensued with the crash in 2001. All of a sudden a lot of people were legitimately terrified about their job security, with inevitable quarterly bad news being met, like clockwork, with another inevitable round of layoffs, which in turn makes it even harder for the company to succeed in the ensuing quarter. Lather, rinse, repeat.
Employees were being asked to make decisions in this environment, and when 25% of your group is going to be turfed in two months, the last thing you can afford is to get blamed for a tangible failure. So you insist on quant research because “numbers don’t lie.” If the initiative fails, you can point to hard, fast, incontrovertible statistics that prove it wasn’t your fault. You might get fired anyway, but at least the numbers give you a fighting chance.
Of course, numbers do lie, especially if you trust them too much. A little tale from my own history illustrates. I used to work in radio, and I found myself pursuing a copy job with a company that had just bought the station I used to work for. The whole market was curious – the new owners were from Florida and nobody knew what format they were going to operate. In my interview we talked about this, and they let me know that they were going to do something dramatically new. It was going to be a “mix” format that would play just about everything – rock, adult contemporary, country, MOR, you name it. Already I’m giggling on the inside, and then they say “we’re also going to play Beach Music.” (If you aren’t familiar with what that term means in the Carolinas, click here.)
They were dead in the water already and didn’t know it. But they had done extensive research and that research proved conclusively that their target audience wanted to hear Beach Music on the radio. The problem: radio research in that market always showed this, but when stations gave them what they said they wanted those listeners always switched to something else. Yes, people there wanted to hear Beach Music – in clubs. Not on the radio.
The research always lied, and if you knew the terrain you knew when to ignore the numbers. The new owners trusted the numbers and their grand experiment (which was a bad idea for other reasons, as well – we knew they were doomed when we heard Kenny Rogers, The Stones and Melissa Manchester back-to-back-to-back) was done within a matter of weeks.
These new entrants could have succeeded right away had they done no research at all – I could have pointed them to several people around the region who had all the insight you’d have ever needed in their heads. But they chose the numbers over the smart. Experience counts, and if you can’t look around your organization and tell where the genius lies, you’re already in trouble.
The lessons here are straightforward and critically important:
- Quant marketing research is often a crutch for people who are afraid and looking to cover their butts;
- Research often lies; I’m pretty sure there were solid numbers behind New Coke, too, and given how the business world operates today you can bet your bottom dollar that every bad marketing idea you’ve seen this century was supported by strong quantitative research results;
- If you don’t have smart, experienced people you trust to tell you that the numbers are wrong, it’s only a matter of time before you become case study material yourself.
Maybe I’ve done nothing more here than start an argument, and that’s fine – at the very least it’s an argument that most organizations can benefit from, no matter what conclusions they reach.
Meanwhile, do read the Brooks editorial – he has some important things to say about long-term thinking.