Occasionally, you’ll hear about an advertising agency that doesn’t participate in the idea-killing charade known as testing, but here in the U.S., testing is a standard part of the process at the vast majority of shops.
In case you’re unfamiliar with it, the kind of testing I’m talking about usually involves taking either a storyboard or a “boardomatic” – a rudimentary cartoon of a proposed commercial in which still figures slide around from place to place – and showing it to consumers to get their reaction. The creatives who have invested so much time and energy into the idea are assured that the only thing the testing will “probe” is the message a consumer takes away from the ad (as if, being reasonably intelligent people, we couldn’t determine that just by, you know, reading the script), but it never works out that way. One way or another, the questions that always end up getting answered are, “Do you like it? How could we make it better?”
Depending on the nature of the test, a lot of things can happen as a result of this process, but they all have one thing in common: the work that emerges always – literally always in my experience – is undermined or weakened in some way…if, that is, it isn’t killed outright.
It’s uncanny how consistently and predictably this happens.
So why do clients insist on it? Why do agencies go along with it? For the answer to the second question, please refer back to the first. And the answer to that question is the reason I called the whole enterprise a charade earlier: everyone involved pretends it’s all about making the work better or seeing which campaign customers will respond to best or just conducting a last-minute disaster check. But it’s really not about any of those things.
It’s about someone making sure his or her ass is covered. It’s so no one has to go out on a limb (to use a less vulgar metaphor). It’s so that when someone above them asks, “Who thought this was the way to go?” they can say, “No one – it’s the one that tested best.”
At this point, you may be asking, “So what? Why shouldn’t someone seek out some objective reinforcement? Are we so arrogant that we dismiss the value of a second opinion?”
If that were what we were getting, I’d agree. But that’s not what we get. That’s not how testing works. And I’ve come to believe it has a lot to do with human nature and the simple fact that people love to help. If you ask people’s opinion about a creative concept, they immediately start to think to themselves, “This guy’s giving me $50 bucks, I’ve got to think of some way to help him out.” Next thing you know, you’ve got a robot breakdancing through your spot. And what was supposed to prevent the client from wasting money on the wrong idea has delivered exactly that.
Now, I’m not suggesting the thought process of the people hired by the testing company is deliberate or conscious. But the bottom line is, as soon as they realize they’re being asked their opinion of something that’s a work in progress, they jump right to, “What would make this better?” And, great ideas being even more fragile than they are powerful, before you know it, you no longer recognize your own idea.
A lot of my creative colleagues have unreserved contempt for the people making all these “helpful” suggestions, but I don’t blame them at all.
They’re just doing what we’re asking of them. I blame our industry for asking in the first place – for pretending there’s any creative value to this at all. There’s not. If we’re any good at our jobs, we should have the courage to say to our clients, “Don’t test this idea. You pay us for our creative judgment, and this is a great idea. We can’t guarantee it will work, but if it doesn’t, it won’t be because it’s not a great idea. This is what we do. This is our expertise, it’s all we have, and we know what we’re talking about.”
There is a place for testing: before the creative process starts. I believe in testing the relevance and persuasiveness of a marketing message. But once the strongest message emerges, testing has done its job.
At that point, it’s time for us to do ours.
Leave a reply0 comments