Rabu, 17 Juni 2009

Bagging the Conversion Elephant

In reading an Occam’s Razor blog post from earlier this year, “Aggregation of Marginal Gains: Recession Busting Analytics!," I noted that Avinash is discussing exactly the same experience we at FutureNow underwent. Ten years ago, low hanging fruit in the conversion improvement space meant high traffic, high impact changes from fixing obviously-wrong calls to action: bad linking, non-obvious next steps, poor UI (user interface).

In fact, for several years conversion improvement was all about UI. Remember that? The theory was (mostly from the UI people, of course), that if you just improved the UI then conversion would follow. What we found out was that if you had a weak UI then of course fixing it would help... but only so far. When it worked you got big fireworks-busting improvements, but only at a singular point in the sales process.

A few years after that it was analytics. If you simply measure everything, then (wave hands here, sprinkle magical pixie dust) you’ll “just know” what to do to improve. If you want to improve more well then just “measure more” — get more analytics apps installed. Or buy more expensive ones.

In the last year or so, the buzz word technique seems to be testing. If you just test everything then somewhere, somehow you’ll know what helps conversion. And when I say “everything”, I mean “EVERYTHING.” Testing is great — in fact, I wrote a best-selling book on this very topic, "Always Be Testing", centered specifically on Google Website Optimizer — but if there’s a bee on your grandmother’s nose, do you honestly have to do a test to determine whether to swat it away? Yet, some companies out there imply to clients they should freeze like a deer caught in the headlights unless and until they have a test to back up every decision made. And if you don’t have enough traffic for the test to be valid just drive more traffic, any traffic, even if precision comes at the cost of accuracy. It's lunacy. No thanks, m’am, that bee on Granny gets swatted and I don’t need 95% statistical confidence to make that decision.

Each time the industry thinks it’s got the elephant in its sights, that five-ton peanut-eater slips away. I think it’s because everyone keeps chasing technology as the solution to pachyderm-sized conversion improvement. If you install the right mix of digital toys, then whamo you’re sure to be the next market leader in your space. Again with the pixie dust.

But it just doesn’t work that way. What we’ve learned is that the big wins come from a long series of small wins, accumulated over time. And small wins come from experienced insight and hard work. And it has to be the type of hard work that a company is willing and able to perform. Not pie-in-the-sky goals without any mechanism for implementing.

I'd recommend focusing less on "big projects" and more on iteration. At FutureNow we did just that, re-tooling our entire business model around this "OnTarget" concept, letting clients decide how many resources they have to devote to improvement in a current cycle, and delivering experienced recommendations specifically for that cycle, and using analytics tools to measure the improvement and testing tools to back up any conclusions. "Rinse and repeat". How do you eat the conversion elephant? One small bite at a time. Exactly what Avinash was speaking about.

Tidak ada komentar:

Posting Komentar