You probably know by now that conversion rate optimization can improve your site’s sales. Using data-driven decision-making tools like A/B testing, you can know for sure which versions of your pages sell your products better. But are you focusing on tactical changes, or are you improving your customer acquisition strategy?
Conversion rate optimization fundamentals
There’s a temptation when optimizing your site to focus on what I’ll call tactics—micro-changes you can make to individual pages—at the expense of strategy—your overarching ideas about what drives conversions on your site.
The most (in)famous example of this kind of confusion is testing your button color. There are plenty of people willing to tell you that if you’d just make your buttons red instead of blue (or blue instead of red!) you’d drive 100% more sales on your site. (Hint: Not likely!)
Don’t test ideas about pages, test ideas about customers
In Patrick McKenzie’s book Sell More Software, he suggests that you test hypotheses about your customers, not pages. This is the best way I know of to explain the difference between strategy and tactics. “If my customers just saw a red button instead of a blue one, they’d be persuaded to buy!” is probably not a reasonable hypothesis about your customers. However, “if my customers just saw an explanation of my product’s benefits, they’d buy more products” is.
If you can’t justify your A/B test in terms of why customers might be more inclined to buy, don’t bother. Instead, consider each step in your purchasing funnel (and the pages that correspond to it) and ask yourself this: “What could I be showing my customers here that would move them to buy?”
Another perspective you can take is this: “If someone were ready to buy, what might get in their way?” If your site is hard to use, this is an obvious fix—if someone has trouble figuring out how to buy your product, you’ll be dissuading them from ordering!
“I have an idea, now what do I do with it?”
The worst thing you can do is have (what seems to be) a great idea for your site, then just implement it. It’s incredibly hard to accurately predict how a page variant will perform, even if you’ve been doing this for years. Your accuracy might improve with time, but you’ll never call it 100%.
Instead, you should be A/B testing your changes. This means that for each visitor that hits the page you’re testing, you (metaphorically) flip a coin; if it comes up heads, you show them the original; otherwise, you show them the new version. By tracking how many sales (or other major goals for your business) are associated with each page, you can get statistical confidence regarding which one is better in the long term.
Let me reiterate: If you aren’t A/B testing, you’re can’t be sure you’re improving—you may even be hurting your business!
To do A/B testing, you’ll need an analytics suite like Google Analytics or KISSmetrics. These tools have built-in A/B testing capabilities, but depending on your needs, you might want a dedicated A/B testing tool like Visual Website Optimizer or Optimizely. (Personally, I’m 100% happy using Google’s “Content Experiments” for A/B testing, but I’ve never wanted to customize a test based on things like a visitor’s physical location.)
Examples from a case study
A win: stopping objections by showing X-Plane’s endorsements
As part of my case study on improving X-Plane.com, I did some A/B testing.
One of the first ideas we had was to reassure visitors that X-Plane isn’t some fly-by-night software that no one actually uses—after all, many people have heard of X-Plane’s biggest competitor, Microsoft Flight Simulator, but for many years, X-Plane had something like a 10–15% market share. This type of reassurance falls into the category of “stopping objections” which might otherwise keep people from engaging with your product and eventually buying.
Our hypothesis was that we could stop those objections by showing people a list of big-name companies who endorsed X-Plane. (Note the hypothesis about what our visitors need to see, not about how our pages should look.)
To that end, we created a variant of the home page to show off X-Plane’s endorsements. We added a couple lines: “As seen in <list of media sources>” and “Used by <list of prominent companies>.” This seemed like a sure win, and the A/B test bore that out: we saw an 18% increase in site sales resulting from that change alone! That was statistically significant—we can say with 95% confidence that the new page is better for sales than the old one.
In one A/B test we performed, we saw an 18% increase in site sales by adding just 2 lines of copy!
A fizzle: Making it obvious that X-Plane is computer software
Now let’s talk about a test that didn’t pan out. Usability experts like Steve Krug tell us that a new visitor needs to be able to “get” what your site and your product are all about with a single glance. A person who asks “What the hell am I looking at?” should be able to answer that question in under 5 seconds. If they can’t, you’re losing customers.
To that end, we hypothesized that visitors might not immediately “get” that X-Plane is a piece of software that runs on your computer. Sure, we show a video of what things look like inside X-Plane, but it’s not immediately obvious from that video that this is something you can use on your home computer.
Thus, we created a variant of our landing page for pilots (our least tech-savvy user group) that played the intro video in a monitor frame, making it totally obvious that, hey, this works on your laptop!
In our A/B test, we found a 6% decrease in the conversion rate resulting from the variant playing the video in the monitor
The result: we saw a 6% drop in conversions! This wasn’t a statistically significant decrease (we can’t say with 95% confidence that it’s worse), but it certainly suggests that we’re on the wrong track.
That’s okay though. Even when your A/B test fails to validate your idea, you can still profit from it. Ask yourself the following questions:
- How good a test of my hypothesis was this? Could it be that other (unrelated) factors caused the test to fail? If there are plausible reasons the test may have failed which are unrelated to your hypothesis, create a better test.
- Ideally, you’ll be checking this before you start your test, but hindsight is 20/20. For instance, in the example test I ran above, I realized after the fact that having text above the fold may have been a confounding factor.
- If you failed to confirm your hypothesis, why could this be? Figure out what your visitors have “told” you here.
- For instance, in the test above, I suspect the reason we saw a decrease was that people do, in fact, “get” that X-Plane is software for your computer. The monitor frame + supporting text wound up distracting from the purpose of the video, which was to wow visitors with what X-Plane can do. For this market segment, it seems that we don’t need to harp on the fact that you can use the software on your home computer.
Closing thoughts
Conversion rate optimization is one of the most powerful tools available to a business online—if you go about it in the right way.
By focusing on your site’s strategy (the messages you’re showing to your customers at each stage in the buying process) instead of your on-page tactics (micro-optimizations like a headline or the appearance of particular elements), you’ll find drastic improvements to your pages. Get creative—and daring—by testing big changes that are capable of producing big wins.
Get our guide to the Top 5 Web marketing tools for small businesses
(Hint: The 3 Best Ones Are Free)
This is our guide to the exact tools we’ve used to help clients quadruple revenue in 3 months.
Drop your email below to get your free guide instantly. We’ll also send you our Revenue Booster Monthly email, packed with strategies for improving your business online.