April 8, 2016
Landing Page A/B Testing
Let’s face it, no matter how good your marketing instincts are or how well you understand your buyer personas, no marketer bats a thousand. As long as we’re people marketing to people, there will be an element of unpredictability to marketing. Maybe your hero shot doesn’t sit right with your audience…maybe your call-to-action isn’t eye-catching enough…maybe your copy is too long…or too short…maybe you simply have the wrong traffic. The possible explanations for why your landing page isn’t converting effectively go on and on. However, your landing page conversion rate doesn’t have to drive you crazy. With a smart A/B testing strategy, you can very quickly learn what works for your audience and use that to increase your conversion rate and your sales.
Psyching out your audience
For example, we recently ran a landing page A/B test that increased a client’s conversion rate by 37%. We had run several tests on this page without any truly breakthrough results. We had optimized page layout, hero shot and copy without increasing the conversion rate by any significant amount. The page was performing better, but still not great. At this point, we knew our target audience fairly well. We were marketing to college-age adults. They were tech-savvy and looking for a seamless online experience, but they were also a bit skeptical of overly pushy sales tactics. In other words, they wanted to be in control of their online experience. So, we decided to try a “push-pull” strategy. Instead of hitting them with a form right off, we switched to using a lightbox form that only appeared when the user clicked a call to action button. Our coy little lightbox form put the user in the driver seat. They had to click the button to get at the form and—when they did—they were rewarded with our form. It doesn’t sound like a big deal, but this psychological play had a huge impact on the conversion rate of our page. Now, instead of forcing our form down their throats, we were letting our audience sell themselves and ask for the form. As a result, our conversion rate jumped from 17.65% to 28.13%!
Cashing in on A/B testing
In just 3 months, that 37% increase in conversion rate produced 100 additional sales and $43,017 in revenue. Even more importantly, it increased their return-on-ad-spend by 88%. That meant a lot more profit in their pockets without any increase in marketing spend. Here’s a side-by-side comparison of the pages: As you can see, the only real difference between these pages was the lightbox. Refining the page layout taught us what we needed to know about our audience to truly optimize the experience for them.
There’s no shortcut around A/B testing
Seeing these results, you might think that lightboxes are the holy grail of conversion rate optimization. At first, we thought so too. So, we tried lightboxes out with a variety of clients. The results were mixed. One client’s conversion rate improved by 15%. For two other clients, their conversion rate stayed about the same or even dropped a little. One client’s conversion rate dropped by 18%! What happened? Unfortunately, in our excitement about lightboxes, we had overlooked what made them successful in the first place—they were an ideal fit for our audience. Our other clients had different audiences with different buyer personas and lightboxes weren’t necessarily the best choice for those personas. Is it any wonder that our results were hit and miss? Often, it’s easy to look at A/B testing case studies and assume that you can use data from someone else’s tests to shortcut your way to conversion rate nirvana, but that simply isn’t the case. Case studies can be a great way to come up with testing ideas, but you can’t assume that what worked for someone else’s audience will be a perfect fit for your market. How’s the old saying go? It’s as true for A/B testing as it is for anything else. No one else knows your audience the way you do, especially once you’ve run a few tests to see what really makes your audience tick. Maybe lightboxes are your holy grail, maybe they aren’t. It all depends on your audience.
How to get killer A/B testing results
There are no shortcuts to A/B testing success, but you can dramatically speed up the process by systematizing your testing approach. If you set things up right, you should learn something from every test. That takes planning and great documentation, but it will save you a lot of time and greatly increase your profitability in the long run. To really get the most out of your tests, it’s best to write out your strategy in advance. For example, to see how a new call to action improves your conversion rate, you might put together a spreadsheet like this: See how each test sets up the next test? You learn something from each iteration and then use that to guide your next test. Plus, everything is thoroughly documented, so if anyone ever wonders why you made a certain choice, you’ve always got a handy reference! A lot of testing tools will document your results, which is helpful, but if you don’t document the thinking behind the test, the results won’t do you much good. Here’s a free template you can use to track your A/B tests.
Most successful landing page A/B tests are not the product of a lucky break—they’re the result of methodical testing and a deep understanding of the target audience. In this client’s case, diligent testing increased their conversion rate by 37% and produced $43,017 in additional sales in just 3 months. That’s a win in my book! You’ve heard my two cents, now I want to hear yours. How have you seen A/B testing improve conversion rates? Have you tried out a lightbox before? How did it affect your page performance? How do you use case study data to fuel your landing page tests? Let us know in the comments.