Here Are 10 Fascinating A/B Testing Examples That Will Blow Your Mind

Design, but simple.

My popular design ebook is now on sale. Don't miss out!

If there’s one thing you need to know about good design practice, it’s that good designers test their work. As a business owner, you should be testing everything that goes in front of a user – websites, landing pages, emails, etc. – if you want the best results.

You wouldn’t just invent a product and send it out into the world without asking a few people if it was a good idea, right?

You shouldn’t assume that you know what will catch someone’s attention or make the most people fill out your sign-up form. That’s why data-driven design can be the most effective.

The things that get the most conversions are counterintuitive! You shouldn’t assume that you know what will catch someone’s attention or make the most people fill out your sign-up form. That’s why data-driven design can be the most effective.

Plus, let’s be honest, you want those conversions.

Here are 10 skinnies on surprising split-testing results you’ll be able to cite at your next networking event.

1.  Groove’s Landing Page Redesign

Groove’s Landing Page Redesign, ab testing examples
Groove’s conversion increased from 2.3% to 4.3%.

This first case study is a great example of how testing and optimization should be done. The team from Groove decided to do an extensive qualitative study. They spoke to their customers on the phone to figure out what words they use, and then they set up an after-signup autoresponder asking them why they signed up.

They used the results to write the landing page copy using their customers’ own words and designed the page only after they finalized the copy.

2. Highrise’s Headline & Subheadline Test

2. Highrise’s Headline & Subheadline Test, ab testing examples

Highrise tested different headline and subheadline combinations to see how it affected their sign-ups. The test performed with Google Analytics Experiments showed that the variation telling visitors that the sign-up is quick produced a 30% increase in clicks. It was also the only variation with an exclamation mark.

3. Humana’s Banner Test

Humana’s Banner Test, ab testing examples
A simpler design plus a stronger CTA led to 433% more clickthroughs.

Humana tested two different banners on their homepage.

The first one was quite cluttered, with a full paragraph of copy and a less noticeable call-to-action. The control was cleaner, with a strong, obvious CTA button. Both banners featured different photos.

The cleaner variation achieved a staggering 433% increase in clickthroughs.

But they weren’t done there! Playing with the microcopy on the button (changing from “Get started now” to “Shop Medicare Plans”), they managed to achieve a further 192% increase in clicks.

4. The Olympic Store Checkout

 The Olympic Store Checkout, ab testing examples
Removing a barrier increased completed purchases by 21.8%.

Typically when shopping online, we’re met with the sign-up or log-in form as we try to check out. This causes unnecessary friction in the purchase process and may cost revenue.

The Olympic Store decided to test a different approach. They let their customers check out without the sign-up, moving the extra step of creating an account to the end of the process, after the check out.

5. Hubspot’s Lead Conversion

Hubspot’s Lead Conversion, ab testing examples
Using an in-line CTA led to a 71% increase in conversions.

This is a very relevant test to all of us who are building an email list (which you should, as well!). What’s better: a sign-up form inside a blog post or a separate sign-up page?

Hubspot decided to test this. They offered the lead magnet The Beginner’s Guide to Inbound Marketing at the end of the posts: on one variation, the form was embedded in the post (in-line CTA), and the other was just a link to a landing page with the form.

The in-link CTA performed 71% better.

6. RummyCircle’s Mobile Facebook Ad

RummyCircle’s Mobile Facebook Ad, ab testing examples
On mobile, ad version B had a 224.7% lower Cost Per Action. And the lower the CPA, the better.

India’s leading rummy site decided to test how differently written Facebook ads affect the sign-ups. Here’s why they tested it: In previous testing with desktop users, the team found engaged users – who commented on the RummyCircle Facebook ad – were more likely to click to gaming site. Commenting, therefore, facilitated lead generation for the site. And, engaged leads converted.

But their mobile users seem to be a lot different. On mobile, the test showed that asking for comments actually decreased conversions to the email list.

7. Google+ Tests a Promo Banner on Mobile

Google+ Tests a Promo Banner on Mobile, ab testing examples
Mobile website users increased by 17%.

Say what you will about Google+, but apparently the social network isn’t going anywhere. In this interesting case study, the Google+ team put the interstitial ads to the test.

Interstitials are the obtrusive full-screen ads that many websites (Pinterest, looking at you!) use to try and convert their mobile website visitors into app downloads. Supposedly, that should improve the user’s experience with the content, but it’s rather the opposite.

69% of people left the mobile website right away, while 9% of the visitors clicked on the “Install” button. After they implemented a nicer, less obtrusive app ad, the 1-day active users on mobile increased by 17%, while the percentage of installs stayed almost untouched.

8. Yuppiechef’s Navigation Test

Yuppiechef’s Navigation Test, ab testing examples
Just removing a navigation led to a 100% increase in conversions.

Removing navigation is one of the tips we recommend to quickly increase conversions, and Yuppiechef’s A/B testing results prove it. The tiny change produced a 100% increase in conversion rate. The reason is probably because it offers fewer distractions to the users.

9. Centraal Beheer Achmea CTA Test

Centraal Beheer Achmea CTA Test, ab testing examples
Adding a link resulted in 244.7% more clicks.

The Acmea’s case study produced a very unlikely result. I normally recommend adding an additional link underneath the call-to-action buttons for the banner-blind.

However, this particular website decided to test a call-to-action with a secondary link that didn’t direct people to completing the offer — instead, it let them share the page to LinkedIn!

Surprisingly, the additional link that some would deem a distraction actually produced 244.7% more clicks on the main call-to-action button!

WhichTestWon explains the reason for this result is likely in the “Hobson’s Choice” effect. The difficult decision of whether or not to click becomes deciding which button to click instead. It’s sort of like when you don’t feel like going to the gym: if you frame it differently – “Will I go to the gym or run today?” – you’re more likely to exercise.

10. Server Density Changes Their Pricing Model

Server Density Changes Their Pricing Model, ab testing examples
Packaging services increased total revenue by 114%.

Server Density is a SaaS company providing hosting and website monitoring. Their initial pricing model relied heavily on the costs. However, when they tested it against a packaged, value-based pricing model, they found out that not only did the overall revenue increase, but also the number of free trial sign-ups decreased, effectively lowering the costs of the “tire-kickers.”

As you can see, sometimes even the tiniest tweak to your design can have huge results. You never know how much business you can drum up if you don’t try… and test. And test again.

Design, but simple.

My popular design ebook is now on sale. Don't miss out!

Get 30% off