We’re always encouraging our clients to A/B test their creatives . So being a company that practices what we preach, we A/B test our own ads as well. Below are three tests we’ve done over the past few months with insight into what we learned from each of them. It’s important to note that different ads appeal to different kinds of audiences. Just because certain ads work for me doesn’t necessarily work for you. With that said, here’s the first set of many A/B tests which we’ll share with you.
Objective: ReTargeting Taglines
We tested our original retargeting ad against some new taglines. The idea was to figure out a good tagline for ReTargeter. After testing a bunch of different taglines via SEM, we narrowed it down to two: “Advertise Everywhere” and “Bring Your Users Back.” Here are the results:
What we learned:
This was officially our first A/B test. The taglines which we narrowed down from our SEM campaigns didn’t seem to work so well with our retargeted audience. We soon found them to be better fitted for top-of-funnel campaigns, and not as much for retargeting campaigns.
Objective: Audience Education On New Products/Ideas
Here at ReTargeter, we always work on driving innovation, and frequently come up with a lot of cool ideas based around display ads and audience targeting. Some ideas are marketable, and some aren’t, but for this test, we wanted to put our tried and true ReTargeter banner up against two new banner sets. These ads highlight some of our latest ideas to our audience. Additionally, we worked on accentuating specific parts of the copy, and threw in a new CTA for the new product banners.
What we learned:
Our second A/B test was a bit too ambitious. There were too many variations between the 3 creative sets. We should’ve initially tested the CTA change on the original banner. With that said, the third creative set’s performance can be heavily attributed to the QR code. Also, for this round, we did not track the amount of QR scans, which is possible through link-shortening tools and link-tracking macros. The amount of scans only adds to the banner’s engagement rate.
Objective: Tagline copy
For this test, we wanted assess additional tag lines on a whole new banner design (thanks to the expert website user experience design consultants at Digital Telepathy). In a different A/B test (which we’ll post in a future date), we saw a massive improvement between our new design and our original tried-and-true banner. This could be due to what we call banner fatigue, in which an audience gets bored with the same old creative and needs to see something new. The main focus, however, was pitting ‘Market to your visitors after they leave your site’ against ‘Stay in front of your audience after they leave your website!’
What we learned:
This was a simple test that allowed us to compare the performance of specific ad copy. The ad copy was the only variation between the ads, and we found that “Stay in front of your audience after they leave your website!” seemed to be a more engaging tagline during the test.
These three tests taught us a lot around the proper way to A/B test. Don’t test too many variations between only a handful of creatives, as this makes it harder to figure out which changes are attributable to the performance difference (see Test 2). Testing one variation at a time, such as copy, CTA, or image, and improving your banners accordingly is a fool-proof way to get specific and actionable data (see Test 3).
These are just 3 A/B tests (of countless) that we’ve done over the past year. More test results coming soon; stay tuned!