A/B Testing Creatives for Retargeting: Round 1
We’re always encouraging our clients to A/B test their creatives. So being a company that practices what we preach, we A/B test our own ads as well. Below are three tests weve done over the past few months with insight into what we learned from each of them. It’s important to note that different ads appeal to different kinds of audiences. Just because certain ads work for me doesnt necessarily work for you. With that said, heres the first set of many A/B tests which we’ll share with you.
Test 1:
Objective: ReTargeting Taglines
We tested our original retargeting ad against some new taglines. The idea was to figure out a good tagline for ReTargeter. After testing a bunch of different taglines via SEM, we narrowed it down to two: Advertise Everywhere and Bring Your Users Back.
What we learned:
This was officially our first A/B test. The taglines which we narrowed down from our SEM campaigns didnt seem to work so well with our retargeted audience. We soon found them to be better fitted for top-of-funnel campaigns, and not as much for retargeting campaigns.
Test 2:
Objective: Audience Education On New Products/Ideas
Here at ReTargeter, we always work on driving innovation, and frequently come up with a lot of cool ideas based around display ads and audience targeting. Some ideas are marketable, and some aren’t, but for this test, we wanted to put our tried and true ReTargeter banner up against two new banner sets. These ads highlight some of our latest ideas to our audience. Additionally, we worked on accentuating specific parts of the copy and threw in a new CTA for the new product banners.
What we learned:
Our second A/B test was a bit too ambitious. There were too many variations between the 3 creative sets. We shouldve initially tested the CTA change on the original banner. With that said, the third creative sets performance can be heavily attributed to the QR code. Also, for this round, we did not track the amount of QR scans, which is possible through link-shortening tools and link-tracking macros. The amount of scans only adds to the banners engagement rate.
Test 3:
Objective: Tagline copy
For this test, we wanted assess additional taglines on a whole new banner design (thanks to the expert website user experience design consultants at Digital Telepathy). In a different A/B test (which well post in a future date), we saw a massive improvement between our new design and our original tried-and-true banner. This could be due to what we call banner fatigue, in which an audience gets bored with the same old creative and needs to see something new. The main focus, however, was pitting Market to your visitors after they leave your site against Stay in front of your audience after they leave your website!
What we learned:
This was a simple test that allowed us to compare the performance of specific ad copy. The ad copy was the only variation between the ads, and we found that Stay in front of your audience after they leave your website! seemed to be a more engaging tagline during the test.
Overall Lessons:
These three tests taught us a lot around the proper way to A/B test. Dont test too many variations between only a handful of creatives, as this makes it harder to figure out which changes are attributable to the performance difference (see Test 2). Testing one variation at a time, such as copy, CTA, or image, and improving your banners accordingly is a fool-proof way to get specific and actionable data (see Test 3).
These are just 3 A/B tests (of countless) that weve done over the past year. More test results coming soon; stay tuned!