While talking with a client recently about email marketing, it became clear that they were interested in the idea of A/B testing, but not sure how to go about it.
Their open rate and CTR were low, but within industry standards — so they didn’t have much perspective on how much better it could be. Figuring out all the nitty-gritty details and logistics was just too much of a hurdle — especially considering their cumbersome internal review process just to get ONE version of their newsletter out the door! It was just too much to consider throwing another ball in the air.
While it’s true A/B testing requires some foresight and planning, it can actually help simplify email marketing efforts. Yep, you read that right. A/B testing is awesome because it takes the guesswork, politics, and “groupthink” OUT of decision making. Instead of routing email content through a committee or trusting your gut, A/B testing enables you to make informed decisions, based on user data.
Here’s how it works:
- Create two versions of your email with only one difference. It could be the subject line, the time of day you send it, or even the call to action.
- Segment your list into three: two small test groups of equal size, and everyone else. Depending on your list size, the test groups should make up anywhere from 2–20 percent of your total list. For lists of a few hundred to several thousand a test group of 10-20% is ideal — while for lists of 400K or more, 2-5% will do.
- Send the tests. You’ll want to watch both immediate opens/clicks (which is a good indicator of mobile engagement) and open/clicks over time. For the best results, allow 1-7 days between your test and your final send.
- Analyze the test results to determine which email performed better (this can be trickier than you may think).
- Send the final blast.
- Analyze the final results and look for ways to improve next time.
Subject line A/B
Subject line testing is a great place to start if you’ve never used A/B testing before. The premise is simple, you send two identical emails at the exact same time, but with different subject lines.
Testing the subject line is great when your email contains multiple messages (such as a newsletter) or when you’re debating which angle to take on a call to action.
One example we’ve seen a lot with regular newsletters is the compulsion to include the newsletter name and issue information in the subject, like this:
Awesome Co. Newsletter - Spring 2012
If your business uses a line like that, consider testing it against a line that teases one of your newsletter articles. For example, let’s say your company sells thingamajigs and one of your newsletter articles this month is about your spring sales drive. Here are three possible headlines to test against the one above:
1/2 off Thingamajigs right now at Awesome Co.
Our best Thingamajig deals, just for you
Thingamajigs on Sale This Week
You might be surprised at the effect lines like these will have not only on opens, but click-through rates too. Often in newsletters we see a correlation between the subject line and the most clicked link. It’s like a one-two punch for conversions.
Telemarketers are notorious for interrupting dinner because they know dinnertime is their best bet for reaching people at home. A timing A/B test will help you find the e-marketing equivalent of dinnertime, for your audience. As the name implies, in this test the content and subject line stay the same for both emails, but you send them at different times.
There are lots of conflicting information out there about what’s the best day or time of day to send email. The truth is it depends on your company, your message, and your audience.
You might find Sunday evenings are great for reaching busy CEOs as they prepare for the new week. Or that Thursday evening is prime time to engage millennials vegging out with the iPad close at hand.
Ask yourself a few questions to figure out a few day parts to test. Are you trying to reach your audience at work or at play? Are your customers scattered across multiple time zones? What sort of time investment does your message ask of users?
With this test, be sure to compare results after a consistent amount of time, and wait 24 hours at the very least. Ideally, you’ll want to send your final email at the same day and time as the test — meaning you’d need to plan for a week of wait time in between. Because of that, this testing methodology is best for monthly newsletters and sporadic announcements, but can be impractical for things like flash sales or time-sensitive product launches.
These two testing strategies are a great place to start — but are just the tip of the iceberg when it comes to harnessing the incredible potential of email marketing. There is some really incredible stuff possible with customization, social integration, analytics, and more. At the end of the day though, successful email marketing ultimately comes down to knowing your users. The best way to get to know them?
Test, analyze, refine, repeat.
I plan to share more about these topics in future blog posts. What email marketing challenges do you have? What questions can we answer about email marketing testing? Let me know in the comments.