A/B testing your newsletters
Earlier this year, I’ve written a few posts on email marketing. In those posts I’ve also mentioned that doing A/B testing for your newsletters (or other forms of email marketing) are a must. However, there are a lot of things you can test, so what should you be focussing on?
In this post I’ll try to answer that question by explaining what you can test. I won’t go into detail of testing examples, but I will tell you what you should pay attention to when testing.
With most email campaign tools, you’ll have the possibility to test the subject line. This means you’ll be able to give your newsletter a number of different subject lines. If you have 2 different subject lines, ordinarily 50% of your newsletter list gets the first variation, and the other 50% gets the other variation.
Testing your subject lines is really only good for testing your open rate and not your click rate. The subject line won’t affect your click rate, since it doesn’t affect anything within the body of the email you’re sending. That being said, testing your subject lines is still very important, as you actually want as much people as possible to read what you’ve sent them, right?
One set of rules that our friend Jordie van Rijn (a great email marketer) taught us and has helped us since is C.U.R.V.E:
- Curiosity: try to pique the readers’ interest by asking them a question.
- Urgency: create urgency by having limited time offers or offering things that need to be done now.
- Relevance: Make sure you’re putting the content that’s most relevant to your audience in your subject.
- Value: Convey the value of the newsletter by offering something exclusive (this can be an exclusive product offer, but also exclusive content).
- Emotion: Use punctuation, such as exclamation marks, to elicit emotional responses from your readers.
Another thing you can almost always test, is your from name. This is exactly what it says: the name that shows from whom the emails are coming:
This is, again, something that will only have an effect on your open rate. However, this is one that people tend to forget about, because it’s such a small thing to change. However, the from name can actually be pretty important. This will be the first thing people will see when your email arrives, so it had better be good. Testing this will make sure it is.
I’m not sure whether all email campaign tools offer this A/B testing option, but MailChimp does. You can test what send time (MailChimp calls this “delivery time”) works best for your audience. You need to do some work here beforehand though, because you’ll be setting the time the variations go out yourself.
So try to find out when most of your emails are opened or at least when most of your audience is awake. Especially if your emails go to an international group of people, like ours, this might be a good thing to test. Sending your emails at the right time can actually make sure more people see it and pay attention to it.
This is the big one. This is where you can go all-out and test basically anything you like. Everything within the content section of your email can be tested, and that’s a lot. You have to really think about what you want to test and treat these A/B tests as you would any other. I’ve written a post which will explain this: Hypothesize first, then test.
I always prefer to begin with this one, because this one is as late in the readers process as possible. This is my personal preference, because I just don’t like the idea of optimizing a part of the process (say, the subject) when what they see next (such as your email’s content) will undo all the optimization you did before.
Just a few ideas of what you could think about when wanting to test your email’s content:
- Your email’s header;
- An index summarizing your email;
- More (or less) images;
- Different tone of voice;
- More buttons instead of text links;
- More ideas on Jordie’s blog.
When you start testing, most email campaign tools will offer you two options:
- send your variations to your complete list, or
- send your variations to a percentage of that list, declare a winner and then send the winner to the remaining people who haven’t received a newsletter yet.
I’d strongly urge you to use the first option. Let me tell you why. First of all, sending multiple variations to just a sample of your list means that you’re cutting down on “respondents”. You’ll have less data than when you send it to the complete list.
However, if your list is big enough, this probably won’t matter much. The reason I’d still choose the first option is that the winning variation gets sent out hours (or days) later. Especially for newsletters this can be quite crucial, because, well, then it’s not really “news” anymore. This also means you have less control over at what time the mail gets sent out. And as I’ve already said: send time can be quite important.
If timing is of less importance to the emails you’re sending out, then you could probably go for the second option, because then the remaining people in your list will always get the winner.
So you’ve thought up some brilliant variations of your newsletter, its subject, from name or send time. Time to send out that newsletter. Once you’ve sent it out, there’s nothing you can do, you just have to wait until the first results come trickling (or flooding) in. Make sure you take notice of the differences in results. Which version got the highest open rate? Which version had the highest click rate?
In this, click rate always has my preference, because then they’ll probably end up on your site, where you have a lot more opportunities for selling, for example. However, we also always use custom campaigns on all the links in our newsletter. And since we’ve set up eCommerce tracking in Google Analytics, we can see which version of our newsletter actually got the most revenue. And if you have a business to run, that’s probably the metric that you want to see increasing.
And unless you’ve set up some kind of eCommerce tracking within your email campaign tool, this metric won’t be available in their results. So don’t value the results of these tools too much. Make sure you focus on what’s important for your business and check those metrics.
Also: don’t be too quick to judge. I usually wait for a few days up to a week before I draw my conclusions, because a lot of people will still be opening and engaging your email after a few days.
What do you think of the steps and rules we’ve set for ourselves? Do you have similar ideas that you follow? Or maybe something completely different? Let us know in the comments!
This post first appeared as A/B testing your newsletters on Yoast. Whoopity Doo!