Laptop and grey office supplies on white background
Copywriting

A/B Testing for Better Email Marketing

If you have an email list (and you’re actually using it), you’re probably aware that email marketing has a great ROI. Every time you send an email to your list, you’re either nurturing those leads or trying to make a sale.

But if you’re paying attention to the numbers (which we’ll discuss more below), you probably find that some emails do their job better than others. An email you thought was going to be a big money-maker turned out to be a dud. Or one that you dashed off in the five minutes between a site visit and a health inspection brought in beaucoup bucks—and you have no idea why.

Wouldn’t you really like to know why, so you can recreate the results?

I know I would. Fortunately, there is a handy tool that we can use to figure out what your audience responds to, and what they ignore like a speed limit sign in Texas.

It’s called A/B testing.

What is A/B Testing?

Simple! 

A/B testing (also called split testing) is a way of evaluating two variations of an email, website, sales page, or other customer-facing content.

In email marketing, A/B testing involves sending the same email with one variation to small groups of your subscriber list to see which gets better results. When the test is complete and one email is crowned the winner, your email service provider will automatically send out the victorious email to the rest of your list.

What Can We Test?

More like what can’t we test! 

Here are some options: 

  • Subject lines
  • Calls to action (CTAs)
  • Photos and graphics
  • Button color or button size
  • Number of links in an email
  • Copy (content and length)
  • Personalization (to use the recipient’s name, or not?)
  • Send time/day of the week

However, if you want to get actionable results that you can build on, you should only test one thing at a time. 

Here’s why:

If you send two completely distinct emails, with different subject lines, CTAs, visuals, buttons, and copy, you’ll have no idea why one performed better than the other. You’ll just know that something (or some combination of things) worked. 

So next time you go to create an email, you won’t know where to start. 

When you test a single metric and assess the results, you can often infer that a better result was due to the variable. Then you can build on that knowledge next time. 

Here’s an example. 

Let’s say you send an email to your list promoting a discount on a new product. Here were your two subject lines: 

  • A: Save 15% on Fancy New Thing!
  • B: Save $8 on Fancy New Thing!

You’re finding out if your audience reacts more to a percentage or a dollar value for a discount of this size. When the results come in, you find out that they preferred the $8 subject line in a big way—the open rate for email B outperformed email A by a lot.

So next time you’re offering a similar discount, you can test another variable. Maybe next time, your subject lines will look like this: 

  • A: Save $8 on Even Fancier New Thing!
  • B: Even Fancier New Thing $8 off!

Here, we’re testing whether the dollar savings perform better at the beginning of the subject line vs. the end.

There is a way of testing multiple variables at once. It’s called “multivariate testing.” This is good for testing which combinations of variables work best together. But you need a very large list to make this method effective. 

Can You Only Test Two Options?

That’s a negative. 

While you should only test one variable at a time, that doesn’t mean you can’t test more than two versions of that variable at once. 

Just keep in mind that the larger the group you have to check these variables, the better. If you have a small list, you should probably stick to two options until your list is more robust.

A/B Testing for Better Email Marketing

How Should We Decide What to Test?

Great question! 

If you’re brand new to email marketing, you’re starting without much to go on. As you send more emails, you’ll start to compile data about what emails are more successful than others. And you can analyze that data to come up with some new theories. 

Maybe you notice that your shorter emails are getting more clicks than your long ones. So you can start writing both short and long versions, and see if those findings are consistent. 

Check out this blog post for some more insight into how you can analyze your data and look for patterns and clues.

How Does A/B Testing Work?

So now that you understand why, how can you do this?

Email service providers make this easy. The platform will let you choose the percentage of subscribers that you want to participate in each test, and then forward the winning format to the rest of the list automatically. 

Each provider should have a walkthrough that helps you figure out the logistics. 

Just do a Google search for “A/B test Mailchimp” (or whoever you use for email marketing) to find it. 

What Does “Winning” Look Like?

There are a number of metrics you can look at, and they can start to get pretty complicated. Since I promised an uncomplicated guide, we’re going to keep this simple.

Your open rate is the percentage of people who received the email who then opened it. This can be interesting to know and a good metric if you’re testing subject lines. 

Your click through rate is the percentage of people who received the email and then clicked on a link inside. In other words: they completed the next step. This is a much more valuable metric than open rate!

Your conversion rate is the percentage of people who received the email and went on to convert. That could be making a purchase, scheduling a sales call, or signing up for a webinar. Just depends on what the purpose of the email was. This is the most important metric of all!

Not all tests will return results with statistical significance. You may test two CTAs and get similar click through rates for each. It happens! A significant result will depend on your sample size, which will depend on the size of your list. Here’s a handy tool you can use to check whether your results are statistically significant.

Also, it is absolutely possible to have a low open rate and a high conversion rate. That’s okay! Conversions are what we want. Much better to have fewer opens and more sales than more opens and fewer sales.

Start Small with A/B Testing

A/B testing your emails can get more complicated. But I’m a big fan of keeping things simple—especially for my hospitality peeps who don’t have full-time marketing departments.

Don’t stress about this. If you’re building a list and reaching out to them semi-regularly, you’re already ahead of the game. But split tests are a handy tool to start dialing up the effectiveness of your email campaigns.

Go get ’em.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.