5 Simple Steps to A/B Test Your Email Campaign Sending Times

As we’ve clearly seen in our previous post on the best email sending times, there are no silver bullets on this topic. Yes, we can accomplish a few quick wins by looking at the stats and trends alone. But in order to get to the ultimate truth of when to send emails, you’ll need to do some basic A/B testing yourself. Hope this post can make this process as easy as possible.

Some general A/B testing tips

As I realized how important sending time is to email campaign performance, I was negatively surprised to figure out there are just a handful of ESPs that have this A/B testing functionality built-in natively. You can see a quick rundown of various ESPs in the table below. I’ve included most of the major ESPs that are integrated with our product Mojn. While in-built testing is great, not all is lost if you are stuck with an ESP that doesn’t support it. You can always divide your email list and send out emails at different times. Tedious, but possible.

ESP Supports A/B testing of sending time
Apsis No
Aweber No
Campaign Monitor No
Compost Carma No
Constant Contact No
ExactTarget No
GetResponse Yes
Hubspot No
Instiller No
MailChimp Yes
MailerLite No
Pure360 No
Responsys No
SilverPop No
SimplyCast No
Sitecore No
SmartFocus No
Ubivox Yes
Vertical Response No
XCampaign No

Focus on the bottom line, not just the open rates

When comparing results from different testing samples, try not to focus just on the open and click rates. Instead, you probably want to get as close as possible to how different sending times affect your revenue. MailChimp, for example, offers direct revenue tracking with its ecommerce360 product, which you should set up in order to easily compare sales results between tests within the ESP. Also: while you’re testing out the best sending time, make sure to leave the email content and subject lines the same for all testing samples. You need to keep all the other variables in the testing samples the same so you can be confident about your end results.

MailChimp's A/B simple testing functionality.

Simple built-in A/B testing functionality (MailChimp)

Prepare big enough testing samples

I would recommend keeping the majority of the list on your regular sending schedule (just so you don’t risk losing a lot of revenue if some of the testing samples prove to perform badly). Make sure that your samples each contain above 1,000 emails and that they are chosen randomly.

Let’s say you have an email marketing list with 100,000 email addresses that you send out every Tuesday afternoon. I would keep 90,000 of those emails on your regular sending schedule and make 5 different testing samples with the sample size of 2,000 emails. Note: I increased the sample size here a bit in order to be even more confident in the test results. If you would like some help calculating appropriate test sample sizes, you should use this handy sample size calculator.

5 steps to reach your optimal sending time

I wanted to put the above example into an actionable list of steps you can take to optimise your email sending times. Here is the list:

  1. Keep sending most of the emails (90,000 in our example) at your normal sending time.
  2. Make 5 test samples from the rest of the email addresses and schedule them at different sending times. (Looking at some of the stats from the last week’s post, I would really focus on weekends as I think there currently lies the biggest opportunity in the consumer space).
  3. Repeat the test a week after to see if there were potential external factors affecting the first test.
  4. If you have a clear winner, switch the main portion of your list (90,000 in our example) to be send out at the winning sample time. (Make sure the winning time actually outperformed all the others by your most important metric – revenue, for example.)
  5. Rinse and repeat the steps above in the following weeks and keep fine-tuning your sending time and day.

Testing never stops

Why not just stop testing after 2 weeks? Well, you can and you will still probably end with a better sending time than you started with. That said, I don’t believe you can ever figure out your all-time best delivery time; for this reason, you should always be running a few tests. External factors, like your target audience’s email reading behaviour, or your competitors’ mailing schedules vary over time, and you need to adapt to it.

Takeaway

As we’ve seen above, A/B testing takes some time. But if you organize it in a simple way and use an ESP that supports it natively it shouldn’t take too much time of your busy schedule. The important bit is to not get satisfied too early and keep on pushing at least for a few months. Every improvement counts, especially when it comes to something as tightly connected to every e-commerce bottom line as direct email campaigns.

You’ve got some other tips that might be useful for our readers? Leave a comment below, we are always on the lookout for more ideas and tips!

 

cta_blog@2x
  • Ariel Roberge

    There is an error in your list. Compost Carma supports AB, ABC and ABCD testing on Subject Lines and Content at send since early 2013.

    The winner can be manually selected or automatically sent based on Open Rats, Clicks Rates or Conversion Rates (Revenue Generated or number of conversions).