Let me paint you a picture…
You send out a weekly email blast featuring your best products or deals.
Suddenly, your two copywriters have opposing theories on what subject line to use.
You’ve been using the same old one for the past month, and it seems to be converting at around 5%.
But, one of your copywriters believes that a new subject line will convert better.
Who do you believe?
Well, like a good marketer you won’t assume.
You test.
You A/B test the two subject lines with equal sample sizes.
And behold…
You find that the new subject line is converting 2% higher.
Depending on the size of your email list and how frequently you send this promotional email, you’ve just received a significant bump in revenue.
That’s why A/B testing is important.
Let’s break it down…
Stage 1. What Metric?
Now one thing to note, before you even consider the metrics below you NEED to improve your deliverability. There is no point if all your emails are landing in spam, or just plain not being received. Read this guide if you want a step-by-step on this. Now… Before you run a test, start with the end in mind. What metric are you trying to improve? Here is a table you can use as a reference pointDifferent metrics in order of testing (based on the goal of max profit) and what you could test to improve it:
Metric | Priority | Email A/B Testing Factors |
---|---|---|
Open Rate | 1 | A: Short subject line vs. B: Long subject line A: Personalised subject line vs. B: Generic subject line A: Morning send time vs. B: Evening send time |
Click-Through Rate (CTR) | 2 | A: Image-heavy design vs. B: Text-heavy design A: Single CTA button vs. B: Multiple CTA buttons A: Personalised content vs. B: Non-personalised content |
Conversion Rate | 3 | A: Percentage discount vs. B: Fixed amount discount A: Product-focused landing page vs. B: Offer-focused landing page A: Standard abandoned cart email vs. B: Incentivised abandoned cart email |
Email Revenue per Subscriber | 4 | A: Weekly email frequency vs. B: Bi-weekly email frequency A: Direct upselling vs. B: Bundled upselling A: Standard re-engagement email vs. B: Incentivised re-engagement email |
List Growth Rate | 5 | A: Pop-up sign-up form vs. B: Embedded sign-up form A: Short lead magnet vs. B: Long lead magnet A: Organic social media promotion vs. B: Paid social media promotion |
- Open Rate:Priority 1:
A: Short subject line vs. B: Long subject line
A: Personalised subject line vs. B: Generic subject line
A: Morning send time vs. B: Evening send time - Click-Through Rate (CTR):Priority 2:
A: Image-heavy design vs. B: Text-heavy design
A: Single CTA button vs. B: Multiple CTA buttons
A: Personalised content vs. B: Non-personalised content - Conversion Rate:Priority 3:
A: Percentage discount vs. B: Fixed amount discount
A: Product-focused landing page vs. B: Offer-focused landing page
A: Standard abandoned cart email vs. B: Incentivised abandoned cart email - Email Revenue per Subscriber:Priority 4
A: Weekly email frequency vs. B: Bi-weekly email frequency
A: Direct upselling vs. B: Bundled upselling
A: Standard re-engagement email vs. B: Incentivised re-engagement email - List Growth Rate:Priority 5
A: Pop-up sign-up form vs. B: Embedded sign-up form
A: Short lead magnet vs. B: Long lead magnet
A: Organic social media promotion vs. B: Paid social media promotion
Stage 2. Understanding the Best Practices for Testing
Before we get into the nitty-gritty of actually conducting the test, just want to cover a few recommendations. It will make your tests a lot smoother. Since our agency, Strikecopy, operates mainly with Klaviyo I will make some reference to it in this section.Develop a hypothesis
To start A/B testing, you should create a hypothesis based on what you want to achieve. For example, you might think that a subject line with an emoji will perform better than one without. Or, you might believe that an email from a person at your company will perform better than one from your company name. Once you have a hypothesis, you can choose the email elements to test. You can start testing micro-elements like CTA button color, preview text, and subject line length. Or, you can test more significant elements like email layout, images, and brand voice to make a major difference. But personally, I recommend…Testing high-impact, low-effort elements first
You can’t test everything at once. It’s best to start with high-impact elements that are easy to test, like subject lines, CTA copy, and email layout. These tests are quicker and much easier to analyze the data for. Once you’ve tested those, you can move on to the significant elements that we mentioned before. When running these texts make sure you…Use a Large Sample
If you can, it’s best to send each version of your email to at least 1,000 recipients. Sending to a smaller group may result in insignificant or inconclusive results. And typically, when you have a smaller sized email list your primary goal should be to grow it anyway.Test at the Same Time
Unless the variable you’re testing is send time, it’s best to send both the control and variation simultaneously. You’ll eliminate time-based factors. Just like in school. Ideally, you only want to change 1 factor.Prioritise the Emails You Send the Most
Invest your A/B testing efforts into the activities that yield the most profit. Like:- Weekly newsletters
- Welcome emails
- Promotional or offer emails
Wait enough time before evaluating performance
Even if you send an email to a lot of people, don’t expect immediate engagement. Give your recipients enough time to engage with the email. With Klaviyo, you can set a test duration and the winner will be declared after that.Stage 3. How to Set up and Execute the Test
1. Figure out what you want to test: Before you get your hands dirty with A/B testing, you gotta know what you’re testing! It could be anything from the subject line to the look of the email.
2. Create two versions of your email: Once you know your what, you gotta create two versions of your email. They should be pretty much the same except for the thing you’re testing.
3. Split your email list: Divide up your email list into two parts and send one version of the email to each part. a. In Klaviyo this is as simple as creating a split. Not affiliated with these guys, but I truly recommend it. If you’re serious about your email rev. pick a tool that can at least run these tests.
4. Check the results: After a certain amount of time (usually around 24-48 hours), look at the results of your test. Check out metrics like open rates, click-through rates, and conversions to see which version of the email did better.
5. Send the winning version to all your subscribers: Once you’ve identified the winner, send that version to the rest of your email list.
How Often Should I Test?
When it comes to testing in email marketing, the frequency depends on a lot of factors like the size of your email list, available resources, and the current performance of your campaigns. But it’s generally a good idea to test regularly so you can keep improving. Here are some guidelines I follow:1. Continuous testing: Always have at least one A/B test running. This way, you can collect data consistently and make data-driven decisions to improve your email marketing campaigns.
2. Test with every campaign: If you have the resources, try testing one thing with every email campaign you send. This can help you understand what your audience likes and make your overall email marketing better.
3. Monthly or quarterly tests: If you have a smaller email list or limited resources, consider testing every month or every three months. This will still give you valuable insights and help you make your campaigns better over time.
4. Seasonal testing: During peak seasons or promotional periods, like holidays or sales events, you might want to test more often. This way, you can take advantage of the extra interest and engagement from customers.
Remember that your test results might not always be conclusive, so it’s important to keep re-testing or try different variations to get a clearer understanding of what works for your audience.