Yes, yes, and yes.
All three methods are opportunities to reach and connect with prospects. However, according to 94% of marketing leaders, email marketing remains one of the top three most effective marketing channels.
But this is only true if you can get folks to actually open and click through to your website or landing page. The best way to do that?
Run experiments and A/B test your way to success.
In this article, we'll cover what email A/B testing is, discuss its importance, and break down a list of eight ways to get started.
Subscribe to our weekly newsletter for tips so good that we might put ourselves out of business.
What is email A/B testing?
Email A/B testing, or split testing, is an email marketing strategy marketers use to experiment with different versions of emails to determine which performs best. You test two versions of your email, with slight variances to them, to determine which is the winning email that gets you better results.
For example, you might send the same email to two different groups of subscribers, but each with unique subject lines.
The goal? To see which subject line gets the most email opens.
Once you learn what makes your audience tick click, you can better optimize campaign performance for even more wins.
Why is email A/B testing important?
Testing your emails is one of the best ways to do just that.
Running email marketing campaigns without split testing leaves money on the table. Without it, there's no way to know if a specific subject line, offer, design, or copy affects campaign outcomes—you have to test your way there.
Overall, email A/B testing helps you achieve
- higher email open rates
- higher click-through rates
- more traffic to your website
- increased conversions
- decreased unsubscribe rates
But improving these metrics is only one part of it. A/B testing also enhances the technical side of email marketing. If you don't test email deliverability, you risk your messages not hitting recipients' mailboxes at all and hurting campaign metrics. Even if your emails do get delivered, you need to ensure they appear correctly.
Does the email look just as good on mobile as on desktop? If the readability is poor, your email deletion and unsubscribes may increase.
These are all things your email marketing can stand to gain (or lose) by neglecting to A/B test.
8 variables to consider when A/B testing email campaigns
We already know that A/B testing email campaigns is important to the overall success of your email marketing strategy.
But what exactly should you test in each email?
This will ultimately depend on your goals and audience. For example, if you're running a promotion, using the right CTA copy and button color is critical to increasing conversions if you're running a promotion. But if you're creating a newsletter and trying to build your email list, then testing the length and design is ideal for improving readability.
There are several variables you can test to boost campaign performance.
Let's take a look.
1. Subject line
It's the first thing people see and the determining factor of whether or not they click. Think of the inbox like a social media feed—if your subject line isn't scroll-stopping, then the odds are higher for deletion—or worse—being marked as spam.
So some marketers find it's the primary variable to A/B test in every email. Get this right, and you can win clicks from your target customers without having to pour over every other excruciating detail.
But what exactly do you test?
Some try different lengths (optimal being 6–7 words). Some try personalization and adding the person's name. Others try adding emojis to stand out.
2. Offers and CTAs
Nothing screams “open me” like an email with a special offer. But don't just add a discount and call it a day. There are various ways to make an offer sound (or look) better.
You might test including the special offer in the subject line, presenting a discount as a dollar amount or percentage, the amount of the discount itself, etc.
When testing offers and CTAs, consider marketing’s “Rule of 100,” which states products under $100 look better with a percentage discount. If the product's over $100, then dollar amount discounts are more appealing.
In the same vein, you can also test your call-to-action that pushes your offer. For instance, you can
- test different CTA copy
- try different placements for the CTA button
- change the color of the CTA button
- see if a CTA link or button works better
Here's an example from Vitacost, which has not one, but two CTA buttons in different areas of the email:
3. Design and format
Plain-text vs. HTML? Images or no images? It's not your decision to make. At least, not until the people have spoken.
It's another reason to A/B test emails to see what works. Because—trust us—these things will impact your email marketing success.
Some emails (like newsletters) perform better with text and simple visuals sprinkled throughout. Others (like promotional emails) fare better with an interactive HTML email design.
Here's an example from Loom, which formats their email using a mix of plain-text and HTML design elements:
ClickUp, on the other hand, goes all out with HTML designs and even includes GIFs.
Once you determine what works and resonates with your audience, you can build email templates based on your findings to speed up the process.
4. Email length
What’s going to work better for you? Short emails that are sweet and to the point? Or longer emails with in-depth details, complete with a FAQ section?
Test your email’s length to identify the perfect sweet spot. Again, the length will depend on the type of marketing email you send and the goals you set.
A newsletter is obviously going to require some more real estate, while a flash sale promo email might only need a single headline.
Here's a great example from Wayfair:
Just one sentence and attractive imagery with a prominent CTA button front and center. (This typically works well in eCommerce email marketing because consumers are quicker to buy or at least window-shop.)
5. Time of day and frequency
The time of day you send emails matters because it can determine whether they're opened or overlooked. Some folks are early birds and like to start the day in their inbox. Others prefer to wait until late in the morning or early afternoon to read messages.
A report from Litmus shows that in America, the best send time is 10 am (download the full report in the link). And the best times overall are between 9 am and 2 pm.
Then the best days of the week to send emails, according to Campaign Monitor, are Mondays for open rates and Tuesdays for click-through rates.
Of course, you should test every time and day of the week to find what works best for you. You might find that your audience prefers going through promotional and branded emails at night or over the weekend.
To add, you also want to test how frequently you send your marketing emails.
Every day might be packing it on a bit too heavy and will have your list running for the hills, while once a month will have your audience scratching their head to remember who you are.
Find the frequency that works best for both you and your audience. Then stay consistent with it.
There’s a lot to test here, and we recommend you don’t skimp on this. For starters, 80% of consumers are more likely to purchase from brands that provide a personalized experience.
This extends to your marketing emails.
Personalization can come in many forms, including
- using customer data to recommend products just for them (it’s said that a whopping 91% of consumers are more likely to buy from brands that remember and recommend relevant offers)
- using the subscriber’s name in the subject line
- sending birthday or anniversary offers
- + so much more
Take this Credit Karma example below, where they personalize their email by calling out the recipient’s credit score.
This works if you have user accounts to gather information from (e.g., search or purchase history data). Test to see if personalizing works or if your audience cares more about the offer than what you know about them.
But it’s pretty safe to say that personalization will likely go over well.
7. Social proof
Would adding some social proof boost result in higher open rates?
This is something you could determine in another email A/B test. Try adding social proof to your emails to see if that results in a lift in opens and clicks.
You might include social proof in the subject line or have its own section in the email content. Not only can you test where to put your social proof, but you can also experiment with different types of social proof. For example, you might test the effectiveness of
- star ratings
- linking to your case studies
- positive press or PR
- including your client list
The possibilities are endless with social proof, and the only way to really find out what works for you is to test your way there.
8. Preview text
Don't sleep on the power of the preview text. It's the second thing subscribers look at it before clicking an email (if the subject line wasn't enough). Use this to reinforce your message and drive home a click.
Here's an example from Wayfair, which promotes a two-day clearance sale in its subject.
Then in the preview text, it follows up by using FOMO (fear of missing out), a high numerical discount, and free shipping. And before it clips off, you see financing is an option—great news for those who like buy-now-pay-later deals.
I’d open this email.
10 best practices for split-testing emails
Split-testing emails is about more than randomly selecting areas of an email to change. It requires a calculated and analytical approach to prevent wasting time and money on fruitless efforts.
So we put together a quick list of best practices to follow when planning and executing your email A/B tests:
- Create a hypothesis: Don't randomly select a component to test in your emails. Hypothesize why you think this area can improve results for the goal you want to achieve.
- Focus on high-impact, low-effort variables: Don't waste time on variants that don't impact KPIs. Instead, focus on areas like subject lines, CTAs, offers, and other changes that provoke action.
- Get the timing right: Avoid sending test emails during seasonal changes that can taint your results—for instance, if everyone's on spring break, email opens will be unusually lower.
- Test one variable at a time: Focus on one component to change in each test to know exactly what's improving your results.
- Wait a few weeks for the final results: Check A/B test results a couple weeks after the campaign to allow for statistical significance. Your data after waiting one day will be different than waiting two weeks.
- Analyze and test again: Look at the results, analyze what you see and why, then run more tests to ensure accuracy.
- Run a test before you start: Yep, test your test. Perform a test send to ensure there are no errors in the copy, design, or deliverability.
- Determine the test sample size: Determine an adequate portion sizes to ensure test groups are substantial enough to get statistically significant results.
- Keep a control version: Always have a control version that doesn't change to test the variations against (e.g., 60% receive the control version, 20% get Version A, and 20% get Version B).
- Use email automation: Prevent forgetting to send emails and which segments to send them to so you don't flub your test results (e.g., email automation providers like Mailchimp).
Improve email performance with A/B testing
Email marketing has the potential to increase conversion rates and revenue for your business. But only if you know how to trigger your audience to act.
Since there's no way to read minds or guess your way to success, you need to A/B test emails to achieve those results.
Use this guide to start split-testing email campaigns like a pro. And check out this list of 50 email marketing examples for inspiration.