If you’re tired of making email marketing decisions based on guessing what your audience wants or is interested in, then you need to do some A/B email marketing testing. With a bit of foresight and planning, you can turn your hypotheses and ideas into insights on how to boost website views, open rates, and more.
To get the most out of your email list, you’ll want to start A/B testing with a marketing email. This can be a cold email, holiday sales announcement, or any marketing message you choose. And with just a few tweaks to your subject lines, layouts, and copy, your emails will start generating results you never thought possible.
These are the key talking points you’ll hear from email service providers and marketing automation vendors as they subtly push you towards buying their add-on modules for email A/B testing.
But allow us to go through why an email platform shouldn't need add-ons and how you can view your email open rate and boost conversion rate with the services already provided to you. Read on for steps on how you would do proper A/B testing. Or simply learn how to run tests on your emails marketing through TruVISIBILTY's Messaging app.
Table of Contents
- What Is A/B Testing and Why Should Marketers Do it for Every Email Campaign?
- What to Test in Your Campaign
- How to Run Successful Split Testing
- Email Testing Tools
- The Best Email Testing Platforms
- How to Not Test Your Email Campaign Marketing
- What to Do Next in Email Marketing
Email A/B testing, or split testing, is the process of creating two versions of the same email with one variable changed and then sending to two subsets of an audience to see which version performs best.
Consider it this way: Email A/B testing pits two emails against each other to see which is superior. You can test elements that are big or small to gain insights that help you do things like update your email design, learn about audience preference, and improve email performance.
Email A/B testing is simply sending two different versions of your email to two different sample groups of your email list. The email which receives the most opens and clicks will be sent out to the rest of your subscribers.
Most people skip email A/B testing in marketing because they don’t know how or what to test. If this is you, read on. It is easier than you think and you’ll discover a huge opportunity to improve your campaigns and email marketing skills in general.
A/B split testing is just a way of evaluating and comparing two things.
You can find out specific data when doing split testing, such as:
- Which subject line has the best open rate
- Whether their target audience is more drawn to emojis or not
- Which button text makes people most eager to click
- What imagery in your email drives better conversions
- What preheader text generates the best open rate
With email marketing A/B tests you can improve your metrics, increase conversions, get to know your audience and find out what’s generating sales.
And the testing part itself is a breeze.
In your email marketing tool, you simply set up two emails that are exactly the same except for one variable, such as a different subject line. You then send the two emails to a small sample of your subscribers to see which email is more effective.
Half of your test group will get Email A and the other half gets Email B. The winning email is determined by what you are trying to measure.
For instance, if you want to know which version attracts more people to open your emails, you use the open rate as your success metric. Let's say Version B gets higher open rates. Then it will be sent automatically to the rest of your subscribers, because it statistically performs better. It could even become an email template for future marketing campaigns!
We've got more email testing tips for you to read, such as all the things you can test in your emails. Email campaigns that include strong subject lines and a shorter word count generally do well, especially when doing personalization email testing (people love a personalized email that also offer something valuable to them).
Remember: Recipients won't want to read a lot and they mostly care about the privacy and safety of the emails you send. In other words, they don't want it to seem like spam but actually read it as valuable to them.
Email Subject Lines
If you want to increase open rates, the subject line is the most common place to start. You can experiment with different styles, lengths, tones, and positioning. For example, Emerson A/B tested two subject lines for a free trial email with a white paper:
- Control: Free Trial & Installation: Capture Energy Savings with Automated Steam Trap Monitoring
- Variable: The Impact of Failed Steam Traps on Process Plants
That particular test revealed a 23% higher open rate for the subject line referencing the white paper. Testing transactional email has shown that the subject line field should show immediate value to the subscriber. Which, of course, leads to more opens.
The tone and positioning of your email copy impact whether the message catches a reader’s interest or not. A/B testing in the copy category covers a ton of elements in your email, including:
- Body copy
- Button copy
Along with word choices, you will want to test the word count. Do people respond to similar copy that is about 500 words less? Or do you get more interaction with having a CTA button between two paragraphs?
Testing the content of your newsletter can be tricky because if you’re simply changing the text, it’s hard to identify the one variable that causes a conversion. One aspect of your content that can be reliably tested is the call to action.
The CTA is the most important part of securing clicks. It’s the final gateway before a reader converts. Here are some aspects of CTAs you can change and test.
Including too many links is overwhelming – but having just two or even three links pointing to the same ultimate goal generally leads to a lift in conversions rather than a drop-off. CTAs are best placed on a clickable button. Try to repeat your CTA in your signature or postscript (P.S.) – you’ll be surprised by the results.
Another piece of content could be the valuable assets in the email. Are you sharing any facts that will help your lead or customer and make them want to either continue receiving your emails or click on a link to your page?
Some content to gauge could be downloadables, whether it's a guide or document. People love receiving some of this content - then again, it all depends on your audience!
Think of other content that seems to get engagement from recipients. Does your targeted audience like free video lessons on how to run your software? Will it get them to buy add-on products or services from you?
If your campaign comes from a company, experiment with using your company name versus inserting the name of one of your employees (like the marketing manager or CEO).
This tactic can result in higher open rates, though it depends on your target audience. Testing different variables is the key to discovering what advice you should be following.
It’s important here that after you’ve tested your favorite sender name, you stick with it. This way your email subscribers can easily start recognizing your emails.
Images and Other Visuals
If you use stylized emails, try A/B testing your visuals, like graphic images, videos, or even short video clips. Do different hero images change engagement? Can you use animated GIFs in longer emails to increase read time? Do email campaigns including an infographic make people more likely to forward it?
Image is a powerful tool to convince your customers to act. Experimenting with images is a fun way to get the pulse of your readers to see what types of images they respond to and how your images can drive engagement.
Here are some ideas of what you can test with images:
- Images of people versus a product is a good place to start
- One image versus multiple images
- Text on an image versus no text
- Screenshot of a video
- Animated GIF versus a static image
- Serious straightforward image against a goofy one
- Colorful visuals versus black and white visuals
- A stock image stacks versus an image of your employees or customers in action
Style or Design
How does the layout of your email perform compared to a different version? Perhaps there is a chunk of text in the majority of test A, but test B has short sentences with a gif before each sentence and it does very well when preceding a CTA button, resulting in a high email click-through rate.
Length of Emails
In addition to the design of an email, you can play around with the length of the message. Here are a few questions to ask yourself:
- Do subscribers want more content and context in the message, or just enough to pique their interest?
- What length is ideal for different types of email? Or different devices?
- Do all segments prefer the same length of email?
Calls to Actions
We touched on CTA buttons a little previously, but instead of testing just the copy of what is on your button, you will also have to change up the design of the button and perhaps even the link itself.
When someone hovers over a button, their cursor may show text describing where the button will take the reader. You can either have this or not.
Another example is having a dull-colored button that is too close in color to the background of the rest of the email body. You also won't know if your CTA button will be more attractive if it had your business colors on it. The example above shows how a CTA button is embedded in the image itself.
We generally suggest having a clear, slightly more colorful, button with an enticing message on it with an action statement, like "Download our free guide" instead of "Here's a free guide".
Best Time to Send
When do you normally open an email? Your answer is probably "it depends".
You might be online, see the email coming in and click within 5 minutes. Or you might first see the newsletter 2 hours after it got delivered in your mailbox. Or perhaps the subject line didn’t grab you enough and you leave the email unopened.
These are all real scenarios, which is why you should have an adequate time window when running an A/B test.
While with variables like subject lines and opens you can send the winner as early as 2 hours after sending, you might want to wait for a longer time if you’re measuring click-throughs. When you’re testing your newsletter on active subscribers, you can shorten the waiting time.
Research has shown that when you wait 2 hours, the accuracy of the test will be around 80%. The longer time you add to those hours, the more accurate your results will be. To hit an accuracy of 99%, it’s best to wait for an entire day.
Be aware that a longer waiting time is not always better. Some newsletters are time-sensitive and should be sent asap. In other situations, waiting too long will result in the winning email being sent at the weekend. A weekday versus a Saturday or Sunday can make a lot of difference in your email stats.
The main rule when it comes to defining the right send time optimization is: Every business is different, so it's essential to monitor your metrics and continue to test.
Tools like TruVISIBILITY, with their drag-and-drop email builders, make it really quick and easy to run A/B tests on your email campaigns, making it unnecessary for you to code multiple versions of the email and test them across different devices and email clients. You simply make the changes you want and click send.
However, before you dive in and start setting up A/B tests, there are a few strategic tips you can use to help increase the chances of getting success from your A/B testing.
To have the highest chance of getting a positive increase in conversions from your A/B test, you need to have a strategic hypothesis about why a particular variation might perform better than the other.
The best way to do this is to come up with a basic hypothesis for the test before you begin. Here are some examples to help illustrate what a basic hypothesis might look like:
- "We believe personalizing the subject line with the subscriber’s first name will help make our campaign stand out in the inbox and increase the chance it will get opened."
- "We believe using a button instead of just a text link will make the call to action stand out in the email, drawing the reader’s attention and getting more people to click-through."
These statements help you define what you are going to test and what you hope to achieve from it, and keep your A/B tests focused on things that are going to get results and are easy to read and understand.
Not every A/B test you run is going to result in a positive increase in conversions. Some of your variations will decrease conversions, and many won’t have any noticeable effect at all.
The key to this is to make sure you learn from each A/B test you run and use that knowledge to create better campaigns next time.
For an entire week, they ran different A/B tests on the newsletter, changing the template, testing images, testing fonts, etc.
Not every test they ran resulted in a positive increase in conversions (for instance, adding images to the email actually decreased conversions) but, with every test, they learned more about what works and what doesn’t work for their audience. This helped them to come up with a new email design that generated a 32% increase in conversions.
Now that you know why testing your emails is important and which elements to test, allow us to go over a couple of tools that can help you create the perfect email that performs well and gives accurate measurement information.
Litmus prides itself on an all-in-one email tool, but their specific email campaign testing is one of the better email testing tools out there. They allow you to preview your test emails, show you first impressions of your email, any image blocking issues, and spam testing for more than just your subject line. Like other email marketing metrics, they will also show the open rate and click-throughs that take your audience to a page on your site or even a social media page.
CoSchedule Headline Analyzer
Although CoSchedule built the Headline Analyzer tool to help analyze blog post headlines, this nifty tool also works well as a subject line tester for coming up with catchy email subject lines. It helps you balance the words in your headline so you can create one that really grabs attention and drives readers to open your email.
Personalization, rate at which traffic is going to your site, and bounce rate, are all important measures to view after testing the emails you send to leads. And the more eyes you get on your pages, the higher your conversion rate will likely be. But what are the easiest platforms to do split testing on?
We considered what marketers need in an email app, such as what information they can gather from test results, how simple it is to connect website results to email results, and the overall experience of creating an email and changing each part to test separately.
TruVISIBILITY is no stranger to email split testing. And the platform offers the most straight-forward navigation to name each email you test.
The resources and tools TruVISIBILITY offers rivals other email platforms, but their multiple customer services are more personalized as the team is determined to get their customers to their goal. The best part about this is that it's still free to start with your email marketing.
This isn't to mention their testing campaigns test give results to show how well your other campaigns are doing compared to each other in a clear list that compares open rate, clicks, and more. Meaning, you can test more than just the same email message, you can test which campaign is more successful than another campaign to improve them in the future.
When you consider trying TruVISIBLITY, you have the option to integrate other platforms, such as Yoast for SEO purposes into your online store website or landing page. You don't have to worry about the privacy or security of your site again. This integration will boost conversion rate optimization, which is reflected in your email responses or clicked links within your email marketing campaigns.
You can compare a campaign to one you've sent in the past with this platform as well. They focus on transactional emails, which are triggered by the recipient and often include order confirmations, shipping information, etc.
MailerLite specifically advertises subject line testing, sender split testing, and content split testing. They also tout their free email service (for up to 1,000 subscribers) to growing businesses.
You likely already know that Mailchimp is mostly known for their email platform (vs their landing page or site building capabilities). Because Mailchimp is known for their email app, they offer A/B testing results.
Mailchimp's marketing email templates are also great to start with if a business needs to start at the basics and test each element separately, such as images vs no images, links to an ecommerce website, personalization in the subject line, and more.
Most marketers are impressed by (and sold on) a feature where you can run a test using a small sample of your full email list. It runs the test until one of the two variants hits a majority in terms of, say, open rate, and then it declares that email to be the winner of the A/B test. At this point, it auto-deploys the winning email to the rest of your list. Sounds simple enough, but from a statistical point of view, this is a terrible idea.
The problem is a lack of statistical significance.
Statistical significance is the likelihood that an experiment is not due to random chance, entirely meaningless, or just flat out wrong.
To achieve an acceptable level of statistical significance, a data set needs to be large enough, and generally the smaller the effect being measured, the larger the data set needs to be.
Statistical significance is measured in terms of % confidence or what’s called a p-value. So, let’s say you want to be 95% sure that the change you’re making to your emails will have a positive effect. Then, you’d want to see test results that show a 95% confidence leve.
Which part of your email are you going to test? Subject line and image content of the email? Or even more? Segment your tests and label them correctly in your email platform, such as "Test A Subject Line 1 Newsletter" - or however you'd like to name your email blasts. Just as long as you know which test are which so your best email outcome can be easily identified.
When you test 2 subject lines, the open rate will show which one of your subjects appealed most to your subscribers. When you test 2 different product images in your email layout, you’ll want to look at both the click-through rate (and conversions).
It can happen that two emails show different results, depending on what you look for. In the email underneath, the plain-text version had a better open rate but when it came to people clicking, the design template was more successful.
Imagine you’re sending two emails at the same time. The content and sender’s name are identical. The only thing that differs is the subject line. After a few hours, you see that version A has a much better open rate.
When you only test 1 thing at a time and you see a clear difference in the metric you’re analyzing, you can draw an accurate conclusion. However, if you had also changed the sender’s name, it would be impossible to conclude that the subject line made all the difference.
When you have a big email list with over 1000 subscribers, we recommend sticking to the 80/20 rule.
Meaning you should focus on the 20% that will bring you 80% of the results. When it comes to A/B tests, this means sending one variant to 10% of the people, and the other 10% to variant B. Depending on which variant performed best, the rest of the 80% will be sent to the remaining group of subscribers.
The reason why we recommend this principle for bigger lists is because you want more statistically significant and accurate results. The 10% sample size for each variant needs to contain enough subscribers to show which version had more impact.
When you’re working with a smaller list of subscribers, the percentage of subscribers that you want to A/B test will get increasingly larger in order for you to get statistically significant results. If you have less than 1,000 subscribers, you probably want to test 80-95% and send the highest performing email version to only the small remaining percentage.
After all, if 12 people click on a button in email A and 16 people do so in option B, you can’t really tell which button performs better. Make your sample size large enough to get statistically significant results.
If your email marketing platform doesn’t provide a measure of statistical significance, you must plug your results into a third-party statistical significance calculator.
Next, never set your email marketing software to auto-deploy the so-called "best" email. This should be done manually.
And the truth is that, realistically, you may have a hard time getting significance out of any test with a list of fewer than 50,000 people. The open rates will only be a fraction of that total — 10% is typical — giving you a total of roughly 5000 responses. If it’s a true 50/50 test, any difference between the two variants is likely to be well within the margin of error at that scale.
It’s not that you can’t test 50,000 email addresses, it’s just that the test itself might not deliver statistically significant results. To see meaningful results, you may need to run a given test multiple times. This approach has its own problems — most statisticians will frown upon the idea — but there’s a way to do this that works.
Repeated A/B email tests are most effective when they are used to explore thematic concepts or broad categories. You wouldn’t want to test the same two subject lines over and over, but you could test similar kinds of subject lines. For instance:
- Do emojis in the subject line increase open rates?
- Do brackets like [Sale!] increase or decrease open rates?
- Will open rates increase if certain words are EMPHASIZED in all caps?
- Does using the recipient’s first name increase open rates?
These are general concepts that are easy to apply to multiple tests, and the results should be fairly consistent across those tests. By aggregating the results, it’s possible to determine winning and losing approaches even with a relatively small email list.
While it is possible to A/B test every aspect of your emails, I also suggest focusing most of your efforts on the subject line. If your list is small, it’s your best bet for generating meaningful results quickly. Whereas an open rate could be 20%, the typical click-through rate is more like 2.5%. The smaller sample size of the CTR makes it much more difficult to achieve statistical significance.
Whether you are one of many marketers on a team or you are running the marketing for your own business, we hope you see that the best email campaigns create opportunity to gain prospective customers. If you you list email marketing as skill or not, following the steps of a successful split test will show you the testing email analytics for you to send out the best version of your email next time.
Have an online or brick and mortar store? Want to ensure the privacy of your leads' and customers' information? Check out the landing pages or chatbot app from TruVISIBILITY, which can be integrated with the emails you regularly send to recipients. There's so much you can do to improve the conversion for your business. Changes to your email campaigns, adding image content or personalization, for example, are just some of the things you can do.
Read our other articles about email marketing for ecommerce, contract services businesses, and more here. Or check out our app site pages to begin the easiest experience creating marketing campaigns you ever had.
Want to receive more articles?
Sign-up for our weekly newsletter to receive info that will help your business grow