A Deep Dive into A/B Testing

By Kayla Waukau
Marketing Coordinator, Lessiter Media
kwaukau@lessitermedia.com

A/B Testing is a critical aspect of marketing, yet it often gets overlooked in the midst of our daily tasks and responsibilities. But fear not, because for the next three weeks, I will be delving into the world of A/B testing in our weekly Marketing Minute. Each week, I will explain what we will be testing, and then I will share the results of the previous week's test.

In previous newsletters, I've touched upon Best Practices and The Significance of A/B Testing. Luke has written on "A/B Testing is More Than Just Subject Lines," while Dallas shared valuable insights on A/B testing with Facebook. However, I am thrilled to take you on a deeper dive into the realm of A/B testing with our 4-part series this month. Join me as I uncover the realities of A/B testing and its role in taking your business higher. Who knows, you might just be proven wrong!

Email A/B Testing Part 1: HTML vs. Plain Text

Let's kickstart this series with an A/B test I will conduct using email marketing. Our objective is to gather registrations for an upcoming conference. To conduct this test, we randomly selected 5,000 users from our email list. After a span of three hours, the test with the highest number of clicks will emerge as the winner and will be sent to the remaining 25,000 emails.

Now, let's dive into what I'm testing this week:

  • Email 1: An HTML-style email: HTML emails are visually captivating, showcasing various colors, styles, images, and even multimedia elements. 
  • Email 2: A plain text email: The term "plain text" refers to a basic, unadorned text-driven email without any embellishments. It involves using simple fonts, devoid of extravagant designs or colors. Plain text emails do not include any additional graphics or multimedia elements.

Personally, I suspect that Email 2 will emerge as the victor. The specific audience we are targeting tends to respond better to personal emails rather than overtly promotional ones. What are your thoughts? Share your prediction on which test you think will come out on top and why! 


Email A/B Testing Part 2: HTML vs. Plain Text Results

The results are in! If you read last week’s Marketing Minute, you’ll know that I am conducting a month-long A/B testing journey. Each week I choose a new email point to test and provide you with the results the following week. The goal is to give marketers a better understanding of A/B testing and why we do it.

Last week I conducted an A/B test on HTML (visually captivating, showcasing various colors, styles, images, and even multimedia elements) vs Text-only (a basic, unadorned text-driven email without any embellishments. It involves using simple fonts, devoid of extravagant designs or colors. Plain text emails do not include any additional graphics or multimedia elements) styled emails. 

I predicted that the text-only style email would result as the winner, however, this was not the case. After 3 hours of testing, the click-through rate for the text-only style email had a 22.4% increase compared to the HTML-style email. A click-through rate is defined as the ratio of clicks on a specific link to the number of times an email is shown. This number is more effective at showing the success of an email vs. the open-rate due to the level of engagement a user has with the email.

Several factors can contribute to possible skewed results of testing. One of those can be time of day. With last week’s test, we sent the A/B test at 10 a.m., allowed testing results to build up for 3 hours, and then the winning email was deployed to the remainder of our audience at 1 p.m. Would the results be different if we sent the email at a different time? What about a different day of the week? This leads me to the next A/B test that I will conduct. 

I will send the same email to a 50/50 randomly split audience where the time of day will vary.

  • Email 1: Deployed at 1 p.m. on Wednesday
  • Email 2: Deployed at 6 p.m. on Wednesday

I would predict that Email 2 would have a higher click-through rate as most people check their email later in the evening in comparison to the middle of the day. 


Email A/B Testing Part 3: Time of Day

The results are in! This week I conducted an A/B test on email deployment times. The process involved taking our audience list and randomly splitting them in half. Audience 1 received the email at 1 p.m. while audience 2 received the same email but at 6 p.m.

I predicted that the 6 p.m. audience would have a higher click-through-rate. However, the two emails were tied with the same amount of clicks! In order to determine a true "winner" I decided that looking at unsubscribes would be the best alternative to measure the success of the email. Turns out, that even though they had the same amount of clicks, the email deployed at 6 p.m. had a 40% higher unsubscribe rate, which would make email 1 the winner of this test.

Onto what I'll test for this upcoming week... Emojis in subject lines! Using emojis in subject lines is still gaining its popularity and more companies are using them to increase open-rates. How will our audience respond to using them in our subject lines?

  • Email 1: Subject line using emojis
  • Email 2: Same subject line without the use of emojis

I would predict that Email 2 would have a higher click-through rate as I believe emojis help capture the reader's attention. We'll see if I'm wrong AGAIN next week in my final report of my month-long testing series.


Email A/B Testing Part 4: Emojis in Subject Line

This week I conducted an A/B test on using emojis within subject lines. I used the exact same email, and the audience was split randomly.

  • Email 1 Subject Line: "🚨New Speaker Just Added!🚨"
  • Email 2 Subject Line: "New Speaker Just Added!"

I predicted that Email 1 which contained the emojis in the subject line would win the test. However, this was not the case! Email 2 had a 5.9% increase in open rate, meaning the email with no emojis won the A/B test!

Throughout this month-long testing series, I conducted three different A/B tests, all of which my predictions of the projected winner were wrong! This is exactly why we test in marketing. We may think we know our audience, or perhaps we are following the current trends, but we won't know until we test! There are several different factors that can skew a test, which is why we should never STOP testing! What may work one week, may not work the next.

I hope you've enjoyed following along during this month and have sparked your interest in what you can test in your own marketing campaigns. I would love to hear about what you've tested and the results!