Shape

Email Testing!

Here's what 7 email experts talk about

Shape

The best email marketers are testers at heart. And what’s great about email marketing is that you can test just about anything–subject lines, CTAs, content, personalization, product recommendations, and more.

We asked a few of our favorite email marketers two big questions around email testing:

  • What is your favorite email test to run?
  • What is a common mistake you see marketers make when they’re running/setting up their tests?

Let’s see what they had to say.

lisa-jones

Lisa Jones

Chief EyeMail Officer
EyeMail, Inc.

What is your favorite email test to run?

My favorite email test involves integrating video content into email campaigns.

This testing method includes A/B testing, where one static variation (A) of an email campaign is sent to a subset of subscribers, and a dynamic video email variation (B) is sent to another set of subscribers to measure engagement. The goal is to test and validate how each variation impacts email metrics and analytics, determining which yields the best results.

Video has proven to be highly engaging, builds trust and can significantly increase click-through rates and conversions.  Studies have shown Viewers retain 95% of a message when they watch it in video compared to 10% when reading it in text (Insivia). Including video directly into the email empowers marketers to provide compelling content that immediately captures the attention of subscribers, delivers storytelling moments and inspires them to take action.

This advanced level of video email testing not only boosts engagement but also offers valuable insights into subscriber preferences and behavior.

What is a common mistake you see marketers make when they’re running/setting up their tests?

One common mistake encountered when conducting testing is a lack of clear definition regarding objectives for measuring impact and success. Without well-defined goals and metrics, it becomes difficult to accurately evaluate the effectiveness of the test and make informed data-driven decisions for future campaigns.

There's a misconception that variables of video in email testing produce equal results. It's crucial to establish consistent standards and ensure metrics are aligned and comparable across different video in email engagement testing scenarios. For instance, comparing an A/B video email test with a static image featuring a play button (vs) a test where the video automatically plays within the email are not equal as the same test. Testing effort requires standardized measures for accurate assessment.

Proper audience segmentation may be overlooked, leading to skewed results or missed optimization opportunities. Marketers should meticulously plan and document their testing strategy, considering objectives, metrics, audience segments, and testing frequency, to derive meaningful and actionable insights from each test.

In addition to traditional dashboard metrics like opens and clicks, analyzing the reduction in Unsubscribes from A/B test results can provide valuable insights into performance improvements. Investing in experimental efforts to explore new technologies and testing can also yield deeper insights and foster long-term success. It's important to recognize that one email testing exercise will not set the benchmark but serves as a starting point for ongoing refinement and enhancement of email marketing growth strategies.

grin-hobbs

Garin Hobbs

VP Customer Success and Strategy InboxArmy

What is your favorite email test to run?

I know it sounds vanilla, but Subject Lines are my absolute favorite test to run. Why? Because every new subscriber experience begins with an "open", and without that, the rest of the message hardly matters.

Creating concise subject lines that inspire users to open (and keep opening) is an evergreen challenge. You only have 3-7 words to tell a story, and it has to be fresh, relevant, and compelling every time. French mathematician and philosopher Blaise Pascal famously said "I would have written you a shorter letter but I did not have the time".

Brevity is incredibly difficult. The best email marketers live that reality every day.

What is a common mistake you see marketers make when they’re running/setting up their tests?

A common mistake I see marketers making when testing, is testing without clear intent or a meaningful strategy. For example, they may take a high-performing email from a previous campaign and test it against a new message, without identifying any of the performative elements to compare.

Email testing isn't very different than the Scientific Method: Question, Research, Hypothesis, Experiment, Data Analysis, Conclusion, and Communication/Application. Testing should be iterative and focus on a single element each time: a word or two in a subject line, a CTA button shape or color, end price vs discount percentage, etc. This allows a marketer to build a book of learnings to better understand what works, what works best, what works best consistently, and why.

Without that, marketers will have a challenging time understanding what actually moves the needle.

kara-trivunovic

Kara Trivunovic

SVP, CX Studio Zeta Global

What is your favorite email test to run?

Personalized vs. Business-as-Usual (BAU).

What is a common mistake you see marketers make when they’re running/setting up their tests?

So much testing in email was born from direct mail and too many marketers try to measure the incremental impact of a single element being tested - when really the test is the uniqueness of the experience delivered to the customer.

And too frequently incorrect assumptions are made with a limited view of the test. For example, many assume that if your subject line is more relevant that your open rate will increase. But that is not always the case. Depending on your business, and how relevant the subject line is, it may not drive increased open activity, but may drive an increase in site visits.

Too many tests assume the customer path is linear and it is definitely not.

aubrey-miller

Aubrey Miller-Schmidt

Senior Email & Website Programs Manager Main Street America Insurance

What is your favorite email test to run?

One of my favorite tests to run is Call-to-Action (CTA) variations. Just about every email needs an effective CTA to meet its goals.

There are numerous tests that can be run on CTAs, including layout placement, language, and format. Different audiences may respond in unique ways to each of these elements, and it can change over time.

One type of CTA test I personally enjoy conducting is language nuance. For example, I recently tested several webinar CTAs, comparing the first-person language of "Save my seat!" against the second-person version "Save your seat!" In my case, the first-person version performed better, likely due to creating more personal relevance to the content.

Some other ways you can test language nuances in CTAs include examining the differences between adding emotional words. For example, comparing "Empower your team" with "Help your team" can reveal how much more effectively emotive language engages your audience.

Another approach is to test phrases that promote risk aversion against those that don't, such as "Don't miss this sale" versus "See the sale." Additionally, evaluating urgency versus invitation, like "Get tickets before they are gone!" versus "Select the best tickets for you!" can help determine which motivates your audience to act more decisively.

By continually refining our approach based on testing insights, we enhance our ability to understand and connect effectively with our email audiences and drive meaningful engagement in our campaigns.

What is a common mistake you see marketers make when they’re running/setting up their tests?

When testing, there are many ways to accidentally introduce bias or generate statistically irrelevant data.  One prevalent mistake I see is attempting to test too many variables simultaneously, often without realizing it.

A typical example of this is a test comparing the CTAs "Save my seat!" versus "Come join us at this Wednesday's webinar!" At first glance, it might seem like a straightforward A/B test, but it’s not. This comparison inadvertently introduces multiple variables: the urgency and directness of "Save my seat!" against the more casual and informative tone of "Come join us at this Wednesday's webinar!" In addition, you are testing two different lengths and perspectives (first-person vs. second-person).

To conduct more effective A/B testing, it's crucial to isolate one variable at a time. For example, if you want to test the effectiveness of first-person versus third-person language, both CTAs should be similar in length and tone, differing only in the point of view. This approach ensures that any difference in performance can be accurately attributed to the variable in question, thereby providing clearer, more actionable insights.

By simplifying your tests and focusing on one change at a time, you can gain a much deeper understanding of what resonates with your audience and why, ultimately leading to more effective and successful email campaigns.

spencer-kollas

Spencer Kollas

Managing Director, Strategy & Analytics Oracle

What is your favorite email test to run?

Yes---I like every email test because hopefully it will tell you something new that you can apply in future campaigns.

What is a common mistake you see marketers make when they’re running/setting up their tests?

Trying to test too many things at the same time---limit the number of variables you are testing in order to get a clear test.

kelly-haggard

Kelly Haggard

VP, Marketing Innovation Synchrony

What is your favorite email test to run?

Taking personal data on a customer and showing that you know them in an email vs a batch and blast. Seeing the lift when you give someone something more relevant is always satisfying. Personalization by activating off the first party data brands own is something I wish all brands did better. If I bought your shirts, I want stuff to wear with them or more shirts of that style.

What is a common mistake you see marketers make when they’re running/setting up their tests?

Trying to test too many things and not being able to identify what drove the results. If you completely redesign an email and get a lift but don’t know what drove it, you don’t have actionable, repeatable results. Don’t get me wrong, sometimes a complete redesign is needed but it won’t be clear what drove the win.

michael-pattison

Michael Pattison

Lead Digital Strategist Klaviyo

What is your favorite email test to run?

I like to test sign up forms and welcome programs. The impact of gaining and retaining new customers is unlike any other program CRM pros run. If you nail this step, all things downstream are much easier.

What is a common mistake you see marketers make when they’re running/setting up their tests?

Thinking you're done or that testing things once means it will never work. You have to constantly try things and in different ways and through varying communications. Pro tip: Keep a catalog of test results so employees can easily reference results and use them as idea starters.

scott_cohen

Scott Cohen

CEO
InboxArmy

The CEO Recap:

What I love about asking these questions to our panel experts is that you’ll see trends emerge among these smart folks who bring lots of “reps” in email marketing to the table. In this panel, you’ll see, for example:

Tests lacking focus - a mistake.

Too many variables, no real hypothesis. This puts you in a position where you’re testing for the sake of testing, not testing to learn–which is crucial.

Personalization as the test - an opportunity.

Marketers often think of personalization as something to turn on rather than something to test. What you’re trying to figure out, in a test like Kara’s example of “Personalization vs. BAU,” is the incremental value of adding personalization to your emails. If personalization drives significantly more engagement and conversions, the effort to bring personalization is “worth the squeeze.”

Lack of specific goals/KPIs - a mistake.

Measurement is key in any testing environment, and deciding on true KPIs for winners is important. Your conversion metrics are ultimately what matter–and I hope you’re not relying on open rate as a conversion metric, because it’s directional at best. Even subject line tests should be measured down to conversions like sales or possibly clicks. Opens alone are not the way to the promised land.

I give my thanks to this panel’s responders.
I look forward to the next edition!

Need Expert Help for Your Email Marketing Program?

InboxArmy is a full-service email marketing and lifecycle marketing agency specializing in email marketing, SMS, push, and in-app messaging execution, strategy, and design. We work with global brands, agencies, and businesses of all sizes and industries.

Get a free consultation today.