A/B Testing Lifecycle Emails to Skyrocket Value

A/B Testing Lifecycle Emails to Skyrocket Value

 So you’ve set up a lifecycle email campaign (also known as a drip email campaign), and you’d like to improve it. If you’re as passionate as I am about using real data to make decisions (as opposed to, say, arguing about the decision with your coworkers based on your intuitions), you probably want to use A/B testing to improve the emails you send. You’re in luck!

Read on to learn how to A/B test your lifecycle email campaign to improve open rate, click rate, and conversion rates!

What is a lifecycle email campaign?

Lifecycle emails are emails your company sends to users based on where their (indirectly) expressed interest in your product. For instance, if you sell chainsaws and someone visits your page titled “Features to look for when buying a chainsaw,” you can be reasonably sure that this person is comparison shopping. By offering to send them a series of emails about things you need to know before buying a chainsaw, you’ll make them happy (because you’re saving them research) and you’ll make your company money (because they’ll be more likely to trust you and your products).

But this post isn’t about the basics of lifecycle emails. If you’d like help setting one up, you can contact me. From this point on, I’ll assume you already have a lifecycle email campaign set up, and you’re looking to optimize it in a few different areas.


Metrics we want to improve


There are a few key metrics in any successful lifecycle email campaign. Let’s talk about those one-by-one.


  • The open rate is the percentage of people who open your email after they receive it (emails opened divided by emails sent). Reading your email, of course, is the first step in either you or your customers getting any value out of it! The industry-cited “average” open rate is about 20%, but if your emails are well targeted, with good subject lines, you should be able to do much better than this (I’ve averaged over 70% on a few emails).
  • The click rate (or click-through rate, CTR) is the rate at which people click on the links in your email after they receive it (links clicked divided by emails sent). Note that a 10% click rate is not quite the same as saying that 10% of people who received the email clicked on it, since individuals can click multiple times to affect this. Note also that this is measured in terms of emails sent, not emails opened—assuming most users click only once, your click rate will never be higher than your open rate. The industry-average click rate is about 5%, but your own rate may vary widely depending on your content.
  • The conversion rate on your email campaign is the percentage of customers who, after receiving your emails, take some action that’s valuable to your business. Generally, that means purchasing some product, but it might be as simple as filling out your contact form for lead generation. Of the three, this is the hardest metric to integrate into your email reporting—you’ll probably have to use your Web analytics tools to see this number. This metric, too, will vary widely depending on your business.


Collectively, these three metrics are stand-ins for measuring customer engagement with your emails. Increase engagement and you’ll increase customer happiness—and your sales.


A/B Testing Lifecycle Emails to Skyrocket Value

My Customer.io dashboard, showing an open rate of 31.6% and a click rate of 6.4% for a lifecycle email campaign I implemented for X-Plane


The image above is taken from my favorite service for sending lifecycle emails, Customer.io. However, your own email provider should provide similar reporting capabilities.




[Aside: If you’re looking for a new service provider for sending your emails (because you certainly don’t want to be sending emails from your own server!), I can’t recommend Customer.io highly enough. Their customer service is absolutely over-the-top-amazing. I, as a run-of-the-mill user, have chatted with the CEO on a number of occasions, and he’s asked for my feedback on new features! (As usual, I’m in no way compensated for recommending their service. My only affiliation is that I’m a very happy, paying customer.)]


Ideas for improving your customer engagement


Having defined the metrics we’re interested in, let’s talk about how you can improve them in your own lifecycle email campaigns.


Improve the open rate on your lifecycle email campaign


Improving the open rate is almost entirely a factor of improving your subject lines and the first line in the body of your email. A good subject line should:


  • Promise almost-but-not-quite unbelievable value—then deliver on it. Your readers should be intrigued at the possibility of getting so much value out of your email, but they shouldn’t write your promises off as absurd. For instance, if your business sells life insurance, you might use “Lower your insurance premiums month after month,” not “Cut your premiums by 90%, guaranteed!” The reasons for this are twofold: First, you want to make sure you can deliver on your subject line to all your readers (assuming they have the minimum level of qualification to be on your email list), not just some. Second, unless you have a truly outstanding track record with your readers, you’ll risk being ignored entirely—or worse, marked as spam!
  • Include your branding and some indication of continuity with the other emails you’ve sent. This might be as simple as adding “Part x of y, <name of your email course>” to the end of the subject line—just make sure your product or Web site name is in there somewhere. This helps the reader remember that they signed up for this course—if they remember “oh, I signed up for this,” they’re much more likely to read than if you seem to be emailing out of the blue.


The other major factor affecting your emails’ open rate is the first line of the body, which is shown as a preview in many email clients. The biggest mistake you can make here is having the first line of your email read “View this online” or something else irrelevant to your subject. The first line should support the subject; if your subject line promised nearly unbelievable value, your first line should tell the reader you’re serious about providing that.


Improve the click rate on your emails


To improve the click rate in your emails, focus on writing better calls to action (CTAs) in the body. These come in a variety of forms, such as the classic, gigantic button or in-text callouts like “click here” or “learn more about <some topic that you may be interested in>.”


As a rule, I’m not a fan of call-to-action buttons in emails—they violate readers’ expectations about what kind of content is included in email, placing your email closer to the realm of super-pretty, sales-heavy newsletters and farther from a candid conversation between trusted peers. With that said, there are plenty of knowledgeable email copywriters who do use big call-to-action buttons successfully.


I’m also not a fan of “click here”—and you shouldn’t be either. It tells the reader nothing about what they might find. As Steve Krug says, “don’t make me think!” If your readers have to sit and ponder what the page you link to might contain, you’ve probably lost your chance to engage them.


As you might have guessed by now, I prefer links to be descriptive—the reader should have a good idea of what they’ll find “on the other side” of the link. Consider: Are you more likely to click a link to the W3 tips on writing good link text, or click here?


Now, the best links—the links that get the most engagement from your users—combine the traits of all 3 of these link styles. Put the link in its own paragraph, ask the user to click on it, and be descriptive. This has the advantage of still being conversational (your email doesn’t look like a sales flyer), directly asking the reader to take action, and telling them exactly what they’ll see when they click. Your link might look like this:

Click here for the 11 Commandments of Linking!


Some very knowledgable people (like Patrick McKenzie, author of the Kalzumeus blog) recommend improving your engagement metrics by asking for other means of engagement, such as responding to an email.


 While this works for them, I prefer to aim for engagement that is directly in line with my business goals for a particular product. Where hitting the Reply button is in line with those goals, I think this is great though!


As an aside, beware of “cooking the stats.” For instance, if you were to make every word in your email into a link, then use CSS to make the words appear as normal text, you’d likely skyrocket your click rate, but at the expense of pissing off every person you email. Likewise, you might be able to improve open rates by promising something in the subject line beyond what you can deliver, but again, this kind of trickery does not result in happy customers who want to pay you money. Tricking customers does not improve engagement!


Improve your campaign’s conversion rate


Improving the conversion rate for your lifecycle email campaign is largely a function of improving customer engagement (open rate and click rate), then asking for the conversion. It sounds simple, but you’d be surprised how many people miss this one!


If your campaign is aimed at educating users who might be interested in purchasing your product, you need to make it absolutely clear what kind of benefit they’ll derive from your product. Then ask them to consider purchasing. If they still haven’t bought it a few days later, ask again!


I’m not suggesting you turn your emails into billboards advertising your product, but you should certainly talk about the problems you can solve for the reader whenever it’s appropriate. My guideline is to make my emails about 75% educational and 25% sales. Some emails will be pure education, and a small number will be pure sales.


The key here is deciding when to ask for the sale. If you ask for it too early—before the user has gotten serious value out of your emails—you’ll probably blow it. But, after they’ve read 3 or 4 emails and you’ve shown them how you can solve their problems, your readers will be very open to your request to consider purchasing.


Again, just keep it conversational—this should read like one a request from a peer, not a sales pitch on late-night TV.


Implementing A/B testing in your lifecycle email campaign


By now, you should have quite a few ideas for how to improve the emails you’re sending. Awesome! Let’s talk about implementing the actual tests that will tell you whether your ideas beat what you already had.


Most email service providers don’t integrate A/B testing tools (at least none that I know of… if you know of one that does, let me know about it in the comments!). Thus, we’re forced to implement our own guerrilla A/B tests.


Here’s how I do it:


  1. Choose the campaign you want to improve, and create a duplicate of it. For instance, during my work with X-Plane, I created a campaign to educate users of their demo, so I had 2 campaigns titled “Educate demo users, Group A” and “Educate demo users, Group B.” Initially, these will have identical content.
  2. Modify the emails in the B group with your improvements.
  3. Assign users to the two groups at random. Perhaps the easiest way to do this is to look at the system time mod 2 when the user gets signed up. If time % 2 is zero, add the user to Group A; otherwise, add them to Group B. Customer.io provides a nice guide for setting up A/B testing in their documentation.
  4. Now that you’re sending emails… wait! (Below we’ll talk about how long, exactly, you should wait.)


Unless you’re sending a great deal of email, it will take a long time to reach statistical significance, so I recommend testing improvement to all your subject lines, first lines, and body content at once. That way, for a given A/B test on a given email campaign, you’ll be measuring lots of variables. Note that this assumes you have access to per-email statistics. For instance, using Customer.io, I can see the open rate and click rate for each email individually in any given lifecycle campaign.


A/B Testing Lifecycle Emails to Skyrocket Value

Open rate and click rate for each email in a lifecycle email campaign. This per-email data is critical for A/B testing.


If, for some reason, you do not have access to this detailed breakdown, you’ll want to change far fewer aspects of your lifecycle campaign in a single A/B test—after all, if you change every subject line in your campaign at once, but you only have engagement metrics at the whole-campaign level, you’ll never know what was responsible for the change.


 This will make your A/B tests much less effective—in fact, I’d recommend switching email service providers, since the time required to effectively test changes to a full campaign will be so much greater without this data.


The waiting game: Ensuring statistical significance in your A/B test


You’ve put so much work into setting up your A/B test. It would be a shame to not get the most benefit possible out of it.


So, before you start getting results, you need to determine how long you’re going to run the test. Visual Website Optimizer has a lovely A/B test duration calculator that you should use. If you’re testing changes to many emails (and you have stats broken down by email, as discussed above), you’ll need to use this tool for each email in the campaign. That is, some of your emails may reach statistical significance (allowing you to make a decision on them) long before others do.


A/B Testing Lifecycle Emails to Skyrocket Value

Let’s go through what each of the fields in the VWO tool mean to your lifecycle emails.


  1. Existing conversion rate. Depending on which engagement metric you want to improve, this will probably be either the open rate or the click rate for your A Group (the original campaign). (Note that in this case, we’re counting an open or a click as a “conversion,” since that’s the thing we’re trying to improve.) For instance, when A/B testing my X-Plane lifecycle email campaign, I started with an open rate of about 30% on one particular email.
  2. Expected improvement in conversion rate. This is the “sensitivity” of the test; it is the minimum increase that will be detected by our statistical tests. Assuming you’re testing big changes (rather than micro-optimizations), a 15% improvement is a good baseline. The larger the expected improvement, the faster you’ll get significant results; the faster you get results, the faster you can try more new things!
  3. Number of combinations. Unless you’re running many tests at once (which I don’t really recommend), this should be 2—you’re only testing the A Group against the B Group.
  4. Percent visitors included in test. Unless your platform for setting up the A/B tests is quite sophisticated, you’ll be including 100% of your “visitors” (i.e., people who receive your emails) in your test.


Clicking the Calculate Test Duration button will tell you how long to run the tests. If, after this many days, you see that your B Group has indeed improved by the “expected improvement in conversion rate,” you have a winner! You can say with confidence that the new version of the email is indeed better than the old one, and you should adopt it immediately.


Once again, let me reiterate that each of your emails may need a different amount of time to reach statistical significance. Be patient and wait for your results—as Evan Miller writes, “peeking” (i.e., making a decision before you have the right number of visitors) will completely undermine the validity of your A/B test!


This should be more than enough to get you started A/B testing your lifecycle emails. Drop me a comment if you have questions. I’d also love to hear about any big wins you get for your company using this method!


More Resources


  • The Two Most Important Lines in Your Email” from the Customer.io blog. Includes tips on improving the one-two punch that is your subject line + first body line.
  • You Should Probably Send More Email Than You Do” by Patrick McKenzie, on the Kalzumeus blog. The most valuable portion of this, at least if you already have a lifecycle email campaign, are the sections “Emails Get Opened And Change Behavior” and “Permission And Trust Are Worth Money.”
  • “Don’t Click Here: The Art of Hyperlinking” by Jeff Atwood, on the Coding Horror blog. Jeff gives 11 best-practices for usability in your hyperlinks. A lot of it is common sense, but some are more subtle—for instance, don’t hyperlink things that users might want to highlight. Ignore these at your own peril!
  • 3 Email Remarketing Ideas You Are Missing Out On” by Chris Hexton, on the Unbounce blog. Chris talks about the winner of one A/B test being the email that was personal (it came from one of the founders), educational, and direct in asking for a response.
  • A/B Split and Multivariate Test Duration Calculator” from Visual Website Optimizer. If you’re running A/B tests by hand, this is an invaluable tool for calculating how many visits you’ll need to reach statistical significance of your results.
  • How not to run an A/B test,” by Evan Miller. Miller gives an analysis of how much less confidence you can have in your A/B testing results if you “peek” at your data (by doing multiple tests for statistical significance throughout the allotted time).

Post a Comment

Previous Post Next Post

Contact Form