At eduWeb I chimed in about A/B testing on emails during the Q&A Session of Kyle James’ presentation (that’s me around 27:40). I just wanted to share an example of a recent test that I did.
The email was to announce that our online app was available. I knew from last year’s send that the subject line was fairly effective (“Butler’s Online App Now Available!”), so I wanted to take a look at the content and see where I could actually push more students to click through and take action.
For the 2007 send, it went to 14,650 students, and the results are below.
For 2008, we have 17,566 students to email. Rather than do it the same as last year, I first ran an A/B Test. I left the first email the same as last year, and for the 2nd one I used a button graphic to see if it would help clickthrough rates increase.
Test A: Same email as last year.
Test B: Added a visual clickthrough.
Results: Each test was sent to 3,500 random students on 8/7/08. The winning test after 36 hours would then be sent to the remaining 10,566 students.
Test A: 3,282 successful. 339 opens (10.3%), 72 click throughs (21.2%) as of 8/11
Test B: 3,292 successful. 719 opens (21.8%), 273 click throughs (38.0%) as of 8/11
2008 Send: 9,454 successful. 1,378 opens (14.6%), 449 click throughs (32.6%) as of 8/11
And for comparison,
2007 (1): 14,650 successful. 5,137 opens (35%), 853 click throughs (16%) after 1 month
2007 (2): 9,513 successful. 1,232 opens (13%), 270 click throughs (22%) after 1 month
So in first 4 days, we have had 47% of the opens as last year (2,436), and 93% of the click throughs (794). These numbers will continue to rise as the days and weeks go on. Based on early #’s, I can say that Test B has been a success, 34.4% click through rate to date compared to 17.6% click through rate over the course of last year’s entire campaign. Those are results.
Even if we don’t do a 2nd send this year, I think we’ll get close to the amount of opens we had last year cumulatively. I might do a 2nd send with a different title to see how that affects open rates, using the content from Test B to continue to push click throughs higher.
I started thinking about the button after reading Designing The Obvious on a flight last week. It was a really good book and made me think more about how I can incorporate more design-friendly aspects into emails.
I’d encourage you to consider an A/B test in the future and see how you can make the most of your email campaigns. This isn’t a new technique, but it’s usually overlooked.