Back to Blog

A/B Testing Emails: The Complete Ecommerce Guide

A/B testing is where email marketing stops being guesswork and starts being a data-driven growth engine. Yet most ecommerce brands either don't test at all, or test randomly without a framework — changing too many variables at once and drawing conclusions from insignificant sample sizes. Done right, systematic A/B testing compounds over time: a 5% improvement here and a 10% improvement there can double your email revenue within a year.

Here's the complete guide to A/B testing emails for ecommerce, from what to test to how to scale your winners.

What to Test (In Priority Order)

Not all tests are created equal. Focus on the variables that have the biggest impact on revenue first, then work your way down to refinements.

1. Subject Lines

Subject lines determine whether your email gets opened at all. They're the highest-leverage test you can run. Test these variables:

2. Send Times

When you send matters more than most brands realize. Test morning (8-10 AM) vs. afternoon (1-3 PM) vs. evening (7-9 PM). Also test days of the week — Tuesday and Thursday are conventional wisdom, but your audience may be different. The key insight: optimal send time varies by segment. Your VIP customers may engage at different times than your broader list.

3. Offers and Incentives

This directly impacts revenue. Test:

4. Email Design and Layout

5. Calls to Action (CTAs)

6. Content and Copy

Statistical Significance Explained Simply

Here's the part most brands get wrong. Statistical significance means your test result is likely real and not just random chance. In practical terms, you need enough people in each test group for the results to be trustworthy.

The magic number most statisticians use is 95% confidence — meaning there's only a 5% chance the winning variant won by luck. To reach 95% confidence, you typically need:

If your list is smaller than these numbers, focus on testing high-impact variables (subject lines) where differences are large enough to detect with smaller samples. Don't test button colors on a 2,000-person list — you'll never get meaningful data.

The A/B Testing Framework

Follow this 5-step process for every test:

Common A/B Testing Mistakes

Setting Up A/B Tests in Klaviyo

Klaviyo's built-in A/B testing is solid for campaigns. Go to Campaigns, create a new campaign, and toggle on "A/B test this campaign." You can test subject lines, content, or send times. Set your sample size (we recommend 20-25% of your list per variant), define the winning metric and wait time, and the winning variant automatically sends to the remainder.

For flows, Klaviyo supports conditional splits that function as A/B tests. Add a conditional split, set it to a random sample, and route 50% of recipients to each branch. Each branch can have different emails, timing, or content. Monitor performance in the flow analytics and manually route 100% to the winner once you reach significance.

Key Takeaway

A/B test in priority order: subject lines first, then offers, then design and CTAs. Always isolate one variable, wait for statistical significance (95% confidence), and track revenue as your primary success metric. Document every test result and implement winners immediately. The brands that test systematically don't just improve — they compound their advantages over time.

Ready to Scale Your Email Revenue?

Get a free audit of your current email program and see exactly where the opportunities are.

Get Your Free Audit