What You Can Test
| Element | Example |
|---|---|
| Subject lines | ”Your exclusive offer inside” vs. “We picked these just for you” |
| Email content | Different copy, layout, or messaging approach |
| Send times | Morning vs. afternoon delivery |
| Product recommendations | Different product selection strategies |
| Call-to-action | ”Shop Now” vs. “See What’s New” |
How A/B Testing Works
Create variants
When building a campaign, click Add A/B Test. Create two or more variants — you can let AI generate them or write them manually.
Set traffic split
Choose how to divide your audience between variants. Common splits:
- 50/50 — equal distribution for clear comparison
- 70/30 — test a variation against your control
- 33/33/33 — test three variants simultaneously
Define success metric
Choose what you’re optimizing for:
- Open rate — best for subject line tests
- Click rate — best for content and CTA tests
- Conversion rate — best for overall effectiveness
Best Practices
- Give tests enough time — allow at least 3-5 days for meaningful data
- Use a large enough audience — tests work best with 1,000+ recipients per variant
- Test subject lines first — they have the biggest impact on overall performance
- Iterate — use winning variants as the starting point for your next test
Viewing Results
A/B test results are available in your campaign analytics:| Metric | Variant A | Variant B |
|---|---|---|
| Sent | 5,000 | 5,000 |
| Open Rate | 24.3% | 31.7% |
| Click Rate | 3.1% | 4.8% |
| Conversions | 42 | 67 |
| Revenue | $2,340 | $3,890 |

