The case against testing headlines

Recently I shared the results of an interesting study.

A group of researchers looked at hundreds of A/B tests for ecommerce stores, and basically validated what experienced copywriters have said for generations:

Don’t waste your time fiddling with cosmetics.

Test big things like your guarantee or adding customer testimonials.

In response, reader Karl points out that the study has a glaring omission:

They forgot to test headlines.

Now I can forgive them on this one.

These tests focused on ecommerce stores, which usually have product names vs. real headlines.

(By the way, testing product names is also a good idea.)

That raises a broader question though that I want to tackle today:

Is it even worth testing headlines in the first place?

I’ll go out on a limb here and say that the way many people test headlines, they’re wasting their time.

What?

Heresy!

Allow me to explain:

When I started working with John at Simple Programmer, his main lead magnet was an email course about how to start a programming blog.

Most of the signups for this lead magnet came from a popup on his site.

The blogging lead magnet converted at just under 1%.

I got excited when I saw that, because I knew a decent popup can do 2-3X better.

I figured I’d run a few A/B tests and score a quick win.

Nothing doing.

I must’ve run a dozen different tests with different headlines, and no matter what I tried, I couldn’t seem to move the needle on that turkey.

A headline has two basic purposes:

1. To “flag down” the people who are most likely to be interested in your offer.

2. To get them to read whatever comes next.

The problem with most headline tests is that you’re usually just expressing the same idea with slightly different words.

In my case, the idea boiled down to:

“Blogging will help you get a better software development job.”

If you’re just shuffling words around and the core idea stays the same, that means the group of people that you’re “flagging down” will stay pretty constant—and so will your results.

Instead of testing headlines, I aim to test different “appeals” or benefits.

For example, I could have pitched the same blogging lead magnet by saying:

“Blogging will help you learn new technologies and programming languages faster and more efficiently.”

Or:

“Blogging will help you earn more respect at work.”

In other words, test concepts, not phrasing.

Ask yourself:

What is a radically different way of looking at my offer that might reach a broader segment of my audience?

That’s where you’re likely to find those elusive 200-500% response bumps.