This post originally appeared in the December 19, 2019 issue with the email subject line "Traffic metrics hurt your business. Measure this instead." and a review of publisher analytics tool Parse.ly.

I've gathered several blog posts from various years that talk about how pageviews are straight-up not a good measurement of content success. Yet I still see pageviews as core metrics on marketing agency, publisher and multibillion dollar business content dashboards. Businesses make terrible decisions based on pageviews and basic traffic metrics. All the time. In 2019!

Because pageviews are easiest to pull from Google Analytics. No single metric is ever going to fully describe content impact on business performance, but no matter what, that metric is certainly not pageviews. The amount of times I’ve used the word “pageviews” in this paragraph? That's how tired I am of seeing pageviews as a metric in reports.

I mean, at least use unique pageviews if you’re gonna be that basic. Like, even 2009 Gawker figured that shit out.

Pageviews are popular because you can game them easily. Because you don’t need to think to understand “more.” Pageviews inspire business leaders to throw babies out with the bathwater, to think that content is interchangeable, to think that they can game the system.

You can rejigger content design so page numbers go up. And up and up. They go up with advertising and email blasts. They go up when you badly reorganize your content and your users need to click around to twenty pages to find what they’re looking for.

Our culture rewards only numbers that go up and to the right. Our culture rewards the simplicity of pageviews. It’s easy to sell the 20th century scourge that is impression-based advertising against pageviews.

Bruce Campbell says, "Well, hello Mr. Fancypants. I got news for you, pal."

Your content deserves better than pageviews.

The Google Analytics traffic metric Sessions improves on pageviews. Sessions measure individual visits to a website, so one user’s multiple actions on one website visit are just counted as one session.

But even sessions are kinda garbage, mostly because they only answer the question of “how much?” and the only answer is “more.” Sessions are a traffic metric that serves a mass communications model of reaching the most, as many as possible. More more more.

Now, you need a significant amount of traffic — let’s say 100 daily sessions — to make the next model work. But once you hit that significance, stop focusing on growing pure traffic because otherwise you will forget about the actual people reading your content. Focus instead on growing engagement.

I bet you think I’m going to tell you that you need conversion actions! Newsletter signups and assisted conversions and product purchases and paying subscriptions! Heck yeah, you need those! But if you go after those too aggressively you’ll alienate your users before you’ve even made a connection. (What’s up, every Conde Nast publication that serves me an email signup popup when I’ve literally clicked to your website directly from your email? Yeah I don’t visit your websites as frequently anymore. That’s right.)

We’re going to look at what you should measure between when a user arrives at your site and isn’t quite ready to convert yet. Measuring the slow burn of engagement matters for content marketing and publishers alike. So let’s get engaged!

A tiny mouse puts an engagement ring on a person

How do you measure content based on how people read?

Any content creator worth their salt wants to know how many readers give a shit. How many readers said “Awww yeah, I'll read something like this again”? Or “I don't wanna close this tab.” Or “I actually learned something from that.”

Many have tried to measure this before. Scroll depth was one of those failed metrics that everyone implemented in the mid-2010s because the theory was that if you scrolled to the bottom of the page you were engaged. Now it just means you’ve scrolled past all the excessive ad units. Tracking scroll depth most often means that you fucked up your bounce rate benchmark for years because it counted any scrolled content as not-a-bounce. Bounce rate is a useful metric if you don’t fuck it up! (If your bounce rate is 8%, please change your settings so scrolling to the bottom of a page doesn't count as a second pageview. Love, your content analyst.)

You can try surveying your users to understand their engagement, but only a very small percentage of very engaged users will complete your survey. Looking at you, everyone who read the email but did not complete the most recent one-question Content Technologist survey.

You certainly don’t want to creep your users out by using personal data to track their every move. That's gauche; I’m not even going to talk about everything wrong with that.

You want to know: “Is this content viable?” So.

Indiana Jones waits for just the right time to replace the gold weight with his decoy.

I wish I could tell you it’s just a 1:1 replacement for sessions and that you could easily find this measurement in Google Analytics. It’s not. (And Google Analytics is designed to help marketers sell widgets, not brain space, but we'll get to that in another issue.)

Remember that goldfish attention-span metric of 8 seconds that everyone quotes as a measure of our “distraction”? Digital readers are distracted, sure, but we're people, not goldfish. As much as you might say “people are stupid” if you get a result that you dislike, people are discerning.

We have filters. Most of us who read online have good digital filters, read in F-patterns, decide quickly whether your content is worth our time. We’re not going to read past the first subheading if we know the content is basic or offers zero original insight. Digital readers are going to pause on any kind of movement or complex visuals like gifs, data visualization or video.

We want to ascertain how many people skim your page but then decide to read at least some portion of that content. Did your content survive their filter?

Surviving the filter

The average reading time for an English-speaking US adult is 265 words per minute (wpm), with college students averaging 300 wpm. Speed readers read 700 wpm. So what's an average skim time? I bet I can find it somewhere in mass comm research but this newsletter needs to go out tomorrow, so I’m going to say: 600 words per minute.

Based on your word count, how long should a reader spend skimming on a page? I’d max out at 90 seconds for really long articles on this one, but for all intents and purposes let’s say: anything above one minute counts as an engagement. If you’re on the page after one minute, you give a shit.*

But that metric alone doesn’t count for anything. So I’m only going to look at number of users who have spent above one minute on any single text-only page.* Pages with complex images like infographics, gifs and videos will likely attract users for longer but not much.

So. Let’s calculate the number of users who deemed our content worthy of passing their filter. They’re the filter survivors.

Gloria Gaynor sings, "I will survive!"

But I’d go one further on the filter survival metric: look at returning users who spend more than one minute on any content-driven page (i.e., not your homepage unless you put significant content on your homepage). Those are people who have been to your publication before and still find your content valuable.* Likely they’ll be here again: once a month, once a week...

Even better than that, try calculating the percentage of users who remain on the page after filter survival, versus your total number of on-page users. That’s your filter survival rate.

Bonus points if they don’t bounce. Bonus points for returning users whose session duration is above five minutes. Bonus points if you've figured out a better metric and want to tell me about it.

Mostly the above has been an exercise. I’m sure far more fluent data scientists than I — not an actual data scientist, just a trained editor who looks at a lot of charts — have better thoughts on how to measure engaged content. But once you’ve spent time seriously considering how an engaged user might behave, you’ll have a better understanding of what to measure.

Look at growing your filter survivors. They give a shit. Look at maintaining your survival rates while growing your base number of users through organic search, referral traffic, word of mouth, etc.

Users who have survived the filter — win them over! Don’t enable dark-patterns or unnecessary pop-ups or spam them. Win them over, gradually, over time, with delicious and unstoppable content.

Freddie Quell played by Joaquin Phoenix reads a letter on a ship.

*This discussion is Google Analytics-specific. GA’s an industry standard and free. However, you can also invest in measuring content a javascript heartbeat like Parse.ly offers.


Looking for more content like this in your inbox?


Measuring content pillars | The Content Technology
Now that you’ve built your content pillars, here’s how to measure them. Hint: it’s not audience size.
Measuring brand lift with organic GSC data | The Content Technologist
Measure PR and ad effectiveness without pricey brand lift studies. This brand media measurement approach uses free data your organization likely already has.
The Netflix content engagement KPI | The Content Technologist
Measuring for eyeballs is for chumps. Especially if you care about content.