How to Measure Success for an Informational Website

Data Science
SiteCrafting Systems Admin Joe Izenman Joe Izenman

Understanding Metrics: Information

In the previous posts of this series, we’ve focused on understanding metrics and how to measure success on your website. This is the final installment of the series. For a high-level overview of the topic, check out the introduction blog.

The Goal: Communicate Information

We started with a technologically complex site type with intuitively easy metrics—online stores—it’s only right that we close on something extremely common, simple to implement, and deceptively difficult to measure.

Strip away the sales channels, the methods of engagement, and we have a website with a goal of communicating information. This could be a simple marketing site, a product FAQ, or a massive, detailed encyclopedia of services, locations, events, etc.

While it’s naive to suggest that a site like this isn’t still selling something, the process is roundabout. Meaningfulness assumes that if a user can easily find information about your product or service, they’re more likely to use it. Conversely, we can guess that users who fail to find the information they need are on their way out as customers.

The Metrics

Informational sites live and die by two elements: information architecture (how intuitively and efficiently the site’s natural structure leads users to their goals) and search (how effectively the site can present what the user wants based on a text query). There are few related possible metrics that seem like they might do the job:

  • Time on task: How long did it take the user to find what they were looking for?
  • Task success: What percentage of users actually found what they were looking for?
  • Search result efficiency: How far down the search results did the user need to go to find what they were looking for?

I say seem because there’s one giant hurdle blocking effectiveness of all these metrics. What were they looking for? To gauge task success or time on task, we need a window into intent, which is notoriously hard to extract from anonymous site behavior. Did the user spend a long time on the site because they couldn’t find what they wanted, but kept trying? Or, because they succeeded quickly, then dove down a rabbit hole of interesting links? Was that quick bounce a user who got what they needed and ran, or one who got quickly frustrated and gave up?

Without a feedback step such as product purchase, event registration, or article share, we’re extremely limited. Time on task and task success are staples of user testing at GearLab, our team of UX experts. The key to their efficacy is right in their name—lab. In a lab study, the tasks are pre-specified and assigned. We know exactly what the user is trying to find, because we told them to find it.

On the web, there are workarounds such as a poll checking whether the user found what they wanted, but the value of this method is questionable. Results are heavily skewed toward failure—users are more likely to spend extra time to complain than praise—and often users find these pop-up questions even more irritating than other problems with the site.

So, what can we do? Mostly, we have to lean on educated guesswork and conventional analytics. Repeat visits are probably good. Low bounce rates and higher time spent on a given page is probably good. Low number of returns to the search results page is probably good. Nothing is certain, but uncertainty is the name of the game in analytics. It’s just a question of how much we’re willing to accept and how negatively it impacts our overall predictability.

Next Up

It would be easy to say that we’re all done here. But, of course, that’s nonsense. Formulating your site’s goals and metrics are foundational tasks, which means they’re just the beginning.

As a data scientist, I’m interested in these concepts because they make my work possible. To use machine learning, you need something to predict. To use experimentation and A/B testing, you need something to measure. And to do either effectively, that thing you’re measuring and predicting needs to be genuinely meaningful to business outcomes.

We’ll continue focusing on this topic here on the blog and out in the world. Want to know more about understanding metrics or want to work with us to apply these ideas to your business? Visit GearLab’s website to view our offerings and send us a message.

More from the Understanding Metrics Series