When you’re having a bad experience with a website, it’s not hard to describe why. The page takes forever to load. Or half the page loads while key parts lag behind. Or a text input field jumps down the page just as you’re trying to click it. No, for end-users, there’s no mystery when a website makes you want to tear your hair out. For the people building and maintaining that site, however, things get more complicated.

Every major company continuously monitors its website. But translating technical performance metrics into a meaningful picture of the user experience—much less identifying what’s causing issues—can be enormously complex. Compounding the challenge, there’s been no universally agreed-upon set of metrics to measure. At least, until now.

Last year, Google introduced Core Web Vitals—three key metrics to evaluate every site. We finally have a consistent yardstick to measure performance. Just as important, Core Web Vitals does this in a clear, unambiguous way that anyone can understand. For the first time, business stakeholders and technologists have a common language to evaluate and improve the digital experience for their users.

The Performance Paradox

I’ve been in the website monitoring business for almost 20 years, so believe me when I say: finding the right metrics to measure user experience is much harder than you’d think. It’s not that we don’t have data; there’s a vast range of metrics to choose from.

It’s just hard to know which ones are relevant for a given case. Since the earliest days of the Internet, we’ve been searching for a way to consistently capture the high and the low, good and bad, some sort of simple star rating system. But answers have been elusive.

The first problem is just that a web page is a very complex beast, and many hands go into making it. You might have developers doing JavaScript, graphics artists, CSS specialists. User experience can also be affected by any number of third-party services and components. So even if you identify a problem, isolating the source and tasking the right people to fix it is not simple.

An even bigger barrier has been the lack of a common language to describe performance. IT measures performance with one set of metrics, business stakeholders another, vendors think about it one way, customers another, and everyone talks past each other. I say 2+2 = 4, you say red + blue = purple, and even if we’re both right, we’re no closer to fixing the problem.

Enter Core Web Vitals

Core Web Vitals represents exactly what the name suggests: a small set of measurements that captures the essential information needed to gauge user experience. The acronyms can sound like a Moderna vaccine formula, but they’re really quite simple:

  • Largest Contentful Paint (LCP) measures how much time it takes the largest above-the-fold content to be rendered when the user first hits the page.
  • First Input Delay (FID) measures the time it takes the browser to respond to a user’s first interaction with the page. This is important since few things are as frustrating as clicking on a site that remains unresponsive.
  • Cumulative Layout Shift (CLS) captures the stability of a site’s layout during the time a user interacts with it. Users shouldn’t have to chase links or input fields around the page because some parts are rendering more slowly than others.

Notice a common theme? Despite complicated-sounding names, these metrics aren’t arcane technical concepts. They’re straightforward reflections of user experience that anyone can understand and relate to. When the user first hits your site, do they sit there frozen wondering if their Internet connection was lost?

Can they interact with the site? These are the most important questions to understand what users experience. Now, we can get clear, meaningful—and most importantly, actionable—answers.

For CLS, a score of .1 or lower considered “good.” For LCP and FID, the metrics are measured in seconds and milliseconds.

(Google counts an LCP below 2.5 seconds as “good” and targets an FID less than 100 milliseconds.) Unlike some performance metrics, normal human beings can grasp these concepts immediately. No one has to go look up what it means if it takes 5 seconds for the browser to respond when a user clicks a link.

A Lingua Franca for Site Performance

The biggest advantage of Core Web Vitals is just the ability to get everyone with a stake in the user experience on the same page.

Whether that’s the different teams involved in site design, the marketing and engineering groups inside an organization, or conversations between an organization and its vendors, everyone can speak the same language.

We can finally all agree on what should be measured and what those measurements mean.

Crucially, these metrics are also actionable. Website performance monitoring can sometimes feel like trying to find a needle in a haystack.

Something is causing that spike you’re seeing in users abandoning your site, but where do you begin to look? Now, if you see your FID is 500 milliseconds, you know right away: That response time needs to get faster.

You’ve shrunk the unknowns in your performance evaluation, and everyone involved in fixing the problem knows the target they need to hit.

Unlike some previous attempts, Google also provides excellent documentation for Core Web Vitals. There’s no alchemy here; developers can see exactly how these metrics are derived and how they affect their site’s rankings.

Nobody has to feel like performance is a voodoo science that only the experts can understand. This also means that there’s no place to hide: If there’s a problem with some part of your site, or with a component or service a vendor is providing, there’s no ambiguity or finger-pointing.

As someone who’s been trying to help customers answer these questions for decades, this is the first time I can say we’re all speaking the same language, moving in the same direction. It’s a powerful development for the industry—and great news for our users.

Read Also:

By Mehdi Daoudi, CEO, Catchpoint

Tags

Leave a Reply

Your email address will not be published. Required fields are marked *