Google's New Page Experience Algorithm - What it is, and What you Need to Know about it
It's happened again! Google's released a new algorithm update that will influence SERPs, SEO and website development -- perhaps quite dramatically.
Detailed documentation is here.
To summarize, the new update is all about "Page Experience," and the idea is simple: Users want their information, they want it now, and they want it easy. It's a trend google's been following lately, one that broadens the scope of SEO beyond marketing, content and keyphrases, and into the technical world of development and network management.
Of course, a professionally and properly developed website has always been important, but this new update marks a major shift in that direction. Web pages need to serve content quickly and efficiently, without employing annoying tactics that interrupt browsing like intrusive interstitials, employing poor development decisions that frustrate the user experience (like serving pages with precarious or bloated code just waiting to break), or cheaping out on the hosting environment, such that data transfer is cramped and content trickles to a crawl.
From a user standpoint, this actually sounds like a great idea.
From a developer standpoint, it sure sounds like a great idea, but as with all algorithm updates, it's also a little frightening, because the truth of the matter is, staying on top here isn't about providing a good user experience. It's about proving to Google that you can provide what they claim to be a good user experience, and catering to their criteria. If the algorithm has flaws, misconceptions, or outright bad ideas -- oh well. The hierarchy is clear: We must first appease the algorithm in order to appease our visitors.
Is that a bad thing? You decide. Standards are important; someone has to set them, and Google's in the most powerful position to do so. The good news is, Google is pretty clear about what they want us to do. So, let's consider what's going on here, what the new update implies, and discuss what we can do to make Page Experience excellent -- for Google, and for our visitors.
The Future of SEO - Page Experience
Ever since the Almighty G established itself as the authoritative search engine, it also established a cycle:
- Release a new algorithm that will change the way searches work
- Website owners freak out and scramble to comply
- Professionals like us dig in, find out what’s going on, and rocket our clients toward the top of SERPs
There’s no reason to panic here, but there’s definite reason to take action. If Google is going to weigh their Core Web Vitals as heavily as they claim, the industry is going to have to shift in that direction or simply get left behind. Don’t be in that crowd!
We certainly won’t be.
If you’re worried, concerned, or just not sure what do, feel free to reach out. We’d love to have a chat with you, and we’ll even throw in a free copy of our book on Search Engine Optimization. If you have a website you think could be affected by the new algorithm, or if you want to build a new one and want to get things right from the start, send us an email or give us a call.
We look forward to hearing from you!
Measuring Page Experience
So, how do you measure the quality of your page experience?
With human beings, of course. As with every website, it’s crucial to have real people click through the content, test functionality, and double-check each page in different browsers on different devices. But remember – the arbiter here is an algorithm, not a human being, so we must appease the algorithm before it will make our content conveniently accessible to actual people.
Google offers 2 main ways to see your your webpages like its algorithm does.
- The new Core Web Vitals section in Google Search Console
- The newly updated PageSpeed Insights
Core Web Vitals (in Google Search Console)
Google Search Console is a very handy tool. You can use it check the index-ability of your website, review the performance of pages in search results, investigate issues with individual urls – and much, much more.
If you’re familiar with the application, you may have noticed a change in the Enhancements section in the sidebar. What used to be PageSpeed has been replaced with Core Web Vitals. In this new section, you’ll find an overview reporting on the Core Web Vitals described above:
- Largest Contentful Paint (LCP)
- First Input Delay (FID)
- Cumulative Layout Shift (CLS)
As of this writing, the Page Experience Algorithm and Core Web Vitals project are very new. So if you see a “Not enough data” message in Google Search Console, there’s nothing to worry about. That just means the algorithm hasn’t had a chance to examine your website yet.
PageSpeed Insights
If your site hasn’t been examined for Core Web Vitals yet – or if you’re tweaking a page and want immediate feedback – Google recommends you use their Page Speed Insights tool.
This tool has recently been updated to account for the new Core Web Vitals metrics, but it’s been around for a quite while. Like many developers, we’ve used PageSpeed Insights in the past… but have been frustrated with the scoring metrics for many reasons.
For a lot of webpages, you would get a significantly different score every time you ran the tool, even if you didn’t make any changes.
Many scores were stupidly evaluated. For example: Say you have an image on your site that is 10k, but it could be optimized down to 5k. The tool would dock you for being 50% inefficient, which is ridiculous, as the load time between 10k vs 5k is unnoticeable.
The tool offered to “fix” problems for you by providing a download of resources which you could use to replace the existing “offending” resources on your site, which included:
- minimizing css and javascript files (reducing many file sizes by such negligible amounts that there’s no point in doing so)
- “optimizing” your images with so much lossy compression that they ended up pixillated and looked terrible
Recent updates do seem improve some of these issues, so it will be interesting to see just how important its suggestions will be. As a general rule, Google says that if you score around 75 you’re on the right track.
We did run some initial tests, with very interesting results. In almost all cases, our scores were docked pretty bad due to:
- 3rd party javascript
- javascript code that is loaded, but not used
Would you like to guess the source for all that “slow, problematic” code?
According to the tool, these are our top 7 resource hogs:
- Google Maps
- YouTube
- Google Analytics
- Google Fonts
- Google Tag Manager
- Google/Doubleclick Ads
- Other Google APIs/SDKs
Silly Google.
Page Experience and Core Web Vitals
Google says:
“The page experience signal measures aspects of how users perceive the experience of interacting with a web page. Optimizing for these factors makes the web more delightful for users across all web browsers and surfaces, and helps sites evolve towards user expectations on mobile. We believe this will contribute to business success on the web as users grow more engaged and can transact with less friction.”
Page Experience is scored based on the following criteria:
- Mobile-friendliness
- Safe-browsing
- HTTPS-security
- Avoiding Intrusive interstitials (annoying popups)
- Core Web Vitals (new)
- Largest Contentful Paint (measures Load Performance)
- First Input Delay (measures Page Interactivity)
- Cumulative Layout Shift (measures Visual Stability)
Let’s talk about these new Core Web Vitals in a bit more detail.
Largest Contentful Paint
“Painting” in this context means rendering a resource on a screen. The larger the resource, the longer it takes a browser to download that resource and “paint” it for the visitor.
Long waits are bad for business. They’re frustrating, and they encourage the visitor to bounce; to abandon your page and seek the information elsewhere. Google defines a “long wait” as anything beyond 2.5 seconds after the page first starts loading, which is not very long at all, so we must to be very considerate of how we present information on a page.
In all cases, the developer’s goal is to reduce the amount of data a page needs to load, and increase the speed at which the data can be transferred. How well one can do that depends on: - what resources need to load, and - the quality of the hosting environment
Resources
The largest resources are inevitably going to be video and images. If you’re going to use them, always, always always make an effort to optimize the media for the web. That means:
Video
- Compress the video file (ffmpeg, HandBrake and many other programs are useful for this)
- Configure the video resolution
- Does it need an audio track? If not, strip the audio.
Images
- How large does the image really need to be? Most phones take very high quality photos.
- When presenting on a mobile device, most images won’t need to be more than 800px wide
- When presenting on a desktop computer, most images won’t need to be more than 1600px wide.
- Always compress using image optimization software
- Consider non-lossy compression first
- If non-lossy leaves you with a file still too large, use lossy compression
Are you stuck with a large resource on your page? A question to ask yourself is, "Do I really need this?" Or, "Is there a another way to present this information?" Consider this example: You want to post an informational page with instructions on granting some access to a Google Analytics account.
Option 1: You shoot a video and edit it to be as short as possible, while still presenting with professional quality. Then you run it through compression software to reduce the file size to 10mb.
Option 2: You present the instructions as an ordered list of text, with a few cropped and optimized images included for elucidation.
Clearly, Option 2 will load a lot faster!
Hosting Environment
A critical aspect of load performance – one that many developers overlook – is the website’s hosting environment.
Here’s an analogy: You’re a race car driver. You know the course you’re going to race inside and out. You’ve practiced every turn a million times, you know exactly what to expect as you round every corner, and you’ve even discovered a few shortcuts to minimize your time. That is to say, the course is optimized. Nice. Now in this analogy, web hosting is the car you drive. No matter how well optimized the course, if you have a $@%#! car, you’ll have a $@%#! time. Other drivers will burn you off the road even if they haven’t optimized like you have.
Developers typically run with cheap ass web hosting because:
- they don’t want to spend the money on a real hosting environment
- they don’t know how to set up the environment themselves, so they use canned options for convenience
If you’re looking for Business Class Web Hosting, you’ll have to do your homework. If you need any help, give us a call or contact us. We’ll point you in the right direction.
First Input Delay
The First Input Delay is the measurement of how long it takes before someone can interact with your web page. The stopwatch starts at page load and culminates once the visitor is able to assume full control.
Here’s why Google thinks FID is important. Say you’re shopping for information. You pull up a web page, and right in the middle of the content, there’s a button you can click for to learn more. You click the button… and nothing happens.
Well, that’s confusing. Why didn’t it work? You’re sure you didn’t miss and accidentally click off target. So… what? Is the button just broken or something? Is it even a button at all? Maybe it’s just some graphic that kind of looks like a button. You click it again – and weird, it works this time!
That’s input delay. You saw the button, and you tried to interact with the button before the web page was able to capture your input. Clearly, no one wants that, and it certainly does dock the user experience. Google believes this input delay should be very, very short. According to Google, any input delay greater than 100ms is too long.
That means your web pages need to be ready to interact within the blink of an eye.
What causes input delay? Large media or other heavy resources (such as video or images, described in the previous section) are one possibility. Another, very likely possibility, is javascript struggling to run in the foreground.
We won’t go into too many details here, as most people who work with websites have at least a rudimentary understanding of what javascript is. For context, we’ll just point out that javascript is client-side code that is downloaded, parsed, and then interpreted by your browser (as opposed to server-side code – which is executed on the server before the web page even loads). A typical use for javascript is to add functionally to a webpage, which can range anywhere from eye candy, to calculations, to offering ways for the user to interact with a web page that simply aren’t possible with html.
What causes javascript to be slow?
- Very large scripts
- Poorly written scripts
- Scripts that rely on or require other scripts to load before they can function properly
- Scripts that are loaded from busy, slow, or cheap ass hosting environments
Typically: all of the above.
In order to optimize for an FID of 100ms or less, you do have to be pretty clever with your code, and choosy with your resources. Here are some tips.
- Ideally, you’ll want to async/defer as much javascript as possible, so the code can finish loading in the background without interfering with your visitor or the content they’re trying to access.
- Consider loading as few external resources as possible. Need to use a javascript library like jQuery? Don’t download it from a CDN. Instead, save a minimized copy of the library local to the website, where every page can access it from the same server.
- Ask yourself if you even need a javascript library like jQuery to begin with. Why do you? If all you’re doing is validating an email address, focusing an element, or scrolling to some position on the page, there’s no point in loading an entire suite of functions you’ll never use, just so you can use one of them. If your needs are truly simple, address them by writing a simple function in vanilla javascript.
- Do you use Google Tag Manger and Google Analytics [if you don’t, you damn well should]? Did you copy/paste both the GTM script and GA scripts into the of your html? Well, that’s two external resources to load! Instead, incorporate Google Analytics via Google Tag Manager. The internet is rife with instructions to do just that.
Cumulative Layout Shift
Here, a layout shift refers to elements moving around on a page without input from the user. The cumulative layout shift is the measurement of all the individual ones.
Some layout shifts are appropriate, necessary, and in fact good practice. For example, a page might have a toggle button that when clicked, shows/hides additional information about a product you’re interested in. The showing and hiding of that information causes a layout shift, but it’s a good layout shift, because it behaves as the user expect it to, and the user is in direct control of it.
A layout shift is not good when the user doesn’t expect it to happen. Say you visit a website and want to click a button – then suddenly the button jumps somewhere else on the page. It’s really annoying if you have to chase the button down… and way more annoying if you end up clicking something else because of the unexpected shift.
Typically bad layout shifts occur when resources load asynchronously, or when resources are very slow to load. In most cases these shifts are unintentional side effects of poor coding practices, careless interface design, or slow hosting environments, but some sites throw them in purposely to bamboozle a visitor into clicking an ad or some other element they don’t actually intend to interact with. In all cases they are annoying and befuddle the user experience.
It usually better to avoid automatic layout shifts whenever possible.
If do need to to use them, take some time to consider how the shifts will affect the user experience. Definitely disallow relocation for buttons or other clickable elements whenever possible, and subject the page to thorough testing – not just by yourself. Include other team members, and if you’re a developer, have your client test and test again. It will probably take some tweaking to stay out the user’s way.
If layouts shifts are occurring unintentionally, that’s a problem. Ensuring page resources load quickly will solve most of these issues, but if fast loading isn’t possible, you could try allocating space on the page by carefully establishing the offending element’s size with inline styles, height and width html attributes, or CSS. That way, even if the resource takes a while to load, the space it’s going to occupy will be predefined, and it won’t jumble the rest of the page.