On the surface this signal seems to make sense. In the hypothetical case where two sites are of equal quality in all points but one is slower, then it’s better for the user if they’re served the faster site on top of the rankings. It will save searchers time and avoid frustration. Then the real question is how strong this signal should be factored in the overall ranking mix... because in the opposite hypothetical case where one site is faster but delivers slightly worse quality, and another is three seconds slower but delivers great quality and perhaps makes its point 10 times as quick, users wouldn’t fare well being served the faster site on top. Google suggests that the speed signal is not a very strong one. It’s one of the “200 signals” affecting “fewer than 1% of search queries”, Google’s Matt Cutts suggests, adding “you don’t need to panic”.
Another question is whether Google will really understand perceived speed, which is not plain page loading speed. As an example, I have one site where I preload a lot of images so that the page can load in the background while the user starts looking at the first couple of images... I did this purposefully so that when the user scrolls down, other pictures are now already loaded and no further paging is needed. On Google’s webmaster tools speed chart however, that approach is said to be slow. Whether or not the approach I’ve taken with that site indeed increases perceived speed I don’t know, but I’d argue it’s at least an example of a page where that is debatable, which leads me to wonder if Google really understand when perceived speed differs from plain page load speed.
As a side effect, Google’s announcement adds a more constructive field to SEO: optimizing a client’s site loading speed. SEO in a way was always about usability and accessibility and site quality... because unless you resort to spamming, improving your site’s quality should be one of the most important parts of getting more links to it, which increases your ranking. And a part of a site’s quality was always its serving speed. Google’s new statements just make this issue within the quality mix more explicit. One issue I wonder about though: if a site’s slowness were already causing it to get less backlinks due to its resulting lower quality, then wouldn’t it now be penalized twice by Google – once by the backlinks count signal, and once by the site speed signal? If we again take the hypothetical case of two identical sites where one is slower though – then wouldn’t the faster of the two sites already have a higher PageRank because people are more likely to link to it?
>> More posts