"Speeding up websites is important — not just to site owners, but to all Internet users. Faster sites create happy users and we've seen in our internal studies that when a site responds slowly, visitors spend less time there. But faster sites don't just improve user experience; recent data shows that improving site speed also reduces operating costs. Like us, our users place a lot of value in speed — that's why we've decided to take site speed into account in our search rankings. We use a variety of sources to determine the speed of a site relative to other sites."If the ROI of page performance wasn't clear enough, we now have a big new reason to focus on optimizing performance. The big question is what Google considers "slow", and how search rankings are affected (e.g. are you boosted up if you are really fast, or are you pushed down if you are really slow, or both?). When are you done optimizing? Google has a big opportunity to set the bar, and give sites a clear target. Without that, the the impact of this move may not be as beneficial to the speed of the web as they hope.
What we know
- Site speed is taken "into account" in search rankings.
- "While site speed is a new signal, it doesn't carry as much weight as the relevance of a page".
- "Signal for site speed only applies for visitors searching in English on Google.com at this point".
- Google is tracking site speed using both the Googlebot crawler and the Google Toolbar passive performance stats.
- You can see what performance Google is recoding for your site (only from Google Toolbar data) in the Webmaster Tools, under "Labs"
- In the "Performance overview" graph, Google considers a load time over 1.5 seconds "slow".
- Google is taking speed very seriously. The faster the web gets, the better for them.
- What "slow" means, and at what point you are penalized (or rewarded).
- How much weight is given to the Googlebot stats versus the Google Toolbar stats.
- What Google considers "done" when a page loads (e.g. Load event, DOMComplete event, HTML download, above-the-fold load, etc.). Does Googlebot load images/objects, and if so does it use a realistic browser engine?
- How much historical data it looks at to determine your site speed, and how often it updates that data.
- Will there be any transparency into the penalties/rewards.
- Site performance is only going to play a factor when your site is extremely slow.
- Extremely slow sites will be pushed down in the rankings, but fast sites probably won't see a rise in the rankings.
- "Slow" is probably a high number, something like 10-20 seconds, and plays a bigger role in the final rankings as the speed gets slower. Regular sites won't be affected, even if they are subjectively slow.
- This is probably just the beginning, and we should expect tweaking of these metrics as we become more comfortable with them. We'll probably be seeing new metrics along the same lines in the coming years (e.g. geographical performance, Time-to-Interact versus onLoad, consistency versus average, reliability, etc.).