Following a questions and answer video created by GoogleWebmasterHelp on YouTube that features Matt Cutts covering the Google site speed issue, it has quickly become clear that reducing the time it takes to load your site is about more than simply things that you can control, even with careful planning and optimised development.
The video features a question by ‘DisgruntledGoat' from the UK who took the opportunity to ask Matt Cutts the following question in his quest to combat the page load issues on his site as shown within Google Webmaster Tools (GWT):
"How does Google determine page speed? In GWT some pages are listed as very slow (8+ seconds). But I have tested on older computers/browsers and they do not take anywhere near that long to load. Why might Google show such high numbers?"
Cutts explains how search engine giants Google consider toolbar data extracted from users that have opted into their research in order to take information surrounding the time that it takes for a particular site to load, revealing that there are other factors outside of the website that could well hinder the results of your statistics.
According to Cutts the data is drawn from the user and used within the raw form that they absorb from the search engine user, so if a visitor to your site has a weak internet signal, operates on dial up internet or simply has issues within their own parameter, the results that Google obtain reflect the drawn data without the elimination of such factors.
Although there seems to have been a huge push towards looking to reduce the page speed, Cutts revealed that there are only really "one in a thousand" sites that have any issues in relation to their page speed that could be big enough to affect their position within the search engine results, taking away some of the need to have the most efficient site on the internet.
Although looking to keep your website optimised is a very good form of practice to be showing, the need to have a highly optimised site in order to rank in top spot in Google is no longer a major factor, putting a cooling sensation on the battle cry to condense page after page of your website.
Although I strongly believe that Google are making large steps in the right direction following their latest alterations that are targeting low ‘value' sites, I think that using page speed as a factor when the testing grounds aren't completely level is a negative move from the internet giants and hopefully they will either look to eliminate such metrics as such any form of negative metric in relation to a site or they will work out a new methodology that would create a fairer testing ground for site owners.
See the video below:
Anyone know the code to seek internet connectivity speed of visitors so that I can block anything below 1Meg?
Clarification Edit –
I thought that I should make this small addition to the above post as it seems that I have been misunderstood by a few readers.
The point that I am trying to make about the page speed metric within the eyes of Google is that regardless of the work and optimisation that you put into the site, use of YSlow will never show you a true value of the speed that your site is offering to users as the data used to calculate the metric is affected by the speed at which a visitor has available to them in terms of internet connection.
Optimising the speed of your site is still important as you are looking to offer a good user experience for every visitor to your site but my point is that investing huge amount of time and money into reducing the metric results as seen by Google is time wasted creating the main objective of your site, informative and unique content.
By the comment that the optimisation of page speed is a waste of time, i mean that dedicating huge amounts of time to battle against the statistics that are shown to you within the likes of YSlow and GWT is time that could be better spent elsewhere within your development.