Should I disallow Googlebot from crawling slower pages? - SEO Packages | Professional SEO Services Sydney Should I disallow Googlebot from crawling slower pages? - SEO Packages | Professional SEO Services Sydney

Should I disallow Googlebot from crawling slower pages?

Is it a good idea to prevent Googlebot from crawling slow loading webpages?

Considering that site speed as detected by Googlebot is one of the signals that the search engines use to determine search rankings, a lot of webmasters are understandable concerned especially those who run websites which cater to sophisticated queries and other heavy functions that makes their website significantly slower than most. Should webmasters consider preventing Googlebot from crawling such webpages?

Check out the video above and see what Google webmaster Matt Cutts have to say about the matter!

Googlebot crawling slow webpages – not generally an issue in search rankings

As mentioned by Matt in the webmaster help video, website speed as a ranking signal isn’t really that critical in determining search rankings. According to Google, only 1 out of 1000 websites are afflicted with low search engine rankings due to slow website speed.  If you suspect that your website is unfortunate enough to belong to this category then you can certainly opt to block Googlebot from crawling your slow loading webpages.

The webmaster goes on to say that although webmasters can indeed opt to prevent Googlebot from  crawling their slow loading webpages, it does not address the issue in terms of user experience. If it takes too long to load your webpages then webmasters will tend to see most of their users leaving their website as most of them simply are not patient enough to deal with such delays. Hence webmasters are still better off focusing their efforts on optimizing website speed by any means possible.

What are your own thoughts about stopping Googlebot from crawling slow webpages?

Be Sociable, Share!
    Tags: , , , ,

    Leave a Reply