If I don't need to block crawlers, should I create a robots.txt file? - SEO Packages | Professional SEO Services Sydney If I don't need to block crawlers, should I create a robots.txt file? - SEO Packages | Professional SEO Services Sydney

If I don’t need to block crawlers, should I create a robots.txt file?

What exactly is a robots.txt file?

Wondering what exactly is a robots.txt file? Well if you’ve been in the search engine optimisation (SEO) for quite some time then you would probably know that it is an exclusion standard protocol used by webmasters to block bots or search engine crawlers like the ones that Google uses from crawling a part of a website or its entirety.  For webmasters who don’t often have to do such a thing in the first place, is it at all necessary?

Few webmasters ever find the need to block crawlers from finding and accessing their site. If you aren’t one of them, do you really need a robots.txt file at all? Check out the Google webmaster help video featured above and see what the search engine’s webmaster has to say about the matter.

Do you still need to have a robots.txt file?

As Matt Cutts explains in the video, it would be best for webmasters to have a robots.txt file in place even if they have no plans of stopping crawlers from accessing any part of their website. For one thing, not having it presents a significant risk as it enables web hosts to modify your site’s 404 pages which can lead to unwanted site behavior. The search engine typically helps webmasters by keeping such webhost behaviors in check but why risk it?

Google recommends that webmasters create a robot.txt file for their websites regardless. You can have it blank or specify that the protocol allow everything – either way works although having one with “User-agent: * Disallow:” is preferable as it specifies exactly what is allowed on your website.

What are your own thoughts about having a robots.txt file on your website?

Be Sociable, Share!
    Tags: , , , ,

    Leave a Reply