hi guys, I am sure most of the seo guys out here, must possess great knowledge regarding robots.txt files. Most of the time when we create a robots.txt file, our format would be as follow : - user-agent: * disallow: /abc.html/ disallow: /xyz.aspx/ Robots.text file mentioned above is used to keep those pages safe then being crawled and indexed by the search engines. Now a new kind of robots.txt file i've come accross which is - user-agent: * sitemap: http://www.xyz.com/sitemap.xml Now, I just want what are the main benefits of including sitemap in robots.text files ? Please let me know your views. Looking forward to hear from you. Thank You
As far as I remember Yahoo was the first one to recognize that syntax, with sitemap field in robots.txt you can tell search engines about your sitemap(s), it's very useful when you have more than one sitemaps.
robots.txt is a means to provide sitemap and sitemap is a mean to inform engines of new pages. sitemaps can be provided using GUI included in google and yahoo