Hi, Which is the benefit when use sitemap in Robots.txt please give me favorable reply. EX: User agent : * Disallow: sitemap: sitemap.xml
Thats a great answer. everytime you update your site, you don't have to inform Google about that. When the crawler crawls your site, it first checks for the robots.txt and finds the sitemap.xml.
The syntax in "sitemap:http://www.yourdomain.com/sitemap.xml" on Robot.txt file This is just a reminder to the Googlebost to visit the website.
I have never heard of a sitemap being used in the robots.txt file. I just thought that you had to add it manually in Google Webmasters Tools and that is better for indexing and tells the SE crawlers to index URLs of the site that they are less likely to find if left on their own. Great share.