Dear all respected members i want to know that what is robot.txt file ? and how it works ? and what is it importance for SEO and Search Engines Thank you
Robot.txt is the file that Allows crawlers to crawl your website.. You can also restrict pages not to be crawled by putting them in robot.txt
Robot.txt is a file you put on your site to tell search robots which pages you would like not to visit.
Hi Roarsin, not robot.txt but robots.txt file.. this file use for crawling your site.. please more information about robots.txt file visit: http://www.robotstxt.org/robotstxt.html
Web Robots (also known as Web Wanderers, Crawlers, or Spiders), are programs that traverse the Web automatically. Search engines such as Google use them to index the web content, spammers use them to scan for email addresses, and they have many other uses.
Robot.txt use to let search engines crawlers know that the particular page or link or image need to crawl or index or not.
The robots.txt file should be used to either block the crawlers from accessing a site or certain pages within the site or simply informing the crawlers that they can have full access to a website. I always think it is good practice to incorporate the address of your sitemaps within the robots.txt file to look to enhance the indexing of your site further.