It gives you some control over how the search engines crawl and index your site. It isn't of huge importance but it is a good idea to have one in place so you don't give a 404 error to search engines when they look for it on your site. If you don't want to give specific directions to search engines on your site, you can just upload a blank text file (make it in notepad) named robots.txt and they won't get a 404 error.
Robots.txt is a text (not html) file you put on your root directory to tell search robots which pages you would like them not to visit. The location of robots.txt is very important. It must be in the main directory because otherwise user agents (search engines) will not be able to find it . Also, it helps the search engines in discovering your Sitemap automatically.
robots.txt file is used to tell the search engines not to crawl the particular pages of your website.
According to me the robots.txt file is a very important file if you want to have a good ranking on search engines, when a search engine crawler comes to your site, it will look for a special file on your site. That file is called robots.txt and it tells the search engine spider, which web pages of your site should be indexed and which Web pages should be ignored.
Hi. Robots.txt is not a HTML file. It is a file that inform search engines, which page to visit and which page to be ignored. And it also prevents crawlers not to get additional information.
hi its is very important , this is the only file which tells search engine what to crawl and what to not.
To be clear, robots.txt is absolutely not necessary. It never tells a search engine to crawl a page, only what pages/directories you don't want to have crawled. Aside from that, you can use it to point a search engine directly to the sitemap, though you can just as easily let them know where it is by placing a link to it in your site footer. In cases where there isn't any part of your site that doesn't get linked to from another part of your site (every page can be reached through a link from another page), you don't even need to do that. Since there is a lot of misinformation here from other posters and you have no reason to believe one poster over another, here is Google's information on the robots.txt file.
Robot.txt is very important for search engine to tell which page and document, have to index and crawl. This is .txt file not html page, if you want to index and crawl every page of your site and improve your ranking then should create a robots.txt file and keep it in root directory......
Robot.txt is very important for search engine.because Search Engine crawler handle This Step.All Crawlers Come Your Site.But he Check That Your Robots.txt File.And Then He Will Crawling Your site.So You can Not Add In your Site Robot.txt Its very Harm ful To Your Site.Because Crawler Come On your Site And Find All errors.And Then Punished.So Check your Site robots.txt File.
Robot.txt is a file which is placed inside root file and give direction to the search engine about what to crawl and what to not.
If in any page duplicate found and you don't want to be index that page then use meta tag robot.txt. If you want that some of the page of site not index the we use robot.txt.
Although the robots.txt file is a very important file if you want to have a good ranking on search engines, many Web sites don't offer this file.If your Web site doesn't have a robots.txt file yet, read on to learn how to create one. If you already have a robots.txt file, read our tips to make sure that it doesn't contain errors.
robot.txt is a file which implement to direct Google crawler which page to read and which page not to read.. when we do not want to read any specific pages, implement robot.txt
Google has clearly mentioned in their support website that "A robots.txt file restricts access to your site by search engine robots that crawl the web" and following the information which is retrieved from official google source: Continued: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449&from=35237&rd=1
robot.text file is very important. it tells the search engines to which web pages of the site should be crawled and which pages not not be indexed by search engines.
This service is just the kind of Lock which we use to block any particular page, so that crawlers can not visit that page and index.