Hello Everyone ! I want to ask question Why we use Robots.txt File ? Please share your thought, ideas and exprience ? Thanks Regards Jack
It is used to tell search robots which files to ignore (or alternatively) which files to crawl. It also helps Search Engines to locate the Sitemap of the website and hence crawl the entire website in depth... helping in your rankings and traffic.
"Robots.txt" is a regular text file that through its name, has special meaning to the majority of "honorable" robots on the web. By defining a few rules in this text file, you can instruct robots to not crawl and index certain files, directories within your site, or at all.
Robots.txt is a text (not html) file you put on your site to tell search robots which pages is to crawl and which is not crawl.
Robots.txt is used to tells crawlers which directories can or cannot be crawled. Robots.txt file is to improve site indexation by telling search engine crawler to only index your content pages and to ignore other pages (i.e. monthly archives, categories folders or your admin files) that you do not want them to appear on the search index.
HI. Robots.txt informs the search engine that in which site it should crawl or in which site not to crawl and thus it saves little time of the search engine.
Hello Jack, As my all friends have been suggesting. ROBOTS.TEXT file is nothing but a protocol which tells Search Engine Robot about crawling limitations for a particular website. Facts- Crawler/Spider understand only Allow or Disallow For Further Help Contact me using PM Thank You for reading me
we use robots.txt for avoid search engine crawling of website unnecessary pages. Like this.......... User-agent: * Disallow: /cgi-bin/ Disallow: /tmp/ Disallow: /~joe/
robots.txt File Just Inform All Serach Engine Crawler Who Crawler Crawl who page of site.its very important For Site Traffic.
Robots are meant for search engine crawlers to instruct them how to crawl the entire website or which part, file not to be crawled. Apart from that some of the bad crawlers can be stopped from crawling the website by disallowing them with the help of robots file.
Robots.txt nothing but, using only for manging search engine boot/spider. You wish which file/folder will crawl or which one don't.
According to me basically, robot.txt file is used for telling a search engine spider and boats which content, directory or page should index or not. In simple word robot.txt is a file and that is use to allow or disallow content from the crawling process.
It allows to googlebot,slurb,msnbot crawler,please crawl our necessary files and folders,it is one of secure for website .
Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.
its tell search engine bots to which file is allowed for indexing and which one is not .. In robots file u can instruct also other thing.
Robots.txt file is helps us to set permissions for bots(Google search bot,spamming bots etc.) . you are able to create robots.txt file very easy using Google webmastertools