Hello Friends, What is Robots.txt? What are the uses of robert.txt. I have done for my site. Please Check this and tell this is correct or not?? http://www.desss.com/robots.txt Please advise us asap... Thanks Adelicia
It should be: Sitemap: /sitemap.xml Basically just disallow the places that have unimportant content on. For example I run an online community, I don't want bots indexing the members profiles so I disallow them from seeing that page. The threads and posts are the most important things so you disallow pages with crap on them so google indexes the important stuff Although sites get indexed over time anyway so it's not crucial
Basic purpose of robot.txt is to stop the search engine bots or crawlers to visit certain pages or areas of your website. In other words you can instruct the SE bots how to crawl your website.
Web site owners use the /robots.txt file to give instructions about their site to web robots that what to crawl and what not to crawl.
Robo.txt is very useful to stop the google crawlers where you don't want them to crawl your pages or links. So please use this very wisely. This will be like this:- User-agent: * Disallow: /Sitemap: http://www.desss.com/sitemap.xml
Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. It is important to clarify that robots.txt is not a way from preventing search engines from crawling your site (i.e. it is not a firewall, or a kind of password protection) and the fact that you put a robots.txt file is something like putting a note “Please, do not enter†on an unlocked door – e.g. you cannot prevent thieves from coming in but the good guys will not open to door and enter. That is why we say that if you have really sen sitive data, it is too naïve to rely on robots.txt to protect it from being indexed and displayed in search results.
Robots.txt file is an essential file to upload in a website which allows search engine to open your website to world wide. That means there is nothing hide content of your website. If you want to hide a page or content then setup the following robots.txt file User- agent:* Disallow: The page you want stop to crawl
Hello Everyone.. A Robots.txt File is used to prevent the indexing of duplicate content on a website or for implementing a no follow rule in an area of a website. A Robots.txt File is used to prevent the indexing of duplicate content on a website or for implementing a no follow rule in an area of a website. user agent:* disallow: this is syntax of robots.txt file.Basically,you can used to block the images from crawling. Thanks.. marie
[FONT= ]Robots.txt is a text (not html) file you put on your site to tell search robots which pages you would like them not to visit. Robots.txt is by no means mandatory for search engines but generally search engines obey what they are asked not to do. [/FONT]
Web site owners use the /robots.txt file to give instructions about their site to web robots that what to crawl and what not to crawl.