Hi my question is short and simple. Do I have to include the robots.txt if I don't have any restricted pages to search engines? All search engines could come and freely index all of my pages so should I put this robots.txt in? Thankx Laszlo.
You don't HAVE to however if you don't you could get a slew of 404 errors from spiders looking for that file.
robots.txt is used to disallow the bots. If it is not present then by default bot will index all pages.
I see in my stats that i get a bunch of 404ed spiders lookin for robots.txt Does this matter at all? I too am not disallowing anything. Will SE's frown on not finding this file, or just go about their business the same as if the file were there with this broad welcome mat: User-agent: * Disallow: Disallow: /cgi-bin/ I'm going to add the file just to avoid all the 404's in my stats, but i want to know opinons on wether or not this affected anything.
There is no need to disallow 404 page by robots.txt What is need is redirect your 404 page to some sales page or sitemap page using 301 redirect.
No robots.txt file is excellent for the robots. When the file is not present, they receive the 404 code and they know that they are welcome everywhere ! If you prefer to avoid these 404's in your stats, you can : - create an empty file and call it robots.txt - create the following robots.txt file : User-agent: * Disallow: Code (markup): There is no need to add anything else in it, except if you do not want the robots in some parts of your site. Feel free to use a customized 404 error page, but NEVER use a 302 redirect for that. Alternately, use a 301 redirect when you decide to replace an old page by a new one (in this case, there will be NO 404 error). Jean-Luc
just create file with below code and upload in your site's root (www) folder. User-agent: * Disallow: Dhruv