I've my robots.txt created and I need to stop some bad urls from getting indexed in google. I'm not able to find from where google is finding those urls. Those pages are on the sub domain. The urls are something like 'http://xxx.mysite.com/1/2/3/4.php' . Suppose I want to block /1/ directory and all its content, how should I block them through robots.txt? Thanks
I've my google sitemap account for the main site. Do I need to add another sitemap for my sub.domain? Because what I find that I won't be getting the url removal option (new feature in google sitemap) for the sub domain?
Just add the '1' directory to your list in your robots.txt file, that will stop majority of bots from indexing it.
I've my google sitemap account for the main site. Do I need to add another sitemap for my sub.domain? Because what I find that I won't be getting the url removal option (new feature in google sitemap) for the sub domain?
MD.45 is right, but the robots.txt file must be placed in the root directory of the subdomain, not in the root directory of the main domain. Jean-Luc