Hi frens, I want to know how to block Search Engine to crawl our subdomain. I ever read that is something to do with the robot.txt So what I want to know, what exactly should I write and where shoul I put them? Example my domain is: www.example.com and my subdomain is very.example.com Thanks
Write and save this as robots.txt, then upload it in your root folder of the subdomain: User-agent: * Disallow: /
hi thanks, so if I do that then no way for any SE to crawl my site rite? =) is there anything we should do again to completely don't let SE to crawl our site? Thanks =)
For more "security" you can put this in the code of the site, somewhere after <head>: <meta name="robots" content="noindex, nofollow" /> <meta name="robots" content="noarchive" />
It should be noted that the SEs will index (read) everything they can unless told not to. You must use the robots.txt file to tell the SEs what not to index. User-agent: * Disallow: /sub-domain/ Disallow: /sub-domain/index.htm You should go to www.robotstxt.org/ for the full syntax that is used.
If you put Disallow: / there is no need to indicate the subfolders because you are blocking the root folder.
Dear if you want to block your subdomain very.example.com to crawl then you need to put these lines in your robots.txt: User-Agent: * Disallow: / and then put this robots.txt file into your subdomian root folder.. I hope it'll help you..