Would it be ok to test an up-and-coming website that will be hosted somewhere else on an already existing and unrealated website's directory? I mean, suppose google crawls the page before I move it the new host, would it be considered duplicate content even after I move it? And if so, how do you make a page un-crawlable.
Alternative 1: To make a page or directory uncrawable by Google use a robots.txt file. Do an online search for robots.txt tutorial. Alternative 2: Do a 301 redirect for pages when you put online elsewhere. I think robots.txt would be easier. Shannon
Hi Folks, The only way I got google to stop crawling two websites with the same content, was to use the 'noindex' in the meta tags. I had tried stopping their bot through a 'robots' file but it didn't work for some reason. All the best for 2005 RVB Pix www.bytephoto.com