Questions about blocking a site from spiders for a beta run of a huge site. (the sites will run side by side with the old site, and I want the newer, beta version to be unaccessible to spiders) The old site is PR6, domain is over 8 years old, 300k+ indexed pages. Indicative of a old site. Other than 1. robots txt 2. meta=no index I feel that if anyone out there links to the site; its going to get indexed. Should I have the beta site password protected?
This may be a ridiculous idea, all users who get invited to the beta click a link, which installs a cookie. Hence, they will only be able to access this site with the cookie. URLs and everything are exactly identical. Thoughts?