I'm running a couple of recruitment websites. I have a main hub site and several skinned websites that use the same content. I want to stop the skinned sites getting indexed as they are showing above the 'new' main hub site. I've changed the robots.txt file to disallow spiders but the results from these skinned sites are still showing up in search engine results pages. Any ideas on how to stop this as I fear that I'm getting hit on duplicate content.
Go to Google webmaster tools in the tools section and remove urls, request it removed from the index there.
Just add them to your list on GWT. If it's on the same server and host you won't even have to upload a new verification file as Google assigns the file name by unique GWT account. It'll verify right away if the file is there already for another domain.
# robots.txt for index.asp User-agent: * Disallow: /redirect.asp Disallow: /banner_refer.asp Verify your robots.txt file with the above chart. After this request a submission to Google webmaster tools for your restrict pages.