My primary domain has more than 20000 pages its an article directory, unfortunately i want to scrap all the pages because of lot of spam pages and some pages direct users viciously, i decided to block all the cached pages into robots.txt, Now i have some subdomains running in that same domain, what will happen to those sub domains if i block each and every single page of the primary domain. I heard Google consider sub domains as private , tell me what will happen to my sub domains, will that be affected by this action taken on primary domain. Advice me plz.
If you want to block the pages for each subdomain, you'll have to write the entry as * as your wildcard (if your subdomains are a prefix *.site) for the subdomains in your robots.txt. But if you haven't promoted those pages with those subdomains, you really don't even need that. If they were never promoted under the subdomain, they have no value. That said, I'd suggest 301'ing the pages to your root before blocking them with robots file.