This company wants to have all their content the exact same on the static pages EXCEPT the homepage and also have a blog with unique content The content will go on up to 100 different domains and all owned by the SAME company. Would it still be possible to rank each domain up high for their own local areas while the frontpage+blog area has all unique content but the rest is duplicate standard information about their companies on like 10 pages?
google won't like it. Try to make each page sufficiently unique that the search engines think its different (but related) content.
I think that would trigger of the duplicate content alarm for the big G , try making the pages as unique as possible !
It is the pure form of spamming and I don't think any penalty will happen as there is no penalty for duplicate content.
If all the content is on the same IP block they are toast. Google will link them all and they will be de indexed. So if all the domains are with the same host they will probably not even get indexed never mind deindexed. Google dont give a damn about dupe content but if its on the same IP block they will. the only way this will rank if the domains are on different C-blocks.
in one sentence, you have managed to be correct and incorrect at the same time. Well done. If you need to ask why search engines don't like duplicate content, you probably won't understand the answer, but here it is anyway:- duplicate content myth explained
Well its the companies standard content that they state they have to have it on certain pages about their company and so on. Is there a way to not let google crawl those certain pages? Or any other ideas?
yes, you can specify in robots.txt, or in your metatags to ask spiders not to crawl or index. Note that word 'ask'.