I have a site that has a knowledgebase for my customers. The entire knowledgebase has been excluded in robots.txt since I posted it. Most of the content is duplicated from another of my sites where it is spidered and gets SE traffic. Is there any benefit from removing the robots excusion? Bigger is better and this will help the SEs determine what the site is about. The pages themselves will be filtered as duplicate because they are. Bad idea? Good idea?