I am currently on practice with robots.txt.. I have found out that there are lots of duplicate contents in a freshly installed wordpress blog.. One major source of duplicate contents are the archives section. There are multiple archives.. So i decided to block them from getting indexed by search engine spiders.. What other sections of a new wordpress blog should be blocked using robot.txt?
And this information has been discussed over and over since it was "found out". That's why plugins like All in one SEO allow you a check box to choose whether to block them or not. But in reality it's not much of an issue, because as it turns out Google already is very aware of this. It has not to date had any effect on the rankings of my blog. I do not create robot text files and they rank just fine.
Install all in one seo pack plugin and use noindex for categories , archives ,tag archives this will fix your problem