Hi there I've been reading that paginated entry pages on blogs can cause problems with duplicate content internally on the site. Some people suggest using robots.txt to tell the bots not to index the paginated pages, but just follow the links to the actual entry. From an SEO point of view is this correct? Will it reduce duplicate content? Does it improve your search ranking? Thanks
Do you mean a static "page" instead of "post" at blog? Why do you have duplicate content on the "page" and the "post"? It's redundant. And yes, having duplicate content could harm your rank.
Duplicate Content will destroy your ranking. On wordpress search for 'The all in one SEO' plug in. It will help with meta data. Next add the meta tag robots and have it noindex, follow your homepage. You want the bots to only see the individual post pages. Lastly beware of your calendar and archive because they will often make the bots see duplicate content. If in doubt search google site:yoururl.com and if you see any supplemental results you know you have dup content issues. Good Luck.