If I add an automated blog to a section of my website, will I risk losing my keyword ranking for materials in my forum due to the duplicate content? I want to add a small automated blog but don't want to risk losing my main keyword rankings once Google sees that I'm using duplicated content. Thanks!
The question you should be asking is "why should I provide useless duplicate content to my visitors". This is precisely the same question Google will ask itself when crawling your website.
I know it's not going to improve it, my question is will it axe unrelated rankings for different areas of the domain? I'd appreciate it if you'd actually assist in answering the question rather than posting a useless remark taking a pick at my intentions. In case you'd forgotten my question is "Will I lose my ranking for areas of the website not related to the automated blog section?". Please do not launch off into an ethics debate regarding scraping. It's just a simple question regarding Google algorithms. Thanks.
Mmmm the guys above have valid arguments against, I'll give an argument for you to do it then : Maybe you could do this as a what's new or news type Blog / Feed, which can or could assist in navigation of your site to and for visitors. And depending on the Blog, you can possibly benefit from the RSS feed (if Blog supports it), such as submitting to directories and updating news associated sites of your new material and you could even put the RSS feed on the Template of your website which will keep all your pages updated with new content
Ethics? I'm actually referring to the Google guidelines. Technically I don't really know the ratio of duplicate content that will trigger an "alarm", however, I am certain that the root domain is not immune to the penalties derived from your sub-directories, which means that you might suffer a setback in the SERPS.
Well I can't answer your question but hope this might be useful to you. I don't know about blogs, but my highest pagerank page is copied from wikipedia (I know pr doesn't affect serp) & i have back-links to that page ... yet it is highest pr? Go figure!! I didn't copy the content out of sheer laziness or for adsense revenue - I simply did it because it is useful to gather as much info as you can in one place, for the visitor (for certain types of websites anyway). I do include all references & links to info I've copied & I do not (hopefully) break copyright laws. I don't think it's affected my serp as it is a relatively young & very small site & is rising in serps for my chosen keywords all the time (slowly) PS All my highest PR pages are copied content with barely any backlinks! All the others are completely unique & updated regularly, so who knows eh? All I can say is, use reputable sources & of course, reference & link back to them.
This is why Google should start using the robots-nocontent attribute that would be perfect for honest webmasters that are citing external content for the benefit of the end user.
In my experience, no, it will not hurt your pages with original content. I'm pretty confident of that as well.
search engines will not like duplicate content. because of this, using an automatic blog is a big risk for your serp