I was listening to Net Income on WebMaster Radio and shoemoney said he uses (or has used) the text from Wikipedia pages to make pages to put AdSense on. Can anyone point me to one of his pages, or tell me how this is done, as surely this just gets penalized for duplicate content? How much do you need to change content so that it is no longer duplicate? Any one got any experience on this?
Yes, the wikipedia articles are never really going to rank well. But if you have a steady readers-base wikipedia articles are likely to be interesting to them, and they will read them and possibly click on ads. So it is really about building an interesting website. I do the same: when I have a subject I really should have an article on, but don't really want to write something about - I will use a wikipedia article, perhaps do a rewrite where I think it's wrong. For the reader it makes the website more interesting. As I build my website and learn more about the subject, that article will change, grow etc. - and move away from the original wikipedia article. But in the meantime it gives valid info to my visitors as well as opportunity to click on ads.
kh7 is spot-on. if you want to see an example take a look at my page http://www.francethisway.com/places/bordeauxwine.php a subject about which I know nothing, so have used wikipedia info, because it will make the site more useful to certain visitors - it is linked from a page http://www.francethisway.com/places/bordeaux.php which gets real visitors. I just checked in google and the page I wrote is there but the wine page is not even in supplemental. but interested people will find it from the town page. A site made just from wikipedia articles is doomed to failure from the start.
there are successful sites which use wikipedia content (think answers.com, etc) but they're kind of the minority. there are really only two ways to get it to work: 1) use some sort of algorithmic rephrasing or another method to "trick" google into letting the content in. translating it back and forth between languages then automatically fixing grammar problems. not useful to users but can end up in google. 2) actually add something useful to the content. this is the best route but it's by no means easy.
Exactly. Don't even bother putting in the effort. Wikipedia can act as a supplement as others stated.
Everyone is so duplicate content worried around here. Google looks at your page as a WHOLE. If the article is only a small part of your page content the duplicate problem is gone. To those saying it's "too much work" well I can republish 1 page or all of them with about the same effort. How is that too much work?
Because you cant republish one or all of them on pages mixed with other articles (that comes from several different sources). And even if you have three articles from three very different sources you can still go suplemental. I did.
So if I make a page that contains 3 Wiki articles, and re-write a few of the sentences, and put my own intro paragraph, menus, navigation etc, Google still knows that it is duplicate content? Presumably they take a sample of sentences from the page and search for exact matches to those sentences on other sites. It must be a clever algo though, to not penalize legitimate identical sentences.
What I guess I'm asking is: Does anyone here use this technique to successfully generate a few $ per day? And does the income last ove time, or do you get found out after a while?
I think everyone is copying content to one level or another but the question is what added-value are you adding to the mix? If you copy someone's article or refer to a part of it, it is very important to generate some unique content.
I'd never thought of it that way -- taking Wiki as a basis to augment other content...certainly an idea I'll work with