Hi all, I'm new to this whole SEO thing...the last few years instead of being a webmaster, I was just a contract programmer for companies. Now that I wish to venture out in becoming a webmaster, I'm trying to gain as much knowledge on SEO. My main search engine I would like my sites to rank well in is Google. Theres no doubt about that. I'm starting with Google and sticking to optimizing for Google. Now what drives me nuts is how the "duplicate content filters" work. First, what is duplicate content? So lets say Site A has a link and description: -asitelink.com- a made up description to see how well duplicate content ilters work Site B , Site C, site D, and another site E (All free directory sites) list: -asitelink.com- a made up description to see how well duplicate content filters work So all site's have the same link title, address target, and description. So now lets say Google "detects" duplicate content for Sites B to E. Does this mean, when you submit to free directories (Sites b to e), Google will give no juice to those links? Now if this was the case on how it works, why submit to many directories with the same Title and description? Why pay for a manual submission service to have a submitter submit the same url addresss, title, description to 1000 plus directories? (unless you think 1000's of directories will provide you valuable traffic) I recently was informed by a directory submission service that Google will indeed count the extra duplicate links (site's b to e). He told me that 300 or so links are good for the same title and description...going beyond 300 may let Google think its spam and devalue the links. Then he told me that its good to use the same title and description to 300 dirs, and when you submit to another 300 directories, then use a different title and description to avoid duplicate content filters. If this directory submission services is right about this, then if a site was to get 300 or so inlinks from DMOZ's ODP it will be counted...making DMOZ still valuable to be in. So the whole "DMOZ (and its clones) only provides you one link" argument should no longer be used?
you know, i personally feel that gayglebot is not as smart as people hype it up to be. it's a program; and in that, can only be as smart as it has been programmed to be. Nothing can ever replace the human factor in decision making.
Yup! They certainly try to deliver much false information to convince the masses that their algo is all that. Some guy named Matt dishes a lot of it out and it spreads like wildfire across the blogoshpere and forumoshere. That said, yes you want to vary your link titles and descriptions that you submit to directories. But nowhere near as much as the OP fears. 4 different anchor text titles and a couple different descriptions are fine in most cases when submitting to hundreds of dirs. But submitting one exact phrase to hundreds of dirs still helps rankings significantly for that term. Varying your anchors helps with targeting multiple terms and helps with semantic relevance, a bit. I don't think Google is quite near the point where semantics, of the latent sort, are a significant part of the current algo. Though every one is talking like it is. But its certainly going to be coming down the pipe. Eventually. Generally the duplicat content thing does not apply to most directories. On any given page in a directory there will be a different mix of sites submitted. So even if those small description blurbs can be found on hundreds of other dirs they are mixed with other blurbs on the page to create unique mixes of content. But I don't know where the ODP clones fit into this. Most of them will have identical pages with the same links and descriptions in the same order. Regardless of the clones, the original ODP listing at dmoz.org would be passing the trust factor juice because of its authority status. The clones may or may not pass anything. Either way the clones don't matter. The link in dmoz itself does.
Only someone with deep inside knowledge of the google algorithm could say something like “300 equal anchor text is ok†the rest is speculation. Some say google is smart, other say google is stupid, you chose your own view, one thing is sure google is getting smarter every day it passes. Google team works with small interactions, if they try to disqualify duplicated content, and testing it feel that this improves search quality they will do it. Duplicated content is very easy to spot, even mixed duplicated like happens in most directories, the only safe way is to go for 100% original content.
I agree 100% it's only as good as people who program it, and we hype it up soo much to a point that we fear it in someway. 5 months ago I went on a campaign to rank for Popular Internet Directory for NWPARKS.net and Google awarded me first place out of the Results 1 - 10 of about 75,400,000 Then, I figured I might try another key phrase and no time I went from 1st to second and the Popular Internet Directory I am now second I was also first at Yahoo for that phrase now moved down to 5th But, I moved up to MSN for that same keyword and is at number 1. Your keywords or phrase is your brand, change it to rank well on another might be a little harmfull IF your competitor is after the same keywords as you. So I guess stick to that particular keyword. Another was the WWWS keyword it's for my site wwws.org 3 years ago I rank 16th and it went offline for several months and the rank where no where to be found, then last year it was at 30th since I been campaining for that keyword, then as months went on I kept moving up and back to where it was at 16th. And now take a look at it, I am now on third for the keyword wwws and i have knocked down all the wwws TLD's and knock down AM 1400 Solid Gold Soul and I am glad i've pushed wwws.warnerbros.com. lol they were next to Solaris(sun.com)
So lets say I have an RSS feed in my directory in lets say education; http://www.free-website-directory.com/Education/ At the bottom I pull in an RSS news from Yahoo, the census is not to really worry about duplicate content filters? I was running a bit nervous but thought it was a cool feature.
its Really Simple Syndication which is what it is intended to be used for. they wouldnt allow syndication if they didnt want you to use it. /me shakes head.
Sorry, my mistake. I was just going to edit my post. Now I do realize what you are trying to say and it is fair, yes. My apologies