Hi, I was thinking about google and duplicate content... when you have a php based site www.domain.com, and let's say for example your main index does not have any parameters, you can simply type http://www.domain.com/?id=1 http://www.domain.com/?id=2 http://www.domain.com/?id=3 http://www.domain.com/?id=4 which all result in the same page. However, google would interpret this as different pages. What if someone with bad intentions wants to make your site look bad to the SE's and does 1000's of links of this type to your site to make google index them, which would make it look to google that there are 1000's of urls, all with the same content... This may be a bit paranoia, but it is easily achieved by someone with bad intentions... how does google handle this?
nobody really knows how google deals with duplicate contents. Everything I've read are all speculations and experience vary from one person to the other.
Well, if some idiot did that, the only person they are hurting are themselves. Google looks at outbound links on a page and would nail him. You don't have control over inbound links to your site. Even when "bad sites" link to you, the only thing you can do is ask them not to. And, I'm not sure if Google would look at your example as different pages. I would hope it was smart enough to realize that it was the same page being called with different parameters. That is how you pass your tracking ids etc so google probably just strips off everything past the ? in the url.
I agree.. Google would only index one of your sites and disregard those that have duplicate contents.. So it would be like a waste of time if you create a lot of sites and place there duplicate contents..
If your site was the first to be indexed by them (in relation to the other duplicate sites), I believe that the other content will be excluded from googles's results and you will not be penalized. But as zeekstern said, its all speculation and the facts are not openly known.
Don't use duplicate contents. It will just hurt your sites. Try to create some unique contents instead.