Dear Digitalpoint members, I have a question and i can't seem to find the satisfying answer thru Google or the search engine on this forum. I apoligise if it's too abvious, but i really need some confirmation. I run about 15 adult-related websites on where i offer the visitors a list of galleries. All these galleries have descriptions that are unique because we write them ourselfs. Via a cronjob (that runs every 24 hours, so the visitors see every day "fresh" galleries from the same database) the server selects a random list of galleries (let's say 100 galleries on the frontpage with there descriptions) and connects them to each site. Because the query is random, none of the sites show the galleries in the exact same order for 24 hours (we have around 2500 galleries in our database by the way and is getting more each day). I was hoping to avoid the duplicate content penalty problem this way because none of the sites are ever the same structure, but use the same gallery descriptions over and over again. The main point (google-wise) is that these descriptions are unique and we are hoping to get some good SE traffic because of this. What is wise? - Let only one website make use of the galleries? - Can we get in trouble because of the way we show the content? - Will one site "penalize" the other this way? - Maybe a rel="nofollow" so google sees it only as content and not as links (maybe that can trigger some sort of penalty) (and smack me in the face when i'm talking gibberish here) By the way, we use a Mod_rewrite on those galleries so we have URL's like wow-this-is-a-great-forum.html instead of id/gallery/3.php, this won't do any harm now does it? Thanks in advance and i hope i can get some rest now, because this issue is actually keeping me quite busy! Kind regards, Raymond
Hi! 1- Are you using unique title on each page? 2- Are you using unique meta description for each page? 3- How many words do you have in you body area? If you are using the same gallery descriptions on all your pages, this is the problem. If you have short description and your text link menu is long, this could be the problem. Best JAkomo
Duplicate content issues are related to a filter for the most part. What I would be worried about is the sites being boilerplate and considered part of a network Are the rest of the sites (outside of the gallery) unique in conent?
If I understood your code correctly, it can spit out two pages with the same metas and title with different content. That in my opinion, and who knows what that's worth, will result is a dupe trigger at the meta and title level at some point.
First of all, thanks for the replies! @jakomo: 1) The <title> is the description of the gallery, this is in each particular site unique, but because every site uses the same galleries, it's nog THAT unique. 2) Same as the <title>, we use the description of the gallery as meta-description 3) The amount of words varies from site to site, but the most sites only use the galleries and some small texts, but the new site where we are focussing on (and is the most important) does have more text (reviews, small weblogs, etc) on the frontpage. @thegypse: It could be that the sites are considered as a network, all of them share the same IP address and there isn't much more content then those galleries i'm afraid for the most sites. The main site where we are focussing on has some more content and is refreshed allmost every day. @noppid: I think you are quite right, when i have 15 sites, they all use those galleries including the description. Someone clicks on the gallery, a new window opens and the meta-description and <title> are exact the same as all the other sites, except the layout is different and in one case there is more content. ------------- Overall, what is the best way to continue? Maybe an iframe so google doesn't see the galleries anymore? That would be pitty because we put a lot of effort in the descriptions. Maybe only use one website with these descriptions (and the other ones use perhaps an iframe of something that Google no longer sees the descriptions)? Thanks in advance guys.
I would certainly advise reading up on Duplicate content issues... and reading some recent Matt Cutts site reviews.. probably open your eyes somewhat SNIPPET or That should give you some more perspective on some issues related to your situation... check back after that
Hi! Yes on your new project make it: Unique title Unique meta tag description Unique content (maybe between 300 and 500 words) Remember about the menu, try to do it short on each page. For example, I have pages with photos and short description (around 15 words), all these pages are sup... . Best regards, Jakomo
Thank you very much thegypsy and jakomo. I will certainly take the time to read those articles and to pay some more attention to the content on the websites. The issue here is that the visitors are more then pleased with the websites, they all have another design and we actually do offer them video's every day (in a random order, but the database is that big that we can do this). I think i will try the following, i will make a different database with fresh new galleries and descriptions that are only used by the new website. On every site i will write an article or two with usefull information and text that is unique. The menu is a little bit of a problem i guess, but maybe i can fix this with a sitemap voor Google and putting the actuall menu in an iframe so it won't be on top of the source code. Do you think it is wise to put the galleries for now in an iframe so Google won't link and index them anymore so the new website won't get harmed (if it isn't allreay harmed) by a duplicate content penalty and when the new database is ready i "release" it on the website?
The menu is too long? You can use this tool http://www.webconfs.com/similar-page-checker.php to check the % of duplicate content between your pages. On my personal experience, if you have around 70% of duplicate content, you can become supple... Best regards, Jakomo
A little off topic, but one thing that caught my eye in your post. I made a site a few months back that has something similar to your auto updating front page gallery index. Basically the site is a datafeed sales site with lots of pages and very little depth to the linking structure. That is, there are maybe 3000 unique pages, but only one level of categorization. So like: index index/catagory1/product1 index/catagory1/product2 index/catagory1/product3 ... index/catagory1/product297 index/catagory2/product298 index/catagory2/product299 index/catagory2/product300 ... index/catagory2/product632 ... On each page (and the index) I have a randomly selected set of categories ~20 out of maybe 300 or so. Maybe this explanation isn't making sense... too much blood in my coffee stream... but anyway the site structure sounds very similar to yours. Yahoo has indexed the site ~1000 out of the 3000 or so pages and I see Yahoo traffic for the index and sub-pages. Google on the other hand has completely ignored it. The index pages is indexed but I have never seen any Google traffic (other than Googlebot which eats a regular share of bandwidth). And yes, I do have backlinks to the index page as well as some interior pages. Had I to do it over again I would have created a more well defined linking structure. That is, added some static links on the page and fewer random ones. Maybe adding a "most popular" set of links that changes infrequently might help both of us? Just a thought.