Hello... Not sure if this has been mentioned here before but... (concerning phpld script) When we where adding a "site map" at google webmaster for adult-url.com we noticed that after it was accepted (the sitemap) we where some how tripping "duplicate content filters" for all category pages which i assume would hurt if not penalize a site for such issues. It would seem that google bots spidering the category pages are also spidering the..... Links Sort by: Hits | Alphabetical which are 1. adult-url.com/Blogs/?s=A&p=1 2. adult-url.com/Blogs/?s=H&p=1 Which are the same duplicates as the main category that are causing the issue in the first place at google webmasters. Just wanted to point out this issue and share with others concerning a fix for the problem. Does anyone else get these errors that have sitemaps at google? thx malcolm
i also noticed that the other day and i was looking for a fix for that.So Anon fix the problem?if yes i have to talk to him also..thanks for sharing this.
I got same problem, but I just delete the original one and generate a new sitemap and upload without problem.
I would like the fix for this also actually If the fix is applied though how to get the category indexed instead of the already indexed /?s=A&p=1 category?
cant you just put a nofollow on those links.. and then google wont crawl them... thats what I do with links like that on my sites and it works
Hello... I received a PM from "hyper" and he mentioned that a possible solution for this is... Here's some minor tweaks, but highly useful. Open robots.txt(which is blank as default) from your phpLD folder and add the following lines: Disallow: /*?s=P& Disallow: /*?s=H& Disallow: /*?s=A& Save & upload the changes of the robots.txt and you are done! many thx malcolm
is there no way to stop this happening from the outset though? or will adding the robots.txt info and then uploading a new sitemap do the same thing?
Thanks for the fix but I have a question. Should not this be a non-issue? Since changing the sort order by Alphabetical, Hits, etc. would yield in different pages, I thought it would not trigger a flag in Google Webmaster Tools. At least it does not in my case.
Well it did in our case Would rather fix the issue asap so that google cant say the pages are duplicates. Regardless of what the webmasters tools say about the content the site itself is being indexed very well (after sitemap installed) and we are recieving tons of traffic from not only google but all major SEs & would like to keep it that way Better safe then sorry thx malcolm
I figured it was just couldnt find the proper thread Have adrian/Silkysmooth working on the fix now as well as a few others laterz malcolm
Thanks malcolm , hyper & snowbird for sharing the solution to this. I have been having big problem with this issue and first noticed it in October/Nov last year. ( I opened a thread -but I am not supersonic like mikey to search and bring it up) In fact, on one of my directories - these duplicate pages of categories have PR and not original categories I'll fix all of them now.
I just decided that nobody ever bothered to re-arrange the links. My choice of sorting is final, I don't want anyone sorting with some gimmick like PR. You could always noindex your links to those pages, or just remove them alltogether.
This is true as i never have but their might be a few whom do so... We just did the "robots txt" instead. It was an easy enough fix so silky did the entire network that issue resolved .... As for PR.... Take a look at adult-url.com Any and or all things associated with "PR" (google metric) has been removed entirely from "that" directory and soon the network. I dont see that surfers care about PR (only webmasters) but rather the listings they are searching for to be exact which is what we are doing thx malcolm
Great steps Malcolm. The same goes for other stuff like indexed pages etc. Seems only relevant in a webmaster niche directory.
meant to ask before but what do I use in the robots.txt to make this applicable to all user agents? this? User-agent: *