1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Duplicate Content Issues for Directories

Discussion in 'Directories' started by malcolm1, Sep 14, 2008.

  1. #1
    Hello...

    Not sure if this has been mentioned here before but... :eek:
    (concerning phpld script)

    When we where adding a "site map" at google webmaster for adult-url.com we noticed that after it was accepted (the sitemap)
    we where some how tripping "duplicate content filters" for all category pages which i assume would hurt if not penalize a site for such issues.

    It would seem that google bots spidering the category pages are also spidering the.....

    Links Sort by: Hits | Alphabetical which are

    1. adult-url.com/Blogs/?s=A&p=1
    2. adult-url.com/Blogs/?s=H&p=1

    Which are the same duplicates as the main category that are causing the issue in the first place at google webmasters.

    Just wanted to point out this issue and share with others concerning a fix for the problem.

    Does anyone else get these errors that have sitemaps at google?

    thx
    malcolm
     
    malcolm1, Sep 14, 2008 IP
    hyper and swedal like this.
  2. swedal

    swedal Notable Member

    Messages:
    3,767
    Likes Received:
    426
    Best Answers:
    0
    Trophy Points:
    230
    #2
    I noticed the same thing a while back and had an0n code in a fix for it.
     
    swedal, Sep 14, 2008 IP
  3. scoobby

    scoobby Active Member

    Messages:
    1,570
    Likes Received:
    98
    Best Answers:
    0
    Trophy Points:
    90
    #3
    i also noticed that the other day and i was looking for a fix for that.So Anon fix the problem?if yes i have to talk to him also..thanks for sharing this.
     
    scoobby, Sep 15, 2008 IP
    swedal likes this.
  4. amenda

    amenda Banned

    Messages:
    778
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #4
    I got same problem, but I just delete the original one and generate a new sitemap and upload without problem.
     
    amenda, Sep 15, 2008 IP
  5. discover

    discover Notable Member

    Messages:
    1,111
    Likes Received:
    166
    Best Answers:
    0
    Trophy Points:
    265
    #5
    I would like the fix for this also actually

    If the fix is applied though how to get the category indexed instead of the already indexed /?s=A&p=1 category?
     
    discover, Sep 15, 2008 IP
    hyper and swedal like this.
  6. MeetHere

    MeetHere Prominent Member

    Messages:
    15,399
    Likes Received:
    994
    Best Answers:
    0
    Trophy Points:
    330
    #6
    I noticed it earlier...
    hope some geek finds the solution..
    Thanks for pointing it out malcolmn...
     
    MeetHere, Sep 15, 2008 IP
    swedal likes this.
  7. shacow

    shacow Active Member

    Messages:
    339
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    60
    #7
    cant you just put a nofollow on those links.. and then google wont crawl them... thats what I do with links like that on my sites and it works
     
    shacow, Sep 15, 2008 IP
  8. malcolm1

    malcolm1 Prominent Member

    Messages:
    7,148
    Likes Received:
    758
    Best Answers:
    0
    Trophy Points:
    310
    #8
    Hello...

    I received a PM from "hyper" and he mentioned that a possible solution for this is...

    Here's some minor tweaks, but highly useful.

    Open robots.txt(which is blank as default) from your phpLD folder and add the following lines:

    Disallow: /*?s=P&
    Disallow: /*?s=H&
    Disallow: /*?s=A&

    Save & upload the changes of the robots.txt and you are done!

    many thx
    malcolm
     
    malcolm1, Sep 15, 2008 IP
  9. hyper

    hyper Peon

    Messages:
    1,565
    Likes Received:
    214
    Best Answers:
    0
    Trophy Points:
    0
    #9
    and the full credits for the tweak goes to snowbird.

    Thanks,
    hyper :)
     
    hyper, Sep 15, 2008 IP
  10. discover

    discover Notable Member

    Messages:
    1,111
    Likes Received:
    166
    Best Answers:
    0
    Trophy Points:
    265
    #10
    is there no way to stop this happening from the outset though?
    or will adding the robots.txt info and then uploading a new sitemap do the same thing?
     
    discover, Sep 15, 2008 IP
  11. falsealarm

    falsealarm Peon

    Messages:
    101
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    0
    #11
    Thanks for the fix but I have a question. Should not this be a non-issue? Since changing the sort order by Alphabetical, Hits, etc. would yield in different pages, I thought it would not trigger a flag in Google Webmaster Tools. At least it does not in my case.
     
    falsealarm, Sep 15, 2008 IP
  12. malcolm1

    malcolm1 Prominent Member

    Messages:
    7,148
    Likes Received:
    758
    Best Answers:
    0
    Trophy Points:
    310
    #12
    Well it did in our case :eek:

    Would rather fix the issue asap so that google cant say the pages are duplicates.
    Regardless of what the webmasters tools say about the content the site itself is
    being indexed very well (after sitemap installed) and we are recieving tons of traffic
    from not only google but all major SEs & would like to keep it that way :)

    Better safe then sorry ;)

    thx
    malcolm
     
    malcolm1, Sep 15, 2008 IP
  13. mikey1090

    mikey1090 Moderator Staff

    Messages:
    15,869
    Likes Received:
    1,055
    Best Answers:
    0
    Trophy Points:
    445
    Digital Goods:
    2
    #13
    mikey1090, Sep 15, 2008 IP
  14. malcolm1

    malcolm1 Prominent Member

    Messages:
    7,148
    Likes Received:
    758
    Best Answers:
    0
    Trophy Points:
    310
    #14
    malcolm1, Sep 15, 2008 IP
  15. jhnrang

    jhnrang Notable Member

    Messages:
    4,107
    Likes Received:
    436
    Best Answers:
    0
    Trophy Points:
    225
    #15
    Thanks malcolm , hyper & snowbird for sharing the solution to this.

    I have been having big problem with this issue and first noticed it in October/Nov last year. ( I opened a thread -but I am not supersonic like mikey to search and bring it up:D)

    In fact, on one of my directories - these duplicate pages of categories have PR and not original categories:eek:

    I'll fix all of them now.:D


     
    jhnrang, Sep 15, 2008 IP
    swedal and hyper like this.
  16. mikey1090

    mikey1090 Moderator Staff

    Messages:
    15,869
    Likes Received:
    1,055
    Best Answers:
    0
    Trophy Points:
    445
    Digital Goods:
    2
    #16
    I just decided that nobody ever bothered to re-arrange the links. My choice of sorting is final, I don't want anyone sorting with some gimmick like PR.

    You could always noindex your links to those pages, or just remove them alltogether.
     
    mikey1090, Sep 15, 2008 IP
  17. malcolm1

    malcolm1 Prominent Member

    Messages:
    7,148
    Likes Received:
    758
    Best Answers:
    0
    Trophy Points:
    310
    #17
    This is true as i never have but their might be a few whom do so...
    We just did the "robots txt" instead. It was an easy enough fix so
    silky did the entire network :) that issue resolved ....

    As for PR.... Take a look at adult-url.com :D

    Any and or all things associated with "PR" (google metric) has been
    removed entirely from "that" directory and soon the network.

    I dont see that surfers care about PR (only webmasters) but rather
    the listings they are searching for to be exact which is what we are doing ;)

    thx
    malcolm
     
    malcolm1, Sep 15, 2008 IP
  18. mikey1090

    mikey1090 Moderator Staff

    Messages:
    15,869
    Likes Received:
    1,055
    Best Answers:
    0
    Trophy Points:
    445
    Digital Goods:
    2
    #18
    Great steps Malcolm. The same goes for other stuff like indexed pages etc. Seems only relevant in a webmaster niche directory.
     
    mikey1090, Sep 15, 2008 IP
    swedal and hyper like this.
  19. discover

    discover Notable Member

    Messages:
    1,111
    Likes Received:
    166
    Best Answers:
    0
    Trophy Points:
    265
    #19
    meant to ask before but what do I use in the robots.txt to make this applicable to all user agents?
    this?
    User-agent: *
     
    discover, Sep 21, 2008 IP