help with blocking sorted by pages from being indexed

Discussion in 'robots.txt' started by samehzone, Aug 21, 2008.

  1. #1
    how do i block this sorted by urls

    i need to have the sorting for the visitors but i dont want to search engine to index them since that the same category can be indexed 3 times

    one is the default sort and thats what i want to index
    and the 2nd is Alphabetical sort
    and the 3rd is sort by hits


    http://middleastpost.org/Palestine/
    http://middleastpost.org/Palestine/?s=A&p=1
    http://middleastpost.org/Palestine/?s=H&p=1

    any help !
     
    If someone posts a solution, use the "Best Answer" link in their post to pick it as the best answer.
    samehzone, Aug 21, 2008 IP
  2. catanich

    catanich Peon

    Messages:
    1,921
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Try this

    # All other robots will spider the domain
    User-agent: *
    Disallow: /Palestine/?s=A
    Disallow: /Palestine/?s=H
    Allow: /Palestine/
     
    catanich, Aug 21, 2008 Set Best Answer IP
  3. samehzone

    samehzone Guest

    Best Answers:
    0
    #3
    yeah but this is a hard work for me i got a lot of categorizes and i dont want to do them one by one !
     
    samehzone, Aug 22, 2008 Set Best Answer IP
  4. samehzone

    samehzone Guest

    Best Answers:
    0
    #4
    any help guys
     
    samehzone, Aug 24, 2008 Set Best Answer IP