How to remove this dynamic url using robots.txt from SE's

Discussion in 'robots.txt' started by sarathy, Oct 13, 2005.

  1. #1
    Hi guyz, how do i remove this url:

    mydomain.com/viewquestionsanswers.php?START=1&END=10 thorugh robots .txt

    all the urls in this page is generated like:
    mydomain.com/viewquestionsanswers.php?START=1&END=10
    mydomain.com/viewquestionsanswers.php?START=11&END=20

    Is there any way to remove all the dynamically generated urls through Robots .txt from the search engines?.

    Regards.,
    sarathy.s :)
     
    sarathy, Oct 13, 2005 IP
  2. nohaber

    nohaber Well-Known Member

    Messages:
    276
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    138
    #2
    If you disallow viewquestionsanswers.php file, it should disallow all the dynamic URLs involving the file.
     
    nohaber, Oct 13, 2005 IP
  3. Epica

    Epica Well-Known Member

    Messages:
    1,007
    Likes Received:
    95
    Best Answers:
    0
    Trophy Points:
    170
    #3
    If you want to keep the main page indexed you could do a user agent detect and supply Google, MSN, and Yahoo each a unique variable, then they would be reassigned that same variable each time they entered the dynamic page so they would never find and index multiple URLs for a single page.
     
    Epica, Oct 13, 2005 IP
  4. alexo

    alexo Well-Known Member

    Messages:
    372
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    108
    #4
    Hello
    question
    i know, that it's possible via robots.txt to disallow indexation of all dinamic files, but unfortunately i don't remember command.

    anybody ,can remind it

    thank you
     
    alexo, Nov 30, 2005 IP
  5. alexo

    alexo Well-Known Member

    Messages:
    372
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    108
    #5
    i find this one:
    User-Agent: *
    Disallow: /*?

    is it right command?

    thank you
     
    alexo, Nov 30, 2005 IP
  6. idolw

    idolw Peon

    Messages:
    158
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #6
    idolw, Dec 3, 2005 IP
  7. flyguy

    flyguy Peon

    Messages:
    198
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    0
    #7
    So for sure would:
    User-Agent: *
    Disallow: somepage.php

    exclude all of these?
    somepage.php?id=1234
    somepage.php?id=12345
    somepage.php?id=12345&id2=12345
    etc.
     
    flyguy, Dec 29, 2005 IP
  8. Alfahane

    Alfahane Guest

    Messages:
    88
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #8
    Are you sure? (I'm not saying that you are wrong :) )
     
    Alfahane, Jul 20, 2006 IP
  9. UmbrellaTechnologies

    UmbrellaTechnologies Active Member

    Messages:
    527
    Likes Received:
    22
    Best Answers:
    0
    Trophy Points:
    58
    #9
    I have the same problem with some pages I dont want listed in google, do I use the same method above to accomplish this?
     
    UmbrellaTechnologies, Sep 21, 2006 IP
  10. internetmarketing

    internetmarketing Peon

    Messages:
    71
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #10
    Hey guys if you solve that problem then please pm me ...... I also faced the same problems. :)
     
    internetmarketing, Sep 27, 2006 IP
  11. netprophet

    netprophet Banned

    Messages:
    288
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #11
    me too........... :)
     
    netprophet, Sep 27, 2006 IP
  12. adacprogramming

    adacprogramming Well-Known Member

    Messages:
    1,615
    Likes Received:
    62
    Best Answers:
    0
    Trophy Points:
    125
    #12
    This does not work, it will disallow somepage.php but allow all the variable versions. I'm trying to figure out the same thing.

    I just found something else I am going to try

    Disallow somepage.php*

    the * is suppose to equal any characters of any number. Seems like that should work
    If I can't get a better way, I'll have to move the file into a folder and disallow the folder. What a pain!! :(
     
    adacprogramming, Oct 3, 2006 IP
  13. sarathy

    sarathy Peon

    Messages:
    1,613
    Likes Received:
    76
    Best Answers:
    0
    Trophy Points:
    0
    #13
    Thatz good idea, But even that would not help in removing a single dynamic url :(
     
    sarathy, Oct 3, 2006 IP
  14. adacprogramming

    adacprogramming Well-Known Member

    Messages:
    1,615
    Likes Received:
    62
    Best Answers:
    0
    Trophy Points:
    125
    #14
    Looks like this can help you be dynamic
    mydomain.com/viewquestionsanswers.php?START=1&END=*
    Should fix your problem

    If you only want the one page ignored you can state the exact page
    disallow: mydomain.com/viewquestionsanswers.php?START=1&END=10
     
    adacprogramming, Oct 4, 2006 IP
    sarathy likes this.
  15. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #15
    This code will help you...
    User-Agent: *
    Disallow: /*?
     
    manish.chauhan, Jul 14, 2008 IP
  16. C. Berntsen

    C. Berntsen Active Member

    Messages:
    128
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    55
    #16
    I'm not sure if you noticed, but last post was posted almost 2 years ago.
     
    C. Berntsen, Jul 17, 2008 IP