Password protected directory question

Discussion in 'Site & Server Administration' started by Smyrl, Sep 28, 2005.

  1. #1
    If no directive is given in the robots.txt file, do robots index password protected directories?

    Thanks for your input.

    Shannon
     
    Smyrl, Sep 28, 2005 IP
  2. Crazy_Rob

    Crazy_Rob I seen't it!

    Messages:
    13,157
    Likes Received:
    1,366
    Best Answers:
    0
    Trophy Points:
    360
    #2
    Robots won't know the password.
     
    Crazy_Rob, Sep 28, 2005 IP
    Smyrl likes this.
  3. Smyrl

    Smyrl Tomato Republic Staff

    Messages:
    13,740
    Likes Received:
    1,702
    Best Answers:
    78
    Trophy Points:
    510
    #3
    Thanks. That is good news indeed and just what I hoped. It will give a second layer of protection to keep content from being indexed.

    Shannon
     
    Smyrl, Sep 28, 2005 IP
  4. Bernard

    Bernard Well-Known Member

    Messages:
    1,608
    Likes Received:
    107
    Best Answers:
    0
    Trophy Points:
    185
    #4
    Some search engines, notably Google, may still include pages in password protected directories in the SERPs, but without titles, descriptions or cached pages, if they find enough IBLs to determine what the protected page is about. They appear in the serps as the URL to the page.
     
    Bernard, Sep 28, 2005 IP
  5. Smyrl

    Smyrl Tomato Republic Staff

    Messages:
    13,740
    Likes Received:
    1,702
    Best Answers:
    78
    Trophy Points:
    510
    #5
    Thanks, Bernard. I have disallowed spidering of the directory in robots.txt file but read somewhere that sometime information dissallowed in this manner may be indexed.

    Shannon
     
    Smyrl, Sep 28, 2005 IP
  6. Bernard

    Bernard Well-Known Member

    Messages:
    1,608
    Likes Received:
    107
    Best Answers:
    0
    Trophy Points:
    185
    #6
    Yes, but, indexed <> cached.

    Pages or directories disallowed by robots.txt will not be read by robots.txt compliant spiders (such as Googlebot, Slurp, MSN Bot, etc.), so the respective search engines will not process or cache the disallowed content. However, the search engines may still include the URLs in SERPs if enough IBLs use meaningful anchor text.

    To ensure zero visibility of a page in the SERPs, you need to allow the pages/directories in robots.txt and use a meta robots = noindex tag. The meta robots directive does no good if the robots can't process it (because they were forbidden from reading the page).
     
    Bernard, Sep 28, 2005 IP
  7. aqi32

    aqi32 Active Member

    Messages:
    225
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    58
    #7
    eh? if the directory cannot be accessed without a password then how will it gain access? it can't! :D
     
    aqi32, Sep 28, 2005 IP