Google disallow a directory, but disallows my files... what?

Discussion in 'robots.txt' started by shacow, Sep 1, 2008.

  1. #1
    Hi, I have my robots setup so that it wont spider a folder on my server called dev.. (Disallow: /dev/) I have a page called dev.html in my root folder which I want to be spidered by google etc.. but google webmaster reports that it is restricted by robots.txt..

    does anyone understand this? or how to fix it?


    many thanks. Shaun
     
    shacow, Sep 1, 2008 IP
  2. shacow

    shacow Active Member

    Messages:
    339
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    60
    #2
    Its at http://shacow.com/robots.txt

    But yes, it looks like I have done it wrong.

    User-Agent: *
    Disallow: /dev
    Disallow: /whois
    Allow: /


    Whats the right way to write it so the google bot will now crawl anything contained in the dev folder, but allow the dev.html file?

    thanks, Shaun
     
    shacow, Sep 3, 2008 IP
  3. powerdot

    powerdot Peon

    Messages:
    173
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    powerdot, Sep 5, 2008 IP