Can I limt a crawler to the root fodler with a single command?

Discussion in 'robots.txt' started by sorcha, Nov 1, 2010.

  1. #1
    Hi,
    Is there a single command to prevent a crawler from going into any subfolder? The reason I am asking this is, it seems google is picking up subfolders that don't even exist on my site. So just excluding each folder separately is not enough, I need a general command.

    Thanks
     
    sorcha, Nov 1, 2010 IP
  2. AtomicPages

    AtomicPages Peon

    Messages:
    38
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    
    User-agent: *
    Disallow: /
    
    Code (markup):
    This will block all bots that follow the robots.txt rules from crawling any folder in your public_html folder.
     
    AtomicPages, Nov 3, 2010 IP