what does this mean?

Discussion in 'Google' started by oleander, Jan 14, 2010.

  1. #1
    User-agent: *
    Disallow: /search

    this is what shows up in my webmaster tools for my site. Does this mean the robots cant search? What exactly is this? lol
     
    oleander, Jan 14, 2010 IP
  2. webcosmo

    webcosmo Notable Member

    Messages:
    5,840
    Likes Received:
    153
    Best Answers:
    2
    Trophy Points:
    255
    #2
    that means you have a folder called search and your robots.text file asking robots not to search that folder.
     
    webcosmo, Jan 14, 2010 IP
  3. Revelations-Decoder

    Revelations-Decoder Well-Known Member

    Messages:
    3,028
    Likes Received:
    152
    Best Answers:
    4
    Trophy Points:
    190
    #3
    Sounds about right!

    Though not exactly.

    You most likely have a file on your server called robots.txt (not in a folder as such) that has some crazy command that disallows spiders from accessing your site (supposedly - though I say supposedly in an offhand way as it does actually work in cases)

    So what you need to do is this >

    Fetch that file onto your local machine (That's your PC, Mac or whatever) and edit it to say something like >

    User-agent: *
    Disallow:
    
    # too many repeated hits, too quick
    User-agent: litefinder
    Disallow: /
    
    # Yahoo. too many repeated hits, too quick
    User-agent: Slurp
    Disallow: /
    
    # too many repeated hits, too quick
    User-agent: Baidu
    Disallow: /
    
    Code (markup):
    Then reload the page and you should be good to go.

    There are other commands that may be of use - perhaps others will digress and diversify?
     
    Last edited: Jan 14, 2010
    Revelations-Decoder, Jan 14, 2010 IP
  4. rvnhanh

    rvnhanh Guest

    Messages:
    168
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #4
    It does not allow Search Engine crawl your search folder.
     
    rvnhanh, Jan 14, 2010 IP
  5. AirForce1

    AirForce1 Peon

    Messages:
    1,325
    Likes Received:
    13
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Hi, oleander

    I think this would only mean all crawlers would not index your pages under /search directory if they are decent enough to obey the robots rules. :)

    Have a nice day,
     
    AirForce1, Jan 14, 2010 IP
  6. abhijit

    abhijit Notable Member

    Messages:
    4,094
    Likes Received:
    30
    Best Answers:
    0
    Trophy Points:
    215
    #6
    it is dissallow tag for spiders do not crawl your search folder.................
     
    abhijit, Jan 14, 2010 IP
  7. vagrant

    vagrant Peon

    Messages:
    2,284
    Likes Received:
    181
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Disallow: /search

    would ONLY block search engines indexing a file in your sites root directory called "search" with no file extension.


    The ones saying it blocks a directory have not read your question correctly or do not understand *sigh*

    if you wanted it to block the contents of the search directory on your server you would need to put
    Disallow: /search/ ... with a / at the end !!

    in your case, using one of your blogs as an example it would block
    thecanadarealestatenews.blogspot.com/search?q=test
    Code (markup):
    ie a FILE called search
     
    Last edited: Jan 15, 2010
    vagrant, Jan 15, 2010 IP
  8. hem1234

    hem1234 Peon

    Messages:
    640
    Likes Received:
    13
    Best Answers:
    0
    Trophy Points:
    0
    #8
    Search engine will not crawl the folder "Search" on the server. This is mentioned in the robots.txt file here.
     
    hem1234, Jan 15, 2010 IP