PR 6 none of the engine is caching up the site

Discussion in 'Google' started by tradealoan, Oct 13, 2006.

Thread Status:
Not open for further replies.
  1. #1
    tradealoan, Oct 13, 2006 IP
  2. vagrant

    vagrant Peon

    Messages:
    2,284
    Likes Received:
    181
    Best Answers:
    0
    Trophy Points:
    0
    #2
    vagrant, Oct 13, 2006 IP
    maldives likes this.
  3. seodelhi

    seodelhi Active Member

    Messages:
    882
    Likes Received:
    29
    Best Answers:
    0
    Trophy Points:
    70
    #3
    Just replace 'Disallow' with 'Allow' and see the magic.;)
     
    seodelhi, Oct 13, 2006 IP
  4. maldives

    maldives Prominent Member

    Messages:
    7,187
    Likes Received:
    902
    Best Answers:
    0
    Trophy Points:
    310
    #4
    You have disllowed robots/spiders from indexing your pages. You must allow the bots/spiders in order to get noticed.
     
    maldives, Oct 13, 2006 IP
  5. maya786

    maya786 Peon

    Messages:
    85
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #5
    this is a very basic error

    only turn it on if you are looking for your website not to be cralwed by the engine.

    appers that at one time it was allowed. but later you disallowed this
     
    maya786, Oct 13, 2006 IP
  6. jaguar-archie2006

    jaguar-archie2006 Banned

    Messages:
    631
    Likes Received:
    16
    Best Answers:
    0
    Trophy Points:
    0
    #6
    cool, its alright you can still update your robots to allow all robots to spider your pages. good luck
     
    jaguar-archie2006, Oct 13, 2006 IP
  7. jaguar-archie2006

    jaguar-archie2006 Banned

    Messages:
    631
    Likes Received:
    16
    Best Answers:
    0
    Trophy Points:
    0
    #7
    1. How to allow all search engine spiders to index all files

    Use the following content for your robots.txt file if you want to allow all search engine spiders to index all files of your Web site:

    User-agent: *
    Disallow:


    Other Use of robots

    If you want to exclude all the search engine spiders from your entire domain, you would write just the following into the robots.txt file:

    User-agent: *
    Disallow: /

    If you want to exclude all the spiders from a certain directory within your site, you would write the following:

    User-agent: *
    Disallow: /aboutme/

    If you want to do this for multiple directories, you add on more Disallow lines:

    User-agent: *
    Disallow: /aboutme/
    Disallow: /stats/

    If you want to exclude certain files, then type in the rest of the path to the files you want to exclude:

    User-agent: *
    Disallow: /aboutme/album.html
    Disallow: /stats/refer.htm

    If you are curious, here is what I used to keep an article from getting indexed:

    User-agent: *
    Disallow: /zine/article002.htm

    If you want to keep a specific search engine spider from indexing your site, do this:

    User-agent: Robot_Name
    Disallow: /
     
    jaguar-archie2006, Oct 13, 2006 IP
Thread Status:
Not open for further replies.