am I blocking the search engine crawlers?

Discussion in 'Search Engine Optimization' started by carboncat, Aug 18, 2007.

  1. #1
    Googlebot has visited my site many times, but the number of files retrieved is 'zero'. same for yahoo slurp. They keep coming, but they get no files.
    I am not blocking IP addresses.
    There is no .htaccess file in the directory.
    This is a fairly new site (about 1 month) but I am redirecting visitors there from an older domain.
    Why don't the spiders retrieve any files? Am I doing something wrong?
     
    carboncat, Aug 18, 2007 IP
  2. pining_garcia

    pining_garcia Banned

    Messages:
    38
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Have you tried using site: command in SEs?
     
    pining_garcia, Aug 18, 2007 IP
  3. Kuldeep1952

    Kuldeep1952 Active Member

    Messages:
    290
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    60
    #3
    You should Register in Google Webmaster Tools. You will then be able to see if there is any problem with your Robots.txt which prevents Googlebot from spidering your website.
     
    Kuldeep1952, Aug 18, 2007 IP
  4. VandJ

    VandJ Guest

    Messages:
    13
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    I would advise you to see via Google Webmasters Tools as well to see what the problem is as well. Did you buy your domain from some one else?
     
    VandJ, Aug 18, 2007 IP
  5. chickboi

    chickboi Banned

    Messages:
    13
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #5
    what is the URL of your site? maybe your site has no Inbound links yet...
     
    chickboi, Aug 18, 2007 IP
  6. trichnosis

    trichnosis Prominent Member

    Messages:
    13,785
    Likes Received:
    333
    Best Answers:
    0
    Trophy Points:
    300
    #6
    how do you know that your site is not indexed by search engines?

    and what's you url?
     
    trichnosis, Aug 18, 2007 IP
  7. Gnet

    Gnet Peon

    Messages:
    5,340
    Likes Received:
    529
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Yea this happens alot when the files either give out a 404 or the connection times out on the servers from where the spiders are accessing your site therefore no bandwidth is consumed.

    Try submitting a sitemap and monitor your site from the webmaster central.
     
    Gnet, Aug 18, 2007 IP
  8. TroyM

    TroyM Well-Known Member

    Messages:
    520
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    108
    #8
    Use Google Web Master Tools...+ Search engine crawler simulator

    Thanks
     
    TroyM, Aug 18, 2007 IP
  9. carboncat

    carboncat Peon

    Messages:
    149
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    0
    #9
    thanks for your advice, everyone. After some investigating, it seems to be the host who is the problem. I have contacted them with a support request.
     
    carboncat, Aug 18, 2007 IP