Google couldn't crawl

Discussion in 'Google' started by sahirfarid, Oct 30, 2012.

  1. #1
    Dear all!

    Last week, my website was down for about 2 days due to problem in my web hosting server. After I transferred to other server, it is now up since last 3 days but in webmaster I received following problem.

    Google couldn't crawl your site because we were unable to access the robots.txt file.robots.txt.

    Now, what should I do? Any idea or suggestion would be welcomed.

    Regards
     
    sahirfarid, Oct 30, 2012 IP
  2. ProSence

    ProSence Greenhorn

    Messages:
    235
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    18
    #2
    It would take time, wait for at least 15 days...
     
    ProSence, Oct 31, 2012 IP
  3. Philvault

    Philvault Active Member

    Messages:
    1,284
    Likes Received:
    13
    Best Answers:
    0
    Trophy Points:
    80
    #3
    Delete and resubmit your robots.txt
    And yeah, it may take time to recrawl your site but eventually Google will.
     
    Philvault, Oct 31, 2012 IP
  4. roxenwebs

    roxenwebs Member

    Messages:
    32
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    36
    #4
    you don't need to modify your code, also you don't need to resubmit,
    just keep update your site and ping to all search engines and if you have Feed then submit to everywhere, where it is possible,
    if you need to check perfect robots.txt file check it.
    http://seriousdatings.com/robots.txt
     
    roxenwebs, Oct 31, 2012 IP