Google Webamster tool Crawl error

Discussion in 'Google' started by lovetospooge, Sep 8, 2011.

  1. #1
    Hi,

    in my webmaster tool under crawl errors section there is Unreachable ‎(972)‎ URL's, I've removed all that files from my server. How can i stop google from trying to crawl that pages.

    all the pages look like this:

    /artists/view/lost_in_meditation-298239
    /artists/view/luke_vibert_and_jeanjacques_perrey-399025
    /artists/view/luke_terry_presents_skycatcher-686508
    /artists/view/lukas_vs_alex_tb-765405

    I've tried:

    User-agent: *
    Disallow: /artists/

    and it doesn't work, what should I do now.

    lovetospooge out
     
    lovetospooge, Sep 8, 2011 IP
  2. katty.will

    katty.will Well-Known Member

    Messages:
    250
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    105
    #2
    Webmaster tool has an option from where you can tell Google to de-indexed the URL's
     
    katty.will, Sep 8, 2011 IP
  3. Jesse12

    Jesse12 Member

    Messages:
    360
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    28
    #3
    Hello

    You can block these url with the help of webmaster tool.
     
    Jesse12, Sep 8, 2011 IP
  4. lovetospooge

    lovetospooge Greenhorn

    Messages:
    45
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    16
    #4
    I know you can use the webmaster tool, but how do I do it. Please explain to me what I have to do.
     
    lovetospooge, Sep 8, 2011 IP
  5. fortishospitals

    fortishospitals Peon

    Messages:
    4
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #5
    please use Robot file with your site so that no search Engine crawling done for the next few of the week and then you need to verify so the error will not be so longer there.
     
    fortishospitals, Sep 8, 2011 IP
  6. WishBone

    WishBone Peon

    Messages:
    2,566
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Are you sure you done the blocking on indexing of these pages on your robots.txt file? Check out this Google support page for disallowing subfolders: Google.com/support/webmasters/bin/answer.py?hl=en&answer=156449&from=40364&rd=1
     
    WishBone, Sep 9, 2011 IP
  7. aileenwuornos

    aileenwuornos Peon

    Messages:
    288
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #7
    block deleted url through google webmaster tool
     
    aileenwuornos, Sep 9, 2011 IP
  8. simonmathew

    simonmathew Peon

    Messages:
    130
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #8
    I have done it both. Removed it from Google and webmaster and blocked those in Robots.txt. But still in crawl error sections shows all of Not Found errors. When Google will remove it. I have done all these exercise a week earlier and Google has updated both after that but nothing has changed.
     
    simonmathew, Sep 9, 2011 IP
  9. g36

    g36 Peon

    Messages:
    1,024
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    0
    #9
    Just wait couple more days. I had problem like that too, couple days later, the error was gone for no apparent reason.
     
    g36, Sep 9, 2011 IP
  10. SEOExpertSydney

    SEOExpertSydney Member

    Messages:
    126
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    26
    #10
    Well.. One of our webmaster said right use Robot.txt. Put that links and describe it nofollow, noindex and also remove from the google index with the help of webmaster tool. and if possible remove the page using FTP :) It sound cleaning ;) Hope u understand... Netprro Australia
     
    SEOExpertSydney, Sep 9, 2011 IP
  11. unknownpray

    unknownpray Active Member

    Messages:
    3,831
    Likes Received:
    14
    Best Answers:
    0
    Trophy Points:
    70
    #11
    If the pages are no longer on your site then you can always remove the pages from your site. BTW including the pages in the robots.txt can also prevent the crawler from crawling these pages.
     
    unknownpray, Sep 13, 2011 IP