Hi, in my webmaster tool under crawl errors section there is Unreachable ‎(972)‎ URL's, I've removed all that files from my server. How can i stop google from trying to crawl that pages. all the pages look like this: /artists/view/lost_in_meditation-298239 /artists/view/luke_vibert_and_jeanjacques_perrey-399025 /artists/view/luke_terry_presents_skycatcher-686508 /artists/view/lukas_vs_alex_tb-765405 I've tried: User-agent: * Disallow: /artists/ and it doesn't work, what should I do now. lovetospooge out
please use Robot file with your site so that no search Engine crawling done for the next few of the week and then you need to verify so the error will not be so longer there.
Are you sure you done the blocking on indexing of these pages on your robots.txt file? Check out this Google support page for disallowing subfolders: Google.com/support/webmasters/bin/answer.py?hl=en&answer=156449&from=40364&rd=1
I have done it both. Removed it from Google and webmaster and blocked those in Robots.txt. But still in crawl error sections shows all of Not Found errors. When Google will remove it. I have done all these exercise a week earlier and Google has updated both after that but nothing has changed.
Just wait couple more days. I had problem like that too, couple days later, the error was gone for no apparent reason.
Well.. One of our webmaster said right use Robot.txt. Put that links and describe it nofollow, noindex and also remove from the google index with the help of webmaster tool. and if possible remove the page using FTP It sound cleaning Hope u understand... Netprro Australia
If the pages are no longer on your site then you can always remove the pages from your site. BTW including the pages in the robots.txt can also prevent the crawler from crawling these pages.