my site shows 100+ not found web crawl errors basing from the google webmasters tool... when i checked it all of it was from the previous content site, that was deleted...google is crawling the cache of the site i think... i tried blocking the folder that existed before that google was crawling through robots.txt...but it is still crawling it when i checked it... please help...thanks
You need to remove the old pages from Google's index, since they were indexed prior to your robots.txt changes GoogleBot will still attempt to crawl them. Just use the "Remove URLs" in the tools section of Google Webmasters Central.
thanks... it is still pending...i chose the directory removal, because it is were most of the errors are...and will remove others next if the results are good... will keep you guys updated.... thank you once again...
i removed the whole directory, but so far its still the same finding the errors... the robots.txt restricted only three of the errors in the directory by disallowing the whole directory... is there another option?...do i have to re-include site?...but its ranking really quite well and not too confident in doing something that drastic... thanks...