Hello Friends, I need help regarding crawl error. I tried almost all the stuff that I'm aware with. I'd done removal through webmaster, robot.txt for removal of url from google crawl. I use "site:domain.com" to search for my crawl page. I found around 900+ pages in it. Prior it was around 1500+ pages. But now I am feed-up with this error. Can any one suggest me any other technique for removal of crawl url from search engine. And one more thing never see a robot.txt is been crawled but in my project is be been crawled and it show as link with above search. Any idea or suggestion would be appreciable. Thanks in advance.
hope this one can help: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=164734 this one is from google help center..