I have deleted (have back up with me) robots.txt file from the wp directory,but still Im getting the Crawl errors in Restricted by robots.txt ‎and the number of error is getting increased also!Why is it so?
The "Restricted by Robots.txt" error in Google Webmaster Tools has nothing to do with your robots.txt file. If you have checked and found no error then I would recommend you put the old robots.txt back to your root WP folder. It does happen to every "nofollow" links on your website. You have nothing to worry about anyway.