Please tell me how can i fix Crawl errors Reason: URL restricted by robots.txt (514) Please help me out how to fix??????
You have accidently or deliberately disallowed bots on your robots.txt files. Allow the bots in robots.txt files for the URLs showing crawling errors.
Isn't it better to delete the robot.txt file to get rid of all such errors if you don't want to block any URL in your site to block the search engine robot to visit it?
You can fix this error through Google Webmaster Tools. If you can't please choose the help option & participate on the Google webmaster forum.