Hi, I blocked a page with robots.txt in my wordpress blog but I have an automatic Sitemap generator and lists it in the .xml sitemap file automatically along with my posts. According to Google Webmaster Tools this is a crawl error. How can this affect SEO or Page Rank? Any advice? Should I ignore this message?
If the page is blocked from your robots.txt file, then it's not going to be indexed period. If you want the page to be indexed, remove the block from your robots.txt file. Also make sure you're using the autodiscovery directive in your robots.txt file as well.