I'm all new to using the Google webmaster tools so I hope someone can help me here. My homepage was last crawled yesterday and Google returned 46 web crawl errors in total up to date. 24 of those are URLs restricted by the robots file, which is fine but the rest confuse me. For example, I have 10 unreachable URLs (500 error code) and every single one of them refers to an image in my gallery: http://www.hannahmontanazone.com/gallery but in the form of an rss feed? (http://www.hannahmontanazone.com/gallery?g2_view=rss.SimpleRender&g2_itemId=119) Now, when I click that, it leads me to my rss feed center, Bloglines in my case. How can I get these to stop showing, because that surely isn't an error ? Secondly, my http://www.hannahmontanazone.com/feed appears purely as a 400 error, HTTP error but it's the same case. And lastly, I have 10 Not Found URLs and most of these date back to 24th December! Since then, I have corrected the pages and they all work fine now. I resubmit my sitemap everytime I update the blog, but it didn't catch up on these Not Found URLs. What shall I do ?