I have account in google webmaster tool and i use to track my blog.In errors section,there are 3 webpages which google bot did not found(404 error for google bot)as the accounts say. But if i check that webpages,they are very well exist and also its the same URL.why is it happening like that?
maybe it has robots.txt exception? Have you used that to prevent the robots from crawling certain pages?
Sometimes pages just time out when being crawled, this can happen with second rate shared hosting, you always get what you pay for. One Google encounters a 404, especially on the first crawl attempt it can take awhile before it will try again. Just give it time.
Agreed with ssandecki it can take time for the crawls to come out. But make sure everything in the robot.txt is correct for a crawl. And try submitting your URL to freewebsubmission.com. From my experience the bots come out within 15 minutes to 3 hours after submission.
ssandecki is right.i suppose it can happen due to that.no change in permalink structure.if it increases,i will change host.