I have checked crawl statistics form Google sitemap, I have found few of the pages which is showing "Directory Listing Denied" had been crawled by Google and according to the statistics those pages are appearing for different keywords. How to prevent these pages to crawl. Is it necessary to prevent? If not then will it hamper my page indexing? Help needed.
Use robots.txt to exclude the googlebot from the pages you don't want crawled. http://www.mattcutts.com/blog/new-robotstxt-tool/
http://www.google.com/search?hl=en&q=http+header+check&btnG=Google+Search try some of those to make sure. if it's an error page without the right headers, as thomasschulz said, google doesn't have any realy way to know what is and what isn't an error page. if they aren't setting out the right headers and you don't know how / don't care to modify them, use robots.txt. and keep in mind that google doesn't know how a url will respond (ie, whether it's an error page or not) until it actually tries to crawl it... it shouldn't add it to the index, though, if the headers are correct.
If the page is not there anymore, I just use a 404 error. Google understands that. 301 should only be used if that page MOVED PERMANENTLY somewhere. Example, my site www.site.com/games.html moves to www.site.com/games/crazy/action/games.html