I recently submitted an xml sitemap to google webmaster account. The format which i selected was code search sitemap. It took almost 7 days to just index 30 pages out of 100 pages from one of my directory. I want to know which format helps to index the pages faster, a general sitemap or code search sitemap. Can anyone spare little time in explaining me this in detail. Thanks.
A General Sitemap is best. And use gSiteCrawler tool to create it. It is faster, easer and less hassle. Place the XML sitemaps in the root, and resubmit them via WMT as you are doing. That's the best you can do.
Well Google have indexed 30 pages in 1 week! That's not bad! You see, I do not believe there's a relation between the sitemap format has some influence in the rate of google spiders crawling. I use a general xml sitemap, and i have the same kind of results. In fact, you can change the rate of crawling at "Dashboard - Tools - Set crawl rate" in Webmasters tools, but the only thing I can do is to lower the crawl rate and I can't improve it. I believe it's the same for you.
Unless of course your website only pages with source code like C++, Delphi, Java whatever. In that case you will want to use code sitemaps which Google have made for websites/pages that show/list source code files for various programming languages. Imagine you are trying to search for e.g. specific PHP/JAVA/whatever source code files... Code sitemaps are for websites that list and show complete source code files.