Here I have a dynamic site of Used and New Cars in Australia (eurodb), which had thousands of URLs. I had submitted 6 different XML site map files, Total: 60,000 URLs in it as 10,000 in each file(s). Now the problem is that the Google crawler did not crawl my site's URLs yet as In webmaster it shows me only 4,413 URLs Indexed by Google (it's very old static which not changed since last long) now google didn't crawl other URLs of my site. can anybody please let me know why it happen and how i can get more indexed URLs in sitemap?
You can use XML Dynamic SiteMap Generator Software, It is a php based SiteMap Generator for Google, RSS, HTML and Text site maps.
Pinging the XML sitemap is not only for blogs, it is for anybody that needs to notify Google that the XML sitemap changed and should be scanned again. I think pinging is a good place to start. Another thing to note is that the indexed URL in webmaster tools is per sitemap. It only means that it selected that number of URLs to index from that particular sitemap. It may also be a good idea to break down the URLs by last changed time add add them to a sitemap index with the proper change time in it. This way Google will know what changed when scanning your sitemap index.