I have a site with more than 500,000 pages I created a sitemap for every 10,000 pages and then create a main sitemap and include all the sitemap I created in the main sitemap and submit the main sitemap to google sitemap. Is right?
Yes that's one good way to do it. I would also rotate the sitemaps so you get more of your pages indexed.
Hi pinoy - you may want to check this - its a detailed how to process on how to build sitemaps for large sites. Hope it helps - http://www.dailyseoblog.com/2007/06/how-to-build-a-sitemap-for-blogger-and-large-websites/
The max per xml sitemap file is 50.000 urls / 10 Megabyte size, see: http://www.sitemaps.org/protocol.php
I have a site with around 750,000 pages. I used gSiteCrawler on it, and the software automatically split the sitemap up into Google-sized chunks. I'm sure there are better ways than to crawl your own site externally (at least one with that many pages), but it worked for me.
YoungPatriot is correct, using the gSiteCrawler at http://gsitecrawler.com/ will easly create sitemaps for a 750k site and check to make sure it isn't over the 10mb size. You just need to configure the gSiteCrawler before you run it. Very easy.
Hmm i need to try that gsitecrawler. The php script I was using completely bonked out after 100k pages
max 10k urls per map then use a sitemap index to list each sitemap in. There is a max no. of sitemaps listed per sitemap index file but not sure what it is...