We have a fairly large site that has over 100,000 pages (mostly products), the problem i'm having is google's 50,000 URL limit per file. I can generate the sitemap for the wholesite but is there a way to seperate them into maybe 2-3 smaller files with less than 50,000 URLs each?
It will be done automatically in A1 Sitemap Generator. Check thhe article about xml sitemaps splitting.
That link doesn't work, could you post it again? I also tried A1 Sitemap Generator, but it runs very slow and takes forever. i'm not sure if the paid version will run faster, i'm still on the trial version.
What's your website? It may just be a configuration issue. Here's the link again about xml sitemap splitting.
As far as your indexing is concerned, i believe you should use a sitemap-index format. I have a website which has 10,900,000 records indexed in Google currently. It is a dynamic site and people have full access to create a new page as a profile. Therefore each new page is being added to my sitemaps through a process called Sitemap Cron Job. Which has a further functionality of getting pinged to Google and other search engines, almost on daily basis. If you need my futher help, do write me a message.
If you can share how it is done, that would be great! we also constantly add products to our database everyday, about 200-300 per day, that means 200-300 new links. but no way we can keep generating sitemaps so frenquently, as it took over an hour to generate only 2000 links with A1.