i cant find a solution to index large sites - any industrial strength xml sitemap generators out there?
you can generate yourself if you know php.. read the sitemaps structure or search on google for php sitemap generator
Humm.... I have had same problems with my hotappz.com So I made a script in PHP which just writes to a sitemap.xml file. Let me know if I can be of any help.
Hi there, Most of my sites are 500,000+ pages, so I have had similar problems. Though two stand out as being pretty decent Gsite Crawler at http://gsitecrawler.com/ although a little slow and Brian Pautsch's crawler at http://www.brianpautsch.com/ - damned fast Worth noting that in a week or so Brian will have a new version out based on some of my requirements. But on the whole its probably the fastest crawler and sitemap maker, converts 500,000 url google-sitemap to yahoo in under a minute. Hope this is of help Dave
Probably Google's generator can handle that many URLs: http://www.google.com/webmasters/sitemaps/docs/en/sitemap-generator.html You need Python. Boby
If you have a site with 100,000+ pages, can I presume that most of them are database generated ? If so - why not write a short app to create a sitemap based on the database ?
Well, I just released a new version of my sitemap generator software. 100,000k should be no problem at all (for the newest version anyways). The only thing that can jinks is that I decided to build and upload my software past midnight
I also use the http://gsitecrawler.com/ generator. It is slow but does a very good job. But I have not tested the upper limits of it. I do 22k all the time without any problems. It has most of the output formats as well. Jim Catanich
I use sitemapGenerator.jnlp, works for any site, but if you have more than 50.000 urls in it, you'd better break it in pieces, for indexing purposes.