I have just created a set of sitemaps for my new website, in total there are 20 files, each holding 25000 records for a total of 500,000. I uploaded them to my server, in total its about 110mb. I tried to open one of the sitemap files (XML) from my browser and my PC basically locked. I wanted to know if google will have this problem with opening the files, they say the files cant be more than 10mb or have more than 50,000 records, which mine dont, but if I cant open them, doesnt seem logical that they will be able to open them with ease. And is there any chance that it will slow down my server each time google starts to download them. Anyone with more than 500K+ pages have issues with sitemaps. Also, I wrote a script to build the sitemaps dynammically and google has ticked them off as ok. cheers
Any file that large will take time to load on a PC. Google does not use the overhead the Microsoft IE's has. If Google likes them great.
It's true loading such a large file on FireFox/IE may cause problems (really, I would say it's a browser problem... Crappy if it locks your PC...) .... but google won't have that problem. Afterall, they are the ones who originally specified the protocol and that XML sitemaps could be that large.