Hello all, I have something of a problem i was hoping the gods of the web that reside here could help me with. I'm making my sitemap for google, and my site is rather large (2 million+ pages) and when running my python script it starts off without a hitch. Works beautifully, that is untill it hits sitemap54.xml.gz... then without fail it crashes. Below is the message I get. (I cut the file path down to save space as your don't need to see the huge file path it goes through.) Anyone have any incite or work arounds to how i can free up the apparent memory that is gummed up by this process? Any help is greatfully appreciated! THANK YOU!
Well at this point I'm certainly open to alternative suggestions haha, since I'm not findin much info on how to combat this problem.
I have fixed sitemap_gen.py so it now does not consume enormous amount of memory and python does not die with MemoryError. Look here http://www.bashkirtsev.com/2009/05/14/sitemap/