I am trying to generate a sitemap for my site. I have a query similar to $result = @mysql_query("SELECT DISTINCT song_name FROM songs ORDER BY song_name"); However, it is giving me 500 Server error when I try to open the page. If I limit the result set to say 600 results, all works fine. Is there any way around this? How do you handle really large result sets? thanks Code (SERVER_RESPONSE_CLOSE):
you might try removing your order clause. You can always re-sort the data once you've grabbed the recordset. i use asp, but there should be a similar function available to you in whatever language you're working in. If i put a line in that says something like ....recordsetname.Sort = "field1 ASC, field2 ASC" before i go into a loop of some sort, then i can define the order of a recordset on the fly outside of my DB connection. don't have any idea if this sort of thing might help, but i do know that sorting the records adds alot of time to a DB operation (or it can). I've seen order by clauses add many seconds to a db query.
Most likely it is not the query that fails, but the code following the query (BTW, outputting the actual error would help you to figure out what's wrong). Don't get the result in an array (it may be quite large), but instead loop through the result set, row-by-row and build the page as you move along. while(($row = mysql_fetch_row($result)) != false) { for($index = 0; $index < count($row); $index++) { printf("%s: %s; ", mysql_field_name($result, $row[index]), $row[$index]); } } PHP: Always a bad idea. The DB server has all the right things in place to sort the array efficiently (sometimes even without pulling the entire result set into memory, if the indexes are set up properly), while on the client side you will have to hold the entire array in memory and waste your CPU cycles on what could've been avoided in the first place. J.D.
That might well be the case. However, I cannot output the actual error, since nothing gets outputted to the screen (thus the 500 Server error) What I am doing, is looping through all the data one by one (as your code suggests), but I am appending the data to the end of a string. something like $sitemap = ''; while(($row = mysql_fetch_row($result)) != false) { $sitemap = $sitemap. $row[$index]; } PHP: And then in the end, I write this string to a file. (ends up being about 1Meg) Is there maybe a better way to do this?
Why not something like: echo '<table>'; while(($row = mysql_fetch_row($result)) != false) { echo '<tr><td>'. $row[$index] .'</td></tr>'; } echo '</table>'; PHP: That way you're not building up a huge variable. It gets echoed right away.
Well, since I am generating a sitemap, that will not really work. I want to write the data to a file with a .xml extension, so that I can submit it to google. (and therefore I must have the data in a variable, for fwrite($file, $sitemap) I know this can be done using mod_rewrite, but that is also not what I am looking for.. Google might download this file many times a day, and I dont want to have to generate it every time google queries for it.
But surely, you can write chunks to the xml, not necessarily one big bulk variable?! Just aim the file pointer to the end and append what's already there. To prevent opening the file too often you could build a variable of 100 loops and then write. That at least will cut down the size sufficiently.
Can you please post an example of how to do that? what I have currently $file = fopen($filename, "w"); // write string to file fwrite($file, $sitemap); // close file fclose($file); PHP: This rewrites the file each time, but what I seem to need is something that appends the data. But are you sure storing the data in a string is the problem here? Im not an expert with php, but in other languages (eg Java, Delphi), you can store 2GB in a string with very little performance impact. This string is only 1MB in size.
Change the mode to "a" or "a+", depending on what you need. See more here: http://ca3.php.net/manual/en/function.fopen.php
Well, without the exact error, it's hard to say what's wrong. You are saying that you can write smaller amounts of data into a file - this rules out permissions. I will put together a small example after lunch, if you don't resolve this problem by then.
Well, after making the changes you suggested, it now works. I now loop through the items, and write 50 items at a time to the file. The problem was probably not with storing all the data in the string, but with writing all that data at once to the file. thanks for all the help.
DP to the rescue Glad you sussed it. That's exactly why I nominated J.D. for Sandwalker's free PR7 link.
Glad it worked. I put together a small sample in almost pseudo-code (the actual code will depend on the structure of your result set). The get_url_priority function will be useful if you'd like to give different weight to different pages of your site. Same goes for other variable parts - changefreq and lastmod. $elem = ""; $handle = fopen("/tmp/sitemap/sitemap.xml", "w"); fwrite($handle, "<url>\n"); for($index = 0; $index < count($row); $index++) { $elem = sprintf("<loc>%s</loc>\n", $row[$index]); fwrite($handle, $elem); $elem = sprintf("<priority>%0.1f</priority>\n", get_url_priority($row[$index])); fwrite($handle, $elem); // etc, etc, etc } fwrite($handle, "</url>\n"); fclose($handle); print("Done"); PHP: This code writes to a temporary directory /tmp/sitemap/, so yours will be different, depending on the permissions you have been assigned. You shouldn't be able to write directly in your web root (if you can, your security may be compromized). J.D.