Hi I am trying to read the contents of a mysql table into an array. The problem is that the table is very huge and it takes a long time. Does anybody have any suggestion to reduce the time. Database includes 200K rows and 42MB The code that I use - it read the table into a multidimentional array - $sql = "SELECT * FROM myTable"; $result = mysql_query($sql) or die(mysql_error()); while ($result_array = mysql_fetch_assoc($result)){ while (list($field, $value) = each($result_array)){ $value = stripslashes($value); $value = str_replace("\n","<br/>",$value); $type[$field][] = $value; } } PHP: .
How long does the actual query take? I would guess that it is taking a lot longer to populate the array than for the actual query. Just curious, why are you trying to make a php array with 200k rows? There might be an easier way to do whatever it is you are trying to do.
You can speed up the query by adding SQL_BIG_RESULT and SQL_CACHE to the query SELECT SQL_BIG_RESULT SQL_CACHE * FROM myTable Code (markup): SQL_BIG_RESULT tells the optimizer to get ready for a big result set and SQL_CACHE saves the result set (This will help for subsequent queries, where the table hasn't been modified.) BUT The query is the quick part of your script. I'd suspect the php is what's making it display slowly. You can increase memory for php (use ini_set to change the memory_limit config option), and maybe you can optimize your while loop. I'd try losing the call to list and instead directly use the array, and you could try nl2br instead of str_replace and see if it's faster.
Hello. Why do you need to select 200k rows? Surely you can't be using all this on one page. If you do then you probably need to think about adjusting your application to only process a few rows at a time. If you can give more detail about what you're doing then we may be able to help more.
Yes, if you're just outputting to the browser, (there's no need to manipulate the array later) then it would be better to just dump it out instead of saving it into an array.