Hi, If I have a simple 2 row table with only about 30 entries, in the form A=1 B=2 ... Z=26 The data changes about once a month, (via the admin), but apart from that it is fairly static. What I am trying to say is, would reading 30 items off a flat file be as fast as reading 30 items off a database. I prefer the db as it is easier to make a control panel, but it might not be worth it if I am going to use a lot of resources. The data is needed for every single page load on my site, regardless what the user is doing. FFMG
I would say if it is possible, the quickest would be to make a static html file, so you dont have to convert the file everytime you load the page. (if this is possible that is) for exampel, run "php Myscript.php > result.html" then all links go to result.html, and you have the quickest solution
I would use a file for 30 items likely. Grab the file with file_get_contents(), explode it into an array and echo $variable[0] through $variable[29] You would have a few lines of code at the beginning of the file and the echos embedded in the html. Use .htaccess to make your server parse .html as php and just change the text file each month. But DBs and CPs are cool.
Maybe I could use the best of both, I could 'cache' the data from the database onto a file. A global function would first read the file, if it is not present then it would read from the db, (and create another file for the next time around). Every time the Admin updates the data then the file would be recreated. I guess that might work. But I still wonder if that is worth all that work. If the call to the db is almost the same a reading a file then I might as well stay with the db. FFMG