I am design a website, witch could allow custom to give rss from search result. To avoid a large query, the best way may be create static .xml files. Then according the xml create date, do an automatic update per 24 hours. I think the better way is use php to make a judge, if time()-filectime>24*60*60, do a .xml update, else just read the old .xml file. But, how to do this judge? custom will read the .xml directly, and xml can not make a `time()-filectime` Or any better way to suggest me? Many thanks.
I think the best way to approach this would be to use a dynamic file (i.e. .php) to dynamically generate the content, however make sure something like APC or XCache to cache the bytecode thus dramatically improving load times. The problem with large static XML files is they have to be read into memory, which takes time. If you generate this dynamically using a PHP script it will only have to do the "read" every time the RSS feed is updated, all other times it will simply dump whats in system memory. That is by far the quickest way to do it.
Thanks. that is a good suggestion. if 10,000 people get rss from me, 1 rss file is 100kb, I need keep 1GB RAM free., not a very large size. And what about this method? $filename = 'path/and_name_of_the_file.xml'; if(file_exists($filename) && filectime($filename)+(24*60*60) > time()){ // file_get_contents($filename) or include($filename) } else { // query from mysql, create new xml file ,than do a xml read. }
no you'd only need 100kb of RAM not 1GB, the cache server caches the content not content per session. In the method you have shown you are doing 1 of 2 things. Firstly reading in a file, which has a non-trivial delay. The second is a MySQL query on something which I assume to be a rather large query and then save and then read, which is going to cause a huge delay. The best thing you can do is cache that file and then run a 24 hour cronjob to update it using MySQL. This is by far the most performance friendly way of running this. Thanks Andrew