Hey all, I have a website with many files for public sharing. Its a very simply setup. We have around 20 sub-directories (divided into file categories but all the same types of files) under the main file directory. Right now we are using the AutoIndex script for uploading new files from the frontend. The script is great except, when too many users execute a search from the main file directory, it overloads our server. Is it possible with this script or even some other script to have a cached php/etc page for each file in the subirectories? Then have a search function in the main directory that can search the cached pages of all the subdirectory files? Hope that makes sense. The simplest solution would be preferrable. I know there are many Perl search engines out there but most seem to complex for our needs...? Thanks guys.
Here is the idea: User search for "xx" on your site. Do the following: 1) see if xx.html exist in folder cache. 2) if exist, check how old. 3) if doesn't exist or older than X days, store the results in a new page (or overwrite existing one) to an xx.html file and store it in cache folder. Peace,
Thanks for your advice mate. I think its over our skill level haha. We are looking for something that can auto-generate cached pages of each file in the subdirectories since our files are updated every day. We don't currently have a cached-page-generating system.