Hi, I am writing a piece of software and I was thinking to enable me to minify files (I am talking about xml, html, css and js files) and to help me to password protect any documents including images, pdf etc I would run everything through the index.php file and then do some custom file handling. I know that once they documents are displayed they can be copied but its about people messing around and trying to see things they shouldn't see. is this good or bad?
if it's a small sofwtare project you can use .htaccess to redirect everything to index.php then handel all requets using switch(get Or Post requests). if it's a big software will be better to use cakephp or any enterprise framework
Wordpress blog front end just use a index.php for everything. I guess it's not a bad idea. Easy to use and maintain.
While some may really use it like that, most of us split everything into parts ( index.php, header.php, sidebar.php, footer.php, etc. ) so I guess it's the matter of preference - it will always be easier to manage a system which is modular.
Yes of course there would be more files and classes. I wouldn't put everything into one file. What I am talking about is that every request (be it for a html page, php page, image, css file etc etc) would be routed via the index.php.
True but they only handle the page requests. I would want also the index.php to handle any requests for images, js files etc etc etc Would this cause some sort of overload ??
unfortunatelt not as I am at the very very very early stages of writing it. (Oops sorry I forgot a very there). Basically my system will be set up in a way that templates, style sheets and javascript is all stored in a db. So what I want doing is that if a certain file is requested it is pulled from the database. but I will want to do certain actions on them like minifing and merging stylesheets and also javascipt files (not with each other. Just all js files into one and all css files into one). I also want to be able to password protect images. So that nobody can just try out changing the url and getting to an image that they shouldn't be able to see. So I would have the main index.php file and in there I would test to see what type of file is requested (I would take the file ending as test). Then depending on what filetype it is I would do certain actions. Something like this switch ($file_ext) { case 'js' : $system->loadJS(array('minify','combine','secure'); break; case 'css' : $system->loadCSS(array('minify','combine','secure'); break; case 'png' : case 'jpg' : case 'gif' : case 'bmp' : $system->loadIMG(array('secure'); break; case 'pdf' : $system->loadPDF(array('secure'); break; default : $system->renderPage(); } PHP:
It depends how you structure it. I don't think there's anything intrinsically wrong with this approach, but there might be problems if everything index.php gets called you run through ALL your include files and classes and initialise everything and use up memory etc. Eg. If you just send out an image, you don't want to load all your header.php, dosomethingA.php, classC.php etc. Apache (or any webserver) is REALLY good as serving static content fast. That's it's job. PHP will just slow you down if you're planning on eventually scaling your application up to any reasonable size. This is the reason people even put their static content such as images etc on seperate servers sometimes. If you're still set on going via index.php then look into some sort of caching options so that if index.php?file=1.jpg gets called it can just return a cached copy immediately. Something like memcached.
I would of course run test and load only other files as needed. So only the bare minimum would be loaded when loading static files. like access control. How can you password protect static content? One reason why I want to route everything via the index.php is so that I can choose to password protect static files like images, pdf etc?.