I'm using PHP+MySQL to build a CMS. Its going pretty well. I have one problem... The CMS uses mod_rewrite to change site . com/section/page into GET vars. Sometimes, I have a folder like site . com/xml - this contains an index.php, and then different pages get included depending on the URL, eg site . com/xml/newsfeed includes the newsfeed.php file in the xml folder. This is fine. But I have a flash file that simultaneously requests 4 files from the xml folder, which basically calls index.php 4 times with different included pages, all using the CMS to generate the content. I seem to get really bad performance sometimes. Sometimes it will load straight away, but then it seems if i reload the page it will take 30+ seconds sometimes. Its weird, in firebug the net monitoring shows the requests, it usually loads a couple of the xml/filename files, but i dont see the other 2 even tho the first couple are done... Also, i put some speed tester code in the php's and they all say they take something like 0.02 seconds to generate...it just seems to take ages to get back to the browser or something. Each call to xml/file opens the index.php in /xml, which then looks in the cache folder to see if xml/file has been cached yet, if so it echos its contents, else it connects to the database, generates the content, creates the cached file, then echos it out The site is mamajazz . com . au Its been suggested this could be to do with mod_rewrite. The rule im using is RewriteEngine on RewriteRule ^([^/\.]+)/?$ index.php?page=$1 [L] RewriteRule ^([^/\.]+)/([^/\.]+)/?$ index.php?section=$1&page=$2 [L] RewriteRule ^([^/\.]+)/([^/\.]+)/([^/\.]+)/?$ index.php?section=$1§ion2=$2&page=$3 [L] RewriteRule ^([^/\.]+)/([^/\.]+)/([^/\.]+)/([^/\.]+)/?$ index.php?section=$1§ion2=$2§ion3=$3&page=$4 [L] Code (markup): My other thought is - could it be some unclosed connection to a database or some unclosed link to a dir/file - something like that??? Can anyone offer any tips of common things to ensure are done to properly terminate a script im using mysql connections, reading contents of files, writing files, using output buffering, and using sessions. im also using register_shutdown_function to do things like close sessions and close output buffers... Any ideas why the bottlenecks are occuring?
Hmmm someone mentioned it could be to do with PHP SESSIONS... This is what im doing - using sessions for lots of php created pages using the same parent php page... Any ideas how to get around this? I need the session open on each one so i can write to it throughout the script execution, so what i was doing was session_start(); function sessionCloseFunc(){ session_write_close(); } register_shutdown_function(sessionCloseFunc); Code (markup): Is there a way to have 'shared' sessions for a parent page? or something like that???
Well Im pretty sure it's sessions - I removed session handling from my php and it seems to be fine. Problem is I need it As I asked above - How can I get sessions to work in a php file that may be called several times, simultaneously, from the same client?