I am having problems with my database occasionally saying "Too many connections". I have designed my script to be quite easy on the database, it will cache results when ever possible and close all database connections. My problem is that I want to log details of each request and get these details into a MySQL database. It seems as if the logging directly to MySQL is causing it to overload. I can not use the Apache Log files instead as some of the data I wish to log is only available when PHP is running. I can not / will not change the max number of connections for MySQL. Does anyone have any ideas on how I can write the logging script so that it does not need to connect to the database for every request?
I'm not sure how your log is querying the database, but you might try using a permanent connection just for the logging script. I'm not sure I understand why apache's log won't work though. If apache isn't running, php isn't running.
forget mysql for something such as this, write locks are much slower than read locks and will eat up too many resources (completely un-needed) write to a file instead file_put_contents('/path/to/lofile.log',$line,FILE_APPEND);
The log script is just inserting data into the database. I do not think permanent connections can be used as PHP is set up in CGI mode.
Is there any quick way to read data from a file and put it into a MySQL database, when the time the script is running is limited to 30 seconds?
How often is the data being inserted? Are records being inserted before the previous one is finished? It doesn't seem feasible that you would be getting too many connections unless the script is connecting independently for each time it inserts a record, and there a re a lot of inserts being made continuously. Have you tried a singleton database class for the log inserts? That may eliminate the problem with too many connections. If not setup properly, it could make it worse though.
On a typical webpage, there will only be one record inserted into the database. This occurs at the end of the script. The database class is a singleton class, and this prevents more than 1 connection being made per webpage request. This means that for each webpage request, there will be 1 database connection made that will be dropped after the web page has been served. The database table does have some indexes to speed up data retrevial later on and this makes me think that it might be slowing down the inserts. I will look into recording it to a file first. Thanks for the help.
Why not use memcached? I heard they are very very fast and live in memory. After x number of cache, flush them into the database in 1 go. Fast and efficient. Of course, if your machine, or memcached daemon die. You will lost the not yet flush data. Oh, before I forgot. File access is way way slower than Memory access.
Ok, so if I log to a file first, and then later put it into a database, what is the best format to write the file: - SQL so that it is ready to go into the database (though I am not sure about new lines, or if this is secure) - Serialized so that it can be deserialized as an object that adds itself to the database (one per line) - CSV (comma separated) so that it takes up less room in the file, but I need to process it more.
I agree with blacknet, I'd actually prefer TSV since I'm more likely ot have commas in my data than tabs. Using the mysql command-line client and "LOAD DATA INFILE ..." is much quicker than parsing SQL queries.