Dear friends, can you please answer of the following 2 questions? What you think? 1) Imagine having a MySQL database table that receives 500,000 new inserts every night. Each inserted row needs to be processed by connecting to a remote API, and such connection takes 1 second. That means that a single php script running all the time can only process around 64,800 rows in 24 hours. You can not use different API for data processing, or have more than one row processed with a single call (but API has no problems accepting more than a single connection at a time). You already have a script that does all this processing, but again, it can only process only a fraction of all new inserts. How would you process all of the data without a growing backlog? I am looking for most simplistic solution (in terms of time / code needed), just something that would work without needing too much programming. 2) imagine you are creating a MySQL table that will hold logs for certain webpage visits - every time somebody visits the website, we will log user agent, ip, URL and the datestamp - which MySQL engine would you use for it having in mind that the data will NEVER be updated and very rarely would be selected?
It may cost some money, but I would first get A powerfull VPS and upgrade the script this may work But ive heard that phpmyadmin is mot built for huge site so try using A paid MySQL backend.