Hello I am using to backup a (MariaDB) database of size 150GB: mysqldump --opt --single-transaction Code (markup): but server start to not respond at the time of backup for a few minutes so i am looking for any other solution to replace mysqldump. Any other solutions for huge databases? Thank you
You're looking at hours to dump a database that size. I'd probably start with this one: https://www.percona.com/doc/percona-xtrabackup/LATEST/index.html
It won't take long with mysqldumper, I've used it for a very long time & still do. https://sourceforge.net/projects/mysqldumper/files/
With a database that size, this is probably your best bet. Xtrabackup is much much faster than performing a standard mysqldump. The other as to why your site is locking up, is that your database tables are being locked during the backup, which is preventing the site from responding. Your best option overall would be to setup a cluster, and have one of the servers in the cluster be a read only option, and then performing the dump via xtrabackup on that server. That would prevent the server from going down during the dump, and would allow the server to perform the dump, and then re-sync after the dump was complete.
I've seen some Magento MySQL databases get huge like that. +1 for Percona Xtrabackup. Going forward, if you don't need all that data right away, it might be worth it to create an archive backup in Percona, then truncate data older than a certain point. Running queries against huge databases like that have a performance tax depending on how much of that database gets called up during a query. I could imagine if you were running something like Wordpress on top of a database that big - that main query that WP runs would add a huge amount of time to page loads. I concur with @Komputerking on setting up a cluster with a read-slave. Depending on the application you are running, you can set it up so all your backups happen with the slave. That's the best practices way to do it, and how we do it in enterprise hosting setups.