1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Backup solution for huge 150GB database

Discussion in 'Databases' started by asteraki1976, Nov 28, 2017.

  1. #1
    Hello

    I am using to backup a (MariaDB) database of size 150GB:

    mysqldump --opt --single-transaction
    Code (markup):
    but server start to not respond at the time of backup for a few minutes so i am looking for any other solution to replace mysqldump.

    Any other solutions for huge databases?

    Thank you
     
    asteraki1976, Nov 28, 2017 IP
  2. jestep

    jestep Prominent Member

    Messages:
    3,659
    Likes Received:
    215
    Best Answers:
    19
    Trophy Points:
    330
    #2
    jestep, Nov 29, 2017 IP
  3. scylla

    scylla Notable Member

    Messages:
    1,025
    Likes Received:
    33
    Best Answers:
    1
    Trophy Points:
    225
    #3
    scylla, Apr 15, 2018 IP
  4. Komputerking

    Komputerking Greenhorn

    Messages:
    33
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    8
    #4
    With a database that size, this is probably your best bet. Xtrabackup is much much faster than performing a standard mysqldump. The other as to why your site is locking up, is that your database tables are being locked during the backup, which is preventing the site from responding.

    Your best option overall would be to setup a cluster, and have one of the servers in the cluster be a read only option, and then performing the dump via xtrabackup on that server. That would prevent the server from going down during the dump, and would allow the server to perform the dump, and then re-sync after the dump was complete.
     
    Komputerking, Jul 31, 2018 IP
  5. actionspec

    actionspec Greenhorn

    Messages:
    10
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    13
    #5
    I've seen some Magento MySQL databases get huge like that. +1 for Percona Xtrabackup.

    Going forward, if you don't need all that data right away, it might be worth it to create an archive backup in Percona, then truncate data older than a certain point. Running queries against huge databases like that have a performance tax depending on how much of that database gets called up during a query. I could imagine if you were running something like Wordpress on top of a database that big - that main query that WP runs would add a huge amount of time to page loads.

    I concur with @Komputerking on setting up a cluster with a read-slave. Depending on the application you are running, you can set it up so all your backups happen with the slave. That's the best practices way to do it, and how we do it in enterprise hosting setups.
     
    actionspec, May 12, 2020 IP