I'm on a shared host and rely on phpmyadmin to manage mysql databases. I have a csv table that I need to upload that is about 77Mb zipped. I can't upload it in phpmyadmin b/c it says the max file size is around 54mb. I know I can split it into separate text files but it takes forever to open on my local machine and I need to upload it daily. How can I upload the csv file into an existing table without running into the file upload limit? Thanks
that looks like it would work as a last resort, but isn't there an easier way? If I had a dedicated server rather than a shared host, would that solve the problem? I know nothing about server admin, so I rely on cpanel so going to a dedicated host scares me. But, if it would solve this problem, let me know.
On a dedicated server you would be able to set the max upload size, but if you're having problems getting around the max upload size then you might be in over your head managing your own server. Big dump is the easiest way. Just edit the config file and upload it to your FTP. Then run your upload there. Or, if you want a slightly less easy way and your host gives you shell access to mysql then you can import the entire file via the shell. You would need to unzip it first. # mysql -u username -p database < sqlfile.sql Is there a reason this needs to be done daily? If this is all new data, then you are going to run into storage problems soon because that 74mb/day zipped is going to add up quickly, especially considering the compression rate for text. If it's not new data, maybe an incremental import solution would be faster and easier. Just import the new data daily and you won't have to bother with bigdump or getting a shell from your host.
Thanks. It's a huge product file from an merchant and it changes daily, yet they don't offer a file with only updates. I just can download the entire file every day. Thus, on a daily basis I would need to delete the old table and replace it with the new one. Bigdump looks great, but I have one more problem. The file comes to me zipped as a csv. It looks like bigdump only works for sql files. I know I could get a conversion program, but this is a really big file to convert. Any thoughts? Can I use bigdump with a csv?
You could create your own update file on your local PC using the old and new DB versions, and then use this file to update the server. For simplicity, you could create 3 separate files for new additions, deletions and modifications.
bigdump can handle CSVs (never tested but you can read this on the specified link 'Handle CSV files (you have to specify $csv_insert_table)') You can use command line syntax LOAD DATA INFILE csv files if you have shell access. Otherwise, import the csv file in your local db and export data as sql in chunks (with the help of phpmyadmin). You will need to create only 2 files as your host is providing 54 MB file limit.