I am new in database ...i face a problem i have complete ip2location db CSV(450) when i try to import csv in PhpMyAdmin ...it can not import show error !!! somebody tell your Phpmyadmin have restrict to import large file ..it can import only 2 mb (yes , i also see this in import option below) I also mass with php.ini but nothing happen HOPE ...............any body can suggest me
Try compressing the file first, it should compress a lot, yet probably not in less than 2MB. Upload the file via FTP and then try to write a small PHP script which reads one line at a time and imports it in MySQL. It may time out considering the file is very large so be sure to program a way to tell the script from which line to continue with import in case it times out.
Thanks for all person who insert in topic ..... I have only idea to use little phpmyadmin What is this Thanks for your suggestion but it is about 50 mb Thanks for this but please give me idea how i can use this i am not complete dumb but experience person like you chance me to do this easily
Ok. Let me give you a little bit of help. PHP has a setting that limits the amount of data that can be uploaded to the server and the default value of that limit is 2 MB. This is a good setting because it prevents hackers or bad people to upload huge files to your server and filling the server's hard drive. You can change that limit by editing the php.ini file or by writing a line in a file called .htaccess, which you must upload in the same folder where phpMyAdmin is on your server. Some hosting companies don't allow you edit the php.ini file and editing the .htaccess is too complicated for you right now, so you may be out of luck. You don't want to upload the file using the standard upload forms anyway, because all kind of stuff can happen while uploading so much data. The best solution is to upload it using FTP to your hosting account in one of the folders on your site. If you would have managed to upload the whole file, there are very high chances that you wouldn't have been upload to insert all that data into the database. This is because phpMyAdmin, the script that inserts all those rows in the database, needs a lot of time to read those records and insert them. PHP has other settings that by default restrict any script to run for more than 30 seconds of processor time, these settings protect the web server from scripts that start doing something wrong and won't stop. You would have started inserting rows with phpMyAdmin and after about 30 seconds or something like that, PHP would have stopped phpMyAdmin from inserting records. One of the workaround solutions is to use another script, like bigdump.php. I'm not sure how it works because I've never used it, but I believe it inserts a few records, stops for a few milliseconds and starts again and so on. If you don't own the web server, this script may also not work, in the end it depends on the hosting company. Find the script bigdump.php, copy it somewhere on your web server, upload in the same folder the large csv file and access the bigdump.php from your browser. The script should guide you through the process. Some hosting companies offer more advanced hosting accounts. On some accounts, you can enable SSH, which means you can connect to the computer which hosts your website and actually start programs directly on that computer using a command prompter similar to the one that you start using Start > Run > cmd.exe (<- try it out). When you can start programs on the web server, you can actually upload that large file on the web server and use a program that exists on the web server which will insert those records for you. For example, a program comes by default with mySQL, so it should exists on all webservers. You would just have to start the program on the web server and type inside that program something like this source "/folder/filename.csv"; I'm not sure that the command is correct but in large, everything I've said should be accurate. It's 4 am and I'm very tired, that's why i'm not so sure of myself. The hosting companies that allow you to use SSH also configure the web servers to start any program on the web server with a low priority (this means what you run will not affect other web sites that are hosted on that web server, unlike when you try to import using phpMyAdmin). So, you would be able to start the program to import the whole large file and that program will run uninterrupted but in a bit longer time and will insert all records from the file in the database.
Tauren Thank you very much for this step by step detail...it is very imformative thing for me ....in my whole life no body can explain the problem solution like you .............once again thanks for this
Thanks for reply the problem is bigger that i think...i can not open csv file in notepad...........(it is much bigger) i use bigdump.php but i am catch in this step............. i also make a database thar i edit in bigdump.php....and also creat a table sample that along with my copy of ip2location CSV file what is solution for this problem ?
Create a database on your server and then create the table (you probably already did that). Now open the bigdump.php file with notepad or some other simple editor (I use UltraEdit) and edit the following lines: // Database configuration $db_server = 'localhost'; $db_name = 'database_name'; $db_username = 'phpmyadmin'; $db_password = 'phpmyadmin'; // Other Settings $filename = ''; // Specify the dump filename to suppress the file selection dialog $linespersession = 3000; // Lines to be executed per one import session $delaypersession = 0; // You can specify a sleep time in milliseconds after each session // Works only if JavaScript is activated. Use to reduce server overrun PHP: db_server - localhost or whatever your host is db_name - the name of the database db_user - the username to log in to database db_pass - the password -- ,, -- Try setting delaypersession to about 100 Then run the script and be careful at what it asks from you. Also try using an empty database, or a database without that table created, the script may stop if the table already exists.
Thanks mariush some changing in $csv_insert_table = 'ipcitylatlong'; after your solution i get this problem i am not master of MySql can you tell me what is this? what is the solution ?
That's because the table you have created has only 8 columns: 1. ipFROM INT(10) UNSIGNED ZEROFILL NOT NULL DEFAULT '0000000000', 2. ipTO INT(10) UNSIGNED ZEROFILL NOT NULL DEFAULT '0000000000', 3. countrySHORT CHAR(2) NOT NULL, 4. countryLONG VARCHAR(64) NOT NULL, 5. ipREGION VARCHAR(128) NOT NULL, 6. ipCITY VARCHAR(128) NOT NULL, 7. ipLATITUDE DOUBLE, 8. ipLONGITUDE DOUBLE, and an index: 1. PRIMARY KEY(ipFROM, ipTO) and your insert lines has 10 columns: Query: INSERT INTO ipcitylatlong VALUES ("0","33996343","-","-","-","-","0","0","-","-"); The harder solution is to remove the two columns from each insert row in the file. Replace ,"-","-"); with );. The problem is that you'll have to upload the file again. The easiest solution is to add two columns at the end of the create table defition, which you can remove later on as they're probably not used. Add two columns VARCHAR(8) after IPLONGITUDE. If you don't know how to add two columns to the table, delete the table and create the table again using this: and you can remove temp1 and temp2 columns after insert.
INSERT INTO ipcitylatlong VALUES ("0","33996343","-","-","-","-","0","0","-","-" ); 1 : "0" 2 : "33996343" 3 : "-" 4 : "-" 5 : "-" 6 : "-" 7 : "0" 8 : "0" 9 : "-" 10: "-" You are trying to insert 10 values (columns) but your table is probably defined for a different number of columns. Read again what I've explained above, the solution still remains good. Change the table layout to contain 10 columns or change each insert so that it contains the exact number of columns that table contains. Please try in the future to search for solutions in Google before asking because you may get the answer faster: http://www.google.com/search?q=Column+count+doesn't+match+value+count+at+row+1 The first search results are pretty good.
So did you get it to work? BigDump (it sounds gross!) worked for me, after a fair share of errors, but it worked.