PHP Timing Out

Discussion in 'PHP' started by bulldogjjb4, Mar 17, 2012.

  1. #1
    Hi,

    Looking for some advice, I have a script in which is converting a CSV file which is on the server into a MYSQL database. However, the CSV file is 20mb and the script is just timing out. I have modified my php.ini so it shouldn't time out, but instead it causes a 500 internal server error. Is there a way I can keep the page live and not time out (would jquery do something like this) - If not, I know I have used like shell execution for large SQL files in the past, is there an alternative to use something like this? advice would really be appreciated. My programming knowledge isnt really too good, so please be patient :)

    Here is the script I am using:


    <?php 
    $databasehost = "localhost";
    $databasename = "test";
    $databasetable = "sample";
    $databaseusername ="test";
    $databasepassword = "";
    $fieldseparator = ",";
    $lineseparator = "\n";
    $csvfile = "filename.csv";
    /********************************/
    /* Would you like to add an ampty field at the beginning of these records?
    /* This is useful if you have a table with the first field being an auto_increment integer
    /* and the csv file does not have such as empty field before the records.
    /* Set 1 for yes and 0 for no. ATTENTION: don't set to 1 if you are not sure.
    /* This can dump data in the wrong fields if this extra field does not exist in the table
    /********************************/
    $addauto = 0;
    /********************************/
    /* Would you like to save the mysql queries in a file? If yes set $save to 1.
    /* Permission on the file should be set to 777. Either upload a sample file through ftp and
    /* change the permissions, or execute at the prompt: touch output.sql && chmod 777 output.sql
    /********************************/
    $save = 1;
    $outputfile = "output.sql";
    /********************************/
     
     
    if(!file_exists($csvfile)) {
          echo "File not found. Make sure you specified the correct path.\n";
          exit;
    }
     
    $file = fopen($csvfile,"r");
     
    if(!$file) {
          echo "Error opening data file.\n";
          exit;
    }
     
    $size = filesize($csvfile);
     
    if(!$size) {
          echo "File is empty.\n";
          exit;
    }
     
    $csvcontent = fread($file,$size);
     
    fclose($file);
     
    $con = @mysql_connect($databasehost,$databaseusername,$databasepassword) or die(mysql_error());
    @mysql_select_db($databasename) or die(mysql_error());
     
    $lines = 0;
    $queries = "";
    $linearray = array();
     
    foreach(split($lineseparator,$csvcontent) as $line) {
     
          $lines++;
     
          $line = trim($line," \t");
         
          $line = str_replace("\r","",$line);
         
          /************************************
          This line escapes the special character. remove it if entries are already escaped in the csv file
          ************************************/
          $line = str_replace("'","\'",$line);
          /*************************************/
         
          $linearray = explode($fieldseparator,$line);
         
          $linemysql = implode("','",$linearray);
         
          if($addauto)
                $query = "insert into $databasetable values('','$linemysql');";
          else
                $query = "insert into $databasetable values('$linemysql');";
         
          $queries .= $query . "\n";
     
          @mysql_query($query);
    }
     
    @mysql_close($con);
     
    if($save) {
         
          if(!is_writable($outputfile)) {
                echo "File is not writable, check permissions.\n";
          }
         
          else {
                $file2 = fopen($outputfile,"w");
               
                if(!$file2) {
                      echo "Error writing to the output file.\n";
                }
                else {
                      fwrite($file2,$queries);
                      fclose($file2);
                }
          }
         
    }
     
    echo "Found a total of $lines records in this csv file.\n";
     
     
    ?>
    PHP:
     
    bulldogjjb4, Mar 17, 2012 IP
  2. stephan2307

    stephan2307 Well-Known Member

    Messages:
    1,277
    Likes Received:
    33
    Best Answers:
    7
    Trophy Points:
    150
    #2
    You could use
    set_time_limit(60000);
    Code (markup):
    give it a go
     
    stephan2307, Mar 19, 2012 IP
  3. jestep

    jestep Prominent Member

    Messages:
    3,659
    Likes Received:
    215
    Best Answers:
    19
    Trophy Points:
    330
    #3
    I would increase the time limit as well. Also, how much memory does the server have. If you're looping through a large text file like that, you can quickly exhaust significantly more memory than the file itself.

    Lines like this one:

    $queries .= $query . "\n";

    Are going to create a memory monster.
     
    jestep, Mar 20, 2012 IP
  4. stephan2307

    stephan2307 Well-Known Member

    Messages:
    1,277
    Likes Received:
    33
    Best Answers:
    7
    Trophy Points:
    150
    #4

    Good point I missed that one.

    Loading a 20mb file and looping through the whole thing will certainly take a long time.

    I would suggest reading it line by line ( php.net reference ) and maybe doing 20-30 sql inserts at the same time and then clear the variable.

    might take longer but is more memory friendly I belief. but I might be wrong
     
    stephan2307, Mar 20, 2012 IP
  5. mfscripts

    mfscripts Banned

    Messages:
    319
    Likes Received:
    4
    Best Answers:
    8
    Trophy Points:
    90
    Digital Goods:
    3
    #5
    What does your error log say as to the reason for the 500 error?
     
    mfscripts, Mar 20, 2012 IP