Working with Csv File data - Help appreciated

Discussion in 'PHP' started by joe5000, Sep 1, 2010.

  1. #1
    Hi,

    I am updating a few parts of my current website.
    Currently, the website updates user accounts by uploading CSV data reports.

    I currently use fgetcsv to deal with each row and this is taking forever to deal with the data as this is currently the code for each row.


    while (($data = fgetcsv($handle, 1000, ",")) !== FALSE)
    {
    $grabinfo = "SELECT * FROM transactions WHERE id = '$data[$i]'";
    $grabinforesult = mysql_query($grabinfo) or die(mysql_error());
    while($row = mysql_fetch_array($grabinforesult)){
    $merchant = $row['merchant'];
    $username = $row['username'];
    $datetime = $row['datetime'];
    $letthem = $row['letthem'];
    $validate = "SELECT * FROM transactions WHERE merchant = '$merchant' and username = '$username' and (status='pending' or status='success' or status='confirmed') and datetime='$datetime'";
    $validator = mysql_query($validate) or die(mysql_error());
    if(mysql_num_rows($validator) >= $letthem)
    {
    $import="UPDATE transactions SET amount='0.00' WHERE id='$data[$i]' and status ='awaiting'";
    mysql_query($import) or die(mysql_error());
    }}

    mysql_query($import) or die(mysql_error());
    } fclose($handle);


    This is performing lots of checks on the csv data and checking it with the system. It checks each row.
    Is there a quicker way of reading the csv data? There are lots of checks and while loops within the fgetcsv while loop. It can take upto half an hour to process a file with 1000 csv rows.

    Hope someone could offer some advise.

    Thanks
     
    joe5000, Sep 1, 2010 IP