23

This is common issue but I have no choice to code it like this just to get appropriate header and body in Excel file

here how it starts

When a request been made to print, I first began make a query to fetch the headers in the database

SELECT instruments.in_id, instrument_parameters.ip_id,
CASE WHEN gv_x_ipid = -1 THEN 'datetime' ELSE '' END xlabel,
CASE WHEN ip_label LIKE '%Reservoir%' THEN 0 ELSE in_order END legendIndex,
CASE WHEN in_name = 'General' THEN ip_label ELSE in_name END ylabel            
FROM graph_plot
LEFT JOIN attributes gptype ON gp_type = gptype.at_id
LEFT JOIN graph_value ON gp_id = gv_gpid
LEFT JOIN instrument_parameters ON gv_y_ipid = ip_id
LEFT JOIN attributes pmunit ON ip_unit = pmunit.at_id
LEFT JOIN instrument_reading yvalue ON gv_y_ipid = iv_ipid
LEFT JOIN instruments ON iv_inid = in_id
WHERE gp_diid = :di_id AND 
      gp_type = :rpt_type AND 
      iv_status = 'Y' AND
      iv_inid in (".implode(",", $coll->inid).") AND
      gv_y_ipid in (".implode(",", $coll->ipid).")
GROUP BY ylabel
ORDER BY legendIndex

and this will produce numbers of headers that I will make it to be like this

DATE | Instrument1 | Instrument2 | Instrument3

The Instrument? will be dynamic based on the query above. I store this in new variable. But the original variable that holds the database results remain intact.

Later, using the same parameters, :di_id and :rpt_type, also another additional parameters, startDt and endDt to make another query just to return a long list of available dates in database. This is based on the startDt and endDt.

$sql2 = "SELECT iv_reading FROM instrument_reading WHERE iv_inid = :inid AND iv_ipid = :ipid AND iv_date = :dt AND iv_status = 'Y'";    

When it finish getting the dates, I make two loop like this

foreach ($dates as $key => $dt) {       
    foreach ($resp as $InstNo => $InstRow) {
        try {
            $stmt2->execute(array(':dt' => $dt, ':inid' => $InstRow->in_id, ':ipid' => $InstRow->ip_id));
            $rowDb = $stmt2->fetch(PDO::FETCH_NUM, PDO::FETCH_ORI_NEXT);
        } catch(PDOException $e) {
            echo '{"error":{"text":"'. $e->getMessage() .'"}}'; 
        }
    }
}

First, it starts looping the date and second it begins looping the headers (based on the query made right before getting the dates). My problem I always stuck here

$stmt2->execute(array(':dt' => $dt, ':inid' => $InstRow->in_id, ':ipid' => $InstRow->ip_id));

What do you think? Is there any better way to handle this?

For your information, I use Slim and PHPExcel. PHPExcel might have memory issue and I'm thinking to switch to Spout but the documents still about the basic stuff.

Muhaimin
  • 1,643
  • 2
  • 24
  • 48
  • Please put the table name (or alias) in front of each column in the SELECT; we can't see how the query works. – Rick James Jun 10 '15 at 21:39
  • Why do you have nested foreach's? Can't the looping be done in the `SELECT`, and you simply get back one set of rows. – Rick James Jun 10 '15 at 21:40
  • How many rows are you expecting from each `execute`? – Rick James Jun 10 '15 at 21:40
  • have you tried anything like `ini_set('memory_limit', '750M');` ? – Dipesh Parmar Jun 11 '15 at 06:11
  • @DipeshParmar now I set it to `1024M`..but still no luck – Muhaimin Jun 11 '15 at 07:48
  • 1
    @MuhaiminAbdul try setting `ini_set('memory_limit', '-1');` Remember this is not good but for test set it to -1 and see – Dipesh Parmar Jun 11 '15 at 07:52
  • 1
    Your question is not clear about what error message you're getting. When you say "I always stuck here", do you mean it throws an exception on this line? If so, what type and what is the message? Have you tried to run the query in another context, like PHPMyAdmin or the MySQL console? – Kryten Jun 11 '15 at 20:01
  • Regardless of the error you're describing here, I'd try to avoid PHPExcel. It's a beautiful library, but it is a major resource hog. It basically attempts to model an Excel worksheet (complete with formats, function evaluation, **everything**) in memory using PHP. For all but the smallest jobs, it will give you problems. – Kryten Jun 11 '15 at 20:03
  • @Kryten, what I meant was this error `Fatal error: Allowed memory size of 134217728 bytes exhausted (tried to allocate 54 bytes)` and it point to the pdo->execute(); – Muhaimin Jun 12 '15 at 01:53

3 Answers3

6

In your SQL, you may consider a limit clause to ease the memory load as follows:

$handle = fopen("file.csv", "wb");
$statement = "
SELECT  instruments.in_id, instrument_parameters.ip_id,
       CASE WHEN gv_x_ipid = -1 THEN 'datetime' ELSE '' END xlabel,
       CASE WHEN ip_label LIKE '%Reservoir%' THEN 0 ELSE in_order END legendIndex,
       CASE WHEN in_name = 'General' THEN ip_label ELSE in_name END ylabel
    FROM  graph_plot
    LEFT JOIN  attributes gptype ON gp_type = gptype.at_id
    LEFT JOIN  graph_value ON gp_id = gv_gpid
    LEFT JOIN  instrument_parameters ON gv_y_ipid = ip_id
    LEFT JOIN  attributes pmunit ON ip_unit = pmunit.at_id
    LEFT JOIN  instrument_reading yvalue ON gv_y_ipid = iv_ipid
    LEFT JOIN  instruments ON iv_inid = in_id
    WHERE  gp_diid = :di_id
      AND  gp_type = :rpt_type
      AND  iv_status = 'Y'
      AND  iv_inid in (".implode(",", $coll->inid).")
      AND  gv_y_ipid in (".implode(",", $coll->ipid).")
    GROUP BY  ylabel
    ORDER BY  legendIndex
    LIMIT  250
";
$prep = $dbh->prepare($statement);
for ($i = 0; $prep -> rowCount < 250; $i+= 250) {
    fputcsv(prep->fetchAll());
    $prep = $dbh->prepare($statement.' OFFSET'.$i);
}
fclose($handle);

Alternatively, you could use system and call SELECT INTO, set the permissions (if necessary) and Bob's your uncle.

hd1
  • 33,938
  • 5
  • 80
  • 91
  • I cannot put a limit because I will dump all of it in an Excel file – Muhaimin Jun 11 '15 at 07:48
  • 1
    Sure you can... open the file outside the loop and just dump every 250 lines to it till you're done. – hd1 Jun 11 '15 at 17:15
  • 1
    How does that move on to the _next_ 250 rows of input? Perhaps you need an `OFFSET`? But then you would have to hope that the table is not being added to or deleted from between chunks. – Rick James Jun 11 '15 at 21:38
  • 1
    Thanks, @RickJames... to ensure the table remains, one would wrap the select stattements in a transaction – hd1 Jun 11 '15 at 21:47
  • I'm not sure whether a transaction is sufficient. Furthermore, `ORDER BY legendIndex` seems to be non-deterministic -- it seems to be 0 a bunch of the time, else `in_order`, which may not be distinct after all the `JOINs`. – Rick James Jun 11 '15 at 22:13
  • (Need to toss the extra `prepare`) – Rick James Jun 11 '15 at 22:15
  • @RickJames, I just copied the SQL from OP – hd1 Jun 11 '15 at 22:25
  • The _new_ `prepare` has `OFFSET`; the one before the loop is no longer necessary. – Rick James Jun 11 '15 at 22:28
  • I have no problem getting the above sql statement to run. My problem when I run two loops and there is a execute method called. – Muhaimin Jun 22 '15 at 07:38
5

You have not terminated the fetch loop.

$rowDb = $stmt2->fetch(PDO::FETCH_NUM, PDO::FETCH_ORI_NEXT);

gets the "next" row or closes the 'cursor' and terminates.

Are you expecting to get exactly one row? If so, consider doing fetchAll. (Caution: the resultset may be an extra level deep in arrays.)

Rick James
  • 135,179
  • 13
  • 127
  • 222
1

The PDO MySQL driver will do some buffering, which causes memory exhaustion when looping over large datasets. You can turn this off using $pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false); which should solve the problem.

$pdo = new PDO('mysql:localhost', $username, $password);
$pdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, false);

$stmt = $pdo->prepare('SELECT * FROM instrument...');
$stmt->execute($parameters);

while($row = $stmt->fetch()) {
    // Insert logic to write the row to the destination
}

If you'd rather set the attribute for that query only, you can do that as well:

$stmt = $pdo->prepare('SELECT * FROM instrument...', [
    PDO::MYSQL_ATTR_USE_BUFFERED_QUERY
]);

Keep in mind that you won't be able to run other queries until you are done with your unbuffered one. You can close the old cursor prematurely with $stmt->closeCursor() if you don't need the remaining results. I also cannot speak to the performance of this, but it solved my issue while writing a one-off script.

The setting is mentioned briefly in MySQL's documentation: https://dev.mysql.com/doc/connectors/en/apis-php-pdo-mysql.html

Steen Schütt
  • 1,355
  • 1
  • 17
  • 31