2

I've made this script where I run a WP_Query() that gets maybe 2,000 records and growing. I want to be able to write these results to a CSV file. I have that working when the posts_per_page is set to 300 or less records.

If I change posts_per_page to -1 then I get this error:

Fatal error: Allowed memory size of 134217728 bytes exhausted

I'm wondering if there is a way that I can send 300 records at a time to a CSV file at a time until there are no more and then trigger the file for download?

Or maybe stream each line to the CSV file? What would be the best way to manage the memory when doing a fopencsv?

Here is what I have now:

$export_query = array(
    'post_type' => 'videorepot',
    'posts_per_page' => -1,
    'status' => 'published',
    'meta_query' => array(
        array(
            'key'     => 'agree',
            'value'   => 1,
            'compare' => '=',
        ),
        array(
            'key'     => 'opt-in',
            'value'   => 1,
            'compare' => '=',
        ),
        array(
            'key'     => 'status',
            'value'   => 'Video Ready',
            'compare' => '=',
        ),
    )
);

// Posts query
$wp_query = new WP_Query( $export_query );
if ( $wp_query->have_posts() ) :
    $list = array( 'Email,Photo1,Photo2,Photo3,VideoURL' );
    while ( $wp_query->have_posts() ) : $wp_query->the_post();
    
        $postID = get_the_ID();
        
        $user_email = get_post_meta($postID, 'user_email', true);
        $photo1 = get_post_meta($postID, 'photo1', true);
        $photo2 = get_post_meta($postID, 'photo2', true);
        $photo3 = get_post_meta($postID, 'photo3', true);
        $videourl = get_post_meta($postID, 'video_file', true);
        
        $list[] = $user_email.','.$photo1.','.$photo2.','.$photo3.','.$videourl;
    
    endwhile;
endif;


// Output file stream
$output_filename = 'export_' . strftime( '%Y-%m-%d' )  . '.csv';
$output_handle = @fopen( $output_filename, 'w' );

header( 'Cache-Control: must-revalidate, post-check=0, pre-check=0' );
header( 'Content-Description: File Transfer' );
header( 'Content-type: text/csv' );
header( 'Content-Disposition: attachment; filename=' . $output_filename );
header( 'Expires: 0' );
header( 'Pragma: public' );

foreach ( $list as $line ) {
    
    // Add row to file
    fputcsv( $output_handle, explode(',', $line), ',', '"');

}

// Close output file stream
fclose( $output_handle );

// We're done!
exit;

For some testing I removed the part of the script that writes the items to a CSV file and so I found out that its really something in my query here that makes it go bonkers. There are maybe 3000 records in my custom post type. Why would this make my query run out of memory?

I decided to go a different route because I am still experiencing memory issues when using wp_query. So, I wrote my MySQL query - you know I hacked it together from other examples. This query runs really fast so I think I'm going to have better luck with it.

I have a new question as this related to meta_key/meta_values. Currently, I am using SELECT * in my query, but I don't want to select everything. I only need about 5 fields from the database. A few of these fields I want are meta_key/meta_value. Is there a way I can specify those meta_keys so when I export the table I can just get 5 columns with the results?

My query looks like this now:

$values = mysql_query("SELECT * FROM wp_posts 
INNER JOIN wp_postmeta ON (wp_posts.ID = wp_postmeta.post_id) 
INNER JOIN wp_postmeta AS mt1 ON (wp_posts.ID = mt1.post_id) 
INNER JOIN wp_postmeta AS mt2 ON (wp_posts.ID = mt2.post_id) WHERE 1=1 
AND ( ( post_date >= '2014-11-03 00:00:00' AND post_date <= '2014-12-31 23:59:59' ) ) 
AND wp_posts.post_type = 'videorepot' 
AND (wp_posts.post_status = 'publish') 
AND ( (wp_postmeta.meta_key = 'agree' AND CAST(wp_postmeta.meta_value AS CHAR) = '1') 
AND (mt1.meta_key = 'opt-in' AND CAST(mt1.meta_value AS CHAR) = '1') 
AND (mt2.meta_key = 'status' AND CAST(mt2.meta_value AS CHAR) = 'Video Ready') ) 
GROUP BY wp_posts.ID 
ORDER BY wp_posts.post_date DESC 
LIMIT 0,20");

For now I am setting the limit to 20, but I've tested it with the limit set to 0,9999 and it runs very fast in phpMyAdmin. The query above gives me around 1282 results when I don't set the limit.

Dharman
  • 30,962
  • 25
  • 85
  • 135
Robbiegod
  • 954
  • 4
  • 14
  • 44
  • increase php memory...alternatively add post meta to signify you have downloaded and set max results to 200 or so (excluding downloaded) – David Dec 11 '14 at 00:59
  • I'd rather not simply increase the php memory. I'm interested to learn proper PHP technique to manage memory in this situation. I'll try this for the time being though. Also according to the error I am allowed 134 MB of memory, i think that is pretty high. The thing is I want to export the full database each time a link is clicked, so there is no need to set the item as being downloaded. Thank you for your comment nonetheless. – Robbiegod Dec 11 '14 at 02:34
  • true but you dont have 134 mb to work with...but unset the wp query after your loop and use while instead of foreach (foreach creates another copy again so you actually have 3 copies on the go) and then unset the $list array as you loop over it. – David Dec 11 '14 at 09:26
  • ok. I added wp_reset_query() just after the loop to reset wp_query. I'm a little confused about using "while" instead of "foreach" because I am loading the results of the query into an array and then using foreach to add each item to the CSV. I may try removing the foreach and moving the fputcsv into the "While" loop. Maybe thats what you mean? I'm sure there is a probably a unset($list) function somewhere, i will look that up. Thanks. – Robbiegod Dec 11 '14 at 14:26
  • 1
    foreach creates a copy of the data, while loops along the data without a copy. You actually can use the wp while loop anyway which saves looping over twice. Hopefully you wont need to unset values then on each iteration, just reset post data at the end of the loop. – David Dec 11 '14 at 14:54
  • hmm, i think its a problem with the query now or building the initial array because I commented out the bits that write the data to a CSV file and then i try just to output the array I made and i get the out of memory limit error still. I'm going to mess with that first. – Robbiegod Dec 11 '14 at 14:56

2 Answers2

1

I've arrived at a good solution I think. This works for me and I did not have to increase php memory or anything like that.

Here is the final, unedited script. Keep in mind the mysql query I'm using uses a date range and 3 meta_keys to get a subset of data from our database. I'm getting about 1290 results all written to the CSV file. (Thanks David for the assist.)

$path = $_SERVER['DOCUMENT_ROOT'];

include_once $path . '/wp-load.php';

// SET UP DB VARS
$host = DB_HOST;
$user = DB_USER;
$pass = DB_PASSWORD;
$db = DB_NAME;


// CONNECT TO DB
$link = mysql_connect($host, $user, $pass) or die("Can not connect." . mysql_error());
mysql_select_db($db) or die("Can not connect.");

$mycolums = array( 'Email,Photo1,Photo2,Photo3,VideoURL' );
foreach ( $mycolums as $column ) {
    $csv_output = $column.", ";
}
$csv_output .= "\n";

$values = mysql_query("SELECT * FROM wp_posts 
INNER JOIN wp_postmeta ON (wp_posts.ID = wp_postmeta.post_id) 
INNER JOIN wp_postmeta AS mt1 ON (wp_posts.ID = mt1.post_id) 
INNER JOIN wp_postmeta AS mt2 ON (wp_posts.ID = mt2.post_id) WHERE 1=1 
AND ( ( post_date >= '2014-11-03 00:00:00' AND post_date <= '2014-12-31 23:59:59' ) ) 
AND wp_posts.post_type = 'videorepot' 
AND (wp_posts.post_status = 'publish') 
AND ( (wp_postmeta.meta_key = 'agree' AND CAST(wp_postmeta.meta_value AS CHAR) = '1') 
AND (mt1.meta_key = 'opt-in' AND CAST(mt1.meta_value AS CHAR) = '1') 
AND (mt2.meta_key = 'status' AND CAST(mt2.meta_value AS CHAR) = 'Video Ready') ) 
GROUP BY wp_posts.ID 
ORDER BY wp_posts.post_date DESC 
LIMIT 0,9999");
while ($rowr = mysql_fetch_row($values)) {

        $postID = $rowr[0];         
        $user_email = get_post_meta($postID, 'email_address', true);
        $photo1 = get_post_meta($postID, 'photo_1', true);
        $photo2 = get_post_meta($postID, 'photo_2', true);
        $photo3 = get_post_meta($postID, 'photo_3', true);
        $videourl = get_post_meta($postID, 'the_video', true);

        // $csv_output .= $rowr['']."; ";
        $csv_output .= $user_email.', '.$photo1.', '.$photo2.', '.$photo3.', '.$videourl.',';
        $csv_output .= "\n";

}

$output_filename = "export_".date("Y-m-d_H-i",time()).".csv";
header("Content-type: application/vnd.ms-excel");
header("Content-disposition: csv" . date("Y-m-d_H-i",time()) . ".csv");
header( "Content-disposition: filename=".$output_filename);
print_r($csv_output);
exit;

I hope this helps someone else who is dealing with memory issues.

Robbiegod
  • 954
  • 4
  • 14
  • 44
0

I was also getting "Tried to allocate more memory fatal error" but my solution was that inside while loop i missed this line:

$wp_query->the_post();

Might be helpful if somebody else is also facing the same issue.

vats
  • 53
  • 1
  • 6