I have this function that gets the html from a list of pages and once I run it for two hours or so the script interrupts and shows that memory limit has been exceeded, Now i've tried to unset/set to null some variables hopefully to free up some memory but it's the same problem. Can you guys please take a look at the following piece of code? :
{
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, TRUE);
if ($proxystatus == 'on'){
curl_setopt($ch, CURLOPT_SSL_VERIFYHOST, FALSE);
curl_setopt($ch, CURLOPT_HTTPPROXYTUNNEL, TRUE);
curl_setopt($ch, CURLOPT_PROXY, $proxy);
}
curl_setopt($ch, CURLOPT_COOKIEFILE, "cookie.txt");
curl_setopt($ch, CURLOPT_URL, $site);
ob_start();
return curl_exec($ch); // the line the script interrupts because of memory
ob_end_clean();
curl_close($ch);
ob_flush();
$site = null;
$ch = null;
}
Any suggestion is highly appreciated. I've set the memory limit to 128M, but before increasing it (doesnt seem like the best option to me) I would like to know if there's anything I can do to use less memory/free up memory while running the script.
Thank you.