3

I am trying to run a php script via a cronjob and sometimes (about half the time) I get the following warning:

PHP Warning: file_get_contents(http://url.com): failed to open stream: HTTP request failed! in /path/myfile.php on line 285

The program continues to run after that which makes me think it is not a timeout problem or a memory issue (timeout is set to 10 minutes and memory to 128M), but the variable that I am storing the results of that function call in is empty. The weird part is that I am making several other calls to this same website with other url parameters and they never have a problem. The only difference with this function call is that the file it is downloading is about 70 mb while the others are all around 300 kb.

Also, I never get this warning if I SSH into the web server and run the php script manually, only when it is run from a cron.

I have also tried using cURL instead of file_get_contents but then I run out of memory.

Thanks, any help here would be appreciated.

Casey
  • 2,393
  • 1
  • 20
  • 22

2 Answers2

3

Perhaps the remote server on URL.com is sometimes timing out or returning an error for that particular (large) request?

I don't think you should be trying to store 70mb in a variable.

You can configure cURL to download directly to a file. Something like:

$file = fopen ('my.file', 'w');
$c = curl_init('http://url.com/whatever');
curl_setopt($c, CURLOPT_FILE, $file);
curl_exec($c);
curl_close($c);
fclose($file);

If nothing else, curl should provide you with much better errors about what's going wrong.

Eli
  • 5,500
  • 1
  • 29
  • 27
  • I'll give cURL another try and see what kind of output I get. I know I probably shouldn't be keeping 70 mb in memory but it was working during testing so I just went with it. The problem with writing to a file is that I don't have permissions to edit files via ssh, but I imagine the owner of the cronjob does, I'll give it a try. – Casey May 03 '11 at 17:09
  • 1
    @Casey - Presumably the tmp directory is writable. Check out the 4th comment on http://www.php.net/manual/en/function.tmpfile.php (NOTE: they're talking about upload a file via curl, but it's the same basic idea) – Eli May 03 '11 at 17:25
  • You're right, temp is writable. I'm now saving the file to php://temp and parsing it from there. I'll try running the cron and see what happens. Thanks. – Casey May 03 '11 at 17:52
  • Using cURL and writing to the temp directory did it. Thanks – Casey May 05 '11 at 16:05
1

From another answer .. double check that this issue isn't occurring some of the time with the URL parameters you're using:

Note: If you're opening a URI with special characters, such as spaces, you need to encode the URI with urlencode() - http://docs.php.net/file%5Fget%5Fcontents

stevecomrie
  • 2,423
  • 20
  • 28
  • The URL parameter is hard coded so it's the same every time and there are no special characters. I don't think the problem is with the remote server or the url call otherwise it wouldn't work every time via ssh. – Casey May 03 '11 at 17:12
  • I've got a gut feeling that it would be related to a cron memory limit - especially if you're on a shared server, otherwise there's no real reason that it shouldn't work if it works through SSH. Could you consider re-writing part of it as a shell script with a wget call? – stevecomrie May 03 '11 at 17:16
  • Would writing it as a shell script get around the cron memory limit? I would think that everything that is executed as a result of the cron would be under the same restrictions. – Casey May 03 '11 at 17:59
  • I would wager a guess that wget is more suited to the task of pulling down large files, I believe it stores them partially as it downloads them as opposed to storing it in one variable in memory as it goes. – stevecomrie May 03 '11 at 20:30