3

I'm working with PHP and need to parse a number of fairly large XML files (50-75MB uncompressed). The issue, however, is that these XML files are stored remotely and will need to be downloaded before I can parse them.

Having thought about the issue, I think using a system() call in PHP in order to initiate a cURL transfer is probably the best way to avoid timeouts and PHP memory limits.

Has anyone done anything like this before? Specifically, what should I pass to cURL to download the remote file and ensure it's saved to a local folder of my choice?

Jon Seigel
  • 12,251
  • 8
  • 58
  • 92
ndg
  • 2,585
  • 2
  • 33
  • 58

1 Answers1

1

you can try this:

function download($src, $dst) {
        $f = fopen($src, 'rb');
        $o = fopen($dst, 'wb');
        while (!feof($f)) {
            if (fwrite($o, fread($f, 2048)) === FALSE) {
                   return 1;
            }
        }
        fclose($f);
        fclose($o);
        return 0;
}
download($url,$target);
if ( file_exists($target) ){
   # do your stuff
}
ghostdog74
  • 327,991
  • 56
  • 259
  • 343
  • This works, but is obviously subject to PHP timeouts - which is no good in this situation. – ndg Feb 20 '10 at 18:05