5

I have the following code that makes a Guzzle 4.1 request:

$client = new \GuzzleHttp\Client(['defaults/headers/User-Agent' => $userAgentString]);

$retry = 0;

do {
    try {
        return $client->post($url, $options);
    } catch (\Exception $e) {
        echo $e->getMessage();
        $retry++;
        continue;
    }
} while ($retry < 3);

It runs happily for quite a while but at random intervals it will sometimes have an issue with the cURL CA file which causes a fatal error due to an uncaught exception. I'm not sure I can do about this because I already have it in a try catch block.

Here is the error that takes down my Laravel console command:

cURL error 77: error setting certificate verify locations:
  CAfile: /home/vagrant/Projects/test.dev/laravel/vendor/guzzlehttp/guzzle/src/cacert.pem
  CApath: /etc/ssl/certs (0)
PHP Fatal error:  Uncaught exception 'ErrorException' with message 'include(/home/vagrant/Projects/test.dev/laravel/vendor/filp/whoops/src/Whoops/Exception/Inspector.php): failed to open stream: Too many open files' in /home/vagrant/Projects/test.dev/laravel/vendor/composer/ClassLoader.php:382

What I would like to do is not only figure out why Guzzle is getting this cURL error, but also how I can catch it if it crops up in other systems that have this cURL issue so it doesn't just crash the process.

eComEvo
  • 11,669
  • 26
  • 89
  • 145
  • `failed to open stream: Too many open files'` doesn't look very good either. just how many files are you trying to use in this script? – Marc B Oct 21 '14 at 21:40
  • 1
    Just one. This script outputs to the console using Symfony's Console component. I have noticed it is quite bad at closing process handles, and despite implementing code to cleanup after it, I still get this issue if there is a long string of progressive outputs to the console. – eComEvo Oct 21 '14 at 21:52
  • @eComEvo I experience the same issue, where you able to find a solution? – Vlad Vinnikov Jan 21 '15 at 15:15
  • @VladVinnikov Unfortunately, not for the cURL error catching. This only seems to happen on my Homestead vagrant setup, it is was something I had to just ignore given it wasn't present on production. Will post if I locate a workaround. The only workaround I could find for the `too many open files` issue was to replace using the Symfony Console methods with simply using `echo` to show output to the console. – eComEvo Jan 21 '15 at 18:51
  • @eComEvo I figured the issue I had was artisan queue. I switched it to redis and all works now. Each job probably has a limit of curls – Vlad Vinnikov Jan 21 '15 at 21:31
  • @VladVinnikov That may actually be the cause! I just realized the production server uses Redis and my Homestead is set to `file`. Will test again later next time I have to run this command. – eComEvo Jan 22 '15 at 01:15

0 Answers0