I am using https://github.com/JosephSilber/page-cache to cache pages. To prepare pages beforehand (about 100,000), I used to run 8 http requests in parallel via GuzzleHttp. It worked, but was pretty slow, because of the overhead.
I am looking for a way to process an instance of Illuminate\Http\Request
directly via the app instance, preventing a real http request. I noticed, that this is much faster. However, parallelizing this with https://github.com/amphp/parallel-functions poses some problems.
The basic code is this:
wait(parallelMap($urlChunks->all(), function($urls) {
foreach($urls as $url) {
//handle the request
}
}, $pool));
I tried several variants for handling the request.
1.
$request = \Illuminate\Http\Request::create($url, 'GET');
$response = app()->handle($request);
In this case app()
returns an instance of Illuminate\Container\Container
, not an instance of app. So it does not have the method handle()
and so on.
2.
$request = \Illuminate\Http\Request::create($url, 'GET');
$response = $app->handle($request);
Only difference here: The variable $app
was injected into the closure. Its value is the correct return value from app()
called outside the closure. It is the application, but amp fails, because the PDO connections contained in the Application instance can not be serialized.
3.
$request = \Illuminate\Http\Request::create($url, 'GET');
$app = require __DIR__.'/../../../bootstrap/app.php';
$app->handle($request);
This works for a short while. But with each instantiation of the app, one or two mysql connections start to linger around in status "Sleep". They only get closed, when the script ends. Important: This does not have to do with parallelization. I actually tried the same with a sequential loop and noticed the same effect. This looks like an error in the framework to me, because one should expect, that the Application instance closes all connections, when it is destroyed. Or can I do this manually? This would be one way to get this thing to work.
Any ideas?