5

I have a form where I can upload multiple files to my laravel backend and I wand to send all those files with Guzzle to external API

I'm having an issue where my script is running out of memory if I upload more MB that what memory is available. Error message is

Allowed memory size of ... bytes exhausted (tried to allocate ... bytes)

Unfortunately, I cannot change the memory limit dynamically

Here is the code that I use

// in laravel controller method

/* @var \Illuminate\Http\Request $request */
$files = $request->allFiles();

$filesPayload = [];

foreach ($files as $key => $file) {
    $filesPayload[] = [
        'name'     => $key,
        'contents' => file_get_contents($file->path()),
        // 'contents' => fopen($file->path(), 'r'), // memory issue as well
        'filename' => $file->getClientOriginalName(),
    ];
}

$client = new \GuzzleHttp\Client\Client([
    'base_uri' => '...',
]);

$response = $client->post('...', [
    'headers' => [
        'Accept'         => 'application/json',
        'Content-Length' => ''
    ],
    'multipart' =>  $filesPayload,
]);

I'm using Guzzle 6. In the docs I found example of fopen but this was also throwing memory error

Is there a way to send multiple files with Guzzle without loading them into memory?

ljubadr
  • 2,174
  • 1
  • 20
  • 24
  • 2
    Does this help? https://stackoverflow.com/a/50273458/1255289 – miken32 May 27 '21 at 21:01
  • Thanks for the link - I'm going to test now – ljubadr May 27 '21 at 21:32
  • I've tried the same approach from the linked solution and I keep getting the same `Allowed memory size of ... bytes exhausted` error message – ljubadr May 27 '21 at 22:25
  • it looks like guzzle doesn't support chunk or streaming upload, maybe use raw socket instead? – Chinh Nguyen May 31 '21 at 03:26
  • Does this help?https://laracasts.com/discuss/channels/laravel/laravel-solution-for-allowed-memory-size-of-134217728-bytes-exhausted-error-in-storageappend – MHIdea May 31 '21 at 05:05
  • Check this https://github.com/guzzle/guzzle/issues/543. This might help you – Haridarshan May 31 '21 at 12:29
  • @MHIdea unfortunately it doesn't work in my case - it's the guzzle that it's trying to load the files into memory before sending them to external API. I went over the docs and multiple different examples to try to find some config option that would stream the files but without any luck.. – ljubadr May 31 '21 at 16:35
  • @Haridarshan thanks for the link. I didn't change the `body_as_string` value – ljubadr May 31 '21 at 16:49
  • That's what guys on the link I sent say. It's because of guzzle and use php directly. Did you try their code ? – MHIdea May 31 '21 at 19:17
  • They had issue appending the data to an existing file which was too big to load in memory. From what I can see they have local file on server. Unfortunately I have to upload multiple files to external API and it's guzzle that's failing – ljubadr Jun 01 '21 at 01:43

4 Answers4

2

I finally managed to make this work by changing

'contents' => file_get_contents($file->path()),

to

'contents' => \GuzzleHttp\Psr7\stream_for(fopen($file->path(), 'r'))

With this change files were not loaded into memory and I was able to send bigger files

ljubadr
  • 2,174
  • 1
  • 20
  • 24
1

In order to get around this, I've added a curl option that you can specify on a request that will send the request body as a string rather than stream it from the entity body of the request. You can enable this behavior like so:

$options = $client->getConfig()->get('curl.options');
$options['body_as_string'] = TRUE;
$client->getConfig()->set('curl.options', $options);
Asfandyar Khan
  • 1,677
  • 15
  • 34
1

I was also struggling with Guzzle to send large file (2 GB to 5 GB) finally i used curl php and it works like a charm :

<?php
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, '/url');
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$headers = [];
$headers[] = 'Content-Type: multipart/form-data';
$headers[] = 'Cookie: something...';
curl_setopt($ch, CURLOPT_HTTPHEADER, $headers);
curl_setopt($ch, CURLOPT_POST, 1);
$path = '/full/path/to/file';
$file = curl_file_create(realpath($path));
$post = [   
    'file' => $file,
    'other_field' => 'value',
];
curl_setopt($ch, CURLOPT_POSTFIELDS, $post);
$result = curl_exec($ch);
$httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE);
curl_close($ch);
var_dump($result,$httpcode);
cogis
  • 21
  • 3
1

I've tried accepted answer, but there is no stream_for function in the \GuzzleHttp\Psr7\ namespace. Instead I've found \GuzzleHttp\Psr7\Utils::streamFor() method and it works. You can try it:

'contents' => \GuzzleHttp\Psr7\Utils::streamFor($file->path(), 'r'))
Invis1ble
  • 1,295
  • 2
  • 17
  • 35