3

I want to upload files in chunks to a URL endpoint using guzzle.

I should be able to provide the Content-Range and Content-Length headers.

Using php I know I can split using

define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of chunk

function readfile_chunked($filename, $retbytes = TRUE) {
    $buffer = '';
    $cnt    = 0;
    $handle = fopen($filename, 'rb');

    if ($handle === false) {
        return false;
    }

    while (!feof($handle)) {
        $buffer = fread($handle, CHUNK_SIZE);
        echo $buffer;
        ob_flush();
        flush();

        if ($retbytes) {
            $cnt += strlen($buffer);
        }
    }

    $status = fclose($handle);

    if ($retbytes && $status) {
        return $cnt; // return num. bytes delivered like readfile() does.
    }

    return $status;
}

How Do I achieve sending the files in chunk using guzzle, if possible using guzzle streams?

Timothy Radier
  • 323
  • 4
  • 15
  • Possible duplicate of [Guzzle 6 Large file uploads / Chunking](https://stackoverflow.com/questions/45641050/guzzle-6-large-file-uploads-chunking) – tlorens Jul 09 '18 at 17:18

2 Answers2

3

This method allows you to transfer large files using guzzle streams:

use GuzzleHttp\Psr7;
use GuzzleHttp\Client;
use GuzzleHttp\Psr7\Request;

$resource = fopen($pathname, 'r');
$stream = Psr7\stream_for($resource);

$client = new Client();
$request = new Request(
        'POST',
        $api,
        [],
        new Psr7\MultipartStream(
            [
                [
                    'name' => 'bigfile',
                    'contents' => $stream,
                ],
            ]
        )
);
$response = $client->send($request);
1

Just use multipart body type as it's described in the documentation. cURL then handles the file reading internally, you don't need to so implement chunked read by yourself. Also all required headers will be configured by Guzzle.

Alexey Shokov
  • 4,775
  • 1
  • 21
  • 22