0

So due to Cloudflare's limitations of 250MB I am trying to circumvent around it through Chunked encoding which is supposed to upload the files in chunks that way I don't get the 413 Request Entity Too Large so I followed the general idea of this.

https://github.com/php-curl-class/php-curl-class/issues/369

And it is still returning this error, but I don't know how to properly verify the headers that it's chunked and maybe I'm just messing up something?

$stream = fopen($getFile, 'r');

// Create a curl handle to upload to the file server
$ch = curl_init($getServer . '/Upload?server=' . $getOldest['vt_server'] . '&video=' . $getOldest['v_key'] . '&type=' . $getOldest['vt_filetype']);
// Send a PUT request
curl_setopt($ch, CURLOPT_CUSTOMREQUEST, 'PUT');
// Let curl know that we are sending an entity body
curl_setopt($ch, CURLOPT_UPLOAD, true);
// Let curl know that we are using a chunked transfer encoding
curl_setopt($ch, CURLOPT_HTTPHEADER, array('Transfer-Encoding: chunked'));
// Use a callback to provide curl with data to transmit from the stream
curl_setopt($ch, CURLOPT_READFUNCTION, function($ch, $fd, $length) use ($stream) {
    return fread($stream, 1024);
});
curl_exec($ch);
curl_close($ch);

Just to add a little bit more information, there's another part of the code where the user uploads it through a form and then with JS / PHP it can do chunk uploading which helps me bypass Cloudflare's limits, it's the other way around that it doesn't properly work the way it should.

   $putdata = fopen("php://input", "r");

$fp = fopen($path['videos'] . '/' . $_GET['video'] . '.' . $_GET['type'], "w");

while ($data = fread($putdata, 1024 * 1024))
    fwrite($fp, $data);

fclose($fp);
fclose($putdata);

Added code for how the data is read / written.

UPDATE: I did try contacting Cloudflare regarding HTTP chunked transferring and if it is possible and they didn't really give me a specific answer besides advertising their "Cloudflare Streaming" platform. And I'm still struggling on this issue, anyone know much about Cloudflare's upload limit and chunked uploading with it?

Cameron
  • 77
  • 1
  • 9
  • They can still check the size of the upload (once it arrives in their server) so I don't find it strange that they then return error since you go over their stated limit. – Daniel Stenberg Jun 19 '18 at 07:55
  • While that is true, it even says in their own documentation to upload it in chunks. https://support.cloudflare.com/hc/en-us/articles/201303340-How-can-I-change-the-client-maximum-upload-size- – Cameron Jun 19 '18 at 14:03
  • I think they mean chunks as in "smaller pieces", not HTTP chunked encoding... – Daniel Stenberg Jun 19 '18 at 14:11
  • Hmm, if it wouldn't have supported HTTP chunked encoding, then technically I shouldn't have been able to upload a 700mb file in the first place. I guess I should be a bit clearer, the first part of this all works.. A user uploads the files and the chunks get sent to the main webserver, it's the part of sending from the webserver to somewhere else where the issue occurs. – Cameron Jun 19 '18 at 23:17

0 Answers0