0

I'm getting a random crash while uploading a file to S3 using Laravel File Storage system. The crash is not reproducible in local/dev environment and in production also it is very random. All the files are still getting uploaded to S3. The issue occurs randomly for any file type (pdf, png, jpg). File size is usually 1 MB to 3 MB.

Aws\Exception\CouldNotCreateChecksumException A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.

Crashed in non-app: /vendor/aws/aws-sdk-php/src/Signature/SignatureV4.php in Aws\Signature\SignatureV4::getPayload /app/Http/Controllers/ApiController.php in App\Http\Controllers\ApiController::__invoke at line 432

$filename = $request->file('file')->getClientOriginalName();
$user_file_id = $request->input('file_id');
$path = Storage::putFileAs(
    'fileo',
    $request->file('file'),
    $user_file_id
);
return $path;
Sam
  • 440
  • 6
  • 18

2 Answers2

1

I had the same error message, but files were not being saved to S3 - so it my be different. I followed the answer StackOverflow - update php.ini to increase upload limits and this error stopped.

0

I had the same issue on laravel and minio object storage. The problem was from my /etc/php.ini configuration I messed up some values. Just make sure that you did not changed this or if you did, make sure they are correct.

upload_max_filesize = 1024M
max_file_uploads = e.g 25
Tyler2P
  • 2,324
  • 26
  • 22
  • 31