3

I'm attempting to upload a file to Amazon S3 from the local hard disk using League's AWS S3 v3 Flysystem. Here is my function:

/**
 * Uploads a file from local disk to S3
 * @param $filename string
 * @param Illuminate\Filesystem\FilesystemAdapter $s3
 * @param Illuminate\Filesystem\FilesystemAdapter $local
 * @param string $path
 */
protected function uploadFile($filename, $s3, $local, $path)
{
    $file = substr($filename, strrpos($filename, '/') + 1);

    if($local->has($filename)) {
        try {
            Log::info("Uploading file", ['filename' => $filename, 'host' => gethostname()]);
            $s3->write($path . $file, $local->read($filename));

            return $path . $file;
        } catch(\Exception $e) {
            Log::error('Error uploading file', ['filename' => $filename, 'message' => $e->getMessage(), 'host' => gethostname()]);
            return false;
        }

    } 

    Log::notice('File does not exist on local disk, cannot upload', ['filename' => $filename, 'host' => gethostname() ]);
    return false;
}

I'm not 100% sure what has changed, but now we're receiving this error from AWS when attempting to upload:

AWS HTTP error: Client error response [url] https://s3.amazonaws.com/bucket/directory/file.ext?partNumber=1&uploadId=BigLoNGsTring_oF_CHaracTers.moREstuFF-- [status code] 400 [reason phrase] Bad Request XAmzContentSHA256Mismatch (client): The provided 'x-amz-content-sha256' header does not match what was computed.

In the above code, this string is part of a (very large XML string) being returned as the message of an Exception being passed in to the catch block.

From what I can tell reading the sparse little bits about this error I can find, this is somehow indicating that S3 thinks the content I send over is corrupted because the hash being placed in this header for each part of the upload doesn't match what Amazon determines that it should be.

I'm confident that at some point this was working without issue, I'm just not quite sure where to start debugging this. The permissions on the local file allow anyone to read it, I can successfully create directories in the S3 bucket I'm attempting to upload this file to so I don't think it has anything to do with permissions on that side.

Am I correct in my understanding of what this error means?

If it makes any difference, the files I'm attempting to upload to S3 are large video files (generally >250MB, <2GB).

Update

I am experiencing this issue locally as well, so I tried one of the suggestions and restarted my local computer, and the issue remains. I tried running something else that also uses this same function (it uploads smaller jpg screen shots of the video files in question) and it was able to upload them without a problem. For more context, below is the calling code:

$format->setKiloBitrate($kiloBitrate);
if(!$local->has($newFilename)) {
    $video->save($format, $newFilename);
    Log::debug($extension . '@' . $kiloBitrate . 'kbps Encoding Finished');
} else {
    Log::debug("File already encoded!");
}

if(!$this->uploadFile($newFilename, $s3, $local, $remotePath . $kiloBitrate . '/')) {
    Log::error("Failed to upload file", ['filename' => $newFilename, 'host' => gethostname()]);
} else {
    $local->delete($newFilename);
}

Above $video is an instance of FFMpeg\Media\Video and $format is an instance of FFMpeg\Format\Video\DefaultVideo. As far as I know and can tell, after the $video->save(...) line no other changes are made to the file

Update #2

The issue is gone. Not posting this as an answer, because it's not one. This is what I did:

$ composer update
Loading composer repositories with package information
Updating dependencies (including require-dev)                             
  - Removing aws/aws-sdk-php (3.1.0)
  - Installing aws/aws-sdk-php (3.2.0)
    Downloading: 100%

Looking at the commits on github for the aws sdk, I think my issue may have been fixed with this submitted PR, but really am not sure.

Jeff Lambert
  • 24,395
  • 4
  • 69
  • 96
  • Have you tried restarting the server? I had asked this question: http://stackoverflow.com/questions/30857543/laravel-5-1-hhvm-s3exception-in-wrappedhttphandler-php-line-152 earlier which seemed to resolve itself after I restarted the server. – Raphael Rafatpanah Jul 14 '15 at 22:12
  • No, have not tried that yet, and will have to wait until the morning. Thanks for the link + idea – Jeff Lambert Jul 14 '15 at 22:48
  • This could occur if the file's contents changed while you were uploading it. Is that a possibility? – Michael - sqlbot Jul 15 '15 at 02:16

0 Answers0