So I was able to set up the direct upload on the view, and send a request to the s3_presigned_url
that I've created. The problem is: I'm getting a 403 Forbidden
.
My main guess on the issue is about the request method that is being made, with is a POST
and in fact should be a PUT
, also I do believe that there are some header missing/wrong like the Content-Type
and Content-MD5
I've followed the Rails Tutorial and this answer , here's my code
presigned_url
3.bucket(ENV['S3_BUCKET']).object(upload_key)..presigned_url(:put,{ acl: 'public-read' })
form.html.erb
<%= form.file_field :image, direct_upload: true, data: { upload_target: 'input', action: "change->upload#uploadFile", direct_upload_url: @presigned_url } %>
Note that I've change the direct-upload-url
from /rails/active_storage/direct_uploads
not sure if this was right? but it was the only way to send this to this presigned url, not sure if there is a way to specify the bucket and file path otherwise
upload_controller.js
// app/javascripts/controllers/upload_controller.js
import { Controller } from "@hotwired/stimulus";
import { DirectUpload } from "@rails/activestorage";
export default class extends Controller {
static targets = ["input", "progress"];
uploadFile() {
Array.from(this.inputTarget.files).forEach((file) => {
const upload = new DirectUpload(
file,
this.inputTarget.dataset.directUploadUrl,
this // callback directUploadWillStoreFileWithXHR(request)
);
upload.create((error, blob) => {
if (error) {
console.log(error);
} else {
this.createHiddenBlobInput(blob);
// if you're not submitting a form after upload. you need to attach
// uploaded blob to some model here and skip hidden input.
}
});
});
}
// add blob id to be submitted with the form
createHiddenBlobInput(blob) {
const hiddenField = document.createElement("input");
hiddenField.setAttribute("type", "hidden");
hiddenField.setAttribute("value", blob.signed_id);
hiddenField.name = this.inputTarget.name;
this.element.appendChild(hiddenField);
}
directUploadWillStoreFileWithXHR(request) {
request.upload.addEventListener("progress", (event) => {
this.progressUpdate(event);
});
}
progressUpdate(event) {
const progress = (event.loaded / event.total) * 100;
this.progressTarget.innerHTML = progress;
// if you navigate away from the form, progress can still be displayed
// with something like this:
// document.querySelector("#global-progress").innerHTML = progress;
}
}
Request Headers
Accept: application/json
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.9,pt;q=0.8
Connection: keep-alive
Content-Length: 118
Content-Type: application/json
Host: *my_bucket_url*
Origin: http://localhost:3000
Referer: http://localhost:3000/
sec-ch-ua: "Google Chrome";v="87", " Not;A Brand";v="99", "Chromium";v="87"
sec-ch-ua-mobile: ?0
Sec-Fetch-Dest: empty
Sec-Fetch-Mode: cors
Sec-Fetch-Site: cross-site
User-Agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/87.0.4280.88 Safari/537.36
X-CSRF-Token: uYpbxLFUaApHXl00j_qWXbQOgjTUTMgx33nmBr2Xh4NYKC5yD6t1F6bIAN-lX5FpKd-Yx2hBDD64DtFmf7UI3A
X-Requested-With: XMLHttpRequest
And the error The request signature we calculated does not match the signature you provided. Check your key and signing method.
UNSIGNED-PAYLOAD
Not sure how to set the correct headers in this upload to S3.
UPDATE:
Correctly configured the store.yml
with my AWS credentials, and used the default direct upload url, ActiveStorage was able to upload the file, but it does at a random key on the roof of the bucket, from what I've read there's no way to set the key
of the upload so this won't work, will look another solution