22

Is it possible to upload a file to S3 from a remote server?

The remote server is basically a URL based file server. Example, using http://example.com/1.jpg, it serves the image. It doesn't do anything else and can't run code on this server.

It is possible to have another server telling S3 to upload a file from http://example.com/1.jpg

           upload from http://example.com/1.jpg 
server -------------------------------------------> S3 <-----> example.com
Cory
  • 14,865
  • 24
  • 57
  • 72

4 Answers4

24

If you can't run code on the server or execute requests then, no, you can't do this. You will have to download the file to a server or computer that you own and upload from there.

You can see the operations you can perform on amazon S3 at http://docs.amazonwebservices.com/AmazonS3/latest/API/APIRest.html

Checking the operations for both the REST and SOAP APIs you'll see there's no way to give Amazon S3 a remote URL and have it grab the object for you. All of the PUT requests require the object's data to be provided as part of the request. Meaning the server or computer that is initiating the web request needs to have the data.

I have had a similar problem in the past where I wanted to download my users' Facebook Thumbnails and upload them to S3 for use on my site. The way I did it was to download the image from Facebook into Memory on my server, then upload to Amazon S3 - the full thing took under 2 seconds. After the upload to S3 was complete, write the bucket/key to a database.

Unfortunately there's no other way to do it.

reach4thelasers
  • 26,181
  • 22
  • 92
  • 123
  • How did you download the images to the memory and uploaded them? I am trying to handle a similar situation right now but I could not find a way to download a file and upload it to S3 without saving it to the server disk. – ahmetcetin Feb 28 '19 at 09:58
  • @ahmetcetin just write some code in any language which supports the AWS S3 api like say go or python then in your code do the download where you save the download into a variable then use that same variable to populate the upload to S3 api call – Scott Stensland May 03 '21 at 22:37
1

I think the suggestion provided is quite good, you can SCP the file to S3 Bucket. Giving the pem file will be a password less authentication, via PHP file you can validate the extensions. PHP file can pass the file, as argument to SCP command.

The only problem with this solution is, you must have your instance in AWS. You can't use this solution if your website is hosted in other Hosting Providers and you are trying to upload files straight to S3 Bucket.

1

Technically it's possible, using AWS Signature Version 4, Assuming your remote server is the customer in the image below, you could prepare a form in the main server, and send the form fields to the remote server, for it to curl it. Detailed example here.

Amazon S3 upload using a POST request

Community
  • 1
  • 1
Ayoub Kaanich
  • 992
  • 8
  • 20
-4

you can use scp command from Terminal.

1)using terminal, go to the place where there is that file you want to transfer to the server

2) type this:

scp -i yourAmazonKeypairPath.pem  fileNameThatYouWantToTransfer.php ec2-user@ec2-00-000-000-15.us-west-2.compute.amazonaws.com:

N.B. Add "ec2-user@" before your ec2blablbla stuffs that you got from the Ec2 website!! This is such a picky error!

3) your file will be uploaded and the progress will be shown. When it is 100%, you are done!

coolcool1994
  • 3,704
  • 4
  • 39
  • 43
  • What why would someone down vote this? Can you explain? This works for me! – coolcool1994 May 13 '14 at 05:55
  • 6
    Voted down because this post specifically asks about S3 and S3 is not a normal ec2 instance so it is not running an SSH server. You need to use an HTTP protocol to talk to it. – mianos Jul 14 '14 at 13:18
  • I suppose you could fire up an EC2 instance, use it to do the remote copy (server -> EC2 -> S3) then destroy the EC2 instance again. – Malvineous Apr 11 '17 at 00:03