154

When I try to upload a folder with subfolders to S3 through the AWS console, only the files are uploaded not the subfolders.

You also can't select a folder. It always requires opening the folder first before you can select anything.

Is this even possible?

random
  • 9,774
  • 10
  • 66
  • 83
chrismarx
  • 11,488
  • 9
  • 84
  • 97

15 Answers15

239

I suggest you to use AWS CLI. As it is very easy using command line and awscli

    aws s3 cp SOURCE_DIR s3://DEST_BUCKET/ --recursive

or you can use sync by

    aws s3 sync SOURCE_DIR s3://DEST_BUCKET/

Remember that you have to install aws cli and configure it by using your Access Key ID and Secrect Access Key ID

     pip install --upgrade --user awscli   
     aws configure
Asad
  • 2,782
  • 2
  • 16
  • 17
  • 7
    The question specifically asks how to do this with the console. – Tim Gautier Aug 13 '17 at 01:49
  • 21
    This is misleading based on whether the user wants to copy the local folder itself to s3, or only the contents of that folder. `aws s3 cp SOURCE_DIR s3://DEST_BUCKET/ --recursive` will not result in the creation of `s3://DEST_BUCKET/SOURCE_DIR`, but having this remote folder automatically created is the intended behavior most of the time. – ely Aug 19 '17 at 21:54
  • I agree with @ely, but I'm not sure what is the correct way to create `s3://DEST_BUCKET/SOURCE_DIR` as part of the aws s3 cp command. I just manually made a folder on the s3 bucket with the same name and recursively copied in to that. – piedpiper Sep 09 '20 at 23:49
  • 1
    If you use `s3://DEST_BUCKET/SOURCE_DIR` as the destination in this command, it will automatically create `SOURCE_DIR` without having to create it manually. – Ryan Heise May 09 '23 at 01:58
  • quick question, if a new object gets created in SOURCE_DIR and you ran the sync command will it automatically push that new object to s3? – Wolfy Jul 18 '23 at 20:40
64

You don't need Enhanced Uploader (which I believe does not exist anymore) or any third-party software (that always has a risk that someone will steal your private data or access keys from the S3 bucket or even from all AWS resources).

Since the new AWS S3 Web Upload manager supports drag'n'drop for files and folders, just login to https://console.aws.amazon.com/s3/home and start the uploading process as usual, then just drag the folder from your desktop directly to the S3 page.

Nae
  • 14,209
  • 7
  • 52
  • 79
Kainax
  • 1,431
  • 19
  • 29
  • 5
    Drag 'n' drop does not work for me - Win8.1/FireFox 41.0.2. Not sure whether "..start the uploading process as usual" has some meaning that is not obvious to me.. :-/ – ScottWelker Oct 23 '15 at 17:18
  • 2
    Oh! Wait! Works with Chrome 46.n - after I "start the uploading process". Thanks for the Tip! – ScottWelker Oct 23 '15 at 17:43
  • 1
    I have 20000 files in my folder, and the upload doesn't work when I select all the files, both via drag-and-drop as well as "click to upload". It works only with a smaller subset of images (like, a few hundred or thousand at best). – Kristada673 Jun 07 '18 at 10:13
  • 1
    This works for small files & folders, but after a certain size the browser/ interface will crash so you're better off using @Asad s solution below – contool Sep 13 '19 at 16:45
  • @contool Lol, why would browser crash? It splits large files on packets and sends them by small portions. It never loads the whole file into memory. I'm uploading several gigabytes weekly for 6 years now, never crashed, not even once. And the question was about the AWS console in a browser, not CLI as Asad s answer. – Kainax Jan 08 '20 at 22:01
  • @Kainax I think it's a matter of the number of files, not the individual sizes of the files. I've never had an issue with a few large files, but at the time of this comment, 1000's or 10000's of small files will crash Chrome on a memory usage exception (~8GB or so). I imagine that there's success/error/progress feedback getting stored in JS for each file that's not deleted or garbage collected until the operation is finished or the page reloads. CLI is a valid recommendation to make in spite of the question being about the console - or rather, the console as it existed 9 years ago. – Supra621 Aug 18 '20 at 09:39
  • This does not work. Dragging a folder onto the area explodes all directories and lists files in the same directory on S3. – Cybernetic Oct 28 '21 at 20:01
  • It works for me as expected (using Chrome). MB you are dragging your folder to the wrong frame? There is a new "ADD FOLDER" button too, next to the old "ADD FILES" button, you can try it. – Kainax Nov 11 '21 at 15:01
40

Execute something similar to the following command:

aws s3 cp local_folder_name s3://s3_bucket_name/local_folder_name/ --recursive
fcdt
  • 2,371
  • 5
  • 14
  • 26
jafig
  • 401
  • 4
  • 2
39

The Amazon S3 Console now supports uploading entire folder hierarchies. Enable the Ehanced Uploader in the Upload dialog and then add one or more folders to the upload queue.

http://console.aws.amazon.com/s3

Dan Winn
  • 423
  • 4
  • 3
32

Normally I use the Enhanced Uploader available via the AWS management console. However, since that requires Java it can cause problems. I found s3cmd to be a great command-line replacement. Here's how I used it:

s3cmd --configure   # enter access keys, enable HTTPS, etc.
s3cmd sync <path-to-folder> s3://<path-to-s3-bucket>/
gabrtv
  • 3,558
  • 2
  • 23
  • 28
5

I was having problem with finding the enhanced uploader tool for uploading folder and subfolders inside it in S3. But rather than finding a tool I could upload the folders along with the subfolders inside it by simply dragging and dropping it in the S3 bucket.

Note: This drag and drop feature doesn't work in Safari. I've tested it in Chrome and it works just fine.

Drag and drop

After you drag and drop the files and folders, this screen opens up finally to upload the content.

enter image description here

Reaz Murshed
  • 23,691
  • 13
  • 78
  • 98
  • 1
    I gave this answer in June 2015 :) – Kainax Feb 12 '18 at 10:01
  • 1
    Yes your answer is fine. However, it did not work when I was in Safari. I had to make it work using Google Chrome. Hence, I thought it would be easier for other developers to get the immediate help if I put the overall process with images. Hope you understand. – Reaz Murshed Feb 12 '18 at 18:37
4

Solution 1:

var AWS = require('aws-sdk');
var path = require("path");
var fs = require('fs');

const uploadDir = function(s3Path, bucketName) {

    let s3 = new AWS.S3({
    accessKeyId: process.env.S3_ACCESS_KEY,
    secretAccessKey: process.env.S3_SECRET_KEY
    });

    function walkSync(currentDirPath, callback) {
        fs.readdirSync(currentDirPath).forEach(function (name) {
            var filePath = path.join(currentDirPath, name);
            var stat = fs.statSync(filePath);
            if (stat.isFile()) {
                callback(filePath, stat);
            } else if (stat.isDirectory()) {
                walkSync(filePath, callback);
            }
        });
    }

    walkSync(s3Path, function(filePath, stat) {
        let bucketPath = filePath.substring(s3Path.length+1);
        let params = {Bucket: bucketName, Key: bucketPath, Body: fs.readFileSync(filePath) };
        s3.putObject(params, function(err, data) {
            if (err) {
                console.log(err)
            } else {
                console.log('Successfully uploaded '+ bucketPath +' to ' + bucketName);
            }
        });

    });
};
uploadDir("path to your folder", "your bucket name");

Solution 2:

aws s3 cp SOURCE_DIR s3://DEST_BUCKET/ --recursive

Muhammad Numan
  • 23,222
  • 6
  • 63
  • 80
1

Custom endpoint

if you have a custom endpoint implemented by your IT, try this

aws s3 cp <local-dir> s3://bucket-name/<destination-folder>/ --recursive --endpoint-url https://<s3-custom-endpoint.lan>
s510
  • 2,271
  • 11
  • 18
0

It's worth mentioning that if you are simply using S3 for backups, you should just zip the folder and then upload that. This Will save you upload time and costs.

If you are not sure how to do efficient zipping from the terminal have a look here for OSX.

And $ zip -r archive_name.zip folder_to_compress for Windows. Alternatively a client such as 7-Zip would be sufficient for Windows users

SGouws
  • 321
  • 1
  • 12
0

I do not see Python answers here. You can script folder upload using Python/boto3. Here's how to recursively get all file names from directory tree:

def recursive_glob(treeroot, extention):
    results = [os.path.join(dirpath, f)
        for dirpath, dirnames, files in os.walk(treeroot)
        for f in files if f.endswith(extention)]
    return results

Here's how to upload a file to S3 using Python/boto:

k = Key(bucket)
k.key = s3_key_name
k.set_contents_from_file(file_handle, cb=progress, num_cb=20, reduced_redundancy=use_rr )

I used these ideas to write Directory-Uploader-For-S3

olekb
  • 638
  • 1
  • 9
  • 28
0

I ended up here when trying to figure this out. With the version that's up there right now you can drag and drop a folder into it and it works, even though it doesn't allow you to select a folder when you open the upload dialogue.

ConorLuddy
  • 2,217
  • 2
  • 19
  • 18
0

You can drag and drop those folders. Drag and drop functionality is supported only for the Chrome and Firefox browsers. Please refer this link https://docs.aws.amazon.com/AmazonS3/latest/user-guide/upload-objects.html

vendeeshwaran Chandran
  • 1,383
  • 1
  • 13
  • 13
0

You can use Transfer Manager to upload multiple files, directories etc More info on:

https://docs.aws.amazon.com/sdk-for-java/v1/developer-guide/examples-s3-transfermanager.html

srj
  • 9,591
  • 2
  • 23
  • 27
Mukesh Kumar
  • 333
  • 1
  • 5
  • 20
0

You can upload files by dragging and dropping or by pointing and clicking. To upload folders, you must drag and drop them. Drag and drop functionality is supported only for the Chrome and Firefox browsers

ABHAY JOHRI
  • 1,997
  • 15
  • 19
-2

Drag and drop is only usable for a relatively small set of files. If you need to upload thousands of them in one go, then the CLI is the way to go. I managed to upload 2,000,00+ files using 1 command...