1

I upload files in many different ways to my s3 bucket.

In python i could use boto like this:

from boto.s3.connection import S3Connection

conn = S3Connection('access-key','secret-access-key')
bucket = conn.get_bucket('bucket')
for key in bucket.list():
    print key.name

In node I have used knox to connect to buckets to get urls, but how could I iterate through the keys in node to see all files in my bucket?

ben75
  • 29,217
  • 10
  • 88
  • 134
kkaehler
  • 493
  • 1
  • 4
  • 13
  • Don't think you can with just knox. http://stackoverflow.com/questions/7459122/updating-headers-of-every-file-in-an-amazon-s3-bucket/7480490#7480490 – Brian D Sep 29 '11 at 17:07
  • Is my answer below what you need? AwsSum can iterate through objects in buckets just fine. – chilts Jun 01 '12 at 03:10
  • If the answer I gave below is fine, please mark it as correct. – chilts Jul 25 '12 at 03:36

2 Answers2

3

If your buckets get big, best stream those keys! Check out knox-copy:

var knoxCopy = require('knox-copy');

var client = knoxCopy.createClient({
  key: '<api-key-here>',
  secret: '<secret-here>',
  bucket: 'mrbucket'
});

client.streamKeys({
  // omit the prefix to list the whole bucket
  prefix: 'buckets/of/fun' 
}).on('data', function(key) {
  console.log(key);
});
hurrymaplelad
  • 26,645
  • 10
  • 56
  • 76
2

You can do it with AwsSum. It is actively maintained and can perform ALL the S3 operations provided by Amazon.

There is a fully featured example of exactly what you're looking for in the node-awssum-scripts repo. It gets the first 1000 keys, then keeps doing new requests using the 'marker' parameter to the operation until there are no more keys, so you may want to look at that:

If you need any help, give me a shout on GitHub. Disclaimer: I'm chilts, author of Awssum. :)

chilts
  • 451
  • 3
  • 12