I have a Python iterator backed by a DB query (a MongoDB cursor in this case). I'm trying to write its contents as a text file on S3, using boto.
The simplest way to do this is to concatenate everything into a string and call key.set_contents_from_string. However, this won't work well for large amounts of data (possibly 1GB+).
s = ""
for entry in entries:
s += entry
k.set_contents_from_string(s)
Ideally, I'd use key.open_write() so I can write each entry to S3 as I iterate... but that function isn't yet implemented by boto.
k.open_write()
for entry in entries:
k.write(entry)
How can I work around this? Is there perhaps a way to wrap an iterator to behave like a file object, so that I could use key.send_file?