2

I'm attempting to stream several large symmetrically encrypted .csv.gpg (40GB+ each) files from S3 to gnupg to an output stream.

I'd like to process the files in chunks using streams so that we never need to download the entire encrypted/decrypted file to disk or memory.

Here's an example using the AWS Ruby S3 SDK to download chunks of the object and pass them to gnupg for decryption, using Ruby 2.5.0 with the ruby-gpgme gem.

crypto = GPGME::Crypto.new

s3_client.get_object(bucket: BUCKET, key: KEY) do |chunk|
  crypto.decrypt(chunk, password: PASSWORD, output: $stdout)
end

When running this, I see valid decrypted CSV data in STDOUT (good!) up until it fails at the end of the first chunk:

~/.rvm/gems/ruby-2.5.0/gems/gpgme-2.0.14/lib/gpgme/ctx.rb:435:in `decrypt_verify': Decryption failed (GPGME::Error::DecryptFailed)

This is where I'm stuck.

  • Can gnupg decrypt chunks at a time or must it read the entire file before writing the output?
  • Do the chunks need to be of certain size and/or delimited in some way?

Any feedback would be greatly appreciated.

doremi
  • 14,921
  • 30
  • 93
  • 148
  • I guess the same goes for decryption as for [encryption](https://stackoverflow.com/q/34926578/589259); could you indicate if this is the case? – Maarten Bodewes Feb 18 '18 at 15:02
  • If the entire file is encrypted as one thing then you'll need the entire ciphertext, but if each chunk is encrypted separately then it should be do able. We'd really need more detail on the data structure (we don't need to know what the decrypted content is, though). – Ben Mar 31 '18 at 12:34

0 Answers0