I'm attempting to stream several large symmetrically encrypted .csv.gpg (40GB+ each) files from S3 to gnupg to an output stream.
I'd like to process the files in chunks using streams so that we never need to download the entire encrypted/decrypted file to disk or memory.
Here's an example using the AWS Ruby S3 SDK to download chunks of the object and pass them to gnupg for decryption, using Ruby 2.5.0 with the ruby-gpgme gem.
crypto = GPGME::Crypto.new
s3_client.get_object(bucket: BUCKET, key: KEY) do |chunk|
crypto.decrypt(chunk, password: PASSWORD, output: $stdout)
end
When running this, I see valid decrypted CSV data in STDOUT (good!) up until it fails at the end of the first chunk:
~/.rvm/gems/ruby-2.5.0/gems/gpgme-2.0.14/lib/gpgme/ctx.rb:435:in `decrypt_verify': Decryption failed (GPGME::Error::DecryptFailed)
This is where I'm stuck.
- Can gnupg decrypt chunks at a time or must it read the entire file before writing the output?
- Do the chunks need to be of certain size and/or delimited in some way?
Any feedback would be greatly appreciated.