In normal operation one can provide encryption keys to the google storage api to encrypt a given bucket/blob: https://cloud.google.com/compute/docs/disks/customer-supplied-encryption
Is this possible for the output of spark/hadoop jobs "on the fly"?
Say we wanted to encrypt the output of a spark write
df.write().format("json").save("gs:///somebucket/output");
In https://storage.googleapis.com/hadoop-conf/gcs-core-default.xml there is no way to specify an encryption key.
Is this possible to do?