3

We have populated S3 credentials in the job.properties of druid ingestion script as mentioned below.

"jobProperties" : {
  "fs.s3a.impl" : "org.apache.hadoop.fs.s3a.S3AFileSystem",
  "fs.AbstractFileSystem.s3a.impl" : "org.apache.hadoop.fs.s3a.S3A",
  "fs.s3a.access.key" : "YOUR_ACCESS_KEY",
  "fs.s3a.secret.key" : "YOUR_SECRET_KEY"
}

However, this secret and access key is printed in the logs and is causing privacy/security issues. Could you please let me know how to hide/mask the credentials.

I have tried the below as well...but didnt work.

export AWS_ACCESS_KEY_ID=YOUR_ACCESS_KEY
export AWS_SECRET_ACCESS_KEY=YOUR_SECRET_KEY
export AWS_DEFAULT_REGION=
  "jobProperties" : {
  "fs.s3a.impl" : "org.apache.hadoop.fs.s3a.S3AFileSystem",
  "fs.AbstractFileSystem.s3a.impl" : "org.apache.hadoop.fs.s3a.S3A",
  "fs.s3.awsAccessKeyId": {
               "type": "environment",
               "variable": "AWS_ACCESS_KEY_ID"
              },
  "fs.s3.awsSecretAccessKey": {
              "type": "environment",
              "variable": "AWS_SECRET_ACCESS_KEY"
             },
   "fs.s3.impl": "org.apache.hadoop.fs.s3native.NativeS3FileSystem",
        "fs.s3n.awsAccessKeyId":{
               "type": "environment",
               "variable": "AWS_ACCESS_KEY_ID"
              },
   "fs.s3n.awsSecretAccessKey": {
              "type": "environment",
              "variable": "AWS_SECRET_ACCESS_KEY"
             }
}

Thanks

Kiran
  • 451
  • 1
  • 6
  • 23
  • Were you able to get an answer? I want to do the same thing for Kafka credentials? – mjennet Aug 24 '20 at 11:11
  • No, I still have the issue. You can give a try with the options mentioned in the below URL and see if it works for you. https://groups.google.com/g/druid-user/c/FydcpFrA688 – Kiran Aug 24 '20 at 22:33

0 Answers0