I am trying to use Cloudera's Quickstart docker container to test simple Hadoop/Hive jobs. I want to be able to run jobs on data in S3, but so far am having problems.
I have added the below properties to core-site.xml, hive-site.xml, hdfs-site.xml.
<property>
<name>fs.s3.awsAccessKeyId</name>
<value>XXXXXX</value>
</property>
<property>
<name>fs.s3.awsSecretAccessKey</name>
<value>XXXXXX</value>
</property>
Regardless, in Hive when trying to create an external table pointing to an S3 location, I get the error:
FAILED: SemanticException java.lang.IllegalArgumentException: AWS Access Key ID and Secret Access Key must be specified as the username or password (respectively) of a s3 URL, or by setting the fs.s3.awsAccessKeyId or fs.s3.awsSecretAccessKey properties (respectively).