I am running Apache Beam Java APP in Spark Client mode using Yarn. On Spark submit, the jks file is getting copied to working directory of the Spark executors. But the reference to this path in Apache Beam KafkaIO config parameter is not working.
jks file is getting copied to this Spark executor working directory : /users/<>/.staging/application-<>/myapp1.jks (this is hadoop fs path)
In Java Apache Beam Kafka IO
Map<String, Object> props = new HashMap<String, Object>
props.put("auto.offset.reset","earliest");
props.put("enable.auto.commit",true);
props.put("session.timeout.ms","60000");
props.put("max.poll.records","100");
props.put("ssl.key.passwords","mypassword");
props.put("security.protocol","SSL");
props.put("ssl.keystore.location","/users/<user-id>/.staging/application-<id>/myapp1.jks");
//even tried "file:/users/<user-id>/.staging/application-<id>/myapp1.jks",
//even tried "maprfs:/users/<user-id>/.staging/application-<id>/myapp1.jks",
//even tried directly myapp1.jks,
//even tried ./myapp1.jks
PCollection<String> pCollection1 = pipeline.apply(KafkaIO.<String, String>read().withBootstrapServers(bootstrapServer)
.withTopic(topicName)
.withConsumerConfigUpdates(props)
.withKeyDeserializer(StringDeserializer.class)
.withValueDeserializer(StringDeserializer.class)
Every time KafkaIO tries to open the connection, it fails with below error:
org.apache.kafka.common.KafkaException: Failed to load SSL keystore myapp1.jks of type JKS
...
...
Caused by: java.io.FileNotFoundException: myapp1.jks(No such file or directory)
at java.io.FileInputStream.open0(Native Method)
Any pointers will be useful.