How to connect and load files in remote BigInsights HDFS(kerberos authentication enabled) from local pyspark program for processing?
df = sqlContext.read.parquet("hdfs://<<remote_hdfs_host>>:8020/testDirectory")
Help would be much appreciated.
How to connect and load files in remote BigInsights HDFS(kerberos authentication enabled) from local pyspark program for processing?
df = sqlContext.read.parquet("hdfs://<<remote_hdfs_host>>:8020/testDirectory")
Help would be much appreciated.