The other answers didn't work for me, so I am writing another one here.
Try the following Scala code:
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.FileSystem
import org.apache.hadoop.fs.Path
val hadoopConf = new Configuration()
val hdfs = FileSystem.get(hadoopConf)
val srcPath = new Path(srcFilePath)
val destPath = new Path(destFilePath)
hdfs.copyFromLocalFile(srcPath, destPath)
You should also check if Spark has the HADOOP_CONF_DIR
variable set in the conf/spark-env.sh
file. This will make sure that Spark is going to find the Hadoop configuration settings.
The dependencies for the build.sbt
file:
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"
libraryDependencies += "org.apache.commons" % "commons-io" % "1.3.2"
libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.6.0"