0

I am trying to execute the below spark-shell command in Linux terminal through java code.

echo spark.sparkContext.parallelize\(1 to 3,3\).map\(x => \
(x,\"city_\"+x\)\).toDF\(\"num",\"city\"\).write.partitionBy\(\"num\"\).mode\
(SaveMode.Overwrite\).parquet\(\"/tmp/abinash\"\) | /opt/ab/cd/de/spark-shell

But getting "No such file or directory" error for /tmp/abinash even if file exist

I tried so many ways to solve this. But did not get any success. I assume there is an issue with escape character.

Can anyone help me with this what i am doing wrong here.

khaled_bhar
  • 257
  • 1
  • 4
  • 21
Abinash Dash
  • 43
  • 1
  • 6
  • Remove the pipe and examine the string written by `echo`. See if it matches your expectation. Consider using a heredoc to generate the input instead of `echo`. – William Pursell Oct 08 '18 at 15:36

1 Answers1

1

Try this.

> echo "spark.sparkContext.parallelize(1 to 3,3).map(x => (x,\"city_\"+x)).toDF(\"num\",\"city\").write.partitionBy(\"num\").mode(SaveMode.Overwrite).parquet(\"/tmp/abinash\")"
spark.sparkContext.parallelize(1 to 3,3).map(x => (x,"city_"+x)).toDF("num","city").write.partitionBy("num").mode(SaveMode.Overwrite).parquet("/tmp/abinash")
stack0114106
  • 8,534
  • 3
  • 13
  • 38