0

I have Jave project that runs Spark local cluster in eclipse I want to use SPARK perfomance test when I run the project how do I define parameters that I want to pass for example there are parameter "--benchmark"

For example when I run via command line

~/spark-1.6.1/bin/spark-submit --conf "some confs"    --jars  spark-sql-perf.jar --benchmark DatabasePerformance
YAKOVM
  • 9,805
  • 31
  • 116
  • 217
  • I don't really understand what you mean by passing parameters to Jars. What is it you want to do exactly? – greg-449 Oct 23 '16 at 14:59
  • Those external jars that I use require some parameters how do I pass them – YAKOVM Oct 23 '16 at 15:04
  • Sorry, but this doesn't make much sense, jars can't have parameters as such. What does the documentation for this jar say, exactly? – greg-449 Oct 23 '16 at 15:11
  • You showed us how parameters are submitted from commandline. Your question is how to set these parameters in Eclipse? Do you start your spark application from java code? Or you want to set these parameters as "default" to avoid submitting on `spark-submit`? Please provide more detail about your problem because it is really unclear for me. – VladoDemcak Oct 23 '16 at 16:09
  • These aren't parameters to the jars, they are all options for the spark-submit command line. – greg-449 Oct 23 '16 at 16:16
  • I have updated the question – YAKOVM Oct 23 '16 at 16:25
  • @VladoDemcak Yes excactly I don't want to submit it via spark-submit but rather run in it in eclipse – YAKOVM Oct 23 '16 at 16:33
  • Not sure how you run `spark-sql-perf.jar` but you can set arguments for [`RUN configuration`](http://stackoverflow.com/questions/12222153/eclipse-how-we-take-arguments-for-main-when-run) and put here `--benchmark DatabasePerformance` – VladoDemcak Oct 23 '16 at 16:58

0 Answers0