1

I'm using Spark job-server for working with job management. I need to create 10 jobs e.g.. I can create 10 separate jars for it and call it next way:

 curl -d "" 'job-server-host:8090/jobs?appName=my_job_number_1&classPath=com.spark.jobs.MainClass'

 curl -d "" 'job-server-host:8090/jobs?appName=my_job_number_2&classPath=com.spark.jobs.MainClass'

...

Or I can create only one jar with 10 job classes:

 curl -d "" 'job-server-host:8090/jobs?appName=my_alone_job&classPath=com.spark.jobs.Job1'

 curl -d "" 'job-server-host:8090/jobs?appName=my_alone_job&classPath=com.spark.jobs.Job2'
...

Which variant is more preferable and why?

Cortwave
  • 4,747
  • 2
  • 25
  • 42
  • I don't think there are any advantages. I think first one has more logical separation by having different app name. I think you should group your classes which are related to single application into one jar. – noorul Jun 03 '16 at 02:10

1 Answers1

1

Main motive to use spark-job-server is Spark job-management and context management.

It all depends on your requirement. If you think that those jobs are related and can be grouped, you can put all those in single jar or creating different-2 packages for related jobs, rather than creating separate jars and use the same App and context for those jobs.

Nishu Tayal
  • 20,106
  • 8
  • 49
  • 101