0

I have a cluster with one Spark master and three Spark workers which are used to query on Cassandra. Also I have a UI app with java that the users can insert the query fields on the UI widgets.

Since some users use my system, so I have multiple queries with Spark on Cassandra at the same time.

I know that SparkListener can show start of a job or some thing like that, but I don't see any function to get the current running jobs on Spark in SparkListener. Just there are some functions that can handle some request like started job or task or ...

Now how can I get the number of queries which are running on my Spark by the users queries for example on every 5 seconds (in java or scala)?

I saw the following questions but my problem does not solve:

Get current number of running containers in Spark on YARN

Spark - How many Executors and Cores are allocated to my spark job

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459

0 Answers0