7

suppose a spark job running in cluster mode launches 3 executors in cluster mode, then how to fetch the process ID (PID) of each of the executor processes in the spark cluster.? is there any api for this in pyspark.?

EDIT: The question is about the executor jvm process ID (PID) not the executor ID. so how to fetch the executor process id using pyspark APIs.?

TheCodeCache
  • 820
  • 1
  • 7
  • 27
  • the above given link (https://stackoverflow.com/questions/7250126/getting-processid-within-python-code) did not solve my problem.. yet it has been marked as duplicate even though I supplied the EDIT. – TheCodeCache Jun 07 '18 at 10:58
  • [What if I disagree with the closure of a question? How can I reopen it?](https://stackoverflow.com/help/reopen-questions). – zero323 Jun 07 '18 at 15:39

0 Answers0