0

I am running catboost with pyspark on dataproc serverless. Everything works perfectly except the batch job will run indefinitely even if all tasks have been completed. I have tried os._exit(0) or spark.stop() to manually kill Spark but it didn't work. I have also spotted the spark driver doesn't have any CPU cores allocated to it.

ellos98
  • 1
  • 2

0 Answers0