How can I tune memory core consumption during spark structured streaming job in pyspark?
Asked
Active
Viewed 78 times
1 Answers
1
You can only tuned a job in spark before submit phase.
You can tune a spark job this way:
./bin/spark-submit \
--class org.apache.spark.examples.SparkPi \
--master spark://207.184.161.138:7077 \
--executor-memory 20G \
--total-executor-cores 100 \
/path/to/examples.jar \
1000

diogoramos
- 86
- 7
-
No no when the job is waiting for a new task (in standby mode) or if I want to reduce the executor-memory in spark job to use that memory in other job, how can I decrease the executor-memory and total-executor-cores . – Ozan Dikerler Aug 07 '20 at 07:27
-
@OzanDikerler it's exactly what i'm trying to say. The tuning in a spark can only be made on the submit phase. But you can use dynamic allocation (only recommended for streaming jobs as your case). Dynamic allocation allows Spark to dynamically scale the cluster resources allocated to your application based on the workload. You can see this thread on stack overflow https://stackoverflow.com/a/40200504/10900556 – diogoramos Aug 07 '20 at 07:56