I am working on changing conf for spark in order to limit the logs for my spark structured streaming log files. I have figured the properties to do so, but it is not working right now. Do i need to restart all nodes (name and worker nodes) or is restarting the jobs is enough. We are using google dataproc clusters and running spark with yarn .
Asked
Active
Viewed 386 times
1
-
you need to restart your running application in order to pick updated values. please provide more details what changes you did? and how did you submitted application.. normal application restart is enough – kavetiraviteja Sep 01 '20 at 09:53
1 Answers
0
The simplest will be to set these properties during cluster creation time using Dataproc Cluster Properties:
gcloud dataproc clusters create $CLUSTER_NAME \
--properties spark:<key>=<value>,yarn:<key>=<value>
Or set them when submitting your Spark application.

Igor Dvorzhak
- 4,360
- 3
- 17
- 31