I have a spark cluster setup with 1 master node and 2 worker nodes. I am running a pyspark application in this spark standalone cluster where I have a job to write the transformed data into Mysql database.
So, I have a question here whether writing to database is done by driver or executor? Because when writing to a textfile, it's done by driver since my output file gets created in driver
Updated
Adding below the code I have used to write to a text file
from pyspark import SparkConf,SparkContext
if __name__ =="__main__":
sc = SparkContext(master = "spark://IP:PORT",appName='word_count_application')
words = sc.textFile("book_2.txt")
word_count = words.flatMap(lambda a : a.split(" ")).map(lambda a : (a,1)).reduceByKey(lambda a,b : a+b)
word_count.saveAsTextFile("book2_output.txt")