I am trying to overwrite a particular partition of a hive table using pyspark but each time i am trying to do that, all the other partitions are getting wiped off. I went through couple of posts in here regarding this and implemented the steps but seems like i am still getting and error. the code i am using is
spark.conf.set("spark.sql.sources.partitionOverwriteMode", "dynamic")
spark.conf.set("hive.exec.dynamic.partition", "true")
spark.conf.set("hive.exec.dynamic.partition.mode", "nonstrict")
df.write.format('parquet').mode('overwrite').partitionBy('col1').option("partitionOverwriteMode", "dynamic").saveAsTable(op_dbname+'.'+op_tblname)
Initially the partitions are like col1=m and col1=n and while i am trying to overwrite only the partition col1=m its wiping our col1=n as well.
Spark version is 2.4.4
Appreciate any help.