I am attempting to insert data into a PostgreSQL database using PySpark with JDBC. However, during the data insertion process, it is unexpectedly attempting to recreate the table and producing the following output.
org.postgresql.util.PSQLException: ERROR: relation "account" already exists
I am trying to use the below code snippet to write the data in Postgres.
def postgres_writes(url, driver, username, password, table_name, df):
df.write \
.format("jdbc") \
.option("url", url) \
.option("dbtable", table_name) \
.option("user", username) \
.option("password", password) \
.option("driver", driver) \
.mode("append") \
.save()
I want to append the data in an existing table.