how are you?
I'm trying to run a postgresql stored procedure inside Databricks using the following (obviously with my credentials and my function) but wont work
jdbcDF = spark.read.format("jdbc").option("url","jdbc:postgresql://host:5432/db").option("driver", "org.postgresql.Driver").option("query", "SELECT * from function()").option("user", "user").option("password", "password").load()
this procedure deletes some rows under some conditions, it executes successfully inside databricks, but when I go to look at the table in postgresql, there was no change, as if it had not been executed. Within postgresql the function works normally. Any other options I can try?
ps. I can't connect to the database using psycopg, so I need to go with something like this