0

how are you?

I'm trying to run a postgresql stored procedure inside Databricks using the following (obviously with my credentials and my function) but wont work

jdbcDF = spark.read.format("jdbc").option("url","jdbc:postgresql://host:5432/db").option("driver", "org.postgresql.Driver").option("query", "SELECT * from  function()").option("user", "user").option("password", "password").load()

this procedure deletes some rows under some conditions, it executes successfully inside databricks, but when I go to look at the table in postgresql, there was no change, as if it had not been executed. Within postgresql the function works normally. Any other options I can try?

ps. I can't connect to the database using psycopg, so I need to go with something like this

  • Are you sure you're connected to the same database? – Frank Heikens Jan 10 '23 at 15:47
  • yeah, im sure because if i use somthing like select * from table instead of the procedure name, the query returns correctly the values from the select statement. i cant just use delete from table, but if you know somthing like this, works fine to me too – Fabio dos Santos Jan 10 '23 at 15:51

0 Answers0