0

Trying to connect database sql warehouse from postgres db but the problem I am facing is that it shows me that data source is not supported.

Please see screenshot below. enter image description here

1 Answers1

0

There is no source type postgresql in databricks. The only supported types are TEXT, AVRO, CSV, JSON, JDBC, PARQUET, ORC, DELTA, LIBSVM (https://docs.databricks.com/spark/latest/spark-sql/language-manual/sql-ref-syntax-ddl-create-table-using.html)

Now, for your case, it is typical external data sourcing from a remote server or data location. For postgres, you can use the JDBC type. The implementation of that can be found in the relative documentation here https://docs.databricks.com/external-data/jdbc.html#language-sql

Fnaxiom
  • 387
  • 1
  • 8
  • I've updated the question with screenshot of the JDBC query. But im getting the error – user8536590 Oct 18 '22 at 15:19
  • The url parameter should be a jdbc url. Test is not a valid url. More can be found here https://jdbc.postgresql.org/documentation/use/#connecting-to-the-database – Fnaxiom Oct 18 '22 at 15:22
  • btw, you shouldn't alter the original problem in stackoverflow. if you do that, some people who ran into your original issue won't find the answer. Update your question instead with a sequence of problems. – Fnaxiom Oct 18 '22 at 15:27
  • "Thanks for your response, i edited the properties with 'test' for confidentiality. The properties im using work perfectly when using pyspark. Sure ill be mindful of updating the question properly next time." – user8536590 Oct 18 '22 at 15:47
  • I am getting this error: The input query contains unsupported data source(s). Only csv, json, avro, delta, parquet, orc, text, unity catalog data sources are supported on databricks sql – user8536590 Oct 18 '22 at 16:07
  • Ah I think you need to create the view first `CREATE TEMPORARY VIEW ` for JDBC. Then insert the view content into a physical table. – Fnaxiom Oct 18 '22 at 21:08