0

Is it possible to use the JDBC connector https://docs.databricks.com/data/data-sources/sql-databases.html in order to get data from local SQL server. (and export it to delta lake)

Using:

jdbcUrl = "jdbc:mysql://{0}:{1}/{2}".format(jdbcHostname, jdbcPort, jdbcDatabase)
connectionProperties = {
  "user" : jdbcUsername,
  "password" : jdbcPassword,
  "driver" : "com.mysql.jdbc.Driver"
}
CHEEKATLAPRADEEP
  • 12,191
  • 1
  • 19
  • 42
Jb11
  • 23
  • 2

1 Answers1

1

Irrespective if you have MySql or SQL Server, Databricks driver supports both as outlined in the article you linked. From the perspective of access to on-prem - the answer is yes, however Databricks must be able to connect to it. Usually this will mean deploying your Databricks clusters into your VNET which has access to your on-prem resources, e.g. following the guidance here

Alternatively you could use Azure Data Factory self-hosted integration runtime to move the data to a staging/"Bronze" storage in the cloud and pick it up with a Databricks task to move it to a Delta table.

Daniel
  • 1,132
  • 8
  • 12