I am suddenly getting the error message java.sql.SQLException: No suitable driver when I try to connect to my Azure SQL Server from my Databricks notebook. I have been successfully connecting without any issues for as long as I can remember. Therefore, I'm guessing that there is a worldwide platform issue with Apache Spark/Databricks, but not sure.
jdbcUrl = f"jdbc:sqlserver://{DBServer}.database.windows.net:1433;database={DBDatabase};user={DBUser};password={DBPword};encrypt=true;trustServerCertificate=false;hostNameInCertificate=*.database.windows.net;loginTimeout=30;"
df = spark.read.csv("/mnt/lake/RAW/cashsales_1.csv")
df.write.mode("overwrite") \
.format("jdbc") \
.option("url", jdbcUrl) \
.option("dbtable", 'UpdatedProducts')\
.save()
If that is not the case, can someone let me know how to resolve this issue?
Thanks