0

I am writing databriks scala / python notebook which connect SQL server database. and i want to execute sql server function from notebook with custom paramters.

import com.microsoft.azure.sqldb.spark.config.Config
import com.microsoft.azure.sqldb.spark.connect._

val ID = "1"
val name = "A"

val config = Config(Map(
  "url"            -> "sample-p-vm.all.test.azure.com",
  "databaseName"   -> "DBsample",
  "dbTable"        -> "dbo.FN_cal_udf",
  "user"           -> "useer567",
  "password"       -> "pppp@345%",
  "connectTimeout" -> "5", //seconds
  "queryTimeout"   -> "5"  //seconds
))

val collection = sqlContext.read.sqlDB(config)
collection.show()

here function is FN_cal_udf which stored in sql server database -'DBsample'

I got error : jdbc.SQLServerException: Parameters were not supplied for the function

How i can pass parameter and call SQL function inside notebook in scala or pyspark.

rohit patil
  • 159
  • 2
  • 9

1 Answers1

1

Here you can first make query string which stores function calling statement with dynamic parameters. and then use in congig.

    import com.microsoft.azure.sqldb.spark.config.Config
    import com.microsoft.azure.sqldb.spark.connect._

    val ID = "1"
    val name = "A"
    val query = " [dbo].[FN_cal_udf]('"+ID+"','"+name+"')"
    val config = Config(Map(
      "url"            -> "sample-p-vm.all.test.azure.com",
      "databaseName"   -> "DBsample",
      "dbTable"        -> "dbo.FN_cal_udf",
      "user"           -> "useer567",
      "password"       -> "pppp@345%",
      "connectTimeout" -> "5", //seconds
      "queryTimeout"   -> "5"  //seconds
    ))

    val collection = sqlContext.read.sqlDB(config)
    collection.show()
Mr. Bhosale
  • 3,018
  • 1
  • 17
  • 34