1

I am working with databricks. I created a function where I use a try and catch to catch any error messages. Unfortunately with errors with a length larger than 256 characters I cannot write to my target table.

def writeIntoSynapseLogTable(df,mode):
  
  df.write \
  .format("com.databricks.spark.sqldw") \
  .option("url", jdbc_string) \
  .option("tempDir", f"""mytempDir """) \
  .option("useAzureMSI","true") \
  .option("dbTable", f"""mytable """) \
  .options(nullValue="null") \
  .mode(mode).save()

I assume that there is a limit imposed by polybase but I would like to understand if there is an .option() to solve this writing problem.

An example of my error:

An error occurred while calling z:com.databricks.backend.daemon.dbutils.FSUtils.cp. : java.io.FileNotFoundException: Operation failed: "The specified filesystem does not exist.", 404, HEAD, https://xxxxxx.xxx.core.windows.net/xxxxxxxxxxxxx/myfile.csv?upn=false&action=getStatus&timeout=90 at shaded.databricks.azurebfs.org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.checkException(AzureBlobFileSystem.java:1179) at shaded.databricks.azurebfs.org.apach...
HABLOH
  • 460
  • 2
  • 12

1 Answers1

1

the solution is to add this option:

 .option("maxStrLength", "4000" ) 

I hope I have been helpful for someone.

HABLOH
  • 460
  • 2
  • 12