I am very new to Databricks Autoloader
. I am trying to ingest a simple csv
file with 3 records with the format [Fname, Lname, age].
The following code runs successfully in Databricks, but no data is getting saved. I'm sure I am missing something very basic. Can anyone please help where I may be going wrong.
df = spark.readStream.format("cloudFiles") \
.option("cloudFiles.format", "csv") \
.option("header", "true") \
.option("cloudFiles.schemaEvolutionMode", "failOnNewColumns") \
.option("cloudFiles.schemaLocation", "/dbfs/FileStore/temp/schema/") \
.load("/dbfs/FileStore/inbound/dsi/data/") \
.writeStream.trigger(once=True) \
.option("checkpointLocation","/dbfs/FileStore/temp/_checkpoint") \
.outputMode("append") \
.start("/dbfs/FileStore/outbound/dsi/output/") \
.awaitTermination()
Can anyone please help.
Thanks