im getting files from eventhub , im continuosuly streaming data through pyspark code using below code:
events = (
spark
.readStream
.format("eventhubs")
.options(**ehConf)
.load()
.withColumn("body", col("body").cast("string"))
)
here its fetching all the files names, partitions and loading into events dataframe. How to fetch the files from event hub and load into azure data lake directly?