I have streaming query with trigger once. But as per spark documentation once all records are read from event-hub,spark job should stop. But this is not happening.
dfout=df_events.writeStream.format("json").trigger(once=True).option("path",filepath).outputMode("append").option("checkpointLocation", checkpointLocation).start()
Please guide me way to stop spark job as I want to shut down cluster. My cluster runs once a day . I read all data and shut down the cluster.