The simplest way to achieve this is to use Spark Structured Streaming with Trigger.Once for that - it will allow to track what files has changed since last invocation, and process only what has changed since that time (but it may depend on how changes are made, like, if it's overwrite vs. append, etc.)
In the simplest case it could be as simple as (in Python, but Scala version will be almost identical. For EventHubs parameters, etc. see its docs):
writeConnectionString = "YOUR.EVENTHUB.NAME"
ehWriteConf = {
'eventhubs.connectionString' : writeConnectionString
}
spark.readStream \
.format("parquet") \
.load("/path/to/data") \
.writeStream \
.format("eventhubs") \
.options(**ehWriteConf) \
.option("checkpointLocation", "/path/to_checkpoint") \
.trigger(once=True) \
.start()
The actual solution could be more complex, as it depends on additional requirements that weren't described.
P.S. I would really recommend to use Delta as file format instead of Parquet.