0

I've a Databricks job the writes to a certain Delta Table. After the write has been completed, the job is calling another function that reads and calculates some metrics(counts to be specific) on top of the same Delta table. But the counts/metrics being calculated are less than the actual counts for the the given partition_dates. Just to verify that the function calculating the metrics is working fine or not, I called the function after the Databricks job got completed and found that the counts were correct in that run.

I feel Delta table is not completely updated by the time I try to read from it, even though the write operation had completed successfully.

Any help would be appreciated.

Olaf Kock
  • 46,930
  • 8
  • 59
  • 90

0 Answers0