I want to have Incremental Load Pattern for a Source System where there is no Audit Fields which state when was the record last modified. Example: Lasted Modified on (date time)
But these tables are defined with Primary Keys and Unique Keys which are used by the application to updated the record when ever there is any any change in the attribute.
Now question is how can i determine Delta's every day and load them into Azure Data Lake using Azure Data Factory / Databricks.
Should I stage full set of data from current day and current day -1 and determine delta's by using hash values ?
Or there is a better way?