I have a Databricks environment and I need to create a real-time log table that contains all instances where any delta table in my hive metastore changes. CREATE, ALTER, INSERT, DELETE, any change to the table.
I need this to serve as a trigger to then update/refresh downstream reporting. That way data is connected and consistent from start to finish.
Is this data captured centrally in Databricks? How would you go about setting this up?
I tried to find clear answers but there are too many options that don't work