I have a situation where I have an authority data source that has a service sending the data to another data source across the network every time the original data source is updated. Similar to an offsite backup I guess. I'm looking for a way to ensure that the secondary data source is consistent with the original data source.
I think that there are atleast two different things I need to check.
Data integrity: This I believe I can check with a checksum style error check.
Data reaching it's destination: I'm not sure how I can ensure that the data is actually getting recorded in the secondary data source. There is a possibility that the data is never reaching the destination because of a network issue or similar.
Are there any best practices that I can use to ensure that the secondary data source has data consistency with the original data source?