first post so please forgive me if I get it wrong.
What I am looking for is a methodology or a standard for comparing data that is stored in a database and the original field data. We have a database which has been populated from field data and what I want to do is to check that the field data has accurately got into the database without having to compare every piece of data.
I cant rely on the business rules and check the integrity of the links in the database because the process by which the data went from the field to the database is outside our control (and I think that a lot of it was manually entered or subsequently changed in some way) and this is why we need to check the link. So what I want to do is (probably) sample the database and compare the original data to give some measure of the quality of the data in the database statistically. So is there a standard for this or a common methodology or do I have to audit it record by record in which case I might as well rebuild the database from scratch!
Thanks for looking