Denormalization is the process of moving from higher to lower normal forms of database modeling in order to speed up database read performance or to maintain a history.
Denormalization is a process of attempting to optimize the read performance of a database by adding redundant data, by grouping data or to maintain a history.
A fully normalized schema shows current state only. For example if in Customer table, a customer moves to other address, the old would be changed with the new one and lost. This can be solved by maintaining a copy columns in the table with an old address.
In the normalized database usually there are more tables. This means more joins needed to perform in the queries and a possible negative impact on the performance. Pre-calculated quantities can be helpful for avoiding many joins and aggregating often large sets of data.
However, denormalization always brings the danger of update anomalies to the database. It must be done deliberately and to be well documented. To make sure an application maintains denormalized data, transactions are needed. Transactions are the smallest unit of work, that must either complete entirely or not at all.
Transactions must fail, ensuring data consistency. If something doesn't work out, procedures for rebuilding the data from a scratch should take place.
Denormalizing for maintaining history is used in data warehouses. Implementing Slowly Changing Dimensions (SCD) requires extra columns in the table for older dates or and current indices.
Links