0

I'm going to need to import some data that has several layers of hierarchy, approx 5-7. So, while I could create tables with the FK constraints to mimic the final destination tables, Id rather keep it simple with a single "staging" table. The complicated bit, is that an application will be built to let users make "adjustments" to this staging data before letting it go the rest of the way into the system. That means the application would have to deal with selecting/updating data in a single table that really is better represented in a way that represents that same or very similar structure as its final destination.

So I guess my question boils down to this regarding staging data:
Given that some amount of processing must be done on any "staged" data before its imported into its destination, is it bad practice to de-normalize staging tables?

bitshift
  • 6,026
  • 11
  • 44
  • 108
  • [(Re)organizing tables for hierarchies/subtyping](https://stackoverflow.com/q/4048151/3404097) is not normalization. Why do you you use the term "de-normalize"? – philipxy Jan 22 '19 at 01:03
  • 1
    Staging table should have the exact same structure as your production table - otherwise you will need a special ETL process to **convert** the data from staging into production. If your production table is de-normalized - so should be your staging table, and vice versa. – IVO GELOV Jan 27 '19 at 18:35

1 Answers1

0

Credit to IVO GELOV - thanks, that's what I was thinking and what I've typically done so far. Yet, when there were 6-10 new destination tables, I was thinking it would be simpler to keep things "flat" at the point of staging. However, given that that certain functions will need to operate on the staged data, it makes sense to give these staged tables the structure that mimic the destinations.

bitshift
  • 6,026
  • 11
  • 44
  • 108