often when you handle "back office" and "front office", you need two different database structure the front office beeing a transformation of the backoffice's.
My predecessors have always used a "batch processing" approach (whe have tousand of batch runnings at all time, moste of them running more than an hour, most of them often crash because of the complexity of the operation).
right now, I've got to do the same, but i'd rather do the transformation realtime. It's not a simple task because so much data is involved in each write operation.
I've got a solution in my mind : using trigger (who will call stored procedure). but i'm wondering how bad this solution is from a performance point of view : the trigger is called for each write and calling complex stored procedure for each line seem a little overhead...
So here is my question : have you tried using this aproch for databases whoses biggets tables have ~10 millions lines (the write process would sometimes need to modifie ~10 000)