0

often when you handle "back office" and "front office", you need two different database structure the front office beeing a transformation of the backoffice's.

My predecessors have always used a "batch processing" approach (whe have tousand of batch runnings at all time, moste of them running more than an hour, most of them often crash because of the complexity of the operation).

right now, I've got to do the same, but i'd rather do the transformation realtime. It's not a simple task because so much data is involved in each write operation.

I've got a solution in my mind : using trigger (who will call stored procedure). but i'm wondering how bad this solution is from a performance point of view : the trigger is called for each write and calling complex stored procedure for each line seem a little overhead...

So here is my question : have you tried using this aproch for databases whoses biggets tables have ~10 millions lines (the write process would sometimes need to modifie ~10 000)

Bruno
  • 1,088
  • 1
  • 13
  • 31
  • I've progressed in my analisys, I think that doing everything real time is dangerous because of the peaks of activitie, so i'll store the process in a queue that will be processed very often so it will be simil real time and will soften the peak of activity. – Bruno Aug 10 '15 at 15:14

1 Answers1

1

whe have tousand of batch runnings at all time

Do not run more than N heavy SQL statements at a time, where N is the number of CPU cores in the machine.

Otherwise, they will stumble over each other and run each much slower.

Rick James
  • 135,179
  • 13
  • 127
  • 222