I have an app that loads data from a csv file into a pre-staging table and then performs some operations on it by calling some SPs before moving to staging:
- The data is truncated before inserts are done
- I am using a Simple Recovery Mode, since data recovery does not matter at this stage
- The entire process is done daily
- Two SPs are used before moving to Staging: one that bulk inserts into the table, and another one that removes quoting marks.
The problem is that the csv file typically has around 1.5 million rows. So that means that it truncates a 1.5 million row table using truncate table
, and then it does quote removal line by line. I'm obviously convinced these two are the ones contributing to the transaction log size.
I have been researching ways to do quote removal in our back-end instead of using an SP to do so, so hopefully that will help minimize growth. However, what could I do on our DB that I haven't done already so that the log doesn't increase so dramatically over time?