0

I have a web application for Retail company. Lot of calculations & inserts are done in stored procedures due to which the application is taking load. Some stored procedures run for 5 hours.

I came with a caching solution. I made sql tables where I insert data after stored procedure execution. So the system checks if there is data in my new tables, it will fetch from there else run the stored procedure.

To automate it, we made a windows service where we could run 11 stored procedures parallely. But somehow the process gets stuck in between. Only 6 sps run while the other 5 doesnt.

Am I trying the right solution? Can anybody propose a better solution or tweak my flow ?

Shalin Jirawla
  • 489
  • 1
  • 5
  • 24
  • If you have a database performance problem, I suggest you tune that rather than building your own cache solution. SQL Server already has a cache. But if it doesn't have enough memory it can't use it. Unfortunately there isn't actually information in you post. You need to check logs and add some more details – Nick.Mc Oct 25 '16 at 12:08
  • For example you can use this to find the slowest query, then tune that - check it's query plan and add indexes etc. http://stackoverflow.com/questions/820219/how-to-find-slowest-queries – Nick.Mc Oct 25 '16 at 12:10

1 Answers1

0

Here are my suggestions:

  1. Concentrate on optimizing the stored procedure
  2. Check for database indexes
  3. SSIS is one more options (you can use it instead of windows services)

If you can run all the stored procedures in parallel, you can schedule the stored procedures in SSIS. Maybe it will help.

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
Ramakrishnan
  • 5,254
  • 9
  • 34
  • 41