0

I need some help to elaborate the best scenario possible so that our web page that contains a report can be loaded fast.

We use ASP.NET 3.5, IIS and SQL Server 2008. Also we use LLBLGen Pro 2.6.

The source in the database is located in several tables (t1, t2, ...). t1 and t2 are the biggest table with around 300,000 entries. We also created one super table with proper indexes gathering only the data we needed from all the tables ending up with around 350,000 entries.

So far, we have a stored procedure that returns a table after working with t1, t2, ... This procedure is called 12 times via LLBLGen library as a retrievalProcedure. Each table returned is looped to build the data source.

In the case of the super table, we query the database via LLBLGen which returns a list. Then we query the list to get the sublist equivalent to each call of the retrieval procedure. The data source is built by going through every sublist.

The final data source usually contains between 500 to 2500 entries according to the parameters set by the users.

My question is: what would be the best approach to optimize the speed?

Is using LLBLGen a good idea in this scenario? Does it impact the speed? Would the ADO.NET library be better? Is querying a 2500 elements list faster than calling 12 times a stored procedure or than querying 12 times the super table instead?

Thanks all

Greg
  • 640
  • 5
  • 11
  • 23
  • One question I have, is the stored procedure that's looping -- what does it do? Does it do the same thing every time is runs 12x? Can you schedule it to only incur that load one time, then query the resulting super table on-demand via the web UI? Or does it do something different for every user based on input variables? – Dave C Jun 11 '14 at 21:00
  • @JiggsJedi The stored procedure queries t1, t2, ... for the time frame 1 and sorts and does basic calculation like count, sum and avg. The second call is for a second time frame. We have 12 different time frame we need to get. I could have the procedure called just call once and return everything and then doing the job for each time frame by working on the table set returned. At this point we are very flexible, we are looking to know what would be the best performance. – Greg Jun 12 '14 at 13:20
  • If each run of the set of 12 loops yields the same results then I would definitely schedule it overnight. You can then approach this from two avenues. 1) post the code and see if it can be tweaked if it runs long... analyze execution plans etc. 2) analyze and index as needed for user queries against resulting tables. Maybe the proc should be in control of index generation too... as they slow insert/update operations. You can easily drop the indexes that support the users... crunch your numbers... then rebuild them. – Dave C Jun 12 '14 at 13:47
  • Coming back a bit late. We finally finished the project and pushed it on the live website. Here is what we did and everything is runnning much better this way. We created a few tables for each time frame. one table is for request regarding 2 weeks time frame, another one for up to two, or one year... Every night we do all the work to insert the data ready to be used by the page. So no more loop or stored procedure or subqueries. Just a simple select. Thanks for the help :) – Greg Jul 02 '14 at 19:03

0 Answers0