I need some help to elaborate the best scenario possible so that our web page that contains a report can be loaded fast.
We use ASP.NET 3.5, IIS and SQL Server 2008. Also we use LLBLGen Pro 2.6.
The source in the database is located in several tables (t1, t2, ...). t1 and t2 are the biggest table with around 300,000 entries. We also created one super table with proper indexes gathering only the data we needed from all the tables ending up with around 350,000 entries.
So far, we have a stored procedure that returns a table after working with t1, t2, ... This procedure is called 12 times via LLBLGen library as a retrievalProcedure. Each table returned is looped to build the data source.
In the case of the super table, we query the database via LLBLGen which returns a list. Then we query the list to get the sublist equivalent to each call of the retrieval procedure. The data source is built by going through every sublist.
The final data source usually contains between 500 to 2500 entries according to the parameters set by the users.
My question is: what would be the best approach to optimize the speed?
Is using LLBLGen a good idea in this scenario? Does it impact the speed? Would the ADO.NET library be better? Is querying a 2500 elements list faster than calling 12 times a stored procedure or than querying 12 times the super table instead?
Thanks all