0

Bit of a theoretical question here.

I have made a database search interface for an ASP.NET website. I am using linq to sql for the BL operations. I have been looking at different ways to implement efficient paging and I think I have got the best method now, but I am not sure how great the difference in performance really is and was wondering if any experts have any explanations/advice to give?

METHOD 1:The traditional method I have seen in a lot of tutorials uses pure linq to sql and seems to create one method to get the data. And then one method which returns the count for the pager. I guess this could be grouped in a single method, but essentially the result seems to be that an IQueryable is created which holds the data to the whole query and IQueryable.Count() and IQueryable.Skip().Take() are then performed.

I saw some websites criticising this method because it causes two queries to be evaluated, which apparently is not as efficient as using a stored procedure... Since I am using fulltext search anyway, I needed to write a SP for my search, so in light of the previous comments I decided to do it paging and counting in the SP. So I got:

METHOD 2: A call to the stored procedure from the BL. In the SP the WHERE clause is assembled according to the fields specified by the user and a dynamic query created. The results from the dynamic query are inserted into a table variable with a temporary identity key on which I perform a COUNT(*) and SELECT WHERE (temp_ID >= x and temp_ID < y).

It looks to me like those two methods are in principle performing the same operations...

I was wondering whether method 2 actually is more efficient than method 1 (regardless of the fact that fulltext is not available in linq to sql...). And why? And by how much? In my understanding, the SP requires the query to be generated only once, so that should be more efficient. But what other benefits are there?

Are there any other ways to perform efficient paging?

yu_ominae
  • 2,975
  • 6
  • 39
  • 76
  • have you measured performance? – Mitch Wheat Jul 27 '11 at 00:35
  • No I haven't, it is probably something I should do. I am a bit wary of performance measurements though, because they depend a lot on what exactly you do and I am looking for general information as to why one approach should be faster or generally better than the other. – yu_ominae Jul 27 '11 at 01:33

1 Answers1

0

I finally got around to doing some limited benchmarks on this. I've only tested this on a database with 500 entries, because that is what I have to hand so far.

In one case I used a dynamic SQL query with

SELECT *, ROW_COUNT() OVER(...) AS RN ... FROM ... WHERE RN BETWEEN @PageSize * @PageCount AND @PageSize * (@PageCount + 1)

in the other I use the exact same query, but without the ROW_COUNT() AND WHERE ... clause and do a

db.StoredProcedure.ToList().Skip(PageSize * PageCount).Take(PageSize);

in the method.

I tried returning datasets of 10 and 100 items and as far as I can tell the difference in the time it takes is negligible: 0.90s for the stored procedure, 0.89s for the stored procedure.

I also tried adding count methods as you would do if you wanted to make a pager. In the stored procedure this seems to add a very slight overhead (going from 0.89s to 0.92s) from performing a second select on the full set of results. That would probably increase with the size of the dataset.

I added a second call to the Linq to SQL query with a .Count() on it, as you would do if you used had the two methods required ASP.NET paging, and that didn't seem to affect execution speed at all.

These tests probably aren't very meaningful given the small amount of data, but that's the kind of datasets I work with at the moment. You'd probably expect a performance hit in Linq to SQL as the datasets to evaluate become larger...

yu_ominae
  • 2,975
  • 6
  • 39
  • 76