3

Are there any obvious reasons why calling a stored procedure via my entity model would result in far slower performance than calling it direcly?

Firstly, I don't expect the SP to run at exactly the same speed and I know that there are various things that EF has to do that wouldn't be called when accessing the SP directly.

That aside, I have a query that returns three columns of strings. It runs pretty much instantly when I execute it via Enterprise Manager. If I run it via EF then it takes about six seconds. Granted, the results are being mapped into a complex type but when I've ran the query through SQL Server Profiler it's clear to see that the delay happens on the SQL server:

Performance of query being executed from two sources in SQL Server Profiler

On the diagram, 1 indicates the SQL being called from Enterprise Manager, 2 indicates it being called via my app using EF.

Is there anything obvious I'm doing wrong here? I'd expect a delay of maybe a second or two, but the difference seems too great.

EDIT:

It seems that the stored procedure also runs slowly when called via ADO.Net. My colleague seems to think it's something to do with a bad execution plan that .Net is caching. By editing the stored procedure and saving it again it seemed to clear whatever was in the cache and both the ADO.Net and EF call to the stored procedure work well.

Has anyone else encountered anything like this before?

GrandMasterFlush
  • 6,269
  • 19
  • 81
  • 104
  • 2
    AFAIK, ADO.NET does **not** cache execution plans. It's SQL Server that does. The EF also has a concept of plan caching, but it refers to query compilation (.NET expressions -> T-SQL) rather than DB operators. – Pavel Gatilov Apr 18 '12 at 16:39
  • Can you confirm that repeated (in a loop) run of the procedure through the EF always takes a lot more time compared to a repeated run of the procedure through a T-SQL batch in SSMS? Do you observe this discrepancy in the SQL Profiler? – Pavel Gatilov Apr 18 '12 at 16:41

3 Answers3

4

Take a look at this thread on SQL Server forum. It's a bit similar and might give some clues. In short, you may have different SQL Server execution environment options in SSMS and ADO.NET leading to different execution plans. Clearing the SQL Server plan cache should help.

Pavel Gatilov
  • 7,579
  • 1
  • 27
  • 42
  • This is the solution. Besides arithabort, there are other options as well like ansi_nulls. I recommend you check all of them. – usr Apr 18 '12 at 17:32
  • @Pavel - Thanks for your answer, this does seem to be the case. I've added a fuller explanation of my findings below, hopefully this will help someone else. – GrandMasterFlush Apr 20 '12 at 08:48
2

Pavel Gatilov seems to have hit the name on the head with the ARITH ABORT setting but I thought I'd post a bit more about my findings.

This post on SO covers a similar problem with a discussion of a work around; it's possible to write a wrapper class between the EF connection and SQL.Data.Client that prefixes any calls to the DB with "SET ARITHABORT ON". This article on MSDN explains in more detail.

Having looked at the complexity of the changes and considering that we're going to move the application from using stored procedures to fully using EF we're going to bite the bullet and move the SP functionality into our EF data model instead.

Community
  • 1
  • 1
GrandMasterFlush
  • 6,269
  • 19
  • 81
  • 104
  • 1
    Beware that the EF is not good at generating optimal queries, so you might have to keep some of the SPs for performance reasons. – Pavel Gatilov Apr 21 '12 at 03:54
0

Is not the same calling in a single transaction

INSERT INTO foo (col1, col2) SELECT col1, col2 (with all 100 rows of the changes)

Than calling 100 times

EXEC SP_foo_INSERT param1, param2

Just take a peek at the query generated in both cases and test that query directly in the database. See what is the execution plan for it.

percebus
  • 799
  • 1
  • 8
  • 21