We have a database with about 50-60 % recompiles. That value comes from [SQL Compilations/sec] coupled with [Batch Requests/sec].
We Think that that value is a bit high
If we look at this query:
SELECT TOP 150
qs.plan_generation_num,
qs.execution_count,
qs.statement_start_offset,
qs.statement_end_offset,
st.text
FROM sys.dm_exec_query_stats qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) st
WHERE qs.plan_generation_num > 1
ORDER BY plan_generation_num DESC
We don't have a lot of plan_generation_num if you compare it to execution count. Wat we do have is a lot of single use objects and I am trying to figure out why?
Our application is built in ASP.NET and we always use parameterized querys. We use both SP's and SQL-statements in the application but always parameterized.
The webpage that runs agains this database is a pretty big website with about 500 000 pageviews each day and about 10 000 request per minute if this information helps.
We have no long running Querys and indexes and statisics are in order. This is one of the last things to optimize.
CPU is average 15% ram is about 100 gb and of coursed used up by SQL-server. We use SQL Server 2014 Enterprise.
One thing I started wondering about. If I have a sql statement like this
SELECT doors, windows, seats from cars where Wheels = @Wheels AND Active = 1
Will this plan not be reused beacause we don't set a parameter on this part: **AND Active = 1 **
Any idea on how to get an idea on why we have so much single use? The Count on cached plans is about 20 000. In comparasion we have about 700 sp' and a lot more querys in the app.