0

I was trying to write a powershell script to try find the top 2 running queries on a Azure SQL database and then capture the query plans of those 2 queries and then peform a DBCC cache free and capture those plans again after clearing the cache and send those plans to email using sendgird.

Issue1: plan handles stored in $delta2 are incorrect and doesn't work when used with sys.dm_exec_query_plan()

Issue 2: if i manage to get the correct handles from the sql studio and use them in $delta3 sys.dm_exec_query_plan() doesn't give any response in PowerShell

Issue 3: the Export-clixml corrpts the xml file and doesnt open in SQL studio.

Any help is much appreciated. Thanks in advance.

$params = @{
    'Database' = 'xx'
    'ServerInstance' =  'xx'
    'Username' = 'xx'
    'Password' = 'xxx'
    'OutputSqlErrors' = $true
}
$sqlcmd= "
SELECT TOP 2
    GETDATE() runtime, *
FROM (SELECT  convert(VARCHAR(50), query_stats.query_hash, 1) as query_hash, SUM(query_stats.cpu_time) 'Total_Request_Cpu_Time_Ms', SUM(logical_reads) 'Total_Request_Logical_Reads', MIN(start_time) 'Earliest_Request_start_Time', COUNT(*) 'Number_Of_Requests', SUBSTRING(REPLACE(REPLACE(MIN(query_stats.statement_text), CHAR(10), ' '), CHAR(13), ' '), 1, 256) AS" + '"Statement_Text" '+"
    FROM (SELECT req.*, SUBSTRING(ST.text, (req.statement_start_offset / 2)+1, ((CASE statement_end_offset WHEN -1 THEN DATALENGTH(ST.text)ELSE req.statement_end_offset END-req.statement_start_offset)/ 2)+1) AS statement_text
        FROM sys.dm_exec_requests AS req
                CROSS APPLY sys.dm_exec_sql_text(req.sql_handle) AS ST ) AS query_stats
    GROUP BY query_hash) AS t
ORDER BY Total_Request_Cpu_Time_Ms DESC;"
$delta1 = Invoke-Sqlcmd -query $sqlcmd @params -MaxCharLength 999999 
$qhash = $delta1.query_hash
$delta2 =  "select convert(VARCHAR(100), plan_handle, 1) as plan_handle  from sys.dm_exec_query_stats where query_hash = $qhash"
$getplanhandle = Invoke-Sqlcmd -query $delta2  @params -MaxCharLength 999999 
$getplanhandle
foreach($plan in $getplanhandle)
{
$handle = $plan.plan_handle
$delta3 = "select * from sys.dm_exec_query_plan($plan.plan_handle)"
$saveplan = Invoke-Sqlcmd -query $delta3  @params -MaxCharLength 999999 
$saveplan.query_plan | export-clixml -path ./query_plan.sqlplan
}
danblack
  • 12,130
  • 2
  • 22
  • 41

1 Answers1

0

You very likely don't need to do most/any of this. It's not obvious from your post as to "why" you are trying to do this, but I'll assume you just want to record the query plan for later performance analysis since that's usually why one would want to look at the plan.

The good news is that Azure SQL DB has a feature to automatically capture most query plans for you already - Query Store. It stores them in the database and you can look at the full history of what query plans have been used for a given query over time and how they have performed. You can read more about it here: Query Store overview. Please note that by default the query store mat not capture all queries (it can overwhelm a database for some kinds of workloads). If you are doing this on a test system, there is a capture mode "all" which you can use to record all plans.

If you happen to be trying to determine if you have parameter sensitivity issues for a given query (where the runtime variance can be high if the selectivity of a predicate with a parameter changes significantly and thus causes a given query plan to process few/many rows on different parameter values), you can also see this property in the query store - it will record sum-of-squares variance for you for each query plan it tracks. SQL Server Management Studio has a nice UI to help you navigate the query store data. You can read more about that here.

Conor Cunningham MSFT
  • 4,151
  • 1
  • 15
  • 21
  • Hi, Thanks for information. we are using secondary databases to read and primary to only write. MSFT said query stores doesn’t work on the secondaries. so i thought may be i spin up a azure automation job to do these and capture the plans. “Why ?” we have some queries that are causing high DTU and it happening more often nowadays. Until the devs optimize those query i thought we can spin up a script to capture plans to analysis whenever the DTU hits more than 80% for 10 min and clear the plan cache and export them to a email. – hashik katikireddy Jan 24 '21 at 19:48
  • That makes sense. Hopefully we can give you more options in the future for this scenario. – Conor Cunningham MSFT Jan 25 '21 at 14:59