5

I'm trying to optimize a report that uses multiple stored procedures on the same table. Unfortunately, each procedure is reading millions of records and aggregating the results. It's a very intense read for a report, but each stored procedure is optimized to run pretty fast within SSMS.

I can run each stored procedure and get a result set within 10 to 20 seconds. When I put them all into one report within SSRS, the report times out.

There is a total of 4 parameters per stored procedure. All targeting the same table, just aggregating the data in different ways. Indexes on those tables are inline with the query. It's based on time, user and the one dimension I'm using to COUNT() both DISTINCT and NONDISTINCT.

I'm thinking the issue is the fact SSRS is running 4 procedures at the same time on the same table as opposed to one after the other. Is this true? If so, is there anyway to ensure SSRS does not run them in parallel?

My only option is to create summary table that is already preaggregated. Then just run the report off that table. Otherwise, I guess param sniffing is possible too.

Fastidious
  • 1,249
  • 4
  • 25
  • 43
  • Can you mimic the activity in a non-prod environment and run a profile trace? That would tell you what is running concurrently. Another option is to have it refresh on a regular schedule so the report is ready to go. – Steve Mangiameli Dec 17 '15 at 20:42
  • 1
    I have seen situations where the Rendering step is the piece of the pipeline that takes so very long and is responsible for the timeout. I'm not saying that's definitely occurring in your case, I'm just saying it is a possible explanation for fast-execution in SSMS and very-slow in SSRS. – David Tansey Dec 17 '15 at 20:44
  • how about increasing timeout http://geekswithblogs.net/ssrs/archive/2009/10/30/steps-to-resolve-ssrs-timeout-issues.aspx – niketshah90 Dec 17 '15 at 20:46
  • If I'm understand correctly, you have 4 data sets on one data source. Select your data source and check off the "Use Single Transaction" option? This will force Reporting Services to run each data set individually rather than parallel executions. – Mike Zalansky Dec 17 '15 at 20:59
  • Yeah, the rendering step is not the issue here. It's anywhere between 4 to 20 rows of data. All the heavy lifting is in SQL Server. It's really fast in SSMS. It's just slow in SSRS. Maybe param sniffing going on? I'll try to run it in the tracer. – Fastidious Dec 17 '15 at 22:16
  • Execution plan showing all seeks on all the indexes. No scans, everything looks good in terms of performance. SSRS is just being difficult. I hate this thing. – Fastidious Dec 17 '15 at 22:17

1 Answers1

7

By default, datasets in SSRS are executed in parallel.

If all of your datasets refer to the same datasource, then you can configure for serialized execution of the datasets on a single connection this way:

  • open the data source dialog in report designer
  • ensure that the Use Single Transaction checkbox is checked

Once that checkbox is selected, datasets that use the same data source are no longer executed in parallel.

I hope that solves your problem.

David Tansey
  • 5,813
  • 4
  • 35
  • 51
  • 1
    Thanks. I did not see that. Unfortunately, this made it slower. I just ran all 4 procedures in SSMS for one day for one of the params and it returned in 7 seconds. It should only be returning 4 rows to SSRS. When I use this change, it times out. I have no idea why... SSMS is 7 seconds. But, this is a right answer regardless of my performance issue, which is likely not because of this option. – Fastidious Dec 17 '15 at 22:09
  • The fact that only 4 rows are returned negates the suggestion I made in an earlier comment regarding rendering-time. I'd be grateful for the Accept but if not I understand. – David Tansey Dec 17 '15 at 22:13