-1

I have an ETL and Cube solutions, which I process one after another in a SQL agent job.

In ETL I run 1 package, that in turn one by one runs all other packages.

Whole processing takes 10 hours.

For ETL: How can I find out which package takes what amount of time to run within that one parent package, other than opening solution and record times?

For cube: Here dimensions process fast. What do I measure here in order to find which part takes it so long? Maybe measures? How to track processing times of particular measure?

Maybe SQL Profiler will help? If so, is there a good article which describes which metrics there should I pay attention to?

rakamakafo
  • 1,144
  • 5
  • 21
  • 44
  • You can find the duration of the SQL Agent Jobs by going to the job itself in SSMS>Right Click>View History and scroll to Duration column. Not sure about the cube though. – Jacob H Jul 21 '17 at 18:52
  • @JacobH, I don't want whole duration. I know how to see processing time of whole process. I want to learn processing time of each "batch" within this whole process – rakamakafo Jul 21 '17 at 19:03
  • You can break it down by the step level too. Or are they not individual steps either? – Jacob H Jul 21 '17 at 19:36
  • @JacobH, there are dozens of processes. If I start doing that, that will take whole day or more – rakamakafo Jul 21 '17 at 19:40
  • Query it in msdb.dbo.sysjobhistory then. – Jacob H Jul 21 '17 at 19:54
  • @JacobH, as stated in post description, and as I explained to you in comments, I DON'T want to learn execution time of a whole ETL. Inside that one package that I run from job, there are many others packages that are being run within a container, I want to learn execution time of each of those packages. Please, stop recommending me same thing again and again. – rakamakafo Jul 21 '17 at 20:01
  • The information you are requesting can be found through google.com I am not sure why you are on SO as you don't appear to have done any research. Good luck. – Jacob H Jul 21 '17 at 20:31
  • @JacobH, any information can be found on google.com, but when you don't know for what to look google.com will not work. Look at answer below, logging, I didn't know at all that there is such thing, and probably that will help. – rakamakafo Jul 21 '17 at 21:10
  • It will only help if you are deploying and running your integrations from the SSISDB. Which you would have to configure (set up the catalog, deploy from VS, etc.). It's not for running jobs from the agent. – Jacob H Jul 21 '17 at 23:26

1 Answers1

1

To gather statistics about SSIS execution times, you can enable logging:

  • For package deployment model, you'll have to turn on logging in each package, go to SSIS > logging. In the dialogue choose the Pre and Post Execute events. Use a sql logging provide which will log to a system table called dbo.sysssislog. You'll need to join pre and post events on execution id.

  • For Project deployment model, it's probably already on. This can be configured in SSMS, Integration Services > SSISDB, right click and choose properties. Once you've executed the package, you can see the results in the standard reports. Right click the master package and choose Reports > Standard Reports > All Executions.

Lots more details on SSIS logging here: https://learn.microsoft.com/en-us/sql/integration-services/performance/integration-services-ssis-logging

For SSAS, I always tested this manually. Connect in SSMS, right click on each Measure group and do a process full (this assumes the dimensions have just been freshly processed.) The measures are more likely to be the cause of an issue because of the amount of data.

Once you understand which measure is slow, you can look at tuning the source query, if it has any complexity to it, or partitioning the measure group and doing incremental loading. Full processing could be scheduled periodically.

m

Mark Wojciechowicz
  • 4,287
  • 1
  • 17
  • 25