1

I have a daily scheduled COGNOS report created by someone else a few months ago. This developer left the company over a month ago. From what I'm told this report ran fine. Then a week before the developer left the company, it began to take longer and longer to run. Now its taking 11-12 hours. There is no documentation for the report that would indicate any possible changes made by the developer that would cause this. And as a very recent new hire; I am, so far, unaware of any db changes that may have caused the issue to begin. I was hoping that someone more experienced could point me in the right direction.

Do I need to investigate changes to the db? Could this be a cache issue? Maybe a permissions problem or just an issue with the created schedule?

Let me know what additional info would be helpful.

woods
  • 243
  • 4
  • 10
  • 20
  • Can you describe what you have tried so far? What layer/area could be causing the issue (network, model, sql, etc)? What database? What optimization has been done (access paths, EVIs, binary radix, etc)? Can we see the SQL statement? – VAI Jason Sep 22 '21 at 20:03
  • 1
    I haven't tried anything noteworthy yet. I'm very unfamiliar with what I'm doing. I haven't seen the database yet or even been introduced to the package creation process. I used framework manager for that years ago, but it all seems very different know. So I've mostly been trying to do research. I'm not expecting to get a flat out solution to my issue right now. I don't think I have enough info to provide to get that. I was just hoping to get some advice from an expert that could at least help guide my research in the right direction. Sorry I don't have the info you requested. – woods Sep 22 '21 at 20:31
  • Np, I gave some suggestions below. Give it a try and see if it helps. Let me know and we will figure it out – VAI Jason Sep 22 '21 at 21:03

2 Answers2

0
  1. Check to see if slow performance is connected to a specific data item

Start with a simplified version of the report

For example if you have a report with 10 columns

Reduce the report down to 3 columns and compare the performance.

Then add back one data items/column and re-examine If you notice a difference linked to a specific data item, examine the changes in the SQL statement. Look for outer joins or maybe complex expressions with functions which can impact performance

  1. Look for a logic trap

Maybe you are witnessing a logic trap Like a data chasm where you are getting the cartesian product for results because the request is a many to many relationship

  1. Check the data source system

It could be the time of day the process is running is now in conflict with another competing process that is taking priority/all of the resources To help test this, try to run the analysis during a non-peak time

  1. Cursor Isolation/lock escalation Check your settings to see what your cursor isolation level is Be careful, don't just change this value casually as it will impact access to the data source. If you have a test system, try uncommitted read (aka dirty read) to avoid locking issues. If another user is maintaining or updating data/operation tables, this could impact the SQL response time
VAI Jason
  • 534
  • 4
  • 14
  • 1
    Thank you for the response. I will look into the suggested actions and let you know what I find. I can say that I did try looking for specific items that were slowing performance. I checked the tabular data for each of the 18 queries. Just to to see how long they run for. Of the 18 queries, 6 finished in a few seconds. The other 12 took too long to finish. So I created a blank report and started recreating one of those queries. Even with just one data item, the query took too long to finish. Which leads me to believe the issue is likely outside the report. – woods Sep 23 '21 at 14:21
  • If you can share the SQL or even the psuedo-code of the statement. Might be able to spot an issue or offer an alternative to the SQL for you to try – VAI Jason Sep 23 '21 at 19:28
0

Which type of datasource has been used for this report? Multidimensional or relational? Which mode has been used - classic or dynamic? Is this report sending out information to different users via email? Maybe bursting and master detail has been used which is often a performance killer.