Author Topic: Report Performance Improvement  (Read 213 times)

Offline pakhi

  • Full Member
  • ***
  • Join Date: Apr 2017
  • Posts: 7
  • Forum Citizenship: +0/-0
Report Performance Improvement
« on: 12 Oct 2017 03:48:09 am »
Hi Team - I have one list report with 56 columns and it is taking around 13 minutes to extract 64000 records in excel while underline report query is completing in just 1 minutes. I tried some option to improve report performance in excel but it did not work.

Please suggest something which I can implement to improve the performance.

Cognos Version : 10.2.2
Data Base        :  SAP HANA

Package is published in DQM mode.

Offline dougp

  • Community Leader
  • *****
  • Join Date: Jul 2014
  • Posts: 176
  • Forum Citizenship: +14/-1
Re: Report Performance Improvement
« Reply #1 on: 12 Oct 2017 12:02:37 pm »
I have some users who routinely run a report that produces 185,000 rows of 208 columns and output to Excel or CSV format.  The query runs on the database for about 30-45 seconds.  The report will often fail at 30 minutes because I have set a maximum execution time.  During that time, the report will consume up to 30 GB of drive space in the Temp folder.  I think the problem is in the way Cognos compiles the output for those formats.

I have tried, to no avail, to encourage my users to use a different method to produce this type of output.  I have explained that Cognos is not the only reporting product and is certainly not the best for this use case.  The fear learning to use additional tools (like Excel).

Good luck!

Offline BigChris

  • Statesman
  • ******
  • Join Date: Apr 2013
  • Posts: 1,056
  • Forum Citizenship: +81/-0
Re: Report Performance Improvement
« Reply #2 on: 13 Oct 2017 01:48:10 am »
I'm curious...what are they doing with 185,000 rows of data? I'm guessing they're filtering it to find the subset of the data that they're actually interested in.

Offline hespora

  • Statesman
  • ******
  • Join Date: Nov 2015
  • Posts: 291
  • Forum Citizenship: +19/-0
Re: Report Performance Improvement
« Reply #3 on: 13 Oct 2017 02:29:48 am »
185k rows is not even that large to work with... I routinely use similar sized datasets (albeit with much fewer columns); pricing simulations are a common application. As in: customer or customer group is entering into pricing negotiations, and we need to analyze how whatever my sales guys want to do in terms of new pricing, discount, rebate schemes will affect the existing business. For that to be accurate, I run past sales for said customer or group on lowest possible granularity.

Offline Invisi

  • Community Leader
  • *****
  • Join Date: Sep 2016
  • Posts: 198
  • Forum Citizenship: +4/-2
    • Invisi - Vision on Information
Re: Report Performance Improvement
« Reply #4 on: Today at 06:52:00 am »
The data set in itself is rather small. Just dumping this many rows with Cognos is a lot. This sounds like an intermediate result. For me, either you produce the end result in Cognos, or you don't bother with Cognos and put your analysis tool directly on your data mart, data lake or whatever data structure you have connected to Cognos.
Few can be done on Cognos | RTFM for those who ask basic questions...

Offline hespora

  • Statesman
  • ******
  • Join Date: Nov 2015
  • Posts: 291
  • Forum Citizenship: +19/-0
Re: Report Performance Improvement
« Reply #5 on: Today at 09:40:16 am »
Oh hell yes, but try to even find someone who will listen to that idea in a large concern where you, the guy needing the data, are removed so far from the guys managing the system, both geographically and organizationally, that you've never even met. Reality unfortunately is, it's just not worth the hassle.

 


       
Twittear