Author Topic: Transformer Cube with 13GB of data  (Read 66 times)

Offline msudheen

  • Full Member
  • ***
  • Join Date: Dec 2008
  • Posts: 7
  • Forum Citizenship: +0/-0
Transformer Cube with 13GB of data
« on: 09 May 2018 10:02:05 am »
Hi,
We are in middle of database migration and one of the challenges is to handle the huge data volume for cube.
Our cube is sourced of few report studio reports and most of the queries run and return > 1 M records. They all come back quickly, I can see that in the transformer log and database monitoring tool.
However, there are 2 fact tables which fetch around 17 Million records.

In our current production we have oracle and CQM package. The fact queries run and return data in around 30 minutes. The only setting I have is Fetch number of rows : 1000 in cogdmor.ini file under configuration.

In new system, I am trying to build the same cube on red-shift database and DQM package. For DQM, i have followed the below technote from IBM and set the fetchbuffersize to 10485760 (10MB*1024). The fact queries takes around 1 hour now and eventually fails with error message that it timed out while data transmission.

Any advice on how to handle the situation. Is there any Fetch number of rows settings for DQM, i am not sure if I have correct settings for fetchbuffersize.

https://www.ibm.com/support/knowledgecenter/en/SSEP7J_11.0.0/com.ibm.swg.ba.cognos.ag_manage.doc/c_connection_properties.html


 


       
Twittear