Author Topic: How does the Power Cube file size affects PowerPlay performance ????  (Read 1852 times)

Offline Ucat

  • Full Member
  • ***
  • Join Date: Feb 2009
  • Posts: 10
  • Forum Citizenship: +1/-0

I need some advice on this issue, based on your experience:  How does the Power Cube file size affects the PowerPlay performance ?

I have a very large Power Cube Model with dimensions that on each tree have many entries for each branch of category level.
Actually the PowerCube contains four years of monthly data with an input of around 33352198 rows and 509018 categories.

The terminated Power Cube file has a size of 4.15 Gb divided on four files an index file ( 128.14 Mb ) and three data Files ( 1.59, 1.25 Gb, and 1.18 Gb ).

The PowerPlay users are issuing a report against this Power Cube asking for a year to year comparison involving most of the metrics defined and a subset of one categories with many entries to chose from. The PowerPlay server takes a reasonable time to run the report  and responds to the data request. But sometimes the server takes longer than the 900 seconds time out for the connection and the request is cancelled.

I know that reducing the amount of records in the creation of the Power Cube will result in smaller files but for now I can't do that as the data is relevant to the users.

How can I improve the performance of the PowerPlay Server ?

Is it better for the PowerPlay server performance to have more data files for the Power Cube ( changing the threshold limit to a lower number will generate more data files ) ?

Should I try to reduce the entries on the category to a number where only used values are available ?