Hi Experts,
I am trying to build a cube in Cognos 10.2 using transformer. The cube used to build successfully with less data. Now it gives error message (TR 1901) PDS-PCT-0010 A data cache table or index is full. I believe it is because the cube size crosses 2GB.
I cannot do time partitioning as I have 3 time dimensions. Please advise.
Thanks for your help in advance.
Have you tried the MultiFileCubeThreshold preference?
Thanks. I tried cube groups on a dimension but it failed for a child cube due to >2GB size.
Checking the MultiFileCubeThreshold value =10000000. Fingers crossed while the cube is building.
It failed again with same error. During cube build, it created 7 .mdp files out of which one mdp file crossed 2GB. The same error.
Please advise. Thanks
Same thing but drop the MUltiFileCubeThreshold by half.
Cube building/sizing is an art not a science as there are too many variables to factor in.
Thanks. What should be the multifile threshold for approx 68275534 rows and categories 657565? I have set auto partitioning . Please advise.
Quote from: Venkataramanan on 20 Feb 2014 01:06:07 PM
Thanks. What should be the multifile threshold for approx 68275534 rows and categories 657565? I have set auto partitioning . Please advise.
Like I said, it's an art, not a science. I couldn't tell you what it "should" be. It's trial and error. You can't base it on just rows, cause you have to consider that a row and have 1 column, or any number of columns. Then you also have to consider that depending on your model, you could have any number of dimensions and also alternate drill paths.
PS. Alt-drill paths add a LOT of size to the cube.
So at this point, whatever you had before, cut it in half. Build again. If you still go over 2Gb, cut the new number in half and try again.
Thanks Grim. I appreciate your help.
I am not using alt drill down path . My cube has 20 dimensions.
As per advise, I have reduced the multi file to 5000000. Building the cube again. Will wait and see.
Thanks again.
In between, I am not really able to understand why the mdp file crosses 2GB. Also can I consider reducing the number of categories . If so, please advise on the manual partition for my case.
I have set MultiFileCubeThershold to split the cube in to multiple files. But while building the cube, it is writing to only 1 mdp file though it creates many mdp files. The error occurs when that mdp file exceeds the 2GB limit. only 1 mdp file is getting updated and exceeds 2GB.
Also i have noticed earlier, cube is not sharing the data equally between the mdps. Please advise.
Thanks in advance for your timely help.
Thanks Grim and Nybble.
I am new to Cognoise community and i am really impressed with the response. You guys are simply great.
I will restate the problem statement and workaround.
Actual Issue: The cube has 20 dimensions. There are 3 Time Dimensions with 15 special categories in each time dimensions. Most of these dimensions are having multi-levels and very granular data (say Item and customer details). The cube fetches data from 3 FACTs and have data 80+ million records for each year. The cube processes 2+ years data.
As expected, the cube size crossed 2GB and build failed.
Solution:
I have tried setting MultiFileCubeThreshold to different values ( gone down to 5000 ). The cube didn't allow Time Partitioning because it has 3 time Dimensions. Tried different settings for desired partition size settings. Tried excluding few dimensions from auto partitioning to reduce number of categories generated. unfortunately, all went in vain.
Finally, I deleted 2 time dimensions (out of 3 time dimensions) from the cube and enabled Time Partitioning on Month. After that, added 2 time dimensions to the existing time dimension as alternate drill down. Now, the time partitioning is disabled but value is still checked. When i build the cube, it generated individual cubes for the number of months and the files size are lesser than a GB. YES.... the cube is being time partitioned (despite of 3 Time dimensions but in Alternate drill down). Thats it.
Thanks for the update! We really appreciate it when people take the time to provide a working solution like this!!
Have an applause from me! :)
MF.
Dear All,
Same situation here. I am able to create the cube using unix scprit and MultiFileCubeThreshold value.
But my real problem , I need to create the cube one time manually then only cube will display all the calculated time categories (Such as 2 weeks, Current week last year etc).
Do you have any suggestion for one time manual creation ?
Many thanks.
Ramesh E
Additional info: I have 15 dimensions. Cube size is 8GB approximately.
Hi All,
My cube having more than 2 GB.
I am getting below error while creating cube:
I have Set it MFT value to :5000000, (checked with 10000000 to 35000000), getting same error. Could you please need fully help on that. below is log file :
Thu 10 Aug 2017 8:19:55 PM 4 000002D9 Processing cube 'MENAP_SecondarySales_Cube' at location \\itsbebevcorp01.jnj.com\its_cognos_prd\consumer\cognosdata\project_data\menap\menap\transformer_models\menap_secondarysales_cube\mdc\menap_secondarysales_cube.mdc
Thu 10 Aug 2017 8:19:55 PM 4 0000021F Timing, UPDATE CATEGORY AND PROCESS WORK FILE,00:01:38
Thu 10 Aug 2017 8:19:55 PM 4 000002D9 Start metadata update of cube 'MENAP_SecondarySales_Cube'.
Thu 10 Aug 2017 8:19:56 PM 4 0000021F Marking categories needed.
Thu 10 Aug 2017 8:19:56 PM 4 0000021F Updating the PowerCube metadata.
Thu 10 Aug 2017 8:19:57 PM 4 0000021F Updating the PowerCube with currency data.
Thu 10 Aug 2017 8:20:04 PM 4 000002D9 End metadata update of cube 'MENAP_SecondarySales_Cube'. 123182 categories were added to the cube.
Thu 10 Aug 2017 8:20:04 PM 4 0000021F Timing, METADATA,00:00:09
Thu 10 Aug 2017 8:20:04 PM 4 000002D9 Start update of cube 'MENAP_SecondarySales_Cube'.
Thu 10 Aug 2017 8:20:04 PM 4 000002D9 --- Performing Pass 0 with 93224043 rows and 190608 categories remaining.
Thu 10 Aug 2017 8:20:04 PM 4 000002D9 Start Write leaving 190608 categories remaining.
Thu 10 Aug 2017 8:20:04 PM 4 0000021F Updating the PowerCube data.
Thu 10 Aug 2017 8:20:04 PM 4 0000021F Updating the PowerCube data.
Thu 10 Aug 2017 8:20:10 PM 4 0000021F Performing DataBase Commit at record number 500000.
Thu 10 Aug 2017 8:20:12 PM 4 0000021F Performing DataBase Commit at record number 1000000.
----------
----
Thu 10 Aug 2017 8:24:45 PM 4 0000021F Committing PowerCube(s).
Thu 10 Aug 2017 8:24:49 PM 4 0000021F Timing, CUBE COMMIT,00:00:04
Thu 10 Aug 2017 8:24:50 PM 4 0000021F End cube update.
Thu 10 Aug 2017 8:24:50 PM 4 0000021F Timing, TOTAL TIME (CREATE CUBE),03:10:55
Thu 10 Aug 2017 8:24:50 PM 2 0000021F (TR1901) PDS-PPE-0197 The local data cache size exceeded the allowed maximum of 2147418112 bytes.