Data Write to ASO cube

Hi,
Please help.
Case like that;
Base member of Period in Planning is Month as calendar, but Base member of Period in ASO cube is Week, like that:
Jan
    JanWeek1
    JanWeek2
    JanWeek3
    JanWeek4
    JanWeek5
now, need to load planning data to ASO cube as reporting.
but seems cannot directly load data from Jan in Planning to Jan in ASO cube, since Jan in ASO cube is a parent member. am i correct? what i tested in smartview is as that.
so is there any better way to do that?
Thank you.

You can go ahead with the approach of what VBajaj has said.
We have a similar process where our Plan data is 1 level higher than ASO (i., level1 of ASO is level0 of plan for most of the dimensions). So, we have created the level1 as level0 in ASO ending with _Plan and then loaded the data for those members.
From a data loading standpoint, you can just put suffix in the rule file itself.
Regards
Amarnath
ORACLE | Essbase

Similar Messages

  • Performance Tuning Data Load for ASO cube

    Hi,
    Anyone can help how to fine tune data load on ASO cube.
    We have ASO cube which load around 110 million records from a total of 20 data files.
    18 of the data files has 4 million records each and the last two has around 18 million records.
    On average, to load 4 million records it took 130 seconds.
    The data file has 157 data column representing period dimension.
    With BSO cube, sorting the data file normally help. But with ASO, it does not seem to have
    any impact. Any suggestion how to improve the data load performance for ASO cube?
    Thanks,
    Lian

    Yes TimG it sure looks identical - except for the last BSO reference.
    Well nevermind as long as those that count remember where the words come from.
    To the Original Poster and to 960127 (come on create a profile already will you?):
    The sort order WILL matter IF you are using a compression dimension. In this case the compression dimension acts just like a BSO Dense dimension. If you load part of it in one record then when the next record comes along it has to be added to the already existing part. The ASO "load buffer" is really a file named <dbname.dat> that is built in your temp tablespace.
    The most recent x records that can fit in the ASO cache are still retained on the disk drive in the cache. So if the record is still there it will not have to be reread from the disk drive. So you could (instead of sorting) create an ASO cache as large as your final dat file. Then the record would already still be on the disk.
    BUT WAIT BEFORE YOU GO RAISING YOUR ASO CACHE. All operating systems use memory mapped IO therefore even if it is not in the cache it will likely still be in on the disk in "Standby" memory (the dark blue memory as seen in Resource Monitor) this will continue until the system runs out of "Free" memory (light blue in resource monitor).
    So in conclusion if your system still has Free memory there is no need (in a data load) to increase your ASO cache. And if you are out of Free memory then all you will do is slow down the other applications running on your system by increasing ASO Cache during a data load - so don't do it.
    Finally, if you have enough memory so that the entire data file fits in StandBY + Free memory then don't bother to sort it first. But if you do not have enough then sort it.
    Of course you have 20 data files so I hope that you do not have compression members spread out amongst these files!!!
    Finally, you did not say if you were using parallel load threads. If you need to have 20 files read up on having parrallel load buffers and parallel load scripts. that will make it faster.
    But if you do not really need 20 files and just broke them up to load parallel then create one single file and raise your DLTHREADSPREPARE and DLTHREADSWRITE settings. Heck these will help even if you do go parallel and really help if you don't but still keep 20 separate files.

  • FDM to load data in Essbase ASO cube

    Anybody have used FDM to load data in Essbase ASO cube? How do you clear and run calc on ASO cube?
    Thanks

    Does the Essbase Adapter for FDM Support ASO Cubes? [ID 1168153.1]
    Modified 17-AUG-2010 Type HOWTO Status PUBLISHED
    Applies to:
    Hyperion Financial Data Quality Management - Version: 11.1.1.3.00 and later [Release: 11.1 and later ]
    Information in this document applies to any platform.
    Goal:
    Does the Essbase adapter for FDQM support ASO cubes?
    Solution:
    ASO cubes are not currently supported in FDQM.
    Unpublished Enhancement 6568323 has been created and it is currently under consideration for a future release.
    References
    BUG:6568323 - 8-529236080 - CUSTOMER WANTS TO TAKE ADVANTAGE OF THE ASO FUNCTIONS IN ESSBASE.
    Related
    Products
    Middleware > Enterprise Performance Management > Financial Data Quality Management > Hyperion Financial Data Quality Management

  • Data Copy in ASO Cube

    I have created an ASO cube (which is new to me, I have only worked with BSO before) and loaded data to scenario1. I know you cannot use calc scripts in ASO cubes, so how do I copy data from Scenario1 to scenario2?
    Any help is greatly appreciated!
    -RB

    You don't mention what version you are on. If it is pre 11.1.2 you would have to extract the data using something like a report script then change the scenario and reload it. If you are on 11.1.2.X then you can use user defined calculations to copy the data. Look in the teck reference under MaxL Calculate and make sure you siwtch to the ASO version

  • Jobs and Tasks for Automation of load data in Essbase ASO Cube

    Hi, all people,
    my question is about creating jobs or tasks for automation loading data in essbase cubes.
    I use .bat file witch run MaxL script for loading data form oracle database into an essbase cube. For running .bat file is made a Windows Task.
    Is there any other opportunities to run .bat file not using windows tasks? Maybe there is special utilities in Oracle EPM Systems or any other.....
    I am using essbase version 11.1.1.2.0.
    Thanks for reply))

    There is no internal scheduler.You either have to use Windows Scheduler, Unix Cron or a third party tool. Take a look at Star Analytics Command center, it is designed specifically for Hyperion applications http://www.staranalytics.com/products/command_center.htm

  • Issue with Logical data clear in ASO

    Since we upgraded to version 11 we have done away with clearing data from the ASO cube using #MI files, and have started using the Clear Data In Region MDX functionality. From my understanding, when doing a logical clear it should load offsetting values to a new slice, that will result in a value of 0 being retrieved from the DB.
    The problem is that there is a 2-3 minute window between the clear script and the new data load where users are pulling not 0's, but incomplete data. Again, from my understanding this is not how a logical delete should act. This process runs every 2 hours, so there is a 2-3 minute window every 2 hours where the data users see may be incorrect. If we can't resolve this issue we will have to go back to loading #MI to clear data from the ASO DB which we are hoping to avoid. Also, we can't do a physical delete because it takes too long.
    Any ideas? Am I misinterpreting the Logical Delete functionality?
    Thanks in advance.

    Just to confuse matters, this problem is intermittent and I haven't been able to successfully replicate it in our Test environment.
    That would seem to indicate something else was going on in the DB that was interfering with the clear, but the logs aren't showing any errors, locks, etc that could have caused the problem.

  • Can we upload data in ASO Cube version 11.1.2

    Hi,
    I have a requirement where in ASO cube users need the ability to enter their plan data into the ASO application. Do we need to create Transparent Partition for that. I have an ASO and Hyperion Planning cube and in my Planning cube I have more number of dimensions as compare to ASO.
    Please advise
    Thanks in advance.

    If you are using the Excel add-in you do NOT do a lock, just a send. The lock is a BSO concept to put a hold on blocks. ASO has no blocks to lock. From smartview you do a submit and it handles it all for you. The beauty of using ASO is exactly what you mention, results are immediate without a calc.

  • Can we load data for all levels in ASO cube

    Hi All,
    Can we load data for all levels of members in ASO cube in 9.3.1.
    Regards

    Yes you can load data for all levels in an ASO cube in any version HOWEVER, none of the upper level data in any cube will be there when you look for it. You will get a warning message in the load because ASO cubes don't store data at upper levels. It is the same as loading data into dynamic calc members in BSO cube. It will do the load without compalints, but there will be no data there (At least you get the warning in ASO)

  • How to add a new dimension in ASO cube without losing all data

    Hi,
    I have an ASO cube with 18 dimensions and I want to add a new one (regular). When I add this new dimension it asked me that the all data has to cleared before the restructure can take place. If I click yes then I would lose all data regardless of the fact if I have a level0 export or not. Because when I tried to reload the export back into the cube it gave me an error cause all the dimensions were not present in the export. Does anyone know how would I be able to accomplish this task?
    Thanks
    fikes

    You can try this work around solution.
    1) Export Lev0 data
    2) Clear the cube
    3) Add new dimension Regular.
    4) Add a default member Reg00 in the new dimension and save.
    5) Open the Lev0 data in a text editor, Insert "Reg00" in the begining of the file and Enter.
    6) Lev0 file will have first record "Reg00" and the remaining export file will be from the second record.
    7) Load the modified file with out rule file.
    You can verify that all your history data will be loaded w.r.to the Reg00 of the New dimension Regular.

  • How to copy data in ASO cube across two scenario's?

    How to copy data in ASO cube across two scenario's?

    There are multiple ways to do it.
    How is the source scenario sourced? If it is populated using load rules, then why don't change it in the rule itself?
    Regards
    Celvin

  • Unable to export the ASO cube data

    Guys,
    I am unable to take an export of the ASO cube data. I am getting the following error message :
    Error(1270042)
    Aggregate storage data export failed
    Can anyone assist me at earliest.
    P.S: The data file size is around 30GB.
    Cheers
    Cnee:)

    Hi Cnee,
    can you post more entries from app log at this situation, that would be helpful for more analysis.
    also confirm if the database is in use while you are exporting the data.
    if so, logout users, disable connects and then perform the export.
    - Krish

  • ASO Cube Does not overwrite the data

    <p>Hi,</p><p> </p><p>I have a ASO Cube and I am loading a data using the followingtext file. While loading the text file I am using the rule filewhere I said overwrite the existing values. But it is notoverwriting the Values. I am using 7.1.5</p><p>I will appreciate your suggestions.</p><p> </p><p>Thanks</p><p>Anky</p><p> </p><p>Text File</p><p> </p><p> </p><table><tr height="17" style=" height: 12.75pt;"><td height="17" style=" height: 12.75pt; width: 48pt;" width="64">Jan</td><td style=" width: 48pt;" width="64">test</td><td style=" width: 48pt;" width="64">Pr1</td><td x:num="" align="right" style=" width: 48pt;" width="64">1000</td></tr><tr height="17" style=" height: 12.75pt;"><td height="17" style=" height: 12.75pt;">Jan</td><td>test</td><td>Pr1</td><td x:num="" align="right">2000</td></tr><tr height="17" style=" height: 12.75pt;"><td height="17" style=" height: 12.75pt;">Jan</td><td>test</td><td>Pr1</td><td x:num="" align="right">100000</td></tr></table><p> </p>

    <p>Hi Ankyjay.</p><p> </p><p>                 Ifound this in DBAG page 1087 this might help.</p><p> </p><p>When you take advantage of the aggregate storage data loadbuffer, Analytic Services sorts and works with the values after alldata sources have been read. <span style=" text-decoration: underline;"><b>If multiple records are encountered for any specificdata cell, the values are accumulated. Analytic Services thenstores the accumulated values.</b></span></p><p> </p><p> </p>

  • Data load in Essbase ASO cube

    Hi,
    I have not been using ASO cube before and had worked only on BSO cubes. Now I have a requirement to create a rule file to load data in to an ASO Essbase cube. I have created a data load rule file as I was creating for a BSO cube which is correctly validating. However when I am doing the data load I am getting following warning:
    "Aggregate storage applications ignore update to derived cells. [480] cells skipped"
    I have investigated further and found that ASO cube does not allow data loading at upper levels & on members calculated through formulas. After this I have ensured that I am loading the data in to zero level members and members which are not calculated through formula. But still I am not able to do the data load & getting the same warning.
    Could you please help me and let me know if there is anything else which I am missing here?
    Thanks in advance...
    AKW

    Hi AKW,
    "Aggregate storage applications ignore update to derived cells. [480] cells skipped"This is only a warning message that means only those many cells were skipped might be for some reasons like any member pointing to those cells will be missing.
    If you want to copy the Data of your BSO cube to an ASO Application why dont you use an PARTIONING it will copy your whole data from BSO to ASO (If Outline is common in both then copy any member of Sparse dimension like "Scenario 1" from Source i.e. BSO, to same member like "Scenario 1" in Target i.e ASO ),
    This is only an alternate wayThanks
    Avneet Singh Bhatia

  • Getting error while loading  Data into ASO cube by flat file.

    Hi All,
    i am getting this error Essbase error 1270040: Data load buffer[1] does not exist while loading data into ASO cube.
    does anyone have solution.
    Regards,
    VM

    Are you using ODI to load the data or maxl? If you are using an ODI interface, are you using a load rule also which version of essbase and ODI are you using
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • DATA LOAD WORNINGS IN ASO CUBES

    Hi Every one,
    While loading data into ASO cubes in Essbae we are getting wornings like *"Data load strem contains 1.25797e 08 and [0] #misssing cells ".*My data file have #missing values and o's and sepecial carecters like E .I want To load the complete data with out warnings.Kindly let me know if any one's know the solution .Whether i need to change any settings in rule file or how to ingnore those cells .
    Thanks,
    Vikram

    The warnings are really informational messages to let you know it loaded and did not load those values. Which is fine as they tend to bloat a cube (The zeros). #missing is not going to load anyway and the E is exponential format of numbers whinch should not be a problem. Excel will display it this way, but you can format it without the E. You don't mention if you are doing this from EAS or MaxL and what version you are on. In Veraion 11, in EAS there are options in the load dialog to turn on or off the loading of zzeros and missing across the top. In MaxL, I don't see the syntax in the Tech reference, but I thought it was there in 9.

Maybe you are looking for