Loading data to Parent in ASO cube
I created an Excel data file to test different load scenarios. One scenario was using a parent level in one of the dimensions. I expected the record to show up in the dropped records file but it doesn't. Does anyone know why it doesn't? Is there any way to tell that the record wasn't loaded? We are on v11.
You get a warning message 1003055: "Aggregate storage applications ignore update to derived cells. [X] cells skipped".
Similar Messages
-
How to load data from a ODS to CUBE Request ID - by - Request ID?
<i>How to load data from a ODS to CUBE Request ID - by - Request ID?</i>
The problem is that... some requests had been eliminated of the cube and the delta control between the ODS and CUBE was lost. The flag "<b>data mart status of request</b>" of all the requests of the ODS had been blank.
Now it is necessary to load some requests from the ODS for the cube.
Notes:
- it is not possible to make a complete load selecting the data to be loaded;
- the PSA is not being used;
- considering the data volume it is impracticable to reload the cube completely.
Thanks in advance,
Wesley.Dear R B,
Considering the following:
-> the delta control was lost;
-> the data already are active in the ODS;
-> part of the data of the ODS already is in the cube.
The indicated procedure it only guarantees the load of the data that are in the ODS and that are not in the cube.
Tks,
Wesley. -
Error While loading data from 8 DSOname to Cube.
Hi All,
while loading data from 8<DSONAME> to CUBE , the data is Sucessfully loaded data into Cube, but the Request Status is shown in Yellow,why it is showing, please tell me how to make it Sucessfull.
Thanks and Regards,
santosh.HI All,
In ST22 i'm getting Short Dump, the Error massage is
MESSAGE_TYPE_X
WHAT HAPPEN
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
Error Analyis is:
Short text of error message:
Generic process chain
Long text of error message:
Technical information about the message:
Message class....... "RSMPC"
Number.............. 000
Variable 1.......... " "
Variable 2.......... " "
Variable 3.......... " "
Variable 4.......... " "
"MESSAGE_TYPE_X" " "
"CL_RSSM_LOADING===============CP" or "CL_RSSM_LOADING===============CM005"
"IF_RSPC_EXECUTE~EXECUTE"
there is any Patch Problem...
Please let me Know how to resolve this issue.
thanks and regards,
santosh. -
Getting error while loading data from sql to ASO
Hi There,
we are working on ASO cube(Hyperion 9.3.0.1) and my data source is Oracle 10 g R2 while we are trying to load data or build dimension from sql, we are getting an error. Although, previously it was working properly. we are getting different error different time. I have updated essbase.cfg file for NETDELAY 800 & NETRETRYCOUNT 1000.I have changed panding cache size limit in my application from 48 to 96 also but still getting same error. Following are the errors
ERROR
Database IBasic loaded
Application IdeaBas2 loaded - connection established
Application [IdeaBas2] started with process id [10920]
Object [IBasic] is locked by user [dhanjit]
Cannot read SQL driver name for [Hyperion Client Sample] from [ODBC.INI]
Cannot read SQL driver name for [Hyperion BIplus Client Sample1] from [ODBC.INI]
Cannot read SQL driver name for [Hyperion BIplus Client Sample2] from [ODBC.INI]
Cannot read SQL driver name for [tmw2k_1] from [ODBC.INI]
Connection String is generated
Connection With SQL Database Server is Established
SQL Connection is Freed
Building Dimensions Elapsed Time : [10.641] seconds
Reading Parameters For Database [Drxxxxxx]
Declared Dimension Sizes = [109 165 15 80 1785 1938 1242 93 25 14 10 6 20 1502 21 5 211 2 ]
Actual Dimension Sizes = [108 158 15 78 1785 1938 1241 93 25 14 10 6 20 1502 21 5 211 1 ]
Network error [10054]: Cannot Send Data
Network error [10054]: Cannot Send Data
Unexpected Essbase error 1042012
Object [IBasic] unlocked by user [dhanjit]
regards,
DhanjitHi There,
we are working on ASO cube(Hyperion 9.3.0.1) and my data source is Oracle 10 g R2 while we are trying to load data or build dimension from sql, we are getting an error. Although, previously it was working properly. we are getting different error different time. I have updated essbase.cfg file for NETDELAY 800 & NETRETRYCOUNT 1000.I have changed panding cache size limit in my application from 48 to 96 also but still getting same error. Following are the errors
ERROR
Database IBasic loaded
Application IdeaBas2 loaded - connection established
Application [IdeaBas2] started with process id [10920]
Object [IBasic] is locked by user [dhanjit]
Cannot read SQL driver name for [Hyperion Client Sample] from [ODBC.INI]
Cannot read SQL driver name for [Hyperion BIplus Client Sample1] from [ODBC.INI]
Cannot read SQL driver name for [Hyperion BIplus Client Sample2] from [ODBC.INI]
Cannot read SQL driver name for [tmw2k_1] from [ODBC.INI]
Connection String is generated
Connection With SQL Database Server is Established
SQL Connection is Freed
Building Dimensions Elapsed Time : [10.641] seconds
Reading Parameters For Database [Drxxxxxx]
Declared Dimension Sizes = [109 165 15 80 1785 1938 1242 93 25 14 10 6 20 1502 21 5 211 2 ]
Actual Dimension Sizes = [108 158 15 78 1785 1938 1241 93 25 14 10 6 20 1502 21 5 211 1 ]
Network error [10054]: Cannot Send Data
Network error [10054]: Cannot Send Data
Unexpected Essbase error 1042012
Object [IBasic] unlocked by user [dhanjit]
regards,
Dhanjit -
Wrong date value in Essbase ASO cube
Hi All,
I'm trying to load a date value in mm-dd-yy format into an Essbase ASO cube. I'm using is a txt tab delimited file. The load rule is working fine. The outline properties is set with the proper format "mm-dd-yy". I loaded the data and when I retrieve the data using Smart View I see all the dates decreased by one day in my Smartview report.
Would you have any ideas why that is happening?
Thanksthis is a bug and fixed in 11.1.2
-
I have actuals data available at the day level but plan is available only at month level. Month is level 1, and I'm thinking about loading plan to this level. Are there any problems with doing this? What are the things I need to be careful about? I will be doing a nightly level 0 export of the data, clearing the database, reloading the export, then loading new data and calc-ing the cube.Thanks for your help.
Well, there are a few problems to worry about. First, you must ensure that your database is not set to Aggregate Missing Values (database seting) - otherwise your upper level values will be made #Missing if all of the children are #Missing.Second, if you want to write back to level 1 of Time, you cannot make those members Dynamic calc, which takes away a big optimization tool. It is probably easier (from a cube admin perspective) to either create dummy members to load the plan to, or chosing one month in each quarter in which to input the plan. At the upper levels of time, the data will look the same as if you input it at that level.Regards,Jade----------------------------------Jade ColeSenior Business Intelligence ConsultantClarity [email protected]
-
Loading data into 0FIAR_C03 A/R CUBE.
We are on BW 3.5 and I'm having an error loading data from the ods 0FIAR_003 LINE ITEMS into the cube 0FIAR_C03.
Data was successfully loaded into the ods.
The ods data is activated.
I get the error when updating the data target.
I can see in the monitor that there are a number of idocs sent but there are no idocs received.
Any suggestions on what I can check on. This is a new BW installation so there maybe a connection missing.
Thanks for your assistance.Hi Dennnis,
Please check the source system connection. Give ST22 to check short dumps if any. system is able to process IDOCs at the time the message sent from BW system but due to system errors this will happen.
YOu can delete the request in the Info Cube and reset the data mart status in the ODS Object and try to repeat the load Hope this will help you.
Cheers,
Ganni -
Integrated Planning - trying to load data from basic to realtime cube
I have created a mulitprovider for my basic and realtime cube, and trying to copy data from basic to realtime using standard copy function in IP. It is not erroring out but data is not getting copied either. Both cubes having exactly the same structure. I can do a export datasource or a BPS multiplanning area..but I need a solution in IP..Any help would be appreciated...Thanks much..
Infoprovider --> Multiprovider
Agg Level --> Selected fields need to be copied
Filter --> None selected, wanted to copy as is
Pl. Fnc --> Standard Copy.
char usage: Selected infoprovider for 'changed'
to parameters: selected all key figs ( percent and amount)
from and to values: Basic to realtime
Let me know. -
Long time to load data from PSA to DSO / CUBE. Sequential read RSBKDATA_V
Hi Gurus
The process stays for several hours on sequential read from RSBKDATA_V - temporary storage for DTPs.
After several hours, the processing finally starts, but even when I assign several BGDs to the DTP it takes hours to load the data.
We have BI7.0 with SP22, so all relevant notes are already implemented.
Does any one had similar problem, please?
Thanks in advance
MartinHi Martin,
this issue has cropped up a few times. Please check and implement the following notes:
1338465 P22:DTP:LOG: Performance problem when messages are added +
1331544 P22:HINT:Slow performance when accessing RSMONFACT +
1312701 70SP21: Performance on view RSBKDATA_V selects
1304234 70SP21: Performance on the hashed table p_th_rsbkdata_v **
1168098 70SP19: Performance during DataStore object extraction
1080027 70SP16: Performance during parallel processing
These notes should improve the performance of the DTPs in your system.
After you've added the notes, please check in tables RSBKDATA and RSBKDATAINFO whether they contain any data. If the tables are empty, please reorganise the tables, and restart the dtp.
Hope this helps you.
Rgds,
Colum -
Errror while loading data from dso to info cube....
Hi All,
When i am running dtp from dso to cube i get a error i.e :
1. Data package processing terminated
Message no. RSBK229
2. 'Processed with Errors'
Message no. RSBK257
3. Error while updating to target 0FIGL_C10 (type INFOCUBE)
Message no. RSBK241.
So these are the errors i am getting , i am new to BI so how could i solve this errors....
All the transformation are active with no errors.
Thanks in Advance
Regards,
Mrigesh.
Edited by: montz2006 on Dec 7, 2009 8:45 AMHi All,
Now i deleted data from dso and data source and again run the dtp,
this time i got data from data-source to dso but , when i am running dtp between dso and cube , i am not finding an error.
Now when i go to cube and see the data the request has came but data has not transferred .....
The Request Status is green in cube but data is not cuming....
Regards,
Mrigesh. -
Hi,
I have not been using ASO cube before and had worked only on BSO cubes. Now I have a requirement to create a rule file to load data in to an ASO Essbase cube. I have created a data load rule file as I was creating for a BSO cube which is correctly validating. However when I am doing the data load I am getting following warning:
"Aggregate storage applications ignore update to derived cells. [480] cells skipped"
I have investigated further and found that ASO cube does not allow data loading at upper levels & on members calculated through formulas. After this I have ensured that I am loading the data in to zero level members and members which are not calculated through formula. But still I am not able to do the data load & getting the same warning.
Could you please help me and let me know if there is anything else which I am missing here?
Thanks in advance...
AKWHi AKW,
"Aggregate storage applications ignore update to derived cells. [480] cells skipped"This is only a warning message that means only those many cells were skipped might be for some reasons like any member pointing to those cells will be missing.
If you want to copy the Data of your BSO cube to an ASO Application why dont you use an PARTIONING it will copy your whole data from BSO to ASO (If Outline is common in both then copy any member of Sparse dimension like "Scenario 1" from Source i.e. BSO, to same member like "Scenario 1" in Target i.e ASO ),
This is only an alternate wayThanks
Avneet Singh Bhatia -
Problem in loading 0calday infoobject date format through Flatfile in cube
Hi All,
I am facing problem in loading the 0calday infoobject data through flat file( format - test.csv) in infocube..
Suppose consider we are having two flat files(test1.csv,test2.csv).
1.First file(test1.csv) has a proper date format (ie YYYYMMDD), while loading it is succesfully .
2.Second file(test2.csv) has improper date format(ie DDMMYYYY), loading fails because of this format..
Is it possible to write the Routine(Start Routine) in the Infopackage (External data-Tab) in such a way the if the flat file(test1.csv) is proper date format load calday data without any conversion, if the file is test2.csv convert the date field format from (DDMMYYYY) to (YYYYMMDD) and finally laod data in cube.
With regards,
Hari.
+91 9323839017Hello Dinesh,Anil
There is no distinguishing field between the two flat file loads.
We are using only one infoobject(ie 0calday) for two loads.
We are using two external source system(one system generate file as YYYYMMDD date format,and another system generate date formate as DDMMYYY, here two file names are unique.
Here my requirement is i have to compare two file names using start routine of the package (tab:External data) .
if(test1.csv)
load as it is 0calday data (since it is in proper format YYYYMMDD)
else if(test2.csv)
then convert from DDMMYYYY to YYYYMMDD and load data to 0calday infoobject in cube.
Is it possible to compare two files names using start routine.
with regards,
Hari -
Look up two ODS and load data to the cube
Hi ,
I am trying to load data to the Billing Item Cube.The cube contains some fileds which are loaded from the Billing Item ODS which is loaded from 2LIS_13_VDITM directly from the datasource, and there are some fields which needs to be looked up in the Service Order ODS and some fields in the Service Orders Operations ODS.I have written a Start Routine in the Cube update rules.Using the select statement from both the ODS i am fetching the required fields from the both ODS and i am loading them to the internal tables and in the Update rules i am writing the Update routine for the fields using Read statement.
I am getting an error when it is reading the second select statement(from second ODS).
The error message is
You wanted to add an entry to table
"\PROGRAM=GPAZ1GI2DIUZLBD1DKBSTKG94I3\DATA=V_ZCSOD0100[]", which you declared
with a UNIQUE KEY. However, there was already an entry with the
same key.
The error message says that there is an Unique Key in the select statement which is already an entry.
Can any one please help me in providing the solution for this requirement.I would appreciate your help if any one can send me the code if they have written.
Thanks in Advance.
BobbyHi,
Can you post the select statements what you have written in Start routine.
regards,
raju -
Process Chain taking long time in loading data in infocube
Dear Expert,
We are loading data thru PC in AR cube it takes data frm
PSA-> DSO->Activation->Index Deletion->DTP(load infocube)->IndexCreation->Create Aggregates.
In Index creation everyday its taking long time around 9 to 10 hrs to create it
when we go in RSRV and repair the infocube thr loading of data happens fast. We are doing it(RSRV) everyday. In DB02 we have seen dat 96% tablespace is used.
Please tell permanent solution.
Please suggest its BI Issue or Basis.
Regards,
AnkitHi ,
We are loading data thru PC in AR cube it takes data frm
PSA-> DSO->Activation->Index Deletion->DTP(load infocube)->IndexCreation->Create Aggregates.
In the above steps insted of Create Aggregates it should be Roll up Process of aggregates.
You can ask the basis team to check the Table space in the transaction db02old/db02.
Check if there is long running job in SM66/SM50 kill that job.
check there should be enough Batch process to perform the steps.
Hope this helps.
"Assigning points is the ways to say thanks on SDN".
Br
Alok -
Loading data from excel to BPS layout
Hi,
I have BPS layouts in place and also excel sheets which are exact replica of those layouts.
I know 2 options to load excel data into the BPS. These are:
1) Copy and paste the data from excel to BPS layout.
2) Create a flat file in line format from excel and then use method explained in document "How to Load a Flat File into BW-BPS Using SAPGUI.pdf".
Is there any other way I can upload the same?
Any help in this regard will be highly appreciated.
thanks in advance.
regards,
PankajThanks to Vlad, Laurent & Srini for prompt replies.
But here the user needs to load data directly to the transactional cube.
So it is a case of "instead of keying in values" on to the BPS planning layout, user wants to upload the data from excel into the BPS layout.
Planning is done on SAP GUI and not web browser.
I am able to do the same using the document "How to Load a Flat File into BW-BPS Using SAPGUI".
But in this case the data needs to be converted into a simple csv file in line item format before it can be uploaded.
Is there any way, where we can upload from excel file without having it to change to csv format as required in "How to Load a Flat File into BW-BPS Using SAPGUI"?
thanks in advance.
regards,
Pankaj
Maybe you are looking for
-
scared
-
How to show a success message in a dialogue at the top of the page
Hi, Can someone please tell me how to show the dialogue message at the top of the page on successfully inserting a record in the database. Current I have 2 pages from first page user navigates to the second page where the information entered on the p
-
When I've worked on my private home-page and want to controll the results, how do I get my Mac to take the new version instead of the old one? I tried cmd R without any change! Who knows? Nina
-
Errors using DBMS_SQLTUNE Advisors for Oracle 10g
I get errors trying to tune the below query for Oracle 10g using the DBMS_SQLTUNE advisors. It happens when I wrap either a large block of PL/SQL code that uses bind variables or multiple nested subqueries with multiple JOIN conditions in a SELECT qu
-
What's the best Preferred Execution System setting for parallel testing run from TestStand
I configured what vi's I think need to be reentrant, but I am not sure what choice for "Preferred Execution System" is best for the vi's in my TestStand sequence that's going to run in parallel. Can anyone advise? Thanks!