Archiving BI.7.0: Must data in InfoCube be compressed before archiving run?
Hi Expert
I have created a Data Archiving Process (DAP). It is execute by a archiving process variant in a process chain.
Archiving an InfoCube works fine when all requests are compressed.
However, when I have some requests in the InfoCube that isn't archived I get the following error:
07.11.2008 14:47:07 Data area to be archived is not completely compressed yet I RSDA 148
07.11.2008 14:47:07 Exception condition in row 105 of include CL_RSDA_ARCHIVING_REQUEST=====CM015 (program CL_RSDA_ARCHIVING_REQUEST=====CP) I RSDA 140
I cannot find anywhere it is stated that all requests have to be archived and I cannot find any OSS's about this subject as well.
Have you got any experience in this matter?
Thanks in advance and kind regards,
Torben
Hi,
First of all, i want to thank you for your help!
but i have a question about a case:
Loading Date Loading N° FISCYEAR Order N° Turnover
31/12/2007 1 2007 Order 1 100
01/01/2008 2 2008 Order 1 -50
02/01/2008 3 2007 Order 1 60
The loading date corresponds to the Request Date.
In our case, we have compressed Data wich Request Date is < 2008, and we want to archive data wich FISCYEAR = 2007.
In this case, we will find the chrgt 1 in the E-Table, and chrgt 2 & 3 in the F-Table.
The goal of the 3rd Loading is to recover Data wich correspond to the year 2007 and that has not been recovered by the SAP System .
For exemple, in my case, Data (wich come directly from the stores ) are sent everyday to SAP, but due to a problem, we didn't receive Data.
So, The 3rd Loading will allow us to recover those Data.
How will i do to archive the entire data that correponds to FISCYEAR = 2007, knowing that i do not have all of them in requests wich Loading Date is < 2008.
Thanks for your help.
Salah Lamaamla
Similar Messages
-
ORA-00258: manual archiving in NOARCHIVELOG mode must identify log
Hi I am new to oracle streams. I am trying to setup a one way replication from one database to another using Oracle 10g (10.2.0.1.0) on Windows XP SP3 (32 bit).
I ran the following proc as the streams admin schema:
begin
dbms_streams_adm.maintain_schemas(
schema_names => 'XXCOW',
source_directory_object => 'repl_exp_dir',
destination_directory_object => 'repl_imp_dir',
source_database => 'PWBSD',
destination_database => 'PDVSD',
perform_actions => true,
dump_file_name => 'exp_app23.dmp',
capture_queue_table => 'rep_capt_table',
capture_queue_name => 'rep_capt_queue',
capture_queue_user => NULL,
apply_queue_table => 'rep_dest_table',
apply_queue_name => 'rep_dest_queue',
apply_queue_user => NULL,
capture_name => 'capture_pubs',
propagation_name => 'prop_pubs',
apply_name => 'apply_pubs',
log_file => 'exp_app23.log',
bi_directional => false,
include_ddl => true,
instantiation => dbms_streams_adm.instantiation_schema);
end;
The script failed the first time because i forgot to configure the source database in archive log mode.
The steps i followed to change to archivelog mode:
SQL> select name from v$database;
NAME
PWBSD
SQL> alter system set LOG_ARCHIVE_DEST = 'D:\data\oracle\oradata\PWBSD\archive' scope=both;
System altered.
SQL> conn sys/sys@pwbsd as sysdba
Connected.
SQL> shutdown immediate
Database closed.
Database dismounted.
ORACLE instance shut down.
SQL> startup mount
ORACLE instance started.
Total System Global Area 612368384 bytes
Fixed Size 1250428 bytes
Variable Size 197135236 bytes
Database Buffers 406847488 bytes
Redo Buffers 7135232 bytes
Database mounted.
SQL> alter database archivelog;
Database altered.
SQL> alter database open;
Database altered.
SQL>
I configured it in archive log mode and ran the proc above again.
I got the following output this time:
job finished
begin
ERROR at line 1:
ORA-23616: Failure in executing block 90 for script
959ECF1D1159402A8C16687AE5E3B5CD
ORA-06512: at "SYS.DBMS_RECOVERABLE_SCRIPT", line 457
ORA-06512: at "SYS.DBMS_STREAMS_MT", line 2201
ORA-06512: at "SYS.DBMS_STREAMS_MT", line 7486
ORA-06512: at "SYS.DBMS_STREAMS_ADM", line 2624
ORA-06512: at "SYS.DBMS_STREAMS_ADM", line 2685
ORA-06512: at line 2
I ran the following to check the error:
select * from dba_recoverable_script_errors;
The output is:
SCRIPT ID: 959ECF1D1159402A8C16687AE5E3B5CD
BLOCK NUM: 90
ERROR_NUMBER: -258
ERROR_MESSAGE: ORA-00258: manual archiving in NOARCHIVELOG mode must identify log
ORA-06512: at "SYS.DBMS_RECO_SCRIPT_INVOK", line 129
ORA-06512: at "SYS.DBMS_STREAMS_RPC", line 358
It seemed like it was still complaining about archive log mode,
I verified that the PWBSD db is in archivelog mode by running the following:
select name, log_mode from v$database;
NAME: PWBSD
LOG_MODE: ARCHIVELOG
What could be the problem and how do i proceed to fix it?Hi Parthiv,
The steps given by you is not clear.
please try to fallow the steps given in the below link. It may be helpful to you to setup schema level streams:
http://gssdba.wordpress.com/2011/04/20/steps-to-implement-schema-level-oracle-streams/
Thanks and Regards,
Satish.G.S
gssdba.wordpress.com -
How to restrict access of data in Infocube
Hi BW Experts,
We got HR project and we loaded the Data from Flat File. We developed reports as per their requirement.
As per some condtions, we should not see the HR Data in Infocube as well as in Report.
We are having only one role which has all authorizations (SAP_ALL and BI_ALL)
I know one solution i.e, we have to restrict all Infocubes except HR cube, then only we cant see the data. But we dont want to disturb our existing Role.
Please can any one tell me other solution.
Regards,
AnjaliHi Anjali,
Please check here......
Authorization issue to view cube contents
Re: Authorization Object issue
Thanks,
Vijay. -
How can i Read and Delete data in Infocube
Can some help me how to read and delete data in infocube before loading data from direct DSO.
I should be able to read and delete condition with combination (0FISCPER, 0COMP_CODE).
Let us say we load JAN, comp _ code : 0001 (10 records )and JAN comp _ code :003 (30 records ) and JAN, comp _ code : 004 (20 records ) to infocube
second time if DSO gets data of JAN, comp _ code : 0001 (40 records ) it doent need to be 10 old + 30 new, it can be 9 old 21 new lay i need to delete 1 old rec which i dont need. so that is the reason i want delet all JAN, comp _ code : 0001 recods before loading.
Can some one help meHello,
You could use the selective deletion available with the cube to get the selective data for 0FISCPER, 0COMP_CODE combination deleted.
If you go in the manage of cube & Content tab, you would find the selective deletion option at the bottom of screen.
If you enter the option, you could give your selections for deleting the records and run the job.
Please let me know if you need any more help.
Regards,
Pankaj -
How to Read data from Infocube
Dear Friends,
I have requirement to read data from Infocube.
I know the Function module(FM) which SAP provided. The FM name is RSDRI_INFOPROV_READ.
But when i am trying to execute this FM in SE37, it giving "Error generating the test frame" message.
I called this FM in start routine, it giving error while loading.
If any using this please let me know, how to use this. If any have sample code, please share.
Thx
RajHi,
i used to check documentation all time, but this time I forgot to check.
It is well documented. thx for igniting my mind.
bye
Raj -
Standard method to archive/delete all vendor master data
Hi every abap expert
Does anyone know how to archive all vendor master data
in a client?
If use T-code : SARA . The vendor master data is required to mark deletion flag before archiving.
Any standard T-code can set all vendor master data with marked deletion flag ?Hi Boris!
Give XK99 a try. With XK06 a single change is possible and in FI also FK06 is available.
When XK99 has problems with field LFA1-LOEVM, then you can make easily a CATT for XK06.
Regards,
Christian -
The table for storing data for infocube and ODS
Hi all:
could you please tell me how to find the table for storing data for infocube and ODS?
thank you very much!Hi Jingying Sony,
To find tables for any infoprvider go to SE11.
In database table field enter the following
Cube -
Has fact table and dimension table
For customized cube - ie cube names not starting with ' 0 '
Uncompressed Fact table - /BIC/F<infocubename>
Compressed fact table - /BIC/E<infocubename>
Dimension table - /BIC/D<infocubename>
For standard cube - ie cube names starting with ' 0 '
Uncompressed Fact table - /BI0/F<infocubename>
Compressed fact table - /BI0/E<infocubename>
Dimension table - /BI0/D<infocubename>
Click on display.
For DSO,
For standard DSO active table- /BI0/A<DSO name>00.
You use 40 for new table.
Click on display.
For customized DSO use- /BIC/A<DSO name>00.
An easier way is in the database table field, write the name of the cube/DSO preceeded and followed by ' * ' sign. Then press F4 . It shall give you the names of the available table for that info provider.
Double click on the name and choose display.
Hope this helps,
Best regards,
Sunmit. -
ABAP short dump when loading data into Infocube
Hello All,
I am getting an ABAP short dump error when loading data into Infocube. I tried to load data from PSA and ODS.
I ran the code to compress the Infocube, no improvement still getting the same error.
Error Analysis:
A raise statement in the program "SAPLRSDU_PART" raised in the exception.
Internal Notes:
The termination occured in the function "ab_i fune" of the SAP Basis system. Specifically in the line 2316 of the module "//bas/620/src/krn/runt/abfunc. C#18".
The internal operation just processed is "FUNE"
Advance thanks !!!
SivHello Siv,
try to run program SAP_DROP_EMPTY_FPARTITIONS to display existing partitions. Then compare it to the number of uncompressed requests. Maybe there's a mismatch.
SAP notes that might help: 385163, 590370
Regards,
Marc
SAP NetWeaver RIG, US BI -
Entry in RSMDATASTATE is initial although there is data in InfoCube
Dear SDN,
When we are check with the transacction RSRV two custom infocube, the log show me the follow description:
Status of data in InfoCubes (ZDP_PTA)
Entry in RSMDATASTATE is initial although there is data in InfoCube.
I check the table RSMDATASTATE for this infocube and the entries are
INFOCUBE ZDP_PTA
TECHOK 0
QUALOK 0
ROLLUP RSVD DUAL 0
ROLLUP 0
COMPR AGGR 0
DMEXIST 0
DMALL 0
TMSTMP ROLLUP 0
TMSTMP QUALOK 0
LAST CHANGE 20.090.105.193.710
COMPR 0
COMPR DUAL 0
AGGREXIST
DMDELTAACTIVE
LOADALWD X
TIMDIM INCON
ODS TIME CONS
ARCH DONE 0
ARCH TODO 0
Any idea?
REgards
AlfHi ,
I guess with out compression that table does not have data.
Pls check the below link.
https://forums.sdn.sap.com/click.jspa?searchID=21007587&messageID=6576166
Hope it helps.
Thanks & Regards,
Ramnaresh.P. -
Date Variant for Shipment Cost document archiving
Hii
I want to archive the shipment cost document in my system for a particular date range.
When i go to the transaction SARA and put the object name as SD_ VFKK, it doesnt show me the date variant.
Whereas when i archive shipment(object: SD_VTTK), date variant is available.
Can anyone suggest any kind of setting which needs to be done in order to use the date variant for the object SD_VFKK
Thanks
Gaurav ManochaHi Gaurav,
Actually for Object SD_VFKK, when you to transaction code SARA and click on write variant, you dont have date option there. You can archive it only on the basis of Shipement Cost Document number.
If you will give me the details then I can see what you are actully looking for.
Regards
Sirfraz -
How to load data in infocube BW 7.0.
Hi,
I am loading data in Infocube from R3(ECC 6.0) to BW(7.0). Following are the details.
Steps in R3.
1) I have created customized table in R3 and loaded data in the table.
2) Using transaction RSO2 I have created data source.
3) I am able to extract data using RSA3 in R3.
Steps in BW(7.).
1) I have replicated data source in BW(7.0)
2) Created Infocube in BW(7.0).
Please let me know what are the steps required to load the data from data source to Infocube.
Thank You,
-NearThere are a few ways to do this. Let me explain the quick and supposed to be the most efficient way.
1. Right click the cube and chose create transformations.
2. You will be prompted to chose the source object type and object; in your caae data source.
3. The system will try to map the objects ( similar to update rules / transfer rules) and give a mesage " so many rules have been created". Check and activate.
4. Right click the cube and chose create DTP ( DTP is similar to info package but have different functions) and chose your source and target.
5. Chose the update mode ( delta , full etc)
6. Activate the DTP and execute.
Ravi Thothadri -
Request (PSA) was NOT posted successfully to data target (InfoCube)
In our daily load process chain I am getting an Error Message for "PSA to InfoCube" as:
Request was NOT posted successfully to data target (InfoCube)
Message no. RSMPC163
Please what is the exact problem it means & what is the solution
Rajesh GiribuwaHello Rajesh,
This requests fails when updating from PSA to Infocube.
Pleaes check the update rules for errors.
Try to find out the exact error message and explore the tabs in monitor for this job.
Hope this helps..
thanks, -
BIA Test: Suggestion for creating enormousTest Data for Infocubes
Hi,
I need to create some enormous amount of data for infocube in order to test BIA.
I don't think my basis team will allow me to load data from Production to Sandbox and moreover production has less than 250 millions records.
Any suggestion for 250 million records?Hello,
Just an idea. If you can load a minimum of data for your cube then it is easy.
You just have to generate for your cube for instance a datasource and then create some update rules to loop on your data.
a. generate export datasource
b. map it to your infosource
c. define some update rules to ensuere your data will be different. For example in the start routine.
Hope it helps.
Patrice -
How to view data into Infocube
Good Morning everybody.
I would like to know if somebody have any document with informations about RSA1, as for example "How to create a Back Up Infocube?" "How to view data into infocube?"
If you can send me
dacampos at br.ibm.com
Thanks a Lot
Daniel Campos - IBM Brazil.Hi,
Backup Cube is used for saving the data of Planning Area.
Following steps for creating Backup Infocube
1.First Create Data Source thorugh Planning Area. Double click on PA,then go to Extras where u find option for creating Data Source,then replicate the cretaed Data Source.
2.Go to RSA1,Activate Created Data Source & Create Info Package uder the Data Source.
3.Create Info Cube under Info Area.
4.Create Transformation between Data Source & info Cube.
5.Activate Data Transfer Process(DTP).
For viewing the data into infocube
1.Right Click on selected Info Cube
2.Go to Manage
3.Select request & above info Cube Type(in Selectable data Target for Administration) & go to Contents
4.Select require field from Fld Selection for Output.
5.Come back & Increase no. of hits if require.
6.Execute it. You will find out the data in selected field.
Hope so it will be helpful for you.
Regards
Sujay -
What are the risks when doing a Delete / Re-load of Data in InfoCube?
Hi,
What are the risks/issues when doing a complete Delete / Re-load of Data in InfoCube?
Rgds,
Mark.We are introducing a new Key Figure to the InfoCube and this KF needs historical data.
The only way I thought this could be achieved was by deleting the data from the InfoCube and re-loading it (It gets loaded from R/3 with no ODS used).
Is this the only way or are there other options in a BW 3.5 environment?
Maybe you are looking for
-
Cannot process a Fixed Field Length file using the File Adapter (Sender)
Hi - I have checked throughout these posts and blogs but I still have not been able to find a solution to my issue. When using the File Adapter (Sender) I get a Conversion initialization failed with "xml.keyfieldName", no value found. Why would I r
-
Open Camera Raw Images From Sony DSC-RX100II in PE12
No joy. Please help.
-
Has anyone received their Thinkpad W530 yet? I would be most interested in the PCMark7 score. I use it for purchasing decisions, since it approximates most closely my usage pattern. (The benchmark can be downloaded free: http://www.pcmark.com/benchma
-
MySQL, MS access, Oracle ... spec question
guys, Say I have to create a database with the following spec: 500 users, 30 simultaneous connections, All information in the db will be text and will not exceed 200Mb What db should I use? I�ve been using MS for a while and it worked great, however,
-
I cant turn on my iphone after reboot? what to do? please help.
I tried to reboot my iPhone to restore my previous back ups, but something came up, my iphone suddenly turned off on wont turn anyway. Its been off for 3 weeks. I tried everything i see in the internet but none of these works. PLEASE HELP1