InfoCube archiving

Hi everyone,
we have a field in our cube that determines if a sales order position is closed in ERP system and will not be changed anymore (KPOSSTAT = 'X').
In the cube this might look like :
DOC_NUMBER
S_ORD_ITEM
CALYEAR
KPOSSTAT
9970001741
100
2000
9970001741
100
2012
9970001741
100
2013
X
9970001741
200
2000
9970001741
200
2012
9970001741
200
2013
What I want to do now, is to archive all sales order position records that are older than 01.01.2014 and where the position has been closed at some point in time before 01.01.2014, so basically all records for position 100. If I use KPOSSTAT = X and CALYEAR <= 2014 in my archiving process selection profile I'm going to archive only the third record for position 100. If I'm selecting only for CALYEAR <= 2014 I'm also going to archive position 200 which is not wanted.
Is this way of archiving possible at all?
Help would be appreciated.
Alexander Oertel

Hi Alexander,
Assumming that the filter in dtp have <2014.
You need create a init routine, with for all entries in your source_package and the where instruction KPOSSTAT EQ X.
And in loop instruction need a execute a read table with doc_number and s_ord_item, and add in your archiving.
I hope help.
Regards.

Similar Messages

  • Infocube archived records still show up on query

    Hello everyone!
    I have archived an Infocube. To test this, I've created a test query upon it, to see if the old (archived) records would show up.
    And they do!!
    Before you state the obvious, I have successfully ran the deletion job (at least according to SM37). Besides, when I visualize the data via LISTCUBE, the old data does not show up.
    On the other hand, when I fetch the number of records of the fact table (via SE16, /BIC/F...), I get the very same value as I did before the archiving session!
    Any thoughts?
    Best Regards,
    Luís Andrade.

    Hi,
    can you try the following in sequence:
    1. try re-deletion from tcode SARA.
    2. drop & reload indexes.
    3. reconstruct DB stats.
    4. run re-org jobs (imp if you are on Oracle DB).
    5. re-generate the queries.
    --Akashdeep
    Edited by: AKASHDEEP BANERJEE on Oct 14, 2010 11:00 PM

  • Error when write data object in archiving (BW)

    HI:
    I am making an infocube archiving,this process is cancelled due to an error.
    the error is:
    Error E/S en fichero de archivo
    BW02sapmntBWPArchivingBWP_BW_0_BWCZRT_C05_20             BA        024
    Error al grabar un objeto de datos                                                    RSARCH      215
    El job ha sido cancelado tras excepción de sistema ERROR_MESSAGE.                                     00        564
    Please help me in this issue.
    Regards,
    Ernesto.

    Hi Ernesto:
    First you should try to write the message in English mi amigo! That way you will get answers, en español muy difícil que alguien te responda! ;-D
    Can you post the SM21 error message and if there any Short Dump (ST22)
    Regards,
    Federico

  • How to track changes on Infoprovider?

    Hi Gurus,
    Could you please provide the steps of how to track changes on Infoprovider? I had 4 Infocubes (belonging to the same info area) that contains compressed data - but now when I tried to display the data, all the cube are empty - no data were aechived.
    I would like to track any actions that were made on those cubes.
    Thanks

    thanks Kumar for helping me out here
    I tried RSD_CUBE_LOG_DELE function module - It came out "no logs found in the database"
    I tried tables RSDRDLOGHEADER and RSDDSTATDELE which provide me deletion informations like user name... but since we have partially archived those specific infoproviders in the same period of time where I think eveything was gone from cube, I am not able to related when and who the data that were not archived were deleted from Infocube
    Under manage Infocube --> archiving tab, all the created archiving request are only for certain dates (below 2006). I know that we had data above 2006.
    Kindly provide additional inputs.
    Thanks

  • Unable to create Archiving for non-cumulative InfoCube

    Dear Gurus,
    I'm working with BI in SAP Nw 2004s with SAPKW70016 (Support Package 16) and need to create archiving on a non-cumulative Infocube (copy of 0RT_C36) but can't use tx RSDAP. Although I found note 1056294, which says that upon SP14 it can't be done the following message is sent:
    Cannot create data archiving process for ZRT_C36
    Message no. RSDA109
    Diagnosis
    You can only create a data archiving process for standard DataStore objects and standard InfoCubes. You cannot create a data archiving process for other InfoProviders such as VirtualProviders.
    In particular, non-cumulative InfoCubes and write-optimized DataStore objects are not currently supported.
    Please let me know what is wrong or how can I do this archiving in 2004s. I also read some documentation about RSDV transaction but couldn't find how to implement it using tx. RSDAP.
    Thanks in advanced.
    Best regards,
    Pilar Infantas.

    Please, any feedback about this issue???

  • Archiving and reload possible for non-cumulative infocube 0IC_C03 ?

    Hello,
        Can the cube 0IC_C03 be archived and reloaded into another cube so that the data can be accessed via a multiprovider ? I read in help (http://help.sap.com/saphelp_nw04/helpdata/en/8f/da1640dc88e769e10000000a155106/content.htm) that 'You cannot use the InfoCube together with another InfoCube with non-cumulative key figures in a MultiProvider'..
        While archiving seems doable, I have doubts on the reload. How do we report the data once it is archived from this cube.. Any ideas ?
    Thanks

    hi
    please check the below link.
    http://help.sap.com/saphelp_nw04s/helpdata/en/42/ead8d7b55e1bd2e10000000a11466f/frameset.htm
    Regards,
    madhu

  • Archiving BI.7.0: Must data in InfoCube be compressed before archiving run?

    Hi Expert
    I have created a Data Archiving Process (DAP). It is execute by a archiving process variant in a process chain.
    Archiving an InfoCube works fine when all requests are compressed.
    However, when I have some requests in the InfoCube that isn't archived I get the following error:
    07.11.2008 14:47:07 Data area to be archived is not completely compressed yet I RSDA 148
    07.11.2008 14:47:07 Exception condition  in row 105 of include CL_RSDA_ARCHIVING_REQUEST=====CM015 (program CL_RSDA_ARCHIVING_REQUEST=====CP) I RSDA 140
    I cannot find anywhere it is stated that all requests have to be archived and I cannot find any OSS's about this subject as well.
    Have you got any experience in this matter?
    Thanks in advance and kind regards,
    Torben

    Hi,
    First of all, i want to thank you for your help!
    but i have a question about a case:
    Loading Date     Loading N°       FISCYEAR     Order N°          Turnover
    31/12/2007         1                          2007                     Order 1            100
    01/01/2008         2                          2008                     Order 1            -50
    02/01/2008         3                          2007                     Order 1             60
    The loading date corresponds to the Request Date.
    In our case, we have compressed Data wich Request Date is < 2008, and we want to archive data wich FISCYEAR = 2007.
    In this case, we will find the chrgt 1 in the E-Table, and chrgt 2 & 3 in the F-Table.
    The goal of the 3rd Loading is to recover Data wich correspond to the year 2007 and that has not been recovered by the SAP System  .
    For exemple, in my case, Data (wich come directly from the stores ) are sent everyday to SAP, but due to a problem, we didn't receive Data.
    So, The 3rd Loading will allow us to recover those Data.
    How will i do to archive the entire data that correponds to FISCYEAR = 2007, knowing that i do not have all of them in requests wich Loading Date is < 2008.
    Thanks for your help.
    Salah Lamaamla

  • Archiving Infocube through Process Chain...

    Hi All,
    I need help in creating process chain for archiving infocube.I am able to archive infocube manually but not getting through process chain.
    Is it possible to archive infocube through Process Chain?If yes then please give steps to create process chain for archiving.
    Thanks in advance.
    Bandana.

    Hi,
    It is possible to Archive data from an infocube via process chain.
    Have a start process followed by Archive data from an infoprovider. Here the trick lies in the variants used for the archive steps used to make the chain.
    Create a process by dragging the "Archive data..." process and give name for the variant and use this variant for writing the archive file, choose your archive process (the same archive process u have created to archive data from the infocube) , as this is write phase do not check "Continue Open Archiving  requests" chk box, and choose "40 Wirte phase completed succesfully option" from the options in Continue Process Until target status.Now enter the required selection conditions. In case u want to reuse this chain give a relative value under 'Primary Time Restriction tab' and save this variant. This is your variant 1.
    Now drag the same 'Archiving process' and create a variant and in this process u need to select "Continue Open Archiving  request" and choose the option from the dropdown list '70 Deletion phase confirmed and request completed' this variant for deleting the data from the infocube. This is your variant 2.
    So now have a Process chain with Start >Archive Process with Variant 1 ( to write the archive file) > Archive Process with Variant 2 ( to delete the data from the cube).
    Thats it !!
    Regards
    Edited by: Ellora Dobbala on Apr 8, 2009 5:28 PM

  • Archiving only compressed InfoCubes?

    Hello Colleges,
    We have an InfoCube with 5 Mill Records. If I try to select  a daterange for Archiving, I get allways the error the request is not compressed. I could not find in any document any record size when it has to be compressed. Is it a generall restriction, which is never mentioned in a document. Does anybody has experince with archiving and had also this problem?
    Thanks for the help
    Henning

    Hi,
    It is recommended to archive only fully compressed InfoCubes as you want to ensure that you do not archive data from requests while they are being loaded.
    In addition, if you InfoCube has a non-cumulative key figure, there is a hard check to ensure all requests are compressed first before an archive job can execute.
    I hope this helps,
    Mike

  • How to backup an infocube with archiving

    hi gurus
    can any one explain how we can backup an infocube without archiving.
    thanks in advance

    Hi Madhusudan,
    Do you need to backup the cube with or without archiving?? the subject line and post conver different messages
    <u>Without archiving:</u>
    Create a new cube, while creating copy it from the existing cube structure. Lets say this is CUBE2 and the first one is CUBE1.
    Then you need to do a export generate datasource on the cube. In the Infosource tree under Datamart you will find one infosource 8ODS1.
    Create update rules from CUBE2 to CUBE1. Activate the update rule and transfer rules.
    Do a full load from CUBE1 to CUBE2.
    Regards.

  • How can i retrieved data into the infocube from archived files

    hi,
    i have archived cube data and i have to load data into the cube from archived files.
    so now i want to find archived files and how to load data into the cube.
    thanks

    Hi.....
    Reloading archived data should be an exception rather than the general case, since data should be
    archived only if it is not needed in the database anymore. When the archived data target is serving also as a
    datamart to populate other data targets, Its recommend that you load the data to a copy of the original
    (archived) data target, and combine the two resulting data targets with a MultiProvider.
    In order to reload the data to a data target, you have to use the export DataSource of the archived data
    target. Therefore, you create an update rule based on the respective InfoSource (technical name 8<data
    target name>). You then trigger the upload either by using ‘Update ODS data in data target’ or by
    replicating the DataSources of the MYSELF source system and subsequently scheduling an InfoPackage
    for the respective InfoSource
    If you want to read the data for reporting or
    control purposes, you have to write a report, which reads data from the archive files sequentially.
    Alternatively, you can also use the Archiving Information System (AS). This tool enables you to define an
    InfoStructure, and create reports based on these InfoStructures. The InfoStructures define an index for
    the archive file data. At the moment, the archiving process in the BW system does not fill the
    InfoStructures during the archiving session automatically. This has to be performed manually when
    needed.
    Another way of displaying data from the archive file is by using the ‘Extractor checker’ (TCODE RSA3).
    Enter the name of the export DataSource of the respective data target (name of the data target preceded
    by ‘8’), and choose the archive files that are to be read. The extractor checker reads the selected archive
    files sequentially. Selection conditions can be entered for filtering but have to be entered in in internal
    format
    It will remain same in the change log table.
    Check this link :
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/b32837f2-0c01-0010-68a3-c45f8443f01d
    Hope this helps you...........
    Regards,
    Debjani............

  • Loading Archived Data into Copied Infocube

    Hi all,
    We archived the old non-cumulative data in the cube 0RT_C03 (Stock Cube). Now we need to load archived data into a copy of the cube 0RT_C03. Archived data belongs to the year 2007 and there is no marker in the archive. Our marker is now at year 2008. As we understood from the “How to Archive in Business Intelligence” document we need to have a marker also in the copy cube which we will load 2007 data. Since we don’t have marker data in the archive file, and it is not obvious in the document we are not clear about loading marker data for year 2007. Now we are deciding to load marker data from R3 system with the extractor 2LIS_03_BX. Could you please comment whether this is a good idea or not.
    Thanks in advance
    Mustafa

    hi,
    my suggesstion is load the new cube with BX init load and compress as mentioned in the document and laod the archived data and then move the 0RT_C03 cube data to the new cube.
    chk the link hope you may have
    Non Cumulative key figures
    http://help.sap.com/saphelp_bw32/helpdata/en/80/1a62dee07211d2acb80000e829fbfe/frameset.htm
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/93ed1695-0501-0010-b7a9-d4cc4ef26d31
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
    inventory management
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328
    Ramesh

  • Are data in aggregates deleted when data in a Cube is archived (BI 7.0)?

    Hi all
    Do you know if the deletion run which is a part of the arching process will delete data in aggregates as well when data in a given infocube is archived?
    Kind regards,
    Torben

    Hi Torben,
    we are following the normal procedure through SARA and we are not using ADK.so i don have any idea about ADK.
    More over while scheduling the archive job keep in mind that there is no confliction with the loading procedure (which includes rollup,compression recreating the indexes etc,,..).so it wont effect the normal procedure.
    Only the thing is if you are using ADK then the things remain same but all should be taking care by the ADK i think.
    Thanks,
    Ashok

  • 0FI_GL_4 - How to prevent open items from being archived (SARA)

    Hello,
    We would like to archive data from our 0FI_GL_4 InfoCube using the SARA transaction. We can only specify a date as the criteria for archiving (e.g. the posting date). However, we may still have open items (that are candidate for archiving based on the posting date) that shouldn't actually be archived because they may be cleared at a later date. I am surprised that SAP doesn't provide a standard mechanism to handle the open items in archiving. Has anybody faced this situation ? Any solution ?
    Many thanks.
    François.

    Hello Francios,
    In BW Archiving, there are no business complete rules, so GL items that are "open" in R3, would get archived in BW.  Another thing to consider in your BW archiving strategy is that when a fiscal year of data is archived, a write protection is placed for that time period.  So, if you need to load data into that object, the data that has been archived would need to be reloaded prior to being able to run the load job.  Then, the data would need to be re-archived.
    Hope this helps.
    Best Regards,
    Karin Tillotson
    Edited by: Karin Tillotson on Oct 13, 2008 2:11 PM

  • Nearline storage (read 'archive' in look up's)

    Hi all,
    we need the functionality to read the nearline objects in queries, data loading processes and in look up's. Is there any other customer, who request this functionality?
    Regards,
    Adrian

    Hi Adrian,
    At the moment there is no API available for unified access to archived and non-archived data of an InfoProvider (InfoCube or DataStore Object). But there is an API available to access the nearline part of an InfoProvider. The following code example could be used to achieve a unified access to data from the active table of a DataStore Object and its associated nearline part. This example will always return consistent results with only one exception: during the selective deletion phase of an archiving run. In this phase it could happen that a record is retrieved from nearline and from the database table as well.
    If you have further questions, please contact the us (the back office) directly.
        Cheers
          SAP NetWeaver 2004s Ramp-Up BI Back Office Team    
    TYPE-POOLS:
      rs,
      rsd.
    DATA:
      adjprov     TYPE rsadjprov,
      infoprov    TYPE rsinfoprov,
      r_cursor    TYPE REF TO if_rsda_cursor,
      r_dta       TYPE REF TO cl_rsd_dta,
      r_nlprov    TYPE REF TO cl_rsda_nearline_provider,
      r_selset    TYPE REF TO cl_rsmds_set,
      r_universe  TYPE REF TO cl_rsmds_universe,
      t_selfields TYPE if_rsdai_nearline_connection=>t_field_selections,
      t_data      TYPE STANDARD TABLE OF /bic/asbook_d00 WITH DEFAULT KEY,
      wherecond   TYPE string.
    infoprov  = 'SBOOK_D'. 
    wherecond = '/BIC/CARRID = ''LH'' AND /BIC/FLDATE IN (''19940617'',''19960617'',''19980617'',''20000617'')' .
    r_dta = cl_rsd_dta=>factory( infoprov ). "create DTA object for InfoProvider
    Check if the DS Object has a nearline archive
    IF r_dta->has_adjoint_provider( rsd_c_adjprovtype-nearline ) EQ rs_c_true.
    Create an instance of the VirtualProvider associated to the nearline archive of the DS Object
      CALL METHOD cl_rsd_dta=>build_adjprov_from_infoprov
        EXPORTING
          i_infoprov    = infoprov
          i_adjprovtype = rsd_c_adjprovtype-nearline
        IMPORTING
          e_adjprov     = adjprov.
      r_nlprov  ?= cl_rsda_nearline_provider=>factory( adjprov ).
    Prepare where condition and selection clause (field list) for the nearline access
      r_universe = r_nlprov->get_universe( ). "This method is available not until SP 8
      r_selset   = r_universe->create_set_from_string( wherecond ).
      CLEAR t_selfields. "request all fields
    Create and open a cursor for nearline access
      r_cursor = r_nlprov->open_cursor(
                   i_t_selfields = t_selfields
                   i_r_selset    = r_selset    ).
      TRY.
        Fetch a data package from the nearline table
          CALL METHOD r_cursor->fetch_next_package
            EXPORTING
              i_package_size = 0                        "fetch all data at once
            IMPORTING
              e_t_data       = t_data.
          r_cursor->close( ).
        CATCH cx_rsda_no_more_data.
      ENDTRY.
    ENDIF.
    Now select data from the active table and
    append selected records (UNION ALL) to the result set table
    SELECT * APPENDING TABLE t_data
           FROM /bic/asbook_d00
      WHERE (wherecond).

Maybe you are looking for

  • Complete error message not getting displayed

    Hi All, I've built a custom webadi integrator which loads into a custom staging table. All fine with the webadi but the error messages are getting truncated to only 45 characters. The variable size is 4000 characters and a debug message before RAISE_

  • HT5071 After submission, how long does it take for a new book to appear in the iBook store?

    How long does it take for a new book to appear in the iBook store after you have submitted it successfully? I submitted a free book last night. No sign of it 6 hours later ... If the delay is because process are being followed, what ate they? Any ide

  • Updating Administrators(built-in) with item-level targeting on an enforced policy

    I need help with this.  In our Computers OU, we have an enforced GPO with a computer policy that adds local admins to the computers on our network. The policy is Computer Configuration >> Policies >> Windows Settings >> Security Settings >> Local Pol

  • Output of PBEwithMD5andDES

    Hi all! In what form is the encrypted message? I wonder if it is ASCII subset or is it random bytes and I have to encode it somehow to send it over in XML? The problem: I'd like to encode something and send that wihtin XML document. Obviously, if the

  • User input formula variable with greater than/less than operators for KFG

    Hello all, I have used a formula in my query that is a difference of two key figures .For e.g: Difference = tax from system A  -  tax from system B. Now when the users run the report they should be prompted for the 'Difference' threshold for seeing t