Data archiving BC_ARCHIVE , FI_SL_DATA, IDOC, COPA1_OCGM

Hi Experts,,
Could you please guide in finding the
1) Archive index for object BC_ARCHIVE , FI_SL_DATA, IDOC, COPA1_OCGM?
2) which of the info-structure should be used for above archiving objects?
BC_ARCHIVE    ->      SAP_BC_ARCHIVE1, SAP_BC_ARCHIVE2
COPA1_OCGM ->     SAP_DRB_PA1OCGM , SAP_DRB_PA2OCGM Z_COPA_OCGM_01
FI_DOCUMNT      ->    SAP_FI_DOC_002
FI_SL_DATA      ->    SAP_FI_SL_001 , SAP_FI_SL_002
IDOC              ->    SAP_DRB_IDOC001 , SAP_IDOC_001

Back to your archive object list
BC_ARCHIVE
typically this object only hep to 'clean up' archive history. It is very unusual to use it because then auditors (including tax auditor's) are not sure what have been archived. So I will suggest not to archive BC_ARCHIVE. Then if you did archive, just create your SAP AS structure on demand for a specific question (volume will be low anyway)
IDOC
Usually no need for data access after archiving, so I suggest on demand SAP AS usage upon a specific and unusual request.
For FI_SL_DATA and CO-PA archiving, check OSS notes where you''ll find some predified SAP AS structure that may help access data in a nice way (not only with SARI, but also with 'data source' archived inside some transactions. You'll have to test usage depending on your data volume. Starting from this, you may create additional SAP AS structure or look at third party solution such as PBS-Software.
Depending on your SAP release, check COPAX_yyyy archiving object (X=A,B,C) and yyyy=OCGM in your case, who may be replacers for COPAZ_xxxx (Z=1 or 2)
T.Julien.

Similar Messages

  • Data Archiving - System Prerequisites

    Hi,
    We are planning to ARCHIVE some of the tables to reduce the TCO. (Total cost of Ownership)
    In this regard, I would like to know the following:
    On the Basis side, I want to check for any prerequisites etc (like Minimum SP LEVEL, Kernel version, Import Notes to be applied etc)
    Are there any document which clearly talks about these prequisites for preparing the System in order to be able to carry out the Archive work without any issues.
    (Note:  We are not using ILM solution for Archiving)
    I am mostly concerned with the SAP Notes that are considered to be the prerequisites.
    Best Regards
    Raghunahth L
    Important System information :
    Our system version is as follows :
    System  -> ERP 2005  (Production Server)
    OS      -> Windows 2003
    DB      -> Oracle 10.2.0.2.0
    SPAM    -> 7.00 / 0023
    Kernel  -> 7.00 (133)
    Unicode -> Yes
    SP Info:
    SAP_BASIS 700(0011)
    SAP_ABA 700(0011)
    PI_BASIS 2005_1_700(0003)
    ST-PI 2005_1_700(0003)
    SAP_BW 700(0012)
    SAP_AP 700(0008)
    SAP_HR 600(0003)
    SAP_AP 700(0008)
    SAP_HR 600(0013)
    SAP_APPL 600(0008)
    EA-IPPE 400(0008)
    EA-DFPS 600(0008)
    EA-HR 600(0013)
    EA-FINSERV 600(0008)
    FINBASIS 600(0008)
    EA-PS 600(0008)
    EA-RETAIL 600(0008)
    EA-GLTRADE 600(0008)
    ECC-DIMP 600(0008)
    ERECRUIT 600(0008)
    FI-CA 600(0008)
    FI-CAX 600(0008)
    INSURANCE 600(0008)
    IS-CWM 600(0008)
    IS-H 600(0008)
    IS-M 600(0008)
    IS-OIL 600(0008)
    IS-PS-CA 600(0008)
    IS-UT 600(0008)
    SEM-BW 600(0008)
    LSOFE 600(0008)
    ST-A/PI 01J_ECC600(0000)
    Tables we are planning to archive
    AGKO,     BFIT_A,     BFIT_A0,     BFO_A_RA,     BFOD_A,     BFOD_AB,     BFOK_A,     BFOK_AB,     BKPF,     BSAD,     BSAK,     BSAS,     BSBW,     BSE_CLR,     BSE_CLR_ASGMT,     BSEG_ADD,     BSEGC,     BSIM,     BSIP,     BSIS,     BVOR,     CDCLS,     CDHDR,     CDPOS_STR,     CDPOS_UID,     ETXDCH,     ETXDCI,     ETXDCJ,     FAGL_BSBW_HISTRY,     FAGL_SPLINFO,     FAGL_SPLINFO_VAL,     FAGLFLEXA,     FIGLDOC,     RF048,     RFBLG,     STXB,     STXH,     STXL,     TOA01,     TOA02,     TOA03,     TOAHR,     TTXI,     TTXY,     WITH_ITEM,          COFIP,     COFIS,     COFIT,     ECMCA,     ECMCT,     EPIDXB,     EPIDXC,     FBICRC001A,     FBICRC001P,     FBICRC001T,     FBICRC002A,     FBICRC002P,     FBICRC002T,     FILCA,     FILCP,     FILCT,     GLFLEXA,     GLFLEXP,     GLFLEXT,     GLFUNCA,     GLFUNCP,     GLFUNCT,     GLFUNCU,     GLFUNCV,     GLIDXA,     GLP0,     GLPCA,     GLPCP,     GLPCT,     GLSP,     GLT0,     GLT3,     GLTP,     GLTPC,     GMAVCA,     GMAVCP,     GMAVCT,     JVPO1,     JVPSC01A,     JVPSC01P,     JVPSC01T,     JVSO1,     JVSO2,     JVTO1,     JVTO2,     STXB,     STXH,     STXL,     TRACTSLA,     TRACTSLP,     TRACTSLT,
    in addition we have some Z Tables to be archived.

    Hi,
    Pre-requisites for search OSS notes or BC Sets depends upon the program that is used in archive, delete or read.
    If there is no proper selection criteria in write variant selection,
    If the program is getting terminated due to long processing time,
    If percentage of data archived for your selection is less even after data meeting minimum criteria,
    If system allows user to change archived data,
    With all the above scenarios we will search for OSS notes. If SAP has released BC sets then we will implement these.
    If you have any problem while archiving and based on the archiving experience if you think that some of the oss notes will help then take a call on implementing it.
    With the tables you have mentioned i can say that archiving objects such as FI_DOCUMNT, FI_SL_DATA, EC_PCA_ITM, EC_PCA_DATA, CHANGEDOCU, ARCHIVELINK.
    You have to search any latest release BC sets or OSS notes for your system application.
    -Thanks,
    Ajay

  • What is data archiving and DMS(Data Management System) in SAP

    what is data archiving and DMS(Data Management System) in SAP
    Welcome to SCN. Before posting questions please search for available information here and in the web. Please also read the Rules of Engagement before further posting.
    Edited by: kishan P on Aug 31, 2010 1:06 PM

    Hi,
    Filtering at the IDoc Level
    Identify the filter object (BD59)
    Modify the distribution model
    Segment Filtering
    specify the segments to be filtered (BD56)
    The Reduced IDoc Type
    Analyze the data.
    Reduce the IDoc type (BD53)
    Thanks and regards.

  • Archive or Delete Idocs

    Hi gurus,
    We are trying to come up with a plan for BW idocs. I have been asked to analyze
    (1) Should we delete or archive (PROs and CONs)
    (2) If to delete, how many month to delete and frequency
    (3) If to archive, how many months to archive and frequency.
    Could you guys give some insight into this or tell me where should I begin my analysis from.
    Ankit

    Hi,,
    Please check this link for ur perusal
    Performance Aspects of data archiving>
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/91573601-0b01-0010-08bb-f2b38db0c52c
    on tape archiving frequency.
    http://help.sap.com/saphelp_nw04/helpdata/en/d1/a9f87f4b8a11d189510000e829fbbd/frameset.htm
    thanks

  • HR PA Data Archiving

    Hi,
    We are undergoing the archiving project for HR module. For PD data, we can use object BC_HROBJ, for PCL4 data, we can use PA_LDOC. What about 2000 series infotypes data, such as PA2001, PA2002, PA2003, etc.? Because all changes to these infotypes are stored in PCL4 cluster LA, we only need to purge the data from these tables. What is the best way to purge PA2xxx data? We cannot use transaction PA30/PA61 to delete records because user exit edits prevent any action against old data beyond certain time frame.
    Thanks very much,
    Li

    This is not directly related to SAP NetWeaver MDM. You may find more information about data archiving on SAP Service Marketplace at http://www.service.sap.com/data-archiving or http://www.service.sap.com/slo.
    Regards,
    Markus

  • Business Partner Data Archiving - Help Required

    Hi,
    Am New to Data Archiving and need help to Archive Business Partner in CRM. I tried to Archive some BP's but it was not archiving. Kindly throw some light on it.
    Problem we face are:
    1) When we try to write a Business Partner to an Archive File the Job Log Shows finished but no files are created in the System.
    2) When we try to delete the BP from Data Base it doesn't show the Archived BP's that could be deleted (I guess this is because there are no archived file).
    Archive Object Used is CA_BUPA. We have created a Variant and set the Time as Immediate and Spool to LOCL.
    Kindly let us know if there is any step we are missing here and if we are on the wrong track.
    All answers are appreciated.
    Thanks,
    Prashanth
    Message was edited by:
            Prashanth KR

    Hi,
    To archive a BP following steps are to be followed.
    A. Mark the BP to be archived in transaction BP >status tab
    B. Run dacheck
    FYI : Steps on how to perform dacheck :
    1.  In transaction DACONTROL change the following parameter to the
    values : for CA_BUPA .
    No. calls/packages   1
    Number of Objects   50
    Package SizeRFC     20
    Number of Days       1 (if you use mobile client it should be more)
    2.If a business partner should be archived directly after the
      archiving note was set, this can be done by reseting the check
       date with report CRM_BUPA_ARCHIVING_FLAGS.
       here only check (set) the options :
       - Archiving Flag
       - Reset Check Date
      Reset all options should be unchecked.
    3. go to dacheck and run the process
    4. This will change the status of the BPs to 'Archivable'
       *Only those bp's which are not involved in any business transactions,
        install base, Product Partner Range (PPR) will be set to archivable
    The BP's with status 'Archivable' will be used by the archiving
         run.
    Kindly refer note 646425 before goint to step C ***
    C.Then run transaction CRM_BUPA_ARC.
    - Make sure that the "selection program" in transaction "DACONTROL"
       is maintained as " CRM_BUPA_DA_SELECTIONS"
      Also create a variant, which can done by excecuting
      CRM_BUPA_DA_SELECTION and enter the variants name for the field
      Variant. This variant is the buisness partner range.
    - Also please check note 554460.
    Regards, Gerhard

  • How to extract data from custom made Idoc that is not sent

    Hi experts,
    Could you please advise if there is a way how to extract data from custom made idoc (it collects a lot of data from different SAP tables)? Please note that this idoc is not sent as target system is not fully maintained.
    As by now, we would like to verify - what data is extracted now.
    Any help, would be appreciated!

    Hi,
    The fields that are given for each segment have their length given in EDSAPPL table. How you have to map is explained in below example.
    Suppose for segment1, EDSAPPL has 3 fields so below are entries
    SEGMENT          FIELDNAME           LENGTH
    SEGMENT1         FIELD1                   4
    SEGMENT1         FIELD2                   2
    SEGMENT1         FIELD3                   2
    Data in EDID4 would be as follows
    IDOC           SEGMENT                          APPLICATION DATA
    12345         SEGMENT1                        XYZ R Y
    When you are extracting data from these tables into your internal table, mapping has to be as follows:
    FIELD1 = APPLICATIONDATA+0(4)        to read first 4 characters of this field, because the first 4 characters in this field would belong to FIELD1
    Similarly,
    FIELD2 = APPLICATIONDATA+4(2).
    FIELD3 = APPLICATIONDATA+6(2).  
    FIELD1 would have XYZ, FIELD2 = R, FIELD3 = Y
    This would remain true in all cases. So all you need to do is identify which fields you want to extract, and simply code as above to extract the data from this table.
    Hope this was helpful in explaining how to derive the data.

  • Data archiving for Write Optimized DSO

    Hi Gurus,
    I am trying to archive data in Write Optimized DSO.
    Its allowing me to archive on request basis but it archives entire requests in the DSO(Means all data).
    But i want to select to archive from this request to this request.(Selection of request by my own).
    Please guide me.
    I got the below details from SDN.Kindly check.
    Archiving for Write optimized DSOs follow request based archiving as opposed to time slice archiving in standard DSO. This means that partial request activation is not possible; only complete requests have to be archived.
    Characteristic for Time Slice can be a time characteristic present in the WDSO, or the request creation data/request loading date. You are not allowed to add additional infoobjects for semantic groups, default is 0REQUEST & 0DATAPAKID.
    The actual process of archiving remains the same i.e
    Create a Data Archiving Process
    Create and schedule archiving requests
    Restore archiving requests (optional)
    Regards,
    kiruthika

    Hi,
    Please check the below OSS Note :
    http://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/oss_notes/sdn_oss_bw_whm/~form/handler%7b5f4150503d3030323030363832353030303030303031393732265f4556454e543d444953504c4159265f4e4e554d3d31313338303830%7d
    -Vikram

  • Invoice List data(VF21/VF22) as IDoc

    Hi,
    We have a typical scenario of converting Invoice List data in SAP into IDoc and to transmit it as an XML to the third party system. In the data to be transmitted, sub-totalling at the material (in the invoice items) and material+invoice date level is required and the sub-totals are to be populated in two kind of segments (for each of the sub-total type) in the IDoc.
    That is, a new IDoc structure will be defined with the segments/fields that the customer wants viz., a header segment with only few fields from the invoice list header, an item segment with only few fields from the invoice list item, and a sub-total segment with two fields, each to hold sub-total at material and material+date level respectively.
    We have output types which, when assigned in the invoice list should create the required Idocs.
    Invoice lists will be created for several partners, but each list will have only one partner.
    What are the steps to be carried out in order to achieve this objective? Please stress upon the FM parameters to be provided and NACE options to be selected.
    Vish.

    Hello ,
    Your requirement is very simple ,However before posting into sdn please do  a proper search on idoc.
    Now coming to your requirement as of now you have the invoice data coming up.
    1. In the NACE transaction create an output type related with the respective partner data.
    2.Create segments with the required fields and set its status to release.
    3.create  Idoc type and message type ..link the message and segents.
    4.Now link the message type and idoc type .
    5.
    6. Now assign process code to the partner function by defining a logical system and in the process code
    put your custom function module where you put your custom code which will filter and
    cumulate the data coming from the invoice into different fields
    using normal abap code,
    Now once you are done with the abap coding the
    function module has few parameters which you can pass  them as it is .
    control_in need to be passed to Control_out as it is.
    Now fill the segments with the required data and append the segment data to the
    EDI_DATA structure which is a exporting parameter for thefunction module .
    Now once you execute the invoice using this partner function and message type
    system would internally call the custom function module which you have assigned to
    the process code and would create an idoc.
    also go through saptechnical-com site for addition details on idoc creation.

  • SAP Data Archiving in R/3 4.6C and ECC 6.0.

    Hi Guys,
        Need some suggestions,
    We currently working on SAP R/3 4.6 and have plans to upgrade it to ECC 6.0,
    In the mean time there is an requirement for SAP Data Archiving for reducing the database size and increase system performance.So wanted to know if its better to do Data Archiving before upgrade or after. Technically which will be comfortable, Also wanted to know if any advanced method available in ECC 6.0 compared to SAP R/3 4.6.
    Please provide your valuable suggestion.
    Thanks and Regards
    Deepu

    Hi Deepu,
    With respect to archiving process there will be no major difference in 4.6 and ECC 6.0 system. However you may get more advantage in ECC6.0 system because of Archive routing and also upgraded Write, Delete programs (upgraded program will depend on your current program in 4.6 system). For example In 4.6 system for archive MM_EKKO write program RM06EW30, archiving of purchase document is based on company code in the selection criteria and there is no preprocessing functionality. In ECC 6.0 you can archive by purchase organization selection criteria and preprocessing functionality will additional help in archiving of PO.
    In case if you archive documents in 4.6 and later upgrade it to ECC 6.0, SAP system will assure you to retrieve the archived data.
    With this i can say that going with archiving after upgrade to ECC 6.0 system will be better with respect to archiving process.
    -Thanks,
    Ajay

  • Vendor Master Data Archiving

    Hi,
    I wanted to archive vendor master data. I got program - SAPF058V which gives FI,Vendor Master Data Archiving: Proposal list.
    This program checks whether and which Vendor Master data can be archived or not. I am getting error message after running this program with selection of one of the Vendor saying that "Links stored incompletely". Can someone pls help on this
    Thanks...Narendra

    Hi,
    Check if you have an entry in table FRUN for PRGID 'SAPF047'. Other option is set the parameter 'FI link validation off'' with the flag on (ie: value 'x'). Check the help for this flag, surely this vendor have a code of customer. Perhaps you must try to delete this link in customer and vendor master before.
    I hope this helps you,
    Regards,
    Eduardo

  • Can I add multple tables to a single Flashback Data archive

    Hi ,
    I want to add mutilple tables of a schema to a single Flashback Data archive . Is this possible ?
    I dont want to create multiple Flash back archives for each table
    Also can i add an entire schema or a tablespace to a flash back archive created ?
    Thanks,
    Sachin K

    Do adding multiple tables to a single flashback archive feasible in terms of space ?Yes, you can use. Multiple tables can share the same policies for data retention and purging. Moreover, since an FBDA consists of one or more tablespaces (or subparts thereof), multiple FBDAs can be constructed, each with a different but specific retention period. This means it’s possible to construct FBDAs that support different retention needs. Here are a few common examples:
    90 days for common short-term historical inquiries
    One full year for common longer-term historical inquiries
    Seven years for U.S. Federal tax purposes
    20 years for legal purposes
    How is the space for archiving data to a flashback archive generally estimated ?[url http://docs.oracle.com/cd/E18283_01/server.112/e16508/process.htm#BABGGHEI]Flashback Data Archiver Process (FBDA)
    FBDA automatically manages the flashback data archive for space, organization, and retention. Additionally, the process keeps track of how far the archiving of tracked transactions has occurred.
    It (FBDA) wakes up every 5 minutes by default (*1), checks Tablespace quotas every hour, creates history tables only when it has to and rests otherwise if not called to work. The process adjusts the wakeup interval dynamically based on the workload.
    (*1) *1 Contradicting observations to MOS Note 1302393.1 were made. The note says that FBDA wakes up every 6 seconds as of Oracle 11.2.0.2, but a trace on 11.2.0.3 showed:
    WAIT #140555800502008: nam='fbar timer' ela= 300004936 sleep time=300 failed=0 p3=0 obj#=-1 tim=1334308846055308
    5.1 Space management
    To ease the space management for FDA we recommend following these rules:
    - Create an archive for different retention periods
    o to optimally use space and take advantage of automatic purging when retention expires
    - Group tables with same retention policy (share FDAs)
    o to reduce maintenance effort of multiple FDAs with same characteristics
    - Dedicate Tablespaces to one archive and do not set quotas
    o to reduce maintenance/monitoring effort, quotas are only checked every hour by FBDA
    http://www.trivadis.com/uploads/tx_cabagdownloadarea/Flashback-Data-Archives-Rechecked-v1.4_final.pdf
    Regards
    Girish Sharma

  • A question on different options for data archival and compression

    Hi Experts,
    I have production database of about 5 terabyte size in production and about 50 GB in development/QA. I am on Oracle 11.2.0.3 on Linux. We have RMAN backups configured. I have a question about data archival stretegy.  To keep the OLTP database size optimum, what options can be suggested for data archival?
    1) Should each table have a archival stretegy?
    2) What is the best way to archive data - should it be sent to a seperate archival database?
    In our environment, only for about 15 tables we have the archival stretegy defined. For these tables, we copy their data every night to a seperate schema meant to store this archived data and then transfer it eventualy to a diffent archival database. For all other tables, there is no data archival stretegy.
    What are the different options and best practices about it that can be reviewed to put a practical data archival strategy in place? I will be most thankful for your inputs.
    Also what are the different data compression stregegies? For example we have about 25 tables that are read-only. Should they be compressed using the default oracle 9i basic compression? (alter table compress).
    Thanks,
    OrauserN

    You are using 11g and you can compress read only as well as read-write tables in 11g. Both read only and read write tables are the candidate for compression always. This will save space and could increase the performance but always test it before. This was not an option in 10g. Read the following docs.
    http://www.oracle.com/technetwork/database/storage/advanced-compression-whitepaper-130502.pdf
    http://www.oracle.com/technetwork/database/options/compression/faq-092157.html

  • SAP Data Archiving in R/3 Ecc6.0, MS-SQL 2008 Server:

    We are using MS-SQL 2008 Database, these days our production database growing quite faster. I want to setting up Data Archiving using T'Code DB15. Can anyone give steps to setup Data Archinving using DB15?
    Edited by: Ahsan's on Jul 19, 2010 11:44 AM

    Archiving data is bound to legal things depending on the country and the company you work. Setting up archiving is not following a few steps, you'd need to have a system where to put the archiving data.
    Markus

  • What is the major use of TAANA in data archiving?

    People say that TAANA can give the distribution profile of the table entries over different
    table field.
    I do not see how come this can help us doing data archiving.
    Please help.
    Thanks a lot!

    Hi Jennifer,
    I use TAANA all the time to help determine what data is in a particular table.  For example, I use it to try and determine how many records there are for a certain document type, per fiscal year, etc.  This information can then be used to determine how many archive runs you will need.  If there are millions of rows for a fiscal year, then you will more than likely want to break that up into months, or quarters (depending on volume).
    Hope this helps.
    Best Regards,
    Karin Tillotson

Maybe you are looking for

  • Uh oh.. it's finally happened...

    and on Christmas Day as well.. boohoo My Hard Drive has finally bit the dust. After months of intermitent crackling, and disk utility runs to get my PB to boot, it has finally given up and here I am on my PC!! Few months ago I wrote about my Hard dri

  • Display HTML code in TextView

    Helo everybody,                  I wanna apply <marquee> tag in WebDynpro View..... what i did: 1. Created Attribute with STRING type, 2. bound to FormattedTextView UI element. 3. set value as '<html><body><marquee> my message </marquee></body></html

  • Read Attachments in Operations Mapping called by Transformation Step

    Hello, I would like to read the attachment of my input message. I created an UDF that assigns the attachment to to a target field, furthermore I have activated the option "Read Attachments" in operation mapping. But when I call the operation mapping

  • Ignoring Video downloads

    Software Update has an option to "ignore update" is there any such option in iTunes in order to ignore download? I'm asking for 2 reasons. Number 1 is that sometimes I mistakenly buy the HD version, instead of the SD version, of a TV series and woops

  • AS3 newbe target my_mc question

    Good day to all of you. I have a basic question. I have movC_mc inside movB_mc inside movA_mc AS2 to access movC_mc from the my Actions layer in frame 1 in the root would be      movA_mc.movB_mc.movC_mc My question is how can I target the movA_mc fro