Ho to archive data from COPA tables in ECC 6.0

HI Experts,
We are using ECC 6.0 and our client wants to archive data from COPA table's CE1XXXX,CE3XXXX and CE4XXXX. We are not using COPA planning.
Need your suggestions and step by step process  to follow to archive the data from the above mentioned COPA tables. Also suggest is there any SAP note is required to apply for this activity.
Regards,
Chandra

Hi Vijay Chandra
please check below links , this shows a detail explaination.
http://help.sap.com/saphelp_afs64/helpdata/en/8d/3e58e2462a11d189000000e8323d3a/content.htm
How to Use SARA Tcode
Also would suggest that Archiving thru batch job is suitable. If archving is to be done on regular basis (say monthly or quarterly)
Regards
Minesh
Edited by: mineshshah on Nov 4, 2011 9:02 AM

Similar Messages

  • In PL-SQL archive data from a table to a file

    I am currently developing a vb app where I need to archive data from a table to a file. I was hoping to do this with a stored procedure. I will also need to be able to retrieve the data from the file for future use if necessary. What file types are available? Thanks in advance for any suggestions.

    What about exporting in an oracle binary format? The export files cannot be modifiable. Is there a way to use the export and import utility in PL/SQL?
    null

  • Moving time-dependant data from one table to another (archiving)

    Hello all
    I would like to know if there's an easier solution or a "best practice" to move data from one table to another. The context of this issue can be found within "archiving".
    More concretely: we have an application that uses several tables to log information to.
    These tables are growing like crazy, and we would like to keep only "relevant" data in those tables, so I was thinking about moving data from these tables that have been in there for, say 2 months, to "archiving" tables.
    I figured there must be some kind of "best practice" to get this done.
    I have already written a procedure that loops the table that has the time indicator and inserts the records from the normal tables into the archive tables (and afterwards delete this data), but it seems to be taking ages to get it done.
    Thanks in advance!
    Message was edited by:
    timschraepen

    There is nothing to do with PL/SQL.
    You can refer below links:
    http://www.lc.leidenuniv.nl/awcourse/oracle/server.920/a96524/c12parti.htm
    http://www.stanford.edu/dept/itss/docs/oracle/10g/server.101/b10739/partiti.htm#i1006727

  • How to retrive archived data from ECC table

    Hi,
    I need extract the data from archived ECC tables to BW.
    How can I know that which tables the archived data has been stored in ECC ?
    Thanks for your help in advance.
    Raj.

    Hi Sreenivas,
    I know the t-code SARA using we can pull the data. Please let me know by the steps how to pull the archived data in to the data source.
    Do I need to create any generic data source or we can pull the data into existing data source.
    Please let me know by steps or if you have any document link?
    Thanks,
    Raj.

  • Fm to get archived data from table bseg

    hi ,
    what is the FM to get archived data from table bseg

    Hi Friend,
    Here is the FM you need,
    FAGL_GET_ARCH_ITEMS_BSEG.
    Regards,
    Lakshmanan

  • Copying large amount of data from one table to another getting slower

    I have a process that copies data from one table (big_tbl) into a very big archive table (vb_archive_tbl - 30 mil recs - partitioned table). If there are less than 1 million records in the big_tbl the copy to the vb_archive_table is fast (-10 min), but more importantly - it's consistant. However, if the number of records is greater than 1 million records in the big_tbl copying the data into the vb_archive_tbl is very slow (+30 min - 4 hours), and very inconsistant. Every few days the time it takes to copy the same amount of data grows signicantly.
    Here's an example of the code I'm using, which uses BULK COLLECT and FORALL INSERST to copy the data.
    I occasionally change 'LIMIT 5000' to see performance differences.
    DECLARE
    TYPE t_rec_type IS RECORD (fact_id NUMBER(12,0),
    store_id VARCHAR2(10),
    product_id VARCHAR2(20));
    TYPE CFF_TYPE IS TABLE OF t_rec_type
    INDEX BY BINARY_INTEGER;
    T_CFF CFF_TYPE;
    CURSOR c_cff IS SELECT *
    FROM big_tbl;
    BEGIN
    OPEN c_cff;
    LOOP
    FETCH c_cff BULK COLLECT INTO T_CFF LIMIT 5000;
    FORALL i IN T_CFF.first..T_CFF.last
    INSERT INTO vb_archive_tbl
    VALUES T_CFF(i);
    COMMIT;
    EXIT WHEN c_cff%NOTFOUND;
    END LOOP;
    CLOSE c_cff;
    END;
    Thanks you very much for any advice
    Edited by: reid on Sep 11, 2008 5:23 PM

    Assuming that there is nothing else in the code that forces you to use PL/SQL for processing, I'll second Tubby's comment that this would be better done in SQL. Depending on the logic and partitioning approach for the archive table, you may be better off doing a direct-path load into a staging table and then doing a partition exchange to load the staging table into the partitioned table. Ideally, you could just move big_tbl into the vb_archive_tbl with a single partition exchange operation.
    That said, if there is a need for PL/SQL, have you traced the session to see what is causing the slowness? Is the query plan different? If the number of rows in the table is really a trigger, I would tend to suspect that the number of rows is causing the optimizer to choose a different plan (with your sample code, the plan is obvious, but perhaps you omitted some where clauses to simplify things down) which may be rather poor.
    Justin

  • Adding Data From One Table to Another

    Now, this doesn't strike me as a particularly complex problem, but I've either strayed outside the domain of Numbers or I'm just not looking at the problem from the right angle. In any case, I'm sure you guys can offer some insight.
    What I'm trying to do is, essentially, move data from one table to another. One table is a calendar, a simple two column 'date/task to be completed' affair, the other is a schedule of jogging workouts, i.e, times, distances. Basically, I'm trying to create a formula that copies data from the second table onto the first but only for odd days of the week, excepting Sundays (and assuming Monday as the start of the week). Now, this isn't the hard part, I can do that. The problem comes when I replicate the formula down the calendar. Even on the days when the 'if' statement identifies it as an 'even day', the cell reference to the appropriate workout on the second table is incremented, so when it comes to the next 'odd day', it has skipped a workout.
    I can't seem to see any way of getting it to specifically copy the NEXT line in the second table, and not the corresponding line.
    This began as a distraction to try and organise my running so I could see at a glance what I had to do that day and track my progress, but now it's turned into an obsession. SURELY there's a solution?
    Cheers.

    Hi Sealatron,
    Welcome to Apple Discussions and the Numbers '09 forum.
    Several possible ways to move the data occur to me, but the devil's in the details of how the data is currently arranged.
    Is it
    • a list of three workouts, one for each of Monday, Wednesday and Friday, then the same three repeated the following week?
    • an open-ended list that does not repeat?
    • something else?
    Regards,
    Barry

  • Copying data from one table to another, but not duplicate

    Good afternoon!
    I am new to Oracle SQL, I have a difficulty.
    I have a script that copies or add another table with data from another table.
    If the table already has 01 "Registry 01" when you make a copy of the data in table 02, can not duplicate the "Registry 01" again.
    As the table already exists since the beginning of the year before last and duplicate information, I can not apply the UNIQUE constraint because of the error. I have to make this change from now.
    How to perform this validation so that no duplicate data?
    DECLARE
    w_cont NUMBER;
    CURSOR c_simpro IS
    SELECT sc.cd_simpro,
    sc.ds_produto,
    sp.qt_embalagem,
    MAX(sp.dt_vigencia)
    FROM simpro_cadastro sc,
    simpro_preco sp
    WHERE sc.cd_simpro = sp.cd_simpro
    GROUP BY sc.cd_simpro,
    sc.ds_produto,
    sp.qt_embalagem;
    BEGIN
    FOR r_simpro IN c_simpro LOOP
    w_cont := 0;
    SELECT COUNT(1)
    INTO w_cont
    FROM pls_material pm
    WHERE pm.cd_material_ops = r_simpro.cd_simpro;
    IF w_cont = 0 THEN
    INSERT INTO pls_material(nr_sequencia,
    dt_atualizacao,
    nm_usuario,
    dt_atualizacao_nrec,
    nm_usuario_nrec,
    ie_tipo_despesa,
    cd_estabelecimento,
    nr_seq_estrut_mat,
    cd_simpro,
    ds_material,
    ie_situacao,
    ds_material_sem_acento,
    dt_inclusao,
    cd_material_ops_orig,
    cd_unidade_medida,
    cd_material_ops,
    qt_conversao_simpro)
    VALUES(pls_material_seq.nextval,
    SYSDATE,
    'ES-SIMPRO',
    SYSDATE,
    'ES-SIMPRO',
    3,
    1,
    3,
    r_simpro.cd_simpro,
    r_simpro.ds_PRODUTO,
    'A',
    r_simpro.ds_PRODUTO,
    SYSDATE,
    r_simpro.cd_simpro,
    'un',
    TRIM(to_char(r_simpro.cd_simpro,'0000099999')),
    r_simpro.qt_embalagem);
    COMMIT;
    END IF;
    IF w_cont > 0 THEN
    UPDATE pls_material p
    SET p.qt_conversao_simpro = r_simpro.qt_embalagem,
    p.dt_atualizacao = SYSDATE
    WHERE p.cd_simpro = r_simpro.cd_simpro;
    COMMIT;
    END IF;
    END LOOP;
    END;
    Edited by: 983464 on 22/01/2013 10:30

    Hi,
    in addition to what Marwin has already said, I suggest you to post CREATE TABLE and INSERT statements (as mentioned in the FAQ).
    The error you are getting from MERGE command is because you need a way to uniquely identify within the table. So it's is important to know also if your table has a primary key/unique index so the keys could to be used in the MERGE command.
    Additionally when you put some code or output please enclose it between two lines starting with {noformat}{noformat}
    i.e.:
    {noformat}{noformat}
    SELECT ...
    {noformat}{noformat}
    Regards.
    Al                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Relocation of data from SOFFCONT1 table

    Hi,
    My issue is on executing RSIRPIRL program for relocating the data from SOFFCONT1 table to external system.
    I have relocated the data successfully from development system to storage system.
    In Quality system, it is asking for change authorization for table structure while relocating the data.
    Taking authorization in quality and going forward in production system will be difficult.
    Question:
    Have you face same type of issue while executing this program RSIRPIRL?
    Did you have authorization to change mode for table structure while executing the program RSIRPIRL?
    Is there any other alternative to execute this program?
    Thanks in Advance!
    -Thanks,
    Ajay
    Edited by: Ajay Kumar on Feb 2, 2009 4:19 AM

    Hi Ajay,
    I'm really interested in knowing how you have accomplised the transfer from SOFFCONT1 to external storage. We're facing a major issue with the size of this table too. We already have an archiving solution for storage of SD order output PDF's to the archive storage. What are the config steps required?
    Also, are the attachments viewable after the SAP document (sales order for example) has been archived.
    Looking forward to your reply.
    Thanks.
    Soyab

  • Best tool to archive data from DW to external storage.

    Hi,
    I want to archive the data of couple of tables in DW at regular interval to some external storage. This archived file will be loaded in relational table again for read only processing later when required. These table's data has got retention period so data gets deleted after retention period. The table is partitioned on ACTION_DATE column.
    I don't have access to filesystem on DW servers so will prefer to avoid to store any file on filesystem.
    We are using Oracle 10.2.
    Following are three options I can think of:
    1) Use exp / imp tools to get data from required tables. I will be running this script daily so I can take data of one partition only (feature supported by exp/imp). I will want to import data in DW to some other table (table with different name to original table but same schema) when required sometime later for processing because of retention period issue. But imp do not support this option. Or at least I do not know of any .
    2) Use expdb/impdb tools. It will help as it is faster to exp/imp and also it has REMAP_TABLE feature which can help me to load data in table with different name. But the problem with this approach is impdb/expdb requires/stores file on server (in this case DW servers ). As mentioned I am not preferring for this solution. If we have a convenient way to get the data on client side and put back on server when impdb has to be executed this option can be used. Or some other approach using expdb/impdb tools.
    3) Use spool /sqlloader. Haven't explored a lot about this option yet. But am concerned about efficiency of this solution. Also the complexity of implementation with this approach. Spool will give required data in csv format and it will be loaded later using sqlloader.
    Please help me with selecting an approach here. Also if anyone has a better solution for the above problem please do mention about the same.
    Thanking in advance.
    -Akshay.

    If all the photos  etc. are on your iPhone, you can back them up after you get your SSD going. If there is more in the backup on your failing HD, you can try copying it manually.
    TimLitch wrote:
    I assume just syncing with the new SSD installed will wipe the phone..?
    Not necessarily. In iTunes Preferences>Devices you can check the box "Prevent iPod, iPhone, iPad etc. from syncing automatically."
    Then you can back up iPhone to your computer or iCloud.
    In Internet Recovery you are given options -- to erase the hd or ssd and install Mavericks, or restore from Time Machine.
    Before you do it you should make a note of your Superduper registration number in case you have to download it again.

  • Can we move archive data from Filenet to another tool

    Hi Guru's,
    we are currently archiving data from SAP to archive location in Filenet , now we are planning to move archive data to onther tool , please suggest below  thing are possible:-
    1) Keep filenet and add another tool also    
    2) Move all a from filenet to another tools and then start rest of archiving to new tool.
    Regards,
    Jeetendra

    Hello,
    I have done this process when we moved our archived data from IBM Commonstore to IXOS/OpenText as well as when we moved all of our archived images in OpenText that were stored in a jukebox and we moved them to a WORM/Hard Disk Storage (still Open Text).  But, in both cases, we were provided the necessary conversion programs from the relevant vendor (in this case Open Text).
    This process is complex and involves correctly updating mulitple internal SAP tables as well as the database entries in the new storage providor. 
    My recommendation is to work with the 3rd party solution partner where the data is going to be moved to and get the appropriate conversion programs from them.
    Best Regards,
    Karin Tillotson

  • How to archive data from bsak - vendor cleared items

    We are using ECC6 and I want to archive vendor transaction data from bsak table using the transactin SARA.
    What archiving object should I choose?
    After doing the archiving how can I retreive the data?
    Best regards,
    Odelia.

    Dear Odelia,
    This is a BASIS procedure. Please ask for correct support before excute it.
    Look the link bellow to clarify your doubts
    http://help.sap.com/saphelp_srm30/helpdata/en/15/c9df3b6ac34b44e10000000a114084/content.htm
    Regards

  • Import data from few tables and export into the same tables on different db

    I want to import data from few tables and export into the same tables on different database. But on the target database, additional columns have been added
    to the same tables. how can i do the import?
    Its urgent can anyone please help me do this?
    Thanks.

    Hello Junior DBA,
    maybe try it with the "copy command".
    http://download.oracle.com/docs/cd/B14117_01/server.101/b12170/apb.htm
    Have a look at the section "Understanding COPY Command Syntax".
    Here is an example of a COPY command that copies only two columns from the source table, and copies only those rows in which the value of DEPARTMENT_ID is 30:Regards
    Stefan

  • MySQL move date from one table to another

    I was wondering if there is a MySQL command that will let me move a selected row of data from one table to another. both tables have the same columns and declaration type (one table is actually an archived table on old data)
    example
    I wasnt to move all data in Table1 where the date is greater than 30 days old to Table 2
    -- so the result should be...import all rows to Table 2 where the date is greater than 30 days old..and delete all date from Table 1 that is greater than 30 days.
    currently..I'm doing three process
    1) get all row that is greater than 30 days  
        "SELECT * FROM Table1 WHERE TO_DAYS(NOW()) - TO_DAYS(dateField) > 30"
    2) insert data into Table2
        while (res.hasNext())
              TableData data = ..... // .get row
              dataList.add(data);
              for (int i = 0; i < dataList.size(); i++){
                    pstm.setString.....
                    pstm.addBatch()
              pstm.executeBatch();
    3) delete data from Table1
        "DELETE FROM Table 1 WHERE  TO_DAYS(NOW()) - TO_DAYS(dateField) > 30"

    for this app..losing a few rows does not
    impact on how we analyze the data.That's what everyone always tells me too. But 99% of the time they come back and want to know why the cannot balance and/or validate the data between two runs taken only minutes from each other.
    I've seen people puzzle over data for days that they swear they ran the exact same utility for their tests, but they were in fact using live data, and additional data had accrued but since all they had to do was execute the a script without parameters (they didn't put in a stop time), they got two different answers and it always, and I mean always confuses people. Be safe, and put the option in for and end date/time, then when they waste days trying to figure out why the two different observations gave them different numbers, they cannot blame you (because you gave them the option)!
    My 2 cents for the day...

  • Error while selecting date from external table

    Hello all,
    I am getting the follwing error while selecting data from external table. Any idea why?
    SQL> CREATE TABLE SE2_EXT (SE_REF_NO VARCHAR2(255),
      2        SE_CUST_ID NUMBER(38),
      3        SE_TRAN_AMT_LCY FLOAT(126),
      4        SE_REVERSAL_MARKER VARCHAR2(255))
      5  ORGANIZATION EXTERNAL (
      6    TYPE ORACLE_LOADER
      7    DEFAULT DIRECTORY ext_tables
      8    ACCESS PARAMETERS (
      9      RECORDS DELIMITED BY NEWLINE
    10      FIELDS TERMINATED BY ','
    11      MISSING FIELD VALUES ARE NULL
    12      (
    13        country_code      CHAR(5),
    14        country_name      CHAR(50),
    15        country_language  CHAR(50)
    16      )
    17    )
    18    LOCATION ('SE2.csv')
    19  )
    20  PARALLEL 5
    21  REJECT LIMIT UNLIMITED;
    Table created.
    SQL> select * from se2_ext;
    SQL> select count(*) from se2_ext;
    select count(*) from se2_ext
    ERROR at line 1:
    ORA-29913: error in executing ODCIEXTTABLEOPEN callout
    ORA-29400: data cartridge error
    KUP-04043: table column not found in external source: SE_REF_NO
    ORA-06512: at "SYS.ORACLE_LOADER", line 19

    It would appear that you external table definition and the external data file data do not match up. Post a few input records so someone can duplicate the problem and determine the fix.
    HTH -- Mark D Powell --

Maybe you are looking for