Regarding workflor archiving

Hi all,
we are in process starting workflow archiving and the scenario as below
adding a dedicated application server to the existing setup due to huge data
and planning to run archiving jobs in the particular server only.
ran couple of Test jobs, initial job triggers 2 jobs 1 SUB job (ARV_IDOC_SUB-program RSARDISP)
& 1 Write jobs(ARV_IDOC_WRI- program RSEXARCA)
able to move the SUB job to the desired server but not able to move the write which get triggered
after the SUB job.
This Write job is the actual archiving and consumes more time & space
Is there any options to move only this Write to the desired server and we also need SUB which in turn updates
the infostructure.
Please suggest.
Thanks
Thirumalai

Hi Thirumalai,
You cannot assign archiving jobs directly to a particaular server, but you can assign it to a server group. You can create a server group having the app servers on which you would like to run the archiving jobs and then in transaction SARA->Customizing->Cross-Archiving Object Customizing->Technical Settings, provide the server group name. I beleieve this technique only works for the jobs released thru SARA.
For the question of updating infostrutures: why do you have the need to run a job to update infostructure repeatedly? once the infostructure is activated, it will automatically be updated during the delete job.
Hope this helps,
Naveen

Similar Messages

  • Appropritate forum to put queries regarding data archiving

    Hi All,
    Which one is the appropriate forum to put queries regarding data archiving?
    Thanks in advance.
    Vithalprasad

    Yes you can use this forum, also there is one forum for Data Transfers.
    Regards,
    Altaf Shaikh

  • Regarding data Archiving

    Hi,
           All
                   my client needs to archive Data in SAP 8.8 each year having only 1 year completed.Can we archive data
                   less than 3 year?
    Thanks in Advance

    There will not be any technical problem if the archiving is done as per the instructions set by SAP (I suggest you go through DATA Archiving Session from SAP), What I meant was that any data which is less than 3 year old may still be considered as new data and if this is archived you are left with no choice to modify anything in this data as this becomes read-only.  If the client is looking for extra space on server by Data Archiving you can suggest other step as well . Again as I said it is the client's call afterall.
    regards
    johnson

  • Regarding force archiving

    Dear all,
    When I need to take the backup of latest archive log file, which commad from the below two is better
    1. ALTER SYSTEM ARCHIVE LOG CURRENT;
    2. ALTER SYSTEM SWITCH LOGFILE;
    Does both the commands give the same result ?
    Regards,
    Charan

    Hi Charan;
    alter system switch logfile -- > switches to the next logfile, irrespective of what mode the database is in ARCHIVELOG or NOARCHIVELOG mode.If in archivelog mode alter system swicth logfile will generate the archive of the redolog that was switched.
    alter system archive log current --> Here oracle switches the current log and archives it as well as all other unarchived logs.It can be fired on only the database which is in ARCHIVELOG mode.
    [http://download.oracle.com/docs/cd/B10501_01/server.920/a96519/backup.htm]
    Source:
    Diff between  switch logfile and archive log current
    Regard
    Helios

  • Regarding data archiving for SD objects.

    Hi All,
    Can you please help me to learn data archiving of SD objects.
    I am very new to SDN and this is my first thread.

    Hi Hariprasad,
    You can find a lot of information relating to archiving SD data in SAP Help.
    Here is the link you need
    <a href="http://help.sap.com/saphelp_erp2005vp/helpdata/en/d1/90963466880c30e10000009b38f83b/frameset.htm">Archiving SD data</a>
    Hope this helps
    Cheers!
    Samanjay

  • Pb regarding SAP Archives

    Hi,
    I've a problem on R/3 4.6c production server.
    Sequence of archive is 100014 and name of file is PRDARCHARC00014_0544984730.001.
    The parameter log_archive_format is ARC%S_%R.%T.
    I think Oracle doesn't manage to build the %S with 6 digit and forget the first digit of the sequence.
    Is there a way to solve that ?
    Thank you,
    Alexandre

    Hi!
    Directly taken from Oracle 10g online documentation. Take care of case-sensitivenesses:
    LOG_ARCHIVE_FORMAT is applicable only if you are using the redo log in ARCHIVELOG mode. Use a text string and variables to specify the default filename format when archiving redo log files. The string generated from this format is appended to the string specified in the LOG_ARCHIVE_DEST parameter.
    The following variables can be used in the format:
    %s log sequence number
    %S log sequence number, zero filled
    %tthread number
    %Tthread number, zero filled
    %a activation ID
    %d database ID
    %r resetlogs ID that ensures unique names are constructed for the archived log files across multiple incarnations of the database
    Using uppercase letters for the variables (for example, %S) causes the value to be fixed length and padded to the left with zeros. An example of specifying the archive redo log filename format follows:
    LOG_ARCHIVE_FORMAT = 'log%t_%s_%r.arc'
    Neither LOG_ARCHIVE_DEST nor LOG_ARCHIVE_FORMAT have to be complete file or directory specifiers themselves; they only need to form a valid file path after the variables are substituted into LOG_ARCHIVE_FORMAT and the two parameters are concatenated together.

  • Question regarding flash archives

    Quick question to all
    Is there a maximum size a flash archive can be when being used by jumpstart over NFS?

    For Solaris10 11/06 Release:
    "The default copy method that is used when you create a Solaris Flash archive is the cpio utility. Individual file sizes cannot be over 4 Gbytes. If you have large individual files, you can create an archive with the pax copy method. The flarcreate command with the -L pax option uses the pax utility to create an archive without limitations on individual file sizes. Individual file sizes can be greater than 4 Gbytes."
    http://docs.sun.com/app/docs/doc/819-6398
    This info for other versions/releases will be in install guide.
    John

  • To Karl Petersen: Regarding Another Archived Question

    Hello Mr. Petersen:
    I have been reading your posts on this topic, and I have question regarding making many frames combined into a QT movie. I do not have QT Pro, and am trying to do this through iMovie 6. Here is the deal:
    I have a series (about 2800) of frames stored in a folder on my Hard Drive, which I would like to make into a .mov file. They are currently BNP files. I imported them in order to iMovie in the timeline, and now they are showing up individually as five second clips. I will attach a screenshot of the timeline. When double-clicking on each clip, I can't adjust the time. I am trying to make each clip into one frame of a movie.
    Thank you so much.
    C h r i s t o p h
    Here is the photo of the timeline:

    Duration Slider? When I double click on the image I can't change the time. Is there a different way?
    C h r i s t o p h

  • PI 7.11  AAE- Archiving and deletion

    Dear All,
    I just want to get confirm on my understanding regarding AAE archiving and deletion.
    We have 98% interfaces using only AAE in our landscape.
    I had set up XMLDAS archiving for AAE ( After fixing so many issues ) which archives the AAE messages into the local file system. So no issues in that .
    Now my requirements are,
    1) I would like to keep only 7 days of data in the database and 100 days of data in the file system
    2) I do not want to set up any rules for Archiving /Deletion. It means i would like to archive messages older than 7 days in the file system.
    The steps are ( My understanding )
      1) Set the persistence duration in NWA to 7 days . So only 7 days of data will be stored in table BC_MSG.
      2) Set up the archival job in the background processing through RWB  without ANY RULES. So all processed messages older  than 7 days will be achieved in the file system. And Archive will come with default delete procedure as well hence Archived messages will be deleted from table BC_MSG.
      3) Deactivate the delete job . If i have the active delete job in the background processing, it will delete all messages older than 7 days which i don want.  The deletion of messages from BC_MSG will be taken care as part of archival job.
    Is my understanding correct?
    Thanks
    Rajesh

    Hello, any know what did they do to delete the files from the file system after 100 days?
    We have the same problem, when we deleted manualy some messages  from SO AIX but they still appear from the RWB, and when we clic over them we get an error:  ...... java.lang.Exception: XML DAS GET command failed with ErrorCode: 598; ErrorText:  598 GET: Error while accessing resource; response from archive store: 598 I/O Error java.io.FileNotFoundException: /usr/sap/PID/javaarch/archive/usr/sap/pid/x
    Is there any way to delete messages without generating errors / inconcistencias?

  • How to use for all entires clause while fetching data from archived tables

    How to use for all entires clause while fetching data from archived tables using the FM
    /PBS/SELECT_INTO_TABLE' .
    I need to fetch data from an Archived table for all the entries in an internal table.
    Kindly provide some inputs for the same.
    thanks n Regards
    Ramesh

    Hi Ramesh,
    I have a query regarding accessing archived data through PBS.
    I have archived SAP FI data ( Object FI_DOCUMNT) using SAP standard process through TCODE : SARA.
    Now please tell me can I acees this archived data through the PBS add on FM : '/PBS/SELECT_INTO_TABLE'.
    Do I need to do something else to access data archived through SAP standard process ot not ? If yes, then please tell me as I am not able to get the data using the above FM.
    The call to the above FM is as follows :
    CALL FUNCTION '/PBS/SELECT_INTO_TABLE'
      EXPORTING
        archiv           = 'CFI'
        OPTION           = ''
        tabname          = 'BKPF'
        SCHL1_NAME       = 'BELNR'
        SCHL1_VON        =  belnr-low
        SCHL1_BIS        =  belnr-low
        SCHL2_NAME       = 'GJAHR'
        SCHL2_VON        =  GJAHR-LOW
        SCHL2_BIS        =  GJAHR-LOW
        SCHL3_NAME       =  'BUKRS'
        SCHL3_VON        =  bukrs-low
        SCHL3_BIS        =  bukrs-low
      SCHL4_NAME       =
      SCHL4_VON        =
      SCHL4_BIS        =
        CLR_ITAB         = 'X'
      MAX_ZAHL         =
      tables
        i_tabelle        =  t_bkpf
      SCHL1_IN         =
      SCHL2_IN         =
      SCHL3_IN         =
      SCHL4_IN         =
    EXCEPTIONS
       EOF              = 1
       OTHERS           = 2
       OTHERS           = 3
    It gives me the following error :
    Index for table not supported ! BKPF BELNR.
    Please help ASAP.
    Thnaks and Regards
    Gurpreet Singh

  • Accessing a file in Imported Archive from Adapter module

    Hi,
    I am designing a module for File/FTP and Mail adapters. Is it possible to retrieve data from a CSV or TXT file uploaded in the Imported Archive from the Java code in the module? If this is possible, do I use the same approach as accessing the CSV data from the mapping using a UDF? If this is not possible, can you suggest other ways were I can access a CSV or TXT configuration file from an adapter module?
    I would like to avoid using Module key, Parameter name and Parameter value as I would like to make the adapter module generic and the data I will be reading might be too much to be specified in this location. However, I would use the Module key, Parameter name and Parameter value to specify the CSV or TXT filename which the adapter module will be using.
    Thanks in advance for any help that you can provide.
    Regards,
    Elbert

    Imported archives are part of mapping flow and adapter modules are more part of routing. Therefore I don't think imported archive could  be made accessible anywhere outside mapping.
    but my CSV or TXT file would be updated regularly by the developer.
    So were you planning to import this file again and again under imported archive? This doesn't seems to be a good solution when you think about changin Design part in Production environment. It would be better to give access to certain folder to developer to put the file there and access it using some code. You may refer this
    /people/sundararamaprasad.subbaraman/blog/2005/12/09/making-csv-file-lookup-possible-in-sap-xi
    Regards,
    Prateek

  • Re: Archiving of purchase orders

    Hi all,
    Can i delete  purchase orders by the means of archiving?
    Please provide details regarding the archiving object.
    Thanks in advance
    Sowmya

    You can archive purchase orders using the archive object MM_EKKO. Refer SAP help for more details.
    http://help.sap.com/saphelp_47x200/helpdata/en/8d/3e5c48462a11d189000000e8323d3a/frameset.htm
    Hope this helps.
    Cheers,
    Samanjay

  • COPA archiving

    Hi Experts
    I am having few questions regarding COPA archiving objects:
    1. what are differences between archiving objects COPA1_xxxx and COPAA_xxxx, COPAB_xxxx.
    I have found the SAP recomendation that to use archiving object COPA1_xxxx is not recommended. But there is no ohter explanation why to use COPAA_xxxx and COPAB_xxxx.
    2. Is it possible to archive by the COPAC_xxxx (archiving of segments) archiving object if I use COPA1_xxxx?
    Thanks
    Barbora

    Dear Barbora,
    When an operating concern (xxxx) is generated in CO-PA, the following archiving objects are generated.
    u2022 COPA1_xxxx for the accrued operating concern
    u2022 COPAA_xxxx
    u2022 COPAB_xxxx
    u2022 COPA1_xxxx
    u2022 COPA2_xxxx for account-based Profitability Analysis
    For profitability segments: COPAC_xxxx
    Archiving objects COPAA_xxxx and COPAB_xxxx have replaced archiving object COPA1_xxxx.
    Although it is still possible to use the archiving object COPA1_xxxx, SAP recommends that you only use the new archiving objects, as they are the standard archiving objects used now. For example, Customizing activity is available only for the new archiving objects in IMG in newer versions.
    Also, if you implement SAP Note 383728 you can use the generated archiving objects COPA1_xxx and COPA2_xxx to archive Profitability Analysis objects from tables CE4xxxx
    When a line item in Profitability Analysis is updated,
    An entry is inserted in the table CE1xxxx,
    A newly formed results object is entered in table CE4xxxx, and
    The related totals record is updated in table CE3xxxx.
    As of SAP R/3 4.5 you have an additional table called CE4xxxx_ACCT. It contains the detailed account assignment information and can grow a lot faster than the actual database table CE4xxxx. For more information see SAP Note 199467.
    Best Regards,
    Kaushik

  • Data archival and purging for OLTP database

    Hi All,
    Need your suggestion regarding data archival and purging solution for OLTP db.
    currently, we are planning to generate flat files from table..before purging the inactive data and move them to tapes/disks for archiving then purge the data from system. we have many retention requirements and conditions before archival of data. so partition alone is not sufficient.
    Is there any better approach for archival and purging other than this flat file approach..
    thank you.
    regards,
    vara

    user11261773 wrote:
    Hi All,
    Need your suggestion regarding data archival and purging solution for OLTP db.
    currently, we are planning to generate flat files from table..before purging the inactive data and move them to tapes/disks for archiving then purge the data from system. we have many retention requirements and conditions before archival of data. so partition alone is not sufficient.
    Is there any better approach for archival and purging other than this flat file approach..
    thank you.
    regards,
    varaFBDA is the better option option .Check the below link :
    http://www.oracle.com/pls/db111/search?remark=quick_search&word=flashback+data+archive
    Good luck
    --neeraj                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Archive Purchase Order in SRM 5.0@ extended classic senario

    Dear SRM Guru,
    I use SRM 5.0 in extended classic senario, how can i archive purchase order with limit order using standard SAP?
    Thanks.
    Regards,
    Kim

    Hi Kim,
    your question is a quite open one. So I try to answer with different flavours - hope there a right one for you.
    In case you'd like to archive over system boundaries, i.e. since to ECC PO is a copy of a leading PO in SRM, you might expect that you somehow can archive both at once. This is not happening - i.e. you have to archive both separately.
    When you look up notes 726509 SRM archiving: Archiving of R/3 documents with SRM, 723685 SRM: History after the archiving of R/3 documents you might figure out more.
    There was an issue in archiving that the workitems were not archived with reference to the SRM document - but this is resolved via note as well. Search for related notes with 'Archiving' + 'SRM' in the service portal.
    When you want to figure out more regarding the archiving dependencies you can look up the archiving trx. SARA resp. the archiving programs in SE38:   BBP_DP_ARCH_* (CHECK, DELETE, WRITE, SET_STATUS).
    Regards,
    Richard

Maybe you are looking for

  • Can I transfer files to sub folders in itune apps.i It seems I can't.

    I have the app office2hd and I'm trying to transfer files and folders to the app from my win 7 pc within iTunes. I can transfer files okay but when I try yo double click a folder to get it to open the sub folder under it, but it does not open. I have

  • Song Order is Wrong

    I just put some new music in my iTunes. I downloaded the CD off the internet by individual songs and edited each songs info correct so its all the right album, artist, song number of total, etc. But the songs are not lined up in the proper order. How

  • Exporting Sessions to Premiere Pro

    Hi There, I'm a newbee to Adobe products. I was wondering; I had three or four clips of audio in one track of a Premiere Pro project, opened it in Audition, worked on it, then exported it back into Premiere Pro. The problem is that, when I exported i

  • [help] Publish Report

    Hello, I'm still new at oracle report builder, well I still can make a report with report builder, and I understand it pretty well. not much actually, I have a few question, I already search it at google and this forum, but it's still too hard for me

  • Problem with in type ComboBoxCollum in grid

    Hi, I try to load one collum type ComboBoxCollum in my grid and received this message: Unable to cast COM object of type 'SAPbouiCOM.GridColumnClass' to interface type 'SAPbouiCOM.ComboBoxColumn'. This operation failed because the QueryInterface call