Archiving: Regarding Archive File in SAP.

Hi,
Description: Once a Archive Write Program is run, the data from the DB tables is written to a Archive file.
Problem: Could anyone please explain me, <b>where exactly is the Archive file stored And where exactly can archive files be created by SAP. </b>
Analysis: From my analysis it seems that archive file is stored on the Application serever. Are there any other places an Archive file can be created.
Thanks and Regards,
Raghavendra Goutham P.

Hi,
Thanks Gopi for your reply.
I have done some more work on this, The Archive file creation will take place depending on the customisation in Transaction AOBJ, where logical fiel name is specified. Depending on this Logical file customisation in FILE transaction, the Srchive File creation location is determined.
Anyone can provide their valuable inputs on this topic.
With warm Regards,
Raghavendra Goutham P.

Similar Messages

  • Archiving ZIP files in SAP

    Hi all,
    I have the following requirements :
    1.Archive the ZIP file(containing a couple of files) into the IXOS content server, using a  ABAP function call.
    2.Attach the ZIPped file into a invoice document in SAP.
    Any ideas on how to proceed ?

    Hi Aravind,
    of course you can create workstation applications in transaction DC30 for .rar or .zip files. Normally you can use any file format in this customizing and then you can use the workstation application to add these files. For displaying or editing these kind of files I would recommend you to use the parameter %AUTO% in DC30 at "Define workstation application in network".
    Regarding the usage of .exe files I'm not sure if this really makes sense in a DMS system.
    Best regards,
    Christoph

  • Regarding Archiving files

    Hi All,
    Consider there are three folders in Sender application.
    lets say f1,f2,f3.
    XI has to pick the files from f1 folder and archive it in f2 if the process is successful.
    If any errors are occured in the files then that files has to place in f3.
    If i select the archive option it is asking only one directory that i have given f2 folder for successful messages.but for error messages where should i mention the f3 folder.
    Do we need to go for Adapter modules development for this.
    Any suggestions??
    Kalyan.

    Hi,
    in Adapter settings select the archive faulty source file....and set the mode as ARCHIVE.then you can get all the messages in the Archive location........error and success messages are archived...........
    Reward points if it is helpful
    Thanks,
    Madhu

  • Error with Archiving Faulty Source File in Sender FTP adapter

    Hi All,
    I have configured a Sender FTP adapter with Processing Mode as Archive.
    I am able to read the files from FTP server and archive all the successfuly processed file in the PI aplication server directory XXX/success.
    We need all the error files also to be archived on PI application server in directory XXXXX/Fail so I have checked the option Archive faulty Source Files and provided the application directory  XXXXX/Fail .
    I have unchecked the option Archive Files on FTP server.
    But the problem is when FTP adapter is getting any error files it is not able to archive it to application directory  XXXXX/Fail .
    In RWB it is showing the error
    Unable to archive faulty input file /data/abc.txt to /XXXX/fail/abc.txt
    Cause: com.sap.aii.adapter.file.ftp.FTPEx: 550 rename: No such file or directory
    This directory is existing and works fine when I give this directory name in Archive directory of processing mode.
    Sender FTP adapter is trying to archive the file in FTP server which I dont want and giving us this error.
    If I give any Directory which is present in FTP server my error files are getting archived on FTP server but not in the archived folder.
    I have already unchecked the option Archive Files on FTP server.
    Please help......
    I am on PI 7.11
    Regards
    Henery H

    Hi Henery,
    Check the below thread, this should help you to fix the issue.
    FTP adapter: exc. 550 : No such file or directory
    Thanks,

  • Archiving faulty Source file not working in Sender Adapter FCC

    Hi Experts,
    I have enabled "Archiving Faulty Source File" in Sender Adapter FCC and pen down the directory path accordingly.
    Likewise I also enable the processed mode as "archive" and give it the direcotory path.
    However when there is a error flagged in sxmb_moni for this interface, I unable to see any file created in the error folder but I can see a file with timestamp being created in the archive folder.
    I have checked the access right to the directory, so this is not an issue. I ran through the forum on this subject and come across the help.sap note on the following
    " To archive source files where a permanent error occurred during processing, set the indicator.
    A permanent error occurs either during the conversion of the file content, or in a module in the module processor.
    More information: MessageTransformBean, Migrating Dispatcher Classes
    ○       Specify the Directory for Error Archiving.
    ○       To add a time stamp to the archived file, select Add Time Stamp. "
    What is the definition as "permanent error". The error I got in sxmb_moni is a mapping conversion error, so it should be archive this to the error folder, right ?
    Anyone have any such setting enabled and working ?
    Regards
    FNG

    The error I got in sxmb_moni is a mapping conversion error, so it should be archive this to the error folder, right ?
    No, it is not the case. As mentioned on the SAP Help site, for the faulty file to be archived, the error has to occur in content conversion, or in module processor.
    If the error you are getting is in MONI, then it means syntactically the file is correct and hence adapter engine has picked up and sent to integration engine (SXMB_MONI).
    -Supriya.

  • Issue while archiving the processed file in sender communication channel using SFTP adapter

    Hi All,
    In one of my scenario (File to IDOC), we are using SFTP sender communicationchannel.
    we are facing an issue while archiving the processed file. Some times PI processed the file successfully but unable to archive it and in the next poll PI process & archives the same file successfully which will creates duplicate orders in ECC.
    Please let us know how to resolve this issue.

    Hi Anil,
    Refer Archiving concepts in below links.
    http://help.sap.com/saphelp_nw73/helpdata/en/44/682bcd7f2a6d12e10000000a1553f6/content.htm?frameset=/en/44/6830e67f2a6d12e10000000a1553f6/frameset.htm
    http://scn.sap.com/docs/DOC-35572
    Warm Regards,
    DNK Siddhardha.

  • How to enable the Archive faulty source file in File adapter

    Dear XI expert,
    I want to enable the Archive Faultly source file, when the exchange step detects any error then move the source file to the ERROR folder.
    I've already applied patch XI ADAPTER FRAMEWORK CORE 7.00 SP10 but haven't seen anythings about this function.
    Best regards,
    Kobsak

    Hi,
    Check out this link for info on how to archive the file.
    http://help.sap.com/saphelp_nw04/helpdata/en/14/80243b4a66ae0ce10000000a11402f/frameset.htm
    Regards,
    Vikram

  • Would you like help me regarding archiving please .. ?

    Dear All,
    We're in the archiving project ..
    We here have scenario to reload/restore archived data from BW Production to BW Development server ...
    I did try it: I moved the archived file to archived known storage in BW Development, but it didn't recognize it ..
    My questions:
    1. Is it possible to get the scenario be done ??? How can i make it ??
    2. I run t-code AOBJ and found that there is reload program (e.g. of the program: SBOOKL), What is it for ?? Could this program solve my case above ??
    Really need ur guidances all.
    Best regards,
    Niel.

    Data Archiving
    Data Archiving u2013 a service provided by SAP u2013 removes mass data that the system no longer needs online, but which must still be accessible at a later date if required, from the database.
    Data Archiving removes from the database application data from closed business transactions that are no longer relevant for the operational business. The archived data is stored in archive files that can be accessed by the system in read-only mode.
    The following graphic illustrates the archiving process:
    Reasons for Archiving
    There are both technical and legal reasons for archiving application data. Data Archiving:
    Resolves memory space and performance problems caused by large volumes of transaction data
    Ensures that data growth remains moderate so that the database remains manageable in the long term
    Ensures that companies can meet the legal requirements for data storage in a cost-efficient manner
    Ensures that data can be reused at a later date, for example, in new product development
    Data Archiving Requirements
    Data archiving is intended to do more than simply save the contents of database tables. Data archiving must also take the following requirements into consideration:
    Hardware independence
    Release dependence
    Data Dependencies
    Enterprise and business structure
    Optical Archiving
    The term u201Coptical archivingu201D generally describes the electronic storage and management of documents in storage systems outside of the SAP Business environment. Examples of documents that can be stored in this way include:
    Scanned-in original documents, such as incoming invoices
    Outgoing documents, such as invoices created in mySAP Financials that are created electronically, then sent in printed form
    Print lists created in mySAP Business Suite
    Residence Time and Retention Periods
    The residence time is the minimum length of time that data must spend in the database before it meets the archivability criteria. Residence times can be set in application-specific Customizing.
    The retention period is the entire time that data spends in the database before it is archived. The retention period cannot be set.
    Ex: If the residence time is a month, data that has been in the system for two months will be archived. Data that is only three weeks old remains in the database.
    Backup & Restore
    Backup is a copy of the database contents that can be used in the case of a system breakdown. The aim is that as much of the database as possible can be restored to its state before the system breakdown. Backups are usually made at regular intervals according to a standard procedure (complete or incremental backup).
    Reloading the saved data into the file system is called restoring the data.
    Archiving Features
    Data Security
    Data archiving is carried out in two steps (a third step u2013 storage of archive files u2013 is optional: In the first step, the data for archiving is copied to archive files. In the second step, the data is deleted from the database. This two-step process guarantees data security if problems occur during the archiving process.
    For example, the procedure identifies network data transfer errors between the database and the archive file. If an error occurs, you can restart the archiving process at any time because the data is still either in the database or in an archive file. This means that you can usually archive parallel to the online application, that is, during normal system operation, without having to back up the database first.
    You can further increase data security if you store the archive files in an external storage system before you delete the data from the database. This guarantees that the data from the database will only be deleted after it has been securely stored in an external storage system.
    Data Compression
    During archiving, data is automatically compressed by up to a factor of 5. However, if the data to be archived is stored in cluster tables, no additional compression takes place.
    Storage Space Gained
    Increased storage space in the database and the resulting performance gains in the application programs are the most important benefits of data archiving. Therefore it is useful to know how much space the data to be archived takes up in the database. It may also help to know in advance how much space the archive files that you create will need.
    Note: - Data is compressed before it is written to the archive file. The extent of the compression depends on how much text (character fields) the object contains.
    Archiving without Backup
    With SAP Data Archiving, data can be archived independently from general backup operations on the database. However, SAP recommends that you backup archive files before storing them.
    Accessing Archived Data
    Because archived data has only been removed from the database and not from the application component itself, the data is always available. Archive management allows three types of access:
    1.     (Read) access to a single data object, such as an accounting document
    2.     Analysis of an archive file (sequential read)
    3.     Reload into the database (not possible for all archiving objects)
    Converting Old Archive Files
    When archived data is read, the system automatically makes the conversions required by hardware and software changes.
    When old archive files are accessed, the Archive Development Kit (ADK) can make allowances for changes to database structures (field types, field lengths, new fields, and deleted fields) after the data was archived and for changes to hardware-dependent storage formats. This is only done on a temporary basis during read access. The data in the archive file is not changed. The following items are changed (if necessary) during automatic conversion:
    Database table schema (new and deleted columns)
    Data type of a column
    Column length
    Code page (ASCII, EBCDIC)
    Number format (such as the use of the integer format on various hardware platforms)
    If database structures in an application have undergone more changes than the ADK can handle (for example, if fields have been moved from one table to another or if one table has been divided into several separate tables), then a program is usually provided by the relevant mySAP Business Suite solution for the permanent conversion of existing archive files.
    Link to External Storage System
    Archive files created by Data Archiving can be stored on tertiary storage media, such as WORMs, magnetic-optical disks (MO), and tapes using the SAP Content Management Infrastructure (which also contains the ArchiveLink/CMS interface). This can be done manually or automatically.
    You can also store archive files in the file system of an HSM system. The HSM system manages the archive files automatically. For storage, the HSM system can also use tertiary storage media, such as MO-disks.
    CMI/R u2013 Content Management Infrastructure / Repository
    HSM u2013 Hierarchical Storage Management Systems
    Archiving Procédure
    The basic Archiving procedure is carried out in three steps, 
    Creating the Archive Files
    Storing Archive Files
    Executing the Delete Programs 
    Security Vs Performance
    Optionally, you can store archive files after the delete phase. To do this, you must mark Delete Phase Before Storage in archiving object-specific Customizing.
    If security is your main concern, then you should not schedule the delete phase until after the archive files have been stored. In this way you know that the data will only be deleted from the database after the archive files have successfully been moved to the external storage system. In addition, you can set the system to read the data from the storage system and not from the file system.
    However, if your main concern is the performance of the archiving programs, then you should schedule the delete program first and then store the files.
    Creating Archive Files (WRITE)
    In step one, the write program creates an archive file. The data to be archived is then read from the database and is written to the archive file in the background. This process continues until one of following three events occurs:
    All the data is written to an archive file
    Archiving is not complete, but the archive file reaches the maximum size specified in archiving object-specific Customizing
    The archiving is not yet finished, but the archive file contains the maximum number of data objects specified in Customizing.
    If in cases 2 and 3 there is still data to be archived, the system will create another archiving file.
    Storing Archive Files (STORE)
    Once the write program has finished creating archive files, these can be stored. There are several ways of storing archive files:
    Storage Systems:
    If a storage system is connected to mySAP Business Suite: At the end of a successful write job, a request is sent to this system to store the new archive files (provided the appropriate settings were made in Archiving Object-Specific Customizing. You can also store archive files manually at a later point if you do not want them to be stored automatically. Storage is carried out by the SAP Content Management Infrastructure (which contains the ArchiveLink/CMS interface).
    HSM Systems:
    If you use an HSM system, it is sufficient to maintain the file name in Customizing (Transaction FILE). You do not then need to communicate with the storage system using the SAP Content Management Infrastructure, because the HSM system stores the files on suitable storage media according to access frequency and storage space.
    Existing Storage Media:
    Once the delete program has processed the relevant archive file, you can manually copy archive files to tape.
    Running Delete Programs
    After closing the first archive file, the archive management system creates a new archive file and continues with the archiving process. While this happens, another program reads the archived data from the completed archive file and deletes it from the database. This procedure guarantees that only data that has been correctly saved in the archive file is deleted from the database.
    If you do not carry out deletion until after the data has been stored, you can make a setting in Archiving Object-Specific Customizing so that the system will read archive files the from the storage system during deletion. In this way, you can detect errors in good time which might arise when transferring or saving the archive files in the storage system.
    When the last archive file is closed, a delete program starts to run for this file. The graphic shows that several delete programs are running simultaneously for previously created archive files. Because, unlike the delete program, the write program does not generally carry out any transactions that change data in the transactions, the write program creates new archive files faster than they can be processed by the delete program. This decreases the total archiving runtime because the database is used more efficiently.
    Note:-
    Scheduling the Archive jobs outside SARA
    WRITE:-
    Using an external job scheduler (SM36, SM62)
    WRITE Run followed by EVENT - SAP_ARCHIVING_WRITE_FINISHED,
    Parameter is Session Number
    To analyze the archiving information of a particular session, use FM
         ARCHIVE_GET_FILES_OF_SESSION
         Input is Session Number
    DELETE:-
    Using an external job scheduler (SM36, SM62)
    Using program RSARCHD, input u2013 Obj Name, Max. no. of files, Max no. of sessions, Max no. of jobs,
              Background User
    DELETE run followed by EVENT - SAP_ARCHIVING_DELETE_FINISHED
         Parameter is Session Number
    To analyze the archiving information of a particular session, use FM
         ARCHIVE_GET_FILES_OF_SESSION
         Input is Session Number
    Archiving Object
    The archiving object is a central component of SAP Data Archiving. The archiving object specifies precisely which data is archived and how. It describes which database objects must be handled together as a single business object and interprets the data irrespective of the technical specifications at the time of archiving (such as release and hardware).
    Note:-
         An archiving object has a name of up to ten characters in length.
         Transaction code to maintain the Archiving Object is AOBJ.
    The following programs must (or can) be assigned to an archiving object. The SAP System contains programs (some of which are optional) for the following actions:
    Preprocessing (Optional)
    Some archiving objects require a preprocessing program that prepares the data for archiving. This preprocessing program marks data to be archived, but it does not delete any data from the database. Preprocessing programs must always be scheduled manually and are run from Archive Administration.
    Write
    This program creates archive files and writes data to them. At this point, however, no data is being deleted from the database.
    You can specify in archiving object-specific Customizing whether the next phase (delete) is to take place automatically after the archive files have been created. Delete jobs can also be event-triggered. To do this, you set up the trigger event in archiving object-specific Customizing.
    Delete
    This function can entail several activities. The activities are always dependent on the existing archive files. Normally, the data is deleted from the data base. However, in some case, the archived data in the database may only have a delete indicator.
    In archiving object-specific Customizing, you can specify that archive files, after successful processing, are to be transferred to an external storage system using the SAP Content Management Infrastructure (which contains the ArchiveLink/CSM interface).
    Postprocessing (Optional)
    This function is usually carried out after deletion has taken place. It is not available for all archiving objects. If the data has not yet been deleted from the database by the delete program, it is deleted by the postprocessing program.
    Reload Archive (Optional)
    You can reload archived data from the archive files into the database using this function. It is not available for all archiving objects. To access this function, choose Goto ® Reload.
    Index (Optional)
    This function builds (or deletes) an index that allows individual access. It is not included in every archiving object.
    Data Object
    A data object is the application-specific instance of an archiving object, that is, an archiving object filled with concrete application data. The Archive Development Kit (ADK) ensures that data objects are written sequentially to an archive file. All data objects in an archive file have the same structure, which is described in the archiving object.
    Archive Administration (SARA)
    All interaction relating to data archiving takes place in the Archive Administration (transaction SARA). Features of Archive Administration:
    Preprocessing
         Write
         Delete
         Postprocessing
         Read - Enables you to schedule and run a program that reads and analyzes archived data.
         Index
         Storage System - Enables archive files to be transferred to a connected storage system and  
                                             enables stored archive files to be retrieved from a storage system. 
         Management - Offers an overview of archiving sessions for one archiving object.
    Depending on the action you have selected, you can use Goto on the menu to access the following menu options:
         Network Graphic
         Reload
         Customizing
         Job Overview
         Management
         Stored Files
         Database Tables
         Infosystems
         Statistics
         Interrupting and Continuing
    Archive Development Kit
    The Archive Development Kit (ADK) is a tool for developing archiving solutions. It also prepares the runtime environment for archiving. From a technical viewpoint, it is an intermediate layer between the application program and the archive that provides all the functions required for archiving data.
    The ADK functions are required for archiving and for subsequent access to archived data. The ADK automatically performs the hardware-dependent adjustments (such as code page and number format) and structural changes that are required when archive files are created. When the archive files are accessed later, the ADK temporarily converts data that was archived using earlier SAP releases.
    Note:-
    S_ARCHIVE is the SAP delivered user authorization check object over archiving objects. The Archive Development Kit (ADK) performs the check when an archive file is opened for one of the following actions:
    Write
    Delete
    Read
    Reload
    Database Tables in Archive Administration (DB15)
    This enables you to display all of the tables for a specific archiving object and as well as allows to display the list of Archiving Objects on a particular database table. Also enables to display the storage and space statistics. This also provides further information, such as the time and number of the last archiving session and various details on the client used.
    Network Graphic
    You can use the network graphic to show any existing dependencies between archiving objects. It shows business process flows and contexts that can influence the archiving sequence. In particular, at the start of an archiving object, you can use the network graphic to obtain a good overview of related documents.
    In an archiving session, you must take into account any dependencies between archiving objects that require a specific archiving sequence. In general, you cannot archive data for an archiving object that has preceding objects until these preceding objects have been archived.
    You can use the network graphic to determine whether the archiving object that you want to use has preceding objects. If so, the preceding objects should be implemented before the current archiving object. The nodes in the network graphic represent the archiving objects. A node displays the following information:
    Archive Object Name
    Application Component Name
    Short Description
    Date of last archiving
    Status of the session
         If status is u2018Greenu2019
         Archiving and Deletion is Successful
         If status is u2018Yellowu2019
         Successfully archived, but not yet deleted, or Archiving still running, or Delete in progress or
    Delete Cancelled
    If status is u2018Redu2019
    Not yet archived, or Archiving cancelled
    Standard Log (Spool List)
    During archiving, a log is usually generated. This can be done during the write, delete, read, or reload phases. This is usually in the form of a standard log. In some cases, an application-specific log may be generated.  Depending on the archiving action that was carried out, the standard log contains statistical information per archiving session or archive file according to the following categories:
    Archiving session number
    Number of data objects for processing
    Archive session size in megabytes
    Total header data in %
    Table space in MB occupied for:
              Tables
              Indexes
    Number of table entries processed
    You can call the standard log from the screen Archive Administration: Overview of Archiving Sessions. Choose Spool List.
    Accessing Archived Data
    Data that was archived using SAP Data Archiving has been relocated from the database but not placed beyond the application. Data is still available for read access and analysis. In some cases, archived data can even be reloaded into the database.
    Note:-
    A prerequisite of read access and reload access, is that the file can be found in the file system.
    Three types of access are possible:
    (Read) access to a single data object, such as an accounting document
    Direct access or single document access requires an index that can be built either during archiving or at a later point. A complex search of the documents stored in the archive files, in which all orders of an article in a particular batch are required for a product recall action, is not possible.
    The Archive Information System (AS) supports direct access using archive information structures that can be generated automatically either when the archive files are being written, or at a later point.
    Analysis of an archive file (sequential read)
    It is possible to run an analysis for one or several archiving sessions. The results of the analyzed data objects are displayed in a list. Furthermore, some archiving objects offer the option of a combined analysis. With this option, you can link current data in the database and archived data.
    Reloading into the database
    Archived data does not usually need to be reloaded because it remains accessible by the applications. There is also a lot of data that cannot be reloaded or for which reloading is problematic. For this reason, reload programs do not exist for all archiving objects.
    Archiving Session Overview
    On this screen, you can display and edit management information on archiving sessions. One archiving session is equal to Write and Delete jobs.  Within a status area, archiving sessions are, by default, organized in groups of 20. The sessions are ordered according to different status.
    Interrupting and Continuing
    In order that Data Archiving can be seemlessly integrated into the production system, you can interrupt an archiving session during the write phase and contine it at a later time. This enables you to react, during archiving, to specific time constraints or hard-disk space shortages. You can continue and complete interrupted archiving sessions when you have more time or more storage space. 
    To interrupt an archiving session:
    The archiving object must be registered in transaction AOBJ as interruptible, otherwise the Archive Development Kit (ADK) is unable to inform the write program of the interruption request.
    The write program must be able to process the interruption request.
    The archiving session must be run in production mode (not test mode) and be in process.
    The delete phase must be able to start before the write phase has finished (setting in transaction AOBJ).
    To continue an archiving session:
    The session must have been interrupted within the context of the interrupution concept. Archiving sessions that were interrupted for other reasons or that were terminated by archive management cannot be continued.
    The delete phase must have completed for the data that was archivied up to the point of interruption, that is, the archiving session must have the status completed.
    Database Action Before and After Archiving
    Archiving uses application software that depends on and affects the organization of the database data. You should therefore organize the database before and after archiving.
    Before Archiving
    Archiving application data helps to prevent storage and performance bottlenecks. Since relocating data can, in some circumstances, in itself, impair performance u2013 this is the case if you need to access archived data u2013 you need to consider carefully what data to archive. To determine whether or not you should archive data, consider the following questions:
    If there are memory problems, can more memory be assigned to the table?
    How likely is it that you will need to access the archived data again? How often?
    Is the data accessed using an optimal index?
    Does the application perform a full table scan on the tables that contain the data to be archived?
    After Archiving
    Reorganize index: If data has been archived or simply deleted and the associated tables were accessed via an index, the index should be reorganized. Deleting table entries leaves holes in the table which are still indexed. Reorganization can shorten the access paths, reducing response times.
    Update the database statistics: If your database uses a cost-based optimizer, you must choose Update Statistics to recalculate the access paths.
    Reorganize tablespace or database space: Whether you should reorganize the tablespace depends on the reason for archiving.
    Do you expect a lot of new data for the archived tables?
    Do you want to make space for other tables?
    Note:-
    Reorganization takes a long time and may need to be repeated after archiving. Throughput during a reorganization:
         With export/import     :           approximately. 60-100 MByte/hour.
           With unload/load     :           approximately. 250-300 MByte/hour.
    Perform an SQL Trace after reorganization.
    Statistics
    When writing, deleting, reading, or reloading, statistical data on each archiving run is automatically generated and is persistently stored in the database The data archiving administrators can analyze these figures so that they can better plan future archiving projects and request the necessary resources. Statistics also provided pertinent information on the role of data archiving in reducing the data volume in the database.
    You can call this screen directly from the Archive Administration (SARA), or using the transaction SAR_DA_STAT_ANALYSIS. It displays the following information:
         Archiving Session Number
         Archiving Object Name
         Client ID on which the archiving session was carried out
         Date on which the archiving session was carried out
         Status of the session number
         Portion of the Header data in the archiving session
         DB Space (WRITE) u2013 Virtual storage space in MB, which is occupied by an incomplete archiving                          session in the database
         DB Storage Space (DELETE) - Virtual storage space in MB, which is occupied by an incomplete                               archiving session in the database
         DB Space (Reload) u2013 Virtual storage space in MB
         Written Data Objects in an incomplete archiving session
         Deleted Data Objects for an incomplete archiving session in database
         Reloaded Data Objects
         Number of delete jobs
         Write job duration
         Delete job duration
         Reload job duration
    Logical Path and File
    Archive files are stored in the file system under a physical path and file name that is derived from a user-definable logical path or file name. The definition can be divided into the following steps:
    Definition of the logical path name
    Definition of the logical file name
    Assignment of the logical fine name for the archiving object
    By default, the system uses the logical file name ARCHIVE_DATA_FILE and the logical path name ARCHIVE GLOBAL PATH as defaults. Consequently, the names only need to be changed if they have to be adjusted to meet special requirements.
    Data Archiving Monitor
    Use this indicator to activate or deactivate the data archiving monitor (transaction SAR_SHOW_MONITOR. If you mark this checkbox before data archiving, archiving-relevant information on the write and delete jobs is updated. This information can be analyzed using the data archiving monitor. If there are errors, alerts are issued.
    The data archiving monitor offers the following information:
    Overview of all the archiving objects that have been run
    Detailed information on the individual archiving sessions
    Processing status display
    Help on analyzing open alerts

  • How do we do Archiving the system logs in SAP-BW

    Hi All
    How do we do Archiving the system logs in SAP-BW.
    Can anyone will let me know reagrding this
    If you have any docs also pls forward to my id [email protected]
    Thanks & Regards
    Balji

    Balaji,
    What is the landscape you have ?
    Windows / Unix / Solaris etc ?
    Also there are a lot of logs that are generated:
    Archive log ,
    System Log,
    Transaction log etc...
    The usual procedure is to copy them into another location and then delete the original file , the system will usually recreate the log and start logging new entries..
    Arun
    Hope it helps...

  • Direct Archival of Document Files in Public/Private Folder

    Dear Experts,
    If I create documents using Tcode CV01n or with the customized upload progam, I can see the files in SAY Easy Doc Mgmt only in serach results.
    I have to copy the files from search result & give the link in my folder which I have created under Public/Private Folder. If I copy & paste the document file in private/public folder from the serach result, system is creating one more document number internally for the same document file.
    To avoid this, How Can I automatically archive the document files directly in my folder which I have created under Public/Private Folder
    Thanks for your inputs
    Regards
    Damodar Pai

    Hi,
    When u create files using CV01n or any other method in SAP the DIR is created but it is not a part of any Document structure.
    But when u create it using EDMS u are creating the document in the public / private folder under som specific folder of ur choice. these folders are nothing but Document structures.
    If u want the files created using CV01n to appear in any specific folder then u have to assign it to that specific structure using CV12.
    Regards

  • Hoping for a quick response : EXP and Archived REDO log files

    I apologize in advance if this question has been asked and answered 100 times. I admit I didn't search, I don't have time. I'm leaving on vacation tomorrow, and I need to know if I'm correct about something to do with backup / restore.
    we have 10g R2 running a single instance on a single server. The application vendor has "embedded" oracle with their application. The vendor's backup is a batch file using EXP - thus:
    exp system/xpwdxx@db full=y file=D:\Orant\admin\db\EXP\db_full.dmp log=D:\Orant\admin\db\EXP\db_full.txt direct=y compress=y
    This command is executed nightly at midnight. The files are then backed up by our nightly backup to offsite storage media.
    Te database is running in autoarchive mode. The problem is, the archived redo files filled the drive they were being stored on, and it is the drive the database is on. I used OS commands to move 136G of archived redo logs onto other storage media to free the drive.
    My question: Since the EXP runs at midnight, when there is likely NO activity, do I need to run in AutoArchive Mode? From what I have read, you cannot even apply archived redo log files to this type of backup strategy (IMP) Is that true? We are ok losing changes since our last EXP. I have read a lot of stuff about restoring consistent vs. inconsistent, and just need to know: If my disk fails, and I have to start with a clean install of Oracle and nothing else, can I IMP this EXP and get back up and running as of the last EXP? Or do I need the autoarchived redo log files back to July 2009 (136G of them).
    Hoping for a quick response
    Best Regards, and thanks in advance
    Bruce Davis

    Bruce Davis wrote:
    Amardeep Sidhu
    Thank you for your quick reply. I am reading in the other responses that since I am using EXP without consistent=y, I might not even have a backup. The application vendor said that with this dmp file they can restore us to the most recent backup. I don't really care for this strategy as it is untested. I asked them to verify that they could restore us and they said they tested the dmp file and it was OK.
    Thank you for taking the time to reply.
    Best Regards
    BruceThe dump file is probably ok in the sense it is not corrupted and can be used in an imp operation. That doesn't mean the data in it is transactionally consistent. And to use it at all, you have to have a database up and running. If the database is physically corrupted, you'll have to rebuild a new database from scratch before you can even think about using your dmp file.
    Vendors never understand databases. I once had a vendor tell me that Oracle's performance would be intolerable if there were more than 5 concurrent connections. Well, maybe in HIS product ..... Discussions terminated quickly after he made that statement.

  • Archive Faulty Source File

    We have an issue in moving the faulty source files to a particular directory.
    We could move all successfully processed files to an archive directory, but not able to move the error files (having some data issue) to another archive directory, so that we can continue with processing of other files.
    Module  used:
    AF_Modules/MessageTransformBean  --> 1
    Parameter:
    Transform.PermanentErrors --> true
    But this is not working. 
    I see a similar thread which is not answered:
    Re: Archive Faulty Source File
    Your inputs are appreciated.

    I actually have two scenarios:
    1. Asynchronous
    2. Synchronous
    In Asynchronous; best part is, all files are getting processed, in spite if there are any erroneous files in between, at least it continues with next file.  My problem here is, it archives all successful and error files to the same Archive directory; though I have mentioned different directory for u2018Faulty Source Fileu2019. And there are no errors when I see in the CC monitoring. But there is error in sxmb_moni (actually I have failed it intentionally)
    Runtime exception occurred during application mapping com/sap/xi/tf/_CreateProductMaster_to_CreateProdu~; com.sap.aii.utilxi.misc.api.BaseRuntimeException:The element type "Products" must be terminated bythe matching end-tag "</Products>".
    In  Synchronous: If there are any error files in between:  it never go to the next file, it is stuck in the same file and retries the same.   And error in CC monitoring is:
    Error: com.sap.engine.interfaces.messaging.api.exception.MessagingException: Application:EXCEPTION_DURING_EXECUTE:
    In sxmb_moni
    Runtime exception occurred during application mapping com/sap/xi/tf/_CreateProductMaster_to_CreateProdu~; com.sap.aii.utilxi.misc.api.BaseRuntimeException:The element type "Products" must be terminated bythe matching end-tag "</Products>".

  • File Adapter - Error creating archive directory adapter file

    Hello,
    I've a interfase File to RFC. In sender CC i have Archive Directory.
    It occurs the follow error:
    Error creating archive directory adapter file
    The archive directory exists.
    any idea?
    thanks very much

    Hi Silvia,
    Check whether the user you are using for FTPing is having proper authorizations.
    Also, verify whether you have the archive directory in place.
    Regards,
    Neetesh

  • Unzip the file archived by the File adapter

    Hi Experts,
    we have a scenario where we are archiving a xml file using the File adapter and also it is being zipped by the OS command.
    the script not only zips the file, it also rename the file from *.zip to *.xml.
    Hence after archiving when I am trying to open it, the content appears to be junk.
    Please suggest how to unzip the file as it got a extension *.xml.
    And I also tried renaming the file to *.zip and then to uncompress, but it shows below error
    Cannot open : it does not appear to be a valid archive
    Thanks in advance,
    MK

    hi Mk,
    instead of using OS command try using the exesting module provided by the SAP, so your job will be easy.
    follow this steps in the module processing for UN ZIP
    Processing sequence:
    Module name: AF_Modules/PayloadZipBean
    Type: Local Enterprise Bean
    Module key: 3
    u2022 Module configuration:
    Module key: 3
    Parameter name: zip.mode
    Parameter value: unzip.
    in the commincation channel specify filename.xml
    so you can open the file in XML format
    Thanks,
    Madhav
    Note:points if useful
    Edited by: madhav poosarla on Aug 14, 2008 8:31 AM

  • Archiving Processedor Errored Files on FTP folder-Receiver File Adapter

    Hello,
    A quick question, we have a scenario where we are picking a file from FTP folder and we need to archive it on the
    1) Processed file on the same FTP server different folder
    2) errored file on different FTP folder.
    How could we achieve this using the FILE Adapter?
    As what i could see the file adapter does not gives you the option to select the FTP site for this...
    Help is appreciated.
    Regards

    To archive source files where a permanent error occurred during processing, select Archive Faulty Source Files.
    A permanent error occurs either during the conversion of the file content, or in a module in the module processor.
    More information: Adding MessageTransformBean in the Module Processor
    u25CB       Specify Error Archive Directory.
    u25CB       To add a time stamp to the archived file, select Add Time Stamp.
    ref: http://help.sap.com/saphelp_nwpi71/helpdata/EN/44/655453b48a4ddfe10000000a1553f7/content.htm

Maybe you are looking for