Status of PGI is Archived

Hi All
I am doing PGI usine FM ws_delivery_update.........it works but the status of the Goods Issue is Archived, and when I click on display document, it gives message does not exist.....
How can I resolve this issue
thnaks

Hi there,
Am also currently facing the same problem.
However I am using SD_DELIVERY_UPDATE_PICKING to update my delivery with picking qty and perform PGI.
Appreciate if you explain how u managed to over come this!
Many Thanks,
Alysia

Similar Messages

  • Billing status after PGI

    Hi All,
    When we create a PO using ME27 and using VL10B we are creating a delivery document and do a PGI ot that the BILLING STATUS is not getting updated as 'C' rather it is showing as 'A'.
    Any suggestions are welcome.
    Thanks & Regards
    Jyo

    Hi Jyo,
    I think you are trying to do the STO scenario, so you can process STO can have with or without billing.
    I donu2019t know which type if STO you are processing. Billing status is controlled in Item Category, check the item category in the outbound delivery and check the billing relevance of that item category in VOV7 (SD Item category). If you donu2019t want the billing then remove the billing relevance in VOV7. Then these documents will not appear in billing due list.
    Regards,
    Murali

  • WM status blocking PGI

    Hi Experts,
    we have two outbound deliveries which is failing on PGI because the WM status is set to 'B' partially processed. some materials for this OD have open picking requests that we cannot close since there are no TOs available.
    the scenario for the issue is that the OD has been GIed before WM was implemented and it was reversed recently which resulted for the stocks to be stuck at 916.
    we need to find out on how we can possibly transfer the stocks from 916 to another bin so we can process it using the correct WM flow.
    please help.
    thanks in advance

    we need to find out on how we can possibly transfer the stocks from 916 to another bin so we can process it using the correct WM flow.
    Dont you think this is one of the most basic things you do?
    As usual a movement starts in MM, so just to a MIGO transfer posting for you warehouse managed storage location to a non-warehouse managed storage location using movement type 311.
    this creates a negative quant in the storage type 921 and you get a transport order to move the stock from a source bin to this storage type 921.

  • Purchase order in 'Archived' status

    Hi All,
    We are using SRM5.0 ECS.
    After SC is created for 'Free text' requirement, PO is created. When looking at the Purchase order in 'Process Purchase orders',it is giving the message as 'Error while reading the PO in backend system, Inform system administration'. When looked into Item details it is showing the status of PO as 'Archived'.
    This is happening only for random Purchase orders. Please let me know what could be the issue.
    Regards
    Krishna

    Hello,
    Run FM META_PO_GETDETAIL in SRM, specifying the ECC system ID and PO number which has been created there.
    Check if there is any authorization message.
    I have seen a similar issue which was related to authorization of RFC user, who was not able to check PO details in backend.
    Kind regards,
    Ricardo

  • Status/Delivery Documents not being updated in C4C from ECC on PGI

    Hi,
    We are integrating SAP C4C with ERP via HCI. We have created a followup ‘Sales Order’ from an opportunity in C4C.
    On creating the followup sales order this gets created in SAP ERP & we get back the sales order/inquiry number in C4C,but we do not get any status.
    Also for the sales order when we try to deliver or do a ‘Post Goods Issue’ in ERP and we deliver goods the document flow in ERP shows’ ‘delivery number’ etc but these statuses are not being updated in C4C.
    We have already configured the 'Communication Arrangements ' in C4C, with this scenario as below,
    Opportunity with followup business Transaction in ERP ------ > Update opportunity from followup business transaction document.
    In HCI we have configured the following template in eclipse :
    ERP to Cod Opportunity Replicate
    So,C4C still has just two entries one for inquiry & one for sales order with just the number and no other followup documents details/status ,say PGI --delivery document number etc.Can you kindly help.
    Thanks
    Indrasish

    Hello Indrasish,
    If I am not wrong you might be using the URL for Outbound Channel that is mentioned in Catalog (standard scenarios for HCI on HCI tenant). Mind that the Outbound URL for Pricing, Query Sales Quote and Query Sales Order Scenario should be picked up from Services for same which we can get from SICF on ECC. These URL structures are also maintained in 1402 standard document for ERP integration using HCI. Yet mentioning them below:
    u
    These URL paths are to be maintained in the Outbound Channel on artifacts related to Request Pricing, Query Sales Quote and Query Sales Order.
    Let me know if this is what you were expecting and if this solves your issue.
    Regards,
    Chandan

  • CommPr01-TC { how to update product status(locked,Archived etc) via Program}

    Hi all,
    TCode: Commpr01 ( how to update material product status as Locked or Archived Via report program. Can any one suggest me what FM's
    i have to use and what inputs i have to pass. I already checked with "COM_PRODUCT_MAINTAIN_STATUS" it is not working properly.
    May be i passed wrong input to that function module i think off.
    Thanks,
    Anbusundaram A

    Hi,
    Its good that u pasted the complete log file. In your environment you have to run this upgrade tool only once from any of the middle tier.
    And with respect to your error that u got in precheck is quite simple. All u have to do is just run this script from by connecting to portal schema using sqlplus.
    Run dropupg.sql
    Location-------- /raid/product/OraHome_1/upgrade/temp/portal/prechktmp/dropupg.sql
    Later you re-run the upgrade tool and let me know the status.
    Good luck
    Tanmai

  • Archiving flag for BP based on activity status and delete archvied BP

    Hi All,
    Can we flag archiving check box for BP based on specific activity status and delete these archived BP's using BUPA_DEL.
    Thanks

    Use Function module
    BAPI_BUPA_CENTRAL_CHANGE and set the flag CENTRALARCHIVINGFLAG to true in the structure
    BAPIBUS1006_CENTRAL.
    Thanks,
    Thirumala.

  • Archived deliveries

    Hi,
    We have an issue where the sales order is open in the system while the delivery document has been archived. The document flow of the items show that the status of order as  "Being Processed" and status of delivery as "Archived". We have checked that the delivery does not exist in LIKP and it exists in VBFA.
    When we try to display the delivery document we get the error message "Archive file 000985-001RV_LIKP does not exist"
    Please advise what could be the causing this.
    Thanks!
    Laxmikanth

    Archive file 000985-001RV_LIKP
    this might be file location ( possibly the tape), where the LIKP information would have been stored, you need to have the associated tape connected to the server system to view this details.
    Thanks & Regards
    Ilango

  • Question on the status record of Idocs.

    Hi all,
    Where does the status of the idoc update::
    1) In the idoc in the sap system/Idoc Archive
    OR
    2)The one we transfer to the OS as a flat file (for the translators(EDI subsystem) to work on)?
    OR
    both of the above?
    Thanks,
    Charles.

    Hi,
    Whenever the IDOC is generated there are one to many status record updated to idoc depending on it is inbound / outbound and various process / system is goes through.
    Check transaction <b>WE47</b> to see all IDOC related status.
    This is the range of status for IDOC.
    OUTBOUND Idoc - Status between 1 to 49
    INBOUND Idoc  - Status between 50 to 74
    Also you can check the View <b>V_STACUST</b>. If you want to check status(s) of any idoc check table EDIDS.
    Regarding two scenario you are talking about.
    1. In the idoc in the sap system/Idoc Archive.
    - When IDOC is generated, it get status record
    - When you archive already created IDOC, the neither the status records are updated / changed or new status record are created
    2. The one we transfer to the OS as a flat file (for the translators(EDI subsystem) to work on)?
    - If you mean that you are sending IDOC to fine port then yes in that case also IDOC has status records.
    Can you explain your second question in more detail?
    Let me know if you have any other question.
    Regards,
    RS

  • Archiving Purchase Info Records

    Hi,
    Can you please let me know what are the prerequsites that need to be completed before archiving a Purchase Info Record?
    do we need to check for any other dependency/flag other than the Mark for deletion (table field LOEKZ   'X')
    Please any one of you sugest me the solution for the above questions;
    Thanks,
    Jeevan.

    Hi Jeevan,
    In order to be able to delete an info record, you have to mark the Info Records for deletion. 
    To do so, use transaction ME15 and flag record for deletion. After you have deleted all info records, use archiving program RM06IW30 or follow the menu path Logistics - Materials Management u2013Purchasing - Master Data - Info Record - Follow on Functions - Archive. 
    Make sure that you only archive Records Marked for deletion as all records may be deleted! 
    Create Archive File: Info Record:
    a) Select Action: Archive and enter a new Variant, for example: Z_EINA_ARCH_ 01, press 'Maintain'
    b) On selection screen enter the data range (Vendor, Material, etc.) you want to archive.
    c) Deselect the 'Test' flag if you don't want to test first.
    d) Press green back-arrow and enter the description of this new variantt on the screen which follows.
    e) Save the variantt which brings you again to the selection screen. Press green back arrow again.
    f) To start archiving process (batch-job), press the 'Start Date' button and select the time when you want to start this process. Select 'Immediate' for instant processing and press the 'Save' button on the bottom of the Start Time' window.
    g) Select the 'Spool Parameter' button and save entries. Eventually enter a valid printer to have the result outputted.
    h) You are ready now to start the process. Press the 'Start' button and monitor the success with the 'Job Overview' button You can also go the 'fast path' by using transaction SE38, program RM06IW30 to archive info records. For large data archiving, use the background jobs and run those during off-peak times. If you run the program online, you will see a confirmation on the status bar telling 'New Archive file created:.... ' Delete Archived Records: Info Record
    a) Follow the menu path: Tools - Administration - Administration - Archiving
    b) Select the Object Name MM_EINA for info records
    c) Select the menu button 'Delete'
    d) Select the menu button: 'Archive Selection'
    e) Click the archive created in previous step
    f) Select Start Date for process and Spool Parameters for output
    g) Submit selection.
    h) Check status by pressing the Job Overview button
    Reg,
    Ashok
    Assign points if useful.

  • Trapping KeyEvents anywhere in a JFrame for a status bar

    Hi all,
    There are a few examples of how to implement a status bar in the archives, but I haven't seen anything that tells you how to trap KeyEvents (e.g. pertaining to CapsLock etc.) which occur anywhere in a JFrame: once you start adding focusable components (particularly but not exclusively JTextComponents) to a JFrame the JFrame itself is often not sent the KeyEvents, so a rudimentary addKeyListener() is not the answer.
    Could this involve the use of KeyboardFocusManager or sthg? NB to experts: I have read the tutorial on the focus subsystem to no avail. The key seems to be to trap the event thread in some way, perhaps??
    All help greatly appreciated...
    Mike

    Add a AWTEventListener on Toolkit.getDefaultToolkit() with appropriate mask. See javadocs of addAWTEventListener method.

  • Would you like help me regarding archiving please .. ?

    Dear All,
    We're in the archiving project ..
    We here have scenario to reload/restore archived data from BW Production to BW Development server ...
    I did try it: I moved the archived file to archived known storage in BW Development, but it didn't recognize it ..
    My questions:
    1. Is it possible to get the scenario be done ??? How can i make it ??
    2. I run t-code AOBJ and found that there is reload program (e.g. of the program: SBOOKL), What is it for ?? Could this program solve my case above ??
    Really need ur guidances all.
    Best regards,
    Niel.

    Data Archiving
    Data Archiving u2013 a service provided by SAP u2013 removes mass data that the system no longer needs online, but which must still be accessible at a later date if required, from the database.
    Data Archiving removes from the database application data from closed business transactions that are no longer relevant for the operational business. The archived data is stored in archive files that can be accessed by the system in read-only mode.
    The following graphic illustrates the archiving process:
    Reasons for Archiving
    There are both technical and legal reasons for archiving application data. Data Archiving:
    Resolves memory space and performance problems caused by large volumes of transaction data
    Ensures that data growth remains moderate so that the database remains manageable in the long term
    Ensures that companies can meet the legal requirements for data storage in a cost-efficient manner
    Ensures that data can be reused at a later date, for example, in new product development
    Data Archiving Requirements
    Data archiving is intended to do more than simply save the contents of database tables. Data archiving must also take the following requirements into consideration:
    Hardware independence
    Release dependence
    Data Dependencies
    Enterprise and business structure
    Optical Archiving
    The term u201Coptical archivingu201D generally describes the electronic storage and management of documents in storage systems outside of the SAP Business environment. Examples of documents that can be stored in this way include:
    Scanned-in original documents, such as incoming invoices
    Outgoing documents, such as invoices created in mySAP Financials that are created electronically, then sent in printed form
    Print lists created in mySAP Business Suite
    Residence Time and Retention Periods
    The residence time is the minimum length of time that data must spend in the database before it meets the archivability criteria. Residence times can be set in application-specific Customizing.
    The retention period is the entire time that data spends in the database before it is archived. The retention period cannot be set.
    Ex: If the residence time is a month, data that has been in the system for two months will be archived. Data that is only three weeks old remains in the database.
    Backup & Restore
    Backup is a copy of the database contents that can be used in the case of a system breakdown. The aim is that as much of the database as possible can be restored to its state before the system breakdown. Backups are usually made at regular intervals according to a standard procedure (complete or incremental backup).
    Reloading the saved data into the file system is called restoring the data.
    Archiving Features
    Data Security
    Data archiving is carried out in two steps (a third step u2013 storage of archive files u2013 is optional: In the first step, the data for archiving is copied to archive files. In the second step, the data is deleted from the database. This two-step process guarantees data security if problems occur during the archiving process.
    For example, the procedure identifies network data transfer errors between the database and the archive file. If an error occurs, you can restart the archiving process at any time because the data is still either in the database or in an archive file. This means that you can usually archive parallel to the online application, that is, during normal system operation, without having to back up the database first.
    You can further increase data security if you store the archive files in an external storage system before you delete the data from the database. This guarantees that the data from the database will only be deleted after it has been securely stored in an external storage system.
    Data Compression
    During archiving, data is automatically compressed by up to a factor of 5. However, if the data to be archived is stored in cluster tables, no additional compression takes place.
    Storage Space Gained
    Increased storage space in the database and the resulting performance gains in the application programs are the most important benefits of data archiving. Therefore it is useful to know how much space the data to be archived takes up in the database. It may also help to know in advance how much space the archive files that you create will need.
    Note: - Data is compressed before it is written to the archive file. The extent of the compression depends on how much text (character fields) the object contains.
    Archiving without Backup
    With SAP Data Archiving, data can be archived independently from general backup operations on the database. However, SAP recommends that you backup archive files before storing them.
    Accessing Archived Data
    Because archived data has only been removed from the database and not from the application component itself, the data is always available. Archive management allows three types of access:
    1.     (Read) access to a single data object, such as an accounting document
    2.     Analysis of an archive file (sequential read)
    3.     Reload into the database (not possible for all archiving objects)
    Converting Old Archive Files
    When archived data is read, the system automatically makes the conversions required by hardware and software changes.
    When old archive files are accessed, the Archive Development Kit (ADK) can make allowances for changes to database structures (field types, field lengths, new fields, and deleted fields) after the data was archived and for changes to hardware-dependent storage formats. This is only done on a temporary basis during read access. The data in the archive file is not changed. The following items are changed (if necessary) during automatic conversion:
    Database table schema (new and deleted columns)
    Data type of a column
    Column length
    Code page (ASCII, EBCDIC)
    Number format (such as the use of the integer format on various hardware platforms)
    If database structures in an application have undergone more changes than the ADK can handle (for example, if fields have been moved from one table to another or if one table has been divided into several separate tables), then a program is usually provided by the relevant mySAP Business Suite solution for the permanent conversion of existing archive files.
    Link to External Storage System
    Archive files created by Data Archiving can be stored on tertiary storage media, such as WORMs, magnetic-optical disks (MO), and tapes using the SAP Content Management Infrastructure (which also contains the ArchiveLink/CMS interface). This can be done manually or automatically.
    You can also store archive files in the file system of an HSM system. The HSM system manages the archive files automatically. For storage, the HSM system can also use tertiary storage media, such as MO-disks.
    CMI/R u2013 Content Management Infrastructure / Repository
    HSM u2013 Hierarchical Storage Management Systems
    Archiving Procédure
    The basic Archiving procedure is carried out in three steps, 
    Creating the Archive Files
    Storing Archive Files
    Executing the Delete Programs 
    Security Vs Performance
    Optionally, you can store archive files after the delete phase. To do this, you must mark Delete Phase Before Storage in archiving object-specific Customizing.
    If security is your main concern, then you should not schedule the delete phase until after the archive files have been stored. In this way you know that the data will only be deleted from the database after the archive files have successfully been moved to the external storage system. In addition, you can set the system to read the data from the storage system and not from the file system.
    However, if your main concern is the performance of the archiving programs, then you should schedule the delete program first and then store the files.
    Creating Archive Files (WRITE)
    In step one, the write program creates an archive file. The data to be archived is then read from the database and is written to the archive file in the background. This process continues until one of following three events occurs:
    All the data is written to an archive file
    Archiving is not complete, but the archive file reaches the maximum size specified in archiving object-specific Customizing
    The archiving is not yet finished, but the archive file contains the maximum number of data objects specified in Customizing.
    If in cases 2 and 3 there is still data to be archived, the system will create another archiving file.
    Storing Archive Files (STORE)
    Once the write program has finished creating archive files, these can be stored. There are several ways of storing archive files:
    Storage Systems:
    If a storage system is connected to mySAP Business Suite: At the end of a successful write job, a request is sent to this system to store the new archive files (provided the appropriate settings were made in Archiving Object-Specific Customizing. You can also store archive files manually at a later point if you do not want them to be stored automatically. Storage is carried out by the SAP Content Management Infrastructure (which contains the ArchiveLink/CMS interface).
    HSM Systems:
    If you use an HSM system, it is sufficient to maintain the file name in Customizing (Transaction FILE). You do not then need to communicate with the storage system using the SAP Content Management Infrastructure, because the HSM system stores the files on suitable storage media according to access frequency and storage space.
    Existing Storage Media:
    Once the delete program has processed the relevant archive file, you can manually copy archive files to tape.
    Running Delete Programs
    After closing the first archive file, the archive management system creates a new archive file and continues with the archiving process. While this happens, another program reads the archived data from the completed archive file and deletes it from the database. This procedure guarantees that only data that has been correctly saved in the archive file is deleted from the database.
    If you do not carry out deletion until after the data has been stored, you can make a setting in Archiving Object-Specific Customizing so that the system will read archive files the from the storage system during deletion. In this way, you can detect errors in good time which might arise when transferring or saving the archive files in the storage system.
    When the last archive file is closed, a delete program starts to run for this file. The graphic shows that several delete programs are running simultaneously for previously created archive files. Because, unlike the delete program, the write program does not generally carry out any transactions that change data in the transactions, the write program creates new archive files faster than they can be processed by the delete program. This decreases the total archiving runtime because the database is used more efficiently.
    Note:-
    Scheduling the Archive jobs outside SARA
    WRITE:-
    Using an external job scheduler (SM36, SM62)
    WRITE Run followed by EVENT - SAP_ARCHIVING_WRITE_FINISHED,
    Parameter is Session Number
    To analyze the archiving information of a particular session, use FM
         ARCHIVE_GET_FILES_OF_SESSION
         Input is Session Number
    DELETE:-
    Using an external job scheduler (SM36, SM62)
    Using program RSARCHD, input u2013 Obj Name, Max. no. of files, Max no. of sessions, Max no. of jobs,
              Background User
    DELETE run followed by EVENT - SAP_ARCHIVING_DELETE_FINISHED
         Parameter is Session Number
    To analyze the archiving information of a particular session, use FM
         ARCHIVE_GET_FILES_OF_SESSION
         Input is Session Number
    Archiving Object
    The archiving object is a central component of SAP Data Archiving. The archiving object specifies precisely which data is archived and how. It describes which database objects must be handled together as a single business object and interprets the data irrespective of the technical specifications at the time of archiving (such as release and hardware).
    Note:-
         An archiving object has a name of up to ten characters in length.
         Transaction code to maintain the Archiving Object is AOBJ.
    The following programs must (or can) be assigned to an archiving object. The SAP System contains programs (some of which are optional) for the following actions:
    Preprocessing (Optional)
    Some archiving objects require a preprocessing program that prepares the data for archiving. This preprocessing program marks data to be archived, but it does not delete any data from the database. Preprocessing programs must always be scheduled manually and are run from Archive Administration.
    Write
    This program creates archive files and writes data to them. At this point, however, no data is being deleted from the database.
    You can specify in archiving object-specific Customizing whether the next phase (delete) is to take place automatically after the archive files have been created. Delete jobs can also be event-triggered. To do this, you set up the trigger event in archiving object-specific Customizing.
    Delete
    This function can entail several activities. The activities are always dependent on the existing archive files. Normally, the data is deleted from the data base. However, in some case, the archived data in the database may only have a delete indicator.
    In archiving object-specific Customizing, you can specify that archive files, after successful processing, are to be transferred to an external storage system using the SAP Content Management Infrastructure (which contains the ArchiveLink/CSM interface).
    Postprocessing (Optional)
    This function is usually carried out after deletion has taken place. It is not available for all archiving objects. If the data has not yet been deleted from the database by the delete program, it is deleted by the postprocessing program.
    Reload Archive (Optional)
    You can reload archived data from the archive files into the database using this function. It is not available for all archiving objects. To access this function, choose Goto ® Reload.
    Index (Optional)
    This function builds (or deletes) an index that allows individual access. It is not included in every archiving object.
    Data Object
    A data object is the application-specific instance of an archiving object, that is, an archiving object filled with concrete application data. The Archive Development Kit (ADK) ensures that data objects are written sequentially to an archive file. All data objects in an archive file have the same structure, which is described in the archiving object.
    Archive Administration (SARA)
    All interaction relating to data archiving takes place in the Archive Administration (transaction SARA). Features of Archive Administration:
    Preprocessing
         Write
         Delete
         Postprocessing
         Read - Enables you to schedule and run a program that reads and analyzes archived data.
         Index
         Storage System - Enables archive files to be transferred to a connected storage system and  
                                             enables stored archive files to be retrieved from a storage system. 
         Management - Offers an overview of archiving sessions for one archiving object.
    Depending on the action you have selected, you can use Goto on the menu to access the following menu options:
         Network Graphic
         Reload
         Customizing
         Job Overview
         Management
         Stored Files
         Database Tables
         Infosystems
         Statistics
         Interrupting and Continuing
    Archive Development Kit
    The Archive Development Kit (ADK) is a tool for developing archiving solutions. It also prepares the runtime environment for archiving. From a technical viewpoint, it is an intermediate layer between the application program and the archive that provides all the functions required for archiving data.
    The ADK functions are required for archiving and for subsequent access to archived data. The ADK automatically performs the hardware-dependent adjustments (such as code page and number format) and structural changes that are required when archive files are created. When the archive files are accessed later, the ADK temporarily converts data that was archived using earlier SAP releases.
    Note:-
    S_ARCHIVE is the SAP delivered user authorization check object over archiving objects. The Archive Development Kit (ADK) performs the check when an archive file is opened for one of the following actions:
    Write
    Delete
    Read
    Reload
    Database Tables in Archive Administration (DB15)
    This enables you to display all of the tables for a specific archiving object and as well as allows to display the list of Archiving Objects on a particular database table. Also enables to display the storage and space statistics. This also provides further information, such as the time and number of the last archiving session and various details on the client used.
    Network Graphic
    You can use the network graphic to show any existing dependencies between archiving objects. It shows business process flows and contexts that can influence the archiving sequence. In particular, at the start of an archiving object, you can use the network graphic to obtain a good overview of related documents.
    In an archiving session, you must take into account any dependencies between archiving objects that require a specific archiving sequence. In general, you cannot archive data for an archiving object that has preceding objects until these preceding objects have been archived.
    You can use the network graphic to determine whether the archiving object that you want to use has preceding objects. If so, the preceding objects should be implemented before the current archiving object. The nodes in the network graphic represent the archiving objects. A node displays the following information:
    Archive Object Name
    Application Component Name
    Short Description
    Date of last archiving
    Status of the session
         If status is u2018Greenu2019
         Archiving and Deletion is Successful
         If status is u2018Yellowu2019
         Successfully archived, but not yet deleted, or Archiving still running, or Delete in progress or
    Delete Cancelled
    If status is u2018Redu2019
    Not yet archived, or Archiving cancelled
    Standard Log (Spool List)
    During archiving, a log is usually generated. This can be done during the write, delete, read, or reload phases. This is usually in the form of a standard log. In some cases, an application-specific log may be generated.  Depending on the archiving action that was carried out, the standard log contains statistical information per archiving session or archive file according to the following categories:
    Archiving session number
    Number of data objects for processing
    Archive session size in megabytes
    Total header data in %
    Table space in MB occupied for:
              Tables
              Indexes
    Number of table entries processed
    You can call the standard log from the screen Archive Administration: Overview of Archiving Sessions. Choose Spool List.
    Accessing Archived Data
    Data that was archived using SAP Data Archiving has been relocated from the database but not placed beyond the application. Data is still available for read access and analysis. In some cases, archived data can even be reloaded into the database.
    Note:-
    A prerequisite of read access and reload access, is that the file can be found in the file system.
    Three types of access are possible:
    (Read) access to a single data object, such as an accounting document
    Direct access or single document access requires an index that can be built either during archiving or at a later point. A complex search of the documents stored in the archive files, in which all orders of an article in a particular batch are required for a product recall action, is not possible.
    The Archive Information System (AS) supports direct access using archive information structures that can be generated automatically either when the archive files are being written, or at a later point.
    Analysis of an archive file (sequential read)
    It is possible to run an analysis for one or several archiving sessions. The results of the analyzed data objects are displayed in a list. Furthermore, some archiving objects offer the option of a combined analysis. With this option, you can link current data in the database and archived data.
    Reloading into the database
    Archived data does not usually need to be reloaded because it remains accessible by the applications. There is also a lot of data that cannot be reloaded or for which reloading is problematic. For this reason, reload programs do not exist for all archiving objects.
    Archiving Session Overview
    On this screen, you can display and edit management information on archiving sessions. One archiving session is equal to Write and Delete jobs.  Within a status area, archiving sessions are, by default, organized in groups of 20. The sessions are ordered according to different status.
    Interrupting and Continuing
    In order that Data Archiving can be seemlessly integrated into the production system, you can interrupt an archiving session during the write phase and contine it at a later time. This enables you to react, during archiving, to specific time constraints or hard-disk space shortages. You can continue and complete interrupted archiving sessions when you have more time or more storage space. 
    To interrupt an archiving session:
    The archiving object must be registered in transaction AOBJ as interruptible, otherwise the Archive Development Kit (ADK) is unable to inform the write program of the interruption request.
    The write program must be able to process the interruption request.
    The archiving session must be run in production mode (not test mode) and be in process.
    The delete phase must be able to start before the write phase has finished (setting in transaction AOBJ).
    To continue an archiving session:
    The session must have been interrupted within the context of the interrupution concept. Archiving sessions that were interrupted for other reasons or that were terminated by archive management cannot be continued.
    The delete phase must have completed for the data that was archivied up to the point of interruption, that is, the archiving session must have the status completed.
    Database Action Before and After Archiving
    Archiving uses application software that depends on and affects the organization of the database data. You should therefore organize the database before and after archiving.
    Before Archiving
    Archiving application data helps to prevent storage and performance bottlenecks. Since relocating data can, in some circumstances, in itself, impair performance u2013 this is the case if you need to access archived data u2013 you need to consider carefully what data to archive. To determine whether or not you should archive data, consider the following questions:
    If there are memory problems, can more memory be assigned to the table?
    How likely is it that you will need to access the archived data again? How often?
    Is the data accessed using an optimal index?
    Does the application perform a full table scan on the tables that contain the data to be archived?
    After Archiving
    Reorganize index: If data has been archived or simply deleted and the associated tables were accessed via an index, the index should be reorganized. Deleting table entries leaves holes in the table which are still indexed. Reorganization can shorten the access paths, reducing response times.
    Update the database statistics: If your database uses a cost-based optimizer, you must choose Update Statistics to recalculate the access paths.
    Reorganize tablespace or database space: Whether you should reorganize the tablespace depends on the reason for archiving.
    Do you expect a lot of new data for the archived tables?
    Do you want to make space for other tables?
    Note:-
    Reorganization takes a long time and may need to be repeated after archiving. Throughput during a reorganization:
         With export/import     :           approximately. 60-100 MByte/hour.
           With unload/load     :           approximately. 250-300 MByte/hour.
    Perform an SQL Trace after reorganization.
    Statistics
    When writing, deleting, reading, or reloading, statistical data on each archiving run is automatically generated and is persistently stored in the database The data archiving administrators can analyze these figures so that they can better plan future archiving projects and request the necessary resources. Statistics also provided pertinent information on the role of data archiving in reducing the data volume in the database.
    You can call this screen directly from the Archive Administration (SARA), or using the transaction SAR_DA_STAT_ANALYSIS. It displays the following information:
         Archiving Session Number
         Archiving Object Name
         Client ID on which the archiving session was carried out
         Date on which the archiving session was carried out
         Status of the session number
         Portion of the Header data in the archiving session
         DB Space (WRITE) u2013 Virtual storage space in MB, which is occupied by an incomplete archiving                          session in the database
         DB Storage Space (DELETE) - Virtual storage space in MB, which is occupied by an incomplete                               archiving session in the database
         DB Space (Reload) u2013 Virtual storage space in MB
         Written Data Objects in an incomplete archiving session
         Deleted Data Objects for an incomplete archiving session in database
         Reloaded Data Objects
         Number of delete jobs
         Write job duration
         Delete job duration
         Reload job duration
    Logical Path and File
    Archive files are stored in the file system under a physical path and file name that is derived from a user-definable logical path or file name. The definition can be divided into the following steps:
    Definition of the logical path name
    Definition of the logical file name
    Assignment of the logical fine name for the archiving object
    By default, the system uses the logical file name ARCHIVE_DATA_FILE and the logical path name ARCHIVE GLOBAL PATH as defaults. Consequently, the names only need to be changed if they have to be adjusted to meet special requirements.
    Data Archiving Monitor
    Use this indicator to activate or deactivate the data archiving monitor (transaction SAR_SHOW_MONITOR. If you mark this checkbox before data archiving, archiving-relevant information on the write and delete jobs is updated. This information can be analyzed using the data archiving monitor. If there are errors, alerts are issued.
    The data archiving monitor offers the following information:
    Overview of all the archiving objects that have been run
    Detailed information on the individual archiving sessions
    Processing status display
    Help on analyzing open alerts

  • Archiving purchase orders of a specific order-type

    hello,
    does anyone have a simple-doc or plan for archiving purchase orders. archiving for dummy's as I have never done this?
    points to be taken into consideration:
    - order of a specific type (ZRL0)
    - order must be older then 10 days
    thanks,

    Hi,
    In order to archive PO's , there are a number of considerations that must be taken into account.
    The archiving program SARA -> MM_EKKO , Write , does a large number of checks before archiving can be done.
    The first step is to go to SARA -> MM_EKKO , customizing. In here , you can specifiy what kind of documents you want to archive and specify the retention period (document life) for each type of document.
    For example :
    Go to C MM-PUR : Reorganization PO
    You will find several type of PO Doc types
    DB     Dummy purchase order           Standard
    NB     Standard PO           Standard
    NB     Standard PO     K     Consignment
    NB     Standard PO     L     Subcontracting
    NB     Standard PO     S     Third-party
    NB     Standard PO     D     Service
    UB     Stock transport ord.     U     Stock transfer
    Suppose you want to archive Standard PO's  , go into the customizing option and then set the residence time 1 and residence time 2.
    Res time 1 is the number of days which must elapse after which a deletion indicator can be set for the document. The elapsed days are calculated from the last change date of the document record in EKKO table.
    Res time 2 is the number of days which must elapse after which a document with a deletion indicator can be actually deleted out of the EKKO table.
    Once the customizing is finished , then you should run the SARA Preprocessing job (which sets the deletion indicators for the selected set of documents) , then the write job (which will write the DLT status documents to an archive file) and then the delete step (which will remove documents from the EKKO and other related Purchasing tables. TO get a trailer of what gets archived and deleted , run the archive write job in "Test" Mode , rather than "Production" mode.
    Hope this helps.
    -Chandra

  • Change invoice status on old sales order?

    Hi,
    We are trying to archive old sales order but we got problem with some old sales orders.
    The sales orders aren't invoiced (for different reasons) so we can't archive them (Overall status: Being processed).
    Since the sale orders are very old we don't want to create invoice on them, just somehow fake or force the status so we can archive them.
    What's the easiest way to do this?
    Thank you!
    Best Regards
    Lars

    Hi,
    The easiest way is to maintain the reason for rejection for these orders.
    If they are small in number then maintain the reason for rejection individually.
    If the number of order is high,then got for the T.Code "MASS".
    Select the "Object type" as "BUS2032".
    Execute.
    Select "Sales Order Item Data" on "Tables" tab.
    Click on "Fields".
    Search for the field "MASSVBAP-ABGRU" and Select that line.Click on "Execute".
    Enter the document numbers range and maintain.
    Regards,
    Krishna.

  • Archiving of Info Records

    Hi,
    I have 22 Materials and now user wants me to archive all the INFO Records related to these 22 materials?
    Please guide me what are the prerequisite needed to be done by me before archiving all the INFO Records.
    Note: We had errors in Production system, which we want to reproduce in Quality system.
    Cheers,
    Kumar.S

    Hi
    Follow the below steps:
    1. Flag Info-Records for deletion (ME15)
    This will Set the Deletion Flag only in Purchase Info Record.
    Once Deletion Flag is set you need to Archive them Using SARA with Archiving Object MM_EINA which will delete the Info Records from the Database.
    2. Run the archiving program - RM06IW30 (Menu path -> Logistics - Materials Management u2013Purchasing - Master Data - Info Record - Follow on Functions - Archive)
    Create Archive File: Info Record:
    a) Select Action: Archive and enter a new Variant, for example: Z_EINA_ARCH_ 01, press 'Maintain'
    b) On selection screen enter the data range (Vendor, Material, etc.) you want to archive.
    c) Deselect the 'Test' flag if you don't want to test first.
    d) Press green back-arrow and enter the description of this new variantt on the screen which follows.
    e) Save the variantt which brings you again to the selection screen. Press green back arrow again.
    f) To start archiving process (batch-job), press the 'Start Date' button and select the time when you want to start this process. Select 'Immediate' for instant processing and press the 'Save' button on the bottom of the Start Time' window.
    g) Select the 'Spool Parameter' button and save entries. Eventually enter a valid printer to have the result outputted.
    h) You are ready now to start the process. Press the 'Start' button and monitor the success with the 'Job Overview' button You can also go the 'fast path' by using transaction SE38, program RM06IW30 to archive info records. For large data archiving, use the background jobs and run those during off-peak times. If you run the program online, you will see a confirmation on the status bar telling 'New Archive file created:.... ' 
    Delete Archived Records: Info Record
    a) Follow the menu path: Tools - Administration - Administration - Archiving
    b) Select the Object Name MM_EINA for info records
    c) Select the menu button 'Delete'
    d) Select the menu button: 'Archive Selection'
    e) Click the archive created in previous step
    f) Select Start Date for process and Spool Parameters for output
    g) Submit selection.
    h) Check status by pressing the Job Overview button 
    regards
    Antony

Maybe you are looking for