Archiving & Storage

Hi,
I've been asked to provide a simple archiving solution by one of our functional guys, which involves storing the archive file and storage file on the same physical server where the sap instance lives.
I have managed to setup quite easily the archiving directory ARCHIVE_GLOBAL_PATH "
<MYIPADDRESS>\archive\<MYSID>\<FILENAME>" under basic customizing through transaction SARA.  My functional colleague has successfully saved archived files to this folder, however he now needs to store these into a storage area.
I have setup a content repository (transaction OAC0) called Z1 and on the object to be archived FI_ACCPAYB (Vendor master data) set Z1 under "File storage to storage system".  This is where I believe I have the issue!!!
Z1 is setup as follows:
Storage Type : RFC Archive
Version : 0031
RFC Destination : <MYSID>
Transfer Directory :
<MYIPADDRESS>\archive\<MYSID>\storage\
What this is actually doing is using an RFC back to itself, rather than to an external SAP system.  When I goto store this file through SARA management and the background job runs, it cancels with the following error:
The archive file 000033-001FI_ACCPAYB is being processed
Error opening file 000033-001FI_ACCPAYB
Error occured when checking the stored archive file 000033-001FI_ACCPAYB
When I check the folder above (
<MYIPADDRESS>\archive\<MYSID>\storage\) the archived file has been copied to this directory, however the red light inside SARA management still shows red (not stored).
Is there another alternative than RFC Archive?  Any suggestions would be gratefully received. 
Thanks,
James

Hi James,
Lets talk from the basics...
Archiving is done in 2 steps (or 3 steps, depending on whether storage is used):
1. Archive - reads the data from DB and saves into filesystem in the form of archive files.
2. Delete - deletes the data from DB, which is already archived
(3. Store - moves the archive file into a storage system)
Data archiving is based on ADK - Archive Development Kit. So the steps 1 and 2 are based on ADK. But when step 3 comes into picture, something else also is involved; its called Archivelink!
Now, Archivelink comes in 2 flavors, one is RFC based (for which you created a content repository) and second is the HTTP Content server.
To use these, you need to have a storage system (either an external storage system or SAP Content server) which can 'understand' and 'talk' Archivelink.
As I understand from your query, you are trying to store your archive files again in filesystem, which would not be possible directly and you need to use a storage system.
So now you have 3 options:
1. Leave the archive files in the file system. If so, leave the content repository id blank in the object specific customizing in SARA.
2. Use external storage system - Various vendors are available in Market
3. Use SAP Content Server - see following links for more details:
http://help.sap.com/saphelp_nw04/helpdata/en/59/fba637fcf7dc39e10000009b38f8cf/frameset.htm
http://help.sap.com/saphelp_nw04/helpdata/en/8c/e9ddbb5d9a524bbb7854d31b963248/frameset.htm
Hope this helps,
Naveen

Similar Messages

  • DVD archival storage

    Is there a best brand/type of DVD for archival storage of digital video? Secondly, I'm using a Sony DVD recorder to transfer old Hi8 analog tapes to DVD (since it cannot be done with iMovie). What DVD format should I use to then play and edit on my iMac20?

    DVD's and archival storage is really much like an oxymoron. They are not reliable for this purpose. Tape is the very best archival medium, but since you don't have that option, I would suggest you look for Taiyo Yuden DVD-R media.
    The common DVD format is MPEG2, for which you will need MPEGStreamclip (freeware) and Apple's QuickTime MPEG2 plug-in ($20.00) to work it in iMovie.
    You could make all your requirements (archive, play and edit) much simpler if you bought yourself a miniDV cam and shelve the DVD cam.

  • Can a time machine backup also be used for archive storage?

    Can the same time machine backup drive also be used for archive storage from another Macbook?

    The way TimeMachine works you can guarantee that anything on your computer now will be backed up and archival copies kept. But there are two caveats. First, TimeMachine trims backups. For 24 hours they are hourly (approximately). Then the most current will be trimmed and a week worth of daily backups will be kept. Then the week will be trimmed to the most current version and the weekly backups will be kept until the drive is too full. At that point the oldest weekly update(s) will be erased to make room for the next hourly backup. When trimming occurs, files that were backed up but are no longer on your computer will be trimmed. This is the second caveat.
    Since this trimming is automatic rather than under your control TimeMachine isn't a good archival backup system. Two programs that I can recommend that do have true archival ability are CarbonCopyCloner and ChronoSync. Both allow you to decide what gets archived and how many archival periods to allow.

  • Video Chat Archive/Storage

    Hello Friends,
    I have developed a audio/video/text chat application with multiple users. I would like to store/save live session in to a single video so that we can manage archived live sessions, and make these sessions available for other users just like a normal video/FLV.
    I know we can store single live streaming with DVR but I need to store all the live cams into single video.
    Note: I'm using FMIS for above application (with rtmp).
    Any help will be very much appreciated.
    regards/Amitpal

    This is not possible with FMS. You would need to use other software to post process the videos into a single file (I'm not aware of such software... if you find something suitable, please share your findings).
    What you can do with FMS is develop a method of timecode, and use that timecode to play the streams back in a synchronized manner. This would go against your plan of a single downloadable file, and would require a specific playback application designed to support your timecode system.

  • Archiving material master - storage location

    Hi experts,
    I am in process of archiving storage locations of few material codes which are split valuated.
    Actually material/s are extended to 3-4 storage location and one storage location has to be removed from the same.
    i am using transaction code SARA with archiving object MM_MATNR and getting following error -
    plant/storage location dependent batch records exist.
    please suggest how to correct the error
    relevant solution will suitably be rewarded please.
    regards,
    Pawan Khurana

    I had not spoken about the tick for delete in test mode in the selection screen.
    I spoke about the variants for test and production run in customizing to your archiving object.
    If the variant that is saved for production run, has the tick for test run, then it just will not work.
    Further you should select X Complete for Detail log and list for P_PROT_O,
    This way you should get spool file that tells you more about errors or succesfull archived items.
    What did you do with the box in from of "Consider batch record without deletion flag" ?
    If empty, are you sure you have manually set the deletion indicator in MSC2N?
    What about the other boxes in this options section?

  • Why does the iCal app on my MacBook Pro only have a 2-year archive? Where did everything before that go? And can I get it back?

    I use my iCal as a daybook and every now and again need to reference events or occurences from several years ago. I tried to look something up from back in 2010 and the iCal on my MacBook Pro only has iCal data from Jan 1, 2011 onwards. Everything I've put in prior to that seems to have disappeared! I can't find anything under preferences that mentions archiving/storage. Can I change the setting so that this info is kept forever? And can I get my previous info back?

    You can only sync with one computer. When you synced to the new computer it would have erased it and synced what was on the new computer.
    You always need to have a backup, an iPhone is not a storage device.
    When you get a new computer you need to transfer your library before syncing
    If you don't have a backup you may be out of luck
    The only thing would be if you're in the US and have access to the iCloud beta to reload purchases
    http://support.apple.com/kb/ht2519

  • How to find out the storage category

    Hi all,
    I am currently working on archiving some of our DMS documents. We are doing this by moving the docs that meet the criteria to a different storage category on a different server.
    I would like to quickly establish how many docs we have archived so far by seeing how many docs are in the "Archive" storage category. Is there a back end table that has this information? I just want to do a quick count of all docs where storage category is "Archive".
    Thanks in advance!
    Kathy

    Hi,
    You have DOKAR, DOKNR, DOKVR and DOKTL fields in DMS_DOC2LOIO table, based on that value get LO_OBJID from table DMS_DOC2LOIO. Then pass the LO_OBJID into DMS_PH_CD1-LOIO_ID to get the required details.
    Thanks & Regards
    Bala Krishna

  • Moving Photos from iCloud Photo Library to Local Storage

    Scenario - I've a fully migrated library of photos/videos using iCloud Photo Library on iPhone and Mac.  It's near the limit of the iCloud storage plan I purchased and want to retain.  I'd like to move older and less frequently used content from iCloud Photo Library to more permanent archival storage.
    [This is for two reasons.  First, I prefer to use the Full Resolution setting on mobile devices, and that will be impossible as the entire library grows beyond the storage capacity of even the largest mobile devices.  Second, I don't feel the need to pay for super-sized iCloud storage for content rarely needed and only needed on a Mac.]
    The only option I've identified in Photos to do this is to Export (and delete from iCloud), which exports the original photos, but does not preserve useful Photos metadata and organization, such as Albums.
    What one would might like to see is a way to designate selection portions of the Library for local storage only (including backup within the Photos app library package), so those photos can be manipulated within Photos alongside iCloud content, but don't consume iCloud or mobile device space.  Or, in the alternative, one would like to see a way to Export content to merge into a separate Photos library package, preserving the metadata/organization.  In that way, one could maintain one Photos library as current iCloud synced content, and one or more local-only Photos library packages with archival content (with, importantly, the Export function to move content between the two preserving metadata).
    Does anyone know if there's a way to do this?  If not, Apple, would you consider coming up with a way to address this need?

    {quote:title=Nissin101 wrote @ 3:36pm EMT:}{quote}
    Well I was able to move photos from the camera roll to the photo library by sending *the pictures via email to my dad's blackberry, then saving them to my computer from his phone, then putting them back on into the photo library*.
    This is what I said originally.
    {quote:title=Nissin101 wrote @ 4:08pm EMT:}{quote}
    Alright I guess that answers my question then. However, just as I said I was able to transfer photos from my camera roll to my photo library, so at least that is possible.
    I never said that I did it directly, neither did I mean to imply that I was looking for a direct solution. This I guess is where our misunderstanding comes from. I just did not feel like repeating the whole process I went through. Regardless, I would rather this thread not derail into who said what and whose misunderstanding who. I now know that it is not possible to get pictures from the photo library to the camera roll in any way, so my question is answered for now at least.

  • Archive Link and toa01

    Hi everyone.
    I would like ask if somebody could help me with this one.
    I would like to now how many of the billing documents we have produced are stored in our Content Manager solution. Therefore I made a program which takes the billing doc number plus 4 zeroes and the billing date. These I compare with the the firste 10 digits in the DOC ID field and Archived DATE in table toa01.
    This works fine mostly but I found some examples where the billing doc has an copy of it in the Content Manager and when I search for this Doc. through transaction OAAD I found nothing. I entered the billing doc number 10 digits in the SAP OBJECT ID and found nothing like my program says. But when I take the DOC ID from the found billing doc and entered this one in the field I find the document and actually I found the DOC ID to which was unknown befor.
    Does anyone has a solution for this one?
    Thanks.

    Hi Neha,
    I'm sorry I don't have answer to your question but I wonder if you could help me.
    I'm looking at OSS note 530792 to configure GOS 'create attachment' option to copy the attachments to the archive server. currently, these are written to the SAP office tables SOC3, SOFFCONT1, etc and I want to use the archivelink and SAPHTTP and copy to the archive storage.
    Have you successfully managed to configure your system since you mentioned TOA01?
    In the same GOS menu I've activated the 'Business document' option and can copy these to the archive server by correctly configuring OAC2 and OAC3.
    I'll really appreciate of you could please share your knowledge.
    Thanks.
    Soyab

  • Maxtor Personal Storage 3200 ext. HD. Yea or Nay

    I saw the ad for a large office supply store and they had this drive for $69 over this weekend.
    I went Maxtor's website and they showed no Mac compatibility. I wondered why, so I dropped tech support a line. This is the response I got, it includes some info on chipset as well:
    "Dear Nick Stearns:
    Thank you for sending your MAXTOR E-mail inquiry.
    Maxtor Personal Storage 3200 drives come pre-formatted for NTFS and these drives are only supported on Windows 2000 and XP computers. However, you can use a 3200 drive on a Macintosh computer if you format (erase) the drive with the Mac OS X Disk Utility. These drives have Prolific chipsets in them and we have only seen compatibility problems with some systems using the Intel 82801 USB chipset."
    Has anyone used one of these drives successfully?
    I have a backup boot drive, if I get this one it would be primarily for archive storage.
    Not a necessity but seems like a real good deal, unless it won't work.
    Thanks for your replies in advance.

    Nick, I saw your other thread and admonished you there - LOL!
    They are right, it should work, but there is a huge difference between should and will. For archiving it probably would be fine.
    Otherwise, get an enclosure from OWC and a bare drive from someone on sale and do it that way.

  • Archiving on Blue-ray.. what are the issues that people are afraid of..?

    I'm considering how my latest storage monster toy (Hi definition camcorder) is going to eat disk space and realise that very very soon I'm going to want to move archived projects onto some external media that isn't going to consume that spare room in my house.
    I dislike the idea of using a ton of external USB disks for the reasons above and the environmental factors associated with those disks and tape drives.. well it gets kinda expensive real quick when you start to look at current models and limited lifecycles that tape drives have before they need replacing.
    My thoughts are that Blue-Ray is the sweet spot at the moment but I have heard people on this forum saying that the life of the media is questionable and comparable to various DVD media types.
    I did a little digging and found that Archive quality non-organic BD-Rs that use a SiCu substrate approximate the UDO disk format that is traditionally used by large organisations for archival storage.
    The UDO format has a life of approximately 100 years and some BD-R vendors are certifying their media for ~50 years. (cost about £25 for dual layer 50Gb today).
    The recording technology on the media is radically different to various recordable DVD technologies that typically use an organic dye that is subject to deterioriation over time.
    My thoughts are that BD-R will see most consumers out until the next generation of technologies (Protein Coated Disks and Holographic Disks) start to become mainstream and in the meantime we have some significant advantages over the tape guys such as size, random access and costs (being driven down all the time by consumer uptake).
    So, what have I missed? Are there additional factors or are people only nervous because of FUD..?
    -Andy

    I'm considering how my latest storage monster toy (Hi definition camcorder) is going to eat disk space and realise that very very soon I'm going to want to move archived projects onto some external media that isn't going to consume that spare room in my house.
    I dislike the idea of using a ton of external USB disks for the reasons above and the environmental factors associated with those disks and tape drives.. well it gets kinda expensive real quick when you start to look at current models and limited lifecycles that tape drives have before they need replacing.
    My thoughts are that Blue-Ray is the sweet spot at the moment but I have heard people on this forum saying that the life of the media is questionable and comparable to various DVD media types.
    I did a little digging and found that Archive quality non-organic BD-Rs that use a SiCu substrate approximate the UDO disk format that is traditionally used by large organisations for archival storage.
    The UDO format has a life of approximately 100 years and some BD-R vendors are certifying their media for ~50 years. (cost about £25 for dual layer 50Gb today).
    The recording technology on the media is radically different to various recordable DVD technologies that typically use an organic dye that is subject to deterioriation over time.
    My thoughts are that BD-R will see most consumers out until the next generation of technologies (Protein Coated Disks and Holographic Disks) start to become mainstream and in the meantime we have some significant advantages over the tape guys such as size, random access and costs (being driven down all the time by consumer uptake).
    So, what have I missed? Are there additional factors or are people only nervous because of FUD..?
    -Andy

  • Put Together A Data Archiving Strategy And Execute It Before Embarking On Sap Upgrade

    A significant amount is invested by organizations in a SAP upgrade project. However few really know that data archiving before embarking on SAP upgrade yields significant benefits not only from a cost standpoint but also due to reduction in complexity during an upgrade. This article not only describes why this is a best practice  but also details what benefits accrue to organizations as a result of data archiving before SAP upgrade. Avaali is a specialist in the area of Enterprise Information Management.  Our consultants come with significant global experience implementing projects for the worlds largest corporations.
    Archiving before Upgrade
    It is recommended to undertake archiving before upgrading your SAP system in order to reduce the volume of transaction data that is migrated to the new system. This results in shorter upgrade projects and therefore less upgrade effort and costs. More importantly production downtime and the risks associated with the upgrade will be significantly reduced. Storage cost is another important consideration: database size typically increases by 5% to 10% with each new SAP software release – and by as much as 30% if a Unicode conversion is required. Archiving reduces the overall database size, so typically no additional storage costs are incurred when upgrading.
    It is also important to ensure that data in the SAP system is cleaned before your embark on an upgrade. Most organizations tend to accumulate messy and unwanted data such as old material codes, technical data and subsequent posting data. Cleaning your data beforehand smoothens the upgrade process, ensure you only have what you need in the new version and helps reduce project duration. Consider archiving or even purging if needed to achieve this. Make full use of the upgrade and enjoy a new, more powerful and leaner system with enhanced functionality that can take your business to the next level.
    Archiving also yields Long-term Cost Savings
    By implementing SAP Data Archiving before your upgrade project you will also put in place a long term Archiving Strategy and Policy that will help you generate on-going cost savings for your organization. In addition to moving data from the production SAP database to less costly storage devices, archived data is also compressed by a factor of five relative to the space it would take up in the production database. Compression dramatically reduces space consumption on the archive storage media and based on average customer experience, can reduce hardware requirements by as much as 80% or 90%. In addition, backup time, administration time and associated costs are cut in half. Storing data on less costly long-term storage media reduces total cost of ownership while providing users with full, transparent access to archived information.

    Maybe this article can help; it uses XML for structural change flexiblity: http://www.oracle.com/technetwork/oramag/2006/06-jul/o46xml-097640.html

  • Content Server for Archiving

    Hello,
    I wanted to confirm if we can use the content server which is used to store the originals of a DIR, to store the scanned documents using Archivelink? Whats are the pros and cons?
    Thanks,
    Paddy

    Dear Athol Hill,
    Thanks fore the quick reply,
    let me with other DMS community cleary understand the Pros and Cons of Going for the content server  storing of original files,
    i belive SAP recommends to go for the SAP Content server installation for storing of original files for DMS? please explain in your own words.
    so if we use same content server for storing original directly and also Archiving, do you mean to say that the original files stored in Archiving portion of the content server be copied to CDs and DVDs,
    if this is the case can there be tranfer of the Original files from one repository of the content server (acting as DMS storage ) to another Repository (acting as Archive Storage)
    so if this transfer is made then any original files can be copied to Optical drives,
    one basic question i have is all this original storage device like this Content server, Archive and Vault do only act as Secure storage system? or is that only the feature of Content server?
    if so then even if we copy the Archived files in to CDs and DVDs. they cannot be viewed directly with out the Help of SAP Logon transaction. is my understanding right?
    is there any Storage System that is Supported by SAP which allows Data be stored other than Secure storage Area,and making it available outside SAP transaction for later use.
    Thanks and regards
    Shanti

  • Sap Archiving u0096 STO error on version ECC 6.0

    Hi experts. I take a long of time with this error and I hope that you can help me. Explain the program.
    We have a archiving project of no standard objects. This objects run on Sap version 4.6C, but now, on version ECC 6.0, the Store program give a execution error. The log of the STO Job on Sara transaction is the follow:
    Job started
    Step 001 started (program RSARCH_STORE_FILE, variant , user ID ARUIZ)
    Archive file 000241-001ZIECI_RF02 is being processed
    Archive file 000241-001ZIECI_RF02 does not exist (Message no. BA111)
    Error occured when checking the stored archive file 000241-001ZIECI_RF02 (Message no. BA194)
    Job cancelled after system exception ERROR_MESSAGE (Message no. 00564)
    The Write and Delete programs runs correctly. 
    A strandar Archiving Object like FI_TCJ_DOC runs Ok (WRI, DEL and STO programs). The customazing for both objects are nearly on OAC0, FILE transactions. The changes are on Sap Directories. Write for you the most important customazing actions:
    Transaction: FILE
    ZIECI_RF02 (No Standard)
    Logical File Path -->     ZZA_ARCHIVE_GLOBAL_PATH     
    Physical Path     -->     /usr/sap/CX6/ADK/files/ZA/<FILENAME>
    Logical File Name Definition -->      
                                ZZA_ARCHIVE_DATA_FILE_ZIECI_RF02
    FI_TCJ_DOC (Standard)
    Logical File Path -->     ARCHIVE_GLOBAL_PATH
    Physical Path     -->  <P=DIR_GLOBAL>/<FILENAME>
    Logical File Name Definition --> ARCHIVE_DATA_FILE
    Transaction: AOBJ
    Customazing settings:
    ZIECI_RF02     
    Logical File Name -->     ZZA_ARCHIVE_DATA_FILE_ZIECI_RF02     
    FI_TCJ_DOC (Standard)
    Logical File Name -->     ARCHIVE_DATA_FILE
    I prove to put in own archiving object (ZIECI_RF02) the logical file name -> ARCHIVE_DATA_FILE too and the error go on.
    In the others parameters the values for both objects are:
    Delete Jobs: Start Automatically
    Content Repository: ZA (Start Automatically)
    Sequence: Delete Before Storing
    I see the point of the storing STO program (RSARCH_STORE_FILE) where own archiving failed. I made a program with the function and debbuging this can see the next:
    REPORT  zarm_06_prueba_http_archivado.
    DATA: LENGTH TYPE I,
          T_DATA TYPE table of TABL1024.
    BREAK ARUIZ.
    CALL FUNCTION 'SCMS_HTTP_GET'
      EXPORTING
        mandt                 = '100'
        crep_id               = 'ZA'
        doc_id                = '47AAF406C02F6C49E1000000805A00A9'
        comp_id               = 'data'
        offset                = 0
        length                = 4096
      IMPORTING
        length                = length
      TABLES
        data                  = t_data
      EXCEPTIONS
        bad_request           = 1
        unauthorized          = 2
        not_found             = 3
        conflict              = 4
        internal_server_error = 5
        error_http            = 6
        error_url             = 7
        error_signature       = 8
        OTHERS                = 9.
    IF sy-subrc <> 0.
      MESSAGE ID sy-msgid TYPE sy-msgty NUMBER sy-msgno
              WITH sy-msgv1 sy-msgv2 sy-msgv3 sy-msgv4.
    ENDIF.
    If execute the program, SAP returns the next error:
    Error in HTTP Access: IF_HTTP_CLIENT -> RECEIVE 1
    The sy-subrc variable value is 6 -> error_http
    The call of the method is the follow:
    call method http_client->receive
         exceptions
           http_communication_failure = 1
           http_invalid_state         = 2
           others                     = 3
    sy-subrc value is 1 -> http_communication_failure
    The unusual of the case are that the Archiving Objects work on SAP 4.6C perfectly,
    furthermore on version ECC 6.0 the standard and no standard objects uses Sap Content Server with the same connection (HTTP), IP, PORT and Repository.
    I hope that anypeople can help me. A lot of thanks for your time.
    Best Regards.

    Hi Samanjay.
    Answer your questions:
    1) I proved the archiving from Sara with the Object ZIECI_RF02 with both customazings. The Stored program fail with logical file name = ZZA_ARCHIVE_DATA_FILE_ZIECI_RF02 and logical file name = ARCHIVE_DATA_FILE (Example at answer 2).
    2) AL11 Transaction (Sap Directories):
    /usr/sap/CX6/ADK/files/ZA/
    cx6adm     09.04.2008     18:02:15     ZIECI_RF02_20080409.180215.ARUIZ
    This is the Fich from ZIECI_RF02 on AL11. I prove this yesterday. The variable name: 000241-001ZIECI_RF02 is the generate to save the fich on repository ZA. In this case the variable name is: 000342-001ZIECI_RF02. Can view this with the table ADMI_FILES on TA SE11.
    Entries on ADMI_FILES with create date = 09.04.2008
    DOCUMENT:       342
    ARCHIV KEY:     000342-001ZIECI_RF02
    CREAT DATE:    09.04.2008
    CREAT TIME :    18:18:28
    OBJ COUNT:       1
    FILENAME:         ZIECI_RF02_20080409.18182
    STATUS OPT:       Not Stored
    STATUS FIL:        Archiving Completed
    PATHINTERN:    ZZA_ARCHIVE_GLOBAL_PATH
    CREP:      
    ARCH DOCID:   
    Now, I put the same information from FI_TCJ_DOC (Standard Object):
    AL11 Transaction (Sap Directories):
    /usr/sap/CX6/SYS/global
    cx6adm   10.04.2008       11:24:15     FI_FI_TCJ_DOC_20080410_112409_0.ARCHIVE
    Entries on ADMI_FILES with create date = 10.04.2008
    DOCUMENT:       343
    ARCHIV KEY:      000343-001FI_TCJ_DOC
    CREAT DATE:     10.04.2008
    CREAT TIME:      11:24:09
    OBJ COUNT:        2
    FILENAME:          FI_FI_TCJ_DOC_20080410_112409_0.ARCHI
    STATUS OPT:       Stored
    STATUS FIL:         Archiving Completed
    PATHINTERN:      ARCHIVE_GLOBAL_PATH
    CREP:                   ZA
    ARCH DOCID:      47FD890364131EABE1000000805A00A9
    Finally, made the example with Archiving Object ZIECI_RF02, but assigning the standard logical file.
    AOBJ (Customazing settings):
    Object Name:           ZIECI_RF02  Archivado de datos: Facturas de Baja
    Logical File Name:  ARCHIVE_DATA_FILE
    Now, execute the archiving at SARA transaction.
    AL11 Transaction (Sap Directories):
    /usr/sap/CX6/SYS/global
    cx6adm    10.04.2008          12:33:25     FI_ZIECI_RF02_20080410_123324_0.ARCHIVE
    Entries on ADMI_FILES with create date = 10.04.2008
    DOCUMENT:      345
    ARCHIV KEY:    000345-001ZIECI_RF02
    CREAT DATE:    10.04.2008
    CREAT TIME:     12:33:24
    OBJ COUNT:       1
    FILENAME:         FI_ZIECI_RF02_20080410_123324_0.ARCHIVE
    STATUS OPT:     Not Stored
    STATUS FIL:       Archiving Completed
    PATHINTERN:    ARCHIVE_GLOBAL_PATH
    CREP:
    ARCH DOCID:
    It´s that the unusual. At first, I thought that´s the problem was the directory of SAP, but with this proof I looked a new error reason.
    3) The details of repository ZA are:
    Content Rep:     ZA
    Description:       Document Area
    Document Area: Data Archiving
    Storage type:      HTTP Content Server
    Version no:         0046   Content Server version 4.6
    HTTP server:     128.90.21.59
    Port Number:     1090   
    HTTP Script:     ContentServer/ContentServer.dll
    Phys. Path:         /usr/sap/CX6/SYS/global/
    Too much thanks for your answer and your time. If you have any doubt, question me.
    Best regards.

  • Archiving Information from "old" SAP systems and switch off the server

    Good day
    I hope someone can offer some advice. Our company has a few sites across the globe. Three of the sites in South Africa ( where I'm working at the moment ) had their own SAP systems. Three odd years ago, we implementeted a new SAP system, combining the separate SAP systems into the new one, running of the same box etc etc. The "old" SAP systems were still availbale after the new one went live, in the event that people want to look back at history.
    We are you busy with a project where we are going to merge our current SAP implementation into our global SAP system... obviously only the current master data are going to be transfered to the new system. It's an idea time for us to clean up our data.
    My question: Once we go live on our global SAP system, we want to "export" or archive from the 3 "old" SAP systems the Master Data from the HR module ( like employees who's left the company - it's law to keep the records for 5 years at least once a employee has left) , HSEC and FI modules into a single datastore... this way we don't have to keep the three servers running in the event that history is required. The servers wil be kept off-line and only switched back on if a detailed audit is required.
    The global SAP system does not have an archive solution implemented to we can't access archived data from the new system...
    So, we need to export master data from our 3 old SAP boxes into a Unicode type file system, and our local South African sites needs to be able to access the information using a text type editor or a web based interface we can run off our SAP Portal...
    Thank you

    Hello Cornelius,
    To introduce myself, I am an active SDN/BPX member as well as an ASUG Archiving and Information Lifecycle Management Volunteer.
    Decommissioning Legacy Systems (SAP and non-SAP systems) is quite the topic these days with divestitures becoming such a common occurrence. 
    Some things to think about in this situation are:
    Identifying the data to be decommissioned:
    what information would be needed for an audit
    what information would be needed for a lawsuit situation
    Identifying the access requirements to this data.
    Identifying the retention policies for this data (this will be
    driven by regulatory requirements, etc.).
    If this has not already been done - define retention periods for this data - it will most likely have different retention periods based on the type of data.
    You mention that you do not have an archive solution in place yet. 
    I would recommend that you may want to research that option because typical requirements for retaining legacy data is that it is stored in such a manner so that it is protected from any type of modification.
    There are a lot of 3rd party archiving/storage solutions that provide this type of secure storage.
    You are correct in that you will want to export or archive the legacy data as opposed to loading the data into your current system.  It would probably involve a large increase in the size of your SAP database and you would probably run into number range overlaps.
    But, if this data is exported to a filesystem, you will be losing the business logic associated with the data, and the information is no longer in its original format (this could have legal ramifications). 
    I would recommend using SAP Archiving functionality for this effort.  If you are looking for outside resources, there are many vendors out there that have a lot of experience with these types of projects.
    Two that come to mind are Dolphin IT and OpenText.
    Or, depending on the timeline of your project, SAP is coming out with a tool that can be used for decommissioning SAP Systems.  It is part of
    their SAP ILM solution.  It will be called Retention Warehouse.  Here is a paragraph from their latest white paper describing this new tool:
    With the retention warehouse, SAP now offers a
    standardized method for system decommissioning in
    answer to these pressing issues. It allows you to reuse the archived data outside the original system in a central
    retention warehouse.
    For more detail on this topic, the white paper can be found on the SAP ILM Website:
    service.sap.com/ilm
    This tool is part of a larger initiative from SAP to help customers with their ILM (Information Lifecycle Management) strategy.  This tool is near and dear to me as I was fortunate enough to be part of an
    ASUG Influence Council that helped with some of the requirements for this solution.
    I hope this information is useful. 
    Good luck and Best Regards,
    Karin Tillotson

Maybe you are looking for