Archiving contents using 'Self Export/Import'

Hi,
We are trying to just archive the documents uploaded to the content server to specific place in the same server (not using migrating functionality).
Is it possible to handle this case using “Self Export/Import” functionality?
Or can we use "Content Tracker -> Reduce data" option?
can you explain us how to do this task.
Thank you
Edited by: Nir on Sep 2, 2011 7:07 AM

Consider Disk space and or where your Archives directory is located.
Sometimes people set up their Vault and Weblayout directories to be on the shared disk and forget to move the Archives directory to a shared location with lots of space. If it is not moved it is in the same file structure as the core CS files and the disk space allotted there may not be big (in some customer cases) and filling that disk may stop your server.
Just a consideration.

Similar Messages

  • System copy using SAPInst(Export Import database Independent prcoess failed

    Hello ,
    I am doing a System copy using SAPInst export/import process .
    Source system : SAP Netweaver'04( BW 3.5 , Kernel : UC 640 ,Patch level 196 )
    Export process fails at Phase 2 -  Database Export at R3load jobs running 1,waiting 0 . Below is the log details
    SAPSDIC.log
    (EXP) INFO:  entry for BAPICONTEN                        in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20000621155733
    (EXP) INFO:  entry for BAPICONTENT255                    in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20031127161249
    (EXP) INFO:  entry for BAPICONVRS                        in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20010131174038
    (EXP) INFO:  entry for BAPICREATORDATA                   in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20000621155733
    (EXP) INFO:  entry for BAPICRMDH1                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMDH2                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMEXP                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175513 > 20031211120627
    (EXP) INFO:  entry for BAPICRMEXT                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175514 > 20031211120627
    (EXP) INFO:  entry for BAPICRMKEY                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMKEY_T                      in DDNTT is newer than in DDNTT_CONV_UC: 20051229175835 > 20031211120803
    (EXP) INFO:  entry for BAPICRMMSG                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMMSG_T                      in DDNTT is newer than in DDNTT_CONV_UC: 20051229175835 > 20031211120803
    (EXP) INFO:  entry for BAPICRMOBJ                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175514 > 20031211120628
    (EXP) INFO:  entry for BAPICRMPAREX_T                    in DDNTT is newer than in DDNTT_CONV_UC: 20051229175452 > 20031211120305
    (EXP) INFO: limit reached, 5000 tables in DDNTT are newer than in DDNTT_CONV_UC
    (EXP) INFO: NameTab check finished. Result=2  #20100113131216
    (EXP) INFO: check for inactive NameTab entries: Ok.
    (RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
    (DB) INFO: disconnected from DB
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: END OF LOG: 20100113131216
    ***SAPCLUST.log ****
    (NT)  Warn:  EDIDOC: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "EDIDOC"
    (NT)  Warn:  PCDCLS: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "PCDCLS"
    (NT)  Warn:  SFHOA: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "SFHOA"
    (NT)  Warn:  SFHYT: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "SFHYT"
    (NT)  Warn:  UMG_TEST_C: normal NameTab from 20040211095029 younger than alternate NameTab from 20031113150115!
    (EXP) TABLE: "UMG_TEST_C"
    myCluster (55.22.Exp): 712: error when retrieving alternate nametab description for physical table UMG_TEST_F.
    myCluster (55.22.Exp): 713: return code received from nametab is 32
    myCluster (55.22.Exp): 299: error when retrieving physical nametab for table UMG_TEST_F.
    (CNV) ERROR: data conversion failed.  rc = 2
    (RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
    (DB) INFO: disconnected from DB
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: END OF LOG: 2010011312563
    Please suggest.
    Thanks & Regards
    Ganesh

    Is your DB unicode?  If so, did you select the unicode flag in sapinst?
    This [thread|System Copy Error while exporting ABAP; might offer some help.
    -Zach

  • System copy using R3load ( Export / Import )

    Hi,
    We are testing System copy using R3load ( Export / Import ) using our production data.
    Environment : 46C / Oracle.
    while executing export, it takes more than 20 hours, for the data to get exported, we want to reduce the export time drastically. hence, we want to achieve it by scrutinizing the input parameters.
    during export, there is a parameter by name splitting the *.STR and *.EXT files for R3load.
    the default input for the above parameter is No, do not split STR and EXT files.
    My question 1 : If we input yes instead of No and split the *.STR and *.EXT files, will the export time get reduced ?
    My Question 2: If the answer is yes to Question 1, will the time reduced will be significant enough ? or how much percentage of time we can expect it to reduce, compare to a No Input.
    Best Regards
    L Raghunahth

    Hi
    The time of the export depends on the size of your database (and the size of your biggest tables) and your hardware capacity.
    My question 1 : If we input yes instead of No and split the *.STR and *.EXT files, will the export time get reduced ?
    In case you have a few very large tables, and you have multiple cpu's and a decent disk storage, then splitting might significantly reduce the export time.
    My Question 2: If the answer is yes to Question 1, will the time reduced will be significant enough ? or how much percentage of time we can expect it to reduce, compare to a No Input.
    As you did not tell us about your database size and hardware details there is no way to give you anything but very basic metrics. Did you specify a parallel degree at all, was your hardware idling for 20 hrs or fully loaded already?
    20 hrs for a 100gb database is very slow. It is reasonable (rather fast in my opinion)  for a 2tb database.
    Best regards, Michael

  • Client Copy Using Client Export/Import

    Hi Everyone,
    This is regarding Client Copy Using Client Export/Import Method. We have two system Production and Development system. Iam planning to do a client copy from from PRD to DEV. I know most of the steps to perform Client Copy.I have some doubhts So if some one can please help me out i would really appreciate it.
    !) I want to know which user should be used to perfrom Client Export ?
    2) I want to know know which user to be used to perform Client delete?
    3) Is there any particular USER who is recommended to perform Client Copy or any USER can do it?
    4) After the Client Delete Procedure thorugh SCC5 ..to get back the client space we do Reorganization ? can anyone plzz tell me what are the tables that i need to reorganize or if anyone can provide me the steps for Reorganization i would be really thankful.
    Waiting For your Replies
    Best Regards

    Hi Mohammed,
    1)I want to know which user should be used to perfrom Client Export ?
    You can use your personal defined on the system UserID.
    2) I want to know know which user to be used to perform Client delete?
    Same as above, just make sure you have an account on the client and you log on to the right client
    3) Is there any particular USER who is recommended to perform Client Copy or any USER can do it?
    I am not aware that you should use a particular user. I have been doing copies/deletions with my UserID, as system administrator.
    4) After the Client Delete Procedure thorugh SCC5 ..to get back the client space we do Reorganization ? can anyone plzz tell me what are the tables that i need to reorganize or if anyone can provide me the steps for Reorganization i would be really thankful.
    Reorganization ( I personally) use SAPDBA tool. However -if you use ORACLE, there are other alternatives as well. What is your DB  vendor? I will append more info in a bit.
    Rgds,
    Loukas

  • Possible to use stsadm export / import to copy subsite to the same site with diff subsite name?

    In the same site is that possible to use stsadm export/import to copy subsite to a new diff subsite name?
    I do not want to use the SaveAs Template method, I need command mode.

    You should create an empty site.
    Go to your site collection and click siteactions --> new site --> Select custom template, click create.
    Here you choose meeting2 as Title/url
    Now execute your import command (Import-SPWeb http://vee:111/meeting2 -Path
    C:\temp\meeting_exp.cmp -UpdateVersions -Overwrite) 
    Now wait, when the import is finished, check your site!
    That should do it.

  • Portal content Transport Packages Export/Import Question.

    Hi,everyone:
      I have a problem when Export packages of portal content,the epa-file is correct on the server however become error after download from the server to localhost.
      The bug is Function of Download itself,if anyone have meet this case and help me. I need your help!
    Regard!
    Fish

    Hi Ree,
    As I mentioned in my previous reply it is working perfectly for me. The import is creating the required files in the PCD.
    Let me explain how it is done.
    Create a TP
    Create a folder and add the required files(Workset,pages, iviews etc) to the the folder.
    add the folder with all the contents(select 'Add all objects to transport package' from the content menu) to the TP.
    Export the TP. give a name of your choice(removing the numbers)
    Download it to the local machine using the dowmload link.
    the file will be saved as a <i>TPName</i>.epa in your local drive.
    in import, select 'client' for Source for Package Files.
    brose and select the <i>TPName</i>.epa file.
    upload the file, and you can see the contents.
    then import it to the new environment. Now the complete folder stucture will be created in the PCD exactly as you created it for the TP.
    I hope you are doing like this. It should work
    Are yo able to open the downloaded file with WinZip?
    cheers
    Kiran

  • Export/Import Process in the UI for Variations Content Translation is Generating CMP Files with No XML

    We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
    exported using the  export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
    Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites. 
    For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
    to CAB and extracting it, the only files it contains are DAT files.
    The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
    sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
    This leads to two questions:
    1.
    Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
    2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
    content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
    As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
    contains no XML files.
    If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
    the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
    Thanks in advance
    Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.com

    I wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
    while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
    ever happen to find a solution to this problem?

  • How to use lodercli for export import

    Hello,
    I want to migrate my content server data  , that is on SAPDB version 7.3 (32 bit)  to new server with MAX DB 7.6. (64 bit) .
    I wanted to know:
    1)  how to do export an d import using lodercli. i wanted to use loder of my new server i.e. MAXDB loder to do the export immport. Can any one tell me the steps and the command to do this ?
    2) I have refered SAP note 962019, but need some more help on the steps that is given in this note for hetrageneous copy.
    Note : i dont have any SAP system on the server i have only content server and hence i dont have user SAPR3 on database.
    Regards,
    Bhavesh

    > Please check the following (point 7) in note 962019:
    Ok...
    > 7. The procedure described in this note supports the following migrations:
    > and I am dealing with the same situation hence choose the export import rather than normal backup recovery.
    Well. it says: the procedure supports these migrations, which is true.
    It does not say: "please do use the export/import migration in these situations".
    Please do read the note point 1 where it says;
    "Homogeneous system copies (backup/recovery) can be executed only in
    systems that fulfill  the following conditions:
    a) The database versions in the source system and the target system
        do not differ.
    b) The processor uses the same byte sorting sequence (Little Endian
        -> Little Endian; Big Endian -> Big Endian).
    *If the specified operating system prerequisites are met in the
    source and target system, you MUST use the homogeneous system copy
    and not the heterogeneous system copy.*"
    So in your case export import is not even supported!
    > 2) I know the Cs 6.10 is out of support and it need to upgrade to new CS version that is 6.4 , but if i am already installing the new server for content server with CS 6.4 and MAXDB 7.6 why then why cant i use the export import from older server to new server??
    > Other wise i need to upgrade the MAXDB then upgrade CS and then take the backup and restore, isnt it more logivcal to do export import using loder as we have all prerequisites available??
    1. you don't meet the prerquesites (see above) - you can use backup/restore, so you have to !
    2. export/import is a terribly slow and clumsy procedure
    3. backup/restore is a easy, safe and fast method to perform the system copy
    If you still really, really want to stick with export/import, feel free to try it out.
    The note contain all required steps and the documentation for MaxDB even includes tutorials for the loader.
    regards,
    Lars

  • Using set/get parameters or export/import in BSP.

    Hi All,
    Is it possible to use set/get or export/import in BSP?
    We need to set/export some variables from a BADI and get/ import them in the BSP application.
    Code snippet will be of great help..
    Thanks,
    Anubhav

    Hi Anubhav,
    You can use the Export / Import statements for your requirement,
    from the BADI use EXPORT to send the variable data to a unique memory location
    with IDs
    e.g.
    *data declaration required for background processing
          DATA: WA_INDX TYPE INDX.
    **here CNAME is the variable you want to export
    EXPORT PNAME = CNAME TO DATABASE INDX(XY) FROM WA_INDX CLIENT
                SY-MANDT ID 'ZVAR1'.
    and in the BSP application use the IMPORT statement to fetch back the values
    set with the IDs above.
    IMPORT PNAME = LV_CNAME
      FROM DATABASE INDX(XY) TO WA_INDX CLIENT
      SY-MANDT ID 'ZVAR1'.
    deletes the data to save wastage of memory
      DELETE FROM DATABASE INDX(XY)
        CLIENT SY-MANDT
        ID 'ZVAR1'.
    Regards,
    Samson Rodrigues

  • ReportMigration Using LCM & Simple Import & Export.

    Hi,
    I just want to know the difference between these two.. migrating the reports using LCM or by a simple import & export.
    Will LCM even migrate the user security which is associated with the report??
    Can anybody throw some light on this.
    Thanks
    Jagadesh R K.

    Yes, LCM can be used to export / import Reports / Application provisioning.
    For more details, refer the below documents
    EPM 11.1.1.x
    http://download.oracle.com/docs/cd/E12825_01/epm.111/epm_lifecycle_management.pdf
    System 9.3.1
    http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/hss_alcm_util.pdf
    Hope this helps.
    Regards,
    Manmohan Sharma

  • Table Values in Export/Import/Tables RFC calls

    Hi
    I know that using Adaptive RFC, the best practice is to use the Tables section of the function rather than the import/export.
    However, in ECC6.0, when creating entries in the Tables section of the FM it tells me that this section is obsolete.
    Should I start to use the export/import parameters instead?
    Cheers
    Ian

    Hi Ashu
    Thanks for the reply but there is no code snippet.  It is a 'best practice' question.
    The document <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/11c3b051-0401-0010-fe9a-9eabd9c216de">Effective Web Dynpro - Adaptive RFC</a> details that table values should not be passed in the exporting/importing parameters of the R/3 RFC enabled function module.  THey should always be added to the Tables section.
    However, in ECC6.0, when adding entries into the Tables section of an R/3 function module, it says that this practice is obsolete.
    Therefore, what is the current best practice for Adaptive RFC??
    Cheers
    Ian

  • Problem with the Export/Import Utility when importing a .portal file

    Hi there !
    I got the following error when trying to import a .portal file to a desktop by using the Export/Import Utility.
    <13-11-2009 12:13:26 PM CLST> <Error> <netuix> <BEA-423487> <Import: did not add navigable [2203] as it would cause an illegal dependency.>
    The error is shown several times, starting from the number [2203] until [2285].
    It causes that i can't see some pages of my portal... it's like they dont exist.
    I Hope you can help me fix it, because i have not found any information about it.
    Best regards
    P.D.: I am using the BEA Weblogic 10.0.1 version.
    Edited by: user12231353 on 16-nov-2009 12:38

    I upgraded to cs6 and imported all the images together, I made them 3 frames per and exported them. The problem is slightly improved but still there. I'm still getting flicker.
    Check it out: http://www.youtube.com/watch?v=g_yZjskzTLs
    Black screen version from yesterday if you want to look at it: http://www.youtube.com/watch?v=NCcAEO8YU6Y
    The problem has to be with my settings. when I upgraded it deleted all my prefrences and settings. There was no dslr preset either so I had to redo it all from  scratch to the best of my ability. Here is a complete rundown of my settings:
    I shoot with a canon dslr 60D in 24fps and upload all my stuff in 1080p. Please advice.

  • Studio export/import button not working.

    To import existing lar file into studio do I need to configure anything ?
    I'm not able to use the export/import even when logged in as admin?

    There are two ways to get to the Manage Pages interface for any community, which is where LAR import/export functionality lives. You can access it by going to the Welcome Menu on the page, and clicking "Manage Pages." If you use this path, you will be unable to import a LAR file, as the application considers you to be "in" that community, and can't overwrite the pages with the LAR you're trying to import. Instead, you'll want to access this page by going to Control Panel -> Communities. For the desired community, click Actions -> Manage Pages. Now you should be able to go to the export/import section and import your LAR.

  • EXPORT & IMPORT URGENT?

    Can anyone help me with the syntax used for export & import memory ID/
    I am using a User exit for functionality for COUNTRY KEY.
    whne I run my report, I want to export the country key filed value(PARAMETER ID = 'MOL') from my report and IMPORT this value in the user exit..
    can you please help me with the syntax wht i need to write?
    Its urgent!!!!

    Hi Abhay,
    define field for export
    First program:
    data: field type char10.
    field = '12345'.
    export field to memory id 'TEST'.
    In the other program:
    data: field type char10.
    import field from memory id 'TEST'.
    Cheers,
    Stefan.

  • Multiple servers in parallel export/import question

    Hello all,
    We plan to use parallel export/import using distribution monitor, and want to use several servers in export and also in import.
    The export_monitor.cmd.properties has FTP option ftphost, whic is the hostname of the import server, which looks like it will only use 1 application server (AS) from the source system and 1 in the AS in the target system -
    - is this true or can we use more AS in both export and import?
    - in the export, how to specify the AS to use?
    - in the export, how to specify more than 1 AS?
    - in the imprt, how to specify more than 1 AS ?
    Thanks,
    Terry

    You can use the Distribution Monitor for that purpose. Check note  855772 - Distribution Monitor
    Markus

Maybe you are looking for

  • Installed BT NetProtect Plus now I cannot access m...

    Hi, I thought I was doing the right thing by installing BT NetProtect plus on my Windows XP Pro PC however ever since I have had issues.  I can no longer browse my home network to access shared files - when I double-click my workgroup I get an error

  • HELP! Have new PC, how can i get iphone's music, apps ect onto new itunes??

    Hi there, Unfortunatly my laptop was stolen a week before christmas. My Iphone was synced to the laptop and I also used the Itunes' Store to purchase my music from the computer aswell as buying some from the phone itself. I now have a new laptop and

  • [11g] most efficient way to calculate size of xmltype type column

    I need to check the current size of some xmltype column in a BIU trigger. I don't think it's good to use   length(:new.xml_data.GetStringVal()); or   dbms_lob.GetLength(:new.xml_data.GetClobVal()); What's the most efficient way to get the storage siz

  • 2013 Workflow Service - Enabled for Other Site Collections

    I have setup the 2013 workflow service with our SharePoint environment. This works fine with the site collection originally used to register the workflow service. The other site collections (in the same web application - we are using host named site

  • File keeps crashing Flash

    I hit save, and closed Flash. Went to work on it again, and it won't open. Can anyone else open this file? Is there any hope that I haven't lost all my work? I have a backup, but that won't open now either! Flash File (CS3)