PREPARE before R3load Export/Import?

We are planning an upgrade from SAP 4.7 to ECC 6.0, where the system will also be moved onto new hardware with new OS and Oracle versions.  Our procedure is to perform a Homogeneous System Copy of our current SAP 4.7 system, using R3load to complete an Export, then using R3load to Import the existing system onto the new hardware.
Our current system setup, where the Export will be completed is:
SAP 4.7 (Domain Installation)
Windows 2000
Oracle 9.2.0.5.
Our new system, SAP 4.7 (Local Installation), will be Imported using the R3load procedure onto our new hardware with Windows 2003 and Oracle 10.2.0.2.  Once the SAP 4.7 copy has been completed on the new hardware, we will begin the SAP upgrade to ECC 6.0.
Based on our procedure, am I still able to effectively complete the PREPARE on our current system before the R3load Export is ran?  Will this PREPARE still be effective once the system has been Imported (copied) onto different versions of OS and Oracle?  Or, will I need to re-run the PREPARE once it has been copied onto the new os and Oracle versions?  Any experience would be very helpful in making this decision.
Thanks in advance.
Tina

It might be best to wait until after the system copy for PREPARE.
1) If this machine is your transport domain controller, there will be files in the transport directory on the source system that you will need to copy to the new system.
2) If you are migrating from 32-bit to 64-bit Windows, PREPARE will load the 32-bit kernel tools on the source system but you will need the 64-bit kernel on the new system.
3) You would need to copy /usr/sap/put contents from the source system.
In addition, running PREPARE after the system copy would provide a validation of the system copy procedures.  I have been through this process and PREPARE has always been run after the system copy to the new hardware.

Similar Messages

  • System copy using R3load ( Export / Import )

    Hi,
    We are testing System copy using R3load ( Export / Import ) using our production data.
    Environment : 46C / Oracle.
    while executing export, it takes more than 20 hours, for the data to get exported, we want to reduce the export time drastically. hence, we want to achieve it by scrutinizing the input parameters.
    during export, there is a parameter by name splitting the *.STR and *.EXT files for R3load.
    the default input for the above parameter is No, do not split STR and EXT files.
    My question 1 : If we input yes instead of No and split the *.STR and *.EXT files, will the export time get reduced ?
    My Question 2: If the answer is yes to Question 1, will the time reduced will be significant enough ? or how much percentage of time we can expect it to reduce, compare to a No Input.
    Best Regards
    L Raghunahth

    Hi
    The time of the export depends on the size of your database (and the size of your biggest tables) and your hardware capacity.
    My question 1 : If we input yes instead of No and split the *.STR and *.EXT files, will the export time get reduced ?
    In case you have a few very large tables, and you have multiple cpu's and a decent disk storage, then splitting might significantly reduce the export time.
    My Question 2: If the answer is yes to Question 1, will the time reduced will be significant enough ? or how much percentage of time we can expect it to reduce, compare to a No Input.
    As you did not tell us about your database size and hardware details there is no way to give you anything but very basic metrics. Did you specify a parallel degree at all, was your hardware idling for 20 hrs or fully loaded already?
    20 hrs for a 100gb database is very slow. It is reasonable (rather fast in my opinion)  for a 2tb database.
    Best regards, Michael

  • R3Load (Export _System Copy )

    Dear Experts,
    System Copy ( Source Version )
    Environment : R/3 4.6C  / Oracle 8.1.7 / Solaris 8
    using R3Load ( Export / Import ) procedure.
    We have a Problem.
    As the Database is around 430 GB.
    We need approx 140 GB for the Export Image File.
    But what we have is not more than 50 GB on each Hard disk. (We have many hard disks but free space not more than 50 GB in any Disk.)
    So we are thinking of Splitting the STR Files and do the export.
    But is this possible?.
    Actually I want to specify multiple locations for the Export Files.
    or can you suggest me other alternative..
    Thanking you in Anticipation.
    Best Regards
    L Raghunahth

    hi
    Thanks for Markus and Michael for your suggestions.
    I think I should opt for NFS thing as it is a Production System.
    Next, our system is R/3 4.6C ( 46D 64 Bit Kernel, Current Patch Level 1748)
    for just executing R3Load Export, is this patch level is ok, as I could not find any prerequisites regarding the kernel requirements.
    Our customer is little hesitant for applying the Latest Kernel Patches as it is the PRD System.
    I wish to know the minimum requirement of Kernel Patch for successful execution of R3Load Export only.
    Could you please let me know where i can find for this prerequisite.
    Thanks in Advance.
    Best Regards
    L Raghunahth

  • SAP Export/Import via R3load

    Hello,
    SAP recommended a sorted export/import for oracle databases. A unsorted export/import ist very very quicker than a sorted export/import.
    very quicker means sorted = eg. 56 hours and unsorted eg. 12 hours.
    Is there any problems if i take the unsorted export?
    regards,
    stefan

    Hello Stefan,
    >> SAP recommended a sorted export/import for oracle databases
    That fact relies on database internal facts like clustering factor of indexes to the table. You will benefit of a sorted export in cases of primary indexes access and so on... but this topic is much bigger than your question.
    >> very quicker means sorted = eg. 56 hours and unsorted eg. 12 hours.
    Where did you get this values from? You can split big tables or packages ... or define parallel access paths and so on... and it depends on your database configuration (sort areas, PGA, etc..)
    I have done some migrations/exports with distmon... and optimized it manually ... so i got from round about 72 hours (normal export with some split packages) to only 12 hours. I have done only sorted exports.
    >> Is there any problems if i take the unsorted export?
    Some tables must be exported sorted (for example cluster tables) - have a look at sapnote #954268 ... but generally there are no "problems".
    If you have a BW system... some rules are changing...
    @ Markus:
    >> You mean the usage of "R3load -loadprocedure fast"? Or how do you do the "unsorted unload"?
    The option "loadprocedure fast" has nothing to do with the unload process. It speeds up the import (insert) by bypassing the buffer cache. Refer to sapnote #1045847 and the following link:
    http://download-east.oracle.com/docs/cd/B10501_01/server.920/a96524/c21dlins.htm
    Regards
    Stefan

  • BI Export Import

    ERROR: 2011-06-12 01:50:01 com.sap.inst.migmon.LoadTask run
    Unloading of 'SAPDODS' export package is interrupted with R3load error.
    Process 'E:\bin\uc\R3load.exe -e SAPDODS.cmd -datacodepage 4103 -l SAPDODS.log -stop_on_error' exited with return code 2.
    For mode details see 'SAPDODS.log' file.
    Standard error output:
    sapparam: sapargv( argc, argv) has not been called.
    sapparam(1c): No Profile used.
    sapparam: SAPSYSTEMNAME neither in Profile nor in Commandline
    ==============SAPDODS.log FILE===============
    (EXP) TABLE: "/BIC/AZSD_O2040" #20110612013025
    (EXP) ERROR: DbSlPrepare/BegRead failed
      rc = 103, table "/BIC/B0000104000"
      (SQL error 208)
      error message returned by DbSl:
    Invalid object name '/BIC/B0000104000'.
    Statement(s) could not be prepared.
    (DB) INFO: disconnected from DB
    E:\bin\uc\R3load.exe: job finished with 1 error(s)
    E:\bin\uc\R3load.exe: END OF LOG: 20110612013026
    ===========================================
    While doing the export import process i am geeting an error , can you please advice what could be the reason
    -Addi

    The indexes of all fact and efact table were not correct. We ran a program "RSDD_MSSQL_CUBEANALYZE" after confirmation from SAP to fix the issue.
    After that the issue was fixed.

  • Homogeneous system copy using export/import on WIn/Ora

    Dear experts,
    i need to know the homogeneous system copy procedure using export/import method for following scenario as we are in the process of Data center migration of SAP systemsfrom location to another location.
    Source system:
    ECC6/BI7/APO(ABAP stack)
    OS:WIndows 2003
    D:B:Oracle 10g
    Target System: New hardware installed on Virtual environment(VMware)
    No change of SID
    ECC6/BI//APO
    OS:WINDOWS2008
    D:B: Oracle 11g
    It would be great if anyone can share the documentu2019s/links (Not SAP homogeneous system copy guide) which you have prepared for the homog. system copy using export/import on Windows/oracle  as i have done the system copy using backup/restore for ABAP stack and export/import for Java stack so far.
    Therefore i am not sure about this procedure/prerequisites/checklist needs to prepare as i have gone through the SAP guide but it is not step by step document.
    Cheers

    There are two approacahes for your  project
    1) First wether to use Oracle11g on target database
    2) Continue with Database Oracle 10.2 on traget database
    If you want to use Oracle11.2 on your target database you have to search the installers which are supported for 11.2 database. If the installers are available then only you can install your target database using Oracle11.2.Other wise t use Oracle10.2.0.4 is supported on Win2008r2. You can use this version of Oracle database to build ur target database.It works fine . Absolutely no problem.
    Later on you can upgrade the Oracle10.2 to 11.2 Onwards.
    If installers are avialble you can use Oracle11.2 also. There is absolutely no problem.
    As per my knowledge installers are available and supported with Oracle11.2.Also check the PAM before arriving on any deciscion.

  • How to turn of specific spot (pantone) colors before i export the document to pdf and these selected colours will not included in pdf...

    Hello, i have one question. Please help me with indesign... i have prepared photoshop document with pantone color in channel pallete... This photoshop file i have imported in indesign document and when i export it from indesign to pdf so color (pantone) overlay over cmyk... i have pantone color just for parcial paint not for proof for customer.
    For example: i need to disable all pantone colors before i export document into pdf file. It is very clear process what i need to do but without find some solution. I know about pallete preview in window/output/... in indesign but this do not fix it! Please help.

    Ok... i understand to overprint process... i have spot color (pantone) in imported image because of preparation to partial varnish... but i'm wondering how can i export the indesign document to pdf without the spot color which recolores my design recolored and i can't it send to customer...
    Only way how can i figure out is to delete spot channel in photoshop? This is very annoying and useless prosess
    i look at this threat:
    Photoshop images with spot channels in Indesign CS6 (layout question)
    and there is some answer to my question!
    I have maked a problem when i set the spot color in photoshop for parcial with some pantone pink color! and this was the problem! I haven't known that i can set white spot color (lab color) and use it in indesign invisible nad in exported pdf too!!! this is what i exactly want!

  • R3load EXPORT tips for 1,5 TB MAXDB Database are welcome!

    Hello to ALL;
    this post is addressed to ALL MAXDB-GURU's !!
    I'v few questions to MAXDB Performance-Guy's for R3load Export on MAXDB!
    (remark to my person:
    I'm certified OS/DB migration consultant and have already done over 500 OS/DB migrations since 1995 successfully)
    BUT now I'm face with following setup for Export:
    HP- IA64 BL870c,  4 CPUu2019s Dual-Core + Hyperthreading activ (mean's 16 "CPU'S" shown via top or glance)
    64 GB RAM, HPUX V11.31;
    MAXDB  7.7.07
    ECC 5.0; Basis/Aba 640.22
    SAP-Kernel 640-Unicode Patch 381
    8 sapdatas are configured on unique 200GB LUN's and VG's in HP-UX; on storage-side the 8 220GB LUN's are located on (only) 9 300GB-Disks with Raid-5 Level and within HP-UX each VG/LVOL is mirrored via LVM to a second desasterdatacenter (200m distance)
    LOGFILES:
    2 x 4 GB LUN Raid-1 on Storage and via LVM  also mirrored to failover-center
    MAXDB-Datasize: 1600 GB Overall and within this 1350 GB used, TEMPUsage about 25 GB !
    MAXDB-Parameter-Settings
    MAXCPUu2019s 10  (4 x IA64 QuadCore+Hyperthreading shows 16 Cores within top and galcne
    Cache_SIZE= I/O Buffer Cache = 16 GB (2.000.000 Pages)
    Data-Cache-HitRatio:  99.61%
    Undo-Cache = 99,98%
    the following sapnote for MAXDB Peformance and Migrations are well known and already processes
    928037, 1077887, 1385089, 1327874, 954268, 1464560, 869267,
    My major problem is the export-runtime with over 4 days on the first dry-run (6 R3load's running on Linux-APPL-Server), and 46h on second runtime, 6 R3loads running on DB-Server IA64)
    the third trail-run was aborted by me after 48hours and 50% export-dump-space was written. In all 3 dry-runs, no more than approx 3.5GB DUMP-Space were written per hour!
    My first question to all MAXDB-Guru'S: How can I influence/optimize the TEMP - Area in MAXDB?? I didn't find any hint's in SDN or SAPNOTES or MaxDB Wiki or google....As fare as I know, this TEMP area "resides" within MAXDB-datafiles, thus it's seperated on my 48 datafiles, spreaded over 8 LUN/VG/disks. But I see LESS throughput on disk's and MAXDB-Kernel uses only ONE of ten's cpu-cores (approx 80% - 120% of 1000%).
    The throughput or cpu-usage doesn't change when I use 2 or 4 or 10 or 16 R3load' processes in parallel. The "result" is always the same: approx. 3,5 GB Export-Dump and 1 CPU MAX-DB Kernelprocess is used !
    so the BIG Question for me: WHERE is the bottleneck ?? (RAID-5 Disk-LUNS mirrored with LVM ???)
    on HP-UX scsi_queue_depth_length I'v increased default value from 8 to 32 to 64 to 128 --> no impact
    2nd question:
    I'v read  OS-Note 1327874 - FAQ: SAP MaxDB Read Ahead/Prefetch, and we are running MAXDB 7.7.07.19! (MaxDB 7.8 is not suppored via PAM for IA64 640-UC-Kernel) and as far as I understood, this parameter will no HELP me on EXPORT with primary-key-sequence ! is this correct?  THUS: which parameter HELPS for speeding up Export-Runtime?
    MAXCPU is set to 10, but ONLY 1 of them is used??
    so this post is for ALL MAXDB GURU'S!
    who will be intrested to contriubte on this "high-sophisticated" migration-project with 1.5TB MAXDB-Database-size and ONLY 24h Downtime !!
    all tips and hints are welcome and I will give us coninued updates to this running project until WE did a successfull migration job.
    PS: Import is not yet started, but should be done within vSphere 5 and SLES 11 SP1 on MAXDB 7.8 ....and of yours in parallel to export with migration monitor, BUT again a challenge: 200km distance from source to traget system !!!
    NICE PROJECT;
    best regards Alfred

    Hi Alfred,
    nice project ... just some simple questions:
    Did you open a message at SAP? Maybe you could buy some upgrade support, this could be usefull to get direct access to the SAP support...
    Which byte order do you use? I know Itanium could use both. But it should be different, otherwise you would use a backup for the migration.
    And the worst question, I do not even want to ask: What about your MAXCPUS parameter? Is it set to more than 1? This could be the problem why only one CPU is used.
    Best regards
    Christian

  • Error when exporting/importing

    I am using MDM 7.1
    When I export the schema from my Dev system to import into QA I get the following Error:
    "This repository requires additional steps before transport. See the MDS log for details.
    In the log my issue is that I am trying to export an Assignment that includes an expression that uses "look-ups".
    In my Dev system I removed the expression to confirm if this is the issue, once I no longer have expressions with look-ups then it allows me to export the schema. I then tries to import it to QA (since the expressions are not changing I planned on excluding them from the import as a temporary workaround.
    however I get the same error message when trying to import. It seems that I can not export or import with a system that has an assignment with an expression that uses look-ups.
    Is there some config I am missing?

    Hi Brad,
    assignments/validations are a general problem when it comes to Schema exports/imports! What you can do in case there are not too many assignments -  is to delete the assignments and create them new (manually) after you have imported the schema.
    Hope this helps a little.
    Regards,
    Erdal

  • System copy using SAPInst(Export Import database Independent prcoess failed

    Hello ,
    I am doing a System copy using SAPInst export/import process .
    Source system : SAP Netweaver'04( BW 3.5 , Kernel : UC 640 ,Patch level 196 )
    Export process fails at Phase 2 -  Database Export at R3load jobs running 1,waiting 0 . Below is the log details
    SAPSDIC.log
    (EXP) INFO:  entry for BAPICONTEN                        in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20000621155733
    (EXP) INFO:  entry for BAPICONTENT255                    in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20031127161249
    (EXP) INFO:  entry for BAPICONVRS                        in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20010131174038
    (EXP) INFO:  entry for BAPICREATORDATA                   in DDNTT is newer than in DDNTT_CONV_UC: 20040211101817 > 20000621155733
    (EXP) INFO:  entry for BAPICRMDH1                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMDH2                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMEXP                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175513 > 20031211120627
    (EXP) INFO:  entry for BAPICRMEXT                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175514 > 20031211120627
    (EXP) INFO:  entry for BAPICRMKEY                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMKEY_T                      in DDNTT is newer than in DDNTT_CONV_UC: 20051229175835 > 20031211120803
    (EXP) INFO:  entry for BAPICRMMSG                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175654 > 20031211120714
    (EXP) INFO:  entry for BAPICRMMSG_T                      in DDNTT is newer than in DDNTT_CONV_UC: 20051229175835 > 20031211120803
    (EXP) INFO:  entry for BAPICRMOBJ                        in DDNTT is newer than in DDNTT_CONV_UC: 20051229175514 > 20031211120628
    (EXP) INFO:  entry for BAPICRMPAREX_T                    in DDNTT is newer than in DDNTT_CONV_UC: 20051229175452 > 20031211120305
    (EXP) INFO: limit reached, 5000 tables in DDNTT are newer than in DDNTT_CONV_UC
    (EXP) INFO: NameTab check finished. Result=2  #20100113131216
    (EXP) INFO: check for inactive NameTab entries: Ok.
    (RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
    (DB) INFO: disconnected from DB
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: END OF LOG: 20100113131216
    ***SAPCLUST.log ****
    (NT)  Warn:  EDIDOC: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "EDIDOC"
    (NT)  Warn:  PCDCLS: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "PCDCLS"
    (NT)  Warn:  SFHOA: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "SFHOA"
    (NT)  Warn:  SFHYT: normal NameTab from 20040211095029 younger than alternate NameTab from 20000621155733!
    (EXP) TABLE: "SFHYT"
    (NT)  Warn:  UMG_TEST_C: normal NameTab from 20040211095029 younger than alternate NameTab from 20031113150115!
    (EXP) TABLE: "UMG_TEST_C"
    myCluster (55.22.Exp): 712: error when retrieving alternate nametab description for physical table UMG_TEST_F.
    myCluster (55.22.Exp): 713: return code received from nametab is 32
    myCluster (55.22.Exp): 299: error when retrieving physical nametab for table UMG_TEST_F.
    (CNV) ERROR: data conversion failed.  rc = 2
    (RSCP) INFO: I18N_NAMETAB_TIMESTAMPS not in env: checks are ON (Note 738858)
    (DB) INFO: disconnected from DB
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: job finished with 1 error(s)
    D:\usr\sap\B35\SYS\exe\run/R3load.exe: END OF LOG: 2010011312563
    Please suggest.
    Thanks & Regards
    Ganesh

    Is your DB unicode?  If so, did you select the unicode flag in sapinst?
    This [thread|System Copy Error while exporting ABAP; might offer some help.
    -Zach

  • Regarding Distribution Monitor for export/import

    Hi,
    We are planning to migrate the 1.2TB of database from Oracle 10.2g to MaxDB7.7 . We are currently testing the database migration on test system for 1.2TB of data. First we tried with just simple export/import i.e. without distribution monitor we were able to export the database in 16hrs but import was running for more than 88hrs so we aborted the import process. And later we found that we can use distribution monitor and distribute the export/import load on multiple systems so that import will get complete within resonable time. We used 2 application server for export /import but export completed within 14hrs but here again import was running more than 80hrs so we aborted the import process. We also done table splitting for big tables but no luck. And 8 parallel process was running on each servers i.e. one CI and 2 App servers. We followed the document DistributionMonitorUserGuide from SAP. I observerd that  on central system CPU and Memory was utilizing above 94%. But on 2 application server which we added on that servers the  CPU and Memory utilization was very low i.e. 10%. Please find the system configuration as below,
    Central Instance - 8CPU (550Mhz) 32GB RAM
    App Server1 - 8CPU (550Mhz) 16GB RAM
    App Server2 - 8CPU (550Mhz) 16GB RAM
    And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    Please can someone let me know how to improve the import time. And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    Thanks,
    Narendra

    > And also when i used top unix command on APP servers i was able to see only one R3load process to be in run state and all other 7 R3load  process was in sleep state. But on central instance all 8 R3load process was in run state. I think as on APP servers all the 8 R3load process was not running add a time that could be the reason for very slow import.
    > Please can someone let me know how to improve the import time.
    R3load connects directly to the database and loads the data. The quesiton is here: how is your database configured (in sense of caches and memory)?
    > And also if someone has done the database migration from Oracle 10.2g to MaxDB if they can tell how they had done the database migration will be helpful. And also if any specific document availble for database migration from Oracle to MaxDB will be helpful.
    There are no such documents available since the process of migration to another database is called "heterogeneous system copy". This process requires a certified migration consultant ot be on-site to do/assist the migraiton. Those consultants are trained specially for certain databases and know tips and tricks how to improve the migration time.
    See
    http://service.sap.com/osdbmigration
    --> FAQ
    For MaxDB there's a special service available, see
    Note 715701 - Migration to SAP DB/MaxDB
    Markus

  • Export/Import and Client Copy

    Hi All,
    Could you please help me with what are the major differences between Export/Import...client copy...and a System Refresh.How they differ from each other.
    Regards
    Rajesh

    Hello Rajesh ,
    I have capture some data from SAP Help :
    Local Copy: Copying Clients Within a System:
    You can improve the performance of the client copy by, for example, by excluding tables or packages, with Edit -> Expert Settings.
    You can exclude tables from the client copy, for example if they are not relevant for the target client, in the Tables tab
    Copying Clients Between Systems (Remote Copy):
    The same product is installed, with the same release, in both systems
    The client copier can copy a client into another system. The systems can be on different platforms. You can change the client number.
    When you copy a client from one system to another, the data is transferred directly via the RFC interface - there is no intermediate storage on hard disk.
    Transporting Clients Between Systems ;Client export (SCC8):
    The client copier can copy a client into another system, which can be on a different platform. You can change the client number.
    You are no longer required to transport clients before you can copy them between systems. You can make a remote copy instead.
    Up to three transport requests are created, depending on the selected copy profile and the existing data.
    The transport request for texts is e.g. only created if the source client contains customer texts.
    <sid>KO<no>  cross-client data
    <sid>KT<no>  client-specific data
    <sid>KX<no>  texts and forms
    The data export is performed automatically asynchronously. The output of the export includes the names of the transport requests that are to be imported.
    Import transport requests into the target client (STMS)
    Choose one of the transport requests of the client transport in the Transport Management System (TMS). The other transport requests belonging to this client transport are then automatically added in the correct order.
    Import these transport requests into the target client.
    client import postprocessing (SCC7)
    You need to perform postprocessing activities to adapt the runtime environment to the current state of the data.
    Copy by Transport Request :
    This function transports customizing changes that have been recorded in a transport request between two clients in a system.
    You can choose whether you only copy the object list of the request or also the object lists of unreleased tasks in the request.
    Entries in the target client are overwritten or deleted according to the key entries in the transport request.
    Choose Administration -> System administration -> Administration -> Client admin. ->Special Functions -> Copy Transport Request.
    <a href="http://help.sap.com/saphelp_nw70/helpdata/en/69/c24c0f4ba111d189750000e8322d00/frameset.htm">For more info Click here</a>
    Regards ,
    Santosh Karadkar

  • Error during export & import of rules in RAR 5.2

    Hi all,
    We followed the steps mentioned in exporting & importing the rules as per the Config guide. But we
    are receiving the below error during the import of rules. Can anybody please throw a limelight on why this error message is appearing?
    "Error in Table Data ==> VIRSA_CC_CRPROF
    SQL==>Insert into VIRSA_CC_CRPROF(VSYSKEY,PROFILE,RULESETID,RISKLEVEL,STATUS)Values(??????)
    Record==>D VIRSA_CC_CRPROF null"
    We also ensured that the downloaded file is not truncated and saved it in a separate folder. Does the file needs to saved in ANSI or Unicode text? We saved in ANSI
    Also the background ground job is not getting scheduled automatically during the import of ruels. Is that due to the above error?
    Thanks and Best Regards,
    Srihari.K

    Hello Sri,
    background job for generating the rules won`t be scheduled before you upload the file successfully.
    The most obvious reason for this error message is that you have line in your file - in table VIRSA_CC_CRPROF that is corrupted. Make sure all lines from table VIRSA_CC_CRPROF have all predefined fields - (VSYSKEY,PROFILE,RULESETID,RISKLEVEL,STATUS).
    If you keep hiting this problem just delete this table, after the upload you`ll add the critical profiles manually - I bet you don`t have many of them.
    Also from my experience I always choose to save the downloaded file in UNICODE UTF-8 format.
    Once the file is saved in other format, there`s no use of it, download it again and save directly in Unicode.
    Make sure you don`t have empty fields, even in the fields where you don`t have values, you must leave space, otherwise you`ll keep hitting the same issue.
    Regards,
    Iliya

  • SLD Export / import Technical and Business systems

    Dear all,
    I am new to SLD administration.
    We have task to update the local SLD on the Production PI system PPI ( Target ) with the business systems ZEC, ZSC from the Production Master SLD on Production solution manager system
    PSM ( source ) using Export/Import on the necessary technical and business systems.
    Please confirm whether i am following correct steps :
    Go to SLD Administration on PPI target system
    ALL Instances
    Instances by Class
    Model
    ( which one of this option need to be selected , please let me know  ?? )
    Go to SLD Administration on PSM source system.
    select Export button ->
    Select business systems one by one and perform Export which will create .ZIP file
    at local machine.
    Go to SLD Administration on PPI target system
    select Import button ->
    Select Export ZIP file which created in previous steps from PSM and provide location from desktop
    Start Import button ( only if you get Continue import option )
    Please let me know if above procedure is correct for business systems
    if yes then  what need to be done for Technical system ??...do they require Export/import also ?
    If yes when i go to technical system option i get Export option disabled...but same way when i go to
    business systems option i get Export button enable and able to perform the export.
    Do i need to only Export / Import business systems ?
    Plese help me.
    Regards,
    RR

    Manoj,
    It seems entries are added manually .... should we remove and run Import for the same ...
    or just try to overwrite .... in case of overwrite option ( as we have open OSS message ) SAP has suggested through OSS message that If you end up with problems when trying to overwrite, then the PPI SLD is broken and will have to be re-built because someone made a manual change to it and got the GUIDs out of synch with the master.
    I do not understand if target system PPI SLD gets broken during overwriting operation as mentioned by SAP ... whats is the way / steps / procedure to Re-build SLD .
    In rebuilt - process do i need to Import ( Export -> Export lines -> ALL ) the export backup created before start of this activity ? please let me know SLD rebuilt steps and about Remove - reimport issue.
    Appreciate you time on this. thanks in advance.

  • Export/Import Process in the UI for Variations Content Translation is Generating CMP Files with No XML

    We have a SharePoint 2010 Publishing Website that uses variations to deliver contain to multiple languages. We are using a third-party translation company to translate publishing pages. The pages are
    exported using the  export/import using the UI process described here: "http://blogs.technet.com/b/stefan_gossner/archive/2011/12/02/sharepoint-variations-the-complete-guide-part-16-translation-support.aspx".
    Certain sub-sites are extremely content-intensive. They may contain many items in the Pages library as well as lists and other sub-sites. 
    For some sub-sites (not all), the exported CMP file contains no XML files. There should be a Manifest.XML, Requirements.XML, ExportSettings.XML, etc., but there are none. After renaming the CMP file
    to CAB and extracting it, the only files it contains are DAT files.
    The only difference I can see between the sub-sites that generate CMP files with no XML files is size. For example, there is one site that is 114 MB that produces a CMP file with no XML files. Small
    sites do not have this problem. If size is the problem, then I would think the process would generate an error instead of creating a single CMP file that contains only DAT files. However, I do not know exactly what the Export/Import Process in the UI is doing.
    This leads to two questions:
    1.
    Does anyone know why some CMP files, when renamed to *.CAB and extracted, would not contain the necessary XML files?
    2. Second, if exporting using the UI will not work, can I use PowerShell? I have tried the Export-SPWeb, but the Manifest.XML does not contain translatable
    content. I have not found any parameters that I can use with Export-SPWeb to cause the exported CMP to be in the same format as the one produced by the Export/Import process in the UI.
    As a next step, we could try developing custom code using the Publishing Service, but before doing this, I would like to understand why the Export/Import process in the UI generates a CMP that
    contains no XML files.
    If no one can answer this question, I would appreciate just some general help on understanding exactly what is happening with the Export/Import Process -- that is, the one that runs when you select
    the export or import option in the Site Manager drop down. Understanding what it is actually doing will help us troubleshoot why there are no XML files in certain export CMPs and assist with determining an alternate approach.
    Thanks in advance
    Kim Ryan, SharePoint Consultant kim.ryan@[no spam]pa-tech.com

    I wanted to bump this post to see about getting some more responses to your problem. I'm running into the same problem as well. We're running a SharePoint 2010 site and are looking at adding variations now. The two subsites with the most content take a
    while to generate the .cmp file (one to two minutes of the browser loading bar spinning waiting on the file). Both files are generated with a lot of .dat files but no .xml files. I was thinking like you that it must be a size issue. Not sure though. Did you
    ever happen to find a solution to this problem?

Maybe you are looking for

  • A New Year, a New Challenge! Become the FIRST WPF Guru of 2015!

    Happy New Year! "Guru 2014" is so 'last year'! The real glory is to be the first Guru of 2015! :D The birth of a new year, and a new hero? Or the stamp of authority from long established Guru leaders? The challenge is on, all eyes are watching, anyon

  • Iphoto keeps quitting unexpectedly

    I just bought a brand new imac. Just tried to save some pics from Photobooth and open them on iphoto but iphoto keeps quitting unexpectedly. It keeps asking me if I want to import some photos it has found when I launch it. It then quickly closes. I c

  • Slide Libraries discontinued in SharePoint 2013

    According to this page, the Slide Library Feature is discontinued. Dear SharePoint Team, what shall we use instead? How can we make sure we won't lose this functionality in future Versions? Thanks, Neno Neno Loje | MVP Team System

  • Will iBooks author support webgl in widgets?

    It would be great to include interactive 3D games in iBooks Author widgets. But Javascript is going to need some help to accomplish this. Does anyone know if webgl will be supported or if there some other way of rendering 3D graphics in the iBooks wi

  • Problems with movie playback after iTunes update

    Recently updated iTunes to 11.1.1.11 and now my Disney movies won't play - the audio keeps dropping out. Any help?