TDMS shell creation  - R3szchk running longtime

Hi,
I am running a shell creation package, in Export ABAP phase R3szchk running long time and there is nothing written to log file
-rw-r-----    1 root     sapinst           0 Oct  4 19:56 /tmp/sapinst_instdir/ERPEhP4/LM/COPY/DB6/EXP/CENTRAL/AS-ABAP/EXP/R3szchk.exe.log
where else I can check to find out the issue?
Thanks
Din

Thanks Pankaj for the valuable information.
---The size of exports may depend on the data in your client independent tables. There are certain client indepedent table which hold a lot of data for example STXL, STXH and DYNP* tables. It also depends on the volume of custom development that you have done in your systems. Most of the client independent tables are exported in full duing shell.
I could understand that. I ran an optional activity u201CCheck size of large tables (optional)u201D  in TDMS Shell package, it returns witj some large tables, such as
STXH- 190,854,900 KB
STXL- 26,819,840 KB
DYNPSOURCE- 20,452,251 KB, some of these are independent and some are dependent tables. For ex. STXH is dependent table with size 190GB, does it means that this table will be exported in full during shell export?
However, you need to be sure that because of certain mistakes you are not exporting the complete system (including application data). As stated by Marcus, you need to make sure that you are using Migration monitor in manual mode and that you configure the Migration monitor control file with correct parameter values.
I think I configured import monitor control file correctly and started it manually.
Here is the import monitor startup log:
Export Monitor is started.
CONFIG: 2011-10-05 00:49:22
Application options:
dataCodepage=4102
dbType=DB6
ddlFile=/shell_install/DDLDB6.TPL
ddlMap=
exportDirs=/shell_export/ABAP
ftpExchangeDir=
ftpExportDirs=
ftpHost=
ftpJobNum=3
ftpPassword=*****
ftpUser=
host=
installDir=/shell_install
jobNum=6
loadArgs=
mailFrom=
mailServer=
mailTo=
monitorTimeout=30
netExchangeDir=
orderBy=name
port=
r3loadExe=
server=
taskArgs=
trace=all
tskFiles=yes
Thanks
Din

Similar Messages

  • Migration monitor in TDMS shell creation

    Hi,
    I am working on a TDMS shell creation for ECC system, I have few question..
    1) Is it mandatory to run migration monitor manually while in both exporting and importing database with R3load option in tdms shell creation or only while running R3load export?
    2) How to run export monitor activity in background?
    3) How to parallel R3load processes to export a single package instead running one r3load for each package?
    OS: AIX
    DB: DB2
    SAp ECC 6
    Thanks
    Praveen

    Answers to your questions are below -
    1) It is mandatory during the exports of data from the source system.
    2) It should not be run in background
    3) Use table splitting using R3TA, but that should be done before exports.

  • TDMS shell creation

    Is TDMS shell creation for the ABAP stack only?
    What happens in the ABAP+Java systems?
    Thanks,
    Sri

    Hi Pankaj,
    I don't think TDMS is meant for JAVA. At preset It is only meant for ABAP stack to transfer data using different senarios.
    You can transfer the data from source and at the same time the Java settings in the target remain the same.
    Thanks,
    Sridhar

  • TDMS 3.0 for ERP 6.0:  step by step procedure for  TDSH6 Shell creation

    Dear All,
    I need to perform shell creation. I have never done this and need help with step by step instructions.
    I need to refresh my my development system with a TDMS shell creation from production client in production system.
    I am familiar with TDTIM and performed many TDTIM copies. I have gone throught the master and operational guides for TDMS also.
    I would try to list down my understanding so far after reading the guides.
    Please correct me if I am wrong ang also suggest correct steps.
    1. All my three systems (central, sender and receiver) are ready from installation point of with with authorizations and latest SPs.
    2. I log into Central/Control system and start a new package  "ERP Shell creation package for AP Release higher 4.6 "
    3. AS I have never executed this option I do not know what are the stps inside it.
       But I assume I should be asked for the source client details from which shell needs to be creates. (000 or pruduction client)
       What type of shell  : slave or master?
    4. During the shell creation from sender system client, is there a downtime for sender system.
    5. Once the shell creation is done an export dump will created.
    6. I perform all the preprocessing steps of a homogenous system copy on the receiver systems.
    7. Stop receiver system, start sapinst  database instance installation with homogenous system copy with R3load option.
    8. Provide the export dump location to sapinst.
    9. After database instance is installed, perform system copy specific post processing steps.
    10. Afte the system is ready it has only one client 000 now available in development system and  i need to create a new cleint from 000 which will act as my new development cleint.
    I request all you experts to please provide your comments at the earliest as activity would be starting from 5th of this month.
    Regards,
    Prateek.

    Hi,
    Let me answer your questions.
    Some queries
    1. Is there a downtime for Production system (sender system).
       SAP recommends that nothing much should be going on during the export of the Sender(PRoduction) system, but so far i have never encountered an issue doing the export in online mode.
    2. Shell creation works on both Production and 000 client or only Production client,
       Be carefull!!! - A Shellcreation is a normal Homogenous Systemcopy using R3load. You will create a 'new' system with the DDIC and clientindependent customization of the Sender system. ALL former data of the target system will be deleted.
    You mention that you will refresh a development system that way. Please be aware of the implications that a development refresh contains. That said. After the refresh you will have the 000 client and parts (users ... authorizations of sonder) of the other clients on the new system.
    3. when I import the the dump using sapinst, how many clients are available in receiver systems.
    See above. All client configuration will be available, but nearly no data will be in these clients except 000. A manual part of the Shellcreation is to clean up the target system (delete the clients that are not needed) This is a fast process.
    4. Soem where in the guide it is mentioned that downtime depends on the type of shell I create.
    Kindly suggest should I create a master shell or slave.
    Hm. The master shell is just a concept that you create a complete new system and put that system into your transport landscape without filling it with data. You can then use that system as the source for a normal (database dependent) systemcopy without impacting your prod system. The result is a very small copy that can be repeated very often without an impact to prod/QA...
    Downtime is as described in point1 not required
    5. Once I execute this activity of shell creation for lets say receiver system A, then can I use the same export dump for second receiver system B.
    Yes
    6. Lastly please suggest the activity execution time as customer has provided us 2 days to execute this activity.
    This depends on your knowledge of homogenuous systemcopies and the preparation + some items like size of stxl  and other tables that will be exported/imported.  If you are very familiar with the systemcopy process using R3Load 2 days are feasable.
    I have seen shellcreations done within 1-2 days but also much longer ones!
    As a prerequisite i would always recommedn to have the latest R3* + migmon exe available on sender + target.
    I hope i have clarified some of your items.
    Best Regards
    Joerg

  • TDMS for BI - shell creation and SMIGR_CREATE_DDL

    If a homogeneous system copy of a BI system is done, it's necessary to run the job SMIGR_CREATE_DDL because depending on what database is used, special SQL scripts for tables must be used (concerning primary keys).
    How does that integrate into a shell creation? Is the shell creation process aware of those scripts? I mean, it's theoretically possible that the SQL scripts has a table that is excluded from being transferred?
    Markus

    Markus and Pankaj:
    I have created a SHELL (BW) using Markus' way as listed below.
    Because Markus did not mention that SMIGR_CREATE_DDL should be run at the beginning, so I did not run it.
    So how to determine in the created shell what is missing?
    We do have partitioned tables and bitmap indexes.
    Thanks!
    Here is the way provided by Markus:
    I basically do shell creation as follows:
    - install the latest system copy tools (R3load, R3ldctl, R3szchk, libdb<database>slib.dll)
    - create an installation directory and give full permissions to <sid>adm and SAPService<SID>
    - create a directory for export location and give full permissions to <sid>adm and SAPService<SID>
    - open a cmd.exe as <SID>adm, step to the installation directory and execute "R3ldctl -l R3ldctl.log -p ." (note the "dot" which means actual directory)
    - in parallel start the client export of client 000 in source system using SCC8 (profile SAP_ALL) and note the file names you get in the last dialog window
    - when R3ldctl is finished give permissions to SAPService<SID> for all files in the installation directory
    - proceed with TDMS
    - when you are at the point to start the system copy start sapinst, choose system copy and select start migration monitor manually (VERY important!)
    - sapinst will run R3ldctl, R3szchk and then prompts you to start migration monitor
    - step to your normal installation directory (c:\program files\sapinst_instdir....), open export_monitor_cmd.properties and adapt the file. The import thing is, that you need to point to YOUR DDL<DB>.TPL file you create in step 4 (in my list here)
    - start export_monitor.cmd and export the system
    - proceed with TDMS to adapt the database sizes (DBSIZE.XML)
    Import:
    - if you have an already installed system uninstall it
    - start sapinst normally, choose system copy and point to the export you created
    - install the system normally (as a new installation)
    - if you want to make sure the import works as the export choose "start migration monitor manually"
    - if sapinst stops and requests you to start migration monitor copy the kernel from source system to target system
    - configure import_monitor_cmd.properties and start migmon
    - logon in client 000, start transaction STMS and create a domain (basically only a transport profile will be created)
    - start program BTCTRNS1 (this will suspend all jobs from the source system but not delete them)
    - copy the files from the client copy to TRANSDIR/cofiles and TRANSDIR/data and import them to client 000 (either use STMS or use command line tp)
    - adapt profile parameters (RZ10)
    - run SGEN
    - invalidate connections in SM59 that point to other production systems
    - finished
    - to re-enable all the jobs run BTCTRNS2, however, I'd do this only if you're sure you have invalidated RFCs and or sending jobs (e. g. from SCOT)

  • What is the "BW shell creation package for BW system" in TDMS?

    I see for lower version of SAP systems such as 4.6 and BW(e.g.3.0, etc), there is a "shell creation package " for those
    types of systems.
    Could you explain what is that for?
    Thanks!

    1) where to find the info about the copy sequence such as: first shell creation, then initial package or whatever, then ...?
    > TDMS solution operation guide will be helpful, the same is available at SAP service market place. In short the sequence is  -
    first shell (only if the repository is not in sync) then initial package (ex. TDTIM) and then refresh packages (as an when needed)
    2) for HR we do not have shell creation, what we have is "initial package for master data and customizing" , the next step is "ERP initial package for HCM personel dev. PA & PD". Why we do not have shell creation here?
    > For HR as only few objects are transfered the need of a full shell system does not arise. only the object to be transfered are synced.
    3) Will TDMS replace system copy for ERP, BW, CRM completely?
    > Shell will not replace them but we may say that it will supplement it. When you need a complete copy means repository along with application data then you need to go for system copy but if you only need repository to be copied from source to destination then shell is helpful. Also it should be noted that Shell is an TDMS package whereas System copy is a SAP standard tool.
    I hope the above response is helpful.
    Regards,
    Pankaj

  • SAP TDMS System Shell creation

    Hello,
    I am new to SAP TDMS. I want to know about System shell creation.
    Is System shell used to create a totally "new system" or can it be done on an existing system? Can it be done on an existing system by using a separate client??
    If we use existing system what happens to its already present data. As per the Operations guide it says we need to do a Homogeneous system copy by using TDMS tools.This will delete the data in existing system.So does it mean all database content will be deleted or only the client specific data?
    Many Thanks.
    Ravindra

    You can get more details  from the following documents:
    Operations Guide
    http://service.sap.com/instguides -> SAP Components -> SAP Test Data Migration Server -> TDMS 3.0
    Security Guide
    http://service.sap.com/security -> Security in Detail -> SAP Security Guides -> SAP Test Data Migration Server

  • What is the "shell creation" in TDMS?

    I cannot find the exact definition of above term "shell creation". Would you please help? Thanks!

    In TDMS, there are different ways of selecting data from the production server. So there are different Process Types.
    Shell Creation is one of those types.
    SHELL CREATION- Migrated data contains only cross-client data and the required client-specific user and address data, but no other client-dependent data.
    Hope this answers your question

  • ERP shell creation package

    For any ERP related package (except client deletion package), do we have to run "ERP shell creation package" at the very beginning? 
    The reason I ask this question is as follows:
    To bring HR configuration to the target, we have 2 different methods:
    1) client copy
    2) TDMS package called "ERP initial package for MDC"
    Therefore I have the impression that at least method 1) does not need a shell creation.
    I am not sure whether method 2) needs a shell creation first or not.
    Could you help explain?
    Is there any other exception for shell creation?
    Thanks!

    Hello
    No shell is not a mandatory step before TDMS ERP related package. General requirement of TDMS packages (time base or object based) is that the repository on both the sender and receiver should be identical. So in case the repository on both the system is already in sync then Shell is not needed. Also if there are minor differences in the repository then too it may not be required to do shell if the same inconsistencies could be removed manually or by importing some missing transports.
    If the differences between the two systems are many or if the receiver system needs to be build from the scratch then Shell is required.
    I hope that helps
    Best regards
    Pankaj

  • TDMS Shell - DB Export from source/sender system taking a VERY long time

    We're trying to build a TDMS Receiver system using the TDMS Shell technique. We've run into a situation wherein the initial  DB Export from source/sender system is taking a VERY long time.
    We are on ECC 6.0, running on AIX 6.1 and DB UDB v9.7. We're executing the DB export from sapinst, per instructions. Our DB export parallelizes, then the parallel processes one by one whittle away to just one remaining, and there we find out that the export is at that point single-threaded, and exporting table BSIS.
    BSIS is an FI transactional data table. We're wondering why is the DB export trying to get BSIS and its contents out??? Isn't the DB export in TDMS Shell technique only supposed to get SAP essential only config and master data, and NOT transactional data?
    Our BSIS table is nearly 700 GB in size by itself.  That export has been running for nearly a week now, with no end in site.
    What are we doing wrong? We suspect we may have missed something, but really don't think we do. We also suspect that the EXCLUSION table in the TDMS Shell technique may be the KEY to this whole thing. It's supposed to automatically exclude very large tables, but in this case, it most certainly missed out in excluding BSIS for some reason.
    Anyway, we're probably going to fire up an OSS Message with SAP Support to help us address this perplexing issue. Just thought we'd throw it out there to the board to see if anyone else somewhere has run into similar circumstances and challenges.  In the meantime, any feedback and/or advice would be dearly appreciated. Cheers,

    Hello
    Dont be bothered by the other TPL file DDLDB6_LRG.TPL, we are only concerned with DDLDB6.TPl.
    Answer following questions to help me analyze the situation -
    1) What is the current size of export dump
    2) Since when is the exports running
    3) What is the size of the source DB? Do you have huge amount of custom developments?
    4) Did you try to use table splitting?
    5) Do you doubt that there may be other transaction tables (like BSIS) which have been exported completely?
    6) Did you update the SAP Kernel of your source system to latest version before starting the Shell package?
    7) Were the DB statistics update during the shell or were they already updated before starting Shell?
    8) Is your system a distributed system i.e. Central instance and Database instance are on different application servers?

  • Shell creation issue

    Dear experts, I've got a question regarding shell creation of TDMS.
    I've finished the export from ECC IDES system (SR3). The size of export files are over 30GB... is it normal? Can I reduce the data? Since it'll take me more than 200GB disk in target system.
    Is it been setup in the step of "Determine Tables to be Excluded from the Export"?
    Which data can be reduced in the table regarding IDES shell creation?
    Thanks!

    Hi Jett
    Your first question is not very clear to me -
    Seems that, the data folder of export is still the same as before. Is it normal?
    >I think you mean to say that the size of the data folder of export is of same size as before. Yes that is normal. However i feel the doubt that remains in your mind is that you want to be sure that its doing a reduced transfer and not a full transfer. Well you cant make a decision based on the size of the export folder. To decide that you can do the following -
    Pick one excluded table and in the exports folder try to search for its existence in the .TOC files. If you find it in any of the .TOC files then it indicates that a full transfer is happening instead of a reduced one. If you dont find it in .TOC but you find that excluded table in .STR files that indicates that a reduced transfer is happening and everything is fine. The existence of the excluded table in .STR file indicates that the structure information of the excluded table will be transferred although the data is not getting transferred.
    Regarding your second set of queries -
    Please understand that there are two parts to shell creation one is excluding the data to be transferred for a set of tables, so no data at all get transferred to the target system for all the excluded tables. Second part is that how much filespace (on target system) should be allocated to tables which are there in the excluded tables list, by default the import process will assign same space to a table in target system as the space allocated to it in the source system. So if a table for which no data is getting transferred was of size say 10 MB in sender so the import process will allocate a filesize of 10 MB to the table in target system also although it will be blank. To avoid this and to reduce the overall size of target system, we reduce the sizes that should be allocated to the excluded tables in the target system in the activity "Determin and Modify Size of Tables".
    Now for the question what do the ratios mean and on what basis do we assign the ratios-
    >Say for example Ratio20 means that the import process will assign 20% of the original size (in sender system) to the table in the target system. Regarding on what basis are the ratios determined - The ratios are determined on the basis that how much data do you expect each table to hold later on. So this depends on which process of TDMS (vis TDTIM, TDMDC, TDTCC) do you want to execute after shell creation. In case of TDTIM process type it may also depend on the period for which you want to transfer the data. For this reason we have provided various templates (vis TDTIM, TDMDC etc) in the activity "Determin and Modify Size of Tables". So we have tried to build this intelligence into the system. So if you want to excute a TDMDC scenario after shell creation then you should choose the template TDMDC in the activity "Determin and Modify Size of Tables" and the system will automatically assign defferent ratios to different tables.
    In my last post i had suggested that if you want to further reduce the size of your target system (although the assigned template will automatically reduce the size of the target system) you can do the same by further reducing the ratios for the tables for which you dont expect much data to be transfered. This functionality of altering ratios for certain table or set of tables is available in the activity "Determin and Modify Size of Tables". SAP already delivers ratios which will be optimum for different tables, if you want to override those setting and provide your own ratios then it should be done at your own risk.
    I hope this post explain you the entire concept. In case you have some doubt feel free to write again.
    Regards
    Pankaj.

  • Is an existing client the only prerequisite for shell creation?

    I am preparing my first shell creation. From what I read, I get the impression that an existing client is
    the only prerequisite for a shell creation. I already created the CPIC user and installed DMIS_  on the sender and central systems.
    Could you confirm?  Thanks!

    TDMS Shell is a form of system copy where you copy an existing source system into a target system but only the non-application related data. TDMS Shell is a prerequisite for other TDMS processes.
    Regarding prerequisites for Shell - all prerequisites and preparation that apply to SAP system copy apply for Shell also. Refer to SAP system copy guide for the same.
    Best regards
    Pankaj

  • Shell creation on iSeries

    Hi,
    Does anyone have experience with shell creation on iSeries?
    With program R3LDCTLDB4 we've created the STR & TPL files, they have been adjusted via TDMS, but now we somehow have to start a system copy based on these files.
    When starting SAPinst, it makes proparations to copy the whole system.
    Does anyone have an idea how to tell SAPinst to copy the system based on already generated STR & TPL files (on iSeries)
    Kind regards,
    Nicolas De Corte

    Start SAPINST with parameter SAPINST_CWD = Path of directory where edited TPL files are place.
    for ex. your installation directory is d:\sapinst (in this directory your tpl files are placed) then start sapinst like  -
    SAPINST SAPINST_CWD = "D:\SAPINST".
    CWD stands for current working directory. Also it is advised to set environment parameter SAP_DIR and TMP to D:\SAPINST
    I hope it helps.
    Best regards,
    Pankaj.

  • Using a UNIX shell script to run a Java program (packaged in a JAR)

    Hi,
    I have an application (very small) that connects to our database. It needs to run in our UNIX environment so I've been working on a shell script to set the class path and call the JAR file. I'm not making a lot of progress on my own. I've attached the KSH (korn shell script) file code.
    Thanks in advance to anyone who knows how to set the class path and / or call the JAR file.
    loggedinuser="$(whoami)"
    CFG_DIR="`dirname $0`"
    EXIT_STATUS=${SUCCESS}
    export PATH=/opt/java1.3/bin:$PATH
    OLDDIR="`pwd`"
    cd $PLCS_ROOT_DIR
    java -classpath $
    EXIT_STATUS=$?
    cd $OLDDIR
    echo $EXIT_STATUS
    exit $EXIT_STATUS

    Hi,
    I have an application (very small) that connects to
    our database. It needs to run in our UNIX environment
    so I've been working on a shell script to set the
    class path and call the JAR file.
    #!/bin/sh
    exec /your/path/to/java -cp your:class:paths:here -MoreJvmOptionsHere your.package.and.YourClass "$@"Store this is a file of any name, e.g. yuckiduck, and then change the persmissions to executechmod a+x yuckiduckThe exec makes sure the shell used to run the script does not hang around until that java program finishes. While this is only a minor thing, it is nevertheless infinite waste, because it does use some resources but the return on that investment is 0.
    CFG_DIR="`dirname $0`"You would like to fetch the directory of the installation out of $0. This breaks as soon as someone makes a (soft) link in some other directory to this script and calls it by its soft linked name. Your best bet if you don't know a lot of script programming is to hardcode CFG_DIR.
    OLDDIR="`pwd`"
    cd $PLCS_ROOT_DIRVery bad technique in UNIX. UNIX supports the notion of a "current directory". If your user calls this program in a certain directory, you should assume that (s)he does this on purpose. Making your application dependent on a start in a certain directory ignores the very helpful concept of 'current directory' and is therefore a bug.
    cd $OLDDIRThis has no effect at all because it only affects the next two lines of code and nothing else. These two lines, however, don't depend on the current directory. In particular this (as the cd above) does not change the current directory for the interactive shell your user is working in.
    echo $EXIT_STATUS
    exit $EXIT_STATUSEchoing the exit status is an interesting idea, but if you don't do this for a very specific purpose, I recommend not to do this for the simple reason that no other UNIX program does it.
    Harald.

  • Shell Creation after SAPINST Error: Graphics profile STRW, SWBOCUSTOMIZ

    Hi all
    we had finisehd Shell Creation and SAP live but now we had some graphical problems  Graphics profile STRW, SWBOCUSTOMIZ.
    Do we have create some error or has somebody a idea what we could do ?
    OBJECTS_OBJREF_NOT_ASSIGNED_NO
    CX_SY_REF_IS_INITIAL
    66 definition_instance = lrh_system_container.
    67 ENDIF.
    68 **** In case we read the method container: hook it on as an include.
    69 IF lrh_method_container IS NOT INITIAL.
    70 CALL METHOD lrh_task_container->include_set
    71 EXPORTING
    72 name = mc_method_cont_include_name
    73 included_container = lrh_method_container
    74 cascaded_save = space.
    75 ENDIF.
    76
    77 IF im_container_id IS SUPPLIED.
    78 CALL METHOD lrh_task_container->set_guid
    79 EXPORTING
    80 guid_32 = im_container_id.
    81 ENDIF.
    82 IF im_persistence_classname IS SUPPLIED.
    83 CALL METHOD lrh_task_container->set_persistence <-ERROR
    84 EXPORTING
    85 persistence_classname = im_persistence_classname
    86 overwrite = 'X'.
    87 ENDIF.
    88
    89 ex_task_container = lrh_task_container.
    90
    91 CATCH cx_swf_ifs_exception.
    Thanks for help.

    > Do we have create some error or has somebody a idea what we could do ?
    I'd create a client copy (SAP_UCSV) from the source system.
    Markus

Maybe you are looking for

  • Customer master data not replicated to DSD Connector tables - BD12 - Idocs

    Hi experts, I have a customer existing in 2 different sales organizations. I need to replicate it to the DSD/ME_CUST_SA table in the DSD Connector. The Idoc's segments generated while executing BD12 transaction, include only information for one sales

  • Word documents (New and Existing) paper area does not work correctly

    Hello all. I have a rather hard to explain situation with one of our workstations. It is an HP 8760w and it has had Office 2010 installed on it for quite some time. Currently, when opening a new document or any existing document, the program will loo

  • Problem in adding custom area to ESS

    Hi All, I have created an Area in SPRO and done all the sub area, service assignment with the PCD page path. When I click the link on the Portal I get an exception saying Fatal error: Area ZNEWAREA does not exist In the XssMenu iview, when I change t

  • Javax.ejb.EJBException while opening the IR Objects in PI7.1.1

    Hi All In our projects we are movig XI3.0 objects to PI7.1, for that we have done the Export of IR objects from XI3.0 & Import it in PI7.1. All the objectys are imported successfully in the IR of PI7.1. But when we are trying to open the objects & do

  • How to update a jscrollpane as new tables are added?

    Following is the code of a search form which searches the address based on some search criteria and places the result in the form of table on the panel p3 which has scrollpane sp1. every time the user searches the address a new table adds to the pane