HCM TDMS data transfer phase hanging and no tables transfered

Hello SAP Colleagues / Practitioners,
Background: We are running TDMS 4.0 HCM PA PD Copy
We are now at the data transfer phase of the data copy. Add it has been already an hour since we have triggered. But the tables are still not being populated. Is there a way to check if the transfer is really processing? The job CNV_MBTCNV_MBT_PEM_START is running in control system but in the receiver system, there are no tdms work process running. is this hanging or is there a way to check if tables are really processed? Thank you.
Regards,
Meinard

You can check the transfer progress in transaction dtlmon in central system. Enter the mass transfer I'd of your package and click on execute button, in the next screen click on tab 'relevant tables', there you can see how many tables have been processed, for more detailed information change the view to calculate throughput.

Similar Messages

  • TDMS Data transfer Step : Long runtime for table GEOLOC and GEOOBJR

    Hi Forum,
    The data transfer for tables GEOLOC and GEOOBJR is taking too long (almost 3 days for 1.7 million records). There are no scrambling rules applied on this tables. Rather I am using TDMS for the first time so I am not using any scrambling rules to start with.
    Also there are 30 process which have gone in error. How to run those erreneous jobs again??
    Any help is greatly appreciated.
    Regards,
    Anup

    Thanks Harmeet,
    I changed the write type for those activities and re-executed them and they are successfully completed.
    Now the data transfer is complete but I see a difference in no of records for these two tables (Geoloc and Geoobjr).
    Can you please let me know what might be the reason?
    Regards,
    Anup

  • Error in the Data Transfer phase

    Dear all,
    During the Data Transfer phase, we get 117 errors.
    The error for each failed process is:
    Execution of program /1CADMC/OLC_500000000043327 failed, return code               3
    Message no. DMC_RT_MSG046
    RC = 3: Error writing the data
    I compared the table structures in source and target system, and they are the same.
    Therefore I suspect that something went wrong during the Data Selection phase.
    Does anyone else have this problem?

    Hi Nicolas,
    Error code 3 suggests that there was a problem writing data to the receiver system. This may happen due to various reasons (For example: No tablespace available in the target system). To be sure, please check the System Logs (SM21) and Error Logs (ST22).
    In oder to check if something went wrong during data selection, you may refer to the selection logs for the tables for which data transfer is aborted.
    In case you are not able to identify the issue, kindly open an OSS Message for faster resolution of the problem.
    Regards,
    Suman

  • Secure the file/data transfer between XI and any third-party system

    Hi All,,
    I would like to use to "secure" SSH on OS Level the file/data transfer between XI and any third-party system Run OS Command before processing and OS command After processing. right now my XI server installed on iSeries OS.
    with ISeries we can't call the Unix commands hope we need to go for AS400 (CL) Programming. If we created the AS400 programm how i can call that in XI.
    If any one have idea pls let me know weather it will work or not.
    Thanks in adavance.
    Venkat

    Hi,
    Thanks for your reply.
    I have red some blogs like /people/krishna.moorthyp/blog/2007/07/31/sftp-vs-ftps-in-sap-pi to call the Unix Shell script in XI.
    But as i know in iSeries OS we can write the shell script we need to go for AS400 programe. If we go with AS400 how we need to call that programe and it will work or not i am not sure there i need some help please.
    Thanks,
    Venkat

  • Data Transfer Between HR and Erec

    Hi all,
    We are using having 2 backend systems 1 for HR other than Erec and another one for erec.
    Now we have setup the ALE data transfer using Message type HRMD_ABA.
    In erec we have only ERECRUIT Component deployed in backend,no EA HR or SAP HR component deployed.
    MEaning no PA* tables.
    Now when Transfer personnel number(HR!001-A008 with Position) to erec box,it says no pernr exist.
    Howver i cannot tranfer the pernr from ECC to Erec as we dont have PA* tables in Erec.
    Can you share me the object types that are transferred from ECC to EREc and EREC and ECC.
    What are the components to be deployed in backend Erec box.
    In Portal for requistions application which should backend(System Object) and for recruiter and recruitment admin applications which should be the backend system(System Object).
    Thanks,
    Nachy

    Hi ,
    1) First get your users created with roles in your Erec system
    Example recruiter ,Manager ,Recruitment Admin
    2) Get your Structure created in your HR back end system .
    3)Make sure you maintain your basic Infotypes with 0105 miantained (System user name /Email id )
    Assign the user to 0105 subtype 0001
    4)ALE that you have setup first move Org structure moved
      next move the person (Pers no ) .
    ALE chose the insert mode and goto WE10 if yoru IDOCS are clean without errors .
    Make sure all the personal data including address ,postal code is all maintained .
    Let me knwo if it helps
    Regards
    Santosh

  • Data transfer from R3 and BW TO external system

    Hi experts
    I got one question which I couldnt find good answer.
    it concerns...data transfer from both R3 system and BW(here data is master data) to an external system.
    what are the various options to do this both from BW side and R3 side?
    which of them offers best performance?
    Thanks
    BR
    Amanda
    Edited by: amanda ciders on Nov 20, 2008 2:17 PM

    Hi
    Im aware that I can use OpenHub Functionality for this.Can anyone explain how to do this as I am unable to implement this for this situation.....
    do anybody know other solutions like creating views or FM like that...
    .it will be of great help if u can explain any solution in greater detail(including open hub)..
    Thanks
    BR
    Amanda
    Edited by: amanda ciders on Nov 20, 2008 2:26 PM

  • Will I lose my saved data in apps, games, and pictures by transferring over to a new computer?

    I got a new Ipad. I'm going to sync it with all my songs and apps from my computer. I'm getting a new computer soon. When I transfer the apps and songs to that computer and update my Ipad on the new computer, will I lose my pictures and saved app data because of the new computer?

    You won't lose the data if you transfer purchases and restore the iPad from a backup. Read this thread where this very same issue was discussed. Read my post (Demo) and read King Penguin's post as well. As long as you follow these instructions you should be able to move all of your apps, music, saved data, photos and all iTunes content to your new computer.
    Remember to turn off auto syncing in iTunes, Transfer Purchases and backup the iPad before you sync any content at all. It is all explained here.
    https://discussions.apple.com/thread/3305461

  • External data transfer into CO - Profitabilty Analysis tables

    Gurus, Sending this again as the earlier one was not answered. Please give us some insight ASAP.
    We are implementing the custom allocations to CO-PA (Profitability Analysis) records externally and trying to post close to million records into CO-PA table CE1xxxx. What is the most efficient method should be used for the posting of the externally created records into CO-PA table. Transaction KE21 is used for one entry at a time. We need to perform the mass data transfer. We also checked the BAPI BAPI_COPAACTUALS_POSTCOSTDATA. It clearly says that it is not for mass data transfer. Also, it updates the CE4xxxx table only. We need the data posted to CE1xxxx table. Any ideas!!!
    There is a transaction KEFC - External data transfer to CO-PA. Has anyone used it? Please provide your insight on this transfer method.
    Any suggestions are appreciated.
    Thank you.

    Ashish,
    We use KEFC on a regular basis to upload actual sales data to PA from a third system.
    An upload file is created in Excel, saved as a TXT file. The structure of that excel file is equal to the structure defined in customizing: define data interface to find the structure and: define structure of external data transfer to see the respective values of the columns you need in your excel file.
    Hope this works for you!
    Regards,

  • Data needed from emp and dept tables

    Wondering if somebody can querry the emp table and dept table that comes with some versions of oracle already built in.
    I need the data produced from these two querries
    select * from emp
    select * from dept

    If you look in ORACLE_HOME/sqlplus/demo you'll find demobld.sql which contains the script the build all the scott tables.

  • Most efficient data transfer between RT and FPGA

    This post is related to THIS post about DMA overhead.
    I am currently investigating themost efficient way to transfer a set of variables to a FPGA target for out application.  We have been using DMA FIFOs for communications in both directions (to and from FPGA) but I'm recently questioning whether this is the most efficient approach.
    Our application must communicate several parameters (around 120 different variables in total) to the FPGA.  Approximately 16 of these are critical meaning that they must be sent every iteration of our RT control loop.  The others are also important but can be sent at a slightly slower rate without jeopardising the integrity of our system.  Until now we have sent these 16 critical parameters plus ONE non-critical parameter over a DMA to the FPGA card.  Each 32-bit value sent incorporates an ID which allows the FPGA to demultiplex to the appropriate global variables on the FPGA.  Thus over time (we run a 20kHz control loop on the RT system - we have a complete set of paramaters sent @ approx. 200Hz).  The DMA transfers are currently a relatively large factor in limiting the execution speed of our RT loop.  Of the 50us available per time-slot running at 20kHz approximately 12-20us of these are the DMA transfers to and from the FPGA target.  Our FPGA loop is running at 8MHz.
    According to NI the most efficient way to transfer data to a FPGA target is via DMA.  While this may in general be true, I have found that for SMALL amounts of data, DMA is not terribly efficient in terms of speed.  Below is a screenshot of a benchmark program I have been using to test the efficiency of different types of transfer to the FPGA.  In the test I create a 32MB data set (Except for the FXP values which are only present for comparison - they have no pertinence to this issue at the moment) which is sent to the FPGA over DMA in differing sized blocks (with the number of DMA writes times the array size being constant).  We thus move from a single really large DMA transfer to a multitude of extremely small transfers and monitor the time taken for each mode and data type.  The FPGA sends a response to the DMA transfers so that we can be sure that when reading the response DMA that ALL of the data has actually arrived on the FPGA target and is not simply buffered by the system.
    We see that the minimum round-time for the DMA Write and subsequent DMA read for confirmation is approximately 30us.  When sending less than 800 Bytes, this time is essentially constant per packet.  Only when we start sending more than 800 Bytes at a time do we see an increase in the time taken per packet.  A packet of 1 Byte and a packet of 800 Bytes take approxiamtely the SAME time to transfer.  Our application is sending 64 Bytes of critical information to the FPGA target each time meaning that we are clearly in the "less efficient" region of DMA transfers.
    If we compare the times taken when communication over FP controls we see that irrespective of how many controls we write at a time, the overall throughput is constant with a timing of 2.7us for 80 Bytes.  For a small dedicated set of parameters, the usage of front panel controls seems to be significantly faster than sending per DMA.  Once we need to send more than 800 Bytes, the DMA starts to become rapidly more efficient.
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

    So to continue:
    For small data sets the usage of FP controls may be faster than DMAs.  OK.  But we're always told that each and every FP control takes up resources, so how much more expensive is the varsion with FP controls over the DMA.
    According to the resource usage guide for the card I'm using (HERE) the following is true:
    DMA (1023 Elements, I32, no Arbitration) : 604 Flip-Flops 733 LUT 1 Block RAM
    1x I32 FP Control: 52 Flip-Flops 32 LUTs 0 Block RAM
    So the comparison would seem to yield the following result (for 16 elements).
    DMA : 604 FLip-Flops 733 LUT 1 Block RAM
    FP : 832 FLip-Flops 512 LUT 0 Block RAM
    We require more FLip-Flops, less LUTs and no Block RAM.  It's a swings and roundabouts scenario.  Depending on which resources are actually limited on the target, one version or the other may be preferred.
    However, upon thinking further I realised something else.  When we use the DMA, it is purely a communications channel.  Upon arrival, we unpack the values and store them into global variables in order to make the values available within the FPGA program.  We also multiplex other values in the DMA so we can't simply arrange the code to be fed directly from the DMA which would negate the need for the globals at all.  The FP controls, however, ARE already persistent data storage values and assuming we pass the values along a wire into subVIs, we don't need additional globals in this scenario.  So the burning question is "How expensive are globals?".  The PDF linked to above does not explicitly mention the difference in cost between FP controls and globals so I'll have to assume they're similar.  This of course massively changes the conclusion arrived to earlier.
    The comparison now becomes:
    DMA + Globals : 1436 Flip-Flops 1245 LUTs 1 Block RAM
    FP : 832 FLip-Flops 512 LUT 0 Block RAM
    This seems very surprising to me.  I'm suspiscious of my own conclusion here.  Can someone with more knowledge of the resource requirements differences between Globals and FP controls weigh in?  If this is really the case, we need to re-think our approach to communications between RT and FPGA to most likely employ a hybrid approach.
    Shane.
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

  • Data Transfer between BW and BO

    I have two servers in different networks, there is a firewall between the two networks.One get through to the Internet, and the other one get through to the Enterprise Intranet. BusinessObjects Enterprise XI 3.1 SP3 is setup in one server which can get through to the Internet,and Busniess Warehouse is setup on another server which can only get through to the Enterprise  ntranet. My question is how to deploy BusinessObjects, so that we can transfer the data in BW to BO.

    HI Roland
    When I create a new connection with SAP Business Warehouse, there is a error.
    DBD:canu2019t connect to the SAP BW Server CMALLC:rc=27>Connect from SAP gateway to RFC server failed Connect_PM GWHOST=192.168.1.66, GWSERV=sapgw00,SYSNR=00
    Location SAP-Gateway on host saptest/sapgw00
    Error   timeout during allocate
    Time   Tue Feb 22 15:25:58 2011
    Release  720
    Component SAP-Gateway
    Version  2
    RC      242
    Module  gwr3cpic.c
    Line     1911
    Detail    no connect of TP sapdp00 from host 192.168.1.66 after 20 sec
    Counter  2
    Is it relation with the BOE solution kit? I have some error when Importing SAP Roles in CMC.
    Exception in JSP: /jsp/auth/sapsec_import_role.jsp:22 19: 20: <% 21: String context=secSAPR3ImportRoleBean.getContextPath(); 22: secSAPR3ImportRoleBean.init(request); 23: response.setHeader("Expires", "0"); 24: %> 25: Stacktrace:
    Thank you very much!

  • Data transfer between SAP and SRM

    Hi all,
    I'm new with SRM and like from a help.
    Working with SAP 4.7 and now with SRM 5.0, I need make the two systems if communicate.
    I need send from SAP to SRM the master data, Material, Customer, Material Group, Cost Center.
    I need send the shop car from SRM for create a PR in SAP.
    I need send the documents RFQ, PO and Contract from SAP to SRM. For RFQ need receive in SAP the data of price updated by the customer.
    How to make this conection? There are standard interface for these data? What are these interfaces?
    Thanks,
    Luciano Lessa

    Hello Ashutosh,
    Thanks your help.
    I have one question about replicate of the material master. In step by step say for complete the CRMSUBTAB table with following values:
    User  ObjectName  U/D             Obj. Class   Function  Obj. Type  Funct. Name
    CRM       empty       Download    Material        empty      empty       CRS_MATERIAL_EXTRACT
    CRM       empty       Download    Material        empty      empty       CRS_CUSTOMIZING_EXTRACT
    CRM       empty       Download    Material        empty      empty       CRS_SERVICE_EXTRACT
    In our system is so:
    User  ObjectName  U/D             Obj. Class                    Function  Obj. Type  Funct. Name
    CRM       empty       Download    Material                        empty      empty       CRS_MATERIAL_EXTRACT
    CRM       empty       Download    CUSTOMIZING             empty      empty       CRS_CUSTOMIZING_EXTRACT
    CRM       empty       Download    SERVICE_MASTER      empty      empty       CRS_SERVICE_EXTRACT
    This configuration is correct?
    The field OBJ.CLASS is key of table, so not is possible add three like values. 
    Thanks,
    Luciano Lessa

  • Data Transfer problem from cursor to table

    Hi ,
             I created one Extract FM to extract data through a Datasource by RAS3. When i run the FM directly  the cursor value (S_CURSOR) is displayed as 3850 and when i try to transfer using FETCH NEXT to a internal table (E_T_DATA) 498  entries populated but if i open the table all lines are blank. Pls help me. i.e. E_T_DATA[] = 498
    OPEN CURSOR WITH HOLD S_CURSOR FOR
        SELECT MATNR
        FROM MARA AS A "INNER JOIN MARC AS C
        "ON A~MATNR = C~MATNR
        "INTO TABLE E_T_DATA
        WHERE MATNR IN L_R_MATNREAN
          AND MTART IN ('ZPLU','ZPAK',
                          'ZCOM','ZTIN',
                          'ZSCR','ZEXW',
                          'ZCOU','ZGVR')
          AND EAN11 = ''
          AND ATTYP = '00'.
    FETCH NEXT CURSOR S_CURSOR
                 APPENDING CORRESPONDING FIELDS
                 OF TABLE E_T_DATA
                 PACKAGE SIZE S_S_IF-MAXSIZE.
      IF SY-SUBRC <> 0.
        CLOSE CURSOR S_CURSOR.
        RAISE NO_MORE_DATA.
      ENDIF.

    first of all you have to put your FETCH into a DO ... ENDDO

  • How to delete the data in P,Q and S table

    Hi All,
    I want to delete the data in /BIO/PCOSTCENTER, /BIO/QCOSTCENTER AND /BIO/SCOSTCENTER. Since we have transported the costcenter infoobject from 3.0B to BI 7. But the 3.0B doesnot contain the ALPHA conversion. So we need to delete the infoobject and install from BI content.
    When I am trying to activate the BI 7 obj, It is giving the error message the above tables are containing the data.
    Pls suggest how to delete them and activate the 0COSTCENTER with BI 7 version.
    Points will be awarded.
    Thanks
    Sathiya

    Program RSDMD_DEL_BACKGROUND is the appropriate choice to delete master data.   You should be able to search OSS notes on te pgm to get a little more info on it.
    It insures referential integrity by checking all InfoProviders to make sure a master data value for that InfoObject does not exist in any InfoProviders.  This might not be an issue in your case if you are dealing with a newly activated object that you have don't have in any Infoproviders, but is the only the best way to delete master data.

  • Data recovery from BSIS and BSAS tables

    Hi all,
    The records in the tables BSIS and BSAS are deleted due to adhoc circumstances.
    How do I recover this data in SAP?
    Please help!!!
    Thanks,
    Raghav

    Hello Raghav,
    two possibilities without a complete restore (in a standard SAP environment):
    1) Flashback Query (but it only works if your UNDO tablespace still contains all the data - parameter undo_retention):
    http://docs.oracle.com/cd/E11882_01/appdev.112/e25518/adfns_flashback.htm#ADFNS01003
    2) LogMiner
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/logminer.htm#i1005553
    Regards
    Stefan

Maybe you are looking for

  • Java(TM) 2 Runtime Environment(build 1.5.0=09-b03) + apache-ant-1.6.5 binar

    Dear frineds, I met some probelm when i set up the dspace in fedora 6. It requres JAVA, APACHE ANT , and TOMCAT before you set up the DSPACE. The version i set up is :Java(TM) 2 Runtime Environment(build 1.5.0=09-b03) + apache-ant-1.6.5 binary distri

  • Max function for table having no rows

    Hi All I have a table in which initially no rows. I am running max() function for a column and then wnats to add 1. But when no rows there below code in not working select max(code) into p_maxCode from cms_codedetails where codetypeid=0; p_maxCode:=p

  • HT3702 dispute a charge...

    I apparently bought something twice I only intended to buy once.  I tried to report a problem, but it kept opening word documents I can't read.  Anyone have a fix?

  • Regarding mail sending in ALV

    Hi All, I got a requirement to develop a report, when executed, will execute and send a mail automatically of output,  to a particular user. Please tel me where to put the logic. Thank You, Rohith.

  • Debug output

    Nov 23 09:39:37.815 CST: ISAKMP (0:2254:): vendor ID seems Unity/DPD but hash mismatch BACKGROUND: VPN tunnel between ASA and IOS router.Tunnel is uo from ASA to IOS but reverse is not happening. implicationof this error? am able to connect