TDMS Data transfer Step : Long runtime for table GEOLOC and GEOOBJR

Hi Forum,
The data transfer for tables GEOLOC and GEOOBJR is taking too long (almost 3 days for 1.7 million records). There are no scrambling rules applied on this tables. Rather I am using TDMS for the first time so I am not using any scrambling rules to start with.
Also there are 30 process which have gone in error. How to run those erreneous jobs again??
Any help is greatly appreciated.
Regards,
Anup

Thanks Harmeet,
I changed the write type for those activities and re-executed them and they are successfully completed.
Now the data transfer is complete but I see a difference in no of records for these two tables (Geoloc and Geoobjr).
Can you please let me know what might be the reason?
Regards,
Anup

Similar Messages

  • [svn:osmf:] 11205: Fix bug FM-169: Trait support for data transfer sample doesn' t display bytes loaded and bytes total for SWF element

    Revision: 11205
    Author:   [email protected]
    Date:     2009-10-27 15:04:26 -0700 (Tue, 27 Oct 2009)
    Log Message:
    Fix bug FM-169: Trait support for data transfer sample doesn't display bytes loaded and bytes total for SWF element
    Ticket Links:
        http://bugs.adobe.com/jira/browse/FM-169
    Modified Paths:
        osmf/trunk/apps/samples/framework/PluginSample/src/PluginSample.mxml
        osmf/trunk/apps/samples/framework/PluginSample/src/org/osmf/model/Model.as

    The bug is known, and a patch has been submitted: https://bugs.freedesktop.org/show_bug.cgi?id=80151. There's been no update since friday, so I wonder what the current status is, or if it's up for review at all.
    Does anyone know how we can be notified when this patch hits the kernel?

  • How to clear previous data entries in sm30 transaction for table maintainen

    Whenever i maintain 8-10 records in sm30 for table maintenanace  and again when  I go to SM30 for entering new records i am able to view previous entries .
    then i click on new entries where data is cleared.
    Now what i need is i want to clear the data before clicking  "new entries".
    means for user it should appear as a fresh screen.
    is it possible if yes HOW?
    please anyone suggest me  way to do it immediately

    hi Nilesh,
    when u r clicking new entries data is not cleared but u r going to other screen,so it will apppaer as blank screen.If u want to delete all the records ,then write the logic in ur code.
    CASE SY-UCOMM.
    when 'NEWENTRIES'
    USe delete dbtable statemnt...then commit work.it will delete all the entries.
    ENDCASE.
    regards,
    Nagaraj

  • Quicken 2007 investment data transfer to Quicken Essentials for Mac

    Will my 10 plus years of investment data transfer from Quicken 2007 for Windows to Quicken Essentials for Mac without loss of data? 

    There is a version of Quicken 2007 for Lion and Mountain Lion available. If you own Quicken 2007 for Mac, it is a US$15 upgrade.
    Many Users balked at the loss of capabilities in having to move to Quicken Essentials, and Quicken eventually came around and "ported" Quicken 2007 so that it could run on 10.7 and later. I run it and it mostly works, but for tax estimates, they force you to their online tax estimator or suggest you buy their stand-alone product that will accept tax info from Quicken -- no more ballpark estimates from inside Quicken.
    https://quicken.intuit.com/personal-finance-software/quicken-2007-osx-lion.jsp
    It appears the conversion from Quicken for PC is incomplete, and requires generation of a QIF file which is then imported:
    https://quicken.intuit.com/support/help/pre-sales-questions/convert-data-from-qu icken-for-windows-to-quicken-for-mac/GEN82194.html

  • Feature request: Support for {table abbr} and {table} in Table scripts

    Hi Data Modeler Team,
    support for {table abbr} and {table} in table scripts ("Table properties"->"Scripts"->All four tabs) would be greatly appreciated as it makes it easier to use boilerplate SQL in these areas.
    Is it possible to add this feature to one of the upcoming releases?
    Thank you,
    Blama

    Hi,
    I've logged an enhancement request on this.
    Thanks,
    David

  • JSF Custom component for table pagination and sorting

    hi
    i want such a custom component that render a table and table has the features of pagination and column sorting.
    any one know from where i can get such a custom component.
    (if this component is available with source then it would be more help full for me)
    thnks & regards,
    Haroon
    Message was edited by:
    HaroonAnwarPADHYAR

    I know two companies that offer JSF component for table pagination and sorting and AJAX based :
    http://www.teamdev.com/quipukit/demo/datatable/DataTable_filteringAndPaging.jsf
    http://java.samples.infragistics.com/NetAdvantage/JSF/2006.1/
    The problem? They are not open source..
    And I am too looking forward on this subject, because I want to develop my own custom component and add some features. If someone has any tips, references or samples of their own, it would be really appreciated.
    Thank you.
    Paul

  • I am passing on my ipod touch 4th generation to my husband. I restored it so he could put in his own data but it keeps asking for my passward and it is using my segurity questions. How can he change that?

    I am passing on my ipod touch 4th generation to my husband. I restored it so he could put in his own data but it keeps asking for my passward and it is using my segurity questions. How can he change that?

    Whose account id shows in Settings > Store (Settings > iTunes & App Stores if it's on iOS 6) ? If it's yours then tap on it and log out of it, and your husband can then log in with his account

  • HCM TDMS data transfer phase hanging and no tables transfered

    Hello SAP Colleagues / Practitioners,
    Background: We are running TDMS 4.0 HCM PA PD Copy
    We are now at the data transfer phase of the data copy. Add it has been already an hour since we have triggered. But the tables are still not being populated. Is there a way to check if the transfer is really processing? The job CNV_MBTCNV_MBT_PEM_START is running in control system but in the receiver system, there are no tdms work process running. is this hanging or is there a way to check if tables are really processed? Thank you.
    Regards,
    Meinard

    You can check the transfer progress in transaction dtlmon in central system. Enter the mass transfer I'd of your package and click on execute button, in the next screen click on tab 'relevant tables', there you can see how many tables have been processed, for more detailed information change the view to calculate throughput.

  • External data transfer into CO - Profitabilty Analysis tables

    Gurus, Sending this again as the earlier one was not answered. Please give us some insight ASAP.
    We are implementing the custom allocations to CO-PA (Profitability Analysis) records externally and trying to post close to million records into CO-PA table CE1xxxx. What is the most efficient method should be used for the posting of the externally created records into CO-PA table. Transaction KE21 is used for one entry at a time. We need to perform the mass data transfer. We also checked the BAPI BAPI_COPAACTUALS_POSTCOSTDATA. It clearly says that it is not for mass data transfer. Also, it updates the CE4xxxx table only. We need the data posted to CE1xxxx table. Any ideas!!!
    There is a transaction KEFC - External data transfer to CO-PA. Has anyone used it? Please provide your insight on this transfer method.
    Any suggestions are appreciated.
    Thank you.

    Ashish,
    We use KEFC on a regular basis to upload actual sales data to PA from a third system.
    An upload file is created in Excel, saved as a TXT file. The structure of that excel file is equal to the structure defined in customizing: define data interface to find the structure and: define structure of external data transfer to see the respective values of the columns you need in your excel file.
    Hope this works for you!
    Regards,

  • Socket data transfer too long

              hi
              when i start up the 1st WLS for clustering, the following exception message is
              shown constantly:
              <MulticastSocket> Error sending blocked message
              java.io.IOException: A message for a socket data transfer is too long.
              at java.net.PlainDatagramSocketImpl.send(Native Method)
              what could cause the exception and is it a fatal error?
              appreciate if any help can be given.
              thanx ...
              yunyee
              

              hi
              i'm running on AIX 4.3.3.
              the jdk version is 1.2.2
              Viresh Garg <[email protected]> wrote:
              >Could you also post your OS and JDK version.
              >Viresh Garg
              >
              >Yun Yee wrote:
              >
              >> hi Shiva
              >>
              >> i'm running WLS5.1. and i have Commerce Server 3.1 as well.
              >> i have service pack 8.
              >> can the exception be rectified by service packs?
              >>
              >> thanx
              >>
              >> yunyee
              >>
              >> "Shiva" <[email protected]> wrote:
              >> >
              >> >Hi,
              >> >Would help if you could also tell about the WLS version, Service Packs
              >> >and the
              >> >environment you are using.
              >> >
              >> >Shiva.
              >> >
              >> >"Yun Yee" <[email protected]> wrote:
              >> >>
              >> >>hi
              >> >>
              >> >>when i start up the 1st WLS for clustering, the following exception
              >> >message
              >> >>is
              >> >>shown constantly:
              >> >><MulticastSocket> Error sending blocked message
              >> >>java.io.IOException: A message for a socket data transfer is too
              >long.
              >> >> at java.net.PlainDatagramSocketImpl.send(Native Method)
              >> >>
              >> >>what could cause the exception and is it a fatal error?
              >> >>
              >> >>appreciate if any help can be given.
              >> >>thanx ...
              >> >>
              >> >>yunyee
              >> >>
              >> >
              >
              

  • Long runtime for CU50

    Hi there, is there anywhere we can update the statistic for table CABN? We encountered long runtime when execute transaction code CU50, which we found out the process keep accessing the CABN table which contains more than 10k characteristics record. Thanks

    If you are running on IBM i (i5/OS, OS/400), there is no need to update statistics for a database table, because that is done automatically by the database.
    If you have a slow transaction, you can analyze it through transaction ST05 and then use the Explain function on the longest running statement. Within the Explain, there is a function "Index advised", that might help in your case.
    Kind regards,
    Christian Bartels.

  • Need Help:Reading Data from RU payroll cluster for table GRREC

    Hi...
    I need help on how to read data from RU cluster table for table GRREC for the employee & run date and get the value from structure PC292 .
    Please let me know about the includes and the import and export statements to be used.
    Thanks in advance,
    RAVI.

    Hi,
    Here goes pseudocode
    Includes:
    include: rpppxd00    ,
                rpppxd10     ,
                rpc2cd09     , 
                rpc2rx02_ce , "if ldb pnp_ce is used else use the same include with out _ce
                rpc2rx29      ,  
                rpc2rx39      ,
                rpppxm00    ,
                rpc2ruu0_ce ,
    Declare:
    DATA : i_rgdir   LIKE pc261        OCCURS 0 WITH HEADER LINE     ,
               i_result  TYPE pay99_result OCCURS 0 WITH HEADER LINE ,
               i_grrec   LIKE  pc292           OCCURS 0 WITH HEADER LINE .
    start-of-selection:
    GET pernr.
    Get the RGDIR VALUE for the current PERNR & selected Molga
    get rgdir data TABLES i_rgdir
                          USING pernr-pernr
                                     p_molga " parameter
    CD-KEY-PERNR = PERNR-PERNR.
    RP-IMP-C2-CU.
    i_rgdir [] = rgdir[].
      LOOP AT i_rgdir WHERE fpbeg  LE  pn-endda
                        AND fpend  GE  pn-begda
                        AND srtza  EQ 'A'
                        AND void   NE   'V'.
      get_result_tabs   TABLES i_result
                                   USING 'RU'    "  US cluster
                                         pernr-pernr
                                         i_rgdir-seqnr
          RX-KEY-PERNR = PERNR-PERNR.
          UNPACK i_RGDIR-SEQNR TO RX-KEY-SEQNO.
          RP-IMP-C2-RU.
      i_grrec[] = i_result-inter-grrec[].
      LOOP AT i_grrec.
      case i_grrec.
      use wage types required here and pass the data to output table.
      endcase.
      endloop.
      endloop
    end-of-selction.

  • Lock object for table KNVP and KNVK

    Hello Friends,
    Can you please tell me the standard lock object for the table KNVP and KNVK.
    Thank and regards,
    Kuldeep

    Hi
    see this and do accordingly
    Lock objects are use in SAP to avoid the inconsistancy at the time of data is being insert/change into database.
    SAP Provide three type of Lock objects.
    - Read Lock(Shared Locked)
    protects read access to an object. The read lock allows other transactions read access but not write access to
    the locked area of the table
    - Write Lock(exclusive lock)
    protects write access to an object. The write lock allows other transactions neither read nor write access to
    the locked area of the table.
    - Enhanced write lock (exclusive lock without cumulating)
    works like a write lock except that the enhanced write lock also protects from further accesses from the
    same transaction.
    You can create a lock on a object of SAP thorugh transaction SE11 and enter any meaningful name start with EZ Example EZTEST_LOCK.
    Use: you can see in almost all transaction when you are open an object in Change mode SAP could not allow to any other user to open the same object in change mode.
    Example: in HR when we are enter a personal number in master data maintainance screen SAP can't allow to any other user to use same personal number for changes.
    Technicaly:
    When you create a lock object System automatically creat two function module.
    1. ENQUEUE_<Lockobject name>. to insert the object in a queue.
    2. DEQUEUE_<Lockobject name>. To remove the object is being queued through above FM.
    You have to use these function module in your program.
    check this link for example.
    http://help.sap.com/saphelp_nw04s/helpdata/en/cf/21eea5446011d189700000e8322d00/content.htm
    tables:vbak.
    call function 'ENQUEUE_EZLOCK3'
    exporting
    mode_vbak = 'E'
    mandt = sy-mandt
    vbeln = vbak-vbeln
    X_VBELN = ' '
    _SCOPE = '2'
    _WAIT = ' '
    _COLLECT = ' '
    EXCEPTIONS
    FOREIGN_LOCK = 1
    SYSTEM_FAILURE = 2
    OTHERS = 3
    if sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    endif.
    Normally ABAPers will create the Lock objects, because we know when to lock and how to lock and where to lock the Object then after completing our updations we unlock the Objects in the Tables
    http://help.sap.com/saphelp_nw04s/helpdata/en/cf/21eea5446011d189700000e8322d00/content.htm
    purpose: If multiple user try to access a database object, inconsistency may occer. To avoid that inconsistency and to let multiple user give the accessibility of the database objects the locking mechanism is used.
    Steps: first we create a loc object in se11 . Suppose for a table mara. It will create two functional module.:
    1. enque_lockobject
    1. deque_lockobject
    before updating any table first we lock the table by calling enque_lockobject fm and then after updating we release the lock by deque_lockobject.
    http://help.sap.com/saphelp_nw04/helpdata/en/cf/21eea5446011d189700000e8322d00/content.htm
    GO TO SE11
    Select the radio button "Lock object"..
    Give the name starts with EZ or EY..
    Example: EYTEST
    Press Create button..
    Give the short description..
    Example: Lock object for table ZTABLE..
    In the tables tab..Give the table name..
    Example: ZTABLE
    Save and generate..
    Your lock object is now created..You can see the LOCK MODULES..
    In the menu ..GOTO -> LOCK MODULES..There you can see the ENQUEUE and DEQUEUE function
    Lock objects:
    http://www.sap-img.com/abap/type-and-uses-of-lock-objects-in-sap.htm
    http://help.sap.com/saphelp_nw04s/helpdata/en/cf/21eea5446011d189700000e8322d00/content.htm
    Match Code Objects:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/41/f6b237fec48c67e10000009b38f8cf/content.htm
    http://searchsap.techtarget.com/tip/0,289483,sid21_gci553386,00.html
    See this link:
    http://www.sap-img.com/abap/type-and-uses-of-lock-objects-in-sap.htm
    Check these links -
    lock objects
    Lock Objects
    Lock Objects
    Regards
    Anji

  • Archiving object for tables GLPCC and GLPCO

    Hi All,
      GLPCA table stores the actual line items in PCA and GLPCT table stores the totals records of PCA.
    What are the transactions which get affected in regular course by these tables?
       GLPCC - EC-PCA: Transaction Attributes
       GLPCO - EC-PCA: Object Table for Account Assignment
    There is no archiving object for the tables GLPCC and GLPCO when checked in DB15. What is the alternative for archiving the entries in those tables?
    Thanks in advance
    Regards
    Anupama

    Hello Anupama,
    The standard archiving objects for PCA are EC_PCA_ITM & EC_PCA_SUM. Most of the data is archived with above two objects.  Hence it would be better to use given archiving object. About 80% of data can be archived through standard archiving objects.
    For more information you may read SAP Data Management Guide.
    Regds
    Ambadas

  • How to know the data transfer amount (MB o KB) between SAP and BW

    Hello everybody,
    Somebody know how calculate the data transfer amount between SAP and BW. Would be fantastic to know the transfer speed  and other similar data. Is there any transaction where you can see this information?
    thanks for your help.
    Manolo

    A quick, although less accurate, way is to add up your field lengths to get your total record size and then multiply it by the number of records.
    So, 5 fields, with 10 char length = 50 chars * 8 bytes /char  = 400bytes.
      8 bytes / char may be higher on unicode systems.
    with 500 records ... you get 200,000 bytes = /1024 = 195kb.
    This is just a rough estimate and it is pretty quick to figure out.  Not all fields will be completely filled, so this is the maximum size for this recordset.
    Brian

Maybe you are looking for

  • Memory compatibility

    I am planning to upgrade the RAM in my early 2009 24"iMac from 4gb to 8gb by replacing the two existing 2gb RAM modules with two 4gb modules. I also have a mid 2010 21.5" iMac with two free memory slots, I was wondering if I could install the old RAM

  • Error when calling  a form from another program

    Hi all, I'm using the following code in EXIT_SAPLV50Q_001: PERFORM STATIC_CREDIT_CHECK(SAPLVKMP)   USING '' LIKP-KKBER LIKP-KNKLI '' 'X' CHANGING RC_CHECK_A RC_CHECK_F RC_WARNING RC_ERROR RC_STATUS_SET CMPSA I'm getting the following error: GETWA_NOT

  • I keep getting an "error" message when I try to download apps.

    I am trying to download apps but I keep getting an error message with the options to retry or cancel. If I hit retry it asks for my password again, and then I get the error again. If I do retry it asks for my password again, and then asks if I forgot

  • Why are ringtones not available for all purchased songs?

    does anyone know why you can not make a ringtone from all purchased songs?  e.g., none of the songs I'e downloaded from The Beatles may be made in to ringtones.

  • Jvm option "-d64" does not recognized by app. server

    Product: SunOne App Server v.7 Platform Edition It seems app. server does not respect jvm option -d64. I've set this option manually in server.xml file and also from management console. None worked. When I debug System properties from my web applicat