Time needed for automated data transfer - hypothetic

An EBS client with interest in maintaining integration in their large customer base backoffice and with CRMOD has had past challenges in terms of the overall size of the data they wish to integration and the time it takes to do so. As such, the client wishes to understand from a data size (X GIGS) and time (how much time) it might take to for instance synchronize their data. Since they would not quantify their data size, I can only ask... how much data be moved and how fast. This is urgent. Any experienced help here using strategies such as batch integration leveraging web services, cast iron systems or PIP is of help here!

Disclaimer
The Author of this posting offers the information contained within this posting without consideration and with the reader's understanding that there's no implied or expressed suitability or fitness for any purpose. Information provided is for informational purposes only and should not be construed as rendering professional advice of any kind. Usage of this posting's information is solely at reader's own risk.
Liability Disclaimer
In no event shall Author be liable for any damages whatsoever (including, without limitation, damages for loss of use, data or profit) arising out of the use or inability to use the posting's information even if Author has been advised of the possibility of such damage.
Posting
"And the solution was using an FTP client which was able to send files using multiple parallel threads."
Yep, although that assumes you have multiple files (for which it can work very well).
For a single large file, a variation of what Milan described, is using a utility that can slice and dice (and reassemble) a single file across multiple TCP flows.
"Another possibility would be using some WAN accelerators (like Riverbeds, e.g.) which would sent "local ACKs" to the clients on both sides and improve the efficiency of the WAN file transfer."
Yep, again, and such products can even "transfer" faster than the line rate.  The latter is possible because they generally compress in-line and also use local caching (negating the need to send some data). One of their issues, their transfer rates can be very variable and inconsistent.

Similar Messages

  • I bought a new iMac today. I'm using migration assistant to move all my software, but the time just keeps getting longer. It says connect an Ethernet cable for faster data transfer. I did, but that doesn't seem to help. Any ideas?

    I bought a new iMac today. I'm using migration assistant to move all my software, but the time just keeps getting longer. It says connect an Ethernet cable for faster data transfer. I did, but that doesn't seem to help. Any ideas?

    m1doc,
    Are you migrating from a Mac or a MS Window machine? Either way you probably should be in touch with AppleCare, you have 90 days of free AppleCare telephone support. They can usually help on issues like this. If you don't know the phone number please use http://support.apple.com/kb/HE57 to help find the number in your country.

  • How to estimate the time needed for unicode conversion

    Experts:
    I am going to perform an upgrade from 46C (non-unicode) to ECC6/EHP4.
    In the action plan , it's hard to estimate the time needed for unicode conversion.
    We do not have a sandbox to benchmark that time.
    Could you please help share your experience here?
    Thanks!!

    Hi,
    usually it is very hard to estimate a proper time.
    There are some hints to get a rough feeling (SAP note 857081 and [SMP link|http://service.sap.com/~form/sapnet?_SHORTKEY=01100035870000380759&_OBJECT=011000358700001279022010E] ).
    However please note that SAP highly recommends to do a Sandbox conversion - otherwise there is a high risk that the PRD conversion will take (much) longer than expected.
    Best regards,
    Nils Buerckel

  • Change Time Constraint for Personal Data InfoType

    Hi,
    How to change the Time Constraint for Personal Data InfoType.
    I tried to do it in Customisation Procedure --> Infotypes, but the option to change Time Constraint is disabled for Infotype 0002.
    Thanks

    Hi,
    you can change time constraints in general attributes of infotype attributes,this can be done through table maintenace view V_T582A.
    But note that we cannot change the time contraints for mandatory infotyepe of personnel in an organization i.e -0000,0001,0002.
    ex:- without personal details like names no personnel will exist in an organization.
    regards,
    Soori

  • Can i buy a smaller macbook air (64gb) but use the time capsule for additional data ?

    can i buy a smaller macbook air (64gb) but use the time capsule for additional data ?

    If you plan to store important "original" or "master" files and documents on the Time Capsule, then you might want to think about a way to back up those files to another hard drive.
    Why? If the Time Capsule has a problem...and you have no backups....you lose everything.

  • Insert or Cut Time and move automation data - how to do it?

    Hi when I cut or insert time it moves the regions but not the automation data. I have tried switching nn 'move automation with regions' but it doesn't move the automation that goes past the end f regions eg plug in settings like delay feedbacks etc.
    a) this seems to me to be really crazy - why if you had done a lot of detailed automation programming at the end of a song would you not want it to move if you cut 4 bars from earlier on in the song?
    any ideas appreciated very much!!
    best
    tommy b

    after using logic for years I have discovered a new bit and something very weird.
    New Bit:
    As well as Cut/Insert Time in the Region Menu
    there is a Cut/Insert Time in the Edit menu.
    if I start here:
    http://www.applebananacarrot.com/downloads/pre.jpg
    if I do the region Cut ( from the region menu on the arrange window):
    look what happens to the automation data on track 21
    http://www.applebananacarrot.com/downloads/postregioncut.jpg (track 21 data didn't move)
    and compare it to this one doing everything else the same but using the Cut from the main logic edit menu.
    http://www.applebananacarrot.com/downloads/posteditcut.jpg Track 21 automation data did move.
    That to me seems wrong....
    whaddya think?
    best
    tommy banana

  • How do I create a delay or time-lag for my data in LABVIEW

    In my data acquisition system I am using acquired data to create an digital output. I want to delay this output or create a sort of time lag for it. Is there any easy way to incorporate this?

    What you might consider doing is using a sequence. Run the data through this sequence and inside of the sequence, put a delay equal to the time you wish to delay the signal. Then, the output will not be available until the delay has completed.
    If you need continuous streaming data, this might not work too well. Then, you will need some sort of a data buffer - perhaps using a queue might be one possible solution. I have not used it, but I think that queues can have a time stamp. You could possibly artifically alter the time stamp.
    Hope this helps,
    Jason

  • Need to learn data transfer

    hi to abapers,
                        need to learn learn data transfer bdc,bi,ct,lsmw .plz send me links.or send it to my mail [email protected]  material will be rewarded

    Hi,
        BDC:
    Batch Data Communication (BDC) is the process of transferring data from one SAP System to another SAP system or from a non-SAP system to SAP System.
    Features :
    BDC is an automatic procedure.
    This method is used to transfer large amount of data that is available in electronic medium.
    BDC can be used primarily when installing the SAP system and when transferring data from a legacy system (external system).
    BDC uses normal transaction codes to transfer data.
    Types of BDC :
    CLASSICAL BATCH INPUT (Session Method)
    CALL TRANSACTION
    BATCH INPUT METHOD:
    This method is also called as ‘CLASSICAL METHOD’.
    Features:
    Asynchronous processing.
    Synchronous Processing in database update.
    Transfer data for more than one transaction.
    Batch input processing log will be generated.
    During processing, no transaction is started until the previous transaction has been written to the database.
    CALL TRANSACTION METHOD :
    This is another method to transfer data from the legacy system.
    Features:
    Synchronous processing. The system performs a database commit immediately before and after the CALL TRANSACTION USING statement.
    Updating the database can be either synchronous or asynchronous. The program specifies the update type.
    Transfer data for a single transaction.
    Transfers data for a sequence of dialog screens.
    No batch input processing log is generated.
    For BDC:
    http://myweb.dal.ca/hchinni/sap/bdc_home.htm
    https://www.sdn.sap.com/irj/sdn/wiki?path=/display/home/bdc&
    http://www.sap-img.com/abap/learning-bdc-programming.htm
    http://www.sapdevelopment.co.uk/bdc/bdchome.htm
    http://www.sap-img.com/abap/difference-between-batch-input-and-call-transaction-in-bdc.htm
    http://help.sap.com/saphelp_47x200/helpdata/en/69/c250684ba111d189750000e8322d00/frameset.htm
    http://www.sap-img.com/abap/learning-bdc-
    http://www.sapbrain.com/TUTORIALS/TECHNICAL/BDC_tutorial.html
    LSMW
    Please check this link and you can download LSMW Workbench documents.
    http://www.erpgenie.com/sap/saptech/lsmw.htm
    Go thro these steps..
    Using Tcode MM01 -- Maintain the source fields are
    1) mara-amtnr char(18)
    2) mara-mbrsh char(1)
    3) mara-mtart char(4)
    4) makt-maktx char(40)
    5) mara-meins char(3)
    the flate file format is like this as follows
    MAT991,C,COUP,Srinivas material01,Kg
    MAT992,C,COUP,Srinivas material02,Kg
    AMT993,C,COUP,Srinivas material03,Kg
    MAT994,C,COUP,Srinivas material04,Kg
    MAT995,C,COUP,Srinivas material05,Kg
    goto Tcode LSMW
    give Project Name
    Subproject Name
    object Name
    Press Enter -
    Press Execute Button
    It gives 13 radio-Button Options
    do the following 13 steps as follows
    1) select radio-Button 1 and execute
    Maintain Object Attributes
    select Standard Batch/Direct Input
    give Object -- 0020
    Method -- 0000
    save & Come Back
    2) select radio-Button 2 and execute
    Maintain Source Structures
    select the source structure and got to click on create button
    give source structure name & Description
    save & Come Back
    3) select radio-Button 3 and execute
    Maintain Source Fields
    select the source structure and click on create button
    give
    first field
    field name matnr
    Field Label material Number
    Field Length 18
    Field Type C
    Second field
    field name mbrsh
    Field Label Industrial Sector
    Field Length 1
    Field Type C
    Third field
    field name mtart
    Field Label material type
    Field Length 4
    Field Type C
    fourth field
    field name maktx
    Field Label material description
    Field Length 40
    Field Type C
    fifth field
    field name meins
    Field Label base unit of measurement
    Field Length 3
    Field Type C
    save & come back
    4) select radio-Button 4 and execute
    Maintain Structure Relations
    go to blue lines
    select first blue line and click on create relationship button
    select Second blue line and click on create relationship button
    select Third blue line and click on create relationship button
    save & come back
    5) select radio-Button 5 and execute
    Maintain Field Mapping and Conversion Rules
    Select the Tcode and click on Rule button there you will select constant
    and press continue button
    give Transaction Code : MM01 and press Enter
    after that
    1) select MATNR field click on Source filed(this is the field mapping) select MATNR and press Enter
    2) select MBRSH field click on Source filed(this is the field mapping) select MBRSH and press Enter
    3) select MTART field click on Source filed(this is the field mapping) select MTART and press Enter
    4) select MAKTX field click on Source filed(this is the field mapping) select MAKTX and press Enter
    5) select MEINS field click on Source filed(this is the field mapping) select MEINS and press Enter
    finally
    save & come back
    6) select radio-Button 6 and execute
    Maintain Fixed Values, Translations, User-Defined Routines
    Create FIXED VALUE Name & Description as MM01
    Create Translations Name & Description as MM01
    Create User-Defined Routines Name & Description as MM01
    after that delete all the above three just created in the 6th step
    FIXED VALUE --MM01
    Translations --MM01
    User-Defined Routines --MM01
    come back
    7) select radio-Button 7 and execute
    Specify Files
    select On the PC (Frontend) -- and click on Create button(f5)
    give the path of the file like "c:material_data.txt"
    description : -
    separators as select comma radiao- button
    and press enter save & come back
    8) select radio-Button 8 and execute
    Assign Files
    Save & come back
    9) select radio-Button 9 and execute
    Read Files
    Execute
    come back
    come back
    10) select radio-Button 10 and execute
    Display Imported Data
    Execute and press enter
    come back
    Come back
    11) select radio-Button 11 and execute
    Convert Data
    Execute
    come back
    Come back
    12) select radio-Button 12 and execute
    Display Converted Data
    Execute & come back
    13) select radio-Button 13 and execute
    Start Direct Input Program
    --select the Program
    --select continue button
    --go with via physical file
    --give the lock mode as 'E'
    and execute
    Refer these links,
    LSMW
    http://www.sapgenie.com/saptech/lsmw.htm
    http://service.sap.com/lsmw
    http://www.sapbrain.com/TUTORIALS/TECHNICAL/LSMW_tutorial.html
    http://www.scmexpertonline.com/downloads/SCM_LSMW_StepsOnWeb.doc
    Regards

  • Input needed for writing data back into BW/ECC data

    Hello Everyone,
    Can anyone please let me know an example or process you have used for writing data back into BW (DSO/Cubes) or ECC systems.
    We do not have integrated planning in our current system configuration currently.
    So, any sample code for using any of the BAPI/RFC for writing data would be appreciated.
    Also, I am trying to find out if there is any way to schedule the models developed in VC 7.1 in background to automate certain type of data processing which goes across multiple systems.
    Any help would be appreciated.
    Thanks.

    Hello,
    Can anyone please help me out on this one....
    I am aware of few BAPI's such as RSDRI_ODSO_INSERT_RFC but I am not aware of what action has to be used to transfer the data from the table within VC to this BAPI and how to define the parameters as the one's available for the BAPI I mentioned do not fit into the data I have in the table within VC.
    The following are the columns I have in the table within VC,
    1. GL Account
    2. Credit Amount
    3. Debit Amount
    4. Sales Amount
    I have defined the DSO with the same where G/L Account is the keyfield and the rest being data fields.
    Any help would be really appreciated.
    Thanks.

  • Changing time-out for scheduled data refresh

    Using a Power Query connection, is it possible to extend the time-out time for scheduled data refreshes? The amount of data to be retrieved is rather limited, but there's thousands of rows (NAV server).
    If not, any suggestions to how to reduce latency?
    Thanks.

    Thorm,
    Is this still an issue?
    Thanks!
    Ed Price, Azure & Power BI Customer Program Manager (Blog,
    Small Basic,
    Wiki Ninjas,
    Wiki)
    Answer an interesting question?
    Create a wiki article about it!

  • COPA exit COPA0007 for external data transfer

    Hello all,
    Have anyone already used COPA0007 Enhancement?
    Is ther any customizing that is needed in order to SAP stop execution with a break-point in the FM's associated.
    I'm trying to change data in external data transfer and I can not stop execution when running T-CODE KEFC.
    Best regards,
    Francisco Fontoura

    Hi,
      Please read note 508273.
    regards
    Waman

  • How To Use OnCommand Workflow Automation For Automated Data Protection During Failover

    Introduction:
    The NetApp OnCommand Workflow Automation (WFA) engineering team recently published a WFA pack that lets you recreate SnapMirror and SnapVault protection after MetroCluster switchover and switchback. A switchover/switchback occurs during a planned or unplanned failover. Introducing automation promotes best practices for continuous data protection during these types of occurrences. Watch this video to learn how to take the steps for implementing and executing this pack.
    Step 1:
    Recreate SnapMirror and SnapVault protection after MetroCluster switchover.
    Step 2:
    Retain SnapMirror and SnapVault data before MetroCluster switchback.
    Step 3:
    Recreate SnapMirror and SnapVault protection after MetroCluster switchback.
    Other Resources:
    This WFA pack is available for download on the Storage Automation Store.

    Hi Ian,The WFA engineering team is getting ready to publish a blog article to accompany this new WFA pack. I hope you will find it helpful (stay tuned).Kristina

  • Upgradation issue 4.6C to 4.7 for DP90-Data transfer module not allowed

    Hi All.
    We have a issue in upgradation project form 4.6C to 4.7.
    Process we follow is Contract,Notification,Service order,Debit memo request.
    While doing DP90 for service order we are facing a error as <b>Data transfer module is not allowed</b>
    Afetr this system gives you only option to EXIT.
    In the log display we get following message :
    Data transfer module is not allowed
    Message no. V1247
    Diagnosis
    A wrong data transfer module was entered in Customizing for copying control. When you enter a sales order with reference to a quotation, you cannot copy header data from an invoice, for example. The module number is 306.
    System Response
    Processing is terminated.
    Procedure
    Check the copying control for your transaction in Customizing or contact your system administrator.
    Any idea how to deal with this
    Regards,
    Amrish Purohit

    Hi Amrish,
    There is a standard SAP program: SDTXT1AID. Start it, and the upper section you can find your incomplete parameters. Sign your relevant parameters (or all) and F8
    (not test case).
    I hope it will help you.
    regards,
    Zsuzsa

  • Approximate time needed for normal shutdown of the instance

    how to figure out how much time needed to stop the oracle instance?
    do the oracle "shutdown immediate" in case when OS is shutting down or "shutdown normal"?
    What will be the result if there are still long-running transactions when OS shutdown initiated?

    Anton Kharus wrote:
    how to figure out how much time needed to stop the oracle instance?
    By measuring the shutdown timings from time to time.
    do the oracle "shutdown immediate" in case when OS is shutting down or "shutdown normal"?
    What will be the result if there are still long-running transactions when OS shutdown initiated?
    Oracle is only shutdown at a system shutdown if you configure shutdown scripts to do so.
    Depending on the OS you have these scripts must be placed somewhere in /etc/rc?.d or /sbin/rc?.d
    It is ofcourse depending on the shutdown type, and the amount of running session.
    Shutdown normal will wait until all users have disconnected
    Shutdown immediate will wait until all users have ended their running statement
    Shutdown transactional will wit for users to and an active transaction
    Shutdown abort will not wait, but leave the database in a to-be recovered state.
    Shutdown abort should only be used in case all other shutdown commands do not work, or there has been some kind of error leaving some database processes running.
    HTH,
    FJFranken
    My Blog: http://managingoracle.blogspot.com

  • Libraries need for ADF Data visualization

    Hi all,
    I want to ask list of libraries which needed by ADF Data visualization.
    Thanks
    Hadi Wijaya

    Hi,
    since you ask this question on the JDeveloper 10.1.3 forum, just be aware that DVT components work on JSF 1.2 only and not in JDeveloper 10.1.3
    Frank

Maybe you are looking for