Transfering big files

Hi,
i would like to transfer really big files ( 500 megs and above ). This is a file to file transfer without any conversion (except code page). How can i do that with XI without having the whole data in XI which will for sure waste a lot of time and space in the database (log tables/msg tables)
The file is written from a BW process with a tempory filename, after the file is written it is renamed to a filename XI is looking for. XI should then trigger the file transfer without reading the content.
Any idea?

Use the Chunk Mode of the file adapter if you are on PI 7.11(For binary data transfers)
/people/niki.scaglione2/blog/2009/10/31/chunkmode-for-binary-file-transfer-within-pi-71-ehp1
Check the "Large Files Handling" section in this guide:
http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2016a0b1-1780-2b10-97bd-be3ac62214c7?quicklink=index&overridelayout=true
Regards,
Ravi

Similar Messages

  • I transferred big files from windows to my macbook, should I install anti-virus on macbook pro?

    I transferred big files from windows to my macbook because I wanted to retire my old PC, should I install anti-virus on macbook pro because of those files I moved? Would you recommend? Or should I feel comfortable since I'm using OS X Mavericks? Anybody with the same case? Thanks.

    Files moved from a Windows machine may contain malware, but that malware would not be a threat to your Mac. Windows malware cannot infect a Mac.
    As for protecting your Mac against Mac malware, see my Mac Malware Guide.

  • WIFI dies when transferring big files over the network

    Hi, I have Dell Inspiron 1520 with ipw3945 driver, when I transfer big files over the network, my wifi dies, anybody knows what could it be? If you need additional information, feel free to ask...
    Thanks in advance

    I guess we shall see. I can't imagine we are the only 7 with these issues.  Mine is mainly just lag... and overall slowness.  I hope to work through this.  If it lasts much longer I will have to revert back all 3 of my lappies to FC which I really do not want to do.  But on the lappies i connect via wireless 90% of the time and need it to work. 
    By the way how are you guys configuring your wifi?  I am using the iwl drivers and wicd.
    Well I did some more testing and loaded up netcfg and still get the same results so that rules out that part.  I also did some packet captures and I am getting a bunch of tcp retransmissions followed by tcp segment lost followed by a reset which in turn kills my rdp session (remote desktop protocol). I also went back several versions of the uucode driver and still get the same reults so I guess it seems to be a kernel issue.  Back to FC I go.  Damn shame...
    Last edited by BKJ (2008-10-08 01:08:56)

  • Transfering big files to MS XP

    Hi.
    How can I transfer a file 500 Mb from my G3 233 CD tray, OS 9.2.2 to a Windows XP? Internet or Ethernet or any other solution.
    ; o} knut f

    Networked file sharing between classic OS 9 and Windows can be done, but it is not straightforward. You can read about some of the options here:
    http://docs.info.apple.com/article.html?artnum=31318
    I can remember reading about another option involving free client/server software that you could download. . . hopefully forum member Strider will see your post, as he was the one who always recommended this method. It's been a few months at least since he posted this, and I just can't remember the name of the program he suggested.
    Many here have recommended the use of AOL instant messenger for file transfers of this type. I'm not sure if 500 MB is too big or not. HTH

  • Broken Ftp Connection and big files problem

    I have a problem with big-files downloading.
    Does anybody know how to resume downloading using FTP-connection?
    Or how can I get bytes from FTP connected file using something like random access to avoid the restart of downloading the file?
    "InputStream" does not support "seek"-like methods.

    From RFC 959
    RESTART (REST)
    The argument field represents the server marker at which
    file transfer is to be restarted. This command does not
    cause file transfer but skips over the file to the specified
    data checkpoint. This command shall be immediately followed
    by the appropriate FTP service command which shall cause
    file transfer to resume.
    You should also be aware of RFC 959 Section 3.4.2 on BLOCK MODE transfers which is what allows FTP to REST a connection and "skip" n-bytes of a file.

  • Keeping "CS Web Service session" alive while uploading big files.

    Hi.
    I have a problem when I'm uploading big files, which takes longer than the session timeout value, causing the upload to fail.
    As you all know uploading a file is a three step process:
    1). Create a new DocumentDefinition Item on the server as a placeholder.
    2). Open an HTTP connection to the created placeholder and transfer the data using the HTTPConnection.put() method.
    3). Create the final document using the FileManager by passing in the destination folder and the document definition.
    The problem is that step 2 take so long that the "CS Web Service Session" times out and thus step 3 can not be completed. The Developer guide gives a utility method for creating an HTTP connection for step 2 and it states the folllowing "..you must create a cookie for the given domain and path in order to keep the session alive while transferring data." But this only keeps the session of the HTTP Connection alive and not the "CS Web Service Session". As in my case step 2 completes succesfully and the moment I peform step 3 it throws an ORACLE.FDK.SessionError:ORACLE.FDK.SessionNotConnected exception.
    How does one keep the "CS Web Service Session" alive?
    Thanks in advance
    Regards.

    Okay, even a thread that pushes dummy stuff through once in a while doesn't help. I'm getting the following when the keep alive thread kicks in while uploading a big file.
    "AxisFault
    faultCode: {http://xml.apache.org/axis/}HTTP
    faultSubcode:
    faultString: (409)Conflict
    faultActor:
    faultNode:
    faultDetail:
    {}:return code: 409
    <HTML><HEAD><TITLE>409 Conflict</TITLE></HEAD><BODY><H1>409 Conflict</H1>Concurrent Requests On The Same Session Not Supported</BODY></HTML>
    {http://xml.apache.org/axis/}HttpErrorCode:409
    (409)Conflict
         at org.apache.axis.transport.http.HTTPSender.readFromSocket(HTTPSender.java:732)
         at org.apache.axis.transport.http.HTTPSender.invoke(HTTPSender.java:143)
         at org.apache.axis.strategies.InvocationStrategy.visit(InvocationStrategy.java:32)
         at org.apache.axis.SimpleChain.doVisiting(SimpleChain.java:118)
         at org.apache.axis.SimpleChain.invoke(SimpleChain.java:83)
         at org.apache.axis.client.AxisClient.invoke(AxisClient.java:165)
         at org.apache.axis.client.Call.invokeEngine(Call.java:2765)
         at org.apache.axis.client.Call.invoke(Call.java:2748)
         at org.apache.axis.client.Call.invoke(Call.java:2424)
         at org.apache.axis.client.Call.invoke(Call.java:2347)
         at org.apache.axis.client.Call.invoke(Call.java:1804)
         at oracle.ifs.fdk.FileManagerSoapBindingStub.existsRelative(FileManagerSoapBindingStub.java:1138)"
    I don't understand this, as the exception talks about "Concurrent Requests On The Same Session", but if their is already a request going on why is the session timing out in the first place?!
    I must be doing something really stupid somewhere. Aia ajay jay what a unproductive day...
    Any help? It will be greatly appreciated...

  • Not enough space on my new SSD drive to import my data from time machine backup, how can I import my latest backup minus some big files?

    I just got a new 256GB SSD drive for my mac, I want to import my data from time machine backup, but its larger than 256GB since it used to be on my old optical drive. How can I import my latest backup keeping out some big files on the external drive?

    Hello Salemr,
    When you restore from a Time Machine back up, you can tell it to not transfer folders like Desktop, Documents. Downloads, Movies, Music, Pictures and Public. Take a look at the article below for the steps to restore from your back up.  
    Move your data to a new Mac
    http://support.apple.com/en-us/ht5872
    Regards,
    -Norm G. 

  • Photoshop CC slow in performance on big files

    Hello there!
    I've been using PS CS4 since release and upgraded to CS6 Master Collection last year.
    Since my OS broke down some weeks ago (RAM broke), i gave Photoshop CC a try. At the same time I moved in new rooms and couldnt get my hands on the DVD of my CS6 resting somewhere at home...
    So I tried CC.
    Right now im using it with some big files. Filesize is between 2GB and 7,5 GB max. (all PSB)
    Photoshop seem to run fast in the very beginning, but since a few days it's so unbelievable slow that I can't work properly.
    I wonder if it is caused by the growing files or some other issue with my machine.
    The files contain a large amount of layers and Masks, nearly 280 layers in the biggest file. (mostly with masks)
    The images are 50 x 70 cm big  @ 300dpi.
    When I try to make some brush-strokes on a layer-mask in the biggest file it takes 5-20 seconds for the brush to draw... I couldnt figure out why.
    And its not so much pretending on the brush-size as you may expect... even very small brushes (2-10 px) show this issue from time to time.
    Also switching on and off masks (gradient maps, selective color or leves) takes ages to be displayed, sometimes more than 3 or 4 seconds.
    The same with panning around in the picture, zooming in and out or moving layers.
    It's nearly impossible to work on these files in time.
    I've never seen this on CS6.
    Now I wonder if there's something wrong with PS or the OS. But: I've never been working with files this big before.
    In march I worked on some 5GB files with 150-200 layers in CS6, but it worked like a charm.
    SystemSpecs:
    I7 3930k (3,8 GHz)
    Asus P9X79 Deluxe
    64GB DDR3 1600Mhz Kingston HyperX
    GTX 570
    2x Corsair Force GT3 SSD
    Wacom Intous 5 m Touch (I have some issues with the touch from time to time)
    WIN 7 Ultimate 64
    all systemupdates
    newest drivers
    PS CC
    System and PS are running on the first SSD, scratch is on the second. Both are set to be used by PS.
    RAM is allocated by 79% to PS, cache is set to 5 or 6, protocol-objects are set to 70. I also tried different cache-sizes from 128k to 1024k, but it didn't help a lot.
    When I open the largest file, PS takes 20-23 GB of RAM.
    Any suggestions?
    best,
    moslye

    Is it just slow drawing, or is actual computation (image size, rotate, GBlur, etc.) also slow?
    If the slowdown is drawing, then the most likely culprit would be the video card driver. Update your driver from the GPU maker's website.
    If the computation slows down, then something is interfering with Photoshop. We've seen some third party plugins, and some antivirus software cause slowdowns over time.

  • Adobe Photoshop CS3 collapse each time it load a big file

    I was loading a big file of photos from iMac iPhoto to Adobe Photoshop CS3 and it keep collapsing, yet each time I reopen photoshop it load the photos again and collapse again. is there a way to stop this cycle?

    I don't think that too many users here actually use iPhoto (even the Mac users)
    However, Google is your friend. A quick search came up with some other non-Adobe forum entries:
    .... but the golden rule of iPhoto is NEVER EVER MESS WITH THE IPHOTO LIBRARY FROM OUTSIDE IPHOTO.In other words, anything you might want to do with the pictures in iPhoto can be done from *within the program,* and that is the only safe way to work with it. Don't go messing around inside the "package" that is the iPhoto Library unless you are REALLY keen to lose data, because that is exactly what will happen.
    .....everything you want to do to a photo in iPhoto can be handled from *within the program.* This INCLUDES using a third-party editor, and saves a lot of time and disk space if you do this way:
    1. In iPhoto's preferences, specify a third-party editor (let's say Photoshop) to be used for editing photos.
    2. Now, when you right-click (or control-click) a photo in iPhoto, you have two options: Edit in Full Screen (ie iPhoto's own editor) or Edit with External Editor. Choose the latter.
    3. Photoshop will open, then the photo you selected will automatically open in PS. Do your editing, and when you save (not save as), PS "hands" the modified photo back to iPhoto, which treats it exactly the same as if you'd done that stuff in iPhoto's own editor and updates the thumbnail to reflect your changes. Best of all, your unmodified original remains untouched so you can always go back to it if necessary.

  • I've doubled my RAM but still can't save big files in photoshop...

    I have just upgraded my Ram from the shipped 4GB to 8GB but I still can't save big images in photoshop. It seems to have made no difference to the speed of general tasks and saving and I can't save big files in photoshop at all. I already moved massive amounts of files off my computer and onto my external hard drive, so there is much less in my computer now, and twice the RAM but it is making no noticeable difference. When I click memory, under 'about my mac' it shows that the RAM is installed and now has twice the memory. Could this be something to do with photoshop? I'm running CS6.

    Also, I just calculated 220 cm into inches and this is roughly 86.5 inches. Over 7 feet in length.
    WIth an image that large, you may want to consider down sampling to a lower DPI to help with being able to work on and process images of this size much easier and it will be easier for your Print house to process, as well.
    You might want to consider working with these rather large images at 225 DPI instead of 300 if resolution at close distances is still a concern.
    Or what you good try is working with the images at 300 DPI, but then save them/export them as a Jpeg image at the highest image quality setting.
    I do a lot of project where I use a high resolution jpeg image to save on image proceesing overhead.
    The final printed images still come out pretty clear, clean and crisp without having to process those large files at a print house or having to deal with using that full resolution image in a page layout or illustration program.

  • Error in loading big file

    Hi All,
    I have an application on WL8.1 with sp3, the database is SQL Server 2000. I have code below to load local file into database. The data type in database is image.
    PreparedStatement pStatement = null;
    InputStreammyStream myStream = new InputStream();
    myStream.setEmbeddedStream( is );
    pStatement.setBinaryStream( 1, myStream, -1 );
    pStatement.executeUpdate();
    pStatement.close();
    pStatement = null;
    is is InputStream from a local file and the sql statement is
    insert into file_content(content) values(?)
    This workes fine for the files with size less than 150M, but for those big files (>150M), it doesn't work and I got error message as below:
    <Feb 11, 2005 12:00:41 PM PST> <Notice> <EJB> <BEA-010014> <Error occurred while attempting to rollback transaction: javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    at weblogic.transaction.internal.ServerTransactionImpl.internalRollback(ServerTransactionImpl.java:396)
    at weblogic.transaction.internal.ServerTransactionImpl.rollback(ServerTransactionImpl.java:362)
    Any body can help? Thanks in advance.

    Fred Wang wrote:
    Hi All,
    I have an application on WL8.1 with sp3, the database is SQL Server 2000. I have code below to load local file into database. The data type in database is image.
    PreparedStatement pStatement = null;
    InputStreammyStream myStream = new InputStream();
    myStream.setEmbeddedStream( is );
    pStatement.setBinaryStream( 1, myStream, -1 );
    pStatement.executeUpdate();
    pStatement.close();
    pStatement = null;
    is is InputStream from a local file and the sql statement is
    insert into file_content(content) values(?)
    This workes fine for the files with size less than 150M, but for those big files (>150M), it doesn't work and I got error message as below:
    <Feb 11, 2005 12:00:41 PM PST> <Notice> <EJB> <BEA-010014> <Error occurred while attempting to rollback transaction: javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    javax.transaction.SystemException: Heuristic hazard: (weblogic.jdbc.wrapper.JTSXAResourceImpl, HeuristicHazard, (javax.transaction.xa.XAException: [BEA][SQLServer JDBC Driver]Object has been closed.))
    at weblogic.transaction.internal.ServerTransactionImpl.internalRollback(ServerTransactionImpl.java:396)
    at weblogic.transaction.internal.ServerTransactionImpl.rollback(ServerTransactionImpl.java:362)
    Any body can help? Thanks in advance.I already answered this as to the cause, in the ms newsgroups... It's the DBMS choking so
    bad on the size of your image file that it actually kills the connection. Note that the DBMS
    has to save the whole image to log as well as to the DBMS table. The fundamental response is
    to get DBA help to configure the DBMS to be able to do what you want. If you want and can,
    you could split your image column into multiple image columns, and split your data into
    100meg chunks, and then enter an empty row and then update it column by column, and then
    concatenate the data again in the cleint when you get it out, etc, but much better to post
    to the MS server newsgroups for ideas.
    Joe

  • While transferring several files to the G-Raid, the G-Raid device locks up with the following warnings:   "The Finder can't complete the operation because some data in (whatever file) can't be read or written (Error code - 36)."

    While transferring several files to the G-Raid, the G-Raid device locks up with the following warnings:   "The Finder can't complete the operation because some data in (whatever file) can't be read or written (Error code - 36).” NOt sure this is a G-Raid problem, can anyone help?

    This is likely a problem with the drive itself. This error indicates unspecified I/O errors with the drive, which can be from a hardware fault, or a formatting issue. If the drive has valuable data on it, then try connecting it to another PC to get the data off, and then try fully partitioning and formatting the drive with Disk Utility. If this corrects the error then you should be able to use the drive again as it was likely a formatting issue, but if the problem continues then it is likely a hardware fault and the best solution is to be safe and replace the drive.

  • What are the good ways to send a big file( 20MB-100MB) file to my friend?

    what are the good ways to send a big file( 20MB-100MB) file to my friend?
    Thanks in advance

    if this is with the internet, iChat is probly your best bet,
    but if you just want a transfer,
    plug a firewire into both of your computers, shutdown one of them, hold "T" and press the power button, the restarted computer should pop up as an external drive on the second computer.

  • USB key ejecting itself when transferring large files

    I have a LG 16G USB key formatted in MacOS extended (journaled), I've been encountering a particular issue when I try to transfer files taken on another computer and copying them from the key to my Macbook pro ( OSX 10.6.2). after it start transferring it ejects itself with the "the disk was not ejected properly" message, then reappears right away in the finder...
    the problem doesn't present itself when I copy a file from my macbook pro to the key, and then try transferring it back to my computer. and I've tried this with both small files (under 200mb) and large files (over 4G).
    I've also noticed the transfer rate is very slow when this happens.
    so far the problem only presents itself when I transfer files from one of the MacPro's in my university's studios, but since I work a lot on these machine, this is becoming quite a problem. I don't have the complete specs of these machines, they all run on Leopard, probably 10.5.something. not SL... perhaps that's the problem?
    I've seen similar post on the forum concerning this issue with TimeMachine backup drives, suggesting it had something to do with the sleep preferences, however I've tried all the potential solutions from these post and the problem persists.
    So far the only solution I've found has been to transfer the files to my girlfriend's macbook (running on tiger), and then transferring the files in target mode from her computer to mine... quite inconvenient...
    I have a feeling this might be due to the formatting of my key, I have a 2G USB key formatted in MS-DOS (FAT32) and have had no problems with it so far. The reason I formatted in Mac OS extended is simple, I work in the video field and with HD I find myself often moving around single files larger then 4GB, which I've come to understand isn't possible in FAT 32.
    I'd like to know how to resolve this, and especially like to know if it is indeed a format issue, since I'm soon going to acquire a new external hard drive for the sole purpose of storing my increasing large media files and would like to know how to format it.

    Hi,
    I have an external USB Card reader (indeed - two different readers), which have the same problems with self-ejecting disks. Every transfer of data from SD card from my camera is the pain now. There is many different themes related to the self-"the disk was not ejected properly" situation, but no working solution now.

  • Can't move big files also with Mac OS Extended Journaled!

    Hi all =)
    This time I can't really understand.
    I'm trying to move a big file (an app 9 GB large) from my macbook app folder to an external USB drive HFS+ formatted, but at about 5.5 GB the process stop and I get this error:
    "The Finder can't complete the operation because some data in "application.app" can't be read or written. (error code -36)"
    Tried to search for this error code over the internet with no results.
    Tried the transfer of the same file to a different drive (which is also HFS) but I still get the same error code.
    Both drives have plenty of available free space.
    Tried also  different USB cables.
    The app in subject has just been fully downloaded with no error from the app store.
    What should I try now? Please any suggestion welcome.. this situation it's so frustrating!
    Thanks

    LowLuster wrote:
    The Applications folder is a System folder and there are restrictions on it. I'm not sure why you are trying to copy an App out of it to an external drive. Try copying it to a folder on the ROOT of the internal drive first, a folder you create using a Admin account, and then copying that to the external.
    Thanks a lot LowLust, you actually solved my situation!
    But apparently in this "forum" you can't cancel the choosen answer if by mistake you clicked on the wrong one and you can't edit your messages and you even can't delete your messages... such a freedom downhere, jeez!

Maybe you are looking for

  • Calculation in udf

    Dear all, i need to perform the following calculation in UDF created in BOM at row level. for eg UDF1=10 (Structure: quanity) UDF2=10 % (Structure: percents) UDF3= UDF1*UDF2 Then in the quantity field of BOM i have to perform following : Qty= UDF1+UD

  • 5.1 sound not working

    I have a X-Fi fatalty titanium champion series sound card with logitech z5500 5. speakers running vista x64 sp. I cannot fully utilize the 5. speakers for some reason. I have updated to the latest drivers, selected 5. speakers from both windows and t

  • Why is the cadence of my film like video lost on DVD shown on TV

    I'm pretty happy with a music video I have edited and graded in FCP and Color. It has a good film like cadence when I play it back on my iPhone or computers. When I burn it to DVD via Compressor (using a standard MPEG-2 preset) it again looks good wh

  • JS - Save File as Field Name in the Data Merge Source File

    Hi All - I need some help... I'm trying to figure out the JavaScript code to save an indd file using the data merge source file field value. As an example... Let's say I have an invoice template called template.indd that is linked to invoice.txt. To

  • Syncing iPad2 doesn't move all photos selected in iTunes from the PC folder.

    In iTunes 10.5 I select a folder where I heve the photos to be sync to iPad2 and I select all sub-folders. On iPad some folders appear completly empty and others with just some photos. The number of photos identified ion iTunes after selecting the fo