Handling large files in scope of WSRP portlets

Hi there,
just wanted to ask if there are any best practices in respect to handling large file upload/download when using WSRP portlets (apart from by-passing WebCenter all-together for these use-cases, that is). We continue to get OutOfMemoryErrors and TimeoutExceptions as soon as the file being transfered becomes larger than a few hundred megabytes. The portlet is happily streaming the file as part of its javax.portlet.ResourceServingPortlet.serveResource(ResourceRequest, ResourceResponse) implementation, so the problem must somehow lie within WebCenter itself.
Thanks in advance,
Chris

Hi Yash,
Check this blogs for the strcuture you are mentioning:
/people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
/people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
Regards,
---Satish

Similar Messages

  • Handling Large files in PI scenarios?

    Hello,
    We have lot of scenarios (almost 50) where we deal with file interfaces atleast in receiver or sender side. Some of them are just file transfers where we use AAE and some are where we have to do message mapping (sometimes very complex ones).
    the interfaces work perfectly fine will a normal file which dont have much records but recently we started testing big files with over 1000 records and its taking a lot of time to process. It is also causing other messages which gets lined up in the same queue to wait in the queue for the amount of time it takes for the first message to process.
    This must be a very practical scenario where PI has to process large files specially files coming from banks. What is the best way to handle its processing? Apart from having a better system hardware (we are currently in the test environment. Production environment will definetely be better) is there any technique which might help us improve the processing of large files without data loss and without interrupting other message?
    Thanks,
    Yash

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • Handling large files with FTP in XI

    Hi All,
    I have a file scenario where i have to post the file with size of more than 500MB and with the fields upto 700 in each line.
    The other scenario worked fine if the file size is below 70MB and less number of fields.
    Could anyone help me in handling such scenario with large file size and without splitting the file.
    1) From your previous experience did you use any Tools to help with the development of the FTP interfaces?
    2) The client looked at ItemField, but is not willing to use it, due to the licensing costs. Did you use any self-made pre-processors?
    3) Please let me know the good and bad experiences you had when using the XI File Adapter?
    Thanks & Regards,
    Raghuram Vedagiri.

    500 MB is huge. XI will not be able to handle such a huge payload for sure.
    Are you using XI as a mere FTP or are you using Content Conversion with mapping etc?
    1. Either use a splitting logic to split the file outside XI ( using Scripts ) and then XI handles these file.
    2. or Quick Size your hardware ( Java Heap etc ) to make sure that XI can handle this file ( not recommended though). SAP recommends a size of 5 MBas the optimum size.
    Regards
    Bhavesh

  • Java proxies for handling large files

    Dear all,
    Kindly let me know handle the same in step by step explanation as i do not know much about java.
    what is advantage of using the java proxies here.Do we implement the split logic in java code for mandling 600mb file?
    please mail me the same to [email protected]

    Hi !!   Srinivas
    Check out this blog....for   Large file handling issue  
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    This will help you
    Please see the documents below. This might help you
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/a068cf2f-0401-0010-2aa9-f5ae4b2096f9
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f272165e-0401-0010-b4a1-e7eb8903501d
    /people/prasad.ulagappan2/blog/2005/06/27/asynchronous-inbound-java-proxy
    /people/rashmi.ramalingam2/blog/2005/06/25/an-illustration-of-java-server-proxy
    We can also find them on your XI/PI server in folders:
    aii_proxy_xirt.jar
    j2eeclusterserver0 inextcom.sap.aii.proxy.xiruntime
    aii_msg_runtime.jar
    j2eeclusterserver0 inextcom.sap.aii.messaging.runtime
    aii_utilxi_misc.jar
    j2eeclusterserver0 inextcom.sap.xi.util.misc
    guidgenerator.jar
    j2eeclusterserver0 inextcom.sap.guid
    Java Proxy
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/a068cf2f-0401-0010-2aa9-f5ae4b2096f9
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f272165e-0401-0010-b4a1-e7eb8903501d
    Pls reward if useful

  • Hardware Question – Handling large files in Photoshop

    I'm working with some big TIFF files (~1GB) for large-scale hi-res printing (60" x 90", 10718 x 14451), and my system is lagging hard like never before (Retina MacBook Pro 2012 2.6GHz i7 /8 GB RAM/ 512GB HD).
    So far I've tried:
    1) converting to .psd and .psb
    2) changing the scratch disk to an external Thunderbolt SSD
    3) allocating all available memory to the program within photoshop preferences
    4) closing all other applications
    In general I'm being told that I don't have enough RAM. So what are the minimum recommended system requirements to handle this file size more comfortably? Newest Retina Pro with 16GB RAM? Or switch to iMac w/ 32? Mac Pro?
    Thanks so much!

    Hi Yash,
    Check this blogs for the strcuture you are mentioning:
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Regards,
    ---Satish

  • Handling Large File

    Hi all,
    We need to handle a large file 88omb .is there any provision at the adapter level to break the file in to smaller chunks.
    Can we avoid using the shell scripts ansd OS level commands.
    Thanks,
    Srinivas

    Hi Srinivas,
       if it is a text file then you could break up the file into multiple recordsets,
      e.g.,
    [Converting Text Format in the Sender File/FTP Adapter to XML   |http://help.sap.com/saphelp_nwpi711/helpdata/en/44/658ac3344a4de0e10000000a1553f7/frameset.htm]
    and
    [#821267 File Adapter FAQ|http://service.sap.com/sap/support/notes/821267]
    14. Memory Requirements
    Regards
      Kenny

  • How does Time Machine handle large files?

    I'm relatively new at the whole Time Capsule / Time Machine process and have learned that large files (eg aperture library) are backed up each time there is a change and this can lead to the TC filling up quicker than normal.
    How does this work with daily and weekly backups?
    For example, if my aperture library is, say 1Gb and I import a load of photos from my camera and this goes up to 2Gb. I've learned that I should disable time machine while I'm in Aperture (or at least before 10.6...not sure now). So given I've done that, imported the files to Aperture but want to edit them later and ultimately move them into iPhoto to keep the Aperture album small.
    When I turn back on Time Machine, the next hourly backup will know the library has changed and will back it up, this will go on until a day backup has been taken - this deletes the 24 hourly backups? or does it merge them?
    If I then do the editing the following week, then export the photos and the library is now back to 1Gb again....backed up hourly/daily/weekly etc what am I left with??
    Do I have an original, the 2GB version and the new 1Gb version...ie 4Gb......is there a cunning way I can work to change the files within a week so only one of the changes is in the backup?

    Orpheus999 wrote:
    When I turn back on Time Machine, the next hourly backup will know the library has changed and will back it up, this will go on until a day backup has been taken - this deletes the 24 hourly backups? or does it merge them?
    The Time Machine panel of System Preferences says this:
    Time Machine keeps
    - Hourly backup for the past 24 hours
    - Daily backups for the past month
    - Weekly backups until your backup disk is full
    Each time Time Machine runs it creates what appears to be an entirely new backup set, although it does this in a way that doesn't require it to copy files that have already been copied. So merging isn't necessary. Another effect of how it operates is that each unique version of a file (as opposed to packages of files) only exists on the backup volume once.
    According to the contents of my Time Machine backup file, hourly backups are literally kept for 24 hours, not until the next "daily" backup. For a "daily" backup, it seems to keep the oldest "hourly" backup for a day.
    If I then do the editing the following week, then export the photos and the library is now back to 1Gb again....backed up hourly/daily/weekly etc what am I left with??
    Do I have an original, the 2GB version and the new 1Gb version...ie 4Gb......is there a cunning way I can work to change the files within a week so only one of the changes is in the backup?
    You might be able to exclude those files from being backed up at certain times, but I can't be sure this would result in older copied of those files being retained.

  • Handling large files

    Hi,
    how to process large files in SAP XI?
    Thanks,
    Manogna

    hi seshagiri
    check the below blogs
    huge processing of files                                        
    Night Mare-Processing huge files in SAP XI                                        
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging                                        
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging                                        
    SAP XI acting as a (huge) file mover                                        
    The specified item was not found.                                        
    Managing bulky flat messages with SAP XI (tunneling once again) - UPDATED                                        
    The specified item was not found.
    regards
    kummari
    Edited by: Craig Cmehil on Jul 9, 2008 8:58 AM

  • Handling Large Files in XI

    we  have designed couple of integration process for the project. we need to handle messages more than 5MB and the message rates are approximately 100 Msgs/Hr - and we did the tuning as per the tuning guide . we are now facing the following issues
    1. Jco Connection Failes Error Occurs whenever large size of files are handled by XI
    2. ICM_HTTP_INTERNAL ERROR Occurs whenever large size of files are handled by XI
    Does anyone has solution for these issues ?

    I am sure that you already have checked sizing requirements for large sized messages. If not, it might be worth looking at this:
    The memory consumption of XI depends on the number of processes running in parallel and the size of the message.
    In general, an extra sizing for XI memory consumption is not required. The total memory of the SAP Web Application Server should be sufficient except in the case of large messages (>1MB).
    To determine the memory consumption for processing large messages, you can use the following rules of thumb:
      Allocate 3 MB per process (for example, the number of parallel messages per second may be an indicator)
      Allocate 4 kB per 1kB of message size in the asynchronous case or 9 kB per 1kB message size in the synchronous case
      Example: asynchronous concurrent processing of 10 messages with a size of 1MB requires 70 MB of memory
    (3MB + 4 * 1MB) * 10 = 70 MB With mapping or content-based routing where an internal representation of the message payload may be necessary, the memory requirements can be much higher (possibly exceeding 20 kBytes per 1kByte
    message, depending on the type of mapping).
    The size of the largest message thus depends mainly on the size of the available main memory. On a normal 32Bit operating system, there is an upper boundary of approximately 1.5 to 2 GByte per process, limiting the respective largest message size.
    Hope it helps.
    Cheers, Sachin K

  • Handling large files for displaying within a JTextPane

    I have some (very) large trace files sometimes up to more than 100MB and I want to group the traces which are originally stored unordered into this trace file.
    When I tried to read in a just 16MB large trace file and used this command line option: -Xm60m I get an OutOfMemoryError.
    Is there a way to handle (very) large (text) files most effectively? For instance, how can I achieve to group (some kind of sortation) a trace file of let's say 100MB most effectively? Maybe by splitting the file?
    I want to display the text of this/these text files in a JTextPane coloring some particular parts of the traces.
    Thank you in advance,
    Dirk
    Berlin, Germany
    [email protected]

    I don't know of any way to display 100MB of text effectively. I'm sure the user isn't going to sit there and scroll through 100MB of data to find what they are looking for. I would probably load the data into a database and create some queries the user can invoke to filter the data.

  • Best way to handle large files in FCE HD and iDVD.

    Hi everyone,
    I have just finished working on a holiday movie that my octagenarian parents took. They presented me with about 100 minutes of raw footage that I have managed to edit down to 64 minutes. They have viewed the final version that I recorded back to tape for them. They now want to know if I can put it onto a DVD for them as well. Problem is the FCE HD file is 13Gb.
    So here is my question.
    What is the best way to handle this problem?
    I have spoken to a friend of mine who is a professional editor. She said reduce the movie duration down to about 15mins because it's probably too long and boring. (rather hurtful really) Anyway that is out of the question as far as my oldies are concerned.
    I have seen info on Toast 8 that mentions a "Fit to DVD" process that purports to "squash" 9Gb of movie to a 4.7Gb disk. I can't find if it will also put 13Gb onto a dual layer 8.5Gb disk.
    Do I have to split the movie into two parts and make two dual layer DVD's? If so I have to ask - How come "Titanic", 3hrs+ fits on one disk??
    Have I asked too many questions?

    Take a deep breath. Relax. All is fine.
    iDVD does not look at the size of your video file, it looks at the length. iDVD can accomodate up to 2 hours of movie
    iDVD gives you different options depending on the length of your movie. Although I won't agree with your friend about reducing the length of your movie to 15 minutes, if you could trim out a few minutes to get it under an hour that setting in iDVD (Best Performance though the new version may have renamed it) gives you the best quality. Still, any iDVD setting will give you good quality even at 64 minutes
    In FCE export as Quicktime Movie NOT any flavour of Quicktime Conversion. Select chapter markers if you have them. If everything is on one system unchecked the Make Movie Self Contained button. Drop the QT file into iDVD

  • Mac faster handling large files than a PC?

    Hei guys.
    Don't want to start a OS wars discussion here.
    But I just wonder...
    I have a pretty fast quad core laptop with 8GB Ram and Windows 7 (64bit) and still when it comes to bigger files it doesnt
    handle then much faster then before I think. (Or am I spoilt already?)
    Does anybody has some experience with the latest Mac Book Pro and 8GB ram using Adobe CS and bigger files everyday
    (files about the size of 500mb to 1gb in PS e.g.) in comparison of a similar Windows machine?
    Thanks for you replies
    Björn

    Mac's have faster interfaces like Firefire 800 and Expressslots that allow eSATA devices for very fast transfer speeds.
    Far as internal storage, SSD's and RAID setups are just the same as on PC's.
    What it is most likely is that PC's are built a lot more cheaply on widespread scale, Mac's are used a lot in the video and creative fields where high output/input is reguired.
    So by default a lot of Mac's come with faster interfaces, and by default a lot of PC's come with slow interfaces.
    Also Windows PC's requires anti-malware running, constant defragging nad other things to keep the performance up which isn't too much the case with Mac's.
    Someone who knows what they are doing can make either machine very fast

  • Handle HttpRequest and Responce in WSRP Portlets

    Hi all,
    I have deployed a fusion ADF application as a WSRP portlet on webcenter portal.
    In my portlet application code i want to perform some file handling operations(like open a file in browser) using ADF table components.
    I had used Http sevlet request and responce but portlet throws a following exception.
    java.lang.ClassCastException: org.apache.myfaces.trinidadinternal.config.dispatch.DispatchResourceResponse cannot be cast to javax.servlet.http.HttpServletResponse
    Any one has some idea about how to handle http request and responce in WSRP portlets.

    Hi
    Portlets don't use HttpServletRequest & HttpServletResponse.
    Use instead of them: PortletRequest & PortletResponse
    Regards.

  • Loading large files in Java Swing GUI

    Hello Everyone!
    I am trying to load large files(more then 70 MB of xml text) in a Java Swing GUI. I tried several approaches,
    1)Byte based loading whith a loop similar to
    pane.setText("");
                 InputStream file_reader = new BufferedInputStream(new FileInputStream
                           (file));
                 int BUFFER_SIZE = 4096;
                 byte[] buffer = new byte[BUFFER_SIZE];
                 int bytesRead;
                 String line;
                 while ((bytesRead = file_reader.read(buffer, 0, BUFFER_SIZE)) != -1)
                      line = new String(buffer, 0, bytesRead);
                      pane.append(line);
                 }But this is gives me unacceptable response times for large files and runs out of Java Heap memory.
    2) I read in several places that I could load only small chunks of the file at a time and when the user scrolls upwards or downwards the next/previous chunk is loaded , to achieve this I am guessing extensive manipulation for the ScrollBar in the JScrollPane will be needed or adding an external JScrollBar perhaps? Can anyone provide sample code for that approach? (Putting in mind that I am writting code for an editor so I will be needing to interact via clicks and mouse wheel roatation and keyboard buttons and so on...)
    If anyone can help me, post sample code or point me to useful links that deal with this issue or with writting code for editors in general I would be very grateful.
    Thank you in advance.

    Hi,
    I'm replying to your question from another thread.
    To handle large files I used the new IO libary. I'm trying to remember off the top of my head but the classes involved were the RandomAccessFile, FileChannel and MappedByteBuffer. The MappedByteBuffer was the best way for me to read and write to the file.
    When opening the file I had to scan through the contents of the file using a swing worker thread and progress monitor. Whilst doing this I indexed the file into managable chunks. I also created a cache to further optimise file access.
    In all it worked really well and I was suprised by the performance of the new IO libraries. I remember loading 1GB files and whilst having to wait a few seconds to perform the indexing you wouldn't know that the data for the JList was being retrieved from a file whilst the application was running.
    Good Luck,
    Martin.

  • Splitting large file in XI

    can we split the incoming file in XI, we are getting a large file of size 80MB , wanted to cut down to 40MB each
    Sender system is sending 80MB file at single shot, they cannot change it.
    It has become mandatory for me to break it in XI.  (scenario is File to Proxy)

    Hi Viswanath,
    Handling large files say anything above 100MB is always a problem with File adapter as the data has to be moved from Adapter Engine integration Engine and vice versa.
    Third party tools are generally used for that. Conversion Agent by Itemfield is one of the best approaches.
    Also, on the Advanced tab of the file sender adapter, select the check box next to Advanced Mode. There you can specify Maximum File Size (Bytes) option.
    Huge processing of files
    Night Mare-Processing huge files in SAP XI
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging
    Step-by-Step Guide in Processing High-Volume Messages Using PI 7.1's Message Packaging
    SAP XI acting as a (huge) file mover
    The specified item was not found.
    Managing bulky flat messages with SAP XI (tunneling once again) - UPDATED
    The specified item was not found.
    Regards,
    Vinod.

Maybe you are looking for

  • Hp cp1025nw won't print

    I recently changed my router.  I got all the other devices to work but not my printer.  Sometimes it works and other times not.  Any ideas?  At the moment, it says it is not connected but yet it printed just a few days ago.

  • Where to look for info about creation of a complex web site ?

    I'm more of an artist and graphic designer than web site creator but one of my clients now want to have what I believe is a very complex enterprise level web site for e-commerce. They have a fairly clear idea in terms of user experience and features

  • Plugins in Photoshop CS5

    Please excuse this very basic question. I need to know if my old Photoshop plugins will be compatible with 64-bit Photoshop CS5. All of these old plugins are 32-bit. I understand that Photoshop CS5 comes in both 32-bit and 64-bit versions. However, i

  • PDF text does not display properly

    My coworkers made a PDF (using Acrobat Pro and MS Office on Windows) that displays properly in Windows, Mac, Android, and iPad (iOS 6). However, on two different iPhones (v3 with iOS 3, v4 with iOS 5) the PDF displays with no text and only black grad

  • Why don't color values change when using proof colors?

    Hi, i am very interested in obtaining exact replicas of the image in screen (in ProPhoto) and the image printed (using a suitable printer profile). However when I select Proof Colors from the view menu, the appearance changes but the RGBV or Lab colo