Out of memory when coverting large files using Web service call

I'm running into an out of memory error on the LiveCycle server when converting a 50 meg Word document with a Web service call.  I've already tried increasing the heap size, but I'm at the limit for the 32 bit JVM on windows.  I could upgrade to a 64 bit JVM, but it would be a pain and I'm trying to avoid it.  I've tried converted the 50 meg document using the LiveCycle admin and it works fine, the issue only occurs when using a web service call.  I have a test client and the memory spikes when it's generating the web service call taking over a gig of memory.  I assume it takes a similar amount of memory on the receiving end which is why LiveCycle is running out of memory.  Does any one have any insight on why passing over a 50 meg file requires so much memory?   Is there anyway around this?
-Kelly

Hi,
You are correct that a complete 64bit environment would solve this. The problem is that you will get the out of memory error when the file is written to memory on the server. You can solve this by creating an interface which stores large files on the server harddisk instead, which allows you to convert as large files as LC can handle without any memory issue.

Similar Messages

  • How to send large files using web service

    hello everyone,
    I am new to this forum, so please pardon me if I post some silly problem...
    I have created one service which sends file when client (jsp) request it. I am using JBOSS as my server. purpose of this application is when client request some fle then service will send this file... and most of the time we need to send only pdfs and ppts...
    Problem is, this service sends txt, java files easily of any size but when i tried sending PDF, PPT then i got xml.SAXParseException.......
    I thought this error is because of some characters, but how to fix it....
    I am working on Linux.
    code snippet is:
    import java.io.*;
    public class MyHelloService
    public String file_size (String name)
         String s = new String("");
    byte[] sendata1=new byte[100];
         try
              System.out.println("name recived is :::::::::::"+name);
              FileInputStream in=new FileInputStream(name);
              int size=0;
              size=in.available();
              System.out.println("FILE SIZE IS:::::"+size);
              byte[] sendata11=new byte[size];
              i=in.read(sendata11);
              System.out.println(new String(sendata11));
              s=new String(sendata11);
         catch(Exception e)
                   System.out.println("EXCEPTION IN JWS:::"+e);
                   s=new String("nofilefounderror");
         return s;
    pls tell me what am i doing wrong ad how to fix this?
    and one more thing can i send byte array from a web service as i tried but couldnt do that... so i am reading everything in a single byte array and then converted to string.....
    is it possibel to send file in a chunk?if yes, how to do that?
    waiting for the reply..... pls reply as soon as possible....
    Rashi

    hi,
    I am sending file from server to client i.e client will request for a file and service will send it back....... no socket connection is there...I am using JBOSS and apache axis.
    pls help me out.....
    Rashi

  • ABAP source code to connect to third party systems using web service calls?

    Hi all,
              can any one provide an example ABAP source code to connect to third party systems using web service calls? The base system is CRM.

    Do you want to call a web service in a remote system, or do you want to provide a web service?
    If you want to call a web service you should create a proxy object via SE80. Open your development package, right click on the tree entry and choose: Create -> Enterprise Service / Web Service -> Proxy Object and provide the needed information (including the WSDL description file). You may then use the proxy object to call the web service (if the connection and everything else works right).
    See [http://help.sap.com/saphelp_nw04/helpdata/en/9b/dad1ae3908ee44a5caf57e10918be9/content.htm|http://help.sap.com/saphelp_nw04/helpdata/en/9b/dad1ae3908ee44a5caf57e10918be9/content.htm]

  • Server goes out of memory when annotating TIFF File. Help with Tiled Images

    I am new to JAI and have a problem with the system going out of memory
    Objective:
    1)Load up a TIFF file (each approx 5- 8 MB when compressed with CCITT.6 compression)
    2)Annotate image (consider it as a simple drawString with the Graphics2D object of the RenderedImage)
    3)Send it to the servlet outputStream
    Problem:
    Server goes out of memory when 5 threads try to access it concurrently
    Runtime conditions:
    VM param set to -Xmx1024m
    Observation
    Writing the files takes a lot of time when compared to reading the files
    Some more information
    1)I need to do the annotating at a pre-defined specific positions on the images(ex: in the first quadrant, or may be in the second quadrant).
    2)I know that using the TiledImage class its possible to load up a portion of the image and process it.
    Things I need help with:
    I do not know how to send the whole file back to servlet output stream after annotating a tile of the image.
    If write the tiled image back to a file, or to the outputstream, it gives me only the portion of the tile I read in and watermarked, not the whole image file
    I have attached the code I use when I load up the whole image
    Could somebody please help with the TiledImage solution?
    Thx
    public void annotateFile(File file, String wText, OutputStream out, AnnotationParameter param) throws Throwable {
    ImageReader imgReader = null;
    ImageWriter imgWriter = null;
    TiledImage in_image = null, out_image = null;
    IIOMetadata metadata = null;
    ImageOutputStream ios = null;
    try {
    Iterator readIter = ImageIO.getImageReadersBySuffix("tif");
    imgReader = (ImageReader) readIter.next();
    imgReader.setInput(ImageIO.createImageInputStream(file));
    metadata = imgReader.getImageMetadata(0);
    in_image = new TiledImage(JAI.create("fileload", file.getPath()), true);
    System.out.println("Image Read!");
    Annotater annotater = new Annotater(in_image);
    out_image = annotater.annotate(wText, param);
    Iterator writeIter = ImageIO.getImageWritersBySuffix("tif");
    if (writeIter.hasNext()) {
    imgWriter = (ImageWriter) writeIter.next();
    ios = ImageIO.createImageOutputStream(out);
    imgWriter.setOutput(ios);
    ImageWriteParam iwparam = imgWriter.getDefaultWriteParam();
    if (iwparam instanceof TIFFImageWriteParam) {
    iwparam.setCompressionMode(ImageWriteParam.MODE_EXPLICIT);
    TIFFDirectory dir = (TIFFDirectory) out_image.getProperty("tiff_directory");
    double compressionParam = dir.getFieldAsDouble(BaselineTIFFTagSet.TAG_COMPRESSION);
    setTIFFCompression(iwparam, (int) compressionParam);
    else {
    iwparam.setCompressionMode(ImageWriteParam.MODE_COPY_FROM_METADATA);
    System.out.println("Trying to write Image ....");
    imgWriter.write(null, new IIOImage(out_image, null, metadata), iwparam);
    System.out.println("Image written....");
    finally {
    if (imgWriter != null)
    imgWriter.dispose();
    if (imgReader != null)
    imgReader.dispose();
    if (ios != null) {
    ios.flush();
    ios.close();
    }

    user8684061 wrote:
    U are right, SGA is too large for my server.
    I guess oracle set SGA automaticlly while i choose default installion , but ,why SGA would be so big? Is oracle not smart enough ?Default database configuration is going to reserve 40% of physical memory for SGA for an instance, which you as a user can always change. I don't see anything wrong with that to say Oracle is not smart.
    If i don't disincrease SGA, but increase max-shm-memory, would it work?This needs support from the CPU architecture (32 bit or 64 bit) and the kernel as well. Read more about the huge pages.

  • How can I use web service call for edit a report with SSRS in Java Struts2 web application

    Hello im new in SSRS technologie and I would like make web service call at my SSRS server. Is sombody can help me ?
    - What API should I import in my project ? I use Maven can I found this API on Maven repository ?
    - I would like have an sample of code which initialize the ReportingService, do the call and process result.
    We use the SQL Server ReportingService 2008 R2 and currently we made HTTP call like this : http://<ssr_server>/ReportServer/Pages/ReportViewer.aspx?%2fSSRS_OMB%2fMyReport&rs:Command=Render&MyParam=<value>
    Regards

    Hi ombinte,
    SQL Server Reporting Services provides access to the full functionality of the report server through the Report Server Web service. Because the Report Server Web service is an XML Web service which uses Simple Object Access Protocol (SOAP) over Hypertext Transfer
    Protocol (HTTP), any SOAP-aware application or development tool can communicate with the SSRS web service.
    There are three primary ways to develop Reporting Services applications based on the Web service, please see:
    Develop applications using Microsoft Visual Studio and the Microsoft .NET Framework SDK.
    Develop applications using the rs utility (RS.exe), the Reporting Services script environment.
    Develop applications using any SOAP-enabled set of development tools.
    For more information about Report Server Web Service, you can refer to the following document:
    http://technet.microsoft.com/en-us/library/ms152787.aspx
    Hope this helps.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Download a XML file from Web Services Using Flex

    Hi All...
    I am new for flex, im developing a windows application using Flex/Air, i have connected the web services with user authentication, now I want to download a xml file using web services in flex,
    how can i do this?? please reply...
    Thanks in advance
    Vasanth

    Hi All....
    I have done this myself using sample tutorials...
    here is the code for your reference guys
              plyLoginName = txtEmailIdDownload.text;                
                     var urlpath:String = new String("your url p?LoginName=");
                    urlpath = new String(urlpath.concat(plyLoginName));
                    urlpath = new String(urlpath.concat("&PlayerType="));
                    urlpath = new String(urlpath.concat(chkseasonvalue));
                    Alert.show(urlpath);
                var request:URLRequest = new URLRequest(urlpath)
                var fileRef:FileReference = new FileReference();           
                fileRef.download(request,"yourfilename.xml");
                Alert.show('File downloaded Successfully');   
                 txtEmailIdDownload.text = "";
                txtPWDownload.text = "";
    thanks
    Vasanth

  • Trasport file by web services

    hi sdn boys
    i have a problem.
    i have a abap program that trasfer a file to server, than this file is put on a web page.
    now i must download the file using web services.
    it 's possible and How can I do?
    thanks

    Hello everyone.
    I have the same problem like it is described in the first post.
    I create a Web Service and also in WSADIMN my Service doesn’t appear.
    When I go to WSCONFIG it is not possible for me to save the created (press F5) Web Service. It seems that it is not possible for me to save in the whole system. Doesn’t matter what I do, the save button is disabled.
    I extend my license and I specify as user “student” and not “developer”. Could that be the problem?
    Anyone an Idea
    Thx a lot

  • Uploading large files using nio in http client

    Hi,
    I'm developing a multithreaded Swing client which needs to be capable of uploading and downloading large ( 100mb ) files to servlets via http.
    Downloading has presented no problem, but writing to the OutputStream of a URLConnection, whilst fine for small files, gave me 'out of memory' errors on large ones. ( As I understand it, the whole stream is buffered before sending and it just runs out of room.)
    So I replaced the URLConnection with a SocketChannel, and after writing the http header, used a loop to write chunks of data from a FileChannel.
    I needed to monitor any incoming data to truncate the upload if the servlet gives a premature response indicating an error condition, so I set the SocketChannel to non-blocking and put a read into the loop.
    Immediately I got a bunch of Swing exceptions and the frame was unable to render itself, I guess due to thread starvation problems.
    I currently have this clunky code that seems to work ok:
    int percent=0;
    long total = 0L;
    long chunk = 1024*8;
    while(total<length ){
    chunk = ((length-total)<chunk)?length-total:chunk;
    long ii= fc.transferTo(total,chunk,socketChannel);
    total+=ii;
    yield();
    socketChannel.configureBlocking(false);
    if(socketChannel.read(buffer)>0 )break;
    socketChannel.configureBlocking(true);
    What I can't figure out is why Swing objects if I permanently set the SocketChannel to non-blocking rather than just whilst I read it, and why I get the following exception from a polling thread that runs concurrently:
    'No buffer space available (maximum connections reached?)'
    Anyone got any ideas or a better way to do it?
    Chris

    I assume that the OutputStream obtained fromURLConnection is internally buffered and URLConnection
    waits until the whole stream has been input before
    inserting the initial Content-length header. Only when
    it knows the complete length is the content sent. Hmm, this is quite unfortunate. Especially since it is not necessary: HTTP allows the data stream close to serve as the end-of-data marker (Content-Length: is an optional header, in its absence the receiver simply reads all of the data until the sender closes the stream).
    Has anyone else had experience of sending large
    (100mb) files through Java http clients on small
    pc's?I've sent large-ish (~1-10Mb) payloads via URLConnection but not as large as yours. Interesting issue to figure out... I am out of ideas for the moment.

  • Out of memory when i render HD with many layers

    I got a message error ( out of memory ) when i render HD full 1920 X 1080 in final cut pro 5 with many layers ( 6 to 9 layers ). I have 5.5 gig of memory and when is crash....1.5gig of memory is free in my activity monitor. If a buy more ram...my problem will be resolve? Please help me if anybody know about this problem.
    thank
    Yan

    I am having the same problem...(out of memory on renders)
    I've been watching my activity monitor and cancelling the render before I run out of memory... but I'm not getting much done and I can't do a big render at night.
    I've done all the usuall things... no CMYC files. Trash Prefrences... etc..
    Two questions:
    I am using Magic Bullet filters and they seem to stack up the inactive RAM. (When the incactive ram fills up that's when I get the "out of memeory") You said you saw a post about that a few days ago...Was there an answer besides changing out the filters? or can you point me towards the posting?
    I'm also trying to figure out how to dump or clear the inactive RAM once I'm done with a render. I found that if I do a full restart I can clear my RAM and get a bunch of renders done before I fill up the inactive ram again. This is going to be a long project if I have to do that.
    thanks
    Mark

  • TS1368 when downloading large files such as videos of television season do you need to turn off autolock to prevent download from halting

    when downloading large files such as videos of a television season do you need to turn off autolock to prevent download from halting? And if so, if the download will take a significant amount of time (not sure how much offhand) how do you prevent burn in from prolonged screen image?

    Among the alternatives not mentioned... Using a TiVo DVR, rather than the X1; a Roamio Plus or Pro would solve both the concern over the quality of the DVR, as well as providing the MoCA bridge capability the poster so desperately wanted the X1 DVR to provide. (Although the TiVo's support only MoCA 1.1.) Just get a third-party MoCA adapter for the distant location. Why the hang-up on having a device provided by Comcast? This seems especially ironic given the opinions expressed regarding payments over time to Comcast. If a MoCA 2.0 bridge was the requirement, they don't exist outside providers. So couldn't the poster have simply requested a replacement XB3 from the local office and configured it down to only providing MoCA bridging -- and perhaps as a wireless access point? Comcast would bill him the monthly rate for the extra device, but such is the state of MoCA 2.0. Much of the OP sounds like frustration over devices providing capabilities the poster *thinks* they should have.

  • Ora-04030 process out of memory when trying to allocate nn bytes..

    Hi,
    Again we are facing the same error after increasing pga_aggregate_target from 2G to 3G, the journal import program request finally results in ora-04030 process out of memory when trying to allocate nn bytes.
    wiil there be any bugs issue?
    DBMS:9.2.0.3
    OS :sun sparc 5.10

    IF possible try to apply last patchset to take your DB to the latest version which should be more reliable and many bugs should be solved.
    Do you have free physical memory to use the 3gb?
    Hope this helps.
    Regards.
    FR.

  • "General Error" and "Out of Memory" for only certain files?

    I have Final Cut Express 4 on Mac OS X Leopard and it has been working fine, up until now.
    For some reason, when I try to view two clips in the tab to the left of the screen to find sections of them to put into my project, I get a message that says, "General Error" and when I click okay it says, "Error: Out of Memory". They are both mp4 files, and are 242.9 MB and 294.2 MB, and I have viewed them both on Quicktime. I don't understand why it is saying that it is out of memory when I still have 7 GB left on my computer and it still lets me view and add other files into my sequence.
    Can someone help me out and tell me how to fix this? I'd really appreciate advise!

    MPEG-4 is not a format that works in FCE. You'll have to convert it to one of FCE's format. Without knowing details about what the original format is it's impossible to say what you should convert it to.

  • Processing Large Files using Chunk Mode with ICO

    Hi All,
    I am trying to process Large files using ICO. I am on PI 7.3 and I am using new feature of PI 7.3, to split the input file into chunks.
    And I know that we can not use mapping while using Chunk Mode.
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    Thanks in Advance,
    - Pooja.

    Hello,
    While trying I noticed below points:
    1) I had Created Data Type, Message Type and Interfces in ESR and used the same in my scenario (No mapping was defined)Sender and receiver DT were same.
    Result: Scenario did not work. It created only one Chunk file (.tmp file) and terminated.
    2) I used Dummy Interface in my scenario and it worked Fine.
    So, Please confirm if we should always USE DUMMY Interfaces in Scenario while using Chunk mode in PI 7.3 Or Is there something that I am missing.
    According to this blog:
    File/FTP Adapter - Large File Transfer (Chunk Mode)
    The following limitations apply to the chunk mode in File Adapter
    As per the above screenshots, the split never cosiders the payload. It's just a binary split. So the following limitations would apply
    Only for File Sender to File Receiver
    No Mapping
    No Content Based Routing
    No Content Conversion
    No Custom Modules
    Probably you are doing content conversion that is why it is not working.
    Hope this helps,
    Mark
    Edited by: Mark Dihiansan on Mar 5, 2012 12:58 PM

  • When moving a file using drag and drop, as I hover over the destination folder it no longer opens, so I can't drop the file into the folder.  Why?

    When moving a file using drag and drop, as I hover over the destination folder it no longer opens, so I can't drop the file into the folder.  Why?  I have a heirarchical embedded folder structure under \Documents.  Usually, as I drag the file to be moved over the various folders, they will blink twice, then expand/open to reveal contents (in Finder).  Now, when the file to be moved is held (hovering) over the destination folder, it will not blink or expand, so there's no place to drop the file.  If I release the mouse button, the file image disappears, showing that it is "going back to the place it came from", a.k.a. was not moved.  This just started happening.  I noticed it after upgrading to Mavericks.  Is this a bug or could anything else be causing this?

    As of today, Apple Support discovered during a remote support session with me that the same thing happens on their own system.  It is isolated to "Column View"; the Icon and List views in Finder behave correctly.  Issue is being escalated to engineering, and Apple will call to report status in a couple days.  In all likelihood, this is a bug that will need to be addressed in a future update.

  • Does logging out of Firefox when you're not using it save energy?

    Have been reading about how much energy servers consume. If I log out of Firefox when I'm not using it, does that decrease the amount of energy servers are using?

    If you've never replaced the hard drive, it could be slowing down - 3 to 5 years is the typical lifespan for a notebook hard drive.
    Do you know how much RAM is in the computer?
    If the computer mostly works OK, you may just need to invest in it a little - replacing the hard drive & upgrading the RAM will breathe new life into it, most likely!
    You do not need to shut it down each time you're done using it - in fact, I leave my MacBook asleep 90% of the time and reboot it every couple of weeks or when it seems to be getting a bit sluggish. I've been doing that for the last 7 years and she's still chugging along.
    ~Lyssa

Maybe you are looking for

  • Script is not working in CS4

    Hi all I have a Apple script for applying character styles although is working fine in CS2. But when we running this script in CS4 is not working. If you have any solutions. Plese let me now. Thanks in Advance Pramod Pant tell application "Adobe InDe

  • Invalid Drive E:\

    I had itunes 7 installed but all of a sudden it would not open and came up with an error message Invalid Drive E:\ i deleted itunes and tried to re-install it and it came up with the same message several times. Any ideas?

  • Can a CIN that is in an infinite loop send data to labview without exiting?

    I have a C program that continuously queries a motor controller.  Right now that program is in a labview vi as a CIN.  The problem is that the CIN must exit before reporting the controller response to the labview vi.  This means I have to constantly

  • Master data templates for Talent management

    Dear All, Please let me know  if someone can provide me with templates for PD. We are in the midst of an implementation and the employee nos are huge. It would take a long time for the client to complete the talent architecture and hence do not want

  • Need to capture webinar while away from mac

    need to capture webinar while away from mac. automator is so complicated. is there a simpler better way to capture. thanks erryjay