Importing a large PTE file

I am trying to import a very large file and I am getting this error:
com.plumtree.openfoundation.io.XPIOException: Posted content length of 41225780 exceeds limit of 20971520
     com.plumtree.openfoundation.util.XPException.GetInstance(XPException.java:385)
     com.plumtree.openfoundation.util.XPException.GetInstance(XPException.java:350)
     com.plumtree.openfoundation.web.XPRequest.InitRequest(XPRequest.java:201)
     com.plumtree.openfoundation.web.XPRequest.<init>(XPRequest.java:111)
     com.plumtree.uiinfrastructure.web.XPPage.service(XPPage.java:283)
     javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
Anyone know where to increase this min size?
andrew morris | bdg | [email protected]

Try to modify the file $PT_HOME/settings/common/serverconfig.xml
I've found a parameter named gateway:maximum-upload-size, it probably can help you

Similar Messages

  • Hello, I would like to know how to import a large .pdf file to my iPad?

    Hello, I would like to know how to import a large .pdf file to my iPad?

    This document should help:
    http://forums.adobe.com/docs/DOC-2532
    How large? It could be limited by the RAM in your iPad.

  • Importing a large wmv file.

    I am trying to import a large wmv file (1hr 6min) into premiere pro cs6 so that I can blur some faces, it is a traffic stop, but the file will not import.  I have done this before several times but the largest file was 16 min.  Is this file too big to import?  Because of legal reasons it cannot be edited in any way except to blur the faces.  I was going to divide it into smaller files to import but was told it had to be intact.

    The duration should not make any difference.
    What is the error message you get or what happens when to try and import it?
    Have you tried another wmv file as a test?

  • Import very large csv files into SAP

    Hi
    We're not using PI, but have middleware called Trading Networks. Our design is fixed (not my decision) to not upload files into Application Server and import it from there. Our design dictates that we must write RFCs and Trading Networks will call the RFC per interface with very large file sent as table of strings. This takes 14 minutes to import into SAP plain Z-table from where we'll integrate. As a test we uploaded the file to Application Server and integrated into Z-table from there. This took 4 minutes. However our architect is not impressed that we'll stretch available Application Server to it's limits.
    I want to propose that the large file be split in e.g. 4 parts at Trading Networks level and call 4 threads of the RFC which could reduce integration time to e.g. 3 minutes 30 seconds. Is there someone that has suggestions in this regard especially about a proposed, working, elegant solution for integrating large files with our current environment? This will form the foundation of our project.
    Thank you and best regards,
    Adrian

    Zip compression can be tried. The RFC will receive zip stream which will be decompressed using CL_ABAP_ZIP.

  • Importing large audio file. HOw do I expand 'measure' limit?

    I can import the large WAV file no problem. My problem is I cannot listen to it all because the default 'measure' limit is somewhere in the 200's and the imported file is significantly longer than that. I can drag it at the top to be as long as I like but this takes a long time to manually drag it out to be longer. There got to be an easier way to do this. ANy help would be great.

    The actual figure from the manual is "8550 quarter notes, or about 2158 bars in 4/4 time".
    Whatever the time signature or tempo, the limiting factor is the number of quarter notes — 8550 of them. So if your file would extend over more quarter notes than that at the current tempo, then adjust the tempo to be as low as is necessary to give you the duration you require.

  • Camera tracker not working on large .mov file

    I am using the camera tracker and it works perfect on a small .mov file however, once I import a larger .mov file everything crumbles. Text not working, points don't show up, etc.
    Any ideas?

    First idea is for you to provide the name of the program you are using, so a Moderator may move this message to the program forum where actual users may respond

  • Is there a way to import large XML files into HANA efficiently are their any data services provided to do this?

    1. Is there a way to import large XML files into HANA efficiently?
    2. Will it process it node by node or the entire file at a time?
    3. Are there any data services provided to do this?
    This for a project use case i also have an requirement to process bulk XML files, suggest me to accomplish this task

    Hi Patrick,
         I am addressing the similar issue. "Getting data from huge XMLs into Hana."
    Using Odata services can we handle huge data (i.e create schema/load into Hana) On-the-fly ?
    In my scenario,
    I get a folder of different complex XML files which are to be loaded into Hana database.
    Then I gotta transform & cleanse the data.
    Can I use oData services to transform and cleanse the data ?
    If so, how can I create oData services dynamically ?
    Any help is highly appreciated.
    Thank you.
    Regards,
    Alekhya

  • PTE file imports with different versions of ALUI?

    Hi,
    I am wondering if anyone knows of any issues with the import/export of data in ALUI via the PTE file from one version of ALUI to another. For example, are there any issues with exporting data from v6.5 of ALUI and importing it into v6.0 or v6.1. I have been able to do this before, but I don't know if ALUI is happy with seeing a newer version of the PTE file that it is trying to process. Any thoughts you have are appreciated!

    You could try renaming the new file with the old name but I don't think
    that's that's going to be a recipe for success.
    Other than that, you're using a very dangerous workflow. Everything here
    is designed to revolve around a single InDesign file with however many
    assignments or packages there are.
    Bob

  • I imported a large file of photos from my computer to my iphone by accident. how can i remove them?

    i imported a large file of photos from my computer to my iphone by accident. how can i remove them all??

    Resync to itunes. Images placed in the Photo app via sync cannot be deleted directly from the device.

  • Can't Import large .avi file from digital camera into iPhoto 08

    I have a digital camera with a 4GB SD card. I was shooting video and have several small .avi files on the card and one large .avi file (2.67 GB) on the same card. I did not have a problem importing the small files. iPhoto hung while trying to import the large file. I read in this forum that I should use Image Capture, which I tried. When I used Image Capture on the smaller files (just to see how it worked), my camera would flash while it was transferring. When I tried Image Capture on the large file, the camera did not flash and after about 20 minutes, there didn't seem to be anything going on with the transfer. I also tried an old card reader through Image Capture, and that didn't work either. Should Image Capture be able to handle this size file? Do you know of any way to break up the large file into smaller files on the SD card? What am I missing? Thanks!

    Canon's cameras do not mount on the Desktop. It's a 'feature' of the brand.
    There are several possibilities for Image Capture - including minor directory damage on the card etc. The only solution is get the card to mount, and that means a recent Card Reader.
    Regards
    TD

  • Trouble Importing Large mp3 File

    I have a very large mp3 file (extension is .mp3, lower-cased) and I'm trying to import it into GarageBand so I can split it up into multiple files to make multiple podcasts out of it. The file is roughly 500MB and 8.5 hours long. I did not record it (hotel sound recording staff at our conference did) so I don't know what sort of settings were used in saving it as an mp3. It imports into iTunes and plays fine there.
    When I try to drag it into a podcast track, GarageBand gives me an "Importing" message for a minute or two and then nothing happens (blank track, file not there). The same thing happens if I try to drag the version that plays fine in iTunes into GarageBand.
    What can I do to get my file to a point where I can edit it?
    If I need to convert it to something other than an mp3, can you tell me what (free?) software I need to download in order to do so?
    Thank you!

    awpwebmaster wrote:
    What can I do to get my file to a point where I can edit it?
    try the Never-Fail™ conversion method:
    http://www.bulletsandbones.com/GB/GBFAQ.html#neverfail
    (Let the page FULLY load. The link to your answer is at the top of your screen)

  • Importing large .m2v files problem

    I am currently using cs3 version of encore and have come accross a problem when trying to import a large (3.1gb)m2v file that was created in PPcs3.
    When I try to import, encore starts to try to import but them suddenly closes down completely. I can import smaller m2v files with out a problem.
    Does anyone have any suggestions?
    my system is Inte duo 2 quad processor QX6800, 4gb ram, GEforce 8800GTX graphics card and I'm using XP Pro operating system
    Ian

    I had similar problems with CS3. This is my research and solution (so far...)
    When outputting MPG files for HD, if you dont do multiplexing in Premiere, it generates a M2V file and a WAV file. Encore basically crashed on every import of the m2v, (irrespective of importing as timeline or as asset) if it was larger than some value (experimentaly around 8GB).
    Someone suggested that it had to do with the XMPSES files. So put the m2v and wav file in a separate directory and import from there (or delete the XMP and XMPSES file). I tried this and it does work, but the chapterpoint information is lost, so in the end I tried to avoid creating >7GB files for import in ENCCS3. However, later with another project, Encore did not even accept 4GB files.
    In the end I found the solution to be to have Premiere generate a transport stream (i.e. a multiplexed file m2t in stead of two separate files for audio and video (wav and m2v)). Get a license for the AC3 encoder, switch on multiplexing and Encore happily imports the resulting files without any problem (until now)
    Hope this helps
    Lucien

  • Problem import large avi-files into iMovie '11

    I have a large AVI-file, DV, from an old video camera that I want to import to iMove '11. The file is 10,37 GB but when I import it to iMovie only a few scenes is visible. Also, iMovie says the file is 4,71 GB.
    How do I import the whole file?
    Thanks
    /Torbjörn

    If you can convert the AVI movie back on the PC to Quicktime, this would be your best option.  AVI files are limited to 4GB in size.  To get around this limit, the AVI file might be referencing other smaller files on the PC.  The AVI file format has been pretty much abandoned by Microsoft which now uses the WMV format.

  • Having a prob importing a large file from a sony DR 60 external drive via log a transfer in FCP this is the first time this has happened have the sony plug. HDisc compatibility is sorted as the first 2 short clips are imported. Any advice will be welcome

    problem importing a large file from a Sony DR60 external drive to FCP. Never had a problem before first two shot clips are captured in log and transfer but the larger 2 hour clip only downloads the first few seconds. Any advice?

    I can think of 2 things to check.
    1. The Scratch disc has plenty of space.
    2. System Settings>Scratch Disc does not have a restriction in the Limit Capture/Export File Size Segment To:
    Al

  • Import Large XML File to Table

    I have a large (819MB) XML file I'm trying to import into a table in the format:
    <ROW_SET>
    <ROW>
    <column_name>value</column_name>
    </ROW>
    <ROW>
    <column_name>value</column_name>
    </ROW>
    </ROW_SET>
    I've tried importing it with xmlsequence(...).extract(...) and ran into the number of nodes exceed maximum error.
    I've tried importing it with XMLTable(... passing XMLTYPE(bfilename('DIR_OBJ','large_819mb_file.xml'), nls_charset_id('UTF8'))) and I gave up after it ran for 15+ hours ( COLLECTION ITERATOR PICKLER FETCH issue ).
    I've tried importing it with:
    insCtx := DBMS_XMLStore.newContext('schemaname.tablename');
    DBMS_XMLStore.clearUpdateColumnList(insCtx);
    DBMS_XMLStore.setUpdateColumn(insCtx,'column1name');
    DBMS_XMLStore.setUpdateColumn(insCtx,'columnNname');
    ROWS := DBMS_XMLStore.insertXML(insCtx, XMLTYPE(bfilename('DIR_OBJ','large_819mb_file.xml'), nls_charset_id('UTF8')));
    and ran into ORA-04030: out of process memory when trying to allocate 1032 bytes (qmxlu subheap,qmemNextBuf:alloc).
    All I need to do is read the XML file and move the data into a matching table in a reasonable time. Once I have the data in the database, I no longer need the XML file.
    What would be the best way to import large XML files?
    Oracle Database 11g Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    "CORE     11.2.0.1.0     Production"
    TNS for Linux: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - Production

    This (rough) approach should work for you.
    CREATE TABLE HOLDS_XML
            (xml_col XMLTYPE)
          XMLTYPE xml_col STORE AS SECUREFILE BINARY XML;
    INSERT INTO HOLDS_XML
    VALUES (xmltype(bfilename('DIR_OBJ','large_819mb_file.xml'), nls_charset_id('UTF8')))
    -- Should be using AL32UTF8 for DB character set with XML
    SELECT ...
      FROM HOLD_XML HX
           XMLTable(...
              PASSING HX.xml_col ...)How it differs from your approach.
    By using the HOLDS_XML table with SECUREFILE BINARY XML storage (which became the default in 11.2.0.2) we are providing a place for Oracle to store a parsed version of the XML. This allows the XML to be stored on disk instead of in memory. Oracle can then access the needed pieces of XML from disk by streaming them instead of holding the whole XML in memory and parsing it repeatedly to find the information needed. That is what COLLECTION ITERATOR PICKLER FETCH means. A lot of memory work. You can search on that term to learn more about it if needed.
    The XMTable approach then simply reads this XML from disk and should be able to parse the XML with no issue. You have the option of adding indexes to the XML, but since you are simply reading it all one time and tossing it, there is no advantage to indexes (most likely)

Maybe you are looking for