Upoading larger files with Fileupload Utility

Hi,
I have the following problem. My customer is using FileUpload Utility (http://www.oracle.com/technology/sample_code/products/forms/extracted/hyperlink/fileupload.html)
to upload files from the client onto the server.
This utility is working fine until the filesize is not bigger than ~6MB.
Now I saw that there are "known restrictions" with this utility and there can be problems with bigger files.
Does anyone know this utility and can help me to adjust it, so that also files bigger than 6MB can be uploaded? Or can anyone give a hint how I can solve this problem in another way?
Regards
Michael
Edited by: MichaelD on Feb 5, 2009 4:18 PM

Well this seems to be a memory problem when reading the file into a String (I guess there are a lot of String Objects which are not used in Memory when reading a file).
But main question is: Why not use webutil instead of the Upload Bean demo?
regards

Similar Messages

  • Can't create log file with java.util.logging

    Hi,
    I have created a class to create a log file with java.util.logging
    This class works correctly as standalone (without jdev/weblogic)
    import java.io.IOException;
    import java.text.DateFormat;
    import java.text.SimpleDateFormat;
    import java.util.Date;
    import java.util.logging.*;
    public class LogDemo
         private static final Logger logger = Logger.getLogger( "Logging" );
         public static void main( String[] args ) throws IOException
             Date date = new Date();
             DateFormat dateFormat = new SimpleDateFormat("yyyyMMdd");
             String dateStr = dateFormat.format(date);
             String logFileName = dateStr + "SEC" + ".log";
             Handler fh;          
             try
               fh = new FileHandler(logFileName);
               //fh.setFormatter(new XMLFormatter());
               fh.setFormatter(new SimpleFormatter());
               logger.addHandler(fh);
               logger.setLevel(Level.ALL);
               logger.log(Level.INFO, "Initialization log");
               // force a bug
               ((Object)null).toString();
             catch (IOException e)
                  logger.log( Level.WARNING, e.getMessage(), e );
             catch (Exception e)
                  logger.log( Level.WARNING, "Exception", e);
    }But when I use this class...
    import java.io.File;
    import java.io.IOException;
    import java.text.DateFormat;
    import java.text.SimpleDateFormat;
    import java.util.Date;
    import java.util.logging.FileHandler;
    import java.util.logging.Handler;
    import java.util.logging.Level;
    import java.util.logging.Logger;
    import java.util.logging.XMLFormatter;
    public class TraceUtils
      public static Logger logger = Logger.getLogger("log");
      public static void initLogger(String ApplicationName) {
        Date date = new Date();
        DateFormat dateFormat = new SimpleDateFormat("yyyyMMdd");
        String dateStr = dateFormat.format(date);
        String logFileName = dateStr + ApplicationName + ".log";
        Handler fh;
        try
          fh = new FileHandler(logFileName);
          fh.setFormatter(new XMLFormatter());
          logger.addHandler(fh);
          logger.setLevel(Level.ALL);
          logger.log(Level.INFO, "Initialization log");
        catch (IOException e)
          System.out.println(e.getMessage());
    }and I call it in a backingBean, I have the message in console but the log file is not created.
    TraceUtils.initLogger("SEC");why?
    Thanks for your help.

    I have uncommented this line in logging.properties and it works.
    # To also add the FileHandler, use the following line instead.
    handlers= java.util.logging.FileHandler, java.util.logging.ConsoleHandlerBut I have another problem:
    jdev ignore the parameters of the FileHandler method .
    And it creates a general log file with anothers log files created each time I call the method logp.
    So I play with these parameters
    fh = new FileHandler(logFileName,true);
    fh = new FileHandler(logFileName,0,1,true);
    fh = new FileHandler(logFileName,10000000,1,true);without succes.
    I want only one log file, how to do that?

  • Large file with fstream with WS6U2

    I can't read large file (>2GB) with the stl fstream. Can anyone do this or is this a limitation of WS6U2 fstream classes. I can read large files with low level function C functions

    I thought that WS6U2 meant Forte 6 Update 2. As to more information, the OS is SunOS 5.8, the file system is NFS mounted from an HP-UX 11.00 box and it's largefile enabled. my belief is that fstream does not implement access to large files, but I can't be sure.
    Anyway, I'm not sure by what you mean by the access to the OS support for largefiles by the compiler, but as I mentioned before, I can read more then 2GB with open() and read(). My problem is with fstream. My belief is that fstream must be largefile enabled. Any idea?

  • Problem unzipping larger files with Java

    When I extract small zip files with java it works fine. If I extract large zip files I get errors. Can anyone help me out please?
    import java.io.*;
    import java.util.*;
    import java.net.*;
    import java.util.zip.*;
    public class  updategrabtest
         public static String filename = "";
         //public static String filesave = "";
        public static boolean DLtest = false, DBtest = false;
         // update
         public static void main(String[] args)
              System.out.println("Downloading small zip");
              download("small.zip"); // a few k
              System.out.println("Extracting small zip");
              extract("small.zip");
              System.out.println("Downloading large zip");
              download("large.zip"); // 1 meg
              System.out.println("Extracting large zip");
              extract("large.zip");
              System.out.println("Finished.");
              // update database
              boolean maindb = false; //database wasnt updated
         // download
         public static void download (String filesave)
              try
                   java.io.BufferedInputStream in = new java.io.BufferedInputStream(new
                   java.net.URL("http://saveourmacs.com/update/" + filesave).openStream());
                   java.io.FileOutputStream fos = new java.io.FileOutputStream(filesave);
                   java.io.BufferedOutputStream bout = new BufferedOutputStream(fos,1024);
                   byte data[] = new byte[1024];
                   while(in.read(data,0,1024)>=0)
                        bout.write(data);
                   bout.close();
                   in.close();
              catch (Exception e)
                   System.out.println ("Error writing to file");
                   //System.exit(-1);
         // extract
         public static void extract(String filez)
              filename = filez;
            try
                updategrab list = new updategrab( );
                list.getZipFiles();
            catch (Exception e)
                e.printStackTrace();
         // extract (part 2)
        public static void getZipFiles()
            try
                //String destinationname = ".\\temp\\";
                String destinationname = ".\\";
                byte[] buf = new byte[1024]; //1k
                ZipInputStream zipinputstream = null;
                ZipEntry zipentry;
                zipinputstream = new ZipInputStream(
                    new FileInputStream(filename));
                zipentry = zipinputstream.getNextEntry();
                   while (zipentry != null)
                    //for each entry to be extracted
                    String entryName = zipentry.getName();
                    System.out.println("entryname "+entryName);
                    int n;
                    FileOutputStream fileoutputstream;
                    File newFile = new File(entryName);
                    String directory = newFile.getParent();
                    if(directory == null)
                        if(newFile.isDirectory())
                            break;
                    fileoutputstream = new FileOutputStream(
                       destinationname+entryName);            
                    while ((n = zipinputstream.read(buf, 0, 1024)) > -1)
                        fileoutputstream.write(buf, 0, n);
                    fileoutputstream.close();
                    zipinputstream.closeEntry();
                    zipentry = zipinputstream.getNextEntry();
                }//while
                zipinputstream.close();
            catch (Exception e)
                e.printStackTrace();
    }

    In addition to the other advice, also change every instance of..
    kingryanj wrote:
              catch (Exception e)
                   System.out.println ("Error writing to file");
                   //System.exit(-1);
    ..to..
    catch (Exception e)
    e.printStackTrace();
    }I am a big fan of the stacktrace.

  • Can´t print large files with 10.5.7

    Hi,
    we finally have updated our G5s, 2x2GHz, 4GB RAM, 500GB system disk. Since then we are unable to print documents from CS3 + CS4 which result in cups print files > 1 Gb. Doesn´t work with GMG rips, HP Laserwriters and Adobe PDF printers. The jobs always show up as "stopped" in the queue. Error log of cups says:
    I [11/Jun/2009:16:42:09 +0200] [Job 55] Adding start banner page "none".
    I [11/Jun/2009:16:42:09 +0200] [Job 55] Adding end banner page "none".
    I [11/Jun/2009:16:42:09 +0200] [Job 55] File of type application/pictwps queued by "dididi".
    I [11/Jun/2009:16:42:09 +0200] [Job 55] Queued on "EpsonISOcoated_39L02" by "dididi".
    I [11/Jun/2009:16:42:09 +0200] [Job 55] Started filter /usr/libexec/cups/filter/pictwpstops (PID 1363)
    I [11/Jun/2009:16:42:09 +0200] [Job 55] Started backend /usr/libexec/cups/backend/pap (PID 1364)
    E [11/Jun/2009:16:42:10 +0200] [Job 55] can't open PSStream for stdout
    E [11/Jun/2009:16:42:10 +0200] [Job 55] pictwpstops - got an error closing the PSStream = -50
    E [11/Jun/2009:16:42:10 +0200] PID 1363 (/usr/libexec/cups/filter/pictwpstops) stopped with status 1!
    I [11/Jun/2009:16:42:10 +0200] Hint: Try setting the LogLevel to "debug" to find out more.
    E [11/Jun/2009:16:42:12 +0200] [Job 55] Job stopped due to filter errors.
    We have tested a catalog file with 150 pages which didn´t print. If you broke it up into 6 pieces (print page 1-25, 26-50...) it did work. Unfortunally we have jobs with single pages exceeding this limit. All together we have tested around 10 differend files.
    We have tried to reset the printer queue, installed the drivers and deleted and created all printers, but nothing helps. Looks like cups or one of it´s helper can´t handle files larger than 1 GB?
    Any help is welcome.

    Hi there,
    Does you printer/RIP support protocols other than AppleTalk? If it does, then I would change it to something like HP Jetdirect-Socket.
    PaHu

  • Handling large files with FTP in XI

    Hi All,
    I have a file scenario where i have to post the file with size of more than 500MB and with the fields upto 700 in each line.
    The other scenario worked fine if the file size is below 70MB and less number of fields.
    Could anyone help me in handling such scenario with large file size and without splitting the file.
    1) From your previous experience did you use any Tools to help with the development of the FTP interfaces?
    2) The client looked at ItemField, but is not willing to use it, due to the licensing costs. Did you use any self-made pre-processors?
    3) Please let me know the good and bad experiences you had when using the XI File Adapter?
    Thanks & Regards,
    Raghuram Vedagiri.

    500 MB is huge. XI will not be able to handle such a huge payload for sure.
    Are you using XI as a mere FTP or are you using Content Conversion with mapping etc?
    1. Either use a splitting logic to split the file outside XI ( using Scripts ) and then XI handles these file.
    2. or Quick Size your hardware ( Java Heap etc ) to make sure that XI can handle this file ( not recommended though). SAP recommends a size of 5 MBas the optimum size.
    Regards
    Bhavesh

  • How to upload large file with http via post

    Hi guys,
    Does anybody know how to upload large file (>100 MB) use applet to servlet with http via post method? Thanks in advance.
    Regards,
    Mark.

    Hi SuckRatE
    Thanks for your reply. Could you give me some client side code to upload a large file. I use URL to connect to server. It throws out of memory exception. The part of client code is below:
    // connect to the servlet
    URL theServlet = new URL(servletLocation);
    URLConnection servletConnection = theServlet.openConnection();
    // inform the connection that we will send output and accept input
    servletConnection.setDoInput(true);
    servletConnection.setDoOutput(true);
    // Don't used a cached version of URL connection.
    servletConnection.setUseCaches (false);
    servletConnection.setDefaultUseCaches(false);
    // Specify the content type that we will send text data
    servletConnection.setRequestProperty("Content-Type",
    +"application/octet-stream");
    // send the user string to the servlet.
    OutputStream outStream = servletConnection.getOutputStream();
    FileInputStream filein = new FileInputStream(largeFile);
    //BufferedReader in = new BufferedReader(new InputStreamReader
    +(servletConnection.getInputStream()));
    //System.out.println("tempCurrent = "+in.readLine());
    byte abyte[] = new byte[2048];
    int cnt = 0;
    while((cnt = filein.read(abyte)) > 0)
    outStream.write(abyte, 0, cnt);
    filein.close();
    outStream.flush();
    outStream.close();
    Regards,
    Mark.

  • Reading large file with JCA Adapter in OSB

    Hello,
    We are searching for a solution how to read large file (>50M) from network drive and deliver it to queue via OSB 11gR4 (10.3.4). The problem is when reading the file with JCA File Adapter. It seems that it cannot handle as large files as we have. The documentation provides a way to bypass file size limitation by using Chunk Read but it seems to require BPEL Process execution which is not possible in our environment. Does anyone know if there are ways to implement this without having BPEL Process?
    Our usecase:
    read file from network drive -> transfer with OSB -> deliver MQ
    Other options than JCA File Adapter can be considered, if anyone can advice...

    If it's a plain routing use case and no message processing is required then you may simply use OSB's FILE transport instead of JCA adapter. Create a messaging type proxy service and select request message type as "binary". Also enable the content streaming (Disk buffer, compression).
    From OSB Dev guide -
    Oracle JCA Adapter for FTP and Files – Attachments (large payload support), pre- and post-processing of files, using a re-entrant valve for processing ZIP files, content streaming, and file chunked read are not supported with Oracle Service Bus.
    http://download.oracle.com/docs/cd/E17904_01/doc.1111/e15866/jca.htm#BABBICIA
    You may also refer -
    Reading huge flat file in OSB 11gR1
    Regards,
    Anuj

  • Can't transfer large files with remote connection

    Hi all,
    I'm trying to figure out why we can't transfer large files (< 20MB) over a remote connection.
    The remote machine is a G5 running 10.4.11, the files reside on a Mac OS X SATA RAID. We are logged into the remote machine via afp with an administrator's account.
    We can transfer files smaller than 20 MB with no problem. When we attempt to transfer files larger than 20 MB, we get the "The operation can't be completed because you don't have sufficient permissions for some of the items."
    We can transfer large files from the remote machine to this one (a Mac Pro running 10.4.11) no problem.
    The console log reports the following error:
    NAT Port Mapping (LLQ event port.): timeout
    I'm over my head on this one - does anyone have any ideas?
    Thanks,
    David

    I tried both these things with no luck.
    The mDNSResponder starts up right after the force quit - is this right?
    The following is the console log, which differs from the previous logs where we had the same problem:
    DNSServiceProcessResult() returned an error! -65537 - Killing service ref 0x14cb4f70 and its source is FOUND
    DNSServiceProcessResult() returned an error! -65537 - Killing service ref 0x14cb3990 and its source is FOUND
    DNSServiceProcessResult() returned an error! -65537 - Killing service ref 0x14cb2b80 and its source is FOUND
    DNSServiceProcessResult() returned an error! -65537 - Killing service ref 0x14cb1270 and its source is FOUND
    Feb 4 06:13:57 Russia mDNSResponder-108.6 (Jul 19 2007 11: 41:28)[951]: starting
    Feb 4 06:13:58 Russia mDNSResponder: Adding browse domain local.
    In HrOamApplicationGetCommandBars
    In HrOamApplicationGetCommandBars
    In HrOamApplicationGetCommandBars
    Feb 4 06:23:12 Russia mDNSResponder-108.6 (Jul 19 2007 11: 41:28)[970]: starting
    Feb 4 06:23:13 Russia mDNSResponder: Adding browse domain local.
    Feb 4 06:26:00 Russia configd[35]: rtmsg: error writing to routing socket
    Feb 4 06:26:00 Russia configd[35]: rtmsg: error writing to routing socket
    Feb 4 06:26:00 Russia configd[35]: rtmsg: error writing to routing socket
    Feb 4 06:26:00 Russia configd[35]: rtmsg: error writing to routing socket
    Feb 4 06:26:00 Russia configd[35]: rtmsg: error writing to routing socket
    Feb 4 06:26:00 Russia configd[35]: rtmsg: error writing to routing socket
    Feb 4 06:26:00 Russia configd[35]: rtmsg: error writing to routing socket
    Feb 4 06:26:00 Russia configd[35]: rtmsg: error writing to routing socket
    [442] http://discussions.apple.com/resources/merge line 4046: ReferenceError: Can't find variable: tinyMCE
    Thanks,
    David

  • Extracting file from a TAR file with java.util.zip.* classes

    Is there a way to extract files from a .TAR file using the java.util.zip.* classes?
    I tried in some ways but I get the following error:
    java.util.zip.ZipException: error in opening zip file
    at java.util.zip.ZipFile.<init>(ZipFile.java127)
    at java.util.zip.ZipFile.<init>(ZipFile.java92)
    Thank you
    Giuseppe

    download the tar.jar from the above link and use the sample program below
    import com.ice.tar.*;
    import java.util.zip.GZIPInputStream;
    import java.io.*;
    public class untarFiles
         public static void main(String args[]){
              try{
              untar("c:/split/20040826172459.tar.gz",new File("c:/split/"));
              }catch(Exception e){
                   e.printStackTrace();
                   System.out.println(e.getMessage());
         private static void untar(String tarFileName, File dest)throws IOException{
              //assuming the file you pass in is not a dir
              dest.mkdir();     
              //create tar input stream from a .tar.gz file
              TarInputStream tin = new TarInputStream( new GZIPInputStream( new FileInputStream(new File(tarFileName))));
              //get the first entry in the archive
              TarEntry tarEntry = tin.getNextEntry();
              while (tarEntry != null){//create a file with the same name as the tarEntry  
                   File destPath = new File(
                   dest.toString() + File.separatorChar + tarEntry.getName());
                   if(tarEntry.isDirectory()){   
                        destPath.mkdir();
                   }else {          
                        FileOutputStream fout = new FileOutputStream(destPath);
                        tin.copyEntryContents(fout);
                        fout.close();
                   tarEntry = tin.getNextEntry();
              tin.close();
    }

  • Suspend to file with pm-utils

    I have a MacBook Pro3 and since I have a triple boot there I am unable to spare a partition for swap space, so I use swap in tmpfs (problem is the same with swap file). However, i want to be able to suspend to disk: is there a way to suspend to a file instead of to a swap partition with pm-utils?
    Thanks in adv!

    Could be an issue with the graphics card. From the log it seems as if you are using the opensource ati driver. Personally, I never had luck to get resume to work with this driver, but my card was too old for the closed source driver. The symptoms were the same as yours, the machine would suspend or hibernate, but failed to wake up.

  • Streaming Larges Files with Quicktime

    Hey guys - I am in need of some serious help. I have streamed files with quicktime before, no problem. But now I need to stream about 75 minutes worth of video. How do I do this and keep quality video and sound? I don't mind having a large file as long as it streams. But 1.2 gigabytes is a little ridiculous. I have seen this done with WMV but I am having trouble doing it with quicktime. It's very frustrating because there are so many compression packages and options within those. I would like to finish before I die. Anybody have any ideas?

    Streaming ANY size file is really an issue of bandwidth out of the server. The closer the server is to a backbone and the better the connection to the backbone, the better feed you have.
    Everybody sing!
    The backbone connected to the T-1,
    The T-1 connected...
    Oh nevermind.

  • Unable to open zip files with Archive Utility or StuffIt

    Hello,
    I am running Mac OS X 10.7.3.  I have been attempting to download and install the Eclipse Java IDE from both www.eclipse.org and from a Stanford course website.  In the first case the file is offered as a tar.gz and in the second case it is a .zip.  When I attempt to open either file using Archive Utility or StuffIt Expander I get a never ending pinwheel.  I am forced to restart to get Finder working porperly again and quiet the fan down.  My wife's computer, also running Mac OS X 10.7.3, can extract the files without a problem.

    Jsanders is basiclly a BlackBerry Superhero, stick around for awhile, Don't be a stranger AntMcl, you can learn a lot around here I promise.
    If someone helped you give them kudos. Research all info!

  • Can I receive large files with Adobe Send?

    Does Adobe Send make it possible to receive large files?

    The incoming address can be whatever one is most preferable for you (just let the sender know). When you receive an email from Send, it just includes a link to the sent file. That way, you don't bump up against any file size limitations that your email client might impose.
    Best,
    Sara

  • How to transfer a large file with jpgs and other picture formats from pc to ipad, how to transfer a large file with jpgs and other picture formats from pc to ipad, how to transfer a large file with jpgs and other picture formats from pc to ipad

    I have a large amount of music, in various photo formats and powerpoint, all together in an alphabetized folder. How can I transfer it from my PC to my ipad?

    You're not going to just be able to drag a folder over. The iPad has no file manager like computers, so every file has to be associated with an app.
    Did you ever sync your iPad with your computer? If so, use iTunes to move the files. However those  files will be moved in bits and pieces. For example the music will be moved via the music app. The photos via the photos app. You'll need a powerpoint app to open your power point.

Maybe you are looking for