Upload a large file to OCS10g workspace

Hi,
I have a workspace on OCS10g and the quota of the workspace is 2GB.
But, when I upload a large file - about 100 MB, it occurs errors.
Undoubtedly, it works well in the case of small files.
Already, I set the parameter IFS.DOMAIN.MEDIA.CONTENTTRANSFER.ContentLimit to 0.
It means an unlimited permission to upload a workspace.
Is there any other prerequisite to upload a large file to a workspace?
Regards,
Kitae

Have you checked the Apache Timeout directive? It might need to be larger. There are no other general tricks or parameters to tweak that our dev team is aware of. Please file a metalink tar for this and detail the error message. I'm sure Oracle Support Services will be able to help you mine the logs to find out what is going on.

Similar Messages

  • Upload very large file to SharePoint Online

    Hi,
    I tried uploading very large files to SharePoint online via the REST API using JAVA. Uploading files with a size of ~160 mb works well, but if the filesize is between 400mb and 1,6 gb I either receive a Socket Exception (Connection reset by peer) or an error
    of SharePoint telling me "the security validation for the page is invalid". 
    So first of all I want to ask you, how to work around the security validation error. As far as I understand the token, which I have added to my X-RequestDigest header of the http post request seems to be expired (by default it expires after 1800 seconds).
    Uploading such large files is time consuming, so I can't change the token in my header while uploading the file. How can I extend the expiration time of the token (if it is possible)? Could it help to send post requests with the same token
    continuously while uploading the file and therefore prevent the token from expiration? Is there any other strategy to upload such large files which need much more than 1800 seconds for upload?
    Additionally any thoughts on the socket exception? It happens quite sporadically. So sometimes it happens after uploading 100 mb, and the other time I have already uploaded 1 gb.
    Thanks in advance!

    Hi,
    Thanks for the reply, the reason I'm looking into this is so users can migrate there files to SharePoint online/Office 365. The max file limit is 2gb, so thought I would try and cover this.
    I've looked into that link before, and when I try and use those end points ie StartUpload I get the following response
    {"error":{"code":"-1, System.NotImplementedException","message":{"lang":"en-US","value":"The
    method or operation is not implemented."}}}
    I can only presume is that these endpoints have not been implemented yet, even though the documentation says they only available for Office 365 which is what  I am using. 
    Also, the other strange thing is that I can actually upload a file of 512mb to the server as I can see it in the web UI. The problem I am experiencing is get a response, it hangs on this line until the TimeOut is reached.
    WebResponse wresp = wreq.GetResponse();
    I've tried flushing and closing the stream, sendchunked but still no success

  • Uploading Very Large Files via HTTP

    I am developing some classes that must upload files to a web server via HTTP and multipart/form-data. I am using Apache's Tomcat FileUpload library contained within the commons-fileupload-1.0.jar file on the server side. My code fails on large files or large quantities of small files because of the memory restriction of the VM. For example when uploading a 429 MB file I get this exception:
    java.lang.OutOfMemoryError
    Exception in thread "main"I have never been successful in uploading, regardless of the server-side component, more than ~30 MB.
    In a production environment I cannot alter the clients VM memory setting, so I must code my client classes to handle such cases.
    How can this be done in Java? This is the method that reads in a selected file and immediately writes it upon the output stream to the web resource as referenced by bufferedOutputStream:
    private void write(File file) throws IOException {
      byte[] buffer = new byte[bufferSize];
      BufferedInputStream fileInputStream = new BufferedInputStream(new FileInputStream(file));
      // read in the file
      if (file.isFile()) {
        System.out.print("----- " + file.getName() + " -----");
        while (fileInputStream.available() > 0) {
          if (fileInputStream.available() >= 0 &&
              fileInputStream.available() < bufferSize) {
            buffer = new byte[fileInputStream.available()];
          fileInputStream.read(buffer, 0, buffer.length);
          bufferedOutputStream.write(buffer);
          bufferedOutputStream.flush();
        // close the files input stream
        try {
          fileInputStream.close();
        } catch (IOException ignored) {
          fileInputStream = null;
      else {
        // do nothing for now
    }The problem is, the entire file, and any subsequent files being read in, are all being packed onto the output stream and don't begin actually moving until close() is called. Eventually the VM gives way.
    I require my client code to behave no different than the typcial web browser when uploading or downloading a file via HTTP. I know of several commercial applets that can do this, why can't I? Can someone please educate me or at least point me to a useful resource?
    Thank you,
    Henryiv

    Are you guys suggesting that the failures I'm
    experiencing in my client code is a direct result of
    the web resource's (servlet) caching of my request
    (files)? Because the exception that I am catching is
    on the client machine and is not generated by the web
    server.
    trumpetinc, your last statement intrigues me. It
    sounds as if you are suggesting having the client code
    and the servlet code open sockets and talk directly
    with one another. I don't think out customers would
    like that too much.Answering your first question:
    Your original post made it sound like the server is running out of memory. Is the out of memory error happening in your client code???
    If so, then the code you provided is a bit confusing - you don't tell us where you are getting the bufferedOutputStream - I guess I'll just assume that it is a properly configured member variable.
    OK - so now, on to what is actually causing your problem:
    You are sending the stream in a very odd way. I highly suspect that your call to
    buffer = new byte[fileInputStream.available()];is resulting in a massive buffer (fileInputStream.available() probably just returns the size of the file).
    This is what is causing your out of memory problem.
    The proper way to send a stream is as follows:
         static public void sendStream(InputStream is, OutputStream os, int bufsize)
                     throws IOException {
              byte[] buf = new byte[bufsize];
              int n;
              while ((n = is.read(buf)) > 0) {
                   os.write(buf, 0, n);
         static public void sendStream(InputStream is, OutputStream os)
                     throws IOException {
              sendStream(is, os, 2048);
         }the simple implementation with the hard coded 2048 buffer size is fine for almost any situation.
    Note that in your code, you are allocating a new buffer every time through your loop. The purpose of a buffer is to have a block of memory allocated that you then move data into and out of.
    Answering your second question:
    No - actually, I'm suggesting that you use an HTTPUrlConnection to connect to your servlet directly - no need for special sockets or ports, or even custom protocols.
    Just emulate what your browser does, but do it in the applet instead.
    There's nothing that says that you can't send a large payload to an http servlet without multi-part mime encoding it. It's just that is what browsers do when uploading a file using a standard HTML form tag.
    I can't see that a customer would have anything to say on the matter at all - you are using standard ports and standard communication protocols... Unless you are not in control of the server side implementation, and they've already dictated that you will mime-encode the upload. If that is the case, and they are really supporting uploads of huge files like this, then their architect should be encouraged to think of a more efficient upload mechanism (like the one I describe) that does NOT mime encode the file contents.
    - K

  • Problem upload too large file in a Content Area ORA-1401

    Hi All
    I have a customer who is trying to upload a file with a too large name in a content area.
    If the filename is 80 characters, it works fine.
    But if the filename is upper or 80 characters the following error ocurred:
    ORA-01401: inserted value too large for column
    DAD name: portal30
    PROCEDURE : PORTAL30.wwv_add_wizard.edititem
    URL : http://sbastida-us:80/pls/portal30/PORTAL30.wwv_add_wizard.edititem
    I checked the table: WWV_DOCUMENT and it have a column name: FILENAME is defined like VARCHAR2(350).
    If i run a query on this table, the filename column stores the filename of the file uploaded.
    Is this a new bug? may i fill this bug?
    Do you have any idea about this issue?
    Thanks in advance,
    Catalina

    Catalina,
    The following restrictions apply to the names of documents that you are uploading into the repository:
    Filenames must be 80 characters or less.
    Filenames must not include any of these characters: \ / : * ? < > | " % # +
    Filenames can include spaces and any of these characters: ! @ ~ & . $ ^ ( ) - _ ` ' [ ] { } ; =
    Regards,
    Jerry
    null

  • Problem when uploading a large file in PI - weird SQL I/O errors

    Hi guys,
    I'm facing a very difficult problem when uploading a 35 MB with an FTPs adapter. I see in the logs that, after the translation to XML, it's going to 170 MB.
    I receive the following error in the CC Monitoring:
    Error: com.sap.aii.af.ra.ms.api.DeliveryException: Problem inserting 41827ca7-6b8c-4a87-198d-ad8a81fcb12b(OUTBOUND) into the database: com.sap.engine.services.dbpool.exceptions.BaseSQLException: Connection is invalid.
    When I look in the NWA Monitoring, I see the following details:
    SQL error occurred on connection affhb201:X11:SAPSR3DB: code=17,002, state="null", message="Io exception: Socket closed";
    SQL statement is "INSERT INTO "XI_AF_MSG" ("MSG_ID","DIRECTION","MSG_BYTES","TIMES_FAILED","SENT_RECV_TIME","STATUS","CONN_NAME","MSG_TYPE","REF_TO_MSG_ID","ADDRESS","TRANSPORT","CREDENTIAL","TRAN_HEADER","MSG_PROFILE","CONVERSATION_ID","SCHEDULE_TIME","PERSIST_UNTIL","FROM_P........
    I cannot check the Visual Admin Logs 'cause I don't have access to them yet.
    I'm pretty convinced that some swap memory, message size or whatever setting on the adapter engine or on the Java stack is preventing this. I do not get any message in CC Monitoring when uploading a smaller, 6 MB version of the same file.
    Can you please help me solve this or give me some interesting pointers?
    Never before did we experience something like this in the PI system. In addition, I didn't find any useful resource on the SDN and throughout the SAP notes for this.
    Let me know if you need more info about this.
    Best regards,
    George

    Hi George
    I am facing the same issue, where did you configure the message split in the Communication Channel?
    If I do the message split as you said, is it going to create several files or how does it work?
    Thanks in advanced
    Emmanuel

  • Does Skydrive Support resumable upload for large files

    Hi,
    Does SkyDrive support resumable upload. For 33 MB file upload it took almost 5 min to upload . So to minimise time to upload i want to use resumable upload.
    Explain how it should be done.
    Thanks,
    Ashlesha

    Hi u_os,
    This forum is to discuss problems of Windows Forms. Your question is not related to the topic of this forum.
    I suggest you posting it in the OneDrive Forum http://answers.microsoft.com/en-us/onedrive/forum?auth=1  for supports, where you can contact OneDrive experts.
    Best regards,
    Youjun Tang
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • BT Cloud - large file ( ~95MB) uploads failing

    I am consistently getting upload failures for any files over approximately 95MB in size.  This happens with both the Web interface, and the PC client.  
    With the Web interface the file upload gets to a percentage that would be around the 95MB amount, then fails showing a red icon with a exclamation mark.  
    With the PC client the file gets to the same percentage equating to approximately 95MB, then resets to 0%, and repeats this continuously.  I left my PC running 24/7 for 5 days, and this resulted in around 60GB of upload bandwidth being used just trying to upload a single 100MB file.
    I've verified this on two PCs (Win XP, SP3), one laptop (Win 7, 64 bit), and also my work PC (Win 7, 64 bit).  I've also verified it with multiple different types and sizes of files.  Everything from 1KB to ~95MB upload perfectly, but anything above this size ( I've tried 100MB, 120MB, 180MB, 250MB, 400MB) fails every time.
    I've completely uninstalled the PC Client, done a Windows "roll-back", reinstalled, but this has had no effect.  I also tried completely wiping the cloud account (deleting all files and disconnecting all devices), and starting from scratch a couple of times, but no improvement.
    I phoned technical support yesterday and had a BT support rep remote control my PC, but he was completely unfamiliar with the application and after fumbling around for over two hours, he had no suggestion other than trying to wait for longer to see if the failure would clear itself !!!!!
    Basically I suspect my Cloud account is just corrupted in some way and needs to be deleted and recreated from scratch by BT.  However I'm not sure how to get them to do this as calling technical support was futile.
    Any suggestions?
    Thanks,
    Elinor.
    Solved!
    Go to Solution.

    Hi,
    I too have been having problems uploading a large file (362Mb) for many weeks now and as this topic is marked as SOLVED I wanted to let BT know that it isn't solved for me.
    All I want to do is share a video with a friend and thought that BT cloud would be perfect!  Oh, if only that were the case :-(
    I first tried web upload (as I didn't want to use the PC client's Backup facility) - it failed.
    I then tried the PC client Backup.... after about 4 hrs of "progress" it reached 100% and an icon appeared.  I selected it and tried to Share it by email, only to have the share fail and no link.   Cloud backup thinks it's there but there are no files in my Cloud storage!
    I too spent a long time on the phone to Cloud support during which the tech took over my PC.  When he began trying to do completely inappropriate and irrelevant  things such as cleaning up my temporary internet files and cookies I stopped him.
    We did together successfully upload a small file and sharing that was successful - trouble is, it's not that file I want to share!
    Finally he said he would escalate the problem to next level of support.
    After a couple of weeks of hearing nothing, I called again and went through the same farce again with a different tech.  After which he assured me it was already escalated.  I demanded that someone give me some kind of update on the problem and he assured me I would hear from BT within a week.  I did - they rang to ask if the problem was fixed!  Needless to say it isn't.
    A couple of weeks later now and I've still heard nothing and it still doesn't work.
    Why can't Cloud support at least send me an email to let me know they exist and are working on this problem.
    I despair of ever being able to share this file with BT Cloud.
    C'mon BT Cloud surely you can do it - many other organisations can!

  • OOM happens inside Weblogic 10.3.6 when application uploads large files

    Oracle Fusion BI Apps application is uploading large files (100+ MB) onto Oracle Cloud Storage. This application works properly when ran outside weblogic server. When deployed on Fusion Middleware Weblogic 10.3.6, during upload of large files we get this OOM error
    java.lang.OutOfMemoryError: allocLargeObjectOrArray: [B, size 268435472
        at jrockit/vm/Allocator.allocLargeObjectOrArray(JIZ)Ljava/lang/Object;(Native Method)
        at jrockit/vm/Allocator.allocObjectOrArray(Allocator.java:349)[optimized]
        at weblogic/utils/io/UnsyncByteArrayOutputStream.resizeBuffer(UnsyncByteArrayOutputStream.java:59)[inlined]
        at weblogic/utils/io/UnsyncByteArrayOutputStream.write(UnsyncByteArrayOutputStream.java:89)[optimized]
        at com/sun/jersey/api/client/CommittingOutputStream.write(CommittingOutputStream.java:90)
        at com/sun/jersey/core/util/ReaderWriter.writeTo(ReaderWriter.java:115)
        at com/sun/jersey/core/provider/AbstractMessageReaderWriterProvider.writeTo(AbstractMessageReaderWriterProvider.java:76)
        at com/sun/jersey/core/impl/provider/entity/InputStreamProvider.writeTo(InputStreamProvider.java:98)
        at com/sun/jersey/core/impl/provider/entity/InputStreamProvider.writeTo(InputStreamProvider.java:59)
        at com/sun/jersey/api/client/RequestWriter.writeRequestEntity(RequestWriter.java:300)
        at com/sun/jersey/client/urlconnection/URLConnectionClientHandler._invoke(URLConnectionClientHandler.java:213)
        at com/sun/jersey/client/urlconnection/URLConnectionClientHandler.handle(URLConnectionClientHandler.java:149)
    Looks like weblogic is using its default Weblogic HTTP handler, switching to Sun HTTP handler via start up JVM/Java Option "-DUseSunHttpHandler=true" solves the OOM issue.
    Seems instead of streaming the file content with some fixed size byte array its being on the whole file into memory during upload.
    Is it possible to solve this OOM by changing any setting of Weblogic HTPP handler without switching to Sun HTTP handler as there are many other application deployed on this weblogic instance?
    We are concerned whether there will be any impact on performance or any other issue.
    Please advice, highly appreciate your response.
    Thanks!

    Hi,
    If you have a back then restore the below file back and then try to start weblogic:
    \Oracle\Middleware\user_projects\domains\<domain_name>\config\config.lok
    Thanks,
    Sharmela

  • Beach Ball of Death on OSX while trying to upload large files(updated)

    Hi,
    I did a search on this and while I got lots of hits none on this issue.
    We have a web portal that allows customers to upload files to a DAM server, we have control server.
    The issue is not the back end, the back end is php and all set up correctly, we have 2gb ram assigned to php, the upload_max_filesize is set to 1.8gb max post size is set accordingly.
    The flex app loads, user selects large file say 1.6gb ( we do file size check to make sure its below the limit), click upload button, beach ball of death shows, a little while later the file uploads.
    Great so far, now the user tries another files same size, beach ball of death, script times out (we capture this and end gracefully). If the user restarts safari then the file will upload just fine.
    Seems you can upload a large file first time and small files subsequently below 1.2gb, but you can not upload multiple large files.
    Seems some sort of memory leak, I was looking at converting this upload app to send files via ftp, but if the flash player needs to process/load the files first then I don't think this will work looking to increase the file size ceiling to 4gb via ftp.
    Code is a bit involved but in the end just calls file reference upload, then beach ball appears
    Any ideas? player version 10.0.32.18 debug
    UPDATED 09_17_09
    It appears to be a memory leak in the player when in safari, the file is read but the memory not freed up after completion. Firefox frees the used memory after upload and you can continue. Not sure if Apple or Adobe issue.
    However why bother reading the file first? can we have an HTP stream file upload in the web flash player like AIR as. This way the player would not care on the file size and the limitation would reside on the server, which we can deal with.
    Message was edited by: flashharry!

    Hi,
    I did a search on this and while I got lots of hits none on this issue.
    We have a web portal that allows customers to upload files to a DAM server, we have control server.
    The issue is not the back end, the back end is php and all set up correctly, we have 2gb ram assigned to php, the upload_max_filesize is set to 1.8gb max post size is set accordingly.
    The flex app loads, user selects large file say 1.6gb ( we do file size check to make sure its below the limit), click upload button, beach ball of death shows, a little while later the file uploads.
    Great so far, now the user tries another files same size, beach ball of death, script times out (we capture this and end gracefully). If the user restarts safari then the file will upload just fine.
    Seems you can upload a large file first time and small files subsequently below 1.2gb, but you can not upload multiple large files.
    Seems some sort of memory leak, I was looking at converting this upload app to send files via ftp, but if the flash player needs to process/load the files first then I don't think this will work looking to increase the file size ceiling to 4gb via ftp.
    Code is a bit involved but in the end just calls file reference upload, then beach ball appears
    Any ideas? player version 10.0.32.18 debug
    UPDATED 09_17_09
    It appears to be a memory leak in the player when in safari, the file is read but the memory not freed up after completion. Firefox frees the used memory after upload and you can continue. Not sure if Apple or Adobe issue.
    However why bother reading the file first? can we have an HTP stream file upload in the web flash player like AIR as. This way the player would not care on the file size and the limitation would reside on the server, which we can deal with.
    Message was edited by: flashharry!

  • Upload large file in sharepoint

    hi
    is it possible upload large file in sharepoint in our scenario we need upload large file >3gb in document libraries
    where sharepoint store files? is there any way we can have a custom aspx page for uploading large files in chunks in document library?
    kindly suggest.
    Thankyou

    is not possbile upload more than 2Gb
    If you try to upload big files you will need to make some change on your Sharepoint to support the 2Gb.
    Error message when you try to upload a large file to a document library on a Windows SharePoint Services 3.0 site: "Request timed out"
    http://support.microsoft.com/?id=925083
    With files bigger than 2GB can recomend creeate linked files to that files where you can download.
    André Lage Microsoft SharePoint and CRM Consultant
    Blog:http://aaclage.blogspot.com
    Codeplex:http://spupload.codeplex.com/http://simplecamlsearch.codeplex.com/

  • Very large file upload 2 GB with Adobe Flash

    Hello, anyone know how can I upload very large files with Adobe Flash or how to use SWFUpload?
    Thanks in Advance, All help will be very appreciated.

    1. yes
    2. I'm getting error message from php
    if( $_FILES['Filedata']['error'] == 0 ){ 
      if(move_uploaded_file( $_FILES['Filedata']['tmp_name'], $uploads_dir.$_FILES['Filedata']['name'] ) ){ 
      echo 'ok'; 
      exit(); 
      echo 'error';    // Overhere
      exit();

  • How to upload large file with http via post

    Hi guys,
    Does anybody know how to upload large file (>100 MB) use applet to servlet with http via post method? Thanks in advance.
    Regards,
    Mark.

    Hi SuckRatE
    Thanks for your reply. Could you give me some client side code to upload a large file. I use URL to connect to server. It throws out of memory exception. The part of client code is below:
    // connect to the servlet
    URL theServlet = new URL(servletLocation);
    URLConnection servletConnection = theServlet.openConnection();
    // inform the connection that we will send output and accept input
    servletConnection.setDoInput(true);
    servletConnection.setDoOutput(true);
    // Don't used a cached version of URL connection.
    servletConnection.setUseCaches (false);
    servletConnection.setDefaultUseCaches(false);
    // Specify the content type that we will send text data
    servletConnection.setRequestProperty("Content-Type",
    +"application/octet-stream");
    // send the user string to the servlet.
    OutputStream outStream = servletConnection.getOutputStream();
    FileInputStream filein = new FileInputStream(largeFile);
    //BufferedReader in = new BufferedReader(new InputStreamReader
    +(servletConnection.getInputStream()));
    //System.out.println("tempCurrent = "+in.readLine());
    byte abyte[] = new byte[2048];
    int cnt = 0;
    while((cnt = filein.read(abyte)) > 0)
    outStream.write(abyte, 0, cnt);
    filein.close();
    outStream.flush();
    outStream.close();
    Regards,
    Mark.

  • Large file upload

    Hi all,
    Is there a way to upload a large file.
    The problem:
    1 file with fixed length records (59 bytes wide)
    Over 17million lines available (actually a download from an extreme old system), making a total of almost 1Gb.
    I have a database table that exactly machtes the record data.
    So the question is: How to upload this amount of data. The normal upload functions do not apply because i run out of memory space (system I-MODE too large).
    Regards,
    Rob.

    Hi Rob,
    I'm not too sure if this solution will be acceptable in your case, but here's one way that I can think of -
    You can FTP this file to the Application Server and then process this file through a simple ABAP program and update the file's data to the desired database table.
    Regards,
    Anand Mandalika.

  • "Unable to sync" when uploading large files to Creative Cloud

    Hi,
    I recently tried to upload some large files (1.1 GB, 2 GB) to my Creative Cloud. After completing the upload after nine hours I got the error message "unable to sync".
    Any suggestions?
    Regards,
    pdm208

    I'm having the same issue. I removed the offending big files. I turned sync on and off, but it's still unable to sync. Any thoughts?

  • 16.0.2 crashes with large file uploads

    Hi, I have noticed that Firefox 16.0.2 crashes when I try uploading a large file (1.2GB) to a website.
    The entire browser closes and I get a window offering to submit a report for me. Can anybody tell me what I need to change to stop this from happening please?
    Thanks,
    Paul

    Be sure to rollback to 16.0.1; the initial v16 update had serious security flaw.

Maybe you are looking for

  • Screen not working iPhone 5s

    I've dropped my iPhone 5s and the screen is not working. I want to extract and back up the data to transfer it to a new iPhone. How can I do this? iTunes is asking me to enter the passcode which I can't due to the screen.

  • Memory leak issue with link server between SQL Server 2012 and Oracle

    Hi, We are trying to use the linked server feature with SQL Server 2012 to connect SQL server and Oracle database. We are concerned about the existing memory leak issue.  For more context please refer to the link. http://blogs.msdn.com/b/psssql/archi

  • Mail not sending anymore with multiple accounts

    Hi I am experiencing some problem with Mail (v4.5 with SnowLeopard). I cannot send e-mails anymore. This has happened on and off in the last 4 months but now it doesn't seem to be able to send at all. When sending I get an e-mail that tells me that t

  • Will the new mac pro support a vga display thru the thunderbolt expansion?

    We are getting ready to purchase mac pro, but we need to drive a small vga display from thunderbolt.  Is this possible, and will it use the mini-display to VGA adaptor?

  • Me22n - PO

    All, I have a purchase order in held status due to an error. If I go to ME22N with that purchase order and click on the save button the system sends a pop up to save/edit/cancel and once I click save it ignores the error and saves the purchase order.