URLConnection Posting Large DataFiles

I am looking for a way to STREAM large data files, say 100MB+ and the HEAP size is say only 32MB.
I believe I should be able to simply do what the code shows below..but..
The AcmeUtils.copyStream method simply conks out with java.lang.outOfMemoryError.
It appears that server never ever events see the request before the out of memory error occurs....Which indicates to me, that this url OutputStream is infact buffering the ENTIRE stream in memory before even contacting the server. If I use files that smaller then the heap..it works no problem..but thats no good for very large files..
Any suggestions??
thanx
Dan
----- CODE snippit ----------------
URL url = new URL(strURL);
connection = (HttpURLConnection)url.openConnection();
connection.setRequestMethod("POST");
connection.setRequestProperty("ContentLength", ""+m_lCurrentFileLength);
connection.setRequestProperty("Content-Type", "application/x-www-form-urlencoded");
connection.setUseCaches(false);
connection.setDoOutput(true);
outStream = connection.getOutputStream());
inputStream = new DataInputStream(connection.getInputStream());
// AcmeUtils, simply copies 4k Chunks from the instream to the outstream
AcmeUtils.copyStream(inStream, outStream);

HttpURLConnection and URLConnection are meant more for web browser type interractions where you send a "simple" request to a server and recieve information
it is reasonable that you would complie the whole request before anything is sent to the server
if you need to send "large" file then you will have to write your own communications using java.net.Socket and create an http request
send the following String
POST /nameOfResource HTTP/1.0\r\n"+
Content-length: "+m_ICurrentFileLength+"\r\n"+
Content-type: application/x-www-form-urlencoded\r\n"+
"\r\n"
followed by the contents, using AcmeUtils if you want

Similar Messages

  • Lecture Poster Large/Not printing correctly

    I selected Lecture Poster Large. I replaced all the photos using Aperture. The large photo at the bottom was also replaced by dragging in a photo from Aperture.
    I was very happy with the result. The photo preview looks exactly like the project. But when I print I see only the large photo. Shouldn't a project print like the preview? Why are the photos on top and the text not printing?
    I'm printing to a RICOH Aficio SP color printer. I've never had this issue before.

    OK, so I'm going to answer my own question.
    I chose the small version, and it worked.

  • Viewing & Posting larger movies

    Does anyone know of a way (or template) in iweb (09) that allows you to post larger viewable movies on web pages, streaming or otherwise? I know once you drag & drop a larger movie file, you're prompted by a warning that says loading times will be slower with any movie over 10mb. And even if you do, the movie itself remains small. Any ideas?
    Thanks
    Chris.

    cloizou wrote:
    Any ideas?
    Chris ~ One idea is to post your video to YouTube and use iWeb's YouTube widget to embed it into your iWeb page:
    http://www.apple.com/ilife/iweb/#widgets
    For high definition video, YouTube was recently ranked the best of several video sites:
    _Which HD video Web service is the best?_
    Another idea is, if you subscribe to MobileMe you can use iWeb's +MobileMe Gallery+ widget to embed a link to your videos published from iMovie. Click here for a video tutorial and here's a sample video on Apple's demo Gallery:
    http://gallery.mac.com/emily_parker#100624
    With both the above ideas, your videos are separate from your iWeb site so publishing/upload times are shorter. Also viewers will be able to click a fullscreen icon.

  • Http post large images

    Hello people ...
    I've been working on this for almost a week
    am trying to post an image and some text with it to my server
    i know that i can post large images to an ASP.NET page .. but now am dealing with apache 2.2.3 and am using multipart post
    I've tested this code using nokia devices and it is working fine
    but on sony ericsoon devices " z520 and z610" it is giving 502 proxy error response ...
    here is the client and server responses ....
    WHAT IS WRONG WITH SONY ERICSSON DEVICES
    any idea what is happening
    here is my code
    InputStream is = null;
            OutputStream os = null;
            DataInputStream dis = null;
            StringBuffer responseMessage = new StringBuffer();      
            StringBuffer b = new StringBuffer();
            HttpConnection c= null;
            try
                String message1 =  "-----------------------------2789938726602\r\n" +
                        "Content-Disposition: form-data; name=\"post[id]\"\r\n\r\n" +
                        "-----------------------------2789938726602\r\n" +
                        "Content-Disposition: form-data; name=\"post[title]\"\r\n\r\n" +
                        " bahjat\r\n" +
                        "-----------------------------2789938726602\r\n" +
                        "Content-Disposition: form-data; name=\"post[discription]\"\r\n\r\n" +
                        "Description\r\n" +
                        "-----------------------------2789938726602\r\n" +
                        "Content-Disposition: form-data; name=\"post[username]\"\r\n\r\n" +
                        "Username\r\n" +
                        "-----------------------------2789938726602\r\n" +
                        "Content-Disposition: form-data; name=\"post[password]\"\r\n\r\n" +
                        "Password\r\n" +
                        "-----------------------------2789938726602\r\n" +
                        "Content-Disposition: form-data; name=\"post[category_id]\"\r\n\r\n" +
                        "1\r\n" +
                        "-----------------------------2789938726602\r\n" +
                        "Content-Disposition: form-data; name=\"post[file]\"; filename=\"mobile.jpg\"\r\n" +
                        "Content-Type: image/jpeg\r\n\r\n";
                 String message2 ="-----------------------------2789938726602--"; 
                 byte[] Message1 = message1.getBytes();
                 byte[] Message2 = message2.getBytes();
                 int total_length = Message1.length + Message2.length + postByte.length ;
                c = (HttpConnection) Connector.open(URL, Connector.READ_WRITE);
                c.setRequestMethod(HttpConnection.POST);
                c.setRequestProperty("Host","hq.d1g.com")  ;
                c.setRequestProperty("Accept"," text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5");
                c.setRequestProperty("Accept-Charset"," ISO-8859-1,utf-8;q=0.7,*;q=0.7");
                c.setRequestProperty("User-Agent", "Profile/MIDP-2.0 Confirguration/CLDC-1.0");
                c.setRequestProperty("Accept-Encoding", "gzip, deflate");
                c.setRequestProperty("Accept-Language","en-us,en;q=0.5");
                c.setRequestProperty("Http-version","HTTP/1.1");
                c.setRequestProperty("Content-Type", "multipart/form-data;boundary=---------------------------2789938726602");
                c.setRequestProperty("Content-Length", Integer.toString(total_length) ); 
                c.setRequestProperty("Keep-Alive","300");
                c.setRequestProperty("Connection","Keep-Alive");
                os = c.openOutputStream();  
                os.write(Message1);
                os.write(postByte);
                os.write(Message2);          
                os.close();
                Message1 = null;
                Message2 = null;
               int rc = c.getResponseCode();
               if (rc != HttpConnection.HTTP_OK) {
                    b.append("Response Code: " + c.getResponseCode() + "\n");
                    b.append("Response Message: " + c.getResponseMessage() +
                            "\n\n");        
               Resived_String=b.toString();
               is = c.openInputStream() ;
                // retrieve the response from server
               int ch;
               while( ( ch = is.read() ) != -1 )
                    responseMessage.append( (char)ch );
               String s=responseMessage.toString();
               Resived_String+=s;
            catch (Exception ce) {
                Resived_String=ce.getMessage(); }
            finally {
                if(dis != null)
                    dis.close();
                if (is != null)
                    is.close();
                if (os != null)
                    os.close();
            return Resived_String;here is the headers as i followed them using wireshark
    client
    POST /gallery/post/update HTTP/1.1
    Host: hq.d1g.com
    Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5
    Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
    User-Agent: Profile/MIDP-2.0 Confirguration/CLDC-1.0
    Accept-Encoding: gzip, deflate
    Accept-Language: en-us,en;q=0.5
    Http-version: HTTP/1.1
    Content-Type: multipart/form-data;boundary=---------------------------2789938726602
    Keep-Alive: 300
    Connection: Keep-Alive
    User-Agent: UNTRUSTED/1.0
    Transfer-Encoding: chunked
    server response
    HTTP/1.1 502 Proxy Error
    Date: Wed, 04 Apr 2007 12:49:40 GMT
    Content-Length: 493
    Connection: close
    Content-Type: text/html; charset=iso-8859-1
    <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
    <html><head>
    <title>502 Proxy Error</title>
    </head><body>
    Proxy Error
    The proxy server received an invalid response from an upstream server.
    The proxy server could not handle the request POST /gallery/post/update.
    Reason: Error reading from remote server
    <hr>
    <address>Apache/2.2.3 (Fedora) Server at hq.d1g.com Port 80</address>
    </body></html>
    and some times i get this response
    404 not found
    so any idea what i can do

    Thank your reply,and I think there are some bad with code,but there is only
    a swc,and I can't analyze the c/c++ code,so I relly don't konw how to solve
    it, Can you provide me some c/c++ code,and I can compile them to Alchemy
    code...
    2009/10/28 zazzo9 <[email protected]>
    It is either a bug with your code or with the library, which may be
    aggravated by a bug in alchemy where it fails to report exhausted memory
    properly.  The library you are using is just a demonstration and may likely
    have bugs.
    >

  • Error 500 when posting large form to windows azure from Android

    I have a HTML page with a form containing a large (>1400 character) hidden field value, that will not submit on Android browser to any site hosted on windows azure websites. I have tested this on 4.1 and 4.2 on different devices.
    I have checked the error logs from Azure and am receiving a 500 error.
    The page will submit from all modern desktop browsers, and iOS devices.
    Unfortunately I can not provide a link to the actual page, as it resides under our companies extranet, but if I point the target of this simple form to any Azure website (*.azurewebsites.net) from an android device, I get the 500 error.
        <form action="http://*****.azurewebsites.net/" method="post">
            <input type="hidden" name="imageData" value="/9j/4AAQSkZJRgABAQEAYABgAAD/4RuURXhpZgAATU0AKgAAAAgABQEyAAIAAAAUAAAASkdGAAMAAAABAAQAAEdJAAMAAAABAD8AAIKYAAIAAAAWAAAAXodpAAQAAAABAAAAdAAAANQyMDA5OjAzOjEyIDEzOjQ4OjM5AE1pY3Jvc29mdCBDb3Jwb3JhdGlvbgAABJADAAIAAAAUAAAAqpAEAAIAAAAUAAAAvpKRAAIAAAADMDIAAJKSAAIAAAADMDIAAAAAAAAyMDA4OjAyOjA3IDExOjMzOjExADIwMDg6MDI6MDcgMTE6MzM6MTEAAAAABgEDAAMAAAABAAYAAAEaAAUAAAABAAABIgEbAAUAAAABAAABKgEoAAMAAAABAAIAAAIBAAQAAAABAAABMgICAAQAAAABAAAaWgAAAAAAAABgAAAAAQAAAGAAAAAB/9j/2wBDAAgGBgcGBQgHBwcJCQgKDBQNDAsLDBkSEw8UHRofHh0aHBwgJC4nICIsIxwcKDcpLDAxNDQ0Hyc5PTgyPC4zNDL/2wBDAQkJCQwLDBgNDRgyIRwhMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjL/wAARCAB4AKADASEAAhEBAxEB/8QAHwAAAQUBAQEBAQEAAAAAAAAAAAECAwQFBgcICQoL/8QAtRAAAgEDAwIEAwUFBAQAAAF9AQIDAAQRBRIhMUEGE1FhByJxFDKBkaEII0KxwRVS0fAkM2JyggkKFhcYGRolJicoKSo0NTY3ODk6Q0RFRkdISUpTVFVWV1hZWmNkZWZnaGlqc3R1dnd4eXqDhIWGh4iJipKTlJWWl5iZmqKjpKWmp6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2uHi4+Tl5ufo6erx8vP09fb3+Pn6/8QAHwEAAwEBAQEBAQEBAQAAAAAAAAECAwQFBgcICQoL/8QAtREAAgECBAQDBAcFBAQAAQJ3AAECAxEEBSExBhJBUQdhcRMiMoEIFEKRobHBCSMzUvAVYnLRChYkNOEl8RcYGRomJygpKjU2Nzg5OkNERUZHSElKU1RVVldYWVpjZGVmZ2hpanN0dXZ3eHl6goOEhYaHiImKkpOUlZaXmJmaoqOkpaanqKmqsrO0tba3uLm6wsPExcbHyMnK0tPU1dbX2Nna4uPk5ebn6Onq8vP09fb3+Pn6/9oADAMBAAIRAxEAPwDv9tJtr6C54obaTbRcQm2kK07gJtpNtO4hNtJtqrisJtpu2nckQrTdtNMTE20mynckbtpNtVcQ3bTdlO5DNvbRtrjudomKNtFxCbaTbTuITbTX2opZiAo6moqVoUoOpUdkt2OMJTkoxV2wVd8e9QcZ7qR"
    />
            <input type="submit" />
        </form>
    To further test this I created a brand new site on azure websites, leaving everything as default. No code or anything uploaded (so you get the "This web site has been successfully created" page). Posting the form to this site gives a 405 response
    (as expected), except for android that gives the 500.
    Is there a setting or something I am missing somewhere?
    Following up, I have identified this event in the error log if this helps:
    138. GENERAL_READ_ENTITY_END
    BytesReceived="0", ErrorCode="The I/O operation has been aborted because of either a thread exit or an application request.
     (0x800703e3)".

    Originally I thought it may have been a browser issue, so I installed the latest Chrome and Firefox browsers as well, which all respond the same.
    This is on 2 separate devices, both modern hardware running Android v4.1 and v4.2

  • URLConnection with large amount of data causes OutOfMemoryError: Java heap

    I am setting up a system where my customers can sent me their data files to my server. If I use a simple socket with a server permanently running on a chosen port, my customers are able to transfer files of any size without problem. However, If I adopt a different architecture, using a web server and a CGI program to receive their submissions, the client program that they run will crash if the file they are sending is any larger than about 30 Mbytes, with the exception OutOfMemoryError: Java heap.
    The code in the two architectures is almost identical:
    Socket con = new Socket( host, portno);
    //URL url = new URL("http://"+host+":"+portno+path);
    //URLConnection con = url.openConnection();
    //con.setDoOutput(true);
    File source_file = new File(file_name);
    FileInputStream source = new FileInputStream(source_file);
    out = new BufferedOutputStream(con.getOutputStream());
    // First, Send a submission header.
    data_out = submit_info.getBytes();
    len = data_out.length + 1;
    data_len[0] = (byte)len;
    out.write(data_len, 0, 1);
    out.write(data_out, 0, data_out.length);
    // Then send the file content.
    buff = new byte[(int)buffSize];
    content_length = source_file.length();
    long tot = 0;
    while ( tot < content_length )
    // Read data from the file.
    readSize = (int) Math.min( content_length-tot, buffSize );
    nRead = source.read(buff, 0, readSize);
    tot += nRead;
    if ( nRead == 0 ) break;
    // Send data.
    out.write(buff, 0, nRead);
    "buffSize" is 4096. This code works fine, but if the first line is commented out and the next three are uncommented, the OutOfMemory exception is thrown within the loop when "tot" is around 30 million.
    I have tried calling the garbage collector within the loop but it makes no difference. I am unable to anticipate the size of files that my customers will submit, so I cannot set the heap size in advance to cope with what they will sent. Fortunately, using a simple Socket avoids the problem, so there seems to be something wrong with how URLConnection works.

    Set the URLConnection to use chunked mode. This saves it from having to buffer the entire contents before sending to ascertain its length.

  • Posting large amount of data via sync web service

    Hi,
    I'm trying to posting a large data via sync soap adapter and receiver as abap proxy. I'm getting log version status in SXMB_MONI when ever i try to post data >5k. Any idea how to trace it? If got timeout should there be an error message in SXMB_moni instead of log version? Thanks.

    Hi,
    This message occurs when process on  receiving system somehow is locked and does not return a response.
    your receiving system is an ABAP system, goto  sm66 ,sm12 in th abap system. In sm66 look for transactions which are running for very long and inside sm12 look for locks that interfere in processing of message on the abap system.
    also maybe the response system was locked when time request was send and therfore no response was received.
    also chk this blog :-
    /people/michal.krawczyk2/blog/2005/05/10/xi-i-cannot-see-some-of-my-messages-in-the-sxmbmoni
    hope it helps.
    Regards,
    Rohit

  • URLConnection POST to external server requiring keep-alive fails because request is HTTP/1.0

              I have a class that when run as a "main" transmits a HTTP/1.1 post successfully
              to an external server. This external server requires keep-alive connections.
              However when instantiated inside a weblogic servlet container, the post fails
              because the HTTP protocol is set to HTTP/1.0. I have tried this with V5.1 SP11
              and then with V6.1 SP2 with the same result. The code works under Tomcat.
              I can find no way to force HTTP/1.1 in the URLConnection. Any suggestions?
              

    Great. I have a question to BEA folks, if they ever read this newsgroup:
              what is the reason for installing WLS protocol handlers, and, if there is
              one, why the implementation is still buggy? I saw many, many instances when
              code making outgoing connections failed to work in WLS, and the solution is
              always the same - use handler which comes with the JVM.
              Bob Bowman <[email protected]> wrote:
              > <[email protected]> wrote:
              >>If it works as a standalone application and fails inside WebLogic, most
              >>likely this
              >>is caused by WebLogic http handler implementation. You can try to modify
              >>your code
              >>like this:
              >>
              >>URL url = new URL(null, "http://some_url", new sun.net.www.protocol.http.Handler());
              >>HttpURLConnection conn = (HttpURLConnection)url.openConnection();
              >>
              >>(you will need to modify weblogic.policy to allow your code to specify
              >>protocol
              >>handler).
              >>
              >>Bob Bowman <[email protected]> wrote:
              >>
              >>> I have a class that when run as a "main" transmits a HTTP/1.1 post
              >>successfully
              >>> to an external server. This external server requires keep-alive connections.
              >>> However when instantiated inside a weblogic servlet container, the
              >>post fails
              >>> because the HTTP protocol is set to HTTP/1.0. I have tried this with
              >>V5.1 SP11
              >>> and then with V6.1 SP2 with the same result. The code works under
              >>Tomcat.
              >>
              >>> I can find no way to force HTTP/1.1 in the URLConnection. Any suggestions?
              >>
              >>--
              >>Dimitri
              > Worked like a champ! Thanks.
              Dimitri
              

  • Error while doing an expdp on a large datafile

    Hello,
    I tried an export using expdp in oracle 10g express edition. It was working perfectly until when the db size reached 2.1 gb. I got the following error message:
    ---------------- Start of error message ----------------
    Connected to: Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    Starting "SERVICE_2_8"."SYS_EXPORT_SCHEMA_05": service_2_8/******** LOGFILE=3_export.log DIRECTORY=db_pump DUMPFILE=service_2_8.dmp CONTENT=all
    Estimate in progress using BLOCKS method...
    Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
    ORA-39125: Worker unexpected fatal error in KUPW$WORKER.GET_TABLE_DATA_OBJECTS while calling DBMS_METADATA.FETCH_XML_CLOB []
    ORA-01116: error in opening database file 5
    ORA-01110: data file 5: '/usr/lib/oracle/xe/oradata/service_3_0.dbf'
    ORA-27041: unable to open file
    Linux Error: 13: Permission denied
    Additional information: 3
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 95
    ORA-06512: at "SYS.KUPW$WORKER", line 6235
    ----- PL/SQL Call Stack -----
    object line object
    handle number name
    0x3b3ce18c 14916 package body SYS.KUPW$WORKER
    0x3b3ce18c 6300 package body SYS.KUPW$WORKER
    0x3b3ce18c 9120 package body SYS.KUPW$WORKER
    0x3b3ce18c 1880 package body SYS.KUPW$WORKER
    0x3b3ce18c 6861 package body SYS.KUPW$WORKER
    0x3b3ce18c 1262 package body SYS.KUPW$WORKER
    0x3b0f9758 2 anonymous block
    Job "SERVICE_2_8"."SYS_EXPORT_SCHEMA_05" stopped due to fatal error at 03:04:34
    ---------------- End of error message ----------------
    Selinux was disabled completely and I have put the permission of 0777 to the appropriate datafile.
    Still, it is not working.
    Can you please tell me how to solve this problem or do you have any ideas or suggestions regarding this?

    Hello rgeier,
    I cannot access this tablespace which is service_3_0 (2.1 gb) through a php web-application or through sqlplus. I can access a small tablespace which is service_2_8 through the web-application or through sqlplus. When I tried to access service_3_0 through sqlplus, the following error message was returned:
    ---------------- Start of error message ----------------
    ERROR at line 1:
    ORA-01116: error in opening database file 5
    ORA-01110: data file 5: '/usr/lib/oracle/xe/oradata/service_3_0.dbf'
    ORA-27041: unable to open file
    Linux Error: 13: Permission denied
    Additional information: 3
    ---------------- End of error message ----------------
    The following are the last eset of entries in the alert_XE.log file in the bdump folder:
    ---------------- Start of alert log ----------------
    db_recovery_file_dest_size of 40960 MB is 9.96% used. This is a
    user-specified limit on the amount of space that will be used by this
    database for recovery-related files, and does not reflect the amount of
    space available in the underlying filesystem or ASM diskgroup.
    Wed Aug 20 05:13:59 2008
    Completed: alter database open
    Wed Aug 20 05:19:58 2008
    Shutting down archive processes
    Wed Aug 20 05:20:03 2008
    ARCH shutting down
    ARC2: Archival stopped
    The value (30) of MAXTRANS parameter ignored.
    kupprdp: master process DM00 started with pid=27, OS id=7463 to execute - SYS.KUPM$MCP.MAIN('SYS_EXPORT_SCHEMA_06', 'SERVICE_2_8', 'KUPC$C_1_20080820054031', 'KUPC$S_1_20080820054031', 0);
    kupprdp: worker process DW01 started with worker id=1, pid=28, OS id=7466 to execute - SYS.KUPW$WORKER.MAIN('SYS_EXPORT_SCHEMA_06', 'SERVICE_2_8');Wed Aug 20 05:40:48 2008
    The value (30) of MAXTRANS parameter ignored.
    The value (30) of MAXTRANS parameter ignored.
    The value (30) of MAXTRANS parameter ignored.
    The value (30) of MAXTRANS parameter ignored.
    The value (30) of MAXTRANS parameter ignored.
    ---------------- End of alert log ----------------

  • Help in sql loader loading a large datafile.

    Hi All
    Could some one help me in loading the xls file which is about 250MB AND got about 2981807 rows into a database table
    i try to open the XLS file it just opened with 65000 records only but i have lot more records in it.
    thanks in advance

    hi i user the sql loader but the file is too big to handel like the file is not opened in notepad or xls sheet...
    is there any other way to split the large file

  • How to Split a Large datafile

    Hi Folks,
    I have problem here in oracle 10g running on Windows 2003(32 bit), we have a datafile(.dbf) of size 21GB and we are not able to copy during cold backup OS throwing below error,so we are working with OS vendor on this. Please let me know if any better way to split datafiles in oracle.
    Os error: insufficient system resources exist to complete the requested service.
    Thanks
    Raghu

    There is no direct oracle command to split the datafile. Instead you have to use workaround for this.
    Do you 21 GB datafile completely occupied by data. Check with the bellow given query.
    select tablespace_name,bytes/1024/1024 from dba_segments where tablespace_name='21 GB tablespace'
    A) Shrink all the objects in that datafile, With this you will be able to free space from the objects by resetting high water mark.
    If it is not occupied completely resize it with command. Alter database datafile 'file_name' resize 20gb; Reduce it until you get error message that datafile contains data beyond resize value.
    B) Create one more tablespace and move few of the tables from 21 gb tablespace to new tablespace with command and then rebuild moved table index.
    alter table table_name tablesapce new_tablespace_name ;
    rebuild index index_name_of moved_table rebuild tablespace_name;
    Then resize the 21 gb datafile;
    Edited by: sac on Sep 24, 2009 4:39 PM

  • Slow http post / unable to post large files

    We have a configuration with ASA 5520 in release 7.1.2. A Web Server on the DMZ. On the Web Server we can post file through a http connection on the Web server, in this http connection we select the file to upload from the http client to the Web Server.
    The upload work for very small files, but not for files that have more than 100Kb. On the logs I don't see any errors, I just see that the Web Server IP with source port 80 build a connection to the client, and teardown after few minutes.
    Any idea ?
    Thanks for your help.

    Hi.
    The ASA will either pass or drop the packet. It will not queue the packet. The only reason can be "ident" protocol. You need to check two things.
    1. Some http servers require that the originating client ip should have a proper in.arpa record. You can do this by doing a
    nslookup initiating the request.
    2. Some http servers try to identify the userid of the process that initiated this request. This is called IDENT protocol. If this is the case, you should see a teardown on port 113. If you see something like this then issue the command
    service resetinbound outside
    --Pls rate if this helps--

  • Large Datafile Sizes with Oracle 10g on ECC 6.0 SR3

    We installed ECC 6.0 SR3 on Oracle 10.2.0.2 and are wondering if the space usage is normal.  We've allocated 125 GB for the partition, and the datafiles have the following sizes:
    Sapdata1: 30.5 GB
    Sapdata2: 26.2 GB
    Sapdata3: 31.5 GB
    Sapdata4: 28.6 GB
    This leaves us with 7GB of free space on the database drive.  116GB of used space for the datafiles of a brand new SAP installation does not seem right.  It doesnu2019t seem normal to me, coming from a MS SQL Server background where around 75 GB is sufficient.
    Any feedback or assistance will be greatly appreciated....

    > We installed ECC 6.0 SR3 on Oracle 10.2.0.2 and are wondering if the space usage is normal.  We've allocated 125 GB for the partition, and the datafiles have the following sizes:
    >
    > Sapdata1: 30.5 GB
    > Sapdata2: 26.2 GB
    > Sapdata3: 31.5 GB
    > Sapdata4: 28.6 GB
    > This leaves us with 7GB of free space on the database drive.  116GB of used space for the datafiles of a brand new SAP installation does not seem right.  It doesnu2019t seem normal to me, coming from a MS SQL Server background where around 75 GB is sufficient.
    Oracle doesn't use the datafiles the same way SQL server does. If you e. g. drop a table from the database, the spaces is still occupied whereas on SQL server the allocated spaces is getting decreased automatically.
    Oracle uses "Extents" to expand the space a table consumes, the size of that extent may be much more than actually necessary.
    Those 116 GB are not filled completely, they are allocated on the filesystem.
    Markus

  • Big Datafile Creation

    Hi all,
    My OS: Windows Server 2003
    Oracle Version: 10.2.0.1.0
    Is there is possibility to add big datafile more than 30G.
    Regards,
    Vikas

    Vikas Kohli wrote:
    Thanks for your help ,
    But if i have a already a tablespace, every time when it is going to fill i need to add datafile of 30g. Is there any possibility that i can specify a big datafile or need to create a new big datafile tablespace and move the tables from olde tablespace to new oneYou have to understand that a bigfile tablespace is a tablespace with a single, but very large datafile.
    have you read the link I posted before?
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14231/tspaces.htm#i1010733

  • Is there a restriction on length of Http request posted to a Iplanet Web Server ?

    I am submitting an Http Request by POST method to my server. The request will be received and forwarded by Iplanet Web Server to a clustered Weblogic environment, which will then be handled by Java Servlets. The problem I am facing is that when the length of the request shoots up to around 2000 bytes, NES doesn't receive and forward the request to Weblogic. There is no fixed cut-off for the length, it keeps fluctuating.
    I observed the following after hitting the server hundreds of times with Http Requests that vary in length.
    1) If length of Request is below 1500 bytes, request always goes through successfully.
    2) In a range of 1500 bytes to 2000 bytes, request fails most of the times. (Around 80 %)
    3) Requests with length above 2000K invariably fail, though it just worked for a 2.4K request on one occasion (out of some 100 hits)
    Note that by "length of request", I mean the data I post through Http Request (I just have a single parameter in Http request having the literal "XML" as key and a well formed XML document as value. That is equivalent to submitting a HTML form just having a text area called "XML" containing a XML document. )
    I presume that the length of actual Http Request is slightly greater than that of XML (probably by around 100 bytes).
    My Iplanet documentation says that the upper limit on the length of a POST Http Request is 64 K, which is way above the value (2-3 K) at which I am facing the problem.

    Hi Ganesh
    Did you check the HTTP Persistent timeout of your server? Check the value which has been set up by default. Try increasing the timeout value and try again.
    follow these steps it might solve your problem:
    (1)Goto Web Server Administration Server, select the server you want to manage.
    (2)Select Preference >> Perfomance Tuning.
    (3)set HTTP Persistent Connection Timeout to your choice (eg 180 sec for three minutes)(note : if you are posting large amount of data or file increase the value accordingly)
    (4) Apply changes and restart the server.
    *Setting the timeout to a lower value, however, may    prevent the transfer of large files as timeout does not refer to the time that the connection has been idle. For example, if you are using a 2400 baud modem, and the request timeout is set to 180 seconds, then the maximum file size that can be transferred before   the connection is closed is 432000 bits (2400 multiplied by 180)
    If this doesnot solve your problem notify me
    regards
    T.Raghulan
    [email protected]

Maybe you are looking for

  • Form Feeder is not working in report printout

    Hi Can any one help me in solving this problem. I am working with reports 10g and TVS HD 945 dot matrix printer. the paper size is 12 X 8, when i print the report, in the next page is not having the same header space as the first one. Do i have to ma

  • Events limit can only be dates between 1 and 9 on the day field

    Im an student, and im trying to put my timetable into ical as did last year it goes ok, i set it up to repit every week mondays lessons for example but when i am trying to set up a limit for this event to repit itself ical only allows me to put a dat

  • Marker and line styles as a choice in drop down menu

    Is it possible to create a drop down choice menu(Choice xyz = new Choice(); ) in java AWT which has line styles or marker styles (no text)( basically these things are required to edit the graph) as choice options?

  • Report Painter Tcode

    Hi, We are in the process of archiving some report painter reports, The Tcode was created in the Dev system  and moved to QA, but the archived program was directly created in the QA system, Now when we run the program thru SE38, we get the execute bu

  • Assigning a blank xml file to a region.......

    I want to assign a blank xml file to a region in sitestudio.......I am using SS_SWITCH_REGION_ASSOCIATION service for that....but to no avail....Is there any other way out?? Anybody can help me out on this.......... Edited by: user11110762 on Nov 15,