Java.io.IOException during large file processing on PI 7.1

Hello Colleagues,
for a large file scenario on our PI 7.1 System we have to verify with big file size we are able to process over PI.
During handing over the large file (200 MB XML) form the Adapter Frame Work (File Adapter) to the Integration Engine we receive following error:
Transmitting the message to endpoint http://<host>:<port>/sap/xi/engine?type=entry using connection File_http://sap.com/xi/XI/System failed, due to: com.sap.engine.interfaces.messaging.api.exception.MessagingException: Error transmitting the message over HTTP. Reason: java.io.IOException: Error writing to server.
The message processing stopped and message still lies at Adapter Frame Work. Large files up to 100 MB we are able to process successfully.
Please, could you tell me why this happened and how we are able to solve it?
Because there is not a java.outofmemory exception however a IO exception i think it could be an memory issue?!
Many thanks in advance!
Regards,
Jochen

Hi Jochen,
Indeed the error is IO Error and it is because the Adapter engine was not able to send the message to Integration server. But it happens due to memory/heap size issues.
Look at these thread, they are having the same problem. Please try the remedy measures suggested by them
Mail to Proxy scenario with attachment. AF channel error.
Error with huge file
problem with big file in file-to-proxy scenario
Is there any additional information in Adapter messaging tool.?
Regards
Suraj
Edited by: S.R.Suraj on Oct 1, 2009 8:55 AM

Similar Messages

  • Java.io.IOException: There is no process to read data written to a pipe.

    Hi all
    I am facing a problem when i run my application
    I am using jdk1.3 and Tomcat 4.0.3
    Actually my application works absolutely fine but when i check the
    local_host log file of tomcat i find the following stack trace in it
    2006-01-04 10:59:00 StandardWrapperValve[default]: Servlet.service() for servlet default threw exception
    java.io.IOException: There is no process to read data written to a pipe.
         at java.net.SocketOutputStream.socketWrite(Native Method)
         at java.net.SocketOutputStream.write(SocketOutputStream.java(Compiled Code))
         at org.apache.catalina.connector.ResponseBase.flushBuffer(ResponseBase.java(Compiled Code))
         at org.apache.catalina.connector.ResponseBase.write(ResponseBase.java(Compiled Code))
         at org.apache.catalina.connector.ResponseBase.write(ResponseBase.java(Compiled Code))
         at org.apache.catalina.connector.ResponseStream.write(ResponseStream.java:312)
         at org.apache.catalina.connector.http.HttpResponseStream.write(HttpResponseStream.java:189)
         at org.apache.catalina.servlets.DefaultServlet.copyRange(DefaultServlet.java:1903)
         at org.apache.catalina.servlets.DefaultServlet.copy(DefaultServlet.java:1652)
         at org.apache.catalina.servlets.DefaultServlet.serveResource(DefaultServlet.java:1197)
         at org.apache.catalina.servlets.DefaultServlet.doGet(DefaultServlet.java:519)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:740)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:247)
         at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:193)
         at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:243)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:566)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:190)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:566)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.core.StandardContext.invoke(StandardContext.java:2343)
         at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:180)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:566)
         at org.apache.catalina.valves.ErrorDispatcherValve.invoke(ErrorDispatcherValve.java:170)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:170)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:468)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:174)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:566)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.connector.http.HttpProcessor.process(HttpProcessor.java(Compiled Code))
         at org.apache.catalina.connector.http.HttpProcessor.run(HttpProcessor.java:1107)
         at java.lang.Thread.run(Thread.java:498)
    2006-01-04 10:59:00 ErrorDispatcherValve[localhost]: Exception Processing ErrorPage[exceptionType=java.lang.Exception, location=/error]
    java.lang.IllegalStateException
         at java.lang.RuntimeException.<init>(RuntimeException.java:39)
         at java.lang.IllegalStateException.<init>(IllegalStateException.java:36)
         at org.apache.catalina.connector.ResponseFacade.reset(ResponseFacade.java:243)
         at org.apache.catalina.valves.ErrorDispatcherValve.custom(ErrorDispatcherValve.java:384)
         at org.apache.catalina.valves.ErrorDispatcherValve.throwable(ErrorDispatcherValve.java:250)
         at org.apache.catalina.valves.ErrorDispatcherValve.invoke(ErrorDispatcherValve.java:178)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:170)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:468)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:174)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:566)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.connector.http.HttpProcessor.process(HttpProcessor.java(Compiled Code))
         at org.apache.catalina.connector.http.HttpProcessor.run(HttpProcessor.java:1107)
         at java.lang.Thread.run(Thread.java:498)
    What i dont get is in the entire stack trace i am not able to locate which of my application files is causing the errors
    I searched on net and found a few root causes but i am not able to find out exactly which class file is causing the stack trace
    Any suggestions are most welcome
    Thanking in advance

    Did you do something strange like writing the object out using the Servlet response's output stream and then attempted to redirect or forward a user to another page? That is usually how the IllegalStateException gets generated. You would still see a valid response from the caller's perspective, but since you attempted to forward or redirect after data has already been written to the stream on the server, an exception is thrown there.
    - Saish

  • Java.io.IOException: Not enough files, skip(42225498) failed

    dear expert,
    During the migration of SAP CRM, on phase - Import Java dump, the following error occurs:
    Apr 21, 2008 12:18:45 PM com.sap.inst.jload.Jload logStackTrace
    SEVERE: java.io.IOException: Not enough files, skip(42225498) failed
            at com.sap.inst.jload.io.SplitInputStream.skip(SplitInputStream.java:368)
            at com.sap.inst.jload.io.SplitInputStream.close(SplitInputStream.java:193)
            at com.sap.inst.jload.io.ChoppedInputStream.close(ChoppedInputStream.java:61)
            at java.io.BufferedInputStream.close(BufferedInputStream.java:398)
            at java.io.FilterInputStream.close(FilterInputStream.java:159)
            at com.sap.inst.jload.io.DataFile.closeDataInputStream(DataFile.java:169)
            at com.sap.inst.jload.db.DBTable.load(DBTable.java:301)
            at com.sap.inst.jload.Jload.dbImport(Jload.java:323)
            at com.sap.inst.jload.Jload.executeJob(Jload.java:397)
            at com.sap.inst.jload.Jload.main(Jload.java:621)
            at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
            at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
            at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
            at java.lang.reflect.Method.invoke(Method.java:324)
            at com.sap.engine.offline.OfflineToolStart.main(OfflineToolStart.java:81)
    Apr 21, 2008 12:18:57 PM com.sap.inst.jload.db.DBConnection disconnect
    i already increase ulimit   nofile to 5000
    all source migration already put on hardisk
    our system using HPUX - oracle
    thanks for your help ...
    rgds
    echo

    hi thanks a lot for help ..
    any way .. can you  guide me how to set permanently ulimit our using HP UX ..
    current ulimit configuration :
    ulimit -a
    time(seconds)        unlimited
    file(blocks)         unlimited
    data(kbytes)         1048576
    stack(kbytes)        131072
    memory(kbytes)       unlimited
    coredump(blocks)     4194303
    nofiles(descriptors) 60000
    thanks a lot
    rgds
    echo
    Edited by: echo haryono on Apr 22, 2008 11:35 AM

  • Large file processing in file adapter

    Hi,
    We are trying to process a large file of ~280 MB file size and we are getting timeout errors. I followed all the required tunings for memory and heap sizes and still the problem exists. I want to know if installation of decentral adapter engine just for this large file processing might solve the problem which I doubt.
    Based on my personal experience there might be a limitation of file size processing in XI may upto 100 MB with minimul mapping and no BPM.
    Any comments on this would be appreciated.
    Thanks
    Steve

    Dear Steve,
    This might help you,
    Topic #3.42
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/70ada5ef-0201-0010-1f8b-c935e444b0ad#search=%22XI%20sizing%20guide%22
    /people/sap.user72/blog/2004/11/28/how-robust-is-sap-exchange-infrastructure-xi
    This sizing guide &  the memory calculations  it will be usefull for you to deal further on this issue.
    http://help.sap.com/bp_bpmv130/Documentation/Planning/XISizingGuide.pdf#search=%22Message%20size%20in%20SAP%20XI%22
    File Adpater: Size of your processed messages
    Regards
    Agasthuri Doss

  • Servlet request terminated with IOException:java.io.IOException: There is no process to read data written to a pipe.

    Hi,
    I am getting this following error. Could anyone please throw some light.
    Thanks
    Nilesh
    <HTTP> Servlet request terminated with IOException:
    java.io.IOException: There is no process to read data written to a pipe.
         at java.net.SocketOutputStream.socketWrite(Native Method)
         at java.net.SocketOutputStream.write(SocketOutputStream.java(Compiled Code))
         at weblogic.servlet.internal.ChunkUtils.writeChunks(ChunkUtils.java(Compiled
    Code))
         at weblogic.servlet.internal.ResponseHeaders.writeHeaders(ResponseHeaders.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletResponseImpl.writeHeaders(ServletResponseImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletOutputStreamImpl.flush(ServletOutputStreamImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletOutputStreamImpl.finish(ServletOutputStreamImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletContextManager.invokeServlet(ServletContextManager.java(Compiled
    Code))
         at weblogic.socket.MuxableSocketHTTP.invokeServlet(MuxableSocketHTTP.java(Compiled
    Code))
         at weblogic.socket.MuxableSocketHTTP.execute(MuxableSocketHTTP.java(Compiled
    Code))
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:129)

    I forgot to mention.
    I am using Weblogic 5.1 with SP 9
    Nilesh
    "Nilesh Shah" <[email protected]> wrote:
    >
    Hi,
    I am getting this following error. Could anyone please throw some light.
    Thanks
    Nilesh
    <HTTP> Servlet request terminated with IOException:
    java.io.IOException: There is no process to read data written to a pipe.
         at java.net.SocketOutputStream.socketWrite(Native Method)
         at java.net.SocketOutputStream.write(SocketOutputStream.java(Compiled
    Code))
         at weblogic.servlet.internal.ChunkUtils.writeChunks(ChunkUtils.java(Compiled
    Code))
         at weblogic.servlet.internal.ResponseHeaders.writeHeaders(ResponseHeaders.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletResponseImpl.writeHeaders(ServletResponseImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletOutputStreamImpl.flush(ServletOutputStreamImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletOutputStreamImpl.finish(ServletOutputStreamImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletContextManager.invokeServlet(ServletContextManager.java(Compiled
    Code))
         at weblogic.socket.MuxableSocketHTTP.invokeServlet(MuxableSocketHTTP.java(Compiled
    Code))
         at weblogic.socket.MuxableSocketHTTP.execute(MuxableSocketHTTP.java(Compiled
    Code))
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:129)

  • File Splitting for Large File processing in XI using EOIO QoS.

    Hi
    I am currently working on a scenario to split a large file (700MB) using sender file adapter "Recordset Structure" property (eg; Row, 5000). As the files are split and mapped, they are, appended to a destination file. In an example scenario a file of 700MB comes in (say with 20000 records) the destination file should have 20000 records.
    To ensure no records are missed during the process through XI, EOIO, QoS is used. A trigger record is appended to the incoming file (trigger record structure is the same as the main payload recordset) using UNIX shellscript before it is read by the Sender file adapter.
    XPATH conditions are evaluated in the receiver determination to eighther append the record to the main destination file or create a trigger file with only the trigger record in it.
    Problem that we are faced is that the "Recordset Structure" (eg; Row, 5000) splits in the chunks of 5000 and when the remaining records of the main payload are less than 5000 (say 1300) those remaining 1300 lines get grouped up with the trigger record and written to the trigger file instead of the actual destination file.
    For the sake of this forum I have a listed a sample scenario xml file representing the inbound file with the last record wih duns = "9999" as the trigger record that will be used to mark the end of the file after splitting and appending.
    <?xml version="1.0" encoding="utf-8"?>
    <ns:File xmlns:ns="somenamespace">
    <Data>
         <Row>
              <Duns>"001001924"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001925"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001926"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001927"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001928"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"001001929"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
         <Row>
              <Duns>"9999"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Cage_Code>"3NQN1"</Cage_Code>
              <Extract_Code>"A"</Extract_Code>
         </Row>
    </Data>
    </ns:File>
    In the sender file adapter I have for test purpose changed the "Recordset structure" set as "Row,5" for this sample xml inbound file above.
    I have two XPATH expressions in the receiver determination to take the last record set with the Duns = "9999" and send it to the receiver (coominication channel) to create the trigger file.
    In my test case the first 5 records get appended to the correct destination file. But the last two records (6th and 7th record get sent to the receiver channel that is only supposed to take the trigger record (last record with Duns = "9999").
    Destination file: (This is were all the records with "Duns NE "9999") are supposed to get appended)
    <?xml version="1.0" encoding="UTF-8"?>
    <R3File>
         <R3Row>
              <Duns>"001001924"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
         <R3Row>
              <Duns>"001001925"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
         <R3Row>
              <Duns>"001001926"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</xtract_Code>
         </R3Row>
              <R3Row>
              <Duns>"001001927"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
              <R3Row>
              <Duns>"001001928"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Extract_Code>"A"</Extract_Code>
         </R3Row>
    </R3File>
    Trigger File:
    <?xml version="1.0" encoding="UTF-8"?>
    <R3File>
              <R3Row>
              <Duns>"001001929"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Ccr_Extract_Code>"A"</Ccr_Extract_Code>
         </R3Row>
              <R3Row>
              <Duns>"9999"</Duns>
              <Duns_Plus_4>""</Duns_Plus_4>
              <Ccr_Extract_Code>"A"</Ccr_Extract_Code>
         </R3Row>
    </R3File>
    I ve tested the XPATH condition in XML Spy and that works fine. My doubts are on the property "Recordset structure" set as "Row,5".
    Any suggestions on this will be very helpful.
    Thanks,
    Mujtaba

    Hi Debnilay,
    We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
    Thanks
    Steve

  • Bottleneck in Large file processing

    Hi,
    We are experiencing timeout and memory issues in large file processings. I want to know wheather J2EE adapter engine or Integration engine is the bottleneck in processing large messages like over 300 MB files without splitting the files.
    Thanks
    Steve

    Hi Mario,
    We are testing a scenario to find out what is the maximum file size that XI can handle based on the blog
    ( /people/william.li/blog/2006/09/08/how-to-send-any-data-even-binary-through-xi-without-using-the-integration-repository) without any mapping. Upto 20 MB it works Ok and after that we are getting timeout error .
    Data from Moni:
    com.sap.engine.services.httpserver.exceptions.HttpIOException: Read timeout. The client has disconnected or a synchronization error has occurred. Read [1704371] bytes. Expected [33353075]. at com.sap.engine.services.httpserver.server.io.HttpInputStream.read(HttpInputStream.java:186) at com.sap.aii.af.service.util.ChunkedByteArrayOutputStream.write(ChunkedByteArrayOutputStream.java:181) at com.sap.aii.af.ra.ms.transport.TransportBody.<init>(TransportBody.java:99) at com.sap.aii.af.ra.ms.impl.core.transport.http.MessagingServlet.doPost
    This could be due to ICM timeout settings which we are planning to increase.
    I would like to hear from others experience of maximum file size that they could process. Ofcouse I do know that it depends on the environment.
    Thanks
    Steve

  • Java proxies for handling large files

    Dear all,
    Kindly let me know handle the same in step by step explanation as i do not know much about java.
    what is advantage of using the java proxies here.Do we implement the split logic in java code for mandling 600mb file?
    please mail me the same to [email protected]

    Hi !!   Srinivas
    Check out this blog....for   Large file handling issue  
    /people/sravya.talanki2/blog/2005/11/29/night-mare-processing-huge-files-in-sap-xi
    This will help you
    Please see the documents below. This might help you
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/a068cf2f-0401-0010-2aa9-f5ae4b2096f9
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/f272165e-0401-0010-b4a1-e7eb8903501d
    /people/prasad.ulagappan2/blog/2005/06/27/asynchronous-inbound-java-proxy
    /people/rashmi.ramalingam2/blog/2005/06/25/an-illustration-of-java-server-proxy
    We can also find them on your XI/PI server in folders:
    aii_proxy_xirt.jar
    j2eeclusterserver0 inextcom.sap.aii.proxy.xiruntime
    aii_msg_runtime.jar
    j2eeclusterserver0 inextcom.sap.aii.messaging.runtime
    aii_utilxi_misc.jar
    j2eeclusterserver0 inextcom.sap.xi.util.misc
    guidgenerator.jar
    j2eeclusterserver0 inextcom.sap.guid
    Java Proxy
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/a068cf2f-0401-0010-2aa9-f5ae4b2096f9
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f272165e-0401-0010-b4a1-e7eb8903501d
    Pls reward if useful

  • GB crashing during large file export

    GB 09 is crashing during an export of a project that is 2 hours and 20 min. I've tried diff export settings but the program still quits when it reaches about 3/4 of the project length.
    No crashes in the past. Been exporting the project to rough cuts at various lengths from1 hour to 1 1/2, but upon finalizing the project, which happen to total over 2 hours, it can't make it through an export.
    It's an audio book - talking only -no effects, other instruments, very straight forward.
    I've tried exporting from the master .book file and a version with all the clips joined, tracks locked, and extra stuff deleted for project.
    Here is some of the crash report (just pasting the first page of the crash report).
    Process: GarageBand [2716]
    Path: /Applications/GarageBand.app/Contents/MacOS/GarageBand
    Identifier: com.apple.garageband
    Version: 5.1 (398)
    Build Info: GarageBand_App-3980000~2
    Code Type: X86 (Native)
    Parent Process: launchd [138]
    Date/Time: 2010-03-29 11:14:16.969 -0400
    OS Version: Mac OS X 10.6.2 (10C540)
    Report Version: 6
    Interval Since Last Report: 1758706 sec
    Crashes Since Last Report: 39
    Per-App Interval Since Last Report: 219501 sec
    Per-App Crashes Since Last Report: 6
    Exception Type: EXC_BREAKPOINT (SIGTRAP)
    Exception Codes: 0x0000000000000002, 0x0000000000000000
    Crashed Thread: 0 Dispatch queue: com.apple.main-thread
    Thread 0 Crashed: Dispatch queue: com.apple.main-thread
    0 com.apple.CoreFoundation 0x94dd0554 CFRelease + 196
    1 com.apple.AppKit 0x9468967a -[NSBitmapImageRep initWithFocusedViewRect:] + 2257
    2 com.apple.garageband 0x001f5fea 0x1000 + 2052074
    3 com.apple.garageband 0x00205650 0x1000 + 2115152
    4 com.apple.garageband 0x0014f7c2 0x1000 + 1370050
    5 com.apple.garageband 0x0015a58a 0x1000 + 1414538
    6 com.apple.AppKit 0x945edb6c -[NSView _drawRect:clip:] + 3721
    7 com.apple.AppKit 0x945eb238 -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 2217
    8 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    9 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    10 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    11 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    12 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    13 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    14 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    15 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    16 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    17 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    18 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    19 com.apple.AppKit 0x945ebbcb -[NSView _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 4668
    20 com.apple.AppKit 0x94689b5f -[NSNextStepFrame _recursiveDisplayRectIfNeededIgnoringOpacity:isVisibleRect:rectIsVisibleRectFor View:topView:] + 311
    21 com.apple.AppKit 0x945e7111 -[NSView _displayRectIgnoringOpacity:isVisibleRect:rectIsVisibleRectForView:] + 3309
    22 com.apple.AppKit 0x94547d6e -[NSView displayIfNeeded] + 818
    23 com.apple.AppKit 0x944fb9c5 -[NSNextStepFrame displayIfNeeded] + 98
    24 com.apple.AppKit 0x94511094 -[NSWindow displayIfNeeded] + 204
    25 com.apple.AppKit 0x945425aa _handleWindowNeedsDisplay + 696
    26 com.apple.CoreFoundation 0x94e43892 __CFRunLoopDoObservers + 1186
    27 com.apple.CoreFoundation 0x94e0018d __CFRunLoopRun + 557
    28 com.apple.CoreFoundation 0x94dff864 CFRunLoopRunSpecific + 452
    29 com.apple.CoreFoundation 0x94dff691 CFRunLoopRunInMode + 97

    Sorry about listing too much data from the GB crash report.
    Anyone have trouble shooing tips for crashing during large mix downs/exports?
    The file that crashes during export is over 2 hours long, but only 1 voice track with no effects!

  • P45 Neo2-FR hangs during large file moves

    Hello all,
        I have a newly built system that I moved my old XP install onto.  ( Removed old Video and hardware drivers then did a repair install on the new hardware.  )  That process appears to have been successful.  The system runs normally most of the time, including resource heavy 3D apps like SecondLife.  I have also accumulated over 32 hours running memtest86 with no errors found.  ( a 24 hours session and an 8+ hour one. )
        During the upgrade/rebuild I backed up almost 0.5TB of data onto an external drive and I'm not trying to move those files onto the new RAID array or a partition on the main hard drive.  During any large copy operation, the system locks up.  It's usually several minutes into the copy, and a hard reset is required to recover.  The external drive is formatted to FAT32, so I've already addressed file size limitations as I copied the data to the external.  The external drive is USB/Firewire/eSATA, but so far I've only used USB on the new hardware due to a lack of cables.
       Anyone seen something like this before ?  I'd really like to get the files back onto my internal drives.  The external is a loaner from a friend, so there's a little time pressure.  Any help will be appreciated.
    Thanks
    Todd

    Update on this.  I have not done a reinstall to date since I don't have a spare hard drive on which to do so.
    However, I can eliminate a windows or driver problem from the list.  I see exactly the same symptoms - random hangs of the machine - under Ubuntu 8 as well as windows.  The Linux install is clean from a downloaded image and updated as recommended by the package manager.
    It seems that the most likely time for the system to hang is during periods of intense IO activity.  I had thought it was large file copies because that was what I was doing a lot of at the time of my original post. 
    I have tried moving my video card to the first PCI-X slot.  No change.
    I have removed 2 of the 4 memory modules, leaving 2x2Gb installed in bank 0.  No change.
    I have added extra fans - a large on on the side of the hard drive bay ( The case is an Antec Sonata III ) and a smaller fan aimed at the memory.  No change.
    I am using the J-micron Raid controller - is this still problematic ?  I seem to recall some mentions of this.
    Any suggestions to help isolate the problem would be appreciated.

  • Large file processing in XI 3.0

    Hi,
    We are trying to process a large file of ~280 MB file size and we are getting timeout errors. I followed all the required tunings for memory and heap sizes and still the problem exists. I want to know if installation of decentral adapter engine for just this file processing might solve the problem which I doubt.
    Based on my personal experience there might be a limitation of file size processing in XI may upto 100 MB with minimul mapping and no BPM.
    Any comments on this would be appreciated.
    Thanks
    Steve

    Hi Debnilay,
    We do have 64 bit architecture and still we have the file processing problem. Currently we are splitting the file into smaller chuncks and processsing. But we want to process as a whole file.
    Thanks
    Steve

  • External USB Hard Drive disconnects during large file transfers.

    I have a Western Digital 2TB USB Hard Drive that Ive been using perfectly for a couple of years on my MacBook Pro. A couple of months ago I bought a 27inch iMac and started using it as my main computer. So I had that hard drive connected to my iMac.
    I started to notice that during large transfers (be it many files or 1 big one), usually into the Gigs, I suddenly got a warning message saying that the Hard Drive was inappropriately disconnected. It doesnt appear in Disk Utility so I have to manually connect and disconnect the drive in order for it to show up again on the iMac. I tested it again on my MacBook and the same thing happened.
    Another problem:
    Sometimes it will dissappear from the iMac without a warning message, it just doesnt show up. (Usually after long periods of inactivity). I then have to manually disconnect and connect for it to show up.
    So what do you think? Is my Hard Drive Dying?

    I've been having similar problems with Lion on both external USB2 drive docks and more recently a Mediasonic 4 bay Firewire RAID drive.  It was suggested that powersaving mode shut these down but changing that didn't completely solve the problem.  I'm returning the Mediasonic for a replacement but suspect that it is something in the OS that is causing it.
    If the replaced drive still causes the problem, I'll do a rebuilt of the system, eliminating apps that were migrated over, even though the system is only 5 months old.  If anyone has solved similar issues, please let me know as well.  Thanks.

  • Network Drive Losing Connection During Large File Transfer...

    Hey, everyone.
    Recently bought an Airport Extreme and LaCie USB HDD/Hub for my hybrid Windows/Mac home network. I formatted the drive with FAT32 as instructed, and hooked it up.
    My first order of business was to back up all of my media on my Windows desktop machine, but it hasn't been working out so well. A short while after starting the 35+ gig transfer (a few minutes at most), I get an error message saying the the destination folder is no longer there. Sure enough, checking 'My Computer' shows that the drive is now a 'Disconnted Network Drive'.
    Cycling power on everything and rebooting Windows restores the network drive, but the core problem remains.
    Anyone else seen this and have any idea how to deal with it?
    Thanks in advance for any help.

    Hi,
    I'm having the same problem. Here's the relevant parts of my setup:
    Airport Extreme
    HP printer and Western Digital "My Book" 500 G USB drive connected by 4/1 usb splitter
    MacBookPro connected via Gigabit
    Windows XP PC connected via Gigabit
    My media I loaded on the usb drive by directly connecting the usb to my PC, I've also loaded other iTunes media to the usb drive from the mac book. If I point iTunes on the PC at the usb drive I can play the media just fine. When I try to copy large files (700 MB) from my PC to the usb drive (via the airport) I'm told the destination folder isn't there. I've also noticed the windows tooltips telling me that it lost the internet connection, and if I have the airport utility open on my mac it also looses the airport for a few seconds. I am able to copy small files (largest that's worked so far is 1.6 MB) the usb drive via the airport.

  • Large file processing issue

    Hi,
    A 2MB source file is found to be generating a file of over 180 MB causing it to fail in pre prod and production. The processes successfully in Development Box where there is no web dispatcher or restrictions on size.
    The recommendation from SAP is that we try to reduce the outout file size.
    Kindly look into the issue ASAP.
    Appreciate your help.
    Thanks,
    Satya Kumar

    Hi Satya,
    There are many ways are available check the below links
    /people/stefan.grube/blog/2007/02/20/working-with-the-payloadzipbean-module-of-the-xi-adapter-framework
    /people/aayush.dubey2/blog/2007/10/10/zip-transfer-unzip-increase-the-performance-of-your-java-abap-applications
    /people/pooja.pandey/blog/2005/10/17/number-formatting-to-handle-large-numbers
    /people/alessandro.guarneri/blog/2007/02/21/sap-xi-acting-as-a-huge-file-mover
    /people/alessandro.guarneri/blog/2006/03/05/managing-bulky-flat-messages-with-sap-xi-tunneling-once-again--updated
    One more way is we have to develope the ZIP Adapter and send the zip file after processing again we have to unzip the file.
    Regards
    Ramesh

  • OutOfMemory error on large file processing in sender file adapter

    Hi Experts,
    I got file to IDOC scenario, sender side we are using remote adapter engine, files are processing fine when the size is below 5mb, if the file size is more than 5mb we are getting java.lang.OutOfMemoryError: java heap space. can anyone suggest me what parameter i need to change in order to process more than 5mb.

    Hi Praveen,
    Suggestion from SAP is not to process huge files at a time. Instead, you can make the into chunks and process.
    To increase the heap Memory:
    For Java heap-size and other memory requirements, Refer to the following OSS note for details:
    Note 723909 - Java VM settings for J2EE 6.40/7.0
    Also check out the OSS Note 862405 - XI Configuration Options for Lowering Memory Requirements
    There are oss notes available for the Java Heap size settings specific to the OS environment too.
    Thanks,

Maybe you are looking for

  • Installing HP LaserJet 1018 HOW LONG on line?

    Have an HP LaserJet 1018 and a new computer that has Windows 7 on it. The old ones had MicrosoftXP. I love my printer but it wouldn't load. Tried the web site to install the driver, but it has been installing for 2 hrs. now. Should I stop and try aga

  • How can I benchmark my refurbished iMac i7?

    Don't get me wrong, I love my new iMac and maybe I am being paranoid because this is my first refurbished Mac, I want to run a speed test. Before my purchase I had played around with the 27" Core Duo model in an Apple Store. I was really impressed wi

  • Error trying to update row

    Hey there, I just created a table through APEX, although when I tried to edit a row from the object browser, this is the error I got. Could anybody point me in the right direction please? ORA-06550: line 3, column 12: PLS-00201: identifier 'FILES' mu

  • Can not import pics into LR 5.6

    I uploaded LR 5 from disc and then discovered it did not recognise Nef Raw pics from my Nikon camera so I updated to 5.6. Everything was going well for 24 hours but then suddenly it will not import pics from my hard drive or my camera even though it

  • HTTP start problem

    Hi All, I am unable to start the http server for APEX 3.1.2.00.02, database 10.2.0.4.0 on redhat 5, when I start using opmn I am getting following error opmnctl: starting opmn managed processes... =====================================================