Xserve Data Written speeds

Due to an application upgrade for one of our departments, I had to upgrade our Xserve G5 w/megaraid from 10.3.9 to 10.5.7. Since the upgrade, I have noticed that the internal drive speed seems to be slow when our apps are attempting to do a mysqldump of the databases (internal drive to internal drive). Diagnostics don't show any problems.
When I go to Activity Monitor, it shows the Data Read speed as 8 - 12MB /sec (sometimes a little lower or higher) and a Data Written speed of around 6MB - 8MB /sec (sometimes a little higher, but sometimes much lower around 1 - 2MB /sec).
Is this normal? I'm not sure what the "normal" data write speed is for Xserve PCI RAID hard drives? I have three 80GB Apple module drives on a RAID-5 via megaraid.
Thanks.

Similar problem here. I have three Mac (iMac, MacBook PB G4) connected to the same AE and the attached USB hard drive. Every one of them recognise the changes I did to the hard drive except the iMac. Doesnt has a clue what's going on.

Similar Messages

  • Internal Disk to Disk Data Transfer Speed Very Slow

    I have a G5 Xserve running Tiger with all updates applied that has recently started experiencing very slow Drive to Drive Data transfer speeds.
    When transferring data from one drive to another ( Internal to Internal, Internal to USB, Internal, Internal to FW, USB to USB or any other combination of the three ) we only are getting about 2GB / hr transfer speeds.
    I initially thought the internal drive was going bad. I tested the drive and found some minor header issues etc... that were able to be repaired so I replace the internal boot drive
    I tested and immediately got the same issue.
    I also tried booting from a FW drive and I got the same issue.
    If I connect to the server over the ethernet network, I get what I would expect to be typical data transfer rates of about 20GB+ / hr. Much higher than the internal rates and I am copying data from the same internal drives so I really don't think the drive is the issue.
    I called AppleCare and discussed the issue with them. They said it sounded like a controller issue so I purchased a replacement MLB from them. After replacing the drive data transfer speeds jumped back to normal for about a day maybe two.
    Now we are back to experiencing slow data transfer speeds internally ( 2GB / hr ) and normal transfer speeds ( 20GB+ / hr ) over the network.
    Any ideas on what might be causing the problem would be appreciated

    As suggested, do check for other I/O load on the spindles. And check for general system load.
    I don't know of a good GUI in-built I/O monitor here (and particularly for Tiger Server), though there is iopending and DTrace and Apple-provided [performance scripts|http://support.apple.com/kb/HT1992] with Leopard and Leopard Server. top would show you busy processes.
    Also look for memory errors and memory constraints and check for anything interesting in the contents of the system logs.
    The next spot after the controllers (and it's usually my first "hardware" stop for these sorts of cases, and usually before swapping the motherboard) are the disks that are involved, and whatever widgets are in the PCI slots. Loose cables, bad cables, and spindle-swaps. Yes, disks can sometimes slow down like this, and that's not usually a Good Thing. I know you think this isn't the disks, but that's one of the remaining common hardware factors. And don't presume any SMART disk monitoring has predictive value; SMART can miss a number of these cases.
    (Sometimes you have to use the classic "field service" technique of swapping parts and of shutting down software pieces until the problem goes away. Then work from there.)
    And the other question is around how much time and effort should be spent on this Xserve G5 box; whether you're now in the market for a replacement G5 box or a newer Intel Xserve box as a more cost-effective solution.
    (How current and how reliable is your disk archive?)

  • Adobe Encore bytes used versus Data Written - ?

    Hello all!
    I'm perplexed!
    I exported a 30 second clip from Adobe Premiere CS5.5 (via AME) using the Mpeg-2 DVD setting/preset. I imported the video file as a timeline into Adobe Encore, and imported the .wav file as an asset. No menus, as this was a "first play" test. 
    When it was time to build the project, the "bytes used" read 19.5MB. However, once the build finished, the "data written" read 550MB. This concerns me.
    Q1 - Why is there such vast difference in size between the bytes used and the data written?
    Q2 - My official project is nearly 2 hours. When I load it up, I'm sure "bytes used" will be multiple GB's, which is fine because I'm using a double layered DVD. However, my concern is that  the data written will be gigantic! Am I assuming correctly? 
    IF I'm right, there is a problem. As we know, the DVD won't burn beyond the space allowed. How is it possible that Encore says there is enough space with bytes used, but the data written FAR exceeds the capacity of the DVD?
    The reason I did the 30 second clip is because my first 3 attempts to burn the DVD have failed. Adobe Encore freezes and won't finish, EVENTHOUGH there was enough space on the DVD according to "bytes used".  I ASSUME it's because the data WRITTEN exceeds the available space.
    Is this making sense? Has anyone encountered this or have any suggestions? Do I need to provie anymore information to help assess the situation?

    3 things
    1 - I don't remember the exact number, but there is some "overhead" written to the DVD (navigation commands to the player ???)
    2 - check to be sure you did not put an entry in the ROM space, to write computer files to the DVD
    3 - I don't have a link, but I "think" I remember a really old discussion that there is a minimum amount that may be written to a DVD for it to work in a standalone player, and that Endore will "pad" the write if there is not enough data
    As far as Encore freezing... do the below as a test
    Create a DVD output ISO file and then use the FREE http://www.imgburn.com/index.php?act=download to write the ISO to DVD (send the author a PayPal donation if you like his program)
    Read http://forums.adobe.com/thread/1322583 for notes on installing Imgburn WITHOUT any toolbar add-ons
    When you write to disc with Imgburn, use the SLOWEST possible speed setting, so your burner has the best chance to create "good, well formed" laser burn holes... since no DVD player is required to read a burned disc, having a "good" one from a high quality blank will help
    Use Taiyo Yuden single layer or Verbatim Two layer Or Falcon Pro for inkjet printable Two layer

  • Problems with Photoshop performance and data transfer speed on iMac

    Two months ago, I started noticing slow performances using Photoshop (above all using clone stamp tool) on my 27" iMac (late 2012). I did the AHT and I found that 8GB of 32GB RAM were broken.
    I removed them but the problem didn't disappered, I also noticed that data transfer speed (both copy and paste from/to internal HD and from CF card/external HD) was really slow.
    I tried many solutions suggested by Apple support, none of them worked out. At the end, I tried uninstalling and re-installing Photoshop: no more problems!!!
    10 days ago, I received a new 8GB RAM module and so I installed it back... suddenly, the problem came back, I tried re-installing again Photoshop but the problem, this time, still persist!
    Does anyone had the same experience? All other CC programs work well (LR, AE, Premiere...)

    yes, it does!
    what seems to be very strange to me is how data trasnfer speed could be affected!
    (just to say, I've already tried reset of SMC and PRAM, I've tried with different accounts and I've also re-installed the OS, next step would be formatting the disk and installing the OS from zero)

  • How to increase AFP data transfer speed?

    When I connect to our server from a WAN source outside of our facility the file data transfer rate is extremely slow. We just upgraded to a 10Mps fiber service and that has dematically increased our website data tranfer speed. I would like to find a way to access the file server from remote location and work, but the data transfer rate makes productivity impossible. I would appreciate hearing how other organizations are set-up and functioning with file transfer sizes at an average of 10 to 15 Mb.
    Thanks,
    Brian
    OS10.6.8 Server,

    If you don't have enough network bandwidth for your time requirements, you have little chance of success with the direct approach.  Techniques such as file compression and such can only provide limited help.  If you're transfering multiple copies of the files, then you can push one copy of the file to a hosted provider, and then serve additional copies from there.
    As for your network, a ten million bits per second network connection is the speed of first-generation Ethernet.  That Ethernet was a fast network, back in 1985.  In the era of a one billion bits per second Gigabit Ethernet and increasingly commonly with the ten gigabit Ethernet links, a 10 Mb link is glacial.
    A typical DSL network is asymmetric, meaning you'll have 10 Mb down (theoretically) and some fraction of that up.  So you might not be getting that 10 Mb in the direction you're coppying files.  And this is best case; various of the ISP network links around aren't providing their rated speeds.
    AFP stinks on a network, and you're also opening up your file system to remote attackers. 
    As for WebDAV, read this.   In addition to WebDAV, you can also try an sftp or other "simpler" copy command as a test, and see what you get for that.  (sftp is also encrypted, which has benefits, though the encryption also requires more processing time.)
    But beyond techniques such as data compression (and which may or may not be an option here) or incremental or "delta" changes to the data (which probably isn't an option here) or working locally and batching over the changes, there are few good ways to contend with a too-slow-for-your-needs link.

  • Very slow network directory listing - but fast data transfer speed once listed?

    Hello,
    I have really tried to sort this myself before opening up to the community, however I have run out of ideas, and hope someone can offer the magic solution I have missed.
    I am currently using the 3.4ghz i7 iMac on a 1GB LAN, running OSX10.7.2 - connecting to a Windows Server 2008 (Running Release 2) over ethernet.
    If i go to a network directory that i haven't recently accessed it can take up to 60 seconds to show the contents of that directory. Once i have accessed that folder, if i come out of it and go back in it will be instant again - but the first time it lists the directory it looks like i have opened an empty folder - which after anything from 10seconds to 1 minute will suddenly show the files that are there.
    Internet connectivity is fast through the network, and file transfers across the LAN are fast. (showing as approx 300mb per second) I can play and edit HD content across the network with no slowdown so I am confident that this issue is not related to the network speed itself, and is more to do with a setting on this mac.
    Symptoms are very similar to this post: https://discussions.apple.com/message/12245148?messageID=12245148&amp%3b#1224514 8 - however i understand that in OSX Lion - SMB was removed - so i cannot find this file to edit.
    I have tried bypassing additional hubs in the network by wiring direct cables to the switch that is connected to the file server, this made no difference.
    I have also tried disconnecting the ethernet cable, and running over wifi. This fixes the listing problem, but when editing HD content over a network drive, this connection is not fast enough to carry the data without interruption (some projects are linked to up to 900gb of hd video content!)
    Using ethernet, I have tried DHCP, DHCP with manual address, and manual mode. All reproduce this problem. i have tried using the windows workgroup, and tried without it.
    I have also followed this suggestion: https://discussions.apple.com/thread/2134936?threadID=2134936&tstart=45 and used OpenDNS. this did not fix the issue.
    For argument sake, I have also just tested a Macbook Pro running Snow Leopard to see if it was OS related. This reproduces the exact same problem, near instant directory listing on the wifi, a long and arduous wait on ethernet.
    I cannot work out why directory listing is instant over wifi, but not over ethernet on 2 different macs, running 2 different versions of OSX. I also do not understand why if the network is having trouble listing the directories - the data transfer speed is 300mbps when i copy files across the wired network from the file server to the mac.
    Does anyone have any other ideas as to what could be the problem here? We are about to start work on a very large project, where the content we are editing is spread out across around 200 different network folders (different shoots captured over the past 2 years). We really don't have the time to wait 60 seconds each time we need to access one of those directories to look for a file, and I am very close to pulling all my hair out!
    I really look forward to hearing from anyone who can offer any insight.

    If you are suspecting that the Windows update had something to do with your LAN going slow, then try the following:
    1.  Look for updates for your clients LAN NIC driver; or
    2.  Un-install the updates.

  • Java.io.IOException: There is no process to read data written to a pipe.

    Hi all
    I am facing a problem when i run my application
    I am using jdk1.3 and Tomcat 4.0.3
    Actually my application works absolutely fine but when i check the
    local_host log file of tomcat i find the following stack trace in it
    2006-01-04 10:59:00 StandardWrapperValve[default]: Servlet.service() for servlet default threw exception
    java.io.IOException: There is no process to read data written to a pipe.
         at java.net.SocketOutputStream.socketWrite(Native Method)
         at java.net.SocketOutputStream.write(SocketOutputStream.java(Compiled Code))
         at org.apache.catalina.connector.ResponseBase.flushBuffer(ResponseBase.java(Compiled Code))
         at org.apache.catalina.connector.ResponseBase.write(ResponseBase.java(Compiled Code))
         at org.apache.catalina.connector.ResponseBase.write(ResponseBase.java(Compiled Code))
         at org.apache.catalina.connector.ResponseStream.write(ResponseStream.java:312)
         at org.apache.catalina.connector.http.HttpResponseStream.write(HttpResponseStream.java:189)
         at org.apache.catalina.servlets.DefaultServlet.copyRange(DefaultServlet.java:1903)
         at org.apache.catalina.servlets.DefaultServlet.copy(DefaultServlet.java:1652)
         at org.apache.catalina.servlets.DefaultServlet.serveResource(DefaultServlet.java:1197)
         at org.apache.catalina.servlets.DefaultServlet.doGet(DefaultServlet.java:519)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:740)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:247)
         at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:193)
         at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:243)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:566)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:190)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:566)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.core.StandardContext.invoke(StandardContext.java:2343)
         at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:180)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:566)
         at org.apache.catalina.valves.ErrorDispatcherValve.invoke(ErrorDispatcherValve.java:170)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:170)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:468)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:174)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:566)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.connector.http.HttpProcessor.process(HttpProcessor.java(Compiled Code))
         at org.apache.catalina.connector.http.HttpProcessor.run(HttpProcessor.java:1107)
         at java.lang.Thread.run(Thread.java:498)
    2006-01-04 10:59:00 ErrorDispatcherValve[localhost]: Exception Processing ErrorPage[exceptionType=java.lang.Exception, location=/error]
    java.lang.IllegalStateException
         at java.lang.RuntimeException.<init>(RuntimeException.java:39)
         at java.lang.IllegalStateException.<init>(IllegalStateException.java:36)
         at org.apache.catalina.connector.ResponseFacade.reset(ResponseFacade.java:243)
         at org.apache.catalina.valves.ErrorDispatcherValve.custom(ErrorDispatcherValve.java:384)
         at org.apache.catalina.valves.ErrorDispatcherValve.throwable(ErrorDispatcherValve.java:250)
         at org.apache.catalina.valves.ErrorDispatcherValve.invoke(ErrorDispatcherValve.java:178)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:170)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:468)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:564)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:174)
         at org.apache.catalina.core.StandardPipeline.invokeNext(StandardPipeline.java:566)
         at org.apache.catalina.core.StandardPipeline.invoke(StandardPipeline.java:472)
         at org.apache.catalina.core.ContainerBase.invoke(ContainerBase.java:943)
         at org.apache.catalina.connector.http.HttpProcessor.process(HttpProcessor.java(Compiled Code))
         at org.apache.catalina.connector.http.HttpProcessor.run(HttpProcessor.java:1107)
         at java.lang.Thread.run(Thread.java:498)
    What i dont get is in the entire stack trace i am not able to locate which of my application files is causing the errors
    I searched on net and found a few root causes but i am not able to find out exactly which class file is causing the stack trace
    Any suggestions are most welcome
    Thanking in advance

    Did you do something strange like writing the object out using the Servlet response's output stream and then attempted to redirect or forward a user to another page? That is usually how the IllegalStateException gets generated. You would still see a valid response from the caller's perspective, but since you attempted to forward or redirect after data has already been written to the stream on the server, an exception is thrown there.
    - Saish

  • There is no process to read data written to a pipe

    Hi
    I have an application server and a database server. I have a process that is running on the application server - that is processing records in a huge file - 2.41 GB. This file contains 20 million records. The process reads and processes one record at a time. For each record, the process reads the record, goes to the database to search for a match, does not find it - since on purpose we have created the test bed to not match, and then it updates the bookmark in one of the database tables and moves on the next record.
    The process runs fine for 86399 seconds and then it throws the 'java.sql.exception Io exception There is no process to read data written to a pipe'.
    Note that in this 86399 seconds, it process 11934696 records.

    It is important to undertand, why do I have to create
    the connection. If the connection has already been
    established and is in use, why after processing
    nearly 11.9 million records, I need to close the
    connection and create a new one.It might be important to you but it isn't important to me.
    When I work I work towards a solution and not understanding. So if closing the connection and re-opening every 100,000 records solves it then that is what I would do. I would do it knowing that something in the driver or the database is preventing me from solving it without doing that and also knowing that I am not going to be able to fix either the driver or the database and I don't have time to wait for those fixes.
    Not to mention that if it was taking an entire day to run anyways then I would consider closing and opening a connection a very minor part of what would be a much bigger problem.
    >
    In this particular case my java program is reading a
    record from a file and extracting certain fields in
    that file record. Based on those fields it is looking
    for a match in the database. It is the same code,
    having application and business logic to take a
    different path if a match is found in the database. .....and again I can only wonder how much faster it would be if you did it in the database. I would suspect at least an order of magnitude.
    >
    I guess the issue here is not java vs Database. The
    issue is what causes this exception in my test
    scenario.So you are looking for a reason and not a solution? Then it should be simple enough to create a small test app, reproduce the bug and then file a bug report with the vendor. Then wait for between 6 weeks and 2 years for an answer. Maybe. Because they might never respond.
    Or disassemble the driver and debug it. And since I consider it unlikely that the problem is in the java code you will also need a cpu op code reference and a debugger that can debug the database itself. Then just start running it and step through the instructions to figure it out.

  • Io exception: There is no process to read data written to a pipe.

    Hi there,
    I get following Exception when i try to open a connection from a AIX machine to a Oracle Database.
    Fehler: Io exception: There is no process to read data written to a pipe.
    java.sql.SQLException: Io exception: There is no process to read data written to a pipe.
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:184)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:226)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:339)
    at oracle.jdbc.driver.OracleConnection.<init>(OracleConnection.java:406)
    at oracle.jdbc.driver.OracleDriver.getConnectionInstance(OracleDriver.java:457)
    at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:332)
    at java.sql.DriverManager.getConnection(DriverManager.java:559)
    at java.sql.DriverManager.getConnection(DriverManager.java:189)
    The connection code looks like:
    Class.forName("oracle.jdbc.driver.OracleDriver");
    DriverManager.registerDriver (new OracleDriver());
    String url = "jdbc:oracle:thin:@"+dbserver+":"+dbPort+":"+db;
    Connection con = DriverManager.getConnection(url, user, pwd);
    Anyone got the same/similar problems and found a solution?
    Regards
    mark

    Hi,
    I got the same error and the problem was in the resolving hosts names by the DNS.
    We changed the configuration in netsvc.conf so it will check /etc/hosts file and this fixed the problem.
    See more details here:
    http://www.regatta.cmc.msu.ru/doc/usr/share/man/info/ru_RU/a_doc_lib/files/aixfiles/netsvc.conf.htm

  • Servlet request terminated with IOException:java.io.IOException: There is no process to read data written to a pipe.

    Hi,
    I am getting this following error. Could anyone please throw some light.
    Thanks
    Nilesh
    <HTTP> Servlet request terminated with IOException:
    java.io.IOException: There is no process to read data written to a pipe.
         at java.net.SocketOutputStream.socketWrite(Native Method)
         at java.net.SocketOutputStream.write(SocketOutputStream.java(Compiled Code))
         at weblogic.servlet.internal.ChunkUtils.writeChunks(ChunkUtils.java(Compiled
    Code))
         at weblogic.servlet.internal.ResponseHeaders.writeHeaders(ResponseHeaders.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletResponseImpl.writeHeaders(ServletResponseImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletOutputStreamImpl.flush(ServletOutputStreamImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletOutputStreamImpl.finish(ServletOutputStreamImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletContextManager.invokeServlet(ServletContextManager.java(Compiled
    Code))
         at weblogic.socket.MuxableSocketHTTP.invokeServlet(MuxableSocketHTTP.java(Compiled
    Code))
         at weblogic.socket.MuxableSocketHTTP.execute(MuxableSocketHTTP.java(Compiled
    Code))
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:129)

    I forgot to mention.
    I am using Weblogic 5.1 with SP 9
    Nilesh
    "Nilesh Shah" <[email protected]> wrote:
    >
    Hi,
    I am getting this following error. Could anyone please throw some light.
    Thanks
    Nilesh
    <HTTP> Servlet request terminated with IOException:
    java.io.IOException: There is no process to read data written to a pipe.
         at java.net.SocketOutputStream.socketWrite(Native Method)
         at java.net.SocketOutputStream.write(SocketOutputStream.java(Compiled
    Code))
         at weblogic.servlet.internal.ChunkUtils.writeChunks(ChunkUtils.java(Compiled
    Code))
         at weblogic.servlet.internal.ResponseHeaders.writeHeaders(ResponseHeaders.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletResponseImpl.writeHeaders(ServletResponseImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletOutputStreamImpl.flush(ServletOutputStreamImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletOutputStreamImpl.finish(ServletOutputStreamImpl.java(Compiled
    Code))
         at weblogic.servlet.internal.ServletContextManager.invokeServlet(ServletContextManager.java(Compiled
    Code))
         at weblogic.socket.MuxableSocketHTTP.invokeServlet(MuxableSocketHTTP.java(Compiled
    Code))
         at weblogic.socket.MuxableSocketHTTP.execute(MuxableSocketHTTP.java(Compiled
    Code))
         at weblogic.kernel.ExecuteThread.run(ExecuteThread.java:129)

  • XI SOAP SocketException:There is no process to read data written to a pipe.

    Hi guys!
    I get this error and have no idea, where it comes from..
    It occurs in scenario: SAR/3P->IntegrationProcess->SAPR/3, wher in integration process I map r/3 message to soap call synchronously external web service, map soap response to target sap msg and send it into R/3.
    Message is sent from r/3 correctly, received by IP correctly, problem occurs in SOAP communication.
    Log from messaging system - received messages - sync:
    2007-07-12 15:29:36 Success Message successfully received by messaging system. Profile: XI URL: http://xxxx:50200/MessagingSystem/receive/AFW/XI Credential (User): PIISUSER
    2007-07-12 15:29:36 Success Using connection SOAP_http://sap.com/xi/XI/System. Trying to put the message into the request queue.
    2007-07-12 15:29:36 Success Message successfully put into the queue.
    2007-07-12 15:29:36 Success The message was successfully retrieved from the request queue.
    2007-07-12 15:29:36 Success The message status set to DLNG.
    2007-07-12 15:29:36 Success Delivering to channel: CC_IS2SOAP
    2007-07-12 15:29:36 Success SOAP: request message entering the adapter
    2007-07-12 15:29:37 Error SOAP: call failed
    2007-07-12 15:29:37 Error SOAP: error occured: com.sap.aii.af.ra.ms.api.RecoverableException: java.net.SocketException: There is no process to read data written to a pipe.
    2007-07-12 15:29:37 Error Exception caught by adapter framework: java.net.SocketException: There is no process to read data written to a pipe.
    2007-07-12 15:29:37 Error Delivery of the message to the application using connection SOAP_http://sap.com/xi/XI/System failed, due to: com.sap.aii.af.ra.ms.api.RecoverableException: java.net.SocketException: There is no process to read data written to a pipe.. Setting message to status failed.
    2007-07-12 15:29:37 Error The message status set to FAIL.
    2007-07-12 15:29:37 Error Returning synchronous error message to calling application: com.sap.aii.af.ra.ms.api.RecoverableException: java.net.SocketException: There is no process to read data written to a pipe..
    Any idea, what could be wrong?  Thanx a lot!
    Olian

    Olian,
    You can also do the same process as mentioned in the weblog:
    /people/bhavesh.kantilal/blog/2006/11/20/webservice-calls-from-a-user-defined-function
    Regards,
    ---Satish

  • Java.sql.SQLException : There is no process to read data written to a pipe

    I have an IDoc being sent from SAP R/3 to Oracle using JDBC Adapter
    I get the following error message:
    Last message processing started 02:03:43 2004-12-14, Error: Transform error in xml processor class, rollback:
    ERROR:Processing request: Error when executing statement for table/stored proc. 'MATERIAL_MASTER': java.sql.SQLException: Io exception: There is no process to read data written to a pipe.
    did anybody encounter this exception before..? any solutions?
    Thanks in advance.
    Anand

    Hi Friends,
    This problem was fixed when I increased the "number of retries" in the JDBC adapter configuration.
    Thanks
    Anand

  • Professional photo program that transfers data written on photos to PC

    I have a Mac OS X v. 10.5.8 and I am looking for a professional Photo program or application that I can write information or data on photo's or on the side of the photo's and have all of that data transfer onto other programs including PC based programs.  I need to be able to share photo's with data written on them with colleagues that run PC.
    Please give me some ideas as to what program to use. I already own Iphoto and Aperature and the data does not transfer.
    Please help me.
    Thanks, Eric

    Have you tried after editing the photo, and it is in full view, do a screen shot? The saved screen shot should appear as a new photo.

  • Data Retrieval Speed in Oracle Spatial vs. ESRI ArcSDE

    I would appreciate any opinions regarding data retrieval
    performance between Oracle Spatial and ESRI ArcSDE. Would an end-
    user (using ESRI software) experience significant differences in
    data retrieval speed depending on how the data were stored in
    Oracle (MDSYS.SDO_GEOMETRY verses ESRI Binary/Blob formats).
    Knowing that the ESRI binary formats are tailored to their
    software front-end apps (ArcGIS, ArcMap, ArcCatalog, and
    ArcInfo), wouldn't this be a "non-issue" until the spatial
    dataset gets "large", and even then, wouldn't performance be
    (almost) equal if the spatial indexes were created properly?
    Thanks for your inputs,
    Bruce

    John,
    You can't do that type of query in sql from sql*plus using
    SDEBINARY. HOwever, you can perform spatial queries in ArcMap
    if you are using SDEBINARY.
    You can use the query builder to perform point-in-polygon type
    queries.
    Hope that helps.
    For my two cents, I think SDO_GEOMETRY gives you a more robust
    database to work with, because you have the added power of
    Oracle Spatial functions. If you are using SDEBINARY you are
    limited to only what you can do thru ArcGIS.
    If you are concerned more about performance than accessibility,
    especially with a large number of users, then SDEBINARY might
    be the better choice.
    I love Oracle Spatial and am hoping that the performance issue
    will not be a serious one when we start putting ArcIMS developed
    apps into production.
    Dave

  • Data Load Speed

    Hi all.
    We are starting the implementation of SAP at the company I work and I am designated to prepare the Data Load of the legacy systems. I have already asked our consultants about the data load speed but they didn´t answer really what I need.
    Does anyone have a statistic of the data load speed using tools like LSMW, CATT, eCATT, etc... per hour?
    I know that the speed depends of what data I´m loading and also the CPU speed but any information is good to me.
    Thank you and best regards.

    hi friedel,
    Again here is the complete details regarding data transfer techniques.
    <b>Call Transaction:</b>
    1.Synchronous Processing
    2.Synchronous and Asynchrounous database updates
    3.Transfer of data for individual transaction each time CALL TRANSACTION statement is executed.
    4.No batch input log gets generated
    5.No automatic error handling.
    <b>Session Method:</b>
    1.Asynchronous Processing
    2.Synchronous database updates.
    3.Transfer of data for multiple transaction
    4.Batch input log gets generated
    5.Automatic error handling
    6.SAP's standard approach
    <b>Direct Input Method:</b>
    1.Best suited for transferring large amount of data
    2.No screens are processed
    3.Database is updated directly using standard function modules.eg.check the program RFBIBL00.
    <b>LSMW.</b>
    1.A code free tool which helps you to transfer data into SAP.
    2.Suited for one time transfer only.
    <b>CALL DIALOG.</b>
    This approach is outdated and you should choose between one of the above techniques..
    Also check the knowledge pool for more reference
    http://help.sap.com
    Cheers,
    Abdul Hakim

Maybe you are looking for