Posting large amount of data via sync web service

Hi,
I'm trying to posting a large data via sync soap adapter and receiver as abap proxy. I'm getting log version status in SXMB_MONI when ever i try to post data >5k. Any idea how to trace it? If got timeout should there be an error message in SXMB_moni instead of log version? Thanks.

Hi,
This message occurs when process on  receiving system somehow is locked and does not return a response.
your receiving system is an ABAP system, goto  sm66 ,sm12 in th abap system. In sm66 look for transactions which are running for very long and inside sm12 look for locks that interfere in processing of message on the abap system.
also maybe the response system was locked when time request was send and therfore no response was received.
also chk this blog :-
/people/michal.krawczyk2/blog/2005/05/10/xi-i-cannot-see-some-of-my-messages-in-the-sxmbmoni
hope it helps.
Regards,
Rohit

Similar Messages

  • Connecting to Data via a web service:  WSDL vs HTTP REST

    I created an app and easily was able to get my data from my web service via connecting using the WSDL file.  I then added RESTful to my web service and tested it via SOAPUI and then I created a copy of my original app and deleted the link to my service and then added it but this time using the HTTP connection and I was able to connect, I was able to get my value objects created via the xml parser.  However, for some reason, my date fields are not working.  I went into debug and it says they are invalid.  I checked the xml for both the web service WSDL version and the HTTP RESTful and they look the same.  So why all of the sudden does my dates not work?
    SOAPui from web service
    <endDate>2011-05-30T09:53:00-06:00</endDate>
    RESTful address
    <endDate>2011-05-30T09:53:00-06:00</endDate>
    RESTful Flash
    endDate:Date
    WSDL Flash
    endDate:Date

    DS Version: 14.2.1.224 (14 sp1)
    Job Server 14.2.2.5xx (14 sp2)
    Configuration on the Web Service:
    URL:
    https://maxcvservices.dnb.com/Company/V4.0?wsdl
    https://maxcvservices.dnb.com/FirmographicsProduct/V3.2?wsdl
    WSS User: provided the name
    WSS Pass: provided the pass
    Timeout set to 500
    All other settings are blank.
    Does something need to be defined on the job server?
    The WSDL does not even open on the workstation unless I use an adapter. I had found an article/help/note (something) that the web service in SAP DS does not work with HTTPS and it does with an HTTP Adapter.
    I do not have access to Job Server, will have to schedule some changes if there is config that has to happen there.
    Thanks for your input.
    Cheers,
    Ryan

  • Receiving very large amounts of data via ibrd in Visual Basic

    I am trying to read anything from a single byte up to 2Mb of data back into a Visual Basic program. Until now I have been using something like
    Dim Response as string * 60000
    ibrda hDevice, Response
    which works fine up to the magic 65536ish string size limit, but I want more!
    Ideally reading into a very large byte array would suffice, e.g.
    Dim ByteBuffer(0 To 2000000 - 1) As Byte
    ibrd hDevice, ByteBuffer
    But I cannot find a way to get the gpib-32.dll to accept the ByteBuffer as a destination in Visual Basic, even though ibrd32 is declared in vbib-32.bas as accepting Any type as a destination, and a Long as a count, implying the the 65536 limit doesn't apply for that call.
    It may be possibl
    e to use repeated ibrd calls until one such call fails to fill the buffer, concatenating the results each time, but this seems a crude solution when the DLL would appear to be able to do the job in one go.
    Using ibrdf may work, but would rather not use a file as intermediate storage.
    TIA

    I'm wondering if that # 65536 is a VB cap for string type. Refering to the lang intf,
    Declare Function ibrd32 Lib "Gpib-32.dll" Alias "ibrd" (ByVal ud As Long, sstr As Any, ByVal cnt As Long) As Long
    Sub ibrd(ByVal ud As Integer, buf As String) <---
    Dim cnt As Long
    cnt = CLng(Len(buf))
    Call ibrd32(ud, ByVal buf, cnt)
    isn't buf a string type?

  • Streaming large amounts of data of socket causes corruption?

    I'm wrinting an app to transfer large amounts of data via a simple client/server architecture between two machines.
    Problem: If I send the data too 'fast', the data arrives corrupted:
    - Calls to read() returns wrong data (wrong 'crc')
    - Subsequent calls to read() do not return -1 but allow me to read e.g. another 60 or 80 KBytes.
    - available() returns always '0'; but I'll get rid of that method anyway (as recommended in other forum entries).
    The behaviour is somewhat difficult to repeat, but it fails for me reliably when transferring the data between two separate machines and when setting the number of packets (Sender.TM) to 1000 or larger.
    Workaround: Reduce number of packages send to e.g. 1; or intruduce the 'sleep' on the Sender side. Another workaround: Changing alone to java.nio.* did not help, but when I got rid of the Streams and used solely ByteBuffers, the problem disappeared. Unfortunately the Streams are required by other parts of my application.
    I'm running the code on two dual-CPU machines connected via
    Below are the code of the Sender and the Listener. Please excuse the style as this is only to demonstrate the problem.
    import java.io.IOException;
    import java.io.OutputStream;
    import java.net.InetSocketAddress;
    import java.nio.channels.Channels;
    import java.nio.channels.SocketChannel;
    import java.util.Arrays;
    public class SenderBugStreams {
        public static void main(String[] args) throws IOException {
            InetSocketAddress targetAdr = new InetSocketAddress(args[0], Listener.DEFAULT_PORT);
            System.out.println("connecting to: "+targetAdr);
            SocketChannel socket = SocketChannel.open(targetAdr);
            sendData(socket);
            socket.close();
            System.out.println("Finished.");
        static final int TM = 10000;
        static final int TM_SIZE = 1000;
        static final int CRC = 2;
        static int k = 5;
        private static void sendData(SocketChannel socket) throws IOException {
            OutputStream out = Channels.newOutputStream(socket);
            byte[] ba = new byte[TM_SIZE];
            Arrays.fill(ba, (byte)(k++ % 127));
            System.out.println("Sending..."+k);
            for (int i = 0; i < TM; i++) {
                out.write(ba);
    //            try {
    //                Thread.sleep(10);
    //            } catch (InterruptedException e) {
    //                // TODO Auto-generated catch block
    //                e.printStackTrace();
    //                throw new RuntimeException(e);
            out.write(CRC);
            out.flush();
            out.close();
    import java.io.IOException;
    import java.io.InputStream;
    import java.net.InetSocketAddress;
    import java.nio.channels.Channels;
    import java.nio.channels.ServerSocketChannel;
    import java.nio.channels.SocketChannel;
    public class ListenerBugStreams {
        static int DEFAULT_PORT = 44521;
         * @param args
         * @throws IOException
        public static void main(String[] args) throws IOException {
            ServerSocketChannel serverChannel = ServerSocketChannel.open();
            serverChannel.socket().bind(new InetSocketAddress(DEFAULT_PORT));
            System.out.print("Waiting...");
            SocketChannel clientSocket = serverChannel.accept();
            System.out.println(" starting, IP=" + clientSocket.socket().getInetAddress() +
                ", Port="+clientSocket.socket().getLocalPort());
            //read data from socket
            readData(clientSocket);
            clientSocket.close();
            serverChannel.close();
            System.out.println("Closed.");
        private static void readData(SocketChannel clientSocket) throws IOException {
            InputStream in = Channels.newInputStream(clientSocket);
            //read and ingest objects
            byte[] ba = null;
            for (int i = 0; i < SenderBugStreams.TM; i++) {
                ba = new byte[SenderBugStreams.TM_SIZE];
                in.read(ba);
                System.out.print("*");
            //verify checksum
            int crcIn = in.read();
            if (SenderBugStreams.CRC != crcIn) {
                System.out.println("ERROR: Invalid checksum: "+SenderBugStreams.CRC+"/"+crcIn);
            System.out.println(ba[0]);
            int x = in.read();
            int remaining = 0;
            while (x != -1) {
                remaining++;
                x = in.read();
            System.out.println("Remaining:"+in.available()+"/"+remaining);
            System.out.println(" "+SenderBug.TM+" objects ingested.");
            in.close();
    }

    Here is your trouble:
    in.read(ba);read(byte[]) does not read N bytes, it reads up to N bytes. If one byte has arrived then it reads and returns that one byte. You always need to check the return value of read(byte[]) to see how much you got (also check for EOF). TCP chops up the written data to whatever packets it feels like and that makes read(byte[]) pretty random.
    You can use DataInputStream which has a readFully() method; it loops calling read() until it gets the full buffer's worth. Or you can write a little static utility readFully() like so:
        // Returns false if hits EOF immediately. Otherwise reads the full buffer's
        // worth. If encounters EOF in mid-packet throws an IOException.
        public static boolean readFully(InputStream in, byte buf[])
            throws IOException
            return readFully(in, buf, 0, buf.length);
        public static boolean readFully(InputStream in, byte buf[], int pos, int len)
            throws IOException
            int got_total = 0;
            while (got_total < len) {
                int got = in.read(buf, pos + got_total, len - got_total);
                if (got == -1) {
                    if (got_total == 0)
                        return false;
                    throw new EOFException("readFully: end of file; expected " +
                                           len + " bytes, got only " + got_total);
                got_total += got;
            return true;
        }

  • Is there any way to connect time capsule to a MacBook Pro directly via USB. I have a large amount of data that I want to back up and it is taking a very long time (35GB is taking 3 hrs, I have 2TB if files in total)...)?

    Perhaps via USB. I have a large amount of data that I want to back up and it is taking a very long time (35GB is taking 3 hrs, I have 2TB if files in total)...? I want to use TimeCapsule as back-up for an archive which is curently stored on a 2 TB WESC HD. 

    No, you cannot backup via direct usb connection..
    But gigabit ethernet is much faster anyway.. are you connected directly by ethernet?
    Is the drive you are backing up from plugged into the TC? That will slow it down something chronic.. plug that drive in by its fastest connection method.. WESC sorry I have no idea. If ethernet use that.. otherwise USB direct to the computer.. always think what way the files come and go.. but since you are copying from the computer everything has to go that way.. it makes things slower if they go over the same cable.. if you catch the drift.

  • MessageExpired because of large amount of data

    Hi Experts, i need some advice for my following scenario.
    I have a 50000 data that need to be inserted into SAP R3 HR. I am using BPM when calling the RFC dan using a sync message to get the response from the RFC. However, i think because of the large amount of data, i get the error "MessageExpired" in my monitoring.
    After i read some documentation, i find that a sync message has a timeout period (xiadapter.inbound.timeout.default = 180000 [ms] ) . My question is, if the 180000ms is not enough for the RFC to process the whole 50000 data, i can increase the timeout time, am i right? But then, to what maximum value will be the most appropriate for me to set the timeout value so that it will not effect the performance of XI. Anyway, does increasing the timeout value effect anything at all?
    Need the advice and inputs from you experts...Thank you so much in advance...

    Made,
    I posted this answer to a similar request a couple of weeks ago
    I had a similar issue some time back and used the following three parameters
    1. Visual Administrator
    SAP XI Adapter: RFC parameter syncMessageDeliveryTimeoutMsec
    This parameter is explained in SAP Note 730870 Q14 and SAP Note 791379
    2. Alert Configuration
    SA_COMM parameter CHECK_FOR_ASYNC_RESPONSE_TIMEOUT
    This parameter specifies the maximum time in seconds that can pass between the arrival of the synchronous request message on the Integration Server and the asynchronous response message. If this time period is exceeded, a system error is returned to the caller.
    3. Integration Process TimeoutControlStep
    My design is such that timeout parameter 3 which is set in the Integration Process will be triggered first. This will give me control to handle the timeout via an SMTP email prior to either of the other timeouts taking effect
    Regards,
    Mike

  • How do I pause an iCloud restore for app with large amounts of data?

    I am using an iPhone app which is holding 10 Gb of data (media files) .
    Unfortunately, although all data was backed up, my iPhone 4 was faulty and needed to be replaced with a new handset. On restore, the 10Gb of data takes a very long time to restore over wi-fi. If interrupted (I reached the halfway point during the night) to go to work or take the dog for a walk, I end up of course on 3G for a short period of time.
    Next time I am in a wi-fi zone the app is restoring again right from the beginning
    How does anyone restore an app with large amounts of data or pause a restore?

    You can use classifications but there is no auto feature to archive like that on web apps.
    In terms of the blog, Like I have said to everyone that has posted about blog preview images:
    http://www.prettypollution.com.au/business-catalyst-blog
    Just one example of an image at the start of the blog post rendering out, not hard at all.

  • With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?

    With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?
    For example, in Notes, I have written three notes; however if I click on 'All On My Mac' on the side bar, I see about 10 different versions of each note I make, it saves a version every time I add or delete a sentence.
    I also noticed, that when I write an email, Mail saves about 10 or more draft versions before the final is sent.
    I understand that all this journaling provides a level of security, and prevents data lost; but I was wondering, is there a function to clean up journal logs once in a while?
    Thanks
    Roz

    Are you using Microsoft word?  Microsoft thinks the users are idiots. They put up a lot of pointless messages that annoy & worry users.  I have seen this message from Microsoft word.  It's annoying.
    As BDaqua points out...
    When you copy information via edit > copy,  command + c, edit > cut, or command +x, you place the information on the clipboard. When you paste information, edit > paste or command + v, you copy information from the clipboard to your data file.
    If you edit > cut or command + x and you do not paste the information and you quite Word, you could be loosing information.  Microsoft is very worried about this. When you quite Word, Microsoft checks if there is information on the clipboard & if so, Microsoft puts out this message.
    You should be saving your work more than once a day. I'd save every 5 minutes.  command + s does a save.
    Robert

  • Azure Cloud service fails when sent large amount of data

    This is the error;
    Exception in AZURE Call: An error occurred while receiving the HTTP response to http://xxxx.cloudapp.net/Service1.svc. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being
    aborted by the server (possibly due to the service shutting down). See server logs for more details.
    Calls with smaller amounts of data work fine. Large amounts of data cause this error.
    How can I fix this??

    Go to the web.config file, look for the <binding> that is being used for your service, and adjust the various parameters that limit the maximum length of the messages, such as
    maxReceivedMessageSize.
    http://msdn.microsoft.com/en-us/library/system.servicemodel.basichttpbinding.maxreceivedmessagesize(v=vs.100).aspx
    Make sure that you specify a size that is large enough to accomodate the amount of data that you are sending (the default is 64Kb).
    Note that even if you set a very large value here, you won't be able to go beyond the maximum request length that is configured in IIS. If I recall correctly, the default limit in IIS is 8 megabytes.

  • How can I edit large amount of data using Acrobat X Pro

    Hello all,
    I need to edit a catalog that contains large amount of data - mainly the product price. Currently I can only export the document into excel file and then paste the new price onto the catalog using Acrobat X Pro one by one, which is extremely time-consuming. I am sure there's a better way to make this faster while keeping the accuracy of the data. Thanks a lot in advance if any one's able to help! 

    Hi Chauhan,
    Yes I am able to edit text/image via tool box, but the thing is the catalog contains more than 20,000 price data and all I can do is deleteing the orginal price info from catalog and replace it with the revised data from excel. Repeating this process over 20,000 times would be a waste of time and manpower... Not sure if I make my situation clear enough? Pls just ask away, I really hope to sort it out, Thanks! 

  • Uploading of large amount of data

    Hi all,
    i really hope you can help me. I have to upload quite large amount of data from flat files to ODS (via PSA of course). But the process takes very long time. I used method of loadin to PSA and then packet by packet into ODS. Loading of cca 1.300.000 lines from flat file takes about 6 or more hours. It seems strange for me. Is it normal or not?? Or should I use another uploading method or set up ODS some way ?? thanks

    hi jj,
    welcome to the SDN!
    in my limited experience, 6hrs for 1.3M records is a bit too long. here are some things you could try and look into:
    - load from the application server, not from the client computer (meaning, move your file to the server where BW is running, to minimize network traffic).
    - check your transfer rules and any customer exits related to loading, as the smallest performance-inefficient bits of code can cause a lot of problems.
    - check the size of data packets you're transmitting, as it could also cause problems, via tcode RSCUSTA2 (i think, but i'm not 100% sure).
    hope ths helps you out - please remember to give out points as a way of saying thanks to those that help you out okay? =)
    ryan.

  • Scrolling of large amount of data with Unscrolled HEADER .How to do???

    I am in much need of scrolling a large amount of data without scrolling the header,i.e. while scrolling,only data will be scrolled,so that I can see the header as well.I am using JSP in struts framework.Waiting for your help friends.Thanks in advance.

    1) Install Google at your machine.
    2) Open it in your favourite web browser.
    3) Note the input field and the buttons.
    4) Enter smart keywords like "scrollable", "table", "fixed" and "header" in that input field.
    5) Press the first button. You'll get a list of links.
    6) Follow the relevant links and read it thouroughly.

  • Sending large amounts of data spontaneously

    In my normal experience with the internet connection, the bits of data sent is about 50 to 80% of that received, but occasionally Firefox starts transmitting large amounts of data spontaneously; what it is, I don't know and where it's going to, I don't know. For example, today the AT&T status screen showed about 19 MB received and about 10 MB sent after about an hour of on-line time. A few minutes later, I looked down at the status screen and it showed 19.5 MB received and 133.9 MB sent, and the number was steadily increasing. Just before I went on line today, I ran a complete scan of the computer with McAfee and it reported nothing needing attention. I ran the scan because a similar effusion of sending data spontaneously had happened yesterday. When I noticed the data pouring out today, I closed Firefox and it stopped. When I opened Firefox right afterward, the transmission of data from did not recommence. My first thought was that my computer had been captured by the bad guys and now I was a robot, but McAfee says not to worry. But should I worry anyway? What's going on, or that not having a good answer now, how can I find out what's going on? And how can I make it stop, unless I'm seeing some kind of maintenance operation that Mozilla or Microsoft is subjecting me to?

    Instead of using URLConnection open a Socket to the server port (80 probably) send a POST http request followed by the data, you may then (optional) recieve data from the server to check that the servlet is ok, this is the same protocol as URLConnection, but you have control over when the data is actually sent...
    Socket sock=new Socket(getHost(),80);
    DataOutputStream dos=new DataOutputStream(sock.getOutputStream());
    dos.writeBytes("POST servletname\r\n");
    dos.writeBytes("Content-type: text/plain\r\n");  //optional, but good if you know
    dos.writeBytes("Content-length: "+lengthOfData+"\r\n")  //again, optional, but good if you can know it without caching the data first
    dos.writeBytes("\r\n");   // gotta have a blank line before the data
      // send data now
    DataInputStream=new DataInputStream(sock.getInputStream());  //optional if you want to recieve
      // recieve any feedback from servlet if you want
    dis.close();
    dos.close();
    sock.close();im guessing that URLConnection caches the data so it can fill in "Content-length"

  • Pull large amounts of data using odata, client API and so takes a long time in project server 2013

    We are trying to pull large amounts of data in project server 2013 using both client API and odata calls, but it seem to take a long time. How is this done
    In project server 2010 we did this creating SQL views in both the reporting database and for list creating a view in the content database. Our IT dept is saying we can't do this anymore. How does a view in Project database or content database create issues?
    As long as we don't add a field in the table. So how's one to do this with creating a view?

    Hello,
    If you are using Project Server 2013 on premise I would recommend using T-SQL against the dbo. schema in the Project Web Database for your reports, this will be far quicker that the APIs. You can create custom objects in the dbo. schema, see the link below:
    https://msdn.microsoft.com/en-us/library/office/ee767687.aspx#pj15_Architecture_DAL
    It is not supported to query the SharePoint content database directly with T-SQL or add any custom objects to the content database.
    Paul
    Paul Mather | Twitter |
    http://pwmather.wordpress.com | CPS |
    MVP | Downloads

  • JSP and large amounts of data

    Hello fellow Java fans
    First, let me point out that I'm a big Java and Linux fan, but somehow I ended up working with .NET and Microsoft.
    Right now my software development team is working on a web tool for a very important microchips manufacturer company. This tool handles big amounts of data; some of our online reports generates more that 100.000 rows which needs to be displayed on a web client such as Internet Explorer.
    We make use of Infragistics, which is a set of controls for .NET. Infragistics allows me to load data fetched from a database on a control they call UltraWebGrid.
    Our problem comes up when we load large amounts of data on the UltraWebGrid, sometimes we have to load 100.000+ rows; during this loading our IIS server memory gets killed and could take up to 5 minutes for the server to end processing and display the 100.000+ row report. We already proved the database server (SQL Server) is not the problem, our problem is the IIS web server.
    Our team is now considering migrating this web tool to Java and JSP. Can you all help me with some links, information, or past experiences you all have had with loading and displaying large amounts of data like the ones we handle on JSP? Help will be greatly appreciated.

    Who in the world actually looks at a 100,000 row report?
    Anyway if I were you and I had to do it because some clueless management person decided it was a good idea... I would write a program in something that once a day, week, year or whatever your time period produced the report (in maybe a PDF fashion but you could do it in HTML if you really must have it that way) and have it as a static file that you link to from your app.
    Then the user will have to just wait while it downloads but the webserver or web applications server will not be bogged down trying to produce that monstrosity.

Maybe you are looking for

  • QT plug-in several problems

    Hi everyone, i'm developing a website which use QT plug-in to display an interactive VR movie. This website is composed with a first page with link to a second page that is written with classic ASP and is divided into two frames (upper wich contains

  • Error in BO Webi using Authorization analysis in Bex with hierarchy display

    Hi Experts , When we run the WEBI report created on bex query which has 0comp_code restricted with characteristic variable  of processing type Authorization and Not ready for input. The hierarchy display is active in Bex (as we want to see L01, L02..

  • Songs are syncing, but won't play.

    I've seen a few other threads of people having the same problems, but nobody is offering suggestions or fixes. I've got a 4th Gen Nano that will download and sync all of my music, but some of it just won't play. The cover art will come up, along with

  • Is there a way to export what you record as sheet music?

    Maybe this is something simple I just haven't figured out yet. I recently recorded a melody in GarageBand. I'm wondering, is there any way to view the notes of what I recorded in the form sheet music? I would like to have the song translated into she

  • Audio Pass-Through

    In the Encoder Tab of the Inspector Window, there is a pulldown menu to the right of the "Audio Settings..." Button. There are 3 choices in this pull-down menu: Enabled, Disabled, and Pass-through. What does the Pass-through option mean? My system: D