How to Send and Receive Large Amounts of Data to and from a  Web Service

Hi All,
My requirement is: .Net Web service should receive a file and services do some modification in that file and return the file. I need to write client (using Java) to invoke that web service. Please help in writing java client code for accessing that service. Suggest if any changes required in Web service code also.
My .Net web Service web method code:
[WebMethod]
public byte [] ByteEcho(byte[] data) {
-     Some modification code -
return data;
}

that will work fine for small files. it will potentially cause OutOfMemoryErrors on client and/or server for large files. if you want to send/recieve large files, you need to stream them.
also, be aware that you cannot send raw byte[] via xml, you will need to encode the data using some sort of binary -> text encoding like Base64 encoding.

Similar Messages

  • ERROR MESSAGE WHEN DISPLAYING LARGE RETRIEVING AND DISPLAYING LARGE AMOUNT OF DATA

    Hello,
    Am querying my database(mysql) and displaying my data in a
    DataGrid (Note that am using Flex 2.0)
    It works fine when the amount of data populating the grid is
    not much. But when I have large amount of data I get the following
    error message and the grid is not populated.
    ERROR 1
    faultCode:Server.Acknowledge.Failed
    faultString:'Didn't receive an acknowledge message'
    faultDetail: 'Was expecting
    mx.messaging.messages.AcknowledgeMessage, but receive Null'
    ERROR 2
    faultCode:Client.Error.DeliveryInDoubt
    faultString:'Channel disconnected'
    faultDetail: 'Channel disconnected before and acknowledge was
    received'
    Note that my datagrid is populated when I run the query on my
    Server but does not works on my client pcs.
    Your help would br greatly appreciated here.
    Awaiting a reply.
    Regards

    Hello,
    Am using remote object services.
    USing component (ColdFusion as destination).

  • Error when exporting large amount of data to Excel from Apex4

    Hi,
    I'm trying to export over 30,000 lines of data from a report in Apex 4 to an Excel spreadsheet, this is not using a csv file.
    It appears to be working and then I get 'No Response from Application Web Server'. The report works fine when exporting smaller amounts of data.
    We have just upgraded the application to Apex 4 from Apex 3, where it worked without any problem.
    Has anyone else had this problem? We were wondering if it was a parameter in Apex4 that needs to be set.
    We are using Application Express 4.1.1.00.23 on Oracle 11g.
    Any help would be appreciated.
    Thanks
    Sue

    Hi,
    >
    I'm trying to export over 30,000 lines of data from a report in Apex 4 to an Excel spreadsheet, this is not using a csv file.
    >
    How? Application Builder > Data Workshop? Apex Page Process? (Packaged) procedure?
    >
    It appears to be working and then I get 'No Response from Application Web Server'. The report works fine when exporting smaller amounts of data.
    We have just upgraded the application to Apex 4 from Apex 3, where it worked without any problem.
    >
    Have you changed your webserver in the process? Say moved from OHS to ApexListener?
    >
    Has anyone else had this problem? We were wondering if it was a parameter in Apex4 that needs to be set.
    We are using Application Express 4.1.1.00.23 on Oracle 11g.
    Any help would be appreciated.

  • General question about storage, networking, and using large amounts of data

    I have been using a powermac tower G5 up until now. I bought my daughter a macbook pro 15" i7, talk about a zippy little machine. What a screamer. I can't believe how fast it is with final cut express, Aperture, and Photoshop. I tend to use higher end stuff, video, photos, etc. Hence I need not only a good fast processor, but the storage....need lots of hard disk space.
    oh, did I mention I bought this for my daughter who's about to go off to college? Heh, when I'm home, she loses the machine to me.... *evil grin*
    Anyways, I haven't considered using a laptop before for me personally as I just don't care for the small screen size, no mouse, and things like that.
    Seems I've changed my opinion, esp seeing as the new mouse track pad is a game changer. I still am not crazy about the small screen size, but I can get over it.
    The one thing I can't get around is the small disk storage. I have 4 TB on my big machine. Between my video, photography, and music, I am nearing a full TB of just raw stuff. If I count all the things I've stored, especially finished videos I make of photography projects, well you can image the chomping that occurs on disk storage.
    So what is a man to do about storage?
    I'm really getting hooked on this zippy little machine. The idea of dragging it around is compelling. Having my work not tied to a desk is fun.
    but the lack of storage is sucky, terribly so.
    I have one possible option, taking the G5 and making it a server. I can use it for backup and high speed transfer, working on the transfer part today.
    I thought I'd poll the collective wisdom of the community and see what do you peeps think is the best overall way to handle such a situation?
    What is the absolute fastest best way to use external drives? Firewire 800 is ok, but when working with 400gb aperture database files....
    I'm hoping there is some way to start posturing myself to really use a laptop for my full computing experience.
    Or am I trying to make a small portable machine act like a tower? Is this not a reasonable hope at this point in time?
    I'm open to suggestions. I'm going to purchase something when my daughter takes this laptop with her to college. Either a new mac pro tower or laptop. I love the portability but need speed and massive storage.
    Thanks for opining.

    Storage:
    1. Replace the drive with a larger one. Up to 1 TB is now available for notebook drives.
    2. Get a portable external drive for additional storage needs. See the options at OWC as an example.
    Drive speed:
    1. FW 800 should be fast enough for your work. If the computer has an ExpressCard slot then consider a card supporting full speed eSATA. You may then achieve speeds closer to the internal bus.
    A laptop is not a tower so stop thinking of or comparing to your tower. Besides the laptop's speed compared to your G5 is more like comparing a Ford Focus to a Ferrari F150.
    As for "polling" opinions here: polling is forbidden by the forums' Terms of Use.

  • Sending Large amount of data (250 K +) from Oracle to flex client

    I have an oracle database with more than 250K rows of data that needs to be sent to the AdvancedDataGrid and Flex charts for runtime analysis. What would be and ideal solution to implement this?

    I would say paging would be the way to go, downloading 250K rows to a client is not sensible. Any analysis that may be required should be performed server-side. Dumping 250K data-points into a chart is also unlikely to be performant or necessary.
    You get paging with LCDS, and the new FlashBuilder 4 code generation features also support paging.

  • Store agent started to transfer large amounts of data, eating all my bandwidth.  Seemed to just start 5-6 days ago.

    About5-6 days ago store agent started to continuously run, sending and receiving large amounts of data.  This eats all my bandwidth quickly, essentially rendering my internet access worthless since I have to use satellite internet.  I have tried stopping it in the Activity Monitor , but it restarts again. I thought I might have had a virus or something.  I downloaded trend micro for Mac, but found its core services essentially did the same thing. I uninstalled, but found that store agent is still running non stop. Ideas?

    The storeagent process is a normal part of Mac OS X, not a virus. Remove Trend Micro, which is a quite poor choice for protecting yourself against malware in the first place (see the results of my Mac anti-virus testing 2014), and which isn't really necessary anyway (see my Mac Malware Guide).
    As for what it might be doing, as babowa points out, it should be present when the App Store app is open, and at that time, it might be occupied with downloading updates or something similar. If you keep force-quitting it in Activity Monitor, that probably ruins whatever download it was working on, so it has to start all over again, perpetuating the cycle. In general, it is a very bad idea to force-quit processes that are part of Mac OS X without a very good reason and an understanding of what they are.
    Go to System Preferences -> App Store:
    You will probably want to turn off automatic download of newly available updates, as well as automatic download of apps purchased on other Macs (if you have other Macs). I do not advise turning off the master "Automatically check for updates" box, or the one for installing security updates, as disabling those will reduce the security of your system. These security updates are typically small, so they should have very little impact on your total internet usage.

  • JSP and large amounts of data

    Hello fellow Java fans
    First, let me point out that I'm a big Java and Linux fan, but somehow I ended up working with .NET and Microsoft.
    Right now my software development team is working on a web tool for a very important microchips manufacturer company. This tool handles big amounts of data; some of our online reports generates more that 100.000 rows which needs to be displayed on a web client such as Internet Explorer.
    We make use of Infragistics, which is a set of controls for .NET. Infragistics allows me to load data fetched from a database on a control they call UltraWebGrid.
    Our problem comes up when we load large amounts of data on the UltraWebGrid, sometimes we have to load 100.000+ rows; during this loading our IIS server memory gets killed and could take up to 5 minutes for the server to end processing and display the 100.000+ row report. We already proved the database server (SQL Server) is not the problem, our problem is the IIS web server.
    Our team is now considering migrating this web tool to Java and JSP. Can you all help me with some links, information, or past experiences you all have had with loading and displaying large amounts of data like the ones we handle on JSP? Help will be greatly appreciated.

    Who in the world actually looks at a 100,000 row report?
    Anyway if I were you and I had to do it because some clueless management person decided it was a good idea... I would write a program in something that once a day, week, year or whatever your time period produced the report (in maybe a PDF fashion but you could do it in HTML if you really must have it that way) and have it as a static file that you link to from your app.
    Then the user will have to just wait while it downloads but the webserver or web applications server will not be bogged down trying to produce that monstrosity.

  • HT3529 I am confused about messaging. Shouldn't I be able to send and receive messages with cellular data off?  I can usually send the message, but often do not receive messages until I either have wifi or turn cellular on.

    I am confused about messaging. Shouldn't I be able to send and receive messages with cellular data off and without wifi? I can send the message but often do not receive messages from others, iPhone or other, until  I turn on cellular or wifi.

    Depends on the type of Message.
    SMS messages will send and receive with data off.  and while you can guarantee you send using SMS you cannot guarantee that whoever replies to you does also. They may be replying thorugh iMessage if they are using iPhones.
    However Android should be sending through SMS.
    You can turn off iMessage if you want to, though people with limtied SMS text messaging in their plans may not appreciate it, and stop messaging you.

  • How to solve delay in a program with data aquisition and processing

    Hello, I am a starter in Labview programing. I am working on a system which contains a roller, a piston and a A/D cart which is from Data Translation Inc (DT304). I am using labview to get speed data of the roller (which is voltage first then be converted into speed) from Analog Input Channels, and then using Analog Output Channels to control the friction force being appylied to the roller by using a piston.
    I am using a Waveform Chart to show the speed of roller. However, as I am adding more components to the program, I always get delay in the speed display and also the output control of the piston. Also, as I spent more time using the program eg. from morning to night, it became slower too.
    My question is, how does labview store data and how are the data been stored and released in the buffer.Is it because I am loading too much data in the buffer so that it became slower? And also how to solve these delays?
    I am using the "save to file" function to write data to a ".lvm" file, but I did some changes to the saving by using a button to enable and disable saving. So I don't think the problem is because I am always saving the data into a file.
    Thank you very much.
    Jessie

    I'm betting it's two things:
    1. Program Architecture
    and 2. Dynamically resizing arrays.
    With the program architecture... if you have one do-while loop that acquires data from teh DAQ, processes it, opens/indexes/writes/closes a file, then comes back around... as your arrays and files increase in size, the loop is going to take longer to execute.  A Producer-Consumer loop (a good tutorial on them can be found here) has one loop that acquires data and stuffs the data into a queue.  This will buffer the data while it's in transition between acquisition and processing.  A second loop in parallel takes data in from the queue, processes it, then comes back around.  The two loops operate independently of each other, so even if the consumer loop takes longer as the files or math gets more complex, the producer loop continues to run full speed.
    Second is the arrays.  Every time you append data into an array, LabVIEW has to make a copy of the data that's in the array.  If you append small amounts of data to an array over and over and over again, eventually LabVIEW is going to be copying very large amounts of data over and over and over again.  The producer-consumer architecture can alleviate this problem.

  • Looking for ideas for transferring large amounts of data between systems

    Hello,
    I am looking for ideas based on best practices for transferring Large Amounts of Data in and out of a Netweaver based application.
    We have a new system we are developing in Netweaver that will utilize both the Java and ABAP stack, and will require integration with other SAP and 3rd Party Systems. It is a standalone product that doesn't share any form of data store with other systems.
    We need to be able to support 10s of millions of records of tabular data coming in and out of our system.
    Since we need to integrate with so many different systems, we are planning to use RFC for our primary interface in and out of the system. As it turns out RFC is not good at dealing with this large amount of data being pushed through a single call.
    We have considered a number of possible ideas, however we are not very happy with any of them. I would like to see what the community has done in the past to solve problems like this as well as how SAP currently solves this problem in other applications like XI, BI, ERP, etc.

    Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
    Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
    Also run Dolphin from terminal to try and see what's the problem.
    Hope that help at least a bit.
    Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
    Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands.

  • Firefox is using large amounts of CPU time and disk access, and I need to know how to shut down most of this so I can actually use the browser.

    Firefox is a very busy piece of software. It's using large amounts of CPU time and disk access. It puts my usage at low priority, so I have to wait for some time to be able to use my pointer or keyboard. I don't know what it uses all that CPU and disk access time for, but it's of no use to me. It often takes off with massive use of resources when I'm not doing anything, and I may not have use of my pointer for several minutes. How can I shut down most of this so I can use the browser to get my work done. I just want to use the web site access part of the software, and drop all the extra. I don't want Firefox to be able to recover after a crash. I just want to browse with a minimum of interference from Firefox. I would think that this is the most commonly asked question.

    Firefox consumes a lot of CPU resources
    * https://support.mozilla.com/en-US/kb/Firefox%20consumes%20a%20lot%20of%20CPU%20resources
    High memory usage
    * https://support.mozilla.com/en-US/kb/High%20memory%20usage
    Check and tell if its working.

  • Need to send and receive larger byte array

    I have a small WCF service, self hosted in a console app. On the same computer I have a client application. Both are running in the debugger in VS2008. The proxy code and the config file were generatred from the running service using svcutil.
    In one of the service calls, the service reads a pdf file and sends the contents as a byte array. The client receives the byte array, saves it as a pdf file, and displays it.
    Everything is fine for all other kinds of calls, and this one works fine also as long as the file is small (say 14K). But if the file is larger (say 84K), the client crashes (VHOST has stopped working).
    Is there some kind of setting that will allow me to send and receive larger byte arrays (> 100 K)?
    Thanks,
    Jon Jacobs
    In transmission, subatomic particles managed by professionals.
    No innocent electrons were harmed.

    Hi Jon,
    You'll want something like this to increase the message size quotas:
    <bindings>
    <basicHttpBinding>
    <binding name="basicHttp" allowCookies="true"
    maxReceivedMessageSize="20000000"
    maxBufferSize="20000000"
    maxBufferPoolSize="20000000">
    <readerQuotas maxDepth="32"
    maxArrayLength="200000000"
    maxStringContentLength="200000000"/>
    </binding>
    </basicHttpBinding>
    </bindings>
    The justification for the values is simple, they are sufficiently large to accommodate most messages. You can tune that number to fit your needs. The low default value is basically there to prevent DOS type attacks. Making it 20000000 would allow for a distributed
    DOS attack to be effective, the default size of 64k would require a very large number of clients to overpower most servers these days.
    If you're still getting this error message while using the WCF Test Client, it's because the client has a separate MaxBufferSize setting.
    To correct the issue:
    Right-Click on the Config File node at the bottom of the tree
    Select Edit with SvcConfigEditor
    A list of editable settings will appear, including MaxBufferSize.
    Note: Auto-generated proxy clients also set MaxBufferSize to 65536 by default.
    Let me know if this helped.
    Regards,
    Raghu

  • MY phone is using large amounts of data, when i then go to system services, it s my mapping services thats causing it. what are mapping services and how do i swithch them off. i really need help.

    MY phone is using large amounts of data, when i then go to system services, it s my mapping services thats causing it. what are mapping services and how do i swithch them off. i really need help.

    I Have the same problem, I switched off location services, maps in data, whatever else maps could be involved in nd then just last nite it chewed 100mb... I'm also on vodacom so I'm seeing a pattern here somehow. Siri was switched on however so I switched it off now nd will see what happens. but I'm gonna go into both apple and vodacom this afternoon because this must be sorted out its a serious issue we have on our hands and some uproar needs to be made against those responsible!

  • Sending large amounts of data spontaneously

    In my normal experience with the internet connection, the bits of data sent is about 50 to 80% of that received, but occasionally Firefox starts transmitting large amounts of data spontaneously; what it is, I don't know and where it's going to, I don't know. For example, today the AT&T status screen showed about 19 MB received and about 10 MB sent after about an hour of on-line time. A few minutes later, I looked down at the status screen and it showed 19.5 MB received and 133.9 MB sent, and the number was steadily increasing. Just before I went on line today, I ran a complete scan of the computer with McAfee and it reported nothing needing attention. I ran the scan because a similar effusion of sending data spontaneously had happened yesterday. When I noticed the data pouring out today, I closed Firefox and it stopped. When I opened Firefox right afterward, the transmission of data from did not recommence. My first thought was that my computer had been captured by the bad guys and now I was a robot, but McAfee says not to worry. But should I worry anyway? What's going on, or that not having a good answer now, how can I find out what's going on? And how can I make it stop, unless I'm seeing some kind of maintenance operation that Mozilla or Microsoft is subjecting me to?

    Instead of using URLConnection open a Socket to the server port (80 probably) send a POST http request followed by the data, you may then (optional) recieve data from the server to check that the servlet is ok, this is the same protocol as URLConnection, but you have control over when the data is actually sent...
    Socket sock=new Socket(getHost(),80);
    DataOutputStream dos=new DataOutputStream(sock.getOutputStream());
    dos.writeBytes("POST servletname\r\n");
    dos.writeBytes("Content-type: text/plain\r\n");  //optional, but good if you know
    dos.writeBytes("Content-length: "+lengthOfData+"\r\n")  //again, optional, but good if you can know it without caching the data first
    dos.writeBytes("\r\n");   // gotta have a blank line before the data
      // send data now
    DataInputStream=new DataInputStream(sock.getInputStream());  //optional if you want to recieve
      // recieve any feedback from servlet if you want
    dis.close();
    dos.close();
    sock.close();im guessing that URLConnection caches the data so it can fill in "Content-length"

  • HT204161 I want to use iCloud for every thing else except for Messages. I do not want to send an iMessage and receive it on both my iPhone and Mac.

    I want to use iCloud for every thing else except for Messages. I do not want to send an iMessage and receive it on both my iPhone and Mac.
    How do I do that?

    Howdy kingtonz,
    If I understand you correctly, you want to limit to limit your use of the Messages application to non-iCloud sources such as text messaging through your phone carrier, is that right?
    You can turn off the use of iMessages in Settings > Messages > iMessage. See this help article -
    Messages settings - iPhone
    Thanks for using Apple Support Communities.
    Best,
    Brett L 

Maybe you are looking for