Adding large amounts of data to multiple traces

Hi,
I have a CNiGraph object. I want to add a 100 traces (plots) with each 500.000pts, and later update with 50.000pts/sec for each trace. How can I do this? I tried and got a result of 45 seconds for 300pts/trace for 100 traces. I must be missing something. What is the fastest way to add the points to the traces? For the init I would use PlotXvsY (as ALL traces can have different X values and the length of each trace may not be the same). Then ChartXvsY. I don't even get as far as the ChartXvsY, since the first stage takes forever. Probably one reason is that I am getting a screen update for every trace added (can this be disabled?). I also don't want the graph to rescale on every trace update, just move it to the end (scroll). I tried turning of the AutoScale function for the plots and setting their visibility to false, but that still took like 23secs (for 100 traces, 300p/t).
Please help.
Thanks,
Miklos

Instead of using URLConnection open a Socket to the server port (80 probably) send a POST http request followed by the data, you may then (optional) recieve data from the server to check that the servlet is ok, this is the same protocol as URLConnection, but you have control over when the data is actually sent...
Socket sock=new Socket(getHost(),80);
DataOutputStream dos=new DataOutputStream(sock.getOutputStream());
dos.writeBytes("POST servletname\r\n");
dos.writeBytes("Content-type: text/plain\r\n");  //optional, but good if you know
dos.writeBytes("Content-length: "+lengthOfData+"\r\n")  //again, optional, but good if you can know it without caching the data first
dos.writeBytes("\r\n");   // gotta have a blank line before the data
  // send data now
DataInputStream=new DataInputStream(sock.getInputStream());  //optional if you want to recieve
  // recieve any feedback from servlet if you want
dis.close();
dos.close();
sock.close();im guessing that URLConnection caches the data so it can fill in "Content-length"

Similar Messages

  • LMS 4.1 -- Adding larger amount of devices ( 10) results in webUI freeze and not collecting Inv/Conf-Data

    Hi all,
    with LMS 4.1 running on solaris 10 I discovered the follwing.
    Adding larger amount (>10) of devices one after the other (via DCRCLI) results in IC_Server no longer collecting the Inventory-Data nor ArchiveMgmt will collect Configs and after some while DataCollection (ANI) will get stucked in endless loop.
    After that (navigating in CW2k-WebUI) the UI freezes.
    Is that a known problem with Interprocess-Communication and Eventbus in LMS 4.1???
    Any feedback is welcome.
    Best Regards
    Lothar

    After stopping the system I discovered that RmeDbEngine won't stop.
    Issuing dbstop command also results in RmeDbEngine still running.
    We had to kill process with signal 9.

  • How to pass large amount of data between steps

    Hi all,
    I have some LabVIEW VIs for data acquisition。
    I need to pass large amount of data(array size >5000000 each time) from one step to another.
    But it is not allowed to set array size larger than 5000000.
    Any suggestion?
    czhen
    Win 7 SP1 & LabVIEW 2012 SP1, Teststand 2012 SP1
    Solved!
    Go to Solution.
    Attachments:
    Array Size Limits.png ‏34 KB

    In your LabVIEW code, put the data into a data value reference.  Pass this reference between your TestStand steps.  As an added bonus, you will not get an extra copy of the data at each step.  You will need to use the InPlace element structure to get your data out of the data value reference.
    This account is no longer active. Contact ShadesOfGray for current posts and information.

  • Large Amount of Data in JSF

    Hello,
    I am using the Table Group component for displaying data in my application designed in Java Studio Creator.
    I have enabled paging on the component. I use CachedRowSet on the bean for the page for getting the data. This works very well at the moment in my development environment. At the moment I am testing on small amount of data.
    I was wondering how does this component perform with very large amounts of data (>75,000 rows). I noticed that there is a button available for users to retrieve all the rows. So I was wondering apart from that instance, when viewing in a paged mode does the component get all the results from the database everytime ?
    Which component would be best suited for displaying large amounts of data in a table format?
    Thanks In Advance!!

    Thanks for your reply. The table control that I use does have paging as a feature and I have enabled it. It still takes time to load the data initially.
    I wonder if it is got to do with the logic of paging. How do you specify which set of 20 records to extract from SQL.
    Thanks for your help!!

  • Open Large amount of data

    Hi
    I have a file on application server in .dat format, it contains large amount of data may be 2 million of records or  more, I need to open the file to check the record count, is there any software or any option to open the file, I have tried opening with Notepad, excel .... it gives error..
    please let me know
    Thanks

    Hi,
    Try this..
    Go to AL11..
    Go to the file directory..Then in the file there will be field called length..which is the total length of the file in characters..
    If you know the length of a single line..
    Divide the length of the file by the length of single line..I believe you will get the number of records..
    Thanks,
    Naren

  • Bex Report Designer - Large amount of data issue

    Hi Experts,
    I am trying to execute (on Portal) report made in BEx Report Designer, with about 30 000 pages, and the only thing I am getting is a blank page. Everything works fine at about 3000 pages. Do I need to set something to allow processing such large amount of data?
    Regards
    Vladimir

    Hi Sauro,
    I have not seen this behavior, but it has been a while since I tried to send an input schedule that large. I think the last time was on a BPC NW 7.0 SP06 system and it worked OK. If you are on a recent support package, then you should search for relevant notes (none come to mind for me, but searching yourself is always a good idea) and if you don't find one then you should open a support message with SAP, with very specific instructions for recreating the problem from a clean input-schedule.
    Good luck,
    Ethan

  • Advice needed on how to keep large amounts of data

    Hi guys,
    Im not sure whats the best way is to make large amounts of data available to my android  app on the local device.
    For example records of food ingredients, in the 100's?
    I have read and successfully created .db's using this tutorial.
    http://help.adobe.com/en_US/AIR/1.5/devappsflex/WS5b3ccc516d4fbf351e63e3d118666ade46-7d49. html
    However to populate the database I use flash? So this kind of defeats the purpose of it. No point in me shifting a massive array of data from flash to a sql database, when I could access the data direct from the as3 array?
    So maybe I could create the .db with an external program? but then how would I include that .db in the apk file and then deploy it to users android device.
    Or maybe I create a as3 class with an xml object init and use that as a means of data storage?
    Any advice would be appreciated

    You can use any means you like to populate your SQLite database, including using external programs, (temporarily) embedding a text file with SQL statements, executing some SQL from AS3 code etc etc.
    Once you have populated your db, deploy it with your project:
    http://chrisgriffith.wordpress.com/2011/01/11/understanding-bundled-sqlite-databases-in-ai r-for-mobile/
    Cheers, - Jon -

  • Error in Generating reports with large amount of data using OBIR

    Hi all,
    we hve integrated OBIR (Oracle BI Reporting) with OIM (Oracle Identity management) to generate the custom reports. Some of the custom reports contain a large amount of data (approx 80-90K rows with 7-8 columns) and the query of these reports basically use the audit tables and resource form tables primarily. Now when we try to generate the report, it is working fine with HTML where report directly generate on console but the same report when we tried to generate and save in pdf or Excel it gave up with the following error.
    [120509_133712190][][STATEMENT] Generating page [1314]
    [120509_133712193][][STATEMENT] Phase2 time used: 3ms
    [120509_133712193][][STATEMENT] Total time used: 41269ms for processing XSL-FO
    [120509_133712846][oracle.apps.xdo.common.font.FontFactory][STATEMENT] type1.Helvetica closed.
    [120509_133712846][oracle.apps.xdo.common.font.FontFactory][STATEMENT] type1.Times-Roman closed.
    [120509_133712848][][PROCEDURE] FO+Gen time used: 41924 msecs
    [120509_133712848][oracle.apps.xdo.template.FOProcessor][STATEMENT] clearInputs(Object) is called.
    [120509_133712850][oracle.apps.xdo.template.FOProcessor][STATEMENT] clearInputs(Object) done. All inputs are cleared.
    [120509_133712850][oracle.apps.xdo.template.FOProcessor][STATEMENT] End Memory: max=496MB, total=496MB, free=121MB
    [120509_133818606][][EXCEPTION] java.net.SocketException: Socket closed
    at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:99)
    at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
    at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:525)
    at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:504)
    at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:382)
    at weblogic.servlet.internal.ChunkOutput.checkForFlush(ChunkOutput.java:469)
    at weblogic.servlet.internal.ChunkOutput.write(ChunkOutput.java:304)
    at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:139)
    at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:169)
    at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
    at oracle.apps.xdo.servlet.util.IOUtil.readWrite(IOUtil.java:47)
    at oracle.apps.xdo.servlet.CoreProcessor.process(CoreProcessor.java:280)
    at oracle.apps.xdo.servlet.CoreProcessor.generateDocument(CoreProcessor.java:82)
    at oracle.apps.xdo.servlet.ReportImpl.renderBodyHTTP(ReportImpl.java:562)
    at oracle.apps.xdo.servlet.ReportImpl.renderReportBodyHTTP(ReportImpl.java:265)
    at oracle.apps.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:270)
    at oracle.apps.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:250)
    at oracle.apps.xdo.servlet.XDOServlet.doGet(XDOServlet.java:178)
    at oracle.apps.xdo.servlet.XDOServlet.doPost(XDOServlet.java:201)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at oracle.apps.xdo.servlet.security.SecurityFilter.doFilter(SecurityFilter.java:97)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3496)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(Unknown Source)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    It seems where the querry processing is taking some time we are facing this issue.Do i need to perform any additional configuration to generate such reports?

    java.net.SocketException: Socket closed
         at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:99)
         at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
         at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:525)
         at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:504)
         at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:382)
         at weblogic.servlet.internal.CharsetChunkOutput.flush(CharsetChunkOutput.java:249)
         at weblogic.servlet.internal.ChunkOutput.checkForFlush(ChunkOutput.java:469)
         at weblogic.servlet.internal.CharsetChunkOutput.implWrite(CharsetChunkOutput.java:396)
         at weblogic.servlet.internal.CharsetChunkOutput.write(CharsetChunkOutput.java:198)
         at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:139)
         at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:169)
         at com.tej.systemi.util.AroundData.copyStream(AroundData.java:311)
         at com.tej.systemi.client.servlet.servant.Newdownloadsingle.producePageData(Newdownloadsingle.java:108)
         at com.tej.systemi.client.servlet.servant.BaseViewController.serve(BaseViewController.java:542)
         at com.tej.systemi.client.servlet.FrontController.doRequest(FrontController.java:226)
         at com.tej.systemi.client.servlet.FrontController.doPost(FrontController.java:128)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:175)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3498)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(Unknown Source)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:17
    (Please help finding a solution in this issue its in production and we need to ASAP)
    Thanks in Advance
    Edited by: 909601 on Jan 23, 2012 2:05 AM

  • With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?

    With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?
    For example, in Notes, I have written three notes; however if I click on 'All On My Mac' on the side bar, I see about 10 different versions of each note I make, it saves a version every time I add or delete a sentence.
    I also noticed, that when I write an email, Mail saves about 10 or more draft versions before the final is sent.
    I understand that all this journaling provides a level of security, and prevents data lost; but I was wondering, is there a function to clean up journal logs once in a while?
    Thanks
    Roz

    Are you using Microsoft word?  Microsoft thinks the users are idiots. They put up a lot of pointless messages that annoy & worry users.  I have seen this message from Microsoft word.  It's annoying.
    As BDaqua points out...
    When you copy information via edit > copy,  command + c, edit > cut, or command +x, you place the information on the clipboard. When you paste information, edit > paste or command + v, you copy information from the clipboard to your data file.
    If you edit > cut or command + x and you do not paste the information and you quite Word, you could be loosing information.  Microsoft is very worried about this. When you quite Word, Microsoft checks if there is information on the clipboard & if so, Microsoft puts out this message.
    You should be saving your work more than once a day. I'd save every 5 minutes.  command + s does a save.
    Robert

  • Looking for ideas for transferring large amounts of data between systems

    Hello,
    I am looking for ideas based on best practices for transferring Large Amounts of Data in and out of a Netweaver based application.
    We have a new system we are developing in Netweaver that will utilize both the Java and ABAP stack, and will require integration with other SAP and 3rd Party Systems. It is a standalone product that doesn't share any form of data store with other systems.
    We need to be able to support 10s of millions of records of tabular data coming in and out of our system.
    Since we need to integrate with so many different systems, we are planning to use RFC for our primary interface in and out of the system. As it turns out RFC is not good at dealing with this large amount of data being pushed through a single call.
    We have considered a number of possible ideas, however we are not very happy with any of them. I would like to see what the community has done in the past to solve problems like this as well as how SAP currently solves this problem in other applications like XI, BI, ERP, etc.

    Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
    Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
    Also run Dolphin from terminal to try and see what's the problem.
    Hope that help at least a bit.
    Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
    Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands.

  • Azure Cloud service fails when sent large amount of data

    This is the error;
    Exception in AZURE Call: An error occurred while receiving the HTTP response to http://xxxx.cloudapp.net/Service1.svc. This could be due to the service endpoint binding not using the HTTP protocol. This could also be due to an HTTP request context being
    aborted by the server (possibly due to the service shutting down). See server logs for more details.
    Calls with smaller amounts of data work fine. Large amounts of data cause this error.
    How can I fix this??

    Go to the web.config file, look for the <binding> that is being used for your service, and adjust the various parameters that limit the maximum length of the messages, such as
    maxReceivedMessageSize.
    http://msdn.microsoft.com/en-us/library/system.servicemodel.basichttpbinding.maxreceivedmessagesize(v=vs.100).aspx
    Make sure that you specify a size that is large enough to accomodate the amount of data that you are sending (the default is 64Kb).
    Note that even if you set a very large value here, you won't be able to go beyond the maximum request length that is configured in IIS. If I recall correctly, the default limit in IIS is 8 megabytes.

  • DSS problems when publishing large amount of data fast

    Has anyone experienced problems when sending large amounts of data using the DSS. I have approximately 130 to 150 items that I send through the DSS to communicate between different parts of my application.
    There are several loops publishing data. One publishes approximately 50 items in a rate of 50ms, another about 40 items with 100ms publishing rate.
    I send a command to a subprogram (125ms) that reads and publishes the answer on a DSS URL (app 125 ms). So that is one item on DSS for about 250ms. But this data is not seen on my man GUI window that reads the DSS URL.
    My questions are
    1. Is there any limit in speed (frequency) for data publishing in DSS?
    2. Can DSS be unstable if loaded to much?
    3. Can I lose/miss data in any situation?
    4. In the DSS Manager I have doubled the MaxItems and MaxConnections. How will this affect my system?
    5. When I run my full application I have experienced the following error Fatal Internal Error : ”memory.ccp” , line 638. Can this be a result of my large application and the heavy load on DSS? (se attached picture)
    Regards
    Idriz Zogaj
    Idriz "Minnet" Zogaj, M.Sc. Engineering Physics
    Memory Profesional
    direct: +46 (0) - 734 32 00 10
    http://www.zogaj.se

    LuI wrote:
    >
    > Hi all,
    >
    > I am frustrated on VISA serial comm. It looks so neat and its
    > fantastic what it supposes to do for a develloper, but sometimes one
    > runs into trouble very deep.
    > I have an app where I have to read large amounts of data streamed by
    > 13 µCs at 230kBaud. (They do not necessarily need to stream all at the
    > same time.)
    > I use either a Moxa multiport adapter C320 with 16 serial ports or -
    > for test purposes - a Keyspan serial-2-USB adapter with 4 serial
    > ports.
    Does it work better if you use the serial port(s) on your motherboard?
    If so, then get a better serial adapter. If not, look more closely at
    VISA.
    Some programs have some issues on serial adapters but run fine on a
    regular serial port. We've had that problem recent
    ly.
    Best, Mark

  • Freeze when writing large amount of data to iPod through USB

    I used to take backups of my PowerBook to my 60G iPod video. Backups are taken with tar in terminal directly to mounted iPod volume.
    Now, every time I try to write a big amount of data to iPod (from MacBook Pro), the whole system freezes (mouse cursor moves, but nothing else can be done). When the USB-cable is pulled off, the system recovers and acts as it should. This problem happens every time a large amount of data is written to iPod.
    The same iPod works perfectly (when backupping) in PowerBook and small amounts of data can be easily written to it (in MacBook Pro) without problems.
    Does anyone else have the same problem? Any ideas why is this and how to resolve the issue?
    MacBook Pro, 2.0Ghz, 100GB 7200RPM, 1GB Ram   Mac OS X (10.4.5)   IPod Video 60G connected through USB

    Ex PC user...never had a problem.
    Got a MacBook Pro last week...having the same issues...and this is now with an exchanged machine!
    I've read elsewhere that it's something to do with the USB timing out. And if you get a new USB port and attach it (and it's powered separately), it should work. Kind of a bummer, but, those folks who tried it say it works.
    Me, I can upload to Ipod piecemeal, manually...but even then, it sometimes freezes.
    The good news is that once the Ipod is loaded, the problem shouldnt' happen. It's the large amounts of data.
    Apple should DEFINITELY fix this though. Unbelievable.
    MacBook Pro 2.0   Mac OS X (10.4.6)  

  • Getting Short dump due to large amount of data

    Hi experts,
    When we are running RALM_ME_MEASP_FULL_DOWNLOAD_SD  program, every time we are getting Short dump due to large amount of data.
    please suggest how to run this program without short dump.
    Thanks & Regards
    Prashant Gupta

    Hi,
    you do run the wrong APP I guess. The service you are running is for MAM20. If you are interested in MAM30 and MAM25, please use the correct service.
    If you have a look for
    *FULL_DOWNLOAD_SD
    You find them all. Use the ones without the RALM in front - then the timeout should be solved as well. Bside that there is a great guid available that helps as well to solve some issues around server driven replication:
    [MAM SERVER DRIVEN SETUP GUIDE|http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/818ac119-0b01-0010-ba8b-b6e3f3490a63?quicklink=index&overridelayout=true]
    Hope that helps to solve yourt issue.
    Regards,
    Oliver

  • Regarding TXT File data truncation due to large amount of data

    Hi Guys,
    I am downloading data to txt.file in background.I am getting truncation of the records due to large amount of data. If it is less data it works good.
    I have checked the Internal table SIZE for this and anywhy i have declared in OCCURS 0 only.
    So please help me to find out what may this reason.I am confuced is there any limitation for TXT file??
    Please help me guys..
    Thanks in advance..
    Prabhu.R

    Hi Rakesh,
    two ways.
    1. Ask ur BASIS team to increase the memory level.
    2. Check the PACKAGE SIZE option of select statement
    Here u  won't select all the data once but in packets of specified size. So get the packets of data and process.
    Just press F1 on package size. That explanation will be enough to proceed further.
    Thanks,
    Vinod.

Maybe you are looking for

  • MAIN MENU

    YOU KNOW WHAT ARE THE IMAGE SUPPORT TRANSPARENCE  IN SAP OR HOW UPLOAD TO SAP IMAGEN WITHOUT BACKGROUND

  • Implemenet/Extend interface/class in default package

    Filename: DeaultInterface.java public interface DefaultInterface {     // Abstract Methods... Filename: ConcreteClass.java package com.company; import DefaultInterface; public class ConcreteClass implements DefaultInterface {     // Implementation of

  • OBIEE 11.1.1.5.0 Write back feature error.

    I am working on OBIEE 11G(11.1.1.5.0) write back feature. After all the steps which were mentioned in 11g Documentation , there's an Error message said "Write back Error" . No other detail information. Do you have any idea about it? http://download.o

  • Email server ports

    Hi there-- I have my Mail set up to check my email, POP, on servers 25 and 110 for my home comcast cable network. Anytime that I travel, which is often enough to be annoying, the servers don't work. This is all greek to me, so I don't know if there's

  • Measurement of the dynamic pressure by IEPE sensor

    Hello, I work on cylinder pressure measurement in combustion engine. I have a piezoelectric pressure sensor with charge output 1 pC/psi and an inline charge amplifier 20 mV/pC what gives overall sensitivity 20 mV/psi. As my instrumenst are of an IEPE