Stream data to xygraph

Trying to solve what I thought would be simple. I have an app creating an array of 10 values every 20 ms (approx - not exact intervals on the data).  The data system will be running for weeks monitoring and controlling processes - data storage is elswhere in the program. All values are doubles, starting with a time tag, then 9 values. Currently plotting to a waveform chart, which is easy and I can set history length, but the timing is not right since it assumes all data updates arrive at regular intervals. So, my assumptions, after going thru many examples and tutorials, is:
 - must use xygraph to plot against time since data intervals are not regular
 - xygraph can only accept one data value per time value in 2-element clusters (though I did create an xygraph that took a 3 element cluster - can't repeat it!)
I created a sample vi that creates a data stream and sends it to a queue (similar to the data system, except that this ixample does have regular timing)
Then plots data using 2 different methods to xygraphs, both methods taken from NI discussion forum sample code. Both seem to work, but neither are very elegant compared to the simplicity of a waveform chart, and both appear to re-generate the entire graph every time new data is added (efficiency?)
So, am i missing something? Is there an easier way to continuously add points to a graph, and keep the array to a manageable size? Or should I just pick one of the methods in the atached vi?
Thanks.
Attachments:
TestXYGraph4.vi ‏47 KB

That's almost half a million collects per day (times 10 values equals 4.32 million doubles!)  You want to store all that data in memory?  What if you lose power?  I would store the data to file (TDMS) as it's collected and then create a reader to extract the regions of interest for display/analysis or whatever.
SIDE NOTE:  Rather than using a stop button with locals to communicate between loops, use the queue you already have.  Monitor it in one loop with queue status (wire the error out to your stop terminal), in your other loop have a destroy queue wired AFTER the loop, then when you stop the other loop the queue is destroyed and the monitoring loop will see the queue is gone and generate an error which stops its loop.
Using LabVIEW: 7.1.1, 8.5.1 & 2013

Similar Messages

  • Stream data from a subvi to the main vi

    Hi
    I’m wondering if someone can help me. I’m trying to stream data from a sub vi to the main vi without opening and closing the sub vi continuously. The data also needs to be extracted from a while loop within the sub vi.  background info: The sub vi controls a photomultipler tube, which once it is on it is best kept on. The PMT signal is generated in while loop of the sub vi. Attached are some basic vis showing what I’m trying to do.
    Thanks
    Attachments:
    Main page.vi ‏11 KB
    test2.vi ‏13 KB

    If you are streaming measurements continously I would consider using a circular buffer instead of the queued producer/consumer approach.
    Check out this implementataion of such a buffer, use only this if your DAQ device does not have an internal circular buffer.
    https://decibel.ni.com/content/docs/DOC-20403
    Alternatively if you are using a DAQmx device, consider using the device buffer for sharing DAQ data between loops.
    Br,
    /Roger

  • Stream data from a subvi to the main vi - path refnum

    hello everyone
    sorry my english
    http://forums.ni.com/t5/LabVIEW/stream-data-from-a-subvi-to-the-main-vi/m-p/2205150/highlight/true#M...
    - refnum boolean worked
    - refnum graph worked
    - refnum numeric worked
    how to create a  path refnum? stream data from a subvi to the main vi
    where do I start?
    thank
    Solved!
    Go to Solution.

    Bom Dia Saille,
    Eu acredito que o que você tenha hoje seja isso aqui (Desenho bem simplificado):
    Hoje o que você tem é o Aplicativo do Medidor + USB Driver controlando o seu medidor. Basicamente, sua aplciação está dividida em três camadas:
    Aplicação - Onde estão as funcionalidades principais do programa (interface com o usuário, apresentação de dados, geração de arquivos, etc.)
    VISA - Uma arquitetura de Software para controle de instrumentos. Basicamente, ele se comunica com o Driver USB para poder enviar e receber pacotes de dados através do barramento USB.
    Device Driver - Intruções de SW de baixo nível para controlar um periférico através de um barramento.
    No desenho, eu aponto duas camadas onde você pode tentar atuar para automatizar suas medições:
    Aplicação - Você pode usar o VI Server para controlar o Aplicativo do Medidor (Se este aplicativo tiver sido desenvolvido em LabVIEW, o que eu suponho ser verdade). No entanto é necessário saber se o desenvolvedor do produto permitiu isso (Veja Using VI Server to Pass Data Between a VI and a LabVIEW Executable). Outra alternativa é conversar com o fabricante para ver se ele desenvolveu alguma API (Application Programming Interface) para que você consiga controlar o instrumento diretamente.
    VISA - O LabVIEW Fornece uma API chamada NI-VISA para poder enviar e receber informações através de diversos barramentos (Veja Serial Instrument Control Tutorial)
    É possível acessar o Driver diretamente também, mas eu não recomendo. O resultado não compensa o esfroço!
    Espero ter esclarecido suas dúvidas e espero que você tenha sucesso na sua aplicação!!
    Atenciosamente.
    Felipe Flores
    Engenharia de Aplicações
    National Instruments Brasil

  • Streaming data to LONG columns in Oracle 7.3.2.3.0

    I am trying to stream data to a LONG column. I'm using Oracle
    Server 7.3.2.3.0 on AIX and JDBC driver 8.0.4 on Windows NT 4
    SP5.
    I include sample tables/programs at the end, but here's the
    summary of what's happening:
    I'm creating a byte array of length 2500. If I use
    setAsciiStream I get the following exception when I execute the
    prepared statement:
    java.sql.SQLException: Data size bigger than max size for this
    type
    at oracle.jdbc.dbaccess.DBError.check_error(DBError.java)
    at oracle.jdbc.ttc7.TTCItem.setArrayData(TTCItem.java)
    at oracle.jdbc.driver.OraclePreparedStatement.setItem
    (OraclePreparedStat
    ement.java)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setAsciiStream
    (OraclePrepa
    redStatement.java)
    at TestOracle.main(TestOracle.java:26)
    If I use setBinaryStream I get this exception:
    java.sql.SQLException: ORA-01461: can bind a LONG value only for
    insert into a LONG column
    at oracle.jdbc.ttc7.TTIoer.processError(TTIoer.java)
    at oracle.jdbc.ttc7.Oall7.receive(Oall7.java)
    at oracle.jdbc.ttc7.TTC7Protocol.doOall7
    (TTC7Protocol.java)
    at oracle.jdbc.ttc7.TTC7Protocol.parseExecuteFetch
    (TTC7Protocol.java)
    at oracle.jdbc.driver.OracleStatement.doExecuteOther
    (OracleStatement.jav
    a)
    at oracle.jdbc.driver.OracleStatement.doExecuteWithBatch
    (OracleStatement
    .java)
    at oracle.jdbc.driver.OracleStatement.doExecute
    (OracleStatement.java)
    at
    oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout
    (OracleStateme
    nt.java)
    at
    oracle.jdbc.driver.OraclePreparedStatement.executeUpdate
    (OraclePrepar
    edStatement.java)
    at oracle.jdbc.driver.OraclePreparedStatement.execute
    (OraclePreparedStat
    ement.java)
    at TestOracle.main(TestOracle.java:27)
    My Oracle7 manual states that LONG columns can store 2GB of text.
    I tried the above with LONG RAW columns and it worked fine.
    Can anyone explain why I get this error? I've tried it with
    different sizes and when the data is <2000 bytes it works fine
    for LONG columns.
    My table is simple:
    create table TestLongs (key INTEGER PRIMARY KEY, data LONG);
    My Java code is also very simple:
    public class TestOracle
    public static void main(String[] args)
    Connection con = null;
    PreparedStatement pstmt = null;
    try
    Class.forName("oracle.jdbc.driver.OracleDriver");
    con = DriverManager.getConnection(
    "jdbc:oracle:thin:@itchy:1526:test",
    "System", "<OMITTED>");
    byte[] data = new byte[2500];
    for (int i=0; i< 2500; i++)
    data[i] = 53;
    String sql = "INSERT INTO TestLongs (key, data)
    VALUES(1, ?)";
    pstmt = con.prepareStatement(sql);
    ByteArrayInputStream bis = new ByteArrayInputStream
    (data);
    pstmt.setAsciiStream(1, bis, data.length);
    pstmt.execute();
    catch (SQLException e)
    System.err.println("An error occurred with the
    database: " + e);
    e.printStackTrace();
    catch (Exception e)
    System.err.println("Oracle JDBC driver not found." +
    e);
    e.printStackTrace();
    finally
    try
    if (pstmt != null)
    pstmt.close();
    if (con != null)
    con.close();
    catch (SQLException e)
    System.err.println("Unable to close
    statement/connection.");
    null

    Robert Greig (guest) wrote:
    : I am trying to stream data to a LONG column. I'm using Oracle
    : Server 7.3.2.3.0 on AIX and JDBC driver 8.0.4 on Windows NT 4
    : SP5.
    I tried it with the old 7.3.x JDBC driver and it works fine. I
    also noticed after further testing that it sometimes worked
    with the 8.0.4 driver. Looks like a bug in the 8.0.4 driver or
    some wacky incompatibility.
    null

  • How to stream data from TDS3000?

    Hi there,
    I would like to stream data from my scope into Labview for further analysis.
    Hardware: Tektronix TDS3014C
    Software: Labview SignalExpress 2.5.1 + Tektronix Extensions
    I don't exactly know when the interesting transient signal appears, therefore I would like to save ~5-10 s of streamed data.
    Right now I'm getting just fractions of 2 us and the scope switches into waiting for trigger.
    Is there a way to deactivate trigger, or a keyword I could search for?
    Thanks for your help,
    nook
    Solved!
    Go to Solution.

    muks,
    I think you got this post confused with another.
    nook,
    You can rarely continuously stream data from a GPIB scope. You can check the manual but often the scope cannot transmit at the same time it is acquiring so you have a sequential operation of wait, trigger, acquire, transfer, repeat.
    Can't you set the trigger for the transient?

  • Using TDM VI in Lv 7.1 to stream data to disk

    I found the TDM VI and the structure of the data is very interesting for my application.
    I try to use it to stream an array of waveform data but the TDM VI goes very slow.
    after that, I see in the documentation that TDM file can't support streaming to disk, but I think it's really a basic function.
    futhermore,the TDM VI write binary data, so normally it must work !.
    is there another possibily with TDM file to stream data to disk ?
    Thanks in advance for your help.
    GHELEYNS Nicolas
    Phénix Industries s.a.

    Hi,
    The LabVIEW Storage VIs do not yet support streaming data to disk. However, you can benefit from the storage VIs structured approach to saving data by:
    Streaming data to disk using standard binary write VIs
    Reading data back in and writing it back out to disk using the storage VIs after acquisition
    Deleting the original binary file
    Although there is the extra step of writing data to an intermediate binary file, you benefit because you can structure your data and save it with the descriptive properties that the storage VIs offer.
    Regards.
    JorisV

  • ABAP PROGRAM " Streaming DATA " is not working in SAP ITS 6.20

    dear all,
    i try create application on abap ( WAS 6.20 ) the concept application is real time streaming data . on SAP Gui the application is working properly,
    the second scenario i try publish this application to SAP ITS 6.20,when we running the program with browser the application cant steaming data or real time ?
    the is any configuration on SAP ITS to make my application working streaming data ( real time )
    thanks for your help
    rgds
    echo

    Hi Echo,
    Please check function module aleweb_download.
    You might also check with service it00 (up- download) how this works.
    Thanks and regards,
    Dieter

  • Develop streaming data processing applications in C# with Stream Computing Platform and Storm in HDInsight. Can this be done with Visual Studio Community sign up?

    Hello,
    I am a  student and love the Visual Studio Community 2013 to implement some of my research projects. I am currently working on a project that involves streaming data analysis. I found this article ( http://azure.microsoft.com/en-us/documentation/articles/hdinsight-hadoop-storm-scpdotnet-csharp-develop-streaming-data-processing-application/
    ) but do not have MSDN subscription (I cannot afford it)  as required in the article. Can this be done somehow with Visual Studio Community 2013 sign up?
    Thank you all in advance for your time and for your help.
    J.K.W

    Hi,
    I just confirmed that the key with Visual Studio Community is that, although it is free like Express, it does not have the limitations that Visual Studio Express had. So, to answer your question, Yes - you can do all your development work as a student with
    VS Community 2013 sign up. You can also refer to this blog for more details -
    http://blogs.msdn.com/b/quick_thoughts/archive/2014/11/12/visual-studio-community-2013-free.aspx
    Regards,
    DebarchanS
    DebarchanS - MSFT

  • Use labview to collect AE streaming data from pacpci2

    Hello everyone,
    I'm using labview to collect AE streaming data from pacpci2.I have the dll from PAC,and I have  written some VIs for streaming data. Now I don't know how to read data to my program from the pci.Can the "getMessage" function do this?
    Hope somebody has this experience can help me.Thanks a lot.
    henry

    No idea because you are leaving all the details out of your message.
    What does "AE" mean in the way you use it?
    What is "pacpci2"?
    What PAC are you using?
    What "Get Message" function are you talking about?

  • For clearing streaming data on stop button of FMLE

    Hi
    I am using FMLE 3.2 for streaming and AMS 5.
    I have to clear stream data from application on Click event of stop button from FMLE 3.2.
    How can i do that? Please help me.
    I search for that at the end i got this :
    http://help.adobe.com/en_US/adobemediaserver/devguide/WSd391de4d9c7bd609-52e437a812a3725df a0-8000.2.3.html#WS5262178513756206-67d61d971377ecd8c14-8000
    but doesn't understand where to use that tag and in which file?
    Please help me.
    Thanks in Advance.

    I do not have any insight of your application architecture and the changes you have made. So it will not be easy to explain you. Let me try one more time.
    You have added a tab and embedded your own wda interface view in the component configuration. When a event is triggered  the FPM transfers control to the component belongs to the UIBB  and in this case your custom component.
    Your custom component does not have any knowledge about the other UIBB's in the application.
    The solution what i proposed was to analyse the current application and see if there is IF_FPM_SHARED_DATA, IF_FPM_TABBED_CONF_EXIT.
    if they are implemented then your work would be simplified. all the events first comes to OVERRIDE_CONFIG_TABBED .You can capture your save event in OVERRIDE_CONFIG_TABBED and you can work around here.
    If the above interfaces are not implemented then it is difficult for you to get the things done the way you would like to do.

  • When I save a pages document as a pdf and then try to email the document, it loads as an application octet-stream dat file instead of a pdf. What's happening?

    When I save a Pages document as a pdf and then try to email the document, it loads as an application octet-stream dat file instead of a pdf and recipients can't open it. Some load correctly, but about half don't load as a pdf. What's happening?

    Hi mjmonck-
    Which email provider are you using? There are different solutions possible for different providers. Yahoo is having trouble with attachments, outside of any Firefox issue. I've seen a Google user fix this by manually adding the mail URL to their ad block program- which mistakenly perceived the webmail page as an advertisement.
    Hope this helps.

  • AXI Stream Data FIFO sends data automatically

    Hi all!
    I've been doing a few beginner experiments with AXI peripherals and following some tutorials online on how to create AXI peripherals and connect them to the PS on my Zynq board. So far, I've managed to sucessfully create AXI DMA to send data from PS to PL AXI Stream Data FIFO and vice versa. However, if I expand my PL with a custom IP with AXI Stream interface and connect it to the AXI Stream slave interface of my AXI Stream Data FIFO to send data to FIFO, my linux application gets stuck when trying to read the data of FIFO via DMA. After some simulations and debugs I figured out, that my FIFO is empty and cause the stuck of my application, because DMA waits for data.
    For this reason I seperated my custom IP and FIFO from the DMA to test and debug only the connection between custom IP and FIFO without the influence of DMA. Thus the master interface of the fifo is not connected. Accordingly every second my custom IP shall properly open a transfer and send 10 values of 32 bit to FIFO where it should be stored and not send.
    However, debugging this PL with ILA core ends with the result, that my FIFO stops to be ready-to-receive after receiving and storing 3 values of 32bit. Then it waits until it receives a high TLAST signal of custom IP. During this waiting period it does not store my remaining values. After TLAST gets high with the 10th 32-bit value, FIFO´s signal TVALID is set instantly to high and FIFO starts to submit the data. This is the point where I get confused. Why does FIFO starts to transmit when there is no receiver connected to FIFO´s AXI Stream master interface? From my level of knowledge FIFO needs the signal TREADY high from a slave peripheral to starts the transmission.
    I believe this behavoiur is the reason of not storing the values in FIFO.
    Hopefully somebody knows whats going wrong here or what I am doing wrong. I really appreciate any help because I have been trying 2 weeks to solve this problem without any success. It is really frustrating...
    Please see attachments for better understanding. If you need further information let me know.
    Best regards,
    timmaexd
     

    Thanks for replying and checking my diagrams! Fortunately, I could solve the problem.
    After you gave me the advice to check m_tready, I figured out that this signal was constantly set to high. This was caused by VCC, which was connected automatically with m_tready, after synthesis.
    Thus I connected DMA with FIFO to replace VCC with DMA and consequently data was stored correctly.
     

  • Streaming data into LONG results in ORA-24307

    I am migrating an application to a new environment. During testing I am receiving the following error when trying to insert a record into an Oracle 8.1.7.4 table:
    Error saving report: update BlobPiece osetpi(): ORA-24307: invalid length for piece
    The table has a column defined as LONG. I am using a prepared staement and setAsciiStream to perform the insert. I have been able to determine that the error happens when the data being streamed exceeds 4000 bytes (i.e. 4001 bytes).
    Connection to the table is made by using a datasource with the ORACLE_OCI driver. If I change the connection to use an Oracle thin driver, the insert is performed (code provided below).
    I would prefere not to change the connection method if possible. Any help resolving this problem would be appreciated. Thanks.
    Here is the insert code being used:
    public void create(SavedReportImpl report, String userId)
    throws ReportManagerException {
    StringBuffer sql = new StringBuffer();
    // NOTE: Must utilize a prepared statement since this insert contains
    // an Oracle Long column type
    java.sql.PreparedStatement statement = null;
    String modelCd = report.getModel().getModelCd().toUpperCase();
    if (modelCd.length() != 7) {
    modelCd = " " + modelCd;
    sql.append("INSERT ");
    sql.append("INTO ");
    sql.append("PRCT012 ");
    sql.append("(");
    sql.append("MODL_YR_NBR, ");
    sql.append("PRICE_CD, ");
    sql.append("MDSNG_MODL_DESGTR, ");
    sql.append("EFFECTIVE_DT, ");
    sql.append("REPORT_TP, ");
    sql.append("REPORT_SUBTP, ");
    sql.append("RESTRICTION, ");
    sql.append("PRICE_DESC, ");
    sql.append("VEHICLE_LINE, ");
    sql.append("MODL_DESC, ");
    sql.append("CURRENCY_CD, ");
    sql.append("CURRENCY_NM, ");
    sql.append("SOP_INDCTR, ");
    sql.append("LAST_UPDT_USERID, ");
    sql.append("LAST_UPDT_TMSTM, ");
    sql.append("REPORT_HTML_STRING) ");
    sql.append("VALUES (" );
    sql.append("'" + report.getModel().getModelYear() + "', ");
    sql.append("'" + report.getPriceCd().toUpperCase() + "', ");
    sql.append("'" + modelCd + "', ");
    sql.append("'" + dateFormatter.formatDatetoDBDateString(report.getEffectiveDate()) + "', ");
    sql.append(report.getReportType() + ", ");
    sql.append(report.getReportSubtype() + ", ");
    if (report.getModel().getRestriction() == null || report.getModel().getRestriction().length() == 0) {
    sql.append("NULL, ");
    } else {
    sql.append("'" + report.getModel().getRestriction().toUpperCase() + "', ");
    sql.append("'" + report.getPriceCdDescription() + "', ");
    sql.append("'" + report.getModel().getVehicleLine().toUpperCase() + "', ");
    sql.append("'" + report.getModel().getDescription() + "', ");
    sql.append("'" + report.getCurrency().getCurrencyCd().toUpperCase() + "', ");
    sql.append("'" + report.getCurrency().getCurrencyNm() + "', ");
    sql.append(report.isSOP() ? "'Y', " : "'N', ");
    sql.append("'" + userId.toUpperCase() + "', ");
    sql.append("'" + dateFormatter.formatDatetoDBDateTimeString(new java.util.Date()) + "', ");
    sql.append("?) "); // LONG COLUMN TO BIND
    //NOTE: only bind one column when there is an ORACLE LONG
    try {
    statement = connection.prepareStatement(sql.toString());     
    String html = com.eds.csdd.util.StringUtils.replaceAll(report.getHTML(), "'", "''");
    byte[] bytes = html.getBytes();
    java.io.InputStream is = new java.io.ByteArrayInputStream(bytes);
    statement.setAsciiStream(1, is, bytes.length);
    int resultCode = statement.executeUpdate();
    } catch (java.sql.SQLException sqle) {
    throw new ReportManagerException(sqle.getMessage());
    } catch (Exception e) {
    e.printStackTrace();
    throw new ReportManagerException(e.getMessage());                    
    } finally {                                   
    if (statement != null) {
    try { statement.close(); } catch (java.sql.SQLException re) {}
    Here is the connection code using the datasource:
    public java.sql.Connection getConnection(String dataSourceName) throws java.sql.SQLException {
    javax.naming.InitialContext               dsCTX          = null;
    javax.sql.DataSource               ds1          = null;
    java.sql.Connection                    conn          = null;
    String dataSource = rte.getProperty(dataSourceName, "datasource","datasourcenotfound");
    try {
         dsCTX = new javax.naming.InitialContext();
         ds1 = (javax.sql.DataSource)dsCTX.lookup("java:comp/env/" + dataSource);
    } catch (javax.naming.NamingException e){
         throw new java.sql.SQLException("Naming Exception:" + e.getMessage());
    if (ds1 == null)
         throw new java.sql.SQLException("datasource not provided");
    else
         conn = ds1.getConnection();
    return conn;
    Here is the connection code using the thin driver:
    public java.sql.Connection getConnection(String dataSourceName) throws java.sql.SQLException {
    java.sql.DriverManager.registerDriver (new oracle.jdbc.driver.OracleDriver());
    // open a connection to the database
    java.sql.Connection conn = java.sql.DriverManager.getConnection (
                             "jdbc:oracle:thin:@###.##.###.###:1521:SID",
                             "userid",
                             "password");
    return conn;
    }

    Robert Greig (guest) wrote:
    : I am trying to stream data to a LONG column. I'm using Oracle
    : Server 7.3.2.3.0 on AIX and JDBC driver 8.0.4 on Windows NT 4
    : SP5.
    I tried it with the old 7.3.x JDBC driver and it works fine. I
    also noticed after further testing that it sometimes worked
    with the 8.0.4 driver. Looks like a bug in the 8.0.4 driver or
    some wacky incompatibility.
    null

  • How do I get websites that have streaming data to work?

    I use Firefox 1.5.0.7 as my internet browser. There are two websites I want to recieve streaming data from. One is a radio station (www.fan960), the other is a naturecam site (www.orca-live.net/community/index.html). Both sites tell me I have to download vaious players (adobe player and flash player) yet I download them, follow the installation instructions, and they still do not work.
    I post this here because I wonder why my MacBook Pro does not already have a player that would work for these purposes. One reason I decided to buy a Mac computer was the selling point that they worked right out of the box. Any suggestions as to what I should do to enjoy these websites to their full extent?

    Personally, I use itunes to play radio. It works
    great.
    So, I am told, does Quicktime.
    How would you go about adding a station to ITunes as these are not listed as available stations?

  • Exceptions, odd behavior of streaming data to outputstream

    I have a servlet which writes mp3 data to the dataoutputstream of a servlet response object. For some reason, the servlet method writes the data out and gets an exception. Then the method/servlet is called again automatically and begins to write the data out again. The exception is below. In the end the mp3 is delivered to the client fine, however with server side exceptions and odd behavior.
    try {
              int len = 0;
              resp.setContentType("audio/mpeg");
              String filename = req.getParameter("file");
              File mp3 = new File(mediaDir + filename);
              byte[] buf = new byte[1024];
              FileInputStream fis = new FileInputStream(mp3);
              DataOutputStream o = new DataOutputStream(resp.getOutputStream());
              while( (len = fis.read(buf)) != -1) {            
                 o.write(buf, 0, len);            
              o.flush();
              resp.flushBuffer();
              o.close();
              fis.close();
           catch(Exception e) {
              System.out.println(e.getMessage());
              e.printStackTrace();
    null
    ClientAbortException:  java.net.SocketException: Connection reset by peer: socket write error
         at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:366)
         at org.apache.tomcat.util.buf.ByteChunk.flushBuffer(ByteChunk.java:403)
         at org.apache.tomcat.util.buf.ByteChunk.append(ByteChunk.java:323)
         at org.apache.catalina.connector.OutputBuffer.writeBytes(OutputBuffer.java:392)
         at org.apache.catalina.connector.OutputBuffer.write(OutputBuffer.java:381)
         at org.apache.catalina.connector.CoyoteOutputStream.write(CoyoteOutputStream.java:76)
         at java.io.DataOutputStream.write(DataOutputStream.java:90)
         at TrafficControl.streamAudio(TrafficControl.java:639)
         at TrafficControl.processRequest(TrafficControl.java:136)
         at TrafficControl.doGet(TrafficControl.java:61)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:689)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:802)
         at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:252)
         at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:173)
         at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:213)
         at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:178)
         at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:126)
         at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:105)
         at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:107)
         at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:148)
         at org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:856)
         at org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.processConnection(Http11Protocol.java:744)
         at org.apache.tomcat.util.net.PoolTcpEndpoint.processSocket(PoolTcpEndpoint.java:527)
         at org.apache.tomcat.util.net.LeaderFollowerWorkerThread.runIt(LeaderFollowerWorkerThread.java:80)
         at org.apache.tomcat.util.threads.ThreadPool$ControlRunnable.run(ThreadPool.java:684)
         at java.lang.Thread.run(Thread.java:595)thanks
    Edited by: black_lotus on 19-Feb-2009 3:08 PM

    There are some versions of some browsers (MS IE) that can call a servlet twice; they only look at the headers at the first request, in order to decide whether to display a "save or open" dialog, or some such reason. Try different browsers; also log the User-Agent header to see if it is "contype", which is present when the multiple request thing happens.
    http://support.microsoft.com/default.aspx?scid=kb;EN-US;q293792
    "Connection reset" can also happen if the client closes the connection without reading the entire response. Occasional resets will happen as users cancel download.
    Not a source of exceptions but something you may still want to consider when sending large responses: by default, the servlet container will have to buffer the entire file in memory to find out its length. To save memory, either set content length before writing the data, or use chunked encoding (google should find details.) That way your write() actually streams data to the user.

Maybe you are looking for