VISA read timeout

I dont know why im getting a timeout error at the VISA read function(error says "timeout expired before operation completed"). I made sure the Labview serial port and my devices serial port are configured exactly the same. I even tested hyperterminal with the same port settings as in labview and it works perfectly but my labview program gives me an error. My device terminates each command it sends with a carriage return so i set up VISA READ to stop reading data when it  encounters the CR character. I also made sure the carriage return was infact being sent by my device. Any thoughts?
Solved!
Go to Solution.

I think it's not seeing your termination character on the read and/or write.  When I send a string to the serial bus, I always use "Concatenate Strings" to add the appropriate termination character(s) to the end.  Assuming your read termination is CR/LF, I've always had better luck stopping on the LF and stripping the CR with "Trim Whitespace."
Also, don't be afraid of making the timeout a second or two.  If everything is going right, it will (normally) give the serial port plenty of time to complete its operation, but if things are going wrong, it will give you time to "notice" something is wrong.
Bill
(Mid-Level minion.)
My support system ensures that I don't look totally incompetent.
Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.

Similar Messages

  • How to run vi continuous​ly when VISA read timeout happens

    Hello,
    I am using LabView 2010 to read an Agilent 6000 series Oscilloscope. The Oscilloscope reads data from another experimental machine which fails occasionally due to sample failure. When the machine fails, it stops sending signal to the oscilloscope and leads to the VISA read timeout error and the whole vi is terminated.  Is there a way that I can continuously run the vi when the error happens? For example, when the error happens, it stores the error in the error wire. When the vi read the error code from the wire, it sends a command to stop my experimental machine then stop the vi. 
    So far, it just simply stops during executing the visa read function. In this case, why do we have error in and out? The vi simply stops when the error happens. 
    Thanks
    Lawrence
    Solved!
    Go to Solution.

    I'll start of with automatic error handling. By default, LabVIEW enables the automatic error handling. So consider the case like the one below,
    If let's say the VISA Read returns an error, but you did you pass the error information to VISA Close, what will happen is that LabVIEW will highlight the VISA Read (since the error comes from that function), pause the execution at the VISA Read and an error dialog box will appear. From the dialog box, it will prompt to either continue (Move on to the VISA Close) or stop (LabVIEW will stop the VI at VISA Read)
    Now, consider another scenario like this,
    By passing the error information from one function to another, let's say if VISA Read returns an error, that error will pass to VISA Close and finally to Simple Error Handler.vi. So Simple Error Handler will generate error dialog box after all the VISA operation is completed. This is called manual error handling (It can be enabled by just wiring the error information from one function to another and terminates at Simple Error Handler).
    http://www.ni.com/gettingstarted/labviewbasics/han​dlingerrors.htm
    As for VISA Close, here is the information about VISA Close.
    http://zone.ni.com/reference/en-XX/help/371361K-01​/lvinstio/visa_close/
    See that button on the left side of this post...
    If you feel my post is helpful, all you need is just (at most) 2 seconds to click that button, to show your appreciation. Thank you~~

  • Visa Read Timeout Occurs with multiple Reentrant VI Calls

    I have written a test application in Labview (6.1) which will be used to test (burn-in) up to 15 serial instruments through a 16 Port USB->RS232 Hub. Here's how it works:
    When the App loads, I am transmitting a Connect command to each of 15 com ports (one-at-a-time) using VISA. If I receive the proper response from the unit on that port, I add the port to an array and continue on to the next system. Once I've found all systems on the hub, I wire my array of active Visa references to a for loop in which I open up to 15 reentrant VIs which will run in the background in parallel. Each of these reentrant VIs (all are idential with the exception of the Visa Resource they use) running in the background are sending commands to the the respective instrument and receiving a response. One Function in particular "Get Unit Status" is important and the response determines whether or not the instrument is functioning correctly. Here's the problem -- In my Main Loop, I am continuously acquiring indicator values from each of the reentrant VIs that are running in the background. After a period of time (not consistent) I will lose communication with a port (the symptom is no response from the unit). I've looked closely at the COMM engine I created and found that the Visa Write function is completing without error, then when I perform a Visa Read I immediately get the "Timeout occured before operation completed" error (please keep in mind that this occurs after 100-5000 successful attempts at writing/reading). Eventually another port will drop out, followed by another. This seems to stop occurring and the remaining systems run to completion without a problem.
    Some background on what how I'm setting up my Visa Sessions...
    When I originally scan for systems (before I load and run the Reentrant VIs)
    - Init Visa Port
    - 19200, 8, N, 1
    - Use Termination = True
    - Timeout = 400mS (I've tried larger values already) 400mS should be plenty
    - Termination Char=13 (/r)
    - Open Visa Session
    - Visa Write "CONN18/r" (the command required to connect to my instrument)
    - Visa Read with 1 for requrested byte count to read 1 byte at-a-time, concatenating the results until /r is received (or 1000mS timeout occurs -- this is not a VISA timeout) I've also tried 16 for requested byte count and just waiting for Visa to timeout -- both methods work.
    Once all 16 ports are scanned I Close ALL of the ports using the Visa Close Function.
    It is important to know at this time that I "AM" using proper wiring flow to ensure open occurs before write, write occurs before read, etc.
    I'm assuming at this time that all of my Visa sessions are closed.
    On to the Reentrant VIs:
    Inside each reentrant VI I first Initialize all of my variables and Init/Open a 'New'? Visa session using the same parameters mentioned above.
    Then I enter the "Run" case structure where all of the communication begins.
    I am using the same Communications Engine to operate the instrument as before (the only difference being that all of the VIs in the comm engine are now reentrant and operate at higher priorities) I have actually saved two different versions of the engine (one for the reentrant calls and one for when I first scan for systems from my Main GUI).
    When I init the reentrant VI, I am placing the Duplicate Visa Resource output of my Visa Open Function on a shift register. When I enter the Run case, it takes the Resource from the register on the left, wires through any Comm Engine Vis then back out to the shift register on the right and keeps going for a 12-hour period or until "Get Unit Status" has returned 60 naughty results.
    On my Main GUI I am continuously (every 500mS) I am Getting certain Indicator Values from each reentrant VI AND I am also setting some control Values on each reentrant VI. There is no VISA interaction between each Reentrant VI, and the Main GUI.
    As I said earlier, up to 15 systems will run for a time, then one will stop responding, followed by another, and another until a few remaining systems will run to completion.
    Any advice as to why I'm encountering the timeouts with the VISA read fucntion as I have metioned would be appreciated. I managed to find one suggestion which uses the Bytes at Port function to ensure there is data at the port before doing a Read otherwise, skip the read and retry the whole operation -- I haven't tried this yet.
    Sorry for the wordiness of my question. If anyone would like some screen shots of portions of my code (I can't submit the actual code because some of it is confidential) I'd be happy to post them.
    Doug.

    Hi Doug,
    The first thing I would recommend is the solution you have already found, to check and see if there is data at the port before attempting a read. I would be interested to see if this will solve the problem. Does there seem to be any trend to which ports have this timeout error? How many ports does it cut down to before operation seems to continue as expected? Does this number vary, or is it always the same number of ports? I think the best thing to do will be to identify constant attributes of how the error is occurring so that we can narrow it down and see what is going on.
    John M

  • VISA Read Timeout and VISA Write not working

    I'm writing a program that sets a triangle wave pattern voltage (using Triangle Wave VI) to a DC power source (GWInstek PST-3202) through a RS-232 connection.  I know my instrument parameters are set up correctly because I've been able to control this power source using a different VI with the same parameters.  I also know I am connected to the power source because I can talk to it using MAX.  I have a string indicator before my VISA Write statement that shows me the command being sent to VISA Write.  It shows the correct syntax for setting a voltage on this instrument.  However, when I run the VI, the voltage is not changed on my instrument, even though the correct command is sent to the instrument.  After the VISA Write operation I have inserted a VISA Read function in order to read the newly set voltage from both a numeric and a waveform indicator.  Using the Error Out cluster I am able to see that there is a timeout error from VISA Read.  I thought perhaps I needed to clear or flush the buffer after each Read statement, but that didn't work either.  I'm stumped as to why VISA Write isn't writing to the device.  I have included my VI below, as well as the initialization subVI for the power supply.  
    Attachments:
    Triangle Volt Mod.vi ‏27 KB
    GW Instek initialization.vi ‏18 KB

    Yep, you need to append the End of Line character to the end of your command.  And for your read to work properly, you need to request the data.  That requires a querry command to be sent.  Look closely at the differences in command and communication structures between your working VI and your mod.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • VISA READ timeout error - multiple GPIB resources

    Hi,
    I am working on a 3 instruments GPIB network (optical attenuator, fiber amplifier, spectrum analyzer), controlled using VISA sessions in Labview. When run separately, the three corresponding VIs (which are located in three different Labview projects) work as expected. However, when they are ran simultaneously, one of them gives VISA READ -1073807339 timeout errors. These errors seem to happen when an other instrument is sending / receiving data / instructions at the same time as it is.
    The exact context for these error is either :
      -  an other VI is running, which includes sending several queries and reading the answers every 100 ms,
      - upon starting the failing VI, I get a  timeout error from one of the first subVI containing a VISA READ operation to be executed (sometimes initialize.vi (in state 1), sometimes one of the subVIs ran from the Idle state (state 0) upon timeout of the event structure).
    or :
      - the failing VI is running,
      - upon starting an other VI, which includes repetitively sending queries and reading answers, the failing VI throws an error from one of the first subVI containing a VISA READ operation to be executed (one of the subVIs ran from the Idle state (state 0) upon timeout of the event structure).
    What I tried :
      -  gradually increasing the delay between the VISA WRITE and READ operations for the relevant instrument (from 10 ms to 10s), to no avail. More puzzling is my obseration that, when this VI is run alone, increasing the WRITE / READ delay leads to the same timeout errors. I could not find any mention of such behavior through google and forum searches. Hopefully this can point to a solution to the main issue,
      - switching between synchronous and asynchronous VISA WRITE / READ operations,
      - reordering the GPIB network from a star topology to a linear topology (all three instruments do have different GPIB addresses in case anyone wonders).
    My thoughts :
    It seems to me that the error is related to a delay introduced between a VISA query and its associated read operation by the transmission of another query to another instrument in the same GPIB network. However I have no idea why transmitting a query to another instrument would introduce such a delay, or why this delay would lead to a timeout error (and only from one instrument, while the write / read VIs in each driver are basically the same). Hopefully a more experienced Labview-er will be able to shed some light on my issue.
    Included is the project containing the failing VI (main.vi) and the custom driver it makes use of. 
    Solved!
    Go to Solution.
    Attachments:
    Manlight EDFA Control - Failing VI.zip ‏73 KB
    EXFO FVA 3150 - Driver.zip ‏348 KB

    Thank you for your input crossrulz. This is indeed what I realized while looking into semaphores.
    Let me first make our architecture clear so that I'm 100% certain we are talking about the same thing. We have a NI GPIB-USB-HS GPIB Controller connected to a linear GPIB network of three instruments. I was convinced that a network allowing up to 15 instruments to be connected at the same time would allow for parallel operation, but it seems I was mistaken.
    I like how semaphores work, and I don't see any obstacle to gathering all these VIs into one project. My conception of a Labview project was that one Labview project was intended to gather subVIs, libraries and controls used in a more complex "main" VI, which would ultimately be made into a single standalone executable. It seems I was mistaken too and that a single Labview project should be used to gather several standalone VIs designed to work together, and their subVIs. Hopefully I got it right now.
    The other option that you suggest for accessing the same GPIB bus from different projects (having a TCP control interface running and controlling communications through the bus) might indeed be a bit overkill for what I'm tring to achieve, and I would need to spend too much time learning and developing it.
    A last option I looked into is the VISA Lock Async VI, but I don't understand yet if 1. locking the VISA session for an instrument in the bus would lock the entire bus ; 2. if it would be possible to use this approach with VIs running in different projects and 3. if it would not just yield errors when one VI is trying to access the locked GPIB bus instead of making it wait until the resource is available.
    I'll look further into these options today, but would appreciate any additional information / advice you might have. Thank you.

  • VISA read timeout error

    Good day to all,
    I have this problem with my LabView program. It is being used to interface with a Energy Meter to retrieve the data. I am able to execute "Open" and "Write" VISA, but am unable to execute "Read" VISA. Attached below is the screenshot of the error.
    EKI1521 is being used to interface serial connection from the meter to ethernet, connecting to the computer with
    labview program. Attached are the settings for the EKI1521.
    This is extracted from the main program after narrowing the problem down to the VISA Read function. The subVIs in this program are proprietary protocols of the meter and does not contribute to this Reading error.
    It might be interesting to note that I am able to run another program to interface with another meter using Modbus protocol. It also used the same EKI1521 with the correct VCOM settings. (This should free the EKI1521 from any faults). 
    Thanks & Regards,
    Andrew
    Attachments:
    1.png ‏169 KB
    2.png ‏210 KB

    Hi,
    Stop me if I make a mistake, but you don't set Visa properties in your Labview code (type, baudrate... etc).
    Best regards,
    V-F

  • VISA Read over USB Problem: After a while the xBFFF0015 Timeout Error occurred

    Hi,
    I have trouble using labview with my non-NI USB device:
    The device is an analog input DAQ board. I was able to setup communication with the board using a VISA driver specifically created for this board and direct FW calls using the product's firmware specification provided by the manufacturer.
    This method has been working pretty good so far, but when I try to get large amount of data (64k samples @100ksps), the VISA Read returns the 'VISA: (Hex 0xBFFF0015) Timeout expired before operation completed.' error.
    Please see the attached screenshot of the block diagram for details.
    First, an 'analog input scan start' command is sent to the DAQ device, and then the vi tries to read all collected data from the device, once the right amount of data retrieved, or no more data is available the data collecting process (the while loop) ends and an 'analog input scan stop' command is sent to the DAQ device.
    The data collection starts with no problem, but after a while, in loop# 400, the VISA Read hangs and then returns the error mentioned above.
    I tried to increase the time out value, but it didn't help, the error occurred after the same number of loops, the VISA Read got hosed and the error occurred after the longer timeout expired.
    I also tried to add some delays in the loop, but it didn't help either.
    I am not sure what do I miss here and I would highly appreciate if anyone could give me some guidance how to solve this issue.
    Thanks,
    John
    Attachments:
    usb-read.png ‏18 KB

    I just wanted to specify that this is not an NI board avoiding to make people think this is a hardware issue. And I think that the rest of the code is irrelevant in this case.
    I believe that I don't use the VISA functions correctly. I assumed that someone who used these functions before would be able to point out the obvious steps missing in the data collecting process using the VISA functions.

  • Visa test panel in NI MAX ver 5.4.0 read timeout

    I am using the visa test panel in NI MAX ver 5.4.0 and I connect to a CVI application I currently have a server socket up and running. My server accepts the connection form the visa test panel ok, and when I write from the visa my application sees it immediately. Then I try to read and the visa will timeout, at which point the visa displays what I sent in the 'send' command. I cannot determine why the visa writes to my application ok but the send is not received until the timeout occurs in the visa tool. In my application I am doing a simple 'send' on a valid socket. I am not sure why, but if anyone has any answers I would appreciate it. Here is a snippet of my code in my main:
      char sendbuf[64];
      strcpy (sendbuf, "ACK\n");
      /* Initialize WINSOCK before any socket calls are made */
      iResult = WSAStartup(MAKEWORD(2,2), &WSAData);
      if ( iResult != 0 )
       return SOCKET_ERROR;
      if ((ServerSock = SockOpenServer(SERVER_PORT_NUMBER_TCP)) == SOCKET_ERROR)
       return SOCKET_ERROR;
      /* By now the socket should be successfully bound.                */
      /* Wait for clients to connect. As each client connects,          */
      AcceptedSock = SockWaitForAccept(ServerSock);
      /* Indicate the socket has been initialized and a client has been accepted */
      if (AcceptedSock > 0)
       isServerInitComplete = TRUE;
      else if (AcceptedSock == SOCKET_ERROR)
       runRDPServer = FALSE;
      while (runRDPServer)
       BytesReceived = recv(AcceptedSock, ptrCh, BytesToRead, 0);
       if (BytesReceived > 0)
        /* for now just send back an ACK to ackknowledge the receipt of data */
        if (strncmp(ptrCh, "*IDN", 4) == 0)
         BytesSent = send(AcceptedSock, sendbuf, (int)strlen(sendbuf), 0 );
        if (strncmp(ptrCh, "*ESE", 4) == 0)
         runRDPServer = FALSE;

    Stephanie, I did go look at the link you provided this morning. That particular paper seem to deal with serial connections. The app I have is for TCP/IP and that's where my problem is. I did find out today, that when talking to my app using a LabView app for TCP there is no timeout in the receipt of data in the LabView TCP app, where as the Visa connection timeouts but still reads the buffer after the timeout occurs. So there is something in that Visa connection to my app, which is a server in 'C' creating the socket. We are currently going to continue with an interface with a LabView TCP connection to my app. Thanks for your input.

  • Why do I get a "termination character was read" warning with VISA read and TCPIP?

    I am using VISA Reads with TCPIP raw sockets without issue with NI-VISA 3.0.1 but when I moved to NI-VISA 4.4 I was getting timeout errors.   The timeout errors went away when I set the termination character enable property to true (which seemed to be default in NI-VISA 3.0.1), but now I get a warning stating that the "termination character was read".   
    Can I disable this warning?   Can I set the termination character enable to default?   How can I get rid of this annoying warning?
    Solved!
    Go to Solution.

    Hey Dagwood,
    Unfortunately there isn't a way to globally change the attribute VI_ATTR_TERMCHAR_EN to VI_TRUE.  I spoke with R&D about possible use of the registries and they say it's not accessible through that.  To address why this change was made, the developer who made the switch isn't around anymore so I can't find his reasoning as an explanation for you.  The best this for you to do in your code would be while initializing, use the VISA Property Node to make the change and until that VISA Resource is closed, this change will remain set to the value you assign.  I'm sorry we cannot provide any other solution for this inconvenience.  Also, if you feel this is a large burden on your programming practice you can definitely submit a product suggestion for the ability to change global default values for VISA attributes.
    Thanks,
    David Pratt
    AES - Test Side Products
    NIC

  • Error -1073807253 occurred at VISA Read in transient SR830.vi VISA: (Hex 0xBFFF006B) A framing error occurred during transfer.

    Hi,
    I am have written a program with labview to make transient c-v measurement using a stanford research SR830 lockin amplifier. The program seems to be runing fine, but sometimes it is givvibg an error:
    Error -1073807253 occurred at VISA Read in transient SR830.vi
    Possible reason(s):
    VISA:  (Hex 0xBFFF006B) A framing error occurred during transfer.
    Error -1073807253 occurred at VISA Read in transient SR830.vi
    Possible reason(s):
    VISA:  (Hex 0xBFFF006B) A framing error occurred during transfer.
    but if I press ok, the program again starts running. What might be the poblem? BTW I googled a bit and I see that in the labview topic "RS-232 Framing Error with HP 34401A Mulitmeter" by pkennedy32 this is what is said about framing error:
    ""Framing Error" in an RS-232 context means a very specific thing - when the receiver was expecting a stop bit, the line was not in SPACE condition. This can be the result of:
    1... Baud rate mismatch (although other problems would likely crop up first).
    2... Data Length problem, If I send 8 data bits and you expect 7, the stop bit is in the wrong place.
    3... Parity setting mismatch - If I send 7 data bits + parity and you expect 7 data bits and no parity, the stop bit is in the wrong place.
    4... Mismatch in # Stop bits - If I send you 7 Data bits + parity + one stop bit, and you expect 7 data bits + parity + TWO stop bits, the second one might not be correct, although most devices do not complain about this.
    But, I must say that this is the same com port setting that I use to measure c-v hysterysis, but I never gt this error there.
    I attach the program herewith for your kind perusal. Please help me resolve this issue.
    Thanks in advance.
    Solved!
    Go to Solution.
    Attachments:
    transient SR830.vi ‏94 KB
    csac.vi ‏8 KB
    sr830 initialize1.vi ‏15 KB

    @Dennis Knutson  you are right I checked the read indicator in backslash mode, and instead of a \n it is sending \r. So I changed the \n in my write strings to \r. But, if I keep the CLOSE VISA outside my loop instead of inside as you suggested, the termination character appears to come in the middle of the read string instead at the end. And since the read terminates at the \r so it is displaying some junk value before the \r, but if I put the CLOSE VISA outside the loop and play along with the bytes at the read buffer, I see the whole read string with the \r  at the end of the string. But, whenever the values are in exponential form (when close to zero) like 6.938839 e-5, I always get a time out error whatever be the timeout that I put at the VISA initialize. And subsequently, if I stop the program and run it again the machine program hangs and I donot get any reading. Then after I close it again and start, sometimes it hangs for some more or starts working. If I put an arbitrarily large byte count at the READ VISA, then I always get the time out before the operation completed error.
    @ Ravens Fan I have removed the CSAC VI altogether and taking the CH! And CH” reading separately, instead as one string. So, no more issues with that.
    I use the control at the delay so that I can choose how much delay I want to set, and I use the math operation because I am using adding up the delay time to keep track of the time elapsed. Because in the end I have t plot a time vs. CH! And CH” readings.
    I am not sure but probably I am making some silly errors. Please help me out. 
    Attachments:
    transient SR830-2.vi ‏103 KB
    sr830 initialize1.vi ‏15 KB

  • Memory leak in Real-Time caused by VISA Read and Timed Loop data nodes? Doesn't make sense.

    Working with LV 8.2.1 real-time to develop a host of applications that monitor or emulate computers on RS-422 busses.   The following screen shots were taken from an application that monitors a 200Hz transmission.  After a few hours, the PXI station would crash with an awesome array of angry messages...most implying something about a loss of memory.  After much hair pulling and passing of the buck, my associate was able to discover while watching the available memory on the controller that memory loss was occurring with every loop containing a VISA read and error propogation using the data nodes (see Memory Leak.jpg).  He found that if he switched the error propogation to regular old-fashioned shift registers, then the available memory was rock-solid.  (a la No Memory Leak.jpg)
    Any ideas what could be causing this?  Do you see any problems with the way we code these sorts of loops?  We are always attempting to optimize the way we use memory on our time-critical applications and VISA reads and DAQmx Reads give us the most heartache as we are never able to preallocate memory for these VIs.  Any tips?
    Dan Marlow
    GDLS
    Solved!
    Go to Solution.
    Attachments:
    Memory Leak.JPG ‏136 KB
    No Memory Leak.JPG ‏137 KB

    Hi thisisnotadream,
    This problem has been reported, and you seem to be exactly reproducing the conditions required to see this problem. This was reported to R&D (# 134314) for further investigation. There are multiple possible workarounds, one of which is the one that you have already found of wiring the error directly into the loop. Other situations that result in no memory leak are:
    1.  If the bytes at port property node is not there and a read just happens in every iteration and resulting timeouts are ignored.
    2.  If the case structure is gone and just blindly check the bytes at port and read every iteration.
    3.  If the Timed Loop is turned into a While loop.
    Thanks for the feedback!
    Regards,Stephen S.
    National Instruments
    Applications Engineering

  • Visa Read Overrun Problem

    Dear all,
    I am using VISA Read Function to read data coming from my Hardware.
    The data is coming really fast. The H/W is sending 1600 bytes after every 10ms.
    I am using Event Structure(timeout event) to check whether 1600 bytes are at port or not. Timeout is set to 1ms. i even tried with 0ms.
    Baud Rate = 3000000.
    but the program is giving Buffer Overrun problem when i am trying to read data using VISA Read.
    I have set the buffer size to 64000 but with no success. The problem still seems to persist.
    i have played around with Receive buffer of COM Port in device manager. i tried all the values from 64 to 1600, again with no success.
    every time i run the program i get an error stating,"A character was not read from the H/W before the next char arrived" at VISA Read Function.
    and i am not doing anyother processing other than Reading data from the Port in the timeout event. 
    What could be the problem?
     Thanks,
    Ritesh 

    Ritesh,
    The over-run is still caused by the driver of the hardware.  What you are observing is that the driver is able to keep up with bursts of data coming into the port.  So for the first few thousand bytes coming in, there is no error.  However, after a while, the driver falls behind and cannot keep up (hence the over-run error).
    How does FTDI recommend that you use their ports in LV?  VISA may interact with the driver in a different way or set properties to different values.  All of this can have an impact on how the driver performs.  Also, this DLL that they provided may not return overrun errors the same way that VISA does.  The malformed string may just be missing data that was missed because of an over-run error.
    You have not mentioned anything about the types of flow control that are available to you.  The only real way to prevent over-run errors in data intensive communication is to have ways to hold-off the data (flow control).  Since the data will be interrupted for a fraction of a second, the driver will have a chance to get the data out of the hardware in time.  Communication will automatically resume once the data is read out of the hardware.
    Thanks,
    Steven T.

  • VISA Read timing issues

    I am using an RS232 to control an older model Power Supply (OXFORD PS 120-10).
    I have successfully written several VI's that all work, the only problem is that VISA Read takes WAY too long. I'm talking 10's of seconds to refresh. I need it have it refreshing in milliseconds or at least tens of seconds for the measurements we need. All of the VI's I have written have the same timing issue. 
    Attached is the most basic Serial Read/Write VI. Is there any way to improve the Read rate? Or might this just be an instrumentation issue. The strange thing is the Write commands work almost instantaneously (I can seem them on the instruments display).
    Please help if you can, I've only been working with LabVIEW for a few weeks and am very must still in the learning process. 
    Thanks!
    Solved!
    Go to Solution.
    Attachments:
    READandWRITE timing test.vi ‏14 KB
    READandWRITE timing test.vi ‏14 KB

    Do you have the communications protocol for the power supply? If you do not have everything right, you will have problems with communications.
    Tens of seconds is a clue that you may be getting timeout errors because the default timeout is 10 seconds. Try placing an inidicator on the error out wire inside the loop (after Read) to see if an error occurs on any iteration. The way you have the VI set up you only see the error on the last iteration of the loop.
    You are writing a carriage return to the instrument. If it requires that, it almost certainly sends a carriage return with the response. (This is why I asked about the protocol). If the instrument sends a carriage return (or other termination character), then you should Enable Termination Character on the Configuration VI and set the termination character to the correct value. The default is line feed (hex A or decimal 10). A carriage return is hex D or decimal 13. You must wire the numeric value to the termination character input for any value other than the default. Then change the byte count value (at the Read input) to a number larger than the longest message the instrument will ever send, perhaps 100 or 500. The Read will end as soon as the termination character is received, regardless of the number of characters.
    I suspect that this is the problem - the instrument sends fewer than 10 characters in most messages but does send a termination character.
    Lynn

  • LabVIEW VISA Read Byte Count Limitation​s

    I have written a program that acquires and analyzes data from the Tektronix TDS3032B oscilloscope. The scope can transfer either 500 or 10,000 points. However, it seems the VISA Read function (I am using LabView 6.0.2) is unable of reading enough bytes to receive the entire data string. I am almost certain that this is not a timeout issue because I can set the timeout extremely high and still get the same behavior. What would be the best way to fix this problem?

    I assume that you receive a timeout error when trying to read a big string (many points). Did you try one reading with a low byte count, less than the length of the expected string? ... or fast consecutive readings ?
    If you are able to receive parts from the expected string, the problem looks to be buffer related. Thus, you can try to use "VISA Advanced > Interface Specific > Set I/O Buffer Size" to set enough room for a large string.
    Hope this helps

  • VISA Read and Bytes at Port Timing Question

    Hi,
    I have a question that doesn't seem to be documented in the VISA Read function help. My application normally queries a serial instrument, waits, and then reads the port (with Bytes at Port property node wired to the byte count input of the VISA Read). However, I also need to be able to handle strings received from the instrument asynchronously without my vi requesting any data. So in the False Case in my vi (the True Case is where I write a command to the instrument) I have a Bytes at Port property wired to the VISA Read function's byte count input without using a VISA Write. This works fine if the \r\n terminated string is sent in one packet. However, sometimes there is a slight delay (only a few milliseconds) between characters. When that happens, the VISA Read returns, but I don't get the entire intended string. (Of course I know I have to keep reading in a loop until I get the \n and then assemble the received characters (sub strings) into my complete string for processing.)
    This is my question: What is the time delay between characters at which the VISA Read terminates? This is not specified. I assume it could be as little as just slightly more than 1 stop bit at the baud rate being used. Does anyone know? NI employees?
    When a string of more than one character (byte) is sent, as soon as the stop bit time has expired, the next start bit is normally sent immediately. Is it possible that if the next start bit doesn't come by, say, the mid-bit position time at the baud rate being used, the VISA Read returns immediately? Or does it wait at least 1 character time (at the baud rate)? This should be documented. Furthermore, for future versions it might be useful to add an input to the VISA Read to specify in milliseconds how long to wait AFTER the 'byte count' number of bytes have been received before returning the string (or character).
    Thanks for your help.
    Ed

    I looked up the PC16550D data sheet (http://www.national.com/ds/PC/PC16550D.pdf). On p. 19 it says:
    When RCVR FIFO and receiver interrupts are enabled, RCVR FIFO timeout interrupts will occur as follows:
    A. A FIFO timeout interrupt will occur, if the following conditions exist:
        - at least one character is in the FIFO
        - the most recent serial character received was longer than 4 continuous character times ago (if 2 stop bits are  programmed the second one is included in this time delay).
        - the most recent CPU read of the FIFO was longer than 4 continuous character times ago.
    The maximum time between a received character and a timeout interrupt will be 160 ms at 300 baud with a 12-bit receive character (i.e., 1 Start, 8 Data, 1 Parity and 2 Stop Bits).
    B. Character times are calculated by using the RCLK input for a clock signal (this makes the delay proportional to the baudrate).
    C. When a timeout interrupt has occurred it is cleared and the timer reset when the CPU reads one character from the RCVR FIFO.
    D. When a timeout interrupt has not occurred the timeout timer is reset after a new character is received or after the CPU reads the RCVR FIFO.
    So, this UART uses 4 character times to determine that no more characters are coming in. And the delay is baud-rate dependent. This makes sense because I see that at, say, 115200 baud I receive more "partial strings" than I do at 9600 baud (where the sending device has more time to send the next character)!
    Kudos for making me investigate this further! Thanks for listening. Hope this may help others in the future.

Maybe you are looking for