Intermittent Extra Serial Byte in Visa Read

Hello All,
I'm hoping you can help me with a problem that I've been having. The background:
Using a microcontroller to to gather data from various sources and sending all the data back to Labview via an FTDI virtual COM port. The different types of data sent back in one chunk has a three byte header that tells me (in an abstract way) how many bytes are following for that type of data. This repeats until all the data sources are exhausted and all the data is received. However, occasionally, there is an extra byte in the buffer, which then pushes the headers out of whack and has me looking for bytes that aren't there. I'm attaching screen shots of the serial port config and of the serial port read vi's. I've tried flushing the buffer before I do a data acquisition and ask for data, but sometimes the extra byte happens inside data stream, so again it pushes the headers out of place.
I have a secondary COM port output on the microcontroller that echoes the data being sent to Labview and is connected to a second PC that is capturing the data. The data is in binary format. I have disabled the termination character requirement from the receive buffer, but enabled it for the transmit (as you should see from the setup). When I review the logs of the captured data on the second PC, the microcontroller is not sending the extra byte. It appears that even though Labview reads the byte from the buffer, it remains there SOMEHOW or I could be way off and it's a bug in my code. I'm hoping that you can help with perhaps seeing something that I'm not seeing.
As for my serial buffer read, I do recognize that I don't have a timeout in the event that the number of bytes to read exceeds the total bytes in the buffer, this has actually been beneficial in helping with capturing the error as it happens. I didn't use the Bytes at the Port property because speed is important and this property is notoriously slow. Also, that really wouldn't solve the extra byte problem.
This the serial read vi
This is the Serial Port Setup
Attachments:
Serial Port Setup.png ‏81 KB
get data.png ‏186 KB

teejimenez wrote:
The initialization only happens once when the program starts. There is of course a close VISA session vi used later, when I'm done with the test.
No! The Initialize Serial Port is inside the loop and executeded on every loop iteration.
Rolf Kalbermatter
CIT Engineering Netherlands
a division of Test & Measurement Solutions

Similar Messages

  • VISA Read function Read buffer problem in serial communication

    Hi,  I use VISA write and read function in serial communication app, the device continuously sends 0x00 if it is not receive a request from Labview program running on PC.
    And the request sent by labview is programmable. I met a weird problem, each time the request changes, the VISA read buffer output port still shows the last request firstly, from second time, shows the right request.
    It works like: Req code: ... 50, 51,51,51,50....;  VISA Read buffer: ...50, 50, 51, 51, 51, 51, 50....
    Please refer to the program.
    Attachments:
    readOne_test.vi ‏21 KB

    How are you running this?  You don't have a while loop around it.  Is it part of a larger VI?  Please don't tell me you are using the run continuously button.
    You don't have any wait statement between you VISA Write and your bytes at port.  So it is very likely the receive buffer is still empty since you didn't give your VI time to wait for the device to turn around and give a reply.  If you read 0 bytes, your VISA read string will be empty.  How does your decoder subVI (which you didn't include) handle an empty string?

  • Error x3FFF0006 in VISA Read/Write

    Hi,
    I am getting this error message sometimes in VISA read and sometimes in VISA write. Its not consisten. Its only thrown in certain VIs. What could be the issue? I am attaching my main VI, where I configure serial port, and the VI where this error is thrown.
    Thanks.
    Attachments:
    Main.vi ‏39 KB
    Error.vi ‏30 KB

    It's a warning and not an error.
    The VI that is showing the warning is a little weird. You configure the serial port to terminate the read with a termination character but then when you do a read, specify the byte count and if the bytes read does not equal what you specify, you try to do another read. You usually don't do both.
    If you specify a termination character with the VISA Configure Serial Port, the VISA Read will terminate when the termination character is detected. If you specify 100 bytes to read and there are only 50 bytes and a termination character, all you will read is 50 bytes. If you disable the termination character detection, you will always get the warning. In this case, as long as you use the VISA Bytes at Serial Port to verify that there really isn't any more data in the buffer, you can simply ignore the warning.

  • How do I stop Serial "VISA Read" from giving me packets instead of available bytes.

    Dear Labvillians,
    Highlight:
    How do I stop serial "VISA read" from giving me packets instead of bytes?
    Background:
    I have a system that serially publishes 14 byte packets on a semi-regular interval.
    At busy times, the producer of these these packets queues the data, effectively producing super-packets in multiples of 14 bytes sometimes as large as 8 packets (112 bytes).
    My protocol handler has been designed to processes bytes, packets or super-packets.
    My application now has multiple devices and the order of message processing is critical to correct functionality.
    My observation is that the VISA read waits until the end of a packet/ super-packet before passing the data to the application code. (See Plot Below)
    My expectation is that VISA read should give me available bytes, and not get too smart for itself and wait for a packet.
    I have observed this on PXI, Embedded PC, cFP and most recently, cRIO
    I have experimented with the cRIO's Scan interface rate, which helps with reducing the packet backlog but doesn't resolve to sub-packet byte read.
    I understand that one solution is to Write FPGA code to handle it and pass the bytes through R/T-FIFO, and there are some great examples on this site.
    Unfortunately this doesn't help with non FPGA devices.
    I have also dabbled in event based serial reads but it is diabolical on vxWorks devices.
    Any Help is appreciated
    iTm - Senior Systems Engineer
    uses: LABVIEW 2012 SP1 x86 on Windows 7 x64. cFP, cRIO, PXI-RT
    Solved!
    Go to Solution.

    Sometimes Talking to yourself is helpful.
    I hope this is a useful Nugget for someone in the future
    iTm - Senior Systems Engineer
    uses: LABVIEW 2012 SP1 x86 on Windows 7 x64. cFP, cRIO, PXI-RT

  • Bytes missing in serial communication UART - NI Visa READ

    hello everyone,
                         I am using NI Visa read to transfer data from microcontroller to laptop using write-to-binary-file.vi... I have observed a very weird phenomenon, in the fact that some bytes appear to be missing from the data-file. this however, does not occur for all laptops... In some laptops, i have never observed a single byte being missed from the data file...
    the OS running on laptop with byte missing is win 7, whereas the laptop running perfectly fine has win xp.
    has anyone ever experienced such issue before? i am using Silicon Labs CP2102 as UART-USB converter over a baudrate of 921600 bps... the data size is normally in GB's... and the no. of bytes missing is purely random and varies from 2bytes to 32 bytes,... the way i determine if a byte is missing or not is by right clicking the file, check properties and compare size on disk and size of file ... they shud be same to ensure no data loss...
    Now on LabVIEW 10.0 on Win7

    Oh.. Created by mistake ,.. How to delete this post ???
    Now on LabVIEW 10.0 on Win7

  • VISA Read gets incorrect data from serial connection

    I am having difficulty using the VISA functions in LabVIEW to read data from a virtual COM port. Data is being sent from a serial to USB chip via a USB connection using OpenSDA drivers. I have a python program written to read from this chip as well, and it never has an issue. However, when trying to achieve the same read in LabVIEW I am running into the problem of getting incorrect data in the read buffer using the VISA Read function.
    I have a VISA Configure Serial Port function set up with a control to select the COM port that the device is plugged into. Baud rate is default at 9600. Termination char of my data is a newline, which is also default. Enable termination char is set to true. A VISA Open function follows this configuration, and then feeds the VISA Resource Name Out into a while loop where a VISA Read function displays the data in read buffer. Byte count for the VISA Read is set to 20 so I can read more of the erroneous datat, however actual data will only be 6-12 bytes. The while loop has a wait function, and no matter how much I slow down the readings I still get incorrect data (I have tried 20ms thru 1000ms). 
    The data I expect to receive in the read buffer from VISA Read is in the form of "0-255,0-255,0-255\n", like the following:
    51,93,31\n
    or
    51,193,128\n
    And occasionally I receive this data correctly, however I intermittently (sometimes every couple reads, sometimes a couple times in a row) get incorrect readings like this:
    51,1\n
    51,193739\n
    \n
    51,1933,191\n
    51,,193,196\n
    51,1933,252 
    51,203,116203,186\n
    Where it seems like the read data is truncated, missing characters, or has additional characters. Looking at these values, however, it seems like the read was done incorrectly because the bytes seem partially correct (51 is the first number even in incorrect reads).
    I have search but haven't found a similar issue and I am not sure what to try from here on. Any help is appreciated. Thanks!
    Attachments:
    Serial_Read_debugging.vi ‏13 KB

    The first thing is that none of the error clusters are connected so you could be getting errors that you are not seeing. Are you sure about the comm parameters? Finally, I have never had a lot of luck looking for termination characters. You might want to just capture data and append each read into one long string just to see if you are still seeing this strangeness.
    What sort of device is returning the data? How often does it spit out the data? How much distance is there between it and your computer? Can you configure it append something like a checksum or crc to the data?
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • The vi is identifyng the number of bytes to be read but the VISA Read vi is not able to read the data from the port.

    We are trying to communicate with the AT106 balance of Mettler Toledo.The VI is attached.
    We are sending in "SI" which is a standard command that is recoginsed by the balance. The balance reads it.The indicator after the property node indicates that there are 8 bytes available on the serial port. However, the VISA read VI fails to read the bytes at the serial port and gives the following error:
    Error -1073807253 occurred at VISA Read in visa test.vi
    Possible reason(s):
    VISA: (Hex 0xBFFF006B) A framing error occurred during transfer.
    The Vi is atttached.
    Thanks
    Vivek
    Attachments:
    visa_test.vi ‏50 KB

    Hello,
    You should also definitely check the baud rates specified; a framing error often occurs when different baud rates are specified, as the UARTs will be attempting to transmit and receive at different rates, causing the receiving end to either miss bits in a given frame, or sample bits more than once (depending on whether the receiving rate is lower or higher respectively). You should be able to check the baud rate used by your balance in the user manual, and match it in your VI with the baud rate parameter of the VISA Configure Serial Port VI.
    Best Regards,
    JLS
    Best,
    JLS
    Sixclear

  • VISA Read and Bytes at Port Timing Question

    Hi,
    I have a question that doesn't seem to be documented in the VISA Read function help. My application normally queries a serial instrument, waits, and then reads the port (with Bytes at Port property node wired to the byte count input of the VISA Read). However, I also need to be able to handle strings received from the instrument asynchronously without my vi requesting any data. So in the False Case in my vi (the True Case is where I write a command to the instrument) I have a Bytes at Port property wired to the VISA Read function's byte count input without using a VISA Write. This works fine if the \r\n terminated string is sent in one packet. However, sometimes there is a slight delay (only a few milliseconds) between characters. When that happens, the VISA Read returns, but I don't get the entire intended string. (Of course I know I have to keep reading in a loop until I get the \n and then assemble the received characters (sub strings) into my complete string for processing.)
    This is my question: What is the time delay between characters at which the VISA Read terminates? This is not specified. I assume it could be as little as just slightly more than 1 stop bit at the baud rate being used. Does anyone know? NI employees?
    When a string of more than one character (byte) is sent, as soon as the stop bit time has expired, the next start bit is normally sent immediately. Is it possible that if the next start bit doesn't come by, say, the mid-bit position time at the baud rate being used, the VISA Read returns immediately? Or does it wait at least 1 character time (at the baud rate)? This should be documented. Furthermore, for future versions it might be useful to add an input to the VISA Read to specify in milliseconds how long to wait AFTER the 'byte count' number of bytes have been received before returning the string (or character).
    Thanks for your help.
    Ed

    I looked up the PC16550D data sheet (http://www.national.com/ds/PC/PC16550D.pdf). On p. 19 it says:
    When RCVR FIFO and receiver interrupts are enabled, RCVR FIFO timeout interrupts will occur as follows:
    A. A FIFO timeout interrupt will occur, if the following conditions exist:
        - at least one character is in the FIFO
        - the most recent serial character received was longer than 4 continuous character times ago (if 2 stop bits are  programmed the second one is included in this time delay).
        - the most recent CPU read of the FIFO was longer than 4 continuous character times ago.
    The maximum time between a received character and a timeout interrupt will be 160 ms at 300 baud with a 12-bit receive character (i.e., 1 Start, 8 Data, 1 Parity and 2 Stop Bits).
    B. Character times are calculated by using the RCLK input for a clock signal (this makes the delay proportional to the baudrate).
    C. When a timeout interrupt has occurred it is cleared and the timer reset when the CPU reads one character from the RCVR FIFO.
    D. When a timeout interrupt has not occurred the timeout timer is reset after a new character is received or after the CPU reads the RCVR FIFO.
    So, this UART uses 4 character times to determine that no more characters are coming in. And the delay is baud-rate dependent. This makes sense because I see that at, say, 115200 baud I receive more "partial strings" than I do at 9600 baud (where the sending device has more time to send the next character)!
    Kudos for making me investigate this further! Thanks for listening. Hope this may help others in the future.

  • I pull fiftyfour bytes of data from MicroProcessor's EEPROM using serial port. It works fine. I then send a request for 512 bytes and my "read" goes into loop condition, no bytes are delivered and system is lost

    I pull fiftyfour bytes of data from MicroProcessor's EEPROM using serial port. It works fine. I then send a request for 512 bytes and my "read" goes into loop condition, no bytes are delivered and system is lost

    Hello,
    You mention that you send a string to the microprocessor that tells it how many bytes to send. Instead of requesting 512 bytes, try reading 10 times and only requesting about 50 bytes at a time.
    If that doesn�t help, try directly communicating with your microprocessor through HyperTerminal. If you are not on a Windows system, please let me know. Also, if you are using an NI serial board instead of your computer�s serial port, let me know.
    In Windows XP, go to Start, Programs, Accessories, Communications, and select HyperTerminal.
    Enter a name for the connection and click OK.
    In the next pop-up dialog, choose the COM port you are using to communicate with your device and click OK.
    In the final pop
    -up dialog, set the communication settings for communicating with your device.
    Type the same commands you sent through LabVIEW and observe if you can receive the first 54 bytes you mention. Also observe if data is returned from your 512 byte request or if HyperTerminal just waits.
    If you do not receive the 512 byte request through HyperTerminal, your microprocessor is unable to communicate with your computer at a low level. LabVIEW uses the same Windows DLLs as HyperTerminal for serial communication. Double check the instrument user manual for any additional information that may be necessary to communicate.
    Please let me know the results from the above test in HyperTerminal. We can then proceed from there.
    Grant M.
    National Instruments

  • LabVIEW VISA Read Byte Count Limitation​s

    I have written a program that acquires and analyzes data from the Tektronix TDS3032B oscilloscope. The scope can transfer either 500 or 10,000 points. However, it seems the VISA Read function (I am using LabView 6.0.2) is unable of reading enough bytes to receive the entire data string. I am almost certain that this is not a timeout issue because I can set the timeout extremely high and still get the same behavior. What would be the best way to fix this problem?

    I assume that you receive a timeout error when trying to read a big string (many points). Did you try one reading with a low byte count, less than the length of the expected string? ... or fast consecutive readings ?
    If you are able to receive parts from the expected string, the problem looks to be buffer related. Thus, you can try to use "VISA Advanced > Interface Specific > Set I/O Buffer Size" to set enough room for a large string.
    Hope this helps

  • Visa Read Timeout Occurs with multiple Reentrant VI Calls

    I have written a test application in Labview (6.1) which will be used to test (burn-in) up to 15 serial instruments through a 16 Port USB->RS232 Hub. Here's how it works:
    When the App loads, I am transmitting a Connect command to each of 15 com ports (one-at-a-time) using VISA. If I receive the proper response from the unit on that port, I add the port to an array and continue on to the next system. Once I've found all systems on the hub, I wire my array of active Visa references to a for loop in which I open up to 15 reentrant VIs which will run in the background in parallel. Each of these reentrant VIs (all are idential with the exception of the Visa Resource they use) running in the background are sending commands to the the respective instrument and receiving a response. One Function in particular "Get Unit Status" is important and the response determines whether or not the instrument is functioning correctly. Here's the problem -- In my Main Loop, I am continuously acquiring indicator values from each of the reentrant VIs that are running in the background. After a period of time (not consistent) I will lose communication with a port (the symptom is no response from the unit). I've looked closely at the COMM engine I created and found that the Visa Write function is completing without error, then when I perform a Visa Read I immediately get the "Timeout occured before operation completed" error (please keep in mind that this occurs after 100-5000 successful attempts at writing/reading). Eventually another port will drop out, followed by another. This seems to stop occurring and the remaining systems run to completion without a problem.
    Some background on what how I'm setting up my Visa Sessions...
    When I originally scan for systems (before I load and run the Reentrant VIs)
    - Init Visa Port
    - 19200, 8, N, 1
    - Use Termination = True
    - Timeout = 400mS (I've tried larger values already) 400mS should be plenty
    - Termination Char=13 (/r)
    - Open Visa Session
    - Visa Write "CONN18/r" (the command required to connect to my instrument)
    - Visa Read with 1 for requrested byte count to read 1 byte at-a-time, concatenating the results until /r is received (or 1000mS timeout occurs -- this is not a VISA timeout) I've also tried 16 for requested byte count and just waiting for Visa to timeout -- both methods work.
    Once all 16 ports are scanned I Close ALL of the ports using the Visa Close Function.
    It is important to know at this time that I "AM" using proper wiring flow to ensure open occurs before write, write occurs before read, etc.
    I'm assuming at this time that all of my Visa sessions are closed.
    On to the Reentrant VIs:
    Inside each reentrant VI I first Initialize all of my variables and Init/Open a 'New'? Visa session using the same parameters mentioned above.
    Then I enter the "Run" case structure where all of the communication begins.
    I am using the same Communications Engine to operate the instrument as before (the only difference being that all of the VIs in the comm engine are now reentrant and operate at higher priorities) I have actually saved two different versions of the engine (one for the reentrant calls and one for when I first scan for systems from my Main GUI).
    When I init the reentrant VI, I am placing the Duplicate Visa Resource output of my Visa Open Function on a shift register. When I enter the Run case, it takes the Resource from the register on the left, wires through any Comm Engine Vis then back out to the shift register on the right and keeps going for a 12-hour period or until "Get Unit Status" has returned 60 naughty results.
    On my Main GUI I am continuously (every 500mS) I am Getting certain Indicator Values from each reentrant VI AND I am also setting some control Values on each reentrant VI. There is no VISA interaction between each Reentrant VI, and the Main GUI.
    As I said earlier, up to 15 systems will run for a time, then one will stop responding, followed by another, and another until a few remaining systems will run to completion.
    Any advice as to why I'm encountering the timeouts with the VISA read fucntion as I have metioned would be appreciated. I managed to find one suggestion which uses the Bytes at Port function to ensure there is data at the port before doing a Read otherwise, skip the read and retry the whole operation -- I haven't tried this yet.
    Sorry for the wordiness of my question. If anyone would like some screen shots of portions of my code (I can't submit the actual code because some of it is confidential) I'd be happy to post them.
    Doug.

    Hi Doug,
    The first thing I would recommend is the solution you have already found, to check and see if there is data at the port before attempting a read. I would be interested to see if this will solve the problem. Does there seem to be any trend to which ports have this timeout error? How many ports does it cut down to before operation seems to continue as expected? Does this number vary, or is it always the same number of ports? I think the best thing to do will be to identify constant attributes of how the error is occurring so that we can narrow it down and see what is going on.
    John M

  • NI VISA read stops at zero character, returning an 0xBFFF003E error

    Hi
    I’m trying to read some serial data from a UUT using the NI-VISA read function. The data is mostly text but does include some control codes. The first of these appears after the ‘OK’ in the Serial Bytes window on the front panel. More text should follow but for some reason, the read function stops at the first zero character (index 144 in the Byte Array), and returns an 0xBFFF003E (-1073807298) error. I found another thread where someone had a similar problem and I’ve tried the fix for this plus a few other things, but nothing’s worked. If I use Hyperterminal, the entire data block is returned as it should be.
    I wondered if this was anything to do with the 7.1 version of Labview I’m using (upgrade is on the cards). The version of NI VISA I’m running is 4.2.
    Very much appreciate any thoughts.
    Thanks
    Bruce

    The error code itself is a generic VISA error which often happens with USB to RS-232 interfaces. Does your device connect to the PC through USB as a virtual COMM port? If so what chip and Windows driver is it using?
    Also your function somehow looks wrong. The only criteria for the read loop to terminate is if there is an error on the VISA read or the Teststand termination status gets true. Generally if you use VISA Bytes at Serial Port you are almost always doing something wrong! That function does absolutely not synchronize with anything in your data. You will read whatever is there at that moment and that could be a partial message, no bytes at all (LabVIEW is typically many times faster than any serial device even if it is super high speed), or multiple messages.
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • How to set "VISA Read" as a event source?

    I want to made the VISA Read as one of the event source.
    For instance,  whenever the "VISA Read" read one byte, it can creat a event, so the event can process it.
    How to realize it? Thank you.

    hi there
    please try the attached vi (not testet cause i don't have any serial device at hand).
    it uses a string indicator as the data buffer. if there is new data the vi sets the "Val(Sgnl)" - property of the control which itself raises the "value changed" - event. i don't see a possibility to connect user events and VISA. there are the VISA - events (see the VISA advanced palette, but they also can't be connected to a event structure)
    Best regards
    chris
    CL(A)Dly bending G-Force with LabVIEW
    famous last words: "oh my god, it is full of stars!"
    Attachments:
    event_ValSignaling_7.1.vi ‏53 KB

  • How can use VISA Read and the Wait Function properly

    I am trying to read an instrument. The is a simple instrument because all it does is feed data; there aren't not any write commands whatsoever. I am trying to read the instrument in time intervals. It seems to work fine except for the one second interval. Every 5 seconds I don't get reading and there are zero bytes at the serial port. Any suggestions on what I should do?

    There is a Visa serial VI called VISA Bytes at Serial port. Use a comparison node for =0 and wire this to a case structure. In the false case, read your data.

  • Visa Read problem from a PIC24's UART port

    Thanks for taking the time to read my post.
    I am using Labview 8 on Win XP CPU 3.4, 2GB RAM PC.
    I am using a microcontroller PIC24 (on the Explorer 16 development board), which I have programmed to acquire an AC singal at a rate of about 4 kHz. The PIC24 has a 10bit ADC and the acquire value is then padded to16bits in total (format: 000000xxxxxxxxxx).
    The serial setting for the pic and my Labview program are 115200, 8 data bits, no parity, 1 stop bit.
    In order to send the acquired value to my PCs serial port (Remember UART = 8-bits of data only) I divide the 16 bit word into MSB and LSB. I then send the MSB and LSB 8-bit value one after the other.
    The total data rate for this communication is about 64kbps.
    The problem is that when I run the code I have written in Labview the CPU usage shoot up to 70% and I also see buffer overruns (a non continuous sine wave). If I a time delay in my while loop the buffer overruns increase.
    Also if I try and use the ‘bytes at port’ VI the data I get is meaningless.
    I would be grateful if someone can look at my code and give me some suggestion as to how I could make the ‘Visa Read’ VI more efficient.
    Regards
    Alex
    Attachments:
    Serial Client for PIC24.vi ‏101 KB

    Dear all,
    I do not know if you have been following my post, but I am still getting buffer overruns (a non continuous sine wave) when using VISA Serial Read.
    The only way to avoid this, is by making the VISA Read VI, read 2-bytes at a time (with no time delay in the main while loop). However, when I do this I see the CPU usage shoot up to around 60% (which is something I would expect anyway, as the main while loop is executing as fast as possible).
    I have attached the working code below and would appreciate ANY comments BIG or small.
    I am still puzzled as to why when I connect the ‘Bytes at Port’ Property Node, the data I get is not correct.
    I have gone through the Labview Examples, as well as the LV Basics 1 course examples (which are similar) and I have also looked in the Labview for Everyone / Labview Graphical Programming books.
    However, I have found the examples to be far too simple, for what I am trying to achieve.
    I am seriously thinking of purchasing the LV Instrument Control Self-Paced Course, but I am not quite certain this would help me much. I have read the Course outline provided by NI, but this did not provide me with more valuable information.
    Can anyone that has ‘done’ this course advice me as to whether the material contains info on ‘high’ speed acquisition using VISA Serial Read/Write?
    The course is slightly price at a cost of around £240(with academic discount) and as far as I understand the courses examples (might) use two HP instruments (Multimeter and Function Generator) and a Tektronix oscilloscope, all of which I am not in direct contact with.
    Regards
    Alex
    Attachments:
    Serial Client PIC24-Serial - Read 2-bytes.vi ‏42 KB

Maybe you are looking for