0 bytes at serial port after Visa Write

Hi,
I'm having problems communicating with a Pollux box controlling a stepper motor (both from Micos). The Pollux box has an RS232/485 connection to the computer. 
I have tried adaptations of LV's Basic Serial Read and Write example. Even for the basic example there has never been any bytes at the serial port after the write.
I have attached some sample code for LV 10.0.1(64bit) and the OS is Windows Server Enterprise 2008.
The Visa drivers have just been installed.
After some research I have added a 'Set I/O buffer size' as advised by http://digital.ni.com/public.nsf/allkb/60DDFED7EFEFE7188625705700750821?OpenDocument
The time-out is set to 10s, I have a wait of 1s before read and 0.5s between each write (the characters are written individually incase that was the problem).
The write command getaxisno_ should reply with the default axis no which is 1. However, there is never any reply.
The termination character for Venus2 is a space for transmitting commands and ASCII CR LF at the end of received data.
I'm not sure what else could be wrong. According to MAX the device is working properly. Although, when I try to open a Visa test panel in MAX the program crashes. I'm not sure if this is a related problem.
I'd be really grateful for any advice, have been trying to fix this for a few days now.
Thanks in advance!
Ciara
Attachments:
Getaxisno test.vi ‏18 KB

Thanks, I'll search those now.
When I click the test panel button, the Max window disappears and the program is closed. Next time I open it an Unexpected Error box opens with a message saying there was an unexpected error and I should search Info Code MAXKnownException.
'The exception occurred in the NIMax process in the function (Unknown).'
I have searched this and the info advised me to send a report with log files etc to NI which I have done. Just waiting on the reply.
I copied the Visa resource name from MAX and the termination character is the same as what a previous owner of the equipment used in Visual Basic. 

Similar Messages

  • Closing a serial port after executing a for loop of write and reads.

    Hello,
    Labview is opening and then closing each write to the port. I have tried to leave the close outside of the for loop, but labview wont allow it. What do I need to change to make all the writes and reads execute on 1 open and close of the serial port.?
    Thanks.
    Attachments:
    Controller.vi ‏27 KB

    J_es--
         The program that you posted looks to be ok for the most part, you might consider putting an open after your configure (but that's trivial). The other thing that is a minor issue is the loop tunnel coming out of your for loop is currently being auto-indexed.  This function is used to index data for each interation of the loop.  You are using a static address (not an array) and so you don't need this.  If you right-click and remove the auto-index the broken wire will go away. Other than that it should be ok.
         I would suggest looking at one of the shipping examples that come with LabVIEW.  "Basic Serial Read and Write" is essentially the same thing that you are doing and is tested here and might save you a bit of time.  Anyway, take a look if you have a second. Best of luck with your application!
    John H.
    Applications Engineer
    National Instruments
    http://www.ni.com/support

  • Writing bytes to serial port in each iteration of loop

    Hello everyone,
                      Well I am new to labview, my question might of beginner's level but i have searched the forums and didnt find content regarding it. Kindly help me through it as i am really stuck in it for last three days. 
                      I have to make a packet of bytes in each iteration depending upon the bytes i get from array which is input to for loop. Taking five bytes and then making packet according to protocol. Once packet is made i have to send it to serial port in the same iteration. So for each iteration i have to send packet to serial port. I have tried the shift registers, local variables, and also the autoindexing which returns the array at the end of loop. But these things dont solve my problem. Kindly help me solvng this issue.
    Many many thanks in advance
    Engr. Yasir Amin 

    First of all thanks alot for your reply.
                             The problem is that i have made separate subVI for this protocol stuffing and the output of that subVI is attatched to the serial visa write in another main  VI in which i am using this VI. I have sent parameters to subVI which makes the packet and then send the output, the output is then sent to visa serial write. Actually that visa serial write is common to many subVIs, On the interface its user's choice to select the request he wants to send. The based on the request the particular subVI is chosen and then that subVI generates the string which is in main VI. which is written to serial port. All other subVis are working fine becoz they dont have loop they simply make the string and send it to port. But this subVi with the loop has the problem as it sends only the last iteration string to port.
    I cant post the code as i am using my laptop in my room and the code is in office laptop But tomorrow i can share it from office
    Thanks
    Yasir

  • Executable does not release serial port after being stopped

    Greetings:
    I have created an executable that communicates through a serial port.  However, if I stop the executable (not close it), the executable still has control over the COM port.  This causes issues if I need to jump on Hyperterminal to send a couple quick queries to my system.  In order to send the queries, I have to actually close the executable in order to release the COM port.  It's problematic because I then have to start the executable back up, go through another configuration query, etc. and there are parameters from the previous run I would prefer to not lose.
    Is there a way to force a LabView executable into releasing it's hold on a COM port after it's stopped, without closing the executable completely?  If it makes a difference, I'm using LV 7.1 on a WinXP machine.  Thanks for any help!

    Gumby_Dammit wrote:
    I changed the settings to have the stop button visible in the menu bar along with the start button.  This is because the user has to select their COM port before continuing.  
    Raven has already said it but I'll say it a different way.  No user should ever interact with the front panel unless the VI is running.  And when it is running the user should never have access to the abort button in the toolbar.  When you open Internet Explorer do you first need to click the run button?  No it runs when you run it.  Write your program so it is always running, and doesn't have an abort so if the user closes the program you can run the proper clean up like closing COM ports.
    Unofficial Forum Rules and Guidelines - Hooovahh - LabVIEW Overlord
    If 10 out of 10 experts in any field say something is bad, you should probably take their opinion seriously.

  • How to display raw, not ascii, bytes from serial port?

    Hello, I have an instrument that outputs raw data, not ascii, bytes to a serial port.  How do you display raw data, not ascii, bytes from a serial port?  Since raw data bytes are not displayable, there is got to be something in labview to interpret/display the received bytes the way it is as data in HEX or Binary.  The STRING-TO-HEX and STRING-TO-BINARY functions are not applicable in this case because the bytes were not string type in the first place.  For example, if the receive byte is 0x4D, I need to display 4D in HEX or 77 in Decimal and not the ascii character "M" corresponding to 0x4D ascii code.  I am using a VISA-RD function currently to receive the data from a COM port.  VISA-RD outputs string type.  Thank for any feedback or suggestion in advance.

    BC@Baxter wrote:
    ...The STRING-TO-HEX and STRING-TO-BINARY functions are not applicable in this case because the bytes were not string type in the first place...
    Of course they are string type, just not formatted in a human readable form. So, YES, these functions are not appropriate.
    BC@Baxter wrote:
    For example, if the receive byte is 0x4D, I need to display 4D in HEX or 77 in Decimal and not the ascii character "M" corresponding to 0x4D ascii code. 
    Dennis is right, but there is no "decimal" display for strings, so the second requirement needs a tiny little bit more code.
    If you have a single byte string, you can typecast it into a U8 number and set the display format as decimal, hex, octal, or binary as desired. In decimal, it would display the number 77. Since it is now a numeric, you can even do math with it.
    In general, the string has multiple bytes, and you would typecast it into an array of U8 numbers (Since this is an often used function, there is also a "string to byte array" which does the same thing).
    Typecasting is the secret. You can e.g. easily typecast 2 bytes into a U16, 4 bytes into a I32, 8 bytes into a DBL,  etc. For more detailed requirements, there is also the "unflatten from string". Check the online help for details. Good luck!
    LabVIEW Champion . Do more with less code and in less time .

  • Bytes at serial port returns allways zero value

    hello, in my code don't works Bytes At Port as I'm expexting. Program receieves data normally, but Bytes At Port return allways zero.
    2nd problem: It is possible to use VISA Read in both of parallel loops ? Purpose of my first loop is sending commads and I want to evaluate synchronous answers imediatly in this loop. Second loop receieves and evaluates asynchronous data flowing from the measurement device when measurement is acitivated.
    Thanks.

    You're sort of answering your first question with your second question. In the first loop the Bytes at Port is meaningless, unless you are intending to use it to perform a read in there. However, since you're reading in the second loop (incorrectly, by the way - more on that later), then there's nothing in the buffer, so you'll get zero.
    You can read in as many places as you want. Whether or not it makes sense to do is a different story. It depends on how the data is coming over the serial port. If the instrument you're talking to is sending data continuously, but "sneaks" in a response to a command between two messages, then you obviously don't want to be reading the response to the command in your second loop. The serial port is just one pipe. It's up to you to determine where the data belongs. Furthermore, you can't be guaranteed of getting all of the bytes at once, which means you might read, say, part of the "asynchronous" message, then get the rest in the next iteration as well as the responsed to a command you sent. As far as the Bytes At Port is concerned, it just sees a bunch of bytes. It doesn't know that, for example, the first 2 are the remainder of the asynchronous message, and the rest are the response to the command. Does the instrument not have something in the communication protocol to  allow you to distinguish?
    As promised, the way you're doing the read: You are setting up the configure VI to disable the termination character, yet you are wiring a termination character. Which is it? Do you want the read to stop at the termination character, or not? If you don't use a termination character, then you need to use Bytes At Port so you know how many bytes to read. You have that property node in the second loop, but you are not using it. Instead you are wiring a constant of 1000 to the VISA Read. This will only work if you have termination enabled. But you don't. See the problem?

  • Serial control with VISA write

    Hi,
    I'm trying to write a vi that will control a stepper motor that moves a
    linear stage, and use a linear transducer (potentiometer) as feedback
    to get precise placement of the stage.  I'm having difficulty using
    VISA write to send commands to the stepper motor.  I can use the
    HyperTerminal to send commands, for example /2P0R to start the motor
    moving.  But when I try to get VISA write to write the same command to
    the serial port, nothing happens.  My code is attached.  Can anyone
    tell me what I'm doing wrong?
    Thanks in advance.
    Attachments:
    Untitled2.vi ‏27 KB

    VISA will indeed throw an error if the COM resource is tied up. But the OP's program does not use any error I/O wiring
    ~~~~~~~~~~~~~~~~~~~~~~~~~~
    "It’s the questions that drive us.”
    ~~~~~~~~~~~~~~~~~~~~~~~~~~

  • Control Camera Attibutes through serial port using VISA commands

    Hi there,
    I'm using a Basler acA2000-340kc camera through an PCIe-1473R FPGA as frame grabber.
    I would like to be able to configure the camera through the serial port just in LabView, not using a third-party as Pylon (which I can do now).
    According to this forum post 
    http://forums.ni.com/t5/Instrument-Control-GPIB-Serial/My-Basler-acA2040-180km-NIR-is-not-visible-in...
    this is possible just taking the VI posted here
    https://decibel.ni.com/content/docs/DOC-5049
    and converting from IMAQ to VISA, as RIO frame grabbers cannot use IMAQ. I have been trying to do this by changing the IMAQ vi to their VISA equivalents, but I have no good results.
    Does anyone know which are the steps to go from that piece of code to one that can be used to control the camera in my case?
    Thanks a lot,

    Hi i.popa,
    Basler makes the Basler Binary Protocol Library for serial communication with their cameras. If you make sure you're running the serial server section of your FPGA code to keep the serial port open, you should be able to use calls from this library to communicate with your camera. We have a community example with more information on using the Call Library Function Node to call a DLL. This DLL will take care of all the processes needed to write from and read to the registry, so this implementation will probably be similar to the example program you listed above, using calls to this library instead of the IMAQ Serial functions.
    Good luck with your application!
    Emily C
    Applications Engineer
    National Instruments

  • Reading/writing to serial port w/ VISA in Labview

    I'm writing a Labview program to control and to read data from a Varian vacuum pump controller. It is connected to the serial port in my computer, and I have been trying to open a VISA session to communicate with the instrument. So far, however, the computer cannot see the instrument--I get the same error messages reading and writing to the instrument as I do to an empty serial port. Does anyone have any suggestions on what the problem might be? Thank you.

    There are a lot of possiblities. Is the cable correct? You probably need a crossover cable (TX and RX) swapped. You also need to check and make sure the serial ports on both ends are set the same (i.e. baud rate, handshaking, stop bits, etc.). Make sure that the port is enabled. I've seen people try to use a comm port only to find that a modem card installation caused the port to be disabled. You could also try using hyperterminal to talk to the instrument. If you can communicate there, you can eliminate any hardware problem.

  • Java program to receive byte from serial port

    hello everyone, i am recently working one a project which require communication between pic16f877 and PC. My pic continuiously measure a voltage and send the result (a byte ranging from 0 to 255) to the rs232 port. One the pc side, my java program keep reading the serial port and it works for values 0~127 and 160~255. But if the byte value is between 127 and 160, it cannot received correctly, i get something like 375,8224,or even 65532 etc(no obvious path but more likely to be 300+,700+,8000+ or 65500+). I have been trying very hard to solve this and i think the problem is due to the pc software. I am feeling so frustrated and any suggestion will be much appreciated.
    p.s. : i used Bufferedreader.read() in my program
    --mengyao                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    fumengyao wrote:
    hello everyone, i am recently working one a project which require communication between pic16f877 and PC. My pic continuiously measure a voltage and send the result (a byte ranging from 0 to 255) to the rs232 port. One the pc side, my java program keep reading the serial port and it works for values 0~127 and 160~255. But if the byte value is between 127 and 160, it cannot received correctly, i get something like 375,8224,or even 65532 etc(no obvious path but more likely to be 300+,700+,8000+ or 65500+). I have been trying very hard to solve this and i think the problem is due to the pc software. I am feeling so frustrated and any suggestion will be much appreciated.
    p.s. : i used Bufferedreader.read() in my program
    --mengyaothis may not matter in your case - but don't use a Reader, use a Stream (BufferedInputStream maybe)

  • Bytes at serial port are unreadable?

    I am trying to communicate to an Omega HH42 Digital Thermometer. I receive Bytes at the port but they do not show up in the display. I have tried the codes disply, Hex etc..... Has anyone seen this before? I am not new to serial communication but this has me stumped.

    Hey Carl,
    I'm trying to serially communicate with the HH42 as well.
    Do you still know what combination of timing and termination characters that you had to use to read temperature?
    Thanks

  • Write to Serial Port without splitting string data

    Hi, all, thank you for your help in advance.
    Problem:
    I am trying to write to the serial port with VISA write module. Somehow the string I tried to send was splitted before it was sent. For example, if I want to send "127", it sends "7', "2", "1". I don't know if there's anyway to configure the module so that it sends out the whole string at once. I use the return count to indicate how many times it spits the data. So "127" now returns "3" (sent three times. I would like to have it to return "1" so that "127" was sent in whole).
    Project:
    I am working on an application where a DC motor is controlled by a controller talking to the PC's serial port. "127" stands for its maximum power. The controller devides the power into 128 steps. Therefore I need to input number from 0 to 127 to command the speed.
    Any help or suggestion will be appreciated!

    Thanks for the prompt replies.
    About Number/ASCII
    I am using the Atmega128-Controller Chip to read in the signals sent from the computer serial port. Then it sends signals to the motor controller. The Atmega chip reads the ASCII string and converts it to hexadecimal number, sending that number to the motor controller. I can program the Atmega chip so that it either translates the ASCII string into hex as mentioned or accepts as it is. Either way, I want it to read two byte information at once (00 to 7F).
    If the VISA serial write can send only one byte at a time, then I may have to program the chip so that it buffers the readings. I have tried using number/hex converter and number/string converter, either case, the fact that VISA Write spits one byte at a time hinders the programming. For example: I defined numbers 1 to 5 represents 20% to 100% power output with 20% increment Then I defined "10" as "90%" power, but it reads "1" "0" seperately, so the actual out put is "20%" then "0%".
    I used the example VI provided by NI : advanced serial write/read. For convenience, attached here. Not all modification I made is saved.

  • Serial VISA 'Write' -why is it slow to return even with large buffer?

    Hi,
    I'm writing a serial data transfer code 'module' that will run 'in the background' on a cRIO-9014.  I'm a bit perplexed about how VISA write in particular seems to work.
    What I'm seeing is that the VISA Write takes about 177ms to 'return' from a 4096 byte write, even though my write buffer has been set to >> 4096.
    My expectation would be that the write completes near instantly as long as the VISA driver available buffer space is greater than the bytes waiting to be written, and that the write function would only 'slow down' up to the defined VISA timeout value if there was no room in the buffer.
    As such, I thought it would be possible to 'pre-load' the transmit buffer at a high rate, then, by careful selection of the time-out value relative to the baud rate, it would self-throttle once the buffer fills up?
    Based on my testing this is not the case, which leaves me wondering:
    a) If you try to set the transmit buffer to an unsupported value, will you get an error?
    b) Assuming 'yes' to a, what the heck is the purpose of the serial write buffer? I see no difference running with serial buffer size == data chunk size and serial buffer size >> data chunk size??
    QFang
    CLD LabVIEW 7.1 to 2013

    Hi, I can quickly show the low-level part as a png. It's a sub-vi for transferring file segments.  Some things like the thin 'in-line' VI with (s) as the icon were added to help me look at were the hold-up is.  I cropped the image to make it more readable, the cut-off left and right side is just the input and output clusters.
    In a nut-shell, the VISA Write takes as much time to 'return' as it would take to transfer x bytes over y baud rate.  In other words, even though there is suppused to be a (software or hardware) write and read buffer on the com port, the VISA write function seems to block until the message has physically left the port (OR it writes TO the buffer at the same speed the buffer writes out of the port).  This is very unexpected to me, and what prompted me to ask about what the point is of the write buffer in the first place?  -The observations are on a 9014 RT target built in serial port.  Not sure if the same is observed on other targets or other OS's.  [edit: and the observation holds even if transmitting block-sizes of say 4096 with a buffer size of 4096 or 2*4096 or 10 * 4096 etc. I also tried smaller block sizes and larger block sizes with larger still buffers.  I was able to verify that the buffer re-size function does error out if I give it an insane input buffer size request, so I'm taking that to mean that when I assign e.g. a 4MiB buffer space with no error, the write buffer actually IS 4MiB, but I have not found a property to read back what the HW buffer is, so all I have to base that on is the lack of an error during buffer size setting.) [\edit\]
    The rest of the code is somewhat irrelelvant to this discussion, however, to better understand it, the idea is that the remote side of the connection will request various things, including a file.  The remote side can request a file as a stream of messages each of size 'Block Size (bytes)', or it can request a particular block (for handling e.g. re-transmission if file MD5 checksum does not match).   The other main reason for doing block transfers is that VISA Write hogs a substantial ammount of CPU, so if you were to attempt to write e.g. a 4MiB file out the serial port, assuming your VISA time-out is sufficiently long for that size transfer, the write would succeed, but you would see ~50% CPU from this one thread alone and (depending on baud rates) it could remain at that level for a verrry long time.   So, by transferring smaller segments at a time, I can arbitrarily insert delays between segments to let the CPU sleep (at the expense of longer transfer times).  The first inner case shown that opens the file only runs for new transfers, the open file ref is kept on a shift register in the calling VI.  The 'get file offset' function after the read was just something I was looking at during (continued) development, and not required for the functionality that I'm describing.
    QFang
    CLD LabVIEW 7.1 to 2013

  • Slow Baud Rate serial port (VISA)

    The last version of LabVIEW, 7.1, has the serial functions incorporated on VISA Resources, and it doesn´t possible to work with baud rate lower than 110 bps. I have a big stuff of applications that works with 5 bps. It´s a serial protocol that send a byte, e.g AA (hex) via serial line, with 5 bps, and after the rate is increased.
    I tried to use dll (CIN Function) but I think is impossible, cause the resource declared at the c code, so Im thiking to use a dedicated COM Dll, and use Call library Function)
    I´d like to know if somebody has a tip or similar problem.
    Thanks in advance
    Fabrizio
    Test Engineer

    Matthias Müller writes:
    > Hello,
    > I'm using LabView to controll a spektrometer through the serial port. I
    > use VISA for the communication with the device. Unfortunately, the
    > device is always in 9600baud mode after power on. So I have to change
    > the baud-rate each time by a command i send to the device. So I open a
    > VISA session in 9600 mode and communicate with the divice to set it to
    > 57600baud. After that, i have to reset my local serial port to the same
    > baud-rate. I do this with an property-node, where i change 'Serial Baud
    > Rate'.
    > Unfortunately, after I did this, the vi's that want to communicate after
    > the reset of the baud-rate stop with an error:
    > -1073807298
    > VI_ERROR_IO
    > Could not perform read/write operation
    because of I/O error.
    >
    > (I try to write to the serial port after change of the baud-rate)
    >
    > I would be very glad, if someone could give me an advice, why it doesn't
    > work, or how to make it work.
    > Thank you a lot.
    > Matthias
    Matthias,
    my first approach would be to close the first VISA session after the
    property node. Data dependency is achieved by the error cluster feed
    into a new session with the new properties.
    IMHO you've discovered another bug in serial VISA.
    HTH,
    Johannes Nieß
    P.S:What brand/modell of spectrometer are you programming for?

  • Warning 1073676424 from VISA Set I/O Buffer Size.vi on a serial port

    I am porting an application from LabVIEW 6.1 on Windows to LabVIEW 7 on OS X (Mac). It was very painless except some GUI modifications.
    The application involves 30 KB data from a instrument through a RS232 serial port. I found the application misses data whenever the computer is busy. The problem came down to the unchanged buffer size.
    Attempt to change the buffer size of a serial port with "VISA Set I/O Buffer Size.vi" fails with a warning of 1073676424 (The specified I/O buffer is not supported). Even the example vi from NI web site "Advanced _Serial_Write_and_Read.vi" has the same warning.
    I wonder what I am missing.

    Under the hood VISA is using the POSIX serial interface for Mac OS X (same as for Linux and Solaris). This interface does not support changing the buffer size. Hence, the buffer size is fixed to the internal OS buffer size. The only thing that changing the buffer size will do (for the out buffer) is to have VISA not flush the data after every write. This is a limitation in the serial API for Mac OS X. Therefore, VISA reports a warning.

Maybe you are looking for