About VISA read

If I wrote different values to different registers by using VISA write. Then I connected a VISA read after the last VISA write. How can I know the read buffer returned is from which register?

Hi,
this is the capture:
the instrument is Emcore TTX 1994, and the application note is attached. I only want to read the value from a certain register, but there are many VISA writes concatenated before VISA read, so I don't know which value is returned from VISA read.
Attachments:
11FM1046B - TTX1994 application note.pdf ‏4150 KB

Similar Messages

  • Why do I get a "termination character was read" warning with VISA read and TCPIP?

    I am using VISA Reads with TCPIP raw sockets without issue with NI-VISA 3.0.1 but when I moved to NI-VISA 4.4 I was getting timeout errors.   The timeout errors went away when I set the termination character enable property to true (which seemed to be default in NI-VISA 3.0.1), but now I get a warning stating that the "termination character was read".   
    Can I disable this warning?   Can I set the termination character enable to default?   How can I get rid of this annoying warning?
    Solved!
    Go to Solution.

    Hey Dagwood,
    Unfortunately there isn't a way to globally change the attribute VI_ATTR_TERMCHAR_EN to VI_TRUE.  I spoke with R&D about possible use of the registries and they say it's not accessible through that.  To address why this change was made, the developer who made the switch isn't around anymore so I can't find his reasoning as an explanation for you.  The best this for you to do in your code would be while initializing, use the VISA Property Node to make the change and until that VISA Resource is closed, this change will remain set to the value you assign.  I'm sorry we cannot provide any other solution for this inconvenience.  Also, if you feel this is a large burden on your programming practice you can definitely submit a product suggestion for the ability to change global default values for VISA attributes.
    Thanks,
    David Pratt
    AES - Test Side Products
    NIC

  • VISA read in exe file is not working

    Hi all,
    I am having problems with VISA read in an exe file created.
    I am trying to write to and read from a programmable power supply via RS232. The VI writes a command to the instrument to set the voltage level. It then writes another command, requesting the resulting current value. This value is then read by VISA read
    The VI is working fine on the development PC, which has LabVIEW installed. The exe file is also working fine on this PC. However, when I try to run the exe file on another PC (I've tried several) everything seem to work except for the VISA read functions. The voltage level command is sent, as well as the On and OFF commands, but the current is not read back.
    I guess there must be something I have missed in the installation. I am working in LabVIEW 8.5. I have created an installer and included
    Runtime Engine 8.5.1
    VISA runtime 4.5
    Is there something else I should do? I am really running out of ideas here...
    I hope someone has a clue about this!
    Clara 

    Clara-
    1. Have you verified that the COM port settings in Windows (check under device manager) are matching how you initialize them (Baud, bits, parity, and flow control) and that these match the power supply's settings?
    2. Also, are you trapping an error message after the attempted Read command (this will make it a lot easier to diagnose).
    3. Do you programmatically close the VISA session at the end of the program?
    4. You can always post the code to see if the forum members will catch the porblem.
    ~js
    2006 Ultimate LabVIEW G-eek.

  • Error -1073807253 occurred at VISA Read in transient SR830.vi VISA: (Hex 0xBFFF006B) A framing error occurred during transfer.

    Hi,
    I am have written a program with labview to make transient c-v measurement using a stanford research SR830 lockin amplifier. The program seems to be runing fine, but sometimes it is givvibg an error:
    Error -1073807253 occurred at VISA Read in transient SR830.vi
    Possible reason(s):
    VISA:  (Hex 0xBFFF006B) A framing error occurred during transfer.
    Error -1073807253 occurred at VISA Read in transient SR830.vi
    Possible reason(s):
    VISA:  (Hex 0xBFFF006B) A framing error occurred during transfer.
    but if I press ok, the program again starts running. What might be the poblem? BTW I googled a bit and I see that in the labview topic "RS-232 Framing Error with HP 34401A Mulitmeter" by pkennedy32 this is what is said about framing error:
    ""Framing Error" in an RS-232 context means a very specific thing - when the receiver was expecting a stop bit, the line was not in SPACE condition. This can be the result of:
    1... Baud rate mismatch (although other problems would likely crop up first).
    2... Data Length problem, If I send 8 data bits and you expect 7, the stop bit is in the wrong place.
    3... Parity setting mismatch - If I send 7 data bits + parity and you expect 7 data bits and no parity, the stop bit is in the wrong place.
    4... Mismatch in # Stop bits - If I send you 7 Data bits + parity + one stop bit, and you expect 7 data bits + parity + TWO stop bits, the second one might not be correct, although most devices do not complain about this.
    But, I must say that this is the same com port setting that I use to measure c-v hysterysis, but I never gt this error there.
    I attach the program herewith for your kind perusal. Please help me resolve this issue.
    Thanks in advance.
    Solved!
    Go to Solution.
    Attachments:
    transient SR830.vi ‏94 KB
    csac.vi ‏8 KB
    sr830 initialize1.vi ‏15 KB

    @Dennis Knutson  you are right I checked the read indicator in backslash mode, and instead of a \n it is sending \r. So I changed the \n in my write strings to \r. But, if I keep the CLOSE VISA outside my loop instead of inside as you suggested, the termination character appears to come in the middle of the read string instead at the end. And since the read terminates at the \r so it is displaying some junk value before the \r, but if I put the CLOSE VISA outside the loop and play along with the bytes at the read buffer, I see the whole read string with the \r  at the end of the string. But, whenever the values are in exponential form (when close to zero) like 6.938839 e-5, I always get a time out error whatever be the timeout that I put at the VISA initialize. And subsequently, if I stop the program and run it again the machine program hangs and I donot get any reading. Then after I close it again and start, sometimes it hangs for some more or starts working. If I put an arbitrarily large byte count at the READ VISA, then I always get the time out before the operation completed error.
    @ Ravens Fan I have removed the CSAC VI altogether and taking the CH! And CH” reading separately, instead as one string. So, no more issues with that.
    I use the control at the delay so that I can choose how much delay I want to set, and I use the math operation because I am using adding up the delay time to keep track of the time elapsed. Because in the end I have t plot a time vs. CH! And CH” readings.
    I am not sure but probably I am making some silly errors. Please help me out. 
    Attachments:
    transient SR830-2.vi ‏103 KB
    sr830 initialize1.vi ‏15 KB

  • VISA Read gets incorrect data from serial connection

    I am having difficulty using the VISA functions in LabVIEW to read data from a virtual COM port. Data is being sent from a serial to USB chip via a USB connection using OpenSDA drivers. I have a python program written to read from this chip as well, and it never has an issue. However, when trying to achieve the same read in LabVIEW I am running into the problem of getting incorrect data in the read buffer using the VISA Read function.
    I have a VISA Configure Serial Port function set up with a control to select the COM port that the device is plugged into. Baud rate is default at 9600. Termination char of my data is a newline, which is also default. Enable termination char is set to true. A VISA Open function follows this configuration, and then feeds the VISA Resource Name Out into a while loop where a VISA Read function displays the data in read buffer. Byte count for the VISA Read is set to 20 so I can read more of the erroneous datat, however actual data will only be 6-12 bytes. The while loop has a wait function, and no matter how much I slow down the readings I still get incorrect data (I have tried 20ms thru 1000ms). 
    The data I expect to receive in the read buffer from VISA Read is in the form of "0-255,0-255,0-255\n", like the following:
    51,93,31\n
    or
    51,193,128\n
    And occasionally I receive this data correctly, however I intermittently (sometimes every couple reads, sometimes a couple times in a row) get incorrect readings like this:
    51,1\n
    51,193739\n
    \n
    51,1933,191\n
    51,,193,196\n
    51,1933,252 
    51,203,116203,186\n
    Where it seems like the read data is truncated, missing characters, or has additional characters. Looking at these values, however, it seems like the read was done incorrectly because the bytes seem partially correct (51 is the first number even in incorrect reads).
    I have search but haven't found a similar issue and I am not sure what to try from here on. Any help is appreciated. Thanks!
    Attachments:
    Serial_Read_debugging.vi ‏13 KB

    The first thing is that none of the error clusters are connected so you could be getting errors that you are not seeing. Are you sure about the comm parameters? Finally, I have never had a lot of luck looking for termination characters. You might want to just capture data and append each read into one long string just to see if you are still seeing this strangeness.
    What sort of device is returning the data? How often does it spit out the data? How much distance is there between it and your computer? Can you configure it append something like a checksum or crc to the data?
    Mike...
    Certified Professional Instructor
    Certified LabVIEW Architect
    LabVIEW Champion
    "... after all, He's not a tame lion..."
    Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

  • Read in Binary form in VISA read function

    Dear All,
    I have connected my Device to the serial Port. and data read from the buffer is stored in text format. I want to view the data in binary format .
    Actually, i have performed the same function in Visual Basic. there also if i view the data in text format, it shows some junk values. but if i view the data in binary, it shows the actual data coming from the instrument .
    I dont know how to modify the VISA read function. can any one pls tell me how can i read the data in binary format?
    Thanks
    Ritesh

    Oops!  No it didn't.
    Lab VIEW 2012 (if that makes a dif)
    I'm using the VISA Read function to take ADC readings in from a microcontroller.  The VISA Read function outputs the data as a string.  Easy to convert the string to U8, either with the conversion function or type cast function, and works great except for a tiny corner case when the ADC reading is zero.  The VISA Read function treats the 8-bit zero reading as a null character and strips it out.
    Apparently, since this is done by the VISA Read function as it's building the string, type casting and or converting the output string from the VISA Read function doesn't "bring the zeros back".
    I've tried setting the VISA property "Discard NUL Characters" to false, and that didn't seem to help.
    My current work around is just to never have the micro send a zero ADC reading. 
    Anyway, I'm a Lab VIEW noob, so while this isn't essential to my project, I remain curious about how to send Lab VIEW serial data that isn't automatically considered characters, thrown into a string with all the zeros stripped out.
    Regards,
    Who

  • Memory leak in Real-Time caused by VISA Read and Timed Loop data nodes? Doesn't make sense.

    Working with LV 8.2.1 real-time to develop a host of applications that monitor or emulate computers on RS-422 busses.   The following screen shots were taken from an application that monitors a 200Hz transmission.  After a few hours, the PXI station would crash with an awesome array of angry messages...most implying something about a loss of memory.  After much hair pulling and passing of the buck, my associate was able to discover while watching the available memory on the controller that memory loss was occurring with every loop containing a VISA read and error propogation using the data nodes (see Memory Leak.jpg).  He found that if he switched the error propogation to regular old-fashioned shift registers, then the available memory was rock-solid.  (a la No Memory Leak.jpg)
    Any ideas what could be causing this?  Do you see any problems with the way we code these sorts of loops?  We are always attempting to optimize the way we use memory on our time-critical applications and VISA reads and DAQmx Reads give us the most heartache as we are never able to preallocate memory for these VIs.  Any tips?
    Dan Marlow
    GDLS
    Solved!
    Go to Solution.
    Attachments:
    Memory Leak.JPG ‏136 KB
    No Memory Leak.JPG ‏137 KB

    Hi thisisnotadream,
    This problem has been reported, and you seem to be exactly reproducing the conditions required to see this problem. This was reported to R&D (# 134314) for further investigation. There are multiple possible workarounds, one of which is the one that you have already found of wiring the error directly into the loop. Other situations that result in no memory leak are:
    1.  If the bytes at port property node is not there and a read just happens in every iteration and resulting timeouts are ignored.
    2.  If the case structure is gone and just blindly check the bytes at port and read every iteration.
    3.  If the Timed Loop is turned into a While loop.
    Thanks for the feedback!
    Regards,Stephen S.
    National Instruments
    Applications Engineering

  • How to read three parameters for exemple temperatur​e,current and voltage from the same port with visa read and separate them in a table

    Hi i want to read parameters with visa read, from three sensors related to  pic16f877a, using  module xbee ..but i want to make every one from those parameters (temperature, voltage and current) in a table ..it's the first time i use Labview so i don't know if ther is a solution for my problem so if any one have any idea please help me. thnx in advance. 

    [email protected] wrote:
    Hi i want to read parameters with visa read, from three sensors related to  pic16f877a, using  module xbee ..but i want to make every one from those parameters (temperature, voltage and current) in a table ..it's the first time i use Labview so i don't know if ther is a solution for my problem so if any one have any idea please help me. thnx in advance. 
    The short answer is: "Yes, of course."  But you are going to have to do the legwork and learn LabVIEW basics before we can offer meaningful help.
    Bill
    (Mid-Level minion.)
    My support system ensures that I don't look totally incompetent.
    Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.

  • Will Labview VIs developed on a MacIntosh run on a Windows PC? How about visa versa?

    Will Labview VIs developed on a MacIntosh run on a Windows PC?  How about visa versa?
    Thanks,
    Dennis U

    As long as you don't use platform specific tools (such as IMAQ vision, ActiveX...), or advanced DAQmx functions, LabVIEW portability between platforms is excellent.
    On MacIntosh, LabVIEW works fine with virtual PC, but is very slow (5-10 times...). Try it and see if it suits you needs.
    Message Edité par chilly charly le 01-09-2006 09:19 AM
    Chilly Charly    (aka CC)
             E-List Master - Kudos glutton - Press the yellow button on the left...        

  • How to run vi continuous​ly when VISA read timeout happens

    Hello,
    I am using LabView 2010 to read an Agilent 6000 series Oscilloscope. The Oscilloscope reads data from another experimental machine which fails occasionally due to sample failure. When the machine fails, it stops sending signal to the oscilloscope and leads to the VISA read timeout error and the whole vi is terminated.  Is there a way that I can continuously run the vi when the error happens? For example, when the error happens, it stores the error in the error wire. When the vi read the error code from the wire, it sends a command to stop my experimental machine then stop the vi. 
    So far, it just simply stops during executing the visa read function. In this case, why do we have error in and out? The vi simply stops when the error happens. 
    Thanks
    Lawrence
    Solved!
    Go to Solution.

    I'll start of with automatic error handling. By default, LabVIEW enables the automatic error handling. So consider the case like the one below,
    If let's say the VISA Read returns an error, but you did you pass the error information to VISA Close, what will happen is that LabVIEW will highlight the VISA Read (since the error comes from that function), pause the execution at the VISA Read and an error dialog box will appear. From the dialog box, it will prompt to either continue (Move on to the VISA Close) or stop (LabVIEW will stop the VI at VISA Read)
    Now, consider another scenario like this,
    By passing the error information from one function to another, let's say if VISA Read returns an error, that error will pass to VISA Close and finally to Simple Error Handler.vi. So Simple Error Handler will generate error dialog box after all the VISA operation is completed. This is called manual error handling (It can be enabled by just wiring the error information from one function to another and terminates at Simple Error Handler).
    http://www.ni.com/gettingstarted/labviewbasics/han​dlingerrors.htm
    As for VISA Close, here is the information about VISA Close.
    http://zone.ni.com/reference/en-XX/help/371361K-01​/lvinstio/visa_close/
    See that button on the left side of this post...
    If you feel my post is helpful, all you need is just (at most) 2 seconds to click that button, to show your appreciation. Thank you~~

  • Visa Read problem from a PIC24's UART port

    Thanks for taking the time to read my post.
    I am using Labview 8 on Win XP CPU 3.4, 2GB RAM PC.
    I am using a microcontroller PIC24 (on the Explorer 16 development board), which I have programmed to acquire an AC singal at a rate of about 4 kHz. The PIC24 has a 10bit ADC and the acquire value is then padded to16bits in total (format: 000000xxxxxxxxxx).
    The serial setting for the pic and my Labview program are 115200, 8 data bits, no parity, 1 stop bit.
    In order to send the acquired value to my PCs serial port (Remember UART = 8-bits of data only) I divide the 16 bit word into MSB and LSB. I then send the MSB and LSB 8-bit value one after the other.
    The total data rate for this communication is about 64kbps.
    The problem is that when I run the code I have written in Labview the CPU usage shoot up to 70% and I also see buffer overruns (a non continuous sine wave). If I a time delay in my while loop the buffer overruns increase.
    Also if I try and use the ‘bytes at port’ VI the data I get is meaningless.
    I would be grateful if someone can look at my code and give me some suggestion as to how I could make the ‘Visa Read’ VI more efficient.
    Regards
    Alex
    Attachments:
    Serial Client for PIC24.vi ‏101 KB

    Dear all,
    I do not know if you have been following my post, but I am still getting buffer overruns (a non continuous sine wave) when using VISA Serial Read.
    The only way to avoid this, is by making the VISA Read VI, read 2-bytes at a time (with no time delay in the main while loop). However, when I do this I see the CPU usage shoot up to around 60% (which is something I would expect anyway, as the main while loop is executing as fast as possible).
    I have attached the working code below and would appreciate ANY comments BIG or small.
    I am still puzzled as to why when I connect the ‘Bytes at Port’ Property Node, the data I get is not correct.
    I have gone through the Labview Examples, as well as the LV Basics 1 course examples (which are similar) and I have also looked in the Labview for Everyone / Labview Graphical Programming books.
    However, I have found the examples to be far too simple, for what I am trying to achieve.
    I am seriously thinking of purchasing the LV Instrument Control Self-Paced Course, but I am not quite certain this would help me much. I have read the Course outline provided by NI, but this did not provide me with more valuable information.
    Can anyone that has ‘done’ this course advice me as to whether the material contains info on ‘high’ speed acquisition using VISA Serial Read/Write?
    The course is slightly price at a cost of around £240(with academic discount) and as far as I understand the courses examples (might) use two HP instruments (Multimeter and Function Generator) and a Tektronix oscilloscope, all of which I am not in direct contact with.
    Regards
    Alex
    Attachments:
    Serial Client PIC24-Serial - Read 2-bytes.vi ‏42 KB

  • VISA READ timeout error - multiple GPIB resources

    Hi,
    I am working on a 3 instruments GPIB network (optical attenuator, fiber amplifier, spectrum analyzer), controlled using VISA sessions in Labview. When run separately, the three corresponding VIs (which are located in three different Labview projects) work as expected. However, when they are ran simultaneously, one of them gives VISA READ -1073807339 timeout errors. These errors seem to happen when an other instrument is sending / receiving data / instructions at the same time as it is.
    The exact context for these error is either :
      -  an other VI is running, which includes sending several queries and reading the answers every 100 ms,
      - upon starting the failing VI, I get a  timeout error from one of the first subVI containing a VISA READ operation to be executed (sometimes initialize.vi (in state 1), sometimes one of the subVIs ran from the Idle state (state 0) upon timeout of the event structure).
    or :
      - the failing VI is running,
      - upon starting an other VI, which includes repetitively sending queries and reading answers, the failing VI throws an error from one of the first subVI containing a VISA READ operation to be executed (one of the subVIs ran from the Idle state (state 0) upon timeout of the event structure).
    What I tried :
      -  gradually increasing the delay between the VISA WRITE and READ operations for the relevant instrument (from 10 ms to 10s), to no avail. More puzzling is my obseration that, when this VI is run alone, increasing the WRITE / READ delay leads to the same timeout errors. I could not find any mention of such behavior through google and forum searches. Hopefully this can point to a solution to the main issue,
      - switching between synchronous and asynchronous VISA WRITE / READ operations,
      - reordering the GPIB network from a star topology to a linear topology (all three instruments do have different GPIB addresses in case anyone wonders).
    My thoughts :
    It seems to me that the error is related to a delay introduced between a VISA query and its associated read operation by the transmission of another query to another instrument in the same GPIB network. However I have no idea why transmitting a query to another instrument would introduce such a delay, or why this delay would lead to a timeout error (and only from one instrument, while the write / read VIs in each driver are basically the same). Hopefully a more experienced Labview-er will be able to shed some light on my issue.
    Included is the project containing the failing VI (main.vi) and the custom driver it makes use of. 
    Solved!
    Go to Solution.
    Attachments:
    Manlight EDFA Control - Failing VI.zip ‏73 KB
    EXFO FVA 3150 - Driver.zip ‏348 KB

    Thank you for your input crossrulz. This is indeed what I realized while looking into semaphores.
    Let me first make our architecture clear so that I'm 100% certain we are talking about the same thing. We have a NI GPIB-USB-HS GPIB Controller connected to a linear GPIB network of three instruments. I was convinced that a network allowing up to 15 instruments to be connected at the same time would allow for parallel operation, but it seems I was mistaken.
    I like how semaphores work, and I don't see any obstacle to gathering all these VIs into one project. My conception of a Labview project was that one Labview project was intended to gather subVIs, libraries and controls used in a more complex "main" VI, which would ultimately be made into a single standalone executable. It seems I was mistaken too and that a single Labview project should be used to gather several standalone VIs designed to work together, and their subVIs. Hopefully I got it right now.
    The other option that you suggest for accessing the same GPIB bus from different projects (having a TCP control interface running and controlling communications through the bus) might indeed be a bit overkill for what I'm tring to achieve, and I would need to spend too much time learning and developing it.
    A last option I looked into is the VISA Lock Async VI, but I don't understand yet if 1. locking the VISA session for an instrument in the bus would lock the entire bus ; 2. if it would be possible to use this approach with VIs running in different projects and 3. if it would not just yield errors when one VI is trying to access the locked GPIB bus instead of making it wait until the resource is available.
    I'll look further into these options today, but would appreciate any additional information / advice you might have. Thank you.

  • VISA read timeout

    I dont know why im getting a timeout error at the VISA read function(error says "timeout expired before operation completed"). I made sure the Labview serial port and my devices serial port are configured exactly the same. I even tested hyperterminal with the same port settings as in labview and it works perfectly but my labview program gives me an error. My device terminates each command it sends with a carriage return so i set up VISA READ to stop reading data when it  encounters the CR character. I also made sure the carriage return was infact being sent by my device. Any thoughts?
    Solved!
    Go to Solution.

    I think it's not seeing your termination character on the read and/or write.  When I send a string to the serial bus, I always use "Concatenate Strings" to add the appropriate termination character(s) to the end.  Assuming your read termination is CR/LF, I've always had better luck stopping on the LF and stripping the CR with "Trim Whitespace."
    Also, don't be afraid of making the timeout a second or two.  If everything is going right, it will (normally) give the serial port plenty of time to complete its operation, but if things are going wrong, it will give you time to "notice" something is wrong.
    Bill
    (Mid-Level minion.)
    My support system ensures that I don't look totally incompetent.
    Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.

  • Visa Read Overrun Problem

    Dear all,
    I am using VISA Read Function to read data coming from my Hardware.
    The data is coming really fast. The H/W is sending 1600 bytes after every 10ms.
    I am using Event Structure(timeout event) to check whether 1600 bytes are at port or not. Timeout is set to 1ms. i even tried with 0ms.
    Baud Rate = 3000000.
    but the program is giving Buffer Overrun problem when i am trying to read data using VISA Read.
    I have set the buffer size to 64000 but with no success. The problem still seems to persist.
    i have played around with Receive buffer of COM Port in device manager. i tried all the values from 64 to 1600, again with no success.
    every time i run the program i get an error stating,"A character was not read from the H/W before the next char arrived" at VISA Read Function.
    and i am not doing anyother processing other than Reading data from the Port in the timeout event. 
    What could be the problem?
     Thanks,
    Ritesh 

    Ritesh,
    The over-run is still caused by the driver of the hardware.  What you are observing is that the driver is able to keep up with bursts of data coming into the port.  So for the first few thousand bytes coming in, there is no error.  However, after a while, the driver falls behind and cannot keep up (hence the over-run error).
    How does FTDI recommend that you use their ports in LV?  VISA may interact with the driver in a different way or set properties to different values.  All of this can have an impact on how the driver performs.  Also, this DLL that they provided may not return overrun errors the same way that VISA does.  The malformed string may just be missing data that was missed because of an over-run error.
    You have not mentioned anything about the types of flow control that are available to you.  The only real way to prevent over-run errors in data intensive communication is to have ways to hold-off the data (flow control).  Since the data will be interrupted for a fraction of a second, the driver will have a chance to get the data out of the hardware in time.  Communication will automatically resume once the data is read out of the hardware.
    Thanks,
    Steven T.

  • VISA Read timing issues

    I am using an RS232 to control an older model Power Supply (OXFORD PS 120-10).
    I have successfully written several VI's that all work, the only problem is that VISA Read takes WAY too long. I'm talking 10's of seconds to refresh. I need it have it refreshing in milliseconds or at least tens of seconds for the measurements we need. All of the VI's I have written have the same timing issue. 
    Attached is the most basic Serial Read/Write VI. Is there any way to improve the Read rate? Or might this just be an instrumentation issue. The strange thing is the Write commands work almost instantaneously (I can seem them on the instruments display).
    Please help if you can, I've only been working with LabVIEW for a few weeks and am very must still in the learning process. 
    Thanks!
    Solved!
    Go to Solution.
    Attachments:
    READandWRITE timing test.vi ‏14 KB
    READandWRITE timing test.vi ‏14 KB

    Do you have the communications protocol for the power supply? If you do not have everything right, you will have problems with communications.
    Tens of seconds is a clue that you may be getting timeout errors because the default timeout is 10 seconds. Try placing an inidicator on the error out wire inside the loop (after Read) to see if an error occurs on any iteration. The way you have the VI set up you only see the error on the last iteration of the loop.
    You are writing a carriage return to the instrument. If it requires that, it almost certainly sends a carriage return with the response. (This is why I asked about the protocol). If the instrument sends a carriage return (or other termination character), then you should Enable Termination Character on the Configuration VI and set the termination character to the correct value. The default is line feed (hex A or decimal 10). A carriage return is hex D or decimal 13. You must wire the numeric value to the termination character input for any value other than the default. Then change the byte count value (at the Read input) to a number larger than the longest message the instrument will ever send, perhaps 100 or 500. The Read will end as soon as the termination character is received, regardless of the number of characters.
    I suspect that this is the problem - the instrument sends fewer than 10 characters in most messages but does send a termination character.
    Lynn

Maybe you are looking for