Visa test panel in NI MAX ver 5.4.0 read timeout

I am using the visa test panel in NI MAX ver 5.4.0 and I connect to a CVI application I currently have a server socket up and running. My server accepts the connection form the visa test panel ok, and when I write from the visa my application sees it immediately. Then I try to read and the visa will timeout, at which point the visa displays what I sent in the 'send' command. I cannot determine why the visa writes to my application ok but the send is not received until the timeout occurs in the visa tool. In my application I am doing a simple 'send' on a valid socket. I am not sure why, but if anyone has any answers I would appreciate it. Here is a snippet of my code in my main:
  char sendbuf[64];
  strcpy (sendbuf, "ACK\n");
  /* Initialize WINSOCK before any socket calls are made */
  iResult = WSAStartup(MAKEWORD(2,2), &WSAData);
  if ( iResult != 0 )
   return SOCKET_ERROR;
  if ((ServerSock = SockOpenServer(SERVER_PORT_NUMBER_TCP)) == SOCKET_ERROR)
   return SOCKET_ERROR;
  /* By now the socket should be successfully bound.                */
  /* Wait for clients to connect. As each client connects,          */
  AcceptedSock = SockWaitForAccept(ServerSock);
  /* Indicate the socket has been initialized and a client has been accepted */
  if (AcceptedSock > 0)
   isServerInitComplete = TRUE;
  else if (AcceptedSock == SOCKET_ERROR)
   runRDPServer = FALSE;
  while (runRDPServer)
   BytesReceived = recv(AcceptedSock, ptrCh, BytesToRead, 0);
   if (BytesReceived > 0)
    /* for now just send back an ACK to ackknowledge the receipt of data */
    if (strncmp(ptrCh, "*IDN", 4) == 0)
     BytesSent = send(AcceptedSock, sendbuf, (int)strlen(sendbuf), 0 );
    if (strncmp(ptrCh, "*ESE", 4) == 0)
     runRDPServer = FALSE;

Stephanie, I did go look at the link you provided this morning. That particular paper seem to deal with serial connections. The app I have is for TCP/IP and that's where my problem is. I did find out today, that when talking to my app using a LabView app for TCP there is no timeout in the receipt of data in the LabView TCP app, where as the Visa connection timeouts but still reads the buffer after the timeout occurs. So there is something in that Visa connection to my app, which is a server in 'C' creating the socket. We are currently going to continue with an interface with a LabView TCP connection to my app. Thanks for your input.

Similar Messages

  • Can not send command via VISA test panel in MAX

    Hello all,
    I am having a small issue using the VISA test panel in MAX.  I am trying to send a simple command to a device and then retrieve the response data.  I am able to do this successfully via Labview, but for whatever reason I can not get a good response in MAX.  The command that I am sending is simply a letter (address of the device) followed by an end of line character (\r\n in ascii).  In MAX, I simply append the \r\n onto the string sent in the buffer (Send End on Writes == FALSE).  All serial settings are the same as what I have in LV (19.2, 8-n-1).  Does anyone have any thoughts?  This is kind of frustrating as I just want to do something simple and don't want to have to build a whole routine in LV to do this.
    Cheers, Matt
    Matt Richardson
    Certified LabVIEW Developer
    MSR Consulting, LLC
    Solved!
    Go to Solution.

    Thanks, Broken.  I am getting a time out error.  Clearing the buffers doesn't seem to be making a difference. This is baffling to me.  I must be missing something fundamental.
    Something that I didn't add before is that I seem to be able to send the command via labview without a read, and then read it via MAX.  This must have something to do with way it is sending the command.  Arrrrgh!
    Matt
    Matt Richardson
    Certified LabVIEW Developer
    MSR Consulting, LLC

  • MAX visa test panel viread error 0xBFFF0015

    I am running labview 8.5 and VISA 4.2. When I go to MAX and open the VISA test panel, the viread malfuctions. When I test it, it returns a timeout error (Hex= 0xBFFF0015). The viwrite tab also gives me the same timeout error sometimes. What can I do to correct this error?
    Also, In the labview program, we are using an RS232 serial port to collect data. Using the visa configure serial port function, we have set the timeout to 10 seconds, and we have set the VISA read byte count to 1. In our code, we have three subVIs in a visa session. The first one works fine, the second and third malfunction, because the bytes at the port show a value of 0.  Is there any relation between this problem and the previous one? Thanks for any help!

    I do not know if you are familiar with the instrument, it is a CSAT3 3D sonic anemometer. I have attached the three subVIs, since I am not sure exactly what the command is. I am very new to labview, so I am not too familiar with the commands.
    The initialize VI is the first one executed, the second is the SA sync, and the read anemometer is the third one.
    Message Edited by Ning123 on 06-20-2008 10:22 AM
    Attachments:
    Read_Anemometer_steve.vi ‏153 KB
    Initialize_SA_Comm.vi ‏136 KB
    SA_sync_steve.vi ‏39 KB

  • Has to type each character twice while supplying command in Open VISA test panel

    has a set of commands to work witha system connected to the MS windows PC. when I trype the commands in Hyperterminal (with/without ) they are yielding the expected results.
    in LabVIEW, to get the expected results I MUST has to type 'each character' in the command Twice(except blank space) in the Open VISA test panel (viWrite/viRead). If I give the command  is given the usual way(with each character typed only once), the result always is 'illegal command'.....the same works fine in the Hyperterminal.

    What confused Ashwin is that you said "In LabVIEW". OK, so you're using the MAX test panel. There's one thing that's not clear. You said "when I trype the commands in Hyperterminal (with/without )". With/without what? How is your Hyperterminal session configured? Have you tried something like PortMon to see what's actually being sent from the serial port driver?

  • VISA (Serial port) commands fail in the VI, but work in VISA test panel

    Hi, I have an instrument which has a usb connection. When I connect it to a Windows 7 PC, it automatically picks up the driver and shows up as a "USB Serial Port (COM7) (Manufacturer is FTDI). 
    When I open up NI MAX, this device shows up as COM7 (ASRL7::INSTR) under "Serial and Parallel". And, when I click on Open VISA Test Panel, and try the "*IDN?" command, it works OK.
    However, the VI (which just sends a VISA command "*IDN?") gives me a time out error (0XBFF0015) or a device/resource not present error ( 0XBFFF0011). Attached is a screenshot of the VI. 
    Any ideas why?
    Thanks. 
    PS: I went through the process to create the VISA-USB driver, but that has some other issue, but I am trying to understand why this occurs. This device also has a GPIB port and when I use a USB-GPIB adapter, it works very well. 
    Solved!
    Go to Solution.
    Attachments:
    problem_vi.JPG ‏24 KB

    SysB1 wrote:
    Hi, I have an instrument which has a usb connection. When I connect it to a Windows 7 PC, it automatically picks up the driver and shows up as a "USB Serial Port (COM7) (Manufacturer is FTDI). 
    When I open up NI MAX, this device shows up as COM7 (ASRL7::INSTR) under "Serial and Parallel". And, when I click on Open VISA Test Panel, and try the "*IDN?" command, it works OK.
    However, the VI (which just sends a VISA command "*IDN?") gives me a time out error (0XBFF0015) or a device/resource not present error ( 0XBFFF0011). Attached is a screenshot of the VI. 
    Any ideas why?
    Thanks. 
    PS: I went through the process to create the VISA-USB driver, but that has some other issue, but I am trying to understand why this occurs. This device also has a GPIB port and when I use a USB-GPIB adapter, it works very well. 
    Look up the examples that ship with LabVIEW.  What you have there isn't quite complete - for serial VISA.  While GPIB will work great with what you have, serial VISA requires you to configure your serial port and is a little more complicated to read.
    Bill
    (Mid-Level minion.)
    My support system ensures that I don't look totally incompetent.
    Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.

  • BK Precision device works in VISA test panel but fails when using driver

    I am trying to control a BK Precision XLN power supply.  I have set it up as a TCPIP VISA instrument using sockets (port 5025).  It validates fine in NI MAX, and when using the VISA test panels I can write to and read from it (although I get a timeout error on the read if I specify too many bytes).  When I use the supplied instrument driver blocks in LV, VISA read commands always timeout and fail to return anything, regardless of how many bytes are specified.

    I don't think it is a VXI-11/LXI instrument, it did not autodetect and I had to use the 'Manual Entry of Raw Socket' option to set up the device.  I am sending  a termination on the write (\n), if I don't send a termination on the read it captures the read but then gives a timeout error if waiting for more bytes, if I use a termination cahracter on the read it functions as you would expect, except that it returns multiple termination characters (line or carriage feeds) in some messages so you would have to perform multiple reads to get the whole message.  (this is all in the test panels, I haven't gotten anything back using the VISA blocks in a VI)
    Given that behaviour it seems more like what you would get using a telnet terminal interface but even given that I still am confused why I can talk to it with the test panel and not using the VISA blocks in a VI. 
    Thanks,
    Stearns

  • VISA Test Panel doesn't work on Non-Developer Environment

    I'm trying to make an executable to be installed on other machines. I have included VISA support and NI MAX along with the runtime environment software. In order to see how flexible my code needs to be, I'm trying to use NI MAX to see how to read/write to the different instruments. However, the VISA Test Panel button is completely grayed out and cannot be clicked. While it is not working, I've been using Hyperterminal, but I need more details than Hyperterminal can provide (but NI MAX can). I can see the COM Ports correctly and they can be written to/read from using LabVIEW (it's the output format that I need to know).
    I'm using LabVIEW 2013, but it also happened with LabVIEW 2012 SP1 f4.
    Thanks,
    Quevvy

    LabVIEW is definitely relevant in this case: it was installed using an installer BUILT IN LabVIEW.
    But installing the full version of NI-VISA worked.

  • Timeout error (Hex 0xBFFF0015) in NI VISA test panel when attempting to read from device

    Hello,
    I am attempting to control the set position of three daisy chained four-way actuator valves. They are VICI Valco and model # EUHA. I have them connected via RS-232 to USB into my computer. I was able to communicate with them when I sent some simple commands through hyperterminal and when I open MAX the devices appear and the panel says the devices are functioning properly. However, when I go into the test panel and try to run some default commands, I get the timeout error, 0xBFFF0015, when reading the command. Thank you for any help you can provide.
    Solved!
    Go to Solution.

    Okay I'm running it again today and I'm noticing that its displaying results when I attempt to read from the valve but still tells me it timed out. Aside from the timeout error, both hyperterminal and visa are providing comparable results. The two commands I have been trying to read from the valves are AM (to check what mode the actuator is set to) and CP (to check what position the valve is at).

  • How does the "Read Status Byte" work in the VISA test panel?

    I'm currently trying to replace an HP-85 with a modern computer running VB.NET, which will communicate with an HP 3421A. Unfortunately the 3421A was created before SCPI was created, so I believe the commands I am able to send are dictated by what the 3421A lists in the manual (http://exodus.poly.edu/~kurt/manuals/manuals/HP%20Agilent/HP%203421A%20Operation,%20Programming%20%20&%20Configuration.pdf).
    The commands I'm trying to use to troubleshoot are "DCV", which reads a voltage, and "SR", which returns 24 different registers, with register 1 being the status byte. When I write "DCV" and then read, it returns a correct voltage. When I write "DCV" and click "Read Status Byte" in the VISA panel, it lets me know that data is available to be read and when I read it returns a correct voltage. When I write "DCV", then "SR" and then read, it returns 24 different bytes, with the first byte being the status register, but it doesn't recognize the "DCV" command from before and the status byte is now 0, which leads me to believe that it's overwriting the "DCV" command from before.
    How can I mimic the functionality of the "Read Status Byte" button in the VISA panel? How does this work on instruments created before SCPI was implemented? Sending "*STB?\n" doesn't work. I am new to instrumentation so if I left out some necessary information please let me know. 

    Thank you for the suggestion. I found a solution by adding NationalInstruments.NI4882.dll and using the SerialPoll function.

  • NI-DAQmx task works in MAX or DAQ Assistant test panel but not in LabVIEW

    I am attempting to read a single AI channel from a PCI-6024E card via an SCB-68. I have created a NI-DAQmx Analog Input Voltage Task in MAX for this channel, sampling in contiuous aquisition mode at 100 kHz, 10000 samples at a time, with RSE terminal config. If I use the Test feature from MAX, the channel acquires data as expected.
    In LabVIEW, I call this task using an DAQmx Task Name Constant. If I right-click this constant and select "Edit Task", the Daq Assistant opens and I can use the Test feature from the DAQ Assistant to see that the data is still being acquired as expected.
    However, when I try to programmatically read this channel in LabVIEW using the VI "DAQmx Read (Analog Wfm 1Chan NSamp).vi", the VI returns a constant DC value of 500 mV, which I know is incorrect (I can monitor the signal across the two terminals in the SCB-68 with a DMM to know that the signal coming in varies as expected, and as I read using the test panels). This erroneous reading occurs even if I make a new VI, drop the task name constant on the diagram, right-click the task name constant and select "Generate Code - Example" and let LabVIEW create its own acquisition loop (which is very similar to what I already set up, using the "DAQmx Read" VI).
    Any ideas why the Test Panels work correctly but the LabVIEW code does not?

    Hello bentnail,
    I'm not sure why the test panels are readin the value correcly but the LabVIEW code does not, but there are a couple of things we can try.
    1) What happens if you just use the DAQ Assistant and place it on the block diagram? Does it read out the correct values?
    2) Try running a shipping example that comes with LabVIEW. "Acq&Graph Voltage-Int Clk.vi" should work well.
    3) What kind of signal are you expecting to read (peak to peak voltage, freqeuncy, etc.)?
    Thanks,
    E.Lee
    Eric
    DE For Life!

  • Why can i NOT use "differential" on the MAX test panel for channels 8-16??

    I have a USB-6218
    This is a 32 channel device with 16 Differential inputs.
    Why can't i use the MAX test panel to view any channel beyond AI-7 using the Differential setting? It shades out differential even though my device has this capability.
    WHY???? 

    I am using MAX 4.1.0.3001 in conjunction with LV 8.0 and the USB-6218.
    I go to MAX, Expand devices, expand MX devices, Right click on the USB-6218 (device2), and go to the test panels.
    From there i am unable to view any differential pair beyond 8.
    How am i supposed to Thru-Calibrate my equipment? Generally to achieve a valid calibration i use MAX itself to record the raw voltages the board "sees" as i apply known loads to the sensors.
    The reason i need to check all this out is because DIFF pairs > 8 also are not working in my VI! Im am not sure if this is related or not.

  • MAX 4.2.1.3001 Task Test Panel Button Missing

    I just upgraded to LabView 8.2.1, along with the drivers, NI-DAQmx, MAX, etc., and now when I use MAX to view my existing tasks, they no longer have a "Test Panels" button.  They appear to now have an integrated test graph.  That is nice, but before I was able to press the "Test Panels" button, select a type of "Values" instead of a graph type, and this would allow me to see the raw values that each channel was returning.  What happend to this functionality in 8.2.1.3001?  I know that I can go to each individual channel and see its raw values, but that is not acceptable.  I need to see all of the values of each of my task channels together in one place, the same way I was before "upgrading" to the new MAX.  Am I missing something?
    Thanks for listening!
    Rick Howard
    Exquadrum, Inc.

    This post and also this post, which are the same, have been replied to on this DAQ forums post, on which you can continue posting.  Please post only once on the most relevant forum.
    Regards,
    Raajit L
    National Instruments

  • In Max, click Test Panel - Get Executable version (7.1.1) doesn't match resource file (7.1)

    Hi,
    I haven't used LabVIEW for a while.
    I plugged in a USB-6251 and ran Measurment and Automation explorer and saw that the USB-6251 was in the NI-DAQmx Device and was green.
    I clicked on the Test Panels button and got a window with the message:
    Get Executable version (7.1.1) doesn't match resource file (7.1)
    When I clicked OK  MAX was locked up.
    I found this:
    http://digital.ni.com/public.nsf/websearch/680E61A4D02158A186256F7A0073C228?OpenDocument
    which told me to repair the LabVIEW Runtime thingy.  I did that and it's still broken.
    My MAX version is 4.2.0.3001
    I assume that's a pretty old version but I would have thought it should have been updated when I upgraded to LabVIEW 8.2.1.  But I don't see a MAX folder in the 8.2.1 install.

    I thought I had replied to this but somehow it didn't get there.  In answer to the question about running the device driver CD after installing LV 8.2 I don't know the answer. 
    Anyway I installed LV8.6.1 and now it goes into test panels OK so I should be able to progress.

  • For a 3.33v input, DAQ & MAX Test panels read 018v. What is missing in set up?

    Hi All:
    I am resurrecting a Labview setup that has not been used for a few years.
    The goal is to measure strain.
    Software is:
            Labview 6.0
     DAQ V6.8.0f9
     MAX 2.0.3.6.
    Hardware is:
     PCI-MIO-16E-1
     SCXI-1000 Chassis
     SCXI-1121 Module
     SCXI-1320 Terminal Block
    As a fist step I decided to test the hardware using MAX.
    I reset all of the SCXI-1121 jumpers to factory settings except the Filter jumpers.
    Filter jumpers were set at 10kHz  and  I left them at that setting.
    In MAX I followed the software procedures for setting up the PCI-MIO-16E, SCXI-1000 Chassis,
    and SCXI-1121 Module (Module was found using the auto detect). MAX verifies the PCI-MIO-16E
    card, and the SCXI-1000 chassis.
    I verified that EX 0 thru 3  were 3.33v using a digital meter.
    I then wired  EX 0 to CH 0 and using the MAX Test Panel and the set up string of
    ob0!sc1!md1!0
    in the channel box on the Analog Input tab.
    I expected to see a 3.33v reading on the test panel, but got a reading of .0184v
    (I assume this means zero or is noise). The result is the same on channels 1, 2, and 3.
    As far as my bifocals can tell, all cables are connected properly, and there are no bent pins.
    I have re-verified the jumper settings per the SCXI-1121 User Manual. I have printed the
    “Knowledge Base”, “SCXI Trouble Shooting Resources” and walked through them step by step
    to verify the set up of hardware and software.
    I have set up a virtual channel for each channel and used the DAQ virtual test channel
    test panel, and I get the same result .0184v.
    Anybody have any suggestions or clues as to what I have missed?
    Thank You
    Peter Waltz

    Hello Peter,
    If you are reading 0.0184V on your analog input, then it probably means that you are reading nothing at all.
    This seems like a driver issue more than anything else. Make sure that you are the latest possible driver installed for this setup. Traditional DAQ 7.0 or 7.1 are compatible with LabVIEW 6.0 and should work fine.
    Once you have the right driver, try connecting your analog output to your analog input. Output a sinewave or a specific voltage and see what the AI reads.
    Most likely the driver upgrade would solve your problem.
    You can download the drivers from here.
    Chetan K
    Application Engineer
    NI
    Message Edited by CKap on 05-08-2006 02:57 PM

  • 6025e MAX test panel

    I'm using MAX test panel to test my digital I/O pins, however i couldn't find any manual explaining the function of the test panel. there are 2 tab on DIO. what are the operation/function in the 2 DIO tab?
    First Digital IO tab
    a. which port does it test?
    b. why can't i select the input state?
    For the second Digital IO tab
    a. i can only select port 2,3,4, what does the port 2,3,4 refer to? coz for 6025E, i have only port 0,1,2,3
    b. why can't i set the input i want for the input port?
    c. how i check if the digital IO port is reponding?
    thks & best regards

    For the PCI-6025E, the number of the digital ports are :
    0: DIO
    2: PA
    3: PB
    4: PC
    There is a big difference between DIO and the 3 other digital ports. While each line of DIO can be individually set to be a DI or a DO, all 8 lines of PA, PB or PC must have the same direction.
    See the user manual for more information.
    The first tab is used for the DIO port (DIO0 to DIO7). You can set the direction of each single line.
    The second tab allows you to define one input port and one output port (eg PA (2) = DI and PB (3) = DO). You can then see the state of each line of the input port and set the state of each line of the output port.
    You can easily test your hardware by wiring a DO line to a DI line (eg PA0 to PB0). Change state of DO and check if DI is co
    rrect.

Maybe you are looking for

  • Not able to select Business Partners in Document Generation Wizard

    Hi While i was trying to generate a consolidated A/R Invoice from deliveries in the step where i have select the business partners,some of the business partners are not in the selection list.These business partner's currency are defined as multicurre

  • Itunes 7 after download

    After I download Itunes 7 my pc doesn't respond when I try to begin installation. I've tried everything posted. I would appreciate any help. Thanks dell   Windows XP  

  • Cant bypass agreement screen

    i've been tring for 6h and not any luck, I have internet and no SIM but still don't load the licence agreement and can't use the phone. I've search how to solve this but any lock. Can anyone please help me??? THX

  • CUA history for child systems

    Hi all I have seen quite a few forum entries about change history and tables to see the mapping of roles to users, such as: Table AGR_USERS (actual assignments) Table USLA04 (actual assignments in a CUA central system) SUIM report RSSCD100_PFCG for v

  • Photoshop javascript: Open files in all subfolders

    Hi guys I'm very new to javascript, and general Photoshop scripting. My script is coming on well, but I struggle working with files and folders. It's a lack of basic javascript knowledge, which I need to work on. Right now, I'm trying to expand the i