VISA Problem

Hi all,
I have tried to start with VISA, at this time serial communication
(COM-Port).
At first i´ve put a VISO open to the diagram then i´ve tried to connect a
resourcename constant and that´s the problem.
LV say something about differend classes... i´m a little confused...
What may be the problem???
MfG / best regards
M.Waschk mailto:[email protected]

Hi,
try following.. after placing the VISA open change into wiring mode, right
click at the Resource in terminal and select create constant.
Now you can select the resource you want's to use.
Henrik
Matthias Waschk schrieb in im Newsbeitrag:
[email protected]..
> Hi all,
>
> I have tried to start with VISA, at this time serial communication
> (COM-Port).
> At first i´ve put a VISO open to the diagram then i´ve tried to connect a
> resourcename constant and that´s the problem.
> LV say something about differend classes... i´m a little confused...
> What may be the problem???
> --
> MfG / best regards
> M.Waschk mailto:[email protected]
>
>

Similar Messages

  • Visa problem with old instrument

    Hallo!
    I have a Problem with an old Jobin Yvon HR320 Monochromator which I try to access via VISA. Sadly I have no manual for the instrument. My task is, to convert an old VEE-Programm to Labview 7.0 using an agilent PCI GPIB-card. In VEE the monoc. is initialised with a string "248222 " which returns a B if not initialised yet. This works fine with LV. The next step is to send "O2000" as a string then, in VEE, Write Byte "0" and at last a blank. This should return a character "f". In LV I don´t know how to write the zero as byte. The VEE I/O busmonitor shows that there is a differenc if I write the zero as text or as byte. Maybe someone knows a solution for this problem?
    Thanks
    C.Kandler
    P.S. I read a lot about the ni-spy, but I can´t find it in labview. Is there a posibility to download this tool?

    C.,
    a LabVIEW string is more like a Pascal string, rather than a C string.
    To write binary data you should create a string constant or control, right-click on it and choose "'\' Codes Display" or "Hex Display". Now you can enter a null Byte as '\00' or '00' (without the quotes, of course), depending on what type of display mode you have chosen. The first mode ist easier to combine with normal printabel text, whereas the latter is better for binary strings.
    NI Spy was installed here in C:\Program Files\National Instruments\NI Spy\NISpy.exe. It might be a component with higher level versions of LabVIEW, not with the base version.
    But if you create probes you could as well debug your app without NISpy.
    HTH and
    Greetings from Germany!
    Uwe

  • VISA problem when setting serial read buffer on MAC?

    Hello,
    I have the following rather strange problem with serial communication on a Macintosh:
    1. I wrote a program using the simple serial VIs provided in LabVIEW and write and receive amounts of data up to about 25000 bytes. The serial buffers are both set to 32000 bytes. If I wait long enough everything works perfect (this was just a test VI for reference)
    2. I wrote the same program using VISA calls, I initialize the port exactly the same (or so I think), set and flush both read and write buffers. Then I send the data, which seems to work, except that I have to increase the general time out of the VISA session (which is already odd as it should just be sent to the buffer). However the read oper
    ation allways fails. It seems that no matter how long I wait after the write operation the buffer only contains 63 bytes, where is the rest going (from the program described above and the reaction of the instrument I know that everything is sent!)? What I read is the echo of what is sent plus one extra byte, it seems that those 63 bytes are the first 63 that are sent, the buffer is thus not overwritten as I would expect if the buffersize was to small.
    If anyone could shed some light on this problem I would be very greatfull.
    Best regards
    Koen

    I haven't tried this yet but I had the same problem as you. This was
    pasted from an earlier response from a post I made a month ago.
    >There are 2 things to try here:
    >1) Set the serial input end mode attribute to 0. The
    >attribute/property name is either "ASRL End In" or "Serial End Mode
    >for Reads".
    >2) Use the "VISA Set I/O Buffer Size" function in the VISA Interface
    >Specific subpalette. This will let you set the receive and/or
    >transmit buffer size.
    >LabVIEW VISA Software Dude...
    Koen wrote:
    > Hello,
    >
    > I have the following rather strange problem with serial communication
    > on a Macintosh:
    >
    > 1. I wrote a program using the simple serial VIs provided in LabVIEW
    > and write and receive amounts of data up to about 25000 bytes. The
    > serial buffers are bo
    th set to 32000 bytes. If I wait long enough
    > everything works perfect (this was just a test VI for reference)
    > 2. I wrote the same program using VISA calls, I initialize the port
    > exactly the same (or so I think), set and flush both read and write
    > buffers. Then I send the data, which seems to work, except that I have
    > to increase the general time out of the VISA session (which is already
    > odd as it should just be sent to the buffer). However the read
    > operation allways fails. It seems that no matter how long I wait after
    > the write operation the buffer only contains 63 bytes, where is the
    > rest going (from the program described above and the reaction of the
    > instrument I know that everything is sent!)? What I read is the echo
    > of what is sent plus one extra byte, it seems that those 63 bytes are
    > the first 63 that are sent, the buffer is thus not overwritten as I
    > would expect if the buffersize was to small.
    >
    > If anyone could shed some light on this problem I would be
    very
    > greatfull.
    >
    > Best regards
    > Koen

  • CVS NI1454 Labview RT 8.2, NI-VISA probleme : "NI-VISA loads modules... At Least one passport on your system could not be loaded"

    Bonjour,
    Je cale sur un problème avec le système de vision NI-CVS 1454.
    J'ai installé la version Labview 8.2 RT dessus car je n'avais les licences que pour cette version.
    Dans mon programme, la caméra doit réagir sur des signaux digitaux pour effectuer une analyse 1 ou une autre (analyse 2).
    Comme il me fallait connecter des I/O j'ai utilisé les fonction de NI-Imaq I/O > Open I/O.
    La caméra ne sait rien faire avec cela car je dois installer des softwares sur le système.
    J'ai donc été via Ni-MAX installer le soft NI-RIO FCF et cela demande automatiquement pour installer NI-VISA et NI-VISA SERVER
    Mais lorsque cela est fait j'ai un message à chaque démarrage du CVS :
    "NI-VISA loads modules(Passports) ro access GPIB, Serial, VXI, etc.... At Least one passport on your system could not be loaded. You should run the Measurement & Automation Explorer"
    Je ne fais que cela depuis 2 h mais je ne trouve pas le problème..
    est-ce la compatibilité des versions? Dois-je travailler autrement pour les I/O surle CVS?
    Merci pour votre aide.
    Seb

    Bonjour,
    Avez-vous essayé les tests proposés par Sami Fatallah ci-dessus ? Si oui quels en sont les résultats ?
    En effet je pense qu'une réinstallation du CVS (en safe mode) serait judicieuse.
    En ce qui concerne la configuration logiciel, tout semble correct.Vous n'avez donc en effet pas besoin du lien que je vous ai envoyé dans mon précédent message. Toutefois une réparation comme suggérée ci-dessus pourrait-être utile.
    Pour ce qui est du fonctionnement de IMAQ I/O, il vous permet d'accéder aux entrées/sorties via le FPGA. Cependant cela est transparent pour vous car un bitfile est généré à l'installation. Vous n'avez donc pas besoin de LabVIEW FPGA ni de programmer le FPGA. Plus de détails ci-dessous :
    http://digital.ni.com/public.nsf/allkb/98EE3EF87B2058F2862575BC005EDBC6
    Quel type de caméra connectez-vous à votre CVS par ailleurs ?
    http://forums.ni.com/t5/Machine-Vision/CVS-1456-Connection-Problems-VISA-error/m-p/1041051?requireLo...
    Voici un lien utile pour le diagnostic et la résolution de problèmes sur les CVS en général. Il ne vous sera d'aucun utilité dans le cas précis, cependant je vous conseille de l'ajouter à vos favoris.
    http://digital.ni.com/public.nsf/allkb/35D5C8DB4F1A66F5862571EA005D2FE3
    N'hésitez pas à nous tenir informés de l'évolution de la situation.
    Bien cordialement,
    Yannick D.
    National Instruments France
    #adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
    >> Du 30 juin au 25 août, embarquez pour 2 mois de vidéo-t'chat ! Prenez place pour un voyage au coe...

  • Mac visa problem

    I developed a spectrometer application for a customer using VISA RAW communication on a Mac.
    The software had been running fine until he decided to install the latest OS uzpdate (10.5.7).  Now the software doesn't work.  The problem seems to lie with NI-VISA not being started on boot.
    Has anyone got any idea where I should look to try to solve this problem?
    The software runs fine on my mac mini with all latest updates installed.  The customer was just here and I was able to verify that indeed the software was not running.  The Visa configurator wouldn't start, VISA server wouldn't start and even though the Spectrometer was visible int he System profiler (i.e. the hardware was recognised by the system), my Software couldn't access it.
    Can anyone help?
    Shane.
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

    Lynn,
    thanks for taking the time to help me out with my Mac adventures (again).
    The programs VI Server and VI Configurator are in themselves not the problem. The inability to start them is simply an indicator to where the problem might lie.
    The problem is that the entire VISA service seems to be unavable.  I have a piece of software (a previous version of which you are familiar with) which ran fine before the upgrade and now doesn't.   I tried a re-install of VISA but itjust doesn't seem to work.  I'm not using anyhting beyond VISA control in and out and an initial VISA search in the software.
    I'll check out the start items tomorrow in the office (it's evening here already).
    Thanks
    Shane.
    Say hello to my little friend.
    RFC 2323 FHE-Compliant

  • Serial VISA problems

    Hello,
    I keep having problems with VISA serial driver. The problem
    is that VISA seems to reserve the serial port even after the
    VISA session has been closed. Also, if I open COM1 with visa,
    I cannot use hyperterminal with COM2. I have to shut donw LabVIEW
    in order to use _any_ COM ports after I've used one with VISA.
    If i use old serial vi's i don't have to close LV in order to
    use COM ports with terminal hyperterminal or procomm
    etc.
    Has anybody else encountered this kind of behaviour with VISA serial?
    I run LabVIEW 5.1f1 in WinNT4.0
    timo
    *Timo Miettunen *
    *Senior Test Designer
    *Fincitec Oy Production Department *
    *e-mail: [email protected] *
    *Address: * *
    *Lumikontie 2 *Tel: +358-16-2151245 *
    *PO BOX 11 *Fax: +358-16-221561 *
    *FIN-94600 KEMI Finland * *

    Timo Miettunen writes:
    > Hello,
    >
    > I keep having problems with VISA serial driver. The problem
    > is that VISA seems to reserve the serial port even after the
    > VISA session has been closed. Also, if I open COM1 with visa,
    > I cannot use hyperterminal with COM2. I have to shut donw LabVIEW
    > in order to use _any_ COM ports after I've used one with VISA.
    > If i use old serial vi's i don't have to close LV in order to
    > use COM ports with terminal hyperterminal or procomm
    > etc.
    >
    > Has anybody else encountered this kind of behaviour with VISA serial?
    > I run LabVIEW 5.1f1 in WinNT4.0
    Timo,
    An easy workaround might be using the "close all visa sessions" option
    in the preferences/Misc tab. Before this option was introduced I used
    the
    visa sessions.vi in .../Labview/vi.lib/visa.llb .
    Do you use "find visa resources" and maybe you do some homemade "plug
    and play" by retrieving info strings from all ports? I'd look for
    sessions not getting closed in your programm.
    I'm using LV 5.1f0 on NT4.0SP3 and didn't see your kind of problem. I
    doubt it is an NT permissions problem.
    Johannes Niess

  • USB-VISA problem!!!

    Hi,
    i'm having some problems with the VISA module. I've installed an USB device defined as BULK-HID, according to this instructions ( http://zone.ni.com/devzone/cda/tut/p/id/4478 ), and everything seems to be OK; but when i try to set the reference to the VISA-OPEN, the Visa Resource Name, doesn't find the USB device, although MAX does!.
    any ideas???

    Go to MAX.  Open up the instrument under the heading "gpib".  Select the VISA tab.  By default the "alias" will be blank.  Put something in there like "DMM1".  In your block diagram, place the visa open vi, and right click on the resource name: create constant.  Your alias should be in the drop-down box.  I think you can wire a string to this input if you have to, but you need to match the name exactly.  Something else to try: can you use a regular VISA name like GPIB0::7::INSTR.

  • LabView VISA problem

    Hi, I'm very new to LabvView and I'm having trouble with communicating
    with an instrument. It's on an RS232 port Com1. I decided to test if
    LabView could communicate with the device so I loaded the setup
    software that came with it which established a connection on Com1. So I
    went into Instrument I/O assist and I was able to successfully validate
    the the port settings. I went into the VISA test Panel and I ran the
    Basic I/O *IDN?\n and I got I read count of 6 back, but a status of 0.
    I executed viGetAttribute VI_ATTR_TMO_VALUE and it returned 2000. So I
    added a Query and Parse step for the *IDN?\n command and I get a
    timeout Status code : bfff0015 no matter what I set the timeout value
    to. So I'm wondering if this means the VI's that came with our
    instrument won't be able to communicate with it. Or is there some
    obvious thing I'm doing wrong?
    Thanks for your patience,
    -Shane

    Hi Shane,
    Also make sure that you are using the proper termination character with the command you send to your device. Here and here are documents that discuss adding termination characters automatically for you with VISA. Otherwise, you can just add this character to the end of the string you write to the device. I hope this helps!
    Regards,
    Missy S.
    Calibration Engineer
    National Instruments

  • How to debug DAQ VIs (problem with data)

    I am trying to debug (and understand) the attached VI.  I’m not asking anyone to analyze/debug it, since it is far more than an example; I am new to these forums and feel like asking such a thing would be inappropriate.
    I would like advice on how to debug it.  I am also going to describe my problem to see if the cause may be obvious to someone.
    The outputs of the "Angle Calc" sub-vi are fed into a state machine.  The state machine actuates outputs to move a motor and measures changing angles.  It first moves up/down, then left/right.  The attached VI is called by a top-level VI and the front panel is not usually open.  For some reason, when the front panel is opened, it behaves differently when it is called.  The behavior only changes for the right/left movements.
    Here's where I'm hoping someone can point out some better debugging methods for me.
    I have been using the Express Write to LVM File VI and the Convert to Dynamic Data function to write acquired data to text files for analysis after the acquisition has completed.  Are there better methods recommended for doing this?  I wish I could run through the code while recording the execution and then step through a playback of it!
     As for the specific problem I am having...
    In the "Right Find Stall" case, the signal wired from the "Angle Calc" VI (array size 100) is connected to a Max/Min Array function.  When I log the data coming from the Min/Max function, I see an alternation between valid data and 0 (i.e. 12.2, 0, 13.3, 0, 14.0, 0, 14.5, 0...).  When I log the data being fed into the Min/Max function I do not see 0s. 
    A guess of mine is that the array is empty and therefore no data is logged but Min/Max returns a 0 (not sure why this would happen).  I also have no idea why this would ALWAYS happen with the front panel opened.  I have seen the effect with the front panel closed, but never to the same degree; usually just a seemingly random zero or two but not a pattern of every other point...
    Thanks all,
    Dave
    Attachments:
    Functional_Test.vi ‏2710 KB

    Angle_calc receives array controls in and passes array indicators out.  It performs a trig function and a bit of multiplication. 
    I'll check out the Write to Spreadsheet File VI, haven't used it in the past.  If it will allow me to write text out that'll be enough reason for me to stop using the Express VI.
    I've used Execution Highlighting and stepping before but with the data acquisition by the time the application makes it around to the next read the DAQ card has sampled a ton of data and the data coming in is no longer useful (a different state should be active by then).
    I do think the problem is related to the computer slowing down with the front panel open.  I've logged the 'backlog' number of the Read function and found that the number in there is greater than 0 much more often when the VI's front panel is open and that I've seen relations between the 0s that I discussed earlier and the backlog going above 0 (usually the scan after a non-0 backlog seems to be a problem).
    I need to familiarize myself with the DAQ basics, the buffer, backlog, and relation to sampling rate, etc.
    In a simple VI I just wrote, I noticed that if I increased the sampling rate high enough, data being acquired into a chart cuts-out on a regular basis; instead of a smooth line, I see small dashes or dots of data (spaces in between).  I don't understand this- I would expect data to be lost but not "no data" to be read in.  I think understanding what is happening here would be helpful.

  • TestStand Deplyment Error- Error: Unable to locate all subVIs from saved VIs because a subVI is missing

    Hi,
    I am a Systems and Software Engineer based in Vancouver. I developed an automated test system using LabVIEW 2013 and TestStand 2013 with custom operator interface.
    I encountered 'missing VIs' problem which is kind of weird because I passed analyzing the sequence for both TestStand Sequence Editor and TestStand Deployment Utility >> Distrubuted Files Tab.
    But when I tried building the installer and reaching the point 'Calling distribution VIs, it always throw an error saying 'An error occurred while trying to read the depencies of the VIs, possible because the VIs are not saved in the current version of LabVIEW. Do you want to save any modified now?'. I tried both cases (i.e. Yes and No) for this option but it did not solve the issue.
    This is part of the original error message displayed in TestStand Deployment Utility:
    While Processing VIs...
    Error: Unable to locate all subVIs from saved VIs because a subVI is missing or the VI is not saved in the current version of LabVIEW.
    The call chain to missing VIs:
    1 - ATE_AccelerometerTest.vi
    2 - CreateAndMergeErrors.vi (missing)
    3 - LogControl_CheckForErrorSendUpdates.vi (missing) "
    All missing VIs are coming from userlib.
    Actions Done:
    - Analyzed sequence file using TestStand Sequence Editor and TestStand Deployment Utility
    - Verified 'Search Directories' include all necessary files/dependencies.
    - Mass compile the directory of the missing VIs
    - Added all needed files and folders in the workspace file.
    The result is still the same based from the actions done.
    The last debugging I did earlier is that I tried locating the sequence and steps of missing VIs as mentioned above (e.g. ATE_AccelerometerTest.vi)
    and I found out that the step seems to be an empty action step. Would this be possible even if it already passed the analysis?
    Other considerations include:
    I am using LabVIEW 2013 sp1 and TestStand 2013. We tried building from three (3) computers and we only succeeded once to a freshly-installed comptuer.
    Hope to hear from you soon.
    With Best Regards,
    Michael Panganiban
    Systems and Software Engineer
    www.synovus.ca
    [email protected]
     

    Hi All,
    We were able to resolve the issue. First to note is that the release notes in TestStand 2013 is outdated and we confirmed from NI Engineer in Austin that TestStand 2013 works fine with LabVIEW 2013 SP1.
    Secondly, we played around TestStand Deployment option that resolved the issue. Attached are the images.
    We just enabled the 'Remove Unused VI Components'. It could be one of the libraries (lvlib) we included in the build but we haven't figured it out yet because we verified that all VIs are working. It could be also something else that I think very difficult to find based from the information. However, if anybody experienced the same issue, this could be helpful.
    Again, we revert back in using TestStand 2013 and LabVIEW 2013 SP1.
    I appreciate any comments and feedbacks. Otherwise, you can close this support request.
    Thank you.
    With Best Regards,
    Michael Panganiban
    Systems and Software Engineer
     

  • VISA Shared resources by lock CVI functions

    Hello,
    Attach to the post : http://forums.ni.com/t5/LabWindows-CVI/VISA-Shared-resources/m-p/1000856#M43685
    I use LabWindows/CVI 2013 SP2.
    I have a share VISA problem : 2 functions VISA (Read and Write), on the same ressource.
    I want to protect the access to the ressource VISA with the lock functions.
    Like they propose and also the "ThreadLockTimeout" NI example.
    So only at the beginning of the software, I do step 1 : cmtStatus = CmtNewLock ("", 0, &LockHandleRS), add before that, to a free handle in case of...
    Only at the end, step 4 : cmtStatus = CmtDiscardLock (LockHandleRS), add to a "LockHandleRS = 0".
    In each read and the write VISA function (20 ms < time < 100 ms, timeOut =2 s),
    I try to use different methodes, without success :
    - to lock (step 2) :
         1) cmtStatus = CmtGetLock (LockHandleRS)
         2) cmtStatus = CmtTryToGetLock (LockHandleRS, &lock), add to a while loop (for timeOut)
         3) cmtStatus = CmtGetLockEx (LockHandleRS, 0, timeOut, &lock)
         4) and even, statusRS = viLock (portVISA, VI_EXCLUSIVE_LOCK, timeOut, VI_NULL, VI_NULL)
    - the VISA read/write function :
         "viWrite" for the ask, following by a "viRead" for the answer
         (VISA read function (# measure) : for only an acknowledge echo traitment and read data reponse)
         (VISA write function (# order) : for only an acknowledge echo traitment)
    - to unlock (if lock successfull) (step 3)
         1)2)3) cmtStatus = CmtReleaseLock (LockHandleRS)
         4) statusRS = viUnlock (portVISA)
    I display every wrong "cmtStatus" and "statusRS", but nothing appears.
    The VISA read function is called following an "EveryNCallback" DAQmx function (every 0.5 s).
    The VISA write function is called punctually, by an operator action throught various interfaces.
    If I use only the periodic VISA read function, no problem on lock/unlock.
    N.B. : Lock function (step 2) take only # 100 µs (for case 2) : at the 1rst iteration).
    N.B. : I use the same external lock/unlock functions for the VISA read/write.
    But nearly at every VISA write asking (3 on 5), I can display 2 consecutive successfull locks function (VISA read and write) ???
    N.B. : same time (100 µs).
    Following, of course, to 2 unlock functions (step 3).
    And every time, the echo of the VISA read function is treat by the VISA write function.
    And the echo of the VISA read traitment is truncked.
    So for me, I treat it like 2 VISA errors (read/write).
    ======================================
    I try to improve the NI example to show my problem.
    I hope that I do not make so much big mistakes, which would cancel the interest of this example.
    Into this small code lines, I can reproduce my problem (by quick applies on F1 button).
    I do not implimente all the fonctions I try.
    Into a bigger one, it is very easy to reproduce (even with the other functions).
    N.B. : In my final application, the thead n°1, look like more to a CVICALLBACK, than a thread with a while loop inside.
    But the problem still the same.
    Thanks a lot for consideration.
    Certified LabWindows/CVI DEVELOPER (2004)
    LabVIEW since 5.01 | LabWindows/CVI since 4.01
    Attachments:
    ThreadLockTimeout.7z ‏127 KB

    Hello,
    With NI France support, we improve to the NI example again.
    We show, with adding timing, that the example can display the message in a wrong sense (with a  software slowing down, at this moment, why ???) (see Explication°1.png).
    If we display the message before the release (//#define M2), we can not reproduce the double lock (and release),
    but also any software slowing down.
    So this example can not be use to help me to resolve my application lock trouble.
    It is not representative of my problem, which is always in progress.
    Certified LabWindows/CVI DEVELOPER (2004)
    LabVIEW since 5.01 | LabWindows/CVI since 4.01
    Attachments:
    ThreadLockTimeout n°2.7z ‏289 KB

  • (My Labview built executable) has encountered a problem and needs to close.

    All,
    I've built a labview executable that interfaces with GPIB and the Texas Instruments EV2300 via USB.
    The VI runs beautifully on the development machine, and I built the VI into an executable and it runs fine on the development machine.  So I built an installer that installs the Labview 2010 runtime environment on the target computer.  I ran the installer on a production line computer that I know can run GPIB and the EV2300, but when I try to start the executable I get an error that says "(your application) has encountered an error and needs to close." 
    I can get labview executables that don't talk to GPIB or USB to run on the target machine, so it's not the runtime environment.  And I have VB6 applications that can run the GPIB and EV2300 interfaces without a hitch.
    I'm not doing anything weird with the file paths in my program, just creating paths from strings using the canned VI for converting strings to paths.  In the course of the program, I read a few text files and write one.
    Is there anything I need to know about how to build an application with an installer?  I need to get this up and running so the test program can run on a production line.
    Ro ma wa ichi ni chi ni shi te na ra zu
    Solved!
    Go to Solution.

    RTSLVU wrote:
    I can get labview executables that don't talk to GPIB or USB to run on the target machine, so it's not the runtime environment.  And I have VB6 applications that can run the GPIB and EV2300 interfaces without a hitch.
    Sounds like a VISA problem. Download and install the latest VISA runtime on the target machine.
    Ditto that!  At least One of the installed drivers on the target does not have LabVIEW 2010 support.  Check MAX and NI.com to resolve all the drivers on the target that have no support for 2010.
    And yes, you may need to re-qualify the pre-existing exe's- if their code changes to use the new drivers If you are in a heavilly regulated industry.   If so, It might be easier to build in an older LabVIEW version.  Very sad- but it might even be simpler to incorporate the Unit Test Framework to handle regession testing of older code with newer drivers.
    Worse-  What if the exe's were built with a legacy version of LabVIEW that is no longer supported by the latest Drivers?
    Time to start thinking of life cycle maintenance plans- with a 5 year upgrade required!
    Jeff

  • VISA: (Hex 0xBFFF003E) Could not perform operation because of I/O error

    Hello
    I'm trying to communicate with a device which have to be tested with my application. So I send it a command to initiate the device to send me back some values. This values are sended continously with a baud rate of 57600.
    I create a VI to test this communication but I always get the error:
    VISA:  (Hex 0xBFFF003E) Could not perform operation because of I/O error
    I have googled for this thread, but all the described solutions didn't solve my problem (ex. Set Buffer Size to 48, Various Byte Count Values for the VISA Read function, etc.).
    Note: the VI works properly in the LabView 6.1! I have tested. I also tried to receive the Data with a Termial programm (works fine). But another COM Port (on an USB Serial converter) shows me the same Error.
    I just updated to Visa 3.4.1, but no success
    LabView 8.0.1

    Yeah it works
    of course buffer to 4096 and mask to 48
    I added a Flush I/O buffer (mask 16) before reading from Visa (problem was to read data).
    And suddenly it works fine.
    I can't reproduce where the error exactly was. Could it be possible that the RX buffer was already full
    at the time I would to read from Visa or at the time the COM port received data? There are some questions to find out:
    how does LV handle the buffers? will the buffers be cleared if LV starts? when does LV flushes the
    buffers?
    thx

  • VISA - unable to queue operation & device reported an input protocol error.

    To summarise I have two main questions:
    How can I clear the VISA operation queue and what causes an input protocol error during transfer?
    I am using a Tektronix AFG3022B Function Generator with latest firmware connected via USB and LabVIEW 2013 SP1 patch 2 (13.0.1f2) 32bit with NI-VISA 5.4.1.
    If I have a small timeout, say 100ms, and perform a save operation and op complete query on the device (*SAV 1;OPC?), then do up to 10 VISA reads; the first few reads time out (as expected), but then I frequently get the following error:
    -1073807305 (0xBFFF0037) VI_ERROR_INP_PROT_VIOL. "Device reported an input protocol error during transfer."
    Once this occurs, the whole VISA session seems to become unstable and lots of the read operations fail returing this error code and not the timeout error (0xBFFF0015 (VI_ERROR_TMO)).
    This only seems to happen when I do save/recall operations (either internal or to a mass storage device connected to the instrument), all other commands/queries timeout for the first few reads (as expected) then return the value. Do you think this is probably down to poor instrument firmware/hardware? Or am I using the VISA incorrectly? I would have thought that attempting to read VISA data to soon should only generate a timeout error and not a protocol error.
    Reason for small timeouts:
    Instead of setting the timeout to 4 seconds and having no way of cancelling the read operation I set the timeout to 100ms and run a for loop up to 40 times ignoring timout errors unless it's the last loop iteration. This means I can the operation within ~100ms.
    What causes the VISA protocol error? I can't find much information on it.
    Once the protocol error occurs and I keep repeating the *SAV command, I get the following error:
    -1073807303 (0xBFFF0039) VI_ERROR_IN_PROGRESS. "Unable to queue the asynchronous operation because there is already an operation in progress."
    Once this error occurs, how can I force a clear on the VISA queue without unplugging/power cycling the instrument?
    Some NI I/O trace captures attached to show what happens and test VI.
    Using LV2014 SP1
    Attachments:
    VISA problems.zip ‏38 KB

    I thought I would try to duplicate your problems, but I only have a Tektronix TDS Scope that has USB.
    I am a big fan of using *OCP? to sync instruments control and your problem is of interest to me
    This very simple write/read returns the *OPC '1' return char consistently between 175-185 ms. No problem
    Trying your short timeout with a loop, I do not get what I or you expected.
    If I run this with an adequate timeout (something well longer than the needed 185 ms) it returns in between 175-185 ms with one time through the for loop. As expected. No problem.
    BUT! If I try setting the timeout to something below the needed 175 ms like your 100ms...
    Sometimes it works but it takes over 2000 ms to return but only goes though the for loop twice.
    But sometime it does not work at all, taking well over 10 sec to exit out of the for loop with a timeout error.
    I never see the queue error that you are seeing
    How long does it take to do a simple *SAV
    I don’t think the queue error is coming from the instrument. You would have to send the instruments the SYSTEM:ERROR? Request before the instrument would report any of the instruments errors. 
    Be sure to set your Termination Char setting. I prefer to turn it off for the Writes so I control when to send it. But turn it on for the Reads
    Also I noticed from your trace files that you use the *OPC? after a *IDN?. This is not needed as you will know the *IDN? is complete when your received the string back. In fact my Tektronix scope did not like it if I sent it "*IDN?;*OPC?\n". It would not send me anything back, not even the IDN.
    Is the SAV command on your instrument so long that you need to be able to break out of the loop?
    Omar

  • Vista "Problem Reports and Solutions" - Compatibility issues?

    The Visa +Problem Reports and Solutions control panel+ is reporting three problems which I cannot seem to be able to find a resolution to.
    ( So far I've just installed Vista 32 business ed. on my iMac 20". Installed bootcamp drivers, updated bootcamp to 2.1 and applied Vista updates.)
    *The Problem Reports and Solutions control panel reports the following:*
    1. Download and install the driver for Apple performance counter
    This problem was caused by Apple performance counter, which was created by Apple Inc..
    2. Compatibility issue between Intel 82801GBM (ICH7-M) LPC Interface Controller - 27B9 and Windows. This problem was caused by a compatibility issue between Intel 82801GBM (ICH7-M) LPC Interface Controller - 27B9 and this version of Windows. Intel 82801GBM (ICH7-M) LPC Interface Controller - 27B9 was created by Intel Corporation and is distributed by Apple Inc..
    3. Problem caused by Apple Desktop Null Driver.
    This problem was caused by a compatibility issue between this version of Windows and Apple Desktop Null Driver. This product is usually distributed by the company that manufactured your device or computer. Note. If you bought Apple Desktop Null Driver from a retailer and installed it yourself, you will need to contact its manufacturer.
    Are these genuine, and if so, is there a solution? No solutions are proposed by Vista itself.
    Mac OS 10.5.5 , bootcamp 2.1
    Message was edited by: noutram

    A performance counter is not something that is usually in use by any application except by the developers of the unit itself. nothing to be alarmed about (well, except perhaps that it should have been removed).
    A null driver is (as the name indicates) null, there is no functionality assigned with it. MS error message is just brilliant "there is an error in your null driver", like what, there is no functionality in a null driver. So no problems there either.
    I got the 2 above error messages on the very first generation 17" MBP for ages and so far not seen any impacts in it for my work.
    The Intel issue is interesting, it might be a good idea to see if Intel has an updated driver for the problem.

Maybe you are looking for