Problems using labview

hi, i am a student and as a project i am making a testbank with 2 pumps.
For the programming and the measuring i am using the NI-DAQ USB 6008.
I've got 3 analoge sensors, on ai0, ai1, ai2, and a digital input and a digital output.
Put i can't control the digital ouput, neither the digital input, can someone help me please?
Thanks

Hi Jeroenooms,
Have you tried to manage only digital In (meaning without digital out) or only Digital Out (meaning without digital in) ?
And only the AI does it works?
Regards,
Julien Roland - District Sales Manager
NI Belgium - Technical Support
Don't forget to rate a good answer

Similar Messages

  • Problems using Labview as ActiveX Server

    Hello,
           I have been having difficulty using Labview as an AcvtiveX server. I have reviewed all the postings on this subject and most are either pre Labview 8.2 and thus do not account for the changes made between 8.2 and 8.5 which broke the Activex server functions. I have looked at the recommendations for changing the to code to export (exported vi's in a DLL or Source distribution) and have tried these with no success. The closest example I have found was posted here http://forums.ni.com/ni/board/message?board.id=170&thread.id=283417 the example code they posted does ont work for me and still generates and error 3005.
          What i need is simple. I want to turn my applicaiton into a Vi server.. Expose a vi that acceses elements in the Vi server.. (controls, queues, Globals etc) that are in the Vi server context. I would then like to build a vi .. or dll that calls the 'exposed' vi in the vi server to pass data to or from the vi server. The V test.zip example file in the above indicated post is a pretty good example of this .. it just does not seem to work when i build it in 8.5. Are there any GOOD and 'current' examples of using labview as the ActiveX server (Compiled) and calling exposed vis from an external application Labivew, Visual Basic.. etc??  I am only interested in cases where Labview is the Sever. or both client and server.
           I have used a tool "ActiveXplorer" to examine the registered "exe" when the viserver is run. It always shows that there is no Type Library associated and the object is not creatable. There is a .tlb created by the project build however, where as the previous version 8.2.1 of Labview did not build that correctly. I have also tried this on 8.6 with similar error 3005 generated. sooooo what am i missing?
          Thanks
           Louis Ashford

    Mike,
           Thank you for your response to my question. The problem is that the example you site does not use the Labview vi as the Server. Excel is actually the vi server and the automation open is using and excel automation object. I am sure that Excel creates proper automation objects .. Labview however does not seem to. So while this example shows how labview can function as a client it is not an example of a compiled Labview Sever being accessed by a 'laview vi'. Possibly I am not looking at the vi that you are thinking of.
           The examples i am aware of:
                        "ActiveX Event Callback for Excel.vi... (Excel is server not Labview vi)
                        "ActiveX Event Callback for IE.vi (same Labview vi is client)
                        "Write Table to XL.vi" ( again excel is the server)
                        "3D Graph Properties - Torus.vi" (accesses an activex Control 'not' and Activex EXE)
                        "3D Lorenz Attractor Draw at Compeltion using 3D Curve.vi (Uses an activex  control not activex Exe server)
                        "3D Parametric Surface - Ribbon.vi (Uses an activex  control not activex Exe server)
                        "3D Surface Example - Fluctuating Sine Wave.vi (Uses an activex  control not activex Exe server)
                        "Excel Macro Example.vi (Uses excel as automation server..not Labview)
                        "FamilyTree.vi (uses MSComctlLib.ITreeView object not Labview as server)
                        "SlideShow.vi" (uses PowerPoint._Application not Laview as server)
          Most of the posts I have seen are for versions prior to  Labview version 8.2 (where the ActiveX server was broken) I have seen only a few posts that actually address the issue i am talking about. however thus far no real solution has been offered. I get the same results when compiling and testing this with 8.6..  as well. So have you tried this Mike? Possibly i am missing something very simple..
          The example i did find and gave the link to is a pretty simple one. This does not work on my machine at all. You can select the automation server that is registered with windows after running the server one time and this then breaks the client vi.. I have found by reselecting the GetViReference property node in the Client vi that it will the 'fix' the client vi as far as labview is concerened and it no longer shows and error. Now when you run the Client vi it will infact find the vi server and will launch it ok. However. The open automation object then hangs.. for quite some time then returns the error
    "Error -2146959355 occurred at Server execution failed
     in Client_reader.vi" Obviously the automation Exe (server) was seen because it was opened yet it did not return a valid reference so the subsequent property nodes in the client.vi will fail. Something is wrong with Labviews opening of or creating of automation objects..
              Thanks,,
                    Louis Ashford

  • Is there any problems using LabView 5.1.1 on Win2K?

    I am currently upgrading the company's system from Win NT 4.0 to Win2K and I am wondering if there is any problems in upgrading LabView to Windows 2000?

    No infact i'm running LV5.1.1 and LabVIEW 6.0.2 on the Win2k system.
    A Rafiq
    National Instruments
    http://www.ni.com/support

  • I want to play video on my computer to make some analysis to frames,the problem that I face ,I can't change video frame rate using labview,but I can change frame rate to the video out of labview using some program

    HI All
    I want to play video on my computer to make some analysis to it's frames,the problem that I face ,I can't change video frame rate using labview,but I can change frame rate to the video out of labview using some program .
    I used IMAQ AVI Read Frame VI
    for example I have avi video It's frame rate is 25 fbs ,my image processing code is very fast that can process more 25 fbs,so I want to accelerate video acquisition

    Hi abdelhady,
    I looked into this further, and reading an AVI file into LabVIEW faster than its frames per second won't be possible. LabVIEW could read in frames faster than 25fps, but because it will be pulling the available frame at that point in time this would just give you duplicate frames. If you want to be able to read in frames at faster than 25fps, you would need to speed up your AVI file before reading into LabVIEW.
    There's a good shipping example to show how to read in from an AVI file, "Read AVI File.vi". You'll notice that they add timing to make sure that the while loop runs at the right speed to match up with the frames per second of the file being read. This is to make sure you're not reading duplicate frames.
    Thank you,
    Emily C
    Applications Engineer
    National Instruments

  • Problem in a particular scan of our software designed using labview

    We are using a software called CROP for our experiments . The software is designed and bulit using Labview . The software seems to be running fine but there is a problem in one of the scans that we use in the software . The particular scan called the 'Threshold Scan' is coded in such a way that it increments by itself to a certain value of voltage after the scan has been conducted at the  current voltage for a certain period of time . For example the current value is 30 and increment is 30 and the final value is 300 for a time period of 1 minute . The scan automatically increments from 30 to 60,60 to 90 and so on after each minute till the final value of 300. The problem is this scan seems to be running perfectly fine in two of our windows 7 lab systems but elsewhere in all other systems running windows 7 (or) 8 it blanks out after the first value of voltage and stops abruptly (i.e) in this case it stops at 30 and does not increment and perform the scan for each voltage till 300. The code seems to be perfect and it has been hard to figure out why the scan runs perfectly in  two of our lab systems and does not do so in any other system.

    Hi raj177,
    Looking over your code I don't see anything glaringly obvious, but not having all of the sub-vi's makes it hard to thoroughly explore. That being said, I do have some troubleshooting steps we could start with.
    Have you tried probing the starting, ending and increment voltages coming out of the "threshold wizard" sub-vi while the code is running on the troublesome machines? Also, canyouprobethe "# ofpoints" wire? 
    The measured values will help us determine whether the problem is inside or outside of the for loop. 
    -edited for misspelling-
    Christopher S. | Applications Engineer
    Certified LabVIEW Developer
    "If in doubt... flat out." - Colin McRae

  • Problems with Serial Communication using Labview 6 and Solaris 8

    I am working on a Driver for a Temperature Controller. But I am stuck at the very basics. I am using Labview 6 and the platform is Solaris 8 on a SUN Ultra 60 Workstation. I can not get the Serial communication to work. When I am running raw (uncompiled) code it works (I can read from and write to ttya and ttyb) but once compiled I get error code 37 (device not found). I have tried the following steps to fix this with no luck.
    1) I made sure that the "serpdrv" file is in the same folder with the executable. I also make sure the serpdrv file is added as a support file when building the app.
    2) I changed from using traditional serial VI's to labview 6's new visa functions. With these "new" VI's when
    I try to initiliaze the visa device and wire a control to the "visa reference" input only 1 serial port shows up (ASRL2, missing ASRL1). I am not sure if this is part of same problem or whole new issue.
    3) I reinstalled both visa and labview 6.0.2 update hoping this would help with no luck
    4) I placed the following entry into the ".labviewrc" file
    labview.serialdevices: "/dev/ttya:/dev/ttyb"
    If anybody has had the same problem I would love to hear about it and if you have any solutions
    Jamie Shea

    Hi Jamie,
    1. Do you have NI-VISA driver installed on the machine on which you are running this executable?? If you are trying to run the executable on the same machine on which the development program has ran fine, then you can ignore this point.
    2. If you have done all the changes that are suggested by other discussions related to this topic, then try changing the Port input to Visa Serial Configure.Vi from a control to a constant and try it. In some case, I have seen this to do the trick. I think this point should solve your problem. If it does do tell me. :-))

  • I use LabVIEW 7.1 but I have some problem when, I use LabVEW to read the data from serial communication

    I use LabVIEW 7.1 but I have some problem when, I use LabVEW to read the data from serial communication.
    I use LabVIEW to read the data from serial communication then, i open the example (.vi) from Serial Communication - Advanced Serial Write and Read  from LabVIEW Example. BUT it have some error message that : Error - 1073807202 occured  at property node in visa configure serial port (instr).vi -> advance serial write and read .vi
    this error code is undefined. no one has provide a description for this code, or you might have wired a number that is not an error code to the error code input.
    I don't know why? please help me. thank you.

    When I copy that code into "Explain Error" I get: "VISA:  (Hex 0xBFFF009E) A code library required by VISA could not be located or loaded."
    You may have a bad install of VISA or the wrong version of VISA loaded. Try re-installing VISA. You can get the latest version from the NI support site: http://digital.ni.com/softlib.nsf/webcategories/85256410006C055586256BBB002C0E91?opendocument&node=1....
    Also ensure that you are not pointing the example towards a serial port that does not exist.
    Please let us know what you find and what gets this working for you.
         Rob

  • Creating TestStand GOTO step programatically using LabView. Problems.

    Hi
    I am trying to programatically create and configure a TestStand GOTO step, using LabView.  I can create the step fine, but I cannot work out how to specify the GOTO Destination.
    Has anybodt got any ideas?
    Regards
    Steve
    There are 10 types of people in the world that understand binary, those that do and those that don't.

    Managed to solve this one myself by setting the CustomActionExpr to 'True' and the CustomTrueActionTargetByExpr to the name of the step I wanted to jump to.
    Steve
    Message Edited by SercoSteve on 01-24-2006 02:13 AM
    There are 10 types of people in the world that understand binary, those that do and those that don't.

  • Problem with Labview and an ARM Cortex

    Good morning,
    I am currently trying to use Labview with a board from ST Microelectronics (MCBSTM32) with an ARM Cortex Processor.
    I use the SDK and have followed the tutorials.
    But, when I try to launch the program (the simple loop as write in the tutorial n°2 : http://zone.ni.com/devzone/cda/tut/p/id/7029 ), Keil gives me an error via Labview:
    "Argument 'DARMSTM' not permitted for option 'device'."
    It seems that Keil does not allow an ARM Cortex from ST as the device.
    Moreover, after having this problem, I am unable to use a Keil project ,even a project which worked before, without Labview. I need to restart the computer.
    I also try to launch the Keil project generated by Labview, without using Labview, and it works. But as soon as I use Labview, I have the error.
    Did anyone already have this error or know how to solve it
    Thank you for your answer and sorry for my bad english.
    Regards,
    Raphaël VAISSIERE

    Hi Raphi,
    So let me make sure I understand,
    The project created in LabVIEW errors out with the message "Argument 'DARMSTM' not permitted for option 'device" 
    If you open the same project in Keil uVision, it runs fine
    Here are my questions:
    1. So how does the code run when run through Keil? does it deploy and run fine?
    2. Did you follow the porting procedure completely?
    Your target STM32F103RB  is techincally supported by Keil but you need to port the RTX kernel to it. This paragraph explains it:
    To determine if your target already supports the RTX Real-Time Kernel, browse to the \Keil\ARM\Startup directory, then browse to the folder that corresponds to the manufacturer of your ARM microcontroller. If there is an RTX_Conf*.c file for your target, then the RTX Real-Time Kernel has already been ported for your ARM device. If no such file exists, skip to chapter 4 for more information on the RTX Real-Time Kernel and a guide for porting RTX to your ARM microcontroller. 
    You also need to port the Real-Time agent to it.
    I just want to make sure that you have followed the guidelines. If you have and are still having problems, we will continue to explore this.
    Thanks,
    National Instruments
    LabVIEW Embedded Product Support Engineer

  • I am trying to use Labview and RP1210 compliant hardware to connect to a truck J1939 bus and receive messages.

    I am trying to use Labview and RP1210 compliant hardware to connect to a truck J1939 bus and receive messages. 
    Specifically I am attempting  to read data frames using the RP1210_READMESSAGE .   am able to configure the hardware and send a message to the J1939 bus. .    I think I have not configured something correctly.  I can use the RP1210_SENDMESSAGE and see the message I have sent on the bus using CANalyzer   When I use the RP1210_READMESSAGE   I get the timestamp from a message and the return from the function sends back the correct number of bytes (the number matches the number of bytes I sent out plus four bytes from the timestamp).  What I am having trouble with is actually receiving the data. I have had the same type of behavior from two different hardware (Vector CANcase XL and Nexiq USB Link), so I don't think the issue is vendor specific.
    Has anyone been able to make the RP1210_RECIEVEMESSAGE function work correctly?
    Thanks for any help

    Thanks
    I have already tried that.  The links are the NI RP1210 wraper. The problem I am having is using labview to interface with the RP1210 layer.  The RecieveMessage char*fpchAPIMessage this is the output which is a pointer to a cahracter array.  In this variable I can receive the timestamp of the message but not the message.  The retun showns the correct amount of bytes are aviaable (18 for a 8 byte message) but I can only get the 4 byte timestamp  I think I have to dereference this pointer to view the data.  I am not sure how to fix this. 

  • Run-time engine problem in Labview 2009 and Labview 2010

    I have a problem with Labview 2009 and Labview 2010. I updated my Labview 2009 into 2010. But it turned out to be a trial one, because i did not have the serial number. So I uninstalled the Labview 2010. however, the funny stuff came over. I cannot use my Labview 2009. So i uninstalled Labview 2009 again. But eventually, I could not reinstall Labview 2009. Every time i had a runtime error and i could not proceed with the installation. in addition, any installation  related to Labview is not permitted and the same error came up every time. it is very annoying.
    So, What is the deal?
    I attached the error here. Any comments or advice are welcomed and appreciated.
    Attachments:
    error.docx ‏2305 KB

    By chance is this machine's language set to any non-English locale?  You would check the locale setting by:
    Opening Control Panel.
    Opening "Regional and Language Options".
    Looking Under "Regional Options" >> "Standards and Formats"
    If it is set to something besides English, trying setting it to English and please report back what locale it was set to (or if this even solves the problem).
    Regards,
    - WesW

  • I use labview 8.0 and i can not run daqmx driver for pcmcia 6062e daq card

    i use labview 8.0 on window xp .when i installed a PCMCIA daq card 6062E, i could not run daqmx driver.It is fine to see that the traditional version is working well.When i install daq card sometimes it is working well with traditional driver  but mostly i see error on reading an analog input from PCMCIA daq device.An error called 'base adress' occures .I use last version of driver that is daqmx 8.0.1. and i have installed this correctly but again i could not  see any signal on the test panel for daq-mx driver.I don't know what is the problem.Also when i install PCMCIA daq card Please reply this message.I need your helps.
    Thanks

    Hello,
    Thanks for your reply.I will try to answer your questions.
    I have checked the Device Mananger and I have seen that the device is detected by Windows  XP.  For another question the card shows up under both NI-DAQmx Devices and Traditional NI-DAQ (Legacy) Devices consistently in the Measurement & Automation Explorer.I use a way that I reset the driver for traditional NI-DAQ in MAX,and then i reset the device for NI-DAQmx driver in MAX and again i tried to run the test panel of DAQmx driver but again i saw a wrong signal or noise signal on the test panel i could not see the sinusoidal signal that i connected. i do get the test panels to run with the card using the DAQmx driver,  But could not run the DAQ diagnostic utility.The error was as i specified below.
    03.05.2006 09:49:35
    Results saved to:  C:\Documents and Settings\serkan\Desktop\Diagnostic Results.txt
    Selected Device: Dev1
    Device Type: DAQCard-6062E
    Serial Number: 107509E
    Device Support: (PASS)
    NI-DAQmx Version: 8.0 (PASS)
    Device Reset: (FAIL)
    Error -50002 occurred at an unidentified location
    Possible reason(s):
    The specified device is not a valid device. The operation could not be completed as specified.
     ------ DIAGNOSTIC UTILITY ABORTED -----
    And for your last question i can say that i have used the DAQ card on other laptops.But the results were the same.İ hope to see your reply
    Thanks,
    Serkan Buhan
    Electrical-Elecronics Engineer
    Researcher
    TUBITAK-BILTEN

  • How to use labview to control a robotic arm by EMG signal

    Hello,
    I am working on simulation of an active Exoskeleton (wearable robot) for the upper limb using LabVIEW for my senior project. I need to use the EMG signal as an input to move the elbow joint (flexing and extending). I downloaded labview biomedical toolkit to take a ready simulated EMG signal but have a weak experience in LabVIEW.
    The design criterial I am planning to use is to establish a threshold for the EMG signal using a comparator (above 0 for example) and set a counter for everytime the signal passes the threshold as(+1) in every count. Then, specify a degree value for the counter (for example when the counter reaches 10000) and feed this value of the counter to a simple simulated structure for the joint (simple angle of 2 lines) or a meter to represent the movement.. e.g. every 10000 count = 1 degree of movement. Zero crossing can also be used instead of the comparator and the signal will be filtered and that is easy to do. However, my problem is in converting this logic into LabVIEW. I don’t know how to set a counter for the signal and make every number of counts refer to a specific degree of movement and I also don’t know how to form the simulated joint structure in LabVIEW or even how to transfer this into a simple meter device in LabVIEW.
    I have only one month to do this project so any help or ideas you provide are highly appreciated
    Thank you ,

    CarlFHellwig 
    Thank you for providing this example I just implement  it in the software to check the counter results.
    In fact, the design criteria I desided to use latly is to use is to establish a threshold for the EMG signal using a comparator and correlate the EMG signal with the angles of movement of a simple simulated structure "motor" for a single joint (simple angle of 2 lines) eg. 30,60,90,120,150 degrees for flexing and extending. Zero crossing can also be used instead of the comparator and the signal will be filtered and that is easy to do. 
    In other words, the idea is to drive a motor for different angles based on the input EMG signals. I am now stuck with developing the algorithm of how the angles are related to the RMS value of the signal while flexing and extending and how to convert this RMS into angular velocity and angular position to form the simulation. 
    I will be grateful if someone guided me to a person did a similar project to discuss some issues. 
     

  • How do I turn off background ni file activity when not using Labview? files such as nimxs.exe, nipalsm.exe, nipalsm.exe, nisvcloc.exe, nicitdl5.exe

    How do I turn off background ni file activity when not using Labview? I use labview rarely, and I also use my computer for demanding multimedia applications, including multitrack digital audio recording. I need to reduce as much background activity as possible, and taking a look with task manager I see files such as nimxs.exe, nipalsm.exe, nipalsm.exe, nisvcloc.exe, and nicitdl5.exe running even when I haven't used Labview (8 I believe). These files run even after I right click the NI icon on the lower right and turn off the application. I would like the computer to boot up without these files, and for the necessary files to be activated only when I start the program.
    Can this be done or do I have to remove labview from the system to improve performance?
    Thanks

    Hello, those processes are part of running NI services that start when Windows boots.  These processes serve varied purposes and stopping them can have undefined and unknown consiquences for your NI products.  These products run at "normal" priority meaning that they should be preempted by any process running at a higher priority which I would expect your other application to be doing given their time sensitivity.  That being said, in Windows XP (I can't speak for other OSes), you can lower their priorities even greater through task manager (right click the process»set priority) to further remove them from contention for resources.  If you want to prevent them from running you can set the service startup type to "manual" in Control Panel»Administrative Tools»Services.  Right click the process go to Properties and Under the General tab choose Startup Type»Manual.  This will start up the processes only when something directly starts it up.  LabVIEW will start these processes up as it attempts to use them.  When you shut down LabVIEW you will need to manually shut down these processes through task manager.  Again, doing this can cause problems with your NI products on your system and it is not advised.
    Travis M
    LabVIEW R&D
    National Instruments

  • How do I browse & play .mp3 files from a USB flash drive using Labview 8.0?

    I'm a tester & want to test USB flash drives connected to PC.
    I want to browse the file system & playback the audio files, .jpeg files etc in the USB drive. I've Labview 8.0. When I searched for some examples or VIs which can detect the USB flash drive, couldn't find any. So if you could point me in the right direction, I can find a solution to my problem.
    Thanks.
    Solved!
    Go to Solution.

    You probably won't find specific examples for this.  A USB flash drive is mounted by the OS and appears as a new memory storage device to all software on the PC, including LabVIEW.  You can certainly write a LabVIEW program to open files on the stick and play/run them but I would create a VI that wrote files to the stick and then read them back and checked them for bit-level errors.  Keep in mind that your OS might be doing parity checking under the hood which would make your test rather meaningless.
    BTW, testing memory by looking at pics or listening to MP3s would be useless of course.  A human can't see or hear bit-level errors in a high bandwidth data stream.
    Using LabVIEW: 7.1.1, 8.5.1 & 2013

Maybe you are looking for