Labview for arduino reading frequency

I'm trying to figure out a way to read a frequency from a device. The device tells how far away something is by a beeping signal. I tapped into this beeping signal and it is now connected to an arduino pin generating a high/low signal to Labview. I got this much to work but now I need the distance that the device is collecting. Basically I need to know if it's possible to generate a number on a numeric idicator from the information on how fast the frequency changes. I'm not really sure if this is enough information but I'm attaching my basic high/low code. I used a basic meter to display whether it is high or low.

Hi,
In the example I posted before, you don't want to use any of the DAQmx VIs.  You would replace these with the arduino code but use the same frequency calculation.
What errors are you getting when using the pulses or frequency functions and which ones have you tried?
You could do a manual calculation of the RPM.  Assuming you get one pulse per rotation, count the pulses of the optical sensor for 1 second.  This will give you rotations/second.  Then you can multiply this number by 60 to get rotations per minute.  The problem with this is if 1 second passes when the motor is half way through a rotation, then there is no way to account for this in the calculation.  If there is more than one pulse per rotation then you will need to factor this into the calculation.
Regards,
Greg H.
Applications Engineer
National Instruments

Similar Messages

  • LabVIEW for memory read/write?

    I know LabVIEW has In Port / Out Port vi for the I/O space read /
    write. I am wondering if LabVIEW has read/write memory vi? As we know,
    Windows 2000/XP do not allow to read/write I/O port and memory
    directly from user level(kernel level is OK).

    May be this post can be of interest to you (I haven't try this myself but it looks like a solution). Hope this helps.

  • How to use IXXAT CAN to USB in LabVIEW for CAN read, when I am having VCI V3 driver?

     I am having VCI V3 driver for IXXAT CAN to USB.How to use that in LV for reading data from CAN device.

    Did you check the settings to make sure that the Password Manager is enabled?
    *Tools > Options > Security: Passwords: "Remember passwords for sites"
    Make sure that you do not run Firefox in (permanent) Private Browsing mode.
    *https://support.mozilla.org/kb/Private+Browsing
    *Tools > Options > Privacy: Use custom settings for history
    *Deselect: [ ] "Always use private browsing mode"
    See also:
    *http://kb.mozillazine.org/User_name_and_password_not_remembered
    *http://kb.mozillazine.org/signon.autofillForms

  • Adaptation of LIFA (LabVIEW Interface For Arduino) to other Arduino compatible boards

    Hello evryone,
     I am trying to use a ''Sakura board", which is an Arduino compatible board, having similar programming as Arduino. This board is manufactured by Renesas; more specifications on it can be found on the following link: http://sakuraboard.net/gr-sakura_en.html
    What I would like to know, is if the firmware used for LIFA to interface Arduino with labview, or the ChipKit firmaware, can be used to this kind of Arduino compatible boards.
    My aim, is to use the Sakura Board for acquiring signals from various pins(Be it digital or analog), and setting other pins (Digital) parralelly, using labVIEW, as this is done using LIFA, or LabVIEW interface for ChiKit!!
    Thanks in advance for your support.

    I haven't had any experience with the Sakura board, so can only go on the information on the website.
    I've used the Arduino LabVIEW interface previously, and it basically uses serial communications between the PC and the Arduino. This serial interface is provided by the on-board USB-serial converter (an ATMega8u2 running custom firmware).
    From what I can tell about the Sakura, it appears as a storage device when plugging into a PC, and doesn't seem to have a USB to serial converter. If this is the case, then the LIFA toolkit won't work. That said, my understanding of the Sakura board may be incorrect (for example I'm unsure what operation modes the slide switch changes between). If you can confirm it has a USB-serial port then LIFA may work. I guess the key thing is that the serial port is required.
    You may be able to use a software serial port (http://arduino.cc/en/Reference/SoftwareSerial) on the Sakura to communicate with a PC, though that would require a PC with a serial port, or an external USB-serial converter. This approach would also require you to modify the LIFA arduino code (located in vi.lib\LabVIEW Interface for Arduino\Firmware\LIFA_Base) to reference the software serial port. It's probably a long shot that this will work, as the software serial library has been written for AVR chips, making use of their specific internal timers.

  • LabVIEW Interface for Arduino Toolkitについて

    はじめまして。
    LabVIEW2013にNI LabVIEW Interface for Arduino Toolkit を導入したいと考えております。
    NIホームページからNI無料アドオンに進み、Arduino用NI LabVIEWインタフェースツールキットの画面からダウンロードを試みたのですが、
    Step One: VI Packege Manager のインストール後、このソフトを立ち上げたところLabVIEW Tools Network及びVI Packege Network に接続できないと表示されます。
    どなたか、この現象の改善方法について御教授頂けないでしょうか?
    宜しくお願い致します

    返信ありがとうございます。以下に実際に表示された画面を貼りますので宜しくお願いいたします。
    まず、VI Packege Manager を起動すると、左の画面が表示されます。
    エラーが発生しているようですが、ひとまずOKを押すと次の画像が表示されます。
    この画面ではライブラリにパッケージが存在していないと表示されており、
    表記通りにToolsからCheck the VI Packege Network for Available Packages を開きました。
    開いた結果、表示されたのが次の画像です。
    この画面では、どうも問題がないような内容が表記されているように思えますが、
    OKを押すと、また最初の画面に戻ってしまいます。
     LabVIEW Tools Network を起動したところ、左のような画面が表示され、
    ネットワークに接続されていないことが分かります。
    よって接続を試みるためにHelpからActivationを選択し、ひとまず30 day Trialに登録しようとしたところ、下の画面が表示されました。
     この画面ではサーバーに繋がらないという内容であったので、表記通り黒の太文字の部分をネットで調べたところ、どうもVIPM Professional を購入しなくてはいけないようです。
    私の考えでは、LabVIEW Tools Network に接続あるいはアクティブ化がされていないのが原因だと考えておりますが、その改善方法が分かりません。
    そもそもVI Packege Managerは無料でも使用可能であり、LabVIEW Interface for Arduino Toolkitもまた無料アドオンだと認識しており、無料の範囲で導入しようと考えております。(LabVIEW 2013は購入しております)
    宜しくお願い致します。

  • Analog Read Pin VI for Arduino interface (LIFA)

    Hi all,
    I'm new to Labivew and Arduino but hoping to incorporate both into my classroom. The Uno I have is functional; I've had some success with it already in the sketch editor. I'm hitting a wall with the interface, though.
    I downloaded VISA but probably overlooked something in the setup. I can't get this Analog Read pin VI to work (from the examples). After setting up a constant on the Init.VI to specify the correct COM port, I hit Run and I can see the TX/RX lights flashing on the Uno. The data from the analog pin is not being read, however. The meter doesn't move.
    Hopefully it's something simple I'm overlooking. I'd really like to move forward and explore LIFA.
    Thanks
    Solved!
    Go to Solution.

    Hi SpuriousD,
    We actually have a specific community group dedicated to LabVIEW Interface for Arduino (LIFA) located at: www.ni.com/arduino. You would most likely get a better response if you posted your question on this board, and it also has some great resources.
    Bill E. | Applications Engineer | National Instruments

  • LabVIEW for ARM 2009 Read from text file bug

    Hello,
    If you use the read from text file vi for reading text files from a sdcard there is a bug when you select the option "read lines"
    you cannot select how many lines you want to read, it always reads the whole file, which cause a memory fault if you read big files!
    I fixed this in the code (but the software doesn't recognize a EOF anymore..) in CCGByteStreamFileSupport.c
    at row 709 the memory is allocated but it tries to allocate to much (since u only want to read lines).
    looking at the codes it looks like it supposed to allocated 256 for a string:
    Boolean bReadEntireLine = (linemode && (cnt == 0)); 
    if(bReadEntireLine && !cnt) {
      cnt = BUFINCR;    //BUFINCR=256
    but cnt is never false since if you select read lines this is the size of the file!
    the variable linemode is also the size of the file.. STRANGE!
    my solution:
    Boolean bReadEntireLine = (linemode && (cnt > 0));  // ==
     if(bReadEntireLine) {    //if(bReadEntireLine && !cnt) {
      cnt = BUFINCR;
    and now the read line option does work, and reads one line until he sees CR or LF or if the count of 256 is done.
    maybe the code is good but the data link of the vi's to the variables may be not, (cnt and linemode are the size of the file!)
    count should be the number of lines, like chars in char mode.
    linemode should be 0 or 1.
    Hope someone can fix this in the new version!
    greets,
    Wouter
    Wouter.
    "LabVIEW for ARM guru and bug destroyer"

    I have another solution, the EOF works with this one.
    the cnt is the bytes that are not read yet, so the first time it tries to read (and allocate 4 MB).
    you only want to say that if it's in line mode and cnt > 256 (BUFINCR) cnt = BUFINCR
    the next time cnt is the value of the bytes that are not read yet, so the old value minus the line (until CR LF) or if cnt (256) is reached.
    with this solution the program does not try to allocate the whole file but for the max of 256.
    in CCGByteStreamFileSupprt.c row 705
     if(linemode && (cnt>BUFINCR)){
       cnt = BUFINCR;
    don't use the count input when using the vi in line mode. count does not make sense, cnt will be the total file size. also the output will be an array.
    linemode seems to be the value of the file size but I checked this and it is just 0 or 1, so this is good
    update: damn it doesn't work!
    Wouter.
    "LabVIEW for ARM guru and bug destroyer"

  • Please help me with my electrical engineering homework : temperature control and watering system for greenhouse using labview and arduino

    temperature control and watering system for greenhouse using labview and arduino
    spesification :
    1. max temp : 28 celcius (when temperature is more than 28 celcius, fan ON)
    2. min temp : 20 celcius (when temperature is under 20 celcius, heater ON)
    3. watering system : aquaponic (grow plant and fish in separate tank but connect each other). Plant roots help filter water for fish. Fish poop for plants fertilizer. So I need a pump to distribute water.
    Please help me create VI file simulation.. I'm sorry I'm not fluent in English. May God bless you all
    Attachments:
    YOOOSHH.vi ‏88 KB

    Duplicate thread.   Please keep the discussion in that thread where you already have a response. It is also the more appropriate thread for your question.
    Lynn

  • LabVIEW controlled Arduino outputting self-clocking signal

    Hello All,
    I am trying to be able to control my LED light strip (http://www.adafruit.com/products/1376) using LabVIEW and Arduino. To do so I need to make Arduino output a digital self-clocking signal, specifically a combination of two square waves of fixed frequency with varying duty cyle. The two waves represent either a 0 or 1 to the LED strip.
    0   =   HIGH for 400 ns then LOW for 850 ns
    1   =   HIGH for 800 ns then LOW for 450 ns
    In the end, using either of those waves as a bit, I would like the full signal to have 3600 bits. After one full signal the pin would remain LOW until I want to change the color again, and would send another 3600 bit signal.
    (If you want more info on the data transmission protocol, there is more info here: http://learn.adafruit.com/adafruit-neopixel-uberguide/advanced-coding#writing-your-own-library)
    Is there a way of controlling the output of the Arduino at such high speeds through LabVIEW. If you can think of any way of doing this, please let me know.
    Thanks!

    AFAIK, the arduino LabVIEW interface module only reads and writes pin values, and it can't do this with the timing precision you need. The only way around this is to write some arduino code, which can't be done in LabVIEW.

  • When using DAQ assistant to read frequency

    When using DAQ assistant to read frequency and Task timing is set to:
    N Samples, Clock settings to read 26,
    Frequency setup to rising edge,
    1 counter with 10 kHz to 1 kHz range.
    I get back a single number.
    Can I assume this is an average reading of 25 samples with the first sample unused?
    What is the base clock used?
    Is the “26” 26 cycles of the frequency to be measured?

    Hello,
    If you choose to acquire N samples from the DAQ Assistant then it will acquire all of these samples and return them to LabVIEW as an array.  However, when you use the DAQ Assistant it outputs the data in the dynamic data type first.  This data type makes it easy to graph and run the data through other express VIs.  If you were to create a Numeric Indicator from this data type it would just display the last element from the dynamic data array.  To display this data properly in a numeric format convert the dynamic data to an array of doubles by using the Convert From Dynamic Data function in LabVIEW.  Then you can select to convert it to a 1D array of scalars and when you create an indicator off of the output of this function all of the data should be displayed.
    The timebase that is used for lower frequency measurements is the onboard clock, which is internally connected to the Source.  Then you connect your signal to be measured to the Gate of the counter.  Since the frequency of the onboard clock is known it can be used to calculate the frequency of an unknown source based on when the counter is on and off (determined by the Gate). 
    Have a good day,
    Brian P.
    Applications Engineer

  • LabView for ARM - MCB2300 Audio

    Hi, and thanks for reading!
      My name is Chuck and I'm an undergrad ME student taking a mechatronics course. We were asked to create a proximity alarm with the MCB2300 and an IR proximity sensor. I have the entire program running correctly, but this lab has brought up a couple questions about how to better implement audio with the LabView for ARM processors.
      I understand how interrupts work, and I've seen a couple examples online of using an interrupt with a timed loop, but I believe the current version of LabView (2010) doesn't support that feature any more. I had a couple questions about how to get a feature similar to this to work with LV 2010.
      I was thinking I could have the proximity trigger enable an interrupt that I could use to generate higher quality audio than I am already making with a While - Timer loop. However, I'm not sure how to make an increment in the interrupt VI without using some form of a loop. The solution I'm thinking of at the moment is to make a For loop run once and to have an incrementing integer separate from the loop iteration (which would only go from 0 to 1) that stores its most recent value in the shift register.
      My other question is about playing a sound file through the MCB2300. I wrote a VI that reads a .wav file and writes each sample as the output needed to drive a speaker, but that decompression turns a 10KB .wav file into a 300KB text file. I also don't have a way to really load the text file onto the board. Is there any reasonable way to go about this? I found an example online that processes audio data using the MicroVision software, but I don't want to learn a new language to implement this.
      Sorry for such a long post, I just had a couple questions and was looking for some feedback. Any help would be greatly appreciated.
    Thanks so much!

    charlestrep91 wrote:
    Hi everyone,
    I just got my Labview for ARM cortex M3 evaluation kit and I can't download a simple program to the target. I'm using the Keil ULink 2 programmer and I get this error when compiling/downloading:
    [4:23:16 PM] Status: Error
    SWD Communication Failure
    Error: Flash Download failed  -  Target DLL has been cancelled
    Detail: [UVSC_PRJ_FLASH_DOWNLOAD, MSG: 0X100A, STATUS: Ex.] (1) 
    Status: FLASH download error.
    I have read about this error and NI simply refers to Ulink2 user's guide which has this description for this error:
    Serial Wire Debug communication is corrupted. The target SWD interface is not working properly. Mainly caused by the target: debug block not powered or clocked properly. Avoid Deep-Sleep modes while debugging. Lower the Max Clock frequency in the ULINK USB-JTAG/SWD Adapter section.
    I have tried to "Lower the Max Clock frequency in the ULINK USB-JTAG/SWD Adapter section" but it didn't resolve the problem.
    I have also tried to download the program using the usb port on the dev board but instead I get this error:
    [4:51:22 PM] Status: ErrorUnexpected error occurred.
    [Source: Target is in debug mode
    Detail: [UVSC_PRJ_ADD_GROUP, MSG: 0x1002, 
    STATUS: 0xA] Code: 10]
    What am I supposed to do with that?? I'm wondering if the dev board is defective. And this was supposed to be plug and play...
    Any help is greatly appreciated!
    I'll ask the obvious question, are you intending to use SWD or just download through the JTAG.  Check your settings.  I have not used the ARM with LV, but can you download anything using the Keil software?  Give that a try.  That may tell us where the problem lies.  Try to duplicate your settings in LV from the Keil sw.
    Reese, (former CLAD, future CLD)
    Some people call me the Space Cowboy!
    Some call me the gangster of love.
    Some people call me MoReese!
    ...I'm right here baby, right here, right here, right here at home

  • Microphone calibratrion for dB reading from pc sound card?

    We are trying to acquire a sound signal from a regular laptop sound card into labview, and obtain a sound intensity reading and a frequency reading. Through FFT we are able to acquire the strongest signal, but the dB readings are all over the map. Using a sound generator we were able to conclude the frequency readings were indeed correct, but the sound intensity was way off, and non-repeatable. So what I'm looking to know is if it is possible to calibrate the mic within labview or to read in a sound level measurement in dB with a regular PC mic and sound card. Even if it was a gain measurement, how do set the baseline for 0 dB? Please help!
    thanks
    Adrien Samson
    Halla Climate Control INC

    A computer sound card is simply a low cost audio input/output. Low cost means that it has usually no calibration options at all, and even if it had, Windows would not know how to access it and even less so allow an application to use it through a known API.
    So you have to do your calibration yourself in software. The way that works is to apply a known signal, measure the sound signal in Volts and then calculate a calibrarion function that translates those Volt into the dB you know you have applied. Additional problems could be that the analog path on your sound card is good enough for the human ear, but totally useless for measurement purposes, since the amplifier transfer function is anything but constant over the frequency range. Also sound measurements with cheap microphones is another big problem, since those microphones are at best a guessimator, but no accurate measurement device.
    Basically you have to do the calibration repeatedly at the same frequency and also at various frequencies too, in order to see if the microhone is actually useful at all (repeatability of calibration value at the same frequency), as well as reasonaby constant over the frequency range for both the mic and the sound card.
    With most standard sound cards none of this is a given, and unless you also use a really good microphone, even the most accurate sound card can't do much for you.
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Labview interface arduino

    Hi, I am using Arduino with LabVIEW to automate an extruder machine for my masters project. Can you please tell me how to obtain wire diagram for the Arduino circuit after completing the graphical circuit in LabVIEW? Can I get some more examples on LabVIEW interface Arduino related to motion control and automation?
    Thanks & regards,
    Akshay Wankhede

    Hello Askhu,
    LabVIEW plus Arduino sounds like a fun combination!
    SparkFun even sells a kit for it, which is handy because they also provide a good starting point for a support community:
    https://www.sparkfun.com/products/10812
    And we sell an interface toolkit that may be helpful:
    http://sine.ni.com/nips/cds/view/p/lang/en/nid/212478
    And our own community page appears to start here:
    https://decibel.ni.com/content/groups/labview-interface-for-arduino
    I don't know much about any of this, but these seem like good places to start looking for more information!
    Good luck,
    Edwin!

  • I am trying to integrate simulink model (.mdl) file with SIT of Labview for RCP and HIL purpose. I am using Labview 8.6, Simulink 6.6 with RTW 6.6 and RTW embedded coder 4.6 ,Visual C Express 2008 and Visual C++ Express 2008.

    I am trying to integrate simulink model (.mdl) file with SIT of Labview for RCP and HIL purpose. I am using Labview 8.6, Simulink 6.6 with RTW 6.6 and RTW embedded coder 4.6 ,Visual C Express 2008 and Visual C++ Express 2008. I have selected system target file as nidll.tlc,make command as make_rtw and template nidll_vs.tmf. When I try to generate .dll file I get the following error.
    Attachments:
    SITProblem.JPG ‏101 KB

    Hi,
    No . I could not solve the issue. Presently we are using microautobox (from Dspace)for doing the RCP.
    Himadri 

  • How do you add a third party sensor to LabVIEW for Lego Mindstorms​?

    I recently purchased an IR Sensor from Mindsensors (DIST-Nx-Long-v3) which I need for a SLAM (Simultaneous Localization and Mapping Application) that I am developing using the LabVIEW for Lego Mindstorms software.  I installed the Mindsonsors IR Sensor, and it works under NXT-G and RobotC, but am having trouble finding a way to get LabVIEW for Lego Mindstorms to install the sensor.
    The Mindsensors website gives the following instructions for installing the IR Sensor:
    1.Unzip the folder mindsensors.com LVEE
    2.Open a blank vi in LVEE
    3.On the Block Diagram Go to Tools->Advanced->Edit Palette Set...
    Unfortunately, on the Block Diagram of the LabVIEW for Lego Mindstorms, there is no "Advanced->Edict Palette Set" under Tools.
    As an alternative, I consulted the documentation that came with LabVIEW for Lego Mindstorms.  The Schematic Editor of LabVIEW for Lego Mindstorms lists several sensors, i.e. the Lego Mindstorms sensors and several Hi-Teach (HT) sensors, but there are no procedures listed in the documentation for adding other third party sensors to the Functions Palette.
    So, how does one go about adding a third party sensor to LabVIEW for Lego Mindstorms?

    Hi Ethan,
    As you can see from my Word document, I am a little light on the proper terminology.  That's because LVLM comes with inadequate documentation.
    I have already followed your recommended protocol for installing a 3rd party sensor (in fact, its the protocol recommended by Mindsensors) with the application set in the Remote Mode (.lvrbt), and it does create a sub-palette with all the Mindsensors functions on it.  But when I drag the Mindsensors icon to the Block Diagram and select "Distance Sensor," the Distance Sensor (an IR sensor) doesn't work (even though the Mindsensor's Distance Sensor does work with NXT-G, RobotC and LVLM under other circumstances (see below)).
    If I repeat the above process with the application set in the Direct Mode (.vi), I also get the sub-palette with all the Mindsensors functions on it.  When I drag the Mindsensors icon to the Block Diagram and select "Distance Sensor," the Distance Sensor does work.
    What I need for my mapping application is for the Distance Sensor to work in the Remote Mode.  I called NI tech support and the first engineer told me to simple drag the Mindsensors Functions (.vi) onto the Block Diagram.  I did this, but when I selected the Distance Sensor, the icon appeared, but the sensor did not work.  Since I have no idea what's under the hood of the vi or a function, I assumed that simply dragging the vi/function onto the desktop didn't install the vi/function properly.  I went back to the Applications Engineer, and he confessed that he did not understand the LVLM product.
    My frustration is being punted to new people, none of whom so far (other than you, of course) understand LVLM.

Maybe you are looking for

  • An error when downloading file

    Dear sir... using jdeveloper 10g 10.1.2 i have an http sevlet that sends a file. when i execute it, it works just fine, and i get the file downloaded perfectly, and no error i get.but when checking the log, i found that it generates the following err

  • Budget Document Entry Upload via LSMW or BDC Program not working!

    Hi All, I tried creating a LSMW and a BDC program for upload of budgets by recording Tcode FMBB.  Unfortunately, I didn't succeed because some fields in Tcode FMBB are not input ready.  I therefore seek an alternative solution to upload the budgets. 

  • Lightroom catalog in Creative Cloud

    Does anyone have experience with putting a Lightroom catalog in the Creative Cloud folder to use for multiple computers?  It seems like it should work well for photographers who are on the road a lot and would like to avoid merging catalogs after eac

  • Taking music from my computer itunes and transporting it to my laptop itune

    please someone help me!!!!!!!!!!!im completely useless in the technology department-all i want to do is take the music from my itunes library on my pc and put it onto my laptop but ive no idea how to do that!the best idea ive come up with is by burni

  • Using a second base station instead of an airport card - possible?

    I have the following items 1 Airport base satiation (new) 1 Graphite Airport bas station 1 iMac 350 – does not support airport card 1 MacBook w/airport card installed 1 Modem I am currently using the new Airport base station to distribute my internet