7334: digital signal processor for position tracking

Hello!
I’m using the 7334 Motion Control, LabView 7.1, and Mdrive23 motors (stepping motor which comes with integrated encoders).
Making first tries with the MAX, the closed-loop-mode worked unsatisfying. E.g. the motor stopped at position 12 instead of target position 5000 after firstly running several times between 0 and 10000. Or it stopped at 4740 after starting and stopping several times around 5000.
Increasing the deadband to 50 didn’t help.
Reducing the velocity I could reach 5043 instead of 5000, but a velocity of 500 steps/s is very low and the result neither satisfying.
Therefore, I tried to control the position with LabView and it worked quite well, checking the current position and comparing it to the target position in a loop and accordingly stopping the motor.
But I’m a bit surprised why I have to do it manually? I thought the digital signal processor on the board would do it for me: stop the motor when the target position is (nearly) reached and set the “move complete” flag.
To do it with software intervention is quite time-critical for us. We need much higher velocities than 1000 steps/s and I imagine that a software deadband had to be rather large to work.
Where is the trick?
(Thank you!)

This type of error typically occurrs when the settings for encoder counts/rev and stepper steps/rev are wrong. Please refer to this document which explains how to find the correct settings for your hardware.
You may also find the information in this thread useful (especially the last post in this thread).
Jochen Klier
National Instruments Germany
Message Edited by Jochen on 09-21-2005 02:50 PM

Similar Messages

  • Using the Microsoft Kinect for Position Tracking in LabVIEW

    Hi,
    I am currently working on a project to take 2-D ultrasound images and reconstruct them into a 3-D image for a physician to use. In this project, we need to be able to know the relative position of the ultrasound probe in order to connect the 2-D images to their position on the part of the body being imaged.
    My current plan is to utilize the Microsoft Kinect in LabVIEW to track the ultrasound probe as it takes 2-D images, in order to later organize the images by position.
    Can anyone point me in the right direction for this task? I have found many different utilities for LabVIEW use of the Kinect, but I am specifically looking to track one single point as it moves a short distance. I could even put a red ball on the end of the probe, and have the Kinect only track the movement of that red ball.
    Any help with this task, or any integration of the Kinect with LabVIEW would be a great help,
    Boutros

    Hello all,
    I wanted to update everyone on how our project is coming, and for some further help.
    We have followed some of the work of this group: https://decibel.ni.com/content/docs/DOC-16978
     and have tested a large portion of their code. The benefit of this project is that they have isolated individual limbs to be tracked by the Kinect. So, we can choose to isolate and track the "right arm" if the pysician is using the ultrasound probe in that arm.
    We have tested the code and it is working sucesfully. The only problem is then we are unsure of how to export the position data into our reconstruction code. We want to take the position data acquired by this LabVIEW routine AS IT IS SAMPLED IN THE TIME DOMAIN, and export it for use in the reconstruction of the 2D ultrasound images. I am unsure of how to take this data, whcih is provided in realtime, and store it as sampled position data points in time.
    Please let me knwo if any clarification is needed of my issue, or the project in general.
    Any help would be greatly appreciated,
    Peter

  • Wait for the digital signal

    Is there like "wait for the panel activity" in digital signal, "wait for the digital signal goes up" etc. ?
    Or is it bad to make it with sequence structure and inside of first sequence use while loop?

    If you need to wait for a digital signal before continuing on with your program, the best way is to have a while loop that continuously reads the signal and stops the loop when the signal goes high. You can put this loop in the first frame of a sequence, or you can wire an output from the loop (like error out from the device read vi) into your next step. Be sure to put a delay (100mSec) in your loop so that the loop does not hog CPU time.
    - tbob
    Inventor of the WORM Global

  • Generate digital signal for 6722 or 6221

     Hi,
    first thanks for the help I already got on this board.
    Now I've got the following problem:
    I like to generate a digital signal: high for 300µs, low 300 µs, high 300µs, low 300µs, high for about 2ms.
    I am looking for a solution how to generate this digital sequence. At the same time I will read the information back via an analog input (hopefully I have now a solution for this problem).
    I tried to find examples but I wasn't successful. At the moment I was not able to produce any digital sequence which worked.
    My hardware is either 6722 or 6221 - which should be OK regarding the timing.
    Thanks
    Solved!
    Go to Solution.

    Find attached example.
    I am not allergic to Kudos, in fact I love Kudos.
     Make your LabVIEW experience more CONVENIENT.
    Attachments:
    Digital Waveform [LV 2009].vi ‏14 KB

  • Sequence 1create output digtal signal- 2 make microsecon​ds delay or less- 3 measure digital signal with board 6023e

    hello
    I need creat a digital signal and for this i use a counter output and then i need creat a delay after  a signal measurment and for this i use again a couter. My problem is how i can make this delay betwen create and measure a signal. this delay must to bethe shortest  possible  and precise. une of my attemps was make  a sequence and for make the delay i create a task for une virtual output with max frequency 5000000hz and just une pulse but i dont know how many time the labview get to go by the first task ( creat a output pulse and ent it) to make tis task before discribed to the last task (measure a digital sinal). Another thing that i dont know is how many time the libview get to create and eliminate une task. Une information beyond the kind of board (6023E) that can be useful to help me  with this problem is the information about my pc in this case i use a pentium4 3Ghz.
    thanks  

    ...My problem is how i can make this delay betwen create and measure a signal.
    ...how many time the labview get to end one task and start another one.
    What I tried to say is, that even if you use the counter of the 6023e-Board
    to create the delay and trigger the measure via LabView under Windows,
    you leave the realtime-path. Executiontime of LabView-code is then not
    determinated.
    One way to solve this, is triggering measurement via hardware and use LabView to read Data from the Buffer of the DAQ-card.
    You can find more info about DAQ in the "measurements manuall" of NI, delivered with your LabView Software.
    Lorand

  • Is there any tutorial for using ARM cortex-A processors of Zynq for digital signal processing ?

    Hello, everyone.
    Is there any tutorial for using ARM cortex-A processors(such as A9 and A53) of Zynq to deal with digital signal processing  problems?
    Please tell me , thanks.

    Hi
    Check below links
    http://www.xilinx.com/training/zynq/software-acceleration-for-dsp-functions-with-zynq.htm
    https://www.youtube.com/watch?v=ErEG7ZREcJQ
    http://www.xilinx.com/support/documentation/application_notes/xapp1170-zynq-hls.pdf
    http://www.xilinx.com/support/documentation/application_notes/xapp890-zynq-sobel-vivado-hls.pdf
    http://www.xilinx.com/support/documentation/application_notes/xapp1167.pdf

  • Generating a digital signal

    Hello,
    I am trying to  generate a digital signal which can be controlled in terms time i.e. Switching ON for 10 minis off for 5 minutes. I tried to generate a digital signal which is going high but unable to control it. As I am the beginner in labview any kind of help is appreciated.
    Please see the attachment for the developed block diagram
    Attachments:
    digital signal generation.vi ‏17 KB

    It appears that all you posted is one of the shipping examples. Is that all you've tried? You can't control it because you're not actually doing anything inside the loop. You're simply setting the value high before the loop starts, and not changing it in the loop. What kind of device do you have? Is it software controlled? If so, you will need to keep track of the time inside the loop. You can use the Elapsed Time VI to do this. Attached is a simple example to get you started to see how this can be done. I'm sure you can figure out how to integrate what you did and what I've shown you.
    To learn more about LabVIEW it is recommended that you go through the introduction material, tutorial(s), and other material in the NI Developer Zone's Learning Center which provides links to other materials and other tutorials. You can also take the online courses for free.
    Attachments:
    Switching signal.vi ‏29 KB

  • Is there a better way to generate custom timed digital Signals

    I am trying to generate digital output of high and lows with particular delays on different lines. Each daq assistant is activating single line on a port on USB 6501. There more complex high and lows that I need to generate with variable time difference between high and low. There is codebelow which accomplishes what I am trying to achieve but for executing a long pattern of high and low signal is very time consuming to do this way. I am sure there is a better way to do this, I  not a expert on labview so I haven't uncovered its full potential. Can anybody suggest a more effective and a quick way to do this. I would hgihly appreciate. Thanks!
    I have not shown in the code below but through DAQ assistant I have initialized lines to low level logic.
    Solved!
    Go to Solution.

    See the attached file
    Attachments:
    Digital Signal.vi ‏335 KB

  • How to input/output a digital signal and acquire an analog signal at the same time?

    Dasylab, version: 8.0.04
    Acquirement Card: PCI1002L
    When I use DasyLab to acquire the analog signals is no problem without digital inputs and outputs,
    and when I use DasyLab to input or output a digital signal is no problem also, but when I do that at the
     same time, DasyLab tell me the rate is too high and stop.
    so, I searched the manual book1 (user guide) for that, it showed me :
    To internally equalize measurement time and system time in the analog input, digital input and counter
    hardware modules, use the following settings:
       Synchronization: PC Clock
       Sampling rate: <= 5Hz
       Block size: =1
    the problem is, if I set the Sampling rate to 5Hz, the speed of the acquirement datas is not enough for my
    application.
    so, how to improve it? who can give me a example programm for me. thanks!
    by the way, I come from China, my English isn't good, I'm sorry.
    Allen, China.

    Hi,
    Have things changed over the years?
    I need to syncronise a digital output (Modul NI9474) and an analoge input (AI-Modul NI9203) module. I need to measure time intervals from a flank in signal A to a flank in signal B. I would like accuracies of the order of 1 ms. Currently, the signals are not synchronised, with errors of the order of 2 times the block length (block size x sample rate), sometimes much higher. The best I got so far was a block size of around 20 with a sample rate of 1 kHz.
    If I use the master and slave settings on the RTSL settings, my program doesn't run properly.
    If I use digital signals for input and output, I can syncronise them with RTSL settings and everything is good, but I can't always do that.
    Also, if I do anything in the GUI (such as scrollowing something or going to another window), my output gets screwed up properly.
    1. What can be done to synchronise AI with DO?
    2. Is there something that can be done to avoid messing up the output when something happens in the user interface? (I know that I am messing up the outputs as they make some valves switch and that is loud).
    Thanks in advance!

  • How do I convert analogue 5.1 surround sound into a digital signal? (Pioneer SE-DIR800C headphon

    I understand that my SB Audigy 2 card will only output 5. sound from games through its 3 analogue connectors. How do I convert this analogue signal into a digital one? What hardware do I need to purchase?
    I have read the thread on "Digital Connections, SPDIF and Dolby Digital Info", and also I've thoroughly read the "Creative Speaker Connectivity Guide".
    The reason I ask is that I have just bought a pair of Pioneer SE-DIR800C Surround Sound headphones. This are supposedly fantastic for recreating surround sound on headphones because there is a clever little decoder box. They accept a DTS / Dolby Digital signal via the digital co-axial and optical inputs, and there is a single analogue input which only accepts stereo sound.
    My XBox and PS2 will output true digital 5. sound via an optical cable and I am confident this will work perfectly with these headphones. I'm really looking forward to playing Halo in surround sound! But for gaming on my PC, I'll be limited to just an analogue stereo signal. Well, unless I can find some device that will convert my SoundBlaster's analogue 5. outputs into a digital signal, which I can then plug into the little decoder box for these headphones.
    So, back to my question:
    What hardware do I need to buy to convert an analogue 5. signal (via 6 x RCA or 3 x 3.5mm stereo ) into a true digital signal (via optical or co-axial SP/DIF)? Is it some type of headphone amplifier I need? If possible, please recommend makes & models of equipment.

    Thanks stefan
    OK, basically I have found 3 options for getting digital 5. out of a PC:
    a) Creative DTS-60 (approx $90). This converts the SB's analogue into digital, but it's not available in the UK and it also introduces a noticeable 50ms sound delay. Also, who wants an extra box hanging out of their PC?
    b) Buy a new sound card which has Dolby Digital Li've output. For example, ) BlueGears HDA X Mystique 7., or 2) Turtle Beach Montego DDL, or 3) Terratec Aureon 7. PCI (NOT the Space, FireWire or Uni'verse cards). I couldn't find the first two available in the UK so I have just ordered the Terratec card from Komplett.
    c) Buy a new motherboard that has built-in Dolby Digital Li've output, for example the ABIT Fatalty AA8XE. Unfortunately my PC is only just over year old and I'm not quite ready to replace it.
    I hope this info is useful for people. Maybe Creative will start making a card with DDL output too.Message Edited by NinjaHeretic on 2-22-2005 08:46 AM

  • How to check 6 digital signals change value at the same time with PCI-6229??

    I am using DAQ card PCI-6229.
    Channel 1 generate a digital signal.
    Channel 2,3,4,5,6,7 acquire digital signals.
    I want to check:
    1. Whether the rising edge of Channel 2,3,4,5,6,7 occures at the same time;
    2. The time delay from the rising edge of Channel 1 to the rising edge of Channel 2,3,4,5,6,7 is within a certain range.
    I know I can use counter to get the two edge seperation time delay. But I only have two counter, it is two time-consuming if I check one by one.
    I don't know how to check the rising edge of 6 different channels occure at the same time.
    Does anyone has any suggestions?
    Thanks

    Hello,
    You can use the DAQ card's digital input change detection circuitry to detect changes in the input, you can then use a counter to measure the relative time between samples. Please read Page 6-9 DI Change Detection Applications for more information. Let me know if this helps
    Christian A
    National Instruments
    Applications Engineer

  • Using the XSLT processor for non-workbench XSLT

    Hi there,
    is it possible to use the built-in XSLT processor for arbitrary XSLT transformations which aren't checked in in the ABAP workbench but instead given as a runtime object (string or iXML)?
    Instead of the built-in command CALL TRANSFORMATION which according to the doc is restricted to workbench transformations, I am looking for an option like this:
    data: lo_transformation type ref to if_ixml_document,
          lo_source         type ref to if_ixml_document,
          lo_target         type ref to if_ixml_focument.
    * I get lo_transformation and lo_source from somewhere out there
    try.
        lo_target ?= cl_some_fine_class_which_i_am_looking_for=>transform(
                          io_source         = lo_source
                          io_transformation = lo_transformation ).
      catch cx_xslt_runtime_error.
    endtry.
    Does anybody know such a feature?
    For a background about this problem - in German language - see my blog
    http://ruediger-plantiko.blogspot.com/2007/08/xslt-in-bsp-anwendungen-und-in-abap.html
    Thanks and Regards,
    Rüdiger

    Dear Rashid,
    thanks - this is the answer! I wonder why I didn't find this class one year ago. A little test prog shows that it works fine and even performant (about 0.5 millisec for creating the new dynamic XSLT program with the method set_source_stream( ) ). For usage in web apps, it would be nice to know whether the temporary program remains available in the application servers' buffer after end of process. I can't check this, since this is performed on the C/C++ level, and SE30 doesn't track the method set_source_stream() itself (it could show a decrease of runtime after the first call).
    Here comes a little self-contained ABAP program to test the functionality. It works well on our system with SAPKB70012.
    Thanks and regards,
    Rüdiger
    * --- Test usage of a dynamically given non-workbench XSLT program
    report  zz_test_cl_xslt_processor.
    data:
    * iXML master
      go_xml type ref to if_ixml,
    * iXML stream factory
      go_sf  type ref to if_ixml_stream_factory.
    load-of-program.
      go_xml = cl_ixml=>create( ).
      go_sf  = go_xml->create_stream_factory( ).
    start-of-selection.
      perform start.
    * --- Start
    form start.
      data: lo_source    type ref to if_ixml_document,
            lo_result    type ref to if_ixml_document,
            lo_processor type ref to cl_xslt_processor,
            lv_p         type progname,
            lo_ex        type ref to cx_xslt_exception.
      perform get_source changing lo_source.
      create object lo_processor.
      try.
    * Set source
          lo_processor->set_source_node( lo_source ).
    * Set result
          lo_result = go_xml->create_document( ).
          lo_processor->set_result_document( lo_result ).
    * This could be time-critical, the creation of a dynamical XSLT prog?
          perform set_transformation using lo_processor
                                     changing lv_p.
    * call xslt-proc
          lo_processor->run( lv_p ).
    * Display result
          call function 'SDIXML_DOM_TO_SCREEN'
            exporting
              document    = lo_result
              title       = 'Result of Transformation'
            exceptions
              no_document = 1
              others      = 2.
        catch cx_xslt_exception into lo_ex.
          sy-msgli = lo_ex->get_text( ).
          message sy-msgli type 'I'.
      endtry.
    endform.                    "start
    * --- Set XSLT transformation from stream
    form set_transformation using io_processor type ref to cl_xslt_processor
                            changing cv_p type progname.
      data: lo_trans     type ref to if_ixml_istream.
    * sv_p contains temp. name of XSLT program after first call
      statics: sv_p   type string.
      if sv_p is initial.
    * It seems that the name can be buffered on appserver level?
        import progname to sv_p
               from shared buffer indx(zx) id 'ZZ_TEST_XSLT_PROC'.
        if sv_p is initial.
          sv_p = 'X'.
        endif.
      endif.
    * Provide the stream containing the XSLT document (as a stream)
      perform get_transformation changing lo_trans.
    * Set transformation
      io_processor->set_source_stream( exporting stream = lo_trans
                                       changing  p      = sv_p ).
    * Buffer progname on server - seems to work
      export progname from sv_p
             to shared buffer indx(zx) id 'ZZ_TEST_XSLT_PROC'.
    * string -> c move necessary, since xslt-proc-interface doesn't use
    * the generic type csequence for program name
      cv_p = sv_p.
    endform.                    "set_transformation
    * --- Parse a source given as string into an if_ixml_document
    form get_source changing co_src type ref to if_ixml_document.
      data: lv_s      type string,
            lo_stream type ref to if_ixml_istream,
            lo_parser type ref to if_ixml_parser.
      concatenate
    `<?xml version="1.0" encoding="iso-8859-1"?>`
    `<countings filiale="2412" invnu="TIEFKUEHL SEPT.07">`
    `<count recNum="1" gid="1" ean="59111828843" menge="1"`
    `preis="0" recNumFrom="1"></count>`
    `</countings>`
    into lv_s.
    * Eingabestream erzeugen und in if_ixml_document abbilden
      lo_stream   = go_sf->create_istream_string( lv_s ).
      co_src      = go_xml->create_document( ).
      lo_parser   = go_xml->create_parser( document       = co_src
                                           istream        = lo_stream
                                           stream_factory = go_sf ).
      lo_parser->parse( ).
    endform.                    "get_source
    * --- Put the transformation given as string into an if_ixml_istrean
    form get_transformation changing co_trans type ref to if_ixml_istream.
      data: lv_s   type string.
      concatenate
      `<?xml version="1.0" encoding="iso-8859-1"?>`
      `<xsl:transform version="1.0"`
      ` xmlns:xsl="http://www.w3.org/1999/XSL/Transform"`
      ` xmlns:asx="http://www.sap.com/abapxml">`
      `<xsl:strip-space elements="*"></xsl:strip-space>`
      `<xsl:template match="countings">`
      ` <asx:abap>`
      `   <asx:values>`
      `     <SELOPT>`
      `       <WERKS><xsl:value-of select="@filiale"></xsl:value-of></WERKS>`
      `       <INVNU><xsl:value-of select="@invnu"></xsl:value-of></INVNU>`
      `     </SELOPT>`
      `     <COUNTINGS>`
      `       <xsl:for-each select="count">`
      `         <ZSRS_ZWSTI_LINE>`
      `           <MATNR></MATNR>`
      `           <EAN11><xsl:value-of select="@ean"></xsl:value-of></EAN11>`
      `           <MAKTX></MAKTX>`
      `           <MENGE><xsl:value-of select="@menge"></xsl:value-of></MENGE>`
      `           <MEINH></MEINH>`
      `           <UNAME></UNAME>`
      `           <EXVKW></EXVKW>`
      `           <WAERS></WAERS>`
      `           <FF></FF>`
      `           <GID><xsl:value-of select="@gid"></xsl:value-of></GID>`
      `           <RECNUM><xsl:value-of select="@recNum"></xsl:value-of></RECNUM>`
      `           <RECNUM_FROM><xsl:value-of select="@recNumFrom"></xsl:value-of></RECNUM_FROM>`
      `           <REF_RECNUM><xsl:value-of select="@refRecNum"></xsl:value-of></REF_RECNUM>`
      `         </ZSRS_ZWSTI_LINE>`
      `       </xsl:for-each>`
      `     </COUNTINGS>`
      `   </asx:values>`
      ` </asx:abap>`
      `</xsl:template>`
      `</xsl:transform>`
      into lv_s.
      co_trans = go_sf->create_istream_string( lv_s ).
    endform.                    "get_transformation
    Edited by: Rüdiger Plantiko on Jul 4, 2008 10:25 AM

  • I'm looking at the dell Inspiron Desktop 4th Generation Intel Core i5 Processor for photoshop work versus the Dell XPS 8700 i7 IS IT WORTH SPENDING THE EXTRA $400?

    I'm looking at the dell Inspiron Desktop 4th Generation Intel® Core™ i5 Processor for photoshop work versus the Dell XPS 8700 i7 IS IT WORTH SPENDING THE EXTRA $400? my old desktop is a AMD about 5 years old so there will be a huge change in speed to what I am use to
    Here are the specks on both:
    Inspiron
    Processor
    4th Generation Intel® Core™ i5-4460 Processor (6M Cache, up to 3.40 GHz)
    Operating System
    Help Me Choose
    Windows® 8.1 (64Bit) English
    Memory2
    8GB Dual Channel DDR3 1600MHz (4GBx2)
    Hard Drive
    1TB 7200 rpm SATA 6Gb/s Hard Drive
    Video Card
    NVIDIA® GeForce® 705 1GB DDR3
    Ports
    Front
    (2) USB 2.0, MCR 8:1, Mic and Headphone Jacks
    Rear
    Four USB 2.0 connectors , Two USB 3.0 connectors, HDMI, VGA, RJ-45 (10/100/1000 Ethernet), 3-stack audio jacks supporting 5.1 surround sound
    Media Card Reader
    Integrated 8-in-1 Media Card Reader
    (supports Secure Digital (SD), Hi Speed SD (SDXC), Hi Capacity SD (SDHC), Memory Stick (MS), Memory Stick PRO (MS PRO), Multimedia Card (MMC), Multimedia Card Plus (MMC Plus), xD-Picture Card(XD))
    Memory Slots
    2 DIMM Slots
    Chassis
    Bluetooth
    BT 4.0 via 1705 WLAN card
    Chipset
    Intel® H81 PCH
    Power
    300 Watt Power Supply
    XPS 8700
    Processor
    4th Generation Intel® Core™ i7-4790 processor (8M Cache, up to 4.0 GHz)
    Operating System
    Help Me Choose
    Windows 8.1 (64Bit) English
    Choose Options  
    Memory2
    12GB Dual Channel DDR3 1600MHz (4GBx2 + 2GBx2)
    Hard Drive
    1TB 7200 RPM SATA Hard Drive 6.0 Gb/s
    Video Card
    NVIDIA GeForce GTX 745 4GB DDR3
    CPU Thermal
    86W
    Graphics Thermal
    225W/150W/75W
    Power
    460W, optional 80 PLUS Bronze, 85% efficient, supply available on ENERGY STAR configurations
    Ports
    Bays
    Support for 4 HDD bays: including (3) 3.5” HDDs
    –Capable of 1 SSD and 3 HDD configuration
    Media Card Reader
    19-in-1 Card Reader (CF Type I, CF Type II, Micro drive, mini SD, MMC, MMC mobile, MMC plus, MS, MS Pro, MS Pro Duo, MS Duo, MS Pro-HG, RS-MMC, SD, SDHC Class 2, SDHC Class 4, SDHC Class 6, SM, xD)
    Slots
    Memory Slots
    4 DIMM

    From my personal experience, I wouldn't go for an Integrated card. This is one of the most important components for Photoshop, so invest in a decent graphics card (ATI or NVidia). It doesn't have to be a really exepensive one - I have been using ATI Radeon with 256MB of RAM on my Dell Studio for almost two years and it still rocks! (even when I work with 3D in Ps CS5 Extended).
    I would also invest in more RAM (this can be added easily - I bought extra 4GB as my Studio came with 3GB).
    I wouldn't worry about the processor - I'm on Intel Core 2 Duo - and it works very very well, it's very quick which is very important as I'm training Photoshop.
    I hope this helps.

  • How to convert digital signal to analog

    Hi..
    I am using NI 9375 (DIO module) to read the output from flow sensor.
    The output of the flow sensor is in the digital signal.(Boolean=True/False).
    How can I convert the digital signal to the analog to get the reading of the flow sensor?
    I had tried before to convert the digital signal to the frequency so that I can make the conversion from frequency to the flowsensor reading,
    but it doesn't work.( I am referring to http://www.ni.com/white-paper/14549/en/ ).

    nhan91213 wrote:
    Hi..
    I am using NI 9375 (DIO module) to read the output from flow sensor.
    The output of the flow sensor is in the digital signal.(Boolean=True/False).
    How can I convert the digital signal to the analog to get the reading of the flow sensor?
    I had tried before to convert the digital signal to the frequency so that I can make the conversion from frequency to the flowsensor reading,
    but it doesn't work.( I am referring to http://www.ni.com/white-paper/14549/en/ ).
    FYI - If your flowmeter pulsing frequency is higher than 500Hz then you won't get reliable reading using your algorithm, and it will be very unreliable above 1kHz because fastest that loop runs is 1mS (1kHz).  In that case you could tie the timed loop  to a higher rate (hardware) clock source to go faster than 1mSec loop time.  If no hardware then I think you can use high resolution timer (in LV2014, not sure if it was also available in previous version) and a regular while loop with algorithm modification for a faster timing.
    New Controls & Indicators made using vector graphics & animations? Click below for Pebbles UI

  • Can we design analog to digital signal conversion block diagram (ADC) in labview

    hi every one,i've doubt i.e can we convert function generator output (i.e analog signals) to digital signal in labview......can anyone help me how to create ADC circuit that means ADC Block Diagram in LABVIEW.....please give me solutions for my problem

    Hi guru,
    from signal generation VIs you get arrays containing float values (aka DBL). Those floats are just digital values made of 0s and 1s…
    Any value in your computer is digital!
    To provide a solution you need to rephrase your question! What exactly do you expect as result of your operation?
    Edit: kkkkkkk ur ckt ???
    Best regards,
    GerdW
    CLAD, using 2009SP1 + LV2011SP1 + LV2014SP1 on WinXP+Win7+cRIO
    Kudos are welcome

Maybe you are looking for

  • How to set transceiver mode to 2-wire TXRDY Auto from Visual Basic

    I am using a PCMCIA-485 adapter to interface to some RS-485 sensors. The sensors are connected using a 2-wire interface. I have developed an application in Visual Basic to control the various sensors. However, the application is dependent on the PCMC

  • How to load livecycle es workspace in flex

    i have a problem with flex 3 when i load adobe livecycle workspace only two folders appear(approval-container and workspace) but the other folders not appear(api,foundation, queue-sharing, and workspace-ui) also under the workspace folder a new file

  • Payment through Alternative Payee.

    Hi Expert, In one scenario we have one Vendor say 1000. In his master data Vendor No. 2000 is alternative Payee. For Vendor No. 1000 no bank data has been maintained. In master data of Vendor No. 2000 in two Bank account has been maintain 1st one for

  • Upgrading from 12.0.6 to 12.1...will Db ugrade also upgrade following??

    I am upgrading Ebusines ssuite form 12.0.6 to 12.1 but for that there are hard pre requisites Upgrade OracleAS 10g Release 3 (10.1.3) to Patchset 4 (10.1.3.4) (Required) Upgrade OracleAS 10g Release (10.1.2) for Forms and Reports to Patchset 3 (10.1.

  • Labview 8.5 project crashes when searching for callers of missing lvlib

    I have done a search and reviewed previously reported crashes with LV8.5, but did not find a match to what I'm seeing. SYNOPSIS: When selecting Find > Callers, under the project Dependencies, LV crashes as shown in the image below. I created a new pr