How to change sampling frequency

Hi,
what is the simple way to change sampling freqiency for more AIChannels (some differential and some RSE and diff. min/max values) trough cont. data acquisition?
Thanx
Vedran

The only way I have found to do this is to use a counter as your clock source for you analog input lines. You can vary the frequency of this while the program is running. Take a look at the attached program. I have an E-series card, so that is what this is based on.
Hopefully it will be useful.
Randall Pursley
Attachments:
Counter Clock Acq.vi ‏180 KB

Similar Messages

  • How to change the frequency of pulse train on the fly using an array of values?

    Hi all!
    First I want to thank U for the great job you are doing for this forum.
    Iam still busy trying to control a stepper motor, by sending pulses from my E-series 6024 to a compumotor s6- stepper Driver. I've managed to get it working. I desperately need to control the motor using the values from an array. I believe we can use two approaches for that:
    1st - I can get an array of the "numbers of pulses". Each element must run for 10 milliseconds. Using that we can calculate the array of frequencies to send the number of pulses within 10 milliseconds for each specific element. Could we use the arrays of "number of pulses" and frequencies in a "finite pulse train " and up
    date with each element every 10 millisecond?
    2nd - Or Could we use of the frequency array in a "continuous pulse train vi" and update it every 10 milliseconds?
    Please note that I must use the values as they are.
    Can someone please built a good example for me? Your help will be appreciated.
    Regards
    Chris
    Attachments:
    number_of_steps.txt ‏17 KB
    frequency.txt ‏15 KB

    Tiano,
    I will try to better explain the paragraph on LabVIEW. The original paragraph reads ...
    "While in a loop for continuous pulse train generation, make two calls to Counter Set Attribute.vi to set the values for "pulse spec 1" (constant 14) and "pulse spec 2" (constant 15). Following these calls you would make a call to Counter Control.vi with the control code set to "switch cycle" (constant 7). The attached LabVIEW programs demonstrate this flow."
    You can make two calls to Counter Set Attribute or you can make a call to Set Pulse Specs which, if you open this VI, you will see that it is just making two calls to Counter Set Attribute. What you are doing with the Counter Set Attribute VIs is setting two registers called "pulse s
    pec 1" and "pulse spec 2". These two registers are used to configure the frequency and duty cycle of your output frequency.
    The example program which is attached to this Knowledge Base demonstrates how to change the frequency of a continuous generation on the fly. Why continuous? Because changing the frequency of a finite train would be easy. When the train completes it's finite generation you would just change the frequency and run a finite train again. You would not care about the time delay due to reconfiguration of the counter.
    If you would like to change the frequency of the pulse train using a knob, this functionality will have to be added in the while loop. The while loop will be continuously checking for the new value of the knob and using the knob value to set the pulse specs.
    LabVIEW is a language, and as with learning all new languages (spoken or programatic) there is a lot of learning to be accomplished. The great thing is that LabVIEW is much easier than mo
    st languages and the learning curve should be much smaller. Don't fret, you'll be an expert before you know it. Especially since you're tackling a challenging first project.
    Regards,
    Justin Britten

  • How to change the frequency of a finite pulse train?

    Hi all? I Know I’ve asked once a similar question, but I need a better answer. I would like some ideas or examples on how to modify the example in LV; "Finite Pulse Train Generation - Intermediate for AM9513-Based Devices",so that the frequency can be changed while the VI is executing.
    Will i have any problem with I use this VI in a DAQ-STC-Based Devices(E-series)?

    Tiano,
    I will try to better explain the paragraph on LabVIEW. The original paragraph reads ...
    "While in a loop for continuous pulse train generation, make two calls to Counter Set Attribute.vi to set the values for "pulse spec 1" (constant 14) and "pulse spec 2" (constant 15). Following these calls you would make a call to Counter Control.vi with the control code set to "switch cycle" (constant 7). The attached LabVIEW programs demonstrate this flow."
    You can make two calls to Counter Set Attribute or you can make a call to Set Pulse Specs which, if you open this VI, you will see that it is just making two calls to Counter Set Attribute. What you are doing with the Counter Set Attribute VIs is setting two registers called "pulse s
    pec 1" and "pulse spec 2". These two registers are used to configure the frequency and duty cycle of your output frequency.
    The example program which is attached to this Knowledge Base demonstrates how to change the frequency of a continuous generation on the fly. Why continuous? Because changing the frequency of a finite train would be easy. When the train completes it's finite generation you would just change the frequency and run a finite train again. You would not care about the time delay due to reconfiguration of the counter.
    If you would like to change the frequency of the pulse train using a knob, this functionality will have to be added in the while loop. The while loop will be continuously checking for the new value of the knob and using the knob value to set the pulse specs.
    LabVIEW is a language, and as with learning all new languages (spoken or programatic) there is a lot of learning to be accomplished. The great thing is that LabVIEW is much easier than mo
    st languages and the learning curve should be much smaller. Don't fret, you'll be an expert before you know it. Especially since you're tackling a challenging first project.
    Regards,
    Justin Britten

  • How to change the frequency of a finite/continuous pulse train in the program?

    Hello!...
    I am trying to modify the Finite pulse train vi example to change the frequency with respect of a ratio (1:2) of the data of a spreadsheet file as they are read one by one.
    My aim is to get the pulse train to clock a stepper motor controller, in such way that the speed of the motor increases when the data of the spreadsheet increases and vice-versa.
    I think that if I want to get the clock to change its speed, I must change the frequency and if I want it to increase/decrease like the data in the spreadsheet, I must tell the frequency that the data is increasing or decreasing in value.
    Obviously the program must do it while it is running
    Can anyone see wh
    ere I have gone wrong and/or give any suggestions?
    I have attached my vi and the spreadsheet file.
    Thanks
    Tiano
    Attachments:
    My_VI.vi ‏99 KB
    wave.txt ‏18 KB

    Tiano,
    Try to build the pieces of the program separately. Get a VI running that reads the spreadsheet and reports the data in it sequentially in an indicator (forget about the pulse generation at this time). This indicator will be wired to the frequency input of the pulse generator VI.
    You have read the data and have a transposed array. Now you must learn to use the Index Array function in a For Loop to extract the data sequentially. You will wire the increment output [i] of the For Loop into Index Array to shift from one data point to the next.
    You have a For Loop with no constant wired to [N] to tell it how many times to run. If you use Array Size to extract the number of entries in the spreadsheet data, this can be the constan
    t that tells the For Loop how many times to run.
    You are using the Tab Control in a way I have never seen before (I don't think it will work this way either. Tab Control is for presenting and hiding controls, indicators, labels, etc. on the front panel, before running the VI. They are not used to control the program functionality itself. Typically you would use a Boolean to choose between 2 options for a Case Structure.
    You can also change the path constant which selects the spreadsheet file into a control. This would allow you to browse for any file you want and eleiminate the need for the Case Structure altogether.
    Mike

  • In NI-DAQmx, how to change the frequency of a pulse train?

    I was using "GPCTR_Change_Parameter()" to change the frequency of my pulse train in Traditional NI-DAQ and Labwindows/CVI; is there any function can do the same thing in NI-DAQmx without restarting the task?

    You can do this in one of two ways, depending on the version of NI-DAQ you are using. If you are using 7.2 or later, you can use the counter write functions (ex: DAQmxWriteCtrFreq()), which are available for each flavor of pulse generation tasks (Freq, Time, Ticks). If you are using NI-DAQ 7.0 or 7.1, you can modify the attributes directly (DAQmx_CO_Pulse_HighTime, DAQmx_CO_Pulse_LowTime, etc). Keep in mind that for each attribute pair, one of the attributes will cause the counter to load the new pair. The attribute that causes the load is LowTime, LowTicks, and Frequency.
    I hope this helps!
    gus....

  • How to lock sample frequency for 5673 and 5663

    Hello,
    In general I'm trying to lock transmitter and receiver together.  It seems easy to lock the carrier frequency, however no matter what I do, I seem to have a drift in my sampling frequency (on the order of 1ppm). 
    Is the sampling clock in the 5663 digitizer tied to the same clock reference as the LO?  ... I didn't think I would need the TClk mechanism here since I don't care about delay ... 
    Any thoughts are greatly apreciated.  Below are a few specifics. 
    I'm using the 5673 as transitter and 5663 as receiver.  I've noticed the folowing:
    1. When each are using a 'Reference Clock Source' = OnboardClock I have a noticeable carrier offset at the receiver (eg 5.8 GHz carrier has  ~7 KHz offset) and this is fine. 
    2. When each are using a 'Reference Clock Source' = PXI Clock, with the chassis physically tied, I see no noticeable carrier offset at the receiver. 
    3. When 5673 is using 'Reference Clock Source' = OnboardClock, 5663 using 'Reference Clock Source' = ClkIn (ClkIn/Out physically tied), I see no noticeable carrier offset at the receiver. 

    Not sure if this is the proper forum.  I created a post in the 'NI Support' area and LabVIEW is were it got placed.  ... arg ...
    I'm going to try and move this to a more apropriate forum:  ... 'High Speed Digitizers' I guess.

  • How to lock sampling frequency for 5673 and 5663

    Hello,
    In general I'm trying to lock transmitter and receiver together.  It seems easy to lock the carrier frequency, however no matter what I do, I seem to have a drift in my sampling frequency (on the order of 1ppm).
    Is the sampling clock in the 5663 digitizer tied to the same clock reference as the LO?  ... I didn't think I would need the TClk mechanism here since I don't care about delay ...
    Any thoughts are greatly apreciated.  Below are a few specifics.
    I'm using the 5673 as transitter and 5663 as receiver.  I've noticed the folowing:
    1. When each are using a 'Reference Clock Source' = OnboardClock I have a noticeable carrier offset at the receiver (eg 5.8 GHz carrier has  ~7 KHz offset) and this is fine.
    2. When each are using a 'Reference Clock Source' = PXI Clock, with the chassis physically tied, I see no noticeable carrier offset at the receiver.
    3. When 5673 is using 'Reference Clock Source' = OnboardClock, 5663 using 'Reference Clock Source' = ClkIn (ClkIn/Out physically tied), I see no noticeable carrier offset at the receiver.
    Solved!
    Go to Solution.

    Hi Clayton, 
    The digitizer sample clock time base source is different from the Reference Clock source. I've copied  a description below of the difference between the two from the digitizers help. 
    Clocking:Reference (Input) Clock Source:
    Specifies the input source for the PLL reference clock.
    Clocking: Sample Clock Timebase Source:
    Specifies the source of the sample clock timebase, which is the timebase used to control waveform sampling.
    Yes, the default configuration of the NI 5663 is for the NI 5652 to export its internal 10 MHz reference to the NI 5622 so that the NI 5622 and the NI 5652 devices are frequency-locked. The NI 5663 can also be configured to lock to an external (10MHz only) reference source. The NI 5663 can also be configured to lock to the PXI 10 MHz backplane clock. Locking to the PXI 10 MHz reference does not require a cable, but this configuration does not provide the same frequency and phase noise performance as the NI 5652 internal Reference clock. All of this information is provided in detail in the NI RF VSA Help. See the directory below:
    Regards,
    Travis Ann
    Customer Education Product Marketing Manager
    National Instruments

  • How to change replication frequency

    Server2012R2, how can I change the hyper-v replication frequency after replication has already been enabled? When I set it up I set it to 30secs but I am finding that is too frequent and would like to increase that to 5mins or 15mins.
    davidh

    Hi its easy, open the settings of the Virtual Server which is being replicated.
    Go to the Management Section > Replication
    on this page there is a tab to select the replication time,
    click apply
    regards
    Mark

  • For PCI-4451,How to change the frequency of an output signal without any undefined state

    I would like to generate a signal using output channels of PCI-4451 and at the same time capture data using the input channels. Is there any method that I can generate a continuous signal without any undefined state when the signal frequency is being changed?
    Attachments:
    Input_Signal.bmp ‏729 KB

    Wee,
    DSA boards such as the PCI-4451 have a number of very desirable properties including high precision and a high sampling rate. The trade-off for the combination of these two properties is that the DSA boards cannot adjust their sampling rate on-the-fly. Instead they have to be stopped and reconfigured. During this reconfiguration time, the value of the board's output becomes flat and level.
    If you are looking to adjust the board's output without seeing these flat spots, you have to take a different approach to programming a DSA board. Instead of reconfiguring the board, what you need to do is allow the board to continue to run (at the same output frequency), and then overwrite the output buffer that the board reads to output values. This will
    allow you to output new data without the flat spots. Below you will find an example that displays this behavior.
    If you determine that this is not an acceptable workaround due to the limitations on output rates, you may want to look at using a Multifunction DAQ board (60xx or 62xx) or an arbitrary waveform generator (54xx) device instead.
    Best of luck with your application.
    Regards,
    Jed R.
    Applications Engineer
    National Instruments
    Attachments:
    Cont_Gen_Voltage_Wfm-Int_Clk-With_Updates_for_DSA.llb ‏161 KB

  • How to change Bandmode / Frequency

    If I go into the "Mobile Network Engineering Screens" menu and then into "Cell Information..." the bandmode is set to "GSM850" which is causing me to get SOS. How do I change the bandmode to GSM900? I've tried clicking on GSM800 and selecting "All" but when I back to the menu and then back into Cell Information it changes back to GSM850.

    I do believe that my phone support this because I've checked the specifications on the Blackberry website and It says It does support GSM900 so I think it's just a case of changing the settings. I dont see why a shop would sell a phone that cant support the UK Band Freqeuncy?

  • Change sampling frequency in software?

    Hi,
    I have a Phidgets Interface Kit, with a temperature sensor, that is sampling at a high rate. I would like to re-sample the signal, or somehow reprocess the signal, so that the output refreshes once per second. Is there a way I can do this in Labview? I tried the Repack vi...but I don't know if this is the best way to handle this task.
    Any ideas are appreciated,
    Fred

    If you are using the "On Sensor Change" event than you should change to some sort of loop timer where you use the "getSensorValue" property.
    Some of the examples just use a "while loop" so it is polling the phidget as fast as possible. You will need to change the loop to a timer.
    Hope this helps.

  • How to change the frequency of time machine back ups

    Time machine backs up every hour. And it takes several minutes. I don't need to back up that often. Once a day is fine. But I don't see any way to set the frequency of back ups.

    There are a number of applications that can be installed to set different time schedules for TM. I have not provided links as I know nothing about them.
    If you do a web search on the title of your post you should find them.

  • Flex sampling frequency changes when I use it with apple loops

    Flex sampling frequency changes when I use it with apple loops in a 24 bit 88.2 project

    sorry  !
    Flex changes sampling frequency when I use it with apple loops in a 24 bit 88.2 project

  • How to use on board counter to change sample rate dynamically on pci-6134

    Hi,
    I am relatively new in LabView.
    I am making power quality measurement system and I need to vary the sampling rate of my pci-6134 dynamically (all channels simultaneously). What I need is to have a constant amount of samples in each period of measured signal (grid voltage), which changes slightly all the time. Therefore I will have to measure the voltage, find its exact frequency and then adjust the sampling rate of daq accordingly. I know that there will always be some delay, but I would rather like not to go into any predictive algorithms...
    I have found an information in the Forum that one of possible solutions is to use an onboard counter to change the sampling frequency but I have no idea how to make that. Can someone help me or possibly show an example? 
    Is there a simple way to solve that problem?
    Thanks in advance
    Andrzej

    At least at a glance, the code generally looks like it ought to work.  Two thoughts:
    1. Instead of getting into PFI3 vs PFI8 routing stuff, can't you just specify "Dev1/Ctr0InternalOutput" as the AI Sample Clock source?  (You may need to right-click the terminal to get at the menu that exposes the so-called "advanced terminals").
    2. Try writing both the freq AND duty cycle  properties when you want to update the freq.  Or try using the DAQmx Write vi instead of a property node.  My past experience suggests that writing only the freq property *should* still work, but writing both isn't hard to try and may turn out to help if the behavior of your version of DAQmx differs somehow.
    -Kevin P.
    P.S. Bonus 3rd thought.  I just went back to reread the thread more carefully, including your first screenshot.  I'm now thinking that maybe the hardware actually WAS behaving properly, and that you just weren't aware of it.   When you query the AI task for it's sampling rate, all the task can know is whatever rate you told it when you configured it outside the loop.  So even as you change the counter freq to change the actual hardware sampling rate on the fly, the AI task will continue to report its orig freq.  After all, how is *it* supposed to know?
           Try an experiment:   Set your original freq very low so that the AI task produces a timeout error without getting all the requested samples within the 10 sec timeout window.  Run and verify the timeout.  Then run again, but after 3-5 seconds set a new frequency that will produce all those samples in another 1 sec or less.  Verify that you get the samples rather than timing out.  That should demonstrate taht the counter freq change really *does* produce a change to the hardware sample rate, even though the task property node remains unaware.

  • Change 5622 sampling frequency

    Hi,
    I am working on a PXI project to generate a single tone frequency signal, (which will be modulated by vital signs) and then demodulate the signal and view the time domain demodulated signal and its spectrum. I am able to generate the signal and view the demodulated wave but the problem is I am using PXI 5663 for the receiver (and 56773 to transmit the single tone signal) and the 5622 digitizer has a very high sampling rate(150MS/s) which is too high for vital signs (whose frequency is not more than 10-12Hz). 
    I want to change the sampling rate of 5622 and I read a couple of sites (http://zone.ni.com/reference/en-XX/help/370592P-01/digitizers/5622_clocking/) and documents on 
    I am using PXI Clock as the Reference Clock Source to synchronize the 5673 and 5663 so how can I change the clock frequency of 5622?
    I tried using the Decimate and Resample vi s to change the sampling frequency of the received waveform but I did not get any output.(Please find attached the vi and the front panel description of the demodulate vi;the transmission vi is the default  RFSG Single Tone Generation).
    I am stuck at this point and don't know how to proceed.
    Could someone please help me out?
    Thanks so much,
    Sharmi
    Attachments:
    MT niRFSA Demodulate AM.vi ‏1590 KB
    frontpanel.png ‏196 KB

    Hi Sharmi,
    Since you already have a service request open with NI Tech Support regarding this issue, I just wanted to let you know that we will be focusing our efforts on helping you out under your support ticket.
    Regards,
    Jason L.
    Product Support Engineer
    National Instruments

Maybe you are looking for

  • Problem with threads in j2me

    im using jdk 1.6 with java wtk 2.1 ya i know.. 2.1 is a jdk1.4 source but im using souce as 1.4 during compiliation so thats not the prob this is my code package org.learn; import javax.microedition.midlet.MIDlet; import javax.microedition.lcdui.Aler

  • J2EE_ADMIN has no Port Role in ABAP+JAVA stacks system

    I installed 2004s BI IDES SR2 with ABAP+JAVA on Win 2003 + Oracle, default client 800. I find the J2EE runs fine, I can log into SDM, configtools. And I can launch http://host:50000/index and http://host:50000/irj/portal, which mean the portal is up.

  • Weblogic deployment error

    I am trying to deploy a Very simple JFS Application that Connects to a Webservice and sends a request. When I run it in JDeveloper locally it works, but when i Deploy to Weblogic i get the following error: oracle/jrf/PortabilityLayerException [03:24:

  • Trying to get response body of a https cert pass protected page.

    I have a url link i'm trying to get to that requires a username and password. In addition, i have a cert in my browser that has a certificate called cert.pk12. I'm getting this error and have looked everywhere but can't get it to connect. I need some

  • Creative use of CC calibraton in the DNG Profile Editor

    The DNG Profile Editor can be used to create profiles that mimic film looks. I think right now it is manual trial & error process involving changes in hue, saturation and lightness, but it could be theoretically achieved with Color Checker calibratio