How to set accurate sampling rate in FPGA programming.

Hi all,
       I'm using cRIO 9004 and NI 9233 for acquiring high speed analog inputs. Module 9233 specify some data rate values which can be programmed. I want to acquire data without missing a single sample. please comment on this.

Couple things to add to what Mike said:
For C Series modules like the NI 9233 when used on the CompactRIO platform, all configuration and programming is done from within LabVIEW.
To configure the data rate at which the 9233 will acquire data, you need to either get into the module's properties page (in LabVIEW) and set it there, or you programmatically do it by using a LV FPGA property node on the module. I suggest you look at the examples that ship with CompactRIO for the 9233.
JMota

Similar Messages

  • [Q] How to set the sampling rate separately?

    Hello,
    I'm using LV5.1 in Windows98 with SCXI-1200.
    I want to set different sampling rate in each input channel.
    I've ever used "AI acquire waveforms.vi" only.
    Anybody can help me?
    Example codes are highly appreciated.
    Regards,
    Hyun-ho Lee
    [email protected]

    The inputs of the SCXI-1200 are multiplexed to a single Analog-to-Digital converter. Because of this, the only difference in sampling rates that are achievable would be integer divisions of a common high frequency. This is functionally identical to acquiring all channels at the highest rate, and decimating (throwing away) data from channels that need lower data rates.

  • How is Core Audio sample rate set?

    When I play a particular movie in QuickTime, the audio and video are out of sync. The movie plays fine on other computers.
    This Mac Pro (OS 10.6.8) is used exclusively for Pro Tools and Final Cut Pro. The audio hardware is a Pro Tools HD Native card:
    Native card --> Pro Tools Digital I/O boxes --> Lavry D/A converters --> monitors
    The sample rate of the Digital I/O boxes is set by an external master clock, a Lavry Gold A/D converter.
    I opened the DigiDesign Core Audio Manager and it says:
    Connected @ 44.1K, 32 In/32 Out, Buffer Size 512
    Yet the movie is at 48K (I know because I created it in Final Cut Pro), and the external clock is set at 48K. So, I don't know where the 41K in the Digidesign Core Audio Manager came from. This is why I am guessing the problem is sample rate.
    Please help me understand what sets the sample rate.
    Does the application using Core Audio set it?
    If so, how is this made consistent with the sample rate set for the hardware by the external master clock?
    Do I have to be sure I always change the external clock setting to be consistent with the movie being played?
    BTW, I have not had a sync problem when I play video in Final Cut Pro..........it is always in sync. So, I have never worried about how Core Audio works. The sync problem is only with QuickTime.

    You might ask on the GarageBand, Logic or Final Cut forums. There is not much audio traffic here.

  • How to coerce the sampling rate??

    I think I found my problem with sampling rate.
    I'm using a PCI-5122 scope card, and in many of my aquisitions, I'm setting the sample rate to 40MS/s. Apparently, this is not a valid number and the scope reverts to 50 MS/s
    Later when I try to calculate cycles per second based on cycles per sample, I need the actual sample rate, and 40 MS/s ain't it.
    I'm trying to coerce sampling rate.
    please,.

    The digitizer coerces the sample rate because of how the sample clock is derrived from the Reference Clock.  The following information is on page 13 of the specifications:
    http://digital.ni.com/manuals.nsf/websearch/C6B059C1BDD70101862574C8005567F1
    The sample clock is created by dividing down the Reference clock (internal reference clock is 100MS/s) by decimation, and it divides it by N, which is an integer between 2 and 65530.  
    Thus 50MS/s uses a decimation factor of 2, and 33.3MS/s is the next valid sample rate with a decimation factor of 3.  So when you specify a sample rate that is not possible, the driver automatically coerces the requested sample rate up to the next valid rate.  You can obtain the actual sample rate used in an acquisition using the NI-SCOPE property "Actual Sample Rate", or the LabVIEW VI "niScope Sample Rate.vi".  Using this property, you can get the values you need for your calculations.
    I hope this helps!
    Nathan
    Product Support Engineer
    National Instruments

  • How to set the sampling interval using DAQ device?

    Now I have a capacity sensor and a 24 bit DAQ device (http://www.mccdaq.com/usb-data-acquisition/USB-2404-10.aspx).
    The DAQ device has the maximum sampling rate of 50kS/s . My question is how to set the interval of data collection. For example, if I set the sampling rate to 2000, in continuous sampling mode, I use DAQmx read.vi in a while loop and set the 'number of samples per channel' to 100. I want to plot these data as a function of time by using a XY graph and also save these data. So I add a 'Mean' function to get the average of the 100 samples per each loop (than means, there are 20 data output per second). But when I set the 'number of samples per channel' much smaller (to get more data per second), there are some problems. It seems that the program cannot read so many data and get the average at a higher frequency. I don't know where the problem is. All in all, how to collect data more frequently?Maybe I didn't express my question clearly. I'll upload a simple program later if necessary. Thanks.
    Solved!
    Go to Solution.

    Another question is in continuous mode, NI DAQmx uses Samples per channel to determine the buffer size. But according to the website you specified, it says if the acquisition is continuous (sample mode on the DAQmx Timing.vi is set to Continuous Samples), NI-DAQmx will allocate a buffer according to the following table. And for sample rate between 100 - 10,000 S/s, the buffer size is 10 kS. So if I set the sample rate to be 5000 S/s, and set the Samples per channel to be 20000, then what exactly the buffer size is? 20 kS or 10 kS?
    Thanks.

  • Setting default sample rate to 48KHz???

    Can someone tell me how to set my Logic Pro to default to 48 KHz instead of 44.1 KHz. Also can you tell me how to convert the sample rate of audio files within Logic (the command appears moved since Logic 7). Thanks. Dave

    to convert the sample rate of audio files you have to use the Copy/Convert command in the Audio Bins' local menu, under *Audio File*.

  • Setting the sampling rate in SignalExpress

    I am using a cDAQ-9172 with a strain gauge module and a thermocouple module and using SignalExpress.  I want to acquire data at a pretty low frequency rate (2Hz), but I am unable to use 1 sample on demand in the acquisition setup.  I get the following error.
    Error -201087 occurred at DAQ Assistant
    Possible Reason(s):
    Measurements: Task contains physical channels on one or more devices that require you to specify the Sample Clock Rate.
    Specify a Sample Clock Rate.
    Device: cDAQ1
    I am unable to setup a sample clock rate as it is 'blacked' out by the software in the 1 sample on demand mode.
    When I try to use continuous or 'N' sample mode, the cDaq is sampling at a rate of around 1600 Hz, even though I am putting in a value of 1Hz on the seti[ screen.  The large amount of data will prevent me from downloading the data to Excel for further reporting.  I obviouisly don't need to sample my TC module at that rate either.   
    Is there anything I can 'easily' do to decrease the sampling rate down to a low level?  Everything I have tried doesn't seem to work and I don't really want to go away from SignalExpress.
    Jay 

    Hi Jay
    I am assuming that you are using the NI-9237.  This strain module has only a specific set of sampling rates.  If you specify a rate that is not supported it will corers the rate to the next higher sampling rate.  Also since you are using the cDAQ chassis, there is only sample clock so it must choose the highest sampling rate.  So it is not possible to sample at a lower rate.  Please see the following link.
     If you have access to LabVIEW then I would suggest taking your readings at the fast rate and then average the samples before you write them to your file.  The following link has some more information on this. 
    If you are limited to Signal Express, this task is tricky.  If you select a N-Sample acquisition, set the samples to read to greater than two, and set the post acquisition delay  under the execution control tab to 1000, you acquisition will acquire the two samples at the very fast rate, but it will wait 1000mS between each acquisition.  This will slow the overall sample rate.
    Chris_K
    National Instruments
    Applications Engineer

  • NI 9234 : sampling rate : cRIO, FPGA

    For NI 9234 module, front panel and block diagram of my FPGA code of my cRIO is like below.
    The sampling rate (data rate) is selectable between some values as shown below. 
    Do you have any idea how I can do the sampling with a lower rate that is smaller than the samllest option (1.652 KS/s)?
    Solved!
    Go to Solution.

    Hi Cashany, 
    Take a look at this link. Page 16 describes the limitation that you are running into. Basically the master time base can only be divided down to certain data rates because of the way the device physically handles digital and analog filitering. So there really isn't a way to divide down past that point. 
    Ryan
    Applications Engineer
    National Instruments

  • How to know the sampling rate for NI6624?

    Dears,
    I am trying to measure a transient signal that is a time-vary counter train.  The target frequency is increased from 0 Hz to 50 Hz when the measurement time rises to 1 s from 0 s.  The NI 6624 card and the LabVIEW DAQmx have been adopted.  In the block diagram, the terminal of measurement method “Low Frequency with 1 Counter” is set in “DAQmx Create Channel (CI-Frequency).vi”, and the “Finite Samples” mode is chosen in “DAQmx Timing (Implicit).vi”.  Then the transient signal points (increasing-frequency points) will be got successfully within 1 s.  Now I have a question: how do I estimate the time step “dt” between these data points?  Knowing the default sampling rate of the card seems a better way to help me to define the "dt", and calculates the time stamp at each data point.  If the foregoing concept is true, how the internal sampling rate in NI6624 obtains?  Beside, for the transient counter signal, any way to get the time stamp of data points is also welcome.
    Thanks for anyone comment,
    Adan

    Adan,
    When selecting "Implicit" as the DAQmx Timing type, you are indicating
    that a data point will be taken for every measurement the counter
    performs. When you create a task of type "Low Frequency with 1
    Counter," the counter simply uses the card's internal timebase to
    measure the period between edges of your signal. It then takes this
    period measurement and converts it to a frequency. Therefore, the
    spacing between the samples you read out is simply the inverse of the
    subsequent frequency measurement sample.
    Hope this helps,
    Ryan Verret
    Product Marketing Engineer
    Signal Generators
    National Instruments

  • How to set the scan rate in the example "NI435x.vi" for the 435x DAQ device?

    I am using LabVIEW 6i, A 4351 Temperature/Voltage DAQ, and am using the example vi "NI 435x thermocouple.vi"
    How do I set the scan rate on this VI? I added a counter to just to see the time between acquisitions, and it is roughly 4s for only 4 channels. The Notch Filter is set at 60, but is there a control for scan rate on this example. I'd like to acquire around 1 Hz.

    Using this DAQ example, it may be easiest to simply place a wait function in the loop and software time when the data is extracted from the board. You can also take a look a the NI-435x pallette and the examples it ships with, along with the timing specs on page 3-3 of the 435x manual and set the filters to meet your needs:
    http://www.ni.com/pdf/manuals/321566c.pdf
    Regards,
    Chris

  • How to store the sampling rate into the header of the data file?

    I want to store the data parameters such as sampling rate to the data file. Now I am using the Write Labview Measurement File VI, the data is saved, but like the sampling rate this kind of parameter is not in the file. How can I store this parameter?

    If you want the sampling rate to appear in a different location of the header, then it and it's subVIs can be modified to write anyway you want. Right click on the Express VI and slect Open Front Panel. Then you have a VI that can be modified. If you want to use an unmodified Write LabVIEW Measurement File, you could use that, read the whole file back in with one of the file read functions, insert a string with the sampling rate and then write the whole thing back out again. It might be simpler though, to use Write Characters to File to create your own header and then use Write to Spreadsheet File to write the data. There is also the function Export Waveforms to Spreadsheet File on the Waveform>Waveform File I/O. This uses a slightly different format than a .lvm file. It too can be modified if you don't like the default header.

  • FMLE won't let me set the sample rate with Blackmagic Decklink Studio 2 (Windows)

    Hi,
    I am trying to set the audio device on FMLE to the Decklink Studio 2's audio inputs. However,
    the sample rate sticks to 44.1k and I can't change it to the 48kHz sample rate that
    the Decklink defaults to (standard for broadcast is 48k).
    Try as I may, it isn't changeable. I seem to remember a similar problem where the
    sample rate defaults to the system audio card (on the motherboard) which is at
    44.1.
      Any ideas on a work-around? The audio from the Decklink is *much* better than
    the built in card.....
    thanks
    jeff

    One thing you might try doing is trashing the preference file in your user directory:
    ~/Library/Preferences/com.apple.compressor.Compressor.plist
    Like repairing permissions, this is a bit of a "catch-all" fix, not something specific to this particular problem.
    Also, have you tried different file types like: TIFF or JPEG, there could be an issue with output from Fireworks. You could use QT Pro to transcode the file. Sometimes that can work to fix funky problems.
    Does the Text Overlay filter work?
    BG

  • How to set tolerance exchange rate?

    Hi all,
    I am looking for customizing in order to set tolerance exchange rate in case of translation when creating FI document.
    Eg: using transaction FB01.
    40 100 GBP 55,54 USD (55,52 to 55,55 accepted)
    50 100 GBP 55,54 USD (55,52 to 55,55 accepted)
    Standard message: F5 061 : "Balance in local currency " & " is too large for automatic adjustment".
    I am looking where can I customize this automatic adjustment.
    Thanks for your help.
    David
    Edited by: David31 on Jul 21, 2011 6:00 PM

    Hi David,
    You can maintain the permissible percentage of exchange rate deviation in OB64. while posting the document, the exchange rate entered manually should be within the rate mentioned in OB08 +/- the devaition percentage specified in OB64.
    Regards,
    Nikhil

  • Does anyone know how to find maximum sample rate on an mac 10.7.5

    Trying to see if I can downlaod a audio plug-in that requires a maximum sample rate of 199kHz. Where do I find this info for 2009 Mac 10.7.5?

    According to http://support.apple.com/kb/ht3913:
    "The internal microphone supports recording at bit depths of 16, 20, or 24 bits per sample and at sample rates of 44.1 kHz, 48 kHz, or 96 kHz."

  • How to set screen refresh rate for linux console (runlevel 3)?

    Hi,
    I have a CRT monitor, so the refresh rate is always needed to configure. I use 720x400 resolution in console, but the refresh rate is 70hz. Are there any ways to get it higher in console (with KMS or without)?
    Last edited by AleXoundOS (2009-11-04 12:27:09)

    thank you for replies
    But setting the resolution, e.g. vga=773, would not change the refresh rate, or would it?
    It would, but usually not to the maximum provided by the monitor. May be it is real to find the place, where it looks for available resolutions linked with refresh rates...
    According to the Gentoo page I can use uvesafb for setting refresh rate as well as other settings. And then I found uvesafb page in ArchWiki. But the way to set proper refresh rate is not so easy as I expected, because I need to install uvesafb and it needs additional daemon. And also I think KMS can't work here. However it's better than nothing, if I don't find any other way, I will try this one.

Maybe you are looking for

  • Subsequent update of record, long time to appear in Journalized View

    Hi, I'm running some integration tests that do an insert into a source table, commits the insert, updates that record, commits the update. The cscn numbers are widely spaced, for example, the insert is 69997742 and the update is 70000579. I have a sc

  • Difference between ORacle 8i Standard & Enterprise

    Hi, 1. I am looking for a document mentions the difference between Oracle 8i Standard & Enterprise editions. 2. Components in Standard edition? (eg. Oracle Intermedia?) Raj

  • LINE PRINTER PROBLEM

    Hi All, We are facing problem while printing a form ( Payment Advice ) through line printer ( EPSON FX1000 ) . The print preview and  output will be different. For example,  we have used new page format ( Half of A4 size ) and desgined payment advice

  • Weblogic WebServices Client: how to use HTTP 1.1 protocol ?

    Hi, I generated a stub ( jar file) to call a weblogic webservice from a client, both in weblogic workshop 8.1.4 and weblogic workshop 10.3. It seems that both jars (generated by weblogic) use HTTP 1.0 protocol (I can see the HTTP 1.0 requests from ac

  • Vintage photo restorations

    I recently came into a cache of old family photos. I bought photoshop elements 11, however, I question if it is adequate.Will photoshop elements 11 repair/restore my damaged vintage family photos?(bleached out areas. faded, tears, and holes that woul