Impact of high, derived clock rate

I am contemplating the use of the Compact RIO system for a laser instrument that requires high speed gating. I understand the FPGA portion of Compact RIO runs at a default rate of 40 Mhz, but the user can derive slower or faster rates...upwards of 200 Mhz. My questions then are:
1. With a 200 Mhz derived rate, I could write or create a Single Cycle Timed Loop to implement a Counter with 5 ns resolution...correct?
2. With this new, faster derived rate...I am assuming existing I/O functions may not work as originally designed. Would a solution be to create additional Single Cycle Timed Loops to connect these I/O functions to say the RT system where the cycle time (or ticks) are some multiple of the now 200 Mhz derived rate. In other words, if a tick is now 5 ns...have this loop wait 200 ticks (1 us) before running again...giving the I/O 1 us to complete before the next cycle. 
Thanks,
Steve

Hello Steve,
Derived clocks on the FPGA are different from the Top Level Timing Source. By default, LabVIEW FPGA applications maintain a 40 MHz top level timing sequence, while Single Cycle Timed Loop have a selectable timing source. When using SCTLs, you may elect to use either a derived clock or the Top Level Source. Multiple loops can execute simultaneously at different rates, based on your selected clock configuration. Please see the video demonstrations linked below for a more detailed explanation of this operation:
(Note: Microphone volume is very quiet, you will need to increase your system volume significantly to hear the narration.)
Configure FPGA Clocks Part 1
http://www.screencast.com/users/Patrick_Corcoran/folders/Jing/media/374e6b30-45a4-4028-b2de-4d113e6d...
Configure FPGA Clocks Part 2
http://www.screencast.com/users/Patrick_Corcoran/folders/Jing/media/b843d308-0c43-4e97-a9b4-7ecd4211...
Please post any additional questions, or comments on these videos. 
Thanks!
Patrick Corcoran
Application Engineering Specialist | Control
National Instruments

Similar Messages

  • MyRIO memory, data transfer and clock rate

    Hi
    I am trying to do some computations on a previously obtained file sampled at 100Msps using myRIO module. I have some doubts regarding the same. There are mainly two doubts, one regarding data transfer and other regarding clock rate. 
    1. Currently, I access my file (size 50 MB) from my development computer hard drive in FPGA through DMA FIFO, taking one block consisting of around 5500 points at a time. I have been running the VI in emulation mode for the time being. I was able to transfer through DMA from host, but it is very slow (i can see each point being transferred!!). The timer connected in while loop in FPGA says 2 ticks for each loop, but the data transfer is taking long. There could be two reasons for this, one being that the serial cable used is the problem, the DMA happens fast but the update as seen to the user is slower, the second being that the timer is not recording the time for data trasfer. Which one could be the reason?
    If I put the file in the myRIO module, I will have to compile it each and every time, but does it behave the same way as I did before with dev PC(will the DMA transfer be faster)? And here too, do I need to put the file in the USB stick? My MAX says that there is 293 MB of primary disk free space in the module. I am not able to see this space at all. If I put my file in this memory, will the data transfer be faster? That is, can I use any static memory in the board (>50MB) to put my file? or can I use any data transfer method other than FIFO? This forum (http://forums.ni.com/t5/Academic-Hardware-Products-ELVIS/myRIO-Compile-Error/td-p/2709721/highlight/... discusses this issue, but I would like to know the speed of the transfer too. 
    2. The data in the file is sampled at 100Msps. The filter blocks inside FPGA ask to specify the FPGA clock rate and sampling rate, i created a 200MHz derived clock and mentioned the same, gave sampling rate as 100Msps, but the filter is giving zero results. Do these blocks work with derived clock rates? or is it the property of SCTL alone?
    Thanks a lot
    Arya

    Hi Sam
    Thanks for the quick reply. I will keep the terminology in mind. I am trying analyse the data file (each of the 5500 samples corresponds to a single frame of data)  by doing some intensive signal processing algorithms on each frame, then average the results and disply it.
    I tried putting the file on the RT target, both using a USB stick and using the RT target internal memory. I thought I will write back the delay time for each loop after the transfer has occured completely, to a text tile in the system. I ran the code my making an exe for both the USB stick and RT target internal memory methods; and compiling using the FPGA emulater in the dev PC VI. (A screenshot of the last method is attached, the same is used for both the other methods with minor modifications. )To my surprise, all three of them gave 13 ms as the delay. I certainly expect the transfer from RT internal memory faster than USB and the one from the dev PC to be the slowest. I will work more on the same and try to figure out why this is happening so.
    When I transferred the data file (50MB) into the RT flash memory, the MAX shows 50MB decrease in the free physical memory but only 20MB decrease in the primary disk free space. Why is this so? Could you please tell me the differences between them? I did not get any useful online resources when I searched.
    Meanwhile, the other doubt still persists, is it possible to run filter blocks with the derived clock rates? Can we specify clock rates like 200MHz and sampling rates like 100Msps in the filter configuration window? I tried, but obtained zero results.
    Thanks and regards
    Arya
    Attachments:
    Dev PC VI.PNG ‏33 KB
    FPGA VI.PNG ‏16 KB
    Delay text file.PNG ‏4 KB

  • Derived clock problem?

    I am tring to derive a 25MHz clock using a NI PXI-7842R and labview project won't allow that exact clock
    But when I try doing the same thing for a PXI-7830R target, I am sucessful.
    What is going on?
    I am choosing a base clock of 40MHz
    to get 25MHz, the multiplier is 5 and the divisor is 8
    When I right click on New Derived clock, I only get the option of entering a new clock frequency.
    Why does the tool not let me just specify the multiplier and divisor?
    I am using Labview version 2010
    Solved!
    Go to Solution.
    Attachments:
    derived clocks.lvproj ‏16 KB

    LabVIEW FPGA does use the built-in DCMs for the Virtex 5,
    however, the parameters for the DCM with a 40 MHz input clock do not allow for
    25 MHz on the Virtex 5.
    In trying to instantiate a derived clock, there are four
    possible options, a 1X clock, a 2X clock, CLKDV, which is a phase-aligned clock
    operating at a fraction of the input clock, and CLKFX, which takes the
    multiplier and divisor and creates a clock of (M/D) * clock in rate.
    Obviously, 1X and 2X will not work because they yield a 40 MHz and 80 MHz clock, respectively.  The other two are limited by the specs for
    the DCM.
    In looking at the Virtex-5 FPGA User Guide in the section
    Clock Management Technology, you will see a section for DCM Attributes.  CLKDV_DIVIDE (p. 58) is an attribute set for the DCM
    that tells the DCM what rate the CLKDV output should run at.  If you look at the available attributes, you
    can see that 5/8 does not fit into one of the valid configurations, so we can’t
    use CLKDV.
    We also cannot use CLKFX because the Virtex-5 DC and
    Switching Characteristics Data Sheet shows that the valid ranges of CLKFX are
    32 MHz to 140 MHz for the Low-Frequency Mode and 140 MHz to 350 MHz for the
    High-Frequency Range (p. 57).  Since 25 MHz is below the minimum rate, we can't create it from this DCM output port.
    You can do this on the PXI-7830R because that
    has a Virtex-II FPGA with different characteristics and attributes.
    Donovan

  • I2C interface with 400kHz clock rate

    The max clock rate of NI's USB-8451 I2C interface is 250kHz.
    Please support my request for an interface with at least 400kHz.
    Has anybody used interfaces (with LV driver) from third-party suppliers with a clock rate of 400kHz or higher ?

    Hi Christian,
    Thank you for your suggestion.
    Because I don't use the LV FPGA Module, this is a too expensive solution !
    Best regards

  • DDS Compiler (Xilinx Coregen IP) maximum achievable clock rate

    Hi,
    I am using labview FPGA2011 and FlexRIO 7965R (i.e. Virtex5 SX95T). I have compiled a sinusoid generater using the built in Xilinx Coregen IP named 'DDS Compiler'. The output of DDS Compiler is sent to the host VI using DMA FIFO. My SCTL runs at 346MHz, as the largest clock that can be provided is 346.666MHz. The code according to DDS Compiler data sheet (DS558) with the configuration settings that i have used, should burn at 450MHz (mentioned in cloumn 4  of Table8 on page 28). But my code gives a maximum achievable clock rate of 350.63MHz. I have attached my code and images of the compilation results. Can somebody check and tell me why am i not getting the 450MHz rate? Is there any limitation on the clock rate due to the VI scoped or DMA FIFOs?
    Thanks
    Attachments:
    DDS Compiler.zip ‏858 KB
    images.zip ‏53 KB

    Hi Sandee, 
    What are you planning on doing with this sine wave?  Are you using a FlexRIO Adapater Module(FAM)?  
    The reason you can't get anything higher than 346.66MHz is because the clock that is being generated from is a 40 MHz clock.  Once you multiply it up that high, the accuracy of the frequency is not reliable.  
    The beauty of the FlexRIO is that you are able to bring in external clocks with several of our FAMs.  These external clocks are piped directly to the FPGA.  On top of that, if you're using the 6587 FAM, for example, you're able to generate up to a 500 MHz clock on the FAM, and provide that to the FPGA.  
    To do that, you first add your adapter module to the project
    Right click on your fpga target -> select new FPGA Base clock - > IO Module clock 0 from the drop down.  
    Then specify your frequency to be 450 or 500, whatever you want it to be. 
    Then on your block diagram, you'll provide your single cycle timed loop with the IO Module Clock 0 as the source. 
    I'm going to compile it now and I'll update with the results. 
    National Instruments
    FlexRIO & R-Series Product Support Engineer

  • How to affect the actual FPGA clock rate?

    Hallo,
    I developed a FPGA-VI, which should run with 80 MHz clock rate on the PCI 7833R board.
    This was no problem at the beginning, but meanwhile the code got bigger (Slices: 30%). Now the compiler says that there is an error with the timing constraints. The maximum clock it can reach is 74 MHz.
    Unfortunately I need the 80 Mhz to be fast enough.
    What parts of the code influence the actual clock rate?
    How can I get it faster?

    I think you missunderstood what I mean:
    The problem is not how many ticks a while loop needs for one cycle.
    The problem is the clock rate setting for the FPGA-board. It is set to 80 MHz and not the Default 40 MHz.
    When I compile my VI the compiler reports that it can't keep the 80 MHz. That means for example that a Single Cycled Timed Loop doesn't need 12.5 ns but for instance 13.4 ns.
    If I separate my code in 2 parts and compile each part as a separate project, the compiler can keep the 80 MHz. Only together it doesn't work.
    So the whole problem has something to do with the size of the project, the depth of the structures etc.
    So I need to know for example what structures slow down the clock most, or something like that.

  • Atm0 clock rate on 1760 router

    I have a 2651xm router with wic1-adsl card and i've seen that by tweaking the atm0 and dialer0 settings I can literally double the download speed on my crappy adsl line. One of the settings I use on the 2651 is:
    # int atm0/0
    # clock rate aal5 7000000
    however I also have a 1760 router on adsl at a different location but that won't accept 7000000.
    1760(config-if)#clock rate aal5 ?
            1000000
            1300000
            1600000
            2000000
            2600000
            3200000
            4000000
            5300000
            8000000    <default>
      <1000000-8000000>  clock rates in bits per second, choose one from above
    from the point of view of squeezing every last drop of speed from the adsl line, which clock rate should I use?

    Tony,
    Read this
    http://www.cisco.com/en/US/docs/ios-xml/ios/interface/command/ir-c2.html#wp4276769922
    Regards,
    Alex.
    Please rate useful posts.

  • High Speed Clock Signal Generation Using FPGA Ouput

    Hi,
    This a screen shot of a LabVIEW FPGA program. Here I am trying to generate 5MHz Clock signal at Connector 0 DIO12. But I get around 2MHz when I measure the signal using an oscilloscope. Would some tell me what’s wrong ?
    LabVIEW 2011
    FPGA Target  : PXI 7841
    I set DIO 12 to Nerver Arbitrate in the property setting.
    Solved!
    Go to Solution.

    I'm not certain but maybe some of the delay is happening because you are changing the mode of the pin.  According to the specs on the card it is capable of having a Maximum Clock Rate of 40MHz under the DIO section.  So if you are purely doing digital reads, or digital writes you should be able to update/read 40,000,000 times a second.  If you perform a read, which can take up to 1/40M of a second, then invert which takes some time but practically none, then another write which can take up to 1/40M of a second, your loop rate should still be faster than the 10MHz you showed.  That's why I suspect there is time involved in changing over the pin from a read mode to a write.
    If you try to do something similar with the analog you'll notice the maximum update rate is only 1MHz.  So while your logic and code can run at 40MHz you can only update the analog value at 1MHz.  Because of this in the past I have had two loops.  One running a at the maximum clock rate doing the logic calculations, and then sending the result to another loop that updates the output at the maximum rate which in the case of an analog out is only 1MHz.
    Unofficial Forum Rules and Guidelines - Hooovahh - LabVIEW Overlord
    If 10 out of 10 experts in any field say something is bad, you should probably take their opinion seriously.

  • Error Disabling the Clock Rate on Interface

    When i try to disable the clock rate on a Serial interface of a 3640 Router the router returns the following error:
    Router(Config-if)#no clock rate 56000
    FECPM PM doesn't support clock rate 0
    HOw can i disable the clock rate

    Hi
    I even kept the interface in admin shut mode but it refuses to accept the command no clock rate
    After using show controller interfaces serial 0/1
    it gives o/p as serail not connected
    Kindly see how to remove the command
    regards
    Sohail JKB

  • Can someone explain the origin or history of the term 'derived clock'?

    I am interested in knowing the origin or history of the term 'derived clock' as it applies to FPGA programming.  See the link below.
    https://decibel.ni.com/content/docs/DOC-3003
    Thank you.

    A derived clock is nothing more than a clock you created (derived) from a real hardware clock.  This is done by multiplying and/or dividing the base clock.
    This is needed for some applications to make some segments work faster/slower than what the base clock will allow, usually with signal processing.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Get actual clock rate error

    I'm using ADAC 5500 DAQ card with labview. I had an error called 'get actual clock rate'.What's the meaning of this error. How can i get actual clock rate,what shoul i do?Thanks for your response.

    Hello �
    I visited IOtech�s website and found out that in the ADAC/5500 Series specifications data sheet it is mentioned that LabVIEW is supported with this piece of hardware. They mention: �complete drivers with example programs for rapid application specific development�.
    My suggestion is to contact IOtech. They are the ones who wrote the driver to interface their hardware with LabVIEW and should know what the error means. In this case, our ability to help you is limited since we did not write the driver. However, they should be able to provide you with more information about the error and how to fix it.
    ADAC/5500 Series
    IOtech support<
    /a>
    Hope this helps!
    S Vences
    Applications Engineer
    National Instruments

  • Can i do hardware compare using hsdio when the generation and acq clock rate are different

    Hi there
    my application should generate a digital stream in clock rate A with bus width (number of bits) very from 1 to 4 bits
    The DUT will response always on on bit but with clock rate B.
    1. can i use the hardware compare mechanism of the hsdio boards.
    2. if answer is YES is there any sample that can help me getting started.
    Thanks for advance
    Gabel Daniel
    Solved!
    Go to Solution.

    Hi,
    One good example to use as a starting point can be found in the NI Example Finder.  Navigate to "Hardware Input and Output" >> "Modular Instruments" >> NI-HSDIO >> Dynamic Acquisition and Generation >> Hardware Compare - Error Locations.vi.  You'll need to explicitly wire in your new rate into the "niHSDIO Configure Sample Clock.vi" on your response task.
    There is also a portion of this code that warns if the stimulus and response waveforms are not the same size.  This is not necessary, and could be deleted from the program. You are allowed to have different size stimulus and response waveforms.
    Jon S
    Applications Engineer
    National Instruments

  • Sample clock rate when exporting from Matlab

    Hi all,
    I'm quite new with Labview and got stuck with (probably) some basics. I try to use a data vector generated by Matlab as an output with PCI-6115 (see the pic). I use "read from spreadsheet" VI. Everything works fine but I cannot change the sample clock rate. I wanted to use 100 000 kHz for this (binary) output but I only get 1 Hz which is some default value. The "DAQmx timing" VI which I'm using shows "uses the dt component of the waveform input to determine the sample clock rate". I find this dt inside the "DAQmx timing" but nothing happens when changing. Any help is greatly appreciated!

    Do you see those red coercion dots on the inputs of the DAQmx Timing and DAQmx Write? They mean that you are using an incorrect data type. A 1D array has no timing information. The waveform data type includes a dt value and this is required to set the sample rate of the DAQ card. If you don't have this in the text file, you cannot automatically provide it. The Read From Spreadsheet cannot automatically create a waveform data type even if the file does contain dt information. Since you did not attach the text file you are reading, no way of knowing if your problem goes all the way back to Matlab or somewhere else.
    In the future, please think about what information you should provide. As mentioned, the text file is important. It's also hard to debug an image. Attaching the actual code or a snippet is much better. A snippet is a png file that can be imported into LabVIEW as actual code. It's an option under the Edit menu in LabVIEW 2009.

  • Scan clock external - log scan at less than the external clock rate

    I am using the Clock config vi to set the scan clock to external and connected to PFI0
    PFI 0 is connected to a TTL level encoder, which is attached to the rotating shaft of a brake test dynamometer. The encoder provides 60 pulses per full rotation of the shaft. Thus I acquire 60 scans of the channels to the buffer for every turn of the shaft. So far, so good, this works fine.
    However, I wish to use the same software on another dynamometer which is fitted with a different encoder. This encoder gives 1024 pulses per turn of the shaft. I do NOT want to acquire a scan for every pulse ( I don't want 1024 scans per turn of thae shaft), I would prefer a 16th of this, say 64 scans per rotation. Can I
    use Clock config vi to set acquisition at some fraction of the external clock rate ?
    I notice an example on your site describes setting the external channel clock to a fraction of the external clock, but I can't see anything similar for the scan clock.

    Hi,
    I would post your question to the Measurement Hardware section of the Discussion Forum. Your question would get the best response in that Forum.
    Mike

  • Ai Convert Clock Rate ?

    I am not clear on the AI convert Rate, I have 8 Analog Input channels that I would like to read based on PFI7 Sampling clock. In the documentation, it says that it will do 10 us + fastest clock cycle for interchannel A/D delay, do I have to specify the Rate for the Convert Clock 10us + 50 ns  delay or will it do it automatically for me?

    Hi SoftwareGuy2009,
    The convert clock is set automatically with the DAQmx driver.  If you need to  change the convert clock rate, you can use the AI Convert Rate property with a DAQmx Timing Property Node.  You can find some more information on this with the KnowledgeBase articles below:
    How is the Convert (Channel) Clock Rate Determined in NI-DAQmx and Traditional NI-DAQ?
    How is the Convert Rate Determined for Analog Input Operations with M Series Cards in NI-DAQmx Base...
    How Do I Increase Interchannel Delay Using NI_DAQmx or Traditional NI-DAQ (Legacy)?
    Regards,
    Jim Schwartz

Maybe you are looking for