External Clock PCI-6110

Hello / Bonjour
I am using Labview 6.1 / Window 2000 and a NI PCI-6110 card.
The system that I develop uses an external clock (connected to the PFI7) to sample a signal in channel 0, a trigger connected to TRIG 1 starts an acquisition. The program controlling the DAQ is a modified version of “Acquire N Scans ExtScanClk D-Trig.vi” example, I have removed from this example the two sub-vi “AI Clock Config”. The program uses then the STARTSCAN signal as an external clock.
I have three questions:
1 How do you configure STARTSCAN to either the rising or falling edge polarity as it is stipulated in the pci-6110 user manual?
2 What is the purpose of the two sub-vi “AI Clock Config” in the “Acquire
N Scans ExtScanClk D-Trig.vi” as it seems the PCI-6110 card doesn’t need them to run properly with an external clock?
3 Should I configure the CONVERT* signal if I would like to sample the channel 0 signal immediately when a STARTSCAN pulse appears?
Thank you

Thank you rgharrison for your answer
I tried to make some modifications and tests following your comments.
I have incorporated the �AI Clock Config� in my application (which is now very similar to �Acquire N Scans ExtScanClk D-trig.vi� except that there is only one �AI Clock Config� VI) and I have these questions and comments:
As the PCI-6110 samples each channel simultaneously (and I want to sample only one channel), should I set the channel clock as you mentioned?
Given that my external clock is connected to PFI7 and I have specified �PFI pin, high to low� in �clock source code�, it doesn�t matter for the board if I set any PFI as �clock source string�: my board still find an external clock. But if I leave �clock source string� blank I�ve got
an error. I noticed also that the parameters of the �actual clock rate specification� cluster are still equal to zero. Is it normal or I misunderstand something?
I don�t understand why I should set up the �clock source� as fastest as possible, as I want to use an external clock and sampling my signal one time per external clock period. Also, in the �Acquire N Scans ExtScanClk D-trig.vi� exemple, after set the clock as fast as possible, the following VI �Ai Start� disable the internal clock, so again why set up the clock and disable it just after?
If I set the �clock source� at the maximum rate and if the scan clock (my actual external clock) is slower than the �clock source�, can the board acquire more than one sample by external clock period?
Regards

Similar Messages

  • How do I use a quadrature encoder as an external clock (PCI 6229)

    Hello, ( a similar post has been placed on DAQ forum apologies as I did not know best place)
    I have a PCI 6229 M Series data acquisition card. I want to use a quadrature encoder to be the external clock driving the acquisition of a number of signals. I have set up reading 24 signals each time a clock pulse is received using the DAQ assistant and set my external clock to pin pfi8 (I think) this is then connected to an encoder output. This works well enough until the encoder is run too fast when it appears I am either missing pulses or getting bounce.
    How can I set up to clock using a quadrature encoder? I have seen a number of questions on this forum regarding quadrature encoders and reverse counting but not on using them as an external clock.
    Basically I want to have the stability and "bounceless" nature of using two outputs from a quadrature encoder whilst still using an external clock. Is this just a case of configuring controls to certain pfi's? If so how is it done?
    Any help or pointers would be helpful. So far I have managed very nicely by simply using the DAQ assistant and the interface it has would suggest that if configured for a certain pfi pin I could actually still use it.
    Thanks in advance.
    Kevin

    Hi,
    Well I've had alook into this for you and I'm not quite sure I understand what you are looking for.
    Is it possible for you to phone back in to support?
    The reason you are seeing bounce at high speeds, or indeed loss of points, is due to the sampling rate that you have set up.
    What you will find is that the trigger will start an aquisition of a number of points at a certain rate.  If your sampling rate is too low then you will not finish that sample batch before the next set of samples is recorded.
    It is possible to use an external clock into a trigger or digital line, however this will limit the number of samples you can take to the speed of your encoder.
    If you increase your sampling rates, and then configure a start trigger from a single input from the encoder you will be able to record a number of samples after a rising/falling edge.  (Set the clock as an internal clock)
    Hope this helps
    AdamB
    Applications Engineering Team Leader | National Instruments | UK & Ireland

  • Is it possible to read digital data using an external clock (PCI-6259 M)?

    I’m using a NI PCI-6259 M Series card and trying to write my program in VC++6.0 using the functions in the DAQmx driver.
    Question1: Not all functions listed in the NI-DAQmx C Reference Help seems to be supported by my NI-card, where can I find information about which of the functions that are supported?
    Question2: I want to read data from a device that clock out data on the falling edge of a clock signal. The clock signal and the data signal are routed to two DIO terminals on the NI-card. The question is if it is possible to read data using the clock as a sample clock? See two code examples below that doesn’t work. In both cases 10 samples are read at once, even if the external clock is not present.
    Example 1
    // Create tasks
    Status = DAQmxCreateTask("", &m_ReadTrimTask);
    // Set up read task
    status = DAQmxCreateDIChan(m_ReadTrimTask, "Dev1/port2/line0", "", DAQmx_Val_ChanPerLine);
    status = DAQmxCfgChangeDetectionTiming(m_ReadTrimTask,"Dev1/port2/line6","Dev1/port2/line6",DAQmx_Val_FiniteSamps, 10);
    // Read data
    int32 sampsPerChanRead, numBytesPerSamp;
    status = DAQmxReadDigitalLines(m_ReadTrimTask, 10, 10.0, DAQmx_Val_GroupByChannel, result, 10, &sampsPerChanRead, &numBytesPerSamp ,NULL);
    Example 2
    // Create tasks
    Status = DAQmxCreateTask("", &m_ReadTrimTask);
    // Set up read task
    status = DAQmxCreateDIChan(m_ReadTrimTask, "Dev1/port2/line0", "", DAQmx_Val_ChanPerLine);
    status = DAQmxSetSampTimingType(m_ReadTrimTask, DAQmx_Val_SampClk);
    status = DAQmxSetSampClkRate(m_ReadTrimTask, 1000.0);
    status = DAQmxSetSampClkActiveEdge(m_ReadTrimTask, DAQmx_Val_Falling);
    status = DAQmxSetSampClkSrc(m_ReadTrimTask, " Dev1/port2/line6");
    // Read data
    int32 sampsPerChanRead, numBytesPerSamp;
    status = DAQmxReadDigitalLines(m_ReadTrimTask, 10, 10.0, DAQmx_Val_GroupByChannel, result, 10, &sampsPerChanRead, &numBytesPerSamp ,NULL);

    Hello Magnus,
    Thank you for contacting National Instruments.
    "Question1: Not all functions listed in the NI-DAQmx C Reference Help seems to be supported by my NI-card, where can I find information about which of the functions that are supported?"
    The best place to look for this information would be the M Series Help Manual. There you can find the features of your PCI-6259 and what operations it supports.
    "Question2: I want to read data from a device that clock out data on the falling edge of a clock signal. The clock signal and the data signal are routed to two DIO terminals on the NI-card. The question is if it is possible to read data using the clock as a sample clock? See two code examples below that doesn’t work. In both cases 10 samples are read at once, even if the external clock is not present."
    Look at the "ContReadDigChan-ExtClk_Fn.c" example project which ships with the NI-DAQ driver. This is located at: C:\Program Files\National Instruments\NI-DAQ\Examples\DAQmx ANSI C\Digital\Read Values\Cont Read Dig Chan-Ext Clk.
    You will have to make some minor modifications to convert this to a finite acquisition, but that is simply a matter of changing the "sampleMode" parameter of the DAQmxCfgSampClkTiming() function. You will also have to route your clock signal to a PFI line and specify which line in your code.
    I hope this helps.
    Sean C.
    Applications Engineering
    National Instruments

  • PCI-6229 Interrupt on external clock

    Hi,
    I want to generate an interrupt from an external clock (2 kHz). I can't find anything on generating an interrupt from a Digital Input. So I have tried using the G0 TC interrupt.
    I modified gptcex1 for a countdown timer as described in the source file.
    - set to reload on TC
    - set source switching to same.
    - enabled G0 TC interupts
    - connected my input clock to PFI 8
    It appears that "reload on TC" reloads 0xFFFFFFFF instead of my initial count. So I have to reload my initial count in the ISR.
    board->G0_Load_A.writeRegister(GPCT0_INITIAL_COUNT);
    This results in an interrupt every GPCT0_INITIAL_COUNT + 1 cycles. This means that I can only generate an interrupt every second cycle at best.
    Is there some way to overcome this limitation or is there a better way to achieve my aim?

    Russell_Thamm wrote:
    I want to generate an interrupt from an external clock (2 kHz). I can't find anything on generating an interrupt from a Digital Input.
    <snip>
    or is there a better way to achieve my aim?
    Hi Russell,
    Ideally, the best course of action is to use change detection [1]. Unfortunately, the M Series DDK does have change detection support like the X Series DDK does [2].
    However, if you have the analog input or analog output subsystem available, you can use your 2 kHz signal as the sample clock and enable interrupts on the START signal. Since the device will expect to acquire or generate samples, you'll need to either throw that acquired data away or provide safe, dummy data to generate.
    Analog input: tInterrupt_A_Enable :: setAI_START_Interrupt_Enable
    Analog output: tInterrupt_B_Enable :: setAO_START_Interrupt_Enable
    [1] M Series User Manual :: Change Detection (page 104)
    http://www.ni.com/pdf/manuals/371022k.pdf
    [2] How to generate an interrupt using DI change detection on m-series card
    http://forums.ni.com/t5/Driver-Development-Kit-DDK/How-to-generate-an-interrupt-using-DI-change-dete...
    Joe Friedchicken
    NI VirtualBench Application Software
    Get with your fellow hardware users :: [ NI's VirtualBench User Group ]
    Get with your fellow OS users :: [ NI's Linux User Group ] [ NI's OS X User Group ]
    Get with your fellow developers :: [ NI's DAQmx Base User Group ] [ NI's DDK User Group ]
    Senior Software Engineer :: Multifunction Instruments Applications Group
    Software Engineer :: Measurements RLP Group (until Mar 2014)
    Applications Engineer :: High Speed Product Group (until Sep 2008)

  • How to increase speed of analog read using PCI 6110 and LabView 7.1?

    Hi All,
    I'm a fairly novice user of LabView and I'm trying to figure out how to make my system as fast as possible. Basically, I'm reading in an analog signal (AI0) together with a trigger signal (PFI0) and I want to sample the analog signal on the falling edge of the trigger signal. What I'm finding, though, is that the acquisition rate of this system is excessively slow. My analog signals occur faster than 1 kHz, but the speed of my system is only about 100 Hz. A colleague of mine told me that the DAQmx Trigger.vi I was using was probably the culprit for the slow performance I was observing. Could this be right? If so, how could I resolve this issue?
    Another thought I had was to input the trigger signal into the external source input of the DAQmx Timing.vi. In this way I'd be using the trigger signal as an external clock. If operated in Continuous mode, then I think the Sample Clock would cause a sample to be read only when the trigger signal changes its state (from high to low). Could this work?
    Thanks for reading,
    Ben

    Hi Marni,
    Thank you so much for your post! I tried doing as you suggested, using the Cont Acq&Graph Voltage-Ext Clk-Dig Start.vi and I first set the trigger signal as the external clock source. It works beautifully! Now, with 1.6 kHz signal in, I'm reading in 1.6 kSa/s. This is a dramatic improvement from before. I also tried using the internal clock source option, but it didn't seem to trigger when I wanted it to. I didn't spend too much time debugging it, though, since the other method works so well. This brings up an interesting question. Is the DAQmx Trigger.vi really that slow? Previously, I could read in 100 kSa/s at best using the Trigger VI. Anyways, I'm a happy camper now:-)
    By the way, to answer your question about the DAQ device, I'm using the NI PCI 6110 Multifunction DAQ.
    Thanks again!
    Ben

  • How can I optimally stream data to disk at 5 MS/s on 4 channels using a PCI-6110?

    I am using a PCI-6110 to acquire data streaming from a camera. Data arrives on all four analog input channels at 5 MHz. An A2D clock is attached to PFI8, also originating from the camera and operating at 5 MHz. Finally, there is a Valid-data signal from the camera that signals when the data stream coming in on the four analog channels is actual data (each Validate pulse is about 132 samples wide). This is attached to PFI0. Ideally, I would like to have a program that captures data at the full rate and is correctly synchronized with the Valid-data line. It should also be able to stream to disk for several seconds at least before pausing for the buffer.
    Though I'm something of an inexper
    ienced programmer, I have attempted programs both in LabView and C (using DAQmx) to resolve this, but I have experienced much slower behavior than I'd like. In LabView, this seems to be the result of slow streaming to disk, such that I have to pause and reset the task every frame (132x512 samples per channel). Using DAQmx in C, I have found it necessary to restart the task after every line (132 samples per channel) in order to correctly trigger from the Validata pulse. Both are far from optimal. I would appreciate any advice that those of you with more experience reading at very high rates from this card may have to offer.
    Thanks,
    Steven Israel
    Duke University

    I'm not sure if I fully understand your application, but maybe some of these hints will help.
    First your streaming to disk rate is going to vary widely depending on your system. Sometimes using a faster processor helps, but most of the time using a faster hard drive will give you the best performance increase.
    Second, writing less data will often have a dramatic impact in streaming to disk applications. If you're reading back scaled data, I would strongly recommend switching to unscaled or raw reads and storing off the scaling constants so you can scale the data off line. This could reduce the required streaming to disk bandwidth by a factor of 4 alone.
    Third, since the 6110 is a 12 bit device, you can try compressing two 12 bit samples into one 24 b
    it piece of data. This will allow you to store in three bytes what used to take four.
    Last, you might consider using your signal on PFI0 as a pause trigger. Depending on the frequency of the pulses on your Valid-data signal, this could significantly reduce the required bandwidth of both the PCI bus and your hard drive. However, since the 6110 uses pipelined ADCs, you need to be careful if the time between valid data pulses is greater than one millisecond. This would violate the minimum sampling frequency of the board and the points in the pipeline that are returned after unpausing (3 in this case) might not be digitized at the full accuracy specification for the device. Because of the pipelining, you also need to be careful how you correlate the data. The first time you unpause for the Valid-data pulse, the first three points will be garbage points. The second and subsequent times you unpause, the first three points are really the last three points from the previous Valid-dat
    a pulse. See the Device Considerations->Timing chapter in the NI-DAQmx Help file for more information on timing considerations with pipelined ADCs.
    I hope this information helps. Good luck!

  • Multiple PCI-6110 in one computer - Minimum Computer Specs?

    I am building a computer system which will house 4 PCI-6110's. What minimum processor, RAM etc. is recommended. Also, can you mix the PCI bus speeds without adverse effects? Dell has a model that has 3 100MHz PCI slots and 1 30MHz slot. Would it be acceptable?
    Thanks in advance.

    Hello,
    In general, the specifications of a board are not dependent on the specifications of the computer. If boards use DMA, as all of our high-speed boards do now, the computer is not the factor. The main factor is the software, not the hardware. if you have LabVIEW, refer to the LabVIEW Release Notes particular to the version you have for more information about the minimum system requirements.
    System performance is only an issue with single-point and continuous double-buffered operations, when the computer's ability to run a loop continuously is the only factor. Publishing specifications for double-buffered operations essentially is impossible because of the many variables present, such as buffer size, acquisition rate, bus type, bus activity, CPU activity, and
    so on. Speed is an issue for single-point I/O only with digital I/O and handshaking. National Instruments does publish specifications for typical transfer rates with actual computers for boards such as the DIO-96.
    Hardware specifications are not dependent on the system if the board uses DMA. The specifications are more dependent on your software and other processor activity occurring at the same time as the data acquisition.
    Regarding your second question, your PCI 6110 requires a bus that has a clock speed of 33MHz, so it's definitely going to work on your first 3 slots but I am not sure if it will work on your 30MHz slot since it's too cloe from the clock speed required. It would be better if you can find a PC that has its 4 slots running at a speed faster than 33MHz just to be sure they will all work.
    Please let me know if you have any further questions.
    Best regards,
    LA

  • External clocking on 6711 card

    Hi all,
      I just read the manual of the card PCI6711, it supports the external clock. So does it mean we can change the update rate by using different external clock rate?

    chris88 wrote:
    HI dragondriver
    They are both correct, the PCI 6711 supports external clocking for the analog output tasks, but for Digital I/O tasks the timing can only be set to software timed, meaning that you cannot use external clock for digital tasks.
    The main difference between RTSI and PFI is the physical connection, PFI lines conects through the connector block you are using and the RTSI is connected through a RTSI bus cable used to synchronize several DAQ cards.
    You can choose any PFI <0 to 9> and this article show how to select which PFI line is the source of sampling.
    Regards
    Thanks for the clarification . I only one doubt. Someone told me that 6711 only supports software-based digital pulse input/output but it supports external-clock-driving counter output. So when you said softwarebased digital I/O task, do you mean all digitals output including counter output? Thanks.

  • External clocking proble with NI-DAQ 6534 after driver upgrade

    Hello newsgroup,
    I've got a Problem with a pattern generation using the NI-DAQ 6534.
    I'm using it in 32 Bit Output mode with external clocking.
    After changing the PC and the OS from a P II 333 Mhz running Win98
    to a P 4 1,8 GHz running WinXP Pro the use of new drivers was nessasary,
    because the old driver doesen't work at all. The following problem occurs
    by using actual driver: Using my old programm external clocking doesen't
    work anymore, but internal clocking works fine. I'm using C++ not Labview.
    I'm not shure if this is the right newsgroup for my problem. If not please
    give me a hint were it belongs to.
    Regards
    Michael

    Michael:
    What version of NI DAQ did you have? What version did you upgrade to? Is your 6534 a PCI board or is it an AT/ISA based device? Have you changed your code at all? You code should be fully compatible with a new version of NI-DAQ... Please provide more exact details of the problem so I can help futher. Are you getting errors or is it just not working?
    Sincerely,
    Brooks B
    Applications Engineer
    National Instruments

  • DAQ Card is inducing noise on an external clock signal resulting in false triggering

    I am using an optical encoder as an external clock source for analog measurements with a PCI 6036 card.  The optical encoder signal is "filtered" using a Schmitt Trigger circuit and proper shielding practices are used on the encoder cable.  I am using a recent version of Labview.
    When the TTL square wave signal from the encoder is viewed on an oscilliscope (without the DAQ attached) it is a picture perfect square wave, nothing that would cause any problems.
    When I then connect the encoder output to my PFI line (with or without the oscilliscope) I get false triggering due to intermittent (seemingly random...) high frequency noise "blips".  Out of 360 expected samples, I will typically get between 2 and 6 "bonus" samples...  When captured on a scope, the noise looks like a decaying sine wave and lasts for only a few us.  the peak magnitude is tyically around 2 V or so as shown on the scope, which is apparently just enough to make my card grab a sample.
    Since the noise is only present when the encoder signal is attached to the DAQ, it seems that the DAQ is somehow inducing noise into the signal.
    I have been fighting this problem for a while now and tried the following:
    1)  I first tried the raw encoder signal, but then added the Schmitt trigger, increased the signal strength of the encoder lines by adding resistors to ground, double checked my shielding, etc...
    2) I verified that my ground potential between my card and my conditioning circuit were not causing problems.  The conditioning circuit and encoder is now powered of the card itself, which should resolve any possible problem with grounds.
    3) Cursed at various inanimate objects (made me feel better but didn't help the situation)
    4) Checked if I could set a minimum pulse width required to trigger off of an exernal scan clock (I can't with my hardware.)
    5) Swapped my card with a card of a different type (problem is still there)
    If anybody out there has some recomendations, I am open to anything.

    Hello OSU_Mech_Eng,
    I'm not quite sure how the DAQ card could be inducing glitches into
    your digital signal.  Digital signals from mechanical devices like
    quadrature encoders can often be glitchy or bouncy, but your schmitt
    trigger should act as a debouncing filter to clean up that digital
    signal.  It sounds like you have thoroughly troubleshooted this
    problem, and I would recommend moving on and trying to use a counter on
    your DAQ board to generate the digital signal, rathar than using the
    raw signal from your encoder/schmitt trigger.
    You were correct when you stated that the best way to do this is by
    configuring your counter to perform retriggerable pulse
    generation.  You can use the signal of from your encoder to gate
    the internal clock on your DAQ card, creating a clean digital
    signal.  By setting the minimum pulse width of the signal, you
    will be able to ignore the small glitches in your signal. Here is
    a link to some Knowledgebases describing how to do this:
    How Do I Remove Glitches or Add a Debounce Filter to My Digital Signal?
    How Do I Define the Parameters for Pulse Generation in NI-DAQmx?
    For further reference, on all of NI's new M-Series DAQ cards (PCI
    625x), the PFI circuitry contains built in debouncing filters to
    protect against small glitches in digital signals.  If you have an
    M-Series card lying around, it might be helpful to give that a try.
    I hope this helps,
    Travis Gorkin
    Applications Engineering
    National Instruments
    www.ni.com/support

  • FPGA Base Clocks PCI-5640R

    Hi,
         I am using NI PCI-5640R. Following FPGA Base Clocks are available to me:
    (1)   Configuration_Clk
    (2)   RTSI_Ref_Clk
    (3)   DAC_0_IQ_Clk
    (4)   DAC_1_IQ_Clk
    (5)   ADC_0_Port_A_Clk
    (6)   ADC_1_Port_A_Clk
    Can anyone help me with brief descriptions of each clock? When these clocks are to be used?
    Kindly send me some URL links where I can find details about the usage of these clocks?
    Thanks and Regards,
    Rashid 
    Solved!
    Go to Solution.

    Hello Rashid,
    (1)   Configuration_Clk -  This is 20MHz onboard clock, which runs independently of the other clocks.  The role of the 20MHz configuration_clk is to provide a fixed frequecny configuration clock that is use by STC2 ASCI for PCI-DMA operations. Also this clock is not synchronized to the 200MHz VCXO or to the external clock
    (2)   RTSI_Ref_Clk - This is a device reference clock derived from 200MHz internal VCXO, by setting the dividing factor ranging from 1,2,4,8 and 16.
    The I/Q clocks are the signals that indicate the rate of the baseband data. These clocks indicate the rate of the data before the digital upconversion in DAC and the rate of the data after the digital downconversion for ADC. They are described as below.
    (3)   DAC_0_IQ_Clk and (4)   DAC_1_IQ_Clk -
        DAC_<i>_IQ_Clk = 2 ×  REFCLKDAC_<i> ×  Clock multiplierDAC_<i>  / InterpolationDAC_<i>
    where
    REFCLKDAC_<i> is the specified device reference clock / N2 or 3 CDC. Specify the divisor using the ni5640R CDC Program VI.
    Clock multiplierDAC_<i>(M) is equal to 1 or 4 ≤ M ≤ 20. Configure the clock multiplier using the ni5640R DAC Program VI.
    InterpolationDAC_<i> is the hardware interpolation rate determined by the DAC fixed 4× interpolator times a programmable 2× to 63× CIC interpolating filter. The programmable CIC interpolator can be configured using the ni5640R DAC Profile VI.
    (5)   ADC_0_Port_A_Clk and (6)   ADC_1_Port_A_Clk -
        ADC_<i>_Port_A_Clk =  ENCADC_<i> × Clock multiplierADC_<i> / (Predivide FactorADC_<i> × DecimationADC_<i>)
    where
    ENCADC_<i> is the device reference clock / N0 or 1 CDC. Specify the divisor using the ni5640R Configure Timebase VI.
    Clock multiplierADC_<i>(M) is equal to 1 or 4 ≤ M ≤ 20. Configure the clock multiplier using the ni5640R Input Port VI.
    Predivide FactorADC_<i> (N) is equal to 1, 2, 4, or 8. Configure the predivide factor using the ni5640R Input Port VI.
    DecimationADC_<i> is the decimation factor for a particular channel in the ADC. Decimation is performed in various filters throughout the processing channel. Each channel includes one CIC filter (decimates by 1 to 32), two FIR-HB filters (each decimates by 2), one DRC filter (decimates by 1 to 16) and one CRCF filter (decimates by 1 to 16). Configure all these filters using the ni5640R ADC Configure DDC VI. 
    The figure below shows how all the above clocks are derived
     Thanks
    NI-khil

  • External clock LabView crash with Measurment Computing board.

    Hi,
    I'm using a Measurment Computing PCI-DAS6402-16 board with Labview (using Universal librarys VIs) in a 4-stroke engine DAQ application and amognst the signals obtain Internaly clocked, there is In-Cylinders Pressure signal which is timed with a Crank-angle deocoder sending analog pulse signal.
    Although there seems to be no problem while the motor is running, it appears that when to external clocked signal  is not getting any pulses (while the motor is not running) then Labview stops to respond and awaits for the external clock to start, and that doesnt happen within 10s then it crashes.
    I dont know whether it is Software problem (because external and internal clocked signals are both in the same WHILE LOOP) or if it is a Hardware problem (because With a NI card there seem to be no problems).
    THNX in Advance

    Well it couldnt be a more typcal windows crash:
    If a channel is externally timed with an analog pusle signal, If the VI starts running but the external clock isn't then:
    -Labview stops to respond to any clicks (there is no .exe at this point) but it keeps working though (That i know cause if i start the ext. clock at this point it "un-stucks' and starts working)
    -But, if for some secs no ext. pulse signal is generated then labnview app stops respondin and i get the typical 'send error report' window.
    Could it be that the Universal library for Measurment Computing boards haw a bug? Cause the same thing happens with the "Analog input External Clock.vi" example given with the Library..so it can't be my mistake..or can it?!

  • Reading digital port with external clock at maximum speed

    I have a PCI-6509. I am programming a fast loop to read 32 bits values from the digital input using an external clock ( injected into one of the card pin inputs )
    The environment is Windows2000 + Visual C++ .
    basically I am doing
     DAQmxErrChk (DAQmxCreateTask("",&taskHandle));
     DAQmxErrChk (DAQmxCreateDIChan(taskHandle,"Dev1/port0:3","",DAQmx_Val_ChanForAllLines));
     DAQmxErrChk (DAQmxSetSampTimingType(taskHandle,DAQmx_Val_ChangeDetection));
     DAQmxErrChk (DAQmxCfgChangeDetectionTiming (taskHandle,"/Dev1/port5/line4:4", NULL, DAQmx_Val_FiniteSamps  , sampsRequested));
     DAQmxErrChk (DAQmxStartTask(taskHandle));
     getSystemTime...
     DAQmxErrChk (DAQmxReadDigitalU32(taskHandle,-1,-1,DAQmx_Val_GroupByChannel,data,sampsRequested,&sampsRead,NULL));
     getSystemTime...
    I pass a data buffer big enought to hold the number of samples I am requesting ( for example 10000 of uInt32 ).
    It works fine. I get my values but the speed I get is only around 10 KHz.
    I print the system time before and after the call to DAQmxReadDigitalU32 as you can see from the above code. 
    This makes no sense because my external clock in ,"/Dev1/port5/line4:4" (as selected on the ChangeDetectionTiming call) runs at 2.0 MHz.
    Could someone tell me what parameters I have to pass to this sequence of NI DAQ functions calls in order to really read at the speed of my external clock?
    many thanks in advance,
    Roberto AButer.
    Note :
    I am going desperate with the web pages, documentation , online helps and the pletora of products and so on.
    I did pay to NI a considerable amount of money for the card , the labview software etc and I just want to do the simplest 20 lines C program to read my digital signal at the speed is being injected. Should that be that difficult????

    Hello caca,
    the board you are using is specified as a static IO board. That means it was not constructed to do highspeed DIO operations.
    You cannot use a hardware clock to time your input and output
    operations, so you have to use a software timing or, as you chose, the
    change detection interrupt. But the maximum speed you can archieve
    doing this, is somewhat limited of course.
    Check this thread for some more information.
    Ingo Schumacher
    Systems Engineer Sound&VibrationNational Instruments Germany

  • How can I use external clock to implement a delay?

    Hi all,
      I am testing to use external clock to drive dev/PFI0 (on device 6711) which is used as the clock for the analog ouput. I have thinking two applications by using the external clock but I don't have much idea on the implementation yet.
    First of all, I have a sequence of data (array) with each sample being sent at the interval of 1us. I use an external clock (10MHz) driving the PFI0 so it is pretty easy to achieve that goal. I am thinking what happen if I want each sampel being sent at different time. For example, if I have 5 samples, I want the first one sent 1us after the task start and wait 2us to send the 2nd sample, wait 5us to send the 3rd sample and wait 11 us to send the 4th sample, and wait 1us to send the last sample. Is it possible to achieve that based on the external clock?
    Second question is about the delay. My code require ciritcal timing and the builtin delay doesn't behave very well because I am running in windows. I can increase the priority of the vi to highest, it helps a bit but still not perfect. I am thinking if it is possible to implement hardward delay based on the external clock. Any idea?

    Hello dragondriver,
    To answer your first question, yes you could send data in that fashion. You would have to programmatically build a pulse train and use that to trigger the sending of data. The answer to the second question is essentially the same. You should be able to programmatically build a pulse train with a delay and use it as trigger to begin whatever operation you have.
    Jonathan L.
    Applications Engineer
    National Instruments

  • Is the PXIe-PCIe8361 adequate for this system? And external clock questions...

    Hi all,
    I have spent some time piecing together a system and I'd like a sanity check before pulling the trigger on this purchase.  The system will contain the following hardware:
    1. Chassis: PXIe-1078
    2. Controller: PXIe-PCIe8361
    3. 3 x PXIe-6363 (16 analog inputs each card, 32 digital inputs each card, all internally clocked @ 10kHz)
    4. 2 x PXI-6224 (32 digital inputs on one, 8 digital inputs on the other, externally clocked in "bursts" of 62.5khz)
    5. Labview software
    The three PXI-6363 cards will be responsible for  a mix of analog and digital measurements made @ 10 kHz, timed continuously by the onboard clock.
    One PXI-6224 will be clocked externally @ 62.5 kHz and will be used to collect digital data on a 32-bit port.  These clock pulses will not be continuous, but will occur in bursts lasting for 2ms every 20ms.
    The other PXI-6224 will be clocked externally @ 62.5kHz as well and will be used to collect digital data on an 8-bit port. These clock pulses will not be continuous, but will occur in bursts lasting for 2 ms at random intervals.
    My questions are:
    1. Am I planning anything that looks unreasonable for this hardware?
    2.  Should I expect issues with data transfer rates with the PXIe-PCIe8361?  I will be operating well within the advertised 110MB/s throughput of the device.  I plan to stream this method... NI Fast TDMS data streaming
    3.  I have only ever used NI cards for continuous measurements made by an onboard clock.  When I set up a task to collect data that is externally-timed, will the DAQ be expecting a "continuous" clock pulse, or will the system wait patiently for clock pulses to arrive at any rate (any rate within the spec of the card, of course)?
    Thanks, any input is appreciated.

    Hello LucasH0011
    1-As long as you put the PXI-6224  and the PXIe-6363 cards in the corresponding slots, meaning the express(PXIe-6363) in the express and the hybrid(PXI-6224) in the hybrid.
    2-I think you would  not have issues with the transfer rate.
    3-Your timing specifications sound reasonable to me, I think you will be fine. 
    Here is a document that has useful concepts for the use of cards:
    http://www.ni.com/white-paper/3615/en/
    It is for the M-Series, but the concepts apply to the X-Series as well. 
    Regards 
    Ernesto

Maybe you are looking for