PXI Watchdog

I'd like some information on the PXI-8108 watchdog. Namely, is it an independent piece of hardware? If so what can it do if it trips? Also, is there any undervoltage lockout capability on the PXI-8108?
Phillip Marks
Solved!
Go to Solution.

Hi gtg811q-
     The purpose of the Watchdog Timer is to provide software-independent means of recovery upon failure of a system or device. The Watchdog Timer is comprised of a binary counter and control registers that assert fixed control signals, upon timeout of the counter.  Here is another KnowledgeBase article that discusses what a watchdog timer does.
     In terms of undervoltage lockout capability, I think what you are asking is whether or not the 8108 will shut down if the power supply is below spec?  If so, then yes, the 8108 will shut down without proper minimum voltage.
     I hope this helps.  Best of luck with your application!
Gary P.
Applications Engineer
National Instruments
Visit ni.com/gettingstarted for step-by-step help in setting up your system.

Similar Messages

  • Booting PXI automatically on reciving a trigger

    Hi all
    I have 2 PXI which are redundant to each other, i mean if one PXI fails other should boot up automatically. Is this possible in any way?
    I dont want to run both the PXI simultaneously, becuse this leads to some other problems.
    Thanx for any reply
    Regards
    Arun

    Hi Arun,
    John's suggestion is good and is definitely the most straightforward way to get the functionality you need. Because you mentioned that you didn't want to have both PXI chassis powered simultaneously for unspecified reasons, I'll throw out another suggestion which may suit your needs.
    Most of the PXI chassis from NI feature a DB-9 connector on the back with connections to the power supply for power monitoring or inhibiting via an external signal. There is a white paper with some details:
    Taking Advantage of the Power Monitoring and Remote Power Reset Capabilities of PXI-1000B, PXI-1006,...
    Although it refers to some of our older chassis, the feature is found on most all of our newer PXI chassis, and further details can be found in the respective user manual for the chassis you are using.
    You would need to feed out a level signal from System #1 into the DB-9 connector of System #2 to inhibit power, but if you are planning to use the PXI Watchdog which fires a pulse if System #1 fails, you may need to build some type of external circuit that can accept a pulse trigger and change the line state of a signal you then feed into the DB-9 on the back of System #2.
    Cheers,
    Josh H.

  • PXIe-8135 Watchdog

    Hello,
    I'm putting together a testing rack that will be used to control a system during testing. The heart of the rack is a PXIe-1065 chassis with PXIe-8135 controller. The rack will be controled remotely through Remote Desktoping into the controller (this is necessary because the test involves pressurizing sections of the system high enough that it is dangerous for any personnel to be in the room while the system is pressed up). The test control software will not be LabView, but our own software that interfaces with the modules in the PXIe chassis.
    I would like a watchdog in the rack to monitor the PXIe controller and restart it without any need for someone to go in the room in the event that the system freezes up or otherwise becomes unresponsive. I'm aware of some possible solutions for a remote restart (Intel AMT, remote-controlled power strip), but response times on those could be too slow depending on the state of the rack when it goes unresponsive. I'm fairly certain I'm not the first to do this, so I was hoping to get some info and ideas.
    From browsing the manual, it looks like there are watchdogs internal to the controller, but these are not accessible unless your using LabView RT. Anybody know if that is correct?
    Are there any other options for a watchdog internal to the PXIe controller?
    If not, any recommendations for getting an external watchdog? I assume it would have to connect up to the inhibit connector on the back and connect the proper pins in case it detects a problem.
    Thanks!

    Hi pghohnst,
    It is possible to control the power of a PXI Chassis with the DB 9 connector that is found on the back of the chassis. More information can be found in the KnowledgeBase below.
    Remote Power of a PXI Chassis: http://digital.ni.com/public.nsf/allkb/FF5AB8BB6A1157DB8625756D00502D55?OpenDocument
    Regards,
    Jason D
    Applications Engineer
    National Instruments

  • PXI-6528 Watchdog Timer parameters

    I am using the PXI-6528 DIO on PXI chassis. This board is an opto-isolated TTL DIO.  The card has an on-board Watchdog timer.  I have DAQmx Generate - Digital Line Output (PXI-6528) steps in Sequence steps with "reuse hardware" permitted which set lines high or low as needed for commanding solid state relays (SSR).  When the project runs, the behavior non-deterministically sets all lines to low when transitioning to the next Sequence step.  If I run each DO step independently it works perfectly every time.  
    To investigate this phenomenon, I did a simple bench check to rule-out SE software errors.  I simply patched from the DO port to the DI port on the same card.  Then I added a DI Acquire step just after every DO step to electrically read the states.  The result is that the DO port really is going low when the SE software is programmed for high or low!  It seems to me that the only thing that can override the SE logic is the Watchdog timer.  
    Reading this NI White paper:
    http://www.ni.com/white-paper/14616/en/#toc4
    It seems that it is critical to configure the Watchdog timer to achieve stable behavior from the 6528 card.  BUT, neither MAX, DAQ Assistant, or SE have an obvious way to configure the Watchdog timer.
    QUESTION: Within SE-DAQmx-Generate-Digital Line Output step what are the parameters that control or disable the Watchdog timer for the PXI-6528?

    Update:
    I have found more information at this URL:
    http://zone.ni.com/reference/en-XX/help/370471AC-01/cdaqmxsupp/pci-6528/
    This is the C programming reference specific to the PXI-6528.  Reviewing the Watchdog timer properties we find that the "Timeout" property can disable the timer with a value of -1.  On the DAQmx Assistant GUI for DO, the Advanced Timing tab does have a Timeout property which will accept a value of -1; however, one must set the Generation Mode to N-Samples to activate the Timeout input field.  As the 6528 fundamentally cannot generate samples, then we must reset the Generation mode back to 1-Sample (On Demand) with the effect that the Timeout property is greyed out on the GUI - does this signify that the -1 value for Timeout is ignored?

  • Unable to install the RT Watchdog on a PXI-8106 RT controller

    Hi,
    After upgrading my NI Developper Suite from rel. 8.2 to 8.2.1 (clean install, means deinstalling the rel. 8.2 and new install of rel. 8.2.1, incl. DAQmx 8.5).
    Installing the software trough MAXn the new PXI-8106 RT controller, everything is working okay except the installation of the NI Watchdog 2.1.5f0. For any reason, the module does not appears in the listing of the National Instruments Real-Time software displayed by MAX but a check of the LabVIEW installation on the PC (Windows XP SP2) confirm that all elements are available on the host. A second installation does not help and the module NI_Watchdog still not appears in the listing
    What is wrong? Does the new PXI 8106 controller not support the watchdog timer function?
    Thank's for all reply
    Philippe

    This thread was resolved by enabling the SQL browser agent with domain account

  • Controlling PXI-8110 watchdog

    We are running a resource intensive application on the PXI-8110 and during the operation, the 8110 reboots.  I believe this may be a built-in watchdog function?  Is there any way to control this?  Thanks.

    Hi Wirehead,
    You're right that the 8110 includes an onboard watchdog timer device, but this device won't reboot the system unless you explicitly arm it to do so.  In other words, the timer must be enabled by software (using LabVIEW VIs), otherwise it cannot restart the system.
    What OS are you using, and what is your application doing when it restarts?  Does it always restart at the same point of execution?
    Thanks,
    Eric G.
    National Instruments R&D

  • Watchdog PXI-8184 on Windows to reboot the PXI when the application is frozen

    Hi,
    I have a Labview application running under Windows XP on a PXI-8184 controller. Now, I want to know if it is feasible to trig some kind of watchdog on this PXI controller in my  application to check if Windows or my application is frozen.
    Thanks
    Alexandre Boyer

    Hey Alexandre,
    If you want your system to reboot automatically if it crashes, you can set this up in Windows. Go to My Computer>>Properties>>Advanced Tab>>Startup and Recovery>>Settings and check the "Automatically Restart" box.  This should reboot the system controller whenever the system stops unexpectedly.
    I hope this helps,
    Jason W
    Application Engineer
    National Instruments

  • How to use SMB trigger input on PXI-8187 in LabView

    Hello!
    I have PXI-8187 controller instaled in PXI-1031 and two PXI-4071 digital multimeters. I want to trigger mesurement with external trigger. One way is to used AUX port on DMM, another can be TRIG input on PXI 8187 CPU unit. Is possible to use this possibility in LabView 7? How?
    Best regards!

    There are multiple reasons why this is not a supported feature for triggering under Windows and why NI does not recommend using the controller's SMB connector for this purpose. National Instruments has hardware that was designed specifically for extremely accurate clock generation, synchronization, and triggering between PXI chassis (PXI-665x) and other recommended alternatives listed below.
    Non-deterministic propagation delays
    1. The controller's SMB input is not guaranteed to have a defined propagation delay to the backplane. This means the SMB trigger on the Windows PXI controller is useless for customers that want deterministic triggering. Unlike our PXI-665x boards which have a maintained API and precise triggering properties, the PXI controller's SMB connector and routing circuitry is intended for Watchdog use under LabVIEW RT and isn't designed for this use case. A trigger propagating through the SMB circuitry and going to the backplane, or vice-versa, could have a great deal of variance in the propagation delay which makes gathering useful triggers impossible. This propagation delay is not something National Instruments specs.
    An excellent available workaround: Use the measurement hardware's trigger inputs
    2. The triggering inputs on our measurement hardware (that customers likely have in their system) can route triggering signals to the PXI backplane (usually through PFI lines) with deterministic results. This functionality is fully supported in most of our APIs and will be maintained between generations of devices.
    NOT backward or forward compatible
    3. The SMB input is NOT guaranteed to be compatible between different versions of our PXI controllers under Windows. The hardware properties along with an excellent workaround are the reason we do not support doing routing of triggers through the SMB connector on our PXI family of embedded controllers under Windows. If you are still interested in the beta software despite the shortcomings, visit www.ni.com/support and submit an e-mail request.
    Adam Ullrich
    PXI/VXI Product Support Engineer
    National Instruments

  • Watchdog blocks and the 6514

    We have an NI systems with a 6514 Daq board as part of a real time PXIe-1075 chassis.  Our system design requires a safety feature implemented with a watchdog timer that will send a digital signal output in the event of the software locking up.  The dynamics of our system is on the order of milliseconds response time which is well within the capability of the hardware.  I've seen a watchdog example for the 6514, but looking at it more closely has raised more questions in my mind that I'd like to ask.  The example was found by filtering to the 6514 hardware platform.
    (1) The example uses non-real time watchdog blocks, should I be using the real time version of these watchdog blocks? Are the non real time versions specific to the 6514 which does support watchdog functionality?
    (2) Should I consider using a real-time loop structure in this code?  I believe I should and this loop should get the highest priority in the task scheduling of the real time process as it is safety feature of the system.
    Thanks for any feedback
    Rob

    From the content of your question, I suspect that you may be mistaken on the watchdog function built into the 6514. You will require an externl digital signal from an external watch dog module to operate in conjunction with the NI card.
    WDT205 Watchdog Timer Module
    The above link is such a module. I have personally never used this particular module, but its specs look good, and the price is YAYYYY!  The ones I have used are 3 to 4 times the price.
    As a general rule, you should not ever consider doing watchdog funtionality in software, simply because if your software crashes, and you want the watchdog to catch it; well what can your software do if it has already crashed??
    Anyhow, my two bits for what it may be worth.
    Good luck,
    Dave

  • Pxi 6528 board status

    Hi,
       i want to develop my own linux driver for  PXI 6528. any one can say me how i can read board status, means register address of board status register?
    Thanks

    Update:
    I have found more information at this URL:
    http://zone.ni.com/reference/en-XX/help/370471AC-01/cdaqmxsupp/pci-6528/
    This is the C programming reference specific to the PXI-6528.  Reviewing the Watchdog timer properties we find that the "Timeout" property can disable the timer with a value of -1.  On the DAQmx Assistant GUI for DO, the Advanced Timing tab does have a Timeout property which will accept a value of -1; however, one must set the Generation Mode to N-Samples to activate the Timeout input field.  As the 6528 fundamentally cannot generate samples, then we must reset the Generation mode back to 1-Sample (On Demand) with the effect that the Timeout property is greyed out on the GUI - does this signify that the -1 value for Timeout is ignored?

  • Pxi 6528

    Dear all
    I am using PXI-6528 for my project.In max card passes the self test,pin 49,50 shows 5 v,but when i write digital port high that time crossponding pins shows 0v only.I want to know how to configure power on states in this card like (0- 60VDC).
    Thanks
    Venkateswaran V
    Automation Engr
    Pricol Ltd

    Update:
    I have found more information at this URL:
    http://zone.ni.com/reference/en-XX/help/370471AC-01/cdaqmxsupp/pci-6528/
    This is the C programming reference specific to the PXI-6528.  Reviewing the Watchdog timer properties we find that the "Timeout" property can disable the timer with a value of -1.  On the DAQmx Assistant GUI for DO, the Advanced Timing tab does have a Timeout property which will accept a value of -1; however, one must set the Generation Mode to N-Samples to activate the Timeout input field.  As the 6528 fundamentally cannot generate samples, then we must reset the Generation mode back to 1-Sample (On Demand) with the effect that the Timeout property is greyed out on the GUI - does this signify that the -1 value for Timeout is ignored?

  • Poor PXI IO performanc​e on Latitude E6410 with ExpressCar​d 8360

    Hello,
    I have a Dell Latitude E6410 with a Core-i5 M520 which is giving me very poor io performance when using an ExpressCard 8360 card to connect to a PXI Rack.
    The sustained IO rate that I can get appears to be about 1/3 of that that I can get using the same ExpressCard on a Dell Latitude E6400 (with a Core2Duo processor).
    I am using the A05 bios (latest at time of writing) on the E6410.
    Wade.

    I am running Windows XP (32 bit) sp3 in both cases.
    The E6410 has 4GByte of memory fitted.
    The E6400 has 2GByte of memory fitted.
    I have also use the same ExpressCard 8360 via a PXIe to ExpressCard Adaptor in a Desktop machine with similar performance figures to the E6400 - i.e. much better than the E6410.
    The Desktop Machine is an HP Compaq D7900 with 4GByte of memory, Core2Duo E8500 also running Windows XP sp3 (32 bit).
    Also, on the Desktop, I am running NI PXI Platform Services 2.3.2 and NI-Visa runtime version 4.3.
    On the E6410, I am running NI PXI Platform Services 2.5.2 and NI-Visa runtime version 4.6.
    I no longer have access to the E6400 so I am not sure what sofware versions were installed. However, they are unlikely to be new than the versions installed on the E6410.
    Wade.

  • Choosing a PXIe controller for streaming 200 MBps

    Warning:  This is a long post with several questions.  My appologies in advance.
    I am a physics professor at a small liberal-arts college, and will be replacing a very old multi-channel analyzer for doing basic gamma-ray spectroscopy.  I would like to get a complete PXI system for maximum flexability.  Hopefully this configuration could be used for a lot of other experiments such as pulsed NMR.  But the most demanding role of the equipment would be gamma-ray spectroscopy, so I'll focus on that.
    For this, I will need to be measuring either the maximum height of an electrical pulse, or (more often) the integrated voltage of the pulse.  Pulses are typically 500 ns wide (at half maximum), and between roughly 2-200 mV without a preamp and up to 10V after the preamp.  With the PXI-5122 I don't think I'll need a preamp (better timing information and simpler pedagogy).  A 100 MHz sampling rate would give me at least 50 samples over the main portion of the peak, and about 300 samples over the entire range of integration.  This should be plenty if not a bit of overkill.
    My main questions are related to finding a long-term solution, and keeping up with the high data rate.  I'm mostly convinced that I want the NI PXIe-5122 digitizer board, and the cheapest (8-slot) PXIe chassis.  But I don't know what controller to use, or software environment (LabView / LabWindows / homebrew C++).  This system will likely run about $15,000, which is more than my department's yearly budget.  I have special funds to accomplish this now, but I want to minimize any future expenses in maintenance and updates.
    The pulses to be measured arrive at random intervals, so performance will be best when I can still measure the heights or areas of pulses arriving in short succession.  Obviously if two pulses overlap, I have to get clever and probably ignore them both.  But I want to minimize dead time - the time after one pulse arrives that I become receptive to the next one.  Dead times of less than 2 or 3 microseconds would be nice.
    I can imagine two general approaches.  One is to trigger on a pulse and have about a 3 us (or longer) readout window.  There could be a little bit of pileup inspection to tell if I happen to be seeing the beginning of a second pulse after the one responsible for the trigger.  Then I probably have to wait for some kind of re-arming time of the digitizer before it's ready to trigger on another pulse.  Hopefully this time is short, 1 or 2 us.  Is it?  I don't see this in the spec sheet unless it's equivalent to minimum holdoff (2 us).  For experiments with low rates of pulses, this seems like the easiest approach.
    The other possibility is to stream data to the host computer, and somehow process the data as it rolls in.  For high rate experiments, this would be a better mode of operation if the computer can keep up.  For several minutes of continuous data collection, I cannot rely on buffering the entire sample in memory.  I could stream to a RAID, but it's too expensive and I want to get feedback in real time as pulses are collected.
    With this in mind, what would you recommend for a controller?  The three choices that seem most reasonable to me are getting an embedded controller running Windows (or Linux?), an embedded controller running Labview real-time OS, or a fast interface card like the PCIe8371 and a powerful desktop PC.  If all options are workable, which one would give me the lowest cost of upgrades over the next decade or so?  I like the idea of a real-time embedded controller because I believe any run-of-the-mill desktop PC (whatever IT gives us) could connect and run the user interface including data display and higher-level analysis.  Is that correct?  But I am unsure of the life-span of an embedded controller, and am a little wary of the increased cost and need for periodic updates.  How are real-time OS upgrades handled?  Are they necessary?  Real-time sounds nice and all that, but in reality I do not need to process the data stream in a real-time environment.  It's just the computer and the digitizer board (not a control system), and both should buffer data very nicely.  Is there a raw performance difference between the two OSes available for embedded controllers?
    As for live processing of the streaming data, is this even possible?  I'm not thinking very precisely about this (would really have to just try and find out), but it seems like it could possibly work on a a 2 GHz dual-core system.  It would have to handle 200 MBps, but the data processing is extremely simple.  For example one thread could mark the beginnings and ends of pulses, and do simple pile-up inspection.  Another thread could integrate the pulses (no curve fitting or interpolation necessary, just simple addition) and store results in a table or list.  Naievely, I'd have not quite 20 clock cycles per sample.  It would be tight.  Maybe just getting the data into the CPU cache is prohibitively slow.  I'm not really even knowledgeable enough to make a reasonable guess.  If it were possible, I would imagine that I would need to code it in LabWindows CVI and not LabView.  That's not a big problem, but does anyone else have a good read on this?  I have experience with C/C++, and some with LabView, but not LabWindows (yet).
    What are my options if this system doesn't work out?  The return policy is somewhat unfriendly, as 30 days may pass quickly as I struggle with the system while teaching full time.  I'll have some student help and eventually a few long days over the summer.  An alternative system could be built around XIA's Pixie-4 digitizer, which should mostly just work out of the box.  I prefer somewhat the NI PXI-5122 solution because it's cheaper, better performance, has much more flexability, and suffers less from vendor lock-in.  XIA's software is proprietary and very costly.  If support ends or XIA gets bought out, I could be left with yet another legacy system.  Bad.
    The Pixie-4 does the peak detection and integration in hardware (FPGAs I think) so computing requirements are minimal.  But again I prefer the flexibility of the NI digitizers.  I would, however, be very interested if data from something as fast as the 5122 could be streamed into an FPGA-based DSP module.  I haven't been able to find such a module yet.  Any suggestions?
    Otherwise, am I on the right track in general on this kind of system, or badly mistaken about some issue?  Just want some reassurance before taking the plunge.

    drnikitin,
    The reason you did not find the spec for the rearm time for
    the 5133 is because the USB-5133 is not capable of multi-record acquisition.  The rearm time is a spec for the reference
    trigger, and that trigger is used when fetching the next record.  So every time you want to do another fetch
    you will have to stop and restart your task. 
    To grab a lot of data increase your minimum record size.  Keep in mind that you have 4MB of on board
    memory per channel. 
    Since you will only be able to fetch 1 record at a time,
    there really isn’t a way to use streaming. 
    When you call fetch, it will transfer the amount of data you specify to
    PC memory through the USB port (up to 12 MB/s for USB 2.0 – Idealy).
    Topher C,
    We do have a Digitizer that has onboard signal processing
    (OSP), which would be quicker than performing post processing.  It is
    the NI 5142
    and can perform the following signal
    processing functions.  It is
    essentially a 5122 but with built in OSP. 
    It may be a little out of your price range, but it may be worth a
    look. 
    For more
    information on streaming take a look at these two links (if you havn’t
    already). 
    High-Speed
    Data Streaming: Programming and Benchmarks
    Streaming Options for PXI
    Express
    When dealing with different LabVIEW versions
    it is important to note that previous versions will be compatible with new
    versions; such as going from 8.0 to 8.5. 
    Keep in mind that if you go too far back then LabVIEW may complain, but
    you still may be able to run your VI.  If
    you have a newer version going to an older version then we do have options in
    LabVIEW to save your VI for older versions. 
    It’s usually just 1 version back, but in LabVIEW 8.5 you can save for
    LabVIEW 8.2 and 8.0.
    ESD,
    Here is the link
    I was referring to earlier about DMA transfers.  DMA is actually done every time you call a
    fetch or read function in LabVIEW or CVI (through NI-SCOPE). 
    Topher C and ESD,
    LabVIEW is a combination of a compiled
    language and an interpreted language. 
    Whenever you make a change to the block diagram LabVIEW compiles
    itself.  This way when you hit run, it is
    ready to execute.  During execution LabVIEW
    uses the run-time engine to reference shared libraries (such as dll’s).  Take a look at this DevZone article about
    how LabVIEW compiles it’s block diagram (user code). 
    I hope all of this information helps!
    Ryan N
    National Instruments
    Application Engineer
    ni.com/support

  • Convert PXIe-8135 controller to dual-boot Windows 7 and LabVIEW RT

    Hello. I have a PXIe-8135 controller that originally was just running Windows 7. We are trying to convert it to a dual boot system to also run LabView Real Time. (There is host computer that will run LabVIEW 2014 with the RT module, and the controller will become a target).
    I have created a FAT32 partition on the hard drive of the controller. Now, I’m trying to install the real-time OS with a USB flash drive made using the MAX utility, but I cannot boot using the USB drive for some reason. I keep getting the message “waiting for USB device to initialize”.  
    In BIOS, legacy USB support is [ENABLED] and boot configuration is set to [Windows/other OS]. I’ve tried removing the drive, waiting, and reinserting. I’ve tried two different USB drives (both 8 GB, different brands).
    I’m not sure what to do next. Apart from the USB boot issue, is converting the PXIe-8135 even possible?  I read about SATA/PATA hard drive issues with older controllers, but I don't know about this one.
    Thanks, in advance, for your help!
    -Jeff
    Solved!
    Go to Solution.

    Per Siana's licensing comment, more information on purchasing a deployment license if you do not have one for this target can be found here.
    The RT Utility USB key is used to set up non-NI hardware with LabVIEW Real-Time, but you should not need it in this situation to convert to dual-boot (*). Try this:
    1. Since you already have a FAT32 partion created, go into BIOS setup and change to booting 'LabVIEW RT'.
    2. The system will attempt to boot LabVIEW RT, see that the partition is empty, and switch over into LabVIEW RT Safe Mode. (this safemode is built into the firmware, which is why you don't really need the USB key).
    3. The system should come up correctly and be detectable from MAX, and you can proceed with installing software.
    4. To switch back to Windows, go back to BIOS setup and choose 'Windows/Other OS'
    (*) One area where the USB key is helpful on a dual boot system is with formatting the disk. If you want to convert from FAT32 to Reliance on the partition designated for LabVIEW RT, the USB key lets you attempt to format a single parition and leave the rest of the disk untouched. If you format from MAX, the standard behavior is to format only one RT partition if found, but if not found, it will format the entire disk.  Formatting from MAX on a dual boot system is consequently riskier and you could lose your Windows partition.

  • Start and Stop Trigger using PXI-6120 and DigitalSta​rtAndStopT​rigger.vi not working :-(

    Hello,
    I've been trying for a while now to get my PXI unit to capture a waveform between a Start and Stop (Reference) Trigger using the NI example DigitalStartAndStopTrigger.vi downloaded from the NI website. However, whilst the start trigger and stop trigger seem to be working i.e. the VI runs and stops at  the correct times there is never any data read from my DAQmx compatible PXI-6120 card. So I can see the VI is running around the aquisition loop but the Property Node AvailSampPerChan is always returning zero... this has me slightly puzzled. I thought this might just be a driver issue so I've updated my box to the following software versions (see below) and installed the latest drivers e.g. DCDNov07.exe (also from the NI site) but nothing has changed.
    my software as of now.
    Labview 7.1 (with the 7.1.1 upgrade applied)
    Max 4.3.0.49152
    DAQmx 8.6.0f12
    Trad DAQ 7.4.4f7
    before I updated I had the same problem but with the following versions:
    Labview 7.1 (with the 7.1.1 upgrade applied)
    Max 4.2.1.3001
    DAQmx 8.5.0f5
    Trad DAQ 6.9.3f4
    So to cut a long story short I still have the same problem with the triggers... does anybody have any ideas what is going wrong?
    To add insult to injury it the traditional DAQ example ai_start-stop_d-trig.vi was almost working correctly before I did the upgrade. It had the strange behaviour of capturing the AI0 channel but on the wrong edges (e.g. if I set Start on Rise and Stop on Fall it would do the opposite, Start on Fall and Stop on Rise).
    I'm going to leave my box doing a mass compile over night but i'd really like it if someone could suggest a solution or point me in the right direction.
    Many thanks,
    Mike

    Hi Graham
    I'm out of the lab today but I'll try and answer your questions as best I can...
    1) What are the values you have set for Buffer size, Rate, samples per read and post trigger Samples?
    At the moment I have all the values (e.g. sample rate, buffer size etc) unchanged apart from the ones I mentioned in my previous post (see above). I have in the past played around with changing the buffer sizes and rates in the example VI but as this appeared to have no effect on the behaviour I now have them setup as in the download.
    2) Does the program end after the stop trigger is implemented?
    Yep, if I toggle the trigger line high then low I see the program exits the read loop and the VI stops running as expected.
    3) Lastly can you give me the details of triggering method. Are you
    using a digital train of users set digital pulses? how long is the
    program running.I'm using the WriteDigChan.vi to manually toggle the first digital line of the PXI-6733 card which is wired directly to PFI0 of the PXI-6120 card. Generally, I just start the VI running  and then toggle the line high, wait a couple of seconds and then toggle it low.
    To me it all looks like it should be acquiring samples but as I said yesterday it just refuses to fill the buffer with any data (and hence no samples are read).
    Any ideas? and thanks for you help,
    Mike

Maybe you are looking for