NI-1752 Smart Camera - Timeout without external trigger

I'm working with an NI 1752 Smart Camera, which is triggered by an Ethernet/IP step. It's loaded with VBAI 2014 and so is my PC.
Whenever I upload (or reupload) VBAI 2014 to it, Image Acquisition works normally. But it seems like after stopping an inspection in Inspection Mode then changing back to Configuration Mode, Image Acquisition stops working and I get Timeout Error. The error messages states that the "camera didn't receive the trigger signal in the specified timeout period", which doesn't make sense because I'm not using the External Trigger functionality.
Rebooting the camera does nothing to fix this. Sometimes it starts working again (I wouldn't be paying attention to what I was doing to make that happen), then stops. I tried testing it with all Acquisition Modes (Wait for next image, immediate, buffer) but that makes no difference. The network path from my PC to the Camera has plenty of bandwidth and no latency problems. I set the camera to initialize all steps upon startup, no difference.
Has anyone else got this problem?
Thanks,
Hazim Salem

Hi Brad,
Thanks for your interest in this problem.
I indeed get the same timeout error trying to take images with NI MAX. It gives me error 0xBFFF8004. I found this error in solved forum post, but the solution in that post didn't fix it.
Also, starting out with an empty inspection and adding "Acquire Image" step doesn't work either.
So far, I've managed to get the camera capturing images immediate after reboot in Configuration Mode, and also in the first time I switch to Inspection Mode. Switching back to Configuration Mode (this would be the second time), the image acquisition starts timing out. Switching back to Inspection Mode doesn't fix it. Only way to fix it is to reboot the camera.
Here are some things about this smart camera that might have to do with this problem:
The camera came preloaded with VBAI 2014. Running on that version, I've never experienced any problems whatsoever.
We sent the camera to the customer to be installed on-site. The customer didn't have VBAI2014 (forgot to deliver it to them and they didn't ask), so I'm sure they installed VBAI 2013 on it.
I noticed that re-installing software on the camera does not completely wipe it clean; I've re-installed VBAI 2014 on it 5 times yesterday, and the inspections I had on it are still there.
Since that's the case, there could probably be some incompatible VBAI 2013 executables or settings that are affecting it.
The actual problem of Camera timing out started when I configured the camera (in "Configure Target") to run my VBAI2014 inspection on-startup. Once that happened, even when I disabled that again, I started getting these timeout errors.
Thinking abot that, I'm sure that by resetting the camera to factory default will solve this issue. I'm just curious if there's something wrong I did, or if there's a bug somewhere with software installation.

Similar Messages

  • Smart camera & PC

    Hello,
    When running VBAI 2013 on a PC in RUN mode, all image layers can be watched on the monitor.
    Using the NI-1752 smart camera connected to PC will I be able to watch all image layers as well in real time?
    If the question is YES, do I need to use the second Ethernet port, or every thing (developing and monitoring) is done with Ethernet port 1?
    Thanks,
    gsc

    Hi,
    Yes, there are 2 ways to monitor what the Smart Camera is inspecting. Both are using Ethernet Port 1.
    1) Display the inspection interface in a web browser on the host.
    You can use the default Inspection Interface or design your own. (Tools>>Inspection Interface Configuration...)
    Then, go to Target>>Target Options. Select the Web Server category and enable the web server.
    You can then go to inspection mode and run your inspection.
    The inspection can be monitored from a web browser of a PC, where you type the IP address of the Smart Camera.
    2) Use shared variables.
    You can create an image shared variable (Tools>>Variable Manager). In the System Variable tab, create an image variable and share it.
    Set it using the Set Variable step located in the User Additional Tools palette.
    You can then create a VI on the host that reads this image variable (and any other shared variable) and displays it on your LabVIEW VI.
    This application note explains how to do that:
    http://www.ni.com/white-paper/12322/en/
    Hope this answers your question.
    Best regards,
    Christophe

  • Display image in Smart camera web interface

    Hi!
    I need some assistance with LV-RT. 
    What I have succeeded is to enalble the web server on NI 17XX camera and successfully displayed a VI with Image Display on it. I thikn I got the hang of it. 
    What I would like to know is there another way to display images on the camera web page. I really do not need the whole VI up there just an image and a string with serial number. One way I know would work is to write an image file to the system memory and then embed that image to the web page, but I am not sure that is the best solution as writing process takes time and it is not recommended to write too often to the program memory (am I right?).
    So, is there some way to assign some object to say a memory bank and display it (without having to install the runtime maybe) on the page?
    Thank you for your attention,
    Mart

    Hey Mart,
    Going back to your original question; are you looking to stream images to a web page from your smart camera or simple see what your camera is seeing when you go to a particular site? Take a look at the sample project below. It is sort of a work around, but it posts an image to a web service. Essentially it snaps an image, saves it, and then posts the saved image to a web service. You will have to change the file path for the image, then build and deploy the webservice from the project view. Also, it would be beneficial to take a look at the webservice build specifications. The webbing to view the snapped image from your local machine would then be viewable from your web browsers at the address : "http://localhost/Stream_Service/ShowImag"
    Let us know if you have trouble getting this example working. Also, perhaps another visit to what your end goal is would be helpful. What was it that you had working?
    Hope this helps
    -Ben
    Message Edited by BCho on 04-10-2009 02:27 PM
    Hope this helps.
    -Ben
    WaterlooLabs
    Attachments:
    SnapWebService.zip ‏14 KB

  • How to connect external trigger PCIe1427

    Hi,
    We are working with PCIe 1427 frame grabber and a IR camera with Cameralink Standard. Our camera has got asynchronous reset at CC3, so we have given external trigger 2 in NI MAX and successfully able to grab the video. But we need to implement the same using Labview application, so tried to use imaq configure trigger3, imaq trigger drive VI's and unsuccessful in resetting the camera and grabbing video.
    We are using IMAQ 4.9 driver and Vision development module with Labview development suite 2013 SP1.
    Kindly provide any solutions.
    with regards,
    Sri 

    Hi all,
    Any suggestion in this regard.

  • How to connect external trigger PCIe-1433?

    I am having an issue getting the external trigger to work using a cameralink camera, a PCIe-1433 card and a signal generator.  The signal generator is connected to the SMB connector on the 1433 card and is providing a 20 Hz signal with a 20ms pulse width at TTL voltages (0-4V).  From memory, we are using the MAX software and under the Acquisition tab I have CC1 set to external, however the camera is not receiving this external trigger.
    I've attached a rough diagram of our setup.
    What could my issues be?  My thoughts are:
    1) Are there other settings in MAX I need to ensure?  I know the SMB connector can be an input or an output.  How can I make sure this is set to input?
    2) My input signal is 0-4V which is TTL, but does the signal need to be 0-5V?
    Any thoughts would be greatly appreciated.  Thank you.
    Attachments:
    ExternalTrigger.jpg ‏29 KB

    Hello tuckturn
    Thank you very much for getting in touch with us. 
    1)  In Measurement and Automation Explorer, the default is an input.  You would need to use LabVIEW to change the SMB connector to be an output. Can you please show me a screenshot of your camera attributes in Measurement and Automation Explorer.
    2)  Where does this input signal come from?  Do you have a 5V TTL compliant output to test this?  Can you please provide me the specification for whatever device is outputting the voltage?
    Thanks again.
    Sincerely,
    Greg S.

  • How to connect external trigger PCI-1433

    Hi all:
    I met the same problem as this old post: https://forums.ni.com/t5/Machine-Vision/How-to-connect-external-trigger-PCIe-1433/m-p/1677560/highli...
    however the post did not give solution.
    I am having an issue getting the external trigger( from PCI 6259) to work with a cameralink camera, a PCIe-1433 card . The PCI 6259 is connected to BNC 2111, the output trigger signal from CTR0 is connected to the SMB connector on the 1433 card and is providing pulse train at TTL voltages (0-5V).  Using the MAX SOFTWARE and under the Acquisition tab I have CC1 set to external 0, however the camera is not receiving this external trigger. The camera is Basler 4096-140km.
    I measured the ouiput signal from CTL0, the sigal seems good.
    I tried RTSI trigger before, failded. I thought SMB should be more straight forward, seems I was wrong.
    Looking forward to reply!
    Attachments:
    camera configuration.jpg ‏168 KB
    External Camera Trigger(SMB).vi ‏57 KB

    Bobjojo,
    You will actually need to affect the properties of both the camera and the frame grabber to take advantage of this triggering mode. I found a document that contains a good run down of the process to acquire in this mode (linked below). As far as the programming for the acquisition is concerned, the frames will be built at the driver level for the specified frame height. This means that the IMAQ driver will composite the line scans for you, and any simple acquisition (the document shows using the Vision Acquisition Express VI) will be able to pull the image into LabVIEW.
    www.ni.com/white-paper/13786/en/pdf
    Karl G.
    Applications Engineer
    ni.com/support

  • Smart camera 1742 selecting inspection by tcp/ip and labview

    Hello everybody!
    I´m working with two smart cameras NI1742 where i have some inspection for different products, i´m trying via tcp/ip and lab view 2010 select the inspection for each product. i have configured the tcp/ip connection and test it with the vision builder ethernet terminal, this part works very good, but i can´t do this with Labview where i show images from the smart cameras via shared variables.
    If someone know how can i do it? please, i will be thankful.
    this is my code but there is something that is wrong.
    thanks and regards
    Solved!
    Go to Solution.
    Attachments:
    USER1.vi ‏85 KB

    You are already using the VBAI API in LabVIEW to start the inspection. Why don't use the Open Inspection VI to select which inspection to run and then you don't need a product select state and TCP commands? You can use the Get Target Inspections VI to list the paths to the inspections on your target since they are saved in numbered folders and it won't be obvious what the path is, but once you get the path, it is very simple to just keep one session open to the target and use the API to open inspections, run them, even get images instead of using the variables (you can use Get Inspection Image for the Image or just keep using your shared variable since it sounds like that works fine.).
    There are two possibilities I can think of without more info on why your current setup doesn't work:
    1. When you start the inspection using the API, I don't think product select works, since it is expected that you just use the API to switch inspections. You can test this out by commenting out your VBAI API calls in LV. Start the inspection from the VBAI executable when connected to the target and the Product Select is in use. Run your VI that sends a TCP command to switch to the other inspection and see if it works (you can reconnect from VBAI and see if it's running the new inspection).
    2. There is an error with the TCP command...check the error out of your TCP VIs.
    Again, instead of trying to get your current model working, I would switch to using the API for running inspections and selecting the inspections. If you do use the API, you don't need to stop the current inspection, when changing inspections. The Open Inspection VI will stop any current inspection before loading the new inspection. It will wait until the current iteration is done before stopping the current inspection, and the clean up state of that inspection will be called as well.
    Hope this helps,
    Brad

  • NI-1744 Smart Cameras go offline using Vision Builder AI and LV2010?

    I've run into a perplexing and frustrating problem.
    I have a set of 6 NI-1744 smart cameras used to monitor sample motion in an automated system
    A central robot moves samples between 6 satellite chambers; there is a camera mounted above the entrance to each of the satellites; the sample carriers are drilled with up to 10 holes encoding their numbers in binary (0-1023). There is also an L-shaped registration mark drilled near the region for the number encoding. Matters are complicated somewhat in that the second generation of carriers has a slightly more complicated registration mark (which includes the simple L of the first generation).
    What I would like to accomplish is this:
    Most of the time, I just want to be able to see what's under each camera, illuminating from above using a ring light attached to the camera. Call that Inspection A.
    I may need to capture an image without the ring light. Call that Inspection B
    When a sample transfer is occurring, I need to run a more complicated inspection (Inspection C):
    A light is turned on underneath the end of the sample transfer arm (the "fork"), backlighting the region where the hole pattern would be if a carrier is in fact present.
    Acquire an image of the fork with the backlight on.
    Check to see if the image shows the fork to be empty. If so, return that information and the inspection is complete.
    If not, look for the (backlit) registration mark and establish a coordinate system for the holes. Look to see which holes are present, calculate the sample number, return the data and the inspection is complete
    If not, look for the 2nd gen registration mark. If it is found, establish the coordinates as in step 4, calculate the number, return the data and complete.
    If none of the registration patterns are found, turn on the ring light, capture an image and return it so the operator can intervene and enter the appropriate data.
    I have successfully built the inspection, and it appears to work in the Vision Builder configuration interface, or if run from the Vision Builder inspection interface. If I attempt to *use* the inspection, accessing it via the VBAI interface in Labview, one or more of the cameras will hang after one or more inspections and stops responding to LabVIEW. It will take multiple reboots to get it back on line and visible to either VB or Labview.
    Originally, I had thought to configure the camera to run inspections continously and select inspection A, B, or C based on a variable I could access using the Labview Shared Variable Engine.  Every time I tried that the camera in question would hang. My current sort-of-successful software uses the VBAI functions in LabVIEW to open Inspection A, B, or C on the camera, run that inspection a single time and return the results and inspection image. That still ends up with one or more cameras hanging, especially if I've added the case to handle the more complicated registration mark.
    I think I may be running out of memory in the camera. I have occasionally received an out of memory error message when running the inspection in debug (step) mode in the VB configuration interface, at which point the camera will disconnect from the VB interface. When I look at the system monitor tab for a camera in MAX, I can see that it is showing me 11.5M free/124M total memory, and 72.3Mfree/118M total disk space. If I understand those numbers, that means that less than 10% of the memory is free when the inspection starts!
    I'm only communicating with the cameras via the ethernet interface. Are there any software components I don't need to have installed on the 1744 (MAX lists about 12 different things installed!) in that case?
    Alternatively, it appears there are VBAI functions available that might let me acquire an image and then process it via a local copy of Vision Builder running on the host PC, rather than in the camera. Is that so, and if so, would it be faster than running the inspection in the camera?
    I've attached a zipfile with the inspection that appears to stall the camera, and samples of typical images captured of the empty fork and a sample holder.
    Kevin Roche
    Advisory Engineer/Scientist
    Spintronics and Magnetoelectronics group
    IBM Research Almaden
    Attachments:
    Carrier Read Problem.zip ‏657 KB

    The good news: thanks to some offline assistance from Brad, my cameras are no longer crashing.
    The bad news: my labview VI using VBAI functions still fails for one or more cameras after a while (typically >12 hours of monitoring). 
    It's very odd: the cameras are still visible online.
    I can ping them.
    I can connect to them via VBAI, run inspections that way (either in configuration mode or inspection mode), and disconnect successfully.
    My calling VI, however insists that it can't connect (usually with a -354700 error from VBAI Connect.vi, occasionally with a -354705).
    I am using shift registers in the monitor loop of the main VI so that I can pass the VBAI session reference for each camera back to the subvis that actually load and run the desired inspection once a connection has been made. After an indeterminate time, one or more of those appears to go bad, but if I attempt to reset and connect to the camera again (I did include a control to let me close and reconnect if necessary from the monitor loop), I still get the above errors.
    The only way to get them back online within LabVIEW this morning proved to be to stop the main VI and reset all the controls to default, reboot the cameras, and then restart the VI. At that point it was able to connect to all 6 again and has been running happily for over 8 hours.
    I set up the persistent sessions using the shift registers because I have observed in the past continually opening and closing resources like that can lead to memory problems. It also dramatically reduced my cycle time when all I want to do is get the latest images from the cameras to under 2 seconds.
    Any ideas? Is there some subtlety to disconnecting/reconnecting to the VB in the cameras via Labview that I'm missing?
    I am working on a slightly smarter version of the inspection that can be allowed to run continously in a camera and simply return the appropriate result when asked, rather than having to start and stop different inspections as I do now, but that is not ready for deployment yet. It's the symptom of running just fine for X hours and then losing one or more cameras that is baffling me right now.
    Kevin Roche
    Advisory Engineer/Scientist
    Spintronics and Magnetoelectronics group
    IBM Research Almaden

  • How do I acquire 1 image every time an external trigger fires

    I am using an NI 1428 frame grabber card and a common apature camera to acquire real time images.
    Does anyone know how to listen for an external trigger (ext 0) and after it fires, grab the current image on the frame grabber card? I want to do this inside of a loop that grabs one RGB image everytime the trigger fires.

    Hello,
    Check out the following shipping example in the LabVIEW directory:
    examples\imaq\IMAQ Signal IO.llb\LL Triggered Snap.vi
    Let me know if you have further questions!
    Regards,
    Yusuf C.
    Applications Engineering
    National Instruments

  • External trigger for internal event

    Hello,
    I would like to use an external trigger on a PFI-line to trigger an
    internal event, e.g. the switch of an indicator on the front panel from
    "1" to "2".
    How do I do that?

    Thanks for your answer!
    The problem was that I wanted to use a PFI-line on a PCI-board
    (BNC-2110) from NI to trigger the start of a voltage output task. On
    the other hand, I wanted to use the same trigger event for stopping a
    while-loop which would otherwise abort the task after some seconds if
    the trigger didn't come; the data generation itself is rather long, so
    you have to set the "timeout" terminal of "DAQmx Wait Until Done" to "
    -1". The problem was that it is not possible (at least on this board)
    to use a digital in-line for triggering the task and the PFI-line for
    getting a digital input. So I splitted the signal up: it was weird to
    both lines and worked as trigger on the PFI and was monitored on a
    DI-line.
    This setup worked, but as there were still some problems, I changed it
    again. Now I check for some digital in-lines (which provide a stimulus)
    in a while-loop while waiting for the trigger; if the stimulus changes,
    the task (which is waiting for the trigger) is aborted. The problem is
    that if the stimulus doesn't change and the task is triggered as usual
    and is completed, I can't report that fact to the while loop! So the
    while-loop goes on and blocks the rest of the diagram. I just wrote
    another posting concerning that thing, if you would have a look... It
    can be really a little hard sometimes, can't it...
    This thing concerning the event structure and value (signaling) sounds
    interesting, I'm just trying to figure out the possibilities how to use
    an event structure; but it's not always really intuitive.
    Best regards,
    Mitja
    Message Edited by Mitja on 11-14-2005 07:09 PM

  • Best way to sync 3 angles without external audio?

    Just installed FCP X today — coming off of FCE 4, so I have lots of learning to do, but am liking it so far.
    My first editing project is a concert of Bach's music I shot on Canon dSLRs a couple months ago. We had three angles, and while I will eventually be getting a copy of the professionally-recorded sound from the concert, I only have my on-camera audio to work with right now.
    So my question is, how does one go about using the syncronization feature to sync three camera angles, without having external audio?
    Each camera has several clips for the whole concert (because of record time limits). I was able to sync up the first clip from each angle by extracting the audio from one angle, importing it as a file, and using it as the sync reference, but there doesn't seem to be a way to add and build onto the syncronized clip (i.e., adding another extracted audio file and syncing it with previously synced footage). Perhaps I'm missing something?
    Thanks in advance for any help!

    And, yes, I did search the forum before hand.
    Oh, and another thing I should mention:
    My DSLR (canon 5d Mark 2) Actually records the DATE and the TIME of when I hit the record button. This is located under the "Date Created" meta data which is viewable in the windows media browser(explorer). AND, it also says "Date Modified" to indicate when I stopped recording. This is very useful,
    but is this a good way  I can multiple cams synced in time?
    Thanks!
    Dave

  • ACE GigE - external trigger - example needed

    Hello,
    I have a CVS-1457RT with a Basler ACE GigE cam - I am having issues when I try to use a hardware trigger for the camera (thru the I/O connection on the camera)
    I've been trying for a few days now, so I decided it was time to ask here
    - have anyone used a similar setup, meaning the camera with external trigger? (controller is not the issues) and would it be possible to share the acquisition VI's?
    best regards,
    Henrik

    Hi Ehlert,
    Sorry you are having problems getting this to work!
    There are a few things to keep in mind when configuring the triggers....
    You will need to enable the trigger on the Ace camera to trigger the specific event (Frame Start) that you want to occur on the specific trigger signal. You might want to consult the Ace user manual for the specifics of doing this, as they have quite a number of options for how the trigger can be configured. Are you using the 1457RT to send the signal to the camera? If so you will then need to configure the 1457RT to send an appropriate trigger. We have some Vision RIO examples that show how to do this. Keep in mind that you also may be experiencing a hardware setup issue. If you are using the isolated outputs on the CVS you will need to provide external power for the outputs to work. It is probably helpful to examine your physical trigger line with a scope and verify thaty the trigger signal is going when you expect, allowing you to isolate which end is not configured appropriately.
    In general, assuming your trigger signal to the camera is working, setting the camera up for triggering is pretty simple:
    -Set Trigger Selector to "Frame Start" (to set context for the other trigger features)
    -Set Trigger Source to "Line1" (only option on Ace GigE)
    -Set Trigger Mode to "On"
    -Set other related features (rising/falling edge, delay, ...)
    Eric

  • Why do I often get blank frames while acquiring images using external trigger under Windows 7?

    I'm using Windows 7, the latest LabVIEW and IMAQdx right now. The program was developed based on the example provided by LabVIEW. Everything ran well on Windows XP using internal trigger or external trigger. But when I wanted to run the program on a desktop with Win 7, I just cannot get the images continuously using the external trigger. Although I can get the images, the blank frames showed up very often. When I chose the internal trigger, there was no problem.
    Once I thought there might be something wrong with my program. But I tried the LabVIEW example to select the internal trigger mode and external trigger mode, it had the same problem.
    Is there anyone could give me some suggestions to solve this problem?
    Thanks!

    Hi Matt,
    The images are either "all blank" or "all good".
    The hardware setup is exactly the same.
    The blank images are not acquired periodically.
    Similar results were also seen when using MAX to control the camera.
    We found a solution to this problem. But we don't know how to explain it. When we were using the 5-Hz external trigger, we thought the video mode would not affect the image acquisition and therefore we left the video mode option as default (15 fps). But when we change the video mode to 7.5 fps or 30 fps, the problem disappeared. So now if we want to get rid of the blank frames while using the external trigger, we just need to choose the video mode other than 15 fps. Interestingly, when we switch back to the XP system, the camera also acquried inconsistent blank frames if we set the video mode at 15 fps.
    Thanks again for your help.
    Hubert 

  • Externally trigger binned counting

    Hi,
    I have a PCI-6221 and am trying to achieve the following.  I've tried searching and found similar questions, but nothing exactly the same, so apologies if I have overlooked a previous answer.
    What I would like to do is count total pulses per 100 µs for some fixed period of time, say 100 ms.  The start of the 100 ms total measurement period should be triggered by an external signal which is much slower, e.g. 1Hz  This is as I need to sum multiple mesurement events as follows, where the time bins given are times after the trigger signal and each measurement is triggered by the external signal.
    Measurement     0-100µs  100-200µs    200-300µs   etc.
           1                    2                1                0
           2                    1                2                1
           3                    2                1                1
           4                    0                1                0
        total                  5                5                2
    If I could get the rolling total for each measurement that would also be fine, I can subtract each time bin from the previous in software.  The previous example using this method would therefore be:
    Measurement     0-100µs  100-200µs    200-300µs   etc.
           1                    2                3                3
           2                    1                3                4
           3                    2                3                4
           4                    0                1                1
        total                  5                10              12
    I have looked at previous answers such as http://forums.ni.com/ni/board/message?board.id=40&​message.id=2056&query.id=1096458#M2056 and whilst I converted that to a finite number of measurements OK, I couldn't work out how to set it to digitally trigger without errors.  I'm using LabView 7.1.
    Thanks very much, I hope that this was clear.
    Alex

    I haven't fully fleshed out this idea, but here it is in a think-out-loud form.
    I'm assuming you don't have clocked analog output in your app, and will lean on using an AO task solely for its clock.
    Here's the idea:
    1. One of your counters is set as a retriggerable single pulse generator.  Let's say it's CTR0.  Use the external ~1Hz signal as its trigger, and set it up to generate a single pulse with a 100 msec high time and a minimal low time.  You'll need a DAQmx property node to make it retriggerable.
    2. Configure an AO task to generate at 10 kHz, corresponding to your 100 microsec bins.  Configure it to be pause triggered by the output of CTR0.  Thus, the AO generates samples at 10 kHz, but only while CTR0 output is high (for 100 msec).  You'll have to write a buffer of analog values to generate, but it can just be an array full of 0 voltage, and you don't have to wire that output to anything either.
    3. Configure your other counter, CTR1, for period measurement.  This may be tricky to configure with DAQmx property nodes, but ultimately you'll need to treat your pulses of interest as the timebase for the measurement while using the AO sample clock as the signal to be measured.   That sounds backwards, but the effect will be to store the number of your pulses of interest that occur between AO clock edges.  You will need to research "duplicate count prevention" to make sure that 0 values get buffered properly.
    I *think* that sort of arrangement could work.  About every second, you get an external trigger.  That trigger causes CTR0 to generate a single pulse with a 100 msec high time.  Your AO task generates a clock pulse every 100 microsec, but only during the 100 msec when CTR0 is high.  Each of those AO clock pulses buffers a value for your CTR1 period measurement.  This should produce a measurement buffer like the first one you listed.
    4. (Simpler Alternative) Configure CTR1 for simple counting.  Use DAQmx Timing.vi to specify the use of the AO Sample clock, and specify your external pulses of interest as the signal to measure.  By default, LabVIEW will expect to see those pulses at CTR1's default SOURCE pin.  This arrangement should produce a measurement buffer like the second one you listed, though you may still need to research "duplicate count prevention" to make sure that gets set up right.
    5. I can't help but think there's another way to do this without the AO stuff, but nothing's coming to me now.
    -Kevin P
    Message Edited by Kevin Price on 02-01-2010 02:19 PM

  • How to externally trigger the execution of TestStand with a start and abort button through a digital interface?

    Hello,
    I'm Currently evaluating TestStand as an alternative for an in-house developed Test Sequencer.
    To start our own Test Sequencer we use a small box, connected to a DIO board. The box has a start button and an abort button. The box also has a pass, a fail and a running led.
    The interface with this box is made via some digital lines of a PCI-DIO-96.
    In our own Test Sequencer we have groups named Init, Run, Abort, Exit.
    The Init group is executed at startup (only once). It is used to initialize all the HW and SW. -> I guess this is the Pre UUT Loop callback in TestStand.
    The Exit group is only executed once at the end of the day when the application is terminated. It is used to free all the used hardware and cleanup all the resources. This probably is the Post UUT Loop Callback in TS.
    When a product needs to be tested, the operator presses the "start" button which triggers our own Test Sequencer and the run and abort group are being executed (first the run group, afterwards the abort group).
    When the product is being tested the "running led" of the little box lights up to indicate to the operator that the application is running. (only when the run and abort group are running)
    The Run group has all the functional tests in it. (MainSequence)
    The abort group is used to put everything back in it's original state after the test on this single product is done. (Post UUT)
    When executing the tests and something goes wrong (operator gets stuck in a clamper, ...) the operator can still press the abort button and then the execution immediately jumps from the currently executing step in the run group to the first step of the abort group. So, when something goes wrong, immediately the abort group is called.
    At the end of the run and abort group, if no errors occured, the "pass led" lights up. If one or more steps went wrong the "fail" led lights up.
    This setup can also be used to test multiple product in parallel. At that time all the different parallel testers have such a small box which contains a "start" and "abort" button and a pass, fail and running led. (it is possible that they are all connected to the same PCI-DIO-96 board. )
    My question:
    Is it possible to do something similar like this in Teststand? If yes, is there an example available that shows me how to do this in TestStand? (externally trigger the execution of TestStand)
    Typically, in the Init group (Pre UUT Loop) the digital interface box gets initialised.
    In the Close group (Post UUT Loop) the digital interface box is taken out of scope.
    Note: The PCI-DIO-96 board to which the digital interface box is connected will also be used in the rest of the developed application (MainSequence, ...)
    What's really important for me is that I can create a process model that all the application developers in our organisation can use. I really don't want any of the application developers (limited software experience) to mess around in all the features of TestStand.
    For them it's important that they just add the function DigitalInterfaceBoxInit () in the Pre UUT Loop Callback and the DigitalInterfaceBoxClose () in the Post UUT Loop Callback, and then everything works!
    It is important that the appliation developers do not have to create any global variables, other functions, synchronisations, parallel sequences, ... in TestStand. All this needs to be done with a simple call to the DigitalInterfaceBoxInit function.
    Thanks in advance for all the help!
    Best Regards,
    Dennis Vanotterdijk
    [email protected]

    Dennis-
    Your application sounds very well suited to TestStand's abilities. I am also quite impressed with your knowledge of how TestStand's architecture is arranged when you are still just considering whether or not to use it.
    I think that TestStand would work really well for you in this application. Like you mentioned it will provide you with a form of standardization for your application developers to work from. It also provides the flexibility for you to add your custom routines in many different places. TestStand also makes parallel and batch testing much easier in TS 2.0 so that you could develop one test for your product and execute different instances of it in parallel to test multiple products at once.
    As for your specific question about how to c
    ontrol TestStand using a DIO board. I think this is very feasible and should not be too difficult. Since TestStand provides you the ability to create a custom operator interface your operator interface could monitor the status of your DIO board and launch/abort executions based on the read values. Usually the executions are launched/aborted when a button on the GUI is pressed however, I do not see any thing different about basing the action on a DI signal vs a mouse click. I am sure your application is more involved than this high level description but from the sounds of it I think it is very possible to do with TestStand.
    Based on my experience of building test systems with TestStand and your description of the application, I would feel very confident in using TestStand to achieve all the goals you mentioned. If you have further detailed questions on how one of your features might be implemented, feel free to contact one of our Application Engineers or email us at www.ni.com/ask
    and we would be glad to help you.
    Best regards,
    Richard McDonell
    National Instruments

Maybe you are looking for

  • Is there a way to use the Passbook on Lock Screen with Touch ID?

    I love having my boarding passes available in Passbook and being able to access from the lock screen.  I also like the Touch ID ability to unlock the phone without typing in the unlock code.   Now to get the two together.   When my passbook item show

  • Challenge for MDX Experts: help me fix my MDX statement

    How do you select based on a wild card (*). How do i fix my following select statement where the line is <b>[ZSOFUND] INCLUDING "*2004D"</b> SELECT [Measures].MEMBERS ON AXIS(0) , NON EMPTY [0FUNDS_CTR__0BUS_AREA].MEMBERS ON AXIS(1) FROM [xxxxxxxxx]

  • Premiere 7 still crashing-freezing

    I tried to find my original post in Sept. or Oct. but could not. I complained about PE 7 crashing when "exporting to tape". The problem is still there. I cannot get through a AVI. without "freezing" I gave up....Re-installed PE3 and have had no probl

  • Difference between pre-ordering at 3am and later

    So if there are plenty of iphones for everyone then what difference does it make if i stay up until 3am to pre-order and wait till tomorrow afternoon to pre-order?

  • Export JCOTableViewModel to CSV

    In an EP6 JSPDynPage java iView, is there a way to export the contents of a table (JCOTableViewModel) to a file on the frontend client in CSV format? Thanks, F