Double Rate Gige camera

I have recently bought a PhotonFocus DR1-D2048x1088C-192-G2-8 Camera which is a standard gige camera but has a function called "double rate" which does image compression I believe to get almost double the frame rate of standard gige cameras.
Now this works in other software I have tried but in both MAX and also the NI Vision software we have, the picture comes up as hazy/snowy when double rate is activated.
Now I believe this is a driver thing and the people who I bought the camera from are asking if National Instruments can call on a specific driver or dll rather than their inbuilt one? 
I am in no way and expert so hope I have made this clear enough.
P.S I could not see the Ni Vision forum and appologise if this is in the wrong spot.

Sure, this can be done.
The amount of work around depends on quite a few factors.
You cannot usually directly call 3rd party library functions, becouse LabView cannot handle specific data format conversions between LabView format and library format.
Therefor, you need to create a C wrapper around that library and export C style functions, that use simple datatypes e.g. (int*, int...).
You will need to get BGRA8b for colored or MONO8b / MONO16b / MONO32b for grayscale image pixel format from the manufacturers functions.
If the libraries have some different output from the decoder, you would need to convert it.
How to get the image from vision to you library memory and back, look here.
Further in LabView, you can call these exported functions through "call library function node".
If you are not familiar with C, this cannot be done, unless the libraries you want to use are designed for LabView.

Similar Messages

  • Triggering a GigE camera

    Newbie question for you all.  Hope this makes sense:
    At the moment, we are using a Dalsa area scan camera and a NI PCI-1424 frame-grabber with a parallel digital interface.  We use LabVIEW and IMAQ software.  Our software creates an IMAQ session that sits and waits for a single external trigger on the PCI-1424 board.  When it receives the trigger, we acquire a series of frames at a given frame rate.
    Now we want to upgrade the whole system and are considering a GigE camera from Basler (Pioneer piA640-210gm).   If decide to do this, how would we configure the triggering? 
    I noticed that NI's GigE board (PCIe-8231) does not have any trigger inputs.  So I guess the only way to trigger the acquisition is to use the camera's trigger input?  But that doesn't even seem possible.  The user manual on the camera indicates that external triggering is used when you have a pulse train that triggers every frame.  We want the camera to wait for just one rising-edge which then triggers a pre-determined acquisition that is setup in the software.  Any ideas how to do that with GigE?
    Also, we use NI-IMAQ software right now.  If we upgrade to GigE, then we are going to have to totally re-write our software to use NI-IMAQdx, right?
    - John

    Hi John,
    With Firewire and GigE cameras the triggering is generally* done on the camera rather than the interface into the PC. The reasoning is that the interfaces into the PC do not have low-latency, deterministic methods to trigger the camera except via a separate external trigger wire linking the PC and camera. (*With Firewire since you are bound to the distance limits of the bus, it is generally easy to make this connection than GigE). NI does sell products like the 8255R (a firewire interface combined with reprogrammable I/O suitable for use with triggering cameras via an external cable, but there is no device that combines this functionality with GigE ports (but no reason why you can't combine with a separate GigE network card). However, as I'll desribe below, GigE Vision has its own tricks that reduce/eliminate the need for this...
    With the triggering moved to the camera, it is up to the camera vendor to decide the complexity the triggering methods they implement. On GigE Vision cameras, because they use GenICam XML files to allow the cameras to self-describe their features, any capabilities the camera manufacturer can dream up will be exposed through the driver. We are starting to see GigE cameras on the market with very complex triggering capabilities (including built-in pulse-generation capabilities, complex input and output interactions, etc) that can rival features on many framegrabbers. You can configure all of these features within MAX or your application just like you can control any other feature of the camera. Note that while there is flexibility to implement any feature desired, there is a "Standard Features Naming Convention" that, among other things, includes complex triggering definitions. This list aims to ensure that cameras that implement the same features (such as common triggering modes) use the same names and behavior for their features.
    As to whether the Basler Pioneer will support what you want, I am unsure. I checked the triggering capability here: http://www.baslerweb.com/downloads/17785/pioneer_manual.pdf. According to their docs, they support an "AcquisitionStart" trigger, meaning you should be able to trigger the start of a single-frame, multi-frame, or continuous acquisition when that trigger comes in (and the source can be varied, including external I/O pins). However, I tried this on a Basler Scout that we have and only a single frame was generated in this mode when triggered via a software trigger. Its possible I have an earlier firmware that is behaving incorrectly, but in theory the Basler camera should be able to do what you want if they named it correctly. I would confirm the expected behavior with Basler to be certain. As I mentioned, there should be plenty of other cameras on the market that do support this triggering mode if the Basler does not.
    With regards to re-writing your software for IMAQdx, its true that you would have to make the translation from IMAQ code. However, the API's are very similar for most things and code porting shouldn't be too bad.
    Please let me know if you have any more questions regarding this,
    Eric G

  • Triggering GigE Camera with Hardware Trigger

    Hello,
    Here is an outline of what I want to accomplish:
    -LabView program starts running and waits for GigE camera to output frames
    -Hardware trigger leads to GigE camera outputting frames
    -Some simple arithmetic is done on each frame to generate the average pixel value--> this average value is plotted for each frame
    -Repeat the above three steps
    Please see the attached VI. I have successfully set my camera's settings in MAX to make it wait for an external hardware trigger. However, the output of IMAQdx Grab2.vi inside the While Loop is only a single frame (even though in MAX I have set the Acquisition Mode to MultiFrame - 255 Frames).
    Any help would be appreciated!
    Thank You.
    Solved!
    Go to Solution.
    Attachments:
    ImageGrab.vi ‏57 KB

    The "problem" that you are having is the frame rate of video acquisition, which you think is about 20 Hz.  Take the very simple VI I posted and run it with your camera -- all it does is continuously take frames (and display them) -- does this have an acceptable rate?  I suspect it will.
    If so, then "start with what works and add to it", rather than trying to "fix what is broken".  First, let's consider how to (better) control the Start and Stop of frame acquisition.  I like your idea of using the 6009, but I recommend (if you are using a 3V signal as the trigger) that you wire the trigger to one of the Digital I/O ports (as 0 and 3V are acceptable TTL levels for False and True).
    Your Video loop will be "clocked" by the Camera at its frame rate, so you might consider using the same loop to clock the DIO.  Take a Digital sample of the line to which you've wired your TTL signal, and as long as it is True, run the loop.  [You'll need to think about how to get the loop started ...].  It should, I think, be possible to read from your USB 6009 within a 60th of a second -- if it is too slow, there are other ways of handling this with a separate parallel loop, but let's not go there until we see it is a problem.
    So now, in principle, you've gone from a simple loop showing frames at 60 Hz to a loop controlled by a TTL signal showing frames at 60 Hz.  All we need now is to process those frames.
    Here is where you want to use a Consumer/Producer pattern -- you don't want to do processing inside this loop (because the loop cannot run faster than all of its parts, taken together, and if you are processing incoming data, you have to get the data, then do the processing).  Instead, you have have two loops running in parallel -- the Producer loop acquiring the videos, and then "exporting" them to a Consumer loop that processes them.
    Are you familiar with this pattern?  There are numerous examples around (look in File/New/From Template/Framework/Design Patterns, and at some of the Sample Projects).  It uses a Queue, with data put onto the Queue by the Producer and removed by the Consumer.  You might need to increase the number of buffers for your camera, but you should be able to do quite a bit of processing in 1/60 of a second.
    Bob Schor

  • Simultaneous capture with usb and gige cameras ?

    Hi,
    I am unable to capture (continuously) simultaneously from a USB and a GigE camera, is there a limitation in Labview ? Both cameras work if I use them one at a time. I am using Labview 2012. Both cameras are IMAQdx devices and I am using IMAQdx VIs.
    Thanks,
    Best,
    Saumil

    I have just converted to low level, just in case. Not tested yet...
    USB camera: USB 2.0, 10 frames/sec, 1280x1024 -> about 13 MBytes/sec
    GigE camera: 300 frames/sec, 640x480 -> about 90 MBytes/sec
    So both cameras data rates are within their respective interface limits. Although when I started to archive raw frames from the USB camera, everything in labview became super slow. The slowness was reduced dramatically once I started storing to compressed AVI files (rates went from 15 Mbytes to approx. 1.5 Mbytes/sec). I have not tested  archival with the GigE camera yet, I am expecting troubles, just dont know how severe.
    Thanks for your help,
    Saumil

  • NI VBAI GigE Camera Lost Packets

    We are running VBAI on a fast PC through an Intel Pro/1000 card and a jumbo-frame GigE switch to a large number of Basler Ace GigE cameras.
    It is understandable that multiple cameras can not simultaneously transfer complete images to the PC at the full 1000 MHz data rate due to the limited bandwith between the PC and switch.
    One way around this is to throttle the max data rate for each camera down so that the sum is not greater than 1000 MHz.
    However, this means that image transfers always take longer, even if only one camera happens to be in operation much of the time.
    Is this a fundamental limitation of GigE Vision, or are GigE Vision cameras clever enough to operate as fast as possible...via resends of the occasional lost packet??? 
         Nelson

    We have come up with a solution.
    The issue is that the cameras, switch, network card, and VisionBuiilder cannot handle any instance where more than one camera is sending images to VisionBuilder where the total of the data rates for those cameras is ever in excess of the 1GB network card bandwidth.
    (It would be nice if someone made a network switch that has slightly more packet buffer memeory...100MB?...so that the occasional collision does not result in garbage images while allowing most captures to operate at maximum speed.)
    Solution:
    (1) We added 3 additional 1Gb network ports to our vison pc.
    (2) In order to redistribute camera traffic to more than one network port, we assigned each network card, and the corresponding cameras, to a different subnet.
    (3) We lowered the data rate for the less time-critical cameras.
    Even after lowering the data rates for several cameras in NI-MAX down to 200 Mb/s such that lost packets should no longer be possible, we stil saw them.
    After a detailed examination we found that the VisionBuilder image acquisition steps do not pay attention the the data rates that you assign in NI-MAX, and alwasy default to the maximum 1000 Mb/s rate, hence the saturated data rates and lost packets.
    The solution to this problem turns out to be to explicity set the desired data rate in each VisionBuilder Image Acquistion step, using the attributes tab.  While you are at it, you should also check that other critical parameters, such as the packet size, are also correct, and update them if not.
    After explicitly correcting the data rates in all image acquisition steps, we ran an image capture stress test that ran all the vision processes (several programs running simultaneously) about 10x faster than required, and observed no lost packets at all.
    Problem solved.
     

  • Trying to detect GigE camera(TM-​1405GE) in MAX

    My company recently bought a new camera.  It is the TM-1405GE by pulnix.  This camera is a GigE camera and I have already installed the software that comes with the camera and can aquire images with that software.  My problem is that when I try to use labview or MAX I cannot detect the camera.  Under the software section of MAX it says I have IMAQdx software but under the Devices and Interfaces folder there is no NI-IMAQdx Devices tab.  I have also tried just righting a simple VI using IMAQdx to aquire an image but the camera isn't being detected.  Any suggestion would be much appreciated.

    Well, MAX is the place to look for the cameras.
    We had issues with Dalsa cams and I kept nagging with the local reseller until it was fixed. The problem was that the camera wasn't fully GigE compliant, the same be with yours.
    See here for more info
    Maybe you have to switch the camera for another software backend?
    Ton
    Free Code Capture Tool! Version 2.1.3 with comments, web-upload, back-save and snippets!
    Nederlandse LabVIEW user groep www.lvug.nl
    My LabVIEW Ideas
    LabVIEW, programming like it should be!

  • Multiple channels from GigE camera

    Hello NI Folks,
    I am using a GigE camera for my Machine Vision application. I have to save data from all three channels coming out of Camera.
    I am using Example VI from National Instruments 'Grab and Setup attributes.vi' to get attributes and save Image. But the problem is this example VI has capability only to grab data from one channel. Anyone please give an idea for Which part of thIS example VI should I edit to grab all the three channels ??
    I checked this issue even in MAX. I selected all three channels in MAX and tried to take a snap from camera, but there is only one image captured, there is no option in MAX to view all three channels. I am attaching the MAX scrrenshot below which shows all the channels from my GigE are activated
    How to make Labview to read all the three channels from camera ??
    Regards
    Neo.

    Hello Mr.Alexander Glasner,
    This is the camera I am using
    http://www.automationtechnology.de/cms/index.php?id=243&L=1
    It is used for Laser triangulation technique. this camera gives out  3D data in one channel, Intensity data in other channel, grey scale Image in third channel. All these these channels comes out ofcamera through a single Gigabit Ethernet.Camera manufacture provided a software with which streaming of desired channels is possible. But I want to do it in Labview. Is there any block in IMAQ library to isolate these channels from GigE Interface ?
    My point with MAX is, MAX is able to grab all the attributes including its channels names DC0,DC1,DC2. But it is not able to display three channels seperately when selected.
    Regards
    Neo

  • Problems acquiring and saving multiple camera images using a switch with GigE cameras

    Hi Folks,
    We are having an issue with connecting 6 GigE cameras via an Ethernet switch. We can acquire and store individual cameras but once we increase the number of cameras we end up with jumpy avi files.
    Each camera has been physically labelled and attached to the switch, so camera 1 is attached to port 1, which then corresponds to Cam1 in MAX. When recording multiple files what appears to happen is that the avi file from cam1 actually has images from multiple cameras, as if cam1 is being renamed/allocated on each frmae and each time a different camera is allocated. How can we fix this?
    We are new to this so any help or advice would be much appreciated.
    Thanks in advance,
    Cathy
    Attachments:
    Camerad.png ‏60 KB
    Camera.vi ‏91 KB

    I think your problem is caused by using the same image name on each instance of Imaq create. "Grab and Save to AVI Acq" being the string you are using. I am pretty sure If you make each image name unique you wont get image referance problems.
    Senior Software Engineer
    www.Adansor.com

  • Gige Camera slow initialization

     I 'm using a Aviiva EM1 GigE camera with Labview 2011. I insalled the latest Imaqdx driver. I can acquire without problems images. However the initialization of the camera takes about 38 sec with Max or Imaqdx Open. When using Gevplayer (Pleora) it takes only 10 seconds. What can I do to speed up the initialzation in Labview ?
    Regards

    Hey,
    how about your network settings and all. have you enabled Jumbo frames? If yes, then there must be an incompatiblity of both ev2 and LabVIEW.
    Sasi.
    Certified LabVIEW Associate Developer
    If you can DREAM it, You can DO it - Walt Disney

  • Hi my Gige Camera takes long time to acquire a single image

    Hi all,
               i need a help regarding . GiGe Camera iMage Acquization . it takes so long to acquire a image nealy 5000ms .
               I am usigng Sentech GE500A Gige Camera .
              i have attached my Vi code .
    Attachments:
    caemra.vi ‏97 KB

    Use Measurement Automation Explorer to verify that the camera integration time is not set to a really high value.
    Machine Vision, Robotics, Embedded Systems, Surveillance
    www.movimed.com - Custom Imaging Solutions

  • GigE Camera File Error

    Hi all,
    Just got a new Hitachi KP-F83GV  GigE camera.  Listed on NI website as supported and current.  Running Labview 8.5.1 and IMAQdx 3.0.1  So here's my issue...
    I plugged the camera, fired up a small example program (below), and everything worked great! Just for kicks, I stopped the program and tried to start it again.  I get the following error: "NI-IMAQdx: (Hex 0xBFF69012) Attribute value is out of range."
    All I have to do to fix the problem, though, is reboot (unplug/replug power) the camera!  This is really annoying, though, as it's likely I'll be firing up multiple Labview programs in one sitting and it's a pain to have to power cycle after each test.
    Any idea why the attribute is only "out of range" on the 2nd/3rd/4th attempt and not the first?

    Aha - Labview 8.5.1 and IMAQdx 3.0.1 don't play nice.
    Labview 8.6.1 and Vision Acquisition 2009 are on their way... hopefully that will fix the problem.  I'll certainly be back if it doesn't

  • GigE cameras

       Hi, I'm having some troubles managing 4 cameras, two are firewire and two are GigE. I have no problems running my program with HIGHLIGHT EXECUTION, but if I run it in a continuously way the next error appears: 0XBFF69002 "Invalid Parameter" and no image is displayed with one of the GigE camera. This only occurs with one of the GigE Camera, the other ones, work perfectly.
       Anybody can help me, I will appreciate your time...
       Regards...

       Ok, sorry about the brief explanation. The cameras I have connected is a DALSA cam (GigE), a Marlin cam (FireWire), a Guppy cam (FireWire) and a JAI cam (GigE). I can see every camera in the MAX, the one I have problems is the DALSA cam. But, just as I was describe it, if I run my program with highlight execution, my program works, in a slowly way, but it works, if I run it in a continuosly way, the program sends me that error...
       I don't have the file right here, I could upload on Monday 13th, but the program flows is as follows: 1st I check the number of cameras connected, I create folders for every camera connected, and then, I start the while loop where I open a session cam, then configure, then take a snap and save it into its folder and then I close session. I make a sweepping between all connected cameras. The program ends when an error occurs or when I press STOP button.
       It's very simple, as you can see, but I have this problem. The configuration of the cameras were adjusted, the frame speed in DALSA is 150 fps, in the other ones is faster (near 8000), the snaps are from 1024 x 780 and every camera has its own configuration about the type of the snap (RGB 32, or Grayscale) as I need it...
       Thank you for your time... =)

  • How to Calibrate my Gige Camera

    hi ,
      i am using  following Gige Camera and Lens . machine vision application 
    5MP Gige Camera
    Sensor Size = 2/3"
    Optical Magnification = x4
    FOV = 1.65 mm x 2.20 mm 
    I need to calibrate my camera so that i can get real world parameter values .  i have seen example it tells about stereo Calibration (with Two camera ) not about sigle Camera with this Low FOV ..  
    I am new to this .. 

    Ya i created that Calibration Image ..using vision assitant .
    i can able to apply that calibration image to other vision assistant script also working.
    Dont know how to use it in labview vi file .. 
    if i create vi file using vision assistant also shows calibration axis info and grid discripter as input .. 
    i dont know how to insert calibration image into vi file .. directly 
    Attachments:
    ll.vi ‏69 KB

  • Why MAX5.0 not report right XML file settings from AVT GigE camera?

    I have a AVT Prosilica GC660M GigE camera. It had a touble to synchonize with an external trigger. My ex-colleague posted this long time ago (http://forums.ni.com/t5/Machine-Vision/Problem-with-external-trigger-on-GigE-camera/m-p/1060572/high...), so I think it might be better to start a new one. Inside the MAX, there's only one option under the "ExposureMode" as "timed" which only use preconfigured time, and cannot be trigger by an external signal. The workaround is to manually register it inside the camera. It was thought to be a bug and an update of firmware should fix it.However the problem remains after I recently upgraded the firmware. And o I installed the AVT package. Both the MAX and the AVT GigEViewer correctly retrived the camera firmware version as 1.42.02. Yet MAX still shows only the "timed" option, while the GigEViewer shows all four enums as described in their attribute manual "Manual, Auto, AutoOnce, External". I had tried to delet the xml and files associate with the camera under IN-IMAQdx\Data. However everytime MAX genenrates them excatly the same with only "timed" option in the XML file. Now it feels more like something is wrong with the MAX instead. I wonder how to fix it.
    Thanks,
    Lei

    Hi Lei,
    I am not exactly sure why you are not seeing all the attributes, but it could be a few different things causing it. First let’s just make sure you have all you attributes being displayed (follow instructions from link below). What happens if you delete the XML file then set your camera to “External” in the GigEViewer software, then load MAX? Do you see any changes? Can you post a screenshot showing the camera attributes from MAX. If you have your camera set to External from GigEViewer and you try to grab from the camera in MAX do you get a timeout error, does it trigger external, what happens?
    Why Can I Not See All of My IMAQdx Camera's Attributes in Measurement & Automation Explorer
    http://digital.ni.com/public.nsf/allkb/9FA7FEE4FC51F043862574A30075B7A1?OpenDocument
    Why Won't My Allied Vision Technology Camera Work With National Instruments Software?
    http://digital.ni.com/public.nsf/allkb/470DA6BDE3883EB686257341006BCB56?OpenDocument
    Tim O
    Applications Engineer
    National Instruments

  • GigE camera - grab image with black bars

    Hi everyboby,
    I am using Ni vision acquisition to grab images from a GigE camera.
    I recieve images just as I trigger the camera - so fare so good.
    Every other picture shows some black bars.
    Does anyone has an idea what it could be?
    It's not the hardware, the camera works perfect with the SDk form the manufacturer.
    Thanx for any help
    Solved!
    Go to Solution.

    Sounds like an incomplete frame, this could be caused by missing packets.  To prevent missing packets during acquisition you can do two key things: ensure the packet size configured on the camera does not exceed what the network adapter can handle, and ensure that the bandwidth of image data being output by the camera does not saturate interface bandwidth.  I would lower camera packet size to 1500 initially to determine if the packet size  is the cause of the problem.  You can throttle how much data the camera can send by manipulating the interpacket delay, some manufacturers have a very handy feature called StreamBytesPerSecond.  Depending on which mechanism you can use, increase interpacket delay or reduce StreamBytesPerSecond.

Maybe you are looking for