Vision system to look at a bubble level?

Would it be possible to use some type of NI vision system to view a bubble in a bubble type level and to somehow "measure" how far off the bubble is left or right?
This would be some type of system that would (somehow) set a table to a calibrated "known" level position. I'm not sure if this would be some type of gyroscope or another calbirated level, but we would have to start with a good solid calibrated level surface. We would then put a level to be tested and use a vision system to "view" the bubble to see how far off it is and report it as a % accuracy or uncertainty.
I'm not looking for Labview code, only if this can be done with some NI vision system. Is there a camera that can see the bubble and calculate some % of the bubble outside of the lines?
Seems like an odd application I know, but I'm just looking for input.

Alternative approach I'd use if space permits and tilt angle is within reason:
Use a digital protractor and then use vision tools for character recognition to read the display. If angle range is high, then use a fiducial and coordinate system to compensate for the angle so the characters can still be read.
http://www.globalindustrial.com/p/tools/test-measurement/Measuring-Protractors/pro-360-digital-protr...
EDIT: There are also inclinometers that have analog and digital ouputs/communications that would alleviate the need for vision entirely.
-AK2DM
~~~~~~~~~~~~~~~~~~~~~~~~~~
"It’s the questions that drive us.”
~~~~~~~~~~~~~~~~~~~~~~~~~~

Similar Messages

  • How Open source Flex looks to a mid level developer

    Here is my response to a post Matt made on a blog:
    THe whole post:
    http://blog.simb.net/2009/01/19/take-flex-back-for-the-community/
    @Matt
    1st off thanks, now my input:
    The crux of your criticism seems to be that the process that we use for decision-making is closed its certainly not how we feel
    Unfortuntly, this seems like a case of bad commuication by Adobe and the Flex team. If notable community experts get the impression of closed doors, then *certainly* the rest of us get this impression as well.
    but there has been very little participation from the community so far
    Im a solid dev, and not an expert but I can contribute in things such as testing and low level optimization; But, the Adobe open source site is a galactic disaster and discourages me from getting invloved. It has so many webpages that just go around in circles - filled with text that is verbose, this frustrates me to a great deal. As opposed to somehting straight foreward like:
    http://framework.zend.com/download/latest
    or
    http://code.google.com/p/papervision3d/
    Everything is here and easily notable. I avoid the seamingly endless pages and confusion. If I want to get to the Flex dev mailing list I have to register, then choose the lists, add the lists to my account. go though my account prefs read some directions.. then set up some other preferences..jeeeezzzz. If I want to submit a feature request I am directed to create another login for some Bug and Issue management? HUH? I thought I was submitting a featue request.
    Right on papervisions google HOME page I see Getting started > Papervision3D Mailing List.
    Next when I finially do get to Flex SVN, this trunk. is nothing like I have seen.what IS all this stuff and where do I go to learn about it? There are countless subfolders filled with things I vaguely understand and if I want to learn what it is. where do I go? Is there even a src folder?
    Here is a classic example.
    Flex trunk Read Me:
    http://opensource.adobe.com/svn/opensource/flex/sdk/trunk/README.txt
    vs.
    Zend Trunk Read me:
    http://framework.zend.com/svn/framework/standard/trunk/README.txt
    *What* is the Flex team thinking with this?
    I just spent 5 minutes or so twirling through all these SVN directories and Adobes website and I am clueless. I feel like the project is unessarily complex and disorganized - even if it isnt. This just convinced me to delete my Flex sdk bookmark on my SVN client.
    Adobe just lost one potential contributor to Flex SDK.
    We have another idea that were bouncing around that I hope to share in a week or two but wont get into now.
    Saying stuff like this is what gives people the impression of closed. Why not post these ideas the Flex team has? Even if it is stupid or incomplete
    Thats the whole point.

    Wow, thanks for your response. This is great.<br /><br />I now understand that I had a combination of prejudice and ignorance  <br />when regarding Adobe Open Source. Perhaps my ignorance on these  <br />particulars is something that other devs encounter ( that whole  <br />perception thing ).   I have chatted with people who consider the Flex  <br />SDK and Flex Builder... not the same... but fused together so much  <br />that it'd be just crazy to try and use one with out the other   When I  <br />mention using Flex without Flex Builder.... people get that glazed  <br />over look in their eyes....( FYI I'm a FDT man, but am keeping a close  <br />eye on the new Flex Builder )<br /><br />I'll use Zend as an example ( it's the best that I can come up with )  <br />but I feel like there is a difference between the open source  <br />framework and Zend Studio ( commercial software ). Anyway, I'm really  <br />excited about how you've responded to all this.  I have more ideas,  <br />but Ill sit on them and think it through and get ready for next weeks  <br />meeting.<br /><br />Thanks,<br /><br />Alan<br /><br /><br />On Jan 23, 2009, at 1:31 AM, Matt Chotin wrote:<br /><br />> A new message was posted by Matt Chotin in<br />><br />> General Discussion --<br />>  How Open source Flex looks to a mid level developer<br />><br />> I've delayed responding to this because frankly I'm not sure what to  <br />> say for much of it.  The first thing I have to point out is that  <br />> many of the things you mention are related to Flex Builder and not  <br />> the Flex SDK, which is the part that is open source. So things like  <br />> the FB features, and NDA as part of its prerelease program, etc are  <br />> part of the commercial offerings from Adobe, not the Flex SDK.   <br />> Every build of the SDK has been available from the open source site,  <br />> the MAX issues were for commercial products.<br />><br />> The roadmap for Flex 4 has been posted on the Gumbo page now for a  <br />> while, we don't put up specific dates because we don't know specific  <br />> dates, and as we get closer to feeling certain on a date we've put  <br />> it up.  I think that having dates up that constantly change is  <br />> counter-productive.<br />><br />> We use a code name because you never know if another version is  <br />> going to jump into the middle or what could happen, locking in on a  <br />> version number is often just setting yourself up for confusion  <br />> later.  Plenty of other projects use code names too, the idea that a  <br />> code name denotes secrecy is frankly ludicrous.<br />><br />> Regarding your question about 1000 developers and 80% wanting to go  <br />> in a different direction: if 80% of our customers think we should  <br />> move one way, don't you think it'd be pretty silly as a company to  <br />> go against them?  Adobe as a company, and the Flex team as a product  <br />> team, are very focused on delivering our customers value.  If we  <br />> fail to deliver value, not only is our free open source product not  <br />> used, but our paid products aren't used as well.  The things that  <br />> you sometimes run into are long-term vision vs. short-term pains,  <br />> and that may be where some aspects of open source vs. closed  <br />> differ.  The Adobe team has a long-term vision of Flex, which we  <br />> have tried to share and get feedback on, and we make decisions based  <br />> on that vision while taking into account the short-term needs of  <br />> developers.  I think this is a pretty reasonable approach overall,  <br />> and you as a Flex/Flash developer have probably benefited from it.<br />><br />> I'm sorry you feel like Adobe is getting the only benefit of open  <br />> source and you aren't, we certainly believe that we've put pieces in  <br />> place to allow for everyone to benefit, and will continue to take  <br />> suggestions on how we can improve.<br />><br />> Matt<br />><br />><br />> On 1/21/09 7:54 AM, "Alan Klement" <[email protected]> wrote:<br />><br />> A new message was posted by Alan Klement in<br />><br />> General Discussion --<br />>  How Open source Flex looks to a mid level developer<br />><br />> Matt, your right.  The Adobe Open Source site does have sufficient   <br />> resources, but the perception I had , as a developer interested in  <br />> getting invloved,  was that the information was either not there,  <br />> incomplete, or difficult to find.  The perception to me is that  <br />> Adobe is not serious about 'Open Source" - even if it is, the  <br />> perception I have is that it isn't.<br />><br />> I don't mean disrespect, but I don't take "Adobe Open Source"  <br />> seriously.   To illustrate my point I'll use this ( albeit a bit  <br />> extreme ) example:<br />><br />> Suppose the Flex community consists of 1000 devoted developers.    <br />> 80% of these developers decide to take the sdk into a direction they  <br />> feel it needs to go.  This decision, regardless if it's 'good' or  <br />> not, renders it incompatible with other Adobe products, namely Adobe  <br />> Catalyst. WIll Adobe accept the community's decision?<br />><br />> When Adobe can answer 'yes' to that question, I will take Adobe's  <br />> commitment seriously.<br />><br />> And there are so many other things t! hat send me red flags.<br />><br />> - Why is the new name of Flex not public, and why am I, as others,  <br />> breaking NDA to talk about the renaming process with other Flex devs?<br />> - What business is NDA doing in an 'Open' project.<br />> - Why is Adobe tight lipped about a release for Flex 4?  Where is  <br />> the roadmap?   https://wiki.mozilla.org/Releases .<br />> - Why are new features in Flex Builder not public?  Adobe asks 'what  <br />> do you want', but never tells us the results of these polls and what  <br />> features it is actively working on.<br />> - Why are there builds of Flex 4 passed out at MAX, but unavailable  <br />> to non-attendies. Had I *paid* to go to MAX, I'd have Flex 4...<br />> - You mentioned that there wasn't a 'budget' to make it easier for  <br />> devs to submit feature suggestions? Set up a google mailing list,  <br />> that'll cost you 0 dollars.<br />> - What is the term 'budget' doing in open source.  If Adobe won't do  <br />> something, ask the community to chip in.<br />> -Who are the other Adobe Flex devs? I can go to other open source  <br />> projects and see the names and contact info of other devs. Why  <br />> aren't THEY posting their opinions on this message board?<br />> _Why all the 'codename' garbage.  'Codename' denotes secrecy.<br />> - and on and on....<br />><br />> To me, Adobe looks like they want all the benefits of an open source  <br />> project, but none of the consequences.   Being open source means  <br />> releasing a degree of control over the software.  Hell, Richard  <br />> Stallman is still trying to convince people to change 'Linux' to  <br />> 'Linux-GNU'.<br />><br />> I would like to help, but I just don't think my efforts would be  <br />> seriously considered.  I work all day developing Flex applications  <br />> and front end Flash web sites. I don't want to then spend my free  <br />> time to be engrossed with a project's red tape ( Adobe policies ) -  <br />> only to have my efforts to be blown off.<br />><br />> Sorry guys, it just looks like a win-win for Adobe and a lose-lose  <br />> for me.<br />><br />> ________________________________<br />> View/reply at How Open source Flex looks to a mid level developer <a href=http://www.adobeforums.com/webx?13@@.59b790da/2 <br />> ><br />> Replies by email are OK.<br />> Use the unsubscribe <a href=http://www.adobeforums.com/webx?280@@.59b790da!folder=.3c060fa1 <br />> >  form to cancel your email subscription.<br />><br />><br />><br />><br />> ------------------------------------------------------<br />> View/reply at <a href=http://www.adobeforums.com/webx?13@@.59b790da/5><br />> Replies by email are OK.<br />> Use the unsubscribe form at <a href=http://www.adobeforums.com/webx?280@@.59b790da!folder=.3c060fa1 <br />> > to cancel your email subscription.

  • I want to buy NI cam for my machine vision system

    Dear member
    I need to buy a NI cam to use it in my machine vision system that is used to in recognition of screws head like in second figure
    the problem is that my web cam is not able to produce high quality image
    so I need no but NI cam high quality
    I need only  NI cam  with USB cable, not a  NI smart cam with processor
    what is your suggestion please??
    best regards
    hi ?Q>

    AdnanZ wrote:
    For your industrial machine vision application, you would be better off with an industrial protocol like the Gigabit Ethenet (GigE) or USB 3.0 cameras. You should have a look at Basler's Ace line of cameras (scroll down and have a look at the Gigabit Ethernet and USB 3.0 tabs).
    Althought there are advantages to USB 2.0 (cost effectiveness, USB 2.0 ports available everywhere), it is still the least standardized and least popular camera bus considered. The one obstruction to the widespread adoption of USB 2.0 for vision applications is the lack of a hardware specification for video acquisition devices. Each vendor has to implement its own hardware and software design, which means that a special driver must be written to connect each USB 2.0 camera to each different software package. As a result, IEEE 1394 is much more prominent in vision applications. Also, most image acquisition drivers for USB 2.0 use utilities like DirectShow to acquire images into the PC. While these tools work well, they are a burden on the CPU. As a result, USB 2.0 image acquisition can be processor-intensive. Utilities like DirectShow also do not provide any type of interface for triggering or communication. Because of this, without a special driver, it is very difficult to synchronize USB cameras with each other or the rest of a system.
    GigE Vision and USB 3.0 Vision are built to avoid this (by using GenICam) and are supported by National Instruments' Vision Acquisition Software. If you have to choose between GigE and USB 3.0, just remembe that GigE is good for longer cable lengths (100m v/s 8m for USB 3.0) and works well with multiple camera. Otherwise, USB 3.0 has better bandwidth (350MB/s v/s 125MB/s for GigE) and is plug and play (no extra power required).
    Before you choose the camera, try to undestand what resolution and fps you need depending on your application, and then select the camera.
    To get the right lens, you need to know the CCD size of the camera, and working distance and field of view of the object. Working Distance is the distance from the front of the lens to the object and Field of View (fov) is defined by the smallest rectangle of sides x and y which contains the object at the object plane. You need to use these to calculate the focal length and then choose the right lens.
    Also, be sure to select the right lighting. Metaphase has some good products.
    Wherever you are based, you should be able to find a local vendor for the camera, lens and lighting. Be sure to do the due diligence. You should, then, be able to get a good image for your application.
    thank for your information share
    is CCD size=pixel size?? what ccd mean ,is thier other type,what ccd advantage??
    also
    i have buy camera with specific focal length then if i add lens with 14 mm focal is the total length will default focal+lens focal (14)?
    what is the effect of focal length and pixel size on images??
    best regards
    hi ?Q>

  • Getting error while opening a saved for later notification: The selected action is not available. The cause may be related to security. Contact your system administrator to verify your permission level for this action.

    Hi All,
    While opening a saved for later notification, we are getting "The selected action is not available. The cause may be related to security. Contact your system administrator to verify your permission level for this action". error.
    This is a custom notification.
    Please help.
    Thanks
    Raghava

    HI All,
    Please help on this issue.
    Thanks
    Raghava

  • The session variable, NQ_SESSION.OU_ORG, has no value definition.Please have your System Administrator look at the log for more details on this error. (HY000)

    Hi All,
    I have created a user 'Bitest' and group 'Bi_Test_Group'. Assigned the user to the group and the group to BI consumer role.
    I gave access to only procurement and spend catalog folder reports and Dashboards.
    When I login to BI Presentation Services with above created user and open any procurement and spent catalog dashboard i am getting below error in every report.
    Its BI Apps 7.9.6.3 installation.I gave read  access to group to all procurement and spent subject area.
    Error Codes: OAMP2OPY:OPR4ONWY:U9IM8TAC:OI2DL65P:OI2DL65P 
    Odbc driver returned an error (SQLExecDirectW). 
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 23006] The session variable, NQ_SESSION.OU_ORG, has no value definition.Please have your System Administrator look at the log for more details on this error. (HY000) 
    SQL Issued: {call NQSGetQueryColumnInfo('SELECT Fact."PO Amount" FROM "Procurement and Spend - Purchase Orders"')}
    SQL Issued: SELECT Fact."PO Amount" FROM "Procurement and Spend - Purchase Orders"
    Please help me in resolving this issue and getting results on Dashboard.
    Thanks in advance
    Thanks,
    Sandeep

    Check your query or connection pool settings etc

  • How to get rid of message in Konsole "System Serial#" looks incorrect or invalid

    I hava a one year old MBP, in real it is some weeks older than this.
    The Display showed a thin yellow line on the left side of the display.
    I took it to the Apple-Store and they, after some discussions about the warranty, took it and repaird it.
    On the next day i went for it to get it out of the store (they told me to fetch it).
    First thing, i checked for the error. Sadly it came up again.
    What did they do? They changed the MLB (Motherboard).
    OK, i do now have a "new" MBP.
    The same error (yellow line) appeared and they took it to repair it again, this time changing the display.
    Three days later they told me to fetch it again.
    Heading for the Store, have a look on the Display..... very good no yellow line. Great!
    BUT! the Display is lightly turned in the case. Grrrr..
    If it is a Laptop for 300 $ ok but not on a system for 2000 $.
    I will bring it back i a few days to "repair" it, again.
    Now to the mentioned Console-Message:
    "Hardware SerialNumber "System Serial#" looks incorrect or invalid"
    After replacing the MLB, the Hardware changed.
    Is it a problem for the Software?
    What can happen, and what restrictions/problems will i have?
    How do i get rid of it?

    When the logic board is replaced, the serial number has to be flashed back into the records - there is a special tool for that and they will need to do that or you will show an incorrect serial number forever and it may result in problems. That part is easy to fix - simply tell them they need to flash/re-register your old serial number.
    As for the other problems: good luck; if they keep replacing/repairing for the same problem more than 3 times, ask (very politely) if one could consider replacing the machine as the repairs are not fixing the problem.

  • System keeps looking for a 'server' that does not exist. Is says the old name of my Airport Extreme. Clicking it away (4x) it gets through.

    System keeps looking for a 'server' that does not exist. Is says the old name of my Airport Extreme. Clicking it away (4x) finally  gets through.

    Other weirdness to report: my neighbor upstairs appears to have a Linksys router network on channel 6. My AEBS is on channel 1 so there shouldn't be interference... but according to iStumbler sometimes the signal leaps to 64000 (keep in mind my own Airport never tops 60)! Is that potentially the problem?
    Might it be possible that your neighbor just acquired a 802.11n (pre-N) wireless router recently?
    802.11n effectively increases capacity by doubling the number of Wi-Fi radios and increasing the number of antennas used to push signals out of those radios. 802.11n splits a data frame into multiple pieces. It then transmits these pieces in parallel using multiple radios that are attached to multiple antennas. These antennas blast out signals from virtually the same vantage point – scattering the signals everywhere.
    You may find the following ZDNet article interesting: http://blogs.zdnet.com/Ou/?p=247
    Even if he did not, and you are on good terms, you might want to ask if he can temporarily turn off his wireless, or, at least, reduce the signal strength to see if that will help in your situation.

  • Where is original point of Z axis in a stereo vision system?

    Hello, everyone
    I use LabVIEW Stereo Vision module. After I calibrate my stereo vision system, I want to verify the accuracy of my system. I can get every point`s depth in my picture, but where is my original point of Z axis? 
    In the newest NI Vision Concepts Help, it is written that  NI Vision renders 3D information with respect to the left rectified image such that the new optical center will be at (0, 0, Z) position. Is the original point of Z axis is on the CCD of left camera or the optical center of my left camera`s lens?
    So anyone can help me ?
    CLAD
    CAU
    Solved!
    Go to Solution.
    Attachments:
    未命名.jpg ‏63 KB

    Hello,
    I would say that the coordinate system origin is at the optical centre, that is the camera's projection centre.
    So, yes, the optical centre of the left camera's lense. This seems most logical to me...
    Best regards,
    K
    https://decibel.ni.com/content/blogs/kl3m3n
    "Kudos: Users may give one another Kudos on the forums for posts that they found particularly helpful or insightful."

  • 11 years of labVIEW based Vision System developmen​t

    Hi,
    I am into desiging and development of Machine vision system and mostly using labVIEW. I am using labVIEW from 6.0. I have successfully deployed 27 Vision systems. Please Send me a private message if you are interested in hiring me.
    I have no location preference.

    Hi there! 
    Did you get hired yet? If not, I'm a recruiter with a small company (a think tank) in Los Angeles, CA. I'm actively seeking a LabVIEW Vision Systems Programmer with 4+ years of LabVIEW programming expereince and demonstrated knowledge of NI LabVIEW vision. Does this sound like your experience? 
    Please let me know! 
    Cheers,
    Kate 

  • I have a flaw on the screen of my MacBook air .  It looks like a bubble on a screen protector, but I don't have one.  Anyone else have this problem?

    I have a flaw on the screen of my MacBook air .  It looks like a bubble on a screen protector, but I don't have one.  Anyone else have this problem?

    Hi, i had the exact same problem and a searched google for any suggestions. In the end I just very gently scratched it off.

  • Using Consumer Digital Camera for the Vision System

    I am evaluating, if I can use a consumer digital camera for the vision system. I wanted to see if anybody out there have done that in past? The other question is, do the consumer camera manufacturers provide the driver software in order to retrive the images directly from the camera to the computer ?

    I think your suggestion is right, using a firewire camera is much
    straightforward. But some consumer cameras have advantages too because
    of the very high resolution, especially the newer models of digital
    still camera. If the cameras support PTP protocol, then I have a
    LabVIEW driver for them. Check out more details at:
    http://www.hytekautomation.com/Products/PTPCam.html
    Irene He
    Bruce Ammons wrote in message news:<[email protected]>...
    > You would need a really, really good reason to use a consumer camera
    > instead of a machine vision camera. The hassle you would go through
    > to get drivers working and get everything else to work the way you
    > want is not worth it. You may pay more for a machine vision cam
    era,
    > but it will work with a minimum amount of effort. Nowadays, the
    > firewire cameras are cost effective and you can put together a system
    > that doesn't cost much more than your consumer camera would.
    >
    > Bruce

  • Launch code targeted to compact vision system

    Following the NI tutorial on "Easy Ways to Launch Code Targeted to the Real-Time Controller," I am attempting to use the 'System Exec.vi' with an assembled command line in a 'LabVIEW for Windows' program on the host such that I can skip the steps associated with opening LabVIEW on the real-time target and manually downloading the code I want to run on the real-time target.
    As such, in the LabVIEW for Windows host program, I use the command line to feed into the 'system exec.vi' with the exact following syntax (including quotations):
    "C:\Program Files\National Instruments\LabVIEW 7.1\LabVIEW.exe" -target 192.168.10.15 "C:\Program Files\National Instruments\ROTH LABVIEW\DAPC-NICC_71\RT_grab_images.vi"
    I perform this st
    ep just prior to calling by reference the 'RT_grab_images.vi' to acquire a set of images using the compact vision system and transfer these back to the host. However, I get an error 7, file not found (I am assuming it is the 'RT_grab_images.vi' file that is not found). If I have already manually downloaded the 'RT_grab_images.vi' (which negates the advantage of using 'system exec.vi' with the command line shown), everything works fine.
    Can someone comment on this?
    Thanks,
    Don

    Ok, this worked and from one of the other NI engineer (Michael Hoag), we discovered a way to programmatically execute this procedure using system exec.vi and a script file. For example, see the attached.
    Below is more information from Michael if you have VI that you need to launch that has subVIs, and if you need to launch from other directories than c:\.
    "Don,
    Alright, with a little more persistance and work I think we now have a
    working solution. I realized that the problem was that after copying the
    VI over to the CVS, the VI was not in memory and was never "downloaded."
    This means that all of the necessary subVIs (like all the standard LabVIEW
    VIs from vi.lib) were never loaded onto the CVS and therefore the VI would
    not execute when we used VI server, unless the VI had already been loaded
    into the CVS' memory by targeting it.
    To solve this problem, all we need to do is build a library (.llb) file for
    our program that includes all subVIs, even those from vi.lib. This can be
    done in LabVIEW, by going to the File menu and selecting Save With Options.
    Make sure the "Include vi.lib files" option is checked and then create the
    distribution. I have attached a new screenshot of my block diagram where
    you can see that now I reference the main VI through the llb file (see the
    path in the block diagram). In this case I placed it in the startup
    folder, but that should not be a requirement. I also modified my script
    file to take into account these changes.
    I have rebooted the CVS several times to make sure nothing is in memory and
    ran this new code. This appears to correct that last problem and should
    get you moving forward with your application. Let me know how this goes
    for you, and hopefully we have found a solution here.
    Regards
    Michael Haag
    Applications Engineer
    National Instruments"
    "Don,
    The error you are receiving when trying to call directory "C:\Program
    Files\National Instruments\ROTH LABVIEW\DAPC-NICC_71" is definitely a
    result of the DOS command prompt not linking the spaces and long
    file/directory names. If you want to use that directory then you'll
    probably have to figure out what the 8 character representation of those
    files and directories are. For example if you want to get into C:\Program
    Files\ then it will most likely be C:\Progra~1\. If you go to a DOS
    command prompt (In windows go to Start >> Run and type in COMMAND), and
    then type "dir /x" to display the directories, then the /x will have it
    show the 8-character name for the directories and files.
    Attachments:
    rt_script.txt ‏1 KB
    system_exec_to_launch_ftp_script.vi ‏12 KB

  • How to measure diameter using vision system

    sir
             I am new to vision system, i want to measure diameter of a cylinder please send some example vi's

    Hello ksr,
    You can use a function called Clamp to determine the distance between two edges in an image.  If you have any specific questions regarding this, please respond and provide more details regarding your project.
    Regards,
    Jasper S

  • I have a customer looking for a high speed vision system 500 fps. Anyone been successfull (or not) with this kind of application with NI hardware?

    Stu

    The digital boards can handle these high frame rates as long as the bytes/sec is within specs of the board, as Bruce mentioned. The question is still what you want to do with these images. You mentioned memory storage, how much memory will it take? The 1424 can have 80MBytes of onboard memory, so you could store it all there and then bring it to the host computer at your convienence. If you bring it directly to the host computer, keep in mind the maximum sustained bandwidth of the PCI bus is about 100 MBytes/sec and if you have 3 digital boards outputting data, this could be a limiting factor (overhead of the three boards communicating on PCI may reduce bandwidth also). Make sure you have plenty of RAM to hold all this data if you do bring it directly to t
    he host computer. Graftek is very good at recommending specific cameras and Robert Eastlund with www.graftek.com (a imageing alliance member witrh NI) has done these type of applications before and could give you more specific advise.
    So in summary, our boards should have no problem withg high frame rates as long as the cameras meet our specs (pixel clock of 40MHz max and the new 1424's have a 100MBytes/sec max data transfer)
    Hope this helps,
    Brad Buchanan
    National Instruments

  • Differnet Final Cut system's outputting audio at different levels.

    I recently got a new Mac Pro to update my Final Cut rig.
    I have 2 BetaSP machines that are set so -20db tone coming out of FCP hits their VU meters at 0db.
    Today I swap out the machines, and the new machine is putting out -20 at a much lower level. So I figure, adjust the machines, go about your business.
    Problem is, I have 1000 or so video files that have been mixed with the old system, and while the Tone output from the new FCP is lower, the audio from these files is not.
    I've gone through all the settings I can find, and nothing seems to hint that the audio output is adjustable.
    Any help is greatly appreciated.

    Tony Bullard wrote:
    I recently got a new Mac Pro to update my Final Cut rig.
    congratulations
    I have 2 BetaSP machines that are set so -20db tone coming out of FCP hits their VU meters at 0db.
    Today I swap out the machines, and the new machine is putting out -20 at a much lower level. So I figure, adjust the machines, go about your business.
    In the previous line you mean 'The new machine' is the new MacPro?
    And with 'adjust the machines' you mean the BetaSP's?
    please be more specific.
    Problem is, I have 1000 or so video files that have been mixed with the old system, and while the Tone output from the new FCP is lower, the audio from these files is not.
    So you are saying that when you send out a -20db tone out of FCP (on the new MacPro) is coming out lower level than the old mac? And that video files come out on the same audio level on both Macs?
    It's digital, so should be the same all the way. Somewhere a setting is wrong.
    I've gone through all the settings I can find, and nothing seems to hint that the audio output is adjustable.
    It is. Depends what you are looking at.
    And what your hardware setup is. How are you going from FCP to the BetaSP deck?
    So what's your system setup.
    How are you comparing?
    Are you aware of the Master slider in the audiomixer? (Option 6). I always keep that on 0. It only applies to sequences or clips that you have opened at that very moment in the Viewer or the Canvas. Is it changed in level somewhere?
    Depending on the hardware setup:
    Did you check the audio output from the Mac under System Preferences?
    Or if you use a Decklink card for output did you check the Audio Processing settings in the Apple > System Preferences > Decklink settings?
    Or what other setup do you have?
    Rienk

Maybe you are looking for