SbRIO 9602 3d model (CAD, Altium)

Hi all,
I wonder if it's available a sbRIO 9602 3d model suitable to load in Altium in order to check correct correspondance between sbRIO and a custom board.
Thanks in advance.

We supply some example daughter card templates in Multisim/Ultiboard for all the various sbRIO platforms - and they have been checked and verified by R&D.   I believe if you have purchased the sbRIO Developer's SDK you have access to addiitonal mechanical CAD drawing files in the docs, but I don't believe they are formatted specifically for Altium.
- Pat N

Similar Messages

  • DIO Connectors name/vendor on sbRio 9602

    Hello,
    i'm working with Altium Designer on a PCB on which i have to put a female connector that fits one of four DIO connectors on the sbRIO 9602 board.
    Can you please tell me how this type of connector is called and if NI provide a library to import in Altium? 
    Thanks.
    MR

    Theese two links could be usefull; from the first one you can find information about layout
    http://digital.ni.com/hardref.nsf/websearch/d922bf5c635e4ed7862574970065ab34
    while from the second one you can find connector model
    http://www.ni.com/pdf/manuals/374991c.pdf
    You could find interesting also 
    http://forums.ni.com/t5/Circuit-Design-Suite-Multisim/Exporting-to-Altium/m-p/1973313
    concerning your problem.

  • Maximum data rate RT host can handle (sbRIO 9602)

    Hi all, i got a sbRIO 9602,
    in my application FPGA reads data from custom module @ 1Mhz, just i'm wondering if 
    RT host has the capability to read data and visualize it on a graph without losing samples.
    I just want to know this before look how to implement any Bufferized/Producer consumer/DMA transfer 'cause
    if RT host can't simply manage this data rate i will have soon or later buffer overflow.
    Until now what i've thought is to copy 1000 contiguous samples in a memory in a ONE SHOT way, reading this memory from RT Host and visualize it on a waveform Graph.
    But again if there is the possibility to show acquired data continuously is obviously better.
    Thanks in advance.

    Hi, thank you all for your suggestions.
    In my RT Host i read 10000 values in every cycle so data rate drops to 1Mhz/10000 ->100Hz i think.
    What i want to do know is to detect peaks in my data and create an histogram of that peaks in RT.
    I've placed a peak detector VI that takes as input my array of 10000 values and generate an array with only peaks but i don't understand how to use Histogram PtByPt VI. It works with single data value at a time so to generate a consistent histogram i should wait 10000 cycles and so it may not  be as fast as 100Hz .
    Previuously i've tried to detect peaks and generate Histogram directly on FPGA to be sure to be as fast as possible due to 1Mhz data acquiring, but now with 10000 values chunk i guess if i can realize this on RT host side .
    Any suggestion will be really appreciated.

  • Best option to mass deploy application to sbRIO-9602

    Hi,
    I need to deploy the labview rt application to a large quantity of 'sbRIO-9602's. What would be the best way of going about this?
    All RIOs will have an identical setup.
    The software drivers on the RIO need to be upgraded to the latest version.
    FPGA bitfiles uploaded.
    FTP folders created and files uploaded.
    Any specific resources I should be looking at?
    Solved!
    Go to Solution.

    Take a look at this .

  • How to get name of sbRIO programatically

    How to get "name" and "Comment" of sbRIO which I write to sbRIO-9602 in MAX in Identification box (Model, Serial number, MAC address, Name)?
    Solved!
    Go to Solution.

    That shows:
     - IP settings
          - IP adress:        136...etc.
          - DNS name:      MyMaster5                   - THAT IS WHAT I NEED, just in wrong place - in MAX it is not in IP settings, that's why i didn't searched for it there.
          - subnet mask    255.255.255.0
          - gateway          136...
          - DNS server      136...
     - MAC address        00800...
     - serial number        16....
     - system state        Running
     - model name         sbRIO-9602
     - model code          7373
     - password protected restarts?    T/F
     - halt if TCP/IP fails?                   T/F
     - locked?                                   T/F
     - use DHCP?                             T/F
    Still there is no "comment" as in MAX. But that's not so important now.
    I have LW Development system 2009 SP1 Profesional, Realtime Development 2009 SP1
    Thanks very much!

  • How to get the measurement data from IMU using SBRIO through SPI?

    Hi All,
    I’am trying to use the SPI communication protocol to realize the bridge between my Vectornav VN100 IMU chip and the LABView through the SBRIO 9602. I amended the SPI sample code from NI website based on my own device and configuration. It compiled and ran well. The problem is the IMU we are using need a 16 bytes command and it will give back a response containing the measurement data. But it seems that with the code I have, I can only send an at most 32 bits command once. Hope anyone who has experience in IMU implementation on LABView could give me some help! Thanks a lot!
    Nick

    Kyle,
    Thank you very much!
    I attached the amended program and the Manual Sheet of the IMU I am using.
    Since I only need to one port, I just amended the configuration for port 1 and adjusted the code for my device. 
    I think the problem exist in the data sending mechanism, you can check the requirement out in P12 and P13 of the manual sheet.
    Best Regards!
    Nick
    Attachments:
    Vectornav VN100.pdf ‏1774 KB
    spi_dual_port_example.zip ‏755 KB

  • Asynchronous serial input with an sbRIO FPGA

    Background:
    As part of my capstone project, I'm trying to read data transmitted serially from an IMU. The host is an sbRIO 9602.
    As far as I'm aware, the protocol is not exactly standard: data is sent asynchronously in packets. Each packet consists of 12+ bytes in immediate sequence, each having a start and stop bit, and then the line goes idle [high] until the next packet. Each data byte is preceded by a frame bit, and only contains 7 bits of actual data, so the packet has to be further processed to get actual useful data.
    I've seen the FPGA serial I/O example program floating around. The code I inherited was actually based on it, but it's overly complex for this application, and I'm not convinced it would actually work for this protocol at all. I couldn't get it to work, at any rate. I rewrote the sampling code in its entirety twice trying to get it to work, but haven't made a lot of progress. -On the bright side, the current VI is much simpler and more straight forward than my first attempt...
    The problem:
    I can read the first 70 or so bits of a packet fine, then the program skips a bit. That throws off the start/stop bits, and basically renders everything after meaningless. In this screenshot the data is as read in, in order from top to bottom:
    I'm fairly certain this means my sampling interval isn't perfect [this suggests about 1.4% too long], but I'm totally stumped on how to avoid it. What's worse, this is actually on the lowest possible output setting from the IMU, 2400 baud. Realistically, we're hoping to have it working at either 230.4k or 460.8k baud.
    The prior version of my code had the packet being read in 10-bit [1 byte] chunks, processing, then reading the next chunk. I encountered exactly the same error. I assumed that having too much code in the sampling process was causing the timing to be thrown off, so I changed it to read off the entire packet into a bit array and then process it afterward [while no data is coming in]. I've attached this version of the code for reference. It's cleaner, but no change as far as the error is concerned.
    Am I correct in my evaluation, or is there something else going on? More to the point, is there a way of fixing or working around the problem so that I can get reliable samples [even at 100-200x the bit rate]?
    As an aside, I've only been working with LabVIEW for a couple weeks; please tell me if I'm using poor habits or doing anything incorrectly.
    Any help will be immensely appreciated. Thank you.
    Attachments:
    IMU_serial_in.vi ‏61 KB

    Hi Ryan,
    I have a suggested methodology, but I don't currently have any example code to share that would get you started.
    The challenge you have is even if you sample at the exact right baud rate of your incoming signal, the phase of the FPGA clock will not be exactly the same as the source signal.  Now complicate that with your sample frequency and baud rate will always be slightly different, and you will get the sampling drift effect you described where data eventually is clocked in wrong.  On short transmissions, this may not be a problem because the sampling can be re-aligned with a start bit, but for long, continuous streaming, it eventually fails as the sampling and source signals drift out of phase.
    I would suggest over-sampling the DIO line, using a debounce filter if necessary, and use a measured time between edge detections to constantly adjust your sampling period and phase to keep your sampling aligned with the incoming data.
    The proposed LabVIEW code I imagine would be a single-cycle timed loop based state-machine.  Essentially the state machine could detect edges that occur near the baud rate you expect to receive, and then would adjust the sampling period to ensure you are sampling the data inbetween transitions while the incoming waveform is stable.
    With this method running at 40MHz, you would essentially have ~43 clock ticks/samples of each clock cycle at 921.6kbps, and you should be able to pull out the right samples at the right time in the waveform.
    Hope this helps, and if I find a good example of this, I'll send it your way.
    Cheers,
    Spex
    National Instruments
    To the pessimist, the glass is half empty; to the optimist, the glass is half full; to the engineer, the glass is twice as big as it needs to be...

  • How should I install/setup a single OS of windows 7 to run through bootcamp and parallels 7 on my new macbook pro?

    I just bought a new macbook 8g ram and 750g harddrive and want to be able to run windows 7 through bootcamp and parallels. How do I setup that up and install a single version of windows (want to be able to utilize heavy programs - photoshop, 3D modeling CAD etc. - by installing them once and being able to use them through parallels 7 or bootcamp)? Please let me know of anything that may red flag by doing this and clear concise instructions of which to do first/ settings for bootcamp and parallels

    BootCamp is directly booting your computer into Windows for full hardware access and performance, just like a PC. It's free from Apple.
    https://www.apple.com/support/bootcamp/
    Virtual machine software (paid) like Parallels and VMFusion both can take the Bootcamped Windows and make a copy for use in OS X in a window at the same time as using OS X, but less performance. It's usualyl easiler to use.
    A free virtual machine option is VirtualBox, but it might not have all the bells and whistles of the payware options above, but works just  fine.
    We can't provide detailed installation instructions, it's too much, you will have to read Apple's instructions and the manual for your virtual machine software.
    http://manuals.info.apple.com/en_US/boot_camp_install-setup_10.7.pdf

  • Cant open itunes 6

    When I download 6 it opens really fast and i cant click. I have tried loading and reloading about 5 times and it still doesnt work.
    PC   Windows XP  

    I know what you mean and I am not feeling the love for Apple corporation right now. I can get iTunes to open in msconfig (but of course no internet thus Music Store while in msconfig). After 2 hours one night and another hour the next night with Apple on the phone, they have decided that it isn't their responsibility to correct the problem since it opens with msconfig, meaning that there is a conflict with some other 3rd party app that prevents iTunes from opening. My argument here is that iTunes version 4 worked fine (and worked fine after I reinstalled it when version 6 would not work...unfortunately version 4 will not work with 5th gen. iPods). Why is it that all of my 3D modeling/CAD software vendors will work with me when there is a problem like this until it is fixed and Apple won't. Now, I can't open any of my MS office products and many internet connections won't work. I have everything backed up so I'm going to have to reformat because nothing is working anymore. Best of luck with iTunes 6, but be warned, if it opens with msconfig, then Apple will do NOTHING to help you.

  • Opening and editing Protel99SE projects

    Hi all,
    is it possible to open and edit Protel99SE projects in UltiBoard 12.0?
    i'm a beginner in this field and i just want to know how to handle a Protel99SE project to edit it and add connector to the PCB 
    to interface a sbRIO 9602 with its DIO connectors.
    Thanks in advance.
    MR

    Hi,
    Ultiboard does have support for some protel files. You can open .pcb and .ddb files. In Ultiboard, go to File >> Open and select the file in question and see if you can open it.
    Hope this helps.
    Regards,
    Tayyab R,
    National Instruments.

  • All that jitters is not gold . . .

    I have dead-ended.
    I recently purchased a Wacom Intuos3 tablet. When I try to use it, the cursor has a severe case of the jitters. It also randomly causes popups to appear, as if I were holding down the "control" key, even though none of the keys on the tablet or pen are set for "control". It is worst in my 3D modeling & CAD apps (the primary reason I got it), but it will still do it with only the Finder running. In CAD, it's bad enough to cause the coordinates to change at 1/16 inch resolution. The cursor is fine when the tablet is on (USB) but it is not active. It is only when I use the pen or Wacom mouse on the tablet that it occurs. My bluetooth mouse appears to have no effect on it.
    The techs at Wacom have been very helpful, but they're stuck, too. They believe it is some kind of RFI, first suggesting that it was possibly too close to my display (which does not seem to be the case), and now they think it is coming from the computer itself. At this point, all they can suggest is putting a USB card in the computer. I also believe it is some kind of environmental RFI, because how I hold the pen or touch the tablet effects the severity of the jitter. I tried it on a friend's dual G5 (although hers is Intel, mine PPC) and it worked fine. And no, it's not practical to bring hers here to see if it still works.
    I have the most recent driver (as does she). I have sat in the dark with everything in the room turned off. I unplugged the cordless phone. I have turned off the bluetooth and unmounted and turned off my external firewire drives. I have no wireless networking. I have had no mice or keyboards connected. I have plugged the tablet into different USB ports. The only things on in the room are the computer, the Apple cinema display and the dsl modem (about 30" away). I'm in a ranch house sitting on a concrete slab. The nearest commercial radio transmitter is about 6 miles away. Everything in the house that I can think of that could possibly cause RFI has been turned off (including two lights on dimmers at the other end of the house). I plugged it into the front port and moved the tablet 8' away from the desk and it still jitters just as much (if it were the computer causing it, the interference should have dropped by a factor of about 16). The only thing that slows it down, but doesn't eliminate it, is if I put my bare foot on the computer case - I'm assuming I'm draining some of the RFI I'm channeling to the tablet.
    So . . . Does anyone know if the computer can be a source of that much RFI, or of something else in my environment that may be causing it.
    Thanks.
    dual 2.5 G5   Mac OS X (10.4.8)  

    Hello, all! The saga continues . . . .
    Well, I'm closer to isolating the problem from my recent machinations, but let me first address your latest round of suggestions.
    Daniel:
    Not ground loops. Right now, my setup is such that the recording equipment is only connected to the computer by optical cable (I/O). The computer analog out runs to an amplifier, which I had pulled. And I know people are checking in, because the "views" count keeps going up. Thanks.
    japamac:
    Manual-read(?) meter. No home security system. Water meter is is remote-read, but they drive by & ping it. And it is possible: I felt like I was "probed" the other night. And we refer to my next-door neighbor as "The Alien". He's OK, but he REALLY keeps to himself, and sometimes he does things that are toward the edges of the bell-curve. Whenever we have random interference on the TV, we say he's trying to contact the mothership. As for Big Brother, I'm sure I was on lists in the 60's & 70's as a "known associate". Years ago, when my sister would call, we would sometimes get strange clicks & noises on the line, so we would just start talking to the FBI agents. They might just be using more sophisticated bugs.
    DaddyPaycheck:
    I've tried it in the dark with everything turned off. Today, everything in the room was unplugged except the computer and display, everything was disconnected from the computer, fans & humidifier in the house turned off, fridge unplugged, etc,, and I waited until the furnace cycled off. The house was dead. It jittered.
    So here's what I did (after I turned the house back on). I took things to a neighbor's a couple of doors down. Her power comes from a pole about 150' from mine, but it's about 3/4-1 mile away on the grid. That side of the neighborhood goes out about 4x more often than this side. Once last summer, they were out for about 36 hours and we flickered a couple of times. It worked fine. So maybe it's interference on the line.
    I take it to the next-door neighbor's (not the Alien); the house between. The two poles are on the back corners of their property. 2 years ago when they remodeled, they had the service drop moved to "my" pole, and have missed out on maybe half a dozen power outages. I set it up and . . . . it works fine. It's not line interference.
    {SIDEBAR: The neighbor was quite impressed with the Mac (she runs her medical transcription service over the internet), is now quite unenchanted with Dell, and will probably be converted before the year is out.}
    I lug the stuff home and set it up in the workshop, the farthest place in the house away from "the office" (and not really a healthy environment for a computer). The jitter is slight, but it's there. I move the rig to this end of the house, but set up in the hall down from the office. More jitter. I set it back up, reconnect everything, do a little troubleshooting . . . St. Vitus' Dance. It's the room. (And yes, it is the end of the house near the Alien's)
    I pondered a bit and came up with a plausible, yet bizarre, reason for the interference. I'm going to play this one close to the vest, because it will take me awhile to set things up to test my thesis. Hopefully tomorrow.
    So, thanks again to everyone, and I'm still entertaining suggestions. I could very well be dead wrong again.
    dual 2.5 G5   Mac OS X (10.4.8)  

  • New User...maybe

    Hi,
    my name is Giuseppe Falcone and I use to develop applications on CAD\CAM programs.
    What I need is to create a tool which, starting from cad files, shows these files (their geometry) in a PDF and, most of all, I need single component to animate.
    I've seen it done from Acrobat 3D toolkit but what I need to know is:
    * What license have I to buy to create the tool (which does everything, export, PDF creation and animation)
    * Can I embed the pdf (the result) in my tool?
    * Which license my customer, the end user, will need to run my application. Can it use it free??
    Thanks a lot, bye,
    Peppe.

    Hi
    There already exists a tool like that. The Instructor 3D allows the creation of 3D animations based on your 3D models (CAD data). This tool supports also the import of your animations into pdf files.
    For more informations visit www.instructor3d.com
    Raphi

  • 9205 FPGA configuration mode différentiel

    Bonjour,
    je travaille actuellement sur un projet à base de sbrio-9602 et de modules d'acquisition ni-9205.
    Dans le but d'acquérir 32 voies analogiques en différentiel, j'ai donc acheté deux modules ni-9205. Cependant, j'ai un doute sur la phase de configuration pour le FPGA:
    Les valeurs de la voie 0 par exemple, sont le différentiel entre la borne AI0 et AI8. Il faut donc que je récupère la voie 0 uniquement (logiciellement) ou bien que je fasse logiciellement la différence AI0-AI8?
    J'espère être clair,
    merci d'avance
    Guillaume
    Solved!
    Go to Solution.

    Pour tout ce qui est du cablage ici :
    http://www.ni.com/pdf/manuals/374188d.pdf
    Pour tout ce qui utilisation du cRIO ici:
    http://www.ni.com/pdf/manuals/372596b.pdf
    Pour trouver ces liens vous pouvez sur ni.com tappez 9205, ensuite vous trouverez votre module (3eme lien), après avoir selectionné ce lien vous allez trouver un onglet ressource qui contient des aides sur l'utilisation de ce module.
    Cordialement.
    Nacer M. | Certified LabVIEW Architecte

  • Quadro K3100 for Adobe pp cs6

    G'day  Adobe people,
    I'm sure someones proberably asked this somewhere, but i couldn't find the thread
    Am upgrading to CS6 prodution premuim, and also need to upgrade my laptop accordingly
    was looking at the top line 15" Apple retina , great specs. except for what i believe is any inadequate card  ( gt 750 )
    for a AUD $3,300 + computer??    was looking at mobile work stations like the 17" HP Zbooks, they look great and on paper
    more powerful, the one i'm considering has a Quadro K3100 4gb card.
    Now i know the old adage, if gaming go GTX , if modelling, CAD, multi layering , ect   go the quadro
    If i could configure this Zbook with a GTX 780 i would,  but they only offer quadros.
    All i;m doing is video editing, renderng Full HD files, possibly 2k ( REDCODE ) at most.
    Should this card mentioned still be more than adequate in speed/ performance for such tasks?
    Anyone out there  ala. 3D Animators, ect   experience with this question??
    Regards

    Welcome to the forum
    I have not seen anything (yet) for you to go Quadro.  Unless you need 10-bit output with the very expensive 10-bit external monitor you really want to go GeForce GTX series as Eric said.
    If you want a real off the shelf video editing laptop you have to forget the "workstation" category and go for the "gaming" category.  On on the best of the current generation for you to look at is something like an brand new (release date 2014-4-7, very soon at $3000) ASUS G750JZ-XS72.  This 17.3-inch HD display with a GTX 880M with 4 GB of video RAM, 32 GB of regular RAM, 1TB 7200rpm hard drive plus 2 256GB SSD drives with an i7-4700HQ. 
    Configured properly and plugged into AC power would be among the best quad-core laptops available.  And because of the two SSD RAID would be better than my G750JW laptop which currentl y is the highest scoring laptop on our Premiere Pro BenchMark (PPBM6)  You will have to register on our site to see the results.
    From your greeting I guess you are in Australia. 
    MSI with their GT70 DominatorPro is better because of a very high end CPU i7-4930MX is one step better and only $1000 more than the ASUS above.
    If you want really high end custom configured (up to 6-cores) laptop see ADK Video Editing for their 9000 series laptops and discuss it with Eric Bowen. 

  • How hot does your Helix get under heavy load?

    I've had my Helix for a few days now and have finally got around to installing some software on it.  I have the i7 with 256gb SSD and 8GB RAM.. so far I've been very impressed with how 'snappy' it all runs and besides a few annoyances with the touchscreen/pen not working correctly 100% of the time am very happy with the system.
    Anyways, I wanted to see how the Helix would run when pushed really hard so I had Solidworks running (3d modeling/CAD software) and Netflix going on on an external monitor.  The back of the Tablet became super hot--hot enough that you would definitely not want to touch it (let alone try to hold it) or have it on your lap.  I installed TP fan control and it read that the CPU got up to 86 degrees Celsius.  Granted this is not representative of normal operating conditions, but I'm concerned about how hot it got.  Has anyone else noticed their Helix's running really hot under heavy load? Is this normal?
    Solved!
    Go to Solution.

    ryanpm wrote:
    I'm in the same boat as you on this. It can get very bad sometimes. Under light use, it never get's as cool as I feel it should. I go through the currently running applications and try to run as minimally as possible, but it still creates a fair amount of heat. My battery life is around 6 hours with both batteries too. I definitely don't get the 8-9 hours it's supposed to be getting.
    I think the 10 hrs thing was for the i5 model. It's still five times longer than my X201t with 4cell battery, so I'm happy with this. Shorter battery time is something you have to live with if you are going to use an i7 machine on the go.
    I also heard that actually the RTM version of the Helix IS the revised version from the initial batch(or the previewers' batch)... I heard the first machines were truely HOT.
    [Added] Also, I think long term heat damage will be of less concern in case of the Helix. The business already has many years of slate tablets with standard CPUs that did have problems with heat(namely, Asus EP121 which had its LCD turn yellow because of heat, and Samsung Slate 7 which had serious throttling issues in its early days). If Lenovo HAS intelligence to learn from the history, which I want to believe they do, they should probably would have dealt with that, considering this machine may actually be a second revision. And even if I don't believe Lenovo, I still trust in Yamato Lab.
    Kim
    W540( ), Helix(Sold), Tablet 2(Sold) Tablet 2( ), X120e(Sold), X201 Tablet, X41 Tablet(Sold), X41, X32*4, 701C*4

Maybe you are looking for

  • Average of a column in Normal Report

    Hi I am using a simple report. I want to display average of a column on the bottom of the report. I am not able to find any option for it. how to show average of a cloumn on simple report. Please help me. Thanks PK

  • Create function with a cursor in it

    I need help to create a function called f_Getfeedesc. I want to use a cursor for Loop. The function will return fees description for any folder. If a folder has more than one fee description, it will concatenate the fees description. The select state

  • Using a form to submit data to database

    I have a servlet and I am trying to implement a form in the page so that it will submit from a text box a string into the database. Here is my code. java.util.Date d = new java.util.Date();           java.sql.Timestamp t = new java.sql.Timestamp(d.ge

  • Managing Custom Paper Sizes with the Epson 2000P

    Hi, I recently started using the iMac and after installing the software for the Epson 2000P found that I could not print a 13 x 44 image. I set up a custom paper size and when the page printed it only printed 18" of the image. I tried the Epson 1280

  • BEA Weblogic JSR compliant portal - Modes supported.

    Hi, The JSR 168 spec has the provision to support custom modes in addition to the default modes like view , help & edit . I want to know what are the custom modes supported by the Bea Web logic portal which is JSR 168 compliant. In specific iam looki