PXI-8420

Can a PXI-8430/8 be used to replace a PXI-8420/8 in an automated test bench? I have existing software written in LabView for a PXI-8420/8 for which I do not have source code. I only have an execuatable.

Hi bsexton,
The only thing you might have to change between the PXI-8420/8 and the PXI-8430/8 would be the ports.  Is the executable set up to allow you to select ports?  Otherwise the PXI 8420/8 is discontinued and the recommended replacement is the PXI 8430/8.
Andy K.
Applications Engineer
National Instruments

Similar Messages

  • Considering the PXI-8420/16 as a solution for running one application to communicate to multiple devices.

    I am developing an application that needs to communicate through a serial interface to several devices under test at the same time.Would the PXI-8420/16 be a solution??

    If the NI serial cards do not do concurrent communication, then how do they work? I have a similar application but the communication doesn't necessarily need to be concurrent. It just cannot be sequential. Maybe the NI serial cards will suit my needs but I need an explanation of their operation before I can determine that.

  • Identification of a PXI-8420

    Hi,
    I am using RS232 cards like the PXI/PCI-8420/2 in some (not in all) of our systems. To identify which system it is I have to find out which card is available. I can see the card in the MAX but I didn't find a solution to find it from the software
    - How can I identify from my software if such a card is available in the system ?
    - Can I differ with the software if it is a PXI-8420/2 or a PXI-8430/8 ?
    Thanks in advance
    Oliver

    The reason why this is not so easy is the fact that DAQ boards are accessed through an NI driver but the COM ports of the 84xx devices are accessed through operating system functions (even VISA doesn't talk to the ports directly but through the Windows API). With DAQ it's easy for us to implement a function to identify devices but this feature is not available for serial devices.
    Here are some other ideas:
    1. Open and close VISA sessions to a list of ports in an initialization step in your program. If you don't receive an error you know that the port is present.
    2. Check the registry for instances of the devices you are looking for. I don't have a 84xx device here but I would expect it to show up somewhere here: HKEY_LOCAL_MACHINE\SYSTEM\ControlSet001\Control\Class\.
    Find out where the key for your hardware is located and use the Windows Registry Access VIs from the Advanced palette to read it.
    Best regards,
    Jochen Klier
    National Instruments Germany

  • Need std and non-std baud rates on PXI-8420 (16 channel RS232)

    I found a similar request on the knowledge base, and the resolution was possibly NI would make a card on a case-by-case basis.
    I need the forementioned PXI card to support std and non-std baud rates (i.e. not a simple xtal change, and the software sets up for 57.6 Kbaud, and it is really 62.5 Kbaud) that is easily configured in LabVIEW/VISA.
    The rates I need are as follows:
    All std rates (i.e. 1200,2400,4800,9600,...)
    10.4K (possible with std ports, but unsure about NI hardware and VISA).
    62.5K
    Thanks.
    62.5K

    The golden rule in serial (UART) communication, is that if two communicating partners are within +-2% of each other, they'll happily communicate error-free. Since you often only know one side of the equation, the window narrows to +-1%.
    Next, take a look at how baud rates are calculated. On most NI-Serial hardware, there is a 7.3728 MHz oscillator that is divided by 16 (to create the what I'll call the 'base' frequency), and then divided again by a divisor latch. A divisor latch value of 1 yields 460800, 2 = 230400, 3 = 153600, etc. This is where the 'standard' baud rates come from.
    However, on some OS's, we divide the clock source again by 4 before it hits the divisor latch. This means 1 = 115200, 2 = 57600, etc. While this does creat
    e a set of 'standard' baud rates, it also limits the number of possible baud rates. This is something that will be fixed in the future.
    Now hopefully the following table will make sense. In systems that have a base of 460800, we support baud rates that are within +-1% of the following (baudrates marked with * are supported in 115200 base systems):
    110-9216*, 9404, 9600*, 9804, 10017, 10240, 10472*, 10716, 10971, 11239, 11520*, 11815, 12126, 12454, 12800*, 13165, 13552, 13963, 14400*, 14864, 15360, 15889, 16457*, 17066, 17723, 18432, 19200*, 20034, 20945, 21942, 23040*, 24252, 25600, 27105, 28800*, 30720, 32914, 35446, 38400*, 41890, 46080, 51200, 57600*, 65828, 76800, 92160, 115200*, 153600, 230400, 460800.
    10.4k is supported in both systems at the baud rate 10472.
    If you're running at 62.5k when set to 57.6k, then you're likely using a 8 Mhz Oscillator. In this case, you have a base of 500000, and with a divisor latch value of 48 you'll be running at 10416. This cor
    responds to a 'normal' baud rate setting of 9600. So if you set your modified card to 9600, you should get 10.416k.

  • Programmatically identifying com ports by their physical port number

    I have a system with a NI PXI 8420/16 card.  Once in a while a glitch in my PXI system occurs such that my physical RS-232 ports get re-assigned different com port numbers.  Then I have to go into MAX and reassign my aliases to re-align my LabVIEW application with my port hardware.  Is there a way to stop these glitches from happenning?  OR is there a way to programmatically identify a physical port's com number?
    Solved!
    Go to Solution.

    See if this helps: http://forums.ni.com/ni/board/message?board.id=170&message.id=490257#M490257

  • Poor PXI IO performanc​e on Latitude E6410 with ExpressCar​d 8360

    Hello,
    I have a Dell Latitude E6410 with a Core-i5 M520 which is giving me very poor io performance when using an ExpressCard 8360 card to connect to a PXI Rack.
    The sustained IO rate that I can get appears to be about 1/3 of that that I can get using the same ExpressCard on a Dell Latitude E6400 (with a Core2Duo processor).
    I am using the A05 bios (latest at time of writing) on the E6410.
    Wade.

    I am running Windows XP (32 bit) sp3 in both cases.
    The E6410 has 4GByte of memory fitted.
    The E6400 has 2GByte of memory fitted.
    I have also use the same ExpressCard 8360 via a PXIe to ExpressCard Adaptor in a Desktop machine with similar performance figures to the E6400 - i.e. much better than the E6410.
    The Desktop Machine is an HP Compaq D7900 with 4GByte of memory, Core2Duo E8500 also running Windows XP sp3 (32 bit).
    Also, on the Desktop, I am running NI PXI Platform Services 2.3.2 and NI-Visa runtime version 4.3.
    On the E6410, I am running NI PXI Platform Services 2.5.2 and NI-Visa runtime version 4.6.
    I no longer have access to the E6400 so I am not sure what sofware versions were installed. However, they are unlikely to be new than the versions installed on the E6410.
    Wade.

  • Choosing a PXIe controller for streaming 200 MBps

    Warning:  This is a long post with several questions.  My appologies in advance.
    I am a physics professor at a small liberal-arts college, and will be replacing a very old multi-channel analyzer for doing basic gamma-ray spectroscopy.  I would like to get a complete PXI system for maximum flexability.  Hopefully this configuration could be used for a lot of other experiments such as pulsed NMR.  But the most demanding role of the equipment would be gamma-ray spectroscopy, so I'll focus on that.
    For this, I will need to be measuring either the maximum height of an electrical pulse, or (more often) the integrated voltage of the pulse.  Pulses are typically 500 ns wide (at half maximum), and between roughly 2-200 mV without a preamp and up to 10V after the preamp.  With the PXI-5122 I don't think I'll need a preamp (better timing information and simpler pedagogy).  A 100 MHz sampling rate would give me at least 50 samples over the main portion of the peak, and about 300 samples over the entire range of integration.  This should be plenty if not a bit of overkill.
    My main questions are related to finding a long-term solution, and keeping up with the high data rate.  I'm mostly convinced that I want the NI PXIe-5122 digitizer board, and the cheapest (8-slot) PXIe chassis.  But I don't know what controller to use, or software environment (LabView / LabWindows / homebrew C++).  This system will likely run about $15,000, which is more than my department's yearly budget.  I have special funds to accomplish this now, but I want to minimize any future expenses in maintenance and updates.
    The pulses to be measured arrive at random intervals, so performance will be best when I can still measure the heights or areas of pulses arriving in short succession.  Obviously if two pulses overlap, I have to get clever and probably ignore them both.  But I want to minimize dead time - the time after one pulse arrives that I become receptive to the next one.  Dead times of less than 2 or 3 microseconds would be nice.
    I can imagine two general approaches.  One is to trigger on a pulse and have about a 3 us (or longer) readout window.  There could be a little bit of pileup inspection to tell if I happen to be seeing the beginning of a second pulse after the one responsible for the trigger.  Then I probably have to wait for some kind of re-arming time of the digitizer before it's ready to trigger on another pulse.  Hopefully this time is short, 1 or 2 us.  Is it?  I don't see this in the spec sheet unless it's equivalent to minimum holdoff (2 us).  For experiments with low rates of pulses, this seems like the easiest approach.
    The other possibility is to stream data to the host computer, and somehow process the data as it rolls in.  For high rate experiments, this would be a better mode of operation if the computer can keep up.  For several minutes of continuous data collection, I cannot rely on buffering the entire sample in memory.  I could stream to a RAID, but it's too expensive and I want to get feedback in real time as pulses are collected.
    With this in mind, what would you recommend for a controller?  The three choices that seem most reasonable to me are getting an embedded controller running Windows (or Linux?), an embedded controller running Labview real-time OS, or a fast interface card like the PCIe8371 and a powerful desktop PC.  If all options are workable, which one would give me the lowest cost of upgrades over the next decade or so?  I like the idea of a real-time embedded controller because I believe any run-of-the-mill desktop PC (whatever IT gives us) could connect and run the user interface including data display and higher-level analysis.  Is that correct?  But I am unsure of the life-span of an embedded controller, and am a little wary of the increased cost and need for periodic updates.  How are real-time OS upgrades handled?  Are they necessary?  Real-time sounds nice and all that, but in reality I do not need to process the data stream in a real-time environment.  It's just the computer and the digitizer board (not a control system), and both should buffer data very nicely.  Is there a raw performance difference between the two OSes available for embedded controllers?
    As for live processing of the streaming data, is this even possible?  I'm not thinking very precisely about this (would really have to just try and find out), but it seems like it could possibly work on a a 2 GHz dual-core system.  It would have to handle 200 MBps, but the data processing is extremely simple.  For example one thread could mark the beginnings and ends of pulses, and do simple pile-up inspection.  Another thread could integrate the pulses (no curve fitting or interpolation necessary, just simple addition) and store results in a table or list.  Naievely, I'd have not quite 20 clock cycles per sample.  It would be tight.  Maybe just getting the data into the CPU cache is prohibitively slow.  I'm not really even knowledgeable enough to make a reasonable guess.  If it were possible, I would imagine that I would need to code it in LabWindows CVI and not LabView.  That's not a big problem, but does anyone else have a good read on this?  I have experience with C/C++, and some with LabView, but not LabWindows (yet).
    What are my options if this system doesn't work out?  The return policy is somewhat unfriendly, as 30 days may pass quickly as I struggle with the system while teaching full time.  I'll have some student help and eventually a few long days over the summer.  An alternative system could be built around XIA's Pixie-4 digitizer, which should mostly just work out of the box.  I prefer somewhat the NI PXI-5122 solution because it's cheaper, better performance, has much more flexability, and suffers less from vendor lock-in.  XIA's software is proprietary and very costly.  If support ends or XIA gets bought out, I could be left with yet another legacy system.  Bad.
    The Pixie-4 does the peak detection and integration in hardware (FPGAs I think) so computing requirements are minimal.  But again I prefer the flexibility of the NI digitizers.  I would, however, be very interested if data from something as fast as the 5122 could be streamed into an FPGA-based DSP module.  I haven't been able to find such a module yet.  Any suggestions?
    Otherwise, am I on the right track in general on this kind of system, or badly mistaken about some issue?  Just want some reassurance before taking the plunge.

    drnikitin,
    The reason you did not find the spec for the rearm time for
    the 5133 is because the USB-5133 is not capable of multi-record acquisition.  The rearm time is a spec for the reference
    trigger, and that trigger is used when fetching the next record.  So every time you want to do another fetch
    you will have to stop and restart your task. 
    To grab a lot of data increase your minimum record size.  Keep in mind that you have 4MB of on board
    memory per channel. 
    Since you will only be able to fetch 1 record at a time,
    there really isn’t a way to use streaming. 
    When you call fetch, it will transfer the amount of data you specify to
    PC memory through the USB port (up to 12 MB/s for USB 2.0 – Idealy).
    Topher C,
    We do have a Digitizer that has onboard signal processing
    (OSP), which would be quicker than performing post processing.  It is
    the NI 5142
    and can perform the following signal
    processing functions.  It is
    essentially a 5122 but with built in OSP. 
    It may be a little out of your price range, but it may be worth a
    look. 
    For more
    information on streaming take a look at these two links (if you havn’t
    already). 
    High-Speed
    Data Streaming: Programming and Benchmarks
    Streaming Options for PXI
    Express
    When dealing with different LabVIEW versions
    it is important to note that previous versions will be compatible with new
    versions; such as going from 8.0 to 8.5. 
    Keep in mind that if you go too far back then LabVIEW may complain, but
    you still may be able to run your VI.  If
    you have a newer version going to an older version then we do have options in
    LabVIEW to save your VI for older versions. 
    It’s usually just 1 version back, but in LabVIEW 8.5 you can save for
    LabVIEW 8.2 and 8.0.
    ESD,
    Here is the link
    I was referring to earlier about DMA transfers.  DMA is actually done every time you call a
    fetch or read function in LabVIEW or CVI (through NI-SCOPE). 
    Topher C and ESD,
    LabVIEW is a combination of a compiled
    language and an interpreted language. 
    Whenever you make a change to the block diagram LabVIEW compiles
    itself.  This way when you hit run, it is
    ready to execute.  During execution LabVIEW
    uses the run-time engine to reference shared libraries (such as dll’s).  Take a look at this DevZone article about
    how LabVIEW compiles it’s block diagram (user code). 
    I hope all of this information helps!
    Ryan N
    National Instruments
    Application Engineer
    ni.com/support

  • Convert PXIe-8135 controller to dual-boot Windows 7 and LabVIEW RT

    Hello. I have a PXIe-8135 controller that originally was just running Windows 7. We are trying to convert it to a dual boot system to also run LabView Real Time. (There is host computer that will run LabVIEW 2014 with the RT module, and the controller will become a target).
    I have created a FAT32 partition on the hard drive of the controller. Now, I’m trying to install the real-time OS with a USB flash drive made using the MAX utility, but I cannot boot using the USB drive for some reason. I keep getting the message “waiting for USB device to initialize”.  
    In BIOS, legacy USB support is [ENABLED] and boot configuration is set to [Windows/other OS]. I’ve tried removing the drive, waiting, and reinserting. I’ve tried two different USB drives (both 8 GB, different brands).
    I’m not sure what to do next. Apart from the USB boot issue, is converting the PXIe-8135 even possible?  I read about SATA/PATA hard drive issues with older controllers, but I don't know about this one.
    Thanks, in advance, for your help!
    -Jeff
    Solved!
    Go to Solution.

    Per Siana's licensing comment, more information on purchasing a deployment license if you do not have one for this target can be found here.
    The RT Utility USB key is used to set up non-NI hardware with LabVIEW Real-Time, but you should not need it in this situation to convert to dual-boot (*). Try this:
    1. Since you already have a FAT32 partion created, go into BIOS setup and change to booting 'LabVIEW RT'.
    2. The system will attempt to boot LabVIEW RT, see that the partition is empty, and switch over into LabVIEW RT Safe Mode. (this safemode is built into the firmware, which is why you don't really need the USB key).
    3. The system should come up correctly and be detectable from MAX, and you can proceed with installing software.
    4. To switch back to Windows, go back to BIOS setup and choose 'Windows/Other OS'
    (*) One area where the USB key is helpful on a dual boot system is with formatting the disk. If you want to convert from FAT32 to Reliance on the partition designated for LabVIEW RT, the USB key lets you attempt to format a single parition and leave the rest of the disk untouched. If you format from MAX, the standard behavior is to format only one RT partition if found, but if not found, it will format the entire disk.  Formatting from MAX on a dual boot system is consequently riskier and you could lose your Windows partition.

  • Start and Stop Trigger using PXI-6120 and DigitalSta​rtAndStopT​rigger.vi not working :-(

    Hello,
    I've been trying for a while now to get my PXI unit to capture a waveform between a Start and Stop (Reference) Trigger using the NI example DigitalStartAndStopTrigger.vi downloaded from the NI website. However, whilst the start trigger and stop trigger seem to be working i.e. the VI runs and stops at  the correct times there is never any data read from my DAQmx compatible PXI-6120 card. So I can see the VI is running around the aquisition loop but the Property Node AvailSampPerChan is always returning zero... this has me slightly puzzled. I thought this might just be a driver issue so I've updated my box to the following software versions (see below) and installed the latest drivers e.g. DCDNov07.exe (also from the NI site) but nothing has changed.
    my software as of now.
    Labview 7.1 (with the 7.1.1 upgrade applied)
    Max 4.3.0.49152
    DAQmx 8.6.0f12
    Trad DAQ 7.4.4f7
    before I updated I had the same problem but with the following versions:
    Labview 7.1 (with the 7.1.1 upgrade applied)
    Max 4.2.1.3001
    DAQmx 8.5.0f5
    Trad DAQ 6.9.3f4
    So to cut a long story short I still have the same problem with the triggers... does anybody have any ideas what is going wrong?
    To add insult to injury it the traditional DAQ example ai_start-stop_d-trig.vi was almost working correctly before I did the upgrade. It had the strange behaviour of capturing the AI0 channel but on the wrong edges (e.g. if I set Start on Rise and Stop on Fall it would do the opposite, Start on Fall and Stop on Rise).
    I'm going to leave my box doing a mass compile over night but i'd really like it if someone could suggest a solution or point me in the right direction.
    Many thanks,
    Mike

    Hi Graham
    I'm out of the lab today but I'll try and answer your questions as best I can...
    1) What are the values you have set for Buffer size, Rate, samples per read and post trigger Samples?
    At the moment I have all the values (e.g. sample rate, buffer size etc) unchanged apart from the ones I mentioned in my previous post (see above). I have in the past played around with changing the buffer sizes and rates in the example VI but as this appeared to have no effect on the behaviour I now have them setup as in the download.
    2) Does the program end after the stop trigger is implemented?
    Yep, if I toggle the trigger line high then low I see the program exits the read loop and the VI stops running as expected.
    3) Lastly can you give me the details of triggering method. Are you
    using a digital train of users set digital pulses? how long is the
    program running.I'm using the WriteDigChan.vi to manually toggle the first digital line of the PXI-6733 card which is wired directly to PFI0 of the PXI-6120 card. Generally, I just start the VI running  and then toggle the line high, wait a couple of seconds and then toggle it low.
    To me it all looks like it should be acquiring samples but as I said yesterday it just refuses to fill the buffer with any data (and hence no samples are read).
    Any ideas? and thanks for you help,
    Mike

  • Trouble capturing waveform from PXI-4472

    I'm really a very green newbie at this stuff, so bear with me...
    I've got a PXI-4472 data acquisition board and a PXI-5411 waveform generator. I've connected the arbitrary out of the 5411 to the channel 0 in on the 4472. An external oscilloscope shows a 1v-amplitude sine wave being generated.
    I created a very simple VI to show what the 4472 is capturing. It connects a NI-DAQ channel I generated to the standard "AI Acquire Waveform.vi", then out to a Waveform Chart, all within a while loop with a Stop button. Problem is, all the waveform chart seems to be showing is the running average of the waveform instead of the form itself (solid line, a tad above zero).
    I can hook the 4472 input channel up to a DC-out power supply, a
    nd when I vary the voltage, the waveform chart changes as well.
    So my question (whew!): What's wrong here that's not allowing me to capture a waveform from the 4472 (in turn from the 5411) and display it on my waveform chart?
    Thanks in advance for the help.

    Never mind.... it was a sample rate problem. I upped the sample rate and it came out ok.

  • Triggerring PXI-4110 to measure 1 current value while HSDIO PXI-6552 generating waveform

    Hi,
    Some question about PXI-4110 to measure current while PXI-6552 is generating the waveform. 
    1. Let say, I need to measure 3 points of current values, i.e. while PXI-6552 is generating sample-1000, 2000 and 3500. On the edge of sample 1000,2000 and 3500, the PXI-6552 will send a pulse via PFI line or via PXI backplane trigger line. My question is, is it possible to trigger PXI-4110 (hardware trigger or software trigger) to measure current values at these points ?
    2. Let say I need to measure the current on 0ms (start of waveform generation by PXI-6552) , 1ms, 2ms, 3ms, 4ms... and so on for 1000 points of measurement, code diagram as shown at the figure below. It is possible for the VI "niDCPower Measure Multiple" to measure exactly at 1ms, 2ms, 3ms .. ? How much time will have to spend to complete acquire 1 point of measurement by "niDCPower Measure Multiple" ?
    Thanks for viewing this post. Your advice on hardware used or software method is much appreciated. Thanks in advance.  
    Message Edited by engwei on 02-02-2009 04:24 AM
    Attachments:
    [email protected] ‏46 KB

    Hi engwei,
    1. Unfortunately, the 4110 does not support hardware triggering. Therefore you cannot implement direct triggering through the backplane or anything like that. However, there are a couple of possible workarounds you can try:
    a) Use software triggering: Say your 6552 is generating in one while loop, and your 4110 is to measure in another while loop. You can use a software syncrhonization method like notifiers to send a notification to your 4110 loop when your 6552 has generated the desired sample. This method, however, will not be very deterministic because the delay between the trigger and the response depends on your processor speed and load. Therefore, if you have other applications running in the background (like antivirus) it will increase the delay.
    b) Use hardware triggering on another device: If you have another device that supports hardware triggering (like maybe an M-series multifunction DAQ module), you can configure this device to be triggered by a signal from the 6552, perform a very quick task (like a very short finite acquisition) then immediately execute the DCPower VI to perform the measurement. The trigger can be configured to be re-triggerable for multiple usage. This will most likely have a smaller time delay then the first option, but there will still be a delay (the time it takes to perform the short finite acquisiton on the M-series). Please refer to the attached screenshot for an idea of how to implement this.
    2. To make your 4110 measure at specific time intervals, you can use one of the methods discussed above. As for how long it will take to acquire 1 measurement point, you may find this link helpful: http://zone.ni.com/devzone/cda/tut/p/id/7034
    This article is meant for the PXI-4130 but the 4110 has the same maximum sampling rate (3 kHz) and so the section discussing the speed should apply for both devices.
    Under the Software Measurement Rate section, it is stated that the default behavior of the VI is to take an average of 10 samples. This corresponds to a maximum sampling rate of 300 samples/second. However, if you configure it to not do averaging (take only 1 sample) then the maximum rate of 3000 samples/second can be achieved.
    It is also important to note that your program can only achieve this maximum sampling rate if your software loop takes less time to execute than the actual physical sampling. For example, if you want to sample at 3000 samples/second, that means that taking one sample takes 1/3000 seconds or 333 microseconds. If you software execution time is less than 333 microseconds, then you can achieve this maximum rate (because the speed is limited by the hardware, not the software). However, if your software takes more than 333 microseconds to execute, then the software loop time will define the maximum sampling rate you can get, which will be lower than 3000 samples/second.
    I hope this answers your question.
    Best regards,
    Vern Yew
    Applications Engineer, NI ASEAN
    Best regards,
    Vern Yew
    Applications Engineer
    Attachments:
    untitled.JPG ‏18 KB

  • Problems performing offset null and shunt calibration in NI PXI-4220

    I am using a 350 ohm strain gage for the measurements, i have already create a task in MAX, when i want to perform offset null in the task, the program shows a waiting bar and the leds in the 4220 board start to tilting, but when the waiting bar stops, MAX gets blocked. it has been impossible to me to perform the offset null, what can i try?.
    which will be the correct values for the parameters beside the gage parameters for the strain measures?

    Hello,
    Thank you for contacting National Instruments.
    Usually when this problem occurs, it is do to incorrect task configuration or incorrectly matched quarter bridge completion resistor. Ensure that you have the correct Strain Configuration chosen. The default is Full Bridge I. If you only have a single strain gauge in your configuration, you will need to change your configuration. Also ensure that if your are using a quarter bridge completion resistor make sure that it is 350Ohm not 120Ohm. If the resistor if 120Ohm you will more thank likely not be able to null your bridge.
    Please see the PXI-4220 User Manual for more information about your configuration and signal connections: http://digital.ni.com/manuals.nsf/websearch/F93CCA9A0B4BA19B86256D60
    0066CD03?OpenDocument&node=132100_US
    Also, you can download and install the latest NI-DAQ 7.2 driver: http://digital.ni.com/softlib.nsf/websearch/50F76C287F531AA786256E7500634BE3?opendocument&node=132070_US
    This 7.2 driver has a signal connections tab displayed when configuring your DAQmx Task which show you how to correctly connect your signals.
    Regards,
    Bill B
    Applications Engineer
    National Instruments

  • Memory upgrade on PXI-8105 and PXI-8106 controller​s

    Hi,
    I've recently upgraded the memory of three PXIs; one with a PXI-8105 controller and two with PXI-8106 controllers.  Both the 8105 and 8105 can take a maximum of 4GB (2x2GB) of DDR2-677 (PC2-5300) memory (see links below).  However, on all three systems, both the BIOS and the O/S only see 3.3GB.  Any idea why this might be the case?
    I've tried flashing the BIOS (v1.4 on both PXIs), but with no success.
    We're using COTS memory (i.e. not bought from NI) but I'd be hard pushed to believe that that is the cause of the problem.
    Thanks.
    Links;
    Max memory capacity of PXI-8105  http://sine.ni.com/nips/cds/view/p/lang/en/nid/202​630
    Max memory capacity of PXI-8106  http://sine.ni.com/nips/cds/view/p/lang/en/nid/203​442
    BIOS upgrade page: http://digital.ni.com/public.nsf/allkb/9C9362590B0​5CD6E86256B270082164A
    Solved!
    Go to Solution.

    Are these controllers running Windows? If so, this is normal expected behavior. I think it might have something to do with the fact that Windows reserves the rest of the memory for driver addressing (could be totally wrong there). The same thing happens on my Dell desktop PC with 4GB of memory in it.
    Jarrod S.
    National Instruments

  • How can I get the Conditoned output from PXI 1520 in PXI 1011 combined Chassie?

    Respected Sir,
    I am using PXI-SCXI combined Chassie PXI 1011 for my application. I have placed three SCXI 1520 modules, a motion card PXI 7352 and PXI 6052E DAQ card in the combined chassie. You know the PXI 1520 and PXI 6052E are connected internally using the Backplane of the SCXI and is not user accessable. Now I need the conditioned output of the PXI 1520 to be used as an Analog input for the Motion Control card PXI 7352. How can I do that? Whether PXI 1180 could solve my problem? If so, how do I connect the PXI 1180 to PXI 1011?
    Kindly clarify me as soon as possible.
    Thanking you,
    Ramkumar. D

    Dear Sir,
    I have already placed my DAQ card at the correct place and configured it. I need some more clarification from you. I have attached my Query in .txt format.
    Kindly reply as soon as possible.
    Thanks,
    Ramkumar. D
    Attachments:
    Clarification.txt ‏2 KB

  • How to connect a Compact RIO to a PXI System

    Hello,
    I want to connect my cRIO-9074 with my PXI System.
    First of all my Hardware configuration:
    1. "PXI-1036" Chassis with "PXI-8101" Embedded Controller and "PXI-8231" Ethernet
    2. "cRIO 9074" Integrated 400 MHz Realt-Time Controller
    3. "NI 9144" EtherCAT slave chassis
    The PXI System is connected with my Network through the Ethernet-Port on the "PXI-8101" Controller. The "NI 9144" is connected with the "cRIO 9074" through their EtherCAT Ports. That works without a problem.
    Now I want to connect my "cRIO-9074" with my PXI-System through the Ethernet Port on the "PXI-8321", but I can't find a way to get this work.
    In MAX I configured the second Ethernet Port on the PXI as "TCP/IP" with a IP and Subnet. Then I connected the cRIO and turned it on.
    But I can't find the cRIO anywhere in MAX or in a LabView Realtime Project.
    When I configure the "PXI-8321" as a EtherCAT Port, I can connect my "NI 9144" with it and use it in a LabView Project. But configured as Ethernet and connected with the "cRIO 9074" it doesn't works.
    Is there any way to get this working? Or is this not possible with my Hardware?
    I know I could connect the cRIO and PXI through a Switch with my Network and then use both in a LabView Real-Time Project. But I want to build a mobile Measurement-Station with as few devices as possible. If there is no other way I will use a Switch, but without would be better.
    Thankful
    Daniel Löffler

    Hi Daniel,
    it is not possible to connect cRIO-9074 Controller with the PXI System through the Ethernet-Port, because the RT cRIO Controller can not behave as a slave. cRIO Controller is always configured as a master, never as slave. The extension Chassis 9144 is configured as a slave, so you can use it with your cRIO Ctrl or with PXI Ctrl as well. Connection PXI-Ctrl, cRIO-Ctrl, NI 9144 is not possible.
    Best regards,
    ENIA
    NI Germany

Maybe you are looking for