NI PXI-CardBus 8310

NI PXI-CardBus 8310のドライバをwindows7にインストールするにはどうしたらいいでしょうか?
(NI PXI-CardBus 8310 Version1.0のインストーラを試しましたがインストールできませんでした)

wwwg 様
日本ナショナルインスツルメンツ技術部 寺尾です。
平素よりディスカッションフォーラムをご利用頂き誠にありがとうございます。
現在お使いのNI PXI-CardBus 8310(NI PXI-CardBus8310 Driver Software 1.0)は正式にWindows7に対応しておりません。Windows XPの環境でご使用頂ければと思います。
ただ、PXI-CardBus 8310とPCとの互換性にはPCのBIOS設計が大きく依存しているようで、そのような背景から弊社でCardbus 8310を使用する事が出来たPC(弊社でのテスト含む)のリストを公開しています。
”What Laptops Have Been Tested with the NI PXI-CardBus 8310 PXI Controller?”
http://zone.ni.com/devzone/cda/tut/p/id/5036
現行のモデルNI PXIe-ExpressCard8360はWindows7に対応したものとなっております。
http://sine.ni.com/nips/cds/view/p/lang/ja/nid/202665
宜しくお願い致します。
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+
 日本ナショナルインスツルメンツ株式会社 | 技術部 | 寺尾 純一
 Junichi Terao | Applications Engineer | National Instruments Japan Corp.
 サポート情報: http://www.ni.com/support/ja
 技術データベース: http://www.ni.com/kb
 住所 : 〒105-0012 東京都港区芝大門1-9-9 野村不動産芝大門ビル8F/9F
 お問い合わせフリーダイヤル: 0120-527196
+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+-+

Similar Messages

  • PXI 8310 Cardbus doesn't work with HP compaq nx9105 laptop

    Hi,
    I have to install a NI PXI 8310 Cardbus on a  HP compaq nx9105 (which is a tested laptop) with a processor ADM Athlon 64, 350 MHz, 1 GB RAM, Win XP Pro. The PCMCIA controllers are "Controller Texas Instruments PCI-1620 Cardbus Controller with Ultramedia".
    I follow the instruction of Installation Guide and after installing the Cardbus driver I shut down the computer, I insert the Cardbus 8310, and when I power Up a message of error is showing.
    "One or more card was not configured in your system. If you are seeing this message for the first time, please use the NIDocker icon on your taskbar, select 'Reset Bridge Settings', and reboot the system. If the issue persist etc..."
    I do these actions and when I reboot the same error arises.
    On Control Pannel's System Manager I can see the situation in the attachment. The PCMCIA controller doesn't work and on the properties of the device it is written that "the specified device generates a resource conflict (code 12)".
    Please Suggest.
    Regards
    Pietro
    Attachments:
    gest perif x Connessione.JPG ‏89 KB

    Hi Pietro,
    based on the error you are seeing, it might be cause by improper installation of the driver or controller. The installation instructions specify that once the driver is installed you have to reboot the laptop twice before you try installing the 8310. If you complete only one reboot, the driver might not finish the configuration which later corrupts the installation of the 8310. Therefore, try the following:
    disconnect the 8310 from the laptop
    uninstall the driver
    re-install the driver
    reboot the laptop twice - after you re-boot once there should be a prompt asking you to reboot again
    install the 8310
    If this does not resolve the problem, run  <a href="http://www.drivertools.net/Products/Utilities/EngineeringUtilities/PCIScope.htm">PCIScope</a> - a third party software (you can get 14 day free trial version) - and send me a capture of the diagnostic. PCIScope is software that is evaluates and debugs the PCI bridge/subsystems.
    I have seen the list of laptop compatible with the NI Cardbus, but your PC is tested with a different controller. So it is possible that the laptop is incompatible due the different controller. With the PCIScope capture we can better determine the root-cause of the problem. If this indeed is the case, we will revise our documentation as far as what laptops with which controllers are supported.
    MarcoC
    NI Italy

  • Fifo communication problem shifted data

    Hello,
    Currently,
    there is a communication between a laptop, equiped with a NI cardbus 8310,
    and a FPGA module controlling an industrial device.
    However, we have to transfer this program to a more efficient computer.
    The data transfered to the FPGA is sent thanks to a FIFO protocol.
    Unfortunately, we met several communication problems while using the main program and the FPGA program :
    - We have several error messages such as :
    -52007
    (The most recurrent) : "Called another software component" which
    happens when we would like to run the FPGA program and when we stop the
    main program.
    61046 : a clock error which occurs at close FPGA VI reference.
    We tried to solve these problems by creating a small program (without
    the FIFO protocol) aiming to test the communication between the
    powerfull computer and the FPGA. It worked.
    - Then, we created
    a program (on the computer) including the FIFO communication to test with the FPGA program. It
    roughly worked but the data were shifted. (This program runs on the
    laptop but not on the computer)
    - Then, we created a
    new project just to test the communication. It's a simple program in the
    FPGA :  when we click on OK button, it increments variables and
    transmits them to the host. In the host.vi, when we valid a button, we
    say to the FPGA "OK button = true" thus enabling reading of the data
    from the FPGA. This program has the same FIFO method as the main program
    but it doesn't work. Actually, in debug mode, we can see variables
    incrementation but in real time mode, the program blocks in FIFO
    reading.
    - We would like to mention that we have two versions of Labview : labview v8.6 on the laptop and  labview 2009 on the computer.
    Hoping that you will be able to help us,
    PS : in all test programs, we used the same way to implement the FIFO such as the first program which works on the laptop with labview v8.6.

    hello,
    Thank you for your answer,
    I understand your answer but the problem we have refers to the fifo. In fact, the program runs well on a laptop
    but when we run it on another computer, the data from fifo are shifted. We don't understand why?
    Morevover, we did another program with fifo communication and the program blocked during the fifo reading
    and we just stop it with the abort program button. Do you know why?
    We tested the FPGA program with the simulation and it worked normally.
    Best regrads
    Mathieu

  • Is there a driver for the PXI-8310 that works with Windows 7?

    The only one I can seem to find is for XP.

    Hello LouisTI,
    Unfortunately there is not a Windows 7 driver for the PXI-Cardbus8310, as this is a legacy device which is currently going EOL (End of Life). The XP driver for the 8310 (which you may already have found) can be downloaded here:
    http://joule.ni.com/nidu/cds/view/p/id/370/lang/en
    If using Win 7 is critical, we recommend the PXI-ExpressCard 8360 as a replacement for the 8310:
    http://sine.ni.com/nips/cds/view/p/lang/en/nid/202294
    Hope that helps!
    James M.  |  Applications Engineer  |  National Instruments
    James M. | Applications Engineer | National Instruments

  • PXI-8630, PXIe-1078 non identifiés dans MAX

    Bonjour,
    J'ai un chassis PXI-1078 contenant des PXI (4110, 4130, 4132, 2532) et contrôler par un pc portable en utilisant PXI-8360.
    Mon Problème est que sur le gestionnaire des periphériques le pont PCI vers PCI standard ne montre aucun problème mais, sur MAX quand je veux ajouter un chassis j'obtient dans la liste de "remote controller"  : NI PXI 8310 au lieu de 8360. En plus de ça mon PC ne detecte pas les PXI existants dans le chassis comme des nouveaux periphériques dans le géstionnaire des periphériques.
    Que puis-je faire pour résourdre ce problème.
    sachant que j'ai effectué tous les mis à jours possible qui existent sur Ni support:
    MAX --> 5.0
    NI PXI Platform services --> 3.0.1
    NI VISA --> 5.1.1
    en plus des update des drivers pour le mois d'Aout
    Merci d'avance
    TheShadowx
    Solved!
    Go to Solution.

    Hello,
    And thank you for posting here.
    What is your OS, your LabVIEW version and the model of your computer?
    Did you respect the order of Installation for our products?
    http://digital.ni.com/public.nsf/allkb/779E54A45478FA2C86256D0500774FCB?OpenDocument
    Is your BIOS version not to old? It could fixes some PXI problems. If you BIOS is updated, you can try the NI MXI-Express BIOS Compatibility Software 1.4 which is still in Beta but which work fine. You can find it here :
    http://digital.ni.com/betaprogram/mainbetacust.nsf/main.htm
    Here two links to other posts where the problem is quite similar. There as useful links contained in those posts :
    http://forums.ni.com/t5/PXI/Express-card-MXI-laptop-compatibility-1033-and-8360/m-p/697994?requireLo...
    http://forums.ni.com/t5/NI-Applications-Engineers/PXIe-1073-via-MXIe-8360-not-visible-in-MAX/td-p/14...
    Regards,
    Jérémy C.
    National Instruments France
    #adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
    Travaux Pratiques d'initiation à LabVIEW et à la mesure
    Du 2 au 23 octobre, partout en France

  • Pxi switching boards causes pc can´t boot

    Hi,
    I have pxi-1044 with some pxi boards (pxi-4065 / 6509 / 2529 / 8431 / 8432 / 2576 (3 borads) )
    and the pxi-8310 controller.
    When the system was configured for the first time, it works ok.
    but 2 of the borads (pxi-8432 / pxi-2576 ) must interchange of slot  (because of space problems with theirs conectors)
    when the system powered, both led (power on pxi chassis) and (controller led) set, but windows xp can not boot,
    controler led switch of, and computer reboot. again and again.
    I remove all boards and check that all are recognized through MAX except the two boards switching.
    I change pxi-8232 to another slot (never used) and the system look to work ok,
    this board is recognized.
    What is the right procedure to interchange some boards when MAX has a previos configuration?
    I would like to use the slots which i have problems with, is it possible?
    thanks
    Solved!
    Go to Solution.

    Insteading of deleting from here or trying to delete, we can always cross-connect threads.
    Continued here. 
    - Partha
    LabVIEW - Wires that catch bugs!

  • Blackberry 8310 not showing reply or send in sms text options menu

    I have a blackberry 8310. iam able to recieve sms text messages but when i click to on options, there is no "reply" option in the menu for me to be able to reply to the text. The chip works fine in other phones so that tells me something in the phone is not set right. There is no option for me to even compose and fresh sms text message. Please help?

    Here ya' go:
    1. Get a good backup of your device: BB device backup (I keep all my backups in a separate folder created for that purpose so they are always in one place and easy to find.)
    2. Download the most recent version of device software for your carrier: Downloading BB device software
    3. Reinstall the device software: Clean reload of device software
    4. Pull the battery out with the device ON. Wait about a minute and put the battery back in. Let the BB completely restart.
    5. Once you have the device OS successfully installed you will then need to reinstall any 3rd party apps you currently have on your device: Install applications using the app loader tool
    6. Complete another battery pull routine after each 3rd party app installation to reclaim memory and refresh your BB.
    Let us know how it goes.
    IrwinII
    Please remember to "Accept as Solution" the post which solved your thread. If I or someone else have helped you, please tell us you "Like" what we had to say at the bottom right of the post.

  • Poor PXI IO performanc​e on Latitude E6410 with ExpressCar​d 8360

    Hello,
    I have a Dell Latitude E6410 with a Core-i5 M520 which is giving me very poor io performance when using an ExpressCard 8360 card to connect to a PXI Rack.
    The sustained IO rate that I can get appears to be about 1/3 of that that I can get using the same ExpressCard on a Dell Latitude E6400 (with a Core2Duo processor).
    I am using the A05 bios (latest at time of writing) on the E6410.
    Wade.

    I am running Windows XP (32 bit) sp3 in both cases.
    The E6410 has 4GByte of memory fitted.
    The E6400 has 2GByte of memory fitted.
    I have also use the same ExpressCard 8360 via a PXIe to ExpressCard Adaptor in a Desktop machine with similar performance figures to the E6400 - i.e. much better than the E6410.
    The Desktop Machine is an HP Compaq D7900 with 4GByte of memory, Core2Duo E8500 also running Windows XP sp3 (32 bit).
    Also, on the Desktop, I am running NI PXI Platform Services 2.3.2 and NI-Visa runtime version 4.3.
    On the E6410, I am running NI PXI Platform Services 2.5.2 and NI-Visa runtime version 4.6.
    I no longer have access to the E6400 so I am not sure what sofware versions were installed. However, they are unlikely to be new than the versions installed on the E6410.
    Wade.

  • Choosing a PXIe controller for streaming 200 MBps

    Warning:  This is a long post with several questions.  My appologies in advance.
    I am a physics professor at a small liberal-arts college, and will be replacing a very old multi-channel analyzer for doing basic gamma-ray spectroscopy.  I would like to get a complete PXI system for maximum flexability.  Hopefully this configuration could be used for a lot of other experiments such as pulsed NMR.  But the most demanding role of the equipment would be gamma-ray spectroscopy, so I'll focus on that.
    For this, I will need to be measuring either the maximum height of an electrical pulse, or (more often) the integrated voltage of the pulse.  Pulses are typically 500 ns wide (at half maximum), and between roughly 2-200 mV without a preamp and up to 10V after the preamp.  With the PXI-5122 I don't think I'll need a preamp (better timing information and simpler pedagogy).  A 100 MHz sampling rate would give me at least 50 samples over the main portion of the peak, and about 300 samples over the entire range of integration.  This should be plenty if not a bit of overkill.
    My main questions are related to finding a long-term solution, and keeping up with the high data rate.  I'm mostly convinced that I want the NI PXIe-5122 digitizer board, and the cheapest (8-slot) PXIe chassis.  But I don't know what controller to use, or software environment (LabView / LabWindows / homebrew C++).  This system will likely run about $15,000, which is more than my department's yearly budget.  I have special funds to accomplish this now, but I want to minimize any future expenses in maintenance and updates.
    The pulses to be measured arrive at random intervals, so performance will be best when I can still measure the heights or areas of pulses arriving in short succession.  Obviously if two pulses overlap, I have to get clever and probably ignore them both.  But I want to minimize dead time - the time after one pulse arrives that I become receptive to the next one.  Dead times of less than 2 or 3 microseconds would be nice.
    I can imagine two general approaches.  One is to trigger on a pulse and have about a 3 us (or longer) readout window.  There could be a little bit of pileup inspection to tell if I happen to be seeing the beginning of a second pulse after the one responsible for the trigger.  Then I probably have to wait for some kind of re-arming time of the digitizer before it's ready to trigger on another pulse.  Hopefully this time is short, 1 or 2 us.  Is it?  I don't see this in the spec sheet unless it's equivalent to minimum holdoff (2 us).  For experiments with low rates of pulses, this seems like the easiest approach.
    The other possibility is to stream data to the host computer, and somehow process the data as it rolls in.  For high rate experiments, this would be a better mode of operation if the computer can keep up.  For several minutes of continuous data collection, I cannot rely on buffering the entire sample in memory.  I could stream to a RAID, but it's too expensive and I want to get feedback in real time as pulses are collected.
    With this in mind, what would you recommend for a controller?  The three choices that seem most reasonable to me are getting an embedded controller running Windows (or Linux?), an embedded controller running Labview real-time OS, or a fast interface card like the PCIe8371 and a powerful desktop PC.  If all options are workable, which one would give me the lowest cost of upgrades over the next decade or so?  I like the idea of a real-time embedded controller because I believe any run-of-the-mill desktop PC (whatever IT gives us) could connect and run the user interface including data display and higher-level analysis.  Is that correct?  But I am unsure of the life-span of an embedded controller, and am a little wary of the increased cost and need for periodic updates.  How are real-time OS upgrades handled?  Are they necessary?  Real-time sounds nice and all that, but in reality I do not need to process the data stream in a real-time environment.  It's just the computer and the digitizer board (not a control system), and both should buffer data very nicely.  Is there a raw performance difference between the two OSes available for embedded controllers?
    As for live processing of the streaming data, is this even possible?  I'm not thinking very precisely about this (would really have to just try and find out), but it seems like it could possibly work on a a 2 GHz dual-core system.  It would have to handle 200 MBps, but the data processing is extremely simple.  For example one thread could mark the beginnings and ends of pulses, and do simple pile-up inspection.  Another thread could integrate the pulses (no curve fitting or interpolation necessary, just simple addition) and store results in a table or list.  Naievely, I'd have not quite 20 clock cycles per sample.  It would be tight.  Maybe just getting the data into the CPU cache is prohibitively slow.  I'm not really even knowledgeable enough to make a reasonable guess.  If it were possible, I would imagine that I would need to code it in LabWindows CVI and not LabView.  That's not a big problem, but does anyone else have a good read on this?  I have experience with C/C++, and some with LabView, but not LabWindows (yet).
    What are my options if this system doesn't work out?  The return policy is somewhat unfriendly, as 30 days may pass quickly as I struggle with the system while teaching full time.  I'll have some student help and eventually a few long days over the summer.  An alternative system could be built around XIA's Pixie-4 digitizer, which should mostly just work out of the box.  I prefer somewhat the NI PXI-5122 solution because it's cheaper, better performance, has much more flexability, and suffers less from vendor lock-in.  XIA's software is proprietary and very costly.  If support ends or XIA gets bought out, I could be left with yet another legacy system.  Bad.
    The Pixie-4 does the peak detection and integration in hardware (FPGAs I think) so computing requirements are minimal.  But again I prefer the flexibility of the NI digitizers.  I would, however, be very interested if data from something as fast as the 5122 could be streamed into an FPGA-based DSP module.  I haven't been able to find such a module yet.  Any suggestions?
    Otherwise, am I on the right track in general on this kind of system, or badly mistaken about some issue?  Just want some reassurance before taking the plunge.

    drnikitin,
    The reason you did not find the spec for the rearm time for
    the 5133 is because the USB-5133 is not capable of multi-record acquisition.  The rearm time is a spec for the reference
    trigger, and that trigger is used when fetching the next record.  So every time you want to do another fetch
    you will have to stop and restart your task. 
    To grab a lot of data increase your minimum record size.  Keep in mind that you have 4MB of on board
    memory per channel. 
    Since you will only be able to fetch 1 record at a time,
    there really isn’t a way to use streaming. 
    When you call fetch, it will transfer the amount of data you specify to
    PC memory through the USB port (up to 12 MB/s for USB 2.0 – Idealy).
    Topher C,
    We do have a Digitizer that has onboard signal processing
    (OSP), which would be quicker than performing post processing.  It is
    the NI 5142
    and can perform the following signal
    processing functions.  It is
    essentially a 5122 but with built in OSP. 
    It may be a little out of your price range, but it may be worth a
    look. 
    For more
    information on streaming take a look at these two links (if you havn’t
    already). 
    High-Speed
    Data Streaming: Programming and Benchmarks
    Streaming Options for PXI
    Express
    When dealing with different LabVIEW versions
    it is important to note that previous versions will be compatible with new
    versions; such as going from 8.0 to 8.5. 
    Keep in mind that if you go too far back then LabVIEW may complain, but
    you still may be able to run your VI.  If
    you have a newer version going to an older version then we do have options in
    LabVIEW to save your VI for older versions. 
    It’s usually just 1 version back, but in LabVIEW 8.5 you can save for
    LabVIEW 8.2 and 8.0.
    ESD,
    Here is the link
    I was referring to earlier about DMA transfers.  DMA is actually done every time you call a
    fetch or read function in LabVIEW or CVI (through NI-SCOPE). 
    Topher C and ESD,
    LabVIEW is a combination of a compiled
    language and an interpreted language. 
    Whenever you make a change to the block diagram LabVIEW compiles
    itself.  This way when you hit run, it is
    ready to execute.  During execution LabVIEW
    uses the run-time engine to reference shared libraries (such as dll’s).  Take a look at this DevZone article about
    how LabVIEW compiles it’s block diagram (user code). 
    I hope all of this information helps!
    Ryan N
    National Instruments
    Application Engineer
    ni.com/support

  • Convert PXIe-8135 controller to dual-boot Windows 7 and LabVIEW RT

    Hello. I have a PXIe-8135 controller that originally was just running Windows 7. We are trying to convert it to a dual boot system to also run LabView Real Time. (There is host computer that will run LabVIEW 2014 with the RT module, and the controller will become a target).
    I have created a FAT32 partition on the hard drive of the controller. Now, I’m trying to install the real-time OS with a USB flash drive made using the MAX utility, but I cannot boot using the USB drive for some reason. I keep getting the message “waiting for USB device to initialize”.  
    In BIOS, legacy USB support is [ENABLED] and boot configuration is set to [Windows/other OS]. I’ve tried removing the drive, waiting, and reinserting. I’ve tried two different USB drives (both 8 GB, different brands).
    I’m not sure what to do next. Apart from the USB boot issue, is converting the PXIe-8135 even possible?  I read about SATA/PATA hard drive issues with older controllers, but I don't know about this one.
    Thanks, in advance, for your help!
    -Jeff
    Solved!
    Go to Solution.

    Per Siana's licensing comment, more information on purchasing a deployment license if you do not have one for this target can be found here.
    The RT Utility USB key is used to set up non-NI hardware with LabVIEW Real-Time, but you should not need it in this situation to convert to dual-boot (*). Try this:
    1. Since you already have a FAT32 partion created, go into BIOS setup and change to booting 'LabVIEW RT'.
    2. The system will attempt to boot LabVIEW RT, see that the partition is empty, and switch over into LabVIEW RT Safe Mode. (this safemode is built into the firmware, which is why you don't really need the USB key).
    3. The system should come up correctly and be detectable from MAX, and you can proceed with installing software.
    4. To switch back to Windows, go back to BIOS setup and choose 'Windows/Other OS'
    (*) One area where the USB key is helpful on a dual boot system is with formatting the disk. If you want to convert from FAT32 to Reliance on the partition designated for LabVIEW RT, the USB key lets you attempt to format a single parition and leave the rest of the disk untouched. If you format from MAX, the standard behavior is to format only one RT partition if found, but if not found, it will format the entire disk.  Formatting from MAX on a dual boot system is consequently riskier and you could lose your Windows partition.

  • Start and Stop Trigger using PXI-6120 and DigitalSta​rtAndStopT​rigger.vi not working :-(

    Hello,
    I've been trying for a while now to get my PXI unit to capture a waveform between a Start and Stop (Reference) Trigger using the NI example DigitalStartAndStopTrigger.vi downloaded from the NI website. However, whilst the start trigger and stop trigger seem to be working i.e. the VI runs and stops at  the correct times there is never any data read from my DAQmx compatible PXI-6120 card. So I can see the VI is running around the aquisition loop but the Property Node AvailSampPerChan is always returning zero... this has me slightly puzzled. I thought this might just be a driver issue so I've updated my box to the following software versions (see below) and installed the latest drivers e.g. DCDNov07.exe (also from the NI site) but nothing has changed.
    my software as of now.
    Labview 7.1 (with the 7.1.1 upgrade applied)
    Max 4.3.0.49152
    DAQmx 8.6.0f12
    Trad DAQ 7.4.4f7
    before I updated I had the same problem but with the following versions:
    Labview 7.1 (with the 7.1.1 upgrade applied)
    Max 4.2.1.3001
    DAQmx 8.5.0f5
    Trad DAQ 6.9.3f4
    So to cut a long story short I still have the same problem with the triggers... does anybody have any ideas what is going wrong?
    To add insult to injury it the traditional DAQ example ai_start-stop_d-trig.vi was almost working correctly before I did the upgrade. It had the strange behaviour of capturing the AI0 channel but on the wrong edges (e.g. if I set Start on Rise and Stop on Fall it would do the opposite, Start on Fall and Stop on Rise).
    I'm going to leave my box doing a mass compile over night but i'd really like it if someone could suggest a solution or point me in the right direction.
    Many thanks,
    Mike

    Hi Graham
    I'm out of the lab today but I'll try and answer your questions as best I can...
    1) What are the values you have set for Buffer size, Rate, samples per read and post trigger Samples?
    At the moment I have all the values (e.g. sample rate, buffer size etc) unchanged apart from the ones I mentioned in my previous post (see above). I have in the past played around with changing the buffer sizes and rates in the example VI but as this appeared to have no effect on the behaviour I now have them setup as in the download.
    2) Does the program end after the stop trigger is implemented?
    Yep, if I toggle the trigger line high then low I see the program exits the read loop and the VI stops running as expected.
    3) Lastly can you give me the details of triggering method. Are you
    using a digital train of users set digital pulses? how long is the
    program running.I'm using the WriteDigChan.vi to manually toggle the first digital line of the PXI-6733 card which is wired directly to PFI0 of the PXI-6120 card. Generally, I just start the VI running  and then toggle the line high, wait a couple of seconds and then toggle it low.
    To me it all looks like it should be acquiring samples but as I said yesterday it just refuses to fill the buffer with any data (and hence no samples are read).
    Any ideas? and thanks for you help,
    Mike

  • Trouble capturing waveform from PXI-4472

    I'm really a very green newbie at this stuff, so bear with me...
    I've got a PXI-4472 data acquisition board and a PXI-5411 waveform generator. I've connected the arbitrary out of the 5411 to the channel 0 in on the 4472. An external oscilloscope shows a 1v-amplitude sine wave being generated.
    I created a very simple VI to show what the 4472 is capturing. It connects a NI-DAQ channel I generated to the standard "AI Acquire Waveform.vi", then out to a Waveform Chart, all within a while loop with a Stop button. Problem is, all the waveform chart seems to be showing is the running average of the waveform instead of the form itself (solid line, a tad above zero).
    I can hook the 4472 input channel up to a DC-out power supply, a
    nd when I vary the voltage, the waveform chart changes as well.
    So my question (whew!): What's wrong here that's not allowing me to capture a waveform from the 4472 (in turn from the 5411) and display it on my waveform chart?
    Thanks in advance for the help.

    Never mind.... it was a sample rate problem. I upped the sample rate and it came out ok.

  • Triggerring PXI-4110 to measure 1 current value while HSDIO PXI-6552 generating waveform

    Hi,
    Some question about PXI-4110 to measure current while PXI-6552 is generating the waveform. 
    1. Let say, I need to measure 3 points of current values, i.e. while PXI-6552 is generating sample-1000, 2000 and 3500. On the edge of sample 1000,2000 and 3500, the PXI-6552 will send a pulse via PFI line or via PXI backplane trigger line. My question is, is it possible to trigger PXI-4110 (hardware trigger or software trigger) to measure current values at these points ?
    2. Let say I need to measure the current on 0ms (start of waveform generation by PXI-6552) , 1ms, 2ms, 3ms, 4ms... and so on for 1000 points of measurement, code diagram as shown at the figure below. It is possible for the VI "niDCPower Measure Multiple" to measure exactly at 1ms, 2ms, 3ms .. ? How much time will have to spend to complete acquire 1 point of measurement by "niDCPower Measure Multiple" ?
    Thanks for viewing this post. Your advice on hardware used or software method is much appreciated. Thanks in advance.  
    Message Edited by engwei on 02-02-2009 04:24 AM
    Attachments:
    [email protected] ‏46 KB

    Hi engwei,
    1. Unfortunately, the 4110 does not support hardware triggering. Therefore you cannot implement direct triggering through the backplane or anything like that. However, there are a couple of possible workarounds you can try:
    a) Use software triggering: Say your 6552 is generating in one while loop, and your 4110 is to measure in another while loop. You can use a software syncrhonization method like notifiers to send a notification to your 4110 loop when your 6552 has generated the desired sample. This method, however, will not be very deterministic because the delay between the trigger and the response depends on your processor speed and load. Therefore, if you have other applications running in the background (like antivirus) it will increase the delay.
    b) Use hardware triggering on another device: If you have another device that supports hardware triggering (like maybe an M-series multifunction DAQ module), you can configure this device to be triggered by a signal from the 6552, perform a very quick task (like a very short finite acquisition) then immediately execute the DCPower VI to perform the measurement. The trigger can be configured to be re-triggerable for multiple usage. This will most likely have a smaller time delay then the first option, but there will still be a delay (the time it takes to perform the short finite acquisiton on the M-series). Please refer to the attached screenshot for an idea of how to implement this.
    2. To make your 4110 measure at specific time intervals, you can use one of the methods discussed above. As for how long it will take to acquire 1 measurement point, you may find this link helpful: http://zone.ni.com/devzone/cda/tut/p/id/7034
    This article is meant for the PXI-4130 but the 4110 has the same maximum sampling rate (3 kHz) and so the section discussing the speed should apply for both devices.
    Under the Software Measurement Rate section, it is stated that the default behavior of the VI is to take an average of 10 samples. This corresponds to a maximum sampling rate of 300 samples/second. However, if you configure it to not do averaging (take only 1 sample) then the maximum rate of 3000 samples/second can be achieved.
    It is also important to note that your program can only achieve this maximum sampling rate if your software loop takes less time to execute than the actual physical sampling. For example, if you want to sample at 3000 samples/second, that means that taking one sample takes 1/3000 seconds or 333 microseconds. If you software execution time is less than 333 microseconds, then you can achieve this maximum rate (because the speed is limited by the hardware, not the software). However, if your software takes more than 333 microseconds to execute, then the software loop time will define the maximum sampling rate you can get, which will be lower than 3000 samples/second.
    I hope this answers your question.
    Best regards,
    Vern Yew
    Applications Engineer, NI ASEAN
    Best regards,
    Vern Yew
    Applications Engineer
    Attachments:
    untitled.JPG ‏18 KB

  • Application loader not working on blackberry curve 8310

    Hi all,
    I have some problem with my 8310. I want to install a asian langague, but application loader not working. I have a error message :
    "The blackberry desktop software does not have
    Blackberry device software for the device that
    you have connected to the computer. Contact
    your wireless service provider or system administrator"
    blackberry desktop manager version 4.2.2.14
    blackberry device manager version 4.2.2.10
    application loader version 4.2.2.16
    What is the good package to install ?
    Thanks in advance.

    hello, you have a 8310. You must use an OS for the exact model. Using one for 8300 or 8330 will not work.
    The search box on top-right of this page is your true friend, and the public Knowledge Base too:

  • Problems performing offset null and shunt calibration in NI PXI-4220

    I am using a 350 ohm strain gage for the measurements, i have already create a task in MAX, when i want to perform offset null in the task, the program shows a waiting bar and the leds in the 4220 board start to tilting, but when the waiting bar stops, MAX gets blocked. it has been impossible to me to perform the offset null, what can i try?.
    which will be the correct values for the parameters beside the gage parameters for the strain measures?

    Hello,
    Thank you for contacting National Instruments.
    Usually when this problem occurs, it is do to incorrect task configuration or incorrectly matched quarter bridge completion resistor. Ensure that you have the correct Strain Configuration chosen. The default is Full Bridge I. If you only have a single strain gauge in your configuration, you will need to change your configuration. Also ensure that if your are using a quarter bridge completion resistor make sure that it is 350Ohm not 120Ohm. If the resistor if 120Ohm you will more thank likely not be able to null your bridge.
    Please see the PXI-4220 User Manual for more information about your configuration and signal connections: http://digital.ni.com/manuals.nsf/websearch/F93CCA9A0B4BA19B86256D60
    0066CD03?OpenDocument&node=132100_US
    Also, you can download and install the latest NI-DAQ 7.2 driver: http://digital.ni.com/softlib.nsf/websearch/50F76C287F531AA786256E7500634BE3?opendocument&node=132070_US
    This 7.2 driver has a signal connections tab displayed when configuring your DAQmx Task which show you how to correctly connect your signals.
    Regards,
    Bill B
    Applications Engineer
    National Instruments

Maybe you are looking for

  • Oss note problem

    Hi , i applid oss note  1046758. to 2 include and 1 program includes:     J_1IEWT_CERT_TOP     J_1IEWT_CERT_F01 program :        J_1IEWT_CERT But i getting error like report/program statement missing. plz give me solution Edited by: vasanth kandula o

  • How to force Office 2013 CTR update regularly

    We are using around 60 PKC licenses of Office 2013 which came in CTR format (we don't have any Office 365 subscription, these licenses came with the new computers). CTR can't be managed via WSUS and it usually updates itself once or two per month (id

  • How hard is it to get a Genius Bar Appointment for a Macbook Pro

    I could not update to Marvericks on my Macbook Pro as the install picked up a SMART error on hard drive meaning it is likely to fail soon. I have been trying to get an appointment for the last two weeks in any store in London to replace my hardrive a

  • How to automatically shut down iMac after downloading a file?

    I'm downloading a large size file at the moment & I want this computer to shut down by itself by the time the download process has finished. what should I do? pls suggest thx.

  • Zen Micro Problem XP wants to load driver but will

    when i plug it windows says new hardware found wants driver. i dont understand. I use on another system it loads just fine. i have download the software and tried to load it and it will not see the zen micro. Message Edited by hellomynameis on <SPAN