PXI interference 25.5kHz

Hello,
I have a custom hardware setup with a PXI to assist with hardware control and data acquisition. The point of the data acquisition is to read in a signal from the hardware, do a FFT, and analyze 1kHz - 256kHz. The signals are small and near the noise floor of the ADC.
I got a continuous signal at 25.5kHz in all of my measurements. I used a spectrum analyzer to figure out where this signal was coming from. I ended up taking everything apart until I was down to only the PXI. The PXI itself was generating the signal and putting it all over the place (Even the casing).
What generates this signal on the PXI? Can I turn it off? It appears the instant the PXI is powered on.
The signal is quite strong, about -70dBm at the DAQ port. You can even find the signal on the ground of the SH68 connectors.
I've tested on two different PXI systems in two different places and its the same result.
System: PXI-8101e OR PXI-8135 (appears for both, seems to be stronger on PXI-8135)
PXI - 6115 (DAQ)
PXI - 6723 (AO)
PXIe-1071 (Chassis)

Jeff·Þ·Bohrer wrote:
Turn off the lights.  Does it go away?
(Electronic ballasts in some florescent system can cause similar problems) 
No, some of the tests were outside. I tested for environmental noises by making a big loop. I was catching some 10kHz noise on the loop that wasn't in the measurement, so if the 25.5kHz was from the environment it would show up there I would think. If the lights are corrupting the mains, its kind of disappointing this makes it through to the PXI, as none of my other measurement equipment receives that signal.
To be frank, it is disappointing that a signal measurement device has this signal corruption. It is unusable except for the most basic measurements, and people aren't going to sink this kind of money to get -50dBm noise floors. At least I've moved up a frequent flier class trying to troubleshoot this between the two locations.

Similar Messages

  • How to structure the DMA buffer for PXie 6341 DAQ card for analog output with different frequencies on each channel

    I'm using the MHDDK for analog out/in with the PXIe 6341 DAQ card.
    The examples, e.g. aoex5, show a single Timer  (outTimerHelper::loadUI method), but the example shows DMA data loaded with the same vector size.
    There is a comment in the outTimerHelper:rogramUpdateCount call which implies that different buffer sizes per channel can be used.
       (the comment is: Switching between different buffer sizes will not be used)
    Does anyone know what the format of the DMA buffer should be for data for multiple channels with different frequencies ?
    For example, say we want a0 with a 1Khz Sine wave and a1 with a 1.5Khz sine wave.  What does the DMA buffer look like ?
    With the same frequency for each channel, the data is interleaved, e.g.  (ao0#0, ao1#0; ao0#1, ao1#1, ...), but when the frequencies for each channel is different, what does the buffer look like ?

    Hello Kenstern,
    The data is always interleaved because each card only has a single timing engine for each subsystem.
    For AO you must specify the number of samples that AO will output. You also specify the number of channels. Because there is only one timing engine for AO, each AO will channel will get updated at the same time tick of the update clock. The data will be arranged interleaved exactly as the example shows because each AO channel needs data to output at each tick of the update clock. The data itself can change based on the frequency you want to output.
    kenstern wrote:
    For example, say we want a0 with a 1Khz Sine wave and a1 with a 1.5Khz sine wave.  What does the DMA buffer look like ?
    With the same frequency for each channel, the data is interleaved, e.g.  (ao0#0, ao1#0; ao0#1, ao1#1, ...), but when the frequencies for each channel is different, what does the buffer look like ?
    In your example, you need to come up with an update rate that works for both waveforms (1 KHz and 1.5 KHz sine waves). To get a good representation of a sine wave, you need to update more than 10x as fast as your fastest frequency...I would recommend 100x if possible.
    Update Frequency: 150 KHz
    Channels: 2
    Then you create buffers that include full cycles of each waveform you want to output based on the update frequency. These buffers must also be the same size.
    Buffer 1: Contains data for the 1 KHz sine wave, 300 points, 2 sine wave cycles
    Buffer 2: Contains data for the 1.5 KHz sine wave, 300 points, 3 sine wave cycles
    You then interleave them as before. When the data is run through the ADC, they are outputting different sine waves even though the AO channels are updating at the same rate.

  • Is the PXIe-PCIe8361 adequate for this system? And external clock questions...

    Hi all,
    I have spent some time piecing together a system and I'd like a sanity check before pulling the trigger on this purchase.  The system will contain the following hardware:
    1. Chassis: PXIe-1078
    2. Controller: PXIe-PCIe8361
    3. 3 x PXIe-6363 (16 analog inputs each card, 32 digital inputs each card, all internally clocked @ 10kHz)
    4. 2 x PXI-6224 (32 digital inputs on one, 8 digital inputs on the other, externally clocked in "bursts" of 62.5khz)
    5. Labview software
    The three PXI-6363 cards will be responsible for  a mix of analog and digital measurements made @ 10 kHz, timed continuously by the onboard clock.
    One PXI-6224 will be clocked externally @ 62.5 kHz and will be used to collect digital data on a 32-bit port.  These clock pulses will not be continuous, but will occur in bursts lasting for 2ms every 20ms.
    The other PXI-6224 will be clocked externally @ 62.5kHz as well and will be used to collect digital data on an 8-bit port. These clock pulses will not be continuous, but will occur in bursts lasting for 2 ms at random intervals.
    My questions are:
    1. Am I planning anything that looks unreasonable for this hardware?
    2.  Should I expect issues with data transfer rates with the PXIe-PCIe8361?  I will be operating well within the advertised 110MB/s throughput of the device.  I plan to stream this method... NI Fast TDMS data streaming
    3.  I have only ever used NI cards for continuous measurements made by an onboard clock.  When I set up a task to collect data that is externally-timed, will the DAQ be expecting a "continuous" clock pulse, or will the system wait patiently for clock pulses to arrive at any rate (any rate within the spec of the card, of course)?
    Thanks, any input is appreciated.

    Hello LucasH0011
    1-As long as you put the PXI-6224  and the PXIe-6363 cards in the corresponding slots, meaning the express(PXIe-6363) in the express and the hybrid(PXI-6224) in the hybrid.
    2-I think you would  not have issues with the transfer rate.
    3-Your timing specifications sound reasonable to me, I think you will be fine. 
    Here is a document that has useful concepts for the use of cards:
    http://www.ni.com/white-paper/3615/en/
    It is for the M-Series, but the concepts apply to the X-Series as well. 
    Regards 
    Ernesto

  • Anyone know what can cause the MXI-3 and PXI-1010 chassis to prevent PC bootup?

    Situation: The PC locked up with the display blank. The PC would not bootup (not even POST) after power was cycled (several attempts).
    Resolution: Simple. The chassis power was cycled and the PC was fully operational.
    Background: The PC is a Dell Dimension 733 running WIN98 and used daily for production test. The test program is a combination of TestStand, instrument dll's and Labview.

    Hi,
    If the pci-mxi-3 is locking up the computer the most common cause for this is a bad mxi-3 card. To check this you will want to try the mxi-3 card and chassis with another type of computer. If it doesn't boot in more than one computer of different types you will want to return the board to national instruments for repair.
    Another thing that we have seen in some computers is interference or noise from the power supplies interrupting communication through the mxi-3 devices. The only thing you can really try in this case is to move the pci-mxi-2 card to another slot in the pc and see if it works, or try using a different type of pc.
    Power cycling the chassis is not a valid solution as this will drop the mxi-3 link and then you will no longer be able to talk to de
    vices in the pxi system. You always need to successful power the pxi-mxi-2 card and chassis before you power up your pc.
    Best Regards,
    Amy Hindman
    Applications Engineer
    National Instruments

  • Pxi-4071 shielding cable

    Hi, we use pxi-4071 on pxi-1033 for measurments and we connected shielded cables few meters long. We wrapped foil cables from the kit. After that we connected all the shields to the ground from the power supply. But we still remain interferences about 1nA and DC component 5nA. If anyone faced a similar problem, is it possible that we are too often measured current. Can you advise the shilded cables for multimeter in this case. 

    alyast wrote:
    Yes, we pretty sure in shielding, but we don't know why there is current constant component (5nA), while declared sensitivity - 0.1nA. Probably some bug with timing in LabView or something? I don't know what could be a reason for this behavior, and therefore ask noob questions.
    Why would you suspect the software when all the evidence points to an issue with the HARDWARE?  And if you're measuring in the nanoAmp region, you'd better be ABSOLUTLEY certain that your grounds are true and accurate.  Probably grounding to your power supply is not good enough.  Your power supply probably has to be grounded to a copper stake going - what is, it, six feet? - into the ground.
    Bill
    (Mid-Level minion.)
    My support system ensures that I don't look totally incompetent.
    Proud to say that I've progressed beyond knowing just enough to be dangerous. I now know enough to know that I have no clue about anything at all.

  • PXI-5661

    Hi
    i want to know whether the downconverter(5600) of 5661 can be used separately with another digitizer such as 5122e?
    are the two modules i.e 5600 and 5142 completely detachable and if that is so than do i need to get separate drivers for both?
    secondly, as i understand it, 5600 has 20 MHz bandwidth but what i need is a down converter that can be used for data streaming 50 MHz+ BW data using 5122e. how to go about this problem?
    thirdly, for data streaming RF data to RAID0 or PXI-controller HDD, do i need to use a RF switch (because 2593 is recommended by you) and do i need AWG such as 5421?
    what exactly is the utility of RF switch here?
    dannn

    Thank you for the clarification Danyal.  You could make the Viterbi and CELP calculations on a Windows machine.  I did find a few places where we already have some example code for the viterbi if you are interested.  Here is a forum post that talks about viterbi and here and here are some example programs from our website. 
    The answer to your question about whether doing these algorithms in real time is, yes.  You could implement viterbi and CELP both in Windows and on an RT (Real Time) Operating System.  I would recommend using the RTOS and not a Windows machine for your real time measurements.  Windows will not be as deterministic as a real time machine would be for you calculations.  You can just use LabVIEW real time to program your algorithm.  I could not find any examples that would achieve this for your.  However, the programming in real time is very similar, so I think you could use the above code to make your programs in real time as well.  Just to clarify, programming on an RTOS instead of a Windows machine will not make it faster, only more deterministic.  Also, the only thing that will be running on your RT machine will be the programs you wrote, so there will be no "lag" or interference caused by other programs running in the background (on a Windows machine).
    To summarize, it depends on what kind of determinism, you are looking for.  If you are looking to process the data as soon as possible, you might want to look at some kind of FPGA solution, but if you are just looking for some kind of determinism, you might be better off with a real time operating system on your 8130. 
    Hope this helps.
    Regards,
    Raajit L
    National Instruments

  • Poor PXI IO performanc​e on Latitude E6410 with ExpressCar​d 8360

    Hello,
    I have a Dell Latitude E6410 with a Core-i5 M520 which is giving me very poor io performance when using an ExpressCard 8360 card to connect to a PXI Rack.
    The sustained IO rate that I can get appears to be about 1/3 of that that I can get using the same ExpressCard on a Dell Latitude E6400 (with a Core2Duo processor).
    I am using the A05 bios (latest at time of writing) on the E6410.
    Wade.

    I am running Windows XP (32 bit) sp3 in both cases.
    The E6410 has 4GByte of memory fitted.
    The E6400 has 2GByte of memory fitted.
    I have also use the same ExpressCard 8360 via a PXIe to ExpressCard Adaptor in a Desktop machine with similar performance figures to the E6400 - i.e. much better than the E6410.
    The Desktop Machine is an HP Compaq D7900 with 4GByte of memory, Core2Duo E8500 also running Windows XP sp3 (32 bit).
    Also, on the Desktop, I am running NI PXI Platform Services 2.3.2 and NI-Visa runtime version 4.3.
    On the E6410, I am running NI PXI Platform Services 2.5.2 and NI-Visa runtime version 4.6.
    I no longer have access to the E6400 so I am not sure what sofware versions were installed. However, they are unlikely to be new than the versions installed on the E6410.
    Wade.

  • Choosing a PXIe controller for streaming 200 MBps

    Warning:  This is a long post with several questions.  My appologies in advance.
    I am a physics professor at a small liberal-arts college, and will be replacing a very old multi-channel analyzer for doing basic gamma-ray spectroscopy.  I would like to get a complete PXI system for maximum flexability.  Hopefully this configuration could be used for a lot of other experiments such as pulsed NMR.  But the most demanding role of the equipment would be gamma-ray spectroscopy, so I'll focus on that.
    For this, I will need to be measuring either the maximum height of an electrical pulse, or (more often) the integrated voltage of the pulse.  Pulses are typically 500 ns wide (at half maximum), and between roughly 2-200 mV without a preamp and up to 10V after the preamp.  With the PXI-5122 I don't think I'll need a preamp (better timing information and simpler pedagogy).  A 100 MHz sampling rate would give me at least 50 samples over the main portion of the peak, and about 300 samples over the entire range of integration.  This should be plenty if not a bit of overkill.
    My main questions are related to finding a long-term solution, and keeping up with the high data rate.  I'm mostly convinced that I want the NI PXIe-5122 digitizer board, and the cheapest (8-slot) PXIe chassis.  But I don't know what controller to use, or software environment (LabView / LabWindows / homebrew C++).  This system will likely run about $15,000, which is more than my department's yearly budget.  I have special funds to accomplish this now, but I want to minimize any future expenses in maintenance and updates.
    The pulses to be measured arrive at random intervals, so performance will be best when I can still measure the heights or areas of pulses arriving in short succession.  Obviously if two pulses overlap, I have to get clever and probably ignore them both.  But I want to minimize dead time - the time after one pulse arrives that I become receptive to the next one.  Dead times of less than 2 or 3 microseconds would be nice.
    I can imagine two general approaches.  One is to trigger on a pulse and have about a 3 us (or longer) readout window.  There could be a little bit of pileup inspection to tell if I happen to be seeing the beginning of a second pulse after the one responsible for the trigger.  Then I probably have to wait for some kind of re-arming time of the digitizer before it's ready to trigger on another pulse.  Hopefully this time is short, 1 or 2 us.  Is it?  I don't see this in the spec sheet unless it's equivalent to minimum holdoff (2 us).  For experiments with low rates of pulses, this seems like the easiest approach.
    The other possibility is to stream data to the host computer, and somehow process the data as it rolls in.  For high rate experiments, this would be a better mode of operation if the computer can keep up.  For several minutes of continuous data collection, I cannot rely on buffering the entire sample in memory.  I could stream to a RAID, but it's too expensive and I want to get feedback in real time as pulses are collected.
    With this in mind, what would you recommend for a controller?  The three choices that seem most reasonable to me are getting an embedded controller running Windows (or Linux?), an embedded controller running Labview real-time OS, or a fast interface card like the PCIe8371 and a powerful desktop PC.  If all options are workable, which one would give me the lowest cost of upgrades over the next decade or so?  I like the idea of a real-time embedded controller because I believe any run-of-the-mill desktop PC (whatever IT gives us) could connect and run the user interface including data display and higher-level analysis.  Is that correct?  But I am unsure of the life-span of an embedded controller, and am a little wary of the increased cost and need for periodic updates.  How are real-time OS upgrades handled?  Are they necessary?  Real-time sounds nice and all that, but in reality I do not need to process the data stream in a real-time environment.  It's just the computer and the digitizer board (not a control system), and both should buffer data very nicely.  Is there a raw performance difference between the two OSes available for embedded controllers?
    As for live processing of the streaming data, is this even possible?  I'm not thinking very precisely about this (would really have to just try and find out), but it seems like it could possibly work on a a 2 GHz dual-core system.  It would have to handle 200 MBps, but the data processing is extremely simple.  For example one thread could mark the beginnings and ends of pulses, and do simple pile-up inspection.  Another thread could integrate the pulses (no curve fitting or interpolation necessary, just simple addition) and store results in a table or list.  Naievely, I'd have not quite 20 clock cycles per sample.  It would be tight.  Maybe just getting the data into the CPU cache is prohibitively slow.  I'm not really even knowledgeable enough to make a reasonable guess.  If it were possible, I would imagine that I would need to code it in LabWindows CVI and not LabView.  That's not a big problem, but does anyone else have a good read on this?  I have experience with C/C++, and some with LabView, but not LabWindows (yet).
    What are my options if this system doesn't work out?  The return policy is somewhat unfriendly, as 30 days may pass quickly as I struggle with the system while teaching full time.  I'll have some student help and eventually a few long days over the summer.  An alternative system could be built around XIA's Pixie-4 digitizer, which should mostly just work out of the box.  I prefer somewhat the NI PXI-5122 solution because it's cheaper, better performance, has much more flexability, and suffers less from vendor lock-in.  XIA's software is proprietary and very costly.  If support ends or XIA gets bought out, I could be left with yet another legacy system.  Bad.
    The Pixie-4 does the peak detection and integration in hardware (FPGAs I think) so computing requirements are minimal.  But again I prefer the flexibility of the NI digitizers.  I would, however, be very interested if data from something as fast as the 5122 could be streamed into an FPGA-based DSP module.  I haven't been able to find such a module yet.  Any suggestions?
    Otherwise, am I on the right track in general on this kind of system, or badly mistaken about some issue?  Just want some reassurance before taking the plunge.

    drnikitin,
    The reason you did not find the spec for the rearm time for
    the 5133 is because the USB-5133 is not capable of multi-record acquisition.  The rearm time is a spec for the reference
    trigger, and that trigger is used when fetching the next record.  So every time you want to do another fetch
    you will have to stop and restart your task. 
    To grab a lot of data increase your minimum record size.  Keep in mind that you have 4MB of on board
    memory per channel. 
    Since you will only be able to fetch 1 record at a time,
    there really isn’t a way to use streaming. 
    When you call fetch, it will transfer the amount of data you specify to
    PC memory through the USB port (up to 12 MB/s for USB 2.0 – Idealy).
    Topher C,
    We do have a Digitizer that has onboard signal processing
    (OSP), which would be quicker than performing post processing.  It is
    the NI 5142
    and can perform the following signal
    processing functions.  It is
    essentially a 5122 but with built in OSP. 
    It may be a little out of your price range, but it may be worth a
    look. 
    For more
    information on streaming take a look at these two links (if you havn’t
    already). 
    High-Speed
    Data Streaming: Programming and Benchmarks
    Streaming Options for PXI
    Express
    When dealing with different LabVIEW versions
    it is important to note that previous versions will be compatible with new
    versions; such as going from 8.0 to 8.5. 
    Keep in mind that if you go too far back then LabVIEW may complain, but
    you still may be able to run your VI.  If
    you have a newer version going to an older version then we do have options in
    LabVIEW to save your VI for older versions. 
    It’s usually just 1 version back, but in LabVIEW 8.5 you can save for
    LabVIEW 8.2 and 8.0.
    ESD,
    Here is the link
    I was referring to earlier about DMA transfers.  DMA is actually done every time you call a
    fetch or read function in LabVIEW or CVI (through NI-SCOPE). 
    Topher C and ESD,
    LabVIEW is a combination of a compiled
    language and an interpreted language. 
    Whenever you make a change to the block diagram LabVIEW compiles
    itself.  This way when you hit run, it is
    ready to execute.  During execution LabVIEW
    uses the run-time engine to reference shared libraries (such as dll’s).  Take a look at this DevZone article about
    how LabVIEW compiles it’s block diagram (user code). 
    I hope all of this information helps!
    Ryan N
    National Instruments
    Application Engineer
    ni.com/support

  • Convert PXIe-8135 controller to dual-boot Windows 7 and LabVIEW RT

    Hello. I have a PXIe-8135 controller that originally was just running Windows 7. We are trying to convert it to a dual boot system to also run LabView Real Time. (There is host computer that will run LabVIEW 2014 with the RT module, and the controller will become a target).
    I have created a FAT32 partition on the hard drive of the controller. Now, I’m trying to install the real-time OS with a USB flash drive made using the MAX utility, but I cannot boot using the USB drive for some reason. I keep getting the message “waiting for USB device to initialize”.  
    In BIOS, legacy USB support is [ENABLED] and boot configuration is set to [Windows/other OS]. I’ve tried removing the drive, waiting, and reinserting. I’ve tried two different USB drives (both 8 GB, different brands).
    I’m not sure what to do next. Apart from the USB boot issue, is converting the PXIe-8135 even possible?  I read about SATA/PATA hard drive issues with older controllers, but I don't know about this one.
    Thanks, in advance, for your help!
    -Jeff
    Solved!
    Go to Solution.

    Per Siana's licensing comment, more information on purchasing a deployment license if you do not have one for this target can be found here.
    The RT Utility USB key is used to set up non-NI hardware with LabVIEW Real-Time, but you should not need it in this situation to convert to dual-boot (*). Try this:
    1. Since you already have a FAT32 partion created, go into BIOS setup and change to booting 'LabVIEW RT'.
    2. The system will attempt to boot LabVIEW RT, see that the partition is empty, and switch over into LabVIEW RT Safe Mode. (this safemode is built into the firmware, which is why you don't really need the USB key).
    3. The system should come up correctly and be detectable from MAX, and you can proceed with installing software.
    4. To switch back to Windows, go back to BIOS setup and choose 'Windows/Other OS'
    (*) One area where the USB key is helpful on a dual boot system is with formatting the disk. If you want to convert from FAT32 to Reliance on the partition designated for LabVIEW RT, the USB key lets you attempt to format a single parition and leave the rest of the disk untouched. If you format from MAX, the standard behavior is to format only one RT partition if found, but if not found, it will format the entire disk.  Formatting from MAX on a dual boot system is consequently riskier and you could lose your Windows partition.

  • Start and Stop Trigger using PXI-6120 and DigitalSta​rtAndStopT​rigger.vi not working :-(

    Hello,
    I've been trying for a while now to get my PXI unit to capture a waveform between a Start and Stop (Reference) Trigger using the NI example DigitalStartAndStopTrigger.vi downloaded from the NI website. However, whilst the start trigger and stop trigger seem to be working i.e. the VI runs and stops at  the correct times there is never any data read from my DAQmx compatible PXI-6120 card. So I can see the VI is running around the aquisition loop but the Property Node AvailSampPerChan is always returning zero... this has me slightly puzzled. I thought this might just be a driver issue so I've updated my box to the following software versions (see below) and installed the latest drivers e.g. DCDNov07.exe (also from the NI site) but nothing has changed.
    my software as of now.
    Labview 7.1 (with the 7.1.1 upgrade applied)
    Max 4.3.0.49152
    DAQmx 8.6.0f12
    Trad DAQ 7.4.4f7
    before I updated I had the same problem but with the following versions:
    Labview 7.1 (with the 7.1.1 upgrade applied)
    Max 4.2.1.3001
    DAQmx 8.5.0f5
    Trad DAQ 6.9.3f4
    So to cut a long story short I still have the same problem with the triggers... does anybody have any ideas what is going wrong?
    To add insult to injury it the traditional DAQ example ai_start-stop_d-trig.vi was almost working correctly before I did the upgrade. It had the strange behaviour of capturing the AI0 channel but on the wrong edges (e.g. if I set Start on Rise and Stop on Fall it would do the opposite, Start on Fall and Stop on Rise).
    I'm going to leave my box doing a mass compile over night but i'd really like it if someone could suggest a solution or point me in the right direction.
    Many thanks,
    Mike

    Hi Graham
    I'm out of the lab today but I'll try and answer your questions as best I can...
    1) What are the values you have set for Buffer size, Rate, samples per read and post trigger Samples?
    At the moment I have all the values (e.g. sample rate, buffer size etc) unchanged apart from the ones I mentioned in my previous post (see above). I have in the past played around with changing the buffer sizes and rates in the example VI but as this appeared to have no effect on the behaviour I now have them setup as in the download.
    2) Does the program end after the stop trigger is implemented?
    Yep, if I toggle the trigger line high then low I see the program exits the read loop and the VI stops running as expected.
    3) Lastly can you give me the details of triggering method. Are you
    using a digital train of users set digital pulses? how long is the
    program running.I'm using the WriteDigChan.vi to manually toggle the first digital line of the PXI-6733 card which is wired directly to PFI0 of the PXI-6120 card. Generally, I just start the VI running  and then toggle the line high, wait a couple of seconds and then toggle it low.
    To me it all looks like it should be acquiring samples but as I said yesterday it just refuses to fill the buffer with any data (and hence no samples are read).
    Any ideas? and thanks for you help,
    Mike

  • Trouble capturing waveform from PXI-4472

    I'm really a very green newbie at this stuff, so bear with me...
    I've got a PXI-4472 data acquisition board and a PXI-5411 waveform generator. I've connected the arbitrary out of the 5411 to the channel 0 in on the 4472. An external oscilloscope shows a 1v-amplitude sine wave being generated.
    I created a very simple VI to show what the 4472 is capturing. It connects a NI-DAQ channel I generated to the standard "AI Acquire Waveform.vi", then out to a Waveform Chart, all within a while loop with a Stop button. Problem is, all the waveform chart seems to be showing is the running average of the waveform instead of the form itself (solid line, a tad above zero).
    I can hook the 4472 input channel up to a DC-out power supply, a
    nd when I vary the voltage, the waveform chart changes as well.
    So my question (whew!): What's wrong here that's not allowing me to capture a waveform from the 4472 (in turn from the 5411) and display it on my waveform chart?
    Thanks in advance for the help.

    Never mind.... it was a sample rate problem. I upped the sample rate and it came out ok.

  • Triggerring PXI-4110 to measure 1 current value while HSDIO PXI-6552 generating waveform

    Hi,
    Some question about PXI-4110 to measure current while PXI-6552 is generating the waveform. 
    1. Let say, I need to measure 3 points of current values, i.e. while PXI-6552 is generating sample-1000, 2000 and 3500. On the edge of sample 1000,2000 and 3500, the PXI-6552 will send a pulse via PFI line or via PXI backplane trigger line. My question is, is it possible to trigger PXI-4110 (hardware trigger or software trigger) to measure current values at these points ?
    2. Let say I need to measure the current on 0ms (start of waveform generation by PXI-6552) , 1ms, 2ms, 3ms, 4ms... and so on for 1000 points of measurement, code diagram as shown at the figure below. It is possible for the VI "niDCPower Measure Multiple" to measure exactly at 1ms, 2ms, 3ms .. ? How much time will have to spend to complete acquire 1 point of measurement by "niDCPower Measure Multiple" ?
    Thanks for viewing this post. Your advice on hardware used or software method is much appreciated. Thanks in advance.  
    Message Edited by engwei on 02-02-2009 04:24 AM
    Attachments:
    [email protected] ‏46 KB

    Hi engwei,
    1. Unfortunately, the 4110 does not support hardware triggering. Therefore you cannot implement direct triggering through the backplane or anything like that. However, there are a couple of possible workarounds you can try:
    a) Use software triggering: Say your 6552 is generating in one while loop, and your 4110 is to measure in another while loop. You can use a software syncrhonization method like notifiers to send a notification to your 4110 loop when your 6552 has generated the desired sample. This method, however, will not be very deterministic because the delay between the trigger and the response depends on your processor speed and load. Therefore, if you have other applications running in the background (like antivirus) it will increase the delay.
    b) Use hardware triggering on another device: If you have another device that supports hardware triggering (like maybe an M-series multifunction DAQ module), you can configure this device to be triggered by a signal from the 6552, perform a very quick task (like a very short finite acquisition) then immediately execute the DCPower VI to perform the measurement. The trigger can be configured to be re-triggerable for multiple usage. This will most likely have a smaller time delay then the first option, but there will still be a delay (the time it takes to perform the short finite acquisiton on the M-series). Please refer to the attached screenshot for an idea of how to implement this.
    2. To make your 4110 measure at specific time intervals, you can use one of the methods discussed above. As for how long it will take to acquire 1 measurement point, you may find this link helpful: http://zone.ni.com/devzone/cda/tut/p/id/7034
    This article is meant for the PXI-4130 but the 4110 has the same maximum sampling rate (3 kHz) and so the section discussing the speed should apply for both devices.
    Under the Software Measurement Rate section, it is stated that the default behavior of the VI is to take an average of 10 samples. This corresponds to a maximum sampling rate of 300 samples/second. However, if you configure it to not do averaging (take only 1 sample) then the maximum rate of 3000 samples/second can be achieved.
    It is also important to note that your program can only achieve this maximum sampling rate if your software loop takes less time to execute than the actual physical sampling. For example, if you want to sample at 3000 samples/second, that means that taking one sample takes 1/3000 seconds or 333 microseconds. If you software execution time is less than 333 microseconds, then you can achieve this maximum rate (because the speed is limited by the hardware, not the software). However, if your software takes more than 333 microseconds to execute, then the software loop time will define the maximum sampling rate you can get, which will be lower than 3000 samples/second.
    I hope this answers your question.
    Best regards,
    Vern Yew
    Applications Engineer, NI ASEAN
    Best regards,
    Vern Yew
    Applications Engineer
    Attachments:
    untitled.JPG ‏18 KB

  • Red interference during capture with FCP 5.1.4

    I have FCP version 5.1.4 . I am using OSX 10.6 snow leopard (Macbook Pro). Have captured and edited without any problems. All of a sudden I am getting an interference during capture. This interference comes in the form of a bright red color which suddenly fills half my screen from corner to corner forming a right angled triangle and hiding half of my picture triangularly. it flashes and then remains. Anyone know what the **** this is and how I can rectify it?
    I would be EXTREMELY for any help anyone can give me. Thanks in advance.

    This is happening to me today as well. Has it got something to do with 10.6.7 I have just updated and now I seem to be having the same problem
    Pleasa Help

  • K7n2g-ilsr IGP graphics interference

    When using the IGP, I see moving diagonal lines on my desktop.
    I don't think it's a memory compatibility issue since I don't experience any reboots, lock-ups, crashes, etc. I suspect it's the PSU but I need a second opinion on this.
    I already experimented with the bios settings with no luck, even on default.
    Here's my specs:
    XP 1,700+
    MSI K7n2g-ilsr
    2 pcs. twinmos 512 ddr400 (twinmos chips)
    Enermax 330 watts (+3.3v=30A, +5v=32A, +12v=12A, combined +3.3v and +5v= 160W)
    I know that the 12v rail is too low, but like I said, I don't experience reboots and everything, just the annoying interference.
    I haven't got this problem whenever I use a separate agp card (geforce2mx)

    Your combined is low as well...should be more like 200W to get enuff power to this setup...
    Cheers!! 8)

  • Interference on display.

    I'm getting random interference patterns on the display. Sometimes an image will become distorted and have lines run through it. Has anyone had this problem? I've had this on iOS 4.2 &4.3 so I'm hesitant to say it's a software issue. These issues have also caused a lot of lag lately on my iPod.

    There are many reports of graphics issues after updating to 4.3, but I'm not aware of the same with 4.2.
    See this thread: http://discussions.apple.com/thread.jspa?threadID=2776843&tstart=25
    Peace.

Maybe you are looking for

  • About floppy disk

    Hello, I have a question about the floppy disk. How can I mount the floppy disk? Which device under the /dev directory represents the floppy disk, /dev/fd0 or /dev/diskette or /dev/diskette0? How can I mount the floppy disk manually? I think the flop

  • Library question: itunes is replacing rather than adding a folder

    Hi, I had music on my new computer before I imported all my old music and library files from my old computer. I kept the file names the same so that my 'old' library would be able to find my 'old' music on my new computer. That worked pretty well (I'

  • How to Increment a Value by 1 in each execution of a Package

    Hi All, Below is a brief about my requirement, We have a requirement that we need to send workflow notification reminders 3 times with prefix like 1st notification with prefix "REMINDER# 1" 2nd notification with prefix "REMINDER# 2" 3rd notification

  • Screen colored, then black, then won't start

    my iBook is a year old and it's been running sluggish lately...but today for no reason the screen when mutli-diaganol colors, then went black, then wouldn't shut off. pulled battery, turned back on after four tries to the same multi-diaganol colored

  • LG G3 Reception problems caused by case and holding the phone

    I was up in Vermont and checked the AT&T map and noticed the area I was headed to showed 4G service. We got there and the kids (with their IPhones) showed 1 or 2 bars and I didn't have any service at all. After trying everything under the sun I decid