SCXI AC Voltage Measurement

Hi,
I am attempting to measure a shunt voltage (AC) using an SCXI 1001 / 1102 / 1300 set up. Data transfer is via SCXI 1600 USB using windows XP. I've installed the XP hotfix associated with data corruption issues.
I've been attempting to collect at 100k and get total nonsense when viewing the waveform in MAX. I've checked the channel with a DC voltage and it appears correct and the AC signal is correct when measured with a multi meter at the terminal block and with a scope.
This is a differential measurement 0-100mV AC @ 60Hz. Task settings: continuous / 200k buffer size / 100k rate
Any suggestions?
Thanks in advance!
Jon

Hi cosgrove,
How do you display the data collected? Is it in a log file or a graph/chart? Have you tried a different channel on the module or a different chassis? What is the signal source and how is it connected to the SCXI system? You could try grounding the signal by adding large (100k or more) resistors that lead from the positive input to ground and the negative input to ground. Have you tried varying the sample rate? A 60Hz waveform would be perfectly defined at far less than a 100 kHz sampling rate.
Let me know if any of this improves the problem and we can work from there.
Ben R.
Modular Instruments Product Marketing Engineer
National Instruments

Similar Messages

  • Voltage measurement through SCXI 1327

    Hello,
    I am using a SCXI 1327 module for taking some ac voltage measurements. I have been to successfully configure it in LABView to take readings using the DAC assistant tool. Currently I am just supplying ac signals of 5V Vpp max. I understand that the module is capable of measuring upto 300V rms. 
    I would like to know what changes do I need to make in the configuration of the module for higher voltage measurements.
    Thanks in advance,
    Ashish.

    The 1327 is a terminal block that acts as a front end for 1120D 1121 or 1125 measurement modules from the SCXI product range, the 1327 on its own can't make any measurements.
    Inside the terminal block are a set of DIP swtches per input, these can be set to 1:1 (for low voltage measurement) or 100:1 for the higher (up to 300V measurement). When you switch them to 100:1 make sure you also make this change in MAX to the module accessory on a chanel by chanel basis otherwise you will get some very strange readings when taking measurement, MAX doesn't automatically compensate for the DIP switch positions, you have to alter this as a seperate step - Mike

  • Low frequency Voltage measurement

    Hi,
    I am using PCI 6025E to control load bank and to measure voltage, current and frequency of a generator.
    While current and frequency measurement  is not a problem, voltage measurement is toublesome.
    Since on test bench already there are few PCI boards and huge number of sensors, I do not want to add NI 9205 or any other additional boards and clutter the area (other people are also working there).
    The problem with voltage measurement is that out of three generators two run from 400 RPM to 1600 RPM i.e.. somewhat 15-55 Hz.
    I had two solutions for this:
    1, Putting a step down transformer (230 Vac / 5 Vac) and then measure the voltage. But when testing at somthing like 15 Hz I have to deal with magnetic saturation with commercially available transformer. I do not want to take pain to go on built my own transformer.
    2, Using high precision voltage Divider. The problem is that what should be the galvanic isolation? Isolation transformer, but then again low frequency problem.
    Hence could someone suggest me since there is a bit confusion in my head.
    Thanks.
    Solved!
    Go to Solution.

    If you have a sine generator (15Hz migth be hard for the soundcard ) you can do a calibration of your transformer.
    (If only the voltage amplitude is of interest , use a isolated DMM as reference and run your generators)
    Or you buy a isolation voltage sense amplifier. (weidmueller, phoenix, wago, ...)
    Greetings from Germany
    Henrik
    LV since v3.1
    “ground” is a convenient fantasy
    '˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'

  • How to take voltage measurements every minute

    Hi all,
    I am currently new to labview and i am trying to take timed voltage measurements.  Here is what i am trying to do;
     I am using labview 2010 and I am trying to take temperature and humidity readings in sync with each other.  To measure the temperature i am using a cDaq and mudule 9213, for humiduty i am using the voltage from module 9234 and converting it using the ambient temperature and voltage from my humidity sensor.  
    Now what i want to do is take voltage and temperature measurements say every minute or 30 seconds, but my buffer fills from the voltage readings and gives me an error or the table takes values from what im assuming is the buffer, one by one, instead of grabbing the current data from the module.  
    My real question here is how do i take current voltage measurements every 30 seconds?
    Thank you for any advice.
    Solved!
    Go to Solution.

    Did you stop your VI when changing voltages?  It would make since if you did since the chart will retain its memory while the table's Express VI is reset with each run.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Voltage measurement error

    Hello,
    I am using a PXI system (PXIe-1073 chassis, PXIe-6363 DAQ card, PXI-2567 relay driver card) connected to my custom PCB.
    My PCB has a 2.5V output precision reference IC whose output is connected to an analog input of PXIe-6363 DAQ card through the 68-pin connector and DAQ cable (SHC68-68).
    When I measure the 2.5V output at the 68-pin connector (Analog input pin of the connector) using an external DMM (with High Z input impedance), I measure exactly 2.5003V, which is what I expect. However, LabVIEW reports back 2.4987V with an error of 1.6mV.
    So, I disconnect my PXI system from my PCB and connect the DAQ card cable to the SCB-68 connector block and supply 2.5V external supply to the analog input and GND through the connector block. Now, both external DMM and LabVIEW report exactly 2.5V. This suggests that the DAQ card and software (voltage measurement VI) are working fine.
    Then why do I have an error of 1.6mV when the PXI system is connected to my PCB ? If the problem is in my PCB, then why does the external DMM measure accurately right at the DAQ 68-pin connector ? What could be going on ?
    Thanks
    Jeet

    Hello,
    Yes, the PXI chassis is still connected to the PCB when I measured with an external DMM.
    Upon further investigation, I found that my switched mode power supply was injecting 240 mV-pp noise on the Ground. This Ground noise was getting filtered down to 60 mV-pp through a choke but it still wasn't good enough. So I used another RF bead (with 1K impedance @ 100 MHz) to further suppress the ground noise and that did the trick. Now my voltage measurements are right on.
    Thanks
    Jeet.

  • What is the fastest way to calculate the rms voltage from a large voltage measurement file?

    I have collected voltage measurements for a 60Hz waveform and would like to calculate the RMS voltage in order to determine voltage drift. The data was collected at 100KHz and the TDMS file is 4 hours long (a huge amount of data!). I have run the RMS tool in the basic mathematics block with small sets of data successfully, but the large set seems to take forever to run (48 hours so far and it hasn't finished yet).
    1. Is there a faster way to analyze this much data?
    2. How does the "one sided interval width" affect my results and analysis time?
    3. Can I calculate the RMS voltage with FFT?
    4. Is there a way to redce the size of my data file to a manageable size file so the RMS function will run quickly, yet maintain accuracy?

    Hi Dave,
    Just a thought, If you have a 60 Hz signal, then the 100 Khz is way overkill for the sampling rate,   If you would resample the waveform at  say 20 X 60 hz= 1200 Hz you should have the same details  that would be in the 100 Khz sample signal.  That should take substantially less time to do a RMS calculation on the resampled channel. The two commands are listed below. 
    ChnResampleChnBased
    ChnResampleFreqBased
    Paul

  • Slow AC voltage measurement readings with hp34401a

    Hi,
    I am currently using the hp34401a to record an input and output signal voltage (this is then converted programmatically to frequency response). This measurement however I consider to take exceptionally long for what it is (approx 3 seconds per measurement). I am using Configure measurement.vi (set to AC voltage on auto range) and Read(single point).vi. If anyone can give me some ideas how to speed up these readings that would be appreciated.
    I have had a look at a similar test done with Labview 6.2.1, where the testing is much faster. The same instrument is being used but with Traditional NI-DAQ.
    Thanks
    Tania
    Note: Labview 8.6

    TanWal wrote:
    I have tried all the possible/ suggested ways I can think of to improve the reading. There seems to be little that can be done in AC. Even applying the fast filter did nothing to improve the time. I tried to do DC voltage measurement, then take a multipt read and use the Basic rms.vi to obtain an AC value but this is very inaccurate. It seems that this DMM is very limited in its speed for this type of measurment.
    I took a look in the spec. Forget the DC way if you have frequencies higher 80Hz....however if you have sampled waveforms do a tone detection (FFT based) or better a SAM (sinus approximation via min MSE) to get the best out of your data ( For FRF with sinus excitation). With RMS measurements you will measure hum and noise too, however you didn't tell us the uncertainty goal nor your DUT.
    I don't use this instrument but from the spec and my experience with HPs(agilent) you should be able to get 10 reading/s in AC mode BUT still have to wait for the settling of the AC filter after switching (I assume you use the front and rear input)
    I would encourage you to write a small state machine for the task and post it here.  
    Greetings from Germany
    Henrik
    LV since v3.1
    “ground” is a convenient fantasy
    '˙˙˙˙uıɐƃɐ lɐıp puɐ °06 ǝuoɥd ɹnoʎ uɹnʇ ǝsɐǝld 'ʎɹɐuıƃɐɯı sı pǝlɐıp ǝʌɐɥ noʎ ɹǝqɯnu ǝɥʇ'

  • Keithley 2000 voltage measurement

    Hi,
    I want to write a code to display the voltage measured from a keithley 2000 meter using RS232 serial port  and display it on a   graph using Labview 8.5 on Windows Vista based Laptop
    . I guess the first step is to download the driver for keithley2000 http://sine.ni.com/apps/utf8/niid_web_display.model_page?p_model_id=248. I want to go step by step. Please give me the main key steps on how to proceed on this problem and i will then work in detail based on your directions.
    How do i know that laptop  is recognizing the meter and RS232 connection is sound? What are the next key steps i need to work on? Please give the directions.
    Thanks

    For fast reading, it is important not to use the MEAS? command, but READ followed by FETCH? (if I remember correctly). Normally these DMMs operate at 1 PLC (power line cycle -> 50/60 Hz), so you need to adjust the NPLC (number of power line cycles).
    There is to options to work with the Keithley DMMs inside LabVIEW. You could use the LabVIEW drivers for the Keithley 2000, those can be found on ni.com and on the keithey homepage. Propably they only implement GPIB, the you need to make some modifications on them for RS232. The other option is to send the commands as found in the DMM manual via VISA.
    Felix
    www.aescusoft.de
    My latest community nugget on producer/consumer design
    My current blog: A journey through uml

  • Signal distortion in Voltage Measurement

    Hello,
    I am using PCI-MIO-16E4 for voltage measurement from a Single phase
    voltage supply panel. I use CB-68LP as connector cable. When I see the
    output signal using Scope it is giving exact value and pure sinusoidal,
    but when it comes to LabView 7.1, the magnitude is correct and the
    sinusoidal is distorted. The siganal is not continuous and looks like
    triggered pulse in the middle for every cycle. So, the signal breaks
    goes up like a sqaure pulse and comes down, sinusoidal continues. I
    donno if there is any problem with the triggering options which i
    didnot selected. I am very new to labView. The example file I am using
    (obtained from example finder/DAQMx/Voltage) is attached below.
    Any help is appreciated.
    Thanks
    Attachments:
    Cont Acq&Graph Voltage-Int Clk.vi ‏68 KB

    Hello Serges,
    Thanks for your response and suggestions. Let me explain my problem more clearly. I am using channel 6 now for measuring Voltage so I have connected wires to pins 25,58 on CB-68LP from the transducer output. These numberI obtained from LV itself from connection diagram and also from other thesis works and weblinks.
    I am getting the right signal intially for sometime when i see both in the scope and in the LV using our DAQ with the above specified connections. It is same for all other channels. I am measuring in the differential mode only (Can you suggest me if this mode is wrong and where to change the setting in 'vi' attached) and verified it in the test panel too. After a couple of minutes the signal is getting distorted in LV as well as in MAX also as attached above.  But in the scope it show the same sinusoidal without any kind of distortion. I feel the problem could be with the Connector block or connecting wire or my LabView file. The DAQ card is really new which we got 1 week back from NI. I will attach the right signal preview with this post.
    Attachments:
    correct1.jpg ‏2305 KB

  • Recording Temperature and Voltage measurements using Keithley 2182 Nanovoltmeter

    Hello all,
    I am relatively new to LabView and looking to extend a vi I am currently using.
    I am trying to record voltage and temperature measurements from a Keithley 2182 nano voltmeter using a GPIB cable. I have a vi that can do this for either voltage or temperature not both. At the moment I only record what is shown on the display of the nano voltmeter.
    Could somebody explain how I could get labview either to change between voltage and temperature on the nano voltmeter or whether it is possible to have two simultaneous measurements of temperature and voltage and how I would achieve this.
    Thanks
    Mike

    Hi,
    For each read, no matter Temperature or Voltage there is a certain command that is send to the voltmeter.
    I don't think (actually I'm pretty sure) you cannot read it in parallel but you can do it successively: One read Voltage, one read Temperature and so on.
    There should be something like:
    while not STOP do
      1. send GPIB command for changing Keithley to Voltage Measurement
      2. send GPIB command for Voltage Read
      3. read GPIB -> Voltage
      4. send GPIB command for changing Keithley to Temperature Measurement
      5. send GPIB command for Temperature Read
      6. read GPIB -> Temperature
    end
    You can take a look in VI to see which are the commands send for Voltage and Temperature reads and to mixed them like I described it above.
    If you don't manage it share your VIs (for temp and volt.) maybe it will be easier for me (or something else) to give you some additional advices.
    Paul

  • PXIe-6363 Voltage Measurements

    Hi Guys,
    First post here. I tried to search the documentation first but came up empty.
    I'm using the analog inputs on the 6363 card to make some voltage measurements. I have 2 questions:
    I know the channels are limited to +/- 10 volts for the inputs. I have a 25 volt line with a series current sense resistor (50mOhm). I want to make a differential measurement across the resistor using 2 of the analog channels. The actual voltage across the two points will be in the mV range but the single ended voltageon each will be ~25 V. Will this work?
    In a similar setup but on a 5 volt line can I make a differential measurement across a current sense resistor using 2 of the channels and then turn around and use one of those same channels to make a single ended measurement wrt gnd?
    Thanks for your help!
    Solved!
    Go to Solution.

    There is a line in the data sheet that states
    Maximum working voltage for analog inputs (signal + common mode)
    ±11 V of AI GND
    So with a common mode voltage of 25V, you will hurt your card.  The 5V line should be fine.
    If you look at Digikey, there are chips specifically made to sense a current shunt.  The amplifiers will turn the differential voltage into a single ended relative to ground.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • Voltage Measuring error in Labview with scxi-1100/1300

    I used the Daq wizard to setup my simple loop to measure 6 channels on my SCXI-1100 card with 1300 terminal.  I actually measured the voltage on the terminal block and get 4.6 volts but labview reads from 1.1 to 0.8?  I believe it is something I setup wrong in NI-MAX or my simple labview loop because we switched out the 1100 & 1300 cards - any suggestions on what to check next?  Could my PCI card or SCXI Chassis be causing this problem?
    thanks 

    Yes.  I am sure it's something simple that I am not doing but I am stumped!
    I have the following setup:  
    PCI-MIO16E-4
    SCXI-1001
          SCXI-1100/1300
    When I open the test panel it reads the real voltage (that I confirmed on the terminal block with a voltmeter) but when I run the sample program or a simple Daq Assistant scan the numbers are completely different.  I attached screen dumps - please take a look and see if you can catch what I have set wrong etc.
    thanks again!
    Attachments:
    SCXI_1100_Settings_and_readings.zip ‏1844 KB

  • Voltage measurement using SCXI 1302

    I am trying to measure voltages using SCXI 1302

    Great!  Are you having any problems?
    The SCXI-1302 is just a terminal block.  If you have a specific question, you might want to post some more information.  Like:
    1)  What SCXI module are you using (the one connected to the 1302).
    2)  What SCXI chassis are you using?
    3)  What DAQ device are you cabling to?
    4)  What version of NI-DAQ do you have?
    5)  Are you getting any particular errors?
    6)  Are you testing in Measurement & Automation Explorer?
    7)  What programming language are you using, and what version?  LabVIEW, CVI/LabWindows, C++, etc.
    -Alan A.

  • SCXI 1327 voltage resolution

    My lab has the following setup
    SCXI 1000 chassis
    SCXI 1121
    SCXI 1327
    I am currently cycling 12 volt electric vehicle batteries, but am having trouble configuring the NI hardware to provide a resolution of 1mV.
    I am also having difficulty finding the limitations on the input signal value if I use 1:1 signals.  The operating voltage range would be around 10.5V to 14V.
    Thanks in advance for the feedback.

    Hi HNEI,
    I forgot that you were using an SCXI-1327.  With the SCXI-1327 you
    can set (on a per channel basis) of whether or not to attenuate the
    signal by a factor of 100.  Since we need our final measurements
    to be maximized over the range of +/-10V by the time it gets to the
    board, then we could attenuate the signal by a factor of 100 and use
    the following formula (signal*amplification/attenuation = max voltage)
    to glean the best amplification setting for the SCXI-1121.
    14V * X / 100 = 10 V   ==>  X = 100*10/14 = 71.42
    So in this case the closest amplification that we can use without damaging the board would be a gain of 50.
    The best part about doing things this way would be that you will
    increase your accuracy by over 250 fold!  Prior accuracy might
    have been 728 uV, now it should be 2.86 uV.   Definitely an
    improvement.
    Sorry about the oversight, but I'm glad to see that things should work even better for you now.
    All the best,
    Otis
    Training and Certification
    Product Support Engineer
    National Instruments

  • SCXI-1121 voltage problem

    I have SCXI-1121 conected with SCXI-1327 on chasis PXI-1052. the excitation voltage is vibrating within 1mv, which makes my measurements unusable and also, instead of 3.33v I get  3.35v in one channel, and 3.345 on second.  is it normal with this device? Or should I send it for repair? or, may be I should use some filter to stabilise excitation?  

    Hi ilia.art,
    Did you use a multimeter to measure the voltage on each channel?  Or a high impedance probe?  According to the data sheet, the excitation voltage should be 3.333 V plus or minus 0.04%, so it is possible that your board is out of spec, but it's important to make sure that the measurements were taken correctly before you waste your time and money to send it back.  What are you using this excitation voltage for and how important is this accuracy to your application?  The more information you provide about your application, the more we'll be able to help.
    If you do decide that you are interested in sending your card back, you can call our customer service department at (800) 531-5066 and they will help you start the process to do an RMA (Return Merchandise Authorization).  Or if you have a support contract, you can create a service request and save some time on hold to get routed to an applications engineer who will begin the process for you.
    Sarah Y
    SDR Product Manager
    National Instruments | Ettus Research

Maybe you are looking for