PXI-4110 current limit

I am programming a PXI-4110 with LabWindows. I am trying out the current limiting. I have an output set to 5V with a 1K load resistor; this makes the current 5mA. I set the current limit to 2mA to see it work. When I read the voltage and current it is still 5V and 5mA. Here is my code:
g_voltageLevel_ch0 = 5.0;
g_currenLimit_ch0= 2.0e-3;
 status = niDCPower_ConfigureOutputFunction(vi_4110_0, channel0Name, NIDCPOWER_VAL_DC_VOLTAGE);
 niDCPower_error_message(vi_4110_0, status, errorMessage);
 status = niDCPower_ConfigureSense(vi_4110_0, channel0Name, NIDCPOWER_VAL_LOCAL);
 niDCPower_error_message(vi_4110_0, status, errorMessage);
 status = niDCPower_ConfigureVoltageLevel(vi_4110_0, channel0Name, g_voltageLevel_ch0);
 niDCPower_error_message(vi_4110_0, status, errorMessage);
 status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
 niDCPower_error_message(vi_4110_0, status, errorMessage);
Solved!
Go to Solution.

I'm getting an error now. I have a 1K resistor on each of the outputs so I will see some current.
Look for these lines
to see comments I am sending you about the code.
When I start my program I call a function PXI4110_HardwareInit() to get things set up. This part seems to look OK.
/************************************** PXI-4110 ****************************************/
// PXI-4110 hardware definitions for initialization
static ViSession vi_4110_0 = VI_NULL, vi_4110_1 = VI_NULL, vi_4110_2 = VI_NULL;
/* The channel names are length 2 because the maximum size of the strings
from the each textbox on the UI is 1. */
static ViChar channel0Name[2] = "0";  // 0V to 6V channel
static ViChar channel1Name[2] = "1";  // 0V to 20V channel
static ViChar channel2Name[2] = "2";  // 0V to -20V channel
// inputs
ViReal64 g_voltageLevel_ch0 = 5.0;
ViReal64 g_voltageLevel_ch1 = +13.0;
ViReal64 g_voltageLevel_ch2 = -13.0;
ViReal64 g_currenLimit_ch0 = 500.0e-3;
ViReal64 g_currenLimit_ch1 = 500.0e-3;
ViReal64 g_currenLimit_ch2 = 500.0e-3;
void PXI4110_HardwareInit(void)
 ViStatus status;
 ViChar errorMessage[256];
    ViChar resourceDCSupplyName[256] = "PXI-4110";
 // set channel 0 to +5.0V
 status = niDCPower_InitializeWithChannels(resourceDCSupplyName,channel0Name,VI_TRUE,VI_NULL,&vi_4110_0);
 if(status < 0)
  niDCPower_error_message(vi_4110_0, status, errorMessage);
 status = niDCPower_ConfigureOutputFunction(vi_4110_0, channel0Name, NIDCPOWER_VAL_DC_VOLTAGE);
 niDCPower_error_message(vi_4110_0, status, errorMessage);
 status = niDCPower_ConfigureSense(vi_4110_0, channel0Name, NIDCPOWER_VAL_LOCAL);
 niDCPower_error_message(vi_4110_0, status, errorMessage);
 status = niDCPower_ConfigureCurrentLimitRange(vi_4110_0,channel0Name,  1.0);
 niDCPower_error_message(vi_4110_0, status, errorMessage);
 Set_PXI_4110_Outputs();
 status = niDCPower_ConfigureOutputEnabled(vi_4110_0, channel0Name, VI_FALSE);
 status = niDCPower_Initiate(vi_4110_0);
 niDCPower_error_message(vi_4110_0, status, errorMessage);
static void Set_PXI_4110_Outputs(void)
 ViStatus status;
 ViChar errorMessage[256];
 // channel 0
 status = niDCPower_ConfigureVoltageLevel(vi_4110_0, channel0Name, g_voltageLevel_ch0);
 niDCPower_error_message(vi_4110_0, status, errorMessage);
 status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
 niDCPower_error_message(vi_4110_0, status, errorMessage);
Then I send a message on ethernet to enable the outputs: When I enable the outputs this function reads the voltage and current outputs and sends this information to a application on my host computer. This information corresponds to what I set it to and what I see with a volt meter.  
void PXI_4110_enable_outputs(void)
 ViStatus status;
    ViReal64 measuredVoltage, measuredCurrent;
    ViBoolean inCompliance;
 ViChar errorMessage[256];
 // set the outputs
// Set_PXI_4110_Outputs();
 status = niDCPower_ConfigureOutputEnabled(vi_4110_0, channel0Name, VI_TRUE);
      /* Wait for the outputs to settle. */
     Delay(50e-3);
  // check channel 0
  status = niDCPower_MeasureMultiple(vi_4110_0, channel0Name, &measuredVoltage, &measuredCurrent);
  niDCPower_error_message(vi_4110_0, status, errorMessage);
  niDCPower_QueryInCompliance(vi_4110_0, channel0Name, &inCompliance);
  niDCPower_error_message(vi_4110_0, status, errorMessage);
Now I send a message to change the current limit. This is where I have trouble. I try and limit the current on Channel 0 to 2mA. Since the output voltage is 5Volt and the resistor is 1K the current will exceed the current limit which is what I am trying to test. I get an error in this function>
   g_currenLimit_ch0 = CommandData->data; 
   status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
   niDCPower_error_message(vi_4110_0, status, errorMessage);
I see g_currentLimit_ch0 = 2e-3 whic is whatI sent. but my status is a negative number. The message is "Invalid value for parameter or property"
vi_4110_0 is the handle I got earlier.
channel0Name ="0"
and g_currentLimit_ch0 = 2e-3
I do not understand why I am seeing and error here?????????????????????
thanks in advance for your help,
Don

Similar Messages

  • 4110 current range selection

    Hellow. There is any troubles with PXI-4110 current range programming. My algorythm:
    1. Init dev
    2. property node: active channel, output enable true, output function DC volt
    3. property node: active channel, voltage level (variable), current limit avtorange on
    4. Measure multiple, reading result (current, current range immediate value)
     In attachment - my version of algorythm realisation.
    Value o current limit range still 1. Required voltage 3.3V.
    Would you send me example for current limit range changing?
    Thank you!
    Attachments:
    Vcc Icc meas.vi ‏21 KB

    Hi Gorge-
         To set the current or voltage ranges, you must use the VIs found on the DC Voltage palette.  They can be found on the block diagram by navigating to Measurement I/O»NI DCPower»Source»DC Voltage.  There are four VIs in there that configure the voltage and current range and limit.  For an example of how to use these, you can look in the LabVIEW Example Finder.  In LabVIEW, click on Help»Find Examples.  Once the example finder opens, navigate to Hardware Input and Output»Modular Instruments»NI DCPower»NI-DCPower Source DC Current.vi.  While this VI sources current and probably doesn't do what you want, it demonstrates how to use the Current/Voltage Range/Limit VIs.  These will allow you to set the voltage or current range of your device.
         I hope this helps.  Best of luck with your application!
    Gary P.
    Applications Engineer
    National Instruments
    Visit ni.com/gettingstarted for step-by-step help in setting up your system.

  • Triggerring PXI-4110 to measure 1 current value while HSDIO PXI-6552 generating waveform

    Hi,
    Some question about PXI-4110 to measure current while PXI-6552 is generating the waveform. 
    1. Let say, I need to measure 3 points of current values, i.e. while PXI-6552 is generating sample-1000, 2000 and 3500. On the edge of sample 1000,2000 and 3500, the PXI-6552 will send a pulse via PFI line or via PXI backplane trigger line. My question is, is it possible to trigger PXI-4110 (hardware trigger or software trigger) to measure current values at these points ?
    2. Let say I need to measure the current on 0ms (start of waveform generation by PXI-6552) , 1ms, 2ms, 3ms, 4ms... and so on for 1000 points of measurement, code diagram as shown at the figure below. It is possible for the VI "niDCPower Measure Multiple" to measure exactly at 1ms, 2ms, 3ms .. ? How much time will have to spend to complete acquire 1 point of measurement by "niDCPower Measure Multiple" ?
    Thanks for viewing this post. Your advice on hardware used or software method is much appreciated. Thanks in advance.  
    Message Edited by engwei on 02-02-2009 04:24 AM
    Attachments:
    [email protected] ‏46 KB

    Hi engwei,
    1. Unfortunately, the 4110 does not support hardware triggering. Therefore you cannot implement direct triggering through the backplane or anything like that. However, there are a couple of possible workarounds you can try:
    a) Use software triggering: Say your 6552 is generating in one while loop, and your 4110 is to measure in another while loop. You can use a software syncrhonization method like notifiers to send a notification to your 4110 loop when your 6552 has generated the desired sample. This method, however, will not be very deterministic because the delay between the trigger and the response depends on your processor speed and load. Therefore, if you have other applications running in the background (like antivirus) it will increase the delay.
    b) Use hardware triggering on another device: If you have another device that supports hardware triggering (like maybe an M-series multifunction DAQ module), you can configure this device to be triggered by a signal from the 6552, perform a very quick task (like a very short finite acquisition) then immediately execute the DCPower VI to perform the measurement. The trigger can be configured to be re-triggerable for multiple usage. This will most likely have a smaller time delay then the first option, but there will still be a delay (the time it takes to perform the short finite acquisiton on the M-series). Please refer to the attached screenshot for an idea of how to implement this.
    2. To make your 4110 measure at specific time intervals, you can use one of the methods discussed above. As for how long it will take to acquire 1 measurement point, you may find this link helpful: http://zone.ni.com/devzone/cda/tut/p/id/7034
    This article is meant for the PXI-4130 but the 4110 has the same maximum sampling rate (3 kHz) and so the section discussing the speed should apply for both devices.
    Under the Software Measurement Rate section, it is stated that the default behavior of the VI is to take an average of 10 samples. This corresponds to a maximum sampling rate of 300 samples/second. However, if you configure it to not do averaging (take only 1 sample) then the maximum rate of 3000 samples/second can be achieved.
    It is also important to note that your program can only achieve this maximum sampling rate if your software loop takes less time to execute than the actual physical sampling. For example, if you want to sample at 3000 samples/second, that means that taking one sample takes 1/3000 seconds or 333 microseconds. If you software execution time is less than 333 microseconds, then you can achieve this maximum rate (because the speed is limited by the hardware, not the software). However, if your software takes more than 333 microseconds to execute, then the software loop time will define the maximum sampling rate you can get, which will be lower than 3000 samples/second.
    I hope this answers your question.
    Best regards,
    Vern Yew
    Applications Engineer, NI ASEAN
    Best regards,
    Vern Yew
    Applications Engineer
    Attachments:
    untitled.JPG ‏18 KB

  • Measure power/current being delivered by PXI-4110 on a specific channel?

    I have a PXI-1033 chassi and inside it there's a PXI-4110. How do I use LabVIEW 8.2.1 to determine the power/current consumption on a specific output channel?

    I have a PXI-1033 chassi and inside it there's a PXI-4110. How do I use LabVIEW 8.2.1 to determine the power/current consumption on a specific output channel?

  • PXI-4110 Cards failing since upgrading to Calibratio​n Executive 3.4

    Since we upgraded to Calibration Executive 3.4 all of our PXI-4110 Cards have been failing the following:
    Calibration
    As Found
    Channel
    As Found DMM Reading
    As Left DMM Reading
    Test Value
    Low Limit
    Reading
    High Limit
    PassFail
    2
    0.999942
    A
    0.999942
    A
    0.25000
    A
    0.99557
    A
    0.94065
    A
    1.00432
    A
    Failed
    2
    0.999991
    A
    0.999991
    A
    0.50000
    A
    0.99524
    A
    0.93736
    A
    1.00474
    A
    Failed
    2
    0.999761
    A
    0.999761
    A
    0.75000
    A
    0.99370
    A
    0.94076
    A
    1.00582
    A
    Failed
    This failure occurred on 3 brand new cards and I have even tried the last PXI-4110 before upgrading and it also failed these test and when an adjustment is attempted I receive the following error:
    Error -300010 occurred at Get UserField Data.vi
    Complete call chain:
         cex_UtilityGetUserFieldData.vi
         cex_UtilityGetUserFieldDataByIndex.vi
         _NI 4110_Get Channel Range and Level.vi
         Adjust_NI 4110_Current Out Accuracy.vi
    at step Adjust Current Output Accuracy
    I am using a Agilent 34401 for the DMM.

    JVP
    Here are the files you wanted. Sorry this took so long. The hard drive on the computer that we have the Calibration Executive software on died and we had to reinstall everything. So and I tried to run another card with the same results the calibration report I am sending you is from today. I sent the reoprt in two formats XLS and PDF.
    Attachments:
    ni_support.zip ‏197 KB
    71517 Wednesday, January 11, 2012 8-28-29.xls ‏49 KB
    71517 Wednesday, January 11, 2012 8-28-29.pdf ‏30 KB

  • Measuring sub 125 microamps using PXI 4110 - Is it possible?

    I have a question concerning the PXI 4110.  We are trying to use this card to both provide voltage to a device while at the same time using it to monitor the real time current being supplied.  What we have found that it is capable of providing voltage from 0 to 6 volts, but have not been able to read below ~500 micro-amps.  The data sheet says it has a resolution of 0.01 ma or 10 micro amps, but we have not been able to get the card to read consistently under the 500 micro-amps.
    Is there a minimum current draw required before the 10 micro-amp range becomes true? 
    Here is what we are trying to monitor using this card:
    Voltage supplied between 3.5 to 6 volts.
    When device is asleep, current draw is between 50 to 100 micro amps.
    When device is awake, current draw can be as high as 100 mili-amps.
    Will this card work?  Is there a better solution to this? 
    Thanks

    Hi John,
    Thanks for the additional information. Based on your sampling rate you should be fine with a software or hardware timed device. The answer that you will need to determine is how accurately you need to measure your sleep current. The issue is that you do not know the mode that your DUT will be in to change your current range. At this time our power supplies have output and measure ranges that are coupled together. This means that when you change to a lower range your output current will be limited to the max of that range, which can be an issue when the DUT 'wakes up'.
    I would recommend is to select a device that can remain in the higher current range (100mA or greater) and have the measurement capability that meets your needs. You would need to know that you would need to measure the sleep current within +/- X uA. When selecting the device you will then want to look at its measure accuracy on the range you will use. The calculation would be +/- (Measurement*Gain Error + Offset Error). You will find that the largest portion of the error will be offset due to the range you will be using. You also need to take into account the resolution of the instrument because that is the smallest possible change you can measure
    One comment on offset error is that for a given test setup/temperature/humidity/etc and test voltage it will stay fairly constant. This means that you can characterize the offset of your system if all of those factors remain constant. I would recommend that you would set up your test, including all fixturing/cabling excluding the DUT. You can set the supply to your test voltage and measure the current. In this setup the ideal current would be 0uA because it is an 'open', but due to leakages in the system there will be an offset. You can take this reading as a baseline 'zero' and subtract it from future readings to improve your measurements. You will want to be careful of Dielectric Absorption (DA) because it can mislead you when making measurements like this, but it is less of an issue when talking about uA and more of an issue when measuring pA. It would be a good idea to repeat this characterization periodically to ensure that your measurements are accurate, ideally once per DUT, but you can scale that back as necessary.
    I hope this is helpful. It is a good idea to evaluate the hardware in your test setup to ensure that the measurements meet your needs. I would also add the PXI-4132 to your list of options to consider for its 100mA range. I think that these other devices would be better than the PXI-4110 in your application because of the low current measurements you need. If you can use the additional channels the PXIe-4140/4141 are good options, if not the PXI-4132 would be a good option. You should also consider the different connectors for PXI vs PXIe and what will work for your chassis.  
    Steve B

  • Pxi 4110 and dcpower express vi

    hi everyone,
    I'm really new to labview and I have to use a pxi 4110 to supply dc voltage with a square wave (or whatever shape I chose).
    I treid using the "dcpower express vi" but I can't understand how to connect the generated signal. at now I have the wave and I put a block which transforms the wave from dynamic data to array.. but it doesn't work! 
    i tried watchin g the examples, but none of them uses power express...
    thanks for any help!!

    As bronzacuerta mentioned, the PXI-4110 is not intended to generate a square waveform, or any specific waveform. It is intended to regulate a constant voltage or current, adjusting for changing loads. By changing the output using the NI-DCPower API, you may be able to approximate a square waveform, but it will not be a very good one, both in terms of rise/fall times and due to software-timed output updates.
    A potentially better option (based on speed/flexibility/accuracy requirements) instead of a Multifunction DAQ card is a signal generator.
    Tobias
    Staff Software Engineer
    Modular Instruments
    National Instruments
    Tobias
    Staff Software Engineer
    Modular Instruments R&D
    National Instruments

  • Don't know how to set current limit for servomotor​s in MID

    My application is to control a servo motor. I�m using PXI-7350, driving a motor (Pittman GM8712F434) with MID-7654. The motor is connected to mechanic parts and mechanic parts are connected to a potentiometer as a feedback. Now, according to the spec of the motor, it said reference voltage 19.1V, inductance 5.4 mH, peak current 1.76 A. So, I set dip switch of MID to 0.85 A for continuous current, 1.70 A for peak current and standard inductance. The motor didn�t move and fault LED lighted. Then I set both continuous and peak current to around 1.7A. The motor still didn�t move and fault LED still lighted. I kept increasing the current until the current hit 3A. It started moving slowly but fault LED still lighted. Final
    ly I set the peak current to 3.5A, the motor ran normally. Come to the questions.
    1. Is the peak current in the spec same as peak current limit in MID? How to determine peak current limit? Because I�m afraid that my little motor is going to burn out.
    2. How to set the continuous current; is it suppose to be half of peak current?
    3. If +10/-10V command voltage(torque limit) gets map to peak current (let say 1.7 A), will +5/-5V be mapped to 0.85? And what happen after 2.7 sec if I set continuous current to 0.85A? Will +5/-5V be mapped to half of 0.85?
    4. How does �2.7 sec� thing works? When I set peak and continuous current to 3.5A and 1.7A, after 2.7 sec motor should run slower or even stop (in my case), but it went nut like it ran at maximum current.
    I set Kp 20 and others zero, everything else is same as default. Thank you, I appreciate your answer.

    infoiupui,
    There are several DIP switches on the front of the MID 7654 that can be toggled to vary the continuous and peak current values for the MID. There is a small panel that can be removed from the lower left corner of the MID. You can look in the manual for the 7654, linked below, to determine the particular settings that you should use to achieve given continuous and peak current values.
    MID-7654/7652 Servo Power Motor Drive User Guide
    http://digital.ni.com/manuals.nsf/webAdvsearch/E38​2785B7D67553E86256A5D0072BBE1?OpenDocument&vid=niw​c&node=132100_US
    Take a look at pages 10 through 13 of this document for information on how to modify those DIP switches to achieve current values that will work with the motor that you are using.
    I am not clear on w
    hat you are referring to when you referred to 2.7 seconds. Could you clarify that for me?
    I hope that helps! Let me know if you have any additional questions on this issue.
    Regards,
    Scott R.
    Applications Engineer
    National Instruments

  • PXI-4110 niDCPower_init

    I am just starting to write a program for my PXI-4110 Programmable DC Power Supply.
    I am using the function niDCPower_init(). When I read the documentation about this function it made this statement:
    "This function is deprecated. Use niDCPower_InitializeWithChannels instead."
    My fp file (nidcpower.fp) does not have this function. Do I need a newer fp file? Is this function niDCPower_init() only a poor choice or is there something wrong with it?
    Deprecated was an odd word to use to describe this function. It implies that the function is disapproved of but maybe it works. What is the opinion of others?
    I am going to try it out but would like to know if this is the function I want to use.
    thanks in advance

    DPearce,
    Thank you for using NI forums! Deprecated functions use the deprecated programming model. 
    In terms of the niDCPower_init function, it is obsolete. If you do not have niDCPower_InitializeWithChannels, I would recommend downloading the most current version of our driver. You can download the driver here. You can find more information on this function here. The VI version is here (I tend to find the VI version a little more reader friendly!). 
    Also, I would check out the shipping examples that come with this driver. That will help you see the various functions in action and assist you in determining which one you want to use. 
    Let me know if you have any other questions!
    Katie
    Katie Collette
    National Instruments

  • Pxi 4130 current range error

    Hi, 
    I'm trying to configure PXI - 4130, with labview.
    DC Volt and range set to 1.5V - no error.
    Current limit and range set to 200uA, 2mA and 20mA - getting the following error.
    The error tells that i'm not setting the right range - which is not true. Also, Property: Current Limit Corresponding Value:  100.0e-3 . In fact this is 2mA itself in the VI. 
    Tried with the VI in llb as well as NI DC Power property node.
    Confused !! Please help.
    Property Node (arg 2) in PXI_4130_SMU.vi <ERR>
    Invalid value for parameter or property.
    Conflicting Properties: Current Limit Range, Current Limit
    Property: Current Limit Range
    Requested Value:  2.0e-3
    Possible Values:  200.0e-6,  2.0e-3,  20.0e-3,  200.0e-3,  2.000000
    Property: Current Limit
    Corresponding Value:  100.0e-3
    Specified Range Minimum:  40.0e-6
    Specified Range Maximum:  2.0e-3
    Channel Name: 1
    Status Code: -225152
    Thanks
    Manu
    Solved!
    Go to Solution.

    Hi Manu,
    What version of the DC Power driver are you working with? Are you explicitly setting both the Current Limit and the Current Limit Range in your VI? If you open up the Example Finder (Help>>Find Examples) you can locate the DC Power Examples (Hardware Input and Output>>Modular Instruments>>NI-DCPOWER) and I would recommend that you open up and try running the example titled "NI-DCPower Source DC Voltage.vi". This example will configure the SMU to source a DC voltage for the configured current limit and current limit range. If properly configured, this VI should run without errors and can be used as guide for writing your VI. 
    Steve B

  • What is the proper way to close all open sessions of a NI PXI-4110 for a given Device alias?

    I've found that, when programming the NI PXI-4110 that, if a the VI "niDCPower Initialize With Channels VI" (NI-DCPower pallette) is called with a device
    alias that all ready has one or more sessions open (due to an abort or other programming error) a device reference results from the reference out that has a (*) where "*" is post-fixed to the device reference where and is an integer starting that increments with each initialize call. In my clean up, I would like to close all open sessions. For example, let's said the device alias is "NIPower_1" in NI Max, and there are 5 open sessions; NIPower_1, NIPower_1 (1), NIPower_1 (2), NIPower_1 (3), and NIPower_1 (4). A simple initialize or reset (using niDCPower Initialize With Channels VI, or, niDCPower Initialize With Channels VI, etc.) What is the proper way to close all open sessions?
    Thanks in advance. Been struggleing with this for days!

    When you Initialize a session to a device that already has a session open, NI-DCPower closes the previous session and returns a new one. You can verify this very easily: try to use the first session after the second session was opened.
    Unfortunately, there is a small leak and that is what you encountered: the previous session remains registered with LabVIEW, since we unregister inside the Close VI and this was never called. So the name of the session still shows in the control like you noted: NIPower_1, NIPower_1 (1), NIPower_1 (2), NIPower_1 (3), and NIPower_1 (4), etc.
    There may be a way to iterate over the registered sessions, but I couldn't find it. However, you can unregister them by calling "IVI Delete Session". Look for it inside "niDCPower Close.vi". If you don't have the list of open sessions, but you have the device name, then you can just append (1), (2) and so forth and call "IVI Delete Session" in a loop. There's no problem calling it on sessions that were never added.
    However - I consider all this a hack. What you should do is write code that does not leak sessions. Anything you open, you should close. If you find yourself in a situation where there are a lot of leaked sessions during development, relaunching LabVIEW will clear it out. If relaunching LabVIEW is too much of an annoyance, then write a VI that does what I described above and run it when needed. You can even make it "smarter" by getting the names of all the NI-DCPower devices in your system using the System Configuration or niModInst APIs.
    Hope this helps.
    Marcos Kirsch
    Principal Software Engineer
    Core Modular Instruments Software
    National Instruments

  • IVI Configuration with PXI-4110 in TestStand

    Hello All,
    Set-Up:
    PXI-1033 connected through MXI to a PC running Windows 7. LabVIEW 2014. TestStand 2014 (32-bit). PXI-4110 Power Supply and PXI-4070 DMM.
    In MAX I can open both soft panels and control both units and they work great.  In LabVIEW I can control both cards as well. 
    In MAX I have set up a driver and logical name for my DMM. This unit works great within TestStand using an IVI DMM step.
    I then proceeded to setup the 4110 in MAX with an IVI driver and logical name. Here are my settings:
    Name: ni4410_PS
    Hardware: Added hardware asset and select the PS. This one is checked, no other assets are checked.
    Software Module: NI-DCPower, 4110 is listed as a device that is supported.
    Virtual Names: This is where I am confused, Under physical name there are four options that come up (0, 1, 2, and 3). This power supply only has 3 outputs so I am unsure why four come up. I have made 4 virtual names, one for each of the options. I named them ch0, ch1, ch2, and ch3 respectively.
    When I put an IVI Power Supply step in TestStand everything seems to be working. I open the configuration window and set my values for each channel. If I try to validate the setup by unchecking simulate and click on init I do not get an error.  As soon as I clic on 'Configure' or 'Show Soft Front Panel' I get the following error:
    "The IVI Configure operation failed for lgical name 'NI PS 1'. Details: Extention capability not supported by instrument driver. (Base) (-30717)"
    Any information would be appreciated.  I tried playing with it for a couple hours yesterday and had a couple co workers try to help.  We are all under the assumption that this should be working.  
    Thank You!!
    Jesse
    Solved!
    Go to Solution.

    Hi jesserzamora,
    Have you seen this link: http://digital.ni.com/public.nsf/allkb/331F2717DBCD1F858625758200745773?OpenDocument
    It discusses a similar failure with the IVI Power Supply step in TestStand. 
    Julia P.

  • NI DC Soft Front Panel, minorbug, minor bug with PXI-4110

    Hi
    The NI DC Soft Front Panel V14.0, with the PXI-4110, the negative voltage scroll, works as expected to -10V, but then rolls over to 0. If one changes in -1V steps, it goes.... -8, -9, -10, -1, -2 ..... instead of -8, -9, -10, -11, -12 ...
    While at it, a simple thing I miss is a on/off button for all three voltages.
    (Also, imo, it would be logical to get negative voltages with the down arrow, not up).
    My 2C
    Solved!
    Go to Solution.

    Hello Janaf,
    I fully agree with both of your statements, I have filed a Correction Action Report which you can monitor in next releases of DCPower to see if this is fixed with the SFP. Number of CAR: 512257
    I added notes that only manual insertion of numbers less than -10 works and that it was not logical to use increment arrow or up arrow to decrease the voltage output.
    Best Regards
    Jonas Mäki
    Applications Engineering
    National Instruments

  • Firewire (400) Current limit

    I want to connect an external 2.5" drive to my firewire port, and would love if I don't have to use a power supply. Does anybody know what the current limit of the Powerbook's firewire 400 port is? I know that it's 500 mA for USB, but I can't find any info on firewire.

    7 Watts, according to the developer notes. That's at 12.8V (no load), so that's 540mA.
    About the same as USB 2.0 then.

  • PXI-4110 Long-Term Fuse Reliability Recall

    I home someone here can provide the info I am looking for. This past October I had a PXI-4110 Triple Output Supply go belly up. It would not pass self-test at all. Appeared to have no communication. After a $530 RMA repair, the board was fine. Today, I have 3 more of the exact same card (all purchased at the same time and only a short time out of warranty). All show the same error as the one in October. All failed in one day (today) with error number -200175. Speaking to NI I find out that it is most likely a fuse and it needs to be sent in on an RMA. Again at $530 a pop. To replace a fuse. A fuse that is apparently at the heart of a "Long-Term Fuse Reliability Recall" that specifically affects PXI-4110.
    Now, at this point it is not even the money, (though it angers me to no end knowing I paid $530 to fix a product with a known defect and that was under recall, but was not informed of it), but the fact that losing the cards for the 10 business days will effectively cost over $50K in lost shippable product.
    We have Certified soldering techs who work under a microscope replacing surface mount components all day. We can replace the fuse. My understanding is that the fuse gets "tired" and is replaced with a different one.
    I need to know 2 things:
    What is the original value of F9 on this card?
    What is it replaced with?
    I find it hard to believe that NI would not give me the info I need to make a quick repair. We did  not receive any recall letter either. Can anyone help???

    Hi Franco,
    I believe that the situation regarding being charged your RMA is being resolved as your board was covered under the recall notification that was sent out. The notification was sent out to the company we have on record who purchased the board. The RMA charge would not apply for a unit under recall.
    The recommended method for repairing the boards is to send them in to NI for repair. We cannot authorize the modification of our boards and it would void the warranty on the product. When the F9 Fuse blew it is possible that other components could have been damaged and also the action of replacing the fuse would likely change the calibration of the device. When sent back to NI for repair the device would undergo testing and calibration to ensure it meets all specifications.   
    We recommend that affected 4110's be sent in for a preventative repair.
    Steve B

Maybe you are looking for

  • AirTunes / Airport Express / Windows XP not working together

    Here's the issue. I have two PCs available to me; a laptop from work that is tightly locked down by our paranoid IT dept, and a family desktop that has the iTunes library within it. I purchased airport express so that I could stream to my stereo. The

  • Front Row won't play AC3

    hey folks Really annoying, front row and quicktime, dont play any of my movie files that require a ac3 codec. i have downloaded the intel codecs pack but still no joy has anyone else experienced this? are there any recomended codecs out there many th

  • CANT ACCESS PICTURES AND VIDEOS FROM GALLERY...573...

    Why cant i access my pictures and videos from gallery?...i can only access it when i connect using USB cord, that is when i see it in d memory card. My phone is 5730 Express Music Pls help..

  • Why do I get an AVG Resident Shield Alert every time I start FF 13.0.1?

    The alert references tracking cookie 2o7, found in my default profile, \cookies.sqlite. I don't really want to add that cookie to my Exceptions because I don't want that cookie on my computer. If I ignore the cookie, FF 13 just repeats the error on e

  • Hiding navigation when viewing DVD on screen

    Does anyone know of a way to hide the navigation symbols that pop on screen (ie: when you hit the play button)? My boss wants to do a presentation by way of DVD. I've set it up to pause at chapter markers so all he has to do is hit play when he is re