PXI-4110 niDCPower_init

I am just starting to write a program for my PXI-4110 Programmable DC Power Supply.
I am using the function niDCPower_init(). When I read the documentation about this function it made this statement:
"This function is deprecated. Use niDCPower_InitializeWithChannels instead."
My fp file (nidcpower.fp) does not have this function. Do I need a newer fp file? Is this function niDCPower_init() only a poor choice or is there something wrong with it?
Deprecated was an odd word to use to describe this function. It implies that the function is disapproved of but maybe it works. What is the opinion of others?
I am going to try it out but would like to know if this is the function I want to use.
thanks in advance

DPearce,
Thank you for using NI forums! Deprecated functions use the deprecated programming model. 
In terms of the niDCPower_init function, it is obsolete. If you do not have niDCPower_InitializeWithChannels, I would recommend downloading the most current version of our driver. You can download the driver here. You can find more information on this function here. The VI version is here (I tend to find the VI version a little more reader friendly!). 
Also, I would check out the shipping examples that come with this driver. That will help you see the various functions in action and assist you in determining which one you want to use. 
Let me know if you have any other questions!
Katie
Katie Collette
National Instruments

Similar Messages

  • Triggerring PXI-4110 to measure 1 current value while HSDIO PXI-6552 generating waveform

    Hi,
    Some question about PXI-4110 to measure current while PXI-6552 is generating the waveform. 
    1. Let say, I need to measure 3 points of current values, i.e. while PXI-6552 is generating sample-1000, 2000 and 3500. On the edge of sample 1000,2000 and 3500, the PXI-6552 will send a pulse via PFI line or via PXI backplane trigger line. My question is, is it possible to trigger PXI-4110 (hardware trigger or software trigger) to measure current values at these points ?
    2. Let say I need to measure the current on 0ms (start of waveform generation by PXI-6552) , 1ms, 2ms, 3ms, 4ms... and so on for 1000 points of measurement, code diagram as shown at the figure below. It is possible for the VI "niDCPower Measure Multiple" to measure exactly at 1ms, 2ms, 3ms .. ? How much time will have to spend to complete acquire 1 point of measurement by "niDCPower Measure Multiple" ?
    Thanks for viewing this post. Your advice on hardware used or software method is much appreciated. Thanks in advance.  
    Message Edited by engwei on 02-02-2009 04:24 AM
    Attachments:
    [email protected] ‏46 KB

    Hi engwei,
    1. Unfortunately, the 4110 does not support hardware triggering. Therefore you cannot implement direct triggering through the backplane or anything like that. However, there are a couple of possible workarounds you can try:
    a) Use software triggering: Say your 6552 is generating in one while loop, and your 4110 is to measure in another while loop. You can use a software syncrhonization method like notifiers to send a notification to your 4110 loop when your 6552 has generated the desired sample. This method, however, will not be very deterministic because the delay between the trigger and the response depends on your processor speed and load. Therefore, if you have other applications running in the background (like antivirus) it will increase the delay.
    b) Use hardware triggering on another device: If you have another device that supports hardware triggering (like maybe an M-series multifunction DAQ module), you can configure this device to be triggered by a signal from the 6552, perform a very quick task (like a very short finite acquisition) then immediately execute the DCPower VI to perform the measurement. The trigger can be configured to be re-triggerable for multiple usage. This will most likely have a smaller time delay then the first option, but there will still be a delay (the time it takes to perform the short finite acquisiton on the M-series). Please refer to the attached screenshot for an idea of how to implement this.
    2. To make your 4110 measure at specific time intervals, you can use one of the methods discussed above. As for how long it will take to acquire 1 measurement point, you may find this link helpful: http://zone.ni.com/devzone/cda/tut/p/id/7034
    This article is meant for the PXI-4130 but the 4110 has the same maximum sampling rate (3 kHz) and so the section discussing the speed should apply for both devices.
    Under the Software Measurement Rate section, it is stated that the default behavior of the VI is to take an average of 10 samples. This corresponds to a maximum sampling rate of 300 samples/second. However, if you configure it to not do averaging (take only 1 sample) then the maximum rate of 3000 samples/second can be achieved.
    It is also important to note that your program can only achieve this maximum sampling rate if your software loop takes less time to execute than the actual physical sampling. For example, if you want to sample at 3000 samples/second, that means that taking one sample takes 1/3000 seconds or 333 microseconds. If you software execution time is less than 333 microseconds, then you can achieve this maximum rate (because the speed is limited by the hardware, not the software). However, if your software takes more than 333 microseconds to execute, then the software loop time will define the maximum sampling rate you can get, which will be lower than 3000 samples/second.
    I hope this answers your question.
    Best regards,
    Vern Yew
    Applications Engineer, NI ASEAN
    Best regards,
    Vern Yew
    Applications Engineer
    Attachments:
    untitled.JPG ‏18 KB

  • What is the proper way to close all open sessions of a NI PXI-4110 for a given Device alias?

    I've found that, when programming the NI PXI-4110 that, if a the VI "niDCPower Initialize With Channels VI" (NI-DCPower pallette) is called with a device
    alias that all ready has one or more sessions open (due to an abort or other programming error) a device reference results from the reference out that has a (*) where "*" is post-fixed to the device reference where and is an integer starting that increments with each initialize call. In my clean up, I would like to close all open sessions. For example, let's said the device alias is "NIPower_1" in NI Max, and there are 5 open sessions; NIPower_1, NIPower_1 (1), NIPower_1 (2), NIPower_1 (3), and NIPower_1 (4). A simple initialize or reset (using niDCPower Initialize With Channels VI, or, niDCPower Initialize With Channels VI, etc.) What is the proper way to close all open sessions?
    Thanks in advance. Been struggleing with this for days!

    When you Initialize a session to a device that already has a session open, NI-DCPower closes the previous session and returns a new one. You can verify this very easily: try to use the first session after the second session was opened.
    Unfortunately, there is a small leak and that is what you encountered: the previous session remains registered with LabVIEW, since we unregister inside the Close VI and this was never called. So the name of the session still shows in the control like you noted: NIPower_1, NIPower_1 (1), NIPower_1 (2), NIPower_1 (3), and NIPower_1 (4), etc.
    There may be a way to iterate over the registered sessions, but I couldn't find it. However, you can unregister them by calling "IVI Delete Session". Look for it inside "niDCPower Close.vi". If you don't have the list of open sessions, but you have the device name, then you can just append (1), (2) and so forth and call "IVI Delete Session" in a loop. There's no problem calling it on sessions that were never added.
    However - I consider all this a hack. What you should do is write code that does not leak sessions. Anything you open, you should close. If you find yourself in a situation where there are a lot of leaked sessions during development, relaunching LabVIEW will clear it out. If relaunching LabVIEW is too much of an annoyance, then write a VI that does what I described above and run it when needed. You can even make it "smarter" by getting the names of all the NI-DCPower devices in your system using the System Configuration or niModInst APIs.
    Hope this helps.
    Marcos Kirsch
    Principal Software Engineer
    Core Modular Instruments Software
    National Instruments

  • Measuring sub 125 microamps using PXI 4110 - Is it possible?

    I have a question concerning the PXI 4110.  We are trying to use this card to both provide voltage to a device while at the same time using it to monitor the real time current being supplied.  What we have found that it is capable of providing voltage from 0 to 6 volts, but have not been able to read below ~500 micro-amps.  The data sheet says it has a resolution of 0.01 ma or 10 micro amps, but we have not been able to get the card to read consistently under the 500 micro-amps.
    Is there a minimum current draw required before the 10 micro-amp range becomes true? 
    Here is what we are trying to monitor using this card:
    Voltage supplied between 3.5 to 6 volts.
    When device is asleep, current draw is between 50 to 100 micro amps.
    When device is awake, current draw can be as high as 100 mili-amps.
    Will this card work?  Is there a better solution to this? 
    Thanks

    Hi John,
    Thanks for the additional information. Based on your sampling rate you should be fine with a software or hardware timed device. The answer that you will need to determine is how accurately you need to measure your sleep current. The issue is that you do not know the mode that your DUT will be in to change your current range. At this time our power supplies have output and measure ranges that are coupled together. This means that when you change to a lower range your output current will be limited to the max of that range, which can be an issue when the DUT 'wakes up'.
    I would recommend is to select a device that can remain in the higher current range (100mA or greater) and have the measurement capability that meets your needs. You would need to know that you would need to measure the sleep current within +/- X uA. When selecting the device you will then want to look at its measure accuracy on the range you will use. The calculation would be +/- (Measurement*Gain Error + Offset Error). You will find that the largest portion of the error will be offset due to the range you will be using. You also need to take into account the resolution of the instrument because that is the smallest possible change you can measure
    One comment on offset error is that for a given test setup/temperature/humidity/etc and test voltage it will stay fairly constant. This means that you can characterize the offset of your system if all of those factors remain constant. I would recommend that you would set up your test, including all fixturing/cabling excluding the DUT. You can set the supply to your test voltage and measure the current. In this setup the ideal current would be 0uA because it is an 'open', but due to leakages in the system there will be an offset. You can take this reading as a baseline 'zero' and subtract it from future readings to improve your measurements. You will want to be careful of Dielectric Absorption (DA) because it can mislead you when making measurements like this, but it is less of an issue when talking about uA and more of an issue when measuring pA. It would be a good idea to repeat this characterization periodically to ensure that your measurements are accurate, ideally once per DUT, but you can scale that back as necessary.
    I hope this is helpful. It is a good idea to evaluate the hardware in your test setup to ensure that the measurements meet your needs. I would also add the PXI-4132 to your list of options to consider for its 100mA range. I think that these other devices would be better than the PXI-4110 in your application because of the low current measurements you need. If you can use the additional channels the PXIe-4140/4141 are good options, if not the PXI-4132 would be a good option. You should also consider the different connectors for PXI vs PXIe and what will work for your chassis.  
    Steve B

  • IVI Configuration with PXI-4110 in TestStand

    Hello All,
    Set-Up:
    PXI-1033 connected through MXI to a PC running Windows 7. LabVIEW 2014. TestStand 2014 (32-bit). PXI-4110 Power Supply and PXI-4070 DMM.
    In MAX I can open both soft panels and control both units and they work great.  In LabVIEW I can control both cards as well. 
    In MAX I have set up a driver and logical name for my DMM. This unit works great within TestStand using an IVI DMM step.
    I then proceeded to setup the 4110 in MAX with an IVI driver and logical name. Here are my settings:
    Name: ni4410_PS
    Hardware: Added hardware asset and select the PS. This one is checked, no other assets are checked.
    Software Module: NI-DCPower, 4110 is listed as a device that is supported.
    Virtual Names: This is where I am confused, Under physical name there are four options that come up (0, 1, 2, and 3). This power supply only has 3 outputs so I am unsure why four come up. I have made 4 virtual names, one for each of the options. I named them ch0, ch1, ch2, and ch3 respectively.
    When I put an IVI Power Supply step in TestStand everything seems to be working. I open the configuration window and set my values for each channel. If I try to validate the setup by unchecking simulate and click on init I do not get an error.  As soon as I clic on 'Configure' or 'Show Soft Front Panel' I get the following error:
    "The IVI Configure operation failed for lgical name 'NI PS 1'. Details: Extention capability not supported by instrument driver. (Base) (-30717)"
    Any information would be appreciated.  I tried playing with it for a couple hours yesterday and had a couple co workers try to help.  We are all under the assumption that this should be working.  
    Thank You!!
    Jesse
    Solved!
    Go to Solution.

    Hi jesserzamora,
    Have you seen this link: http://digital.ni.com/public.nsf/allkb/331F2717DBCD1F858625758200745773?OpenDocument
    It discusses a similar failure with the IVI Power Supply step in TestStand. 
    Julia P.

  • Pxi 4110 and dcpower express vi

    hi everyone,
    I'm really new to labview and I have to use a pxi 4110 to supply dc voltage with a square wave (or whatever shape I chose).
    I treid using the "dcpower express vi" but I can't understand how to connect the generated signal. at now I have the wave and I put a block which transforms the wave from dynamic data to array.. but it doesn't work! 
    i tried watchin g the examples, but none of them uses power express...
    thanks for any help!!

    As bronzacuerta mentioned, the PXI-4110 is not intended to generate a square waveform, or any specific waveform. It is intended to regulate a constant voltage or current, adjusting for changing loads. By changing the output using the NI-DCPower API, you may be able to approximate a square waveform, but it will not be a very good one, both in terms of rise/fall times and due to software-timed output updates.
    A potentially better option (based on speed/flexibility/accuracy requirements) instead of a Multifunction DAQ card is a signal generator.
    Tobias
    Staff Software Engineer
    Modular Instruments
    National Instruments
    Tobias
    Staff Software Engineer
    Modular Instruments R&D
    National Instruments

  • NI DC Soft Front Panel, minorbug, minor bug with PXI-4110

    Hi
    The NI DC Soft Front Panel V14.0, with the PXI-4110, the negative voltage scroll, works as expected to -10V, but then rolls over to 0. If one changes in -1V steps, it goes.... -8, -9, -10, -1, -2 ..... instead of -8, -9, -10, -11, -12 ...
    While at it, a simple thing I miss is a on/off button for all three voltages.
    (Also, imo, it would be logical to get negative voltages with the down arrow, not up).
    My 2C
    Solved!
    Go to Solution.

    Hello Janaf,
    I fully agree with both of your statements, I have filed a Correction Action Report which you can monitor in next releases of DCPower to see if this is fixed with the SFP. Number of CAR: 512257
    I added notes that only manual insertion of numbers less than -10 works and that it was not logical to use increment arrow or up arrow to decrease the voltage output.
    Best Regards
    Jonas Mäki
    Applications Engineering
    National Instruments

  • PXI-4110 current limit

    I am programming a PXI-4110 with LabWindows. I am trying out the current limiting. I have an output set to 5V with a 1K load resistor; this makes the current 5mA. I set the current limit to 2mA to see it work. When I read the voltage and current it is still 5V and 5mA. Here is my code:
    g_voltageLevel_ch0 = 5.0;
    g_currenLimit_ch0= 2.0e-3;
     status = niDCPower_ConfigureOutputFunction(vi_4110_0, channel0Name, NIDCPOWER_VAL_DC_VOLTAGE);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureSense(vi_4110_0, channel0Name, NIDCPOWER_VAL_LOCAL);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureVoltageLevel(vi_4110_0, channel0Name, g_voltageLevel_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
    Solved!
    Go to Solution.

    I'm getting an error now. I have a 1K resistor on each of the outputs so I will see some current.
    Look for these lines
    to see comments I am sending you about the code.
    When I start my program I call a function PXI4110_HardwareInit() to get things set up. This part seems to look OK.
    /************************************** PXI-4110 ****************************************/
    // PXI-4110 hardware definitions for initialization
    static ViSession vi_4110_0 = VI_NULL, vi_4110_1 = VI_NULL, vi_4110_2 = VI_NULL;
    /* The channel names are length 2 because the maximum size of the strings
    from the each textbox on the UI is 1. */
    static ViChar channel0Name[2] = "0";  // 0V to 6V channel
    static ViChar channel1Name[2] = "1";  // 0V to 20V channel
    static ViChar channel2Name[2] = "2";  // 0V to -20V channel
    // inputs
    ViReal64 g_voltageLevel_ch0 = 5.0;
    ViReal64 g_voltageLevel_ch1 = +13.0;
    ViReal64 g_voltageLevel_ch2 = -13.0;
    ViReal64 g_currenLimit_ch0 = 500.0e-3;
    ViReal64 g_currenLimit_ch1 = 500.0e-3;
    ViReal64 g_currenLimit_ch2 = 500.0e-3;
    void PXI4110_HardwareInit(void)
     ViStatus status;
     ViChar errorMessage[256];
        ViChar resourceDCSupplyName[256] = "PXI-4110";
     // set channel 0 to +5.0V
     status = niDCPower_InitializeWithChannels(resourceDCSupplyName,channel0Name,VI_TRUE,VI_NULL,&vi_4110_0);
     if(status < 0)
      niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureOutputFunction(vi_4110_0, channel0Name, NIDCPOWER_VAL_DC_VOLTAGE);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureSense(vi_4110_0, channel0Name, NIDCPOWER_VAL_LOCAL);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureCurrentLimitRange(vi_4110_0,channel0Name,  1.0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     Set_PXI_4110_Outputs();
     status = niDCPower_ConfigureOutputEnabled(vi_4110_0, channel0Name, VI_FALSE);
     status = niDCPower_Initiate(vi_4110_0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
    static void Set_PXI_4110_Outputs(void)
     ViStatus status;
     ViChar errorMessage[256];
     // channel 0
     status = niDCPower_ConfigureVoltageLevel(vi_4110_0, channel0Name, g_voltageLevel_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
    Then I send a message on ethernet to enable the outputs: When I enable the outputs this function reads the voltage and current outputs and sends this information to a application on my host computer. This information corresponds to what I set it to and what I see with a volt meter.  
    void PXI_4110_enable_outputs(void)
     ViStatus status;
        ViReal64 measuredVoltage, measuredCurrent;
        ViBoolean inCompliance;
     ViChar errorMessage[256];
     // set the outputs
    // Set_PXI_4110_Outputs();
     status = niDCPower_ConfigureOutputEnabled(vi_4110_0, channel0Name, VI_TRUE);
          /* Wait for the outputs to settle. */
         Delay(50e-3);
      // check channel 0
      status = niDCPower_MeasureMultiple(vi_4110_0, channel0Name, &measuredVoltage, &measuredCurrent);
      niDCPower_error_message(vi_4110_0, status, errorMessage);
      niDCPower_QueryInCompliance(vi_4110_0, channel0Name, &inCompliance);
      niDCPower_error_message(vi_4110_0, status, errorMessage);
    Now I send a message to change the current limit. This is where I have trouble. I try and limit the current on Channel 0 to 2mA. Since the output voltage is 5Volt and the resistor is 1K the current will exceed the current limit which is what I am trying to test. I get an error in this function>
       g_currenLimit_ch0 = CommandData->data; 
       status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
       niDCPower_error_message(vi_4110_0, status, errorMessage);
    I see g_currentLimit_ch0 = 2e-3 whic is whatI sent. but my status is a negative number. The message is "Invalid value for parameter or property"
    vi_4110_0 is the handle I got earlier.
    channel0Name ="0"
    and g_currentLimit_ch0 = 2e-3
    I do not understand why I am seeing and error here?????????????????????
    thanks in advance for your help,
    Don

  • PXI-4110 Cards failing since upgrading to Calibratio​n Executive 3.4

    Since we upgraded to Calibration Executive 3.4 all of our PXI-4110 Cards have been failing the following:
    Calibration
    As Found
    Channel
    As Found DMM Reading
    As Left DMM Reading
    Test Value
    Low Limit
    Reading
    High Limit
    PassFail
    2
    0.999942
    A
    0.999942
    A
    0.25000
    A
    0.99557
    A
    0.94065
    A
    1.00432
    A
    Failed
    2
    0.999991
    A
    0.999991
    A
    0.50000
    A
    0.99524
    A
    0.93736
    A
    1.00474
    A
    Failed
    2
    0.999761
    A
    0.999761
    A
    0.75000
    A
    0.99370
    A
    0.94076
    A
    1.00582
    A
    Failed
    This failure occurred on 3 brand new cards and I have even tried the last PXI-4110 before upgrading and it also failed these test and when an adjustment is attempted I receive the following error:
    Error -300010 occurred at Get UserField Data.vi
    Complete call chain:
         cex_UtilityGetUserFieldData.vi
         cex_UtilityGetUserFieldDataByIndex.vi
         _NI 4110_Get Channel Range and Level.vi
         Adjust_NI 4110_Current Out Accuracy.vi
    at step Adjust Current Output Accuracy
    I am using a Agilent 34401 for the DMM.

    JVP
    Here are the files you wanted. Sorry this took so long. The hard drive on the computer that we have the Calibration Executive software on died and we had to reinstall everything. So and I tried to run another card with the same results the calibration report I am sending you is from today. I sent the reoprt in two formats XLS and PDF.
    Attachments:
    ni_support.zip ‏197 KB
    71517 Wednesday, January 11, 2012 8-28-29.xls ‏49 KB
    71517 Wednesday, January 11, 2012 8-28-29.pdf ‏30 KB

  • Measure power/current being delivered by PXI-4110 on a specific channel?

    I have a PXI-1033 chassi and inside it there's a PXI-4110. How do I use LabVIEW 8.2.1 to determine the power/current consumption on a specific output channel?

    I have a PXI-1033 chassi and inside it there's a PXI-4110. How do I use LabVIEW 8.2.1 to determine the power/current consumption on a specific output channel?

  • PXI-4110 Long-Term Fuse Reliability Recall

    I home someone here can provide the info I am looking for. This past October I had a PXI-4110 Triple Output Supply go belly up. It would not pass self-test at all. Appeared to have no communication. After a $530 RMA repair, the board was fine. Today, I have 3 more of the exact same card (all purchased at the same time and only a short time out of warranty). All show the same error as the one in October. All failed in one day (today) with error number -200175. Speaking to NI I find out that it is most likely a fuse and it needs to be sent in on an RMA. Again at $530 a pop. To replace a fuse. A fuse that is apparently at the heart of a "Long-Term Fuse Reliability Recall" that specifically affects PXI-4110.
    Now, at this point it is not even the money, (though it angers me to no end knowing I paid $530 to fix a product with a known defect and that was under recall, but was not informed of it), but the fact that losing the cards for the 10 business days will effectively cost over $50K in lost shippable product.
    We have Certified soldering techs who work under a microscope replacing surface mount components all day. We can replace the fuse. My understanding is that the fuse gets "tired" and is replaced with a different one.
    I need to know 2 things:
    What is the original value of F9 on this card?
    What is it replaced with?
    I find it hard to believe that NI would not give me the info I need to make a quick repair. We did  not receive any recall letter either. Can anyone help???

    Hi Franco,
    I believe that the situation regarding being charged your RMA is being resolved as your board was covered under the recall notification that was sent out. The notification was sent out to the company we have on record who purchased the board. The RMA charge would not apply for a unit under recall.
    The recommended method for repairing the boards is to send them in to NI for repair. We cannot authorize the modification of our boards and it would void the warranty on the product. When the F9 Fuse blew it is possible that other components could have been damaged and also the action of replacing the fuse would likely change the calibration of the device. When sent back to NI for repair the device would undergo testing and calibration to ensure it meets all specifications.   
    We recommend that affected 4110's be sent in for a preventative repair.
    Steve B

  • PXI-4110 question

    Hallo
    I have a question about the DC Power supply (PXI-4110) , is there anyway to use it from -V to +V  (e.g. -4V....+4V)
    if yes how ?
    Thank you in advance
    Toni

    There are three channels, ch0, ch1, and ch2.  The second channel, ch1, is capable of outputting up to +20v and the third channel, ch2, is capable of outputting down to -20v.  So you can use the last two channels, channel 1 and 2, for +/-4v.  When you program ch 2, you must put in -4 as the value.  You cannot put in a positive value.  However, when wiring, the Lo (-) terminal of ch1 must be wired to the Hi (+) terminal of ch2, and this is your ground reference.  The Hi terminal of ch1 will be +4v and the Lo terminal of ch2 will be -4v.
    - tbob
    Inventor of the WORM Global

  • PXIe-8101 Unable To Compile Code

    Hello,
    My system is a PXIe-1065 chassis (I've also had the same issue in the PXIe-1082 chassis), PXIe-8101 controller with Windows 7 installed by NI, and the following instruments:
    DMM PXI-4071, Power Supply PXI-4110, DAQ PXIe-6356, Timing Card PXIe-6674T, and Motion Control PXI-7332.
    I also have a PXIe-8361 which I will use on occasion and will be what I use in the long term.
    Now for the actual problem. What happens is I'll put together some code in labview using the instrument driver VI's and everything will work fine. I can make VI's that call subVI's that call the instrument VI's and it works great, but if I double click on an instruments VI and then try to look at the block diagram, for example, opening "DAQmx Create Channel (AI-Voltage-Basic).vi" it seems to break the code. Initially it won't say anything but once I hit the run button the arrow will change to the broken gray arrow and will state that the code failed to compile (this is without actually changing anything with the VI, I open it and look at what it is doing).
    Also any code I've written that called that VI will now get the same error, if I click show error it will bring up whatever window I was looking at last, so if I was looking at the front panel, then clicked show error it would just bring up the front panel again without highlighting anything. If I create a new file and build a small vi that uses the instrument drivers it works fine, if I copy the vi's from the broken code and paste it into the new file it will fail with the same error.
    It doesn't have this behavior with all the instrument VI's but will happen with some. I mentioned it happens with "DAQmx Create Channel (AI-Voltage-Basic).vi" but then it doesn't happen with "DAQmx Timing(Sample Clock).vi". I haven't gone through and checked exactly which VI's it happens with but it seems to happen with other instruments drivers as well, not just for the DAQ. It also only seems to happen when using the PXIe-8101. If I connect it to a computer this issue doesn't pop up. My company primarily controls the chassis using MXI cards so we don't have any available embedded controllers I could swap in to see if it still happens.
    It's not a big issue as looking into the instrument vi's isn't necessary for me to do my job, I just do it to try to better understand what exactly is happening but it is something I'm curious about.
    DISCLAIMER: I'm a hardware engineer so I may not be familiar with some concepts/terminology/best practices so... patience is appreciated
    Solved!
    Go to Solution.

    At the time I first noticed this error I was just starting to use labview, so I simply avoided opening that vi, now I've been working with labview for about 9 months. I haven't been working with it consistently, not all my projects require it, but I would say I've spent at least half my time working with it.
    Well I seem to have recorded it slightly off. So when I looked into the DAQmx vi it gave an error and when I clicked show details it gave the following:
    When I tried to exit it asked me if I wanted to save the changes, I chose defer decision and eventually chose do not save. I closed labview and reopened the VI, tried to run it and it gave the error:
    I selected show error and it gave:
    Then where it said it failed to compile was within another vi in the same project. When I first opened it it said SubVI is not executable. When I clicked show error it took me to a VI that had the DAQmx vi in it and it was saying VI failed to compile.
    I've attached the SubVI from which I opened the DAQmx. It's a basic vi to read analog data. Is that what you meant for screenshots?
    DISCLAIMER: I'm a hardware engineer so I may not be familiar with some concepts/terminology/best practices so... patience is appreciated
    Attachments:
    DAQaread.vi ‏27 KB

  • Pxi-4071 constant current

    We use 4110 voltage source for current measurements on pxi-4071. We supply high-ohm (1.5 GigaOhm) resistor with few volts and measure current on 4071. All cables are shielded on ground of 4110 voltage source, but cables are about 5 meters long. What can be reason for constant current 5 nA at zero supply voltage? We don't know whether it software or hardware problem, and i already asked this question in LabView forum, autozero enabled in max doesn't help. Could be a reason bad shielding of ground in pxi-4110 voltage source

    I would focus on the hardware side initially. I would try the following steps.
    1) Make a current measurement on the 4071 with the inputs open to see if there is an offset. If so we should look into the DMM configuration.
    2) Connect your cabling and/or resistor without the 4110. If this adds the offset then it seems to be shielding/noise related.
    3) Connect the cabling and/or resistor to the 4110 with the output disabled. If this adds the offset then it could be ground related or also related to an offset in the 4110 output.
    Steve B

  • PXI cards not detected

    I'm using SignalExpress with a NI PXIe-1062Q chassis and the following cards:
    PXIe-8360
    PXI-4071 DMM
    PXIe-6556 waveform generator/analyser
    PXI-4110
    All cards are detected by NI MAX. However when I create a waveform in SignalExpress, it's unable to see my hardware. It can see the 4071.
    I'm following the correct startup procedure of booting the chassis first, then booting my PC.
    Any suggestions?

    Hello AidanFrost,
    The reason for this could probably be that you may not have the correct drivers installed. Which drivers do you have on you computer at the moment ?

Maybe you are looking for

  • Using a UK-purchased MacBook in Canada

    Hi all, I am going on holiday to Canada in a few days time - will I be able to charge my UK-purchased MacBook overseas? Obviously I will need a plug adapter but i'm concerned about any possible differences between the UK/Canada's power network. Thank

  • Providers

    hi, my application run on jdeveloper internal weblogic and i want to get contributor mode with user that defined in the ucm which run on the main weblogic that i installed... somebody can guide my to create provider that do that for me, meaning that

  • Spry Accordion Content not expanding fully

    Hi,      I am trying to add some nice user friendly stuff to my site.  I've created a Spry Accordion in which the content is images that I have created in photoshop.  Some of the content is longer than 200px and will not show up in IE.  When viewed i

  • Video files in Dreamweaver?

    Hi there, What video file would you recommend to upload so that at least 90 per cent of people can view my vides when they hit play? Also, is it pretty easy to add videos? Steve

  • Wireless runs great, but printer problems...

    My Airport Extreme was a breeze to hook up and get connected to the internet, but printing is an issue. The Airport Utility sees the printer from both a macbook and a 64 bit Vista desktop, but when I try to print and get a message that says printer c