Triggerring PXI-4110 to measure 1 current value while HSDIO PXI-6552 generating waveform

Hi,
Some question about PXI-4110 to measure current while PXI-6552 is generating the waveform. 
1. Let say, I need to measure 3 points of current values, i.e. while PXI-6552 is generating sample-1000, 2000 and 3500. On the edge of sample 1000,2000 and 3500, the PXI-6552 will send a pulse via PFI line or via PXI backplane trigger line. My question is, is it possible to trigger PXI-4110 (hardware trigger or software trigger) to measure current values at these points ?
2. Let say I need to measure the current on 0ms (start of waveform generation by PXI-6552) , 1ms, 2ms, 3ms, 4ms... and so on for 1000 points of measurement, code diagram as shown at the figure below. It is possible for the VI "niDCPower Measure Multiple" to measure exactly at 1ms, 2ms, 3ms .. ? How much time will have to spend to complete acquire 1 point of measurement by "niDCPower Measure Multiple" ?
Thanks for viewing this post. Your advice on hardware used or software method is much appreciated. Thanks in advance.  
Message Edited by engwei on 02-02-2009 04:24 AM
Attachments:
[email protected] ‏46 KB

Hi engwei,
1. Unfortunately, the 4110 does not support hardware triggering. Therefore you cannot implement direct triggering through the backplane or anything like that. However, there are a couple of possible workarounds you can try:
a) Use software triggering: Say your 6552 is generating in one while loop, and your 4110 is to measure in another while loop. You can use a software syncrhonization method like notifiers to send a notification to your 4110 loop when your 6552 has generated the desired sample. This method, however, will not be very deterministic because the delay between the trigger and the response depends on your processor speed and load. Therefore, if you have other applications running in the background (like antivirus) it will increase the delay.
b) Use hardware triggering on another device: If you have another device that supports hardware triggering (like maybe an M-series multifunction DAQ module), you can configure this device to be triggered by a signal from the 6552, perform a very quick task (like a very short finite acquisition) then immediately execute the DCPower VI to perform the measurement. The trigger can be configured to be re-triggerable for multiple usage. This will most likely have a smaller time delay then the first option, but there will still be a delay (the time it takes to perform the short finite acquisiton on the M-series). Please refer to the attached screenshot for an idea of how to implement this.
2. To make your 4110 measure at specific time intervals, you can use one of the methods discussed above. As for how long it will take to acquire 1 measurement point, you may find this link helpful: http://zone.ni.com/devzone/cda/tut/p/id/7034
This article is meant for the PXI-4130 but the 4110 has the same maximum sampling rate (3 kHz) and so the section discussing the speed should apply for both devices.
Under the Software Measurement Rate section, it is stated that the default behavior of the VI is to take an average of 10 samples. This corresponds to a maximum sampling rate of 300 samples/second. However, if you configure it to not do averaging (take only 1 sample) then the maximum rate of 3000 samples/second can be achieved.
It is also important to note that your program can only achieve this maximum sampling rate if your software loop takes less time to execute than the actual physical sampling. For example, if you want to sample at 3000 samples/second, that means that taking one sample takes 1/3000 seconds or 333 microseconds. If you software execution time is less than 333 microseconds, then you can achieve this maximum rate (because the speed is limited by the hardware, not the software). However, if your software takes more than 333 microseconds to execute, then the software loop time will define the maximum sampling rate you can get, which will be lower than 3000 samples/second.
I hope this answers your question.
Best regards,
Vern Yew
Applications Engineer, NI ASEAN
Best regards,
Vern Yew
Applications Engineer
Attachments:
untitled.JPG ‏18 KB

Similar Messages

  • Measure power/current being delivered by PXI-4110 on a specific channel?

    I have a PXI-1033 chassi and inside it there's a PXI-4110. How do I use LabVIEW 8.2.1 to determine the power/current consumption on a specific output channel?

    I have a PXI-1033 chassi and inside it there's a PXI-4110. How do I use LabVIEW 8.2.1 to determine the power/current consumption on a specific output channel?

  • Measuring sub 125 microamps using PXI 4110 - Is it possible?

    I have a question concerning the PXI 4110.  We are trying to use this card to both provide voltage to a device while at the same time using it to monitor the real time current being supplied.  What we have found that it is capable of providing voltage from 0 to 6 volts, but have not been able to read below ~500 micro-amps.  The data sheet says it has a resolution of 0.01 ma or 10 micro amps, but we have not been able to get the card to read consistently under the 500 micro-amps.
    Is there a minimum current draw required before the 10 micro-amp range becomes true? 
    Here is what we are trying to monitor using this card:
    Voltage supplied between 3.5 to 6 volts.
    When device is asleep, current draw is between 50 to 100 micro amps.
    When device is awake, current draw can be as high as 100 mili-amps.
    Will this card work?  Is there a better solution to this? 
    Thanks

    Hi John,
    Thanks for the additional information. Based on your sampling rate you should be fine with a software or hardware timed device. The answer that you will need to determine is how accurately you need to measure your sleep current. The issue is that you do not know the mode that your DUT will be in to change your current range. At this time our power supplies have output and measure ranges that are coupled together. This means that when you change to a lower range your output current will be limited to the max of that range, which can be an issue when the DUT 'wakes up'.
    I would recommend is to select a device that can remain in the higher current range (100mA or greater) and have the measurement capability that meets your needs. You would need to know that you would need to measure the sleep current within +/- X uA. When selecting the device you will then want to look at its measure accuracy on the range you will use. The calculation would be +/- (Measurement*Gain Error + Offset Error). You will find that the largest portion of the error will be offset due to the range you will be using. You also need to take into account the resolution of the instrument because that is the smallest possible change you can measure
    One comment on offset error is that for a given test setup/temperature/humidity/etc and test voltage it will stay fairly constant. This means that you can characterize the offset of your system if all of those factors remain constant. I would recommend that you would set up your test, including all fixturing/cabling excluding the DUT. You can set the supply to your test voltage and measure the current. In this setup the ideal current would be 0uA because it is an 'open', but due to leakages in the system there will be an offset. You can take this reading as a baseline 'zero' and subtract it from future readings to improve your measurements. You will want to be careful of Dielectric Absorption (DA) because it can mislead you when making measurements like this, but it is less of an issue when talking about uA and more of an issue when measuring pA. It would be a good idea to repeat this characterization periodically to ensure that your measurements are accurate, ideally once per DUT, but you can scale that back as necessary.
    I hope this is helpful. It is a good idea to evaluate the hardware in your test setup to ensure that the measurements meet your needs. I would also add the PXI-4132 to your list of options to consider for its 100mA range. I think that these other devices would be better than the PXI-4110 in your application because of the low current measurements you need. If you can use the additional channels the PXIe-4140/4141 are good options, if not the PXI-4132 would be a good option. You should also consider the different connectors for PXI vs PXIe and what will work for your chassis.  
    Steve B

  • PXI-4110 current limit

    I am programming a PXI-4110 with LabWindows. I am trying out the current limiting. I have an output set to 5V with a 1K load resistor; this makes the current 5mA. I set the current limit to 2mA to see it work. When I read the voltage and current it is still 5V and 5mA. Here is my code:
    g_voltageLevel_ch0 = 5.0;
    g_currenLimit_ch0= 2.0e-3;
     status = niDCPower_ConfigureOutputFunction(vi_4110_0, channel0Name, NIDCPOWER_VAL_DC_VOLTAGE);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureSense(vi_4110_0, channel0Name, NIDCPOWER_VAL_LOCAL);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureVoltageLevel(vi_4110_0, channel0Name, g_voltageLevel_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
    Solved!
    Go to Solution.

    I'm getting an error now. I have a 1K resistor on each of the outputs so I will see some current.
    Look for these lines
    to see comments I am sending you about the code.
    When I start my program I call a function PXI4110_HardwareInit() to get things set up. This part seems to look OK.
    /************************************** PXI-4110 ****************************************/
    // PXI-4110 hardware definitions for initialization
    static ViSession vi_4110_0 = VI_NULL, vi_4110_1 = VI_NULL, vi_4110_2 = VI_NULL;
    /* The channel names are length 2 because the maximum size of the strings
    from the each textbox on the UI is 1. */
    static ViChar channel0Name[2] = "0";  // 0V to 6V channel
    static ViChar channel1Name[2] = "1";  // 0V to 20V channel
    static ViChar channel2Name[2] = "2";  // 0V to -20V channel
    // inputs
    ViReal64 g_voltageLevel_ch0 = 5.0;
    ViReal64 g_voltageLevel_ch1 = +13.0;
    ViReal64 g_voltageLevel_ch2 = -13.0;
    ViReal64 g_currenLimit_ch0 = 500.0e-3;
    ViReal64 g_currenLimit_ch1 = 500.0e-3;
    ViReal64 g_currenLimit_ch2 = 500.0e-3;
    void PXI4110_HardwareInit(void)
     ViStatus status;
     ViChar errorMessage[256];
        ViChar resourceDCSupplyName[256] = "PXI-4110";
     // set channel 0 to +5.0V
     status = niDCPower_InitializeWithChannels(resourceDCSupplyName,channel0Name,VI_TRUE,VI_NULL,&vi_4110_0);
     if(status < 0)
      niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureOutputFunction(vi_4110_0, channel0Name, NIDCPOWER_VAL_DC_VOLTAGE);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureSense(vi_4110_0, channel0Name, NIDCPOWER_VAL_LOCAL);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureCurrentLimitRange(vi_4110_0,channel0Name,  1.0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     Set_PXI_4110_Outputs();
     status = niDCPower_ConfigureOutputEnabled(vi_4110_0, channel0Name, VI_FALSE);
     status = niDCPower_Initiate(vi_4110_0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
    static void Set_PXI_4110_Outputs(void)
     ViStatus status;
     ViChar errorMessage[256];
     // channel 0
     status = niDCPower_ConfigureVoltageLevel(vi_4110_0, channel0Name, g_voltageLevel_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
    Then I send a message on ethernet to enable the outputs: When I enable the outputs this function reads the voltage and current outputs and sends this information to a application on my host computer. This information corresponds to what I set it to and what I see with a volt meter.  
    void PXI_4110_enable_outputs(void)
     ViStatus status;
        ViReal64 measuredVoltage, measuredCurrent;
        ViBoolean inCompliance;
     ViChar errorMessage[256];
     // set the outputs
    // Set_PXI_4110_Outputs();
     status = niDCPower_ConfigureOutputEnabled(vi_4110_0, channel0Name, VI_TRUE);
          /* Wait for the outputs to settle. */
         Delay(50e-3);
      // check channel 0
      status = niDCPower_MeasureMultiple(vi_4110_0, channel0Name, &measuredVoltage, &measuredCurrent);
      niDCPower_error_message(vi_4110_0, status, errorMessage);
      niDCPower_QueryInCompliance(vi_4110_0, channel0Name, &inCompliance);
      niDCPower_error_message(vi_4110_0, status, errorMessage);
    Now I send a message to change the current limit. This is where I have trouble. I try and limit the current on Channel 0 to 2mA. Since the output voltage is 5Volt and the resistor is 1K the current will exceed the current limit which is what I am trying to test. I get an error in this function>
       g_currenLimit_ch0 = CommandData->data; 
       status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
       niDCPower_error_message(vi_4110_0, status, errorMessage);
    I see g_currentLimit_ch0 = 2e-3 whic is whatI sent. but my status is a negative number. The message is "Invalid value for parameter or property"
    vi_4110_0 is the handle I got earlier.
    channel0Name ="0"
    and g_currentLimit_ch0 = 2e-3
    I do not understand why I am seeing and error here?????????????????????
    thanks in advance for your help,
    Don

  • Problem in getting the current value of the drop down while calling value change listener

    I have 2 drop down list. I am trying to get the value of first drop down from other drop downs value change listener. Initially one drop down contains a default value. First time I got the value while calling the value change listener. But if I change the default value to other in the first drop down and call the value change listener of the second drop down then I got the old value in the bean. Can anyone suggest a process

    If I use the following code it gives me the current index.
                valueChangeEvent.getComponent().processUpdates(FacesContext.getCurrentInstance());
                System.out.println(valueChangeEvent.getNewValue());
    This is also giving me current index.
    BindingContainer container = BindingContext.getCurrent().getCurrentBindingsEntry();
    AttributeBinding attrIdBinding = (AttributeBinding)container.getControlBinding("PersonTypeId1");
    if(attrIdBinding.getInputValue()!=null)
                   System.out.println(attrIdBinding.getInputValue().toString());
    But at last I got some help from Shay Shmeltzer's Weblog.
    BindingContainer bindings =
                    BindingContext.getCurrent().getCurrentBindingsEntry();
                    // Get the sepecific list binding
                    JUCtrlListBinding listBinding =
                    (JUCtrlListBinding)bindings.get("PersonTypeId1");
                    // Get the value which is currently selected
                    Object selectedValue = listBinding.getSelectedValue();
                      long value =0L;
                    if(selectedValue!=null){
                        System.out.println("Sudip.. Person Type using bindings"+selectedValue.toString());
    But this returns "ViewRow [oracle.jbo.Key[300000860721156 ]]"
    300000860721156 is the original value.. Would you please help me to figure it.

  • Making Current Values Default While VI is Running

    Is there a way to 'make the current values default' while a VI is running?

    You know, i try to read these posts really quick because i do this at work
    (my boss doesn't mind because i learn things too). Anyway, my dyslexic mind
    interpreted the subject line as initialize to default values. Sorry. Craig,
    i think that's twice you corrected me. Thanks.
    But now i have a question. Why would you want to set the current values
    to default? So after stopping then re-starting the program it starts with
    the last know values? (i know, thats really 2 questions). The way i do
    this is to use the simple configuration read and write vi's to some ini file.
    On program close, write the values. To make sure this is done, take the
    stop button away from your users and use a front panel quit button that will
    execute the write code before calling labview exit. Obvisously, on program
    start, read from this file to return your controls to the last know states.
    Hope this helps, sorry for the confusion.
    Jared
    "Craig Graham" wrote:
    >Uh.. perhaps I missed this. Certainly you can set default values via>property
    nodes, but I was under the impression only for a VI that isn't>running or
    locked for running- i.e. in the hierarchy of another VI that is>running.
    Am I mistaken?>>Hence a VI can load settings into another VI's front panel
    controls, set>them as default and then use an "invoke node" to run the VI,
    but there's no>way a VI can set its own front panel defaults unless you use
    some horrible>method whereby it launches another VI, via "invoke node" since
    the>hierarchies must be independent, the other VI aborts the first, sets
    the>font panel defaults and then relaunches it, using "invoke node" before>killing
    itself.>>jared wrote in message>news:[email protected]...>>>>
    Yes, you have to use property nodes. This will only work in the>development>>
    enviroment. If you compile into an .exe it won't work. This was discussed>>
    last week or the week before on this group. Search using keywords from>your>>
    subject line, i'm sure you'll find alot of info. I also created a sample>>
    of how to implement this in an .exe. If this is something you would like,>>
    email me and i will respond with it attached.>>>> Jared>>>> "Peter T"
    wrote> >>Is there a way to 'make the current values default' while a VI
    is>running?>>>

  • To measure current in DAQ should we use rms value or DC avereage value

    to measure current in DAQ should we use rms value or DC avereage value

    i am taking measurement for single phase or 3 phase voltage and current and display it on front panel voltage measurement is valid but doinh with current measurement i suffer some problem should i take dc avg value or rms value for measurement 
    when i take dc average value it give result as i want but doing with rms valur it give wrong result according to my circuit sitution. As, we know in ac case rms value is used for measurement but it give wrong result so guide me in this regard 
    THANK YOU 
    Attachments:
    Final pp self.vi ‏156 KB
    Untitled.png ‏46 KB

  • Recording measured voltage and current values from LCR 4284A

    Hello,
    I want to use the labview VI to record the measured voltage and current values from LCR 4284A. These are shown as Vm and Im on the instrument screen. Is it possible to record these values using a VI?
    I have looked through the drivers that are available through the "Instrument driver network" (Link) for 4284A but none of the VIs give any option to monitor Vm and Im. If you know how this can be done then I would really appreciate a reply.
    Thanks
    Mansoor

    I haven't used the instrument but there are functions to enable/read voltage level and current level monitors. Are those different than what you are asking for? What does the manual say about Vm and Im and programatically getting them? If you can find the command in the manual, you can just open the VI Tree and do a text search for the command.

  • How do I correctly time while loops using SCPI and VISA/Ethernet communication to send DC current Values to a Power supply?

    I'm rather new to using labview and I having an
    issue with a test Data Aquisition lab I'm trying to setup using a
    Keithley 6221 AC/DC current source and a basic PCI M-series NI-DAQmx. 
    First of all, I'm looking to update the  current value on a power
    supply at a rate of atleast 10Hz and I'm using SCPI commands and VISA
    communication through ethernet to do so.  Attached below is the VI I
    have written. 
    The issue I'm having is this:
        My VI will
    loop through the values fine in software, or so it seems.  I am unable
    to get the Power supply to update sequentially if I don't set my loop
    delay to anything greater than 130ms.  If I try to run the loop faster
    it starts skipping values instead of counting 1,2,3,4,5,1,2,...etc.  it
    goes 1,2,4,5,2,4,1,2 on the display of the power supply and my DAQ unit
    also skips these values so I know that the number just isn't getting
    read to the Power supply in time.  I was wondering if this was due to
    my sloppy programming or if it is a hardware issue, my computer or the
    method of communication to the power supply?  Is this due to the fact
    that I'm using ethernet and VISA communication?  Is there a faster way
    to communicate or is GPIB faster?  Any input at all would be extremely
    helpful.
    On a side note:  Right now I'm using an pre-determined
    array of values that I can update on the fly but in the future this
    with be put into a  closed-loop control system.  The value for the
    power supply output will be determined by the loop and sent that way.
    Attachments:
    basic DC loop 6221 Keithley.vi ‏145 KB

    Also, one of the string constants is NOT set to '\' Codes Display. Here is an alternative method of calculating the index. I removed the DAQ and VISA stuff since I could not run it.
    Lynn
    Attachments:
    basic DC loop 6221 Keithley.2.vi ‏16 KB

  • PXI-4110 Cards failing since upgrading to Calibratio​n Executive 3.4

    Since we upgraded to Calibration Executive 3.4 all of our PXI-4110 Cards have been failing the following:
    Calibration
    As Found
    Channel
    As Found DMM Reading
    As Left DMM Reading
    Test Value
    Low Limit
    Reading
    High Limit
    PassFail
    2
    0.999942
    A
    0.999942
    A
    0.25000
    A
    0.99557
    A
    0.94065
    A
    1.00432
    A
    Failed
    2
    0.999991
    A
    0.999991
    A
    0.50000
    A
    0.99524
    A
    0.93736
    A
    1.00474
    A
    Failed
    2
    0.999761
    A
    0.999761
    A
    0.75000
    A
    0.99370
    A
    0.94076
    A
    1.00582
    A
    Failed
    This failure occurred on 3 brand new cards and I have even tried the last PXI-4110 before upgrading and it also failed these test and when an adjustment is attempted I receive the following error:
    Error -300010 occurred at Get UserField Data.vi
    Complete call chain:
         cex_UtilityGetUserFieldData.vi
         cex_UtilityGetUserFieldDataByIndex.vi
         _NI 4110_Get Channel Range and Level.vi
         Adjust_NI 4110_Current Out Accuracy.vi
    at step Adjust Current Output Accuracy
    I am using a Agilent 34401 for the DMM.

    JVP
    Here are the files you wanted. Sorry this took so long. The hard drive on the computer that we have the Calibration Executive software on died and we had to reinstall everything. So and I tried to run another card with the same results the calibration report I am sending you is from today. I sent the reoprt in two formats XLS and PDF.
    Attachments:
    ni_support.zip ‏197 KB
    71517 Wednesday, January 11, 2012 8-28-29.xls ‏49 KB
    71517 Wednesday, January 11, 2012 8-28-29.pdf ‏30 KB

  • How to measure current on DC servo motor.

    Need to measure current on DC servo motor. What is the best way to measure the current on the motor as it moves to position? Using PXI-7344 controller and MID-7654 drive unit.  

    This is not an easy task. Here are some ideas:
    Direct current measurement:
    Add shunt resistors into the motor cabling and measure the voltage over the resistors. As the 7654 doesn't provide DC current but PWM current (32 kHz) you will have to acquire the data pretty fast (at least with 200 kHz). This could be done with a DAQ-board like the PCI-6220. Additionally you will have to do some math to calculate the RMS value of the current.
    Control voltage measurement:
    You could measure the output voltage of the 7344 as it is proportional to the duty cycle of the PWM current signal of the 7654. You will also need some additional measurement hardware but you wouldn't have to use shunt resistors and RMS calculations. The major disadvantage of this option is cabling as there is a single cable connection between the controller and the drive and you would have to use e. g. two external connector blocks and an additional cable in order to be able to connect your measurement hardware to the voltage output of the 7344.
    Torque measurement:
    Torque is also proportional to motor current so you could add a torque sensor to your motor and measure the output signal with e. g. a PCI-6220. Depending on your hardware setup this might be the best option.
    I hope that helps,
    Jochen Klier
    National Instruments Germany

  • How to compare, current value in :block.text_item with the database value

    Hi
    Could you please tell me
    How to compare the current value in :block.text_item with the corresponding database column value.
    I am using forms 10g
    There is block and there is an text Item in that block.
    When I run the form and query the block (tabular), the :block.text_item shows me, whatever value there in the database.
    Now I add some value in the :block.text_item to the existing value.
    now
    the :block.text_item contains old+ new added value
    whereas
    the database table contains 'old' value
    Now on a button click , I want to find out what is the value that I have added
    Could you please tell me, is it possible without writing a select query?

    Hello,
    Now on a button click , I want to find out what is the value that I have addedSo you mean always user will add value in the existing value. Because this way will fail in one case. Let say
    Value in Database is = ABCD
    User opened the form and he removed the D and write E and now value is ABCE and length is still same 4. So, there is no addition.
    Anyway you can know the database value at runtime there is one property for item called DATABASE_VALUE. It gives the value which is in database while you are running the form before save. and you can use like this..
    Trigger = WHEN-MOUSE-DOUBLE-CLICK on item level
    DECLARE
      vItemValue DATATYPE; -- Set the data type according to your desired field.
      vValueAdded DATATYPE; -- Set the data type according to your desired field.
    BEGIN
      vItemValue:=GET_ITEM_PROPERTY('ITEM_NAME',DATABASE_VALUE);  -- It will return you the database value in vItemValue variable.
      IF LENGTH(vItemValue)>LENGTH(:FORM_ITEM_NAME) THEN  -- It mean something change or added
        vValueAdded:=SUBSTR(:FORM_ITEM_NAME,LENGTH(vItemValue)+1);
        MESSAGE('Added value is : '||vValueAdded);  -- It will show you the added value.
      END IF;
      -- now suppose you want to show the old and new value in message not the added one
      -- Then no need of IF condition. You can just use message like this
      -- And i would prefer to use like this way
      MESSAGE('Old Value : '||vItemValue||'  New Value - '||:FORM_ITEM_NAME);
      MESSAGE('Old Value : '||vItemValue||'  New Value - '||:FORM_ITEM_NAME);
    END;Hope it is clear.
    -Ammad

  • Measuring different values with different intervals

    I have a program which is measuring temperatures every second (or how often I want it to) and writes this to a file. But now I want it to open a digital line to measure the current through a resistant and if the current is higher than a certain limit, the program should open another digital line. But I don't want it to open the digital line/measure the current so often because the resistant will get hot, so I want it to measure maybe every 15 minutes. How should I make the program measure these values in different intervals?
    I am really a beginner in labVIEW, and i have no clue how to do this. If someone could give me any advice, even if it's only about where i can look to find a solution, I would be very greatful.

    On your front panel, you can have two numerics: one is the interval between temperature reads, the other is the interval between digital reads. If intervals are on the order of seconds (not milliseconds) you're probably better off checking the interval using Get Date/Time in Seconds rather than using Wait (ms). Add shift registers to your loop to save the last time read (one shift register for last temp read, another for last digital read). Each time through the loop, check if the current time (from Get Date/Time in Seconds on the Time & Dialog palette) is more than the specified interval from the last time read. If it is, take the reading and save the current time as the last time read to the shift register. Use separate case structures for the temperature
    reading and the digital reading.
    See the attached LabView 6.1 example.
    Attachments:
    ReadAtDifferentIntervals.vi ‏35 KB

  • IVI Configuration with PXI-4110 in TestStand

    Hello All,
    Set-Up:
    PXI-1033 connected through MXI to a PC running Windows 7. LabVIEW 2014. TestStand 2014 (32-bit). PXI-4110 Power Supply and PXI-4070 DMM.
    In MAX I can open both soft panels and control both units and they work great.  In LabVIEW I can control both cards as well. 
    In MAX I have set up a driver and logical name for my DMM. This unit works great within TestStand using an IVI DMM step.
    I then proceeded to setup the 4110 in MAX with an IVI driver and logical name. Here are my settings:
    Name: ni4410_PS
    Hardware: Added hardware asset and select the PS. This one is checked, no other assets are checked.
    Software Module: NI-DCPower, 4110 is listed as a device that is supported.
    Virtual Names: This is where I am confused, Under physical name there are four options that come up (0, 1, 2, and 3). This power supply only has 3 outputs so I am unsure why four come up. I have made 4 virtual names, one for each of the options. I named them ch0, ch1, ch2, and ch3 respectively.
    When I put an IVI Power Supply step in TestStand everything seems to be working. I open the configuration window and set my values for each channel. If I try to validate the setup by unchecking simulate and click on init I do not get an error.  As soon as I clic on 'Configure' or 'Show Soft Front Panel' I get the following error:
    "The IVI Configure operation failed for lgical name 'NI PS 1'. Details: Extention capability not supported by instrument driver. (Base) (-30717)"
    Any information would be appreciated.  I tried playing with it for a couple hours yesterday and had a couple co workers try to help.  We are all under the assumption that this should be working.  
    Thank You!!
    Jesse
    Solved!
    Go to Solution.

    Hi jesserzamora,
    Have you seen this link: http://digital.ni.com/public.nsf/allkb/331F2717DBCD1F858625758200745773?OpenDocument
    It discusses a similar failure with the IVI Power Supply step in TestStand. 
    Julia P.

  • Pxi 4110 and dcpower express vi

    hi everyone,
    I'm really new to labview and I have to use a pxi 4110 to supply dc voltage with a square wave (or whatever shape I chose).
    I treid using the "dcpower express vi" but I can't understand how to connect the generated signal. at now I have the wave and I put a block which transforms the wave from dynamic data to array.. but it doesn't work! 
    i tried watchin g the examples, but none of them uses power express...
    thanks for any help!!

    As bronzacuerta mentioned, the PXI-4110 is not intended to generate a square waveform, or any specific waveform. It is intended to regulate a constant voltage or current, adjusting for changing loads. By changing the output using the NI-DCPower API, you may be able to approximate a square waveform, but it will not be a very good one, both in terms of rise/fall times and due to software-timed output updates.
    A potentially better option (based on speed/flexibility/accuracy requirements) instead of a Multifunction DAQ card is a signal generator.
    Tobias
    Staff Software Engineer
    Modular Instruments
    National Instruments
    Tobias
    Staff Software Engineer
    Modular Instruments R&D
    National Instruments

Maybe you are looking for