Measure power/current being delivered by PXI-4110 on a specific channel?

I have a PXI-1033 chassi and inside it there's a PXI-4110. How do I use LabVIEW 8.2.1 to determine the power/current consumption on a specific output channel?

I have a PXI-1033 chassi and inside it there's a PXI-4110. How do I use LabVIEW 8.2.1 to determine the power/current consumption on a specific output channel?

Similar Messages

  • Triggerring PXI-4110 to measure 1 current value while HSDIO PXI-6552 generating waveform

    Hi,
    Some question about PXI-4110 to measure current while PXI-6552 is generating the waveform. 
    1. Let say, I need to measure 3 points of current values, i.e. while PXI-6552 is generating sample-1000, 2000 and 3500. On the edge of sample 1000,2000 and 3500, the PXI-6552 will send a pulse via PFI line or via PXI backplane trigger line. My question is, is it possible to trigger PXI-4110 (hardware trigger or software trigger) to measure current values at these points ?
    2. Let say I need to measure the current on 0ms (start of waveform generation by PXI-6552) , 1ms, 2ms, 3ms, 4ms... and so on for 1000 points of measurement, code diagram as shown at the figure below. It is possible for the VI "niDCPower Measure Multiple" to measure exactly at 1ms, 2ms, 3ms .. ? How much time will have to spend to complete acquire 1 point of measurement by "niDCPower Measure Multiple" ?
    Thanks for viewing this post. Your advice on hardware used or software method is much appreciated. Thanks in advance.  
    Message Edited by engwei on 02-02-2009 04:24 AM
    Attachments:
    [email protected] ‏46 KB

    Hi engwei,
    1. Unfortunately, the 4110 does not support hardware triggering. Therefore you cannot implement direct triggering through the backplane or anything like that. However, there are a couple of possible workarounds you can try:
    a) Use software triggering: Say your 6552 is generating in one while loop, and your 4110 is to measure in another while loop. You can use a software syncrhonization method like notifiers to send a notification to your 4110 loop when your 6552 has generated the desired sample. This method, however, will not be very deterministic because the delay between the trigger and the response depends on your processor speed and load. Therefore, if you have other applications running in the background (like antivirus) it will increase the delay.
    b) Use hardware triggering on another device: If you have another device that supports hardware triggering (like maybe an M-series multifunction DAQ module), you can configure this device to be triggered by a signal from the 6552, perform a very quick task (like a very short finite acquisition) then immediately execute the DCPower VI to perform the measurement. The trigger can be configured to be re-triggerable for multiple usage. This will most likely have a smaller time delay then the first option, but there will still be a delay (the time it takes to perform the short finite acquisiton on the M-series). Please refer to the attached screenshot for an idea of how to implement this.
    2. To make your 4110 measure at specific time intervals, you can use one of the methods discussed above. As for how long it will take to acquire 1 measurement point, you may find this link helpful: http://zone.ni.com/devzone/cda/tut/p/id/7034
    This article is meant for the PXI-4130 but the 4110 has the same maximum sampling rate (3 kHz) and so the section discussing the speed should apply for both devices.
    Under the Software Measurement Rate section, it is stated that the default behavior of the VI is to take an average of 10 samples. This corresponds to a maximum sampling rate of 300 samples/second. However, if you configure it to not do averaging (take only 1 sample) then the maximum rate of 3000 samples/second can be achieved.
    It is also important to note that your program can only achieve this maximum sampling rate if your software loop takes less time to execute than the actual physical sampling. For example, if you want to sample at 3000 samples/second, that means that taking one sample takes 1/3000 seconds or 333 microseconds. If you software execution time is less than 333 microseconds, then you can achieve this maximum rate (because the speed is limited by the hardware, not the software). However, if your software takes more than 333 microseconds to execute, then the software loop time will define the maximum sampling rate you can get, which will be lower than 3000 samples/second.
    I hope this answers your question.
    Best regards,
    Vern Yew
    Applications Engineer, NI ASEAN
    Best regards,
    Vern Yew
    Applications Engineer
    Attachments:
    untitled.JPG ‏18 KB

  • Measure Power using Labview and two PXI-4071 DMMs

    Hello,
    I am trying to make a simple VI.  I am measuring input AC voltage and Current using two PXI-4071 DMMs.  I want to Calculate Power and Power Factor.  I am able to read the voltage and current fine.  I just dont know how to calculate the power and power factor.  Im looking for real power not watt*amps.  Please Help.
    Scott
    Attached is my VI
    Attachments:
    Measure AC Voltage and Current.vi ‏42 KB

    Scott,
    It appears that another applications engineer (Sean_N) has been assigned your initial forum post. I assure you that he is working on finding an answer to your question. We would like for our customers to limit questions to one forum so that we do not duplicate efforts on our end. I will contact Sean to remind him that you are awaiting a response. I will also point all future traffic from this post to your initial one.
    Thanks,
    Kareem W.
    National Instruments
    Web Product Manager

  • Measuring sub 125 microamps using PXI 4110 - Is it possible?

    I have a question concerning the PXI 4110.  We are trying to use this card to both provide voltage to a device while at the same time using it to monitor the real time current being supplied.  What we have found that it is capable of providing voltage from 0 to 6 volts, but have not been able to read below ~500 micro-amps.  The data sheet says it has a resolution of 0.01 ma or 10 micro amps, but we have not been able to get the card to read consistently under the 500 micro-amps.
    Is there a minimum current draw required before the 10 micro-amp range becomes true? 
    Here is what we are trying to monitor using this card:
    Voltage supplied between 3.5 to 6 volts.
    When device is asleep, current draw is between 50 to 100 micro amps.
    When device is awake, current draw can be as high as 100 mili-amps.
    Will this card work?  Is there a better solution to this? 
    Thanks

    Hi John,
    Thanks for the additional information. Based on your sampling rate you should be fine with a software or hardware timed device. The answer that you will need to determine is how accurately you need to measure your sleep current. The issue is that you do not know the mode that your DUT will be in to change your current range. At this time our power supplies have output and measure ranges that are coupled together. This means that when you change to a lower range your output current will be limited to the max of that range, which can be an issue when the DUT 'wakes up'.
    I would recommend is to select a device that can remain in the higher current range (100mA or greater) and have the measurement capability that meets your needs. You would need to know that you would need to measure the sleep current within +/- X uA. When selecting the device you will then want to look at its measure accuracy on the range you will use. The calculation would be +/- (Measurement*Gain Error + Offset Error). You will find that the largest portion of the error will be offset due to the range you will be using. You also need to take into account the resolution of the instrument because that is the smallest possible change you can measure
    One comment on offset error is that for a given test setup/temperature/humidity/etc and test voltage it will stay fairly constant. This means that you can characterize the offset of your system if all of those factors remain constant. I would recommend that you would set up your test, including all fixturing/cabling excluding the DUT. You can set the supply to your test voltage and measure the current. In this setup the ideal current would be 0uA because it is an 'open', but due to leakages in the system there will be an offset. You can take this reading as a baseline 'zero' and subtract it from future readings to improve your measurements. You will want to be careful of Dielectric Absorption (DA) because it can mislead you when making measurements like this, but it is less of an issue when talking about uA and more of an issue when measuring pA. It would be a good idea to repeat this characterization periodically to ensure that your measurements are accurate, ideally once per DUT, but you can scale that back as necessary.
    I hope this is helpful. It is a good idea to evaluate the hardware in your test setup to ensure that the measurements meet your needs. I would also add the PXI-4132 to your list of options to consider for its 100mA range. I think that these other devices would be better than the PXI-4110 in your application because of the low current measurements you need. If you can use the additional channels the PXIe-4140/4141 are good options, if not the PXI-4132 would be a good option. You should also consider the different connectors for PXI vs PXIe and what will work for your chassis.  
    Steve B

  • How to measure the current/power running through iMac

    Hi
    After I moved from my old house to my new appartment, my 24" late 2006 iMac instantly started making this annoying high pitch noise.
    I've tracked it to coming from the area around the hard drive fan.
    I believe there might be a problem with the ground connection in the electricity system in the building.
    Is there a way to measure the current running through my dear iMac?
    Thank you very much.
    //Spaceranger
    ps.: also see my previous post where I tried to figure out what might be the problem.

    See here for the full tech specs of your machine:
    http://support.apple.com/kb/SP28
    You could connect a Volt/ Amp meter to the power outlet which will tell you what voltage/ current is being supplied, but you have to ensure the meter is set correctly or it could damage the meter being used.
    I'd ask an electrician to come in and have a look at the supply. They're qualified and know what they're talking about and may be able to shed some light on the problem.

  • PXI-4110 current limit

    I am programming a PXI-4110 with LabWindows. I am trying out the current limiting. I have an output set to 5V with a 1K load resistor; this makes the current 5mA. I set the current limit to 2mA to see it work. When I read the voltage and current it is still 5V and 5mA. Here is my code:
    g_voltageLevel_ch0 = 5.0;
    g_currenLimit_ch0= 2.0e-3;
     status = niDCPower_ConfigureOutputFunction(vi_4110_0, channel0Name, NIDCPOWER_VAL_DC_VOLTAGE);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureSense(vi_4110_0, channel0Name, NIDCPOWER_VAL_LOCAL);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureVoltageLevel(vi_4110_0, channel0Name, g_voltageLevel_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
    Solved!
    Go to Solution.

    I'm getting an error now. I have a 1K resistor on each of the outputs so I will see some current.
    Look for these lines
    to see comments I am sending you about the code.
    When I start my program I call a function PXI4110_HardwareInit() to get things set up. This part seems to look OK.
    /************************************** PXI-4110 ****************************************/
    // PXI-4110 hardware definitions for initialization
    static ViSession vi_4110_0 = VI_NULL, vi_4110_1 = VI_NULL, vi_4110_2 = VI_NULL;
    /* The channel names are length 2 because the maximum size of the strings
    from the each textbox on the UI is 1. */
    static ViChar channel0Name[2] = "0";  // 0V to 6V channel
    static ViChar channel1Name[2] = "1";  // 0V to 20V channel
    static ViChar channel2Name[2] = "2";  // 0V to -20V channel
    // inputs
    ViReal64 g_voltageLevel_ch0 = 5.0;
    ViReal64 g_voltageLevel_ch1 = +13.0;
    ViReal64 g_voltageLevel_ch2 = -13.0;
    ViReal64 g_currenLimit_ch0 = 500.0e-3;
    ViReal64 g_currenLimit_ch1 = 500.0e-3;
    ViReal64 g_currenLimit_ch2 = 500.0e-3;
    void PXI4110_HardwareInit(void)
     ViStatus status;
     ViChar errorMessage[256];
        ViChar resourceDCSupplyName[256] = "PXI-4110";
     // set channel 0 to +5.0V
     status = niDCPower_InitializeWithChannels(resourceDCSupplyName,channel0Name,VI_TRUE,VI_NULL,&vi_4110_0);
     if(status < 0)
      niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureOutputFunction(vi_4110_0, channel0Name, NIDCPOWER_VAL_DC_VOLTAGE);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureSense(vi_4110_0, channel0Name, NIDCPOWER_VAL_LOCAL);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureCurrentLimitRange(vi_4110_0,channel0Name,  1.0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     Set_PXI_4110_Outputs();
     status = niDCPower_ConfigureOutputEnabled(vi_4110_0, channel0Name, VI_FALSE);
     status = niDCPower_Initiate(vi_4110_0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
    static void Set_PXI_4110_Outputs(void)
     ViStatus status;
     ViChar errorMessage[256];
     // channel 0
     status = niDCPower_ConfigureVoltageLevel(vi_4110_0, channel0Name, g_voltageLevel_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
     status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
     niDCPower_error_message(vi_4110_0, status, errorMessage);
    Then I send a message on ethernet to enable the outputs: When I enable the outputs this function reads the voltage and current outputs and sends this information to a application on my host computer. This information corresponds to what I set it to and what I see with a volt meter.  
    void PXI_4110_enable_outputs(void)
     ViStatus status;
        ViReal64 measuredVoltage, measuredCurrent;
        ViBoolean inCompliance;
     ViChar errorMessage[256];
     // set the outputs
    // Set_PXI_4110_Outputs();
     status = niDCPower_ConfigureOutputEnabled(vi_4110_0, channel0Name, VI_TRUE);
          /* Wait for the outputs to settle. */
         Delay(50e-3);
      // check channel 0
      status = niDCPower_MeasureMultiple(vi_4110_0, channel0Name, &measuredVoltage, &measuredCurrent);
      niDCPower_error_message(vi_4110_0, status, errorMessage);
      niDCPower_QueryInCompliance(vi_4110_0, channel0Name, &inCompliance);
      niDCPower_error_message(vi_4110_0, status, errorMessage);
    Now I send a message to change the current limit. This is where I have trouble. I try and limit the current on Channel 0 to 2mA. Since the output voltage is 5Volt and the resistor is 1K the current will exceed the current limit which is what I am trying to test. I get an error in this function>
       g_currenLimit_ch0 = CommandData->data; 
       status = niDCPower_ConfigureCurrentLimit(vi_4110_0, channel0Name, NIDCPOWER_VAL_CURRENT_REGULATE, g_currenLimit_ch0);
       niDCPower_error_message(vi_4110_0, status, errorMessage);
    I see g_currentLimit_ch0 = 2e-3 whic is whatI sent. but my status is a negative number. The message is "Invalid value for parameter or property"
    vi_4110_0 is the handle I got earlier.
    channel0Name ="0"
    and g_currentLimit_ch0 = 2e-3
    I do not understand why I am seeing and error here?????????????????????
    thanks in advance for your help,
    Don

  • Pxi 4110 and dcpower express vi

    hi everyone,
    I'm really new to labview and I have to use a pxi 4110 to supply dc voltage with a square wave (or whatever shape I chose).
    I treid using the "dcpower express vi" but I can't understand how to connect the generated signal. at now I have the wave and I put a block which transforms the wave from dynamic data to array.. but it doesn't work! 
    i tried watchin g the examples, but none of them uses power express...
    thanks for any help!!

    As bronzacuerta mentioned, the PXI-4110 is not intended to generate a square waveform, or any specific waveform. It is intended to regulate a constant voltage or current, adjusting for changing loads. By changing the output using the NI-DCPower API, you may be able to approximate a square waveform, but it will not be a very good one, both in terms of rise/fall times and due to software-timed output updates.
    A potentially better option (based on speed/flexibility/accuracy requirements) instead of a Multifunction DAQ card is a signal generator.
    Tobias
    Staff Software Engineer
    Modular Instruments
    National Instruments
    Tobias
    Staff Software Engineer
    Modular Instruments R&D
    National Instruments

  • How to measure power consumption of a brushless DC motor

    Hi,
        I need to measure the power consumption of a moog 23-23 motor using a PXI platform because I need to save the instant power consumption data. I have a Dqmx 6259 and a FPGA module. The problem is that I don't know if I should measure the current and voltaje of all three coils of the motors or if it is enuogh measuring just one, plus is it better to measure in differential mode all the signals directly with the DAQmx or shoul I use a differential amplifier first?

    I don't see anything wrong with your approach; however, I'll throw out a few thoughts:
    If you could tie the sending of the change-voltage command (in your DLL) to the START TASK command for DAQ, you could reduce the variability in the time between the two events. Maybe that's important, maybe not.
    Can you set the voltage via some LabVIEW code, rather than a DLL?
    You might or might not want a variable sampling rate - if you expect 10 mSec, you might want to sample at 10 kHz to catch the 1% difference between 10.2 and 10.3 mSec. But if you're expecting 500 mSec, you could sample at 200 Hz to catch the 1% difference between 500 mSec and 505 mSec, thereby saving data space and processing time. Maybe that's important, maybe not.
    Steve Bird
    Culverson Software - Elegant software that is a pleasure to use.
    Culverson.com
    Blog for (mostly LabVIEW) programmers: Tips And Tricks

  • PXI-4110 niDCPower_init

    I am just starting to write a program for my PXI-4110 Programmable DC Power Supply.
    I am using the function niDCPower_init(). When I read the documentation about this function it made this statement:
    "This function is deprecated. Use niDCPower_InitializeWithChannels instead."
    My fp file (nidcpower.fp) does not have this function. Do I need a newer fp file? Is this function niDCPower_init() only a poor choice or is there something wrong with it?
    Deprecated was an odd word to use to describe this function. It implies that the function is disapproved of but maybe it works. What is the opinion of others?
    I am going to try it out but would like to know if this is the function I want to use.
    thanks in advance

    DPearce,
    Thank you for using NI forums! Deprecated functions use the deprecated programming model. 
    In terms of the niDCPower_init function, it is obsolete. If you do not have niDCPower_InitializeWithChannels, I would recommend downloading the most current version of our driver. You can download the driver here. You can find more information on this function here. The VI version is here (I tend to find the VI version a little more reader friendly!). 
    Also, I would check out the shipping examples that come with this driver. That will help you see the various functions in action and assist you in determining which one you want to use. 
    Let me know if you have any other questions!
    Katie
    Katie Collette
    National Instruments

  • Emails are only being delivered to the WebOutlook inbox and not my desktop outlook

    1.  Big Problem:  How do I get emails sent to my primary email address (name @ wealthblush dot com hosted by Go Daddy) to go to both the
    Outlook Web App inbox AND the inbox of my desktop Outlook 2013? They only go to
    Outlook Web App and not my desktop Outlook 2013.
    How is the email account configured in Outlook (POP3, IMAP or Exchange):
    POP/SMTP is for the user @ wealthblush dot com - this is the one where the email ONLY goes to the Outlook Web App, but not into Desktop Outlook 2013.
    IMAP/SMTP  is for the user @ charter dot net email. Emails go to both clients, but if I delete on one client it is not deleted on the other.
    Exchange/ActiveSync is the outlook.com email and it only goes to my desktop Outlook.
    I'm not getting any error messages. Mail is being delivered to the user @ wealthblush dot com, but only to Outlook Web App and NOT my desktop Outlook.
    2.  How can I change the inbox an email account puts incoming emails into? I read somewhere the way to do this is to create a search folder? This question is actually in advance of fixing
    question number 1 above.
    3.   I don't know how I should configure my emails with the new Outlook 2013 now being in the cloud:
        - I have an @ outlook dot com email I setup when MS came out with the free outlook dot com (hotmail), but I never used it because the reminders didn't work the same as in desktop outlook (not being able to snooze for a few hours
    or days or weeks if I remember right and the way we get the alerts too).
        - I also have an email with my ISP (@ charter dot net) which is a mix of personal and business emails. I've had this account for over 10 years.
        - Finally, I have a business website (www dot wealthblush dot com) hosted by GoDaddy with one real email @ wealthblush dot com. I've already setup the aliases for several other emails in my new office 365 account (support,
    sales, webmaster at wealthblush dot com) which are configured to pass into specific folders using Outlook 2013 Rules in my desktop outlook. I'm not sure if those rules are replicated into the office 365 account.
    I still want the aliases to go into those folders like before.
    4.  In the case of my outlook.com email, they do go to both inboxes (on the web and desktop), but when I delete an email from one place, shouldn't it get deleted in the other place as well? How do I get them to synch?
    5.   And then there's my existing .pst files. I have 4 of them with old emails I've saved over the years that takes up just over one GB. I save these .pst files to DVD everyday so I can go back to look at something if I need to (and for hard drive
    crashes). Does MS want me to import all that stuff to their servers and get rid of my .pst files? Does MS want me to put everything into one .pst file? It seems like it? If I do then I assume things need to be synchronized all the time between my file and
    your .pst file? The synchronization process is as close as you can get to real-time, right? Or does each individual change get sent as it happens?
    6.  I also have a skydrive account from a while ago too. Is this where I should put my documents so they're accessible when I'm not at my desk? Now, that would be a LOT of data (probably close to 80 GB). Or, is the intention for us users
    to only put certain current docs in this area. I assume these files can be synced with my desktop too, right? But I don't know how to. Do you keep old versions of the files in skydrive so we can recover them? I want to know if I can use your service to hold
    my files (they would ONLY exist on your system and you'll back them up and I can get old versions back when needed).
    7.  Is an option to keep everything on my desktop 2013 like I did in 2010 of Outlook, bypassing the cloud and MS mail servers all together?
    Sorry about all the questions, but I'm just getting the feel for your approach and need to decide how this all should fit together, do it, then go back to my business. I'm by myself at my company right now, but I'm planning on hiring a few people
    real soon too so I need to keep this in mind. I've worked with Outlook since it came out, and I've configured it, but mainly only for myself and my family.

    R1: Seems you didn't configure you email account in your Outlook client properly. please refer to the following KB article and try to create a new mail profile to configure your email account again:
    http://support.microsoft.com/kb/2758902
    R2: Do you want create subfolders under Inbox to category your emails? If so, we can just right click on Inbox > select New Folder... After that, we need to create rules to move or copy your emails to each subfoler. See:
    http://office.microsoft.com/en-us/outlook-help/manage-email-messages-by-using-rules-HA010355682.aspx
    Besides, in order to avoid confusion and keep track of troubleshooting steps, we usually troubleshoot one issue per thread in order to find a resolution efficiently. Concerning your other question, I suggest we create new posts for your other questions via:
    http://social.technet.microsoft.com/Forums/en-US/home?forum=outlook
    Thanks for the understanding.
    Steve Fan
    TechNet Community Support

  • My 4S has stopped ringing. Voicemail's disabled but if you call me, you'll hear it ring your end but it does nothing at mine and freezes if I make a call. Texts are not being delivered or coming through delayed. Ringer is on, tone enabled - help!

    My iPhone 4S has stopped ringing altogether. Voicemail is disabled but if you call me, you'll hear it ringing at your end but it does nothing my end and freezes in the phone app if I make a call (doesn't ring or connect and doesn't freeze the phone, just the calling part). Texts are not being delivered or coming through delayed. Ringer is on, tone enabled - help!
    I've disabled roaming so it only runs on wifi (this after getting a £4k phone bill...) and I can use Skype, Viber and WhatsApp with no problem at all.
    Would really appreciate any help at all!
    Many thanks.

    Hi there - am with Orange and they said nothing wrong with account and service running normally.  They said if they had cut me off I would've received a text (debatable with current message receiving situation!) and when I called out it would say 'calls from this number are barred'.  Also if you called me it would say something similar.  But it doesn't, it will ring and ring until it rings off but nothing happens at all on my handset. Not even a missed call notification.  If I call out, it will display that it is calling the number but that's it.  If I cancel the call it will constantly display 'ending call'.  If I come out of the phone and go to another app then revist phone it will start calling that last dialled number - without ever getting as far as ringing or connecting.

  • HT3728 my broadband speed has all of a sudden been reduced dramatically over the weekend. the isp asked me to eliminate my apple airport express base station to make sure that my 20mb band is being delivered which i did and confirmed so.

    MY BROADBAND HAS BEEN REDUCED DRAMATICALLY IN MY HOME NETWORK USING MY AIRPORT EXTREME BASE STATION.
    THIS HAPPENED  SO UNEXPECTEDLY OVER THE WEEKEND. I CALL MY ISP AND THEY SIMPLY TOLD ME TO ELIMINATE THE AIRPORT
    EXTREME BASE  STATION  TO CONFIRM THAT  WHAT I  HAVE PURCHASED  IS BEING DELIVERED WHICH I  DID AND CONFIRMED.
    MEANWHILE USING MY  NETWORK UNTIL LAST  FRIDAY EVERYTHING LOOKED  JUST  GREAT.
    I  WONDER WHETHER  I WOULD HAVE TO   RESET MY  AIRPORT EXTREME TO FACTORY  SETTINGS TO RECONFIGURE  IT.

    Hmm. Your ISP is incorrect in telling you that removing any of your routers on the local area network will have anything to do with the bandwidth performance that they provide you at the Internet modem. On the other hand, changing/removing networking equipment on your local network DOES affect bandwidth performance on that local network.
    Most likely what the ISP is asking you to do is verify that you are getting their promised download/upload bandwidth at the modem. It appears that you did that and verified that you are getting the stated bandwidth.
    So, we should take a look at why, when adding the AirPort Extreme, you are getting less.
    First off, which AirPort Extreme model do you have? What is the make & model of the broadband modem that your ISP is providing you?
    It is always a good idea to perform a complete power recycle of your networking equipment when introducing new hardware. Please check out the following AirPort User Tip for details.
    (p.s. Please don't use all capital letters when you post. It appears to readers that you are shouting. Thx!)

  • Messages in queue not being delivered

    Hi
    I am using IMS 5.2. After a recent migration of user accounts to an new login ID, some user's emails are not being delivered to thier accounts and are accumulating in the ims-ms queue. When I did the migration, I copied the user's email to thier new account. I did reconstruct -f -r a couple of times, and emails appeared to be delivered to the new login accounts.
    There were a few emails left in the queue but now there are close to 6000 and users are not getting emails delivered. When I look at the message in the queue, they state the user is overquota. I checked the user's account and they are not overquota.
    Any suggestions on how to correct the problem or get the message to be delivered?
    Thanks
    Randy

    Jay,
    iPlanet Messaging Server 5.2 (built Feb 21 2002)
    libimta.so 5.2 (built 15:07:23, Feb 21 2002)
    SunOS cpmail 5.8 Generic_108528-27 sun4u sparc
    SUNW,Ultra-60
    This is the original release. There have been over thirty patch/hotfix releases, and hundreds of fixes. I URGE you to go to Sun's download site, and get 5.2p2, and install it immediately.Thanks for the advice Jay. The product is installed as part of the portal we are using(which I upgraded last June). I was unaware of the versioning and patches that are associated with the version of IMS that was installed. I will move forward with confirming with our portal vendor that the latest sp and patches will support the integrated IMS.
    >>
    >>
    How did you "migrate" your users to new login id?When you "copied" the files, did you re-set
    ownership/permisions of the files?
    I used the following process in a script to migrate
    the users from old login to new login:
    1. created the new login mailbox with mboxutil -c
    2. moved new mailbox to different partitions
    There is a utility/command that does all of this for you.
    mboxutil -r
    When you "moved" the mailbox, I assume you use unix cp, right? Bad Idea.
    current versions of mboxutil do this all for you, and don't need reconstruct afterwards.
    I suspect that you have either not properly "moved" your user in LDAP, folder database, or have gotten file ownership or permissions wrong after the move.
    Unix files are owned by a user and group, and have various read/write permissions set by the OS. YOu're on a Unix system, so you should know about such.I tried using the mboxutil -r initially but was getting a lot of errors with it in my test environment. In hindsight, I am thinking this had to do with the version of IMS I have installed. The unix file move was done with the IMS admin so would also own all the the messages etc on the OS side. I have obviously made a mess of things! So again, I will put in a support call to our portal vendor about how to proceed on this. I really appreciate your input into this.
    3. retrieved hashed directory information for old and
    new mailbox for user
    4. copied files from old to new mailbox with cp -R
    oldmailbox/* newmailbox/
    5. after all users were migrated, I did a reconstruct
    -f -r, and a reconstruct -f -m
    I am not sure what you mean by re-set
    ownership/permissions.
    Have you looked at the "default" log for an errormessage?
    I found the following errors in the default log:
    default.60.1093908097:[31/Aug/2004:04:15:01 -0600]
    cpmail stored[28697]: Store Critical: Error checking
    mailbox database: Ignoring log file:
    /usr/iplanet/ims52/msg-cpmail/store/mboxlist/log.00000
    00543: magic number 0, not 40988
    default.62.1094084554:[02/Sep/2004:04:15:02 -0600]
    cpmail stored[20883]: Store Critical: Error checking
    mailbox database: Ignoring log file:
    /usr/iplanet/ims52/msg-cpmail/store/mboxlist/log.00000
    00543: magic number 0, not 40988
    Your reconstruct command may have been wrong, too. That's what the errors are telling you.
    Also, there are other problems in your system. Permission problems, most likely.
    I would do a reconstruct -m, first. and see if the above goes away.I will proceed with what you suggested, and let you know the results. Again, thank you very much for your time and input for this.
    >>
    >>
    AS well, when I do a mboxutil -u on an affected user,
    the quota information says the user is well under
    quota. For example:
    $ mboxutil -u xxxxxxxxx
    diskquota size(K) %use msgquota msgs %use user
    25600 2604 10% no quota 19
    19 xxxxxxxxxxx
    The error message doesn't really have anything todo with quota.>>
    >>
    In imta/queue/ims-ms, the user has 183 messages
    waiting to be delivered. All of the messages say the
    user is over quota.
    Any suggestions?
    Randy, you need to re-think how you're moving users. You have somewhat messed up your mailstore. . . I will in re-think things. In fact, I have a number of users to migrate. I will stop what I am doing and get this right before I proceed! There is no sense compounding the problem when there is a better alternative. One thing for sure, I will continue using this forum before I proceed with messing with a production system!
    You may need to contact tech support for the immediate help you will need to straighten this all out.>>
    >>
    RandyThanks Jay.
    Randy

  • Measuring power at two different point (single phase)

    Hi all,
    I am new user of LabVIEW 2013 with electrical power suite.
    May i know how to measure two different point of single phase system using the same VI such as in DAQ power and energy example?
    As I notice that there are several type of wiring available. Is only one voltage and one current availabe during 1 ph voltage and 1 current selection. Thus, when I select 3ph voltage and 2 current, the value I obtain is not seem to be true.
    From the figure attached, I want to measure the voltage at current at the solar/PV  to obtain the power value. Other than that, I also need to measure the voltage and current at the load such as water heater. This is a single phase configuration network.
    (2 voltage and 2 current measurement at single phase)
    Hope for guidance.

    Hi Lewis,
    Here attached the file.
    Actually I wanted to measure power two different point in a single phase system as shown in the first attachement.
    Since, there is example in power quality measurement(DAQ) as shown in the second attachment.
    since there is two measuring point in my measurement. May i know how to obtain two measurement voltage at the same time in DAQ?
    Hope for guidance.
    Attachments:
    Two measuremernt point (single phase).png ‏7 KB
    power & energy measurement.jpg ‏172 KB

  • IVI Configuration with PXI-4110 in TestStand

    Hello All,
    Set-Up:
    PXI-1033 connected through MXI to a PC running Windows 7. LabVIEW 2014. TestStand 2014 (32-bit). PXI-4110 Power Supply and PXI-4070 DMM.
    In MAX I can open both soft panels and control both units and they work great.  In LabVIEW I can control both cards as well. 
    In MAX I have set up a driver and logical name for my DMM. This unit works great within TestStand using an IVI DMM step.
    I then proceeded to setup the 4110 in MAX with an IVI driver and logical name. Here are my settings:
    Name: ni4410_PS
    Hardware: Added hardware asset and select the PS. This one is checked, no other assets are checked.
    Software Module: NI-DCPower, 4110 is listed as a device that is supported.
    Virtual Names: This is where I am confused, Under physical name there are four options that come up (0, 1, 2, and 3). This power supply only has 3 outputs so I am unsure why four come up. I have made 4 virtual names, one for each of the options. I named them ch0, ch1, ch2, and ch3 respectively.
    When I put an IVI Power Supply step in TestStand everything seems to be working. I open the configuration window and set my values for each channel. If I try to validate the setup by unchecking simulate and click on init I do not get an error.  As soon as I clic on 'Configure' or 'Show Soft Front Panel' I get the following error:
    "The IVI Configure operation failed for lgical name 'NI PS 1'. Details: Extention capability not supported by instrument driver. (Base) (-30717)"
    Any information would be appreciated.  I tried playing with it for a couple hours yesterday and had a couple co workers try to help.  We are all under the assumption that this should be working.  
    Thank You!!
    Jesse
    Solved!
    Go to Solution.

    Hi jesserzamora,
    Have you seen this link: http://digital.ni.com/public.nsf/allkb/331F2717DBCD1F858625758200745773?OpenDocument
    It discusses a similar failure with the IVI Power Supply step in TestStand. 
    Julia P.

Maybe you are looking for

  • GR/IR with FI data in FAGL_FC_VAL

    Hi experts, What is the meaning of the checkbox: GR/IR with FI data in T-COD: FAGL_FC_VAL? There is no F1 documentation on this. Pls help. Thanks, Betty

  • No service for system sap*** client *** , sm58 error

    Hi Friends When I am trying to send message from My Xi system to R/3 system I am getting error., No service for system SAPSXD, client 001 in Integration Directory I had searched it on sdn n found that this is very common error. I had gone through blo

  • Embedding swf into Flex Desktop Application size/scale problem.

    Hello, I am embedding an swf animation into my desktop Flex desktop application. I am using SDK 4.6.0. I use the SWFLoader to embed the animation: <s:SWFLoader source="img/mch_anim_1.swf"/> When I run the application, the animation is loaded and is w

  • What's the maximum Screen Resolution (Windows 7) on a T410?

    Hi, I got a new Lenovo T410 laptop (Type 2252-AT6) which came with WinXP. I upgraded to Windows 7 but found that the screen resolution was really limited. The maximum I'm able to get is 1440 X 900 which is very disappointing as the WinXP install prev

  • How to save a Graphics with .jpg???

    I draw a Picture on cavas ,and then I want to save it to an IMage with .jpg, and print it. what can I do? help me !! Thanks in advance.