Varying input voltage refrigerator testing application

I am working on an application for refrigerator testing. One of the specs that we need to comply requires that we vary the input voltage, (ex. 127 VAC for a 120 VAC unit) through some sort of voltage regulator. The biggest unit we test has a 2 hp compressor with a max current draw of 18 amps. We would like to know what equipment we would need in order to accomplish this through a LabVIEW application that we are working on. We could do this through GPIB, serial, etc. Any help on this matter will be appreciated.

You might be right.  I was think of one small 1.5 HP controller (Commander SE) we are using that is being powered by 1-phase 240 VAC supply.  So I was thinking the motor was 1-phase, but actually it does the voltage chopping to create a 3-phase supply to the motor.
It may be possible to still use one where you just use 1 phase relative to ground, but I'm not sure, and I'm not an electrical engineer to be able to tell how to do such a thing.  It may be worthwhile to ask them if such a thing can be done.  Or look around for some other motor speed controller manufactures that make a 1-phase motor controller.
I'm sorry, but I don't know where else to look or be of any help.

Similar Messages

  • Variant Input Parameters with ActiveX Objects

    I've created an ActiveX library that exports a variety of objects using Visual C++ Version 6.0. I'm testing the methods and properties of these objects using both LabVIEW and Visual Basic clients. The problem I'm facing is that when testing under LabVIEW, I can't access methods that expect a VARIANT input parameter. The declaration in C++ of one object's such method looks like this:
    void CChassis::Configure (VARIANT config, int rate);
    I can create the ActiveX object with both LabVIEW and Visual Basic. I can invoke methods that do not use VARIANT input parameters using both as well; but, when I create a second ActiveX Object (a CConfiguration) and use the LabVIEW Variant operator, my attempt to invoke Configure fails with a "No Such Interface (E_NOINTERFACE)" error. The invocation fails identically if I wire the CConfiguration directly to the Variant input, as well. If I instead create a constant VARIANT as an input parameter, my Configure method gets called, although the method recognizes that I passed an invalid object.
    Using Visual Basic I can invoke the Configure method without problems while encapsulating a CConfiguration object within a Variant.
    So my question is, have I found a LabVIEW error, or is my approach flawed? If my approach is flawed, where have I gone wrong? Thanks in advance! I'm using LabVIEW version 7.0.

    Hi,
    Have you tried to put in plain data or an object reference into your variant parameter?
    I think plain data should work... for object references i don't know.
    Some time ago, i also coded a small activex library, where i passed object references by parameters in delphi. I did not use variant parameters, but the real interfacedeclaration types (like IMyObject**). This worked fine. I also uses "dual" interfaces for the objects, which get passed by parameter. But i don't really know if this is helpfull for vb.
    Finally, you can also use a workaround:
    For each of your Objects, that you want to use from LabView and which you want to pass by a parameter, you
    insert an new Property "int32 Object_Handle".
    When an Object gets created, it ask a "ObjectHandleManager" (gets written by you) to get an own unique Object_Handle which it stores into its property "Object_Handle".
    The ObjectHandleManager generates an new Object Handle and memorizes the Handle assoziated with the calling object's reference.
    Now, at each time, when you want to pass a object reference to a library function of your self, you can pass the Object_handle as an integer value. In the called function, you can ask your ObjectHandleManager for the real ObjectReference.
    But thats not a very comfortable workaround.

  • Variant input to a simulation subsystem in LVSim 2.0

    I have a simulation subsystem that takes a variant as one of its inputs. This worked well for my purpose in Labview 7.1. I've now upgraded to LabView 8.0 and simulation module 2.0.  The subsystem appears correct when opened, but when I use it in a simulation loop, the variant input appears as type void and connecting a variant to it causes an error. Is this an error with my VI or are variants no longer allowed in simulation subsystems?

    Chris,
    I see what you mean. I looked through the release notes and Help files but did not see any mention of the variant data type.  I will see if R&D has a more definite answer.
    Best Regards,
    Message Edited by Doug M on 07-05-2006 07:35 PM
    Doug M
    Applications Engineer
    National Instruments
    For those unfamiliar with NBC's The Office, my icon is NOT a picture of me

  • PC-6023 MAX INPUT VOLTAGE

    IN THE NAME OF GOD
    I have a pci-6023E card and potentiometer . The ouput of the pot. is differential .
    The (-)end is always 13.5 Volt and the (+) end is between 5to22 Volt. I connected the output of the pot. to the input of the card in dif mode and i could measure the output of pot( ie 13.5-(5to22) ) successfully.Is this a correct connection to the card?
    The maximum working voltage of each input of the card , according to manual, is 11 volt, but my sensor (i.e pot.) outputs are 13.5 and (5-22) !!
    I used BNC-2110 terminal in dif. mode for this exprement.
    Please explain me that how the card can successfully measured this input?
    Can this demage the card at ling time or not?
    thanks
    Edward. Hb.

    Thanks Mr. Alan A
    I am "434343" memeber.
    I recive your response to my question about "Max input voltage of PCI-6023E" .But I have some questions.
    According to the "Data Acquistion Specification- a Glossary by Richard House" ( Application Note 092):
    "Maximum Working Voltage (signal+common mode)= the highest voltage level that can be input to board without saturating the input."
    And according to your response:
    "measuring a signal with higher common mode voltage than that(11V) might not accurate"
    But in may experiment with a potentiometer (floating differntial source with (-)13.5V and (+) 5to22V),I could successfully measure the signal without saturating and without error (Max error is about +/- 0.02 volt)!!
    I think that the maximum working voltage of PCI-6023E is greater then 11 volt??!!
    or my measurement has a little error.
    I think that I don't need to use a signal conditioning circuit.
    Please Help me.
    Thans again.
    Edward Hb.

  • Please help on online test application

    Hi all, i'm developing an online test application. I'm done with the login page and uploading the question and i choose to display all the questions to the user but i dont know how to insert in the database cos i used a while loop to display the questions and options so i'm confused.please help

    Thanks for your response.the thing's dat i've already retrieved the questions and options from my sql table. I just want to submit to the database and its not that i dont know how to do that, i just dont know how to submit multiple entries at once cos i have sumthin like:
    while(rs.next())
    id=rs.getString(1);
    opta=rs.getString(2);
    optb=rs.getString(3);
    optc=rs.getString(4);
    optd=rs.getString(5);
    <input type=radio name=<%id%> value=<%opta%> />
    <input type=radio name=<%id%> value=<%optb%> />
    <input type=radio name=<%id%> value=<%optc%> />
    <input type=radio name=<%id%> value=<%optd%> />

  • SCXI-1120D input voltage protection:

    Hello All,
    I have a SCXI-1120D input voltage protection question:
    the data sheet specified 250Vrms. Does it mean signal plus CMV only? What if my signal itself is has a large potential difference (thus not CMV) , say 20V? Will the SCXI 1120D be damaged?
    Thanks
    Feilong

    Hi Feilong,
    You are correct. The 250 Vrms refers to your signal plus the CMV. See Figure 3-2 on page 32 of the SCXI 1120D User Manual. The paragraph above Figure 3-2 also explains this. I hope this helps!
    Regards,
    Ima
    Applications Engineer
    National Instruments
    LabVIEW Introduction Course - Six Hours
    Getting Started with NI-DAQmx

  • How do I tell a variant input that a string is coming in?

    I'm just learning how to use LabVIEW, so my apologies if this is a simple/stupid question.
    I'm trying a variant on a DataSocket Read VI that was supplied with LabVIEW (I'm running version 7.1).  I've got the "data" output line tied into a string.  The "type (Variant)" input is connected to a 1-D array of unsigned bytes.  There is an error in the wire connecting the "data" output to the string element, though:
    You have connected two termnials of different types.  The type of the source is 1-D array of unsigned byte [8-bit integer (0 to 255).
    The type of the sink is string.
    All well and good.  How do I change the type of the source to string?  When I right-click on the source and select "Replace", I can't see anything in there that looks like a string that isn't a constant... that is, a string type.  I'd appreciate any insight you can lend me on this.

    In the strings pallete you an find the Empty string constant or just a string constant . Wire that to the input (type) connector and it should produce the string you want.
    Or you insert a byte array to string function in the broken wire.
    Regards,
    André
    Message Edited by andre.buurman@carya on 08-29-2007 12:05 PM
    Message Edited by andre.buurman@carya on 08-29-2007 12:05 PM
    Regards,
    André
    Using whatever version of LV the customer requires. (LV5.1-LV2012) (www.carya.nl)

  • I'm not able to set Analog Input voltages, etc

    ok,
    so i have an NI DAQ I/O card (NI PCI 6723, or something to that effect) and i'm trying to configure the channels for Analog Input signals.
    when i try to choose any kind of setting for Anolog signal by:
    >>drop DAQ Assistant vi onto the block diagram
    >>>>choose Analog Input
    >>>>>>>Voltage
    i get this message "No supported devices found"
    i thought since I have the I/O card, i should be able to configure channels for both input/output.
    i also have ADAC card (from IOTECH) installed but when i look at MAX, i only see the DAQ card.
    i'm really confused at this point. can anyone break things down for me ... what i'm doing wrong?
    -r

    stuartG,
    you were right. so what i'm using for the input is a ADAC 5501MF from IOTECH
    following their manual (http://www.iotech.com/productmanuals/adac_lvi.pdf) it says that the ADAC-LVi libraries can be located at "Functions->User Libraries" palette of labview.
    does this mean that i have to install the ADAC-LVi driver in LabView->User Libraries directory?
    if that's not the case then i don't see why the ADAC-LVi is not showing up in the LabView User Libraries since i've installed the driver correctly (in C\Program File\ADAC )
    do you have any idea why?
    thanks
    -r

  • Interested in finding out about using LabView in Networking Testing applications.

    Networking Testing applications, such as:
    - General Ethernet and POS testing
    - BERT
    - RFC2544 back to back, Frame loss, Throughput
    - RFC2285
    - IP Multicast
    - QoS
    - Load Balancing
    - Wireless IP
    - BGP
    - Other Networking testing

    LV provides a lot of tools which make VIs remotely communicate and control through net easily,such as VI server ,datasocket,tcp/ip,opc etc.I do not have much experiences in the environment you decribed,but in the internet and general ethernet programming with LV is very easy and interesting.

  • How to Implement a Time Limit Feature in an Online Test Application ?

    I am creating an Online Test application. The time limit can be stored for a Test in the database.
    How to implement a time limit such that when the test is started (user clicks on the Start button to go to the fragment containing the Questions) the time left is shown and the test ends (goes to home page) when the timer reaches zero.
    Thanks

    Hi,
    timestamp is a date and thus cannot be used directly to determine seconds passed. So what you need to do is to build the difference between a saved time stamp and the current time.
    http://docs.oracle.com/javase/1.4.2/docs/api/java/sql/Timestamp.html
    So if you have two time stamps, then calling getTime() on each and building the difference gives you a value, which is millisecond. So to get this to an int value you divide it by 1000
    Frank

  • Reading a video input using a java application

    I want to read a live video input inside a java application and extract frames from that video,..But I don't have knowledge of doing this.
    If any of you know how to implement this would you please send me some sample applications codes to one of the following email addresses.
    [email protected]
    [email protected]
    [email protected]
    [email protected]

    I am not really sure how to solve this problem, you might be interested in checking this example on Accesing Individual Decoded Video Frames - http://java.sun.com/products/java-media/jmf/2.1.1/solutions/FrameAccess.html
    regards

  • Online test applications

    HI Java experts,
    I am creating an java online test application, i got strucked in the following scenario.
    Scenario :
    My task is when my application starts running other operating system command and menus should be disabled.
    The operating system commands and menus should remain in disable state till my applications exits.
    I know it can be done, can any one suggest ?
    Thanks in advance.

    Try to download an early version of the JVM

  • Unable to Create EJB Test Application ....

    Folks - I realize that I need to have my EJB offer remote interfaces in order to be able to setup an EJB Test Application from within SJSE6 - but nonetheless, the option is uanvailble off the main menu for logical Session Bean node -
    Any thoughts ?
    John Schmitt

    Hi,
    in order to generate the test app, you have to create a J2EE application and add the beans to it using the explorer, than the option should be enabled.
    Also it is much better that you use Studio to create the beans and all of the components of the application. Do not create the beans by hand, the IDE will create all of the interfaces for you automatically when you use it to create the beans.
    Regards Jirka

  • Why does my test application quit (disappear) during testing?

    Hello,
    I wrote test application using CVI2009 under WinXP. The application runs fine but when I repeat test, the application quits around the 9th or 10th repeat. When I reload and run the application it would quit again around the 9th or 10th repeat. I do not have a clue of what is going on. I have attached error logs A80428Error.bmp and A80428ErrorLog.bmp. Would someone please help?
    Than you.
    Robert Mensah
    Solved!
    Go to Solution.
    Attachments:
    A80428Error.zip ‏10 KB
    A80428ErrorLog.zip ‏128 KB

    Hello Robert,
    a couple of hints on your problem :
    1. Are you shure the system you are running on is stable? In the event viewer screenshot you have posted  I noticed *a lot* of Application hang errors on several different programs (cvi.exe -which by the way appears to be release 8.51- , explorer.exe, your app, NI Example finder and so on. Maybe you should reconsider reinstalling the system and see if this fixes some of the errors
    2. Did you happen to develop your application in CVI 2009 evaluation version? If so, keep in mind that every executable built in eval versions will hang after approx. 10 minutes running (more or less the difference between records on 3/3/2010 12:03 and 3/3/2010 12:14 in system log)
    3. If none of the above applies, you could recompile the application checking "Generate map file" checkbox in Build >> Target settings panel: this will generate a report inside cvibuild directory with mappings for all function calls inside your application. By searching the fault address shown in the error log inside the map file, you can narrow down the faulty condition to some specific function in your program
    Message Edited by Roberto Bozzolo on 03-05-2010 09:34 AM
    Proud to use LW/CVI from 3.1 on.
    My contributions to the Developer Zone Community
    If I have helped you, why not giving me a kudos?

  • Theme Testing Application

    Hello
    A couple of weeks ago I came across a link on one of the messages to a Theme Testing Application by Carl Blackstrom. I've lost the URL for this can anyone aim me in the right direction please?
    Regards
    Pync

    Try http://htmldb.oracle.com/pls/otn/f?p=40722:101
    http://htmldb.oracle.com/pls/otn/f?p=24317:51 is also useful

Maybe you are looking for

  • Carriage Returns / Line Breaks & PDF

    I have a requirement to display a customers address in 1 column of a PDF report. The address is stored in the RPD as Address1, Address2, etc. etc. I use the answer of the question "Carriage Returns / Line Breaks" as a basic startup. ( Question Regist

  • How can I delete messages on phone(5c)and have them simultaneously be deleted on computer

    how can I synchronize deleting messages on iPhone 5c and my computer MacBook Pro (10.7.5)

  • Sudden Media Card error..?

    Hello everyone. For the past few weeks, my media card in my BB 8900, which has been working fine before this suddenly started, has now been displaying the following error message whenever I unplug it from my wall-charger or my USB connection cord tha

  • Hp TS 300-1017 starts but won't enter W/7

    Hi. It starts then the screen freezes, won't enter Windows 7. Safe mode does the same. I bought an extended warranty that expired Feb 2012. However, I've been told that HP recognizes this is a heat-related video chip problem common to all-in-ones. Be

  • Error when inserting install disc

    Hi,  When inserting a retail OS X Tiger disc into my G5 Tower to do an archive and install, when the computer boots of the CD, I get a Kernal Panic and cannot continue? I have run permissions, and disk verify and all OK, also have completed 'memtest'