DC motor closed loop control through labview using usb 6008

Hello
i am doing a project in which i want to control the speed of a DC motor (PID control) through labVIEW 2010. I am having a DC motor, a drive unit that regulates the voltage to the motor by getting analog voltage from 0 to 3.3V ( it can be used for both directions) and i also have a NI usb 6008 board. The problem is that i can't feed the usb 6008 with the digital signal from my hall effect speed sensor. Is it posible with this setup to control the motor?? I can also try to use the pulse as analog input and trasform the frequency into rpm's. Then i am thinking of generating an analog signal to feed the motor drive. If you have any further suggestion of a probably better hardware setup would be more than helpfull to me.
Sincerely
Jason Chaloulos

Hello Michael
Thanks for the reply. I came across those topics before and all of them are trying to generate a PWM signal as output i want to use just an analog signal output so timing on the output is not that important i guess. i am struggling on getting the frequency from the digital input signal that my hall effect sensor generates. Since the maximum speed of the motor is 3000 rpm and with my tooth wheel the maximum  output frequency of the sensor will be 300Hz which i see its way less than the limitation of my ni board. Is there any tutorial documentation that might help me with this one ? Thank you in advance.
Kind regards
Jason

Similar Messages

  • Stepper motor closed loop control using 7344

    I have some question about the closed loop control of stepper motor using 7344. From the NI website I got that closed loop steppers work differently from closed loop servos. Instead of adjusting the output on each PID iteration like a servo system, closed loop steppers will do a pull-in move at the end of a move to adjust for any difference between the target position and the encoder feedback. By default, it will attempt the pull-in move three times.
    Now for example I got five points: (0,0) (10,20) (30 40) ( 31 60) (50 65). The request time interval I set as 10ms. This mean that in the first 10ms, the machine should move from (0 0) to (10 20). But for some reasons the machine can not move to (10, 20), it moves to (8,16) and the first 10ms passed. In the second 10ms, how it move? Move from (8 16) to (30 40), If like this, the following error will accumulated. In practise, it is not accumulated. Is there anyone can explain something about the closed loop control of stepp motor? Thanks a lot!

    Requested Interval is additional data for the Position buffer type. The Position buffer type requires a Requested Interval parameter, and indicates the time between contouring data points in milliseconds. For all other buffer types, the Requested Interval parameter is ignored.
    The controller will use the closest value it can that is greater than or equal to the interval value you requested. Your time interval must be an even multiple of the PID rate.
    For example, given a PID rate of 250 ms, a time interval of 11.2 ms between points is physically impossible, so a call to configure buffer with an interval of 11.2 causes the  buffer to actually be configured for 11.25 ms, since 11.25 is the nearest possible interval greater than 11.2. The following table shows valid intervals for each PID rate.
    My question is like this. For example I got five points: (0,0) (10,20) (30 40) ( 31 60) (50 65). The request time interval I set as 10ms. This mean that in the first 10ms, the machine should move from (0 0) to (10 20).  If it is impossible for the machine to move from (0 0) to (10 20), there are two ways to solve this problem. First is the time interval is 10ms and assume it move to (8 16) within 10ms and take (8 16) as the start point and (30 40) as the end point of the second move. Second is extend the time interval, assume it takes 14ms to move from (0 0) to (10 20). After 14ms, the (10 20) is the new start point for the second move. I think it works as the second ways.If not, the following error will accumulated.  Is there anyone can explain something about the closed loop control of stepp motor in detail? Thanks a lot!

  • Closed loop control of DC motor drive using PCI 6251

    Hi
    i am using NI PCI 6251 for closed loop control of dc drive.(shunt motor,220v, 0.5HP, 2.5A rated, 1500RPM)
    the dc motor is fed from a bidirectional dc-dc converter. this converter works in boost mode(forword motoring) and buck mode(braking mode/battery charging)
    the converter is built using mitsubishi IGBT module. at low voltage side of the converter a battery bank of 48V is connected and high voltage side the motor is connected.
    An IR sensor is developed to measure the speed of the motor and a hall current sensor is used to sense the converter current.
    in the control part i have an inner current control loop(PWM controller) and a outer speed control loop.both controllers are PI controller. To measure the speed i have used counter of the DAQ assistant, then frequency. then this frequency is converted to RPM using labview.after giving the speed and current signals  to the controllers , then i started  tunning the PI gain, but at the starting the duty cycles was 100%, so the IGBT got damaged immediately. now how to tune the controllers during the closed loop operation.  the switching frequency of the converter is 20KHz.   
    can any body suggest me how to run the dc drive in closed loop ?
    thanking you.

    Hi Premenanda
    The freq data receiverd from the daq assistant has the units of Hz, or in terms of motor speed, rps (rotations per second). I can see that you are trying to convert that to rpm (rotation per minute). which can be done by dividing the rps speed by 60. Instead, you are multiplying it by 60 and also comparing with 2000. the resultant multiplied value is displayed in your program only if it is smaller than 2000.which is causing the problem here.( as seen in file 'original code.png)
    I have attached a file named 'modified code.png' that shows how the conversion from rps to rpm is done. Please do the required changes in your code and let me know if it works.
    Attachments:
    modified code.png ‏18 KB
    Original code.png ‏15 KB

  • Closed loop control of step motors possible with 7344?

    Hi All
    I have been looking around here for a while for an answer to this question.
    For me, a closed loop control system is correcting for following errors all the time, also when the move is complete.
    If the load on a system changes, there will often be a need for compensation in order to keep the position right. Then you need a closed loop control system.
    I have made a system using DC motors with a 7342 controller. We are in the process og getting a new similar system, where the manufacturer chosed stepmotors in the believe that it can be used in closed loop.
    Is it really true that the NI version of closed loop is only performing a what you call a "pull-in" at the end of the move in order to correct for lost steps or flexibility of the connection between motor and object/encoder?
    I would call this "backlash compensation" or something like that.
    Why dont you make a real closed loop option. The control voltage from your servo motor control, could be fed into a voltage-frequency converter with a matchin sign signal.
    This shouldn't be too complicated to accomplish.
    /b

    You are right. NI motion control boards don't apply a control algorithm
    during moves when configured in closed loop stepper mode. In this mode
    the board monitors the following error during the move and generates an
    error when the following error limit is exceeded and they start the
    pull-in moves at the end of the move if necessary.
    If you really want to do PID control with stepper motors you still
    could connect an external V/f converter to the DAC outputs of the board
    and control the axes like DC motor axes. Some vendors implement this
    feature on their board but NI has decided not to do that as we don't
    think that this approach provides a good solution. Here are some reasons:
    The most important issue is the fact that you could easily exceed
    the maximum rate of change of the motor frequency in the case that the
    following error increases during the move. For a DC motor this would mean that the
    voltage of the controller increases to compensate for this error and
    the motor will follow - probably with some delay, but it will follow. For
    the stepper this could mean that the acceleration becomes so high
    that the motor stops immediately as it can't follow with a delay like a DC motor. What
    should the controller do then? Restart the move automatically? Generate
    an error? There is a big chance that you end up with a system moving in a stop and go mode.
    The second issue is related to the control algorithm itself. The transfer function of a stepper motor is not really linear so it will be quite hard to find good tuning parameters. A DC motor can be tuned by analyzing the step response and the Bode plot. For the reasons that I have described in 1.) this won't work for a stepper so how should you tune it?
    I have talked to some  motion control engineers who have tried to control steppers with a PID algorithm and they all agree that this is a real pain and it's not recommended.
    So if your system requirements involve online following error compensation I strongly recommend using a DC motor.
    Best regards,
    Jochen Klier
    National Instruments Germany

  • Creating a simple closed loop control

    Hi everyone,
    I need a bit of help creating a (very) simple closed loop control.
    Note: I am using 3 Measurement Computing boards: one USB-2416, and two USB-TC's. Since Measurement Computing has their own driver (called ULx), it might be a bit more difficult to understand than the NI drivers, but keep in mind ULx is the equivalent to DAQmx.
    I need to be able output a 0 or 1 binary value based on measurements from 4 voltage input channels compared to constants. Ie. if voltage measurement < 3.5 (constant), send a boolean value (see a, b, c, d in attached photo) which in turn gets sent to the ULx Write VI, and that outputs the binary value of 0 or 1. The problem I've run into is that I need to "compare" single chanels from different boards, which gives me single boolean values for each channel. The ULx Write VI only accepts boolean arrays. Is there a way (and there probably is, I just haven't been able to find it) of arranging these lone boolean values into a boolean array?
    (see attached file for what I have so far)
    PS. I am a COMPLETE rookie, only been using LV for a week or so, so please explain everything as if I have no clue what's going on.
    Thanks in advance!
    Solved!
    Go to Solution.

    Hi,
    the attached example I've made should demonstrate what you need to do.
    Some things to think about:
    1. It seems that the ULx vis return arrays of signals. If so you can select signals with "index array" as done in my example. That consumes less space and does not change the type of the signals to variant.
    2. If you compare the signals to your constants you will receive boolean arrays because every element of the measured signal is compared to the constant (see my example). You may then decide how to process that info - another hooray for signals :-)
    Regards
    Florian
    Attachments:
    Example01.vi ‏24 KB

  • Connecting Agilent DSO3062A Oscilloscope with LabVIEW using USB

    I am trying to connect Agilent DSO3062A Oscilloscope with LabVIEW using USB but the device is not recognized by MAX. I have installed the drivers for scope and it works fine with the scope software. The problem is LabVIEW dosen't recognize the scope. Please tell me the procedure of setting up the scope with LabVIEW 8.2. I am using Windows XP.

    Hi there,
    I think the issue is that the DSO3062A will not be recognized by LabView as a GPIB instrument through the USB connection. If you check out the 3000 series manual (http://cp.literature.agilent.com/litweb/pdf/D3000-97016.pdf) you'll notice on page 8 it says you can't use the USB for programming, it can only be used with their scope software. You need the N2861 module installed on the back to connect via GPIB, and then a GPIB interface to your computer. I hope that helps.
    -jmart

  • What amplifier will recomended for my k-type theromocoupler using usb 6008

    I recently do a project about temperature control system using usb 6008, according to the spec, the minimum AI range is +/- 1V and the thermocouple giv the voltages in jz a mV, so what kind of amplifier will be recomamded? is there any amplifier chip set that available ?

    Analog Devices AD594/AD595 Monolithic Thermocouple Amplifiers (Type J / Type K)
    This would give you a conversion of 10mv per degree C using a single +5v supply you could get to a couple of hundred C
    I get 11 bits of resolution available for the +/- 4v range (11 because I can't see a negative supply available so we are only using one side of the differential range).
    That's a resolution of 2048 bits over 4v giving 0.001953 v per bit
    The devices above give 10 mv per degree C so you would have about 0.2 C resolution and a maximum temperature of around 450 degrees C an minimum temperature of 0 degrees C.
    The device draws about 1mW so you should have no problems with power consumption. (Hint Earth / Ground loops might be a problem).
    I should add that whilst I have used this device many hundreds of times, I have never applied it to the USB 6008.
    http://www.analog.com/en/prod/0%2C2877%2CAD594%2C00.html
    Message Edited by Conseils on 05-08-2007 08:51 PM
    Message Edited by Conseils on 05-08-2007 08:52 PM
    Attachments:
    46185785AD594_fbs.gif ‏17 KB

  • Using USB-6008 software timimg Generate waveform

    I am using USB-6008 device. Base on it's User Guide, this device on support software-timed and maximum update rate is 150 Hz.
    By place a time delay VI for 1ms, I have a 1.660Hz sine signal. Is this wrong for place a time delay VI for software timing?
    I would like to generate a 60 Hz signal by using software-timed VI on USB-6008 device. How should I do?
    Thanks a lot.

    DephinTW,
    The update speed of the USB-6008 is largely dependent on the speed of
    you computer as well as anything else on your USB bus. If you are using
    an older computer or have other USB devices on the bus, you may not
    achieve this rate. In order to output a 60Hz signal, the minimum output
    sampling rate is 120Hz. This corresponds to an 8.3 msec. delay between
    samples. Instead of using the "Wait" VI, you should try using the "Wait
    Until Next ms Multiple" VI. Place this in parallel with your
    generation. This will synchronize the output with multiples of the PC
    clock, rather than waiting for a fixed amount of time, irrespective of
    code and USB overhead.
    Hope this helps,
    Ryan V.
    National Instruments
    Ryan Verret
    Product Marketing Engineer
    Signal Generators
    National Instruments

  • Problem using PID toolkit and control design toolkit to design the closed loop system. (for DC motor closed loop)

    Hi, i have facing some problem during i upgrade a project. This project is an opened loop system, it only achieved up to controlling the speed of motor, detecting the speed of motor, and switching the direction of motor while controlling the motor (This motor is a 12 VDC motor).
    In order to improve the project, closed-loop dc motor control system will be implemented to correct the error of the motor's speed and maintaining the speed for the initial setting. I would like to use PID control method to do the closed loop system.
    I already installed the PID toolkit V8.2 and control design toolkit V2.1.2, and i'm using DAQ usb 6221, LabVIEW 8.5
    Below is my problem.
    http://img177.imageshack.us/my.php?image=howtocreatethesetpointnx2.jpg
    question: 1. How to generate a icon for set point? my setpoint is duty cycle...
                    2. How to feedback the output duty cycle to summing junction?
    below this is my basic concept
    http://img237.imageshack.us/my.php?image=closedloopbz5.jpg
    Problem 2:
    http://img357.imageshack.us/my.php?image=problem1yk2.jpg
    question: Why PID toolkit icon cannot wire to cd series.vi?
    Below is my original program...
    http://rapidshare.com/files/140538836/pwm_generate-final_PSMII.vi.html

    Hi Cyrus
    Have you had the opportunity to see our
    developer zone site on the PID toolkit? This article also has sample
    code at the bottom that may help you in developing your application. I
    have also linked below knowledge bases regarding setting a point
    profile and generating a PWM from a digital output line. 
    PID toolkit
    http://zone.ni.com/devzone/cda/tut/p/id/6440 
    How to generate a set point profile:
    http://digital.ni.com/public.nsf/allkb/125F27AC143B6AFD86256C2B0004A4DC?OpenDocument
     How to generate a PWM on a digital output line:
    http://digital.ni.com/public.nsf/allkb/1561D31534F07D608625727900391114?OpenDocument 
    Thank You
    Eric Reid
    Thank You
    Eric Reid
    National Instruments
    Motion R&D

  • Tension control in labview using PCI 7354 Motion Control Board

    I want to make a program for web tension control of roll to roll system in labview using PCI 7354 Motion Control Board. Is there any interface program that can convert my controller output in a format to be directly used as the input of PCI 7354 (Motion Controller Board) for tension control?
    Thank you!

    thanks for your reply, i have already make the code of a straight line move and the motor moves successfully, but know i want to put the H-bridge to be able to make a reverse direction for the motor, for your information i only use a PCI-7340 controller and a UMI-7774, where the controller connected to the UMI-7774, the UMI connected to a PWM circuit and the PWM connected to the DC-motor, so what do you think i can do?

  • Disable "Close" button of command prompt through LabVIEW using Win32 APIs

    Hello all,
    I am trying to disable the close button of a third party console application that I am invoking through LabVIEW. I tried using GetSystemMenu() and DeleteSystemMenu() from user32.dll, but somewhere I am doing it wrong.
    Can anyone can suggest a solution to this?
    Thanks!
    FraggerFox
    -FraggerFox!
    Certified LabVIEW Architect, Certified TestStand Developer
    "What you think today is what you live tomorrow"
    Solved!
    Go to Solution.

    Do you have the handle to Window?  Off hand, I don't know how to delete it, but here is some C# code that I used to grey out the close window button in another project:
            [DllImport("user32.dll")]
            private static extern IntPtr GetSystemMenu(IntPtr hWnd, bool bRevert);
            [DllImport("user32.dll")]
            private static extern bool EnableMenuItem(IntPtr hMenu, uint uIDEnableItem,
               uint uEnable);
            private const Int32 SC_CLOSE = 0xF060;
            private const UInt32 MF_BYCOMMAND    =0x00000000;
            private const UInt32 MF_ENABLED = 0x00000000;
            private const UInt32 MF_GRAYED = 0x00000001;
            private const UInt32 MF_DISABLED = 0x00000002;
            private void EnableClose(bool enable)
                IntPtr pSysMenu = GetSystemMenu(Handle, false);
                if (pSysMenu != null)
                    EnableMenuItem(pSysMenu, SC_CLOSE, MF_BYCOMMAND | (enable ? MF_ENABLED : MF_DISABLED));
    This was fairly easy to convert over once I had the handle (this is from a library I picked up somewhere).
    Hope this helps.
    A
    Attachments:
    WINUTIL.LLB ‏609 KB
    DisableCloseButton.vi ‏44 KB

  • Can I use USB 6008 with LabVIEW 6.1 RT?

    Hi there,
    I've been using LabVIEW 6.1 RT for my applications and so far I haven't felt the need to upgrade.
    However, I decided to try the USB 6008 and I'm learning the hard way that this device will only work with NI-DAQmx and apparently not with the latest NI-DAQ 7.4.2 (legacy)  In fact, I can't see the device in MAX 4.1
    Is there any way I can use the USB 6008 with my 6.1 RT version of LabVIEW ?  or should I send the thing back to NI ?
    I will appreciate your guidance on this issue.
    Thanks a lot,
    at

    Hi at -
    Allisso is right.  There is no way to use the USB-6008 with LV 6.1, because neither of its drivers (DAQmx and DAQmxBase) support that version of LV.  You'll have to either upgrade to a current version of LV or work with another device.  I recommend contacting your sales rep at NI for help.
    David Staab, CLA
    Staff Systems Engineer
    National Instruments

  • Controling heating/cooling element with labview and usb 6008 DAQ card

    Greetings All
    I'm looking for a heating/cooling type of element that I can control with labview and my usb-6008 card. Heating/Cooling will just be for clean water and the temperture range will be from 0 C to 100 degree C.
    Thanks
    The heating and cooling elements can be seperate products. Any recommendations

    A simple kettle element will of course be sufficient for upto 100°C
    To get the temperature down you need a cooling system, what springs readily to mind of course is a refrigerator.
    Just a note of warning, water and electricity are dangerous bed fellows.
    On that note perhaps one of those vortex air units would be safer - they blow hot in one direction and cold in the other, nominally to 100°C
    see: -
    http://www.airtxinternational.com/how_vortex_tubes_work.php

  • Switching pneumatic solenoid valves using USB 6008

    Hi, 
    I am a student working on a project in which I need to built a program to control/switch  pneumatic solenoid valves to oporate on a pneumatic cylinder using the USB 6008 I am very new to LabView and DAQ.
    The solenoid has two ports, one for extending the stroke arm of the pneumatic cylinder and the other for collapsing the stroke arm.as the solenoids works off of a 12V supply I have bulilt a darlington pair as a firing circuit for it.
    I have had built different programs but I'm only able to  send a signal to extend the stroke arm  and then manually switch the wiring to collaps the stroke arm. So I have problem building a program that sends a signal to my circuit in such a way that ports of the solenoids go ON and OFF after a certain time.
    I really appreciate your help
    Elmira
    Elmira

    Dear Lynn,
    Thanks for the replay.
    I'm gonna try to best describe my system this time;
    I'm trying to creat pitch motion  using a pneumatic system including an air compressor, a solenoid valve and a cylinder.
    The compressor is made by PowerFist and is capable of storing 5 gallons of air under a rated pressure of 125psi and can deliver 2.5cfm (cubic foot per meter) at 90 psi. The compressed air is sent directly to the solenoid valve. It operates using an electrical motor operating from a standard 110V power outlet.The cylinder has a stoke length of 19.7” and a bore size of 1.6” capable of operating under 130psi.
    The solenoid valve used is  5-way 3-position. This means that there are five different ways the air flows inside the solenoid and the solenoid can set in three positions depending on the movement of the mechanical core inside. When air is transferred into the valve the mechanical core moves directing the air to the appropriate ports of the solenoid. The two outputs of the solenoid are sent directly to the pneumatic cylinder where extension and collapsing movements can be performed. the solenoid works off of a 12V supply so I'm using darlington pair transistor as the firing circuit .
    Please see the attach for the pneumatic diagram.
    Thank you
    Elmira
    Attachments:
    pneumatic diagram.jpg ‏33 KB

  • Cannot use USB 6008 in Lab VIEW... is in Devices on Comp however

    I was in class the other night, and was running a VI.  We went to lab, and without shutting down labView, my partner plugged in my USB-6008 DAQ.  SInce that time, the 6008 will not show up.  I have a message about having no supported devices.  I am using DAQmx 8.0 that came with my student version, and have reinstalled the LabVIEW 8.0 and the DAQmx 8.0  I still cannot get LabVIEW to recognize the DAQ.  When I plug it in my hardware profiles recognize that it is hooked up, but LabVIEW will not.  Also I have uninstalled it, and plugged it in, reinstalled it using the auto hardware install wizard, and still nothing on LabVIEW.  The DAQ has the flickering green LED, and works on other laptops.  Also my flash drive works correctly in all of the USBs, so I do not think that could be the problem.  I just really would like to get this to work again, but as I said the DAQ is not recognized by MAX or DAQmx inside of LabVIEW.  Any help would be greatly appreciated...
    Thanks

    Hi,
    You might want to try MSI-Blast.
    Have a look at this thread.
    http://forums.ni.com/ni/board/message?board.id=170&message.id=108288&query.id=113978#M108288
    Patrick Allen

Maybe you are looking for

  • Workaround for inability to scan on USB 3.0 port

    I've got a late 2012 era Mac Mini (specifications here) that I'm trying to set up as a printing and scanner server for my Brother MFC-7340. Long story short, I have discovered that scanning does not work when the MFC-7340 is plugged in to a USB 3.0 p

  • How to provide default values to DFF columns

    Hi I have DFF column (Code Combinaton ID) on a page. This page helps the user to enter information. I need to initialize the DFF as 000.0000.000.00.00.000. Code combination has six segments. How can I achiev the same? Thanks Prasad

  • Employee photo upload issue

    Hello Experts, i am uploading a photo wih the help of abaper but problem is when i run t code OAC0 and create repository for that storage type should be type as R/3 database .but when I  am selecting it is not coming another option is comes HTTP CONT

  • Required code to convert binary to decimal

    i need the programming logic to convert a binary number into its decimal equivalent. the program should also detect a non binary number.

  • Help with installation error 1335 -- Acrobat XI Pro

    This weekend I've attempted to install a purchased copy of Acrobat XI Pro to a machine running Windows XP SP3. Each time I received the error "error 1335 Data1.cab file is corrupt...". This happened 3 times, including with the anti-virus and firewall