Servo Motor and NI Card

Hello,
I'm pretty familiar with LabVIEW specially with the vision part of it, but I'm not familiar at all with servo motors.
Problem statement:
I'd like to make a VI that controls a servo motor that will rotate a screw every x amount of time (in both directions, obviously).
Limitations:
The screw turning should be a smooth as possible as it is a very delicate instrument and experiment. 
As stated above, I don't know the first thing about servo motors and NI cards, and I would like to get some advice from you. Where should I begin? What are your recommendations for NI cards/ Motors?
I don't think that the software aspect will be very challenging, it's just two nested loops-- the first is a while loop that controls the whole experiment time and the other controls the turning function.
Unless, I underestimate to complexity of setting up a servo motor VI.
I'd be happy to hear any thoughts/ideas you might have.
Thank you,
A

Hey A,
I would start by checking this website.
I would recommend using a PCI card 7332, with UMI 7772, and AKD servo drive P00606. Then you can select the motor from here. Depending on the size of your application, choose a motor, and then make sure that the drive has enough current to power it. 
Regards,
A. Zaatari
National Instruments
Applications Engineer

Similar Messages

  • Stepper motor and NI6025E card

    I want to control a stepper motor with a 6025E National instrument card. My stepper motor is controlled with a microstep controller.
    Does anybody have already done that ? I am interested by some examples...
    Please HELP me...
    Frederic
    [email protected]

    I don't have any examples but I have done this before and it's not that difficult in principle. Getting precise timing can be tricky, and you'll have to use counters rather than the regular DIO ports for that kind of precision.
    I had an MC3479 stepper motor driver to control each motor (there were 2). The way I did it was to use a counter for the clock signal and the DIO ports for control. I generated a gated pulse train on one of the counters, with a DIO pin (DIO 0)as the gate. The Direction, Full/Half step, and other control lines were also connected to the DIO port. I just wrote out whatever data word I needed to setup the motion I wanted, with the pin 0 high to let the clock move the motor. The motor had to be stopped by clearing bit 0, which I had after a tim
    ing delay that was changed in the program. It required a little calculation to get the time delay matched with the frequency of the pulse train to get the number of steps I needed, and it was only approximate; I wasn't worried about exact position as much as being able to adjust the position dynamically.
    For precise position you might try using just a digital line to increment the clock. This is simplest, but the timing will depend on your program. You can also use the other counter for a very precise gate signal, which gives accuracy as well as time-specific rotation, but uses the other counter and is programmatically more complex.

  • CompactRIO restart when run VI of NI 9505 and servo motor

    I plan to use NI 9505 to control a servo motor, and the cables are connected according to the manual of NI 9505 (The M+  and M- ports are connected with servo motor directly). I can read the the encoder value from VI, but when I try to control the servo motor by even a very simple VI, the error happens, and the compactRIO restart. When I disconnect the motor and 9505 module, the voltage between the M+ and M- port can be measured, and it  is 24 V. I think maybe the current of motor through the 9505 module is too high, but why and how to solve it. Is there anybody can help me? Thank you very much. 

    If you believe that your motor pulls more than 5A, take a look at this from the NI 9505 product page. It's the first bullet point.
    Continuous current of up to 5 A at 40 °C (or 1 A at 70 °C) at 30 V - for higher power add NI 9931
    The NI 9931 will allow the 9505 to supply up to 7.3A.
    www.movimed.com - Custom Imaging Solutions

  • Servo motor parameter estimation

    Hello,
    I am new to system identification toolkit in LabVIEW. I want to perform DC servo motor parameter estimation using system identification. Is it possible to connect the hardware ( DC servo motor) and let LabVIEW estimate parameters automatically? If not then, how should I proceed.
    If I am not wrong, we need to give a response signal and stimulus signal as input and obtain the parameters ,right?
    Also, which algorithm should I adopt?? ARX, ARMAX, General- Linear, Box-Jenkins etc. ?
    Thank you.

    Hi,
    National Instrument's motion page has some useful information about the Motors that we sell.  If you are interested in microstepping, we sell Stepper Motor Drives as well.  This information should help you decide on a motor that fits your needs.
    Wes P
    Applications Engineer
    National Instruments
    Wes P
    Certified LabVIEW Developer

  • Can we cascade PID and PIV loops to control the servo motor

    hi
    presently we are using PID loop for controlling the motion of a servo motor using ni-7352 card. We  are not able to receive the desired response from this implemenatation. So is there any other alternative like using a PIV loop or using PID cascaded with PIV to achieve a better response. If there is possibility help us to proceed further with this.
     also tell us which is more reliable 1)using PID alone  2) Using PIV Alone 3) using PIV and PID cascaded.
    please mail to this query to [email protected]

    Sidda,
    before you start thinking about advanced control architectures I want to ask you to tell me some details about your system behavior and the control parameters that you have used. I have used 73xx boards for very dynamical systems and I have always been able to find control parameters that resulted in a very fast and stable system behavior.
    For the case that you need some help with tuning please have a look at this link. In many cases autotuning doesn't result in good system behavior but you will find a lot of interesting hints about the manual tuning process there (e. g. that increasing the Kd gain typically results in a better damped system).
    If this doesn't help please attach some screenshots from your step response and the control parameters that you have used.
    Best regards,
    Jochen Klier
    National Instruments Germany

  • Position Control of compumotor 406LXR linear servo table and GV-U6E(motor drive) with PCI-7344

    Since I had the answers referring:
    http://exchange.ni.com/servlet/ProcessRequest?RHIVEID=101&RPAGEID=135&HOID=506500000008000000DD490000&USEARCHCONTEXT_CATEGORY_0=_14_&USEARCHCONTEXT_CATEGORY_S=0&UCATEGORY_0=_14_&UCATEGORY_S=0
    Wiring step was done well. However, I still have a problem that MAX configuration doesn't fit in compumotor drive settings. Therefore, the setup is shown below, and I want to know how I can set up PCI-7344 configuration in MAX or LabVIEW VI.
    ;Uploaded from device address 0
    ;Gemini GV Servo Drive Setup
    ;Motor Setup
    DMTR 1703 ;Motor ID (406-x-LXR-M-x-D13-x-x-x-x-E5-x-x-x)
    DMTIC 2.48 ;Continuous Current (Amps-RMS)
    DMTICD 0.00 ;Continuous Current Derating (% derating at rated speed)
    DMTKE 17.6 ;Motor Ke (Volts (0-to-peak)/krpm)
    DMTRES 10.10 ;Motor Winding Resistance (Ohm)
    DMTJ 119.300 ;Motor Rotor Inertia (kg*m*m*10e-6)
    DPOLE 1 ;Number of Motor Pole Pairs
    DMTW 40.5 ;Motor Rated Speed (rev/sec)
    DMTIP 7.40 ;Peak Current (Amps-RMS)
    DMTLMN 3.4 ;Minimum Motor Inductance (mH)
    DMTLMX 3.4 ;Maximum Motor Inductance (mH)
    DMTD 0.000000 ;Motor Damping (Nm/rad/sec)
    DMTRWC 0.23 ;Motor Thermal Resistance (degrees Celsius/Watt)
    DMTTCM 20.0 ;Motor Thermal Time Constant (minutes)
    DMTTCW 0.33 ;Motor Winding Time Constant (minutes)
    DMTAMB 40.00 ;Motor Ambient Temperature (degrees Celsius)
    DMTMAX 90.00 ;Maximum Motor Winding Temperature (degrees Celsius)
    DHALL 1 ;Disable Hall Sensor Checking
    DMTLQS 0 ;Set Q Axis Inductance Saturation
    DMTLDS 0 ;Set D Axis Inductance Saturation
    DTHERM 0 ;Disable motor thermal switch input
    ;Drive Setup
    DMODE 2 ;Drive Control Mode
    DRES 8400 ;Drive Resolution (counts/rev)
    DPWM 16 ;Drive PWM Frequency (kHz)
    SFB 1 ;Encoder Feedback
    ERES 8400 ;Encoder Resolution (counts/rev)
    ORES 8400 ;Encoder Output Resolution (counts/rev)
    DMEPIT 42.00 ;Electrical Pitch (mm)
    SHALL 0 ;Invert Hall Sensors
    DMTLIM 1.5 ;Torque Limit (Nm)
    DMTSCL 1.5 ;Torque Scaling (Nm)
    DMVLIM 119.000000 ;Velocity Limit (rev/sec)
    DMVSCL 119.000000 ;Velocity Scaling (rev/sec)
    ;Load Setup
    LJRAT 0.0 ;Load-to-Rotor Inertia Ratio
    LDAMP 0.0000 ;Load Damping (Nm/rad/sec)
    ;Fault Setup
    FLTSTP 1 ;Fault on Startup Indexer Pulses Enable
    FLTDSB 1 ;Fault on Drive Disable Enable
    SMPER 8400 ;Maximum Allowable Position Error (counts)
    SMVER 0.000000 ;Maximum Allowable Velocity Error (rev/sec)
    DIFOLD 0 ;Current Foldback Enable
    ;Digital Input Setup
    INLVL 11000000 ;Input Active Level
    INDEB 50 ;Input Debounce Time (milliseconds)
    INUFD 0 ;Input User Fault Delay Time (milliseconds)
    LH 0 ;Hardware EOT Limits Enable
    ;Digital Output Setup
    OUTBD 0 ;Output Brake Delay Time (milliseconds)
    OUTLVL 0100000 ;Output Active Level
    ;Analog Monitor Setup
    DMONAV 0 ;Analog Monitor A Variable
    DMONAS 100 ;Analog Monitor A Scaling (% of full scale output)
    DMONBV 0 ;Analog Monitor B Variable
    DMONBS 100 ;Analog Monitor B Scaling (% of full scale ouput)
    ;Servo Tuning
    DIBW 1500 ;Current Loop Bandwidth (Hz)
    DVBW 100 ;Velocity Loop Bandwidth (Hz)
    DPBW 40.00 ;Position Loop Bandwidth (Hz)
    SGPSIG 1.000 ;Velocity/Position Bandwidth Ratio
    SGIRAT 1.000 ;Current Damping Ratio
    SGVRAT 1.000 ;Velocity Damping Ratio
    SGPRAT 1.000 ;Position Damping Ratio
    DNOTAF 0 ;Notch Filter A Frequency (Hz)
    DNOTAQ 1.0 ;Notch Filter A Quality Factor
    DNOTAD 0.0000 ;Notch Filter A Depth
    DNOTBF 0 ;Notch Filter B Frequency (Hz)
    DNOTBQ 1.0 ;Notch Filter B Quality Factor
    DNOTBD 0.0000 ;Notch Filter B Depth
    DNOTLG 0 ;Notch Lag Filter Break Frequency (Hz)
    DNOTLD 0 ;Notch Lead Filter Break Frequency (Hz)
    SGINTE 1 ;Integrator Option
    SGVF 0 ;Velocity Feedforward Gain (%)
    SGAF 0 ;Acceleration Feedforward Gain (%)
    Regards,
    JinHo

    First of all, before connecting the drive to the 73xx controller I would check that the drive and motor configuration works correctly independent from the controller. The Gemini drives have a utility called the Motion Planner that allows you to configure your motor and drive for independent operation so you can test if the motor and drive combination are working by themselves. Refer to page 18 of the Gemini GV installation guide which you can find at the compumotor site or download from:
    http://www.compumotor.com/manuals/gemini/Gemini_GV_HW_Install_Guide.pdf
    Once you have tested your motor and drive combination, make sure that the drive is configured in torque mode and that the command signal is configured to be sent from the I/O connector and not through RS232 using the Motion Planner. The next step would be to connect the 7344 through the UMI-7764 breakout box in the following order:
    ---- UMI7764 GVU6E
    ---- AOut Cmd+ (pin23)
    ---- AOGnd Cmd- (pin24)
    ---- InhOut Enable- (pin2)
    ---- +5V Enable+ (pin24)
    7344 --> EncA AX+ (pin8)
    ---- EncA- AX- (pin9)
    ---- EncB BX+ (pin10)
    ---- EncB- BX- (pin11)
    ---- Index ZX+ (pin12)
    ---- Index- ZX- (pin13)
    ---- +5V Encoder +5V (pin5)
    Verify that your enable line is connected in an open collector mode (as shown in the diagram above). Our inhibit outputs can sink current but not source it so if your enable line is not behaving properly you want to make sure that the +5v supply you are suing for the UMI can source enough current for your enable line to work. Consult Compumotor on the specs of their enable switch.
    Once the connections are done properly, all you need to do is configure and initialize your board for servo operation in MAX and then you can start your tuning process. Refer to the Tuning PID for servos tutorial in:
    http://www.ni.com/support/motnsupp.htm
    for instructions on tuning your servomotor properly. Also for more information on using MAX, refer to the following tutorial:
    http://zone.ni.com/devzone/conceptd.nsf/webmain/081957EE013C7A4586256B92007818E0?opendocument

  • How to make the servo motor to move in steps of set degrees and stop

    how to make the servo motor to move in steps of set degrees and stop

    Hi,
    I think the following document would be a good starting place: NI Developer Zone Tutorial: Single Axis Moves It includes links to several example programs that you may find useful for your application. Keep in mind that there are many motion examples that ship with LabVIEW as well.
    I strongly recommend that you check out the following documents as well:
    NI Developer Zone Tutorial: Simple Point to Point Motion
    NI Dev
    eloper Zone Tutorial: Hands-On Motion
    NI Developer Zone Tutorial: Axis Settings for Motion Controllers
    These tutorials will help give you a good foundation for understanding motion control systems.
    Best wishes!
    Dawna P.
    Applications Engineer
    National Instruments

  • Whether the NI step controller can control the digital servo driver and motor

    I want to you can help me confirm whether the NI step controller can control the digital servo driver and motor.If can,what is the differents between servo controller and step controller to control servo motor?Thank you so much!
    Best regards

    The PCI/PXI/FW 7344 high precision controller can be used to control both steppers and servo motors. The older PC Step controllers are made for stepper only control. The difference in the two control types is with stepper control, the controller outputs two digital signals that control the step and direction of the motor. The drive uses these digital signals to energize the stepper phases and commutate the motor. In a servo system, the controller outputs a -10 to 10 Volt analog signal that the drive then uses to output the appropriate current to the motor which is proportional to torque.
    You can get more information from the following website:
    http://zone.ni.com/devzone/devzone.nsf/webcategories/69771825AE23E98E86256786000BEA02?opendocument
    Please let u
    s know if you have any further questions.
    Regards,
    Andy Bell
    Applications Engineer
    National Instruments

  • Servo motor control using CRIO+FPGA and 9477 digital out module

    Hello experts,
    I have a futaba BLS551 brushless motor digital servo (3 wires-+,-, signal). i also have a CRIO+real-time+fpga and 9477 digital out module. how can i generate servo signals using this module
    please help...
    Thanks,

    freemason,
    In order to control your servo motor with the FPGA and or DIO module you will have to write drivers to control your motor and drive.  While this is possible is an extremely complicated and time consuming process.  I would highly recommend you consider using the NI 9514 with soft motion as it will provide full servo functionality and is relatively easy to use.
    Regards,
    Sam K
    Applications Engineer
    National Instruments

  • Servo motor control using MCB2300 and Labview

    Hello Everyone,
    I have to drive servo motor using MCB2300 board and labview. I am new to LabView as well as MCB2300 board. I understand that I have to generate Pulse to control the servo motor. 
    I have gone through some of the post but could not find something useful. 
    I need to do it asap, a fast and eloborating answer is much appreciated. 
    Thanks is advanced.

    I don't know if this will help you, but think about this tip:
    On the back of my (digital-) Servo package there was some data written (see attached image).
    You've heard of PWM (Pulse Width Modulation)? If not look it up in
    wikipedia / microcontroller.net / (if you speak german) http://www.rn-wissen.de/index.php/Servos or google
    The description of the Servo says that I have to send every 200ms +/- 1ms a positive (+5V +/- 1V) signal with a length depending on the angle I want the Servo to be set. The Signal is coded in conjunction to time, which means, as longer the Signal as higher the angle is. The Signal range is in between 70 to 240 ms (with my servo and not exactly). The Signal must repeated every 200 ms like I said before. I don't know, if you understand C but here is a function I wrote, which works fine for me:
    void set_Servo_0(uint8_t angle)
      DDRA |= 2; // Specific Port declaration for my µC (Atmel)
      uint8_t tick; // a var to count how often I send my Signal
      for(tick = 0; tick < 2; tick++) // loop to count -> send Signal three times
        if(getStopwatch1() > 200)  // getStopwatch is a libary specific function to measure time in ms steps
          PORTA |= 1;  // port high
          sleep(angle);  // angle comes from outside the function, it is a parameter for this function. Sleep for this time with port high = pos Signal
          PORTA &= ~1;  // then pull down
          setStopwatch1(0); // reset the timer 
      mSleep(250); // Finally I have to wait for this time (in ms) when I send different angle parameters one after another, to let the whole system
                          // (µC + Servo and rest of program) to settle down, else I will loose signal steps due to incorrect timing (not nice, but works).
    This function gets the angle as Integer from 7 to 24 and puts the Servo in corresponding position one time.
    Maybe You can adapt it, good luck.
    Attachments:
    Servo.jpg ‏207 KB

  • CRIO using 9502 and 9514 for servo motor

    Hello,
    The 9514 states it is an interface from the cRIO to a servo driver and the 9502 states that it is a servo driver.  Can you hook the two together to work as a system?  Is there any benefit of using the two together as opposed to just one?  I couldn't find any reference in either manual of the other's use.
    Thanks,
    Zach

    "Why would the interface to an external drive have the PID loop on board, when the end product driver itself does not?"
    Essentially the cRIO FPGA is part of the 9502 module. Putting the control logic on the FPGA instead of in the module means that the module is less expensive and that it is more customizable. The performance is the same if not better.
    The goal with a lot of cRIO products, and especially new motion products, is to give customers an API that can be opened and customized by customers all the way down to the actual IO pins if necessary.
    In motion that means that when you use the 9502 you can run the example as is, or you can open it up and modify the logic that defines that hardware to customize it to your needs.
    While the 9502 is less expensive than the 9514+Drive there are some tradeoffs. The 9502 can support up to 4 Amps continuous current. We are doing a lot of work to make the LV FPGA-based 950x modules integrate seamlessly with NI SoftMotion.  In LabVIEW 2011 it takes some custom programming to write the interface code between the NI SoftMotion engine and the FPGA.
    The 9514 can potentially control a drive that can drive larger motors. It is also easier to use with the NI SoftMotion engine in LabVIEW 2011.

  • Servo motors, rpm measuremen​ts and accelerome​ters

    Hello reader,
    Please excuse my question if it seems trivial but I am an undergrad trying to make LabVIEW work! We are running LabVIEW 8 and plan to use USB-6009 to make the measurements. I have read that this device isn't the best for control tasks. Any suggestions?
    For our experiment, we wish to have six controllable spinning cylinders attached to servo motors or stepper motors, which ever works the best. Each cylinder will be spinning at a set rpm which we wish to measure the desired rpm versus actual at all times. If possible, we would like to have an accelerometer functioning as well.
    I was wondering if this is possible with LabVIEW. If so, does anyone have any suggestions on how one might do this?
    Thank you very much for your time!
    -I. Fritz

    ifritz,
    there are several  considerations for choosing a servo or a stepper:
    Inertia of the cylinders
    maximum rpm values
    maximum acceleration and deceleration values 
    The maximum velocity for stepper motors is typically 3000 rpm. At this velocity the torque that the motor can provide has decreased to a small fraction of the specified maximum torque.Stepper motors have very limited capabilities to compensate following errors, so you need to calculate carefully the required torque at a given velocity and compare these values with the motor specs. Additionally you need to make sure, that the ratio between the motor's inertia and the inertia of the load is not too low. To avoid torque reflections this value should be somewhere in the range of 1:1 and 1:5.
    If you find a stepper, that is a good match for your application, the difference between commanded speed and measured speed should be very close to zero (except potentially some micro-oscillations).
    If you need higher torque and/or velocities and dynamic following error compensation, a servo motor should be a better choice, but you will have to tune the system which makes it a bit harder to configure the system.
    In any case the USB-6008 is the wrong choice for the control task, as it's not fast enough and can't be used in real-time control applications.
    Depending on the type of your feedback signal, it might be used for your measurements, but I also doubt, that this is a good choice. According to your post, you need to compare commanded velocity and real velocity. This implies, that you have access to the data of the trajectory generator and that you can acquire these data at the same rate as your feedback signal. This requirement conflicts with most of the available motion control units in the market. The minimum system that could meet this requirement is a real-time system with one ore more PCI or PXI multifunction DAQ  plugin boards (depending on the number and type of output channels that you need to control the motors) and optionally (recommended) the NI SoftMotion Development Module.
    Depending on your accuracy and speed requirments there might be also other solutions, but with the USB-6008 you are definitely on the wrong track.
    Kind regards,
    Jochen Klier
    National Instruments

  • Difference between BLDC motors and Brushless Servo Motors

    Hi
       I am new to the concept of motors and motor control. I dont know if I have come to the right place but it would be great if anyone can help me in finding out the differences between Brushless DC motors and Brushless Servo motors. Can I use NI motion controllers to control a Brushless DC motor?
    Regards,
    Suman.

    Suman,
    Brushless DC motors and Brushless Servo motors mean typically the same thing. The word "servo" implies that a position feedback resource (e. g. a quadrature encoder) is used for closed loop control.
    You can use NI motion controllers to control brushless motors. Please refer to this thread in the forums for more information.
    Best regards,
    Jochen Klier
    National Instruments Germany

  • Accelerometer sense tilt then interface with USB-6008 and generate PWM to control M995 servo motor

    hi everyone,
    currently im doing a final year project in LabVIEW using USB6008. my project involve of sensing the tilt from accelerometer and convert the tilt into PWM to control the turning of servo motor.
    accelerometer that im using is ADXL322 which has Dual axis ±2g sense range
    servo motor that im using is M995. when the servo motor in neutral position it required 1500microseconds, while it turns 90degree it requires 2200microseconds and whilw it turns -90degree, it requires 800 microseconds.
    currently im facing problem in generating the PWM signal to control the operation of servo motor.
    attachment is my VI design that i have done to date.
    the program is about attracting tilt range from accelerometer using USB NI 6008 and then convert the tilt into angle in degree, and after that generate PWM to control two servo motor.
    hopefully there is somebody can help me on this. thanks.
    Attachments:
    FYP.vi ‏253 KB

    currently i need to generate PWM to control servo motor model MG995. but i am facing problem in generating the PWM in between 800usec to 2200usec to control the rotation of the servo motor. is there any example for me to refer.
    below is my VI that i have done to date.
    hopefully somebody can identify my mistake because my VI is not able to turn the servo motor.
    Attachments:
    pwm.vi ‏128 KB

  • Interfacin​g external servo motor in NI sbrio-9632 and programmin​g for it.

    Hi..
    I am using NI starter kit containing SbRIO-9632 and programming it through Labview..Now if I want to interface external gripper to it, I am using servo motor for its motion but how it is possible to program for this external motor in Labview..I mean Do I need to make some pin high in SbRIO??
    Pls help me to start.. 

    Hi sharry,
    You should be able to connect the signal pin of your servo directly into one of the Digital Pins of the SB-RIO. Then you can simple write a pulse of varying width using the FPGA to drive the servo. A good example of this can be found in the Example Finder. Simply search for "PWM", the example you want is called "Controlling a servo using PWM.lvproj".
    The 5V and GND pin of the servo should be connected externally, so that the servo doesn't draw its power from the SB-RIO.
    Also be aware that the digital pins on the SB-RIO operate at 3.3V, not 5V. So if you're outputting a pulse to control a servo, it will be a 3.3V pulse. For most servos, this won't be a problem (3.3V is within the range for a HIGH signal in  TTL and CMOS logic levels), but it's worth checking the datasheet with your servo first, or doing a quick test,
    Kind regards,
    Josh E
    Applications Engineer
    National Instruments UK & Ireland

Maybe you are looking for