Tuning servo motor problems

I have a PCI 7344 connected to a UMI and a Ecodrive amplifier, with a linear modul and servo. Feedback is a incremental encoder 16404counts/rev (400counts/mm). When i try to autotune, the servo motor goes berserk, so I have to manually tune it with "Step response".
I am able to get a nice response curve according to "Understanding servotune" but the problem is, curve starts at -50mS? and at 50mS it seems stable, but settling time is indicated to be 500 - 600 mS, and when i use "wait for move complete.vi" it waits about the same period before it returns "move complete". I was expecting/hoping for 50 - 100mS settling time. How to tune this? I run position mode and there is no problem regarding velocity, acceleration and position feedback, the only
thing is that settling time is to long. I have no following error. Is there any other way to get a signal exactly when a position is reached.

Hi, thank you for your answer.
I still have problems with this, so here is some more questions.
1. I am getting a perfect curve on the step response, similar to attached picture (1), only with shorter time scale. The step response report under the chart that settling time is 500-600mS, which is settling time (<100mS) and steady state (>400-500mS) together. This period seems to be the same that i am experience with "move complete.vi".
I have Kp=7 Kd=14 and Ki=1, and i have tried to adjust these values but this is the best i get.
Does the loop need that time to determine that move is complete?
2. The move i want to do in my application is a movement about 20000 counts (50mm) in 130mS, wait 100mS and return to start in 130mS, this is possible if I use
software delay and start return move before first move is settled according to "move complete.vi" But this generate a increasing stop position error each move if I loop this sequense with 100mS rest time. These timings should be possible with the hardware I am using. Any suggestions?
3. I have set the driver/amplifier in velocity mode (incremental encoder feedback), how is that compared to torque mode in the driver in this case?
4. It is possible to get a second feedback from the drive (torque, velocity or position from an analog output) but this is 8 bit and less accurate than encoder. I also have a second external encoder for precision measurement and a force cell, but they are only active parts of the range to the servo (under the desired move) is there any other way i could set this up?
Regards
Attachments:
153.gif ‏29 KB

Similar Messages

  • Tuning servo motor to withstand external force of another servo motor

    Hi,
    I have two brushless DC servo motors connected to their own third-party amplifier operating in torque mode. These motors control a reel-to-reel system, where a metal ribbon travels from one reel to the other. One amplifier is connected to the NI 9516, where I am using Softmotion to control its velocity. The other amplifier is set at a constant torque to take up the ribbon.
    The problem I am having is with tuning the velocity-controlled motor to withstand the torque caused by the take-up motor. I need the motor to operate at a constant velocity while able to withstand the forces cause by the pull of the other motor. So far I have been unsuccessful at tuning it to counteract that external force. I was hoping to have Softmotion control the motor's velocity while the motor itself is set in torque mode, but it seems I have to set the motor to velocity mode to solve my problem. Is that the only answer, or is there another way?
    Linus

    Just to let you know, I already solved the problem. It seems that my initial settings for the servo amplifier was incorrect, and that I was providing too little power to the motor. It was set to a low power because it was making a noise when it was set to a normal value. I learned eventually that the source of the noise was not because of the power, but because the initial gain tuning parameters were not set correctly. When I changed the tuning parameters, I was able to supply more power to the motors, which gave it enough torque to withstand external forces.
    I initially used the gain tuning values recommended by the Getting Started the AKD EtherCat Drives guide, even though I was not using the same brand of motor. I though the suggested uning parameters would apply to my brushless servo motor, but it turns out that's not the case.

  • Servo motor encoder pulses/counter data erroneous

    First off, I am very new to using labview.  I am trying to complete a project a former employee was working on.
    For a quick background on what I am working with, I am using a NI DAQCard-6036E connected to a SC-2345.  The SC-2345 is then connected to a load sensor, Omron R88D servo driver, and an omron servo motor.  The servo motor has a incremental encoder with a resolution of around 2048 pulses per revolution.  My labview program includes a counter that records the data from the encoder on the servo motor.  I have been able to get accurate data when testing through the measurement and automation program by manually turning the motor.  Also when running through the specific DAQ assistant I am using for my counter, I am getting correct readings when manually turning motor.  Once I run my complete program, instead of getting 2048 pulses per revolution, I am getting between 34000-36000 pulses per revolution.  The most logical assumption is that I am getting vibration in the motor itself or some sort of noise is interfering with my signal.  I first attempted to change any possible settings through the omron servo driver that might reduce any vibration in the motor.  I attempting changing the rigidity settings, turning on and off the auto-tuning function, and a few other settings specified by the user manual that might cause vibration.  If I turn the rigidity settings as low as possible, I am able to get around 2000 pulses per revolution, but the data is very sporadic. Also, my equipment needs to be very rigid, and with the lowest rigidity setting for the servo driver, I am able to almost stop the motor with minimal force.  My equipment needs to be able to travel at a near constant speed with fluctuations of up to 200 N of force.  Any suggestions on which direction I should go in finding a countermeasure? 
    Thanks
    Solved!
    Go to Solution.

    The model number of the servo motor is R88M-W10030L.  The servo motor rotates at a constant speed.  The program is designed to drive the servo motor connected to a ball screw in one direction.  Once the load sensor reaches a desired load, it will reverse at a constant speed until no load is on the sensor. Throughout, it records load vs. displacement.   I have found a few things that will alter the pulses counts.  If you apply resistive pressure on the servo motor while it is rotating, the pulse output will vary.  Also when you apply pressure to the casing of the servo motor itself, the pulses will often jump around. I was almost certain my false pulses were caused by vibration.  After have no success adjusting settings to reduce vibration(according to the user manual), I ran the program while moving around several wires to see if any were loose, etc... After applying force to the power lines and encoder cable, the program ran several times with an average of 2000 pulses per revolution and would actually subract pulses while going in reverse(what I want it to do); Although the average was around 2000 pulses per revoltion, i saw positive and negative jumps in pulse counts while traveling forward at constant speed.  Today I re-wired the equipment, seperating as many wires as possible.  After the re-wire, the equipment/program is back to sending 34000+ pulses per revolution, and does not subract pulses while reversing.  I have read the 'Using Quadrature Encoders with E Series DAQ Boards' article.  Referring to the article, I am running similar to "method 1".  I am already using a signal conditioning box, but have the counter data run directly through. Do you believe running the signals through a SCC-CTR01 might solve the problems? 

  • From where in the UMI flex 6c I should give enable to servo motor amplifier/​driver.

    I am using ADS 50/5 maxon amplifier to run the maxon servo motors with Flex 6C. The problem I am facing is that, How to enable the amplifier so that it works only when desired. I have tried by connecting the enable to the inhibit output but it is not working. before connecting to enable Inhibit output is showing 5v but as soon as I connect it , it drops to 1.5V. I am giving encoder feedback to 6C. Motor runs when I directly connect enable to the power supply, but then it runs continuously. Can anybody suggest me about the correct connections.
    regards
    vkmehta

    If the output voltage of the inhibit output drops to around 1.5V as soon as you connect it to the enable input of your servo amplifier you should check the specifications for the enable input of the amplifier. It seems that an optocoupler is used, and if you apply 5V without any series resistance to the input led of an optocoupler the input voltage will be clamped between 1.5 and 2V. Also, the optocoupler may get damaged due to excessive current on the input led.
    Check the manufacturers specifications for correct input wiring. In most cases a series (current limiting) resistor is needed, somewhere around 470 ohms.
    You should also check whether you are applying the correct logic level to the enable input. Some servo amplifiers need a logic 1 (= 5V) on t
    he input, some require a logic 0 (= 0V) to shut down the amplifier.

  • Servo motor using parallel port

    Hello,
            I have a XY mirror scan Servo motor used for optical scanning experiment.
    The question I have is:
          I already have a program to control stepper motor via parallel port. Can I use the same to control servo motor ?
    Or is the SERVO motor concept different to that of a STEPPER motor ...
    Thank you
    Abhilash S Nair
    Research Assistant @ Photonic Devices and Systems lab
    [ LabView professional Development System - Version 11.0 - 32-bit ]
    LabView Gear:
    1. NI PXI-7951R & NI 5761
    2. The Imaging Source USB 3.0 monochrome camera with trigger : DMK 23UM021
    OPERATING SYSTEM - [ MS windows 7 Home Premium 64-bit SP-1 ]
    CPU - [Intel Core i7-2600 CPU @ 3.40Ghz ]
    MEMORY - [ 16.0 GB RAM ]
    GPU - [ NVIDIA GeForce GT 530 ]

    You will need a DAQ card that can generate the voltage needed to send a command signal to the 671.  The 671 will need to be tuned to the 6880 with whatever sized mirror is attached.  (If you bought the galvo and servo driver as a package it should already be tuned.)
    CTI systems take in an analog command from -10 to +10 Volts.  Almost all the NI DAQ cards (and many other brands) output +/- 10 Volts so that will be easy.
    Then you will need to decide how to scan your target.  A ramp pattern or triangle wave is the usual choice for scanning objects so you need to generate that in LabVIEW code along with the code that will read your sensor.  This should be done simultaneously but you really don't need a very expensive DAQ card to accomplish that.  Look on the NI website for options in your price range and do some research...
    Is your system one axis (one 6880 and one 671)?  If so you will scan a raster (ramp or triangle) to measure a single line of light intensity, move the stage a tiny distance and scan another line.  When you put all the lines together into a 2D image you will have a representation of one face of your object.  Many people use a rotary stage to spin the object while scanning to assemble a 3D model of the object.  This is a bit more complex of course.
    Using LabVIEW: 7.1.1, 8.5.1 & 2013

  • Servo Motor Control

    Hi all...
    I'm working with the AT-MIO-16E-10 Board,
    and I use its 2 counter (0 and 1) to control 2 servo motors, when I
    using them to generate continuous pulse, both motors can move
    continuous. But my problem is, when I generating a finite pulse train,
    only either 1 motor can move(ie. no of pulse=10000)...why and how can I
    control 2 servo motors by this board?

    sushma:
    You currently have 4 other threads asking help for control of a stepper, one of which you said you had a working solution. Now you are asking about servos which operate totally different from stepper motors. Please keep discussion on one thread and include what hardware you have to work with (motor driver, motor, DAQ card) so that others may help in an efficient and timely manner.
    Thanks
    ~~~~~~~~~~~~~~~~~~~~~~~~~~
    "It’s the questions that drive us.”
    ~~~~~~~~~~~~~~~~~~~~~~~~~~

  • Servo Motor and NI Card

    Hello,
    I'm pretty familiar with LabVIEW specially with the vision part of it, but I'm not familiar at all with servo motors.
    Problem statement:
    I'd like to make a VI that controls a servo motor that will rotate a screw every x amount of time (in both directions, obviously).
    Limitations:
    The screw turning should be a smooth as possible as it is a very delicate instrument and experiment. 
    As stated above, I don't know the first thing about servo motors and NI cards, and I would like to get some advice from you. Where should I begin? What are your recommendations for NI cards/ Motors?
    I don't think that the software aspect will be very challenging, it's just two nested loops-- the first is a while loop that controls the whole experiment time and the other controls the turning function.
    Unless, I underestimate to complexity of setting up a servo motor VI.
    I'd be happy to hear any thoughts/ideas you might have.
    Thank you,
    A

    Hey A,
    I would start by checking this website.
    I would recommend using a PCI card 7332, with UMI 7772, and AKD servo drive P00606. Then you can select the motor from here. Depending on the size of your application, choose a motor, and then make sure that the drive has enough current to power it. 
    Regards,
    A. Zaatari
    National Instruments
    Applications Engineer

  • Controllin​g a servo motor to generate constant force

    Hi,
    I have a machine which interacts with foot and I would like to control the motor so that it generates a constant load at the end effector (applied on the foot). I am able to read force and position information. What kind of control algorithm should I use to control a constant force generated by the servo motor? A force feedback controller which feeds back the force information to the controller and update the command signal seems to be ideal but my force sensors could be noisy and I don't know what filters and what cut off frequency I should choose. Can I assume a linear relationship between the currnet command to the motor and the amount of force generated at the output or it could be wrong due to the back emf, friction, etc...
    I appreciate if you could advice.
    I use compactRIO and labview fpga 8.6.
    Thanks

    I think it can be done with a simple force sensor and a DC motor provided you have a means to control the voltage from Analog output.
    The only problem you might encounter is it may oscilate around your force setpoint but you can find some simple ways to smooth that out.
    It really depends (quite a bit) on what control precision your design is willing to tolerate.

  • Controllin​g a Servo Motor Using LabVIEW, Phidget & Mac OS

    Can anyone help me with this problem? 
    I'm attempting to control a servo motor attached to a Phidget, using a VI in LabVIEW on a Mac. The Phidget works fine with the Mac, LabVIEW works fine with the Mac, but there seems to be a problem combining the three. The problem I'm encountering is that the examples on the Phidget website (www.phidgets.com) for LabVIEW use ActiveX. Can anyone suggest a solution for this? Is it possible to use one of the control options in LabVIEW (GPIB, VISA, etc etc.) instead?
    I'm not great with computers so really have no idea where to start. 
    Thanks!  

    Thanks for your reply Jeff! So I can insert a CIN to the block diagram, right click and use the 'create .c file' option, insert the example code for controlling the motor from the Phidgets website (after tweaking it to make it specific to my setup) and that should work? How do I load the C library? Or a better question may be, what is the C library? (I wasn't joking when I said I'm totally new to this!)
    I've been working my way through 'C for Dummies' this week but I have to admit, the bit about header files and libraries lost me. I downloaded a bunch of stuff from the Phidgets website, including the phidget21.h and phidget21.lib files...do I '#include' both of these at the top of the example code?
    I've been in touch with the Phidget Support team (who are indeed great!) and received a similar reply ("You would have to call into the mac Phidget21 Framework directly").
    Once the CIN is all set up do you know what degree of control I'll have over the motor? The aim is to have the motor move in steps from -60deg to +60deg around a central point. Would this need to be defined in the code and then linked into LabVIEW or is this something I could control from within LabVIEW? The idea of my project is to use the motor to move a light source around a sensor. The sensor is hooked up to an NI DAQ that will record and display values (after some manipulation) on the front panel. I hope to display the sensor values and the corresponding motor position values.  

  • Accelerometer sense tilt then interface with USB-6008 and generate PWM to control M995 servo motor

    hi everyone,
    currently im doing a final year project in LabVIEW using USB6008. my project involve of sensing the tilt from accelerometer and convert the tilt into PWM to control the turning of servo motor.
    accelerometer that im using is ADXL322 which has Dual axis ±2g sense range
    servo motor that im using is M995. when the servo motor in neutral position it required 1500microseconds, while it turns 90degree it requires 2200microseconds and whilw it turns -90degree, it requires 800 microseconds.
    currently im facing problem in generating the PWM signal to control the operation of servo motor.
    attachment is my VI design that i have done to date.
    the program is about attracting tilt range from accelerometer using USB NI 6008 and then convert the tilt into angle in degree, and after that generate PWM to control two servo motor.
    hopefully there is somebody can help me on this. thanks.
    Attachments:
    FYP.vi ‏253 KB

    currently i need to generate PWM to control servo motor model MG995. but i am facing problem in generating the PWM in between 800usec to 2200usec to control the rotation of the servo motor. is there any example for me to refer.
    below is my VI that i have done to date.
    hopefully somebody can identify my mistake because my VI is not able to turn the servo motor.
    Attachments:
    pwm.vi ‏128 KB

  • Using serial port to communicate with servo motor drive

    I wish to use Labview 7.1 to control a brushless servo motor with encoder (from MCG) and digital drive (from AMC) via the serial port. The application simply involves progressively loading the shaft on the motor in 180 degree increments, and reading the torque. Is this possible? If so, how complicated is this approach? Thanks!

    Communicating with your AMC drive via LabVIEW is no problem. There is an example in the Example Finder (Help >> Find Examples . . ) called Basic Serial Write and Read.vi which will show how to send and receive serial commands. You will have to find out what serial commands your drive responds to by looking at the AMC drive user manual. Certainly there are commands to position your servo motor at various increments and at various speeds.
    As for reading torque, however, I would be surprised if the drive would directly output a torque value. One possibility might be to read the current being sent to the servo and then interpret that into a torque reading - provided that the drive has a serial command to read the current output (normally servo motors have a fairly linear current to torque conversion). Current to Torque conversions are motor specific and the accuracy using such a method might not be to good.
    A more accurate method would be to use a torque sensor (Transducer Techniques sells these along with many other sensor companies) - or to use a strain gauge. Both types of sensors would need to be read in with a DAQ board (see link for National Instruments DAQ boards).
    Hopefully this gives you some ideas - good luck with your project!

  • Can we cascade PID and PIV loops to control the servo motor

    hi
    presently we are using PID loop for controlling the motion of a servo motor using ni-7352 card. We  are not able to receive the desired response from this implemenatation. So is there any other alternative like using a PIV loop or using PID cascaded with PIV to achieve a better response. If there is possibility help us to proceed further with this.
     also tell us which is more reliable 1)using PID alone  2) Using PIV Alone 3) using PIV and PID cascaded.
    please mail to this query to [email protected]

    Sidda,
    before you start thinking about advanced control architectures I want to ask you to tell me some details about your system behavior and the control parameters that you have used. I have used 73xx boards for very dynamical systems and I have always been able to find control parameters that resulted in a very fast and stable system behavior.
    For the case that you need some help with tuning please have a look at this link. In many cases autotuning doesn't result in good system behavior but you will find a lot of interesting hints about the manual tuning process there (e. g. that increasing the Kd gain typically results in a better damped system).
    If this doesn't help please attach some screenshots from your step response and the control parameters that you have used.
    Best regards,
    Jochen Klier
    National Instruments Germany

  • Interfacin​g external servo motor in NI sbrio-9632 and programmin​g for it.

    Hi..
    I am using NI starter kit containing SbRIO-9632 and programming it through Labview..Now if I want to interface external gripper to it, I am using servo motor for its motion but how it is possible to program for this external motor in Labview..I mean Do I need to make some pin high in SbRIO??
    Pls help me to start.. 

    Hi sharry,
    You should be able to connect the signal pin of your servo directly into one of the Digital Pins of the SB-RIO. Then you can simple write a pulse of varying width using the FPGA to drive the servo. A good example of this can be found in the Example Finder. Simply search for "PWM", the example you want is called "Controlling a servo using PWM.lvproj".
    The 5V and GND pin of the servo should be connected externally, so that the servo doesn't draw its power from the SB-RIO.
    Also be aware that the digital pins on the SB-RIO operate at 3.3V, not 5V. So if you're outputting a pulse to control a servo, it will be a 3.3V pulse. For most servos, this won't be a problem (3.3V is within the range for a HIGH signal in  TTL and CMOS logic levels), but it's worth checking the datasheet with your servo first, or doing a quick test,
    Kind regards,
    Josh E
    Applications Engineer
    National Instruments UK & Ireland

  • Como zerar o indicador LV que lê o um sinal de um resolver de um servo motor e a posição de um transdutor linear?

    Estou coletando através de uma placa PCI 6221 um sinal de um resolver de um servo motor, porém quando mando referenciar o zero no servo motor a placa não indica que o motor está na posição 0º.
    Isso também ocorre com um transdutor linear de posição Heidenhain MT 1271, onde não consigo referenciar a posição zero de trabalho.

    Discussão movida de  Comunidade SAP em Português (Portuguese Language Community) para Bancos de Dados & Tecnologia (Database & Technology)
    E Paulo. Obrigado por contribuir.

  • Why does not my servo motor work?

    Hello all,
    I can work my servo motors, so I have two questions.
    Firstly, I'm using PXI-7340 & UMI 7764 to control AC motor on servo motor (TAMAGAWA SEIKI co.),
    But in MAX, motor doesn't work with servo type.And there is no signal between AnalogOutput and AnalogOutputGround.
    Can I connect UMI to the motor correctly?
    Now, I connect encorder signals with changing differential line driver (motor driver) to TTL (UMI) using quadruple differential line receiver.
    Is it right?
    If it is right, why does the motor work?
    Secondarily, when I use stepper type in MAX, the motor work with open-loop. But when I use stepper type in MAX, the motor doesn't work with cloosed-loop.Why?
    Polarities of encorders(A, B, Index) make matches my servo motors by setting in MAX.
    Why?
    Thanks,
    megumi
    Attachments:
    TAMAGAWA SEIKI motor&driver.pdf ‏3063 KB

    Thank you for your reply.
    The "does not work" means that the motor doesn't move with servo type.
    In MAX, there is no error when a target position is less than 999 in 1-D Interactive, but doesn't move. When the target position is more than 1000 and I click "Apply" and "Start", error ramps of "Motor off" and "Following error occurred" are red. And the motor doesn't move.
    I thought that firstly signals of encorder have been wrong because the motor have not moved with stepper mode of closed-loop and feedback of encorder.
    So I would connect correctly wiring of encorders, but the motor doesn't move.
    Wiring of all is "STEP", "DIR", "AO", "ENC A, B and Index bar" respectively. In MAX, type is servo mode, feedback is encorder, kinds of limit switch are not used.
    I supply it if you need the other setup of wiring or information.
    Regards,
    Megumi

Maybe you are looking for

  • Regarding Inbound and outbound interfaces in ABAP HR

    Hi, Iam new to SAP. Can you send the document related to Inbound and Outbound Interfaces in detail. i.e What these interfaces comes under and steps to develop these inerfaces. Thanks&Regards, B.Thulasi.

  • Whats the business object attached to the transaction TBB1  ??

    Whats the business object attached to the transaction TBB1 . I am trying to post a  loan contract fees and repayment  from Treasury to GL. Thanks , Naval Bhatt

  • Need Help!!! Oracle8i Lite

    Hi there, I defined a connection against an oracle8i lite (4.0.1.x). This connection works fine - I can see the tables within the table-browser. Try to create a new bc4j-package against this oracle8i lite with just 1 very small entity (table test wit

  • Dvd drive missing

    I am using HP pavilion dv6000 PC notebook and operating sytem is widows vista (32 bit) . After upgrading to service pack 2 my dvd drive got disappeared and i tried all possible solutions but could not resolve the issue. Please help in this matter sou

  • Java Webstart with Config Files

    Hi, Whats the best way to deploy a 3rd party jar program with a config file in the same dir level? Do I need to jar it all up into another jar again? Cheers, John.