Controllin​g a servo motor to generate constant force

Hi,
I have a machine which interacts with foot and I would like to control the motor so that it generates a constant load at the end effector (applied on the foot). I am able to read force and position information. What kind of control algorithm should I use to control a constant force generated by the servo motor? A force feedback controller which feeds back the force information to the controller and update the command signal seems to be ideal but my force sensors could be noisy and I don't know what filters and what cut off frequency I should choose. Can I assume a linear relationship between the currnet command to the motor and the amount of force generated at the output or it could be wrong due to the back emf, friction, etc...
I appreciate if you could advice.
I use compactRIO and labview fpga 8.6.
Thanks

I think it can be done with a simple force sensor and a DC motor provided you have a means to control the voltage from Analog output.
The only problem you might encounter is it may oscilate around your force setpoint but you can find some simple ways to smooth that out.
It really depends (quite a bit) on what control precision your design is willing to tolerate.

Similar Messages

  • Controllin​g a Servo Motor Using LabVIEW, Phidget & Mac OS

    Can anyone help me with this problem? 
    I'm attempting to control a servo motor attached to a Phidget, using a VI in LabVIEW on a Mac. The Phidget works fine with the Mac, LabVIEW works fine with the Mac, but there seems to be a problem combining the three. The problem I'm encountering is that the examples on the Phidget website (www.phidgets.com) for LabVIEW use ActiveX. Can anyone suggest a solution for this? Is it possible to use one of the control options in LabVIEW (GPIB, VISA, etc etc.) instead?
    I'm not great with computers so really have no idea where to start. 
    Thanks!  

    Thanks for your reply Jeff! So I can insert a CIN to the block diagram, right click and use the 'create .c file' option, insert the example code for controlling the motor from the Phidgets website (after tweaking it to make it specific to my setup) and that should work? How do I load the C library? Or a better question may be, what is the C library? (I wasn't joking when I said I'm totally new to this!)
    I've been working my way through 'C for Dummies' this week but I have to admit, the bit about header files and libraries lost me. I downloaded a bunch of stuff from the Phidgets website, including the phidget21.h and phidget21.lib files...do I '#include' both of these at the top of the example code?
    I've been in touch with the Phidget Support team (who are indeed great!) and received a similar reply ("You would have to call into the mac Phidget21 Framework directly").
    Once the CIN is all set up do you know what degree of control I'll have over the motor? The aim is to have the motor move in steps from -60deg to +60deg around a central point. Would this need to be defined in the code and then linked into LabVIEW or is this something I could control from within LabVIEW? The idea of my project is to use the motor to move a light source around a sensor. The sensor is hooked up to an NI DAQ that will record and display values (after some manipulation) on the front panel. I hope to display the sensor values and the corresponding motor position values.  

  • Tuning servo motor to withstand external force of another servo motor

    Hi,
    I have two brushless DC servo motors connected to their own third-party amplifier operating in torque mode. These motors control a reel-to-reel system, where a metal ribbon travels from one reel to the other. One amplifier is connected to the NI 9516, where I am using Softmotion to control its velocity. The other amplifier is set at a constant torque to take up the ribbon.
    The problem I am having is with tuning the velocity-controlled motor to withstand the torque caused by the take-up motor. I need the motor to operate at a constant velocity while able to withstand the forces cause by the pull of the other motor. So far I have been unsuccessful at tuning it to counteract that external force. I was hoping to have Softmotion control the motor's velocity while the motor itself is set in torque mode, but it seems I have to set the motor to velocity mode to solve my problem. Is that the only answer, or is there another way?
    Linus

    Just to let you know, I already solved the problem. It seems that my initial settings for the servo amplifier was incorrect, and that I was providing too little power to the motor. It was set to a low power because it was making a noise when it was set to a normal value. I learned eventually that the source of the noise was not because of the power, but because the initial gain tuning parameters were not set correctly. When I changed the tuning parameters, I was able to supply more power to the motors, which gave it enough torque to withstand external forces.
    I initially used the gain tuning values recommended by the Getting Started the AKD EtherCat Drives guide, even though I was not using the same brand of motor. I though the suggested uning parameters would apply to my brushless servo motor, but it turns out that's not the case.

  • Can some body help me to develop labview code for generating pulse to drive ac servo motor

    can some body help me to develop labview code for generating pulse to drive ac servo motor... i am using NI 9401 card ....tnx

    Driving an AC servo motor would (I missed AC on the previous msg) requires some complex hardware. This is generally done by drivers specifically designed for this purpose. I doubt you will be able to accomplish this with the hardware you currently have and it might be cheaper to just buy a driver for it and control the speed through the driver.

  • Accelerometer sense tilt then interface with USB-6008 and generate PWM to control M995 servo motor

    hi everyone,
    currently im doing a final year project in LabVIEW using USB6008. my project involve of sensing the tilt from accelerometer and convert the tilt into PWM to control the turning of servo motor.
    accelerometer that im using is ADXL322 which has Dual axis ±2g sense range
    servo motor that im using is M995. when the servo motor in neutral position it required 1500microseconds, while it turns 90degree it requires 2200microseconds and whilw it turns -90degree, it requires 800 microseconds.
    currently im facing problem in generating the PWM signal to control the operation of servo motor.
    attachment is my VI design that i have done to date.
    the program is about attracting tilt range from accelerometer using USB NI 6008 and then convert the tilt into angle in degree, and after that generate PWM to control two servo motor.
    hopefully there is somebody can help me on this. thanks.
    Attachments:
    FYP.vi ‏253 KB

    currently i need to generate PWM to control servo motor model MG995. but i am facing problem in generating the PWM in between 800usec to 2200usec to control the rotation of the servo motor. is there any example for me to refer.
    below is my VI that i have done to date.
    hopefully somebody can identify my mistake because my VI is not able to turn the servo motor.
    Attachments:
    pwm.vi ‏128 KB

  • Servo motor encoder pulses/counter data erroneous

    First off, I am very new to using labview.  I am trying to complete a project a former employee was working on.
    For a quick background on what I am working with, I am using a NI DAQCard-6036E connected to a SC-2345.  The SC-2345 is then connected to a load sensor, Omron R88D servo driver, and an omron servo motor.  The servo motor has a incremental encoder with a resolution of around 2048 pulses per revolution.  My labview program includes a counter that records the data from the encoder on the servo motor.  I have been able to get accurate data when testing through the measurement and automation program by manually turning the motor.  Also when running through the specific DAQ assistant I am using for my counter, I am getting correct readings when manually turning motor.  Once I run my complete program, instead of getting 2048 pulses per revolution, I am getting between 34000-36000 pulses per revolution.  The most logical assumption is that I am getting vibration in the motor itself or some sort of noise is interfering with my signal.  I first attempted to change any possible settings through the omron servo driver that might reduce any vibration in the motor.  I attempting changing the rigidity settings, turning on and off the auto-tuning function, and a few other settings specified by the user manual that might cause vibration.  If I turn the rigidity settings as low as possible, I am able to get around 2000 pulses per revolution, but the data is very sporadic. Also, my equipment needs to be very rigid, and with the lowest rigidity setting for the servo driver, I am able to almost stop the motor with minimal force.  My equipment needs to be able to travel at a near constant speed with fluctuations of up to 200 N of force.  Any suggestions on which direction I should go in finding a countermeasure? 
    Thanks
    Solved!
    Go to Solution.

    The model number of the servo motor is R88M-W10030L.  The servo motor rotates at a constant speed.  The program is designed to drive the servo motor connected to a ball screw in one direction.  Once the load sensor reaches a desired load, it will reverse at a constant speed until no load is on the sensor. Throughout, it records load vs. displacement.   I have found a few things that will alter the pulses counts.  If you apply resistive pressure on the servo motor while it is rotating, the pulse output will vary.  Also when you apply pressure to the casing of the servo motor itself, the pulses will often jump around. I was almost certain my false pulses were caused by vibration.  After have no success adjusting settings to reduce vibration(according to the user manual), I ran the program while moving around several wires to see if any were loose, etc... After applying force to the power lines and encoder cable, the program ran several times with an average of 2000 pulses per revolution and would actually subract pulses while going in reverse(what I want it to do); Although the average was around 2000 pulses per revoltion, i saw positive and negative jumps in pulse counts while traveling forward at constant speed.  Today I re-wired the equipment, seperating as many wires as possible.  After the re-wire, the equipment/program is back to sending 34000+ pulses per revolution, and does not subract pulses while reversing.  I have read the 'Using Quadrature Encoders with E Series DAQ Boards' article.  Referring to the article, I am running similar to "method 1".  I am already using a signal conditioning box, but have the counter data run directly through. Do you believe running the signals through a SCC-CTR01 might solve the problems? 

  • Servo motor

    Hi All,
    Currently I am working on a project involved programing LabView to control an AC servo motor. I have done some simple simulation and controller design algorithm on Simulink and I wish to implement my controller from Simulink to LabView and I am struggle to find the way to input my controller in terms of transfer function. I know there is an add-on toolkit for Control Design and Simulation in LabView, but due to tight budget, I could not purchase that add-on. Therefore I am posting this to ask whether is there any other way I can design/test the controller that I design from Simulink to LabView.
    Thanks.  
    *P.S. Attachment is the basic VI I am using to run/test my servo motor. I need to add on a controller in front, before the signal being sent to motor. 
    Attachments:
    IP2.png ‏17 KB
    26 April Single Axis.vi ‏23 KB

    From the z transform you can generate an algorithm that you can then enter in LabVIEW using the normal functions (shift registers, add functions etc). For example a PD controller might be described in Z and then converted to an algorithm as shown below
    You could then enter this algorithm in LabVIEW
    If you are still stuck I recommend you look in a good control text book which will explain it better than I can
    David
    www.controlsoftwaresolutions.com

  • Servo motor using parallel port

    Hello,
            I have a XY mirror scan Servo motor used for optical scanning experiment.
    The question I have is:
          I already have a program to control stepper motor via parallel port. Can I use the same to control servo motor ?
    Or is the SERVO motor concept different to that of a STEPPER motor ...
    Thank you
    Abhilash S Nair
    Research Assistant @ Photonic Devices and Systems lab
    [ LabView professional Development System - Version 11.0 - 32-bit ]
    LabView Gear:
    1. NI PXI-7951R & NI 5761
    2. The Imaging Source USB 3.0 monochrome camera with trigger : DMK 23UM021
    OPERATING SYSTEM - [ MS windows 7 Home Premium 64-bit SP-1 ]
    CPU - [Intel Core i7-2600 CPU @ 3.40Ghz ]
    MEMORY - [ 16.0 GB RAM ]
    GPU - [ NVIDIA GeForce GT 530 ]

    You will need a DAQ card that can generate the voltage needed to send a command signal to the 671.  The 671 will need to be tuned to the 6880 with whatever sized mirror is attached.  (If you bought the galvo and servo driver as a package it should already be tuned.)
    CTI systems take in an analog command from -10 to +10 Volts.  Almost all the NI DAQ cards (and many other brands) output +/- 10 Volts so that will be easy.
    Then you will need to decide how to scan your target.  A ramp pattern or triangle wave is the usual choice for scanning objects so you need to generate that in LabVIEW code along with the code that will read your sensor.  This should be done simultaneously but you really don't need a very expensive DAQ card to accomplish that.  Look on the NI website for options in your price range and do some research...
    Is your system one axis (one 6880 and one 671)?  If so you will scan a raster (ramp or triangle) to measure a single line of light intensity, move the stage a tiny distance and scan another line.  When you put all the lines together into a 2D image you will have a representation of one face of your object.  Many people use a rotary stage to spin the object while scanning to assemble a 3D model of the object.  This is a bit more complex of course.
    Using LabVIEW: 7.1.1, 8.5.1 & 2013

  • Tuning servo motor problems

    I have a PCI 7344 connected to a UMI and a Ecodrive amplifier, with a linear modul and servo. Feedback is a incremental encoder 16404counts/rev (400counts/mm). When i try to autotune, the servo motor goes berserk, so I have to manually tune it with "Step response".
    I am able to get a nice response curve according to "Understanding servotune" but the problem is, curve starts at -50mS? and at 50mS it seems stable, but settling time is indicated to be 500 - 600 mS, and when i use "wait for move complete.vi" it waits about the same period before it returns "move complete". I was expecting/hoping for 50 - 100mS settling time. How to tune this? I run position mode and there is no problem regarding velocity, acceleration and position feedback, the only
    thing is that settling time is to long. I have no following error. Is there any other way to get a signal exactly when a position is reached.

    Hi, thank you for your answer.
    I still have problems with this, so here is some more questions.
    1. I am getting a perfect curve on the step response, similar to attached picture (1), only with shorter time scale. The step response report under the chart that settling time is 500-600mS, which is settling time (<100mS) and steady state (>400-500mS) together. This period seems to be the same that i am experience with "move complete.vi".
    I have Kp=7 Kd=14 and Ki=1, and i have tried to adjust these values but this is the best i get.
    Does the loop need that time to determine that move is complete?
    2. The move i want to do in my application is a movement about 20000 counts (50mm) in 130mS, wait 100mS and return to start in 130mS, this is possible if I use
    software delay and start return move before first move is settled according to "move complete.vi" But this generate a increasing stop position error each move if I loop this sequense with 100mS rest time. These timings should be possible with the hardware I am using. Any suggestions?
    3. I have set the driver/amplifier in velocity mode (incremental encoder feedback), how is that compared to torque mode in the driver in this case?
    4. It is possible to get a second feedback from the drive (torque, velocity or position from an analog output) but this is 8 bit and less accurate than encoder. I also have a second external encoder for precision measurement and a force cell, but they are only active parts of the range to the servo (under the desired move) is there any other way i could set this up?
    Regards
    Attachments:
    153.gif ‏29 KB

  • Servo Motor Control

    Hi all...
    I'm working with the AT-MIO-16E-10 Board,
    and I use its 2 counter (0 and 1) to control 2 servo motors, when I
    using them to generate continuous pulse, both motors can move
    continuous. But my problem is, when I generating a finite pulse train,
    only either 1 motor can move(ie. no of pulse=10000)...why and how can I
    control 2 servo motors by this board?

    sushma:
    You currently have 4 other threads asking help for control of a stepper, one of which you said you had a working solution. Now you are asking about servos which operate totally different from stepper motors. Please keep discussion on one thread and include what hardware you have to work with (motor driver, motor, DAQ card) so that others may help in an efficient and timely manner.
    Thanks
    ~~~~~~~~~~~~~~~~~~~~~~~~~~
    "It’s the questions that drive us.”
    ~~~~~~~~~~~~~~~~~~~~~~~~~~

  • Servo motor control using CRIO+FPGA and 9477 digital out module

    Hello experts,
    I have a futaba BLS551 brushless motor digital servo (3 wires-+,-, signal). i also have a CRIO+real-time+fpga and 9477 digital out module. how can i generate servo signals using this module
    please help...
    Thanks,

    freemason,
    In order to control your servo motor with the FPGA and or DIO module you will have to write drivers to control your motor and drive.  While this is possible is an extremely complicated and time consuming process.  I would highly recommend you consider using the NI 9514 with soft motion as it will provide full servo functionality and is relatively easy to use.
    Regards,
    Sam K
    Applications Engineer
    National Instruments

  • Servo motor control using MCB2300 and Labview

    Hello Everyone,
    I have to drive servo motor using MCB2300 board and labview. I am new to LabView as well as MCB2300 board. I understand that I have to generate Pulse to control the servo motor. 
    I have gone through some of the post but could not find something useful. 
    I need to do it asap, a fast and eloborating answer is much appreciated. 
    Thanks is advanced.

    I don't know if this will help you, but think about this tip:
    On the back of my (digital-) Servo package there was some data written (see attached image).
    You've heard of PWM (Pulse Width Modulation)? If not look it up in
    wikipedia / microcontroller.net / (if you speak german) http://www.rn-wissen.de/index.php/Servos or google
    The description of the Servo says that I have to send every 200ms +/- 1ms a positive (+5V +/- 1V) signal with a length depending on the angle I want the Servo to be set. The Signal is coded in conjunction to time, which means, as longer the Signal as higher the angle is. The Signal range is in between 70 to 240 ms (with my servo and not exactly). The Signal must repeated every 200 ms like I said before. I don't know, if you understand C but here is a function I wrote, which works fine for me:
    void set_Servo_0(uint8_t angle)
      DDRA |= 2; // Specific Port declaration for my µC (Atmel)
      uint8_t tick; // a var to count how often I send my Signal
      for(tick = 0; tick < 2; tick++) // loop to count -> send Signal three times
        if(getStopwatch1() > 200)  // getStopwatch is a libary specific function to measure time in ms steps
          PORTA |= 1;  // port high
          sleep(angle);  // angle comes from outside the function, it is a parameter for this function. Sleep for this time with port high = pos Signal
          PORTA &= ~1;  // then pull down
          setStopwatch1(0); // reset the timer 
      mSleep(250); // Finally I have to wait for this time (in ms) when I send different angle parameters one after another, to let the whole system
                          // (µC + Servo and rest of program) to settle down, else I will loose signal steps due to incorrect timing (not nice, but works).
    This function gets the angle as Integer from 7 to 24 and puts the Servo in corresponding position one time.
    Maybe You can adapt it, good luck.
    Attachments:
    Servo.jpg ‏207 KB

  • Servo motors, rpm measuremen​ts and accelerome​ters

    Hello reader,
    Please excuse my question if it seems trivial but I am an undergrad trying to make LabVIEW work! We are running LabVIEW 8 and plan to use USB-6009 to make the measurements. I have read that this device isn't the best for control tasks. Any suggestions?
    For our experiment, we wish to have six controllable spinning cylinders attached to servo motors or stepper motors, which ever works the best. Each cylinder will be spinning at a set rpm which we wish to measure the desired rpm versus actual at all times. If possible, we would like to have an accelerometer functioning as well.
    I was wondering if this is possible with LabVIEW. If so, does anyone have any suggestions on how one might do this?
    Thank you very much for your time!
    -I. Fritz

    ifritz,
    there are several  considerations for choosing a servo or a stepper:
    Inertia of the cylinders
    maximum rpm values
    maximum acceleration and deceleration values 
    The maximum velocity for stepper motors is typically 3000 rpm. At this velocity the torque that the motor can provide has decreased to a small fraction of the specified maximum torque.Stepper motors have very limited capabilities to compensate following errors, so you need to calculate carefully the required torque at a given velocity and compare these values with the motor specs. Additionally you need to make sure, that the ratio between the motor's inertia and the inertia of the load is not too low. To avoid torque reflections this value should be somewhere in the range of 1:1 and 1:5.
    If you find a stepper, that is a good match for your application, the difference between commanded speed and measured speed should be very close to zero (except potentially some micro-oscillations).
    If you need higher torque and/or velocities and dynamic following error compensation, a servo motor should be a better choice, but you will have to tune the system which makes it a bit harder to configure the system.
    In any case the USB-6008 is the wrong choice for the control task, as it's not fast enough and can't be used in real-time control applications.
    Depending on the type of your feedback signal, it might be used for your measurements, but I also doubt, that this is a good choice. According to your post, you need to compare commanded velocity and real velocity. This implies, that you have access to the data of the trajectory generator and that you can acquire these data at the same rate as your feedback signal. This requirement conflicts with most of the available motion control units in the market. The minimum system that could meet this requirement is a real-time system with one ore more PCI or PXI multifunction DAQ  plugin boards (depending on the number and type of output channels that you need to control the motors) and optionally (recommended) the NI SoftMotion Development Module.
    Depending on your accuracy and speed requirments there might be also other solutions, but with the USB-6008 you are definitely on the wrong track.
    Kind regards,
    Jochen Klier
    National Instruments

  • Como zerar o indicador LV que lê o um sinal de um resolver de um servo motor e a posição de um transdutor linear?

    Estou coletando através de uma placa PCI 6221 um sinal de um resolver de um servo motor, porém quando mando referenciar o zero no servo motor a placa não indica que o motor está na posição 0º.
    Isso também ocorre com um transdutor linear de posição Heidenhain MT 1271, onde não consigo referenciar a posição zero de trabalho.

    Discussão movida de  Comunidade SAP em Português (Portuguese Language Community) para Bancos de Dados & Tecnologia (Database & Technology)
    E Paulo. Obrigado por contribuir.

  • Why does not my servo motor work?

    Hello all,
    I can work my servo motors, so I have two questions.
    Firstly, I'm using PXI-7340 & UMI 7764 to control AC motor on servo motor (TAMAGAWA SEIKI co.),
    But in MAX, motor doesn't work with servo type.And there is no signal between AnalogOutput and AnalogOutputGround.
    Can I connect UMI to the motor correctly?
    Now, I connect encorder signals with changing differential line driver (motor driver) to TTL (UMI) using quadruple differential line receiver.
    Is it right?
    If it is right, why does the motor work?
    Secondarily, when I use stepper type in MAX, the motor work with open-loop. But when I use stepper type in MAX, the motor doesn't work with cloosed-loop.Why?
    Polarities of encorders(A, B, Index) make matches my servo motors by setting in MAX.
    Why?
    Thanks,
    megumi
    Attachments:
    TAMAGAWA SEIKI motor&driver.pdf ‏3063 KB

    Thank you for your reply.
    The "does not work" means that the motor doesn't move with servo type.
    In MAX, there is no error when a target position is less than 999 in 1-D Interactive, but doesn't move. When the target position is more than 1000 and I click "Apply" and "Start", error ramps of "Motor off" and "Following error occurred" are red. And the motor doesn't move.
    I thought that firstly signals of encorder have been wrong because the motor have not moved with stepper mode of closed-loop and feedback of encorder.
    So I would connect correctly wiring of encorders, but the motor doesn't move.
    Wiring of all is "STEP", "DIR", "AO", "ENC A, B and Index bar" respectively. In MAX, type is servo mode, feedback is encorder, kinds of limit switch are not used.
    I supply it if you need the other setup of wiring or information.
    Regards,
    Megumi

Maybe you are looking for

  • Sapscript: display in main window table ... repeat table haeder on new page

    Hi all, I have a form with one Window -> Main Window, in this Window it will be written many different information... there is also a table, when the table cause a new-page, the header (discription of the table) should be written (repeated) also on t

  • Screen Change when using LDB - pnp

    Hi , I am facing a strange problem. I am using LDB -> pnp and HRIN0013 as Master data rep , in one of the programs that i developed. The Screen and program behave fine in the development server. Once i transport it to the production server, the Scree

  • Execute txkWfClone.sh in a shell script

    We have a requirement that needs txkWfClone.sh to run inside a shell script (unix). We need to pass the username/password in the shell script. Basically we need to run this as a silent process. Any suggestions on how to accomplish that?

  • Export to Excel with colored row

    Hi, I'm developing a alv report in web dynpro. The first  requirement is to put traffic lights based on some criteria. This part has been done. 2nd requirement is to export this alv report into excel sheet. If traffic light of the row is 'RED' then i

  • Do anybody know how to read file...

    Hello, I have a problem with reading config file. Do anybody know how to read config file from the same dir as class. Now the situation is like: filePath I hardcoded like: fileName = "C:\\apache-tomcat-5.5.17\\webapps\\ROOT\\WEB-INF\\classes\\config.