AC servo motor

Is it necessary to use UMI 7764, if i want to implement fuzzy logic or other advance controller.
My ac servo motor has drives. i intend to use FP 2000, FP AIO 610 and FP QUAD 510 (Do not intend to use PCI 7344 for motion control)
TQ.
SADIAH

The UMI-7764 is a breakout box with signal conditioning to interface a NI motion control plugin board like the PCI-7344 to a 3rd party drive. You won't need this device for your hardware setup.
Please note that you shouldn't expect higher control loop rates than 100 to 200 Hz with your hardware. This is primarily caused by the FP-2000 which is designed for rather slow I/Os which are typically used in industrial automation applications.
If you are looking for a high performance solution in an even more compact form factor please have a look at our compact RIO products which can run control loops with up to 200 kHz. You also might be interested in the NI SoftMotion development module for LabVIEW that provides a lot of features that you will need for your application (trajectory generation, spline engine,...).
Best regards,
Jochen Klier
National Instruments Germany

Similar Messages

  • Como zerar o indicador LV que lê o um sinal de um resolver de um servo motor e a posição de um transdutor linear?

    Estou coletando através de uma placa PCI 6221 um sinal de um resolver de um servo motor, porém quando mando referenciar o zero no servo motor a placa não indica que o motor está na posição 0º.
    Isso também ocorre com um transdutor linear de posição Heidenhain MT 1271, onde não consigo referenciar a posição zero de trabalho.

    Discussão movida de  Comunidade SAP em Português (Portuguese Language Community) para Bancos de Dados & Tecnologia (Database & Technology)
    E Paulo. Obrigado por contribuir.

  • Why does not my servo motor work?

    Hello all,
    I can work my servo motors, so I have two questions.
    Firstly, I'm using PXI-7340 & UMI 7764 to control AC motor on servo motor (TAMAGAWA SEIKI co.),
    But in MAX, motor doesn't work with servo type.And there is no signal between AnalogOutput and AnalogOutputGround.
    Can I connect UMI to the motor correctly?
    Now, I connect encorder signals with changing differential line driver (motor driver) to TTL (UMI) using quadruple differential line receiver.
    Is it right?
    If it is right, why does the motor work?
    Secondarily, when I use stepper type in MAX, the motor work with open-loop. But when I use stepper type in MAX, the motor doesn't work with cloosed-loop.Why?
    Polarities of encorders(A, B, Index) make matches my servo motors by setting in MAX.
    Why?
    Thanks,
    megumi
    Attachments:
    TAMAGAWA SEIKI motor&driver.pdf ‏3063 KB

    Thank you for your reply.
    The "does not work" means that the motor doesn't move with servo type.
    In MAX, there is no error when a target position is less than 999 in 1-D Interactive, but doesn't move. When the target position is more than 1000 and I click "Apply" and "Start", error ramps of "Motor off" and "Following error occurred" are red. And the motor doesn't move.
    I thought that firstly signals of encorder have been wrong because the motor have not moved with stepper mode of closed-loop and feedback of encorder.
    So I would connect correctly wiring of encorders, but the motor doesn't move.
    Wiring of all is "STEP", "DIR", "AO", "ENC A, B and Index bar" respectively. In MAX, type is servo mode, feedback is encorder, kinds of limit switch are not used.
    I supply it if you need the other setup of wiring or information.
    Regards,
    Megumi

  • Servo motor encoder pulses/counter data erroneous

    First off, I am very new to using labview.  I am trying to complete a project a former employee was working on.
    For a quick background on what I am working with, I am using a NI DAQCard-6036E connected to a SC-2345.  The SC-2345 is then connected to a load sensor, Omron R88D servo driver, and an omron servo motor.  The servo motor has a incremental encoder with a resolution of around 2048 pulses per revolution.  My labview program includes a counter that records the data from the encoder on the servo motor.  I have been able to get accurate data when testing through the measurement and automation program by manually turning the motor.  Also when running through the specific DAQ assistant I am using for my counter, I am getting correct readings when manually turning motor.  Once I run my complete program, instead of getting 2048 pulses per revolution, I am getting between 34000-36000 pulses per revolution.  The most logical assumption is that I am getting vibration in the motor itself or some sort of noise is interfering with my signal.  I first attempted to change any possible settings through the omron servo driver that might reduce any vibration in the motor.  I attempting changing the rigidity settings, turning on and off the auto-tuning function, and a few other settings specified by the user manual that might cause vibration.  If I turn the rigidity settings as low as possible, I am able to get around 2000 pulses per revolution, but the data is very sporadic. Also, my equipment needs to be very rigid, and with the lowest rigidity setting for the servo driver, I am able to almost stop the motor with minimal force.  My equipment needs to be able to travel at a near constant speed with fluctuations of up to 200 N of force.  Any suggestions on which direction I should go in finding a countermeasure? 
    Thanks
    Solved!
    Go to Solution.

    The model number of the servo motor is R88M-W10030L.  The servo motor rotates at a constant speed.  The program is designed to drive the servo motor connected to a ball screw in one direction.  Once the load sensor reaches a desired load, it will reverse at a constant speed until no load is on the sensor. Throughout, it records load vs. displacement.   I have found a few things that will alter the pulses counts.  If you apply resistive pressure on the servo motor while it is rotating, the pulse output will vary.  Also when you apply pressure to the casing of the servo motor itself, the pulses will often jump around. I was almost certain my false pulses were caused by vibration.  After have no success adjusting settings to reduce vibration(according to the user manual), I ran the program while moving around several wires to see if any were loose, etc... After applying force to the power lines and encoder cable, the program ran several times with an average of 2000 pulses per revolution and would actually subract pulses while going in reverse(what I want it to do); Although the average was around 2000 pulses per revoltion, i saw positive and negative jumps in pulse counts while traveling forward at constant speed.  Today I re-wired the equipment, seperating as many wires as possible.  After the re-wire, the equipment/program is back to sending 34000+ pulses per revolution, and does not subract pulses while reversing.  I have read the 'Using Quadrature Encoders with E Series DAQ Boards' article.  Referring to the article, I am running similar to "method 1".  I am already using a signal conditioning box, but have the counter data run directly through. Do you believe running the signals through a SCC-CTR01 might solve the problems? 

  • Test stand design for high speed brushless dc servo motor efficiency testing

    We are in the process of designing two test stands around the brushless dc servo motor in the attached specifications.  The first we would like to operate using Labview and a PXI-7352 controller.  The second will be a real time configuration.  The motor will need to run in the 25-50,000 RPM range.  We want to bring the hall sensors back to the controller and do PID and other control there.  The sinusoidal commutation from the controller would then go to an as yet unselected servo amplifier.  When the motor is under test we want to be able to measure accurately the power that is being used by the motor - not the motor and controller / amplifier.
    Can the PXI-7353 handle this motor speed range?
    Will Labview and the PID toolkit be sufficient software to program the first test stand?
    Any example VI's available for this approach (I could not locate any)?
    What is the appropriate controller for a real time system?
    What is the best way to instrument the motor to determine power required at a given test point?
    Any suggestion on a servo amplifier?
    Thanks in advance,
    David
    Attachments:
    B0912-050 Brushless Motor.pdf ‏91 KB

    Mr. Zaatari,
    As you will note, there are six questions posted above.  As you also know, I have been waiting for your response for 5 days and decided to use this alternate method in hope of getting my questions answered.  As you further know you have not answered these questions.   It is a shame that you will  not post your "answers" here so that the rest of this wonderful NI community might have the benefit of your knowledge as well.
    I will respond further to you by email.
    David

  • Can some body help me to develop labview code for generating pulse to drive ac servo motor

    can some body help me to develop labview code for generating pulse to drive ac servo motor... i am using NI 9401 card ....tnx

    Driving an AC servo motor would (I missed AC on the previous msg) requires some complex hardware. This is generally done by drivers specifically designed for this purpose. I doubt you will be able to accomplish this with the hardware you currently have and it might be cheaper to just buy a driver for it and control the speed through the driver.

  • Time varying velocity input for servo motor

    I 'm running a PXI-8145RT CPU with a 7344 motion controller.I want to load a time varying velocity profile for my servo motor.How can I do this? The "load velocity" tile does not have an input for such a profile,but only for a maximum velocity that will be reached after a certain time.

    Are you using the Motion Assistant or a programming language like LabVIEW? If you are using the motion assistant, you can do a contoured move with a Position-Velocity-Time Profile. I have never done this, but it sounds to be what you are looking for. This assumes that you have a controller that can do contoured moves.
    Hope that this helps,
    Bob
    Bob Young - Test Engineer - Lapsed Certified LabVIEW Developer
    DISTek Integration, Inc. - NI Alliance Member
    mailto:[email protected]

  • How to make the servo motor to move in steps of set degrees and stop

    how to make the servo motor to move in steps of set degrees and stop

    Hi,
    I think the following document would be a good starting place: NI Developer Zone Tutorial: Single Axis Moves It includes links to several example programs that you may find useful for your application. Keep in mind that there are many motion examples that ship with LabVIEW as well.
    I strongly recommend that you check out the following documents as well:
    NI Developer Zone Tutorial: Simple Point to Point Motion
    NI Dev
    eloper Zone Tutorial: Hands-On Motion
    NI Developer Zone Tutorial: Axis Settings for Motion Controllers
    These tutorials will help give you a good foundation for understanding motion control systems.
    Best wishes!
    Dawna P.
    Applications Engineer
    National Instruments

  • Servo motor

    Hi All,
    Currently I am working on a project involved programing LabView to control an AC servo motor. I have done some simple simulation and controller design algorithm on Simulink and I wish to implement my controller from Simulink to LabView and I am struggle to find the way to input my controller in terms of transfer function. I know there is an add-on toolkit for Control Design and Simulation in LabView, but due to tight budget, I could not purchase that add-on. Therefore I am posting this to ask whether is there any other way I can design/test the controller that I design from Simulink to LabView.
    Thanks.  
    *P.S. Attachment is the basic VI I am using to run/test my servo motor. I need to add on a controller in front, before the signal being sent to motor. 
    Attachments:
    IP2.png ‏17 KB
    26 April Single Axis.vi ‏23 KB

    From the z transform you can generate an algorithm that you can then enter in LabVIEW using the normal functions (shift registers, add functions etc). For example a PD controller might be described in Z and then converted to an algorithm as shown below
    You could then enter this algorithm in LabVIEW
    If you are still stuck I recommend you look in a good control text book which will explain it better than I can
    David
    www.controlsoftwaresolutions.com

  • From where in the UMI flex 6c I should give enable to servo motor amplifier/​driver.

    I am using ADS 50/5 maxon amplifier to run the maxon servo motors with Flex 6C. The problem I am facing is that, How to enable the amplifier so that it works only when desired. I have tried by connecting the enable to the inhibit output but it is not working. before connecting to enable Inhibit output is showing 5v but as soon as I connect it , it drops to 1.5V. I am giving encoder feedback to 6C. Motor runs when I directly connect enable to the power supply, but then it runs continuously. Can anybody suggest me about the correct connections.
    regards
    vkmehta

    If the output voltage of the inhibit output drops to around 1.5V as soon as you connect it to the enable input of your servo amplifier you should check the specifications for the enable input of the amplifier. It seems that an optocoupler is used, and if you apply 5V without any series resistance to the input led of an optocoupler the input voltage will be clamped between 1.5 and 2V. Also, the optocoupler may get damaged due to excessive current on the input led.
    Check the manufacturers specifications for correct input wiring. In most cases a series (current limiting) resistor is needed, somewhere around 470 ohms.
    You should also check whether you are applying the correct logic level to the enable input. Some servo amplifiers need a logic 1 (= 5V) on t
    he input, some require a logic 0 (= 0V) to shut down the amplifier.

  • CompactRIO restart when run VI of NI 9505 and servo motor

    I plan to use NI 9505 to control a servo motor, and the cables are connected according to the manual of NI 9505 (The M+  and M- ports are connected with servo motor directly). I can read the the encoder value from VI, but when I try to control the servo motor by even a very simple VI, the error happens, and the compactRIO restart. When I disconnect the motor and 9505 module, the voltage between the M+ and M- port can be measured, and it  is 24 V. I think maybe the current of motor through the 9505 module is too high, but why and how to solve it. Is there anybody can help me? Thank you very much. 

    If you believe that your motor pulls more than 5A, take a look at this from the NI 9505 product page. It's the first bullet point.
    Continuous current of up to 5 A at 40 °C (or 1 A at 70 °C) at 30 V - for higher power add NI 9931
    The NI 9931 will allow the 9505 to supply up to 7.3A.
    www.movimed.com - Custom Imaging Solutions

  • How to measure current on DC servo motor.

    Need to measure current on DC servo motor. What is the best way to measure the current on the motor as it moves to position? Using PXI-7344 controller and MID-7654 drive unit.  

    This is not an easy task. Here are some ideas:
    Direct current measurement:
    Add shunt resistors into the motor cabling and measure the voltage over the resistors. As the 7654 doesn't provide DC current but PWM current (32 kHz) you will have to acquire the data pretty fast (at least with 200 kHz). This could be done with a DAQ-board like the PCI-6220. Additionally you will have to do some math to calculate the RMS value of the current.
    Control voltage measurement:
    You could measure the output voltage of the 7344 as it is proportional to the duty cycle of the PWM current signal of the 7654. You will also need some additional measurement hardware but you wouldn't have to use shunt resistors and RMS calculations. The major disadvantage of this option is cabling as there is a single cable connection between the controller and the drive and you would have to use e. g. two external connector blocks and an additional cable in order to be able to connect your measurement hardware to the voltage output of the 7344.
    Torque measurement:
    Torque is also proportional to motor current so you could add a torque sensor to your motor and measure the output signal with e. g. a PCI-6220. Depending on your hardware setup this might be the best option.
    I hope that helps,
    Jochen Klier
    National Instruments Germany

  • Servo motor using parallel port

    Hello,
            I have a XY mirror scan Servo motor used for optical scanning experiment.
    The question I have is:
          I already have a program to control stepper motor via parallel port. Can I use the same to control servo motor ?
    Or is the SERVO motor concept different to that of a STEPPER motor ...
    Thank you
    Abhilash S Nair
    Research Assistant @ Photonic Devices and Systems lab
    [ LabView professional Development System - Version 11.0 - 32-bit ]
    LabView Gear:
    1. NI PXI-7951R & NI 5761
    2. The Imaging Source USB 3.0 monochrome camera with trigger : DMK 23UM021
    OPERATING SYSTEM - [ MS windows 7 Home Premium 64-bit SP-1 ]
    CPU - [Intel Core i7-2600 CPU @ 3.40Ghz ]
    MEMORY - [ 16.0 GB RAM ]
    GPU - [ NVIDIA GeForce GT 530 ]

    You will need a DAQ card that can generate the voltage needed to send a command signal to the 671.  The 671 will need to be tuned to the 6880 with whatever sized mirror is attached.  (If you bought the galvo and servo driver as a package it should already be tuned.)
    CTI systems take in an analog command from -10 to +10 Volts.  Almost all the NI DAQ cards (and many other brands) output +/- 10 Volts so that will be easy.
    Then you will need to decide how to scan your target.  A ramp pattern or triangle wave is the usual choice for scanning objects so you need to generate that in LabVIEW code along with the code that will read your sensor.  This should be done simultaneously but you really don't need a very expensive DAQ card to accomplish that.  Look on the NI website for options in your price range and do some research...
    Is your system one axis (one 6880 and one 671)?  If so you will scan a raster (ramp or triangle) to measure a single line of light intensity, move the stage a tiny distance and scan another line.  When you put all the lines together into a 2D image you will have a representation of one face of your object.  Many people use a rotary stage to spin the object while scanning to assemble a 3D model of the object.  This is a bit more complex of course.
    Using LabVIEW: 7.1.1, 8.5.1 & 2013

  • Control a servo motor thru labview.

    I want to control a servo motor thru labview. I have the following hardware: servo motor, electro-craft bdc-12, s8vm-15024 cd power supply, NI PCI-7831R, and scb-68 circuit board. I am new to the field and I have no idea how to connect the amplifier to the circuit board and create a program in labview to control the motor. Thank you for your help.

    reposted here: http://forums.ni.com/t5/LabVIEW/I-want-to-control-a-servo-motor-thru-labview/td-p/1651742

  • Tuning servo motor to withstand external force of another servo motor

    Hi,
    I have two brushless DC servo motors connected to their own third-party amplifier operating in torque mode. These motors control a reel-to-reel system, where a metal ribbon travels from one reel to the other. One amplifier is connected to the NI 9516, where I am using Softmotion to control its velocity. The other amplifier is set at a constant torque to take up the ribbon.
    The problem I am having is with tuning the velocity-controlled motor to withstand the torque caused by the take-up motor. I need the motor to operate at a constant velocity while able to withstand the forces cause by the pull of the other motor. So far I have been unsuccessful at tuning it to counteract that external force. I was hoping to have Softmotion control the motor's velocity while the motor itself is set in torque mode, but it seems I have to set the motor to velocity mode to solve my problem. Is that the only answer, or is there another way?
    Linus

    Just to let you know, I already solved the problem. It seems that my initial settings for the servo amplifier was incorrect, and that I was providing too little power to the motor. It was set to a low power because it was making a noise when it was set to a normal value. I learned eventually that the source of the noise was not because of the power, but because the initial gain tuning parameters were not set correctly. When I changed the tuning parameters, I was able to supply more power to the motors, which gave it enough torque to withstand external forces.
    I initially used the gain tuning values recommended by the Getting Started the AKD EtherCat Drives guide, even though I was not using the same brand of motor. I though the suggested uning parameters would apply to my brushless servo motor, but it turns out that's not the case.

Maybe you are looking for

  • Deleting Cookies... Security OPane on Preferences no longer working "properly"

    My Preferences panel seems to have reset itself, with the Security pane no longer listing several options (Location services, Accept Cookies, Databases, Non-secure forms). As a result, I don't have an option to "show" and then "delete" cookies or dat

  • How to extract data ??

    Hi all, We have POS(Point of sales )cubes in BW business content.These cubes are getting loaded from BW data sources using an extractor FM.We have our POS data in a flat file which is coming frm a legasy system.Now using XI how we need to update data

  • SPAM is winning!!!

    Hi everybody, I hope that someone out there can help. Before I go into all the details, I want you to know that we don't have a tech person and I have nowhere near the amount of tech knowledge that you have out there, so please be gentle. Simply stat

  • Can't read CF card reader

    My iMac won't read my CF Card Reader.  Help!!

  • Switching between Classic and exended classic

    Hello All, We are going for an upgrade to SRM Server 5.0 with the Classic scenario. Can we in the near future go for Extended classic Scenario to have all the purchasing processes in SRM. If it is possible how can we go for and what are the efforts,