Understand​ing velocity control on 9505 FPGA VI

I was able to quickly use the 9505 Servo Drive example to do closed loop position control on the FPGA but I'm having trouble with figuring out how to modify the code to support velocity control (which I need for a different axis).  I'm trying to make my modifications on the FPGA VI itself rather than on the RT Softmotion side.
Here are the issues I'm having:
If I use the PI Velocity Loop with my Velocity and an arbitrary control setpoint velocity, I can see the velocity stabilizes (motor output looks uniform) but I can't correlate it with the velocity setpoint other than a bigger number is faster.  Velocity (from the Encoder loop) seems to readback in values of just 0-2 on my FPGA VI (I don't fully understand the units but I see my position change on my encoder readback in thousands of counts per second).  Why are they so different/how do I scale my encoder velocity correctly?
Using just the shipping 9505 Servo Drive example (position control), how do I set the velocity of a move from within the FPGA code (in interactive mode)?
Solved!
Go to Solution.

I was able to get my velocity readings to match my velocity setpoints when using velocity control mode by increasing my Timer Config.Target Period and  Encoder 'calls per sample encoder' from 2000 to 40,000.  Now my velocity readback is +/- 1 of my velocity setpoint for my drive range.  This makes sense to me now - if you read the encoder to quickly it hasn't accumulated enough points to accurately calculate the velocity.  It was in the documentation but I didn't fully understand it.
So now I'm down to one question - (from last post):
I still don't understand how to set the velocity of a move if I'm using position control (out of the box example) from within the FPGA code (on the 9505 example).  Is the best way to do this just to limit the 'Current Output Range' depending on the max velocity requested?  The NI Softmotion RT VIs and Express VIs are able to set 'Velocity' but I'm really unclear what that the SoftMotion RT VIs are actually doing in terms of interacting with the FPGA code.  Can you explain how that works?

Similar Messages

  • Closed loop velocity control based on load cell force

    Hello,
    My application is for a drill, that drills into rocks of various densities for the purpose of collecting rock core samples.
    My setup has 2 motors which get controlled, one spins the drill bit at a constant velocity, the other moves the drill mechanism along a Z axis.
    For efficient cutting, it is desired to apply a constant force between the drill bit and the rock.  I have a load cell which measures the force being exerted by the drill bit on the rock surface, and this force can be adjusted by changing the velocity of the Z axis.  So I would like to employ closed loop control to adjust the Z axis velocity to maintain a constant force on the rock.
    Platform: cRIO 9073, with NI 9505 & 9215 Modules, Labview 2010 Full with RT and FPGA modules.
    The load cell is by Transducer Techniques, and I use their TMO-1 module to condition the signal, the output of which is attached to an input of the 9215 module, where 0-100 lbs equates to 0-8VDC.
    The motors and encoders for the Drill and Z axis are connected directly to the 9505 modules.
    Right now I am using a modified version the example found in ...\examples\CompactRIO\Module Specific\NI 9505\Velocity Control (closed loop)\Velocity Control (closed loop) - NI 9505.lvproj to accomplish velocity control of the motors.
    My questions are:
    1) Do I have the appropriate NI hardware/software for this task?
    2) With my current hardware setup, what would be an appropriate way to control my Z axis velocity rate based on Analog feedback from the load cell?
    3) Development time is a critical factor, so are there any toolkits etc that are easy to get started with that would drastically decrease my development time, or do I already have everything I need?
    Thank you for your time
    -MK Hokie

    1. Your hardware and software look appropriate assuming the motors are compatible with the 9505s.  You didn't mention the NI SoftMotion module in your software list which is something you will need.
    2.  There are a few ways of doing this.  One method would be to have a force PID loop that would attempt to maintain a force setpoint by directly outputting values to your torque loop.  In this case, the drill would essentially move as fast as it could while maintaining the force setpoint.  Another option is to have the force loop output a velocity setpoint.  You would then have a velocity PID loop that outputs torque values to the torque loop.  By adding this additional velocity loop you could have control over your maximum and minimum velocities.  There are likely other alternatives as well, but these are the first two that come to mind.
    3.  Unfortunately there are no shipping examples that close the loop on force feedback.  My advice would be to start with the NI 9505 shipping example and adapt it to your needs.  There are quite a few things you will want to change though.  Do you know if you will need to use the trajectory generator to move the drill into position before starting the force control?
    Assuming you don't need any trajectory generation, you can scrap the entire RT portion of the NI 9505 example and just create the necessary FPGA code.  On the FPGA, you won't need the Spline or Synchronization code either because this information would no longer be coming from RT.   You could take these out and replace the position loop with a force loop and possibly a velocity loop and your FPGA program would basically be finished.  In fact the only real motion IP that you will need is for the encoders (assuming you want velocity control) and PID.  Then you would need to create an RT VI that allows you to send down enable, disable, PID gains, and setpoints. 
    If you do need trajectory generation, you would want to keep most of the example code the way that it is, but then program in a 'Force Mode' that utilizes the force and velocity controly as described above.  You could think of it as having two different routines programmed side by side. 
    Regards,
    Burt S

  • 7342 velocity control

    Refer to the question, ,
    In 7342's user manaul, it is recommanded that use current loop in driver and velocity or position l...

    Okay. I see what you're talking about. I'm including the text here so I don't have to look it up again. It appears that the difference between closing the torque (current) loop and the velocity loop on the motor drive itself is that the parameters used for PID control of velocity set up in MAX for the motion controller wouldn't be able to be utilized. If you want to have that nice velocity control through the software on your PC rather than through whatever interface you might have through the motor drive, the velocity loop needs to be closed back on the controller and not externally on the drive.
    Hopefully this is helpful to you. Best of luck.
    Jim
    "National Instruments recommends that when you use the UMI-7774/7772 with a servo mo
    tor drive, you use the torque (current) mode option on the drive instead of the velocity mode to simplify the system setup and tuning. In torque mode, only the current loop is closed on the drive. Both the position and velocity loops are closed on the motion controller using encoder feedback. When you use torque mode, you can set all of the control gains with the National Instruments motion controller software. Using torque mode on the drive also reduces noise and enables you to achieve high position repeatability in the motion system."

  • PCI-7344 Servo Velocity Control

    Hi,
    I have several servos to control fast changing velocity with ComponentWorks++.
    The samples included in samples directory take the following steps:
    1.flex_load_rpsps
    2.flex_load_rpm ;velocity
    3.flex_load_target_pos;postion
    4.flex_start.
    For a velocity control, is it correct to call the function
    flex_load_rpm and flex_start in repeat during a servo is running as a new velocity should be given?
    And when the position is not important due to the velocity control and the very long length, which value can be given for flex_load_target_pos function?
    flex_load_target_pos fucntion also can be overridden during servo runnig as like velocity?

    You should call Flex_set_op_mode and set your controller for velocity mode. This way you don't have to load a target position. (flex_load_target_pos)
    Yes you can repeatedly call flex_load_rpm and flex_start to change velocity on the fly.
    Hope this helps.
    Ken
    Applications Engineer
    National Instruments

  • Velocity control, DC motor

    Hi,
    Please refer to my attachment, may I ask how to convert my output variable (after coming out from pid vi and marked by biggest arrow) to PWM Duty Cycle duration? The scaling I have done doesn't seem correct. 
    [This code is modified from the 9505 example, but I am not using the 9505, and I am not doing the position nor current loop. My hardware set up is just a DC motor with a full-H-bridge circuit, a linear encoder and an fpga.]
    Thank you.
    Message Edited by xf on 04-25-2009 12:59 PM
    Message Edited by xf on 04-25-2009 01:00 PM
    Attachments:
    velocityControl-code.JPG ‏178 KB

    Brian,
    Thank you for the suggestion about the speed controllers. Can you give me the names of the manufactures of some DC motor speed controllers?
    Buying an off the shelf solution is preferred than designing a custom circuit for my application.
    As for your ideal about using stepper motors, I am trying to control existing assemblies that already have the DC motors embedded in them. What I am doing is trying to assist the M.E.s in adding control to their proof of concept models and we do not want to turn a custom circuit board and custom software for a micro-controller for each concept model.
    Thanks again for your suggestion.
    --Jon

  • Understand​ing I don't have my 4s,become understand​ing I may be getting the shaft from bestbuy?

    So I have to vent.(guess all we can do before our phones come.)
    I went to my local bestbuy on the 7th to "PRE ORDER" 2 sprint iphone 4s. A 16 for my brother for his birthday,and a 32 for myself. I pre ordered both phones within a minute apart. Litterly. Receipt shows the 16 at 10:18 am,and the 32 at 10:19 am. I was only one to pre order a 32. I read all about the issues,so me and my brother decide to get to the store on the 14th to see if they came in. NONE. It was a shot. I get a call at 10:30 on launch day. It is the BBM manager saying the 16 came in but the 32 did not. Cool. I understand. We go in,and we are in and out. No hiccups. Brother got his phone and we left.
    I was told it would be in no later than Saturday the 15th. I understand. Apple has a finicky way of controlling there market profits. The 25th passes,and still no phone. I understand. I went to Tennessee to see my in laws. I left the 16-20 no calls saying phone had come in. I even checked there local BB to see if they had any. As you all can imagine they didn't. They received the iphone 4,but no S. So,now I am leaning towards BB favor. Hey, my store didn't get them,and neither did another state. I understand.
    I get an email on the 21st. Saying sorry your pre order is not available,but BB is hoping to have all pre orders fulfilled in 21 days. I understand. I get another email the 23 saying my pre order is unavailable to ship. This is where I start to not be so understanding. How,do I get one pre order phone,and not another,after almost 10 days. Mind you,I have been calling my local store everyday,and I am being told sorry we only have 16 or 64 white no blacks. I even went as far as to call and say I received an email saying my phone was in. That is when the BBM manager says they have 16 and 64 in. I was not offered either one. I call the 1888 number. Lady says sorry I am not seeing any inventory from any store. I was like OK, how does corporate not know inventory?
    So it is now the 24th,and I still have no 32 black sprint iphone 4s. I am know understanding I may be getting the shaft. I am finding it so hard to believe the store has not received one,not one 32 black for sprint in 17 days. Again,this is after I am told them get shipments everyday. But,guess what, my store still has 16 and 64 coming in. I just do not get it. I have one a 16,that is sold out everywhere,and I am still waiting on a phone that was not "sold out" until 5 days after I pre ordered mine.
    So as I stated,I am understanding I may be getting the shaft,run-around. Whatever,you want to call it. But, I bet if a higher person looked into this,they would see BBM has probably sold a number of 32 sprint iphone 4s. Thanks all for letting me vent. Just my overall impression of BB has gone down.
    I also want to add,this is not a bashing of BB,BBM,or even Apple. I know I want to hottest phone released. I know I may have to wait,but to tell me you have not gotten a single 32 in,17 days after pre order. After, I am told shipments come in everyday,I find that hard to believe.

    I've got the same story. I pre ordered the black 32gb 4S from Verizon on the 7th just like you. Just like you apparently my BB has not received a single 32gb Verizon 4S in 10 days.

  • Understand​ing DAQmxTaskC​ontrol()

    Hi all , 
    I have a few question about the DAQmxTaskControl() function:
    1. What's the difference between DAQmxStartTask(gAItaskHandle); to DAQmxTaskControl (gAItaskHandle, DAQmx_Val_Task_Start); is there any difference in the time it takes ?
    2. What's the use for DAQmxTaskControl (gAItaskHandle, DAQmx_Val_Task_Commit); I see it makes my function run faster about 40ms faster but does anybody knows how it's done and what's is purpose ?
    Kobi Kalif
    Software Engineer
    Solved!
    Go to Solution.

    Hi Kobi,
    If we look at the function help for DAQmxStartTask and DAQmxTaskControl, we find the following descriptions. To better understand the below descriptions, see the state machine diagram in the Performance section of the NI-DAQmx FAQ. 
    DAQmxStartTask
    "Transitions the task from the committed state to the running state, which begins measurement or generation. Using this function is required for some applications and optional for others."
    DAQmxTaskControl
    "Alters the state of a task according to the action you specify. To minimize the time required to start a task, for example, DAQmxTaskControl can commit the task prior to starting."
    Basically when you use DAQmxTaskControl you are explicitly telling the DAQmx driver what state to be in. In this case you are telling it to transition to the commit state instead of waiting for the DAQmxStartTask to make that transition for you. The task control gives you more advanced control on the state of your acquisition so it makes sense that it would improve your performance.
    If you need more information on the different states, I would recommend the NI-DAQmx help file on your computer. Start Menu » All Programs » National Instruments » NI-DAQ » NI-DAQmx Help - search for "task state model". 
    Jake H | Applications Engineer | National Instruments

  • Servo motor control using CRIO+FPGA and 9477 digital out module

    Hello experts,
    I have a futaba BLS551 brushless motor digital servo (3 wires-+,-, signal). i also have a CRIO+real-time+fpga and 9477 digital out module. how can i generate servo signals using this module
    please help...
    Thanks,

    freemason,
    In order to control your servo motor with the FPGA and or DIO module you will have to write drivers to control your motor and drive.  While this is possible is an extremely complicated and time consuming process.  I would highly recommend you consider using the NI 9514 with soft motion as it will provide full servo functionality and is relatively easy to use.
    Regards,
    Sam K
    Applications Engineer
    National Instruments

  • How to accomplish Torque and velocity control of an axis using NI motion controller

    Used torque transducer as a primary feedback. I am able to control the torque. Used Encoder feedback as secondary feedback to control velocity. But when I run the axis at 5 rpm, I observe velocity variations of more than 200 %. What is the nessasary change Should I make to control both Torque, velocity accurately?

    Hello,
    Here are a couple of suggestions that might help:
    - Tune the PID parameters so that step response will be overdamped
    - Increase the PID loop update period
    - Reduce the noise on the analog signal coming in. 7350 series controllers are going to be better for this because they have a 16-bit ADC resolution versus 12-bit on the 7340 series.
    Hope this helps! Let me know if you have any questions regarding this.
    Best regards,
    Yusuf Cavdarli
    Applications Engineer
    National Instruments

  • Velocity control finding home

    Hi, I've got some problems with function of the motion control.
    I attached an easy VI to find the home on the motor axis. When the stepper motor find the "home" it slows immediately. I'd like to control the deceleration, I do not want it so rough. Is it possible control this deceleration rate from the VI?
    One more question: how can I read the clamps of the "limit switch terminal block" of my "mid7602".
    Thanks.
    Ciao.
    Attachments:
    Find_Home.vi ‏45 KB

    Hi,
    I think you have to set the "Approach Velocity%" parameters, the following link explains it and shows you where it is in MAX. If you want to change it programmatically in LabVIEW, you have to use the "Load reference parameter.flx".
    To read the values of the limits you can use the "read limit Status.vi".
    Best regards
    AlbertoA

  • Position control with 9505

    hi any one help me position control of dc motor with 9505 card which position i give motor should stop

    Hi,
    I would suggest you start by looking at LabVIEW shipping examples.  These can be found using the example finder in LabVIEW.
    Also, please elaborate a little more on what specifically you need help with.
    Regards,
    Greg H.
    Applications Engineer
    National Instruments

  • Understanding Fan, Fan Control, and Heating Intel iMac

    Greetings all, I'm a recent convert from the world of Windows to Mac and I've enjoyed it more than you can imagine. I'm a network admin at a major airport and have come to hate working on Windows during the day to working on Windows at home. Although, I wish no harm to Windows... they keep me with a job. >:)
    I'm going to show my age here so don't laugh too much...
    I have trouble understanding the heat fans inside my iMac.
    I do play video games, namely WoW (Yes still a young'n).
    When I first got my iMac (3 months ago) and started playing, I noticed the back panel would get extremely hot to the touch. I was worried because I never heard the fans kick up past their default speed.
    I read the iMac handles heat by passing it to the back panel to allow for ambient air cooling. (Love the concept). But being a Windows user for years taught me to check my temps. I downloaded iStat Pro and my CPU while playing was at an alarming 60deg Centigrade.
    I checked for blockages in airflow and checked to make sure my fans were running, which iStat also showed at 1200 RPMs @ 60deg. The machine idles at about 38-39deg. also 1200 RPMs
    I did get this iMac used to cut costs, wiped it clean and put a fresh install of OSX.
    My question is: **Are the speeds factory fixed and if not, at what temp do the Fans kick into overdrive and start pushing some air?** 60deg is dangerously hot in such a compact unit in my opinion.
    note I did install and currently use smcFanControl to max out my fans when playing games to keep the temps down so I know the fan speeds can change manually, but shouldn't the firmware inside the computer auto adjust fan speeds based on their temp?
    Message was edited by: SPuppy

    Things have really changed... or the quality of the system is that much better than my old Shuttle Fragboxes. 60deg on my AthlonXP was within the limits but would spell premature age failure running it that hot. Its been a while since I had a true Intel product. When I was building PCs I couldn't afford them.
    Its good to know that the fans will auto increase by the firmware. I suspected as much but just didn't know 100% so thanks for the response.
    Ideally from a personal prespective I would like to keep my heat down below 59deg only because if the CPU is that hot, the ambient air inside the case is warm and I do not want premature failure of the bridges, HDD controller or any other component prone to heat failure. I'll take the smcFanControl off the startup processes and just manually switch it on and off when playing games. I can replace a fan cheaper than replacing the whole unit.
    Thanks for your responses.

  • Understand​ing TestStand Debug Deployment License

    After reading the License Agreement and the TestStand Help section on licenses I still would like an explanation as to exactly what the license does and does not allow. My understanding is with a TestStand Debug/Deployment license I can install TestStand on a machine, let's call it the development machine, and also install TestStand on another machine, my production machine. The development machine is where I would develop applications to deploy onto my production machine. The production machine is where I would actually test my products. The production machine can also be used as a platform to debug my deployed applications. If this is true then can I develop any additional tests on the production machine, e.g. I found a problem in the sequence so I need to add a new sequence to correct the error? Or I found a problem with a LabVIEW VI and I need to create a new VI to correct it. Can I develop VI's and sequences on the production machine or do I need to develop on the development machine transfer the files over to the production machine and hope it works (this type of process could prove to be very time consuming and tedious)?

    Thanks for the info. Now what about LabVIEW. Here are some options I'm considering:
    1. Run a LabVIEW Runtime Engine on the production machine and a FDS of LabVIEW on the development machine. In turn run a TestStand Deployment license on the production machine. This seems the most cost effective method. The problem I'm having is some VI's are developed in LabVIEW 6i, 7.0, and 7.1 and the LabVIEW 7.0 RT Engine does not work with TestStand 3.1. It seems only the LabVIEW 7.1 RT Engine will work with TestStand 3.1. But I don't have a LabVIEW 7.1 development license (not sure how the 7.1 VI's came to be, before my time). I've also thought of using standalone LabVIEW applications but the only license I have for Application Builder is from LabVIEW 6i PDS from the Developer's Suite. If TestStand 3.1 is suppose to work with any LabVIEW RT engine then I'm doing something wrong.
    2. Run a LabVIEW Development/Deployment license with a TestStand Development/Deployment license. This is the most expensive method but the easiest to implement.
    3. Run a LabVIEW FDS and a TestStand Deployment license on the production machine.
    A finally question, does NI make available a TestStand Run-time Engine that is available as a free download, similar to the LabVIEW Run-Time Engine?

  • Understand​ing the webcam.dll file

    Hi there everyone, I am trying to learn how to use call library in Labview and this would be my first time doing so.
    I am looking at an example online for usb and webcam.
    I found this example with 
    WEBCAMGRAB.DLL
    I looked into the blocked diagram and i see 3 inputs.
    the reason i know this because the block diagram said so.
    lets say that i don't have the blocked diagram and only the WEBCAMGRAB.DLL file. How would I know how to use the file in CALL LIBRARY node?
    how do i know that int32 is driver, width, and height?
    Message Edited by krispiekream on 03-12-2009 04:36 PM
    Message Edited by krispiekream on 03-12-2009 04:38 PM
    Message Edited by krispiekream on 03-12-2009 04:39 PM
    Best regards,
    Krispiekream
    Solved!
    Go to Solution.
    Attachments:
    untitled.PNG ‏36 KB
    WebcamGrab.zip ‏15 KB

    thank you
    smercurio_fc and jmcbee for your help on this issue for me.
    its always a challenge learning new things..
    i did get a chance to read over the document you gave me and the example is really easy. 
    what if i don't have the .h file?
    do you have examples of using .DLL to hardware?
    i got the example somewhere on this forum...its been 5 years since i took the last C programing class...
    here is the DLL.H
    #ifndef _DLL_H_
    #define _DLL_H_
    #define EXPORT __declspec (dllexport)
    EXPORT long SetUp(long driver, long width, long height);
    EXPORT long GetDigitizer(long DigitizerID, CHAR *str);
    EXPORT void Grab(unsigned char *LVPict);
    EXPORT void ShowDialog(long WhichDlg);
    #endif
    Message Edited by krispiekream on 03-13-2009 12:27 PM
    Best regards,
    Krispiekream
    Attachments:
    dll.c ‏6 KB
    dll.h ‏1 KB
    test.vi ‏34 KB

  • What am I doing wrong not not understand​ing??!

    Still having strange outlook issuesHi guys this is my first post on this message board! glad to be a part of these forums as I am on crackberry as well and have posted this. Ive been having astrange issue that I can't seem to get any info about. This has to do with my outlook exchange 2010 (Active sync) account at work. The problem is very strange...basically if I get an email in my outlook inbox and look at it but don't reply right away then that email vanishes into thin air. I can still see the email in my inbox on my pc but I search in my inbox, deleted, sent folders and nothing is thereon my z10. However if I reply to the email from my z10, then voila there it is saved in my inbox and my hub simply because I replied to the email .......what can this be. I've tried syncing 7,14,30 days and deleting the account and re adding and nothing. My Hotmail works just fine, and I asked the IT at my work and he has no idea, yet my Lumia 920 works fine with the outlook at my work. This us driving me nuts and I need to get any information I can since I can't always reply to emails all the time even if I check them. They can't just delete or dissappear from the phone!!! Thank you ahead of time guys despite sounding stupid. On a side note I thouht I would mention that in the Outlook setting ``keep a copy of emails is checked`` and I wiped the phone and started from scratch yesterday and this is still happening. I am losing all patience since my work email is very important to me and I love this phone, but this cannot remain. Any feedback would be great to try and guide me in the right direction please!
    Julian

    So to be clear, the emails remain in Outlook but are disappearing from the Z10?
    How is Outlook connecting to the email server?
    EDIT: The symptoms really seem to suggest that the emails are being removed from the inbox on the Exchange server somehow by Outlook. The Z10 should only be reflecting what is on the server.. I don't think it can do anything else.
    -: DrewRead :-
    Can't find an app on Blackberry World yet? Tweet the developer and ask when they will #GetWithBB10 !!
    "Like it" if you like it.

Maybe you are looking for

  • Voice memos folder syncing issues

    (I'm on Windows 7 64-bit, iTunes 11, iPhone 4S) I deleted all the voice memos from my iTunes library by mistake but did not delete the files from my computer. Now, trying to restore them to the library this is what happened: - I dragged and dropped t

  • Confusion with DX80 directory and contacts

    We just received two new Cisco DX80. I have to say I am a bit confused by many thing, one being the directory/contact. When the endpoint is registered on the CUCM (9.1.2), synched with Exchange and Jabber CUP server, a "Cisco UCM account" and "Cisco

  • Store images problem 6610i

    I have Nokia pc Suite installed, phone 6610i is connected by cable, when I click store images I get a window that says Image store has encountered a problem and needs to close, with the possibility to debug, send an error report or don't send, aal of

  • Edit Custom base calendar

    I created a custom base calendar to use with local holidays, which I use for a number of projects. This works fine, but I need to add holidays. That again works fine for each project. But is it possible to edit this base calendar so that the changes

  • IPhone 3GS battery life iOS 6.0.1

    I updated my iPhone 3GS to iOS 6.0.1 and my battery life has gotten terrible.  Use to get all day off a charge not it's dead in <6 hours.  Any fixes?