NXT ROBOT Question

Can anybody help with the question below please
Using your LabView interface, input different values of power and record the output value of the number of wheel rotations. You can convert the latter into speed by measuring the diameter of the wheel. Then, type ‘ident’ in the command line of Matlab to launch a graphical user interface for system identification. Using the data collected for input power and recorded speed, you can derive the transfer function for your robot in the Laplace domain.
You will notice that it is possible to approximate the transfer function to a first order system.
Add disturbance to your system such as wind drag, road profile or friction. You might want to model the disturbance as a sinusoidal waveform.
Analyse the system response in LabView in order to tune the response.
Finally, write a LabView program to test your controller.
Set both Integral and Derivative to 0
Increase the value for Proportional controller to 0.5, 0.8, 1, 1.1, 1.3, and 1.5
Select a suitable simulation stop time.
Compare each simulation results and draw some conclusions.
Record the steady-state error. Has it dropped to near zero?
Record the rise time. Has it decreased to less than 0.5 second? And for what values of Kp?
Check whether this response is realistic i.e. a real cruise control system generally cannot change the speed of the vehicle from 0 to 10m/s in less than 0.5 second.
Adjust the gain (Kp) to give a reasonable rise time and add an integral controller to eliminate the steady-state error.
Change both Kp and Ki and see what happens to the response. When you adjust the integral gain Ki, start with a small value since a large Ki can destabilise the response. (Ki =0.001, 0.002, 0.003, 0.004).
Then you need to adjust both the proportional gain Kp and the integral gain Ki to obtain the desired response. Record the values of Kp and Ki that can meet all the design criteria.
Write the PID-feedback control LabVIEW program as described above.
Check whether your system can stabilise speed to within 1%. What are the optimal PID parameters that you have found? Can you choose PID parameters to both reach the set speed quickly and also maintain the speed within good accuracy once the system has reached the set speed? Can you control the speed of small steps?

Duplicate Post.

Similar Messages

  • System Identification and Transfer Function of the LEGO MINDSTORM NXT Robot

    Can anybody please help with the below question.
    Using your LabView interface, input different values of power and record the output value of the number of wheel rotations. You can convert the latter into speed by measuring the diameter of the wheel. Then, type ‘ident’ in the command line of Matlab to launch a graphical user interface for system identification. Using the data collected for input power and recorded speed, you can derive the transfer function for your robot in the Laplace domain.
    You will notice that it is possible to approximate the transfer function to a first order system.
    Add disturbance to your system such as wind drag, road profile or friction. You might want to model the disturbance as a sinusoidal waveform.
    Analyse the system response in LabView in order to tune the response.
    Finally, write a LabView program to test your controller.
    Set both Integral and Derivative to 0
    Increase the value for Proportional controller to 0.5, 0.8, 1, 1.1, 1.3, and 1.5
    Select a suitable simulation stop time.
    Compare each simulation results and draw some conclusions.
    Record the steady-state error. Has it dropped to near zero?
    Record the rise time. Has it decreased to less than 0.5 second? And for what values of Kp?
    Check whether this response is realistic i.e. a real cruise control system generally cannot change the speed of the vehicle from 0 to 10m/s in less than 0.5 second.
    Adjust the gain (Kp) to give a reasonable rise time and add an integral controller to eliminate the steady-state error.
    Change both Kp and Ki and see what happens to the response. When you adjust the integral gain Ki, start with a small value since a large Ki can destabilise the response. (Ki =0.001, 0.002, 0.003, 0.004).
    Then you need to adjust both the proportional gain Kp and the integral gain Ki to obtain the desired response. Record the values of Kp and Ki that can meet all the design criteria.
    Write the PID-feedback control LabVIEW program as described above.
    Does it work? Quantify! Show us it is working
    Hand in a short report on the PID project on the cruise control of the LEGO MINDSTORM NXT Robot. Your report should contain an introduction to cruise control and uses of PID controllers, a brief explanation of how PID feedback control works. The printout of the Block Diagram and Front Panel of your LabVIEW program should be attached at the end of your report.
    Put it all together. There are many questions that can be explored and written up in the result section. Be creative.
    Here are some things that may be addressed: Check whether your system can stabilise speed to within 1%. What are the optimal PID parameters that you have found? Can you choose PID parameters to both reach the set speed quickly and also maintain the speed within good accuracy once the system has reached the set speed? Can you control the speed of small steps? Explain your results?

    Thanks for that.
    DanHarryman wrote:
    HI ToolMonkey
    You should be able to build control system to do this using PID control VIs. The following paper is a good place to start when working with PID systems. 
    http://www.ni.com/white-paper/6440/en
    Let me know how you get on using some of the functions in this. 
    Thanks 
    Dan.H
    UKAE

  • LabView Connection to NXT block Question

        Hi,
    I am trying to connect a NXT block using bluetooth. I have paired the device with the bluetooth software and created a serial connection. Thus I can just send data from LabVIEW using VISA but its proving more dificalt than I thought.
    All I have to do is send a string or integer to the NXT block.
     How can I create a simple sending function using LabVIEW -VISA for just an integer or string ??
    Attachments:
    Spike.vi ‏72 KB

    A couple of comments:
    1. I recommend that you invoke the Get Device Info VI after creating the NXT object to verify that you can communicate with the NXT.
    2. I don't remember what the default of the require response input to the Send Direct Command VI, but you need to specify TRUE if TRUE is not the default.
    3. You should wire the response buffer output to an indicator so you can examine the response from the NXT. I'm not sure how you are determining that you are not getting a response since you are checking for one.
    JacoNI wrote:
    Also I noticed you keep refering to spike.rtx. What is this? My VI running on the NXT block is named "Spike Remote"
    Will this have an affect? The VI jpg has got this name in hex but still no response :/
    4. "spike.rxe" was the name of the program on the NXT in my example. Programs on the NXT have an extension of rxe. I would recommend the use of the "String to Byte Array" VI rather than building the array by hand. The bytes you specify appear to be decimal rather than hex numbers and probably don't represent the string you expect. You also are missing the extension. Wire "Spike Remote.rxe" into the "String to Byte Array" VI and append a NULL byte.
    geoff
    Geoffrey Schmit
    Fermi National Accelerator Laborary

  • NXT shortest track calculation

    Hello,
    I have a urgent question. 
    One of our teachers has given us the challenge to build an nxt robot that can collect and sort collored balls that are lying on a "playing ground" as shown in picture. (yellow dots are ball locations)
    We get the "location" of the collored balls 5 minutes before start of challenge. (We now the places but we dont know which collor will be at what place) so we thought about inserting that by higlighting buttons for one collor of balls and leaving them blank for the other collor balls.
    Now, we made VI 's for: turning left/right, stop/riding over a crosspoint (where 2 black lines cross) we can use the ultrasone and light sensors etc... We did lot's of tests with them
    But now the problems starts:
    We have to write a labview program that can calculate the shortest way to a specified ball (Our goal is to collect 3 balls of same collor at a time before returning to base) following the black lines.
    Then transmitting the calculated short track to the nxt true bluetooth. (Shortest way means as less turns obviously).
    But we dont have any clue how to program such thing... we only have verry verry basic knowledge of labview and time is really running out...
    If anybody has an idea how to make such program or if anybody can give us detailed information on how to program that or if want to be our hero and you can make that program for us...
    Please let us know, we are desperate.
    Many thanks,
    Vincent
    (sorry for English mistakes... I'am used to speak dutch )
    Attachments:
    STEL SITUATIE.png ‏23 KB

    Hello Vincent,
    I do not seem to be able to open your attachment.
    Is there an error that you're running into?
    What algorithm are you trying to implement to solve this problem?
    Do you need to write a program that calculates the shortest path or the shortest amount of time needed?
    Shortest path doesn't necessarily mean for me the least amount of turns.
    Did you already have a look at this page (http://nxtmastery.com/) to get started with the program?
    Have you already defined your algorithm in pseudo-code?
    This will help you with implementing your algorithm.
    Kind Regards,
    Thierry C - Applications Engineering Specialist Northern European Region - National Instruments
    CLD, CTA
    If someone helped you, let them know. Mark as solved and/or give a kudo.

  • Connecting LEGO mindstorms NXT to my computer

    I am trying to learn Labview using the LEGO mindstorms NXT robot. I installed Labview 2009 and started working, but when I open the NXT terminal in the NXT module it claims that the NXT is not connected although it is (via USB). Any ideas ? Could this have to do with the fact that I am using Windows Vista ?
    Thanks

    This article is intended to indicate the installation and configuration process between LabVIEW and educational robot Lego Mindstorms NXT wirelessly, this will enable LabVIEW developers in having a mobile platform for the development of projects and autonomous robots, we need to install some programs the computer:
    https://decibel.ni.com/content/docs/DOC-32448    
    Atom
    Certified LabVIEW Associate Developer

  • How to make a robot roam and simultaneously display its coordinate using other vi

    Hi i am using Cricket to find the coordinates of moving robot as my receiver is attached to the robot.I made a vi which can extract the exact coordinates using my CRICKET sensor data.How to move a robot and simultaneously display its coordinates.In turn moving the robot usin roaming Vi as well as same time my VI display its coordinate.

    Hi,
    Is this an NXT robot and are you deploying your code on the robot?  Are you running the robot from a VI on you PC using blue tooth, etc... ?
    If you are running both VI's in LabVIEW you can use parallel loops.  If they have to communicate with each other, then you have to decide which approach to use with parallel loop communication.  Like simple variables or more complex choices like queue, etc...
    Can you provide more information about your system and version of LabVIEW?
    Mark Ramsdale

  • Getting results to the Lego mindstorm NXT brick

    In a nutshell, my group and I are trying to use eye movements to control the Lego mindstorm, ie look left turn left look right turn right etc. We will be acquiring the eye movement signals through labview signal express and running those signals into labview to move the NXT robot. We are in need of assistance in both the signal importing into labview and the subsequent utilization of the said signal by the NXT. We already have the NXT toolkit and are able to succesfully run basic programs. The problem we've had is that we need the results continuously imported into the NXT so that we can maintain control of the vehicle.  So basically we need pretty close to real time results, sort of like a joystick type of thing. My group members and I have limited experience with labview so any assistance is appreciated. 

    Hello King945,
    Here is a link that walks through sending and reading Bluetooth messages. I have also include a basic NXT Front Panel Control vi. The vi will work if the NXT toolkit is installed and if the NXT brick is wired to the computer through a USB cable. Also, if you integrate the Bluetooth messaging with this vi you should be able to control the NXT using front panel controls through Bluetooth.
    Wear
    National Instruments
    Product Support Engineer
    Attachments:
    Front_Panel_Steering_Control.vi ‏102 KB

  • NXT case structure

    Hey Im making an nxt robot for a competition and it has to go thru two phases for the competition.  Shifting between phase 1 and 2 is controlled by a timer.  the only issue im having is how to setup multiple shift registers for my program
    Attachments:
    Group2Phase2.vi ‏16 KB

    Sorry, I don't have the NXT toolkit installed, but maybe you should move the select node outside the case structure and wire the output to the outer case selector to swrich between Phases 1 and 2.
    If you need more shift registers, just add them.
    LabVIEW Champion . Do more with less code and in less time .

  • NXT PID Controller from PC

    Hello,
    I've implemented a PID controller on PC to control NXT Robot in this steps:
    1. NXT sends the sensor value (Light sensor) to PC.
    2. PC performs all calculations (PID controller and motor power for each wheel)
    3. NXT receives the powers of motors and applies to corresponding motor.
    With USB connection, robot follows the line very well and execution time is only 15 ms. But with Bluetooth, execution time up to 60 ms (!!) and the robot cannot follow the line properly.
    Is it normal that Bluetooth has this latency? The only solution is to implement the PID controller directly on the NXT?
    I've attached the robot's program.
    Thanks!
    Attachments:
    Robot 9 (enviat).vi ‏20 KB

    Bluetooth is always going to be slower than USB, and best performance would be doing the PID control on the NXT.
    Can you attach your host side program as well so we can see how you are sending/receiving messages on that pc? 
    Are you using a mac with built in bluetooth radio? Sometimes people get better performance on mac with an external dongle.
    Another option would be disabling the status check when a message is written to the NXT. 
    See the following vi: 
    C:\Program Files (x86)\National Instruments\LabVIEW 2012\vi.lib\NXT\DirectCommands\NXTToolkit.DC.MessageWrite.vi
    There is a true constant on the diagram that specifies "requireResponse". This enables getting a response back that the messageWrite succeeded. I'm not sure how important it is, but setting that to false would reduce the number of bluetooth transactions in the loop by one.

  • Seeking Help with Album Image Issues in iPod Classic

    Hi folks -
    I have a new 160 iPod Classic and have two album image-related issues I'd appreciate some help with.
    In the case of several albums where I'd associated my own art with them in iTunes, the formatting appeared off in my iPod due to the vertical/horizontal proportions. So I corrected the images, replaced the images via iTunes, and re-synced. All of the old images were replaced - except for two. Is there any way to get my iPod to recognize the change to these two images, as it did for all the others?
    Secondly, my "The Allman Brothers Band Live at Fillmore East" album was appearing twice in both the Cover Flow and Album list. That was strange, except that after the latest sync it now appears six times! It's only in one place in iTunes, and not erroneously set as a compilation or anything. This is the only album displaying this behavior...
    Thanks for any suggestions or advice!
    Frank
    Message was edited by: frank3si

    Yes my e-mail address is [email protected] 
    Thank you for your kind attention to my problem. I am looking for one on one brief consultation with my laptop in Cincinnati. If not then I will compose a clear question with VI.
    These manuals are well known to me NI Visions Concepts ManualIMAQ Vision for LaVIEW User ManuelNI-IMAQ for USB Cameras My problem is moving to the next step of Create an array of USB imagePerform math on array Display results Sincerely,Tom Lohre cell 513-236-1704, [email protected] http://tomlohre.com/images/lafley.jpgAG Lafley, Chairman & CEO of Proctor & Gamble http://tomlohre.com/lafley.htm A.G. Lafley enjoyed hearing of Tom's painting robot and thought it played well to his new book: "The Game-Changer: How You Can Drive Revenue and Profit Growth With Innovation." http://tomlohre.com/newart.htm
    Tom Lohre artist/scientist
    Has a operating painting robot using RoboLab/RCX
    Developing a LabView/ NXT robot that analyzes an image for aesthetic quality.

  • Bluetooth not supported for emulated applications?

    This is a quote I received from a tech support group recently, and I'm trying to find out if this is true or not:
    "Bluetooth on Intel based Macintosh system is not supported by Mac OS X for emulated applications"
    So if we want to make use of Bluetooth, are we out of luck if the application is not a universal binary?
    This is part of a running discussion I'm having with the Lego company regarding the Bluetooth capabilities of their new Mindstorms NXT robot kit. I'm aware of the issues of my specific case and I'm pursuing this with them based on an install of their software on an XP machine, but I'm wondering about the accuracy of their general statement above.
    Thoughts?

    Why exactly are you trying to pair your iPhone with your Macbook Bluetooth, if it is for transferring any Data via bluetooth then that won't work as iPhone's bluetooth can only be used for Tethering (Internet) on your Macbook Pro
    iOS: Supported Bluetooth profiles : http://support.apple.com/kb/ht3647

  • LEGO MINDSTROMS

    Hi, I have finished the program in LabVIEW to manage robotic car using two touch sensors .
    Program to do the following :
    1 The Power variable holds the
    current power level selected
    by the user ( 0-100 ) . Start at 80
    The variable holds TargetSteer
    The steering motor angle That
    the main sequence above is
    Currently asking for , Which can
    range from -75 ( left ) to 75 ( right ) .
    Start at 0 for straight ahead .
    2 If the Left Arrow button is bumped then subtract 20 from Power and write the new value back into power . If the value gets less than 20
    then set it back to 20 with That 20 is the minimum allowed power . Confirm the button press with a beep .
    3 If the Right Arrow button is bumped then add 20 to Power and write the new value back into power . If the value gets greater than 100
    then set it back to 100 to 100 That is the minimum allowed power . Confirm the button press with a beep .
    4 Display the current power on the screen before it and
    some spaces after it to Ensure That the previous value is erased .
    5 Test both touch sensors by doing and switch on sensor 1 , then resulting in both paths , and to switch on sensor 4 , Which results in four
    possible paths Corresponding to the four button combinations ( both , left only , right only , neither ) . In each case , perform the action :
    Both Pressed : TargetSteer Set to 0 ( straight ahead) and apply the selected power to both motors to go straight .
    Left Pressed : Set TargetSteer to -75 (full left ) to send Power and Power C / 2 to B ( rear wheel power difference assists the steering )
    Right Pressed : Set TargetSteer to 75 (full right ) , send it Power and Power B / 2 to B ( rear wheel power difference assists the steering )
    Neither Pressed : TargetSteer Set to 0 ( straight ) and stop .
    6.Get the current steering angle from its motor rotation sensor , then subtract That from
    The Desired steering motor angle in TargetSteer , and use it That Determine how much
    power and what direction to turn the steering motor to get to the Desired angle .
    The power used is proportional to the angle difference , Which produces a smooth
    progressive motion . The duration is Unlimited, with the power will just be briefly
    updated , and then the loop will repeat and re -calculate a new power . As the steering
    Desired approaches the angle , the power will reduce until it reaches 0 power at the
    Desired steering and will stop there . If it overshoots , opposite power will bring it back .
    The angle difference can be either positive or negative, with the motor direction is
    Determined by testing to see if the difference is greater than zero or not , and the Motor
    block 's Power port will take the absolute value of the power in wired .
    Some of the steps and some do not know I certainly wrong. I am a beginner in labivew . Please find attached my program . Please help me with it .
    Thank you
    Attachments:
    Prakticka cast BP pokus 2.vi ‏33 KB

    This article is intended to control the direction and speed of a Lego NXT robot using the directional buttons on the computer keyboard, the remote control of limited strategic scope of the Bluetooth transmitter installed. I hope it will be useful, greetings
    https://decibel.ni.com/content/docs/DOC-36250
    Atom
    Certified LabVIEW Associate Developer

  • Implementing a mediaplayer into my application

    h3. Hello,
    I am working on a project for school. This involves editing a freeware java program for a Lego NXT robot to function under keyboard input aswell as button input, and implementing a media player into the application so that live video material from the robot can be viewed through the app. The video will be created from an RTP protocol for stream, but I haven't got the slightest idea about how to implement the media player into the application. The app itself is a GUI which holds the buttons to control the NXT robot. The media player is of the type specified in the JMF 2.1.1 as I found JMF most suitable for the situation.
    h4. My first try was this:
    if(mediaPlayer != null)
    mediaPlayer.close();
    String location = ("rtp:10.110.110.224:1234");
    try{
         MediaLocator ml = new MediaLocator(location);
         if(ml == null)
             System.err.println("Can't build MediaLocator for RTP");
         mediaPlayer = (MediaPlayer) Manager.createPlayer(ml);
    catch (NoPlayerException e){
         System.err.println("Error:" + e);
    catch (MalformedURLException e){
         System.err.println("Error:" + e);
    catch (IOException e){
         System.err.println("Error:" + e);
    mediaPlayer.setControlPanelVisible(true);
    mediaPlayer.setFixedAspectRatio(true);
    mediaPlayer.setPlaybackLoop(false);
    mediaPlayer.prefetch ();
    addComponent(contentPane, mediaPlayer, 10,28,400,200);
    mediaPlayer.start ();addComponent is a method which places the component in a pre-defined container and adds it to the GUI. But somehow, it doesn't recognize the RTP adress specified as the location.
    I already tried something with the AVReceive3 class which sun so generously distributed (http://java.sun.com/javase/technologies/desktop/media/jmf/2.1.1/solutions/AVReceive3.java), but this didn't seem to work aswell.
    {noformat}I hope that you can help me find a solution fast,{noformat}
    {noformat}best wishes,{noformat}
    {noformat}Roy
    {noformat}

    Get rid of the file:// part of the RTP address. rtp:// is like http:// or file://...it's the prefix telling the URL what protocol to use.
    String location = "rtp://10.110.110.224:1234"; I've tried your sollution, but it doesn't seem to work.
    I've tried another approach:
    * Setting the properties for the mediaplayer object
    if(mediaPlayer != null)
                mediaPlayer.close();
    //Source for streaming video
    String location = "rtsp://localhost:1234";
    //String location = "file://C:/Documents and Settings/Administrator/Desktop/Capture.AVI";
    try{
                ml = new MediaLocator(location);
                if(ml == null)
                    System.err.println("Unable to build MediaLocator");
                DataSource ds = new DataSource(ml);
    }I switched to using the localhost, as the stream is intended to be streamed from a server to the client program, and I've tried using a datasource.
    HOWEVER, using this approach insists on implementing the 4 standard abstract methods which come with the DataSource constructor:
    DataSource ds = new DataSource(ml) {
                public InputStream getInputStream() throws IOException {
                        throw new UnsupportedOperationException("Not supported yet.");
                public OutputStream getOutputStream() throws IOException {
                        throw new UnsupportedOperationException("Not supported yet.");
                public String getContentType() {
                        throw new UnsupportedOperationException("Not supported yet.");
                public String getName() {
                        throw new UnsupportedOperationException("Not supported yet.");
    };And after that, it insists on getting rid of the medialocator argument, as I implemented an annonymous class.
    I'm sure it's rather obvious that I'm a novice to this stuff... but I really want to fix this, as I could use the experience.

  • Gift Certificate for NI Week 2012 worth $500

    Hi I Have a Gift Certificate for NI Week 2012 worth $500, unfortunately i'm not able to attend NIWeek and any body can offer to have the GiftCode.
    Thanks!
    Ashok

    Matt got the certificate. Thanks for a great community. Tom
    Tom Lohre artist/scientist
    Has a operating painting robot using RoboLab/RCX
    Developing a LabView/ NXT robot that analyzes an image for aesthetic quality.

  • Move motors to specific coordinates

    Hey everyone,
    I'm working on developing a NXT robot that can move motors to a given set of coordinates. I am using Labview 8.6. So far I'm using the initialize mouse VI to give the coordinates of the picture that I've pasted onto the Labview front panel. My problem is that I do not know how to make the motors move to these coordinates. I know that I can set the motors to run for a specified time or number of degrees, but I do not know any other way. I've attached the program with the initialize mouse VI to get the coordinates. I am unable to download this onto the NXT. I've also attached the basic program of how we are moving the motors. I think it would be best to combine these some how. Any help is appreciated! Thanks.
    Attachments:
    get coordinates.vi ‏300 KB
    Untitled 10.vi ‏10 KB

    As in most vision Motion Systems, you will need to consider that the system will need to be calibrated.  In your case, pixels to degrees of rotation.  First consider the width and height of your clickable picture control.  For the sake of discussion, let's say that it is 100x100 pixels.  Now you will need to determine the distance that the motors can travel in each axis, in degrees.  Lets say 720 degrees (two full rotations).
    Position the motors in the upper left corner of the travel area, and zero them out, them move them to the opposite corner, and read the number of degrees.  Lets say 720 degrees on X (two full rotations), and 270 (3/4 rotation) on Y.  These are the "real world" coordinates that we will convert to pixel counts.
    The conversion looks like this: Xd= (X*(720/100)) Yd=(Y*(270/100)
    When you click the Upper Left-most pixel in your image (0,0), you will send  Xd=(0*(720/100))=0 Yd=(0*(270/100)=0
    When you click the Upper (22,77) in your image, you will
    send  Xd=(22*(720/100))=158.4 Deg.  Yd=(0*(270/100)=207.9 Deg.
    Hope this helps.

Maybe you are looking for

  • Problem while submitting job from Forms

    Hi, i'm submitting a job in when button pressed trigger. This job will call a Backend Package(one procedure). But the job id is created in the backend but the job is not running. but if i run the job with DBMS_job.Run(jobid) from form itself after su

  • Creating invoice through bapi

    Hi experts, I am creating invoice by using BAPI_INCOMINGINVOICE_CREATE. this is for Invoice with Planned Delivery Costs. Here i am giving the following data. header data INVOICE_IND = x DOC_TYPE = kr doc_date post-date reff-doc-no = 3006828 comp-code

  • Component renders more then once

    Hi, I'm having a jspx page containing a navigation panel and 3 panelboxes. Depending on the value of a 'Type' attribute defined in the pageDefinition, one of the panelBoxes get rendered. *If the value of the 'Type' attribute is 'T': the panelBox rend

  • How to restart the DAC

    Hi can anyone help me the steps to restart the DAC

  • Timeline/Behavior Issues

    I have placed 3 AP Elements (with Images in them) into my timeline.  I have staggered them and applied the fade/appear behavior to each.  I have also set the timeline to loop, however when it is done fading the last image out the screen is blank for