LabVIEW Output, BASIC STAMP II

How can I output the LabVIEW program that does the following (see
below) to interface with Basic Stamp II.
From a graph, I could retrieve X-Values (X1 and X2) and Y-Values (Y1
and Y2).
Target (X1, Y1)
Laser (X2, Y2)
Change in X (X2 - X1)
Change in Y (Y2 - Y1)
If ChangeX >= 5 If ChangeY >=5;
Move ChangeX==Right; Move ChangeY==Down;
else if ChangeX <= -5 else if ChangeY <= -5
Move ChangeX ==Left; Move ChangeY==Up;
else Move ChangeX==0 else Move ChangeY==0
| increasing Y
|
|
|
V ----------- > increasing X
If the change in X is positive (greater than zero), move change in X
to th
e right. If it was negative (less than zero), move change in X to
left. If the change in Y is positive (greater than zero), move change
in Y down. If it was negative (less than zero), move change in Y up. I
used 5 instead of 0 to give room for error.
Please e-mail me at if you can help, have
information or any question.
Thanks,
Paul

The example code that was provided with parallax is attached.  As far as the driver for the serial to USB, I downloaded that and I can see the BS2 in my Device Manager (COM5).  When I ran both the example code from Parallax and an example code through LabVIEW for a basic serial write and read, I was unable to get communication between my laptop and the BS2.  If you have code that you developed before, it would be really helpful if I could test that out with my microcontroller. Thanks for all your help guys!
Attachments:
StampSerial.llb ‏63 KB

Similar Messages

  • Communicat​ing between LabVIEW and Basic Stamp 2.0

    Hello,
    We are trying to control the movement of a Parallax Boe-Bot (robot with
    Board of Education) via LabView programming.  The BOE operates on
    Basic Stamp programming to control the pulse width sent to two servo
    motors.
    Is there a way to establish communication between LabVIEW and Basic
    Stamp 2.0?   We need to send commands from LabVIEW to the
    Basic Stamp to control the pulse width and therefore the rotation of
    the two servo motors.  We also need to send a signal from an
    accelerometer based on the robot to the LabVIEW ADC. 
    The ultimate goal is to do this with wireless communication, but for
    right now we are simply trying to figure out any way to send the data
    between LabVIEW and the Basic Stamp.
    Any ideas?
    Thank you,
    Adam

    Hi Adam,
    You can communicate with a BASIC Stamp microcontroller over GPIB using VISA.  To see some LabVIEW example code, just search for GPIB in the NI Example Finder (Help>>Find Examples). This will show you the basics of GPIB communication in LabVIEW.  I also found this link by searching the NI discussion forums for "basic stamp"
    http://www.parallax.com/html_pages/downloads/softw​are/software_basic_stamp.asp
    Here you can download an example using LabVIEW 7.0 or higher to communicate with a BASIC Stamp 2.
    I hope this is helpful!  Let us know if you have further questions.
    Megan B.
    National Instruments

  • Basic Stamp 2 MicroController

    I am currently working on a project that involves using a Basic Stamp 2 microcontroller that was purchased through Parallax.com.  We currently have the uController connected with our laptops using a USB to Serial converter and are able to achieve proper communication with the example LabVIEW code provided through parallax.com.  We loaded the Basic Stamp Editor code that I attached below  and ran the LabVIEW program (also attached).  The problem that we are running into is that we need to have the ability to read and write to each pin.  We need to set a specific pin HIGH depending on the user input. (Ex. User selects Pin 3 to set HIGH or LOW)  The program that was loaded from the basic stamp editor just echos the user input.  We are beginners with the software so we are not sure what part has to be changed in order to get it working.
    If there is any more information that is needed please let us know! We are nearly 4 weeks until our due date =/
    Attachments:
    StampSerial.llb ‏63 KB
    LabVIEWIO Notepad.txt ‏3 KB

    Hi,
    Looking at your stamp code, it is only programmed to drive one output pin and one input pin.  Your serial protocol needs to be expanded to allow you to set pin and state, currently you only send state.  You could just copy the code down and change the DO command to O1, O2 etc.
    Also, the labview code you have is not too good.  There is no while loop.  Are you just running the code once or using the run continuous button?  Run continuous is very bad as you are continuously opening and closing the com port.  You should move the serial port open and close functions outside of the loop.  This will require you to move the open and close code out of the subvi.  If the code is user interface driven then you should use an event structure within your main loop.  If you are not using an event structure you must put a small wait inside the loop or your code could use 100% CPU and lock up your PC.
    Search the labview examples for ways to code your application.
    Good luck,
    Michael

  • Communicating To Basic Stamp with RS232

    I am new to programming in Labview.
    I recently have a requirement to use LabView to control a Test Setup rigged with a Basic Stamp 2p Mircocontroller.
    All I need to do is to get Labview to send some coded commands to the stamp so that it can retrieve execute the commands and then return some data that it collected from the sensors that it is monitoring.
    Below is how I try to manage the communication:
    1. Labview send out "0" then read from COM1, timeout in 5secs. It will repeat this until it get a reply from the Stamp wihtin 5secs.
    2. Basic Stamp will not reply if it is doing something else. It will reply "0" if it is ready to receive instructions. And "1" if it is ready to send data.
    3. When Labview get a "0" reply, it will send out the two coded instructions. If the reply is "1" Labview will read from COM1.
    Here's the problem.
    I start my experiment by sending the first set of instructions to the Stamp. Which it has no problem executing.
    But when I try to send a 2nd instructions, i.e. when Labview try to send out "0" and wait for a "0" reply. It reads the value of the last instruction it sent out. Depending how many bytes I set it to read, LabView will cycle until the last byte of the previous instruction is done then it will read in the "0".
    I do not believe the problem is with my Stamp program or the hardware as I tried to manually sent instructions using Hyperterminal(with echo off) and I do not see the instructions returning.
    The way Labview behave seems like ECHO ON in hyperterminal. Just that the data I sent out seems to be sitting in the read buffer.
    I manage to bypass this by doing something really stupid. Each time I need to read from the COM port, I close the VISA connection and Open it again. This seems to help clear the buffer. But this put in substantial amount of wait time in both my Stamp and Labview program as I have to ensure each side is ready to communicate and they wait for each other to get ready.
    Does anyone know how I can stop the data I sent out to appear when I read from COM port?
    The only configuration that is not default on the VISA connection is that I stepped the Baud rate down to 4800 to ensure that the Stamp do not loose data.
    Thank you.

    I tried using the Send/Read Example. But since my Stamp is programmed to communicate in 4800 baud rate, I have to add the VISA Config in the Labview example to get it talking properly. Well the results are the same. I see what I send out in the read message box and my stamp is not program to echo message. The only thing I can think of that could cause this is the serial cable. I have to rig an adapter to diasble the stamp from resetting I rigged the cable so that pin 2, 3 & 5 on the DB9 cable is connected to the stamp. Pin 1, 4 & 6 are tie together. Pin 7 & 8 are tied together. (As suggested in the basic stamp manual, Pg 399). Would this cause the data to be routed back to the read buffer?
    Thank you.
    Attachments:
    Manual.jpg ‏64 KB

  • Working with basic stamp rs-232

    hi there,
    i'm currently using basic stamp together with labview 8.5. communication between the hardware (4 ultrasound sensors) and software is using the RS-232 serial communication.
    however, the sampling rate is very slow ( as low as 10 samples per second). may i know how to increase the sampling rate?
    it is due to the serial communication?

    Who wrote the embedded code for the microcontroller?  That's probably where the sampling rate is set (or allows to be set).  Is there documentation that came with the firmware?  Also, what are you sampling?  Depending on what you are sampling, sampling rates can vary greatly. 
    For instance, you would sample temperature at a vely low rate as compared with wind velocity. 
    Does your circuit allow for buffering?  There are many details about your environment that you need to share before we can offer any suggestions on how to increase sampling rate...  Or to determine if it is possible to do so...
    R

  • Labview output data to mathcad

    I use Labview to acquire data and use Mathcad to process data. I am wondering how to link two program together?
     In other words, labview outputs a big 2D array and Mathcad to input it. If one can give an example like it, it will be a great help.
    Thanks
    liming
    Solved!
    Go to Solution.

    I would really like to appologize for frustrating you with this issue. That was not my intention and I am sorry. I have done alot more research on using the dll for Mathcad in LabVIEW and I think I may be on to where the problem possibly is. It seems that we may need to point the activex containers to the correct location for Mathcad. Unfortunately, you need Mathcad installed in order to point them to the correct location, so I was wondering if you could double click the Mathcad Open.vi (in the File IO with Mathcad.vi) to open the block diagram. Now go to the Block diagram of the MathCad Open.vi and open the Mathcad_App_Open.vi. Once there right click the Application Reference and click on "Select ActiveX Class" >> browse. Now from the drop down menu choose the library type for something with Mathcad on it it will probably be IMathcad but if its not then choose anything with Mathcad. After that is selected choose the object with Application in the title from the box below the drop down menu and select OK. Run the example and see if the problem dissapears.
    I have provided some screenshots of the functions that I was looking at.
    National Instruments
    RIO Embedded Hardware PSE
    CompactRIO Developers Guide
    Attachments:
    Mathcad Open VI.GIF ‏7 KB
    Mathcad_App_Open.GIF ‏9 KB
    Application Reference.GIF ‏15 KB

  • My accelerometer outputs a charge proportional to the acceleration to which the accelerometer is subjected to.Can the labview output the waveform in voltage?if not how can i overcome the problem ?

    Im am using a labview RT with PCI-7030 DAQ board and a CB-68LP .This is urgent.can somebody please help?

    I suggest you consult the manifactuer of the devices to see if they sell or can recomend signal conditioning gear to output a voltage or a current.
    Generally speaking, the charge can be measured by using a capacitor (where V=Q/C) but the leakage current that is introduced when monitoring the voltage across the cap will adversly affect the measurements.
    I have designed circuits to do this but it quite un-orthodox.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • How can I use Easy Custom Device Tool to do the same thing LabVIEW the Basic Averaged DC-RMS VI does

    I am using a cRio 9024 and am reading PWM with a 9205 analog input card.  It was suggested that I use the Basic Averaged DC-RMS VI to convert the PWM signal voltage into an analog signal (RMS value) for use in my system.  The issue is that the 9024 cannot use LabVIEW models (it is VxWorks). I am working through the Easy Custom Device Tool, and wonder if someone can guide me in the right direction.
    Solved!
    Go to Solution.

    Hello,
    Can you give more details about your VeriStand configuration ? Are you using the Scan Interface (and thus the Scan Engine Custom Device) or are you using the FPGA of your cRIO chassis to access the I/O ?
    If you are programming the FPGA and using a FPGA Personalty in VeriStand you can probably add the calculations such as RMS to the FPGA code directly.
    If you are using the Scan Engine, you will need a custom device. The Easy Custom  Device Tool is just a tool, you will need to know more about what a custom device is and how to develop one:
    - http://www.ni.com/white-paper/12712/en/
    Regards
    Eric M. - Application Engineering Specialist
    Certified LabVIEW Architect
    Certified LabWindows™/CVI Developer

  • LabView course basic 1 and basic 2

    Hey,
    I'm rather new to LabView and to get more experience in 'programming
    LabView' I thought it may be some idea to go to the basic 1 and basic 2
    course, provided by NI... (and in some time got and see the DAQ-course)
    Anyone out there who took this course (these courses) ?
    My basic topic is DAQ. (Not faster than 100 S/s at this moment). Before I
    decide to follow this course it would be a good idea to find out if I
    already got the experience I would get from this course. In that case it's a
    waste of money of course.
    Who can tell me what you really learn at Basic 1 and 2. Is it a waste of
    money (and time)?
    Perhaps it helps if I send some vi's I've created and tell them what they
    do. (In separate e-mail)
    please reply on
    e-mail
    Boozz.

    I've taken both Basics I and II, and the LabVIEW DAQ course. In my case
    they were worth it. I'm not sure if your situation is the same as mine,
    though. My schedule is usually pretty full. So I was able to take these
    courses and get up to a point where I could do all the basic functions in
    one week. That was a big help. I might add that I'm an instructor by
    trade, not a programmer, so I don't have years of C programming behind me.
    Also, my company paid for the courses, which made my decision a lot easier!
    Another benefit that really helped me is that the instructors really cared,
    and offered to give me some help even after the course was over. I didn't
    want to abuse this offer, so I was very careful in how much I asked of them,
    but they both were more than
    helpful. This is the kind of service that's
    hard to find out there! Both instructors worked for National Instruments
    Alliance Member companies. They were knowledgeable and serious about
    ensuring I got my money's worth.
    If you are already familiar with wiring VIs, then I would suggest that you
    go to the NI web site and look at the detailed course outline for these
    courses. If you are already familiar with what is covered in these courses,
    then obviously you wouldn't gain much by taking them. If you need to learn
    a lot of what is shown in the outlines, then there is no quicker way to get
    up to speed.
    Best of luck to you,
    Mike

  • Using LabView (Output to file function) to retrieve GPS RMC data

    Hello Everyone,
    I was wondering if anyone knows the process of getting the RMC data from a GPS unit into a labview file.
    I already have it being processed by the program, and now I need to acquire the data and put it into a file where I can analyze it. The tricky part for me is getting it into a file (speed, directional properties, latitude and longitude) and organised so that I can shove it into excel and do my calculations there.
    If anyone has any tips or answers, it would be greatly appreciated :-)
    - Luke

    If you can interface to the GPS unit, it should not be problem to write data to a ASCII file. Are you using a RS232 link then communication with your GPS?
    Then the data is in the computer you have several options for parsing the string. A RMC data may lock something like this $GPRMC,123519,A,4807.038,N,01131.000,E,022.4,084.4,230394,003.1,W*6A The first thing to do is to create an array of strings using the
     "Spreadsheet String To Array Function" and "," as the delimiter
    EDIT: Take a look here also http://www.gpsinformation.org/dale/nmea.htm#RMC
    Message Edited by Coq rouge on 06-03-2009 06:05 PM
    Besides which, my opinion is that Express VIs Carthage must be destroyed deleted
    (Sorry no Labview "brag list" so far)

  • Can Labview output source code in text format?

    I am trying to file a patent that includes some Labview code. The patent office recommends attaching code (either source code or machine code) to show reduction to practice.
    However, it appears that the USPTO only accepts CD-Rs that have ASCII text on them. Is there a way to export Labview source code or machine code (from application builder?) to text?
    Does anyone have any advice for dealing with the PTO when Labview code is involved?
    Thanks,
    Casey

    > I am trying to file a patent that includes some Labview code. The
    > patent office recommends attaching code (either source code or machine
    > code) to show reduction to practice.
    >
    > However, it appears that the USPTO only accepts CD-Rs that have ASCII
    > text on them. Is there a way to export Labview source code or machine
    > code (from application builder?) to text?
    >
    > Does anyone have any advice for dealing with the PTO when Labview code
    > is involved?
    >
    I believe this has been done before. I did a quick search on
    www.uspto.gov, and found lots of them referencing LabVIEW -- meaning
    that LabVIEW was mentioned in the patent, and probably that LV code was
    the mentioned embodiment of the patent. I didn't browse all of them to
    see if or how they i
    ncluded source, but surely you can find a precedent
    for including a pdf or rtf with your diagrams.
    If you can't find anything else, print your VIs to RTF and see about
    submitting that.
    Greg McKaskle

  • Calculating scale factor from miliamps input to labview output

    I am setting up a testing fixture with Fieldpoint and Labview. I am bringing in a 4-20MA single to the fieldpoint. I need to scale it so I can have a pressure and vacuum reading in my labview screen. Do I use a multiply function or make a sub VI.
    Thanks, Bob

    I built a sub VI for my amp meter. I have a current transducer that is set to read 100 amps. So we used y=mx+b and got the scaling right on. I copied the Sub VI and used it for a pressure reading. This time the scale was based on a 250 PSI Transducer. This transducer puts out a 4 to 20ma signal. Again this one works perfect. It is dead on compared to the precision analog gauge. The problem is with my Vacuum Transducer. It is a 0-200 psi or 0-30" in Hg. I calculated the y=mx+b for this sub VI and I can not get a valid vacuum reading. I checked the port to make sure I was pulling vacuum there.
    In the sub VI if you click on it, it shows the original calculation for the amps. I tried to save the sub VI as a different name but it will not allow two sub VI's in the same state. I was able to get the pressure to work by changing the value in the constant that is connected to the Y multiplier.
    So how do I get the vacuum meter to show a 0-30" value.
    Thanks for the reply, Bob

  • Generating DLL with LabVIEW - A basic question

    Hi everybody !
    I'm trying to generate a simple DLL with LV. I have read as much as I could on the subject, prepared a small test vi (calculate x*x), built the corresponding dll, inserted and configured the ddl into another vi... Every thing seems to be fine, except that the ddl returns 0 whatever I'm doing.
    I'm obviously missing something here, and I would be very gratefull if somebody could guide me out of this stupid corner.
    CC
    Chilly Charly    (aka CC)
             E-List Master - Kudos glutton - Press the yellow button on the left...        

    Hi Chrisger & Becktho !
    Thanks for your replies.
    Unfortunately I'm still stuck.
    First thing is that I followed already exactly the path develepped by Chrisger. I also tried changing the calling convention as suggestedc by Becktho. No success.
    Second thing is that the ddl provided by Chrisger works perfectly (sorry Becktho, you just forgot the attachment...). At least that's something that shows I'm not entirely stupid, since I'm able to use a DDL compiled by someone else !
    So, this seems to be a problem related to the way I am generating the DLL. Just in case, I have attached here the files I'm using (the source, the .bld file, and the generated DLL. Could you give them a look/trial ?
    CC
    Chilly Charly    (aka CC)
             E-List Master - Kudos glutton - Press the yellow button on the left...        
    Attachments:
    Test DLL.zip ‏18 KB

  • PING))) Ultrasonic Distance Sensor (#28015) LabVIEW

    We are doing a project with the PING))) Ultrasonic Distance Sensor (#28015) using LabVIEW 9.0.  We are having difficulty interfacing the PING))) sensor with the NI ELVIS II+.  Also we are unsure of how to make a time delay in the microsecond range.  Any suggestions are appreciated.

    LabVIEW, under Windows, is not capable of a microsecond delay. Windows, in general, has a latency of 1 to 10 mS depending on who you ask and what version of Windows, machine, etc etc; however, I have seen LabVIEW commit to delays of 2 mS reliably. You just have to test it. LabVIEW RealTime is a different story, but I don't think you are running RT with Elvis.
    As far as "having difficulty interfacing the PING))) sensor with the NI ELVIS II+", that question is so general, I wouldn't know where to start. But I must say that the sensor you are using was designed to interface directly with microelectronics like the Basic Stamp. You will not reliably measure microsecond pulses through high level hardware and Windows. But lets say you don't need microseconds..... the sensor outputs TTL. If you can read a digital input with your Elvis, then that's where to start to get "something" out of the Ping))) and into Elvis/LabVIEW.
    Richard

  • Passing two variables from LabView to Javascipt program or Vice Versa

    Ok let me be blunt here. Below is my code to move a ethernet camera. As you can see from the code when the page is opened up you are immediately prompted to enter values for a X-coordinate and a Y-coordinate. Once your value has been inputted in, the camera moves based on the values. Fantastic right? Here is where i'm scratching my head. My Labview program basically takes in sound (short-version here) and is able to determine the position of where the sound is coming from in a x, y value (NOTE: the program  THUS far has been able to determine the position but has not been worked on to output to a x y value. But for argument sake let's say it does). My question to all of you is this I have read other theads that asked this and noticed that the code (highlighted in blue) is what they think would do the trick. However, this needs to be constantly working. Meaning my program needs to be constantly be feeding data to the X Y coordinate OR it constantly needs to be prompted through Labview. I dont know what would work best or even what is feasible at this point.
    Any help, suggestions comments or opinions anything would be greatly appreciated.
    <script type = "text/javascript">
    <!--
    var x,          // x variable axis
        y,          // y variable axis
        counter;    // Basic Counter
    var newWindow; //varibale to open up new windows
    // The following code should be used to implement Labview into this program
    // This code has been unmodified since it was given to me
    // NOTES: Consider intializing lvapp and vi and variables
    // NOTES: The response function should be changed to X and Y
    // NOTES: The parameter "Output" should be whatever the Team uses to distinguish X and Y
        lvapp = new ActiveXObject("Labview.Application");
        viPath = "C:\\test.vi";
        vi = lvapp.GetVIReference(viPath);   // Load the vi into memory
        vi.FPWinOpen = 1;                    // Open front panel
        vi.SetControlvalue("Input",125)      // set the input parameter, 125 is just a sample
        vi.Run();                            // run the VI here, the "Call"-methode without parameter
                                             // does not work since it uses
                                             // the defaults of the controls
        response = vi.GetControlvalue("Output")  // get the output parameter    
        alert("Result from LV:"+response);  */
    // Initialization Phase
    counter = 0; // prepare to loop
    ccounter = 0;
    while ( counter <= 0 ) { // loop 1 times
            // prompt for input and coordinate from user
            x = window.prompt( "Enter X-axis coordinate:", "0" );
            // prompt for input and coordinate from user
            y = window.prompt( "Enter Y-axis coordinate:", "0" );
            counter = counter + 1;
    //if statement frame for camera control
    //Home position
    if (x==0 && y==0)
        newWindow = open('http://192.168.0.3/cgi-bin/camctrl.cgi?move=home', 'secondWindow',
            'scrollbars,resizable,width=500,height=400');
        if (newWindow && !newWindow.closed) {
                newWindow.close();
    if (x==1 && y==0)
        newWindow = open('http://192.168.0.3/cgi-bin/camctrl.cgi?move=right','secondWindow',
            'scrollbars,resizable,width=500,height=400');
        if (newWindow && !newWindow.closed) {
                newWindow.close();
    // -->
    </script>
    </head>

    I know that it's not very elegant, but how about writing the data to a file which could be read by the other application?  You could use some sort of locking scheme or queue to make sure the data stays current.
    "There is a God shaped vacuum in the heart of every man which cannot be filled by any created thing, but only by God, the Creator, made known through Jesus." - Blaise Pascal

Maybe you are looking for