Arm controller

Hello everybody
I'm searching for a vi concerning supervision with control design and simulation of an arm controller this example is integrated in the examples on Lbaview but i can't found it on meine
here is the picture  
And thank u
Solved!
Go to Solution.
Attachments:
ui_picture_control.jpg ‏112 KB

Hello Zanoubia,
I hope you're doing well today. What version of LabVIEW do you have installed? From what I can tell, it looks like these examples are not installed with the Base LabVIEW package. So that could be part of the reason why you're missing them. Did you once have these examples and they are now no longer on your machine?
When you navigate to the LabVIEW Examples folder, does the /picture directory exist? (C:\Program Files\National Instruments\LabVIEW 20##\examples\picture)
If you do have LabVIEW Full or Professional, you should be able to reinstall these file components by performing a repair on your software. If you can please elaborate on the question above and I will do my best to help you.
Tim A.
National Instruments

Similar Messages

  • USB DAQ with ARM controller boards

    Hello,
         I wish to use USB DAQ (for example USB6509) with ARM or CORTEX-M3 board. I recently purchased one cortex - M3 (by luminary micro) from NI with labview embedded software. I wish to control and monitor large number digital I/Os using USB6509 but the controller card i wish to use is cortex-m3 boards. 
    can some one tell me is it possible at all? if yes, how to go about this? 
    thanks in advance
    Nabhi

    Hi Andy,
    There are a few interfaces that can be used depending on the data you want to send.
    If you want to use Wifi then dongles will be difficult as you would have to write a USB driver for this.  The simplest method would be to use a WiFi ethernet access point to put it on the network.
    If you are looking at general wireless technologies there is quite a lot based on serial technologies.  If you look at http://www.active-robots.com/products/radio-solutions/index.shtml there is a lot based on RS-232 which is available on the ARM7 board.  You may also find some that are based on UART ports, SPI or I2C that should be able to connect.
    I also wanted to draw your attention towards the SingleBoardRIO. It offers the form factor but is much more powerful by allowing you alot of DIO and programmable FPGA.  This is a single board version of the CompactRIO which has the same features but is larger and heavier so depending on the project may not be suitable. Check out ni.com/singleboard and maybe get in touch with your local NI branch for more information.  The SBRIO will offer the same connectivity options as the ARM board with the WiFi access point the simplest way to connect.
    Cheers,
    James Mc
    ========
    CLA and cRIO Fanatic
    wiresmithtech.com/blog

  • 1 Button 2 Functions

    Im working with Luminary Micro Evaluation Board (LM3S8962). In a case structure One case Clears screen when select button is pushed. I would like to make it so it also Stops the program when the same button is pushed for a period of time....Lets say 5 seconds.....But if the select button is not pushed and held then it just clears screen and keeps running. I cant seem to figure out how to use the Elapsed Time VI correctly or maybe there is another way..??? Also What I've tryed with the Elapsed time VI it seems to pick up where it left off as far as counting so by the time I push select lets say 3 times (eventhough push and release) it returns a true. Can anyone help?
    Attachments:
    ElapsedTimeImage.PNG ‏36 KB

    Im an electronics student and I just got started with Labveiw so everything is still pretty confusing. I opened up an event structure but as I started disecting it and reading about it I became Very lost. If you dont mind, Can you break it down like your talking to a four year old . Do I use the event structure outside of the case structure? Do i use it in the same case as the reset button or make another case??? What terminals do I need to connect to what....... I was up reading about the parts of the Event structure which seemed to get me nowhere because I needed to know what the other things are that are involved. What I am building is an Etch-a-Sketch to run on an ARM controller OLED. Everything works except this stop button Idea.

  • Sensor data to be displayed on indicators

    hello friends,
    Im working on project SCADA automatic irrigation system in which i have used 4 sensors and ARM controller,
    i m able to display all my values on array in labview but i would like to show them on indicators(temp, tank etc)
    please tell the detailed procedure or steps which need to be done as im a begineer.
    Thank you.
    Attachments:
    DSC_0210.jpg ‏694 KB
    DSC_0211.jpg ‏829 KB

    Duplicate - http://forums.ni.com/t5/LabVIEW/how-to-display-arr​ay-values-on-indicators/m-p/3135587
    It is just rude to start a new thread with the exact same question and the exact same images from your phone. You have not done anything there that shows you are paying attention to answers that have been given poor any of the suggested steps you should do to debug your problem.

  • Zedboard Xilinx Zync 7000 interface using labview

    Hello,
    I am doing my thesis in Zedboard for developing an DDR3 memory test and verification. For that I need to implement an LabVIEW dedicated Graphical User Interface base on NI Measurement Studio.
    Topic is : Szudy of Algorithmic test setup for DDR3 SDRAM with a Xilinx Zynq SoC.
    Here i have done my algorithm in Xilinx SDK. But I need to make a GUI using labview. Which helps to execute these programs. Please let me know how I can do this.
    1. Or is it possible to directly access the Zynq SoC using Labview. If yes how?
    2. Or if I need to do the coding in Xilinx SDK and How I can run this code using Labview?
    Please give me an detailed reply. Since I am new to labview. I m not understanding how to start with. If you have any example design please share with me.
    Thanks & Regards,
    Nithin Ponnarasseri
    Solved!
    Go to Solution.

    No you can't develop directly in LabVIEW and deploy that program to the Zync board. NI has their own Zync based hardware platform (cRIO and myRIO) but the tools to target those boards are specific to the NI hardware implementation and won't work with other hardware. Developing an interface for another hardware platform is a lot of work and needs to be adapted for every single flavor of a new hardware platform. And NI does not support this for other hardware.
    So your option will be to develop an application with the Zync SDK for the Zync ARM controller and supply some form of communication interface (serial port, TCP/IP, or similar) in that application over which you can send commands from LabVIEW to your embedded application.
    Rolf Kalbermatter
    CIT Engineering Netherlands
    a division of Test & Measurement Solutions

  • Choosing a PXIe controller for streaming 200 MBps

    Warning:  This is a long post with several questions.  My appologies in advance.
    I am a physics professor at a small liberal-arts college, and will be replacing a very old multi-channel analyzer for doing basic gamma-ray spectroscopy.  I would like to get a complete PXI system for maximum flexability.  Hopefully this configuration could be used for a lot of other experiments such as pulsed NMR.  But the most demanding role of the equipment would be gamma-ray spectroscopy, so I'll focus on that.
    For this, I will need to be measuring either the maximum height of an electrical pulse, or (more often) the integrated voltage of the pulse.  Pulses are typically 500 ns wide (at half maximum), and between roughly 2-200 mV without a preamp and up to 10V after the preamp.  With the PXI-5122 I don't think I'll need a preamp (better timing information and simpler pedagogy).  A 100 MHz sampling rate would give me at least 50 samples over the main portion of the peak, and about 300 samples over the entire range of integration.  This should be plenty if not a bit of overkill.
    My main questions are related to finding a long-term solution, and keeping up with the high data rate.  I'm mostly convinced that I want the NI PXIe-5122 digitizer board, and the cheapest (8-slot) PXIe chassis.  But I don't know what controller to use, or software environment (LabView / LabWindows / homebrew C++).  This system will likely run about $15,000, which is more than my department's yearly budget.  I have special funds to accomplish this now, but I want to minimize any future expenses in maintenance and updates.
    The pulses to be measured arrive at random intervals, so performance will be best when I can still measure the heights or areas of pulses arriving in short succession.  Obviously if two pulses overlap, I have to get clever and probably ignore them both.  But I want to minimize dead time - the time after one pulse arrives that I become receptive to the next one.  Dead times of less than 2 or 3 microseconds would be nice.
    I can imagine two general approaches.  One is to trigger on a pulse and have about a 3 us (or longer) readout window.  There could be a little bit of pileup inspection to tell if I happen to be seeing the beginning of a second pulse after the one responsible for the trigger.  Then I probably have to wait for some kind of re-arming time of the digitizer before it's ready to trigger on another pulse.  Hopefully this time is short, 1 or 2 us.  Is it?  I don't see this in the spec sheet unless it's equivalent to minimum holdoff (2 us).  For experiments with low rates of pulses, this seems like the easiest approach.
    The other possibility is to stream data to the host computer, and somehow process the data as it rolls in.  For high rate experiments, this would be a better mode of operation if the computer can keep up.  For several minutes of continuous data collection, I cannot rely on buffering the entire sample in memory.  I could stream to a RAID, but it's too expensive and I want to get feedback in real time as pulses are collected.
    With this in mind, what would you recommend for a controller?  The three choices that seem most reasonable to me are getting an embedded controller running Windows (or Linux?), an embedded controller running Labview real-time OS, or a fast interface card like the PCIe8371 and a powerful desktop PC.  If all options are workable, which one would give me the lowest cost of upgrades over the next decade or so?  I like the idea of a real-time embedded controller because I believe any run-of-the-mill desktop PC (whatever IT gives us) could connect and run the user interface including data display and higher-level analysis.  Is that correct?  But I am unsure of the life-span of an embedded controller, and am a little wary of the increased cost and need for periodic updates.  How are real-time OS upgrades handled?  Are they necessary?  Real-time sounds nice and all that, but in reality I do not need to process the data stream in a real-time environment.  It's just the computer and the digitizer board (not a control system), and both should buffer data very nicely.  Is there a raw performance difference between the two OSes available for embedded controllers?
    As for live processing of the streaming data, is this even possible?  I'm not thinking very precisely about this (would really have to just try and find out), but it seems like it could possibly work on a a 2 GHz dual-core system.  It would have to handle 200 MBps, but the data processing is extremely simple.  For example one thread could mark the beginnings and ends of pulses, and do simple pile-up inspection.  Another thread could integrate the pulses (no curve fitting or interpolation necessary, just simple addition) and store results in a table or list.  Naievely, I'd have not quite 20 clock cycles per sample.  It would be tight.  Maybe just getting the data into the CPU cache is prohibitively slow.  I'm not really even knowledgeable enough to make a reasonable guess.  If it were possible, I would imagine that I would need to code it in LabWindows CVI and not LabView.  That's not a big problem, but does anyone else have a good read on this?  I have experience with C/C++, and some with LabView, but not LabWindows (yet).
    What are my options if this system doesn't work out?  The return policy is somewhat unfriendly, as 30 days may pass quickly as I struggle with the system while teaching full time.  I'll have some student help and eventually a few long days over the summer.  An alternative system could be built around XIA's Pixie-4 digitizer, which should mostly just work out of the box.  I prefer somewhat the NI PXI-5122 solution because it's cheaper, better performance, has much more flexability, and suffers less from vendor lock-in.  XIA's software is proprietary and very costly.  If support ends or XIA gets bought out, I could be left with yet another legacy system.  Bad.
    The Pixie-4 does the peak detection and integration in hardware (FPGAs I think) so computing requirements are minimal.  But again I prefer the flexibility of the NI digitizers.  I would, however, be very interested if data from something as fast as the 5122 could be streamed into an FPGA-based DSP module.  I haven't been able to find such a module yet.  Any suggestions?
    Otherwise, am I on the right track in general on this kind of system, or badly mistaken about some issue?  Just want some reassurance before taking the plunge.

    drnikitin,
    The reason you did not find the spec for the rearm time for
    the 5133 is because the USB-5133 is not capable of multi-record acquisition.  The rearm time is a spec for the reference
    trigger, and that trigger is used when fetching the next record.  So every time you want to do another fetch
    you will have to stop and restart your task. 
    To grab a lot of data increase your minimum record size.  Keep in mind that you have 4MB of on board
    memory per channel. 
    Since you will only be able to fetch 1 record at a time,
    there really isn’t a way to use streaming. 
    When you call fetch, it will transfer the amount of data you specify to
    PC memory through the USB port (up to 12 MB/s for USB 2.0 – Idealy).
    Topher C,
    We do have a Digitizer that has onboard signal processing
    (OSP), which would be quicker than performing post processing.  It is
    the NI 5142
    and can perform the following signal
    processing functions.  It is
    essentially a 5122 but with built in OSP. 
    It may be a little out of your price range, but it may be worth a
    look. 
    For more
    information on streaming take a look at these two links (if you havn’t
    already). 
    High-Speed
    Data Streaming: Programming and Benchmarks
    Streaming Options for PXI
    Express
    When dealing with different LabVIEW versions
    it is important to note that previous versions will be compatible with new
    versions; such as going from 8.0 to 8.5. 
    Keep in mind that if you go too far back then LabVIEW may complain, but
    you still may be able to run your VI.  If
    you have a newer version going to an older version then we do have options in
    LabVIEW to save your VI for older versions. 
    It’s usually just 1 version back, but in LabVIEW 8.5 you can save for
    LabVIEW 8.2 and 8.0.
    ESD,
    Here is the link
    I was referring to earlier about DMA transfers.  DMA is actually done every time you call a
    fetch or read function in LabVIEW or CVI (through NI-SCOPE). 
    Topher C and ESD,
    LabVIEW is a combination of a compiled
    language and an interpreted language. 
    Whenever you make a change to the block diagram LabVIEW compiles
    itself.  This way when you hit run, it is
    ready to execute.  During execution LabVIEW
    uses the run-time engine to reference shared libraries (such as dll’s).  Take a look at this DevZone article about
    how LabVIEW compiles it’s block diagram (user code). 
    I hope all of this information helps!
    Ryan N
    National Instruments
    Application Engineer
    ni.com/support

  • How do I use a xbox360 controller in labview 7?

    So I've just been given a project in my research lab..
    We have to control a gantry arm using labview.  The gantry control is already completely written out in labview using a datasocket server.  However, I've never worked with Labview before.  All I know is that it takes in a number as a velocity and it moves the arm according to that velocity.  What I have to do is use the xbox 360 controller to input that number (a velocity), as well as calibrate it.  If possible, using the mangitude of the joystick angle to input a different velocity.
    How would I go about doing this?  I hav einstalled all the necessary software to talk to the controller, but I don't know how to set it up in Labview.
    Also, Windows XP sees it as a game controller, if that makes any difference.
    Message Edited by ShadowGray on 08-13-2008 10:56 AM

    You cannot connect those two nodes directly because they're different datatypes, as the message indicates. The axis info contains 8 elements. Thus, you have to pick the one that applies to you and unbundle it from the cluster. For example:
    To learn more about LabVIEW it is recommended that you go through the tutorial(s) and look over the material in the NI Developer Zone's Learning Center which provides links to other materials and other tutorials. You can also take the online courses for free.
    Message Edited by smercurio_fc on 08-13-2008 01:08 PM
    Attachments:
    Example_VI_BD.png ‏1 KB

  • Logic Pro X keeps sending C0 notes to midi out when record is armed on a audio track

    I have a annoying problem with Logic Pro X, which I bought a month ago. Here's al little scenario that describes the problem.
    I disconnect my audio interface, so that Logic Pro has to rely on my MacBook Pro's internal audio. This way I exclude my RME-FF800 from being of any influence.
    I start up the Midi Monitor tool, to analyse what comes out of Logic; MIDI-wise.
    I start up Logic set up a small test project in Logic Pro X, that contains:1 audio track
    There's little button on the audio track with a red-colored "R". This is the button to "arm" this track for recording audio.
    I push the little "R" button, the R-character turns white, the button as a whole turns red. The button starts blinking.
    From this moment on Logic starts sending MIDI messages ....... huh?  Every second or so ... a low C.
    In my regular studio setup this means that I hear a low repeated note coming out of my piano, every time I record audio.  How annoying!!
    By the way ... a Guitar/Bass track has the same problem. All the other track types (External midi, software instrument, drummer) don't have this problem.
    Help is deeply appreciated, because it drives me crazy and if I can't fix this .... I think I have to return to my previous DAW-of-choice (Cubase 7.5).
    Here's the Midi Monitor output.
    20:46:01.241          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:02.008          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    20:46:02.774          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:03.541          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    20:46:04.308          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:05.075          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    20:46:05.841          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:06.608          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    20:46:07.374          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:08.140          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    20:46:08.908          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:09.675          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    20:46:10.441          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:11.208          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    20:46:11.975          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:12.741          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    20:46:13.507          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:14.274          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    20:46:15.043          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:15.808          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    20:46:16.574          To MIDI Monitor (midi monitor)          Note On          2          C0          127
    20:46:17.341          To MIDI Monitor (midi monitor)          Note Off          2          C0          0
    Logic Pro X = version 10.0.6
    Mac Book Pro = late 2008.
      Model Name:          MacBook Pro
      Model Identifier:          MacBookPro5,1
      Processor Name:          Intel Core 2 Duo
      Processor Speed:          2,53 GHz
      Number of Processors:          1
      Total Number of Cores:          2
      L2 Cache:          6 MB
      Memory:          4 GB
      Bus Speed:          1,07 GHz
      Boot ROM Version:          MBP51.007E.B06
      SMC Version (system):          1.33f8
    Mac OSX = 10.9.1
      System Version:          OS X 10.9.1 (13B42)
      Kernel Version:          Darwin 13.0.0
      Boot Volume:          Macintosh HD
      Boot Mode:          Normal
      Secure Virtual Memory:          Enabled

    The note is likely being sent out because of a Control Surface you have installed or used.... be it a hardware keyboard, controller or even an ipad controller app... such as Logic Remote or TouchOSC for example... or even an iOS software synth app that is sending midi over wireless....
    So.. first make sure you have force quit any iPad/iPhone Logic/midi related apps and not just closed them..
    Then if that doesn't fix things.. delete Logic's CS preferences... and Logic's own user preferences too...as described here..
    Delete the user preferences
    You can resolve many issues by restoring Logic Pro X back to its original settings. This will not impact your media files. To reset your Logic Pro X user preference settings to their original state, do the following:
    In the Finder, choose Go to Folder from the Go menu.
    Type ~/Library/Preferences in the "Go to the folder" field.
    Press the Go button.
    Remove the com.apple.logic10.plist file from the Preferences folder. Note that if you have programmed any custom key commands, this will reset them to the defaults. You may wish to export your custom key command as a preset before performing this step. See the Logic Pro X User Manual for details on how to do this. If you are having trouble with a control surface in Logic Pro X, then you may also wish to delete the com.apple.logic.pro.cs file from the preferences folder.
    If you have upgraded from an earlier version of Logic Pro, you should also remove~/Library/Preferences/Logic/com.apple.logic.pro.
    Restart the computer.
    Do not forget to Quit Logic before doing this and then restart the Mac after doing this....

  • Solaris 9 , SAN attached MSL library : robotic arm access

    Could someone provide information regarding Solaris 9 configuration issue.
    On a V880 using LP10K emulex HBAs in a Brocade switches SAN environment, I am facing issue configuring HP MSL6060 2 LTO3 drive library.
    I am able to tar to 2 LTO drives but I do not know the steps to configure access to the library Robotic controller arm itself.
    emulex hbanyware shows 4 objects for the library i.e the library controller, 2 drives and library NSR.
    At this point, at OS level, devfsadm creates only tapes special files.
    I am more familiar with HP-UX than Solaris : What would be solaris commands to confirm HW is seen and usable at OS level (ioscan -fnC disks + insf to create special files on HP-UX)
    On Solaris 9 : I have used modinfo , modunload, modload and devfsadm commands after st.conf modifications to gain access to drives without rebooting
    BUT after sd.conf modifications, update_drv -f , devfsadm -v does not create any special file to point to robotic arm.
    Overall objective is to configure Data Protector indirect backup for Solaris9 server and abitlity for that host to manage robotic arm.
    Any help appreciated.

    I'm not sure if this will help, since this is from a very small environment, but here goes:
    I recently attached an old Compaq Manatee class mini-library to a Blade 100 via a dual-channel LSI SCSI adapter and was trying to manipulate the robot arm via the open-source mtx utility.
    Using the SCSI generic (sgen) driver, I was able to get this to work. Before I did the OS level change, at the prom, I did a probe-scsi-all and the system showed the two DLT7000 drives and the HP type 8 media changer. But without updating the sgen driver configuration, I could only see the DLT drives at the Solaris 9 level.
    The configuration file for the sgen driver resides at: /kernel/drv/sgen.conf The file itself actually is quite clearly commented by default, which saved me a lot of research. In addition, man sgen is an excellent resource.
    I added the following two lines to my sgen.conf file:
    device-type-config-list="changer";
    inquiry-config-list=    "*",            "*";And in addition, since my changer is at SCSI id 6, I uncommented the following line:
    name="sgen" class="scsi" target=6 lun=0;After that I did an init 6, and lo and behold in my /var/adm/messages file:
    May 15 13:54:22 baetica17.arxmicarus.lan scsi: [ID 193665 kern.info] sgen0 at glm0: target 6 lun 0
    May 15 13:54:22 baetica17.arxmicarus.lan genunix: [ID 936769 kern.info] sgen0 is /pci@1f,0/pci@5/scsi@1/sgen@6,0Success! In addition, I found the following new directories and files in /dev:
    /dev/scsi/changer/c1t6d0which is a symlink to the following physical device file:
    c1t6d0 -> ../../../devices/pci@1f,0/pci@5/scsi@1/sgen@6,0:changerTo make specifying this physical device file easier in mtx, I also added this symlink to /dev:
    /dev/changer -> ../devices/pci@1f,0/pci@5/scsi@1/sgen@6,0:changerMy example here is from my simple hardwire SCSI environment. I'm not experienced in the specific SAN environment you're operating in, but my guess is that you would probably need to make sure that you have some sort of persistent reservation for the changer made so that the physical device of the changer doesn't change between reboots.
    Hope this at least gets you going in a productive direction! Please let me (and anyone else who might read this) know how you solved the issue at your site.
    - Michael

  • Warner/superior electric's SS2000PCi motion controller interfacing with LabVIEW 6i

    Sir,
    In our application, we are controlling the movement of X-Y arm on the X-Y table. For this we are using superior electric products:
    (a) Slo-Syn SS2000PCi Programmable Step Motor Controller
    (b) MD808 Motor Drive
    We are using two such controllers and motor drives to drive two 2 Amps Sanyo Denki Stepper motors: one each along X-axis and Y-axis. Along with the arm movement a data acquisition also has to be carried out. So, the motion control and Data Acquisition has to be synchronized by means of software. The problem now is to program the controller. Though MCPI Version 4.41 is there, we want to program the controller in LabVIEW 6i so that we can synch
    ronize both motion control and Data Acquisition.There is no driver which is compatible to LabVIEW 6i.
    Is there any 32-bit DLLs for this controller? If any one has these dll's please let me know. My E-mail ID: [email protected]
    So that i can call these DLLs in LabVIEW 6i and program it.Or else send me at least the detailed low level command sets of the controller in pdf format so that i can develop our own drivers.
    Regards,
    Nagendra

    Nagendra,
    Unfortunately, I was unable to find any helpful resources for you based on a cursory web search. I recommend that you contact the manufacturer of the hardware and ask them if they have a driver (DLL) that you can use to interface with LabVIEW.
    Good luck with your application, and have a good day.
    Sincerely,
    Darren N.
    NI Applications Engineer
    Darren Nattinger, CLA
    LabVIEW Artisan and Nugget Penman

  • ARM Embedded serial buffers

    Hi All,
    Colleagues and I have had some very recent experience with serial buffers when using the ARM Embedded module (1.1) for LabVIEW, using the LPC23xx series controller (we are using the LPC2368/87). I thought it best to share should others find some crazy things going on.
    We were losing packet information when large packets came through on the normal 64 byte allocation for the incoming serial buffer. These large packets happened fairly regularly with other small packets in between. At times we found that 2 bytes would be dropped thus losing the CRC check and forcing a drop in the packet information. It took some time to work why (surely it was to do with our firmware!) but we soon realised we had to adjust this serial buffer size from 64 to something much larger - 256 or 512 - even though we have another buffer within our producer loop. This serial buffer change is done in the ARM_serial.c file, easily found using Keil uVision; line #48:
    #define SER_IN_BUF_SIZE   64
    Even though we initialised the buffer size (using "Serial Port Init.vi") to 512 there was no change to the hardcoded value of 64.
    In terms of RAM allocation, a change in the 'serial in buffer' size (power of 2 change) will generate a x4 memory allocation in your heap space (ZI data). This means, if I increase my buffer size by 256 bytes I also reduce my available heap space (run-time RAM) by 1024 bytes. This can be significant if you are very tight on memory (changed using line #100 in your 'target'.s file - LPC2300.s for me).
    I hope that helps for somebody.

    Resolving this problem did take us some time, first in identifying why we were losing packets after setting the buffer size on the init.vi to 512 and second in realising that it wasn't working. We had packets (largest ones) with a size of 130 bytes coming in every 100 ms. Trying to handle these packets takes time and heap space - if you use more than one queue (for more than one consumer loops). In terms of memory efficiency we have stayed away from using too many queues due to dynamic memory allocation; if too many packets came in and we weren't able to process them quick enough then the heap will become full and the controller will crash ("Memory Abort" error - as indicated in the LabVIEW processor status window).
    We previously went over the known issues site and couldn't find a mention of serial buffer size allocation input on the ...init.vi (see: http://digital.ni.com/public.nsf/websearch/270545BCCF971FE9862574F20049095C?opendocument&Submitted&&...).
    You mentioned that it is an intended design, which is surprising. Having the option for the user to control their hardware settings using firmware (LabVIEW) would have been a real plus for NI.
    I also noticed that the serial buffer size is allocated for each port - well, we think so anyway. We have 4 ports on our controller, which is why we see a quadruple increase in heap allocation with an increase in buffer size. Is there some way for us to isolate the buffer size to each port, thereby giving the (default) 64 bytes to the unused ports and increasing the allocation to those that need it ? This would put more control in the user to maximise their memory usage with more efficiency, especially if all you are using are 2 ports and are tight with memory.

  • Using data from a camera to move a robot arm

    Hey guys,
    I have a 6 motor robot arm that operates through serial commands.  The six motors are the base, elbow, shoulder, wrist, hand twist, hand grip motors.  Each motor accepts a numerical value from 500-5500 to control its position.  From reading the manual this is done through PWM I believe.  Now my problem is that i am designing a game in where one component of it requires the robot arm to pick up a chip from a position that doesn't change, and move it to a game board.  The position on that game board where the robot arm puts the chip will vary depending on the user input.
    I though of just recording the values for the position of the arm in each location, but with PWM that can be slightly inaccurate because it may not go exactly back to the location when the command is set.  The square areas on the board are 2 cm x 2 cm, so a bit of precision would be nice.  My remedy was thinking of using a usb web camera to snap a picture of the board, and setup a coordinate system from that image to guide the robot to the correct location.
    My question then is (1) is there an easier way to do this that I am missing? (2) If i went with this method then would I probably need 2 cameras? one for x-y plane and one for z-plane. Lastly (3) Also if i went with this method, i having trouble finding examples of using IMAQ to record the continous images, attach coordinate systems to the region of interest, then coorelate that to movement of the robot arm. 
    If anyone has had any experience doing this and can lead in the right direction then that would be great and I deeply appreciate it!   

    I'm not quite sure what you mean by sending serial commands and then referencing PWM. Likely, your motor controller/drive is receiving serial commands from LabVIEW and then controlling the motors with PWM signals to actual drive the movement. 
    If you are having slight inaccuracy in the movement, I'm also guessing that you are not using stepper motors. The much easier option than optical feedback would be to use encoder feedback on the axes, but I understand this would be difficult if you have predefined hardware without access to adding physical encoders. Without knowing more details of your hardware, it is difficult to recommend another solution.
    Regardless, the IMAQ programming would be very difficult to accurately identify and guide the robot manipulator. The simpler application I would implement if forced to use image recognition would be to re-zero the robot's position after a move by comparing images when in a "home" position. You could use a fairly simple pattern matching routine and then find the displacement from the original match. This would still take quite a bit of programming, but wouldn't be anywhere near as difficult as continuous processing.
    Karl G.
    Applications Engineer
    ni.com/support

  • [USB network problem]arm linux to use RNDIS host funtion

    I am trying to use  RNDIS host function on my arm linux device,so i can use android phone to browse my device's WebUI with usb cable plug in.
    my device's platform is at91sam9260,kernel version 2.6.19
    if i plug my android phone  into my device and set phone's usb tethering fuction on, my device can find it.But usb0 network device never show up.
    [root@ICIM ~]# usb 1-1: USB disconnect, address 2
    usb 1-1: new full speed USB device using at91_ohci and address 3
    usb 1-1: configuration #1 chosen from 1 choice
    [root@ICIM ~]# lsusb
    Bus 001 Device 003: ID 04e8:6863 Samsung Electronics Co., Ltd
    Bus 001 Device 001: ID 0000:0000
    root@ICIM ~]# ifconfig -a
    eth0 Link encap:Ethernet HWaddr 00:24:DE:00:02:40
    inet addr:192.168.0.81 Bcast:192.168.0.255 Mask:255.255.255.0
    UP BROADCAST MULTICAST MTU:1500 Metric:1
    RX packets:0 errors:0 dropped:0 overruns:0 frame:0
    TX packets:0 errors:0 dropped:0 overruns:0 carrier:0
    collisions:0 txqueuelen:1000
    RX bytes:0 (0.0 B) TX bytes:0 (0.0 B)
    Interrupt:21 Base address:0x4000
    lo Link encap:Local Loopback
    inet addr:127.0.0.1 Mask:255.0.0.0
    UP LOOPBACK RUNNING MTU:16436 Metric:1
    RX packets:1 errors:0 dropped:0 overruns:0 frame:0
    TX packets:1 errors:0 dropped:0 overruns:0 carrier:0
    collisions:0 txqueuelen:0
    RX bytes:100 (100.0 B) TX bytes:100 (100.0 B)
    if i plug my android phone  into a linux PC and set phone's usb tethering fuction on, the linux PC can find my phone, and I can find the usb0 network.
    then i ping my phone from the linux PC, and it success
    [root:/home/xjl/EMS_Kernel/linux-2.6.19_ICIM] lsmod |grep rndis_host
    rndis_host 7108 0
    cdc_ether 6464 1 rndis_host
    usbnet 13646 2 rndis_host,cdc_ether
    [root:/home/xjl/EMS_Kernel/linux-2.6.19_ICIM] ifconfig -a
    eth1 Link encap:Ethernet HWaddr 00:16:76:D4:98:EC
    inet addr:192.168.5.158 Bcast:192.168.5.255 Mask:255.255.255.0
    inet6 addr: fe80::216:76ff:fed4:98ec/64 Scope:Link
    UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
    RX packets:12142625 errors:0 dropped:0 overruns:10372 frame:10372
    TX packets:7310990 errors:0 dropped:0 overruns:0 carrier:0
    collisions:0 txqueuelen:1000
    RX bytes:2921302257 (2.7 GiB) TX bytes:3720558580 (3.4 GiB)
    lo Link encap:Local Loopback
    inet addr:127.0.0.1 Mask:255.0.0.0
    inet6 addr: ::1/128 Scope:Host
    UP LOOPBACK RUNNING MTU:16436 Metric:1
    RX packets:12 errors:0 dropped:0 overruns:0 frame:0
    TX packets:12 errors:0 dropped:0 overruns:0 carrier:0
    collisions:0 txqueuelen:0
    RX bytes:720 (720.0 b) TX bytes:720 (720.0 b)
    usb0 Link encap:Ethernet HWaddr CE:3A:A2:93:89:73
    inet addr:192.168.42.101 Bcast:192.168.42.255 Mask:255.255.255.0
    inet6 addr: fe80::cc3a:a2ff:fe93:8973/64 Scope:Link
    UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
    RX packets:11 errors:0 dropped:0 overruns:0 frame:0
    TX packets:20 errors:0 dropped:0 overruns:0 carrier:0
    collisions:0 txqueuelen:1000
    RX bytes:1094 (1.0 KiB) TX bytes:5718 (5.5 KiB)
    [root:/home/xjl/EMS_Kernel/linux-2.6.19_ICIM] ping 192.168.42.129
    PING 192.168.42.129 (192.168.42.129) 56(84) bytes of data.
    64 bytes from 192.168.42.129: icmp_seq=1 ttl=64 time=2.17 ms
    64 bytes from 192.168.42.129: icmp_seq=2 ttl=64 time=0.473 ms
    64 bytes from 192.168.42.129: icmp_seq=3 ttl=64 time=0.543 ms
    64 bytes from 192.168.42.129: icmp_seq=4 ttl=64 time=1.27 ms
    64 bytes from 192.168.42.129: icmp_seq=5 ttl=64 time=0.568 ms
    64 bytes from 192.168.42.129: icmp_seq=6 ttl=64 time=1.21 ms
    64 bytes from 192.168.42.129: icmp_seq=7 ttl=64 time=1.21 ms
    64 bytes from 192.168.42.129: icmp_seq=8 ttl=64 time=1.24 ms
    64 bytes from 192.168.42.129: icmp_seq=9 ttl=64 time=1.20 ms
    64 bytes from 192.168.42.129: icmp_seq=10 ttl=64 time=0.505 ms
    ^C
    --- 192.168.42.129 ping statistics ---
    10 packets transmitted, 10 received, 0% packet loss, time 9850ms
    rtt min/avg/max/mdev = 0.473/1.041/2.176/0.506 ms
    [root:/home/xjl/EMS_Kernel/linux-2.6.19_ICIM]
    below is my kernel's configuration on usb. In the Linux pc, the RNDIS host function use rndis_host.ko cdc_ether.ko usbnet.ko.
    I compile all these drivers into my kernel, but it doesn't work.
    usb support
    <*> Support for Host-side USB │ │
    │ │ [ ] USB verbose debug messages │ │
    │ │ --- Miscellaneous USB options │ │
    │ │ [*] USB device filesystem │ │
    │ │ [ ] Enforce USB bandwidth allocation (EXPERIMENTAL) │ │
    │ │ [ ] Dynamic USB minor allocation (EXPERIMENTAL) │ │
    │ │ --- USB Host Controller Drivers │ │
    │ │ < > ISP116X HCD support │ │
    │ │ <*> OHCI HCD support │ │
    │ │ < > SL811HS HCD support │ │
    │ │ --- USB Device Class drivers │ │
    │ │ <*> USB Modem (CDC ACM) support │ │
    │ │ < > USB Printer support │ │
    │ │ --- NOTE: USB_STORAGE enables SCSI, and 'SCSI disk support' │ │
    │ │ --- may also be needed; see USB_STORAGE Help for more information │ │
    │ │ < > USB Mass Storage support │ │
    │ │ [ ] The shared table of common (or usual) storage devices │ │
    │ │ --- USB Input Devices │ │
    │ │ < > USB Human Interface Device (full HID) support │ │
    │ │ USB HID Boot Protocol drivers ---> │ │
    │ │ < > Aiptek 6000U/8000U tablet support │ │
    │ │ < > Wacom Intuos/Graphire tablet support │ │
    │ │ < > Acecad Flair tablet support │ │
    │ │ < > KB Gear JamStudio tablet support │ │
    │ │ < > Griffin PowerMate and Contour Jog support │ │
    │ │ < > USB Touchscreen Driver │ │
    │ │ < > Yealink usb-p1k voip phone │ │
    │ │ < > X-Box gamepad support │ │
    │ │ < > ATI / X10 USB RF remote control │ │
    │ │ < > ATI / Philips USB RF remote control │ │
    │ │ < > Keyspan DMR USB remote control (EXPERIMENTAL) │ │
    │ │ < > Apple USB Touchpad support │ │
    │ │ --- USB Imaging devices │ │
    │ │ < > USB Mustek MDC800 Digital Camera support (EXPERIMENTAL) │ │
    │ │ < > Microtek X6USB scanner support │ │
    │ │ USB Network Adapters ---> │ │
    │ │ [*] USB Monitor │ │
    │ │ --- USB port drivers │ │
    │ │ USB Serial Converter support --->
    usb network adapters
    < > USB CATC NetMate-based Ethernet device support (EXPERIMENTAL) │ │
    │ │ <*> USB KLSI KL5USB101-based ethernet device support │ │
    │ │ < > USB Pegasus/Pegasus-II based ethernet device support │ │
    │ │ < > USB RTL8150 based ethernet device support (EXPERIMENTAL) │ │
    │ │ <*> Multi-purpose USB Networking Framework │ │
    │ │ <M> ASIX AX88xxx Based USB 2.0 Ethernet Adapters │ │
    │ │ --- CDC Ethernet support (smart devices such as cable modems) │ │
    │ │ < > GeneSys GL620USB-A based cables │ │
    │ │ <M> NetChip 1080 based cables (Laplink, ...) │ │
    │ │ < > Prolific PL-2301/2302 based cables │ │
    │ │ < > MosChip MCS7830 based Ethernet adapters │ │
    │ │ <*> Host for RNDIS devices (EXPERIMENTAL) │ │
    │ │ <*> Simple USB Network Links (CDC Ethernet subset) │ │
    │ │ [ ] ALi M5632 based 'USB 2.0 Data Link' cables │ │
    │ │ [ ] AnchorChips 2720 based cables (Xircom PGUNET, ...) │ │
    │ │ [*] eTEK based host-to-host cables (Advance, Belkin, ...) │ │
    │ │ [*] Embedded ARM Linux links (iPaq, ...) │ │
    │ │ [ ] Epson 2888 based firmware (DEVELOPMENT) │ │
    │ │ <M> Sharp Zaurus (stock ROMs) and compatible
    anyone familiar with the RNDIS host can help me ? Is my configuration of the kernel right?
    Last edited by snakewind (2013-06-08 08:04:06)

    We don't support archlinux-arm here, try archlinuxarm.org.

  • Program LPC2378 with code generated by LabView ARM embedded, without using LabView

    I would like to use the C files that are originally generated in LabVIEW ARM embedded module, and use Keil software (or other lower cost) to load these files into custom boards with LPC2378 controller without LabView.  Are there detailed instuctions for this?  I am using version 8.6 and the ARM embedded module is 1.1.  
    My reason for this task is boards are being made in a maufacturing facility oversees, and they need to be programmed at their end. 
    I read a forum describing the use of  hex files, but not enough detail for me to figure this out.
    Thanks in advance.

    Hey Bob,
    What you're looking to do should be possible. You'll just need to build the application in LabVIEW, then you should be able to use the uVisions command line to deploy the system. As for the C code itself, it gets generated when you build your build specification and you should see it placed in the same directory as your labview project in a folder named the name of your project. For example, if your project were called My ARM Project, you'll see a folder called "My ARM Project" created/updated whenever you build the build specification in that project and it will contain the C Code. To see the code organized logically, after building your build specification, right-click on the ARM target and select "Show Keil uVision". This will open a uVision window in "LabVIEW mode" and you'll be able to see all of the C files associated with you application; your VI specific code will be under a folder titled "VIs". You could also just open the C code files from disk in uVision, which is what you'll need to do if LabVIEW isn't installed. The main LabVIEW project can be found in the built directory described above as the LabVIEW.uvproj file in the "target"/Application/"uvision version"/Project directory. For example my main file was built in MyProject\EK_LM3S8962\Application\4.01\Project. From there, you'll need to follow uVision's process for compiling and deploying the target code from their command line:
    Command Line
    http://www.keil.com/support/man/docs/uv4/uv4_comma​ndline.htm
    Hope this helps!
    --Ryan_S

  • PC xbox controller

    Hey,
    IM 14 and looking for a store that carries a PC compatible xbox 360 look alike controller. I know that these are made but I want a store that's local. I also want to know if this controller is easier for first person shooter games (in particular brothers in arms hells highway) then the standard PC keyboard. Thanks

    daneboy502 wrote:
    Hey,
    IM 14 and looking for a store that carries a PC compatible xbox 360 look alike controller. I know that these are made but I want a store that's local. I also want to know if this controller is easier for first person shooter games (in particular brothers in arms hells highway) then the standard PC keyboard. Thanks
    No, it is not easier.  Unless the game is severely crippled so that mouse aiming is disabled, people using a mouse to aim will utterly and completely dominate you.
    Pad vs. keyboard/mouse is why you never see cross-platform FPS gaming (at least not between PC and consoles) - because mouse users will utterly destroy pad users.
    The order of control accuracy/speed is:
    Gamepads are slightly better than keyboard-only
    Mouse + keyboard is so superior to gamepads that mousers won't be able to tell the difference between you and a keyboarder - to them you'll be cannon fodder
    Connecting gamepads to PCs is rarely useful for anything other than running classic console games on emulators.  It could be useful for some game types like flight simulators and maybe driving sims, but it's vastly inferior for FPS games and strategy games unless the game itself is massively crippled.
    *disclaimer* I am not now, nor have I ever been, an employee of Best Buy, Geek Squad, nor of any of their affiliate, parent, or subsidiary companies.

Maybe you are looking for

  • Macbook pro goes to sleep on its own

    I've been having a few problem with my MB pro 17inch (late 2011-2012 2.4 Ghz intel core i7 / 4gb DDR3 - Mountain Lion), it over heats quite often using minimal services and on a few occasions it decided to turn it self to sleep mode, then it boots up

  • ISight, FireWire Port, eMac

    I am trying (to help a friend) to connect an isight to an eMac running OS X 2.8 -- there are 2 FireWire ports on the side. When I plug in the cord, the camera's light turns on for about 3 seconds. iChat has no indication that there's a video camera a

  • Streaming music from iPad

    TRying to play music from ipad2 on appletv..I see my library on tv but can not control from ipad

  • HT201365 When I upgraded to ios7 I have no app for find my iPhone ?

    When I upgraded to ios7, I have no app for find my iPhone

  • How do I burn multiple projects onto one DVD?

    I have several projects saved that are small enough to fit on one DVD.  Not sure, though, if I burn project #2 after project 1 has been burned, whether the software is smart enough to keep them separate and not over-write project #1.