What is the best way to graph CAN data?

Hi All,
Even after years of trying, I am still really lousy at using the LabVIEW graphing functions, so I hope you will be patient with me.
I am currently writing code to monitor a CAN bus.  I have messages with five different arbitration IDs coming in at irregular time intervals.  Generally, there is between between 5ms and 100ms between packet arrival times.  My first currently captures the raw CAN data and parses it according to the format requirements for each arbitration ID and shows the corresponding values from the parsed data.   It also saves the raw data into a txt file.  Each entry saves a time stamp, Arb ID, message length and 8 CAN bytes, which is the length of each CAN packet.
I have another program which takes the raw data from the original txt file and resaves it as formatted data, broken out into parsed values.  I use this two-step process to make my CAN capture program run faster.
I have two graphing needs right now:
1.  Graphing the ruasi-real-time values of the parsed CAN data as they are acquired in my CAN capture program.  This needs to happen in as little time as possible, giving me usable graphs or charts without bogging down the program.  If graphing the data takes up too much CPU time, I could graph data samples once every XXXms, instead of graphing everything that comes in.  If graphing the data proves too CPU intensive, I might just give up on this real time graphing.
2.  Graphing the entire parsed data set in my second, post-capture program.  Speed is not as important in this program because I am not capturing real time data while it is running.  This graph needs to include each data point vs. its associated timestamp.
Any recommendations on which graphing functions to use in these programs would be greatly appreciated.
Thanks!

I'm not sure what you mean by "graphing functions."  I have code running now that receives a CAN message every 1ms.  Every 100ms, all the data received in that period is added to a chart.  I'm using a less-expensive USB CAN card (so no on-board filtering) and my code has no problem filtering by arb id in software.  I'm doing this by grabbing all received messages from the buffer every 10ms and, in a for loop, checking the arbitration ID of each packet.  If the ID matches the one I want to graph, I convert the CAN packet to 4 16-bit values and put them in a queue.  In a separate loop, every 100ms I flush the queue and write the contents to the chart.  Here's the relevant bit of my code.  Does this help?

Similar Messages

  • I am giving away a computer, what is the best way to wipe out data prior to depature

    i am giving away a computer, what is the best way to wipe out data prior to depature

    Did the Mac come with two grey disks when new? If so, use disk one to erase the drive using Disk Utility and then re-install the OS from the same disk. Once installed, quit and when the new owner boots they can set it up as a new out-of-the-box Mac when they boot it up. The grey disks need to be passed on with the computer.
    If you need detailed instructions on how to erase and re-install please post back.
    If the Mac came with Lion or Mountain Lion installed the above process can be done using the Recovery HD as since Lion no restore disks are supplied with the Mac.
    The terms of the licence state that a Mac should be sold/passed on with the OS installed that was on the machine when new (or words to that effect).

  • What is the best way to captue current date and time?

    I got a field in table to capute current date and time...i am
    using SQL Server.
    field name datatype length
    enter_datetime datetime 8
    What is the best way to get current date and time and insert
    to table?.
    Is this way?.
    <cfset curtime = 'dateformat(#now()#,'mm/dd/yyyy')&"
    "&timeformat(#now()#,'hh:mm:ss')'>
    This way looks like time is not entered correctly.
    or any other better way?.

    > get current date and time and insert to table?
    You can use cfqueryparam
    <cfqueryparam value="#now()#"
    cfsqltype="cf_sql_timestamp">
    Or as was suggested, set the default for your table column to
    getdate(). Then you won't have to insert anything. Sql server will
    do it automatically when a new record is created.

  • What is the best way to insert massive data into an existing excel file?

    dear gurus,
    i am wondering that what is the best way to insert massive data into an existing excel file, more performance perspective.
    the file is read from BDS , and we want to insert data into it .
    the way i can think of is
    1. OLE AUTOMATION
       i think performance will be a big problem
    2. Office integration
        i am not sure it's facing the same performance issue ?
    3 . XXL_SIMPLE_API/FULL_API
        I am not sure whether they can insert data into an existing excel file?
    could you please give me some advices?
    br.
    jun

    Hi,
    If you want to APPEND data( add data to an existing excel file) from SAP, then use GUI_DOWNLOAD fm with APPEND = 'X' paramter.
    Best regards,
    Prashant

  • What is the best way to transfer my date from my old iMac to my new iMac?

    what is the best way to transfer my date from my old iMac to my new iMac? I don't have a time capsule, so I cannot use time machine.
    thanks

    Your best bet by far is to use Setup Assistant when the new Mac first starts up to transfer directly from the old one. 
    It's similar to Migration Assistant, but doing it that way will avoid having an extra user account.
    Either way, you can connect them via FireWire, Thunderbolt, or your network, but avoid wireless if at all possible.  You can connect both to your netwrork via Ethernet, or put a single cable between the two. 
    See Using Setup Assistant on Mountain Lion or Lion for details.

  • What is the best way to edit meta data..

    What is the best way to edit meta data and tag photos, faces, places etc. and have the data saved to the original photo.
    On a PC I would just use Windows Gallery. iPhoto on the Mac allows for some tagging, but it doesn't save to the original file.
    I like to have my photos in a folder, edit them and save the changes.
    What software would work best on a MAC to accomplish this?
    Thanks for any help,
    Nick

    iPhoto is a database and any metadata you add or edit is available in any app - if you learn how to use it.
    iPhoto is a non-destructive processor. It never touches the original file - it treats it like a film shooter treats the negative.
    If you want a copy of the original file with the metadata included simply export a copy.
    This User Tip
    https://discussions.apple.com/docs/DOC-4921
    has details of the options in the Export dialogue.
    As an FYI:
    For help accessing your photos in iPhoto see this user tip:
    https://discussions.apple.com/docs/DOC-4491

  • What is the best way to move all data and apps from an old ipad to a newq ipad air?

    What is the best way to convert (update) from an original ipad to a new ipad air?

    How to Transfer Everything from an Old iPad to New iPad
    http://osxdaily.com/2012/03/16/transfer-old-ipad-to-new-ipad/
    iOS: Transferring information from your current iPhone, iPad, or iPod touch to a new device
    http://support.apple.com/kb/HT2109
    Moving Content to a New iPad
    http://tinyurl.com/qzk2a26
    Transferring your prepaid cellular data account depends on your carrier. AT&T lets you move it yourself when you go to Cellular Data in Settings and log into your account with your previous AT&T user name and password. For iPads with Sprint service, you can set up an account on the new iPad and contact Sprint Customer Care (888-211-4727 and go through the menus) to deactivate the old plan and get credit for unused service. For Verizon, call the company’s customer service number for mobile broadband support (800-786-8419) and ask to have your account transferred.
     Cheers, Tom

  • What's the best way to read JSON data?

    Hi all;
    What is the best way to read in JSON data? And is the best way to use it once read in to turn it into XML and apply XPath?
    thanks - dave

    jtahlborn wrote:
    without having a better understanding of what your definition of "use it" is, this question is essentially unanswerable. Jackson is a fairly popular library for translating json to/from java objects. the json website provides a very basic library for parsing to/from xml. which one is the "best" depends on what you want to do with it.Good point. We have a reporting product ([www.windward.net|http://www.windward.net]) and we've had a number of people ask us for JSON support. But how complex the data is and what they want to pull is all over the place. The one thing that's commin is they generally want to pull down the JSON data, and then put specific items from that in the report.
    XML/XPath struck me as a good way to do this for a couple of reasons. First it seems to map well to the JSON data layout. Second it provides a known query language. Third, we have a really good XPath wizard and we could then use it for JSON also.
    ??? - thanks - dave

  • What is the best way to get/change data in R/3 from external interface?

    Hi SAP gurus,
    I have a problem to know what is the best technology to access and maintenance all SAP functionality and data too in R/3 systems.
    Anyone know if connectors (.NET or JCO) are the only solution to manipulate all SAP system data or exist other way?
    One thing more, what is the best connector, with more functionality?
    E.g. The screen painter was made in C++ and is executed by user event in R/3 system, so would like to know if it's exist any way to do the same but replace the screen painter to another custom application?
    Regards

    Hello Vitor
    Not all functions and data are externally accessible. Only those business objects (e.g. like sales order, customer, material) for which BAPIs are available (transaction BAPI ) can be accessed via RFC.
    Regards,
        Uwe

  • What is the best way to remove personal data from the hard drive in preparation for selling my old Mac?

    I am going to sell or perhaps donate my old Power Mac G5.  Is there an easier, yet comprehensive way to remove my data from the hard-drive?  My best approach is dumping anything I do not want found into the trash.
    I am running Mac OS 10.5.8 on the powermac G5

    For better security than the simple Erase (which does not actually over-write the data blocks, only clears the directory), choose Security Options, and Zero all data, one pass.
    After a Zero all data, the only way to recover the data is to dis-assemble the drive in a clean room and use expensive test gear to recover a little of the data. That is good for all but Military Secrets.
    If you really trust no one under any circumstances, remove the drive and beat it to death with a hammer.

  • What is the best way to back up data on an older MacBook?

    My MacBook is about 5 years old.  It is still working just find and intend to use it until it crashes.  What is the easiest and most inexpenisve way to back up all of my pictures/music/data so that I don't lose it when that day comes?

    Time Machine. It's already included with Leopard.
    Buy an external hard disk and plug it in, Time Machine does the rest.
    The least expensive option is to find a Windows PC discarded due to viruses and harvest its HD. Purchase an inexpensive USB enclosure, install the HD, format it with Disk Utility. Your total investment is $20 for the enclosure plus a little time.

  • What is the best way to get storage data for hard disk using REST api

    Hello All,
    Given that I have disk info for virtual machine/role from service management REST api (for example using
    GetRole) how I could retrieve container/blob related info for it?
    So I have credentials for service management REST API, I have OSVirtualHardDisk info, but I am not sure how to detect correctly to which storage account connect and than which container to use. Yes, I know that there is OSVirtualHardDisk .MediaLink property
    which contains storage account name and container inside of it but I am not sure it is good practice to assume about it format. Alternatively I have another solution - just retrieve all storage accounts from  Service management REST, then compare url
    of each account with disk's  MediaLink. And use appropriate storage account for further data retrieve.   But seem to me it could retrieve too many info. 
    So generally I am trying to find correct way to join  service management REST api and Storage Services REST API for disks

    Hi,
    From my experience, your first approach is correct. The media link exactly points to the location of the blob. With the link, you can access the blob if you have the storage account key. If you want to extract more information, such as what
    the container is and what the blob is, you can parse the link. 
      >> From my point of view it is bad way to retrieve storage account name and container.
    In addition, you are welcome to post feature requests on
    http://feedback.windowsazure.com/forums/34192--general-feedback
    Best Regards,
    Ming Xu
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • After finally completing  a clean install what is the BEST way to replace my data.

    Please understand I moved from PC to Mac in Late 2011. Unlike windows where you have to know how it all works to use it, Mac is so simple I, to my own detriment , did not take the time to learn more than just my daily use. Been busy moving country and other crazy life stuff. I am rectifying this mistake. I did have issue with the clean install process working through with out interruption, some file issue, so had to repeat a few times.
    1. completed a clean install through to completion of set up, new account etc. Machine runs amazingly better. Completed updates. Knowing Migration Assistant can deal with duplicate user issues.
    Q? Do I need to run disk repair or other.
    2. Thought myself ill equipped with knowledge and understanding to manually replace my data via finder. I simply do not know where things are supposed to go.
    3. Tried reinstall of everything thru set up but it installed everything including all the issues I had before. Went through clean install process again.
    4. With this in mind I am hesitant and very weary to use Migration Assistant to move everything back as I simply do Not know where the issue lies even if it allowed greater folder & file separation.
    I have got to the stage where I am ready to recover my data which is on my backup external hard drive. The data was recovered using Time Machine.
    I wish to do what is best for my machine. It is getting older and I wish to do what I can for its longevity.
    Q? So what data is best recovered in which way and in what order.
    5. Is it best to say do recover applications from App Store and redo all updates if required.
    Then install non app store apps, e.g. Microsoft office for mac, and others thru finder.
    Users thru Migration assistant  etc etc.
    Been learning and working on this since my last post. Lots of watching YouTube and reading for hours.
    So much conflicting opinions and maybe this is due to my lack of specifics and most posts are for general use so I do understand.
    Again thank you in advance. I still think Mac has it all over PC.

    I read thru the data migration one below however wasn't sure of its applicability given it was from 09.
    Re: 2. there seems to be data I have yet to bring across. How important it is I don't know. What exactly it is I also am yet to find out. Will proceed with the downloading n stuff.
    Thank you for your help on this.

  • What's the best way to read/write data from a file (preferabl​y a *.txt file)?

    As in the title.  l have revived a couple of old VIs to read and write three numbers and a 1D array of clusters to/from a *.txt file.  The functionality is not very user friendly, and it would also be useful if one could open the text files (manually - not through LabVIEW) and still be able to see/understand what was there.
    I was wondering if anyone would be able to come up with a more efficient and/or user friendly method (compatible with lv6.1) 
    James
    Never say "Oops." Always say "Ah, interesting!"
    Attachments:
    Read Spec.vi ‏110 KB
    Write Spec.vi ‏58 KB

    My primary goal is to have something that works and is easy and comprehensive to operate.  Generating a human-readable file is just a bonus but would be nice if it could be achieved.
    I enclose pictures of the initial file dialog (for both loading and saving the data - referred to as Spec(s) from hence forth), and of the front panel screen seen when  a) loading a spec and  b) saving a spec.  In the file dialog, you have to already know the exact string to input else you'll just be told the file doesn't exist (applies for both loading and saving).  When saving a spec, you cannot see any files previously saved, nor even any previous specs saved within the file.  This means that one can unwittingly overwrite specs without realising it.
    I'm not sure if I've explained this very well, but the current functionality means that far too much can go wrong.  Additionally, if you forget the file name, you'll have no way of knowing what it should be (the data files are stored on a 'locked' part of our network accessible only by Admin or through the LabVIEW executable
    Never say "Oops." Always say "Ah, interesting!"
    Attachments:
    File Dialog.JPG ‏23 KB
    Select The Required Test Spec.JPG ‏10 KB
    Name of specification.JPG ‏6 KB

  • What's the best way to export UTF8 data into Excel?

    Hi,
    Database charterer set is UTF8
         PARAMETER     VALUE
    1     NLS_NCHAR_CHARACTERSET UTF8
    2     NLS_CHARACTERSET     UTF8
    My requirement is that I want to export oracle data into excel file by using UTL_FILE oracle supplied package.
    But while writing data into excel files it’s looses in UTF8 encoding and writes some garbage data into file.
    I am very much sure that while retrieving the data form database it is perfectly encoded with UTF8, but after data is written to the file it eats the UTF encoding resulting into some garbage data.
    e.g.
    I am retrieving this filed from database which is UTF8 encoded in DB “Langäcker 55” but when it is written into excel it becomes “Langäcker 55”.
    Is Oracle UTL_FILE eating the UTF8 encoding while writing into file? if yes, is there utility which can help me to remain the UTF8 encoding as it is while writing in to excel file? Is there anythingi can do in Oracle to get this done?
    Thanks in advance.
    Thanks,
    Santosh

    Until now I have not found anyone who could write real Excel files (MS binary format) in PL/SQL. So I assume that you write your data in strings separated by a character like ; or |, aka CSV.
    Try to open your file with 'wb' (write binary) instead of 'w' and then use
    -- write data
    utl_file.put_raw(v_file,utl_raw.cast_to_raw(<your data>));
    --write remaining data
    utl_file.fflush(v_file);
    --close the file
    utl_file.fclose(v_file);Regards
    Marcus

Maybe you are looking for

  • How to restore Pages 4 files after opening in Pages 5

    Here's a tip for those who have discovered the misery of Pages 5 and then wanted to go back to a Pages 4 compatible version of a file. If you simply go to Time Machine, you will find that the file no longer exists in back-ups previous to the date you

  • Satellite L50D - Boot issues after Hard Drive replacement

    I have just upgraded my hard drive from the pathetic 5400 crod they supply to an SSHD@7200rpm but now have a major issue. I initially imaged the os from the old hard drive back to the new drive and have had chronic boot issues (1 in 3-4 boot attempts

  • Validating File Browse path is valid

    I'm sure some other people ran into this same problem... I need a way to validate the file entered into the File Browse item is a valid file path. I've created an obvious "not null" validation so the item has a value when the page is submitted. I als

  • SM 56 error while posting IDOC thru XI (File to IDOC) ????

    Hi ,    When I am trying to post data into R/3 system thru MATMAS idoc,I have problem in posting data in R/3 system. In XI side the message is sent correctly to R/3 system, but in R/3 it is giving me SM 56 error, while mapping I tried to map only few

  • Trick to Load Data into System Matrix?

    Hi, Is there any possibility to load a system Matrix, see code below: The code below break at LoadFromDataSource (Not user defined) Thank You! Rune CODE oForm.DataSources.DataTables.Item("SYS_77").ExecuteQuery("SELECT [U_ItemCode] WHERE [U_MyField1]