TDMS & Diadem best practices: what if my signal has pauses/breaks?

I built a LV2011 datalogging application that stores lots of data in TDMS files.  The basic architecture is like this:
Every channel has these properties:
     To = Start time
     dt =  Sampling interval
Channel values:
     1D array of DBL values
After datalogging starts, I just keep appending to the channel values.  And if the TDMS file size goes over 1 GB, I create a new file and start over.  The application runs continuously for days/weeks, so I get a lot of TDMS files.
It works fine.  But now I need to change my system to allow the data acquisition to pause/restart.  This means there will be breaks in the signal (probably between 30 sec and 10 mins).  I had originally considered recording two values for every datapoint (value & timestamp) like an XY graph.  But I am opposed to this in principal because I feel it fills up your hard disk unnecessarily (twice us much disk footprint for the same data?).
Also, I have never used Diadem, but I want to ensure that my data can be easily opened and analyzed using Diadem.
My question:  Are there some best practices for storing signals that pause/break like this?  I would like to just start a new recording with a new start time (To) and have Diadem be able to somehow "link" these signals ... eg have it know that its a continuation of the same signal.
Obviously, I should install Diadem and play with it.  But I thought I would ask the experts about best practice first, since I have zero Diadem knowledge.
Solved!
Go to Solution.

Hi josborne,
Are you planning on creating a new TDMS file each time the acquisition stops and starts again, or were you wanting to store multiple start/stop sections withing the same TDMS file?  The easiest way to handle the date/time offset is to store one waveform per Channel per start/stop section and use the "wf_start_time" Channel property that is native to TDMS waveform data-- whether you're wiring an orange floating point array or a brown waveform to the TDMS Write.vi.  DIAdem 2011 has the ability to easily access the date/time offset when it is stored in this Channel property (assuming that it is stored as a date/time and not as a DBL or a string).  If you only have one start/stop section per TDMS file, I would definitely also add a "DateTime" property to File level.  If you want to store multiple start/stop sections in a single TDMS file, I would recommend using a separate Group for each start/stop section.  Make sure you're storing the following Channel properties in the TDMS file if you want that information to flow naturally into DIAdem:
"wf_xname"
"wf_xunit_string"
"wf_start_time"
"wf_start_offset"
"wf_increment"
Brad Turpin
DIAdem Product Support Engineer
National Instruments

Similar Messages

  • Best Practice - What a complete PO means?

    Hi! I would like to determine the best practice to managing the PO process and ensuring that the PO is in a state ready for archiving in the future (with minimum efforts to correct the PO).
    Is it fair to say that a PO is considered 'complete' if the Complete Delivery indicator and invoiced quantity equals that of the final goods receipted quantity? Or do we have to be more stringent in the definition in that the goods receipted and invoice quantity has to match that of the ordered quantity?
    For example:
    PO Qty = 5
    GR Qty = 4 (complete delivery indicator turned on)
    IR Qty = 4
    In the above example, is the PO considered complete? The reason why I ask is because I understand the PO can only be archived if it is set with Deletion Flag, and when I manually try to delete the PO, an error is output indicating that the quantity delivered is smaller than the quantity ordered (assuming no under delivery tolerance is maintained here). So, in this case, is it good practice to always update the PO quantity to match that of the final receipted quantity?
    Appreciate your advice on the above.
    Cheers!
    SF

    Hi,
    If PO qty is 5 & GR qty is 4..
    you can flag the PO as Delivery completed..
    Logically the PO will be considered as Closed for procurement..
    It is Some times not possible to change the PO qty to GR qty..bcz basing on the Settings.. it may trigger again release strategy also..
    Basing on your business requirement either u can go for changing the PO qty to GR qty
    or marking the Delivery completed..
    Thx
    Raju

  • Best Practices: What is a best backup plan on BO 4.0

    Hi Experts!
    I work with BO since 2007. I worked a lot with BO XI 3.1 and now with BI 4.0. I always have a question about the way to make a backup: What is a best backup plan on BO 4.0.
    I know de many way to do this, but how I'm a consultant developer and backup usually is not my responsibility, but always I have to advise my clients and users.
    The little summary of ways I know on BI 4.0:
    - Stop the services and do a backup of repository database and FileStore folder (eventually include the TomCat folder and others BO installation folders)
    - Create a job on LCM and a schedule to export a LCMBIAR file
    - Use the Upgrade Management Tool to generate a BIAR file by the command line
    I found that interesting post of Raphael Branger, but his the best option is to use the 360View, but that software I don't know, and the clients usually want to use the SAP solutions, so the preference is use the BO's way to make backup.
    Backup & Recovery in BO 4.0
    Note: I agree with Raphael about the old Import Wizard, I don't know why Upgrade Management Tool don't allow to import a biar file that same version of target. It is terrible.
    Let me make a the big question: What is a best backup plan on BO 4.0?
    I know that this depends of the environment and the many variables, but let us consider the general environment, the standard installation.
    Thanks everybody!

    Thanks Mrinal and Ajay,
    On my experience I always use the full-backup: repository database backup + filestore folder backup (usually I recommend include BO folder installation and TomCat folder too because the custom configurations). That backup is essential, is basic.
    But this backup is not flexible.The usual problems on BO's production enviroment is accidental deletion of some reports or objects of BO. Since BO XI R2 I used the "Import Wizard" to generate BIAR files by the command line, I usually create a BAT file with command line to create thats files, however BO 4 Import Wizard was died, now exists "Upgrade Management Tool", but I can create BIAR files by the command line too. Let's suppose a case that the BO user has deleted a report and that user did notified that deletion after 1 month. We don't need to restore all objects of the full-backup of 1 month ago, with BIAR files, we can restore only that report. Thats is the advantage of using BIAR files.
    So, my strategy is use the full-backup (repository database + BO installation folder) and create BIAR files.
    What do you think about the backup by generating BIAR files?

  • RD Session Host lock down best practice document

     
    Hello,
    I am currently working on deploying an RDS Farm. My farm has several RD Session host servers. Today I learned that you can do some bad things to the RD Session hosts, if a user presses
    CTRL + Alt + End when having a open session. I locked all of this down using different GPOs which include disabled access task manager, cmd, locking the server, reboot and shutdown etc.
    However, this being sad how would I know what else to lock down since I am new to this topic. I tried to find some Microsoft document about best practices what should be locked down but I wasn’t
    successful and unfortunately a search in the forum did not bring up anything else.
    With all the different features and option Windows Server 2008 R2 has I do not even know where to start.
    Can some please point me into the right direction.
    Thank you
    Marcus

    Hi,
    The RD Session host  lock down best practices of each business is different, every enterprise admin can only to find the most suitable for their own solutions based on their IT infrastructure.
    I collected some resource info for you.
    Remote Desktop Services: Frequently Asked Questions
    http://www.microsoft.com/windowsserver2008/en/us/rds-faq.aspx
    Best Practices Analyzer for Remote Desktop Services
    http://technet.microsoft.com/en-us/library/dd391873(WS.10).aspx
    Remote Desktop Session Host Capacity Planning for 2008 R2
    http://www.microsoft.com/downloads/details.aspx?FamilyID=CA837962-4128-4680-B1C0-AD0985939063&displaylang=en   
    RDS Hardware Sizing and Capacity Planning Guidance.
    http://blogs.technet.com/iftekhar/archive/2010/02/10/rds-hardware-sizing-and-capacity-planning-guidance.aspx
    Technical Overview of Windows Server® 2008 R2 Remote Desktop Services
    http://download.microsoft.com/download/5/B/D/5BD5C253-4259-428B-A3E4-1F9C3D803074/TDM%20RDS%20Whitepaper_RC.docx
    Remote Desktop Load Simulation Tools
    http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=c3f5f040-ab7b-4ec6-9ed3-1698105510ad
    Hope this helps.
    Technology changes life……

  • Best Practice question - null or empty object?

    Given a collection of objects where each object in the collection is an aggregation, is it better to leave references in the object as null or to instantiate an empty object? Now I'll clarify this a bit more.....
    I have an object, MyCollection, that extends Collection and implements Serializable(work requirement). MyCollection is sent as a return from an EJB search method. The search method looks up data in a database and creates MyItem objects for each row in the database. If there are 10 rows, MyCollection would contain 10 MyItem objects (references, of course).
    MyItem has three attributes:
    public class MyItem implements Serializable {
        String name;
        String description;
        MyItemDetail detail;
    }When creating MyItem, let's say that this item didn't have any details so there is no reason to create MyitemDetail. Is it better to leave detail as a null reference or should a MyItemdetail object be created? I know this sounds like a specific app requirement, but I'm looking for a best practice - what most people do in this case. There are reasons for both approaches. Obviously, a bunch of empty objects going over RMI is a strain on resources whereas a bunch of null references is not. But on the receiving end, you have to account for the MyItemDetail reference to be null or not - is this a hassle or not?
    I looked for this at [url http://www.javapractices.com]Java Practices but found nothing.

    I know this sounds like a specific apprequirement,
    , but I'm looking for a best practice - what most
    people do in this case. It depends but in general I use null.Stupid.Thanks for that insightful comment.
    >
    I do a lot of database work though. And for that
    null means something specific.Sure, return null if you have a context where null
    means something. Like for example that you got no
    result at all. But as I said before its's best to
    keep the nulls at the perimeter of your design. Don't
    let nulls slip through.As I said, I do a lot of database work. And it does mean something specific. Thus (in conclusion) that means that, in "general", I use null most of the time.
    Exactly what part of that didn't you follow?
    And exactly what sort of value do you use for a Date when it is undefined? What non-null value do you use such that your users do not have to write exactly the same code that they would to check for null anyways?

  • Best practice for dealing with Recordsets, JDBC and JSP?

    I've spent the last three years developing web apps using JSP, Struts and Kodo JDO for persistence. All of the content for the apps was created as Java objects using model classes and saved to an Oracle db. Thus, data retrieved from the db was as instances of the model classes and then put into Struts form beans, etc.
    I changed jobs last month and am now having to use Servlets with JDBC to retrieve records from db tables and returning it into Recordsets. Oh, and I can't use Struts in my JSPs either. I'm beginning to think that I had it easy at my previous job but maybe that's just because I was used to it.
    So here are my problems/questions:
    I have two tables with a one to many relationship that I need to retrieve data from, show in a jsp and be able to update eventually.
    So here's what I am doing:
    a) In a servlet, I use a SQL statement to join the tables and retrieve the results into a Recordset.
    b) I created a class with a bunch of String attributes to copy the Recordset data into, one Recordset row per each instance of the bean and then close the Recordset
    c) I then add the beans to an ArrayList and save the ArrayList into the session.
    d) Then, in the JSP, I retrieve the ArrayList from the session and iterate over each bean instance, printing the data out to the jsp. There are some logic statements to determine when not to print redundant data caused by the one to many join.
    e) I have not written the code to update the data yet but was planning on having separate jsps for updating the (one) table and the (many) table.
    Would most of you do something similar? Would you use one SQL statement to retrieve all of the data for display and use logic to avoid printing the redundant part of the data? Or would you have used separate SQL queries, one for each table? Would you have saved the results into something other than an instance of a bean class that represents one record in the RecordSet? Would you have had a bean class with attributes other than Strings - like had a collection attribute to hold the results from the "many" table? The way that I am doing everything just seems so cumbersome and difficult compared to using Struts and JDO before.
    Your help/opinion will be greatly appreciated!

    Would you use one SQL statement to retrieve all of the data for display Yes.
    and use logic to avoid printing the redundant part of the dataNo.
    I believe in minimising the number of queries. If it is a simple one-many join on a db table, then one query is better than one + n queries.
    However I prefer to store the objects in a bean class with attributes other than strings - ie one object, with a collection attribute to hold the related "many" records.
    Does the fact you are not using Struts mean that you have to use scriptlet code? (shudder)
    Or are you using JSTL, or other custom tags?
    How about tools like Ant? Junit testing?
    The way that I am doing everything just seems so cumbersome and difficult
    compared to using Struts and JDO before.Anything different takes adjusting to. Sounds like you know what you're doing for the most part. I agree, in terms of best practices what you have described so far sounds like a step backwards from what you were previously doing.
    However I wouldn't go complaining about it too loudly, too quickly. If you're new on the block theres nothing like making a pain of yourself, and complaining how backwards the work they have done is to put your new workmates' backs up
    Look on it as a challenge. Maybe discuss it quietly with a team leader, to see if they understand how much easier/better/less error prone such approaches can be?
    Struts, cumbersome as it can be, definitely has the advantage of pushing you to follow good MVC practice.
    Good luck,
    evnafets

  • Best practice for migrating between environments and versions?

    Hi to all,
    we've got a full suite of solutions custom developed in SAP BPC 7.0, SP 7. We'd like to understand if
    - there are best practice in order to copy this applications from an environment to another environment (another client)
    - there are best practice in case the client has got a newer version of SAP BPC (they should install the 7.5, while we're still stucked with the 7.0).
    Thank you very much
    Daniele

    Hi Daniele
    I am not entirely sure, what you are asking, Please could you provide additional information.
    Are you looking for best practice recommendations for Governance, for example: Change transports between DEV, QA and PRD in BPC 7.0?
    What is the best method? Server Manager backup and restore, etc  ?
    And
    Best Practice recommendations on how to upgrade to a different version of BPC, for example: Upgrading from BPC 7.0 to 7.5 or 10.0 ?
    Kind Regards
    Daniel

  • Best Practice for Recording to HDD's.

    I want some answers from some people who have been using Logic for a while and have experience in this topic. I have a 2010 MBP 13" with 1 Firewire 800 port, after looking for a cheap quality interface i purchased a Saffire Pro 14 which has a Firewire 6pin interface and i assume is Firewire 400mbps. The interface also doesn't have an extra Firewire port and the manual says not to "daisy-chain" becuase of latency and CPU spikes. Honestly i haven't noticed any problems with my HDD I/O in Logic, when i skip around it peaks but never any errors. I formatted an old 160GB Sata 1 drive and hooked it up over USB, i noticed that when i "skip" arround or w/e its called it doesn't peak (get red) anymore. I want to know peoples experience with Daisy chaining firewire devices, and using USB HDD's.
    When i invest more money, would it be ok to stick with my interface for now and get a Glyph drive and daisy chain my interface on at the end? Should i stick with USB storage? Should i stick with internal 5400RPM drive? Should i upgrade it to a 7200RPM ( my original idea). In I.T. we call it best practices, whats the best practice for storing your audio/logic files on a Mac when running Logic.

    Please don't just tell me Firewire is faster than USB.....I'm an I.T. Proffesional (wishing i went to school for recording ) I know how comptuers work, i build them and fix them all day. I want to know from peoples personal experience what they run and their suggestions. Thanks so much!

  • Best practice for scripts in UCCX

    General Question
    When I wrote scripts for Nortel Symposium , I always created a primary script for queuing calls to skill sets so that the press 1 , welcome prompts, check day of week etc were all in different scripts but the core function- Select Resource was in a single simple script by itself- which was pointed to by the initial menu scripts-
    In UCCX , do people make use of a single script for each call flow - Welcome, Day of Week, Select Resource etc all in a single script to do you call sub - Scripts/ Flows for the actual  Select Resource - what this best practice , what do other people do
    The reason I had subscripts in Nortel was so that when the call was offered to the SkillSet, the Admin. person who that there were no additional queue offerings so any time to answer etc was purely on the Q also any changes  in other scripts did not affect  other Q`s 

    Here's a few tips/best practices that may help you out:
    http://www.cisco.com/en/US/docs/voice_ip_comm/cust_contact/contact_center/crs/express_7_0/reference/guide/UCCX_Best_Practices.pdf

  • Best practice for creating JCO destinations

    Hi All,
       I have a project which uses 10 to 12 BAPIs.What is the best practice
      1) Create 10 JCO destinations one for each BAPI .
      2) Create one JCO and use it for all BAPIs.
         Can some one tell me what is the best practice.What are the advantages and the disadvantages.
    Regards,
    Rajini.

    Hi,
    these docs helps you to get idea over the jco best practices.
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/705f2b2e-e77d-2b10-de8a-95f37f4c7022
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/85a483cb-0d01-0010-2990-c5168f01ce8a
    Regards,
    ramesh

  • Best practice for Recruitment

    Hi Guys,
    My client is located in 500 places, all over the India. The operational head of each client place will recruit the employee as & when requires & send ESI, PF enrollment forms to the Head Office.
    Now they are asking me,  what SAP Best Practice will suggest. Can any one tell me what is "SAP Best Practice" & what it suggests.
    Regards
    Prasad

    I think what they are looking for is Best Practices in SAP enabled Recruitment processes in light of the new hire forms (based on below statement).
    What that means is what is the right approach to do this based on different SAP Recruitment implementations.
    You may be able to get some of this information from SAP Service marketplace.
    Based on what I have seen, the best practice is to provide for an online tool (I think SAP E-Recruitment offers this) for the applicants to sign in with a given id and passwd and collect documents electronically. Then the paper documents can later be mailed to head office at a later date.
    Hope that helps.

  • Workflow setup:Best Practices

    Hi All,
    Could anyone please share knowledge related to Oracle Workflow setup:Best Practices.What all are the high level steps?
    I am looking from embedded workflow setup for R11 or R12.
    Thanks for your time!
    Regards,

    This is a very broad question - narrowing it to specifics might help folks respond better.
    There are a lot of documents on MOS that refer to best practices from a technology stack perspective.
    Oracle Workflow Best Practices Release 12 and Release 11i          (Doc ID 453137.1)
    As far as functional practices are concerned, these may vary from module to module, as functionality and workflow implementation vary from module to module.
    FAQ: Best Practices For Custom Order Entry Workflow Design          (Doc ID 402144.1)
    HTH
    Srini

  • Database Administration - Best Practices

    Hello Gurus,
    I would like to know various best practices for managing and administering Oracle databases. To give you all an example what I am thinking about - for example, if you join a new company and would like to see if all the database conform to some kind of standard/best practices, what would you look for - for instance - are the control files multiplexed, are there more than one member for each redo log group, is the temp tablespace using TEMPFILE or otherwise...something of that nature.
    Do you guys have some thing in place which you use on a regular basis. If yes, I would like to get your thoughts and insights on this.
    Appreciate your time and help with this.
    Thanks
    SS

    I have a template that I use to gather preliminary information so that I can at least get a glimar of what is going on. I have posted the text below...it looks better as a spreedsheet.
    System Name               
    System Description               
         Name      Phone     Pager
    System Administrator               
    Security Administrator               
    Backup Administrator               
    Below This Line Filled Out for Each Server in The System               
    Server Name               
    Description (Application, Database, Infrastructure,..)               
    ORACLE version/patch level          CSI     
              Next Pwd Exp     
    Server Login               
    Application Schema Owner               
    SYS               
    SYSTEM               
         Location          
    ORACLE_HOME               
    ORACLE_BASE               
    Oracle User Home               
    Oracle SQL scripts               
    Oracle RMAN/backup scripts               
    Oracle BIN scripts               
    Oracle backup logs               
    Oracle audit logs               
    Oracle backup storage               
    Control File 1               
    Control File 2               
    Control File 3                    
    Archive Log Destination 1                    
    Archive Log Destination 2                    
    Datafiles Base Directory                    
    Backup Type     Day     Time     Est. Time to Comp.     Approx. Size
    archive log                    
    full backup                    
    incremental backup                    
    As for "Best" practices, well I think that you know the basics from your posting but a lot of it will also depend on the individual system and how it is integrated overall.
    Some thoughts I have for best practices:
    Backups ---
    1) Nightly if possible
    2) Tapes stored off site
    3) Archives backed up through out day
    4) To Disk then to Tape and leave backup on disk until next backup
    Datafiles ---
    1) Depending on hardware used.
    a) separate datafiles from indexes
    b) separate high I/O datafiles/indexes on dedicated disks/lungs/trays
    2) file names representative of usage (similar to its tablespace name)
    3) Keep them of reasonable size < 2 GB (again system architecture dependent)
    Security ---
    At least meet DOD - DISA standards where/when possible
    http://iase.disa.mil/stigs/stig/database-stig-v7r2.pdf
    Hope that gives you a start
    Regards
    tim

  • Best practice for database move to new disk

    Good morning,
    Hopefully this is a straight forward question/answer, but we know how these things go...
    We want to move a SQL Server Database data file (user database, not system) from the D: drive to the E: drive.
    Is there a best practice method?
    My colleague has offered "ALTER DATABASE XXXX MODIFY FILE" whilst I'm more inclined to use "sp_detach_db".
    Is there a best practice method or is it much of a muchness?
    Regards,
    Andy

    Hello,
    A quick search on MSDN blogs does not show any official statement about ALTER DATABASE – MODIFY FILE vs ATTACCH. However, you can see a huge number of article promoting and supporting
     the use of ALTER DATABASE on any scenario (replication, mirroring, snapshots, always on, SharePoint, service broker).
    http://blogs.msdn.com/b/sqlserverfaq/archive/2010/04/27/how-to-move-publication-database-and-distribution-database-to-a-different-location.aspx
    http://blogs.msdn.com/b/sqlcat/archive/2010/04/05/moving-the-transaction-log-file-of-the-mirror-database.aspx
    http://blogs.msdn.com/b/dbrowne/archive/2013/07/25/how-to-move-a-database-that-has-database-snapshots.aspx
    http://blogs.msdn.com/b/sqlserverfaq/archive/2014/02/06/how-to-move-databases-configured-for-sql-server-alwayson.aspx
    http://blogs.msdn.com/b/joaquint/archive/2011/02/08/sharepoint-and-the-importance-of-tempdb.aspx
    You cannot find the same about ATTACH. In fact, I found the following article:
    http://blogs.msdn.com/b/sqlcat/archive/2011/06/20/why-can-t-i-attach-a-database-to-sql-server-2008-r2.aspx?Redirected=true
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • SAP Best Practice Scoping

    Hi,
    We have installed SAP Best Practice .
    If any body has done installation please share this problem.After importing solution when I click on scoping
    I am getting dojo script error.Could not getting through the scoping page.
    I have activated all BSP services in sicf but still getting same dojo script error.
    Please share the solution if anyone has come across the problem.
    Regards
    Yogesh

    Hi,
    Pls see the attached link:-
    http://www.scribd.com/doc/6075383/Sap-Tutorial?autodown=pdf
    Anil

Maybe you are looking for

  • How do you get to the "Multiple Item Information" window so I can click "Part of a compilation:?

    How do you get to the "Multiple Item Information" window so I can click "Part of a compilation"?  When I transfer a CD to iTunes, and the CD has ten tracks, it loads as ten different albums.  Also, once this happens, do I have to delete the album and

  • Using READ_IMAGE_FILE BUILT-IN IN Forms causes a loss in the  resolution

    Hi All Oracle Developer suite 10g, database 11 gr2, windows Platform I'm using READ_IMAGE_FILE BUILT-IN in forms builder to read an image and get a good background for the form, but the image displays in the form with a resolution that is less than t

  • Continual Rotating Screen

    I have an HP Pavilion Entertainment PC and the screen is constantly rotating unless I hold down the rotation button below the screen. I have looked all over my computer but to be honest I'm not sure what to look for to stop this from happening. To pa

  • Difference between Oracle developer suite 9i and 6i

    Hi - Can I know what's the release difference between oracle developer suite 9i and 6i , and what's new in oracle discoverer 9i in compare with last version , we have discoverer version of 4.1... I could not able to get from metalink anything like ne

  • Drawing with touchpad

    Hello, fellow friends! Moments ago, I first time tried to install Adobe Photoshop on MacBook (Mac OS X v10.6.6). I downloaded Photoshop CS5 trial, installed it, and everything worked almost okey, but I have a problem, which I don't get clear: When I