Periodic spectra storing in TDMS

I need to periodically store spectra from 4 voltages and 4
currents, each  spectra signal has 100 spectral lines. This means 800 spectral
lines is one record which is created each aggregation interval (200ms up to 10
min – user selectable). For long time I use binary data format for this and it
works fine and fast. Now I need to optionally store data into TDMS data format
for my customer who want to process them in DIAdem.
I found only one example for periodical storing of spectra
signal: Express
write data (time and frequency domain).vi. I tried to create data
file with approximatelly 500 records. The data file is about 2MB. But to show
this file in LabVIEW TDMS File Viewer takes tens of second having CPU
utilisation on 100%!!! If 
I load this 2MB file into DIAdem it makes also significantly longer
responses in comparison with example files!!!
I am
looking for efective way how to proceed for this type of data with TDMS format.
Consequently I will need to read data in LabVIEW, but the reader must be fast.
I have few typical tasks for reader:
To read time sequence of
selected harmonic on selected signal for the whole measurement
To read one record (all
harmonics) from one signal
Please
help.
Bilik

Hi Bilik,
One reason you are getting poor performance in the TDMS code you posted is that you are writing one data point at a time to each channel.  Since each buffer write in TDMS is prefaced with a binary buffer header describing its contents, you are creating a binary header for every data point you are storing.  The LabVIEW 2009 Beta, which is already live, contains TDMS VIs with an updated option to create a single binary header for the entire file (and forego the option of changing anything during the acquisition)-- with this option you would not encounter this problem.
But you can sidestep this problem with already shipping LabVIEW versions by invoking a special channel-level property set to automatically buffer channel writes up to a certain minimum buffer size, then flush that minimum buffer to disk automatically.  This avoids excessive binary header writes without changing your code inside the acquisition loop, though it does require you to add the channel property sets outside (left of) the acquisition loop.  Just assign the "NI_MinimumBufferSize" property to something like 500 or 1000 or 2000 values on each of the channels.  This also avoids the need to defrag the TDMS file afterwards, since you aren't creating it with serious fragmentation to start with.  However, I did notice after adding all this to your code that the write performance of your binary file is still much better than TDMS, and this I don't understand.
Perhaps Herbert can take a look at the edits I made and suggest further improvements.
Brad Turpin
DIAdem Product Support Engineer
National Instruments
Attachments:
TimeFreqDomain-TDMS.vi ‏33 KB
TimeFreqDomain-BIN.vi ‏27 KB

Similar Messages

  • Where is the "Icon Display Period" attribute stored?

    G'day All,
    I have a procedure that needs to referenece the "Icon Display
    Period" attribute for a particular content area (this is normally accessed via
    the Content Area properties page). Try as I might, I cannot track down
    the table/column that stores this little gem of a value:-( Anyone have
    any ideas where this might be located?
    Cheers
    Kal.

    G'day Sharmila,
    Thanks for your quick response:-) Unfortunately neither WWV_THINGS nor WWV_CORNERS
    seems to contain what I'm looking for (either that or I'm really missing something here).
    My understanding is that WWV_CORNERS contains information about individual folders.
    I use this table to get the UPDATEDATE and CREATEDATE attributes to which I want to
    compare the difference between the current date and the "Icon Display Period".
    WWV_THINGS seems to contain information about idividual items but I'm only really
    interested in folders for the moment.
    Now "Icon Display Period" seems to be a content area wide property. The default is 7 days.
    I've looked at tables like WWSBR_ALL_CONTENT_AREAS and WWSBR_ALL_SITES
    but they don't seem to hold very much information at all:-( Hmmm ... just looking at the Content
    Area Properties page and there seems to be quite a lot of information there. Surely It must
    be stored somewhere?
    Anyone have any ideas?
    Kal.

  • Valid period for stored documents (Archive Link)

    Hello,
    Can someone please confirm if the stored documents within Archive Link can be retrieved after more than 10 years or is there an expire date setting?
    Thanks.

    SAP does not work like "Mission Impossible", that a document disappears itself after a certain time.
    If you archive documents, then it is still up to you, when you want to dispose this archive.
    E.g. you have to keep financial data for 10 years as requested by law.
    But you want keep your database small and want have a good performance.
    So you decide to archive everything what is older than e.g. 2 years.
    The older documents are written to an archive file and deleted from the table spaces.
    The archive can stay in the SAP file system (quickest access) or in an external archive system like IXOS.
    You have to make sure that you keep this archive with data older than 2 years for another 8 year to comply with the legal requirement.
    If the 10 years have past, then you archive the archive information (this has the info where your archive is located)  itself  and remove the indexes to the archived documents that are older than 10 years.
    Then you take the disk from the IXOS system and destroy it.
    If you dont archive the archive infos (SARA for BC_ARCHIVE) and dont destroy the archive files then you may be able to access the data far beyond the 10 years.

  • Total issue qty for given period.

    Hello
    Plz let me know whether Total issue qty for given period is stored at any place in R/3??
    There is requirement to calculate value for given period as Value of Total Stock - Value of Total issue qty.
    Thnx
    Nilesh

    Hi,
    If u issue the stock, or if u receiving the stock
    the system generate Material Document,so u can find at any time in MSEG table
    Thanks & Regards
    Suresh
    > Hello
    >
    > Plz let me know whether Total issue qty for given
    > period is stored at any place in R/3??
    >
    > There is requirement to calculate value for given
    > period as Value of Total Stock - Value of Total issue
    > qty.
    >
    >
    > Thnx
    > Nilesh

  • 0FISCPER3 - Default to last closed period in Report Variable

    Hi all,
    We have a requirement in our reports to default the Posting Period to last closed period.
    In the InfoCube we have 0CALMONTH (In Period) and also 0FISCPER3 (Posting Period). The values for Posting Period varies from 1 to 16.
    What is the functionality behind Posting Period ?
    How can we find the last closed period as months are 12 and Posting period are 16 ?
    Kindly clarify.
    Thanks!

    Hello,
    The fiscal year variant contains the number of posting periods in the fiscal year and the number of special periods.
    You can define a maximum of 16 posting periods for each fiscal year in the Controlling component (CO).
    The fiscal year variant specifies the number of periods and special periods in a fiscal year and how the SAP System is to determine the assigned posting periods.
    You have 12 periods in SAP and also four special periods. These periods are stored in what is called the fiscal year variant (K1, K2, K3 and K4)
    You can use the function module DATE_TO_PERIOD_CONVERT to get the period by passing the date.
    Also see
    [Determining Posting Periods|http://help.sap.com/saphelp_nw04/helpdata/en/c6/f0c33b8e2811d4b3090050dadefebd/frameset.htm]
    special periods
    Thanks
    Chandran

  • TDMS with timestamp and configurable channels

    I am trying to figure out a good way to determine how to line up data in a TDMS file with corresponding timestamps when the user in my application adds channels.
    Here's the scenario,
    Lets say I've been recording 10 channels (called Ch0,Ch1...Ch9) in a group called 'Data'. My data appears as follows in the TDMS file:
    (10 Ch's, 3 Samples) 
    Timestamp
    Ch0
    Ch1
    Ch2
    Ch3
    Ch4
    Ch5
    Ch6
    Ch7
    Ch8
    Ch9
    12:00:01 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    12:00:02 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    12:00:03 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    Let's say now the user decides to add an additional channel to the group. The TDMS write function will add the channel, but the new data does not line up with the corresponding timestamp, instead it start writing to the first row as follows:
    (11 Ch's, 6 samples, 3 new channel samples)
    Timestamp
    Ch0
    Ch1
    Ch2
    Ch3
    Ch4
    Ch5
    Ch6
    Ch7
    Ch8
    Ch9
    Ch10
    12:00:01 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    1100
    12:00:02 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    1100
    12:00:03 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    1100
    12:00:04 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    12:00:05 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    12:00:06 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    I would want it to appear as follows:
    Timestamp
    Ch0
    Ch1
    Ch2
    Ch3
    Ch4
    Ch5
    Ch6
    Ch7
    Ch8
    Ch9
    Ch10
    12:00:01 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    12:00:02 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    12:00:03 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    12:00:04 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    1100
    12:00:05 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    1100
    12:00:06 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    1100
    Other than just starting a brand new TDMS file, can anyone think of a way to line the data up with its corresponding timestamp in this scenario? Or if there is a way to determine what the corresponding timestamp is when I read in the TDMS file.
    Any help is appreciated
    Thanks,
    -CAC

    From your description that the channel values have their corresponding timestamp, I assume that you are writing the waveform data rather than those basic data type(integer, double, etc.) which do not have any timestamp related information.
    The reason that the new data in Ch10 does not line up with the same timestamp data in other channels is because:
    When writing data to a channel, it cannot leave the first several positions vacant, and start writing from the nthe position.
    From the view of TDMS as a file format, it should only be responsible for data logging, and not assume any relationship between channels.
    In your case, the new data in Ch10 channel should have the timestamp starting from 12:00:04 AM to 12:00:06 AM, these data are definitely stored from the first position of Ch10, and .tdms file format cannot do anything to line up the data between channels.
    Timestamp
    Ch0
    Ch1
    Ch2
    Ch3
    Ch4
    Ch5
    Ch6
    Ch7
    Ch8
    Ch9
    Ch10
    12:00:01 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    1100 (12:00:04 AM)
    12:00:02 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    1100 (12:00:05 AM)
    12:00:03 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    1100 (12:00:06 AM)
    12:00:04 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    12:00:05 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    12:00:06 AM
    100
    200
    300
    400
    500
    600
    700
    800
    900
    1000
    For your second question, there is a approach to determine the corresponding timestamp of each channel data read out from .tdms file. The waveform data type is stored in .tdms file in the form of three components:
    Value array value[ ]
    Starting time stamp t0, which is the base timestamp of a channel (or the timestamp of the first value in this channel)
    Time increment dt, the time interval between two data samples. (e.g. one sample per second, dt=1.0; four samples per second, then dt=0.25)
    So it's very easy to calculate the Timestamp of value[index] = t0 + (dt * index)  (0 <= index < num of samples in value[])
    Here is a VI in the attachment that demonstrates how to get the timestamp of each value in the channel.
    Regards,
    Tianbin
    Attachments:
    GetDataTimestamp.vi ‏21 KB

  • Timestamp in TDMS file

    Hi,
    Im trying to read some timestamp values to be stored in TDMS files. This is how im doing it(se file)
    And this is my final result.Why does it only show a lot of numbers and not time???WHY??? I am going nuts soon
    Attachments:
    Frustated time.png ‏66 KB

    The TDMS file will just have a start time (t0), so just add those numbers (seconds) to the start time and you'll have your time stamp.
    Chris
    Certified LabVIEW Architect
    Certified TestStand Architect

  • Customer  exit to get the result in between two fiscal periods

    Hi Guys,
    I have a requirement  to write customer exit, in which i have to get the result for a range of fiscal periods,
    that is in Between   fiscal period1 and fiscal period3,
    and i am getting this Fiscal period from other variable called version in which it consists of combination of fiscalperiod and text
    and now i have filtered the fiscal period and stored in Final_val ( this is an interger), but  how can i use dynamically this Final_val to get the results in between Final_val1 and Final_val3 ( that means if the Final_val is 2008010 then i have to get the results in between 2008011 and 2009001).
    Please provide me the solution, with the possible piece of code

    Hi Diogo,
    Here is the code
    WHEN 'ZC_PVR'.
        DATA: FIN_YEAR(4) TYPE C,
              FIN_DATE(3) TYPE C,
              FIN_VAL(7) TYPE C.
        IF I_STEP = 2.
          READ TABLE I_T_VAR_RANGE INTO LT_VAR_RANGE WITH KEY VNAM = 'ZC_VCS'.
          IF SY-SUBRC EQ 0.
            CONCATENATE '20' LT_VAR_RANGE-LOW+2(2) INTO FIN_YEAR.
            CONCATENATE '0' LT_VAR_RANGE-LOW+4(2) INTO FIN_DATE.
            CONCATENATE FIN_YEAR FIN_DATE INTO FIN_VAL.
            CLEAR L_S_RANGE.
            L_S_RANGE-LOW =  FIN_VAL.
            L_S_RANGE-HIGH =  ''.
            L_S_RANGE-SIGN = 'I'.
            L_S_RANGE-OPT = 'BT'.
            APPEND L_S_RANGE TO E_T_RANGE.
          ENDIF.
        ENDIF.
    which i am using for Filter the fiscal period, after this when i tried to restrict on this "ZC_PVR" vairable and  set the offset like
    zc_pvr 1 to zc-pvr3 under value of ranges, but i am facing an error saying the " variable may be deleted or used incorreclty",
    could u plz suggest

  • Change Periodic data to YTD data

    Hello,
    I have periodic data stored in the system (in Periodic Measure).
    Is there any way to pass the data to accumulated type (changing to YTD Measure)?
    Regards,
    Miguel.

    If your data is in YTD format then you can select so in 'web application parameters'.
    Suppose you are brining actuals data from ECC in YTD format, then you can select the data sources is predefined for YTD format. Some data sources are preconfigured for MTD and some YTD format.
    However if it is merely reporting issue, try the solution suggested by Niranjan.
    if you provide more details I can  try to help as much..
    Regards,

  • Unable to delete period from Periods in Control tables

    I am trying to delete period 'Sep' from Periods in Control Tables.
    But getting msg as 'FDM was unable to find the Record to delete'.
    Not sure whether the record got deleted internally.But I can still see it in my Period list in Control tables.
    Cud anyone suggest how shud I delete period from Control tables and where these periods are stored internally in FDM??
    Waiting for Reply!!!!!!

    Hi,
    There is also a note in Metalink which suggests the record be deleted from workbench client. I did that and it worked. The cause is stated as the record not being entered correctly, and hence the application is unable to find it.
    Thanks
    Ramya

  • Using an external hard drive with iPhoto

    I stored my pictures on an external hard drive, but since they were in iPhoto format I have to import all of them to my computer just to view them.  Is there a way to fix this, or will I just have to keep the pictures as jpegs?  *I was hoping that I could use iPhoto to load and view pictures periodically while storing them on the external hard drive, rather than relying on Finder and Preview

    iPhoto isn’t a photo format. If the drive contains a volume formatted as Mac OS Extended, which can be a partition or disk image, launch iPhoto with the Option key held down and either create a library on that volume or choose one already there.
    (110321)

  • Nodes statement in payroll in SAP ABAP hr  Report

    hi,
      i am working in sap abap hr report for payroll.i am using nodes statement in that report.it is showing error that ""PERNR" is not a node of the logical database __S".how can i solve this error.in this report.          
    NODES pernr .
    INFOTYPES: 0000, 0001, 2001.
    TABLES: t554s, pcl1, pcl2.
    INCLUDE rpclst00.
    INCLUDE rpc2rx09.                      "Payroll results datadefns-Intl.
    INCLUDE rpc2rxx0.                      "Payroll results datadefns-Intl.
    INCLUDE rpc2rgg0.                      "Payroll results datadefns-GB
    INCLUDE rpcfdcg0.                      "Payroll results datadefns-GB
    INCLUDE rpcdatg0.
    INCLUDE rpc2cd00.                      "Cluster Directory defns.
    INCLUDE rpc2ps00.                      "Cluster: Generierte Schematas
    INCLUDE rpc2pt00.
    INCLUDE rpcfdc10.
    INCLUDE rpcfdc00.
    INCLUDE rpppxd00.
    INCLUDE rpppxd10.
    INCLUDE rpcfvp09.
    INCLUDE rpcfvpg0.
    INCLUDE rpppxm00.
    TYPES: BEGIN OF t_salrate,
        seqnr    TYPE pc261-seqnr,
        begda    TYPE p2001-begda,
        endda    TYPE p2001-endda,
        split(2) TYPE c,
        val      TYPE p DECIMALS 2,
       END OF t_salrate.
    DATA: it_salrate TYPE STANDARD TABLE OF t_salrate INITIAL SIZE 0,
          wa_salrate TYPE t_salrate.
    *Selection screen
    SELECTION-SCREEN BEGIN OF BLOCK block1 WITH FRAME TITLE text-001.
    SELECT-OPTIONS: so_awart FOR p2001-awart.
    SELECTION-SCREEN END OF BLOCK block1.
    *START-OF-SELECTION.
    START-OF-SELECTION.
    GET pernr.
    get payroll results data
      rp-init-buffer.
      CLEAR rgdir. REFRESH rgdir.
      CLEAR rt. REFRESH rt.
      CLEAR: rx-key.
    set key to current pernr
      MOVE pernr-pernr(8) TO cd-key-pernr.
    retrieves payroll results for specific pernr(personnel number)
      rp-imp-c2-cd.
      IF rp-imp-cd-subrc = 0.                                "rgdir success
        rx-key-pernr = pernr-pernr.
        SORT rgdir BY seqnr ASCENDING.
        CLEAR rgdir.
      ENDIF.
      SORT rgdir BY fpbeg fpend ASCENDING seqnr DESCENDING.
    RGDIR the table where all payroll results periods are stored
      LOOP AT rgdir WHERE  abkrs IN pnpabkrs        "pay area
                      AND  srtza EQ 'A'
                      AND  void  NE 'V'.
        IF sy-subrc NE 0.
        set key to specific payroll results period(current RGDIR loop pass)
          UNPACK rgdir-seqnr   TO   rx-key-seqno.
        Retrieves data for specific payroll results period (current RGDIR
        loop pass)
          rp-imp-c2-rg.
        Loop at wpbp data for specific payroll results period
          LOOP AT wpbp.
            wa_salrate-seqnr = rgdir-seqnr.
            wa_salrate-begda = wpbp-begda.
            wa_salrate-endda = wpbp-endda.
            wa_salrate-split = wpbp-apznr.
          loop at rt data for specific payroll results period
            LOOP AT rt WHERE lgart EQ '/010' AND             "wage type
                             apznr EQ wpbp-apznr.            "payroll split
              wa_salrate-val = ( rt-betpe * ( wpbp-adivi / wpbp-kdivi ) ).
              APPEND wa_salrate TO it_salrate.
            ENDLOOP.
          ENDLOOP.
        Process BT table
          LOOP AT BT.
          ENDLOOP.
        Process NIPAY table
          LOOP AT NIPAY.
          ENDLOOP.
        etc................
        ENDIF.
      ENDLOOP.
    *END-OF-SELECTION.
    END-OF-SELECTION.

    Hi,
    Have you put a Logical Database in the attributes of the program that you have created ?
    Regards,
    Samson Rodrigues.

  • Field in MB5B T-Code

    Hi all,
    I am using T-code MB5B.
    Can please anybody tell me, which table and field using for opening stock?
    here in technical info its showing MARD-LABST but my value doesnt match with table value.
    Waiting for your reply.
    Thanks in advance

    Hi,
    Open stock of a material will be available in the Accounting1 View of the material master.
    Here is some information to find the opening stock of a material in a period. This information will be stored in the tables MBEWH, MARDH, MCHBH.
    As of Release 4.5, stock fields and valuation fields relating to the previous period or to an earlier period are stored in history tables (for example, MBEWH, MARDH, MCHBH), and no longer in those tables in which the current stock data is stored (for example, MBEW, MARD, MCHB).
    These history tables can contain one entry for each period. An entry is created in the history table only if the stock-relevant data or valuation-relevant data changes. The value of the entry relates to the end of the period. The history tables do not contain any entries for the current period.
    Example
    Scenario
    At the start of period 02, there are 10 pieces of material 4711 in stock.
    Goods receipt
    5 pieces are received in period 02.
    System response
    The system records a stock of 10 pieces in the history table for period 01. At the same time, the system increases the current stock to 15 pieces.
    Goods receipt
    2 more pieces are received in period 02.
    System response
    The history table is unaffected by this event because an entry already exists for period 01. The system increases the current stock to 17 pieces.
    Goods issue
    4 pieces are withdrawn from stock in period 04.
    System response
    The system records a stock of 17 pieces in the history table for period 03. At the same time, the system reduces the current stock to 13 pieces. (The history table does not contain an entry for period 02 because there were no goods movements in period 03.)
    Rules
    If the history table does not contain an entry for the previous period, the values for the previous period are the same as those for the current period.
    If the history table does not contain an entry for a period n that precedes the previous period, the values for period n are the same as those for period n+1.
    Since the second rule can be applied recursively, it is possible to determine (in accordance with both rules) the values for any periods as of the period in which Release 4.5 or a subsequent release was implemented.
    Rgds,
    Prajith

  • ASO members with formula not rolling up

    Hi Gurus,
    In our ASO cube we have a members which caluclate % are not rolling up to parent . Its rolling up for MTD but they are not rolling for QTD . Members are in Account dimension .Please let me know if you require more info to get the excat idea .

    These are MDX formula members, right? Is the problem that the results are recalculated at the QTD level rather than the MTD level summing up the Period dimension?
    Is Period a stored dimension? The members / formulae in dynamic dimensions or hierarchies will always be calculated after roll-up of stored dimensions / hierarchies.
    See the section on 'Calculation Order' here: http://docs.oracle.com/cd/E26232_01/doc.11122/esb_dbag/alocare.html#alocare1058144
    If that is the problem you're seeing, the only options are a) to use an ASO calculation to derive the percentages as stored values, which can then roll up in stored hierarchies, or b) calculate the percentages outside of Essbase - in a relational staging area, for example.

  • Display planned depreciation values for depre areas

    Hi,
    I have 2 depreciation areas, 01 for book and 04 for local depreciation.
    How do I display the periodic depre values for 04 in asset explorer posted values tab  if I don't want to maintain a derived depre area?
    Your assistance will be greatly appreciated.
    Thank you

    Hi,
    if area 04 does not post, you will not get any info on the "posted" values tab. Simply because this area does NOT post to G/L then.
    Only parallel depreciation areas, or real depreciation areas which are part of the definition of a POSTING derived depreciation area get also period values stored, this is the exception.  But a standalone real depreciation area, which does not post and which is not parallel will not store period depreciation values.
    Regards,
    Markus

Maybe you are looking for

  • Jabber for IPhone/IPad and Anyconnect VPN

    I have just setup Jabber for iPhone and iPad in a CUCM 8.6/Presence 8.6 enviornment.  Works great when on my wireless network at work. The problem I have is that it doesn't work with Anyconnect VPN.  Running 8.4 code and can access the CUPS website f

  • Replication Of Z Partner Functions from CRM to ECC

    HI All, We are creating the " Z Partner Functions " in SAP CRM & ECC.in our scenario partner functions flows from CRM to ECC and we want to replicate these partner functions from CRM to ECC. So can some onle please tell me how to do this?. Thanks in

  • PO Output type with Option 5 External Send (&EKKO-EBELN&)

    Dear Friends, We have come across and interesting issue. We want PO output to be send to vendor via email and the email subject should be "New Purchase Order <po document no>". To achieve this requirement we have set the Output type NEU --> Mail titl

  • 6140 Replication Problem in Sun Cluster

    Hi, I am not able to mount a replicate volume from cluster system (primary site) to non-cluster system (DR site). Replication was done by 6140 storage. In primary site the volume was configured in a system with metaset under Solaris Cluster 3.2. and

  • No track listings - Connection error to  CDDB

    Morning all! I too am having problems importing track names. I am also getting a message to check my internet connection - even am I was connected and even have the CDDB website open in the browser. Any suggestions??