Raw time series with power spectrum

I want to generate another VI with raw time series and power spectrum. Use soundgen.vi, with a sampling frequency of 1 kHz and datasize of 2000, so that it covers 2 sec.
How do I display the raw time series data and a power spectrum of it? 
Find the rest of the peaks and identify them. Why the peaks are where they are?
Attachments:
soundgen.vi ‏15 KB

To start do a search in the examples that came with LV for FFT. This will show you a good cross-section of what LV has to offer in terms of analysis capabilities. The thing to remember is that you can perform this analysis on data directly from a device or post-process data that you read from a datafile. After going through the examples if you have specific questions, we'll be able to give more specific answers...
Mike...
Certified Professional Instructor
Certified LabVIEW Architect
LabVIEW Champion
"... after all, He's not a tame lion..."
Be thinking ahead and mark your dance card for NI Week 2015 now: TS 6139 - Object Oriented First Steps

Similar Messages

  • Memory leak with Power Spectrum function

    Hi
    I have a memory leak on my application. By observing with the "Desktop Execution Trace Toolkit", the Power Spectrum function (from NI_AALPro.lvlib) seems to have two "Reference leak".
    The first one refers to the function "Open VI reference"; I got around this problem by replacing the "Open VI reference" by an "Initialize" input boolean, but I don't understand the second memory leak.
    Any idea ?
    Thanks...
    (See attachments : Desktop Execution Trace + code)
    LV8.6.1 + Desktop Execution Trace Toolkit 2009
    Same problem with LV2012 + Desktop Execution Trace Toolkit 2012
    Attachments:
    DesktopExecutionTraceToolkit.png ‏82 KB
    MemoryLeakPowerSpectrum.zip ‏7 KB

    Hi Mathilde,
    Thank you for using NI Discussion Forums!
    I reproduce this problem with LV2012 + Desktop Execution Trace Toolkit 2012. I will look further into this.
    Are there many calls of this function in your code? Could it be a problem for you?
    Thank you.
    Regards,
    Audrey_P
    National Instruments France
    #adMrkt{text-align: center;font-size:11px; font-weight: bold;} #adMrkt a {text-decoration: none;} #adMrkt a:hover{font-size: 9px;} #adMrkt a span{display: none;} #adMrkt a:hover span{display: block;}
    Journées techniques : des fondamentaux aux dernières technologies pour la mesure et le contrôle/comm...

  • Dynamic Time Series with Planning connection?

    Hi guys,
    I have built some reports in HFR, with a connection to a Planning application (the architect requested so). But the only way to get the YTD function properly working is using an Essbase connection (I got an error with the Planning connection).
    My question is: can reports with DTS function be built with a Planning connection? Version is 11.1.2.2.
    Could you please help me?
    Thanks!
    Lu

    Is the report showing any Planning specific content? supporting detail/ cell comments?
    If not you can use an Essbase connection. However if you do so then the way metadata is presented is going to be different (if you use planning connection then the member level security will mimic what is in Planning).
    You cannot use attributes with a planning connection too.
    Regards
    Celvin Kattookaran

  • Power Spectrum Density conversion to Time Series Data

    Hi,
    This may seem an odd request but is there a way to convert power spectrum density data back to the time series data that generated it in the first place. I have lost the original time series data but still have the PSD and need the time series to do other analysis.
    Thanks,
    Rhys Williams

    Hate to be the bearer of bad news, but there are an infinite number of time series that will generate a given PSD.  You lose all phase information upon taking the PSD.  For this reason I almost always save time domain data, or at least complex FFT values.  

  • Perf issues with Time series function in OBIEE

    Attached is the SQL:
    SELECT SUM (T256675.ACTIVITY_GLOBAL2_AMT) AS c1,
    SUM (T256675.ACTIVITY_GLOBAL3_AMT) AS c2
    FROM X_FINANCIAL_HIERARCHY_DH T610485 /* Dim_X_GLACCT_ALTVIEW_D */,
    X_FINANCIAL_HIERARCHY_DH T610414 /* Dim_X_MGMT_ENTITY_D */,
    W_GL_ACCOUNT_D T256463 /* Dim_W_GL_ACCOUNT_D */,
    W_GL_BALANCE_F T256675 /* Fact_W_GL_BALANCE_F */,
    SAWITH3,
    SAWITH6
    WHERE ( T256463.ROW_WID = T256675.GL_ACCOUNT_WID
    AND T256463.X_FIN_HIER1_WID = T610485.ROW_WID
    AND T256463.X_FIN_HIER5_WID = T610414.ROW_WID
    AND T256675.BALANCE_DT_WID = SAWITH3.c3
    AND SAWITH6.c1 = SAWITH3.c1
    AND T610414.ACCOUNT_HIER8_NAME = 'Worldwide'
    AND T610414.HIERARCHY_SOURCE = 'EntityMgmt'
    AND T610485.ACCOUNT_HIER7_NAME = 'Controllable Expenses'
    AND T610485.HIERARCHY_SOURCE = 'AltViews'
    AND SAWITH6.c3 = '2009 / 11'
    AND SAWITH6.c2 >= SAWITH3.c2
    The SAWITH3 and SAWITH6 dimensions using time series fuctions are causing a full table scan on W_GL_BALANCE_F and W_GL_ACCOUNT_D . Though we are interested in just getting 2009/11 data; the queries generated by time series function TODATE is taking us against entire W_DAY_D data and is causing several performance issues.
    W_GL_BALANCE_F table has index on GL_ACCOUNT_WID and BALANCE_DT_WID.
    How can we force to use index on these columns for better performance.
    Please advise us on the right approach to improve performance.
    SQL for SAWITH3
    ===========
    WITH SAWITH0 AS
    SELECT T31328.ROW_WID AS c3, T31328.PER_NAME_FSCL_QTR AS c4,
    ROW_NUMBER () OVER (PARTITION BY T31328.PER_NAME_FSCL_QTR ORDER BY T31328.PER_NAME_FSCL_QTR DESC)
    AS c5,
    T31328.PER_NAME_FSCL_MNTH AS c6,
    ROW_NUMBER () OVER (PARTITION BY T31328.PER_NAME_FSCL_QTR, T31328.PER_NAME_FSCL_MNTH ORDER BY T31328.PER_NAME_FSCL_QTR DESC,
    T31328.PER_NAME_FSCL_MNTH DESC) AS c7
    FROM W_DAY_D T31328 /* Dim_W_DAY_D_Common */),
    SAWITH1 AS
    SELECT CASE
    WHEN CASE SAWITH0.c5
    WHEN 1
    THEN SAWITH0.c3
    ELSE NULL
    END IS NOT NULL
    THEN RANK () OVER (ORDER BY CASE SAWITH0.c5
    WHEN 1
    THEN SAWITH0.c3
    ELSE NULL
    END ASC NULLS LAST)
    END AS c1,
    CASE
    WHEN CASE SAWITH0.c7
    WHEN 1
    THEN SAWITH0.c3
    ELSE NULL
    END IS NOT NULL
    THEN RANK () OVER (PARTITION BY SAWITH0.c4 ORDER BY CASE SAWITH0.c7
    WHEN 1
    THEN SAWITH0.c3
    ELSE NULL
    END ASC NULLS LAST)
    END AS c2,
    SAWITH0.c3 AS c3, SAWITH0.c4 AS c4, SAWITH0.c6 AS c5
    FROM SAWITH0),
    SAWITH2 AS
    SELECT MIN (SAWITH1.c1) OVER (PARTITION BY SAWITH1.c4) AS c1,
    MIN (SAWITH1.c2) OVER (PARTITION BY SAWITH1.c4, SAWITH1.c5)
    AS c2,
    SAWITH1.c3 AS c3
    FROM SAWITH1),
    SAWITH3 AS
    SELECT DISTINCT SAWITH2.c1 + 5 AS c1, SAWITH2.c2 AS c2,
    SAWITH2.c3 AS c3
    FROM SAWITH2),
    SQL for SAWITH6
    ===========
    SAWITH4 AS
    SELECT T31328.PER_NAME_FSCL_MNTH AS c3, T31328.ROW_WID AS c4,
    T31328.PER_NAME_FSCL_QTR AS c5,
    ROW_NUMBER () OVER (PARTITION BY T31328.PER_NAME_FSCL_QTR ORDER BY T31328.PER_NAME_FSCL_QTR DESC)
    AS c6,
    ROW_NUMBER () OVER (PARTITION BY T31328.PER_NAME_FSCL_QTR, T31328.PER_NAME_FSCL_MNTH ORDER BY T31328.PER_NAME_FSCL_QTR DESC,
    T31328.PER_NAME_FSCL_MNTH DESC) AS c7
    FROM W_DAY_D T31328 /* Dim_W_DAY_D_Common */),
    SAWITH5 AS
    SELECT CASE
    WHEN CASE SAWITH4.c6
    WHEN 1
    THEN SAWITH4.c4
    ELSE NULL
    END IS NOT NULL
    THEN RANK () OVER (ORDER BY CASE SAWITH4.c6
    WHEN 1
    THEN SAWITH4.c4
    ELSE NULL
    END ASC NULLS LAST)
    END AS c1,
    CASE
    WHEN CASE SAWITH4.c7
    WHEN 1
    THEN SAWITH4.c4
    ELSE NULL
    END IS NOT NULL
    THEN RANK () OVER (PARTITION BY SAWITH4.c5 ORDER BY CASE SAWITH4.c7
    WHEN 1
    THEN SAWITH4.c4
    ELSE NULL
    END ASC NULLS LAST)
    END AS c2,
    SAWITH4.c3 AS c3, SAWITH4.c5 AS c4
    FROM SAWITH4),
    Thanks
    Srini Pendem

    Hi Srini,
    I've had quite a bit of performance issues with ToDate and Ago. I tended to create materialized views on my fact tables that would materialize those values as columns and that way I would avoid having to use the ToDate or Ago functions. Is there any chance you can implement a similar solution and bypass the time series functions entirely?
    Just to check, those tables belong to BI Apps, right? If so, you can log an SR about the performance issue since it's an issue with the model that Oracle sold you.
    Good luck!
    -Joe

  • Time Series initialization dates with fiscal periods!

    Dear Experts
    Problem:
    I cannot initialize planning area.
    Period YYYY00X is invalid for periodicity P FYV
    Configuration
    Storage Buckets Profile: Week, Month, Quarter, Year, Post Period FYV all checked.
    Horizon
    Start: 24.02.2013
    End: 31.12.2104
    No Time stream defined.
    Fiscal Year Variant
    2012
    Month
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Edate
    28
    25
    31
    28
    26
    30
    28
    25
    29
    27
    24
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    2013
    Month
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Edate
    26
    23
    30
    27
    25
    29
    27
    24
    28
    26
    23
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    2014
    Month
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Edate
    25
    22
    29
    26
    24
    28
    26
    23
    27
    25
    22
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    2015
    Month
    1
    2
    4
    5
    5
    7
    8
    8
    10
    10
    11
    12
    Edate
    31
    28
    4
    2
    30
    4
    1
    29
    3
    31
    28
    31
    FP
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    Question
    What goddamn dates should I enter in the planning area initialization start and end to initialize for maximum duration given the settings above?
    I tried a few dozens but none is accepted. For a start I tried the same dates as in Horizon of storage bucket profile. But given the kind of error text I have I cannot decipher with tiny little mind what dates I am expected to enter in for time series creation.
    Thanks
    BS

    Thanks Mitesh,
    No its 2014
    Here is what worked.
    Storage Bucket Horizon
    Start: 24.02.2013
    End: 22.11.2014
    Time Series Initialization
    Start: 01.02.2013
    End Date: 31.12.2014
    The fiscal year variant is what I pasted above.
    I thought time series can only be initialized for a subset of period in the storage bucket profile !. This is my first experience of this kind where Initialization period is larger than the storage horizon. Is this the case ? I went by the book always and this is just a chance discovery by me.
    I was aware of the limitation by SAP notes on need to have one more year on either sides in fiscal year variant for it to be used for initialization and hence the range of dates I tried.
    Appreciate your comments on this.
    All dates in dd/mm/yyyy.
    Thanks
    BS

  • Issue with nested aggregated in time series

    Hi Experts,
    I have a small requiremnt using time series.
    eg : (sum(current year4:Quarters bookingamt)-sum(previous year4:Quarters bookingamt))/sum(previous year4:Quarters bookingamt)
    Bookingamt is My measure Name.
    how do we implement above logic can please help regarding this.
    Thanks,
    C Mahesh

    Hi Mahesh,
    These functions help to obtain the data analyzed based on the time. There are two types of Time Series in OBIEE viz..
    1) Ago
    2) To Date
    Following are the prerequisites and procedure to create time series function in OBIEE RPD.
    a) Create a dimensional Hierarchy (ex: YearàQTRàMonth) and convert it to the time dimension. DC on hierarchy and check the box time dimension.
    b) Define the Chronological Key. A chronological key should always be the lowest level in the table and the data in it should be in the format of 20090102 or 200109 (year, month format). DC on level (preferred/last—detail) go to KEYS tab and check chronological key.
    c) Duplicate the existing column, rename and specify the formula for the column.
    1) Ago: This function when specified in any column displays data that is month ago, 2 months ago depending on the specified formula. DC on new column check use logical columns and click on exp. Builder. In the left column select functions and then time series and then AGO. Then in metric go to logical tables and select metric and in level go to time dimension and select month (1 month ago) and give the period as 1(since it is 1 month ago) then transfer two new columns to presentation layer
    Ex: sales: 100,200,300 à -, 100, 200 (1 month AGO).
    2) To Date: This functions allows us to display the aggregated measures based on the formula specified. DC on new column check use logical columns and click on exp. Builder. In the left column select functions and then time series then select TO DATE. Select metric from logical tables (revenue) and as level go to time dimension and select year(YTD) and OK.
    YTD- year to date – Displays aggregated values that is with respect to month since it is YTD. Similarly for QTD and MTD.
    Ex: sales: 100,200,300 à YTD Sales: 100,300,600
    http://www.rittmanmead.com/2007/04/obi-ee-time-dimensions-and-time-series-calculations/ -- How to implement step by step with screen shots.
    http://www.oraclebidwh.com/2009/12/time-series-wizard-in-obiee/ -- you can understand easily how to implement this with screen shot.
    Hope it help's
    Thanks,
    satya

  • Issues using two-pass with dynamic time series?

    Is there any issues with using Dynamic time series [Q-T-D(May)] on a dynamic calc account which is defined as two-pass and has time balance property associated with it.
    The account is defined as "Dynamic Calc"+"Two-pass"+"TB First" and it works fine, but does not fetch result when used with DTS as Q-T-D(May). But, the same seems to work fine without the two-pass calc option on.
    Thanks in advance,
    cheers,
    revati

    Hi Revati,
    Yes u r right, I also tried with same scenario as you mentioned.
    Y-T-D values are not fetching with Twopass,but in that case Twopass is not required because of that member already having "Dynamic calc".
    so you can proceed with Dynamiccalc without twopass.
                   Actual                    
                   Sales     COGS     Margin     Margin %     Margin1 %
    New York     Cola     Jan     6000     5500     500     8.333333333     8.333333333
              Feb     5500     5000     500     9.090909091     9.090909091
              Mar     7000     6500     500     7.142857143     7.142857143
              Qtr1     18500     17000     1500     8.108108108     8.108108108
    With twopass          Q-T-D(Mar)     18500     17000     1500     8.108108108     #missing
    with out twopass          Q-T-D(Mar)     18500     17000     1500     8.108108108     7.142857143
    Praveenkumar.I
    TSPL
    mobile:09975686985

  • Power Spectrum with RFSA

    Hi all,
    I have some question about fetching power spectrum with RFSA drivers.
    when i set span and resolution bandwidth as 100M and 1M respective in ni5660 Configure for Spectrum,and use ni5660 Read Averaged Power Spectrum to fetch the power spectrum,does averaged power spectrum output the 100M signal once?
    if so,how can i draw the spectrum in Waveform Chart as RFSA demo show us?RFSA demo did use Waveform Chart to draw spectrum,did't it?
    Thanks
    Liu Yuan

    Hello Liu Yuan,
    If you take a look at ni5660 Power in Band example in the LabView Example Finder ( Help -> Find Examples ), you will notice it uses a Waveform Graph.  The Read Averaged Power Spectrum vi is contained within a while loop which means that it will continuously output the averaged power spectrum.  If you want to use the RFSA drivers, I suggest looking at the RFSA Getting Started Spectrum example in the example finder.  This example only reads the spectrum once, but you could put a while loop around the read vi to get continuous spectrum output.  If you have any more questions feel free to post back.
    Regards,
    Benjamin Cook

  • How to find the year ago measure with out using time series functions

    hi all
    is there any way to find year ago sales with out using time series functions like ago
    Thanks
    Sreedhar

    Hello Madan,
    Thanks for the reply.
    It still doesn't consider the product into account.
    My columns are as below
    Prod Week End DATE Current Sales Prior Sales % Change
    A 12/4/2010 100 0
    A 12/11/2010 200 100
    A 12/18/2010 300 200
    B 12/4/2010 400 300(this value is not for prod B, i want this to b 0 aswell. But we get product A's last sale amount)
    Is there any way this can be done. I have tried evaluate,MSUM.
    I cannot build a time dimension as all I have is a view.
    Thanks,
    Deep

  • I am using macbook air and thrying to print wireless over time capsule with a lexmark 5600/6600 series, i can print test page but only this. Anyone can help?

    I am using macbook air and thrying to print wireless over time capsule with a lexmark 5600/6600 series, i can print test page but only this. Anyone can help?

    Check here, if this can help:
    http://support.apple.com/kb/HT4670?viewlocale=en_US

  • Will default macros work  with Time series keyfigures???

    Hi,
    I wanted to update time series keyfigures using Default macros. For example, i wanted to populate safety stock in a time series keyfigure while i'm running the heuristics in background.
    Whole idea of doing it is to do reporting i.e. we couldnot extract safety stock because its an auxialiary key figure.
    If someone has done this on a better way, can you please share with me?
    And also i wanted to know whether default macros work with time series key figures.
    Any help is rewarded with pionts.
    Thanks & Regards,
    Jagadeesh.

    Hi,
    I think i'm not clear on what i have posted so let make it very clear abt my scenario.
    We wanted to do reporting on SNP planning area, where we take report on
    Safety stock, Stock on hand which are auxialiry key figures currently.
    As we cannot extract data on auxialiry key figures, we wanted to track
    those information on Time series key figures.For that, we have modified
    the macros and copied the auxialiry key figures into time series key
    figures. All the macros are Default macros, all the prerquisites are been set and all other macros which are supposed to run before this macro, also been set.
    We are not running the bcakground jobs for this macros ( exclusively), instead we are running the heuristics in background and we assume that all the default macros will get executed once the heuristics is done i.e values in auxialiry key figures are been copied to time series key figures.
    Now we found out that Times series key figures are not getting updated
    using Default macros.
    Nowi think i make you all clear. Somehow my questions are still not answered.
    1. Without running background jobs seperately for macros, will time series key figures get updated?
    Thanks & Regards,
    Jagadeesh.

  • Use of time series functions with horizontally fragmented fact tables

    Hi Guys,
    in OBIEE 10g it wasn't possible to use time series functions [AGO, TO_DATE] on horizontally fragmented fact tables. This was due to be fixed in 11g.
    Has this been fixed? Has somebody used this new functionality? What the the limitations?
    Tkx
    Emil

    Hello,
    Can you give us some examples for "horizontally fragmented fact tables", we can tell you whether we can do that or not?
    Thanks,

  • Problem with time series functions against Essbase

    Hello all,
    I'm having problems with displaying correct results when using AGO function in OBIEE. I'm using Essbase 9.3.1 and OBIEE 10.1.3.4.1. Logical column named MAGO1 is created in repository like this: IFNULL( AGO("Measure01", Time.Month, 1), 0). In Answers when I use columns Measure01,MAGO1,Time.Month with filter just on Time.Month column results are OK. When I put another filter, for example Dim1.Gen2,Dim1, MAGO1 returns zero as a result. But when I put Dim1.Gen2,Dim1 also as a column and as a filter results are OK. I don't want to have Dim1.Gen2,Dim1 as a column but just in a filter. Also hiding column is not an option. I want to have grouped results and not multiple rows for one month if I remove filter Dim1.Gen2,Dim1. Filter is selected on prompt in dashboard and can be some value from Dim1.Gen2,Dim1 or All Choices. Reason for this is that I'm using this answers report as a data source for BI Publisher so additional formatting on answers report isn't helpful.
    I am suspecting that something is wrong with hierarchy it time dimension or other dimensions. Or maybe MDX is not generated right ? If you need more details regarding my repository or configuration please feel free to ask. Any help is greatly appreciated.
    Thanks,
    D.

    I am using BI Publisher and Answers report as data source. Making changes like hiding column or displaying results as pivot table won't help in this case.
    Anyway I did discover something else. When I put in report logical columns with ago function they return correct results. But when I add additional logical column with todate function all logical columns with ago and todate function return zero as a result. Also this happens only when result is more than one row. If you have only one row, for example grand total level of dimension, than result is OK, but if you have more rows on a lower level than zero values appear.
    Some columns that are used in filter do not appear as columns in report. This is the main problem. If I put all columns that are filtered in report than the result is OK. But I don't want to do this because I want to have option to select values on dashboard prompt on different levels.
    Hope this narrows the problem.
    Thanks,
    D.

  • Fragementation with Time series calculation

    Hi,
    I have three 3 tables
    Table1 for Region 1 , Frag Logic =Region = Region 1
    Table 2 for Region 2 Frag Logic =Region = Region 2
    Table 3 for Region 3 Frag Logic =Region = Region 3
    we use fragementation to query based on the region.Now i have duplicated Table 1 to calculate the Previous date for time series
    Table1 for Region 1 , Frag Logic =Region = Region 1
    Table1 for Region 1 Prev ?
    Table 2 for Region 2 Frag Logic =Region = Region 2
    Table 2 for Region 2 Prev ?
    Table 3 for Region 3 Frag Logic =Region = Region 3
    Table 3 for Region 3 Prev ?
    After adding prev tables i am almost lost,Do i need to add the Fragmentation for Prev tables also ,else we dont need as main and Prev tables are created as Alias from the base tables.Please share your thoughts.

    To use TimeSeries on Fragmentation in OBIEE, is identified as a bug
    http://obieetalk.com/time-series-and-fragmentation

Maybe you are looking for

  • Macbook pro 2.2 MHZ versus new Macbook pro.

    I just started doing some video editing and my computer seems very, very slow for the first time. Never had a problem doing big files in Photoshop. I was thinking of buying one of the new Macbook pro but then realized that the MHZ has only increased

  • Editing Canon Rebel T3i files in FCP 5

    I'm a newbie to FCP and shoot videos with a Canon Rebel T3i. Obviously, I can copy the .mov files from the SD card to my Mac and edit in Final Cut Pro 5, but it's a bear (slow rendering time and whatnot) and I'm never impressed with the video quality

  • Okay, so I installed Premiere Pro CS6 from CC but when I open it, I get Encore CS6.

    Is there a reason for this and if so, what do I need to do in order to use Premiere instead because I wish to make youtube videos instead of DVDs or Blue Ray disks.

  • How to manage / establish client connection

    We are new to mac server. We have some users (small number) that have individual mac devices - laptops or ipads, etc. Some window users but not many. We want them to be able to 1. maintain their working environments on their laptops; 2. be able to co

  • Strange behaviour iPhone 4 Safari

    While testing my website's mobile version on several devices, I noticed a very strange behaviour. I have a scrollable content div with overflow: auto, and this works properly on all tested devices, except iPhone 4 on Safari. Other browsers and device