Saivng last run data in Power Pivot

Hi All,
I am using cube as a source for my power pivot reports.
Suppose I create a report and I run it on a monthly basis using power pivot technology.
My first run suppose is on 1<sup>st</sup> Jan 2014 and if I will run the report on 1<sup>st</sup> Feb 2014 again, it will overwrite the previous data which I got on 1<sup>st</sup> Jan 2014.
Is there any option to “Do not refresh automatically” or save the previous state of data without overwriting it.
Thanks
Syed Faiz
SFH

Hi Syed,
Generally, we focus on historical data BI analysis in most cases. If you want to review previous historical data in your report(such as PivotTable, Power View and so on), we should consider implementing Time Intelligence/Filter/Slicer for your scenario.
Regards,
Elvis Long
TechNet Community Support

Similar Messages

  • How to sync/pull out 'phymyadmin' data into power pivot?

    Hi,
    i work for a one of the leading e-commerce companies in India. ( Lenskart.com ). I have just installed powerpivot and looking forward to using my database /crm data into power pivot as the data is into millions.. i have login info of my server/db . rest
    i do not knwo how to configure that to powerpivot so that i can pull out that data in powerpivot excel.. need hel to set it up..
    thanks..

    Hi Pranay,
    This is the set-up I use also. 
    You need to make sure you first have the ODBC MySQL Connector installed on your machine (presuming you are Windows):
    http://dev.mysql.com/downloads/connector/odbc/ 
    WINDOWS
    Then you need to go into "ODBC Data Sources". IF you are using Windows 8 or above simply type this into the start screen. Make sure you select the correct version i..e 64-bit if you are running a 64-bit system. 
    Click 'Add...'
    Choose your driver, for me this is 'MySQL ODBC 5.3 Unicode Driver'.
    Click "Finish".
    You well then be prompted with a configuration screen, the only info I have needed is:
    Data Source Name
    TCP/IP Server
    Port (usually 3306)
    User
    Password
    Database
    Click 'Test' to make sure the connection is working. In the past I have errors which were caused by my IP address not being added to the allow list on the server firewall, so be wary of this. 
    Click 'OK' and that is your connection done. 
    POWERPIVOT
    In powerpivot open your data model and then select 'From Other Sources'
    Choose 'Others (OLED/ODBC)'
    Enter a name for the connection.
    Click 'Build'
    Click the 'Provider' tab
    Click 'Microsoft OLD DB Provider for ODBC Drivers'
    Click 'Next'
    From the 'Use Data Source Name' select the data source you created above.
    Click 'Test Connection' 
    Click 'OK'
    Click 'Next'
    You will then have the option to either write your own query to extract data or to select from a list of tables. 
    Hope that helps!

  • How to get the last run date.

    We intend to develop an incremental data load mapping using this strategy:
    1) The mapping reads the date it was last run from an auxiliary table.
    2) It selects from the source only those rows that were inserted or updated after said date.
    3) Then, a post-mapping process updates the last run date in the auxiliary table, using SYSDATE.
    The problem with this logic is that there is a gap: if the mapping starts running at 1:00 and ends at 2:00, the rows that are inserted in between will never be loaded.
    Is there any way to get the value when the mapping started running? Is there a better way to do this?
    Any help would be appreciated.
    Juan Algaba

    There is always the possibility of some record updates slipping through the crack if you are depending on dates unless you are very carefull. All of the audit tasks that the OWB-generate code performes take time. Any pre- or post- process that needs to run takes time. So which date is the best cuttoff point to equate to "when the last run of the merge (or insert or update) statement completed"?
    Plus, how do you handle reloads if the previous load failed and your mapping had incremental commits?
    Is your source on another server? If so, are the dates in perfect synch? The audit tables populate with sysdate of your runtime schema. Is that the same as the sysdate on your source remote database?
    I would qualify my query to look for all updates since the start of the last run that finished successfully - adjusted if neccesary for sysdate differences if it is on a remote schema. And make sure that your code handles any reloads gracefully in the event that this brings back data that you have already loaded once. .
    Because we use Oracle Streams to load a local staging area, we also have custom code to dump the primary keys of all data changes to utility staging tables while streams is updating the local copy. So, our Person table has an st_Person_delta table that just holds the primary keys that have been updated by Streams since the last ETL run.
    During datamart load we disable the streams apply to stabilize our environment, and join these lists of pk's to their source tables to drive our ETL. So we only select data where Streams has performed an update to the row since our last run. When we are done our ETL, we truncate the primary key staging tables, and then turn streams back on to start loading up our new delta into our staging tables again..
    The ETL gets pretty complex though when many tables join together in one mapping and you need to check all possible source table deltas to see if any of them got updated to determine the delta for a given dimension or fact record, but it works great once you get it all done.

  • To find last run date of custom concurrent program in oracle apps

    Hello Exports,
    Can you pls tell me how to find last run date of custom concurrent program in oracle apps.( thr Backend query )
    Thanks in advance,
    Edited by: 981527 on Mar 7, 2013 3:01 AM

    try the below:
    select fcp.user_concurrent_program_name
    ,fcr.request_date
    ,fu.user_name
    ,fcr.actual_start_date
    ,fcr.actual_completion_date
    ,fcr.phase_code
    ,fcr.status_code
    ,fcr.argument1
    ,fcr.argument2
    ,fcr.argument3
    from fnd_concurrent_programs_vl fcp
    ,fnd_concurrent_requests fcr
    ,fnd_user fu
    where fcp.user_concurrent_program_name like 'Payroll Run'
    and fcp.concurrent_program_id = fcr.concurrent_program_id
    and fcr.requested_by = fu.user_id
    order by fcr.actual_completion_date desc

  • How to find the last run date of the report..

    please any one one help me
    how to find last run date of the report...
    for example if my report is zgrir...if i am exuted in last week
    if want to find when it executed
    last run date is req
    please any one help me...

    The trasaction stat is limited because i want all the execution date even if it is 3 years ago.
    I have tryied the transaction st03 but it is limited to 3 months ago.
    check transaction STAT and it's report RSSTAT00.
    U can copy RSSTAT00 into ZRSSTAT00 and modify corresponding.
    *& Report ZDSAP *
    REPORT ZDSAP .
    DATA: d_ref TYPE REF TO data,
    d_ref2 TYPE REF TO data ,
    i_alv_cat TYPE TABLE OF lvc_s_fcat,
    ls_alv_cat LIKE LINE OF i_alv_cat.
    TYPES tabname LIKE dcobjdef-name .
    parameter: p_tablen type tabname.
    data: begin of itab occurs 0.
    INCLUDE STRUCTURE dntab.
    data: end of itab.
    FIELD-SYMBOLS : <F_FS> TYPE table,
    <F_FS1> TYPE TABLE,
    <F_FS2> TYPE ANY,
    <F_FS3> TYPE TABLE.
    REFRESH itab.
    CALL FUNCTION 'NAMETAB_GET'
    EXPORTING
    langu = sy-langu
    tabname = p_tablen
    TABLES
    nametab = itab
    EXCEPTIONS
    no_texts_found = 1.
    LOOP AT itab .
    ls_alv_cat-fieldname = itab-fieldname.
    ls_alv_cat-ref_table = p_tablen.
    ls_alv_cat-ref_field = itab-fieldname.
    APPEND ls_alv_cat TO i_alv_cat.
    ENDLOOP.
    internal table build
    CALL METHOD cl_alv_table_create=>create_dynamic_table
    EXPORTING it_fieldcatalog = i_alv_cat
    IMPORTING ep_table = d_ref .
    ASSIGN d_ref->* TO <F_FS>.
    SELECT * FROM (p_tablen) INTO CORRESPONDING FIELDS OF TABLE <F_FS>.
    LOOP AT <F_FS> ASSIGNING <F_FS2>.
    *your code goes here.
    ENDLOOP.

  • Extract Data in Power Pivot from MDX with diffrentnt different Level of Hierarchy in MDX

    Hi All,
    My requirement is to extract the data form tabular model and put in power pivot,as all the measures calculation is present in model just extracting the aggregated data.
    Issue is appearing on Rolling up the data, so is there any way to extract the data so that it will display the right data on different level. As per the below example on level 2 data is getting summed up which is not correct.
    Ex:-
    Level1
    A
    10%
    B
    20%
    Level2
    30%
    Thanks Chandan

    Hi Chandan,
    According to your description, you create a SQL Server Analysis Services Tabular model and extract Data in Power Pivot from MDX, the problem is that the hierarchy data rolling up base on the hierarchy level, right?
    I have tested it on my local environment, we cannot reproduce this issue. So as per my understanding, the issue can be cause by the hierarchy setting in your tabular model. Here are some links about how to create and manage hierarchies in a tabular model,
    please refer to the links and check if the settings are correct.
    http://msdn.microsoft.com/en-in/library/hh213003.aspx
    http://www.youtube.com/watch?v=qKKNY1Vj_2c
    If the issue persists, please provide us more information about you hierarchy creation steps, so that we can make further analysis.
    Regards,
    Charlie Liao
    TechNet Community Support

  • Create a listing of reports with last run date and # of times run

    Hi All,
    I want to create a listing of reports with last run date and number of times report is run in the past 18 months.
    If anybody can please help me with the query for the same.
    Regards,
    Sk
    Edited by: user10989244 on Aug 18, 2009 7:12 AM

    Hi Sk
    Assuming you have the collection of statistics enabled, which it is out of the box, you can get the information you need from the EUL5_QPP_STATS table. This script will help:
    SELECT
    QPP.QS_DOC_OWNER WORKBOOK_OWNER,
    QPP.QS_DOC_NAME WORKBOOK_NAME,
    QPP.QS_DOC_DETAILS WORKSHEET_NAME,
    QPP.QS_CREATED_BY RUN_BY,
    TRUNC(MAX(QPP.QS_CREATED_DATE)) LAST_USED_DATE,
    COUNT(QPP.QS_ID) TIMES_USED
    FROM
    EUL5_QPP_STATS QPP
    WHERE
    QPP.QS_CREATED_DATE >= ADD_MONTHS(SYSDATE, -18)
    GROUP BY
    QPP.QS_DOC_OWNER,
    QPP.QS_DOC_NAME,
    QPP.QS_CREATED_BY,
    QPP.QS_DOC_DETAILS
    ORDER BY 1,2,3;
    Best wishes
    Michael

  • Calculating 60 days from last run date and sum a total for those 60 days

    I need to calculate 60 days from last run date (I will create prompts), then take that number to sum a total of items.
    Can someone be of assistance.

    Hi,
    You can create a calculation to calculate 60 days before your run date. e.g. if rundate is your date item then the calculation will be:
    rundate-60
    Rod West

  • Last run date & time & multiple files handling

    Hi frnds,
    i am working on an outbound interface in which i am retrieving data from custom db table,processing it with other db tables and writing  the final output into a file into the application server and updating the custom db table.
    Now i have an issue here when i am giving plant as input in selection screen (select-options), i should also get those plants from my custom table which have no last run date and time.
    How to get that?
    Next part of my question is for multiple plants entry,multiple files have to created in application server.
    Presently, i have concatenated the file path n name.Do i need to loop the same with s_werks.Will it work
    Also i used an internal table for update data storage.When multiple plants are their how to handle the updation.
    Rgds,
    Simran

    Hi.
    In ur reply i did'nt understood two things
    1) this is how i'm doin the selection
        select lt_run_dt
               lt_run_tm
        into table itab1
        from z_table
        where prog_name EQ sy-repid
        AND   plant IN s_werks
    will it give me plants whose last_run_date and last_run_time are not in the z table?
    2)  If you are going with select-option from parameter, then if you have to write a FILE per PLANT. Then you can loop at plant->extract the data-> write to app server-update z table->end loop
    Does that means:
    loop at s_werks
    open dataset
    transfer
    modify
    endloop
    Kindly clarify to solve this issue.
    Rgds,
    Simran

  • Help find the Last run date from previous month

    Hi all
    I am stuck trying to find the Last Run Date from the previous month.
    select distinct(date_ran) from TABLE X where date_ran like '%/11-%' order by date_ran desc gives
    "03/30/11-06:19
    "03/25/11-03:01
    "03/24/11-03:00
    "03/23/11-03:00
    "03/22/11-03:00
    "03/21/11-03:00
    "03/18/11-03:00
    "03/17/11-00:00
    "03/16/11-06:31
    "02/15/11-07:42
    "02/15/11-06:00
    "02/14/11-08:19
    Here the result I am trying to acheive is 02/15/11-07:42 ;Note the column date_ran is a VARCHAR2(255) and not a TIMESTAMP
    I tried
    select ADD_MONTHS(to_date(max(date_ran), 'MM/DD/YYYY-HH24:MI'),-1)
    from daily_tests_a
    where date_ran like '%/11-%'
    order by date_ran desc
    which returns 2/28/0011 6:19:00 AM and that is not the result I am looking for
    Regards
    SMK

    Welcome to the forum!
    user2931503 wrote:
    Hi all
    I am stuck trying to find the Last Run Date from the previous month.
    select distinct(date_ran) from TABLE X where date_ran like '%/11-%' order by date_ran desc gives
    "03/30/11-06:19
    "03/25/11-03:01
    "03/24/11-03:00
    "03/23/11-03:00
    "03/22/11-03:00
    "03/21/11-03:00
    "03/18/11-03:00
    "03/17/11-00:00
    "03/16/11-06:31
    "02/15/11-07:42
    "02/15/11-06:00
    "02/14/11-08:19
    Here the result I am trying to acheive is 02/15/11-07:42 ;Note the column date_ran is a VARCHAR2(255) and not a TIMESTAMPThat's a very bad idea. Points in time should always be stored in DATE (or maybe TIMESTAMP) columns. If not, you're going to waste resources converting them to DATEs whenever you need them to behave like DATEs, and risk run-time errors because of bad data.
    Normally, the "previous" month means the month before the current month. In April, the previous month is March.
    By "previous" month, do you mean the 2nd latest month in the table? That is, this data contains data from February and March, 2011; March 2011 is the latest of those months, February 2011 is the next-to-last, and so is that why you want a date from February?
    I tried
    select ADD_MONTHS(to_date(max(date_ran), 'MM/DD/YYYY-HH24:MI'),-1)
    from daily_tests_a
    where date_ran like '%/11-%'
    order by date_ran desc
    which returns 2/28/0011 6:19:00 AM and that is not the result I am looking for
    Regards
    SMKTry this:
    WITH   got_real_date   AS
         SELECT     TO_DATE ( SUBSTR (date_ran, 1, 14)
                   , 'MM/DD/RR-HH24:MI'
                   )     AS real_date
         FROM     table_x
    --     WHERE     ...     -- If you need any filtering, put it here
    SELECT     TO_CHAR ( MAX (real_date)
              , 'MM/DD/RR-HH24:MI'
              )     AS last_date
    FROM    got_month_num
    WHERE     real_date     < (
                               SELECT  TRUNC (MAX (real_date), 'MONTH')
                         FROM    got_real_date
    ;Edited by: Frank Kulash on Apr 5, 2011 3:59 PM

  • Does ODI keeps the record of last run date of interface?

    Target:Oracle
    Source:MS SQL
    ODI:11g
    My requirement is I have a source table, Whatever operations performed on that (like insert,update etc) table, I want to load those records into the target table. For that I have created an Interface which runs daily at 6 PM.While loading, I have put one filter which loads only those data which got inserted/updated on current date(GETDATE()).Suppose the case, If my ODI has been down(because of any reason) since 3 days. In this case I have no option to load the records who got changed in that three days. On the fourth day I start the ODI then, how can I get data which got changed in last three days.
    Is there any place where ODI keeps the record of last run date or something of interface ?
    If I use CDC in that case, once the ODI down then the journalism can not able to keep the changed data info.
    How can I load the changed data into the target.?
    Thanks,
    Shrinivas

    If you have a date column in source and target , I guess you might be having a you are using date , you can use a filter and write the expression like this
    Source_date > (select max(date_created/update) from schema.target_table)
    I have validated this expression and it validated by ODI , I did not try running it , please give it a shot
    Suresh

  • BI Report information :Report technical name - Report name - Last run date

    hi,
    I want to know how we gather the information about the total reports present in production environment in following format.
    Report technical name - Report name - Last run date
    I know a table RSZCOMPDIR where we can find technical name and last run information, but this table does not give Report name.(description)
    Is there any other table which can give me this information???

    Hi Amruta,
    check the following forum link which is already posted . probably it will help.
    [How to find out Query last used by whom|How to find out Query last used by whom]
    Regards
    Sree

  • How to get the last run date of an ISchedulerTask ?

    Hello,
    I have implemented an ISchedulerTask and inside the run method I would like to get a reference to the last timestamp/date that the task was running, any idea how to achieve that?
    Rgds,
    Roy

    Hi
    You can create a custom metadata property and set this property with the date when the Ischedlar task runs.
    So, in your code you can read this property whenever you want to read the previous date.
    Hope this works.
    Thanks
    Puneet

  • Maitaining Last Run Date!

    Hi,
    we have a specific requirement of creating a Delta enabled Datasource using a Z Function Module, which will use Z control table for maintaining Deltas.
    Any pointers to the same will be appreciated.
    Regards,
    San!

    hi siggi,
    i have created all this...but my problem is like this:
    Through Z fm,for the first run, i have to filter the records based on some codition like vbreve-rrsta='z'.
    going forward, i have to filter  the records based on vbreve-erdat ( date on which record was created) with last_run_date( means last time when we run delta).
    this code for this:
    Select <Fields> from vbereve
    Into g_internal table
    where ( erdat = g_last_run_date and erzet > g_last_run_time)
                                        or (erdat > g_last_run_date and erdat <G_current_date )
                     or (erdat=g_current_date and erzet LE to G_current_time)
        And rrsta = ‘z’.
    g_last_run_date= previous day run date & g_last_run_time have = previous day run time,
    G_current_date= (sy_datum) current day run date & g_current_time = (sy_uzeit) current day run time,
    have to be stored in a Ztable.
    For the first run (initial value), set g_last_run_date = 01/1900**
    can use help

  • How to access sharepoint List Data in Power Pivot model

    Hi,
    I have a sharepoint list and i want to access the list in power pivot workbook.
    I tryed exporting the list as datafeed but iam that is missing under export.
    Iam stuck pls help any other approach.?
    Regards,
    Krishnaa

    Hello,
    I have answered this question in this thread below:
    http://social.technet.microsoft.com/Forums/office/en-US/92a04e1b-a5f7-4dbe-88e7-0a1064622641/sharepoint-list-access-in-power-pivot?forum=sqlkjpowerpointforsharepoint
    Please go through the following aticles:
    Using SharePoint List Data in PowerPivot:
    http://msdn.microsoft.com/en-us/library/hh230322.aspx
    Using a SharePoint List as a data source:
    http://blogs.technet.com/b/excel_services__powerpivot_for_sharepoint_support_blog/archive/2013/07/11/excel-services-using-a-sharepoint-list-as-a-data-source.aspx
    Regards,
    Elvis Long
    TechNet Community Support

Maybe you are looking for