Historical data load schedule

Hello,
Is it possible to see/check/configure somewhere in CRMOD a time frame for nightly ETL runs populating data for historical/analytics subject areas?
Regards,
Alexei

You cannot configure to execute the ETL.
You can however check the last time the ETL executed by doing the following:
We can also check the Last ETL run (last historical refresh) end time by clicking on Reports tab -> Pipeline Analysis -> Top 10 opportunities.
at the bottom of the report it would state when the report was last refreshed.
Alternatively you can also go to Report; Design View and select Contact Subject Area. Pull up the Contact ID and Contact Name. in the function of the Contact Name please apply the formula VALUEOF(LAST_REFRESH_DT).
Execute the report and you would get the time when the ETL was last executed.
Regards
Paul

Similar Messages

  • Trading Partner field (VBUND) is left blank in EC-PCA historical data load

    Hello everyone,
    I have a question regarding trading partner field (VBUND) in EC-PCA. In my company we are turning on this Profit Center Accounting functionality and we need to know how to bring this trading partner field filled when we make the historical data load.
    We have tested the transactions 1KE8 (FI docs), 1KEC (MM docs) and 1KE9 (SD docs) that are used to transfer real data for GLPCA table. Unfortunately, VBUND field is left blank when using these transactions.
    This impacts the opened journal entries for AR/AP (Accounts Receivable and Accounts Payable), as the ones made prior to the go-live of this EC-PCA functionality would not have this trading partner information, and the ones made after go-live would carry the information (i.e., not matching the documents).
    Does anyone know how to solve this problem?

    I worked on your issue but it seems there is no standard option to update VBUND field apart from writing an ABAP program to update GLPCA from FI documents.
    Regards
    Siva

  • Init Load,Historical Data Load & Delta Load with 'No Marker Update'

    What is the difference b/w Compression in Inventroy cube while Init Load,Historical Data Load & Delta Load with 'No Marker Update'???
    please help for this quesy..
    Thanks
    Gaurav

    Hi Gaurav,
    I believe you will find the answers here...
    [http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/e0b8dfe6-fe1c-2a10-e8bd-c7acc921f366&overridelayout=true]
    regards
    Pavel

  • Historical data load for newly added fields

    Hi,
    we are using cube 0UCSA_C01 with delta update which has gone live now the requirement is we have to  add 2 new fields to this cube  but the problem is how can we load previous data for dese fields dat has been updated already . Plz letme know hw this can b acheieved  plz letme know the proceedure to achieve dis..
    Thnx
    help will b apprciated

    Hi,
    populate historical data into a newly added field in an infoprovider
    How to populate historical data into a newly added field in an infoprovider
    /people/dinesh.lalchand/blog/2006/02/07/loopback-process--follow-up
    http://sap.ittoolbox.com/documents/popular-q-and-a/adding-characteristics-to-infocubes-2138
    Thanks,
    JituK

  • Update/Delta Mechanism while scheduling data loads

    Hello BW Gurus,
    When loading transactional data, what are the kind of upload methods which is to be selected.
    I have 2 scenarios.
    Scene 1: One to create a infopackage to load historical data from a particular date to till today. (for eg: date between 1.4.2002 - 31.10.2004)
    Scene 2: The other one to create a infopackage to load data in future ie: from tommorrow to 31.12.9999. (for eg: 01.11.2004 - 31.12.9999)
    I also want to capture changes/modifications wrt to historical data (ie date between 1.4.2002 - 31.10.2004)
    So when I create an infopackage, initially I select a date range (date between 1.4.2002 - 31.10.2004) and "Init Delta" as the update method.
    My Question is, having selected "Init Delta" , does it load the entire data from 1.4.02 - 31.10.04 like a full update as well as create a delta in R/3 (RSA7) to capture the changes for the records between the period selected.
    For scheduling a future load (Scene 2) should I create a infopackage with "delta" update method, so that it captures all new records and changes henceforth.
    Is there a document from where I can learn the same or can anyone guide me with the steps.
    thanks.

    Hi,
    normally you have 2 infopackages.
    1. Initialization of the delta with data transfer.
    This package extracts all historic data and sets a flag in the source system, that from this moment on all changes (changes, deletions, creations) will be trapped in the delta queues.
    2. Delta upload
    This package extracts all the delta data since the last init or delta run.
    regards
    Siggi

  • Historical Data with it's delta loading from PSA to ODS in 3.x model

    Hello ,
    I want to load the one year back historical data with it's delta's to ODS object in BW 3.x model .
    So can you please give steps to load historical data from PSA table into ODS object(not from the source system).
    Thnaks alot
    Regards
    BI Beginner

    Hi
    Run the full load IP from PSA to ODS with selections on 0CALDAY(give one year selections)
    make this full load as repair full request.In display mode of IP, in menu bar click on scheduler --> select repair full --->check the option and click ok.
    now execute the IP.
    If you run like this, your init which has between your PSA and ODS will no get disturb.
    Regards,
    Venkatesh.

  • How to automate the data load process using data load file & task Scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of data_Load.bat file & Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler or what are the rest of the file is require to achieve this.
    Thanks

    To follow up on your question are you using the maxl scripts for the dataload?
    If so I have seen and issue within the batch (ex: load_data.bat) that if you do not have the full maxl script path with a batch when running it through event task scheduler the task will work but the log and/ or error file will not be created. Meaning the batch claims it ran from the task scheduler although it didn't do what you needed it to.
    If you are using maxl use this as the batch
    "essmsh C:\data\DataLoad.mxl" Or you can also use the full path for the maxl either way works. The only reason I would think that the maxl may then not work is if you do not have the batch updated to call on all the maxl PATH changes or if you need to update your environment variables to correct the essmsh command to work in a command prompt.

  • Automate the data load process using task scheduler

    Hi,
    I am doing Automated Process to load the data in Hyperion Planning application with the help of Task Scheduler.
    I have created Data_Load.bat file but rest of the process i am unable complete.
    So could you help me , how to automate the data load process using Data_load.bat file & task Scheduler.
    Thanks

    Thanks for your help.
    I have done data loading using Data_load.batch file . the .bat file call to .msh file using this i have followed your steps & automatic data lload is possible.
    If you need this then please let me know.
    Thanks again.
    Edited by: 949936 on Oct 26, 2012 5:46 AM

  • Historical data from ACE20 load balancer modules

    Hello,
    I am trying to get some historical data from our ACE20 load balancer modules which are housed in a 6504, but cannot seem to find where to get this information from. I would like to get information such as memory used, CPU active connections, throughput, http requests etc over the past month, is there any way that I can do this? I am using ANM 2.0
    Thanks

    Hello,
    Below are teh steps to load from DB connect.
    Just go to Rsa1->modeling->source system->double click on the source system. On right habd side side right click on the topmost node.now it will take u to a screen, there u can give the table name (if u know) or just execute. it will display all the tables u have in Oracle. there double click on th eselected table and select the fields requared and now generate data source. now come back to RSA1->source system-> replicate the data source. now assign infosouce as usual.And create infopackage and start the load.
    And this doesn't support delta process.
    This document could help:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/2f0fea94-0501-0010-829c-d6b5c2ae5e40
    Regards,
    Dhanya

  • Loading historical data in DBI

    Hi
    We are implementing DBI v7.x in a project. Any suggestions on how we can load 3 years historical data (currently in a non-Oracle environment) into the DBI
    Thanks
    Senthil

    You need to have Oracle Apps Implemented to implement DBI.
    But the latest versions of DBI has the flexibility to get data from other sources and create DBI dashboards with the old data.

  • Loading historical data with 0CO_PC_ACT_02

    Hello,
        I need to load historical values with infosource 0CO_PC_ACT_02.  I can load the current period, but get no data from R/3 for historical periods.  When I run RSA3 on R/3 for a historical period, I get no data, so I don't believe that this is a BW Issue.
        My question:
       1.  Is there a job or something on R/3 I need to run to enable this data source to pull historical data?  If so, what is it and how do I run it?
       2.  Is this data source simply not able to pull historical data?
    Thanks.
    Dave

    Hi All,
    I have same issue , any one got work around to load history data with this Extractor(0CO_PC_ACT_02) ?

  • Loading Historical data to the new field without deleting the data in cube

    Dear BI Experts,
    I have enhanced a new field to the Generic data source in BI 7.0 .
    I need to load historical data to the newly  appended field.
    As we are having very huge data it is not possible to delete and do init again.
    Is there any  other possibility to load the historical data for the new appeneded field without deleting the old requests?
    Thanks for your Kind  help.
    Kind Regards,
    Sunil

    Dear Sushant,
    Thanks for your reply.
    But I am just wondeing if there is any possibility of loading historical data for new field using Remodelling  concept
    with out deleting old requests.
    I do not know about Remodelling conept but heard that it is not recommeneded to use.
    Can you please suggest and help.
    Thanks and Regards,
    Sunil Kotne

  • Reg: Loading historic data for the enhanced field

    Hello All,
    We need to add a new field 0VENDOR to our datasource 0FI_GL_4. This field is available in our BSEG table. Hence, we are planning to go ahead with datasource enhancement.
    Now, please advice on how to update the historical data to this newly added field.I have heard there is a BW functionality/program to do so without deleting the entire data. Kindly advice on the possible solutions.
    Thanks & Regards
    Sneha Santhanakrishnan

    HI Sneha,
    Using remodeling option you will be able to do that, ie.. loading historical data for new attributes without deleting existing data. But the problem is in remodeling either you can assign constant or some other attribute value or values determined using EXIT.
    Now when you are loading data from source system and if you need historical data as well the best practise is delete existing data and reload it from source sytsem.
    But if you don't want to do that then I can give you one trick but not sure whether it will work or not. The idea is to populate the historical values for 0Vendor using customer exit option of remodeling. Now to get the historical values in customer exit you will need all that data in some BW table, here you can think of creating some generic extractor which will store the values of all the documents and the respective vendor, as you will load data form source system you will get historical values as well.
    Now read that table in customer exit and populate vendor value. This will be one time process to populate the historical values.
    Regards,
    Durgesh.

  • Data load issue in BI7.0 Enviornment

    Hi,
    We went to BI7.0 technical upgrade from BW 3.x recently and we are upgarde the system BI 7.0. We have new requirement for that i need to create a copy cube from Base(A) cube and  load the data from base(A) cube to new (B)cube .
    I need to enhance the Base(A) cube with few key figures and need historical data that why i am not using Remodeling method.These key figures are attributes of masterdata
    For loading data from Base cube (A) to New Cube(B)
    Need to follow the below steps:
    1>Export data source
    2>create update rules
    3>create info pack
    or
    Do i need to create
    1>transformation
    2>DTP
    The Base (A) cube is loading from two ODS and i am going to write update routine between ODS and cube (A) to get populate new key figure in the cube using update routine,so that i will pick the data from the master data and these new key figures as attributes of master data.
    Please some one can advise me on it.

    Thnaks alot khaja for quick response.
    I will store key figures in cube A and it require historical data.
    These key figure are attribute of Material master.Will populate these key figures in cube from masterdata using update routine.If i go for Remodeling then i would not get historical data.
    If i get the key figure from master data then it would load historical data righ?
    I want to move the Cube A to Cube B and load the data from A to B cube
    Once the enhancement is done i will bring back data from Cube B to cube A.
    Please advise me :
    Once the data is loaded into cube B and delete the whole data in cube A
    Instead of bring back B data to cube A ,I would like schedule data from ODS to cube A
    Which is in the flow.
    Do i need to create Transformation and DTP or Update rules and Infopackage?
    Please advis me in this scenario

  • Reloading Historical Data into Cube: 0IC_C03...

    Dear All,
    I'm working on SAP BW 7.3 and I have five years of historical data being loaded into InfoCube: 0IC_C03 (Material Stocks/Movements (as of 3.0B)) and the deltas are also scheduled (DataSources: 2LIS_03_BF & 2LIS_03_UM) for the InfoCube. I have new business requirement to reload the entire historical data into InfoCube with addition of "Fiscal Quarter" and I do not have a DSO in the data flow. I know its a bit tricky InfoCube (Inventory Management) to work on and reloading the entire historical data can be challenging.
    1. How to approach this task, what steps should I take?
    2. Drop/Delete entire data from InfoCube and then delete Setup Tables and refill these and then run "Repair Full Load", is this they way?
    3. What key points should I keep in mind, as different BO reports are already running and using this InfoCube?
    4. Should I run "Repair Full Load" requests for all DataSources( 2LIS_03_BX &2LIS_03_BF & 2LIS_03_UM) ?
    I will appreciate any input.
    Many thanks!
    Tariq Ashraf

    Hi Tariq,
    Unfortunately, you will need some downtime to execute a stock initialization in a live production system. Otherwise you cannot guarantee the data integrity. There are however certainly ways to minimize it. Please see SAP Note 753654 - How can downtime be reduced for setup table update. You can dramatically reduce the downtime to distinguish between closed and open periods. The closed periods can be updated retrospectively.
    To make it more concrete, you could consider e.g. the periods prior to 2014 as "closed". You can then do the initialization starting from January 1, 2014. This will save you a lot of downtime. Closed periods can be uploaded retrospectively and do not require downtime.
    Re. the marker update,please have a look at the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x, step 10, 11, 12 and 13 contain important information.
    Re. the steps, it looks OK for me but please double check with the document Re-initialization of the Material Stocks/Movements cube (0IC_C03) with 2LIS_03_BF, 2LIS_03_BX and 2LIS_03_UM in BW 7.x.
    Please have a look at the following document for more background information around inventory management scenarios:
    How to Handle Inventory Management Scenarios in BW (NW2004)
    Last but not least, you might want to have a look at the following SAP Notes:
    SAP Note 436393 - Performance improvement for filling the setup tables;
    SAP Note 602260 - Procedure for reconstructing data for BW.
    Best regards,
    Sander

Maybe you are looking for

  • Marketing Calendar : Display Views

    Hi All, We are working SAP CRM 2007 , we need to hide some of the standard Views available in Marketing calendar drop downs. I cant see any Options available in SPRO settings of Marketing Calendar also. Any Inputs are greatly appreciated Best Regards

  • How to filter an email with a "reply to" line in the header?

    In filtering certain junk emails I've noticed the 'From' email address varies but there is a constant email address in another line 'Reply to'. I tried to add a custom tag but 'Reply To' causes an error message. If I could filter the 'Reply To' line

  • Login Hook stops working once I've changed the username

    Hello. I need to change the usernames for a number of local accounts on a 10.5.8 mac. The accounts were set up by someone else who is no longer working at this education insitution and hasn't left any documentation. Currently when each user logs on,

  • Iwork '09

    I am using Using numbers '09 in Iwork '09, when I reveiw the help - part of the online help still refers to Iwork '08, Is this a typo or hasn't this online help been substantially updated yet? Thanks, Greg

  • Oracle Database configuration

    Hello everyone, I have a problem with configuring oracle database 11g enterprise edition. I installed the software and configured the listener, but emca returns the following error: I'm all ears if you can help. Thanks