Automatic loading from data marts

Hi All,
I have a cube which has a data flow wherein I have an ODS1 which gives data to ODS2 and then finally to the cube.
I have put this entire process in a process chain.
But many a times due to an erroneous record,. the ODS1 loading fails and then I have to manually correct the PSA and do a manual activation too in ODS1.
I
I now want to know whether the system will pick up the processes again from the process chain and do further loading automatically OR if there is a possibility of automated loading settings be defined in the infopackage so as to do subsequent data targets once I have done it successfully in ODS1???
Also, if I do such a setting, then the job will be done in my username or BWALEREMOTE??
Thanks,
Sharmishtha Biswas

Thnaks for ur reply,
But it is possible that the data mart does it automatically??
As I have observed that some of the subequent infoproviders get loaded automatically after I manually load and activate one data mart.
I wanted to find an explanation for this..
Thanks,
sharmishtha

Similar Messages

  • Problem loading from DATA MART to ODS, SERVICE-API

    Hi gurus,
    I have a problem loading data from data mart to ODS, full load.
    But, if i try extractor itself (test in RSA3) it works fine.
    I already replicated, generated,check transfer rules....datamart but when i try to load data, I get this to messages:
    Message no. R3005
    "The InfoSource 8TEST_MM specified in the data request, is not defined in the
    source system."
    Message no. RSM340
    Errors in source system.
    BTW: This test system was copied from production system. And so far I had no problems with system, but i never tried loading from data marts.
    Any ideas?
    Regards, Uros

    Thanks, for your answer.
    I already did that and everything is fine, I see the infosource, and if I test the extractor it works fine, but the infopackage gives me above mentioned errors.
    I already looked through notes and I couldn't find anything useful.
    I forgot to mention that I generated export data source from transacional ODS.
    Regards, Uros

  • Table on delta loads from data mart

    Hi,
    I am loading data from two DSO's (let's call them A and B) to another DSO (B) in BI 7 with a BW3.5 delta infopackage.
    Now I want to know where I can find information on the timestamp or last request from the last delta load from A and B to C.
    So in fact I would like to know how the system knows which requests in A en B have not been loaded to C yet at the next delta load.
    In which table can I find the information for ODS A and B that is used by the system to define what data in the change log has been loaded or not to ODS C or other targets. (In fact I should have a comparison table in BW as ROOS* in R3)
    Thanks in advance!
    Kind regards,
    Bart

    Hi Guys,
    Thanks for the answers.
    I know how to check everything in the Workbench, but I want to know where the information of the delta is stored technically.
    Just for the sake of completeness:
    Due to some issues; several successive loads from A and B were correctly loaded into DSO C (the 'new' table), but could not be activated. It is not possible to do a repeat or anything else. I am not going into too much detail, but just take this for granted.
    The only way we can 'solve' the problem is to make the system believe that the 3 last loads (activated data) in A and the two last loads in B have not been loaded to C yet. Just deleting the last delta's in C and do a new delta from A and B to C will not work.
    Therefore I want to 'manipulate' the table that is being read by a delta load. If I can change the timestamp or request numbers in that table, I can make the system believe that some requests have not been loaded to C yet.
    Dirty, but it should work I think. But I am still figuring out what table contains information about the datamart data source (8A and 8B) and the last delta load to the targets.
    Hope this is more clear.
    Thanks in advance!
    Kind regards,
    Bart

  • Can we use 0INFOPROV as a selection in Load from Data Stream

    Hi,
    We have implemented BW-SEM BPS and BCS (SEM-BW - 602 and BI 7 ) in our company.
    We have two BPS cubes for Cost Center and Revenue Planning and we have Actuals Data staging cube, we use 0SEM_BCS_10 to load actuals.
    We created a MultiProvider on BPS cubes and Staging cube as a Source Data Basis for BCS.
    Issue:
    When loading plan data or Actuals data into BCS (0BCS_C11) cube using Load from Data Stream method, we have performance issue, We automated load process in a Process Chain. Some times it take about 20 hrs for only Plan data load for 3 group currencies and then elimination tasks.
    What I noticed is, (for example/) when loading Plan data, system is also reading Actuals Cube which is not required, there is no selection available in Mapping or selection tab where I can restrict data load from a particular cube.
    I tried to add 0INFOPROV into databasis but then it doen't show up as selection option in the data collection tasks.
    Is there a way where I can restrict data load into BCS using this load option and able to restrict on cube I will be reading data from ?
    I know that there is a filter Badi available, but not sure how it works.
    Thanks !!
    Naveen Rao Kattela

    Thanks Eugene,
    We do have other characteristics like Value Type (10 = Actual and 20 = Plan) and Version (100 = USD Actual and 200 = USD Plan), but when I am loading data into BCS using Load from Data Stream method, the request goes to all the underlying cubes, which in my case are Planning cubes and Actual Cube, but I don't want request to goto Actual Cube when I am running only Plan load. I think its causing some performance issue.
    For this reason I am thinking if I can use 0INFOPROV as we use in Bex queries to filter the InfoProvider so that the data load performance will improve. 
    I was able to to bring in 0INFOPROV into DataBasis by adding 0INFOPROV  in the characteristics folder used by the Data Basis.
    I am able to see this InfoObject Data Stream Fileds tab. I checked marked it to use it in the selection and regenerated the databasis.
    I was expecting that now this field would be available for selection in data collection method, but its not.
    So If its confirmed that there is no way we can use 0INFOPROV as a selection then I would suggest my client for a redesign of the DataBasis it self.
    Thanks,
    Naveen Rao Kattela

  • TSV_TNEW_PAGE_ALLOC_FAILED - BCS load from data stream task

    Hi experts,
    We had a short dump when executing BCS Load from Data Stream task. The message is: TSV_TNEW_PAGE_ALLOC_FAILED.
    No storage space available for extending an internal table.
    What happened? How we can solve this error?
    Thanks
    Marilia

    Hi,
    Most likely, the remedy for your problem is the same as in my answer to your another question:
    Raise Exception when execute UCMON

  • Issue in loading data from data mart

    Actual load is from APO to BW is happening, but load gets failed when it is loaded from base cube to target cube (data mart), this flow is in 3.5.
    Attached document contains the system msg.

    Your load is still in process.
    Where load failure mesasge?
    Try to activate source 8DSO/Cube and later actiavte update rules thru prorgams.
    Can you check your job details from SM37.
    From info pack monitor, copy request id.
    Go SM37,Job name - <enter copied request id>.
    Use job status - Active/cancel/finished.
    Use proper data ranges.
    User - *

  • Automatic Loading from BW to APO

    Hello,
    Scenario
    1.Data is loaded from R/3 to BW using process chains
    2.Data is loaded from BW to APO using an infopackage manually.
    Solution required
    Once the data is loaded successfully into BW , the infopackage in APO should be triggered automatically to load the data from BW.
    Can you share your thoughts?
    Thanks
    KT

    IGNORE THIS! misunderstood your question*******
    > You can include the InfoPack that loads the data from BW -> APO as the last step of Process Chain which loads data into BW from R3.
    > You can schedule the InfoPackage which loads BW -> APO based on a event and trigger that event in the last step of the process chain which loads data from R3 -> BW.
    Regards,
    Sree
    Message was edited by: Sree Damodararaj

  • Error message while loading data from data mart

    Please can some one help on this.
    While loading data from one cube to another (I am actually creating a backup of original cube) for 1 record I am getting the error message " internal error occured with time split" The long text of error message says Message no. RSAU101

    Take a look at this thread:
    Internal error occured with time split
    Hope it helps,
    Gilad

  • Automatic loading from ODS to Cube in 3.5

    Hi All
    I was under the impression that in version 3.5 in order to load delta from ODS to Cube you had to run the 8 series Ipak.
    However I have recently noticed that this ipak is running automatically after a delta load into the ODS even when the load is not via a process chain.
    Can somebody where and how this setting is maintained.
    Regards
    A

    Hi,
    Go to ODS display mode and check if "Update Data Automatically" is ticked in Settings.
    Regards,
    Kams

  • Missing records while fetching data from Data mart

    Hi,
    I have some missing records in the ODS.The data is fetched from the other BW system.
    It is a delta load & all the loads are succesfull as of now.But still some records are missing.
    If i see in reconstruction tab, some requests are showing the Transfer structure status as clock(transfer stucture has changed sine the last request). Is it because of this time stamp status got changed ?
    I have done reinitialization & the missing Datas are fetched .But i would like to know the cause of missing records.
    Regrads,
    ANita

    Hi kedar,
    If there was a time stamp difference ,the data load should have got failed.All the delta loads were succesfull.
    But now they realised there are some missing records.Just for analysis purpose , i was looking into reconstruction tab & the transfer structure status was displayed as clock.
    But actually there was no change in the transfer stucture.
    Sometimes we used to get timestamp error &
    we used to replicate the data source & activate the transfer structure.
    To avoid these things in future what needs to be done.As this load gets triggered through process chain & each time the data load status was succesful.
    Every time we cant do replication of datasource while loading through Process chain unless the transfer structure gets changed.
    But my concern was is this the cause for missing records or something else.
    Regards,
    ANita

  • BAdI UC_DATATRANSFER for BCS Mapping in "Load from Data Stream" method

    Hello Everyone,
    I need some help on finishing up the code for the UC_DATATRANSFER BAdI.
    I have looked up in the SDN and other places, but could not get comprehensive breakdown of documentation except for the "F1" documentation available on the BAdI.
    So, any help would be appreciated.
    The Steps so far completed,
    1. Have activated the BAdI and have created the filter value for the BAdI.
    2. After the BAdI has been activated, I was able to go into the MAP method and have written the logic for profit center derivation from consolidation hierarchy.
    The issue is there are four components for the Map method,
    IT_DATA_SOURCE
    IS_DATA_TARGET
    ES_DATA_TARGET
    ET_DATA_TARGET
    The data is available from Source system in the table IT_DATA_SOURCE.
    But this is not changeable as it is "Importing" type. Whereas the actual ET_DATA_TARGET which is passed over into FINALIZE method of the BAdI is not filled initially.
    When I try to do a MOVE-CORRESPONDING from the IT_DATA_TARGET into ET_DATA_TARGET I continuously am getting the short dumps as both the tables length is not the same.
    Did anyone else face the same issue as above when trying to do the BAdI implementation for Mapping.
    I will really appreciate if any one can provide me a sample code if possible.
    Let me know if you need additional information.
    Thanks
    Dharma.

    Hello,
    Thanks for looking into the question.
    I already had tried doing that, I get the Short dump stating the object tables are not convertible.
    When I looked into the table structures, I found out that the table structures "IS_DATA_TARGET", "ES_DATA_TARGET" & "ET_DATA_TARGET" belong to the same category in terms of these structures being flat structures or tables of length 484 as per the debugger.
    Whereas the structure "IT_DATA_SOURCE" has the length 404.
    Due to this reason when I say,
    ET_DATA_TARGET = IT_DATA_SOURCE, I keep getting the short dumps.
    Also, is your consolidation process legal or managerial.
    Our Consolidation process is legal and we have the Company and Profit Center fields assigned to the Consolidation Unit role in the Data Basis definition.
    Can you please let me know what is the structures length in your system.
    Thanks
    Dharma.

  • Oracle BI Publisher no columns loaded from data model

    Hello Guys,
    I use Oracle BI Publisher 11g. Have created a data model by using
    the JDBC thin connection to a 11g database. The creation of the data model
    worked fine using a SQL query. Now in the report creation wizard, I select
    the "guide me" option to see how nice the wizard is.
    The problem comes here, on the second window I should select the columns
    but I only read "no columns available". Why is that?

    Hi Metalray
    For BIP report sample XML data is importent. with sample data you cannot generate the BIP reports.
    Follow the below steps, you can achieve your expected BIP output
    After crating data set in the data medel save the data model then click on
    --get XML output option on right side of the top-corner.
    -- then selet no. for the nuber of rows to return dropdown option,
    --then click on Run. Then save the sample data by clicking on Save as sample data option.
    --then it will take you to the data model page, now again save the data model
    --and use it for creating the reports.
    Now you were able to see the available columns in the the second window of guide me option
    please mark if it helps you..

  • Data Mart load does not support Delta load?

    For data mart load, we create an InfoPackage for Delta load, under data selection tab, for the InfoObj. 0FISCPER, we pick OLAP variable 0FPER [Current Fiscal Year/Period (SAP Exit)], save this InfoPackage, but when get in again, find under Data Selection tab, on the row 0FISCPER, the From Value column becomes 001/2005 and To Value column becomes 011/2005 and the whole Data Selection tab screen becomes non-editable! 
    The correct behavior should be that after saving the InfoPackage on this row 0FISCPER, From Value/To Value columns are blank and just the Type column value is 7 and after saving the infopackage and get back in, the Data Selection tab screen is editable.
    But if we select Full update mode, then switch back to the Data Selection tab, then the behavior is right. 
    We wonder if SAP doesn't support Delta load for Data Mart load, why under the Update tab, the Delta load radio button shows up after we conducted an initial load?  It doesn't make any sense for SAP!

    Dear,
    as alredy suggested, you cannot do different selection criteria on your delta, since you automatically have to load on the basis of initial selection...and, if you think about it, this is logic: you cannot initialize a flow for certain conditions and then decide to upload in delta other (different) records...
    Hope now is clearer...
    Bye,
    Roberto
    clearly, if you perform a full (like a 'repair' full, from a functional point of view), this problem doesn't exist (data is taken from active table and not from change log one!)

  • How to load the exist Data Mart Request in to cube.

    Hi Friends,
    I have a scenario... i got DSO and one Cube , iam loading the data from DSO to Cube .
    The number of records are more in the DSO and due to some clculation in the routine i need to load the data year by year(i got 2007 and 2008 year daya). I have loaded the 2007 data to the Infocube and in the DSO i can see the Data Mart Symbal against the Request Id when th load is finished successfully.
    When iam trying to load the 2008 data from DSO to Cube iam getting the "0" records. I realised that the Data Mart symbal is exist in the DSO.
    How to load the 2008 data in this scenario. If i delete the data mart sysbel means iam deleting the Cube request.
    Can any have an idea on this.
    Thaks in advance.

    HI,
    Things are not clear.
    How is loading happening if its delta or full load through DTP or you using 3.5 flow??
    In any cases if you do a full load or full repair based on the year selection it should pick the records from the source...there is nothing to do with the data mart status in the case of full loads.
    Data mart status will come into picture only when you schedule the delta loads.
    Do a full load based on selections on year from the DSO to the cube...no need to delete the data mart or it will bring that request again delta is scheduled...does that request in DSO contains the data for 2008 only ...if yes then you can just delete the data mart status for that and do a delta...if not then do full loads as said.
    Thanks
    Ajeet

  • Reporting services button not auto loading the data

    Hi,
    I am having issues loading data from reporting services (using the reporting services button).When I click on the "refresh on load" box, is not pulling the data automatically. The data gets pulled only if I press the button. Also, the trigger cell feature is not working either. This all started happening when I installed sp1. In the first version of excelsius 2008 engage I had no problem. Also I am pulling data based on date range, the data only pulls automatically if I type the numbers in the cells, if I calculated the date range it no longer works and i have to press the reporting services button for it to pull the data.
    is there any setting  or a hot fix or some known issues about this?
    Also, I also opened a model that I had created with first version of excelsius 2008 using sp1. The model did not automatically loaded the data.  The flash file of that module works fine (since it was created with the earilier version).
    I am sure there is a bug on the reporting services button.
    Edited by: Ricardo Rivera on Oct 23, 2008 12:06 AM

    Hi,
       I just upgraded to the new SP fix. I tested, based on an old file that automatically pulls the data from reporting services, but it didnt work. I still have to test it building from scratch using this new version. I dont think they have corrected the reporting services button issue.
    Thanks
    Ricardo

Maybe you are looking for