Datamart with in Datatarget

In which scenario we will go for loading the data to samedatatarget(eg: ods1 is having the data we are doing the datamart 8ods1 is loading to the same ods1)

Hi,
This approach is used in some unique scenarios. Its called a Repump/Loopback Processes in some landscapes. I will give you a scenario. Consider a case where the requirement is that we need all the customers to be displayed in target for all requests even if there has been no activity by the customer for a day (if its a daily load) or week (if its a weekly load) or month (if its a monthly load). In that case we create an update rule to have a look up on the existing data in the ODS, read the customers available and see if all those customers are coming as a part of the delta load.If any customer that is in already existing in the ODS is not coming in the delta then a new record is created with that customer name but the values will be blank and loaded to the ODS along with the new request.
In this scenario you can notice two things. The number of records in each delta will be higher or equal to the last one.
Refer this blog
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/f421c86c-0801-0010-e88c-a07ccdc69601
Thanks,
JituK

Similar Messages

  • Request Datamarted with QM status Red

    Hi Friends,
    Could anyone explain me whether there is a possiblity for a request to get datamarted when the QM status is red.
    I am facing the scenario like my DSO has a red request and it is datamarted(loaded to further targets.
    Now I want to delete this request, if anyone encounter such scenario please explain me how to go forward.
    Thanks in advance,
    Vishnu

    Hi Vishnu,
    In case a request is red in the DSO, it wouldnt be loaded to the infoobject.
    In any case, if you want to delete the request from the DSO which has been loaded to further data target, you can use selective deletion, where you give the name of the key field according to which you data will be deleted. Right click on DSO -> Manage -> Contents -> Selective Deletion.
    You can also write a ABAP program to delete the data from the DSO table selectively. For getting the request ID, you can use the RSICCONT table.
    Hope this answers your questions.
    You can also refer the following threads.
    Last request Details of DSO
    Deleting WDSO Data
    Request Deletion from Write-optimized DSO
    Thanks,
    Chinmaya Bhgata

  • FULL Upload from DataMart with aggregates

    HI,
    is it normal that in BW 7.0, if I load the data from a cube to another (datamart), the system reads only the requests rolled up in the aggregate?
    Thank You.
    AV

    Aggregates are materialized subsets of InfoCube data, where the data is pre-aggregated and stored in an InfoCube structure.
    Datamart in BI  7 is not used for single instance or same BI BOX.
    for more details you may wish to check below
    What is GENERATE EXPORT DATASOURCE
    So from BI 7.0 you would load from one data target to another data target using DTP and you just need the request available for reporting(green) dont need to rollup in aggregates ,fact tables and aggregate tables are different.
    Hope it Helps
    Chetan
    @CP..

  • Cube to cube datamart in 3.5

    I have built a cube to cube datamart and cannot determine how to initiate the load. When building a datamart with an ODS as the source, it is quite obvious (right-click ODS and select Update ODS Data in Data Target). However, this option does not exist in the case of a cube.
    It's a one-time load of some historical data. I don't require scheduling of deltas. Just a single full load.I do however need selectability on a date field to exclude recent data.
    Can someone please advise how to initiate this data load?
    Many thanks,
    Brett

    Hi,
    Go to the infosource tab in the RSA1 and search for 8(your infocube technical name).
    Eg: if your source infocube 0tct_01  then search for 80tct_c01.
    Create an infopackage on the 8.* ICtechnical name  and schedule the load. This will load data from your source IC to Target IC.
    If it is not clear please let me know.
    Thanks
    Santosh R C

  • Problem with Adftreetable when disclosured and re-open it

    I have problem with using adftreetable. Normally adftreetable do not expand all nodes even though there is an attribute for that (it expands only the first node strangely). So we have to use this tip (http://thepeninsulasedge.com/frank_nimphius/2007/12/19/adf-faces-rc-initially-expanding-all-nodes-in-a-tree-or-tree-table/) to expand nodes when first running web page. It works but when click to close (-) to close and open the node again with (+) the animation seems to loop forever and the clicked node doesn't open. Could anyone know this problem? Please help me.
    Thanks.

    As you are loading in development
    try to create new Infopackage and try loading with the datatarget for whcih you want to load alone
    Also try to activate all the related objects.
    Regards
    Edited by: 200816 on Oct 7, 2009 11:58 AM

  • Datamart using stock cube

    Hi experts,
    I have a stock cube which is compressed and having marker update till today. Now I am trying to do datamart using this cube.
    Is it possible to do datamart with the cube which is compressed and having marker update.
    suggest me the procedure to if it is possible.
    regards,
    rajesh.

    Hi Rajesh,
    Check note 375098 (Datamart extraction from non-cumulative InfoCubes)

  • Need help in MDX query

    Hi All 
    I am new to MDX language and need a MDX functions/query on the cube to get the required output, Given below is the scenario with the data. 
    I am maintaining the data in a table in dataMart with given structure. We have the data at day and weekly in a single table with granularity indicator and count is the measure group column. While loading the data in to mart table we are populaiting the week
    Key from week table and Month key from month table and joining in the cube.
    we need to calculate the inventory for a particular month. If a user selects a particular month the output would be count = 30 as  a measure called Closed and count = 16 as a measure value called Open.
    Need a MDX query to get output.
    Granularity  Count WeekKey MonthKey
    Weekly 16
    W1 M1
    Weekly 17
    W1 M1
    Weekly 18
    w1 M1
    Weekly 19
    W1 M1
    Weekly 20
    W1 M1
    Weekly 21
    W1 M1
    Weekly 22
    W1 M1
    Weekly 23
    w2 M1
    Weekly 24
    w2 M1
    Weekly 25
    w2 M1
    Weekly 26
    w2 M1
    Weekly 27
    w2 M1
    Weekly 28
    w2 M1
    Weekly 29
    w2 M1
    Weekly 30
    w2 M1
    Weekly 16
    w3 M1
    Weekly 17
    w3 M1
    Weekly 18
    w3 M1
    Weekly 19
    w3 M1
    Weekly 20
    w3 M1
    Weekly 21
    w3 M1
    Weekly 22
    w3 M1
    Weekly 23
    w4 M1
    Weekly 24
    w4 M1
    Weekly 25
    w4 M1
    Weekly 26
    w4 M1
    Weekly 27
    w4 M1
    Weekly 28
    w4 M1
    Weekly 29
    w4 M1
    Weekly 30
    w4 M1
    Thanks in advance

    Hi Venkatesh,
    According to your description, you need to count the members with conditions in a particular month, right?
    In MDX, we can achieve the requirement by using Count and Filter function, I have tested it on AdventureWorks cube, the sample query below is for you reference.
    with member [ConditionalCount]
    as
    count(filter([Date].[Calendar].[Month].&[2008]&[2].children,[Measures].[Internet Order Count]>50))
    select {[Measures].[Internet Order Count],[ConditionalCount]} on 0,
    [Date].[Calendar].[Date].members on 1
    from
    (select [Date].[Calendar].[Month].&[2008]&[2] on 0 from
    [Adventure Works]
    Reference
    http://msdn.microsoft.com/en-us/library/ms144823.aspx
    http://msdn.microsoft.com/en-us/library/ms146037.aspx
    If this is not what you want, please elaborate your requirement, such as the detail structure of your cube, so that we can make further analysis.
    Regards,
    Charlie Liao
    TechNet Community Support

  • How use "All Values" in the pages of pivot tables

    Hi everyone,
    I’m using Oracle Bi Answers and I have the following problem:
    I’m doing reports based in a datamart with 2 fact table each with about 28 Million records per month and with a medium of 14 dimensions.
    The users want to have the choice to select 5 parameters for example (region, district, type of product, month of analysis, subtype of product) where they can select an individual option as well as select an “all values” option. They want this for all the parameters.
    We are having problems doing this without having the report to timeout.
    Approaches used
    We are trying to do this by using pivot tables and combo boxes in the reports (one for each parameter). The combos are in the pages area.
    We are trying to do a calculated item with Sum(*) to have a “ALL Values” in the combo box but this makes the report extremely slow.
    We also tried to do Sum in the pages section of the pivot table but it only displays the “All pages value” of the first combo box.
    So actually I don’t know what to try. Can you help me with the best way to do this?
    Thanks
    Edited by: user8727081 on Sep 28, 2009 2:45 AM

    Hi,
    I had the same problem, and I solved it using the dashboard prompts.
    But sometimes users are not happy. I explaine why.
    Suppose a dimension Nation:
    Italy, Germany, UK, France, USA
    1) I create a report with sales quantity.
    2) Report has a filter --> "nation" = is prompted
    3) I create a dashboard prompt "Nations" based on dimension Nation with All Values
    4) I publish the report in a dashboard with the dashboard prompt "Nations"
    When user choose the nation from the prompt, all nations are displayed, also if I have NO sales for some of these.
    My goal: only nations with sales must be shown
    Thanks
    Enrico

  • Delta datasource management in process chain

    Hi experts,
    I have the following problem with process chains.
    I have one <b>Delta</b> infosource used to load two different infoproviders (ODS1 and 2).ODS 2 must be loaded only once ODS 1 is loaded and activated. The Delta process implies that the <b>2</b> ODS are datatarget for the delta infopackage.
    In my process chain, I have :
    'execute infopackage IFPCK1'
    <i>(with 'only PSA' option and of course the two datatargets ODS1 and ODS 2 selected)</i>
    then a first
    'Read PSA and charge datatarget process'
    <i>(with options : datatarget = ODS 1, execute infopackage IFPCK1)</i>
    followed by
    'Activate ODS' ODS 1
    then a second
    'Read PSA and charge datatarget process'
    <i>(with options : datatarget = ODS 2, execute infopackage IFPCK1)</i>
    Each time, I run the Process chain, I have the following message for <b>ODS 2</b>
    "the request <b>(ODS1)</b> already exists in infocube/ODS; Not loaded."
    It is as if the first Read PSA process tries to load both ODS instead of only one.
    Any suggestion to help me slve this problem ??
    Thanks in advance

    Hi Daniel,
    unfortunately there is no easy way around this. I would suggest to create an ODS 3 as exact copy of the info source and load ODS 1 and 3 in parallel and then ODS 2 from ODS 3.
    Best regards
       Dirk

  • Normalized (3NF) VS Denormalized(Star Schema) Data warehouse :

    what are the benefits of normalized data warehouse (3NF) over the denormalized (Star schema)?
    if DW is in the 3NF then is need to create the seprate physical database which contains several data marts( star schema)with physical tables, which feeds to cube or create the views(SSAS data source view) on top of 3NF warehouse of star schema which feeds to
    cube?
    please explin the pros and cons of 3NF and denormalized DW.
    thanks in advance.
    Zaim Raza.

    Hi Zaim,
    Take a look to this diagram:
    1) Normally, 3NF schema is typical for ODS layer, which is simply used to fetch data from sources, generalize, prepare, cleanse data for upcoming load to data warehouse.
    2) When it comes to DW layer (Data Warehouse), data modelers general challenge is to build historical data silo.
    Star schema with slow changing facts and  slow changing dimensions are partially suitable.
    The DataVault and other similar specialized methods provides, in my opinion, wider possibility and flexibility.
    3) Star schema is perfectly suitable for datamarts. SQL Server 2008 and higher contains numerous query analyzer improvements to handle such workload efficiently. SQL Server 2012 introduced column stored indexes, that makes possibility to
    create robust star model datamarts with SQL Query performance comparable to MS OLAP. 
    So, your choice is:
    1) Create solid, consistent DW solution
    2) Create separate datamarts on top of DW for specific business needs. 
    3) Create necessary indexes, PK, FK key and statistics (of FK in fact tables) to help sql optimizer as much as possible.
    4) Forget about approach of defining SSAS datasource view on top of 3NF (or any other DWH modeling method), since this is the way to performance and maintenance issues in the future.

  • BI REPOSITORY REFRESH

    Hi every body;
    I have created a datamart with OWB.
    Then i have created the BI REPOSITORY.
    My physical layer is populated by importing data from my datamart.
    After that i have inserted new records in my datamart.
    When i checked data into the physical layer of my BI repository, i haven't found new records that i have inserted into my datamart.
    My question is :
    How to refresh data into BI repository?
    Is there any why to automatic the synchronisation between BI repository and my database(datamart)?
    Thank you at advance.

    "My physical layer is populated by importing data from my datamart.
    After that i have inserted new records in my datamart."
    Are you really talking about records? Or did you just change the structure? Did you actually load data?
    Are you pointing to the correct data source?

  • Datawarehouse with dependent datamarts

    Hello Everyone,
    Could you please guide me on how to
    1)Create a datawarehouse with a focus to create dependent datamarts on top of it
    2)Simplest procedure to load the dtawarehouse to these datamarts
    3)Can I have multiple fact tables in the dtawarehouse with each fact table and set of dimensions catering to datamart?
    4)Additionally can we load data directly into these datamarts without loading them to Datawarehouse?
    Any reference material to deal with this scenario would be highly appreciated
    Thanks Much,
    Sri

    Hi,
    not to use a central datawarehouse causes several BIG problems. You will build cubes for a specific department and cannot integrate them including their etl processes for other departements. Larissa Moss calls it 'swim lanes'! I would never do this, it's a low level of maturity for a dwh. This was done some years ago! Golden rule: Think big, start small. You can build cubes without building the complete enterprise model for your whole company. Just model the parts you need for your project - but think about that it must fit for other departments too.
    And try to put all department specific logic in the load processes from the dwh to the data marts. Then you can reuse all data and load processes of your dwh to build some data marts for other departments. If you load data marts straight ot of the source the data marts for other departments (which can have only some small changes to the existing ones) must be build from the scratch. You will have much dedundant code, a really bad architecture, too much process flows, ...
    Regards,
    Detlef

  • Can Anyone Clear me the things with DATAMARTS

    Hi All,
    I Am trying to load data from ODS to MAsterdata Object.
    The Masterdata Object has already been assgined to PCFILES source system. SO I Am trying to assgin another  source system which will be the MYSELF SYSTEM. When I am giving the name of the source system(as myself), a pop screen is coming asking to assgin the datasource... Here comes the problem for me . I need to update the masterdata with the contents in ODS. SO my datasource wilbe the 8(ods) right ? ... but in the list I am not finding the datasource name with my ODS .. BUt i am avle to see the datsource in the --sourcesyeem, myself system under datamarts  section.
    And One more question ,IF you go the infosource screen of any ODS I mean the 8ODS.. there in the menu bar we have drop meuns like , 1.INfosource 2. EDIT 3.goto
    4.??????????????? 5.environment.
    what does that 4 mean here and why the names of those are not displayed and if we click on that we will get  options like . 1.infosource status , 2. ??
    what is the opiton 2 here .
    if we click on the question mark we will get two radio buttons like .
    1.transfer program 8ztods_vc
    2.psa access program.
    can anyone tell me what will happen if we choose one of them and where we can see the results if we choose any one of them and excute.
    Thanks in Advance ,

    Hi,
    Here you tried to connect the Info source(Direct update) to Transactional data Data source(8...).Thats why you are not able to see the Data source 8....
    So create a Infosource (for Flexible updating ) for your Master characterstic.After activating the communication structure , if you try to assign the Data source,you will see the DS 8...
    With rgds,
    Anil Kumar Sharma .P
    Plz, donot forget to assign points, if it helps you.

  • Request not yet updated with datamart

    Hey guys,
    I am having trouble with an Upload.
    Some days ago I uploaded an Excel file in an ODS without any problems. Everything turns out green:
    Load: green
    Activation: green
    Yet the data is not in my Infocube yet, because when I run a query the data is simply not there.
    Now when I click 'manage data' on the ODS, I see this message that says 'Request not yet updated with datamart'.
    What can I do about this or what might be the problem?
    Thank you very much,
    Filip

    Hey Jürgen,
    I tried the "update 3.x" but it wasnt working.
    Next I tried to generate a datasource and to "update 3.x data", this 'worked' it didn't give me an error, but still my request is still "not yet updated with the datamart".
    Any ideas?
    Filip

  • When iam loading multiple datatargets with single datasource request failed

    when iam loading multiple datatargets with single datasource request failed
    i want to delete the  bad request at a time in all datatargets

    Hi Neeraj,
    The only thing you can do is go in to theMonitor screen of that IP and select the datatargets from the Monitor screen.In the next screen you can see all the targets included in the IP at the top.But the only bad thing is you have to manually delete the Bad request only from each target.
    Regards
    Sandeep

Maybe you are looking for

  • Help - Importing Large Images Fails

    Hi, I use Flash primarily for animation.  The project I'm working on currently is a hefty 1920 X 1080 square pixel file and currently weighs in at 187mb because of all the sounds and jpg images.  It's 2 minutes long. The problem occuring is new to me

  • I start running a java program and when i switch users the sound doesnt work

    When I start running a java program or leave a game running and i switch users the sound doesnt work. I have been searching around the web and nobody seems to have an answer. This just recently started to happen. Please if anyone has any ideas that w

  • SQL2005 collation error

    Hi gurus i am running sapinst  installing sap 47 on sql 2005 server and iam getting  this error : Wrong server collation :J2ee installatin requires SQL_LATIN_GENERAL_CP850_BIN collation. I have checkek there is a note but it seems its applying to SQL

  • How to disable Material Picking MB26 from Reservation display MB23

    Hello guys I have to know that disable picking option from display menu reservation t-code MB23. One of our user issued a material from reservation display function selecting picking option from menu and saved that lead to MIGO by MB26 even user does

  • Imported photos from old mac now cant export onto new Macq

    Ok so here is the deal. I was in Iraq and I put all my photos from my Macbook I had to my iPhone. Now the hard drive in that thing crapped out so I bought a iMac the other day. Nice computer for $1700. But with all this money I have spent on this wor