EDW layer

Hi all,
What is the best way to implement a EDW layer for Sales Order, Billing, elivery areas (2LIS*) if all the users are looking for is standard business content reports to start with? Apart from using the standard busines content data flow, we want the EDW layer to be solid enough with proper assumptions and be able to handle future requirements in a short period of time.
In short, I am looking for a good data model for the EDW layer in SD.
Thanks.

The key to the write optimised DSO will just be the package id and line within the packet (ie a duplication of the PSA/acquisiton layer)
We will use this as a short term load (for support problems i.e. reloads of 5-7 days)
The corporate memory is a deep storage version of this - and in a big installation this will be archived off to NLS until needed
With regards to why coprorate memory is different... you dont need those huge DSOs of > 100,000,000 records lying around doing nothing but increasing cpu load for indexes and structures just for short term reloads
I would normally put a infosource between the PSA and the corporate memory and propagation layers which adds some value to the data but not business specific value - ie put a year in, the loaded date etc...
I would then feed the corporate memory layer back to the propgation layer for long term reloads and building future developments

Similar Messages

  • 4 LSA architecture - EDW layer - write optimized DSO settings

    Dear Colleagues,
    I have a question regarding the 4 LSA architecture, available on the following article.
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/306f254c-1e3f-2d10-9da0-bcff4e35e0ef
    When we activate a SAP business content dataflow.
    If we want to add in the EDW layer write optimized DSO to have a faster upload from source system to SAP BI.
    Should we always check the" do not check uniqueness of data" setting in the write optimized DSO to avoid data upload error ?
    Based on your experience what would be your recommendation ?
    Cheers,

    I would suggest to check "ON"
    -If the check is ON It will allow to load several records with same semantic key and in next layer
    -If the check is OFF It will not allow to load same record twice and throws error
    Did you seen this ?
    /people/martin.mouilpadeti/blog/2007/08/24/sap-netweaver-70-bi-new-datastore-write-optimized-dso
    Edited by: Srinivas on Aug 24, 2010 2:16 PM

  • EDW layer as HANA for Non SAP environment

    Dear All,
    We would like to build HANA as EDW Layer,  Would please suggest best practice , Node configuration etc
    Regards,
    manoj.

    Hi Manoj,
    I suggest you check out the material about BW on HANA on http://experiencesaphana.com.
    There is no general answer, as the configuration highly depends on you data volume, update frequency, etc.
    The largest single HANA server available comes with 2 TB of memory - that gives you about 1 - 1.2 TB in data storage (the rest will be used for processing the data). That doesn't sound like much, but early BW on HANA projects have seen a data volume reduction of about 1:5, so if your existing data warehouse is less than 5 TB, a single node might be an option...
    If you go for a multi-node system, there's not a lot of options for landscape configuration. Just follow the [scaleout guide|http://help.sap.com/hana/hana1_imdb_scale_en.pdf]...
    Not an official best practice, but personally I would say that bigger is better - so if you decide to build a multi-node landscape, I would go with XL servers...
    cheers
    --Juergen

  • EDW Architecture question

    Hi experts.
    I was searching for orientation in SAP help about EDW architecture details, but I didn’t find anything.
    We’re developing a new BW (2004s version) project and the challenge is to have one physical layer with EDW and another layer just for reporting.
    My question is about the communication between these two layers, as long as they are in two different machines. If my reporting layer infocubes are loaded by EDW layer DSO´s how can I create a transformation linking both objects? Is it possible? Or our concept is not right?
    Thanks in advance.
    TP

    Physical layer with EDW: It will be the Modeling area and data will reside in your InfoProviders (DSO/Cube etc).
    Rreporting: You will use Front-Ent which is BI Reporting Tools to create, edit, use reports in Bex tools like Analyzer, WAD, Report Designer. And data will be coming from your InfoProviders.
    My question is about the communication between these two layers, as long as they are in two different machines.
    Do you mean to keep two production server where one will be used just for backup and another one for all users? BTW users will be using reports only through BW Reporting Tools and they will be able to access through network.
    If my reporting layer infocubes are loaded by EDW layer DSO´s how can I create a transformation linking both objects? Is it possible? Or our concept is not right?
    Check these:
    Modeling:
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/6ce7b0a4-0b01-0010-52ac-a6e813c35a84
    https://www.sdn.sap.com/irj/sdn/docs?rid=/webcontent/uuid/93eaff2d-0a01-0010-d7b4-84ac0438dacc
    Hope it helps..

  • Adm and edw

    hi
    can any of you please explain me about what is ADM LAYER  and  EDW LAYER in BI 2007.What is the perpose and adwantages of those.
    What we can able to include in ADM and EDW layers.
    regards
    balaji.p

    Hi Balaji,
                EDW layer : Enterprise data ware house layer. this to replicate the same number of records in R3 in ur data targets, main purpose is to store the records in the data targets, but in 7.0 PSA already has the facility. remove all the keys in ur ODS for thsi layer
    ADM  layer : This layer doesnot replicate the same number of records in R3 in ur data targets, Here you really dont remove he keys in ur data targets. there can be aggregation taking place in this layer.
    please let us know.

  • DIfference between TERADATA TVS technology an new TERADATA FOUNDATION Layer

    Experts ,
    what is the difference between TERADATA TVS technology and new TERADATA FOUNDATION Layer technology.
    What are the advantages and disadvantages of each
    Thanks

    Imagine I has a multiprovider that needed to look at process data from R3 and operational data from say a post office tracking system
    The post office tracking system was on a set of non R3 boxes and historically a EDW and reporting marts where built using non SAP products on a Teradata platform
    Within BW you can create a multicube on a BW cube that has the ECC data in it and have a virtual remote cube which reads teradata at run time to get the tracking data for that country etc
    BW does not store the data (why would it need to - it is already stored in the Teradata EDW system) but consumes it at run time in a query
    To read the data at runtime from Teradata you could either write your own function module or you could buy the TVAS module from teradata which does it for you (at a cost) and in teradata's words is optimised for Teradata SQL and handles hierarchies much better etc
    Teradata foundation is a whole different matter - in the case above it would involve you moving your BW box and putting it on Teradata. You coudl benefit from the massive parallel processing within teradata that your infopackges ands DSO activations within your BW LSA layers need.
    Teradata will easily handle its its normal non BW applications tens of terabytes in a very little time and it is normal to rebuild reporting marts from scratch each night rather than deltas due to the actual power of the product.
    How this integration within the  BW application is at this moment I dont know as I haven;t been updated since my last conf call with them.
    So it's horses for courses - if a client has a huge teradata platform already and a oracle BW one it is a door opener to move BW to that
    Or if its a greenfield site with huge and i mean huge data volumes that you need to model within a EDW layer then it might make sense to look at teradata or one of the other MPP database vendors.
    I notice you work in consultancy in presales - so I best leave the rest of the free advice there.. but it gives you a flavour

  • How to update data from PSA to target in BI 7.0

    Hi all,
    can you pls tell me the procedure/process to load data from PSA to data target. We got a load failure while activating the data in DSO.We made necessary changes to the incorrect data in PSA and we saved. When i run the DTP again it gives me 0 records. Is there any option like in 3.5 where we can select the PSA right click.....schedule the update/start the update immediate. or any other process which solves my problem.
    Thank you all,
    Regards,
    Praveen

    A new object concept is available for the ETL process as sap Net Weaver 2004. To implement the EDW layer paradigm, sap has changed the concept of the ETL data flow and process design.
        The most important innovations in the modified object concept are as followes.
    1. When a Datasource is activated, a PSA table is generated in the inbound laywr of the BI so that data
       can already be loaded.
    2 The infopackage is only used to load the data from the source system into PSA.
    3 This is followed by the data transfer proccess (DTP) step to transfer data within the BI from one
       persistent object to another by using transformation and filters,
    4 The definition for which data targets the data from the Datasource is to be updated into occurs in  
       transformations.  this is also where fields of Datasource are assigned to infoObjects of the target
       objects in bi.

  • Three types of DSO in BI 7.0

    Hi,
    I went through the sap help documents related to the three types of DSO.
    1. Standard
    2. Direct Update
    3. Write Optimized
    Still not clearly understood in which case we use what kind of DSO's.  Could any one explain me that in your own words.
    I am new to BI, your explanation is appreciated.
    What is this ADP process that is used using the Direct Update DSO.........Any docuements or links or info on it?
    Thanks,

    DataStore Objects for Direct Update
    DataStore objects for direct update ensure that the data is available quickly. The data from this kind of DataStore object is accessed transactionally. The data is written to the DataStore object (possibly by several users at the same time) and reread as soon as possible.
    It is not a replacement for the standard DataStore object. It is an additional function that can be used in special application contexts.
    <b>APD -</b>
    In the BI system, you can use a DataStore object for direct update as a data target for an analysis process
    In the BI system data is collected from the heterogeneous databases in the various systems that the enterprise is using and is consolidated, managed, and staged for evaluation purposes. There is often further, valuable potential in this data.
    This is completely new information that is displayed in the form of meaningful correlations between data but that is too well hidden or complex to be discovered by simple observation or intuition.
    The Analysis Process Designer (APD) allows you to find and identify these hidden or complex relationships between data in a simple way. Various data transformations are provided for this purpose, such as statistical and mathematical calculations, data cleansing and structuring processes.
    The results of the analysis are saved in BI InfoProviders or in CRM systems. They are available for all decision and application processes and thus can be decisive (strategically, tactically, and operatively).
    Examples of analysis processes include the calculation of ABC classes, determination of frequency distribution or of scoring information.
    http://help.sap.com/saphelp_nw04s/helpdata/en/49/7e960481916448b20134d471d36a6b/content.htm
    <b>Standard DataStore Object</b> The standard DataStore object is filled with data during the extraction and load process in the BI system.
    It has 3 tables - New , Active , Change log table
    <b>Write-Optimized DataStore Objects</b>
    Data that is loaded into write-optimized DataStore objects is available immediately for further processing.
    You use write-optimized DataStore objects in the following scenarios:
    &#9679;     You use a write-optimized DataStore object as a temporary storage area for large sets of data if you are executing complex transformations for this data before it is written to the DataStore object. Subsequently, the data can be updated to further (smaller) InfoProviders. You only have to create the complex transformations once for all data.
    &#9679;     You use write-optimized DataStore objects as the EDW layer for saving data. Business rules are only applied when the data is updated to additional InfoProviders.
    The system does not generate SIDs for write-optimized DataStore objects and you do not need to activate them. This means that you can save and further process data quickly. Reporting is possible on the basis of these DataStore objects. However, we recommend that you use them as a consolidation layer, and update the data to additional InfoProviders, standard DataStore objects, or InfoCubes.
    http://help.sap.com/saphelp_nw04s/helpdata/en/f9/45503c242b4a67e10000000a114084/content.htm
    Hope it Helps
    Chetan
    @CP..

  • Types of Infocube & DSO in BI 7.0 version

    Hi,
    What are the Types of Infocube in BI 7.0 version, Please provide details and document also. what are the usage these types of infocube.
    What are the Types of DSO in BI 7.0 version, Please provide details and document also. what are the usage these types of DSO.
    Thanks,
    Vijay.

    Hi Gopi,
    Use a standard DataStore object and set the Unique Data Records flag if you want to use the following functions:
    ○       Delta calculation:
    ○       Single record reporting:
    ○       Unique data:
    Write optimized DSO
    ●      You use a write-optimized DataStore object as a temporary storage area for large sets of data if you are executing complex transformations for this data before it is written to the DataStore object. The data can then be updated to further (smaller) InfoProviders. You only have to create the complex transformations once for all data.
    ●      You use write-optimized DataStore objects as the EDW layer for saving data. Business rules are only applied when the data is updated to additional InfoProviders.
    DataStore object for direct update
    you can use a DataStore object for direct update as a data target for an analysis process. The DataStore object for direct update is also required by diverse applications, such as SAP Strategic Enterprise Management (SEM).
    http://wiki.sdn.sap.com/wiki/display/BI/AboutDSOtypesinBI
    Hope this helps.....

  • 2LIS_11_VAITM - Deleted Line Items

    I am having an issue with sales order line items that have been deleted.
    I load my data using 2LIS_11_VAITM into a DSO EDW Layer. I then pass this up to another DSO layer. I noticed that users were deleting line items from Orders and this was causing me an issue in my data. To remove the problem I have linked ROCANCEL to 0RECORDMODE in the Technical rule up to my EDW layer and now in my EDW layer the line item goes completely when it is deleted out of the ERP system ad the deletion is passed into BW.
    I got the idea for this from here Deleted line item appears in BI report
    My problem lies in the layer above where I am still left with a line in the DSO relating to the deleted line. But the KF are zero. Is there anyway to also get this line to be deleted out as per my EDW layer?
    Thanks,
    Joel

    Hi Joel,
    What is not completely clear to me is what happens in the first DSO. Are the deleted records removed from the active table or are the key figures set to 0?
    Furthermore, what is update rule for key figures: overwrite or addition (in the transformation to both DSOs)?
    The record mode is indeed the crucial factor. It must become clear which one is delivered by the DataSource: 'R' for a reversed record or a 'D' for an entire deletion? And what is the record mode in the update to second DSO?
    Best regards,
    Sander

  • Comparasions of records between two dsos  using abap program

    Hello Experts,
                           As per my business requirements we Implemented  30 DSO (EDW Layer) for x Reason  with the reference from  Y Reason DSOs. and for x reson dso we created  transformations and dtp. through process chains we loaded the data to X Reason Dsos. after loading i want to comare these two dso records through  ABAP Progrm.
    here my source is : X Region Dsos
    here Trget is        :  Y Region Dsos
    these two are the mandatory fields
    and the optional fields are:
    1. Sales org
    2. Sales division
    3. document creation date.
      for this type of requirement i want to implement  ABAP Program.i want to abap code for this type of requirement. anyone  have abap code for similar type of requirement please send the abap code.
    Thanks & Regards,

    Hi saurabh,
    If your requirement if to comepare both the values based on the sales org, sales div, data, u can build a report or if you want to look up the both the DSO and want to perform any operation we need to write routine at transforamation level.
    I'm sending u a sample code where i had used for look up two ods and delete the exisitng saled docs.in the BI.
    DATA : BEGIN OF i_uxxx OCCURS 0,
           /bic/zaw        LIKE    /bic/aZD_UDLIT00-/bic/zaw,
           END OF i_uxxx.
    DATA : wa_udlit LIKE i_uxxxt.
    DATA : BEGIN OF i_uxxxx OCCURS 0,
           /bic/zaw        LIKE    /bic/aZD_UDAN00-/bic/zaw,
           END OF i_udxxx.
    DATA : wa_uxxxx LIKE i_uxxx.
    case 1 : For uxxxx lite data
    CLEAR i_uxxx.
        DATA: wa_srcpack  TYPE  tys_sc_1,
              zindex      TYPE  sy-tabix.
        BREAK-POINT.
    CLEAR: zindex.
        SELECT /bic/zaw
        FROM  /bic/aZD_UDLIT00
        INTO  TABLE i_uxxx.
        FOR ALL ENTRIES IN SOURCE_PACKAGE
        WHERE /bic/zaw  = SOURCE_PACKAGE-/bic/zcustomer.
        LOOP AT SOURCE_PACKAGE INTO wa_srcpack.
          zindex = sy-tabix.
          CLEAR wa_xxx.
          READ TABLE i_uxxx INTO wa_uxxx
                               WITH KEY /bic/zaw = wa_srcpack-/bic/zcustomer
          IF sy-subrc = 0.
            Delete SOURCE_PACKAGE.
          ENDIF.
       MODIFY SOURCE_PACKAGE INDEX zindex FROM wa_srcpack.
        CLEAR: zindex.
        ENDLOOP.
    *case 2 : For uxxxx data
    CLEAR i_uxxxx.
        DATA: wa_srcpack1  TYPE  tys_sc_1,
              zindex1      TYPE  sy-tabix.
        BREAK-POINT.
    CLEAR: zindex1.
        SELECT /bic/zaw
        FROM  /bic/aZD_UDAN00
        INTO  TABLE i_uxxxx
        FOR ALL ENTRIES IN SOURCE_PACKAGE
        WHERE /bic/zaw  = SOURCE_PACKAGE-/bic/zcustomer.
        LOOP AT SOURCE_PACKAGE INTO wa_srcpack1.
          zindex1 = sy-tabix.
          CLEAR wa_uxxx.
          READ TABLE i_uxxxx INTO wa_uxxx
                               WITH KEY /bic/zaw =
                               wa_srcpack1-/bic/zcustomer
          IF sy-subrc = 0.
            Delete SOURCE_PACKAGE.
          ENDIF.
       MODIFY SOURCE_PACKAGE INDEX zindex FROM wa_srcpack.
        CLEAR: zindex1.
    endloop.
    Hope this helps...
    Regards
    KP

  • Extract data from ECC to Oracle using Data Services 4.0

    How to extract data from ecc6.0 Business content extractors  to oracle using sap bo data services 4.0

    Are you trying to use the SAP BW Business Content to extract data out of ECC and load into Oracle tables with Data Services? If that's the case, then you cannot do that. The SAP BW Business Content was developed to only be used in conjunction with SAP BW. When using Data Services to access the extractors in ECC, it has to have an SAP BW InfoPackage associated with it to execute. In this architecture, Data Services is only a pass through from ECC to BW and allows the ability to do some transformations of data prior to loading into the EDW layer (staging tables basically) on SAP BW.
    To connect ECC to Oracle, you're going to have to have all of the SAP BusinessObjects supplied Function Modules loaded onto ECC, along with a non-dialog logon account that has the ability to pass dynamic ABAP programs, generate the programs and schedule them. Depending on how you want to process the output, you may also have to have the ability to write to files on the ECC application servers and have an FTP account created on the application servers that can GET flat files and potentially DELETE them (you're going to need to delete periodically, otherwise your jobs will crash when the file space allocation has been consumed).

  • Record mode " R"

    Hi ,
    Here i have a Write Optimized DSO as EDW layer from which the data is loaded to other DSO 's
    Now,  i have a record with record mode value ' R' ( reverse posting ) in EDW, I understand that the particular key combination record  will be deleted from Active table of ODS if the Recordmode value is R .
    However i believe i should be able to see the same record with Recordmode value "R' in change log table.But i am unable to see the particular key combination record itself in cahnge log table.
    Can you please explian why is it so ?
    Thanks,
    Siva.

    Hi,
    Delta Administration:
    Data that is loaded into Write-Optimized Data Store objects is available immediately for further processing. The activation step that has been necessary up to now is no longer required. Note here that the loaded data is not aggregated. If two data records with the same logical key are extracted from the source, both records are saved in the Data Store object, since the technical key for the both records not unique. The record mode (InfoObject 0RECORDMODE (space,X,A,D,R)) responsible for aggregation remains, however, the aggregation of data can take place at a later time in standard Data Store objects (or) InfoCube. Write-Optimized DataStore does not support the image based delta(RECORDMODE), it supports request level delta, and you will get brand new delta request for each data load. When you load a DataStore object that is optimized for writing, the delta administration is supplied with the change log request and not the load request.
    Since write-optimized DataStore objects do not have a change log, the system does not create delta (in the sense of a before image and an after image). When you update data into the connected InfoProviders, the system only updates the requests that have not yet been posted.
    Write-Optimized Data Store supports request level delta. In order to capture before and after image delta, you must have to post latest request into further targets like Standard DataStore or Infocubes.
    Thanks

  • Why ODS in this example

    Hello Gurus
    I have a doubt which is not clear for me completely.
    I know ODS is used for detailed reports and etc but why ODS is used if the datasource can handle deltas.
    Please explain me with example I will understand totally.
    Greetings

    George,
    Apart from what our friends said in the above posts, today BW is not simply a data warehouse but has become an EDW(Enterprise Datawarehouse). The basic concept in EDW is to bring all data to BW into EDW level and then move the required data to next level needed for analysis. To suffice this EDW layer ODS is used to store data. Even if the datasource supports delta, the data is kept for safe in BW to encounter various situations antcipated.

  • Key Performance Indicators calculation

    Hi Guru's,
    I need small information to understand where it is feasible to calculate the KPI's?  currently my project is planning to calculate at etl level,  if it is at ETL, what are the precautions that we have to take?  Does it impact on SAP Design standards and performance of Data base.  Please give me your valuable suggestions quickly.
    Thanks in advance,
    Thanks and Regards,
    Ganni Venkat.

    Basically you plan to hard code RKFs and CKFs into columns at the DSO and cube level and not using RKFs and CKFs at the OLAP level during run time
    This design within reporting marts (certianly not at the DSO level) is one way of doin gthings - but over time it can become unsustainable if the business model changes (which they all do)
    I for one woul donly hardcode the KPIs within a layer above the reporting layer (ie in a consolidation layer) - where the data mdoels can be flattened and load is via a snapshot
    the response times at the database elvel will be fantastic however as there is no memory based OLAP processing goign on
    As mentioned though - the next time somebody uses a separate document type etc or wants to restate history - then you have a huge problem on your hands reloading data into the reporting marts
    If however you are one of the new teradata based sites - then I would think about doing it this way as the bulding of report specific marts based on full loads from a EDW layer is normal within teradata models

Maybe you are looking for

  • New features in Exchange 2013 that can help speed recovery ?

    What new features are available in Exchange 2013 to help speed data recovery ?  anything change to the architecture which means it is no longer needed to restore mailboxes and replay logs prior to restoring data ?

  • Price Inclusive/Exclusive of Taxes

    Sir My Client  Requirement as given below. Can any Person will help me out Sales order Requirement according to the Pricing Procedure Product Base Price 1000 Value of service(Inclusive of taxes) 500 A mount Base value for service = 445.00 Service tax

  • TS1424 iTunes error when downloading tv "Download error, tap to retry"

    Hi, just bought a tv series from iTunes. The first three episodes downloaded with no problems, the last three episodes won't download and the above error keeps occuring. I have be enable to download on my iPhone 4s but they won't on my iPad! Any idea

  • Forms server 6 problems

    We've Form Server 6 patch 3 running on an NT box SP5 with OAS 4.0.8 (cartridge implementation). We've some forms that execute reports with RUN_REPORT_OBJECT built in (repformat PDF); Sometimes forms server seems to be in a loop and users can't have t

  • Procedure for adding a characteristic to an ODS in production

    Hi, Can someone outline the right process of adding a newly created characteristic infoobject to an ODS (that contains records)in production? Many thanks, Xibi