How to user SDDM for a multi layer data warehouse

Hi,
I'm evaluating the use of SDDM for data design of a data warehouse. My DWH will be build using 3-layer architecture with: Operational Data Store, Enterprise data warehouse layer, Datamart layer.
First two layers are relational models: ods is just a load off all sources; edwh comprises both master data and subject area in a relational model. Datamart is the top layer made in form of multidimensional model.
My question is if it's better to use a single 'design' of SDDM where I create multiple relational model and multidimensional or a different design for any layer.
Thanks
Pilot

If they all have a common Logical Model you could use one design as SDDM lets you tie multiple relational and multi-dimensional models to one Logical. (Actually you can do it all in one design without that)
Right now I am using one design to hold my source table designs, a stage table layer (like source plus ETL metadata columns), my integrations layer (EDW) and my dimensions/star schema presentation layer. The only challenge becomes if you have multiple sources that have the same tables names then you have a conflict in the relational model.
Each of my layers has a different schema name in the database so I am able to create and manage sub-views based on the schema names.That makes it possible to generate sub-view based DDL scripts easily for managing the build, and reports, etc.
Having it in one design also lets me create diagrams to show the model evolution (and data lineage) from source to stage to EDW to star.

Similar Messages

  • User Exit for Changing Payment Due date for Invoices

    HI Gurus,
    Could anyone suggest a User Exit which cane be used for Due Date calculation an all Invoice Entries like MIRO, FB60 etc.
    This is required for overwriting the Due Date determined via payment terms by a Particular day in a week. (Eg: Friday)

    Hi
    User exit
    Re: user exits in MM
    Regards
    Ram

  • How do i bring the management pack to data warehouse?

    I have been working with Service Manager a lot but this is my first custom management pack in SCOM.
    What i am trying to do is Create two classes and create a relationship. Data will be entered in these classes using PowerShell. This classes should be brought to Data Warehouse as well. There is no discovery or monitoring involved. I have sealed the the
    ManagementPack.
    I created the class and relationship and imported the management pack in SCOM console. Import was successful and i can see the tables are created in the OperationsManager database but i don't see them in the data warehouse.
    I don't see any place from where i can force sync the management packs.
    Is there something special i have to do to bring these classes to data warehouse?

    Hi 
    scom will not create table for MP in DW. DW is intended for store result of monitors and rules like alerts, performance data,event id,...etc  for reporting.
    refer below link for more information.
     http://blogs.technet.com/b/stefan_stranger/archive/2009/08/15/everything-you-wanted-to-know-about-opsmgr-data-warehouse-grooming-but-were-afraid-to-ask.aspx
    Regards
    sridhar v

  • How to Scale SDDM for Large Designs

    We have an Oracle Designer repository with 5 databases with about 20 schemas that need to interact. Three of the databases are COTs systems while two are for custom development. On importing to SDDM we have almost 7000 objects and it takes forever to save the design. SDDM does not scale very well. If I add one column to one table I have to save the ENTIRE design. I hate to think about what that will mean when it comes time to synch with subversion. Does it have to compare everything in the ENTIRE design to determine what has changed?
    Am I missing something? Is there a way to save just one table without saving the whole design? If not SDDM is only useful for small silo databases that don't need to interact. Can I have 20 schema based designs that can relate to other designs in the same database? For instance we have multiple application based schemes in a single database and they all refer to one reference schema with shared look up data in the same database.
    I wish we could stay on Designer but our sysadmins do not want to support legacy OS's defined as anything older than MS Server 2008 R2 and Windows 7. And Data Modeler has some really nice features but doesn't support our current design approach. We like to provide our developers with diagrams with all the tables they will be interacting with on all the databases.
    Any other users out there with large designs like this? How do you use SDDM?
    Marcus Bacon

    Phillip,
    Thanks for the reply.
    How do you count these 7000 objects?We have about 2500 entities and tables and no, not all on one diagram. This is the number that was imported from Designer. That is why I am struggling with SDDM because in Designer we divide and conquer. We have about 20 applications for our custom work and 3 COTS applications. We share/shortcut entities/tables needed from the COTs apps into our custom applications.
    You need 64 bit OSI have been lazy on this one since the 32 bit installs JVM for me. I have a 64 bit box with 16 GB of memory and dual quad core CPU's, 2 - 500 GB disk drives which are slow and not RAID. Guess I will uninstall again and install 64 bit before complaining again.
    Saving takes a long time and the last time I imported I received some errors yet the log page said 0. When I tried to save it, SDDM blew itself away. I would open a support request but right now I do not have support because the government people dropped it and we have been in the process of getting it back for several months. Any thoughts on the errors,
    In Data Modeler 3.2 tables from other models can be referred thus you can have 20 models (500 objects each) and you can select to load only one or two models you need to work on.I keep hearing rumors of 3.2 and I may try to wait until it is out before we migrate. I know, Oracle doesn't comment on release dates so I won't ask if it will be out in the next few months.
    There is a "save" at model level, but as I said you can use save at design level as well - only changed objects will be saved. How do you save at these different levels?
    What version of Data Modeler do you use?I am currently using SDDM 3.1.3.709.

  • BMM - Decision criteria on how to define LTS for a multi-source object

    I am curious to know what others use as their decision criteria when determining how to setup a BMM logical table that has multiple sources that map to multiple columns in the logical table.
    For example,
    If you have defined logical table Dim - Customer in the BMM that has 4 fields:
    Dim - Customer
    Customer Id
    Customer Name
    Customer Address
    Customer Phone
    The sources for this logical table from the physical layer are:
    CUSTOMER (all customers exist here)
    CUSTOMER_ADDRESS (A customer may or may not have an address, but can have only one address to simplify this conversation)
    CUSTOMER_PHONE (A customer may or may not have a phone, but can have only one phone to simplify this conversation)
    Field Mappings from Logical Table to Physical Table are:
    Dim - Customer
    Customer Id - CUSTOMER
    Customer Name - CUSTOMER
    Customer Address - CUSTOMER_ADDRESS
    Customer Phone - CUSTOMER_PHONE
    How would you setup the logical table and LTS's for this object in the BMM knowing the following requirements?:
    1. The Address and Phone tables should only be joined in when fields from those tables are used in a report/query.
    2. Since a customer may or may not have an address or phone, the BMM table needs to be setup such that adding those fields into a query will not filter all customers that dont have data for those fields, i.e. display null if it does not exist.
    My initial approach for resolving this would be to set it up as follows:
    A single Logical Table Source: LTS_CUSTOMER
    Three Tables added to the LTS:
    CUSTOMER, CUSTOMER_ADDRESS, CUSTOMER_PHONE
    Two joins defined for the three tables:
    CUSTOMER - CUSTOMER_ADDRESS (LEFT OUTER JOIN)
    CUSTOMER - CUSTOMER_PHONE (LEFT OUTER JOIN)
    Issues/Questions:
    - The problem I have run into is that any query that uses Customer Id or Customer Name still causes a join to the CUSTOMER_ADDRESS and CUSTOMER_PHONE table even though I didn't use a field from those tables. I assume this is because OBIEE things that any query to that Logical Table using that LTS has to query it as the LTS is defined (so all three tables are joined in)
    - Is there a way around this? If so, is it to specify separate LTS tables for the Logical Table Dim - Customer? If you do this, then there is no way to specify a LEFT OUTER JOIN and the query will just default to what is specified in the Physical Layer as an inner join.
    - Curious as to what peoples thoughts are in general for how to decide whether to specifiy multiple sources in a single LTS or just add multiple sources to the Logical Table and the decision criteria for going either way.
    Thanks in advance.
    K
    Edited by: user_K on Jun 3, 2010 11:09 AM
    Edited by: user_K on Jun 3, 2010 11:10 AM

    Thanks a lot for the example, that fixed the problem.
    I've been reading the NSAPI programmers guide several times now and I was under the impression that you should either use <object> or <client>, never thought about this approach. Now when I re-read it is makes sense, but wasn't very clear at first.
    I think I'll e-mail Sun to suggest an addition of the <Object> tag in their <Client> tag example (http://docs.sun.com/source/817-6252/npgobjcn.html#wp1045056) to make things a little more clear.
    Thanks again, now I can finally implement the final configuration settings and plan for the actual switch later this evening.

  • How do you setup for your multi cores and other settings?

    since the preferences doesnt really give us a option does that mean that FCPX just "knows" how good your computer is and what you want to devote (memory wise, etc) to it?

    I have the same problem - beach balls and temporary freezing, on 2 Mac Pros (8 Core, 32GB RAM, HD5870 card). All the while CPU and other resources are not maxed out; this is on 10.6.8. there seems to be 'some' bugs - for some people - that in fact make FCPS quite slow (and slowing hte system down as well) all the while there is no bottleneck in hardware....  

  • Enhancement /User Exit for logic setting call date / Horizon

    Due toe planning in IP10 we have a horizon set by the system for (eks.) 80%. This is working OK for small planning intervals (i.e. up to 24 mth intervals).
    However when intarval exceeds this limit we will set the call date to be maximum 30 days (for instance) ahead of the sceduled start date of the order.
    Thus we need a user-defined way to manipulate / set the call date by a user exit (logic in ABAP - or by IMG settings if possible) that differs from the standard SAP setting for this date.
    The question is then how we - in best practice -  can do so.
    Please advice if you need mor information on this issue.

    Hi
    Find the available exits with the following program:::
    *& Report  ZFINDUSEREXIT
    report  zfinduserexit.
    tables : tstc, tadir, modsapt, modact, trdir, tfdir, enlfdir.
    tables : tstct.
    data : jtab like tadir occurs 0 with header line.
    data : field1(30).
    data : v_devclass like tadir-devclass.
    parameters : p_tcode like tstc-tcode obligatory.
    select single * from tstc where tcode eq p_tcode.
    if sy-subrc eq 0.
    select single * from tadir where pgmid = 'R3TR'
    and object = 'PROG'
    and obj_name = tstc-pgmna.
    move : tadir-devclass to v_devclass.
    if sy-subrc ne 0.
    select single * from trdir where name = tstc-pgmna.
    if trdir-subc eq 'F'.
    select single * from tfdir where pname = tstc-pgmna.
    select single * from enlfdir where funcname =
    tfdir-funcname.
    select single * from tadir where pgmid = 'R3TR'
    and object = 'FUGR'
    and obj_name eq enlfdir-area.
    move : tadir-devclass to v_devclass.
    endif.
    endif.
    select * from tadir into table jtab
    where pgmid = 'R3TR'
    and object = 'SMOD'
    and devclass = v_devclass.
    select single * from tstct where sprsl eq sy-langu and
    tcode eq p_tcode.
    format color col_positive intensified off.
    write:/(19) 'Transaction Code - ',
    20(20) p_tcode,
    45(50) tstct-ttext.
    skip.
    if not jtab[] is initial.
    write:/(95) sy-uline.
    format color col_heading intensified on.
    write:/1 sy-vline,
    2 'Exit Name',
    21 sy-vline ,
    22 'Description',
    95 sy-vline.
    write:/(95) sy-uline.
    loop at jtab.
    select single * from modsapt
    where sprsl = sy-langu and
    name = jtab-obj_name.
    format color col_normal intensified off.
    write:/1 sy-vline,
    2 jtab-obj_name hotspot on,
    21 sy-vline ,
    22 modsapt-modtext,
    95 sy-vline.
    endloop.
    write:/(95) sy-uline.
    describe table jtab.
    skip.
    format color col_total intensified on.
    write:/ 'No of Exits:' , sy-tfill.
    else.
    format color col_negative intensified on.
    write:/(95) 'No User Exit exists'.
    endif.
    else.
    format color col_negative intensified on.
    write:/(95) 'Transaction Code Does Not Exist'.
    endif.
    at line-selection.
    get cursor field field1.
    check field1(4) eq 'JTAB'.
    set parameter id 'MON' field sy-lisel+1(10).
    If there are no available user exits you could go for badi's.
    To search for a badi, go to se 24 display class cl_exithandler. double click on method get_instance, get a break point on case statement. execute and start the required transaction in new session. look for variable exit_name. It would show the available badi's.
    Please reward if useful....
    regards
    Dinesh

  • How to create Reports for the infopath form data

    Hi All,
    I have a requirement to generate reports in SharePoint based on InfoPath form data which is submitted into form library. Do I need to store each InfoPath form data into SQL Database/MS Access Database. if yes, can any one  help me on this how
    to save InfoPath form data into DB while submitting into "Form Library". based on the data stored in DB how to create  and view the report in SharePoint?.
    Thanks in Advance,
    Satish
    Thanks & Regards Satish Dugasani

    Thank you for asking the questions for clarification, my requirement is like below
    1. Need to insert all infopath form fields data into Access Database when i am submitting to the Form library
    2. Create Access Reports like based on time period, based on submitter name, based on state, city, and combination of these etc.
    3. Sync Access Database every time i am updating the form
    4. View Access Reports in share point site
    PS: I worked on this like below approch
    1. Written Web Service to Sync Access Database for the Infopath form
    2. Create Access Reports
    3. Publish using "Publish to Access Services" by providing site URL it will create a subsite there i can able to see reports, tables, macros etc. but that is not an user friendly like i am not able to apply my branding because it is pointing to /_Layouts/accsrv/Modifyapplication.aspx.
    could any one suggest me on this.
    Thanks & Regards Satish Dugasani

  • How to do INIT for Sales order Itema Data (DS 2LIS_11_VAITM)

    Hi all
    I have a report on sales order item data, I have to reload it. DS 2LIS_11_VAITM
    Can any of you pls explain me the steps to do the INIT? Refill set up tables etc
    Many Thanks in advance
    Ishi

    Hi Hari
    Many Thanks for the explanation and steps. I deleted set up tables and executed to fill it.
    It says in R/3 Start of Processing, I ticked to continue and its still running.
    In the mean time I checked RSA3 and it says 1007 records selected. I am refreshing it and the no. of records are still the same.
    Can you tell me how long it takes to fill the table?
    And the R/3 system is still running (Start of Processing)
    Thanks again
    Ishi

  • How to listen out for a stream of data coming from (web)server

    Hello
    I have an applet that connects to a server (same location as web server) and connects to a server on a socket. This all works fine for sending commands to this server. But the server can send data to the client at any moment in time. So how do I listen out for the activity? do I launch a separate thread that sits listening for incoming data?
    What is the way to do it?
    Angus

    Check out my InfoFetcher class
    http://forum.java.sun.com/thread.jspa?threadID=750441&messageID=4291848

  • Badi or User exit for CJ20N..Milestone date

    HI...
    I have to access the field of CJ20N fieldname (LST_ACTDT) (Milestone date).I have to send the alert mail using that value.
    But i am unable to get the BADI or user exit to get that value.I am using the BADI  WORKBREAKDOWN_UPDATE. But not reaching to solution.
    Can anyone help me out..

    Hi Priyank,
    Instead of that you can directly use the BAPI -BAPI_PROJECT_GETINFO
    CALL FUNCTION 'BAPI_PROJECT_GETINFO'
    EXPORTING
    PROJECT_DEFINITION           =
       WITH_ACTIVITIES              = 'X'
       WITH_MILESTONES              = 'X'
       WITH_SUBTREE                 = 'X'
    IMPORTING
      E_PROJECT_DEFINITION         =
      RETURN                       =
    TABLES
       I_WBS_ELEMENT_TABLE          = " send the pspnr value by appending in the table i.e. IT_WBS_ELEMENT-PSPNR
      E_WBS_ELEMENT_TABLE          =
       E_WBS_MILESTONE_TABLE        = "catch this in a internal table
      E_WBS_HIERARCHIE_TABLE       =
      E_ACTIVITY_TABLE             =
      E_MESSAGE_TABLE              =
    If u wish to have details of wbs only..by entering WBS ELEMENT in I_WBS_ELEMENT_TABLE u have to give
    WITH_ACTIVITIES X
    WITH_MILESTONES X
    WITH_SUBTREE X
    then only u will get deatils of corresponding wbs via bapi BAPI_PROJECT_GETINFO.
    In the milestone internal table you will have all the dates.
    Please refer to the function module documentation for further details.

  • Screen/ User Exit for Internal Order Master Data

    Hi,
    I would like to one more field in Internal Order Master Data in TC KO01. I know the Table AUFK and i know there is one Include Structure (CI_AUFK). but i dont know how to add the field in that.
    if you go to Screen Exit COOPA003.  There is a Structure like CI_AUFK Structure there is no Calling screen in SMOD.  
    Can anybody please let me know how to do that?
    It's really really very urgent.
    your help would really appriciate.
    Thanks,
    Dhaval Shah.

    Hi Chandra,
    Thanks for the reply, can you please let me know from which TC i have to do. and i will be really appreciate your help  if you write down the whole process.
    Because i am aFunctional Person and i am using this Screen Exit for the first time so i'll be really thank ful to you.
    Thanks,
    Dhaval Shah.

  • User exit for copying Inbound delivery data to Batch

    Hi, my client wanted copy the field 'country of origin' of Inbound delivery to batch when we create a new batch in inbound delivery.
    Can anyone tell me which User Exit I can use? Thanks.

    Removed, for new thread
    Edited by: Carol D'Sa on Dec 7, 2009 12:56 PM

  • How to optimize DELETE for large number of data?

    Hi Guys,
    I am creating a DELETE sql for deleting large amount of data, approximately a million records. When I tested it, it got a lot amount of execution time. Is there a way how can I optimize this query?
    I highly appreciate any idea from you guys!
    Thanks,
    Jay

    JayPavia wrote:
    I am creating a DELETE sql for deleting large amount of data, approximately a million records. When I tested it, it got a lot amount of execution time. Is there a way how can I optimize this query?
    I highly appreciate any idea from you guys!Jay,
    one quite nice approach is described in the AskTom thread: http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:2345591157689#2356959585102
    In a nutshell: You can create a partitioned table with the same layout (using e.g. a RANGE partitioning with a single default partition), insert the remaining data into that table and swap the two segments using ALTER TABLE EXCHANGE PARTITION.
    This allows you to load the data into the partitioned table using all means available to you, e.g. direct-path parallel insert with nologging attribute of the table to make it as fast as possible, rebuilding/creating any indexes in bulk after the load completed.
    Furthermore, at least seamless read-access to the table is possible while you're performing this operation, and you don't have to deal with renaming/granting etc. which you would need to do with a traditional CTAS approach.
    Of course, if the table needs to be online for DML while performing the DELETE you can't use these options.
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle (Open source Oracle GUI for Windows):
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/

  • How to read channel for channel and appended data for appended data?

    Good day,
    I wrote a little program that saves blocks of data after a trigger to a
    file, it will append these blocks off data as much times as you set it.
    Each appended data block has a time stamp and a signal path and burst
    number as a header.
    I use the storage way to save it to TDM format. This all seems to work nicelly.
    Now is my question:
    Afterwards i want to read each block or burst seperatelly. How can I do
    this. I can't seem to find an easy way to read just one data block in
    the file.
    best regards
    Joost van Heijenoort
    Ursa Minor Space and Navigation

    allready found out that I have to save it in unique channel names...
    this helps..
    best regards
    Joost

Maybe you are looking for