Is MDM a tool to design a Data ware house or Database?

Hi
Is MDM a tool to design a Data ware house or Database . Is it an ETL.?
Actually what  do you design in the repository? Does it follow any schema ? where does the data get stored?
would i load the data into MDM?
what do i design for?
I understood  the usage of MDM & The differnce between SAP BI that MDM is for Master data
and BI is for both Master and Transactional data and a lot more...
I would really appreciate your help.
Thanks and regards
Novino

Hi Novino
Suggest you to read the reference guides on MDM, will answer most of your queries.
What is the name of the Schema ? as you mentioned.
Is it Star Schema, snow flake schema or extended star schema? or any other?
MDM can use any schema file custom or SAP standard IDoc structure. Star or snow flake schema are database design principles. You can store any xsd file as valid schema in MDM which contains information about the data type and field name,meta data etc in MDM repository.
Could you pls let me know to What could be the prerequisites interms of knowledge to design a repository?
Do i have to be specialized in any data base designing?
This is most important job Data modeling.  You need to understand the master object considered and the business needs or processes which will be supported by MDM. Use Data Manager reference guide, that will equip you with Data modeling.
do i have know xml, abap as well to work on MDM?
Not necessarily but if you know them it adds value while working on integration scenarios of MDM like Consolidation and Harmonization of data.
Hope it helped.
regards
Ravi

Similar Messages

  • How can i Extraxt the data form Data ware house to BI

    Hello Gurus,
                         Our Requirement is to Extract the data from Data ware house( Informatica, Business objects)
    How can i Extract and What r the steps follows
    Regards

    Hi Lakshmi,
    You can use third party tools for extraction data from DW, like PowerCenter for Informatica , BusinessObjects Data Integrator for Business Objects. 
    To use third-party source systems, you have to create RFC destinations from BW to an RFC server running on the third-party ETL server.
    These tools usually work in a similar way. First, they extract data from the source database. Then, the tool stages the data in an intermediate database before loading it to the target system.
    Transformations can occur at any point in the data flow, although you will usually want to store an untransformed copy of the data before significantly manipulating it.
    Thanks

  • Order Data Ware House SD report?

    Hi all kindly give a solution for this scenario
      If there are no quantity on the request, it will not show a u201C0u201D order quantity there u2013 it will show the first promise quantity by its side and show the promised quantity there.  Please change your query as per that and do not put the schedule lines as it is from order table.  There will not be any row with 0 quantities. 
    Explaining the case which you asked u2013 1 piece requested on 11-Dec and promised on 12-Dec will appear as below
    Request Date    Promise Date    Order Qty
    11-Dec              12-Dec              1
    Once invoicing happens, invoice quantity will drive the query. 
    Just take an example  - 3 pieces requested on 11-Dec, all the 3 promised on 12-Dec, only one was invoiced on 12-Dec.  As on 12-DEC, it will appear as below:-
    Request Date    Promise Date    Invoice Number    Billing Date Order Qty
    11-Dec              12-Dec              ABC                   12-Dec            1
    11-Dec              12-Dec                                                              2
    If the rest of the 2 were billed on 15th Dec, it will appear as below on 15th Dec
    Request Date    Promise Date    Invoice Number    Billing Date Order Qty
    11-Dec              12-Dec              ABC                   12-Dec            1
    11-Dec              12-Dec              DEF                     5-Dec 2

    Yes all jobs are running and cubes are also processed.I have two server 1)Dataware house server containing sql database 2)Service manager console .
    I didn't connect operation manager server but i created one incident manually from "service manager console ">work items .So it should show in the data ware house incident report .

  • Service Manager Data Ware House Connection Issue

    We have 3 virtual server for SCSM. The scenario of SCSM server are in below.
    SCSM management server, Console and Management Server SQL database.
     SCSM data Ware House Server, reporting server and data ware house & repporting SQl database.
    SharePoint server, SCSM web server and SharePoint server SQL database.
    Only SharePoint SQL server running on SQL Server 2008 and the other SQL server's are running on SQL 2012. Previously data warehouse connectivity was fine. But recently data warehouse connection is not working. When I try to unregister the data source connection
    in showing error. I try to check the data source by powershell Get-SCDWSource, but it not showing any source. How can we unregister and re-register datasource also with dataware house connection.
    Mithun Dey Web: http://cloudmithun.wordpress.com If this may give your necessary resolution please mark it as Answre.

    I've 2-3 error.
    IN SCSM Management Server
    Event ID: 34152
    The description for Event ID 34152 from source Data Connectors cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.
    If the event originated on another computer, the display information had to be saved with the event.
    The following information was included with the event: 
    Domain Connector
    Error resolving user with domain PRIMEBANK and name as040121. Exception: System.DirectoryServices.DirectoryServicesCOMException (0x8007052E): The user name or password is incorrect.
       at System.DirectoryServices.DirectoryEntry.Bind(Boolean throwIfFail)
       at System.DirectoryServices.DirectoryEntry.Bind()
       at System.DirectoryServices.DirectoryEntry.get_AdsObject()
       at System.DirectoryServices.PropertyValueCollection.PopulateList()
       at System.DirectoryServices.PropertyValueCollection..ctor(DirectoryEntry entry, String propertyName)
       at System.DirectoryServices.PropertyCollection.get_Item(String propertyName)
       at System.DirectoryServices.AccountManagement.PrincipalContext.DoLDAPDirectoryInitNoContainer()
       at System.DirectoryServices.AccountManagement.PrincipalContext.DoDomainInit()
       at System.DirectoryServices.AccountManagement.PrincipalContext.Initialize()
       at System.DirectoryServices.AccountManagement.PrincipalContext.get_QueryCtx()
       at System.DirectoryServices.AccountManagement.Principal.FindByIdentityWithTypeHelper(PrincipalContext context, Type principalType, Nullable`1 identityType, String identityValue, DateTime refDate)
       at System.DirectoryServices.AccountManagement.UserPrincipal.FindByIdentity(PrincipalContext context, String identityValue)
       at Microsoft.EnterpriseManagement.ServiceManager.Sdk.Connectors.Lfx.Workflows.ADGroupExpander.MarkDeletions()
    In DATAWAREHOUSE
    Event ID: 33519
    The description for Event ID 33519 from source Data Warehouse cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.
    If the event originated on another computer, the display information had to be saved with the event.
    The following information was included with the event: 
    bd773a44-6c16-1ad9-fe59-344395587b6d
    ServiceManager.ServiceCatalog.Cubes.Library
    7.5.2905.0
    SM Management Group
    Either the MP has failed deployment Or any of its references has failed deployment
    Mithun Dey Web: http://cloudmithun.wordpress.com If this may give your necessary resolution please mark it as Answre.

  • Deploy data ware house with control center

    Hi,
    I am a rookie in data ware housing and in ETL process. I have sucsessfully deployed our sequneces, tables and dymensions. As next step we would like to deploy our mappings. The deployment was finished with errors:
    At the first 'bigger' mapping we received the following error message:
    RPE-01012: PL/SQL-Mappings can not be provided in this target schema. The traget schema and control center has to be in the same instance.
    The role OWB_USER must be granted to the target schema. Use the OWB security interface to convert the target schema to OWB-user.
    At the second 'simplier' mepping we got the following warning message:
    ORA-06550: PACKAGE BODY, line 0, column 0: PL/SQL: Compilation unit analysis terminated
    ORA-06550: PACKAGE BODY, line 1, column 15:
    PLS-00304: Body of 'PRODUCT_MAP' is without specification complied.
    ORA-06550: PACKAGE BODY, line 1, column 15:
    PLS-00905: Object DWH.PRODUCT_MAP is invalid.
    ORA-06550: PACKAGE, line 171, column 1:
    PL/SQL: Declaration ignored
    ORA-06550: PACKAGE, line 171, column 24:
    PLS-00201: Bezeichner 'OWBSYS.WB_RT_MAPAUDIT' must be declared.
    Anyone solved something similar?
    Thank you!
    Regards, Robert

    The issue is not relevant any more.

  • Data Ware House Development Position Available in The Dallas Ft. Worth Area.

    Esoftsolutions, Inc is an award winning systems integration firm headquartered in Plano. We specialize in providing both solutions and services in several areas including, Data warehousing, ERP package implementations/upgrades and Infrastructure related upgrades and enhancements. Recognized as one of the best and brightest companies to work for by our employees and one of the fastest-growing companies in the nation, esoftsolutions, Inc. is seeking qualified candidates for the following data warehouse position.This individual will work closely and communicate effectively with a variety of clients. Participate in the evaluation and creation of project requirements. Get to design, develop and implement efficient and effective ETL processes to Essbase, Oracle staging, fact and summary table consolidated solutions to complex business requirements in a global environment as you work both independently and as part of a larger team.He/She Work with Genio ETL tools and Oracle import utilities and Hyperion Essbase to build data management programs for data warehouse applications (models/OLAP data marts) for the Marketing, Finance and Operations Department. Analyze and verify data for reasonableness. Work on special projects. The key responsibilities associated with this role are: RESPONSIBILITIES: - Design, build and document ETL programs. - Support ETL programs and make adjustments as needed. - Assist with trouble shooting models/data marts. - Respond to special requests from finance team, marketing dept. and various operational areas. - Create and update documentation. - Modify or assist in the management of Oracle tables as needed. - Build new models as requested. - Prepare the models for the weekly build. - Build weekly models (Approx. 14.) - Verify the data in the models. Research as needed. - Prepare the models for the monthly build. - Maintain users, groups and libraries on Express server. - Provide reports on a monthly basis.QUALIFICATIONS: Degree/Major: BS Information Systems/Accounting Information Systems- Relational database background required (Oracle preferred). - Excellent verbal and written communication skills. Project management experiences a plus.- Excellent problem solving skills. - MS Excel skills required - 3-5 years of ETL Data warehousing experience required.- 2-3 years experience using Genio tools preferred.- C++, Pascal, PL/SQL,Delphi, InterDev programming experience or equivalent required.- Essbase experience welcomed. - NT server and workstation. - Strong understanding of financial concepts (GL, Cash, A/P, A/R, Project Accounting). - Previous application development desired. - Self-motivated and resourceful. - Preference for working in a cooperative team environment. - Demonstrated ability and willingness to learn new technology quickly. This is a long-term opportunity in the Dallas area. The client resides in the manufacturing vertical. The compensation is based on skills and experience and is comprised of a salary plus bonus structure. This is an immediate opportunity with a well-established client. Qualified applicants may respond to the e-mail address below with a word copy of their resume. Qualified applicants will be contact by our recruiting office. Unfortunately, re-location is not budgeted for this position. We respectfully request responses from individuals only we are unable to support third party inquiries at this time. John Sigginsesoftsolutions, inc.5045 Lorimar Drive, Suite 200Plano, TX [email protected] / http://www.esoftsolutions.com

    elset191 wrote:
    crossrulz wrote:
    Wow.  I was still in High School when that original post was done.  I doubt the position is now available.
    7th grade for me.  I wish I had known about this opening!
    I was in divorce court in 2000. Man I feel old on the forums sometimes.
    Mark Yedinak
    "Does anyone know where the love of God goes when the waves turn the minutes to hours?"
    Wreck of the Edmund Fitzgerald - Gordon Lightfoot

  • Why SAP HANA ? what made it difference from Data ware housing concept ?

    Hi,
    Greetings Guys !! I am wondering and kind of lost with giant SAP newly introduced database SAP HANA. I have been informed that HANA is developed for fast query processing mainly.The older method of query processing involves lot of time ,i.e . the request should go to database and will retrieve information . If HANA have shown IMDB ( In- Memory database ) technology , Dataware house concept also solves waiting time of query processing .
    Could some one explain me why HANA is introduced if Dateware house concept was already born ??
    Message was edited by: Tom Flanagan

    Hi Venkatesh Garlapat,
    SAP HANA is/was not only about reinventing the DW concepts again, but its a lot more than that.
    Probably Google would have helped you, but let me point you to the right content which can clarify your doubts: Please refer:
    http://www.saphana.com/welcome
    About HANA | SAP HANA
    Features | SAP HANA
    SAP HANA Outperforms | SAP HANA
    SAP HANA Cloud Platform | SAP HANA
    Data Compatibility | SAP HANA
    I hope above links helps.
    Happy Learning.
    Regards
    Kumar

  • Enterprise data ware housing

    Hi BW gurus,
    Can u please forward this document to my mail id [email protected]
    TBW10 BI - Enterprise Data Warehousing 
    Full points will be assigned.
    Regards,
    Shahid

    Hi Syed
    Sent you EDW document check and confirm the same
    Regards
    Chandru

  • Data modeling tool to design a dwh - SQL Server 2008 R2-2012

    Hi,
    I need to know if it exists a Microsoft tool to model the data structures of a data warehouse. I generally use the SQL Server diagrams but I'd like to know an enhanced tool. Visio?
    Thanks

    Hi,
    You could download a 60 day free trial of Visio; -
    http://office.microsoft.com/en-gb/visio/visio-professional-2013-free-trial-flowchart-software-FX103791368.aspx
    Have a look at a related link; -
    http://social.technet.microsoft.com/Forums/sqlserver/en-US/620009b1-9625-4e1f-8574-f4049718ffe2/shapestemplates-bi-for-visio
    Kind Regards,
    Kieran.
    Kieran Patrick Wood http://www.innovativebusinessintelligence.com http://uk.linkedin.com/in/kieranpatrickwood http://kieranwood.wordpress.com/

  • Tool Report Design

    Hi all,
    Could you tell me any tools for design report layout or other way to design? Beside Quick Viewer, SAP Query, Report Painter/Writer, they can't group data and very complex.
    Thank you!

    Crystal Reports can be used to create html files which can be loaded into a java application. And it provides a C interface which can be accessed via JNI.
    Given that I did that four(5?) years ago, it might have gotten easier.
    I used the following search string via www.google.com
    "Crystal Reports" +java
    And the first page I looked at seemed to suggest that more support for java is now available.

  • How to design Master datas?? Whats is Time Scenarios???Explain..

    How to design Master datas?? Whats is Time Scenarios???
    Can anybody explain about Time Scenarios???

    Hi..
    to tell it is huge ..
    but a slice of it 
    Design          – Create design documents
    Prod.          – Make the product
    Quality          – Confirm quality of product
    Sales          - Market the product
    Purchase                    – Procure the items
    Accounts                       To control the cost of Mfg
    These steps should be understood as a general approach. To what extent they must be carried out depends
    on the actual situation and the experience of the project members involved.
    After deciding on the business process being dealt with, the basic steps to implementing a BI based solution
    are:
    1. Focus on the structure of information
    Develop a complete understanding of the underlying business processes. Create an Entity Relationship
    Model (ERM) of the business process
    The ERM as a function of the information
    2. Focus on analytical needs - Overcome model complexity
    Create a valid data model. Translate the ERM to the Multi-Dimensional Model (MDM) / Star schema
    The MDM as a function of the analytical processing
    3. Build the solution as a part of an integrated data warehouse
    The Star schema on the BI stage are the InfoCubes. Translate the MDM / Star schema to one or more
    InfoCube.
    coming to time ..
    How real-world changes are dealt with, i.e. how the different time aspects are handled is the most
    important topic with data warehouses.
    The attributes of a characteristic that will reside in its master data table are determined in the modeling
    phase. Each attribute can be defined individually as being time dependent:
    There is one ‘time dependent’ check box for each attribute in the ‘attribute’ tab page section.
    Time dependency of an attribute allows you to keep track on the changes over time of the relation of the
    characteristic and the time dependent attribute values.
    In terms of technical implementation, two master data tables exist if we have both non-time dependent
    and time dependent attributes.
    One master data table stores all relations to non-time dependent attributes (name of the table:
    /BIC/P<InfoObject name>) and
    One table stores relations to time dependent attributes (name of the table: /BIC/Q<InfoObject
    name>).
    The time dependent attributes master data table has additional DATETO and DATEFROM system
    attributes. In queries the different constellations are addressed using the key date ( Query properties).
    The validity attributes are not available for navigation.
    The text table, or better the description attributes, may be defined as time dependent.
    SID tables with respect to master data:
    The SID table is always generated if an InfoObject is not defined as ‘attribute only’ (tab page general).
    This table is used if the access to an Infocube or DataStore Object uses a navigational attribute or if the
    access is via a characteristic without attributes. Name of the table: /BIC/S<InfoObject name>
    The non-time dependent attribute SID table of a characteristic for access via non-time dependent
    attributes. Name of the table: /BIC/X<InfoObject name>
    The time dependent attribute SID table of a characteristic for access via time dependent attributes.
    Name of the table: /BIC/Y<InfoObject name>
    with regards,
    hari kv

  • Designer to Data Modeler - triggers

    Hello,
    Our client still uses Designer to design it's database model and is considering an upgrade. Since the last version you can import your repository from Designer to Data Modeler. When you start selecting your objects before import, you can't seem to select your triggers.
    After the export the triggers are not in the physical model at the table.
    Was anyone successful with this feature?

    Hello Joni,
    The documentation of Data Modeler indeed says to import triggers (http://www.oracle.com/technetwork/developer-tools/datamodeler/datamodelerfaq-167683.html#designer).
    I use the import functionality from Designer often, but never the triggers are included. I extract the triggers as plsql modules and do not store them in DM.
    Joop

  • Power designer's Data Architect, Enterprise Architect and Information Architect

    Does anyone have product differences between Power designer's Data Architect, Enterprise Architect and Information Architect?

    Dear Rahaman,
    Please refer to the PD 16.5 packaging options below and you can't miss the differences.
    DataArchitect\DM includes Shell/Core*, XML, PDM, LDM
    DataArchitect\RE  includes Shell/Core*, XML, PDM, LDM, Repository
    Information Architect includes Shell/Core*, XML, PDM, LDM, Repository, DMM, CDM, BPM
    EnterpriseArchitect includes Shell/Core*, XML, PDM, LDM, Repository, DMM, CDM, BPM, OOM, EAM, EA Frameworks
    ** Shell & Core includes UI, Gap and Impact Analysis Tools, Reporting Tools, RQM, Free Model, and other common components
    There exists different modeling features you can use in each of packages in PowerDesigner 16.
    If you want to use a repository for storing models and sharing them with other modelers, you should have at least DataArchitect/RE option.
    And, If you should use OOM, then it is required that you have the EnterpriseArchitect option.
    Best Regards
    Jerry

  • Any tools are there in data upload for other than BDC.

    Any tools are there in data upload for other than BDC.

    Hi Hassan,
    <b>LSMW</b>
    <a href="http://sapabap.iespana.es/sapabap/manuales/pdf/lsmw.pdf">http://sapabap.iespana.es/sapabap/manuales/pdf/lsmw.pdf</a>
    <b>Direct Input</b>
    <a href="http://help.sap.com/saphelp_di471/helpdata/EN/fa/097174543b11d1898e0000e8322d00/content.htm">Direct Input</a>
    <b>BAPI</b>'s
    Example :
    <b>BAPI_QUOTATION_CREATEFROMDATA2</b> - Customer Quotation: Create Customer Quotation
    <b>BAPI_PO_CREATE1 </b> -     Create Purchase Order
    <b>Close your previous threads if you have got the answers.</b>
    Regards,
    AS

  • Design pattern / data loading solution

    Hello all!
    I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
    What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
    The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
    class SpecificLoader extends LoadingOperation<CustomDataType>
    and doing similar things with Parser classes, but this seems a bit weird.
    Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...
    thanks for any help!
    psi have also asked this question here [http://stackoverflow.com/questions/4329087/design-pattern-data-loading-solution]

    rackham wrote:
    Hello all!
    I have been working on a few projects which involve loading data, sometimes remotely, sometimes local, sometimes JSON, sometimes XML. The problem I am having is that due to the speed of development and the changing minds of various clients I am finding my designs are too rigid and I would like them to be more flexable. I have been trying to think of a reusable solution to data loading, and would like some advice as I imagine many of you out there have had the same problem.
    What I would like to do is to create a generic LoadingOperation abstract class, which has member variables of type Parser and Loader, which have parse() and loadData() methods respectivly. The Parser and Loader classed are interfaces and classes that implement these could be XMLParser and JSONParser, LocalLoader and RemoteLoader etc. With something like this i would like to have a new class which extends LoadingOperation for each thing to be loaded, weather thats a local XML file, or remote JSON, or whatever.
    The problem is is that specific Parser implementation cannot return custom data types without breaking the polymorphic behavior of the LoadingOperation class. I have been messing around with generics and declaring subclasses of LoadingOperation like
    class SpecificLoader extends LoadingOperation<CustomDataType>
    and doing similar things with Parser classes, but this seems a bit weird.
    Does anyone have any suggestions on what im doing wrong / could be doing better. I want to be able to react quickly to changing specifications (ignoring the fact that they shouldnt be changing that much!) and have a logical seperation of code etc...That depends on the specifics.
    The fact that it seems like processes are similar doesn't mean that they are in fact the same. My code editor and Word both seem to be basically the same but I am rather sure that generalizing between the two would be a big mistake.
    And I speak from experience (parsing customer data and attempting to generalize the process.)
    The problem with attempting to generalize is if you generalize functionality that is not in fact the same. And then you end up with conditional logic all over the place to deal with differences dependent on the users. Rather than saving time that actually costs time because the code becomes more fragile.
    Doesn't mean it isn't possible but just rather that you should insure that it is in fact common behavior before implementing anything.

Maybe you are looking for