OBIEE Best Practice Data Model/Repository Design for Objectives/Targets

Hello World!
We are faced with a design question that has become somewhat difficult and we need some help. We want to be able to compare side-by-side actual measures with their corresponding objectives/targets. Sounds simple. But, our objectives are static (not able to be aggregated) with multi-dimensionality and multi-levels. We need some best practice tips on how to design our data model and repository properly so that we can see the objective/target for a measure regardless of the dimensions that are used in the criteria and regardless of the level.
Here is some more details:
Example of existing objective table.
Dimension1
Dimension2
Dimension3
Obj1
Obj2
Quarter
NULL
NULL
NULL
.99
1.8
1Q13
DIM1VAL1
NULL
NULL
.99
2.4
1Q13
DIM1VAL1
DIM2VAL1
NULL
.98
2.41
1Q13
DIM1VAL1
DIM2VAL1
DIM3VAL1
.97
2.3
1Q13
DIM1VAL1
NULL
DIM3VAL1
.96
1.9
1Q13
NULL
DIM2VAL1
NULL
.97
2.2
1Q13
NULL
DIM2VAL1
DIM3VAL1
.95
2.0
1Q13
NULL
NULL
DIM3VAL1
.94
3.1
1Q13
- Right now we have quarterly objectives set using 3 different dimensions. So, if an author were to add one or more (or zero) dimensions to their criteria for a given measure they could get back a different objective. They could add Dimension1 and get 99%. They could add Dimension1 and Dimension2 and get 98%. They could add all three dimensions and get 97%. They could add zero dimensions (highest grain) and get 99%. Using our existing structure if we were to add a new dimension to the mix the possible combinations would grow dramatically. (Not flexible)
- We would like our final solution to be flexible enough so that we could view objectives with altogether different dimensions and possibly get different objectives.
- We currently have 3 fact tables with 3+ conformed dimension tables and a few unique dimension tables.
Could anyone share a similar situation where you have implemented a data model structure with the proper repository joins to handle showing side-by-side objectives/targets where the objectives were static and could be displayed at differing levels with flexible dimensions as described?
Any help would be greatly appreciated.

hi..yes this suggestion is nice...first configure the sensors(activity or variable) ..then configure the sensor action as a JMS Topic which will in turn insert the data into a DB..Or when u configure the sensor action as a DB..then the data goes to Oracle Reports schema..if there is any chance of altering the DB..i mean if there is any chance by changing config files so that the data doesnt go to that Reports schema and goes to a custom schema created by any User....i dont know if it can b done...my problem is wen i m configuring the jms Topic for sensor actions..i see blank data coming..for sm reason or the other the data is not getting posted ...i have used a esb ..a routing service based on the schema which i am monitoring...can any1 help?

Similar Messages

  • Best practice in Infoprovider & Query design for access by BO Universe

    Hello Experts,
    Are there any best practices identified by practitioners or suggested by SAP for development of Infoprovider and queries for access by BO Universe.
    Best practices should be from the prospective of performance, design simplicity, adaptability to change etc.
    Appreciate your help.
    Regards,
    Pritesh.
    Edited by: pritesh prakash on Jul 19, 2010 10:51 AM

    Thanks Suresh.
    My project plan is to build Infocubes & queries which will be then used to build Universe upon it. Thus I am looking for do's & dont's while designing infocubes & queries such that there wont be any issues(performance or other) when accessed by Universe built on it.
    Hope I have made it more clear now.
    Regards,
    Pritesh.

  • SQL Developer Data Modeler Repository

    Hi,
    I would like to know how to save all my applications into the Data Modeler Repository instead of doing it piece by piece and having to create a dmd file for every single application I imported into Data Modeler.
    In Oracle Designer, everything is in the repository and I can run reports against the tables and views belonging to that repository. So, I can produce a pdf documentation out of it. Can we do the same with Data Modeler?
    Thank you!

    Hello Anonymous,
    I'll start with your first comment - "I would like to know how to save all my applications into the Data Modeler Repository ". The Repository is just for reporting purposes, as Dimitar points outs, so saving all your applications in the Data Modeler is merely File - Save. This action saves the current design (with all its models) to the file system. The main premise here is that the Data Modeler is a file based product and all models and designs are saved locally to the file system. This is a positive move for many customers who are now building applications in Java or using any of the application development environments that work with a variety of files. They are taking all the artifacts and placing them under version control using open source versing tools like Subversion. Using the Data Modeler, they can do the same.
    For reporting you have a choice, one of which has also been explained. The action of creating a schema before you export the design is a one off step. Following on from this you can export new versions of the model to the same repository. The reason we did this was that many of the Designer audience wanted to write and run their own reports. So here you have a set of tables that you can write whatever SQL query you want to use.
    The other reporting option is to use the integrated reports in the product and you do not have to export the design or open another product or write SQL.
    Have you used the Design Rules in the Data Modeler? These are a set of "reports" for quality assurance purposes. There are many of these reports at all levels of the design which verify that your model is defined according to a set of database rules and general business rules and if you like, you can add your own rules.
    I attended ODTUG last week in LA and was really pleased to listen to a number of talks by well seasoned Designer users who have successfully transitioned to the Data Modeler and who had many positive experiences.
    Yes, the tools are different - but the one tool was not designed to replace the other - Data Modeler does none of the application design and generation that Designer does, those customers who have wanted to replace one tool (people are moving to the data Modeler from different tools) with the Data Modeler, have reported to me that they are pleased with the change.
    Regards
    Sue Harper
    Product Manager

  • Upcoming SAP Best Practices Data Migration Training - Chicago

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    SAP America, Downers Grove in Chicago, IL:
    November 3 u2013 5, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Services
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Services to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Services and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Services fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Servicesu2013 Installation and deployment of the Data Services and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    5.     Overview of Data Quality within the Data Migration process A demonstration of the Data Quality functionality available to partners using the full Data Services toolset as an extension to the Data Services license.
    Logistics & How to Register
    Nov. 3 u2013 5: SAP America, Downers Grove,  IL
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 8AM u2013 3PM
                     Address:
                     SAP America u2013Buckingham Room
                     3010 Highland Parkway
                     Downers Grove, IL USA 60515
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Services on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please use the hyperlink below.
    http://service.sap.com/~sapidb/011000358700000917382010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Upcoming SAP Best Practices Data Migration Training - Berlin

    YOU ARE INVITED TO ATTEND HANDS-ON TRAINING
    Berlin, Germany: October 06 u2013 08, 2010     `
    Installation and Deployment of SAP Best Practices for Data Migration & SAP BusinessObjects Data Integrator
    Install and learn how to use the latest SAP Best Practices for Data Migration package. This new package combines the familiar IDoc technology together with the SAP BusinessObjects (SBOP) Data Integrator to load your customeru2019s legacy data to SAP ERP and SAP CRM (New!).
    Agenda
    At the end of this unique hands-on session, participants will depart with the SBOP Data Integrator and SAP Best Practices for Data Migration installed on their own laptops. The three-day training course will cover all aspects of the data migration package including:
    1.     Offering Overview  u2013 Introduction to the new SAP Best Practices for Data Migration package and data migration content designed for SAP BAiO / SAP ERP and SAP CRM
    2.     Data Integrator fundamentals u2013 Architecture, source and target metadata definition. Process of creating batch Jobs, validating, tracing, debugging, and data assessment.
    3.     Installation and configuration of the SBOP Data Integratoru2013 Installation and deployment of the Data Integrator and content from SAP Best Practices. Configuration of your target SAP environment and deploying the Migration Services application.
    4.     Customer Master example u2013 Demonstrations and hands-on exercises on migrating an object from a legacy source application through to the target SAP application.
    Logistics & How to Register
    October 06 u2013 08: Berlin, Germany
                     Wednesday 10AM u2013 5PM
                     Thursday 9AM u2013 5PM
                     Friday 9AM u2013 4PM
                     SAP Deutschland AG & Co. KG
                     Rosenthaler Strasse 30
                     D-10178 Berlin, Germany
                     Training room S5 (1st floor)
    Partner Requirements:  All participants must bring their own laptop to install SAP Business Objects Data Integrator on it. Please see attached laptop specifications and ensure your laptop meets these requirements.
    Cost: Partner registration is free of charge
    Who should attend: Partner team members responsible for customer data migration activities, or for delivery of implementation tools for SAP Business All-in-One solutions. Ideal candidates are:
    u2022         Data Migration consultant and IDoc experts involved in data migration and integration projects
    u2022         Functional experts that perform mapping activities for data migration
    u2022         ABAP developers who write load programs for data migration
    Trainers
    Oren Shatil u2013 SAP Business All-in-One Development
    Frank Densborn u2013 SAP Business All-in-One Development
    To register please follow the hyperlink below
    http://intranet.sap.com/~sapidb/011000358700000940832010E

    Hello,
    The link does not work. This training is still available ?
    Regards,
    Romuald

  • Best FREE data modeling tool

    What is the best FREE data modeling tool?  Thanks.
    Kalman Toth Database & OLAP Architect
    SQL Server 2014 Database Design
    New Book / Kindle: Beginner Database Design & SQL Programming Using Microsoft SQL Server 2014

    Hi Kalman,
    According to my knowledge, Microsoft Office Visio is helpful for building data model. For more information, please review this article:
    Create a Database Model (also known as Entity Relationship diagram).
    Besides, you can also use other third-party tools such as
    Erwin, SQL Power Architect to design SQL Server database. However, Microsoft cannot make any representations regarding the quality, safety, or suitability of any the third party software or information.
    There is also a discussion about free data modeling tool in the following thread for your reference.
    https://social.msdn.microsoft.com/Forums/en-US/b70d2cdb-dc7f-4e89-a0ae-9dbf5687199e/free-data-modelling-tool?forum=databasedesign
    Thanks,
    Lydia Zhang

  • How to load best practices data into CRM4.0 installation

    Hi,
      We have successfully installed CRM4.0 on a lab system and now would like to install the CRM best practice data into it.
      If I refer to the CRM BP help site http://help.sap.com/bp_crmv340/CRM_DE/index.htm,
    It looks like I need to install at least the following In order to run it properly.
    C73: CRM Essential Information 
    B01: CRM Generation 
    C71: CRM Connectivity 
    B09: CRM Replication 
    C10: CRM Master Data 
    B08: CRM Cross-Topic Functions
    I am not sure where to start and where to end. At the minimum level I need the CRM Sales to start with.
    Do we have just one installation CDs or a number of those, Also are those available in the download area of the service.sap.com?
    Appreciate the response.

    <b>Ofcourse</b> you need to install Best Practices Configuration, or do your own config.
    Simply installing CRM 4.0 from the distibutiond CD\DVD will get you a plain vanilla CRM system with no configuration and obviously no data.  The Best Practices guide you trhough the process of configuring CRM, and even has automated some tasks.  If you use some of the CATT processes of the Best Practices you can even populate data in your new system (BP data, or replace the input files with your own data)
    In 12 years of SAP consulting, I have NEVER come across a situation whereby you simply install SAP from the distribution media, and can start using it without ANY configuration.
    My advise is to work throught the base configuration modules first, either by importing the BP config/data or following the manual instruction to create the config/data yourself.  Next, look at what your usage of CRM is going to be, for example Internet Sales, Service Management, et cetera, and then install the config  for this/these modules.

  • What are the best practices to migrate VPN users for Inter forest mgration?

    What are the best practices to migrate VPN users for Inter forest mgration?

    It depends on a various factors. There is no "generic" solution or best practice recommendation. Which migration tool are you planning to use?
    Quest (QMM) has a VPN migration solution/tool.
    ADMT - you can develop your own service based solution if required. I believe it was mentioned in my blog post.
    Santhosh Sivarajan | Houston, TX | www.sivarajan.com
    ITIL,MCITP,MCTS,MCSE (W2K3/W2K/NT4),MCSA(W2K3/W2K/MSG),Network+,CCNA
    Windows Server 2012 Book - Migrating from 2008 to Windows Server 2012
    Blogs: Blogs
    Twitter: Twitter
    LinkedIn: LinkedIn
    Facebook: Facebook
    Microsoft Virtual Academy:
    Microsoft Virtual Academy
    This posting is provided AS IS with no warranties, and confers no rights.

  • Best practice standard User Acess Test for WIN2012 AD

    What is the Best practice standard User Acess Test  for WIN2012 AD

    Hello,
    as before, add a computer to the domain and log on with a domain user account to the computer.
    You should be able from the client machine to open the sharedfolders on the DCseither with:
    \\DCName\sysvol
    \\DCName\netlogonor \\NetBiosDomainName\sysvol
    \\NetBiosDomainName\netlogon
    Best regards
    Meinolf Weber
    MVP, MCP, MCTS
    Microsoft MVP - Directory Services
    My Blog: http://blogs.msmvps.com/MWeber
    Disclaimer: This posting is provided AS IS with no warranties or guarantees and confers no rights.
    Twitter:  

  • Quick question regarding best practice and dedicating NIC's for traffic seperation.

    Hi all,
    I have a quick question regarding best practice and dedicating NIC's for traffic seperation for FT, NFS, ISCSI, VM traffic etc.  I get that its best practice to try and separate traffic where you can and especially for things like FT however I just wondered if there was a preferred method to achieving this.  What I mean is ...
    -     Is it OK to have everything on one switch but set each respective portgroup to having a primary and failover NIC i.e FT, ISCSI and all the others failover (this would sort of give you a backup in situations where you have limited physical NICs.
    -    Or should I always aim to separate things entirely with their own respective NICs and their own respective switches?
    During the VCAP exam for example (not knowing in advance how many physical NIC's will be available to me) how would I know which stuff I should segregate on its own separate switch?  Is there some sort of ranking order of priority /importance?  FT for example I would rather not stick on its own dedicated switch if I could only afford to give it a single NICs since this to me seems like a failover risk.

    I know the answer to this probably depends on however many physical NICs you have at your disposal however I wondered if there are any golden 100% rules for example FT must absolutely be on its own switch with its own NICs even at the expence of reduced resiliency should the absolute worst happen?  Obviously I know its also best practice to seperate NICs by vender and hosts by chassis and switch etc 

  • Viewing data kept in "services for object" for several objects in alv grid?

    Dear Experts,
    Is it possible to view the data kept in "services for object" (The Button at topleft side of title bar of Master data header screen of objects like Functional Location/Equipment/Notification/Order) for several objects in one list like ALV?
    Please guide how to do it.
    Thanks and Regards,
    R N Sabat.

    Hi,
    The data stored in service object can be viewed latter on in change/display mode
    Kapil

  • Oracle HTTP Server as web tier for OBIEE - Best Practices?

    Hi All,
    A bit of a cross-topic issue - IHAC which wants to add a web tier (in a form of additional DMZ server with Oracle HTTP Server installed) to existing OBIEE installation.
    Are there any best practices regarding all the security aspects - installations, ports, SSL, certificates, keystores etc. ?
    Thank you in advance,
    Roman

    Am not using weblogic, I'm doing standalone setup
    Standalone:
    FMW 11g Web-Tier products are configured without a domain and administered from the command line. In this case, be sure to UN-check the selection for “Associate to WebLogic Domain” during the installation prompts and uncheck the web cache.. Only OHS is installed.
    Is it possible to install sun jdk 64 bit on AIX 7.1 machine.. ?

  • Question - Best practice data source for Vs2008 and Crystal Reports 2008

    I have posted a question here
    CR2008 using data from .NET data provider (ADO.NET DATASET from a .DLL)
    but think that perhaps I need general community advise on best practice with data sources.
    In Crystal reports I can choose the data source location from any number of connection types, eg ado.net(xml), com, oledb, odbc.
    Now in regard to the post, the reports have all been created in Crxi 6.3, upgraded to Crystal XI and now Im using the latest and greatest. I wrote the Crystal Reports 6.3/ XI reports back in the day to do the following: The Reports use a function from COM Object which returns an ADO recordset which is then consumed fine.
    So I don't want to rewrite all these reports, of which there are many.
    I would like to know if any developers are actually using .NET Class libraries to return ADO.NET datasets via the method call or if you are connecting directly to XML data via whatever source ( disk, web service, http request etc).
    I have not been able to eliminate the problem listed in the post mentioned above, which is that the Crystal Report is calling the .NET class library method twice before displaying the data. I have confirmed this by debugging the class lib.
    So any guidance or tips is appreciated.
    Thanks

    This is already being discuss in one of your other threads. Let's close this one out and concentrate on the one I've already replied to.
    Thanks

  • Best Practice: A J2EE Blue-Print for a Typical Web App

    Consider a typical synchronous Struts-based Web application which does a simple DB search and post. What are some of the main patterns and components that should be used if following the �industry best practices�
    Does the following flow seem accurate?
    Strust Action creates a TransferObject , and passes it to a Business Delegate. Delegate finds the appropriate BusinessObject, the Business Object uses the Data Access Object�.the CRUD operation happens and the result is sent back to the Action in the same TransferObject.
    Which one of these components need an interface?
    What's the best way for this components to interact with each other (factory, etc.)?
    Message was edited by:
    kmkiani
    Message was edited by:
    kmkiani

    There are 3 tiers in a Java EE application. (Presentation, Business, Integration).
    The BusinessDelegate in this scenario would be a Presentation-tier business delegate. This guy would interact with a Session Facade who lives on the Business-tier. The SessionFacade is the abstraction on the Business-tier and the Business Delegate is the abstraction on the Presentation-tier. It is these guys that have direct communication. This design enables low coupling between the actual implementations of each area. If done properly, you could go from EJB to Web Service to POJO business models without ever having to change anything in the Presentation-tier.
    These object-oriented design patterns are primarily for Enterprise applications with extensive Quality-of-Service requirements.
    In your scenario, the Presentation-tier would contain a MVC-based web application, i.e. Struts. The business model and business/domain requirements would be implemented in the Business-tier.
    Presentation Tier - Struts Web Application
    Business Tier - (EJB | POJO | WEB SERVICES) Application
    Integration Tier - (Relational Database | File System | XML Database | EIS)

  • Publish data model with designer

    Hi,
    so far I used ERwin for data modelling. ERwin can generate HTML output consisting of the graphical data model plus hyperlinks on the tables. If you click on a table a report pops up with all details, e.g. table and column comments.
    I find this very useful to publish the data model.
    Is there something similar available with Oracle Designer?
    I found this announcement, but no product: http://otn.oracle.com/products/designer/pdf/ROB_Announce.pdf
    Best Regards,
    Jens

    If you have any recent release of Designer installed, then you have the ROB automatically installed. Select your Start menu in Windows and then navigate through your menus to the Designer sub menu and you'll see a link for setting up the ROB.
    The ROB was made available as part of Designer from March 2003. The lastest release of Designer, viz Designer 10g (9.0.4.4) and the equivalent 9i and 6i releases, all now avialable on OTN, have additional support for the ROB. This offers the ability to save and ERD or Server Model Diagram directly to the ROB.
    You do need to add the "hotpost" functionality manually. The ROB help gives a step by step apporach on adding these hotspots. There are a number of demos on the Designer demo page on using the ROB and configuring the App Server to use it. http://otn.oracle.com/products/designer/demos.htm
    Regards
    Sue

Maybe you are looking for

  • Purchase order ship to party no at item level

    Hi All, I am able to get Ship To Party Address from ADRC table(Iam getting ADRNR from EKPO) But my client requirement is to get the KUNNR and not the address. Is there any way to get KUNNR based on ADRNR at item level .I have search through the (KNA1

  • How can I get Firefox to show ALL the pictures of people's profiles once I log onto myspace?

    Every time I log onto MySpace, on every profile I go onto to look at, there are at least some if not most of their pictures blank, it won't show me their pictures, even when I look at my profile pictures it won't show most of them, it'll be a blank s

  • Format & Formula Bars

    Question is there any way to enlarge the Formula Bar and Format Bar in Numbers? The tools are so small and difficult to read. Dick

  • Error on Micro SD While taking picture. Micro SD Undetected.

    Hello this is my first time using a Micro SD, my last phone was Xperia S. While I was taking picture I got an error, my phone does not detect my Micro SD and also my Laptop can't detect it. It's a 32GIG Transcender. My phone is fine but no more Micro

  • Copy an paste from an Adobe XI document

    I am unable to select a document or part of a document and paste it. When I use the selection tool it will highlight the portion selected but as soon as i let up on the mouse the highliting disappears and I am unable to copy or paste.