Best practice in Infoprovider & Query design for access by BO Universe

Hello Experts,
Are there any best practices identified by practitioners or suggested by SAP for development of Infoprovider and queries for access by BO Universe.
Best practices should be from the prospective of performance, design simplicity, adaptability to change etc.
Appreciate your help.
Regards,
Pritesh.
Edited by: pritesh prakash on Jul 19, 2010 10:51 AM

Thanks Suresh.
My project plan is to build Infocubes & queries which will be then used to build Universe upon it. Thus I am looking for do's & dont's while designing infocubes & queries such that there wont be any issues(performance or other) when accessed by Universe built on it.
Hope I have made it more clear now.
Regards,
Pritesh.

Similar Messages

  • OBIEE Best Practice Data Model/Repository Design for Objectives/Targets

    Hello World!
    We are faced with a design question that has become somewhat difficult and we need some help. We want to be able to compare side-by-side actual measures with their corresponding objectives/targets. Sounds simple. But, our objectives are static (not able to be aggregated) with multi-dimensionality and multi-levels. We need some best practice tips on how to design our data model and repository properly so that we can see the objective/target for a measure regardless of the dimensions that are used in the criteria and regardless of the level.
    Here is some more details:
    Example of existing objective table.
    Dimension1
    Dimension2
    Dimension3
    Obj1
    Obj2
    Quarter
    NULL
    NULL
    NULL
    .99
    1.8
    1Q13
    DIM1VAL1
    NULL
    NULL
    .99
    2.4
    1Q13
    DIM1VAL1
    DIM2VAL1
    NULL
    .98
    2.41
    1Q13
    DIM1VAL1
    DIM2VAL1
    DIM3VAL1
    .97
    2.3
    1Q13
    DIM1VAL1
    NULL
    DIM3VAL1
    .96
    1.9
    1Q13
    NULL
    DIM2VAL1
    NULL
    .97
    2.2
    1Q13
    NULL
    DIM2VAL1
    DIM3VAL1
    .95
    2.0
    1Q13
    NULL
    NULL
    DIM3VAL1
    .94
    3.1
    1Q13
    - Right now we have quarterly objectives set using 3 different dimensions. So, if an author were to add one or more (or zero) dimensions to their criteria for a given measure they could get back a different objective. They could add Dimension1 and get 99%. They could add Dimension1 and Dimension2 and get 98%. They could add all three dimensions and get 97%. They could add zero dimensions (highest grain) and get 99%. Using our existing structure if we were to add a new dimension to the mix the possible combinations would grow dramatically. (Not flexible)
    - We would like our final solution to be flexible enough so that we could view objectives with altogether different dimensions and possibly get different objectives.
    - We currently have 3 fact tables with 3+ conformed dimension tables and a few unique dimension tables.
    Could anyone share a similar situation where you have implemented a data model structure with the proper repository joins to handle showing side-by-side objectives/targets where the objectives were static and could be displayed at differing levels with flexible dimensions as described?
    Any help would be greatly appreciated.

    hi..yes this suggestion is nice...first configure the sensors(activity or variable) ..then configure the sensor action as a JMS Topic which will in turn insert the data into a DB..Or when u configure the sensor action as a DB..then the data goes to Oracle Reports schema..if there is any chance of altering the DB..i mean if there is any chance by changing config files so that the data doesnt go to that Reports schema and goes to a custom schema created by any User....i dont know if it can b done...my problem is wen i m configuring the jms Topic for sensor actions..i see blank data coming..for sm reason or the other the data is not getting posted ...i have used a esb ..a routing service based on the schema which i am monitoring...can any1 help?

  • What are the best practices to migrate VPN users for Inter forest mgration?

    What are the best practices to migrate VPN users for Inter forest mgration?

    It depends on a various factors. There is no "generic" solution or best practice recommendation. Which migration tool are you planning to use?
    Quest (QMM) has a VPN migration solution/tool.
    ADMT - you can develop your own service based solution if required. I believe it was mentioned in my blog post.
    Santhosh Sivarajan | Houston, TX | www.sivarajan.com
    ITIL,MCITP,MCTS,MCSE (W2K3/W2K/NT4),MCSA(W2K3/W2K/MSG),Network+,CCNA
    Windows Server 2012 Book - Migrating from 2008 to Windows Server 2012
    Blogs: Blogs
    Twitter: Twitter
    LinkedIn: LinkedIn
    Facebook: Facebook
    Microsoft Virtual Academy:
    Microsoft Virtual Academy
    This posting is provided AS IS with no warranties, and confers no rights.

  • Best practice standard User Acess Test for WIN2012 AD

    What is the Best practice standard User Acess Test  for WIN2012 AD

    Hello,
    as before, add a computer to the domain and log on with a domain user account to the computer.
    You should be able from the client machine to open the sharedfolders on the DCseither with:
    \\DCName\sysvol
    \\DCName\netlogonor \\NetBiosDomainName\sysvol
    \\NetBiosDomainName\netlogon
    Best regards
    Meinolf Weber
    MVP, MCP, MCTS
    Microsoft MVP - Directory Services
    My Blog: http://blogs.msmvps.com/MWeber
    Disclaimer: This posting is provided AS IS with no warranties or guarantees and confers no rights.
    Twitter:  

  • Quick question regarding best practice and dedicating NIC's for traffic seperation.

    Hi all,
    I have a quick question regarding best practice and dedicating NIC's for traffic seperation for FT, NFS, ISCSI, VM traffic etc.  I get that its best practice to try and separate traffic where you can and especially for things like FT however I just wondered if there was a preferred method to achieving this.  What I mean is ...
    -     Is it OK to have everything on one switch but set each respective portgroup to having a primary and failover NIC i.e FT, ISCSI and all the others failover (this would sort of give you a backup in situations where you have limited physical NICs.
    -    Or should I always aim to separate things entirely with their own respective NICs and their own respective switches?
    During the VCAP exam for example (not knowing in advance how many physical NIC's will be available to me) how would I know which stuff I should segregate on its own separate switch?  Is there some sort of ranking order of priority /importance?  FT for example I would rather not stick on its own dedicated switch if I could only afford to give it a single NICs since this to me seems like a failover risk.

    I know the answer to this probably depends on however many physical NICs you have at your disposal however I wondered if there are any golden 100% rules for example FT must absolutely be on its own switch with its own NICs even at the expence of reduced resiliency should the absolute worst happen?  Obviously I know its also best practice to seperate NICs by vender and hosts by chassis and switch etc 

  • How to search in BI 7.1 query designer for the required key figures & chars

    Hi All,
         Can anyone plz tell me how to search in BI 7.1 query designer for the required key figures & characteristics. I have the query being built on a multiprovider which has many cubes.
    So, i have a huge list of key figures and characteristics. I am not able to search for the required one by the name or the technical name.
    How can we search and pick the required object from the enormous list of the Mutliprovider in the Bex Query Designer??
    Thanks
    Phani

    There is not a search feature available. You have to do an educated guess under what dimension your infoobject could be and select.

  • See & in query Designer for variable

    Hi guys,
    When i turn the technical name on in the Query designer, sometimes I see & in front and end of a variable. Sometimes, I dont see them. I was wondering why.

    I see & in the query designer for the variables. Sometimes, for the same variable I dont see & sign in front and back.
    This is what I have observed which in no way is a complete assesment.
    When you have multiple variables on Active hierachy char. and you delete one, you see '&' in the other variables.

  • Best practise BW Query design for Crystal Reports integration

    Hi all,
    I am looking for a guide on best practices when designing a BW Query to be used as data foundation for a Crystal Report.
    The scenario is that I am responsible for developing the Crystal Reports part, but not the BW Query part, therefore I would like to provide a list of best practises to the person who is responsible for the Query, this way make sure that the integration will work as good as possible. The setup is of course using BO Integration Kit for SAP.
    An example is how to use authorization variables in the query to provide data security. This is just one example, there are problably a number of other things to be aware of. A document containing suggestions for best practices is what I am looking for, or if the document does not exist, input to what should be on such a list.
    Thank you in advance.
    Regards,
    Rasmus

    Hi Rasmus,
    in regards to the Best Practices for Crystal Reports you can leverage all the knowledge you have on the Query Design today already. if you not the person for designing the query I think it is important to make sure people designing the queries do understand how Crystal Reports is leveraging the elements from the BI Query.
    /people/ingo.hilgefort/blog/2008/02/19/businessobjects-and-sap-part-2
    You should try to put as much as possible into the BI query from the logic point of view.
    and you can also build common BI queries - there is no need to build a BI query for each report.
    ingo

  • [XI 3.1] BEST PRACTICE method of Oracle connection for RPTs on Linux

    Business Objects XI (3.1) - SP3.
    Running on Red Hat Enterprise Linux OS.
    7,000+ Crystal Reports 2008 *.rpt objects ONLY (No Universe / No WebI).
    All reports connecting to Oracle 10g databases.
    ==================
    In the past, all of this infrastructure was running on Windows Server OS and providing the database access via a Named ODBC connection (eg. "APP_DATA".)
    This made it easy to manage as all the Report Developers had a standard System DSN called "APP_DATA" which was the same as the System DSN name on all of our DEV, TEST/UAT, and PROD servers for Business Objects.
    When we wanted to move/promote a *.rpt file from DEV to PROD we did not have to change any "Database Connection" info as it was all taken care of by pointing the System DSN called "APP_DATA" a a different physical Oracle server at the ODBC level.
    Now, that hardware is moving from Windows OS to Red Hat Linux and we are trying to determine the Best Practices (and Pros/Cons) of using one of the three methods below to access the Oracle database for our *.rpts....
    1.) Oracle Native connection
    2.) ODBC connection
    3.) JDBC connection
    Here's what we have determined so far -
    1a.) Oracle Native connection should be the most efficient method of passing SQL-query to the DB with the fewest issues and best speed [PRO]
    1b.) Oracle Native connection may not be supported on Linux - http://www.forumtopics.com/busobj/viewtopic.php?t=118770&view=previous&sid=9cca754b468fc67888ab2553c0fbe448 [CON]
    1c.) Using Oracle Native would require special-handling on the *.rpts at either the source-file or the CMC level to change them from DEV -> TEST -> PROD connection. This would result in a lot more Developer / Admin overhead than they are currently used to. [CON]
    2a.) A 3rd-Party Linux ODBC option may be available from EasySoft - http://www.easysoft.com/products/data_access/odbc_oracle_driver/index.html - which would allow us to use a similar Developer / Admin overhead to what we are used to. [PRO]
    2b.) Adding a 3rd-Party Vendor into the mix may lead to support issues is we have problems with results or speeds of our queries. [CON]
    3a.) JDBC appears to be the "defacto standard" when running Oracle SQL queries from Linux. [PRO]
    3b.) There may be issues with results or speeds of our queries when using JDBC. [CON]
    3c.) Using JDBC requires the explicit-IP of the Oracle server to be defined for each connection. This would require special-handling on the *.rpts at either the source-file (and NOT the CMC level) to change them from DEV -> TEST -> PROD connection. This would result in a lot more Developer / Admin overhead than they are currently used to. [CON]
    ==================
    We would appreciate some advice from anyone who has been down this road before.
    What were your Best Practices?
    What can you add to the Pros and Cons listed above?
    How do we find the "sweet spot" between quality/performance/speed of reports and easy-overhead for the Admins and Developers?
    As always, thanks in advance for your comments.

    Hi,
    I just saw this article and I would like to add some infos.
    First you can quite easely reproduce the same way of working with the odbc entries by playing with the oracle name resolution on the server. By changing some files (sqlnet, tnsnames.ora,..) you can define a different oracle server for a specific name that will be the same accross all environments.
    Database name will be resolved differently regarding to the environment and therefore will access a different database.
    Second option is the possibility to change the connection in .rpt files by an automated way like the schedule manager. This tool is a additional web application to deploy that can change the connection settings of rpt reports on thousands of reports in a few clicks. you can find it here :
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/80af7965-8bdf-2b10-fa94-bb21833f3db8
    The last option is to do it with a small sdk script, for this purpose, a few lines of codes can change all the reports in a row.
    After some implementations on linux to oracle database I would prefer also the native connection. ODBC and JDBC are deprecated ways to connect to database. You can use DATADIRECT connectors that are quite good but for volumes you will see the difference.

  • Best Practice: A J2EE Blue-Print for a Typical Web App

    Consider a typical synchronous Struts-based Web application which does a simple DB search and post. What are some of the main patterns and components that should be used if following the �industry best practices�
    Does the following flow seem accurate?
    Strust Action creates a TransferObject , and passes it to a Business Delegate. Delegate finds the appropriate BusinessObject, the Business Object uses the Data Access Object�.the CRUD operation happens and the result is sent back to the Action in the same TransferObject.
    Which one of these components need an interface?
    What's the best way for this components to interact with each other (factory, etc.)?
    Message was edited by:
    kmkiani
    Message was edited by:
    kmkiani

    There are 3 tiers in a Java EE application. (Presentation, Business, Integration).
    The BusinessDelegate in this scenario would be a Presentation-tier business delegate. This guy would interact with a Session Facade who lives on the Business-tier. The SessionFacade is the abstraction on the Business-tier and the Business Delegate is the abstraction on the Presentation-tier. It is these guys that have direct communication. This design enables low coupling between the actual implementations of each area. If done properly, you could go from EJB to Web Service to POJO business models without ever having to change anything in the Presentation-tier.
    These object-oriented design patterns are primarily for Enterprise applications with extensive Quality-of-Service requirements.
    In your scenario, the Presentation-tier would contain a MVC-based web application, i.e. Struts. The business model and business/domain requirements would be implemented in the Business-tier.
    Presentation Tier - Struts Web Application
    Business Tier - (EJB | POJO | WEB SERVICES) Application
    Integration Tier - (Relational Database | File System | XML Database | EIS)

  • What is best practice in FR ? Original Report access to Users or Snapshot

    Hi,
    can any one pls. let me know on what is the best practice in FR ? I need to give access to original reports to my users or the snapshot access only ? Users are not happy with snapshot access mainly during the closing time. What are the complications if i give access to original reports ? I'm using Batch scheduler, but users are unable to see the data some times. What will be the reason for this ?
    Thanks,
    PVR

    Hi,
    There are certainly many variables to have a look at. Server sizes is one concern, report size is another. Giving access to original reports is fine as long as concurrency and heavy usage don't take the servers down. That said, the reports that will be given access to their originals shouldn't run for 10 minutes for each query. Having 10 user connected to such reports will probably take reporting services down which makes the entire system useless. These types of reports should better be scheduled and shown as snapshots. However, some of these reports might have time-dependent information which needs to be refreshed at query time. In this case you could either give the access to original reports or schedule the reports to run say every 2 hours.
    Cheers,
    Alp

  • Best Practice while configuring Traffic Manager for Azure Website

    Hi Team,
    I want to understand What is the best practice while we configure traffic manager for Azure website.
    To give you the base, Here let me explain my requirement. I have one website which 40% target audiences would be East US, while  40% would be UK and rest 20% would be from Asia-pacific.
    Now, What I want is Failover + Performance based Traffic Manager Configuration.
    My thinking:
    1) we need to create 1 website with 2 instances in each region (east us, east asia, west us for an example). so, total 3 deployment of website. (give region based url for the website)
    2) create traffic manager based on performance and add 3 of those instances. that would become website-tmonperformance
    3) create traffic manager based on failover and add 3 of those instances. that would become website-tmonfailover
    4) create traffic manager and ?? don't know the criteria but add both above traffic manager here and take your final url for end user.
    I am not sure (1) this may be the right approach or not (2) if this is right, in the 4th step which criteria we should select while creating final traffic manager round-robin/ performance/ failover?
    after all these if use try to access site from US.. traffic manager will divert that to US Data-Centre or it will wait for failover and till that it will be served from east-asia if in configuration, east-asia is my 1st instance?
    Regards, Brijesh Shah

    Hi Jonathan,
    Thanks for your quick reply. actually question is bit different. Let me explain you different way.
    I was asking for recommendation from Azure Traffic Manager team. whether my understanding is correct or not.We want Performance with Failover.
    So, One azure website we have: take an example todoapp. I deployed that in 3 different region. now, I want to have performance based routing as well as failover based routing. but obviously I can't give two URL to my end user. so, at the top of that I will
    require one more traffic manager. So,
    step 1: I will create one traffic manager with performance criteria named: TMForPerformance.trafficmanager.com where I will add all those 3 instances (all are from different region so, it want create any issue.)
    step 2: I will create one more traffic manager with failover criteria named: TMForFailover.trafficmanager.com where I will add all those 3 instances (all are from different region so, it want create any issue.)
    step 3: I will create one final traffic manager with performance criteria named: todoapp.trafficmanager.com where I will add these two traffic manager instead of 3 different region's website.
    Question 1) Is it correct structure if we want to achieve Performance with Failover or Is there any better solution?
    Question 2) in step 3, what criteria we should select? performance/ round robin/ failover
    Regards, Brijesh Shah

  • Best practice creating an Corporate Design (Layout) with a web dynpro

    I got now a dynpro application that has the functionality of my wishes.
    but how to create a corporate design (layout) for my customer?
    i heared that there is a special protal editor for making layouts?
    do you know a best practice? or is it best to work witch the dynpro explorer itself?

    Hi,
    if the application parameter WDTHEMEROOT is not availiable together with the other application parameters in the Workbench, then you might have a lower support package...
    For the style sheet editor you need a portal installation, it belongs to the portal. There's no ABAP transaction for that or ABAP only style sheet editor.
    Regards, Heidi
    PS: See also CSS for WebDynpro ABAP without Portal
    Message was edited by:
            Heidi von Geisau

  • What is the best practices recommended from microsoft to give access a intranet portal from internet externally

    Hi
    what is the best practices recommended from microsoft
    i have a intranet portal in my organization used by employees  and i want to give access for employees to access external from  internet also
    can i use same url  for employees access intranet portal from internally and externally or diffrent url?
    like ( https://extranet.xyz.com.in)  and (http://intranet.xyz.com.in)
    internal url access by employees is( http://intranet.xyz.com.in)
    and this portal configured with claims based authentication
    here i have a F5 for load blance and
     a request from external to F5 is https request and F5 to sharepoint server http request
    and sharepoint server to F5 is http request but F5 to external users it is https response so 
    when i change below settings in alternate access mapings   all links changed to https
    but only authentication link is still showing http and authentication page not opened.
    adil

    Hi,
    One of my clients has an environment similar to yours with an internal pair of F5s and a pair used for the access from the internet. 
    I am only going to focus on the method using an F5 Load Balancer and SSL Offloading. the setup of the F5 will not be covered in detail but a reference to the documentation to support SharePoint and SSL Offloading will be provided
    Since you arte going to be using SSL Offloading you do not need to extend your WebApps to use separate IIS WebSites with Unique IP Addresses
    Configure the F5 with SSL Offloading
    Configure a Internal AAM for SSL (HTTPS) for each WebApp that maps to the Public HTTP FQDN AAM Setting for each WebApp
    Our environment has an additional component we require RSA Authentication for all internet facing Sites. So we have the extra step of extending the WebApp to a separate IIS WebSite and configuring RSA for each extended WebSite.Reference:
    Reference SharePoint F5 Configuration:
    http://www.f5.com/featured/video/ssl-offloading/
    -Ivan

  • Creating Restrictions in BEx query designer for (1-a), where a is variable.

    Dear All,
    >>  Suppose for the variable in the BEx query designer, I've created restriction for 0FISCPER to the variable 0P_PER (as 0P_PER-1) by right clicking 0P_PER3, from that a dialog box appears.  In that I've selected 0P_PER3 and then clicked on offset variable and when I set the offset variable, it works for (0P_PER-1.etc, 0P_PER+1,etc..)
    But if I want to have (1-0P_PER), then plz tell me how to define this restriction?
    Plz explain me the steps to perform (1-0P_PER) restriction in BEx query designer in BW1.
    I'm using SAP BI 7.2 gui.
    Expecting your reply soon,
    with Regards,
    Jerald

    My requirement is that I want to have " 1-fiscal year"., But this I could not define using offset variables, since using offset we can define "fiscal year +1" or "fiscal year -1". 
    So there would be some other way to define "1-fiscalyear".  So Please help me to define "1-fiscalyear"  by giving me the steps.
    Thank you,
    with Regards,
    Jerald

Maybe you are looking for