Universe on an OLTP database

Hi All,
I am using BO XIR2.
I wanted to know what is the best way/approach to create a universe on an OLTP database.
The database is highly normalized.
I pulled in the required tables and added joins. In order to resolve loops I uncluded contexs and some aliases.
Now when I create a test report, it results in mostly all incompatible objects.
I think the approach i am using to create the universe is that of a datawarehouse(denormalised) database.
And thus is resulting in a bad structure.
Can anyone help me on the approach to be followed to create a universe on a highly normalised database?
Are there any restirctions in such a universe for reporting?
Please help. '
Thanks in advance,
Edited by: BO User on Sep 16, 2009 10:58 PM
Edited by: BO User on Sep 16, 2009 10:58 PM

Hi,
The objects in a context should be under the same class.
Since you are using OLTP database, you will not have a proper data model as we have in a datawarehouse.
Try to include the necessary joins for your reports and remove unnecessary joins as it will lead to more loops.
Try to use views wherever needed.
Avoid using more derived tables in your Universe as it will lead to performance issues. Use derived tables when you are not able to acheive something in the normal join approach and when you need a quick solution.
Regards,
Santhosh

Similar Messages

  • In Oracle RAC environment which is OLTP database? load balancing advantage.

    In Oracle RAC environment which is a OLTP database? List the options for load balancing along with their advantages.

    You can use a software load balancer.
    https://forums.oracle.com/forums/search.jspa?threadID=&q=Software+AND+Load+AND+Balancer&objID=c3&dateRange=all&userID=&numResults=15&rankBy=10001
    Installing and Configuring Web Cache 10g and Oracle E-Business Suite 12 [ID 380486.1]
    Thanks,
    Hussein

  • Purpose of t-sql HEAP sturcture in OLTP database

    Hello,
    In an OLTP database when is a HEAP (table with no clustered indexes) to be preferred over a table with a clustered index?
    Same question but for OLAP database.
    TIA,
    edm2

    For OLAP database I would add a clustered index to any table in the database that needs to produce sorted results.This way, the data is already pre-sorted (by the clustered index key), saving a lot of time when the query is actually run. This becomes more
    important as huge numbers of rows are returned from your query
    I see also people create am indexed views in the OLAP database to speed up the aggregation.
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance
    SQL Server Integration Services:
    Business Intelligence

  • Implementing Oracle discoverer on OLTP databases

    We have a database(520 tables) completely normalized minute level for ex:Person->Phone->PhoneType, Person->Email->EmailType.
    That means completely normalized, in order generate a report you have to have more than 10 joins. Apart from these we need to link to value based hierarchy with that we need to achieve output with respect to geography.
    Hierarchy is build on value based and data is maintained in the table in recursive manner.
    1.My first question is, can I implement Oracle DS on OLTP databases.?
    2.I guess Oracle DS will not support more than 6 or 7 joins because it encounters fan-trap situations, am I correct?
    Pls. suggest/advice me. Thans in Advance.
    Regards
    Krishna

    Hi Krishna,
    1) Yes you can (there are no laws against that).
    But keep in mind that the OLTP database is intended/tuned to process transactions, not to do reporting on. So you might run into performance issues and/or modelling issues. Moreover, the "transactional users" migth expierence slower performance because of the "reporting users" and they won't like you for it...
    2) There is no limit on the number of joins. And running into fantrap situations depends on the functional requirements and the way you set up the EUL for that, not (directly) on the number of joins.
    Regards,
    Sabine

  • Requirements for High-Load OLTP Database

    Hi guys!
    Need your Best Practise!
    I will install&configure High-Load OLTP Database.
    5 million users
    500 transactions per second What requirements is need?
    Do you have any papers or documents?

    Denis :) wrote:
    Hi guys!
    Need your Best Practise!
    I will install&configure High-Load OLTP Database.
    5 million users
    concurrent users?
    500 transactions per second
      1* select 5000*60*60/5000000 from dual
    SQL> /
    5000*60*60/5000000
                   3.6each user does about 4 transactions per hour
    How big is a single transaction?
    How much redo is generated every day?
    >
    What requirements is need? more hardware is more better!
    Do you have any papers or documents?

  • OLTP Database Design...

    Dear Team,
    I would like to know some books / links which explains concepts of OLTP database design..
    If any one of experts has done this before, please do share the experiences & ideas..
    Thanks
    nic...

    Nicloei W wrote:
    Dear Indra Budiantho
    The Link is helpful, but its asking me to buy the book, which would take time..
    You think this stuff can be reduced to an executive summary? Why do you think entire books are written on the subject? Entire college courses dedicated to it?
    is there any better way out? some live examples white papers??There are no shortcuts to learning complex subjects.
    Start by googling "Third Normal Form" or "Data Nomalization".
    And I emphasize "start".
    >
    Regards
    nic

  • Adaptive Cursor Sharing and OLTP databases

    Hi all,
    I am reading a book about performance that affirms that, for OLTP environments, it is recommended to disable the Adaptive Cursor Sharing feature (setting the hidden parameter OPTIMIZEREXTENDED_CURSOR_SHARING_REL to NONE and the CURSOR_SHARING to EXACT). The book recommends this to avoid the overhead related to the feature.
    I know that with this feature you can avoid the Bind Peeking issue as it creates more than one execution plan for the same query. So, it is really a good practice to disable it?
    Thanks in advance.

    OK, thanks for pointing out.
    Getting back to your original question:
    So, it is really a good practice to disable it?No, it is not, especially not without approval of Oracle Support.
    Furthermore, I feel confirmed by that point from the book review by Charles Hooper as well as his additional points:
    The book states that in an OLTP type database, “we probably want to disable the Adaptive Cursor Sharing feature to eliminate the related overhead.” The book then suggests changing the CURSOR_SHARING parameter to a value of EXACT, and the OPTIMIZEREXTENDED_CURSOR_SHARING_REL parameter to a value of NONE. First, the book should not suggest altering a hidden parameter without mentioning that hidden parameters should only be changed after consulting Oracle support. *Second, it is not the CURSOR_SHARING parameter that should be set to a value of EXACT, but the OPTIMIZERADAPTIVE_CURSOR_SHARING parameter that should be set to a value of FALSE (see Metalink (MOS) Doc ID 11657468.8). Third, the blanket statement that adaptive cursor sharing should be disabled in OLTP databases seems to be an incredibly silly suggestion for any Oracle Database version other than 11.1.0.6 (this version contained a bug that lead to an impressive number of child cursors due to repeated executions of a SQL statement with different bind variable values)*. (page 327)"
    http://hoopercharles.wordpress.com/2012/07/23/book-review-oracle-database-11gr2-performance-tuning-cookbook-part-2/

  • Configuring raid 0+1 for an oltp database of 1 terabyte size on centos 4.5

    Hi all,
    Configuring raid 0+1 for an oltp database on centos 4.5
    I have to configure 0+1 raid for an oltp database of 1 terabyte size on centos 4.5.
    Please anyone suggest me step by step configuration or link.
    Thanks and Regards
    Edited by: DBA24by7 on Mar 15, 2009 2:20 PM

    >
    it is centos 4.5 which is almost like redhat linux.And thus completely unsupported by Oracle - which begs the
    question as to why anyone would bother to go to the
    expense of setting up a RAID configuration for an
    unsupported database?
    Anyway, you should be using RAID 1+0
    see here: http://www.acnc.com/04_01_10.html
    Paul... (lots of RAID questions today!)

  • Transaction Isolation Level to Read UnCommited in Non OLTP Database

    HI,
    We are having a database which for NOT OLTP process. That is OLAP DB. Operation on that DB is only Select and (Incremental Insert - FOR DWH ) not Update/Delete and we are performing ROLAP operations in that DB.
    By Default SQL Server DB isolation Level is READ COMMITTED.AS Our DB IS OLAP SQL Server DB we need to change the isolation level toRead Uncommited. We google it down but We can achive in
    Transaction level only by SET isoaltion Level TO Read UNCOMMITED
    or ALLOW_SNAPSHOT_ISOLATION ON or READ_COMMITTED_SNAPSHOT
    Is there any other way if we can Change the Database isolation level to READ uncommitedfor Entire Database?, insteads of achiving in Transaction Level or With out enabling SET ALLOW_SNAPSHOT_ISOLATION ON or READ_COMMITTED_SNAPSHOT
    Please use Marked as Answer if my post solved your problem and use Vote As Helpful if a post was useful.
    Please use Marked as Answer if my post solved your problem and use Vote As Helpful if a post was useful.

    Hi,
    My first question would be why do you want to change Isolation level to read uncommitted, are you aware about the repercussions you will get dirty data, a wrong data.
    Isolation level is basically associated with the connection so is define in connection.
    >> Transaction level only by SET isoaltion Level TO Read UNCOMMITED or ALLOW_SNAPSHOT_ISOLATION ON or READ_COMMITTED_SNAPSHOT
    Be cautious Read UNCOMMITED  and Snapshot isolation level are not same.The former is pessimistic Isolation level and later is Optimistic.Snapshot isolation levels are totally different from read uncommitted as snapshot Isolation level
    uses row versioning.I guess you wont require snapshot isolation level in O:AP DB.
    Please read below blog about setting Isolation level Server wide
    http://blogs.msdn.com/b/ialonso/archive/2012/11/26/how-to-set-the-default-transaction-isolation-level-server-wide.aspx
    Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers
    My TechNet Wiki Articles

  • Error when setting up connection from Universe Designer to Oracle database

    Hello,
    I am attempting to create a connection to an Oracle 10 database in Universe Designer.  I am currently running Business Objects XI 3.1 SP3 on a 32 bit machine.  I have the Oracle client installed.  In Universe Designer when I attempt to create a new connection using the Oracle 10 client I receive the following error.
    CS:DBDriver failed to load : C:\Program Files\Business Objects\BusinessObjects Enterprise 12.0\win32_x86\dataAccess\connectionServer\dbd_oci.dll (The specified module could not be found.)
    The dbd_oci.dll file does exist in the directory listed above.  Has anyone encountered this issue?
    Thanks in advance.
    Brock

    Hello Brock,
              Do you have multiple SQL clients installed on your system? If so, uninstall the previous versions. Check if the PATH variable is set to correct bin path and then try testing it again.
    Regards,
    Sanjay

  • Issues with universes for AS400 DB2 databases

    I create Business Objects universes and Webi reports accessing data from AS400 DB2 databases (via ODBC). I have a couple of queries that I have been having trouble resolving. I hope someone in the forum is working on a similar set-up.
    My queries are as follows:
    1. If I have 2 different libraries with identical set of tables (physical files), after creating a universe in designer based on the tables in one library, is there an automated way to create another identical universe pointing to or referring to the 2nd library?  Every physical file and field structure within it are the same. The only difference is that they are located in 2 different libraries, containing different data.
    2. For physical files with 'multiple members', the reports generated by Business Objects seem to only access the data in the first member of the file. Is there a setting where this behavior can overidden or by passed.  I think 'multiple members' is a characteric that unique to tables in AS400 DB2.
    Would really appreciate the help! Thanks!

    Since it doesn't appear the Online Capture for DB2 is supported, I am trying to work around this and do the Offline capture.
    I copied the generated files from the Offline Capture option (startDump.sh,getForeignKeys.sh,getProcedures.sh, and db2v9.ocp)to the Linux box where the DB2 Database resides.
    After connecting to the DB2 Database (DB2DB1), I entered:
    sh startDump.sh DB2DB1 USRNAME USRPSWD
    to run the script startDump.sh. It produced files:
    Connect.dat
    Schemas.dat
    It also created a subfolder (DB2DB1) containing:
    checkConstraints.dat
    foreignKeys.dat
    indexes.dat
    primarykeys.dat
    routines.dat
    synonyms.dat
    tables.dat
    triggers.dat
    views.dat.
    My question is what do I do with these files? I believe the documentation said that file db2v9.ocp was supposed to be modified, but it wasn’t modified after I ran startDump.sh. Was I supposed to run getForeignKeys.sh and getProcedures.sh, or did startDump.sh already run these? Can anyone provide an example of how this is to be run? The documentation doesn't provide a DB2 example.
    Note: The DB2 Supplement: http://download.oracle.com/docs/cd/E15846_01/doc.21/e15286.pdf doesn't seem to discuss the offline capture. It doesn't reference the startDump.sh file at all.

  • Universe based on Unidata database advice

    Hi there,
    I am looking for advice to building a universe based on a unidata database
    What is the best way to connecct to the datbase?
    Will I end up building lots of derived tables with loads of  UNION statements?
    Are there any tools out there to create a view that Business Objects can use?
    regards
    James

    .

  • OLTP Database and DW Database Creation

    Hi,
    Is there any specific parameter we need to take care during creation of Database Instance on Exadata Machine for OLTP and DW types?
    Thanks,

    Hi Friend
    Good Query.
    Recommended for DWH :
    1. Real Application Clusters
    2. Partitioning
    Recommended for OLTP :
    1. Real Application Clusters
    2. Advanced Compression
    and we should evaluate all the database options to determine their value for your customer's specific situation.
    In terms of Database Options : IORM (IO Resource Manager) + DBRM (DB Resource Manager) plays very big roles in Prioritizing the activities in OLTP and DWH enviroments based on LOAD.
    Hope it helps...
    Thanks
    LaserSoft

  • Universe Builder - Invalid Oracle Database

    Post Author: LobiGabi
    CA Forum: Olap
    Hi All,
    I have an Oracle connection what I created in BO Designer.
    I like connect to Oracle cubes in Universe Builder, but I get the follow message:
    Invalid Oracle Database. OLAP Package not detected. Connection aborted.
    What's problem?
    LobiGabi

    Anyway .... this is how to resolve this issue:
    Note: Ensure that the SHARED_POOL_SIZE and JAVA_POOL_SIZE parameters are set to at least 150M each, and then restart the instance in MIGRATE mode and execute catpatch as detailed in the following steps:
    Before starting this process, notify users of database that the database will be unavailable for approximately 2 hours. Proceed with proper notification procedures before shutting down the database.
    1.     sqlplus /nolog
    2.     spool $ORACLE_HOME/rdbms/log/catpatch_ddmmyy.log
    3.     connect / as sysdba
    4.     alter system set shared_pool_size = 200M scope=spfile; (if necessary)
    5.     alter system set java_pool_size = 150M scope=spfile; (if necessary)
    6.     shutdown immediate
    7.     startup migrate
    8.     @$ORACLE_HOME/rdbms/admin/catpatch.sql
    9.     spool off
    10.     shutdown immediate
    11.     startup
    12.     (validate): select comp_name,version,status from dba_registry;

  • Where to find the Siebel OLTP Database and OBAW database after downloading

    Hi All,
    I downloaded Oracle Apps 7.9.6.1 (424114 KB) and i'n not able to see the following
    1. Prebuilt dashboards and reports
    2. OLTP DB and DW DB
    Do i need to download them seperately from somewhere.
    Thanks in Advance.
    Jaan

    Hi All,
    My objective is to query the transactional database and the Datawarehouse tables. Could some please help me in finding these tables in OBIEE Apps. I'm not able to update rows in repository as well.
    Any help appreciated.
    Thanks
    Jaan

Maybe you are looking for

  • Stop waiting for event 'SEND_OK_TRANSPORT' of CL_SWF_XI_MESSAGE

    Hi, I designed a BPM with a sender requesting for a Transport Acknowledgement. But earlier I defined via IDX_NOALE, that no Ack. should be sent. My problem is now, that I have a lot of messages waiting for the event 'SEND_OK_TRANSPORT' of object type

  • Error message The procedure entry point ?GetIsWrapLocationValid@@YAFPBVIStandOffData@@@Z could not b

    The procedure entry point ?GetIsWrapLocationValid@@YAFPBVIStandOffData@@@Z could not be found in the dynamic link library Public.dll I get this message when trying to launch InDesign CS3. It's been working  fine until today.

  • Regarding bapi help required

    hi all, i am using bapi BAPI_MATERIAL_SAVEDATA to create a material. how could i know that which fields are mandatory in this bapi to create a material. in which fields or parameters i should pass data to create a material. thanks sanjeev

  • J2ee-application-client-permissions details

    I haven't been able to find any documentation on what this security setting provides. I'd like to know what can and can't be done with this level. Does anybody have a list of the Java security permissions that it grants?

  • Tracks deleted when saving song

    Have a song with 3 sections of 8 bars length and 4 tracks with external audio in each section (total of 12 external audio tracks), in addition to two tracks in each section with loops. When I record the 9th or 10th external audio track, older tracks