Best practice to create multi tier (atleast 3 level) table

What is the best practice to create multi tier (minimum 3 levels) of table?. Could any one provide a sample structure?.
Thanks.

Can u b more specific as to what you are trying to do. What u mean by 3 level table?

Similar Messages

  • What is the best practice for creating primary key on fact table?

    what is the best practice for primary key on fact table?
    1. Using composite key
    2. Create a surrogate key
    3. No primary key
    In document, i can only find "From a modeling standpoint, the primary key of the fact table is usually a composite key that is made up of all of its foreign keys."
    http://download.oracle.com/docs/cd/E11882_01/server.112/e16579/logical.htm#i1006423
    I also found a relevant thread states that primary key on fact table is necessary.
    Primary Key on Fact Table.
    But, if no business requires the uniqueness of the records and there is no materilized view, do we still need primary key? is there any other bad affect if there is no primary key on fact table? and any benifits from not creating primary key?

    Well, natural combination of dimensions connected to the fact would be a natural primary key and it would be composite.
    Having an artificial PK might simplify things a bit.
    Having no PK leads to a major mess. Fact should represent a business transaction, or some general event. If you're loading data you want to be able to identify the records that are processed. Also without PK if you forget to make an unique key the access to this fact table will be slow. Plus, having no PK will mean that if you want to used different tools, like Data Modeller in Jbuilder or OWB insert / update functionality it won't function, since there's no PK. Defining a PK for every table is a good practice. Not defining PK is asking for a load of problems, from performance to functionality and data quality.
    Edited by: Cortanamo on 16.12.2010 07:12

  • Best practices for creating and querying a history table?

    Suppose I have a table of name-value pairs, and I want to keep track of changes to them so that I can query the value of any pair at any point in time.
    A direct approach would be to use a schema like this:
    CREATE TABLE NAME_VALUE_HISTORY (
      NAME      VARCHAR2(...),
      VALUE     VARCHAR2(...),
      MODIFIED DATE
    );When a name-value pair is updated, a new row is added to this table with the date of the change.
    To determine the value associated with a name at a particular point in time, one uses a query like:
      SELECT * FROM NAME_VALUE_HISTORY
      WHERE NAME = :name
        AND MODIFIED IN (SELECT MAX(MODIFIED)
                        FROM NAME_VALUE_HISTORY
                        WHERE NAME = :name AND MODIFIED <= :time)My question is: is there a better way to accomplish this? What indexes/hints would you recommend?
    What about a two-table approach like this one? http://pratchev.blogspot.com/2007/05/keeping-history-data-in-sql-server.html
    Edited by: user10936714 on Aug 9, 2012 8:35 AM

    user10936714 wrote:
    There is one advantage... recording the change of a value is just one insert, and it is also atomic without the use of transactions.At the risk of being dumb, why is that an advantage? Oracle always and everywhere uses transactions so it's not like you're avoiding some overhead by not using transactions.
    If, for instance, the performance of reading the value of a name at a point in time is not important, then you can get by with just using one table - the history table.If you're not overly concerned with the performance implications of having the current data and the history data in the same table, rather than rolling your own solution, I'd be strongly tempted to use Workspace Manager to let Oracle keep track of the changes.
    You can create a table, enable versioning, and do whatever DML operations you'd like
    SQL> create table address(
      2    address_id number primary key,
      3    address    varchar2(100)
      4  );
    Table created.
    SQL> exec dbms_wm.enableVersioning( 'ADDRESS', 'VIEW_WO_OVERWRITE' );
    PL/SQL procedure successfully completed.
    SQL> insert into address values( 1, 'First Address' );
    1 row created.
    SQL> commit;
    Commit complete.
    SQL> update address
      2     set address = 'Second Address'
      3   where address_id = 1;
    1 row updated.
    SQL> commit;
    Commit complete.Then you can either query the history view
    SQL> ed
    Wrote file afiedt.buf
      1  select address_id, address, wm_createtime
      2*   from address_hist
    SQL> /
    ADDRESS_ID ADDRESS                        WM_CREATETIME
             1 First Address                  09-AUG-12 01.48.58.566000 PM -04:00
             1 Second Address                 09-AUG-12 01.49.17.259000 PM -04:00Or, even cooler, you can go back to an arbitrary point in time, run a query, and see the historical information. I can go back to a point between the time that I committed the first change and the second change, query the ADDRESS view, and see the old data. This is invaluable if you want to take existing queries and/or reports and run them as of certain dates in the past when you're trying to debug a problem.
    SQL> select *
      2    from address;
    ADDRESS_ID ADDRESS
             1 First AddressYou can also do things like set savepoints which are basically named points in time that you can go back to. That lets you do things like create a savepoint for the data as soon as month-end processing is completed so you can easily go back to "July Month End" without needing to figure out exactly what time that occurred. And you can have multiple workspaces so different users can be working on completely different sets of changes simultaneously without interfering with each other. This was actually why Workspace Manager was originally created-- to allow users manipulating spatial data to have extremely long-running transactions that could span days or months-- and to be able to switch back and forth between the current live data and the data in each of these long-running scenarios.
    Justin

  • BEST PRACTICES FOR CREATING DISCOVERER DATABASE CONNECTION -PUBLIC VS. PRIV

    I have enabled SSO for Discoverer. So when you browse to http://host:port/discoverer/viewer you get prompted for your SSO
    username/password. I have enabled users to create their own private
    connections. I log in as portal and created a private connection. I then from
    Oracle Portal create a portlet and add a discoverer worksheet using the private
    connection that I created as the portal user. This works fine...users access
    the portal they can see the worksheet. When they click the analyze link, the
    users are prompted to enter a password for the private connection. The
    following message is displayed:
    The item you are requesting requires you to enter a password. This could occur because this is a private connection or
    because the public connection password was invalid. Please enter the correct
    password now to continue.
    I originally created a public connection...and then follow the same steps from Oracle portal to create the portlet and display the
    worksheet. Worksheet is displayed properly from Portal, when users click the
    analyze link they are taken to Discoverer Viewer without having to enter a
    password. The problem with this is that when a user browses to
    http://host:port/discoverer/viewer they enter their SSO information and then
    any user with an SSO account can see the public connection...very insecure!
    When private connections are used, no connection information is displayed to
    SSO users when logging into Discoverer Viewer.
    For the very first step, when editing the Worksheet portlet from Portal, I enter the following for Database
    Connections:
    Publisher: I choose either the private or public connection that I created
    Users Logged In: Display same data to all users using connection (Publisher's Connection)
    Users Not Logged In: Do no display data
    My question is what are the best practices for creating Discoverer Database
    Connections.
    Is there a way to create a public connection, but not display it in at http://host:port/discoverer/viewer?
    Can I restrict access to http://host:port/discoverer/viewer to specific SSO users?
    So overall, I want roughly 40 users to have access to my Portal Page Group. I then want to
    display portlets with Discoverer worksheets. Certain worksheets I want to have
    the ability to display the analyze link. When the SSO user clicks on this they
    will be taken to Discoverer Viewer and prompted for no logon information. All
    SSO users will see the same data...there is no need to restrict access based on
    SSO username...1 database user will be set up in either the public or private
    connection.

    You can make it happen by creating a private connection for 40 users by capi script and when creating portlet select 2nd option in Users Logged in section. In this the portlet uses there own private connection every time user logs in.
    So that it won't ask for password.
    Another thing is there is an option of entering password or not in ASC in discoverer section, if your version 10.1.2.2. Let me know if you need more information
    thnaks
    kiran

  • What are the best practices to create surrounding borders?

    Good day everyone,
    I was wondering what is the best practices to create a look in my iOS app like the one below? How are they accomplishing the creation of the borders, is there a tool in Xcode IB to do that?
    Thank you in advance

    Once again thanks for your input, however I am still not clear how you have accomplish the rounded corners, you do not mention that in your reply.
    I did some research on my end and I was able to accomplish what I want with a UIView using the code below in an outlet:
    redView.layer.cornerRadius = 10;
    redView.layer.borderColor = [UIColor greenColor].CGColor;
    redView.layer.borderWidth = 5;
    However, I cannot do the same for the UITableView or UITableView cell.
    Thanks

  • Best practice to create Voyager conn on SAP BI INFOCUBE OR  SAP BI Query

    I wanted to know which is the best practice to create a Voyager connection on SAP BI Infocube or SAP BI  Query?
    which is gud for more performance?

    Hi Nirmansyah,
    Please check the below link.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/business-intelligence/s-u/step-by-step%20procedure%20for%20creating%20customized%20bex%20maps.pdf
    Hope this helps.
    Veerendra.

  • Best practice to create a database

    please can you send me the best practice to create a database which is want to be used in the future for a dataware house

    Hi,
    For Dataware housing purpose only means you can create the database by using DBCA. or for only transctional purpose means you can create manyally. like create control file , datafiles and all.
    Thanks and Regards
    Venkat.K.Raju
    Mindlance,
    Oracle Applications Team
    BANGLORE..66
    Mobile:+919986556688
    Land:080-41464843 Ext-4942
    [email protected]

  • Best practice to create users - Hybrid scenario

    I would like to know what is the best practice for creating new users in a Hybrid scenario with all mailboxes hosted and no mailboxes on-premise.
    Currently when creating a new user we go to our local EMC and create a 'New Remote Mailbox'.  This creates the mailbox in Office365 and the local user account in one wizard.
    After the new user is created we have to manually add the user to the correct distribution groups, and security groups.
    We would like a way to create new users using a template which already has the correct distribution groups and security groups.  This is how we did it prior to setting up the Hyrbrid scenario.
    Is this possible?  Can we create a user from a template and have the mailbox created in Office365 at the same time?  We do not wish to create the mailbox locally then migrate it.

    Thanks for the response DJStatik.  I think this tool might be useful for creating users in bulk, however we are looking for a user template in the traditional sense.
    Occasionally we have a new user and (prior to O365) would 'copy' the template user to ensure correct groups etc.
    We have tried making users from the existing template we have, but in order to create the mailbox in O365, you need to 'mail enable' the user.  We have noticed that doing this process causes issues with Autodiscover for that particular user.
    To avoid the autodiscover issue, we have found it best to create the user and the mailbox in the same wizard - hence our new process that we would like a template.

  • Best practices to create Universe on top of SAP BW

    Hi Experts,
    I would like to know the Best Practices for Creating Universe on top of SAP BW.
    1. is it advisable to create Universe on top of SAP BW CUBE directly. in my case , we are starting a fresh BW implementation for one application on SAP BI.7 ; and BOE 3.1
    2. if we create a universe on top of BEX Query, what need to be done if want to upgrade to Business Objects 4.0 version.
    3. if we create a universe on BEX Query, will  SAP is going to support  universe on BEX for future releases.
    4. what is the support period for BEX and its integration with Business objects.
    Thanks
    Bhnau.

    Hi,
    1. is it advisable to create Universe on top of SAP BW CUBE directly. in my case , we are starting a fresh BW implementation for one application on SAP BI.7 ; and BOE 3.1
    First go through by below link.This guide describe how you can create universe based on Cube and Query and what features are available via Cube and Query.
    http://help.sap.com/businessobject/product_guides/boexir3/en/xi3_sap_olap_universes_en.pdf
    You can check on the Page 12 in the PDF what supported by Cube and BW Query while designing of a universe.
    Best practices says to design universe on top of one generic query for one Cube because CFK,RFK can not generate in the universe on top of BW Cube.
    2. if we create a universe on top of BEX Query, what need to be done if want to upgrade to Business Objects 4.0 version.
    check below links.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c0d937fa-1261-2e10-6388-e71afb6b5ff6?quicklink=index&overridelayout=true
    http://biguru.wordpress.com/
    Thanks,
    Amit

  • Best Practices for creating reports/Dashboards from BW systems

    HI Gurus,
    Best  Practices of creating BO Dashboards / Xcelsisus from BW systems
    Prasad

    You can use the BICS connector that leverages BW queries directly.  It is listed in the Connection Manager as "SAP NetWeaver BW Connection".  You will need both the ABAP and Java stack and SSO configured between the two.  You will also need to have SAP GUI and BEx installed on the machine you are doing development on.  Note that dashboards using this connection can only be hosted in NW Portal for the time being until the next release of BI 4.x platform.
    Here are some links on getting started with the BICS connector:
    [Building Fast and Efficient Dashboards with BW and Xcelsius|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d0ab8cce-1851-2d10-d5be-b5147a651c58]
    [Requirements for BICS|http://wiki.sdn.sap.com/wiki/display/BOBJ/prerequisitestoXcelsiusandSAPNetWeaverBW+Connection]

  • Best practice for creating JCO destinations

    Hi All,
       I have a project which uses 10 to 12 BAPIs.What is the best practice
      1) Create 10 JCO destinations one for each BAPI .
      2) Create one JCO and use it for all BAPIs.
         Can some one tell me what is the best practice.What are the advantages and the disadvantages.
    Regards,
    Rajini.

    Hi,
    these docs helps you to get idea over the jco best practices.
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/705f2b2e-e77d-2b10-de8a-95f37f4c7022
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/85a483cb-0d01-0010-2990-c5168f01ce8a
    Regards,
    ramesh

  • Looking for best practices when creating DNS reverse zones for DHCP

    Hello,
    We are migrating from ISC DHCP to Microsoft DHCP. We would like the DHCP server to automatically update DNS A and PTR records for computers when they get an IP. The question is, what is the best practice for creating the reverse look up zones in DNS? Here
    is an example:
    10.0.1.0/23
    This would give out IPs from 10.0.1.1-10.0.2.254. So with this in mind, do we then create the following reverse DNS zones?:
    1.0.10.in-addr.arpa AND 2.0.10.in-addr.arpa
    OR do we only create:
    0.10.in-addr.arpa And both 10.0.1 and 10.0.2 addresses will get stuffed into those zones.
    Or is there an even better way that I haven't thought about? Thanks in advance.

    Hi,
    Base on your description, creating two reverse DNS zones 1.0.10.in-addr.arpa and 2.0.10.in-addr.arpa, or creating one reverse DNS zone 0.10.in-addr.arpa, both methods are all right.
    Best Regards,
    Tina

  • Best practices for creating application schema

    All,
    Can anyone recommend best practices (or pointer to a url) for creating application schema. A novice installer created a schema and the tablespace ran out of disk space in 2 days and the system came to a halt at a production site. The tablespace was created with one datafile and with MAXSIZE specified. I am looking for Do's and Dont's on production system.
    Thanks for any help,
    Vissu

    I'm not sure that you can boil this down to a "Do's and Don'ts" list unless you want to get overly general...
    For example, do make sure that you provision space appropriately. "Appropriately" however, is going to be radically different in different environments. Some shops set all their data files to autoextend in production and monitor utilization at the OS level. Other shops specify exact file sizes and monitor utilization at the Oracle level. Each approach has its own advantages and disadvantages, you just need to make sure that your application uses the same approach that every other application in the organization uses.
    Do have an idea about the space utilization of the application, but don't go overboard. Running out of space in 2 days means someone failed to do a basic analysis. On the other hand, I've seen people spend way more time than they should making 5 year projections based on some relatively soft assumptions and getting worried about internal overheads that were much smaller than the error bars in their baseline estimates. Of course, the precision necessary also depends on the implications-- a 20% error in a multi-TB data warehouse is going to have a lot more impact than a 20% error in a 20 GB OLTP application.
    Justin

  • Best practice for creating RFC destination entries for 3rd parties(Biztalk)

    Hi,
    We are on SAP ECC 6 and we have been creating multiple RFC destination entries for the external 3rd party applications such as Biz-talk and others using TCP/IP connection type and sharing the programid.
    The RFC connections with IDOC as data flow have been made using Synchronous mode for time critical ones(few) and majority through asynchronous mode for others. The RFC destination entries have been created for many interfaces which have unique RFC destinations with its corresponding ports defined in SAP. 
    We have both inbound and outbound connectivity.with the large number of RFC destinations being added we wanted to review the same. We wanted to check with others who had encountered similar situation and were keen to learn their experiences.
    We also wanted to know if there are any best practices to optimise on number of RFC destinations.
    Here were a few suggestions we had in mind to tackle the same.
    1. Create unique RFC destinations for every port defined in SAP for external applications as Biztalk for as many connections. ( This would mean one for inbound, one for outbound)
    2. Create one single RFC destination entry for the external host/application and the external application receiving the idoc control record to interpret what action to perform at its end.
    3. Create RFC destinations based on the modules it links with such as materials management, sales and distribution, warehouse management. This would ensure we can limit the number of RFC to be created and make it simple to understand the flow of data.
    I have done checks on SAP best practices website, sap oss notes and help pages but could not get specific information I was after.
    I do understand we can have as unlimited number of RFC destinations and maximum connections using appropriate profile parameters for gateway, RFC, client connections, additional app servers.
    I would appreciate if you can suggest the best architecture or practice to achieve  RFC destinations in an optimized manner.
    Thanks in advance
    Sam

    Not easy to give a perfect answer
    1. Create unique RFC destinations for every port defined in SAP for external applications as Biztalk for as many connections. ( This would mean one for inbound, one for outbound)
    -> be careful if you have multi cllients ( for example in acceptance) RFC's are client independ but ports are not! you could run in to trouble
    2. Create one single RFC destination entry for the external host/application and the external application receiving the idoc control record to interpret what action to perform at its end.
    -> could be the best solution... its easier to create partner profiles and the control record will contain the correct partner.
    3. Create RFC destinations based on the modules it links with such as materials management, sales and distribution, warehouse management. This would ensure we can limit the number of RFC to be created and make it simple to understand the flow of data.
    -> consider this option 2.
    We send to you messagebroker with 1 RFC destination , sending multiple idoctypes, different partners , different ports.

  • Best Practice while creating Contract, Purchase Requisition, Purchase Order

    Hi
    What is the best practice with respect to Contract & Purchase Requisition?
    IN T Code ME31K, there is a button Reference to PReq, meaning that we can create contract with using Purchase Requisition. (We have done the same)
    While creating the Purchase Order using Contract, we could find Purchase Requisition reference in the Purchase Order; similarly when we created the Purchase Order using Purchase Requisition, we could find Contract reference in the Purchase Order.
    I have done the following:
    1. Create a Contract.
    2. Create a Purchase Requisition
    3. Assign Requisition and Create Purchase Order using T Code ME57.
    4. Create Purchase Order
    Here in this case we could find references of both Contract & Purchase Requisition.
    I just want to know what Practice should we adopt / advise while creating Contract, Purchase Requisition. & Purchase Order?
    Regards,

    Hi,
    In ME51N Screen, enter Contract number in the "AGREEMENT" TAB and if more items in the contract, you can mention the item number in the next TAB to the AGREEMENT TAB and then proceed with creating PR by entering other details. This TAB you can find in the 4th column from extreme right end of PR screen.
    Regards,
    Biju K
    Edited by: Bijay Kumar Barik on Sep 10, 2009 2:23 PM

Maybe you are looking for

  • Options icon

    i noticed i was being charged 17p for sms delivery receipt so i have tried to turn this off on my blackberry curve. i was told to go onto the options icon and click on sms text but when i go onto the options icon nothing to do with sms comes up! how

  • Comm Express + convergence customization

    Hello, I have convergence installed on Solaris 10 x86. I have customized the URLs for Communication Express access for the multiple domains so that users can login with only their ID. ( referred UWC customization guide for 6.3). I have /var/opt/sun/c

  • Oracle FO processor ignoring color attribute

    Hello, I have the following: <fo:inline font-weight="bold" color="red">blah</fo:inline> iin my report xml. When I use BI Publisher desktop to convert my template to pdf or html, the selected text is shown in bold but the color is not in red but black

  • Log Host IP Address

    Hi, Is it possible to log the IP address of the client (not the application server) in a database table along with other session values like last accessed time, last login time and log off time? Once this is logged I need to show this information on

  • FP-AO Channel Error (Red Light)

    I have a FP-AO-210 voltage output.  Power and Ready lights are solid green, but channels 0 and 1 are lit solid red.  The other channels are all clear, however, I am not able to get voltage out of any channel.  Could 0 and 1 be affecting the rest of t