Content tab for a fact table

Hi
Please , help me in knowing the use of content tab for a fact table in the repository in OBIEE.
Thanks.

if you have multiple LTS then you should set the content level approprately otherwise you can get errors during consistency checks.not able to find any link which talks only about content level.see these links and let us know if you have any doubts
http://kr.forums.oracle.com/forums/thread.jspa?threadID=604637
Content tab is also handy when you are using aggregate tables.
Regards,
Sandeep

Similar Messages

  • Logical level for logical fact table sources

    it is clear that for fact aggregates, we should use the Content tab of the Logical Table Source dialog to assign the correct logical level to each dimension.
    question is : is it mandatory to assign even for non-aggregates fact tables the logical level for each dimension (which normally should be set to the most detailed level of each dimension) ? is it any known issue if "logical levels"in content tab are not set ?
    the reason I'm asking this is a strange bug I have (I'm not going to discuss it here) and then only workaround seems to be NOT setting the logical levels (on content tab) for logical fact table sources.
    thank you !

    If levels are not set: By default levels are considered as lowest level
    It should not matter if you set or not
    Generally we set for facts explicitly when we are using Aggregate tables.
    Your current issue might be a case by case; I would suggest to check implicit fact, any table mapped to the source to force a join etc
    Mark if helps
    Let me know how it helps
    Edited by: Srini VEERAVALLI on Feb 5, 2013 8:33 AM
    Any updates on this?+_
    Edited by: Srini VEERAVALLI on Feb 14, 2013 9:09 AM

  • "Select" Physical table as LTS for a Fact table

    Hi,
    I am very new to OBIEE, still in the learning phase.
    Scenario 1:
    I have a "Select" Physical table which is joined (inner join) to a Fact table in the Physical layer. I have other dimensions joined to this fact table.
    In BMM, I created a logical table for the fact table with 2 Logical Table Sources (the fact table & the select physical table). No errors in the consistency check.
    When I create an analysis with columns from the fact table and the select table, I don't see any data for the select table column.
    Scenario 2:
    In this scenario, I created an inner join between "Select" physical table and a Dimension table instead of the Fact table.
    In BMM, I created a logical table for the dimension table with 2 Logical Table Sources (the dimension table & the select physical table). No errors in the consistency check.
    When I create an analysis with columns from the dimension table and the select table, I see data for all the columns.
    What am I missing here? Why is it not working in first scenario?
    Any help is greatly appreciated.
    Thanks,
    SP

    Hi,
    If I understand your description correctly, then your materialized view skips some dimensions (infrequent ones). However, when you reference these skipped dimensions in filters, the queries are hitting the materialized view and failing as these values do not exist. In this case, you could resolve it as follows
    1. Create dimensional hierarchies for all dimensions.
    2. In the fact table's logical sources set the content tabs properly. (Yes, I think this is it).
    When you skipped some dimensions, the grain of the new fact source (the materialized view in this case) is changed. For example:
    Say a fact is available with the keys for Product, Customer, Promotion dimensions. The grain for this is Product * Customer * Promotion
    Say another fact is available with the keys for Product, Customer. The grain for this is Product * Customer (In fact, I would say it is Product * Customer * Promotion Total).
    So in the second case, the grain of the table is changed. So setting appropriate content levels for these sources would automatically switch the sources.
    So, I request you to try these settings and let me know if it works.
    Thank you,
    Dhar

  • ETL for loading Fact tables

    Hi all
    I have a fundamental question regarding loading records into the Fact tables. As you know we always load Dimensions before loading Facts. In our data model, for most of the Fact tables we have a special Dimension that has got the basic attributes in it, along with the source key of the records which are valid in terms of our data warehouse. The source of the Fact table includes the source tables which are involved in loading the Dimension as well.
    For example for Student Fact table we have a Student Dimension which is loaded before Student Fact, hence has got the most recent changes (source key of the records which are to be loaded into DW). Now my question is that which options below are more appropriate in loading Student Fact table.
    Option 1 -  Load Student Fact table using the same source tables that are used in loading records into Student Dimension, ignoring the similar logic has been applied already in Dimension to get valid records from Source and do it again in Fact job.
    Option 2 -  Inner Join the Student Dimension in the Student Fact ETL job, so it limits the number of source records (coming from Source System) to the validated records which we are interested to load into DW. This option makes loading the Fact table faster, as by contributing the Dimension in loading its corresponding Fact, we don't have to get all records from the source and validate them against the rules again (as it has been done in the Dimension job and now we can use the validated record IDs in Fact job).
    As we are new building jobs for a dimensional model, I thought it would be good to ask your opinions and your experience
    Thanks in advance,
    Tootia

    I don't see any problem with re-using your "which students do we care about?" logic and rules by borrowing the list of students from the dimension table. You can join to it, or do a lookup() into it and then filter-out nulls (assuming you return a null on a no-match), or use a Validation transform with the "Exists in table" option -- which to use is question of style & performance -- any would work.  Indeed, when migrating data into SAP using iDocs, I use a similar idea, always joining the iDoc base segment "map" table to the load of any child segment table to "automatically" filter-out any child segment records not present in the base, re-using the selection logic as you suggest, which then only needs to be built in the base segment. Keeps things tidy.
    Best wishes,
    Jeff Prenevost
    Data Services Practice Manager
    itelligence

  • Issue with Multiple LTS for a fact table and filters

    Hello,
    I am facing an issue with obiee 10g.
    In my model, I have a huge FACT table F1 (partitioned and indexed). The average response time for the queries, which targeted it, was ~30-60 seconds, which was not really convincing our end user.
    So, we decided to create a materialized view, which removes some dimensions that are not used by default, but might be used if the end user adds some filters. I added the Materialized view in the Physical Layer and in the corresponding Logical Table Source.
    I then tried to see if it works, but I was a bit surprised by the result. Indeed,
    -> If the report does not reference a truncated dimension, it targets the materialized view. -> Perfect
    -> If the report does reference a truncated dimension in the columns, it targets the Fact Table. -> Perfect
    -> If the report does reference a truncated dimension in the Filters, it targets the materialized view. For this reason, the filter is never resolved and no join on the dimension table is applied, whereas it exists in logical SQL generated. -> Ko.
    A suggestion could be to add the filters into the columns, but I am not satisfied by this response because it will never use the materialized view in that case.
    An other suggestion could be to use query rewrite, but I 'd like to have the full control on the generation of the queries.
    Does someone know if the filters are not evaluated to determine which LTS to use? How can I force this evaluation?
    Regards,

    Hi,
    If I understand your description correctly, then your materialized view skips some dimensions (infrequent ones). However, when you reference these skipped dimensions in filters, the queries are hitting the materialized view and failing as these values do not exist. In this case, you could resolve it as follows
    1. Create dimensional hierarchies for all dimensions.
    2. In the fact table's logical sources set the content tabs properly. (Yes, I think this is it).
    When you skipped some dimensions, the grain of the new fact source (the materialized view in this case) is changed. For example:
    Say a fact is available with the keys for Product, Customer, Promotion dimensions. The grain for this is Product * Customer * Promotion
    Say another fact is available with the keys for Product, Customer. The grain for this is Product * Customer (In fact, I would say it is Product * Customer * Promotion Total).
    So in the second case, the grain of the table is changed. So setting appropriate content levels for these sources would automatically switch the sources.
    So, I request you to try these settings and let me know if it works.
    Thank you,
    Dhar

  • BMM issue for multiple fact tables

    Hi All,
    I have three facts F1,F2, F2 and two confirmed dimension D1,D2 in my sample rpd.
    In BBM layer , I tried to do the modeling as follows instead of creating one logical fact table. There are many fact tables to come
    in future.
    F1<----D1--->F2<----D2--->F3
    When I deploy this sample rpd and run the report from F1 andF2, I get an error No fact Table exists at requested level of detail.
    I have not created any hierarchical dimension to set up the content in the fact table.
    I am very much thanksful for any advices.
    Thanks,
    Vishal

    Hi,
    Please refer the below link.
    http://satyaobieesolutions.blogspot.in/2012/07/implementing-multiple-fact-tables-in.html
    My suggestion would be bring both the facts to the same logical table sources and have a single fact table in the BMM layer joined with multiple dimensions.
    Build a dimension hierarchy for the dimensions and then in the content logical layer mapping, map the dimensions to the fact tables with detailed level/Total
    Refer the below link-
    http://108obiee.blogspot.com/2009/08/joining-two-fact-tables-with-different.html
    Hope this help's
    Thanks,
    Satya
    Edited by: Satya Ranki Reddy on Jul 26, 2012 7:34 AM

  • Error for the fact table while processing the cube - attribute key cannot be found when processing

    Please help as I am new to SSAS and this is urgent requirement. This is a MOLAP cube and below is the error that I am receiving when processing the cube. The cube is set to Prrocess Full. Several similar errors are popped up for various dimensions.
    "Errors in the OLAP storage engine: The attribute key cannot be found when processing: Table: 'Fact_Table', Column: 'ID', Value: '1'. The attribute is 'Id'. Errors in the OLAP storage engine: The attribute key was converted to an unknown member because
    the attribute key was not found. Attribute Id of Dimension: 17 - Ves - PoC Cont from Database: DB, Cube: IPNCube, Measure Group: iSrvy, Partition: Partition1, Record: 1."
    Thanks in advance.

    Thanks for the recommendations David.
    It will be really great if you can clear some of my doubts:
    To my information, all the dimensions need to be processed first and then the fact table will be processed.
    So if the ID's are not present in the dimension tables, then it should not be present in the Fact table either.
    Here we found null values in the dimension table and the ID's were present in the Fact table. What might be the reasons causing such situation?
    Also how frequently the cube needs to be processed? Currently the ETL which processes the cube, is scheduled in a SQL Job Agent on hourly basis everyday. 
    Is there any possibilty that the cube might be under processing state and the SQL job for the next run getting executed trying to access and process the cube while it was still processing?

  • Distinct count for multiple fact tables in the same cube

    I'm fairly new to working with SSAS, but have been working with DW environments for many years.
    I have a cube which has 4 fact tables.  The central fact table is Encounter and then I also have Visit, Procedure and Medication.  Visit, Procedure and Medication all join to Encounter on Encounter Key.  The relationship between Encounter
    and Procedure and Encounter and Medication are both an optional 1 to 1.  The relationship between Encounter and Visit is an optional 1 to many.
    Each of the fact tables join to the Patient dimension on the Patient Key.  The users are looking for a distinct count of patients in all 4 fact tables.  
    What is the best way to accomplish this so that my cube does not talk all day to process?  Please let me know if you need any more information about my cube in order to answer this.
    Thanks for the help,
    Andy

    Hi Andy,
    Each distinct count measure cause an ORDER BY clause in the SELECT sent to the relational data source during processing. In SSAS 2005 or later, it creates a new measure group for each distinct count measure(it's a technique strategy for improving perormance).
    Besides, please take a look at the following distinct count optimization techniques:
    Create Customized Aggregations
    Define a Processing Plan
    Create Partitions of Equal Size
    Use Partitions Comprised of a Distinct Range of Integers
    Distribute the Hash of Your UserIDs
    Modulo Function
    Hash Function
    Choose a Partitioning Strategy
    For more detail information, please refer to the article below:
    Analysis Services Distinct Count Optimization:
    http://www.microsoft.com/en-us/download/details.aspx?id=891
    In addition, here is a good article about SSAS Best Practices for your reference:
    http://technet.microsoft.com/en-us/library/cc966525.aspx
    If you have any feedback on our support, please click
    here.
    Hope this helps.
    Elvis Long
    TechNet Community Support

  • Pulling values from Variables for my Fact Table

    Hi All,
    The Load_Date in my Fact table is currently, current_timestamp-1, as the data is one day old when it is being loaded. I want to get that date from a variable because if I need to rerun with an earlier date like 20 or 30 days before the current date, I do not want to touch the mapping instead modify the variable and run the package. Please advise on how to do this?
    Thanks for your time and help.

    As Bhabani said, create a project variable with the code you would like to run to generate the earlier date. For example, "select current_timestamp - 30". Then, add this variable to the Package before the Interface, setting it to refresh. In the Interface, change the mapping from what it is now (current_timestamp - 1) to the variable (#VARIABLE_NAME). The variable will now be refreshed and the value added to the mapping.
    If you want to change it to "current_timestamp - 20", just update the variable.
    Regards,
    Michael Rainey

  • Find the partition for the fact table

    Oracle version : Oracle 10.2
    I have one fact table with daily partitions.
    I am inserting some test data in this table for old date 20100101.
    I am able to insert this record in this table as below
    insert into fact_table values (20100101,123,456);
    However I observed that the partition for this date does not exist in the table (all_tab_partitions) morever I am not able to select the data using
    select * from facT_table partition(d_20100101)
    but I am able to extract the data using
    select * from facT_table where date_id=20100101
    could some one please let me know the way to find the partition in which this data might be inserted
    and if the partition for date 20100101 is not present then why insert for that date is working ?

    user507531 wrote:
    However I observed that the partition for this date does not exist in the table (all_tab_partitions) morever I am not able to select the data using
    select * from facT_table partition(d_20100101)Wrong approach.
    but I am able to extract the data using
    select * from facT_table where date_id=20100101Correct approach.
    could some one please let me know the way to find the partition in which this data might be inserted
    and if the partition for date 20100101 is not present then why insert for that date is working ?Who says that the date is invalid..? This is a range partition - which means that each partition covers a range. And if you bothered to read up in the SQL Reference Guide on how a range partition is defined, you will notice that each partition is defined with the end value of the range it covers. There is no start value - as the previous partition's end value is the "+border+" between this and the prior partition.
    I suggest that before you use a database feature you first familiarise yourself with it. Else incorrectly using it, and making the wrong assumptions about it, more than likely results.

  • Using WHERE NOT EXISTS for a Fact Table Load

    I'm trying to set up a fact table load using T SQL, and I need to use WHERE NOT EXISTS. All of the fields from the fact table are listed in the WHERE NOT EXISTS clause. What I expect is that if the value of any one of the fields is different, that the whole
    record be treated as a new record, and inserted into the table. However, in my testing, when I 'force' a field value, new records are not inserted.
    The following is my query:
    declare 
    @Created_By nchar(50)
    ,@Created_Date datetime --do we need utc check?
    ,@Updated_By nchar(50)
    ,@Updated_Date datetime
    select @Created_By = system_user
    ,@Created_Date = getdate()
    ,@Updated_By = system_user
    ,@Updated_Date = getdate()
    insert fact.Appointment
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    select distinct
    Slot.Slot_ID 
    , Slot.Slot_Start_DateTime  as Slot_DateTime --???
    , Slot.Slot_Start_DateTime
    , Slot.Slot_End_DateTime
    , datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) as Slot_Duration_Min 
    , Slot.Created_Date as Slot_CreateDateTime
    , SlotCreateDate.Date_key as Slot_CreateDate_DateKey
    , HSite.Healthcare_System_ID
    , HSite.Healthcare_Service_ID
    , HSite.Healthcare_Service_ID as Healthcare_Supervising_Service_ID
    , HSite.Healthcare_Site_ID 
    , Ref.Booked_Appt_ID 
    , ApptSubmissionTime.Date_key as Appt_Notification_Submission_DateKey
    , ApptCompletionTime.Date_key as Appt_Notification_Completion_DateKey
    , datediff(mi,appt.SubmissionTime,appt.CompletionTime) as Appt_Notification_Duration
    , Appt.Appt_Notification_ID 
    , pat.Patient_ID 
    , 0 as Physician_ID
    , ref.Referral_ID
    , Hsrv.Specialty
    , appt.[Language] as LanguageRequested
    ,@Created_Date as Created_Date
    ,@Created_By as Created_By
    ,@Updated_Date as Updated_Date
    ,@Updated_By as Updated_By
    from dim.Healthcare_System HSys
    inner join dim.Healthcare_Service HSrv
    on HSys.Healthcare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Healthcare_Site HSite
    on HSite.HealthCare_Service_ID = HSrv.Healthcare_Service_ID
    and HSite.HealthCare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Referral Ref 
    on Ref.ReferralSite_ID = HSite.Site_ID
    and Ref.ReferralService_ID = HSite.Service_ID
    and Ref.ReferralSystem_ID = HSite.System_ID 
    right join (select distinct Slot_ID, Source_Slot_ID, Slot_Start_DateTime, Slot_End_DateTime, Created_Date from dim.slot)slot
    on ref.Source_Slot_ID = slot.Source_Slot_ID
    inner join dim.Appointment_Notification appt
    on appt.System_ID = HSys.System_ID
    inner join dim.Patient pat 
    on pat.Source_Patient_ID = appt.Source_Patient_ID
    inner join dim.SystemUser SysUser
    on SysUser.Healthcare_System_ID = HSys.Healthcare_System_ID
    left join dim.Calendar SlotCreateDate
    on SlotCreateDate.Full_DateTime = cast(Slot.Created_Date as smalldatetime)
    left join dim.Calendar ApptSubmissionTime
    on ApptSubmissionTime.Full_DateTime = cast(appt.SubmissionTime as smalldatetime)
    left join dim.Calendar ApptCompletionTime
    on ApptCompletionTime.Full_DateTime = cast(appt.CompletionTime as smalldatetime)
    where not exists
    select
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    from fact.Appointment
    I don't have any issues with the initial insert, but records are not inserted on subsequent inserts when one of the WHERE NOT EXISTS field values changes.
    What am I doing wrong?
    Thank you for your help.
    cdun2

    so I set up a WHERE NOT EXIST condition as shown below. I ran the query, then updated Slot_Duration_Min to 5. Some of the Slot_Duration_Min values resolve to 15. What I expect is that when I run the query again, that the records where Slot_Duration_Min resolves
    to 15 should be inserted again, but they are not. I am using or with the conditions in the WHERE clause because if any one of the values is different, then a new record needs to be inserted:
    declare 
    @Created_By nchar(50)
    ,@Created_Date datetime
    ,@Updated_By nchar(50)
    ,@Updated_Date datetime
    select
    @Created_By = system_user
    ,@Created_Date = getdate()
    ,@Updated_By = system_user
    ,@Updated_Date = getdate()
    insert fact.Appointment
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    select distinct
    Slot.Slot_ID 
    , Slot.Slot_Start_DateTime  as Slot_DateTime --???
    , Slot.Slot_Start_DateTime
    , Slot.Slot_End_DateTime
    , datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) as Slot_Duration_Min 
    , Slot.Created_Date as Slot_CreateDateTime
    , SlotCreateDate.Date_key as Slot_CreateDate_DateKey
    , HSite.Healthcare_System_ID
    , HSite.Healthcare_Service_ID
    , HSite.Healthcare_Service_ID as Healthcare_Supervising_Service_ID
    , HSite.Healthcare_Site_ID 
    , Ref.Booked_Appt_ID 
    , ApptSubmissionTime.Date_key as Appt_Notification_Submission_DateKey
    , ApptCompletionTime.Date_key as Appt_Notification_Completion_DateKey
    , datediff(mi,appt.SubmissionTime,appt.CompletionTime) as Appt_Notification_Duration
    , Appt.Appt_Notification_ID 
    , pat.Patient_ID 
    , 0 as Physician_ID
    , ref.Referral_ID
    , Hsrv.Specialty
    , appt.[Language] as LanguageRequested
    ,@Created_Date as Created_Date
    ,@Created_By as Created_By
    ,@Updated_Date as Updated_Date
    ,@Updated_By as Updated_By
    from dim.Healthcare_System HSys
    inner join dim.Healthcare_Service HSrv
    on HSys.Healthcare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Healthcare_Site HSite
    on HSite.HealthCare_Service_ID = HSrv.Healthcare_Service_ID
    and HSite.HealthCare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Referral Ref 
    on Ref.ReferralSite_ID = HSite.Site_ID
    and Ref.ReferralService_ID = HSite.Service_ID
    and Ref.ReferralSystem_ID = HSite.System_ID 
    right join (select distinct Slot_ID, Source_Slot_ID, Slot_Start_DateTime, Slot_End_DateTime, Created_Date from dim.slot)slot
    on ref.Source_Slot_ID = slot.Source_Slot_ID
    inner join dim.Appointment_Notification appt
    on appt.System_ID = HSys.System_ID
    inner join dim.Patient pat 
    on pat.Source_Patient_ID = appt.Source_Patient_ID
    inner join dim.SystemUser SysUser
    on SysUser.Healthcare_System_ID = HSys.Healthcare_System_ID
    left join dim.Calendar SlotCreateDate
    on SlotCreateDate.Full_DateTime = cast(Slot.Created_Date as smalldatetime)
    left join dim.Calendar ApptSubmissionTime
    on ApptSubmissionTime.Full_DateTime = cast(appt.SubmissionTime as smalldatetime)
    left join dim.Calendar ApptCompletionTime
    on ApptCompletionTime.Full_DateTime = cast(appt.CompletionTime as smalldatetime)
    where not exists
    select
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    from fact.Appointment fact
    where 
    Slot.Slot_ID  = fact.Slot_ID 
    or
    Slot.Slot_Start_DateTime   = fact.Slot_DateTime  
    or
    Slot.Slot_Start_DateTime = fact.Slot_StartDateTime
    or
    Slot.Slot_End_DateTime = fact.Slot_EndDateTime
    or
    datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) =
    fact.Slot_Duration_min
    or
    Slot.Created_Date  = fact.Slot_CreateDateTime
    or
    SlotCreateDate.Date_key = fact.Slot_CreateDate_DateKey
    or
    HSite.Healthcare_System_ID = fact.Healthcare_System_ID
    or
    HSite.Healthcare_Service_ID = fact.Healthcare_Service_ID
    or
    HSite.Healthcare_Service_ID  =
    fact.Healthcare_Service_ID 
    or
    HSite.Healthcare_Site_ID  = fact.Healthcare_Site_ID 
    or
    Ref.Booked_Appt_ID  = fact.Booked_Appt_ID 
    or
    ApptSubmissionTime.Date_key =
    fact.Appt_Notification_Submission_DateKey
    or
    ApptCompletionTime.Date_key =
    fact.Appt_Notification_Completion_DateKey
    or 
    datediff(mi,appt.SubmissionTime,appt.CompletionTime)  = fact.Appt_Notification_Duration
    or
    Appt.Appt_Notification_ID = fact.Appt_Notification_ID 
    or
    pat.Patient_ID  =
    fact.Patient_ID 
    or
    0 = 0
    or
    ref.Referral_ID = fact.Referral_ID
    or
    Hsrv.Specialty = fact.Specialty
    or
    appt.[Language] = fact.LanguageRequested

  • Reason for dim fact tables ratio

    Friends.
    What is the exact reason for Dim tables not be greater than 20 % of Fact tables.
    How does that really impact performance?

    Hi,
    In an extended star schema, dimension tables stores the DIM IDs and SID IDs to access the data from the actual characteristics. These dimension tables are attached with fact table with DIM IDs. Now if the size of the DIM table is as large as the size of fact table, then the cause of DIM table will not be used at all. It is as good as referring the master data for each record from fact table itself.
    That's why it is ideally suggested that the ratio size of DIM table to that of fact table is 1:10.
    Hope this info helps you.
    Regards,
    Yogesh.

  • Indexes for OWB fact tables

    Hi All,
    I had completed development of my OWB with 10 fact tables, each fact table is having index like(SA_PROD_S_IDX1_1).
    Now Client had given the production database(table space and indexes) with default tablespace USERS and unlimited quota on MASTER_DATA, MASTER_IDX, TRANS_DATA, and TRANS_IDX.
    My Problem is that I can put table space to my fact tables in the table properties, but how to specify my indexes to the dabase indexes.
    Please help me in this regard.
    Regards,
    Kumar.

    Hi,
    I have another problem.
    I Created ProcessFlows and by using email notification, I can send SUCCESS / ERROR / WARNING notification.
    In process flows I have 3 maps.
    When there is any error or warnings in any of the map execution, I am getting only message which I put in my email subject body, but I am not getting exactly which map got error or warnings.
    Is there any way to send the execution details of the map through mail notification.
    Or Please suggest me to do any solution.
    Regards,
    Kumar

  • 00933 error when clicking on the data tab for the selected table

    Hi,
    I'm getting the following error when selecting a table from the tree view then clicking on the data tab:
    An error was encountered performing the requested operation:
    ORA-00933: SQL command not properly ended
    00933.00000 - "SQL command not properly ended"
    *Cause:
    *Action:
    Vendor code 933
    The exact same tables created from sqlplus for a different user in the same database work fine. Note, I'm not typing any SQL in myself. It would appear this happens on all tables created as this user. I cannot see any difference between permissions for this user and others which work but the failing user does have a "-" in the username and password which I'm suspicious of.
    This is SQL Developer v 1.2.0 (build 29.98) (the most recent I believe, Oracle Database 10g Express Edition Release 10.2.0.1.0 and Linux ubuntu.
    Thanks
    Martin

    "-" is not allowed in identifiers and is almost certainly the cause of your problem. This can be got round by quoting i.e. using "STUPID-NAME" instead of STUPID-NAME.
    I'm slightly surprised that SQLDeveloper doesn't quote the statement. Perhaps it quotes table names but not user names.

  • Joining two fact tables for subject area

    When I tried to use two simple fact tables joined by a dimension, I am getting the “ No fact table exists at the requested level of detail” error in the answer when I try to pull the columns from Fact 1 and Fact 2 tables. I have set the content in both the fact table to lowest granularity of dimension with CUST_ID, RAT_ID, ACT_ID keys. We have one to many relationships between dimension and both the fact tables. Any feedback is highly cherished.
    Fact1: keys are: CUST_ID, RAT_ID, ACT_ID, YEAR
    Fields are CUST_ID, RAT_ID, ACT_ID, YEAR , Rev1, Transaction Date
    Fact2: keys are: CUST_ID, RAT_ID, ACT_ID, YEAR
    Fields are CUST_ID, RAT_ID, ACT_ID, YEAR , Rev2, CreationDate
    Dimension keys are CUST_ID, RAT_ID, ACT_ID
    Thanks,
    uday

    Hi LC,
    We have to add two fact tables F1 AND F2 to an existing bmm. These fact tables have history tables F11 AND F22 and we have to use partition logic for this.How you did partition ? you should be using Fragmentation logic for that any how you will add the F11/F22 tables to F1 and F2 LTS , so when you join F1 & F2 to the common dimension then it should work for the calucluated measures but dont forgot to create hierarchies and specify the content levels for the fact tables
    Thanks,
    Saichand

Maybe you are looking for