Import individual dimensions/fact tables or use views!
Hi,
Just a quick question to establish if there is a difference between importing a view that has been created based on our dimensional model or should we import the individual tables i.e. dimensions & fact table
Many thanks,
Rhys
Thank you both for your responses.
I am fairly new to dimensional modelling & DW and was just curious.
My instinct is to use the dimension & fact tables as opposed to views but I was just wondering what sort of circumstance would benefit by using one approach or the other??
In our first steps we have modelled a fairly simple process, imported the model and displayed the data on our dashboards and this worked just fine.
Regards,
Rhys
Similar Messages
-
Import SQL into Fact Table - Conversion Error
Hi All,
I am new to BPC MS and did not know much into MS SQL.
I am facing the Conversion error while converting between unicode and non unicode string data types.
We are using SAP BPC 10.0 MS SP13, EPM-Addin SP18 on .Net 4
The below is the error log
Total Step: 4
SQLToTxt: Failed in 0 seconds
Import SQL into Fact Table: Failed in 0 seconds
[Selection]
DB = BPC
TABLE = GlBalanceOpening
COLUMNS = Account,Auxiliary,Curr,BalanceDateTime,Amount
TRANSFORMATION = \JMG\LIQ_JMG\DataManager\TransformationFiles\GLOpeningBalances.xls
CLEARDATA = No
RUNLOGIC = No
PROCESSCUBE = Yes
CHECKLCK = No
[Message]
An error occurred while executing a package.
Package Error Events:
ErrorCode = -1071636234
Source = SQLToTxt
SubComponent= OLE DB Source [68]
Description = Column "Account" cannot convert between unicode and non-unicode string data types.
IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
Package Error Events:
ErrorCode = -1071636234
Source = SQLToTxt
SubComponent= OLE DB Source [68]
Description = Column "AccountName" cannot convert between unicode and non-unicode string data types.
IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
Package Error Events:
ErrorCode = -1071636234
Source = SQLToTxt
SubComponent= OLE DB Source [68]
Description = Column "Auxiliary" cannot convert between unicode and non-unicode string data types.
IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
Package Error Events:
ErrorCode = -1071636234
Source = SQLToTxt
SubComponent= OLE DB Source [68]
Description = Column "AuxiliaryName" cannot convert between unicode and non-unicode string data types.
IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
Package Error Events:
ErrorCode = -1071636234
Source = SQLToTxt
SubComponent= OLE DB Source [68]
Description = Column "Curr" cannot convert between unicode and non-unicode string data types.
IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
Package Error Events:
ErrorCode = -1071636234
Source = SQLToTxt
SubComponent= OLE DB Source [68]
Description = Column "Dr" cannot convert between unicode and non-unicode string data types.
IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
Package Error Events:
ErrorCode = -1071636234
Source = SQLToTxt
SubComponent= OLE DB Source [68]
Description = Column "Cr" cannot convert between unicode and non-unicode string data types.
IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
Package Error Events:
ErrorCode = -1073450901
Source = SQLToTxt
SubComponent= SSIS.Pipeline
Description = "component "OLE DB Source" (68)" failed validation and returned validation status "VS_ISBROKEN".
IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
Package Error Events:
ErrorCode = -1073450996
Source = SQLToTxt
SubComponent= SSIS.Pipeline
Description = One or more component failed validation.
IDOfInterfaceWithError= {C81DFC5A-3B22-4DA3-BD3B-10BF861A7F9C}
Package Error Events:
ErrorCode = -1073594105
Source = SQLToTxt
SubComponent=
Description = There were errors during task validation.
IDOfInterfaceWithError= {B4E78907-3D9C-4229-9DB9-6A311E45C779}
Thanks and Regards,
RajHi Raj,
you have to modify the script of your package, see please 1629737 - Error in ImportSQL Data Manager Package
Regards
Roberto -
Do ordering your child block using other tables wihout using views
Hi all,
Just need to play with this : Query Data Source Name property for data block
Benefit : you can do order by child block with other tables without using View
I have used two tables in this property !!
and it is working fine.
e.g. Dept is master block
and clild block is Emp nad combination with emp_category.
so now I can order emp block by emp category.
master block : dept
Query datasource name : dept
child block : emp
Query datasource name : emp a , emp_category b
From
Chirag PatelOk, I'd probably go with the two insert method, but as an alternative, here is a method of doing it in a single insert:
create table emp_info as
select 1 empno, 'a' empname, 101 deptid, 'aaa' deptname from dual union all
select 2, 'b' empname, 201 deptid, 'bbb' deptname from dual union all
select 3, 'c' empname, 101 deptid, 'aaa' deptname from dual union all
select 4, 'd' empname, 101 deptid, 'aaa' deptname from dual union all
select 5, 'e' empname, 301 deptid, 'ccc' deptname from dual;
create table emp (empno number primary key, empname varchar2(3), deptid number);
create table dept (deptid number primary key, deptname varchar2(3));
insert all
WHEN rn = 1 THEN
into dept (deptid, deptname)
values (deptid, deptname)
WHEN rn > 0 THEN
into emp (empno, empname, deptid)
values (empno, empname, deptid)
select empno, empname, deptid, deptname, rn
from (select empno,
empname,
deptid,
deptname,
row_number() over (partition by deptid order by empno) rn
from emp_info);
8 rows inserted
commit;
select * from emp;
EMPNO EMP DEPTID
1 a 101
3 c 101
4 d 101
2 b 201
5 e 301
select * from dept;
DEPTID DEP
101 aaa
201 bbb
301 cccYou should test both methods to see which one is more performant. -
Snapshot Dimension & Fact tables
We are currently designing a logical multidimensional model from an OLTP tables.Dimension tables have monthly snapshot because some of or all the attributes might change on monthly basis.The same situation for the fact table has monthly snapshot.
I know that according to Kimball's modeling, the attributes for dimensions should be implemented using mini-dimensions and put combination key as a foreign key in the fact table but this step needs an ETL job to handle mini-dimension and other fact table.However, in our situation and according to scope limitation,there is no time to design separate ETL to handle mini-dimension.
An example for our records:
Customer:
Cust_ID
Month_ID
Attr1
Attr2
Attr3
Attr20
Installments
Installment_ID
Cust_ID
Month_ID
Attr1
Attr2
Attr3
Attr5
Measure1
Measure2
Measure3
Measure4
So, Installments table contains attributes as well as measures so what is the suitable consideration and the suitable OBIEE logical BM design ,should we consider Installments table as fact or should we divide it into dimension and fact tables ?Xerox wrote:
So, Installments table contains attributes as well as measures so what is the suitable consideration and the suitable OBIEE logical BM design ,should we consider Installments table as fact or should we divide it into dimension and fact tables ?You already give the answer yourself: you should create an Installment dimension and a Installment fact table, using the same physical table as logical table source. The logical dimension should only contain the attributes, the logical fact should only contain measures. -
Cartesian product of three dimensions - fact table is too big
Hi, I need some advice with OLAP design. I have tables with orders, customers and product types. My problem is that I want to know amount of orders for every combination of customer - date since customer was created - product type. I also need to filter
days when no order was made and the opposite (by dimension). Also customer have some interval of time when he is considered as a new customer and I need to filter this customers too (again by dimension).
Now I have got fact table filled with every combination of customer/product type/date and the number of orders with the bit flag - was a new customer. Problem is that I have aprox 100k customers, so this table has bilions of rows. Is there any other possible
solution? For browsing cube is used Excel where I cannot easily filter by measure/calculation.It seems like every customer is a new customer at some point in time. So, you should probably store first order date in the customer table. Depending on the Date (Period) selected , a customer will become new, active or inactive. Also, create measures for
new customer sales, old customer sales. i.e. First order date + 30 days of the customer is less than last date in the period selected, it is new customer sales. I would assume that it would be difficult to do it in cube. Did you
try to solve the problem using SQL queries?
Good luck.
Customer
First Ord Date
New customer till
Cust A
3/1/2014
4/1/2014
New Cust Sales
Old Cust Sales
Total Sales (New + old)
2014-01
2014-02
2014-03
10
10
2014-04
20
20
2014-05 -
Calc problem with fact table measure used as part of bridge table model
Hi all,
I'm experiencing problems with the calculation of a fact table measure ever since I've used it as part of a calculation in a bridge table relationship.
In a fact table, PROJECT_FACT, I had a column (PROJECT_COST) whose default aggregate was SUM. Whenever PROJECT_COST was used with any dimension, the proper aggregation was done at the proper levels. But, not any longer. One of the relationships PROJECT_FACT has is with a dimension, called PROJECT.
PROJECT_FACT contains details of employees and each day they worked on a PROJECT_ID. So for a particular day, employee, Joe, might have a PROJECT_COST of $80 for PROJECT_ID 123, on the next day, Joe might have $40 in PROJECT_COST for the same project.
Dimension table, PROJECT, contains details of the project.
A new feature was added to the software - multiple customers can now be charged for a PROJECT, where as before, only one customer was charged.
This percentage charge break-down is in a new table - PROJECT_BRIDGE. PROJECT_BRIDGE has the PROJECT_ID, CUSTOMER_ID, BILL_PCT. BILL_PCT will always add up to 1.
So, the bridge table might look like...
PROJECT_ID CUSTOMER_ID BILL_PCT
123 100 .20
123 200 .30
123 300 .50
456 400 1.00
678 400 1.00
Where for project 123, is a breakdown for multiple customers (.20, .30. .50).
Let's say in PROJECT_FACT, if you were to sum up all PROJECT_COST for PROJECT_ID = 123, you get $1000.
Here are the steps I followed:
- In the Physical layer, PROJECT_FACT has a 1:M with PROJECT_BRIDGE as does PROJECT to PROJECT_BRIDGE (a 1:M).
PROJECT_FACT ===> PROJECT_BRIDGE <=== PROJECT
- In the Logical layer, PROJECT has a 1:M with PROJECT_FACT.
PROJECT ===> PROJECT_FACT
- The fact logical table source is mapped to the bridge table, PROJECT_BRIDGE, so now it has multiple tables it maps to (PROJECT_FACT & PROJECT_BRIDGE). They are set for an INNER join.
- I created a calculation measure, MULT_CUST_COST, using physical columns, that calculates the sum of the PROJECT_COST X the percentage amount in the bridge table. It looks like: SUM(PROJECT_FACT.PROJECT_COST * PROJECT_BRIDGE.BILL_PCT)
- I brought MULT_CUST_COST into the Presentation layer.
We still want the old PROJECT_COST around until it get's phased out, so it's in the Presentation layer as well.
Let's say I had a request with only PROJECT_ID, MULT_CUST_COST (the new calculation), and PROJECT_COST (the original). I'd expect:
PROJECT_ID MULT_CUST_COST PROJECT_COST
123 $1000 $1000
I am getting this for MULT_CUST_COST, however, for PROJECT_COST, it's tripling the value (possibly because there are 3 percent amounts?)...
PROJECT_ID MULT_CUST_COST PROJECT_COST
123 $1000 (correct) $3000 (incorrect, it's been tripled)
If I were to look at the SQL, it would have:
SELECT SUM(PROJECT_COST),
SUM(PROJECT_FACT.PROJECT_COST * PROJECT_BRIDGE.BILL_PCT),
PROJECT_ID
FROM ...
GROUP BY PROJECT_ID
PROJECT_COST used to work correctly before modeling a bridge table.
Any ideas on what I did wrong?
Thanks!Hi
Phew, what a long question!
If I understand correctly I think the problem lies with your old cost measure, or rather combining that with you new one in the same request. If you think about it, your query as explained above will bring back 3 rows from the database which is why your old cost measure is being multiplied. I suspect that if you took it out of the query, your bridge table would be working properly for the new measure alone?
I would consider migrating your historic data into the bridge table model so that you have a single type of query. For the historic data each would have a single row in the bridge with a 1.0 BILL_PCT.
Best of luck,
Paul
http://total-bi.com -
Join Between Dimension & Fact Table
Hi all,
Here is my scenerio:
There are two tables, one dimension table and one fact table.
Normally, they will be joined with a key, says Date_ID.
However, these two tables do not have the same key.
Question comes, can I create derived columns in both tables to pretend as key (the value of key assumes the same) and join them together? If yes, which layer (physical or business logic) to do the join? Will data be shown in Answer correctly?
Thanks.Thanks for reply.
I know that if two tables have the same key, says primary key and foreign key, they can be joined directly.
However, the tables do not have the same key/column for join.
Actually, I've tried to create dummy columns in both tables to join together in physical layer. Then use those dummy columns in business model layer to get the value from other columns. Complex join is carried out later in business moel layer. But it seems that it does work. -
How do I use Derived Table to dynamically choose fact table
How do I use the Derived Table functionality to dynamically choose a fact table?
I am using BO XI R2 querying against Genesys Datamart kept in Oracle 10g. The datamart contains aggregated fact tables at different levels (no_agg, hour, day, week, etc...) I would like to build my universe so that if the end user chooses a parameter to view reports at daily granularity, then the daily fact table is used; choose hourly granularity, then hourly fact table is used, etc....
I tried using dynamic SQL in Oracle Syntax, but Business Obljects Universe didn't like that type of coding.
The tables look something like this:
O_LOB1_NO_AGG o
inner join V_LOB1_NO_AGG v on o.object_id = v.object_id
inner join T_LOB1_NO_AGG t on v.timekey = t.timekey
Likewise, in the 'hour', 'day', 'week', etc... fact tables, the Primary Key to Foreign Key names and relationships are the same. And the columns in each O_, V_, T_ fact table is the same, or very similar (just aggregated at different levels of time).
I was thinking of going a different route and using aggregate aware; but there are many Lines of Business (20+) and multiple time dimensions (7) and I believe aggregate aware would require me to place all relevant tables in the Universe as separate objects, which would create a large Universe with multiple table objects, and not be maintenance-friendly. I also was going to dynamically choose Line of Business (LOB) in the derived tables, based on the end user choosing a parameter for LOB, but that is out-of-scope for my current question. But that information sort of points you down the train of thought I am travelling. Thanks for any help you can provide!You can create a derived table containing a union like the following:
select a,b,c from DailyFacts where (@prompt('View'....) = 'Daily' and (<rest of your where conditions here if necessary>)
union
(select a,b,c from MonthlyFacts where (@prompt('View'....) = 'Monthly' and (<rest of your where conditions here if necessary>))
union
(select a,b,c from YearlyFacts where (@prompt('View'....) = 'Yearly' and (<rest of your where conditions here if necessary>))
I assume that you are familiar with the @prompt syntax
Regards,
Stratos -
Dimension Table Attributes giving No Fact Table Exists Error
Hi Experts,
My OBIEE version is 11.1.1.6.10. There is a presentation table which has columns coming from multiple logical tables. I'm dragging 2 columns into the report which are coming from 2 logical tables.
The result is displaying. When I checked the query, a fact table is also coming and results are getting effected because of this.
Could anybody let me know if any idea.Hi SriramKarthik,
Aj (bi007) already gave you the answer on how you can change the implicit fact table, the one you are seeing in your physical query.
I'm just not sure of what is your question: are you surprised to see a fact table is used in the physical SQL and look for a way to avoid it because, as you say, results are impacted by that implicit fact table? You can't get rid of that table, you can only choose it (the implicit one).
OBIEE must join in a way or another your 2 dimensions, and in your BMM these dimensions are joined by a fact table, that's why you see it in your query. -
Modelling Time Dimension with Fact Table containing Start Date and End Date
Hi Gurus,
I have a time dimension with Year till Date. I have a fact table which consists of Start Date, End Date, Person ID, Department ID.
How do i design Time dimension with fact table the below scenario
In the dashboard i have start Month and End month as prompts.
In the report i need to display Count(Person ID) > Start Date and < End Date along the trend.
For instance, i have selected Jan-2009 as start date and Apr-2009 as End Date, then i need to display Count(Person ID) of Jan-2009, Feb2009, Mar-2009 andApr-2009.
I Can not connect Time dimension with only Start Date or only with End Date to get the trend along the months.
Please advice on the issue which i am having.Hi,
Thanks for the response, Infact i tried using Complex join in physical layer. I have considered Time table joined with Fact table, and used >= and took and alias of the Time table and joined fact table using <=. When coming to BMM, i am not knowing how do i design this as if i merge the both the time dimensiona and its alias into single table, values will not be correct and if i make them as seperate columns. i can not show the trend as both are different columns.
Can you please let know where i am going wrong.
Thanks -
Delete only the Fact table using process chain?
Hi Experts,
I have an issue,where i have to delete the contents of the Infocube.So when i drag the Process type "Delete Contents of the Data target" into the process chain.I Am able to delete the contents of the cube.
But in my case i need to delete only the FACT table and not the DIMENSION table.The system does prompt a message to delete FACT only /FACT & DIM TAbles.
Please advice if its possible to Delete the FACT table only using process chains
thanks in advanceHi ,
check this link ..
http://help.sap.com/saphelp_nw04/helpdata/en/80/1a6473e07211d2acb80000e829fbfe/content.htm
Regards,
Shikha -
Reg: Using Multiple fact tables in the RPD!
Hello everyone,
Can I get some help with the following scenario?
We use OBIEE 11g.
We have a report that uses only 1 fact table to retrieve the data and it takes hell lot of time to bring up the data in the report.
My question here is is there a way to bring up the performance of the report if we split the data in 1 fact table to 2 fact tables and make the report use the 2 fact tables.
Are there any other ways of tuning the report for better performance.
Thanks for the help in advance!
Ajay.Hi Ajay,
Follow the following steps for the tuning;
1. Try to tune the SQL query first generated by the report, and make necessary changes. (Put indexes on required column so that your query is scanning the index instead of entire table)
2. You can try to create agg fact table and use the agg navigation for improving the performance. (Idea is to reduce the data set on which sql is fired)
3. You can try to create partitions on the FACT table in the DB and so that required partitions are queried as per filters.
4. In the Last if all the above doesn't tune your query you can try splitting the FACT table in to TWO or More as per your decision and then use fragmentation content in the BMM layer of the RPD so it will hit appropriate tables as per you report.
Mark Correct/Helpful if it helps.
Best of Luck,
Kashi -
Using WHERE NOT EXISTS for a Fact Table Load
I'm trying to set up a fact table load using T SQL, and I need to use WHERE NOT EXISTS. All of the fields from the fact table are listed in the WHERE NOT EXISTS clause. What I expect is that if the value of any one of the fields is different, that the whole
record be treated as a new record, and inserted into the table. However, in my testing, when I 'force' a field value, new records are not inserted.
The following is my query:
declare
@Created_By nchar(50)
,@Created_Date datetime --do we need utc check?
,@Updated_By nchar(50)
,@Updated_Date datetime
select @Created_By = system_user
,@Created_Date = getdate()
,@Updated_By = system_user
,@Updated_Date = getdate()
insert fact.Appointment
Slot_ID
, Slot_DateTime
, Slot_StartDateTime
, Slot_EndDateTime
, Slot_Duration_min
, Slot_CreateDateTime
, Slot_CreateDate_DateKey
, Healthcare_System_ID
, Healthcare_Service_ID
, Healthcare_Supervising_Service_ID
, Healthcare_Site_ID
, Booked_Appt_ID
, Appt_Notification_Submission_DateKey
, Appt_Notification_Completion_DateKey
, Appt_Notification_Duration
, Appt_Notification_ID
, Patient_ID
, Physician_ID
, Referral_ID
, Specialty
, LanguageRequested
, Created_Date
, Created_By
, Updated_Date
, Updated_By
select distinct
Slot.Slot_ID
, Slot.Slot_Start_DateTime as Slot_DateTime --???
, Slot.Slot_Start_DateTime
, Slot.Slot_End_DateTime
, datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) as Slot_Duration_Min
, Slot.Created_Date as Slot_CreateDateTime
, SlotCreateDate.Date_key as Slot_CreateDate_DateKey
, HSite.Healthcare_System_ID
, HSite.Healthcare_Service_ID
, HSite.Healthcare_Service_ID as Healthcare_Supervising_Service_ID
, HSite.Healthcare_Site_ID
, Ref.Booked_Appt_ID
, ApptSubmissionTime.Date_key as Appt_Notification_Submission_DateKey
, ApptCompletionTime.Date_key as Appt_Notification_Completion_DateKey
, datediff(mi,appt.SubmissionTime,appt.CompletionTime) as Appt_Notification_Duration
, Appt.Appt_Notification_ID
, pat.Patient_ID
, 0 as Physician_ID
, ref.Referral_ID
, Hsrv.Specialty
, appt.[Language] as LanguageRequested
,@Created_Date as Created_Date
,@Created_By as Created_By
,@Updated_Date as Updated_Date
,@Updated_By as Updated_By
from dim.Healthcare_System HSys
inner join dim.Healthcare_Service HSrv
on HSys.Healthcare_System_ID = HSrv.HealthCare_System_ID
inner join dim.Healthcare_Site HSite
on HSite.HealthCare_Service_ID = HSrv.Healthcare_Service_ID
and HSite.HealthCare_System_ID = HSrv.HealthCare_System_ID
inner join dim.Referral Ref
on Ref.ReferralSite_ID = HSite.Site_ID
and Ref.ReferralService_ID = HSite.Service_ID
and Ref.ReferralSystem_ID = HSite.System_ID
right join (select distinct Slot_ID, Source_Slot_ID, Slot_Start_DateTime, Slot_End_DateTime, Created_Date from dim.slot)slot
on ref.Source_Slot_ID = slot.Source_Slot_ID
inner join dim.Appointment_Notification appt
on appt.System_ID = HSys.System_ID
inner join dim.Patient pat
on pat.Source_Patient_ID = appt.Source_Patient_ID
inner join dim.SystemUser SysUser
on SysUser.Healthcare_System_ID = HSys.Healthcare_System_ID
left join dim.Calendar SlotCreateDate
on SlotCreateDate.Full_DateTime = cast(Slot.Created_Date as smalldatetime)
left join dim.Calendar ApptSubmissionTime
on ApptSubmissionTime.Full_DateTime = cast(appt.SubmissionTime as smalldatetime)
left join dim.Calendar ApptCompletionTime
on ApptCompletionTime.Full_DateTime = cast(appt.CompletionTime as smalldatetime)
where not exists
select
Slot_ID
, Slot_DateTime
, Slot_StartDateTime
, Slot_EndDateTime
, Slot_Duration_min
, Slot_CreateDateTime
, Slot_CreateDate_DateKey
, Healthcare_System_ID
, Healthcare_Service_ID
, Healthcare_Supervising_Service_ID
, Healthcare_Site_ID
, Booked_Appt_ID
, Appt_Notification_Submission_DateKey
, Appt_Notification_Completion_DateKey
, Appt_Notification_Duration
, Appt_Notification_ID
, Patient_ID
, Physician_ID
, Referral_ID
, Specialty
, LanguageRequested
, Created_Date
, Created_By
, Updated_Date
, Updated_By
from fact.Appointment
I don't have any issues with the initial insert, but records are not inserted on subsequent inserts when one of the WHERE NOT EXISTS field values changes.
What am I doing wrong?
Thank you for your help.
cdun2so I set up a WHERE NOT EXIST condition as shown below. I ran the query, then updated Slot_Duration_Min to 5. Some of the Slot_Duration_Min values resolve to 15. What I expect is that when I run the query again, that the records where Slot_Duration_Min resolves
to 15 should be inserted again, but they are not. I am using or with the conditions in the WHERE clause because if any one of the values is different, then a new record needs to be inserted:
declare
@Created_By nchar(50)
,@Created_Date datetime
,@Updated_By nchar(50)
,@Updated_Date datetime
select
@Created_By = system_user
,@Created_Date = getdate()
,@Updated_By = system_user
,@Updated_Date = getdate()
insert fact.Appointment
Slot_ID
, Slot_DateTime
, Slot_StartDateTime
, Slot_EndDateTime
, Slot_Duration_min
, Slot_CreateDateTime
, Slot_CreateDate_DateKey
, Healthcare_System_ID
, Healthcare_Service_ID
, Healthcare_Supervising_Service_ID
, Healthcare_Site_ID
, Booked_Appt_ID
, Appt_Notification_Submission_DateKey
, Appt_Notification_Completion_DateKey
, Appt_Notification_Duration
, Appt_Notification_ID
, Patient_ID
, Physician_ID
, Referral_ID
, Specialty
, LanguageRequested
, Created_Date
, Created_By
, Updated_Date
, Updated_By
select distinct
Slot.Slot_ID
, Slot.Slot_Start_DateTime as Slot_DateTime --???
, Slot.Slot_Start_DateTime
, Slot.Slot_End_DateTime
, datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) as Slot_Duration_Min
, Slot.Created_Date as Slot_CreateDateTime
, SlotCreateDate.Date_key as Slot_CreateDate_DateKey
, HSite.Healthcare_System_ID
, HSite.Healthcare_Service_ID
, HSite.Healthcare_Service_ID as Healthcare_Supervising_Service_ID
, HSite.Healthcare_Site_ID
, Ref.Booked_Appt_ID
, ApptSubmissionTime.Date_key as Appt_Notification_Submission_DateKey
, ApptCompletionTime.Date_key as Appt_Notification_Completion_DateKey
, datediff(mi,appt.SubmissionTime,appt.CompletionTime) as Appt_Notification_Duration
, Appt.Appt_Notification_ID
, pat.Patient_ID
, 0 as Physician_ID
, ref.Referral_ID
, Hsrv.Specialty
, appt.[Language] as LanguageRequested
,@Created_Date as Created_Date
,@Created_By as Created_By
,@Updated_Date as Updated_Date
,@Updated_By as Updated_By
from dim.Healthcare_System HSys
inner join dim.Healthcare_Service HSrv
on HSys.Healthcare_System_ID = HSrv.HealthCare_System_ID
inner join dim.Healthcare_Site HSite
on HSite.HealthCare_Service_ID = HSrv.Healthcare_Service_ID
and HSite.HealthCare_System_ID = HSrv.HealthCare_System_ID
inner join dim.Referral Ref
on Ref.ReferralSite_ID = HSite.Site_ID
and Ref.ReferralService_ID = HSite.Service_ID
and Ref.ReferralSystem_ID = HSite.System_ID
right join (select distinct Slot_ID, Source_Slot_ID, Slot_Start_DateTime, Slot_End_DateTime, Created_Date from dim.slot)slot
on ref.Source_Slot_ID = slot.Source_Slot_ID
inner join dim.Appointment_Notification appt
on appt.System_ID = HSys.System_ID
inner join dim.Patient pat
on pat.Source_Patient_ID = appt.Source_Patient_ID
inner join dim.SystemUser SysUser
on SysUser.Healthcare_System_ID = HSys.Healthcare_System_ID
left join dim.Calendar SlotCreateDate
on SlotCreateDate.Full_DateTime = cast(Slot.Created_Date as smalldatetime)
left join dim.Calendar ApptSubmissionTime
on ApptSubmissionTime.Full_DateTime = cast(appt.SubmissionTime as smalldatetime)
left join dim.Calendar ApptCompletionTime
on ApptCompletionTime.Full_DateTime = cast(appt.CompletionTime as smalldatetime)
where not exists
select
Slot_ID
, Slot_DateTime
, Slot_StartDateTime
, Slot_EndDateTime
, Slot_Duration_min
, Slot_CreateDateTime
, Slot_CreateDate_DateKey
, Healthcare_System_ID
, Healthcare_Service_ID
, Healthcare_Supervising_Service_ID
, Healthcare_Site_ID
, Booked_Appt_ID
, Appt_Notification_Submission_DateKey
, Appt_Notification_Completion_DateKey
, Appt_Notification_Duration
, Appt_Notification_ID
, Patient_ID
, Physician_ID
, Referral_ID
, Specialty
, LanguageRequested
, Created_Date
, Created_By
, Updated_Date
, Updated_By
from fact.Appointment fact
where
Slot.Slot_ID = fact.Slot_ID
or
Slot.Slot_Start_DateTime = fact.Slot_DateTime
or
Slot.Slot_Start_DateTime = fact.Slot_StartDateTime
or
Slot.Slot_End_DateTime = fact.Slot_EndDateTime
or
datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) =
fact.Slot_Duration_min
or
Slot.Created_Date = fact.Slot_CreateDateTime
or
SlotCreateDate.Date_key = fact.Slot_CreateDate_DateKey
or
HSite.Healthcare_System_ID = fact.Healthcare_System_ID
or
HSite.Healthcare_Service_ID = fact.Healthcare_Service_ID
or
HSite.Healthcare_Service_ID =
fact.Healthcare_Service_ID
or
HSite.Healthcare_Site_ID = fact.Healthcare_Site_ID
or
Ref.Booked_Appt_ID = fact.Booked_Appt_ID
or
ApptSubmissionTime.Date_key =
fact.Appt_Notification_Submission_DateKey
or
ApptCompletionTime.Date_key =
fact.Appt_Notification_Completion_DateKey
or
datediff(mi,appt.SubmissionTime,appt.CompletionTime) = fact.Appt_Notification_Duration
or
Appt.Appt_Notification_ID = fact.Appt_Notification_ID
or
pat.Patient_ID =
fact.Patient_ID
or
0 = 0
or
ref.Referral_ID = fact.Referral_ID
or
Hsrv.Specialty = fact.Specialty
or
appt.[Language] = fact.LanguageRequested -
Trying to create a fact table,error:unable to extend temp segement
Using Oracle 10.2g
SQL>
create materialized view facts_table
( s_id
, g_id
, sb_id
, sc_id
, y_id)
refresh with rowid
as select s.s_id
, g.g_id
, sb.sb_id
, sc.sc_id
, y.academicyear
from student s
, grade g
, subject sb
, school sc
, comqdhb.teachinggroup y;
ERROR at line 3:
ORA-01652: unable to extend temp segment by 1024 in tablespace POSTGRADSI am creating the fact table as materialized view because its specified for us.
I am trying to create a fact table from the dimension tables and its giving this error
what is the mistake kindly help.
Also when creating the fact table do all the columns need to be foreign keys compulsorily.
Edited by: Trooper on Jan 10, 2009 5:25 AM
Edited by: Trooper on Jan 10, 2009 6:37 AMWell basically what your saying is absolutely right
I realized what i am doing right now is stupendous blunder.
Basically my aim is there are 5 dimensional tables that are created
Student->s_id primary key,upn(unique pupil no),name
Grade->g_id primary key,grade,exam_level,values
Subject->sb_id primary key,subjectid,subname
School->sc_id primary key,schoolno,school_name
year->y_id primary key,year(like 2008)
s_id,g_id,sb_id,sc_id,y_id are sequences
select * from student;
S_ID UPN FNAME COMMONNAME GENDER DOB
==============================
9062 1027 MELISSA ANNE f 13-OCT-81
9000 rows selected
select * from grade;
G_ID GRADE E_LEVEL VALUE
73 A a 120
74 B a 100
75 C a 80
76 D a 60
77 E a 40
78 F a 20
79 U a 0
80 X a 0
18 rows selectedThese are basically the dimensional views
Now according to the specification given, need to create a fact table as facts_table which contains all the dim tables primary keys as foreign keys in it.
The problem is when i say,I am going to consider a smaller example than the actual no of dimension tables 5 lets say there are 2 dim tables student,grade with s_id,g_id as p key.
create materialized view facts_table(s_id,g_id)
as
select s.s_id,g.g_id
from (select distinct s_id from student)s
, (select distinct g_id from grade)gThis results in massive duplication as there is no join between the two tables.But basically there are no common things between the two tables to join,how to solve it?
Consider it when i do it for 5 tables the amount of duplication being involved, thats why there is not enough tablespace.
I was hoping if there is no other way then create a fact table with just one column initially
create materialized view facts_table(s_id)
as
select s_id
from student;then
alter materialized view facts_table add column g_id number;Then populate this g_id column by fetching all the g_id values from the grade table using some sort of loop even though we should not use pl/sql i dont know if this works?
Any suggestions.
Edited by: Trooper on Jan 10, 2009 6:38 AM -
Error VLD-0917 (unknown error) while deploying FACT table
Hi,
I keep getting this error while trying to deploy a FACT table:
VLD-0917: An unknown error occured while generating <fact table name>.
An unknown error occurred while generating <fact table name>. Error details: java.lang.NullPointerException.
Do yo know what can it be? This fact table was linked to a Dimension that I modified (added one more level to its hierarchy). Since I did this change, the fact table cannot be validated.
This happens to all fact tables that use this dimension. I created a new cube with the new definition of the dimension and had no trouble at all. I disassociate the dimension from existing cubes and they validate OK. What could be happening?
I would appreciate any help you could provide.
Regards,
--oswaldo.
[osantos]Hi,
I realized that I'm getting this error for all fact tables. I cannot deploy any of them. What could be happening? I have a dimension that is linked to all facts which I changed recently: had to redefine its default hierarchy as a value-based one. I don't know if this affected my cubes.
Any idea what might be happening here?
Best Regards,
--oswaldo.
[osantos]
Maybe you are looking for
-
Remote MQs in ALSB business services
Does anybody know whether it is possible to set the Transport Endpoint URI in an ALSB Business service to be a Remote MQ? The URI template is 'local-queue-name', which seems to imply it is a local queue, but I can't find anything in the documentation
-
Updaterate of opc url less than 1 s doesn't work
I conntect to a field point by using this url: opc://localhost/National Instruments.OPCFieldPoint/FP @ COM1\FP-TC-120 @1\Channel 3?updaterate=50&deadband=0 and with AccessMode cwdsReadAutoUpdate. The adviserate of the field point channel is 10 ms. T
-
Does any one has the same problem ? I can't download or update any app using app store from the mobile
-
Hi, While i creating the loan in 0045 i m getting the error, A maximum amount of 0 can be approved for the employee 00010162 for loan type 0120. The amount is accepting 0. How can i get the amount here. Where can i check for this. Pls let me know. Th
-
Is there a way to temporarily disable "Scale strokes & effects"?
I want to be able to scale some charts without changing the stroke size. But I don't want this preference permanently changed. Is there a key to hold down so that when I enlarge the charts that the stroke size stays the way it is? Thanks... julie