Duplicate Physical Tables
Hi,
I am trying to import two physical tables in the repository, both tables are the same as I would like to do a self join amongst the two tables and a fact table.
I am finding that I am only able to view data from one table imported, I am then copying this table but when I try to view the data behind the copied table. I get an error message 'table or view does not exist'
Below are the tables:
Customer_Dim - Original imported table(can view data)
Customer_Dim#1 - Copied above table (unable to view data)
Is this the normal behaviour?
Can anyone help?
Thanks
Hi,
Create Alias table for Customer_Dim, Dont Copy and paste.
Right click your Customer_Dim -> New Object -> alias -> give different name for the alias table.
Now you can view the data for Alias table.
Thanks,
Balaa...
Similar Messages
-
Error: 38015, Physical tables have multiple joins.
Hi,
I have 5 dimensions and 1 fact table. One of the dimension table have 2 keys, which are referenced with fact table.
I have created aliases for all table on which I have defined joins.
But, It is giving me error like
ERRORS:
GLOBAL:
*[38015] Physical tables "obidb".."ORDER_DETAILS"."FACT" and "obidb".."ORDER_DETAILS"."BILLING_ACCOUNT" have multiple joins.*
Delete new foreign key object if it is a duplicate of existing foreign key.
Please give me any suggestions.....
Thanks.Hi,
Did your deleted existing foreign key before joining the alis_dim1(fk1), dim1(fk2) to fact join?
double check u r model its comes like circular join so by using alias method u r can resolve that issue.
In your model just check all your FK relation ship here u can find FK ending with #1 (double time just delete them and check metadata consitancey) if its not working delete the dimension and import it newly then create alias of the dim then join each other required fact also check below link
http://mtalavera.wordpress.com/2012/03/29/obieerpd-fails-global-consistency-on-joins-between-tables/
Thanks
Deva
Edited by: Devarasu on Nov 23, 2012 4:44 PM -
Error when Check global consistency: Physical tables have multiple Joins
Hi
I have a table that have multiple joins with a dimension in the physical layer, this is a fact table and the dimension is a geograhic dimension, and in the fact table I have three codes, customer geography, account geography and office geography. This is a simple model and is correct for my DWH. However when I want to check global consistency the consistency check manager display the next error (three times):
ERRORS:
GLOBAL:
[38015] Physical tables "ODS".."ODS"."FT_INTERFAZ_CICLO_FACTURACION" and "ODS".."ODS"."DIM_GEOGRAFIA" have multiple joins. Delete new foreign key object if it is a duplicate of existing foreign key.
[38015] Physical tables "ODS".."ODS"."FT_INTERFAZ_CICLO_FACTURACION" and "ODS".."ODS"."DIM_GEOGRAFIA" have multiple joins. Delete new foreign key object if it is a duplicate of existing foreign key.
[38015] Physical tables "ODS".."ODS"."FT_INTERFAZ_CICLO_FACTURACION" and "ODS".."ODS"."DIM_GEOGRAFIA" have multiple joins. Delete new foreign key object if it is a duplicate of existing foreign key.
How can I do to solve this error?
Thanks
EdwinI have one dim table name team.
In the dim table there are two primary keys like Team key and Team Type key.
In the Fact table there are 4 foriegnkey like
a) Sales team key
b) Sales team type key
c) Trader team key
d) Trader team type key
For this purpose , i am going to create the alias table in the physical layer. Can any body explain to me the whole process -
Duplicating Physical Tables vs Aliasing Physical Tables
Hi All,
what is the significance of alising a physical table over duplicating the physical table. In which senarios we use those strategies?
Thanks and Regards,
Sreekanth.hi,
alias is different to duplicate. if duplicate a table it contians all the joins with another table and duplicate is mainly used for reference where as alias is used when we have 2 joins for same fact.
thanks and regards
Ranganathan T.A. -
Oracle Data Modeler commit change seg physical table
Hello,
Sometimes when I record a data model, Oracle Data Modeler (SVN) given as outgoing a chage of segment of physical table , if you committees are to change, duplication of files in the physical model is produced. Later when you open the physical model gives errors that have duplicate files
How you can correct this error? Why the segment change occurs?
ThanksHello,
Versión 4.0.0.833.
No pattern is occasionally when shooting in the data model. How you can correct this error? Why the segment change occurs?
Thanks -
Add records to the physical table from Internal table
Hello,
I am trying to insert the records from IT table to physical table. But, its not inserting. Please let me know how do I add the records to the table zebp_iv_cf_log.
I have used only few fields for example *
After looping I get about 800 records in it_non_ebp tab *
loop at non_ebp_inv.
it_non_ebp-zspgrv = non_ebp_inv-spgrv.
it_non_ebp-zspgrq = non_ebp_inv-spgrq.
it_non_ebp-zspgrs = non_ebp_inv-spgrs.
it_non_ebp-inv_ref_num = non_ebp_inv-xblnr.
it_non_ebp-zspgrc = non_ebp_inv-spgrc.
it_non_ebp-zlifnr = non_ebp_inv-lifnr.
append it_non_ebp.
endloop.
insert zebp_iv_cf_log from table it_non_ebp[] accepting duplicate keys .
I also tried inserting one by one by putting insert syntex within a loop but, it takes keeps processing.
Shall appreciate the response.
Thks & Rgds,
Hemal
Edited by: hemalgandhi on Jun 12, 2009 6:27 PMHi,
for the internal table you are using for appending , you must declare it with the header line option
as
data it_non_ebp type table of zebp_iv_cf_log with HEADER LINE.
or else you can have a separate workarea of type zebp_iv_cf_log
as
data wa_zebp_iv_cf_log type zebp_iv_cf_log.
then the code should be like :
loop at non_ebp_inv.
wa_zebp_iv_cf_log -zspgrv = non_ebp_inv-spgrv.
wa_zebp_iv_cf_log -zspgrq = non_ebp_inv-spgrq.
wa_zebp_iv_cf_log -zspgrs = non_ebp_inv-spgrs.
wa_zebp_iv_cf_log -inv_ref_num = non_ebp_inv-xblnr.
wa_zebp_iv_cf_log -zspgrc = non_ebp_inv-spgrc.
wa_zebp_iv_cf_log -zlifnr = non_ebp_inv-lifnr.
append wa_zebp_iv_cf_log to it_non_ebp.
clear wa_zebp_iv_cf_log .
endloop.
and use
modify zebp_iv_cf_log from table it_non_ebp accepting duplicate keys .
instead of
insert zebp_iv_cf_log from table it_non_ebp[] accepting duplicate keys .
Please try out and let me know of you face any problem -
Using case when statement in the select query to create physical table
Hello,
I have a requirement where in I have to execute a case when statement with a session variable while creating a physical table using a select query. let me explain with an example.
I have a physical table based on a select table with one column.
SELECT 'VALUEOF(NQ_SESSION.NAME_PARAMETER)' AS NAME_PARAMETER FROM DUAL. Let me call this table as the NAME_PARAMETER table.
I also have a customer table.
In my dashboard that has two pages, Page 1 contains a table with the customer table with column navigation to my second dashboard page.
In my second dashboard page I created a dashboard report based on NAME_PARAMETER table and a prompt based on customer table that sets the NAME_ PARAMETER request variable.
EXECUTION
When i click on a particular customer, the prompt sets the variable NAME_PARAMETER and the NAME_PARAMETER table shows the appropriate customer.
everything works as expected. YE!!
Now i created another table called NAME_PARAMETER1 with a little modification to the earlier table. the query is as follows.
SELECT CASE WHEN 'VALUEOF(NQ_SESSION.NAME_PARAMETER)'='Customer 1' THEN 'TEST_MART1' ELSE TEST_MART2' END AS NAME_PARAMETER
FROM DUAL
Now I pull in this table into the second dashboard page along with the NAME_PARAMETER table report.
surprisingly, NAME_PARAMETER table report executes as is, but the other report based on the NAME_PARAMETER1 table fails with the following error.
Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 16001] ODBC error state: S1000 code: 1756 message: [Oracle][ODBC][Ora]ORA-01756: quoted string not properly terminated. [nQSError: 16014] SQL statement preparation failed. (HY000)
SQL Issued: SET VARIABLE NAME_PARAMETER='Novartis';SELECT NAME_PARAMETER.NAME_PARAMETER saw_0 FROM POC_ONE_DOT_TWO ORDER BY saw_0
If anyone has any explanation to this error and how we can achieve the same, please help.
Thanks.Hello,
Updates :) sorry.. the error was a stupid one.. I resolved and I got stuck at my next step.
I am creating a physical table using a select query. But I am trying to obtain the name of the table dynamically.
Here is what I am trying to do. the select query of the physical table is as follows.
SELECT CUSTOMER_ID AS CUSTOMER_ID, CUSTOMER_NAME AS CUSTOMER_NAME FROM 'VALUEOF(NQ_SESSION.SCHEMA_NAME)'.CUSTOMER.
The idea behind this is to obtain the data from the same table from different schemas dynamically based on what a session variable. Please let me know if there is a way to achieve this, if not please let me know if this can be achieved in any other method in OBIEE.
Thanks. -
Using multiple physical tables in a single logical dimension table
I have two physical tables that are related on a 1 to 1 basis based on a natural key. One of these tables is already part of my RPD file (actually, it is the W_EMPLOYEE_D from the Oracle BI Applications). The second table contains additional employee attributes from a custom table added to the data warehouse. Unfortunately, I don't seem to be able to display ANY data from this newly added custom table! I'm running on OBIEE 11.1.1.6.
Here's what I've tried to do. Lets call the original table E1 and the new one E2. E1 is part of the repository already, and has functioned perfectly for years.
- In my physical model, I have imported E2 and defined the join between E1 and E2.
- In my logical table for E1, I've mapped E2 to E1 (E2 appears as a source), set up an INNER JOIN in the joins section for E1 and added the attributes from E2 in the folder
- In the SOURCES for this logical table, I've set the logical level of the content for E2 appropriately (detail level, same as E1)
- In my presentation folder for E1, I've copied in the attributes from E2 that were included in my logical table
Consistency check runs smoothly, no warnings or errors. Note: E2 contains hundreds of rows, all of which have matching records in E1.
Now, when I create an analysis that includes only an attribute sourced from E2, I get a single row returned, with a NULL value. If I create an analysis that includes one attrribute from E1 and one from E2, I get all the valid E1 records with data showing, but with NULL for the E2 attributes. Remember, I have an inner join, which means that the query is "seeing" E2 data, it is just choosing not to show it to me! Additionally, I can grab the query from the NQQuery.log file - when I run this SQL in SQL*Developer, I get PERFECT results - both E1 and E2 attributes show up in the SQL - the query engine is generating valid SQL. The log file does not indicate there are any errors either; it does show the correct number of rows being added to cache. If I create a report that includes attributes from E1, E2 and associated fact metrics I get similar results. The reports seem to run fine, but all my E2 attributes are NULL in Answers. I've verified basics, like data types, etc. and when I "Query Related Objects" in the repository, everything looks consistent across all 3 layers and all objects. E2 is located in the same (Oracle) database and schema as E1, and there are no security constraints in effect.
I've experimented with a lot of different things without success, but I expected that the above configuration should have worked. Note that I cannot set up E2 as a new separate dimension, as it does not contain the key value used to join to the facts, nor do the facts contain the natural key that is in both E1 and E2.
Sorry for the long post - just trying to head off some of the questions you might have.
Any ideas welcomed! Many thanks!
EricHi Eric,
I would like you to re-check on the content level settings here as they are the primary causes of this kind of behavior. You could notice that the same information might have written down in the logical plan of the query too.
Also, as per your description
"In the SOURCES for this logical table, I've set the logical level of the content for E2 appropriately (detail level, same as E1)"
I would like to check on this point again, as if you had mapped E2 to E1 in the same logical source with an inner join, you would get to set the content level at E1 levels themselves but not E2 (Now, that E2 would become a part of the E1 hierarchy too). This might be the reason, the BI Server is choosing to elimiate(null) the values from E2 too (even you could see them in the sql client)
Hope this helps.
Thank you,
Dhar -
OBIEE generated SQL differs if it's a Physical Table or Select Table...
Hi!
I have some tables defined in the Physical Layer, which some are Physical Tables and others are OBIEE "views" (tables created with a Select clause).
My problem is that the difference in the generated SQL for the same table, differs (as expected) whether it is a Physical Table or a "Select Table". And this difference originates problems in the returned data. When it a Physical Table, the final report returns the correct data, but when it is a Select Table it returns incorrect/incomplete data. The report joins this table with another table from a different Database (it is a join between Sybase IQ and SQL Server).
This is the generated SQL in the log:
-- Physical Table generated SQL
select T182880."sbl_cust_acct_row_id" as c1,
T182880."sbl_cust_acct_ext_key" as c2,
T182880."sbl_cust_source_sys" as c3
from
"SGC_X_KEY_ACCOUNT" T182880
order by c2, c3
-- "Select Table" generated SQL
select
sbl_cust_acct_ext_key,
ltrim(rtrim(sbl_cust_source_sys)) as sbl_cust_source_sys,
sbl_cust_acct_row_id,
sbl_cust_acct_camp_contact_row_id,
ods_date,
ods_batch_no,
ods_timestamp
from dbo.SGC_X_KEY_ACCOUNT
As you may notice, the main difference is the use of Aliases (which I think that it has no influence in the report result) and the use of "Order By" (which I start to think that it its the main cause to return the correct data).
Don't forget that OBIEE server is joining the data from this table, with data from another table from a differente database. Therefore, the join is made in memory (OBIEE Engine). Maybe in the OBIEE Engine the Order by is essential to guarantee a correct join...but then again, I have some other tables in the Physical Layer that are defined as "Select" and the generated SQL uses the aliases and the Order by clause...
In order to solve my problem, I had to transform the "Select Table" into a "Physical Table". The reason it was defined as a "Select Table" was because it had a restriction in the Where Clause (which I eliminated already, althouth the performance wil be worse).
I'm confused. Help!
Thanks.
FPGHi FPG,
Not sure if this is a potential issue for you at all, but I know it caused me all kinds of headaches before I figured it out. Had to do with "Features" tab Values in the database object's settings in the Physical Layer:
Different SQL generated for physical table query vs. view object query?
Mine had to do with SQL from View objects not being submitted as I would expect, sounds like yours has more to do with "Order By"? I believe I remembered seeing some Order By and Group By settings in the "Features" list. You might make a copy of your RPD and experiement around with setting some of those if they aren't already selected and retesting your queries with the new DB settings.
Jeremy -
Files moving to NFS error folder - Could not insert message into duplicate check table
Hi Friends
Have anyone faced this error, could suggest me why.
The CSV Files failed on Sender Channel and moves to NFS error path and in the log it says as below.
Error: com.sap.engine.interfaces.messaging.api.exception.MessagingException: Could not insert message into duplicate check table. Reason: com.ibm.db2.jcc.am.SqlTransactionRollbackException: DB2 SQL ErrorHi Uma - is that a duplicate file? have you enabled duplicate file check in sender channel?
please check if the below note is applicable
1979353 - Recurring TxRollbackException with MODE_STORE_ON_ERROR stage configuration -
Dynamic physical table name vs. Cache
Hello, Experts!
I'm facing quite an interesting problem. I have two physical tables with the same structure but with a different data. Requirement is to show same reports with one or another table. Idea is to have dynamically changed physical table name with session variable usage. Session variable can be change in UI so it was working until cache was turned on. When cache is turned on logical statements sent to OBI backend are the same even for different values of session variable that stores physical table name. Once cache is populated every users will get values from cache. This is possible source of discrepancy because some users might run reports with tableA values and some with tableB values.
Are there any options to set OBI to use data related to proper physical table name (i.e. accordingly to session variable value)? Model clone is not an option because it will be way to hard and complex to maintain both, beside same reports need to work sometimes with one table name and sometimes with other...
PS. Cache is set to be common for all users.
Lucasthank you, I've found another way to make it running. In fact there are two ways of doing it: filter LTS and have all data filtered from single table with session variable or use fragmentation content also with session variable.
Now tricky part is to set variable from UI, currently I'm using issue raw sql: call NQSSetSessionValue( 'String SV_SIGNOFF=aaa;' ) but I have to figure out how to change session non system variable value without need of administrator user rights.
There is GoURL method, but it's not working...
2. Add In ORACLE_HOME/bifoundation/web/display/authenticationschemas.xml
<RequestVariable source="url" type="informational" nameInSource="lang"
biVariableName="NQ_SESSION.LOCALE" />
inside the top <AuthenticationSchemaGroup> </AuthenticationSchemaGroup> tag -
"Select" Physical table as LTS for a Fact table
Hi,
I am very new to OBIEE, still in the learning phase.
Scenario 1:
I have a "Select" Physical table which is joined (inner join) to a Fact table in the Physical layer. I have other dimensions joined to this fact table.
In BMM, I created a logical table for the fact table with 2 Logical Table Sources (the fact table & the select physical table). No errors in the consistency check.
When I create an analysis with columns from the fact table and the select table, I don't see any data for the select table column.
Scenario 2:
In this scenario, I created an inner join between "Select" physical table and a Dimension table instead of the Fact table.
In BMM, I created a logical table for the dimension table with 2 Logical Table Sources (the dimension table & the select physical table). No errors in the consistency check.
When I create an analysis with columns from the dimension table and the select table, I see data for all the columns.
What am I missing here? Why is it not working in first scenario?
Any help is greatly appreciated.
Thanks,
SPHi,
If I understand your description correctly, then your materialized view skips some dimensions (infrequent ones). However, when you reference these skipped dimensions in filters, the queries are hitting the materialized view and failing as these values do not exist. In this case, you could resolve it as follows
1. Create dimensional hierarchies for all dimensions.
2. In the fact table's logical sources set the content tabs properly. (Yes, I think this is it).
When you skipped some dimensions, the grain of the new fact source (the materialized view in this case) is changed. For example:
Say a fact is available with the keys for Product, Customer, Promotion dimensions. The grain for this is Product * Customer * Promotion
Say another fact is available with the keys for Product, Customer. The grain for this is Product * Customer (In fact, I would say it is Product * Customer * Promotion Total).
So in the second case, the grain of the table is changed. So setting appropriate content levels for these sources would automatically switch the sources.
So, I request you to try these settings and let me know if it works.
Thank you,
Dhar -
Hi, I am really new to OBIEE 10g.
I already set up a SQL Server 2005 database in Physical and import a view vw_Dim_retail_branch.
The view has 3 columns: branch_id, branch_code, branch_desc.
Now I want to set up the Business model to map this physical table (view).
I created a new Business model
Added new logical table Dim_retail_branch
In the sources, added the vw_Dim_retail_branch as source table.
But in the Logical table source window, column mapping tab, it's blank. I thought it should be able to identify all the columns from vw_Dim_retail_branch, but not. The show mapped columns is ticked.
What should I do here? Manually type each column?HI,
Just you can drag and drop the columns from physical layer to BMM layer.
Select the 3 columns and drag and drop it to the created logical column in BMM layer.
for more reference : http:\\mkashu.blogspot.com
Regards,
VG -
Create physical table using select in repository
Hi Gurus,
Can we create Physical table in OBIEE 11.1.1.6 repository using stored procedure and select?
How is the right syntax?
Thank you so much
JOEHi,
Yes. physical layer just select and put it like below
for example,
select field1, field2, . field_n
from tables
UNION
select field1, field2, . field_n
from tables;
http://gerardnico.com/wiki/dat/obiee/opaque_view
http://www.clearpeaks.com/blog/oracle-bi-ee-11gusing-select_physical-in-obiee-11g
http://allaboutobiee.blogspot.com/2012/05/obiee-11g-deployundeploy-view-in.html
Thanks
Deva -
The error i receive while performing global consistency check is : [38091] Physical table 'D_TIME__EVENT_TIME' joins to non-fact table 'FS_IND_SUBS_RGE_ACT' that is outside of its time dimension table source 'D_TIME__EVENT_TIME'.
I have had this problem for some time and it is getting frustrating. the table D_TIME__EVENT_TIME is an alias of the d_time_event table. i have created a foreign key between both tables D_TIME__EVENT_TIME (time dimension) and FS_IND_SUBS_RGE_ACT in the physical layer. but everytime I check for consistency, i get the error.
I have created multiple star schemas using the time dimension table There are a couple of other tables that have necessitated creating physical foreign keys with D_TIME__EVENT_TIME but had no errors.
I have on the side created another alias (d_time_) of the parent table to validate the steps taken. I have created a physical foreign key with the d_time_ table and fs_ind_subs_rge_act table and this was successful.
I am at loggerheads on what to do next. i need some help any help
Edited by: 794286 on Sep 20, 2010 11:35 AMhi,
Refer Re: Time Dimension Problem joe mentioned some good points
thanks,
saichand.v
Maybe you are looking for
-
I usually work in my office, which has a couple of monitors plugged in via the miniports, but occassionally I would like to work in the house. To do this, I would like to use our second TV, which has an HDMI connnection. This TV is used for various o
-
Dear sir I have 2 iPhones , I downloaded find my iphone and find out there is another iPhone on my Apple ID and in another location in same city Erbil which I live and I tried to use lost mode but it is still not stopped
-
Mail.app hang if mail attachment have special characters in file name
Hello, I have upgraded to 10.5.1 from last version of Tiger. After upgrade is not possible send message with attachment which have a czech special characters in file name (like á, í, é, ý, ú, ů, ě, š, č, ř, ž) with Mail.app. If I try it, CPU is overl
-
Dynamic Determination of Target Messge Type (IDoc)
Hello, we have a IDoc - HTTP scenario. A SYSTAT01 IDoc should be sent back to the sender system setting the status depending on the http response. The basic scenario works fine, however depending on the value of one field there should be either a STA
-
Hi there, Has anyone else had the problem of PC shutting down when exporting a PremPro project to movie? Everytime I try to export, my PC crashes & I lose project even if I save it before I export. Please help. Trying to export avi to movie?