Error in creating time dimension
Hi,
I created a simple time dimension with two levels: year and month, and each level has one element that serves as chronological key. This is the error I got when checking global consistency:
"The physical table 'my time table' , which is part of a time dimension table, is also used in the logical table 'my time table', which is not part of any dimension".
What does it exactly mean?
Thanks for any assistance.
Hi ,
Please provide details about how you are bringing in the table into presentation layer , its seems like the table is taken as a fact table by the tool , its very important how you are defining the relationship in the logical diagram.Some times its easier to bring a dummy table along in the physical layer and make a sensible logical diagram, since you are trying to create a time dimension you want it to have drillability , so you have to create a hierarchy on the same table.The key here is to make sure the tool understands what you really want ,sometimes you have to work around since the logical diagram is the only way the tool makes sense of the tables brought in.
Thank you,
Mohammad Farhan Alam
Similar Messages
-
Create time dimension table in repository without data warehouse
Hi,
I want to implement only BI repository solution in my customer (not datawarehousing). Is it possible to transform the data by repository tools, so that the times columns in fact tables are categorized by the "time dimension" table?
To be more explanatory:
The "Sales" table has the "time of sale" column. It contains the timestamp when the sale was performed. I have imported this table in "physical layer" of the repository. Now I want to create a new "time dimension" table, something like:
CREATE TABLE dimension_time (
Day_Key INT NOT NULL PRIMARY KEY,
Day_Timestamp DATETIME NOT NULL,
Day_Name NVARCHAR(32) NOT NULL,
Day_Text NVARCHAR(32) NOT NULL,
INSERT INTO dimension_time VALUES (20110101, {d '2011-01-01'}, '1/1', 'January 1', 'Saturday', 0, 6, 1, 1, 185, 1, 201052, 'W52', 'Week 52', 52, 201101, '01', 'January', 1, 7, 1004, 'Winter', 'Winter', 20111, 'Q1', '1st Quarter', 1, 20103, 'Q3', '3rd Quarter', 3, 20111, 'S1', '1st Semester', 1, 20102, 'S2', '2nd Semester', 2, 2011, '2011', '2011', 2010, '10/11', '2010/2011', 0);
INSERT INTO dimension_time VALUES (20110102, {d '2011-01-02'}, '2/1', 'January 2', 'Sunday', 0, 7, 2, 2, 186, 2, 201052, 'W52', 'Week 52', 52, 201101, '01', 'January', 1, 7, 1004, 'Winter', 'Winter', 20111, 'Q1', '1st Quarter', 1, 20103, 'Q3', '3rd Quarter', 3, 20111, 'S1', '1st Semester', 1, 20102, 'S2', '2nd Semester', 2, 2011, '2011', '2011', 2010, '10/11', '2010/2011', 0);
and after to add a new column in "sales" fact table for "time dimension ID" and through the repository populate this column based on the "time of sale" column and the corresponding "time dimension ID".
I know that the ETL process might perform it, but I do not want to go for Data Warehousing (it is not real - time, needs more resources, etc).
Is it possible to perform such action only on repository?
Thank you.Hi,
I can do it, but this would be usefull only to create "time dimension" table. But also the "sales" fact table needs to be altered (thus, the "time" column will not contain the value of the time, but the ID of the corresponding time in the "time dimension" table).
I know that on DW this procedure is done automatically by the ETL process.
My question is that does the repository has any tools similar to this?
Thank you. -
ERROR while loading time dimension table
i need to load time dimension from csv to oracle table, while loading i got the error.
my source data type is date and target is date.
ODI-1226: Step sample day fails after 1 attempt(s).
ODI-1240: Flow sample day fails while performing a Loading operation. This flow loads target table W_SAMPLE_DATE.
ODI-1228: Task SrcSet0 (Loading) fails on the target ORACLE connection Target_Oracle.
Caused By: java.sql.SQLException: ORA-30088: datetime/interval precision is out of range
while creating c$ table
create table WORKSCHEMA.C$_0W_SAMPLE_DATE
C3_ROW_WID NUMBER(10) NULL,
C1_CALENDAR_DATE TIMESTAMP() NULL,
C2_DAY_DT TIMESTAMP() NULL
)check the source data and use the correct function eg TO_DATE(SRC.DATE, 'MM?DD/YYYY') use NVL if required.
-
OWB 904 - problems creating time dimension
I'm using Oracle9iR2 DB and OWB 904. I have not much experience with OWB 904 yet and problems with creating the time dimension.
I tried to figure out the example which comes with OWB 904 and did everything as it is written in the readme.txt file:
1. loaded the time table functions owb_time_seq.sql and owb_time.sql in my runtime user schema
2. imported the owb_bp_time.mdl file into my design repository, which created the demo project 'OWB_BP'
The problems which I have with this demo are:
1.) I get warnings when I validate the mappings (TF_TIME_MAP):
VLD-1002: Mapping object T_TIME is not bound to a repository object.
VLD-1004: Column length of L_DAY_NAME is longer than the target column length.
VLD-1004: Column length of L_MONTH_NAME is longer than the target column length.
VLD-1004: Column length of L_QUARTER_NAME is longer than the target column length.
VLD-1004: Column length of L_YEAR_NAME is longer than the target column length.
VLD-3260: No Output Attribute name is specified. In this case, the attribute’s physical name will be used.
VLD-1123: Missing location information for Module WAREHOUSE.
VLD-1115: Commit frequency is defaulted to Bulk Size.
I know how to solve the errors VLD-1002, VLD-1004, VLD-1123 and VLD-1115 but I have no idea how I should solve VLD-3260 and VLD-1002, because reconcile outbound is not possible for the T_TIME dimension.
2.) When I try to import the table function
TIMEDATA (IN VARCHAR2, IN NUMBER) return TABLE
from the runtime user schema into my design repository I get the error message: Argument Data type is not supported.
3.) I cant deploy the mapping TF_TIME_MAP.
Is it possible to use the table function from the demo to create a time dimension and to import it into my own project? My aim is to create a bean compliant time dimension and I want to know what I have to bear in mind to accomplish that. The OWB904UsersGuide didn't give me enough information, therefore I'm asking if you can help me out.
Thanks in advance,
DirkDirk,
The way you should approach the time dimension is following:
- Run the SQL scripts into your target schema (you already did).
- Import the MDL file (you already did).
- Copy and paste the times dimension into your own project to be able to use it.
- Copy and paste the mapping you want to use to your own project.
- If necessary, modify the times dimension according to your needs; change the mapping accordingly.
- Open the mapping, do a right mouse click on the time dimension and select 'Reconcile inbound'. Select matching strategy to match by bound name.
We do not currently support the table function as an object in the metadata repository. I.e. if it exists at runtime then you can call it (as the time dimension load mappings do).
With the objects in the target schema you should be able to deploy the mapping.
Thanks,
Mark. -
Error While Loding Time Dimension
Hello
I have a requirement in which i need to load the data into Time dimension so i mapped the data from Time dimension which Oracle Creates
While deploying i am getting an Error which says there is an error in Time Dimension Package which is created by oracle and
Can anyone tell me where should i go to rectify the error
Thanks
SriksDid the mapping validate ok?
If validation was ok, locate the package on the DB (same name as mapping) and compile it, this should give you more information on what is causing the error e.g. table does not exist etc.
Si -
Error while creating Time Card Using API
Hello
I have a requirement to run OTL Interface once day to create/update Time Card in OTL for those datea created on sysdate-1 in Service Module (data will get feeded from Service Module). For example this interface will run on sysdate to create time cards for those data created in the service module on sysdate-1.
I have some sample data and code I am using but it is erroring out with different errors..Can anybody help me on this
My Interface should run as expected below and my time card range will be Monday 24-Dec-2012 to Sunday 30-Dec-2012
Interface Run Date Time Entry to process
25-Dec-2012 24-Dec-2012 Data
26-Dec-2012 25-Dec-2012 Data
CREATE OR REPLACE PROCEDURE main_process (o_errbuf OUT VARCHAR2,
o_ret_code OUT NUMBER,
l_chr_from_date IN VARCHAR2,
l_chr_to_date IN VARCHAR2)
IS
l_chr_skip VARCHAR2 (1);
l_chr_task_exists VARCHAR2 (100);
l_chr_error_msg VARCHAR2 (4000);
l_num_tbb_id NUMBER;
l_num_request_id NUMBER := fnd_global.conc_request_id;
l_num_login_id NUMBER := fnd_global.login_id;
l_num_user_id NUMBER := fnd_global.user_id;
l_num_resp_id NUMBER := fnd_Profile.VALUE ('resp_ID');
l_num_otl_appl_id CONSTANT NUMBER (3) := 809; -- This is the appl_id for OTL, do not change
l_chr_proj_attr1 CONSTANT VARCHAR2 (7) := 'Task_Id';
l_chr_proj_attr2 CONSTANT VARCHAR2 (10) := 'Project_Id';
l_chr_proj_attr3 CONSTANT VARCHAR2 (16) := 'Expenditure_Type';
l_chr_proj_attr4 CONSTANT VARCHAR2 (19) := 'Expenditure_Comment';
l_chr_proj_attr5 CONSTANT VARCHAR2 (23) := 'SYSTEM_LINKAGE_FUNCTION';
i_num_count NUMBER;
i_num_rows_inserted NUMBER := NULL;
l_chr_message fnd_new_messages.MESSAGE_TEXT%TYPE;
l_chr_hdr_eligible VARCHAR2 (5);
l_tbl_timecard_info hxc_self_service_time_deposit.timecard_info;
l_tbl_attributes_info hxc_self_service_time_deposit.app_attributes_info;
l_tbl_messages hxc_self_service_time_deposit.message_table;
l_new_timecard_id NUMBER;
l_new_timecard_ovn NUMBER;
l_num_time_building_block_id hxc_time_building_blocks.time_building_block_id%TYPE;
I NUMBER; --PENDING REMOVE
l_message VARCHAR2 (2000); --PENDING REMOVE
l_time_building_block_id NUMBER; --PENDING REMOVE
CURSOR process_line
IS
SELECT ROWID ROW_ID, a.*
FROM xxpowl.xxpowl_hxt_otl_intf a
WHERE 1 = 1 --a.request_id = l_num_request_id
AND a.status_flag = 'NEW' -- 'VALID'
AND a.error_desc IS NULL
AND a.service_activity_type IS NOT NULL;
BEGIN
FND_FILE.PUT_LINE (
fnd_file.LOG,
'Entered From Date:'
|| (TO_DATE ( (l_chr_from_date), 'YYYY/MM/DD HH24:MI:SS')));
FND_FILE.PUT_LINE (
fnd_file.LOG,
'Entered To Date:'
|| (TO_DATE ( (l_chr_to_date), 'YYYY/MM/DD HH24:MI:SS')));
FND_GLOBAL.APPS_INITIALIZE (user_id => l_num_user_id,
resp_id => l_num_resp_id,
resp_appl_id => l_num_otl_appl_id);
BEGIN
FND_FILE.PUT_LINE (
fnd_file.LOG,
'Inserting the data in the custom table XXPOWL_HXT_OTL_INTF');
INSERT INTO xxpowl.XXPOWL_HXT_OTL_INTF (SR_NUMBER,
PROJECT_TASK_OWNER,
PROJECT_TASK_OWNER_PERSON_ID,
PROJECT_ID,
PROJECT_TASK_ID,
ASSIGNEE,
ASSIGNEE_ID,
ASSIGNEE_PERSON_ID,
ASSIGNEE_SUPERVISOR_PERSON_ID,
TASK_NUMBER,
PARENT_TASK_NUMBER,
DEBRIEF_NUMBER,
DEBRIEF_HEADER_ID,
DEBRIEF_DATE,
TASK_ASSIGNMENT_ID,
DEBRIEF_OBJECT_VERSION_NUMBER,
DEBRIEF_PER_COMPLETE,
DEBRIEF_LINE_ID,
DEBRIEF_PROCESS,
SERVICE_ACTIVITY,
SERVICE_ACTIVITY_TYPE,
INVENTORY_ITEM_ID,
ITEM,
BUSINESS_PROCESS_ID,
DEBRIEF_TRANSACTION_TYPE_ID,
DEBRIEF_UOM_CODE,
DEBRIEF_HOURS,
CONV_DEBRIEF_IN_HOURS,
DEBRIEF_START_TIME,
DEBRIEF_END_TIME,
DEBRIEF_SERVICE_DATE,
NOTES,
OTL_ELIGIBLE_FLAG,
NOTIF_SENT_FLAG,
NOTIF_ELIGIBLE_FLAG,
ERROR_DESC,
STATUS_FLAG,
INITIAL_NOTIF_SENT_DATE,
CREATION_DATE,
CREATED_BY,
LAST_UPDATE_DATE,
LAST_UPDATED_BY,
LAST_UPDATE_LOGIN,
REQUEST_ID,
INITIAL_REQUEST_ID)
(SELECT b.incident_number sr_number,
h.resource_name project_task_owner,
h.source_id project_task_owner_person_id,
b.external_attribute_1 project_id,
b.external_attribute_2 project_task_id,
jtf_task_utl.get_owner (d.resource_type_code, d.resource_id)
assignee,
d.resource_id assignee_id,
jrre.source_id assignee_person_id,
paa.supervisor_id assignee_supervisor_person_id,
c.task_number task_number,
(SELECT task_number
FROM jtf_tasks_b z
WHERE c.parent_task_id = z.task_id)
parent_task_number,
a.debrief_number,
a.debrief_header_id debrief_header_id,
a.debrief_date debrief_date,
a.task_assignment_id task_assignment_id,
a.object_version_number debrief_object_version_number,
a.attribute1 debrief_per_complete,
e.debrief_line_id debrief_line_id,
cbp.name debrief_process,
cttv.name service_activity,
DECODE (cttv.attribute1,
'Y', 'Service-Billable',
'N', 'Service-Non Billable')
service_activity_type,
e.inventory_item_id inventory_item_id, --f.segment1 item,
(SELECT segment1
FROM mtl_system_items_b f
WHERE e.inventory_item_id = f.inventory_item_id
AND f.organization_id = b.inv_organization_id)
item, --241 --Mater Org
e.business_process_id business_process_id,
e.transaction_type_id debrief_transaction_type_id,
e.uom_code debrief_uom_code,
e.quantity debrief_hours,
(g.conversion_rate * e.quantity) conv_debrief_in_hours,
e.labor_start_date debrief_start_time,
e.labor_end_date debrief_end_time,
e.service_date debrief_service_date,
(SELECT note_tl.notes
FROM JTF_NOTES_B NOTE, JTF_NOTES_TL NOTE_TL
WHERE NOTE.JTF_NOTE_ID = NOTE_TL.JTF_NOTE_ID
AND NOTE_TL.LANGUAGE = USERENV ('LANG')
AND NOTE.SOURCE_OBJECT_ID = a.DEBRIEF_HEADER_ID
AND note.jtf_note_id =
(SELECT MAX (note1.jtf_note_id)
FROM JTF_NOTES_B note1
WHERE NOTE1.SOURCE_OBJECT_ID =
a.DEBRIEF_HEADER_ID))
notes,
'Y',
'N',
'X',
NULL,
'NEW',
NULL,
SYSDATE,
l_num_user_id,
SYSDATE,
l_num_user_id,
l_num_login_id,
l_num_request_id,
l_num_request_id
FROM csf_debrief_headers a,
cs_incidents_all_b b,
jtf_tasks_b c,
jtf_task_assignments d,
csf_debrief_lines e, --mtl_system_items_b f,
mtl_uom_conversions g,
jtf_rs_resource_extns_vl h,
cs_business_processes cbp,
cs_transaction_types_vl cttv --, cs_sr_task_debrief_notes_v notes
jtf_rs_resource_extns_vl jrre,
per_all_assignments_f paa
WHERE a.task_assignment_id = d.task_assignment_id
AND d.task_id = c.task_id
AND c.source_object_id = b.incident_id
AND c.source_object_type_code = 'SR'
AND a.debrief_header_id = e.debrief_header_id
AND e.uom_code = g.uom_code
AND g.uom_class = 'Time'
AND b.incident_owner_id = h.resource_id(+)
AND e.business_process_id = cbp.business_process_id
AND e.transaction_type_id = cttv.transaction_type_id
AND d.resource_id = jrre.resource_id
AND jrre.source_id = paa.person_id --= 181
AND TRUNC (SYSDATE) BETWEEN paa.effective_start_date
AND paa.effective_end_date
AND TRUNC (e.last_update_date) BETWEEN TO_DATE (
(l_chr_from_date),
'YYYY/MM/DD HH24:MI:SS')
AND TO_DATE (
(NVL (
l_chr_to_date,
l_chr_from_date)),
'YYYY/MM/DD HH24:MI:SS')
AND NOT EXISTS
(SELECT 1
FROM xxpowl.XXPOWL_HXT_OTL_INTF old
WHERE old.debrief_header_id =
e.debrief_header_id
AND old.task_number = c.task_number
AND TRUNC (old.DEBRIEF_SERVICE_DATE) =
TRUNC (e.SERVICE_DATE)
AND old.DEBRIEF_TRANSACTION_TYPE_ID =
e.TRANSACTION_TYPE_ID
AND old.request_id <> l_num_request_id
AND old.DEBRIEF_UOM_CODE = e.uom_code
AND old.DEBRIEF_HOURS = e.quantity));
i_num_rows_inserted := SQL%ROWCOUNT;
fnd_file.put_line (fnd_file.LOG,
'No of rows Inserted:' || i_num_rows_inserted);
COMMIT;
EXCEPTION
WHEN OTHERS
THEN
l_chr_error_msg :=
l_chr_error_msg
|| 'Error while Inserting debrief lines data into custom table.Error: '
|| SQLERRM
|| '. Exiting the interface without processing further';
fnd_file.put_line (fnd_file.LOG, l_chr_error_msg);
ROLLBACK;
RETURN;
END;
FOR process_line_rec IN process_line
LOOP
l_chr_error_msg := NULL;
l_tbl_timecard_info.delete;
l_tbl_attributes_info.delete;
l_tbl_messages.delete;
l_new_timecard_id := NULL;
l_new_timecard_ovn := NULL;
l_num_time_building_block_id := NULL;
l_chr_message := NULL;
i_num_count := 0;
l_num_tbb_id := NULL;
BEGIN
hxc_timestore_deposit.create_time_entry (
p_measure => process_line_rec.conv_debrief_in_hours,
p_day => TO_DATE ( (process_line_rec.debrief_service_date),
'DD/MM/RRRR'),
p_resource_id => process_line_rec.assignee_person_id,
p_comment_text => process_line_rec.notes
|| '....Remove this notes in the code logic....Request Id:'
|| l_num_request_id, --pending lokesh
p_app_blocks => l_tbl_timecard_info,
p_app_attributes => l_tbl_attributes_info,
p_time_building_block_id => l_num_time_building_block_id);
fnd_file.put_line (
fnd_file.LOG,
'Step#1 completed, TIME_BUILDING_BLOCK_ID:'
|| l_num_time_building_block_id);
EXCEPTION
WHEN OTHERS
THEN
l_chr_error_msg :=
l_chr_error_msg
|| '.Error in INSERT API CALL at Step#1 for the debrief line id: '
|| process_line_rec.debrief_line_id
|| '.Error:'
|| SQLERRM;
END;
BEGIN
-- Classify Time
-- Attribute1
hxc_timestore_deposit.create_attribute (
p_building_block_id => l_num_time_building_block_id,
p_attribute_name => 'Task_Id',
p_attribute_value => process_line_rec.project_task_id,
p_app_attributes => l_tbl_attributes_info);
EXCEPTION
WHEN OTHERS
THEN
l_chr_error_msg :=
l_chr_error_msg
|| '.Error in INSERT API CALL at Step#2 for the debrief line id: '
|| process_line_rec.debrief_line_id
|| '.Error:'
|| SQLERRM;
END;
BEGIN
-- Attribute2
hxc_timestore_deposit.create_attribute (
p_building_block_id => l_num_time_building_block_id,
p_attribute_name => 'Project_Id',
p_attribute_value => process_line_rec.project_id,
p_app_attributes => l_tbl_attributes_info);
EXCEPTION
WHEN OTHERS
THEN
l_chr_error_msg :=
l_chr_error_msg
|| '.Error in INSERT API CALL at Step#3 for the debrief line id: '
|| process_line_rec.debrief_line_id
|| '.Error:'
|| SQLERRM;
END;
BEGIN
-- Attribute3
hxc_timestore_deposit.create_attribute (
p_building_block_id => l_num_time_building_block_id,
p_attribute_name => 'Expenditure_Type',
p_attribute_value => 'Service-Billable',
-- p_attribute_value=> 'Service-Non Billable',
p_app_attributes => l_tbl_attributes_info);
EXCEPTION
WHEN OTHERS
THEN
l_chr_error_msg :=
l_chr_error_msg
|| '.Error in INSERT API CALL at Step#4 for the debrief line id: '
|| process_line_rec.debrief_line_id
|| '.Error:'
|| SQLERRM;
END;
BEGIN
-- Attribute4
hxc_timestore_deposit.create_attribute (
p_building_block_id => l_num_time_building_block_id,
p_attribute_name => 'Expenditure_Comment',
p_attribute_value => 'Expenditure Comment created by API',
p_app_attributes => l_tbl_attributes_info);
EXCEPTION
WHEN OTHERS
THEN
l_chr_error_msg :=
l_chr_error_msg
|| '.Error in INSERT API CALL at Step#5 for the debrief line id: '
|| process_line_rec.debrief_line_id
|| '.Error:'
|| SQLERRM;
END;
BEGIN
-- Attribute5
hxc_timestore_deposit.create_attribute (
p_building_block_id => l_num_time_building_block_id,
p_attribute_name => 'SYSTEM_LINKAGE_FUNCTION',
p_attribute_value => 'ST',
p_app_attributes => l_tbl_attributes_info);
EXCEPTION
WHEN OTHERS
THEN
l_chr_error_msg :=
l_chr_error_msg
|| '.Error in INSERT API CALL at Step#6 for the debrief line id: '
|| process_line_rec.debrief_line_id
|| '.Error:'
|| SQLERRM;
END;
BEGIN
hxc_timestore_deposit.execute_deposit_process (
p_validate => FALSE,
p_app_blocks => l_tbl_timecard_info,
p_app_attributes => l_tbl_attributes_info,
p_messages => l_tbl_messages,
p_mode => 'SAVE', -- p_mode-> 'SUBMIT', 'SAVE', 'MIGRATION', 'FORCE_SAVE' or 'FORCE_SUBMIT'
p_deposit_process => 'OTL Deposit Process',
p_timecard_id => l_new_timecard_id,
p_timecard_ovn => l_new_timecard_ovn);
COMMIT;
hxc_timestore_deposit.log_messages (p_messages => l_tbl_messages);
IF (l_tbl_messages.COUNT <> 0)
THEN
l_chr_error_msg :=
l_chr_error_msg
|| '.Error in INSERT API CALL at Step#7 for the debrief line id: '
|| process_line_rec.debrief_line_id
|| '.Error:'
|| SQLERRM
|| '.API Error messages are:';
i_num_count := l_tbl_messages.FIRST;
LOOP
EXIT WHEN (NOT l_tbl_messages.EXISTS (i_num_count));
l_chr_message :=
fnd_message.get_string (
appin => l_tbl_messages (i_num_count).application_short_name,
namein => l_tbl_messages (i_num_count).message_name);
fnd_file.put_line (
fnd_file.LOG,
(l_tbl_messages (i_num_count).message_name));
l_chr_error_msg := l_chr_error_msg || l_chr_message || '.';
i_num_count := l_tbl_messages.NEXT (i_num_count);
END LOOP;
END IF;
EXCEPTION
WHEN OTHERS
THEN
fnd_file.put_line (
fnd_file.LOG,
'**** Error.....Inside Exception Block in execute_deposit_process');
BEGIN
hxc_timestore_deposit.log_messages (
p_messages => l_tbl_messages);
EXCEPTION
WHEN OTHERS
THEN
fnd_file.put_line (
fnd_file.LOG,
'*****Error....Inside Exception Block in hxc_timestore_deposit.log_messages.Error:'
|| SQLERRM);
END;
l_chr_error_msg :=
l_chr_error_msg
|| 'Error in INSERT API CALL at Step#7 for the debrief line id: '
|| process_line_rec.debrief_line_id
|| '.Error:'
|| SQLERRM
|| '.API Error messages are:';
IF (l_tbl_messages.COUNT <> 0)
THEN
i_num_count := l_tbl_messages.FIRST;
LOOP
EXIT WHEN (NOT l_tbl_messages.EXISTS (i_num_count));
l_chr_message :=
fnd_message.get_string (
appin => l_tbl_messages (i_num_count).application_short_name,
namein => l_tbl_messages (i_num_count).message_name);
l_chr_error_msg := l_chr_error_msg || l_chr_message || '.';
i_num_count := l_tbl_messages.NEXT (i_num_count);
END LOOP;
END IF;
END;
COMMIT;
IF l_chr_error_msg IS NOT NULL OR l_new_timecard_id IS NULL
THEN
fnd_file.put_line (fnd_file.LOG,
'***** Error *****' || l_chr_error_msg);
o_ret_code := 1;
END IF;
BEGIN
fnd_file.put_line (
fnd_file.LOG,
'Updating the status of the record in custom table for debrief_line_id:'
|| process_line_rec.debrief_line_id);
UPDATE xxpowl.XXPOWL_HXT_OTL_INTF
SET status_flag =
DECODE (
l_chr_error_msg,
NULL, DECODE (
l_new_timecard_id,
NULL, 'FAILED',
DECODE (SIGN (l_new_timecard_id),
'1', 'SUCESS',
'FAILED')),
'FAILED'),
time_building_block_id = l_new_timecard_id,
last_update_date = SYSDATE,
request_id = l_num_request_id,
last_updated_by = l_num_user_id,
error_desc = l_chr_error_msg
WHERE ROWID = process_line_rec.row_id;
COMMIT;
EXCEPTION
WHEN OTHERS
THEN
l_chr_error_msg :=
l_chr_error_msg
|| '.Error while updating the status to SUCCESS.Error: '
|| SQLERRM;
END;
END LOOP; --process_line
EXCEPTION
WHEN OTHERS
THEN
FND_FILE.PUT_LINE (
fnd_file.LOG,
'Unknown/Unhandlled Exception Error......' || SQLERRM);
o_ret_code := 2;
END main_process;
SHOW ERR;
Error Messages:
1. Error in INSERT API CALL at Step#7 for the debrief line id: 41011.Error:ORA-00001: unique constraint (HXC.HXC_ROLLBACK_TIMECARDS_PK) violated.
2. Error in INSERT API CALL at Step#7 for the debrief line id: 41011.Error:ORA-00001: unique constraint (HXC.HXC_LATEST_DETAILS_FK) violated
Thanks a lot for the help!Hello
We will get this error message when we run this from application other than "Time and Labor Engine". So try to run this concurrent program (if you registered as a conc. program) from Time and Labor Engine application.
--Lokesh -
Error while creating user dimension in awm
Hi,
I am new to awm. I am trying to create a dimension in my workspace, but i receive an error after i click 'create'. When i check the detailed error, i think its something to do with the OLAP_TABLE. Please find the error message below,
===================================================
Your metadata changes have been saved, with the following errors
Invalid Metadata Objects:
Invalid Object "TAN.TICKERSYMBOL": "CREATE OR REPLACE VIEW "TAN"."TICKERSYMBOL_VIEW" AS SELECT "DIM_KEY", "LEVEL_NAME", "MEMBER_TYPE", "DIM_ORDER", "LONG_DESCRIPTION", "SHORT_DESCRIPTION" FROM TABLE(CUBE_TABLE('"TAN"."TICKERSYMBOL"') )
ORA-00902: invalid datatype "
====================================================
Please advise.Which version of the database (not AWM!) are you using? (e.g. 11.2.0.3)
My guess is that there is a problem with your OLAP installation and that you will need to open a service request to get it resolved. But the following may help us to track down the problem.
alter session set events='902 trace name errorstack';
alter session set tracefile_identifier=OLAP;
SELECT
"DIM_KEY",
"LEVEL_NAME",
"MEMBER_TYPE",
"DIM_ORDER",
"LONG_DESCRIPTION",
"SHORT_DESCRIPTION"
FROM TABLE(CUBE_TABLE('"TAN"."TICKERSYMBOL"') )
Once you have run this (in a single session), look for a trace file containing the word OLAP in its file name. It should contain a stack trace that will tell us where the ORA-00902 is being raised. The trace file will be long, but the interesting part begins like this:
----- Call Stack Trace -----
calling call entry argument values in hex
location type point (? means dubious value)
skdstdst()+29 call kgdsdst() 7FFFADE14590 ? 000000000 ?
7FFFADDF6E50 ? 7FFFADDF6F68 ?
7FFFADE14C48 ? 7FFFADE14540 ?
ksedst()+112 call skdstdst() 7FFFADE14590 ? 000000000 ?
If you can find this, please copy it in reply. -
Creating Time Dimension from date columns in fact tables.
I remember watching a demo from a BI Tool a couple years ago, wich I swear was OBIEE, and the presentator stated it was possible to create a Time Dimension in the admin tool, based on a date column in other table.
Can you guys tell me if there's such functionality in OBIEE?
If so, how could I achieve that?!
Thanks in advance!
Marcoshi,
You are trying to make Fact table as Dim table ???
Fact table has some dim columns??
We can do this by making a fact table as dim table ,create a dim hierarchy on that table
Year level :Extract(year from fact_date_column)
Month and year level:CAST (Extract(month from fact_date_column) As CHAR(5) ) || CAST (Extract(yearfrom fact_date_column) As CHAR(5) )
Like this
But,be careful while doing this make sure that all joins and content levels are good
As per my knowledge this is not a good way,Experts can add some words lets see!!!!!! :-)
thanks,
saichand.v -
Creating Time dimension in BW data model. - like seen in logical data model
Hello all,
I have been struggling with this thing and I am looking for some help from anyone on this forum.
We are trying to create a logical data model of our bw system. We are going live next month with Student module for universities. We have multiple Infocubes and DSO and since there is so much crossing over in between them most of the reporting is done on infosets.
One of the thing we were thinking; is it possible to create something like a common time dimension table for every infoprovider. Basically when we are providing the reports to the end user can we give them a drop down menu which gives a time frame for reporting rather than selecting.
Example: Like can we create something which looks in the drop down like current month data, last months data, three months ago, four months ago, five months ago, one year ago, two years ago. Can we make like these data slices in our cube and deliver it to the end user?
We have in our cube a few date infoobjects, like receipt date, decision date, cancellation date and like wise.
Please let me know if any one has done any similar thing, it will be very helpful.
Thank you so much in advance.if you add your common time dimension to your data model, first identify for each infoprovider the time against which 'current month' and other frames should be applied and map them to your dimension.
just a question... are you not using time dimension in cubes ? ideally this should be your time dimension llinking all.
when you use time dimension which uses 'current month' , 'current year' , you will have to address their historisation as well. (because current month now will not be so current after 2 months).
so in data load procedure every day these values need to change (meaning drop and reload).
and routines to populate these values based on reporting date.
Edited by: hemant vyas on May 6, 2009 1:56 PM -
Error in Creating Time series objects in APO 5.0
Hi, We are implementing APO 5.0. I am trying to Create Time series objects for Planning Area 9ASNP05, but getting the following Run Time Error
We have used Support Package 10
Runtime Errors PERFORM_TOO_MANY_PARAMETERS
Exception CX_SY_DYN_CALL_PARAM_NOT_FOUND
The exception, which is assigned to class 'CX_SY_DYN_CALL_PARAM_NOT_FOUND', was
not caught in procedure "RSS_TEMPLATE_INSTANTIATE" "(FUNCTION)", nor was it propagated by a RAISING clause.
Since the caller of the procedure could not have anticipated that the
exception would occur, the current program is terminated.
The reason for the exception is: A PERFORM was used to call the routine "INSTANTIATE" of the program
"GP_MET_RSSG_HEADER_COMMENT".
This routine contains 15 formal parameters, but the current call
contains 16 actual parameters.
AnyHi
I am getting exactly the same error, on SCM 5.0 SR2, running on Windows 2003 64 BIT & SQL 2000.
Conditions for error:
Using transaction:
/SAPAPO/MSDP_ADMIN
Select: Planning Object Structures
Short Dump occurs if you either:
1. Attempt to deactivate an active Planning Object Structure
2. Attempt to create a Characteristic Combination
Gives a runtime error shortdump PERFORM_TOO_MANY_PARAMETERS
Error analysis:
An exception occurred that is explained in detail below.
The exception, which is assigned to class 'CX_SY_DYN_CALL_PARAM_NOT_FOUND', was
not caught in
procedure "RSS_TEMPLATE_INSTANTIATE" "(FUNCTION)", nor was it propagated by a
RAISING clause.
Since the caller of the procedure could not have anticipated that the
exception would occur, the current program is terminated.
The reason for the exception is:
A PERFORM was used to call the routine "INSTANTIATE" of the program
"GP_MET_RSSG_HEADER_COMMENT".
This routine contains 15 formal parameters, but the current call
contains 16 actual parameters.
Has anyone seen this before?
Did you find a solution Bhavesh ?
Thanks. -
hi,
i want to create a time dimension using odi.
How do i create the below table.
CREATE TABLE Dim_Time(
Timekey INT IDENTITY(1,1),
Day CHAR(2),
Month VARCHAR(9),
Quarter CHAR(1),
Year CHAR(4),
Week CHAR(2),
FullDate DATETIME,
Thank You.You can use an Excel file as a source that contains 1 columns with 1 date by row
then you map the other columns thanks to date function in ODI -
Hi
I have a fact table which has a time data stored in two columns, month_id,year_id.
My fact table looks like this
Sales,Month_ID,Year_ID
How can I model a level based dimension for time for this sort of table ?You can map the time dimension to a view on top of your fact table as long as the fact is sufficiently dense in time (e.g. there are no missing months). The view would add end date and time span info as appropriate. Alternatively you can define a view that generates dates. The following thread shows the latter approach
populating a date dimension
You can adapt the view in it to derive time info from your fact if you like, but the standalone view is probably easier. -
Ora-00942 when creating time dimension
hi All,
I am new to data warehouse, i have created a date dimension " date_dim " in a target location, it has created a corresponding mapping "date_dim_map" in mappings section.
But when i try to deploy the dimension i get error ora-00942 table or view does not exist.
I get the same error when i try to deploy the corresponding map created by this dimension.
any ideas what the problem could be.
regardsoops got it had to deploy the corresponding table first.\
-
Unable to create time series ago function
Hello guys
I am running into a problem about time series function, ago function
I have a simple schema, fact <---->Time dimension
I have created time dimension hierarchy where month is the lowest level, I have also made it "time dimension" and month is the "chronogical key"..
When applying ago function, I am putting my measure from the fact table, level as "month" from time dimension table, and 1 meaning 1 month ago, when I complile it, I get the following error:
[nQSError: 27009] Unresolved identifier: "FTTNBILLSUM"."FTTNOBIEEBILLTIME"."Month".
Could anybody help me out as what the system doesn't like about this column?
ThanksThanks Maden
I did the correction and now it's giving a different error: [nQSError: 22038] Function AGO requires at least one measure attribute in its first argument.
My code is:
Ago("FTTNBILLSUM"."F_Bill_Sum"."ATM_UNITS_Sum" , "FTTNBILLSUM"."D_Time"."Month" , 1)
I have time hierarchy that names: D_Time with Month at the lowest level with Chrono keys, there is no level above it except grand total level, so only 2 level hierachy
I have a fact table that is F_Bill_Sum, which is join to D_Time... I made the D_time both the name of table and hierarchy
Please let me know what this error is coming from
Thanks -
After reading the following article, I have decided to use SSAS dimension wizard for generating our Date dimension, which creates a DATETIME PK.
http://www.made2mentor.com/2011/05/date-vs-integer-datatypes-as-primary-key-for-date-dimensions/
I have also created a separate Time dimension as granularity of an hour is required.
The Time dimension is very simple and only contains a surrogate key (INTEGER) and actual time in hours (VARCHAR).
DimTime(TimeKey, TimeInHours)
Our Fact table will now have a link to both the Date and Time dimension using the PK's.
Our analysis is required by hour, day, week, month and year.
My query is; Will this current structure cause any problems when creating MDX scripts to analyse our data (i.e. drilldown and rollup queries) Hour - Day - Week - Month - YearHi Darren,
According to your description, there a day and hour granularity in your fact table, so you want to a hierarchy like Hour - Day - Week - Month - Year, right?
In your scenario, you created a time table that only contains a surrogate key (INTEGER) and actual time in hours (VARCHAR). We cannot create a Hour - Day - Week - Month - Year hierarchy without ant relationship between date table and time table. As per my understanding,
you need create a foreigner key in time table, and join those table in the data source view, then you can create such a hierarchy. Here are some links about create time dimension, please see:
http://www.ssas-info.com/analysis-services-articles/59-time-dimension/1224-date-and-time-dimensions-template
http://www.codeproject.com/Articles/25852/Creating-Time-Dimension-in-Microsoft-Analysis-Serv
Regards,
Charlie Liao
TechNet Community Support
Maybe you are looking for
-
@-moz-keyframes does not seem to work in cs 5.5
I have difficulties when editing a CSS style in CS5.5. Dreamweaver accepts @-webkit-keyframes [id]... and @keyframes [id]... without any problems. Only when I try do use @-moz-keyframes [id]... the editor does not seem to accept the term. Usually the
-
I recently purchased a refurb Mac Pro. For some reason I can't get the ethernet port to work. I ran a diagnostics and it doesn't even show up as active. Any help would be appreciated. I was thinking about purchasing a USB-to-Ethenet adapter if all
-
Oracle 11g init.ora file
1) I am a bit confused with init.ora file format in oracle 11g. In addition= to the usual entries that are prefixed with *.<parameter>, it also has ent= ries prefixed with <sid>.__<parameter>. I am not able to find an explanatio= n of these <sid>__ e
-
Develop a program that can read pdf document in the UClinux OS?
hello, I'm a programmer in a company which is doing embed development ,i want to develop a program that can read pdf document in the UClinux OS? Can some one give me some ideas? can some body give me some information? My email is [email protected] th
-
How do I get tech help for export pdf during Australian business hours?
How do I get tech help for export pdf during Australian business hours?