Viewing my loaded data
Can anyone tell me why I am getting the error below after executing my control file and also why I can't view that data which was loaded.
SQL*Loader: Release 10.2.0.1.0 - Production on Thu Jul 12 16:28:57 2007
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: workattribute2.ctl
Data File: workattribute.dat
Bad File: workattribute.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table WORKATTRIBUTE, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
RESUMEID FIRST * CHARACTER
Terminator string : 'x'09''
WORKID NEXT * CHARACTER
Terminator string : 'x'09''
ID NEXT * CHARACTER
Terminator string : 'x'09''
TYPE NEXT * CHARACTER
Terminator string : 'x'09''
Record 1: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 2: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 3: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 4: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 5: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 6: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 7: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 8: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 9: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 10: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 11: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 12: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 13: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 14: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 15: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 16: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 17: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 18: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 19: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 20: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 21: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 22: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 23: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 24: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 25: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 26: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 27: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 28: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 29: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 30: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 31: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 32: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 33: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 34: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 35: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 36: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 37: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 38: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 39: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 40: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 41: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 42: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 43: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 44: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 45: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 46: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 47: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 48: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 49: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 50: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
Record 51: Rejected - Error on table WORKATTRIBUTE, column RESUMEID.
Field in data file exceeds maximum length
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table WORKATTRIBUTE:
0 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 66048 bytes(64 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 51
Total logical records rejected: 51
Total logical records discarded: 0
Run began on Thu Jul 12 16:28:57 2007
Run ended on Thu Jul 12 16:29:05 2007
Elapsed time was: 00:00:08.09
CPU time was: 00:00:00.12
thank you in advance
This is my control File
LOAD DATA
INFILE Job.dat
APPEND INTO TABLE job
FIELDS TERMINATED BY x'09'
(PositionID, jobID, title, code, family)
=======================================================
DATA FILE
PositionID jobID title code family
50003314 1 NULL 001810 NULL
50004849 1 NULL 001255 NULL
50004854 1 NULL 001322 NULL
50004966 1 NULL 001382 NULL
===========================================================================
Error From Log file
SQL*Loader: Release 10.2.0.1.0 - Production on Fri Jul 13 10:17:33 2007
Copyright (c) 1982, 2005, Oracle. All rights reserved.
Control File: Position1.ctl
Data File: Job.dat
Bad File: Job.bad
Discard File: none specified
(Allow all discards)
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table JOB, loaded from every logical record.
Insert option in effect for this table: APPEND
Column Name Position Len Term Encl Datatype
POSITIONID FIRST * WHT CHARACTER
JOBID NEXT * WHT CHARACTER
TITLE NEXT * WHT CHARACTER
CODE NEXT * WHT CHARACTER
FAMILY NEXT * WHT CHARACTER
Record 1: Rejected - Error on table JOB, column POSITIONID.
Field in data file exceeds maximum length
Record 2: Rejected - Error on table JOB, column POSITIONID.
Field in data file exceeds maximum length
Record 3: Rejected - Error on table JOB, column POSITIONID.
Field in data file exceeds maximum length
Record 4: Rejected - Error on table JOB, column POSITIONID.
Field in data file exceeds maximum length
Record 5: Rejected - Error on table JOB, column POSITIONID.
Field in data file exceeds maximum length
Similar Messages
-
Hi,
I have loaded the data into an ODS from an External Flat file and executed the job. Can anyone guide me how to view the loaded data in the ODS ?
ThanksHi Madhu,
You can simply Goto Transaction LISTCUBE and give the ODS name -> Execute.
Regards
Hemant -
Error in viewing successfully loaded data into cube..
hi gurus,
I had a problem on viewing my data into cube. These data were successfully loaded from LIS environment. The errors i had received were 'Your user master record is not sufficiently maintained for object Commodity and Company Code Auth object'..'System error: RSDRC / FORM AUTHORITY_CHECK USER NOT AUTHORIZED ZFIN___ ZFIN___'... 'System error: RSDRC / FUNC RSDRC_BASIC_CUBE_DATA_GET ERROR IN RSDRC_BASIC_QUERY_DATA_GET ZFIN____ 64' ... 'System error: RSDRC / FORM DATA_GET ERROR IN RSDRC_BASIC_CUBE_DATA_GET ZFIN____ 64 ...
Does anybody encounters these errors? Can you give me step by step procedure on how to deal on this errors?This thread will solve your problem. look for Bhanu's post in it.
Re: Cube Security -
Can't load data through smart view (ad hoc analysis)
Hi,
There is EPM application where I want to give ability to planners to load data through smart view (ad hoc analysis). In Shared Services there are four options in
EssbaseCluster-1: Administrator, Create/Delete Application, Server Access, Provisioning Manager. Only Administrator can submit data in smart view (ad-hoc analysis). But I don't want to grant Essbase administrator to planners, I'm just interested to give them ability to load data through ad-hoc analysis. Please suggest!I take that you refreshed the Planning security, If not refresh the security of those users. Managing Security Filters
Check in EAS whether those filters are created with "Write" permissions.
Regards
Celvin
http://www.orahyplabs.com -
Not able to load data in tables with correct way
Hi
i made one trigger to load data in view and tables.
CODE FOR TRIGGER>>>>
Object Details Code Errors SQL
CREATE OR REPLACE TRIGGER "WELL_GENERATOR_TRIGGER_1"
INSTEAD OF INSERT ON VIEW_WELL_GENERATOR_FORM
FOR EACH ROW
DECLARE
rowcnt number;
BEGIN
INSERT INTO facility (FAC_PK) VALUES (:NEW.FAC_PK);
SELECT COUNT(*) INTO rowcnt FROM WELL WHERE WEL_PK = :NEW.WEL_PK;
IF rowcnt = 0 THEN
INSERT INTO WELL (WEL_PK,SITE,FAC_FK) VALUES(:NEW.WEL_PK,:NEW.SITE,:NEW.FAC_PK);
ELSE
UPDATE WELL SET WELL.SITE = :NEW.SITE,
WELL.FAC_FK = :NEW.FAC_PK
WHERE WELL.WEL_PK = :NEW.WEL_PK;
END IF;
SELECT COUNT(*) INTO rowcnt FROM WELL_STATUS WHERE WELL_STATUS.STA_PK = :NEW.STA_PK;
IF rowcnt = 0 THEN
INSERT INTO WELL_STATUS (WELL_TYPE, WELL_TYPE_DATE, OPER_STATUS, CLASS,WEL_FK)
VALUES(:NEW.WELL_TYPE, :NEW.WELL_TYPE_DATE, :NEW.OPER_STATUS, :NEW.CLASS,:NEW.WEL_PK);
ELSE
UPDATE WELL_STATUS SET WELL_STATUS.WELL_TYPE = :NEW.WELL_TYPE,
WELL_STATUS.WELL_TYPE_DATE = :NEW.WELL_TYPE_DATE,
WELL_STATUS.OPER_STATUS = :NEW.OPER_STATUS,
WELL_STATUS.CLASS = :NEW.CLASS,
WELL_STATUS.WEL_FK = :NEW.WEL_PK
WHERE STA_PK = :NEW.STA_PK;
END IF;
SELECT COUNT(*) INTO rowcnt FROM PERMIT WHERE PERMIT.PER_PK = :NEW.PER_PK;
IF rowcnt = 0 THEN
INSERT INTO PERMIT (AUT_STATUS,WEL_FK) VALUES (:NEW.AUT_STATUS,:NEW.WEL_PK);
ELSE
UPDATE PERMIT SET PERMIT.AUT_STATUS = :NEW.AUT_STATUS,
PERMIT.WEL_FK = :NEW.WEL_PK
WHERE PERMIT.PER_PK = :NEW.PER_PK;
END IF;Now But still i am not getting result which i want.Like in WELL_STATUS table i could nt able to insert the value (WEL_FK) from (WEL_PK).And In PERMIT table column (WEL_FK) still i could nt able to insert the value (WEL_PK).
Now WEL_PK value autogenerate from the SEQ same with STA_PK and PER_PK.But instead of taking value from seq STA_PK take the value from WEL_PK (dnt know why ? But in data its shows me Same value WEL_PK and STA_PK).Same thing With PER_PK.
so where i am wrong ? i really need your help.
workspace:PRACTISE
UN:[email protected]
PW:testing
Application: 39289 - TESTTING
Thanks
Edited by: vijaya45 on Jul 13, 2009 9:44 PM
Edited by: vijaya45 on Jul 14, 2009 12:48 AMHello vijaya45,
Not sure if this will help, but I noticed you currently have a WELL_GENERATOR_TRIGGER_1 and a WELL_GENERATOR_TRIGGER_2, and "1" is disabled but "2" is enabled. Which one is supposed to be running at this point?
John -
BI 7.0 data load issue: InfoPackage can only load data to PSA?
BI 7.0 backend extraction gurus,
We created a generic datasource on R3 and replicated it to our BI system, created an InfoSource, the Transformation from the datasource to the InfoSource, an ODS, the transformation from the InfoSource to the ODS.
After the transformation creation between the InfoSource and the ODS is done on this BI system, a new folder called "Data Transfer Process" is also created under this ODS in the InfoProvider view. In the Data Transfer Process, in the Extraction tab, picks 'Full' in the field Extraction Mode, in the Execute tab, there is a button 'Execute', click this button (note: so far we have not created InfoPackage yet) which sounds like to conduct the data load, but find there is no data available even if all the status show green (we do have a couple of records in the R3 table).
Then we tried to create an InfoPackage, in the Processing tab, find 'Only PSA' radio button is checked and all others like 'PSA and then into Data Targets (Package by Package)" are dimmed! In the Data Target tab, find the ODS as a target can't be selected! Also there are some new columns in this tab, 'Maintain Old Update Rule' is marked with red color 'X', under another column 'DTP(S) are active and load to this target', there is an inactive picture icon, that's weird since we have already activated the Data Transfer Process! Anyway, we started the data load in the InfoPackage, and the monitor shows the records are brought in, but since in the Process tab in the InfoPackage, 'Only PSA' radio button is checked with all others dimmed that there is no any data going to this ODS! Why in BI 7.0, 'Only PSA' radio button can be checked with others all dimmed?
Many new features with BI 7.0! Any one's idea/experience is greatly appreciate on how to load data in BI 7.0!You dont have to select anything..
Once loaded to PSA in DTP you have the option of FULL or DELTA ,full loads all the data from PSA and DELTA loads only the last load of PSA.
Go through the links for Lucid explainations
Infopackage -
http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03808225cf5167e10000000a1553f6/content.htm
DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f98e07cc483255e10000000a1553f7/frameset.htm
Creating DTP
http://help.sap.com/saphelp_nw2004s/helpdata/en/42/fa50e40f501a77e10000000a422035/content.htm
<b>Pre-requisite-</b>
You have used transformations to define the data flow between the source and target object.
Creating transformations-
http://help.sap.com/saphelp_nw2004s/helpdata/en/f8/7913426e48db2ce10000000a1550b0/content.htm
Hope it Helps
Chetan
@CP.. -
Errors when loading data to ODS
Hi,
I am getting the following dump when loading data to ODS
whta might be the problem
Runtime Error MESSAGE_TYPE_X
Date and Time 29.09.2006 14:26:52
ShrtText
The current application triggered a termination with a short dump.
What happened?
The current application program detected a situation which really
should not occur. Therefore, a termination with a short dump was
triggered on purpose by the key word MESSAGE (type X).
What can you do?
Print out the error message (using the "Print" function)
and make a note of the actions and input that caused the
error.
To resolve the problem, contact your SAP system administrator.
You can use transaction ST22 (ABAP Dump Analysis) to view and administer
termination messages, especially those beyond their normal deletion
date.
is especially useful if you want to keep a particular message.
Error analysis
Short text of error message:
Test message: SDOK_GET_PHIO_ACCESS 001
Technical information about the message:
Message classe...... "1R"
Number.............. 000
Variable 1.......... "SDOK_GET_PHIO_ACCESS"
Variable 2.......... 001
Variable 3.......... " "
Variable 4.......... " "
Variable 3.......... " "
Variable 4.......... " "
How to correct the error
Probably the only way to eliminate the error is to correct the program.
You may able to find an interim solution to the problem
in the SAP note system. If you have access to the note system yourself,
use the following search criteria:
"MESSAGE_TYPE_X" C
"SAPLSDCL" or "LSDCLF00"
"INTERNAL_ERROR"
If you cannot solve the problem yourself and you wish to send
an error message to SAP, include the following documents:
1. A printout of the problem description (short dump)
To obtain this, select in the current display "System->List->
Save->Local File (unconverted)".
2. A suitable printout of the system log
To obtain this, call the system log through transaction SM21.
Limit the time interval to 10 minutes before and 5 minutes
after the short dump. In the display, then select the function
"System->List->Save->Local File (unconverted)".
3. If the programs are your own programs or modified SAP programs,
supply the source code.
To do this, select the Editor function "Further Utilities->
Upload/Download->Download".
4. Details regarding the conditions under which the error occurred
or which actions and input led to the error.
System environment
SAP Release.............. "640"
Application server....... "bomw093a"
Network address.......... "132.186.125.66"
Operating system......... "Windows NT"
Release.................. "5.2"
Hardware type............ "4x Intel 801586"
Character length......... 8 Bits
Pointer length........... 32 Bits
Work process number...... 16
Short dump setting....... "full"
Database server.......... "BOMW093A"
Database type............ "ORACLE"
Database name............ "BIW"
Database owner........... "SAPDAT"
Character set............ "English_United State"
SAP kernel............... "640"
Created on............... "Nov 4 2004 23:26:03"
Created in............... "NT 5.0 2195 Service Pack 4 x86 MS VC++ 13.10"
Database version......... "OCI_920_SHARE "
Patch level.............. "43"
Patch text............... " "
Supported environment....
Database................. "ORACLE 8.1.7.., ORACLE 9.2.0.."
SAP database version..... "640"
Operating system......... "Windows NT 5.0, Windows NT 5.1, Windows NT 5.2"
Memory usage.............
Roll..................... 8112
EM....................... 6271776
Heap..................... 0
Page..................... 24576
MM Used.................. 3921120
MM Free.................. 258392
SAP Release.............. "640"
User and Transaction
Client.............. 800
User................ "IC881147"
Language key........ "E"
Transaction......... " "
Program............. "SAPLSDCL"
Screen.............. "SAPMSSY0 1000"
Screen line......... 6
Information on where terminated
The termination occurred in the ABAP program "SAPLSDCL" in "INTERNAL_ERROR".
The main program was "RSRD_BROADCAST_PROCESSOR ".
The termination occurred in line 25 of the source code of the (Include)
program "LSDCLF00"
of the source code of program "LSDCLF00" (when calling the editor 250).
The program "SAPLSDCL" was started as a background job.
Job name........ "SECOQUERY"
Job initiator... "IC881147"
Job number...... 14265102
Source Code Extract
Line
SourceCde
1
2
INCLUDE LSDCLF00 *
3
4
5
6
FORM INTERNAL_ERROR *
7
8
Handles unexpected error conditions (internal errors)
9
10
--> VALUE(U_ROUTINE) Routine/function module where error occured
11
--> VALUE(U_ERROR_CODE) Identifier in routine (e.g. number)
12
--> VALUE(U_VAR1) Variable containing further information
13
--> VALUE(U_VAR2) Variable containing further information
14
--> VALUE(U_VAR3) Variable containing further information
15
--> VALUE(U_VAR4) Variable containing further information
16
17
form internal_error
18
using value(u_routine)
19
value(u_error_code)
20
value(u_var1)
21
value(u_var2)
22
value(u_var3)
23
value(u_var4).
24
>>>>>
message x000 with u_routine u_error_code u_var1 u_var2.
26
27
endform.
28
29
30
*& Form BAD_OBJECT_TO_SYMSG
31
32
maps error information in u_bad_object into system message
33
variables
34
35
--> VALUE(U_BAD_OBJECT) structure containing error information
36
37
form bad_object_to_symsg
38
using value(u_bad_object) type sdokerrmsg.
39
40
sy-msgid = u_bad_object-id.
41
sy-msgty = u_bad_object-type.
42
sy-msgno = u_bad_object-no.
43
sy-msgv1 = u_bad_object-v1.
44
sy-msgv2 = u_bad_object-v2.
Contents of system fields
Name
Val.
SY-SUBRC
0
SY-INDEX
0
SY-TABIX
1
SY-DBCNT
4
SY-FDPOS
0
SY-LSIND
0
SY-PAGNO
0
SY-LINNO
1
SY-COLNO
1
SY-PFKEY
SY-UCOMM
SY-TITLE
Report Dissemaintion Framework: Executing the Transferred Settings
SY-MSGTY
X
SY-MSGID
1R
SY-MSGNO
000
SY-MSGV1
SDOK_GET_PHIO_ACCESS
SY-MSGV2
001
SY-MSGV3
SY-MSGV4
Active Calls/Events
No. Ty. Program Include Line
Name
15 FORM SAPLSDCL LSDCLF00 25
INTERNAL_ERROR
14 FORM SAPLSDCI LSDCIU13 303
PHIO_GET_CONTENT_ACCESS
13 FUNCTION SAPLSDCI LSDCIU13 113
SDOK_PHIO_GET_CONTENT_ACCESS
12 FUNCTION SAPLSKWF_CONTENT LSKWF_CONTENTU02 63
SKWF_PHIO_CONTENT_ACCESS_GET
11 METHOD CL_RSRA_KWF_UTILITIES=========CP CL_RSRA_KWF_UTILITIES=========CM00B 50
CL_RSRA_KWF_UTILITIES=>COPY_MIME_TO_FOLDER
10 METHOD CL_RSRA_KWF_TMPL==============CP CL_RSRA_KWF_TMPL==============CM002 28
CL_RSRA_KWF_TMPL=>GET_STYLESHEET
9 METHOD CL_RSRA_KWF_TMPL==============CP CL_RSRA_KWF_TMPL==============CM001 227
CL_RSRA_KWF_TMPL=>CONSTRUCTOR
8 METHOD CL_RSRA_ENGINE_BC=============CP CL_RSRA_ENGINE_BC=============CM010 9
CL_RSRA_ENGINE_BC=>SET_TEMPLATE_FOLDER
7 METHOD CL_RSRA_ENGINE_BC=============CP CL_RSRA_ENGINE_BC=============CM001 75
CL_RSRA_ENGINE_BC=>CONSTRUCTOR
6 METHOD CL_RSRA_JOB===================CP CL_RSRA_JOB===================CM003 47
CL_RSRA_JOB=>EXECUTE_SINGLE
5 METHOD CL_RSRA_JOB===================CP CL_RSRA_JOB===================CM00E 14
CL_RSRA_JOB=>EXECUTE_SINGLE_RC
4 METHOD CL_RSRD_PRODUCER_RA===========CP CL_RSRD_PRODUCER_RA===========CM001 147
CL_RSRD_PRODUCER_RA=>IF_RSRD_F_PRODUCER_RT~PRODUCE
3 METHOD CL_RSRD_SETTING===============CP CL_RSRD_SETTING===============CM005 28
CL_RSRD_SETTING=>EXECUTE_NODES
2 METHOD CL_RSRD_SETTING===============CP CL_RSRD_SETTING===============CM002 73
CL_RSRD_SETTING=>EXECUTE
1 EVENT RSRD_BROADCAST_PROCESSOR RSRD_BROADCAST_PROCESSOR 197
START-OF-SELECTION
Chosen variables
Name
Val.
No. 15 Ty. FORM
Name INTERNAL_ERROR
U_ROUTINE
SDOK_GET_PHIO_ACCESS
54445445554445444455
34FBF754F089FF133533
SY-MSGV1
SDOK_GET_PHIO_ACCESS
54445445554445444455222222222222222222222222222222
34FBF754F089FF133533000000000000000000000000000000
U_ERROR_CODE
001
333
001
SY-MSGV2
001
33322222222222222222222222222222222222222222222222
00100000000000000000000000000000000000000000000000
U_VAR1
2
0
SY-MSGV3
22222222222222222222222222222222222222222222222222
00000000000000000000000000000000000000000000000000
SDOKI_MODE_DELETE
5
0000
5000
U_VAR2
2
0
SY-MSGV4
22222222222222222222222222222222222222222222222222
00000000000000000000000000000000000000000000000000
No. 14 Ty. FORM
Name PHIO_GET_CONTENT_ACCESS
SYST
####################################################A#######P###############è#################
0000000000000000000000000000000000000000000000000000400000005000000000000000E00000000000000000
0000000010002000000000000000000000000000400000001000100010000000000000000000840000000000000000
PHIO_OBJECT_ID-CLASS
2222222222
0000000000
DUMMY_VERSTYPE
0
3
0
DUMMY_FCT_EXPORT
222222222222222222222222222222
000000000000000000000000000000
DUMMY_FCT_IMPORT
222222222222222222222222222222
000000000000000000000000000000
DUMMY_FCT_DELETE
222222222222222222222222222222
000000000000000000000000000000
FCT_VIEW
222222222222222222222222222222
000000000000000000000000000000
BUFF_XPIRE
000000000000
333333333333
000000000000
NO_BUFFER
2
0
SUBRC_AUX
3
0000
3000
SDOKA_EVENT_PH_FROM_REL_PRE
0022
3333
0022
SCREEN
%_17SNS0001592815_%_%_%_%_%_%_
2533545333333333352525252525252222222222222222222222222222222222222222222222222222222222222222
5F173E30001592815F5F5F5F5F5F5F0000000000000000000000000000000000000000000000000000000000000000
<%_TABLE_SDOKSTRE>
SY-XFORM
CONVERSION_EXIT
444545544454545222222222222222
3FE65239FEF5894000000000000000
CONTEXT[]
Table IT_5544[0x89]
FUNCTION=SKWF_PHIO_CONTENT_ACCESS_GETDATA=CONTEXT[]
Table reference: 324
TABH+ 0(20) = 00000000801CE03C0000000044010000A8150000
TABH+ 20(20) = 0000000059000000FFFFFFFF047B0200D00E0000
TABH+ 40( 8) = 10000000C1248000
store = 0x00000000
ext1 = 0x801CE03C
shmId = 0 (0x00000000)
id = 324 (0x44010000)
label = 5544 (0xA8150000)
fill = 0 (0x00000000)
leng = 89 (0x59000000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000058
occu = 16 (0x10000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 0
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = Not allocated
pghook = Not allocated
idxPtr = Not allocated
refCount = Not allocated
tstRefCount = Not allocated
lineAdmin = Not allocated
lineAlloc = Not allocated
store_id = Not allocated
shmIsReadOnly = Not allocated
>>>>> 1st level extension part <<<<<
regHook = 0x00000000
hsdir = 0x00000000
ext2 = 0x381CE03C
>>>>> 2nd level extension part <<<<<
tabhBack = 0x98DC033D
delta_head = 000000000000000000000000000000000000000000000000000000000000000000000000
pb_func = 0x00000000
pb_handle = 0x00000000
CONTEXT
22222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
PROPERTIES[]
Table[initial]
PROPERTIES
22222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
00000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
No. 13 Ty. FUNCTION
Name SDOK_PHIO_GET_CONTENT_ACCESS
ALLOW_MODEL
2
0
CACHE_SERVER
0000000000
2222222222222222222222222222222222222222222222222222222222222222333333333322222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
CLIENT
800
333
800
CONTENT_ONLY
X
5
8
CONTENT_OR_URL_ONLY
2
0
OBJECT_ID
222222222222222222222222222222222222222222
000000000000000000000000000000000000000000
RAW_MODE
2
0
TEXT_AS_STREAM
X
5
8
USE_URL_AT
2
0
ACCESS_MODE
00
33
00
DOCUMENT_HTTPS_URL
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
DOCUMENT_URL
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
MODEL_RETURNED
2
0
COMPONENTS[]
Table[initial]
COMPONENT_ACCESS[]
Table IT_5533[0x8616]
CLASS=CL_RSRA_KWF_UTILITIESMETHOD=COPY_MIME_TO_FOLDERDATA=L_T_COMPONENT_ACCESS
Table reference: 337
TABH+ 0(20) = 00000000481BE03C00000000510100009D150000
TABH+ 20(20) = 00000000A8210000FFFFFFFF04C5010088190000
TABH+ 40( 8) = 01000000C1248000
store = 0x00000000
ext1 = 0x481BE03C
shmId = 0 (0x00000000)
id = 337 (0x51010000)
label = 5533 (0x9D150000)
fill = 0 (0x00000000)
leng = 8616 (0xA8210000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000107
occu = 1 (0x01000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 0
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = Not allocated
pghook = Not allocated
idxPtr = Not allocated
refCount = Not allocated
tstRefCount = Not allocated
lineAdmin = Not allocated
lineAlloc = Not allocated
store_id = Not allocated
shmIsReadOnly = Not allocated
>>>>> 1st level extension part <<<<<
regHook = 0x00000000
hsdir = 0x00000000
ext2 = 0x001BE03C
>>>>> 2nd level extension part <<<<<
tabhBack = 0x985D043D
delta_head = 000000000000000000000000000000000000000000000000000000000000000000000000
pb_func = 0x00000000
pb_handle = 0x00000000
CONTEXT[]
Table IT_5544[0x89]
FILE_CONTENT_ASCII[]
Table IT_5534[0x1022]
CLASS=CL_RSRA_KWF_UTILITIESMETHOD=COPY_MIME_TO_FOLDERDATA=L_T_FILE_CONTENT_ASCII
Table reference: 312
TABH+ 0(20) = 00000000B01BE03C00000000380100009E150000
TABH+ 20(20) = 00000000FE030000FFFFFFFF04C50100D81A0000
TABH+ 40( 8) = 10000000C1248000
store = 0x00000000
ext1 = 0xB01BE03C
shmId = 0 (0x00000000)
id = 312 (0x38010000)
label = 5534 (0x9E150000)
fill = 0 (0x00000000)
leng = 1022 (0xFE030000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000113
occu = 16 (0x10000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 0
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = Not allocated
pghook = Not allocated
idxPtr = Not allocated
refCount = Not allocated
tstRefCount = Not allocated
lineAdmin = Not allocated
lineAlloc = Not allocated
store_id = Not allocated
shmIsReadOnly = Not allocated
>>>>> 1st level extension part <<<<<
regHook = 0x00000000
hsdir = 0x00000000
ext2 = 0x681BE03C
>>>>> 2nd level extension part <<<<<
tabhBack = 0xB813043D
delta_head = 000000000000000000000000000000000000000000000000000000000000000000000000
pb_func = 0x00000000
pb_handle = 0x00000000
FILE_CONTENT_BINARY[]
Table IT_5535[0x1022]
CLASS=CL_RSRA_KWF_UTILITIESMETHOD=COPY_MIME_TO_FOLDERDATA=L_T_FILE_CONTENT_BINARY
Table reference: 325
TABH+ 0(20) = 00000000181CE03C00000000450100009F150000
TABH+ 20(20) = 00000000FE030000FFFFFFFF04C50100101B0000
TABH+ 40( 8) = 10000000C1248000
store = 0x00000000
ext1 = 0x181CE03C
shmId = 0 (0x00000000)
id = 325 (0x45010000)
label = 5535 (0x9F150000)
fill = 0 (0x00000000)
leng = 1022 (0xFE030000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000114
occu = 16 (0x10000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 0
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = Not allocated
pghook = Not allocated
idxPtr = Not allocated
refCount = Not allocated
tstRefCount = Not allocated
lineAdmin = Not allocated
lineAlloc = Not allocated
store_id = Not allocated
shmIsReadOnly = Not allocated
>>>>> 1st level extension part <<<<<
regHook = 0x00000000
hsdir = 0x00000000
ext2 = 0xD01BE03C
>>>>> 2nd level extension part <<<<<
tabhBack = 0xC08FD33C
delta_head = 000000000000000000000000000000000000000000000000000000000000000000000000
pb_func = 0x00000000
pb_handle = 0x00000000
PROPERTIES[]
Table[initial]
COMPONENT_ACCESS
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
FILE_CONTENT_ASCII
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
FILE_CONTENT_BINARY
0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.0.1.
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
OBJECT_ID_AUX
222222222222222222222222222222222222222222
000000000000000000000000000000000000000000
X_DOCUMENT_URL
X
5
8
X_DOCUMENT_HTTPS_URL
X
5
8
X_COMPONENT_ACCESS
X
5
8
SUBRC_AUX
0
0000
0000
%_DUMMY$$
2222
0000
%_SPACE
2
0
SY-REPID
SAPLSDCI
5454544422222222222222222222222222222222
310C343900000000000000000000000000000000
No. 12 Ty. FUNCTION
Name SKWF_PHIO_CONTENT_ACCESS_GET
CACHE_SERVER
0000000000
2222222222222222222222222222222222222222222222222222222222222222333333333322222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
PHIO
2222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000
USE_URL_AT
2
0
X_ALLOW_MODEL
2
0
X_CONTENT_ONLY
X
5
8
X_CONTENT_OR_URL_ONLY
2
0
X_RAW_MODE
2
0
X_TEXT_AS_STREAM
X
5
8
ACCESS_MODE
00
33
00
DOCUMENT_HTTPS_URL
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
DOCUMENT_URL
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
ERROR
000
2222222222222222222223332222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
X_MODEL_RETURNED
2
0
COMPONENT_ACCESS[]
Table IT_5533[0x8616]
CONTEXT[]
Table IT_5544[0x89]
FILE_CONTENT_ASCII[]
Table IT_5534[0x1022]
FILE_CONTENT_BINARY[]
Table IT_5535[0x1022]
PROPERTIES[]
Table[initial]
SKWFA_C_ACT_READ
03
33
03
SYST-REPID
SAPLSKWF_CONTENT
5454545454445445222222222222222222222222
310C3B76F3FE45E4000000000000000000000000
%_ARCHIVE
2222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222222
0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000
PHIO+1(42)
222222222222222222222222222222222222222222
000000000000000000000000000000000000000000
CL_ABAP_TABLEDESCR=>TABLEKIND_STD
S
5
3
SKWFC_YES
X
5
8
No. 11 Ty. METHOD
Name CL_RSRA_KWF_UTILITIES=>COPY_MIME_TO_FOLDER
I_S_MIME_IO
FM_FOLDER 3AA00E1E0D0E3DCBE10000000A1144B5
4454444452234433434343434444333333334333343
6DF6FC4520031100515040534325100000001114425
I_S_FOLDER_IO
FBW_FLD 08BSWBLVV6N1IJCIG3VHEX2H8
4455444222233455445534344444354453432222222
627F6C40000082372C666E19A397368582880000000
L_T_PROPERTY_REQUEST
Table IT_5512[1x25]
CLASS=CL_RSRA_KWF_UTILITIESMETHOD=COPY_MIME_TO_FOLDERDATA=L_T_PROPERTY_REQUEST
Table reference: 316
TABH+ 0(20) = 28D5D23C68D5D23C000000003C01000088150000
TABH+ 20(20) = 0100000019000000FFFFFFFF04C5010038110000
TABH+ 40( 8) = 10000000C1248400
store = 0x28D5D23C
ext1 = 0x68D5D23C
shmId = 0 (0x00000000)
id = 316 (0x3C010000)
label = 5512 (0x88150000)
fill = 1 (0x01000000)
leng = 25 (0x19000000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000069
occu = 16 (0x10000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 1
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = 0xC00BE03C
pghook = 0x00000000
idxPtr = 0x00000000
refCount = 0 (0x00000000)
tstRefCount = 0 (0x00000000)
lineAdmin = 16 (0x10000000)
lineAlloc = 16 (0x10000000)
store_id = 3644 (0x3C0E0000)
shmIsReadOnly = 0 (0x00000000)
>>>>> 1st level extension part <<<<<
regHook = 0x00000000
hsdir = 0x00000000
ext2 = 0x600DE03C
>>>>> 2nd level extension part <<<<<
tabhBack = 0x98D5033D
delta_head = 000000000000000000000000000000000000000000000000000000000000000000000000
pb_func = 0x00000000
pb_handle = 0x00000000
L_S_PROPERTY_REQUEST
KW_RELATIVE_URL
4555444545455542222222222
B7F25C14965F52C0000000000
L_T_PHIO
Table IT_5494[1x43]
CLASS=CL_RSRA_KWF_UTILITIESMETHOD=COPY_MIME_TO_FOLDERDATA=L_T_PHIO
Table reference: 309
TABH+ 0(20) = E8D4D23CE0D3D23C000000003501000076150000
TABH+ 20(20) = 010000002B000000FFFFFFFF04C5010018120000
TABH+ 40( 8) = 10000000C1248000
store = 0xE8D4D23C
ext1 = 0xE0D3D23C
shmId = 0 (0x00000000)
id = 309 (0x35010000)
label = 5494 (0x76150000)
fill = 1 (0x01000000)
leng = 43 (0x2B000000)
loop = -1 (0xFFFFFFFF)
xtyp = TYPE#000073
occu = 16 (0x10000000)
access = 1 (ItAccessStandard)
idxKind = 0 (ItIndexNone)
uniKind = 2 (ItUniqueNon)
keyKind = 1 (default)
cmpMode = 2 (cmpSingleMcmpR)
occu0 = 1
collHash = 0
groupCntl = 0
rfc = 0
unShareable = 0
mightBeShared = 0
sharedWithShmTab = 0
isShmLockId = 0
gcKind = 0
isUsed = 1
>>>>> Shareable Table Header Data <<<<<
tabi = 0x2010E03C
pghook = 0x00000000
idxPtr = 0x00000000
refCount = 0 (0x00000000)
tstRefCount = 0 (0x00000000)
lineAdmin = 16 (0x10000000)
lineAlloc = 16 (0x10000000)
store_id = 3643 (0x3B0E0000)
shmIsReadOnly = 0 (0x00000000)Hi priya,
Not sure: check syntax in your Update Roules, also at level of start routine.
Ciao.
Riccardo. -
Hi All,
We have created one cube and loaded data successfully.
But we have one dimension named periodType: the members of the period are:Annual, Quarterly and Monthly.
Client asked to make Monthly as + and remaining ~.
For that we have created one view in that we have added one column aggr_cons.
I have defined it as +* for monthly and the rest as *~*. I’m using that column as the consolidation operator.
But the data load is laoding only a few records.
Actually client asked me to create an Attribute dimension but it is not possible in EIS.
We are using EIS 7.1.2 and SQL server 2005.
For PeriodType dimension we have written the query like
select distinct PeriodType,aggr_cons
from ClaimsData_2
PeriodType is a column in table and it contains Annual,Quarterly,Monthly.
Please let me know any ideas to do this.
Only thing is i have to make Monthly as +* and remaining *~*.
Thanks,
prathapHi Pratap,
Ffirst do following changes in datasource (i.e. SQL Server, Oracle whatevere you are using): -
1- Create a new table say 'Population' and add add two column say id and population like 100,200....etc. Define id as primary key.
2- Now assuming that you have a SQL table called 'Product' so add column called 'Attribute' create it relationship with column 'id' of table 'Population' through foriegn key.
Now do follolwing changes in OLAP metadata & metaoutline:-
1- Suppose you have Product dimension and enable one of its column as attribute.
2- Ok now open metaoutline and expand Product dimension in left panel. Now it will show attributes that you associated.
3- Select an attribute and drag to right panel. It will create a attribute dimension automatically.
Hope it answers you.
Atul K, -
Getting error While loading data from ERP integrator to HFM
Hello,
We are getting the following error while loading the data from ERPI to HFM.
2013-12-31 22:44:54,133 INFO [AIF]: ERPI Process Start, Process ID: 300
2013-12-31 22:44:54,137 INFO [AIF]: ERPI Logging Level: DEBUG (5)
2013-12-31 22:44:54,139 INFO [AIF]: ERPI Log File: C:\Windows\TEMP\/aif_501_300.log
2013-12-31 22:44:54,141 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Java HotSpot(TM) 64-Bit Server VM (Oracle Corporation)]
2013-12-31 22:44:54,143 INFO [AIF]: Java Platform: java1.7.0_25
2013-12-31 22:44:56,821 INFO [AIF]: COMM Process Periods - Insert Periods - START
2013-12-31 22:44:56,828 DEBUG [AIF]:
SELECT l.SOURCE_LEDGER_ID
,l.SOURCE_LEDGER_NAME
,l.SOURCE_COA_ID
,l.CALENDAR_ID
,'0' SETID
,l.PERIOD_TYPE
,NULL LEDGER_TABLE_NAME
FROM AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE br.RULE_ID = 27
AND l.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = br.SOURCE_LEDGER_ID
2013-12-31 22:44:56,834 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_PERIODS (
PROCESS_ID
,PERIODKEY
,PERIOD_ID
,ADJUSTMENT_PERIOD_FLAG
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,GL_PERIOD_NAME
,GL_PERIOD_CODE
,GL_EFFECTIVE_PERIOD_NUM
,YEARTARGET
,PERIODTARGET
,IMP_ENTITY_TYPE
,IMP_ENTITY_ID
,IMP_ENTITY_NAME
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_ID
,TRANS_ENTITY_NAME
,PRIOR_PERIOD_FLAG
,SOURCE_LEDGER_ID
SELECT DISTINCT brl.LOADID PROCESS_ID
,pp.PERIODKEY PERIODKEY
,prd.PERIOD_ID
,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
,COALESCE(prd.YEAR,0) GL_PERIOD_YEAR
,COALESCE(prd.PERIOD_NUM,0) GL_PERIOD_NUM
,prd.PERIOD_NAME GL_PERIOD_NAME
,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM,0) AS VARCHAR(38))) GL_PERIOD_CODE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
,COALESCE(ppa.YEARTARGET, pp.YEARTARGET) YEARTARGET
,COALESCE(ppa.PERIODTARGET, pp.PERIODTARGET) PERIODTARGET
,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) IMP_ENTITY_ID
,prd.PERIOD_NAME IMP_ENTITY_NAME
,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) TRANS_ENTITY_ID
,pp.PERIODDESC TRANS_ENTITY_NAME
,'N' PRIOR_PERIOD_FLAG
,1 SOURCE_LEDGER_ID
FROM (
AIF_BAL_RULE_LOADS brl
INNER JOIN TPOVCATEGORY pc
ON pc.CATKEY = brl.CATKEY
INNER JOIN TPOVPERIOD_FLAT_V pp
ON pp.PERIODFREQ = pc.CATFREQ
AND pp.PERIODKEY >= brl.START_PERIODKEY
AND pp.PERIODKEY <= brl.END_PERIODKEY
LEFT OUTER JOIN TPOVPERIODADAPTOR_FLAT_V ppa
ON ppa.PERIODKEY = pp.PERIODKEY
AND ppa.PERIODFREQ = pp.PERIODFREQ
AND ppa.INTSYSTEMKEY = 'FMTEST2'
INNER JOIN AIF_GL_PERIODS_STG prd
ON prd.SOURCE_SYSTEM_ID = 3
AND prd.CALENDAR_ID IN ('10000')
AND prd.SETID = '0'
AND prd.PERIOD_TYPE = 'Month'
AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
AND prd.START_DATE > pp.PRIORPERIODKEY
AND prd.START_DATE <= pp.PERIODKEY
WHERE brl.LOADID = 300
ORDER BY pp.PERIODKEY
,GL_EFFECTIVE_PERIOD_NUM
2013-12-31 22:44:56,915 INFO [AIF]: COMM Process Periods - Insert Periods - END
2013-12-31 22:44:56,945 INFO [AIF]: COMM Process Periods - Insert Process Details - START
2013-12-31 22:44:56,952 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
SELECT PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,'AIF_EBS_GL_BALANCES_STG' TARGET_TABLE_NAME
,CURRENT_TIMESTAMP EXECUTION_START_TIME
,NULL EXECUTION_END_TIME
,0 RECORDS_PROCESSED
,'PENDING' STATUS
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' LAST_UPDATED_BY
,CURRENT_TIMESTAMP LAST_UPDATE_DATE
FROM (
SELECT DISTINCT PROCESS_ID
,IMP_ENTITY_TYPE ENTITY_TYPE
,IMP_ENTITY_ID ENTITY_ID
,IMP_ENTITY_NAME ENTITY_NAME
,(COALESCE(SOURCE_LEDGER_ID,0) * 100000000 + GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
) q
ORDER BY ENTITY_NAME_ORDER
2013-12-31 22:44:56,963 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
SELECT PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,'TDATASEG' TARGET_TABLE_NAME
,CURRENT_TIMESTAMP EXECUTION_START_TIME
,NULL EXECUTION_END_TIME
,0 RECORDS_PROCESSED
,'PENDING' STATUS
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' LAST_UPDATED_BY
,CURRENT_TIMESTAMP LAST_UPDATE_DATE
FROM (
SELECT PROCESS_ID
,TRANS_ENTITY_TYPE ENTITY_TYPE
,MIN(TRANS_ENTITY_ID) ENTITY_ID
,TRANS_ENTITY_NAME ENTITY_NAME
,MIN(COALESCE(SOURCE_LEDGER_ID,0) * 100000000 + GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
AND PRIOR_PERIOD_FLAG = 'N'
GROUP BY PROCESS_ID
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_NAME
) q
ORDER BY ENTITY_NAME_ORDER
2013-12-31 22:44:56,970 INFO [AIF]: COMM Process Periods - Insert Process Details - END
2013-12-31 22:44:57,407 DEBUG [AIF]: EBS/FS GL Balances Print Variables - Printing Variables - START
2013-12-31 22:44:57,408 DEBUG [AIF]:
p_process_id: 300
p_sql_db_type: ORACLE
p_partitionkey: 12
p_rule_id: 27
p_source_system_id: 3
p_application_id: 26
p_target_application_type: HFM
p_is_multi_currency: true
p_data_load_method: CLASSIC_VIA_EPMI
p_bal_balance_method_code: STANDARD
p_bal_ledger_group_code: SINGLE
p_bal_amount_type: MONETARY
p_prd_entity_name: JAN-13
p_prd_period_id: 135
p_prd_gl_period_name: JAN-13
p_prd_source_ledger_id: 1
p_prd_source_coa_id: 101
p_source_ledger_id: 1
p_source_coa_id: 101
p_bal_actual_flag: A
p_bal_seg_column_name: SEGMENT1
p_max_ccid_loaded_to_stg: 148137
2013-12-31 22:44:57,408 DEBUG [AIF]: EBS/FS GL Balances Print Variables - Printing Variables - END
2013-12-31 22:44:57,806 INFO [AIF]: LKM EBS/FS Extract Type - Load Audit AND Full Refresh - START
2013-12-31 22:44:57,817 DEBUG [AIF]:
SELECT p.PROCESS_ID
,br.RULE_NAME
,l.SOURCE_LEDGER_NAME
FROM AIF_GL_LOAD_AUDIT aud
,AIF_PROCESSES p
,AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND p.PROCESS_ID = aud.LAST_LOADID
AND p.STATUS = 'RUNNING'
AND p.PROCESS_ID <> 300
AND br.RULE_ID = p.RULE_ID
AND l.SOURCE_SYSTEM_ID = aud.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = aud.SOURCE_LEDGER_ID
2013-12-31 22:44:57,826 DEBUG [AIF]:
SELECT 'Y' VALID_FLAG
FROM GL_PERIOD_STATUSES
WHERE APPLICATION_ID = 101
AND SET_OF_BOOKS_ID = 1
AND PERIOD_NAME = 'JAN-13'
AND CLOSING_STATUS IN ( 'O','C','P' )
2013-12-31 22:44:57,847 DEBUG [AIF]:
SELECT 'Y' EXISTS_FLAG
FROM GL_TRACK_DELTA_BALANCES
WHERE SET_OF_BOOKS_ID = 1
AND PROGRAM_CODE = 'FEM'
AND PERIOD_NAME = 'JAN-13'
AND ACTUAL_FLAG = 'A'
AND EXTRACT_LEVEL_CODE = 'DTL'
AND CURRENCY_TYPE_CODE = 'B'
AND ENABLED_FLAG = 'Y'
2013-12-31 22:44:57,883 DEBUG [AIF]:
SELECT MAX(DELTA_RUN_ID) MAX_DELTA_RUN_ID
FROM GL_BALANCES_DELTA
WHERE SET_OF_BOOKS_ID = 1
AND PERIOD_NAME = 'JAN-13'
AND ACTUAL_FLAG = 'A'
2013-12-31 22:44:57,898 DEBUG [AIF]:
SELECT brl.EXECUTION_MODE
,( SELECT CASE COUNT(aud.DELTA_RUN_ID) WHEN 0 THEN 'N' ELSE 'Y' END
FROM AIF_GL_LOAD_AUDIT aud
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND aud.LAST_LOADID <> brl.LOADID
AND COALESCE( aud.STATUS, 'SUCCESS' ) = 'SUCCESS'
) GL_LOAD_AUDIT_SUCCESS_FLAG
,( SELECT CASE COUNT(aud.DELTA_RUN_ID) WHEN 0 THEN 'N' ELSE 'Y' END
FROM AIF_GL_LOAD_AUDIT aud
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND aud.LAST_LOADID <> brl.LOADID
AND aud.STATUS = 'RUNNING'
) GL_LOAD_AUDIT_RUNNING_FLAG
FROM AIF_BAL_RULE_LOADS brl
WHERE brl.LOADID = 300
2013-12-31 22:44:57,904 DEBUG [AIF]:
INSERT INTO AIF_GL_LOAD_AUDIT ( LAST_LOADID
,DELTA_RUN_ID
,SOURCE_SYSTEM_ID
,SOURCE_LEDGER_ID
,GL_PERIOD_ID
,BALANCE_TYPE
,GL_EXTRACT_TYPE
,STATUS
) VALUES ( 300
,0
,3
,1
,135
,'A'
,'FULLREFRESH'
,'RUNNING'
2013-12-31 22:44:57,907 DEBUG [AIF]:
DELETE FROM AIF_EBS_GL_BALANCES_STG
WHERE SOURCE_SYSTEM_ID = 3
AND SOURCE_COA_ID = 101
AND SOURCE_LEDGER_ID = 1
AND ACTUAL_FLAG = 'A'
AND PERIOD_NAME = 'JAN-13'
2013-12-31 22:44:59,283 INFO [AIF]: LKM EBS/FS Extract Type - Load Audit AND Full Refresh - END
2013-12-31 22:45:06,507 INFO [AIF]: COMM End Process Detail - Update Process Detail - START
2013-12-31 22:45:06,514 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'SUCCESS'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 57408
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_IMP'
AND ENTITY_NAME = 'JAN-13'
2013-12-31 22:45:06,519 INFO [AIF]: COMM End Process Detail - Update Process Detail - END
2013-12-31 22:45:07,106 INFO [AIF]: EBS/FS Load Data - Load TDATASEG_T - START
2013-12-31 22:45:07,112 INFO [AIF]:
Import Data from Source for Period 'January 2013'
2013-12-31 22:45:07,115 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,brl.EXECUTION_MODE
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.CURRENCY_CODE
,br.INCL_ZERO_BALANCE_FLAG
,br.BAL_SEG_VALUE_OPTION_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
FROM AIF_BAL_RULE_LOADS brl
,AIF_TARGET_APPLICATIONS app
,AIF_BALANCE_RULES br
WHERE brl.LOADID = 300
AND app.APPLICATION_ID = brl.APPLICATION_ID
AND br.RULE_ID = brl.RULE_ID
2013-12-31 22:45:07,120 DEBUG [AIF]:
SELECT PERIODKEY
FROM TPOVPERIOD
WHERE PERIODDESC = 'January 2013'
2013-12-31 22:45:07,122 INFO [AIF]:
Import Data from Source for Ledger 'JWR Books'
2013-12-31 22:45:07,125 DEBUG [AIF]:
SELECT COA_SEGMENT_NAME
,ACCOUNT_TYPE_FLAG
,BALANCE_TYPE_FLAG
FROM AIF_COA_SEGMENTS
WHERE SOURCE_SYSTEM_ID = 3
AND SOURCE_COA_ID = '101'
AND (
ACCOUNT_TYPE_FLAG = 'Y'
OR BALANCE_TYPE_FLAG = 'Y'
2013-12-31 22:45:07,127 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 26
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 12
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-12-31 22:45:07,154 DEBUG [AIF]:
INSERT INTO TDATASEG_T (
LOADID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,VALID_FLAG
,CHANGESIGN
,CODE_COMBINATION_ID
,SOURCE_LEDGER_ID
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,YEAR
,PERIOD
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ACCOUNT
,ACCOUNTX
,ENTITY
,ENTITYX
,ICP
,ICPX
,UD1
,UD1X
,UD2
,UD2X
,UD3
,UD3X
,UD4
,UD4X
,DATAVIEW
,DATAKEY
,STAT_BALANCE_FLAG
,CURKEY
,AMOUNT_PTD
,AMOUNT_YTD
,AMOUNT
,AMOUNTX
SELECT pprd.PROCESS_ID LOADID
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,pprd.PERIODKEY
,'Y' VALID_FLAG
,0 CHANGESIGN
,ccid.CODE_COMBINATION_ID
,bal.SOURCE_LEDGER_ID
,pprd.GL_PERIOD_YEAR
,pprd.GL_PERIOD_NUM
,pprd.YEARTARGET YEAR
,pprd.PERIODTARGET PERIOD
,pprd.PROCESS_ID ATTR1
,bal.SOURCE_SYSTEM_ID ATTR2
,bal.SOURCE_LEDGER_ID ATTR3
,pprd.GL_PERIOD_YEAR ATTR4
,pprd.GL_PERIOD_NAME ATTR5
,bal.ACTUAL_FLAG ATTR6
,bal.BUDGET_VERSION_ID ATTR7
,bal.ENCUMBRANCE_TYPE_ID ATTR8
,ccid.ACCOUNT_TYPE ATTR9
,NULL ATTR10
,NULL ATTR11
,NULL ATTR12
,NULL ATTR13
,NULL ATTR14
,ccid.SEGMENT4 ACCOUNT
,NULL ACCOUNTX
,ccid.SEGMENT1 ENTITY
,NULL ENTITYX
,NULL ICP
,NULL ICPX
,ccid.SEGMENT5 UD1
,NULL UD1X
,ccid.SEGMENT3 UD2
,NULL UD2X
,ccid.SEGMENT6 UD3
,NULL UD3X
,ccid.SEGMENT2 UD4
,NULL UD4X
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN 'Periodic' ELSE 'YTD' END ) DATAVIEW
,TDATASEG_DATAKEY_S.NEXTVAL
,'N' STAT_BALANCE_FLAG
,bal.CURRENCY_CODE CURKEY
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) * ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR ) AMOUNT_PTD
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) * ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR ) AMOUNT_YTD
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) *
( CASE
WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
ELSE ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
END
AMOUNT
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) *
( CASE
WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
ELSE ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
END
AMOUNTX
FROM AIF_EBS_GL_BALANCES_STG_V bal
,AIF_EBS_GL_CCID_STG ccid
,AIF_PROCESS_PERIODS pprd
WHERE bal.SOURCE_SYSTEM_ID = 3
AND bal.SOURCE_LEDGER_ID = 1
AND bal.ACTUAL_FLAG = 'A'
AND ccid.SOURCE_SYSTEM_ID = bal.SOURCE_SYSTEM_ID
AND ccid.SOURCE_COA_ID = bal.SOURCE_COA_ID
AND ccid.CODE_COMBINATION_ID = bal.CODE_COMBINATION_ID
AND pprd.PROCESS_ID = 300
AND pprd.PERIODKEY = '2013-01-01'
AND pprd.SOURCE_LEDGER_ID = bal.SOURCE_LEDGER_ID
AND pprd.GL_PERIOD_NAME = bal.PERIOD_NAME
AND (
bal.BEGIN_BALANCE_DR <> 0
OR bal.BEGIN_BALANCE_CR <> 0
OR bal.PERIOD_NET_DR <> 0
OR bal.PERIOD_NET_CR <> 0
AND bal.CURRENCY_CODE <> 'STAT'
AND bal.TRANSLATED_FLAG IS NULL
2013-12-31 22:45:09,269 INFO [AIF]: Monetary Data Rows Imported from Source: 12590
2013-12-31 22:45:09,293 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_AUDIT (
LOADID
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,EXPORT_TO_TARGET_FLAG
SELECT DISTINCT PROCESS_ID LOADID
,'HFM' TARGET_APPLICATION_TYPE
,'FMTEST2' TARGET_APPLICATION_NAME
,NULL PLAN_TYPE
,SOURCE_LEDGER_ID
,YEARTARGET EPM_YEAR
,PERIODTARGET EPM_PERIOD
,'Y' SNAPSHOT_FLAG
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,PERIODKEY
,'N' EXPORT_TO_TARGET_FLAG
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
AND PERIODKEY = '2013-01-01'
AND SOURCE_LEDGER_ID = 1
2013-12-31 22:45:09,297 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_PRD_AUDIT (
LOADID
,SOURCE_LEDGER_ID
,GL_PERIOD_ID
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
SELECT DISTINCT pprd.PROCESS_ID LOADID
,pprd.SOURCE_LEDGER_ID
,pprd.PERIOD_ID GL_PERIOD_ID
,(SELECT MAX(gl.DELTA_RUN_ID)
FROM AIF_GL_LOAD_AUDIT gl
WHERE gl.SOURCE_SYSTEM_ID = 3
AND gl.SOURCE_LEDGER_ID = pprd.SOURCE_LEDGER_ID
AND gl.BALANCE_TYPE = 'A'
AND gl.GL_PERIOD_ID = pprd.PERIOD_ID
) DELTA_RUN_ID
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,pprd.PERIODKEY
FROM AIF_PROCESS_PERIODS pprd
WHERE pprd.PROCESS_ID = 300
AND pprd.PERIODKEY = '2013-01-01'
AND pprd.SOURCE_LEDGER_ID = 1
2013-12-31 22:45:09,302 INFO [AIF]:
Total Data Rows Imported from Source: 12590
2013-12-31 22:45:09,305 INFO [AIF]: EBS/FS Load Data - Load TDATASEG_T - END
2013-12-31 22:45:09,381 INFO [AIF]: COMM Update Data - Init DataLoadUtil - START
2013-12-31 22:45:09,385 INFO [AIF]: COMM Update Data - Init DataLoadUtil - END
2013-12-31 22:45:09,485 INFO [AIF]: COMM Update Data - Update TDATASEG_T/TDATASEGW - START
2013-12-31 22:45:09,491 DEBUG [AIF]:
DELETE FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND AMOUNT = 0
2013-12-31 22:45:09,559 WARN [AIF]:
Warning: Data rows with zero balances exist
2013-12-31 22:45:09,636 INFO [AIF]: Zero Balance Data Rows Deleted: 1879
2013-12-31 22:45:09,654 DEBUG [AIF]:
SELECT DIMNAME
,CASE WHEN RULE_ID IS NULL THEN 'N' ELSE 'Y' END RULE_MAP_FLAG
,SRCKEY
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,DATAKEY
,MAPPING_TYPE
FROM (
SELECT DISTINCT tdm.DIMNAME
,tdm.RULE_ID
,NULL SRCKEY
,NULL TARGKEY
,tdm.WHERECLAUSETYPE
,tdm.WHERECLAUSEVALUE
,NULL CHANGESIGN
,1 SEQUENCE
,COALESCE(tdm.SYSTEM_GENERATED_FLAG,'N') SYSTEM_GENERATED_FLAG
,NULL DATAKEY
,CASE
WHEN tdm.WHERECLAUSETYPE IS NULL THEN 1
ELSE 3
END MAPPING_TYPE
FROM TDATAMAP_T tdm
WHERE tdm.LOADID = 300
AND tdm.PARTITIONKEY = 12
AND tdm.TDATAMAPTYPE = 'ERP'
AND (tdm.RULE_ID IS NULL OR tdm.RULE_ID = 27)
AND tdm.WHERECLAUSETYPE IS NULL
UNION ALL
SELECT tdm.DIMNAME
,tdm.RULE_ID
,tdm.SRCKEY
,tdm.TARGKEY
,tdm.WHERECLAUSETYPE
,tdm.WHERECLAUSEVALUE
,tdm.CHANGESIGN
,CASE tpp.PARTSEQMAP
WHEN 0 THEN CASE
WHEN (tdm.WHERECLAUSETYPE = 'MULTIDIM') THEN 2
WHEN (tdm.WHERECLAUSETYPE = 'BETWEEN') THEN 3
WHEN (tdm.WHERECLAUSETYPE = 'LIKE') THEN 4
ELSE 0
END
ELSE tdm.SEQUENCE
END SEQUENCE
,COALESCE(tdm.SYSTEM_GENERATED_FLAG,'N') SYSTEM_GENERATED_FLAG
,tdm.DATAKEY
,CASE
WHEN tdm.WHERECLAUSETYPE IS NULL THEN 1
ELSE 3
END MAPPING_TYPE
FROM TDATAMAP_T tdm
INNER JOIN TPOVPARTITION tpp
ON tpp.PARTITIONKEY = tdm.PARTITIONKEY
WHERE tdm.LOADID = 300
AND tdm.PARTITIONKEY = 12
AND tdm.TDATAMAPTYPE = 'ERP'
AND (tdm.RULE_ID IS NULL OR tdm.RULE_ID = 27)
AND tdm.WHERECLAUSETYPE IN ('MULTIDIM','BETWEEN','LIKE')
) q
ORDER BY DIMNAME
,RULE_ID
,SEQUENCE
,SYSTEM_GENERATED_FLAG
,SRCKEY
2013-12-31 22:45:09,672 INFO [AIF]:
Processing Mappings for Column 'ACCOUNT'
2013-12-31 22:45:09,677 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ACCOUNTX = ACCOUNT
,ACCOUNTR = 121
,ACCOUNTF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ACCOUNTX IS NULL
AND (1=1)
2013-12-31 22:45:10,044 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,053 INFO [AIF]:
Processing Mappings for Column 'ENTITY'
2013-12-31 22:45:10,057 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ENTITYX = ENTITY
,ENTITYR = 122
,ENTITYF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ENTITYX IS NULL
AND (1=1)
2013-12-31 22:45:10,426 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,435 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ENTITYX = ENTITY
,ENTITYR = 132
,ENTITYF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ENTITYX IS NULL
AND (1=1)
2013-12-31 22:45:10,446 INFO [AIF]: Data Rows Updated by Location Mapping 'DEFAULT' (LIKE): 0
2013-12-31 22:45:10,448 INFO [AIF]:
Processing Mappings for Column 'ICP'
2013-12-31 22:45:10,452 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ICPX = '[ICP None]'
,ICPR = 130
,ICPF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[ICP None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ICPX IS NULL
AND (1=1)
2013-12-31 22:45:10,784 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,798 INFO [AIF]:
Processing Mappings for Column 'UD1'
2013-12-31 22:45:10,802 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD1X = '[None]'
,UD1R = 124
,UD1F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD1X IS NULL
AND (1=1)
2013-12-31 22:45:11,134 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,156 INFO [AIF]:
Processing Mappings for Column 'UD2'
2013-12-31 22:45:11,160 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD2X = '[None]'
,UD2R = 123
,UD2F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD2X IS NULL
AND (1=1)
2013-12-31 22:45:11,517 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,531 INFO [AIF]:
Processing Mappings for Column 'UD3'
2013-12-31 22:45:11,535 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD3X = '[None]'
,UD3R = 125
,UD3F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD3X IS NULL
AND (1=1)
2013-12-31 22:45:11,870 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,883 INFO [AIF]:
Processing Mappings for Column 'UD4'
2013-12-31 22:45:11,887 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD4X = '[None]'
,UD4R = 128
,UD4F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD4X IS NULL
AND (1=1)
2013-12-31 22:45:12,192 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:12,204 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ATTR14 = DATAKEY
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:12,739 DEBUG [AIF]:
UPDATE TDATASEG_T
SET VALID_FLAG = 'N'
WHERE 1=1
AND (
(1=0)
OR TDATASEG_T.ACCOUNTX IS NULL
OR TDATASEG_T.ENTITYX IS NULL
OR TDATASEG_T.ICPX IS NULL
OR TDATASEG_T.UD1X IS NULL
OR TDATASEG_T.UD2X IS NULL
OR TDATASEG_T.UD3X IS NULL
OR TDATASEG_T.UD4X IS NULL
AND LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND VALID_FLAG = 'Y'
2013-12-31 22:45:12,754 INFO [AIF]:
Total Data Rows available for Export to Target: 10711
2013-12-31 22:45:12,773 INFO [AIF]: COMM Update Data - Update TDATASEG_T/TDATASEGW - END
2013-12-31 22:45:12,808 INFO [AIF]: COMM End Process Detail - Update Process Detail - START
2013-12-31 22:45:12,823 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'SUCCESS'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 10711
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_TRANS'
AND ENTITY_NAME = 'January 2013'
2013-12-31 22:45:12,829 INFO [AIF]: COMM End Process Detail - Update Process Detail - END
2013-12-31 22:45:12,987 INFO [AIF]: COMM Update YTD Amounts - Update YTD Amounts - START
2013-12-31 22:45:12,993 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,pprd.YEARTARGET
,pprd.PERIODTARGET
,pprd.SOURCE_LEDGER_ID
FROM AIF_BAL_RULE_LOADS brl
,AIF_PROCESS_PERIODS pprd
WHERE brl.LOADID = 300
AND pprd.PROCESS_ID = brl.LOADID
GROUP BY brl.PARTITIONKEY
,brl.CATKEY
,pprd.YEARTARGET
,pprd.PERIODTARGET
,pprd.SOURCE_LEDGER_ID
HAVING COUNT(*) > 1
2013-12-31 22:45:12,995 INFO [AIF]: COMM Update YTD Amounts - Update YTD Amounts - END
2013-12-31 22:45:13,052 INFO [AIF]: COMM Load TDATAMAPSEG/TDATASEG - Load TDATAMAPSEG/TDATASEG - START
2013-12-31 22:45:13,057 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,brl.EXECUTION_MODE
FROM AIF_BAL_RULE_LOADS brl
WHERE brl.LOADID = 300
2013-12-31 22:45:13,059 DEBUG [AIF]:
SELECT PERIODKEY
FROM AIF_APPL_LOAD_AUDIT
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND RULE_ID = 27
ORDER BY PERIODKEY
2013-12-31 22:45:13,061 INFO [AIF]:
Processing Data for PeriodKey '2013-01-01'
2013-12-31 22:45:13,065 DEBUG [AIF]:
DELETE FROM TDATAMAPSEG
WHERE PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND (
TDATAMAPTYPE = 'ERP'
OR (
TDATAMAPTYPE = 'MULTIDIM'
AND EXISTS (
SELECT 1
FROM TDATAMAPSEG parent
WHERE parent.PARTITIONKEY = TDATAMAPSEG.PARTITIONKEY
AND parent.DATAKEY = TDATAMAPSEG.TARGKEY
AND parent.CATKEY = TDATAMAPSEG.CATKEY
AND parent.PERIODKEY = TDATAMAPSEG.PERIODKEY
AND parent.TDATAMAPTYPE = 'ERP'
2013-12-31 22:45:13,074 INFO [AIF]: Number of Rows deleted from TDATAMAPSEG: 8
2013-12-31 22:45:13,077 DEBUG [AIF]:
INSERT INTO TDATAMAPSEG (
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,DIMNAME
,SRCKEY
,SRCDESC
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,VBSCRIPT
,TDATAMAPTYPE
,SYSTEM_GENERATED_FLAG
SELECT DATAKEY
,PARTITIONKEY
,4
,'2013-01-01'
,DIMNAME
,SRCKEY
,SRCDESC
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,VBSCRIPT
,TDATAMAPTYPE
,SYSTEM_GENERATED_FLAG
FROM TDATAMAP_T
WHERE LOADID = 300
2013-12-31 22:45:13,081 INFO [AIF]: Number of Rows inserted into TDATAMAPSEG: 8
2013-12-31 22:45:13,083 DEBUG [AIF]:
DELETE FROM TDATASEG
WHERE LOADID < 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND RULE_ID = 27
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:15,659 INFO [AIF]: Number of Rows deleted from TDATASEG: 10711
2013-12-31 22:45:15,728 DEBUG [AIF]:
INSERT INTO TDATASEG (
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,CURKEY
,DATAVIEW
,CALCACCTTYPE
,CHANGESIGN
,JOURNALID
,AMOUNT
,AMOUNTX
,AMOUNT_PTD
,AMOUNT_YTD
,DESC1
,DESC2
,ACCOUNT
,ACCOUNTX
,ACCOUNTR
,ACCOUNTF
,ENTITY
,ENTITYX
,ENTITYR
,ENTITYF
,ICP
,ICPX
,ICPR
,ICPF
,UD1
,UD1X
,UD1R
,UD1F
,UD2
,UD2X
,UD2R
,UD2F
,UD3
,UD3X
,UD3R
,UD3F
,UD4
,UD4X
,UD4R
,UD4F
,UD5
,UD5X
,UD5R
,UD5F
,UD6
,UD6X
,UD6R
,UD6F
,UD7
,UD7X
,UD7R
,UD7F
,UD8
,UD8X
,UD8R
,UD8F
,UD9
,UD9X
,UD9R
,UD9F
,UD10
,UD10X
,UD10R
,UD10F
,UD11
,UD11X
,UD11R
,UD11F
,UD12
,UD12X
,UD12R
,UD12F
,UD13
,UD13X
,UD13R
,UD13F
,UD14
,UD14X
,UD14R
,UD14F
,UD15
,UD15X
,UD15R
,UD15F
,UD16
,UD16X
,UD16R
,UD16F
,UD17
,UD17X
,UD17R
,UD17F
,UD18
,UD18X
,UD18R
,UD18F
,UD19
,UD19X
,UD19R
,UD19F
,UD20
,UD20X
,UD20R
,UD20F
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ARCHIVEID
,HASMEMOITEM
,STATICDATAKEY
,LOADID
,RULE_ID,
CODE_COMBINATION_ID
,STAT_BALANCE_FLAG
,VALID_FLAG
SELECT
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,CURKEY
,DATAVIEW
,CALCACCTTYPE
,CHANGESIGN
,JOURNALID
,AMOUNT
,AMOUNTX
,AMOUNT_PTD
,AMOUNT_YTD
,DESC1
,DESC2
,ACCOUNT
,ACCOUNTX
,ACCOUNTR
,ACCOUNTF
,ENTITY
,ENTITYX
,ENTITYR
,ENTITYF
,ICP
,ICPX
,ICPR
,ICPF
,UD1
,UD1X
,UD1R
,UD1F
,UD2
,UD2X
,UD2R
,UD2F
,UD3
,UD3X
,UD3R
,UD3F
,UD4
,UD4X
,UD4R
,UD4F
,UD5
,UD5X
,UD5R
,UD5F
,UD6
,UD6X
,UD6R
,UD6F
,UD7
,UD7X
,UD7R
,UD7F
,UD8
,UD8X
,UD8R
,UD8F
,UD9
,UD9X
,UD9R
,UD9F
,UD10
,UD10X
,UD10R
,UD10F
,UD11
,UD11X
,UD11R
,UD11F
,UD12
,UD12X
,UD12R
,UD12F
,UD13
,UD13X
,UD13R
,UD13F
,UD14
,UD14X
,UD14R
,UD14F
,UD15
,UD15X
,UD15R
,UD15F
,UD16
,UD16X
,UD16R
,UD16F
,UD17
,UD17X
,UD17R
,UD17F
,UD18
,UD18X
,UD18R
,UD18F
,UD19
,UD19X
,UD19R
,UD19F
,UD20
,UD20X
,UD20R
,UD20F
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ARCHIVEID
,HASMEMOITEM
,STATICDATAKEY
,LOADID
,RULE_ID,
CODE_COMBINATION_ID
,STAT_BALANCE_FLAG
,VALID_FLAG
FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:16,838 INFO [AIF]: Number of Rows inserted into TDATASEG: 10711
2013-12-31 22:45:16,858 DEBUG [AIF]:
DELETE FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:17,123 INFO [AIF]: Number of Rows deleted from TDATASEG_T: 10711
2013-12-31 22:45:17,153 DEBUG [AIF]:
DELETE FROM TDATAMAP_T
WHERE LOADID = 300
2013-12-31 22:45:17,156 INFO [AIF]: Number of Rows deleted from TDATAMAP_T: 8
2013-12-31 22:45:17,161 INFO [AIF]: COMM Load TDATAMAPSEG/TDATASEG - Load TDATAMAPSEG/TDATASEG - END
2013-12-31 22:45:17,993 DEBUG [AIF]:
SELECT CASE app.METADATA_LOAD_METHOD
WHEN 'EPMA' THEN CASE dim.TARGET_DIMENSION_CLASS_NAME
WHEN 'Generic' THEN dim.TARGET_DIMENSION_NAME
ELSE dim.TARGET_DIMENSION_CLASS_NAME
END
ELSE dim.TARGET_DIMENSION_NAME
END TARGET_DIMENSION_NAME
,adim.BALANCE_COLUMN_NAME
FROM AIF_TARGET_APPLICATIONS app
,AIF_TARGET_APPL_DIMENSIONS adim
,AIF_DIMENSIONS dim
WHERE app.APPLICATION_ID = 26
AND adim.APPLICATION_ID = app.APPLICATION_ID
AND dim.DIMENSION_ID = adim.DIMENSION_ID
AND dim.TARGET_DIMENSION_CLASS_NAME IN ('Custom1','Custom2','Custom3','Custom4','Generic')
2013-12-31 22:45:17,997 DEBUG [AIF]:
SELECT SCENARIO "Scenario"
,YEAR "Year"
,PERIOD "Period"
,DATAVIEW "View"
,DATAVALUE "Value"
,ACCOUNT "Account"
,ENTITY "Entity"
,ICP "ICP"
,UD2 "Area"
,UD1 "Tail"
,UD3 "Special"
,UD4 "Facility"
,AMOUNT "DataValue"
FROM AIF_HS_BALANCES
WHERE LOADID = 300
2013-12-31 22:45:18,000 INFO [SimpleAsyncTaskExecutor-9]: ODI Hyperion Financial Management Adapter
2013-12-31 22:45:18,002 INFO [SimpleAsyncTaskExecutor-9]: Load task initialized.
2013-12-31 22:45:18,028 INFO [AIF]: LKM COMM Load Data into HFM - Load Data to HFM - START
2013-12-31 22:45:18,031 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
) VALUES (
300
,'PROCESS_BAL_EXP_HFM'
,NULL
,'FMTEST2'
,NULL
,NULL
,CURRENT_TIMESTAMP
,NULL
,NULL
,'RUNNING'
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
,CURRENT_TIMESTAMP
2013-12-31 22:45:18,034 INFO [SimpleAsyncTaskExecutor-9]: Connecting to Financial Management application [FMTEST2] on [10.150.20.40] using user-name [admin].
2013-12-31 22:45:18,155 INFO [SimpleAsyncTaskExecutor-9]: Connected to Financial Management application.
2013-12-31 22:45:18,157 INFO [SimpleAsyncTaskExecutor-9]: HFM Version: 11.1.2.2.300.
2013-12-31 22:45:18,160 INFO [SimpleAsyncTaskExecutor-9]: Options for the Financial Management load task are:
<Options>
<Option name=LOG_FILE_NAME value=C:\Windows\TEMP\/aif_501_300.log/>
<Option name=IMPORT_MODE value=Replace/>
<Option name=CONSOLIDATE_ONLY value=false/>
<Option name=CONSOLIDATE_PARAMETERS value=""/>
<Option name=LOG_ENABLED value=true/>
<Option name=ACCUMULATE_WITHIN_FILE value=false/>
<Option name=DEBUG_ENABLED value=true/>
<Option name=CONSOLIDATE_AFTER_LOAD value=false/>
<Option name=FILE_CONTAINS_SHARE_DATA value=false/>
</Options>
2013-12-31 22:45:18,168 INFO [SimpleAsyncTaskExecutor-9]: Load Options validated.
2013-12-31 22:45:18,176 ERROR [SimpleAsyncTaskExecutor-9]: Error occurred during load process ORA-00904: "DATAVALUE": invalid identifier
com.hyperion.odi.common.ODIHAppException: ORA-00904: "DATAVALUE": invalid identifier
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:216)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
at org.python.core.PyObject.__call__(PyObject.java:355)
at org.python.core.PyMethod.__call__(PyMethod.java:215)
at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
at org.python.core.PyMethod.__call__(PyMethod.java:206)
at org.python.core.PyObject.__call__(PyObject.java:397)
at org.python.core.PyObject.__call__(PyObject.java:401)
at org.python.pycode._pyx161.f$0(<string>:98)
at org.python.pycode._pyx161.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java:165)
at org.python.core.PyCode.call(PyCode.java:18)
at org.python.core.Py.runCode(Py.java:1204)
at org.python.core.Py.exec(Py.java:1248)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1889)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$2.doAction(StartScenRequestProcessor.java:580)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor.doProcessStartScenTask(StartScenRequestProcessor.java:513)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$StartScenTask.doExecute(StartScenRequestProcessor.java:1066)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:724)
Caused by: java.sql.SQLSyntaxErrorException: ORA-00904: "DATAVALUE": invalid identifier
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)
at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:202)
at oracle.jdbc.driver.T4CStatement.executeForDescribe(T4CStatement.java:942)
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1283)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1441)
at oracle.jdbc.driver.OracleStatement.executeQuery(OracleStatement.java:1690)
at oracle.jdbc.driver.OracleStatementWrapper.executeQuery(OracleStatementWrapper.java:446)
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:212)
... 38 more
2013-12-31 22:45:18,208 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'FAILED'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 0
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_EXP_HFM'
AND ENTITY_NAME = 'FMTEST2'
2013-12-31 22:45:18,210 FATAL [AIF]: Error in LKM CThe issue is that you are mapping "Data Value" to amount in the Target Application import format:
SELECT SCENARIO "Scenario"
,YEAR "Year"
,PERIOD "Period"
,DATAVIEW "View"
,DATAVALUE "Value"
,ACCOUNT "Account"
,ENTITY "Entity"
,ICP "ICP"
,UD2 "Area"
,UD1 "Tail"
,UD3 "Special"
,UD4 "Facility"
,AMOUNT "DataValue"
FROM AIF_HS_BALANCES
WHERE LOADID = 300
You need to map AMOUNT to "AMOUNT" in the HFM Application. Check that the dimension mapping is correct for the class in the target application and that your import format is going to the proper target dimension(Amount). -
Problem in Loading Data from SQL Server 2000 to Oracle 10g
Hi All,
I am a university student and using ODI for my final project on real-time data warehousing. I am trying to load data from MS SQL Server 2000 into Oracle 10g target table. Everything goes fine until I execute the interface for the initial load. When I choose the CKM Oracle(Create unique index on the I$ table) km, the following step fails:
21 - Integration - Prj_Dim_RegionInterface - Create Unique Index on flow table
Where Prj_Dim_Region is the name of my target table in Oracle.
The error message is:
955 : 42000 : java.sql.SQLException: ORA-00955: name is already used by an existing object
java.sql.SQLException: ORA-00955: name is already used by an existing object
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:125)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:316)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:282)
at oracle.jdbc.driver.T4C8Oall.receive(T4C8Oall.java:639)
at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:185)
at oracle.jdbc.driver.T4CPreparedStatement.execute_for_rows(T4CPreparedStatement.java:633)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1086)
at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:2984)
at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3057)
at com.sunopsis.sql.SnpsQuery.executeUpdate(SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execStdOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlI.treatTaskTrt(SnpSessTaskSqlI.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
at com.sunopsis.dwg.cmd.e.i(e.java)
at com.sunopsis.dwg.cmd.g.y(g.java)
at com.sunopsis.dwg.cmd.e.run(e.java)
at java.lang.Thread.run(Unknown Source)
I am using a surrogate key column in my target table alongwith the natural key. The natural key is populated by the primary key of my source table, but for the surrogate key, I have created a sequence in my oracle schema where the target table exists and have used the following code for mapping:
<%=snpRef.getObjectName( "L" , "SQ_PRJ_DIM_REGION" , "D" )%>.nextval
I have chosen to execute this code on target.
Among my attempts to solve this problem was to set Create Index option of the CKM Oracle(Create Index for the I$ Table) to No so that it wont create any index on the flow table. I also tried to use the simple CKM Oracle km . Both solutions allowed the interface to execute successfully without any errors, but the data was not loaded into the target table.
When I right-click on the Prj_Dim_Region data store and choose Data, it shows empty. Pressing the SQL button in this data store shows a dialog box " New Query" where I see this query:
select * from NOVELTYFURNITUREDW.PRJ_DIM_REGION
But when i press OK to run it, I get this error message:
java.lang.IllegalArgumentException: Row index out of range
at javax.swing.JTable.boundRow(Unknown Source)
at javax.swing.JTable.setRowSelectionInterval(Unknown Source)
at com.borland.dbswing.JdbTable.accessChange(JdbTable.java:2959)
at com.borland.dx.dataset.AccessEvent.dispatch(Unknown Source)
at com.borland.jb.util.EventMulticaster.dispatch(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.a(Unknown Source)
at com.borland.dx.dataset.DataSet.open(Unknown Source)
at com.borland.dx.dataset.StorageDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.refresh(Unknown Source)
at com.borland.dx.sql.dataset.QueryDataSet.executeQuery(Unknown Source)
at com.sunopsis.graphical.frame.a.cg.actionPerformed(cg.java)
at javax.swing.AbstractButton.fireActionPerformed(Unknown Source)
at javax.swing.AbstractButton$ForwardActionEvents.actionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.fireActionPerformed(Unknown Source)
at javax.swing.DefaultButtonModel.setPressed(Unknown Source)
at javax.swing.plaf.basic.BasicButtonListener.mouseReleased(Unknown Source)
at java.awt.AWTEventMulticaster.mouseReleased(Unknown Source)
at java.awt.Component.processMouseEvent(Unknown Source)
at java.awt.Component.processEvent(Unknown Source)
at java.awt.Container.processEvent(Unknown Source)
at java.awt.Component.dispatchEventImpl(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.LightweightDispatcher.retargetMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.processMouseEvent(Unknown Source)
at java.awt.LightweightDispatcher.dispatchEvent(Unknown Source)
at java.awt.Container.dispatchEventImpl(Unknown Source)
at java.awt.Window.dispatchEventImpl(Unknown Source)
at java.awt.Component.dispatchEvent(Unknown Source)
at java.awt.EventQueue.dispatchEvent(Unknown Source)
at java.awt.EventDispatchThread.pumpOneEventForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.pumpEvents(Unknown Source)
at java.awt.EventDispatchThread.run(Unknown Source)
I do not understand what the problem is and wasting days to figure it out. Any help will be highly appreciated as my deadline is too close for this project.
Thank you so much in advance.
NeelHi Cezar,
Can u plz help me with this scenario?
I have one Oracle data model with 19 source tables and one SQL Server data model with 10 target tables. I have created 10 interfaces which use JournalizedDataOnly on one of the tables in the interface; e.g in interface for DimCustomer target table, I have 2 tables, namely Customer and Address, but the journalizing filter appear only on Customer table and this option is disabled for Address automatically.
Now I want to create a package using OdiWaitForLog event detection. Is it possible to put all these 10 interfaces in just one package to populate the target tables? It works fine when I have only one interface and I use the name of one table in the interface for Table Name parameter of OdiWaitForLogData event, but when I try a comma seperated list of table names[Customer, Address] this error happens
java.sql.SQLException: ORA-00942: table or view does not exist
and if I use this method <%=odiRef.getObjectName("L","model_code","logical_schema","D")%>, I get this error
"-CDC_SET_NAME=Exception getObjectName("L", "model_code", "logical_schema", "D") : SnpLSchema.getLSchemaByName : SnpLschema does not exist" "
Please let me know how to make it work?
Do I need to create separate data models each including only those tables which appear in their corresponding interface and package? Or do I need to create multiple packages each with only one journalized interface to populate only one target table?
Thank you for your time in advance.
Regards,
Neel -
Unable to load data to Hyperion planning application using odi
Hi All,
When I try to load data into planning using odi, the odi process completes successfully with the following status in the operator ReportStatistics as shown below but the data doesn't seem to appear in the planning data form or essbase
can anyone please help
org.apache.bsf.BSFException: exception from Jython:
Traceback (most recent call last):
File "<string>", line 2, in <module>
Planning Writer Load Summary:
Number of rows successfully processed: 20
Number of rows rejected: 0
source is oracle database
target account dimension
LKM SQL TO SQL
IKM SQL TO HYPERION PLANNING is used
In Target the following columns were mapped
Account(load dimension)
Data load cube name
driverdimensionmetadata
Point of view
LOG FILE
2012-08-27 09:46:43,214 INFO [SimpleAsyncTaskExecutor-3]: Oracle Data Integrator Adapter for Hyperion Planning
2012-08-27 09:46:43,214 INFO [SimpleAsyncTaskExecutor-3]: Connecting to planning application [OPAPP] on [mcg-b055]:[11333] using username [admin].
2012-08-27 09:46:43,277 INFO [SimpleAsyncTaskExecutor-3]: Successfully connected to the planning application.
2012-08-27 09:46:43,277 INFO [SimpleAsyncTaskExecutor-3]: The load options for the planning load are
Dimension Name: Account Sort Parent Child : false
Load Order By Input : false
Refresh Database : false
2012-08-27 09:46:43,339 INFO [SimpleAsyncTaskExecutor-3]: Begining the load process.
2012-08-27 09:46:43,355 DEBUG [SimpleAsyncTaskExecutor-3]: Number of columns in the source result set does not match the number of planning target columns.
2012-08-27 09:46:43,371 INFO [SimpleAsyncTaskExecutor-3]: Load type is [Load dimension member].
2012-08-27 09:46:43,996 INFO [SimpleAsyncTaskExecutor-3]: Load process completed.Do any members exist in the account dimension before the load? if not can you try adding one member manually then trying the load again.
Cheers
John
http://john-goodwin.blogspot.com/ -
Unable to load data through outline load utility
I am unable to load data by using outline load utility.
Assigned Data load dimension as: Account
Driver Dimension as : Period
Driver member as : Jan
Login file:
Account,Jan,Point-of-View,Data Load Cube Name
Investment,1234,"India,Current,Drat,FY13",Plan1
Command at Outline utility
OutlineLoad /A:pract /U:admin /I:C:\test1.csv /D:Account /L:C:\lg.log /X:C:\ex.exc
Exception file
[Thu Mar 28 18:05:39 GMT+05:30 2013] Error occurred loading data record 1: Investments,1234,"""India,Current,Draft,FY14""",Plan1
[Thu Mar 28 18:05:39 GMT+05:30 2013] com.hyperion.planning.InvalidMemberException: The member India,Current,Draft,FY14 does not exist or you do not have access to it.
[Thu Mar 28 18:05:40 GMT+05:30 2013]Planning Outline data store load process finished with exceptions: exceptions occured, examine the exception file for more information. 1 data record was read, 1 data record was processed, 0 were successfully loaded, 1 was rejected.
Logfile:
Successfully logged into "pract" application, Release 11.113, Adapter Interface Version 5, Workforce supported and not enabled, CapEx not supported and not enabled, CSS Version 3
[Thu Mar 28 18:05:38 GMT+05:30 2013]Successfully located and opened input file "C:\load.csv".
[Thu Mar 28 18:05:38 GMT+05:30 2013]Header record fields: Account, Jan, Point-of-View, Data Load Cube Name
[Thu Mar 28 18:05:38 GMT+05:30 2013]Located and using "Account" dimension for loading data in "pract" application.
[Thu Mar 28 18:05:40 GMT+05:30 2013]Load dimension "Account" has been unlocked successfully.
[Thu Mar 28 18:05:40 GMT+05:30 2013]A cube refresh operation will not be performed.
[Thu Mar 28 18:05:40 GMT+05:30 2013]Create security filters operation will not be performed.
[Thu Mar 28 18:05:40 GMT+05:30 2013]Examine the Essbase log files for status if Essbase data was loaded.
[Thu Mar 28 18:05:40 GMT+05:30 2013]Planning Outline data store load process finished with exceptions: exceptions occured, examine the exception file for more information. 1 data record was read, 1 data record was processed, 0 were successfully loaded, 1 was rejected.
Infact those members exist in the Outline.
Any help would be appreciated.Check for the double quotes (the one John mentioned). Also I can see that you are taking the approach of Data Load dimension. Is Data load dimension and driver members defined correctly in Planning?
Regards
Celvin
http://www.orahyplabs.com
Please mark the answers as helpful/correct if applicable -
Ever since I upgraded to iTunes 10.4 I've been getting this dreaded message on many occasions when I try to sync my iPhone 4 or iPad 2 with my Win 7 64 bit machine. "iTunes was unable to load data class information from Sync Services. Reconnect or try again later." What happens is that local content (music, videos etc) will sync properly to my iPhone, but other content (such as Outlook information, MobileMe stuff, etc) will not.
I have uninstalled and completely purged all Apple data from my PC (including hidden files and folders under Common Files and in the Registry) and reinstalled iTunes. Yet after one or two syncs, the same problem resurfaces. The other weird part is that the Sync Services crap-out message will happen after I do a successful sync, leave the iPhone connected to the PC, and don't even touch the computer for several hours.
I've actually developed a very tedious work around that seems to restore syncing if for a short time.
- Undock/unconnect all Apple devices from the PC.
- Close iTunes, MobileMe control panel, and Safari (if you have it).
- Start Task Manager (Ctrl + Alt + Del) and shut down iTunesHelper.exe and SyncServer.exe
- Open up a Windows Explorer window (like My Computer) and under Tools, Folder Options, View, toggle on Show Hidden Files and toggle off Hide Protected Operating System FIles
- In WIndows Explorer, navigate to "C:\Users\<your name>\App Data\Roaming\Apple Computer". Rename the folder Sync Services to something else, like Sync Services_Old.
- In WIndows Explorere, navigate to "C:\Program Files (x86)\Common Files\Apple\Mobile Device Support" and double-click on AppleSyncNotifier.exe.
- Go back to your Folder Options and turn off SHow Hidden Files and toggle on Hide Protected Operationg System Files
Now you can start up iTunes again and connect your device. It should sync properly again (at least, until it doesn't once more).
Does anyone at Apple have any idea about this error or a solution?I actually spent a fair amount of time on the phone with a senior Apple tech last week. He directed me to this topic:
http://support.apple.com/kb/HT1923?viewlocale=en_US
It's important that you go through the steps EXACTLY as described here and in the proper order. Also make sure MobileMe control panel is uninstalled (if you have it).
Interestingly, when I went through this procedure and then reinstalled iTunes 10.4 64-bit (didn't do MobileMe or Safari at this stage, but QT is automatically installed) everything worked perfectly. The aforementioned error messages disappeared and all is working flawlessly, as it should.
I hope my expereince will help! Give it a try. -
The Game Center is "unable to load data due to network connectivity issues or errors".
The Game Center is "unable to load data due to network connectivity issues or errors".
My internet is fine: all other apps that do not rely on the Game Center work, as does my browser and other online functions; additionally, a friend's iPhone connects to his Game Center perfectly while he is sitting next to me.
Closing the Game Center app and then reopening it is ineffective. Opening the options for the Game Center in the settings, logging out of my Apple ID, then logging back in, and then reopening the Game Center is ineffective. Opening the options for the Game Center in the settings, logging out of my Apple ID, then reopening the Game Center app and logging back in from there is ineffective.
This issue has gone on for more than a month, and so the phone has gone through all variations of "turn it off then back on again, refresh the system, etc etc" that are the basic first suggestions for trouble shooting.
One game app, which required a Game Center profile to launch, stopped working entirely, and would return a message saying something like "connect to the Game Center". I eventually deleted it to free up space. Other games which can be linked to the Game Center to connect with friends simply fail to link, returning "Game Center unavailable". This is the aspect that is currently bothering me the most: I cannot play games with friends unless the app can also link to Facebook and connect to friends from there.
When I open the Game Center, I see the following screen. The bubbles float gently, and it seems as if I had the ability to update my "status". However doing so results in a message that has vanished once the app is reopened.
Upon tapping Games, either the bubble from the main screen or the option on the ribbon at the bottom, the following screen appears. Clicking on Challenges shows a message that achievements must be viewed to issue a challenge. Clicking achievements results in a message that games must be connected (and from my friend's phone I know I have several apps that should appear).
Upon tapping friends, it does show me a list of suggested friends pulled from Facebook, but attempting to send a friend request results in the following screen. After accepting the error box, the "send" button for the request is grayed out and unavailable regardless of adding or deleting friends and/or an accompanying message.
In short, the Game Center is completely unresponsive for no discernible reason, and this problem began with no warning and is now long term. Though my iPhone is older (a 4s), my friend's is even older (a regular 4) and works fine.I am having the very same problem on my iPad running ios 7. All is fine on my iPhone it opens no problem - this is running on ios 8. I wish they would fix it.
-
Hi All,
I am getting this error message while loading data in BW. The eeror message is below:
Data not received in PSA Table.
Diagnosis
Data has not been updated in PSA Table . The request is probably still running or there was a short dump.
Procedure
In the short dump overview in BW, look for the short dump that belongs to your data request. Make sure the correct date and time are specified in the selection screen.
You can use the wizard to get to the short dump list, or follow the menu path "Environment -> Short dump -> In Data Warehouse".
Removing errors
Follow the instructions in the short dump.
When I checked the short dump it is as below.
ShrtText
An SQL error occurred when accessing a table.
What can you do?
Make a note of the actions and input which caused the error.
To resolve the problem, contact your SAP system administrator.
You can use transaction ST22 (ABAP Dump Analysis) to view and administer termination messages, especially those beyond their normal deletion date.
How to correct the error
Database error text........: "ORA-14400: inserted partition key does not map to any partition"
Internal call code.........: "[RSQL/INSR//BIC/B0000541000 ]"
Please check the entries in the system log (Transaction SM21).
You may able to find an interim solution to the problem in the SAP note system. If you have access to the note system yourself, use the following search criteria:
How to correct the error
Database error text........: "ORA-14400: inserted partition key does not map to any partition"
Internal call code.........: "[RSQL/INSR//BIC/B0000541000 ]"
Please check the entries in the system log (Transaction SM21).
You may able to find an interim solution to the problem in the SAP note system. If you have access to the note system yourself, use the following search
"DBIF_RSQL_SQL_ERROR" CX_SY_OPEN_SQL_DBC
"GP12DW003276UZE34XM7O2QXLER" or "GP12DW003276UZE34XM7O2QXLER"
"INSERT_ODS"
Can someone suggest me what should I check?
Thanks in advance.Thanks a lot Manga. I did what you told and it gave me this error message.
Inconsistency: High value for table /BIC/B0000541000: 0002 ; PARTNO value in RSTSODS: 0022
So you mean to say once I click on correct error button, it should be good. So after that if do the loading, it should not give me this problem if I am right.
But can you tell me what exactly does that error message mean? Also is this not an table space issue. I have to wait to correct that error as DBA is currently looking into it.
Assigned points. Will assign more points if the issue resolves.
Maybe you are looking for
-
How do I sign out of the App Store so that I can sign back in with the correct apple ID?
I needed to change my apple ID, which I did through Apple support today. Now I need to get out of the default old ID to update my apps. How do I do that? The correct ID is showing in the settings section. Thanks!
-
Hi, I am currently creating custom timer job to call WCF web service to perform nighty job to update employee document library metadata. If I update regular list/library items it updates correctly on a specified interval basis. However when I try to
-
How to Parse XML data directly from context variables in webdynpro
Hello, I have two requirements: 1) I have a context variable which has string value. I want to write the this value into a flat file. How do I do this in WebDynpro. Any sample code for this. 2) In Webdynpro, I want to parse and process th
-
The tool bar has disappeared and I cannot find the key to getting it back == This happened == Just once or twice == When the check marked boxes under VIEW - TOOLBARS were cleared
-
Reciept of multiple PO's into single Material doc
Hi, As a standard function we can receive multiple PO's into a single Material Doc (Vendor is same in all PO). Let me know is there a way/enhancement/exits to disable this function. So that only single PO can be received into a Material Doc. Regards,