Error while Loading info cube
Hi all,
I am getting error while i am loading from ODS to InfoCube.I have extracted data from CRM system to ODS.when i am loading from ODS to infocube i am getting Error as :Value 00000000 of charecteristic 0DATE is not Plausible.
Regards,
kals
You can use conversion routine for your field 0pstng_date.
The conversion routine which can solve ur prob. Try RSDAT conversion routine.
You have any variable for 0DATE , if so I think its format is YYYYMMDD
, give 99991231 instead of 99993112
Edited by: Prasad B on Aug 21, 2008 8:32 PM
Similar Messages
-
Error while loading a Cube in BW - "Waiting for Semaphore"
Hello Experts:
I seem to be having a frequent problem while doing a Delta load from BW (3.5) cube to exactly similar cube in BI 7.0.
The load is done daily via a Process Chain. Before doing the Delta load, I also have a step to DELET Index from the
cube in BI. The Delta loads have been successful but frequently they take about 7 or 8 hours to load. The number of
records are not many 80,000 to 140,000 records range. There have been successful loads for similar range withing
7 or 8 minutes. However, when it gets hung up it takes 7 or 8 hours.
Looks like it is a Lock happening in 3.5 BW system. Tcode SM12 shows that it is locked and then in T. Code SM66, it
shows that it is hun for resources..... The message inicated is "Waiting for Semaphore". After about 7 hours, the load
automatically happens.
This is causing issues becuase the upward data loads don't happen and Users can not access the data until late afternoon
for their planning in BPC.
Is there a way this error or Waiting for Semaphore be avoided or time is reduced....? How can I achieve this? Any
fne tuning tricks anyone can suggest please......!!
Thanks a lot in advance.Thanks to both of you.... ppreciate the feedback. I guess, I will have to wait for the situation to arise again as today's delta went normally. I will see if this happens again. I am not sure whether the Semaphore number is same as the "Work Process Number" in SM66 (which was 3 in my case) or is it the Process ID (which is 22872082 in my case).
If I go to SM50, I shoudl find either the Process ID then.
Would I be searching for the OSS notes corresponding to Process ID (in this case 22872082...? or would it for sempahore number (if it is different .... as I mentioned 3 in my case..?
Could you please clarify which is Semaphore number and if it is same as the process ID I am talking about so that I can search for the right OSS notes.
Also, finding all this will only tell me why it hung.... but is there a way we can make it kick off and not get stuck or I have to wait it out until the resources get released.....? Which in my case then would stil be a wait for 6/7 hours!!
Apreciate your feedback.
Best..... Lee -
Error while loading Purchasing Cube.
Hi All,
I am loading the data to Purchasing Group Cube "0PUR_C04" with the data source 2LIS_02_SCL. The laod fails saying that "The material MB0214363L does not exist or is not activated" When I checked the same material in R/3 I could see it as the deleted material. I am unable to load the data in BI even when in Error Handling I select "Update valid records". Please guide me in that, its very urgent. I will assign points.
Thanks and Regards,
Sangini.Select the option load even if master does not exists in the InfoPackage
OR
Load the master data for material and activate it
Now reload the transaction data. -
Hi
i am loading data from r/3 to dso and then to cube. data loaded fine till dso. when i am trying to load data from dso to infocube, it is giving short dump.
the error is
you attempted to assing field to type field symbol but the field does not have required type
adapt the type of field symbol to the type of the field or use an untyped field symbol or use casting addition
plz help
Regards
sanju.Are you using a routine in the transformation?
Seems you have used the field symbol there and not defined it correctly.
Can you paste the routine here?
Regards,
MD -
Error while loading data on to the cube : Incompatible rule file.
Hi,
I am trying to load data on to essbase cube from a data file. I have a rule file on the cube already, and I am getting the following error while loading the data. Is there any problem with the rules file?
SEVERE: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
com.essbase.api.base.EssException: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
at com.essbase.server.framework.EssOrbPluginDirect.ex_olap(Unknown Source)
at com.essbase.server.framework.EssOrbPluginDirect.essMainBeginDataload(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMainMethod(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)
at com.essbase.server.framework.EssOrbPluginDirect._invokeProtected(Unknown Source)
at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
at com.essbase.api.session.EssOrbPlugin.essMainBeginDataload(Unknown Source)
at com.essbase.api.datasource.EssCube.beginDataload(Unknown Source)
at grid.BudgetDataLoad.main(BudgetDataLoad.java:85)
Error: Cannot begin data load. Essbase Error(1019058): Incompatible rule file. Duplicate member name rule file is used against unique name database.
Feb 7, 2012 3:13:37 PM com.hyperion.dsf.server.framework.BaseLogger writeException
SEVERE: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
com.essbase.api.base.EssException: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
at com.essbase.server.framework.EssOrbPluginDirect.ex_olap(Unknown Source)
at com.essbase.server.framework.EssOrbPluginDirect.essMainLoadBufferTerm(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMainMethod(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMethod2(Unknown Source)
at com.essbase.api.session.EssOrbPlugin._invokeMethod(Unknown Source)
at com.essbase.server.framework.EssOrbPluginDirect._invokeProtected(Unknown Source)
at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
at com.essbase.api.session.EssOrbPluginEmbedded.invokeMethod(Unknown Source)
at com.essbase.api.session.EssOrbPlugin.essMainLoadBufferTerm(Unknown Source)
at com.essbase.api.datasource.EssCube.loadBufferTerm(Unknown Source)
at grid.BudgetDataLoad.main(BudgetDataLoad.java:114)
Error: Cannot Load buffer term. Essbase Error(1270040): Data load buffer [3] does not exist
Thanks,
Santhosh" Incompatible rule file. Duplicate member name rule file is used against unique name database."
I am just guessing here as I have never used the duplicate name functionality in Essbase, nor do I remember which versions it was in. However with that said I think your answer is in your error message.
Just guessing again.... It appears that your rule file is set to allow duplicate member names while your database is not. With that information in hand (given to you in the error message) I would start to explore that. -
Error while loading data from DS into cube
Hello All
I am getting the below error while loading data from Data source to the cube.
Record 1 :Time conversion from 0CALDAY to 0FISCPER (fiscal year ) failed with value 20070331
Wht am i suppose to do?
Need ur input in this regard.
Regards
RohitHi
Simply map 0calday to 0fiscper(you already done it)..You might have forgotten to map fiscal year variant in update rules....if it comes from souce,just map it..if it won't comes from source...make it as constant in update rules and give value..
If your business is April to March make it as 'V3'
If your business is January to Decemeber make it as 'K4'..
activate your update rules delete old data and upload again....
Hope it helps
Thanks
Teja -
Error while loading data into cube 0calday to 0fiscper (2lis_13_vdcon)
Hi all,
I m getting following error while loading the data into cube.
"Time conversion from 0CALDAY to 0FISCPER (fiscal year V3 ) failed with value 10081031"
amit shetyeHi Amit,
This conversion problem. Calender not maintained for Fiscal variant "V3", for Year: 1008.
Maintain calender for year: 1008 and transfer global setting from soruce(R/3).
RSA1--> Source systems --> from context menu --> transfer global settings > choose fiscal year variants and calender> execute
Hope it Helps
Srini -
Error while loading Transactional data from NW BW Infoprovider
Hi,
I am trying to load the transactional data using the delivered "Transactional data from NW BW Infoprovider" package and getting error message Error occurs when loading transaction data from other cube .
Below is the transformation file content
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = TAB
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=NO
CREDITPOSITIVE=YES
MAXREJECTCOUNT= 9999999999999
ROUNDAMOUNT=
*MAPPING
ACCOUNT = 0ACCOUNT
BUSINESSTYPE = *NEWCOL(NOBTYPE)
BWSRC = *NEWCOL(R3)
COMPANYCODE = 0COMP_CODE
DATASRC = *NEWCOL(R3)
FUNCTIONALAREA = *IF(0FUNC_AREA = *STR() then *STR(NOFA);0FUNC_AREA)
GAME = *NEWCOL(NOGAME)
INPUTCURRENCY = *IF(0CURRENCY = *STR() then *STR(NOCURR) ; 0CURRENCY)
PROBABILITY = *NEWCOL(NOPROB)
PRODUCT = *NEWCOL(NOPROD)
PROFITCENTER = 0PROFIT_CTR
PROJECT = *NEWCOL(NOPROJECT)
TIME = 0FISCPER
VERSION = *NEWCOL(REV0)
WEEKS = *NEWCOL(NOWEEK)
SIGNEDDATA= 0AMOUNT
*CONVERSION
Below is the Error Log
/CPMB/MODIFY completed in 0 seconds
/CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
InforProvide=ZPCA_C01
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\R3_TD_P&L.xls
CLEARDATA= Yes
RUNLOGIC= No
CHECKLCK= Yes
[Messages]
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other cube
Application: ACTUALREPORTING Package status: ERROR
Does anyone had this issue before or is there something that needs to be looked in the BW delivered task ?
ANy Inpout will be appreciated.
SanjayHello Guru,
Currently i am getting below error while loading costcenter master data from BW to BPC.
Task name MASTER DATA SOURCE:
Record count: 189
Task name TEXT SOURCE:
Record count: 189
Task name CONVERT:
No 1 Round:
Info provider is not available
Application: ZRB_SALES_CMB Package status: ERROR
Anybody can tell me, if i have missed anything ???
Regards,
BI NEW
Edited by: BI NEW on Feb 23, 2011 12:25 PM -
Error while loading the data from PSA to Data Target
Hi to all,
I'm spacing some error while loading the data to data target.
Error : Record 1 :Value 'Kuldeep Puri Milan Joshi ' (hex. '004B0075006C0064006500650070002000500075007200690
Details:
Requests (messages): Everything OK
Extraction (messages): Everything OK
Transfer (IDocs and TRFC): Errors occurred
Request IDoc : Application document posted
Info IDoc 2 : Application document posted
Info IDoc 1 : Application document posted
Info IDoc 4 : Application document posted
Info IDoc 3 : Application document posted
Data Package 1 : arrived in BW ; Processing : Data records for package 1 selected in PSA - 1 er
Processing (data packet): Errors occurred
Update PSA ( 2462 Records posted ) : No errors
Transfer Rules ( 2462 -> 2462 Records ) : No errors
Update rules ( 2462 -> 2462 Records ) : No errors
Update ( 0 new / 0 changed ) : Errors occurred
Processing end : Errors occurred
I'm totally new to this issue. please help to solve this error.
Regards,
SaranHi,
I think you are facing an invalid character issue.
This issue can be resolved by correcting the error records in PSA and updating it into the target. For that the first step should be to identify if all the records are there in PSA. You can find out this from checking the Details tab in RSMO, Job log , PSA > sorting records based on status,etc. Once its confirmed force the request to red and delete the particular request from the target cube. Then go to PSA and edit the incorrect records (correcting or blanking out the invalid entries for particular field InfoObject for the incorrect record) and save it. Once all the incorrect records are edited go to RSA1>PSA find the particular request and update to target manually (right click on PSA request > Start update immediately).
I will add the step by step procedure to edit PSA data and update into target (request based).
In your case the error message says Error : Record 1 :Value 'Kuldeep Puri Milan Joshi '. You just need to conver this to Capital letter in PSA and reload.
Edit the field to KULDEEP PURI MILAN JOSHI in PSA and push it to target.
Identifying incorrect records.
System wont show all the incorrect records at the first time itself. You need to search the PSA table manually to find all the incorrect records.
1. First see RSMO > Details > Expand upate rules / processing tabs and you will find some of the error records.
2. Then you can go to PSA and filter using the status of records. Filter all the red requests. This may also wont show the entire incorrect records.
3. Then you can go to PSA and filter using the incorrect records based on the particular field.
4. If this also doesnt work out go to PSA and sort (not filter) the records based on the particular field with incorrect values and it will show all the records. Note down the record numbers and then edit them one by one.
If you want to confirm find the PSA table and search manually."
Also Run the report RS_ERRORLOG_EXAMPLE,By this report you can display all incorrected records of the data & you can also find whether the error occured in PSA or in TRANSFER RULES.
Steps to resolve this
1. Force the request to red in RSMO > Status tab.
2. Delete the request from target.
3. Come to RSMO > top right you can see PSA maintenace button > click and go to PSA .
4.Edit the record
5. Save PSA data.
6. Got to RSA15 > Search by request name > Right click > update the request from PSA to target.
Refer how to Modify PSA Data
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40890eda-1b99-2a10-2d8b-a18b9108fc38
This should solve your problem for now.
As a long term you can apply some user exit in source system side or change your update rules to ensure that this field is getting blanked out before getting loaded in cube or add that particular char to permitted character list in BW.
RSKC --> type ALL_CAPITAL --> F8 (Execute)
OR
Go to SE38 and execute the program RSKC_ALLOWED_CHAR_MAINTAIN and give ALL_CAPITAL or the char you want to add.
Check the table RSALLOWEDCHAR. It should contain ALL_CAPITAL or the char you have entered.
Refer
/people/sap.user72/blog/2006/07/23/invalid-characters-in-sap-bw-3x-myths-and-reality-part-2
/people/sap.user72/blog/2006/07/08/invalid-characters-in-sap-bw-3x-myths-and-reality-part-1
/people/aaron.wang3/blog/2007/09/03/steps-of-including-one-special-characters-into-permitted-ones-in-bi
http://help.sap.com/saphelp_nw04/helpdata/en/64/e90da7a60f11d2a97100a0c9449261/frameset.htm
For adding Other characters
OSS note #173241 Allowed characters in the BW System
Thanks,
JituK
Edited by: Jitu Krishna on Mar 22, 2008 1:52 PM -
Error while creating info object
error while creating info object a character
Info object may be a maximum of 000060 digits...... what would be the error?
Thanks ALLHi,
The maximum length supported by one CHAR type infoobject is 60. Check whether you have assigned more.
If you have to load more length ( more than 60) split the total thing into different infoobjects. Create different infoobjects and split the total thing into them - like 1st 60 char in 1st infoobject, char 61-120 in 2nd infoobject and so on. Then you can again concatenate them.
Thanks,
Indrashis -
Getting error While loading data from ERP integrator to HFM
Hello,
We are getting the following error while loading the data from ERPI to HFM.
2013-12-31 22:44:54,133 INFO [AIF]: ERPI Process Start, Process ID: 300
2013-12-31 22:44:54,137 INFO [AIF]: ERPI Logging Level: DEBUG (5)
2013-12-31 22:44:54,139 INFO [AIF]: ERPI Log File: C:\Windows\TEMP\/aif_501_300.log
2013-12-31 22:44:54,141 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Java HotSpot(TM) 64-Bit Server VM (Oracle Corporation)]
2013-12-31 22:44:54,143 INFO [AIF]: Java Platform: java1.7.0_25
2013-12-31 22:44:56,821 INFO [AIF]: COMM Process Periods - Insert Periods - START
2013-12-31 22:44:56,828 DEBUG [AIF]:
SELECT l.SOURCE_LEDGER_ID
,l.SOURCE_LEDGER_NAME
,l.SOURCE_COA_ID
,l.CALENDAR_ID
,'0' SETID
,l.PERIOD_TYPE
,NULL LEDGER_TABLE_NAME
FROM AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE br.RULE_ID = 27
AND l.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = br.SOURCE_LEDGER_ID
2013-12-31 22:44:56,834 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_PERIODS (
PROCESS_ID
,PERIODKEY
,PERIOD_ID
,ADJUSTMENT_PERIOD_FLAG
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,GL_PERIOD_NAME
,GL_PERIOD_CODE
,GL_EFFECTIVE_PERIOD_NUM
,YEARTARGET
,PERIODTARGET
,IMP_ENTITY_TYPE
,IMP_ENTITY_ID
,IMP_ENTITY_NAME
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_ID
,TRANS_ENTITY_NAME
,PRIOR_PERIOD_FLAG
,SOURCE_LEDGER_ID
SELECT DISTINCT brl.LOADID PROCESS_ID
,pp.PERIODKEY PERIODKEY
,prd.PERIOD_ID
,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
,COALESCE(prd.YEAR,0) GL_PERIOD_YEAR
,COALESCE(prd.PERIOD_NUM,0) GL_PERIOD_NUM
,prd.PERIOD_NAME GL_PERIOD_NAME
,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM,0) AS VARCHAR(38))) GL_PERIOD_CODE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
,COALESCE(ppa.YEARTARGET, pp.YEARTARGET) YEARTARGET
,COALESCE(ppa.PERIODTARGET, pp.PERIODTARGET) PERIODTARGET
,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) IMP_ENTITY_ID
,prd.PERIOD_NAME IMP_ENTITY_NAME
,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) TRANS_ENTITY_ID
,pp.PERIODDESC TRANS_ENTITY_NAME
,'N' PRIOR_PERIOD_FLAG
,1 SOURCE_LEDGER_ID
FROM (
AIF_BAL_RULE_LOADS brl
INNER JOIN TPOVCATEGORY pc
ON pc.CATKEY = brl.CATKEY
INNER JOIN TPOVPERIOD_FLAT_V pp
ON pp.PERIODFREQ = pc.CATFREQ
AND pp.PERIODKEY >= brl.START_PERIODKEY
AND pp.PERIODKEY <= brl.END_PERIODKEY
LEFT OUTER JOIN TPOVPERIODADAPTOR_FLAT_V ppa
ON ppa.PERIODKEY = pp.PERIODKEY
AND ppa.PERIODFREQ = pp.PERIODFREQ
AND ppa.INTSYSTEMKEY = 'FMTEST2'
INNER JOIN AIF_GL_PERIODS_STG prd
ON prd.SOURCE_SYSTEM_ID = 3
AND prd.CALENDAR_ID IN ('10000')
AND prd.SETID = '0'
AND prd.PERIOD_TYPE = 'Month'
AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
AND prd.START_DATE > pp.PRIORPERIODKEY
AND prd.START_DATE <= pp.PERIODKEY
WHERE brl.LOADID = 300
ORDER BY pp.PERIODKEY
,GL_EFFECTIVE_PERIOD_NUM
2013-12-31 22:44:56,915 INFO [AIF]: COMM Process Periods - Insert Periods - END
2013-12-31 22:44:56,945 INFO [AIF]: COMM Process Periods - Insert Process Details - START
2013-12-31 22:44:56,952 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
SELECT PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,'AIF_EBS_GL_BALANCES_STG' TARGET_TABLE_NAME
,CURRENT_TIMESTAMP EXECUTION_START_TIME
,NULL EXECUTION_END_TIME
,0 RECORDS_PROCESSED
,'PENDING' STATUS
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' LAST_UPDATED_BY
,CURRENT_TIMESTAMP LAST_UPDATE_DATE
FROM (
SELECT DISTINCT PROCESS_ID
,IMP_ENTITY_TYPE ENTITY_TYPE
,IMP_ENTITY_ID ENTITY_ID
,IMP_ENTITY_NAME ENTITY_NAME
,(COALESCE(SOURCE_LEDGER_ID,0) * 100000000 + GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
) q
ORDER BY ENTITY_NAME_ORDER
2013-12-31 22:44:56,963 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
SELECT PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,'TDATASEG' TARGET_TABLE_NAME
,CURRENT_TIMESTAMP EXECUTION_START_TIME
,NULL EXECUTION_END_TIME
,0 RECORDS_PROCESSED
,'PENDING' STATUS
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' LAST_UPDATED_BY
,CURRENT_TIMESTAMP LAST_UPDATE_DATE
FROM (
SELECT PROCESS_ID
,TRANS_ENTITY_TYPE ENTITY_TYPE
,MIN(TRANS_ENTITY_ID) ENTITY_ID
,TRANS_ENTITY_NAME ENTITY_NAME
,MIN(COALESCE(SOURCE_LEDGER_ID,0) * 100000000 + GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
AND PRIOR_PERIOD_FLAG = 'N'
GROUP BY PROCESS_ID
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_NAME
) q
ORDER BY ENTITY_NAME_ORDER
2013-12-31 22:44:56,970 INFO [AIF]: COMM Process Periods - Insert Process Details - END
2013-12-31 22:44:57,407 DEBUG [AIF]: EBS/FS GL Balances Print Variables - Printing Variables - START
2013-12-31 22:44:57,408 DEBUG [AIF]:
p_process_id: 300
p_sql_db_type: ORACLE
p_partitionkey: 12
p_rule_id: 27
p_source_system_id: 3
p_application_id: 26
p_target_application_type: HFM
p_is_multi_currency: true
p_data_load_method: CLASSIC_VIA_EPMI
p_bal_balance_method_code: STANDARD
p_bal_ledger_group_code: SINGLE
p_bal_amount_type: MONETARY
p_prd_entity_name: JAN-13
p_prd_period_id: 135
p_prd_gl_period_name: JAN-13
p_prd_source_ledger_id: 1
p_prd_source_coa_id: 101
p_source_ledger_id: 1
p_source_coa_id: 101
p_bal_actual_flag: A
p_bal_seg_column_name: SEGMENT1
p_max_ccid_loaded_to_stg: 148137
2013-12-31 22:44:57,408 DEBUG [AIF]: EBS/FS GL Balances Print Variables - Printing Variables - END
2013-12-31 22:44:57,806 INFO [AIF]: LKM EBS/FS Extract Type - Load Audit AND Full Refresh - START
2013-12-31 22:44:57,817 DEBUG [AIF]:
SELECT p.PROCESS_ID
,br.RULE_NAME
,l.SOURCE_LEDGER_NAME
FROM AIF_GL_LOAD_AUDIT aud
,AIF_PROCESSES p
,AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND p.PROCESS_ID = aud.LAST_LOADID
AND p.STATUS = 'RUNNING'
AND p.PROCESS_ID <> 300
AND br.RULE_ID = p.RULE_ID
AND l.SOURCE_SYSTEM_ID = aud.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = aud.SOURCE_LEDGER_ID
2013-12-31 22:44:57,826 DEBUG [AIF]:
SELECT 'Y' VALID_FLAG
FROM GL_PERIOD_STATUSES
WHERE APPLICATION_ID = 101
AND SET_OF_BOOKS_ID = 1
AND PERIOD_NAME = 'JAN-13'
AND CLOSING_STATUS IN ( 'O','C','P' )
2013-12-31 22:44:57,847 DEBUG [AIF]:
SELECT 'Y' EXISTS_FLAG
FROM GL_TRACK_DELTA_BALANCES
WHERE SET_OF_BOOKS_ID = 1
AND PROGRAM_CODE = 'FEM'
AND PERIOD_NAME = 'JAN-13'
AND ACTUAL_FLAG = 'A'
AND EXTRACT_LEVEL_CODE = 'DTL'
AND CURRENCY_TYPE_CODE = 'B'
AND ENABLED_FLAG = 'Y'
2013-12-31 22:44:57,883 DEBUG [AIF]:
SELECT MAX(DELTA_RUN_ID) MAX_DELTA_RUN_ID
FROM GL_BALANCES_DELTA
WHERE SET_OF_BOOKS_ID = 1
AND PERIOD_NAME = 'JAN-13'
AND ACTUAL_FLAG = 'A'
2013-12-31 22:44:57,898 DEBUG [AIF]:
SELECT brl.EXECUTION_MODE
,( SELECT CASE COUNT(aud.DELTA_RUN_ID) WHEN 0 THEN 'N' ELSE 'Y' END
FROM AIF_GL_LOAD_AUDIT aud
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND aud.LAST_LOADID <> brl.LOADID
AND COALESCE( aud.STATUS, 'SUCCESS' ) = 'SUCCESS'
) GL_LOAD_AUDIT_SUCCESS_FLAG
,( SELECT CASE COUNT(aud.DELTA_RUN_ID) WHEN 0 THEN 'N' ELSE 'Y' END
FROM AIF_GL_LOAD_AUDIT aud
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND aud.LAST_LOADID <> brl.LOADID
AND aud.STATUS = 'RUNNING'
) GL_LOAD_AUDIT_RUNNING_FLAG
FROM AIF_BAL_RULE_LOADS brl
WHERE brl.LOADID = 300
2013-12-31 22:44:57,904 DEBUG [AIF]:
INSERT INTO AIF_GL_LOAD_AUDIT ( LAST_LOADID
,DELTA_RUN_ID
,SOURCE_SYSTEM_ID
,SOURCE_LEDGER_ID
,GL_PERIOD_ID
,BALANCE_TYPE
,GL_EXTRACT_TYPE
,STATUS
) VALUES ( 300
,0
,3
,1
,135
,'A'
,'FULLREFRESH'
,'RUNNING'
2013-12-31 22:44:57,907 DEBUG [AIF]:
DELETE FROM AIF_EBS_GL_BALANCES_STG
WHERE SOURCE_SYSTEM_ID = 3
AND SOURCE_COA_ID = 101
AND SOURCE_LEDGER_ID = 1
AND ACTUAL_FLAG = 'A'
AND PERIOD_NAME = 'JAN-13'
2013-12-31 22:44:59,283 INFO [AIF]: LKM EBS/FS Extract Type - Load Audit AND Full Refresh - END
2013-12-31 22:45:06,507 INFO [AIF]: COMM End Process Detail - Update Process Detail - START
2013-12-31 22:45:06,514 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'SUCCESS'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 57408
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_IMP'
AND ENTITY_NAME = 'JAN-13'
2013-12-31 22:45:06,519 INFO [AIF]: COMM End Process Detail - Update Process Detail - END
2013-12-31 22:45:07,106 INFO [AIF]: EBS/FS Load Data - Load TDATASEG_T - START
2013-12-31 22:45:07,112 INFO [AIF]:
Import Data from Source for Period 'January 2013'
2013-12-31 22:45:07,115 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,brl.EXECUTION_MODE
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.CURRENCY_CODE
,br.INCL_ZERO_BALANCE_FLAG
,br.BAL_SEG_VALUE_OPTION_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
FROM AIF_BAL_RULE_LOADS brl
,AIF_TARGET_APPLICATIONS app
,AIF_BALANCE_RULES br
WHERE brl.LOADID = 300
AND app.APPLICATION_ID = brl.APPLICATION_ID
AND br.RULE_ID = brl.RULE_ID
2013-12-31 22:45:07,120 DEBUG [AIF]:
SELECT PERIODKEY
FROM TPOVPERIOD
WHERE PERIODDESC = 'January 2013'
2013-12-31 22:45:07,122 INFO [AIF]:
Import Data from Source for Ledger 'JWR Books'
2013-12-31 22:45:07,125 DEBUG [AIF]:
SELECT COA_SEGMENT_NAME
,ACCOUNT_TYPE_FLAG
,BALANCE_TYPE_FLAG
FROM AIF_COA_SEGMENTS
WHERE SOURCE_SYSTEM_ID = 3
AND SOURCE_COA_ID = '101'
AND (
ACCOUNT_TYPE_FLAG = 'Y'
OR BALANCE_TYPE_FLAG = 'Y'
2013-12-31 22:45:07,127 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 26
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 12
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-12-31 22:45:07,154 DEBUG [AIF]:
INSERT INTO TDATASEG_T (
LOADID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,VALID_FLAG
,CHANGESIGN
,CODE_COMBINATION_ID
,SOURCE_LEDGER_ID
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,YEAR
,PERIOD
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ACCOUNT
,ACCOUNTX
,ENTITY
,ENTITYX
,ICP
,ICPX
,UD1
,UD1X
,UD2
,UD2X
,UD3
,UD3X
,UD4
,UD4X
,DATAVIEW
,DATAKEY
,STAT_BALANCE_FLAG
,CURKEY
,AMOUNT_PTD
,AMOUNT_YTD
,AMOUNT
,AMOUNTX
SELECT pprd.PROCESS_ID LOADID
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,pprd.PERIODKEY
,'Y' VALID_FLAG
,0 CHANGESIGN
,ccid.CODE_COMBINATION_ID
,bal.SOURCE_LEDGER_ID
,pprd.GL_PERIOD_YEAR
,pprd.GL_PERIOD_NUM
,pprd.YEARTARGET YEAR
,pprd.PERIODTARGET PERIOD
,pprd.PROCESS_ID ATTR1
,bal.SOURCE_SYSTEM_ID ATTR2
,bal.SOURCE_LEDGER_ID ATTR3
,pprd.GL_PERIOD_YEAR ATTR4
,pprd.GL_PERIOD_NAME ATTR5
,bal.ACTUAL_FLAG ATTR6
,bal.BUDGET_VERSION_ID ATTR7
,bal.ENCUMBRANCE_TYPE_ID ATTR8
,ccid.ACCOUNT_TYPE ATTR9
,NULL ATTR10
,NULL ATTR11
,NULL ATTR12
,NULL ATTR13
,NULL ATTR14
,ccid.SEGMENT4 ACCOUNT
,NULL ACCOUNTX
,ccid.SEGMENT1 ENTITY
,NULL ENTITYX
,NULL ICP
,NULL ICPX
,ccid.SEGMENT5 UD1
,NULL UD1X
,ccid.SEGMENT3 UD2
,NULL UD2X
,ccid.SEGMENT6 UD3
,NULL UD3X
,ccid.SEGMENT2 UD4
,NULL UD4X
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN 'Periodic' ELSE 'YTD' END ) DATAVIEW
,TDATASEG_DATAKEY_S.NEXTVAL
,'N' STAT_BALANCE_FLAG
,bal.CURRENCY_CODE CURKEY
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) * ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR ) AMOUNT_PTD
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) * ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR ) AMOUNT_YTD
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) *
( CASE
WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
ELSE ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
END
AMOUNT
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) *
( CASE
WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
ELSE ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
END
AMOUNTX
FROM AIF_EBS_GL_BALANCES_STG_V bal
,AIF_EBS_GL_CCID_STG ccid
,AIF_PROCESS_PERIODS pprd
WHERE bal.SOURCE_SYSTEM_ID = 3
AND bal.SOURCE_LEDGER_ID = 1
AND bal.ACTUAL_FLAG = 'A'
AND ccid.SOURCE_SYSTEM_ID = bal.SOURCE_SYSTEM_ID
AND ccid.SOURCE_COA_ID = bal.SOURCE_COA_ID
AND ccid.CODE_COMBINATION_ID = bal.CODE_COMBINATION_ID
AND pprd.PROCESS_ID = 300
AND pprd.PERIODKEY = '2013-01-01'
AND pprd.SOURCE_LEDGER_ID = bal.SOURCE_LEDGER_ID
AND pprd.GL_PERIOD_NAME = bal.PERIOD_NAME
AND (
bal.BEGIN_BALANCE_DR <> 0
OR bal.BEGIN_BALANCE_CR <> 0
OR bal.PERIOD_NET_DR <> 0
OR bal.PERIOD_NET_CR <> 0
AND bal.CURRENCY_CODE <> 'STAT'
AND bal.TRANSLATED_FLAG IS NULL
2013-12-31 22:45:09,269 INFO [AIF]: Monetary Data Rows Imported from Source: 12590
2013-12-31 22:45:09,293 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_AUDIT (
LOADID
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,EXPORT_TO_TARGET_FLAG
SELECT DISTINCT PROCESS_ID LOADID
,'HFM' TARGET_APPLICATION_TYPE
,'FMTEST2' TARGET_APPLICATION_NAME
,NULL PLAN_TYPE
,SOURCE_LEDGER_ID
,YEARTARGET EPM_YEAR
,PERIODTARGET EPM_PERIOD
,'Y' SNAPSHOT_FLAG
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,PERIODKEY
,'N' EXPORT_TO_TARGET_FLAG
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
AND PERIODKEY = '2013-01-01'
AND SOURCE_LEDGER_ID = 1
2013-12-31 22:45:09,297 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_PRD_AUDIT (
LOADID
,SOURCE_LEDGER_ID
,GL_PERIOD_ID
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
SELECT DISTINCT pprd.PROCESS_ID LOADID
,pprd.SOURCE_LEDGER_ID
,pprd.PERIOD_ID GL_PERIOD_ID
,(SELECT MAX(gl.DELTA_RUN_ID)
FROM AIF_GL_LOAD_AUDIT gl
WHERE gl.SOURCE_SYSTEM_ID = 3
AND gl.SOURCE_LEDGER_ID = pprd.SOURCE_LEDGER_ID
AND gl.BALANCE_TYPE = 'A'
AND gl.GL_PERIOD_ID = pprd.PERIOD_ID
) DELTA_RUN_ID
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,pprd.PERIODKEY
FROM AIF_PROCESS_PERIODS pprd
WHERE pprd.PROCESS_ID = 300
AND pprd.PERIODKEY = '2013-01-01'
AND pprd.SOURCE_LEDGER_ID = 1
2013-12-31 22:45:09,302 INFO [AIF]:
Total Data Rows Imported from Source: 12590
2013-12-31 22:45:09,305 INFO [AIF]: EBS/FS Load Data - Load TDATASEG_T - END
2013-12-31 22:45:09,381 INFO [AIF]: COMM Update Data - Init DataLoadUtil - START
2013-12-31 22:45:09,385 INFO [AIF]: COMM Update Data - Init DataLoadUtil - END
2013-12-31 22:45:09,485 INFO [AIF]: COMM Update Data - Update TDATASEG_T/TDATASEGW - START
2013-12-31 22:45:09,491 DEBUG [AIF]:
DELETE FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND AMOUNT = 0
2013-12-31 22:45:09,559 WARN [AIF]:
Warning: Data rows with zero balances exist
2013-12-31 22:45:09,636 INFO [AIF]: Zero Balance Data Rows Deleted: 1879
2013-12-31 22:45:09,654 DEBUG [AIF]:
SELECT DIMNAME
,CASE WHEN RULE_ID IS NULL THEN 'N' ELSE 'Y' END RULE_MAP_FLAG
,SRCKEY
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,DATAKEY
,MAPPING_TYPE
FROM (
SELECT DISTINCT tdm.DIMNAME
,tdm.RULE_ID
,NULL SRCKEY
,NULL TARGKEY
,tdm.WHERECLAUSETYPE
,tdm.WHERECLAUSEVALUE
,NULL CHANGESIGN
,1 SEQUENCE
,COALESCE(tdm.SYSTEM_GENERATED_FLAG,'N') SYSTEM_GENERATED_FLAG
,NULL DATAKEY
,CASE
WHEN tdm.WHERECLAUSETYPE IS NULL THEN 1
ELSE 3
END MAPPING_TYPE
FROM TDATAMAP_T tdm
WHERE tdm.LOADID = 300
AND tdm.PARTITIONKEY = 12
AND tdm.TDATAMAPTYPE = 'ERP'
AND (tdm.RULE_ID IS NULL OR tdm.RULE_ID = 27)
AND tdm.WHERECLAUSETYPE IS NULL
UNION ALL
SELECT tdm.DIMNAME
,tdm.RULE_ID
,tdm.SRCKEY
,tdm.TARGKEY
,tdm.WHERECLAUSETYPE
,tdm.WHERECLAUSEVALUE
,tdm.CHANGESIGN
,CASE tpp.PARTSEQMAP
WHEN 0 THEN CASE
WHEN (tdm.WHERECLAUSETYPE = 'MULTIDIM') THEN 2
WHEN (tdm.WHERECLAUSETYPE = 'BETWEEN') THEN 3
WHEN (tdm.WHERECLAUSETYPE = 'LIKE') THEN 4
ELSE 0
END
ELSE tdm.SEQUENCE
END SEQUENCE
,COALESCE(tdm.SYSTEM_GENERATED_FLAG,'N') SYSTEM_GENERATED_FLAG
,tdm.DATAKEY
,CASE
WHEN tdm.WHERECLAUSETYPE IS NULL THEN 1
ELSE 3
END MAPPING_TYPE
FROM TDATAMAP_T tdm
INNER JOIN TPOVPARTITION tpp
ON tpp.PARTITIONKEY = tdm.PARTITIONKEY
WHERE tdm.LOADID = 300
AND tdm.PARTITIONKEY = 12
AND tdm.TDATAMAPTYPE = 'ERP'
AND (tdm.RULE_ID IS NULL OR tdm.RULE_ID = 27)
AND tdm.WHERECLAUSETYPE IN ('MULTIDIM','BETWEEN','LIKE')
) q
ORDER BY DIMNAME
,RULE_ID
,SEQUENCE
,SYSTEM_GENERATED_FLAG
,SRCKEY
2013-12-31 22:45:09,672 INFO [AIF]:
Processing Mappings for Column 'ACCOUNT'
2013-12-31 22:45:09,677 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ACCOUNTX = ACCOUNT
,ACCOUNTR = 121
,ACCOUNTF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ACCOUNTX IS NULL
AND (1=1)
2013-12-31 22:45:10,044 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,053 INFO [AIF]:
Processing Mappings for Column 'ENTITY'
2013-12-31 22:45:10,057 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ENTITYX = ENTITY
,ENTITYR = 122
,ENTITYF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ENTITYX IS NULL
AND (1=1)
2013-12-31 22:45:10,426 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,435 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ENTITYX = ENTITY
,ENTITYR = 132
,ENTITYF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ENTITYX IS NULL
AND (1=1)
2013-12-31 22:45:10,446 INFO [AIF]: Data Rows Updated by Location Mapping 'DEFAULT' (LIKE): 0
2013-12-31 22:45:10,448 INFO [AIF]:
Processing Mappings for Column 'ICP'
2013-12-31 22:45:10,452 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ICPX = '[ICP None]'
,ICPR = 130
,ICPF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[ICP None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ICPX IS NULL
AND (1=1)
2013-12-31 22:45:10,784 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,798 INFO [AIF]:
Processing Mappings for Column 'UD1'
2013-12-31 22:45:10,802 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD1X = '[None]'
,UD1R = 124
,UD1F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD1X IS NULL
AND (1=1)
2013-12-31 22:45:11,134 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,156 INFO [AIF]:
Processing Mappings for Column 'UD2'
2013-12-31 22:45:11,160 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD2X = '[None]'
,UD2R = 123
,UD2F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD2X IS NULL
AND (1=1)
2013-12-31 22:45:11,517 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,531 INFO [AIF]:
Processing Mappings for Column 'UD3'
2013-12-31 22:45:11,535 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD3X = '[None]'
,UD3R = 125
,UD3F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD3X IS NULL
AND (1=1)
2013-12-31 22:45:11,870 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,883 INFO [AIF]:
Processing Mappings for Column 'UD4'
2013-12-31 22:45:11,887 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD4X = '[None]'
,UD4R = 128
,UD4F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD4X IS NULL
AND (1=1)
2013-12-31 22:45:12,192 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:12,204 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ATTR14 = DATAKEY
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:12,739 DEBUG [AIF]:
UPDATE TDATASEG_T
SET VALID_FLAG = 'N'
WHERE 1=1
AND (
(1=0)
OR TDATASEG_T.ACCOUNTX IS NULL
OR TDATASEG_T.ENTITYX IS NULL
OR TDATASEG_T.ICPX IS NULL
OR TDATASEG_T.UD1X IS NULL
OR TDATASEG_T.UD2X IS NULL
OR TDATASEG_T.UD3X IS NULL
OR TDATASEG_T.UD4X IS NULL
AND LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND VALID_FLAG = 'Y'
2013-12-31 22:45:12,754 INFO [AIF]:
Total Data Rows available for Export to Target: 10711
2013-12-31 22:45:12,773 INFO [AIF]: COMM Update Data - Update TDATASEG_T/TDATASEGW - END
2013-12-31 22:45:12,808 INFO [AIF]: COMM End Process Detail - Update Process Detail - START
2013-12-31 22:45:12,823 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'SUCCESS'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 10711
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_TRANS'
AND ENTITY_NAME = 'January 2013'
2013-12-31 22:45:12,829 INFO [AIF]: COMM End Process Detail - Update Process Detail - END
2013-12-31 22:45:12,987 INFO [AIF]: COMM Update YTD Amounts - Update YTD Amounts - START
2013-12-31 22:45:12,993 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,pprd.YEARTARGET
,pprd.PERIODTARGET
,pprd.SOURCE_LEDGER_ID
FROM AIF_BAL_RULE_LOADS brl
,AIF_PROCESS_PERIODS pprd
WHERE brl.LOADID = 300
AND pprd.PROCESS_ID = brl.LOADID
GROUP BY brl.PARTITIONKEY
,brl.CATKEY
,pprd.YEARTARGET
,pprd.PERIODTARGET
,pprd.SOURCE_LEDGER_ID
HAVING COUNT(*) > 1
2013-12-31 22:45:12,995 INFO [AIF]: COMM Update YTD Amounts - Update YTD Amounts - END
2013-12-31 22:45:13,052 INFO [AIF]: COMM Load TDATAMAPSEG/TDATASEG - Load TDATAMAPSEG/TDATASEG - START
2013-12-31 22:45:13,057 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,brl.EXECUTION_MODE
FROM AIF_BAL_RULE_LOADS brl
WHERE brl.LOADID = 300
2013-12-31 22:45:13,059 DEBUG [AIF]:
SELECT PERIODKEY
FROM AIF_APPL_LOAD_AUDIT
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND RULE_ID = 27
ORDER BY PERIODKEY
2013-12-31 22:45:13,061 INFO [AIF]:
Processing Data for PeriodKey '2013-01-01'
2013-12-31 22:45:13,065 DEBUG [AIF]:
DELETE FROM TDATAMAPSEG
WHERE PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND (
TDATAMAPTYPE = 'ERP'
OR (
TDATAMAPTYPE = 'MULTIDIM'
AND EXISTS (
SELECT 1
FROM TDATAMAPSEG parent
WHERE parent.PARTITIONKEY = TDATAMAPSEG.PARTITIONKEY
AND parent.DATAKEY = TDATAMAPSEG.TARGKEY
AND parent.CATKEY = TDATAMAPSEG.CATKEY
AND parent.PERIODKEY = TDATAMAPSEG.PERIODKEY
AND parent.TDATAMAPTYPE = 'ERP'
2013-12-31 22:45:13,074 INFO [AIF]: Number of Rows deleted from TDATAMAPSEG: 8
2013-12-31 22:45:13,077 DEBUG [AIF]:
INSERT INTO TDATAMAPSEG (
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,DIMNAME
,SRCKEY
,SRCDESC
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,VBSCRIPT
,TDATAMAPTYPE
,SYSTEM_GENERATED_FLAG
SELECT DATAKEY
,PARTITIONKEY
,4
,'2013-01-01'
,DIMNAME
,SRCKEY
,SRCDESC
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,VBSCRIPT
,TDATAMAPTYPE
,SYSTEM_GENERATED_FLAG
FROM TDATAMAP_T
WHERE LOADID = 300
2013-12-31 22:45:13,081 INFO [AIF]: Number of Rows inserted into TDATAMAPSEG: 8
2013-12-31 22:45:13,083 DEBUG [AIF]:
DELETE FROM TDATASEG
WHERE LOADID < 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND RULE_ID = 27
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:15,659 INFO [AIF]: Number of Rows deleted from TDATASEG: 10711
2013-12-31 22:45:15,728 DEBUG [AIF]:
INSERT INTO TDATASEG (
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,CURKEY
,DATAVIEW
,CALCACCTTYPE
,CHANGESIGN
,JOURNALID
,AMOUNT
,AMOUNTX
,AMOUNT_PTD
,AMOUNT_YTD
,DESC1
,DESC2
,ACCOUNT
,ACCOUNTX
,ACCOUNTR
,ACCOUNTF
,ENTITY
,ENTITYX
,ENTITYR
,ENTITYF
,ICP
,ICPX
,ICPR
,ICPF
,UD1
,UD1X
,UD1R
,UD1F
,UD2
,UD2X
,UD2R
,UD2F
,UD3
,UD3X
,UD3R
,UD3F
,UD4
,UD4X
,UD4R
,UD4F
,UD5
,UD5X
,UD5R
,UD5F
,UD6
,UD6X
,UD6R
,UD6F
,UD7
,UD7X
,UD7R
,UD7F
,UD8
,UD8X
,UD8R
,UD8F
,UD9
,UD9X
,UD9R
,UD9F
,UD10
,UD10X
,UD10R
,UD10F
,UD11
,UD11X
,UD11R
,UD11F
,UD12
,UD12X
,UD12R
,UD12F
,UD13
,UD13X
,UD13R
,UD13F
,UD14
,UD14X
,UD14R
,UD14F
,UD15
,UD15X
,UD15R
,UD15F
,UD16
,UD16X
,UD16R
,UD16F
,UD17
,UD17X
,UD17R
,UD17F
,UD18
,UD18X
,UD18R
,UD18F
,UD19
,UD19X
,UD19R
,UD19F
,UD20
,UD20X
,UD20R
,UD20F
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ARCHIVEID
,HASMEMOITEM
,STATICDATAKEY
,LOADID
,RULE_ID,
CODE_COMBINATION_ID
,STAT_BALANCE_FLAG
,VALID_FLAG
SELECT
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,CURKEY
,DATAVIEW
,CALCACCTTYPE
,CHANGESIGN
,JOURNALID
,AMOUNT
,AMOUNTX
,AMOUNT_PTD
,AMOUNT_YTD
,DESC1
,DESC2
,ACCOUNT
,ACCOUNTX
,ACCOUNTR
,ACCOUNTF
,ENTITY
,ENTITYX
,ENTITYR
,ENTITYF
,ICP
,ICPX
,ICPR
,ICPF
,UD1
,UD1X
,UD1R
,UD1F
,UD2
,UD2X
,UD2R
,UD2F
,UD3
,UD3X
,UD3R
,UD3F
,UD4
,UD4X
,UD4R
,UD4F
,UD5
,UD5X
,UD5R
,UD5F
,UD6
,UD6X
,UD6R
,UD6F
,UD7
,UD7X
,UD7R
,UD7F
,UD8
,UD8X
,UD8R
,UD8F
,UD9
,UD9X
,UD9R
,UD9F
,UD10
,UD10X
,UD10R
,UD10F
,UD11
,UD11X
,UD11R
,UD11F
,UD12
,UD12X
,UD12R
,UD12F
,UD13
,UD13X
,UD13R
,UD13F
,UD14
,UD14X
,UD14R
,UD14F
,UD15
,UD15X
,UD15R
,UD15F
,UD16
,UD16X
,UD16R
,UD16F
,UD17
,UD17X
,UD17R
,UD17F
,UD18
,UD18X
,UD18R
,UD18F
,UD19
,UD19X
,UD19R
,UD19F
,UD20
,UD20X
,UD20R
,UD20F
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ARCHIVEID
,HASMEMOITEM
,STATICDATAKEY
,LOADID
,RULE_ID,
CODE_COMBINATION_ID
,STAT_BALANCE_FLAG
,VALID_FLAG
FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:16,838 INFO [AIF]: Number of Rows inserted into TDATASEG: 10711
2013-12-31 22:45:16,858 DEBUG [AIF]:
DELETE FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:17,123 INFO [AIF]: Number of Rows deleted from TDATASEG_T: 10711
2013-12-31 22:45:17,153 DEBUG [AIF]:
DELETE FROM TDATAMAP_T
WHERE LOADID = 300
2013-12-31 22:45:17,156 INFO [AIF]: Number of Rows deleted from TDATAMAP_T: 8
2013-12-31 22:45:17,161 INFO [AIF]: COMM Load TDATAMAPSEG/TDATASEG - Load TDATAMAPSEG/TDATASEG - END
2013-12-31 22:45:17,993 DEBUG [AIF]:
SELECT CASE app.METADATA_LOAD_METHOD
WHEN 'EPMA' THEN CASE dim.TARGET_DIMENSION_CLASS_NAME
WHEN 'Generic' THEN dim.TARGET_DIMENSION_NAME
ELSE dim.TARGET_DIMENSION_CLASS_NAME
END
ELSE dim.TARGET_DIMENSION_NAME
END TARGET_DIMENSION_NAME
,adim.BALANCE_COLUMN_NAME
FROM AIF_TARGET_APPLICATIONS app
,AIF_TARGET_APPL_DIMENSIONS adim
,AIF_DIMENSIONS dim
WHERE app.APPLICATION_ID = 26
AND adim.APPLICATION_ID = app.APPLICATION_ID
AND dim.DIMENSION_ID = adim.DIMENSION_ID
AND dim.TARGET_DIMENSION_CLASS_NAME IN ('Custom1','Custom2','Custom3','Custom4','Generic')
2013-12-31 22:45:17,997 DEBUG [AIF]:
SELECT SCENARIO "Scenario"
,YEAR "Year"
,PERIOD "Period"
,DATAVIEW "View"
,DATAVALUE "Value"
,ACCOUNT "Account"
,ENTITY "Entity"
,ICP "ICP"
,UD2 "Area"
,UD1 "Tail"
,UD3 "Special"
,UD4 "Facility"
,AMOUNT "DataValue"
FROM AIF_HS_BALANCES
WHERE LOADID = 300
2013-12-31 22:45:18,000 INFO [SimpleAsyncTaskExecutor-9]: ODI Hyperion Financial Management Adapter
2013-12-31 22:45:18,002 INFO [SimpleAsyncTaskExecutor-9]: Load task initialized.
2013-12-31 22:45:18,028 INFO [AIF]: LKM COMM Load Data into HFM - Load Data to HFM - START
2013-12-31 22:45:18,031 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
) VALUES (
300
,'PROCESS_BAL_EXP_HFM'
,NULL
,'FMTEST2'
,NULL
,NULL
,CURRENT_TIMESTAMP
,NULL
,NULL
,'RUNNING'
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
,CURRENT_TIMESTAMP
2013-12-31 22:45:18,034 INFO [SimpleAsyncTaskExecutor-9]: Connecting to Financial Management application [FMTEST2] on [10.150.20.40] using user-name [admin].
2013-12-31 22:45:18,155 INFO [SimpleAsyncTaskExecutor-9]: Connected to Financial Management application.
2013-12-31 22:45:18,157 INFO [SimpleAsyncTaskExecutor-9]: HFM Version: 11.1.2.2.300.
2013-12-31 22:45:18,160 INFO [SimpleAsyncTaskExecutor-9]: Options for the Financial Management load task are:
<Options>
<Option name=LOG_FILE_NAME value=C:\Windows\TEMP\/aif_501_300.log/>
<Option name=IMPORT_MODE value=Replace/>
<Option name=CONSOLIDATE_ONLY value=false/>
<Option name=CONSOLIDATE_PARAMETERS value=""/>
<Option name=LOG_ENABLED value=true/>
<Option name=ACCUMULATE_WITHIN_FILE value=false/>
<Option name=DEBUG_ENABLED value=true/>
<Option name=CONSOLIDATE_AFTER_LOAD value=false/>
<Option name=FILE_CONTAINS_SHARE_DATA value=false/>
</Options>
2013-12-31 22:45:18,168 INFO [SimpleAsyncTaskExecutor-9]: Load Options validated.
2013-12-31 22:45:18,176 ERROR [SimpleAsyncTaskExecutor-9]: Error occurred during load process ORA-00904: "DATAVALUE": invalid identifier
com.hyperion.odi.common.ODIHAppException: ORA-00904: "DATAVALUE": invalid identifier
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:216)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
at org.python.core.PyObject.__call__(PyObject.java:355)
at org.python.core.PyMethod.__call__(PyMethod.java:215)
at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
at org.python.core.PyMethod.__call__(PyMethod.java:206)
at org.python.core.PyObject.__call__(PyObject.java:397)
at org.python.core.PyObject.__call__(PyObject.java:401)
at org.python.pycode._pyx161.f$0(<string>:98)
at org.python.pycode._pyx161.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java:165)
at org.python.core.PyCode.call(PyCode.java:18)
at org.python.core.Py.runCode(Py.java:1204)
at org.python.core.Py.exec(Py.java:1248)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1889)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$2.doAction(StartScenRequestProcessor.java:580)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor.doProcessStartScenTask(StartScenRequestProcessor.java:513)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$StartScenTask.doExecute(StartScenRequestProcessor.java:1066)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:724)
Caused by: java.sql.SQLSyntaxErrorException: ORA-00904: "DATAVALUE": invalid identifier
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)
at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:202)
at oracle.jdbc.driver.T4CStatement.executeForDescribe(T4CStatement.java:942)
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1283)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1441)
at oracle.jdbc.driver.OracleStatement.executeQuery(OracleStatement.java:1690)
at oracle.jdbc.driver.OracleStatementWrapper.executeQuery(OracleStatementWrapper.java:446)
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:212)
... 38 more
2013-12-31 22:45:18,208 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'FAILED'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 0
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_EXP_HFM'
AND ENTITY_NAME = 'FMTEST2'
2013-12-31 22:45:18,210 FATAL [AIF]: Error in LKM CThe issue is that you are mapping "Data Value" to amount in the Target Application import format:
SELECT SCENARIO "Scenario"
,YEAR "Year"
,PERIOD "Period"
,DATAVIEW "View"
,DATAVALUE "Value"
,ACCOUNT "Account"
,ENTITY "Entity"
,ICP "ICP"
,UD2 "Area"
,UD1 "Tail"
,UD3 "Special"
,UD4 "Facility"
,AMOUNT "DataValue"
FROM AIF_HS_BALANCES
WHERE LOADID = 300
You need to map AMOUNT to "AMOUNT" in the HFM Application. Check that the dimension mapping is correct for the class in the target application and that your import format is going to the proper target dimension(Amount). -
Error while Loading data through .csv file
Hi,
I am getting below date error when loading data through into Olap tables through .csv file.
Data stored in .csv is 20071113121100.
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
... nl:ERROR(u:'transformation error')).
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
Any help is greatly appreciated.
Thanks,
Poojak1) Wrong format, you wont get much support loading OLAP cubes in here I think.
2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
*** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
Expression in Informatica is setup as below.
IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
Thanks,
Poojak -
Error while loading application
Hi,
I'm not sure this is the right forum, but I try anyway.
I'm trying to use a 3rd party Java application Unico (just use it, I don't develop anything of that) that after working for weeks, suddenly refuses to load at startup, and returns an error like "Error while loading application" (no error code number no other hint about the problem).
The tests that did not work: 1. I installed the Unico again, 2. I created a new login and installed Unico in that login, 3. I started in safe mode (pressing shift at startup) - I always had the same message and loading aborted.
The only workaround I found is to use an external startup disk, that I fortunately had available to face cases like this: when starting the system up from that disk Unico WORKS, not only, but I can even use the same application in my internal disk that does not load otherwise.
My first conclusion is that something wrong happened to my system on the startup disk that prevents loading a Java application. The user Library seems to make no difference.
So my questions to the Java experts are:
- what in the system could suddenly block the loading of a Java application ? where should I look in the system (I can use the terminal and know something about UNIX) to fix this ? could this be caused by a permission problem ? (it looks a lot alike, but repairing permissions in Disk Utility didn't help) if so where are the Java support files stored in the Library and System Library to check and compare between the good and the bad system ?
- is there a better workaround to this problem, other than starting up from a different disk ?
Some additional info:
- the application Unico is freely provided by the Italian tax organization, so there is basically no support to use it...
- Unico requires Java 1.3.1: my system is constantly updated/upgraded and it runs the most recent release that is 1.4.0 (I guess...) and anyway worked very well till a week ago.
- since the last time the application had worked I installed two (non Apple) packages: the TomTom (GPS navigator) setup application, and new Epson printer drivers for my new printer.
- in all cases I repaired permissions and I also verified the disk with the disk utility
- Unico does not uses printers (it creates pdf files instead) and does not use USB, which in turn is used by both the Epson driver and the TomTom app.
So I cannot connect the Unico malfunctioning to this new SW except for the fact that I cannot find any other answer...
Thanks to anybody having any suggestion about
PieroI made some research: it seems to me that most (if not all) the Java machine is in the /System/Library/Framework/JavaVM.framework folder: but I found no evident difference in the file permissions between the good and the bad system (see my previous posts).
There are other locations where there are java related files: eg in Javaconfig.plist (in /System/Library/Java) that seems to name the locations for all components of the java machine. I checked some files in /usr/bin referenced in the .plist: again no difference.
Something "could" be a clue; just before the crashing point the Unico (java) program causes a message in the Console:
"[JavaAppLauncher Warning] Specifying a specific version for JVMVersion 1.3.1 is deprecated. Use the more general 1.3* instead."
This would be a clue... except that this message appears also when the program loads without problem (in the "good" system where it works). So it is more a warning.
Another clue is the message that appears in the console in the "good" system when the Unico loads correctly; and this seems really an error message (whateve it means):
"Index 1 for 'pxm#' 2062 out of range (must be between 0 and 0)
Attempted to read past end of 'pxm#'(2062) resource"
The funny thing is that when this message appears, Unico loads with no probem, while in the "bad" system it doesn't even get to this point.
So in my opinion both messages are irrilevant to identify the problem.
Or not ?
And again my main question is: what in the system could suddenly block the loading of a Java application ?
Thanks for any hint
Piero -
Ops center 12c installation last stage we are getting below error :
INFO: starting new satellite SMF services
satadm: Enable HTML redirect
satadm: Disabling application/scn/ec-server-splashpage
/opt/sun/xvmoc/bin/svcadm: error while loading shared libraries: libz.so.1: cannot open shared object file: No such file or directory
satadm: ERROR: application/scn/ec-server-splashpage failed to stop
satadm: Starting Enterprise Controller with SMFlite...
svcadm: error while loading shared libraries: libz.so.1: cannot open shared object file: No such file or directory
satadm: ERROR: Error enabling Enterprise Controller run-time milestone svc:/application/scn/satellite-enable:default (rc=127)
Error: starting satellite services
Command: /opt/sun/xvmoc/bin/satadm start -v -w
Exit code: 1
Output:
Please fix the problem and then try this step again.
For a full log of the failed install see the file: /var/tmp/installer.log.14112.
t. Try this step again (correct the failure before proceeding)
x. ExitHi Valentin
The copy error occurs when you try to copy the files to the main OS that has allocated the VM ?
Because another option is to get those files toward the VM across and ftp server. You could use
an ftp server installed in the main machine or in the network where you are connected o inclusive from
your mail if you have connection to internet.
Joel Perez
Oracle Ace Member
DBA Oracle -
hi guys
i got an error while loading infoobject atrributes
i created characteristic info object "equipment" with Master data and text.
when i loaded text it was working fine but the problem happened when i start
loading attributes data.
error code is as given below
Check Load from InfoSource D92_EQUIP , Packet infpackage for equipment -attrib
Please execute the mail for additional information.
Data records were selected in the PSA
Diagnosis
Data records were marked as incorrect in the PSA.
System response
If error messages exist for the data records, the corresponding data packets were not updated using the PSA.
Procedure
Go into the PSA maintenance for the request and read the logged messages.
You find the relevant messages for each record in the PSA table by double-clicking on the status of the record.
Where necessary, edit the incorrect records or remove the error in another way.
Post the incorrect data packets manually.
Note that you are only permitted to edit and manually update the request if it is not contained in the request list in the InfoCube (see InfoCube administration). Where necessary, first delete it from the InfoCube.
any advice ???its says problem with data records..but i couldnt find any
regards
abhi
Message was edited by:
Abhilash MuraleedharanHi Abhilash,
You can use the preview functionality in the external page to see if the reords are being pulled properly or not.
If not then there is some problem with the file which you need to fix. Or might be the setting you have done in selecting the delimiter in BW or any setting. Also check your routines if you are using any.
If the file is being pulled correctly try seeing the error message under the details tab in the monitor screen. That will give you an idea why the load is failing.
Hope that helps.
regards.
Maybe you are looking for
-
Bootcamp partition shows up as grey "disk0s4" in Disk Utility
Hello everyone, So I am a few days into this tiring and discouraging process of trying to get my iMac back to it's original working state. Here's to hoping for a solution: Ever since I bought my iMac, I've been running OSX Mountain Lion with Win7 ins
-
Mainstage 2.1.3 - an error occurred Result code = -39
mainstage 2.1.3 - an error occurred Result code = -39 . What does this mean. How can I find out what error messages mean. And what can I do to fix it. My session became very unstable- previously saved versions (lighter ) were stable but still got er
-
Imac and external projector resolution problem!!!!!
hello! I´ve benn using an imac24" for about 4 months. When I attached my projector with a capasity for 1024x768 85Hz while my imac´s is 1920x1200. My problem is that I want my imac to use it´s highest resolution at 1920x1200 while my projector is usi
-
Printing data available in an SAPUI5 tree table control / Model
Hello Experts, I have a JSON model, which holds the data coming from an MII BLS transction. A Tree table is bound to the JSON model. Also, based on the "Expand" icon click on each row in Tree table, I load some data from the back-end dynamically and
-
Posted documents could not retrieve from Finance system (Travel Management)
Hi Experts, I could sucessfully post the documents to Finance system using T.code: PRRW and when i click on FI/Co documents tab it is showing error Could not retrieve from Finance (FIN310) system, I have checked the ALE configuration form HR outbound