Error while loading Reported Financial Data from Data Stream
Hi Guys,
I'm facing the following error while loading Reported Financial Data from Data Stream:
Message no. UCD1003: Item "Blank" is not defined in Cons Chart of Accts 01
The message appears in Target Data. Item is not filled in almost 50% of the target data records and the error message appears.
Upon deeper analysis I found that Some Items are defined with Dr./Cr. sign of + and with no breakdown. When these items appear as negative (Cr.) in the Source Data, they are not properly loaded to the target data. Item is not filled up, hence causing the error.
For Example: Item "114190 - Prepayments" is defined with + Debit/Credit Sign. When it is posted as negative / Credit in the source data, it is not properly written to the target.
Should I need to define any breakdown category for these items? I think there's something wrong with the Item definitions OR I'm missing something....
I would highly appreciate your quick assistance in this.
Kind regards,
Amir
Found the answer with OSS Note: 642591.....
Thanks
Similar Messages
-
Master data error while loading data
Dear Masters
Im getting below error while loading delta update for master data(Attr) through infopackage to only PSA. Im in BI 7.0
When I checked Error message, Im getting 2 errors. They are
1) Update mode R is not supported by the extraction API
2) Error in source system.
I checked source system is fine. im getting data also in RSA3
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Service API .
Refer to the error message.
Procedure
How you remove the error depends on the error message.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the
Could you please help me to resolve this issue.Process chains might fail if the data source is not in sync between the source and the BW system. In such a case, go to the 'Source Systems' in transaction RSA1, find the datasource, right click and select 'Replicate Datasources'. Once this is done, the transfer structure needs to be activated as well. To do this in production, run the report RS_TRANSTRU_ACTIVATE_ALL with the source system name and the infosource name. This will activate the transfer structure.
NOTE: For master data using direct upload the infosource name is the name of the info object itself. -
Getting Duplicate data Records error while loading the Master data.
Hi All,
We are getting Duplicate data Records error while loading the Profit centre Master data. Master data contains time dependent attributes.
the load is direct update. So i made it red and tried to reloaded from PSA even though it is throwing same error.
I checked in PSA. Showing red which records have same Profit centre.
Could any one give us any suggestions to resolve the issues please.
Thanks & Regards,
RajuHi Raju,
I assume there are no routines written in the update rules and also you ae directly loading the data from R/3 ( not from any ODS). If that is the case then it could be that, the data maintained in R/3 has overlapping time intervals (since time dependency of attributes is involved). Check your PSA if the same profit center has time intervals which over lap. In that case, you need to get this fixed in R/3. If there are no overlapping time intervals, you can simply increase the error tolerance limit in your info-package and repeat the load.
Hope this helps you.
Thanks & Regards,
Nithin Reddy. -
Getting error While loading data from ERP integrator to HFM
Hello,
We are getting the following error while loading the data from ERPI to HFM.
2013-12-31 22:44:54,133 INFO [AIF]: ERPI Process Start, Process ID: 300
2013-12-31 22:44:54,137 INFO [AIF]: ERPI Logging Level: DEBUG (5)
2013-12-31 22:44:54,139 INFO [AIF]: ERPI Log File: C:\Windows\TEMP\/aif_501_300.log
2013-12-31 22:44:54,141 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Java HotSpot(TM) 64-Bit Server VM (Oracle Corporation)]
2013-12-31 22:44:54,143 INFO [AIF]: Java Platform: java1.7.0_25
2013-12-31 22:44:56,821 INFO [AIF]: COMM Process Periods - Insert Periods - START
2013-12-31 22:44:56,828 DEBUG [AIF]:
SELECT l.SOURCE_LEDGER_ID
,l.SOURCE_LEDGER_NAME
,l.SOURCE_COA_ID
,l.CALENDAR_ID
,'0' SETID
,l.PERIOD_TYPE
,NULL LEDGER_TABLE_NAME
FROM AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE br.RULE_ID = 27
AND l.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = br.SOURCE_LEDGER_ID
2013-12-31 22:44:56,834 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_PERIODS (
PROCESS_ID
,PERIODKEY
,PERIOD_ID
,ADJUSTMENT_PERIOD_FLAG
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,GL_PERIOD_NAME
,GL_PERIOD_CODE
,GL_EFFECTIVE_PERIOD_NUM
,YEARTARGET
,PERIODTARGET
,IMP_ENTITY_TYPE
,IMP_ENTITY_ID
,IMP_ENTITY_NAME
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_ID
,TRANS_ENTITY_NAME
,PRIOR_PERIOD_FLAG
,SOURCE_LEDGER_ID
SELECT DISTINCT brl.LOADID PROCESS_ID
,pp.PERIODKEY PERIODKEY
,prd.PERIOD_ID
,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
,COALESCE(prd.YEAR,0) GL_PERIOD_YEAR
,COALESCE(prd.PERIOD_NUM,0) GL_PERIOD_NUM
,prd.PERIOD_NAME GL_PERIOD_NAME
,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM,0) AS VARCHAR(38))) GL_PERIOD_CODE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
,COALESCE(ppa.YEARTARGET, pp.YEARTARGET) YEARTARGET
,COALESCE(ppa.PERIODTARGET, pp.PERIODTARGET) PERIODTARGET
,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) IMP_ENTITY_ID
,prd.PERIOD_NAME IMP_ENTITY_NAME
,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
,(COALESCE(prd.YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM,0)) TRANS_ENTITY_ID
,pp.PERIODDESC TRANS_ENTITY_NAME
,'N' PRIOR_PERIOD_FLAG
,1 SOURCE_LEDGER_ID
FROM (
AIF_BAL_RULE_LOADS brl
INNER JOIN TPOVCATEGORY pc
ON pc.CATKEY = brl.CATKEY
INNER JOIN TPOVPERIOD_FLAT_V pp
ON pp.PERIODFREQ = pc.CATFREQ
AND pp.PERIODKEY >= brl.START_PERIODKEY
AND pp.PERIODKEY <= brl.END_PERIODKEY
LEFT OUTER JOIN TPOVPERIODADAPTOR_FLAT_V ppa
ON ppa.PERIODKEY = pp.PERIODKEY
AND ppa.PERIODFREQ = pp.PERIODFREQ
AND ppa.INTSYSTEMKEY = 'FMTEST2'
INNER JOIN AIF_GL_PERIODS_STG prd
ON prd.SOURCE_SYSTEM_ID = 3
AND prd.CALENDAR_ID IN ('10000')
AND prd.SETID = '0'
AND prd.PERIOD_TYPE = 'Month'
AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
AND prd.START_DATE > pp.PRIORPERIODKEY
AND prd.START_DATE <= pp.PERIODKEY
WHERE brl.LOADID = 300
ORDER BY pp.PERIODKEY
,GL_EFFECTIVE_PERIOD_NUM
2013-12-31 22:44:56,915 INFO [AIF]: COMM Process Periods - Insert Periods - END
2013-12-31 22:44:56,945 INFO [AIF]: COMM Process Periods - Insert Process Details - START
2013-12-31 22:44:56,952 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
SELECT PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,'AIF_EBS_GL_BALANCES_STG' TARGET_TABLE_NAME
,CURRENT_TIMESTAMP EXECUTION_START_TIME
,NULL EXECUTION_END_TIME
,0 RECORDS_PROCESSED
,'PENDING' STATUS
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' LAST_UPDATED_BY
,CURRENT_TIMESTAMP LAST_UPDATE_DATE
FROM (
SELECT DISTINCT PROCESS_ID
,IMP_ENTITY_TYPE ENTITY_TYPE
,IMP_ENTITY_ID ENTITY_ID
,IMP_ENTITY_NAME ENTITY_NAME
,(COALESCE(SOURCE_LEDGER_ID,0) * 100000000 + GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
) q
ORDER BY ENTITY_NAME_ORDER
2013-12-31 22:44:56,963 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
SELECT PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,'TDATASEG' TARGET_TABLE_NAME
,CURRENT_TIMESTAMP EXECUTION_START_TIME
,NULL EXECUTION_END_TIME
,0 RECORDS_PROCESSED
,'PENDING' STATUS
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' LAST_UPDATED_BY
,CURRENT_TIMESTAMP LAST_UPDATE_DATE
FROM (
SELECT PROCESS_ID
,TRANS_ENTITY_TYPE ENTITY_TYPE
,MIN(TRANS_ENTITY_ID) ENTITY_ID
,TRANS_ENTITY_NAME ENTITY_NAME
,MIN(COALESCE(SOURCE_LEDGER_ID,0) * 100000000 + GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
AND PRIOR_PERIOD_FLAG = 'N'
GROUP BY PROCESS_ID
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_NAME
) q
ORDER BY ENTITY_NAME_ORDER
2013-12-31 22:44:56,970 INFO [AIF]: COMM Process Periods - Insert Process Details - END
2013-12-31 22:44:57,407 DEBUG [AIF]: EBS/FS GL Balances Print Variables - Printing Variables - START
2013-12-31 22:44:57,408 DEBUG [AIF]:
p_process_id: 300
p_sql_db_type: ORACLE
p_partitionkey: 12
p_rule_id: 27
p_source_system_id: 3
p_application_id: 26
p_target_application_type: HFM
p_is_multi_currency: true
p_data_load_method: CLASSIC_VIA_EPMI
p_bal_balance_method_code: STANDARD
p_bal_ledger_group_code: SINGLE
p_bal_amount_type: MONETARY
p_prd_entity_name: JAN-13
p_prd_period_id: 135
p_prd_gl_period_name: JAN-13
p_prd_source_ledger_id: 1
p_prd_source_coa_id: 101
p_source_ledger_id: 1
p_source_coa_id: 101
p_bal_actual_flag: A
p_bal_seg_column_name: SEGMENT1
p_max_ccid_loaded_to_stg: 148137
2013-12-31 22:44:57,408 DEBUG [AIF]: EBS/FS GL Balances Print Variables - Printing Variables - END
2013-12-31 22:44:57,806 INFO [AIF]: LKM EBS/FS Extract Type - Load Audit AND Full Refresh - START
2013-12-31 22:44:57,817 DEBUG [AIF]:
SELECT p.PROCESS_ID
,br.RULE_NAME
,l.SOURCE_LEDGER_NAME
FROM AIF_GL_LOAD_AUDIT aud
,AIF_PROCESSES p
,AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND p.PROCESS_ID = aud.LAST_LOADID
AND p.STATUS = 'RUNNING'
AND p.PROCESS_ID <> 300
AND br.RULE_ID = p.RULE_ID
AND l.SOURCE_SYSTEM_ID = aud.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = aud.SOURCE_LEDGER_ID
2013-12-31 22:44:57,826 DEBUG [AIF]:
SELECT 'Y' VALID_FLAG
FROM GL_PERIOD_STATUSES
WHERE APPLICATION_ID = 101
AND SET_OF_BOOKS_ID = 1
AND PERIOD_NAME = 'JAN-13'
AND CLOSING_STATUS IN ( 'O','C','P' )
2013-12-31 22:44:57,847 DEBUG [AIF]:
SELECT 'Y' EXISTS_FLAG
FROM GL_TRACK_DELTA_BALANCES
WHERE SET_OF_BOOKS_ID = 1
AND PROGRAM_CODE = 'FEM'
AND PERIOD_NAME = 'JAN-13'
AND ACTUAL_FLAG = 'A'
AND EXTRACT_LEVEL_CODE = 'DTL'
AND CURRENCY_TYPE_CODE = 'B'
AND ENABLED_FLAG = 'Y'
2013-12-31 22:44:57,883 DEBUG [AIF]:
SELECT MAX(DELTA_RUN_ID) MAX_DELTA_RUN_ID
FROM GL_BALANCES_DELTA
WHERE SET_OF_BOOKS_ID = 1
AND PERIOD_NAME = 'JAN-13'
AND ACTUAL_FLAG = 'A'
2013-12-31 22:44:57,898 DEBUG [AIF]:
SELECT brl.EXECUTION_MODE
,( SELECT CASE COUNT(aud.DELTA_RUN_ID) WHEN 0 THEN 'N' ELSE 'Y' END
FROM AIF_GL_LOAD_AUDIT aud
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND aud.LAST_LOADID <> brl.LOADID
AND COALESCE( aud.STATUS, 'SUCCESS' ) = 'SUCCESS'
) GL_LOAD_AUDIT_SUCCESS_FLAG
,( SELECT CASE COUNT(aud.DELTA_RUN_ID) WHEN 0 THEN 'N' ELSE 'Y' END
FROM AIF_GL_LOAD_AUDIT aud
WHERE aud.SOURCE_SYSTEM_ID = 3
AND aud.SOURCE_LEDGER_ID = 1
AND aud.GL_PERIOD_ID = 135
AND aud.BALANCE_TYPE = 'A'
AND aud.LAST_LOADID <> brl.LOADID
AND aud.STATUS = 'RUNNING'
) GL_LOAD_AUDIT_RUNNING_FLAG
FROM AIF_BAL_RULE_LOADS brl
WHERE brl.LOADID = 300
2013-12-31 22:44:57,904 DEBUG [AIF]:
INSERT INTO AIF_GL_LOAD_AUDIT ( LAST_LOADID
,DELTA_RUN_ID
,SOURCE_SYSTEM_ID
,SOURCE_LEDGER_ID
,GL_PERIOD_ID
,BALANCE_TYPE
,GL_EXTRACT_TYPE
,STATUS
) VALUES ( 300
,0
,3
,1
,135
,'A'
,'FULLREFRESH'
,'RUNNING'
2013-12-31 22:44:57,907 DEBUG [AIF]:
DELETE FROM AIF_EBS_GL_BALANCES_STG
WHERE SOURCE_SYSTEM_ID = 3
AND SOURCE_COA_ID = 101
AND SOURCE_LEDGER_ID = 1
AND ACTUAL_FLAG = 'A'
AND PERIOD_NAME = 'JAN-13'
2013-12-31 22:44:59,283 INFO [AIF]: LKM EBS/FS Extract Type - Load Audit AND Full Refresh - END
2013-12-31 22:45:06,507 INFO [AIF]: COMM End Process Detail - Update Process Detail - START
2013-12-31 22:45:06,514 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'SUCCESS'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 57408
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_IMP'
AND ENTITY_NAME = 'JAN-13'
2013-12-31 22:45:06,519 INFO [AIF]: COMM End Process Detail - Update Process Detail - END
2013-12-31 22:45:07,106 INFO [AIF]: EBS/FS Load Data - Load TDATASEG_T - START
2013-12-31 22:45:07,112 INFO [AIF]:
Import Data from Source for Period 'January 2013'
2013-12-31 22:45:07,115 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,brl.EXECUTION_MODE
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.CURRENCY_CODE
,br.INCL_ZERO_BALANCE_FLAG
,br.BAL_SEG_VALUE_OPTION_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
FROM AIF_BAL_RULE_LOADS brl
,AIF_TARGET_APPLICATIONS app
,AIF_BALANCE_RULES br
WHERE brl.LOADID = 300
AND app.APPLICATION_ID = brl.APPLICATION_ID
AND br.RULE_ID = brl.RULE_ID
2013-12-31 22:45:07,120 DEBUG [AIF]:
SELECT PERIODKEY
FROM TPOVPERIOD
WHERE PERIODDESC = 'January 2013'
2013-12-31 22:45:07,122 INFO [AIF]:
Import Data from Source for Ledger 'JWR Books'
2013-12-31 22:45:07,125 DEBUG [AIF]:
SELECT COA_SEGMENT_NAME
,ACCOUNT_TYPE_FLAG
,BALANCE_TYPE_FLAG
FROM AIF_COA_SEGMENTS
WHERE SOURCE_SYSTEM_ID = 3
AND SOURCE_COA_ID = '101'
AND (
ACCOUNT_TYPE_FLAG = 'Y'
OR BALANCE_TYPE_FLAG = 'Y'
2013-12-31 22:45:07,127 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 26
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 12
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-12-31 22:45:07,154 DEBUG [AIF]:
INSERT INTO TDATASEG_T (
LOADID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,VALID_FLAG
,CHANGESIGN
,CODE_COMBINATION_ID
,SOURCE_LEDGER_ID
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,YEAR
,PERIOD
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ACCOUNT
,ACCOUNTX
,ENTITY
,ENTITYX
,ICP
,ICPX
,UD1
,UD1X
,UD2
,UD2X
,UD3
,UD3X
,UD4
,UD4X
,DATAVIEW
,DATAKEY
,STAT_BALANCE_FLAG
,CURKEY
,AMOUNT_PTD
,AMOUNT_YTD
,AMOUNT
,AMOUNTX
SELECT pprd.PROCESS_ID LOADID
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,pprd.PERIODKEY
,'Y' VALID_FLAG
,0 CHANGESIGN
,ccid.CODE_COMBINATION_ID
,bal.SOURCE_LEDGER_ID
,pprd.GL_PERIOD_YEAR
,pprd.GL_PERIOD_NUM
,pprd.YEARTARGET YEAR
,pprd.PERIODTARGET PERIOD
,pprd.PROCESS_ID ATTR1
,bal.SOURCE_SYSTEM_ID ATTR2
,bal.SOURCE_LEDGER_ID ATTR3
,pprd.GL_PERIOD_YEAR ATTR4
,pprd.GL_PERIOD_NAME ATTR5
,bal.ACTUAL_FLAG ATTR6
,bal.BUDGET_VERSION_ID ATTR7
,bal.ENCUMBRANCE_TYPE_ID ATTR8
,ccid.ACCOUNT_TYPE ATTR9
,NULL ATTR10
,NULL ATTR11
,NULL ATTR12
,NULL ATTR13
,NULL ATTR14
,ccid.SEGMENT4 ACCOUNT
,NULL ACCOUNTX
,ccid.SEGMENT1 ENTITY
,NULL ENTITYX
,NULL ICP
,NULL ICPX
,ccid.SEGMENT5 UD1
,NULL UD1X
,ccid.SEGMENT3 UD2
,NULL UD2X
,ccid.SEGMENT6 UD3
,NULL UD3X
,ccid.SEGMENT2 UD4
,NULL UD4X
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN 'Periodic' ELSE 'YTD' END ) DATAVIEW
,TDATASEG_DATAKEY_S.NEXTVAL
,'N' STAT_BALANCE_FLAG
,bal.CURRENCY_CODE CURKEY
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) * ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR ) AMOUNT_PTD
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) * ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR ) AMOUNT_YTD
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) *
( CASE
WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
ELSE ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
END
AMOUNT
,( CASE WHEN ccid.ACCOUNT_TYPE IN ('A','E','D') THEN 1 ELSE -1 END ) *
( CASE
WHEN ccid.ACCOUNT_TYPE IN ('R','E','D','C') THEN ( bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
ELSE ( bal.BEGIN_BALANCE_DR - bal.BEGIN_BALANCE_CR + bal.PERIOD_NET_DR - bal.PERIOD_NET_CR )
END
AMOUNTX
FROM AIF_EBS_GL_BALANCES_STG_V bal
,AIF_EBS_GL_CCID_STG ccid
,AIF_PROCESS_PERIODS pprd
WHERE bal.SOURCE_SYSTEM_ID = 3
AND bal.SOURCE_LEDGER_ID = 1
AND bal.ACTUAL_FLAG = 'A'
AND ccid.SOURCE_SYSTEM_ID = bal.SOURCE_SYSTEM_ID
AND ccid.SOURCE_COA_ID = bal.SOURCE_COA_ID
AND ccid.CODE_COMBINATION_ID = bal.CODE_COMBINATION_ID
AND pprd.PROCESS_ID = 300
AND pprd.PERIODKEY = '2013-01-01'
AND pprd.SOURCE_LEDGER_ID = bal.SOURCE_LEDGER_ID
AND pprd.GL_PERIOD_NAME = bal.PERIOD_NAME
AND (
bal.BEGIN_BALANCE_DR <> 0
OR bal.BEGIN_BALANCE_CR <> 0
OR bal.PERIOD_NET_DR <> 0
OR bal.PERIOD_NET_CR <> 0
AND bal.CURRENCY_CODE <> 'STAT'
AND bal.TRANSLATED_FLAG IS NULL
2013-12-31 22:45:09,269 INFO [AIF]: Monetary Data Rows Imported from Source: 12590
2013-12-31 22:45:09,293 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_AUDIT (
LOADID
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,EXPORT_TO_TARGET_FLAG
SELECT DISTINCT PROCESS_ID LOADID
,'HFM' TARGET_APPLICATION_TYPE
,'FMTEST2' TARGET_APPLICATION_NAME
,NULL PLAN_TYPE
,SOURCE_LEDGER_ID
,YEARTARGET EPM_YEAR
,PERIODTARGET EPM_PERIOD
,'Y' SNAPSHOT_FLAG
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,PERIODKEY
,'N' EXPORT_TO_TARGET_FLAG
FROM AIF_PROCESS_PERIODS
WHERE PROCESS_ID = 300
AND PERIODKEY = '2013-01-01'
AND SOURCE_LEDGER_ID = 1
2013-12-31 22:45:09,297 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_PRD_AUDIT (
LOADID
,SOURCE_LEDGER_ID
,GL_PERIOD_ID
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
SELECT DISTINCT pprd.PROCESS_ID LOADID
,pprd.SOURCE_LEDGER_ID
,pprd.PERIOD_ID GL_PERIOD_ID
,(SELECT MAX(gl.DELTA_RUN_ID)
FROM AIF_GL_LOAD_AUDIT gl
WHERE gl.SOURCE_SYSTEM_ID = 3
AND gl.SOURCE_LEDGER_ID = pprd.SOURCE_LEDGER_ID
AND gl.BALANCE_TYPE = 'A'
AND gl.GL_PERIOD_ID = pprd.PERIOD_ID
) DELTA_RUN_ID
,12 PARTITIONKEY
,4 CATKEY
,27 RULE_ID
,pprd.PERIODKEY
FROM AIF_PROCESS_PERIODS pprd
WHERE pprd.PROCESS_ID = 300
AND pprd.PERIODKEY = '2013-01-01'
AND pprd.SOURCE_LEDGER_ID = 1
2013-12-31 22:45:09,302 INFO [AIF]:
Total Data Rows Imported from Source: 12590
2013-12-31 22:45:09,305 INFO [AIF]: EBS/FS Load Data - Load TDATASEG_T - END
2013-12-31 22:45:09,381 INFO [AIF]: COMM Update Data - Init DataLoadUtil - START
2013-12-31 22:45:09,385 INFO [AIF]: COMM Update Data - Init DataLoadUtil - END
2013-12-31 22:45:09,485 INFO [AIF]: COMM Update Data - Update TDATASEG_T/TDATASEGW - START
2013-12-31 22:45:09,491 DEBUG [AIF]:
DELETE FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND AMOUNT = 0
2013-12-31 22:45:09,559 WARN [AIF]:
Warning: Data rows with zero balances exist
2013-12-31 22:45:09,636 INFO [AIF]: Zero Balance Data Rows Deleted: 1879
2013-12-31 22:45:09,654 DEBUG [AIF]:
SELECT DIMNAME
,CASE WHEN RULE_ID IS NULL THEN 'N' ELSE 'Y' END RULE_MAP_FLAG
,SRCKEY
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,DATAKEY
,MAPPING_TYPE
FROM (
SELECT DISTINCT tdm.DIMNAME
,tdm.RULE_ID
,NULL SRCKEY
,NULL TARGKEY
,tdm.WHERECLAUSETYPE
,tdm.WHERECLAUSEVALUE
,NULL CHANGESIGN
,1 SEQUENCE
,COALESCE(tdm.SYSTEM_GENERATED_FLAG,'N') SYSTEM_GENERATED_FLAG
,NULL DATAKEY
,CASE
WHEN tdm.WHERECLAUSETYPE IS NULL THEN 1
ELSE 3
END MAPPING_TYPE
FROM TDATAMAP_T tdm
WHERE tdm.LOADID = 300
AND tdm.PARTITIONKEY = 12
AND tdm.TDATAMAPTYPE = 'ERP'
AND (tdm.RULE_ID IS NULL OR tdm.RULE_ID = 27)
AND tdm.WHERECLAUSETYPE IS NULL
UNION ALL
SELECT tdm.DIMNAME
,tdm.RULE_ID
,tdm.SRCKEY
,tdm.TARGKEY
,tdm.WHERECLAUSETYPE
,tdm.WHERECLAUSEVALUE
,tdm.CHANGESIGN
,CASE tpp.PARTSEQMAP
WHEN 0 THEN CASE
WHEN (tdm.WHERECLAUSETYPE = 'MULTIDIM') THEN 2
WHEN (tdm.WHERECLAUSETYPE = 'BETWEEN') THEN 3
WHEN (tdm.WHERECLAUSETYPE = 'LIKE') THEN 4
ELSE 0
END
ELSE tdm.SEQUENCE
END SEQUENCE
,COALESCE(tdm.SYSTEM_GENERATED_FLAG,'N') SYSTEM_GENERATED_FLAG
,tdm.DATAKEY
,CASE
WHEN tdm.WHERECLAUSETYPE IS NULL THEN 1
ELSE 3
END MAPPING_TYPE
FROM TDATAMAP_T tdm
INNER JOIN TPOVPARTITION tpp
ON tpp.PARTITIONKEY = tdm.PARTITIONKEY
WHERE tdm.LOADID = 300
AND tdm.PARTITIONKEY = 12
AND tdm.TDATAMAPTYPE = 'ERP'
AND (tdm.RULE_ID IS NULL OR tdm.RULE_ID = 27)
AND tdm.WHERECLAUSETYPE IN ('MULTIDIM','BETWEEN','LIKE')
) q
ORDER BY DIMNAME
,RULE_ID
,SEQUENCE
,SYSTEM_GENERATED_FLAG
,SRCKEY
2013-12-31 22:45:09,672 INFO [AIF]:
Processing Mappings for Column 'ACCOUNT'
2013-12-31 22:45:09,677 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ACCOUNTX = ACCOUNT
,ACCOUNTR = 121
,ACCOUNTF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ACCOUNTX IS NULL
AND (1=1)
2013-12-31 22:45:10,044 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,053 INFO [AIF]:
Processing Mappings for Column 'ENTITY'
2013-12-31 22:45:10,057 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ENTITYX = ENTITY
,ENTITYR = 122
,ENTITYF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ENTITYX IS NULL
AND (1=1)
2013-12-31 22:45:10,426 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,435 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ENTITYX = ENTITY
,ENTITYR = 132
,ENTITYF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '*' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ENTITYX IS NULL
AND (1=1)
2013-12-31 22:45:10,446 INFO [AIF]: Data Rows Updated by Location Mapping 'DEFAULT' (LIKE): 0
2013-12-31 22:45:10,448 INFO [AIF]:
Processing Mappings for Column 'ICP'
2013-12-31 22:45:10,452 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ICPX = '[ICP None]'
,ICPR = 130
,ICPF = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[ICP None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND ICPX IS NULL
AND (1=1)
2013-12-31 22:45:10,784 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:10,798 INFO [AIF]:
Processing Mappings for Column 'UD1'
2013-12-31 22:45:10,802 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD1X = '[None]'
,UD1R = 124
,UD1F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD1X IS NULL
AND (1=1)
2013-12-31 22:45:11,134 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,156 INFO [AIF]:
Processing Mappings for Column 'UD2'
2013-12-31 22:45:11,160 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD2X = '[None]'
,UD2R = 123
,UD2F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD2X IS NULL
AND (1=1)
2013-12-31 22:45:11,517 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,531 INFO [AIF]:
Processing Mappings for Column 'UD3'
2013-12-31 22:45:11,535 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD3X = '[None]'
,UD3R = 125
,UD3F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD3X IS NULL
AND (1=1)
2013-12-31 22:45:11,870 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:11,883 INFO [AIF]:
Processing Mappings for Column 'UD4'
2013-12-31 22:45:11,887 DEBUG [AIF]:
UPDATE TDATASEG_T
SET UD4X = '[None]'
,UD4R = 128
,UD4F = 3
,AMOUNTX = AMOUNTX * 1
,CHANGESIGN = CASE 1
WHEN -1 THEN CASE CHANGESIGN
WHEN 1 THEN 0
WHEN 0 THEN 1
ELSE CHANGESIGN
END
ELSE CHANGESIGN
END
,VALID_FLAG = CASE '[None]' WHEN 'IGNORE' THEN 'I' ELSE VALID_FLAG END
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND UD4X IS NULL
AND (1=1)
2013-12-31 22:45:12,192 INFO [AIF]: Data Rows Updated by Location Mapping 'Like' (LIKE): 10711
2013-12-31 22:45:12,204 DEBUG [AIF]:
UPDATE TDATASEG_T
SET ATTR14 = DATAKEY
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:12,739 DEBUG [AIF]:
UPDATE TDATASEG_T
SET VALID_FLAG = 'N'
WHERE 1=1
AND (
(1=0)
OR TDATASEG_T.ACCOUNTX IS NULL
OR TDATASEG_T.ENTITYX IS NULL
OR TDATASEG_T.ICPX IS NULL
OR TDATASEG_T.UD1X IS NULL
OR TDATASEG_T.UD2X IS NULL
OR TDATASEG_T.UD3X IS NULL
OR TDATASEG_T.UD4X IS NULL
AND LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND VALID_FLAG = 'Y'
2013-12-31 22:45:12,754 INFO [AIF]:
Total Data Rows available for Export to Target: 10711
2013-12-31 22:45:12,773 INFO [AIF]: COMM Update Data - Update TDATASEG_T/TDATASEGW - END
2013-12-31 22:45:12,808 INFO [AIF]: COMM End Process Detail - Update Process Detail - START
2013-12-31 22:45:12,823 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'SUCCESS'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 10711
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_TRANS'
AND ENTITY_NAME = 'January 2013'
2013-12-31 22:45:12,829 INFO [AIF]: COMM End Process Detail - Update Process Detail - END
2013-12-31 22:45:12,987 INFO [AIF]: COMM Update YTD Amounts - Update YTD Amounts - START
2013-12-31 22:45:12,993 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,pprd.YEARTARGET
,pprd.PERIODTARGET
,pprd.SOURCE_LEDGER_ID
FROM AIF_BAL_RULE_LOADS brl
,AIF_PROCESS_PERIODS pprd
WHERE brl.LOADID = 300
AND pprd.PROCESS_ID = brl.LOADID
GROUP BY brl.PARTITIONKEY
,brl.CATKEY
,pprd.YEARTARGET
,pprd.PERIODTARGET
,pprd.SOURCE_LEDGER_ID
HAVING COUNT(*) > 1
2013-12-31 22:45:12,995 INFO [AIF]: COMM Update YTD Amounts - Update YTD Amounts - END
2013-12-31 22:45:13,052 INFO [AIF]: COMM Load TDATAMAPSEG/TDATASEG - Load TDATAMAPSEG/TDATASEG - START
2013-12-31 22:45:13,057 DEBUG [AIF]:
SELECT brl.PARTITIONKEY
,brl.CATKEY
,brl.EXECUTION_MODE
FROM AIF_BAL_RULE_LOADS brl
WHERE brl.LOADID = 300
2013-12-31 22:45:13,059 DEBUG [AIF]:
SELECT PERIODKEY
FROM AIF_APPL_LOAD_AUDIT
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND RULE_ID = 27
ORDER BY PERIODKEY
2013-12-31 22:45:13,061 INFO [AIF]:
Processing Data for PeriodKey '2013-01-01'
2013-12-31 22:45:13,065 DEBUG [AIF]:
DELETE FROM TDATAMAPSEG
WHERE PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
AND (
TDATAMAPTYPE = 'ERP'
OR (
TDATAMAPTYPE = 'MULTIDIM'
AND EXISTS (
SELECT 1
FROM TDATAMAPSEG parent
WHERE parent.PARTITIONKEY = TDATAMAPSEG.PARTITIONKEY
AND parent.DATAKEY = TDATAMAPSEG.TARGKEY
AND parent.CATKEY = TDATAMAPSEG.CATKEY
AND parent.PERIODKEY = TDATAMAPSEG.PERIODKEY
AND parent.TDATAMAPTYPE = 'ERP'
2013-12-31 22:45:13,074 INFO [AIF]: Number of Rows deleted from TDATAMAPSEG: 8
2013-12-31 22:45:13,077 DEBUG [AIF]:
INSERT INTO TDATAMAPSEG (
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,DIMNAME
,SRCKEY
,SRCDESC
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,VBSCRIPT
,TDATAMAPTYPE
,SYSTEM_GENERATED_FLAG
SELECT DATAKEY
,PARTITIONKEY
,4
,'2013-01-01'
,DIMNAME
,SRCKEY
,SRCDESC
,TARGKEY
,WHERECLAUSETYPE
,WHERECLAUSEVALUE
,CHANGESIGN
,SEQUENCE
,VBSCRIPT
,TDATAMAPTYPE
,SYSTEM_GENERATED_FLAG
FROM TDATAMAP_T
WHERE LOADID = 300
2013-12-31 22:45:13,081 INFO [AIF]: Number of Rows inserted into TDATAMAPSEG: 8
2013-12-31 22:45:13,083 DEBUG [AIF]:
DELETE FROM TDATASEG
WHERE LOADID < 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND RULE_ID = 27
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:15,659 INFO [AIF]: Number of Rows deleted from TDATASEG: 10711
2013-12-31 22:45:15,728 DEBUG [AIF]:
INSERT INTO TDATASEG (
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,CURKEY
,DATAVIEW
,CALCACCTTYPE
,CHANGESIGN
,JOURNALID
,AMOUNT
,AMOUNTX
,AMOUNT_PTD
,AMOUNT_YTD
,DESC1
,DESC2
,ACCOUNT
,ACCOUNTX
,ACCOUNTR
,ACCOUNTF
,ENTITY
,ENTITYX
,ENTITYR
,ENTITYF
,ICP
,ICPX
,ICPR
,ICPF
,UD1
,UD1X
,UD1R
,UD1F
,UD2
,UD2X
,UD2R
,UD2F
,UD3
,UD3X
,UD3R
,UD3F
,UD4
,UD4X
,UD4R
,UD4F
,UD5
,UD5X
,UD5R
,UD5F
,UD6
,UD6X
,UD6R
,UD6F
,UD7
,UD7X
,UD7R
,UD7F
,UD8
,UD8X
,UD8R
,UD8F
,UD9
,UD9X
,UD9R
,UD9F
,UD10
,UD10X
,UD10R
,UD10F
,UD11
,UD11X
,UD11R
,UD11F
,UD12
,UD12X
,UD12R
,UD12F
,UD13
,UD13X
,UD13R
,UD13F
,UD14
,UD14X
,UD14R
,UD14F
,UD15
,UD15X
,UD15R
,UD15F
,UD16
,UD16X
,UD16R
,UD16F
,UD17
,UD17X
,UD17R
,UD17F
,UD18
,UD18X
,UD18R
,UD18F
,UD19
,UD19X
,UD19R
,UD19F
,UD20
,UD20X
,UD20R
,UD20F
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ARCHIVEID
,HASMEMOITEM
,STATICDATAKEY
,LOADID
,RULE_ID,
CODE_COMBINATION_ID
,STAT_BALANCE_FLAG
,VALID_FLAG
SELECT
DATAKEY
,PARTITIONKEY
,CATKEY
,PERIODKEY
,CURKEY
,DATAVIEW
,CALCACCTTYPE
,CHANGESIGN
,JOURNALID
,AMOUNT
,AMOUNTX
,AMOUNT_PTD
,AMOUNT_YTD
,DESC1
,DESC2
,ACCOUNT
,ACCOUNTX
,ACCOUNTR
,ACCOUNTF
,ENTITY
,ENTITYX
,ENTITYR
,ENTITYF
,ICP
,ICPX
,ICPR
,ICPF
,UD1
,UD1X
,UD1R
,UD1F
,UD2
,UD2X
,UD2R
,UD2F
,UD3
,UD3X
,UD3R
,UD3F
,UD4
,UD4X
,UD4R
,UD4F
,UD5
,UD5X
,UD5R
,UD5F
,UD6
,UD6X
,UD6R
,UD6F
,UD7
,UD7X
,UD7R
,UD7F
,UD8
,UD8X
,UD8R
,UD8F
,UD9
,UD9X
,UD9R
,UD9F
,UD10
,UD10X
,UD10R
,UD10F
,UD11
,UD11X
,UD11R
,UD11F
,UD12
,UD12X
,UD12R
,UD12F
,UD13
,UD13X
,UD13R
,UD13F
,UD14
,UD14X
,UD14R
,UD14F
,UD15
,UD15X
,UD15R
,UD15F
,UD16
,UD16X
,UD16R
,UD16F
,UD17
,UD17X
,UD17R
,UD17F
,UD18
,UD18X
,UD18R
,UD18F
,UD19
,UD19X
,UD19R
,UD19F
,UD20
,UD20X
,UD20R
,UD20F
,ATTR1
,ATTR2
,ATTR3
,ATTR4
,ATTR5
,ATTR6
,ATTR7
,ATTR8
,ATTR9
,ATTR10
,ATTR11
,ATTR12
,ATTR13
,ATTR14
,ARCHIVEID
,HASMEMOITEM
,STATICDATAKEY
,LOADID
,RULE_ID,
CODE_COMBINATION_ID
,STAT_BALANCE_FLAG
,VALID_FLAG
FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:16,838 INFO [AIF]: Number of Rows inserted into TDATASEG: 10711
2013-12-31 22:45:16,858 DEBUG [AIF]:
DELETE FROM TDATASEG_T
WHERE LOADID = 300
AND PARTITIONKEY = 12
AND CATKEY = 4
AND PERIODKEY = '2013-01-01'
2013-12-31 22:45:17,123 INFO [AIF]: Number of Rows deleted from TDATASEG_T: 10711
2013-12-31 22:45:17,153 DEBUG [AIF]:
DELETE FROM TDATAMAP_T
WHERE LOADID = 300
2013-12-31 22:45:17,156 INFO [AIF]: Number of Rows deleted from TDATAMAP_T: 8
2013-12-31 22:45:17,161 INFO [AIF]: COMM Load TDATAMAPSEG/TDATASEG - Load TDATAMAPSEG/TDATASEG - END
2013-12-31 22:45:17,993 DEBUG [AIF]:
SELECT CASE app.METADATA_LOAD_METHOD
WHEN 'EPMA' THEN CASE dim.TARGET_DIMENSION_CLASS_NAME
WHEN 'Generic' THEN dim.TARGET_DIMENSION_NAME
ELSE dim.TARGET_DIMENSION_CLASS_NAME
END
ELSE dim.TARGET_DIMENSION_NAME
END TARGET_DIMENSION_NAME
,adim.BALANCE_COLUMN_NAME
FROM AIF_TARGET_APPLICATIONS app
,AIF_TARGET_APPL_DIMENSIONS adim
,AIF_DIMENSIONS dim
WHERE app.APPLICATION_ID = 26
AND adim.APPLICATION_ID = app.APPLICATION_ID
AND dim.DIMENSION_ID = adim.DIMENSION_ID
AND dim.TARGET_DIMENSION_CLASS_NAME IN ('Custom1','Custom2','Custom3','Custom4','Generic')
2013-12-31 22:45:17,997 DEBUG [AIF]:
SELECT SCENARIO "Scenario"
,YEAR "Year"
,PERIOD "Period"
,DATAVIEW "View"
,DATAVALUE "Value"
,ACCOUNT "Account"
,ENTITY "Entity"
,ICP "ICP"
,UD2 "Area"
,UD1 "Tail"
,UD3 "Special"
,UD4 "Facility"
,AMOUNT "DataValue"
FROM AIF_HS_BALANCES
WHERE LOADID = 300
2013-12-31 22:45:18,000 INFO [SimpleAsyncTaskExecutor-9]: ODI Hyperion Financial Management Adapter
2013-12-31 22:45:18,002 INFO [SimpleAsyncTaskExecutor-9]: Load task initialized.
2013-12-31 22:45:18,028 INFO [AIF]: LKM COMM Load Data into HFM - Load Data to HFM - START
2013-12-31 22:45:18,031 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_DETAILS (
PROCESS_ID
,ENTITY_TYPE
,ENTITY_ID
,ENTITY_NAME
,ENTITY_NAME_ORDER
,TARGET_TABLE_NAME
,EXECUTION_START_TIME
,EXECUTION_END_TIME
,RECORDS_PROCESSED
,STATUS
,LAST_UPDATED_BY
,LAST_UPDATE_DATE
) VALUES (
300
,'PROCESS_BAL_EXP_HFM'
,NULL
,'FMTEST2'
,NULL
,NULL
,CURRENT_TIMESTAMP
,NULL
,NULL
,'RUNNING'
,'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
,CURRENT_TIMESTAMP
2013-12-31 22:45:18,034 INFO [SimpleAsyncTaskExecutor-9]: Connecting to Financial Management application [FMTEST2] on [10.150.20.40] using user-name [admin].
2013-12-31 22:45:18,155 INFO [SimpleAsyncTaskExecutor-9]: Connected to Financial Management application.
2013-12-31 22:45:18,157 INFO [SimpleAsyncTaskExecutor-9]: HFM Version: 11.1.2.2.300.
2013-12-31 22:45:18,160 INFO [SimpleAsyncTaskExecutor-9]: Options for the Financial Management load task are:
<Options>
<Option name=LOG_FILE_NAME value=C:\Windows\TEMP\/aif_501_300.log/>
<Option name=IMPORT_MODE value=Replace/>
<Option name=CONSOLIDATE_ONLY value=false/>
<Option name=CONSOLIDATE_PARAMETERS value=""/>
<Option name=LOG_ENABLED value=true/>
<Option name=ACCUMULATE_WITHIN_FILE value=false/>
<Option name=DEBUG_ENABLED value=true/>
<Option name=CONSOLIDATE_AFTER_LOAD value=false/>
<Option name=FILE_CONTAINS_SHARE_DATA value=false/>
</Options>
2013-12-31 22:45:18,168 INFO [SimpleAsyncTaskExecutor-9]: Load Options validated.
2013-12-31 22:45:18,176 ERROR [SimpleAsyncTaskExecutor-9]: Error occurred during load process ORA-00904: "DATAVALUE": invalid identifier
com.hyperion.odi.common.ODIHAppException: ORA-00904: "DATAVALUE": invalid identifier
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:216)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java:175)
at org.python.core.PyObject.__call__(PyObject.java:355)
at org.python.core.PyMethod.__call__(PyMethod.java:215)
at org.python.core.PyMethod.instancemethod___call__(PyMethod.java:221)
at org.python.core.PyMethod.__call__(PyMethod.java:206)
at org.python.core.PyObject.__call__(PyObject.java:397)
at org.python.core.PyObject.__call__(PyObject.java:401)
at org.python.pycode._pyx161.f$0(<string>:98)
at org.python.pycode._pyx161.call_function(<string>)
at org.python.core.PyTableCode.call(PyTableCode.java:165)
at org.python.core.PyCode.call(PyCode.java:18)
at org.python.core.Py.runCode(Py.java:1204)
at org.python.core.Py.exec(Py.java:1248)
at org.python.util.PythonInterpreter.exec(PythonInterpreter.java:172)
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:144)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:322)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:170)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2472)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:47)
at oracle.odi.runtime.agent.execution.cmd.ScriptingExecutor.execute(ScriptingExecutor.java:1)
at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2913)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2625)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:558)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:464)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:2093)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1889)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$2.doAction(StartScenRequestProcessor.java:580)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:216)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor.doProcessStartScenTask(StartScenRequestProcessor.java:513)
at oracle.odi.runtime.agent.processor.impl.StartScenRequestProcessor$StartScenTask.doExecute(StartScenRequestProcessor.java:1066)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:126)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:724)
Caused by: java.sql.SQLSyntaxErrorException: ORA-00904: "DATAVALUE": invalid identifier
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:462)
at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:931)
at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:481)
at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:205)
at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:548)
at oracle.jdbc.driver.T4CStatement.doOall8(T4CStatement.java:202)
at oracle.jdbc.driver.T4CStatement.executeForDescribe(T4CStatement.java:942)
at oracle.jdbc.driver.OracleStatement.executeMaybeDescribe(OracleStatement.java:1283)
at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1441)
at oracle.jdbc.driver.OracleStatement.executeQuery(OracleStatement.java:1690)
at oracle.jdbc.driver.OracleStatementWrapper.executeQuery(OracleStatementWrapper.java:446)
at com.hyperion.odi.hfm.ODIHFMAppWriter.loadData(ODIHFMAppWriter.java:212)
... 38 more
2013-12-31 22:45:18,208 DEBUG [AIF]:
UPDATE AIF_PROCESS_DETAILS
SET STATUS = 'FAILED'
,RECORDS_PROCESSED = CASE
WHEN RECORDS_PROCESSED IS NULL THEN 0
ELSE RECORDS_PROCESSED
END + 0
,EXECUTION_END_TIME = CURRENT_TIMESTAMP
,LAST_UPDATED_BY = CASE
WHEN ('native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER' IS NULL) THEN LAST_UPDATED_BY
ELSE 'native://DN=cn=911,ou=People,dc=css,dc=hyperion,dc=com?USER'
END
,LAST_UPDATE_DATE = CURRENT_TIMESTAMP
WHERE PROCESS_ID = 300
AND ENTITY_TYPE = 'PROCESS_BAL_EXP_HFM'
AND ENTITY_NAME = 'FMTEST2'
2013-12-31 22:45:18,210 FATAL [AIF]: Error in LKM CThe issue is that you are mapping "Data Value" to amount in the Target Application import format:
SELECT SCENARIO "Scenario"
,YEAR "Year"
,PERIOD "Period"
,DATAVIEW "View"
,DATAVALUE "Value"
,ACCOUNT "Account"
,ENTITY "Entity"
,ICP "ICP"
,UD2 "Area"
,UD1 "Tail"
,UD3 "Special"
,UD4 "Facility"
,AMOUNT "DataValue"
FROM AIF_HS_BALANCES
WHERE LOADID = 300
You need to map AMOUNT to "AMOUNT" in the HFM Application. Check that the dimension mapping is correct for the class in the target application and that your import format is going to the proper target dimension(Amount). -
Error while loading the data from PSA to Data Target
Hi to all,
I'm spacing some error while loading the data to data target.
Error : Record 1 :Value 'Kuldeep Puri Milan Joshi ' (hex. '004B0075006C0064006500650070002000500075007200690
Details:
Requests (messages): Everything OK
Extraction (messages): Everything OK
Transfer (IDocs and TRFC): Errors occurred
Request IDoc : Application document posted
Info IDoc 2 : Application document posted
Info IDoc 1 : Application document posted
Info IDoc 4 : Application document posted
Info IDoc 3 : Application document posted
Data Package 1 : arrived in BW ; Processing : Data records for package 1 selected in PSA - 1 er
Processing (data packet): Errors occurred
Update PSA ( 2462 Records posted ) : No errors
Transfer Rules ( 2462 -> 2462 Records ) : No errors
Update rules ( 2462 -> 2462 Records ) : No errors
Update ( 0 new / 0 changed ) : Errors occurred
Processing end : Errors occurred
I'm totally new to this issue. please help to solve this error.
Regards,
SaranHi,
I think you are facing an invalid character issue.
This issue can be resolved by correcting the error records in PSA and updating it into the target. For that the first step should be to identify if all the records are there in PSA. You can find out this from checking the Details tab in RSMO, Job log , PSA > sorting records based on status,etc. Once its confirmed force the request to red and delete the particular request from the target cube. Then go to PSA and edit the incorrect records (correcting or blanking out the invalid entries for particular field InfoObject for the incorrect record) and save it. Once all the incorrect records are edited go to RSA1>PSA find the particular request and update to target manually (right click on PSA request > Start update immediately).
I will add the step by step procedure to edit PSA data and update into target (request based).
In your case the error message says Error : Record 1 :Value 'Kuldeep Puri Milan Joshi '. You just need to conver this to Capital letter in PSA and reload.
Edit the field to KULDEEP PURI MILAN JOSHI in PSA and push it to target.
Identifying incorrect records.
System wont show all the incorrect records at the first time itself. You need to search the PSA table manually to find all the incorrect records.
1. First see RSMO > Details > Expand upate rules / processing tabs and you will find some of the error records.
2. Then you can go to PSA and filter using the status of records. Filter all the red requests. This may also wont show the entire incorrect records.
3. Then you can go to PSA and filter using the incorrect records based on the particular field.
4. If this also doesnt work out go to PSA and sort (not filter) the records based on the particular field with incorrect values and it will show all the records. Note down the record numbers and then edit them one by one.
If you want to confirm find the PSA table and search manually."
Also Run the report RS_ERRORLOG_EXAMPLE,By this report you can display all incorrected records of the data & you can also find whether the error occured in PSA or in TRANSFER RULES.
Steps to resolve this
1. Force the request to red in RSMO > Status tab.
2. Delete the request from target.
3. Come to RSMO > top right you can see PSA maintenace button > click and go to PSA .
4.Edit the record
5. Save PSA data.
6. Got to RSA15 > Search by request name > Right click > update the request from PSA to target.
Refer how to Modify PSA Data
https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40890eda-1b99-2a10-2d8b-a18b9108fc38
This should solve your problem for now.
As a long term you can apply some user exit in source system side or change your update rules to ensure that this field is getting blanked out before getting loaded in cube or add that particular char to permitted character list in BW.
RSKC --> type ALL_CAPITAL --> F8 (Execute)
OR
Go to SE38 and execute the program RSKC_ALLOWED_CHAR_MAINTAIN and give ALL_CAPITAL or the char you want to add.
Check the table RSALLOWEDCHAR. It should contain ALL_CAPITAL or the char you have entered.
Refer
/people/sap.user72/blog/2006/07/23/invalid-characters-in-sap-bw-3x-myths-and-reality-part-2
/people/sap.user72/blog/2006/07/08/invalid-characters-in-sap-bw-3x-myths-and-reality-part-1
/people/aaron.wang3/blog/2007/09/03/steps-of-including-one-special-characters-into-permitted-ones-in-bi
http://help.sap.com/saphelp_nw04/helpdata/en/64/e90da7a60f11d2a97100a0c9449261/frameset.htm
For adding Other characters
OSS note #173241 Allowed characters in the BW System
Thanks,
JituK
Edited by: Jitu Krishna on Mar 22, 2008 1:52 PM -
Error while loading Transactional data from NW BW Infoprovider
Hi,
I am trying to load the transactional data using the delivered "Transactional data from NW BW Infoprovider" package and getting error message Error occurs when loading transaction data from other cube .
Below is the transformation file content
*OPTIONS
FORMAT = DELIMITED
HEADER = YES
DELIMITER = TAB
AMOUNTDECIMALPOINT = .
SKIP = 0
SKIPIF =
VALIDATERECORDS=NO
CREDITPOSITIVE=YES
MAXREJECTCOUNT= 9999999999999
ROUNDAMOUNT=
*MAPPING
ACCOUNT = 0ACCOUNT
BUSINESSTYPE = *NEWCOL(NOBTYPE)
BWSRC = *NEWCOL(R3)
COMPANYCODE = 0COMP_CODE
DATASRC = *NEWCOL(R3)
FUNCTIONALAREA = *IF(0FUNC_AREA = *STR() then *STR(NOFA);0FUNC_AREA)
GAME = *NEWCOL(NOGAME)
INPUTCURRENCY = *IF(0CURRENCY = *STR() then *STR(NOCURR) ; 0CURRENCY)
PROBABILITY = *NEWCOL(NOPROB)
PRODUCT = *NEWCOL(NOPROD)
PROFITCENTER = 0PROFIT_CTR
PROJECT = *NEWCOL(NOPROJECT)
TIME = 0FISCPER
VERSION = *NEWCOL(REV0)
WEEKS = *NEWCOL(NOWEEK)
SIGNEDDATA= 0AMOUNT
*CONVERSION
Below is the Error Log
/CPMB/MODIFY completed in 0 seconds
/CPMB/INFOPROVIDER_CONVERT completed in 0 seconds
/CPMB/CLEAR completed in 0 seconds
[Selection]
InforProvide=ZPCA_C01
TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\R3_TD_P&L.xls
CLEARDATA= Yes
RUNLOGIC= No
CHECKLCK= Yes
[Messages]
Task name CONVERT:
No 1 Round:
Error occurs when loading transaction data from other cube
Application: ACTUALREPORTING Package status: ERROR
Does anyone had this issue before or is there something that needs to be looked in the BW delivered task ?
ANy Inpout will be appreciated.
SanjayHello Guru,
Currently i am getting below error while loading costcenter master data from BW to BPC.
Task name MASTER DATA SOURCE:
Record count: 189
Task name TEXT SOURCE:
Record count: 189
Task name CONVERT:
No 1 Round:
Info provider is not available
Application: ZRB_SALES_CMB Package status: ERROR
Anybody can tell me, if i have missed anything ???
Regards,
BI NEW
Edited by: BI NEW on Feb 23, 2011 12:25 PM -
BI 7.0 - Error while loading data from R3 to PSA
Hi,
I transported Master Data ATTR/TEXT object from DEV to QA (BI 7.0 NW 2004s); and getting the following error while loading the data from R3 in to PSA using the InfoPackage.
Any clue to resolve this issue is appreciated.
Monitor - STATUS tab:
Error message from the source system
Diagnosis
An error occurred in the source system.
System Response
Caller 09 contains an error message.
Further analysis:
The error occurred in Service API .
Refer to the error message.
Procedure
How you remove the error depends on the error message.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
Monitor - Details Tab:
Extraction (messages): Errors occurred
Error occurred in the data selection
Request IDoc : Application document not posted
Error Message:
Diagnosis
In the source system, there is no transfer structure available for
InfoSource 0ACTIONTYPE_TEXT .
System Response
The data transfer is terminated.
Procedure
In the Administrator Workbench, regenerate from this source system the
transfer structure for InfoSource 0COMPANY_TEXT .
Thanks,Hi,
Pl try the below
The l_dta data target that you want to load is an InfoObject (master data or text
table).
You must declare this for the InfoProvider in Transaction RSD1.
You do this on the 'Master Data/Text' tab by assigning any InfoArea to the InfoObject.
The InfoProvider (that is, this InfoObject) is then displayed below this InfoArea in
the InfoProvider tree.
Rregards,
Senthil -
Error while loading data from a file on application server
Hi all,
Facing an error while loading data from a flat file.
Error 'The argument '##yyyymmdd;@##' cannot be interpreted as a number ' while assigning character.
I changed the format of date fields (tried with number,general,date(International))in the xls. But i still get the same error.Did check all the data types in Data source all the fields are dats.
Can you please tell me what could be the problem?
Thank you all,
PraveenHi all,
As far as my first question i got through it but i had one more field in my flat file while actually is a time stamp, but in my flat file i have a data in this format
10/21/2006 5:11:48 AM which i need to change to 10/21/2006
one more note is i have some of the fields as NULL in this field
Last Updated Date
10/21/2006 5:11:48 AM
10/21/2006 5:11:48 AM
NULL
NULL
10/21/2006 5:11:48 AM
NULL
I want to display the values as 10/21/2006 and NULL as it is.
Please let me know if we have a conversion routine in datasource which can solve my problem.
Regards,
Praveen -
BPC Error while loading master data from SAP BW
Hello Guru,
I have NW BPC 7.5 installed on BW server. Currently i have started to work on one application where i have to load master data first. I am loading master data for cost center.
I looked into the process chain "BPC: Import BW InfoObject master data" and i found that few processes are in GREEN color and few are in RED color (Display). is that correct or not ? then as per some of the thread i went to "BPC: Convert BW infoObject master data" --> SOURCE_TYPE = IP and added one parameter "FORMULA_FILE_NO". Then i saved this process chain and activated it.
I created transformation file in BPC and it got created successfully but while i try to load packages it throws me error.
Q. Do i need to have entire process chain with all processes in GREEN to have the master data in BPC ?
Q. Do i need to activate any BADI's for this or not ?
I appreciate your help in advance. I trieHello Guru,
Currently i am getting below error while loading costcenter master data from BW to BPC.
Task name MASTER DATA SOURCE:
Record count: 189
Task name TEXT SOURCE:
Record count: 189
Task name CONVERT:
No 1 Round:
Info provider is not available
Application: ZRB_SALES_CMB Package status: ERROR
Anybody can tell me, if i have missed anything ???
Regards,
BI NEW
Edited by: BI NEW on Feb 23, 2011 12:25 PM -
Error while loading data from DS into cube
Hello All
I am getting the below error while loading data from Data source to the cube.
Record 1 :Time conversion from 0CALDAY to 0FISCPER (fiscal year ) failed with value 20070331
Wht am i suppose to do?
Need ur input in this regard.
Regards
RohitHi
Simply map 0calday to 0fiscper(you already done it)..You might have forgotten to map fiscal year variant in update rules....if it comes from souce,just map it..if it won't comes from source...make it as constant in update rules and give value..
If your business is April to March make it as 'V3'
If your business is January to Decemeber make it as 'K4'..
activate your update rules delete old data and upload again....
Hope it helps
Thanks
Teja -
Error while loading data from 0FI_TX_4 to ODS
Dear Friends,
One of our proccess chain getting error since a week, every day with same error while loading data from R3 data source 0FI_TX_4 to ODS.
Short dump is raised in R3 system.
"DBIF_RSQL_SQL_ERROR" CX_SY_OPEN_SQL_DBC
"SAPLBWFIR" or "LBWFIRU13"
"BWFIR_READ_BSET_CPUDT_DATA"
ORA-01555
The number of records are below 10000 only.
All other chains are OK.
we have already searched SDN and Notes...and applied below solution, but still no use
Increase the size of all the roll back segments
Increase number of REDO logs
Decrease the number of records per data packet
Any more suggessions please ??
Thanks
TonyHow are you telling that no. of records are below 10000 .Have you checked the no.of records in PSA???
Now ,I tell you something which I had done for this error .I increased the Package size such that whole data gets transferred in single data package.Please make the package size as large as possible so that all the data can be processed in single package.
I know you have read so many threads on the same but what I am suggesting is what I have done and successfully rectified this issue.
regards,
rocks -
ODI error while loading data from Flat File to oracle
Hi Gurus,
I am getting following error while loading flat file to oracle table :
ava.lang.NumberFormatException: 554020
at java.math.BigInteger.parseInt(Unknown Source)
at java.math.BigInteger.<init>(Unknown Source)
at java.math.BigDecimal.<init>(Unknown Source)
at com.sunopsis.sql.SnpsQuery.updateExecStatement(SnpsQuery.java)
at com.sunopsis.sql.SnpsQuery.addBatch(SnpsQuery.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
The connections between source and target is fine.
Kindly suggest on this error.
Thanks
ShridharHi John,
The source is csv file. By default all columns are string type.The integration is failing at step 3 ( load Data).
I am loading the data from csv file directly to staging table( oracle ).Its one to one mapping.
Shridhar -
Error while Loading data through .csv file
Hi,
I am getting below date error when loading data through into Olap tables through .csv file.
Data stored in .csv is 20071113121100.
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TE_7007 Transformation Evaluation Error [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')]
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11132 Transformation [Exp_FILE_CHNL_TYPE] had an error evaluating output column [CREATED_ON_DT_OUT]. Error message is [<<Expression Error>> [TO_DATE]: invalid string for converting to Date
... t:TO_DATE(u:'2.00711E+13',u:'YYYYMMDDHH24MISS')].
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11019 There is an error in the port [CREATED_ON_DT_OUT]: The default value for the port is set to: ERROR(<<Expression Error>> [ERROR]: transformation error
... nl:ERROR(u:'transformation error')).
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> TT_11021 An error occurred moving data from the transformation Exp_FILE_CHNL_TYPE: to the transformation W_CHNL_TYPE_DS.
TRANSF_1_1_1> CMN_1761 Timestamp Event: [Mon Mar 29 15:06:17 2010]
TRANSF_1_1_1> CMN_1086 Exp_FILE_CHNL_TYPE: Number of errors exceeded threshold [1].
Any help is greatly appreciated.
Thanks,
Poojak1) Wrong format, you wont get much support loading OLAP cubes in here I think.
2) Has your CSV file been anywhere near excel by chance the conversion of 20071113121100 to 2.00711E+13 looks very much like what I see when Excel has an invalid number mask / precision specified for a cell.
*** We are using EBS. File was already generated by consultants through SQL query. Getting error while loading though Informatica. Target table is setup as date datatype and source is String(19).
Expression in Informatica is setup as below.
IIF(ISNULL(CREATED_ON_DT), NULL, TO_DATE(CREATED_ON_DT, 'YYYYMMDDHH24MISS'))
Thanks,
Poojak -
hi guys
i got an error while loading infoobject atrributes
i created characteristic info object "equipment" with Master data and text.
when i loaded text it was working fine but the problem happened when i start
loading attributes data.
error code is as given below
Check Load from InfoSource D92_EQUIP , Packet infpackage for equipment -attrib
Please execute the mail for additional information.
Data records were selected in the PSA
Diagnosis
Data records were marked as incorrect in the PSA.
System response
If error messages exist for the data records, the corresponding data packets were not updated using the PSA.
Procedure
Go into the PSA maintenance for the request and read the logged messages.
You find the relevant messages for each record in the PSA table by double-clicking on the status of the record.
Where necessary, edit the incorrect records or remove the error in another way.
Post the incorrect data packets manually.
Note that you are only permitted to edit and manually update the request if it is not contained in the request list in the InfoCube (see InfoCube administration). Where necessary, first delete it from the InfoCube.
any advice ???its says problem with data records..but i couldnt find any
regards
abhi
Message was edited by:
Abhilash MuraleedharanHi Abhilash,
You can use the preview functionality in the external page to see if the reords are being pulled properly or not.
If not then there is some problem with the file which you need to fix. Or might be the setting you have done in selecting the delimiter in BW or any setting. Also check your routines if you are using any.
If the file is being pulled correctly try seeing the error message under the details tab in the monitor screen. That will give you an idea why the load is failing.
Hope that helps.
regards. -
Error While loading data for LIS InfoSources.
Hi All,
I am repeatedly receiving load failure errors while loading data using 2lis_01_s001 (This is the case with all the InfoSources).
The error message is:
An error occurred in the source system.
Note
If the source system is a Client Workstation, then it is possible that the file that you wanted to load was being edited at the time of the data request. Make sure that the file is in the specified directory, that it is not being processed at the moment, and restart the request.
in our Quality system, we disabled the LIS Updation to No Update(R3) and loaded data and then again changed the Update Mode for No Updating to Asynchronous Update(R3). But now we are doing dataloading in Production. How to proceed. Should we have to disable the LIS Updating whenever we have to load the loads from R3 to BW.
Regards
JayHi Jayanthy,
Pls. check the order of the fields in the two set up tables for the S001 structure. The order of fields in both the tables should be the same.
You can see the structure in the TCode - SE11.
If the order is different, then you bneed to ask the BASIS person to change the order so that the order of fields in both the setup tables is same. This should fix the issue.
Thanks,
Raj
Maybe you are looking for
-
Mail stuck "cannot send using server"....
but the real issue is that the list of addressees is so long that I cannot see the buttons at the bottom, so I cannot click them to dismiss the error and try something else. Only way out of Mail is to force quit (menus are unresponsive) and if I rela
-
Is it possible to move audiobook from the playlist to the book section on my ipod touch
In the past I was able to download audiobooks to the ipod touch using Overdrive Media and itune. The audiobooks went into the audiobook or book section on my ipod and everything worked great. With the new software upgrade to itune, the audiobooks go
-
Profit Center Feild In Sales Order Line item
How can i make the Feild Profit Center in the line item level of the sales order to mandatory? Please help me regardign this
-
Here's my situation: I own CS 5.0 Design Premium. I run it on a PC. I want to switch to my new Mac. Adobe Support told me that I cannot move CS5.0 from PC to Mac. They said I have to buy the upgrade to CS5.5 and specify the Mac as my target OS. The u
-
Issue with REST POST call: CopyTo in a SharePoint hosted App not working
Hi Community I have been really struggling to get a cross domain copy to work both in an App or even using REST Tools such as Postman or Google Advanced REST Client. Ok there are various blog posts by Scott G et al. that shows how you can post say