Load data from 2012 to 2008
If i create a ssis package and have a source of sql 2012 to destination sql 2008 in my dataflow, will this work? The ssis package would be created for sql 2012 and installed on the sql 2012 server.
Yes, it will work.
That is just data to be moved from SQL 2012 to SQL 2008. So you are good to go.
Cheers,
Vaibhav Chaudhari
[MCTS],
[MCP]
Thanks!!!
Similar Messages
-
Unable to load data from FDMEE staging table to HFM tables
Hi,
We have installed EPM 11.1.2.3 with all latest related products (ODI/FDMEE) in our development environment
We are inprocess of loading data from EBS R12 to HFM using ERPI (Data Managemwnt in EPM 11.1.2.3). We could Import and validate the data but when we try to Export the data to HFM, the process continuously running hours (neither it gives any error nor it completes).
When we check the process details in ODI work repository, following processes are completed successfully
COMM_LOAD_BALANCES - Running .........(From past 1 day, still running)
EBS_GL_LOAD_BALANCES_DATA - Successful
COMM_LOAD_BALANCES - Successful
Whereas we could load data in staging table of FDMEE database schema and moreover we are even able to drill through to source system (EBS R12) from Data load workbench but not able to load the data into HFM application.
Log details from the process are below.
2013-11-05 17:04:59,747 INFO [AIF]: FDMEE Process Start, Process ID: 31
2013-11-05 17:04:59,747 INFO [AIF]: FDMEE Logging Level: 4
2013-11-05 17:04:59,748 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_31.log
2013-11-05 17:04:59,748 INFO [AIF]: User:admin
2013-11-05 17:04:59,748 INFO [AIF]: Location:VisionLoc (Partitionkey:1)
2013-11-05 17:04:59,749 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-05 17:04:59,749 INFO [AIF]: Category Name:VisionCat (Category key:3)
2013-11-05 17:04:59,749 INFO [AIF]: Rule Name:VisionRule (Rule ID:2)
2013-11-05 17:05:00,844 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-05 17:05:00,844 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-05 17:05:02,910 INFO [AIF]: -------START IMPORT STEP-------
2013-11-05 17:05:02,953 INFO [AIF]: -------END IMPORT STEP-------
2013-11-05 17:05:03,030 INFO [AIF]: -------START EXPORT STEP-------
2013-11-05 17:05:03,108 INFO [AIF]:
Move Data for Period 'JAN'
Any help on above is much appreciated.
Thank you
Regards
PraneethHi,
I have followed the steps 1 & 2 above. Noe the log shows something like below
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Process Start, Process ID: 22
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Logging Level: 4
2013-11-05 09:47:31,179 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_22.log
2013-11-05 09:47:31,180 INFO [AIF]: User:admin
2013-11-05 09:47:31,180 INFO [AIF]: Location:VisionLoc (Partitionkey:1)
2013-11-05 09:47:31,180 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-05 09:47:31,180 INFO [AIF]: Category Name:VisionCat (Category key:3)
2013-11-05 09:47:31,181 INFO [AIF]: Rule Name:VisionRule (Rule ID:2)
2013-11-05 09:47:32,378 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-05 09:47:32,378 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-05 09:47:34,652 INFO [AIF]: -------START IMPORT STEP-------
2013-11-05 09:47:34,698 INFO [AIF]: -------END IMPORT STEP-------
2013-11-05 09:47:34,744 INFO [AIF]: -------START EXPORT STEP-------
2013-11-05 09:47:34,828 INFO [AIF]:
Move Data for Period 'JAN'
2013-11-08 11:49:10,478 INFO [AIF]: FDMEE Process Start, Process ID: 22
2013-11-08 11:49:10,493 INFO [AIF]: FDMEE Logging Level: 5
2013-11-08 11:49:10,493 INFO [AIF]: FDMEE Log File: C:\FDMEE\outbox\logs\OASIS_22.log
2013-11-08 11:49:10,494 INFO [AIF]: User:admin
2013-11-08 11:49:10,494 INFO [AIF]: Location:VISIONLOC (Partitionkey:1)
2013-11-08 11:49:10,494 INFO [AIF]: Period Name:JAN (Period Key:1/1/12 12:00 AM)
2013-11-08 11:49:10,495 INFO [AIF]: Category Name:VISIONCAT (Category key:1)
2013-11-08 11:49:10,495 INFO [AIF]: Rule Name:VISIONRULE (Rule ID:1)
2013-11-08 11:49:11,903 INFO [AIF]: Jython Version: 2.5.1 (Release_2_5_1:6813, Sep 26 2009, 13:47:54)
[Oracle JRockit(R) (Oracle Corporation)]
2013-11-08 11:49:11,909 INFO [AIF]: Java Platform: java1.6.0_37
2013-11-08 11:49:14,037 INFO [AIF]: -------START IMPORT STEP-------
2013-11-08 11:49:14,105 INFO [AIF]: -------END IMPORT STEP-------
2013-11-08 11:49:14,152 INFO [AIF]: -------START EXPORT STEP-------
2013-11-08 11:49:14,178 DEBUG [AIF]: CommData.exportData - START
2013-11-08 11:49:14,183 DEBUG [AIF]: CommData.getRuleInfo - START
2013-11-08 11:49:14,188 DEBUG [AIF]:
SELECT brl.RULE_ID
,br.RULE_NAME
,brl.PARTITIONKEY
,brl.CATKEY
,part.PARTVALGROUP
,br.SOURCE_SYSTEM_ID
,ss.SOURCE_SYSTEM_TYPE
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
ELSE 'Y'
END SOURCE_ADAPTER_FLAG
,app.APPLICATION_ID
,app.TARGET_APPLICATION_NAME
,app.TARGET_APPLICATION_TYPE
,app.DATA_LOAD_METHOD
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,br.INCL_ZERO_BALANCE_FLAG
,br.PERIOD_MAPPING_TYPE
,br.INCLUDE_ADJ_PERIODS_FLAG
,br.BALANCE_TYPE ACTUAL_FLAG
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.BALANCE_METHOD_CODE
,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
,br.CURRENCY_CODE
,br.BAL_SEG_VALUE_OPTION_CODE
,brl.EXECUTION_MODE
,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
,CASE
WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
ELSE 'NONE'
END LEDGER_GROUP_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
,br.LEDGER_GROUP
,(SELECT brd.DETAIL_CODE
FROM AIF_BAL_RULE_DETAILS brd
WHERE brd.RULE_ID = br.RULE_ID
AND brd.DETAIL_TYPE = 'LEDGER'
) PS_LEDGER
,CASE lg.LEDGER_TEMPLATE
WHEN 'COMMITMENT' THEN 'Y'
ELSE 'N'
END KK_FLAG
,p.LAST_UPDATED_BY
,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
,p.EPM_ORACLE_INSTANCE
FROM AIF_PROCESSES p
INNER JOIN AIF_BAL_RULE_LOADS brl
ON brl.LOADID = p.PROCESS_ID
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = brl.APPLICATION_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = br.PARTITIONKEY
INNER JOIN TBHVIMPGROUP imp
ON imp.IMPGROUPKEY = part.PARTIMPGROUP
LEFT OUTER JOIN AIF_COA_LEDGERS l
ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
AND scr.RECNAME = 'LED_GRP_TBL'
LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
AND lg.SETID = scr.SETID
AND lg.LEDGER_GROUP = br.LEDGER_GROUP
WHERE p.PROCESS_ID = 22
2013-11-08 11:49:14,195 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 2
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 1
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-11-08 11:49:14,197 DEBUG [AIF]: {'APPLICATION_ID': 2L, 'IMPORT_FROM_SOURCE_FLAG': u'N', 'PLAN_TYPE': None, 'RULE_NAME': u'VISIONRULE', 'ACTUAL_FLAG': u'A', 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'C:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 1L, 'BAL_SEG_VALUE_OPTION_CODE': u'ALL', 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'EBS_R12', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'OASIS', 'RECALCULATE_FLAG': u'N', 'SOURCE_SYSTEM_ID': 2L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'AMOUNT_TYPE': u'MONETARY', 'EXPORT_TO_TARGET_FLAG': u'Y', 'DATA_TABLE_NAME': 'TDATASEG', 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'ICP', u'UD1', u'UD2', u'UD3', u'UD4'], 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT5', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 9L}, u'ICP': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'ICP', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT7', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ICP', 'DIMENSION_ID': 8L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT1', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 12L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT4', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 11L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT6', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 1L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT3', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 10L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT2', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 7L}}, 'TARGET_APPLICATION_TYPE': u'HFM', 'PARTITIONKEY': 1L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'SINGLE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': u'SNAPSHOT', 'PLAN_NUMBER': 0L, 'PS_LEDGER': None, 'BALANCE_SELECTION': u'FUNCTIONAL', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 1L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'N', 'BALANCE_METHOD_CODE': u'STANDARD', 'SIGNAGE_METHOD': u'SAME', 'WEB_SERVICE_URL': u'http://localhost:9000/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI'}
2013-11-08 11:49:14,197 DEBUG [AIF]: CommData.getRuleInfo - END
2013-11-08 11:49:14,224 DEBUG [AIF]: CommData.insertPeriods - START
2013-11-08 11:49:14,228 DEBUG [AIF]: CommData.getLedgerListAndMap - START
2013-11-08 11:49:14,229 DEBUG [AIF]: CommData.getLedgerSQL - START
2013-11-08 11:49:14,229 DEBUG [AIF]: CommData.getLedgerSQL - END
2013-11-08 11:49:14,229 DEBUG [AIF]:
SELECT l.SOURCE_LEDGER_ID
,l.SOURCE_LEDGER_NAME
,l.SOURCE_COA_ID
,l.CALENDAR_ID
,'0' SETID
,l.PERIOD_TYPE
,NULL LEDGER_TABLE_NAME
FROM AIF_BALANCE_RULES br
,AIF_COA_LEDGERS l
WHERE br.RULE_ID = 1
AND l.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = br.SOURCE_LEDGER_ID
2013-11-08 11:49:14,230 DEBUG [AIF]: CommData.getLedgerListAndMap - END
2013-11-08 11:49:14,232 DEBUG [AIF]:
INSERT INTO AIF_PROCESS_PERIODS (
PROCESS_ID
,PERIODKEY
,PERIOD_ID
,ADJUSTMENT_PERIOD_FLAG
,GL_PERIOD_YEAR
,GL_PERIOD_NUM
,GL_PERIOD_NAME
,GL_PERIOD_CODE
,GL_EFFECTIVE_PERIOD_NUM
,YEARTARGET
,PERIODTARGET
,IMP_ENTITY_TYPE
,IMP_ENTITY_ID
,IMP_ENTITY_NAME
,TRANS_ENTITY_TYPE
,TRANS_ENTITY_ID
,TRANS_ENTITY_NAME
,PRIOR_PERIOD_FLAG
,SOURCE_LEDGER_ID
SELECT DISTINCT brl.LOADID PROCESS_ID
,pp.PERIODKEY PERIODKEY
,prd.PERIOD_ID
,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
,COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) GL_PERIOD_YEAR
,COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) GL_PERIOD_NUM
,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') GL_PERIOD_NAME
,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) AS VARCHAR(38)),'0') GL_PERIOD_CODE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
,COALESCE(ppa.YEARTARGET, pp.YEARTARGET) YEARTARGET
,COALESCE(ppa.PERIODTARGET, pp.PERIODTARGET) PERIODTARGET
,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) IMP_ENTITY_ID
,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') IMP_ENTITY_NAME
,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) TRANS_ENTITY_ID
,pp.PERIODDESC TRANS_ENTITY_NAME
,'N' PRIOR_PERIOD_FLAG
,2202 SOURCE_LEDGER_ID
FROM (
AIF_BAL_RULE_LOADS brl
INNER JOIN TPOVCATEGORY pc
ON pc.CATKEY = brl.CATKEY
INNER JOIN TPOVPERIOD_FLAT_V pp
ON pp.PERIODFREQ = pc.CATFREQ
AND pp.PERIODKEY >= brl.START_PERIODKEY
AND pp.PERIODKEY <= brl.END_PERIODKEY
LEFT OUTER JOIN TPOVPERIODADAPTOR_FLAT_V ppa
ON ppa.PERIODKEY = pp.PERIODKEY
AND ppa.PERIODFREQ = pp.PERIODFREQ
AND ppa.INTSYSTEMKEY = 'OASIS'
INNER JOIN TPOVPERIODSOURCE ppsrc
ON ppsrc.PERIODKEY = pp.PERIODKEY
AND ppsrc.MAPPING_TYPE = 'EXPLICIT'
AND ppsrc.SOURCE_SYSTEM_ID = 2
AND ppsrc.CALENDAR_ID IN ('29067')
LEFT OUTER JOIN AIF_GL_PERIODS_STG prd
ON prd.PERIOD_ID = ppsrc.PERIOD_ID
AND prd.SOURCE_SYSTEM_ID = ppsrc.SOURCE_SYSTEM_ID
AND prd.CALENDAR_ID = ppsrc.CALENDAR_ID
AND prd.SETID = '0'
AND prd.PERIOD_TYPE = '507'
AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
WHERE brl.LOADID = 22
ORDER BY pp.PERIODKEY
,GL_EFFECTIVE_PERIOD_NUM
2013-11-08 11:49:14,235 DEBUG [AIF]: CommData.insertPeriods - END
2013-11-08 11:49:14,240 DEBUG [AIF]: CommData.moveData - START
2013-11-08 11:49:14,242 DEBUG [AIF]: CommData.getPovList - START
2013-11-08 11:49:14,242 DEBUG [AIF]:
SELECT PARTITIONKEY
,PARTNAME
,CATKEY
,CATNAME
,PERIODKEY
,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
,RULE_ID
,RULE_NAME
FROM (
SELECT DISTINCT brl.PARTITIONKEY
,part.PARTNAME
,brl.CATKEY
,cat.CATNAME
,pprd.PERIODKEY
,pp.PERIODDESC
,brl.RULE_ID
,br.RULE_NAME
FROM AIF_BAL_RULE_LOADS brl
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = brl.PARTITIONKEY
INNER JOIN TPOVCATEGORY cat
ON cat.CATKEY = brl.CATKEY
INNER JOIN AIF_PROCESS_PERIODS pprd
ON pprd.PROCESS_ID = brl.LOADID
LEFT OUTER JOIN TPOVPERIOD pp
ON pp.PERIODKEY = pprd.PERIODKEY
WHERE brl.LOADID = 22
) q
ORDER BY PARTITIONKEY
,CATKEY
,PERIODKEY
,RULE_ID
2013-11-08 11:49:14,244 DEBUG [AIF]: CommData.getPovList - END
2013-11-08 11:49:14,245 INFO [AIF]:
Move Data for Period 'JAN'
2013-11-08 11:49:14,246 DEBUG [AIF]:
UPDATE TDATASEG
SET LOADID = 22
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,320 DEBUG [AIF]: Number of Rows updated on TDATASEG: 1842
2013-11-08 11:49:14,320 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_AUDIT (
LOADID
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,SEGMENT_FILTER_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,EXPORT_TO_TARGET_FLAG
SELECT DISTINCT 22
,TARGET_APPLICATION_TYPE
,TARGET_APPLICATION_NAME
,PLAN_TYPE
,SOURCE_LEDGER_ID
,EPM_YEAR
,EPM_PERIOD
,SNAPSHOT_FLAG
,SEGMENT_FILTER_FLAG
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
,'N'
FROM AIF_APPL_LOAD_AUDIT
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,321 DEBUG [AIF]: Number of Rows inserted into AIF_APPL_LOAD_AUDIT: 1
2013-11-08 11:49:14,322 DEBUG [AIF]:
INSERT INTO AIF_APPL_LOAD_PRD_AUDIT (
LOADID
,GL_PERIOD_ID
,GL_PERIOD_YEAR
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
SELECT DISTINCT 22
,GL_PERIOD_ID
,GL_PERIOD_YEAR
,DELTA_RUN_ID
,PARTITIONKEY
,CATKEY
,RULE_ID
,PERIODKEY
FROM AIF_APPL_LOAD_PRD_AUDIT
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND RULE_ID = 1
AND LOADID < 22
AND PERIODKEY = '2012-01-01'
2013-11-08 11:49:14,323 DEBUG [AIF]: Number of Rows inserted into AIF_APPL_LOAD_PRD_AUDIT: 1
2013-11-08 11:49:14,325 DEBUG [AIF]: CommData.moveData - END
2013-11-08 11:49:14,332 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,332 DEBUG [AIF]:
SELECT tlp.PROCESSSTATUS
,tlps.PROCESSSTATUSDESC
,CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
FROM TLOGPROCESS tlp
,TLOGPROCESSSTATES tlps
WHERE tlp.PARTITIONKEY = 1
AND tlp.CATKEY = 1
AND tlp.PERIODKEY = '2012-01-01'
AND tlp.RULE_ID = 1
AND tlps.PROCESSSTATUSKEY = tlp.PROCESSSTATUS
2013-11-08 11:49:14,336 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 20
,PROCESSEXP = 0
,PROCESSENTLOAD = 0
,PROCESSENTVAL = 0
,PROCESSEXPNOTE = NULL
,PROCESSENTLOADNOTE = NULL
,PROCESSENTVALNOTE = NULL
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,338 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,339 DEBUG [AIF]: CommData.purgeInvalidRecordsTDATASEG - START
2013-11-08 11:49:14,341 DEBUG [AIF]:
DELETE FROM TDATASEG
WHERE LOADID = 22
AND (
PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
AND VALID_FLAG = 'N'
2013-11-08 11:49:14,342 DEBUG [AIF]: Number of Rows deleted from TDATASEG: 0
2013-11-08 11:49:14,342 DEBUG [AIF]: CommData.purgeInvalidRecordsTDATASEG - END
2013-11-08 11:49:14,344 DEBUG [AIF]: CommData.updateAppLoadAudit - START
2013-11-08 11:49:14,344 DEBUG [AIF]:
UPDATE AIF_APPL_LOAD_AUDIT
SET EXPORT_TO_TARGET_FLAG = 'Y'
WHERE LOADID = 22
AND PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY= '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,345 DEBUG [AIF]: Number of Rows updated on AIF_APPL_LOAD_AUDIT: 1
2013-11-08 11:49:14,345 DEBUG [AIF]: CommData.updateAppLoadAudit - END
2013-11-08 11:49:14,345 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,346 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 21
,PROCESSEXP = 1
,PROCESSEXPNOTE = 'Export Successful'
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,347 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,347 DEBUG [AIF]: CommData.exportData - END
2013-11-08 11:49:14,404 DEBUG [AIF]: HfmData.loadData - START
2013-11-08 11:49:14,404 DEBUG [AIF]: CommData.getRuleInfo - START
2013-11-08 11:49:14,404 DEBUG [AIF]:
SELECT brl.RULE_ID
,br.RULE_NAME
,brl.PARTITIONKEY
,brl.CATKEY
,part.PARTVALGROUP
,br.SOURCE_SYSTEM_ID
,ss.SOURCE_SYSTEM_TYPE
,CASE
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N'
WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FILE%' THEN 'N'
ELSE 'Y'
END SOURCE_ADAPTER_FLAG
,app.APPLICATION_ID
,app.TARGET_APPLICATION_NAME
,app.TARGET_APPLICATION_TYPE
,app.DATA_LOAD_METHOD
,brl.PLAN_TYPE
,CASE brl.PLAN_TYPE
WHEN 'PLAN1' THEN 1
WHEN 'PLAN2' THEN 2
WHEN 'PLAN3' THEN 3
WHEN 'PLAN4' THEN 4
WHEN 'PLAN5' THEN 5
ELSE 0
END PLAN_NUMBER
,br.INCL_ZERO_BALANCE_FLAG
,br.PERIOD_MAPPING_TYPE
,br.INCLUDE_ADJ_PERIODS_FLAG
,br.BALANCE_TYPE ACTUAL_FLAG
,br.AMOUNT_TYPE
,br.BALANCE_SELECTION
,br.BALANCE_METHOD_CODE
,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
,br.CURRENCY_CODE
,br.BAL_SEG_VALUE_OPTION_CODE
,brl.EXECUTION_MODE
,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
,CASE
WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
ELSE 'NONE'
END LEDGER_GROUP_CODE
,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
,br.LEDGER_GROUP
,(SELECT brd.DETAIL_CODE
FROM AIF_BAL_RULE_DETAILS brd
WHERE brd.RULE_ID = br.RULE_ID
AND brd.DETAIL_TYPE = 'LEDGER'
) PS_LEDGER
,CASE lg.LEDGER_TEMPLATE
WHEN 'COMMITMENT' THEN 'Y'
ELSE 'N'
END KK_FLAG
,p.LAST_UPDATED_BY
,p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL
,p.EPM_ORACLE_INSTANCE
FROM AIF_PROCESSES p
INNER JOIN AIF_BAL_RULE_LOADS brl
ON brl.LOADID = p.PROCESS_ID
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN AIF_SOURCE_SYSTEMS ss
ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
INNER JOIN AIF_TARGET_APPLICATIONS app
ON app.APPLICATION_ID = brl.APPLICATION_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = br.PARTITIONKEY
INNER JOIN TBHVIMPGROUP imp
ON imp.IMPGROUPKEY = part.PARTIMPGROUP
LEFT OUTER JOIN AIF_COA_LEDGERS l
ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
AND scr.RECNAME = 'LED_GRP_TBL'
LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
AND lg.SETID = scr.SETID
AND lg.LEDGER_GROUP = br.LEDGER_GROUP
WHERE p.PROCESS_ID = 22
2013-11-08 11:49:14,406 DEBUG [AIF]:
SELECT adim.BALANCE_COLUMN_NAME DIMNAME
,adim.DIMENSION_ID
,dim.TARGET_DIMENSION_CLASS_NAME
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1
) COA_SEGMENT_NAME1
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2
) COA_SEGMENT_NAME2
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3
) COA_SEGMENT_NAME3
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4
) COA_SEGMENT_NAME4
,(SELECT COA_SEGMENT_NAME
FROM AIF_COA_SEGMENTS cs
WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5
) COA_SEGMENT_NAME5
,(SELECT CASE mdd.ORPHAN_OPTION_CODE
WHEN 'CHILD' THEN 'N'
WHEN 'ROOT' THEN 'N'
ELSE 'Y'
END DIMENSION_FILTER_FLAG
FROM AIF_MAP_DIM_DETAILS_V mdd
,AIF_MAPPING_RULES mr
WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
AND mdd.RULE_ID = mr.RULE_ID
AND mdd.DIMENSION_ID = adim.DIMENSION_ID
) DIMENSION_FILTER_FLAG
,tiie.IMPCONCATCHAR
FROM TPOVPARTITION tpp
INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
ON adim.APPLICATION_ID = 2
INNER JOIN AIF_DIMENSIONS dim
ON dim.DIMENSION_ID = adim.DIMENSION_ID
LEFT OUTER JOIN TBHVIMPITEMERPI tiie
ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
AND tiie.IMPMAPTYPE = 'ERP'
WHERE tpp.PARTITIONKEY = 1
AND adim.BALANCE_COLUMN_NAME IS NOT NULL
ORDER BY adim.BALANCE_COLUMN_NAME
2013-11-08 11:49:14,407 DEBUG [AIF]: {'APPLICATION_ID': 2L, 'IMPORT_FROM_SOURCE_FLAG': u'N', 'PLAN_TYPE': None, 'RULE_NAME': u'VISIONRULE', 'ACTUAL_FLAG': u'A', 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'C:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 1L, 'BAL_SEG_VALUE_OPTION_CODE': u'ALL', 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'EBS_R12', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'OASIS', 'RECALCULATE_FLAG': u'N', 'SOURCE_SYSTEM_ID': 2L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'AMOUNT_TYPE': u'MONETARY', 'EXPORT_TO_TARGET_FLAG': u'Y', 'DATA_TABLE_NAME': 'TDATASEG', 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'ICP', u'UD1', u'UD2', u'UD3', u'UD4'], 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT5', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 9L}, u'ICP': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'ICP', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT7', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ICP', 'DIMENSION_ID': 8L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT1', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 12L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT4', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 11L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT6', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 1L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT3', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 10L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': u'SEGMENT2', 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 7L}}, 'TARGET_APPLICATION_TYPE': u'HFM', 'PARTITIONKEY': 1L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'SINGLE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': u'SNAPSHOT', 'PLAN_NUMBER': 0L, 'PS_LEDGER': None, 'BALANCE_SELECTION': u'FUNCTIONAL', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 1L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'N', 'BALANCE_METHOD_CODE': u'STANDARD', 'SIGNAGE_METHOD': u'SAME', 'WEB_SERVICE_URL': u'http://localhost:9000/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI'}
2013-11-08 11:49:14,407 DEBUG [AIF]: CommData.getRuleInfo - END
2013-11-08 11:49:14,407 DEBUG [AIF]: CommData.getPovList - START
2013-11-08 11:49:14,407 DEBUG [AIF]:
SELECT PARTITIONKEY
,PARTNAME
,CATKEY
,CATNAME
,PERIODKEY
,COALESCE(PERIODDESC, TO_CHAR(PERIODKEY,'YYYY-MM-DD HH24:MI:SS')) PERIODDESC
,RULE_ID
,RULE_NAME
FROM (
SELECT DISTINCT brl.PARTITIONKEY
,part.PARTNAME
,brl.CATKEY
,cat.CATNAME
,pprd.PERIODKEY
,pp.PERIODDESC
,brl.RULE_ID
,br.RULE_NAME
FROM AIF_BAL_RULE_LOADS brl
INNER JOIN AIF_BALANCE_RULES br
ON br.RULE_ID = brl.RULE_ID
INNER JOIN TPOVPARTITION part
ON part.PARTITIONKEY = brl.PARTITIONKEY
INNER JOIN TPOVCATEGORY cat
ON cat.CATKEY = brl.CATKEY
INNER JOIN AIF_PROCESS_PERIODS pprd
ON pprd.PROCESS_ID = brl.LOADID
LEFT OUTER JOIN TPOVPERIOD pp
ON pp.PERIODKEY = pprd.PERIODKEY
WHERE brl.LOADID = 22
) q
ORDER BY PARTITIONKEY
,CATKEY
,PERIODKEY
,RULE_ID
2013-11-08 11:49:14,409 DEBUG [AIF]: CommData.getPovList - END
2013-11-08 11:49:14,409 DEBUG [AIF]: CommData.updateWorkflow - START
2013-11-08 11:49:14,409 DEBUG [AIF]:
SELECT tlp.PROCESSSTATUS
,tlps.PROCESSSTATUSDESC
,CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
FROM TLOGPROCESS tlp
,TLOGPROCESSSTATES tlps
WHERE tlp.PARTITIONKEY = 1
AND tlp.CATKEY = 1
AND tlp.PERIODKEY = '2012-01-01'
AND tlp.RULE_ID = 1
AND tlps.PROCESSSTATUSKEY = tlp.PROCESSSTATUS
2013-11-08 11:49:14,410 DEBUG [AIF]:
UPDATE TLOGPROCESS
SET PROCESSENDTIME = CURRENT_TIMESTAMP
,PROCESSSTATUS = 30
,PROCESSENTLOAD = 0
,PROCESSENTVAL = 0
,PROCESSENTLOADNOTE = NULL
,PROCESSENTVALNOTE = NULL
WHERE PARTITIONKEY = 1
AND CATKEY = 1
AND PERIODKEY = '2012-01-01'
AND RULE_ID = 1
2013-11-08 11:49:14,411 DEBUG [AIF]: CommData.updateWorkflow - END
2013-11-08 11:49:14,412 DEBUG [AIF]:
SELECT COALESCE(usr.PROFILE_OPTION_VALUE, app.PROFILE_OPTION_VALUE, site.PROFILE_OPTION_VALUE) PROFILE_OPTION_VALUE
FROM AIF_PROFILE_OPTIONS po
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES site
ON site.PROFILE_OPTION_NAME = po.PROFILE_OPTION_NAME
AND site.LEVEL_ID = 1000
AND site.LEVEL_VALUE = 0
AND site.LEVEL_ID <= po.MAX_LEVEL_ID
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES app
ON app.PROFILE_OPTION_NAME = site.PROFILE_OPTION_NAME
AND app.LEVEL_ID = 1005
AND app.LEVEL_VALUE = NULL
AND app.LEVEL_ID <= po.MAX_LEVEL_ID
LEFT OUTER JOIN AIF_PROFILE_OPTION_VALUES usr
ON usr.PROFILE_OPTION_NAME = usr.PROFILE_OPTION_NAME
AND usr.LEVEL_ID = 1010
AND usr.LEVEL_VALUE = NULL
AND usr.LEVEL_ID <= po.MAX_LEVEL_ID
WHERE po.PROFILE_OPTION_NAME = 'JAVA_HOME'
2013-11-08 11:49:14,413 DEBUG [AIF]: HFM Load command:
%EPM_ORACLE_HOME%/products/FinancialDataQuality/bin/HFM_LOAD.vbs "22" "a9E3uvSJNhkFhEQTmuUFFUElfdQgKJKHrb1EsjRsL6yZJlXsOFcVPbGWHhpOQzl9zvHoo3s%2Bdq6R4yJhp0GMNWIKMTcizp3%2F8HASkA7rVufXDWEpAKAK%2BvcFmj4zLTI3rtrKHlVEYrOLMY453J2lXk6Cy771mNSD8X114CqaWSdUKGbKTRGNpgE3BfRGlEd1wZ3cra4ee0jUbT2aTaiqSN26oVe6dyxM3zolc%2BOPkjiDNk1MqwNr43tT3JsZz4qEQGF9d39DRN3CDjUuZRPt4SEKSSL35upncRJiw2uBOtV%2FvSuGLNpZ2J79v1%2Ba1Oo9c4Xhig7SFYbE6Jwk1yXRJLTSw0DKFu%2FEpcdjpOnx%2F6YawMBNIa5iu5L637S91jT1Xd3EGmxZFq%2Bi6bHdCJAC8g%3D%3D" "C%3A%5COracle%5CMiddleware%5Cuser_projects%5Cepmsystem1" "%25EPM_ORACLE_HOME%25%2F..%2Fjdk160_35"
Is there anywhere we need to set EPM Home and also let me know what is PSU1 ?
Thanks in advance.
Praneeth -
Loading data from flatfile to relational table,i am getting SQLLDR error
Hi,
While loading data from flatfile to relational table,i am getting SQLLDR error and i am unable to proceed further.
Source is a flatfile and target is a Oracle database,i used "LKM file to oracle(SQLLDR)" and "IKM sql control append"
and ran the interface.When i checked the seesion in operator window" after generating "CTL file" successfully
the session got failed at "Call sqlldr" and was not able to proceed further.
Environment details:
ODI 11g
database:Oracle 11g
Operating system:Windows server 2008
The error message it displayed in call sqlldr session file was
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 31, in ?
File "C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi\bin\..\..\jdev\extensions\oracle.odi.navigator\scripting\Lib\javaos.py", line 198, in system
File "C:\oracle\product\11.1.1\Oracle_ODI_1\oracledi\client\odi\bin\..\..\jdev\extensions\oracle.odi.navigator\scripting\Lib\javaos.py", line 224, in execute
OSError: (0, 'Failed to execute command ([\'sh\', \'-c\', \'sqlldr DEVELOPER/pass_123@CPRDEV control="F:\\\\flatfile/CROSS_CURR.ctl" log="F:\\\\flatfile/CROSS_CURR.log" > "F:\\\\flatfile/CROSS_CURR.out" \']): java.io.IOException: Cannot run program "sh": CreateProcess error=2, The system cannot find the file specified')
at org.apache.bsf.engines.jython.JythonEngine.exec(JythonEngine.java:146)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.execInBSFEngine(SnpScriptingInterpretor.java:345)
at com.sunopsis.dwg.codeinterpretor.SnpScriptingInterpretor.exec(SnpScriptingInterpretor.java:169)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.scripting(SnpSessTaskSql.java:2374)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1615)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execScriptingOrders(SnpSessTaskSql.java:1580)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java:2755)
at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2515)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:534)
at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:449)
at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1954)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:322)
at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:224)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:246)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:237)
at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:794)
at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:114)
at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
at java.lang.Thread.run(Thread.java:619)
could u give me a solution to sort out this error ASAP.
thanks,
keshav.This was the following code generated .
import java.lang.String
import java.lang.Runtime as Runtime
from jarray import array
import java.io.File
import os
import re
import javaos
def reportnbrows():
f = open(r"F:\flatfile/TEST.log", 'r')
try:
for line in f.readlines():
if line.find("MAXIMUM ERROR COUNT EXCEEDED")>=0 :
raise line
finally:
f.close()
ctlfile = r"""F:\flatfile/TEST.ctl"""
logfile = r"""F:\flatfile/TEST.log"""
outfile = r"""F:\flatfile/TEST.out"""
oracle_sid=''
if len('CPRDEV')>0: oracle_sid = '@'+'CPRDEV'
loadcmd = r"""sqlldr DEVELOPER/<@=snpRef.getInfo("DEST_PASS") @>%s control="%s" log="%s" > "%s" """ % (oracle_sid,ctlfile, logfile, outfile)
rc = os.system(loadcmd)
if rc <> 0 and rc <> 2:
raise "Load Error", "See %s for details" % logfile
if rc==2:
reportnbrows() -
Error While Loading data from HFM to Essbase
Hi,
I am trying to load data from a HFM application to Essbase application. The LKM i am using is "HFM Data to SQL" and the IKM is "SQL to Hyperion Essbase(DATA). I am also using couple of mapping expression like {CASEWHEN(HFMDATA."Account" = '3110', 'PL10100', HFMDATA."Account") } , {CASEWHEN(HFMDATA."Period" = 'January', 'Jan', HFMDATA."Period") } etc.
The error i am getting looks like this -
org.apache.bsf.BSFException: exception from Jython:
Traceback (innermost last):
File "<string>", line 23, in ?
com.hyperion.odi.essbase.ODIEssbaseException: Invalid column type specified for Data column [Account]
at com.hyperion.odi.essbase.ODIEssbaseDataWriter.loadData(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.python.core.PyReflectedFunction.__call__(PyReflectedFunction.java)
at org.python.core.PyMethod.__call__(PyMethod.java)
at org.python.core.PyObject.__call__(PyObject.java)
at org.python.core.PyInstance.invoke(PyInstance.java)
I am mapping account dimension of HFM to to account dimension of essbase. Both source and target column type is "string" with length 80. Instead of keeping the target "Essbase application", if i keep an Oracle table, columns of which are exactly like essbase application, ODI is loading data perfectly with proper coversion.
Can anyody please help me out.
Many thanks.
N. Laha.Hi John,
Thank you very much for your response. Pasting here the SQl generated at step "8. Integration - Load data into Essbase" in the operator in the description tab, for your perusal -
from com.hyperion.odi.common import ODIConstants
from com.hyperion.odi.connection import HypAppConnectionFactory
from java.lang import Class
from java.lang import Boolean
from java.sql import *
from java.util import HashMap
# Get the select statement on the staging area:
sql= """select CASEWHEN(C3_ACCOUNT = '3110', 'PL10100', C3_ACCOUNT) "Account",CASEWHEN(C4_PERIOD = 'January', 'Jan', C4_PERIOD) "TimePeriod",'HSP_Inputvalue' "HSP_Rates",C8_SCENARIO "Scenario",CASEWHEN(C2_YEAR = '2007', 'FY09', C2_YEAR) "Years",CASEWHEN(C1_CUSTOM1 = '2012', 'Atacand', C1_CUSTOM1) "Product",CASEWHEN(C7_VIEW = 'YTD', 'Input', C7_VIEW) "Version",CASEWHEN(C6_VALUE = 'USD', 'Local', C6_VALUE) "Currency",C9_ENTITY "Entity",C5_DATAVALUE "Data" from "C$_0AZGRP_PLData" where (1=1) """
srcCx = odiRef.getJDBCConnection("SRC")
stmt = srcCx.createStatement()
srcFetchSize=30
stmt.setFetchSize(srcFetchSize)
rs = stmt.executeQuery(sql)
#load the data
stats = pWriter.loadData(rs)
#close the database result set, connection
rs.close()
stmt.close()
From the SQL, you may find the CASEWHEN expressions. Yes these i have added in the interface mapping section, with the help of ODI expression editor. Following are the expressions i have used, one statement for one mapping each. This is just to make the data of HFM, compatible to Essbase. e.g. - account number in HFM for which i am estracting the data is 3110, but there is no account '3110' in the essbase application, so essbase application wont be able to load the data. To make the HFM data compatible with Essbase, these casewhen statements have been used. I hope its fine to use such statements in mapping.
CASEWHEN(HFMDATA."Year" = '2007', 'FY09', HFMDATA."Year")
CASEWHEN(HFMDATA."View" = 'YTD', 'Input', HFMDATA."View")
CASEWHEN(HFMDATA."Entity" = '100106_LC', '1030GB', HFMDATA."Entity")
CASEWHEN(HFMDATA."Value" = 'USD', 'Local', HFMDATA."Value")
CASEWHEN(HFMDATA."Account" = '3110', 'PL10100', HFMDATA."Account")
CASEWHEN(HFMDATA."Custom1" = '2012', 'Atacand', HFMDATA."Custom1")
CASEWHEN(HFMDATA."Period" = 'January', 'Jan', HFMDATA."Period")
If you can get some idea from the SQL and help me out!
Many thanks.
N. Laha
Edited by: user8732338 on Sep 28, 2009 9:45 PM -
Error while loading data from FDM (EPM 11.1.1.3)
Hi,
We are loading data into HFM from FDM. While loading data from FDM it is throwing an error which as below.
"Load data started: 5/20/2012 9:30:45 PM
Line 27, Error: Cell for Period Mar is not an input cell.
Actual,2011,Mar,YTD,ENT_JJ,JPY,A23232323,ENT_J76;[None];[None];[None];ICP_Load;3463437"
There were multiples lines which are throwing error.
After doing some checking we found that when we unpost ICP data in HFM we were able to load data from FDM. We have not locked the ICP entities because that will restrcit to load data. But we are not able to understand just unposting the ICP values why we are able to load data from FDM. Also we have not closed the period in HFM.
Kindly suggest, what could be the possible reason.
Thanks
Edited by: user12121146 on May 27, 2012 5:29 AMone way to figure this out, would be to open a grad at the same POV.
It will rule out a couple things.
If it is locked, you'll visually see that it is locked. (will be grey I believe)
If it is an invalid intersection, you'll see that as well. (will be red)
If you do not have access to the cell, you will see that if you look at cell status, it will say NO ACCESS.
That should help you figure out what is happening, the quick and dirty way. -
Error
[Load data from excel file [1]] Error: SSIS Error Code DTS_E_CANNOTACQUIRECONNECTIONFROMCONNECTIONMANAGER. The AcquireConnection method call to the connection manager "Excel Connection Manager" failed with error code 0xC0202009. There
may be error message
I am using BIDS Microsoft Visual Studio 2008 and running the package to load the data from excel .
My machine has 32 bit excel hence have set property to RUN64BITRUNTIME AS FALSE.
But the error still occurs .
I checked on Google and many have used Delay validation property at Data flow task level to true but even using it at both excel connection manager and DFT level it doesnt work
MudassarThats my connection string
Provider=Microsoft.ACE.OLEDB.12.0;Data Source=E:\SrcData\Feeds\Utilization.xlsx;Extended Properties="Excel 12.0;HDR=NO";
Excel 2010 installed and its 32 bit edition
Are you referring to install this component -AccessDatabaseEngine_x64.exe?
http://www.microsoft.com/en-us/download/details.aspx?id=13255
Mudassar
You can try an OLEDB provider in that case
see
http://dataintegrity.wordpress.com/2009/10/16/xlsx/
you might need to download and install ms access redistributable
http://www.microsoft.com/en-in/download/details.aspx?id=13255
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs -
Using FDM to load data from oracle table (Integration Import Script)
Hi,
I am using Integration Import Script to load data from oracle table to worktables in FDM.
i am getting following error while running the script.
Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done
Attaching the full error report
ERROR:
Code............................................. -2147217887
Description...................................... Multiple-step OLE DB operation generated errors. Check each OLE DB status value, if available. No work was done.
At line: 22
Procedure........................................ clsImpProcessMgr.fLoadAndProcessFile
Component........................................ upsWObjectsDM
Version.......................................... 1112
Thread........................................... 6260
IDENTIFICATION:
User............................................. ******
Computer Name.................................... *******
App Name......................................... FDMAPP
Client App....................................... WebClient
CONNECTION:
Provider......................................... ORAOLEDB.ORACLE
Data Server......................................
Database Name.................................... DBNAME
Trusted Connect.................................. False
Connect Status.. Connection Open
GLOBALS:
Location......................................... SCRTEST
Location ID...................................... 750
Location Seg..................................... 4
Category......................................... FDM ACTUAL
Category ID...................................... 13
Period........................................... Jun - 2011
Period ID........................................ 6/30/2011
POV Local........................................ True
Language......................................... 1033
User Level....................................... 1
All Partitions................................... True
Is Auditor....................................... False
I am using the following script
Function ImpScrTest(strLoc, lngCatKey, dblPerKey, strWorkTableName)
'Oracle Hyperion FDM Integration Import Script:
'Created By: Dhananjay
'Date Created: 1/17/2012 10:29:53 AM
'Purpose:A test script to import data from Oracle EBS tables
Dim cnSS 'ADODB.Connection
Dim strSQL 'SQL string
Dim rs 'Recordset
Dim rsAppend 'tTB table append rs object
'Initialize objects
Set cnSS = CreateObject("ADODB.Connection")
Set rs = CreateObject("ADODB.Recordset")
Set rsAppend = DW.DataAccess.farsTable(strWorkTableName)
'Connect to SQL Server database
cnss.open "Provider=OraOLEDB.Oracle.1;Data Source= +server+;Initial Catalog= +catalog+;User ID= +uid+;Password= +pass+"
'Create query string
strSQL = "Select AMOUNT,DESCRIPTION,ACCOUNT,ENTITY FROM +catalog+.TEST_TMP"
'Get data
rs.Open strSQL, cnSS
'Check for data
If rs.bof And rs.eof Then
RES.PlngActionType = 2
RES.PstrActionValue = "No Records to load!"
Exit Function
End If
'Loop through records and append to tTB table in location’s DB
If Not rs.bof And Not rs.eof Then
Do While Not rs.eof
rsAppend.AddNew
rsAppend.Fields("PartitionKey") = RES.PlngLocKey
rsAppend.Fields("CatKey") = RES.PlngCatKey
rsAppend.Fields("PeriodKey") = RES.PdtePerKey
rsAppend.Fields("DataView") = "YTD"
rsAppend.Fields("CalcAcctType") = 9
rsAppend.Fields("Amount") = rs.fields("Amount").Value
rsAppend.Fields("Desc1") = rs.fields("Description").Value
rsAppend.Fields("Account") = rs.fields("Account").Value
rsAppend.Fields("Entity") = rs.fields("Entity").Value
rsAppend.Update
rs.movenext
Loop
End If
'Records loaded
RES.PlngActionType = 6
RES.PstrActionValue = "Import successful!"
'Assign Return value
SQLIntegration = True
End Function
Please help me on this
Thanks,
Dhananjay
Edited by: DBS on Feb 9, 2012 10:21 PMHi,
I found the problem.It was because of the connection string.The format was different for oracle tables.
PFB the format
*cnss.open"Provider=OraOLEDB.Oracle.1;Data Source= servername:port/SID;Database= DB;User Id=aaaa;Password=aaaa;"*
And thanks *SH* for quick response.
So closing the thread......
Thanks,
Dhananjay -
I need how to load data from MS Excel(csv format) to NW Excel
Hi,
I am doing Migration project from SAP MS to NW Manually.
I need how to load data from MS excel (csv format) to NW excel .
For example 2008 budget data.
Could you please help me in this.
Thanks and Regards
KrishnaHi,
You need to create a transformation file and a conversion file if required. First upload the excel (csv) file into BPC using Manage Data and Upload File option.
Create the transformation file (refer to the sap help on how to define a transformation file). You need to specify the mapping correctly and include all your application dimensions and map them to appropriate columns of the flat file.
Before running the import package, do validate the data in the flat file you uploaded into BPC with the transformation file you created.
Thanks,
Sreeni -
Loading data from R/3 systems to BI infocubes
Hi all BI gurus,
Can anyone pls explain in simple steps how to load data from SAP R/3 systems into an infocube?
Thanks in advance,
karthikHi,
Which module u want?
if its SD(for example)
steps>>
1>In AWB goto "business content"
2> goto "Info provider"
3>Under infoarea select SD cubes
4> Drag related cubes and ODS to right panel
5> Set the grouping option "In Data flow before&afterwards"
6>Install the collected objects
Go to R/3
7> Use Tcode RSA5 Transfer all regarding SD module datasources
Goto BW
8> Right click on the source system "Replicate datasources"
[DataSources|http://help.sap.com/saphelp_nw70/helpdata/en/3c/7b88408bc0bb4de10000000a1550b0/frameset.htm]
Edited by: Obily on Jul 10, 2008 8:36 AM -
How to regularly load data from .csv file to database (using apex)
Hi,
i am using apex3 , I need to load data from a csv file to apex . I need to perform this automatically through code at regular time interval of 5-10 seconds.
Is it possible .If yes how ?. Please reply as early as possible. This will decide whether to use apex or not for this application.
this is question for Application Express. Dont know why in forum for BPEL
Edited by: TEJU on Oct 24, 2008 2:57 PMHello,
You really need to load the data every 5-10 seconds? Presumably it's read only?
I would look at using an Oracle external table instead, that way you just drop your CSV in a location the DB can read it and then you build a table that essentially references the underlying CSV (that way when you query the table you view the data in the CSV file).
Take a look at this link for a quick example on usage -
http://www.oracle-base.com/articles/9i/SQLNewFeatures9i.php#ExternalTables
Hope this helps,
John.
Blog: http://jes.blogs.shellprompt.net
Work: http://www.apex-evangelists.com
Author of Pro Application Express: http://tinyurl.com/3gu7cd
REWARDS: Please remember to mark helpful or correct posts on the forum, not just for my answers but for everyone! -
Load data from XML files into BI
Hi All,
Can any one guide me through the steps which involve in loading data from XML files into BI?
Thanks
SantoshHi James,
I have followed upto step No: 12 .
Please could you let me know how to link the XML file on my local drive to BI,
I am not able to find figure out where to specify the path of the XML file to be loaded into BI.
Thanks In Advance
Regards,
San
1. Create a Infosource(ZIS), with transfer structure in accordance with the structure in the XML file. (You can open the XML file with MS Excel.This way you can get to know the structure).
2. Activate the transfer rules.
3. After activation ;from the Menu Bar , Select Extras>Create BW Datasource with SOAP connection.
4.Now Activate the Infosurce again, this creates an XML based Datasource(6AXXX...)
5.These steps would create two sub nodes under the Infosource(ZIS).
6.Select Myself system and create a data package and execute it run in Init mode(without Data Transfer).
7. This would create an entry in RSA7(BW delta Queue)
8. Again create another Delta Infopackage under it, and run it. Now the Datasource(6AXXXXXX..) would turn green in RSA7.
9.In Function builder(SE37) select your FM( do a search ,F4, on the datasource 6AXXX....) .
10.Inside this RFC based FM , from the Menu Bar select Utilities>more Utilities>Create Web services>From Function module.
11.A wizard will guide you through the next steps .
12.once this is done a Web serrvice would be enabled in WSADMIN. Select your FM and execute it.
13.From here you can upload the data from XML file into BW delta queue.
Edited by: Santosh on Nov 30, 2008 2:22 PM -
Loading data from a file - originally posted in Spatial forum
Loading data from a file
Posted: Jun 23, 2008 10:21 AM
Hello,
i want to load my data to oracle from a rdf file and can consult it later with sparql.
so, I construct a Jena Model, then I store it using this program, but how can i consult it without loading the file again. thanks
Oracle oracle = new Oracle(jdbcUrl, user, password);
// read a data file into a default Jena model.
Model model = ModelFactory.createDefaultModel();
InputStream is;
is = new BufferedInputStream( new FileInputStream(filename));
model.read(new InputStreamReader(is), "", "");
is.close();
ModelOracleSem modelDest = ModelOracleSem.createOracleSemModel(oracle, modelName);
GraphOracleSem g = new GraphOracleSem(oracle, modelName);
g.dropApplicationTableIndex();
if (method == 0) {
psOut.println("start incremental");
modelDest.add(model);
psOut.println("end size " + modelDest.size());
}else if (method == 1) {
psOut.println("start batch load");
((OracleBulkUpdateHandler) g.getBulkUpdateHandler()).addInBatch(
GraphUtil.findAll(model.getGraph()), tbs);
psOut.println("end size " + modelDest.size());
} else {
psOut.println("start bulk load");
((OracleBulkUpdateHandler) g.getBulkUpdateHandler()).addInBulk(
GraphUtil.findAll(model.getGraph()), tbs);
psOut.println("end size " + modelDest.size());
g.rebuildApplicationTableIndex();Hi,
Chapter 1 (section 1.8) in the documentation has a quickstart on these steps - creating a semantic network, a model, loading the data using INSERT statements and so on. For faster load of large amounts of data you can use the bulk load functionality, documented in chapter 1 (section 1.7).
For Java code, you could use the Oracle Jena Adaptor, and the best place to start is with the documentation of the Oracle Jena Adaptor, at http://www.oracle.com/technology/tech/semantic_technologies/htdocs/documentation.html
Melli -
Down load data from frames on the webpage
Hi all,
i have a scenario like, down load data from a web page.
i am able to make http requests using httpclient easily, i am receiving correct response also(i can say this as i get the same when i use browser for the same request, i checked page view source and confirmed that i got right response)
my data on the web page are in FRAMES , now when i do the request using the browser , i can view my data on the web page by selecting that particular frame , and view source of that particular frame)
if need to read the data using java program how can i navigate to that particular frame
the below is sample response when i make request to web page using http client
<link rel=STYLESHEET href="Detail.css" type="text/css">
<script language="JavaScript">
var jsp_path="";
var sfrw_servlet="Servlet";
var sfrw_report_view_servlet="Servlet1";
var sfrw_report_download_servlet="Servlet2";
</script>
<html>
<head>
<META http-equiv="Content-Type" content="text/html; charset=ISO-8859-1">
<title> Website</title>
<script language="JavaScript" src="NavigationScripts.js"></script>
</head>
<FRAMESET ROWS="120,*" COLS="*" FRAMEBORDER="NO" BORDER="0" FRAMESPACING="0">
<!-- COGNIZANT MODIFY END 09/05/2007 -->
<FRAME NAME="Branding" SCROLLING="NO" NORESIZE SRC="Branding.jsp?navigationState=DEFAULT" >
<FRAMESET COLS="10,975,*" FRAMEBORDER="NO" BORDER="0" FRAMESPACING="10" ROWS="*">
<FRAME NAME="Navigation" NORESIZE SCROLLING="NO" src="Blank.jsp">
<FRAMESET ROWS="37,*" FRAMEBORDER="NO" BORDER="0" FRAMESPACING="0" ROWS="*">
<FRAME NAME="Header" NORESIZE SCROLLING="NO" src="Blank.jsp">
<FRAME NAME="Detail" NORESIZE SCROLLING="AUTO" src="SearchAll.jsp">
</FRAMESET>
<FRAME NAME="Navigation1" NORESIZE SCROLLING="NO" src="Blank.jsp">
</FRAMESET>
</FRAMESET>
</HTML>actually i am not sure which frame of above response holds my data, lets suppose frame Detail holds my data , how can i get data from the frame
any help?
Edited by: LoveOpensource on Mar 28, 2008 5:39 AMtry FM KCD_EXCEL_OLE_TO_INT_CONVERT
data itab type table of KCDE_cells with header line.
CALL FUNCTION 'KCD_EXCEL_OLE_TO_INT_CONVERT'
EXPORTING
filename = 'C:\test.xls'
i_begin_col = 1
i_begin_row = 1
i_end_col = 3
i_end_row = 3
tables
intern = itab
EXCEPTIONS
INCONSISTENT_PARAMETERS = 1
UPLOAD_OLE = 2
OTHERS = 3
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
ENDIF.
loop at itab.
write :/ itab-row, itab-col, itab-value.
endloop.
rgds,
PJ -
Unable to load data from an ODS to a Cube
Hi all,
I am traying to load data from an ODS to a cube, and I'm getting the following msg at the bottom of the screen monitor: <b>"No corresponding requests were found for request REQU_BZUH9ZPYNL5XLQNB7K8IQDKFS in the ODS/Cube" Message no. RSBM043.</b>
I am unable to load the data. The QM status is yellow. When the process starts, on the left hand of the monitor it shows: "0 of 84408 records" which is ok. I load that with this package before and it works well.
Can you help me what I have to do? I tried to wait but it was the same: no progress and no data in the cube.
Thank you very much and kind regards,
MM.
May be this helps...
When I look at the status, it says that I have a short dump at BW. It was CALL_FUNCTION_REMOTE_ERROR and the short text is "The function module "SUBST_TRANSFER_UPGEVAL" cannot be used for 'remote'". This short dump occurs very much before I shoot the upload.
Thanks again.Hello MM,
Can you do the following..
make the Total status Red Delete the request from cube.
goto ODS -> Remove the Data Mart Status -> Try loading it again.
The error message that you get is common when we are trying to run a infopackage with the source object having no new request, (means all the requests available in the source are already moved to the targets). So please try the steps given above.
Hope it will help! -
To load data from a cube in SCM(APO) system to a cube in BI system.
Experts,
Please let me know whether it is possible to load data from a cube in SCM(APO) system to a cube in BI system.If so explain the steps to perform.
Thanks,
MeeraHi,
Think in this way,
To load the data fro any source we need datasource Ok. You can genare Export data source for Cube in APO, then use that datasource for BW extraction, try like this. I think it will work, in my case I'm directly loading data from APO to BW using the DataSource sthat are genaraed on Planning Area.
Why you need to take data from APO cube?. Is there any condition for that?. If it is not mandatory, you can use the dame datasource and load the data to BW, if they have any conditions while loading the data from APO to APO cube, they you try check wherther it is possible in BW or not. If possible then you use DataSource and do the same calculation in BW directly.
Thanks
Reddy
Maybe you are looking for
-
Slow Video playback on external monitor
I'm having a problem with video on an external monitor drifting out of sync. The video plays back slow on the monitor but the sound come through fine and the video is playing at a normal speed in FCP. This is only happening with video from FCP, I can
-
Capturing video on MacBook Pro
I have several old VHS-C tapes that I want to upload into my Macbook Pro, where they can be edited. I have an old VHS camera to play the tapes, but I cannot get it to connect directly to my Macbook pro to upload in iMovie as you normally would when
-
My Satellite C660 is running very slow
Hi I have a C660-2E1 which has always run very slow since I bought it two years ago and I know I should have done something about it before now. I noticed that it only has 1GB RAM, could this be the problem? I was going to replace the 1 GB with a 4 G
-
Custom Footer not showing in Word to PDF Conversion
Hi All, I created a 14 page word document with a custom footer. The custom footer is a gray textbox with the page number feature inserted. When I convert the Word 2010 document to a PDF (using Adobe Acrobat XI Standard) the gray text box shows, how
-
Opera-devel: conflicts with opera ??
I have just attempted to install the AUR opera-devel (10.5) package with "yaourt -S opera-devel" and it fails with: checking dependencies... error: replacing packages with -U is not supported yet error: you can replace packages manually using -Rd and