FDMEE BefImport script for SQL source

Hi Experts ,
I am new to FDMEE and trying to implment it for our process which should pull the data from one of the SQL view and load into Essbase. I am trying to use the Open Interface table and hence trying to builde BefImport event script with help of admin guide exmple. I know i am missing manythings in following scripts but i m not sure about it , can anyone please help and guide me to build a correct befImport Event script please !!
Below is the script:
# Sample to import data from a custom SQL source and upload into FDMEE
# open interface table. This script should be called in BefImport Event.
# This is alternate to the FDMEE integration import script.
import java.sql as sql
batchName = "Batch_" + str(fdmContext["LOC_ACT"])
insertStmt = """
INSERT INTO AIF_OPEN_INTERFACE (
BATCH_NAME
,COL01
,COL02
,COL03
,COL04
,COL05
,COL06
,COL07
,COL08
,AMOUNT
) VALUES (
sourceConn = sql.DriverManager.getConnection(sqlserver://hyperion-financial-dev/Custom_hfm_dev1, testuser, password);
# Limiting number of rows to 5 during the test runs.
selectStmt = select * from dbo.ACTUAL_ALL_USD where FYear = '2014' and period = '1'
stmt = sourceConn.prepareStatement(selectStmt)
stmtRS = stmt.executeQuery()
while(stmtRS.next()):
params = [batchName, stmtRS.getString("account"), stmtRS.getString("entity"), stmtRS.getString("Dim1"), stmtRS.getString("Dm2"), stmtRS.getString
("Dim3"), stmtRS.getString("Dim4"), stmtRS.getString("Dim5"), stmtRS.getString("Dim6"), stmtRS.getBigDecimal("Data") ]
fdmAPI.executeDML(insertStmt, params, False)
fdmAPI.commitTransaction()
stmtRS.close()
stmt.close()
sourceConn.close()
Thanks
VJ

here is the log ::
5,489 INFO  [AIF]: -------START IMPORT STEP-------
2015-04-16 11:37:35,630 DEBUG [AIF]: CommData.preImportData - START
2015-04-16 11:37:35,630 DEBUG [AIF]: CommData.getRuleInfo - START
2015-04-16 11:37:35,630 DEBUG [AIF]:
        SELECT brl.RULE_ID, br.RULE_NAME, brl.PARTITIONKEY, brl.CATKEY, part.PARTVALGROUP, br.SOURCE_SYSTEM_ID, ss.SOURCE_SYSTEM_TYPE
        ,CASE
           WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'EBS%' THEN 'N'    
           WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'PS%' THEN 'N'
           WHEN ss.SOURCE_SYSTEM_TYPE LIKE 'FUSION%' THEN 'N' 
           WHEN ss.SOURCE_SYSTEM_TYPE = 'FILE' THEN 'N'
           WHEN ss.SOURCE_SYSTEM_TYPE = 'EPM' THEN 'N'
           ELSE 'Y'
         END SOURCE_ADAPTER_FLAG
        ,app.APPLICATION_ID, app.TARGET_APPLICATION_NAME, app.TARGET_APPLICATION_TYPE, app.DATA_LOAD_METHOD, brl.PLAN_TYPE
        ,CASE brl.PLAN_TYPE
           WHEN 'PLAN1' THEN 1 WHEN 'PLAN2' THEN 2 WHEN 'PLAN3' THEN 3 WHEN 'PLAN4' THEN 4 WHEN 'PLAN5' THEN 5 WHEN 'PLAN6' THEN 6 ELSE 0
         END PLAN_NUMBER
        ,br.INCL_ZERO_BALANCE_FLAG, br.PERIOD_MAPPING_TYPE, br.INCLUDE_ADJ_PERIODS_FLAG, br.BALANCE_TYPE ACTUAL_FLAG
        ,br.AMOUNT_TYPE, br.BALANCE_SELECTION, br.BALANCE_METHOD_CODE
        ,COALESCE(br.SIGNAGE_METHOD, 'ABSOLUTE') SIGNAGE_METHOD
        ,br.CURRENCY_CODE, br.BAL_SEG_VALUE_OPTION_CODE, brl.EXECUTION_MODE
        ,COALESCE(brl.IMPORT_FROM_SOURCE_FLAG, 'Y') IMPORT_FROM_SOURCE_FLAG
        ,COALESCE(brl.RECALCULATE_FLAG, 'N') RECALCULATE_FLAG
        ,COALESCE(brl.EXPORT_TO_TARGET_FLAG, 'N') EXPORT_TO_TARGET_FLAG
        ,COALESCE(brl.CHECK_FLAG, 'N') CHECK_FLAG
        ,CASE
           WHEN ss.SOURCE_SYSTEM_TYPE = 'EPM' THEN 'NONE'
           WHEN (br.LEDGER_GROUP_ID IS NOT NULL) THEN 'MULTI'
           WHEN (br.SOURCE_LEDGER_ID IS NOT NULL) THEN 'SINGLE'
           ELSE 'NONE'
         END LEDGER_GROUP_CODE
        ,COALESCE(br.BALANCE_AMOUNT_BS, 'YTD') BALANCE_AMOUNT_BS
        ,COALESCE(br.BALANCE_AMOUNT_IS, 'PERIODIC') BALANCE_AMOUNT_IS
        ,br.LEDGER_GROUP
        ,(SELECT brd.DETAIL_CODE FROM AIF_BAL_RULE_DETAILS brd WHERE brd.RULE_ID = br.RULE_ID AND brd.DETAIL_TYPE = 'LEDGER') PS_LEDGER
        ,CASE lg.LEDGER_TEMPLATE WHEN 'COMMITMENT' THEN 'Y' ELSE 'N' END KK_FLAG
        ,p.LAST_UPDATED_BY, p.AIF_WEB_SERVICE_URL WEB_SERVICE_URL, p.EPM_ORACLE_INSTANCE
        ,brl.JOURNAL_FLAG, br.MULTI_PERIOD_FILE_FLAG, br.IMPGROUPKEY, imp.IMPSOURCELEDGERID
        ,imp.IMPGROUPFILETYPE, imp.IMPTARGETSOURCESYSTEMID, imp.IMPSOURCECOAID, part.PARTTARGETAPPLICATIONID
        FROM AIF_PROCESSES p
        INNER JOIN AIF_BAL_RULE_LOADS brl
          ON brl.LOADID = p.PROCESS_ID
        INNER JOIN AIF_BALANCE_RULES br
          ON br.RULE_ID = brl.RULE_ID
        INNER JOIN AIF_SOURCE_SYSTEMS ss
          ON ss.SOURCE_SYSTEM_ID = br.SOURCE_SYSTEM_ID
        INNER JOIN AIF_TARGET_APPLICATIONS app
          ON app.APPLICATION_ID = brl.APPLICATION_ID
        INNER JOIN TPOVPARTITION part
          ON part.PARTITIONKEY = br.PARTITIONKEY
        INNER JOIN TBHVIMPGROUP imp
          ON imp.IMPGROUPKEY = part.PARTIMPGROUP
        LEFT OUTER JOIN AIF_COA_LEDGERS l
          ON l.SOURCE_SYSTEM_ID = p.SOURCE_SYSTEM_ID
          AND l.SOURCE_LEDGER_ID = COALESCE(br.SOURCE_LEDGER_ID,imp.IMPSOURCELEDGERID)
        LEFT OUTER JOIN AIF_PS_SET_CNTRL_REC_STG scr
          ON scr.SOURCE_SYSTEM_ID = l.SOURCE_SYSTEM_ID
          AND scr.SETCNTRLVALUE = l.SOURCE_LEDGER_NAME
          AND scr.RECNAME = 'LED_GRP_TBL'
        LEFT OUTER JOIN AIF_PS_LED_GRP_TBL_STG lg
          ON lg.SOURCE_SYSTEM_ID = scr.SOURCE_SYSTEM_ID
          AND lg.SETID = scr.SETID
          AND lg.LEDGER_GROUP = br.LEDGER_GROUP
        WHERE p.PROCESS_ID = 309
2015-04-16 11:37:35,645 DEBUG [AIF]:
      SELECT adim.BALANCE_COLUMN_NAME DIMNAME
      ,adim.DIMENSION_ID
      ,dim.TARGET_DIMENSION_CLASS_NAME
      ,(SELECT COA_SEGMENT_NAME FROM AIF_COA_SEGMENTS cs WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID1) COA_SEGMENT_NAME1
      ,(SELECT COA_SEGMENT_NAME FROM AIF_COA_SEGMENTS cs WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID2) COA_SEGMENT_NAME2
      ,(SELECT COA_SEGMENT_NAME FROM AIF_COA_SEGMENTS cs WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID3) COA_SEGMENT_NAME3
      ,(SELECT COA_SEGMENT_NAME FROM AIF_COA_SEGMENTS cs WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID4) COA_SEGMENT_NAME4
      ,(SELECT COA_SEGMENT_NAME FROM AIF_COA_SEGMENTS cs WHERE cs.COA_LINE_ID = tiie.IMPSOURCECOALINEID5) COA_SEGMENT_NAME5
      ,(SELECT DISTINCT CASE mdd.ORPHAN_OPTION_CODE
          WHEN 'CHILD' THEN 'N'
          WHEN 'ROOT' THEN 'N'
          ELSE 'Y'
        END DIMENSION_FILTER_FLAG
        FROM AIF_MAP_DIM_DETAILS_V mdd
        ,AIF_MAPPING_RULES mr
        WHERE mr.PARTITIONKEY = tpp.PARTITIONKEY
        AND mdd.RULE_ID = mr.RULE_ID
        AND mdd.DIMENSION_ID = adim.DIMENSION_ID
      ) DIMENSION_FILTER_FLAG
      ,tiie.IMPCONCATCHAR
      FROM TPOVPARTITION tpp
      INNER JOIN AIF_TARGET_APPL_DIMENSIONS adim
        ON adim.APPLICATION_ID = 28
      INNER JOIN AIF_DIMENSIONS dim
        ON dim.DIMENSION_ID = adim.DIMENSION_ID
      LEFT OUTER JOIN TBHVIMPITEMERPI tiie
        ON tiie.IMPGROUPKEY = tpp.PARTIMPGROUP
        AND tiie.IMPFLDFIELDNAME = adim.BALANCE_COLUMN_NAME
        AND tiie.IMPMAPTYPE = 'ERP'
      WHERE tpp.PARTITIONKEY = 13
      AND adim.BALANCE_COLUMN_NAME IS NOT NULL
      AND dim.TARGET_DIMENSION_CLASS_NAME <> 'ICPTRANS'
      AND (adim.VALID_FOR_PLAN1 = 1 OR dim.TARGET_DIMENSION_CLASS_NAME = 'LOOKUP')
      ORDER BY adim.BALANCE_COLUMN_NAME
2015-04-16 11:37:35,661 DEBUG [AIF]: {'APPLICATION_ID': 28L, 'IMPORT_FROM_SOURCE_FLAG': u'Y', 'PLAN_TYPE': u'PLAN1', 'RULE_NAME': u'HFMAct', 'ACTUAL_FLAG': None, 'IS_INCREMENTAL_LOAD': False, 'EPM_ORACLE_INSTANCE': u'D:\\Oracle\\Middleware\\user_projects\\epmsystem1', 'CATKEY': 2L, 'BAL_SEG_VALUE_OPTION_CODE': None, 'INCLUDE_ADJ_PERIODS_FLAG': u'N', 'PERIOD_MAPPING_TYPE': u'EXPLICIT', 'SOURCE_SYSTEM_TYPE': u'OTHERS', 'CHECK_FLAG': u'N', 'LEDGER_GROUP': None, 'TARGET_APPLICATION_NAME': u'Corp', 'RECALCULATE_FLAG': u'Y', 'SOURCE_SYSTEM_ID': 16L, 'TEMP_DATA_TABLE_NAME': 'TDATASEG_T', 'KK_FLAG': u'N', 'IMPGROUPKEY': None, 'AMOUNT_TYPE': None, 'DATA_TABLE_NAME': 'TDATASEG', 'EXPORT_TO_TARGET_FLAG': u'N', 'JOURNAL_FLAG': None, 'SOURCE_APPLICATION_ID': None, 'DIMNAME_LIST': [u'ACCOUNT', u'ENTITY', u'UD1', u'UD2', u'UD3', u'UD4', u'UD5', u'UD6'], 'FCI_FLAG': 'N', 'IMPSOURCECOAID': 0L, 'TDATAMAPTYPE': 'ERP', 'LAST_UPDATED_BY': u'admin', 'DIMNAME_MAP': {u'UD6': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD6', 'DIMENSION_ID': 33L}, u'UD3': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD3', 'DIMENSION_ID': 30L}, u'ENTITY': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Entity', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ENTITY', 'DIMENSION_ID': 26L}, u'UD2': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD2', 'DIMENSION_ID': 23L}, u'UD5': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD5', 'DIMENSION_ID': 32L}, u'UD4': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Generic', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD4', 'DIMENSION_ID': 31L}, u'ACCOUNT': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Account', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'ACCOUNT', 'DIMENSION_ID': 27L}, u'UD1': {'IMPCONCATCHAR': None, 'TARGET_DIMENSION_CLASS_NAME': u'Version', 'COA_SEGMENT_NAME5': None, 'COA_SEGMENT_NAME1': None, 'COA_SEGMENT_NAME2': None, 'COA_SEGMENT_NAME3': None, 'DIMENSION_FILTER_FLAG': None, 'COA_SEGMENT_NAME4': None, 'DIMNAME': u'UD1', 'DIMENSION_ID': 29L}}, 'TARGET_APPLICATION_TYPE': u'HPL', 'PARTITIONKEY': 13L, 'PARTVALGROUP': u'[NONE]', 'LEDGER_GROUP_CODE': u'NONE', 'INCLUDE_ZERO_BALANCE_FLAG': u'N', 'EXECUTION_MODE': None, 'PLAN_NUMBER': 1L, 'MULTI_PERIOD_FILE_FLAG': None, 'PS_LEDGER': None, 'BALANCE_SELECTION': None, 'IMPGROUPFILETYPE': u'ODI', 'BALANCE_AMOUNT_IS': u'PERIODIC', 'RULE_ID': 31L, 'BALANCE_AMOUNT_BS': u'YTD', 'CURRENCY_CODE': None, 'SOURCE_ADAPTER_FLAG': u'Y', 'BALANCE_METHOD_CODE': None, 'SIGNAGE_METHOD': u'ABSOLUTE', 'WEB_SERVICE_URL': u'http://STAAP1655D.R02.XLGS.LOCAL:6550/aif', 'DATA_LOAD_METHOD': u'CLASSIC_VIA_EPMI', 'PARTTARGETAPPLICATIONID': 28L, 'IMPTARGETSOURCESYSTEMID': 0L}
2015-04-16 11:37:35,661 DEBUG [AIF]: CommData.getRuleInfo - END
2015-04-16 11:37:35,692 DEBUG [AIF]: CommData.insertPeriods - START
2015-04-16 11:37:35,692 DEBUG [AIF]: CommData.getLedgerListAndMap - START
2015-04-16 11:37:35,692 DEBUG [AIF]: CommData.getLedgerSQL - START
2015-04-16 11:37:35,692 DEBUG [AIF]: CommData.getLedgerSQL - END
2015-04-16 11:37:35,692 DEBUG [AIF]:
          SELECT COALESCE(br.SOURCE_LEDGER_ID,0) SOURCE_LEDGER_ID
          ,NULL SOURCE_LEDGER_NAME
          ,NULL SOURCE_COA_ID
          ,br.CALENDAR_ID
          ,NULL SETID
          ,NULL PERIOD_TYPE
          ,NULL LEDGER_TABLE_NAME
          FROM AIF_BALANCE_RULES br
          WHERE br.RULE_ID = 31
2015-04-16 11:37:35,692 DEBUG [AIF]: CommData.getLedgerListAndMap - END
2015-04-16 11:37:35,692 DEBUG [AIF]: doAppPeriodMappingsExist - Corp: Y
2015-04-16 11:37:35,692 DEBUG [AIF]: Period mapping section: ERPI/EXPLICIT/BUDGET/APFY
2015-04-16 11:37:35,692 DEBUG [AIF]:
        INSERT INTO AIF_PROCESS_PERIODS (
          PROCESS_ID
          ,PERIODKEY
          ,PERIOD_ID
          ,ADJUSTMENT_PERIOD_FLAG
          ,GL_PERIOD_YEAR
          ,GL_PERIOD_NUM
          ,GL_PERIOD_NAME
          ,GL_PERIOD_CODE
          ,GL_EFFECTIVE_PERIOD_NUM
          ,YEARTARGET
          ,PERIODTARGET
          ,IMP_ENTITY_TYPE
          ,IMP_ENTITY_ID
          ,IMP_ENTITY_NAME
          ,TRANS_ENTITY_TYPE
          ,TRANS_ENTITY_ID
          ,TRANS_ENTITY_NAME
          ,PRIOR_PERIOD_FLAG
          ,SOURCE_LEDGER_ID
                SELECT DISTINCT brl.LOADID PROCESS_ID
                ,pp.PERIODKEY PERIODKEY
                ,prd.PERIOD_ID
                ,COALESCE(prd.ADJUSTMENT_PERIOD_FLAG, 'N') ADJUSTMENT_PERIOD_FLAG
                ,COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) GL_PERIOD_YEAR
                ,COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) GL_PERIOD_NUM
                ,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0') GL_PERIOD_NAME
                ,COALESCE(prd.PERIOD_CODE, CAST(COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0) AS VARCHAR(38)),'0') GL_PERIOD_CODE
                ,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) GL_EFFECTIVE_PERIOD_NUM
                ,pp.YEARTARGET YEARTARGET
                ,pp.PERIODTARGET PERIODTARGET
                ,'PROCESS_BAL_IMP' IMP_ENTITY_TYPE
                ,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) IMP_ENTITY_ID
                ,COALESCE(prd.PERIOD_NAME, ppsrc.GL_PERIOD_NAME,'0')+' ('+pp.PERIODDESC+')' IMP_ENTITY_NAME
                ,'PROCESS_BAL_TRANS' TRANS_ENTITY_TYPE
                ,(COALESCE(prd.YEAR, ppsrc.GL_PERIOD_YEAR,0) * 10000 + COALESCE(prd.PERIOD_NUM, ppsrc.GL_PERIOD_NUM,0)) TRANS_ENTITY_ID
                ,pp.PERIODDESC TRANS_ENTITY_NAME
                ,'N' PRIOR_PERIOD_FLAG
                ,0 SOURCE_LEDGER_ID
                FROM (
                  AIF_BAL_RULE_LOADS brl
                  INNER JOIN TPOVCATEGORY pc
                    ON pc.CATKEY = brl.CATKEY
                  INNER JOIN TPOVPERIODADAPTOR_FLAT_V pp
                    ON pp.PERIODFREQ = pc.CATFREQ
                    AND pp.PERIODKEY >= brl.START_PERIODKEY
                    AND pp.PERIODKEY <= brl.END_PERIODKEY
                    AND pp.INTSYSTEMKEY = 'Corp'
                INNER JOIN TPOVPERIODSOURCE ppsrc
                  ON ppsrc.PERIODKEY = pp.PERIODKEY
                  AND ppsrc.MAPPING_TYPE = 'EXPLICIT'
                  AND ppsrc.SOURCE_SYSTEM_ID = 16
                  AND ppsrc.CALENDAR_ID IN ('Jan')
                LEFT OUTER JOIN AIF_GL_PERIODS_STG prd
                  ON prd.PERIOD_ID = ppsrc.PERIOD_ID
                  AND prd.SOURCE_SYSTEM_ID = ppsrc.SOURCE_SYSTEM_ID
                  AND prd.CALENDAR_ID = ppsrc.CALENDAR_ID
                  AND prd.ADJUSTMENT_PERIOD_FLAG = 'N'
                WHERE brl.LOADID = 309
                ORDER BY pp.PERIODKEY
                ,GL_EFFECTIVE_PERIOD_NUM
2015-04-16 11:37:35,692 DEBUG [AIF]: periodSQL - periodParams: ['N', 'PROCESS_BAL_IMP', 'PROCESS_BAL_TRANS', 0L, u'Corp', u'EXPLICIT', 16L, u'Jan', 'N', 309]
2015-04-16 11:37:35,708 DEBUG [AIF]: CommData.insertPeriods - END
2015-04-16 11:37:35,708 DEBUG [AIF]: CommData.getPovList - START
2015-04-16 11:37:35,708 DEBUG [AIF]:
        SELECT DISTINCT brl.PARTITIONKEY, part.PARTNAME, brl.CATKEY, cat.CATNAME, pprd.PERIODKEY
        ,COALESCE(pp.PERIODDESC, CONVERT(VARCHAR,pprd.PERIODKEY,120)) PERIODDESC
        ,brl.RULE_ID, br.RULE_NAME, CASE WHEN (tlp.INTLOCKSTATE = 60) THEN 'Y' ELSE 'N' END LOCK_FLAG
        FROM AIF_BAL_RULE_LOADS brl
        INNER JOIN AIF_BALANCE_RULES br
          ON br.RULE_ID = brl.RULE_ID
        INNER JOIN TPOVPARTITION part
          ON part.PARTITIONKEY = brl.PARTITIONKEY
        INNER JOIN TPOVCATEGORY cat
          ON cat.CATKEY = brl.CATKEY
        INNER JOIN AIF_PROCESS_PERIODS pprd
          ON pprd.PROCESS_ID = brl.LOADID
          LEFT OUTER JOIN TPOVPERIODADAPTOR pp
            ON pp.PERIODKEY = pprd.PERIODKEY
            AND pp.INTSYSTEMKEY = 'Corp'
        LEFT OUTER JOIN TLOGPROCESS tlp
          ON tlp.PARTITIONKEY = brl.PARTITIONKEY
          AND tlp.CATKEY = brl.CATKEY
          AND tlp.PERIODKEY = pprd.PERIODKEY
          AND tlp.RULE_ID = brl.RULE_ID
        WHERE brl.LOADID = 309
        ORDER BY brl.PARTITIONKEY, brl.CATKEY, pprd.PERIODKEY, brl.RULE_ID
2015-04-16 11:37:35,708 DEBUG [AIF]: CommData.getPovList - END
2015-04-16 11:37:35,708 DEBUG [AIF]: CommData.insertImportProcessDetails - START
2015-04-16 11:37:35,708 DEBUG [AIF]:
          INSERT INTO AIF_PROCESS_DETAILS (
            PROCESS_ID
            ,ENTITY_TYPE
            ,ENTITY_ID
            ,ENTITY_NAME
            ,ENTITY_NAME_ORDER
            ,TARGET_TABLE_NAME
            ,EXECUTION_START_TIME
            ,EXECUTION_END_TIME
            ,RECORDS_PROCESSED
            ,STATUS
            ,LAST_UPDATED_BY
            ,LAST_UPDATE_DATE
          SELECT PROCESS_ID
          ,ENTITY_TYPE
          ,ENTITY_ID
          ,ENTITY_NAME
          ,ENTITY_NAME_ORDER
          ,'TDATASEG' TARGET_TABLE_NAME
          ,CURRENT_TIMESTAMP EXECUTION_START_TIME
          ,NULL EXECUTION_END_TIME
          ,0 RECORDS_PROCESSED
          ,'PENDING' STATUS
          ,'admin' LAST_UPDATED_BY
          ,CURRENT_TIMESTAMP LAST_UPDATE_DATE
          FROM (
            SELECT DISTINCT PROCESS_ID
            ,IMP_ENTITY_TYPE ENTITY_TYPE
            ,IMP_ENTITY_ID ENTITY_ID
            ,IMP_ENTITY_NAME ENTITY_NAME
            ,(COALESCE(SOURCE_LEDGER_ID,0) * 100000000 + GL_EFFECTIVE_PERIOD_NUM) ENTITY_NAME_ORDER
            FROM AIF_PROCESS_PERIODS
            WHERE PROCESS_ID = 309
          ) q
          ORDER BY ENTITY_NAME_ORDER
2015-04-16 11:37:35,724 DEBUG [AIF]: CommData.insertImportProcessDetails - END
2015-04-16 11:37:35,724 DEBUG [AIF]: Comm.doScriptInit - START
2015-04-16 11:37:35,833 DEBUG [AIF]: fdmContext: {BATCHSCRIPTDIR=D:\Oracle\Middleware\user_projects\epmsystem1\FinancialDataQuality, INBOXDIR=D:\AppFDM\FDMEE\inbox, LOCNAME=HFM_ACT, SOURCENAME=HFMSQL, APPID=28, SOURCEID=16, APPROOTDIR=D:\AppFDM\FDMEE, IMPORTFORMAT=HFMACTUALSQL, SCRIPTSDIR=D:\AppFDM\FDMEE\data\scripts, EPMORACLEHOME=D:\Oracle\Middleware\EPMSystem11R1, TARGETAPPTYPE=HPL, RULEID=31, CATNAME=Actual, EPMORACLEINSTANCEHOME=D:\Oracle\Middleware\user_projects\epmsystem1, LOADID=309, PERIODNAME=Jan-15, IMPORTMODE=null, SOURCETYPE=OTHERS, PERIODKEY=2015-01-31, EXPORTFLAG=N, TARGETAPPDB=PLAN1, TARGETAPPNAME=Corp, LOCKEY=13, RULENAME=HFMAct, OUTBOXDIR=D:\AppFDM\FDMEE\outbox, MULTIPERIODLOAD=N, EXPORTMODE=null, CATKEY=2, USERNAME=admin, FILEDIR=null, IMPORTFLAG=Y, USERLOCALE=null}
2015-04-16 11:37:35,849 DEBUG [AIF]: The executeEventScript is set to: YES
2015-04-16 11:37:35,849 DEBUG [AIF]: The AppRootFolder is set to: D:\AppFDM\FDMEE
2015-04-16 11:37:35,849 DEBUG [AIF]: The JavaHome is set to: %EPM_ORACLE_HOME%/../jdk160_35
2015-04-16 11:37:35,849 DEBUG [AIF]: The OleDatabaseProvider is set to: SQLNCLI
2015-04-16 11:37:35,849 DEBUG [AIF]: Comm.doScriptInit - END
2015-04-16 11:37:35,849 DEBUG [AIF]: Comm.executeScript - START
2015-04-16 11:37:35,849 INFO  [AIF]: Executing the following script: D:\AppFDM\FDMEE/data/scripts/event/BefImport.py
2015-04-16 11:37:35,849 ERROR [AIF]: The script has failed to execute:
2015-04-16 11:37:35,895 DEBUG [AIF]: Comm.finalizeProcess - START
2015-04-16 11:37:35,895 DEBUG [AIF]: CommData.updateRuleStatus - START
2015-04-16 11:37:35,895 DEBUG [AIF]:
    UPDATE AIF_BALANCE_RULES
    SET STATUS = CASE 'FAILED'
      WHEN 'SUCCESS' THEN
        CASE (
          SELECT COUNT(*)
          FROM AIF_PROCESS_DETAILS pd
          WHERE pd.PROCESS_ID = 309
          AND pd.STATUS IN ('FAILED','WARNING')
        WHEN 0 THEN 'SUCCESS'
        ELSE (
          SELECT MIN(pd.STATUS)
          FROM AIF_PROCESS_DETAILS pd
          WHERE pd.PROCESS_ID = 309
          AND pd.STATUS IN ('FAILED','WARNING')
        END
      ELSE 'FAILED'
    END
    WHERE RULE_ID = 31
2015-04-16 11:37:35,911 DEBUG [AIF]: CommData.updateRuleStatus - END
2015-04-16 11:37:35,911 FATAL [AIF]: Error in COMM Pre Import Data
2015-04-16 11:37:35,911 DEBUG [AIF]: Comm.updateProcess - START
2015-04-16 11:37:35,911 DEBUG [AIF]: Comm.updateProcess - END
2015-04-16 11:37:35,911 DEBUG [AIF]: The fdmAPI connection has been closed.
2015-04-16 11:37:35,911 INFO  [AIF]: FDMEE Process End, Process ID: 309

Similar Messages

  • Pls give some shell scripting for sql/plsql

    pls give some shell scripting for sql/plsql

    794244 wrote:
    pls give some shell scripting for sql/plsqlNeither SQL or PL/SQL are shell script languages. Both are server side languages that executes inside an Oracle database server process.
    This is an important concept to understand when using SQL*Plus for example to "script" interaction with an Oracle database.

  • Shell scripting for sql queires

    Hi All,
    I have written 4 sql queires.now i want to write shell scripting for this.so please guide me in this issue..
    1. select * from emp;
    2. select * from dept;
    3. delete from emp;
    4. delete from dept;
    Thank you.

    Hi,
    Apologees for the c!=k!=b stuff. I guess it was to cryptic. It means
    C shell is not equal to Korne shell and both are not equal to Bourne shell.
    I can't provide you with any site for such stuff. Maybe Google might help? Or someone who is a nicer guy than me...

  • Script for sql*loader

    Hi All,
    I am loading the data from .csv format to table by using sql*loader.
    This sql*loader command was existed in one of the pl/sql Procedure. this procedure will first create a table and then it will get the data by using sql*loader command,then one procedure will run.
    to run this script we are manually loading the data from excel by using sql*laoder by running the .ctl command from Command prompt.. instead of doing manually this process is there any other way to do it automatically.
    if it is possible please let me know.
    here i am giving the script
    drop table emp;
    create table emp
    (eno number,
    ename  varchar2(20),
    job varchar2(20),
    sal number(4),
    doj date
    - here table created.then here by using below sql*loader command loading the data info emp table
    load data
    infile <file path>
    fields terminated by ','
    intto table emp
    (eno,
    ename,
    job,
    sal)by running this control file manually from command prompt loading the data.
    then there is a procedure which we need to to update the EMP table.
    for this whole thing we are just running the script except loading the data part.
    for this please let me know the automation part.
    Thanks.

    Hi,
    Create a shell Script(Unix) or Windows batch Job to automate the entire process. The script will like below
    1.Connect SQLPLUS with the Createtable.SQL file (The Createtable.sql will have the create table script)
    2.invoke sql loader with the control file
    3.then again connect SQLPLUS with the executeproc.sql file (the executeproc.sql will have the execute command to execute the stored procedure)
    by this way your entire process can be in one script and automate by scheduling it in unix or windows.
    Thanks,
    Vijay
    Edited by: Vijayaraghavan Krishnan on Nov 27, 2012 4:48 PM

  • DDLIMP utility is failing while running the DDL_OLTP.ctl script for Siebel

    DDLIMP utility is failing while running the DDL_OLTP.ctl script for Siebel source on DB2.
    Below is the the log details:
    2021 2012-11-08 03:23:46 2012-11-08 03:25:56 -0700 0000002a 001 003f 0001 09 ddlimp 604 736 C:\OBIA\Upgrade\CTLFiles\DDL_.log
    ContextInit     ContextInit     0     0     2012-11-08 03:23:46     Message Facility failed to init. Siebel Root: C:\DAC\client\utilities
    Trace     Trace     3     0     2012-11-08 03:23:46     Siebel Enterprise Applications ODBC DDL Import Utility, Version 7.7 [18030] ENU
    Trace     Trace     3     0     2012-11-08 03:23:46     Copyright (c) 2001 Siebel Systems, Inc. All rights reserved.
    Trace     Trace     3     0     2012-11-08 03:23:46     
    This software is the property of Siebel Systems, Inc., 2207 Bridgepointe Parkway,
    San Mateo, CA 94404.
    User agrees that any use of this software is governed by: (1) the applicable
    user limitations and other terms and conditions of the license agreement which
    has been entered into with Siebel Systems or its authorized distributors; and
    (2) the proprietary and restricted rights notices included in this software.
    WARNING: THIS COMPUTER PROGRAM IS PROTECTED BY U.S. AND INTERNATIONAL LAW.
    UNAUTHORIZED REPRODUCTION, DISTRIBUTION OR USE OF THIS PROGRAM, OR ANY PORTION
    OF IT, MAY RESULT IN SEVERE CIVIL AND CRIMINAL PENALTIES, AND WILL BE
    PROSECUTED TO THE MAXIMUM EXTENT POSSIBLE UNDER THE LAW.
    If you have received this software in error, please notify Siebel Systems
    immediately at (650) 295-5000.
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:23:46     ddlimp /U SIEBEL /P ***** /C siebel /G SSE_ROLE /F C:\OBIA\Upgrade\CTLFiles\DDL_OLTP.CTL /L C:\OBIA\Upgrade\CTLFiles\DDL_
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:23:46     
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:23:46     Connecting to the database...
    DBCLog     DBCLogError     1     0     2012-11-08 03:23:52     SQL Warning, SQL State 01004, 0, [DataDirect][ODBC DB2 Wire Protocol driver]String data, right truncated.
    DBCLog     DBCLogError     1     0     2012-11-08 03:23:52     SQL Warning, SQL State ”, 40042692, CLog     DBCLogError     1     0     2012-11-08 03:23:52     SQL Warning, SQL State 01004, 0, [DataDirect][ODBC DB2 Wire Protocol driver]String data, right truncated.
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:23:52     Connected.
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:23:52     
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:23:52     Reading tables and indexes from DDL file...
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:23:52     Read 522 tables and 1084 indexes from DDL file...
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:23:52     
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:23:52     Reading existing schema...
    SARMLog     SARMInformation     3     0     2012-11-08 03:23:52     SARM is OFF -change param SARMLevel to enable
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     Read 0 tablespaces, 6137 tables and 24846 indexes from existing schema...
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     Running SQL statements against the database...
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     Merging table S_ETL_COSTLST ...
    SQLError     Statement     0     0     2012-11-08 03:25:56     SQL Statement:
    alter table S_ETL_COSTLST modify
    CONFLICT_ID varchar(15)
    DBCLog     DBCLogError     1     0     2012-11-08 03:25:56     [DataDirect][ODBC DB2 Wire Protocol driver][UDB DB2 for Windows, UNIX, and Linux]ILLEGAL SYMBOL modify; VALID SYMBOLS ARE table S_ETL_COSTLST. ADD
    SQLError     Statement     0     0     2012-11-08 03:25:56     SQL Statement:
    alter table S_ETL_COSTLST modify
    CONFLICT_ID varchar(15)
    DBCLog     DBCLogError     1     0     2012-11-08 03:25:56     [DataDirect][ODBC DB2 Wire Protocol driver][UDB DB2 for Windows, UNIX, and Linux]An error occurred during implicit system action type '2'. Information returned for the error includes SQLCODE '-104', SQLSTATE '42601' and message tokens 'modify|table S_ETL_COSTLST|ADD'.
    SQLError     Statement     0     0     2012-11-08 03:25:56     SQL Statement:
    alter table S_ETL_COSTLST modify
    CONFLICT_ID default '0'
    DBCLog     DBCLogError     1     0     2012-11-08 03:25:56     [DataDirect][ODBC DB2 Wire Protocol driver][UDB DB2 for Windows, UNIX, and Linux]ILLEGAL SYMBOL modify; VALID SYMBOLS ARE table S_ETL_COSTLST. ADD
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     37000: [DataDirect][ODBC DB2 Wire Protocol driver][UDB DB2 for Windows, UNIX, and Linux]ILLEGAL SYMBOL modify; VALID SYMBOLS ARE table S_ETL_COSTLST. ADD
    SQLError     Statement     0     0     2012-11-08 03:25:56     SQL Statement:
    alter table S_ETL_COSTLST modify
    CONFLICT_ID default '0'
    DBCLog     DBCLogError     1     0     2012-11-08 03:25:56     [DataDirect][ODBC DB2 Wire Protocol driver][UDB DB2 for Windows, UNIX, and Linux]An error occurred during implicit system action type '2'. Information returned for the error includes SQLCODE '-104', SQLSTATE '42601' and message tokens 'modify|table S_ETL_COSTLST|ADD'.
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     56098: [DataDirect][ODBC DB2 Wire Protocol driver][UDB DB2 for Windows, UNIX, and Linux]An error occurred during implicit system action type '2'. Information returned for the error includes SQLCODE '-104', SQLSTATE '42601' and message tokens 'modify|table S_ETL_COSTLST|ADD'.
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     alter table S_ETL_COSTLST modify
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     CONFLICT_ID default '0'
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     ;
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     writeExecDDL error (UTLOdbcExecDirectDDL pDDLSql).
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     writeExecDDL error (UTLDbDdlColModify).
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     Error in MainFunction (UTLDbDdlDbMerge).
    SQLDBUtilityLog     SQLDBUtilityLog     3     0     2012-11-08 03:25:56     Error in Main function...
    GenericLog     GenericError     1     0     2012-11-08 03:25:56     (logapi.cpp (167) err=1 sys=126) SBL-GEN-00001: (logapi.cpp: 167) error code = 1, system error = 126, msg1 = (null), msg2 = (null), msg3 = (null), msg4 = (null)

    If you run DAC on a 64-bit windows Operating System, you must create the ODBC data source using the ODBC Administrator tool in %windir%\SysWOW64\odbcad32.exe. for creating the data warehouse tables.
    If helps pls mark

  • Good book for SQL Powershell????

    Can anyone suggest me a good book on SQL Powershel??? I am really in need of one... I am a SQL Developer and was given some tasks that are making me use powershell :-(

    Hello,
    The SQL Server snapin for PowerShell is one out of a few hundret; I dont't think you will find a dedicated book for it.
    But in Internet you can find a lot of article about PowerShell for SQL Server. You can start at MSDN here:
    SQL Server PowerShell
    Also a good resource:
    MS Script Center Gallery with PowerShell scripts for SQL Server
    Olaf Helper
    [ Blog] [ Xing] [ MVP]

  • Can I change the default 'File Type' for PL/SQL Source File to '*.prc' ?

    Want to change the 'default' extension for PL/SQL source to '*.prc'
    When navigating 'File' / 'Save As', and selecting the 'File Type' list box,
    The default 'File Type' for PL/SQL Source File is '*.pls'.
    Can I change the default 'File Type' for PL/SQL Source File to '*.prc' ?
    I have followed , 'Tools', 'Preferences' , 'File Types' and added '.prc' as 'Sql Script' file type, because (PL/SQL Source) is not present in the list.

    You can certainly overwrite the 'Save As' extension as you are saving the file. I have added an ER for more control over this functionality.
    sue

  • Creating SQL-Loader script for more than one table at a time

    Hi,
    I am using OMWB 2.0.2.0.0 with Oracle 8.1.7 and Sybase 11.9.
    It looks like I can create SQL-Loader scripts for all the tables
    or for one table at a time. If I want to create SQL-Loader
    scripts for 5-6 tables, I have to either create script for all
    the tables and then delete the unwanted tables or create the
    scripts for one table at a time and then merge them.
    Is there a simple way to create migration scripts for more than
    one but not all tables at a time?
    Thanks,
    Prashant Rane

    No there is no multi-select for creating SQL-Loader scripts.
    You can either create them separately or create them all and
    then discard the one you do not need.

  • Error while opening SQL source for a Data Load Rules File

    Hi ,I have created Data Laod rules file.When I try to open a SQL source for this rules file (File->Open SQL) , I get an error saying "Your server does not have a SQL connection Option, Please check with your system administrator"Further I get a message "There are no data sources defined. PLease create one to continue.".I have created DSN on my Essbase server.What is the problem.What needs to done to open SQL files.Thanks.

    I have Essbase 7.1 I guess for version 7.1 the SQL interface option is intalled with the Analytic server itself .Am I right?I have setup the DSN also.Please help to resolve this issue.Thanks .

  • Please help me resolve the Lync server 2013 deployment error: "An error occurred while applying SQL script for the feature BackendStore."

    I am getting an error in "Step 2 - Setup or Remove Lync Server Components" of "Install or Update Lync Server System" step.
    "An error occured while applying SQL script for the feature BackendStore. For details, see the log file...."
    Additionally, all previous steps such as: Prepare Active Directory, Prepare first Standard Edition server, Install Administrative Tools, Create and publish topology are done without any errors. The user that I used to setup the Lync server is member of:
    Administrators
    CSAdministrator
    Domain Admins
    Domain Users
    Enterprise Admins
    Group Policy Creator Owners
    RTCComponentUniversalServices
    RTCHSUniversalServices
    RTCUniversalConfigReplicator
    RTCUniversalServerAdmins
    Schema Admins
    I have tried to re-install all the things and started to setup a new one many times but the same error still occurred. Please see the log below and give me any ideas/solutions to tackle this problem.
    ****Creating DbSetupInstance for 'Microsoft.Rtc.Common.Data.BlobStore'****
    Initializing DbSetupBase
    Parsing parameters...
    Found Parameter: SqlServer Value lync.lctbu.com\rtc.
    Found Parameter: SqlFilePath Value C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup.
    Found Parameter: Publisheracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group;RTC Local Administrators;LCTBU\RTCUniversalServerAdmins.
    Found Parameter: Replicatoracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group.
    Found Parameter: Consumeracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group;RTC Local Read-only Administrators;LCTBU\RTCUniversalReadOnlyAdmins.
    Found Parameter: DbPath Value D:\CsData\BackendStore\rtc\DbPath.
    Found Parameter: LogPath Value D:\CsData\BackendStore\rtc\LogPath.
    Found Parameter: Role Value master.
    Trying to connect to Sql Server lync.lctbu.com\rtc. using windows authentication...
    Sql version: Major: 11, Minor: 0, Build 2100.
    Sql version is acceptable.
    Validating parameters...
    DbName rtcxds validated.
    SqlFilePath C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup validated.
    DbFileBase rtcxds validated.
    DbPath D:\CsData\BackendStore\rtc\DbPath validated.
    Effective database Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath.
    LogPath D:\CsData\BackendStore\rtc\LogPath validated.
    Effective Log Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\LogPath.
    Checking state for database rtcxds.
    Checking state for database rtcxds.
    State of database rtcxds is detached.
    Attaching database rtcxds from Data Path \\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath, Log Path \\lync.lctbu.com\D$\CsData\BackendStore\rtc\LogPath.
    The operation failed because of missing file '\\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath\rtcxds.mdf'
    Attaching database failed because one of the files not found. The database will be created.
    State of database rtcxds is DbState_DoesNotExist.
    Creating database rtcxds from scratch. Data File Path = D:\CsData\BackendStore\rtc\DbPath, Log File Path= D:\CsData\BackendStore\rtc\LogPath.
    Clean installing database rtcxds.
    Timeout expired.  The timeout period elapsed prior to completion of the operation or the server is not responding.
    ****Creating DbSetupInstance for 'Microsoft.Rtc.Common.Data.RtcSharedDatabase'****
    Initializing DbSetupBase
    Parsing parameters...
    Found Parameter: SqlServer Value lync.lctbu.com\rtc.
    Found Parameter: SqlFilePath Value C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup.
    Found Parameter: Serveracct Value LCTBU\RTCHSUniversalServices;RTC Server Local Group.
    Found Parameter: DbPath Value D:\CsData\BackendStore\rtc\DbPath.
    Found Parameter: LogPath Value D:\CsData\BackendStore\rtc\LogPath.
    Trying to connect to Sql Server lync.lctbu.com\rtc. using windows authentication...
    Sql version: Major: 11, Minor: 0, Build 2100.
    Sql version is acceptable.
    Validating parameters...
    DbName rtcshared validated.
    SqlFilePath C:\Program Files\Common Files\Microsoft Lync Server 2013\DbSetup validated.
    DbFileBase rtcshared validated.
    DbPath D:\CsData\BackendStore\rtc\DbPath validated.
    Effective database Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\DbPath.
    LogPath D:\CsData\BackendStore\rtc\LogPath validated.
    Effective Log Path: \\lync.lctbu.com\D$\CsData\BackendStore\rtc\LogPath.
    Checking state for database rtcshared.
    Reading database version for database rtcshared.
    Database version for database rtcshared - Schema Version5, Sproc Version 0, Update Version 1.
    Thanks and Regards,
    Thanh Le

    Thanks Lạc
    Phạm 2
    I Had similar issue i end up uninstalling and reinstallting but same issue, then i change the drive but same issue. It was I/O issue. After adjusting my I/O it fix our issue and installation went on without any issue. 
    If any one using KVM here is detail article 
    We just  give this option cache=‘writeback
    using this article http://www.ducea.com/2011/07/06/howto-improve-io-performance-for-kvm-guests/ and http://itscblog.tamu.edu/improve-disk-io-performance-in-kvm/ this fix my issue thanks 

  • An error occurred while applying SQL script for the feature BackendStore.

    Hello,
    I am using my AD in Windows Azure VMs. I created new VM of A3 (4 cores, 7 GB Memory) Windows Server 2012 R2, Port 1433 MSSQL added, made it a member of Domain and planned to install first Lync Server 2013 on it.
    In "Setup or Remove Lync Server Components" of "Install or Update Lync Server System", got an Red Coloured text "An error
    occurred while applying SQL script for the feature BackendStore."
    I have not enabled monitoring and archiving server in topology builder. I added "Network Service" and assign "Full Control" in Security Permissions of "C:\CsData" and "C:\LyncShare".
    I executed the SQL Setup Wizard and upgraded any instance to 2012.
    Please guide.
    Thanks, Divyaprakash Koli

    Please check you have enough disk space for the disk where the folders are.
    Check view log for detailed log information.
    The following link is a similar thread for you to refer:
    http://social.technet.microsoft.com/Forums/lync/en-US/a3cb9ab0-7451-4df5-af96-3d2784d1b075/an-error-occurred-while-applying-sql-script-for-the-feature-backendstore-for-details-see-the-log?forum=lyncdeploy
    Lisa Zheng
    TechNet Community Support

  • Shell script for below pl/sql script dbms_file_transfer

    Please let me know how tt write the shell script for below pl/sql script dbms_file_transfer it is
    I have trasfer the files from asm into filesystem .
    it is working . but i have to put in the loop
    begin
    dbms_file_transfer.copy_file(
    source_directory_object => 'src',
    source_file_name => 'ncsn',
    destination_directory_object => 'dest',
    destination_file_name => 'ncsn');
    end;
    Edited by: user8680248 on 27/10/2009 20:55

    user8680248 wrote:
    Please let me know how tt write the shell script for below pl/sql script dbms_file_transfer it is
    I have trasfer the files from asm into filesystem .
    it is working . but i have to put in the loop
    begin
    dbms_file_transfer.copy_file(
    source_directory_object => 'src',
    source_file_name => 'ncsn',
    destination_directory_object => 'dest',
    destination_file_name => 'ncsn');
    end;What database version?
    What are you trying to do exactly?
    It's working but you have to put it in a loop. Fine, what's the problem you are having?
    begin
      loop
        exit when ... whatever the exit condition is ...
        dbms_file_transfer.copy_file(
          source_directory_object => 'src',
          source_file_name => 'ncsn',
          destination_directory_object => 'dest',
          destination_file_name => 'ncsn');
      end loop;
    end;

  • Oracle SQL template to create re-usable DDL/DML Scripts for Oracle database

    Hi,
    I have a requirement to put together a Oracle SQL template to create re-usable DDL/DML Scripts for Oracle databases.
    Only the Oracle DBA will be running the scripts so permissions is not an issue.
    The workflow for any DDL is as follows:-
    1) New Table
    a. Check if the table exists from the system/admin views.
    b. If table exists then give message "Table Exists"
    c. If table does not exist then execute DDL code
    2) Add Column
    a. Check if Column exists for a given table from system/admin views
    b. If column exists in the specified table,
    b1. backup table.
    b2. alter table to make changes to the column
    b3. verify data or execute dml script convert from backup to the new change.
    c. If Column does not exist
    c1. backup table
    c2. alter table to add column
    c3. execute dml to populate column with default value.
    The DML scripts are for populating base tables with data required for business operations.
    3) Add new row
    a. check if row exists by comparing old values of each column with new values to be added for the new record.
    b. If exists, give message row exists
    c. If not exists, add new record.
    4) Update existing record (We have createtime columns in these tables so changes can be tracked)
    a. check if row exists using primary key.
    b. If exists,
    b1. deactivate the record using the "active" column of the table
    b2. Add new record with the changes required.
    c. If does not exist, add new record with the changes required.
    Could you please help with some ideas which can get this done accurately?
    I have tried several ways, but I am not able to put together something that fulfills all requirements.
    Thank you,

    First let me address your question. (This is the easy part.)
    1. The existence of tables can be found in DBA_TABLES. Query it and and then use conditional logic and execute immediate to process the DDL.
    2. The existence of table columns is found in DBA_TAB_COLUMNS. Query it and then conditionally execute your DDL. You can copy the "before picture" of the table using that same dba view, or even better, use DBMS_METADATA.
    As for your DML scripts, they should be restartable, reversible, and re-run-able. They should "fail gracefully" on error, be written in such a way that they can run twice in a row without creating duplicate changes.
    3. Adding appropriate constraints can prevent invalid duplicate rows. Also, you can usually add to the where clause so that the DML does only what it needs to do without even relying on the constraint (but the constraint is there as a safeguard). Look up the MERGE statement to learn how to do an UPSERT (update/insert), which will let you conditionally "deactivate" (update) or insert a record. Anything that you cannot do in SQL can be done with simple procedural code.
    Now, to the heart of the matter...
    You think I did not understand your requirements?
    Please be respectful of people's comments. Many of us are professionals with decades of experience working with databases and Oracle technology. We volunteer our valuable time and knowledge here for free. It is extremely common for someone to post what they feel is an easy SQL or PL/SQL question without stating the real goal--the business objective. Experienced people will spot that the "wrong question" has been asked, and then cut to the chase.
    We have some good questions for you. Not questions we need answers from, but questions you need to ask yourself and your team. You need to reexamine this post and deduce what those questions are. But I'll give you some hints: Why do you need to do what you are asking? And will this construct you are asking for even solve the root cause of your problems?
    Then ponder the following quotations about asking the right question:
    Good questions outrank easy answers.
    — Paul Samuelson
    The only interesting answers are those which destroy the questions.
    — Susan Sontag
    The scientific mind does not so much provide the right answers as ask the right questions.
    — Claude Levi-Strauss
    You can tell whether a man is clever by his answers. You can tell whether a man is wise by his questions.
    — Mahfouz Naguib
    One hears only those questions for which one is able to find answers.
    — Friedrich Nietzsche
    Be patient towards all that is unresolved in your heart and try to love the questions themselves.
    — Rainer Maria Rilke
    What people think of as the moment of discovery is really the discovery of the question.
    — Jonas Salk
    Judge a man by his questions rather than his answers.
    — Voltaire
    The ability to ask the right question is more than half the battle of finding the answer.
    — Thomas J. Watson

  • PeopleSoft - Backup and Restore scripts for PeopleSoft implementation using MS-SQL 2012?

    I am working for a county government as a consultant for PeopleSoft using SQL Server. When they restore a DEV instance from PROD, they have to manually stop the web server, app server and batch server, then they do the restore. Afterward they have to change
    all references to PROD in the cloned DEV instance. Everything is done manually and takes a couple of hours.
    Can anyone point me to scripts for PeopleSoft restores that automate bringing down/up the web, app and batch servers and all the other tasks so they don't have to be done manually?
    Life is hard, but it's harder when you're stupid! John 'Duke' Wayne

    This question should be posted on PeopleSoft support forum. That being said, a quick search on the internet will provide helpful links like below: http://www.erpassociates.com/peoplesoft-corner-wiki/peoplesoft-administration/cloning-a-peoplesoft-database.html
    Satish Kartan www.sqlfood.com

  • How to get SQL script for generating table, constraint, indexes?

    I'd like to get from somewhere Oracle tool for generating simple SQL script for generating table, indexes, constraint (like Toad) and it has to be Oracle tool but not Designer.
    Can someone give me some edvice?
    Thanks!
    m.

    I'd like to get from somewhere Oracle tool for
    generating simple SQL script for generating table,
    indexes, constraint (like Toad) and it has to be
    Oracle tool but not Designer.
    SQL Developer is similar to Toad and is an Oracle tool.
    http://www.oracle.com/technology/products/database/sql_developer/index.html

Maybe you are looking for