ELT vs ETL

Hi,
In what situation do I position Data Integrator , as against an OWB Enterprise ETL?
regards
Vijay

Hi,
Data Integrator is only for ETL and necessary if your targets are not in oracle, OWB for ETL, modelling, profiling, auditing, scheduling, ... but you must have an oracle database as a target.
Regards,
Detlef

Similar Messages

  • ODI Supports ELT technology..How it is best when compared to ETL?

    In any Convensional ETL tool we are able to perform complex logics very easyly by using the componets inside the tool.
    I have not found any such components or transformations in ODI.
    But when I am reading the documents it says that ODI supports complex logics?
    Then How it is best when compared to any ETL tool in the market?

    Hi Harmeet,
    How are you?
    Seems like you are in fire...:-)
    Well, yes ODI is an E-LT tool and i guess this is one and only tool in market which follow E-LT architecture.
    Coming to comparison,
    An ETL tool needs three servers to move the source data to target means Source System, Transformation Engine and Data warehouse.( data is transformed twice in this approach)
    In E-LT, transformation engine/server and data warehouse server is combined as one and needs only 2 servers to do transforamtion. Because of this cost is very less and speed is very good.
    This is the one of the big advantage of using E-LT architecture.
    Experts comments are welcome...:-)
    Thanks,
    Guru

  • Which tool is better for ETL?

    I want to transfer data from RLDB to Essbase, there are many method and tools, I don't know which tool is better for my case. Now, what I am doing now is:
    1. Create views in Oracle
    2. Export view to falt file
    3. Import flat file to Essbase using rule file and MaxL import script.
    4.The data size: flat file rows 3-4 million, time used: 13 hours
    I want to use more powerful ETL tools, such as: Informatica PowerCenter, OBIEE, FDM, EIS, ..., I have basic knowledge for the above tools, but don't know which tool is better for my case? Please advise?

    First to streamline your existing process, you could get rid of step 2 and add a odbc connection to the server and Sql statement to the load rule to pull directly from your Oracle database. Of course with the number of rows you have, if you use the same extract multiple times, I would create a table to load into from the view and pull from that table.
    As for using an etl (Elt) tool, Informatica is pretty powerful and there is a version of it (although not supported in the future called DIM). I might go the ELT route with ODI(Oracle Data Integrator). In the future I might use OBIEE, but I don't think it fully ready for Essbase until the next version and I don't think it will rank on the same level as a full ETL tool for conversions.
    Edited by: GlennS_2 on Jan 2, 2010 9:58 AM
    As for EIS, I would not use it, but might use Essbase Studio. (It can use OBIEE as a source) again this is not an etl tool, it would just take youe existing views and load them.

  • WHY  ODI IS AN ELT TOOL???

    Hi,
    I have done some labs in odi. I am in deep dark for some cases in odi labs like-
    Q1)For Relational Table To File (.xls) data exportation, transformation occurs in sunopsis memory engine.For this case how is is ELT??
    Q2)As transformation occurs in sunopsis memory engine( for prev case), does sunopsis memory engine has any inbuilt data base?
    Q3)For a simple Relational Table to Relational Table data exportation, transformation occurs in staging area i.e in a temp table.Where the table is created? in Target Schema or in Work Schema?
    Q4)As the transformation occurs in staging area and then load into Target Table.Then It is ETL.
    But how it is ELT???
    Please Ans
    Thanks in advance
    Papai

    Papai wrote:
    Hi,
    I have done some labs in odi. I am in deep dark for some cases in odi labs like-
    Q1)For Relational Table To File (.xls) data exportation, transformation occurs in sunopsis memory engine.For this case how is is ELT??You cannot do the transformation on a file. For this you must be needing one relational database on which transformation can be done. So after extraction load it to sunopsis memory engine and do the transformation.
    Q2)As transformation occurs in sunopsis memory engine( for prev case), does sunopsis memory engine has any inbuilt data base?Its nothing but the HyperSQL , which is an 100% java database. More on this you can refer
    http://odiexperts.com/11g-oracle-data-integrator-%E2%80%93-part-711g-%E2%80%93-sunopsis-memory-engine/ But as far as i know its not a good practice always to take this as staging.
    Q3)For a simple Relational Table to Relational Table data exportation, transformation occurs in staging area i.e in a temp table.Where the table is created? in Target Schema or in Work Schema?Temp table created in work schema. BUt if your workschema same as target schema/ then temp table will be created in target schema. So it depends on you and your architecture.
    Q4)As the transformation occurs in staging area and then load into Target Table.Then It is ETL.
    But how it is ELT???Its E-LT. That is there is no middletier. Hence its very faster than E-T-L approach.
    Traditional ETL tools operate by first Extracting the data from various sources, Transforming the data on a proprietary, middle-tier ETL engine, and then Loading the transformed data onto the target data warehouse or integration server.
    but E-LT moves the data transformation step to the target RDBMS, changing the order of operations to: Extract the data from the source tables, Load the tables into the destination server, and then Transform the data on the target RDBMS using native SQL operators.
    Please Ans
    Thanks in advance
    Papai

  • Error While running the ETL Load in DAC (BI Financial Analytics)

    Hi All,
    I have Installed and Configured BI Applictions 7.9.5 and Informatic8.1.1. For the first time when we run the ETL Load in DAC it has failed.for us every Test Connection was sucess.and getting the error message as below.
    The log file which I pasted below is from the path
    /u01/app/oracle/product/Informatica/PowerCenter8.1.1/server/infa_shared
    /SessLogs
    SDE_ORAR12_Adaptor.SDE_ORA_GL_AP_LinkageInformation_Extract_Full.log
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R12] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [9] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value ['Y'] for mapping parameter:[$$FILTER_BY_LEDGER_ID].
    DIRECTOR> VAR_27028 Use override value ['N'] for mapping parameter:[$$FILTER_BY_LEDGER_TYPE].
    DIRECTOR> VAR_27028 Use override value [04/02/2007] for mapping parameter:[$$INITIAL_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [] for mapping parameter:[$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [1] for mapping parameter:[$$LEDGER_ID_LIST].
    DIRECTOR> VAR_27028 Use override value ['NONE'] for mapping parameter:[$$LEDGER_TYPE_LIST].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] at [Thu Feb 12 12:49:33 2009]
    DIRECTOR> TM_6683 Repository Name: [DEV_Oracle_BI_DW_Rep]
    DIRECTOR> TM_6684 Server Name: [DEV_Oracle_BI_DW_Rep_Integration_Service]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR12_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_GL_AP_LinkageInformation_Extract_Full]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_GL_AP_LinkageInformation_Extract [version 1]
    DIRECTOR> TM_6827 [u01/app/oracle/product/Informatica/PowerCenter8.1.1/server/infa_shared/Storage] will be used as storage directory for session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6708 Using configuration property [SiebelUnicodeDB,apps@devr12 bawdev@devbi]
    DIRECTOR> TM_6703 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] is run by 64-bit Integration Service [node01_oratestbi], version [8.1.1 SP4], build [0817].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [ISO 8859-1 Western European]
    MAPPING> TM_6151 Session Sort Order: [Binary]
    MAPPING> TM_6156 Using LOW precision decimal arithmetic
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM Error Log Disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Thu Feb 12 12:49:34 2009)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Thu Feb 12 12:49:34 2009)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 12582912 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [devr12.tessco.com], user [apps]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [DEVBI], user [bawdev], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_GL_LINKAGE_INFORMATION_GS :SQL INSERT statement:
    INSERT INTO W_GL_LINKAGE_INFORMATION_GS(SOURCE_DISTRIBUTION_ID,JOURNAL_LINE_INTEGRATION_ID,LEDGER_ID,LEDGER_TYPE,DISTRIBUTION_SOURCE,JE_BATCH_NAME,JE_HEADER_NAME,JE_LINE_NUM,POSTED_ON_DT,SLA_TRX_INTEGRATION_ID,DATASOURCE_NUM_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_GL_LINKAGE_INFORMATION_GS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Thu Feb 12 12:49:34 2009
    Target tables:
    W_GL_LINKAGE_INFORMATION_GS
    READER_1_1_1> RR_4029 SQ Instance [SQ_XLA_AE_LINES] User specified SQL Query [SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
          AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AP_INV_DIST', 'AP_PMT_DIST'
              , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('04/02/2007 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Thu Feb 12 12:49:34 2009)
    READER_1_1_1> CMN_1761 Timestamp Event: [Thu Feb 12 12:49:34 2009]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-01114: IO error writing block to file 513 (block # 328465)
    ORA-27072: File I/O error
    Linux-x86_64 Error: 28: No space left on device
    Additional information: 4
    Additional information: 328465
    Additional information: -1
    ORA-01114: IO error writing block to file 513 (block # 328465)
    ORA-27072: File I/O error
    Linux-x86_64 Error: 28: No space left on device
    Additional information: 4
    Additional information: 328465
    Additional information: -1
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
    JBATCH.NAME BATCH_NAME,
    JHEADER.NAME HEADER_NAME,
    PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
    , GL_IMPORT_REFERENCES GLIMPREF
    , XLA_AE_LINES AELINE
    , GL_JE_HEADERS JHEADER
    , GL_JE_BATCHES JBATCH
    , GL_LEDGERS T
    , GL_PERIODS PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
    ( 'AP_INV_DIST', 'AP_PMT_DIST'
    , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID = DLINK.AE_HEADER_ID
    AND AELINE.AE_LINE_NUM = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID = T.LEDGER_ID
    AND JHEADER.STATUS = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
    TO_DATE('04/02/2007 00:00:00'
    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
    JBATCH.NAME BATCH_NAME,
    JHEADER.NAME HEADER_NAME,
    PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
    , GL_IMPORT_REFERENCES GLIMPREF
    , XLA_AE_LINES AELINE
    , GL_JE_HEADERS JHEADER
    , GL_JE_BATCHES JBATCH
    , GL_LEDGERS T
    , GL_PERIODS PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
    ( 'AP_INV_DIST', 'AP_PMT_DIST'
    , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID = DLINK.AE_HEADER_ID
    AND AELINE.AE_LINE_NUM = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID = T.LEDGER_ID
    AND JHEADER.STATUS = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
    TO_DATE('04/02/2007 00:00:00'
    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Thu Feb 12 12:49:34 2009]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_GL_LINKAGE_INFORMATION_GS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Thu Feb 12 12:49:34 2009
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_GL_LINKAGE_INFORMATION_GS (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
    WRT_8044 No data loaded for this target
    WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [SQ_XLA_AE_LINES] has completed: Total Run Time = [0.673295] secs, Total Idle Time = [0.000000] secs, Busy Percentage = [100.000000].
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [SQ_XLA_AE_LINES] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_GL_LINKAGE_INFORMATION_GS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Thu Feb 12 12:49:35 2009)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Thu Feb 12 12:49:35 2009)
    MAPPING> TM_6018 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] run completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [SQ_XLA_AE_LINES] (Instance Name: [SQ_XLA_AE_LINES])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_GL_LINKAGE_INFORMATION_GS] (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] completed at [Thu Feb 12 12:49:36 2009]
    Thanks in Advance,
    Prashanth
    Edited by: user10719430 on Feb 11, 2009 7:33 AM
    Edited by: user10719430 on Feb 12, 2009 11:31 AM

    Need to increase temp tablespace.

  • Not able to extract performance data from .ETL file using xperf commands. getting error "Events were lost in this trace. Data may be unreliable ..."

    Not able to extract  performance data from .ETL file using xperf commands.
    Xperf Commands:
    xperf –i C:\TempFolder\Test.etl -o C:\TempFolder\BootData.csv  –a process
    Getting following error after executing above command:
    "33288636 Events were lost
    in this trace. 
    Data may be unreliable
    This is usually caused
    by insufficient disk bandwidth for ETW lo
    gging.
    Please try increasing the minimum
    and maximum number of buffers
    and/or
                    the buffer size. 
    Doubling these values would be a good first at
    tempt.
    Please note, though, that
    this action increases the amount of me
    mory
                    reserved
    for ETW buffers, increasing memory pressure on your sce
    nario.
    See "xperf -help start"
    for the associated command line options."
    I changed page size file but its does not work for me.
    Any one have idea, how to solve this problem and extract ETL file data.

    I want to mention one point here. I have total 4 machines out of these 3 machines above
    commands working properly. Only one machine has this problem.<o:p></o:p>
    Hi,
    I consider that you can try to use xperf to collect the trace etl file and see if it can be extracted on this computer:
    Refer to following articles:
    start
    http://msdn.microsoft.com/en-us/library/windows/hardware/hh162977.aspx
    Using Xperf to take a Trace (updated)
    http://blogs.msdn.com/b/pigscanfly/archive/2008/02/16/using-xperf-to-take-a-trace.aspx
    Kate Li
    TechNet Community Support

  • Data Migration from Legacy system to ECC systems via ETL through SAP PI

    Hi All,
    I wanted to know if we can migrate the data from Legacy systems to ECC systems via PI.
    What I understand is there is ETL tool is used to extract the data from legacy system, and what client is looking to load that data via PI?
    Can we do that ? I am concerned because this will involve mass data?
    If I have to use PI , I see option of  PROXY / IDOC /FILE as receiver in ECC system (take the data from ETL)?
    Can somebody has done this earlier , please share the approach.
    Thanks,
    Pushkar Patel

    Hi ,
    I require few details from you.
    1. What ETL tool you are using, If Informatica, it already have PowerConnect to connect to SAP. So you can create source and Target Structure and also you can use RFC's to send data to R/3. Else, for other ETL tools, can you prepare RFC's or any other way to send data to R/3. let me know the tool.
    2. Does R/3 contains the master data tables? If yes, then try to use LSMW for Mass upload of data to tables.
    If your client don't want to use either of these options please elaborate, what is the case.
    Regards
    Aashish Sinha

  • Fault elt in web-services.xml NOT WORKING

    We are trying to capture an invalid message coming into our service before our
    service actually processes it. Per WLS7 documentation, it provides the ability
    to add a <fault> elt under the <params> elt in web-services.xml to perform that.
    Here's how the operations portion of our web-services.xml looks like:
    <operations>
    <operation method="echo(java.lang.String)" component="jcComp0" name="echo"
    handler-chain="diagnosticChain">
    <params>
    <param location="body" class-name="java.lang.String" style="in" name="echoString"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema" type="xsd:string">
    </param>
    <return-param location="body" class-name="java.lang.String" name="Result"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema" type="xsd:string">
    </return-param>
         <fault name="InvalidMessageException" class-name="com.gmacfs.routeone.diagnostic.InvalidMessageException"/>
    </params>
    </operation>
    </operations>
    However, when we tried doing that, we got a BIG set of exception while trying
    to build our client. It looks as follows:
    client:
    [clientgen] Generating client jar for diagnostic.ear ...
    [clientgen] Could not read Web Service deployment descriptor
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.EARClientGen.run(EARClientGen.java:112)
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.ClientGenTask.execute(ClientGenTask.java:270)
    [clientgen] at org.apache.tools.ant.Task.perform(Task.java:217)
    [clientgen] at org.apache.tools.ant.Target.execute(Target.java:164)
    [clientgen] at org.apache.tools.ant.Target.performTasks(Target.java:182)
    [clientgen] at org.apache.tools.ant.Project.executeTarget(Project.java:601)
    [clientgen] at org.apache.tools.ant.Project.executeTargets(Project.java:560)
    [clientgen] at org.apache.tools.ant.Main.runBuild(Main.java:454)
    [clientgen] at org.apache.tools.ant.Main.start(Main.java:153)
    [clientgen] at org.apache.tools.ant.Main.main(Main.java:176)
    [clientgen] --- Nested Exception ---
    [clientgen] Could not read Web Service deployment descriptor
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.EARClientGen.getWebServiceDD(EARClientGen.java:332)
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.EARClientGen.run(EARClientGen.java:110)
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.ClientGenTask.execute(ClientGenTask.java:270)
    [clientgen] at org.apache.tools.ant.Task.perform(Task.java:217)
    [clientgen] at org.apache.tools.ant.Target.execute(Target.java:164)
    [clientgen] at org.apache.tools.ant.Target.performTasks(Target.java:182)
    [clientgen] at org.apache.tools.ant.Project.executeTarget(Project.java:601)
    [clientgen] at org.apache.tools.ant.Project.executeTargets(Project.java:560)
    [clientgen] at org.apache.tools.ant.Main.runBuild(Main.java:454)
    [clientgen] at org.apache.tools.ant.Main.start(Main.java:153)
    [clientgen] at org.apache.tools.ant.Main.main(Main.java:176)
    [clientgen] --- Nested Exception ---
    [clientgen] weblogic.webservice.dd.DDProcessingException: Could not find required
    attribute "type" for element <fault> (Line 28, Column 8)
    [clientgen] at weblogic.webservice.dd.ParsingHelper.getRequiredAttribute(ParsingHelper.java:287)
    [clientgen] at weblogic.webservice.dd.DDLoader.processFaultElement(DDLoader.java:1195)
    [clientgen] at weblogic.webservice.dd.DDLoader.processFaultElements(DDLoader.java:1166)
    [clientgen] at weblogic.webservice.dd.DDLoader.processParamsElement(DDLoader.java:1004)
    [clientgen] at weblogic.webservice.dd.DDLoader.processOperationElement(DDLoader.java:977)
    [clientgen] at weblogic.webservice.dd.DDLoader.processOperationElements(DDLoader.java:853)
    [clientgen] at weblogic.webservice.dd.DDLoader.processOperationsElement(DDLoader.java:841)
    [clientgen] at weblogic.webservice.dd.DDLoader.processWebServiceElement(DDLoader.java:378)
    [clientgen] at weblogic.webservice.dd.DDLoader.processWebServiceElements(DDLoader.java:283)
    [clientgen] at weblogic.webservice.dd.DDLoader.processWebServicesElement(DDLoader.java:271)
    [clientgen] at weblogic.webservice.dd.DDLoader.load(DDLoader.java:249)
    [clientgen] at weblogic.webservice.util.WebServiceWarFile.getWSDD(WebServiceWarFile.java:79)
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.EARClientGen.getWebServiceDD(EARClientGen.java:330)
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.EARClientGen.run(EARClientGen.java:110)
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.ClientGenTask.execute(ClientGenTask.java:270)
    [clientgen] at org.apache.tools.ant.Task.perform(Task.java:217)
    [clientgen] at org.apache.tools.ant.Target.execute(Target.java:164)
    [clientgen] at org.apache.tools.ant.Target.performTasks(Target.java:182)
    [clientgen] at org.apache.tools.ant.Project.executeTarget(Project.java:601)
    [clientgen] at org.apache.tools.ant.Project.executeTargets(Project.java:560)
    [clientgen] at org.apache.tools.ant.Main.runBuild(Main.java:454)
    [clientgen] at org.apache.tools.ant.Main.start(Main.java:153)
    [clientgen] at org.apache.tools.ant.Main.main(Main.java:176)
    [clientgen] --------------- nested within: ------------------
    [clientgen] weblogic.webservice.util.WebServiceJarException: Could not load deployment
    descriptor - with nested exception:
    [clientgen] [weblogic.webservice.dd.DDProcessingException: Could not find required
    attribute "type" for element <fault> (Line 28, Column 8)]
    [clientgen] at weblogic.webservice.util.WebServiceWarFile.getWSDD(WebServiceWarFile.java:81)
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.EARClientGen.getWebServiceDD(EARClientGen.java:330)
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.EARClientGen.run(EARClientGen.java:110)
    [clientgen] at weblogic.ant.taskdefs.webservices.clientgen.ClientGenTask.execute(ClientGenTask.java:270)
    [clientgen] at org.apache.tools.ant.Task.perform(Task.java:217)
    [clientgen] at org.apache.tools.ant.Target.execute(Target.java:164)
    [clientgen] at org.apache.tools.ant.Target.performTasks(Target.java:182)
    [clientgen] at org.apache.tools.ant.Project.executeTarget(Project.java:601)
    [clientgen] at org.apache.tools.ant.Project.executeTargets(Project.java:560)
    [clientgen] at org.apache.tools.ant.Main.runBuild(Main.java:454)
    [clientgen] at org.apache.tools.ant.Main.start(Main.java:153)
    [clientgen] at org.apache.tools.ant.Main.main(Main.java:176)
    BUILD FAILED
    Anybody has any ideas?
    Thanks much,
    sami

    Manoj,
    Thanks a lot, THAT DID IT... two very helpful hints from you in a row.
    By the way, one thing worth mentioning is that the Weblogic documentation that
    we explored did not have enough information about that issue.
    Thanks again.
    sami
    "manoj cheenath" <[email protected]> wrote:
    Buried deep in the stack trace, is this little
    detail:
    Could not find required
    attribute "type" for element <fault> (Line 28, Column 8)
    So the correct DD should look something like:
    <fault type="typeNS:string"
    xmlns:typeNS="http://www.w3.org/2001/XMLSchema"
    class-name="tutorial.sample9.HelloWorldException"
    name="HelloWorldException">
    </fault>
    Also, check out this example:
    http://manojc.com/?sample9
    There is a know problem: WLS can not handle
    exceptions that contain complex data types.
    This will be fixed in SP1.
    Regards,
    -manoj
    http://manojc.com
    "sami titi" <[email protected]> wrote in message
    news:[email protected]...
    We are trying to capture an invalid message coming into our servicebefore
    our
    service actually processes it. Per WLS7 documentation, it providesthe
    ability
    to add a <fault> elt under the <params> elt in web-services.xml toperform
    that.
    Here's how the operations portion of our web-services.xml looks like:
    <operations>
    <operation method="echo(java.lang.String)" component="jcComp0"name="echo"
    handler-chain="diagnosticChain">
    <params>
    <param location="body" class-name="java.lang.String" style="in"name="echoString"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema" type="xsd:string">
    </param>
    <return-param location="body" class-name="java.lang.String"name="Result"
    xmlns:xsd="http://www.w3.org/2001/XMLSchema" type="xsd:string">
    </return-param>
    <fault name="InvalidMessageException"class-name="com.gmacfs.routeone.diagnostic.InvalidMessageException"/>
    </params>
    </operation>
    </operations>
    However, when we tried doing that, we got a BIG set of exception whiletrying
    to build our client. It looks as follows:
    client:
    [clientgen] Generating client jar for diagnostic.ear ...
    [clientgen] Could not read Web Service deployment descriptor
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.EARClientGen.run(EARClientGen.ja
    va:112)
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.ClientGenTask.execute(ClientGenT
    ask.java:270)
    [clientgen] at org.apache.tools.ant.Task.perform(Task.java:217)
    [clientgen] at org.apache.tools.ant.Target.execute(Target.java:164)
    [clientgen] atorg.apache.tools.ant.Target.performTasks(Target.java:182)
    [clientgen] atorg.apache.tools.ant.Project.executeTarget(Project.java:601)
    [clientgen] atorg.apache.tools.ant.Project.executeTargets(Project.java:560)
    [clientgen] at org.apache.tools.ant.Main.runBuild(Main.java:454)
    [clientgen] at org.apache.tools.ant.Main.start(Main.java:153)
    [clientgen] at org.apache.tools.ant.Main.main(Main.java:176)
    [clientgen] --- Nested Exception ---
    [clientgen] Could not read Web Service deployment descriptor
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.EARClientGen.getWebServiceDD(EAR
    ClientGen.java:332)
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.EARClientGen.run(EARClientGen.ja
    va:110)
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.ClientGenTask.execute(ClientGenT
    ask.java:270)
    [clientgen] at org.apache.tools.ant.Task.perform(Task.java:217)
    [clientgen] at org.apache.tools.ant.Target.execute(Target.java:164)
    [clientgen] atorg.apache.tools.ant.Target.performTasks(Target.java:182)
    [clientgen] atorg.apache.tools.ant.Project.executeTarget(Project.java:601)
    [clientgen] atorg.apache.tools.ant.Project.executeTargets(Project.java:560)
    [clientgen] at org.apache.tools.ant.Main.runBuild(Main.java:454)
    [clientgen] at org.apache.tools.ant.Main.start(Main.java:153)
    [clientgen] at org.apache.tools.ant.Main.main(Main.java:176)
    [clientgen] --- Nested Exception ---
    [clientgen] weblogic.webservice.dd.DDProcessingException: Could notfind
    required
    attribute "type" for element <fault> (Line 28, Column 8)
    [clientgen] atweblogic.webservice.dd.ParsingHelper.getRequiredAttribute(ParsingHelper.java
    :287)
    [clientgen] atweblogic.webservice.dd.DDLoader.processFaultElement(DDLoader.java:1195)
    [clientgen] atweblogic.webservice.dd.DDLoader.processFaultElements(DDLoader.java:1166)
    [clientgen] atweblogic.webservice.dd.DDLoader.processParamsElement(DDLoader.java:1004)
    [clientgen] atweblogic.webservice.dd.DDLoader.processOperationElement(DDLoader.java:977)
    [clientgen] atweblogic.webservice.dd.DDLoader.processOperationElements(DDLoader.java:853)
    [clientgen] atweblogic.webservice.dd.DDLoader.processOperationsElement(DDLoader.java:841)
    [clientgen] atweblogic.webservice.dd.DDLoader.processWebServiceElement(DDLoader.java:378)
    [clientgen] atweblogic.webservice.dd.DDLoader.processWebServiceElements(DDLoader.java:283)
    [clientgen] atweblogic.webservice.dd.DDLoader.processWebServicesElement(DDLoader.java:271)
    [clientgen] at weblogic.webservice.dd.DDLoader.load(DDLoader.java:249)
    [clientgen] atweblogic.webservice.util.WebServiceWarFile.getWSDD(WebServiceWarFile.java:79
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.EARClientGen.getWebServiceDD(EAR
    ClientGen.java:330)
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.EARClientGen.run(EARClientGen.ja
    va:110)
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.ClientGenTask.execute(ClientGenT
    ask.java:270)
    [clientgen] at org.apache.tools.ant.Task.perform(Task.java:217)
    [clientgen] at org.apache.tools.ant.Target.execute(Target.java:164)
    [clientgen] atorg.apache.tools.ant.Target.performTasks(Target.java:182)
    [clientgen] atorg.apache.tools.ant.Project.executeTarget(Project.java:601)
    [clientgen] atorg.apache.tools.ant.Project.executeTargets(Project.java:560)
    [clientgen] at org.apache.tools.ant.Main.runBuild(Main.java:454)
    [clientgen] at org.apache.tools.ant.Main.start(Main.java:153)
    [clientgen] at org.apache.tools.ant.Main.main(Main.java:176)
    [clientgen] --------------- nested within: ------------------
    [clientgen] weblogic.webservice.util.WebServiceJarException: Couldnot
    load deployment
    descriptor - with nested exception:
    [clientgen] [weblogic.webservice.dd.DDProcessingException: Could not
    find>required>> attribute "type" for element <fault> (Line 28, Column 8)
    [clientgen] atweblogic.webservice.util.WebServiceWarFile.getWSDD(WebServiceWarFile.java:81
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.EARClientGen.getWebServiceDD(EAR
    ClientGen.java:330)
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.EARClientGen.run(EARClientGen.ja
    va:110)
    [clientgen] atweblogic.ant.taskdefs.webservices.clientgen.ClientGenTask.execute(ClientGenT
    ask.java:270)
    [clientgen] at org.apache.tools.ant.Task.perform(Task.java:217)
    [clientgen] at org.apache.tools.ant.Target.execute(Target.java:164)
    [clientgen] atorg.apache.tools.ant.Target.performTasks(Target.java:182)
    [clientgen] atorg.apache.tools.ant.Project.executeTarget(Project.java:601)
    [clientgen] atorg.apache.tools.ant.Project.executeTargets(Project.java:560)
    [clientgen] at org.apache.tools.ant.Main.runBuild(Main.java:454)
    [clientgen] at org.apache.tools.ant.Main.start(Main.java:153)
    [clientgen] at org.apache.tools.ant.Main.main(Main.java:176)
    BUILD FAILED
    Anybody has any ideas?
    Thanks much,
    sami

  • Error while reporting in parallel with ETL run!!!

    Hi All,
    It has been observed that when report is run in parallel with ETL, Report fails with following error:
    Error during SQL execution: (DA0003)
    Exception: DBD, ORA-12842: Cursor invalidated during parallel execution State: N/A
    Please let know if we can run report in parallel with ETL.

    Hi,
    thats correct. You should first load your DWH via ETL and then report off it.
    Regards
    -Seb.

  • Working on ETL tools interoperability using Common Warehouse Model (CWM)

    Hi All,
    Its just a piece of information and not a question.
    I have been working on proving the ETL tools interoperability using Common Warehouse Metamodel(CWM), an OMG standard. The whole concept is to take out the metadata from an ETL tool say OWB and put it into CWM Metadata Repository and this metadata can be used for building the same project in any other tool say Informatica or may be in the same ETL tool.
    The main thing in this process is to map each ETL tool with the CWM concepts and then using Model to Model Tranformations(Technologies like Xtend) one can set up a communication between different ETL tools.
    Till now I have worked with OWB only. I, with my team, have extracted all information from an OWB project (which has medium level of complexity like two oracle modules(schemas) and few tables, views and mappings with various operators) and put it in CWM repository and extracted it back from CWM MDR to OWB itself. We haven't worked with any other ETL tool because of the unavailablity of any other ETL tool with us. We will be working with Pentaho kettle in near future and try to prove the whole process as two way communication.
    The whole process can be described in steps as below :
    1. Creation of a manual OWB Ecore model(model representaion in Eclipse Modelling Framework) which gives all dependencies and reletionships in OWB objects like Project,OracleModule etc.
    2. Creation of CWM ecore model from Rational Rose mdl which has been provided by OMG on their site.
    3. Generation of Java code(Gen Model) from the above mentioned ecore model(It is needed to create an object from OWB).
    4. Extraction of project from OWB using public views which has been exposed by OWB itself. You can refer to [http://download.oracle.com/docs/cd/B31080_01/doc/owb.102/b28225/toc.htm|http://download.oracle.com/docs/cd/B31080_01/doc/owb.102/b28225/toc.htm] link for OWB public views and other APIs.
    5. (Actually Step 4 is a part of this step only )Writing a Java code which has JDBC connection for accessing OWB public views and Ecore model as imported java files(Step 3 has been done for this part only). This java code will return an OWB project object(instance of the Ecore model) which will be used in further steps.
    6. Writing an Xtend code to do a model to model tranformation from OWB to CWM.
    7. Writing an Open Architecture Workflow to combine all the steps in one step which will take the output of java code(step 5) and put it into Xtend Code(step 6) and then will take the output of Xtend code and give it to the XMIWriter(an OAW component) to write an XMI which is actually a CWM Ecore Model instance.
    8. Saving above XMI(CWM model instance) to the CWM MDR using Hibernate and Teneo.
    In the same way we can extract metadata from CWM MDR and put it into OWB. But the only problem with OWB is that we cannot persist OWB object in OWB repositories as OWB tables are very cryptic and tough to understand. So for that we have used TCL scripts(OMB Plus scripts) to create a project in OWB using OWB ecore instance. You can refer to the above oracle documentation link for TCL scripts.
    Let me know if I can assist you if you are working on the same.
    You can mail me for any queries. My email id is [email protected].
    Thanks,
    Deepak

    Hi
    1. Why do we need to install another standalone HTTP server in a separate home? Where do we use that server?
    DA: The separate HTTP server is for the Workflow Monitor, which is not necessary (it has some use cases mind you).
    2. To make the OWB work correctly while using ETL features, do we always need to run Workflow Configuration Assistant, because I wasn't able to generate code from OWB editor after building a mapping while Workflow Configuration Assistant wasn't running.
    DA: Not necessary, what error did you get? Mappings can be designed, deployed and executed without Workflow. Workflow can be used for orchestrating the the mappings (ie .running a bunch of them in a specific order with others tasks).
    3. Whenever I try to save my work in OWB, I get an error , message : Preference.properties (Access is denied). Though it saves my work but I don't understand why I am getting this error. It looks like OWB is trying to access some Property from the Preferences (Tools menu) but can't access.
    DA: It sounds like the directory where you have installed OWB does not have permissions for the OS user you are executing it. Is the install user different from the execution user? Either run using the installed user, or change the permissions of the directories (grant the executing user write permissions under all directories under owb).
    4. I also get error while closing the Mapping Editor :-
    DA. same issue as 3.
    Cheers
    David

  • DAC ETL tasks failing with Error Code

    Here is a copy of just 1 of the tasks that fail. I did search for this and found reference to increasing DB Processes to 500 but that had no effect.
    I have tried to reduce the number of Maximum Sessions in Setup to but it had no effect.
    Some of the tasks completed. The remaining stay as Failed even after re-running ETL.
    Using Informatica 8.1.1 SP5 32bit
    Build: 7.9.5.1.011209.1448
    Release Version: Oracle Business Intelligence Applications 7.9.5.1
    Package: 011209.1448
    Oracle 32bit 10g client
    Windows Server 2003 R2 64bit
    Details:
    pmcmd startworkflow -u Administrator -p **** -s 10.1.11.69:4006 -f SDE_ORAR12_Adaptor -lpf D:\OracleBI\DAC\Informatica\parameters\SDE_ORAR12_Adaptor.SDE_ORA_Stage_ValueSetHier_Extract_Full.txt SDE_ORA_Stage_ValueSetHier_Extract_Full
    Status Desc : Failed
    WorkFlowMessage :
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [8.1.1 SP5], build [186.0822], Windows 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    Invoked at Thu Apr 15 14:39:18 2010
    Connected to Integration Service at [10.1.11.69:4006]
    Folder: [SDE_ORAR12_Adaptor]
    Workflow: [SDE_ORA_Stage_ValueSetHier_Extract_Full] version [1].
    Workflow run status: [Failed]
    Workflow run error code: [36331]
    Workflow run error message: [WARNING: Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SDE_ORA_Stage_ValueSetHier_Extract_Full] will be failed.]
    Start time: [Thu Apr 15 14:38:58 2010]
    End time: [Thu Apr 15 14:39:03 2010]
    Workflow log file: [d:\Informatica\PowerCenter8.1.1\server\infa_shared\WorkflowLogs\SDE_ORA_Stage_ValueSetHier_Extract_Full.log]
    Workflow run type: [User request]
    Run workflow as user: [Administrator]
    Integration Service: [PowerCenter_Integration_Service]
    Disconnecting from Integration Service
    Completed at Thu Apr 15 14:39:18 2010
    =====================================
    ERROR OUTPUT
    =====================================
    Error Message : Unknown reason for error code 36331
    ErrorCode : 36331
    Workflow Log.
    INFO : LM_36435 [Thu Apr 15 14:38:58 2010] : (4716|4956) Starting execution of workflow [SDE_ORA_Stage_ValueSetHier_Extract_Full] in folder [SDE_ORAR12_Adaptor] last saved by user [Administrator].
    INFO : LM_44195 [Thu Apr 15 14:38:58 2010] : (4716|4956) Workflow [SDE_ORA_Stage_ValueSetHier_Extract_Full] service level [SLPriority:5,SLDispatchWaitTime:1800].
    INFO : VAR_27085 [Thu Apr 15 14:38:58 2010] : (4716|4956) Parameter file [d:\Informatica\PowerCenter8.1.1\server\infa_shared\Temp\SDE_ORA_Stage_ValueSetHier_Extract_Full_a04956] is opened for [workflow [SDE_ORA_Stage_ValueSetHier_Extract_Full]].
    INFO : LM_36330 [Thu Apr 15 14:38:58 2010] : (4716|4956) Start task instance [Start]: Execution started.
    INFO : LM_36318 [Thu Apr 15 14:38:58 2010] : (4716|4956) Start task instance [Start]: Execution succeeded.
    INFO : LM_36505 : (4716|4956) Link [Start --> SDE_ORA_Stage_ValueSetHier_Extract_Full]: empty expression string, evaluated to TRUE.
    INFO : LM_36388 [Thu Apr 15 14:38:58 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] is waiting to be started.
    INFO : LM_36330 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full]: Execution started.
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [TM_6793 Fetching initialization properties from the Integration Service. : (Thu Apr 15 14:38:58 2010)]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [DISP_20305 The [Preparer] DTM with process id [7972] is running on node [node01_OBIEEDEV01].
    : (Thu Apr 15 14:38:58 2010)]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [PETL_24036 Beginning the prepare phase for the session.]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [TM_6721 Started [Connect to Repository].]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [TM_6722 Finished [Connect to Repository]. It took [0.234368] seconds.]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [TM_6794 Connected to repository [PowerCenter] in domain [Domain_OBIEEDEV01] user [Administrator]]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [TM_6721 Started [Fetch Session from Repository].]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [TM_6722 Finished [Fetch Session from Repository]. It took [0.234367] seconds.]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [VAR_27028 Use override value [ORA_R12] for session parameter:[$DBConnection_OLTP].]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [VAR_27028 Use override value [9] for mapping parameter:[$$DATASOURCE_NUM_ID].]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_ValueSetHier.$$LAST_EXTRACT_DATE].]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [TM_6721 Started [Partition Group Formation].]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [TM_6722 Finished [Partition Group Formation]. It took [0.031249] seconds.]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [PETL_24037 Finished the prepare phase for the session.]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [TM_6792 Notifying the Integration Service that the prepare phase has been completed. : (Thu Apr 15 14:38:59 2010)]
    INFO : LM_36488 [Thu Apr 15 14:38:59 2010] : (4716|4956) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] : [TM_6228 Writing session output to log file [d:\Informatica\PowerCenter8.1.1\server\infa_shared\SessLogs\SDE_ORAR12_Adaptor.SDE_ORA_Stage_ValueSetHier_Extract_Full.log].]
    INFO : LM_36682 [Thu Apr 15 14:38:59 2010] : (4716|4960) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full]: started a process with pid [7972] on node [node01_OBIEEDEV01].
    ERROR : LM_36320 [Thu Apr 15 14:39:03 2010] : (4716|4964) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full]: Execution failed.
    WARNING : LM_36331 : (4716|4964) Session task instance [SDE_ORA_Stage_ValueSetHier_Extract_Full] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SDE_ORA_Stage_ValueSetHier_Extract_Full] will be failed.
    ERROR : LM_36320 [Thu Apr 15 14:39:03 2010] : (4716|4964) Workflow [SDE_ORA_Stage_ValueSetHier_Extract_Full]: Execution failed.
    Thanks

    Can you please post the session log?
    Thanks,
    Austin

  • How to process pdf file in clower ETL

    Hi,
    I want process pdf document in clower ETL dataintegartor. I have created sample project and created ETL garph universal data reader, data i have imported pdf file, while openning the metta data information it's show encoding data format and invalid delimiter and while running error in the console
    Please assist me how to process pdf file with unstructured data format.
    I am getting below the error,
    ERROR [WatchDog] - Graph execution finished with error
    ERROR [WatchDog] - Node DATA_READER0 finished with status: ERROR caused by: Parsing error: Unexpected record delimiter, probably record has too few fields. in field # 1 of record # 2, value: '<Raw record data is not available, please turn on verbose mode.>'
    ERROR [WatchDog] - Node DATA_READER0 error details:
    org.jetel.exception.BadDataFormatException: Parsing error: Unexpected record delimiter, probably record has too few fields. in field # 1 of record # 2, value: '<Raw record data is not available, please turn on verbose mode.>'
         at org.jetel.data.parser.DataParser.parsingErrorFound(DataParser.java:527)
         at org.jetel.data.parser.DataParser.parseNext(DataParser.java:437)
         at org.jetel.data.parser.DataParser.getNext(DataParser.java:168)
         at org.jetel.util.MultiFileReader.getNext(MultiFileReader.java:415)
         at org.jetel.component.DataReader.execute(DataReader.java:261)
         at org.jetel.graph.Node.run(Node.java:425)
         at java.lang.Thread.run(Thread.java:619)
    please can any one help me.
    Thanks
    Rajini C
    Edited by: 954486 on Sep 19, 2012 11:19 PM

    There is a separate forum for the BI/Information Discovery application of Endeca software: Endeca Information Discovery You should post your message there.
    Thanks.
    Sean

  • How do I upload a CSV file with embedded quotation marks into a table via ETL

    I'm having a problem importing a CSV file via ETL that contains double-quotes, and prior solutions aren't helping.  My data looks like this:
    A
    B114SA                             
    CHLORASCRUB SWAB INS SUBASSEMB                  
    A
    S273SA                             
    CHLORASCRUB MAXI INS SUBASSEMB                    
    A
    2AB286                             
    WEB ZEE ANTISEPT 5410\4.5" CD                     
    A
    2AB512                             
    WEB PDI PVP IODINE PREP PAD 3870/4.5              
    A
    2AB542                             
    WEB ZEE CLEAN WIPE NP5410/4.5                     
    If I set the "Text Qualifier" to ' " ', then run it, it falls over on the third row, with the following error:
    - Executing (Error)
    Messages
    Error 0xc0202055: Data Flow Task 1: The column delimiter for column "Column 2" was not found.
     (SQL Server Import and Export Wizard)
    Error 0xc0202092: Data Flow Task 1: An error occurred while processing file "H:\AS400_file_transfers\LIMS\ACTITEMPF.CSV" on data row 3.
     (SQL Server Import and Export Wizard)
    Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "Source - ACTITEMPF_CSV" (1) returned error code 0xC0202092.  The component returned a failure code when the pipeline engine
    called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.
     (SQL Server Import and Export Wizard)
    Any help?

    Full support for embedded quotes was added in SSIS 2012.
    Which version are you using?
    http://blogs.msdn.com/b/mattm/archive/2011/07/17/flat-file-source-changes-in-denali.aspx

  • DAC: failed task during ETL for financial apps

    I am trying  my first ETL on OBIA 7.9.6.4
    i'm using  Oracle EBS 12.1.1 as source system.
    the ETL completes 314 tasks successfully ,but it fails the task named:
    "SDE_ORA_GL_AR_REV_LinkageInformation_Extract"
    DAC Error log:
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [9.1.0 HotFix2], build [357.0903], Windows 32-bit
    Copyright (c) Informatica Corporation 1994 - 2011
    All Rights Reserved.
    Invoked at Wed Sep 18 09:46:41 2013
    Connected to Integration Service: [infor_int].
    Folder: [SDE_ORAR1211_Adaptor]
    Workflow: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full]
    Instance: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full]
    Mapping: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract]
    Session log file: [C:\Informatica\server\infa_shared\SessLogs\.SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full.ORA_R1211.log]
    Source success rows: [0]
    Source failed rows: [0]
    Target success rows: [0]
    Target failed rows: [0]
    Number of transformation errors: [0]
    First error code [4035]
    First error message: [RR_4035 SQL Error [
    ORA-00904: "XLA_EVENTS"."UPG_BATCH_ID": invalid identifier
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATI]
    Task run status: [Failed]
    Integration Service: [infor_int]
    Integration Service Process: [infor_int]
    Integration Service Grid: [infor_int]
    Node Name(s) [node01_AMAZON-9C628AAE]
    Preparation fragment
    Partition: [Partition #1]
    Transformation instance: [SQ_XLA_AE_LINES]
    Transformation: [SQ_XLA_AE_LINES]
    Applied rows: [0]
    Affected rows: [0]
    Rejected rows: [0]
    Throughput(Rows/Sec): [0]
    Throughput(Bytes/Sec): [0]
    Last error code [16004], message [ERROR: Prepare failed. : [
    ORA-00904: "XLA_EVENTS"."UPG_BATCH_ID": invalid identifier
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_CO]
    Start time: [Wed Sep 18 09:46:13 2013]
    End time: [Wed Sep 18 09:46:13 2013]
    Partition: [Partition #1]
    Transformation instance: [W_GL_LINKAGE_INFORMATION_GS]
    Transformation: [W_GL_LINKAGE_INFORMATION_GS]
    Applied rows: [0]
    Affected rows: [0]
    Rejected rows: [0]
    Throughput(Rows/Sec): [0]
    Throughput(Bytes/Sec): [0]
    Last error code [0], message [No errors encountered.]
    Start time: [Wed Sep 18 09:46:14 2013]
    End time: [Wed Sep 18 09:46:14 2013]
    Disconnecting from Integration Service
    Completed at Wed Sep 18 09:46:41 2013
    Informatica session logs:
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R1211] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [.SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full.ORA_R1211.log] for session parameter:[$PMSessionLogFile].
    DIRECTOR> VAR_27028 Use override value [26] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value ['N'] for mapping parameter:[$$FILTER_BY_LEDGER_ID].
    DIRECTOR> VAR_27028 Use override value ['N'] for mapping parameter:[$$FILTER_BY_LEDGER_TYPE].
    DIRECTOR> VAR_27028 Use override value [] for mapping parameter:[$$Hint1].
    DIRECTOR> VAR_27028 Use override value [01/01/1970] for mapping parameter:[$$INITIAL_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [01/01/1990] for mapping parameter:[$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [1] for mapping parameter:[$$LEDGER_ID_LIST].
    DIRECTOR> VAR_27028 Use override value ['NONE'] for mapping parameter:[$$LEDGER_TYPE_LIST].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] at [Wed Sep 18 09:46:13 2013].
    DIRECTOR> TM_6683 Repository Name: [infor_rep]
    DIRECTOR> TM_6684 Server Name: [infor_int]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR1211_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] Run Instance Name: [] Run Id: [2130]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_GL_AR_REV_LinkageInformation_Extract [version 1].
    DIRECTOR> TM_6963 Pre 85 Timestamp Compatibility is Enabled
    DIRECTOR> TM_6964 Date format for the Session is [MM/DD/YYYY HH24:MI:SS]
    DIRECTOR> TM_6827 [C:\Informatica\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6708 Using configuration property [DisableDB2BulkMode ,Yes]
    DIRECTOR> TM_6708 Using configuration property [OraDateToTimestamp ,Yes]
    DIRECTOR> TM_6708 Using configuration property [overrideMpltVarWithMapVar,Yes]
    DIRECTOR> TM_6708 Using configuration property [SiebelUnicodeDB,[APPS]@[ 54.225.65.108:1521:VIS] [DWH_REP2]@[AMAZON-9C628AAE:1521:obiaDW1]]
    DIRECTOR> TM_6703 Session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] is run by 32-bit Integration Service  [node01_AMAZON-9C628AAE], version [9.1.0 HotFix2], build [0903].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6151 The session sort order is [Binary].
    MAPPING> TM_6156 Using low precision processing.
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6187 Session target-based commit interval is [10000].
    MAPPING> TM_6307 DTM error log disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TE_7004 Transformation Parse Warning [IIF(EVENT_TYPE_CODE='RECP_REVERSE',
    IIF(UPG_BATCH_ID>0,
    SOURCE_TABLE || '~' || DISTRIBUTION_ID,
    SOURCE_TABLE || '~RECEIPTREVERSE~' || DISTRIBUTION_ID),
    SOURCE_TABLE || '~' || DISTRIBUTION_ID)
    ]; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning [<<PM Parse Warning>> [||]: operand converted to a string
    ... IIF(EVENT_TYPE_CODE='RECP_REVERSE',
    IIF(UPG_BATCH_ID>0,
    SOURCE_TABLE || '~' || >>>>DISTRIBUTION_ID<<<<,
    SOURCE_TABLE || '~RECEIPTREVERSE~' || DISTRIBUTION_ID),
    SOURCE_TABLE || '~' || DISTRIBUTION_ID)
    <<PM Parse Warning>> [||]: operand converted to a string
    ... IIF(EVENT_TYPE_CODE='RECP_REVERSE',
    IIF(UPG_BATCH_ID>0,
    SOURCE_TABLE || '~' || DISTRIBUTION_ID,
    SOURCE_TABLE || '~RECEIPTREVERSE~' || >>>>DISTRIBUTION_ID<<<<),
    SOURCE_TABLE || '~' || DISTRIBUTION_ID)
    <<PM Parse Warning>> [||]: operand converted to a string
    ... IIF(EVENT_TYPE_CODE='RECP_REVERSE',
    IIF(UPG_BATCH_ID>0,
    SOURCE_TABLE || '~' || DISTRIBUTION_ID,
    SOURCE_TABLE || '~RECEIPTREVERSE~' || DISTRIBUTION_ID),
    SOURCE_TABLE || '~' || >>>>DISTRIBUTION_ID<<<<)
    ]; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning [JE_HEADER_ID || '~' || JE_LINE_NUM]; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning [<<PM Parse Warning>> [||]: operand converted to a string
    ... >>>>JE_HEADER_ID<<<< || '~' || JE_LINE_NUM<<PM Parse Warning>> [JE_LINE_NUM]: operand converted to a string
    ... JE_HEADER_ID || '~' || >>>>JE_LINE_NUM<<<<]; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning [AE_HEADER_ID || '~' || AE_LINE_NUM]; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning [<<PM Parse Warning>> [||]: operand converted to a string
    ... >>>>AE_HEADER_ID<<<< || '~' || AE_LINE_NUM<<PM Parse Warning>> [AE_LINE_NUM]: operand converted to a string
    ... AE_HEADER_ID || '~' || >>>>AE_LINE_NUM<<<<]; transformation continues...
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Wed Sep 18 09:46:13 2013)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Wed Sep 18 09:46:13 2013)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 12582912 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [54.225.65.108:1521/VIS], user [APPS]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [AMAZON-9C628AAE:1521/obiaDW1], user [DWH_REP2], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_GL_LINKAGE_INFORMATION_GS :SQL INSERT statement:
    INSERT INTO W_GL_LINKAGE_INFORMATION_GS(SOURCE_DISTRIBUTION_ID,JOURNAL_LINE_INTEGRATION_ID,LEDGER_ID,LEDGER_TYPE,DISTRIBUTION_SOURCE,JE_BATCH_NAME,JE_HEADER_NAME,JE_LINE_NUM,POSTED_ON_DT,GL_ACCOUNT_ID,SLA_TRX_INTEGRATION_ID,DATASOURCE_NUM_ID)  VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_GL_LINKAGE_INFORMATION_GS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    READER_1_1_1> RR_4029 SQ Instance [SQ_XLA_AE_LINES] User specified SQL Query [SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATION_ID,
    AEHEADER.EVENT_TYPE_CODE,
    NVL(XLA_EVENTS.UPG_BATCH_ID,0) UPG_BATCH_ID
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AR_DISTRIBUTIONS_ALL'
              , 'RA_CUST_TRX_LINE_GL_DIST_ALL')
    AND DLINK.APPLICATION_ID = 222
    AND AELINE.APPLICATION_ID = 222
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('01/01/1970 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('N', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Wed Sep 18 09:46:13 2013)
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Wed Sep 18 09:46:13 2013
    Target tables:
         W_GL_LINKAGE_INFORMATION_GS
    READER_1_1_1> CMN_1761 Timestamp Event: [Wed Sep 18 09:46:13 2013]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-00904: "XLA_EVENTS"."UPG_BATCH_ID": invalid identifier
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATION_ID,
    AEHEADER.EVENT_TYPE_CODE,
    NVL(XLA_EVENTS.UPG_BATCH_ID,0) UPG_BATCH_ID
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AR_DISTRIBUTIONS_ALL'
              , 'RA_CUST_TRX_LINE_GL_DIST_ALL')
    AND DLINK.APPLICATION_ID = 222
    AND AELINE.APPLICATION_ID = 222
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('01/01/1970 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('N', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATION_ID,
    AEHEADER.EVENT_TYPE_CODE,
    NVL(XLA_EVENTS.UPG_BATCH_ID,0) UPG_BATCH_ID
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AR_DISTRIBUTIONS_ALL'
              , 'RA_CUST_TRX_LINE_GL_DIST_ALL')
    AND DLINK.APPLICATION_ID = 222
    AND AELINE.APPLICATION_ID = 222
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('01/01/1970 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('N', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Wed Sep 18 09:46:13 2013]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_GL_LINKAGE_INFORMATION_GS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Wed Sep 18 09:46:13 2013
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_GL_LINKAGE_INFORMATION_GS (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
    WRT_8044 No data loaded for this target
    WRITER_1_*_1> WRT_8043 *****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [SQ_XLA_AE_LINES] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [SQ_XLA_AE_LINES] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_GL_LINKAGE_INFORMATION_GS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Wed Sep 18 09:46:14 2013)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Wed Sep 18 09:46:14 2013)
    MAPPING> TM_6018 The session completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [SQ_XLA_AE_LINES] (Instance Name: [SQ_XLA_AE_LINES])
      Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_GL_LINKAGE_INFORMATION_GS] (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
      Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] completed at [Wed Sep 18 09:46:14 2013].
    *I did some queries in my source database (Vision) , table "XLA_EVENTS" exists , column "UPG_BATCH_ID" also exists
    *I added "XLA_EVENTS" to the FROM clause and ran it in SQL Developer
    *in the SELECT clause ,i see a column named "AEHEADER.EVENT_TYPE_CODE"
    but there is no table named "AEHEADER" in the FROM clause
    so i added it manually , it's probably refers to "XLA_AE_HEADERS"
    Final query looks like this:
    SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATION_ID,
    AEHEADER.EVENT_TYPE_CODE,
    NVL(XLA_EVENTS.UPG_BATCH_ID,0) UPG_BATCH_ID
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
       , XLA_AE_HEADERS AEHEADER
       , XLA_EVENTS
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AR_DISTRIBUTIONS_ALL'
              , 'RA_CUST_TRX_LINE_GL_DIST_ALL')
    AND DLINK.APPLICATION_ID = 222
    AND AELINE.APPLICATION_ID = 222
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('01/01/1970 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('N', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    *when i run that query,it takes a lot of time executing without returning any results (last time it took 4 hours before i cancel it)
    my questions are:
    -what's wrong with that query?
    -how can i change the query in the workflow?
    could anyone please help?

    thank you very much
    i found SQ_XLA_AE_LINES and checked its SQL query,it's a very healthy query
    SELECT  DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
          AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATION_ID,
    AEHEADER.EVENT_TYPE_CODE,
    NVL(XLA_EVENTS.UPG_BATCH_ID,0) UPG_BATCH_ID
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , XLA_AE_HEADERS AEHEADER
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
       , XLA_EVENTS
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AR_DISTRIBUTIONS_ALL'
              , 'RA_CUST_TRX_LINE_GL_DIST_ALL')
    AND DLINK.APPLICATION_ID = 222
    AND AELINE.APPLICATION_ID = 222
    AND AEHEADER.APPLICATION_ID = 222
    AND XLA_EVENTS.APPLICATION_ID=222
    AND AEHEADER.AE_HEADER_ID = AELINE.AE_HEADER_ID
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND AEHEADER.EVENT_ID=XLA_EVENTS.EVENT_ID
    AND JHEADER.LAST_UPDATE_DATE >=
              TO_DATE('$$LAST_EXTRACT_DATE'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE($$FILTER_BY_LEDGER_ID, 'Y', T.LEDGER_ID, 1) IN ($$LEDGER_ID_LIST)
    AND DECODE($$FILTER_BY_LEDGER_TYPE, 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ($$LEDGER_TYPE_LIST)
    i compared this query with the query that appears in the Error messages,they are different (the Error message is stated in the first post)
    the query that appears in the Error messages misses a couple of lines,specifically in the "FROM" clause and the "WHERE" clause
    what might cause that issue?

  • ETL Load error

    Hi
    When i ran the ETL load for Project analytics , it errored out.
    In the DAC , these 3 items errored
    1. SIL_GlobalCurrencyGeneral_Update
    2. SDE_ORA_UserDimension
    3. SDE_ORA_EmployeeDimension
    below is the error from the log file. Any help will be appreciated
    9 SEVERE Thu Sep 24 17:54:48 PDT 2009
    START OF ETL
    10 SEVERE Thu Sep 24 17:57:14 PDT 2009 Unable to evaluate method getNamedSourceIdentifier for class com.siebel.analytics.etl.etltask.PauseTask
    11 SEVERE Thu Sep 24 17:57:15 PDT 2009 Unable to evaluate method getNamedSource for class com.siebel.analytics.etl.etltask.PauseTask
    12 SEVERE Thu Sep 24 17:57:20 PDT 2009 Unable to evaluate method getNamedSourceIdentifier for class com.siebel.analytics.etl.etltask.InformaticaTask
    13 SEVERE Thu Sep 24 17:57:20 PDT 2009 Unable to evaluate method getNamedSource for class com.siebel.analytics.etl.etltask.InformaticaTask
    14 SEVERE Thu Sep 24 17:58:27 PDT 2009 Unable to evaluate method getNamedSourceIdentifier for class com.siebel.analytics.etl.etltask.TaskPrecedingActionScriptTask
    15 SEVERE Thu Sep 24 17:58:27 PDT 2009 Unable to evaluate method getNamedSource for class com.siebel.analytics.etl.etltask.TaskPrecedingActionScriptTask
    16 SEVERE Thu Sep 24 17:58:39 PDT 2009 Starting ETL Process.
    17 SEVERE Thu Sep 24 17:59:14 PDT 2009 Informatica Status Poll Interval new value : 20000(milli-seconds)
    19 SEVERE Thu Sep 24 18:06:59 PDT 2009 /oracle/dac/OracleBI/bifoundation/dac/Informatica/parameters/input/ORACLE specified is not a currently existing directory
    20 SEVERE Thu Sep 24 18:06:59 PDT 2009 /oracle/dac/OracleBI/bifoundation/dac/Informatica/parameters/input/Oracle specified is not a currently existing directory
    21 SEVERE Thu Sep 24 18:06:59 PDT 2009 /oracle/dac/OracleBI/bifoundation/dac/Informatica/parameters/input/oracle specified is not a currently existing directory
    22 SEVERE Thu Sep 24 18:06:59 PDT 2009 /oracle/dac/OracleBI/bifoundation/dac/Informatica/parameters/input/ORACLE (THIN) specified is not a currently existing directory
    24 SEVERE Thu Sep 24 18:07:10 PDT 2009 /oracle/dac/OracleBI/bifoundation/dac/Informatica/parameters/input/FLAT FILE specified is not a currently existing directory
    25 SEVERE Thu Sep 24 18:08:23 PDT 2009 Request to start workflow : 'SILOS:SIL_CurrencyTypes' has completed with error code 0
    26 SEVERE Thu Sep 24 18:08:26 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_ProductMultipleCategories_Full' has completed with error code 0
    27 SEVERE Thu Sep 24 18:08:26 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_ExchangeRateGeneral_Full' has completed with error code 0
    28 SEVERE Thu Sep 24 18:08:26 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_Product_Categories_Derive' has completed with error code 0
    29 SEVERE Thu Sep 24 18:08:26 PDT 2009 Request to start workflow : 'SILOS:SIL_Parameters_Update' has completed with error code 0
    30 SEVERE Thu Sep 24 18:08:26 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_Stage_GLAccountDimension_FinSubCodes' has completed with error code 0
    31 SEVERE Thu Sep 24 18:08:27 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_EmployeeDimension_Addresses_Full' has completed with error code 0
    32 SEVERE Thu Sep 24 18:08:27 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_UserDimension_Full' has completed with error code 0
    33 SEVERE Thu Sep 24 18:08:27 PDT 2009 Request to start workflow : 'SDE_ORAR12_Adaptor:SDE_ORA_GeoCountryDimension' has completed with error code 0
    34 SEVERE Thu Sep 24 18:08:27 PDT 2009 Request to start workflow : 'SILOS:SIL_GlobalCurrencyGeneral_Update' has completed with error code 0
    35 SEVERE Thu Sep 24 18:19:53 PDT 2009 Error while contacting Informatica server for getting workflow status for SDE_ORA_UserDimension_Full
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [8.6.0 HotFix4], build [272.1017], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    Invoked at Thu Sep 24 18:19:40 2009
    Connected to Integration Service: [Integ_r1211].
    Integration Service status: [Running]
    Integration Service startup time: [Thu Sep 24 11:50:39 2009]
    Integration Service current time: [Thu Sep 24 18:19:43 2009]
    Folder: [SDE_ORAR12_Adaptor]
    Workflow: [SDE_ORA_UserDimension_Full] version [1].
    Workflow run status: [Failed]
    Workflow run error code: [36331]
    Workflow run error message: [WARNING: Session task instance [SDE_ORA_UserDimension_Full] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SDE_ORA_UserDimension_Full] will be failed.]
    Workflow run id [1611].
    Start time: [Thu Sep 24 18:08:26 2009]
    End time: [Thu Sep 24 18:15:45 2009]
    Workflow log file: [oracle/Informatica/PowerCenter8.6.0/server/infa_shared/WorkflowLogs/SDE_ORA_UserDimension_Full.log]
    Workflow run type: [User request]
    Run workflow as user: [Administrator]
    Run workflow with Impersonated OSProfile in domain: []
    Integration Service: [Integ_r1211]
    Disconnecting from Integration Service
    Completed at Thu Sep 24 18:19:43 2009
    =====================================
    ERROR OUTPUT
    =====================================
    37 SEVERE Thu Sep 24 18:19:53 PDT 2009 pmcmd startworkflow -sv Integ_r1211 -d Domain_r1211 -u Administrator -p **** -f SDE_ORAR12_Adaptor -lpf /oracle/Informatica/PowerCenter8.6.0/server/infa_shared/SrcFiles/ORA_R12_Flatfile.DataWarehouse.SDE_ORAR12_Adaptor.SDE_ORA_Stage_GLAccountDimension_FinSubCodes.txt SDE_ORA_Stage_GLAccountDimension_FinSubCodes
    Status Desc : Succeeded
    WorkFlowMessage : Workflow executed successfully.
    Error Message : Successfully completed.
    ErrorCode : 0
    36 SEVERE Thu Sep 24 18:19:53 PDT 2009 Error while contacting Informatica server for getting workflow status for SIL_GlobalCurrencyGeneral_Update
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [8.6.0 HotFix4], build [272.1017], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    Invoked at Thu Sep 24 18:19:12 2009
    Connected to Integration Service: [Integ_r1211].
    Integration Service status: [Running]
    Integration Service startup time: [Thu Sep 24 11:50:39 2009]
    Integration Service current time: [Thu Sep 24 18:19:21 2009]
    Folder: [SILOS]
    Workflow: [SIL_GlobalCurrencyGeneral_Update] version [1].
    Workflow run status: [Failed]
    Workflow run error code: [36331]
    Workflow run error message: [WARNING: Session task instance [SIL_GlobalCurrencyGeneral_Update] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SIL_GlobalCurrencyGeneral_Update] will be failed.]
    Workflow run id [1610].
    Start time: [Thu Sep 24 18:08:26 2009]
    End time: [Thu Sep 24 18:15:47 2009]
    Workflow log file: [oracle/Informatica/PowerCenter8.6.0/server/infa_shared/WorkflowLogs/SIL_GlobalCurrencyGeneral_Update.log]
    Workflow run type: [User request]
    Run workflow as user: [Administrator]
    Run workflow with Impersonated OSProfile in domain: []
    Integration Service: [Integ_r1211]
    Disconnecting from Integration Service
    Completed at Thu Sep 24 18:19:21 2009
    =====================================
    ERROR OUTPUT
    =====================================
    38 SEVERE Thu Sep 24 18:19:53 PDT 2009 Could not attach to workflow because of errorCode 36331 For workflow SDE_ORA_UserDimension_Full
    39 SEVERE Thu Sep 24 18:19:53 PDT 2009 Could not attach to workflow because of errorCode 36331 For workflow SIL_GlobalCurrencyGeneral_Update
    40 SEVERE Thu Sep 24 18:19:53 PDT 2009
    ANOMALY INFO::: Error while executing : INFORMATICA TASK:SDE_ORAR12_Adaptor:SDE_ORA_UserDimension_Full:(Source : FULL Target : FULL)
    MESSAGE:::
    Irrecoverable Error
    Error while contacting Informatica server for getting workflow status for SDE_ORA_UserDimension_Full
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [8.6.0 HotFix4], build [272.1017], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    Invoked at Thu Sep 24 18:19:40 2009
    Connected to Integration Service: [Integ_r1211].
    Integration Service status: [Running]
    Integration Service startup time: [Thu Sep 24 11:50:39 2009]
    Integration Service current time: [Thu Sep 24 18:19:43 2009]
    Folder: [SDE_ORAR12_Adaptor]
    Workflow: [SDE_ORA_UserDimension_Full] version [1].
    Workflow run status: [Failed]
    Workflow run error code: [36331]
    Workflow run error message: [WARNING: Session task instance [SDE_ORA_UserDimension_Full] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SDE_ORA_UserDimension_Full] will be failed.]
    Workflow run id [1611].
    Start time: [Thu Sep 24 18:08:26 2009]
    End time: [Thu Sep 24 18:15:45 2009]
    Workflow log file: [oracle/Informatica/PowerCenter8.6.0/server/infa_shared/WorkflowLogs/SDE_ORA_UserDimension_Full.log]
    Workflow run type: [User request]
    Run workflow as user: [Administrator]
    Run workflow with Impersonated OSProfile in domain: []
    Integration Service: [Integ_r1211]
    Disconnecting from Integration Service
    Completed at Thu Sep 24 18:19:43 2009
    =====================================
    ERROR OUTPUT
    =====================================
    Re-Queue to attempt to run again or attach to running workflow
    if Execution Plan is still running or re-submit Execution Plan to execute the workflow.
    EXCEPTION CLASS::: com.siebel.analytics.etl.etltask.IrrecoverableException
    com.siebel.analytics.etl.etltask.InformaticaTask.doExecute(InformaticaTask.java:179)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.doExecuteWithRetries(GenericTaskImpl.java:410)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.execute(GenericTaskImpl.java:306)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.execute(GenericTaskImpl.java:213)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.run(GenericTaskImpl.java:585)
    com.siebel.analytics.etl.taskmanager.XCallable.call(XCallable.java:63)
    java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    java.util.concurrent.FutureTask.run(FutureTask.java:138)
    java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:441)
    java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
    java.util.concurrent.FutureTask.run(FutureTask.java:138)
    java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:885)
    java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:907)
    java.lang.Thread.run(Thread.java:619)
    41 SEVERE Thu Sep 24 18:20:01 PDT 2009
    ANOMALY INFO::: Error while executing : INFORMATICA TASK:SILOS:SIL_GlobalCurrencyGeneral_Update:(Source : FULL Target : FULL)
    MESSAGE:::
    Irrecoverable Error
    Error while contacting Informatica server for getting workflow status for SIL_GlobalCurrencyGeneral_Update
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :

    Shyam
    i have attached the info you had asked.
    When i reran , this time only SDE_ORA_ExchangeRateGeneral failed. Below is the output of 2 files
    1. SDE_ORA_ExchangeRateGeneral_Full
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [8.6.0 HotFix4], build [272.1017], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    Invoked at Fri Sep 25 10:19:38 2009
    Connected to Integration Service: [Integ_r1211].
    Integration Service status: [Running]
    Integration Service startup time: [Thu Sep 24 11:50:39 2009]
    Integration Service current time: [Fri Sep 25 10:19:43 2009]
    Folder: [SDE_ORAR12_Adaptor]
    Workflow: [SDE_ORA_ExchangeRateGeneral_Full] version [1].
    Workflow run status: [Failed]
    Workflow run error code: [36331]
    Workflow run error message: [WARNING: Session task instance [SDE_ORA_ExchangeRateGeneral_Compress_Full] failed and its "fail parent if this task fails" setting is turned on. So, Workflow [SDE_ORA_ExchangeRateGeneral_Full] will be failed.]
    Workflow run id [1806].
    Start time: [Fri Sep 25 10:18:00 2009]
    End time: [Fri Sep 25 10:19:18 2009]
    Workflow log file: [oracle/Informatica/PowerCenter8.6.0/server/infa_shared/WorkflowLogs/SDE_ORA_ExchangeRateGeneral_Full.log]
    Workflow run type: [User request]
    Run workflow as user: [Administrator]
    Run workflow with Impersonated OSProfile in domain: []
    Integration Service: [Integ_r1211]
    Disconnecting from Integration Service
    Completed at Fri Sep 25 10:19:43 2009
    =====================================
    ERROR OUTPUT
    =====================================
    2. SDE_ORA_ExchangeRateGeneral_Full_SESSIONS
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMREP, version [8.6.0 HotFix4], build [272.1017], LINUX 32-bit
    Copyright (c) Informatica Corporation 1994 - 2008
    All Rights Reserved.
    This Software may be protected by U.S. Patent Numbers 6,208,990; 6,044,374; 6,014,670; 6,032,158; 5,794,246; 6,339,775; 6,850,947; 6,895,471; 7,254,590 and other U.S. Patents Pending.
    Invoked at Fri Sep 25 10:08:55 2009
    [[REP_57066] Request timed out.]
    [09/25/2009 10:12:05-[REP_55112] Unable to connect to the Repository Service [Rep_r1211] since the resilience time is up.]
    [Failed to connect to repository service [Rep_r1211].]
    An error occurred while accessing the repository[Failed to connect to repository service [Rep_r1211].]
    [09/25/2009 10:12:05-[REP_55102] Failed to connect to repository service [Rep_r1211].]
    Repository connection failed.
    Failed to execute listobjectdependencies.
    Completed at Fri Sep 25 10:12:05 2009
    =====================================
    ERROR OUTPUT
    =====================================

Maybe you are looking for

  • Strange behavior in System Preferences when trying to change backgrounds

    Running 10.5.1 on my MacBook Pro without a hitch until today. Installed Aperture 1.5 and updated it to 1.5.6 last night and it too works fine. However, today when I went to change my background image in System Preferences I encountered some strange v

  • PROBLEM PATCHING SOLARIS 10 FOR DST

    OK - I messed up installing the DST patch, but now I cannot reinstall. I used "pkgadd" to install the patch 122032-04. It went fine into the global zone, but "partially failed" going into my other 4 zones. NOW - using "date" gives the correct time zo

  • Populating Outlook 2013 auto-complete addresses via PowerShell

    Hello, I would like to use PowerShell to add auto-complete e-mail addressees to Microsoft Office Outlook 2013. My understanding is that since Outlook 2010 was released, the NK2 file has been retired in favor of integrating the auto-complete addresses

  • ChaRM: combination of transport strategies

    Hi ChaRM-experts! I plan to map the existing transport landscape into ChaRM scenario of SOLMAN and have the following questions: 1) Is it possible to combine ChaRM-functionality and still use the old transport settings (without ChaRM) for the same sy

  • Accessing the Product Model in PME

    Hi All, I am facing an issue related to PME, I have created a configurable product in CRM. The  issue is when i  clicked on create button in the configuration tab of commpr01 , PME starts and establishes connection with the respective client but it i