ETL in Oracle 9i

i woudl need to load data from database into dw ... all in Oracle 9i.. without installing any other tool. I have read, it is possible from 9iR2 but i am not able to find it out.
I have installed OWB but it didnt work...
thanks
Marketecka

Oracle Warehouse Builder
Home made PL/SQL Procedures
You can also have a look in Transportable Tablespace Feature (I do not know your requirments)
You can trasport data between database usingdatabase links
You can also use SQL*Loader for Loadingdata into Oracle
To load data from flat files also external tables can be helpful
But for building ETL logic like DTS packages allows to do, I think the best choice would be Oracle Warehouse Builder.
Best Regards
Krystian Zieja / mob

Similar Messages

  • Description of the ETL solution Oracle

    Hi,
    I would like to have answers for questions below :
    *2. Description of the ETL solution Oracle*
    => Software name: (i thinks it's Oracle Data Integrator)
    =>Version:
    =>Platforms:
    =>Products to be installed before the solution:
    =>Hints that may create conflicts with other applications:
    *3. ETL Solution Architecture*
    *3.1. Architecture Description*
    => Hub & Spoke Architecture:
    => Peer to Peer Architecture:
    => Bus Architecture applications:
    => Architecture code generator:
    => The solution she parallelization treatment?
    => Does it allow the installation of remote engines?
    => Does it load balancing?
    *3.3. Nature Repository*
    => Nature Repository:
    *3.4. Power Repository*
    => The solution supports she metadata standards?
    => Does it allow data exchange with:
    => Does it allow the exchange of metaName?
    => If Yes, what métadictionnaires?
    *3.5. Repository Management*
    => Are there specific backup procedures?
    *3.6. Access to the repository*
    => The repository is it accessible to external tools?
    *4. Data Access*
    *4.1. Access to relational data*
    *4.1.1. Access modes*
    => ODBC (for each source / destination):
    => JDBC (id)?
    => Native (id)?
    => Asynchronous (loaders, etc.). (Id)?
    => Other means? Other middleware?
    => Agreements with suppliers of middleware?
    *4.1.2. Data read / written*
    The product can read a full table?
    => Can read a complete view?
    => Can read a complete stored procedure?
    => Is it possible to add a where clause / orderby these different elements?
    => Can use loaders / Unloaders?
    => Can read a query?
    => If yes, what tool to create query?
    => He uses a single grammar for all databases?
    => He uses a grammar extended to take advantage of specific functions to the basics?
    => If Yes, what basis?
    => Does it control of the validity of flat files?
    => He manages the Time Out DataBase? (What happens if the base does not?)
    => Does it allow the use of error messages database?
    *4.2. Non-relational data access*
    => How the product connects there to the sources / destinations of the non-relational business?
    => Can read / write all types of data? On what basis?
    => Does it control of the validity of XML files?
    *4.3. Access to standard data applications / ERP*
    => Access to data and ERP software packages:
    *5. Triggering processes*
    *5.1. Trigger type of message*
    => Corba:
    => JMS:
    => MOMS:
    *5.2. Trigger type of polling (polling)*
    => Polling directory:
    => Polling POP / MAPI:
    => Polling databases:
    *5.3. Trigger trigger databases:*
    => Trigger trigger databases:
    => If yes, which databases?
    *5.4 The product includes a scheduler it?*
    *5.5. Other trigger modes:*
    *6. Data processing*
    *6.1. Data Transfer*
    => The product allows set-there treatments?
    => He manages the transactions?
    => He supports the multiple simultaneous updates?
    => Contain standard tools for synchronizing tables?
    *6.2. Data Transformation-aggregates-calculations*
    => What are the functions available?
    => Detailing those to be executed by the database:
    => Detailing those to be executed by the engine:
    => Are there any transformation functions format date / digital?
    => Are there any statistical functions of data quality?
    => The product allows it to transcode a reference table?
    => Supports he joins heterogeneous?
    => Modes of joins between tables are supported:
    => Operators supported on the knuckles:
    => Management nested query:
    => Does it changes through a coded language of the market?
    => Does it reuse the scripts pl / sql, dts, stored procs, etc.. , Exist?
    *7. Development tool processing chains*
    => What is the language used for the development of processing chains?
    => The product he offers the possibility of mapping graph?
    => The graphical interface allows it to drag and drop to the construction of treatment:
    => It enables the graphical representation of flux?
    => A query is it possible in the product under development to see the data?
    => The product contain an impact assessment tool?
    => What are the tools to assist in debugging?
    => It allows the generation of technical documentation?
    => It allows the generation of functional documentation?
    => Does it allow the consultation of the documentation over the Web?
    => What are the main functions of error handling available?
    => What are the main functions of error handling available?
    *8. Advanced Development*
    => The engine of the product he unveiled APIs for the integration of process chains in the external developments?
    => The tool allows it to integrate external functions developed in the following language:
    => He provides mechanisms for disaster recovery?
    => Does it play on the parameters of buffer / index / cache data to optimize treatment?
    => Does it define a process that can be applied to n / m sources / destinations (without duplicating the process)?
    => It allows the development team?
    => Does it support versioning?
    => Versioning is compatible with the tools on the market?
    *9. Deployment / Production start*
    => The treatments are compiled before the switch in production?
    => The product allows it to create a "package" delivery?
    => New treatments are put into production?
    => The product allows it to create a "package" delivery?
    *10. Administration*
    => The product contain a management console?
    => If yes, is it accessible from a Web browser?
    => Possibility of taking remote control of the Administration Console:
    => Yes, interface proposed:
    => The administration console requires Does a specific server?
    => This console allows it to intervene in the current treatment?
    => This console allows it to follow the treatment in real time?
    => He manages the logs automatically?
    => If yes, what is their structure?
    => Does it generate specific logs?
    => Interfacing with monitoring tools:
    => If yes, which:
    => The product makes it out of whistleblowing?
    *11. Scheduling*
    => The product has it a scheduling solution owner?
    => If yes, is there a possibility of insertion of external processing chain?
    => If yes, is there a possibility of inclusion of human intervention in the chain?
    => Can it be driven schedulers following?
    *12. Security*
    => The product can use the rights of a directory?
    => Can allow / disallow the creation of scenarios?
    => Can allow / disallow the update scenarios?
    => Can allow / deny access to metadata?
    => Can allow / disallow the use of a management console?
    => Can allow / disallow the introduction of manual tasks?
    *13. Various*
    => Can allow / disallow the creation of scenarios?
    => If yes, what languages are available?
    => The product documentation is it multilingual?
    => The product he obtained certifications from other publishers?
    => Are there opportunities for OEM integration?
    => The publisher is involved there to a standards body?
    => If Yes, (s) is (s)?
    => Is there a club user?
    => Is there a community of developers?
    => What is the language of product development?
    => What is the training cycle of the product?
    => What are the processes of product support?
    thanks in advance

    Hi Ingo,
    these files are coming with the BO-Installation. From my point of view they are responsible for the general reactions of the specified database - in my case oracle - behind BO.
    Thanks a lot for help.
    Sincerly Sabine

  • Error While ETL

    I got following error while running ETL for Oracle Financials.
    Can someone help in this problem?
    I am using OBIA 7.9.6.4 with EBS R12.1.3 Vision.
    Following is the session log.
    2013-02-27 15:28:21 : DEBUG : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : DBG_21082 : Upd_Strategy_W_ORA_PRODUCT_DS_TMP_Ins_Upd (Input Group 0): : Current Input data:
    2013-02-27 15:28:21 : INFO : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : CMN_1053 : Input row from EXPTRANS: Rowdata: ( RowType=0(insert but dropped due to abort) Src Rowid=2 Targ Rowid=2
    DATASOURCE_NUM_ID (DATASOURCE_NUM_ID:Double:): "1000.000000000000"
    INVENTORY_ITEM_ID (INVENTORY_ITEM_ID:Double:): "936.0000000000000"
    ORGANIZATION_ID (ORGANIZATION_ID:Double:): "204.0000000000000"
    PROD_NAME (PROD_NAME:Char.100:): "Formed Sheet Metal Shell Bottom"
    PRODUCT_TYPE_CODE (PRODUCT_TYPE_CODE:Char.50:): "P"
    PRODUCT_TYPE_DESC (PRODUCT_TYPE_DESC:Char.255:): "Purchased item"
    PR_PROD_LN (PR_PROD_LN:Char.100:): "Unspecified"
    PR_PROD_LN_DESC (PR_PROD_LN_DESC:Char.255:): "(NULL)"
    CONFIG_CAT_CODE (CONFIG_CAT_CODE:Char.50:): "(NULL)"
    CONFIG_CAT_DESC (CONFIG_CAT_DESC:Char.80:): "(NULL)"
    PRICE_TYPE_CD (PRICE_TYPE_CD:Char.30:): "(NULL)"
    BASIC_PRODUCT (BASIC_PRODUCT:Char.30:): "(NULL)"
    PR_EQUIV_PROD_NAME (PR_EQUIV_PROD_NAME:Char.100:): "(NULL)"
    CONFIG_PROD_IND (CONFIG_PROD_IND:Char.1:): "(NULL)"
    CONTAINER_CODE (CONTAINER_CODE:Char.50:): "(NULL)"
    CONTAINER_DESC (CONTAINER_DESC:Char.80:): "(NULL)"
    INDUSTRY_CODE (INDUSTRY_CODE:Char.50:): "(NULL)"
    INDUSTRY_NAME (INDUSTRY_NAME:Char.80:): "(NULL)"
    INDUSTRY_STD_DESC (INDUSTRY_STD_DESC:Char.255:): "(NULL)"
    INTRODUCTION_DT (INTRODUCTION_DT:Date:): "(NULL)"
    LOW_LEVEL_CODE (LOW_LEVEL_CODE:Char.50:): "(NULL)"
    MAKE_BUY_IND (MAKE_BUY_IND:Char.1:): "B"
    PROD_LIFE_CYCL_CD (PROD_LIFE_CYCL_CD:Char.30:): "(NULL)"
    PROD_REPRCH_PERIOD (PROD_REPRCH_PERIOD:Double:): "(NULL)"
    PROD_STRUCTURE_TYPE (PROD_STRUCTURE_TYPE:Char.30:): "(NULL)"
    PROD_GRP_CODE (PROD_GRP_CODE:Char.50:): "(NULL)"
    PROD_GRP_DESC (PROD_GRP_DESC:Char.80:): "(NULL)"
    STORAGE_TYPE_CODE (STORAGE_TYPE_CODE:Char.50:): "(NULL)"
    STORAGE_TYPE_DESC (STORAGE_TYPE_DESC:Char.80:): "(NULL)"
    BATCH_IND (BATCH_IND:Char.1:): "(NULL)"
    BRAND (BRAND:Char.30:): "(NULL)"
    COLOR (COLOR:Char.30:): "(NULL)"
    CUSTOM_PROD_FLG (CUSTOM_PROD_FLG:Char.1:): "(NULL)"
    PACKAGED_PROD_FLG (PACKAGED_PROD_FLG:Char.1:): "(NULL)"
    RTRN_DEFECTIVE_FLG (RTRN_DEFECTIVE_FLG:Char.1:): "Y"
    SALES_PROD_FLG (SALES_PROD_FLG:Char.1:): "(NULL)"
    SALES_SRVC_FLG (SALES_SRVC_FLG:Char.1:): "N"
    SERIALIZED_FLG (SERIALIZED_FLG:Char.1:): "N"
    NRC_FLG (NRC_FLG:Char.1:): "Y"
    FRU_FLG (FRU_FLG:Char.1:): "(NULL)"
    ITEM_SIZE (ITEM_SIZE:Double:): "(NULL)"
    LEAD_TIME (LEAD_TIME:Char.30:): "(NULL)"
    MTBF (MTBF:Double:): "(NULL)"
    MTTR (MTTR:Double:): "(NULL)"
    PART_NUM (PART_NUM:Char.50:): "CM23597"
    VENDOR_LOC (VENDOR_LOC:Char.50:): "(NULL)"
    VENDOR_NAME (VENDOR_NAME:Char.100:): "Unspecified"
    VENDR_PART_NUM (VENDR_PART_NUM:Char.50:): "(NULL)"
    DISCONTINUATION_DT (DISCONTINUATION_DT:Date:): "(NULL)"
    HAZARD_MTL_DESC (HAZARD_MTL_DESC:Char.80:): "(NULL)"
    HAZARD_MTL_CODE (HAZARD_MTL_CODE:Char.50:): "(NULL)"
    SALES_UOM_CODE (SALES_UOM_CODE:Char.50:): "Ea"
    SALES_UOM_DESC (SALES_UOM_DESC:Char.80:): "Each"
    SERIALIZED_COUNT (SERIALIZED_COUNT:Double:): "(NULL)"
    SHELF_LIFE (SHELF_LIFE:Double:): "1.000000000000000"
    SHIP_MTHD_GRP_CODE (SHIP_MTHD_GRP_CODE:Char.50:): "(NULL)"
    SHIP_MTHD_GRP_DESC (SHIP_MTHD_GRP_DESC:Char.80:): "(NULL)"
    SHIP_MTL_GRP_CODE (SHIP_MTL_GRP_CODE:Char.50:): "(NULL)"
    SHIP_MTL_GRP_DESC (SHIP_MTL_GRP_DESC:Char.80:): "(NULL)"
    SHIP_TYPE_CODE (SHIP_TYPE_CODE:Char.50:): "(NULL)"
    SHIP_TYPE_DESC (SHIP_TYPE_DESC:Char.80:): "(NULL)"
    SOURCE_OF_SUPPLY (SOURCE_OF_SUPPLY:Char.30:): "(NULL)"
    SPRT_WITHDRAWL_DT (SPRT_WITHDRAWL_DT:Date:): "(NULL)"
    UOM (UOM:Char.30:): "(NULL)"
    UOM_I (UOM_I:Char.50:): "(NULL)"
    BASE_UOM_CODE (BASE_UOM_CODE:Char.50:): "Ea"
    BASE_UOM_DESC (BASE_UOM_DESC:Char.80:): "Each"
    UNIT_GROSS_WEIGHT (UNIT_GROSS_WEIGHT:Double:): "0.1000000000000000"
    UNIT_NET_WEIGHT (UNIT_NET_WEIGHT:Double:): "0.1000000000000000"
    UNIT_VOLUME (UNIT_VOLUME:Double:): "0.1000000000000000"
    UNIV_PROD_CODE (UNIV_PROD_CODE:Char.50:): "(NULL)"
    UNIV_PROD_DESC (UNIV_PROD_DESC:Char.80:): "(NULL)"
    UOV_CODE (UOV_CODE:Char.50:): "FT3"
    UOV_DESC (UOV_DESC:Char.80:): "Cubic foot"
    UOW_CODE (UOW_CODE:Char.50:): "Lbs"
    UOW_DESC (UOW_DESC:Char.80:): "Pounds"
    APPLICATION_FLG (APPLICATION_FLG:Char.1:): "(NULL)"
    BODY_STYLE_CD (BODY_STYLE_CD:Char.30:): "(NULL)"
    BODY_STYLE_CD_I (BODY_STYLE_CD_I:Char.50:): "(NULL)"
    CASE_PACK (CASE_PACK:Double:): "(NULL)"
    CTLG_CAT_ID (CTLG_CAT_ID:Char.30:): "(NULL)"
    DEALER_INV_PRICE (DEALER_INV_PRICE:Double:): "(NULL)"
    DETAIL_TYPE (DETAIL_TYPE:Char.30:): "(NULL)"
    DOORS_TYPE_CD (DOORS_TYPE_CD:Char.30:): "(NULL)"
    DRIVE_TRAIN_CD (DRIVE_TRAIN_CD:Char.30:): "(NULL)"
    ENGINE_TYPE_CD (ENGINE_TYPE_CD:Char.30:): "(NULL)"
    FUEL_TYPE_CD (FUEL_TYPE_CD:Char.30:): "(NULL)"
    GROSS_MRGN (GROSS_MRGN:Double:): "(NULL)"
    INVENTORY_FLG (INVENTORY_FLG:Char.1:): "Y"
    MAKE_CD (MAKE_CD:Char.30:): "(NULL)"
    MODEL_CD (MODEL_CD:Char.30:): "(NULL)"
    MODEL_YR (MODEL_YR:Double:): "(NULL)"
    MSRP (MSRP:Double:): "(NULL)"
    ORDERABLE_FLG (ORDERABLE_FLG:Char.1:): "(NULL)"
    PROD_NDC_ID (PROD_NDC_ID:Char.30:): "(NULL)"
    PROD_TYPE (PROD_TYPE:Char.30:): "(NULL)"
    PROFIT_RANK (PROFIT_RANK:Char.30:): "(NULL)"
    REFERRAL_FLG (REFERRAL_FLG:Char.1:): "(NULL)"
    RX_AVG_PRICE (RX_AVG_PRICE:Double:): "(NULL)"
    SERVICE_TYPE (SERVICE_TYPE:Char.30:): "(NULL)"
    STATUS (STATUS:Char.30:): "(NULL)"
    R_TYPE (R_TYPE:Char.30:): "(NULL)"
    SUB_TYPE (SUB_TYPE:Char.30:): "(NULL)"
    SUB_TYPE_CD (SUB_TYPE_CD:Char.30:): "(NULL)"
    TGT_CUST_TYPE (TGT_CUST_TYPE:Char.30:): "(NULL)"
    TRANSMISSION_CD (TRANSMISSION_CD:Char.30:): "(NULL)"
    TRIM_CD (TRIM_CD:Char.30:): "(NULL)"
    UNIT_CONV_FACTOR (UNIT_CONV_FACTOR:Double:): "(NULL)"
    VER_DT (VER_DT:Date:): "(NULL)"
    PAR_INTEGRATION_ID (PAR_INTEGRATION_ID:Char.30:): "(NULL)"
    CREATED_ON_DT (CREATED_ON_DT:Date:): "12/09/1997 15:59:47.000000000"
    CHANGED_ON_DT_OUT (CHANGED_ON_DT:Date:): "02/27/2013 15:28:19.000000000"
    AUX1_CHANGED_ON_DT (AUX1_CHANGED_ON_DT:Date:): "(NULL)"
    AUX2_CHANGED_ON_DT (AUX2_CHANGED_ON_DT:Date:): "(NULL)"
    AUX3_CHANGED_ON_DT (AUX3_CHANGED_ON_DT:Date:): "(NULL)"
    AUX4_CHANGED_ON_DT (AUX4_CHANGED_ON_DT:Date:): "(NULL)"
    SRC_EFF_FROM_DT (SRC_EFF_FROM_DT:Date:): "01/01/1899 00:00:00.000000000"
    SRC_EFF_TO_DT (SRC_EFF_TO_DT:Date:): "(NULL)"
    DELETE_FLG (DELETE_FLG:Char.1:): "N"
    INTEGRATION_ID (INTEGRATION_ID:Char.80:): "936"
    TENANT_ID (TENANT_ID:Char.80:): "DEFAULT"
    X_CUSTOM (X_CUSTOM:Char.10:): "(NULL)"
    CREATED_BY_ID (CREATED_BY_ID:Char.80:): "(NULL)"
    CHANGED_BY_ID (CHANGED_BY_ID:Char.80:): "(NULL)"
    UPDATE_FLG (UPDATE_FLG:Char.10:): "U"
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : CMN_1086 : Upd_Strategy_W_ORA_PRODUCT_DS_TMP_Ins_Upd: Number of errors exceeded threshold [1].
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [EXPTRANS], and the session is terminating.
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [EXPTRANS], and the session is terminating.
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [mplt_BC_ORA_ProductCategories_Resolve.FILTRANS], and the session is terminating.
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [mplt_BC_ORA_ProductCategories_Resolve.FILTRANS], and the session is terminating.
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [mplt_BC_ORA_ProductCategories_Resolve.EXP_W_ORA_PRODCUT_DS_TMP_INS_REJ], and the session is terminating.
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [mplt_BC_ORA_ProductCategories_Resolve.EXP_W_ORA_PRODCUT_DS_TMP_INS_REJ], and the session is terminating.
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [mplt_BC_ORA_ProductCategories_Resolve.Lkp_W_ORA_PRODUCT_DS_TMP], and the session is terminating.
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [mplt_BC_ORA_ProductCategories_Resolve.Lkp_W_ORA_PRODUCT_DS_TMP], and the session is terminating.
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [mplt_BC_ORA_ProductCategories_Resolve.SQ_W_ORA_PRODUCT_CATEGORIES_RESOLVE], and the session is terminating.
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [mplt_BC_ORA_ProductCategories_Resolve.SQ_W_ORA_PRODUCT_CATEGORIES_RESOLVE], and the session is terminating.
    2013-02-27 15:28:21 : ERROR : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : TM_6085 : A fatal error occurred at transformation [mplt_BC_ORA_ProductCategories_Resolve.SQ_W_ORA_PRODUCT_CATEGORIES_RESOLVE], and the session is terminating.
    2013-02-27 15:28:21 : DEBUG : (4696 | TRANSF_1_1_1) : (IS | BIA_IS) : node01_BIAPPS : DBG_21511 : TE: Fatal Transformation Error.
    2013-02-27 15:28:21 : INFO : (4696 | WRITER_1_*_1) : (IS | BIA_IS) : node01_BIAPPS : WRT_8167 : Start loading table [W_ORA_PRODUCT_DS_TMP] at: Wed Feb 27 15:28:21 2013
    2013-02-27 15:28:21 : INFO : (4696 | WRITER_1_*_1) : (IS | BIA_IS) : node01_BIAPPS : WRT_8333 : Rolling back all the targets due to fatal session error.
    2013-02-27 15:28:21 : INFO : (4696 | WRITER_1_*_1) : (IS | BIA_IS) : node01_BIAPPS : WRT_8325 : Final rollback executed for the target [W_ORA_PRODUCT_DS_TMP] at end of load
    2013-02-27 15:28:21 : ERROR : (4696 | WRITER_1_*_1) : (IS | BIA_IS) : node01_BIAPPS : WRT_8170 : Writer run terminated: Abort Session request received from the DTM
    2013-02-27 15:28:21 : INFO : (4696 | WRITER_1_*_1) : (IS | BIA_IS) : node01_BIAPPS : WRT_8168 : End loading table [W_ORA_PRODUCT_DS_TMP] at: Wed Feb 27 15:28:21 2013
    2013-02-27 15:28:21 : INFO : (4696 | WRITER_1_*_1) : (IS | BIA_IS) : node01_BIAPPS : WRT_8035 : Load complete time: Wed Feb 27 15:28:21 2013
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_ORA_PRODUCT_DS_TMP (Instance Name: [W_ORA_PRODUCT_DS_TMP])
    WRT_8044 No data loaded for this target
    2013-02-27 15:28:21 : INFO : (4696 | WRITER_1__1) : (IS | BIA_IS) : node01_BIAPPS : WRT_8043 : ****END LOAD SESSION*****
    2013-02-27 15:28:21 : INFO : (4696 | MANAGER) : (IS | BIA_IS) : node01_BIAPPS : PETL_24031 :
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [mplt_BC_ORA_ProductCategories_Resolve.SQ_W_ORA_PRODUCT_CATEGORIES_RESOLVE] has completed.
         Total Run Time = [1.109375] secs
         Total Idle Time = [0.500000] secs
         Busy Percentage = [54.929577]
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [mplt_BC_ORA_ProductCategories_Resolve.SQ_W_ORA_PRODUCT_CATEGORIES_RESOLVE] has completed.
         Total Run Time = [0.531250] secs
         Total Idle Time = [0.468750] secs
         Busy Percentage = [11.764706]
         Transformation-specific statistics for this thread were not accurate enough to report.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_ORA_PRODUCT_DS_TMP] has completed. The total run time was insufficient for any meaningful statistics.
    2013-02-27 15:28:21 : INFO : (4696 | MAPPING) : (IS | BIA_IS) : node01_BIAPPS : CMN_1793 : The index cache size that would hold [5170] rows in the lookup table for [mplt_BC_ORA_ProductCategories_Resolve.Lkp_W_ORA_PRODUCT_DS_TMP], in memory, is [186000] bytes
    2013-02-27 15:28:21 : INFO : (4696 | MAPPING) : (IS | BIA_IS) : node01_BIAPPS : CMN_1792 : The data cache size that would hold [5170] rows in the lookup table for [mplt_BC_ORA_ProductCategories_Resolve.Lkp_W_ORA_PRODUCT_DS_TMP], in memory, is [180224] bytes
    2013-02-27 15:28:21 : INFO : (4696 | MANAGER) : (IS | BIA_IS) : node01_BIAPPS : PETL_24005 : PETL_24005 Starting post-session tasks. : (Wed Feb 27 15:28:21 2013)
    2013-02-27 15:28:22 : INFO : (4696 | MANAGER) : (IS | BIA_IS) : node01_BIAPPS : PETL_24029 : PETL_24029 Post-session task completed successfully. : (Wed Feb 27 15:28:22 2013)
    2013-02-27 15:28:22 : INFO : (4696 | MAPPING) : (IS | BIA_IS) : node01_BIAPPS : TE_7216 : Deleting cache files [PMLKUP22595_262151_0_588W64] for transformation [mplt_BC_ORA_ProductCategories_Resolve.Lkp_W_ORA_PRODUCT_DS_TMP].
    2013-02-27 15:28:22 : INFO : (4696 | MAPPING) : (IS | BIA_IS) : node01_BIAPPS : TE_7216 : Deleting cache files [PMLKUP22595_3_0_588W64] for transformation [Lkp_W_USER_D_get_id].
    2013-02-27 15:28:22 : INFO : (4696 | MAPPING) : (IS | BIA_IS) : node01_BIAPPS : TM_6018 : The session completed with [2] row transformation errors.
    2013-02-27 15:28:22 : INFO : (4696 | MANAGER) : (IS | BIA_IS) : node01_BIAPPS : PETL_24002 : Parallel Pipeline Engine finished.
    2013-02-27 15:28:22 : INFO : (4696 | DIRECTOR) : (IS | BIA_IS) : node01_BIAPPS : PETL_24013 : Session run completed with failure.
    2013-02-27 15:28:22 : INFO : (4696 | DIRECTOR) : (IS | BIA_IS) : node01_BIAPPS : TM_6022 :
    SESSION LOAD SUMMARY
    ================================================
    2013-02-27 15:28:22 : INFO : (4696 | DIRECTOR) : (IS | BIA_IS) : node01_BIAPPS : TM_6252 : Source Load Summary.
    2013-02-27 15:28:22 : INFO : (4696 | DIRECTOR) : (IS | BIA_IS) : node01_BIAPPS : CMN_1740 : Table: [SQ_W_ORA_PRODUCT_CATEGORIES_RESOLVE] (Instance Name: [mplt_BC_ORA_ProductCategories_Resolve.SQ_W_ORA_PRODUCT_CATEGORIES_RESOLVE])
         Output Rows [252], Affected Rows [252], Applied Rows [252], Rejected Rows [0]
    2013-02-27 15:28:22 : INFO : (4696 | DIRECTOR) : (IS | BIA_IS) : node01_BIAPPS : TM_6253 : Target Load Summary.
    2013-02-27 15:28:22 : INFO : (4696 | DIRECTOR) : (IS | BIA_IS) : node01_BIAPPS : CMN_1740 : Table: [W_ORA_PRODUCT_DS_TMP] (Instance Name: [W_ORA_PRODUCT_DS_TMP])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    2013-02-27 15:28:22 : INFO : (4696 | DIRECTOR) : (IS | BIA_IS) : node01_BIAPPS : TM_6023 :
    ===================================================
    2013-02-27 15:28:22 : INFO : (4696 | DIRECTOR) : (IS | BIA_IS) : node01_BIAPPS : TM_6020 : Session [SDE_ORA_ProductCategories_Resolve] completed at [Wed Feb 27 15:28:22 2013].
    Regards
    Naeem Akhtar

    lbaert/Luc,
    Thanks this worked with your suggestion
    Thanks

  • Task fails while running Full load ETL

    Hi All,
    I am running full load ETL For Oracle R12(vanila Instance) HR But 4 tasks are failing SDE_ORA_JobDimention, SDE_ORA_HRPositionDimention, SDE_ORA_CodeDimension_Pay_level and SDE_ORA_CodeDimensionJob, I changed the parameter for all these task as mentioned in the Installation guide and rebuilled. Please help me out.
    Log is like this for SDE_ORA_JobDimention
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R12] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [9] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$JOBCODE_FLXFLD_SEGMENT_COL].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$JOBFAMILYCODE_FLXFLD_SEGMENT_COL].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [DEFAULT] for mapping parameter:[$$TENANT_ID].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_JobDimension_Full] at [Fri Sep 26 10:52:05 2008]
    DIRECTOR> TM_6683 Repository Name: [Oracle_BI_DW_Base]
    DIRECTOR> TM_6684 Server Name: [Oracle_BI_DW_Base_Integration_Service]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR12_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_JobDimension_Full]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_JobDimension [version 1]
    DIRECTOR> TM_6827 [C:\Informatica\PowerCenter8.1.1\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_JobDimension_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6703 Session [SDE_ORA_JobDimension_Full] is run by 32-bit Integration Service [node01_HSCHBSCGN20031], version [8.1.1], build [0831].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6151 Session Sort Order: [Binary]
    MAPPING> TM_6156 Using LOW precision decimal arithmetic
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM Error Log Disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_JobDimension_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Fri Sep 26 10:52:13 2008)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Fri Sep 26 10:52:14 2008)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 32000000 bytes and Block size is 1280000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [dev], user [apps]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [orcl], user [obia], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_JOB_DS :SQL INSERT statement:
    INSERT INTO W_JOB_DS(JOB_CODE,JOB_NAME,JOB_DESC,JOB_FAMILY_CODE,JOB_FAMILY_NAME,JOB_FAMILY_DESC,JOB_LEVEL,W_FLSA_STAT_CODE,W_FLSA_STAT_DESC,W_EEO_JOB_CAT_CODE,W_EEO_JOB_CAT_DESC,AAP_JOB_CAT_CODE,AAP_JOB_CAT_NAME,ACTIVE_FLG,CREATED_BY_ID,CHANGED_BY_ID,CREATED_ON_DT,CHANGED_ON_DT,AUX1_CHANGED_ON_DT,AUX2_CHANGED_ON_DT,AUX3_CHANGED_ON_DT,AUX4_CHANGED_ON_DT,SRC_EFF_FROM_DT,SRC_EFF_TO_DT,DELETE_FLG,DATASOURCE_NUM_ID,INTEGRATION_ID,TENANT_ID,X_CUSTOM) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_JOB_DS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    WRITER_1_*_1> WRT_8005 Writer run started.
    READER_1_1_1> BLKR_16007 Reader run started.
    READER_1_1_1> RR_4029 SQ Instance [mplt_BC_ORA_JobDimension.Sq_Jobs] User specified SQL Query [SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT,      PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS.  AS JOB_CODE,
      '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID]
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Fri Sep 26 10:53:05 2008
    Target tables:
    W_JOB_DS
    READER_1_1_1> RR_4049 SQL Query issued to database : (Fri Sep 26 10:53:05 2008)
    READER_1_1_1> CMN_1761 Timestamp Event: [Fri Sep 26 10:53:06 2008]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-01747: invalid user.table.column, table.column, or column specification
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT, PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS. AS JOB_CODE,
    '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT, PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS. AS JOB_CODE,
    '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Fri Sep 26 10:53:06 2008]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_JOB_DS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Fri Sep 26 10:53:06 2008
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_JOB_DS (Instance Name: [W_JOB_DS])
    WRT_8044 No data loaded for this target
    WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [mplt_BC_ORA_JobDimension.Sq_Jobs] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [mplt_BC_ORA_JobDimension.Sq_Jobs] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_JOB_DS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Fri Sep 26 10:53:06 2008)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Fri Sep 26 10:53:06 2008)
    MAPPING> TM_6018 Session [SDE_ORA_JobDimension_Full] run completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [Sq_Jobs] (Instance Name: [mplt_BC_ORA_JobDimension.Sq_Jobs])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_JOB_DS] (Instance Name: [W_JOB_DS])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_JobDimension_Full] completed at [Fri Sep 26 10:53:07 2008]

    To make use of the warehouse you would probably want to connect to an EBS instance in order to populate the warehouse.
    Since the execution plan you intend to run is designed for the EBS data-model. I guess if you really didn't want to connect to the EBS instance to pull data you could build one using the universal adapter. This allows you to load out of flat-files if you wish, but I wouldn't reccomend making this a habit for actual implementation as it does create another potential point of failure (populating the flat-files).
    Thanks,
    Austin

  • Oracle EBS GL Journal Line Description not being extracted by ETC

    It seems that the ETL for Oracle EBS (R12) includes the JE Batch Name, the JE Header Name/Description, and the JE Line Number. However, it does not include the JE Line Description. We have been able to meet all of our requirements based on out-of-the-box ETL and are trying to avoid customization. Has anyone run into this issue and has any customized the ETL to accomodate the need.

    yes most of the times there are alot of customizations ...
    If you are doing any customization to out of the box ETL job then you will do that in custom folders such as custom_sde or custom_sil. Now your modified etl job is in a new Informatica folder in order for this new etl job to run by DAC you need to create a new folder in DAC with the same name as in informatica and associate it with corresponding informatica folder.
    Please refer to the DAC Guide Section 5 for more step by step information !!
    Hope this helps !!

  • Best strategy to ETL BLOB data

    Hi: Gurus I am in the process of doing some ETL to port an oracle data base application to other oracle database application and bit confuse about how to port Blob data tables from one system to other. Can some one tell me with their hand on experience which strategy in general suit best to import/export Blob data. I have the following options in my mind (Note: commercial ETL tool is not an option due to some operational reasons. ) and appreciate some one insights about the following options.
    1. Do ETL via oracle source table to destination table via some custom JDBC code.
    2. Export blob data tables as csv file, and then run the custom JDBC code on the exported csv file
    3. write oracle pl/sql custom stored procedure code and use oracle dbms packages that wraps impdp/expdp utilities to do the ETL job.
    Thanks in advance.

    Both applications are sit on top of Oracle 10g Release 2. The size of the data is easily in GB's (so far we don't know the exact volume). Just like I mentioned in my post, I need an expert opinion about how to proceed with BLOB data migration in general. Hope this is information is enough to give an opinion.
    Thanks,
    RJ.

  • How to create execution plan in DAC and how to start ETL steps

    Hi,
    For ETL configuration i have installed and configured,
    1. Oracle 10G DB
    2. obiee and obia 7.9.6
    3. informatica server (here i created repository and integration services)
    4. DAC server 10g (setup for DAC is also configured like create warehouse tables etc.,
    same installation done on windows for client.
    Now i'm struck with execution plan creation.
    then
    how to start ETL, where start ETL
    Source : Oracle EBIZ R12.1.1
    target : Datawarehouse (10G DB)
    please help me out the steps till ETL start ?
    Thanks,
    Srik

    Hi Srik
    I am assuming you have followed the steps required before a load in the library for dac config, csv lookups etc...
    Did you check out the example in the documentation?
    Here is the link
    http://download.oracle.com/docs/cd/E14223_01/bia.796/e14217/windows_ic.htm#BABFGIHE
    If you follow those steps and run the execution plan and you can monitor the progress under the current run tab in the dac and the infa workflow monitor.
    Regards
    Nick
    Edited by: Nickho on 2010/06/21 9:24 AM

  • How to start ETL ?

    Hi,
    For ETL configuration i have installed and configured,
    1. Oracle 10G DB
    2. obiee and obia 7.9.6
    3. informatica server (here i created repository and integration services)
    4. DAC server 10g (setup for DAC is also configured like create warehouse tables etc.,
    same installation done on windows for client.
    Now i'm struck with ETL run
    how to start ETL, where start ETL
    Source : Oracle EBIZ R12.1.1
    target : Datawarehouse (10G DB)
    please help me out the steps till ETL start ? or provide the doc so that i can try to move farward.
    Thanks,
    Srik

    http://www.rittmanmead.com/2008/07/06/performing-initial-data-loads-into-the-oracle-bi-apps-795-data-warehouse/
    Wrong forum btw, for BI Apps go to:
    Business Intelligence Applications

  • URGENT : Help needed for OLAP tools selection

    Hi Gurus,
    Environment :
    Oracle 10g
    .net web application with backend as Oracle 10g server embaded with OLAP server.
    Application - CRM
    Feature requreid - OLAP, OLAP Reporting, Drilldown graphical analysis, US MAP view location specific queries, Ad-hoc analysis.
    I have to take a descision for for the OLAP tool selection with above mention client requirement.
    * Please let me know if anybody have worked on above combination for the required features and held up with any issue (if any) ?
    * Please go thru Oracle options below per feature and let me know is there any issue with .NET and IIS server,Oracle 10g OLAP compatibily ? specially for following points from the below details -
    4. OLAP Reporting Tool :
    6. Web deployment of Cubes :
    8. Graphing :
    9. Geographical Map View based query :
    1.Extraction Transformation and loading (ETL) Tool : Oracle Business Intelligence Warehouse Builder, tightly integrated with Database Server.
    2. Analysis Tool (Multidimensional Data Analysis using cubes) : Oracle Business Intelligence Discoverer·     
    3. Ad-hoc query and analysis by end User :
    Oracle Business Intelligence Spreadsheet Add-in.OLAP DML Regular Sql with OLAP_TABLE Function.
    4. OLAP Reporting Tool :     Oracle Reports 10g.
    5. Multidimensional Data (Cube) Query Language     OLAP DML : Regular Sql with OLAP_TABLE Function
    6.Web deployment of Cubes : Available
    7. Cube data export to xls, html, format : Available
    8.Graphing : Oracle Reports 10g Graph Wizard for Graphical Analysis.
    9. Geographical Map View based query : Oracle Locator: Location-Enabling Every Oracle Database
    Please send me the relevent urls in support / issues for the above featues.
    TIA,
    Sheilesh

    The only Adobe program I know that can edit images is Photoshop.
    If you have troubles with Google software, you need to post in the appropriate Google forum.

  • New training materials available for OWB 11.2

    Oracle University now offers an instructor-led course on OWB11.2 entitled Data Integration and ETL with Oracle Warehouse Builder.
    The entire course is 5 days long and is divided into two parts:
    * Part I is 3 days and covers designing and debugging ETL mappings, performing data cleansing, integrating with OBIEE and other basic ETL functionality included in the Oracle Database license.
    * Part II is 2 days and covers metadata management, accessing non-Oracle sources (code templates), right-time data warehousing, and the other features in the Enterprise ETL/ ODI EE license.
    For links to this and other training resources, including the free OBE's, see http://www.oracle.com/technology/products/warehouse/htdocs/OTN_Training.html

    The Linux version of OWB 11gR2 was released with DB 11gR2 for Linux on Sept 1, 2009.
    Typically, OWB conforms to the DB schedule for releasing other platforms. However, upon receiving numerous requests for OWB on Windows, we're evaluating the possibility of releasing a Windows stand-alone client sooner rather than later. At this time, though, there are no official announcements on when to expect that.

  • OBIA -'Financials_Oracle R12'

    After installation of OBIA we are trying to run out of box 'Financials_Oracle R12' execution plan. And it failed right at the start with following error.
    Error while loading nodes.
    ERROR: Circular Dependency detected. Check server log for details.
    'TASK_GROUP_Load_PartyDimension' Has circular dependency with 'SIL_PositionDimensionHierarchy_Softdelete'
    'TASK_GROUP_Load_PartyDimension' Has circular dependency with 'SIL_PositionDimensionHierarchy_Type1Update'
    ABORTING BECAUSE OF CIRCULAR DEPENDENCIES.
    Does anyone know how to resolve this?
    Please note we were able to execute "Procurement and Spend: Oracle R12".
    Thansk,
    Avdhut

    Hi Naeem,
    For me, the only way to sizing OBIA with your requirements is to do a sizing review with Oracle. They analyze your requirement, preconize architecture and do an estimation of the size of your database.
    Oracle provide minimum requirement for hardware with the installation of OBIA but it is only pre-requisite (http://docs.oracle.com/cd/E35287_01/bia.7964/e22970.pdf).
    With your suggestions, I have some questions :
    - Why there are 2 Informatica / DAC server ?
    - For me, your Database configuration is under-estimate compared your OBIEE server.
    The minimum prerequisite is :
    - Oracle Business Analytics Warehouse :
    CPU:2 GHz or better, 8 CPU cores minimum - RAM:8 GB
    - ETL Server (Oracle Business Intelligence DataWarehouse Administration Console Server and Informatica PowerCenter Services) :
    CPU:2 GHz or better, 4 CPU cores minimum - RAM:8 GB - Storage Space:100 GB free space
    Regards,
    Benoit

  • How to configure OBIA 11.1.1.8.1 with ODI

    Hi forum,
    I want to configure Oracle Business Intelligence Apps 11.1.1.8.1 installed on Linux with ODI ver 11.1.1.7.
    Can anybody please provide:
    1) A tutorial about how to setup ODI for OBIA 11.1.1.8.1?
    2) Than how to perform functional configuration using Oracle Data Integrator, Oracle BI Applications Configuration Manager and Functional Setup Manager?
    3) How to Create, Run and Customize ETL in Oracle Data Integrator?
    Thanks

    a) Do what most people do and read the documentation (Oracle Business Intelligence Applications 11g Release 1 (11.1.1.8.1) Documentation Library) that comes with the product, installation details will be documented. No one is going to go through all the installation steps for you in these forums. The links I provided before are good starting points. If you are still struggling get hire a consultant to help you out.
    b) The ODI studio is just a client so you can install it on whatever OS you like. The repositories are simply databases and can be installed on a database whether it is hosted on Windows or Linux etc and the agent is Java so again is platform independent. You can configure the setup distribution however you want as long as the repositories can be connected to via JDBC from the studio client.

  • Implementation strategy for non sap sources

    hai friends,
                could anyone help with the
    'implementation strategy for non sap sources'.

    Hi,
    Its the same as with R3 sources.Only difference is you'll have different underlying interfaces. Non SAP systems can either be flat files, ETL systems or legacy systems using ETL connection, Oracle or Java systems XML, etc.
    But your stategy would remain the same only per your non sap source system, the transactions and the ways you configure your datasources would differ.
    Cheers,
    Kedar

  • Typical Dimension

    Hi
    The tools we are using are
    OWB 9.0.2.56 for ETL and
    Oracle Express 6.3.4 server, RAA/RAM and Oracle Express Web Agent.
    The database is Oracle 9i
    I have to design a very typical dimension for the following requirements which I guess should be common for most of the Data Warehouing projects.
    We need to have a hierarchy for a department dimension in payroll module where we have 4 levels as follows
    General Department --> Department --> Sub Department --> Section. Section being at the leaf level.
    The doubts I have is
    1. I was under an impression all these days that data should be available at leaf level but here I see some cases where employees does not have a section as they are directly into a sub department and similarly some employees does not have section as well as sub department as they directly fall in Department and Vise Versa. So the data is not always at leaf level. So how is the aggregation going to take place.
    2. Say we have a measures like number of employees terminated, recruited, promoted etc As discussed in point1. if Section1 and and section2 are falling under Sub-Department1 so here say 200 employees are working in section one and 250 Employees are working in Section2 and 150 employees working in Sub-Department1. Here on aggregation at sub-department1 it sould show the sum of all the three figures(200+250+150) but not just the aggregation of section1 and section2(which is 250 + 150)
    3. The another issue is that for some cases it is skip level like section1 directly falls under Department and so the level is being skipped. How do I address this in my dimension design.
    Can any one through some light on this.
    How can we design and implement the hierarchy in OWB
    Thanks and Regards
    Nanda Kishore

    There is a way to design a dimension so that all levels of the data can be loaded to it and the aggregation hierarchy is preserved.
    Firstly, the dimension implementation looks like the following:
    ID Top Level Middle Level Detail Level
    1 Total null null
    2 Total East null
    3 Total West null
    4 Total East Part1
    5 Total East Part2
    6 Total West State1
    7 Total West State2
    8 Total West State3
    The null's could be replaced with "Not Available" if from an EUL that is a preferable presentation.
    In the ETL for loading the fact table I use Key Lookup Table (Index Organized) with the natural key for the dimension as the primary key, and the surrogate key of the ID as the output value. Giving a table which looks like:
    Natural Key Dimension ID          
    Total 1
    East 2
    West 3
    Part1 4
    Part2 5
    State1 6
    State2 7
    State3 8
    With this structure data in the fact table can be loaded and rolled up independent of the level of the dimension which the data comes in on. The roll up in the EUL also works happily (without any double counting).

  • Unload

    hi all,
    I am using oracle 10g.How to unload ( create command and data ) from table to file.using any command.
    Is it possible to load like this:
    CREATE TABLE TABLE1( COL1 NUMBER, COL2 VARCHAR(10) )
    INSERT INTO TABLE1 VALUES (
    :1,
    :2)
    $DATATYPES NUMBER, CHARACTER
    1,"ASD",
    2,"ASDF"
    Thanks in advance.

    There is no simple way to generate this particular file format from Oracle. You could, of course, write your own script to generate the file, but that is probably not the most efficient use of your time.
    Can sqlbase import CSV files? Various folks have written utilities that dump Oracle data into CSV files. Depending on the Oracle version, you can use the DBMS_METADATA package or your favorite PL/SQL IDE (I would assume that Oracle's SQL Programmer would be one of the options here) to generate the DDL. Some IDEs will also generate the CSV file for you.
    If you can make an ODBC connection to sqlbase from the machine where Oracle is running, you could also move the data to sqlbase using Oracle Heterogeneous Services.
    You can also use a variety of ETL tools (Oracle Warehouse Builder, Oracle Data Integrator, etc) to move the data, but learning a full-blown ETL tool for something this small is probably overkill (but a useful tool to have for later).
    Justin

Maybe you are looking for

  • Raw 6.41 update issue blank white window

    Prior to Camera Raw 6.41 update, PSE9 Organize opened and performed as designed. Immediately after the update, and clicking Organize, it opens with an empty white square window appearing to mid right and half outside the PSE9 window. Organizer will e

  • Loading Sound Clips For Swing Application

    I have searched all the forums, din't find what i want. I have many small sound files to be played in my application. Even on mouse Pressed, Drage etc etc.... Now I was looking for a better way of Loading the Sound files in memory as the application

  • Colour shift from Snow Leopard to Lion

    I initially noticed this problem within Lion and Lightroom 4 when printing to my Epson 4800. The colour shifts to a more pinkish hue in prints done in Lion. I'd originally thought this was unique to Lightroom, but the same thing happens to the same f

  • How do I install Adobe Acrobat as a Google Drive App?

    Does Anyone Use Google Drive? I started using Google Drive when I bought my Acer Chromebook last year... why?  I wanted thought a free 100GB Google Drive account sounded nice for just trying a Chromebook.  Ok... I can do a lot with my Google Drive, b

  • Mini mac ethernet cable problem

    what cable do i need to hook up to my phone wall jack to use my dial-up service and where can i buy it? Message was edited by: whiteapple5 Message was edited by: whiteapple5 Message was edited by: whiteapple5