Error in DAC ETL Full Load (Supply Chain Inventory Transactions)

Hi Gurus,
My Environment is like this,
Syatems: windows 2003 server 32 bit
DAC: 10.1.3.4.1 with Patch 10052370
Informatica : Power Center 8.6.1 Hot Fix 11
OBIA: 7.9.6.2
OBIEE: 10.1.3.4.1
I Created a plan with one Subject area Supply Chain Inventory Transactions. while running i am getting an error (out of 241 tasks 22 got failed). Nothing got Failed in Informatica. TASK_Group_Load_PositionHierarchy got failed. when i go through the log the error is like this.
Error:
Values :
Null Map
MESSAGE:::ORA-00604: error occurred at recursive SQL level 1
ORA-01654: unable to extend index SYS.I_WRI$_OPTSTAT_H_OBJ#_ICOL#_ST by 128 in tablespace SYSAUX
ORA-06512: at "SYS.DBMS_STATS", line 18408
ORA-06512: at "SYS.DBMS_STATS", line 18429
ORA-06512: at line 1
EXCEPTION CLASS::: java.sql.SQLException
Error:
MESSAGE:::DataWarehouse:DBMS_STATS.GATHER_TABLE_STATS(ownname => 'DWH_REP', tabname => 'W_POSITION_DH', estimate_percent => DBMS_STATS.AUTO_SAMPLE_SIZE, method_opt => 'FOR ALL INDEXED COLUMNS SIZE AUTO',cascade => false, degree => DBMS_STATS.DEFAULT_DEGREE)
ORA-00604: error occurred at recursive SQL level 1
ORA-01654: unable to extend index SYS.I_WRI$_OPTSTAT_H_OBJ#_ICOL#_ST by 128 in tablespace SYSAUX
ORA-06512: at "SYS.DBMS_STATS", line 18408
ORA-06512: at "SYS.DBMS_STATS", line 18429
ORA-06512: at line 1
Values :
Null Map
Please Help me to Solve this.
Edited by: 846311 on Apr 11, 2011 10:52 PM

Hi Naveen,
Thanks for your reply,
I added one more temp tablespace like temp3, it resolves my problem, i am getting sucesses now. temp tablespace is using while exicuting large queries like aggrications.
Thanks,
Rajesh Thumuluri

Similar Messages

  • Reg. Error while Full load - process chain

    Hi,
    I am running a process chain, which is a full load. In Manage screen it is in RED status whereas, While i chekced in Monitor screen it is showing as "Missing messages" . While I checked in the error and warning messages, I found that the data packet should be run manually. So I tried Manual update option by killing the job manually. After doing this i tried to run the same process with "Back to Technical Status" option. While performin this again i got an error saying  "REQUXXXX terminated, as data packet 0000XX could not be locked"
    How this error can be rectified.
    Please advice.
    Thanks in advance!!!
    Regards,
    Melissa

    Hi Melissa.....
    While i chekced in Monitor screen it is showing as "Missing messages"........where did u got the message.............in the data packet.......right...........?..........
    I found that the data packet should be run manually. So I tried Manual update option by killing the job manually..................which job u hav killed........? Extraction Job?If u hav killed the Extraction job........then Manual update is not possible..............Manual update is only possible when the Extraction job will get completed successfully........
    Due to this u r getting the Error Message error saying "REQUXXXX terminated, as data packet 0000XX could not be locked"....................When the request itself does'nt exist............how could u do Manual update for that deletedrequuest........
    Then only option is..............make the QM status red.............delete the request from the target and repeat the load.........
    Hope this helps.......
    Regards,
    Debjani........

  • ETL Full Load, Nothing Shows up in Current Run

    Hi,
    I have followed the:
    Oracle® Business Intelligence Applications
    Installation Guide for Informatica PowerCenter Users
    Release 7.9.6.3
    E19038-01
    Documentation to install OBIA.
    Right now, I have DAC and Informatica servers installed on a Linux machine. I have installed OBIEE on another Linux machine. I have EBS installed on another Linux machine. Oracle installed on another Linux machine. DAC, Informatica, and OBIEE are all connected to the Oracle machine. The EBS has its own oracle database on its server. I installed OBIA on my local windows computer with both the DAC and Informatica client. I have transferred the required files to the OBIEE machine. When I configure DAC all the connections are working and I get no errors.
    I get to the part where, I have am running the "4.19.1 An Example of Running a Full Load ETL"
    I do all the sets it asks, and after I hit "Run Now", A message box appears and says "Start request successfully submitted to server"
    According to the documentation, The run should show up in the "Current Run" tab. But nothing appears.
    I have tried to create 2 more Execution Plans with different subject areas and still nothing.
    I tried running the client on another machine and still nothing.
    Do you guys know what might be the issue?
    Any help would be awesome. Thanks in advance.
    Edited by: 934820 on May 17, 2012 2:40 PM

    You shouldn't need to customise anything in the Informatica Workflows to get the ETL working - I believe that all the config for Informatica is done in the DAC.
    For the RPD and DAC variables, anything that has OLAP in it's name refers to the BI Apps data warehouse database. I'm guessing that your DSNs are correct, but I think that OLAP_USER and OLAPTBO would need to be something different. Normally, you have three key schemas in the Warehouse, one for the actual warehouse tables, one for the DAC repositoty and one for the Informatica repository. Unlike EBS you can name these schemas whatever you like - we call ours BIAPPS, DAC and INFA respectively, and we'd have OLAP_USER and OLAPTBO set to BIAPPS.

  • Project Analytics 7.9.6.1 - Error while running a full load

    Hi All,
    I am performing a full load for Projects Analytics and get the following error,
    =====================================
    ERROR OUTPUT
    =====================================
    1103 SEVERE Wed Nov 18 02:49:36 WST 2009 Could not attach to workflow because of errorCode 36331 For workflow SDE_ORA_CodeDimension_Gl_Account
    1104 SEVERE Wed Nov 18 02:49:36 WST 2009
    ANOMALY INFO::: Error while executing : INFORMATICA TASK:SDE_ORA11510_Adaptor:SDE_ORA_CodeDimension_Gl_Account:(Source : FULL Target : FULL)
    MESSAGE:::
    Irrecoverable Error
    Error while contacting Informatica server for getting workflow status for SDE_ORA_CodeDimension_Gl_Account
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    Session log initialises NULL value to mapping parameter MPLT_ADI_CODES.$$CATEGORY. This is then used insupsequent SQL and results in ORA-00936: missing expression error following are the initialization section and the load section containing the error in the log
    Initialisation
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_11_5_10] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [ORA_11_5_10.DATAWAREHOUSE.SDE_ORA11510_Adaptor.SDE_ORA_CodeDimension_Gl_Account_Segments.log] for session parameter:[$PMSessionLogFile].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[MPLT_ADI_CODES.$$CATEGORY].
    DIRECTOR> VAR_27028 Use override value [4] for mapping parameter:[MPLT_SA_ORA_CODES.$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value [DEFAULT] for mapping parameter:[MPLT_SA_ORA_CODES.$$TENANT_ID].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_CodeDimension_Gl_Account_Segments] at [Wed Nov 18 02:49:11 2009].
    DIRECTOR> TM_6683 Repository Name: [repo_service]
    DIRECTOR> TM_6684 Server Name: [int_service]
    DIRECTOR> TM_6686 Folder: [SDE_ORA11510_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_CodeDimension_Gl_Account_Segments] Run Instance Name: [] Run Id: [17]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_CodeDimension_GL_Account_Segments [version 1].
    DIRECTOR> TM_6963 Pre 85 Timestamp Compatibility is Enabled
    DIRECTOR> TM_6964 Date format for the Session is [MM/DD/YYYY HH24:MI:SS]
    DIRECTOR> TM_6827 [C:\Informatica\PowerCenter8.6.1\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_CodeDimension_Gl_Account_Segments].
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6703 Session [SDE_ORA_CodeDimension_Gl_Account_Segments] is run by 32-bit Integration Service [node01_ASG596138], version [8.6.1], build [1218].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [UNICODE]
    MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6151 The session sort order is [Binary].
    MAPPING> TM_6185 Warning. Code page validation is disabled in this session.
    MAPPING> TM_6156 Using low precision processing.
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM error log disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> DBG_21075 Connecting to database [orcl], user [DAC_REP]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_Master_Map] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_Master_Code] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_W_CODE_D] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_CodeDimension_Gl_Account_Segments]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Wed Nov 18 02:49:14 2009)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Wed Nov 18 02:49:14 2009)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 32000000 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [asgdev], user [APPS]
    READER_1_1_1> BLKR_16051 Source database connection [ORA_11_5_10] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8147 Writer: Target is database [orcl], user [DAC_REP], bulk mode [OFF]
    WRITER_1_*_1> WRT_8221 Target database connection [DataWarehouse] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL INSERT statement:
    INSERT INTO W_CODE_D(DATASOURCE_NUM_ID,SOURCE_CODE,SOURCE_CODE_1,SOURCE_CODE_2,SOURCE_CODE_3,SOURCE_NAME_1,SOURCE_NAME_2,CATEGORY,LANGUAGE_CODE,MASTER_DATASOURCE_NUM_ID,MASTER_CODE,MASTER_VALUE,W_INSERT_DT,W_UPDATE_DT,TENANT_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL UPDATE statement:
    UPDATE W_CODE_D SET SOURCE_CODE_1 = ?, SOURCE_CODE_2 = ?, SOURCE_CODE_3 = ?, SOURCE_NAME_1 = ?, SOURCE_NAME_2 = ?, MASTER_DATASOURCE_NUM_ID = ?, MASTER_CODE = ?, MASTER_VALUE = ?, W_INSERT_DT = ?, W_UPDATE_DT = ?, TENANT_ID = ? WHERE DATASOURCE_NUM_ID = ? AND SOURCE_CODE = ? AND CATEGORY = ? AND LANGUAGE_CODE = ?
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL DELETE statement:
    DELETE FROM W_CODE_D WHERE DATASOURCE_NUM_ID = ? AND SOURCE_CODE = ? AND CATEGORY = ? AND LANGUAGE_CODE = ?
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_CODE_D]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    Load section
    *****START LOAD SESSION*****
    Load Start Time: Wed Nov 18 02:49:16 2009
    Target tables:
    W_CODE_D
    READER_1_1_1> RR_4029 SQ Instance [mplt_BC_ORA_Codes_GL_Account_Segments.Sq_Fnd_Flex_Values] User specified SQL Query [SELECT
    FND_FLEX_VALUES.FLEX_VALUE_SET_ID,
    FND_FLEX_VALUES.FLEX_VALUE,
    MAX(FND_FLEX_VALUES_TL.DESCRIPTION),
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM,
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME
    FROM
    FND_FLEX_VALUES,
    FND_FLEX_VALUES_TL,
    FND_ID_FLEX_SEGMENTS,
    FND_SEGMENT_ATTRIBUTE_VALUES
    WHERE
    FND_FLEX_VALUES.FLEX_VALUE_ID = FND_FLEX_VALUES_TL.FLEX_VALUE_ID AND FND_FLEX_VALUES_TL.LANGUAGE ='US' AND
    FND_ID_FLEX_SEGMENTS.FLEX_VALUE_SET_ID =FND_FLEX_VALUES.FLEX_VALUE_SET_ID AND
    FND_ID_FLEX_SEGMENTS.APPLICATION_ID = 101 AND
    FND_ID_FLEX_SEGMENTS.ID_FLEX_CODE ='GL#' AND
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM =FND_SEGMENT_ATTRIBUTE_VALUES.ID_FLEX_NUM AND
    FND_SEGMENT_ATTRIBUTE_VALUES.APPLICATION_ID =101 AND
    FND_SEGMENT_ATTRIBUTE_VALUES.ID_FLEX_CODE = 'GL#' AND
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME=FND_SEGMENT_ATTRIBUTE_VALUES.APPLICATION_COLUMN_NAME AND
    FND_SEGMENT_ATTRIBUTE_VALUES.ATTRIBUTE_VALUE ='Y'
    GROUP BY
    FND_FLEX_VALUES.FLEX_VALUE_SET_ID,
    FND_FLEX_VALUES.FLEX_VALUE,
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM,
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Wed Nov 18 02:49:17 2009)
    READER_1_1_1> RR_4050 First row returned from database to reader : (Wed Nov 18 02:49:17 2009)
    LKPDP_3> DBG_21312 Lookup Transformation [mplt_ADI_Codes.Lkp_W_CODE_D]: Lookup override sql to create cache: SELECT W_CODE_D.SOURCE_NAME_1 AS SOURCE_NAME_1, W_CODE_D.SOURCE_NAME_2 AS SOURCE_NAME_2, W_CODE_D.MASTER_DATASOURCE_NUM_ID AS MASTER_DATASOURCE_NUM_ID, W_CODE_D.MASTER_CODE AS MASTER_CODE, W_CODE_D.MASTER_VALUE AS MASTER_VALUE, W_CODE_D.W_INSERT_DT AS W_INSERT_DT, W_CODE_D.TENANT_ID AS TENANT_ID, W_CODE_D.DATASOURCE_NUM_ID AS DATASOURCE_NUM_ID, W_CODE_D.SOURCE_CODE AS SOURCE_CODE, W_CODE_D.CATEGORY AS CATEGORY, W_CODE_D.LANGUAGE_CODE AS LANGUAGE_CODE FROM W_CODE_D
    WHERE
    W_CODE_D.CATEGORY IN () ORDER BY DATASOURCE_NUM_ID,SOURCE_CODE,CATEGORY,LANGUAGE_CODE,SOURCE_NAME_1,SOURCE_NAME_2,MASTER_DATASOURCE_NUM_ID,MASTER_CODE,MASTER_VALUE,W_INSERT_DT,TENANT_ID
    LKPDP_3> TE_7212 Increasing [Index Cache] size for transformation [mplt_ADI_Codes.Lkp_W_CODE_D] from [1000000] to [4734976].
    LKPDP_3> TE_7212 Increasing [Data Cache] size for transformation [mplt_ADI_Codes.Lkp_W_CODE_D] from [2000000] to [2007040].
    READER_1_1_1> BLKR_16019 Read [625] rows, read [0] error rows for source table [FND_ID_FLEX_SEGMENTS] instance name [mplt_BC_ORA_Codes_GL_Account_Segments.FND_ID_FLEX_SEGMENTS]
    READER_1_1_1> BLKR_16008 Reader run completed.
    LKPDP_3> TM_6660 Total Buffer Pool size is 609824 bytes and Block size is 65536 bytes.
    LKPDP_3:READER_1_1> DBG_21438 Reader: Source is [orcl], user [DAC_REP]
    LKPDP_3:READER_1_1> BLKR_16051 Source database connection [DataWarehouse] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    LKPDP_3:READER_1_1> BLKR_16003 Initialization completed successfully.
    LKPDP_3:READER_1_1> BLKR_16007 Reader run started.
    LKPDP_3:READER_1_1> RR_4049 SQL Query issued to database : (Wed Nov 18 02:49:18 2009)
    LKPDP_3:READER_1_1> CMN_1761 Timestamp Event: [Wed Nov 18 02:49:18 2009]
    LKPDP_3:READER_1_1> RR_4035 SQL Error [
    ORA-00936: missing expression
    Could you please suggest what the issue might be and how it can be fixed?
    Many thanks,
    Kiran

    I have continued related detains in the following thread,
    Mapping Parameter  $$CATEGORY not included in the parameter file (7.9.6.1)
    Apologies for the inconvenience.
    Thanks,
    Kiran

  • Delta and Full load Process Chain

    Hi Guru`s,
    I have a one standard target (infocube) which is fetching data from the 3 datasources. Out of 3 datasources 2 are standard datasources and 1 is Generic datasource (Custom datasource).
    These 2 standard datasources have Delta load and Generic datasource have full load.
    I have to design the process chain by using Full load and Delta load.
    Can any body tell me how could I achieve  this requirement.
    Thanks in Advance,
    Venkat

    Hi,
    Please check the follwing steps,
    Delta chain:
    Step- 1 Start varitant
    Step- 2 Delete PSA request(Full- ur Genric DS)
    Step- 3 delta infopackage1
    Step- 4 delta infopackage2
    Step- 5 full infopackage(Genric DS)
    Step- 6 DTP1
    Step- 7 DTP2
    Step- 8 DTP3
    Full chain:
    Step- 1 Start varitant
    Step- 2 Delete PSA request for PSA1
    Step- 3 Delete PSA request for PSA2
    Step- 4Delete PSA request for PSA3
    Step- 5 FUll IP1
    Step- 6 Full IP2
    Step- 7 Full IP3
    Step- 8 DTP1
    Step- 9 DTP2
    Step- 10 DTP3
    I have given steps upto DSO only.
    Thanks,Girish

  • Error in Source System while running the Full Load

    HI Experts,
    We are using 0CRM_UT_SRV_CONT_I datasource from CRM to extract data.
    It is giving error while running the full load as 'Error Occured in Source system' with messaage Number : RSM340.
    I  want to mention certain points here:
    1) In RSA3 its running fine.
    2) Connection between BW and CRM system is OK.
    3) We have replicated datasource and send active veriosn from Dev system.
    4) When I run Init without data, Init set sucessful with entry in RSA7 in CRM.
    5) Delta run fine with green status.
    6) As source system is static client, hence do data for delta.
    The Main issue happens when we run either the Full load or Init With Data transfer.
    Kindly suggest.
    Thanks
    Mayank

    Hi Mayank,
    see in source system (CRM) with transaction SM21 and ST22 the error log and and then you let us know.
    Charly

  • Process chain for full load.

    Hi,
    I have developed a Process chain for delta loads. Now my question is, can we use process chain to load full load data source for master data, if yeas then what are the steps.
    If we can use process chain for full load to master data then can we use delta and full load infopackages in the process chain.
    Please update.

    Hi Sata,
      You can Include both Full and Delta in the Process Chain. But make sure that you execute the Init InfoPackage manually before executing the delta InfoPackage in the Process Chain.
    For Full Load Process chain, add the infopackage
    >DTP for Full Load---->delete PSA data variant(after 2days) -
    >attribute change run
    Normally load the hierarchy first then the attribute and text.
    Assign points if it helped you.
    Regards,
    Senoy

  • PDS are not displayed in supply chain model SCC07

    Hi,
    Does anyone know if it is possible to add PDS to supply chain model (transaction SCC07)? currently i don't see this option (we are not using IPPE in the ECC).

    Hello,
    You can not see PDS in SCC07. Note that PDS are not even part of the menu.
    You can also see that the SAP Help does not mention PDS in the documentation of the Supply Chain Engineer (SCC07):
    https://help.sap.com/saphelp_scm50/helpdata/en/4c/e1af376ed96c21e10000009b38f8cf/content.htm?frameset=/en/40/f5ae37fdcf3…
    Kind Regards,
    Mariano

  • Inventory transaction worker request completing with error.

    when running Inventory transaction worker concuurent request completing with error.
    Here I attached the error message ..
    Oracle Inventory: Version : 11.5.0 - Development
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    INCTCW module: Inventory transaction worker
    Current system time is 10-OCT-2010 08:22:07
    No completion options were requested.
    Output is not being printed because:
    The print option has been disabled for this report.
    Concurrent request completed
    Current system time is 10-OCT-2010 08:22:08
    Regards
    ***SBJ***

    Hi, Did relinking fix your issue?
    Thanks.
    Hi,
    I have the same problem. The Inventory transaction errors out randomly. Did relinking fix your issue?
    Thanks
    A/A
    Hi,
    No,I didn't tried to relink INCTCW.
    Please try to relink the file and see if it helps.
    Also, please go through the doc referenced above and check the database log file for any errors.
    Thanks,
    Hussein
    user13098774 wrote:when running Inventory transaction worker concuurent request completing with error.
    Here I attached the error message ..
    Oracle Inventory: Version : 11.5.0 - Development
    Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
    INCTCW module: Inventory transaction worker
    Current system time is 10-OCT-2010 08:22:07
    No completion options were requested.
    Output is not being printed because:
    The print option has been disabled for this report.
    Concurrent request completed
    Current system time is 10-OCT-2010 08:22:08
    Regards
    ***SBJ***

  • Errors: ORA-00054 & ORA-01452 while running DAC Full Load

    Hi Friends,
    Previously, I ran full load...it went well. And, I did some sample reports also in BI APPS 7.9.6.2
    Now, I modified few parameters as per the Business and I try to run Full Load again...But I struck with few similar errors. I cleared couple of DB Errors.
    Please, help me out to solve the below errors.
    1. ANOMALY INFO::: Error while executing : TRUNCATE TABLE:W_SALES_BOOKING_LINE_F
    MESSAGE:::com.siebel.etl.database.IllegalSQLQueryException: DataWarehouse:TRUNCATE TABLE W_SALES_BOOKING_LINE_F
    ORA-00054: resource busy and acquire with NOWAIT specified or timeout expired
    --I checked W_SALES_BOOKING_LINE_F, it contain s data.
    2. ANOMALY INFO::: Error while executing : CREATE INDEX:W_GL_REVN_F:W_GL_REVN_F_U1
    MESSAGE:::java.lang.Exception: Error while execution : CREATE UNIQUE INDEX
         W_GL_REVN_F_U1
    ON
         W_GL_REVN_F
         INTEGRATION_ID ASC
         ,DATASOURCE_NUM_ID ASC
    NOLOGGING
    with error DataWarehouse:CREATE UNIQUE INDEX
         W_GL_REVN_F_U1
    ON
         W_GL_REVN_F
         INTEGRATION_ID ASC
         ,DATASOURCE_NUM_ID ASC
    NOLOGGING
    ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
    -- Yes, I found duplicate values in this table W_GL_REVN_F. But, how can I rectify it. I did some engineering, but failed.
    please tell me the steps to acheive....
    Thanks in advance..
    Stone

    Hi, Please see the answers (in bold) below.
    1. ANOMALY INFO::: Error while executing : TRUNCATE TABLE:W_SALES_BOOKING_LINE_F
    MESSAGE:::com.siebel.etl.database.IllegalSQLQueryException: DataWarehouse:TRUNCATE TABLE W_SALES_BOOKING_LINE_F
    ORA-00054: resource busy and acquire with NOWAIT specified or timeout expired
    --I checked W_SALES_BOOKING_LINE_F, it contain s data.
    Just restart the load, It seems like your DB processes are busy and the table still has a  lock on it which means something is not yet Commited/Rolled Back.
    If this issue repeats you can mail your DBA and ask him to look in to the issue
    2. ANOMALY INFO::: Error while executing : CREATE INDEX:W_GL_REVN_F:W_GL_REVN_F_U1
    MESSAGE:::java.lang.Exception: Error while execution : CREATE UNIQUE INDEX
    W_GL_REVN_F_U1
    ON
    W_GL_REVN_F
    INTEGRATION_ID ASC
         ,DATASOURCE_NUM_ID ASC
         NOLOGGING
         with error DataWarehouse:CREATE UNIQUE INDEX
         W_GL_REVN_F_U1
         ON
         W_GL_REVN_F
         INTEGRATION_ID ASC
         ,DATASOURCE_NUM_ID ASC
         NOLOGGING
         ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
         -- Yes, I found duplicate values in this table W_GL_REVN_F. But, how can I rectify it. I did some engineering, but failed.
         please tell me the steps to achieve....
    please execute this sql and get the duplicate values. If the count is less you can delete the records based on ROW_WID
    How  many duplicates do you have in total?
    *1. SELECT INTEGRATION_ID,DATASOURCE_NUM_ID,count(*) FROM W_GL_REVN_F*
    GROUP BY INTEGRATION_ID, DATASOURCE_NUM_ID
    HAVING COUNT()>1*
    *2. SELECT ROW_WID,DATASOURCE_NUM_ID,INTEGRATION_ID FROM W_GL_REVN_F*
    WHERE INTEGRATION_ID= (from 1st query)
    *3. DELETE from W_GL_REVN_F where ROW_WID=( from 2nd query)*
    Hope this helps !!

  • Error in process chain for PCA full load

    Hello everyone,
    I'm trying to use a process chain in order to delete a previous full load of plan data in a cube prior to the new load (to avoid double records). The successor job in the process chain is loading a delta of actual data into the same cube (same info source).
    When executing the process chain (and the included info package (full load)), the setting "Automatic loading of similar/identical requests from info cube" in the info package is not working (I have ticked "full or init, data-/infosource are the same")...
    I have checked that the function itself works as I have executed the info package manually with success. So the problem is the chain somehow.
    In the chain I just execute the info package as usual... so to my understanding, it should work the same way as if I executed it manually. Or am I wrong? Is some additional setting required in the chain in order to make it work?
    Any ideas?
    Thanks,
    Fredrik

    Hi Fredrik,
    not all settings in infopackages work in chains in the same way they do while running the package manually. Mostly you can check that with pressing F1 on the setting. In your case, you need to add a process type for deleting the data to the chain. In your chain maintenance, look at process types and then in load processes .... There you will find the type you need.
    kind regards
    Siggi

  • How to do a full load in DAC for a particular Module?

    Hi,
    We have one Execution plan loading all the BI Analytics module as of now.
    1)Financials,
    2)Procurement and Spend,
    3)Supply Chain and Order Management and
    4)EAM
    Issue is if  i go to Tools-->ETL Management-->Reset Data Sources  in DAC, it refreshes dates for all the Tables in the warehouse. ( and hence full load for all the Modules happens when i start the execution plan)
    I dont want full load for other modules, just want to do full load for a particular Module lets say EAM Module ( and also want to make sure that this particular full load doesnt create issues with the  data for other modules)
    Let me know how to achieve this.
    Any help in this regard would be highly appreciated.
    Thanks
    Ashish

    I can get a list of all the Facts and dimensions for a particular module from BI Apps Content guide... and then i can go and make refresh dates as Null for those Tables.
    Now should i run the execution plan for one particular module or an Execution plan containing all the modules?
    Cause i got this from someone..
    "The Data models for the different modules have many common dimensions, so a reload of one module could leave the facts in another in an invalid state as the dimension keys would be moved.
    Therefore it should not be expected that you can reload a module in isolation.
    However, you can selectively do a reload of any table by resetting it's refresh date in the DAC console.
    Then the DAC will determine the dependent objects that will also need a full load in order to maintain data consistency."
    Thanks
    Ashish

  • Full Load: Error while executing :TRUNCATE TABLE: S_ETL_PARAM

    Hi All,
    We are using Bi Apps 7.9.6.1. Full Load was running fine. But Now we are facing a problem with truncating a table "S_ETL_PARAM".
    I have restart informatica Server And also DAC Srever. But still I am getting the same in the DAC Log as, *"NOMALY INFO::: Error while executing : TRUNCATE TABLE:S_ETL_PARAM*
    *MESSAGE:::com.siebel.etl.database.IllegalSQLQueryException: DBConnection_OLTP:SIEBTRUN ('siebel.S_ETL_PARAM')*
    *Values :*
    *Null Map*
    *EXCEPTION CLASS::: java.lang.Exception"*
    Any Suggestion.....
    Thanks in Advance,
    Deepak

    are you trying to run incremental load when you get this truncate error? can you re-run full load and see that still runs ok? pls also check your DW side database logs like alert lor any DB level issue. such errors do not throw friendly messages in DAC/Informatica side.

  • BI Apps error while performing Full Load....

    Hi Experts,
    I am trying to implement BI Apps on my laptop......with 2 vms.......one with BI Apps, Biee, Informatica, and Dac; the other VM has EBS R12.1.1 with Vision demo database.......I am following below blog post from Deliver BI blog...
    http://deliverbi.blogspot.com/search/label/OBIA%20Setup%20Steps
    PDF: http://www.box.net/shared/7q0gavzd63
    I am now on page 73.....I clicked Run Now, and when I see the status of the Task it give an error.........
    Status Description:
    Some steps failed.Number of incomplete tasks whose status got updated to stopped :0
    Number of incomplete task details whose status got updated to stopped :1721
    Complete Description in the Description tab:
    ETL Process Id : 21627633
    ETL Name : SHA_LOAD_HR
    Run Name : SHA_LOAD_HR: ETL Run - 2011-02-13 22:32:05.907
    DAC Server : localhost(biapps)
    DAC Port : 3141
    Status: Failed
    Log File Name: SHA_LOAD_HR.21627633.log
    Database Connection(s) Used :
         DataWarehouse jdbc:oracle:thin:@biapps.ven.com:1521:orcl10
         ORA_R1211 jdbc:oracle:thin:@ebsr12.ven.com:1521:VIS10
    Informatica Server(s) Used :
         Oracle_BI_DW_Base_Integration-Oracle_BI_DW_Base_Integration:(10)
    Start Time: 2011-02-13 22:32:05.922
    Message: Some steps failed.Number of incomplete tasks whose status got updated to stopped :0
    Number of incomplete task details whose status got updated to stopped :1721
    Actual Start Time: 2011-02-13 22:32:05.922
    End Time: 2011-02-13 22:34:01.503
    Total Time Taken: 1 Minutes
    Start Time For This Run: 2011-02-13 22:32:05.922
    Total Time Taken For This Run: 1 MinutesLog file mentioned in the above description:
    http://www.mediafire.com/?j9br75v6ecga8h6
    Below are the screen shots of the Tasks tab:
    1st half: http://img11.imageshack.us/i/111bud.jpg/
    2nd half: http://img340.imageshack.us/i/222hj.jpg/
    I successfully followed each and every step mentioned int he above document upto page 72.....but on page 73 i got the above errors........
    for source: I used apps/apps@VIS account........where VIS is the instance name for Vision demo database........but in the above document I was asked to give ebs12 as the connection string name.........I am able to loin to apps/apps@VIS account from my biapps VM........
    Can anyone please help me in running Full Load on EBS R12.1.1 Vision Demo database.........please.....
    Thanks in advance,
    DK
    Edited by: user12296343 on Feb 13, 2011 9:00 PM

    Hi DK,
    It was combination of couple of things,
    Did you apply the cumulative Patch 10052370?
    Also check your steps from below links
    http://gerardnico.com/wiki/obia/installation_7961
    http://ahmedshareefuddin.blogspot.com/2010/12/installation-and-configuration-of.html
    Hope this helps,
    regards,

  • SQL Statement error when running ETL from DAC

    Dear all,
    I installed and configured biapps 7.9.6.3, then I run the full load of Subject CRM - Loyalty in DAC. And I got the error of task fail, I check the Session log files in Informatica server.
    $ view SEBL_VERT_811.DATAWAREHOUSE.SDE_SBL_Vert_811_Adaptor.SDE_GeographyDimension_Business.log
    DIRECTOR> VAR_27028 Use override value [0] for user-defined workflow/worklet variable:[$$passInStatus].
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [SEBL_VERT_811] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [SEBL_VERT_811.DATAWAREHOUSE.SDE_SBL_Vert_811_Adaptor.SDE_GeographyDimension_Business.log] for session parameter:[$PMSessionLogFile].
    DIRECTOR> VAR_27028 Use override value [1] for mapping parameter:[MPLT_LOAD_W_GEO_DS.$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value [] for mapping parameter:[$$Hint1].
    DIRECTOR> VAR_27028 Use override value [] for mapping parameter:[$$Hint2].
    DIRECTOR> TM_6014 Initializing session [SDE_GeographyDimension_Business] at [Mon Jul 25 17:29:47 2011].
    DIRECTOR> TM_6683 Repository Name: [Oracle_BI_DW_Base]
    DIRECTOR> TM_6684 Server Name: [Oracle_BI_DW_Server]
    DIRECTOR> TM_6686 Folder: [SDE_SBL_Vert_811_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_GeographyDimension_Business] Run Instance Name: [] Run Id: [260]
    DIRECTOR> TM_6101 Mapping name: SDE_GeographyDimension_Business [version 1].
    DIRECTOR> TM_6963 Pre 85 Timestamp Compatibility is Enabled
    DIRECTOR> TM_6964 Date format for the Session is [MM/DD/YYYY HH24:MI:SS]
    DIRECTOR> TM_6827 [u01/app/oracle/biapps/dev/Informatica/9.0.1/server/infa_shared/Storage] will be used as storage directory for session [SDE_GeographyDimension_Business].
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6708 Using configuration property [DisableDB2BulkMode,Yes]
    DIRECTOR> TM_6708 Using configuration property [ServerPort,6325]
    DIRECTOR> TM_6708 Using configuration property [overrideMpltVarWithMapVar,Yes]
    DIRECTOR> TM_6708 Using configuration property [SiebelUnicodeDB,SIEBEL@ANSDEV dwhadmin@ANBDEV]
    DIRECTOR> TM_6703 Session [SDE_GeographyDimension_Business] is run by 64-bit Integration Service [node01_hkhgc01dvapp01], version [9.0.1 HotFix2], build [1111].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [UNICODE]
    MAPPING> CMN_1570 Server Code page: [UTF-8 encoding of Unicode]
    MAPPING> TM_6151 The session sort order is [Binary].
    MAPPING> TM_6185 Warning. Code page validation is disabled in this session.
    MAPPING> TM_6156 Using low precision processing.
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6187 Session target-based commit interval is [10000].
    MAPPING> TM_6307 DTM error log disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> DBG_21075 Connecting to database [ANBDEV], user [dwhadmin]
    MAPPING> CMN_1716 Lookup [MPLT_LOAD_W_GEO_DS.LKP_W_LST_OF_VAL_G] uses database connection [Relational:DataWarehouse] in code page [UTF-8 encoding of Unicode]
    MAPPING> CMN_1716 Lookup [MPLT_LOAD_W_GEO_DS.LKP_W_GEO_DS] uses database connection [Relational:DataWarehouse] in code page [UTF-8 encoding of Unicode]
    MAPPING> DBG_21694 AGG_COUNTRY_CITY_ZIPCODE Partition [0]: Index cache size = [1048576], Data cache size = [2097152]
    MAPPING> TE_7212 Increasing [Index Cache] size for transformation [AGG_COUNTRY_CITY_ZIPCODE] from [1048576] to [2402304].
    MAPPING> TE_7212 Increasing [Data Cache] size for transformation [AGG_COUNTRY_CITY_ZIPCODE] from [2097152] to [2097528].
    MAPPING> TE_7029 Aggregate Information: Creating New Index and Data Files
    MAPPING> TE_7034 Aggregate Information: Index file is [u01/app/oracle/biapps/dev/Informatica/9.0.1/server/infa_shared/Cache/PMAGG14527_3_0_260.idx]
    MAPPING> TE_7035 Aggregate Information: Data file is [u01/app/oracle/biapps/dev/Informatica/9.0.1/server/infa_shared/Cache/PMAGG14527_3_0_260.dat]
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_GeographyDimension_Business]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Mon Jul 25 17:29:47 2011)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Mon Jul 25 17:29:47 2011)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 36000000 bytes and Block size is 128000 bytes.
    LKPDP_2> DBG_21097 Lookup Transformation [MPLT_LOAD_W_GEO_DS.LKP_W_GEO_DS]: Default sql to create lookup cache: SELECT CITY,COUNTRY,ZIPCODE,STATE_PROV FROM W_GEO_DS ORDER BY CITY,COUNTRY,ZIPCODE,STATE_PROV
    LKPDP_1> DBG_21312 Lookup Transformation [MPLT_LOAD_W_GEO_DS.LKP_W_LST_OF_VAL_G]: Lookup override sql to create cache: SELECT W_LST_OF_VAL_G.VAL AS VAL, W_LST_OF_VAL_G.R_TYPE AS R_TYPE FROM W_LST_OF_VAL_G
    WHERE
         W_LST_OF_VAL_G.R_TYPE LIKE 'ETL%' ORDER BY R_TYPE,VAL
    LKPDP_1> TE_7212 Increasing [Index Cache] size for transformation [MPLT_LOAD_W_GEO_DS.LKP_W_LST_OF_VAL_G] from [1048576] to [1050000].
    LKPDP_2> TE_7212 Increasing [Index Cache] size for transformation [MPLT_LOAD_W_GEO_DS.LKP_W_GEO_DS] from [20000000] to [20006400].
    LKPDP_2> TE_7212 Increasing [Data Cache] size for transformation [MPLT_LOAD_W_GEO_DS.LKP_W_GEO_DS] from [20000000] to [20004864].
    READER_1_1_1> DBG_21438 Reader: Source is [ANSDEV], user [SIEBEL]
    READER_1_1_1> BLKR_16051 Source database connection [SEBL_VERT_811] code page: [UTF-8 encoding of Unicode]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [ANBDEV], user [dwhadmin], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8221 Target database connection [DataWarehouse] code page: [UTF-8 encoding of Unicode]
    WRITER_1_*_1> WRT_8124 Target Table W_GEO_DS :SQL INSERT statement:
    INSERT INTO W_GEO_DS(CITY,CONTINENT,COUNTRY,COUNTY,STATE_PROV,ZIPCODE,DATASOURCE_NUM_ID,X_CUSTOM) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8020 No column marked as primary key for table [W_GEO_DS]. UPDATEs Not Supported.
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_GEO_DS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Mon Jul 25 17:29:47 2011
    Target tables:
    W_GEO_DS
    READER_1_1_1> RR_4029 SQ Instance [SQ_S_ADDR_ORG] User specified SQL Query [SELECT  DISTINCT
    S_ADDR_ORG.CITY,
    S_ADDR_ORG.COUNTRY,
    S_ADDR_ORG.COUNTY,
    S_ADDR_ORG.PROVINCE,
    S_ADDR_ORG.STATE,
    S_ADDR_ORG.ZIPCODE,
          '0' AS X_CUSTOM
    FROM
    V_ADDR_ORG S_ADDR_ORG
    READER_1_1_1> RR_4049 SQL Query issued to database : (Mon Jul 25 17:29:47 2011)
    READER_1_1_1> CMN_1761 Timestamp Event: [Mon Jul 25 17:29:47 2011]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-00942: table or view does not exist
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT DISTINCT
    S_ADDR_ORG.CITY,
    S_ADDR_ORG.COUNTRY,
    S_ADDR_ORG.COUNTY,
    S_ADDR_ORG.PROVINCE,
    S_ADDR_ORG.STATE,
    S_ADDR_ORG.ZIPCODE,
    '0' AS X_CUSTOM
    FROM
    V_ADDR_ORG S_ADDR_ORG
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT DISTINCT
    S_ADDR_ORG.CITY,
    S_ADDR_ORG.COUNTRY,
    S_ADDR_ORG.COUNTY,
    S_ADDR_ORG.PROVINCE,
    S_ADDR_ORG.STATE,
    S_ADDR_ORG.ZIPCODE,
    '0' AS X_CUSTOM
    FROM
    V_ADDR_ORG S_ADDR_ORG
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Mon Jul 25 17:29:47 2011]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_GEO_DS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Mon Jul 25 17:29:47 2011
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_GEO_DS (Instance Name: [W_GEO_DS])
    WRT_8044 No data loaded for this target
    WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [SQ_S_ADDR_ORG] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [SQ_S_ADDR_ORG] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [TRANSF_1_2_1] created for [the transformation stage] of partition point [AGG_COUNTRY_CITY_ZIPCODE] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_GEO_DS] has completed. The total run time was insufficient for any meaningful statistics.
    MAPPING> CMN_1791 The index cache size that would hold [0] aggregate groups of input rows for [AGG_COUNTRY_CITY_ZIPCODE], in memory, is [0] bytes
    MAPPING> CMN_1790 The data cache size that would hold [0] aggregate groups of input rows for [AGG_COUNTRY_CITY_ZIPCODE], in memory, is [0] bytes
    MAPPING> CMN_1793 The index cache size that would hold [0] rows in the lookup table for [MPLT_LOAD_W_GEO_DS.LKP_W_LST_OF_VAL_G], in memory, is [0] bytes
    MAPPING> CMN_1792 The data cache size that would hold [0] rows in the lookup table for [MPLT_LOAD_W_GEO_DS.LKP_W_LST_OF_VAL_G], in memory, is [0] bytes
    MAPPING> CMN_1793 The index cache size that would hold [0] rows in the lookup table for [MPLT_LOAD_W_GEO_DS.LKP_W_GEO_DS], in memory, is [0] bytes
    MAPPING> CMN_1792 The data cache size that would hold [0] rows in the lookup table for [MPLT_LOAD_W_GEO_DS.LKP_W_GEO_DS], in memory, is [0] bytes
    MANAGER> PETL_24005 Starting post-session tasks. : (Mon Jul 25 17:29:47 2011)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Mon Jul 25 17:29:47 2011)
    MAPPING> TE_7216 Deleting cache files [PMLKUP14527_524289_0_260L64] for transformation [MPLT_LOAD_W_GEO_DS.LKP_W_LST_OF_VAL_G].
    MAPPING> TE_7216 Deleting cache files [PMLKUP14527_524293_0_260L64] for transformation [MPLT_LOAD_W_GEO_DS.LKP_W_GEO_DS].
    MAPPING> TM_6018 The session completed with [0] row transformation errors.
    MANAGER> TE_7216 Deleting cache files [u01/app/oracle/biapps/dev/Informatica/9.0.1/server/infa_shared/Cache/PMAGG14527_3_0_260.idx] for transformation [AGG_COUNTRY_CITY_ZIPCODE].
    MANAGER> TE_7216 Deleting cache files [u01/app/oracle/biapps/dev/Informatica/9.0.1/server/infa_shared/Cache/PMAGG14527_3_0_260.dat] for transformation [AGG_COUNTRY_CITY_ZIPCODE].
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [SQ_S_ADDR_ORG] (Instance Name: [SQ_S_ADDR_ORG])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_GEO_DS] (Instance Name: [W_GEO_DS])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_GeographyDimension_Business] completed at [Mon Jul 25 17:29:48 2011].
    After review the log, I found the select statement was fail, the SQL below was wrong:
    SELECT DISTINCT
    S_ADDR_ORG.CITY,
    S_ADDR_ORG.COUNTRY,
    S_ADDR_ORG.COUNTY,
    S_ADDR_ORG.PROVINCE,
    S_ADDR_ORG.STATE,
    S_ADDR_ORG.ZIPCODE,
    '0' AS X_CUSTOM
    FROM
    V_ADDR_ORG S_ADDR_ORG
    There is no table V_ADDR_ORG but S_ADDR_ORG in the Siebel transaction database. So I don't know why it generate this sql when transfering data to OBAW. This is the out of box bi application.
    Experts! How could I fix this problem?? Please help, thank you very much!!
    Best regards,
    Ryan

    Ryan,
    Yes you missed to create views for source tables. so in dac design>> tables tab, select any table and right clieck on it then click on change capture scripts>> generate view scripts. it will ask you whether it can generate script for all the tables. so it will generate view script for you. now run the entire esript in source database. then your problem will be solved.
    if this answered your question . make my answer correct.
    Thanks
    Jay.

Maybe you are looking for

  • [sce8000 web filtering / parental controls]

    Hi, All: In a couple of Cisco URLs it says that the SCE8000 integrates with Websense (and AdvancedMobile) to support url filtering and parental controls. See here: http://www.cisco.com/en/US/partner/prod/collateral/ps7045/ps6129/ps6257/ps6135/white_p

  • Why is my MacBook Pro (mid-2012) so unresponsive and slow?

    I have been having many huge problems with my MacBook Pro recently. First of all, it takes a minute to two minutes to boot up. Whenever I try to open an application, it usually takes 30 seconds or so to even open, and it remains unresponsive (with th

  • Sample code for creation of contact

    Creation of contact using the java api does not work as it should for me. I think i dont have the correct format for the vcard will someone help me with this!! A sample code example will be very helpfull§. I am using calander version 9.0.4. any addit

  • Change Parition Size in Leopard?

    I've heard Leopard has the ability to create and change the size of partitions on the fly. Does this also apply to your Boot Camp partition? And if not, how would I go about taking my Boot Camp partition, creating a new Parallels-only partition and r

  • X200s BSOD when resuming from hibernatio​n

    I always get a BSOD regarding CLASSPNP.SYS when trying to resume from hibernation after the laptop has hibernated because of critical battery level. The BSOD also says that "physical memory dump failed with status 0xc000009c". If I just choose to hib