URGENT!pls hlpDAC Full Load always in 'Running' status at a particular task

Hi Friends,
I started a full load yesterday.There are totally 257 tasks.The load went fine without issues till 248th task.But while executing 249th task(Load into Activity Fact),it is always in 'Running' status and is not getting completed even after executing for 2 hours. I checked in the informatica workflow monitor and found that the workflow is in 'running' state and is not getting completed. When right-clicked the session and selected run properties,I can see that 0 rows are inserted into the target table.So I manually tried to stop the workflow.Even after that the task is always in 'Stopping' status and is not getting stopped.Then I manually aborted the workflow.
Below is the session log file.Could you please check and let me know.
Regards,
Vijay
Edited by: vijayobi on Jul 22, 2011 4:26 AM

Hi Friends,
We executed a Full-Load again on Saturday i.e 23rd July 2011.This time we allowed the task 'Load into Activity Fact_CUSTOM' to execute without stopping it manully like we did in the previous data load.It got executed for 3 hours and 45 minutes and then 'Failed' giving the following error ORA-01652(unable to extend temp segment by string in tablespace string).This task got executed successfully in our dev environment.Below is what we found in the sessio .log file and help us resolve this issue.Please revert back as soon as possible as we have this issue in our prod environment.
2011-07-23 14:56:07 : ERROR : (8128 | LKPDP_25:READER_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : RR_4035 : SQL Error [
ORA-01652: unable to extend temp segment by 128 in tablespace TEMP
Database driver error...
Function Name : Execute
SQL Stmt : SELECT distinct LOOKUP_TABLE.ROW_WID AS ROW_WID, LOOKUP_TABLE.GEO_WID AS GEO_WID, LOOKUP_TABLE.INTEGRATION_ID AS INTEGRATION_ID, LOOKUP_TABLE.DATASOURCE_NUM_ID AS DATASOURCE_NUM_ID, LOOKUP_TABLE.EFFECTIVE_FROM_DT AS EFFECTIVE_FROM_DT, LOOKUP_TABLE.EFFECTIVE_TO_DT AS EFFECTIVE_TO_DT FROM W_PARTY_D LOOKUP_TABLE,W_ACTIVITY_FS LEFT OUTER JOIN W_CUSTOMER_ACCOUNT_DON (W_ACTIVITY_FS.CUSTOMER_ACCOUNT_ID=W_CUSTOMER_ACCOUNT_D.INTEGRATION_IDAND W_ACTIVITY_FS.DATASOURCE_NUM_ID=W_CUSTOMER_ACCOUNT_D.DATASOURCE_NUM_ID)WHERECOALESCE(W_ACTIVITY_FS.CUSTOMER_ID,W_CUSTOMER_ACCOUNT_D.PARTY_ID)=LOOKUP_TABLE.INTEGRATION_IDAND W_ACTIVITY_FS.DATASOURCE_NUM_ID=LOOKUP_TABLE.DATASOURCE_NUM_IDAND COALESCE(W_ACTIVITY_FS.PLANNED_START_DT,W_ACTIVITY_FS.CREATED_DT) >= LOOKUP_TABLE.EFFECTIVE_FROM_DT AND COALESCE(W_ACTIVITY_FS.PLANNED_START_DT,W_ACTIVITY_FS.CREATED_DT) < LOOKUP_TABLE.EFFECTIVE_TO_DTORDER BY LOOKUP_TABLE.INTEGRATION_ID, LOOKUP_TABLE.DATASOURCE_NUM_ID, LOOKUP_TABLE.EFFECTIVE_FROM_DT, LOOKUP_TABLE.EFFECTIVE_TO_DT, LOOKUP_TABLE.ROW_WID, LOOKUP_TABLE.GEO_WID -- ORDER BY INTEGRATION_ID,DATASOURCE_NUM_ID,EFFECTIVE_FROM_DT,EFFECTIVE_TO_DT,ROW_WID,GEO_WID
Oracle Fatal Error
Database driver error...
Function Name : Execute
SQL Stmt : SELECT distinct LOOKUP_TABLE.ROW_WID AS ROW_WID, LOOKUP_TABLE.GEO_WID AS GEO_WID, LOOKUP_TABLE.INTEGRATION_ID AS INTEGRATION_ID, LOOKUP_TABLE.DATASOURCE_NUM_ID AS DATASOURCE_NUM_ID, LOOKUP_TABLE.EFFECTIVE_FROM_DT AS EFFECTIVE_FROM_DT, LOOKUP_TABLE.EFFECTIVE_TO_DT AS EFFECTIVE_TO_DT FROM W_PARTY_D LOOKUP_TABLE,W_ACTIVITY_FS LEFT OUTER JOIN W_CUSTOMER_ACCOUNT_DON (W_ACTIVITY_FS.CUSTOMER_ACCOUNT_ID=W_CUSTOMER_ACCOUNT_D.INTEGRATION_IDAND W_ACTIVITY_FS.DATASOURCE_NUM_ID=W_CUSTOMER_ACCOUNT_D.DATASOURCE_NUM_ID)WHERECOALESCE(W_ACTIVITY_FS.CUSTOMER_ID,W_CUSTOMER_ACCOUNT_D.PARTY_ID)=LOOKUP_TABLE.INTEGRATION_IDAND W_ACTIVITY_FS.DATASOURCE_NUM_ID=LOOKUP_TABLE.DATASOURCE_NUM_IDAND COALESCE(W_ACTIVITY_FS.PLANNED_START_DT,W_ACTIVITY_FS.CREATED_DT) >= LOOKUP_TABLE.EFFECTIVE_FROM_DT AND COALESCE(W_ACTIVITY_FS.PLANNED_START_DT,W_ACTIVITY_FS.CREATED_DT) < LOOKUP_TABLE.EFFECTIVE_TO_DTORDER BY LOOKUP_TABLE.INTEGRATION_ID, LOOKUP_TABLE.DATASOURCE_NUM_ID, LOOKUP_TABLE.EFFECTIVE_FROM_DT, LOOKUP_TABLE.EFFECTIVE_TO_DT, LOOKUP_TABLE.ROW_WID, LOOKUP_TABLE.GEO_WID -- ORDER BY INTEGRATION_ID,DATASOURCE_NUM_ID,EFFECTIVE_FROM_DT,EFFECTIVE_TO_DT,ROW_WID,GEO_WID
Oracle Fatal Error].
2011-07-23 14:56:07 : ERROR : (8128 | LKPDP_25:READER_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : BLKR_16004 : ERROR: Prepare failed.
2011-07-23 14:56:07 : INFO : (8128 | WRITER_1_*_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : WRT_8333 : Rolling back all the targets due to fatal session error.
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [mplt_SIL_ActivityFact.LKP_W_PARTY_D_With_Geo_Wid], and the session is terminating.
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [mplt_SIL_ActivityFact.EXP_Decode_CustomerId], and the session is terminating.
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [mplt_SIL_ActivityFact.EXP_Decode_CustomerId], and the session is terminating.
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [mplt_SIL_ActivityFact.LKP_W_CUSTOMER_ACCOUNT_D_With_Party_ID], and the session is terminating.
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [mplt_SIL_ActivityFact.LKP_W_CUSTOMER_ACCOUNT_D_With_Party_ID], and the session is terminating.
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [mplt_SIL_ActivityFact.EXPTRANS], and the session is terminating.
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [mplt_SIL_ActivityFact.EXPTRANS], and the session is terminating.
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [FIL_ETL_PROC_WID], and the session is terminating.
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [FIL_ETL_PROC_WID], and the session is terminating.
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [MPLT_Get_ETL_Proc_WID.Exp_Decide_Etl_Proc_Wid], and the session is terminating.
2011-07-23 14:56:07 : INFO : (8128 | WRITER_1_*_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : WRT_8325 : Final rollback executed for the target [W_ACTIVITY_F] at end of load
2011-07-23 14:56:07 : ERROR : (8128 | TRANSF_1_1_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : TM_6085 : A fatal error occurred at transformation [MPLT_Get_ETL_Proc_WID.Exp_Decide_Etl_Proc_Wid], and the session is terminating.
2011-07-23 14:56:07 : INFO : (8128 | MANAGER) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : PETL_24007 : Received request to stop session run. Attempting to stop worker threads.
2011-07-23 14:56:07 : INFO : (8128 | WRITER_1_*_1) : (IS | Oracle_BI_DW_Base_Integration_Service) : node01_MACAW : WRT_8035 : Load complete time: Sat Jul 23 14:56:07 2011
Thanks in advance.
Vinay

Similar Messages

  • Can I use full load for daily loads in LO Cockpit extraction.

    I am using LO cockpit extraction for 2LIS_11_VAITM and i am using delta load for daily load and I have following doubt.
    Can i use full load for daily loads instead of delta in LO Cockpit extraction.
    Because, my understanding is that full load always takes data from setup tables in ECC and set up tables will not have delta data.
    pls reply. Thanks.

    You are right abt the understanding that full load (atleast for the extractor at hand) brings data from setup tables. You are also right that delta records are not loaded into setup tables.
    So if you plan to do full loads every day, you have to load the setup tables everyday. It is at this juncture that I have to remind you that, filling of setup tables requires a downtime of the system to ensure  that no changes / creates happen during the process.
    Hence performing a full load is not a very good approach to go by...especially when SAP made the job easy by doing all the dirty work of identifying changes and writing them to extraction queue.
    Hope this helps!

  • How to have an Incremental Load after the Full Load ?

    Hello,
    It may be naive, but that question ocurred to me... I am still dealing with the Full load and have it finish OK.
    But I am wondering... once I can get the full load to work OK... Do I need to do something so that the next run is incremental ? or is this automatic ?
    Txs.
    Antonio

    Hi,
    1. Setup the source and target table for the task in DAC
    2. Once you execute the task (in DAC) than in the Setup -> Phyisical Sources -> Last refresh date timestamp of the tables (for the database) will be updated (Sorry but I do not remeber exactly the location)
    3. Once it is updated the incremental (Informatica) workflow will kicked off (if the task not setup for full load all times)
    4. If it is null the full load will updated.
    5. You can use a variable (may be $$LAST_EXTRACT_DATE?!) to setup the incremental load for the Informatica workflow
    Regards
    Gergo
    PS: Once you have a full load like 15hours run (and it is works ok) the incremental is handy when it is just 30minutes ;)

  • COPA delta and Full load inconsistent result

    Hi ,
    whn i do a Full load for COPA(datasource 1_CO_PA1159000) for one particular order number, it returned 5 line items.but when i tried to do Initialization load  it returned only 4line items. One line item is missing.
    Can anyone let me know what could cause this inconsistence between a full load and initialization load on the same datasource extraction.
    Appreciate feedback from anyone.
    Thanks.
    Regards,
    Maili

    Hi,
    Data could be picked from summarization levels or the base tables. The source of data may vary for Full/Init. Please check section 3.4 Data source for CO-PA extraction in the How to CO-PA extraction doc....
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sapportals.km.docs/library/business-intelligence/g-i/how%20to%20connect%20between%20co-pa%20and%20sap%20bw%20for%20a%20replication%20model.0x

  • Application builder - Why my .exe opens with running status?

    I have no problem in building the application. But when I open the application, it is always in running status. Thus I couldn't input control variables. The other applications I built don't have this problem though. The .vi and .exe files are attached. Thanks a lot for your help.
    Etain
    Attachments:
    reformat.zip ‏258 KB

    Richard gave you the right answer on how to change the behavior but typically, you would want your app to start automatically. Most programs are designed so that nothing is done until you do something like hit a start button. This could be done with an idle state if using a state machine or a separate while loop. You also might have a problem lith the use of your local variables. You have no data flow between getting Header Byte Number and the while loop that reads it. You may have the problem that the while loop will actually start before File Info gets the information. If you want that to run first, directly wire File Info to where you are initializing the shift register.

  • Project Analytics 7.9.6.1 - Error while running a full load

    Hi All,
    I am performing a full load for Projects Analytics and get the following error,
    =====================================
    ERROR OUTPUT
    =====================================
    1103 SEVERE Wed Nov 18 02:49:36 WST 2009 Could not attach to workflow because of errorCode 36331 For workflow SDE_ORA_CodeDimension_Gl_Account
    1104 SEVERE Wed Nov 18 02:49:36 WST 2009
    ANOMALY INFO::: Error while executing : INFORMATICA TASK:SDE_ORA11510_Adaptor:SDE_ORA_CodeDimension_Gl_Account:(Source : FULL Target : FULL)
    MESSAGE:::
    Irrecoverable Error
    Error while contacting Informatica server for getting workflow status for SDE_ORA_CodeDimension_Gl_Account
    Error Code = 36331:Unknown reason for error code 36331
    Pmcmd output :
    Session log initialises NULL value to mapping parameter MPLT_ADI_CODES.$$CATEGORY. This is then used insupsequent SQL and results in ORA-00936: missing expression error following are the initialization section and the load section containing the error in the log
    Initialisation
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_11_5_10] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [ORA_11_5_10.DATAWAREHOUSE.SDE_ORA11510_Adaptor.SDE_ORA_CodeDimension_Gl_Account_Segments.log] for session parameter:[$PMSessionLogFile].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[MPLT_ADI_CODES.$$CATEGORY].
    DIRECTOR> VAR_27028 Use override value [4] for mapping parameter:[MPLT_SA_ORA_CODES.$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value [DEFAULT] for mapping parameter:[MPLT_SA_ORA_CODES.$$TENANT_ID].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_CodeDimension_Gl_Account_Segments] at [Wed Nov 18 02:49:11 2009].
    DIRECTOR> TM_6683 Repository Name: [repo_service]
    DIRECTOR> TM_6684 Server Name: [int_service]
    DIRECTOR> TM_6686 Folder: [SDE_ORA11510_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_CodeDimension_Gl_Account_Segments] Run Instance Name: [] Run Id: [17]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_CodeDimension_GL_Account_Segments [version 1].
    DIRECTOR> TM_6963 Pre 85 Timestamp Compatibility is Enabled
    DIRECTOR> TM_6964 Date format for the Session is [MM/DD/YYYY HH24:MI:SS]
    DIRECTOR> TM_6827 [C:\Informatica\PowerCenter8.6.1\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_CodeDimension_Gl_Account_Segments].
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6703 Session [SDE_ORA_CodeDimension_Gl_Account_Segments] is run by 32-bit Integration Service [node01_ASG596138], version [8.6.1], build [1218].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [UNICODE]
    MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6151 The session sort order is [Binary].
    MAPPING> TM_6185 Warning. Code page validation is disabled in this session.
    MAPPING> TM_6156 Using low precision processing.
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM error log disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> DBG_21075 Connecting to database [orcl], user [DAC_REP]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_Master_Map] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_Master_Code] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> CMN_1716 Lookup [mplt_ADI_Codes.Lkp_W_CODE_D] uses database connection [Relational:DataWarehouse] in code page [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_CodeDimension_Gl_Account_Segments]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Wed Nov 18 02:49:14 2009)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Wed Nov 18 02:49:14 2009)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 32000000 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [asgdev], user [APPS]
    READER_1_1_1> BLKR_16051 Source database connection [ORA_11_5_10] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8147 Writer: Target is database [orcl], user [DAC_REP], bulk mode [OFF]
    WRITER_1_*_1> WRT_8221 Target database connection [DataWarehouse] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL INSERT statement:
    INSERT INTO W_CODE_D(DATASOURCE_NUM_ID,SOURCE_CODE,SOURCE_CODE_1,SOURCE_CODE_2,SOURCE_CODE_3,SOURCE_NAME_1,SOURCE_NAME_2,CATEGORY,LANGUAGE_CODE,MASTER_DATASOURCE_NUM_ID,MASTER_CODE,MASTER_VALUE,W_INSERT_DT,W_UPDATE_DT,TENANT_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL UPDATE statement:
    UPDATE W_CODE_D SET SOURCE_CODE_1 = ?, SOURCE_CODE_2 = ?, SOURCE_CODE_3 = ?, SOURCE_NAME_1 = ?, SOURCE_NAME_2 = ?, MASTER_DATASOURCE_NUM_ID = ?, MASTER_CODE = ?, MASTER_VALUE = ?, W_INSERT_DT = ?, W_UPDATE_DT = ?, TENANT_ID = ? WHERE DATASOURCE_NUM_ID = ? AND SOURCE_CODE = ? AND CATEGORY = ? AND LANGUAGE_CODE = ?
    WRITER_1_*_1> WRT_8124 Target Table W_CODE_D :SQL DELETE statement:
    DELETE FROM W_CODE_D WHERE DATASOURCE_NUM_ID = ? AND SOURCE_CODE = ? AND CATEGORY = ? AND LANGUAGE_CODE = ?
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_CODE_D]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    Load section
    *****START LOAD SESSION*****
    Load Start Time: Wed Nov 18 02:49:16 2009
    Target tables:
    W_CODE_D
    READER_1_1_1> RR_4029 SQ Instance [mplt_BC_ORA_Codes_GL_Account_Segments.Sq_Fnd_Flex_Values] User specified SQL Query [SELECT
    FND_FLEX_VALUES.FLEX_VALUE_SET_ID,
    FND_FLEX_VALUES.FLEX_VALUE,
    MAX(FND_FLEX_VALUES_TL.DESCRIPTION),
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM,
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME
    FROM
    FND_FLEX_VALUES,
    FND_FLEX_VALUES_TL,
    FND_ID_FLEX_SEGMENTS,
    FND_SEGMENT_ATTRIBUTE_VALUES
    WHERE
    FND_FLEX_VALUES.FLEX_VALUE_ID = FND_FLEX_VALUES_TL.FLEX_VALUE_ID AND FND_FLEX_VALUES_TL.LANGUAGE ='US' AND
    FND_ID_FLEX_SEGMENTS.FLEX_VALUE_SET_ID =FND_FLEX_VALUES.FLEX_VALUE_SET_ID AND
    FND_ID_FLEX_SEGMENTS.APPLICATION_ID = 101 AND
    FND_ID_FLEX_SEGMENTS.ID_FLEX_CODE ='GL#' AND
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM =FND_SEGMENT_ATTRIBUTE_VALUES.ID_FLEX_NUM AND
    FND_SEGMENT_ATTRIBUTE_VALUES.APPLICATION_ID =101 AND
    FND_SEGMENT_ATTRIBUTE_VALUES.ID_FLEX_CODE = 'GL#' AND
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME=FND_SEGMENT_ATTRIBUTE_VALUES.APPLICATION_COLUMN_NAME AND
    FND_SEGMENT_ATTRIBUTE_VALUES.ATTRIBUTE_VALUE ='Y'
    GROUP BY
    FND_FLEX_VALUES.FLEX_VALUE_SET_ID,
    FND_FLEX_VALUES.FLEX_VALUE,
    FND_ID_FLEX_SEGMENTS.ID_FLEX_NUM,
    FND_ID_FLEX_SEGMENTS.APPLICATION_COLUMN_NAME]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Wed Nov 18 02:49:17 2009)
    READER_1_1_1> RR_4050 First row returned from database to reader : (Wed Nov 18 02:49:17 2009)
    LKPDP_3> DBG_21312 Lookup Transformation [mplt_ADI_Codes.Lkp_W_CODE_D]: Lookup override sql to create cache: SELECT W_CODE_D.SOURCE_NAME_1 AS SOURCE_NAME_1, W_CODE_D.SOURCE_NAME_2 AS SOURCE_NAME_2, W_CODE_D.MASTER_DATASOURCE_NUM_ID AS MASTER_DATASOURCE_NUM_ID, W_CODE_D.MASTER_CODE AS MASTER_CODE, W_CODE_D.MASTER_VALUE AS MASTER_VALUE, W_CODE_D.W_INSERT_DT AS W_INSERT_DT, W_CODE_D.TENANT_ID AS TENANT_ID, W_CODE_D.DATASOURCE_NUM_ID AS DATASOURCE_NUM_ID, W_CODE_D.SOURCE_CODE AS SOURCE_CODE, W_CODE_D.CATEGORY AS CATEGORY, W_CODE_D.LANGUAGE_CODE AS LANGUAGE_CODE FROM W_CODE_D
    WHERE
    W_CODE_D.CATEGORY IN () ORDER BY DATASOURCE_NUM_ID,SOURCE_CODE,CATEGORY,LANGUAGE_CODE,SOURCE_NAME_1,SOURCE_NAME_2,MASTER_DATASOURCE_NUM_ID,MASTER_CODE,MASTER_VALUE,W_INSERT_DT,TENANT_ID
    LKPDP_3> TE_7212 Increasing [Index Cache] size for transformation [mplt_ADI_Codes.Lkp_W_CODE_D] from [1000000] to [4734976].
    LKPDP_3> TE_7212 Increasing [Data Cache] size for transformation [mplt_ADI_Codes.Lkp_W_CODE_D] from [2000000] to [2007040].
    READER_1_1_1> BLKR_16019 Read [625] rows, read [0] error rows for source table [FND_ID_FLEX_SEGMENTS] instance name [mplt_BC_ORA_Codes_GL_Account_Segments.FND_ID_FLEX_SEGMENTS]
    READER_1_1_1> BLKR_16008 Reader run completed.
    LKPDP_3> TM_6660 Total Buffer Pool size is 609824 bytes and Block size is 65536 bytes.
    LKPDP_3:READER_1_1> DBG_21438 Reader: Source is [orcl], user [DAC_REP]
    LKPDP_3:READER_1_1> BLKR_16051 Source database connection [DataWarehouse] code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    LKPDP_3:READER_1_1> BLKR_16003 Initialization completed successfully.
    LKPDP_3:READER_1_1> BLKR_16007 Reader run started.
    LKPDP_3:READER_1_1> RR_4049 SQL Query issued to database : (Wed Nov 18 02:49:18 2009)
    LKPDP_3:READER_1_1> CMN_1761 Timestamp Event: [Wed Nov 18 02:49:18 2009]
    LKPDP_3:READER_1_1> RR_4035 SQL Error [
    ORA-00936: missing expression
    Could you please suggest what the issue might be and how it can be fixed?
    Many thanks,
    Kiran

    I have continued related detains in the following thread,
    Mapping Parameter  $$CATEGORY not included in the parameter file (7.9.6.1)
    Apologies for the inconvenience.
    Thanks,
    Kiran

  • Errors: ORA-00054 & ORA-01452 while running DAC Full Load

    Hi Friends,
    Previously, I ran full load...it went well. And, I did some sample reports also in BI APPS 7.9.6.2
    Now, I modified few parameters as per the Business and I try to run Full Load again...But I struck with few similar errors. I cleared couple of DB Errors.
    Please, help me out to solve the below errors.
    1. ANOMALY INFO::: Error while executing : TRUNCATE TABLE:W_SALES_BOOKING_LINE_F
    MESSAGE:::com.siebel.etl.database.IllegalSQLQueryException: DataWarehouse:TRUNCATE TABLE W_SALES_BOOKING_LINE_F
    ORA-00054: resource busy and acquire with NOWAIT specified or timeout expired
    --I checked W_SALES_BOOKING_LINE_F, it contain s data.
    2. ANOMALY INFO::: Error while executing : CREATE INDEX:W_GL_REVN_F:W_GL_REVN_F_U1
    MESSAGE:::java.lang.Exception: Error while execution : CREATE UNIQUE INDEX
         W_GL_REVN_F_U1
    ON
         W_GL_REVN_F
         INTEGRATION_ID ASC
         ,DATASOURCE_NUM_ID ASC
    NOLOGGING
    with error DataWarehouse:CREATE UNIQUE INDEX
         W_GL_REVN_F_U1
    ON
         W_GL_REVN_F
         INTEGRATION_ID ASC
         ,DATASOURCE_NUM_ID ASC
    NOLOGGING
    ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
    -- Yes, I found duplicate values in this table W_GL_REVN_F. But, how can I rectify it. I did some engineering, but failed.
    please tell me the steps to acheive....
    Thanks in advance..
    Stone

    Hi, Please see the answers (in bold) below.
    1. ANOMALY INFO::: Error while executing : TRUNCATE TABLE:W_SALES_BOOKING_LINE_F
    MESSAGE:::com.siebel.etl.database.IllegalSQLQueryException: DataWarehouse:TRUNCATE TABLE W_SALES_BOOKING_LINE_F
    ORA-00054: resource busy and acquire with NOWAIT specified or timeout expired
    --I checked W_SALES_BOOKING_LINE_F, it contain s data.
    Just restart the load, It seems like your DB processes are busy and the table still has a  lock on it which means something is not yet Commited/Rolled Back.
    If this issue repeats you can mail your DBA and ask him to look in to the issue
    2. ANOMALY INFO::: Error while executing : CREATE INDEX:W_GL_REVN_F:W_GL_REVN_F_U1
    MESSAGE:::java.lang.Exception: Error while execution : CREATE UNIQUE INDEX
    W_GL_REVN_F_U1
    ON
    W_GL_REVN_F
    INTEGRATION_ID ASC
         ,DATASOURCE_NUM_ID ASC
         NOLOGGING
         with error DataWarehouse:CREATE UNIQUE INDEX
         W_GL_REVN_F_U1
         ON
         W_GL_REVN_F
         INTEGRATION_ID ASC
         ,DATASOURCE_NUM_ID ASC
         NOLOGGING
         ORA-01452: cannot CREATE UNIQUE INDEX; duplicate keys found
         -- Yes, I found duplicate values in this table W_GL_REVN_F. But, how can I rectify it. I did some engineering, but failed.
         please tell me the steps to achieve....
    please execute this sql and get the duplicate values. If the count is less you can delete the records based on ROW_WID
    How  many duplicates do you have in total?
    *1. SELECT INTEGRATION_ID,DATASOURCE_NUM_ID,count(*) FROM W_GL_REVN_F*
    GROUP BY INTEGRATION_ID, DATASOURCE_NUM_ID
    HAVING COUNT()>1*
    *2. SELECT ROW_WID,DATASOURCE_NUM_ID,INTEGRATION_ID FROM W_GL_REVN_F*
    WHERE INTEGRATION_ID= (from 1st query)
    *3. DELETE from W_GL_REVN_F where ROW_WID=( from 2nd query)*
    Hope this helps !!

  • Error in Source System while running the Full Load

    HI Experts,
    We are using 0CRM_UT_SRV_CONT_I datasource from CRM to extract data.
    It is giving error while running the full load as 'Error Occured in Source system' with messaage Number : RSM340.
    I  want to mention certain points here:
    1) In RSA3 its running fine.
    2) Connection between BW and CRM system is OK.
    3) We have replicated datasource and send active veriosn from Dev system.
    4) When I run Init without data, Init set sucessful with entry in RSA7 in CRM.
    5) Delta run fine with green status.
    6) As source system is static client, hence do data for delta.
    The Main issue happens when we run either the Full load or Init With Data transfer.
    Kindly suggest.
    Thanks
    Mayank

    Hi Mayank,
    see in source system (CRM) with transaction SM21 and ST22 the error log and and then you let us know.
    Charly

  • Task fails while running Full load ETL

    Hi All,
    I am running full load ETL For Oracle R12(vanila Instance) HR But 4 tasks are failing SDE_ORA_JobDimention, SDE_ORA_HRPositionDimention, SDE_ORA_CodeDimension_Pay_level and SDE_ORA_CodeDimensionJob, I changed the parameter for all these task as mentioned in the Installation guide and rebuilled. Please help me out.
    Log is like this for SDE_ORA_JobDimention
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R12] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [9] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$JOBCODE_FLXFLD_SEGMENT_COL].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$JOBFAMILYCODE_FLXFLD_SEGMENT_COL].
    DIRECTOR> VAR_27027 Use default value [] for mapping parameter:[mplt_BC_ORA_JobDimension.$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [DEFAULT] for mapping parameter:[$$TENANT_ID].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_JobDimension_Full] at [Fri Sep 26 10:52:05 2008]
    DIRECTOR> TM_6683 Repository Name: [Oracle_BI_DW_Base]
    DIRECTOR> TM_6684 Server Name: [Oracle_BI_DW_Base_Integration_Service]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR12_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_JobDimension_Full]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_JobDimension [version 1]
    DIRECTOR> TM_6827 [C:\Informatica\PowerCenter8.1.1\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_JobDimension_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6703 Session [SDE_ORA_JobDimension_Full] is run by 32-bit Integration Service [node01_HSCHBSCGN20031], version [8.1.1], build [0831].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6151 Session Sort Order: [Binary]
    MAPPING> TM_6156 Using LOW precision decimal arithmetic
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM Error Log Disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_JobDimension_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Fri Sep 26 10:52:13 2008)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Fri Sep 26 10:52:14 2008)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 32000000 bytes and Block size is 1280000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [dev], user [apps]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [orcl], user [obia], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_JOB_DS :SQL INSERT statement:
    INSERT INTO W_JOB_DS(JOB_CODE,JOB_NAME,JOB_DESC,JOB_FAMILY_CODE,JOB_FAMILY_NAME,JOB_FAMILY_DESC,JOB_LEVEL,W_FLSA_STAT_CODE,W_FLSA_STAT_DESC,W_EEO_JOB_CAT_CODE,W_EEO_JOB_CAT_DESC,AAP_JOB_CAT_CODE,AAP_JOB_CAT_NAME,ACTIVE_FLG,CREATED_BY_ID,CHANGED_BY_ID,CREATED_ON_DT,CHANGED_ON_DT,AUX1_CHANGED_ON_DT,AUX2_CHANGED_ON_DT,AUX3_CHANGED_ON_DT,AUX4_CHANGED_ON_DT,SRC_EFF_FROM_DT,SRC_EFF_TO_DT,DELETE_FLG,DATASOURCE_NUM_ID,INTEGRATION_ID,TENANT_ID,X_CUSTOM) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_JOB_DS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    WRITER_1_*_1> WRT_8005 Writer run started.
    READER_1_1_1> BLKR_16007 Reader run started.
    READER_1_1_1> RR_4029 SQ Instance [mplt_BC_ORA_JobDimension.Sq_Jobs] User specified SQL Query [SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT,      PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS.  AS JOB_CODE,
      '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID]
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Fri Sep 26 10:53:05 2008
    Target tables:
    W_JOB_DS
    READER_1_1_1> RR_4049 SQL Query issued to database : (Fri Sep 26 10:53:05 2008)
    READER_1_1_1> CMN_1761 Timestamp Event: [Fri Sep 26 10:53:06 2008]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-01747: invalid user.table.column, table.column, or column specification
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT, PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS. AS JOB_CODE,
    '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    PER_JOBS.JOB_ID,
    PER_JOBS.BUSINESS_GROUP_ID,
    PER_JOBS.JOB_DEFINITION_ID,
    PER_JOBS.DATE_FROM,
    PER_JOBS.DATE_TO,
    PER_JOBS.LAST_UPDATE_DATE AS CHANGED_ON_DT, PER_JOBS.LAST_UPDATED_BY, PER_JOBS.CREATED_BY, PER_JOBS.CREATION_DATE,
    PER_JOB_DEFINITIONS.LAST_UPDATE_DATE AS AUX1_CHANGED_ON_DT,
    PER_JOB_DEFINITIONS.JOB_DEFINITION_ID,
    PER_JOBS.NAME,
    PER_JOBS.JOB_INFORMATION1, PER_JOBS.JOB_INFORMATION3,
    PER_JOBS. AS JOB_FAMILY_CODE, PER_JOB_DEFINITIONS. AS JOB_CODE,
    '0' AS X_CUSTOM
    FROM
    PER_JOBS, PER_JOB_DEFINITIONS
    WHERE
    PER_JOBS.JOB_DEFINITION_ID = PER_JOB_DEFINITIONS.JOB_DEFINITION_ID
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Fri Sep 26 10:53:06 2008]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_JOB_DS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Fri Sep 26 10:53:06 2008
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_JOB_DS (Instance Name: [W_JOB_DS])
    WRT_8044 No data loaded for this target
    WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [mplt_BC_ORA_JobDimension.Sq_Jobs] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [mplt_BC_ORA_JobDimension.Sq_Jobs] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_JOB_DS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Fri Sep 26 10:53:06 2008)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Fri Sep 26 10:53:06 2008)
    MAPPING> TM_6018 Session [SDE_ORA_JobDimension_Full] run completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [Sq_Jobs] (Instance Name: [mplt_BC_ORA_JobDimension.Sq_Jobs])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_JOB_DS] (Instance Name: [W_JOB_DS])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_JobDimension_Full] completed at [Fri Sep 26 10:53:07 2008]

    To make use of the warehouse you would probably want to connect to an EBS instance in order to populate the warehouse.
    Since the execution plan you intend to run is designed for the EBS data-model. I guess if you really didn't want to connect to the EBS instance to pull data you could build one using the universal adapter. This allows you to load out of flat-files if you wish, but I wouldn't reccomend making this a habit for actual implementation as it does create another potential point of failure (populating the flat-files).
    Thanks,
    Austin

  • ETL Full Load, Nothing Shows up in Current Run

    Hi,
    I have followed the:
    Oracle® Business Intelligence Applications
    Installation Guide for Informatica PowerCenter Users
    Release 7.9.6.3
    E19038-01
    Documentation to install OBIA.
    Right now, I have DAC and Informatica servers installed on a Linux machine. I have installed OBIEE on another Linux machine. I have EBS installed on another Linux machine. Oracle installed on another Linux machine. DAC, Informatica, and OBIEE are all connected to the Oracle machine. The EBS has its own oracle database on its server. I installed OBIA on my local windows computer with both the DAC and Informatica client. I have transferred the required files to the OBIEE machine. When I configure DAC all the connections are working and I get no errors.
    I get to the part where, I have am running the "4.19.1 An Example of Running a Full Load ETL"
    I do all the sets it asks, and after I hit "Run Now", A message box appears and says "Start request successfully submitted to server"
    According to the documentation, The run should show up in the "Current Run" tab. But nothing appears.
    I have tried to create 2 more Execution Plans with different subject areas and still nothing.
    I tried running the client on another machine and still nothing.
    Do you guys know what might be the issue?
    Any help would be awesome. Thanks in advance.
    Edited by: 934820 on May 17, 2012 2:40 PM

    You shouldn't need to customise anything in the Informatica Workflows to get the ETL working - I believe that all the config for Informatica is done in the DAC.
    For the RPD and DAC variables, anything that has OLAP in it's name refers to the BI Apps data warehouse database. I'm guessing that your DSNs are correct, but I think that OLAP_USER and OLAPTBO would need to be something different. Normally, you have three key schemas in the Warehouse, one for the actual warehouse tables, one for the DAC repositoty and one for the Informatica repository. Unlike EBS you can name these schemas whatever you like - we call ours BIAPPS, DAC and INFA respectively, and we'd have OLAP_USER and OLAPTBO set to BIAPPS.

  • DAC - Can I run more then one Exection Plans with full load option enabled?

    Hello All,
    I have configured for subject Area sales – Orders, Backlog and Invoices (OBIEE)
    Now I want to run DAC to load the underline tables for subject Areas Financials – Revenue Analysis and Financials – GL Budget and Expenses, I have identified the relevant fact tables and built a Execution Plan in DAC.
    DAC - In which mode should I run the execution plan?
    With full load option enabled or disabled because already full load run is done for sales – Orders, Backlog and Invoices and this run should not disturb the existing data
    If I run with full load option enabled do it overrides the existing data for other subject area
    Please help me I am bit confused!
    Regards,
    SMA
    Edited by: SMA on Aug 18, 2009 3:36 PM

    No need of combining all the subject areas into one execution plan. Simply run the second ETL without checking the full load option. It will do the full load for the tables related to the newly added subject areas and incremental for the common tables between old and new subject areas.

  • Core 2 duo only running at 997mhz at full load

    Hi my new/used thinkpad t60 core 2 duo t2400 1.8ghz is only running at 997mhz even at full load i disabled speedstep in the bios the power management is set to maximum performance and still it only runs that clock speed even at 100% load.
    Link to picture
    thank you for looking
    Eastham.
    My main Thinkpad - T400, 2768-BA8, core 2 duo t8300, 8GB RAM, ATI HD3470, 700GB HDD drive.
    Solved!
    Go to Solution.

    Do you have a charged genuine battery installed and is the cpu operating at normal temps?
    ThinkPad W-510 i7-820QM(1.73-3.06GHz) Quad Core... ThinkPad T500, T9900, 8gb SSD...FrankNpad T-60p/61p (X9000 2.8ghz) 8gb SSD ips FlexView...ThinkPad T-61p (T9300 2.5ghz) 8gb ram...Thinkpad X-61 Tablet 4gb ram...ThinkPad A-31 (1.9ghz P4 1.5gb ram)

  • How to Load Classes in run time - Urgent

    Hi All,
    How to Load a class file from a .jar file in JDK1.1 with out using URLClassLoader. I have tried to extract the .jar file by file input stream and zipEntry classes and defineClass using the bytes i got from the stream , but the class loader fails to take the classes that are already loaded in the system.
    The customClassLoader that i created was not able to assign an object to the another reference variable that already in the JVM.Please i want to know how to assign or communicate between the objects of CustomclassLoader and SystemClassLoader?
    Please help me in this regard.
    Thanks in advance,

    Hi,
    The above code works in the straight way if the class is available in the classpath.
    I need to load a class by using custom class loader that i extends from classloader base class .
    I have to read a class file from a .jar file (in server) and load it in run time.Here i am able to read the bytes from the .jar file of the specified class file, but the class that i read extends some other class that is already loaded when i tried to create newInstance of the class i get Exceptions. Here the custom classLoader cannot find the loaded existing class in the JVM.can i get a solution for this situation

  • Full Load: Error while executing :TRUNCATE TABLE: S_ETL_PARAM

    Hi All,
    We are using Bi Apps 7.9.6.1. Full Load was running fine. But Now we are facing a problem with truncating a table "S_ETL_PARAM".
    I have restart informatica Server And also DAC Srever. But still I am getting the same in the DAC Log as, *"NOMALY INFO::: Error while executing : TRUNCATE TABLE:S_ETL_PARAM*
    *MESSAGE:::com.siebel.etl.database.IllegalSQLQueryException: DBConnection_OLTP:SIEBTRUN ('siebel.S_ETL_PARAM')*
    *Values :*
    *Null Map*
    *EXCEPTION CLASS::: java.lang.Exception"*
    Any Suggestion.....
    Thanks in Advance,
    Deepak

    are you trying to run incremental load when you get this truncate error? can you re-run full load and see that still runs ok? pls also check your DW side database logs like alert lor any DB level issue. such errors do not throw friendly messages in DAC/Informatica side.

  • Cube full load failed

    Hi Experts,
    When i am loading the data from ODS to cube (FUll load) load is failed due to the below error.
    Short dump in the Warehouse
    Diagnosis
    The data update was not completed. A short dump has probably been logged in BW providing information about the error.
    System response
    "Caller 70" is missing.
    Further analysis:
    Search in the BW short dump overview for the short dump belonging to the request. Pay attention to the correct time and date on the selection screen.
    You get a short dump list using the Wizard or via the menu path "Environment -> Short dump -> In the Data Warehouse".
    Error correction:
    Follow the instructions in the short dump.
    and i checked in ST22 and the error analysis is,
    An exception occurred. This exception is dealt with in more detail below
    . The exception, which is assigned to the class 'CX_SY_OPEN_SQL_DB', was
      neither
    caught nor passed along using a RAISING clause, in the procedure "WRITE_ICFA
      "(FORM)"
    Since the caller of the procedure could not have expected this exception
      to occur, the running program was terminated.
    The reason for the exception is:
    The database system recognized that your last operation on the database
    would have led to a deadlock.
    Therefore, your transaction was rolled back
    to avoid this.
    ORACLE always terminates any transaction that would result in deadlock.
    The other transactions involved in this potential deadlock
    are not affected by the termination.
    Please can any one tell me why the error occured and how to resolved.
    Thanks in advance
    David

    David,
    It appears that there was a Table lock when you executed your DTP. This means that there was a multiple read on any of the tables used by DTP at the same time. This has resulted in the error. Check your PC's once.
    As of now, delete the request in IC and reload!
    -VA

Maybe you are looking for

  • Pages '08 to Microsoft Word '04

    I have pages '08 and i was wondering if i could print out a document from pages from the application microsoft word '03 Is there a setting i have to change on page to do this?

  • Need to exchange PS 13 Windows for Mac version

    I mistakenly purchased the Windows version of PS 13. I need the Mac version instead. I have not yet downloaded the purchase. How should I proceed?

  • How to disable automatic hibernation on my Mac?

    Hi there, I'm using a Mac mini Late 2012 which is configured to go on sleep automatically after 15 minutes. What I have noticed is that after some hours "sleeping" my Mac goes automatically in hibernation. I would like to disable that feature so that

  • Scheduling an image.. Start time etc..

    I want to schedule a lab to image for me after everyone goes home. I've created a multicast imaging bundle with the correct actions. Ideally these machines will all be turned off at the end of the day. I'd like them to wake up at 6pm, join the sessio

  • Mass Deletion of BOM

    Dear Experts, Please advice is there is any mass tranasaction available to delete the BOMs in mass.