Non-OBAW Dims / Facts being loaded using DAC tool?

Guys, i am setting up a warehouse besides what is available using Oracle Apps. This does not have much to do with OBAW tables but i still would like to use the DAC tool to manage the loads.
Is this worth setting up? I know enough about DAC but dont know how i would go about setting up something that has nothing to do with any of the existing containers / subject areas.
Any feedback / guidance is greatly appreciated.
Regards,

Hi,
Well Informatica is well thought of as an ETL tool so no reason why this wouldn't be a good idea, although you should check the third party licence for the Informatica bundled with OBIEE to make sure this is allowed.
Personally I think the DAC has many good points and also many bad points. If you think the good outweighs the bad then go for it, the process works for the OBIEE Apps so no reason why it can't work for you.
Regards,
Matt

Similar Messages

  • Using WHERE NOT EXISTS for a Fact Table Load

    I'm trying to set up a fact table load using T SQL, and I need to use WHERE NOT EXISTS. All of the fields from the fact table are listed in the WHERE NOT EXISTS clause. What I expect is that if the value of any one of the fields is different, that the whole
    record be treated as a new record, and inserted into the table. However, in my testing, when I 'force' a field value, new records are not inserted.
    The following is my query:
    declare 
    @Created_By nchar(50)
    ,@Created_Date datetime --do we need utc check?
    ,@Updated_By nchar(50)
    ,@Updated_Date datetime
    select @Created_By = system_user
    ,@Created_Date = getdate()
    ,@Updated_By = system_user
    ,@Updated_Date = getdate()
    insert fact.Appointment
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    select distinct
    Slot.Slot_ID 
    , Slot.Slot_Start_DateTime  as Slot_DateTime --???
    , Slot.Slot_Start_DateTime
    , Slot.Slot_End_DateTime
    , datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) as Slot_Duration_Min 
    , Slot.Created_Date as Slot_CreateDateTime
    , SlotCreateDate.Date_key as Slot_CreateDate_DateKey
    , HSite.Healthcare_System_ID
    , HSite.Healthcare_Service_ID
    , HSite.Healthcare_Service_ID as Healthcare_Supervising_Service_ID
    , HSite.Healthcare_Site_ID 
    , Ref.Booked_Appt_ID 
    , ApptSubmissionTime.Date_key as Appt_Notification_Submission_DateKey
    , ApptCompletionTime.Date_key as Appt_Notification_Completion_DateKey
    , datediff(mi,appt.SubmissionTime,appt.CompletionTime) as Appt_Notification_Duration
    , Appt.Appt_Notification_ID 
    , pat.Patient_ID 
    , 0 as Physician_ID
    , ref.Referral_ID
    , Hsrv.Specialty
    , appt.[Language] as LanguageRequested
    ,@Created_Date as Created_Date
    ,@Created_By as Created_By
    ,@Updated_Date as Updated_Date
    ,@Updated_By as Updated_By
    from dim.Healthcare_System HSys
    inner join dim.Healthcare_Service HSrv
    on HSys.Healthcare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Healthcare_Site HSite
    on HSite.HealthCare_Service_ID = HSrv.Healthcare_Service_ID
    and HSite.HealthCare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Referral Ref 
    on Ref.ReferralSite_ID = HSite.Site_ID
    and Ref.ReferralService_ID = HSite.Service_ID
    and Ref.ReferralSystem_ID = HSite.System_ID 
    right join (select distinct Slot_ID, Source_Slot_ID, Slot_Start_DateTime, Slot_End_DateTime, Created_Date from dim.slot)slot
    on ref.Source_Slot_ID = slot.Source_Slot_ID
    inner join dim.Appointment_Notification appt
    on appt.System_ID = HSys.System_ID
    inner join dim.Patient pat 
    on pat.Source_Patient_ID = appt.Source_Patient_ID
    inner join dim.SystemUser SysUser
    on SysUser.Healthcare_System_ID = HSys.Healthcare_System_ID
    left join dim.Calendar SlotCreateDate
    on SlotCreateDate.Full_DateTime = cast(Slot.Created_Date as smalldatetime)
    left join dim.Calendar ApptSubmissionTime
    on ApptSubmissionTime.Full_DateTime = cast(appt.SubmissionTime as smalldatetime)
    left join dim.Calendar ApptCompletionTime
    on ApptCompletionTime.Full_DateTime = cast(appt.CompletionTime as smalldatetime)
    where not exists
    select
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    from fact.Appointment
    I don't have any issues with the initial insert, but records are not inserted on subsequent inserts when one of the WHERE NOT EXISTS field values changes.
    What am I doing wrong?
    Thank you for your help.
    cdun2

    so I set up a WHERE NOT EXIST condition as shown below. I ran the query, then updated Slot_Duration_Min to 5. Some of the Slot_Duration_Min values resolve to 15. What I expect is that when I run the query again, that the records where Slot_Duration_Min resolves
    to 15 should be inserted again, but they are not. I am using or with the conditions in the WHERE clause because if any one of the values is different, then a new record needs to be inserted:
    declare 
    @Created_By nchar(50)
    ,@Created_Date datetime
    ,@Updated_By nchar(50)
    ,@Updated_Date datetime
    select
    @Created_By = system_user
    ,@Created_Date = getdate()
    ,@Updated_By = system_user
    ,@Updated_Date = getdate()
    insert fact.Appointment
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    select distinct
    Slot.Slot_ID 
    , Slot.Slot_Start_DateTime  as Slot_DateTime --???
    , Slot.Slot_Start_DateTime
    , Slot.Slot_End_DateTime
    , datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) as Slot_Duration_Min 
    , Slot.Created_Date as Slot_CreateDateTime
    , SlotCreateDate.Date_key as Slot_CreateDate_DateKey
    , HSite.Healthcare_System_ID
    , HSite.Healthcare_Service_ID
    , HSite.Healthcare_Service_ID as Healthcare_Supervising_Service_ID
    , HSite.Healthcare_Site_ID 
    , Ref.Booked_Appt_ID 
    , ApptSubmissionTime.Date_key as Appt_Notification_Submission_DateKey
    , ApptCompletionTime.Date_key as Appt_Notification_Completion_DateKey
    , datediff(mi,appt.SubmissionTime,appt.CompletionTime) as Appt_Notification_Duration
    , Appt.Appt_Notification_ID 
    , pat.Patient_ID 
    , 0 as Physician_ID
    , ref.Referral_ID
    , Hsrv.Specialty
    , appt.[Language] as LanguageRequested
    ,@Created_Date as Created_Date
    ,@Created_By as Created_By
    ,@Updated_Date as Updated_Date
    ,@Updated_By as Updated_By
    from dim.Healthcare_System HSys
    inner join dim.Healthcare_Service HSrv
    on HSys.Healthcare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Healthcare_Site HSite
    on HSite.HealthCare_Service_ID = HSrv.Healthcare_Service_ID
    and HSite.HealthCare_System_ID = HSrv.HealthCare_System_ID 
    inner join dim.Referral Ref 
    on Ref.ReferralSite_ID = HSite.Site_ID
    and Ref.ReferralService_ID = HSite.Service_ID
    and Ref.ReferralSystem_ID = HSite.System_ID 
    right join (select distinct Slot_ID, Source_Slot_ID, Slot_Start_DateTime, Slot_End_DateTime, Created_Date from dim.slot)slot
    on ref.Source_Slot_ID = slot.Source_Slot_ID
    inner join dim.Appointment_Notification appt
    on appt.System_ID = HSys.System_ID
    inner join dim.Patient pat 
    on pat.Source_Patient_ID = appt.Source_Patient_ID
    inner join dim.SystemUser SysUser
    on SysUser.Healthcare_System_ID = HSys.Healthcare_System_ID
    left join dim.Calendar SlotCreateDate
    on SlotCreateDate.Full_DateTime = cast(Slot.Created_Date as smalldatetime)
    left join dim.Calendar ApptSubmissionTime
    on ApptSubmissionTime.Full_DateTime = cast(appt.SubmissionTime as smalldatetime)
    left join dim.Calendar ApptCompletionTime
    on ApptCompletionTime.Full_DateTime = cast(appt.CompletionTime as smalldatetime)
    where not exists
    select
    Slot_ID
    , Slot_DateTime
    , Slot_StartDateTime
    , Slot_EndDateTime
    , Slot_Duration_min
    , Slot_CreateDateTime
    , Slot_CreateDate_DateKey
    , Healthcare_System_ID
    , Healthcare_Service_ID
    , Healthcare_Supervising_Service_ID
    , Healthcare_Site_ID
    , Booked_Appt_ID
    , Appt_Notification_Submission_DateKey
    , Appt_Notification_Completion_DateKey
    , Appt_Notification_Duration
    , Appt_Notification_ID
    , Patient_ID
    , Physician_ID
    , Referral_ID
    , Specialty
    , LanguageRequested
    , Created_Date
    , Created_By
    , Updated_Date
    , Updated_By
    from fact.Appointment fact
    where 
    Slot.Slot_ID  = fact.Slot_ID 
    or
    Slot.Slot_Start_DateTime   = fact.Slot_DateTime  
    or
    Slot.Slot_Start_DateTime = fact.Slot_StartDateTime
    or
    Slot.Slot_End_DateTime = fact.Slot_EndDateTime
    or
    datediff(mi,slot.Slot_Start_DateTime,slot.Slot_End_Datetime) =
    fact.Slot_Duration_min
    or
    Slot.Created_Date  = fact.Slot_CreateDateTime
    or
    SlotCreateDate.Date_key = fact.Slot_CreateDate_DateKey
    or
    HSite.Healthcare_System_ID = fact.Healthcare_System_ID
    or
    HSite.Healthcare_Service_ID = fact.Healthcare_Service_ID
    or
    HSite.Healthcare_Service_ID  =
    fact.Healthcare_Service_ID 
    or
    HSite.Healthcare_Site_ID  = fact.Healthcare_Site_ID 
    or
    Ref.Booked_Appt_ID  = fact.Booked_Appt_ID 
    or
    ApptSubmissionTime.Date_key =
    fact.Appt_Notification_Submission_DateKey
    or
    ApptCompletionTime.Date_key =
    fact.Appt_Notification_Completion_DateKey
    or 
    datediff(mi,appt.SubmissionTime,appt.CompletionTime)  = fact.Appt_Notification_Duration
    or
    Appt.Appt_Notification_ID = fact.Appt_Notification_ID 
    or
    pat.Patient_ID  =
    fact.Patient_ID 
    or
    0 = 0
    or
    ref.Referral_ID = fact.Referral_ID
    or
    Hsrv.Specialty = fact.Specialty
    or
    appt.[Language] = fact.LanguageRequested

  • Regarding REFRESHING of Data in Data warehouse using DAC Incremental approa

    My client is planning to move from Discoverer to OBIA but before that we need some answers.
    1) My client needs the data to be refreshed every hour (incremental load using DAC) because they are using lot of real time data.
    We don't have much updated data( e.g 10 invoices in an hour + some other). How much time it usually takes to refresh those tables in Data wareshouse using DAC?
    2) While the table is getting refreshed can we use that table to generate a report? If yes, what is the state of data? Stale or incorrect(undefined)?
    3) How does refresh of Fin analytics work? Is it one module at a time or it treats all 3 modules (GL, AR and AP) as a single unit of refresh?
    I would really appreciate if I can get an answer for all the questions.
    Thank You,

    Here you go for answers:
    1) Shouldn't be much problem for a such small amt of data. All depends on ur execution plan in DAC which can always be created as new and can be customized to load data for only those tables...(Star Schema) ---- Approx 15-20 mins as it does so many things apart from loading table.
    2) Report in OBIEE will give previous data as I believe Cache will be (Shud be) turned on. You will get the new data in reports after the refresh is complete and cache is cleared using various methods ( Event Polling preferred)
    3) Again for Fin Analytics or any other module, you will have OOTB plans. But you can create ur new plans and execute. GL, AR, AP are also provided seperate..
    Hope this answers your question...You will get to know more which going through Oracle docs...particular for DAC

  • Using DAC for running Non BIApps infa jobs n running 2 EP in parallel

    Hi,
    We have already setup BI Apps prod environment using DAC, Informatica and OBIEE 11g for one of our customer.
    Now, we want to check possibility of using DAC for running Non BIApps related informatica jobs.
    (As we have only weekly run of DAC execution plan on weekends and Informatica and DAC are idle most of the time during weekdays)
    Customer wants a separate new small datamart to be setup which will cater reporting requirements for different department and has no relation or any link with existing BI Apps datawarehouse.
    Just wanted to check if it will violate licensing terms (if we use DAC for Non BI Apps workflows and run another EP)?
    Also, is DAC Build 10.1.3.4.1 capable of running two execution plans in parallel?
    We heard long back that running two EP parallel feature will be lunched in DAC 11g version. Any pointers or new in this space?
    Thanks in Advance,

    From what I recall, you CANNOT load a "separate" DB instance that is NOT OBIA. If you create a small custom datamart INSIDE the exitsing OBIA schema, then it is acceptable. However, if you are using DAC (regardless of if its one plan or two plans) to load a NON-OBIA target, this may violate the licensing agreement. You may need a separate standalone license for Informatica and use Informatica's scheduler tool. If you want to use DAC, make sure your target is inside the OBIA DW.
    Pls mark correct...

  • "None of the fact tables are compatible with the query request " error

    I've got a situation where I have two facts(Fact_1, Fact_2) and three dimensions(dim_1,dim_2,dim_3) in 1 subject area. I've got dimension hierarchies setup for all the dimension tables.
    Dim_1 is one to many to Fact_1
    Dim_2 is one to many to Fact_2
    Dim_3 is one to many to both Fact_1 and Fact_2
    I've set up the content levels for the LTS for the Facts so that they are the lowest grain for dimensions they join to and the grand total grain for dimensions they do not join to.
    My rpd is consistent. When I run a report using an attribute from Dim_3 and Dim_1 or Dim_3 and Dim_2, the report comes back fine.
    But if I try to run a report using all three Dim tables, I get an error and the message "None of the fact tables are compatible with the query request ".
    First of all, is it possible to make a report using all three dimensions?
    Second, what's the best way to trouble shoot this error? Why are none of the fact tables compatible? I thought as long as the aggregation levels were set to grand total for non-shared dimensions, Answers would be able to create the report properly.
    Any advise would be greatly appreciated.
    Thanks!
    -Joe

    OBIEE is looking for a fact that can link ALL the dimensions together. This is also known as the implicit fact ... you don't have a fact that can relate all the dimensions - you have 2 facts that together they can. Perhaps you need to great a single logical fact that has both LTS for your physical facts and try it that way.
    Then you'd have Dim1, Dim 2, Dim3 all being able to join to Fact1 (which is made of physical facts 1 & 2).

  • Error while creating the DWH tables using DAC

    Hi,
    I am getting error while creating the DWH tables using DAC. I have created a ODBC DSN using merant driver with DAC repository DB credentials and the test connection is successful. And while creating the tables i gave the olap dw credentials and the DSN name which i created earlier. But it throws the error as below:
    Please find the below mentioned error message
    =====================================
    STD OUTPUT
    =====================================
    CREATING SIEBEL DATABASE OBJECTS
    F:\DAC\bifoundation\dac\UTILITIES\BIN\DDLIMP /I N /s N /u infdomain /p ******* /c DB_DAC /G "SSE_ROLE" /f F:\DAC\bifoundation\dac/conf/sqlgen/ctl-file/oracle_bi_dw.ctl /b "" /K "" /X "" /W N
    Error while importing Siebel database schema.
    =====================================
    ERROR OUTPUT
    =====================================
    Siebel Enterprise Applications ODBC DDL Import Utility, Version 7.7 [18030] ENU
    Copyright (c) 2001 Siebel Systems, Inc. All rights reserved.
    This software is the property of Siebel Systems, Inc., 2207 Bridgepointe Parkway,
    San Mateo, CA 94404.
    User agrees that any use of this software is governed by: (1) the applicable
    user limitations and other terms and conditions of the license agreement which
    has been entered into with Siebel Systems or its authorized distributors; and
    (2) the proprietary and restricted rights notices included in this software.
    WARNING: THIS COMPUTER PROGRAM IS PROTECTED BY U.S. AND INTERNATIONAL LAW.
    UNAUTHORIZED REPRODUCTION, DISTRIBUTION OR USE OF THIS PROGRAM, OR ANY PORTION
    OF IT, MAY RESULT IN SEVERE CIVIL AND CRIMINAL PENALTIES, AND WILL BE
    PROSECUTED TO THE MAXIMUM EXTENT POSSIBLE UNDER THE LAW.
    If you have received this software in error, please notify Siebel Systems
    immediately at (650) 295-5000.
    F:\DAC\bifoundation\dac\UTILITIES\BIN\DDLIMP /I N /s N /u infdomain /p ***** /c DB_DAC /G SSE_ROLE /f F:\DAC\bifoundation\dac/conf/sqlgen/ctl-file/oracle_bi_dw.ctl /b /K /X /W N
    Connecting to the database...
    28000: [DataDirect][ODBC Oracle driver][Oracle]ORA-01017: invalid username/password; logon denied
    Unable to connect to the database...
    any help is appreciated.
    Thanks,
    RM

    The fact that you are getting an "ORA-01017: invalid username/password; logon denied" message indicates that you are at least talking to the database.
    The log shows that username "infdomain" is being used. Can you double check the username and password you have in DAC in a SQL*Plus/SQL Developer session?
    Please mark if useful/helpful,
    Andy.

  • Error While running the ETL Load in DAC (BI Financial Analytics)

    Hi All,
    I have Installed and Configured BI Applictions 7.9.5 and Informatic8.1.1. For the first time when we run the ETL Load in DAC it has failed.for us every Test Connection was sucess.and getting the error message as below.
    The log file which I pasted below is from the path
    /u01/app/oracle/product/Informatica/PowerCenter8.1.1/server/infa_shared
    /SessLogs
    SDE_ORAR12_Adaptor.SDE_ORA_GL_AP_LinkageInformation_Extract_Full.log
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R12] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [9] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value ['Y'] for mapping parameter:[$$FILTER_BY_LEDGER_ID].
    DIRECTOR> VAR_27028 Use override value ['N'] for mapping parameter:[$$FILTER_BY_LEDGER_TYPE].
    DIRECTOR> VAR_27028 Use override value [04/02/2007] for mapping parameter:[$$INITIAL_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [] for mapping parameter:[$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [1] for mapping parameter:[$$LEDGER_ID_LIST].
    DIRECTOR> VAR_27028 Use override value ['NONE'] for mapping parameter:[$$LEDGER_TYPE_LIST].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] at [Thu Feb 12 12:49:33 2009]
    DIRECTOR> TM_6683 Repository Name: [DEV_Oracle_BI_DW_Rep]
    DIRECTOR> TM_6684 Server Name: [DEV_Oracle_BI_DW_Rep_Integration_Service]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR12_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_GL_AP_LinkageInformation_Extract_Full]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_GL_AP_LinkageInformation_Extract [version 1]
    DIRECTOR> TM_6827 [u01/app/oracle/product/Informatica/PowerCenter8.1.1/server/infa_shared/Storage] will be used as storage directory for session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6708 Using configuration property [SiebelUnicodeDB,apps@devr12 bawdev@devbi]
    DIRECTOR> TM_6703 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] is run by 64-bit Integration Service [node01_oratestbi], version [8.1.1 SP4], build [0817].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [ISO 8859-1 Western European]
    MAPPING> TM_6151 Session Sort Order: [Binary]
    MAPPING> TM_6156 Using LOW precision decimal arithmetic
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM Error Log Disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Thu Feb 12 12:49:34 2009)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Thu Feb 12 12:49:34 2009)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 12582912 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [devr12.tessco.com], user [apps]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [DEVBI], user [bawdev], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_GL_LINKAGE_INFORMATION_GS :SQL INSERT statement:
    INSERT INTO W_GL_LINKAGE_INFORMATION_GS(SOURCE_DISTRIBUTION_ID,JOURNAL_LINE_INTEGRATION_ID,LEDGER_ID,LEDGER_TYPE,DISTRIBUTION_SOURCE,JE_BATCH_NAME,JE_HEADER_NAME,JE_LINE_NUM,POSTED_ON_DT,SLA_TRX_INTEGRATION_ID,DATASOURCE_NUM_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_GL_LINKAGE_INFORMATION_GS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Thu Feb 12 12:49:34 2009
    Target tables:
    W_GL_LINKAGE_INFORMATION_GS
    READER_1_1_1> RR_4029 SQ Instance [SQ_XLA_AE_LINES] User specified SQL Query [SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
          AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AP_INV_DIST', 'AP_PMT_DIST'
              , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('04/02/2007 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Thu Feb 12 12:49:34 2009)
    READER_1_1_1> CMN_1761 Timestamp Event: [Thu Feb 12 12:49:34 2009]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-01114: IO error writing block to file 513 (block # 328465)
    ORA-27072: File I/O error
    Linux-x86_64 Error: 28: No space left on device
    Additional information: 4
    Additional information: 328465
    Additional information: -1
    ORA-01114: IO error writing block to file 513 (block # 328465)
    ORA-27072: File I/O error
    Linux-x86_64 Error: 28: No space left on device
    Additional information: 4
    Additional information: 328465
    Additional information: -1
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
    JBATCH.NAME BATCH_NAME,
    JHEADER.NAME HEADER_NAME,
    PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
    , GL_IMPORT_REFERENCES GLIMPREF
    , XLA_AE_LINES AELINE
    , GL_JE_HEADERS JHEADER
    , GL_JE_BATCHES JBATCH
    , GL_LEDGERS T
    , GL_PERIODS PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
    ( 'AP_INV_DIST', 'AP_PMT_DIST'
    , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID = DLINK.AE_HEADER_ID
    AND AELINE.AE_LINE_NUM = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID = T.LEDGER_ID
    AND JHEADER.STATUS = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
    TO_DATE('04/02/2007 00:00:00'
    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
    JBATCH.NAME BATCH_NAME,
    JHEADER.NAME HEADER_NAME,
    PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
    , GL_IMPORT_REFERENCES GLIMPREF
    , XLA_AE_LINES AELINE
    , GL_JE_HEADERS JHEADER
    , GL_JE_BATCHES JBATCH
    , GL_LEDGERS T
    , GL_PERIODS PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
    ( 'AP_INV_DIST', 'AP_PMT_DIST'
    , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID = DLINK.AE_HEADER_ID
    AND AELINE.AE_LINE_NUM = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID = T.LEDGER_ID
    AND JHEADER.STATUS = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
    TO_DATE('04/02/2007 00:00:00'
    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Thu Feb 12 12:49:34 2009]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_GL_LINKAGE_INFORMATION_GS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Thu Feb 12 12:49:34 2009
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_GL_LINKAGE_INFORMATION_GS (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
    WRT_8044 No data loaded for this target
    WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [SQ_XLA_AE_LINES] has completed: Total Run Time = [0.673295] secs, Total Idle Time = [0.000000] secs, Busy Percentage = [100.000000].
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [SQ_XLA_AE_LINES] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_GL_LINKAGE_INFORMATION_GS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Thu Feb 12 12:49:35 2009)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Thu Feb 12 12:49:35 2009)
    MAPPING> TM_6018 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] run completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [SQ_XLA_AE_LINES] (Instance Name: [SQ_XLA_AE_LINES])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_GL_LINKAGE_INFORMATION_GS] (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] completed at [Thu Feb 12 12:49:36 2009]
    Thanks in Advance,
    Prashanth
    Edited by: user10719430 on Feb 11, 2009 7:33 AM
    Edited by: user10719430 on Feb 12, 2009 11:31 AM

    Need to increase temp tablespace.

  • Content Tab: None of the fact tables are compatible with the query request

    Hi All,
    **One thing I am not clear yet of all my years with OBIEE is working with the content tab in BMM.**
    I have made a rpd the joins in physical layer as shown below:
    https://picasaweb.google.com/114804305606242416264/OBIEEError#5663056545119428530
    And the BMM layer as:
    https://picasaweb.google.com/114804305606242416264/OBIEEError#5663056519553812930
    Error I am getting when i run a request from the 3 columns from the selected 3 tables is:
    Dim - Comment Code Details
    Fact - Complaint
    Dim - Service Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 14020] None of the fact tables are compatible with the query request Sr Num:[DAggr(Fact - Complaint.Sr Num by [ Dim - Service Details.Sr Cat Type Cd, Dim - Comment Code Details.Cmtcode name] )]. (HY000).
    I get no error for consistency.. I read everywhere and I know i need to set the appropriate aggregation levels in the various dims and facts LTS properties to help OBIEE understanding our model, but how to do that.. how do i decide... how should I approach, what should be the aggregation level, what details.
    When i click More button i see different options: Copy, Copy From, Get Levels, Check Level, what do these mean.
    Aggregation Content, group by - Logical Level or Column which one should i choose and how should I decide.
    Can anyone explain the Content Tab in details and from scratch with some example and why we get these errors.... I know many people who are well versed with many other things related to RPD but this. A little efforts of explaining from you guys will really be appreciated.
    Thanks in advance,
    Dev

    Hi Deepak,
    Option 1:
    My tables in physical layer are joined as below:
    D1--> F1 <--D2--> F2 <--D3
    Same way i model it in BMM
    D1--> F1 <-- D2--> F2 <--D3
    Here D1 is non Conformed Dimension for F2 and D3 is non Conformed dim for F1. Later create Dimensional hierarchies, I tried setting up the content levels
    I go Sources>content tab of Fact F1 I set
    Dimensions----------- Logical level
    D1---------------------- D1 Detail
    D2---------------------- D2 Detail
    D3---------------------- D3 Total
    then, I go Sources>content tab of Fact F2 I set
    Dimensions----------- Logical level
    D1---------------------- D1 Total
    D2---------------------- D2 Detail
    D3---------------------- D3 Detail
    Then, I also go in all the dimensions and set their content levels to Details, but it still gives me errors not sure where I am going wrong in setting the content levels.
    I need to know whether the way I have modeled it in BMM is right,
    Option 2:
    I can combine the two facts in a single Logical Fact or the above design should also work.
    (F1&F2)<--D1, D2 , D3 joined separately using complex logical joins.
    what will be the content tab details?
    Thanks,
    Dev

  • Web AS restarts due to classes being loaded multiple number of times

    Hi,
    We are developing a custom application which uses Dynpro for UI and EJBs/java for business and DAO layer.
    ->The application size is very big and to make the build and deployment faster, we have split the application into multiple ears.
    ->We have made sure that none of the classes are repeated across ears.
    ->We are using Web AS 2004s SP12 on Solaris UNIX platform.
    With all the ears being deployed on the application server, we notice application server being restarted once/twice per day. We have gone through the logs etc in detail and we found that most of the application classes are being loaded multiple number of times. In some cases, same class is being loaded more than 150 times. Due to this reason, permSpace area of JVM is getting exhausted and JVM is crashing. permSpace of JVM is set to 1024M at the moment.
    Any thoughts on what could be wrong here?
    Please let me know if you need any more details on this.
    Thanks, Chandra

    Hi,
    Which JDK version is in use ?  If possible, use JDK 1.4.2.17
    Change Perm size :   -XX:PermSize=512m -XX:MaxPermSize=512m
    Check if the below mentioned JVM parameters exists in the list :
    -verbose:gc
    -Xloggc:GC.log
    //   -Xmx2048m (if -Xmx mentioned the JVM Parameter list, remove it) as you have already mentioned in the JVM Heapsize  //
    -Xms2048m
    -XX:PermSize=512m -XX:MaxPermSize=512m
    -XX:NewSize=320m -XX:MaxNewSize=320m
    -XX:SurvivorRatio=2
    -XX:TargetSurvivorRatio=90
    -XX:+PrintGCDetails
    -XX:+PrintGCTimeStamps
    -XX:+UseParNewGC
    -XX:+HandlePromotionFailure
    -XX:+PrintClassHistogram
    Regards
    Shaji

  • Essbase answers - None of the fact tables are compatible with the query request "member"

    Hi,
    I have modelled an Essbase database into the repository.
    If I pull the measure, period and year dimension in and filter on the year (member) and display the year (member) along with the period (alias) and measure it errors with =>
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14020] None of the fact tables are compatible with the query request Fiscal Year.Fiscal Year Code. (HY000)
    However, all other things being equal if I change the year displayed to the alias then it works.
    Anyone tell me why??
    Is there a limitation that Essbase brings through that you cannot view what you filter on?
    thanks,
    Robert.

    Hi
    i have done the content level setting in each of the table, D1,F1 and F2(LTS), now i am getting the following error..
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 15018] Incorrectly defined logical table source (for fact table Gl Sets Of Books) does not contain mapping for [Code Combinations.Code Combinations.Affiliate, GL Balances.GL Balances.Currency Code, GL Balances.GL Balances.PTD_Balance, Gl Sets Of Books.Gl Sets Of Books .SoB Name]. (HY000)
    Gl Balances : D1
    Code Commbination: F1
    Gl Sets Of Books : F2
    I have checked the joins in physical and BMM layer..all are fine..

  • None of the fact sources for [column] are compatible with the detail filter

    Hi Experts,
    I am using obiee 11.1.1.7.
    Getting below error when i add a filter of non conformed dimension column.
    Below is the error
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P 
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14023] None of the fact sources for [column] are compatible with the detail filter
    Please help in solving an issue.
    Regards,
    NN

    Hi NN
    You need to follow steps in below link to model your BMM layer for the report to work with non confirmed dimensions.
    http://obibb.wordpress.com/2010/05/31/multiple-fact-reporting-on-non-conforming-dimensions/
    Once you have followed these steps you can use the non confirmed dimension in the reports.When you see the logs to see the query generated by obiee,it generated 2 queries 1 with confirmed dimension and other with non conformed dimension.When you filter using non confirmed dimension you will now not notice any error but the results will not be filtering out because the filter will be applied only in 1 query which is valid
    Now for this to work what you can do is ,instead of using the filter drag the column you like to filter as a prompt in your view.default it to the value you need using the case statement and hide it.This is one of the work around.
    There may be many other work arounds for this but you need to understand that for you requirment to work you need to tweek OBIEE to generate 1 query instead of 2  queries.

  • None of the fact tables are compatible

    hi,
    am developing report from two fact table columns and one dimension table in obiee 11.1.1.5.0.
    am getting error
    Error
    View Display Error
    Odbc driver returned an error (SQLExecDirectW).
    Error Details
    Error Codes: OPR4ONWY:U9IM8TAC:OI2DL65P
    State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 43113] Message returned from OBIS. [nQSError: 43119] Query Failed: [nQSError: 14020] None of the fact tables are compatible with the query request Fact - Retail.Retail. (HY000)
    SQL Issued: SELECT 0 s_0, "TM Vehicle Sales"."- Offtake Facts"."Offtake" s_1, "TM Vehicle Sales"."- Retail Facts"."Retail" s_2, "TM Vehicle Sales"."Distributor"."Country" s_3 FROM "TM Vehicle Sales"
    regards
    vcm

    need to see your design, dim is shared between the facts?
    I think you can assume the physical query based on your columns selection..
    Now pick one column from 1st fact and 2nd column from dim run a report, get physcial query and verify the joins with obiee and your own query.
    then add column from 2nd query see how it works
    Edited by: svee on Jun 29, 2012 6:21 AM

  • JRun session id being re-used

    We are using JRun 4.0 on our server in conjunction with MS
    IIS 6.0 to support dynamic JSP pages and Java Serlvets. We are
    using URL Encoding to support session handling. In the jrun-
    web.xml file we have the following parameters to disable the use of
    cookies for session handling.
    <session-config>
    <cookie-config>
    <active>false</active>
    </cookie-config>
    </session-config>
    With these parameters defined in the jrun-web.xml file, and
    the use of response.encodeURL() function, we see that jrun
    automatically appends a jsessionid=xxxxxxx parameter in the urls.
    This has been working for us well since long. Recently we noticed
    that these jsessionid values are being re-used by jrun for
    different session instances. Which means if a user logs in to a
    website at a given time and is assigned a sessionid say for e.g.
    101011 and after a while the user logs out. After some time if
    another user logs in, this second user is assigned the same
    sessionid parameter (which has a value 101011) for handling his
    session. If in case the first user has bookmarked the a page on the
    website, the bookmark is going to include the sessionid parameter
    (which has a value 101011) and if the first user accesses the
    website from the bookmark at the same time as the second user is
    logged in, the first user will get access to the second user's
    session which is very unsecure.
    This phenomena is referred to as session fixation and can be
    used by a hijacker to get access to any other user's session. Is
    there a way to prevent JRun from re-using these session id values
    or to increase the time period after which JRun re-uses these
    session ids.

    Dax Trajero wrote:
    ... how do I prevent a user who's just ordered, from returning to the site and re-using the same session ref ?
    Deny a returning paying(!) customer his session? Yours might be the only shop in town doing that.
    If your session housekeeping is any good, then the session variables  pertaining to shopping-cart, payment and delivery would have been  cleared or re-initialized. Often, starting a new session means logging in again. There are a number of reasons why that can be undesirable.
    I did an e-commerce course for a year, and learned some strange things. It is in fact to your advantage that a returning customer should keep his session, even after ordering.
    For example, it is well known that the chances of a returning customer placing a new order is much higher when he is already logged in than when he has to log in afresh. You could test that hypothesis yourself. Psychologists have also found that e-shoppers often return to the shop to gloat at the goodies they've just ordered. You wouldn't want to deny them their gloating session, would you?

  • Reversal documents are not being represented when 0PY_PP_C3 is being loade

    We use the 0HR_PY_PP_2 and 0HR_PY_PP_1 infosources to load the 0PY_PP_C2 and 0PY_PP_C1 ODSs. Then 0PY_PP_C1 updates the 0PY_PP_C3 ODS.
    Routines in the update rules include corresponding data from 0PY_PP_C2 during the update of 0PY_PP_C3. If there is no matching data in 2, then the corresponding records in 1 do not go into 3 Our process chain first loads 2, after it activates, it then loads 1, then updates 3.
    There are 5 reversal documents are in 0PY_PP_C2.
    The Update rules between 0PY_PP_C1 and 0PY_PP_C3 refer to the Posting Documents ODS: 0PY_PP_C2. 
    The issue is that the reversal documents are not being represented when 0PY_PP_C3 is being loaded.

    Hi,
    The simple solution is to debug the code written in the routine, as you mentioned that you already have some data in PP_C2 then check if that data is properly getting read in routine.
    After that check the condition based on which you are skipping records of PP_C1 and not loading into PP_C3. Is your condition perfectly working fine?
    Put a break point and check how the data is flowing.
    Regards,
    Durgesh.

  • DemoTrust certificate being loaded but not defined in config.xml

    I have a strange problem in that the DemoTrust.jks is being loaded when I start any of the managed servers.
    I have set-up custom Identity and trust keystores and there is no reference to the DemoTrust keystore anywhere in the config.xml
    The problem manifests itself when starting a managed server (WLS9.2) which tries to connect to the Admin server with the secure Administration port enabled for the domain.
    This requires 2waySSL and the handshake fails as it appears to be using the DemoTrust.jks keystore.
    config.xml extract showing managed server keystore entries
    <custom-trust-key-store-file-name>/home/wlserver92/domain/keystores/trust.jks</custom-trust-key-store-file-name>
    <custom-trust-key-store-type>JKS</custom-trust-key-store-type>
    <custom-trust-key-store-pass-phrase-encrypted>{3DES}QWdToZU27VsIWV4LU3bZnQ==</custom-trust-key-store-pass-phrase-encrypted>
    Extract from startup log file
    <Jun 16, 2008 12:45:16 PM CEST> <Info> <WebLogicServer> <BEA-000377> <Starting WebLogic Server with Java HotSpot(TM) Server VM Version 1.5.0_11-b03 from Sun Microsystems Inc.>
    <Jun 16, 2008 12:45:18 PM CEST> <Notice> <Security> <BEA-090169> <Loading trusted certificates from the jks keystore file /fs01/app/bea/wlserver9.2/weblogic92/server/lib/DemoTrust.jks.>
    <Jun 16, 2008 12:45:18 PM CEST> <Notice> <Security> <BEA-090169> <Loading trusted certificates from the jks keystore file /fs01/app/bea/wlserver9.2/jdk1.5.0_11/jre/lib/security/cacerts.>
    <Jun 16, 2008 12:45:18 PM CEST> <Info> <Management> <BEA-141107> <Version: WebLogic Server Temporary Patch for CR344429 Thu Nov 15 02:22:13 EST 2007
    WebLogic Server 9.2 MP2 Mon Jun 25 01:32:01 EDT 2007 952826 >
    <Jun 16, 2008 12:45:21 PM CEST> <Warning> <Security> <BEA-090477> <Certificate chain received from wls-admin.test.co.uk - xx.xx.xx.xx was not trusted causing SSL handshake failure.>
    <Jun 16, 2008 12:45:21 PM CEST> <Emergency> <Management> <BEA-141151> <The admin server could not be reached at https://wls-admin.test.co.uk:9002.>
    <Jun 16, 2008 12:45:21 PM CEST> <Info> <Configuration Management> <BEA-150018> <This server is being started in managed server independence mode in the absence of the admin server.>
    <Jun 16, 2008 12:45:21 PM CEST> <Info> <WebLogicServer> <BEA-000215> <Loaded License : /fs01/app/bea/wlserver9.2/license.bea>
    <Jun 16, 2008 12:45:21 PM CEST> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to STARTING>
    <Jun 16, 2008 12:45:21 PM CEST> <Info> <WorkManager> <BEA-002900> <Initializing self-tuning thread pool>
    <Jun 16, 2008 12:45:21 PM CEST> <Notice> <Log Management> <BEA-170019> <The server log file /fs01/log/dv31/wlserver92/server01/server01_server.log is opened. All server side log events will be written to this file.>
    <Jun 16, 2008 12:45:22 PM CEST> <Notice> <Security> <BEA-090171> <Loading the identity certificate and private key stored under the alias wls05_test_co_uk from the JKS keystore file /home/wlserver92/domain/keystores/wls05.test.co.uk.jks.>
    <Jun 16, 2008 12:45:23 PM CEST> <Notice> <Security> <BEA-090169> <Loading trusted certificates from the JKS keystore file /home/wlserver92/domain/keystores/trust.jks.>
    <Jun 16, 2008 12:45:23 PM CEST> <Notice> <Security> <BEA-090082> <Security initializing using security realm myrealm.>
    <Jun 16, 2008 12:45:23 PM CEST> <Critical> <Security> <BEA-090402> <Authentication denied: Boot identity not valid; The user name and/or password from the boot identity file (boot.properties) is not valid. The boot identity may have been changed since the boot identity file was created. Please edit and update the boot identity file with the proper values of username and password. The first time the updated boot identity file is used to start the server, these new values are encrypted.>
    <Jun 16, 2008 12:45:23 PM CEST> <Critical> <WebLogicServer> <BEA-000386> <Server subsystem failed. Reason: weblogic.security.SecurityInitializationException: Authentication denied: Boot identity not valid; The user name and/or password from the boot identity file (boot.properties) is not valid. The boot identity may have been changed since the boot identity file was created. Please edit and update the boot identity file with the proper values of username and password. The first time the updated boot identity file is used to start the server, these new values are encrypted.
    weblogic.security.SecurityInitializationException: Authentication denied: Boot identity not valid; The user name and/or password from the boot identity file (boot.properties) is not valid. The boot identity may have been changed since the boot identity file was created. Please edit and update the boot identity file with the proper values of username and password. The first time the updated boot identity file is used to start the server, these new values are encrypted.
    at weblogic.security.service.CommonSecurityServiceManagerDelegateImpl.doBootAuthorization(CommonSecurityServiceManagerDelegateImpl.java:941)
    at weblogic.security.service.CommonSecurityServiceManagerDelegateImpl.initialize(CommonSecurityServiceManagerDelegateImpl.java:1029)
    at weblogic.security.service.SecurityServiceManager.initialize(SecurityServiceManager.java:854)
    at weblogic.security.SecurityService.start(SecurityService.java:141)
    at weblogic.t3.srvr.SubsystemRequest.run(SubsystemRequest.java:64)
    Truncated. see log file for complete stacktrace
    >
    <Jun 16, 2008 12:45:23 PM CEST> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FAILED>
    <Jun 16, 2008 12:45:23 PM CEST> <Error> <WebLogicServer> <BEA-000383> <A critical service failed. The server will shut itself down>
    <Jun 16, 2008 12:45:23 PM CEST> <Notice> <WebLogicServer> <BEA-000365> <Server state changed to FORCE_SHUTTING_DOWN>
    Anybody seen this type of behaviour before ???

    Hi Sunita,
    To define AIA_INSTANCE you have to export these environment variables at least.
    export AIA_HOME=/app/soa/aiafp
    export MW_HOME=/app/soa/Oracle/Middleware
    export SOA_HOME=/app/soa/Oracle/Middleware/Oracle_SOA1
    export JAVA_HOME=/usr/jrockit-jdk1.6.0_29-R28.2.2-4.1.0/jre
    export AIA_INSTANCE=${AIA_HOME}/aia_instances/AIAPRD
    The path will vary and should be replaces as per your installation.
    Thanks,
    Shantanu

Maybe you are looking for

  • Xml source Reading

    HI all  I have xml source in for each loop.  when execute package getting error Like ... was unable to read the XML data. Root element is missing  I think this comes due to having blank xml file. can anyone suggest me how can verify xml have root ele

  • My MacBook Pro will not install yosemite

    MMy installation will not work at all it sits at 22 minutes remaining then cuts out and pops a message saying 'file system verify or repair Failed' then I try to to verify disk to repair it but it says I have to repair it first! Help I can't use my M

  • Why can't I use the paintbrush tool?

    I'm trying to paint on an image and when I mouse over where I want to paint I get the "prohibited" symbol and can't paint. I opened other .jpgs and was able to paint with no problem. Is there a type of file that you can't use the paintbrush tool on?

  • Complete Your Season question re Game of Thrones

    I've bought two episodes of Game of Thrones Season 1 through iTunes and would like to take advantage of the Complete Your Season offer. But whilst I can see the series I want listed under Complete Your Season and I can see that two of the episodes ha

  • ORA-00205: error in identifying control file,-Startup ASM db in PFA mode

    Is there any oracle restriction for ASM database in PFA mode? *[oracle@aveo:~]>cd $ORACLE_HOME/dbs* *[oracle@aveo:~]>orapwd file=orapwQ102ASMA password=sys entries=10 force=y* oracle@aveo:~]>sq SQL*Plus: Release 10.2.0.1.0 - Production on Fri Sep 3 1