Anamoly in DAC behaviour

Dear Gurus,
We are implementing OBIA 7.9.6.3.For that we have installed OBIEE 11.1.1.5,Informatica 9.0.1 and DAC 10.1.3.4.1. EBS R12.1.2 is source.We configured CSV files according to the source.We did the load for the first time (Full load).Some workflows were failed in the DAC but got succeeded in the informatica workflow monitor.When check the failed workflows in DAC it is showing the error "Index Creation Failed".We didnt do any changes ,just requeued the workflows and restarted ETL, they got succeeded without fail.
Kindly help us .
Thanks
MVSST
Edited by: MVSST on Jul 28, 2012 11:25 PM

Hi,
Is it vanila implementation?Did you know which table has duplicate index?
Try to check this table
W_USER_DS
select integration_id, datasource_num_id, src_eff_from_dt, count(*)
     from w_user_ds
     group by integration_id, datasource_num_id, src_eff_from_dt
     having count(*)>1
W_EMPLOYEE_DS
select integration_id, datasource_num_id, src_eff_from_dt, count(*)
     from W_EMPLOYEE_DS
     group by integration_id, datasource_num_id, src_eff_from_dt
     having count(*)>1
W_POSITION_DS
select integration_id
               , datasource_num_id
               , SRC_EFF_FROM_DT
               , count(*)
     FROM           W_POSITION_DS
     GROUP BY     integration_id
               , datasource_num_id
               , SRC_EFF_FROM_DT
     HAVING          count(*)>1
Hope this help you to figure it out where is the duplicate index.
Thanks
Joni

Similar Messages

  • Delay on audio playback with FiiO e10 USB DAC

    Hello,
    My USB DAC playback is not working correctly. When I start the playback of any music stream, it takes a while (around a second or less) before I actually hear anything. The problem also occurs if short silent periods are present in the audio file/stream. The only way I can listen these problematic files/streams without the skipping is by keeping the master channel active all the time by playing some other audio along with problematic stream.
    I have had this problem on openSUSE 12.3 too, and I was not able to solve it there either.
    On Windows 7, however, this very same problem occurs only if I choose to use any of the 24 bit audio depth settings for the DAC.
    Here is a screenshot to demonstrate the Windows scenario:
    http://s10.postimg.org/xzwx302jd/fiio_delay.png
    I have been trying out various things things without success, such as:
    From Arch's PulseAudio wiki:
    - Glitches, skips or crackling
    - Setting the default fragment number and buffer size in PulseAudio
    - Laggy sound
    - Realtime scheduling
    Here is some general information about my current Arch system and sound settings:
    OS: Arch Linux x86_64
    Kernel Release: 3.14.1-1-ARCH
    WM: KWin
    DE: KDE
    Processor Type: Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz
    Sound settings: ALSA + PulseAudio
    This problem did exist even before I installed Pulseaudio.
    aplay -l :
    **** List of PLAYBACK Hardware Devices ****
    card 0: PCH [HDA Intel PCH], device 0: ALC898 Analog [ALC898 Analog]
    Subdevices: 1/1
    Subdevice #0: subdevice #0
    card 0: PCH [HDA Intel PCH], device 1: ALC898 Digital [ALC898 Digital]
    Subdevices: 1/1
    Subdevice #0: subdevice #0
    card 1: NVidia [HDA NVidia], device 3: HDMI 0 [HDMI 0]
    Subdevices: 1/1
    Subdevice #0: subdevice #0
    card 1: NVidia [HDA NVidia], device 7: HDMI 1 [HDMI 1]
    Subdevices: 1/1
    Subdevice #0: subdevice #0
    card 1: NVidia [HDA NVidia], device 8: HDMI 2 [HDMI 2]
    Subdevices: 1/1
    Subdevice #0: subdevice #0
    card 1: NVidia [HDA NVidia], device 9: HDMI 3 [HDMI 3]
    Subdevices: 1/1
    Subdevice #0: subdevice #0
    card 3: Audio [DigiHug USB Audio], device 0: USB Audio [USB Audio]
    Subdevices: 1/1
    Subdevice #0: subdevice #0
    card 3: Audio [DigiHug USB Audio], device 1: USB Audio [USB Audio #1]
    Subdevices: 1/1
    Subdevice #0: subdevice #0
    pacmd list-cards (the last entry only)
    index: 3
    name: <alsa_card.usb-FiiO_DigiHug_USB_Audio-01-Audio>
    driver: <module-alsa-card.c>
    owner module: 9
    properties:
    alsa.card = "3"
    alsa.card_name = "DigiHug USB Audio"
    alsa.long_card_name = "FiiO DigiHug USB Audio at usb-0000:07:00.0-2, full speed"
    alsa.driver_name = "snd_usb_audio"
    device.bus_path = "pci-0000:07:00.0-usb-0:2:1.1"
    sysfs.path = "/devices/pci0000:00/0000:00:1c.7/0000:07:00.0/usb5/5-2/5-2:1.1/sound/card3"
    udev.id = "usb-FiiO_DigiHug_USB_Audio-01-Audio"
    device.bus = "usb"
    device.vendor.id = "1852"
    device.vendor.name = "GYROCOM C&C Co., LTD"
    device.product.id = "7022"
    device.product.name = "DigiHug USB Audio"
    device.serial = "FiiO_DigiHug_USB_Audio"
    device.string = "3"
    device.description = "DigiHug USB Audio"
    module-udev-detect.discovered = "1"
    device.icon_name = "audio-card-usb"
    profiles:
    output:analog-stereo: Analog Stereo Output (priority 6000, available: unknown)
    output:iec958-stereo: Digital Stereo (IEC958) Output (priority 5500, available: unknown)
    off: Off (priority 0, available: unknown)
    active profile: <output:iec958-stereo>
    sinks:
    alsa_output.usb-FiiO_DigiHug_USB_Audio-01-Audio.iec958-stereo/#0: DigiHug USB Audio Digital Stereo (IEC958)
    sources:
    alsa_output.usb-FiiO_DigiHug_USB_Audio-01-Audio.iec958-stereo.monitor/#1: Monitor of DigiHug USB Audio Digital Stereo (IEC958)
    ports:
    analog-output: Analog Output (priority 9900, latency offset 0 usec, available: unknown)
    properties:
    iec958-stereo-output: Digital Output (S/PDIF) (priority 0, latency offset 0 usec, available: unknown)
    properties:
    .asoundrc :
    pcm.!default {
    type hw
    card 3
    ctl.!default {
    type hw
    card 3
    -- mod edit: read the Forum Etiquette and only post thumbnails http://wiki.archlinux.org/index.php/For … s_and_Code [jwr] --
    Last edited by voodur (2014-04-23 22:24:35)

    emeres wrote:Does this behaviour continue with pulseaudio killed? Use alsa directly to narrow the problem down first.
    Yes. It did exist even before I installed pulseaudio.
    emeres wrote:Check different resolutions, sample rates and channels. Try to determine the lowest buffer and period size, that the DAC accepts.
    If by this you mean this, I went through that guide and it did not make any difference.
    I found out that sometimes some audio formats, such as wav or mp3 do not produce this flaw. I gathered stats about this here. VLC was used for playback and displaying the information. Amarok behaves similarly.
    emeres wrote: You probably load snd-usb-audio module with no options. Try loading it with "nrpacks" using different settings (default is 8, try 1, 2).
    I am not sure if this is the way to do it, but I followed this guide and added file: "usbsound.conf" to "/etc/modprobe.d/". usbsound.conf has line: "options snd_usb_audio nrpacks=2". I tried changing the nrpacks to various values and booted after every change, but it did not make difference to the audio skipping.
    What could it be that .xm (and every other tracker music format) and .ogg files are not playing like mp3 or wav?
    Sorry for delayed reply and about that image
    Last edited by voodur (2014-05-06 12:37:57)

  • Airplay music streaming and internal DAC bypass

    Hello everyone, I'm an "audiophile" who loves high quality liquid music, and I need help about Airplay hi-fi music streaming.
    I own an Apple TV 3gen linked with optical cable to my stereo amplifier internal hi-end DAC for optimal sound quality. Everything is fine when I play my iTunes network shared music using the Apple TV iTunes app or computer shared libraries, because the digital file is directly sent without manipulation to the hi-end amplifier's DAC.
    When I stream music with my macbook pro to the Apple TV with Airplay, I notice that the macbook pro internal DAC isn't bypassed causing bad sound output. I also notice that because every audio control in the macbook is still active, so you can set the voume output and use the equalizer. This means that there is a Digital to Analog conversion (for sound control) and then an Analog to Digital conversion (for the Airplay streaming) made by macbook pro internal DAC.
    As I said before, this workflow causes bad sound output, so is there a way to configure Airplay to stream untouched music files bypassing the internal device DAC (the DACs of iphones, ipads and macs)?

    Max_Ram wrote:
    Are there any hi-end devices out there able to manage a shared iTunes library (like the ATV 3 does) with bit perfect output?
    And has the Airplay feature in some "audiophile grade" devices like NAD, Marantz, etc. the same behaviour?
    I suspect you'll find many of the high-end audio devices use custom library software - they may not run with iTunes directly, but I suspect some will be able to access files stored in iTunes.  The problem really is the AppleTV is not configurable to simply output what it is sent - I don't honestly know if for example iTunes downsamples hi def files before sending to AppleTV or if AppleTV does it internally.
    In many ways it may not matter if AppleTV is not outputting as you want.
    I think the problem with Airplay is that the technology is licensed from Apple for inclusion in those devices - from memory things like the first Zeppelin Air used to boast high-quality DACs but actually the Airplay stream that went to those devices was lower quality.  I've not really tested any of these things for a while to be honest and am rather out of touch with current AV gear.  Thinking about it I can use my Pioneer as an Airplay destination, but as you have noticed volume control/equaliser etc are active so the stream is not the original - must check on sample rate/bit depth though.
    Airplay may or may not be capable of what you want now, but even if it isn't it may evolve to do so in the future. I suspect your best option will be to connect your Mac directly to the DAC if streaming won't work.
    This is old and mentions some variably priced playback apps aimed at audiophiles:
    http://www.headfonia.com/os-x-audio-players-amarra-audirvana-pure-music-fidelia- decibel-and-bitperfect/
    BitPerfect is available in the AppStore - OS X removed some APIs for low level access to audio output some times ago - not sure if they are available again now.
    Fidelia also in AppStore but has some in-app purchases.
    Audirvana looked promising.
    Not used any of these recently.
    Amarra used to be very expensive.
    In the past there have been several lively discussions here where people have moaned that AppleTV does not have a volume control - I have always argued that it is a 'source' device and volume etc should be controlled on the amp or TV as with conventional audio sources but many devices do have volume controls these days so consumers who like those features want them. They did actually add volume control to AppleTV 1 for stereo audio but you could disable it in Settings.

  • Error while creating a new DAC connection using connection type MSSQL

    Hi,
    I am trying to create a new DAC connection i.e. a new DAC repository in the SQL Server 2008 database.
    DAC version : 10.1.3.4.1
    Database : SQL Server 2008
    I have downloaded the sqljdbc4.jar file from the below link and placed it in the D:\orahome\10gR3_1\bifoundation\dac\lib folder.
    [http://www.microsoft.com/en-us/download/details.aspx?displaylang=en&id=11774 ]
    I have entered all the details correctly for database name, database host, database port. I created a new Authentication file.
    I get the below error when I try to test the connection.
    MESSAGE:::MSSQL driver not available!
    EXCEPTION CLASS::: java.lang.IllegalArgumentException
    com.siebel.etl.gui.login.LoginDataHandler$LoginStructure.testConnection(LoginDataHandler.java:512)
    com.siebel.etl.gui.login.LoginDataHandler.testConnection(LoginDataHandler.java:386)
    com.siebel.etl.gui.login.ConnectionTestDialog$Executor.run(ConnectionTestDialog.java:290)
    ::: CAUSE :::
    MESSAGE:::com.microsoft.sqlserver.jdbc.SQLServerDriver
    EXCEPTION CLASS::: java.lang.ClassNotFoundException
    java.net.URLClassLoader$1.run(URLClassLoader.java:200)
    java.security.AccessController.doPrivileged(Native Method)
    java.net.URLClassLoader.findClass(URLClassLoader.java:188)
    java.lang.ClassLoader.loadClass(ClassLoader.java:306)
    sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:276)
    java.lang.ClassLoader.loadClass(ClassLoader.java:251)
    java.lang.ClassLoader.loadClassInternal(ClassLoader.java:319)
    java.lang.Class.forName0(Native Method)
    java.lang.Class.forName(Class.java:169)
    com.siebel.etl.gui.login.LoginDataHandler$LoginStructure.testConnection(LoginDataHandler.java:510)
    com.siebel.etl.gui.login.LoginDataHandler.testConnection(LoginDataHandler.java:386)
    com.siebel.etl.gui.login.ConnectionTestDialog$Executor.run(ConnectionTestDialog.java:290)
    The error seems to be a connectivity issue with SQL Server. Am I using the correct jar file?
    Please help me out in resolving this issue. Appreciate the help provided on this forum earlier.
    Thank You

    Add
    .\lib\sqljdbc4.jar
    at end of the line starting with SQLSERVERLIB in config.bat file
    Pls mark correct

  • Error while creating a task for creating an generic task  /app/dac/CustomSQ

    Hi ,
    I have created a sql file in DAC server /app/dac/CustomSQLs/ ,just to fire an update sql in database
    In DAC task tab i have created a task with the following:
    Command for incremental load :/app/dac/CustomSQLs/DBNameBeforeLoad.sql
    Primary source :flatfileconnection
    target source:DBCONNECTION_OLAP
    Execution type:SQL FILE
    Task phase:GENERAL
    I created subject area and assembled ,then created a Execution plan.
    When i try to execute this EP ,it shows the following error in DAC log:
    ANOMALY INFO::: Error while creating a task for creating an generic task /app/dac/CustomSQLs/DBNameBeforeLoad.sql
    MESSAGE:::/app/dac/CustomSQLs/DBNameBeforeLoad.sql - invalide template name!
    EXCEPTION CLASS::: com.siebel.analytics.etl.etltask.TaskInitializationException
    com.siebel.analytics.etl.etltask.SQLFileTask.doInit(SQLFileTask.java:69)
    com.siebel.analytics.etl.etltask.GenericTaskImpl.init(GenericTaskImpl.java:194)
    Does my above configuration is correct ..?

    verify the following settings:
    1. mapping of ECC plant and storage location and logical system number with the appropriate availability group code since this links the stock types in EWM to ECC org elements (SPRO -> EWM -> Interfaces -> ERP Integration -> Goods movement)
    2. ensure the availability group configuration is correct (SPRO -> EWM -> GR Process -> Configure availability group for putaway) - check all applicable nodes in this configuration since the availability group code is assigned in the pertinent storage types

  • Error while creating the DWH tables using DAC

    Hi,
    I am getting error while creating the DWH tables using DAC. I have created a ODBC DSN using merant driver with DAC repository DB credentials and the test connection is successful. And while creating the tables i gave the olap dw credentials and the DSN name which i created earlier. But it throws the error as below:
    Please find the below mentioned error message
    =====================================
    STD OUTPUT
    =====================================
    CREATING SIEBEL DATABASE OBJECTS
    F:\DAC\bifoundation\dac\UTILITIES\BIN\DDLIMP /I N /s N /u infdomain /p ******* /c DB_DAC /G "SSE_ROLE" /f F:\DAC\bifoundation\dac/conf/sqlgen/ctl-file/oracle_bi_dw.ctl /b "" /K "" /X "" /W N
    Error while importing Siebel database schema.
    =====================================
    ERROR OUTPUT
    =====================================
    Siebel Enterprise Applications ODBC DDL Import Utility, Version 7.7 [18030] ENU
    Copyright (c) 2001 Siebel Systems, Inc. All rights reserved.
    This software is the property of Siebel Systems, Inc., 2207 Bridgepointe Parkway,
    San Mateo, CA 94404.
    User agrees that any use of this software is governed by: (1) the applicable
    user limitations and other terms and conditions of the license agreement which
    has been entered into with Siebel Systems or its authorized distributors; and
    (2) the proprietary and restricted rights notices included in this software.
    WARNING: THIS COMPUTER PROGRAM IS PROTECTED BY U.S. AND INTERNATIONAL LAW.
    UNAUTHORIZED REPRODUCTION, DISTRIBUTION OR USE OF THIS PROGRAM, OR ANY PORTION
    OF IT, MAY RESULT IN SEVERE CIVIL AND CRIMINAL PENALTIES, AND WILL BE
    PROSECUTED TO THE MAXIMUM EXTENT POSSIBLE UNDER THE LAW.
    If you have received this software in error, please notify Siebel Systems
    immediately at (650) 295-5000.
    F:\DAC\bifoundation\dac\UTILITIES\BIN\DDLIMP /I N /s N /u infdomain /p ***** /c DB_DAC /G SSE_ROLE /f F:\DAC\bifoundation\dac/conf/sqlgen/ctl-file/oracle_bi_dw.ctl /b /K /X /W N
    Connecting to the database...
    28000: [DataDirect][ODBC Oracle driver][Oracle]ORA-01017: invalid username/password; logon denied
    Unable to connect to the database...
    any help is appreciated.
    Thanks,
    RM

    The fact that you are getting an "ORA-01017: invalid username/password; logon denied" message indicates that you are at least talking to the database.
    The log shows that username "infdomain" is being used. Can you double check the username and password you have in DAC in a SQL*Plus/SQL Developer session?
    Please mark if useful/helpful,
    Andy.

  • Error while importing a DW table into DAC

    Hi,
    We are on OBIEE 7.9.6 and we have a requirement of adding new DW table. I created a new table in DW and getting error "Could connect to data source process. Process failed during creation of connection pool", while trying to import this new table into DAC. I would like to know why I am getting this error.
    Any help is greatly appreciated.
    Thanks,
    Chandra

    Thank you for your response.
    I added this table in DW database and have imported into informatica. When I look at the traffic light it is red. Looks like thats the problem.
    Thanks again.

  • Error While running the ETL Load in DAC (BI Financial Analytics)

    Hi All,
    I have Installed and Configured BI Applictions 7.9.5 and Informatic8.1.1. For the first time when we run the ETL Load in DAC it has failed.for us every Test Connection was sucess.and getting the error message as below.
    The log file which I pasted below is from the path
    /u01/app/oracle/product/Informatica/PowerCenter8.1.1/server/infa_shared
    /SessLogs
    SDE_ORAR12_Adaptor.SDE_ORA_GL_AP_LinkageInformation_Extract_Full.log
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R12] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [9] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value ['Y'] for mapping parameter:[$$FILTER_BY_LEDGER_ID].
    DIRECTOR> VAR_27028 Use override value ['N'] for mapping parameter:[$$FILTER_BY_LEDGER_TYPE].
    DIRECTOR> VAR_27028 Use override value [04/02/2007] for mapping parameter:[$$INITIAL_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [] for mapping parameter:[$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [1] for mapping parameter:[$$LEDGER_ID_LIST].
    DIRECTOR> VAR_27028 Use override value ['NONE'] for mapping parameter:[$$LEDGER_TYPE_LIST].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] at [Thu Feb 12 12:49:33 2009]
    DIRECTOR> TM_6683 Repository Name: [DEV_Oracle_BI_DW_Rep]
    DIRECTOR> TM_6684 Server Name: [DEV_Oracle_BI_DW_Rep_Integration_Service]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR12_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_GL_AP_LinkageInformation_Extract_Full]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_GL_AP_LinkageInformation_Extract [version 1]
    DIRECTOR> TM_6827 [u01/app/oracle/product/Informatica/PowerCenter8.1.1/server/infa_shared/Storage] will be used as storage directory for session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6708 Using configuration property [SiebelUnicodeDB,apps@devr12 bawdev@devbi]
    DIRECTOR> TM_6703 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] is run by 64-bit Integration Service [node01_oratestbi], version [8.1.1 SP4], build [0817].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [ISO 8859-1 Western European]
    MAPPING> TM_6151 Session Sort Order: [Binary]
    MAPPING> TM_6156 Using LOW precision decimal arithmetic
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6307 DTM Error Log Disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning; transformation continues...
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Thu Feb 12 12:49:34 2009)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Thu Feb 12 12:49:34 2009)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 12582912 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [devr12.tessco.com], user [apps]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [DEVBI], user [bawdev], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_GL_LINKAGE_INFORMATION_GS :SQL INSERT statement:
    INSERT INTO W_GL_LINKAGE_INFORMATION_GS(SOURCE_DISTRIBUTION_ID,JOURNAL_LINE_INTEGRATION_ID,LEDGER_ID,LEDGER_TYPE,DISTRIBUTION_SOURCE,JE_BATCH_NAME,JE_HEADER_NAME,JE_LINE_NUM,POSTED_ON_DT,SLA_TRX_INTEGRATION_ID,DATASOURCE_NUM_ID) VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_GL_LINKAGE_INFORMATION_GS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Thu Feb 12 12:49:34 2009
    Target tables:
    W_GL_LINKAGE_INFORMATION_GS
    READER_1_1_1> RR_4029 SQ Instance [SQ_XLA_AE_LINES] User specified SQL Query [SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
          AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AP_INV_DIST', 'AP_PMT_DIST'
              , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('04/02/2007 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Thu Feb 12 12:49:34 2009)
    READER_1_1_1> CMN_1761 Timestamp Event: [Thu Feb 12 12:49:34 2009]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-01114: IO error writing block to file 513 (block # 328465)
    ORA-27072: File I/O error
    Linux-x86_64 Error: 28: No space left on device
    Additional information: 4
    Additional information: 328465
    Additional information: -1
    ORA-01114: IO error writing block to file 513 (block # 328465)
    ORA-27072: File I/O error
    Linux-x86_64 Error: 28: No space left on device
    Additional information: 4
    Additional information: 328465
    Additional information: -1
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
    JBATCH.NAME BATCH_NAME,
    JHEADER.NAME HEADER_NAME,
    PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
    , GL_IMPORT_REFERENCES GLIMPREF
    , XLA_AE_LINES AELINE
    , GL_JE_HEADERS JHEADER
    , GL_JE_BATCHES JBATCH
    , GL_LEDGERS T
    , GL_PERIODS PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
    ( 'AP_INV_DIST', 'AP_PMT_DIST'
    , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID = DLINK.AE_HEADER_ID
    AND AELINE.AE_LINE_NUM = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID = T.LEDGER_ID
    AND JHEADER.STATUS = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
    TO_DATE('04/02/2007 00:00:00'
    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    DLINK.ACCOUNTING_LINE_CODE LINE_CODE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
    JBATCH.NAME BATCH_NAME,
    JHEADER.NAME HEADER_NAME,
    PER.END_DATE
    FROM XLA_DISTRIBUTION_LINKS DLINK
    , GL_IMPORT_REFERENCES GLIMPREF
    , XLA_AE_LINES AELINE
    , GL_JE_HEADERS JHEADER
    , GL_JE_BATCHES JBATCH
    , GL_LEDGERS T
    , GL_PERIODS PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
    ( 'AP_INV_DIST', 'AP_PMT_DIST'
    , 'AP_PREPAY')
    AND DLINK.APPLICATION_ID = 200
    AND AELINE.APPLICATION_ID = 200
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID = DLINK.AE_HEADER_ID
    AND AELINE.AE_LINE_NUM = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID = T.LEDGER_ID
    AND JHEADER.STATUS = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
    TO_DATE('04/02/2007 00:00:00'
    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('Y', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Thu Feb 12 12:49:34 2009]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_GL_LINKAGE_INFORMATION_GS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Thu Feb 12 12:49:34 2009
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_GL_LINKAGE_INFORMATION_GS (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
    WRT_8044 No data loaded for this target
    WRITER_1__1> WRT_8043 ****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [SQ_XLA_AE_LINES] has completed: Total Run Time = [0.673295] secs, Total Idle Time = [0.000000] secs, Busy Percentage = [100.000000].
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [SQ_XLA_AE_LINES] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_GL_LINKAGE_INFORMATION_GS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Thu Feb 12 12:49:35 2009)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Thu Feb 12 12:49:35 2009)
    MAPPING> TM_6018 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] run completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [SQ_XLA_AE_LINES] (Instance Name: [SQ_XLA_AE_LINES])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_GL_LINKAGE_INFORMATION_GS] (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
         Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_GL_AP_LinkageInformation_Extract_Full] completed at [Thu Feb 12 12:49:36 2009]
    Thanks in Advance,
    Prashanth
    Edited by: user10719430 on Feb 11, 2009 7:33 AM
    Edited by: user10719430 on Feb 12, 2009 11:31 AM

    Need to increase temp tablespace.

  • Regarding REFRESHING of Data in Data warehouse using DAC Incremental approa

    My client is planning to move from Discoverer to OBIA but before that we need some answers.
    1) My client needs the data to be refreshed every hour (incremental load using DAC) because they are using lot of real time data.
    We don't have much updated data( e.g 10 invoices in an hour + some other). How much time it usually takes to refresh those tables in Data wareshouse using DAC?
    2) While the table is getting refreshed can we use that table to generate a report? If yes, what is the state of data? Stale or incorrect(undefined)?
    3) How does refresh of Fin analytics work? Is it one module at a time or it treats all 3 modules (GL, AR and AP) as a single unit of refresh?
    I would really appreciate if I can get an answer for all the questions.
    Thank You,

    Here you go for answers:
    1) Shouldn't be much problem for a such small amt of data. All depends on ur execution plan in DAC which can always be created as new and can be customized to load data for only those tables...(Star Schema) ---- Approx 15-20 mins as it does so many things apart from loading table.
    2) Report in OBIEE will give previous data as I believe Cache will be (Shud be) turned on. You will get the new data in reports after the refresh is complete and cache is cleared using various methods ( Event Polling preferred)
    3) Again for Fin Analytics or any other module, you will have OOTB plans. But you can create ur new plans and execute. GL, AR, AP are also provided seperate..
    Hope this answers your question...You will get to know more which going through Oracle docs...particular for DAC

  • Unable to capture video from VHS through DAC-200

    (Running Final Cut Pro 5.1.4)
    I have a DAC-200 to hook up my VCR to my G5 in order to capture video from VHS tapes into Final Cut. I had this set-up working a few months ago, but it was disconnected to rearrange equipment and now I can't get it working again. I have followed the advice from the DAC-200 installation instructions and advice from other forum topics but can't seem to get it to work. I finally was able to get something in the log and capture window besides "preview disabled" and "cannot capture because there is no video." I now can get Final Cut to capture video, however it only captures a blank black screen (and this is all I can see in the preview window).
    I've triple-checked every cable hook-up, the G5 recognizes the converter box, the video tape plays fine to a tv. I've even had our IT-guy here at work come check out my set-up for help and he can't figure out what is wrong either. (He did get my log&capture window to get white fuzz in the black screen at one point though. Does that mean anything?)
    Any help is greatly appreciated! (I'm happy to answer any other questions too.) Thanks!

    I had difficullty with the X.4.10 plus QT7.2 and put my info on the Forum as well as sending feedback to Apple. Shortly afterwards, Apple brought out a short security update which I downloaded and, hey presto, I was then able to use QT7.2 with OSX.4.10. It may be that you missed out on the later small update to QT7.2..... ? Another factor I found was that some little while later I again had a problem so I again installed QT7.2 and over-installed FCP5.1.4 which has resolved my problems for the time being. Worth a try if you still can't get through on the DAC.
    Ron.

  • IMovie not recognising Data Video DAC AV/DV converter

    I use my DataVideo Dac-1 to capture old VHS family movies. Drop them into iMovie edit and make short DVD's. The system worked a treat with our old G4 and Panther. Have just bought a dual G5 2 Gig machine & iMovie HD. The system is not seeing the DAC on the fire wire bus. I have tried new cables, plugging into a native bus etc etc. but no go. The DAC is admitted 5 years old but worked perfectly up to now. I am loath to have to scrap it because of a Fire wire incompatibility issue. Anyone else out here with similiar issues and a possible solution - or else off to the mail-order I go and cough up €200

    Hi Richard,
    I cannot help, sorry, but I read here many reports concerning Converters worked/then failed....
    I recommend http://www.apple.com/feedback/imovie.html to report in detail your system's profile and converter specs.... maybe, there IS some <bad word following> bug???

  • Different query behaviour (index or no index use) between sql and pl/sql

    Hi All,
    I have a query inside a cursor in a procedure, let's say:
    cursor c_emp (b_dept_no in number)
    is
    select *
    from emp
    where dept_no = b_dep_no;
    There is an non unique index on dept_no.
    In this procedure i first loop through another cursor which gets the dept_no and with this dept_no i go into a for loop for the c_emp cursor.
    When i run this procedure i see that the explain plan of above query gives me a Full table Scan on EMP, eventhough there is a index on dept_no and there are 1.3 million records in the em table. When i take the above query and explain it seperately, it uses the index on dept_no.
    I have rebuild/analyzed the tables and indexes, nothing seems to work. Even hints in the procedure cursor are not helping at all.
    I'm using Oracle Database 10.2.0.1.0 and have never seen such behaviour before.
    Anyone an idea what is going on here?
    Kind regards,
    Dave

    Hoi Rob,
    it's like i have three tables. The first i have to get data from for my output, then for every record of table one there are one or more records in table 2 and table 3.
    I had to create a xml file and it looks like this:
    <Table1 Record 1>
    <Name>...</Name>
    etc...
    <Table2 Record 1>
    </Table2 Record 1>
    <Table2 Record 2>
    </Table2 Record 2>
    <Table3 Record 1>
    </Table3 Record 1>
    <Table3 Record 2>
    </Table3 Record 2>
    <Table1 Record 2>
    etc.etc.etc.
    So that's why i programmed:
    for r_1 in c_table1 loop
    for r_2 in c_table2(r_1.dept_no) loop
    end loop;
    for r_3 in c_table3(r_1.dept_no) loop
    end loop;
    end loop;
    And when i look at the first record,
    it has 2 records in table2 en 16 records in table3. So nothing huge.
    For now i'm unable to trace the session. I will try to post the tkprof later.
    Kind regards,
    Dave

  • Has there been updates to transcription support in a PS 4 Windows Update? Behaviour has changed...

    I utilise PowerShell transcription support heavily in my PS scripts. Essentially I utilise transcription as a replacement to having a logging framework - all logging is performed via Write-Host, Write-Verbose and Write-Debug calls, and then logged to file
    via a call to Start-Transcript at the start of script execution.
    Transcription support in PS versions prior to 5 is limited in that it is hard coded in the Start-Transcript and Stop-Transcript commands to throw an exception if the host is not an instance of Microsoft.PowerShell.ConsoleHost. Also, to make it even more difficult,
    Stop-Transcript throws an error when there is no active transcript; but there is no way to detect if there is. Therefore, I use reflection to get the ConsoleHost.IsTranscribing internal property value to determine whether to call Stop-Transcript at script
    completion. This logic has worked fine for PS versions 2, 3 and 4. However, more recently I believe a Windows Update patch has changed the way the transcription works. It appears that PS 4 now supports transcription functionality that I thought was PS 5 (CTP)
    specific - nested transcription sessions (i.e. you can now continually call Start-Transcript and it will create more sessions, rather than throw an error when there is an active session).  Also, the internals of how transcription is managed must have
    changed because the IsTranscribing property no longer returns the correct value. I am unsure if there is now a "supported" way of checking for this rather than hacking into the internals of the host. This change results in my script creating multiple
    transcription sessions as it does not know there is already transcription started.
    Does anyone know if there has been any updates in PS 4 for the transcription functionality? Or any other details regarding this?
    Cheers

    Hi Anna,
    Thanks for the response. Can you make sense of the info below? Transcription appears completely broken
    on the build on my machine, which I have not installed any PS CTP on. I am worried a hotfix such as KB3000850 which has also caused me the grief here:
    https://social.technet.microsoft.com/Forums/windowsserver/en-US/eaa4a532-5ff5-4a2f-89ce-7e72c71a8fb7/powershell-30-stuck-in-infinite-loop-resolving-members-internally-is-this-a-ps-bug?forum=winserverpowershell
    https://connect.microsoft.com/PowerShell/feedback/details/1045858/add-member-cmdlet-invokes-scriptproperty-members-when-adding-new-member
    ...has been installed via Windows Update. I am running Windows 8.1 and all latest updates applied as at 23 April.
    My $PSVersionTable is the same as yours except for "BuildVersion" 6.3.9600.17400. I can also call Start-Transcript continually and it will keep creating sessions. Once I do this all sessions receive data once I
    enter more commands in the console (i.e. each file is getting the same data). This is the behaviour I would have expected in PS 5 with the transcription updates, but not in a PS 4 update.
    NOTE: In addition to this, the data being fed into the transcription file is NOT the same format as the standard transcription format. The transcription file contains different data and
    is not a true reflection of what is on the console). The following data is being output to the transcription file after each command:
    CurrentMatchIndex              ReplacementIndex             ReplacementLength CompletionMatches
                               -1                            
    0                             8 {System.Management.Automat...
    In addition to that (I haven't had a chance to look into this in detail), but the log files that are being generated by the PS module I have written are now containing a heap of contents that appear to be generated each time an exception
    is thrown (even if it has been caught?)
    "PS>TerminatingError(Invoke-Expression): xxxxxxx" and
    "PS>TerminatingError(): "The pipeline has been stopped.""
    messages, but no host, verbose, or debug data (i.e. terminating error details only, which I have never seen in the transcript file before). That's correct -
    the transcription file doesn't contain any of the data it used to, just error info that did not used to be in there!!
    Please assist urgently, this is a critical issue for me.

  • APEX Application behaviour in a RAC setup

    Hi
    Caveat first: I'm pretty new to Oracle RAC and just looking into it as an option. We have an APEX application currently running in Oracle 11gR2 single node currently and are considering HA for this.
    My question is: What would be the expected behaviour seen by a User of an APEX application, in the event of a node failure, when running with an OHS / RAC configuration? Will they get "transparent fail-over" and see nothing or will they see an error?
    I appreciate I could post in the APEX forum, but feel that is probably more of a development forum and possibly someone here has had to look at things at this level.
    I have read what I think may be the definitive reference for this:
    http://www.oracle.com/technetwork/developer-tools/apex/learnmore/apex-rac-wp-133532.pdf
    but while it covers most of what I want I don't believe I have found an answer to my question
    This states:
    "The Transparent Application Failover (TAF) feature of Oracle Net Services is a runtime failover for high-availability environments. It enables client applications to automatically reconnect to the database if the connection fails and, optionally, resume a SELECT statement that was in progress. The reconnection happens automatically from within the Oracle Call Interface (OCI) library. For applications that do insert, update or delete transactions, the application must trap the error when the failure occurs, rollback the transaction, and then resubmit. If the application is not written to be TAF aware, the session will get disconnected."
    However (as I understand it) APEX runs in the database and would fail with the database, it isn't a typical "client application" connecting to Oracle via a TAF aware connection pool - it is essentially a large pl/sql package and TAF only covers SELECT statements not packages.
    May be I'm over-reading this and it's simpler than that: APEX/Mod_plsql might just handle it?
    - APEX User/HTTP session state is stored in database APEX: Understanding session state which is available on other nodes
    - Mod_plsql in OHS can detect the error returning and reissues the request to good server and APEX on that instance can retrieve Users/HTTP state and process the request (APEX/RAC doc states mod_plsql can see an error from database and cleanup connection up and form a new connection, but not that it will retry the request for the client into other APEX/DB node).
    I'm really just after a (transparent/non-transparent) statement based on experience, but an outline of how the components behave would be useful.
    Thanks in advance
    Dave

    Hi
    Any chance of getting that link outside of Metalink? - I'm trying to get our customer support id, but no luck at present.
    I'm aware that APEX can run with RAC (as per the link I posted) - I'm really after next level info around behaviour in that environment.
    Thanks
    Dave

  • Unit Testing in DAC

    Hi,
    I am facing a problem in unit testing in DAC. I am able to run full loads and incremental but i am unable to do unit testing in DAC. It shows an error that my Integration Services are not up when i run an individual task . Are there any setups for unit testing in DAC. I had followed http://docs.oracle.com/cd/E15586_01/fusionapps.1111/e14849/dacquickstart.htm#BABCIIIH topic : To unit test an execution plan task. If any one would help me in running a unit test that would be a great help.
    Thanks,
    Ram

    I'm struggling with the idea of testing View/Controller classes which depend on "container things" like bindings. Since there are many ways to interact with the 'container things' I guess it depends on exactly how you're using those bindings in your View/Controller classes. I've had success using Mockito along with my own fixture classes that do things like set up a mock ADFContext and RequestContext. Within my test setUp and tearDown, I set up and tear down these fixture classes, which results in a safe, mocked-out environment in which the test can execute. It's often tricky because many of the these objects are static and singleton-like, but so far I've always found a back door by which to inject a mock object. We also have our own utility classes that look-up 'container things', so we of course make sure we have a way to provide mock 'container things' through those utility classes.
    If you haven't come across the concept of 'mock objects' before, then it may be best to do a bit of background reading first:
    [http://www.mockobjects.com/2009/09/brief-history-of-mock-objects.html]
    [http://en.wikipedia.org/wiki/Mock_object]
    There are many mocking libraries about (EasyMock, jMock, rMock) but IMO Mockito is the one to choose right now.
    Of course, the alternative is to apply some [Inversion of Control|http://martinfowler.com/articles/injection.html] and make sure that the code you right is always ignorant of exactly where the 'container things' have come from. This makes mocking much easier, but I've not yet investigated use of a D.I. container (e.g. Spring) along with ADF - I suspect your options in 10g would be limited.

Maybe you are looking for

  • How many computers can I install on?

    Hey everyone I was wondering how many computers I would be allowed to install my Adobe Creative Suite Master Collection on? Also if it changes anything it will be the student pricing for the master collection. Thanks a lot!

  • Assessing one account from another on same computer

    I have been using a mac as a company server. I am now adding a user to that computer, but I am unable to set them up in such a way as they can access the folders under the user account which functions as my server. Is this possible (I could not find

  • Itunes on windows 7 not recognising i-phone5

    I have an acer aspire windows 7 laptop that does not have any software on it and i downloaded i-tunes onto it but it does not recognise my i-phone5, what could be the problem?

  • Rounding up salary increase

    HRMS application EBS 11.5.10.2 windows 2003 db 9.2.0.7 Under Self service Manager/My employee informatuion/salary increased amount and anulized salary field need to be rounded up. If salary is increased to $500, it is showing up as $499.99 and same w

  • Order to Cash Process

    Hi all, I am aware of the order to billing process in SD. Can you please details me the Order to Cash process and the extra Config. requirements in this when compared to the order to billing process. Thanks in advance. Jn.