Training Database (Vision)

I'm getting involved with Oracle once again, and am putting together a training plan around SQL, PL/SQL, Forms & Reports. We intend on doing much of this through OPN at our site, and want to set up a training instance to be used with real world examples we are putting together.
There used to be a 'Vision' database available for free that we could use to train with. Does that still exist, and if so how can I get to the download?

Are you referring to Oracle Vision Demo database? If yes, then it can be downloaded from:
Oracle® E-Delivery Web site
http://edelivery.oracle.com
AFAIK, there is no free/direct link to this database.
Please note that Solution Beacon provides a public access to 11i/R12 application (but not Database access).
Release 11i Vision Instances
http://www.solutionbeacon.com/tools_r11ivision.htm
Release 12 Vision Instances
http://www.solutionbeacon.com/tools_r12vision.htm
Edited by: hsawwan on Nov 4, 2008 3:24 PM -- Added the links to 11i/R12 instances.

Similar Messages

  • Training database localisation and impact on upgrading

    Dear Experts/Gurus,
    I have a problem to define or to find out what localisation used in training database and what are the impact if I upgrade it because when I open it using choose from company, the version is 680119 meanwhile the other database 680319.
    I appreciate your answer. TIA
    Rgds,

    Hi Steve,
    You can find the localization in field CINF.LawsSet.
    Not sure what you mean with "impact if I upgrade"; an upgrade should be a normal process...
    Regards,
    Frank

  • Possible Issues with Duplicate Training Database?

    Hello everyone!
    First off, please forgive my ignorance! I've only been working with databases lightly over the last couple of years, and thus far only with MySQL and MS SQL.
    Recently the company I work for purchased a new software package that required Oracle. We purchased a copy of Oracle 10.2 as part of the deal and the company that sold it to us, and set it up for us, has denied us dev access to the database, on our server, because it would expose their underlying data schema. I wasn't in change when the decision was made and we spent the money on a crippled copy of Oracle, but that's in the past.
    Because we don't have dev access to Oracle we're unable to create a training database for this new software in house. We have to pay this company to create a test environment for us. I was told it would cost $500, until the tech said he was going to install Oracle on our training server.
    At this point I stopped him and said at $500 they'll either be breaching their licensing agreement, or giving us a free copy! The other possibility is that they were going to install this and then just bill us for Oracle later on, despite quoting the job at $500.
    The tech who was about to do the installation went to check with someone more senior and then returned to inform me that installing the second copy of Oracle would indeed require more licensing fees.
    This verified what I already had told him, and I said we want a training database on our existing Oracle server, and then training copies of the new software can just point to that training database.
    He now needs to take time with the more senior technicians while they're "analyzing the existing database server/environment". He was trying to talk me out of installing a training database on the existing server, and instead buying another copy of Oracle, which I believe to be a total cash grab. He had no solid reason why he couldn't do it, but refused to anyway and said he needs to talk to a more senior tech (the same one who said that yes, we'll need to purchase a second copy of Oracle rather than use a second database on the existing server).
    So, these two databases, one live data and one a duplicate of that live data used in a training environment where changes won't be reported against, should be able to exist peacefully should they not? The same version of the same software accessing the same data from the same server, just a different database, shouldn't present any problems with my limited knowledge of Oracle! There will only ever be maybe 5 connections to the training environment at a time, and the server should be able to currently handle twice as many users connected as it's current maximum (edit: expected) load.
    I know this wouldn't be a problem with MySQL, or MS SQL, and Oracle is a superior database package, so can anyone think of any legitimate reason that might back up the claim that having that duplicate database will be a problem?
    Thanks in advance for any guidance here!

    Thank you very much John!
    We have a per user license and I don't expect our training needs to exceed even a half dozen people at a time to start (A small, significant, fraction of the initial user base). In the future we can easily recommission other resources, or purchase more. The whole project is in the earlier stages anyway, so the needs are smaller now (but the money we spend on licensing could be recurring).
    I understand the requirements angle from a server perspective, but those points you brought up drill down into what I need to know a bit more.
    I doubt rollback segments would be necessary, I was thinking more along the lines of scheduled replication/duplication of the production database on a weekly or monthly basis, just to keep the material at least somewhat up to date.
    I also don't think temporary table space would put much load on storage either, at least in the long run.
    RAM isn't much an issue with a 64 bit version running either, for our needs now at least.
    My instinct, and what you mention about even potential impact of altering the existing database schema enforces this, is that it's basically safer to create a separate database, but less resource intensive to alter the schema of an existing Oracle database to suit training needs (in this situation, maybe, making some vague assumptions about software probably), if done correctly (although still carrying risks).
    I do wish I had the time to read up on this more now, and I will as soon as I have the time.
    I probably will download 10g express edition to familiarize myself a bit more now, as soon as I come up with an application for it, to at least try out whatever I read.
    Thanks for the advice

  • Vision information

    I'm not very experienced in labview, andI'm using OCR with the vision assistant.
    I would like to know if someone could tell me what is threshold, and why do I have to train the vision assistant with characters the database already has, when I move the object farther away from the camera?
    Thanks in advance!
    Solved!
    Go to Solution.

    This is from context help
    If you manually set the threshold values, and OCR/OCV cannot detect characters in the region you specified, the training process does not create the character set file correctly. For example, if you select Fixed Range from the Mode control, and you select a Max value that is too low, OCR/OCV cannot correctly detect the characters in the region of interest.
    Select a threshold type from the Mode control.
    When you select Linear or Nonlinear, you must set # of Blocks to at least the number of objects in the region of interest. If you selected Fixed Range in Type, use Min and Max or the slider at the bottom of the histogram to determine the threshold value. If you selected Computed, Linear, or Nonlinear in Type, use Characters to indicate the character color in the image. Light on Dark indicates that the image contains light characters on a dark background. Dark on Light indicates that the image contains dark characters on a light background.
    Moving object farther away is expected to create inconsistensies. If you can post an image i will try to work something

  • Missing objects within database

    Hello,
    I'm fairly new to Oracle but have extensive knowledge with all versions of SQL Server. I was given the task to refresh our training environment with production data. The traning database was corrupt and couldn't be recovered so I needed to start from scratch. The database was built but there seems to be missing system views, stored procedures, tables and triggers.
    It would be ideal if I could take a production database snapshot, and restore over the existing training database. Can someone please give me the steps on how to do that? The files are all in the same location, but the only difference is that the database names are different.
    Any help would be much appreciated.
    Thank you,
    Pete

    825084 wrote:
    ...have extensive knowledge with all versions of SQL Server.Does SQLServer not use different forum places regarding subjects ? Oracle does.
    ...Can someone please give me the steps on how to do that? First step, post in the proper forum : General Database Discussions
    I'm sure, you'll get advised for the second step over there.
    Nicolas.

  • Database (in)dependence

    we got a two day training on ARIS Business Designer last week and now have both the IDS Sheer and the Oracle editions installed on my laptop.
    ARIS Business Architect version is 7.0.2.155701
    Oracle Business Process Architect 7.0.1.163410
    The instructor asked us to use the restore function to setup the lab environment.
    When I try to load the backup into the Oracle environment I get the next error message:
    Could not load java class :C:\..\LocalServer\temp\olite\sp\source\com\idssheer\aris\server\dll\mapimpl\olite\sp\AARISDB_9OnDeleteSetNull.java
    According to the instructor. the ARIS install is running a Sybase SQL server. Somewhere an Oracle database config setting needs to be changed.
    Anyone with a clue?

    Hmm, no sorry. We only have a temporary license for a month for IDS BA. After that it's just the public Oracle BPA edition.
    I can send the training database to reproduce the error.
    Just thinking: what would happen with the demo70 database?
    Guess what? After 4 minutes of loading time: same error, now with AARISDB_10OnDeleteSetNull
    Interesting fact:
    BPA demo70 database size: 3,085 KB (aedfile_70)
    ARIS demo70 database size: 4,902 KB (aedfile_70)
    Logical next step: restore the original (small) demo70 database into Oracle BPA.
    SURPRISE: Same error message!!!!!!!! I can not reload the demo70 database.
    Bottomline: script to reproduce the problem:
    start Oracle BA
    backup the demo70 database
    delete the demo70 database (after logging out)
    restore the demo70 database
    Don't try this at home: you have to reinstall the BPA suite to get your demo70 database back.

  • Oracle 8.1.7 database connects to different database other than specified.

    Dear Experts,
    This is a mysterious problem, I had ever encountered. I had executed some scripts by having connect canvas@trn_pcd as part of the script. I had entered the password in SQL * PLUS by calling the script. The Production and Training database had the same passwords. The connection was somehow established to Production database and the changes went to Prod_pcdb8 instance rather than trn_pcdb instance. There was no local installation of any oracle tool and I had used the SQL * PLUS of installation from one of network machine. The DBA had confirmed that TNSNAMES.ora file does not point trn_pcdb to PROD_PCDB8 instance. The database the connection was made to is 8.1.7.0.0, what are the possible ways that this can happen. Any ideas or experiences in this regard will be highly appreciated.
    Thanks and regards,
    Reddy
    [email protected]

    Thanks Justin. I will try that, but I was not able to reproduce, it all happened in one SQL * PLUS session. I had that open and ran multiple scripts with each of them having connect user_id@trn_pcdb and each of the changes in the script went to PROD_PCDB. Unfortunately, I had not realised at that point that they were going to PROD_PCDB, until I opened the Training Application and to my surprise, I did not find any of the changes, we had done. Our applications are developed on Developer 2000 and use a batch file which resets the Oracle registry. So when I started the application, it goes and resets the Oracle registry on the user machine. This time, I opened a SQL * PLUS session from the ORACLE_HOME used by the application and re-ran the scripts and they rightfully went to trn_pcdb database.
    So what are all the possible ways, that the changes can go to a different database though the connect string points to the right instance. Any ideas in this direction will highly be appreciated.
    Thanks,
    Reddy

  • SQLPlus Default Database Connection

    I have two Oracle databases installed on the same LINUX server. One is a training database that is rarely even running.
    When logging in through SQLPlus and only supplying userid and password, the database to which the connect is attempted varies depending on the userid I am currently logged in as under LINUX. Under one userid, SQLPlus even attempts to connect to the idle training database. The 8i documentation indicates that setting the default database for SQLPlus is OS specific. Does anyone know how this is specified under LINUX?

    Joseph,
    That was one of my guesses, as well, but it doesn't seem to behave consistently.
    The .bash_profile files for both the oracle and mdm user are the same, however, the oracle user is able to default to the correct database, whereas mdm cannot.
    Here's the profile for mdm:
    $ cat .bash_profile
    # .bash_profile
    # Get the aliases and functions
    if [ -f ~/.bashrc ]; then
    . ~/.bashrc
    fi
    # User specific environment and startup programs
    DISPLAY=localhost:0.0; export DISPLAY
    ORACLE_BASE=/usr/oracle; export ORACLE_BASE
    ORACLE_HOME=/usr/oracle; export ORACLE_HOME
    ORACLE_SID=pmisbk; export ORACLE_SID
    #ORACLE_SID=oralin; export ORACLE_SID
    ORA_NLS33=$ORACLE_HOME/ocommon/nls/admin/data; export ORA_NLS33
    ORAENV_ASK=NO; export ORAENV_ASK
    LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$ORACLE_HOME/lib; export LD_LIBRARY_PATH
    umask 022
    NLS_LANG=american_america.EE8ISO8859P2; export NLS_LANG
    #PATH=$PATH:$HOME/bin
    PATH=/usr/local/sbin:/usr/sbin:/sbin:/usr/local/ActivePerl-5.6/bin:$PATH:$HOME/bin:$ORACLE_HOME/bin
    BASH_ENV=$HOME/.bashrc
    USERNAME=""
    export USERNAME BASH_ENV PATH
    . /usr/local/bin/oraenv
    and behavior:
    $ echo $ORACLE_SID
    pmis
    $ sqlplus
    SQL*Plus: Release 8.1.6.0.0 - Production on Thu Apr 25 14:01:10 2002
    (c) Copyright 1999 Oracle Corporation. All rights reserved.
    Enter user-name: mdm
    Enter password:
    ERROR:
    ORA-01034: ORACLE not available
    Enter user-name: mdm@pmisbk
    Enter password:
    Connected to:
    Oracle8i Enterprise Edition Release 8.1.6.1.0 - Production
    With the Partitioning option
    JServer Release 8.1.6.0.0 - Production
    SQL> exit
    Disconnected from Oracle8i Enterprise Edition Release 8.1.6.1.0 - Production
    With the Partitioning option
    JServer Release 8.1.6.0.0 - Production
    and the .bash_profile for oracle:
    $ su - oracle
    Password:
    $ cat .bash_profile
    # .bash_profile
    # Get the aliases and functions
    if [ -f ~/.bashrc ]; then
    . ~/.bashrc
    fi
    # User specific environment and startup programs
    DISPLAY=localhost:0.0; export DISPLAY
    ORACLE_BASE=/usr/oracle; export ORACLE_BASE
    ORACLE_HOME=/usr/oracle; export ORACLE_HOME
    ORACLE_SID=pmis; export ORACLE_SID
    #ORACLE_SID=oralin; export ORACLE_SID
    ORA_NLS33=$ORACLE_HOME/ocommon/nls/admin/data; export ORA_NLS33
    ORAENV_ASK=NO; export ORAENV_ASK
    LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$ORACLE_HOME/lib; export LD_LIBRARY_PATH
    umask 022
    NLS_LANG=american_america.EE8ISO8859P2; export NLS_LANG
    CLASSPATH=.:$ORACLE_HOME/jdbc/lib/classes111.zip:$ORACLE_HOME/sqlj/lib/translator.zip; export CLASSPATH
    PATH=/usr/local/ActivePerl-5.6/bin:$PATH:$ORACLE_HOME/bin
    BASH_ENV=$HOME/.bashrc
    USERNAME=""
    export USERNAME BASH_ENV PATH
    . /usr/local/bin/oraenv
    and the behavior:
    $ sqlplus
    SQL*Plus: Release 8.1.6.0.0 - Production on Thu Apr 25 14:07:47 2002
    (c) Copyright 1999 Oracle Corporation. All rights reserved.
    Enter user-name: mdm
    Enter password:
    Connected to:
    Oracle8i Enterprise Edition Release 8.1.6.1.0 - Production
    With the Partitioning option
    JServer Release 8.1.6.0.0 - Production
    SQL> exit
    Disconnected from Oracle8i Enterprise Edition Release 8.1.6.1.0 - Production
    With the Partitioning option
    JServer Release 8.1.6.0.0 - Production
    Perhaps this is because the Home directory for the oracle user is the same as $ORACLE_HOME.
    You may have noticed the connect string for mdm was to pmisbk. That's the alias in the tnsnames.ora file, otherwise if you try using pmis you get:
    Enter user-name: mdm@pmis
    Enter password:
    ERROR:
    ORA-12154: TNS:could not resolve service name
    and have to enter:
    Enter user-name: mdm@pmisbk
    Enter password:
    Connected to:
    Oracle8i Enterprise Edition Release 8.1.6.1.0 - Production
    With the Partitioning option
    JServer Release 8.1.6.0.0 - Production
    SQL> exit
    If you specify pmisbk and the ORACLE_SID in the .bash_profile file, when you log into the server it prompts you:
    ORACLE_HOME = [usr/oracle] ?
    presumably because /etc/oratab does not contain an entry for pmisbk. You can respond with RETURN and the variable contains the correct path, but all the other behavior is as described above.

  • Running AP transfer to General Ledger: Complete with Error

    Below is the error I got while trying to run : AP transfer to General Ledger and I got Complete with Error:
    Starting concurrent program execution...
    +-----------------------------
    Arguments
    p_selection_type='1'
    p_set_of_books_id='1'
    p_include_reporting_sob='N'
    p_batch_name='jginv'
    p_from_date='2007/01/01 00:00:00'
    p_to_date='2008/12/31 00:00:00'
    p_accounting_method='Accrual'
    p_journal_category='A'
    p_validate_account='Y'
    p_gl_transfer_mode='D'
    p_submit_journal_import='Y'
    p_summary_journal_entry='N'
    p_debug_flag='N'
    p_trace_flag='N'
    APPLLCSP Environment Variable set to :
    Current NLS_LANG and NLS_NUMERIC_CHARACTERS Environment Variables are :
    American_America.UTF8
    LOG :
    Report: c:\oracle11i\visappl\ap\11.5.0\reports\US\APXGLTRN.rdf
    Logged onto server:
    Username:
    LOG :
    Logged onto server:
    Username: APPS
    MSG MSG-00001: After SRWINIT
    MSG MSG-00002: After Get_Sob_Ids
    MSG MSG-00002: After Get_Company_Name
    MSG MSG-00003: After Get_NLS_Strings
    MSG MSG-00004: After Get_Base_Curr_Data
    MSG MSG-00005: Calling Transfer Request
    MSG MSG-00005: After calling Transfer Request
    MSG MSG-00100: Error occured in Insert_Liability_Balance
    MSG MSG-00101: Error Code :-20100
    MSG MSG-00102: Error Message :ORA-20100: File o0059033.tmp creation for FND_FILE failed.
    You will find more information on the cause of the error in request log.
    ORA-06512: at "APPS.FND_FILE", line 396
    ORA-06512: at "APPS.FND_FILE", line 499
    ORA-06512: at "APPS.AP_TRIAL_BALANCE_PKG", line 1252
    MSG MSG-00005: Insert_Liability_Balance Failed.
    MSG MSG-00000: User-Defined Exception
    ERR REP-1419: 'beforereport': PL/SQL program aborted.
    Program exited with status 3
    Cause: The program terminated, returning status code 3.
    Action: Check your installation manual for the meaning of this code on this operating system.
    Concurrent Manager encountered an error while running Oracle*Report for your concurrent request 2748532.
    Review your concurrent request log and/or report output file for more detailed information.
    Jon

    Remember that this problem is from Vision training database and not real life productionIt does not matter whether you have a Vision demo database or fresh database installation.
    The APPLPTMP directory does not even exist- Open a new DOS session
    - cd c:\oracle11i\visappl
    - Run envshell.cmd
    - Type "echo %APPLPTMP%", does it return something?

  • How Can I Use Triggers to Pre-Populate a Table

    I've given up on using collections, time to move on to plan B.
    I'm working on building a training database and I have 3 tables, one which houses employee information (A), one which houses training courses (B) and a third what stores the results of completed training (C). What I'd like to do is store all possible combinations of A*B (M:M) in table C and throw a tabular form on table C to provide for mass updates.
    This is easy enough if my employee base (A) and my training courses (B) weren't changing throughout the year. What I'd like to accomplish is to place a trigger on each table so that when it is updated the new record is copied to table C along with all combinations of the other table.
    For example:
    If I add a new employee A, a trigger on that table will run which takes the new employee and creates records in table C, with all possible combinations of courses in table B. (1:M)
    or
    If I add a new course B, a trigger on that table will run which takes the new course and creates records in table C, with all possible combinations of employees in table A. (M:1)
    Let's just say the tables look like the following:
    A
    employee_num (PK)
    employee_name
    password
    active
    B
    course_ID (PK)
    course_name
    Course_code
    Course_hours
    C
    ID (PK)
    employee_num (FK)
    course_ID (FK)
    approved_status
    approved_by
    date_taken
    Does anyone have any pseudo code, links, sample applications, encouraging words I could leverage to get started?
    Thanks much.

    I've never created a trigger to do what your asking. I personally try to limit my use of triggers to surrogate keys.
    That being said I have done something similar to your situation. If you are managing the employees and the courses in apex just make that page process create your new records in your other tables.
    Ex. Add new employee
    Declare
    v_id number;
    Begin
    If creating
    v_id := what ever your next employee_id
    Issue your create on the employee table specifying v_id as the value of the employee_id
    Create an insert from select for the "c" table something like:
    Insert into c (employee_id,course_id)
    Select v_id, course_id from b ;
    End if;
    End;
    I would do something similar on the courses page to solve ex 2.

  • DAC: failed task during ETL for financial apps

    I am trying  my first ETL on OBIA 7.9.6.4
    i'm using  Oracle EBS 12.1.1 as source system.
    the ETL completes 314 tasks successfully ,but it fails the task named:
    "SDE_ORA_GL_AR_REV_LinkageInformation_Extract"
    DAC Error log:
    =====================================
    STD OUTPUT
    =====================================
    Informatica(r) PMCMD, version [9.1.0 HotFix2], build [357.0903], Windows 32-bit
    Copyright (c) Informatica Corporation 1994 - 2011
    All Rights Reserved.
    Invoked at Wed Sep 18 09:46:41 2013
    Connected to Integration Service: [infor_int].
    Folder: [SDE_ORAR1211_Adaptor]
    Workflow: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full]
    Instance: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full]
    Mapping: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract]
    Session log file: [C:\Informatica\server\infa_shared\SessLogs\.SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full.ORA_R1211.log]
    Source success rows: [0]
    Source failed rows: [0]
    Target success rows: [0]
    Target failed rows: [0]
    Number of transformation errors: [0]
    First error code [4035]
    First error message: [RR_4035 SQL Error [
    ORA-00904: "XLA_EVENTS"."UPG_BATCH_ID": invalid identifier
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATI]
    Task run status: [Failed]
    Integration Service: [infor_int]
    Integration Service Process: [infor_int]
    Integration Service Grid: [infor_int]
    Node Name(s) [node01_AMAZON-9C628AAE]
    Preparation fragment
    Partition: [Partition #1]
    Transformation instance: [SQ_XLA_AE_LINES]
    Transformation: [SQ_XLA_AE_LINES]
    Applied rows: [0]
    Affected rows: [0]
    Rejected rows: [0]
    Throughput(Rows/Sec): [0]
    Throughput(Bytes/Sec): [0]
    Last error code [16004], message [ERROR: Prepare failed. : [
    ORA-00904: "XLA_EVENTS"."UPG_BATCH_ID": invalid identifier
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_CO]
    Start time: [Wed Sep 18 09:46:13 2013]
    End time: [Wed Sep 18 09:46:13 2013]
    Partition: [Partition #1]
    Transformation instance: [W_GL_LINKAGE_INFORMATION_GS]
    Transformation: [W_GL_LINKAGE_INFORMATION_GS]
    Applied rows: [0]
    Affected rows: [0]
    Rejected rows: [0]
    Throughput(Rows/Sec): [0]
    Throughput(Bytes/Sec): [0]
    Last error code [0], message [No errors encountered.]
    Start time: [Wed Sep 18 09:46:14 2013]
    End time: [Wed Sep 18 09:46:14 2013]
    Disconnecting from Integration Service
    Completed at Wed Sep 18 09:46:41 2013
    Informatica session logs:
    DIRECTOR> VAR_27028 Use override value [DataWarehouse] for session parameter:[$DBConnection_OLAP].
    DIRECTOR> VAR_27028 Use override value [ORA_R1211] for session parameter:[$DBConnection_OLTP].
    DIRECTOR> VAR_27028 Use override value [.SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full.ORA_R1211.log] for session parameter:[$PMSessionLogFile].
    DIRECTOR> VAR_27028 Use override value [26] for mapping parameter:[$$DATASOURCE_NUM_ID].
    DIRECTOR> VAR_27028 Use override value ['N'] for mapping parameter:[$$FILTER_BY_LEDGER_ID].
    DIRECTOR> VAR_27028 Use override value ['N'] for mapping parameter:[$$FILTER_BY_LEDGER_TYPE].
    DIRECTOR> VAR_27028 Use override value [] for mapping parameter:[$$Hint1].
    DIRECTOR> VAR_27028 Use override value [01/01/1970] for mapping parameter:[$$INITIAL_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [01/01/1990] for mapping parameter:[$$LAST_EXTRACT_DATE].
    DIRECTOR> VAR_27028 Use override value [1] for mapping parameter:[$$LEDGER_ID_LIST].
    DIRECTOR> VAR_27028 Use override value ['NONE'] for mapping parameter:[$$LEDGER_TYPE_LIST].
    DIRECTOR> TM_6014 Initializing session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] at [Wed Sep 18 09:46:13 2013].
    DIRECTOR> TM_6683 Repository Name: [infor_rep]
    DIRECTOR> TM_6684 Server Name: [infor_int]
    DIRECTOR> TM_6686 Folder: [SDE_ORAR1211_Adaptor]
    DIRECTOR> TM_6685 Workflow: [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] Run Instance Name: [] Run Id: [2130]
    DIRECTOR> TM_6101 Mapping name: SDE_ORA_GL_AR_REV_LinkageInformation_Extract [version 1].
    DIRECTOR> TM_6963 Pre 85 Timestamp Compatibility is Enabled
    DIRECTOR> TM_6964 Date format for the Session is [MM/DD/YYYY HH24:MI:SS]
    DIRECTOR> TM_6827 [C:\Informatica\server\infa_shared\Storage] will be used as storage directory for session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full].
    DIRECTOR> CMN_1805 Recovery cache will be deleted when running in normal mode.
    DIRECTOR> CMN_1802 Session recovery cache initialization is complete.
    DIRECTOR> TM_6708 Using configuration property [DisableDB2BulkMode ,Yes]
    DIRECTOR> TM_6708 Using configuration property [OraDateToTimestamp ,Yes]
    DIRECTOR> TM_6708 Using configuration property [overrideMpltVarWithMapVar,Yes]
    DIRECTOR> TM_6708 Using configuration property [SiebelUnicodeDB,[APPS]@[ 54.225.65.108:1521:VIS] [DWH_REP2]@[AMAZON-9C628AAE:1521:obiaDW1]]
    DIRECTOR> TM_6703 Session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] is run by 32-bit Integration Service  [node01_AMAZON-9C628AAE], version [9.1.0 HotFix2], build [0903].
    MANAGER> PETL_24058 Running Partition Group [1].
    MANAGER> PETL_24000 Parallel Pipeline Engine initializing.
    MANAGER> PETL_24001 Parallel Pipeline Engine running.
    MANAGER> PETL_24003 Initializing session run.
    MAPPING> CMN_1569 Server Mode: [ASCII]
    MAPPING> CMN_1570 Server Code page: [MS Windows Latin 1 (ANSI), superset of Latin1]
    MAPPING> TM_6151 The session sort order is [Binary].
    MAPPING> TM_6156 Using low precision processing.
    MAPPING> TM_6180 Deadlock retry logic will not be implemented.
    MAPPING> TM_6187 Session target-based commit interval is [10000].
    MAPPING> TM_6307 DTM error log disabled.
    MAPPING> TE_7022 TShmWriter: Initialized
    MAPPING> TE_7004 Transformation Parse Warning [IIF(EVENT_TYPE_CODE='RECP_REVERSE',
    IIF(UPG_BATCH_ID>0,
    SOURCE_TABLE || '~' || DISTRIBUTION_ID,
    SOURCE_TABLE || '~RECEIPTREVERSE~' || DISTRIBUTION_ID),
    SOURCE_TABLE || '~' || DISTRIBUTION_ID)
    ]; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning [<<PM Parse Warning>> [||]: operand converted to a string
    ... IIF(EVENT_TYPE_CODE='RECP_REVERSE',
    IIF(UPG_BATCH_ID>0,
    SOURCE_TABLE || '~' || >>>>DISTRIBUTION_ID<<<<,
    SOURCE_TABLE || '~RECEIPTREVERSE~' || DISTRIBUTION_ID),
    SOURCE_TABLE || '~' || DISTRIBUTION_ID)
    <<PM Parse Warning>> [||]: operand converted to a string
    ... IIF(EVENT_TYPE_CODE='RECP_REVERSE',
    IIF(UPG_BATCH_ID>0,
    SOURCE_TABLE || '~' || DISTRIBUTION_ID,
    SOURCE_TABLE || '~RECEIPTREVERSE~' || >>>>DISTRIBUTION_ID<<<<),
    SOURCE_TABLE || '~' || DISTRIBUTION_ID)
    <<PM Parse Warning>> [||]: operand converted to a string
    ... IIF(EVENT_TYPE_CODE='RECP_REVERSE',
    IIF(UPG_BATCH_ID>0,
    SOURCE_TABLE || '~' || DISTRIBUTION_ID,
    SOURCE_TABLE || '~RECEIPTREVERSE~' || DISTRIBUTION_ID),
    SOURCE_TABLE || '~' || >>>>DISTRIBUTION_ID<<<<)
    ]; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning [JE_HEADER_ID || '~' || JE_LINE_NUM]; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning [<<PM Parse Warning>> [||]: operand converted to a string
    ... >>>>JE_HEADER_ID<<<< || '~' || JE_LINE_NUM<<PM Parse Warning>> [JE_LINE_NUM]: operand converted to a string
    ... JE_HEADER_ID || '~' || >>>>JE_LINE_NUM<<<<]; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning [AE_HEADER_ID || '~' || AE_LINE_NUM]; transformation continues...
    MAPPING> TE_7004 Transformation Parse Warning [<<PM Parse Warning>> [||]: operand converted to a string
    ... >>>>AE_HEADER_ID<<<< || '~' || AE_LINE_NUM<<PM Parse Warning>> [AE_LINE_NUM]: operand converted to a string
    ... AE_HEADER_ID || '~' || >>>>AE_LINE_NUM<<<<]; transformation continues...
    MAPPING> TM_6007 DTM initialized successfully for session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full]
    DIRECTOR> PETL_24033 All DTM Connection Info: [<NONE>].
    MANAGER> PETL_24004 Starting pre-session tasks. : (Wed Sep 18 09:46:13 2013)
    MANAGER> PETL_24027 Pre-session task completed successfully. : (Wed Sep 18 09:46:13 2013)
    DIRECTOR> PETL_24006 Starting data movement.
    MAPPING> TM_6660 Total Buffer Pool size is 12582912 bytes and Block size is 128000 bytes.
    READER_1_1_1> DBG_21438 Reader: Source is [54.225.65.108:1521/VIS], user [APPS]
    READER_1_1_1> BLKR_16003 Initialization completed successfully.
    WRITER_1_*_1> WRT_8146 Writer: Target is database [AMAZON-9C628AAE:1521/obiaDW1], user [DWH_REP2], bulk mode [ON]
    WRITER_1_*_1> WRT_8106 Warning! Bulk Mode session - recovery is not guaranteed.
    WRITER_1_*_1> WRT_8124 Target Table W_GL_LINKAGE_INFORMATION_GS :SQL INSERT statement:
    INSERT INTO W_GL_LINKAGE_INFORMATION_GS(SOURCE_DISTRIBUTION_ID,JOURNAL_LINE_INTEGRATION_ID,LEDGER_ID,LEDGER_TYPE,DISTRIBUTION_SOURCE,JE_BATCH_NAME,JE_HEADER_NAME,JE_LINE_NUM,POSTED_ON_DT,GL_ACCOUNT_ID,SLA_TRX_INTEGRATION_ID,DATASOURCE_NUM_ID)  VALUES ( ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
    WRITER_1_*_1> WRT_8270 Target connection group #1 consists of target(s) [W_GL_LINKAGE_INFORMATION_GS]
    WRITER_1_*_1> WRT_8003 Writer initialization complete.
    READER_1_1_1> BLKR_16007 Reader run started.
    READER_1_1_1> RR_4029 SQ Instance [SQ_XLA_AE_LINES] User specified SQL Query [SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATION_ID,
    AEHEADER.EVENT_TYPE_CODE,
    NVL(XLA_EVENTS.UPG_BATCH_ID,0) UPG_BATCH_ID
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AR_DISTRIBUTIONS_ALL'
              , 'RA_CUST_TRX_LINE_GL_DIST_ALL')
    AND DLINK.APPLICATION_ID = 222
    AND AELINE.APPLICATION_ID = 222
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('01/01/1970 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('N', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')]
    READER_1_1_1> RR_4049 SQL Query issued to database : (Wed Sep 18 09:46:13 2013)
    WRITER_1_*_1> WRT_8005 Writer run started.
    WRITER_1_*_1> WRT_8158
    *****START LOAD SESSION*****
    Load Start Time: Wed Sep 18 09:46:13 2013
    Target tables:
         W_GL_LINKAGE_INFORMATION_GS
    READER_1_1_1> CMN_1761 Timestamp Event: [Wed Sep 18 09:46:13 2013]
    READER_1_1_1> RR_4035 SQL Error [
    ORA-00904: "XLA_EVENTS"."UPG_BATCH_ID": invalid identifier
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATION_ID,
    AEHEADER.EVENT_TYPE_CODE,
    NVL(XLA_EVENTS.UPG_BATCH_ID,0) UPG_BATCH_ID
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AR_DISTRIBUTIONS_ALL'
              , 'RA_CUST_TRX_LINE_GL_DIST_ALL')
    AND DLINK.APPLICATION_ID = 222
    AND AELINE.APPLICATION_ID = 222
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('01/01/1970 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('N', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error
    Database driver error...
    Function Name : Execute
    SQL Stmt : SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATION_ID,
    AEHEADER.EVENT_TYPE_CODE,
    NVL(XLA_EVENTS.UPG_BATCH_ID,0) UPG_BATCH_ID
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AR_DISTRIBUTIONS_ALL'
              , 'RA_CUST_TRX_LINE_GL_DIST_ALL')
    AND DLINK.APPLICATION_ID = 222
    AND AELINE.APPLICATION_ID = 222
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('01/01/1970 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('N', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    Oracle Fatal Error].
    READER_1_1_1> CMN_1761 Timestamp Event: [Wed Sep 18 09:46:13 2013]
    READER_1_1_1> BLKR_16004 ERROR: Prepare failed.
    WRITER_1_*_1> WRT_8333 Rolling back all the targets due to fatal session error.
    WRITER_1_*_1> WRT_8325 Final rollback executed for the target [W_GL_LINKAGE_INFORMATION_GS] at end of load
    WRITER_1_*_1> WRT_8035 Load complete time: Wed Sep 18 09:46:13 2013
    LOAD SUMMARY
    ============
    WRT_8036 Target: W_GL_LINKAGE_INFORMATION_GS (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
    WRT_8044 No data loaded for this target
    WRITER_1_*_1> WRT_8043 *****END LOAD SESSION*****
    MANAGER> PETL_24031
    ***** RUN INFO FOR TGT LOAD ORDER GROUP [1], CONCURRENT SET [1] *****
    Thread [READER_1_1_1] created for [the read stage] of partition point [SQ_XLA_AE_LINES] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [TRANSF_1_1_1] created for [the transformation stage] of partition point [SQ_XLA_AE_LINES] has completed. The total run time was insufficient for any meaningful statistics.
    Thread [WRITER_1_*_1] created for [the write stage] of partition point [W_GL_LINKAGE_INFORMATION_GS] has completed. The total run time was insufficient for any meaningful statistics.
    MANAGER> PETL_24005 Starting post-session tasks. : (Wed Sep 18 09:46:14 2013)
    MANAGER> PETL_24029 Post-session task completed successfully. : (Wed Sep 18 09:46:14 2013)
    MAPPING> TM_6018 The session completed with [0] row transformation errors.
    MANAGER> PETL_24002 Parallel Pipeline Engine finished.
    DIRECTOR> PETL_24013 Session run completed with failure.
    DIRECTOR> TM_6022
    SESSION LOAD SUMMARY
    ================================================
    DIRECTOR> TM_6252 Source Load Summary.
    DIRECTOR> CMN_1740 Table: [SQ_XLA_AE_LINES] (Instance Name: [SQ_XLA_AE_LINES])
      Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6253 Target Load Summary.
    DIRECTOR> CMN_1740 Table: [W_GL_LINKAGE_INFORMATION_GS] (Instance Name: [W_GL_LINKAGE_INFORMATION_GS])
      Output Rows [0], Affected Rows [0], Applied Rows [0], Rejected Rows [0]
    DIRECTOR> TM_6023
    ===================================================
    DIRECTOR> TM_6020 Session [SDE_ORA_GL_AR_REV_LinkageInformation_Extract_Full] completed at [Wed Sep 18 09:46:14 2013].
    *I did some queries in my source database (Vision) , table "XLA_EVENTS" exists , column "UPG_BATCH_ID" also exists
    *I added "XLA_EVENTS" to the FROM clause and ran it in SQL Developer
    *in the SELECT clause ,i see a column named "AEHEADER.EVENT_TYPE_CODE"
    but there is no table named "AEHEADER" in the FROM clause
    so i added it manually , it's probably refers to "XLA_AE_HEADERS"
    Final query looks like this:
    SELECT DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
    AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATION_ID,
    AEHEADER.EVENT_TYPE_CODE,
    NVL(XLA_EVENTS.UPG_BATCH_ID,0) UPG_BATCH_ID
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
       , XLA_AE_HEADERS AEHEADER
       , XLA_EVENTS
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AR_DISTRIBUTIONS_ALL'
              , 'RA_CUST_TRX_LINE_GL_DIST_ALL')
    AND DLINK.APPLICATION_ID = 222
    AND AELINE.APPLICATION_ID = 222
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND JHEADER.CREATION_DATE >=
              TO_DATE('01/01/1970 00:00:00'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE('N', 'Y', T.LEDGER_ID, 1) IN (1)
    AND DECODE('N', 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ('NONE')
    *when i run that query,it takes a lot of time executing without returning any results (last time it took 4 hours before i cancel it)
    my questions are:
    -what's wrong with that query?
    -how can i change the query in the workflow?
    could anyone please help?

    thank you very much
    i found SQ_XLA_AE_LINES and checked its SQL query,it's a very healthy query
    SELECT  DISTINCT
    DLINK.SOURCE_DISTRIBUTION_ID_NUM_1 DISTRIBUTION_ID,
    DLINK.SOURCE_DISTRIBUTION_TYPE SOURCE_TABLE,
          AELINE.ACCOUNTING_CLASS_CODE,
    GLIMPREF.JE_HEADER_ID JE_HEADER_ID,
    GLIMPREF.JE_LINE_NUM JE_LINE_NUM,
    AELINE.AE_HEADER_ID AE_HEADER_ID,
    AELINE.AE_LINE_NUM AE_LINE_NUM,
    T.LEDGER_ID LEDGER_ID,
    T.LEDGER_CATEGORY_CODE LEDGER_TYPE,
        JBATCH.NAME BATCH_NAME,
       JHEADER.NAME HEADER_NAME,
          PER.END_DATE,
    AELINE.CODE_COMBINATION_ID,
    AEHEADER.EVENT_TYPE_CODE,
    NVL(XLA_EVENTS.UPG_BATCH_ID,0) UPG_BATCH_ID
    FROM XLA_DISTRIBUTION_LINKS DLINK
       , GL_IMPORT_REFERENCES        GLIMPREF
       , XLA_AE_LINES                              AELINE
       , XLA_AE_HEADERS AEHEADER
       , GL_JE_HEADERS                         JHEADER
       , GL_JE_BATCHES                         JBATCH
       , GL_LEDGERS                                 T
       , GL_PERIODS   PER
       , XLA_EVENTS
    WHERE DLINK.SOURCE_DISTRIBUTION_TYPE IN
             (  'AR_DISTRIBUTIONS_ALL'
              , 'RA_CUST_TRX_LINE_GL_DIST_ALL')
    AND DLINK.APPLICATION_ID = 222
    AND AELINE.APPLICATION_ID = 222
    AND AEHEADER.APPLICATION_ID = 222
    AND XLA_EVENTS.APPLICATION_ID=222
    AND AEHEADER.AE_HEADER_ID = AELINE.AE_HEADER_ID
    AND AELINE.GL_SL_LINK_TABLE = GLIMPREF.GL_SL_LINK_TABLE
    AND AELINE.GL_SL_LINK_ID         = GLIMPREF.GL_SL_LINK_ID
    AND AELINE.AE_HEADER_ID         = DLINK.AE_HEADER_ID        
    AND AELINE.AE_LINE_NUM           = DLINK.AE_LINE_NUM
    AND GLIMPREF.JE_HEADER_ID   = JHEADER.JE_HEADER_ID
    AND JHEADER.JE_BATCH_ID       = JBATCH.JE_BATCH_ID
    AND JHEADER.LEDGER_ID                   = T.LEDGER_ID
    AND JHEADER.STATUS                         = 'P'
    AND T.PERIOD_SET_NAME = PER.PERIOD_SET_NAME
    AND JHEADER.PERIOD_NAME = PER.PERIOD_NAME
    AND AEHEADER.EVENT_ID=XLA_EVENTS.EVENT_ID
    AND JHEADER.LAST_UPDATE_DATE >=
              TO_DATE('$$LAST_EXTRACT_DATE'
                    , 'MM/DD/YYYY HH24:MI:SS' )
    AND DECODE($$FILTER_BY_LEDGER_ID, 'Y', T.LEDGER_ID, 1) IN ($$LEDGER_ID_LIST)
    AND DECODE($$FILTER_BY_LEDGER_TYPE, 'Y', T.LEDGER_CATEGORY_CODE, 'NONE') IN ($$LEDGER_TYPE_LIST)
    i compared this query with the query that appears in the Error messages,they are different (the Error message is stated in the first post)
    the query that appears in the Error messages misses a couple of lines,specifically in the "FROM" clause and the "WHERE" clause
    what might cause that issue?

  • Some problem to install BIEE11g

    hello everyone,
    I have installed BIEE11g three time, in each process, I will meet a mistake, that is 'ASInstance install error'. then I lookup the log file, it writes 'ASInstance error, Error in starting opmn server. operation aborted because of a system call failure or internal error'. to be honest, I couldnot solve this problem though i read the BIEE11g doc.
    if i choose to skip the error and continue to install, i could not to visit the http://localhost:7001/analytics and could not succee to run startall in opmn.
    so, i do not know how to solve this problem, please help me!
    notes: my os is win xp sp3 and database vision is 10.2.0.1.0 and rcu is 1.1.5.0, actually when i install oracle schemas by rcu, it emerges some mistake and i choose to continue it. is the vision not match?
    thanks,
    regards,
    phoeny

    Hi Guys
    Let me share my experience.
    That Adobe Extension Manager CS4 was not working correctly for me.
    System: Win XP Pro SP2, Adobe CS4 Master Collection. Flex Builder 3,  SDK 3.4.1, Flex Component Kit 1.1.2, Adobe Extension Manager 2.1.112
    In folder
    C:\Documents and Settings\{username}\Local Settings\Application Data\Adobe
    I haven both
    Flash CS4 (current)
    and
    Flash CS3 (left with few files after uninstalling of CS3)
    Apparently Extension manager put FlexComponentBase.swc to Flash CS3 folder. So I Removed CS3 folder and tried again.
    Then I've also using Russian language (for non Unicode characters) while my default (keyboard bindings) is English. So Extension manager installing to ru folder and as result I can not see  "Convert to Flex Component" in commands of Flash CS4. Simple copy of FlexComponentBase.swc  to en folder (Adobe\Flash CS4\en\Configuration\Components) did not help.
    So I set default language (for non unicode characters) in Regional setting of Windows CP to  English. Reboot Xp.
    Install Flex Component Kit again.
    Finally it went to correct folder: C:\Documents and Settings\{username}\Local Settings\Application Data\Adobe\Flash CS4\en\Configuration\Components\Flex
    Flash CS 4 shows now 2 new options : Convert symbol to Flex container, Convert symbol to Flex Component.
    Doc

  • How to solve the ASInstance error

    hello everyone,
    when i install the BIEE11g, I always met a same mistake, that is 'ASInstance install error'. then I lookup the log file, it writes 'ASInstance error, Error in starting opmn server. operation aborted because of a system call failure or internal error'. could you tell me how to solve it?
    if i choose to skip the error and continue to install, i could not to visit the http://localhost:7001/analytics and could not succee to run startall command in opmn.
    notes: my os is win xp sp3 and database vision is 11.2.0.1.0 and rcu is 1.1.5.0
    thank you very much.
    regards,
    phoeny

    hello everyone,
    when i install the BIEE11g, I always met a same mistake, that is 'ASInstance install error'. then I lookup the log file, it writes 'ASInstance error, Error in starting opmn server. operation aborted because of a system call failure or internal error'. could you tell me how to solve it?
    if i choose to skip the error and continue to install, i could not to visit the http://localhost:7001/analytics and could not succee to run startall command in opmn.
    notes: my os is win xp sp3 and database vision is 11.2.0.1.0 and rcu is 1.1.5.0
    thank you very much.
    regards,
    phoeny

  • IMP-00017: following statement failed with ORACLE error 1276

    i have exported the full database without rows from live database and tried to import it into training database.
    both use ASM file system.i have precreated tablespace in training database.
    but im getting the error IMP-00017: following statement failed with ORACLE error 1276.
    it tries to create tablesapce and datafile on live ASM location rather than creating it on training database ASM location.
    live database asm location is WEBD and training database is at TRAIND.
    it tries to import on WEBD RATHER THAN TRAIND and throws the error.
    any help is appreciated.i have googled it and cant find anything helpful.
    Im bit new to DBA stuff.
    is there anyway to import and export data in ASM file system?
    Thanks

    When datapump exports the tablespace information, all it knows about is the currently location of your datafiles. Let's take a simple example. Let's say you have tablespace foo with a file called foo.dbf and it lives in a windows directory c: if you are now trying to import that into a linux box, Data Pump does not know that. All it knows is that it needs to create the tablespace data file in c:. If you are changing the locations of the datafiles, you need to use remap_datafile=old_datafile:new_datafile ... in your impdp command. Or, precreate them and use exclude=tablespace. The old_datafile and new_datafile need to be the complete path.
    Hope this helps.
    Dean

  • While installing oracle apps 11.5.10 the post installion check JSP fails

    Hi Guys,
    While installing oracle apps 11.5.10 on oracle linux OEL4.7 i am facing one post installation check failure i.e JSP CHECK. even i tried aoljtest its getting fail.
    I am output of error log file and access log files.
    Environment:
    Oracle apps version:11.5.10
    Os :oracle linux 4.7
    bit :32 bit
    database : vision type
    Error_log file contains
    [Thu Aug 23 03:38:27 2012] [error] OPM:Can not find one alive process
    [Thu Aug 23 03:38:27 2012] [error] [client 49.248.45.6] File does not exist: /oa_servlets/AppsLogin
    [Thu Aug 23 03:38:37 2012] [error] OPM:Can not find one alive process
    [Thu Aug 23 03:38:37 2012] [error] [client 49.248.45.6] File does not exist: /servlets/weboam/oam/oamLogin/oam/oamLogin
    Access_log file contains
    173.49.29.80 - - [23/Aug/2012:01:08:08 -0400] "GET /OA_HTML/jsp/fnd/fndhelp.jsp?dbc=/11i/applmgr/visappl/fnd/11.5.0/secure/VIS_ora11i/vis.dbc HTTP/1.1" 404 239
    173.49.29.80 - - [23/Aug/2012:01:08:08 -0400] "GET /OA_HTML/US/ICXINDEX.htm HTTP/1.1" 200 3483
    49.248.45.6 - - [23/Aug/2012:01:12:38 -0400] "GET /aplogon.html HTTP/1.1" 304 -
    49.248.45.6 - - [23/Aug/2012:01:12:40 -0400] "GET /oa_servlets/AppsLogin HTTP/1.1" 404 227
    49.248.45.6 - - [23/Aug/2012:01:12:49 -0400] "GET /servlets/weboam/oam/oamLogin HTTP/1.1" 404 234
    49.248.45.6 - - [23/Aug/2012:01:16:36 -0400] "GET /FND_WEB.PING HTTP/1.1" 404 218
    smtp.ttml.co.in - - [23/Aug/2012:01:19:06 -0400] "GET /servlets/IsItWorking HTTP/1.1" 403 230
    smtp.ttml.co.in - - [23/Aug/2012:01:19:11 -0400] "GET /servlets/IsItWorking HTTP/1.1" 403 230
    smtp.ttml.co.in - - [23/Aug/2012:01:20:43 -0400] "GET /servlets/IsItWorking HTTP/1.1" 403 230
    49.248.45.6 - - [23/Aug/2012:03:38:23 -0400] "GET /oa_servlets/AppsLogin HTTP/1.1" 404 227
    49.248.45.6 - - [23/Aug/2012:03:38:27 -0400] "GET /oa_servlets/AppsLogin HTTP/1.1" 404 227
    49.248.45.6 - - [23/Aug/2012:03:38:32 -0400] "GET / HTTP/1.1" 304 -
    49.248.45.6 - - [23/Aug/2012:03:38:32 -0400] "GET /apptitle.html HTTP/1.1" 304 -
    49.248.45.6 - - [23/Aug/2012:03:38:32 -0400] "GET /applist.html HTTP/1.1" 304 -
    49.248.45.6 - - [23/Aug/2012:03:38:32 -0400] "GET /aplogon.html HTTP/1.1" 304 -
    49.248.45.6 - - [23/Aug/2012:03:38:32 -0400] "GET /appsmed3.gif HTTP/1.1" 304 -
    49.248.45.6 - - [23/Aug/2012:03:38:37 -0400] "GET /servlets/weboam/oam/oamLogin HTTP/1.1" 404 234
    49.248.45.6 - - [23/Aug/2012:03:38:44 -0400] "GET /appdet.html HTTP/1.1" 304 -
    49.248.45.6 - - [23/Aug/2012:03:38:47 -0400] "GET /appdet.html HTTP/1.1" 304 -
    Please provide some solution.

    Please see these docs.
    Servlet and JSP Pages return a "Page Not Found" Error After Install of Applications [ID 372096.1]
    "Unrecognized Option: -Client,Could Not Create The Java Virtual Machine." With Applications 11i [ID 358061.1]
    Receiving ''Internal Server Error'' When Checking Out From Supplier Punchout After Upgrade to Java Development Kit (JDK) 6 [ID 781750.1]
    Thanks,
    Hussein

Maybe you are looking for