Load Huge volume of data

Hi,
         We are having a Go-live soon and I need to initialize the data from a huge table which is almost 1.5 TB. This is a billing condition table. Any tips on how to load the data effectively wiht out any data lossand optima time frame.
Thanks

HI,
If you have source downtime. and if you have selection conditions on company code or fical year period any thing specific to your requiremnets.
If it is fiscal year period you can pull repur fulls for all the periods eg:001.2009 to 006.2009 and immediately you can create one more infopackage abd load 007.2009 to 012.2009.Like this u can kep intervals for all the fiscal years depending how much history you have.like 2000 to 2009. or 2001 to 2009.
Before doing this make sure that u have enough back ground process and table space available, keep sm12 authorisations also. and kiling a job in sm50 if have done some thing wrong.
After pulling every thing if still the down time is tere do a init without data transver by having all the fical year periods in the selctions eg_001.2000 to 012.2009.
Please start the IP in start later in back ground.
Cheers,
Vikram

Similar Messages

  • In Bdc I have huge volume of data to upload for the given transaction

    Hi gurus,
    In Bdc I have huge volume of data to upload for the given transaction, here am using session method, it takes lots of exection time to complete the whole transaction, Is there any other method to process the huge volume with minimum time,
    reward awaiting
    with regards
    Thambe

    Selection of BDC Method depends on the type of the requirement you have. But you can decide which one will suite requirement basing the difference between the two methods. The following are the differences between Session & Call Transaction.
    Session method.
    1) synchronous processing.
    2) can tranfer large amount of data.
    3) processing is slower.
    4) error log is created
    5) data is not updated until session is processed.
    Call transaction.
    1) asynchronous processing
    2) can transfer small amount of data
    3) processing is faster.
    4) errors need to be handled explicitly
    5) data is updated automatically
    Batch Data Communication (BDC) is the oldest batch interfacing technique that SAP provided since the early versions of R/3. BDC is not a typical integration tool, in the sense that, it can be only be used for uploading data into R/3 and so it is
    not bi-directional.
    BDC works on the principle of simulating user input for transactional screen, via an ABAP program.
    Typically the input comes in the form of a flat file. The ABAP program reads this file and formats the input data screen by screen into an internal table (BDCDATA). The transaction is then started using this internal table as the input and executed in the background.
    In ‘Call Transaction’, the transactions are triggered at the time of processing itself and so the ABAP program must do the error handling. It can also be used for real-time interfaces and custom error handling & logging features. Whereas in
    Batch Input Sessions, the ABAP program creates a session with all the transactional data, and this session can be viewed, scheduled and processed (using Transaction SM35) at a later time. The latter technique has a built-in error processing mechanism too.
    Batch Input (BI) programs still use the classical BDC approach but doesn’t require an ABAP program to be written to format the BDCDATA. The user has to format the data using predefined structures and store it in a flat file. The BI program then reads this and invokes the transaction mentioned in the header record of the file.
    Direct Input (DI) programs work exactly similar to BI programs. But the only difference is, instead of processing screens they validate fields and directly load the data into tables using standard function modules. For this reason, DI programs are much faster (RMDATIND - Material Master DI program works at least 5 times faster) than the BDC counterpart and so ideally suited for loading large volume data. DI programs are not available for all application areas.
    synchronous & Asynchronous updating:
    http://www.icesoft.com/developer_guides/icefaces/htmlguide/devguide/keyConcepts4.html
    synchronous & Asynchronous processings
    Asynchronous refers to processes that do not depend on each other's outcome, and can therefore occur on different threads simultaneously. The opposite is synchronous. Synchronous processes wait for one to complete before the next begins. For those Group Policy settings for which both types of processes are available as options, you choose between the faster asynchronous or the safer, more predictable synchronous processing.
    By default, the processing of Group Policy is synchronous. Computer policy is completed before the CTRLALTDEL dialog box is presented, and user policy is completed before the shell is active and available for the user to interact with it.
    Note
    You can change this default behavior by using a policy setting for each so that processing is asynchronous. This is not recommended unless there are compelling performance reasons. To provide the most reliable operation, leave the processing as synchronous.

  • Data with huge volume of data with DTP

    Hi Experts,
    I have this problem with upload of huge volume of data with DTPs.
    I have my initialisation done as I am doing reloads, Now I have this data from fiscal year period 000.2010 to 016.9999.
    I have huge volume of data.
    I have tried uploading this data in chunks by dividing 3 months for each DTP and had made full load.
    But when I processed the DTP the data packages are decided at source and I have about 2000 data packages.
    Now my request is turning to red after processing about 1000 datapackages, batch processes allocated to this also stopped.
    I have tried dividing DTP only by month and processed the DTP I have same problem. I have deleted the indexes before uplaoding to the cube, Changed the setting battch processing from 3 to 5.
    Please can any one advise what could be problem.I am uplaoding this reloads in quality system.
    How can upload this data which are in millions.
    Thanks,
    Tati

    Hi Galban,
    I have made the parallel processing from 3 to 5 even and the datapakcage size
    Can you please advise in this area how can I increase the data package size as the data package size for my upload is the package size corresponds to package size in source it is determined dynammically at runtime.
    Please advise.
    Thanks
    Tati

  • Error while extracting huge volumes of data from BW

    Hi,
    we see this error while extracting huge volumes of data (apprx 3.4 million and with more no.of columns) and we see this error.
    R3C-151001: |Dataflow DF_SAPSI_SAPSI3131_SAPBW_To_Teradata
    Error calling R/3 to get table data: <RFC Error:
    Key: TSV_TNEW_PAGE_ALLOC_FAILED
    Status: EXCEPTION SYSTEM_FAILURE RAISED
    No more storage space available for extending an internal table.
    >.
    We are not sure if DoP works with source as SAP BW, but when tried with DoP also, we got the same error.
    Will this issue be resolved with an R/3 or ABAP dataflow? Can anyone suggest some possible solutions for this scenario?
    Sri

    The problem is that you've reached the maximum memory configure for your system.
    If this is batch job reconfigure the profile parameter
    abap/heap_area_nondia
    Markus

  • Most efficient method of storing configuration data for huge volume of data

    The scenario in which i'm boggled up is as follows:
    I have a huge volume of raw data (as CSV files).
    This data needs to be rated based on the configuration tables.
    The output is again CSV data with some new fields appended to the original records.
    These new fields are derived from original data based on the configuration tables.
    There are around 15 configuration tables.
    Out of these 15 tables 4 tables have huge configurations.
    1 table has 15 million configuration data of 10 colums.
    Other three tables have around 1-1.5 million configuration data of 10-20 columns.
    Now in order to carry forward my rating process, i'm left with the following methods:
    1) Leave the configurations in database table. Query the table for each configuration required.
    Disadvantage: Even if the indexes are created on the table, it takes a lot of time to query 15 configuration tables for each record in the file.
    2) Load the configurations as key value pairs in RAM using a suitable collection (Eg HashMap)
    Advantage: Processing is fast
    Disadvantage: Takes around 2 GB of RAM per instance.
    Also when the CPU context swithes (as i'm using a 8 CPU server), the process gets hanged up for 10 secs.
    This happens very frequently, so the net-net speed which i get is again less
    3) Store the configurations as CSV sorted files and then perform a binary search on it.
    Advantages: No RAM usage, Same configuration shared by multiple instances
    Disadvantages: Only 1 configuration table has an integer key, so cant use this concept for other tables
    (If i'm wrong in that please correct)
    4) Store the configurations as an XML file
    Dont know the advantages/disadvantages for it.
    Please suggest with the methodology which should be carried out....
    Edited by: Vishal_Vinayak on Jul 6, 2009 11:56 PM

    Vishal_Vinayak wrote:
    2) Load the configurations as key value pairs in RAM using a suitable collection (Eg HashMap)
    Advantage: Processing is fast
    Disadvantage: Takes around 2 GB of RAM per instance.
    Also when the CPU context swithes (as i'm using a 8 CPU server), the process gets hanged up for 10 secs.
    This happens very frequently, so the net-net speed which i get is again lessSounds like you don't have enough physical memory. Your application shouldn't be hanging at all.
    How much memory is attached to each CPU? e.g. numactl --show                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Maitaning huge volume of data (Around 60 Million records)

    Iu2019ve requirement to load the data from ODS to Cube by full load. This ODS is getting 50 Million records down the line for 6 months which we have to maintain in BW.
    Can you please put the advise on the following things?
         Can we accommodate 50 Million records in ODS?
    If i.e. the case u201CCan we put the load for 50 Million records from ODS to Cube?u201D     And each record has to go check in another ODS to get the value for another InfoObject. Hence u201CIs the load going to be successful for the 50 Million records? Iu2019m not sure. Or do we get time out error?

    Harsha,
    The data load should go through ... some things to do / check...
    Delete the indices on cube before loading and then rebuild the same later after the load completes.
    regarding the lookup - if you are looking up specific values in another DSO - build a suitable secondary index on the DSO for the same ( preferably unique index )
    A DSo or cube can definitely hold 50 million records - we have had cases where we has 50 million records for 1 month with the DSO holding data for 6 to 10 months and the same with the cube also. Only that the reporting on the cube might be slow at a very detailed level.
    Also please state your version - 3.x or 7.0...
    also if you are on Oracle - plan for providing / backing up archive logs - since loading generates a lot of arcive logs...
    Edited by: Arun Varadarajan on Apr 21, 2009 2:30 AM

  • Huge volume of data not getting processed

    Hello Everyone,
    Its a single file to multiple idoc scenario. There is no mapping involved. But the problem is that the file is of size 50 MB having 50000 idocs . This idocs get divided and are sent to BW for reporting based on the  message id . Now since the message id should remain same for all the 50000 idocs, so i cannot split the file.
    But when i process this, it gives me Lock_Table_Overflow error . The function module IDOC_INBOUND_ASYNCHRONOUS is in error in SM58 . I have checked the enque/table_size and its 64000. i think its enough to process a 50 MB file.
    Please let me know how to proceed further with this.
    Regards,
    Ravi

    Hi Ravi,
    I don't really think this is a problem of PI itself, especially that you get the error in IDOC_INBOUND_ASYNCHRONOUS. The enque/table_size equal 64 000 might not be enough for 50 000 IDocs - just think if each IDoc requires two locks.
    Hopefully, you should be able solve the issue by setting the Queue Processing checkbox in your receiver IDoc adapter in PI. This will force IDocs being processed one by one, so that so many locks will not be created simultaneously. The only problem is that I cannot foresee, how big the overall increase in processing time will be.
    But you will not know until you try and please let us know about the results, as there might be others to follow your path
    Hope this helps,
    Greg

  • Delete volumes of Data in a table.

    We have a table which has loads of data. When we execute the delete statement , it gives DB error as it was not able to delete huge volume of data.
    Approach tried so far :
    Keep a counter who size is say 10000. and execute delete statements every 10000 records. But still as the data is huge it still takes time.
    Can anybody suggest some approaches ?

    user12944938 wrote:
    Oracle Version : 10g
    SQL : DELETE FROM STUDENT WHERE YEAR = 2000.
    The requirement is more of a yearly basis but on different tables[STUDENT, ADMIN, MANAGERS,........].
    We have tried to run in off-hours.
    STUDENT TABLE [ ID, FIRST NAME, MIDDLE NAME LAST NAME,.......................................................................................] AROUND 200 COLUMNS.Well you left out a lot of information from what i asked for so i don't have much to suggest.
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:2345591157689
    Is a good read and may be of help to you.
    Partitioning could also be an option (at which point you could drop the old partitions).
    Really can't say knowing what we do (and more importantly, do not know) about your situation.

  • Dealing with large volumes of data

    Background:
    I recently "inherited" support for our company's "data mining" group, which amounts to a number of semi-technical people who have received introductory level training in writing SQL queries and been turned loose with SQL Server Management
    Studio to develop and run queries to "mine" several databases that have been created for their use.  The database design (if you can call it that) is absolutely horrible.  All of the data, which we receive at defined intervals from our
    clients, is typically dumped into a single table consisting of 200+ varchar(x) fields.  There are no indexes or primary keys on the tables in these databases, and the tables in each database contain several hundred million rows (for example one table
    contains 650 million rows of data and takes up a little over 1 TB of disk space, and we receive weekly feeds from our client which adds another 300,000 rows of data).
    Needless to say, query performance is terrible, since every query ends up being a table scan of 650 million rows of data.  I have been asked to "fix" the problems.
    My experience is primarily in applications development.  I know enough about SQL Server to perform some basic performance tuning and write reasonably efficient queries; however, I'm not accustomed to having to completely overhaul such a poor design
    with such a large volume of data.  We have already tried to add an identity column and set it up as a primary key, but the server ran out of disk space while trying to implement the change.
    I'm looking for any recommendations on how best to implement changes to the table(s) housing such a large volume of data.  In the short term, I'm going to need to be able to perform a certain amount of data analysis so I can determine the proper data
    types for fields (and whether any existing data would cause a problem when trying to convert the data to the new data type), so I'll need to know what can be done to make it possible to perform such analysis without the process consuming entire days to analyze
    the data in one or two fields.
    I'm looking for reference materials / information on how to deal with the issues, particularly when a large volumn of data is involved.  I'm also looking for information on how to load large volumes of data to the database (current processing of a typical
    data file takes 10-12 hours to load 300,000 records).  Any guidance that can be provided is appreciated.  If more specific information is needed, I'll be happy to try to answer any questions you might have about my situation.

    I don't think you will find a single magic bullet to solve all the issues.  The main point is that there will be no shortcut for major schema and index changes.  You will need at least 120% free space to create a clustered index and facilitate
    major schema changes.
    I suggest an incremental approach to address you biggest pain points.  You mention it takes 10-12 hours to load 300,000 rows, which suggests there may be queries involved in the process which require full scans of the 650 million row table.  Perhaps
    some indexes targeted at improving that process is a good first step.
    What SQL Server version and edition are you using?  You'll have more options with Enterprise (partitioning, row/page compression). 
    Regarding the data types, I would take a best guess at the proper types and run a query with TRY_CONVERT (assuming SQL 2012) to determine counts of rows that conform or not for each column.  Then create a new table (using SELECT INTO) that has strongly
    typed columns for those columns that are not problematic, plus the others that cannot easily be converted, and then drop the old table and rename the new one.  You can follow up later to address columns data corrections and/or transformations. 
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • Processing large volumes of data in PL/SQL

    I'm working on a project which requires us to process large volumes of data on a weekly/monthly/quarterly basis, and I'm not sure we are doing it right, so any tips would be greatly appreciated.
    Requirement
    Source data is in a flat file in "short-fat" format i.e. each data record (a "case") has a key and up to 2000 variable values.
    A typical weekly file would have maybe 10,000 such cases i.e. around 20 million variable values.
    But we don't know which variables are used each week until we get the file, or where they are in the file records (this is determined via a set of meta-data definitions that the user selects at runtime). This makes identifying and validating each variable value a little more interesting.
    Target is a "long-thin" table i.e. one record for each variable value (with numeric IDs as FKs to identify the parent variable and case.
    We only want to load variable values for cases which are entirely valid. This may be a merge i.e. variable values may already exist in the target table.
    There are various rules for validating the data against pre-existing data etc. These rules are specific to each variable, and have to be applied before we put the data in the target table. The users want to see the validation results - and may choose to bail out - before the data is written to the target table.
    Restrictions
    We have very limited permission to perform DDL e.g. to create new tables/indexes etc.
    We have no permission to use e.g. Oracle external tables, Oracle directories etc.
    We are working with standard Oracle tools i.e. PL/SQL and no DWH tools.
    DBAs are extremely resistant to giving us more disk space.
    We are on Oracle 9iR2, with no immediate prospect of moving to 10g.
    Current approach
    Source data is uploaded via SQL*Loader into static "short fat" tables.
    Some initial key validation is performed on these records.
    Dynamic SQL (plus BULK COLLECT etc) is used to pivot the short-fat data into an intermediate long-thin table, performing the validation on the fly via a combination of including reference values in the dynamic SQL and calling PL/SQL functions inside the dynamic SQL. This means we can pivot+validate the data in one step, and don't have to update the data with its validation status after we've pivoted it.
    This upload+pivot+validate step takes about 1 hour 15 minutes for around 15 million variable values.
    The subsequent "load to target table" step also has to apply substitution rules for certain "special values" or NULLs.
    We do this by BULK collecting the variable values from the intermediate long-thin table, for each valid case in turn, applying the substitution rules within the SQL, and inserting into/updating the target table as appropriate.
    Initially we did this via a SQL MERGE, but this was actually slower than doing an explicit check for existence and switching between INSERT and UPDATE accordingly (yes, that sounds fishy to me too).
    This "load" process takes around 90 minutes for the same 15 million variable values.
    Questions
    Why is it so slow? Our DBAs assure us we have lots of table-space etc, and that the server is plenty powerful enough.
    Any suggestions as to a better approach, given the restrictions we are working under?
    We've looked at Tom Kyte's stuff about creating temporary tables via CTAS, but we have had serious problems with dynamic SQL on this project, so we are very reluctant to introduce more of it unless it's absolutely necessary. In any case, we have serious problems getting permissions to create DB objects - tables, indexes etc - dynamically.
    So any advice would be gratefully received!
    Thanks,
    Chris

    We have 8 "short-fat" tables to hold the source data uploaded from the source file via SQL*Loader (the SQL*Loader step is fast). The data consists simply of strings of characters, which we treat simply as VARCHAR2 for the most part.
    These tables consist essentially of a case key (composite key initially) plus up to 250 data columns. 8*250 = 2000, so we can handle up to 2000 of these variable values. The source data may have 100 any number of variable values in each record, but each record in a given file has the same structure. Each file-load event may have a different set of variables in different locations, so we have to map the short-fat columns COL001 etc to the corresponding variable definition (for validation etc) at runtime.
    CASE_ID VARCHAR2(13)
    COL001 VARCHAR2(10)
    COL250     VARCHAR2(10)
    We do a bit of initial validation in the short-fat tables, setting a surrogate key for each case etc (this is fast), then we pivot+validate this short-fat data column-by-column into a "long-thin" intermediate table, as this is the target format and we need to store the validation results anyway.
    The intermediate table looks similar to this:
    CASE_NUM_ID NUMBER(10) -- surrogate key to identify the parent case more easily
    VARIABLE_ID NUMBER(10) -- PK of variable definition used for validation and in target table
    VARIABLE_VALUE VARCHAR2(10) -- from COL001 etc
    STATUS VARCHAR2(10) -- set during the pivot+validate process above
    The target table looks very similar, but holds cumulative data for many weeks etc:
    CASE_NUM_ID NUMBER(10) -- surrogate key to identify the parent case more easily
    VARIABLE_ID NUMBER(10) -- PK of variable definition used for validation and in target table
    VARIABLE_VALUE VARCHAR2(10)
    We only ever load valid data into the target table.
    Chris

  • Changes to write optimized DSO containing huge amount of data

    Hi Experts,
    We have appended two new fields in DSO containg huge amount of data. (new IO are amount and currency)
    We are able to make the changes in Development (with DSO containing data).  But when we tried to
    tranport the changes to our QA system, the transport hangs.  The transport triggers a job which
    filled-up the logs so we need to kill the job which aborts the transport.
    Does anyone of you had the same experience.  Do we need to empty the DSO so we can transport
    successfully?  We really don't want to empty the DSO's as it will take time to load? 
    Any help?
    Thank you very muhc for your help.
    Best regards,
    Rose

    emptying the dso should not be necessary, not for a normal dso and not for a write optimized DSO.
    What are the things in the logs; sort of conversions for all the records?
    Marco

  • Huge volume of records are routing to the remote user other than his position and organization records. Synchronization and DB initialization taking more time around 36 hours.

    Huge volume of records are routing to the remote user other than his position and organization records. Synchronization and DB initialization taking more time around 36 hours.
    Actual accounts & contacts need to be route around 2000 & 3000 but we have observed lakhs of records routing into local DB.
    We have verified all the Assignment Rules, Views.
    We ran docking object visibility rules and we have observed that some other accounts are routing due to Organization rule passing. (these records are not supposed to route).
    Version Siebel 7.7.2.12,
    OS Solaris.

    let me know what would be the reason that 1st million takes only 15 minuts and the time goes on increasing gradually with the increase of dataYes that's a little strange. I only can guess:
    1. You are in archivelog mode and the Archiver is not able to archive the redo logs fast enough
    2. You don't use Direct Load and DBWR ist not able to write the direty block to disk fast enough. You could create more DBWR processes in that case.
    3. Make a snapshot of v$system_event:
    create table begin as select * from v$system_event;After the import run
    create table end as select * from v$system_event;Now compare the values:
    select * from begin order by TIME_WAITED_MICRO descwith the values given you by
    select * from end order by TIME_WAITED_MICRO descSo you can look where your DB spent so much time waiting for something.
    Alternativly, you could start a 10046 trace on the loading session and use tkprof.
    Dim

  • Handling Huge Amount of data in Browser

    Some information regarding large data handling in Web Browser. Browser data will be downloaded to the
    cache of local machine. So when the browser needs data to be downloaded in terms of MBs and GBs, how can we
    handle that ?
    requirement is as mentioned below.
    A performance monitoring application is collecting performance data of a system every 30 seconds.
    The size of the data collected can be around 10 KB for each interval and it is logged to a database. If this application
    runs for one day, the statistical data size will be around 30 MB (28.8 MB) . If it runs for one week, the data size will be
    210 MB. There is no limitation on the number of days from the software perspective.
    User needs to see this statistical data in the browser. We are not sure if this huge amount of data transfer to the
    browser in one instance is feasible. The user should be able to get the overall picture of the logged data for a
    particular period and if needed, should be able to drill down step by step to lesser ranges.
    For e.g, if the user queries for data between the dates 10'th Nov to 20'th Nov, the user expects to get an overall idea of
    the 11 days data. Note that it is not possible to show each 30 second data when showing 11 days data. So some logic
    has to be applied to present the 11 days data in a reasonably acceptable form. Then the user can go and select a
    particular date in the graph and the data for that day alone should be shown with a better granularity than the overall
    graph.
    Note: The applet may not be a signed applet.

    How do you download gigabytes of data to a browser? The answer is simple. You don't. A data analysis package like the one you describe should run on the server and send the requested summary views to the browser.

  • The content of this page failed to load as expected because data transmission was interrupted. Please try again, or contact your system administrator.

    jdeveloper 11.1.2.0
    version 64
    hi
    this message appear when my page appear and wait time to fetch data but no data come
    "The content of this page failed to load as expected because data transmission was interrupted. Please try again, or contact your system administrator. "
    this error some page work fine and some like this give me this message why ?
    and for log
    ####<Sep 19, 2013 10:54:34 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1379577274094> <BEA-090082> <Security initializing using security realm myrealm.>
    ####<Sep 19, 2013 10:54:40 AM AST> <Notice> <WebLogicServer> <ABHO-IT-AHMAD> <DefaultServer> <main> <<WLS Kernel>> <> <> <1379577280708> <BEA-000365> <Server state changed to STANDBY>
    ####<Sep 19, 2013 10:54:40 AM AST> <Notice> <WebLogicServer> <ABHO-IT-AHMAD> <DefaultServer> <main> <<WLS Kernel>> <> <> <1379577280777> <BEA-000365> <Server state changed to STARTING>
    ####<Sep 19, 2013 10:54:42 AM AST> <Warning> <oracle.as.jmx.framework.MessageLocalizationHelper> <ABHO-IT-AHMAD> <DefaultServer> <JMX FRAMEWORK Domain Runtime MBeanServer pooling thread> <<anonymous>> <> <0000K4pDfiFE8Ty707MaMF1IEeqy000001> <1379577282352> <J2EE JMX-46041> <The resource for bundle "oracle.jrf.i18n.MBeanMessageBundle" with key "oracle.jrf.JRFServiceMBean.checkIfJRFAppliedOnMutipleTargets" cannot be found.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Log Management> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <> <1379577302172> <BEA-170027> <The Server has established connection with the Domain level Diagnostic Service successfully.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <WebLogicServer> <ABHO-IT-AHMAD> <DefaultServer> <main> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000006> <1379577302287> <BEA-000365> <Server state changed to ADMIN>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <WebLogicServer> <ABHO-IT-AHMAD> <DefaultServer> <main> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000006> <1379577302332> <BEA-000365> <Server state changed to RESUMING>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302454> <BEA-090171> <Loading the identity certificate and private key stored under the alias DemoIdentity from the jks keystore file D:\ORACLE~1\MIDDLE~2\WLSERV~1.3\server\lib\DemoIdentity.jks.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302528> <BEA-090169> <Loading trusted certificates from the jks keystore file D:\ORACLE~1\MIDDLE~2\WLSERV~1.3\server\lib\DemoTrust.jks.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302531> <BEA-090169> <Loading trusted certificates from the jks keystore file D:\ORACLE~1\MIDDLE~2\JDK160~1\jre\lib\security\cacerts.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302580> <BEA-090898> <Ignoring the trusted CA certificate "CN=Entrust Root Certification Authority - G2,OU=(c) 2009 Entrust\, Inc. - for authorized use only,OU=See www.entrust.net/legal-terms,O=Entrust\, Inc.,C=US". The loading of the trusted certificate list raised a certificate parsing exception PKIX: Unsupported OID in the AlgorithmIdentifier object: 1.2.840.113549.1.1.11.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302584> <BEA-090898> <Ignoring the trusted CA certificate "CN=thawte Primary Root CA - G3,OU=(c) 2008 thawte\, Inc. - For authorized use only,OU=Certification Services Division,O=thawte\, Inc.,C=US". The loading of the trusted certificate list raised a certificate parsing exception PKIX: Unsupported OID in the AlgorithmIdentifier object: 1.2.840.113549.1.1.11.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302588> <BEA-090898> <Ignoring the trusted CA certificate "CN=T-TeleSec GlobalRoot Class 3,OU=T-Systems Trust Center,O=T-Systems Enterprise Services GmbH,C=DE". The loading of the trusted certificate list raised a certificate parsing exception PKIX: Unsupported OID in the AlgorithmIdentifier object: 1.2.840.113549.1.1.11.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302589> <BEA-090898> <Ignoring the trusted CA certificate "CN=T-TeleSec GlobalRoot Class 2,OU=T-Systems Trust Center,O=T-Systems Enterprise Services GmbH,C=DE". The loading of the trusted certificate list raised a certificate parsing exception PKIX: Unsupported OID in the AlgorithmIdentifier object: 1.2.840.113549.1.1.11.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302592> <BEA-090898> <Ignoring the trusted CA certificate "CN=GlobalSign,O=GlobalSign,OU=GlobalSign Root CA - R3". The loading of the trusted certificate list raised a certificate parsing exception PKIX: Unsupported OID in the AlgorithmIdentifier object: 1.2.840.113549.1.1.11.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302593> <BEA-090898> <Ignoring the trusted CA certificate "OU=Security Communication RootCA2,O=SECOM Trust Systems CO.\,LTD.,C=JP". The loading of the trusted certificate list raised a certificate parsing exception PKIX: Unsupported OID in the AlgorithmIdentifier object: 1.2.840.113549.1.1.11.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302595> <BEA-090898> <Ignoring the trusted CA certificate "CN=VeriSign Universal Root Certification Authority,OU=(c) 2008 VeriSign\, Inc. - For authorized use only,OU=VeriSign Trust Network,O=VeriSign\, Inc.,C=US". The loading of the trusted certificate list raised a certificate parsing exception PKIX: Unsupported OID in the AlgorithmIdentifier object: 1.2.840.113549.1.1.11.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302602> <BEA-090898> <Ignoring the trusted CA certificate "CN=KEYNECTIS ROOT CA,OU=ROOT,O=KEYNECTIS,C=FR". The loading of the trusted certificate list raised a certificate parsing exception PKIX: Unsupported OID in the AlgorithmIdentifier object: 1.2.840.113549.1.1.11.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Security> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302605> <BEA-090898> <Ignoring the trusted CA certificate "CN=GeoTrust Primary Certification Authority - G3,OU=(c) 2008 GeoTrust Inc. - For authorized use only,O=GeoTrust Inc.,C=US". The loading of the trusted certificate list raised a certificate parsing exception PKIX: Unsupported OID in the AlgorithmIdentifier object: 1.2.840.113549.1.1.11.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Server> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302628> <BEA-002613> <Channel "DefaultSecure" is now listening on 127.0.0.1:7102 for protocols iiops, t3s, ldaps, https.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <WebLogicServer> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302629> <BEA-000331> <Started WebLogic Admin Server "DefaultServer" for domain "DefaultDomain" running in Development Mode>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <Server> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000000a> <1379577302628> <BEA-002613> <Channel "Default" is now listening on 127.0.0.1:7101 for protocols iiop, t3, ldap, snmp, http.>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <WebLogicServer> <ABHO-IT-AHMAD> <DefaultServer> <main> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000006> <1379577302763> <BEA-000360> <Server started in RUNNING mode>
    ####<Sep 19, 2013 10:55:02 AM AST> <Notice> <WebLogicServer> <ABHO-IT-AHMAD> <DefaultServer> <main> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000006> <1379577302763> <BEA-000365> <Server state changed to RUNNING>
    ####<Sep 19, 2013 10:55:36 AM AST> <Warning> <org.apache.myfaces.trinidadinternal.application.ViewHandlerImpl> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000002a> <1379577336590> <BEA-000000> <Apache Trinidad is running with time-stamp checking enabled. This should not be used in a production environment. See the org.apache.myfaces.trinidad.CHECK_FILE_MODIFICATION property in WEB-INF/web.xml>
    ####<Sep 19, 2013 10:55:42 AM AST> <Warning> <org.apache.myfaces.trinidad.component.UIXEditableValue> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000002d> <1379577342537> <BEA-000000> <A Bean Validation provider is not present, therefore bean validation is disabled>
    ####<Sep 19, 2013 11:00:06 AM AST> <Error> <HTTP> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000049> <1379577606792> <BEA-101017> <[ServletContext@166723[app:pms module:pms-ViewController-context-root path:/pms-ViewController-context-root spec-version:2.5], request: weblogic.servlet.internal.ServletRequestImpl@13038c7[
    GET /pms-ViewController-context-root/faces/Projects?_adf.ctrl-state=1868ecjjey_3&Adf-Rich-Message=true&unique=1379577605234&oracle.adf.view.rich.STREAM=pt1:t2&javax.faces.ViewState=!11i3l2zal9&Adf-Window-Id=w0 HTTP/1.1
    User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:23.0) Gecko/20100101 Firefox/23.0
    Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
    Accept-Language: en-US,en;q=0.5
    Accept-Encoding: gzip, deflate
    Referer: http://localhost:7101/pms-ViewController-context-root/faces/TaskStuts?_adf.ctrl-state=1868ecjjey_3
    Cookie: JSESSIONID=7wJxS6tWTJX1JPKj5Jp4X4Tl4g29drQTyGbRJ701xTxgT5TGvh3w!498953664
    Connection: keep-alive
    ]] Root cause of ServletException.
    java.lang.NoClassDefFoundError: javax/faces/event/ExceptionQueuedEventContext
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._publishException(LifecycleImpl.java:816)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._handleException(LifecycleImpl.java:1446)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:208)
      at javax.faces.webapp.FacesServlet.service(FacesServlet.java:312)
      at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
      at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
      at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
      at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
      at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
      at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
      at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
      at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    Caused By: java.lang.ClassNotFoundException: javax.faces.event.ExceptionQueuedEventContext
      at weblogic.utils.classloaders.GenericClassLoader.findLocalClass(GenericClassLoader.java:297)
      at weblogic.utils.classloaders.GenericClassLoader.findClass(GenericClassLoader.java:270)
      at weblogic.utils.classloaders.ChangeAwareClassLoader.findClass(ChangeAwareClassLoader.java:64)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:305)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:246)
      at weblogic.utils.classloaders.GenericClassLoader.loadClass(GenericClassLoader.java:179)
      at weblogic.utils.classloaders.ChangeAwareClassLoader.loadClass(ChangeAwareClassLoader.java:43)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._publishException(LifecycleImpl.java:816)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._handleException(LifecycleImpl.java:1446)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:208)
      at javax.faces.webapp.FacesServlet.service(FacesServlet.java:312)
      at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
      at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
      at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
      at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
      at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
      at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
      at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
      at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    >
    ####<Sep 19, 2013 11:00:06 AM AST> <Notice> <Diagnostics> <ABHO-IT-AHMAD> <DefaultServer> <[STANDBY] ExecuteThread: '2' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000004b> <1379577606817> <BEA-320068> <Watch 'UncheckedException' with severity 'Notice' on server 'DefaultServer' has triggered at Sep 19, 2013 11:00:06 AM AST. Notification details:
    WatchRuleType: Log
    WatchRule: (SEVERITY = 'Error') AND ((MSGID = 'WL-101020') OR (MSGID = 'WL-101017') OR (MSGID = 'WL-000802') OR (MSGID = 'BEA-101020') OR (MSGID = 'BEA-101017') OR (MSGID = 'BEA-000802'))
    WatchData: DATE = Sep 19, 2013 11:00:06 AM AST SERVER = DefaultServer MESSAGE = [ServletContext@166723[app:pms module:pms-ViewController-context-root path:/pms-ViewController-context-root spec-version:2.5], request: weblogic.servlet.internal.ServletRequestImpl@13038c7[
    GET /pms-ViewController-context-root/faces/Projects?_adf.ctrl-state=1868ecjjey_3&Adf-Rich-Message=true&unique=1379577605234&oracle.adf.view.rich.STREAM=pt1:t2&javax.faces.ViewState=!11i3l2zal9&Adf-Window-Id=w0 HTTP/1.1
    User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:23.0) Gecko/20100101 Firefox/23.0
    Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
    Accept-Language: en-US,en;q=0.5
    Accept-Encoding: gzip, deflate
    Referer: http://localhost:7101/pms-ViewController-context-root/faces/TaskStuts?_adf.ctrl-state=1868ecjjey_3
    Cookie: JSESSIONID=7wJxS6tWTJX1JPKj5Jp4X4Tl4g29drQTyGbRJ701xTxgT5TGvh3w!498953664
    Connection: keep-alive
    ]] Root cause of ServletException.
    java.lang.NoClassDefFoundError: javax/faces/event/ExceptionQueuedEventContext
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._publishException(LifecycleImpl.java:816)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._handleException(LifecycleImpl.java:1446)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:208)
      at javax.faces.webapp.FacesServlet.service(FacesServlet.java:312)
      at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
      at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
      at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
      at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
      at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
      at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
      at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
      at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    Caused By: java.lang.ClassNotFoundException: javax.faces.event.ExceptionQueuedEventContext
      at weblogic.utils.classloaders.GenericClassLoader.findLocalClass(GenericClassLoader.java:297)
      at weblogic.utils.classloaders.GenericClassLoader.findClass(GenericClassLoader.java:270)
      at weblogic.utils.classloaders.ChangeAwareClassLoader.findClass(ChangeAwareClassLoader.java:64)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:305)
      at java.lang.ClassLoader.loadClass(ClassLoader.java:246)
      at weblogic.utils.classloaders.GenericClassLoader.loadClass(GenericClassLoader.java:179)
      at weblogic.utils.classloaders.ChangeAwareClassLoader.loadClass(ChangeAwareClassLoader.java:43)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._publishException(LifecycleImpl.java:816)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._handleException(LifecycleImpl.java:1446)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:208)
      at javax.faces.webapp.FacesServlet.service(FacesServlet.java:312)
      at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
      at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
      at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
      at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
      at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
      at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
      at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
      at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    SUBSYSTEM = HTTP USERID = <WLS Kernel> SEVERITY = Error THREAD = [ACTIVE] ExecuteThread: '0' for queue: 'weblogic.kernel.Default (self-tuning)' MSGID = BEA-101017 MACHINE = ABHO-IT-AHMAD TXID =  CONTEXTID = e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000049 TIMESTAMP = 1379577606792
    WatchAlarmType: AutomaticReset
    WatchAlarmResetPeriod: 30000
    >
    ####<Sep 19, 2013 11:00:10 AM AST> <Alert> <Diagnostics> <ABHO-IT-AHMAD> <DefaultServer> <oracle.dfw.impl.incident.DiagnosticsDataExtractorImpl - Incident Dump Executor (created: Thu Sep 19 11:00:08 AST 2013)> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000004f> <1379577610100> <BEA-320016> <Creating diagnostic image in c:\users\ahmed-it\appdata\roaming\jdeveloper\system11.1.2.0.38.60.17\defaultdomain\servers\defaultserver\adr\diag\ofm\defaultdomain\defaultserver\incident\incdir_20 with a lockout minute period of 1.>
    ####<Sep 19, 2013 11:01:07 AM AST> <Warning> <Common> <ABHO-IT-AHMAD> <DefaultServer> <[STANDBY] ExecuteThread: '4' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000048> <1379577667721> <BEA-000632> <Resource Pool "pms" shutting down, ignoring 1 resources still in use by applications..>
    ####<Sep 19, 2013 11:01:24 AM AST> <Warning> <org.apache.myfaces.trinidadinternal.application.ViewHandlerImpl> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000005a> <1379577684728> <BEA-000000> <Apache Trinidad is running with time-stamp checking enabled. This should not be used in a production environment. See the org.apache.myfaces.trinidad.CHECK_FILE_MODIFICATION property in WEB-INF/web.xml>
    ####<Sep 19, 2013 11:01:34 AM AST> <Warning> <org.apache.myfaces.trinidad.component.UIXEditableValue> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000005f> <1379577694830> <BEA-000000> <A Bean Validation provider is not present, therefore bean validation is disabled>
    ####<Sep 19, 2013 11:02:06 AM AST> <Error> <javax.faces.event> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000068> <1379577726178> <BEA-000000> <Received 'javax.faces.event.AbortProcessingException' when invoking action listener '#{bindings.Commit.execute}' for component 'cb7'>
    ####<Sep 19, 2013 11:02:06 AM AST> <Error> <javax.faces.event> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000068> <1379577726182> <BEA-000000> <javax.faces.event.AbortProcessingException: ADFv: Abort processing exception.
      at oracle.adfinternal.view.faces.model.binding.FacesCtrlActionBinding.execute(FacesCtrlActionBinding.java:199)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
      at com.sun.el.parser.AstValue.invoke(Unknown Source)
      at com.sun.el.MethodExpressionImpl.invoke(Unknown Source)
      at com.sun.faces.facelets.el.TagMethodExpression.invoke(TagMethodExpression.java:105)
      at javax.faces.event.MethodExpressionActionListener.processAction(MethodExpressionActionListener.java:148)
      at javax.faces.event.ActionEvent.processListener(ActionEvent.java:88)
      at org.apache.myfaces.trinidad.component.UIXComponentBase.broadcast(UIXComponentBase.java:814)
      at org.apache.myfaces.trinidad.component.UIXCommand.broadcast(UIXCommand.java:179)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent$1.run(ContextSwitchingComponent.java:130)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent._processPhase(ContextSwitchingComponent.java:461)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent.broadcast(ContextSwitchingComponent.java:134)
      at oracle.adf.view.rich.component.fragment.UIXInclude.broadcast(UIXInclude.java:111)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent$1.run(ContextSwitchingComponent.java:130)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent._processPhase(ContextSwitchingComponent.java:461)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent.broadcast(ContextSwitchingComponent.java:134)
      at oracle.adf.view.rich.component.fragment.UIXInclude.broadcast(UIXInclude.java:105)
      at javax.faces.component.UIViewRoot.broadcastEvents(UIViewRoot.java:787)
      at javax.faces.component.UIViewRoot.processApplication(UIViewRoot.java:1252)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._invokeApplication(LifecycleImpl.java:965)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:346)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:204)
      at javax.faces.webapp.FacesServlet.service(FacesServlet.java:312)
      at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
      at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
      at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:173)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:121)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
      at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:293)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:199)
      at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
      at java.security.AccessController.doPrivileged(Native Method)
      at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
      at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:413)
      at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
      at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:161)
      at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:136)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
      at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
      at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
      at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
      at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
      at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    >
    ####<Sep 19, 2013 11:02:06 AM AST> <Warning> <oracle.adf.controller.faces.lifecycle.Utils> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000068> <1379577726194> <BEA-000000> <ADF: Adding the following JSF error message: ORA-04098: trigger 'PMS.PROJECT_SEQ' is invalid and failed re-validation
    java.sql.SQLSyntaxErrorException: ORA-04098: trigger 'PMS.PROJECT_SEQ' is invalid and failed re-validation
      at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:457)
      at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
      at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:889)
      at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:476)
      at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:204)
      at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:540)
      at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:217)
      at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1079)
      at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1466)
      at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3752)
      at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3887)
      at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeUpdate(OraclePreparedStatementWrapper.java:1508)
      at weblogic.jdbc.wrapper.PreparedStatement.executeUpdate(PreparedStatement.java:172)
      at oracle.jbo.server.OracleSQLBuilderImpl.doEntityDML(OracleSQLBuilderImpl.java:432)
      at oracle.jbo.server.EntityImpl.doDML(EntityImpl.java:8494)
      at pms.model.eo.ProjectsImpl.doDML(ProjectsImpl.java:245)
      at oracle.jbo.server.EntityImpl.postChanges(EntityImpl.java:6751)
      at oracle.jbo.server.DBTransactionImpl.doPostTransactionListeners(DBTransactionImpl.java:3264)
      at oracle.jbo.server.DBTransactionImpl.postChanges(DBTransactionImpl.java:3067)
      at oracle.jbo.server.DBTransactionImpl.commitInternal(DBTransactionImpl.java:2071)
      at oracle.jbo.server.DBTransactionImpl.commit(DBTransactionImpl.java:2352)
      at oracle.adf.model.bc4j.DCJboDataControl.commitTransaction(DCJboDataControl.java:1590)
      at oracle.adf.model.binding.DCDataControl.callCommitTransaction(DCDataControl.java:1414)
      at oracle.jbo.uicli.binding.JUCtrlActionBinding.doIt(JUCtrlActionBinding.java:1428)
      at oracle.adf.model.binding.DCDataControl.invokeOperation(DCDataControl.java:2168)
      at oracle.jbo.uicli.binding.JUCtrlActionBinding.invoke(JUCtrlActionBinding.java:731)
      at oracle.adf.controller.v2.lifecycle.PageLifecycleImpl.executeEvent(PageLifecycleImpl.java:402)
      at oracle.adfinternal.view.faces.model.binding.FacesCtrlActionBinding._execute(FacesCtrlActionBinding.java:252)
      at oracle.adfinternal.view.faces.model.binding.FacesCtrlActionBinding.execute(FacesCtrlActionBinding.java:185)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
      at com.sun.el.parser.AstValue.invoke(Unknown Source)
      at com.sun.el.MethodExpressionImpl.invoke(Unknown Source)
      at com.sun.faces.facelets.el.TagMethodExpression.invoke(TagMethodExpression.java:105)
      at javax.faces.event.MethodExpressionActionListener.processAction(MethodExpressionActionListener.java:148)
      at javax.faces.event.ActionEvent.processListener(ActionEvent.java:88)
      at org.apache.myfaces.trinidad.component.UIXComponentBase.broadcast(UIXComponentBase.java:814)
      at org.apache.myfaces.trinidad.component.UIXCommand.broadcast(UIXCommand.java:179)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent$1.run(ContextSwitchingComponent.java:130)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent._processPhase(ContextSwitchingComponent.java:461)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent.broadcast(ContextSwitchingComponent.java:134)
      at oracle.adf.view.rich.component.fragment.UIXInclude.broadcast(UIXInclude.java:111)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent$1.run(ContextSwitchingComponent.java:130)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent._processPhase(ContextSwitchingComponent.java:461)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent.broadcast(ContextSwitchingComponent.java:134)
      at oracle.adf.view.rich.component.fragment.UIXInclude.broadcast(UIXInclude.java:105)
      at javax.faces.component.UIViewRoot.broadcastEvents(UIViewRoot.java:787)
      at javax.faces.component.UIViewRoot.processApplication(UIViewRoot.java:1252)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._invokeApplication(LifecycleImpl.java:965)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:346)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:204)
      at javax.faces.webapp.FacesServlet.service(FacesServlet.java:312)
      at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
      at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
      at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:173)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:121)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
      at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:293)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:199)
      at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
      at java.security.AccessController.doPrivileged(Native Method)
      at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
      at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:413)
      at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
      at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:161)
      at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:136)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
      at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
      at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
      at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
      at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
      at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    >
    ####<Sep 19, 2013 11:02:06 AM AST> <Warning> <oracle.adf.controller.faces.lifecycle.Utils> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000068> <1379577726297> <BEA-000000> <ADF: Adding the following JSF error message: ORA-04098: trigger 'PMS.PROJECT_SEQ' is invalid and failed re-validation
    java.sql.SQLSyntaxErrorException: ORA-04098: trigger 'PMS.PROJECT_SEQ' is invalid and failed re-validation
      at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:457)
      at oracle.jdbc.driver.T4CTTIoer.processError(T4CTTIoer.java:405)
      at oracle.jdbc.driver.T4C8Oall.processError(T4C8Oall.java:889)
      at oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:476)
      at oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:204)
      at oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:540)
      at oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:217)
      at oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1079)
      at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1466)
      at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3752)
      at oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:3887)
      at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeUpdate(OraclePreparedStatementWrapper.java:1508)
      at weblogic.jdbc.wrapper.PreparedStatement.executeUpdate(PreparedStatement.java:172)
      at oracle.jbo.server.OracleSQLBuilderImpl.doEntityDML(OracleSQLBuilderImpl.java:432)
      at oracle.jbo.server.EntityImpl.doDML(EntityImpl.java:8494)
      at pms.model.eo.ProjectsImpl.doDML(ProjectsImpl.java:245)
      at oracle.jbo.server.EntityImpl.postChanges(EntityImpl.java:6751)
      at oracle.jbo.server.DBTransactionImpl.doPostTransactionListeners(DBTransactionImpl.java:3264)
      at oracle.jbo.server.DBTransactionImpl.postChanges(DBTransactionImpl.java:3067)
      at oracle.jbo.server.DBTransactionImpl.commitInternal(DBTransactionImpl.java:2071)
      at oracle.jbo.server.DBTransactionImpl.commit(DBTransactionImpl.java:2352)
      at oracle.adf.model.bc4j.DCJboDataControl.commitTransaction(DCJboDataControl.java:1590)
      at oracle.adf.model.binding.DCDataControl.callCommitTransaction(DCDataControl.java:1414)
      at oracle.jbo.uicli.binding.JUCtrlActionBinding.doIt(JUCtrlActionBinding.java:1428)
      at oracle.adf.model.binding.DCDataControl.invokeOperation(DCDataControl.java:2168)
      at oracle.jbo.uicli.binding.JUCtrlActionBinding.invoke(JUCtrlActionBinding.java:731)
      at oracle.adf.controller.v2.lifecycle.PageLifecycleImpl.executeEvent(PageLifecycleImpl.java:402)
      at oracle.adfinternal.view.faces.model.binding.FacesCtrlActionBinding._execute(FacesCtrlActionBinding.java:252)
      at oracle.adfinternal.view.faces.model.binding.FacesCtrlActionBinding.execute(FacesCtrlActionBinding.java:185)
      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
      at java.lang.reflect.Method.invoke(Method.java:597)
      at com.sun.el.parser.AstValue.invoke(Unknown Source)
      at com.sun.el.MethodExpressionImpl.invoke(Unknown Source)
      at com.sun.faces.facelets.el.TagMethodExpression.invoke(TagMethodExpression.java:105)
      at javax.faces.event.MethodExpressionActionListener.processAction(MethodExpressionActionListener.java:148)
      at javax.faces.event.ActionEvent.processListener(ActionEvent.java:88)
      at org.apache.myfaces.trinidad.component.UIXComponentBase.broadcast(UIXComponentBase.java:814)
      at org.apache.myfaces.trinidad.component.UIXCommand.broadcast(UIXCommand.java:179)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent$1.run(ContextSwitchingComponent.java:130)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent._processPhase(ContextSwitchingComponent.java:461)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent.broadcast(ContextSwitchingComponent.java:134)
      at oracle.adf.view.rich.component.fragment.UIXInclude.broadcast(UIXInclude.java:111)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent$1.run(ContextSwitchingComponent.java:130)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent._processPhase(ContextSwitchingComponent.java:461)
      at oracle.adf.view.rich.component.fragment.ContextSwitchingComponent.broadcast(ContextSwitchingComponent.java:134)
      at oracle.adf.view.rich.component.fragment.UIXInclude.broadcast(UIXInclude.java:105)
      at javax.faces.component.UIViewRoot.broadcastEvents(UIViewRoot.java:787)
      at javax.faces.component.UIViewRoot.processApplication(UIViewRoot.java:1252)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._invokeApplication(LifecycleImpl.java:965)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl._executePhase(LifecycleImpl.java:346)
      at oracle.adfinternal.view.faces.lifecycle.LifecycleImpl.execute(LifecycleImpl.java:204)
      at javax.faces.webapp.FacesServlet.service(FacesServlet.java:312)
      at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
      at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
      at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:300)
      at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.adf.model.servlet.ADFBindingFilter.doFilter(ADFBindingFilter.java:173)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.adfinternal.view.faces.webapp.rich.RegistrationFilter.doFilter(RegistrationFilter.java:121)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
      at oracle.adfinternal.view.faces.activedata.AdsFilter.doFilter(AdsFilter.java:60)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl$FilterListChain.doFilter(TrinidadFilterImpl.java:468)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl._doFilterImpl(TrinidadFilterImpl.java:293)
      at org.apache.myfaces.trinidadinternal.webapp.TrinidadFilterImpl.doFilter(TrinidadFilterImpl.java:199)
      at org.apache.myfaces.trinidad.webapp.TrinidadFilter.doFilter(TrinidadFilter.java:92)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.security.jps.ee.http.JpsAbsFilter$1.run(JpsAbsFilter.java:111)
      at java.security.AccessController.doPrivileged(Native Method)
      at oracle.security.jps.util.JpsSubject.doAsPrivileged(JpsSubject.java:313)
      at oracle.security.jps.ee.util.JpsPlatformUtil.runJaasMode(JpsPlatformUtil.java:413)
      at oracle.security.jps.ee.http.JpsAbsFilter.runJaasMode(JpsAbsFilter.java:94)
      at oracle.security.jps.ee.http.JpsAbsFilter.doFilter(JpsAbsFilter.java:161)
      at oracle.security.jps.ee.http.JpsFilter.doFilter(JpsFilter.java:71)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at oracle.dms.servlet.DMSServletFilter.doFilter(DMSServletFilter.java:136)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.RequestEventsFilter.doFilter(RequestEventsFilter.java:27)
      at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:56)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.wrapRun(WebAppServletContext.java:3715)
      at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3681)
      at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
      at weblogic.security.service.SecurityManager.runAs(SecurityManager.java:120)
      at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2277)
      at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2183)
      at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1454)
      at weblogic.work.ExecuteThread.execute(ExecuteThread.java:209)
      at weblogic.work.ExecuteThread.run(ExecuteThread.java:178)
    >
    ####<Sep 19, 2013 11:06:31 AM AST> <Warning> <Common> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '7' for queue: 'weblogic.kernel.Default (self-tuning)'> <<WLS Kernel>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000075> <1379577991286> <BEA-000632> <Resource Pool "pms" shutting down, ignoring 1 resources still in use by applications..>
    ####<Sep 19, 2013 11:06:44 AM AST> <Warning> <org.apache.myfaces.trinidadinternal.application.ViewHandlerImpl> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-000000000000007d> <1379578004059> <BEA-000000> <Apache Trinidad is running with time-stamp checking enabled. This should not be used in a production environment. See the org.apache.myfaces.trinidad.CHECK_FILE_MODIFICATION property in WEB-INF/web.xml>
    ####<Sep 19, 2013 11:06:48 AM AST> <Warning> <org.apache.myfaces.trinidad.component.UIXEditableValue> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000080> <1379578008443> <BEA-000000> <A Bean Validation provider is not present, therefore bean validation is disabled>
    ####<Sep 19, 2013 11:06:48 AM AST> <Error> <javax.enterprise.resource.webcontainer.jsf.application> <ABHO-IT-AHMAD> <DefaultServer> <[ACTIVE] ExecuteThread: '3' for queue: 'weblogic.kernel.Default (self-tuning)'> <<anonymous>> <> <e234f3f4721f40cd:1d6f37d9:1413536b1b6:-8000-0000000000000083> <1379578008782> <BEA-000000> <Error Rendering View[/Projects]
    oracle.jbo.AttributeLoadException: JBO-27022: Failed to load value at index 2 with java object of type java.lang.Integer due to java.sql.SQLException.
      at oracle.jbo.server.AttributeDefImpl.loadFromResultSet(AttributeDefImpl.java:2435)
      at oracle.jbo.server.ViewRowImpl.populate(ViewRowImpl.java:3842)
      at oracle.jbo.server.ViewDefImpl.createInstanceFromResultSet(ViewDefImpl.java:2378)
      at oracle.jbo.server.ViewObjectImpl.createRowFromResultSet(ViewObjectImpl.java:6005)
      at oracle.jbo.server.ViewObjectImpl.createInstanceFromResultSet(ViewObjectImpl.java:5834)
      at oracle.jbo.server.QueryCollection.populateRow(QueryCollection.java:3568)
      at oracle.jbo.server.QueryCollection.fetch(QueryCollection.java:3423)
      at oracle.jbo.server.QueryCollection.get(QueryCollection.java:2173)
      at oracle.jbo.server.ViewRowSetImpl.getRow(ViewRowSetImpl.java:5115)
      at oracle.jbo.server.ViewRowSetIteratorImpl.doFetch(ViewRowSetIteratorImpl.java:2935)
      at oracle.jbo.server.ViewRowSetIteratorImpl.ensureRefreshed(ViewRowSetIteratorImpl.java:2804)
      at oracle.jbo.server.ViewRowSetIteratorImpl.ensureRefreshed(ViewRowSetIteratorImpl.java:2751)
      at oracle.jbo.server.ViewRowSetIteratorImpl.setRangeStartWithRefresh(ViewRowSetIteratorImpl.java:2724)
      at oracle.jbo.server.ViewRowSetIteratorImpl.setRangeStart(ViewRowSetIteratorImpl.java:2714)
      at oracle.jbo.server.ViewRowSetImpl.setRangeStart(ViewRowSetImpl.java:3015)
      at oracle.jbo.server.ViewObjectImpl.setRangeStart(ViewObjectImpl.java:10639)
      at oracle.adf.model.binding.DCIteratorBinding.setRangeStart(DCIteratorBinding.java:3552)
      at oracle.adfinternal.view.faces.model.binding.RowDataManager._bringInToRange(RowDataManager.java:101)
      at oracle.adfinternal.view.faces.model.binding.RowDataManager.setRowIndex(RowDataManager.java:55)
      at oracle.adfinternal.view.faces.model.binding.FacesCtrlHierBinding$FacesModel.setRowIndex(FacesCtrlHierBinding.java:800)
      at org.apache.myfaces.trinidad.component.UIXCollection.setRowIndex(UIXCollection.java:530)
      at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer.renderDataBlockRows(TableRenderer.java:2694)
      at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer._renderSingleDataBlock(TableRenderer.java:2431)
      at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer._handleDataFetch(TableRenderer.java:1632)
      at oracle.adfinternal.view.faces.renderkit.rich.TableRenderer.encodeAll(TableRenderer.java:558)
      at oracle.adf.view.rich.render.RichRenderer.encodeAll(RichRenderer.java:1452)
      at org.apache.myfaces.trinidad.render.CoreRenderer.encodeEnd(CoreRenderer.java:493)
      at org.apache.myfaces.trinidad.component.UIXComponentBase.encodeEnd(UIXComponentBase.java:913)
      at org.apache.myfaces.trinidad.component.UIXCollection.encodeEnd(UIXCollection.java:617)
      at javax.faces.component.UIComponent.encodeAll(UIComponent.java:1659)
      at oracle.adfinternal.view.faces.util.rich.InvokeOnComponentUtils$EncodeChildVisitCallback.visit(InvokeOnComponentUtils.java:116)
      at com.sun.faces.component.visit.PartialVisitContext.invokeVisitCallback(PartialVisitContext.java:183)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:505)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:354)
      at oracle.adfinternal.view.faces.renderkit.rich.PanelStretchLayoutRenderer._visitFacetAsStretched(PanelStretchLayoutRenderer.java:856)
      at oracle.adfinternal.view.faces.renderkit.rich.PanelStretchLayoutRenderer._visitFacet(PanelStretchLayoutRenderer.java:834)
      at oracle.adfinternal.view.faces.renderkit.rich.PanelStretchLayoutRenderer.visitChildrenForEncodingImpl(PanelStretchLayoutRenderer.java:793)
      at oracle.adf.view.rich.render.RichRenderer.visitChildrenForEncoding(RichRenderer.java:2393)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:387)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:669)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:532)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:354)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitAllChildren(UIXComponent.java:411)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:392)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:669)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:532)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:354)
      at oracle.adfinternal.view.faces.renderkit.rich.PanelSplitterRenderer._visitFacetAsStretched(PanelSplitterRenderer.java:393)
      at oracle.adfinternal.view.faces.renderkit.rich.PanelSplitterRenderer._visitFacet(PanelSplitterRenderer.java:371)
      at oracle.adfinternal.view.faces.renderkit.rich.PanelSplitterRenderer.visitChildrenForEncodingImpl(PanelSplitterRenderer.java:342)
      at oracle.adf.view.rich.render.RichRenderer.visitChildrenForEncoding(RichRenderer.java:2393)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:387)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:669)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:532)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:354)
      at oracle.adfinternal.view.faces.renderkit.rich.DecorativeBoxRenderer.visitChildrenForEncodingImpl(DecorativeBoxRenderer.java:214)
      at oracle.adf.view.rich.render.RichRenderer.visitChildrenForEncoding(RichRenderer.java:2393)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:387)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:669)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:532)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:354)
      at oracle.adfinternal.view.faces.renderkit.rich.DecorativeBoxRenderer.visitChildrenForEncodingImpl(DecorativeBoxRenderer.java:214)
      at oracle.adf.view.rich.render.RichRenderer.visitChildrenForEncoding(RichRenderer.java:2393)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:387)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:669)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:532)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:354)
      at oracle.adfinternal.view.faces.renderkit.rich.DecorativeBoxRenderer.visitChildrenForEncodingImpl(DecorativeBoxRenderer.java:214)
      at oracle.adf.view.rich.render.RichRenderer.visitChildrenForEncoding(RichRenderer.java:2393)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:387)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitChildren(UIXComponent.java:669)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:532)
      at org.apache.myfaces.trinidad.component.UIXComponent.visitTree(UIXComponent.java:354)
      at oracle.adfinternal.view.faces.renderkit.rich.PanelStretchLayoutRenderer._visitFacetAsStretched(PanelStretchLayoutRenderer.java:856)
      at oracle.adfinternal.view.faces.renderkit.rich.PanelStretchLayoutRenderer._visitFacet(PanelStretchLayoutRenderer.java:834)
      at oracle.adfinternal.view.faces.renderkit.rich.PanelStretchLayoutRenderer.visitChildrenForEncodingImpl(PanelStretchLayoutRenderer.java:793)
      at oracle.adf.view.rich.render.RichRenderer.visitChildrenForEncoding(RichRenderer.java:2404)
      at

    You are kidding?  It took me about 3 minutes to scroll down on my tab to get to the triplex button!
    Habe you read the error message? 
    Quote:
    java.sql.SQLSyntaxErrorException: ORA-04098: trigger 'PMS.PROJECT_SEQ' is invalid and failed re-validation
    Check the trigger and it should work again.
    Timo

  • Report in Excel format fails for huge amount of data with headers!!

    Hi All,
    I have developed an oracle report which fetches upto 5000 records.
    The requirements is to fetch upto 100000 records.
    This report fetches data if the headers are removed. If headers are given its not able to fetch the data.
    Have anyone faced this issue??
    Any idea to fetch huge amount of data by oracle report in excel format.
    Thanks & Regards,
    KP.

    Hi Manikant,
    According to your description, the performance is slow when display huge amount of data with more than 3 measures into powerpivot, so you need the hardware requirements for build a PowerPivot to display huge amount of data with more than 3 measures, right?
    PowerPivot benefits from multi-core processors, large memory and storage capacities, and a 64-bit operating system on the client computer.
    Based on my experience, large memory, multiprocessor and even
    solid state drives are benefit PowerPivot performance. Here is a blog about Memory Considerations about PowerPivot for Excel for you reference.
    http://sqlblog.com/blogs/marco_russo/archive/2010/01/26/memory-considerations-about-powerpivot-for-excel.aspx
    Besides, you can identify which query was taking the time by using the tracing, please refer to the link below.
    http://blogs.msdn.com/b/jtarquino/archive/2013/12/27/troubleshooting-slow-queries-in-excel-powerpivot.aspx
    Regards,
    Charlie Liao
    TechNet Community Support

Maybe you are looking for

  • I had to restore my computer and now it won't sync with my iPhone

    I had to restore my computer to default and now I can't transfer items in my iTunes library to my phone.  Is there anything besides 3rd party software that I can do to authorize this computer.  I restored my phone and I could transfer songs, but then

  • Rendering a blank page not working for multiple xdp's in assembler

    Hi, we are trying to append different .xdp's witht he assembler service to get the resultant .pdf Main xdp :docin.xdp 2nd .xdp : doci2n.xdp there are multiple xdp's that would be combined together with the ddx, the main xdp (pdf from the main .xdp) s

  • ILM to FIM 2010 Migration.

    Hello All, We are planning to upgrade ILM 2007 to FIM 2010 and plan is to use existing ILM database. --Restore it to the new DB server with the name "FIMSynchronizationService", --Install FIM Sync, telling it to use the restored DB, and providing the

  • Help With Tools & Highlighting in new Adobe Reader

    Can anyone help me find the tool menu and highlight function in he new Adobe Reader.  When I bring up an online document it no longer has the tool menu option.  There also used to be the adobe logo that you could click on to bring up the tool menu. 

  • Which DNG converter should I be using?

    Which DNG converter should I be using? I am now still using Lightroom 4.4 and my camera is Nikon D4s... the converter (cannot remember which version)  used to work before, but it was behaving strangely with part of the display hidden, after I reinsta