SimpleDateFormat fails while parsing dates

i have to parse a string representing a date, i did this test case which fails:
public void testParseDate4() throws Exception {
          dateAsString = "Fri, 01-Jan-2000 01:00:00 GMT+1";
          String datePattern = "EEE, dd-MMM-yyyy HH:mm:ss z";
          boolean correctlyParsed=false;
          try {
               SimpleDateFormat sdf = new SimpleDateFormat(datePattern);
               java.util.Date utilData = sdf.parse(dateAsString);
          }catch(Exception e) {
               e.printStackTrace();
               correctlyParsed=false;
          assertTrue("Assert date correctly parsed", correctlyParsed);
     }any help is very appreciated.
valerio

i must correct my self. i checked jdk 1.5 sources, and i found the code implementing that case:
case 17: // 'z' - ZONE_OFFSET
            int zoneIndex =
                formatData.getZoneIndex(calendar.getTimeZone().getID());
            if (zoneIndex == -1) {
                value = calendar.get(Calendar.ZONE_OFFSET) +
                    calendar.get(Calendar.DST_OFFSET);
          buffer.append(ZoneInfoFile.toCustomID(value));
            } else {
          int index = (calendar.get(Calendar.DST_OFFSET) == 0) ? 1: 3;
                if (count < 4) {
              // Use the short name
              index++;
          buffer.append(formatData.zoneStrings[zoneIndex][index]);
            break;i must be doing something unproper...

Similar Messages

  • Data load failed while loading data from one DSO to another DSO..

    Hi,
    On SID generation data load failed while loading data  from Source DSO to Target DSO.
    Following are the error which is occuuring--
    Value "External Ref # 2421-0625511EXP  " (HEX 450078007400650072006E0061006C0020005200650066
    Error when assigning SID: Action VAL_SID_CONVERT, InfoObject 0BBP
    So, i'm  not getting  WHY in one DSO i.e Source  it got successful but in another DSO i.e. Target its got failed??
    While analyzing all i check that SIDs Generation upon Activation is ckecked in source DSO but not in Target DSO..so it is reason its got failed??
    Please explain..
    Thanks,
    Sneha

    Hi,
    I hope your data flow has been designed in such a way where the 1st DSO as a staging Device and all transformation rules and routine are maintained in between 1st to 2nd dso and sid generation upon activation maintained in 2nd DSO.  By doing so you will be getting your data 1st DSO same as your source system data since you are not doing any transformation rules and routine etc.. which helps to avoid data load failure.  
    Please analyze the following
    Have you loaded masterdata before transaction data ... if no please do it first
    go to the property of first dso and check whether there maintained sid generation up on activation (it may not be maintained I guess)
    Goto the property of 2nd Dso and check whether there maintained sid generation up on activation (It may be maintained I hope)
    this may be the reason.
    Also check whether there is any special char involvement in your transaction data (even lower case letter)
    Regards
    BVR

  • Proc_getObject with nolock causes "read operation on a large object failed while sending data to the client"

    SharePoint 2013 code for the SharePoint Config database stored procedure dbo.proc_getObject has been changed from SharePoint 2010 with the difference of select with nolock.
    I am seeing numerous Error 7886, Severity-20 errors nightly:
    "A read operation on a large object failed while sending data to the client. A common cause for this is if the application is running in READ UNCOMMITTED isolation level. This connection will be terminated."
    Here I have commented out "with (nolock)" to 'fix' the error and to align the store proc with SharePoint 2010 code : 
    ALTER PROCEDURE [dbo].[proc_getObject]
    @Id uniqueidentifier,
    @RequestGuid uniqueidentifier = NULL OUTPUT
    AS
    SET NOCOUNT ON
    SELECT
    Id,
    ParentId,
    ClassId,
    Name,
    Status,
    Version,
    Properties
    FROM
    Objects --with (nolock)
    WHERE
    Id = @Id
    RETURN 0
    How is this stored procedure called and for what use?
    Why did Microsoft make this change to SharePoint 2013 - some performance gain through dirty reads at the cost of stability? Or what could be a reason for this to happen that can be fixed?

    GetObject retrieves any farm object that is selected (by ID). Changing this is obviously not supported, and NoLock was used to (help) prevent locks when performing a select query.
    I've never seen this issue, and would guess it is unique to your environment given this is such a core piece of the SharePoint infrastructure.
    Trevor Seward
    Follow or contact me at...
    &nbsp&nbsp
    This post is my own opinion and does not necessarily reflect the opinion or view of Microsoft, its employees, or other MVPs.

  • Exception while Parsing date

    When i try to parse date like this :
    Date date = new Date();
    date = DateFormat.getDateInstance(DateFormat.SHORT).parse("08.02.05");
    out.println(date);
    At run time i get the following Exception:
    Exception in thread "main" java.text.ParseException: Unparseable date: "08.02.05"
    at java.text.DateFormat.parse(DateFormat.java:335)
    at demo.main(demo.java:12)
    I had seen this working in an example and now this is not working.....can anyone tell me Why.................

    It probably depends on your regional settings. You should specify the pattern instead of using DateFormat.SHORT.
    Kaj

  • Error while parsing date format

    Hi Friends
    I am using db2 as backend for my project and currently facing an issue with the parsing of the date.
    The date being returned from the db is in the format - yyyy-MM-dd-HH.mm.ss.SSS000 and the return type is String.
    when i try to parse and convert it into a date object. i get the following error
    java.text.ParseException: Unparseable date: "2011-05-30-20.23.25.319000"
    the code to format is as below
    String dateTime = "2011-05-30-20.23.25.319000";
    java.text.SimpleDateFormat format = new SimpleDateFormat("yyyy-MM-dd-HH.mm.ss.SSS000");
    Date date =format.parse(dateTime);
    please provide your comments as to what i am doing wrong here.and if possible let me know the solution.
    Best Regards
    Vikeng

    Use this instead:
    java.text.SimpleDateFormat format = new SimpleDateFormat("yyyy-MM-dd-HH.mm.ss.SSS");

  • RMAN script failed while creating Data Guard 11g

    Hi Friends,
    I am creating Physical Standby (11g) using RMAN (ACTIVE) on windows using the doc : Step by Step Guide on Creating Physical Standby Using RMAN DUPLICATE...FROM ACTIVE DATABASE [ID 1075908.1]
    The folder structure on Primary DB and on Physical Standby is totally different.
    while executing RMAN script i got errors for lot of locations and i specified the proper location in RMAN script itself (ex set diagnostic_dest = 'c:\app\' , set db_recovery_file_dest='C:\app\flash_recovery_area')
    Location of control files on Primary DB:
    D:\oradata\MESSTG\CONTROL01.CTL
    D:\oradata\flash_recovery\messtg\MESSTG\CONTROL02.CTL
    Location of control files (planned) on Physical Standby DB:
    C:\app\oradata\MESSTDBY\CONTROL01.CTL
    C:\app\flash_recovery_area\MESSTDBY\CONTROL02.CTL
    How to specify control file location of Physical standby in set control_files?
    Also i have 2 control files on Primary DB so do i need to specify both the control files , if so how?
    I tried different combination in RMAN script but all are failed:
    set control_files='D:\oradata\MESSTG\CONTROL01.CTL','C:\app\oradata\MESSTDBY\CONTROL01.CTL'
    RMAN Error is :
    while using set control_files=: C:\app\oradata\MESSTDBY\CONTROL01.CTL
    contents of Memory Script:
    backup as copy current controlfile for standby auxiliary format 'D:\ORADATA\MESSTG\CONTROL01.CTL';
    restore clone controlfile to 'C:\APP\ORADATA\MESSTDBY\CONTROL01.CTL' from
    'D:\ORADATA\MESSTG\CONTROL01.CTL';
    executing Memory Script
    Starting backup at 16-APR-13
    channel prmy1: starting datafile copy
    copying standby control file
    released channel: prmy1
    released channel: prmy2
    released channel: prmy3
    released channel: prmy4
    released channel: stby
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of Duplicate Db command at 04/16/2013 14:18:41
    RMAN-03015: error occurred in stored script Memory Script
    RMAN-03009: failure of backup command on prmy1 channel at 04/16/2013 14:18:41
    ORA-17628: Oracle error 19505 returned by remote Oracle server
    RMAN>
    Error 2)
    while using set control_files='D:\oradata\MESSTG\CONTROL01.CTL','C:\app\oradata\MESSTDBY\CONTROL01.CTL'
    contents of Memory Script:
    backup as copy current controlfile for standby auxiliary format 'D:\ORADATA\MESSTG\CONTROL01.CTL';
    executing Memory Script
    Starting backup at 16-APR-13
    channel prmy1: starting datafile copy
    copying standby control file
    released channel: prmy1
    released channel: prmy2
    released channel: prmy3
    released channel: prmy4
    released channel: stby
    RMAN-00571: ===========================================================
    RMAN-00569: =============== ERROR MESSAGE STACK FOLLOWS ===============
    RMAN-00571: ===========================================================
    RMAN-03002: failure of Duplicate Db command at 04/16/2013 13:22:33
    RMAN-03015: error occurred in stored script Memory Script
    RMAN-03009: failure of backup command on prmy1 channel at 04/16/2013 13:22:33
    ORA-17628: Oracle error 19505 returned by remote Oracle server
    RMAN>
    Regards,
    DB

    Hi,
    Can you paste here you rman script ?
    I think following script can help you.
    duplicate target database for standby
    from  active database
    spfile
      set "control_files"="d:\oradata\<standbydbuniquename>\CONTROL01.CTL"
      set "db_name"="<DBNAME>"
      set "db_unique_name"="<STANBYD DBA UNIQUENAME>"
      set "db_file_name_convert"="C:\app\oracle\oradata\<dbname>","d:\oradata\<standbydbuniquename>"
      set "log_file_name_convert"="C:\app\oracle\oradata\<dbname>","d:\oradata\<standbydbuniquename>"
      set "db_recovery_file_dest"="D:\fast_recovery_area"
    nofilenamecheck;C:\app\oracle\oradata\<dbname> - it is primary database dafiles location
    D:\oradata\<standbydbuniquename> - it is standby database datafile location.
    Don't forget creation of diagnostinc dests.
    Regards
    Mahir M. Quluzade

  • Data services job failes while insert data into SQL server from Linux

    SAP data services (data quality) server is running on LInux server and Windows. Data services jobs which uses the ODBC driver to connect to SQL server is failing after selecting few thousand records with following reason as per data services log on Linux server. We can run the same data services job from Windows server, the only difference here is it is using SQL server drivers provided by microsoft. So the possible errors provided below, out of which #1 and #4 may not be the reason of job failure. DBA checked on other errors and confirmed that transaction log size is unlimited and system has space.
    Why the same job runs from Windows server and fails from Linux ? It is because the ODBC drivers from windows and Linux works in different way? OR there is conflict in the data services job with ODBC driver.
    ===== Error Log ===================
    8/25/2009 11:51:51 AM Execution of <Regular Load Operations> for target <DQ_PARSE_INFO> failed. Possible causes: (1) Error in the SQL syntax;
    (2)6902 3954215840 RUN-051005 8/25/2009 11:51:51 AM Database connection is broken; (3) Database related errors such as transaction log is full, etc.; (4) The user defined in the
    6902 3954215840 RUN-051005 8/25/2009 11:51:51 AM datastore has insufficient privileges to execute the SQL. If the error is for preload or postload operation, or if it is for
    ===== Error Log ===================

    this is another method
    http://www.mssqltips.com/sqlservertip/2484/import-data-from-microsoft-access-to-sql-server/
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • JRE1.5 swing.html parser fails to parse data between script tags

    Hi all...
    I've written a class that extends the java-provided default HTML parser to parse for text inside a table. In JRE1.4.* the parser works fine and extracts data between <script> tags as text. However now that I've upgraded to 1.5, the data between <script> tags are no longer parsed. Any suggestion anyone?
    Steve

    According to the API docs, the 1.5 parser supports HTML 3.2, for which the spec states that the content of SCRIPT and STYLE tags should not be rendered. I assume it doesn't have a scripting engine, so it won't get executed either.

  • ORA-00060 hrjptkfl.ldt, hrzatkfl.ldt  workers failed while uploading data

    Hi,
    I had a strange error while applying the NLS patch. I had two workers failed with ORA-00060 deadlock detected while waiting for a resource. for the two files.
    on the table HR_NAVIGATION_UNITS.
    The good thing is the individual requests which these two workers executing completed successfully.
    After searching a lot i could not find any thing relevant to it and just restarted the failed worker using adctrl. And the patch completed that phase. Wonder why thi happened.
    Any way this is not a question, any ideas are appreciated.
    Regards
    Taher

    Taher,
    And the patch completed that phase. Wonder why thi happened. If the patch completed successfully then there is nothing to worry about. Sometimes when doing the installation, the same error is reported in the installation log files, but as long as the installation complete successfully, then you can simply ignore those errors.
    Regards,
    Hussein

  • Error while Inserting data into flow table

    Hi All,
    I am very new to ODI, I am facing lot of problem in my 1st interface. So I have many questions here, please forgive me if it has irritated to you.
    ========================
    I am developing a simple Project to load a data from an input source file (csv) file into a staging table.
    My plan is to achieve this in 3 interfaces:
    1. Interface-1 : Load the data from an input source (csv) file into a staging table (say Stg_1)
    2. Interface-2 : Read the data from the staging table (stg_1) apply the business rules to it and copy the processed records into another staging table (say stg_2)
    3. Interface-3 : Copy the data from staging table (stg_2) into the target table (say Target) in the target database.
    Question-1 : Is this approach correct?
    ========================
    I don't have any key columns in the staging table (stg_1). When I tried to execute the Flow Control of this I got an error:
    Flow Control not possible if no Key is declared in your Target Datastore
    With one of the response (the response was - "FLOW control requires a KEY in the target table") in this Forum I have introduced a column called "Record_ID" and made it a Primary Key column into my staging table (stg_1) and my problem has been resolved.
    Question-2 : Is a Key column compulsary in the target table? I am working in BO Data Integrator, there is no such compulsion ... I am little confused.
    ========================
    Next, I have defined one Project level sequence. I have mapped the newly introduced key column Record_Id (Primary Key) with the Project level sequence. Now I am got another error of "CKM not selected".
    For this, I have inserted "Insert Check (CKM)" knowledge module in my Project. With this the above problem of "CKM not selected" has been resolved.
    Question-3 : When is this CKM knowledge module required?
    ========================
    After this, the flow/interface is failing while loading data into the intermediar ODI created flow table (I$)
    1 - Loading - SS_0 - Drop work table
    2 - Loading - SS_0 - Create work table
    3 - Loading - SS_0 - Load data
    5 - Integration - FTE Actual data to Staging table - Drop flow table
    6 - Integration - FTE Actual data to Staging table - Create flow table I$
    7 - Integration - FTE Actual data to Staging table - Delete target table
    8 - Integration - FTE Actual data to Staging table - Insert flow into I$ table
    The Error is at Step-8 above. When opened the "Execution" tab for this step I found the message - "Missing parameter Project_1.FTE_Actual_Data_seq_NEXTVAL RECORD_ID".
    Question-4 : What/why is this error? Did I made any mistake while creating a sequence?

    Everyone is new and starts somewhere. And the community is there to help you.
    1.) What is the idea of moving data from stg_1 and then to stg_2 ? Do you really need it for any other purpose other than move data from SourceFile to Target DB.
    Otherwise, its simple to move data from SourceFile -> Target Table
    2.) Does your Target table have a Key ?
    3.) CKM (Check KM) is required when you want to do constraint validation (Checking) on your data. You can define constraints (business rules) on the target table and Flow Control will check the data that is flowing from Source File to Target table using the CKM. All the records that donot satisfy the constraint will be added to E$ (Error table) and will not be added to the Target table.
    4.) Try to avoid ODI sequences. They are slow and arent scalable. Try to use Database sequence wherever possible. And use the DB sequence is target mapping as
    <%=odiRef.getObjectName( "L" , "MY_DB_Sequence_Row" , "D" )%>.nextval
    where MY_DB_Sequence_Row is the oracle sequence in the target schema.
    HTH

  • SSIS 2012 is intermittently failing with below "Invalid date format" while importing data from a source table into a Destination table with same exact schema.

    We migrated Packages from SSIS 2008 to 2012. The Package is working fine in all the environments except in one of our environment.
    SSIS 2012 is intermittently failing with below error while importing data from a source table into a Destination table with same exact schema.
    Error: 2014-01-28 15:52:05.19
       Code: 0x80004005
       Source: xxxxxxxx SSIS.Pipeline
       Description: Unspecified error
    End Error
    Error: 2014-01-28 15:52:05.19
       Code: 0xC0202009
       Source: Process xxxxxx Load TableName [48]
       Description: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 11.0"  Hresult: 0x80004005  Description: "Invalid date format".
    End Error
    Error: 2014-01-28 15:52:05.19
       Code: 0xC020901C
       Source: Process xxxxxxxx Load TableName [48]
       Description: There was an error with Load TableName.Inputs[OLE DB Destination Input].Columns[Updated] on Load TableName.Inputs[OLE DB Destination Input]. The column status returned was: "Conversion failed because the data value overflowed
    the specified type.".
    End Error
    But when we reorder the column in "Updated" in Destination table, the package is importing data successfully.
    This looks like bug to me, Any suggestion?

    Hi Mohideen,
    Based on my research, the issue might be related to one of the following factors:
    Memory pressure. Check there is a memory challenge when the issue occurs. In addition, if the package runs in 32-bit runtime on the specific server, use the 64-bit runtime instead.
    A known issue with SQL Native Client. As a workaround, use .NET data provider instead of SNAC.
    Hope this helps.
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • There was an error while writing data back to the server: Failed to commit objects to server : Duplicate object name in the same folder.

    Post Author: dmface15
    CA Forum: Administration
    I am working in a new enviorment and i am trying to save a report to the Crystal Server via the CMC. I am uploading the report from the objects tab and attempting to save to a folder. The report has 1 static defined parameter and that's it. When i click submit to save the report i receive the following error message: "There was an error while writing data back to the server: Failed to commit objects to server : Duplicate object name in the same folder." There is not a anothe report within the folder with that name. What could be causing this error message and equally important, what is the solution.

    hte message you received about duplicate user probably means something hadn't fully updated yet. Once it did then it worked...
    Regards,
    Tim

  • Failed to parse SQL query: ORA-01403: no data found

    I'm going to post and answer my own question in the hope that others will not have to struggle with this error.
    Using a report of the type PL/SQL Function Body Returning SQL and using generic columns you may run into this error
    failed to parse SQL query:
    ORA-01403: no data found
    The SQL will run stand alone but the report fails.
    There is a setting just below the source you should check:
    "Maximum number of generic report columns"
    In my case the number of columns was dynamic and when it exceeded the number set as the maximium number of generic columns I received the 1403 error.
    Hope this helps someone.
    Greg

    Thanks for much for the pointer. For anyone else struggling with this too, I found that my generic columns had unordered themselves. Reordering them solved the problem for me.
    Edited by: user11096971 on Jul 22, 2010 3:19 AM

  • Parsing Date with SimpleDateFormat

    Hi,
    In my application, i want to parse date which is in String format.
    The format of Date in String is "yyyyMMdd'T'HHmmss.SSS'Z'" (ex:- 20031201T100116.000Z).
    //Happy Case - Input Param "20031201T100116.000Z"
    //Output : 2003-12-01 10:01:16.0 is the expected output ------- works well.
    //Exception case Input param Value "00000000T000000.000Z"
    But the above case breaks as the expected output is "0000-00-00 00:00:00 ". But the output what i get is
    "0002-11-30 00:00:00.0"
    Greatly appreciate your inputs...,
    Here is my code for the above stated case....
    private static void printDateFormat(String str) throws Exception
    String format = "yyyyMMdd'T'HHmmss.SSS'Z'"; //Desired Format
    ParsePosition pos = new ParsePosition(0);
    SimpleDateFormat sdf = new SimpleDateFormat(format);
    Date date = sdf.parse(str, pos);
    Calendar cal = Calendar.getInstance();
    cal.setTime(date);
    System.out.println("[After Formatting]" + date.toString());
    java.sql.Timestamp ts = new java.sql.Timestamp(cal.getTime().getTime());
    System.out.println(ts.toString());
    Thanks !
    Priya.Jlus
    IIMS-NZ.

    SimpleDateFormat is set to lenient date parsing by default, so it's trying it's best to come up with an actual date for 0000-00-00 (what would you say a 'zero' month should be?). You can prevent this "approximation" by using "setLenient(false)", but then you will get a null Date reference for non-parseable dates like the one you're using.
    The bottom line is that you're going to have to do some validation for the user input and handle exceptional conditions.
    Hope this helps! :o)

  • Implementing Database codes FAIL /error in parsing data into jsp page

    Hi there,
    I have this problem trying to build a portal application. I am using the JSPDynPage, jsp page and a bean to build a page.
    I have this problem parsing data from JSPDynPage to the beans and to the jsp page.
    I tested out the connection between the beans and the jsp page. There is no error parsing information from the beans to the jsp page.
    So i tried to test out the codes for the database from the beans.
    Will this work? Because it does not display the results i want.
    I need an Solution asap to find out wad is wrong also also the correct codes to the database.
    public String db(){
    try {
                 InitialContext iC = new InitialContext();
                 DataSource dataSource = (DataSource)iC.lookup("java:env/jdbc/mySQL");
                 java.sql.Connection connection = dataSource.getConnection();
                 PreparedStatement st = connection.prepareStatement("SELECT name FROM test WHERE id='123'");
                 ResultSet resultSet = st.executeQuery();
                 while (resultSet.next()){
                      name = resultSet.getString("name");
                      name = name.toString();
                 connection.close();
                 return name;
                 } catch(Exception n){
                      e = "Exception";
                      return e;
    This is the method i put in the beans to test out the database connection.
    Thanks Loads
    Quatre

    Hi there,
    Thanks for the reply, i thought that no one is going to reply me anymore. Thanks loads.
    Bean Class name: Bean1
    Bean Packeage name: Beans1
    Bean id : myBean1
    Jsp Codes
    <%@ taglib uri="tagLib" prefix="hbj" %>
    <jsp:useBean id="myBean1" scope="application" class="Beans1.Bean1" />
    <hbj:content id="myContext" >
      <hbj:page title="PageTitle">
       <hbj:form id="myFormId" >
       <hbj:textView id = "ll" text="lalalalalalalalalasasa" />
      <%--
    <hbj:textView id = "l">
                   <% l.setName(myBean1.getName());%>
    </hbj:textView>
    --%>
    <hbj:textView
                    id="appraisal_yr_label"
                    text="Apprasial Year:">
                    </hbj:textView>
                    <hbj:textView id = "la" >
                    <% la.setText(myBean1.getName());%>
                    </hbj:textView>
       </hbj:form>
      </hbj:page>
    </hbj:content>
    Beans1.java
    package Beans1;
    import java.io.Serializable;
    public class Bean1 implements Serializable{
    private String name = new String();
    private String e = new String();
    private String year = new String();
         public void setName(String name){
              this.name = name;
         public String getName(){
              return name;
    JSPDynPage
    import Beans1.Bean1;
    import com.sapportals.htmlb.*;
    import com.sapportals.htmlb.enum.*;
    import com.sapportals.htmlb.event.*;
    import com.sapportals.htmlb.page.*;
    import com.sapportals.portal.htmlb.page.*;
    import com.sapportals.portal.prt.component.*;
    public class testDBv5 extends PageProcessorComponent {
    import Beans1.Bean1;
    import com.sapportals.htmlb.*;
    import com.sapportals.htmlb.enum.*;
    import com.sapportals.htmlb.event.*;
    import com.sapportals.htmlb.page.*;
    import com.sapportals.portal.htmlb.page.*;
    import com.sapportals.portal.prt.component.*;
    public class testDBv5 extends PageProcessorComponent {
      public DynPage getPage(){
        return new testDBv5DynPage();
      public static class testDBv5DynPage extends JSPDynPage{
        private Bean1 myBean1= null;
        private String name = new String();
        public void doInitialization(){
              try{
                                  IntitialContext context = new javax.naming.InitialContext();
                                  DataSource dataSource = (DataSource)context.lookup("java:env/jdbc/appDB");
                                  Connection connection = dataSource.getConnection();
                                  Statement stmt = connection.createStatement();
                                  ResultSet rs = stmt.executeQuery("Select AppraisalYear From tblAppraisal Where AppraisalYear='2007'");
                                          while(rs.next()){
                                            year = rs.getString("AppraisalYear");
                                            rs.close();
                                            stmt.close();
                                            connection.close();
                                            return year;
                                       }catch (Exception n){
                                            n.printStackTrace();
              IPortalComponentProfile profile = ((IPortalComponentRequest)getRequest()).getComponentContext().getProfile();
                     Object o = profile.getValue("myBean1");
                     if(o==null || !(o instanceof Bean1)){
                        myBean1 = new Bean1();
                        profile.putValue("myBean1",myBean1);
                     } else {
                          myBean1 = (Bean1) o;
                        Object value = request.getSession().getValue("myBeans1");
                        if (value==null || (value instanceof Bean1)){
                             myBean1 = new Bean1();
                        request.getSession().putValue("myBean1", Object value);
              DataBase cn = new DataBase();
              name = cn.dataBaseConnection();
              myBean1.setName(name);
    //name = "gir";
    //myBean1.setName(name);
    // IPortalComponentRequest request = (IPortalComponentRequest) this.getRequest();
    //IPortalComponentContext myContext = request.getComponentContext();
    //Bean1 myNameContainer = (Bean1) myContext.getValue("myBean1");
    //myNameContainer.setName(name);
          // fill your bean with data here...
        public void doProcessAfterInput() throws PageException {
        public void doProcessBeforeOutput() throws PageException {
          this.setJspName("testDBv5.jsp");
    //testing purpose
      public static void main (String[] arg){
         testDBv5DynPage ef = new testDBv5DynPage();
               ef.doInitialization();
    Thanks Loads
    Quatre

Maybe you are looking for