Export data maxl failed

Hello all!
Im trying to export data from a cube using Maxl. I have succeded doing same process with many cubes before, but this one has 30gb size and it returns me this message:
Statement executed with warnings.
Parallel export failed: too many segments[19945] in database. Start single-threaded export.
The thing is that i have not configured to have more than 1 thread. What can i do? My maxl script is:
login 'xxx' 'xxx' on 'xxx';
import database 'app'.'db' data from local text data_file 'c:\cphone.txt'
on error write to 'c:\cphone.err';
logout;
Thanks in advance!

Krishna,
Parallel export failed: too many segments[19945] in database. Start single-threaded export.^^^It looks like this is a parallel export.
I had this problem with parallel exports a long (like five+ years) time ago. It had something to do with the dimensionality of the database. I added one more member to a small dimension, maybe Version, and Essbase threw this error. Quite exciting as I was expecting file names to follow the parallel export naming convention. Oh well, a lesson in writing error checking.
The same question was posed back in 2008 and I had the same lack of answer. The OP to that thread never followed up.
WARNING - 1005030 - Parallel export failed: too many segments in database
Maybe someone else has a better answer?
Regards,
Cameron Lackpour

Similar Messages

  • Creation of the export data source failed

    trying to activate ods but getting this message that the creation of the export data source failed.
    and it also gives message that RFC Connection to source system is damaged===> no metadata upload
    Please can someone help me with this.
    Thanks in advance.
    Michelle

    Hi,
    Check ur source system connection. by going to RSA1-> Source System-> right click and then select check.
    U will get the info about the source system Connection. IF RFC has failed then better be in touch with ur basis.
    Or try restoring ur source system connection by
    Source system-> Restore.
    Its sets all the default conenctions b/n ur systems.
    Once again check the connection now. if it says ok.. then try do the export data source and then assign the data source to info source and then maintain the Update rules..
    Hope this helps-
    Regards-
    MM

  • PSD files created using export data set fail to import into lightroom

    If I create a set of files using variables in a data set, the PSD files won't import into LightRoom. LR says to resave with maximum compatibility, but I already have that set in Photoshop. If I open one of the files and resave it, it will import. Perhaps PS isn't honoring the maximize compatibility setting when exporting.

    Hi,
    I downloaded 10g database ver 10.2.0.1.0 from oracle site and installed it on my laptop. I took an export of schema which was on my 9i database ver 9.2.0.1.0 using 9i exp, created tablespace and user in the 10gdb and tried to import the same to 10gdb using 10g imp using the following.
    imp system/pswd@10gdb fromuser=test touser=test file=expdat.dmp log=expdat.log
    it displays
    .importing SYSTEM'S objects into SYSTEM
    .importing TEST'S objects into TEST
    and it doesnt import any objects just hangs. The Log file does'nt have any entries.
    Can anyone help me.
    Thanks in advance
    Smdas

  • How to catch failed rows from excel export data conversion

    I am pulling data from SQL Server and exporting to Excel file.  Using SSIS 2008, sending to Excel 2003.  The process is working fine, and I want to grab any data conversion failures, specifically I want to grab any data that fails or is to be truncated.
    I add a flat file destination to the data conversion error line (red) and pointed it to a txt file.  This caused an error, saying some of the columns were the wrong data type to go in a text file.  So I added a data conversion to the first data
    conversion error line, but the data types wont change.  
    The wierd thing, is the error says the columns are DT_NTEXT and need to be DT_TEXT, but they aren't, they are DT_WSTR.  Anyway, I tried to convert to DT_TEXT and it caused the data conversions in my original conversion to change, whcih broke the whole
    package.
    My intention is to grab the erroring row so it can be manually converted. So how do I do that without adding 100 more errors?

    Hi teahou,
    do you really use two accounts to post on the MSDN forums?
    I think the data types were not guessed correctly by the Flat File Destination component and thus you need to adjust them using the advanced editor, then naturally the data conversion transformations become obsolete.
    Arthur
    MyBlog
    Twitter

  • When exporting data from FDM to HFM getting error - Error: Adapter function [fConnect] failed

    Hi ALL!!!
    I am trying to Export Data from FDM to HFM and clicked "Export" option and getting below error.
    Error: Adapter function [fConnect] failed.
    Detail: Stacktrace:
    upsWBlockProcessorDM.clsBlockProcessorClass.ActLoad(strLoc[String], strCat[String], strStartPer[String], strEndPer[String], strFile[String], objLoadParam[Object&], blnNoRaiseEvents[Boolean], lngMarshalType[Int32])
    Hyperion.FDM.Dialogs.TargetSystemLoadDialog.buttonOK_ServerClick(sender[Object], e[EventArgs])
    And followed the John suggestions through this Link (http://epm.jonhwilliams.com/adapter-function-fconnect-failed-fd/), All the checks are good, But still issue has not been resolved.
    Could some one please suggest your valuable inputs.
    Thanks,
    RSV

    Can you check if the integration options "Use SSO" and "Enable Sticky Server" are set to on?    

  • Data and Cleansing export TO SQL table with Melissa Data appended fails

    I am using Data Quality services with Melissa Data Address Check as reference data.  Everything works fine until I take the option to export Data and Cleansing Info which will give me my cleansed data plus additional data points such as geocodes from
    Melissa.  When I do it fails with the error below.
    (Failed to create a new table geocode in database DQS_STAGING_DATA. Check whether the table already exists  and have the database administrator make sure the DQS Service has CREATE TABLE rights in the destination database and can INSERT to the destination
    table.)
    This error makes no sense as the table does not exist and I do have proper rights. I can export Data and Cleansing data if Melissa Data is not involved  ,  when I dig further it seems to be complaining about column header lengths.   
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_CBSADivisionCod' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_DeliveryPointCo' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_ResponseRecordI' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_DeliveryPointCh' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_CBSADivisionLev' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_CongressionalDi' is too long. Maximum length is 128.;
    The identifier that starts with 'Address Validation_Melissa Data Corporation - Address Check - Verify, Correct, Geocode US and Canadian Addresses_CBSADivisionTit' is too long. Maximum length is 128.;
    I can see no option to control these column headers in DQS.  Has anyone else experienced this ?  Does anyone know of a workaround ? 
    I have already reported to Melissa data and they agreed the problem was the column header length but said they also had no control of that.

    Hello,
    You can create an SR with a based outbound filter. All object that match the filter will be provisioning to CS SQL (if you do not define filter, all objects will be provisioning).
    Or you can create an MVextension rules
    Regards,
    Sylvain

  • Export data to MS excel dynamically, randomly fail

    Hello, there,
    I developed a package to export data from SQL 2005 to xlsx file.
    In the package, I have a foreach loop to loop through a recordset, ie. Group_Name, Group_ID
    Inside the foreach loop, I create a spreadsheet using the Group_Name (passed by variable) in the destination excel file and then use Group_ID to select data that belongs to that Group_ID and populate the data into the Group_Name tab.
    So, if there are 10 Group_Names, then 10 spreadsheets will be created in the excel file, if there are 50 group_names, then 50 will be created ...
    I run this package in my local machine for now...
    What puzzles me is this package randomly fails in the middle of process. Sometimes it succeed. And I am running the same data ... Even it fails, it does not fail on any particular spreadsheet (group_name), sometimes this, sometimes that ...
    When it failes, it gives me error message like this
    error: 0xC0202009 at Data Flow Task - PopulateData, Excel Destination [12]: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E37.
    Error: 0xC02020E8 at Data Flow Task - PopulateData, Excel Destination [12]: Opening a rowset for "GroupName_101" failed. Check that the object exists in the database.
    Error: 0xC004706B at Data Flow Task - PopulateData, SSIS.Pipeline: "Excel Destination" failed validation and returned validation status "VS_ISBROKEN".
    Error: 0xC004700C at Data Flow Task - PopulateData, SSIS.Pipeline: One or more component failed validation.
    Error: 0xC0024107 at Data Flow Task - PopulateData: There were errors during task validation.
    I really want to understand why it is not stable.
    Thanks.

    Thanks for the reply. My process is very similar to yours, using variable to create sheets. I am just running it at my desktop, right click the package, click execute ... It randomly fails
    At some point, I suspect it is because I have too many application open: I have VS 2010 open, where this project is being developed, BIDS2008 open, a couple of SSMS open ...
    Not very sure why, just wonder if anybody else had this experience ...
    Thanks.
    SQL Developer from PA
    I think I should tell you that the process is supposed to create about 80 spreadsheets in one excel file and quite often it fails at around 20th sheet. But some times, it finishes creating and populating data in 80 sheets without any problem.

  • Exporting data in Applicationmanager fails with Error 10054

    Hi all,when I try to export data via Applicationmanager the Application terminates immediately with an error message "can't read data - Error 10054 - network error" "can't write data - Error 10054 - network error" and sometimes with the errormessage "unable to allocate memory - Error 10054".We have an IBM server 340 with 1 GByte Ram, Harddisk Space for Hyperion about 3 GByte in RAID 1.Essbase 6.1.Does anybody know that problem???Help to solve that issue would be great. ThanxJoerg [email protected]

    A couple of tips:1. Try stoping and restarting the application then try to export2. If that doesn't work try stopping and restarting the applicationa and the server and reconnecting then doing th eecport.3. Check to see that you have a good connection to the server. When you run an export you must maintain a connection to the server, if you lose it the ecport stops.4. If all else fails open Application Manager on the server itself and run the export from there.Hope this helps,CB

  • What is your recommendation to improve data export time - MaxL?

    We have scheduled data export for backup everyday using MaxL and takes 2 hours and would like to decrease it to 30 min or less.
    What is your recommendation?
    1. Hot back-up
    2. Parallel Calc (if possible)
    3. Copy DB files on Unix box.
    4. ???
    Environment 9.3.0.1
    Thanks in advance!
    Edited by: venuramini on Jul 10, 2009 2:52 PM

    CL,
    I have done the following and my results are very encouraging.
    With 1 ExportThreads:
    0 lvl     9.40 min     2.8 GB
    All lvl     123.00 min     2.8 GB
    With 2 ExportThreads:               
    0 lvl     4.55 min     2.8 GB
    All lvl     72.78 min     2.8 GB
    With 3 ExportThreads:               
    0 lvl     3.32 min     2.8 GB
    All lvl     50.5 min     2.8 GB
    With 4 ExportThreads:               
    0 lvl     2.56 min     2.8 GB
    All lvl     36.8 min     2.8 GB
    Only issue with exporting 0 level data is aggregation. But it is a small inconvenience than taking long to export data that we very rarely need.
    Venu

  • Exporting data

    Hi all
    I am trying to export a cube into a text file for archiving purposes. I am using application manager 6.5.3.
    I click on Database > Export and select 'All data', tick the Export in Column Format button and type in the pathway of a location on my c:\ drive. I then get the error
    "Ascii Backup:Failed to open [C:\xxxxxxx].
    From having a quick browse of the forums it appears that you can only ecport in this fashion to the actual server essbase is installed on, so I tried just putting the filename in the Server File Name and that seems to work.
    However, being a relative essbase virgin I'm not sure where the file actaully goes on the server! Can anyone help, ideally I would like a way I can export cubes directly to my C:\drive as txt files but if there is a simple workround that would be great as well.
    Thanks in advance
    Kryz

    Hi,
    The tech ref I believe has enough information to get you started :- http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/html_esb_techref/techref.htm
    I know it is for 9.3.1 but in the maxl section there is an overview of what is new in each version so you don't try and use a command that is not available in the version you are using.
    Have a look at the export data command.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Export Data - Error: 1005000 Ascii Backup

    I'm attempting to export data from a Dbase to a folder on my C: Drive, have tried both MAXL and EAS but am receiving following message<BR><BR>"Error: 1005000 ASCII Backup: Failed to open [C:\PlanningUploads\ExportFile\ExportRes.txt]."<BR><BR>Never done this before, so would appreciate advice.<BR><BR>Many Tanks<BR>David<BR>

    You should export to a folder on the Essbase server. You can then copy to a folder on your C drive if you wish.

  • Help needed with Export Data Pump using API

    Hi All,
    Am trying to do an export data pump feature using the API.
    while the export as well as import works fine from the command line, its failing with the API.
    This is the command line program:
    expdp pxperf/dba@APPN QUERY=dev_pool_data:\"WHERE TIME_NUM > 1204884480100\" DUMPFILE=EXP_DEV.dmp tables=PXPERF.dev_pool_data
    Could you help me how should i achieve the same as above in Oracle Data Pump API
    DECLARE
    h1 NUMBER;
    h1 := dbms_datapump.open('EXPORT','TABLE',NULL,'DP_EXAMPLE10','LATEST');
    dbms_datapump.add_file(h1,'example3.dmp','DATA_PUMP_TEST',NULL,1);
    dbms_datapump.add_file(h1,'example3_dump.log','DATA_PUMP_TEST',NULL,3);
    dbms_datapump.metadata_filter(h1,'NAME_LIST','(''DEV_POOL_DATA'')');
    END;
    Also in the API i want to know how to export and import multiple tables (selective tables only) using one single criteria like "WHERE TIME_NUM > 1204884480100\"

    Yes, I have read the Oracle doc.
    I was able to proceed as below: but it gives error.
    ============================================================
    SQL> SET SERVEROUTPUT ON SIZE 1000000
    SQL> DECLARE
    2 l_dp_handle NUMBER;
    3 l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    4 l_job_state VARCHAR2(30) := 'UNDEFINED';
    5 l_sts KU$_STATUS;
    6 BEGIN
    7 l_dp_handle := DBMS_DATAPUMP.open(
    8 operation => 'EXPORT',
    9 job_mode => 'TABLE',
    10 remote_link => NULL,
    11 job_name => '1835_XP_EXPORT',
    12 version => 'LATEST');
    13
    14 DBMS_DATAPUMP.add_file(
    15 handle => l_dp_handle,
    16 filename => 'x1835_XP_EXPORT.dmp',
    17 directory => 'DATA_PUMP_DIR');
    18
    19 DBMS_DATAPUMP.add_file(
    20 handle => l_dp_handle,
    21 filename => 'x1835_XP_EXPORT.log',
    22 directory => 'DATA_PUMP_DIR',
    23 filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    24
    25 DBMS_DATAPUMP.data_filter(
    26 handle => l_dp_handle,
    27 name => 'SUBQUERY',
    28 value => '(where "XP_TIME_NUM > 1204884480100")',
    29 table_name => 'ldev_perf_data',
    30 schema_name => 'XPSLPERF'
    31 );
    32
    33 DBMS_DATAPUMP.start_job(l_dp_handle);
    34
    35 DBMS_DATAPUMP.detach(l_dp_handle);
    36 END;
    37 /
    DECLARE
    ERROR at line 1:
    ORA-39001: invalid argument value
    ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3043
    ORA-06512: at "SYS.DBMS_DATAPUMP", line 3688
    ORA-06512: at line 25
    ============================================================
    i have a table called LDEV_PERF_DATA and its in schema XPSLPERF.
    value => '(where "XP_TIME_NUM > 1204884480100")',above is the condition i want to filter the data.
    However, the below snippet works fine.
    ============================================================
    SET SERVEROUTPUT ON SIZE 1000000
    DECLARE
    l_dp_handle NUMBER;
    l_last_job_state VARCHAR2(30) := 'UNDEFINED';
    l_job_state VARCHAR2(30) := 'UNDEFINED';
    l_sts KU$_STATUS;
    BEGIN
    l_dp_handle := DBMS_DATAPUMP.open(
    operation => 'EXPORT',
    job_mode => 'SCHEMA',
    remote_link => NULL,
    job_name => 'ldev_may20',
    version => 'LATEST');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.dmp',
    directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.add_file(
    handle => l_dp_handle,
    filename => 'ldev_may20.log',
    directory => 'DATA_PUMP_DIR',
    filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
    DBMS_DATAPUMP.start_job(l_dp_handle);
    DBMS_DATAPUMP.detach(l_dp_handle);
    END;
    ============================================================
    I dont want to export all contents as the above, but want to export data based on some conditions and only on selective tables.
    Any help is highly appreciated.

  • PROBLEM IN EXPORTING DATA FROM A RELATIONAL TABLE TO ANOTHER RELATIONAL TAB

    Hi,
    While trying to export data from a source table to a target table, problem occurs with loading the data in the work table(SrcSet0)[As shown in the operator]. The Work Table has been dropped and created successfully, the problem is coming with loading the data( in this work table(SrcSet0)). The error details as mentioned below. Please advise:-
    ODI-1227: Task SrcSet0 (Loading) fails on the source ORACLE connection ORACLE_SOURCE.
    Caused By: java.sql.SQLException: SQL string is not Query
         at oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1442)
         at oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:3752)
         at oracle.jdbc.driver.OraclePreparedStatement.executeQuery(OraclePreparedStatement.java:3806)
         at oracle.jdbc.driver.OraclePreparedStatementWrapper.executeQuery(OraclePreparedStatementWrapper.java:1667)
         at oracle.odi.query.JDBCTemplate.executeQuery(JDBCTemplate.java:189)
         at oracle.odi.runtime.agent.execution.sql.SQLDataProvider.readData(SQLDataProvider.java:89)
         at oracle.odi.runtime.agent.execution.sql.SQLDataProvider.readData(SQLDataProvider.java:1)
         at oracle.odi.runtime.agent.execution.DataMovementTaskExecutionHandler.handleTask(DataMovementTaskExecutionHandler.java:67)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
         at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
         at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
         at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
         at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
         at java.lang.Thread.run(Thread.java:662)
    Thanks
    Anindya

    Hi actdi,
    This is my KM_IKM SQL Incremental Update.xml code:-
    and I have find it(for (int i=odiRef.getDataSetMin(); i <= odiRef.getDataSetMax(); i++)) and high lighted it(Bold Text).
    So, please advise.
    Here is the main part of this code because the code is too long to post.It exceeds maxlength for the msg.
    So here it is:
    <Object class="com.sunopsis.dwg.dbobj.SnpTxtHeader">
    <Field name="Enc" type="java.lang.String">null</Field>
    <Field name="EncKey" type="java.lang.String">null</Field>
    <Field name="ITxt" type="com.sunopsis.sql.DbInt"><![CDATA[1695003]]></Field>
    <Field name="ITxtOrig" type="com.sunopsis.sql.DbInt"><![CDATA[102]]></Field>
    <Field name="SqlIndGrp" type="java.lang.String"><![CDATA[2]]></Field>
    <Field name="Txt" type="java.lang.String"><![CDATA[insert into <%=odiRef.getTable("L","INT_NAME","A")%>
    <%=odiRef.getColList("", "[COL_NAME]", ",\n\t", "", "(((INS or UPD) and !TRG) and REW)")%>,
    IND_UPDATE
    *<%for (int i=odiRef.getDataSetMin(); i <= odiRef.getDataSetMax(); i++){%>*
    <%=odiRef.getDataSet(i, "Operator")%>
    select <%=odiRef.getPop("DISTINCT_ROWS")%>
    <%=odiRef.getColList(i,"", "[EXPRESSION]", ",\n\t", "", "(((INS or UPD) and !TRG) and REW)")%>,
    <% if (odiRef.getDataSet(i, "HAS_JRN").equals("1")) { %>
    JRN_FLAG IND_UPDATE
    <%} else {%>
    'I' IND_UPDATE
    <%}%>
    from <%=odiRef.getFrom(i)%>
    where (1=1)
    <%=odiRef.getJoin(i)%>
    <%=odiRef.getFilter(i)%>
    <%=odiRef.getJrnFilter(i)%>
    <%=odiRef.getGrpBy(i)%>
    <%=odiRef.getHaving(i)%>
    <%}%>
    ]]></Field>
    </Object>
    <Object class="com.sunopsis.dwg.dbobj.SnpLineTrt">
    <Field name="AlwaysExe" type="java.lang.String"><![CDATA[0]]></Field>

  • Error while exporting data from HCM to VDS

    Hi,
    When i am trying to export data from the SAP HCM system using the RPLDAP_EXTRACT, the export fails and the log shows that the LDAP_CREATE failed. In the VDS log,
    com.sap.idm.vds.oper.main_listener Message:
    Exception on add: class java.lang.Integer:null incompatible with class
    java.lang.String:null
    I verified the database connectivity from VDS to IDS when seems fine since the LDAP_SEARCH is working fine.
    Cheers !
    Zaheer

    It was a bug in SP2, it is fixed in SP2 Patch 4.
    Cheers !!
    Zaheer

  • SQL Developer 2.1.0.63.73 exports DATE as TIMESTAMP

    I believe this is a bug. When I export a table to an XLS file, the values contained in DATE columns are saved as if they were TIMESTAMP:
    eg. 31-DEC-09 12:00:00 AM would export as 31-DEC-09 12.00.00.000000000 AM
    Not really a huge deal until you try to import it back in, in which case you can't import a TIMESTAMP into a DATE column. First, you'll get an error about the AM/A.M. or PM/P.M. missing. You can't explicitly specify the date format during the import either as Oracle rejects it since it's not supported. The proper way is to cast it back to a date, but you can't do that through the import function.
    Regardless, I think the export function should export DATEs according to the Date Format NLS settings but it is not.
    If it makes any difference, I'm using the 64-bit Windows version of SQL Developer on Windows 7 64-bit with the Oracle 64-bit client.

    Hi,
    Not sure if this is something related to my previous problem.
    My SQL Dev gives the correct date format on exporting to Excel, but fails on export INSERT statement
    Vasan, one of the sql dev's team member, gave this workaround which solved my problem
    >
    You can add the following in the sqldeveloper.conf to ensure that the driver doesn't report the column type of a DATE column as TIMESTAMP.
    AddVMOption -Doracle.jdbc.mapDateToTimestamp=false
    >
    as suggested in this thread
    Re: 2.1 EA1: Problems on Date type columns
    Hope this help,
    Buntoro

Maybe you are looking for