Timestamp datatype issue in Expression

Hi
I am extracting data from oracle table which timestamp datatype for one column then connected to Expression operator. When i see the data type of timestamp in sourcetable it is timestamp(3), But when i see the data type of timestamp in Expression operator it says timestamp(3) but it is adding 6 in the precision by default and that cannot be changed. I tried creating new column in expression it behaved the same. there by my mapping is giving me the warnings regarding this.
Could you guys please help on this.
Thanks

Hi Cezar,
Thanks for your quick reply.
I am currently doing the second step you have mentioned. I even went to the extent of cutomizing the I$ tables getting created with a custom structure to avoid this. But this is very tedious as i have hundreds of columns in the table.
Is this a bug in ODI? I am asking this because this is a general functionality and should be ideally covered by such a good ELT tool.
I have raised a SR for the same, let us see what solution they have in store:)
Anyway thanks for your valuable suggestion its really useful.
But I have a doubt regarding the versions. Whats the difference between versions 10.1.3.2.0 which I am using and 10.1.3.4.0? I hope this version is not creating this issue?
Thanks,
Vikram

Similar Messages

  • How to obtain a number of seconds between 2 fields in TimeStamp datatypa

    Hello, I need to have the result of a difference between 2 fields (date1 - date2)which are in TimeStamp datatype
    The result given must be in seconds
    I am using owb 10.2.1.0.31
    Can so help me ?
    Thanks

    check the Puget Sound Oracle Users Group page (http://www.psoug.org/) at
    http://www.psoug.org/reference/timestamp.html
    And more information on the INTERVAL DAY TO SECOND datatype at
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14200/sql_elements001.htm#SQLRF00207
    here's an excerpt from the PSOUG page:
    CREATE TABLE tint_test (
    msg VARCHAR2(25),
    start_date TIMESTAMP WITH TIME ZONE,
    end_date TIMESTAMP WITH TIME ZONE,
    duration_1 INTERVAL DAY(5) TO SECOND,
    duration_2 INTERVAL YEAR TO MONTH);
    INSERT INTO tint_test
    (msg, start_date, end_date)
    VALUES
    ('my plane ride',
    timestamp'2004-08-08 17:02:32.212 US/Eastern',
    timestamp'2004-08-08 19:10:12.235 US/Pacific');
    UPDATE tint_test
    SET duration_1 = (end_date - start_date) DAY(5) TO SECOND,
    duration_2 = (end_date - start_date) YEAR TO MONTH;
    SELECT msg, duration_1, duration_2 FROM tint_test;
    SELECT t.*, end_date - start_date FROM tint_test t;

  • Timestamp datatype not output correctly in table export

    When using the data export from table view timestamp datatype is not handled correctly. It shows as oracle.sql.TIMESTAMP@14c0761.
    Works fine from SQL Explorer view though.

    Im using the same build. 1.0.0.15.27.
    You can try any export option. I tried SQL Insert.
    If you right click from the data grid (SQL Worksheet or Table view it works fine)
    In the table view, if you go to Actions -> Export -> SQL Insert then it doesn't.

  • TIMESTAMP datatype in Oracle 8i.

    'TIMESTAMP' datatype is not to be available in Oracle 8i.
    Is there an equivalent datatype in Oracle 8i as 'TIMESTAMP' in 9i?
    Regards,
    Bhagat

    Timestamp was a new globalization feature in 9i. The only way to handle time in 8i is with the regular DATE type and traditional date arithmetics.

  • ORA-01790: expression must have same datatype as corresponding expression

    can somebody see a problem with the query below:
    error message: ORA-01790: expression must have same datatype as corresponding expression
    select CREV_D_POSTMARK,CRF1_A_PREM_CREDIT,CRF1_A_SPREM_FLAT,CRF1_A_SPREM_VAR,HSTA_C_UPDATE_SEQ, DTAX_C_LINE_SEQ,
    SCHA_C_PV_EIN_5500, SCHA_C_PV_PN_5500, CRBH_D_RECEIVE, SCHA_A_SIG_EVENT,SCHA_C_EIN_PAGE_2, SCHA_C_PN_PAGE_2,
    SCHA_D_SIGN_ADMN, SCHA_D_UNF_BEN, SCHA_F_INIT_412I, SCHA_F_INIT_AJ_ACT, SCHA_F_INIT_AJ_ADM, SCHA_F_INIT_ENTITL,
    SCHA_F_INIT_GEN_RL, SCHA_F_INIT_NO_UNF, SCHA_F_INIT_NOPROF, SCHA_F_INIT_PART, SCHA_F_INIT_REQMT, SCHA_F_INIT_SIG_EV,
    SCHA_L_SIGN_NAME, SCHA_A_ADJ_VAL_RCV, SCHA_A_ADJ_VAL_TOT, SCHA_A_ADJ_VAL_ASS, SCHA_A_ADJ_VL_NRCV, SCHA_A_ASSET_CONT,
    SCHA_A_ASSET_DISC, SCHA_A_ASSET_VALUE, SCHA_A_SPREM_VAR, SCHA_A_VBVAL_NRCV, SCHA_A_VBVAL_RCV, SCHA_A_VBVAL_TOTAL,
    SCHA_C_EIN, SCHA_C_PN, SCHA_C_FORM_YEAR, SCHA_D_ASSET_VALUE,SCHA_D_DETER_AS_OF, SCHA_D_DIST_INVOL, SCHA_D_STAND_TERM,
    SCHA_D_VAL_PYC, SCHA_F_NO_SIG_EV,SCHA_F_SIG_EVENT_1, SCHA_F_SIG_EVENT_2, SCHA_F_SIG_EVENT_3, SCHA_F_SIG_EVENT_4,
    SCHA_F_SIG_EVENT_5,SCHA_F_SIG_EVENT_6, SCHA_F_SIG_EVENT_7, SCHA_N_PLAN_NAME, SCHA_P_REQ_INT_RAT, SCHA_P_VBVAL_NRCV,
    SCHA_P_VBVAL_RCV, SCHA_Q_ASS_RET_AGE, SCHA_T_ACCRUAL_FAC, SCHA_F_CERT_NREQ_I, SCHA_F_CERT_REQ_I,SCHA_F_CERT_EXP_A,
    SCHA_F_SIG_EV_NEG, SCHA_A_INT_ADJ_VAL, SCHA_D_SIGN_ACTU, SCHA_C_ENCL_SEQ,DTAX_C_UID, DTAX_A_AMOUNT_PAID, CRF1_A_NET_PREMIUM,
    CRBH_Q_SOA_PAYMENT, CRBH_A_CONTROL_TOT,CSOA_A_PREM_PAYMT, CSOA_A_PEN_PAYMENT, CSOA_A_INT_PAYMENT, CSOA_A_PAYMENT from [email protected] minus
    select CREV_D_POSTMARK,CRF1_A_PREM_CREDIT,CRF1_A_SPREM_FLAT,CRF1_A_SPREM_VAR,HSTA_C_UPDATE_SEQ, DTAX_C_LINE_SEQ,
    SCHA_C_PV_EIN_5500, SCHA_C_PV_PN_5500, CRBH_D_RECEIVE, SCHA_A_SIG_EVENT,SCHA_C_EIN_PAGE_2, SCHA_C_PN_PAGE_2,
    SCHA_D_SIGN_ADMN, SCHA_D_UNF_BEN, SCHA_F_INIT_412I, SCHA_F_INIT_AJ_ACT, SCHA_F_INIT_AJ_ADM, SCHA_F_INIT_ENTITL,
    SCHA_F_INIT_GEN_RL, SCHA_F_INIT_NO_UNF, SCHA_F_INIT_NOPROF, SCHA_F_INIT_PART, SCHA_F_INIT_REQMT, SCHA_F_INIT_SIG_EV,
    SCHA_L_SIGN_NAME, SCHA_A_ADJ_VAL_RCV, SCHA_A_ADJ_VAL_TOT, SCHA_A_ADJ_VAL_ASS, SCHA_A_ADJ_VL_NRCV, SCHA_A_ASSET_CONT,
    SCHA_A_ASSET_DISC, SCHA_A_ASSET_VALUE, SCHA_A_SPREM_VAR, SCHA_A_VBVAL_NRCV, SCHA_A_VBVAL_RCV, SCHA_A_VBVAL_TOTAL,
    SCHA_C_EIN, SCHA_C_PN, SCHA_C_FORM_YEAR, SCHA_D_ASSET_VALUE,SCHA_D_DETER_AS_OF, SCHA_D_DIST_INVOL, SCHA_D_STAND_TERM,
    SCHA_D_VAL_PYC, SCHA_F_NO_SIG_EV,SCHA_F_SIG_EVENT_1, SCHA_F_SIG_EVENT_2, SCHA_F_SIG_EVENT_3, SCHA_F_SIG_EVENT_4,
    SCHA_F_SIG_EVENT_5,SCHA_F_SIG_EVENT_6, SCHA_F_SIG_EVENT_7, SCHA_N_PLAN_NAME, SCHA_P_REQ_INT_RAT, SCHA_P_VBVAL_NRCV,
    SCHA_P_VBVAL_RCV, SCHA_Q_ASS_RET_AGE, SCHA_T_ACCRUAL_FAC, SCHA_F_CERT_NREQ_I, SCHA_F_CERT_REQ_I,SCHA_F_CERT_EXP_A,
    SCHA_F_SIG_EV_NEG, SCHA_A_INT_ADJ_VAL, SCHA_D_SIGN_ACTU, SCHA_C_ENCL_SEQ,DTAX_C_UID, DTAX_A_AMOUNT_PAID, CRF1_A_NET_PREMIUM,
    CRBH_Q_SOA_PAYMENT, CRBH_A_CONTROL_TOT,CSOA_A_PREM_PAYMT, CSOA_A_PEN_PAYMENT, CSOA_A_INT_PAYMENT, CSOA_A_PAYMENT from
    [email protected] where to_number(to_char(asum_d_pyc,'YYYY')) = 2001;
    Message was edited by:
    mc

    Any sugestion on easy way to verify what coulmn may have different datatype?
    The table has quite a few columns, will take time to do it manually....
    Any other better way to compare this two tables?

  • Ora-01790 Expression must have same datatype as correspoding expression

    Can somebody help me how to fix this.
    award_number is varchar2(30)
    The problem is with award_name. Throwing an error.
    Not sure why it is giving problem.
    ora-01790 Expression must have same datatype as correspoding expression
    SELECT *  from award_test
            UNPIVOT(VAL for operator in(   
    AWARD_NAME,                                                                                                                                                                                                      
    TOTAL_PROCEEDS,                                                                                                                                                                                                             
    EARNING_PROCS ,                                                                                                                                                                                                              
    TOT_PROCS_EARNINGS ,                                                                                                                                                                                                           
    GROSS_PROCS    ,                                                                                                                                                                                                               
    PROC_REF_DEF_ESCR ,                                                                                                                                                                                                        
    OTH_UNSP_PROCS ,                                                                                                                                                                                                              
    ISSUANCE_COST ,                                                                                                                                                                                                            
    WORK_CAP_EXP ))
            PIVOT(max(VAL) for award_number in  ('XIAAE' as "XIAAE",'XIBNG' as "XIBNG" ))

    it has not resolved my problem. Still the problem is there. if i remove award_name it is working fine. only problem with award_name

  • ORACLE TIMESTAMP DataType support in Toplink ?

    Currently we have an application that need to create timestamps with precision up to thousands of a second.
    Do you know of any other customer that have similar requirements and how they solve this problem ?
    We try changing the SQL DDL to change datatype from DATE to TIMESTAMP(3) which can support a timestamp of 1/1000 seconds precision.
    We find that if our Oracle column is defined as Oracle DATE, the last 3 digit will be drop and cause us some duplicate key exception for records that
    Get inserted within 1 second because the timestamp is part of the primary key.
    ts '2004-03-12 17:13:27.792'
    So we change the ORACLE column from DATE to TIMESTAMP(3)
    What we find is that Toplink produce this exception
    Exception [TOPLINK-3001] (OracleAS TopLink - 10g (9.0.4) (Build 031126)): oracle.toplink.exceptions.ConversionException
    Exception Description: The object [oracle.sql.TIMESTAMP@321b5e39], of class [class oracle.sql.TIMESTAMP], could not be converted to [class java.util.Date].
    at oracle.toplink.exceptions.ConversionException.couldNotBeConverted(ConversionException.java:35)
    at oracle.toplink.internal.helper.ConversionManager.convertObjectToUtilDate(ConversionManager.java:679)
    at oracle.toplink.internal.helper.ConversionManager.convertObject(ConversionManager.java:97)
    at oracle.toplink.internal.databaseaccess.DatabasePlatform.convertObject(DatabasePlatform.java:55
    Than we try to change our java code and modify the java instance variable type from java.util.Date to java.sql.Timestamp
    And we get the following error
    Exception [TOPLINK-3001] (OracleAS TopLink - 10g (9.0.4) (Build 031126)): oracle.toplink.exceptions.ConversionException
    Exception Description: The object [oracle.sql.TIMESTAMP@731de027], of class [class oracle.sql.TIMESTAMP], could not be converted to [class java.sql.Timestamp].
    at org.apache.xerces.impl.XMLNamespaceBinder.endElement(XMLNamespaceBinder.java:650)
    at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanEndElement(XMLDocumentFragmentScannerImpl.java:1011)
    at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl$FragmentContentDispatcher.dispatch(XMLDocumentFragmentScannerImpl.java:1564)
    at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:335)
    We cannot seems to find in toplink mapping workbench how to specify timestamp
    ========================================================================================================
    The TIMESTAMP Datatype
    The new TIMESTAMP datatype is almost identical to DATE and differs in only one way:
    TIMESTAMPs can represent fractional seconds.
    The granularity of a TIMESTAMP value can be as little as a billionth of a second, whereas
    DATE variables can only resolve time to the second.
    When you declare a TIMESTAMP variable, you can optionally specify the precision that you wish to use for fractional seconds. The default precision is to the millisecond (six decimal digits); the maximum precision is to the billionth of a second (nine decimal digits).
    ===========================================================================================================
    -----Original Message-----
    From: Cheung, Ka-Kit
    Sent: Friday, March 12, 2004 6:20 PM
    To: Burr, Tim; Julian, Robert; Matthiesen, Sean
    Cc: Tsounis, George; Del Rosso, Peter; Cham, Mei
    Subject: Problem identified : AddressDetail duplicate key problem
    If we look at the exact of the insert statement.
    We see that the last address detail insert have key of
    Address ID = '5a052407-dac6-42ad-bbbf-29edc94488c1', and
    TransactionStartDate = {ts '2004-03-12 17:13:27.792'},
    While in the database, we look like we have an entry of
    Address ID = '5a052407-dac6-42ad-bbbf-29edc94488c1', and
    TransactionStartDate = {ts '2004-03-12 17:13:27.229'},
    If my memory served me right, while
    {ts '2004-03-12 17:13:27.792'}, is different than {ts '2004-03-12 17:13:27.229'},
    because are Java timestamps that have precison up to MicroSeconds, therefore 229 is different than 792.
    However, when this timestamp is saved to Oracle, I believe (have to check with Mei) that oracle only takes
    Up to '2004-03-12 17:13:27’ and discard the 229 or 792 because that is the maximum precision of timestamp for oracle.
    So we have the second insert have the same '2004-03-12 17:13:27’ after stripping off the 792 and we have a same record with the same same '2004-03-12 17:13:27’ in the database and
    Therefore causing duplicate key exception.
    That is why this is happen only once in a while when 2 rapid fire inserts happen in less than 1 second of each other.
    The solution actually is in the ESS code itselfs.
    The current ESS code will send addDependentToClient multiple times, one for each dependent added
    On the screen.
    The right way is to add all the dependent on the screen all at once.
    To have “course grain” method like addDependentsToClient, and have a collection or array of dependents as input parameter.
    This way we are not causing the participant to create history of themselves multiple times within a very short period of time. It save disk space, conform to a single UOW per submit and that is what I proposed
    To solve this problem from the root cause is by enhancing the method to save multiple dependents in one shot rather than a loop of multiple calls.
    KK
    and
    INSERT INTO PTTCBSI.ADDRESS_DETAIL
    (LINE_3_AD, ADR_TRAN_UNTIL_DT, MODIFY_DT, CITY_NM, POSTAL_CD, VER_ID, POSTAL_EXT_CD, LINE_2_AD, ADR_TRAN_START_DT, CREATE_DT, AUTHOR_ID, ADDRESS_ID, LINE_1_AD, COUNTY_NM, LINE_4_AD, COUNTRY_ID, STATE_ID)
    VALUES ('Block 5, Apt. 6', {ts '9999-12-31 00:00:00.0'},
    {ts '2004-03-12 17:13:26.385'},
    'Oakwood', '61043', 1, '1234', 'Mailstop 820',
    {ts '2004-03-12 17:13:26.385'},
    {ts '2004-03-12 16:50:12.0'}, 'dataLoad',
    '5a052407-dac6-42ad-bbbf-29edc94488c1',
    'IBM Corp.', NULL, '140 Main Street', 'US', 'NJ')
    UnitOfWork(1238222885)--Connection(2102560837)--
    UPDATE PTTCBSI.ADDRESS_DETAIL
    SET ADR_TRAN_UNTIL_DT = {ts '2004-03-12 17:13:26.385'}, VER_ID = 2 WHERE
    (((ADDRESS_ID = '5a052407-dac6-42ad-bbbf-29edc94488c1') AND
    (ADR_TRAN_START_DT = {ts '2004-03-12 16:52:29.0'})) AND (VER_ID = 1))
    UPDATE PTTCBSI.ADDRESS_DETAIL SET
    ADR_TRAN_UNTIL_DT = {ts '2004-03-12 17:13:27.229'}, VER_ID = 2
    WHERE (((ADDRESS_ID = '5a052407-dac6-42ad-bbbf-29edc94488c1') AND (ADR_TRAN_START_DT = {ts '2004-03-12 17:13:26.0'})) AND (VER_ID = 1))
    UnitOfWork(102762535)--Connection(2102560837)--
    INSERT INTO PTTCBSI.ADDRESS_DETAIL
    (LINE_3_AD, ADR_TRAN_UNTIL_DT, MODIFY_DT, CITY_NM, POSTAL_CD, VER_ID, POSTAL_EXT_CD, LINE_2_AD, ADR_TRAN_START_DT, CREATE_DT, AUTHOR_ID, ADDRESS_ID, LINE_1_AD, COUNTY_NM, LINE_4_AD, COUNTRY_ID, STATE_ID) VALUES
    ('Block 5, Apt. 6', {ts '9999-12-31 00:00:00.0'}, {ts '2004-03-12 17:13:27.229'}, 'Oakwood', '61043', 1, '1234', 'Mailstop 820',
    {ts '2004-03-12 17:13:27.229'},
    {ts '2004-03-12 16:50:12.0'}, 'dataLoad',
    '5a052407-dac6-42ad-bbbf-29edc94488c1',
    'IBM Corp.', NULL, '140 Main Street', 'US', 'NJ')
    INSERT INTO PTTCBSI.ADDRESS_DETAIL
    (LINE_3_AD,
    ADR_TRAN_UNTIL_DT,
    MODIFY_DT,
    CITY_NM, POSTAL_CD, VER_ID, POSTAL_EXT_CD, LINE_2_AD,
    ADR_TRAN_START_DT,
    CREATE_DT,
    AUTHOR_ID,
    ADDRESS_ID,
    LINE_1_AD, COUNTY_NM, LINE_4_AD, COUNTRY_ID, STATE_ID) VALUES
    ('Block 5, Apt. 6', {ts '9999-12-31 00:00:00.0'},
    {ts '2004-03-12 17:13:27.792'},
    'Oakwood', '61043', 1, '1234',
    'Mailstop 820',
    {ts '2004-03-12 17:13:27.792'},
    {ts '2004-03-12 16:50:12.0'},
    'dataLoad',
    '5a052407-dac6-42ad-bbbf-29edc94488c1',
    'IBM Corp.', NULL, '140 Main Street', 'US', 'NJ')
    ClientSession(790235177)--Connection(2102560837)--rollback transaction
    ORA-00001: unique constraint (PTTCBSI.PK_ADDRESS_DETAIL) violated

    KK,
    We are back-porting the support for oracle.sql.TIMESTAMP to 9.0.4 in an upcoming patch-set. It is possible to enhance TopLink using a customer conversion manager or database platform to add this support if required in the short term.
    Doug

  • Capturing EDT/EST timezone with TIMESTAMP datatype

    Hi guys,
    In 9i DB, I need to capture date and timestamp with EDT or EST time zone specifiers. For testing pursposes, I wrote the following code and the result is as:
    SET SERVEROUT ON
    DECLARE
    v_date TIMESTAMP WITH LOCAL TIME ZONE;
    BEGIN
    v_date := TIMESTAMP '2003-06-15 12:16:30 US/Eastern EDT';
    DBMS_OUTPUT.PUT_LINE(v_date);
    END;
    I get the following result:
    15-JUN-03 12.16.30.000000 PM
    Actually I need 15-JUN-03 12.16.30.000000 PM EDT or EST depending on the daylight times in the Eastern coast. Also, I did change my SESSION with ALTER SESSION SET TIME_ZONE='US/Eastern' but still I do not get the promising output. I just put it to the forum in the hope if someone could help me out with the new timestamp datatype in this case please.
    Any help will highly be appreciated.
    Thanks
    Zahir

    WE need to set the NLS_TIMESTAMP_TZ_FORMAT and give apropriate format when printing timestamp:
    SQL> alter session set nls_timestamp_tz_format = 'DD-MON-YYYY HH24:MI:SS TZD' ;
    Session altered.
    SQL> DECLARE
      2     v_date TIMESTAMP WITH LOCAL TIME ZONE;
      3  BEGIN
      4     v_date := TIMESTAMP '2003-&1-15 12:16:30 US/Eastern';
      5     DBMS_OUTPUT.PUT_LINE(TO_CHAR(v_date, 'DD-MON-YYYY HH:MI:SS TZD'));
      6  END;
      7  /
    Enter value for 1: 01
    old   4:        v_date := TIMESTAMP '2003-&1-15 12:16:30 US/Eastern';
    new   4:        v_date := TIMESTAMP '2003-01-15 12:16:30 US/Eastern';
    15-JAN-2003 01:16:30
    PL/SQL procedure successfully completed.
    SQL> /
    Enter value for 1: 06
    old   4:        v_date := TIMESTAMP '2003-&1-15 12:16:30 US/Eastern';
    new   4:        v_date := TIMESTAMP '2003-06-15 12:16:30 US/Eastern';
    15-JUN-2003 12:16:30
    PL/SQL procedure successfully completed.
    SQL>When I enter a date in January, it prints the time as 01:16:30 (even though the time specified in the value was 12:16:30).
    And, next when I enter a date in June, it prints the time as 12:16:30 (this is what we entered).

  • Timestamp Datatype invalid in Oracle9i

    I tried to set a column to timestamp datatype in 9i but it returns invalid datatype
    Can anyone tell me why is this happen?
    I used the following statement
    create table test (modified timestamp);

    No, TIMESTAMP is not supported in Forms. However, a fix is forthcoming to address problems with the Builder's Wizards improper handling of unsupported datatypes such as this one.
    Ref._
    Bug 2567564
    Bug 8836073

  • Conversion from DATE to TIMESTAMP datatype

    Hello,
    My issue is as follows:
    1. I have one variable of type DATE, which I assign the value of SYSDATE
    mydatevar DATE:= SYSDATE;2. I want to find *"today"*, truncated to DAY
      TRUNC (mydatevar, 'DD')
    TRUNC function returns DATE datatype. So I will receive in point 2 for example *'2010-01-13 00:00:00'*.
    3. I want to assign the value from point 2 to a variable of type TIMESTAMP
      mytimestampvar TIMESTAMP := mydatevar;which implicitly will convert the DATE variable to TIMESTAMP.
    Problem: During the conversion (both implicit and explicit conversion with a format mask) I lose the "00" hours and "00" minutes and receive something like this: "10-JAN-13 *12*.00.00.000000000 AM".
    Question: How can I convert from DATE to TIMESTAMP keeping hours and minutes zeros?
    Why I need this conversion: I have a table with a column "column1" TIMESTAMP(0) and I would like to take only those rows from the table, where "column1" is in range from today 12 o'clock in the morning till now (whatever hour it is).
    NLS characteristics of the database:
    PARAMETER                           VALUE
    NLS_LANGUAGE                           AMERICAN
    NLS_TERRITORY                   AMERICA
    NLS_CURRENCY     $
    NLS_ISO_CURRENCY                    AMERICA
    NLS_NUMERIC_CHARACTERS     .,
    NLS_CHARACTERSET                    AL32UTF8
    NLS_CALENDAR                            GREGORIAN
    NLS_DATE_FORMAT                    DD-MON-RR
    NLS_DATE_LANGUAGE            AMERICAN
    NLS_SORT     BINARY
    NLS_TIME_FORMAT                     HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT             DD-MON-RR HH.MI.SSXFF AM
    NLS_TIME_TZ_FORMAT             HH.MI.SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT     DD-MON-RR HH.MI.SSXFF AM TZR
    NLS_DUAL_CURRENCY              $
    NLS_COMP                             BINARY
    NLS_LENGTH_SEMANTICS               BYTE
    NLS_NCHAR_CONV_EXCP             FALSE
    NLS_NCHAR_CHARACTERSET     AL16UTF16
    NLS_RDBMS_VERSION             10.2.0.4.0Session parameters are the same.
    DBTIMEZONE is "+02:00".

    Verdi wrote:
    Problem: During the conversion (both implicit and explicit conversion with a format mask) I lose the "00" hours and "00" minutes and receive something like this: "10-JAN-13 *12*.00.00.000000000 AM".I don't think you are necessarily losing any information whatsoever. It's probably more of a function of your NLS_TIMESTAMP_FORMAT and NLS_DATE_FORMAT. For example your NLS_DATE_FORMAT could be setup by default for a HH24 (24 hour time) which would report midnight as "00" hours. However, it looks like your NLS_TIMESTAMP_FORMAT is setup with a "HH" format with a meridian indicator which means 12 hours time.
    Your comparisons should be using date/timestamp data types anyways so as long as the input value is converted properly into a date type this shouldn't matter anyways.
    You can see what is actually stored by using the DUMP function:
    SQL> SELECT  DUMP(TO_TIMESTAMP(TO_CHAR(TRUNC(SYSDATE,'DD'),'MM/DD/YYYY HH:MI:SS AM'))) AS TSTAMP
      2  ,       DUMP(TRUNC(SYSDATE,'DD')) AS DT
      3  FROM DUAL
      4  /
    TSTAMP                                                                      DT
    Typ=187 Len=20: 218,7,1,13,0,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0                  Typ=13 Len=8: 218,7,1,13,0,0,0,0As you can see the TSTAMP and DT store nearly the same values (218,7,1,13), but the TSTAMP has more precision because of fractional seconds.
    HTH!
    Edited by: Centinul on Jan 13, 2010 7:23 AM

  • SSRS Matrix Issue. Expression calculating in row with no data.

    I have a matrix with multiple column groups for different sales channels. With in each group there are several columns and some with expressions for calculations. The rows are based on sales reps. Now not every sales rep sells in every channel, so there
    are some columns in which a rep will have no data. I want them be blank or show a 0. I can do both of those. The issue I am having is in the expressions for reps that have no data, it is still showing a calculation. It seems to populate that cell with whatever
    the first result of an actual calculation is. See below. For rep D and E they have no calls or sales in this particular sales channel; however, the closing ratio is showing 30.23% (which is the closing ratio for rep A). I need this to be blank or 0 or N/A,
    anything other that a wrong %. I have tried numerous iif and isnothing statements, e.g. =iif(Fields!count_of_customers.Value = 0,0,Fields!count_of_customers.Value/iif(Fields!Calls.Value = 0,1,Fields!Calls.Value)) plus other i have found on the internet
    but none of them work. Can someone please help? Sorry it is not a picture of the report. They need to confirm my account before I can attach pictures. But this is the set up of the report.
    Figure A
    Phone Field
    Rep Calls
    Sales Closing Ratio
    Premium Rep
    Calls Sales
    A 1000
    323 32.3%
    $100,250 A
    50 5
    B 200
    10 5% $50,000
    B 0
    0
    C 300
    15 5% $25,000
    C 25
    5
    D 0
    0 32.3%
    $0 D
    300 50
    E 0
    0 32.3%
    $0 E
    100 15
    F 500
    100 20%
    $300,000 F
    0 0

    Hi RobTencents,
    After testing the issue in my environment, I can reproduce it. To fix this issue, please try to use the expression below to instead the original one:
    =iif(sum(Fields!count_of_customers.Value) = 0,0,sum(Fields!count_of_customers.Value)/iif(sum(Fields!Calls.Value) = 0,1,sum(Fields!Calls.Value)))
    If there are any other questions, please feel free to ask.
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • SQL Developer 3.2 - Exporting data with TIMESTAMP datatype

    Hi,
    We have users that are attempting to export data with the Timestamp format to Excel (xls format) using SQL Developer 3.2.  When attempting to sort the Timestamp in either asc or desc order, Excel is having issues sorting correctly.  I suggested that the user just do all the sorting within their SQL Developer session but they require the ability to slice and dice in Excel.
    This is definitely not an issue with Excel as the users have previously exported Timestamp data from Toad and been able to sort without issue.  Any thoughts as to what might resolve this issue? 
    Thanks.

    We're not formatting timestamps in Oracle as numbers/dates in Excel. They'll need to properly format the Excel column/cells to get it to sort the way they want vs being treated as simple strings.

  • IR group by function and timestamp datatype

    Is there any limitations on the group by function e.g. based on the datatype.
    My problem\misunderstanding is with the IR:
    USING "GROUP BY"-
    I have a TIMESTAMP column (columnname TM_TIME) and I can schose it in the group by clause but not in the function section. Is there any reason why and could I get it to work.
    regards
    Thorsten
    Edited by: Fischert on 02.05.2012 03:05

    It really depends on the granularity that you need from the timestamp to make it meaningful. The key word is aggregation. So you should ask what level of aggregation do I need?
    You can reduce the granularity from the fractional seconds to some lower level of granularity like seconds, minute, hours or even date (equivalent of trunc(date_column) using CAST or Truncating or to_date(to_char(...)) with appropriate format mask.
    It just depends on the application and data.
    E.g. for the LHC at CERN chasing the Higgins, timestamp is just not fine grained enough (you need 10 exp -23 or lower I guess !)
    E.g. if you look at seismic data for oil exploration a hell of a lot happens in 5th and 6th decimal places in the timestamp.
    But if you are looking at data logged by a normal SCADA system then maybe a second is detailed enough for the purpose.
    In normal business application we are better off "rounding" the timestamp to some meaningful level for aggregation/ reporting.
    Regards,

  • Time datatype issue from SSIS

    Hi All,
    I have a SSIS pkg, I have
    1. Sart_time (datetime datatype)
    2. End_time (datetime datatype)
    3. Run_time ( Time(0) datatype)
    I am using the Run_time to get the difference of star & end time to calculate the time taken to complete the execution of pkg.
    When I run the below query in Management Studio I am getting the desired result as below, however when ran through SSIS expression using Execute SQL Task to update this table, it is updating it to NULL, as I am having the same code in SSIS I was expecting
    it give the the result.
    select CAST(END_TIME - START_TIME AS TIME(0)) from Log_table
    00:00:56
    Regards
    Neil

    Hello,
    Just as others post above, please try to use the DATEDIFF() function rather then minus operation. Please check the following statements:
    SELECT CONVERT(time(0), DATEADD(ss, DATEDIFF(ss,Sart_time, End_time), 0))
    Regards,
    Fanny Liu
    If you have any feedback on our support, please click here. 
    Fanny Liu
    TechNet Community Support

  • SSIS excel connection datatype issue

    HI
    Please find data below
    ColA            ColB
    74378             11213
    312             21312
    34                 234
    7329             27924
    23423             224353
    42342             13243
    54245             32321
    25423             6483
    5234             84379
    dmf             934293
    52                 9549
    49                 439879
    ads             asdf
    asd             aesdf
    asd             asdf
    asd             asdf
    asdf             asdf
    sadf             asdf
    asdf             sadf
    Using excel connection to load data to SQL Server When I try doing this character data is loading as nulls, I have tried by putting IMEX=1 property in connection string and removing IMEX=1 property from connection string , in both cases it is showing nulls,
    in the data viewer which i have placed after Excel source. I have also tried using the OPENROWSET which for some reason does not create an adhoc connection. I also tried changing the Datatypes in the advanced editor which again throws error. I tried all possible
    ways of fixing this issue but no luck yet. Could you please help resolve this. For some reason SSIS excel connection reads only the first 8 rows and decides a datatype and rest showup NULL.

    The problem with your data is that Excel is attempting to guess the data type of the columns.  By default, the TypeGuessRows property is set to 8, which means it will sample the first eight rows.  In your data, the first eight rows are all numbers,
    so Excel interprets the entire column as numeric.  When it encounters text, it decides it can't convert text to numbers, so it provides SSIS with a NULL.
    IMEX has no effect in the above scenario (with the default TypeGuessRows value) because IMEX applies when Excel finds "mixed types" in the columns - and in your case (with eight rows) it does not.
    TypeGuessRows can be modified (as Reza suggested) but only for values from 0 to 16.  In your specific case, setting it to 16 would have an effect.  Excel would read your file, see some numeric AND text values, and decide those columns have "mixed
    types".  NOW the IMEX attribute comes into play.  If you DON'T specify IMEX=1, then Excel will choose a data type based on majority wins.  In your specific case, it would still choose numeric for column A and column B, because most of the values
    in the first 16 rows are numeric.  If you specify IMEX=1, then Excel behaves differently.  Instead of "majority wins", if multiple data types are detected, it ALWAYS picks text.
    So - in your specific case, with this specific file, increasing TypeGuessRows to 13 or more, AND using IMEX=1 will allow you to read both columns as text, and not receive NULLs.
    Talk to me now on

Maybe you are looking for

  • I can not open NEF files from Nikon Df

    I can not open NEF files from Nikon Df Buongiorno a tutti, qualcuno mi può essere di aiuto per utilizzare Camera RAW con i file NEF della nuova Nikon Df ? Grazie infinite . Hello everyone, someone can help me to use Camera RAW NEF files with the new

  • Invalid URL

    I will try and go to websites like facebook, tumblr, and google and will get an invalid url message. I clear my cache and that typically works but it will start doing it again later. I'm using chrome. It will also do this on my iPad. It doesn't do th

  • Unable to reclaim free space

    tablespace size: 62G, 14 4G files and 1 6G file, freespace =51G When I try to resize datafiles, I get an error; the max free space per file =1G --- not sure what is using up the space. environment: 10.2.0.3 2-node RAC cluster on Red Hat Linux 4 Can s

  • Models in Abap Webdynpro.

    Hi All, I am a Java guy having knowledge in Java webdynpro, it happend for me to get into ABAP webdynpro just had a small query here. How does models functionality work in ABAp Webdynpro. How do we access RFC, BAPI, web services, EJBs etc in ABAP web

  • SmoothDivScroller in apex

    Hi all, I'm getting a problem with implementing the SmoothDivScroller in apex. - APEX version: 4 - DB version and edition: XE - Web server architecture (EPG, OHS or APEX listener): apex listener - Browser(s)/version(s) used: IE7, Chrome 10, Mozilla 3