Conversion of a timestamp

Hello,
I need to convert a UTC timestamp in the long form in the local version (CET) but if I use the CONVERT function I lose the decimal part (it becomes 0000000).
Any idea?
Thanks in advance.
Angelo

hi,
check this code...
REPORT  ZTEST_time.
DATA: time_stamp TYPE timestamp,
      dat TYPE d,
      tim TYPE t,
      tz   TYPE ttzz-tzone,
      dst(1) TYPE c.
tz = 'CET'.
time_stamp = 20030309033000.
CONVERT TIME STAMP time_stamp TIME ZONE tz
        INTO DATE dat TIME tim DAYLIGHT SAVING TIME dst.
WRITE: /(10) dat, (8) tim, dst.
time_stamp = 20030309043000.
CONVERT TIME STAMP time_stamp TIME ZONE tz
        INTO DATE dat TIME tim DAYLIGHT SAVING TIME dst.
WRITE: /(10) dat, (8) tim, dst.

Similar Messages

  • Timestamp import format difference

    Oracle 9i accepts the following sql insert format for timestamp
    {ts '2005-01-24 15:04:00' }
    Oracle 10g does not like {ts but accepts the format below
    timestamp '2005-01-24 16:08:00',
    can you confirm what is acceptable to 9i and 10g (trying to support a product
    to run on both)                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

    Hello:
    The statement you propose won't work in 9i as well on other releases:
    SQL*Plus: Release 9.2.0.6.0 - Production on Fri Dec 9 09:53:57 2005
    Copyright (c) 1982, 2002, Oracle Corporation. All rights reserved.
    Enter password:
    Connected to:
    Oracle9i Enterprise Edition Release 9.2.0.6.0 - 64bit Production
    With the Partitioning and Oracle Data Mining options
    JServer Release 9.2.0.6.0 - Production
    9iR2>create table t (ts timestamp);
    Table created.
    9iR2>insert into t values ('2005-01-24 15:04:00');
    insert into t values ('2005-01-24 15:04:00')
    ERROR at line 1:
    ORA-01843: not a valid month
    If it is accepted is because of default conversion and default timestamp format on you particular client setup. You may have a different setup from 9i and 10g installation, and this explains the issue.
    The only way to support correctly is to explicitly convert:
    9iR2>insert into t values (to_timestamp('2005-01-24 15:04:00','yyyy-mm-dd hh24:mi:ss'));
    1 row created.
    9iR2>
    This will work correctly in every release that supports timestamp format.
    The same is true as well for other conversions as numbers, dates...
    HTH,
    Andrea

  • Time stamp not recognized by Excel

    I am working on a VI to collect data from 16 load cells. I am using FieldPoint. The VI saves this data to Excel and puts a timestamp along with the data. The problem is this. The number in the timestamp cannot be converted by Excel into a useable time. I have placed a probe on both sides of the conversion from the timestamp and verified the problem. There is probably another way to skin this cat.
    gasseous
    Message Edited by gasseous on 07-29-2008 01:20 PM
    gasseous
    One test is worth a thousand expert opinions
    Attachments:
    FE1410 DRH 4.vi ‏35 KB

    I went in and subtracted (5) hours x (60) seconds x (60) minutes from the 126316800 constant. This made the time correct. Surely NI has come up with a better method than this? I need to take this system to all corners of the globe and hate to have to adjust the time to match.
    I also had to delete the date time to seconds vi and associated hookup in order to get the time to work. I replaced all of this with the get date time vi and it seems to work fine. Excel now can convert the output to a useable number.
    Thank you for all the help!
    gasseous
    One test is worth a thousand expert opinions
    Attachments:
    FE1410 DRH 5.vi ‏36 KB
    FE1410 DRH 5.xlsx ‏14 KB

  • Data-Export/Import

    Hi,
    I have exported my data via SQL Developer. The file has been saved as a LDR-File. In the file I can find:
    LOAD DATA
    INFILE *
    TRUNCATE
    INTO TABLE "CONTRL"
    FIELDS TERMINATED BY '|'
    OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS (
    DATEINAME ,
    DATEITYP ,
    PFAD ,
    TIMESTMP timestamp "DD.MM.RR" )
    begindata
    20091001_043629_0000786715|CONTRL|c:\TestCON715.txt|01.10.09 04:34:00,000000000|
    Now I am trying to import this file. I am using the SQLLoader. I cannot do it, even on the same computer.
    In the log-file I can find ORA-01830 and even I try to do some conversion like:
    TIMESTMP timestamp "MM-DD-YY hh24:mi:ss"
    TIMESTMP to_date(:TIMESTMP "MM-DD-YY hh24:mi:ss")
    Or….I am getting all of the time the same ORA-01830!
    How to solve the problem? Why the export function save a file that is not importable at the some computer?
    Thanks
    Kef

    Look at this test:
    SQL> create table test (t timestamp);
    Table created.
    SQL> insert into test values (
      2  to_timestamp('01.10.09 04:34:00,000000000','mm.dd.rr'));
    to_timestamp('01.10.09 04:34:00,000000000','mm.dd.rr'))
    ERROR at line 2:
    ORA-01830: date format picture ends before converting entire input string
    SQL> insert into test values (
      2  to_timestamp('01.10.09 04:34:00,000000000','mm.dd.rr hh24:mi:ss'));
    to_timestamp('01.10.09 04:34:00,000000000','mm.dd.rr hh24:mi:ss'))
    ERROR at line 2:
    ORA-01830: date format picture ends before converting entire input string
    SQL> insert into test values (
      2  to_timestamp('01.10.09 04:34:00,000000000','mm.dd.rr hh24:mi:ss,ff9'));
    1 row created.So I think you must change
    TIMESTMP timestamp "DD.MM.RR" )to
    TIMESTMP timestamp "mm.dd.rr hh24:mi:ss,ff9" )Max
    [My Italian Oracle blog|http://oracleitalia.wordpress.com/2010/01/31/le-direttive-di-compilazione-pragma/]

  • How do I make Skype 6.20 interface more compact li...

    I just made a big mistake and upgraded to Skype 6.20 on Win 7. The display of contacts is so wdiely spaced that I can no longer manage the availability of my work team by glancing at a small sized Compact view window that displays all of my team (favorites) in a smallish display without needing to scroll. Even with profile pics off, teh display shows less than half the number of contacts in the same real estate due to bigger line spacing and font - both of which do not seem adjustable.
    How do I reduce the display font size and Contacts display line spacing OR revert to my old version of Skype? As is, this is not usable for me and my company and as much as I hate to say it, makes Google Hangouts a viable alternative.
    Solved!
    Go to Solution.

    If you on using the latest versions of Skype like 7.0, these are the settings I've observed people are most interested in to make their Skype function in a more classic mode.  In older releases some of these options will have different names or not exist at all.
    Enable Compact Chat View
    "Tools" - "Options" - "IM settings" - Check "Compact Chat View"
    Enable Compact Sidebar View
    "View" - select "Compact Sidebar View". When it's enabled there will be a check next to it, when it's not, there won't be.
    Disable large emoticons
    "Tools" - "Options" - "IM appearance" - Uncheck "Show large emoticons"
    Disable automatic file downloading
    "Tools" - "Options" - "IM settings" - Uncheck "Automatically accept incoming files" or set it to "choose a folder to save the file in every time"
    Disable showing images in conversations
    "Tools" - "Options" - "IM settings" - Uncheck "Show images in conversations"
    Enable showing timestamps next to messages
    "Tools" - "Options" - "IM appearance" - Check "Show timestamp next to instant messages"
    Toggle multiple chat windows or a single chat window
    "View" - "Split Window View" for multiple chat windows
    "View" - "Single Window View" for a single chat window
    Displaying Skype credit balance (no longer displayed in the main window)
    Your current credit balance can be displayed by viewing your profile in one of the following ways:
    1.  Clicking your display name
    2.  Pressing "Ctrl-I"
    3.  Selecting "Skype" - "Profile" - "Edit your profile"
    4.  Logging into your account on the website ( http://www.skype.com/go/myaccount )
    The Skype website generally does not provide/advertise links to older releases.  Most dynamic links will either point to the latest release or the one before the latest.
    Below are direct downloads/static links which originate from the Skype website: 
    Currently the latest version using the new GUI (these may retire if a newer maintenance release comes out):
    Skype 7.0 (exe version)
    Skype 7.0 (msi version)
    The last version using the older style GUI:
    Skype 6.21 (exe version)
    Skype 6.21 (msi version)

  • Current Tag Query

    Newbie Tag query questions, two questions in one here -
    1) We are able to pull multiple tags within a history query, but when we try a Current query it only pulls back one tag.  Why doesnt it return the other tags within the Current?
    2) We have been seeing odd error message occasionally, "SQLGetData: Conversion Character to Timestamp.". We thought it was the DateTime config but that is set up correctly.
    Any thoughts?

    Brian - glad to see you made it back from AT&T park in one piece...
    1)  Is this just any pair of tags, or just certain combinations?
    2)  Does your ODBC System DSN have any of the advanced settings checked besides Read-Only?  (specifically the Timestamp sent as Character one).
    1/2)  Are any of your tag meta queries failing when it sends the initial request for description and scaling ranges, etc?
    Regards,
    Jeremy

  • Query is slow to reutn result...

    The following query is slow, I have index on tzk, and the event_dtg column is date column and allow null value.
    abc_zone have > 60 millions records. Statistics on table and index is current.
    Any idea how to improve the query performance?
    select count (*) tz6
    from abc_zone
    where tzk =6
    and event_dtg > to_date('09/05/2009 01:00:00' , 'MM/DD/YYYY HH24:MI:SS')
    and event_dtg < to_date('04/04/2010 00:00:00' , 'MM/DD/YYYY HH24:MI:SS')
    Oracle 10.2.0.3 on AIX
    Thanks in advance.

    sorry, I do have index on event_dtg...
    here is the EP:
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
    | 0 | SELECT STATEMENT | | 1 | 19 | 148 (0)| 00:00:01 |
    | 1 | SORT AGGREGATE | | 1 | 19 | | |
    | 2 | TABLE ACCESS BY INDEX ROWID| ABC_ZONE | 16 | 304 | 148 (0)| 00:00:01 |
    | 3 | INDEX RANGE SCAN | ABC_ZONE_EVENT_DTG | 3439 | | 1 (0)| 00:00:01 |
    Query Block Name / Object Alias (identified by operation id):
    1 - SEL$1
    2 - SEL$1 / ABC_ZONE@SEL$1
    3 - SEL$1 / ABC_ZONE@SEL$1
    17 rows selected.
    I suspect there is some kind of conversion (date to timestamp) that is costly.
    Thanks.

  • WSN and RT time stamps

    Hi WSN engineers again,
    I'm using a CRIO 9014 and my WSNs. They'll be working stand alone. Is there a way to synchronise there times? Can the CRIO act a time server for the WSN? I understand how to configure the host PC to act as the time server, but not sure about the RT targets.
    Thanks,
    Jevon

    Hi,
    I tried the same thing: cRIO-9073 should be time server for WSN-9791
    cRIO-9073 has IP 192.168.1.2
    WSN-9791 has IP 192.168.1.3 (time server in MAX is set tot 192.168.1.2)
    I wrote a VI which listens for NTP clients and sends a NTP package back, if the requesting IP is in the IP list of allowed clients. I can force the VI to run in a loop until the client, in my case the WSN-9791 replies. Then I send a NTP package to the WSN an i quit the VI.
    I had some serious problems to create the real UTC timestamp but I solved it using Wireshark. The conversion between NTP timestamp and Labview timestamp was the big challenge. Wireshark shows the transmitted UTC timestamp.
    My problem: the WSN-9791 does not allways accept the NTP timestamp, sometimes it does, sometimes it does not. Is there any solution on how to make the thing working for 100%? Please keep in mind that my NTP server VI may have some errors or does not follow exactly the NTP standard and can cause conflicts with the WSN-9791.
    To use the VI just open UDP port 123 (no other settings) and then connect the VI, close UDP after the VI again. Timeout can be set also, if you want. From my experience the WSN-9791 more or less every minute sends a request to his time server. (Used host PC, Meinberg NTP Server and Wireshark to check that)
    If anyone can help me I would really need help, because until now my WSN-9791 is wrong for one hour. If cRIO runs on UTC+1 WSN runs on UTC+2 !?!??
    Bye
    Attachments:
    cRIO NTP Server for WSN.jpg ‏8 KB
    Time_NTP_Server.vi ‏18 KB

  • Recoding for J2ME - Replacing BigInteger

    Hi,
    I'm working on getting the TOTP class from http://www.ietf.org/id/draft-mraihi-totp-timebased-03.txt However - they use BigInteger to run some of the type conversions for the timestamp and key.
    I've tried all of yesterday, last night and today to get something that functions - but I never seem to get the right/working result....
    specifically the line is:
    bArray = new BigInteger(time,16).toByteArray()
    and then getting that same value but in a J2ME compatable way. any suggestions ?
    time is a string, as a base 16 representation of the timestamp.
    Thanks,
    PC_Nerd
    * This works when not compileing for J2ME - and just on a standard J2(SE or EE ?) compile. * I haven't worked on java for over a year.

    Hi,
    I'm working on getting the TOTP class from http://www.ietf.org/id/draft-mraihi-totp-timebased-03.txt However - they use BigInteger to run some of the type conversions for the timestamp and key.
    I've tried all of yesterday, last night and today to get something that functions - but I never seem to get the right/working result....
    specifically the line is:
    bArray = new BigInteger(time,16).toByteArray()
    and then getting that same value but in a J2ME compatable way. any suggestions ?
    time is a string, as a base 16 representation of the timestamp.
    Thanks,
    PC_Nerd
    * This works when not compileing for J2ME - and just on a standard J2(SE or EE ?) compile. * I haven't worked on java for over a year.

  • Conversion from DATE to TIMESTAMP datatype

    Hello,
    My issue is as follows:
    1. I have one variable of type DATE, which I assign the value of SYSDATE
    mydatevar DATE:= SYSDATE;2. I want to find *"today"*, truncated to DAY
      TRUNC (mydatevar, 'DD')
    TRUNC function returns DATE datatype. So I will receive in point 2 for example *'2010-01-13 00:00:00'*.
    3. I want to assign the value from point 2 to a variable of type TIMESTAMP
      mytimestampvar TIMESTAMP := mydatevar;which implicitly will convert the DATE variable to TIMESTAMP.
    Problem: During the conversion (both implicit and explicit conversion with a format mask) I lose the "00" hours and "00" minutes and receive something like this: "10-JAN-13 *12*.00.00.000000000 AM".
    Question: How can I convert from DATE to TIMESTAMP keeping hours and minutes zeros?
    Why I need this conversion: I have a table with a column "column1" TIMESTAMP(0) and I would like to take only those rows from the table, where "column1" is in range from today 12 o'clock in the morning till now (whatever hour it is).
    NLS characteristics of the database:
    PARAMETER                           VALUE
    NLS_LANGUAGE                           AMERICAN
    NLS_TERRITORY                   AMERICA
    NLS_CURRENCY     $
    NLS_ISO_CURRENCY                    AMERICA
    NLS_NUMERIC_CHARACTERS     .,
    NLS_CHARACTERSET                    AL32UTF8
    NLS_CALENDAR                            GREGORIAN
    NLS_DATE_FORMAT                    DD-MON-RR
    NLS_DATE_LANGUAGE            AMERICAN
    NLS_SORT     BINARY
    NLS_TIME_FORMAT                     HH.MI.SSXFF AM
    NLS_TIMESTAMP_FORMAT             DD-MON-RR HH.MI.SSXFF AM
    NLS_TIME_TZ_FORMAT             HH.MI.SSXFF AM TZR
    NLS_TIMESTAMP_TZ_FORMAT     DD-MON-RR HH.MI.SSXFF AM TZR
    NLS_DUAL_CURRENCY              $
    NLS_COMP                             BINARY
    NLS_LENGTH_SEMANTICS               BYTE
    NLS_NCHAR_CONV_EXCP             FALSE
    NLS_NCHAR_CHARACTERSET     AL16UTF16
    NLS_RDBMS_VERSION             10.2.0.4.0Session parameters are the same.
    DBTIMEZONE is "+02:00".

    Verdi wrote:
    Problem: During the conversion (both implicit and explicit conversion with a format mask) I lose the "00" hours and "00" minutes and receive something like this: "10-JAN-13 *12*.00.00.000000000 AM".I don't think you are necessarily losing any information whatsoever. It's probably more of a function of your NLS_TIMESTAMP_FORMAT and NLS_DATE_FORMAT. For example your NLS_DATE_FORMAT could be setup by default for a HH24 (24 hour time) which would report midnight as "00" hours. However, it looks like your NLS_TIMESTAMP_FORMAT is setup with a "HH" format with a meridian indicator which means 12 hours time.
    Your comparisons should be using date/timestamp data types anyways so as long as the input value is converted properly into a date type this shouldn't matter anyways.
    You can see what is actually stored by using the DUMP function:
    SQL> SELECT  DUMP(TO_TIMESTAMP(TO_CHAR(TRUNC(SYSDATE,'DD'),'MM/DD/YYYY HH:MI:SS AM'))) AS TSTAMP
      2  ,       DUMP(TRUNC(SYSDATE,'DD')) AS DT
      3  FROM DUAL
      4  /
    TSTAMP                                                                      DT
    Typ=187 Len=20: 218,7,1,13,0,0,0,0,0,0,0,0,0,0,3,0,0,0,0,0                  Typ=13 Len=8: 218,7,1,13,0,0,0,0As you can see the TSTAMP and DT store nearly the same values (218,7,1,13), but the TSTAMP has more precision because of fractional seconds.
    HTH!
    Edited by: Centinul on Jan 13, 2010 7:23 AM

  • Timestamp conversion problem.

    I work in the development of a LabVIEW program that communicates with a
    server (written in Java). All acquired data is sent via TCP and the
    server can read everything (waveforms and other information). However,
    there is a problem in timestamp conversion. We don't know how to
    convert the milliseconds correctly. The timestamp is inside a waveform,
    so we cannot convert it to formatted string.
    Example:
    Original timestamp:
    13:37:19,639
    11/10/2005
    String generated with "Flatten to String":
    0000 0000 BF71 9ABF A3D7 0800 0000 0000
    Converted value (Java server):
    13:37:19,000
    11/10/2005
    Does anyone know the algorithm to obtain the milliseconds from this value: "A3D7 0800"?
    Thanks for attention.
    My regards,
    Vinicius Falseth
    Solved!
    Go to Solution.

    There is a faster way.  You can extract the timestamp from the waveform using Get Waveform Components. 
    At that point, you can convert it to whatever you want.  Attached
    is a VI which shows a simple conversion to milliseconds, losing a lot
    of resolution (128 bit timestamp goes to 52 bit double) and a more
    complex conversion showing the internal structure of the timestamp (it
    is a 128 bit fixed point number with the decimal in the middle). 
    You can modify the second conversion to do such things as throw away
    the integer portion to get higher resolution on the fraction.  Or
    you could just save to Java using a four-integer structure.
    This account is no longer active. Contact ShadesOfGray for current posts and information.
    Attachments:
    timestamp.llb ‏48 KB

  • DNG and MESZ timestamp / sony A100 RAW conversion

    Hello,
    i have a question regarding the timestamp within the EXIF metadata after the RAW to DNG conversion for my A100 pictures. Today I have seen that the timestamp betwen the RAW original file and the converted DNG file differs about 3 hours. I can't explain me this difference by myself. Has someone here an idea for this difference? I use the Mac Version of the actual free DNG converter. My timezone is MESZ/CEST what also means UTC+2 at the moment if I'm right. So 2 hours difference I could imagine because of such a time zone missmatch, but not 3 hours. But if i look at the RAW files the timestamp addition also shows MESZ like in the DNG after conversion. The main reason why I want to use DNG format is the linkage with my GPS logger data. But if the timestamps will convert wrong, this would be a no go for me.
    Thanks for your help.
    Greetings,
    Tschubi

    I think what you are seeing is a longstanding mis-match between how Exif and XMP record date-time values, and--possibly--a failure on the part of some of your software to understand how to use date-times read from XMP for the Exif date-time properties. When reading an Exif date-time from XMP, the UTC offset should always be ignored when interpeting the date-time--it has no meaning.
    Three date-time metadata properties are involved in this issue, in XMP they are stored as the following properties:
    exif:DateTimeDigitized
    exif:DateTimeOrigional
    xmp:ModifyDate
    When Adobe software reads Exif metadata from files, it maps the values in the Exif tag(s) to XMP properties. The XMP is stored inside the file when the software saves or updates a file (or sidecar .xmp file for RAW files).
    For exmaple, the exif:DateTimeDigitized the value is taken from two Exif tags, 0x9003 ("DateTimeDigized") and 0x9291 ("SubSecTimeDigizied"). See Part 2 of the XMP Specification for more details about how Exif is mapped to XMP.
    The Exif specification defines no way to record a time zone or UTC offset for its date-time values. If you are trying to figure out what time the picture was taken reading Exif, it's impossible to know what time zone is correct. If cameras are recording the time zone, they are storing it in propritary metadata in the file, and not in a standard Exif tag.
    Unfortunetly until recently the XMP Specification defined a date-time string format that requires a UTC offset. That's the mismatch. So, what's the software do when it's supposed to write a UTC offset, but doesn't have one to go with the time? Traditionally Adobe software uses whatever time zone your computer's clock has at the time the XMP property is created (that is, when the Exif is read).
    So, if your camera's clock was set to record 2009:08:28 11:26:07 in the Exif, and you convert your raw file to a DNG with a computer in Europe, you're probably going to get something like 2009-08-28T11:26:07+02:00 in the XMP. But if I convert the same RAW file to a DNG here in Seattle, I'm going to get 2009-08-28T11:26:07-07:00.
    Of course the right thing to do would be to just write 2009-08-28T11:26:07 in the XMP, leaving out the time zone designator (TZD). Expect that in the future. The XMP Specification now says that the TZD is optional and that "software should not assume anything about the missing time zone," but even the most recent release of the DNG Converter (5.5 as I write this) has not yet caught up with this update to the XMP Specification, and it still writes the bogus TZD. Also, any XMP written by older Adobe software will have the TZDs.
    What Adobe software like Bridge and Lightroom have done tradtionally is to ignore the TZD in the XMP when using these dates. You'll notice that browsing DNGs with the dates 2009-08-28T11:26:07+02:00 and 2009-08-28T11:26:07-07:00, and running Bridge anywhere in the world (that is regardless of the the clock's Time Zone setting on your computer), the metadata panel will still show you that you took the picture at 11:26 am on August 28, 2009. If other software is trying to shift dates to local times based on the UTC offsets recorded in XMP for these dates, that's a bug.
    Note that unlike other date-time values in Exif, the Exif specification says that the GPSTimeStamp is UTC, so this problem does not affect GPS metadata.
    I mentioned the XMP specification a few times, you can download it here:
    http://www.adobe.com/devnet/xmp/
    -David

  • Help to create Materialized View that has timestamp conversion

    I need help creating materialized view with timestamp conversion between GMT to LocalTime.
    Feel free to make any comments.
    Thanks in advance.
    jon joaquino;)

    Here is one way.
    1. Alter the table hist_table and add a new column pdt_timestamp.
    2. Update the new column using the function 'new_time'
    For example,
    Update hist_table
    set pdt_timestamp = new_time(gmt_timestamp,'GMT','PDT');
    3. create a materialized view log on the table 'hist_table' using the syntax
    create materialized view log on hist_table
    with primary key, rowid, sequence;
    4. create a materialized view now using the syntax:
    (You have to specify your own storage numbers and tablespace name)
    create materialized view mview_hist_table
    pctfree 0 tablespace mymview
    storage (initial 16k next 16k pctincrease 0)
    build immediate
    refresh fast on commit
    as select uid,gmt_timestamp,pdt_timestamp
    from hist_table;
    Please test on your test instance before doing it on production. I have tested this on Oracle 10g for Windows 2000. I assumed that column 'uid' is the primary key on the table. I hope this helps.
    **********************************************************

  • Incorrect timestamp to string conversions?

    Why does this code give different times when the UTC format? input changes?  The FieldPoint controller
    timezone is set to UTC and compensate for DST is disabled.  I would expect the two conversions to give the
    same result in this case but they don't.  RT 8.5.1.  The timestamps of readings in MAX are correct and the
    string formatted with UTC format? True is correct.
    Thanks.
    Matt
    Attachments:
    utc_time.vi ‏8 KB

    We're still seeing this in LabVIEW 2012.
    But it only appears to affect dates in BST, as soon as you get back into GMT this '-1' day error goes away
    Attachments:
    test - BST - UTC format bug.vi ‏16 KB

  • Converting Oracle TIMESTAMP(4) column to SQL datetime column conversion error in ssis

    I could not able to convert Oracle TIMESTAMP(4) column to SQL datetime column conversion error in ssis.
    I'm connecting OLEDD Oracle Source to OLEDB SQL Destination in SSIS package. I'm trying to insert data from oracle datetime column into sql datetime column. I'm getting some errors.
    Please provide helpful info.

    You can transform the data types directly at the source by writing a proper SQL statement, or you can convert them using the data conversion component.
    Please refer the below link
    http://stackoverflow.com/questions/6256168/how-to-convert-a-timestamp-in-string-format-to-datetime-data-type-within-a-packa

Maybe you are looking for