Read trigger timestamps

Hi,
I have a VI which is supposed to output an analog waveform in response to a trigger on APFI1 (USB X-Series Card). This waveform is changed depending on the phase of my experiment. So during the first part it's just zeros, it then becomes a square pulse and a pulse train. My problem is now, that I need to keep track of when the triggers where elicited or how many triggers were done during the small timed while loop on the right, as I need to save it. The experiment is designed in a way, that the trigger rates vary depending on the signal outputted, so I want to see, whether more triggers occur if I output a pulsetrain instead of a square pulse (the pulses are not the trigger! the trigger is derived seperately). I spent hours getting the thing to at least allow retriggerable stimulation with different waveforms, but I just have no idea, how to monitor and log the trigger time stamps...
Thanks for your help!
Best,
Michael
ps: I guess it would be enough to connect the output start trigger to a counter and read this counter every time the 100 ms while loop runs. Unfortunately I have no idea how to do this...
Attachments:
OGselfstim _Cont_vs_Pulses_vs1.vi ‏36 KB
retriggered output.png ‏79 KB

Hello Michael,
thanks for you request.
Unfortunately it is a little hard to see what the idea behind your VI is, as I find some contradictions between the VI and your description.
Could you update your VI with some comments on the single tasks, especially in the for-loop (in the same way as you did for the while loop) so it will be much easier for me to
understand your intentions.
If I know them I can build you an example (if non exists yet) how to produce your monitor.
Regards,
Eduard Gross
National Instruments

Similar Messages

  • Reading Trigger Value

    I have database trigger for setup table Investment_objective,In this table an objective_id column is auto generated ,which is achieved by SELECT statment within database level trigger which is follows
    CREATE OR REPLACE TRIGGER investment_objective_setup
    BEFORE INSERT ON  investment_objective FOR EACH ROW
    WHEN   (new.objective_id IS NULL)
    BEGIN
      SELECT NVL(MAX(objective_id),0)+1 INTO :new.objective_id
        FROM   investment_objective;
    END;
    /its working fine
    But using front end 6i forms after commit generated value is not getting display at front end field :objective_id unless i dont query ,my reqiurment is that i have to show the trigger generated objective_id for each record as soon as i commit,in short i want to read trigger generated value at front end at the time of triggerring the event.
    Khurram

    Hi,
    You could write a function which will return the next generated sequence and you can use that function in the front end and assign the value to the column.
    Any way you are checking the :new.objective_id is null in the back end trigger so that the sequence is not generated twice.
    Thanks and Regards
    Mohan

  • Daqmx read.vi timestamp incorrect problem (storing the timestamp from start task)

    Hello,
    I have a state machine with 1st state: configure DAQ where i have [daqmx create channel.vi --> daqmx timing sample clock--> DAQmx start trigger--> daqmx start trigger] connected in order
    sample clock in continuous mode, 10000 samples to read and sample rate 1000 Hz ,
    start trigger has rising edge selected with a dev1/PFI0 from a daq board.
    next state of SM is read channels which has daqmx read.vi set to 1d waveform  Nchan Nsamp.
    and the program first configures and goes to read channels state and is just waiting for the true value on PFI0,
    but the time stamp is not updated to current time/ when daqmx read starts instead it is getting the initial t0 from the time when it was in configure DAQ state of my SM in other words it stores the starting time when strart task.vi was run and then updates from then on.
    so in effect i am not getting the exact time stamp values while i am running the vi.
    i tried to build waveform by inputting the actual current time but the chart doesnot scroll and shows only few values each reading, when i tried to input the t0 from the waveform given out by read daqmx. vi, it then behaves normally..
    option #1 from this link says
    http://forums.ni.com/ni/board/message?board.id=250&thread.id=47648&view=by_date_ascending&page=2
    Try and do exactly what the driver does.  This
    will require you to do exactly what you are doing in the posted
    example.  Call the current system time immediately prior to calling the
    DAQmx Read and subtract dt * x where x is the number of samples already
    acquired.  This will require you to know exactly how many samples have
    been acquired.  This can be found by calling the Total Samples Per
    Channel Acquired property immediately prior to the DAQmx Read.  This
    introduces some points of innaccuracy.  For example, you're system time
    is already innaccurate to some amount.  In addition, it takes some time
    between calling system time, calling the total samples acquired, and
    calling the DAQmx read.  If 2 samples are acquired between calling the
    system time and total samples acquired, you could be off by a few
    samples.  For slower clock rates, you will have more accuracy. 
    but there are no samples that are acquired until the rising edge, so the number of samples is always zero before the read operation
    http://digital.ni.com/public.nsf/allkb/5D42CCB17A70A06686256DBA007C5EEA 
    this link says that
    and the number of samples in my waveform is constantly changing around 20-40, so i cannot really input the current time to build the waveform, thats what i can figure out for now..
    can somebody tell me why this is happening or is there any fix around this, how to get the current time into my daqmx read.viso it displays the currect time on my waveform chart
    Thanks,

    thanks,
    im using LV 8.6, daqmx driver 8.9.5.
    yes the program u posted get the currect time. but see attached program thats what i have in my program exactly
    if you took out the time to using the get waveform components using the waveform  from daqmx read. i still have the timestamp as the application start time. plz see attached code 8.2v and FP
    the waveform time x axis is not current, it is starting from the application start time.
    PS: samples to read = 10,000 and sample rate = 1000
    plz help..
    Attachments:
    daqmx read problemp.png ‏8 KB
    daqmx read problem.vi ‏36 KB

  • How can i read the timestamp of my *.swf's compile time?

    Hi,
    sometimes i run into the problem that my builds from FlexBuilder are not refreshed/rebuilt properly due to many reasons (compile errors, ...).
    To be able to determine if the build I'm debugging right now I'd simply like to display the timestamp of the compile time of my app within itself.
    Thought already about accessing the file properties through the FileSystem...but come on there must be another way to get that info...
    Searched also inside the LiveDocs for compiler constants that can be read but found nothing that matches my needs, yet.
    Thank's for any help or suggestion!
    - Michl

    Same question w/ a twist, is it somehow possible to get FLEX compiler to understand and evaluate something like this:
    -define+=COMPILE::Timestamp,"new Date()"
    Obviously it doesn't work otherwise I would have used it and proposed it as a solution myself.
    But I was wondering if there are some tweaks to trick the framework into evaluating this?
    Or writing this expression itself somehow differently to get it evaluated?

  • How to Read the Timestamp of a File on Unix from Form?

    If you have a file located on UNIX operating, what is the best way to read the time stamp (the date that file has been created) of that file from Oracle SQL, SQL*plus, or Developer Forms 6i/9i/10g?

    Thanks Robin for the reply! Your suggestion seems to be good!
    However, We found a solution now! I would like to share with you the way we did, in case anyone out there need the solution:
    In SQL*Plus, we created a Java Source below:
    CREATE OR REPLACE AND COMPILE JAVA SOURCE NAMED "FileDate"
    AS
    import java.io.File;
    import java.text.SimpleDateFormat;
    import java.util.Date;
    public class FileDate {
    public static String GetDate(String strFilePath) {
    String ReturnValue = null;
    if (strFilePath == null || strFilePath.length()==0)
    // do nothing
    else
    File aFile = new File(strFilePath);
    if (aFile.exists())
    long lngTime = aFile.lastModified();
    Date aDate = new Date(lngTime);
    SimpleDateFormat fmtDate =
    new SimpleDateFormat("dd-MM-yyyy HH:mm:ss");
    ReturnValue = fmtDate.format(aDate);
    fmtDate = null;
    aDate = null;
    aFile = null;
    return ReturnValue;
    -- Then we created a wrapper to call the Compiled Java above:
    CREATE or REPLACE FUNCTION FileDate_GetDate(FilePath IN STRING)
    RETURN VARCHAR2 IS
    LANGUAGE JAVA
    NAME 'FileDate.GetDate(java.lang.String) return int';
    -- Then we created a database package:
    -- Package Spec:
    PACKAGE sap_dnld AS
    FUNCTION get_file_date2 (s1 varchar2) RETURN VARCHAR2;
    END sap_dnld;
    -- Package Body:
    PACKAGE BODY sap_dnld AS
    FUNCTION get_file_date2( s1 VARCHAR2 )
    RETURN VARCHAR2 AS LANGUAGE JAVA
    NAME 'FileDate.GetDate(java.lang.String) return int';
    END sap_dnld;
    -- Then we tested the whole thing in SQL*Plus:
    set serveroutput on;
    declare
    x varchar2(2000);
    begin
    x := sap_dnld.get_file_date2;
    dbms_output.put_line(x);
    end;
    -- Then we called the function from the package using a procedure from the form:
    PROCEDURE Find_date IS
    BEGIN
    :control.file_date := sap_dnld.get_file_date2('directory1/subdirectory/filename.txt');
    END;

  • DAQmX read trigger

    On the DAQmx timing block what does the source onboard clock mean. I am using USB6225 as my DAQ device and I didn't find any clockspeed for onboard clock on the manual for that device. Please need some help.

    First of all, you should clean up your diagram before you submit it here for review.  All of the DAQ settings are on top of each other.  It is impossible to follow the wiring unless I move things out of the way.  How can you troubleshoot anything if you can't see the connections?
    Next, your Current Channel control is a DBL.  It should be I32.  Change it.
    The Onboard Clock IS the internal clock.  In the DAQmx Timing function (Sample Clock), the rate input is the clock speed.
    Instead of having two DAQmx channels in separate threads, it would be better to create one DAQmx task that contains both channels you want to use, provided that all settings for both channels are the same.  This appears to be the case in your vi.  Use the Create Task function and wire the output into the task input of  DAQmx Create Channel.  Then you don't have to worry about separate identical threads.  You will have to change the read function to Multiple Channels Single Sample.  Why are you using Single Sample when you have the number of samples per channel set to 10.  If you want 10 samples, you need to selet Multiple Samples.  Your output will be an array of 10 samples.  If using Multiple Channels Multiple Samples, you can choose an output of 2D array.  The first row will contain 10 samples for the first channel, and the second row is for the second channel.  Use index array to separate them for processing.
    - tbob
    Inventor of the WORM Global

  • Help! PXI-6682 timestampi​ng is limited to 2.5Hz

    I am outputting a 30Hz signal from a pulse generator, and I have reduced the vi to the bare minimum, but the program still only logs a timestamp about 2-3 times a second.  Checking timestamping in ni-max test panels has the same result.  The cables are fine and I've tried PFI0, 1, and 2.  What's going on?
    Attachments:
    timestamp_check3.vi ‏18 KB

    Hi Steve-
         You have selected 'Read Single Timestamp' as the polymorphic instance of the niSync Read Trigger Timestamp.vi.  Change that to 'Read Multiple Timestamps' from the drop-down menu.  This will enable you to pull an array of timestamps from the 6682, as opposed to just a single timestamp, as you had it configured.  The reason you were only getting 2-3 Hz is because you were pulling only a single timestamp with every iteration of the while loop, which was only iterating at 2-3 Hz.  With the Multiple Timestamps enabled, your while loop will still iterate at 2-3 Hz, but you can pull several timestamps with every iteration.  To specify the number of timestamps to pull, right-click on the number of timestamps input and select Create»Constant.  Start out with 10 as the constant and increase from there until you are pulling 30 timestamps every second.  Be sure to also wire a timeout value to the timeout input as well (ten seconds should be sufficient, but you can fiddle with this number also). 
         Your number of timestamps indicator will now be the 'detected edges' output of the niSync Read Trigger Timestamp.vi instead of the number of loop iterations.  Be sure to also place an niSync Close.vi at the end of the program to properly close the niSync session and clear it from memory.
         This should now run how you want it to.  I hope this helps, and best of luck with your application!
    Gary P.
    Applications Engineer
    National Instruments
    Visit ni.com/gettingstarted for step-by-step help in setting up your system.

  • Multichannel multisample read timestamp for CCP

    I have been using multichannel multisample 2D Dbl CAN read vi to acquire data from my CCP functions. This is almost the same as what is shown in the NI CCP toolset examples. However, now I want to read the timestamps, but when I select the multichan- multisamp- 2D Time & Dbl, I get the error:
    HEX: BFF62209
    Read/Write not matched initialized input/output mode.
    The CCP init task does not have a mode selection as for CAN Init Start, and so I have no idea how to solve this problem. Any ideas please?
    Thanks.

    Hello Elmo
    I've had a look at the functions, and the way to specify mode is via the 'Message Configuration' input on the CCP initialise function. If you look at the help file for this function it shows the breakdown of the cluster input. Its the fourth item in the cluster is the mode configuration. The numerics map to the following settings:
    0 - Input 1 - Output 2 - Timestamped Input 4 - Output Recent.
    Setting up the initialise with the mode set, should pass the correct CAN task reference to allow you to perform the timestamped measurement.
    Hope this helps, if you have any further questions, feel free to ask.
    Regards
    Hannah
    NI

  • Reading Dates and Timestamps created by PL/SQL

    Hello,
    This is probably a FAQ (or stating the obvious) but I want to make sure I've got the right idea about things...
    I recently noticed that our Java app reads dates/timestamps incorrectly from the database. Any date set using 'SYSDATE' in PL/SQL is an hour out when read by Java. This is because we are now in BST and the object read back by the JDBC driver thinks the timezone is GMT.
    From reading around it seems that the DATE and TIMESTAMP types in Oracle don't persist timezone information. Fair enough, but to me this makes it dangerous to use SYSDATE at all in PL/SQL procedures.
    Previously I've never relied on the database itself to generate timestamps so dates have always been stored as UTC. I'm currently using SYS_EXTRACT_UTC(SYSTIMESTAMP) when inserting data in to tables and basically wondering if this is the common way of achieving accuracy.
    (using types such as "TIMESTAMP WITH LOCAL TIME ZONE" seem to have their own annoyances in JDBC so I've opted to avoid those)
    Thanks.

    Tom,
    This may be helpful...
    http://www.javaworld.com/javaworld/jw-10-2003/jw-1003-time.html
    Good Luck,
    Avi.

  • How to write a Timestamp in an Oracle table?

    Hi,
    I am pretty new to SAP IdM and therefore have a very basic question.
    I am using SAP IdM 7.2 SP8 on Oracle.
    I was creating a job where I calculated some values from the database with a 'From Database' path.
    Then I wanted to store these values in a temporary table together with the timestamp when these were stored/calculated.
    Doing this in the Destination-Tab of the 'From Database'-path I got following error message...
    java.sql.SQLException: Missing IN or OUT parameter at index::2
    I did several tests and created a little test which recreates the problem...
    1. Create a Job with a 'From Database'-path
    2. In the source statement, insert...
    select
      'Test' AS Message,
      systimestamp AS TempDate
    from dual;
    3. In the destination tab add a temp Table and two columns. Since I want to add a date/timestamp-value I create a column of type 'DATE' here.
    4. After running the Job the error appears. (Here in german)
    Doing some additional tests I am able to add the date as a varchar2 if I convert the TempDate using a to_char()-function.
    Thus it seems like I misusing the Timestamp/Date data type.
    Do you have recommendations how I can add the Timestamp as a Date/Timestamp value to my table?
    Kind Regards, Andreas

    Hi,
    I really never got this to work properly and most of what follows is just my own rant caused by the annoyance of not getting this seemingly simple thing to work and are strictly my own theories not based on facts or actual knowledge of how Oracle or JDBC works, nor representing my employer in any way. Anyhow :-)
    Its possible that cause of the problem is that the timestamp is "destroyed" by NLS/JDBC when transferred from the database to the client. This is also one of the issues that varies a bit from version to version of the jdbc driver. Also the DATE datatype does not include fractional seconds, so you need to create the column as TIMESTAMP if you want that.
    The real fun thing with Oracle (or their jdbc driver) is that you can read a timestamp in the source and jdbc/NLS will convert it according to your locality setting, but you can't write what you read from Oracle back to Oracle.
    Just the same way you can use to_char to get a timestamp in a nice enough format, but when writing it to_date may not be able to parse the date back when using the exact same conversion mask...
    Example:
         select TO_CHAR(current_timestamp,'YYYYMMDD HH24:MI:SS.FFTZHTZM') from dual;
    Gives: 20140314 12:41:03.320000+0100. But both
         select TO_DATE('20140314 12:41:03.320000+0100','YYYYMMDD HH24:MI:SS.FFTZHTZM') from dual;
    and
         select TO_TIMESTAMP('20140314 12:41:03.320000+0100','YYYYMMDD HH24:MI:SS.FFTZHTZM') from dual;
    give errors.
    Rather than argue and fight with it I usually just end up using a To Database pass, check SQL Updating and write the Create Table and update/insert statements myself. Alternativly you can create the table with a trigger that automatically adds the timestamp on inserts/updates.
    1st To Database pass has no source, creates the temp table:
    The next pass has the source to read the data and insert it into the temp table:
    Then just use the current_timestamp function in the insert statement to keep it local and unconverted in the database engine...
    Br
    Chris
    Message was edited by: Per Krabsetsve

  • Arrive time of start trigger

    Dear all NI high speed digitizer experts:
     I am asking how can I read the timestamp of the start trigger? In my measurement I configure my digitizer NI5154 to start a multirecord acquisition by sending a start trigger. Currently I use the timestamp of the first captured waveform as the trigger time. This is not a precise approximation since the time difference between the first waveform and the trigger varies in a few ms range.
    Anyone know how to read the trigger timestamp?
    thank you.
    Lixin 

    Hello,
    The timestamp that you're looking at is actually the software timestamp returned by the NI-Scope driver VI, so you're right that it will differ from when the start trigger on the digitizer hardware is received and when that first sample is returned in software, especially if you have a very long record length.
    There are two properties that can help you determine a relative timestamp between records, but is actually a time value of a constantly running counter onboard the digitizer. The "Absolute Initial X" and "Relative Initial X" values that get returned as part of the "wfm Info" on the Fetch VI, will show you the timestamp of the first sample in each record of a particular acquisition and the time from the trigger to the first sample of the record, respectively. Take a look in the niScope Fetch (poly) detailed help and you can see this explanation.
    These values are good for comparison between records, but not a good absolute timestamp, because this counter is constantly running and is only reset on rollover or when the board is reset. So, you can see that the values returned may look a little funny, but keep in mind they are values derived from the counter counting a certain rate. So, what you can do is take the first values returned for AbsoluteInitialX and RelativeInitialX and add/subtract them (depending on your trigger type) and make that value your "0" time, and then for each consecutive record, just extrapolate from there.
    Hopefully this helps out.
    Chris W

  • Can I use the timestamp of a Network published global variable to reduce network traffic?

    I would like to use a couple of network-published global variables that will contain large clusters of data.  I want to host them on one device but read them from several - consider a distributed control system.  The data will update very infrequently, but, when it does, I want all my HMIs to know quickly.  I can have all the HMIs just read the data 4x/second (that would be fast enough) but I was wondering if there is a more elegant solution (still using global variables).  If I read only the timestamp 4x/second from each of the HMIs, compare it to the last read, and then poll the whole variable only if the timestamps are different, will that require less resources than just grabbing the whole variable every time?  In other words, does reading the timestamp use the same amount of resources as reading the whole variable?
    With really simple code, assuming the "Setup Data" cluster is quite large, does....
    ...get me any advantage over...
    Solved!
    Go to Solution.

    mark3545 wrote:
    So that means they are already doing what I want anyway, right?  If the reader only gets updated when the writer changes it, I can poll it as often as I want without increasing traffic, correct?
    That is correct.
    There are only two ways to tell somebody thanks: Kudos and Marked Solutions
    Unofficial Forum Rules and Guidelines

  • How to read/unzip a specific file within a zip file

    Hi,
    I have a file within a zip file that contains a timestamp - I want to read this timestamp and then create a destination directory for the remaining zip files to be unzipped into. Since I know the name of the file with the timestamp in it I thought I could create a zipfile and use getEntry to get the entry but then other than getting the size and name of the file I can't do much more with it like read it unless I use a stream (zipinputstream) instead of a file (zipfile) - do I have this right?
    Does this mean to get the content I would have to loop through possibly all the files using the stream until I come across the one I want - then get the timestamp and loop through them all again to write them to the destination directory? Or am I reading this wrong - seems a bit round about.
    Any suggestions would be greatly appreciated.
    Thanks

    this works though - and you don't have to loop through all the files - just use the ZipFile:
    ZipEntry ze = zipfile.getEntry("path/to/file");
    BufferedReader br = new BufferedReader(new InputStreamReader(zf.getInputStream(ze)));
    line = br.readLine;Thanks!

  • Xy-graph timestamp issue

    Hi All,
    I'm sure it's simple but I can't figure out what is going wrong with my xy-graph. I read a timestamp as a string from a waveform data file and convert it to a timestamp format. I then plot data vs time.
    The timestamps look correct but the plot is all out of whack. I've attached a simple example. Can anyone help me and see what I'm doing wrong? 
    Thanks!  
    Solved!
    Go to Solution.
    Attachments:
    XY-example.vi ‏19 KB

    The data was taken on two consecutive days.I set the graph to be the same format but didn't work.
    Solution - make the timestamp creator to be dd/mm/yy. For some reason I thought it needed to be mm/dd/yy?
    Cheers! 

  • Mapping util.Date to Oracle timestamp

    Tuesday, March 22, 2005
    I am currently experiencing difficulty in mapping a java.util.Date
    field to an Oracle TIMESTAMP column.
    Here's what I see. By default, Kodo maps the date field to a DATE
    column. I suppose this makes sense since Oracle's date columns
    have time information that resolves to the second. In this case,
    the client has a business case to store subsecond resolution,
    hence the desire to store the date field in an Oracle TIMESTAMP
    column.
    First question: how should this be done?
    Here's what I've tried. I tried setting the jdbc-type extension
    for the date field to "timestamp". This setting makes no
    difference, and I suspect the reason is that OracleDBDictionary
    has made the mapping from TIMESTAMP to DATE.
    I tried setting the jdbc-sql-type extension for the date field to
    "timestamp". This makes a difference only when I drop the table.
    Then the schematool's refresh action creates a table with date's
    field mapped to a TIMESTAMP column. I have also gone ahead and
    manually altered the table to achieve the same effect.
    Once the mapping is created, I see the following behavior. Kodo
    has no problem reading the TIMESTAMP column and putting the info
    into the date field. It also has no problem saving non-null date
    values into the TIMESTAMP column. But it does have a problem
    storing a null in the date field.
    Second question: what is the workaround to this problem?
    The the stack dump (obtained by using the JDO Tools Library
    example) follows.
    Thanks in advance,
    David Ezzio
    enter command:
    --> return book
    Select the book to return:
    1. book [com.ysoft.jdo.book.library.Book-354] "Gone to War" checked out:
    Tue Mar 22 10:38:01 EST 2005
    2. book [com.ysoft.jdo.book.library.Book-356] "Gone to Work" checked
    out: Tue Mar 22 10:33:58 EST 2005
    3. book [com.ysoft.jdo.book.library.Book-357] "Gone Fishing" checked
    out: Tue Mar 22 10:33:58 EST 2005
    4. book [com.ysoft.jdo.book.library.Book-360] "Gone Sailing" checked
    out: Tue Mar 22 10:33:58 EST 2005
    5. book [com.ysoft.jdo.book.library.Book-355] "Gone Hunting" checked
    out: Tue Mar 22 10:33:58 EST 2005
    Enter selection:
    --> 2
    okay
    enter command:
    --> commit
    exception caught in command
    kodo.util.FatalDataStoreException: The transaction has been rolled back.
    See the nested exceptions for details on the errors that occu
    rred.
    at
    kodo.runtime.PersistenceManagerImpl.throwFlushException(PersistenceManagerImpl.java:1262)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    NestedThrowablesStackTrace:
    kodo.util.DataStoreException: Invalid column type
    at
    kodo.jdbc.sql.DBDictionary.newDataStoreException(DBDictionary.java:3081)
    at kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:77)
    at kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:63)
    at kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:43)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flush(PreparedStatementManager.java:89)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:445)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:193)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:95)
    at
    kodo.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:609)
    at
    kodo.runtime.DelegatingStoreManager.flush(DelegatingStoreManager.java:153)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    NestedThrowablesStackTrace:
    java.sql.SQLException: Invalid column type
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
    at oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:269)
    at
    oracle.jdbc.driver.OracleStatement.get_internal_type(OracleStatement.java:6164)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setNull(OraclePreparedStatement.java:1316)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.PoolConnection$PoolPreparedStatement.setNull(PoolConnection.java:406)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.LoggingConnectionDecorator$LoggingConnection$LoggingPreparedStatement.setNull(LoggingConnectionDecorato
    r.java:792)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at kodo.jdbc.sql.DBDictionary.setNull(DBDictionary.java:950)
    at
    kodo.jdbc.sql.OracleDictionary.setNull(OracleDictionary.java:450)
    at kodo.jdbc.sql.RowImpl.toSQL(RowImpl.java:828)
    at kodo.jdbc.sql.RowImpl.flush(RowImpl.java:1039)
    at kodo.jdbc.sql.RowImpl.flush(RowImpl.java:975)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flushInternal(PreparedStatementManager.java:160)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flush(PreparedStatementManager.java:84)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:445)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:193)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:95)
    at
    kodo.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:609)
    at
    kodo.runtime.DelegatingStoreManager.flush(DelegatingStoreManager.java:153)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    enter command:
    -->

    Hi Stephen,
    There are two related issues that are addressed. One, some Oracle
    drivers return the wrong type (Type.OTHER) for the TIMESTAMP field.
    This is true for the
    9.2.0.1.0 driver that ships with 9iR2. This causes an exception when
    attempting to assign a null to the date field that has been mapped to a
    TIMESTAMP column. Two, all of the 9i drivers (and 10g drivers) return a
    type name of "TIMESTAMP(x)" where x is the precision. This confuses
    Kodo's OracleDictionary which is looking for a string without the
    precision characters.
    Following your suggestion, the following code fixes it just fine. It is
    harmless, in that all it does is do what OracleDictionary intended but
    failed to do. To use it, you must add the following property
    configuration to the kodo.properties file.
    kodo.jdbc.DBDictionary: xxx.jdo.FixedOracleDictionary
    Without the fix, Kodo does not reassign the TIMESTAMP columns to a type
    of DATE. So far as I can tell, as long as the driver returns a
    Types.TIMESTAMP this does not cause a failure.
    This fix will be moot as soon as the bug in OracleDictionary is fixed.
    What I wonder about is why does Kodo reassign type TIMESTAMP to DATE?
    Why don't you treat TIMESTAMP types as TIMESTAMP types? Curious minds
    want to know.
    Best wishes,
    David
    ---- code follows
    package xxx.jdo;
    import java.sql.*;
    import kodo.jdbc.schema.*;
    import kodo.jdbc.sql.*;
    * Some Oracle drivers do not return the correct type for the TIMESTAMP
    field.
    * This class fixes this issue for Kodo 3.3. The problem (an exception
    complaining
    * about an invalid column type) appears when mapping a Java field
    (Date for example) to
    * an Oracle timestamp field, and only when attempting to set null on
    the Java field.
    public class FixedOracleDictionary
    extends OracleDictionary
    public Column[] getColumns (DatabaseMetaData meta, String catalog,
    String schemaName, String tableName,
    String columnName, Connection conn)
    throws SQLException
    // Let Kodo's OracleDictionary do its thing
    Column[] cols = super.getColumns (meta, catalog, schemaName,
    tableName,
    columnName, conn);
    // Catch the columns with a name of "TIMESTAMP(n)" and mark them
    as DATE types.
    // This is what the OracleDictionary intended to do, but was
    foiled by the
    // name which now has a precision.
    for (int i = 0; cols != null && i < cols.length; i++)
    String tName = cols.getTypeName();
    if (tName != null && tName.startsWith("TIMESTAMP"))
    cols[i].setType(Types.DATE);
    return cols;
    ---- code ends
    Stephen Kim wrote:
    This is a bug (1111)with regards to specific combinations of Oracle 10
    driver and db.
    To work around the issue until the next relase, getColumns (...) in
    OracleDictionary needs to be extended/modified to instead of doing a
    strict equals () comparison to "TIMESTAMP", to instead do a startsWith
    ("TIMESTAMP")
    David Ezzio wrote:
    Tuesday, March 22, 2005
    I am currently experiencing difficulty in mapping a java.util.Date
    field to an Oracle TIMESTAMP column.
    Here's what I see. By default, Kodo maps the date field to a DATE
    column. I suppose this makes sense since Oracle's date columns
    have time information that resolves to the second. In this case,
    the client has a business case to store subsecond resolution,
    hence the desire to store the date field in an Oracle TIMESTAMP
    column.
    First question: how should this be done?
    Here's what I've tried. I tried setting the jdbc-type extension
    for the date field to "timestamp". This setting makes no
    difference, and I suspect the reason is that OracleDBDictionary
    has made the mapping from TIMESTAMP to DATE.
    I tried setting the jdbc-sql-type extension for the date field to
    "timestamp". This makes a difference only when I drop the table.
    Then the schematool's refresh action creates a table with date's
    field mapped to a TIMESTAMP column. I have also gone ahead and
    manually altered the table to achieve the same effect.
    Once the mapping is created, I see the following behavior. Kodo
    has no problem reading the TIMESTAMP column and putting the info
    into the date field. It also has no problem saving non-null date
    values into the TIMESTAMP column. But it does have a problem
    storing a null in the date field.
    Second question: what is the workaround to this problem?
    The the stack dump (obtained by using the JDO Tools Library
    example) follows.
    Thanks in advance,
    David Ezzio
    enter command:
    --> return book
    Select the book to return:
    1. book [com.ysoft.jdo.book.library.Book-354] "Gone to War" checked
    out:
    Tue Mar 22 10:38:01 EST 2005
    2. book [com.ysoft.jdo.book.library.Book-356] "Gone to Work" checked
    out: Tue Mar 22 10:33:58 EST 2005
    3. book [com.ysoft.jdo.book.library.Book-357] "Gone Fishing" checked
    out: Tue Mar 22 10:33:58 EST 2005
    4. book [com.ysoft.jdo.book.library.Book-360] "Gone Sailing" checked
    out: Tue Mar 22 10:33:58 EST 2005
    5. book [com.ysoft.jdo.book.library.Book-355] "Gone Hunting" checked
    out: Tue Mar 22 10:33:58 EST 2005
    Enter selection:
    --> 2
    okay
    enter command:
    --> commit
    exception caught in command
    kodo.util.FatalDataStoreException: The transaction has been rolled
    back. See the nested exceptions for details on the errors that occu
    rred.
    at
    kodo.runtime.PersistenceManagerImpl.throwFlushException(PersistenceManagerImpl.java:1262)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at
    com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at
    com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    NestedThrowablesStackTrace:
    kodo.util.DataStoreException: Invalid column type
    at
    kodo.jdbc.sql.DBDictionary.newDataStoreException(DBDictionary.java:3081)
    at
    kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:77)
    at
    kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:63)
    at
    kodo.jdbc.sql.SQLExceptions.getDataStore(SQLExceptions.java:43)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flush(PreparedStatementManager.java:89)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:445)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:193)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:95)
    at
    kodo.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:609)
    at
    kodo.runtime.DelegatingStoreManager.flush(DelegatingStoreManager.java:153)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at
    com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at
    com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    NestedThrowablesStackTrace:
    java.sql.SQLException: Invalid column type
    at
    oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:134)
    at
    oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:179)
    at
    oracle.jdbc.dbaccess.DBError.throwSqlException(DBError.java:269)
    at
    oracle.jdbc.driver.OracleStatement.get_internal_type(OracleStatement.java:6164)
    at
    oracle.jdbc.driver.OraclePreparedStatement.setNull(OraclePreparedStatement.java:1316)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.PoolConnection$PoolPreparedStatement.setNull(PoolConnection.java:406)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at
    com.solarmetric.jdbc.LoggingConnectionDecorator$LoggingConnection$LoggingPreparedStatement.setNull(LoggingConnectionDecorato
    r.java:792)
    at
    com.solarmetric.jdbc.DelegatingPreparedStatement.setNull(DelegatingPreparedStatement.java:369)
    at kodo.jdbc.sql.DBDictionary.setNull(DBDictionary.java:950)
    at
    kodo.jdbc.sql.OracleDictionary.setNull(OracleDictionary.java:450)
    at kodo.jdbc.sql.RowImpl.toSQL(RowImpl.java:828)
    at kodo.jdbc.sql.RowImpl.flush(RowImpl.java:1039)
    at kodo.jdbc.sql.RowImpl.flush(RowImpl.java:975)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flushInternal(PreparedStatementManager.java:160)
    at
    kodo.jdbc.runtime.PreparedStatementManager.flush(PreparedStatementManager.java:84)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:445)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:193)
    at
    kodo.jdbc.runtime.UpdateManagerImpl.flush(UpdateManagerImpl.java:95)
    at
    kodo.jdbc.runtime.JDBCStoreManager.flush(JDBCStoreManager.java:609)
    at
    kodo.runtime.DelegatingStoreManager.flush(DelegatingStoreManager.java:153)
    at
    kodo.runtime.PersistenceManagerImpl.flush(PersistenceManagerImpl.java:1122)
    at
    kodo.runtime.PersistenceManagerImpl.flushSafe(PersistenceManagerImpl.java:1005)
    at
    kodo.runtime.PersistenceManagerImpl.beforeCompletion(PersistenceManagerImpl.java:932)
    at
    kodo.runtime.LocalManagedRuntime.commit(LocalManagedRuntime.java:69)
    at
    kodo.runtime.PersistenceManagerImpl.commit(PersistenceManagerImpl.java:592)
    at
    com.ysoft.jdo.book.library.LibraryHandler.commitTransaction(LibraryHandler.java:175)
    at
    com.ysoft.jdo.book.library.client.CommitTransaction.execute(Library.java:279)
    at
    com.ysoft.jdo.book.common.console.UserInterface.execute(UserInterface.java:196)
    at
    com.ysoft.jdo.book.common.console.UserInterface.pumpCommands(UserInterface.java:186)
    at
    com.ysoft.jdo.book.library.client.Library.run(Library.java:139)
    at
    com.ysoft.jdo.book.library.client.Library.main(Library.java:104)
    enter command:
    -->

Maybe you are looking for