[Bug?] Hypertrend Draws 'Fake' Gaps in Data

Hi
I have noticed Hypertrend drawing gaps in the data (when there is actually data there). This can be confirm when you zoom in on the data. Is this a known issue, or is there any workaround/configuration? The images are from Hypertrend in LabVIEW, but we have seen it in MAX too.
Cheers
-JG
Certified LabVIEW Architect * LabVIEW Champion
Solved!
Go to Solution.

Here are some prior discussions on the subject...
http://forums.ni.com/t5/Real-Time-Measurement-and/Distributed-System-Manager-Historical-Trend/m-p/99...
Also see attachment for a Jing video of the problem.  This occurs in DSM 2009 and 2010 and on multiple machines.
Here is the final result after much rangling with NI:
From: Mark Black [mailto:[email protected]]
Sent: Thursday, January 21, 2010 4:53 PM
To: Sachs, Michael A. (MSFC-ET30)[Intelligent Systems]
Cc: Roger Hebert
Subject: Re: DSM issues
Hi Mike,
We do have a CAR for this issue (#178809), but its a general CAR for these
types of Hypertrend drawing issues that your case fits into. Currently
this bug is not targeted to be fixed for SP1, but I have contacted our Shanghai
development team for more an update.  Have you seen this issue on multiple
different systems?  Have you seen this issue with plots not tied to your
GPS synched cRIO?
Thanks,
Mark Black
Product Support Engineer - LabVIEW R&D
National Instruments
[email protected]
(512) 683-8929
Attachments:
2010-10-18_0907.zip ‏3018 KB

Similar Messages

  • Possible Bug in Draw 1-Bit Pixmap(6_1).vi

    I have been observing this bug in Draw 1-Bit Pixmap(6-1).vi (part of picture.llb) in many LabView versions. Every time I install new LabView version or update an existing one, the bug reappears. Please, fix it once and for all. The array "Color Table" is not wired to Draw 1-Bit Pixmap.vi. It should be wired. I have attached the screen snapshot showing the bug and the comments. In your LabView version, you can just open Draw 1-Bit Pixmap(6-1).vi, wire "Color Table" and save this vi back into its library.
    Attachments:
    Bug_Draw_1-Bit_Pixmap(6_1)-vi.jpg ‏239 KB

    You would probably be much better off breaking that VI and replacing it as needed with a non-deprecated version, something like Draw Unflattened Pixmap.

  • BUG 3.0/3.1 Import data from CSV in non english local

    Hello,
    I have 3.0 on a German XP (NLS Decimal "," and Group ".") but with
    AddVMOption -Duser.language=en
    because of bug 9231534
    I try to import data from a delimited file that contains the number "123,23"
    When I use the Import Data Wizard it generates an insert with "12323.0". This import works, but gives me the wrong values, 12323 instead of 123,23.
    I have 3.1 with german UI and try the same import.
    It generates a correct insert "123.23", but on executing it fails, because it expects the german decimal separator, this is wrong for two reasons:
    The generated script has a dot as decimal separator and it would not make sense to use a comma, because this is the value separator in an insert script.
    To execute the script it should use the same NLS-Setting as for generating.
    Regards
    Marcus

    Hello,
    has anybody found a solution?
    Testcase: SQL Developer 3.1 on a German XP with default NLS Settings
    CREATE TABLE "TEST_TABLE"
           "NUM" NUMBER
          ,"VCH"  VARCHAR2(10 BYTE)
        ) ;Test file test_insert.dsv
    num;vch
    1;KL
    1,5;tz
    12345,45;ooImporting using the wizard inserts the first row correctly, for the others I get
    SET DEFINE OFF
    --Einfügen für Zeilen  1  bis  3  nicht erfolgreich
    --ORA-01722: invalid number
    --Zeile 2
    INSERT INTO TEST_TABLE (NUM, VCH) VALUES (1.5,'tz');
    --Zeile 3
    INSERT INTO TEST_TABLE (NUM, VCH) VALUES (12345.45,'oo');Beside the wrong umlaut in the message the insert statement itself is correct, because you cannot use the german decimal separator "," in the script. The bug is, that it should use the same NLS settings for generating and running the script.
    Regards
    Marcus

  • Bug in To do's - Due date won't change unless you go to "Other..."

    Bug in To do's - Due date won't change unless you go to "Other..." when you create or edit a to-do in Reminders>To Do's view
    Mail Version 3.0 (912.1/912)

    Another bug: The "No Date" option doesn't work properly. I have a smart mailbox set up to show all items dated plus/minus a year from now (i.e. all dated items) and four undated items showed up in the list. Selecting "no date" didn't work, so what I had to do was use the "other" option to set a date a couple of years back then go to iCal to set the items to "no date". That cleared them from the smart mailbox in Mail.

  • [svn:osmf:] 11205: Fix bug FM-169: Trait support for data transfer sample doesn' t display bytes loaded and bytes total for SWF element

    Revision: 11205
    Author:   [email protected]
    Date:     2009-10-27 15:04:26 -0700 (Tue, 27 Oct 2009)
    Log Message:
    Fix bug FM-169: Trait support for data transfer sample doesn't display bytes loaded and bytes total for SWF element
    Ticket Links:
        http://bugs.adobe.com/jira/browse/FM-169
    Modified Paths:
        osmf/trunk/apps/samples/framework/PluginSample/src/PluginSample.mxml
        osmf/trunk/apps/samples/framework/PluginSample/src/org/osmf/model/Model.as

    The bug is known, and a patch has been submitted: https://bugs.freedesktop.org/show_bug.cgi?id=80151. There's been no update since friday, so I wonder what the current status is, or if it's up for review at all.
    Does anyone know how we can be notified when this patch hits the kernel?

  • Time Sparsity and Gaps in Data ???

    Hi,
    I have a cube with 7 dimensions and having over 70 million rows. I have made all dimensions including time to be sparse. Now the problem is that there are gaps in my data for certain days after I maintain cube, which seems strange. Also, the aggregation at lower levels(days) is not correct whereas aggregations on higher levels (like year) are absolutely correct.
    On the other hand, if i dont make time dimension sparse then there are no gaps in data and everything looks fine. All aggregations are fine in this case in every level.
    Can any1 explain this behaviour please ?
    Regards.

    Hi Riccardo:
    At your suggestion, I used the Convert From Dynamice Data to 2D array before saving the array using Write to Spreadsheet File.  When I collected 2 seconds of data at 1,000 Hz, there are fewer than 200 lines of data in the spreadsheet file (no gaps in the data, see the txt file attached). For comparison, the file saved using Write to Measurement File has close to 1,000 lines (with gaps in the data, see the xlsx file attached).  Thanks for your time.
    John
    Attachments:
    A16D1_1000Hz_2s.txt ‏20 KB
    A16D1_1000Hz_2s.xlsx ‏133 KB
    Analog_16AI_1DI_Timed_Datalogging(WriteSpreadsheet).vi ‏116 KB

  • Gaps in data saved as LVM

    I am using 2 USB-6009s to collect 16 channels of analog data and the 2 devices are not synced.  The data files attached were collected at 1,000 Hz for 2 sec using the Write to Measurement File function.
    1. When the data were saved as a text (lvm) file [the csv file attached], there are gaps in the data.  However, there are no gaps when saved as an Excel file.
    2. When saved as Excel with 'absolute' timestamps, the time column starts with 00.000 and then reset to 00.000 at non-regular intervals (as in the spreadsheet attached).  The time column has continuous time when the 'absolute' option is deselected,
    If the gaps are related to the non-synchronization of the 2 devices, why are there no gaps when saved as Excel files?  Thanks for your time.
    Attachments:
    Analog_16Ch_Timed_Datalogging_Dell.vi ‏90 KB
    AI16_1000Hz_2s(absolute timestamp).xlsx ‏126 KB
    AI16_1000Hz_2s.csv ‏304 KB

    Hi Riccardo:
    At your suggestion, I used the Convert From Dynamice Data to 2D array before saving the array using Write to Spreadsheet File.  When I collected 2 seconds of data at 1,000 Hz, there are fewer than 200 lines of data in the spreadsheet file (no gaps in the data, see the txt file attached). For comparison, the file saved using Write to Measurement File has close to 1,000 lines (with gaps in the data, see the xlsx file attached).  Thanks for your time.
    John
    Attachments:
    A16D1_1000Hz_2s.txt ‏20 KB
    A16D1_1000Hz_2s.xlsx ‏133 KB
    Analog_16AI_1DI_Timed_Datalogging(WriteSpreadsheet).vi ‏116 KB

  • Fill gaps in Dates

    I have a table like this
    DDL for table:
    CREATE TABLE TEST1
      ID        VARCHAR2(20 BYTE),
      HH_START  DATE,
      HH_END    DATE
    Data for table:
    SET DEFINE OFF;
    Insert into TEST1
       (ID, HH_START, HH_END)
    Values
       ('144813', TO_DATE('01/26/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('01/30/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into TEST1
       (ID, HH_START, HH_END)
    Values
       ('144813', TO_DATE('01/30/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('11/18/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into TEST1
       (ID, HH_START, HH_END)
    Values
       ('144813', TO_DATE('01/10/2011 00:00:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('01/11/2011 00:00:00', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into TEST1
       (ID, HH_START, HH_END)
    Values
       ('944813', TO_DATE('01/26/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('01/30/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into TEST1
       (ID, HH_START, HH_END)
    Values
       ('944813', TO_DATE('01/30/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('11/18/2010 00:00:00', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into TEST1
       (ID, HH_START, HH_END)
    Values
       ('944813', TO_DATE('01/10/2011 00:00:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('01/11/2011 00:00:00', 'MM/DD/YYYY HH24:MI:SS'));
    COMMIT;The table will contain multiple ID's. I just have 2 as sample. The reqiurement is we need to fill in the date gaps for each ID. For instance the final data for 144813 should look like:
    ID     HH_START     HH_END
    144813     1/26/2010     1/30/2010
    144813     1/30/2010     11/18/2010
    144813     11/18/2010     1/10/2011
    144813     1/10/2011     1/11/2011
    The gaps could be in many places. Thank you for your inputs.

    select  id,
             hh_start,
             hh_end
       from  test1
    union all
    select  id,
             hh_start,
             hh_end
       from  (
              select  id,
                      lag(hh_end) over(partition by id order by hh_start) hh_start,
                      hh_start hh_end
                from  test1
       where hh_start != hh_end
    order by id,
             hh_start
    ID                   HH_START  HH_END
    144813               26-JAN-10 30-JAN-10
    144813               30-JAN-10 18-NOV-10
    144813               18-NOV-10 10-JAN-11
    144813               10-JAN-11 11-JAN-11
    944813               26-JAN-10 30-JAN-10
    944813               30-JAN-10 18-NOV-10
    944813               18-NOV-10 10-JAN-11
    944813               10-JAN-11 11-JAN-11
    8 rows selected.
    SQL> SY.

  • Finding Gaps In Date Range

    I was recently asked to help create a query at my company to search for date gaps in employment status history. My table data looks similar to this
    employee_id employment_status beg_date end_date
    1               Active               1990-01-01          1991-01-01
    1               Leave               1991-02-01          1993-06-03
    1               Active               1993-06-04          1995-02-01
    1               Fired               2000-06-01          2299-12-31
    So the gap im looking for would be from 1995-02-01 and 2000-06-01
    Unfortunately as well, I dont have admin access to the database in order to be able to create an index, or do any fancy PL/SQL, im pretty much limited to the most basic SQL possible.
    Any help appreciated!

    If your database supports analytic functions, the following query should give what you want.
    with sample_data as (
      select 1 employee_id, 'Active' employment_status, date '1990-01-01' beg_date, date '1991-01-01' end_date from dual union all
      select 1, 'Leave', date '1991-02-01', date '1993-06-03' from dual union all
      select 1, 'Active', date '1993-06-04', date '1995-02-01' from dual union all
      select 1, 'Fired', date '2000-06-01', date '2299-12-31' from dual
    select employee_id,
           employment_status as last_status,
           end_date as gap_lower_bound,
           next_date as gap_upper_bound,
           next_status
    from
      select t.*,
             lead(beg_date) over(partition by employee_id order by beg_date) next_date,
             lead(employment_status) over(partition by employee_id order by beg_date) next_status
      from sample_data t
    where next_date > end_date + 1
    EMPLOYEE_ID LAST_STATUS GAP_LOWER_BOUND GAP_UPPER_BOUND NEXT_STATUS
              1 Active      01/01/1991      01/02/1991      Leave
              1 Active      01/02/1995      01/06/2000      Fired
    Note #1 : the WITH clause is just there to generate some test data "on-the-fly", you can remove it and use your real table in place of SAMPLE_DATA in the main query.
    Note #2 : unless you made a typo, the gap 01/01/1991 to 01/02/1991 should also be retrieved.
    BTW, for specific questions about SQL or PL/SQL please use the {forum:id=75} forum.
    Edited by: odie_63 on 27 févr. 2011 17:16

  • How to find gaps in data?

    Hi!
    I have the following problem.
    Data:
    range_id actual_nr
    AAA 001AAA
    AAA 002AAA
    AAA 003AAA
    AAA 006AAA
    AAA 007AAA
    AAA 009AAA
    BBB 001BBB
    BBB 002BBB
    etc.
    I have to get report in the following form
    from to nr_of_rows
    001AAA 003AAA 3
    006AAA 007AAA 2
    009AAA 1
    001BBB 002BBB 2
    etc.
    As you can see if there is a gap in sequence then I have to calculate how many rows were in sequence before the gap.
    Can somebody give me some hints or even working statement?

    How's this?
    WITH
         Sample_Data
    AS
          SELECT 'AAA' range_id, '001AAA' actual_nr FROM Dual UNION ALL
          SELECT 'AAA' range_id, '002AAA' actual_nr FROM Dual UNION ALL
          SELECT 'AAA' range_id, '003AAA' actual_nr FROM Dual UNION ALL
          SELECT 'AAA' range_id, '006AAA' actual_nr FROM Dual UNION ALL
          SELECT 'AAA' range_id, '007AAA' actual_nr FROM Dual UNION ALL
          SELECT 'AAA' range_id, '009AAA' actual_nr FROM Dual UNION ALL
          SELECT 'BBB' range_id, '001BBB' actual_nr FROM Dual UNION ALL
          SELECT 'BBB' range_id, '002BBB' actual_nr FROM Dual
    SELECT
         MIN(actual_nr_start)     actual_nr_start,
         actual_nr_finish,
         MAX(Total)
    FROM
          SELECT
              range_id,
              MIN(actual_nr)     actual_nr_start,
              MAX(actual_nr)     actual_nr_finish,
              MAX(Level)     Total
          FROM
              Sample_Data
          CONNECT BY
              range_id          = PRIOR range_id
             AND     substr(actual_nr, 1, 3)     = substr(PRIOR actual_nr, 1, 3) + 1
          GROUP BY
              range_id,
              CONNECT_BY_ROOT actual_nr
    GROUP BY
         actual_nr_finish
    ORDER BY
         SubStr(actual_nr_start, -3),
         actual_nr_start;Message was edited by:
    Brian Tkatch 2
    consolidated two levels.

  • Gaps in data

    Hi.  I am trying to save a lot of data at a high frequency:
    36 Analog Channels, 400 samples @ 4kHz per channel, each iteration.
    I am saving the data inside my loop and this seems to cause a 20ms gap after each pass through.  Is there a way to use the write to lvm vi outside the loop?

    Yes.  Producer/consumer architecture.
    Search the forums for those words and you'll find numerous messages on it.

  • IPhoto Bug - Every time try to change date, date moves back 100 years

    You will like this bug. Somehow an incorrect date was entered on the videos in iPhoto. Now, we try to enter a correct date of 2/1/2011 using iPhoto->Photos->Adjust Date Time and the dates just keep going back approximately 100 years. Currently, the date shows June 3, 1221!
    We can't set the date on our videos.
    I just tried again, and now the date shows May 16th, 1153.
    Bummer, huh?

    I friend of mine had the same problem and ask me for a solution. I conclude that when you ask to adjust date and time, iPhoto calculates the number of seconds that each picture has to be adjusted with. That number of seconds is stored in an 32bits signed integer variable, which means that the maximum numbers of seconds we can store is 2,147,483,647, which corresponds to 68 years, 35 days, 3 hours, 14 minutes and 7 seconds. If the adjustment you are trying to do is smaller than that, it will work, if it's bigger the result would be uncertain. There is an easy solution to that problem, but is Apple that has to implement it by changing the 32bits variable that stores those seconds to a 64 bits one.
    Therec.

  • Bug or feature: APEX_ITEM.DISPLAY_AND_SAVE (1, DATE) is sort by alphanumeric...???

    Hi,
    my select is like this:
    select
         APEX_ITEM.DISPLAY_AND_SAVE (1, STICHTAG) as STICHTAG, --Datatype is date
         APEX_ITEM.DISPLAY_AND_SAVE (2, Name) as Name,
         APEX_ITEM.DISPLAY_AND_SAVE (3, Animal) as animal
    From XYZ;
    If i want to sort asc/ desc by STICHTAG...Apex sort it by alphanumeric...not by the higest or lowes date...Why APEX do it and how can i fix it and sort by higest or lowest date?
    THX,
    René

    RWErene81 wrote:
    my select is like this:
    select
         APEX_ITEM.DISPLAY_AND_SAVE (1, STICHTAG) as STICHTAG, --Datatype is date
         APEX_ITEM.DISPLAY_AND_SAVE (2, Name) as Name,
         APEX_ITEM.DISPLAY_AND_SAVE (3, Animal) as animal
    From XYZ;
    If i want to sort asc/ desc by STICHTAG...Apex sort it by alphanumeric...not by the higest or lowes date...Why APEX do it and how can i fix it and sort by higest or lowest date?
    Previously asked and answered: Date format in apex items
    This is not a bug. The contents of the column is a character string generated by APEX_ITEM.DISPLAY_AND_SAVE.

  • Bug or DUE attempting navigation to Date field in query

    I am encountering trouble executing a query that involves
    navigating a one-way one-to-one relationship
    (persistent reference to PC object) to use a Date field.
    The query is the following: "period.endingDate <=
    p.endingDate" with a query class of Vote. The right side of the
    query has been varied and varying it yields different errors,
    but the query will not execute. The parameter is defined as:
    "VotingPeriod p". The import string is present and appears
    okay.
    For the current form, the error is:
    javax.jdo.JDOUserException: The given filter/ordering String
    "period.endingDate <= p.endingDate" is not valid.
    Make sure all parentheses are properly matched and that the filter uses
    proper Java syntax.
         at com.solarmetric.kodo.query.FilterParser.evaluate(FilterParser.java:536)
         at com.solarmetric.kodo.query.QueryImpl.getExpression(QueryImpl.java:371)
         at com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.getExpression(JDBCQuery.java:49)
         at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:312)
         at com.solarmetric.kodo.query.QueryImpl.executeWithArray(QueryImpl.java:393)
         at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:285)
    The Vote class is defined as:
    public class Vote
    // The pair (topic, period) is unique for all votes
    private Topic topic;
    private VotingPeriod period;
    private int count;
    The VotingPeriod class is defined as:
    public class VotingPeriod
    // the pair (startingDate, endingDate) are unique for all voting periods
    private Date startingDate;
    private Date endingDate;
    If the query is modified to be: "period.endingDate <= d" and
    the parameter is declared as "java.util.Date d", then the error
    is:
    javax.jdo.JDOException: com.ysoft.jdo.book.voting.VotingPeriod
    NestedThrowables:
    java.io.NotSerializableException: com.ysoft.jdo.book.voting.VotingPeriod
         at
    com.solarmetric.kodo.impl.jdbc.schema.dict.GenericDictionary.toSQL(GenericDictionary.java:122)
         at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCExpressionFactory$Constant.<init>(JDBCExpressionFactory.java:419)
         at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCExpressionFactory.getConstant(JDBCExpressionFactory.java:213)
         at com.solarmetric.kodo.query.FilterParser.eval(FilterParser.java:562)
         at com.solarmetric.kodo.query.FilterParser.getValue(FilterParser.java:668)
         at com.solarmetric.kodo.query.FilterParser.eval(FilterParser.java:601)
         at com.solarmetric.kodo.query.FilterParser.getExpression(FilterParser.java:677)
         at com.solarmetric.kodo.query.FilterParser.evaluate(FilterParser.java:527)
         at com.solarmetric.kodo.query.QueryImpl.getExpression(QueryImpl.java:371)
         at com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.getExpression(JDBCQuery.java:49)
         at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:312)
         at com.solarmetric.kodo.query.QueryImpl.executeWithArray(QueryImpl.java:393)
         at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:285)
    David Ezzio
    Yankee Software

    Hi Patrick,
    Okay, there is a bug. Your parser doesn't know how to handle the ProxyDate class. So if a "Date"
    is passed into the query that came from a persistent object, then a ProxyDate object is passed. If
    the date is a simple "Date" object then all is well. For the simple date, the parser constructs a
    query in nice JDBC escape syntax, but for ProxyDate objects, it generates garbage.
    David
    David Ezzio wrote:
    >
    I am encountering trouble executing a query that involves
    navigating a one-way one-to-one relationship
    (persistent reference to PC object) to use a Date field.
    The query is the following: "period.endingDate <=
    p.endingDate" with a query class of Vote. The right side of the
    query has been varied and varying it yields different errors,
    but the query will not execute. The parameter is defined as:
    "VotingPeriod p". The import string is present and appears
    okay.
    For the current form, the error is:
    javax.jdo.JDOUserException: The given filter/ordering String
    "period.endingDate <= p.endingDate" is not valid.
    Make sure all parentheses are properly matched and that the filter uses
    proper Java syntax.
    at com.solarmetric.kodo.query.FilterParser.evaluate(FilterParser.java:536)
    at com.solarmetric.kodo.query.QueryImpl.getExpression(QueryImpl.java:371)
    at com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.getExpression(JDBCQuery.java:49)
    at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:312)
    at com.solarmetric.kodo.query.QueryImpl.executeWithArray(QueryImpl.java:393)
    at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:285)
    The Vote class is defined as:
    public class Vote
    // The pair (topic, period) is unique for all votes
    private Topic topic;
    private VotingPeriod period;
    private int count;
    The VotingPeriod class is defined as:
    public class VotingPeriod
    // the pair (startingDate, endingDate) are unique for all voting periods
    private Date startingDate;
    private Date endingDate;
    If the query is modified to be: "period.endingDate <= d" and
    the parameter is declared as "java.util.Date d", then the error
    is:
    javax.jdo.JDOException: com.ysoft.jdo.book.voting.VotingPeriod
    NestedThrowables:
    java.io.NotSerializableException: com.ysoft.jdo.book.voting.VotingPeriod
    at
    com.solarmetric.kodo.impl.jdbc.schema.dict.GenericDictionary.toSQL(GenericDictionary.java:122)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCExpressionFactory$Constant.<init>(JDBCExpressionFactory.java:419)
    at
    com.solarmetric.kodo.impl.jdbc.runtime.JDBCExpressionFactory.getConstant(JDBCExpressionFactory.java:213)
    at com.solarmetric.kodo.query.FilterParser.eval(FilterParser.java:562)
    at com.solarmetric.kodo.query.FilterParser.getValue(FilterParser.java:668)
    at com.solarmetric.kodo.query.FilterParser.eval(FilterParser.java:601)
    at com.solarmetric.kodo.query.FilterParser.getExpression(FilterParser.java:677)
    at com.solarmetric.kodo.query.FilterParser.evaluate(FilterParser.java:527)
    at com.solarmetric.kodo.query.QueryImpl.getExpression(QueryImpl.java:371)
    at com.solarmetric.kodo.impl.jdbc.runtime.JDBCQuery.getExpression(JDBCQuery.java:49)
    at com.solarmetric.kodo.query.QueryImpl.executeWithMap(QueryImpl.java:312)
    at com.solarmetric.kodo.query.QueryImpl.executeWithArray(QueryImpl.java:393)
    at com.solarmetric.kodo.query.QueryImpl.execute(QueryImpl.java:285)
    David Ezzio
    Yankee Software

  • SQL Lead & Lag to find gap between dates

    i have a table with two columns event_start and event_end :
    CREATE TABLE MYBATCHTAB
      EVENT_START  DATE                             NOT NULL,
      EVENT_END    DATE                             NOT NULL
    and my data is :
    Insert into MYBATCHTAB
       (EVENT_START, EVENT_END)
    Values
       (TO_DATE('08/12/2013 22:45:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('08/12/2013 23:55:00', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into MYBATCHTAB
       (EVENT_START, EVENT_END)
    Values
       (TO_DATE('08/12/2013 15:30:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('08/12/2013 17:00:00', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into MYBATCHTAB
       (EVENT_START, EVENT_END)
    Values
       (TO_DATE('08/12/2013 16:00:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('08/12/2013 17:30:00', 'MM/DD/YYYY HH24:MI:SS'));
    Insert into MYBATCHTAB
       (EVENT_START, EVENT_END)
    Values
       (TO_DATE('08/12/2013 20:00:00', 'MM/DD/YYYY HH24:MI:SS'), TO_DATE('08/12/2013 22:00:00', 'MM/DD/YYYY HH24:MI:SS'));
    COMMIT;
    Event Start
    Event End
    08/12/2013 15:30:00
    08/12/2013 17:00:00
    08/12/2013 16:00:00
    08/12/2013 17:30:00
    08/12/2013 20:00:00'
    08/12/2013 22:00:00
    08/12/2013 22:45:00
    08/12/2013 23:55:00
    and i want to find the first whole start - end period in this example start : 15:30 - end 17:30 (merging  record 1&2 )
    but not the third one for example not 15.30 - 22:00 or not 15.30 23:55 because there are gaps between end dates.
    how can i do this using lead&lag ? 
    I'm not sure if this is the best approach

    Maybe a baby-step solution
    select event_start,event_end
      from (select event_start,
                   case when overlap is not null
                        then case when lead(overlap,1) over (order by event_start) is not null
                                  then lead(event_end,1) over (order by event_start)
                                  when lag(overlap,1) over (order by event_start) is not null
                                  then null
                             end
                        else event_end
                   end event_end
              from (select event_start,event_end,
                           case when lead(event_start,1) over (order by event_start) <= event_end
                                  or lag(event_end,1) over (order by event_start) >= event_start
                                then 'overlap'
                           end overlap
                      from mybatchtab
    where event_end is not null
    order by event_start
    EVENT_START
    EVENT_END
    08/12/2013 15:30:00
    08/12/2013 17:30:00
    08/12/2013 20:00:00
    08/12/2013 22:00:00
    08/12/2013 22:45:00
    08/12/2013 23:55:00
    or when there can be more than two consecutive overlaps
    select event_start,
           event_end
      from (select case when lag_overlap is null
                        then event_start
                   end event_start,
                   case when coalesce(lead_overlap,lag_overlap) is null
                        then event_end
                        when lag_overlap is null and lead_overlap is not null
                        then lead(event_end) over (order by event_start)
                    end event_end
              from (select event_start,event_end,
                           case when event_start < lag(event_end) over (order by event_start)
                                then 'overlap'
                           end lag_overlap,
                           case when event_end > lead(event_start) over (order by event_start)
                                then 'overlap'
                           end lead_overlap
                      from mybatchtab
             where lead_overlap is null
                or lag_overlap is null
    where event_end is not null
       and event_end is not null
    order by event_start
    Regards
    Etbin

Maybe you are looking for