Solution for Overlapping dates

Hi
I am looking for an easy solution the problem described below
I have 4 records under party key and
PartyKey account id taxtype date start date end
123 145 100 01/01/2001 12/31/2002
123 145 100 05/01/2001 05/31/2001
123 145 100 01/01/2002 11/30/2002
123 145 100 01/01/2003 12/31/2007
124 111 300 01/01/2001 12/31/2006
125 222 260 04/01/2001 11/30/2001
125 222 260 01/01/2001 12/31/2001
I want to select all the records which has overlapping dates with other set of record for the same partykey and other columns and want to list all those records which has no overlapping dates.
for example for party key 123 first 3 records are overlapping date, and party key 125 has both the records has overlapping date.
party key 124 has no over lapping date and last record of partykey 123 the dates are not overlapping with first three records
Can any body help me in building the code. Its very Urgent please
Thanks
Rajesh

My bad, i had to head out shortly so i rushed, and introduced some copy+paste errors....This should be what you're after, hopefully you'll take the time to understand it, since you'll have to support it :)
ME_XE?select
  2     id_internal,cd_type_acct,cd_type_addr,id_acct,  dt_effect, dt_end,
  3     case when
  4        (dt_effect between lead(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lead(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
  5        OR
  6        (dt_end between lead(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lead(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
  7        OR
  8        (dt_effect between lag(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lag(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
  9        OR
10        (dt_end between lag(dt_effect) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) AND lag(dt_end) over (partition by id_internal,cd_type_acct,cd_type_addr,id_acct order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc) )
11    then
12     'Overlap'
13    else
14     'No Overlap'
15    end as overlaps
16  from
17  (
18     select *
19     from MV_ITS_PARTY_ADDRESS_EXCEP
20     order by id_internal,cd_type_acct,cd_type_addr,id_acct, dt_effect asc
21  );
       ID_INTERNAL       CD_TYPE_ACCT       CD_TYPE_ADDR ID_ACCT                              DT_EFFECT  DT_END     OVERLAPS
             93535       300000042829                 10 300                                  10/01/2000 10/31/2000 Overlap
             93535       300000042829                 10 300                                  10/01/2000 10/31/2000 Overlap
             93535       300000042829                 10 300                                  11/01/2000 12/31/2001 No Overlap
       20001115051                  0                 60 0                                    01/01/2004 12/31/2004 No Overlap
       20001115051                  0                 60 0                                    01/01/2005 12/31/2005 No Overlap
       20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
       20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
       20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
       20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
       20001115051                  0                 60 0                                    01/01/2006 02/27/2006 Overlap
       20001115051                  0                 60 0                                    02/28/2006 12/31/9999 No Overlap
11 rows selected.
Elapsed: 00:00:00.18
ME_XE?
[pre]
Message was edited by:
        Tubby                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

Similar Messages

  • Reg:Efficient solution for a data upload scenario

    Hi All,
            I have the following task.
             Required  data from a legacy system(generate  data only in the form of flat files)to SAP R3 as FB01 journals and the output file should be generated periodically(daily,weekly,fortnightly etc…)
    Solution Approaches:
    1)Write a BDC program to extract the data.
    2) Write a ABAP Progam to populate IDoc (if standard IDOc is available) or generate a Outbound proxy (If standard IDoc is not available) to push the data into SAP XI.
    Could anyone  tell me which would be the the best and efficient approach for this task and need your recommendations.
    Thanks in Advance.
    B.Lavanya
    Edited by: Lavanya Balanandham on Mar 31, 2008 2:23 PM

    Hi Lavanya,
    Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
    the output file should be generated periodically(daily,weekly,fortnightly etc…)  - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
    Regards,
    Rajeev Gupta

  • IP mobility solution for a Data Centers

    Hello guys
    Please give me an advice.
    Now I try to work out a networking solution for a two geo separated data centers (DCs).
    Both the data centers will contain a virtualised server infrastructure (VMware).
    Now the main design problem is to choose good solution for IP mobility and  for the DCs interconnect. We need to have an ability of moving a VM machines both for maintanance and for disaster recovery between the DCs. And users must seamlessly without any advanced knowledge about routing  be connected to the VM machines that were moved. For now the users work in 1100 different offices around the Ukraine country.
    Yes I know a little about Cisco OTV, RHI and LISP solutions. But all this solutions require the top level switches and routers like Nexus 7000 and Cat 6500. And they are very costly.
    Is there cheaper solution for the IP Mobility?
    Maybe the NHRP protocol that is a part of the DMVPN could be a good solution? 
    Are there any features in NHRP or in other protocol for monitoring the health of a VM machine, customising and propagation the routing info in case of VM machine motion?
    Thank you for help.

    Hi Lavanya,
    Required data from a legacy system(generate data only in the form of flat files)to SAP R3 as FB01 journals - use BDC for this thing because it will be better for large source files.
    the output file should be generated periodically(daily,weekly,fortnightly etc…)  - if this output file contains acknowledgment for the data uploaded by the above process, create a ABAP report for it and schedule it..........but if this output contains some other IDOC data which you need to send as a file to a third-party system, then go for SAP XI provided the IDOC data is not too large... but if the IDOC size is huge, then just create a ABAP report for outputting data to a file on application server and FTP the file to third-party system.
    Regards,
    Rajeev Gupta

  • Is Workspace Manager the right solution for history data only

    I've been experimenting some with WM to just enable history on all tables in a application we have with about 100 tables.
    Everything works fine, except for the fact that we have lots of fk_constraints that require enabling of versioning in order.
    We know that changes to the Db layout will occur as the app develops with business requirement's.
    My questions is : Is WM the right solution here, as I don't see anyway to "transport" historical data in the event of a migration of old DB layout to a new layout?
    (db refactoring).
    /christian

    Hi Christian,
    When you execute enableversioning on the tables, you can specify the complete set of tables that you want to version, instead of having to maintain the correct table order.
    There shouldn't be a need to transport the data between tables. We support DDL changes on versioned tables by using the beginDDL/commitDDL procedures. If you do require the transport of data between databases, then note that we only support a full database import/export. The user guide covers both of these topics in greater detail.
    Regards,
    Ben

  • Looking for a proper analytical query/solution for below data

    I have data as shown below:
    mob_id     status_code     status_text     media_date
    39585     600     Online     04-Aug-09
    54988     600     Online     05-Aug-09
    39585     600     Online     05-Aug-09
    39585     600     Online     06-Aug-09
    39585     600     Online     07-Aug-09
    39585     600     Online     08-Aug-09
    39585     600     Online     09-Aug-09
    39585     600     Online     10-Aug-09
    39585     600     Online     11-Aug-09
    39585     600     Online     12-Aug-09
    39585     600     Online     13-Aug-09
    39585     600     Online     14-Aug-09
    39585     600     Online     15-Aug-09
    39585     600     Online     16-Aug-09
    39585     700     Deinstall     17-Aug-09
    54988     600     Online     06-Aug-09
    54988     600     Online     07-Aug-09
    54988     600     Online     08-Aug-09
    54988     600     Online     09-Aug-09
    54988     600     Online     10-Aug-09
    54988     600     Online     11-Aug-09
    54988     600     Online     12-Aug-09
    54988     600     Online     13-Aug-09
    54988     600     Online     14-Aug-09
    54988     600     Online     15-Aug-09
    54988     600     Online     16-Aug-09
    39585     600     Online     20-Aug-09
    39585     600     Online     21-Aug-09
    39585     600     Online     22-Aug-09
    39585     600     Online     23-Aug-09
    39585     600     Online     24-Aug-09
    39585     600     Online     25-Aug-09
    39585     700     Deinstall     26-Aug-09
    39585     600     Online     27-Aug-09
    39585     600     Online     28-Aug-09
    39585     600     Online     29-Aug-09
    39585     600     Online     30-Aug-09
    39585     600     Online     31-Aug-09
    39585     600     Online     01-Sep-09
    39585     700     Deinstall     02-Sep-09
    54988     600     Online     17-Aug-09
    54988     600     Online     18-Aug-09
    54988     600     Online     19-Aug-09
    54988     600     Online     20-Aug-09
    54988     600     Online     21-Aug-09
    54988     600     Online     22-Aug-09
    54988     600     Online     23-Aug-09
    54988     600     Online     24-Aug-09
    54988     600     Online     25-Aug-09
    54988     700     Deinstall     26-Aug-09
    69875     600     Online     20-Aug-09
    69875     600     Online     21-Aug-09
    69875     600     Online     22-Aug-09
    69875     600     Online     23-Aug-09
    69875     600     Online     24-Aug-09
    69875     600     Online     25-Aug-09
    69875     600     Online     26-Aug-09
    Using the above data I need to find out the below result set. Can any one help in this?
    occurrnace_seq     mob_id     start_media_date     end_media_date     no_of_days
    1     39585     04-Aug-09     17-Aug-09     13
    2     39585     20-Aug-09     26-Aug-09     6
    3     39585     27-Aug-09     02-Sep-09     6
    1     54988     05-Aug-09     26-Aug-09     21
    1     69875     20-Aug-09          null null
    Here start_media_date can be found with status_code=600 & end_media_date can be found with status_code=700.
    Please look that the mobility_id is starting multiple times.
    Any one can help me in producing this result using SQL or PL/SQL.
    Many thanks in advance.
    Thanks
    Guttiwas

    guttis wrote:
    Can I run this query on a 70 million records? Does it raise any performance problems. If you have any idea, just thorough some possible suggestions to protect such isses.Well, you can certailny run it on 70 million records. How long it will run depends on your hardware, Oracle and OS settings. Said that, there is a simpler solution:
    select  occurrenace_seq,
            mob_id,
            min(case grp when 'start-of-group' then media_date end) start_media_date,
            max(case grp when 'end-of-group' then media_date end) end_media_date,
            max(case grp when 'end-of-group' then media_date end) - min(case grp when 'start-of-group' then media_date end) no_of_days
      from  (
             select  t.*,
                     case
                       when status_text = 'Deinstall' then 'end-of-group'
                       when lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) = 'Deinstall' then 'start-of-group'
                     end grp,
                     sum(case status_text when 'Deinstall' then 1 else 0 end) over(partition by mob_id order by media_date) +
                     case lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) when 'Deinstall' then 1 else 0 end occurrenace_seq
               from  your_table t
      where grp in ('start-of-group','end-of-group')
      group by mob_id,
               occurrenace_seq
      order by mob_id,
               occurrenace_seq
    /With your sample:
    with t as (
               select 39585 mob_id,600 status_code,'Online' status_text, to_date('04-Aug-09','dd-mon-yy') media_date from dual union all
               select 54988,600,'Online',to_date('05-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('05-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('06-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('07-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('08-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('09-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('10-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('11-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('12-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('13-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('14-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('15-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('16-Aug-09','dd-mon-yy') from dual union all
               select 39585,700,'Deinstall', to_date('17-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('06-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('07-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('08-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('09-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('10-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('11-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('12-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('13-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('14-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('15-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('16-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
               select 39585,700,'Deinstall', to_date('26-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('27-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('28-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('29-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('30-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('31-Aug-09','dd-mon-yy') from dual union all
               select 39585,600,'Online',to_date('01-Sep-09','dd-mon-yy') from dual union all
               select 39585,700,'Deinstall', to_date('02-Sep-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('17-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('18-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('19-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
               select 54988,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
               select 54988,700,'Deinstall', to_date('26-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('20-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('21-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('22-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('23-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('24-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('25-Aug-09','dd-mon-yy') from dual union all
               select 69875,600,'Online',to_date('26-Aug-09','dd-mon-yy') from dual
    select  occurrenace_seq,
            mob_id,
            min(case grp when 'start-of-group' then media_date end) start_media_date,
            max(case grp when 'end-of-group' then media_date end) end_media_date,
            max(case grp when 'end-of-group' then media_date end) - min(case grp when 'start-of-group' then media_date end) no_of_days
      from  (
             select  t.*,
                     case
                       when status_text = 'Deinstall' then 'end-of-group'
                       when lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) = 'Deinstall' then 'start-of-group'
                     end grp,
                     sum(case status_text when 'Deinstall' then 1 else 0 end) over(partition by mob_id order by media_date) +
                     case lag(status_text,1,'Deinstall') over(partition by mob_id order by media_date) when 'Deinstall' then 1 else 0 end occurrenace_seq
               from  t
      where grp in ('start-of-group','end-of-group')
      group by mob_id,
               occurrenace_seq
      order by mob_id,
               occurrenace_seq
    OCCURRENACE_SEQ     MOB_ID START_MED END_MEDIA NO_OF_DAYS
                  1      39585 04-AUG-09 17-AUG-09         13
                  2      39585 20-AUG-09 26-AUG-09          6
                  3      39585 27-AUG-09 02-SEP-09          6
                  1      54988 05-AUG-09 26-AUG-09         21
                  1      69875 20-AUG-09
    SQL> SY.

  • Solution for mass data import to VB02

    hello
    do you have any procedure to mass data import to VB02 exclusion list for ZSD1 3rd list key
    sales dep/ distribution chanel / material / customer
    we don't want to do that manually - more then 600 records
    do you have any idea how to import it - any easier way ?

    HI
    you can import the records eaither LSMW or BAPI which ever your are comfortable to do that.
    in LSMW we have 14 steps , i hope you know .
    thanks
    surya

  • XML Solutions for Large Data Sets

    Hi,
    I'm working with a large data set (9 million records comprising 36 gigabytes) and am exploring the use of XML with it.
    I've experimented with a JDBC app (taken straight from Steve Muench's excellent <i>Oracle_XML_Applications</i>) for writing to CLOBS, but achieve throughputs of much less than 40k/s (the minimum speed required to process the data in < 10 days).
    What kind of throughputs are possible loading XML records from CLOBs into multiple tables (using server-side Java apps)?
    Could anyone comment whether XML is a feasible possibility for this size data set?
    Regards,
    Mike

    Just would like to identify myself (I'm the submitter):
    Michael Driscoll <[email protected]>.
    null

  • Jquery to check for overlapping of dates

    Hi All,
    I have a conference room booking calendar on moss 2007 site. i want to check for overlapping date i.e. no body should book the conference room for a time period which has already been booked by some one else. I need to include a jquery code on the web part.
    Kindly help me on the same.

    Hi,
    I think this is possible if you have the necessary datetime fields and if you incorporate some Javascripts:
    http://spjsblog.com/2012/02/25/spjs-resource-management-no-more-double-booking/
    With some server side code (which I believe could also be replaced by client javascript code):
    http://aleemsheikh.wordpress.com/2011/08/08/warn-users-of-double-booking-in-sharepoint-calendar-in-real-time/
    Alternative would be either an Event Receiver or a workflow:
    https://www.nothingbutsharepoint.com/sites/eusp/Pages/SharePoint-Double-Booking-with-the-Calendar.aspx
    http://sharepoint.stackexchange.com/questions/18086/calendar-items-at-same-time

  • Overlapping date problem, I'm a bit stumped...

    Hi,
    My first post here, woohooo!!
    I'm trying to write some SQL for overlapping dates, hoping someone could point me in the write direction because I'm a bit stuck :(
    Heres the scenario....
    Tables are simillar to this (although I've changed it for the sake of this example).
    TABLE A, TABLE B, TABLE C look like this
    - EMPLOYEE_NUMBER VARCHAR2
    - START_DATE DATE
    - END_DATE DATE
    These three tables contain information on an employees working period. The dates in each table can overlap each other.
    For example
    TABLE A
    AAAA, 05-10-1991, 05-10-2020
    AAAA, 05-01-2000, 20-05-2003
    BBBB, 09-12-1999, 02-07-2009
    TABLE B
    AAAA, 01-11-2002, 01-09-2003
    TABLE C
    AAAA, 15-08-2003, 12-05-2004
    Heres the tricky bit. There is a hierarchy, any dates in table C will overide any of the dates in table A and B. Table B will override any dates Table A. I need a query to break up the dates to show one consecutive timeline for any given employee. The query would obviously need to use all 3 tables.
    Therefore I would need a query to output the following for Employee AAAA.
    EMPLOYEE, START_DATE, END_DATE
    AAAA, 05-10-1991, 04-01-2000
    AAAA, 05-01-2000, 31-10-2002
    AAAA, 01-11-2002, 14-08-2003
    AAAA, 15-08-2003, 12-05-2004
    AAAA, 13-05-2004, 05-10-2020
    I've worked out how to write this using PLSQL although its pretty feral code, hoping there might be a cleaner method using SQL. Any ideas anyone?
    Thanks!
    Edited by: user2322009 on 1/03/2009 21:17

    in addition to avinash solution and using the data that is provided by you.
    with t as (SELECT 'AAAA' emp_id, TO_DATE('05-10-1991','DD-MM-YYYY') start_dt, TO_DATE('05-10-2020','DD-MM-YYYY') end_dt FROM DUAL UNION ALL
        SELECT 'AAAA', TO_DATE('05-01-2000','DD-MM-YYYY'), TO_DATE('20-05-2003','DD-MM-YYYY') FROM DUAL UNION ALL
        SELECT 'AAAA', TO_DATE('01-11-2002','DD-MM-YYYY'), TO_DATE('01-09-2003','DD-MM-YYYY') FROM DUAL UNION ALL
        SELECT 'AAAA', TO_DATE('15-08-2003','DD-MM-YYYY'), TO_DATE('12-05-2004','DD-MM-YYYY') FROM DUAL )
    , t1 as (    select emp_id,max(end_dt) m_dt from t
                group by emp_id)
    , t2 as (select t1.emp_id, max(end_dt) m_dt from t,t1
                 where t.emp_id = t1.emp_id
                 and end_dt <> m_dt
                 group by t1.emp_id
    SELECT emp_id,start_DT, CASE WHEN start_dt_pr < end_dt THEN start_dt_pr -1 ELSE end_dt END  FROM (
        SELECT emp_id,start_dt,NVL(LEAD(start_dt,1) OVER(ORDER BY start_dt),end_dt)  start_dt_pr, end_dt FROM (
        SELECT emp_id,start_dt,end_dt FROM T ORDER BY start_dt ))
    union all
    select t1.emp_id,t2.m_dt + 1, t1.m_dt from t1,t2
    where t1.emp_id = t2.emp_id;See if this helps.
    Avinash solutions was missing one record if i use the data provided by you. That is the last record.
    Following was added to get the last record. Experts here will correct me if this is not the right way to do.
    , t1 as (    select emp_id,max(end_dt) m_dt from t
                group by emp_id)
    , t2 as (select t1.emp_id, max(end_dt) m_dt from t,t1
                 where t.emp_id = t1.emp_id
                 and end_dt <> m_dt
                 group by t1.emp_id
                 )Alvinder

  • Looking for a query to find first/last dates in overlapping dates...

    Hi,
    I'm looking for a query to find the first dates and last dates in a table conaining overlapping dates.
    I have a subscription table which has for each Customer start and end date for different subscriptions.
    I want to know the different ranges of date where there is subscriptions active.
    so if the table has this:
    CustID, Start date, end date
    1, 2008-01-01, 2012-06-06
    1 ,2009-01-01, 2011-01-01
    1, 2011-01-01, 2013-02-02
    1, 2013-01-01, 2013-08-08
    1, 2014-01-01, 2014-04-04
    I want to produce this result:
    custid, range start, range end
    1, 2008-01-01, 2013-08-08
    1, 2014-01-01, 2014-04-04
    the first row is the range identified from the 4 rows in my subscription table.
    thanks :)

    I think I found it...
    http://stackoverflow.com/questions/5213484/eliminate-and-reduce-overlapping-date-ranges
    let me try this method
    Hi,
    m writing to follow up with you on this post. Thanks for you posting a reply to share your workground. Was the problem resolved after performing the above link? If you are satisfied with the above solution, I’d like to mark this issue as "Answered".
    Please also feel free to unmark the issue, with any new findings or concerns you may have.
    Thanks,
    Sofiya Li
    If you have any feedback on our support, please click here.
    Sofiya Li
    TechNet Community Support

  • TS3297 "iTunes was unable to load data class information from Sync Services." What is this solution for the new phone to sync up with my iTunes acct?

    "iTunes was unable to load data class information from Sync Services." What is this solution for this new phone to sync up with my iTunes acct?

    Hi tadhunt,
    Here is an article that directly addresses that error message:
    iTunes for Windows: "Unable to load data class" or "Unable to load provider data" sync services alert
    http://support.apple.com/kb/ts2690
    I hope this helps!
    - Ari

  • Need solution for the following sync error: "iTunes could not sync calendars to the iPad "iPad name" because an error occurred while sending data from the iPad"

    Need solution for the following sync error: "iTunes could not sync calendars to the iPad "iPad name" because an error occurred while sending data from the iPad"

    I want to add that I deleted all the old back-ups and created a new back-up without any issues except sync problem.

  • EIS - Is this the solution for getting to Alpha data?

    Is there any other way to get to alpha data(stored in FISERV), through Essbase?

    As Jeff mentioned, unfortunately, the info provided is incorrect. Most external hard drives do not have the ability to act as a host device. Meaning, when you plug your camera into a computer, it generally mounts the camera as a drive (or provides similar functionality). You then use your computer to pull the information from the camera. If you simply connect an external drive directly to the camera (as this solution suggests), there is no way to get the data from the camera to the drive. You can generally daisy chain firewire connections... meaning you can connect your camera to a firewire drive and if the firewire drive has multiple firewire ports, you can then connect the second port to a computer with firewire. This is not an option for you as you already know there is no firewire on the MacBook. There is some misconception that for external drives with multiple interfaces (firewire and USB and or eSATA) that you can connect a camera to the drives firewire interface and then connect the drive to your computer using the USB connection. Unfortunately, you can't daisy chain across different interfaces, the drive can only support data throughput through one interface at a time.
    Still an external drive is good to have for backups, etc... but it is not a solution for your problem.

  • Best Solution for Archiving Table data

    Hi All,
    I have a table with huge data. It is not partitioned table.
    On an average per day 10000 records will be inserted into this table. Now I want to archive(backup)
    every one years data manually and keep in safe location and hence delete those archived rows
    from the table. Whenever required it should be easily imported back to this table. All this happens through
    Application.
    One appraoch in my mind right now, is transferring the data from table to flat file with comma separted,
    and whenever required again importing back to the table from Flat file using external tables concept.
    Can any body suggest what is best solution for this.
    Thanks

    The best solution would be partitioning.
    Any other solution requires DML - running DELETE and INSERT transactions to remove a data set and to add a data set (if need be) again.
    With partitioning this is achieved (in sub-seconds) using DDL by exchanging a partition's contents with that of a table. Which means that after the archived data has been loaded (SQL*Loaded, Import, etc) into a table (and indexes created), that table (with indexes) is "swapped" into the partition table as a partition.

  • Can anyone explain the split and overlapping date ranges in PHOTO moments.  I would have expected each date to have its own slot.  What logic is used for deciding which photos to include and where to split them?

    iPad Air iOS 8.1 Stock Photo App.  14000 Photos. Overlapping date ranges in Moments.  What is the criteria used for deciding after import where photos are included in Moments.  I was expecting each day to have its own exclusive slot for example, not 26 - 27 January 2007 with 8 photos then 27 January 2007 1 photo.  Why is there not two seperate slots, one for the 26th and one for 27th??   This makes locating photos by day more arkward than it need be and seems to follow no logic.  This occurs over 150 times and is very confusing.  Is this an issue or expected behaviour?

    iPad Air iOS 8.1 Stock Photo App.  14000 Photos. Overlapping date ranges in Moments.  What is the criteria used for deciding after import where photos are included in Moments.  I was expecting each day to have its own exclusive slot for example, not 26 - 27 January 2007 with 8 photos then 27 January 2007 1 photo.  Why is there not two seperate slots, one for the 26th and one for 27th??   This makes locating photos by day more arkward than it need be and seems to follow no logic.  This occurs over 150 times and is very confusing.  Is this an issue or expected behaviour?

Maybe you are looking for

  • Jax publishing errors . HELP!!

    run-jaxr-publish: [echo] Running OrgPublisher. [java] Created connection to registry [java] Got registry service, query manager, and life cycle manager [java] Jan 6, 2004 1:01:31 PM com.sun.xml.messaging.saaj.soap.MessageImpl i dentifyContentType [ja

  • Bluetooth two mini speakers ?

    Is it possible to bluetooth two mini speakers to an iPod Touch ?

  • How to hide MSO after trigger

    Hi, I am making a TOC section. below is my scenario: 1. I have a TOC icon on all slides. which when clicked opens a MSO which has topics listed with page numbers. 2. When i am on page 3 and i want to jump to page 10, i click on TOC icon on that slide

  • My Available messages don't work...I REALLY need help

    Hey guys My available messages dont work...at all. Just yesturday they were fine but today people are telling me they can't see them and I can see other peoples...HELP ME!!

  • Can not load adworker

    Hi, when generating message files or report files we have this error message : Could not load program /exec/products/OA/v11.5.10.2/p13drec/p13dsmsappl/ad/11.5.0/bin/adworker: rtld: 0712-001 Symbol __pth_init was referenced from module adworker(), but