Using FIFO as sql code for assigning indicator to data items

Hi All,
We are looking for a solution based on FIFO algorithm. We have a table having following data:
We need to perform FIFO on this table, and assign "object" as data items to other rows based on following conditions:
1. first we have to group the rows based on "object" column
2. then we have to traverse each group from the row having  minimum start time of the object group e.g: row id 1 for object group for "19O"
2.1 Assign a "EqpREf" as "object" + <an integer value> , where integer value changes when the start and end chain finishes. Start -end chain is explained in step 2.2
2.2 then we have to pick the "nextstarttime" of the picked row and compare it against the closest "starttime" of the rows having "start" as same as "end" of the same row we are
picking for "nextstarttime" e.g: row id 2 of object 19O is having "nextstarttime" 0310 closest to "starttime" 0355 of row id 2 of object 19O , and rowid 2 is having "start" AAL which is similar to "end" of
rowid 1 
2.3 We have to perform this chain till we find end of same and allocate each row in a chain same "Eqpref"
hence the output we need to generate will come as:
Kindly help on same.
Thanks in advance
-Regards
Kumud

Hi,
Please find the following code block for what is input data and what should be output.
--The input data
create table temp_table
row_id int,
engine_no varchar(20),
schedule_no varchar(20),
start_station varchar(20),
end_station varchar(20),
startdate datetime,
enddate datetime,
starttime datetime,
endtime datetime,
record_id int,
engine_id int,
Mgt int,
nextstarttime datetime,
Schedule_ref varchar(20),
Engine_Ref varchar(20)
GO
insert into temp_table values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00',null,null)
insert into temp_table values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00',null,null)
insert into temp_table values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00',null,null)
insert into temp_table values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00',null,null)
insert into temp_table values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00',null,null)
insert into temp_table values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00',null,null)
insert into temp_table values(7,'19O','107','DFW','ABC','2015/01/01','2015/01/01','06:00:00','07:00:00',7,10,60,'08:00:00',null,null)
insert into temp_table values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00',null,null)
insert into temp_table values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00',null,null)
insert into temp_table values(10,'19O','110','XYZ','BDW','2015/01/01','2015/01/01','13:00:00','15:00:00',10,10,60,'16:00:00',null,null)
insert into temp_table values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00',null,null)
insert into temp_table values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00',null,null)
insert into temp_table values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00',null,null)
insert into temp_table values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00',null,null)
insert into temp_table values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00',null,null)
insert into temp_table values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00',null,null)
insert into temp_table values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00',null,null)
insert into temp_table values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00',null,null)
insert into temp_table values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00',null,null)
insert into temp_table values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00',null,null)
insert into temp_table values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00',null,null)
insert into temp_table values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00',null,null)
insert into temp_table values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00',null,null)
insert into temp_table values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00',null,null)
insert into temp_table values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00',null,null)
insert into temp_table values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00',null,null)
insert into temp_table values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00',null,null)
insert into temp_table values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00',null,null)
insert into temp_table values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00',null,null)
insert into temp_table values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00',null,null)
insert into temp_table values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00',null,null)
insert into temp_table values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00',null,null)
insert into temp_table values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00',null,null)
insert into temp_table values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00',null,null)
--output should come as the data in temp_table_final
create table temp_table_final
row_id int,
engine_no varchar(20),
schedule_no varchar(20),
start_station varchar(20),
end_station varchar(20),
startdate datetime,
enddate datetime,
starttime datetime,
endtime datetime,
record_id int,
engine_id int,
Mgt int,
nextstarttime datetime,
Schedule_ref varchar(20),
Engine_Ref varchar(20)
GO
insert into temp_table_final values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00','101','19O-1')
insert into temp_table_final values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00','102','19O-2')
insert into temp_table_final values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00','103','19O-3')
insert into temp_table_final values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00','101','19O-1')
insert into temp_table_final values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00','102','19O-2')
insert into temp_table_final values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00','104','19O-1')
insert into temp_table_final values(7,'19O','107','DFW','ABC','2015/01/01','2015/01/01','06:00:00','07:00:00',7,10,60,'08:00:00','107','19O-4')
insert into temp_table_final values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00','105','19O-2')
insert into temp_table_final values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00','106','19O-1')
insert into temp_table_final values(10,'19O','110','XYZ','BDW','2015/01/01','2015/01/01','13:00:00','15:00:00',10,10,60,'16:00:00','110','19O-5')
insert into temp_table_final values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00','111','319-1')
insert into temp_table_final values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00','211','319-2')
insert into temp_table_final values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00','111','319-1')
insert into temp_table_final values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00','212','319-3')
insert into temp_table_final values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00','213','319-4')
insert into temp_table_final values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00','118','319-5')
insert into temp_table_final values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00','212','319-3')
insert into temp_table_final values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00','112','319-1')
insert into temp_table_final values(1,'19O','101','SGC','IXP','2015/01/01','2015/01/01','00:00:00','01:00:00',1,10,60,'02:00:00','101','19O-1')
insert into temp_table_final values(2,'19O','102','SGC','IXP','2015/01/01','2015/01/01','00:30:00','01:30:00',2,10,60,'02:30:00','102','19O-2')
insert into temp_table_final values(3,'19O','103','SGC','IXP','2015/01/01','2015/01/01','02:30:00','03:30:00',3,10,60,'04:30:00','103','19O-3')
insert into temp_table_final values(4,'19O','104','IXP','DFW','2015/01/01','2015/01/01','03:30:00','04:00:00',4,10,60,'05:00:00','101','19O-1')
insert into temp_table_final values(5,'19O','105','IXP','DFW','2015/01/01','2015/01/01','04:30:00','05:30:00',5,10,60,'06:30:00','102','19O-2')
insert into temp_table_final values(6,'19O','106','DFW','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',6,10,60,'07:00:00','104','19O-1')
insert into temp_table_final values(8,'19O','108','DFW','ABC','2015/01/01','2015/01/01','07:00:00','08:00:00',8,10,60,'09:00:00','105','19O-2')
insert into temp_table_final values(9,'19O','109','ABC','DEF','2015/01/01','2015/01/01','10:00:00','11:30:00',9,10,60,'12:30:00','106','19O-1')
insert into temp_table_final values(1,'319','111','PQR','STU','2015/01/01','2015/01/01','00:00:00','01:00:00',11,11,60,'02:00:00','111','319-1')
insert into temp_table_final values(2,'319','211','PQR','STU','2015/01/01','2015/01/01','04:30:00','15:30:00',12,11,60,'16:30:00','211','319-2')
insert into temp_table_final values(3,'319','112','STU','ABC','2015/01/01','2015/01/01','05:00:00','06:00:00',13,11,60,'07:00:00','111','319-1')
insert into temp_table_final values(4,'319','212','STU','DEF','2015/01/01','2015/01/01','06:00:00','07:00:00',14,11,60,'08:00:00','212','319-3')
insert into temp_table_final values(5,'319','213','STU','PQR','2015/01/01','2015/01/01','07:00:00','08:00:00',15,11,60,'09:00:00','213','319-4')
insert into temp_table_final values(6,'319','118','STU','XYZ','2015/01/01','2015/01/01','10:00:00','11:30:00',16,11,60,'12:30:00','118','319-5')
insert into temp_table_final values(7,'319','119','DEF','JKL','2015/01/01','2015/01/01','13:00:00','15:00:00',17,11,60,'16:00:00','212','319-3')
insert into temp_table_final values(8,'319','215','ABC','MNO','2015/01/01','2015/01/01','17:00:00','20:00:00',18,11,60,'21:00:00','112','319-1')
What we are doing here is generating a schedule for Trains departures.
here, we should identify the train schedules making a chain of stations considering the endstation of a train engine no should be startstation of another record for same engineno. also the starttime of engineno should be nearest of nextstarttime
of same station. 
for example : if we pick Ist row "SGC-IXP", nextstarttime for same is "02:00:00 am". this means train departed from SGC will reach to IXP and is available for departure from IXP after "02:00:00 am". So we have to consider
the record having startstation as IXP and nearest starttime to nextstarttime ("02:00:00"). So the next train departure would be IXP-DFW having starttime as "03:30:00 am".
here you can see we have to assign the scheduleno of previously considered record to the chained schedule so we have updated the "schedule_ref" as 101. Also we have to assign the engine no - <counter of integer> to a single chain on schedule
given in engine_ref.
Regards
Kumud

Similar Messages

  • Need PL/SQL code for this

    Hi,
    I need a PL/SQL code for this one...
    Let me know if something is not clear...
    1) The table CLOB_CLOBJECT_CDA has the columns described below...
    Explaining only those fields which are important in this context
    -- CDA_STEP_ID : Basically a Sequence
    -- CLOBJECT_SOURCE1_ID : Every id has got a set of records
    -- CLOBJECT_SOURCE2_ID : Every id has got a set of records
    -- LVL : There are total 8 levels..
    This is the main aim :
    1) There are total 16 million rows..(limited to 10 rows here)
    2) We need to go through level by level (LVL column) & insert the intersection records (CLOBJECT_SOURCE1_ID intersect CLOBJECT_SOURCE2_ID)
    into another table...but this is how it goes..
    Level (LVL column) 3's basically have CLOBJECT_SOURCE1_ID as level (LVL column) 2 CDA_STEP_ID's..
    (consider the statement --** where CLOBJECT_SOURCE1_ID = 285 which is same as 1st insert statement step id)..
    The above process goes for next levels until 8..(so have to use loops)
    So for ex :
    We go through the first insert statement and insert the insertion records only when both CLOBJECT_SOURCE1_ID & CLOBJECT_SOURCE2_ID has got records ..
    If we don't find any records for both of them we should skip the corresponding step id when we go to the next levels...
    Let's go through the 1st insert statement...
    -- We have CDA_STEP_ID = 285 & two sources CLOBJECT_SOURCE1_ID as 19 & CLOBJECT_SOURCE2_ID as 74...
    -- We see the table CLOBJECT_COUNTS & check whether we have counts for both 19 & 74 ..(In fact we insert counts into this table only if they have records)
    -- If so, we insert the intersection records into CDA_MRN_RESULTS ( we do have counts for both of them..) with CDA_STEP_ID 285...
    -- Then we insert the step id which is 285 along with the count into CLOBJECT_COUNTS..
    Let's go through another insert statement...
    -- Consider CDA_STEP_ID = 288 which has two sources CLOBJECT_SOURCE1_ID as 19 & CLOBJECT_SOURCE2_ID as 92...
    -- We see the table CLOBJECT_COUNTS & check whether we have counts for both 19 & 92 ..(we have records for 19 but not for 92)
    -- So we should not proceed with this..& also skip all those records (future records with increasing levels..basically level 3's) which have got 288 as CLOBJECT_SOURCE1_ID..
    (As said earlier that the present CDA_STEP_ID will always be CLOBJECT_SOURCE1_ID in the next level)...
    I wrote the following code which is after the statement...
    Let me have the create & insert statements here..
    create table CLOB_CLOBJECT_CDA
        CDA_STEP_ID           NUMBER,
        CDA_ID                NUMBER,
        CDA_SEQ_NUMBER        NUMBER,
        CLOBJECT_SOURCE1_TYPE VARCHAR2(3000),
        CLOBJECT_SOURCE1_ID   NUMBER,
        CLOBJECT_OPERATOR     VARCHAR2(3000),
        CLOBJECT_SOURCE2_TYPE VARCHAR2(3000),
        CLOBJECT_SOURCE2_ID   NUMBER,
        LVL                   NUMBER
    insert into clob_clobject_cda (CDA_STEP_ID, CDA_ID, CDA_SEQ_NUMBER, CLOBJECT_SOURCE1_TYPE, CLOBJECT_SOURCE1_ID, CLOBJECT_OPERATOR, CLOBJECT_SOURCE2_TYPE, CLOBJECT_SOURCE2_ID, LVL)
    values (285, 285, 1, 'CLOBJECT', 19, 'INTERSECT', 'CLOBJECT', 74, 2);
    insert into clob_clobject_cda (CDA_STEP_ID, CDA_ID, CDA_SEQ_NUMBER, CLOBJECT_SOURCE1_TYPE, CLOBJECT_SOURCE1_ID, CLOBJECT_OPERATOR, CLOBJECT_SOURCE2_TYPE, CLOBJECT_SOURCE2_ID, LVL)
    values (286, 286, 1, 'CLOBJECT', 19, 'INTERSECT', 'CLOBJECT', 75, 2);
    insert into clob_clobject_cda (CDA_STEP_ID, CDA_ID, CDA_SEQ_NUMBER, CLOBJECT_SOURCE1_TYPE, CLOBJECT_SOURCE1_ID, CLOBJECT_OPERATOR, CLOBJECT_SOURCE2_TYPE, CLOBJECT_SOURCE2_ID, LVL)
    values (287, 287, 1, 'CLOBJECT', 19, 'INTERSECT', 'CLOBJECT', 91, 2);
    insert into clob_clobject_cda (CDA_STEP_ID, CDA_ID, CDA_SEQ_NUMBER, CLOBJECT_SOURCE1_TYPE, CLOBJECT_SOURCE1_ID, CLOBJECT_OPERATOR, CLOBJECT_SOURCE2_TYPE, CLOBJECT_SOURCE2_ID, LVL)
    values (288, 288, 1, 'CLOBJECT', 19, 'INTERSECT', 'CLOBJECT', 92, 2);
    insert into clob_clobject_cda (CDA_STEP_ID, CDA_ID, CDA_SEQ_NUMBER, CLOBJECT_SOURCE1_TYPE, CLOBJECT_SOURCE1_ID, CLOBJECT_OPERATOR, CLOBJECT_SOURCE2_TYPE, CLOBJECT_SOURCE2_ID, LVL)
    values (4869, 4869, 1, 'CDA_STEP', 285, 'INTERSECT', 'CLOBJECT', 91, 3);  -- **
    insert into clob_clobject_cda (CDA_STEP_ID, CDA_ID, CDA_SEQ_NUMBER, CLOBJECT_SOURCE1_TYPE, CLOBJECT_SOURCE1_ID, CLOBJECT_OPERATOR, CLOBJECT_SOURCE2_TYPE, CLOBJECT_SOURCE2_ID, LVL)
    values (4870, 4870, 1, 'CDA_STEP', 285, 'INTERSECT', 'CLOBJECT', 92, 3);
    insert into clob_clobject_cda (CDA_STEP_ID, CDA_ID, CDA_SEQ_NUMBER, CLOBJECT_SOURCE1_TYPE, CLOBJECT_SOURCE1_ID, CLOBJECT_OPERATOR, CLOBJECT_SOURCE2_TYPE, CLOBJECT_SOURCE2_ID, LVL)
    values (4871, 4871, 1, 'CDA_STEP', 285, 'INTERSECT', 'CLOBJECT', 93, 3);
    insert into clob_clobject_cda (CDA_STEP_ID, CDA_ID, CDA_SEQ_NUMBER, CLOBJECT_SOURCE1_TYPE, CLOBJECT_SOURCE1_ID, CLOBJECT_OPERATOR, CLOBJECT_SOURCE2_TYPE, CLOBJECT_SOURCE2_ID, LVL)
    values (4880, 4880, 1, 'CDA_STEP', 286, 'INTERSECT', 'CLOBJECT', 91, 3);
    insert into clob_clobject_cda (CDA_STEP_ID, CDA_ID, CDA_SEQ_NUMBER, CLOBJECT_SOURCE1_TYPE, CLOBJECT_SOURCE1_ID, CLOBJECT_OPERATOR, CLOBJECT_SOURCE2_TYPE, CLOBJECT_SOURCE2_ID, LVL)
    values (4881, 4881, 1, 'CDA_STEP', 286, 'INTERSECT', 'CLOBJECT', 92, 3);
    insert into clob_clobject_cda (CDA_STEP_ID, CDA_ID, CDA_SEQ_NUMBER, CLOBJECT_SOURCE1_TYPE, CLOBJECT_SOURCE1_ID, CLOBJECT_OPERATOR, CLOBJECT_SOURCE2_TYPE, CLOBJECT_SOURCE2_ID, LVL)
    values (4882, 4882, 1, 'CDA_STEP', 286, 'INTERSECT', 'CLOBJECT', 93, 3);
    create table CDA_MRN_RESULTS
       CDA_STEP_ID      NUMBER,
      MRN              NUMBER,
      INSERT_DATE_TIME DATE
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (19, 1, to_date('19-10-2011', 'dd-mm-yyyy'));
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (19,  2, to_date('19-10-2011', 'dd-mm-yyyy'));
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (19,  3, to_date('19-10-2011', 'dd-mm-yyyy'));
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (74,  1, to_date('19-10-2011', 'dd-mm-yyyy'));
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (74,  2, to_date('19-10-2011', 'dd-mm-yyyy'));
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (74,  4, to_date('19-10-2011', 'dd-mm-yyyy'));
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (75,  1, to_date('19-10-2011', 'dd-mm-yyyy'));
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (75,  2, to_date('19-10-2011', 'dd-mm-yyyy'));
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (75,  6, to_date('19-10-2011', 'dd-mm-yyyy'));
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (91,  2, to_date('19-10-2011', 'dd-mm-yyyy'));
    insert into cda_mrn_results (CDA_STEP_ID, MRN, INSERT_DATE_TIME)
    values (91,  3, to_date('19-10-2011', 'dd-mm-yyyy'));
    create table CLOBJECT_COUNTS
      CDA_STEP_ID    NUMBER,
      CLOBJECT_COUNT NUMBER,
      DATE_TIME      DATE
    Insert into CLOBJECT_COUNTS values (19,3, to_date('19-10-2011', 'dd-mm-yyyy'));
    Insert into CLOBJECT_COUNTS values (74,3, to_date('19-10-2011', 'dd-mm-yyyy'));
    Insert into CLOBJECT_COUNTS values (75,3, to_date('19-10-2011', 'dd-mm-yyyy'));
    Insert into CLOBJECT_COUNTS values (91,2, to_date('19-10-2011', 'dd-mm-yyyy'));The output goes into two tables...
    CDA_MRN_RESULTS : O/p of intersection records between source1 & source2 id
    CLOBJECT_COUNTS : Step id with counts ...(useful for skipping next level step id's if either of source id has "0" counts)
    Any help is appreciated..
    Thanks..

    I tried to code this..but looping takes a lot of time..I want to skip certain rows where source1_step_id & source_2_step_id are not in clobject_counts table as we proceed to the next levels..Not sure how to skip the rows..
    declare
    cursor c1 (p_level varchar2 ) is
      Select * from clob_clobject_cda
        where lvl = p_level    ;
       TYPE V_TT IS TABLE OF C1%ROWTYPE INDEX BY PLS_INTEGER;
        L_TT V_TT;
        v1 number;
        v2 number;
        v_step_id number;
        v_operator varchar2(100) := '';
    begin
    for i in 2..8 loop
      open c1(i);
      LOOP
           FETCH C1 BULK COLLECT INTO L_TT LIMIT 500;
            FOR indx IN 1 .. L_TT.COUNT
             LOOP
               v1 := L_TT(indx).clobject_source1_id;
               v2 := L_TT(indx).clobject_source2_id;
               v_step_id := L_TT(indx).cda_step_id;
               v_operator := L_TT(indx).clobject_operator;
      Execute Immediate ('Insert into cda_mrn_results Select --+ parallel (cm 128)
                                                      distinct ' || v_step_id || ', mrn, trunc(sysdate) dt from cda_mrn_results  cm
                        where cda_step_id = ' || v1 || '
                        and   cda_step_id in (Select cda_step_id from clobject_counts) ' ||
         v_operator ||
                    '  Select --+ parallel (cm 128)
                                                      distinct ' || v_step_id || ', mrn, trunc(sysdate) dt from cda_mrn_results  cm
                        where cda_step_id = ' || v2 || '
                        and   cda_step_id in (Select cda_step_id from clobject_counts)  ' );
    Insert --+ Append
           into clobject_counts Select cda_step_id, count(distinct mrn),
                       insert_date_time dt from cda_mrn_results  where cda_step_id =  v_step_id   group by cda_step_id,insert_date_time;
       COMMIT;                    
             END LOOP;
           EXIT WHEN L_TT.COUNT = 0;
         END LOOP;
      CLOSE C1;
    End Loop;    
    Commit;
    End;

  • How to add a custom PL/SQL code for a button event handler

    Hi All,
    I am a toddler in using Oracle Portal. So please forgive me for my ignorance.
    Q : How do I add a custom PL/SQL code for a button event handler?
    Basically, I would like to write MY PL/SQL function and call it. I could see that we can write "CUSTOM" code as "PL/SQL button event handler" in the form design window. But the question is that it expects only a "call" to procedure. But where do I define the procedure then? If I insert the procedure from the backend, it gets flushed the next time I compile my form.
    Hope I am able to explain my point.
    Thanks in advance,
    Abbas.

    Hi All,
    I am a toddler in using Oracle Portal. So please forgive me for my ignorance.
    Q : How do I add a custom PL/SQL code for a button event handler?
    Basically, I would like to write MY PL/SQL function and call it. I could see that we can write "CUSTOM" code as "PL/SQL button event handler" in the form design window. But the question is that it expects only a "call" to procedure. But where do I define the procedure then? If I insert the procedure from the backend, it gets flushed the next time I compile my form.
    Hope I am able to explain my point.
    Thanks in advance,
    Abbas.

  • Can I have a PL/SQL code for LOV

    Hi,
    I apologize for this dumb question. I have been so out of touch with dev (almost 8 years). Plus new to pl/sql.
    I am creating a status report application. On the dashboard, I currently have some metrics (horizontal charts). I wanted to expose these metrics based on filter.
    I had created 2 filters. The first one will identify the type (Week, Month, Qtr or Year) and the 2nd filter would be based on the type. For e.g. if I select Month as type, the 2nd filter should show May-12, April-12 etc. Once I select the 2nd filter, I should use them to show the appropriate metrics. The metric that I currently have would show all the projects that users have worked on during the week/month/qtr or year (depending on the 1st and 2nd filter).
    On the dashboard region, I added a condition to check if the value for both 1st and 2nd filter is not null. This allows me to show the dashboard only if the 2 filters have been selected.
    For the 2nd filter, I need to write a PL/SQL code to show the LOV.
    I am assuming that the pl/sql would return a SQL query. The SQL query will be based on week, month etc. Is that right?
    Thanks
    balaji
    Edited by: user644868 on May 17, 2012 11:29 AM

    rbalaji2026 wrote:
    For the 2nd filter, I need to write a PL/SQL code to show the LOV. Doesn't appear necessary. With filter 2 cascading from filter 1, why not:
    select
            /* Week query */
    from
    where
    and     :p1_filter_1 = 'WEEK'       
    union all
    select
            /* Month query */
    from
    where
    and     :p1_filter_1 = 'MONTH'       
    union all
    select
            /* Quarter query */
    from
    where
    and     :p1_filter_1 = 'QUARTER'       
    union all
    select
            /* Year query */
    from
    where
    and     :p1_filter_1 = 'YEAR'       

  • Use condition for assigning query to data provider with Javascript

    Hi gurus,
    I want to assign a query dynamically to a data provider when the user activated a Tab Panel in a Web Template (BI7).
    I can do that easily with the standard function SET_DATA_PROVIDER_PARAMETERS but this action is do each time of the activation of the Tab Panel.
    So I try to make a javascript for assign the query to the data provider only one time.
    My problem is I havenu2019t found a solution to get the default query assigning to the data provider.
    Itu2019s an example of what I try to do with javascipt :
    function Load_Query()
    var r=GET_DATA_PROVIDER PARAMETER ("QUERYVIEW_DATA_PROVIDER")
    if (r <>empty)
        executeJS_SET_DATA_PROVIDER_PARAMETERS_R()
    Thank you for help,
    Franck

    Hi Janice,
    You can check the Option Display Variable values only once.
    Thie you will the variables being displayed only once eventhough they are used in Multiple Data provider.
    If you want som variable to be data provider specific, then , you can create a new variable and add it int eh query.
    For example in your case let us say COMP1 is the variable for company code used in DP1 and DP2.
    If you want a different values to be selected for company code in DP1 and DP2 just replce the variable in DP2 with a new variabel COMP2.
    Hope this helps.
    Regards.
    Shafi.

  • Sql code for authorizing senior staff after data was inserted in the table

    Dear sirs/madams,
    Sub:
    I am using oracle9i/forms6i/reports6i and developing software for rural bank,
    I need sql/procedure/function code for authorizing senior staff after data was inserted in to the table by the junior staff.
    Thanks in advance for your kind help.
    R. Chandrasekar
    e-mail: [email protected]

    So basically you want a form (only accessible by senior staff) which reads the data entered by junior staff and when it is has been validated by senior staff
    does the following in one transaction:
    This is in effect a design question.
    There are two different scenarios I see. Approve transactions in bulk, or approve individual transcations.
    Approve transactions in Bulk:
    Mark a column with approved.
    Move all approved into final table;
    delete all transactions from temp_table which are in final_table;
    commit;
    approve/move individually.
    insert transcation x into final_table where transaction_x in temp_table;
    delete transaction_x from temp_table;
    commit;
    Its basically up to you how you design it.

  • How to handle SQL code for Daylight Savings for MST Time zone

    Hi,
    1. My time zone is MST. My data showing differently as day light saving started from November. Please help me how to handle these issue.
    2. After Mar 09 2014 Daylight saving going to end.( For this how to handle in the SQL codes)
    Please answer for the above 2 doubts.
    Thanks in advance.
    Regards,
    LuckyAbdul

    Hi Abdul,
    Daylight saving is basically like switching to another timezone. If your normal time zone is Mountain Standard Time (MST), you will switch to Mountain Daylight Time in the summer.
    If daylight saving or timezones are a concern. It is best to store your dates in a DATETIMEOFFSET(0) column. This data type includes the offset between your selected timezone (MST or MDT) and UTC. The offset between MST and UTC is -7 hours and the offset
    between MDT and UTC is -6 hours.
    Be sure to datetimeoffset(0) and not just datetimeoffset. datetimeoffset(0) uses 8 bytes to store it's data (same as datetime), but datetimeoffset uses 10 bytes. This is especially important if you are going to index this column.
    Also be sure to use a similar data type in your application or the timezone information will be lost. If it is an .Net application you should use The DateTimeOffset type. Most other programming languages have equivalent types.
    Hope this helps. If you have anymore questions please let me know.
    For more information see:
    http://msdn.microsoft.com/en-us/library/bb630289.aspx
    http://msdn.microsoft.com/en-us/library/ms187819.aspx
    http://msdn.microsoft.com/en-us/library/system.datetimeoffset%28v=vs.110%29.aspx

  • Using CVS in SQL Developer for Data Modeler changes.

    Hi,
    I am fairly new to SQL Developer Data Modeler and associated version control mechanisms.
    I am prototyping the storage of database designs and version control for the same, using the Data Modeler within SQL Developer. I have SQL Developer version 3.1.07.42 and I have also installed the CVS extension.
    I can connect to our CVS server through sspi protocol and external CVS executable and am able to check out modules.
    Below is the scenario where I am facing some issue:
    I open the design from the checked out module and make changes and save it. In the File navigator, I look for the files that have been modified or added newly.
    This behaves rather inconsistently in the sense that even after clicking on refresh button, sometimes it does not get refreshed. Next I try to look for the changes in Pending Changes(CVS) window. According to the other posts, I am supposed to look at the View - Data Modeler - Pending Changes window for data modeler changes but that shows up empty always( I am not sure if it is only tied to Subversion). But I do see the modified files/ files to be added to CVS under Versioning - CVS - Pending Changes window. The issue is that when I click on the refresh button in the window, all the files just vanish and all the counts show 0. Strangely if I go to Tools - Preferences - Versioning - CVS and just click OK, the pending changes window gets populated again( the counts are inconsistent at times).
    I believe this issue is fixed and should work correctly in 3.1.07.42 but it does not seem to be case.
    Also, I m not sure if I can use this CVS functionality available in SQL Dev for data modeler or should I be using an external client such as Wincvs for check in/ check out.
    Please help.
    Thanks

    Hi Joop,
    I think you will find that in Data Modeler's Physical Model tree the same icons are used for temporary Tables and Materialized Views as in SQL Developer.
    David

  • HT1222 Can you use the same authorization code for QT Pro on two computers? Or do you have to buy a code for each computer

    I have two computers a laptop and my Video Editing computer can I use the same activation code on both?

    An Apple ID is a user name you use for everything you do with Apple. Creating an account for an Apple service, such as the iTunes Store or the App Store, creates an Apple ID. Apple ID allows you to access other Apple services.   So you can use the same Apple ID for up to 10 Macs associated with it.
    To create an Apple ID you need to enter  your full name and your primary email address as your Apple ID. This will be used as the contact email address for your account.

  • Before Header pl/sql process to assign values to page items

    How can i assign values to items on a page based on the data structure returned from a pl/sql procedure within the before header process? I am thinking of executing a pl/sql procedure based on a URL parameter value and then assign values to page items. Is this really possible using Apex 4.0.2? If yes, how can it be done?
    thanks
    Seetharaman

    The question isn't really clear. What kind of "data structure returned from a pl/sql procedure"? Please be much more specific.
    For a PL/SQL data structure such as record containing 2 elements (<tt>x</tt> and <tt>y</tt>) returned as an out parameter from a procedure to be assigned to 2 page items <tt>p1_x</tt> and <tt>p1_y</tt>, all that's required in the Before Header process is something like
    declare
      l_rec rec_type;
    begin
      the_proc(..., p_out => l_rec, ...);
      :p1_x := l_rec.x;
      :p1_y := l_rec.y;
    end;

  • OpenSource code for accessing Life Science data sources?

    We know that many people use Oracle in Life Sciences. Per IDC, Oracle has a 75-80% market share in "Discovery". Oracle would like to help facilitate the exchange of customer code for accessing LS data e.g from GenBank, SwissProt, etc., but Oracle Legal concerns prevent us from posting customer's code on our web site.
    Does anyone have any LS code that is based on Oracle that they are willing to share with others on say SourceForge.net? Is there any location where people are already posting this type of Oracle-based open source code?

    Please go to SourceForge.net at http://sourceforge.net/projects/oraclelifesci/. Subscribe the mailing list or join the project development. To contribute codes you can chose one of the following:
    1. If you would like to use CVS or you need shared development by others, you can use cvs in sourceForge.net to upload your code.
    2. email your code to [email protected]
    You may also use other facilities that sourceForge has. It only take a couple hours in the beginning to known SourceForge.net.
    We welcome codes in all programming langages (Java, Perl, PL/SQL, C, etc) for life science applications on Oracle Platform. Thank you.
    Joshua Li , San Diego Supercomputer Center/UCSD

  • BDC code for MB01 Application(Multiple Line Items)

    Hi,
    Can anybody please send me the BDC CODE for MB01 Application(for Multiple Line items).
    Very urgent.
    Thanks And Regards,
    Ajay

    Instead of BDC you can use following BAPI.
    see the following ex:
    BAPI TO Upload Inventory Data
    GMCODE Table T158G - 01 - MB01 - Goods Receipts for Purchase Order
                         02 - MB31 - Goods Receipts for Prod Order
                         03 - MB1A - Goods Issue
                         04 - MB1B - Transfer Posting
                         05 - MB1C - Enter Other Goods Receipt
                         06 - MB11
    Domain: KZBEW - Movement Indicator
         Goods movement w/o reference
    B - Goods movement for purchase order
    F - Goods movement for production order
    L - Goods movement for delivery note
    K - Goods movement for kanban requirement (WM - internal only)
    O - Subsequent adjustment of "material-provided" consumption
    W - Subsequent adjustment of proportion/product unit material
    LOOP AT I_TAB.
          count = sy-tabix.
          SELECT SINGLE * FROM ZMM_GI_WIP
                                    WHERE GI_NO = I_TAB-DOCNO AND
                                    GI_DATE = I_TAB-DOCDT.
          IF SY-SUBRC = 0.
            I_TAB-FLAG = 'C'.
            modify i_tab index count.
            CONCATENATE 'ERROR  GI : ' I_TAB-DOCNO
            '  WAS ALREADY UPLOADED' INTO I_MSG1.
            APPEND I_MSG1.
            CLEAR I_TAB-FLAG.
            CONTINUE.
          ELSE.
            CONCATENATE I_TAB-DOCDT+4(2)
                        I_TAB-DOCDT+6(2)
                        I_TAB-DOCDT+2(2)
                        I_TAB-DOCDT+0(2)
                        INTO G_DATE.
            gmhead-pstng_date = G_DATE.
            gmhead-doc_date = sy-datum.
            gmhead-pr_uname = sy-uname.
    "01 - MB01 - Goods Receipts for Purchase Order
            gmcode-gm_code = '03'.
            refresh itab.
            clear itab.
            SORT I_MAIN BY SOL_DOCNO.
            LOOP AT I_MAIN WHERE DOCNO = I_TAB-DOCNO.
              IF I_MAIN-GI_TXN_TYPE = 'MGI'.
                itab-move_type  = '291'.
              ENDIF.
              itab-mvt_ind    = ' '.
              itab-plant      = I_MAIN-WERKS.
              itab-material   = I_MAIN-MATNR.
              itab-entry_qnt  = I_MAIN-ERFMG.
              itab-stge_loc   = I_MAIN-LGOBE.
              itab-ENTRY_UOM = I_MAIN-ERFME.
              IF I_MAIN-WERKS = 'MR'.
                itab-TR_PART_BA = '11'.
              ELSEIF I_MAIN-WERKS = 'MR'.
                itab-TR_PART_BA = '12'.
              ENDIF.
              append itab.
            ENDLOOP.
            if not itab[] is initial.
              call function 'BAPI_GOODSMVT_CREATE'
               exporting
                   goodsmvt_header             = gmhead
                   goodsmvt_code               = gmcode
                 *   TESTRUN                     = ' '
              IMPORTING
                  goodsmvt_headret            = mthead
             *   MATERIALDOCUMENT            =
      MATDOCUMENTYEAR             =
               tables
                 goodsmvt_item               = itab
      GOODSMVT_SERIALNUMBER       =
                return                      = errmsg.
              clear errflag.
              loop at errmsg.
                if errmsg-type eq 'E'.
                  write:/'Error in function', errmsg-message.
                  errflag = 'X'.
                else.
                  write:/ errmsg-message.
                endif.
                move errmsg-message to i_msg1.
                append i_msg1.
              endloop.
              if errflag is initial.
                CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
              EXPORTING
                WAIT          =
              IMPORTING
                RETURN        =
               commit work and wait.
                if sy-subrc ne 0.
                  write:/ 'Error in updating'.
                  exit.
                 endif.
            endif.
          ENDIF.
           wait up to 20 seconds.
        ENDLOOP.

  • HT1222 What is a pass code,for this next up date.never been asked that before

    What's the pass code for this next update.never been asked that in the past

    The passcode you need to enter is the passcode you set on your device. It is the passcode that you enter when you turn on your iPad or awaken it from sleep after and Auto lock.
    If you do not know or remember your password and did not store it somewhere then you will have to restore your device.
    http://support.apple.com/kb/HT1212

  • Possible t-codes for changing schedule line dates

    Hi
       I am trying to find out all possible t-codes for changing schedule dates in sales order. I found out of the following t-codes:
    VA02
    CO06
    V_RA
    V_V2
    Experts please tell if I'm missing any t-code
    TIA

    Hi,
    As far as my knowledge goes, you have got all the tcode. if you get more tcode send me a mail.
    Thanking you
    Regards
    Manoj
    ([email protected])

  • Use, purpose, related t-codes for "INVOIC_FI_INBOUND"

    Hi all,
    I am new to badi and suddenly i got requirement to work with this badi - INVOIC_FI_INBOUND.
    I need the (1) Use of this badi
                    (2) Purpose of this badi
                    (3) I need all the T-codes which are used for this badi or the respective T-codes which are related to this badi
                    (4) why we are using this badi
                    (5) how to use this badi
    Thanks in advance,
    Karthik

    So, there's a requirment to use this BADI, and you don't know what the BADI is for, nor why you need to use it.  How was the requirement phrased?  Did your boss say - Hey, Kalai - your next assignment.  Use this BADI.
    Funny that - because usually, your Boss or whoever, will say something like - we need this functionality, you can implement in this BADI.  Then you know the why.
    So, what's it all about?  Why do you need this information?

Maybe you are looking for

  • I want to sent to values from a form to the sevlet

    Hello I have a html file with some like that: <form action="htpp://myservlet" method="post"> <label>Select your favorites interests</label> <input type="checkbox" name="interests" value="Economy" /> <input type="checkbox" name="interests" value="Spor

  • Brand New iPod touch 4g won't connect to youtube

    Hi, I just got my iPod 64 GB Touch yesterday.  I have it hooked up to my home wi-fi and I can use Safari, but cannot access anything else....youtube, facebook etc.  I'm very frustrated! I've seen posts about checking the date on my iPod, and I have d

  • IMP: SAP Netweaver Mobile Integrations with External Applications

    Hey Experts, I have a requirement in SAP Mobile Asset management and already we have SAP MI(Mobile Infrastructure), in this project SAP r/3 is sits in Backend, SAP Netweaver is Middleware  Basically i would have experiance on ABAP ,in last project i

  • Problem with exported quicktime file

    I have created a slideshow, where you can go ahead and back by clicking on two arrow button. I have set dissolve effect. In KeyNote, I can see the effect when I go ahed and when I go back. I have exported the slideshow in quicktime format. In the qui

  • Upload dell printer software on my macbook, how can i tell if it is compatible?

    I want to upload dell printer software on my macbook, how can I tell if it is compatible for mac's? I need to print a project for one of my classes and I would like to print at home but we only have a dell printer for my mom's dell laptop and I want