Informatica failed to update records in target table

Hi,
Recently I converted hourly full load into incremental laod. everything is working good except updating records in target. Informatica not updating any records and I'm trying to understand and troubleshoot the issue. details given below
I ran mapping debugger and came to know Filter transformation is filtering the data instead of passing to next transformation because update flag is X.
ETL_PROC_WID AND LKP_ETL_PROC_WID are same and update flag = X.
I dont understand why ETL_PROC_WID AND LKP_ETL_PROC_WID are same. As per my knowledge Informatica generates new ETL_PROC_WID for every ETL run. If you know anything about it please let me know.
I executed code in SYS_COL_DIFF and UPDATE_FLG manually(on paper for few records) by assuming ETL_PROC_WID AND LKP_ETL_PROC_WID are different and got UPDATE_FLG=U
If you know how MPLT_GET_ETL_PROC_WID mapplet in OOTB works please let me know.
appreciate you for your help
Thanks
Jay.
Edited by: JV123 on Dec 12, 2012 9:29 AM

Welcome to the forum.
You can try your hands on MERGE while performing the INSERT operation
Something like this
MERGE INTO Emp_Org eo
USING (
          select Empno,deptno,empname,currenttimestamp from Emp
       ) x           
ON (
               eo.Empno      = x.Empno
WHEN NOT MATCHED THEN
     INSERT
          Empno,
          country,
          Emporg,
          currenttimestamp,
          Targettimestamp
     VALUES
          x.Empno,
          'USA', /* Used Constant here */
          'Emporg', /* Used Constant here */
          x.currenttimestamp,          
          sysdate /* Used Constant here */
     );If you are not ok with the solution, then provide some sample data with expected output

Similar Messages

  • How to update Records from Internal table to u2018Zu2019 table?

    Hi Friends,
    How to update Records from Internal table to u2018Zu2019 table.
    I have records in Internal table , that records want to update on u2018Zmarau2019 Table.
    ( my internal table & u2018 Zu2019 table structures are same.)
    Thanking you.
    Regards,
    Subash

    Hi,
    loop at internal table.
    modify <Z- table > from values < internal table Workarea>.
    if sy-subrc = 0.
      COMMIT work.
    else.
      ROLLBACK waork.
    endif.
    endloop.
    or
    UPDATE <Z- table > from table < internal table Workarea>.
    if sy-subrc = 0.
      COMMIT work.
    else.
      ROLLBACK waork.
    endif.
    Prabhudas

  • I HAVE A SOURCE TABLE WITH 10 RECORDS AND TARGET TABLE 15 RECORDS. MY WUESTION IS USING WITH THE TABLE COMPARISON TRANSFORM I WANT TO DELETE UNMATCHED RECORDS FROM THE TARGET TABLE ??

    I HAVE A SOURCE TABLE WITH 10 RECORDS AND TARGET TABLE 15 RECORDS. MY QUESTION IS USING WITH THE TABLE COMPARISON TRANSFORM .I WANT TO DELETE UNMATCHED RECORDS FROM THE TARGET TABLE ?? HOW IT IS ??

    Hi Kishore,
    First identify deleted records by selecting "Detect deleted rows from comparison table" feature in Table Comparison
    Then Use Map Operation with Input row type as "delete" and output row type as "delete" to delete records from target table.

  • Aggregtaor Transformation without any condition showing zero records in target table

    Hi Everyone, I have a source table which has 100 records,I just passing all the 100 records to aggregator transformation. In aggregator transformation i am not giving any condition,I just mapped from aggreator directy to target table.After running this mapping in target table i found 0 records in target table. why this happing can anyone explain this?

    Magazinweg 7Taucherstraße 10Taucherstraße 10Av. Copacabana, 267Strada Provinciale 124Fauntleroy CircusAv. dos Lusíadas, 23Rua da Panificadora, 12Av. Inês de Castro, 414Avda. Azteca 123 I have the source table like this and i want to replace the character and sum up the numbers and how can i do it, I replace the character by reg_replace() function but I am not able to add the number because it is of not fixed length. My Output should be,71115705396

  • How to update Records of SAP table from .CSV file

    Hi,
    I have written a code which takes a data from (,) delimited CSV file and adds it into the Internal table.
    Now I want to update the all fields in SAP table with the Internal table.
    I want to use Update statement.
    Update <table Name> SET <field needs to update> WHERE connditon.
    I don't want to iterate through thousand on record in sap table to check the where condition.
    Could you please tell me how to do it.

    Hi. I thing you will not workaround the iterating of the internal table.
    You can pre-load all the records into another internal table
    check lt_csv[] is not initial. " important, otherwise next select would read all records of the table
    select .... into lt_dbitab for all entries in lt_csv where key-fieds = lt_csv-key_fields...
    check sy-subrc eq 0 and lt_dbitab[] is not initial.
    then do in-memory update of the it_dbitab
    loop at it_dbitab assign <fs>.
      read table lt_csv assign <fs_csv> with key ... " lt_csv should be sorted table with key, or you should try to use binary search
      if sy-subrc eq 0.
        ...change required lt_dbitab fields: <fs>-comp = <fs_csv>-comp...
      endif.
    endloop.
    ant then you can do mass-update
    update dbtab from table lt_dbitab.
    From performance view, this solution should be much faster than iterating lt_csv directly and updating every single database record
    Br
    Bohuslav

  • How to update records in a table based on user selection..

    Hi all,
    This time the above doubt is totally based on the logic of coding which I tried a lot but didn't get any solution. so atlast I come to sdn site.
    please help..
    The requirement is like that I have a table with 6 fields (1 primary key and other are nonkeys). If the user inputs some values in the fields on the screen, then a row will be added in the table. Upto this i have done well. but when the user want to change some value  in the existing row of the table my program unable to do so. Because I couldn't get any logic to do that as there are 5 nonkey fields, so if any one field is modified then the respective row should be selected first based on the user selection and then it should be updated.
    At this point I could not get any idea as it may take a lots of if conditions (I guess) to reach to that particular row.
    Please help..
    thanks ,
    sekhar

    Hi Sekhar,
    I am afraid, the whole design of your program is wrong, let me explain
    Let us say you have two rows(5 non key fields) that the user wants to update and the data in these five non key fields are identical and in your program you are getting a number(which is the key) using a number range object. So you will have two entries in the table for the same data.
    And on the update page when the user enters the non key fields, how will the program know(or for that matter any one of us) which record to pick, if you have two identical books and if asked for a book wouldn't you ask which one among these two do you want?
    Possible Solution: Identify a possible key maintaining the integrity of the data, that is a combination of the non key fields which will help you identify a unique row and make these fields as key fields in the table.
    A more costly solution(if you do not want to change the non key field keys to key fields) would be to, adding a check(using select statement) to see if the non key fields combination already exists in the Z table before inserting a record into the table.
    If yes, throw a message to the user and just update the values in the table, else insert the record.
    Another solution would be to, use the non key fields to generate a key(using some logic) and using this instead of the number range object.
    regards,
    Chen

  • Need to Insert Records in target table.

    Hi All,
    This is my first thread in this forum.
    My source table name is Emp and Target table name is Emp_Org.
    Given below is the structure of the source table
    Empno(p.key) deptno(p.key) empname currenttimestamp
    Target Table MetaData.
    Empno(p.key) country Emporg currenttimestamp Targettimestamp
    My problem is In my source table I have 10000 records. While I run the job, out of 10000 records only 4000 records are Inserted into the target table. If I run the job again, I need to run the job from 4001 onwards not from the begining.
    I don't know exactly how can I write a sql for above logic.
    I don't have idea how to do this?. Any help would be really appreciated.
    Edited by: 896227 on Nov 9, 2011 3:46 AM

    Welcome to the forum.
    You can try your hands on MERGE while performing the INSERT operation
    Something like this
    MERGE INTO Emp_Org eo
    USING (
              select Empno,deptno,empname,currenttimestamp from Emp
           ) x           
    ON (
                   eo.Empno      = x.Empno
    WHEN NOT MATCHED THEN
         INSERT
              Empno,
              country,
              Emporg,
              currenttimestamp,
              Targettimestamp
         VALUES
              x.Empno,
              'USA', /* Used Constant here */
              'Emporg', /* Used Constant here */
              x.currenttimestamp,          
              sysdate /* Used Constant here */
         );If you are not ok with the solution, then provide some sample data with expected output

  • Create a Master-Detail Records in target table

    hi,
    I have a source in a table that contains the infos of customers and for each customer record, we have three fields representing the customer ID
    (Identity card, SIN, passport ID) and I want to create 3 record in the target table for each customer, each record contains information for each ID
    How can we do with OWB?
    thanks.

    hello
    i belive you need to use pivot operator
    check out this blog
    https://blogs.oracle.com/warehousebuilder/entry/pivoting_data_in_owb
    rgds

  • Update records in huge table

    Hi,
    I need to update two fields in a huge table (> 200.000.000 records). I've created 2 basic update scripts with a where clause. The problem is that there isn't an index on these fields in the where clause. How can I solve this? Creating a new index is not an option.
    An other solution is to update the whole table (so without a where clause) but I don't know if it takes a lot of time, locks records,...
    Any suggestions?
    Thanks.
    Ken

    Ken,
    You may be better off reading the metalink documents. PDML stands for Parallel DML. You can use parallel slaves to get the update done quickly. Obviously this is dependent on the number of parallel slaves you have and the degree you set
    Type PDML on metalink
    G

  • Updating records in a table

    Hi,
    ERP: 11.5.10.2 , Database= 9i
    we have requirement to update one of the column of a table for customized property manager modules.
    basically this is old data which cannot be inserted from application side as the field representing to this column is disabled from application side. So we have only one way to insert/update directly from database. ( using update staement)
    its almost like 1000 or 2000 records are there, in the same table one column is there which can used as where condition for updating the required collumn
    any suggestion how to achieve this task or any way that can reduce the time of updating.
    Regards

    [Edit: please mark questions as answered when they are. This helps people know what posts to look at.]
    Test data:drop table a;
    create table a(id, NBR) as
    select level, level*1000 from DUAL
    connect by level <= 4;
    drop table B;
    create table B(id, NBR) as
    with MULT as(select level from DUAL connect by level <= 10)
    select id, 9999 from a, MULT;Solution:merge into B
    using (
      select B.rowid RID,
      a.NBR
      from a,B
      where a.id = B.id
      and a.NBR != B.NBR
    ) U
    on (B.rowid = U.RID)
    when matched then update set nbr = u.nbr;
    40 rows merged.Run the MERGE twice and the second time 0 rows will be merged. This confirms that rows are updated only when necessary.
    You could update a join, but only if table A has a primary key or unique index on ID.
    Edited by: Stew Ashton on Oct 29, 2012 10:46 AM

  • Update records from a table in correct sequence that look from 2 tables and loop

    Hi!
    My question title is kinda unclear but here how it goes.
    I created  2 tables for my BOM (Bill of Materials). I have BOM_Header and BOM_Detail. 
    Below are my sample data. 
    BOM_Header
    Product_ID Int
    Product_Name Varchar(50)
    Cost Numeric(18,2)
    Yield Varchar(50)
    Select * from BOM_Header
    1 Choco Cake 850.00 10
    2 Mixed Flour 700.00 30
    3 Choco Syrup 160.00 10
    4 Egg Formula 2150.00 20
    BOM_Detail
    Product_ID int
    ItemNo Int
    ItemName varchar(50)
    Quantity int
    Unit varchar(50)
    ProdCost numeric(18,2)
    Select * from BOM_Detail
    1 2 Mixed Four 10 Grams 15.00
    1 3 Choco Syrup 20 ML 25.00
    1 4 Egg Formula 20 Grams 10.00
    2 101 Flour 5 packs 80.00
    2 4 Egg Formula 5 Grams 60.00
    3 201 Cocoa Power 2 kg 20.00
    3 202 Sugar 2 kg 60.00
    4 301 Whole Egg 10 pcs 85.00
    4 302 EP12 Formula 25 ml 52.00
    My computation is below.
    BOM_Header = a
    BOM_Detail = b
    a.Cost = b.Quantity  x  b.Product Cost  where a.Product_ID = b.Product_ID
    My problem is how can I automatically compute their Food Cost in  sequence from raw materials to finished products.
    Based on the data, I need to compute first the Egg Formula because it is used as component of  Mixed Flour then compute the Mixed Flour and other component to get the cost of Choco Cake. 
    How can I do this automatically in query to look first in detail if there are ingredients within a sub - ingredients then compute before computing the final cost of the Product. 
    This is because cost of ingredients are changing most of the time and need to recalculate to get the most updated cost.
    Any suggestion is very much appreciated.
    Thank you very much,
    Regem

    >> My question title is kinda unclear but here how it goes. <<
    Then your answers will be unclear, too :(  
    You do not know data modeling, so your data types are wrong. What math are you doing with the product_id? None. This is why identifiers are char(n) and not numeric. They should be industry standards if possible. I like the GTIN.
    You do not even know that rows are not records. 
    Why is the product name fifty characters? Certainly not research! But if you are an ACCESS programmer using a default vale, then you might do this. 
    Besides violating ISO-11179 rules, “<vague>_field” makes no sense! It is a quantity that you put in a  string. 
    CREATE TABLE BOM_Header
    (gtin CHAR(15) NOT NULL PRIMARY KEY,
     product_name   VARCHAR(25) NOT NULL
     unit_cost  DECIMAL(18,2) NOT NULL
      CHECK ( unit_cost >= 0.0.),
     yield_qty INTEGER NOT NULL
       CHECK (yield_qty >= 0));
    >> Any suggestion is very much appreciated. <<
    Get a copy of my book on TREES in SQL and read the chapter on BOM problems. I am not goingto try to post a whole chaper and diagrams to answer this. You are doing the wrong things and have done them poorly. 
    --CELKO-- Books in Celko Series for Morgan-Kaufmann Publishing: Analytics and OLAP in SQL / Data and Databases: Concepts in Practice Data / Measurements and Standards in SQL SQL for Smarties / SQL Programming Style / SQL Puzzles and Answers / Thinking
    in Sets / Trees and Hierarchies in SQL

  • Fail to update record in the RowUpdating event

    If I have well understood, the RowUpdating event is called before to send the row to the database.
    I want to modify the row argument of this event before to send it to oracle to add traceability info (user name of the user and the date of modification)
    No exception is raised, but the modifications I do to the row are not saved in the database.
    You will find my sample code here under.
    Thanks
    Benoit FRANC
    Atos Origin Belgium
    Sample code
    First create a table with this script:
    DROP TABLE TEST CASCADE CONSTRAINTS ;
    CREATE TABLE TEST (
    ID NUMBER (10) NOT NULL,
    DATANUMBER,
    STRING VARCHAR2 (50),
    CONSTRAINT PK_TEST
    PRIMARY KEY ( ID ) ) ;
    Manually enter a row with ID = 1; DATA=1 and STRING = "A"
    Imports System.Data
    Imports Oracle.DataAccess.Client
    Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button1.Click
    Dim sSql As String
    Dim mConnection As OracleConnection
    mConnection = New OracleConnection("User Id=eperform;Password=eperform;Data Source=tractDev")
    mConnection.Open()
    sSql = "SELECT * FROM TEST WHERE ID = 1"
    Dim oraDA As OracleDataAdapter = New OracleDataAdapter(sSql, mConnection)
    Dim oraCB As OracleCommandBuilder = New OracleCommandBuilder(oraDA)
    Dim oraDS As DataSet = New DataSet()
    oraDA.FillSchema(oraDS, SchemaType.Source, "SourceTable")
    oraDA.Fill(oraDS, "SourceTable")
    oraDS.Tables(0).Rows(0).Item("STRING") = "Hello" & Now 'only to see the change in the bd
    Try
    AddHandler oraDA.RowUpdating, AddressOf OnRowUpdating
    oraDA.Update(oraDS, "SourceTable")
    Catch ex As Exception
    MsgBox(ex.Message)
    Debug.WriteLine(ex.Message)
    Debug.WriteLine(ex.StackTrace.ToString)
    End Try
    mConnection.Dispose()
    mConnection.Close()
    End Sub
    Private Sub OnRowUpdating(ByVal sender As Object, ByVal args As OracleRowUpdatingEventArgs)
    Try
    args.Row("DATA") = -1
    Catch ex As Exception
    Debug.WriteLine(ex.Message)
    Debug.WriteLine(ex.StackTrace.ToString)
    End Try
    End Sub

    benoit
    I have tried working with RowUpdatingEvent, RowUpdatedEvent Handler, it
    works well. I did not go through your code but I am pasting the code I had tried, updatation happens as expected. Hope that helps.
    Jagriti
    Pls:This code is not the complete class, I have pasted only the relevant methods.
    Public Function getConnection() As OracleConnection
    'Return connection reference
    Return conn
    End Function
    'This method updates the database using dataAdapter
    Public Function UpdateRecords()
    Try
    sbox.AppendText("Updating the database" + Environment.NewLine)
    sbox.Update()
    'Update the database with the help of dataAdapter update method
    dataAdapter.Update(dset, "COUNTRYTAB")
    sbox.Update()
    Catch ex As Exception
    sbox.AppendText("Error while Updating the database" + Environment.NewLine)
    sbox.AppendText(ex.ToString() + Environment.NewLine)
    sbox.Update()
    End Try
    End Function
    'RowUpdating EventHandler
    Private Sub OnRowUpdating(ByVal sender As Object, ByVal e As OracleRowUpdatingEventArgs)
    sbox.AppendText("** OnRowUpdating() Called **" + Environment.NewLine)
    sbox.Update()
    End Sub
    'RowUpdated EventHandler
    Private Sub OnRowUpdated(ByVal sender As Object, ByVal e As OracleRowUpdatedEventArgs)
    sbox.AppendText("** OnRowUpdated() Called **" + Environment.NewLine)
    sbox.Update()
    End Sub
    'This method adds a RowUpdating event handler if it 'is not added already.
    Public Function AddUpdatingHandler()
    If updatingCount = 0 Then
    'Add the eventHandler
    AddHandler dataAdapter.RowUpdating, AddressOf OnRowUpdating
    updatingCount = 1
    End If
    End Function
    'This method adds a RowUpdated event handler if it is not added already.
    Public Function AddUpdatedHandler()
    If updatedCount = 0 Then
    'Add the eventHandler
    AddHandler dataAdapter.RowUpdated, AddressOf OnRowUpdated
    updatedCount = 1
    End If
    End Function

  • XSU : Updating records in the table

    Hi,
    XSU seems to be assuming that the data from the XML doc has to be always INSERTED into the database. Does it not have to UPDATE existing records and INSERT only the NEW RECORDS?
    How should this be implemented?
    null

    A future release is planned to offer this so-called "upsert" functionality, but the current release requires you to know whether you want to insert, update, or delete.
    You can use the insert functionality, in combination with an INSTEAD OF INSERT trigger to programmatically handle the "update-if-already-exists" functionality.
    See Example 12-17 on page 465 of "Building Oracle XML Applications" for a concrete example of this hand-coded "upsert" functionality.

  • Polling for database updates fails to update sequence file/table

    I have a small system to poll for changes to a database table of student details. It consists of:
    Database adapter, which polls the database for changes, retreives a changed row, and passes the generated XML to -
    "receive" router service, which simply passes the retrieved database XML data to -
    "execute" router service, which transforms the XML and passes the new message to -
    File Adapter, which writes the transformed XML to a file.
    The problem is that the database polling is not updating the sequence recorder - I have tried using a sequence file which stores the personnumber, a sequence table which stores the personnumber and a sequence table which stores the last-updated timestamp. In all cases the database adapter successfully reads the sequence file/table and retreives the correct row, based on the sequence value ( I have tested this by manually changing the sequence value), the data is transformed and a correct xml file is created, but the system then fails to update the sequence file/table so that when the next polling time comes around the very same database row is extracted again and again and again.
    In the ESB control panel I have one error: "Response payload for operation "receive" is invalid!"
    The Trace is:
    oracle.tip.esb.server.common.exceptions.BusinessEventRejectionException: Response payload for operation "receive" is invalid! at oracle.tip.esb.server.service.EsbRouterSubscription.processEventResponse(Unknown Source) at oracle.tip.esb.server.service.EsbRouterSubscription.onBusinessEvent(Unknown Source) at oracle.tip.esb.server.dispatch.EventDispatcher.executeSubscription(Unknown Source) at oracle.tip.esb.server.dispatch.InitialEventDispatcher.processSubscription(Unknown Source) at oracle.tip.esb.server.dispatch.InitialEventDispatcher.processSubscriptions(Unknown Source) at oracle.tip.esb.server.dispatch.EventDispatcher.dispatchRoutingService(Unknown Source) at oracle.tip.esb.server.dispatch.InitialEventDispatcher.dispatch(Unknown Source) at oracle.tip.esb.server.dispatch.BusinessEvent.raise(Unknown Source) at oracle.tip.esb.utils.EventUtils.raiseBusinessEvent(Unknown Source) at oracle.tip.esb.server.service.impl.inadapter.ESBListenerImpl.processMessage(Unknown Source) at oracle.tip.esb.server.service.impl.inadapter.ESBListenerImpl.onMessage(Unknown Source) at oracle.tip.adapter.fw.jca.messageinflow.MessageEndpointImpl.onMessage(MessageEndpointImpl.java:281) at oracle.tip.adapter.db.InboundWork.onMessageImpl(InboundWork.java:1381) at oracle.tip.adapter.db.InboundWork.onMessage(InboundWork.java:1291) at oracle.tip.adapter.db.InboundWork.transactionalUnit(InboundWork.java:1262) at oracle.tip.adapter.db.InboundWork.runOnce(InboundWork.java:501) at oracle.tip.adapter.db.InboundWork.run(InboundWork.java:401) at oracle.tip.adapter.db.inbound.InboundWorkWrapper.run(InboundWorkWrapper.java:43) at oracle.j2ee.connector.work.WorkWrapper.runTargetWork(WorkWrapper.java:242) at oracle.j2ee.connector.work.WorkWrapper.doWork(WorkWrapper.java:215) at oracle.j2ee.connector.work.WorkWrapper.run(WorkWrapper.java:190) at EDU.oswego.cs.dl.util.concurrent.PooledExecutor$Worker.run(PooledExecutor.java:819) at java.lang.Thread.run(Thread.java:595)
    The payload is:
    <PersonCollection xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://xmlns.oracle.com/pcbpel/adapter/db/top/ISSSimIN">
    <Person>
    <name>Bob</name>
    <surname>Stupid</surname>
    <istatus>2</istatus>
    <active>Active</active>
    <personnumber>3001</personnumber>
    <role>Staff</role>
    <organization>TEX</organization>
    <updateTimestamp>2007-11-29T14:06:55.000+00:00</updateTimestamp>
    </Person>
    </PersonCollection>
    The payload above is the XML output from the Database adapter, the xsd for which was autogenerated by JDeveloper, so I don't understand how it can be invalid.
    I have only been working with ESB for 3 days, but half of that time has now been spent stuck on this issue. Anyone got any ideas??
    Richard

    You need to be careful. Submit and committing to the database are two different things.
    Submit on a page (and autoSubmit the property) are only posting the changes you typed into the field, back down to the application server. NOTHING is happening with the database at this point. So the applications's internal "cache" of records (using something like ADF BC) is holding the changes but the database doesn't know anything. So if you quit now, no changes will go to the database.
    These changes onlyget submitted down to the app server when you do some sort of action like press a button. Simply navigating fields will be doing nothing back to the app server. If you add autoSubmit then you will automatically submit the changes on that field down to the app server when you leave the field (but again, this is not touching the database)
    You have to explicitly add a COMMIT operation to save those changes to the database.
    Maybe if you try in a simple EMP example and send us your findings we can help direct you further.
    Regards
    Grant

  • Help two sql queries toload target table

    create table table1
        (code varchar2(10)
         ,mod_time date);
         insert into table1 values('2533',to_date('31-JUL-2012', 'DD-MON-YY'));
         insert into table1 values('2534',to_date('31-JUL-2012', 'DD-MON-YY'));
          insert into table1 values('2535',to_date('01-SEP-2012', 'DD-MON-YY'));
          create table table2
        (code varchar2(10)
         ,ID   NUMBER
         ,TYPE VARCHAR2(3)
         ,mod_time date);
         insert into table2 values('2533',1,'AB',to_date('01-SEP-2012', 'DD-MON-YY'));
         insert into table2 values('2534',1,'CD',to_date('01-SEP-2012', 'DD-MON-YY'));
         create table table3
         (ID   NUMBER
          ,mod_time date);
          insert into table3 values(1,to_date('01-SEP-2012', 'DD-MON-YY'));
          insert into table3 values(2,to_date('01-SEP-2012', 'DD-MON-YY'));
          create table target
          (code varchar2(10)
          ,ID   NUMBER
          ,TYPE VARCHAR2(3)
          ,table1_modtime date
          ,table2_modtime date
          ,table3_modtime date
          ,valid_till  date
          ,valid_from  date
          drop table target
          combine all the three tables where table1.mod_time<=31-jul-2012, table2.mod_time<=31-jul-2012,table3.mod_time<=31-jul-2012
          initial load to target load should be get the information from all the tables modtime<=31-jul-2012
          Need to have select query to get the below 1st load expected output
          valid_till defaults to to_date('31-DEC-2030', 'DD-MON-YY')
          valid_from defaults to to_date('01-01-1995', 'DD-MON-YY')
         1st load expected output
          code    id     type  table1_modtime   table2_modtime      table3.modtime  valid_from valid_till  
          2533                 31-JUL-2012                                                            01-01-1995  31-DEC-2030 
          2534                 31-JUL-2012                                                             01-01-1995   31-DEC-2030 
          -------------------------------------------------------------------------------------------------------for the second load. first get the query to get the values from all the tables
    mod_time>31-jul-2012 .
    if the code exists then update the existing record with valid_till to 02-jan-2013 i.e yesterday
    and insert new record with the new values and valid from date to 03-jan-2013 i.e today
        --------------------2nd load expected output 
          code    id     type  table1_modtime   table2_modtime      table3.modtime      valid_from     valid_till
          2533                 31-JUL-2012                                                               01-01-1995       02-jan-2013 --updating old record valid-till to yesterday
          2534                 31-JUL-2012                                                               01-01-1995       02-jan-2013  --updating old record valid-till to yesterday
          2533      1     AB   31-JUL-2012        01-SEP-2012        01-SEP-2012         03-jan-2013  31-DEC-2030----updating old record valid-from to today
          2534      1     CD   31-JUL-2012        01-SEP-2012       01-SEP-2012          03-jan-2013  31-DEC-2030--updating old record valid-from to today
          2535                  01-SEP-2012                                                              01-01-1995  31-DEC-2030 --no record exist so insert new record in target table
      Edited by: choti on Jan 3, 2013 5:01 PM

    Hi ,
    Just check the data you have inserted into the tables because, trying to join the table and putting the date constraint its returning no values
    select T1.CODE as code ,t3.id as id ,t2.type as type  ,T1.MOD_TIME as table1_modtime ,T2.MOD_TIME as table2_modtime ,T3.MOD_TIME as table3_modtime from table1 t1, table2 t2,table3 t3 where T1.CODE = t2.code and t2.id = t3.id
    and t1.mod_time<=to_date('31-jul-2012', 'DD-MON-YY')
    and t2.mod_time<=to_date('31-jul-2012', 'DD-MON-YY')
    and t3.mod_time<=to_date('31-jul-2012', 'DD-MON-YY')And this because for the code 2533 and 2534 in the table1 the id entry in the table2 is 1 . but the date entry in the table3 for the Id 1 is 01-SEP-2012 which falls outside the date. Kindly confirm on this data entry in table2.

Maybe you are looking for