Updating target table in OWB

I have target table with columns structure_id,case_num,user_id,user_name,eff_beg_dt,eff_end_dt . I have to update the eff_end_dt in target table and insert a new row with new eff_beg_dt if any of the info in the column changes like user_id or User_name based on case_num.
I have in source table case_num,user_id,user_name only which I got from previous process (delta change rows only). Now for all this case_num in source_table I want to update the eff_end_dt to current months end date and insert a new row with source tables case_num,user_id,user_name with eff_beg_dt with next month start date.
How to achive this in OWB . I am very new to OWB and when I try to put loading type has update what should be the match by contraint.
I have to match case_num in source to case_num in target and update the eff_end_date . But case_num in not a unique key
Can you pl walk me thr the process
thanks

Hi,
As I understand the problem looks straightforward.
1. First make a mapping where you drag all the fields from the source table to the target table. The target table loading type should be set to UPDATE. The update matching condition should be set to case_num = case_num from the source table. The field that needs to be updated is you eff_end_date. Update it with the appropriate value. Set the match by constraints to "No Constraints" and set the matching conditions in the target table field level.
2. Then make a second mapping which will do just the inserts. All the rows appearing in the source (which is your delta rows I think) should be mapped to the respective fields in the target table. The target table load type should be set to INSERT. Set the eff_beg_date as per your logic.
The above two steps can be done within one mapping I believe, by setting the target load order.
Hope this helps
-AP

Similar Messages

  • Updating target table with date in owb using mapping

    I want to update the effetive begin date (month begin date) when we load the target table. I have field in target table has Eff_updated_dt defined as DATE.
    If I use the Data generator operator and then expression operator (trunc (sysdate,'mm' ) it giving error. It says data generator operator should be connected to flat file .
    What other operators can I use to update the column with month begin data.
    Also which operator I have use to have sequence key in the table.
    Thanks

    you can always use a constant for the date field.
    just create a constant of type date and give it the value you want, trunc (sysdate,'mm' ) and connect it to your target column, Eff_updated_dt defined.
    you have a sequence operator that you can use to link to your "sequence key" as well.
    Borkur

  • Updating target tables....

    Hello,
    I am actually in quite a predicament. I need to update a table from a view. The table has 3 columns(*col1, col2,col3*) initialized as NULLs in all the records noting that the other columns have data inside...... So what I basically do is the following:
    I map the view columns to the table and the click on the table in order to view the table operator properties then set the loading type to UPDATE. Then under conditional loading in the Target filter for update I insert the following condition:
    ( INOUTGRP1.*col4* IS NOT NULL ) which shows a successful validation.
    Taking into consideration that the Match by Constraints is choosen to be NO_CONSTRAINTS...
    After that I click on col1, col2 and col3 respectively in order to set the Match column when updating row to Yes....
    Unfortunately, these warnings appears saying the following:
    +VLD-2753: All mapped attributes are used in matching criteria in 15_REC.
    It will be a meaningless update action if all the attributes that are mapped are used for matching. The update statement will select the rows which satisfy the match condition and update it with the same values. Specify at least one mapped attribute of MY_TABLE to be used for updating by setting Update Use for Matching to No : ( COL1 COL2 COL3).+
    +VLD-2761: Unable to generate Merge statement.
    Merge statement cannot be generated because column COL4 is used for both matching and update. A matching column cannot be updated in a merge statement.+
    So if anyone has any kind of idea please help as soon as possible.... Since this is actually urgent....
    Many thanks already grateful for reading through,
    Hossam

    Hey David,
    Well the columns that I want to update are col1, col2 and col3 ..... I don't want to match any column between the view and the table I just want the carry on the update under the condition that table1.col1, table1.col2 and table1.col3 are set to NULL....

  • Update a table using OWB

    Hi All,
    I have a unique requirement where the data comes from two different sources using the following cases in perspective
    - Table A and Table B have common ID's but different information regarding the same entity.
    - Table A has the commanding records going into the target table and table B fills in some other details not available in table A.
    so in essence I am merging data which can be accomplished by UPDATE/INSERT ....How ever I am constrained by the date element , data from table B merges in to the target table different for a given period of time , so what I have done is created a bunch of mappings just to update the table with the logic for the different years and I am using the UPDATE operator to update the records and mapping is taking a long time , just to update 122,360 records the mapping has been running for about 2 hours.
    Any suggestions on how to better approach this?
    Regards,
    Malam

    Hi Malam,
    there is no simple answer to your question. Maybe SQL tuning is the most complex subject for relational databases.
    For example of successful SQL tuning look at this thread Hanging deployment
    Regards,
    Oleg

  • Updating Target Table using DBMS_Scheduler for every 10 seconds

    Hi All,
    Am new to DBMS_Scheduler Package,
    Based on the examples I have written an anonymous block to create job,,, It needs a correction in repeat_interval value,
    I have two table emp (Source) and sk_emp (Target),
    I want sk_emp table to be update for every 10 seconds,
    CREATE PROCEDURE sk_insert_records
    IS
    BEGIN
       MERGE INTO sk_emp se
          USING emp e
          ON (e.empno = se.empno)
          WHEN MATCHED THEN
             UPDATE
                SET se.ename = e.ename, se.job = e.job, se.mgr = e.mgr,
                    se.hiredate = e.hiredate, se.sal = e.sal, se.comm = e.comm,
                    se.deptno = e.deptno
                WHERE se.empno = e.empno
          WHEN NOT MATCHED THEN
             INSERT (empno, ename, job, mgr, hiredate, sal, comm, deptno)
             VALUES (e.empno, e.ename, e.job, e.mgr, e.hiredate, e.sal, e.comm,
                     e.deptno);
    END;I have written the above procedure and below is my anonymous block of dbms_scheduler
    BEGIN
       DBMS_SCHEDULER.create_job (job_name              => 'empdetails',
                                  job_type                 => 'STORED_PROCEDURE',
                                  job_action               => 'ngshare.sk_insert_records',
                                  number_of_arguments      => 0,
                                  start_date               => TRUNC (SYSDATE),
                                  repeat_interval          => 'FREQ=DAILY;BYSECOND=10',
                                  end_date                 => NULL,
                                  job_class                => 'DEFAULT_JOB_CLASS',
                                  enabled                  => TRUE,
                                  auto_drop                => FALSE,
                                  comments                 => 'DONE BY SUNIL'
    END;Can you please let me know the solution,,
    if my repeat_interval is correct, can you please let me know what else need to be done,
    This is my Version
    Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production
    PL/SQL Release 11.2.0.1.0 - Production
    "CORE     11.2.0.1.0     Production"
    TNS for Linux: Version 11.2.0.1.0 - Production
    NLSRTL Version 11.2.0.1.0 - ProductionThanks for your help
    Edited by: NSK2KSN on Jul 26, 2010 11:06 AM

    Hi It's working, Problem was in repeat_interval it's working now,
    Thanks,
    I changed
      repeat_interval          => 'FREQ=DAILY;BYSECOND=10',to
      repeat_interval          => 'FREQ=SECONDLY;BYSECOND=10',Thanks,
    Edited by: NSK2KSN on Jul 26, 2010 11:14 AM

  • How to control unique key target table mapping!

    Hi all,
    Is there any way to map a source table to target,the source table contains some duplicate rows,and only one of them is needed for the target table,can OWB manage to insert only one of these duplicate rows and ignore other rows according to an unique key and continue to execute the map?
    Thanks in advance!

    If you have duplicates and want to filter them out, try creating a view on the target table and use the rownum() function in the view.
    Than in your mapping use this view as you source and put a filter on it where the rownumber = 1. This will only select the unique records.

  • In OWB I need to update the target table with same field for match/update

    In OWb I am trying to update the target table with the match and the update on the same field can this be done. I am getting a error match merge error saying you cannot update and match on the same field. But in SQl my select is
    Update table
    set irf = 0
    where irf = 1
    and process_id = 'TEST'
    Hwo do i do this in OWB.

    table name is temp
    fields in the table
    field1 number
    field2 varchar2(10)
    field3 date
    values in the table are example
    0,'TEST',05/29/2009
    9,'TEST',05/29/2009
    0,'TEST1',03/01/2009
    1,'TEST1',03/01/2009
    In the above example I need to update the first row field1 to 1.
    Update temp
    set field1 = 1
    where field1 = 0
    and field2 = 'TEST'
    when I run this I just need one row to be updated and it should look like this below
    1,'TEST',05/29/2009
    9,'TEST',05/29/2009
    0,'TEST1',03/01/2009
    1,'TEST1',03/01/2009
    But when I run my mapping I am getting the rows like below the second row with 9 also is getting updated to 1.
    1,'TEST',05/29/2009
    1,'TEST',05/29/2009
    0,'TEST1',03/01/2009
    1,'TEST1',03/01/2009

  • How do I update/insert into a target table, rows after date X

    Hi all
    I have a mapping from source table A to target table B. Identical table structure.
    Target table A updates rows and inserts rows daily. Every week I want to synchronize this with table B.
    I have CREATION_DATE and LAST_UPDATE_DATE on both tables. I want to pass in a parameter to this mapping of date X which tells the mapping:
    "if CREATION_DATE is past X then do an insert of this row in B, if LAST_UPDATE_DATE is past X then do an update of this row in B"
    Please can you help me work out how to map this correctly as I am new to OWB.
    Many thanks
    Adi

    Hi,
    You can achieve this by -
    1. Create a control table, say Control_Table, with structure
    Map Name, last_load_date. Populate this table with the mappings that synchronizes your Table B.
    2. Alter mapping, that loads Table B to use the above control table to get all the records from Table A, you have to join Table A and Control_Table with the condition -
    Control_Table.Map_Name = < mapping name>
    AND ( TableA.Creation_Date > Control_Table.last_load_date
    OR TableA.Last_Update_Date > Control_Table.last_load_date )
    3. Then use UPDATE/INSERT on the Table B based on the Keys. This should take care of INSERT ( if not present) / UPDATE (if the row already exists).
    4. Schedule the mapping to run on weekly basis.
    5. You have to maintain the Control_Table to keep changing the values for Last_Load_Date to pick the data since the last time Table B is synchronized.
    HTH
    Mahesh

  • Does "Target table" and "Error Table" need to be in same DB as OWB's?

    We get the following error ORA-00600: internal error code, arguments: [opixrb-3], [1036], ORA-01036: illegal variable name/number at the time of OWB execution
    OWB seems to successfully log the errors rows in the error table when a) target table and it’s b) error table are in the same database as OWB’s but throws ORA-01036 error when they are in a remote database from OWB's
    Any one has any information about this please...? It will be of great help
    Edited by: ghiyer on Jul 20, 2009 8:42 AM

    Oracle has come back to us saying that
    "DML error logging feature is not supported for distributed DML."
    Bug 5698887 has more information about this.

  • Merge update source table and delete from target table problem

    Hello Friends, 
    I am a newbie in SQL Server world and I am in a situation where I need to delete the bunch of records from the TARGET table using the values from the SOURCE table. 
    The TARGET table has close to 400 Million records, so I need to delete the records in small batches of about ~10,000 rows.
    I figured out a way to delete in batches by refering the following 2 posts
    http://sqlperformance.com/2013/03/io-subsystem/chunk-deletes
    http://dba.stackexchange.com/questions/1750/methods-of-speeding-up-a-huge-delete-from-table-with-no-clauses
    I think my best option to delete and update in 1 pass would be through using Merge statement, so for that I constructed following SQL.
    MERGE dbo.table1 AS TARGET
    USING 
    SELECT File_name FROM dbo.table2
    WHERE  FILE_DESC = 'EDI'
    AND [Processed_date] < DATEADD (WEEK, -10, Getdate ()) AS SOURCE
    ON (TARGET.File_name = SOURCE.File_name)
    WHEN MATCHED THEN DELETE (FROM THE TARGET)
    WHEN MATCHED 
        THEN UPDATE SET SOURCE.PROCESS_delete_date = GETDATE()
    But, when executed, it throws following error and I am struggling to figure out what is wrong with the above syntax. 
    Msg 156, Level 15, State 1, Line 3
    Incorrect syntax near the keyword 'SELECT'.
    Msg 156, Level 15, State 1, Line 5
    Incorrect syntax near the keyword 'AS'.
    Can any expert please help a newbie as I learn the new way.
    Thanks a lot.

    Visakh, we can have more than 1 matched clause in merge as per the Microsoft sql statement, but we need to add a condition along with the match. thanks for your prompt response on this query. your query is logically fine but when executed, it throws
    following error. Msg 156, Level 15, State 1, Line 2 Incorrect syntax near the keyword 'DELETE'. Msg 102, Level 15, State 1, Line 8 Incorrect syntax near ')'. remember, my server machine is 2005 version & my work machine is 2012. dont know why your query
    is not working.
    MERGE is available only from 2008 onwards
    Yes you're correct but again not more than two  MATCHED clauses even if you specify condition
    see MSDN documentation below
    WHEN MATCHED THEN <merge_matched>
    Specifies that all rows of target_table that match the rows returned by <table_source> ON <merge_search_condition>, and satisfy any additional search condition, are either updated or deleted according to the <merge_matched> clause.
    The MERGE statement can have at most two WHEN MATCHED clauses. If two clauses are specified, then the first clause must be accompanied by an AND <search_condition> clause
    from
    http://msdn.microsoft.com/en-us/library/bb510625.aspx
    Also I guess Composable DML which used is also not present in 2005
    So in your case you can try this instead
    DECLARE @DELETED_FILES table
    File_Name varchar(100)
    DELETE t
    OUTPUT DELETED.File_Name INTO @DELETED_FILES
    FROM dbo.table1 t
    INNER JOIN dbo.table2 s
    ON t.File_name = s.File_name
    WHERE s.FILE_DESC = 'EDI'
    AND s.[Processed_date] < DATEADD (WEEK, -10, Getdate ())
    UPDATE r
    SET r.Process_Delete_Date = GETDATE()
    FROM dbo.table2 r
    INNER JOIN @DELETED_FILES AS p
    ON p.File_Name = r.File_Name
    Please Mark This As Answer if it solved your issue
    Please Mark This As Helpful if it helps to solve your issue
    Visakh
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • How to load  a tables one partition of data in to OWB target table

    Hey ,
    I want to load only one partition data from source (size in Tera bytes) in OWB tables. I used filter condition as per the partition done on a key value. But Its take so much time to load the data into the target table.
    I feel to configure the source table as well as target table and give some HINT command in operator settings . Is it right to improve the performance of loading .
    Can I use PEL option in source as well as target .
    suggest me please
    thanks
    murari

    Hi Andreas,
    This error has not been documented yet, but I got the information from the other forums.
    Error Name                            Error Code (hex)    Description
    LinBusErrorTxSyncTimeout         1040                  The LIN interface master task attempted to send a
                                                                               sync byte and did not self not receive the sync byte
                                                                               within the timeout period.
    This is the description. The problem is solved, the baud rate i was giving was higher than the slave units baud rate. Now it seems to be functioning correctly.
    Thank you.

  • Is it possible to run an update via target tables post load command

    Is it possible to run a sql update statement via post load command of target table?
    What does it mean by post load command, does it gets triggered only after a successful execution of a data flow (to Target table)?
    Or it can also gets triggered even if a dataflow fails?
    this is teh sql which i would like to run via post if dataflow success only.
    Update tbl_job_status set end_time=nvl(NULL,sysdate()) where job_id = (select max job_id from tbl_job_status where j_dataflowname='Chk_Order_Delta')

    I do not believe it will necessary run if the df fails. If you want your job status table updated no matter what (apart from a complete DS meltdown), you'll want to wrap things in a try-catch to make sure you're handling the update. You can do it in a script, or with a dataflow. A nice way to handle all this sort of operational metadata tracking stuff is to write yourself a library of custom functions and then consistently call them fore & aft of jobs or workflows or whatever components you want to track.  If you've got access to a Rapid Mart, you might look at what they've done there -- nice stuff (although you can't just "borrow" it).

  • Interface performs insert rather then update on target table

    I'm cannot figure out how to control my Interface to update target records rather then insert new records when a source value changes.
    I have primary keys defined.

    Turns out one of my primary key columns was being set inside the interface, the value did not exist in the source, so it was marked for update.
    Thanks!

  • The size of the target table grows abnormaly

    hi all,
    I am curently using OWB (version 9 2.0 4 to feed some tables.
    we have created a new database 9.2.0.5 for a new datawarehouse.
    I have an issue that I really can not explain about the increase size of the target tables.
    I take the exemple of a parameter table that contains 4 fields and only 12 rows.
    CREATE TABLE SSD_DIM_ACT_INS
    ID_ACT_INS INTEGER,
    COD_ACT_INS VARCHAR2(10 BYTE),
    LIB_ACT_INS VARCHAR2(80 BYTE),
    CT_ACT_INS VARCHAR2(10 BYTE)
    TABLESPACE IOW_OIN_DAT
    PCTUSED 0
    PCTFREE 10
    INITRANS 1
    MAXTRANS 255
    STORAGE (
    INITIAL 1M
    MINEXTENTS 1
    MAXEXTENTS 2147483645
    PCTINCREASE 0
    BUFFER_POOL DEFAULT
    LOGGING
    NOCACHE
    NOPARALLEL;
    this table is feed by a mapping and I use the update/insert option, which generates a Merge.
    first the table is empty, I run the maping and I add 14 lines.
    the size of the table is now 5 Mo !!
    then I delete 2 lines by sql with TOAD
    I run a again the mapping. It updates 12 lines and add 2 lines.
    at this point,the size of the table has increased of 2 Mo (1 Mo by line !!)
    the size of the table is now 7 Mo !!
    I do the same again and I get a 9 Mo table
    when I delete 2 lines with a SQL statement and create them manually, the size of the table does not change.
    when I create a copy of the table with an insert select sql statement the size becomes equal to 1 Mo which is normal.
    Could someone explain me how this can be possible.
    is it a problem with the database ? with the configuration of OWB ?
    what should I check ?
    Thank you for your help.

    Hi all
    We have found the reason of the increasing.
    Each mapping has a HINT which is defaulted to PARALLEL APPEND. as I understand it, it is use by OWB to determine if an insert allocates of not new space for a table when it runs and insert.
    We have changed each one to PARALLEL NOAPPEND and now, it's correct.

  • How to gather stats on the target table

    Hi
    I am using OWB 10gR2.
    I have created a mapping with a single target table.
    I have checked the mapping configuration 'Analyze Table Statements'.
    I have set target table property 'Statistics Collection' to 'MONITORING'.
    My requirement is to gather stats on the target table, after the target table is loaded/updated.
    According to Oracle's OWB 10gR2 User Document (B28223-03, Page#. 24-5)
    Analyze Table Statements
    If you select this option, Warehouse Builder generates code for analyzing the target
    table after the target is loaded, if the resulting target table is double or half its original
    size.
    My issue is that when my target table size is not doubled or half its original size then traget table DOES NOT get analyzed.
    I am looking for a way or settings in OWB 10gR2, to gather stats on my target table no matter its size after the target table is loaded/updated.
    Thanks for your help in advance...
    ~Salil

    Hi
    Unfortunately we have to disable automatic stat gather on the 10g database.
    My requirement needs to extract data from one database and then load into my TEMP tables and then process it and finally load into my datawarehouse tables.
    So I need to make sure to analyze my TEMP tables after they are truncated and loaded and subsequently updated, before I can process the data and load it into my datawarehouse tables.
    Also I need to truncate All TEMP tables after the load is completed to save space on my target database.
    If we keep the automatic stats ON my target 10g database then it might gather stats for those TEMP tables which are empty at the time of gather stat.
    Any ideas to overcome this issue is appreciated.
    Thanks
    Salil

Maybe you are looking for

  • Mini DP on MacBookPro to thunderbolt on iMac

    I have a MacBook Pro that I purchased in December of 2010. It does not have a thunderbolt port. I just purchased an iMac, and would like to be able to use it as an external display for the MacBook Pro. From what I've found, there is not currently a d

  • Safari, FireFox and Chrome are not loading webpages in OS X Lion

    I have been unable to load certain webpages in any of my web browers that I have installed. This includes Safari, FireFox and Chrome. While I am able to load sites such as ESPN.com without any issue, I cannot connect to Facebook.com or CNET.com. I ha

  • Issue with SAPbouiCOM.exe

    Hello, We are trying to install the client onto a desktop and we are having this message : Unable to reg C:Program FilesSAPSAP Business OneSAPbouiCOM.exe Any idea / reason why it shows this message ? Thanks Stéphane (SBO 2005A SP1 PL11 + SQL 2000 SP4

  • Dynamic internal table+function module

    Hi guys, I need a help.Could u plz tell me how to pass a field symbol contexts inside an export parameter? Points will be rewarded for helpful answers....

  • Broadband usage IMPOSSIBLE!

    Hi all, i'm a new customer to BT having signed up to an Infinity 1 & TV package from 17.06.14. I am after a bit of feedback to an issue which has become apparaent from the outset of my service connection, and which i'm getting exceedingly concerned a