UPSERT feature in ODI

Hi Gurus/Experts,
I have a requirement to put different transformation logic for a column based on the value of another columns in target and source table. If the value is already there in source then there is a different logic than the rows not having values in source. Hope I am clear with the requirement.
Please let me know how this to be implemented in ODI?
Regards,
Vipin

When you say "put different transformation logic", do you mean :
if your source row exists on the target, the target column A will receive the mapping XXX (and the row will be updated, of course)
if your source row doesn't exists on the target, the target column A will receive the mapping YYY (and the row will be inserted, of course)
If so, it will be difficult.
Because if you take for example the IKM Incremental Update, the different transformations occurs during the insertion in I$ table (if the mapping is on "work area" zone) or during insertion in the C$ table (if the mapping is on "source" zone). But at this time, ODI hasn't already checked if the data exists on the target. The step "flag row for updated" is done later and will give you this answer thanks to the "IND_UPDATE" field.
So, you cannot use the IND_UPDATE field in your mappings that occurs on "work area" zone or "source zone".
But... if your transformation can be processed in "target zone" (that means they don't depend on source data, but depends on variable, sequence or static data), you will be able to do it.
Example, you can write for column A :
"CASE WHEN IND_UPDATE = I THEN '#variable1' ELSE '#variable2' END'
In this example : XXX = variable2 and YYY = variable1.
another example :
"CASE WHEN IND_UPDATE = I THEN 'data_inserted' ELSE 'data_updated' END'
If your transformations need source data, then they must be set on work area zone or source zone. In this case, you won't be able to use IND_UPDATE field. For your requirement, you should create 2 interfaces : one for inserted data, and one for updated data. Or 1 interface with 2 unioned datasets (odi 11g only).

Similar Messages

  • ODI Data Profiling and Data Quality

    Hi experts,
    Searching about ODI features for data profiling and data quality I found (I think) many ... extensions? for the product, that confuse me. Cause I think there are at least three diferents ways to do data profiling and data quality:
    In first place, I found that ODI has out of the box features for data profiling and data quality, but, acording to the paper, this features are too limited.
    The second way I found was the product Oracle Data Profiling and Oracle Data Quality for Oracle Data Integrator 11gR1 (11.1.1.3.0) that is in the download page of ODI. Acording to the page, this product extends the existing inline Data Quality and Data profiling features of ODI.
    Finally, the third way is Oracle Enterprise Data Quality that is another product that can be integrated to ODI.
    I dont know if I understood good my alternatives. But, in fact, I need a general explanation of what ODI offer to do Data Quality and Data profiling. Can you help me to understand this?
    Very thanks in advance.

    Hi after 11.1.1.3 version of ODI release, oracle no longer supports ODP/ODQ which is a trillium software product and this not owned by oracle. Oracle is recommending the usage OEDQ for quality purposes. It's better if you could spend time on OEDQ rather than trying to learn and implement ODP/ODQ in ODI

  • Why should we go for ODI?

    Hi,
    I know the Informatica 9.1.0. Now , I am learning ODI so getting some questions.
    I am working with the Hyperion & ODI is used with the hyperion to fetch data from any source system.
    I have few questions in my mind related to ODI.
    why should I go for ODI? OR when should I use OID?
    what is the benefits getting by ODI that does not available in other tools.
    Thanks

    It might be worth starting to read through the features of ODI and related documentation to understand it strengths http://www.oracle.com/technetwork/middleware/data-integrator/overview/index.html
    It is Oracle's strategic integration product so if you are working with EPM products then you will find more features than with Informatica.
    I will let someone else provide information on when to use it because I have been here before on many occasions, it can all depend on what products you have currently, what your source/target systems, what your objectives are to whether it is for you.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • ODI Resources  Guidance ?

    ODI Resources Guidance ?
    =================
    I find some interesting features on ODI. There were some webcasts on ODI during March 2011 (As per link in this forum).
    Did you post this webcasts as resource for replay? Where can I find it?
    I find "Oracle Data Integration 11g: New Features" presentation at OBE area. It is very useful to know about the new features.
    Do you have a follow up presentation on how we could use these features on ELT (ETL) projects?
    Thanks in helping.

    You will get all archive of that web seminar.
    http://www.oracle.com/technetwork/testcontent/odi-webcast-archives-086000.html
    Thanks.

  • Resuable in ODI

    HI All,
    I am using ODI 11 g.
    I have created one Model and some interfaces in ODI.
    Now i want to create another model interfaces which is same as the above but naming are diffrent.
    How to do this .??.what is the resuable feature in ODI for multiple similar type of Objects.
    Sometimes i found if you do like this then the Previous one is also changes...
    Thanks,
    Lony

    An interface is associated to one set of source table, and 1 target table.
    That limits the re-usability.
    When you say you want to reuse an interface, what do you want exactly ? Reuse this interface to load another target table ? Or reuse this interface to use the same source table ? Or another thing ?
    Here's some ways to "re-use" things in ODI :
    ** set variable in your datastore resource. It allows you to use the same interface many time, but with different table. Of course, these different table will have the same structure (at least, the fields used in the interface will be the same).
    ** create some user function : it allows you to reuse some complicated mappings
    ** custom your LKM, CKM or IKM : it allows you to reuse these customed rules by using your custom KM in many interfaces
    ** use dataset with union/intersection features in interface. It allows you to use one unique interface with various sources set of tables, and various target mappings.

  • Upgrading ODI 11.5 to 11.6

    Dear All,
    I have installed ODI 11g (11.1.1.5.0) and would like to upgrade it to 11..1.6.0 to learn new features in ODI 11.6.
    Can anyone tell me steps/process to do so, how to apply patches to ODI 11.5 to make it 11.6 ?From where to get those patches? How to upgrade master/ work repositories in 11.5 to 11.6?
    I can strait a way download ODI 11.6 from oracle site, and install but would like to learn upgrade process, so would request for steps to acomplish the same.
    Thanks and Regards

    This has been discussed upon
    FYI .. patch to upgrade ODI 11.5 to 11.6

  • ODI security

    Hi,
    I have implemented a whole project in ODI 11G.
    Now I want to give the security in this ODI.
    Could you please provide me the step by step procedure?
    Thanks,
    Rubi

    Hi Rubi,
    You are trying to explore the least documented feature to ODI ... you need to learn by tryint it out.
    ODI security is implemented by PROFILE given to USER .This is done via Security Manager.
    You can assign 1 or many profile to an user . Then drag/drop the objects into that user.
    During this time it will ask you to give edit/read privilege , choose which evere is required .
    I am not sure if I am able to express in proper way .. but at least you can try it out by your self.
    Thanks,
    Sutirtha

  • Need clustering of ODI 10g

    Hi Gurus,
    We have a requirement in which we need to install Clustering (High Availability) for ODI 10g.
    Can anyone please help me regarding this issue ASAP.
    Any Help is Greatly Appreciable.
    Regards,
    Pavan Kumar.

    ODI 11g by itself does not have any failover mechanism, ODI uses the fail-over mechanism implemented by the Application Server under which ODI J2EE Agents will be running i.e. if we have an ODI J2EE agent deployed on Weblogic for instance.
    However, we can use the Load Balancing feature in ODI 10g or 11g to attain Pseudo High Availability, which can be implemented in the below mentioned steps,
    1.     On a specific host machine, define a "parent" Agent whose only function is to route ODI Scenario start-up commands to the appropriate "child" Agents.
    2.     Define more than one "child" Agent
    3.     For each "child" Agent on the cluster set the Concurrent Session Number parameter with a large value.
    4.     Open the Load balancing tab of the "parent" Agent and activate the checkbox for the "child" Agents on the cluster nodes (and not for the "parent" Agent itself). Limit the number of sessions on the "parent" Agent to (number of "child" Agents+2).
    5.     Route all ODI Scenario start-up commands and schedules to the "parent" Agent. This Agent will then forward ODI Scenarios to the most suitable child Agent
    Roles of the “Parent” agent:
         The "parent" Agent should be used to dispatch the executions on its "child" Agents that are alive or running.
         The "parent" Agent is able to detect that a "child" Agent is no longer running.
         Also, “parent” Agent is able to detect a "child" Agent that has just been started.
    Note however that, once an execution has started on Agent A, if Agent A "dies", the current execution will NOT be moved on another Agent.
    The "dying" execution should be manually pushed into the queue in order to have the "parent" Agent redistribute it on its still alive "child" Agents.
    One concern is the "parent" Agent:
         If it dies, the already distributed tasks will continue to be executed on the corresponding "child" Agent.
         If the "parent" Agent dies, it should be restarted as soon as possible, in order to keep the flow active.
         Given its very important role, it should be placed on a machine that has a high uptime coefficient. If the machine stops, the Agent should be restarted ASAP.
    Regards,
    Rickson Lewis
    http://www.linkedin.com/in/rickson

  • Problem with journalizing

    Hi all,
    I was trying to use the 'Change data capture' feature in ODI. I was able to start the journal for 1 of my models. In the operator the process failed giving the error :
    java.sql.SQLException: ORA-00439: feature not enabled: Streams Capture
    Then, I thought the problem might have been because the db user did not have privileges. So i executed the following pl sql block in the sql prompt :
    BEGIN
              DBMS_STREAMS_AUTH.GRANT_ADMIN_PRIVILEGE(GRANTEE => 'DATA2');
    END;
    / as per the instructions in the designer(DATA2 is the name of the db user, from where the model takes the data ). Still the same error came. Then I figured that in the V$OPTION view the value of the parameter 'Streams Capture' was false. Now I am trying to set this 'Streams Capture' parameter to 'TRUE'. The 'update' command didnt seem to work. The error I got was :
    ORA-02030: can only select from fixed tables/views
    How do I set the 'Streams Capture' parameter to 'TRUE'?
    And am I on the right track?Please help.
    P.S : I am using the Oracle 10g Express Edition.
    Regards,
    Divya

    I'm not sure that Express has the LogMiner fiunctionality available. I think this may be an Enterprise feature.

  • Dynamic SQL generation

    Hi All,
    As part of the ODI Transforms, is it possible to create dynamic sqls at the run time, which could be dependent on the data coming inside, which in turn would be looking at some kind of mapping table and create the sql dynamically.
    Is this feature possible ODI?
    Thanks in advance for your input.

    Hi Cezar Santos,
    Thanks for the reply. The scenario is given below:
    Logic to dynamically map chartfield to segment.
    A user would have access to a GUI domain value map that would allow them to map BU/Chartfield to Ledger/Segment.
    Edge App 1 (PSFT)     AIA Common Key     Edge App 2 (Retail)
    Business Unit (from SETID)     Chartfield          Segment     Ledger
    US001     ACCOUNT     1     SEGMENT1     Ledger1
    US001     DEPARTMENT     2     SEGMENT2     Ledger1
    US001     PRODUCT     3     SEGMENT3     Ledger1
    US001     OPERATING_UNIT     4     SEGMENT4     Ledger1
    US002     OPERATING_UNIT     5     SEGMENT2     Ledger2
    US002     ACCOUNT     6     SEGMENT3     Ledger2
    US002     DEPARTMENT     7     SEGMENT4     Ledger2
    US003     OPERATING_UNIT     8     SEGMENT1     Ledger3
    The transformation happens inside the XFORM view by using an alias to select the columns from RETAIL_STG. Once the aliases are applied, the view can be mapped one to one with the PSFT_STG table.
    Use ODI to select unique Ledgers from RETAIL_STG.
    Execute transformation and transportation of data.
    Move to next Ledger and repeat.
    For US001/Ledger1
    SELECT LEDGER AS BUSINESS_UNIT, SEGMENT1 AS ACCOUNT, SEGMENT2 AS DEPARTMENT, SEGMENT3 AS PRODUCT, SEGMENT4 AS OPERATING UNIT FROM RETAIL_STG WHERE LEDGER = “Ledger1”
    For US002/Ledger2
    SELECT LEDGER AS BUSINESS_UNIT, SEGMENT3 AS ACCOUNT, SEGMENT4 AS DEPARTMENT, ‘ ‘, SEGMENT2 AS OPERATING UNIT FROM RETAIL_STG WHERE LEDGER = “Ledger2”
    For US003/Ledger3
    SELECT LEDGER AS BUSINESS_UNIT, ‘ ‘,’ ‘, ‘ ‘, SEGMENT1 AS OPERATING UNIT FROM RETAIL_STG WHERE LEDGER = “Ledger3”
    Kindly provide your thoughts. If you have questions, please let me know.
    Thanks.

  • Odireadmail errors out

    Hi all,
    We are trying to use the odireadmail feature in ODI(10.1.3.6.2) to trigger one of our jobs. I have verified the mail server information, and they have created a new username and password for me. I have logged into the mail account, and verified that the account is receiving emails. Everytime, I use this feature in my package, it errors out.
    OdiReadMail "-MAILHOST=<server name>" "-USER=<user>" "-PASS=<pass>" "-PROTOCOL=pop3" "-FOLDER_OPT=NONE" "-KEEP=NO" "-EXTRACT_MSG=YES" "-EXTRACT_ATT=YES" "-USE_UCASE=NO" "-NOMAIL_ERROR=NO" "-TIMEOUT=10000" "-POLLINT=10000" "-MAX_MSG=1" "-SUBJECT=<subject name>
    Here is the exact error:
    com.sunopsis.tools.core.exception.SnpsSimpleMessageException: Oops, got exception!null
         at com.sunopsis.dwg.tools.ReadMail.actionExecute(ReadMail.java)
         at com.sunopsis.dwg.function.SnpsFunctionBase.execute(SnpsFunctionBase.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execIntegratedFunction(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlS.treatTaskTrt(SnpSessTaskSqlS.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.k(e.java)
         at com.sunopsis.dwg.cmd.g.A(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Thanks,
    Sujani

    I get error code 11111 and can never purchase anything

  • How to give dynamically mapping in interface

    Hi all,
    I have requirement for this requirement i am thinking various scenarios,
    could please tell me is there any way to we can give source,target data stores and mapping dynamically(execution time) to the interfaces ? if it is possible then i can re use job again and again ?
    please give your solutions for this post
    Thanks,
    Surya.

    Please let me know if I miss the mark here.
    If you are trying to build the 150 interfaces without spending the time to do so manually, use the ODI SDK. Check out David Allan's blog post for examples here https://blogs.oracle.com/warehousebuilder/entry/odi_11g_interface_builder. You can build your interfaces once - without need for dynamic mapping at execution time.
    I'm pretty sure there is no good way in ODI "out of the box" to dynamically map columns at execution time, but I imagine you could use the ODI SDK to write a Java program that does so. Then, use the Open Tools feature in ODI to run the Java program within your ODI processes. I'm not sure the value of this, but I do not have a full understanding of your requirement.
    I hope this helps. Enjoy!
    Michael R.

  • Do I need to Start Journal after Stream process restarting

    I have a CDC process using JKM  with "Oracle Streams Consistent set" in place. Do I need to start the Journal once the Database is back up after a shutdown and the stream process is restarted?

    You may not need to do that. Please refer to support note "What Is The Basic Setup Of The Changed Data Capture (Journalization) Feature In ODI (Doc ID 793424.1)"

  • Cdc- deletion of parent- child records

    Hi,
    I am working with CDC-consistent feature in odi.
    Here my scenario is, I have a record say 120 (primary key) in table A(parent source table) and it is used as a foreign key in Table B.
    both child and parent are inserted into the concerned tables of target.
    Now i want to delete this 120 record from target parent and child tables.
    IN the pkg i arranged the pkg scenarios as follows
    odiwaitforlogdata----->source model(with extenwindow and lock subscriber option selected)------>parent pkg scenario------>child pkg scenario----->source mode(with unlock subscriber and purge journal options). ------------> This works fine for insert and update.
    odiwaitforlogdata----->source model(with extenwindow and lock subscriber option selected)------>child pkg scenario--------->parent pkg scenario----->source mode(with unlock subscriber and purge journal options). ------------> This works fine for delete.
    Can't I achieve these two in one pkg
    Please Guide.
    Regards,
    Chaitanya.

    Hi,
    kev374 wrote:
    Thanks, one question...
    I did a test and it seems the child rows have to also satisfy the parent row's where clause, take this example:
    EVENT_ID|PARENT_EVENT_ID|CREATED_DATE
    2438 | (null) | April 9 2013
    2439 | 2438 | April 11 2013
    2440 | 2438 | April 11 2013
    select * from EVENTS where CREATED_DATE < sysdate - 9
    start with EVENT_ID = 2438
    connect by PARENT_EVENT_ID = prior EVENT_IDSo you've changed the condition about only wanting roots and their children, and now you want descendants at all levels.
    This pulls in record #2438 (per the sysdate - 9 condition) but 2439 and 2440 are not connected. Is there a way to supress the where clause evaluation for the child records? I just want to pull ALL child records associated with the parent and only want to do the date check on the parent.Since the roots (the only rows you want to exclude) have LEVEL=1, you can get the results you requested like this:
    WHERE   created_date  < SYSDATE - 9
    OR      LEVEL         > 1However, since you're not ruling out the grandchildren and great-grandchildren any more, why wouldn't you just say:
    SELECT  *
    FROM    events
    WHERE   created_date     < SYSDATE - 9
    OR      parent_event_id  IS NOT NULL;?
    CONNECT BY is slow. Don't use it if you don't need it.
    If you x-reference my original query:
    select * from EVENTS where CREATED_DATE < sysdate - 90 and PARENT_EVENT_ID is null -- All parents
    union
    select * from EVENTS where PARENT_EVENT_ID in (select EVENT_ID from EVENTS where CREATED_DATE < sysdate - 90 and PARENT_EVENT_ID is null) -- include any children of parents selected from above
    The 2nd select does not apply the created_date < sysdate - 90 on the children but rather pulls in all related children :)Sorry; my mistake. That's what happens when you don't post sample data, and desired results; people can't test their solutions and find mistakes like that.

  • Dynamic setting of error limit

    Hi All
    Is it possible to set dynamically the "maximum error limit" of an interface . The client has a requirement where the error limit will be provided from an external file / table and the value from the file should be used as a parameter to stop the interface when the errors reach the percentage limit.
    Thanks

    Hi,
    Trying to Help!
    I don't know whether there is any such kind of built in feature in ODI or not. But Surely such kind of Functionality can be delivered by adding another option in CKM such that it will accept some numbers as parameter and when it exceeds the no, abort should take place.
    Regards,
    Amit

Maybe you are looking for