Issue in Target Column mappings in ODI

Hi Guru's,
Unable to uncheck Insert and Update checkbox in the Update section of target column mappings.
How we can uncheck the Insert and Update checkbox for the columns which should not affect in the target datastore?
Thanxs
--Madhavi                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

Hi
My requirement is to only perform updation on some columns based on the key
How it can be implemented in the ODI interface mappings?
--Madhavi                                                                                                                                                                                                                                                                                                                   

Similar Messages

  • How to implement this aggregate logic at target column in odi inteface mapp

    sum(NOTICES) over ( partition by property order by RELAVANTDATE range between interval '30' day preceding and current row)
    how to implement this aggregate logic at target column in odi inteface mappings

    Hi
    if you don't want to aggregate try to define a user function
    analytic_sum($(value))
    implémented by
    sum($(value))
    after that
    replace your
    sum(NOTICES) over ( partition by property order by RELAVANTDATE range between interval '30' day preceding and current row)
    by
    analytic_sum(NOTICES) over ( partition by property order by RELAVANTDATE range between interval '30' day preceding and current row)

  • One Target column depending on other target column in ODI.

    I have a scenario in which one Target column is depending on other target column in ODI.
    for ex:
    Target Table:
    EDBT EDLN
    sss 1
    sss 2
    abc 1
    sss 3
    In short,EDLN will maintain sequence for repeating values of EDBT.
    Is there any solution?

    Yes, it's possible. But more difficult and unsupportable.
    How to do this:
    1. Create view and lookup on it in interface LKP_TARGET
    select EDBT, max (EDLN) MAX_EDLN from Target2. From source select VIEW_SOURCE
    select EDBT, row_number() over (partition by EDBT order by ??date??) RN from Source3. In target expression (Quick-edit tab)
    EDLN = LKP_TARGET.MAX_EDLN + 1 + VIEW_SOURCE.RN

  • ODI error Target Column Accounts: Target column has no data server associat

    Hi John,
    When i tried to map my source file to Planning target, i got the below error. Please let me know the resolution.
    Target Column Accounts: Target column has no data server associated
    Thanks,
    Sravan

    Hi,
    You have not set the correct context that the datastore is associated with.
    Change the optimization context in the definition tab.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Error calling pl/sql function in target column

    Hi guys,
    I get this error when calling my function in ODI:
    Caused By: java.sql.SQLSyntaxErrorException: ORA-00904: "mySchema"."GET_ODI_DEFAULT_VALUE(1)": ongeldige ID --> 1 is an IN parameter
    while in sql developer I get a good result with following query:
    select mySchema.get_odi_default_value(1) from dual;
    In my target table for the primary key I call a sequence from mySchema and this works fine.
    I've tried the following synxtax in the mapping of my target column:
    - <%=odiRef.getObjectName( "L","GET_ODI_DEFAULT_VALUE(1)", "D" )%>
    - <%=odiRef.getObjectName( "L","GET_ODI_DEFAULT_VALUE", "D" )(1)%>
    Thanks for you advice

    iadgroe wrote:
    how to bring oracle function into ODI
    I thought for objects like sequences and functions you had to use 'odiRef.getObjectName' to get access to them.
    Am I wrong because now I'm a little confused???Hi,
    Best practices would be to use getobjectname method as this way your not hardcoding anything into the interface and you are referencing the logical object only, which means you can change the logical object later and not have to worry about changing the interfaces.

  • How to migrate OWB mappings in ODI

    Dear All,
    I would require your valuable inputs for following points.
    1. How do we do the deployment on multiple sites in ODI? what is the methdology or steps? R there any third party tools to do the same? what are they?
    2. Is there any scripting language in ODI similar to OMB meta data scripting languate as in OWB which can be used to automate and speed up the multi site deployment?
    3. What is the process of step to convert OWB mappings to ODI interfaces? Does oracle provides any tools or methodology to do the migration from OWB to ODI.?R there any third party tools to do the same? what are they?
    Thanks and Regards
    Edited by: 910192 on Aug 16, 2012 10:22 PM
    Edited by: 910192 on Aug 16, 2012 11:54 PM

    910192 wrote:
    Dear All,
    I would require your valuable inputs for following points.
    1. How do we do the deployment on multiple sites in ODI? what is the methdology or steps? R there any third party tools to do the same? what are they?If you mean databases as 'sites' then you just configure seperate phyiscal connections and choose if you want to implicitly refer to each DB in your code or use Contexts to determine which database to use at run time.
    Also careful consideration / deployment of ODI Agents allow you to run / execute / invoke you code from just about anywhere you want to (Target database, remote file system, source servers etc)
    2. Is there any scripting language in ODI similar to OMB meta data scripting languate as in OWB which can be used to automate and speed up the multi site deployment?There is an SDK and groovy can be used : https://blogs.oracle.com/dataintegration/entry/odi_11g_insight_to_the
    3. What is the process of step to convert OWB mappings to ODI interfaces? Does oracle provides any tools or methodology to do the migration from OWB to ODI.?R there any third party tools to do the same? what are they?Not sure if Oracle have formally released a step by step process yet, they promise an upgrade path to OWB users to migrate, there is a consulting offer for this : http://www.oracle.com/us/products/consulting/resource-library/owb-odi-migration-ds-1367824.pdf
    ALso an italian company has / is developing a migration tool : http://www.owb2odiconverter.com/eng/index.html

  • Need to update target data source in ODI

    Hi All,
    I have 2 tables 1. Tab_abc(col1,col2,col3,col4,col5) & 2. Tab_xyz(col1,col2,col3).
    Tab_abc has already some records with NULL values in col2& col3 and Tab_abc.col1 = Tab_xyz.col1
    I need to update col2 & col3 in Tab_abc.
    I have used IKM: oracle Incremental update
    and checked col2& col3 for update and made col1 as key.
    But it is giving some problem.
    COuld you please let me know, in deatail to solve this issue.
    Thanks.

    It seems you are on the right track. On the target table , you have made col1 as key.
    The only issue is that it will update all your target columns ( col2 , col3 ) based on the entries of source columns ( col2 , col3 ).
    One of the prospective solution can be make a join of target with the source or simply put a filter to find out the rows which have col2 , col3 in target as null.
    i.e.
    on Source table put the filter
    Tab_abc.col1 in( Select Tab_xyz.col1 from Tab_xyz where Tab_xyz.col2 is null or Tab_xyz.col3 is null ).
    BUT in this case , col2 and col3 both values will get updated if any of the value for col2 ,col3 is null for particular row.
    If you want to update only null valued columns then you will need to create interface based on the following query :
    update Tab_xyz set (col2,col3) = ( select NVL(a.col2,B.col2),NVL(a.col3,B.col3) from Tab_abc B,Tab_xyz a
    where Tab_xyz.EMPLOYEE_ID = B.EMPLOYEE_ID and A.EMPLOYEE_ID = B.EMPLOYEE_ID)
    where col3 is null or col2 is null;
    For this you need to make a join of target table with the source table.
    Hope it helps

  • Source Rows to Target Column mapping

    Hi Gurus,
    Please guide me on the bellow issue,
    Source Side tables
    CUSTOMER_MASTER(master table)-Customer ID is primary key
    CUSTOMER_ADDRESS(Child Table)-Multiple customer rows based on address type ,OFFICE,RESIDENCE etc.
    Target Table
    TRG_CUSTOMER-Customer ID is primary key. This target table has column for each address type like OFFICE_ADDRESS,RES_ADDRESS.
    I have to hand off all the rows of CUSTOMER_MASTER to TRG_CUSTOMER,if multiple address are there for customer in CUSTOMER_ADDRESS table i have to transfer those values to respective target columns.
    Could plz share idea to implement this in the inteface?
    Thanks,
    Kumbar

    Hi ,
    I am using some of the column of both tables to map target table, In Target table we have columns OFFICE_ADDRESS_LINE1 to OFFICE_ADDRESS_LINE3,home_ADDRESS_LINE1 to home_ADDRESS_LINE3,but in the source table CUSTOMER_ADDRESS we have row for each type of address.How to do
    this maping and tranformation ?
    Please help me,

  • Logical Column Mappings update

    If I want all my Logical Columns in the Business Model layer to point to a new and different database (e.g: EIMDW) instead of "ORAEIMU", what is the quickest and easiest way to update all the column mappings at once in the Business Model Layer?
    Your feedback will be very much appreciated
    Boniface Ntawutarama

    Hello :)
    Maybe someone else has a better suggestion. But as for our project's case, when we want to point to a different database with EXACT same definition of schema, tables, and fields, we simply re-define the ODBC connection to point to a different Data Source Name (in System DSN) or we edit the actual data source definition in tnsnames.ora. This would mainly affect your physical layer, though, not just your BMM.
    If you wish to use both old and new databases in the repository, I think you would need to import the tables from the new database and change the Sources of each Logical Folder to point to the new tables.
    Hope this helps. Share with us if you find a better solution. :)
    - Cha :)

  • TARGET Column in crs_stat output

    This what oracle doc says about the TARGET column in crs_stat output
    http://docs.oracle.com/cd/B19306_01/rac.102/b14197/crsref.htm
    The TARGET value shows the state to which Oracle Clusterware attempts to set the resource. If the TARGET value is ONLINE and a cluster node fails, then Oracle Clusterware attempts to restart the application on another node if possible. If there is a condition forcing a resource STATE to be OFFLINE, such as a required resource that is OFFLINE, then the TARGET value remains ONLINE and Oracle Clusterware attempts to start the application or application resource once the condition is corrected.
    Didn't quite understand. So, if
    TARGET = ONLINE , then that means this particular resourse should be up mandatorily ?
    and if
    TARGET = OFFLINE , then that means this particular resourse does not have to be up ?Question2.
    Can anyone tell me a scenario for a standard RAC implementation , where you can find TARGET = OFFLINE

    GarryB wrote:
    So, if
    TARGET = ONLINE , then that means this particular resourse should be up mandatorily ?
    and if
    TARGET = OFFLINE , then that means this particular resourse does not have to be up ?
    This is also the way I understand it.
    >
    >
    >
    Question2.
    Can anyone tell me a scenario for a standard RAC implementation , where you can find TARGET = OFFLINEFor example in 11.2 RAC:
    - ora.gsd is by default OFFLINE because only needed if you use an Oracle 9.2 database according to http://docs.oracle.com/cd/E11882_01/install.112/e24616/postinst.htm#BABDAFGD.
    - ora.oc4j is OFFLINE because it has not been implemented by Oracle. Example in http://docs.oracle.com/cd/E11882_01/rac.112/e17264/install_rac.htm#TDPRC175.

  • Target field without origin - ODI 11 G

    What is soluciton for target field without origin?
    Help me

    What do you mean by without origin? Are you interested in leaving the target column as blank ??

  • Column mappings of an LOV

    How can I manipulate the column mappings of an LOV in my trigger code?

    You can use SET_ITEM_PROPERTY to change the LOV, you are selecting value from, at runtime - you can find the details in Forms Help: 'Attaching an LOV programmatically at runtime'

  • Performance issue due to column formula and filters

    Hi,
    I am facing strange issue with performance for my OBIEE reports. I have two sets of reports Static and Dynamic. Both runs against same tables. The only difference between these reports is that the Static reports would run against all the data for given aggregation level e.g. Year, Month, Date and so on. Where as for Dynamic one I have range prompts to filter data. Other difference is that I have a column formula for one of the column in the Dynamic report, which is nothing but Go URL to show another page with certain parameters.
    The static report takes around 14-15 Seconds where as the Dynamic one takes around 3.5 min. The amount of data and range is same here. From the logs I could see that for the Static reports, i.e. reports without filters it applys group by at SQL level where as it is not doing so for the dynamic one. Is this expected ?
    Second issue is, even if I say remove the filters and just have report with column formula in one and no formula in other there is significant time difference in the processing at Presentation service layer. Again this is taken from the log. it takes 8 second to get data from DB but shows almost 218 Seconds as response time at Presentation layer.
    Below are conceptual details about table and reports -
    Table 1 (It is date dimension) : Date_Dim
    DateCode Date
    Day Number
    MonthCode Varchar2
    YearCode Varchar2
    Table 2 (It is aggregate table at year level) : Year_Aggr
    DateCode Date (FK to Table1 above)
    Measure1
    Measure2
    Measure3
    Measure4
    Measure5
    Report 1
    Date_Dim.YearCode | Year_Aggr.Measure1 | Year_Aggr.Measure2 | Year_Aggr.Measure3 | Year_Aggr.Measure4
    Report 2
    Dashboard Filter : Dimension1 | Dimension2 | Year Start | Year End |
    Date_Dim.YearCode | Year_Aggr.Measure1 | Year_Aggr.Measure2 | Year_Aggr.Measure3 | Year_Aggr.Measure4
    Column formula for Date_Dim.YearCode is something like :
    '<a href="saw.dll?Dashboard&PortalPath=somepath and parameters  target=_self>'  || Date Dim"."YearCode" || '</a">'
    Filters :
    Dimension1 is prompted...
    Dimension2 is prompted...
    cast("Date Dim"."YearCode" as Int) is greater than or equal to @{Start_Year}
    cast("Date Dim"."YearCode" as Int) is greater than or equal to @{End_Year}
    Note : I need to apply cast to int as column is varchar2, legacy problem.+
    How can I fix this? Am I missing something? In the result of report2 the DB SQL doesn't show the year in where thought it is displayed in the logical sql.
    Let me know if anybody had faced this and have fixed. Or suggetion to make changes to fix this.
    Thannks,
    Ritesh</a>

    Hi Ritesh,
    I think you right about the root cause of your problem. The first request does the group by in the database which returns fewer records to the BI Server for processing. The second request does not do the group by and sends significantly more records back to the BI server forcing it to do the group by. Compound that with the fact that pivot table views are relatively expensive computationally and that explains the difference between the execution times.
    Assuming that the execution time of the first report is satisfactory, I would recommend you try to experiment with a few settings to see if you can get the second report to do the group by in the database.
    Are the two filters identical except for the following conditions?
    cast("Date Dim"."YearCode" as Int) is greater than or equal to @{Start_Year}
    cast("Date Dim"."YearCode" as Int) is greater than or equal to @{End_Year}
    Best regards,
    -Joe

  • Dynamic Column Name in ODI Interface

    Hi Everyone
    I have a requirement to read a set of columns from source table and move the data to target table. Set of columns are decided dynamically based on parameter say PVV_PREFIX.
    For example, assume the following tables
    Source: Employee (OFFICE_ADDRESS1, OFFICE_ADDRESS2, OFFICE_CITY, HOME_ADDRESS1, HOME_ADDRESS2, HOME_CITY)
    Target: Address(TYPE, ADDRESS1, ADDRESS2, CITY)
    Now, if scenario is called PVV_PREFIX=OFFICE, then OFFICE_* columns should be mapped to target table
    if scenario is called PVV_PREFIX=HOME, then HOME_* columns should be mapped to target table
    In actual requirement there are more than 30 such columns. Declaring those many variables is really not practical, rather we prefer to #PVV_PREFIX + '_ADDRESS1' kind of approach. But not getting supported in ODI Interface.
    Please let me know if this requirement is feasible via ODI Interface or we need to do only using ODI procedure. Thank you
    Regards
    Prasad

    Prasad,
    You can do it using single interface itself but you will need to customize the KM .
    Step 1. Create Interface mapping with appropriate Source to Target Mapping.
    Step2. Customize KM in such a way that you add prefix PVV_PREFIX. This can done using ODI Variable or Java
    Step 3. Execute and test you interface
    Please let me know if you have any question in above approach .

  • How to load the value in target column?

    Hi
    Source: Oracle
    Target: Oracle
    ODI: 11g
    I have an interface which loads the data from source table to target. Some of the columns in the target tables are automatically mapped with source table. Some of the column remain un-mapped. Those who remain un-mapped, I want to load the values in that column by executing an query. So can anybody tell me where I should mention that query whose result would become the value of the specific column.
    -Thanks,
    Shrinivas

    You can put the query (or subquery) into the mapping field.
    You can also call a function in the mapping field which may be easier in your case.

Maybe you are looking for

  • Why can't I cut and paste photos between project?

    I recently made the transition from iPhoto to Aperture. One of the weird things is that I can't cut and paste photos between projects. In iPhoto this is an easy way to organize your projects if some photos are in the wrong place. You also can't split

  • Code include from database through servlet into jsp

    Hi all, Im alittle stuck at the moment. I need to store code in a database that will be retrieved and displayed on a webpage. I know how to retrieve the information and display it if it is just normal data. The data in the database will be html along

  • Serialization format

    Hi, does anyone know a good reference source for describing how java encodes objects? My problem is that i need to be able to write a program in Perl to decode a simple object created by Java. I can't use XML, i have to read the native serialization

  • Placeing PSDs in AIs or EPSs

    Hi guys. I have been contracted by a customer to remake a logo of theirs. Like always it should be a vector file so they can scale it and print it resolution independent. The problem is their logo includes some effects (pillow embossed text and graph

  • Firefox 4.0 often doesn't respond. In addition screen can't be reduced & it doesn't set in place (moves upwards).

    Firefox 4.0 often doen't respond. Screen can't be reduced & initially it doesn't setup in place (moves upwards). Please help.