Embedded pluggable mappings

I see this defined in the manual but I cannot find any examples on how to create this.
Can someone provide me with more information?
What's the use of embedded plugmaps since they cannot exist outside the surrounding mapping. They do not exist as standalone plugmaps, can we share them between mapping having the same surrounding conditions?

Pluggable mappings are basically reusable bits of code. An example could be an in-line view, which could exist as a pluggable mapping which can then be added (via Pluggable Mapping oprator) to a normal mapping. I also use pluggable mappings to break up a large mappings into smaller more manageable bits.
They are created via the Pluggable Mappings branch in the Project Explorer.
FYI there are many bugs with pluggable mappings in versions prior to 10.2.0.3 and a few still in 10.2.0.3.
Si

Similar Messages

  • OWB 10.2 - Pluggable Mappings Implementation: Black-Box, White-Box, ...

    Hi,
    We recently migrated from OWB 10.1 to version 10.2, and for the moment I'm investigating the new feature 'Pluggable Mappings'.
    When looking at the OWB documentation I can find 3 different data transformation logics: black-box, white-box and system-managed.
    But when going to the OWB 10.2 design client I can't find anywhere this property on the pluggable mappings?
    Can anybody help?
    Thanks a lot in advance!
    Bart

    I haven't seen that in the OWB manual but perhaps it just refers to how you can treat a pluggable mapping.
    As a pluggable mapping is in effect a reusable component it can be tested independently.
    Black box testing usually refers to testing with no knowledge of the inner workings of the component i.e. you test by passing specific inputs and ensuring you get desired outputs, what the component did in the middle is irrelevant to your test.
    White box testing means that you test every path through the component so it requires a detailed understanding of the inner workings.
    Not sure about system managed, perhaps that just means a system test where you test the whole application.
    Pluggable mappings are useful to split large or complex mappings into smaller more manageable parts or to create a reusable component that you include in several mappings but only need to build/test once.
    Si
    PS. prior to 10.2.0.3 I encountered many bugs with pluggable mappings.

  • Profile mappings - generated pluggable mappings

    When you create a correction mapping using the create correction in data profile editor, the mapping contains generated pluggable mappings. Where can you find these mappings?

    I'm not sure where you find the pluggable mapping but if you select the plug in your map and use the 'Visit Child Graph' button from the 'General' toolbar you should be able to see mapping detail.
    Cheers
    Si

  • Pluggable mappings - how to get them to work (OWB 10gR2)

    Hi,
    I'm trying to create some pluggable mappings that I can share between a dimension and fact load. For example, I want to take my incoming data and do some set of operations on it (nothing too complex, all just a basic "expression" operator). I need to exactly duplicate this code between a dimension mapping (to generate dimension values), and a cube load (so OWB can look up the proper surrogate keys).
    Therefore, to make sure the logic stays 100% the same between the dim and fact loads, I'd like to use pluggable mappings. Then if I change the logic for the pluggable mapping, it automatically updates both the dim and fact load (and makes the logic 100% accurate between the dim and fact load ETL jobs)
    The problem I'm running in to is this: everything works 100% ok in the dimension load, where basically the pluggable mapping feeds directly into a single target operator, and that target operator has no other inputs.
    However, in the fact load, I can't get pluggable mappings to work (at all). The only difference is that, in the fact load, I have data from both the pluggable mapping AND from other sources going into a single operator (AGGREGATOR in my case). Those "other sources" are all the other dimensions that don't have pluggable mappings, plus the numeric fact columns.
    When I try to connect the pluggable mapping to the AGGREGATORI get the following error message:
    "API8003: Connection target attribute group is already connected to an incompatible data source. Use a Joiner or Set operator to join the upstream data first before connecting it to this operator"
    I'm not sure how to resolve this. A "joiner" won't help - if I have 10 dimensions coming in to an aggregator, and I'm processing 50 million rows, I sure can't try to add suitable join fields to each dimension and then try to join ten 50 million row sets. And a Set operator doesn't help at all either.
    Any way for me to get around this?
    Thanks,
    Scott

    Hi everyone,
    I have attempted to use pluggable mappings because i have a source system with 30 similar tables that need similar transformations (and extraction operations) performed on them. At the moment there is only a business need in our DW environment for 3 of them, and I have created 3 mappings using a common (nested) pluggable mapping. I have found the functionality "flaky". My two specific concerns are that the design centre sometimes doesnt update the mapping that uses the pluggable mapping (the code generated doesnt include the newly updated PM code). Concern 2 is that sometimes I cant even add PM's to the design canvas. I haven't rigorously attempted to solve these problems yet (because they arent a priority yet) but I AM going to watch this thread with interest.
    Gene

  • Pluggable Mappings and Input Parameters

    Hi All,
    I have a mapping which receives 2 input parameters. I have created a pluggable mapping to which I would like to pass those same values. Is is possible to pass parameter values into a pluggable mapping such that these values are NOT part of the input signature?
    thanks
    Ok, I have more information. When I deploy my mapping, I found the following code commented out of the main procedure:
    -- register custom input parameters:
    -- Temp fix until selective registration ensures correct handling of complex types
    -- wb_rt_mapaudit_util.register_custom_param(x_audit_id,
    -- '"MY_PARAM"',
    -- TO_CHAR("MY_PARAM"));
    What is the "selective registration" that they mention and how do I do it?
    thanks, again
    Edited by: philip_b on Mar 26, 2010 8:39 AM

    Thanks for responding David. I am using Client: 10.2.0.1.31 and Repo: 10.2.0.1.0
    I was hoping I didn't have to include it in the signature because I didn't want these values to be part of the incoming dataset; that just complicates the mapping logic for me.
    It would make sense to me that the pluggable mapping should have access to the parent mapping's parameter values.
    Why is the parameter registration code commented out inside the generated package? Do you know what it is referring to when it mentions "selective registration"? I can't find any reference to it:-/
    Someone else had the same issue last year on the following thread, but there were no responses to the issue:
    wb_rt_mapaudit_util register_custom_param call commented out in gen'd sql

  • How to use pluggable mappings in OWB10gR2

    Sorry, yet another newbie question... (btw, boy I really wish the Users Guide was a lot more informative...)
    I've created a simple pluggable mapping called "closed_date" that takes a few input data elements (closed date, account status, time since last reported) and does some simple expression logic to come up with a final "closed date" output attribute and an "open account indicator" field.
    Now, when I'm trying to actually use the pluggable mapping inside of an ETL map, I get the following error:
    "API8003: Connection target attribute group is already connected to an incompatible data source. Use either a Joiner or Set operator to join the upstream data first before connecting it to this operator"
    Basically, all of the data is coming out of a big JOINER operator where I join 4 or 5 tables. Then I want some of the attributes to go directly to the target dimension operator, but I also want this pluggable mapping to sit between the JOINER and Dimension to do the translation.
    What am I doing wrong - the expressions in the pluggable mapping are exactly the same as the ones I removed from the process flow (when it worked fine)?
    Thanks,
    Scott

    bump!

  • Pluggable mapping problem

    Hi,
    I have a src table which needs to be mapped to a target, but through a pluggable mapping which contains expressions to calculate an extra field I came to the following problem. The output signature of the Pluggable mapping cannot be combined with the source data, so i used a joiner and it generates something like this:
    SELECT
    "MYTABLE"."THEPK" "THEPK"
    "MYTABLE"."NAME" "NAME"
    "MYTABLE"."STARTDATE" "STARTDATE"
    <function to create ENDDATE field> "ENDDATE"
    FROM
    "MYTABLE" "MYTABLE"
    "MYTABLE" "MYTABLE_1"
    WHERE ....
    I don't want to join them, they must come from the first table, but I cannot combine them without a joiner. It works when I make it without pluggable mapping but it's a lot of redundant work! Is there another way to fix this?
    Thank you
    Den
    ps: I am using OWB 10g R2 (the newest version)
    Message was edited by:
    user537381

    Hi,
    See this thread - Pluggable Mappings
    This might be solution for your problem. IMHO, one should search this forum before posting .
    HTH
    Mahesh

  • Mapping canvas of the mapping editor doesn't display

    I don't want to reinstall the OWB, I've tried on everything on the Design Center GUI, the mapping canvas has just being hiding somewhere seeing me tearing my hair. I will almost die. But all the pluggable mappings show their mapping canvas as they did.
    I can't remember what I've done in the other day and so lose the mapping canvas, every mapping in the same module just doesn't display its canvas, all other windows (palette, explorer...) show normally.
    I've tried creating a new project and creating a mapping in it. The mapping canvas still doesn't come to the display.
    Is there any configuration file to set control over the mapping canvas as well as all other GUI elements?
    Any direction, Thanks a million.
    Oracle 10.2.0.1.0
    OWB 10.2.0.1.0
    Work Flow 2.6.4

    I solved it.
    It's in <owb_dir>\owb\bin\admin\MappingEditorLayout.xml.
    I replaced
    <SPLITTER orientation="0">
    </SPLITTER>
    with
    <DOCKABLE tag="ROOT" width="714" height="660" min_width="50" min_height="16" pref_width="714" pref_height="660" collapsed="false"/>.
    Of course, then relogin to the design center.
    It works for me.

  • Dimension synchronisation does not seem to work in a mapping

    Hi all,
    While evaluating OWB 10gR2 for our next project, I have come across the following problem.
    There is a very simple dimension, type SCD2, ROLAP, with some hierarchies and levels. It is connected to a flat table. Everything works fine. Then I add a new level, include it in the new hierarchy and bind to the table which has been extended by a couple of columns. In the mapping, I do "Synchronise". Validation completes without warning, deployment works, too.
    But running the mapping generates an error ORA-01400: cannot insert NULL into "DIMENSION"."DIMENSION_KEY". After some analysis in the generated PL/SQL Package, I noticed, that the CURRVAL is missing in the WHEN NOT MATCHED of the new level MERGE statement:
    WHEN NOT MATCHED THEN
    INSERT
    ("DIMENSION"."LEVEL_ID", -- Surrogate Key
    "DIMENSION"."LEVEL_CODE", -- Business Key
    "DIMENSION"."LEVEL_NAME", -- Short Desription
    "DIMENSION"."LEVEL_DESCRIPTION") -- Long Description
    VALUES
    (-1 * ("DIMENSION_SEQ".NEXTVAL + 1)/* MULT_BY_NEG1_1.OUTGRP1.NEG_NEXTVAL */,
    "MERGE_SUBQUERY$1"."LEVEL_CODE$11",
    "MERGE_SUBQUERY$1"."LEVEL_NAME$9",
    "MERGE_SUBQUERY$1"."LEVEL_DESCRIPTION$9")
    This can be seen in the pluggable-mappings as well. DIMENSION_KEY is included in all "old" levels, but not in the new level.
    The only workaround I found was to rebuild the complete dimension from scratch. You can imagine, this is not a pleasant task, especially, when our dimensions grow in the future.
    The correct code looks like this
    WHEN NOT MATCHED THEN
    INSERT
    ("DIMENSION"."LEVEL_ID", -- Surrogate Key
    "DIMENSION"."LEVEL_CODE", -- Business Key
    "DIMENSION"."LEVEL_NAME", -- Short Desription
    "DIMENSION"."LEVEL_DESCRIPTION", -- Long Description
    "DIMENSION"."DIMENSION_KEY")
    VALUES
    (-1 * ("DIMENSION_SEQ".NEXTVAL + 1)/* MULT_BY_NEG1_1.OUTGRP1.NEG_NEXTVAL */,
    "MERGE_SUBQUERY$1"."LEVEL_CODE$11",
    "MERGE_SUBQUERY$1"."LEVEL_NAME$9",
    "MERGE_SUBQUERY$1"."LEVEL_DESCRIPTION$9",
    -1 * ("DIMENSION_SEQ".CURRVAL + 1)/* MULT_BY_NEG1_1.OUTGRP1.NEG_CURRVAL */)
    Anybody with the same problem(s)?
    TIA,
    Slavo

    When you assigned the new level into the hierarchy, did you also remember to turn on the attributes for that level that correspond to the surrogate and business keys?
    Scott

  • Example of a custom field mapping?

    Ok, I admit it I am struggling here. I have simplified my example from
    what I actually have.
    I have a table that models a flat hierarchy
    ID | START_DATE | END_DATE | CLASSNAME | FIELD1 | FIELD2 | ...
    one of the objects in my hiearchy (CashFlow) has a field that is in fact
    another object called DatePeriod that contains two fields startDate and
    endDate.
    I understand that what I am trying to do is embed the DatePeriod object
    inside of the larger object when it get's persisted.
    I have the following metadata set-up
    <class name="CashFlow" persistence-capable-superclass="InstrumentFlow">
    <extension vendor-name="kodo" key="table" value="INSTRUMENT_FLOW"/>
    <extension vendor-name="kodo" key="pk-column" value="ID"/>
    <extension vendor-name="kodo" key="class-column" value="none"/>
    <extension vendor-name="kodo" key="lock-column" value="none"/>
    <field name="accrualPeriod" embedded="true"/>
    </class>
    and for my DatePeriod object
    <class name="DatePeriod">
    <extension vendor-name="kodo" key="table" value="INSTRUMENT_FLOW"/>
    <extension vendor-name="kodo" key="pk-column" value="ID"/>
    <extension vendor-name="kodo" key="lock-column" value="none"/>
    <extension vendor-name="kodo" key="class-column" value="none"/>
    <field name="startDate">
    <extension vendor-name="kodo" key="data-column" value="START_DATE"/>
    </field>
    <field name="endDate">
    <extension vendor-name="kodo" key="data-column" value="END_DATE"/>
    </field>
    </class>
    Every time I try to fetch a CashFlow object I get an error as KODO is
    trying to select the column 'ACCRUALPERIODX.'
    What am I doing wrong? Do I need to use a custom field mapping? If so
    where is the documentation to help me write a custom field mapping?
    A second question would be what happens if the DatePeriod object is used
    in a couple of places, I don't want to tie it's persistence to the
    INSTRUMENT_FLOW table.
    All help gratefully received
    Cheers
    Matt

    As you suspect, Kodo 2.x does not support embedded class mappings. Kodo
    3.0 will support embedded mappings.
    In the meantime, you can create a custom mapping, but unfortunately our
    documentation for custom mappings is lacking right now. Given how simple
    your DatePeriod object is, you're probably better off with something
    simpler (and as a bonus, less bound to Kodo):
    Just make your DatePeriod class and the field that holds the DatePeriod
    instance non-persistent. In the class that has the (now non-persistent)
    DatePeriod field, add two new persistent fields for the startDate and
    endDate. Then just use internal logic to construct the DatePeriod from
    the startDate and endDate. You can do this using the
    javax.jdo.InstanceCallbacks methods, or just do the logic in your setter
    and getter methods for the DatePeriod.

  • How to use the out parameter of a transformation

    Hi All,
    I have a requirement where I need to move all the transformations from process flows to map.SO for each transformation I need to have 1 map which calls this transformation.I have 1 transformation which has both input and output parameter.If I use this transformation in mapping then how to use the out parameter of thsi transformation.This out parameter needs to beused in other mappings.Can soemone please help me.
    Thansk in advance

    Hi,
    I'm not quite sure what you are trying to do.
    What works: Connect the outgroup of a pre- or post-mapping process operator to the mapping output parameter operator..
    What does not work: Connect the outgroup of an operator that can return more than one row (e.g. table operator, filter, joiner ,...) to the mapping output parameter operator. The mapping output parameter just returns "one row", like a pl/sql function call.
    You cannot pass a "data stream" from one mapping to another. Maybe the pluggable mappings is what you are looking for.
    Regards,
    Carsten.

  • Mapping creation best practice

    What is the best practice while designing OWB mappings.
    Is it best to have less number of complex mappings or more number of simple mappings particularly when accessing remote DB to
    extract the data.
    A simple mapping may be having lesser number of source tables and the complex mapping may be one
    which will have more source tables and more expresssions.

    If you're an experienced PL/SQL (or other language) developer then you should adopt similar practices when designing OWB mappings i.e. think reusability, modules, efficiency etc. Generally, a single SQL statement is often more efficient than a PL/SQL procedure therefore in a similar manner a single mapping (that results in a single INSERT or MERGE statement) will be more efficient than several mappings inserting to temp tables etc. However, it's often a balance between ease of understanding, performance and complexity.
    Pluggable mappings are a very useful tool to split complex mappings up, these can be 'wrapped' and tested individually, similar to a unit test before testing the parent mapping. These components can then also be used in multiple mappings. I'd only recommend these from 10.2.0.3 onwards though as previous to that I had a lot of issues with synchronisation etc.
    I tend to have one mapping per target and where possible avoid using a mapping to insert to multiple targets (easier to debug).
    From my experience with OWB 10, the code generated is good and reasonably optimised, the main exception that I've come across is when a dimension has multiple levels, OWB will generate a MERGE for each level which can kill performance.
    Cheers
    Si

  • MMM1061: Corrupted array: position mismatch at 0

    Hello
    We got severes errors on Parent Mappings with Pluggable Mappings inside. We can Analyze and Deploy the Parent Mappings without any error, but we get errors on syncrhonizing the Parent Mappings.
    1. Error on Synchronize a Parent Mapping (Pluggable Mappings inside)
    Msg Box:
    MMM1061: Corrupted array: position mismatch at 0.
    MMM1061: Corrupted array: position mismatch at 0.
    2. Error on closing the previously syncrhonized Parent Mapping
    Msg Box:
    INTERNAL ERROR: in MapGenerationLanguageValidator.validate(GENERATION_LANGUAGE, PLSQL)
    3. Error on opening this or any other mapping:
    Msg Box:
    Repository Usage Error: Start nested transaction while already in one. Please contact Oracle Support with the stack trace and details on how to reproduce it.
    All 3 msg boxes have a Detail button with a lot of Java function calls (too much to post it here).
    What we tried so far (but all didn't help so far)
    - Copy existing Pluggable Mapping and use this one in existing Parent Mapping
    - Removing Pluggable Mapping from Parent Mapping and the include it again
    - Recreating Pluggable Mapping from scratch and include it in Parent Mapping
    - Rebuidling new Pluggable Mapping from scratch and also new Parent Mapping, using the newly created Pluggable Mapping
    - Making sure all developers accessing the OWB Repository have the same language settings (English)
    The only thing that helped so far was creating a dummy Pluggable Mapping/Parent Mapping NOT using (including) any other existing objects from this Repository (like tables, etc.)
    The sync strategy (merge, update) has no influence whether we can sync successfully or not.
    My conclusion so far is:
    a) We have a severe internal Repository problem
    or
    b) This OWB version doesn't properly support Pluggable Mappings/Parent Mappings.
    Any1 already had the same problems? We are working with OWB Client 10.2.0.1.31 and Warehouse Builder Repository: 10.2.0.1.0
    Any hint are welcome
    Thank you

    Hi,
    I recommend using the latest OWB Patch 10.2.0.4. The first release of 10.2 was quite buggy concerning pluggable mappings. we did not use them in that version.
    Regards,
    Carsten.

  • Sequence in process flow (output from mapping)

    I have a mapping where I am using sequence operator and creating a record in the target table I need to pass this variable out of the mapping and as input to another mapping. I am not able to pass the sequence out of the mapping. How can we assign values to mapping output variables however I do it it says you cannot assign variables to output parameters from the mapping it says.
    mapping input parameter and mapping output parameter are intended to be executed before and after the data flow of the mapping and cannopt accept inputs from any part of the data flow.
    Any ideas as to how this can be done.
    Can a pluggable mapping be called in a process flow.
    Thanks
    Edited by: user8023060 on Jun 23, 2009 6:08 AM

    Hi,
    write a plsql-function that executes returns seq_XXX.currval (or, to be on the safe side, select the value from your table).
    Assign that value to an output attribute.
    And no, pluggable mappings can just be included from other mappings, there is no way to execute them directly.
    Regards,
    Carsten.

  • Problem in Synchronization in OWB 11gR2

    Hi
    While synchronizing one the object (i.e. sequence) of the mapping uder "XYZ" module, OWB hanged and after some minutes it got closed.
    When i reconnected to repository, i was suprised to see that "XYZ" module was not present under My project.
    Can any one suggest me what to do know?
    Is this any kind of Bug in 11gR2?
    -Regards
    Shubham

    We have installed below patches
    10185523 PSE FOR BASE BUG 10270220 ON TOP OF 11.2.0.2 Mega Patch that includes a large number of fixes.
    11817721 PSE FOR BASE BUG 11798880 ON TOP OF 11.2.0.2.1 Fixes error when validating pluggable mappings.
    11821008 PSE FOR BASE BUG 10373424 ON TOP OF 11.2.0.2.0 Fixes a generic bug
    12589448 PSE FOR BASE BUG 12565505 ON TOP OF 11.2.0.2+MP1 Fixes error when generating code from pluggable mappings
    11933912 Bug 11852736 - EXECUTION-DEPLOYMENT JOB LIST EMPTY OR INCOMPLETE WITH 11.2.0.2 CUMULATIVE PATCH Fixes bug when monitoring active jobs
    12365886 OWB MAPPING WITH CASE EXPRESSION IN FRONT OF LOOKUP WON'T GENERATE ANSI OUTER JOIN Fixes bug
    11905191 ENHANCEMENT TO ALLOW CHANGING OF LOC TYPE FROM HPS TO NET CONNECTION Fixes bug
    12344934 REPOSITORY BROWSER SHOWS POOR PERFORMANCE Fixes bug
    12398947 BUGS FIXED 11938478, 11835354 Fixes error with target load order
    12660663 Bug 12612428 - WORKFLOW IMPORT FAILS WITH API0259: THE OBJECT CANNOT BE EDITED IN READ-ONLY MOD
    Bug 12641488 - MDL IMPORT FAILS WITH API0259: THE OBJECT CANNOT BE EDITED IN READ-ONLY MODE! Fixes error "API0259: The object cannot be edited in Read-Only mode!" when importing PF.
    12357673 Bug 11074442 - MDL3001: CONNECTFROM <XXX> NOT FOUND FOR ATTRIBUTE <YYY> Fixes import error from 11g MDL to 11g Env
    Please let know if any other patch is required for the above mentioned.

Maybe you are looking for