Mapping creation best practice

What is the best practice while designing OWB mappings.
Is it best to have less number of complex mappings or more number of simple mappings particularly when accessing remote DB to
extract the data.
A simple mapping may be having lesser number of source tables and the complex mapping may be one
which will have more source tables and more expresssions.

If you're an experienced PL/SQL (or other language) developer then you should adopt similar practices when designing OWB mappings i.e. think reusability, modules, efficiency etc. Generally, a single SQL statement is often more efficient than a PL/SQL procedure therefore in a similar manner a single mapping (that results in a single INSERT or MERGE statement) will be more efficient than several mappings inserting to temp tables etc. However, it's often a balance between ease of understanding, performance and complexity.
Pluggable mappings are a very useful tool to split complex mappings up, these can be 'wrapped' and tested individually, similar to a unit test before testing the parent mapping. These components can then also be used in multiple mappings. I'd only recommend these from 10.2.0.3 onwards though as previous to that I had a lot of issues with synchronisation etc.
I tend to have one mapping per target and where possible avoid using a mapping to insert to multiple targets (easier to debug).
From my experience with OWB 10, the code generated is good and reasonably optimised, the main exception that I've come across is when a dimension has multiple levels, OWB will generate a MERGE for each level which can kill performance.
Cheers
Si

Similar Messages

  • Mapping Content Best Practice

    Anyone got suggestions on best practice for mapping contents...
    a) create a lot of mappings - almost 1 for each table, dim, etc;
    b) put many items into logically grouped mappings (e.g. staging static data extract, staging fact extract, etc)
    c) somewhere in between - most staging stuff gets chucked together, major dimension/fact refresh has its own mapping, possibly put many related smaller dims (status, etc) in one mapping, etc..
    I assume the considerations are parallel execution, monitoring, maintenance/clarity, simplifying process flows, etc.
    I suppose I'd go for c), but if anyone can suggest things to avoid that'd be good!

    Hi Chewy,
    Mapping composition is not an exact science, as you state yourself.
    The problem also extends to workflows and workflow subprocess division.
    I've made some core rules for myself, that may work for others too:
    1) If I want to be able to reload a target in case of problems, it should be in a separate mapping.
    2) Generally, each target file has its own mapping. If several disconnected flows have the same target, I put them in the same mapping.
    3) If I split a single source into multiple targets, I generally use a single mapping for that
    4) If flows are complex (have many operators), I never put two or more together. If your 3600x4800px monitor can't display all of your mapping, you're on wrong track.
    5) If I want to sequence simple flows and don't care about 1) or which one runs first (10gR1 caveat), I put them into the same mapping. Eg. when the source cannot handle concurrent queries well.
    When it comes to workflows, things get more complicated.
    In my current project, most source data moves through three modules: staging, ods (relational), and data mart (star schema). I have two requirements for packaging mappings into workflow subprocesses:
    1) There should be a subprocess per source module that transfers all data from the source to the staging area
    2) Each target (dim or cube) in the data mart should have a silo-type subprocess that handles the data from staging through ods into the data mart. Rule 1) from mappings applies here, as well.
    To do this efficiently, you should have a PL/SQL function that checks eg. if the source loaded successfully into staging before commencing with target loading.
    Regards, Hans Henrik

  • Database creation best practices.

    Hi,
    We are planning to setup new database, oracle-10G on Sun and AIX. It is a datawarehose environment.
    Can anyone please share me the documents which speaks about best practices to be followed during database creation/setup. I googled and got some douments but not satisfied with them, so thought of posting this query.
    Regards,
    Yoganath.

    YOGANATH wrote:
    Anand,
    Thanks for your quick response. I went thru the link, but it seems to be a brief one. I need a sort of crisp/summary document for my presentation, which speaks about:
    1. Initial parameter settings for an datawarehouse to start with, like block_size, db_file_multiblock_read_count, parallel server etc...
    2. Memory parameters, SGA, PGA (say for an sever with 10GB RAM).
    3. How to split tablespaces, like Large, small.
    If someone has a just a crisp/outline document which speaks about the above mentioned points, it will be grateful.
    Regards,
    YoganathYou could fire up dbca, select the 'data warehouse' template, walk through the steps, and at the end do not select 'create a database' but simply select 'create scripts', then take a look at the results, especially the initialization file. Since you chose a template instead of 'custom database' you won't get a CREATE DATABASE script, but you should still get some stuff genned that will answer a lot of the questions you pose.
    You could even go so far as to let dbca create the database. Nothing commits you to actually using that DB. Just examine it to see what you got, then delete it.
    Edited by: EdStevens on Feb 10, 2009 10:41 AM

  • Vendor Creation Best Practices?

    Can anyone that has the Purchasing dept. (one who is able to issue PO's) also has the ability to create vendors MK01(local) XK01(central). I'm trying to understand the risks with one who can issue PO's also can setup vendors.
    Thanks in advance!
    Best, Michael

    Hi,
    This depends totally on the organisational set up. Each company has its own checks for curtailing mal practices. There are also international standards available for controlling these.
    There is nothing wrong in the same person creating a vendor master and also issuing a P.O so long as the requisite checks and controls ( e.g release strategy or authorisation control ( to name a few )) are in place.
    If the organisation and the responsible / competant authorities are satisfied that a fool proof control mechanism in place to control mal practices it is fine for the same person to create vendormaster as well as P.O ( e.g a small organisation may not have the luxury to keep 2 different people for this activity ).
    I hope you are getting my point.  SAP is just an enabler for business to run. The checks and controls have to be there anyway.
    Regards,
    Rajeev

  • File Creation - Best Practice

    Hi,
    I need to create a file daily based on a single query. There's no logic needed.
    Should I just spool the query in a unix script and create the file like that or should I use a stored procedure with a cursor, utl_file etc.?
    The first is probably more efficient but is the latter a cleaner and more maintainable solution?

    I'd be in favour of keeping code inside the database as far as possible. I'm not dismissing scripts at all - they have their place - I just prefer to have all code in one place.

  • IS-Retail: Site Creation Best Practice

    Hi !
    I have 3 basic questions on this topic:
    1. In IS-Retail is a Site considered 'config' or 'master data'?
    2. Do all sites need to exist in the golden config client, or is it sufficient to have only the DCs (and not the Stores) in it?
    3. After go-live, is it ok to create new stores directly in production? (I hope the answer to this q is yes!)
    Thanks,
    Anisha.

    Hi my answers for your qestions as follows
    1) In IS Retail Site is a master data, but to create Site/DC/stores need to do some config in SPRO, Like...
    Number ranges for Sites, Account groups, create DC and site profiles and assign account groups to them.
    then you Create DC or site using WB01, now site master data will consider.
    2) yes you should have all Sites in golden client ( i am not clear this questn)
    3) After golive better need to create new stores on devleopment in then export (using WBTI/WBTE) to prodcution, in case any future changes requires to do something on stores like assign/delete merch category etc... you can do its easliy to aviod data mismatch in the production.
    regards
    satish

  • Java WebDynpro context mapping  best practices

    Hi Friends,
    the data to provide in context for every view controller and component controller.. can bemaintained in different ways.
    1. to map the view controller fields with component controller only when it is required at both the places to be accessed.
    rest all fields which do not need to be accessed at both the places may not be maped.
    or:- Whats the advantage of not mapping the fields between view controllers and component controller?
    2.
    instead of fields as value attributes, one Value Node may be used  as a grouping for a particular group of fields. is is best practice to group the fields into value node as per screen grouping?
    for example screen has three sub parts, so three value node.. and each value ndoe may contain different value attributes. which scenario should be consider as best practice?
    Thanks!

    <i>1) Advantage of not mapping is perfomance;</i>
    Very weak argument. There is no any significant performance lost when mapping used (I bet you save less then a percent comparing to "direct" access).
    Just put simple: your business data originates from controller. You must to show it on view, hence the need for mapping.
    Also view may require certain context nodes just to setup and control UI elements. Declare these nodes directly in view controller and you'll need no mapping in this case.
    Valery Silaev
    EPAM Systems
    http://www.NetWeaverTeam.com

  • What oracle best practices in mapping budgeting to be implement at item

    Dear Consultant's
    Really i need you values Consultantancy
    What oracle best practices in mapping budgeting to be implement at item category level or item level
    I want to check fund at encumbrance account according to item level
    Case:
    I have there item category
    One is Computer's items
    Tow is printer's items
    Third is food's item's
    I want to implement my budget on item category level
    Example:
    I want my purchase budget for item with printer's type not exceed 30000USD
    And For item with type food's not exceed 45000usd
    How to map this in oracle application
    The modules implemented on my site
    (GL, AP, AR, INV, PURCHASING, OM)
    Please give me the oracle best practice that handle this case
    Thanks for all of you

    Hi,
    It is really difficult to have Budgetary Control on Inventory Items in Average Costing enviornment as you can have only one Inventory Account at the Inventory Organization level.
    You have to modify your PO / Requisition Account Generator to populate the Encumbrance Account in PO / Requisition based upon item category. Moreover, the "Reverse Encumbrance" flag in your Inventory Org needs to be unchecked so that the encumbrances are not revered when the goods are received.
    Gajendra

  • Mapping Best Practice Doubt

    Dear SDN,
    I have a best practice doubt.
    For an scenario where it is needed to mapping a value to another value, but the conversion is based on certain logic over R/3 data what is the recommended implementation:
    1.  Use Value Mapping Replication for Mass Data   or
    2.  Use XSLT ABAP Mapping calling an RFC ??
    Best regards,
    Gustavo P.

    Hi,
    I would suggest you use XSLT ABAP mapping or,
    Use the RFC LookUp API available from SP 14 onwards to call the RFC from your message mapping itself.
    Regards
    Bhavesh

  • SAP Best Practice: Can't open process maps

    Hi Community,
    when I try to open a process map within the SAP best practice package I receive this error message (tried various browser, low security settings):
    Any ideas how to avoid this error message?
    Thanks in advance,
    Frank

    Hey Frank,
    is my assumption correct, that you use either Firefox or Chrome? Please try with IE.
    In case this does not work, please go to:
    https://websmp108.sap-ag.de/~form/handler?_APP=00200682500000002672&_EVENT=DISPLAY&_SCENARIO=&_HIER_KEY=501100035870000006532&_HIER_KEY=601100035870000146943&
    Select the country you are interested in and download.
    Best,
    xiaoma

  • Best practice for confirmation creation?

    Dear Experts,
    I would like to have your opinion on best practices for confirmation creation in classic scenario. Whether most organizations  prefer to create confirmation in SRM or ECC? One advantage of confirmation creation in SRM is that you can avail different workflow scenario, while ECC MM does not support any workflow or release strategy for confirmation/goods receipt.
    Your input on it will be highly appreciated.
    Regards,
    Ranjan

    Hi Ranjan,
    As far as best practice for confirmation creation goes , it should be decided by who is the goods recipient , what access the goods recipient have and  and what type of material it is , direct or indirect ..
    Ideally  for most of Indirect procurement where requestor is creating a cart for himself or his department and would be reciving the good/ service himself , he/she should do the confirmation in SRM.
    In case of direct material definitely GR  need to be done in the ECC.
    Thanks
    Iftekhar Alam

  • Best Practices for ASAP Inputs - As-Is Business Process Mapping

    I am new to the SAP world and my company is in the early phases of implementation.  I am trying to prepare the "as-is" business process maps for the Project Preparation and Business Blueprint phases and I am looking for some best practices.  I've been told that we don't want them to go too deep but are there best practices and/or examples that give more information on what we should be capturing and the format.
    I have searched the forums, WIKI, ASAP documentation, and other areas but have not found much at this level of detail.  I have reviewed the [SAP BPM Methodology|http://wiki.sdn.sap.com/wiki/display/SAPBPX/BPM+Methodology] but again I am looking for more detail if anyone can direct me to that.
    Thank you in advance for any assistance.
    Kevin

    Hello Kevin,
    You can try to prepare a word document for each of your As-Is processes first before going to As-Is Process Design in a flowchart.
    The word document can have 7 sections -
    The first section can include Name of the Process Owner, Designation, Process Responsibility, User Department(s) involved, Module Name and a Document number for reference.
    The second section can include Process Definition details - Name of the major process, Name of the minor process, Name of the sub process and a Process ID for future reference.
    The third section can be titled as Inputs - this contains details of - Input, Vendor for the input, Type of Input (Data / Activity / Process), Category of Input (Vital / Essential / Desirable) and Mode of Information (Hard / Soft Copy).
    The fourth section can be Process Details. Here you can write the process in detail.
    The fifth section to contain outputs of the process, customer to whom these outputs are sent, type of output (report / approval / plan / request / email / fax), Category of Output (Vital / Essential / Desirable) and Mode of Information (Hard / Soft Copy).
    The sixth section can be Issues / Pain Areas in this process - Issue Description, Remarks, Expectations, Priority (High / Medium / Low)
    The seventh section can be expected reports in future out of this process for internal and external reporting.
    Hope this helps your question.

  • Best Practices for Reprocessing or Reexecuting a mapping

    Hi All
    Please,
    May someone tell me which is the best practice to execute a mapping when it fails?
    How does Oracle Warehouse Builder manage the re execution and re processing of a map from a Source system to staging area tables?
    - Truncate the stage table and start again?
    - Populate an identical error table and reprocess only this table after fixing each Extraction mistake? This to avoid a whole processing when the quantity of mistakes is considerably lower than successful records
    - Has OWB a log table of error records? Can I process over this table?
    - Any valuable link ?
    (The questions applies for mappings between staging area tables and the Warehouse tables too)
    I`m working with DB v 9.2.0.5, and OWB v.10g.
    Thanks in Advance
    LEONARDO QUINTERO RIPPE
    [email protected]
    Technical Consulting

    Roland,
    The doc is linked on [SAP on DB2 for z/OS|SAP on DB2 for z/OS] or access the doc directly [here|https://service.sap.com/~sapidb/011000358700000525542007E] .
    Regards,
    Thomas

  • Best Practices to update Cascading Picklist mapping for Account record type

    1. Most of the existing picklist values name in parent and related picklist has been modified in external app master list, so the same needs to be updated in CRMOD.
    2. If we need to update picklist value, do we need to DISABLE the existing value and CREATE a new picklist.
    3. Is there any Best Practices to avoid doing Manual Cascading picklist mapping for Account record type? because we have around 500 picklist values to be mapped with parent and related picklist.
    Thanks!

    Mahesh, I would recommend disabling the existing values and create new ones. This means manually remapping the cascading picklists.

  • Best practices TopLink Mapping Workbench multi-user + CVS?

    This might be a very important issue, in our decision whether or not to choose TopLink --
    How well is multi-user development and CVS supported when using the TopLink Mapping Workbench? Are there best practices regarding this use case?
    Thanks.

    We have no problem with the workbench and CVS. Only a couple of our developers are responsible for the mappings so we havn't really run into concurrent edits. It's pure XML so a decent mergetool with XML support should let you resolve conflicts pretty easily.

Maybe you are looking for