Regarding Performance Tunning of Mapping & Process Flow

Hi,
i have around 60-70 GB data in Target database.
i need do improve the performance of Mapping and Process flow.
i have used lookup transformation in mapping.
Plz give me some tips for improving performance of Process.
Thanks,

Please go through a Performance Tuning Book for Oracle 10gR2.
Most importantly remember that in Oracle 10g, your performace of mappings can be increased manifold by following these steps:
1. Do not design a mapping wherein you load a table and then select from that table to load another table and so on. This is a bad design.
2. Keep mappings as simple as possible. In other words if a mapping is complicated in terms of joins or other operators then split the mapping into more than one parts.
3. Also check that all your source tables should be analyzed using DBMS_STATS. Ensuring this one single step can make your work very easy.
4. Put indexes where you find your predicate has a very high selectivity. Also keep in mind the column ordering of the index.
5. Use Set Based operation, since it is always a good idea to achieve the result by running one single query rather than a loop and multiple inserts.
6. Use APPEND PARALLEL hint while loading the target tables. This will not generate wny redo and save time
7. Please have a recheck while usng some performance intensive operators like UNION, DISTINCT and AGGREGATION
8. When using a sequence operator to load a large table, check that the sequence should be cached with some values.
9. When loading large data HASH JOINS are the most appropriate more often than not. So you can use USE_HASH as the hint for selecting from large tables.
10. Filter out as much unrequired data from a table as soon as possible in the mapping before doing multiple joins.
I am sure there are many more ... the above is just a random list that I could remember now. Please go through Oracle Performance Tuning Guide, Tom Kyte's Oracle Expert One on One. Knowledge of performance tuning will grow with experience. I am also learning each day !!!
Regards
AP

Similar Messages

  • Business Process Maps/ Process Flows for Retail Operations

    Dear All,
               I am looking for Business Process Maps/ Process Flows for Retail Operations like forecasting, inter store transfers, ASN, Procurement, Allocation etc.
    Kindly provide me with a link, docs, PPts etc.
    Appreciation would be in the form of points.
    Regards,
    Jack Silverz

    Hi Jack
    Seeing as you only got one reply - I'm jumping in with my 2-peneth.
    Seldom have I found process content which sufficiently explains both manual and automated processes in a format which is sufficiently simple to engage discussion (between project teams and business representatives).  Is there a way to strike the required balance - simplicity for the sake of engagement; yet rich enough to record SAP dependencies, system requirements and so forth.
    An answer in my opinion is [here|https://ecohub.sdn.sap.com/irj/ecohub/solutions/control2007].  (Honest account - I work for Nimbus.  But you don't have to take my word for it.  Check out our case studies if you like. This approach is being adopted by several strategic SAP accounts several of which are listed [here|https://ecohub.sdn.sap.com/irj/ecohub/solutions/control2007]).
    What might partly address your requirement is the [APQC Process Classification Framework|http://www.apqc.org/portal/apqc/site/?path=/research/pcf/index.html] . We took this framework as a starting point for building a model company process map. I cannot claim that it is Retail specific, but we have quite a few retail customers who have adopted Control 2007 and also several supply chain optimization projects which we have delivered.
    If this is of interest - we provide a web hosted demonstration environment where you can explore the APQC process content I mentioned above. You can contact me through [SAP EcoHub|https://ecohub.sdn.sap.com/irj/ecohub/solutions/control2007] for more details
    Kind regards,  Nigel

  • Regarding Performance tunning...

    HI Experts,
    can you suggest me, how exactly performance is checked.
    what is the measure we considered for performance check.
    what are ideals status of the performance check for a certain program.
    that is, how much should be database use, abap use and the system
    use in the program.
    i checked my program in t-code SE30, and it is showing almost
    above 90% use of database and very less use of abap and
    almost no use of system.
    so please tell me exactly how much it should be...
    Thanks in advance.
    Regard,
    vijay chavan

    Hi
    these are the ways of checking performance of a report
    Tools for Performance Analysis
    Run time analysis transaction SE30
    SQL Trace transaction ST05
    Extended Program Check (SLIN)
    Code Inspector ( SCI)
    <b>Run time analysis transaction SE30</b> :This transaction gives all the analysis of an ABAP program with respect to the database and the non-database processing.
    <b>SQL Trace transaction ST05</b>: The trace list has many lines that are not related to the SELECT statement in the ABAP program. This is because the execution of any ABAP program requires additional administrative SQL calls. To restrict the list output, use the filter introducing the trace list.
    The trace list contains different SQL statements simultaneously related to the one SELECT statement in the ABAP program. This is because the R/3 Database Interface - a sophisticated component of the R/3 Application Server - maps every Open SQL statement to one or a series of physical database calls and brings it to execution. This mapping, crucial to R/3s performance, depends on the particular call and database system. For example, the SELECT-ENDSELECT loop on a particular database table of the ABAP program would be mapped to a sequence PREPARE-OPEN-FETCH of physical calls in an Oracle environment.
    The WHERE clause in the trace list's SQL statement is different from the WHERE clause in the ABAP statement. This is because in an R/3 system, a client is a self-contained unit with separate master records and its own set of table data (in commercial, organizational, and technical terms). With ABAP, every Open SQL statement automatically executes within the correct client environment. For this reason, a condition with the actual client code is added to every WHERE clause if a client field is a component of the searched table.
    To see a statement's execution plan, just position the cursor on the PREPARE statement and choose Explain SQL. A detailed explanation of the execution plan depends on the database system in use.
    <b>Extended Program Check</b>
    This can be called in through transaction SE38 or through transaction SLIN. This indicates possible problems that may cause performance problems.
    <b>Code Inspector (SCI)</b>
    You can call the Code Inspector from the ABAP Editor (SE38), the Function Builder (SE37), the Class Builder (SE24), or as a separate transaction (SCI).
    The Code Inspector indicates possible problems. However, note that, especially with performance issues: There is no rule without exception. If a program passes an inspection, it does not necessarily mean that this program will have no performance problems.
    <b>Run time analysis transaction SE30</b>
    <b>steps</b>
    In Transaction SE30, fill in the transaction name or the program name which needs to be analyzed for performance tuning.
    For our case, let this be “ZABAP_PERF_TUNING”
    Run time analysis transaction SE30
    After giving the required inputs to the program, execute it. After the final output list has been displayed, PRESS the “BACK” button.
    On the original SE30 screen, now click on “ANALYZE” button.
    The percentage across each of the areas ABAP/ Database/System shows the percentage of total time used for those areas and load on these areas while running the program . The lesser the database load faster the program runs.
    <b>SQL Trace – ST05</b>
    <b>steps</b>
    <u>Starting the Trace:</u>
    To analyze a trace file, do the following:
    Choose the menu path Test &#61614; Performance Trace in the ABAP Workbench or go to Transaction ST05. The initial screen of the test tool appears. In the lower part of the screen, the status of the Performance Trace is displayed. This provides you with information as to whether any of the Performance Traces are switched on and the users for which they are enabled. It also tells you which user has switched the trace on.
    Using the selection buttons provided, set which trace functions you wish to have switched on (SWL trace, enqueue trace, RFC trace, table buffer trace).
    If you want to switch on the trace under your user name, choose Trace on. If you want to pass on values for one or several filter criteria, choose Trace with Filter. Typical filter criteria are: the name of the user, transaction name, process name, and program name.
    Now run the program to be analyzed.
    Stopping the Trace:
    To deactivate the trace:
    Choose Test &#61614;Performance Trace in the ABAP Workbench. The initial screen of the test tool appears. It contains a status line displaying the traces that are active, the users for whom they are active, and the user who activated them.
    Select the trace functions that you want to switch off.
    Choose Deactivate Trace. If you started the trace yourself, you can now switch it off immediately. If the performance trace was started by a different user, a confirmation prompt appears before deactivation-
    <u>Analyzing a Sample trace data:</u>
    PREPARE: Prepares the OPEN statement for use and determines the access method.
    OPEN: Opens the cursor and specifies the selection result by filling the selection fields with concrete values.
    FETCH: Moves the cursor through the dataset created by the OPEN operation. The array size displayed beside the fetch data means that the system can transfer a maximum package size of 392 records at one time into the buffered area.
    <b>Reward if useful</b>

  • Function to call Process-flows/mapping from SQL

    Hi All,
    Can any one of you send me the function to call process-flows/mappings (eg: I have 10 mapping/Process-flows). It is an urgent requirement pending in my final testing.
    Plz help me in this regard.
    Regards

    Hi
    All information that you need, you can find in $ORACLE_HOME/owb/rtp/sql/. BAsically all executions in owb is called through wb_rt_api_exec package.
    for example:
    declare
         variable exec_return_code number;
         procesflow varchar2(255);
    plsqlname varchar2(255);
    begin
    -- Initialize Return Code
    exec_return_code := owbrt_sys.wb_rt_api_exec.RESULT_FAILURE;
    -- Run Task
    procesflow:='WF_X1';
    plsqlname:='MP_X1'
    exec_return_code := owbrt_sys.wb_rt_api_exec.run_task('OWF_LCTN','PROCESS',procesflow, ' ' );
    if exec_return_code = 1 then
    exec_return_code := owbrt_sys.wb_rt_api_exec.run_task('STAGE_LCTN','PLSQL',plsqlname, ' ' );
    end if;
    end;
    Regards

  • OWB 10.1 -- How to make a JOB agains a process flow.

    Dear All
    I have created mapping and make a process flow. Deploy and generation done successfully.
    If i execute my process flow it whole bring data from source table to detination warehouse schema table.
    I want this process automate and run it daily at a given time. I do not get any control centre or other utility from OWB wheren I can make a job or schedule to achieve this goal. My OWB version is 10.1.
    Regards
    Sayeed

    Follow the below step to schedule mapping
    1.Define the Scheduler
    2. Right Click on Schedule tab -> New and define the module say Sche_mod
    3.Right Clicvk on Sche_mod ->New
    Filll all the details like when to run ther schedule say every sunday
    2.Right click the mapping or process flow which ever to schedule
    under Configure > Refered Calendar select the schedular name.
    3 Deloy the mapping .
    4.Deploy the Sechdular then run the scheduler.
    then every sunday the mapping/process flow will be executed.
    (Mark the answer as helpful or correct if it is)
    Cheers
    Nawneet

  • EMail and SQLPLUS activity in process flow

    Hi Guys,
    I am struggling to get the email activity working. I entered all the details but no email is sent?
    Secondly, I had an easy sql plus script that I pasted into the script property box
    But the sqlplus just does not execute?
    truncate table xxx;
    quit
    Any ideas?

    Make sure there are no spaces after you have replaced DISABLED with NATIVE_JAVA.
    Any space after the NATIVE_JAVA will not work.
    Can you give details like what are you using Script or Parameter list etc ?
    Did you configure the deployed location for the SQLPLus parameter ??
    Regarding Your EMAIL -
    Is your process flow showing any error regarding email ??
    Create a simple process flow with email activity only.
    Set FROM_EMAIL , TO_EMAIL , SUBJECT ,MESSAGE_BODY, SMTP_SERVER , PORT
    SMTP_SERVER and PORT values are very important.
    Check connection to SMTP_SERVER by running the following command on dos prompt
    telnet smtp_server_name port
    If this works this means your smtp server is working. Now just deploy and run your processs flow.
    If it still doen't send you email, check whats in the log.
    The log should give you any errors regarding email like ORA-06512: at "SYS.UTL_TCP", line 17 ,etc (if any).
    Thanks,
    Sam.

  • MM procurement process flow diagrams

    Hi can any body send me the MM procurement process diagrams to me .
    thanks and regards,
    Sri.,

    Hi All,
       The Process flow for SAP MM. You can check out in the following link.
    http://help.sap.com/printdocu/core/Print46c/en/data/pdf/SVASAPROZ/SVASAPROZ.pdf
    http://sapbrainsonline.com/sap-training/SAP-tutorial-pdf-material_582.html
    http://www.ntu.edu.sg/finance2/purchasing/sap%20procurement%20process%20flow.doc
    Hope this will solve your problem
    Pherasath
    Edited by: Pherasath Padmanaban on Feb 24, 2009 10:10 AM

  • Manual Commit using Process Flows

    Hi,
    I have 2 mappings, each with commit control property set as 'Manual'. I created a process flow to run these mappings. Following the instructions in the OWB user guide, I added a single SQLPLUS activity in the process flow, and entered the following in the 'SCRIPT' parameter of the SQLPLUS activity :
    DECLARE
    status VARCHAR2(30);
    BEGIN
    SCHEDULER_TESTING.main(status);
    IF status!='OK' THEN
    ROLLBACK;
    ELSE
    ERROR_LOG_TESTING.main(status);
    IF status!='OK' THEN
    ROLLBACK;
    ELSE
    COMMIT;
    END IF;
    END IF;
    END;
    When I deploy and run this process flow, it shows the execution as successful (COMPLETE:OK) in the control center. But when I check the target tables, no records have been inserted.Also, when I check the execution job report in the repository browser, it shows the status as 'Complete:Failure' ,without displaying any error message.
    There is no problem with the mappings, as executing them independantly gives the desired result.
    Any idea what is going wrong?
    Thanks and Regards,
    Amit

    OK, well from the process flow perspective your script exited normally which is why it shows the sqlplus activity as having completed properly. You would need an exit variable in the script that the activity's outgoing transformation was checking to in order to have the process flow know that a failure occurred. Because whether the mappings succeed or not is not known to the process flow from this script.
    But the other problem, of course, is why the script isn't running the process flows properly, but you haven't any way to determine this without doing some logging from the script.
    Perhaps, as a quick and dirty testing idea you could create a table proc_log(msg varchar2(500) and amend your script to:
    DECLARE
    status1 VARCHAR2(30);
    status2 VARCHAR2(30);
    BEGIN
    SCHEDULER_TESTING.main(status1);
    IF status!='OK' THEN
    ROLLBACK;
    ELSE
    ERROR_LOG_TESTING.main(status2);
    IF status!='OK' THEN
    ROLLBACK;
    ELSE
    COMMIT;
    END IF;
    END IF;
    insert into yourSchema.proc_log('Status for scheduler_testing: '||status1);
    insert into yourSchema.proc_log('Status for error_log_testing: '||status2);
    commit;
    EXCEPTION
    when other then
    rollback;
    insert into yourSchema.proc_log('EXCEPTION: '||sqlerrm);
    commit;
    END;
    just to try and see what is happening in there.
    Now, to put an exit variable into your script you would need to do something like:
    variable exec_return_code number;
    DECLARE
    -- we'll use 0 for success and 1 for failure
    returncode number := 0;
    status VARCHAR2(30);
    BEGIN
    SCHEDULER_TESTING.main(status);
    IF status!='OK' THEN
    returncode := 1;
    ROLLBACK;
    ELSE
    ERROR_LOG_TESTING.main(status);
    IF status!='OK' THEN
    returncode := 1;
    ROLLBACK;
    ELSE
    COMMIT;
    END IF;
    END IF;
    :exec_return_code := returncode;
    END;
    exit :exec_return_code;
    I haven't tried this yet, but it would be the only way I could think of to have any sort of success/failure returned from a sqlplus activity to the process flow.
    Cheers,
    Mike

  • Process flow/map performance issues

    We have some issues with our OWB-based application and we're looking to find out if there are different ways we could be using the tool, or features/options we've missed.
    We are trying to maintain a near real time feed of data from a front end system into our warehouse which was built using OWB 10.2.0.3 over a 10.2.0.4 database. The bulk of the application consists of OWB maps with a few hand-written PL/SQL objects, all executed in a series of hierachical OWB process flows. Maps/transformations are executed either sequentially or in parallel where the referential integrity of the model allows.
    The problem is that we have around 150 tables in the datamart which could potentially require updating on each refresh cycle, although in practice only a few tables have any activity on a typical refresh cycle. The cycle consists of loading data into a set of staging tables, and from there the data is transformed into the main schema, often with multiple maps per target table.
    On every cycle we run hundreds of maps, the vast majority of which process zero rows. Each map runs quickly and efficiently in its own right but collectively they add up to a 5 - 10 min cycle even if there is no data to process.
    There are 2 avenues which we'd like to explore and would be grateful if anyone could provide any pointers/suggestions :-
    1) It appears that each map opens and closes its own database session when it executes. I presume this was done because a single process flow could be constructed with maps executing in different target schemas, but we know that's not the case for us. We'd like to know if there is anyway to configure the database connection at a higher level (eg. process flow) so it opens a connection once and executes each of the maps (database packages) in that one session.
    Our DBAs are experimenting with 'shared server' settings at a database level which may help to some degree but won't be the whole story.
    2) Another option is simply to run less maps eg. load the staging area as now, collate stats on which staging tables contain new data, and then apply some logic such that subsequent maps only execute if the relevant staging table(s) contain(s) some new data, otherwise bypass that map.
    We tried experimenting with the 'Pre Mapping Process' operator, but essentially that just generates another function call from the map package, so we still have the overhead of opening a database session for each map to run the package. Minimal gain.
    We thought about adding a function call in the process flow before each map and then branching to either execute/bypass the map as approriate, but the function call still requires opening/closing of a database session each time so, once again, minimal gain.
    What we really want is some way for a map or process flow to check without logging onto the database repeatedly.
    Any ideas on the above, or other potential solutions anyone could suggest, would be greatly appreciated.

    Hi,
    Please see if these documents help.
    Note: 554635.1 - Create Accounting Process Performs Poorly When 100K + Distributions are Passed for an Event
    Note: 954273.1 - Multiple Create Accounting Requests Result In Poor Performance For Online Accruals
    Note: 763500.1 - R12: Performance Issue with Create Accounting
    Note: 733637.1 - R12:Performance Issue When Running Accounting Program Xlaaccup
    Note: 781311.1 - Create Accounting Process Taking A Long Time To Complete After Appying Critical Patches
    Note: 557869.1 - EBS: R12 Oracle Financials Critical Patches
    Regards,
    Hussein

  • Performance issues executing process flows after upgrading db to 10G

    We have installed OWF 2.6.2, and initially our database was at 9.2. Last week we updated our database to 10g, and process flow executions are taking a lot longer, from 1 minute to 15 minutes.
    Any ideas anyone what could be the cause of this performance issue?
    Thanks,
    Yanet

    Hi,
    Oracle10g database behaves differently on the statistics of tables and indexes. So check these and check wether the mappings are updating these statistics at the right moments with respect to the ETL-proces and with the right interval.
    Also, check your generated sources on how statistics are gathered (dmbs_stats.gather....). Does the index that might play a vital role in Oracle9i get new statistics, or only the table? Or only the table where doubled in amount of rows by this mapping?
    You can always take matter into your own hands, by letting OWB NOT generate the source for gathering statistics, and call your own procedure in a post-mapping.
    Regards,
    André

  • Invoking a mapping from another DB in a process flow

    Hi experts, have anyone try to invoke/execute an OWB mapping from another database in a process flow? Is it possible to do that?
    My process flow (in the Staging db) scenario is to execute several OWB mapping in the Staging db, then the final step is to execute an OWB mapping which is created in the warehouse db.
    Is that possible?
    Edited by: wwardana on Apr 5, 2009 9:09 PM

    Look at this thread
    [Calling WB_RT_API_EXEC.RUN_TASK over database link|http://forums.oracle.com/forums/thread.jspa?threadID=775938]
    Regards,
    Oleg

  • OWB 10gR2 map returning status = Complete and Result = Null in Process Flow

    Hi,
    I have an OWB process flow which invokes an OWB map. There are 2 outgoing transitions attached to the OWB map. 1st transition is a conditinal one (on SUCCESS) and the 2nd one is unconditional transition.
    I am monitoring the status of process flow execution in Oracle Workflow monitor. After the process flow is initiated, sometimes the OWB map within shows status = complete and result = OK in workflow monitor (in tab - status). But sometimes it shows status = Complete and Result = Null. In this situation the execution is following the unconditional transition path.
    Can anyone tell me what causes the OWB map to return a Result = NULL? Also sometimes I am getting the map returning a Result = Force (with the same status = Complete).
    The configuration that I am working with is as below ---
    OWB client version : 10.2.0.1.31
    OWB repository version : 10.2.0.1.0
    Oracle Workflow Version : 2.6.4.0.0
    Database version : 10g Enterprise edition release 10.2.0.1.0
    Regards,
    Swagata

    Hi Manohar,
    1. You need to install Oracle HTTP server in a separate Oracle home.
    2.Look for the dads.conf file in the installation. Edit the file, copy the sample code and paste it below and edit the pasted part as under ----
    <Location /pls/<SID>>
    SetHandler pls_handler
    Order deny,allow
    Allow from all
    AllowOverride None
    PlsqlDatabaseConnectString <server name>:<port>:<SID >
    PlsqlAuthenticationMode Basic
    PlsqlDefaultPage wfa_html.home
    PlsqlNLSLanguage "AMERICAN_AMERICA.WE8ISO8859P1"
    </Location>
    3. STOP and START OPMN (Oracle Process Manager & Notification Server)
    4. Use the URL in your internet browser like --- <HTTP server URL>/pls/<SID>/wfa_html.home
    5. When prompted for user id and password, give the schema user id and password where the workflow engine is installed.
    Hope this helps,
    Swagata

  • Process flow failes when executing a mapping by error ......

    hi all,
    When we executes a mapping through owb normally it works fine.
    But the same when executes throgh a process flow it get falies with the f/w error.
    *"Set based mode not supported ORA-06512"*
    The operating mode is "set based fali over to row based"
    In the mapping we are doing some transforamtion and putting the data in a flat file.
    Please help in this reagrd.
    Regards
    ashok

    Hi,
    change in processflow configuration property OPERATING_MODE for this mapping to ROW_BASED or ROW_BASED_TARGET_ONLY
    Hope this helps!

  • OWB Process Flow and Mapping Differences

    IHAC where we have a mapping which produces two different operations depending on how the mapping is called from the Deployment Manager.
    When you call the mapping as an individual job an INSERT is performed.
    However, when the SAME MAPPING is called as part of a process flow a MERGE is performed instead.
    The test data for each case is exactly the same and the result set at the end of the mapping, using either, method is the same.
    Has anyone come across this sort of behaviour before ?
    I would be grateful to know why this is happening and should I be concerned about it given the fact that the outcome is the same in both cases.
    Thanks in advance
    Chris

    Hi Christopher,
    This is funky. I have not seen this before... The only thing I can think of is that in the process flow someone changed the runtime parameters for the mapping, doing a set based execution while the map itself is doing this row based...
    Should not happen otherwise...
    Jean-Pierre

  • Skipping mapping execution in process flow

    I have a process flow that calls multiple mappings. Based on some condition I want a mapping not to execute, E.g. when rerunning the process flow in case of failure. Currently I keep track of mapping execution and store the status in a control table. When a mapping has completed, it updates the status as completed. When I need to rerun, in the SQL of the tables join in the mapping, I have a where clause that returns 0 rows (mapping_completed = N) . So mapping is excuted but no rows added/updated.
    The flip side of the approach is when using a complex join specially using views, the SQL takes a long time to run only to return 0 rows. So logic is okay but I want to save time by avoiding the execution of the mapping it self.
    I would like to know how others are implementing this scenario.
    Regards
    Sandeep

    Can you not have two different mappings one for running it first time and one which you can run on failure...
    In your process flow you can have a param i.e 'F for failure and 'I' for intial and based on this condition you can decide which mapping to invoke and hte path to be followed.

Maybe you are looking for

  • Macbook core 2 duo ti tac noise under keyborad

    Hi! I have a new macbook core 2 duo 2 gh. There is a soft tic tac noise under the keyboard. Is this normal? Where does it come from? Thanks, ENRIQUE.

  • Out of control thermocoup​les

    Hey, I've got a weird problem and I'm hoping someone can help. I've been using a program for about 6 months now that basically maintains a cold temperature at one side of a heat sink, and maintains a hot temperature at the other side.  Then I have 25

  • Best setting for Canon VIXIA HF R21 for YouTube

    I am looking for the best or most optimal setting for shooting video with my Canon VIXIA HF R21 camera. I am using a MacBook Pro laptop and Adobe Premiere Elements 10. My goal is to produce the best quality videos for a "YouTube" channel. The videos

  • Books on cd-how to consolidate

    Is there a way to consolidate books on CD so they play like an audio book in large parts? Right now I have a playlist for each CD.

  • Error when runing retropay by element

    i processed 2 employees who are ex-employees as of last month but their Final close date have not been entered in the termination window. Now when i run the retropay by element on these 2 employees i am getting the following error for one of the empl