Insert number of executions vs rows processed

I'm reviewing a conventional load job that was running in my environment and I came across something that I couldn't explain correctly and was hoping for some help. Say I run an AWR report and look at a particular insert statement in the section "SQL ordered by Executions" and I look at the column "Rows per Exec" I would expect this to be one so that each row inserted increments the counter for exections of the same sql statement. How does oracle processes rows per each execution of the same insert statement? Like I said I initially thought each row would be an execution. Below is an example from and AWR report, the sql was edited by me before posting.
Executions Rows Processed Rows per Exec Elapsed Time (s) %CPU %IO SQL Id SQL Module SQL Text
33,021,240 396,254,880 12.00 18,317.08 65.80 0.55 7v0u50ct2tfyn pmdtm@server123 (TNS V1-V3) INSERT INTO TABLE1(ID, NAME, DATE) VALUES ( :1, :2, :3)

Even though statement such as “INSERT INTO TABLE1(ID, NAME, DATE) VALUES ( :1, :2, :3)” implies one-row-at-time execution, some drivers (JDBC and others) implement bulk processing for this type of insert. Please check the settings of your driver and, if possible, the actual code that issues the insert.
Iordan Iotzov
http://iiotzov.wordpress.com/

Similar Messages

  • Executions/Rows Processed in statpack

    I'm looking at insert statement in the "SQL ordered by Executions" section of the statpack. I have Executions = 400,626 and Rows Processed = 400,464.
    Does this mean that the aplication actually issued 400,626 inserts but only 400,464 were really inserted?
    Thanks a lot,mj

    This means that some of the insert failed for some reasons - like duplicating of PK,Check constraints... The problem I'm having is the missing rows in the Log/audit table which should never happen and I need to discover what exactly is missing. The application obviously ignores the errors. How can I find what inserts are failing and why? Trace which level?
    I also have another insert statement with more row processed then executions. Does this mean that there have been some inserts with multiple rows at the time? Can we rely on this numbers 100%?
    Thanks lot.mj
    Message was edited by:
    user494147

  • Automatic row processing insert fail

    Hi all,
    Application Express 3.2.1.00.11
    Oracle Database 10g Release 10.2.0.3.0 - 64bit Production
    I am running into a problem where my automatic row processing will update and delete records but will not insert. I get the following error.
    ORA-44003: invalid SQL nameI disabled all triggers and added the appropriate fields to my form and manually entered correct data and am still having the issue. I can run a trace but only the dba has access to retrieve the file and he is gone. Any suggestions on where to start? i checked debug mode and it is definitely hanging on the ARP.
    Cheers,
    Tyson Jouglet

    All,
    Finally fixed it!!! So if you experience this problem, double check all of your item sources. I was upgrading a date column to a timezone column and subsecuently made it three different items. Long story less long, I left the original date item on the form and deleted the source column name but i forgot to change the item source from database column to something else. This will not effect how apex deletes or updates a record but it will not allow you to insert a new record. I know this is a user error, but maybe this could be added as a validation?
    Cheers,
    Tyson Jouglet

  • Inserts using Automatic Row Processing (DML)  should not clear cache

    Hi,
    I am using APEX 4.0.1.00.03.
    I created a form on a table for inserts and updates.
    After I do an insert, the Automatic Row Processing (DML) clears cache for all items on the page.
    But the user wants to see the data he inserted without having to query. How can I achieve this?
    Thanks
    Chandra

    There isn't any "reset page" process on my page.
    --Chandra.                                                                                                                                                                                               

  • Suppress number of rows processed message in msql

    Is it possible to suppress the feedback message that one gets after running a query.
    I've tried the following:
    SQL> set feedback off
    SQL> set heading off
    SQL> @x
    3
    1 row(s) returned
    But still get the message "1 row(s) returned"

    The Bytes is AVG_ROW_LENGTH * Number of Rows Processed.
    For instance you have an EMP table which an AVG_ROW_LENGTH as 37 and the Number of rows processed is 14 then the Bytes will be 518

  • ExecuteBatch(): number of successfully updated rows

    Hello everybody:
    Here is a simple but often a repeated question in java forums:
    Requirement:
    1.To read a flat file that has many rows of data.
    2.Parse the data and update the database accordingly.
    3.Find the number of successfully updated rows.
    Approach:
    After reading the file and parsing its data,
    - use PreparedStatement
    - use executeBatch()
    I found this as unadvisable to use executeBatch() as its implementation is
    inherently driver specific. The executeBatch() returns an array of update counts.
    Now,can any one tell me, what is the best way to trace the number of successfully
    (and unsuccessfully) updated rows by using this count?
    Is there any other way to achieve the same by not using executeBatch()?
    Can any one share a snippet of code to achieve this specific functionality?
    [Need is to log the number of unsuccessful attempts along with their
    corresponding rows of data].
    Thanks & regards,
    Venkat Kosigi

    executeBatch submits a batch of commands to the database for execution and if all commands execute successfully, returns an array of update counts. The int elements of the array that is returned are ordered to correspond to the commands in the batch, which are ordered according to the order in which they were added to the batch. The elements in the array returned by the method executeBatch may be one of the following:
    -- A number greater than or equal to zero indicates that the command was processed successfully and is an update count giving the number of rows in the database that were affected by the command's execution
    -- A value of -2 indicates that the command was processed successfully but that the number of rows affected is unknown
    If one of the commands in a batch update fails to execute properly, this method throws a BatchUpdateException, and a JDBC driver may or may not continue to process the remaining commands in the batch. However, the driver's behavior must be consistent with a particular DBMS, either always continuing to process commands or never continuing to process commands.
    If the driver continues processing after a failure, the array returned by the method BatchUpdateException.getUpdateCounts will contain as many elements as there are commands in the batch, and at least one of the elements will be the following:
    -- A value of -3 indicates that the command failed to execute successfully and occurs only if a driver continues to process commands after a command fails.
    return values have been modified in the Java 2 SDK, Standard Edition, version 1.3 to accommodate the option of continuing to proccess commands in a batch update after a BatchUpdateException obejct has been thrown.
    Throws BatchUpdateException (a subclass of SQLException) if one of the commands sent to the database fails to execute properly or attempts to return a result set. The BatchUpdateException getUpdateCounts() method allows you to known the element who caused the fail identified by a -3 value.
    -- So, if you have a succesfully result, look for at the executeBatch returned array ( #values >= 0 ) + ( #values == -2 ) = successes
    and if you have not a succesfully result, catching the BatchUpdateException take the array returned by the getUpdateCounts() method, and look for the position in which array values are -3. You could take the data at this position on batch and log it.
    -- Other way to insert a bulk copy on database is to use a bcp command ( it�s not java, bcp is an independent command ) that allows you to do bulk inserts from file, indicate an error file, bcp will give to you as result a file with those lines not where inserted.
    I hope have help you.;)

  • Number of currently displayed rows in report with pagination?

    Maybe I'm thinking to complicated, but ...
    In my report I use the HTMLDB_ITEM.CHECKBOX function in order to display a checkbox for each row:
    HTMLDB_ITEM.CHECKBOX(1,"ID",NULL,:P311_ASSIGNED,':') " "
    Some of the checkboxes are selected depending on whether there is an entry for them in a database table.
    As the report returns hundreds of rows, I use pagination (e.g. 15 rows per page).
    After checking/unchecking the checkboxes and submitting the page by clicking the save button, I want to do the following: For each row CURRENTLY DISPLAYED on the screen (1) check, whether its checkbox is currently checked or not (2a) If it is checked and there is not yet an entry -> add row to the db table (2b) If it is unchecked and there is an entry in the db table -> remove row from db table.
    How can I, on the server side, find out which rows (row ids) are currently displayed (using pagination), so that I check each of the currently displayed row ids for their existence in the database table? Is there any item stored on the server side, which tells me the row number of the first row currently displayed? Is there an item I can refer to, which tells me the amount of rows, which are displayed on one page?
    Your help would be very much appreciated!
    Konrad

    Hi Konrad,
    I will jump in. :-) Just have a few minutes before a meeting.
    I think you don't have to make it that complicated, where you have to know the current pagination.
    1) Your HTMLDB_ITEM.CHECKBOX has to contain as checked value #ROWNUM# as the row selector would do it.
    2) You have to create a hidden item where you store your customer id
    3) create another hidden item where you store if the customer has already the assignment
    4) In your process you are now able to loop through your arrays.
    DECLARE
        vRowNumber BINARY_INTEGER;
        vFound BOOLEAN;
    BEGIN
        -- insert new event assignments
        FOR ii IN 1 .. WWV_Flow.g_f01.COUNT -- your checkbox
        LOOP
            vRowNumber := WWV_Flow.g_f01(ii);
            -- no assignment yet?
            IF WWV_Flow.g_f03(vRowNumber) IS NULL -- your hidden field where you store if you have an assmnt
            THEN
                INSERT INTO xxx VALUES (WWV_Flow.g_f02(vRowNumber)); -- your customer id
            END IF;
        END LOOP;
        -- delete old event assignments
        FOR ii IN 1 .. WWV_Flow.g_f03.COUNT -- your hidden field where you store if you have an assmnt
        LOOP
            -- only if the event was already assigned
            IF WWV_Flow.g_f03(ii) IS NOT NULL
            THEN
                vFound := FALSE;
                FOR jj IN 1 .. WWV_Flow.g_f01.COUNT -- your checkbox
                LOOP
                    -- is the event still checked?
                    IF WWV_Flow.g_f01(jj) = ii
                    THEN
                        vFound := TRUE;
                        EXIT;
                    END IF;
                END LOOP;
                IF NOT vFound
                THEN
                    DELETE xxx WHERE CUSTOMER_ID = WWV_Flow.g_f02(ii);
                END IF;
            END IF;
        END LOOP;
    END LOOP;Haven't tested the code, but I think it should show the idea.
    Hope that helps
    Patrick
    Check out my APEX-blog: http://inside-apex.blogspot.com

  • Processing Static (via automatic row processing) & Dynmaic fields

    Hi,
    I have a page that has 2 sections. Section S is statically driven which I'd like to process via a Automatic Row Process DML. Section D is for dynamic fields which I process via a PL/SQL script.
    I need to process Section D (dynamic) first.
    Now they're 2 things that I'm noticing when I try this. Can someone please confirm.
    - After my process of Section D it seems to make a commit. I know this since I have a error in Section S, and the values from Section D are committed to the DB. I need to make sure a commit only occurs after all page processes have completed error free.
    - My Automatic Row Process DML for Section S doesn't seem to work at all. It can't seem to read the values at all. I know this since I have several columns which are "NOT NULL" and the appropriate error messages are being raised. The Automatic Row Fetch for Section S does work properly.
    For the time being the work around is writing a process for the entire page which includes both Section S and Section D. The thing is I thought that HTMLDB would be able to help me out a lot with Section S since it had static fields etc.

    Martin - I would try to debug these two processes separately. If the Auto DML process isn't firing, perhaps the button used to submit the page isn't setting the request to one of the standard values recognized by the Auto DML package ('INSERT','CREATE','CREATE_AGAIN','CREATEAGAIN' for inserts and 'SAVE','APPLY CHANGES','UPDATE','UPDATE ROW','CHANGE','APPLY' or like 'APPLY%CHANGES%' for update).
    A commit happens whenever session state is changed, so if your process saves an item value, that would do it. If you think that is not the cause of the commit, let me know the details of the process and I'll take a closer look. There is no way to prevent the commit when session state is updated.
    Scott

  • Without Automatic Row Processing (DML)

    Hi All
    I try to make one page to show use DML statements without standard Automatic Row Process.
    What I have:
    One page with:
    region: Report - select empno, ename, sal, deptno from emp;
    region: Form with items: p6_empn, p6_ename, p6_sal, p6_deptno
    addictionaly in that moment I have:
    process CreateProcess:
    begin
    insert into emp (empno, ename, sal, deptno)
    values (emp_seq.nextval, :P6_ENAME, :P6_SAL, :P6_DEPTNO);
    commit;
    end;
    and button CREATE.
    Create Process is fired when I push CREATE button and Insert statement works corecctly. Branch is on the same page. (I use request).
    Now I would like to show update and delete statements.
    In Report I have in column EMPNO standard button Edit (each row)
    (<img src="#IMAGE_PREFIX#edit.gif" alt="Edit">)
    I have process SelectProcess:
    begin
    select ename,sal,deptno
    into :p6_ename, :p6_sal, :p6_deptno
    from emp
    where empno = :p6_empno;
    end;
    It is fired In Load _ Before Header
    When I push Edit in one row I would like to see details of employee in Form, but I see nothing. In session state all is correct - I see :p6.... with name, sal and ...
    I dont know how to connect this process with buton and branch.
    could you help me?

    Hi,
    I have done a quick test for this - [http://htmldb.oracle.com/pls/otn/f?p=25946:1]
    On the Edit page, I have a process called P2_LOAD_DATA that runs "On Load - Before Header", is conditional on P2_EMPNO being NOT NULL and using the following code:
    BEGIN
    SELECT EMPNO, ENAME, JOB
    INTO :P2_EMPNO, :P2_ENAME, :P2_JOB
    FROM EMP
    WHERE EMPNO = :P2_EMPNO;
    END;I have a button with a Button Name setting of "UPDATE" (user sees this as Apply Changes)
    I then have a process that runs "On Submit - After Computations and Validations" and is conditional on REQUEST = UPDATE. The code is:
    BEGIN
    UPDATE EMP
    SET ENAME = :P2_ENAME,
    JOB = :P2_JOB
    WHERE EMPNO = :P2_EMPNO;
    END;The source settings on P2_EMPNO, P2_ENAME and P2_JOB are as I described in my previous post.
    One thing to check - do you have a "reset page" process? If so, remove it.
    Andy

  • Understanding "automatic row processing"

    hi,
    i have problems understanding automatic row processing in apex.
    at first i thought there is a relation between page process (fetch row, insert, update,...) and page items / reports.
    something like: Item P1_TESTTEXTBOX (Source=Database Column => "TESTCOLUMN") referes to process "FETCH ROW FROM TESTTABLE"
    but i havent found any relation.
    then i thought there is a "global pool of database columns" (fetched, inserted, updated by a page processes)... when a fetch process loads data from a db-table, all columns from this table are stored in a "dictionary"...but this also didnt seems to be right.
    i did some testing to understand the (row fetch) behavior but there a still some questions:
    - i created 2 Processes (Data Manipulation -> Row Fetch) in "After Header" on 2 different Database tables but each with a Column "TESTCOLUMN" (for testing^^) . It didn't work properly: it seems that only one fetch process is executed .
    I'm still not quite sure how automatic row processing (fetch, insert, update) is working...
    are there any explanation about this issue? i havent found any information that goes into detail.

    ok, some testing later i come to the conclusion that the automatic row processing works like this:
    when a row (fetch) process starts, ALL binded Page Items (Source Type=Database Column) tries to load their data from the current fetch-process.
    i tried to add 2 different Regions containing some Page Items (TextBox) that binds to a Database Column.
    * Region1 should contain Artikle-Information (refers to DB-Table Article)
    * Region2 should contain Article-Detail Informations (refers to DB-Table ArticleDetails)
    i am not able to create 2 Row fetch processes in 1 page, because i got an error when the first row-fetch process starts "Column [XY] not found in table [YZ]"
    obviously there is no way to "link" processes to regions, reports or page items.
    or is it possible to create 2 regions (in 1 page) with automated row processing, that refers to different tables ? (region1/report1 => DBTable1, region2/report2 => DBTable2)?
    (the only way i see is that i create a view that contains these 2 DBTables, so i only need one row fetch process for this page)
    thanks in advance
    rene...

  • Automatic Row Processing?

    Hi,
    I have a requirement to create forms that insert into multiple tables and returns items from multiple tables. What are my options? Can I still use automatic row processing and fetching?
    Thanks in advance.

    create multiple regiions embedded with different table

  • OWB11gR2: Mapping execution in a process flow not visible in OWB Browser

    When a mapping is executed inside a process flow, execution details are not visible in OWB Repository Browser (Control Center reports) - rows processed, errors etc. Mapping row is missing in a log, like it never happened (but it did).
    This auditing information is very important for monitoring reasons (to our customers also) and I just don't get it how this functionality is lost with this version. Another serious bug?

    Hi David,
    I was rather tired and frustrated last evening, so today I noticed some things I didn't yesterday. Your reply gave me a new motivation.
    The conclusion is - a mapping execution in a process flow is logged, but the way activities are displayed in OWB Browser are now different than in previous versions. If I click on 'Execution Job Report' on a process flow, I see all the activities listed except mappings (transformations, assign, file exists, subprocess etc.). If I want to see mapping execution row, I must click on a plus (expand) sign.
    This kind of behavior will make processes with a complex hierarchy (usually we have more than 5 levels of subprocesses) rather vast to monitor. In 10gR2, a drilling down was accomplished by opening a new browser tab (Execution Job Report link) for each subprocess/mapping activity. Now it shall remain on one huge screen (list) that keeps expanding.
    But, if that is the new feature, we shall live with that. If our customers won't like it, they will have to get used to it.
    Thank you for your reply!

  • Automatic Row Processing - Preset a value with value from other page

    Hi everybody,
    I have created a form on a table with report. I added a dropdown field to a sidebar region on the first page the groups and selects only a couple of these entries. (Think: only people for the department selected in the drop down field). Now when I click on the normal create button the id field for the department should be filled automatically with the value from the page before, where something was already selected in the drop down field. The id field is a hidden field on the actual create page. Maybe it is not working because the field is linked to the database column to make the automatic row processing work or something. I tried computions on both pages and setting the values. Nothing seems to do the trick.
    Any help would be greatly appreciated.
    Thanks,
    Henrik

    Hi,
    these are the values I currently see:
    108 14 P14_SELECTEDSS Display as Text (escape special characters, does not save state) No
    108 14 P14_CHARACTERISTIC_ID Hidden and Protected Reset to Null No
    108 14 P14_SHAREDSERVICE_ID Text Field 30 Inserted No
    108 14 P14_CAPTION Text Field No
    108 14 P14_CATEGORY_ID Select List No
    108 14 P14_WEIGHING Text Field No
    108 14 P14_DESCRIPTION Textarea No
    I also have to say that I deleted the computations I created earlier again cause they did not work and I had the same values for these variables. I am trying to (pre)set P14_SHAREDSEVICE_ID to P13_SELSS when I click the create button on page 13.
    Thanks again,
    Henrik

  • Automatic Row Processing (DML) - Return Key Into Item

    Hello,
    This question is on Apex 4.2:
    I'm Displaying the [UNIKEY] column value after the record is inserted into the table using :
    page process > process row table_name > Source: Automatic Row Processing (DML) > Return Key Into Item > "Item Name"
    This works fine while inserting records, my question is why this does not respond on update / delete ?
    ϯ LT

    LT
    Check your branches.
    What might be happening is that on insert and update an other branch is followed.
    Where the branch of the insert either isn't clearing the cache of the page or is setting the item with it's own value.
    The branch on the update on the other hand is clearing the cache of the page and not setting the item.
    With a delete there is a clear cache process generate by the wizard. Check if this process is only running on the delete and not on the update.
    From memory the process is called something with reset.
    If the above doesn't help try to replicate the problem on apex.oracle.com and give access with a guest developer account so we can have a look.
    Nicolette

  • Automatic row processing query should not update key columns

    We have an APEX application to maintain a table A, which is referenced by a huge table B (several millions rows). If a row is updated (by automatic row processing), the following update statement is executed.
    update "MM_DWH"."A" set "ID" = :DML_BV0001,"NAME" = ... where "ID" = :p_rowid
    Due to the foreign key constraint of table B, we have situations where this update statement is blocked (Enq: TM - contention) for a long time if we have inserts into table B at the same time - I guess.
    My questions are:
    Why is column ID updated, this does not make sense?
    Is there a way to prevent that behavious or do I have to code the update statements manually instead of using the automatic row processing feature?
    Thanks and regards, Maren

    m_eschermann wrote:
    We have an APEX application to maintain a table A, which is referenced by a huge table B (several millions rows). If a row is updated (by automatic row processing), the following update statement is executed.
    update "MM_DWH"."A" set "ID" = :DML_BV0001,"NAME" = ... where "ID" = :p_rowid
    Due to the foreign key constraint of table B, we have situations where this update statement is blocked (Enq: TM - contention) for a long time if we have inserts into table B at the same time - I guess.
    My questions are:
    Why is column ID updated, this does not make sense?
    Is there a way to prevent that behavious or do I have to code the update statements manually instead of using the automatic row processing feature?Make sure that you have all the required Primary Key and Foriegn Key indices for the two tables - if missing, it's possible that each update of A is requiring a full table scan of B.
    What is column "ID" - the PK for A or the FK to B?

Maybe you are looking for