JOB and INSERT

Hi all again.
I'm practicing JOBS, and I have a program that insert 3 records in a Z table What I do is:
1. Loop to my internal table and then record by record I user the clause INSERT, if sy-subrc eq 0, then COMMIT and write:/ Record added, ELSE. rollback work and write:/ Record not added.
2. When i execute the job via SUBMIT clause, and then I go to the SM37 trx, I can't see the spool list. This happens when I try to add records which were added previously, but if there are new records, I can see the spool list and I don't have problems.
Whys is this happening? I assume that the insert is not possible when the record exist in the Z Table, but I have the rollback and the write, why I can't see the writes?
Can someone explain to me, and let me know how I can solve it?
Thanks
Gaby

Hi Vikranth,
I tried deleting the rollback line. And when I use that in a loop works fine, but when I use this line:
INSERT ztabla_vuelos FROM TABLE ti_vuelos.
if there is a duplicate record so then  I see the dump, I debugged  the code, and when the point is on the INSERT line, after pressing F5 I see the dump error, and tells me that I',m triying to add a duplicate record -which is true- but i don't know how can avoid that, I would like to show a line like this:
INSERT ztabla_vuelos FROM TABLE ti_vuelos.
IF sy-subrc EQ 0.
  COMMIT WORK.
  WRITE :/ 'OK '.
ELSE.
  WRITE: / 'failed'.
ENDIF.
But i can't get it. How can I solve that?
Thanks

Similar Messages

  • Read a csv file, fill a dynamic table and insert into a standard table

    Hi everybody,
    I have a problem here and I need your help:
    I have to read a csv file and insert the data of it into a standard table.
    1 - On the parameter scrreen I have to indicate the standard table and the csv file.
    2 - I need to create a dynamic table. The same type of the one I choose at parameter screen.
    3 - Then I need to read the csv and put the data into this dynamic table.
    4 - Later I need to insert the data from the dynamic table into the standard table (the one on the parameter screen).
    How do I do this job? Do you have an example? Thanks.

    Here is an example table which shows how to upload a csv file from the frontend to a dynamic internal table.  You can of course modify this to update your database table.
    report zrich_0002.
    type-pools: slis.
    field-symbols: <dyn_table> type standard table,
                   <dyn_wa>,
                   <dyn_field>.
    data: it_fldcat type lvc_t_fcat,
          wa_it_fldcat type lvc_s_fcat.
    type-pools : abap.
    data: new_table type ref to data,
          new_line  type ref to data.
    data: xcel type table of alsmex_tabline with header line.
    selection-screen begin of block b1 with frame title text .
    parameters: p_file type  rlgrap-filename default 'c:Test.csv'.
    parameters: p_flds type i.
    selection-screen end of block b1.
    start-of-selection.
    * Add X number of fields to the dynamic itab cataelog
      do p_flds times.
        clear wa_it_fldcat.
        wa_it_fldcat-fieldname = sy-index.
        wa_it_fldcat-datatype = 'C'.
        wa_it_fldcat-inttype = 'C'.
        wa_it_fldcat-intlen = 10.
        append wa_it_fldcat to it_fldcat .
      enddo.
    * Create dynamic internal table and assign to FS
      call method cl_alv_table_create=>create_dynamic_table
                   exporting
                      it_fieldcatalog = it_fldcat
                   importing
                      ep_table        = new_table.
      assign new_table->* to <dyn_table>.
    * Create dynamic work area and assign to FS
      create data new_line like line of <dyn_table>.
      assign new_line->* to <dyn_wa>.
    * Upload the excel
      call function 'ALSM_EXCEL_TO_INTERNAL_TABLE'
           exporting
                filename                = p_file
                i_begin_col             = '1'
                i_begin_row             = '1'
                i_end_col               = '200'
                i_end_row               = '5000'
           tables
                intern                  = xcel
           exceptions
                inconsistent_parameters = 1
                upload_ole              = 2
                others                  = 3.
    * Reformt to dynamic internal table
      loop at xcel.
        assign component xcel-col of structure <dyn_wa> to <dyn_field>.
        if sy-subrc = 0.
         <dyn_field> = xcel-value.
        endif.
        at end of row.
          append <dyn_wa> to <dyn_table>.
          clear <dyn_wa>.
        endat.
      endloop.
    * Write out data from table.
      loop at <dyn_table> into <dyn_wa>.
        do.
          assign component  sy-index  of structure <dyn_wa> to <dyn_field>.
          if sy-subrc <> 0.
            exit.
          endif.
          if sy-index = 1.
            write:/ <dyn_field>.
          else.
            write: <dyn_field>.
          endif.
        enddo.
      endloop.
    REgards,
    RIch Heilman

  • SQL Loader and Insert Into Performance Difference

    Hello All,
    Im in a situation to measure performance difference between SQL Loader and Insert into. Say there 10000 records in a flat file and I want to load it into a staging table.
    I know that if I use PL/SQL UTL_FILE to do this job performance will degrade(dont ask me why im going for UTL_FILE instead of SQL Loader). But I dont know how much. Can anybody tell me the performance difference in % (like 20% will decrease) in case of 10000 records.
    Thanks,
    Kannan.

    Kannan B wrote:
    Do not confuse the topic, as I told im not going to use External tables. This post is to speak the performance difference between SQL Loader and Simple Insert Statement.I don't think people are confusing the topic.
    External tables are a superior means of reading a file as it doesn't require any command line calls or external control files to be set up. All that is needed is a single external table definition created in a similar way to creating any other table (just with the additional external table information obviously). It also eliminates the need to have a 'staging' table on the database to load the data into as the data can just be queried as needed directly from the file, and if the file changes, so does the data seen through the external table automatically without the need to re-run any SQL*Loader process again.
    Who told you not to use External Tables? Do they know what they are talking about? Can they give a valid reason why external tables are not to be used?
    IMO, if you're considering SQL*Loader, you should be considering External tables as a better alternative.

  • Splitting a list (loop and insert)

    I have a text file with about 65000 records to be looped and
    inserted into a
    database. I am using cfhttp to read the list and turn it into
    a query. I am
    then using that query to loop and insert the records into a
    database.
    However, its too large and the server is timing out. Is there
    a break the
    list into 2 pieces and do it in serial?
    I am using:
    <cffunction name="getResidential" access="private"
    returntype="void"
    output="false" hint="">
    <cfargument name="ResFile" type="string"
    required="yes">
    <cfhttp timeout="6600"
    url="
    http://www.mywebsite.com/assets/property/#ResFile#"
    method="GET"
    name="Property" delimiter="|" textqualifier=""
    firstrowasheaders="yes" />
    <cfloop query="Property">
    <cfquery name="loopProperty" datasource="bpopros">
    INSERT STATEMENT HERE
    </cfquery>
    </cfloop>
    </cffunction>
    Is it possible to do something like:
    Function 1
    <cfloop from="1" to="40000" index="i">
    </cfloop>
    Function 2
    <cfloop from="40001" to="#Query.RecordCount#"
    index="i">
    </cfloop>
    Any ideas? Thanks
    Wally Kolcz
    MyNextPet.org
    Founder / Developer
    586.871.4126

    I like your second solution, but wouldn't I need access to
    the actual web
    server? I need to do this on hosting. I am downloading the
    files from a Real
    Comp and then looping the data into my client's database for
    searching. I
    wanted to just use the files as a flat database and just
    query or query off
    of it, but there is a new file everyday. Only one update
    (Sunday) is the
    master list of 45-60k records. All the rest (Mon-Sat) are
    just small
    updates.
    "Dan Bracuk" <[email protected]> wrote in
    message
    news:fdrgde$6qu$[email protected]..
    > It's possible, but it's a better idea to look for ways
    to do it without
    > cold
    > fusion.
    >
    > Food for thought, I have a requirement to move data from
    one db to
    > another. I
    > use Cold Fusion to query the first db, output to text,
    and ftp the text
    > files
    > to another server. I also have a scheduled job that
    looks for these text
    > files
    > and loads them into db2.
    >

  • Rows updated and inserted by  MERGE

    Hi,
    Is there any way I can come to know, How many rows are updated and inserted using a MERGE statement.
    Thanks in advance

    Something like this ->
    satyaki>
    satyaki>select * from v$version;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Prod
    PL/SQL Release 10.2.0.3.0 - Production
    CORE    10.2.0.3.0      Production
    TNS for 32-bit Windows: Version 10.2.0.3.0 - Production
    NLSRTL Version 10.2.0.3.0 - Production
    Elapsed: 00:00:00.01
    satyaki>
    satyaki>
    satyaki>
    satyaki>desc aud_emp;
    Name                                      Null?    Type
    EMPNO                                              NUMBER(4)
    ENAME                                              VARCHAR2(10)
    JOB                                                VARCHAR2(9)
    MGR                                                NUMBER(4)
    HIREDATE                                           DATE
    SAL                                                NUMBER(7,2)
    COMM                                               NUMBER(7,2)
    DEPTNO                                             NUMBER(2)
    AUDIT_TMP                                          TIMESTAMP(6)
    AUDIT_IP_ADDR                                      VARCHAR2(15)
    OPR_DESC                                           VARCHAR2(30)
    USER_NM                                            VARCHAR2(40)
    REMARKS                                            VARCHAR2(200)
    satyaki>
    satyaki>
    satyaki>
    satyaki>
    satyaki>select * from aud_emp;
    no rows selected
    Elapsed: 00:00:00.04
    satyaki>
    satyaki>
    satyaki>select * from emp;
         EMPNO ENAME      JOB              MGR HIREDATE         SAL       COMM     DEPTNO
          9999 SATYAKI    SLS             7698 02-NOV-08      55000       3455         10
          7777 SOURAV     SLS                  14-SEP-08      45000       3400         10
          7521 WARD       SALESMAN        7698 22-FEB-81       1250        500         30
          7566 JONES      MANAGER         7839 02-APR-81       2975                    20
          7654 MARTIN     SALESMAN        7698 28-SEP-81       1250       1400         30
          7698 BLAKE      MANAGER         7839 01-MAY-81       2850                    30
          7782 CLARK      MANAGER         7839 09-JUN-81       4450                    10
          7788 SCOTT      ANALYST         7566 19-APR-87       3000                    20
          7839 KING       PRESIDENT            17-NOV-81       7000                    10
          7844 TURNER     SALESMAN        7698 08-SEP-81       1500          0         30
          7876 ADAMS      CLERK           7788 23-MAY-87       1100                    20
         EMPNO ENAME      JOB              MGR HIREDATE         SAL       COMM     DEPTNO
          7900 JAMES      CLERK           7698 03-DEC-81        950                    30
          7902 FORD       ANALYST         7566 03-DEC-81       3000                    20
    13 rows selected.
    Elapsed: 00:00:00.01
    satyaki>
    satyaki>select * from e_emp;
         EMPNO ENAME      JOB              MGR HIREDATE         SAL       COMM     DEPTNO
          7566 JONES      MANAGER         7839 02-APR-81    3718.75                    20
          7782 CLARK      MANAGER         7839 09-JUN-81     3062.5                    10
          7788 SCOTT      ANALYST         7566 19-APR-87       3000                    20
          7839 KING       PRESIDENT            17-NOV-81       5000                    10
          7876 ADAMS      CLERK           7788 23-MAY-87       1100                    20
          7902 FORD       ANALYST         7566 03-DEC-81       3000                    20
          7934 MILLER     CLERK           7782 23-JAN-82      11700                    10
    7 rows selected.
    Elapsed: 00:00:00.07
    satyaki>CREATE OR REPLACE TRIGGER trg_aud
    before insert or update or delete on e_emp
    for each row
    declare
         S_IP_ADDR    varchar2(30);
         str          varchar2(320);
         s_empno      varchar2(40);
         s_ename      varchar2(40);
         s_job        varchar2(40);
         s_mgr        varchar2(40);
         s_hrdate     varchar2(40);
         s_sal        varchar2(40);
         s_comm       varchar2(40);
         s_deptno     varchar2(40);
    begin
         SELECT SYS_CONTEXT('USERENV','IP_ADDRESS')
         into S_IP_ADDR
         from dual;
         str:= null;
         if inserting then
             insert into aud_emp values(:new.empno,
                                        :new.ename,
                                        :new.job,
                                        :new.mgr,
                                        :new.hiredate,
                                        :new.sal,
                                        :new.comm,
                                        :new.deptno,
                                        systimestamp,
                                        S_IP_ADDR,
                                        'INSERT',
                                        USER,
                                        NULL);
         elsif updating then
           if :old.empno <> :new.empno then
              s_empno := 'Employee No: '||:old.empno;
           elsif :old.ename <> :new.ename then
              s_ename := 'Employee Name: '||:old.ename;
           elsif :old.job <> :new.job then
              s_job := 'Job: '||:old.job;
           elsif :old.mgr <> :new.mgr then
              s_mgr := 'Mgr: '||:old.mgr;
           elsif :old.hiredate <> :new.hiredate then
              s_hrdate := 'Hire Date: '||:old.hiredate;
           elsif :old.sal <> :new.sal then
              s_sal := 'Salary: '||:old.sal;
           elsif :old.comm <> :new.comm then
              s_comm := 'Commission: '||:old.comm;
           elsif :old.deptno <> :new.deptno then
              s_deptno := 'Department No: '||:old.deptno;
           end if;
           str:= 'Updated Records Details -> '||s_empno||' '||s_ename||' '||s_job||' '||s_mgr||' '||s_hrdate||' '||s_sal||' '||s_comm||' '||s_deptno;
             insert into aud_emp values(:new.empno,
                                        :new.ename,
                                        :new.job,
                                        :new.mgr,
                                        :new.hiredate,
                                        :new.sal,
                                        :new.comm,
                                        :new.deptno,
                                        systimestamp,
                                        S_IP_ADDR,
                                        'UPDATE',
                                        USER,
                                        str);
         elsif deleting then
             insert into aud_emp values(:old.empno,
                                        :old.ename,
                                        :old.job,
                                        :old.mgr,
                                        :old.hiredate,
                                        :old.sal,
                                        :old.comm,
                                        :old.deptno,
                                        systimestamp,
                                        S_IP_ADDR,
                                        'DELETE',
                                        USER,
                                        'Old Records before deletion');
         end if;
    exception
        when others then
          raise_application_error(-20501,'Contact With Your Admin....');
    end;
    Trigger Created.
    Elapsed: 00:00:00.09
    satyaki>ed
    Wrote file afiedt.buf
      1  merge into e_emp o
      2     using emp n
      3     on ( o.empno = n.empno)
      4     when matched then
      5       update set o.ename = n.ename,
      6                  o.job   = n.job,
      7                  o.mgr   = n.mgr,
      8                  o.hiredate = n.hiredate,
      9                  o.sal = n.sal,
    10                  o.comm = n.comm,
    11                  o.deptno = n.deptno
    12     when not matched then
    13       insert( o.empno,
    14               o.ename,
    15               o.job,
    16               o.mgr,
    17               o.hiredate,
    18               o.sal,
    19               o.comm,
    20               o.deptno)
    21       values( n.empno,
    22               n.ename,
    23               n.job,
    24               n.mgr,
    25               n.hiredate,
    26               n.sal,
    27               n.comm,
    28*              n.deptno)
    satyaki>/
    13 rows merged.
    Elapsed: 00:00:03.95
    satyaki>
    satyaki>commit;
    Commit complete.
    Elapsed: 00:00:00.07
    satyaki>
    satyaki>
    satyaki>select * from aud_emp;
         EMPNO ENAME      JOB              MGR HIREDATE         SAL       COMM     DEPTNO AUDIT_TMP                                                                   AUDIT_IP_ADDR   OPR_DESC                       USER_NM                                  REMARKS
          7566 JONES      MANAGER         7839 02-APR-81       2975                    20 31-DEC-08 10.54.06.667000 PM                                                10.23.99.77     UPDATE                         SCOTT                                    Updated Records Details ->      Salary: 3718.75
          7782 CLARK      MANAGER         7839 09-JUN-81       4450                    10 31-DEC-08 10.54.06.686000 PM                                                10.23.99.77     UPDATE                         SCOTT                                    Updated Records Details ->      Salary: 3062.5
          7788 SCOTT      ANALYST         7566 19-APR-87       3000                    20 31-DEC-08 10.54.06.687000 PM                                                10.23.99.77     UPDATE                         SCOTT                                    Updated Records Details ->
          7839 KING       PRESIDENT            17-NOV-81       7000                    10 31-DEC-08 10.54.06.697000 PM                                                10.23.99.77     UPDATE                         SCOTT                                    Updated Records Details ->      Salary: 5000
          7876 ADAMS      CLERK           7788 23-MAY-87       1100                    20 31-DEC-08 10.54.06.698000 PM                                                10.23.99.77     UPDATE                         SCOTT                                    Updated Records Details ->
          7902 FORD       ANALYST         7566 03-DEC-81       3000                    20 31-DEC-08 10.54.06.699000 PM                                                10.23.99.77     UPDATE                         SCOTT                                    Updated Records Details ->
          7844 TURNER     SALESMAN        7698 08-SEP-81       1500          0         30 31-DEC-08 10.54.06.720000 PM                                                10.23.99.77     INSERT                         SCOTT
          7777 SOURAV     SLS                  14-SEP-08      45000       3400         10 31-DEC-08 10.54.07.059000 PM                                                10.23.99.77     INSERT                         SCOTT
          7521 WARD       SALESMAN        7698 22-FEB-81       1250        500         30 31-DEC-08 10.54.07.060000 PM                                                10.23.99.77     INSERT                         SCOTT
          7654 MARTIN     SALESMAN        7698 28-SEP-81       1250       1400         30 31-DEC-08 10.54.07.060000 PM                                                10.23.99.77     INSERT                         SCOTT
          7698 BLAKE      MANAGER         7839 01-MAY-81       2850                    30 31-DEC-08 10.54.07.061000 PM                                                10.23.99.77     INSERT                         SCOTT
         EMPNO ENAME      JOB              MGR HIREDATE         SAL       COMM     DEPTNO AUDIT_TMP                                                                   AUDIT_IP_ADDR   OPR_DESC                       USER_NM                                  REMARKS
          9999 SATYAKI    SLS             7698 02-NOV-08      55000       3455         10 31-DEC-08 10.54.07.061000 PM                                                10.23.99.77     INSERT                         SCOTT
          7900 JAMES      CLERK           7698 03-DEC-81        950                    30 31-DEC-08 10.54.07.062000 PM                                                10.23.99.77     INSERT                         SCOTT
    13 rows selected.
    Elapsed: 00:00:00.22
    satyaki>
    satyaki>set lin 80Hope this will help you.
    Regards.
    Satyaki De.

  • DDL and INSERT scripts of EMPLOYEES, DEPARTMENTS, LOCATIONS, etc.

    Hi all,
    (Where) Can I have DDL and INSERT scripts of tables
    EMPLOYEES, DEPARTMENTS, LOCATIONS, JOB_HISTORY, JOBS, REGIONS,
    and
    EMP and DEPT tables?
    I could find only descriptions of the tables in http://download.oracle.com/docs/cd/B28359_01/server.111/b28328/scripts.htm#insertedID3, not the complete DDL and INSERT scripts. Please help.

    Hi,
    The scripts (and other files) are on the companion CD. You can download it from www.oracle.com .
    Bartek

  • Data pump for HR Jobs and HR positions for v11i

    I am trying to use Datapump (v11i/11.5.2)
    to load the data for HR Jobs and Hr Positions. I am loading HR_PUMP_BATCH_HEADER and HR_PUMP_BATCH_LINE TABLEs with records having required valid values but I am getting error. I have succefully loaded the employee data using same method before.
    Have anybody done this ? PLase help.
    null

    I got the below from Metalink.
    Pl. see whether it is helpful for you, since I have little knowledge about HRMS.
    The HR_PUMP_BATCH_LINE_USER_KEYS table must be seeded with value in order for the package that follows to work. In some cases, the user must provide this value.
    I ran into this problem when trying to run the create_job_requirement API through the Data Pump. Within my PL/SQL block, this is how I passed the value to my variable:
    pv_id_flex_num_user_key := sel.sun_job_id&#0124; &#0124;sel.name&#0124; &#0124;sel.job_requirement_id&#0124; &#0124;sel.id_flex_num&#0124; &#0124;':ID FLEX NUM USER KEY';
    I then performed the following after calling the insert_batch_line procedure and passing all the parameter values:
    SELECT hpbl.batch_line_id,
    coj_hr_ci.devl_seq_s.nextval,
    coj_hr_ci.devl_seq_s.nextval
    INTO v_batch_line_id,
    v_sequence1,
    v_sequence2
    FROM hr_pump_batch_lines hpbl
    WHERE hpbl.pval058 = pv_id_flex_num_user_key;
    INSERT INTO hr_pump_batch_line_user_keys(batch_line_id,
    unique_key_id,
    user_key_id,
    user_key_value)
    VALUES(v_batch_line_id,
    50141,
    v_sequence2,
    pv_id_flex_num_user_key);
    null

  • How can we do policy updations and insertion using procedures in insurence?

    how can we do policy updations and insertion using procedures in insurence?
    please answer for this post i need faced in interview.how can we write the code for that procedure in policy details using procedures in plsql.

    997995 wrote:
    how can we do policy updations and insertion using procedures in insurence?
    please answer for this post i need faced in interview.how can we write the code for that procedure in policy details using procedures in plsql.You are asking about a specific business area (Insurance) and the nuances of that business (Insurance Policies) and providing no technical details about what is required, on a technical forum. There are many varied businesses in the world and not everyone here is going to be familiar with the area of business you are referring to.
    As a technical forum, people are here to assist with technical issues, not to try and help you answer questions in an interview for something you clearly know nothing about, so that you can get a job that you won't know how to do.
    If you have a specific technical issue, post appropriate details, including database version, table structures and indexes, example data and expected output etc. as detailed in the FAQ: {message:id=9360002}

  • Data from table in xml Format and Inserting it into  Table

    Hi All
    I have table where xml data is stored in long format with xml tag know i have read the entire xml xoulmn which is xml tag and insert it into diffrent table can any suggest me the code
    Thanks & Regards

    I believe you are on the wrong forum. You want the XML DB forum.
    See:
    XML DB

  • Associative array comparison and INSERT upon IF condition

    Hi Guys,
    I have written this pl sql code to identify non existing sellers and insert their sales channel information into the dimension table (dimensional table update).
    Somehow,......nothing is inserted and this script runs for 12 hours+ without any result. the sql autotrace shows no result and the explain plan (button on sql developer throws upon clicking "missing keyword". I have no
    information what is going on/wrong. Does anyone spot an error?
    UNDEFINE DimSales;
    UNDEFINE FactTable;
    DEFINE DimSales = 'testsales';
    DEFINE FactTable = 'testfact';
    DECLARE
    v_SellerNo VarChar(9);
    v_error_code T_ERRORS.v_error_code%TYPE;
    v_error_message T_ERRORS.v_error_message%TYPE;
    TYPE assoc_array_str_type1 IS TABLE OF VARCHAR2(32) INDEX BY PLS_INTEGER;
         v1 assoc_array_str_type1;
    TYPE assoc_array_str_type2 IS TABLE OF VARCHAR2(32) INDEX BY PLS_INTEGER;
         v2 assoc_array_str_type2;
    BEGIN
    --Collect all distinct SellerNo into associative array (hash table)
    select distinct SellerNo bulk collect into v1 from &FactTable;
    select distinct seller_id bulk collect into v2 from &DimSales;
    v_SellerNo := v1.first;
    loop
    exit when v1 is null;
    --1 Check if v_SellerNo already exists in DIM_Sales (if NOT/FALSE, its a new seller and we can insert all records for that seller
    if (v2.exists(v_SellerNo)=false) THEN
    INSERT INTO &DimSales (K_Sales,REG,BVL,DS, VS,RS,GS,VK)
    (SELECT DISTINCT trim(leading '0' from RS||GS) ,REG BVL,DS,VS,RS,GS,VK from &FactTable where SellerNo =v_SellerNo);
    --ELSE
    end if;
    v_SellerNo := v1.next(v_SellerNo);
    end loop;
    EXCEPTION
    WHEN OTHERS THEN
    ROLLBACK;
    --v_error_code := SQLCODE
    --v_error_message := SQLERRM
    --INSERT INTO t_errors VALUES ( v_error_code, v_error_message);
    END;
    ---------------------------------------------------------------

    Distinct clause requires a sort. Sorts can be very expensive.
    Bulk collects that are not constrained in fetch size, can potentially fetch millions of rows - requiring that data to be wholly read into server memory. I have seen how this can degrade performance so badly that the kernel reboots the server.
    Using PL/SQL loops to process and insert/update/delete data is often problematic due to its row-by-row approach - also called slow-by-slow approach. It is far more scalable letting SQL do the "loop" processing, by using joins, sub-selects and so on.
    Where the conditional processing is too complex for SQL to handle, then PL/SQL is obviously an alternative to use. Ideally one should process data sets as oppose to rows in PL//SQL. Reduce context switching by using bulk fetches and bulk binds.
    But PL/SQL cannot execute in parallel as the SQL it fires off can. If after all the optimisation, the PL/SQL process still needs to hit a million rows to process, it will be slow irrespective of how optimal that PL/SQL approach and design - simply because of the number of rows and the processing overheads per row.
    In that case, the PL/SQL code itself need to be parallelised. There are a number of ways to approach this problem - the typical one is to create unique and distinct ranges of rows to process, spawn multiple P/SQL processes, and provide each with a unique range of rows to process. In parallel.
    So you need to look close at what you are trying to achieve, what the workloads are, and how to effectively decrease the workloads and increase the processing time of a workload.
    For example - finding distinct column values. You can pay for that workload when wanting that distinct list. And each time afterward repeat that workload when wanting that distinct list. Or you can pay for that workload up-front with the DML that creates/updates those values - and use (for example) a materialised view to maintain a ready to use distinct list of values.
    Same workload in essence - but paying once for it and up-front as oppose to each time you execute your code that needs to dynamically build that distinct list.
    Kent Crotty did tests and showed stunning performance improvements with bulk collect and forall, up to 30x faster:Bulk processing is not a magical silver bullet. It is a tool. And when correctly use, the tool does exactly what it was designed to do.
    The problem is using a hammer to drive in screws - instead of a screwdriver. There's nothing "stunning" about using a screwdriver. It is all about using the correct tool.
    If the goal of the swap daemon is to free up "idle" chunks of memory, and try to use that memory for things like file cache instead, what does that have to do with bulk processing?The swap daemon reads virtual memory pages from swap space into memory, and writes virtual pages from memory to swap space.
    What does it have to do with bulk processing? A bulk fetch reads data from the SGA (buffer cache) into the PGA (private process memory space). The larget the fetch, the more memory is required. If for example 50% of server memory is required for a bulk collection that is 2GB in size, then that will force in-use pages from memory to swap space.. only to be swapped back again as it is needed, thereby forcing other in-use pages to swap. The swap daemon will consume almost all the CPU time swapping hot pages continually in and out of memory.

  • How to receive parameters from a form and insert values in db using servlet

    Hi friends,
    My first question here.
    I'm trying to create a Servlet that takes input from a form and process it and insert it in database.tablename.
    My MAIN ISSUE IS WITH THE NUMBER OF COLUMNS THAT WOULD BE DYNAMIC AND THUS CANT RECEIVE THE PARAMETERS IN STATIC CODE.
    Here is the form code
    <%@page contentType="text/html" pageEncoding="UTF-8"%>
    <!DOCTYPE html>
    <form action="Test" method="post" />
    <table width="90%" border="1" cellspacing="1" cellpadding="1">
    <tr>
    <th scope="col">Student Name</th>
    <th scope="col">RollNo</th>
    <th scope="col">Java</th>
    <th scope="col">J2ee</th>
    </tr>
    <tr>
    <td><input type="text" name="studentname" /></td>
    <td><input type="text" name="rollno" /></td>
    <td><input type="text" name="java" /></td>
    <td><input type="text" name="j2ee" /></td>
    </tr>
    <tr>
    <td><input type="text" name="studentname" /></td>
    <td><input type="text" name="rollno" /></td>
    <td><input type="text" name="java" /></td>
    <td><input type="text" name="j2ee" /></td>
    </tr>
    <tr>
    <td><input type="text" name="studentname" /></td>
    <td><input type="text" name="rollno" /></td>
    <td><input type="text" name="java" /></td>
    <td><input type="text" name="j2ee" /></td>
    </tr>
    </table>
    <input type ="submit" value="insert values in the database now"/>
    </form>
    </html>
    And here is the Servlet
    import javax.servlet.*;
    import javax.servlet.http.*;
    import java.sql.*;
    import java.io.*;
    public class Test extends HttpServlet {
    public void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
    response.setContentType("text/html");
    PrintWriter out = response.getWriter();
    String title = "Module: Process Result Values from previous page";
    out.println("<html>\n" +
    "<head><title>" + title + "</title></head>\n" +
    "<body>");
              try
                   Class.forName("com.mysql.jdbc.Driver");
                   Connection con = DriverManager.getConnection("jdbc:mysql://localhost:3306/","root", "root");
                   Statement s = con.createStatement();
    ResultSet res = s.executeQuery("SELECT * FROM dbname.tablename");
    ResultSetMetaData rmeta=res.getMetaData();
    int noofcolumns=rmeta.getColumnCount();
                   if(!con.isClosed())
              catch(Exception e)
    System.out.println("Exception"+e);
    out.println(e);
    if(connectionFlag==1)
    out.println("</body></html>");
    doGet(request, response);
    public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException
    This is the basic code structure and I'm trying to figure out this special case with handling the multiple parameters:
    1. I dont' want to use String []a=request.getParameterValues("studentname"); as hard code. The reason being the number of rows will be diyamic and I would like to get the column name and then then use an array of string to take values. But how can I connect these to form a query so that the rows goes on inserted one after other.
    Thanks,
    Mashall

    Thank you for your help.
    I'm something close to it but this segment of code throws a NullPointerException Error. I counted number of rows and columns using the methods getColumnCount() and getColumnName(i).
    ResultSetMetaData md = res.getMetaData(); //Get meta data for the resultset
    int noofcolumns=md.getColumnCount();
    out.println("<br />Number of columns in table (including name and roll number) :::: "+noofcolumns+"<br />");
    for(int i=1;i<=noofcolumns;i++) // access column 1....n
    String columnname = md.getColumnName(i); // get name of column 1....n
    out.println("<br />Current column name ::: "+columnname+"<br />"); //check if the name appear correct.
    String []columndata = request.getParameterValues(columnname); //Get raw value in array
    for(int j=1;j<=rows;j++) // here rows is number of rows in each column
    out.println("To be inserted: <b>"+columndata[j]+"</b> in Row ::: "+j+"<br />");
    //The line above thows null pointer exception.
    }

  • Problem with overwrite and insert keyboard commands

    I'm a new user, switching from FCP. I edited one project successfully with no trouble a couple of weeks ago. Now I'm in a second project and having a weird difficulty. Using a pretty new Mac 8 core tower.
    The insert and overlay keyboard commands (comma and period keys) don't work. They still work in the old project but not this one. If I start a new sequence and copy my timeline to it, the commands will work for awhile but ONLY with video, not with the audio tracks. Drag and drop works fine but that really slows me down.
    The clips I'm using are all the same format (H.264 from Canon 5DII and some still photos) as the first project. Only difference in this project is lots of chroma keying and some resizing of background still photos. Nothing unrendered.
    I started a new project just now and imported the problem project into it, started a new sequence, copied old sequence, and I get the same problem. I can drop in video with a keyboard command but not audio. Earlier it was not even doing video. Yesterday when I'd start a new sequence and do the same thing, it would give me video on the keyboard command but then eventually that would stop and it would do nothing. At one point the cursor would move over the width of the clip, as if it had been dropped into the timeline, but then that stopped as well.
    I've trashed preferences and that didn't work. I did that according to a procedure I found online. It said open up PP while holding down the Shift and Option keys. I did that twice. Nothing.
    To summarize: Keyboard commands for overwrite and insert edit do not work in this project but do in an older project.
    Any ideas?
    Thanks.
    Bill Pryor
    [email protected]

    Well, thanks to an email from another person who was having the same problem, it's all solved not.
    Operator error. Dang. Who knew--in FCP you select a track by clicking on the V or A buttons to the far left. In Premiere Pro you have to make sure BOTH the far left button and the one to the right that says Video 1, Audio 1, etc. are also clicked. They BOTH have to be light gray. In my first project, everything was properly activated but in the next project apparently they were not and that carried over to succeeding projects (even though I didn't change anything...at least consciously).
    So all as well now. No glitch, no bug. Just good ol' fashioned operator error. I will wear the dunce hat today.

  • Using servlets to read from text file and insert data

    Hi,
    Can I use a servlet to read from a space delimited text file on the client computer and use that data to insert into a table in my database? I want to make it easy for my users to upload their data without having to have them use SQL*Loader. If so can someone give me a hint as how to get started? I appreciate it.
    Thanks,
    Colby

    Create a page for the user to upload the file to your webserver and send a message (containing the file location) to a server app that will open the file, parse it, and insert it into your database. Make sure you secure the page.
    or
    Have the user paste the file into a simple web form that submits to a servlet that parses the data and inserts it into your db.

  • I have a MacBook pro 10.6.8, 2.66 Ghz 8GB 1067MHz. Please does anybody know of software that can auto send emails to a word/pages doc. and insert in date order into a book file, so that you can open like a book

    I have a MacBook Pro 10.6.8, 2.66 Ghz,  8GB memory 1067Mhz running Snow Lepard. I am going mad with trying to organise emails and cross reference to file documents, also the first model iPad.
    Does anybod know if there is any softeware - pages or otherwise that can copy emails automatically to document files and insert them in dated order and also that the file can be reviewed like a book by flicking from bottom right hand corner iPad book style.
    Please help me run my contracts easily, it shouldn't be difficult in this day and age.
    I currently use Apple 'Mail' and microsoft word but am considering any new software that can organise and integrate mail better. I use photo inserts and scanned documents and excel spreadsheets.

    No, I don't know about that kind of software. It isn't Pages anyway. You need somekind of database.

  • Triggering Process chain based on execution of R3 job and Process Chain

    Hi All,
    I need your help in arriving at a solution for a reqirement.
    Scenario is., I need to trigger a process chain B based on successful execution of Process chain A and an R3 job. If both conditions(Completion of R3 job and Process Chain A) are met then only it has to trigger process Chain B.
    I know that we can use events to trigger a process chain using R3 job.But my case is differenet I need to consider the success message from both process chain and R3 job.Is there any way to solve it ?
    Please provide me with your valuable inputs.
    Thanks,
    Bindu

    Hi Hima,
    You can use the  'AND' variant for both, If both are success then trigger the process chain.
    Regards,
    Ravi Kanth
    Edited by: Ravi kanth on Apr 30, 2009 3:36 PM

Maybe you are looking for

  • Error code -50 ACL permission duplication

    Hey there. It seems SL server and possible OSX has an ACL permission duplicaion bug. I believe it to be fixed in 10.6.8, but am unsure.  I havent been able to test yet and had to send a script I made to a company to temporarily fix this issue, until

  • Cant view video on ipod

    i can view them on my itunes but when i go and try to view them on my ipod my screen freezes and goes back to the selection menu help any suggestions?

  • LV Driver - Installation - LV32 bit and LV 64- bit

    Hi, I am making build spec for LV driver, I did the following experiment, Experiment, Create a installer on LV2014 32 bit   2. Path for source destination:       C:\Program Files(x86)\National Instruments\LabVIEW 2014\instr.lib Now, I install this dr

  • Oracle database security

    Hi, I want to get expertised in security of oracle database.I have just learnt the basics and gained some level of confidence but i dont have knowledge about the security and other features in oracle database. i came to know that security is the most

  • Can selections be preserved?

    If I make a selection within a clip in the event browser, marking an in and out point where I want them in the clip, then I decide to look at another clip instead, when I go back to the previous clip my selection is not preserved and I have to start