Fetch 30,000 record on execute Hang up the form

Hello
My problem is when trying to push the button of the Query it takes very long time to execute can any one help me pls. ?
Is MAximizing the size of the Buffering could solve the problem if yes to which extend ?
Regards,
Abdetu...

Abdetu wrote:
Hi
yes i did it's much faster on pl/sql.
Regards,
Abdetu..
>Hi
yes i did it's much faster on pl/sql.
PL/SQL is different than SQL it's a different engine, if you want to compare speed between forms and raw SQL you need to use SQL*Plus.
best was to compare is to check the Explain Plan.
Open an SQL*Plus session use the below command to see the plan and the result speed.
SQL*Plus: Release 10.1.0.4.2 - Production on Mon Apr 6 08:37:41 2009
Copyright (c) 1982, 2005, Oracle.  All rights reserved.
Enter user-name: TONY
Enter password:
Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
With the Partitioning, OLAP and Data Mining options
TONY@ORCL>
TONY@ORCL> SET AUTOTRACE TRACEONLY
TONY@ORCL>
TONY@ORCL> SET TIMING ON
TONY@ORCL>
TONY@ORCL>
TONY@ORCL> SELECT * FROM SCOTT.EMP;
15 rows selected.
Elapsed: 00:00:00.01
Execution Plan
   0      SELECT STATEMENT Optimizer=ALL_ROWS (Cost=3 Card=15 Bytes=54
          0)
   1    0   TABLE ACCESS (FULL) OF 'EMP' (TABLE) (Cost=3 Card=15 Bytes
          =540)
Statistics
          1  recursive calls
          0  db block gets
          7  consistent gets
          0  physical reads
          0  redo size
       1448  bytes sent via SQL*Net to client
        512  bytes received via SQL*Net from client
          2  SQL*Net roundtrips to/from client
          0  sorts (memory)
          0  sorts (disk)
         15  rows processed
TONY@ORCL>Keep in mind, the user should have PLUSTRACE role granted to see the explain plan
Tony

Similar Messages

  • Unable to fetch 50,000 records from SQL using Orchestrator

    Hi Team,
    I have a table in MS SQL which is having more than 50,000 records. I am trying to fetch 50,000 records using orchestrator, but unable to fetch the records..
    There is no problem with the SQL query because I can able to get 40,000 records..
    I am using SCORCH DEV - SQL Integration Pack.
    Regards,
    Soundarajan.

    Hi,
    Thanks for your query.
    I have also used timeout parameter but it is not working.. As you said I also tried with Query database Activity which is out of the box...
    Now i can able to fetch more than 80,000 records but the output what i am getting is not in the format which we are looking for..
    Attached the output...
    How to edit this..?
    I tried to write the output in excel but all the data sits in the first column itself..
    Regards,
    Soundarajan.

  • To fetch all active records in Report falling in the Date Range

    Gurus,
    My requirement goes like this. In my Bex Report I have Input prompt on ZStart_Date [ Interval Mandatory ] type.
    There is something called as process which can have start Date and End Date.
    Say
    Process A has Start Date as 01.01.1990 End Date - 31.12.9999 ( ie., ongoing)
    Process B has Start Date as 01.01.1995 End Date - 01.01.2000 ( i.e., Process Dead)
    So when I execute the report giving the prompt for Start Date as 01.01.2008 to 31.12.2008, the report should fetch me all active processes which falls in that period regardless of the process start date at the same time it should not consider Dead Processes.
    To help you guys, I have got a logic of using a customer exits variable or sort of this, but I dont know much about these concepts.
    Any pointers would be og GREATEST help.
    Regards,
    Yaseen

    Shalabh,
    To be precise enough , We  term a process ongoing based on the End Date of Process.
    Say, process X can started on 01.01.2000 and Ended on 31.12.2005, so when I execute the report say for  period 01.01.2006 - 31.12.2006 , Process X is Dead according to reporting Period and hence should not appear in the report.
    The report should  fetch process which are ongoing i.e., there end dates can be 31.12.9999 or can be between the reporting period.
    couldnt understand "regardless of the process start date " .....??
    mean , say Process A started on 01.01.1995 and End Date is 31.12.9999 ( which we term as ongoing ). So when my reporting period is say 01.01.2006 to 31.12.2006 I should fetch this process, and thats why I term this as Active process in the Reporting Period.
    Hope this clarifies.
    Edited by: Yaseen Ahmed on Dec 15, 2008 5:32 PM

  • Question on SFSF Adapter : Delta sync to fetch only changed records.

    Hi All ,
    As per the SAP Document on SFSF Communication channel, one of the feature supported by SP00 release of the connectivity add-on 1.0 is :-
    1. Delta sync to fetch only changed records
    SFSF Adapter: Delta Sync
    The delta sync features enables you to fetch only the records that were modified after the last successful data fetch from the SuccessFactors system.
    This increases the efficiency of query operation.
    So below are my question accross Successfull run date feature :
    Is it SAP PI Connector internally maintains this last successful run date ?
    what would be Time-Zone of Successful run date : Is it where SF DC is located or where PI Box is Located ?
    what is it mean from successful run date ? - is it process successful run date or is it just communication channel successful run date ?
    For Delta extraction - This Successful Run Date - Is it maintained internally for  Interface wise or Communication Channel wise ?
    eg. If i have 5 Delta Interface - will SAP PI maintain 5 Successfull run dates w.r.t to each connector or w.r.t each Process ?
    for more information please open below URL.
    http://help.sap.com/saphelp_nw-connectivity-addon100/helpdata/en/1e/22aaf0780d4b78b6f889157f6a8035/frameset.htm
    Regards
    Prabhat Sharma

    Hello all,
    I have the same question : in the "official" documentation (SuccessFactors (SFSF) Adapter for SAP NetWeaver Process Integration) is mentioned this :
    Delta Sync - The delta sync features enables you to fetch only the records that were modified after the last successful data fetch from the SuccessFactors system. This increases the efficiency of the query operation.
    Anyway I cant find any specification on how to use this feature. Do you have any suggestion on how to implement it or any reference to an existing documentation?
    Do I need to write you an email to get the info?
    Regards

  • How to return to the first record of a multiple row cursor (Forms 6i)

    Hi all,
    I had a bit of a search through here, but couldn't quite find the answer I'm after.
    I have a multiple row cursor, which I feed into a multi-row block in Forms 6i. I have the following code:
    OPEN CURSOR;
    LOOP
       FETCH CURSOR INTO :FIELD1, :FIELD2, :FIELD3, :FIELD4;
       ... do other code not related with cursor
       EXIT WHEN CURSOR%NOTFOUND;
       NEXT_RECORD;
    END LOOP;Now, I use the Forms built-in NEXT_RECORD to move down through the records on the form and fill in each row from the db. However, once the block loads (this works correctly), the current item (ie where the typing cursor is left) is on the last record.
    Obviously, I need the current item (after loading) to be on the first record. I tried using the Forms built-in FIRST_RECORD after the END LOOP command, but that gives me an ORA-06511 error (An attempt was made to open a cursor that was already open).
    Does anyone know how I might be able to return to the first record of the block correctly?
    Thanks in Advance,
    Chris.

    Ok, I feel like a bit of a dolt.
    I found that all cursors had to be closed before navigating back to the first record.
    Case closed! :)

  • System hangs while retrieving nearly 80,000 records in HTML output

    Hi Users,
    I am using Oracle BI Publisher Enterprise.
    When I tried to retrieve nearly 80,000 records, the system hangs.
    How to get rid out of that.
    Awaiting for you reply guys.

    Thats common, go for summary report instead of that.

  • Performance in processing 80,000 records.

    Hi
    I am working on a module where I have to upload a file of 80,000 records, process them and then upload them in Web Service.
    I am uploading file by simply parsing request
    items = upload.parseRequest(request);After this I am traversing entire file line by line, processing individual records with my logic, and then saving them to a Vector.
    In second servlet I am fetching these records and then uploading them to WSDL file.
    This process will take some time.
    I am facing few problems/questions here :
    Question 1:
    After 30 minutes or so.. the browser displays "This page cannot be displayed".
    While I am debugging this code and setting breakpoints, I noticed that code is actually executing when browser displays "This page cannot be displayed" message.
    Can I increase browser settings so that It can wait for some more time before displaying above message.
    So, that my java code can complete its execution.
    Question 2 :
    I am using vector to store all 80,000 records at one go. Will the use of ArrayList or some other collection type increase performance.
    Question 3 :
    What if I break vector in parts.
    i.e. instead of keeping 1 single vector of 80,000 records, if I store 10,000 records each in different vectors and then process them separately.
    Please comment.
    Thanks.

    money321 wrote:
    Question 1:
    After 30 minutes or so.. the browser displays "This page cannot be displayed".
    While I am debugging this code and setting breakpoints, I noticed that code is actually executing when browser displays "This page cannot be displayed" message.
    Can I increase browser settings so that It can wait for some more time before displaying above message.
    So, that my java code can complete its execution.
    It is the request timeout, it is a webserver setting, not a webbrowser setting. Even though the request times out, the code should still continue to execute until the process finishes; you just don't get the response in your browser.
    Question 2 :
    I am using vector to store all 80,000 records at one go. Will the use of ArrayList or some other collection type increase performance.
    Probably yes, because a Vector is thread safe while the ArrayList is not. It is a similar situation as StringBuffer/StringBuilder.
    Question 3 :
    What if I break vector in parts.
    i.e. instead of keeping 1 single vector of 80,000 records, if I store 10,000 records each in different vectors and then process them separately.Wouldn't make much of a difference I'd say. The biggest performance hit is the webservice call, so try to save as much time as you can there. By the way, are you doing one webservice call, or 80000?
    >
    Please comment.
    Thanks.

  • Unable to INSERT PL/SQL  record with EXECUTE IMMEDIATE

    Hi All,
    I am selecting data from a source table and after some modification inserting into a target table. Source and target table name are available at run time. You can say only source table structure is fixed.
    I have crated a pl/sql table of type source record and inserting record by record in target table using execute immediate. But I am not able to write
    EXECUTE IMMEDIATE string USING pl_sql_table(index); and getting error as
    PLS-00457: expressions have to be of SQL types
    Please see the part of code below. Is it possible to use FORALL with dynamic sql like
    FORALL pl_sql_table.FIRST .. pl_sql_table.COUNT
    EXECUTE IMMEDIATE .... pl_sql_table(j); -- Like this.
    Please suggest why I am not able to write record here. I also want to replace 'INSERT in a loop' with a single INSERT statement out of the loop, to upload whole pl_sql table into target table in one go.
    Thanks,
    Ravi
    DECLARE
        TYPE rec_tab_CMP IS RECORD
         model_id          NUMBER(38),   
         absolute_rank          NUMBER(5)         
        v_rec_tab_CMP  rec_tab_CMP;
        TYPE t_rec_tab_CMP IS TABLE OF v_rec_tab_CMP%TYPE INDEX BY BINARY_INTEGER;
        v_records_CMP               t_rec_tab_CMP;
        rc                          SYS_REFCURSOR;
        v_old_table_name            VARCHAR2(30); -- passed from parameter 
        v_new_table_name            VARCHAR2(30); -- passed from parameter 
        dyn_str                     VARCHAR2(500);
        v_columns_str               VARCHAR2(200) := ' model_id, absolute_rank ';
    BEGIN
           EXECUTE IMMEDIATE 'CREATE TABLE '|| v_new_table_name || ' AS SELECT * FROM ' || v_old_table_name ||' WHERE 1 = 2 ' ;
         OPEN rc FOR 'SELECT '|| v_columns_str ||' FROM '|| v_old_table_name;
         FETCH rc BULK COLLECT INTO v_records_CMP;
         FOR j IN 1..v_records_CMP.COUNT
         LOOP
            v_records_CMP(j).model_id := 1; -- Do someting here, This thing can be performed in SQL stmt directly.
            dyn_str := 'INSERT INTO '|| v_new_table_name ||' ( '|| v_columns_str || ' ) VALUES (:1, :2) ';
            EXECUTE IMMEDIATE dyn_str USING v_records_CMP(j).model_id          ,
                                v_records_CMP(j).absolute_rank     ;
         -- Here in place of two columns I want to use one record like
         -- EXECUTE IMMEDIATE dyn_str USING v_records_CMP(j);
         -- But it is giving me error like
            --          EXECUTE IMMEDIATE dyn_str USING v_records_st(j);
            --   PLS-00457: expressions have to be of SQL types
         END LOOP;
         CLOSE rc;
    END;
    /

    You cannot bind PL/SQL record types to dynamic SQL.
    Possibly you could work around this by declaring the INDEX-BY table of records at package specification level, e.g.
    Oracle Database 10g Express Edition Release 10.2.0.1.0 - Production
    SQL> CREATE PACKAGE package_name
      2  AS
      3     TYPE tt_emp IS TABLE OF emp%ROWTYPE;
      4     t_emp tt_emp;
      5  END package_name;
      6  /
    Package created.
    SQL> CREATE TABLE new_emp
      2  AS
      3     SELECT *
      4     FROM   emp
      5     WHERE  1 = 0;
    Table created.
    SQL> DECLARE
      2     v_table_name user_tables.table_name%TYPE := 'NEW_EMP';
      3  BEGIN
      4     SELECT *
      5     BULK COLLECT INTO package_name.t_emp
      6     FROM   emp;
      7
      8     EXECUTE IMMEDIATE
      9        'BEGIN ' ||
    10        '   FORALL i IN 1 ..package_name.t_emp.COUNT ' ||
    11        '      INSERT INTO ' || v_table_name ||
    12        '      VALUES package_name.t_emp (i); ' ||
    13        'END;';
    14  END;
    15  /
    PL/SQL procedure successfully completed.
    SQL> SELECT empno, ename
      2  FROM   new_emp;
         EMPNO ENAME
          7369 SMITH
          7499 ALLEN
          7521 WARD
          7566 JONES
          7654 MARTIN
          7698 BLAKE
          7782 CLARK
          7788 SCOTT
          7839 KING
          7844 TURNER
          7876 ADAMS
          7900 JAMES
          7902 FORD
          7934 MILLER
    14 rows selected.
    SQL>

  • Performance while uploading 50,000 records

    Hi
    I have to create an application which reads records from a file.
    The records can exceed upto 50,000.
    Then these records are to be processed using a web service.
    Now, I need to design an optimized solution.
    Simple design would be to read all records from file store them in context and then loop them through web service model.
    I think there has to be one more optimal solution.
    Even ahead of performance comes runtime memory issue !! (What if it falls short to hold all 50,000 records in context at same time)
    How can I break this application.
    Thanks

    money321 wrote:
    Question 1:
    After 30 minutes or so.. the browser displays "This page cannot be displayed".
    While I am debugging this code and setting breakpoints, I noticed that code is actually executing when browser displays "This page cannot be displayed" message.
    Can I increase browser settings so that It can wait for some more time before displaying above message.
    So, that my java code can complete its execution.
    It is the request timeout, it is a webserver setting, not a webbrowser setting. Even though the request times out, the code should still continue to execute until the process finishes; you just don't get the response in your browser.
    Question 2 :
    I am using vector to store all 80,000 records at one go. Will the use of ArrayList or some other collection type increase performance.
    Probably yes, because a Vector is thread safe while the ArrayList is not. It is a similar situation as StringBuffer/StringBuilder.
    Question 3 :
    What if I break vector in parts.
    i.e. instead of keeping 1 single vector of 80,000 records, if I store 10,000 records each in different vectors and then process them separately.Wouldn't make much of a difference I'd say. The biggest performance hit is the webservice call, so try to save as much time as you can there. By the way, are you doing one webservice call, or 80000?
    >
    Please comment.
    Thanks.

  • Try to insert 10.000 records, but stop at 500 records

    I try to insert 10.000 records to 4 coloumn table at sun solaris oracle, from visual basic application, but it stop at 500 records, and when I try to insert record 501, it never succeded.
    Is there limitation in oracle database in insertion procedure ?

    Hi,
    There is no such limitations in Oracle Database. The insertion process is going on, but it looks like hanging. You can do one thing to trace the happenings.
    1. Paste progress bar item in your screen
    2. Set Min = 1
    3. Set Max = Total no of records from source table(where 10,000 records are there)
    4. You might have one Do while..loop structure to insert a record to a target table. Within that loop, increase the value of process bar. So,while inserting a record, the progress bar value will change.
    So, you can trace whether the process is running or not.
    I think, this will help u to trace the process.
    N.Swaminathan

  • Until 50,000 records reaches does the request status will be yellow?

    Hi all,
    I can see there are 20,000 records in the request but then it is in yellow, so i believe until the same request reaches 50,000 records the request will be still in yellow isn it?
    Can anyone pls confirm me on this!!
    Thanks
    Pooja

    Its not necessary that it should have 50000 records. It completely depends upon how many records in tables are satisfying the conditions of the data selection in infopackage.
    It will fetch all the records then only it will change to green.

  • Call transaction for less than 10,000 records

    Hi all
    Is call transaction advisable for processing less than 10,000 records.
    I want to use call transaction for each record and if it fails, pass it to an opened bdc session for manual processing by functional consultants. At the same time, I have to write a report with successful and errored records. Is my approach the only way? Any other ideas are welcome and appreciated
    Thanks
    ricky

    Its completely safe.
    Check the return table for all message type E and then recreate your Error table and pass it to BDC open session and can be executed from SM35 in foreground mode. But what happen if there are more say 100 or 200 error records are there.Its not feasible to go and run each record in Foreground mode.Instead Display the error records in an editable ALV with the error message trap from the CALL Transaction return table and let the consultant correct then on the screen and download it, modify it to suit the upload procedure of your report. Then reprocess the file.
    Also, display the audit report based on that table.
    Regards,
    Amit

  • Possible to send the 50,000 records at a time using proxy?

    Hi All,
    I am using the proxy to send the data form SAP to PI and then send it to Receciver by using JMS. Here i have a small issue.... is it possible to send the 50,000 records at a time using proxy? If not please suggest me how can i send bulk of records through proxy?
    Thanks
    Karthik.

    is it possible to send the 50,000 records at a time using proxy? If not please suggest me how
    can i send bulk of records through proxy?
    you can try this in steps...do not go for a BigBang testing :)....check how much your XI system can handle at a time...then you may need to tune the system parameters to accomodate more message size.....how to do this??...check the below document..section 3.9.2 (Special Cases)
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2016a0b1-1780-2b10-97bd-be3ac62214c7
    Regards,
    Abhishek.

  • How to fetch some certain records at a time

    Hi everyone,
    I dont know how to fetch some records at a time,
    for some reason,there is not enouht memory.so I cant
    fethch all the records at a time.
    example, every time I only want to fetch 10
    records. Does anyone know that?
    thanks
    Krussi

    I think you may be getting setFetchSize(int) and setMaxRows(int) confused. I may have added to the confusion with my breaking out of the loop comment.
    setFetchSize(int) gives the driver a hint as to how many rows it should store in memory at a time internally, ie. within the driver. It has no real effect on how you write your program at all. Its only purpose is to limit memory usage by the driver. To limit the rows returned, use setMaxRows.
    For example, assume a result set has 20 rows and you use:
    rs.setFetchSize(5);
    while (rs.next()) {
       System.out.println("Rows: " + (++count));
    }  You still get a count from 1 to 20. However, internally the driver is only holding in memory 5 records at a time, after the 5 returned it will discard those and get the next 5. So instead of holding 20 rows, its only holding 5 at a time. If you change rs.setFetchSize(5) to rs.setMaxRows(5) you will only get a count from 1 to 5, the rest are silently discared and rs.next() returns false.
    Check your documentation to see if your driver supports the method. I believe many drivers don't. If not, I believe there is no way around multiple queries. Even if there is no rowid, as long as you order your queries you should be able to start again where you left off.
    Disclaimer: Both drivers I commonly use do not implement setFetchSize(int). Someone let me know if I've fudged something here.
    Good Luck

  • How can I modify this to process 80,000 records at a time until finish.

    Hello it's me again -
    Without using a rownum limit in my cursor declare my record-set is around 10 million records. This causes problems within our environment. Is there a way I can loop the following code and only do 80,000 at a time.
    1 process 80,000
    2. process next 80,000.
    How would I redeclare the cursor in a loop and grab the next 80,000?
    Thanks again
    Steve
    SET SERVEROUTPUT ON
    DECLARE
    CURSOR vt_mlr_cursor IS Select master_key, tn, user2 from vt_mlr Where user2 is not null and rownum < 80001;
    USERFIELD VARCHAR2(100);
    R_count NUMBER := 0;
    Field1 VARCHAR2(20);
    Field2 VARCHAR2(20);
    key VARCHAR2(10);
    phone VARCHAR2(11);
    BEGIN
    FOR vt_mlr_record IN vt_mlr_cursor
    LOOP
    BEGIN
         key := vt_mlr_record.master_key;
         phone     := vt_mlr_record.tn;
         USERFIELD := vt_mlr_record.user2;
         Field1 := SUBSTR(vt_mlr_record.user2,12,4);
         Field2 := SUBSTR(vt_mlr_record.user2,28,4);
              UPDATE vt_mlr
              SET
              line_medr = Field1,
              line_aidr = Field2
              WHERE
              master_key = vt_mlr_record.master_key;
              R_count := R_count + 1;
         EXCEPTION
         when others then
         Insert into temp_reject (REJECT_KEY, REJECT_TN, REJECT_VALUE) VALUES
         (key, phone, 'USER2 ' || USERFIELD );
         R_count := R_count - 1;
    END;
    END LOOP;
         commit;
         EXCEPTION
         when others then
         DBMS_OUTPUT.PUT_LINE('Error code ' || sqlcode || ' Error desc' || SUBSTR(sqlerrm,1,200));
    END;

    Add a "last_update" or "modified" column to your table.
    Then do this:
    declare
    CURSOR vt_mlr_cursor IS
       select master_key, tn, user2
       from vt_mlr
       where user2 is not null and rownum < 80001
             and modified != 'Y'
    (or)
             and last_update < begin_date ;
       begin_date constant date := sysdate ;
    begin
      update vt_mlr
         set line_medr = Field1,
             line_aidr = Field2,
             modified = 'Y'
    (or)
             last_update = sysdate

Maybe you are looking for