Problem fetch large number of records

Hi
I want to fetch large number of record from database.and I use secondary index database for improve performance for example my database has 100000 records and query fetch 10000 number of records from this database .I use secondary database as index and move to secondary database until fetch all of the information that match for my condition.but when I move to this loop performance terrible.
I know when I use DB_MULTIPLE fetch all of the information and performance improves but
I read that I can not use this flag when I use secondary database for index.
please help me and say me the flag or implement that fetch all of the information all to gether and I can manage this data to my language
thanks alot
regards
saeed

Hi Saeed,
Could you post here your source code, that is compiled and ready to be executed, so we can take a look at the loop section ?
You won't be able to do bulk fetch, that is retrieval with DB_MULTIPLE given the fact that the records in the primary are unordered by master (you don't have 40K consecutive records with master='master1'). So the only way to do things in this situation would be to position with a cursor in the secondary, on the first record with the secondary key 'master1' retrieve all the duplicate data (primary keys in the primary db) one by one, and do the corresponding gets in the primary database based on the retrieved keys.
Though, there may be another option that should be taken into consideration, if you are willing to handle more work in your source code, that is, having a database that acts as a secondary, in which you'll update the records manually, with regard to the modifications performed in the primary db, without ever associating it with the primary database. This "secondary" would have <master> as key, and <std_id>, <name> (and other fields if you want to) as data. Note that for every modification that your perform on the std_info database you'll have to perform the corresponding modification on this database as well. You'll then be able to do the DBC->c_get() calls on this database with the DB_MULTIPLE flag specified.
I have other question.is there any way that fetch information with number of record?
for example fetch information that located third record of my database.I guess you're refering to logical record numbers, like the relational database's ROW_ID. Since your databases are organized as BTrees (without the DB_RECNUM flag specified) this is not possible directly.You could perform this if use a cursor and iterate through the records, and stop on the record whose number is the one you want (using an incrementing counter to keep track of the position). If your database could have operated with logical record numbers (BTree with DB_RECNUM, Queue or Recno) this would have been possible directly:
http://www.oracle.com/technology/documentation/berkeley-db/db/ref/am_conf/logrec.html
http://www.oracle.com/technology/documentation/berkeley-db/db/ref/am_conf/renumber.html
Regards,
Andrei

Similar Messages

  • Lookups with large number of records do not return the page

    Hi,
    I am developing an application using Oracle JHeadstart 10.1.3 Preview Version 10.1.3.0.78
    In my application I created a lookup under domains and used that lookup for an attribute (Display Type for this attribute is: dropDownList) in a group to get the translation fro this attribute. The group has around 14,800 records and the lookup has around 7,400 records.
    When I try to open this group (Tab), the progress shows that it is progressing but it does not open even after a long time.
    If I change the Display Type for the attribute from dropDownList to textInput then it works fine.
    I have other lookups with lower number of records. Those lookups work fine with dropDownList Display Type.
    Only I have this kind of problem when I have a lookup with large number of records.
    Is there any limitation of record number for lookups under Domains?
    How I can solve this?
    I need to translate the attribute (get the description from another table using the code).
    Your help would be appreciated.
    Thanks
    Syed

    We have also faced similar issue, but us, it was happening when we were using the dropDownList in a table, while the same dropDownList was working in table format. In our case the JVM is just used to crash and after google'ing it here in forums, found that it might be related to some JVM issue on Windows XP machines without Service Pack 2.
    Anyway... the workaround that we taken to get around the issue is to use LOV instead of a dropDownList in your jHeadStart.
    Hope this helps...
    - rutwik

  • Best way to delete large number of records but not interfere with tlog backups on a schedule

    Ive inherited a system with multiple databases and there are db and tlog backups that run on schedules.  There is a list of tables that need a lot of records purged from them.  What would be a good approach to use for deleting the old records?
    Ive been digging through old posts, reading best practices etc, but still not sure the best way to attack it.
    Approach #1
    A one-time delete that did everything.  Delete all the old records, in batches of say 50,000 at a time.
    After each run through all the tables for that DB, execute a tlog backup.
    Approach #2
    Create a job that does a similar process as above, except dont loop.  Only do the batch once.  Have the job scheduled to start say on the half hour, assuming the tlog backups run every hour.
    Note:
    Some of these (well, most) are going to have relations on them.

    Hi shiftbit,
    According to your description, in my opinion, the type of this question is changed to discussion. It will be better and 
    more experts will focus on this issue and assist you. When delete large number of records from tables, you can use bulk deletions that it would not make the transaction log growing and runing out of disk space. You can
    take the table offline for maintenance, a complete reorganization is always best because it does the delete and places the table back into a pristine state. 
    For more information about deleting a large number of records without affecting the transaction log.
    http://www.virtualobjectives.com.au/sqlserver/deleting_records_from_a_large_table.htm
    Hope it can help.
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Change transaction using BDC-problem for large number of lines on screen

    Hi All,
    I am developing BAPI (using BDC) which creates quality notification in SAP which is entered via front end web application. Structure p_qmsm contain 3 lines of task in notification. The code is as given below. To avoid problem of large no of lines on screen,code lines starting with * is used. This actually for pagedown after entering every 2 lines and creates new line. so that 2 lines always push up on screen and there will not be problem for creating large no of lines on screen
    perform bdc_dynpro using 'SAPLIQS0' '7200'.
    perform bdc_field using 'BDC_OKCODE' '=10\TAB11'.
    LOOP AT p_qmsm INTO wa_qmsm.
    *IF wa_qmsm_cntr > 2.
           wa_qmsm_cntr = 2.
           perform bdc_dynpro  using 'SAPLIQS0'                   '7204'.
           perform bdc_field   using 'BDC_OKCODE'                 '=PEND'.
    ENDIF.
    perform bdc_dynpro using 'SAPLIQS0' '7204'.
    perform bdc_field using 'BDC_OKCODE' '/00'.
    CONCATENATE 'VIQMSM-QSMNUM(' wa_qmsm_cntr ')' INTO wm_qmsm_qsmnum.
    CONCATENATE 'VIQMSM-MNGRP(' wa_qmsm_cntr ')' INTO wm_qmsm_mngrp.
    CONCATENATE 'VIQMSM-MNCOD(' wa_qmsm_cntr ')' INTO wm_qmsm_mncod.
    CONCATENATE 'VIQMSM-MATXT(' wa_qmsm_cntr ')' INTO wm_qmsm_matxt.
    perform bdc_field using wm_qmsm_qsmnum wa_qmsm-qsmnum.
    perform bdc_field using wm_qmsm_mngrp wa_qmsm-mngrp.
    perform bdc_field using wm_qmsm_mncod wa_qmsm-mncod.
    perform bdc_field using wm_qmsm_matxt wa_qmsm-matxt.
    wa_qmsm_cntr = wa_qmsm_cntr + 01.
    ENDLOOP.
    CALL TRANSACTION 'IQS2' USING wt_bdc
    MODE 'N' UPDATE 'A'  MESSAGES INTO P_MESSTAB.
    The same code is used in modify mode also. web application is sending all 3 lines in modify mode even if single line is modified. It is already decided to send all rows back from web application to SAP in same sequence. It is working fine if i comment 5 lines which is starting with *. But in modify mode, how can i ensure that correct row is modified? and how can i achieve problem of large no of lines on screen?Please suggest?

    Hi yogesh,
    how can i ensure that correct row is modified?
    1. For this we need to know two things ;
       a) the database table in which the entries are already stored
       b) the sequence in which they are displayed in the transaction.
    2. So before changing any line, we need to compare (the primary key values / important values)
       a) as per the database table and as per the incoming data from web application (using bapi)
      b) if the match is ok, it means that particular row was not modified, else modified.
    how can i achieve problem of large no of lines on screen?
    1. For this I am  not sure about the transaction and its screen. Manytimes for appending row on the screen,
      there is a PLUS + button on the grid toolbar. So for every entry, (inspite of some empty/filled rows already visible on the screen), we should use the + button, and this new row always appears on the top i.e. row number 1.
    hope this helps.
    regards,
    amit m.

  • Analyze table after insert a large number of records?

    For performance purpose, is it a good practice to execute an 'analyze table' command after inserting a large number of a records into a table in Oracle 10g, if there is a complex query following the insert?
    For example:
    Insert into foo ...... //Insert one million records to table foo.
    analyze table foo COMPUTE STATISTICS; //analyze table foo
    select * from foo, bar, car...... //Execute a complex query whithout hints
    //after 1 million records inserted into foo
    Does this strategy help to improve the overall performance?
    Thanks.

    Different execution plans will most frequently occur when the ratio of the number of records in various tables involved in the select has changed tremendously. This happens above all if 'fact' tables are growing and 'lookup' tables stayed constant.
    This is why you shouldn't test an application with a small number of 'fact' records.
    This can happen both with analyze table and dbms_stats.
    The advantage of dbms_stats is, it will export the current statistics to a stats to table, so you can always revert to them using dbms_stats.import_stats.
    You can even overrule individual table and column statistics by artificial values.
    Hth
    Sybrand Bakker
    Senior Oracle DBA

  • Which is the best way for posting a large number of records?

    I have around 12000 register to commit to dababase.
    Which is the best way for doing it?
    What depends on ?
    Nowadays I can't commit such a large number of register..The dabatase seems hanged!!!
    Thanks in advance

    Xavi wrote:
    Nowadays I can't commit such a large number of registerIt should be possible to insert tens of thousands of rows in a few seconds using an insert statement even with a complex query such as the all_objects view, and commit at the end.
    SQL> create table t as select * from all_objects where 0 = 1;
    Table created.
    Elapsed: 00:00:00.03
    SQL> insert into t select * from all_objects;
    32151 rows created.
    Elapsed: 00:00:09.01
    SQL> commit;
    Commit complete.
    Elapsed: 00:00:00.00
    I meant RECORDS instead of REGISTERS.Maybe that is where you are going wrong, records are for putting on turntables.

  • Fetching the number of records in a multi-record block...

    Hi ,
    In Forms10g runtime-and in previous releases too- there is , as a message, the number of records fetched/inserted in a multi-line block such as Record:5/9. Is it possible to catch these two numbers (i mean 5 and 9 , or at least the number of records)...not only in query but in the insert mode as well...?????
    Many thanks ,
    Simon

    No, you can't capture that text, but you can write your own code to do the same thing.

  • Problem :fetching more than one record

    In the below code ,i am receiving error that '"fetch returns more than record"..Is there any way to solve this problem
    declare
    cursor cur_test is
    select * from test_for_cursor where i=&i for update;
    r test_for_cursor%rowtype;
    r1 test_for_cursor.i%type;
    begin
    open cur_test;
    loop
    fetch cur_test into r;
    exit when cur_test%notfound;
    update test_for_cursor set i=r.i+2 where i=r.i returning i into r1;
    dbms_output.put_line(r1);
    end loop;
    close cur_test;
    end;

    Try this
    DECLARE
       CURSOR cur_test
       IS
          SELECT     *
                FROM test_for_cursor
               WHERE i = &i
          FOR UPDATE;
       r    test_for_cursor%ROWTYPE;
       TYPE r1_typ IS TABLE OF test_for_cursor.i%TYPE;
       r1   r1_typ;
    BEGIN
       OPEN cur_test;
       LOOP
          FETCH cur_test
           INTO r;
          EXIT WHEN cur_test%NOTFOUND;
          UPDATE    test_for_cursor
                SET i = r.i + 2
              WHERE i = r.i
          RETURNING i
          BULK COLLECT INTO r1;
          FOR j IN 1 .. r1.COUNT
          LOOP
             DBMS_OUTPUT.put_line (r1(j));
          END LOOP;
       END LOOP;
       CLOSE cur_test;
    END;Some correction.
    Message was edited by:
    michaels

  • Problems syncing large number of photos while choosing all photos option

    I have 17,800 photos and have experienced problems with trying to sync in iTunes since release day 1.
    If I choose to sync using the all photos option, the iPad will reboot during synch and appears to finish. Ipad will become unresponsive and hang with the apple logo on screen and a hard reset will then be required.
    Then when trying to launch photo app in iPad it displays "updating library" for a minute or 2 and then photo app crashes back to home screen.
    Have tried many options in trying to solve this including restore iPad software, deleting photo cache in iTunes, rebuilding iTunes library, etc.
    Only solution I have found was to break down my 17,800 photos into events and choose to synch events instead of the all photos option and that works, getting all 17,800 photos into iPad. Photos take about 5.8 gb of storage.
    There is definitely a problem with iTunes here in synching large number of photos using the all photos option and I have reported this to AppleCare support via two support calls. There does not appear to be a awareness of this issue in tech supports knowledge base and I am surprised to that there are not a large number of incidents reported as there are a few discussions started here on the issues of photo sync.
    Hopefully this post will help others with the same issue.
    Message was edited by: amfca

    Solved problem of synching iPad with desktop containing large photo library (20,000 photos): In summary, solution was to rename, copy and live iPHoto Library to XHD and reduce size of library on home (desktop) drive.
    In the home directory, rename "iPhoto Library" to "iPhoto Global" (or any other name you want), and copy into an external hard drive via simple drag and drop method.
    Then, go back to the newly renamed iPhoto Global and rename it again, to iPhoto 2010. Now, open iPhoto by holding down the Alt/Option key and opening iPhoto. This provides option to choose library iPhoto 2010. Open the library, and eliminate every photo before 2010. This got us down to a few thousand photos and a much smaller library. Synch this smaller library with the iPad.
    Finally, I suggest downloading a program "iPhoto Library Manager" and using this to organize and access the two libraries you now have (which could be 2, 10 or however many different libraries you want to use.) The iPhoto program doesn't want you to naturally have more than one library, but a download called iPhoto Library Manager allows user to segregate libraries for different purposes (eg. personal, work, 2008, 2009, etc.). Google iPhoto Library Manager, download, and look for links to video tutorials which were very helpful.
    I did not experience any problems w/ iPhoto sequencing so can't address that concern.
    Good luck!

  • CLIENT_TEXT_IO - Hanging on "PUT" for large number of records

    I have successfully used CLIENT_TEXT_IO but my users have run into an error where the Form hangs and spits out details such:
    "oracle.forms.net.HTTPNStream.doFlush"
    etc....
    This happens when the number of records in the datablock is high (ex: 70,000 recs). So my question is: Is there a limit on how many lines you can write to a file?
    I'm just creating a CSV file on the client's machine using CLIENT_TEXT_IO.PUT_LINE. It works fine on say a few thousand recs but after that it hangs.
    I'm on Oracle Application Server 10g, Release 9.0.4 on Windows Server 2003, and forms compiled using Oracle Developer Suite 9.0.4.
    Thanks,
    Gio

    Hello,
    When playing with huge data, it is better to generate the file on the A.S. then get it back to the client.
    <p>Read this article</p>
    Francois

  • Not receiving email when sending large number of records using a FM?

    Hi..
    I am using the function module " SO_DOCUMENT_SEND_API1 " to send email....
    When a single record is there.. or around 5-6 records are there... email is coming successfully...
    But when there are more records..say around 100, the email is not coming... I checked SOST transaction and the status there is in red..and the error message is " Internal error: SO_OBJECT_MIME_GET Exception: 2 ".........
    What could be the reason behind this problem....... ??
    I have another problem... my output has over 60 fields, but in the email which i am receiving has only around 10 fields... how to solve this problem...??
    Plz help...

    Well... right now i am tryin to get only the first 2 fields.. but even in this case... i am not getting the email if around 15 records are there......
    I am using the code which is given below which i found in SDN only......In this code.. data is getting selected from EKPO... i tried changing the number of rows getting selected.. and in this case, the attachement is coming as desired... but when i use the same code for my prog.. i am not getting the mail.. even if there are only 10 records or so...
    *& Report  ZT062108   ALV Header                                    *
    REPORT  zt062108.
    TABLES: ekko.
    PARAMETERS: p_email   TYPE somlreci1-receiver
                                      DEFAULT '<give email here>'.
    TYPES: BEGIN OF t_ekpo,
      ebeln TYPE ekpo-ebeln,
      ebelp TYPE ekpo-ebelp,
      aedat TYPE ekpo-aedat,
      matnr TYPE ekpo-matnr,
    END OF t_ekpo.
    DATA: it_ekpo TYPE STANDARD TABLE OF t_ekpo INITIAL SIZE 0,
          wa_ekpo TYPE t_ekpo.
    TYPES: BEGIN OF t_charekpo,
      ebeln(10) TYPE c,
      ebelp(5)  TYPE c,
      aedat(8)  TYPE c,
      matnr(18) TYPE c,
    END OF t_charekpo.
    DATA: wa_charekpo TYPE t_charekpo.
    DATA:   it_message TYPE STANDARD TABLE OF solisti1 INITIAL SIZE 0
                    WITH HEADER LINE.
    DATA:   it_attach TYPE STANDARD TABLE OF solisti1 INITIAL SIZE 0
                    WITH HEADER LINE.
    DATA:   t_packing_list LIKE sopcklsti1 OCCURS 0 WITH HEADER LINE,
            t_contents LIKE solisti1 OCCURS 0 WITH HEADER LINE,
            t_receivers LIKE somlreci1 OCCURS 0 WITH HEADER LINE,
            t_attachment LIKE solisti1 OCCURS 0 WITH HEADER LINE,
            t_object_header LIKE solisti1 OCCURS 0 WITH HEADER LINE,
            w_cnt TYPE i,
            w_sent_all(1) TYPE c,
            w_doc_data LIKE sodocchgi1,
            gd_error    TYPE sy-subrc,
            gd_reciever TYPE sy-subrc.
    *START_OF_SELECTION
    START-OF-SELECTION.
    *   Retrieve sample data from table ekpo
      PERFORM data_retrieval.
    *   Populate table with detaisl to be entered into .xls file
      PERFORM build_xls_data_table.
    *END-OF-SELECTION
    END-OF-SELECTION.
    * Populate message body text
      perform populate_email_message_body.
    * Send file by email as .xls speadsheet
      PERFORM send_file_as_email_attachment
                                   tables it_message
                                          it_attach
                                    using p_email
                                          'Example .xls documnet attachment'
                                          'TXT'
                                          'filename'
                                 changing gd_error
                                          gd_reciever.
    *   Instructs mail send program for SAPCONNECT to send email(rsconn01)
      PERFORM initiate_mail_execute_program.
    *&      Form  DATA_RETRIEVAL
    *       Retrieve data form EKPO table and populate itab it_ekko
    FORM data_retrieval.
      SELECT ebeln ebelp aedat matnr
       UP TO 1000 ROWS
        FROM ekpo
        INTO TABLE it_ekpo.
    ENDFORM.                    " DATA_RETRIEVAL
    *&      Form  BUILD_XLS_DATA_TABLE
    *       Build data table for .xls document
    FORM build_xls_data_table.
    *  CONSTANTS: con_cret TYPE x VALUE '0D'.  "OK for non Unicode
    *             con_tab TYPE x VALUE '09'.   "OK for non Unicode
    *If you have Unicode check active in program attributes thnen you will
    *need to declare constants as follows
    *class cl_abap_char_utilities definition load.
    constants:
        con_tab  type c value cl_abap_char_utilities=>HORIZONTAL_TAB,
        con_cret type c value cl_abap_char_utilities=>CR_LF.
      CONCATENATE 'EBELN' 'EBELP' 'AEDAT' 'MATNR'
             INTO it_attach SEPARATED BY con_tab.
      CONCATENATE con_cret it_attach  INTO it_attach.
      APPEND  it_attach.
      LOOP AT it_ekpo INTO wa_charekpo.
        CONCATENATE wa_charekpo-ebeln wa_charekpo-ebelp
                    wa_charekpo-aedat wa_charekpo-matnr
               INTO it_attach SEPARATED BY con_tab.
        CONCATENATE con_cret it_attach  INTO it_attach.
        APPEND  it_attach.
      ENDLOOP.
    ENDFORM.                    " BUILD_XLS_DATA_TABLE
    *&      Form  SEND_FILE_AS_EMAIL_ATTACHMENT
    *       Send email
    FORM send_file_as_email_attachment tables pit_message
                                              pit_attach
                                        using p_email
                                              p_mtitle
                                              p_format
                                              p_filename
                                              p_attdescription
                                              p_sender_address
                                              p_sender_addres_type
                                     changing p_error
                                              p_reciever.
      DATA: ld_error    TYPE sy-subrc,
            ld_reciever TYPE sy-subrc,
            ld_mtitle LIKE sodocchgi1-obj_descr,
            ld_email LIKE  somlreci1-receiver,
            ld_format TYPE  so_obj_tp ,
            ld_attdescription TYPE  so_obj_nam ,
            ld_attfilename TYPE  so_obj_des ,
            ld_sender_address LIKE  soextreci1-receiver,
            ld_sender_address_type LIKE  soextreci1-adr_typ,
            ld_receiver LIKE  sy-subrc.
      ld_email   = p_email.
      ld_mtitle = p_mtitle.
      ld_format              = p_format.
      ld_attdescription      = p_attdescription.
      ld_attfilename         = p_filename.
      ld_sender_address      = p_sender_address.
      ld_sender_address_type = p_sender_addres_type.
    * Fill the document data.
      w_doc_data-doc_size = 1.
    * Populate the subject/generic message attributes
      w_doc_data-obj_langu = sy-langu.
      w_doc_data-obj_name  = 'SAPRPT'.
      w_doc_data-obj_descr = ld_mtitle .
      w_doc_data-sensitivty = 'F'.
    * Fill the document data and get size of attachment
      CLEAR w_doc_data.
      READ TABLE it_attach INDEX w_cnt.
      w_doc_data-doc_size =
         ( w_cnt - 1 ) * 255 + STRLEN( it_attach ).
      w_doc_data-obj_langu  = sy-langu.
      w_doc_data-obj_name   = 'SAPRPT'.
      w_doc_data-obj_descr  = ld_mtitle.
      w_doc_data-sensitivty = 'F'.
      CLEAR t_attachment.
      REFRESH t_attachment.
      t_attachment[] = pit_attach[].
    * Describe the body of the message
      CLEAR t_packing_list.
      REFRESH t_packing_list.
      t_packing_list-transf_bin = space.
      t_packing_list-head_start = 1.
      t_packing_list-head_num = 0.
      t_packing_list-body_start = 1.
      DESCRIBE TABLE it_message LINES t_packing_list-body_num.
      t_packing_list-doc_type = 'RAW'.
      APPEND t_packing_list.
    * Create attachment notification
      t_packing_list-transf_bin = 'X'.
      t_packing_list-head_start = 1.
      t_packing_list-head_num   = 1.
      t_packing_list-body_start = 1.
      DESCRIBE TABLE t_attachment LINES t_packing_list-body_num.
      t_packing_list-doc_type   =  ld_format.
      t_packing_list-obj_descr  =  ld_attdescription.
      t_packing_list-obj_name   =  ld_attfilename.
      t_packing_list-doc_size   =  t_packing_list-body_num * 255.
      APPEND t_packing_list.
    * Add the recipients email address
      CLEAR t_receivers.
      REFRESH t_receivers.
      t_receivers-receiver = ld_email.
      t_receivers-rec_type = 'U'.
      t_receivers-com_type = 'INT'.
      t_receivers-notif_del = 'X'.
      t_receivers-notif_ndel = 'X'.
      APPEND t_receivers.
      CALL FUNCTION 'SO_DOCUMENT_SEND_API1'
           EXPORTING
                document_data              = w_doc_data
                put_in_outbox              = 'X'
                sender_address             = ld_sender_address
                sender_address_type        = ld_sender_address_type
                commit_work                = 'X'
           IMPORTING
                sent_to_all                = w_sent_all
           TABLES
                packing_list               = t_packing_list
                contents_bin               = t_attachment
                contents_txt               = it_message
                receivers                  = t_receivers
           EXCEPTIONS
                too_many_receivers         = 1
                document_not_sent          = 2
                document_type_not_exist    = 3
                operation_no_authorization = 4
                parameter_error            = 5
                x_error                    = 6
                enqueue_error              = 7
                OTHERS                     = 8.
    * Populate zerror return code
      ld_error = sy-subrc.
    * Populate zreceiver return code
      LOOP AT t_receivers.
        ld_receiver = t_receivers-retrn_code.
      ENDLOOP.
    ENDFORM.
    *&      Form  INITIATE_MAIL_EXECUTE_PROGRAM
    *       Instructs mail send program for SAPCONNECT to send email.
    FORM initiate_mail_execute_program.
      WAIT UP TO 2 SECONDS.
      SUBMIT rsconn01 WITH mode = 'INT'
                    WITH output = 'X'
                    AND RETURN.
    ENDFORM.                    " INITIATE_MAIL_EXECUTE_PROGRAM
    *&      Form  POPULATE_EMAIL_MESSAGE_BODY
    *        Populate message body text
    form populate_email_message_body.
      REFRESH it_message.
      it_message = 'Please find attached a list test ekpo records'.
      APPEND it_message.
    endform.                    " POPULATE_EMAIL_MESSAGE_BODY

  • Slow record selection in tableView component with large number of records

    Hi experts,
    we have a Business Server Page (flow logic) with several htmlb:inputField's. As known from SAP standard we would like to offer value helper (F4) to the users for the ease of record selection.
    We use the onValueHelp() method of the inputField to open a extra browser window through JavaScript. In the popup another html-website is called, containing a tableView component with all available records. We use the SINGLESELECT mode for the table view.
    Everything works perfect and efficient, unless the tableView contains too many entries. If the number of possible entries is large the whole component performs very very slow. For example the selection of the record can take more than one minute. Also the navigation between pages through the buttons at the bottom of the component takes a lot of time. It seems that the tableView component can not handle so many entries.
    We tried to switch between stateful and stateless mode, without success. Is there a way to perform the tableView selection without doing a server-round-trip? Any ideas and comments will be appreciated.
    Best regards,
    Sebastian

    Hi Raja,
    thank you for your hint. I took a look at sbspext_table/TableViewClient.bsp but did not really understand how the Java-Script coding works. Where is the JavaScript code in that example? Which file, does it contain.
    Meanwhile I implemented another way to evite the server round trip.
    - Switch page mode of the popup window to "Stateful"
    - Use OnInitialization method like OnCreate (as shown in [using OnInitialization like OnCreate])
    - Limit the results of the SELECT statement with UP TO 1000 ROWS
    Best regards,
    Sebastian

  • Processing of large number of records using JDBC Sender Channel

    Hi experts,
    We have a JDBC-File scenario where in the tables contain about 500K records on an average.
    I used multimapping to generate a flat file for every 10K. The problem is ..when I start the JDBC Sender CC, the memory goes up and the J2EE engine restarts. In the Sender CC, I gave the Disconnect from the database option too. The query is SELECT * from TABLE and the Update statement is <TEST>. Please help me out how to solve this.
    Regards.

    Hi
    Use the below query,
    // Oracle
    SELECT Statement :      select colname from tblname where rownum<=1000 ;
    // MSSQL
    select top 1000 colname from tblname
    Regards
    Ramg

  • Saving of data in a table having large number of records

    Hi,
    i'm working in forms 6i and database 10g.
    i'm having two tables, stock_head and stock_detail.
    The stock_detail table is having millions of records.
    The stock_detail is having 3 database triggers.
    the saving of data into these tables is very slow even after disabling the triggers.
    can anyone please help me regarding this matter...
    How to improve the performance?
    please help me...

    As always the same thing applies to these type of queries
    - No exact version numbers are provided
    - The problem description is way too vague to resolve the issue
    - The requestor doesn't read documentation
    - The requestor didn't use online resources, and didn't search this forum
    The central question always is
    What is it waiting for
    So you need to run ADDM and/or AWR reports provided you are properly licensed, or statspack when you don't have a license for AWR/ADDM.
    Apart from that no help is possible, as the post didn't contain a problem description other than 'It doesn't work, help'
    Sybrand Bakker
    Senior Oracle DBA

  • Large number of records in Sender JDBC

    Hi Guys,
    I have a requirement where a legacy database sends data close to 4000 records (via a script) to a MS access database.
    of the four thousand, the script is such that it fetches data upto 2 yrs old..where as when the interface runs in XI it will be run on daily basis..how can i filter the above in XI?..ther is not even a time stamp in the antiquated legacy database..
    Also , could u throw light on how i should get the details for connection from XI to MS access?

    Hi
    Do with Server proxies
    http://help.sap.com/saphelp_nw04/helpdata/en/2b/f49b21674e8c44940bb3beafd83d5c/content.htm
    /people/siva.maranani/blog/2005/04/03/abap-server-proxies
    /people/ravikumar.allampallam/blog/2005/03/14/abap-proxies-in-xiclient-proxy
    rgds
    srini

Maybe you are looking for

  • Can't Update/Restore

    When trying to update my iPhone 6 Plus through iTunes, the download would take hours, half a day usually. At the end of every download, this would always fail stating "the file was corrupted during download". Since updating/restoring my iPhone 6 Plus

  • Is dynamic display possible in crosstabs?

    Hi All, I am new to Design Studio and need your help in designing my first dashboard using the ver 1.3 There are 4 charts with the same dimension but with 4 different measures. Am able to display this dynamically through scripting by creating a singl

  • 8.1.6 to 9iR2

    Hello, I've got: - Machine1 with NT and Oracle 8.1.6 - Machine1 with W2K and Oracle 9iR2 I plan to do: 1- Export of my db on Machine1 2- Create the tablespaces on Machine2 3- Import the dump genrated in step 1 to Machine2 I read that in case of upgra

  • Error in standby db while applying logs

    Hi, we have a manual standby db of version 9.2.0.6 on sun solaris in alert log file i found error in one og the log i.e Errors with log /arch/log/1_1817.dbf ALTER DATABASE RECOVER    CONTINUE DEFAULT Wed Oct 22 18:33:03 2008 Media Recovery Log /arch/

  • VALUATION TYPE IN STO

    Dear All,                      user is making a STO , Here a Valuation type  is required so pl guide what to enter , user is confuser what to give pl tell what is reffered to Val. class . sap11