Performance issue while transferring data from one itab to another itab

hi experts,
i have stored all the general material details in one internal table and description, valuation details of the meterial is stored in another internal table which is of standard table. now i need to tranfer all the data from these two internal tables into one final internal table but it is taking lot of time as it has to transfer lacs of data.
i have declared the output table as shown below
DATA:
  t_output TYPE standard TABLE
           OF type_output
           INITIAL SIZE 0
           WITH HEADER LINE.
(according to the standard i have to declare like this and the two internal tables are declared similar to the above one)
could somebody suggest me how shall I procced through this...
thanks in advance....
ragerds,
Deepu

Have a look at the following article which you may find useful:
  <a href="https://www.sdn.sap.comhttp://www.sdn.sap.comhttp://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/40729734-3668-2910-deba-fa0e95e2c541">Improving performance in nested loops</a>
good luck, damian

Similar Messages

  • Fatal error while transferring data from one bb to another

    I was trying to transfer data from one bb to a different bb using the bb desktop manager tool.  It collected all the data from the first bb fine and then while transferring data to the new one it said a fatal error occured to try again.  That bb will not even turn on now.  No white screen or anything.  Its like there is no operating system.  I don't know what to do.  I tried to connect it and just update the software but that isnt working either.  Says it is not responding during initialization! I think I killed it.... any ideas what to do???

    You can try to reload the OS in one of the following ways
    http://www.blackberry.com/btsc/KB03485
    http://blackberryfaq.net/index.php/How_do_I_wipe_the_BlackBerry_using_Jl_Cmder%3F
    If someone has been helpful please consider giving them kudos by clicking the star to the left of their post.
    Remember to resolve your thread by clicking Accepted Solution.

  • BADI for transfering data from one modal to another modal within single appset

    Hallo Experts,
    My Business Requirement is Transfer of data from one modal to another in same environment and did this taking reference from below document.(How to custom badi for replicating destination app)
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e04b5d24-085f-2c10-d5a2-c1153a9f9346?QuickLink=index&…
    Document contains TR for implementing BADI, but this document supports bpc version 7.0 and we are using is bpc 10.0.
    So i make all compatible changes in BADI implementation and activate it. Now i am testing using transaction UJKT using following script.
    and the result is records successfully written back to application MIS, but when i check data is not moved to target application MIS.
    I am facing stuck situation in my project work. Please Suggest. Hope for positive reply.
    Script:
    *XDIM_MEMBERSET WM_ACCOUNT = WM_041,
    *XDIM_MEMBERSET WM_UOM_02 = UOM_004
    *XDIM_MEMBERSET WM_UD_2 = WM_07
    *XDIM_MEMBERSET WD_EXT_MAT_GRP =CHALK-PH-I,CHALK-PH-II,CHAVN-PH-I,CHAVN-PH-II,NASHIK-WM,RAJASTHAN-WM,TAMILNADU-WM
    *XDIM_MEMBERSET CATEGORY= Plan
    *XDIM_MEMBERSET AUDITTRAIL=Input
    *XDIM_MEMBERSET P_ENTITY = SIL
    *XDIM_MEMBERSET RPTCURRENCY = LC
    *START_BADI DAPP
       DESTINATION_APP ="MIS"
       RENAME_DIM ="WD_EXT_MAT_GRP= PRODUCT"
       ADD_DIM ="PLANT=NO_PLANT","MIS_ACCOUNTS=CAIN0058040008","COST_CENTER=NO_COST_CENTER","FLOW=Opening","UOM=AMT","CUSTOMER_SALES_2=NO_CUSTOMER","CATEGORY=Plan","AUDITTRAIL=Input"             
       DEBUG = ON
       WRITE = OFF
       QUERY = ON
    *END_BADI
    Please find attached result.
    Regards,
    Dipesh Mudras.

    Hello,
    Here is the manual to copy data between apps (it works with BPC NW75):
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/b0480970-894f-2d10-f9a5-d4b1160be203?quicklink=index&overridelayout=true
    It works me, but now I need to modify the Script Logic to make that a "property" from the origin dimension has to be copied to the destination dimension as "id", like follow:
    //*XDIM_MEMBERSET CATEGORY = %CATEGORY_SET%
    //*XDIM_MEMBERSET TIME = %TIME_SET%
    //*XDIM_MEMBERSET ENTITY = %ENTITY_SET%
    //*XDIM_MEMBERSET RPTCURRENCY =  %RPTCURRENCY_SET%
    *START_BADI FiltroPD
         WRITE = OFF
         APPL = $APPLICATION$
         ADD_DIM = "ORIGEN = APPVENTAS"
         ADD_DIM ="O_COSTE = no_input"
         ADD_DIM="CECO = no_input"
         RENAME_DIM="P_ACCT = RATIOS.P_ACCT "
    *END_BADI

  • Data load failed while loading data from one DSO to another DSO..

    Hi,
    On SID generation data load failed while loading data  from Source DSO to Target DSO.
    Following are the error which is occuuring--
    Value "External Ref # 2421-0625511EXP  " (HEX 450078007400650072006E0061006C0020005200650066
    Error when assigning SID: Action VAL_SID_CONVERT, InfoObject 0BBP
    So, i'm  not getting  WHY in one DSO i.e Source  it got successful but in another DSO i.e. Target its got failed??
    While analyzing all i check that SIDs Generation upon Activation is ckecked in source DSO but not in Target DSO..so it is reason its got failed??
    Please explain..
    Thanks,
    Sneha

    Hi,
    I hope your data flow has been designed in such a way where the 1st DSO as a staging Device and all transformation rules and routine are maintained in between 1st to 2nd dso and sid generation upon activation maintained in 2nd DSO.  By doing so you will be getting your data 1st DSO same as your source system data since you are not doing any transformation rules and routine etc.. which helps to avoid data load failure.  
    Please analyze the following
    Have you loaded masterdata before transaction data ... if no please do it first
    go to the property of first dso and check whether there maintained sid generation up on activation (it may not be maintained I guess)
    Goto the property of 2nd Dso and check whether there maintained sid generation up on activation (It may be maintained I hope)
    this may be the reason.
    Also check whether there is any special char involvement in your transaction data (even lower case letter)
    Regards
    BVR

  • Help transferring data from one account to another

    I bought my computer in August 2008 and had previously had an iBook. When transferring my data from my old computer to the new one, I guess I did it wrong so instead of it transferring all to the same place it created two accounts.
    Is there way to transfer my music/pics etc to my main account?
    Also, my iphone is backed up the other account because thats where all my music is, is there a way to also keep all this data once I am using only the main account?

    When you use Migraine Assistant, it transfers the account of the older machine to the newer because the users name is involved in everything, like iTunes playlists etc.,
    Since to run Migraine Assistant you need to have gone through setup on the Mac and created the main Admin account, when you Migrate, you transfer the other account so you wind up with two accounts on the new machine.
    All you have to do is log into the Admin account of the first user you created, then change the acount status of the Migrained account to Admin and reboot, check it out well, then delete the first Admin account at your own leaisure. However it's best to leave two accounts on the machine, just in case your locked out for some reason.
    If you rather transfer your files from one account to another use the Shared Drop Box
    If you have three accounts on the machine (original admin and two migrained migrateda accounts), you'll just have to monkey around and clean up the mess.

  • Performance issues while query data from a table having large records

    Hi all,
    I have a performance issues on the queries on mtl_transaction_accounts table which has around 48,000,000 rows. One of the query is as below
    SQL ID: 98pqcjwuhf0y6 Plan Hash: 3227911261
    SELECT SUM (B.BASE_TRANSACTION_VALUE)
    FROM
    MTL_TRANSACTION_ACCOUNTS B , MTL_PARAMETERS A  
    WHERE A.ORGANIZATION_ID =    B.ORGANIZATION_ID 
    AND A.ORGANIZATION_ID =  :b1 
    AND B.REFERENCE_ACCOUNT =    A.MATERIAL_ACCOUNT 
    AND B.TRANSACTION_DATE <=  LAST_DAY (TO_DATE (:b2 ,   'MON-YY' )  )  
    AND B.ACCOUNTING_LINE_TYPE !=  15  
    call     count       cpu    elapsed       disk      query    current        rows
    Parse        1      0.00       0.00          0          0          0           0
    Execute      3      0.02       0.05          0          0          0           0
    Fetch        3    134.74     722.82     847951    1003824          0           2
    total        7    134.76     722.87     847951    1003824          0           2
    Misses in library cache during parse: 1
    Misses in library cache during execute: 2
    Optimizer mode: ALL_ROWS
    Parsing user id: 193  (APPS)
    Number of plan statistics captured: 1
    Rows (1st) Rows (avg) Rows (max)  Row Source Operation
             1          1          1  SORT AGGREGATE (cr=469496 pr=397503 pw=0 time=237575841 us)
        788242     788242     788242   NESTED LOOPS  (cr=469496 pr=397503 pw=0 time=337519154 us cost=644 size=5920 card=160)
             1          1          1    TABLE ACCESS BY INDEX ROWID MTL_PARAMETERS (cr=2 pr=0 pw=0 time=59 us cost=1 size=10 card=1)
             1          1          1     INDEX UNIQUE SCAN MTL_PARAMETERS_U1 (cr=1 pr=0 pw=0 time=40 us cost=0 size=0 card=1)(object id 181399)
        788242     788242     788242    TABLE ACCESS BY INDEX ROWID MTL_TRANSACTION_ACCOUNTS (cr=469494 pr=397503 pw=0 time=336447304 us cost=643 size=4320 card=160)
       8704356    8704356    8704356     INDEX RANGE SCAN MTL_TRANSACTION_ACCOUNTS_N3 (cr=28826 pr=28826 pw=0 time=27109752 us cost=28 size=0 card=7316)(object id 181802)
    Rows     Execution Plan
          0  SELECT STATEMENT   MODE: ALL_ROWS
          1   SORT (AGGREGATE)
    788242    NESTED LOOPS
          1     TABLE ACCESS   MODE: ANALYZED (BY INDEX ROWID) OF
                    'MTL_PARAMETERS' (TABLE)
          1      INDEX   MODE: ANALYZED (UNIQUE SCAN) OF
                     'MTL_PARAMETERS_U1' (INDEX (UNIQUE))
    788242     TABLE ACCESS   MODE: ANALYZED (BY INDEX ROWID) OF
                    'MTL_TRANSACTION_ACCOUNTS' (TABLE)
    8704356      INDEX   MODE: ANALYZED (RANGE SCAN) OF
                     'MTL_TRANSACTION_ACCOUNTS_N3' (INDEX)
    Elapsed times include waiting on following events:
      Event waited on                             Times   Max. Wait  Total Waited
      ----------------------------------------   Waited  ----------  ------------
      row cache lock                                 29        0.00          0.02
      SQL*Net message to client                       2        0.00          0.00
      db file sequential read                    847951        0.40        581.90
      latch: object queue header operation            3        0.00          0.00
      latch: gc element                              14        0.00          0.00
      gc cr grant 2-way                               3        0.00          0.00
      latch: gcs resource hash                        1        0.00          0.00
      SQL*Net message from client                     2        0.00          0.00
      gc current block 3-way                          1        0.00          0.00
    ********************************************************************************On a 5 node rac environment the program completes in 15 hours whereas on a single node environemnt the program completes in 2 hours.
    Is there any way I can improve the performance of this query?
    Regards
    Edited by: mhosur on Dec 10, 2012 2:41 AM
    Edited by: mhosur on Dec 10, 2012 2:59 AM
    Edited by: mhosur on Dec 11, 2012 10:32 PM

    CREATE INDEX mtl_transaction_accounts_n0
      ON mtl_transaction_accounts (
                                   transaction_date
                                 , organization_id
                                 , reference_account
                                 , accounting_line_type
    /:p

  • Performance issue while extracting data from non-APPS schema tables.

    Hi,
    I have developed an ODBC application to extract data from Oracle E-Business Suite 12 tables.
    e.g. I am trying to extract data from HZ_IMP_PARTIES_INT table of Receivables application(Table is in "AR" database schema) using this ODBC application.
    The performance of extraction (i.e. number of rows extracted per second) is very low if the table belongs to non-APPS schema. (e.g. HZ_IMP_PARTIES_INT belongs to "AR" database schema)
    Now if I create same table (HZ_IMP_PARTIES_INT) with same data in "APPS" schema, the performance of extraction improves a lot. (i.e. the number of rows extracted per second increases a lot) with the same ODBC application.
    The "APPS" user is used to connect to the database in both scenarios.
    Also note that my ODBC application creates multiple threads and each thread creates its own connection to the database. Each thread extract different data using SQL filter condition.
    So, my question is:
    Is APPS schema optimized for any data extraction?
    I will really appreciate any pointer on this.
    Thanks,
    Rohit.

    Hi,
    Is APPS schema optimized for any data extraction? I would say NO as data extraction performance should be the same for all schemas. Do you run "Gather Schema Statistics" concurrent program for ALL schemas on regular basis?
    Regards,
    Hussein

  • Help transferring data from one computer to another

    My aunt uses Palm Desktop as her calendar but does not have a palm pda or phone.
    Her computer at home recently died but she was able to get the palm installation folder from the old hard drive.
    She has Palm Desktop installed at work but the information in it is totally different than what she had at home.  She wants to import the data from her home installation into the installation she has at work.
    It used to be really easy to transfer settings from one Palm installation to another.  Side note: My wife has a Palm Treo 650 and I recently formatted her computer and tried to copy her old installation folder from a backup to the new install directory.  Well, it didn't work like it used to.  I don't know if the software has changed or what.  But, she didn't get everything.  It's been a while, I can't remember if she didn't get ANY of it, or just some.  I remember at one point it only showed her contacts for A-M...N-Z were just plain gone.  Really strange.
    But, back on topic, I tried creating a new user in Palm Desktop and then copy the files under her old username folder to the new username folder in the Palm install directory.  But, Palm Desktop doesn't show anything for the new username.
    How should I go about doing this for her?
    Thanks!
    Post relates to: Treo 650 (Alltel)

    The user on her home install was also Trisha.  So, I created a new user on the work computer, "Trisha2".  Then I closed Palm and copied the contents of her backed-up Trisha user folder from home into the new Trisha2 folder on the work install.  Then I started Palm and flipped to the Trisha2 user.  Nothing showed up.  So, I've pretty much done the exact process from the link already.  Which is exactly the way I thought it should work and have had it work in the past.
    I'll try it again without using numbers in the username and see what happens.
    Thanks!
    Post relates to: Treo 650 (Alltel)

  • Transferring data from one table to another table using a Keycolumn using SSIS row by row dynamically

    Hi All,
    I have a Store Procedure(SP) which has a output variable name "AcivityID" which is the key column. In this SP, transformation  of data is done from one table to insert data into other table. I have to execute the SP and insert row by row data
    using the output variable "ActivityID"  whose value will keep on changing. How can I do it?
    Thanks,
    Kallu

    Value changing on a row by row basis? Not quite sure what you mean, but it seems that you want to use the results of an insert into one table as input for another. If so then SSIS is not needed, inside the stored proc use the SQL that will do that and for
    all records as
    INSERT A INTO dbo.table1
    OUTPUT INSERTED.A INTO MyTable;
    Arthur My Blog

  • Best method for transferring data from one database to another?

    There is an 8i database I have to deal with on a regular basis. Besides being completely outdated and unsupported, in my opinion the database has not been well-maintained--for a table with 25 million records, should ‘select count(*) from table;’ take 5 minutes to run? I don’t really know, but that seems long to me. Many complex queries (most including only tables with less than a million records) take a ridiculously long time to run, to the point that I can’t even run some of them.
    I am not the DBA; I don't have the authority to fiddle with the database (nor would I feel comfortable doing that), and the powers that be will not put effort into improving functionality of this database due to an alleged plan to update/replace it within the next year. However, in the mean time, I still have to get data out of this database on a regular basis.
    I have XE 10g installed on my local machine, and I have set up a database link in it to the 8i database. I have found that I can pull in basic data (simple queries) from the 8i database into tables in my XE database (e.g. create table tbl1 as select data from tbl1@8idb) and then query those tables to get the information I need much, much faster (including creation of the tables). While this option does not allow me to create queries/reports that other people can run, it makes work I’m doing only for myself much faster.
    What I’m wondering is, what is the best way to bring the information I need over to my database? I usually don’t need entire tables, and I can probably identify a number of key tables (or parts of tables) I need. What I’ve been doing up until now is writing CREATE TABLE statements on the fly, but then I end up forgetting what all I’ve done, and each time I want up-to-date data, I have to drop the tables and re-create them. It seems to me that there should be an easier way to do this than to copy and paste from a text document into SQL*Plus.
    Does anyone have any suggestions for me on how best to do this?

    Sorry, I guess I posted this in the wrong forum. I re-posted in the database-general forum.

  • Transferring data from one DB to another DB

    Hi Experts,
    In production database 30 tables are there in different schemas.
    In some tables data is existed for 5 years
    in some tables data is existed 8 years
    in some tables data is existed for 6 years.
    And all these tables are having billions and millions of records.
    Let us assume the tables in the production database are
    Table1 -- 8 years data -- No. Of Records 3538969000
    Table2 -- 6 years data -- No. Of Records  592844435
    Table3 -- 3 years data -- No. Of Records   33224993
    Table4 -- 4 years data -- No. Of Records   52361756
    Table5 -- 5 years data -- No. Of Records    8948567
    Table15 -- 6 years data -- No. Of Records 308476987Now I want to trnasfer these 15 tables data to test database.
    Based on the following conditions.
    For Table1 I want to transfer 6 years data to test database, keeep 2 years data in production
    and delete all 6 years data from production databse.
    For Table2 I want to transfer 4 years data to test database, keeep 2 years data in production
    and delete all 4 years data from production databse.
    For Table3 I want to transfer 2 years data to test database keeep 1 years data in production
    and delete all 2 years data from production databse.This will be done periodically.
    I.e. suppose if we run the script now and transfer the requested data to test database,
    again after one year the data got increased in production then again we have to run the
    script and transfer data to test database based on some conditions,so it should work for
    long time.
    Please help me what is the good and fast process to transfer the data.
    Thanks in advance.

    Thanks for your information.
    Hi ,
    To make the process dynamic why we want to make it dynamic in future any tables needs to e archived
    only one script should execute.
    I have created the below control table to pass ID, TABLE_NAME, ARCHIVE_TABLE_NAME, WHERE_CONDITION dynamically.
    ID  TABLE_NAME ARCHIVE_TABLE_NAME  WHERE_CONDITION
    1   Table1     archive_table1       created_date< MONTHS_BETWEEN(SYSDATE,-24)
    1   Table2     archive_table2    Accepted_date< MONTHS_BETWEEN(SYSDATE,-24)
    1   Table3     archive_table3    last_proposal_date< MONTHS_BETWEEN(SYSDATE,-24)
    1   Table4     archive_table4    lat_modified_date< MONTHS_BETWEEN(SYSDATE,-24)
    2   Table5     archive_table5       changed_date< MONTHS_BETWEEN(SYSDATE,-24)
    2   Table6     archive_table6    received_date< MONTHS_BETWEEN(SYSDATE,-24)
    3   Table7     archive_table7       send_date< MONTHS_BETWEEN(SYSDATE,-24)
    3   Table8     archive_table8       exact_date< MONTHS_BETWEEN(SYSDATE,-24)I want to pass ID from the JOB then the procedure should execute for all the tables with respective ID.
    this is my job
    DECLARE
      X NUMBER;
    BEGIN
      SYS.DBMS_JOB.SUBMIT
      ( job       => X
       ,what      => 'APPS.myprocedure(1);'
       ,next_date => to_date('24/12/2012 00:00:00','dd/mm/yyyy hh24:mi:ss')
       ,interval  => 'TRUNC(SYSDATE+1)'
       ,no_parse  => FALSE
      SYS.DBMS_OUTPUT.PUT_LINE('Job Number is: ' || to_char(x));
    COMMIT;
    END;
    For example if I pass the ID as '1' through job,then the procedure should execute for
    1   Table1     archive_table1       created_date< MONTHS_BETWEEN(SYSDATE,-24)
    1   Table2     archive_table2    Accepted_date< MONTHS_BETWEEN(SYSDATE,-24)
    1   Table3     archive_table3    last_proposal_date< MONTHS_BETWEEN(SYSDATE,-24)
    INSERT INTO archive_table1
    (SELECT * FROM Table1 WHERE created_date< MONTHS_BETWEEN(SYSDATE,-24));
    DELETE FROM Table1 WHERE created_date< MONTHS_BETWEEN(SYSDATE,-24));
    INSERT INTO archive_table2
    (SELECT * FROM Table2 WHERE created_date< MONTHS_BETWEEN(SYSDATE,-24));
    DELETE FROM Table2 WHERE created_date< MONTHS_BETWEEN(SYSDATE,-24));
    INSERT INTO archive_table3
    (SELECT * FROM Table3 WHERE created_date< MONTHS_BETWEEN(SYSDATE,-24));
    DELETE FROM Table3 WHERE created_date< MONTHS_BETWEEN(SYSDATE,-24));If I pass ID as 2 through job,then the procedure should execute for respective table names and where conditions.
    I have written the following code but it's not giving expected result.
    CREATE OR REPLACE PROCEDURE myprocedure (P_ID IN NUMBER)
    IS
       CURSOR C IS
          SELECT id, table_name,archive_table_name,where_condition FROM apps_global.control_ram WHERE id=p_id
          ORDER BY id,table_name;
    BEGIN
       FOR I IN C
       LOOP
    EXECUTE IMMEDIATE 'INSERT INTO '|| I.ARCHIVE_TABLE_NAME || '
    (SELECT * FROM '|| I.TABLE_NAME ||' WHERE '||I.WHERE_CONDITION||)';
    EXECUTE IMMEDIATE 'DELETE FROM ' ||I.TABLE_NAME|| ' WHERE '||I.WHERE_CONDITION||' ;
       END LOOP;
    END;
    END myprocedure;Please help me.
    Thanks in advance.

  • Transfering Data from One table to another

    Hi all,
    Can we transfer data from Cluster Table to ZTABLE (Transparent Table)?
    if Yes, How can we do that?
    Thanks
    Devinder

    Hi,
    Generating a (surrogate) primary key value is usually done using:
    A) an Oracle sequence to generate a number
    B1) refer to the NEXVAL of the sequence in your insert statements
    or
    B2) a trigger on your agents table that populates agent_id by referring to the NEXTVAL of the sequence.
    Both approaches have pro's and con's, see what best fits your requirement.
    See this explanation (including examples):
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:6575961912937#tom6583848685931
    Also, since you're new to SQL, check the [Oracle docs|http://www.oracle.com/pls/db102/homepage].
    (Or http://www.oracle.com/pls/db112/homepage if you're on 11G)
    Especially the Concepts and Fundamentals.

  • Transferring data from one schem to another

    Hi:
    I am looking for an efficient way to transfer all the records from one table in a certain schema to it's identical counterpart in another schema. Both schemas are in the same Instance. The tables are "identical" in that they have the same columns as metadata. Any suggestions?
    I would like this to be as quick a transfer as possible, so, if it would speed things up, I would like to consider deferring update of indexes of the receiving table until after the records have moved. Is that sort of thing possible? A less attractive approach would be to drop all the indexes, then transfer the records, then recreate all the indexes.
    The receiving table will be truncated just prior to the transfer, so perhaps all forms of journaling/undo can be shut off (no need to recover if the transfer failed)
    Thanks for any help

    You could certainly drop and recreate the indexes as another part of the answer, but the fastest transfer would be a direct-path insert:
    insert /*+ append */ into target
      select * from source;

  • Transfering data from one account to another

    I just discovered that the user files from the second account on my iMac g5 have somehow transfered into the trash of my first account( which is my main one).I was getting in the second account the following dropdown message- " the home folder is not accessed in the usual place"- and was unable to use it.Can someone please tell me how I can transfer the user files back , thanks

    if you are suggesting I drop the User folder i have in trash into my USer folder in the Hard disc surely it will stay in my first account
    No definitely not in your Users folder, but the Folder called Users, which also has all the other shortnames in it.
    This one in the Trash isn't called Users is it? If so open it up & just look for the other shortname folder, move that.

  • Transferring data from one table to another

    I have two tables.i have some information existing in one of the tables.this information is defined at the broadest possible level e.g. Level A has one row.
    I have to transfer this information to the other table when asked to by the user, but the problem is that the other table needs to have 64 rows created corresponding to the single row in the first table,i.e the other table will have 64 rows corresponding to a single row in the first table.
    This needs to be done as quickly as possible so that the system does not hang and the other processing logic continues.
    Moreover the first table can have more rows and each row may have 64 instances created in the other table when asked to by the user.
    Please help if anyone has answer to this query.Any kind of algorithm will help.

    Walk through the fist table, and for each row make 64 inserts.
    Do it in one transaction (i.e. without autocommit) to speed it up a bit and improve data consistency. Don't forget to commit at the end. If that seems to be too slow, you can always make a stored procedure to do the same to avoid round trip time for each insert statement if the database is on another machine...

Maybe you are looking for

  • SAP Script - Purchase Order

    Dear All,          I m working with sap script for purchase order. Now problem is that when i give the print from purchase order (ME23N) the data of internal table is not getting printed (where as itab contains data - checked in debugging). But when

  • Page will not display online but ok on local

    what is wrong with this code? <%@LANGUAGE="VBSCRIPT" CODEPAGE="1252"%> <html> <head> <title>Look4</title> <link href="inc/main.css" rel="stylesheet" type="text/css"> </head> <body> <div class="logo"><!--#include virtual="/inc/header.asp" --></div> <b

  • Work flow not created for the Shopping Cart

    Hi one of the customers created a shopping cart,in approval overview it is showing as no need of approvals,but the status of approval is 'awaiting approval', but workflow is not created for the Shopping cart. advised to re-order in change mode but st

  • How do I specify an output folder when exporting?

    I am running Adobie Premiere Pro CS5.5. I got to File -> Export -> Media. The "export settings" box appears. I know how to set the settings for my desired export. The issue I'm having is under "summary" where the output is specified. Often, I find my

  • CS5 unable to start Subscription

    I have entered my valid serial number and it has check out. When asked to enter my adobe ID, I do so, during authentication it stops and says unable to start subscription, close and reload product. I have done so and the same "unble to start subscrip