Suggestions to improve the performance to update sales order

Hi all,
Iam uploading the data from a local file to update the sales order by changing the partner function
and iam using 'BAPI_SALESORDER_CHANGE' for updating and it is taking much time to update the
sales order can any one suggest how to improve the performance to update salesorder with new partner function ?
Thanks & Best Regards,
Vishnu.

Make an SE30 on your program.
Check the action that cost too much time.
The result could be --> The update (maybe make only one commit at the end, instead of one commit by BAPI call).
A selection --> Maybe change the selection to use an index.

Similar Messages

  • Need suggestions on improving the performance end to end

    I referred following links and as many as 25 previous posts before posting this question.
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/10b54994-f569-2a10-ad8f-cf5c68a9447c
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/402fae48-0601-0010-3088-85c46a236f50
    https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/e0068bc1-6f8c-2a10-52bb-c6ee3562feb2
    /people/boris.zarske/blog/2007/06/13/sizing-a-system-for-the-system-landscape-directory-of-sap-netweaver
    I have some queries related to improving the performance of a scenario in my landscape.
    Scenario : IDOC-SOAP (Synchronous with BPM)
    The mapping is simple with 3 fld direct mapping.
    Works grt with no problem in normal cases for one message transfer.
    But initial load ( when we put it in new system, the scenario generates erros if we send more than 23 IDOCs at a time)
    I have decided to perform the following to solve it.
    /people/william.li/blog/2008/03/07/step-by-step-guide-in-processing-high-volume-messages-using-pi-71s-message-packaging
    Will it help me because its a IDOC sync scenario.
    What can I do to improve the performance of this scenario?
    Please suggest me good suggestions to improve the performance
    Please do list me out points that I have to perform (for a sync scenario with BPM) as I am already confused watching lots of blogs posts.
    Nikhil.

    do you think that the performance tuning that I mentioned in the link will hold good with sync scenarios?
    i dont think so.....in this scenario the async mesg processing is made to wait
    from the blog:
    As you can see, for the 1st minute, all the messages are waiting to be processed. After 60 seconds,
    the packages will be created and processed. There is no change to the monitoring of each of the
    individual messages in SXI_MONITOR.
    ....but if you make a sync scenario wait....then probably you may run into the risk of blocking the Queues.......
    If this is going to be in production then i would have been more careful...becoz firstly it is BPM....then synchronous BPM.....then a processing wait......normally do not try to make the BPM processing wait...my small suggestion

  • Does anyone have any suggestions to improve the performance to improve Firefox for andoid (more details).

    I'm using it on a Samsung Captivate SGHi-897 with Jellybean 2.2 (I'm unable to upgrade the OS @this time). Firefox was recommended by a freind, however, so far I've been very disappointed with it, as, it is extremely slow and constanly crashes. I've tried all of the suggestions provided, but there is no improvememt. I realize it may be not all that compatible with this properly working device or OS. I have Samsung Galxaxy SGHi-727 Skyrocket with IC 4.1, however, I've been waiting for a part for it coming from Aisa, which seems to be taking forever, so in the meantime I'm.stuck with this. I was wondering if you may have any suggestions on how to speed it up and prevent it from crashing so much, other than what's on your help guide since those changes I've tried make no improvement in it's performance. I would very much prefer to use Firefox on this device, as well as, after I repair my other device for a number of other reasons, but if I'm unable to improve it, I'll just go back to what I was using. Thank you and be well.
    twich83115
    [ed. removed email]

    Suggestion for improvement:
    I'd like <select size="1" multiple> to provide a dropdown with checkboxes like this:
    http://demos.telerik.com/aspnet-ajax/combobox/examples/functionality/checkboxes/defaultcs.aspx
    Can that be done?
    Thanks

  • How to improve the performance of Swing Application?

    My project is a Swing based application which uses java 1.2.1 API.
    The problem is my application is very slow and gets hung many times, inspite of the system configuration being good ( 256 Mb RAM, 733MHz processor).
    Do give me some suggestions to improve the performance.
    Regards,
    sudhakar

    The system configuration is more than enough to run java applications.
    You are probalbly doing time-consuming operations in the event thread. Which blocks the event thread and the gui seems not to be responding. If you you have a very bad design.
    Use a thread for time consuming operations.

  • Error message when updating Sales Orders

    Hi Experts
    Following an issue over the weekend when I was forced to perform a  hard reset we are now experiencing error message when trying to update some sales orders. 
    This entry already exists in the following tables " ADO1 (ODBC - 2035) [Message 131 - 183]
    it is not happening on all orders - i think only ones which were on the system prior to the reset.
    We also use webtools and the B1SyncService seems to be causing the SAP B1 system to become unusable - following each attempt to sync there is an error in the event log:
    Event Type:     Error
    Event Source:     B1SynchService
    Event Category:     None
    Event ID:     0
    Date:          29/06/2009
    Time:          13:42:38
    User:          N/A
    Computer:     SQL01
    Description:
    A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)
       at netpoint.api.data.DataFunctions.ExecuteScalar(String SQL, String connectionstring)
       at NetPoint.SynchSBO.SBOObjects.SBOUtility.SetCompany(Company TheCompany, SecurityTicket securityTicket)
       at NetPoint.SynchSBO.Synch.SetCompany()
       at NetPoint.SynchSBO.Synch..ctor(SecurityTicket ticket)
       at NetPoint.SynchSBO.Synch..ctor(SecurityTicket ticket, Int32 pricinginterval)
       at NetPoint.SynchService.NPSynchService.Synch(String profile, Mutex mutex)
       at NetPoint.SynchService.NPSynchService.Main(String[] args)
    For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.
    For now I have had to stop the SyncService to prevent B1 from becoming unusable (users are unable to start any AR module), but even after I have stopped the syncservice I still have the error when updating sales orders.
    Regards
    Jon

    Dear Johnny,
    the error reported sounds like a DB Corruption. I would advise you to log a message to SAP Support including all the details in order to obtain the error message. It is also a good idea and it will speed up the process if you include some print screens showing exactly the error message.
    Hope my reply helps you to solve the issue.
    Regards,
    Wesley Honorato

  • Help to improve the performance of a procedure.

    Hello everybody,
    First to introduce myself. My name is Ivan and I recently started learning SQL and PL/SQL. So don't go hard on me. :)
    Now let's jump to the problem. What we have there is a table (big one, but we'll need only a few fields) with some information about calls. It is called table1. There is also another one, absolutely the same structure, which is empty and we have to transfer the records from the first one.
    The shorter calls (less than 30 minutes) have segmentID = 'C1'.
    The longer calls (more than 30 minutes) are recorded as more than one record (1 for every 30 minutes). The first record (first 30 minutes of the call) has segmentID = 'C21'. It is the first so we have only one of these for every different call. Then we have the next (middle) parts of the call, which have segmentID = 'C22'. We can have more than 1 middle part and again the maximum minutes in each is 30 minutes. Then we have the last part (again max 30 minutes) with segmentID = 'C23'. As with the first one we can have only one last part.
    So far, so good. Now we need to insert these call records into the second table. The C1 are easy - one record = one call. But the partial ones we need to combine so they become one whole call. This means that we have to take one of the first parts (C21), find if there is a middle part (C22) with the same calling/called numbers and with 30 minutes difference in date/time, then search again if there is another C22 and so on. And last we have to search for the last part of the call (C23). In the course of these searches we sum the duration of each part so we can have the duration of the whole call at the end. Then we are ready to insert it in the new table as a single record, just with new duration.
    But here comes the problem with my code... The table has A LOT of records and this solution, despite the fact that it works (at least in the tests I've made so far), it's REALLY slow.
    As I said I'm new to PL/SQL and I know that this solution is really newbish, but I can't find another way of doing this.
    So I decided to come here and ask you for some tips on how to improve the performance of this.
    I think you are getting confused already, so I'm just going to put some comments in the code.
    I know it's not a procedure as it stands now, but it will be once I create a better code. I don't think it matters for now.
    DECLARE
    CURSOR cur_c21 IS
        select * from table1
        where segmentID = 'C21'
        order by start_date_of_call;     // in start_date_of_call is located the beginning of a specific part of the call. It's date format.
    CURSOR cur_c22 IS
        select * from table1
        where segmentID = 'C22'
        order by start_date_of_call;
    CURSOR cur_c22_2 IS
        select * from table1
        where segmentID = 'C22'
        order by start_date_of_call;  
    cursor cur_c23 is
        select * from table1
        where segmentID = 'C23'
        order by start_date_of_call;
    v_temp_rec_c22 cur_c22%ROWTYPE;
    v_dur table1.duration%TYPE;           // using this for storage of the duration of the call. It's number.
    BEGIN
    insert into table2
    select * from table1 where segmentID = 'C1';     // inserting the calls which are less than 30 minutes long
    -- and here starts the mess
    FOR rec_c21 IN cur_c21 LOOP        // taking the first part of the call
       v_dur := rec_c21.duration;      // recording it's duration
       FOR rec_c22 IN cur_c22 LOOP     // starting to check if there is a middle part for the call
          IF rec_c22.callingnumber = rec_c21.callingnumber AND rec_c22.callednumber = rec_c21.callednumber AND 
            (rec_c22.start_date_of_call - rec_c21.start_date_of_call) = (1/48)                
    /* if the numbers are the same and the date difference is 30 minutes then we have a middle part and we start searching for the next middle. */
          THEN
             v_dur := v_dur + rec_c22.duration;     // updating the new duration
             v_temp_rec_c22:=rec_c22;               // recording the current record in another variable because I use it for the next check
             FOR rec_c22_2 in cur_c22_2 LOOP
                IF rec_c22_2.callingnumber = v_temp_rec_c22.callingnumber AND rec_c22_2.callednumber = v_temp_rec_c22.callednumber AND 
                  (rec_c22_2.start_date_of_call - v_temp_rec_c22.start_date_of_call) = (1/48)        
    /* logic is the same as before but comparing with the last value in v_temp...
    And because the data in the cursors is ordered by date in ascending order it's easy to search for another middle parts. */
                THEN
                   v_dur:=v_dur + rec_c22_2.duration;
                   v_temp_rec_c22:=rec_c22_2;
                END IF;
             END LOOP;                     
          END IF;
          EXIT WHEN rec_c22.callingnumber = rec_c21.callingnumber AND rec_c22.callednumber = rec_c21.callednumber AND 
                   (rec_c22.start_date_of_call - rec_c21.start_date_of_call) = (1/48);       
    /* exiting the loop if we have at least one middle part.
    (I couldn't find if there is a way to write this more clean, like exit when (the above if is true) */
       END LOOP;
       FOR rec_c23 IN cur_c23 LOOP             
          IF (rec_c23.callingnumber = rec_c21.callingnumber AND rec_c23.callednumber = rec_c21.callednumber AND
             (rec_c23.start_date_of_call - rec_c21.start_date_of_call) = (1/48)) OR v_dur != rec_c21.duration          
    /* we should always have one last part, so we need this check.
    If we don't have the "v_dur != rec_c21.duration" part it will execute the code inside only if we don't have middle parts
    (yes we can have these situations in calls longer than 30 and less than 60 minutes). */
          THEN
             v_dur:=v_dur + rec_c23.duration;
             rec_c21.duration:=v_dur;               // updating the duration
             rec_c21.segmentID :='C1';
             INSERT INTO table2 VALUES rec_c21;     // inserting the whole call in table2
          END IF;
          EXIT WHEN (rec_c23.callingnumber = rec_c21.callingnumber AND rec_c23.callednumber = rec_c21.callednumber AND
                    (rec_c23.start_date_of_call - rec_c21.start_date_of_call) = (1/48)) OR v_dur != rec_c21.duration;                 
                    // exit the loop when the last part has been found.
       END LOOP;
    END LOOP;
    END;I'm using Oracle 11g and version 1.5.5 of SQL Developer.
    It's my first post here so hope this is the right sub-forum.
    I tried to explain everything as deep as possible (sorry if it's too long) and I kinda think that the code got somehow hard to read with all these comments. If you want I can remove them.
    I know I'm still missing a lot of knowledge so every help is really appreciated.
    Thank you very much in advance!

    Atiel wrote:
    Thanks for the suggestion but the thing is that segmentID must stay the same for all. The data in this field is just to tell us if this is a record of complete call (C1) or a partial record of a call(C21, C22, C23). So in table2 as every record will be a complete call the segmentID must be C1 for all.Well that's not a problem. You just hard code 'C1' instead of applying the row number as I was doing:
    SQL> ed
    Wrote file afiedt.buf
      1  select 'C1' as segmentid
      2        ,start_date_of_call, duration, callingnumber, callednumber
      3  from (
      4        select distinct
      5               min(start_date_of_call) over (partition by callingnumber, callednumber) as start_date_of_call
      6              ,sum(duration) over (partition by callingnumber, callednumber) as duration
      7              ,callingnumber
      8              ,callednumber
      9        from table1
    10*      )
    SQL> /
    SEGMENTID  START_DATE_OF_CALL     DURATION CALLINGNUMBER   CALLEDNUMBER
    C1         11-MAY-2012 12:13:10 8020557824 1982032041      0631432831624
    C1         15-MAR-2012 09:07:26  269352960 5581790386      0113496771567
    C1         31-JUL-2012 23:20:23  134676480 4799842978      0813391427349
    Another thing is that, as I said above, the actual table has 120 fields. Do I have to list them all manually if I use something similar?If that's what you need, then yes you would have to list them. You only get data if you tell it you want it. ;)
    Of course if you are taking the start_date_of_call, callingnumber and callednumber as the 'key' to the record, then you could join the results of the above back to the original table1 and pull out the rest of the columns that way...
    SQL> select * from table1;
    SEGMENTID  START_DATE_OF_CALL     DURATION CALLINGNUMBER   CALLEDNUMBER          COL1       COL2       COL3
    C1         31-JUL-2012 23:20:23  134676480 4799842978      0813391427349          556         40       5.32
    C21        15-MAR-2012 09:07:26  134676480 5581790386      0113496771567          219        100      10.16
    C23        11-MAY-2012 09:37:26  134676480 5581790386      0113496771567          321         73       2.71
    C21        11-MAY-2012 12:13:10 3892379648 1982032041      0631432831624          959         80       2.87
    C22        11-MAY-2012 12:43:10 3892379648 1982032041      0631432831624          375         57       8.91
    C22        11-MAY-2012 13:13:10  117899264 1982032041      0631432831624          778         27       1.42
    C23        11-MAY-2012 13:43:10  117899264 1982032041      0631432831624          308         97       3.26
    7 rows selected.
    SQL> ed
    Wrote file afiedt.buf
      1  with t2 as (
      2  select 'C1' as segmentid
      3        ,start_date_of_call, duration, callingnumber, callednumber
      4  from (
      5        select distinct
      6               min(start_date_of_call) over (partition by callingnumber, callednumber) as start_date_of_call
      7              ,sum(duration) over (partition by callingnumber, callednumber) as duration
      8              ,callingnumber
      9              ,callednumber
    10        from table1
    11       )
    12  )
    13  --
    14  select t2.segmentid, t2.start_date_of_call, t2.duration, t2.callingnumber, t2.callednumber
    15        ,t1.col1, t1.col2, t1.col3
    16  from   t2
    17         join table1 t1 on (   t1.start_date_of_call = t2.start_date_of_call
    18                           and t1.callingnumber = t2.callingnumber
    19                           and t1.callednumber = t2.callednumber
    20*                          )
    SQL> /
    SEGMENTID  START_DATE_OF_CALL     DURATION CALLINGNUMBER   CALLEDNUMBER          COL1       COL2       COL3
    C1         11-MAY-2012 12:13:10 8020557824 1982032041      0631432831624          959         80       2.87
    C1         15-MAR-2012 09:07:26  269352960 5581790386      0113496771567          219        100      10.16
    C1         31-JUL-2012 23:20:23  134676480 4799842978      0813391427349          556         40       5.32
    SQL>Of course this is pulling back the additional columns for the record that matches the start_date_of_call for that calling/called number pair, so if the values differed from row to row within the calling/called number pair you may need to aggregate those (take the minimum/maximum etc. as required) as part of the first query. If the values are known to be the same across all records in the group then you can just pick them up from the join to the original table as I coded in the above example (only in my example the data was different across all rows).

  • Need help in improving the performance for the sql query

    Thanks in advance for helping me.
    I was trying to improve the performance of the below query. I tried the following methods used merge instead of update, used bulk collect / Forall update, used ordered hint, created a temp table and upadated the target table using the same. The methods which I used did not improve any performance. The data count which is updated in the target table is 2 million records and the target table has 15 million records.
    Any suggestions or solutions for improving performance are appreciated
    SQL query:
    update targettable tt
    set mnop = 'G',
    where ( x,y,z ) in
    select a.x, a.y,a.z
    from table1 a
    where (a.x, a.y,a.z) not in (
    select b.x,b.y,b.z
    from table2 b
    where 'O' = b.defg
    and mnop = 'P'
    and hijkl = 'UVW';

    987981 wrote:
    I was trying to improve the performance of the below query. I tried the following methods used merge instead of update, used bulk collect / Forall update, used ordered hint, created a temp table and upadated the target table using the same. The methods which I used did not improve any performance. And that meant what? Surely if you spend all that time and effort to try various approaches, it should mean something? Failures are as important teachers as successes. You need to learn from failures too. :-)
    The data count which is updated in the target table is 2 million records and the target table has 15 million records.Tables have rows btw, not records. Database people tend to get upset when rows are called records, as records exist in files and a database is not a mere collection of records and files.
    The failure to find a single faster method with the approaches you tried, points to that you do not know what the actual performance problem is. And without knowing the problem, you still went ahead, guns blazing.
    The very first step in dealing with any software engineering problem, is to identify the problem. Seeing the symptoms (slow performance) is still a long way from problem identification.
    Part of identifying the performance problem, is understanding the workload. Just what does the code task the database to do?
    From your comments, it needs to find 2 million rows from 15 million rows. Change these rows. And then write 2 million rows back to disk.
    That is not a small workload. Simple example. Let's say that the 2 million row find is 1ms/row and the 2 million row write is also 1ms/row. This means a 66 minute workload. Due to the number of rows, an increase in time/row either way, will potentially have 2 million fold impact.
    So where is the performance problem? Time spend finding the 2 million rows (where other tables need to be read, indexes used, etc)? Time spend writing the 2 million rows (where triggers and indexes need to be fired and maintained)? Both?

  • How to improve the performance of adobe forms

    Hi,
    Please give me some suggestions as to how to improve the performance of adobe form?
    Right now when I' am doing user events it is working fine for first 6 or 7 user events. From the next
    one it is hanging.
    I read about Wizard form design approach, how to use the same here.
    Thanks,
    Aravind

    Hi Otto,
    The form is created using HCM forms and processes. I' am performing user events in the form.
    User events will doa round trip, in which form data will be sent to backend SAP system. Processing will
    happen on the ABAP side and result will appear on the form. First 6 or 7 user events works correctly,
    the result is appearing on the form. Around 8 or 9th one, the wait symbol appears and the form is not
    re-rendered. The form is of size 6 pages. The issue is not coming with form of size 1 page.
    I was reading ways to improve performance during re-rendering given below.
    http://www.adobe.com/devnet/livecycle/articles/DynamicInteractiveFormPerformance.pdf
    It talks about wizard form design approach. But in SFP transaction, I am not seeing any kind of wizard.
    Let me know if you need further details.
    Thanks,
    Aravind

  • To improve the performance of the extractor

    Hi Team,
    Currently there is one dataload which is taking 48hours to extract data from R/3 System to BI. It is based on infoset query.
    The extractor uses the standard LDB : PNP, The database driver for this LDB is SAPDBPNP.
    The extractor is based on PA0008 infotype.
    1 tricky part of this code is that the values are not stored in the PA0008 table, but these are dynamically calculated during the infoset extract.
    Could you please provide some inputs in improving the performance.
    Thanks in advance.
    Sunil Kumar.

    Since its dynamically calculated during the infoset extract, obviously it would go longer time. Because it has to calculate those many times as the no.of PERNR (in IT8).
    If possible try to move those calculations to Transfer / Update Rules level.
    OR
    If its a non-delta datasource, try to add more infopackages (with PERNR selections) to this datasource, run them in parallel in process chain.

  • How can we improve the performance while fetching data from RESB table.

    Hi All,
    Can any bosy suggest me the right way to improve the performance while fetching data from RESB table. Below is the select statement.
    SELECT aufnr posnr roms1 roanz
        INTO (itab-aufnr, itab-pposnr, itab-roms1, itab-roanz)
        FROM resb
        WHERE kdauf  = p_vbeln
        AND   ablad  = itab-sposnr+2.
    Here I am using 'KDAUF'  & 'ABLAD' in condition. Can we use secondary index for improving the performance in this case.
    Regards,
    Himanshu

    Hi ,
    Declare intenal table with only those four fields.
    and try the beloe code....
    SELECT aufnr posnr roms1 roanz
    INTO  table itab
    FROM resb
    WHERE kdauf = p_vbeln
    AND ablad = itab-sposnr+2.
    yes, you can also use secondary index for improving the performance in this case.
    Regards,
    Anand .
    Reward if it is useful....

  • How to improve the Performance of Swing Components

    Hi
    I have developed a GUI Framework with Swing components. I observed that when number of components are more, the GUI is slow. Could anybody suggest me the ways to improve the performance of Swing Components
    Thanks

    Hi There - I haven't found issues so far with the speed of the GUI building, it seems fast when compared with .NET applications
    Are you doing any database querying on form load? This can slow down the loading of the GUI, I use persistence in an application and before the GUI is shown the persistence unit logs in to the database and registers itself - it does take 2/3 seconds to register and complete whatever it does before the form builds and shows
    Also check if you are loading/using any network based files on form load, this can also slow down the building considerably as well. Apart from that it's hard to say where your issue may lie apart from removing controls and checking build speed until you find the one that's slowing it down (not a good approach though) I sometimes had to load csv files from a network drive and if the file server is slow or LAN speed isn't great this can also cause a delay, hope you find it.....

  • How to improve the performance of 11i?

    Dear all,
    We just upgraded from 10.7 to 11i and found that the performance is not acceptable. Currently there are about 120 staff in our company. Can someone kindly suggest some way to improve the performance?
    Thanks,
    George

    Dear Alan,
    The reports are running slowly when more than one hundred users are logged in. Should we tune the database or setup another report server to shorten the report running time?
    Thanks,
    George

  • How to improve the performance of extractor?

    Hi,
    we have an extractor for which data loading is taking too much of a time!
    It is taking 5 hours for loading where as the same extractor in different source system  ( different client )  is doing this load with in an hour.
    what one can do to improve the performance of the extractor?
    Many thanks in advance!
    Ravi

    Hi,
    Are you sure that both source systems has same amount of data?
    And also make sure that Master data was uploaded from both systems. It takes more time to upload transaction data if the master data is not yet uploaded from same source system.
    And also check SM58 in Source systme for any blocking TRFCs.If there are any with text in red colour , then manually update them.Huge number of blocked TRFCs may slow down the upload.
    with rgds,
    Anil Kumar Sharma .P

  • Inner Join. How to improve the performance of inner join query

    Inner Join. How to improve the performance of inner join query.
    Query is :
    select f1~ablbelnr
             f1~gernr
             f1~equnr
             f1~zwnummer
             f1~adat
             f1~atim
             f1~v_zwstand
             f1~n_zwstand
             f1~aktiv
             f1~adatsoll
             f1~pruefzahl
             f1~ablstat
             f1~pruefpkt
             f1~popcode
             f1~erdat
             f1~istablart
             f2~anlage
             f2~ablesgr
             f2~abrdats
             f2~ableinh
                from eabl as f1
                inner join eablg as f2
                on f1ablbelnr = f2ablbelnr
                into corresponding fields of table it_list
                where f1~ablstat in s_mrstat
                %_HINTS ORACLE 'USE_NL (T_00 T_01) index(T_01 "EABLG~0")'.
    I wanted to modify the query, since its taking lot of time to load the data.
    Please suggest : -
    Treat this is very urgent.

    Hi Shyamal,
    In your program , you are using "into corresponding fields of ".
    Try not to use this addition in your select query.
    Instead, just use "into table it_list".
    As an example,
    Just give a normal query using "into corresponding fields of" in a program. Now go to se30 ( Runtime analysis), and give the program name and execute it .
    Now if you click on Analyze button , you can see, the analysis given for the query.The one given in "Red" line informs you that you need to find for alternate methods.
    On the other hand, if you are using "into table itab", it will give you an entirely different analysis.
    So try not to give "into corresponding fields" in your query.
    Regards,
    SP.

  • How can I Improve the Performance using Global Temo Tables ??

    Hi,
    Can anyone tell me , How can i make use of Global Temporary Tables to improve the Performance.
    I have few sample scripts ,
    Say i have the View based on some Complex query like ,
    CREATE OR REPLACE VIEW Profile_values_view AS
    SELECT d.Profile_option_name, d.Profile_option_id, Profile_option_value,
    u.User_name, Level_id, Level_code
    FROM Profile_definitions d, Profile_values v, Profile_users u
    WHERE d.Profile_option_id = v.Profile_option_id
    AND ((Level_code = 'USER' AND Level_id = U.User_id) OR
    (Level_code = 'DEPARTMENT' AND Level_id = U.Department_id) OR
    (Level_code = 'SITE'))
    AND NOT EXISTS (SELECT 1 FROM PROFILE_VALUES P
    WHERE P.PROFILE_OPTION_ID = V.PROFILE_OPTION_ID
    AND ((Level_code = 'USER' AND
    level_id = u.User_id) OR
    (Level_code = 'DEPARTMENT' AND
    level_id = u.Department_id) OR
    (Level_code = 'SITE'))
    AND INSTR('USERDEPARTMENTSITE', v.Level_code) >
    INSTR('USERDEPARTMENTSITE', p.Level_code));
    Now i have created the Global temp Table as ,
    CREATE GLOBAL TEMPORARY TABLE Profile_values_temp
    Profile_option_name VARCHAR(60) NOT NULL,
    Profile_option_id NUMBER(4) NOT NULL,
    Profile_option_value VARCHAR2(20) NOT NULL,
    Level_code VARCHAR2(10) ,
    Level_id NUMBER(4) ,
    CONSTRAINT Profile_values_temp_pk
    PRIMARY KEY (Profile_option_id)
    ) ON COMMIT PRESERVE ROWS ORGANIZATION INDEX;
    Now I am Inserting the Records into Temp table as
    INSERT INTO Profile_values_temp
    (Profile_option_name, Profile_option_id, Profile_option_value,
    Level_code, Level_id)
    SELECT Profile_option_name, Profile_option_id, Profile_option_value,
    Level_code, Level_id
    FROM Profile_values_view;
    COMMIT;
    Now what my doubt is, when do i need to execute the Insert Statement.
    Say , if the View returns few millions of records , then loading such a data into Global Temporary table takes lot of time.
    Then what is the use of Global Temporary tables and how can i improve the Performance using the same.
    Raj

    Thanks for the responce ,
    There are 2 to 3 complex views in our database, and there always be more than 5000+ users will be workinf on the application and is OLTP application. Those complex views are killing the application performance.
    I what i felt was, if i create the Global Temporary tables for thow views and will be able to load the one third million of records returned by the views in to cache and can improve the application performance.
    I have created the Global Temporary tables for 2 views with the option On Commit Preserve , But after am inserting the records into the Temp table and when i Issue the commit statement, the Temp table is getting Cleared.
    I really got surpised of this behaviour as i know that with the Option On Commit Preserve , the rows should retain in the Temp Table, Instead , it's getting cleared.
    Pelase suggest , what to do ??
    Raj

Maybe you are looking for

  • Mystery with itunes music store

    has anyone ever seen the following error message: iTunes could not connect to the Music Store. An unknown error ocurred (-9807) I can connect to hear the sound bites of songs but get this message when I try to purchase anything. I don't have any prob

  • Bind Variables [SOLVED]

    Hello, I made a post yesterday and didn't solve my problems, but I'm getting a lot closer. I found two tutorials about bind variables and have been trying to follow them, but I am getting some errors and was hoping someone here could tell me what I'm

  • Error while starting the BI services

    Hi, I am using obiee via windows server 2003, suddenly my OS stopped working and because of which I had to power off my system abnormally. when i logged into the system and started the BI Services, the weblogic service exited with the following error

  • When I touch on the phone icon it just flashes and returns to main screen. Any ideas how I can solve this?

    All other functions work just perfectly, except for my phone. I would appreciate some tips.

  • Activation fails - Already Activated on 2 devices  - but it's not!

    HI I have a new sealed Acrobat X Standard CD that Adobe won't accept the serial since they think it's already been activated twice and yet I've never opened/installed it before... I tried it on a new machine earlier today but same already installed r