How to improve performance of DSO activation

Hi all,
We have created a DSO which is updated daily from another DSO. The extraction is from the Active table of the source DSO, so all records are extracted, in this case roughly 1,000,000 records. (we cannot use delta update for this particular DSO)
Because of the large number of records, the activation of the data in the target DSO is quite slow (ca 20 minutes).
My question: Is there any way that we can analize where exactly the time is lost during the activation and how we can improve the performance? Is it for example possible to deactivate the Change log?
Note that I am already familiar with RSODSO_SETTINGS, and also, we do not want to switch to write-optimized DSO.
Thank you in advance for your help.

Hi
DSO Activation can be improve by unchecking the "SIDs Generation upon Activation" check box in the DSO settings as it takes lot of time to create SIDs while activations. We can go ahead with this option only when we dont have any reports based on that DSO
Thanks & Regards
Rohit

Similar Messages

  • How to improve performance of MediaPlayer?

    I tried to use the MediaPlayer with a On2 VP6 flv movie.
    Showing a video with a resolution of 1024x768 works.
    Showing a video with a resolution of 1280x720 and a average bitrate of 1700 kb/s leads to a delay of the video signal behind the audio signal of a couple of seconds. VLC, Media Player Classic and a couple of other players have no problem with the video. Only the FX MediaPlayer shows a poor performance.
    Additionally mouse events in a second stage (the first stage is used for the video) are not processed in 2 of 3 cases. If the MediaPlayer is switched off, the mouse events work reliable.
    Does somebody know a solution for this problems?
    Cheers
    masim

    duplicate thread..
    How to improve performance of attached query

  • How to improve performance of the attached query

    Hi,
    How to improve performance of the below query, Please help. also attached explain plan -
    SELECT Camp.Id,
    rCam.AccountKey,
    Camp.Id,
    CamBilling.Cpm,
    CamBilling.Cpc,
    CamBilling.FlatRate,
    Camp.CampaignKey,
    Camp.AccountKey,
    CamBilling.billoncontractedamount,
    (SUM(rCam.Impressions) * 0.001 + SUM(rCam.Clickthrus)) AS GR,
    rCam.AccountKey as AccountKey
    FROM Campaign Camp, rCamSit rCam, CamBilling, Site xSite
    WHERE Camp.AccountKey = rCam.AccountKey
    AND Camp.AvCampaignKey = rCam.AvCampaignKey
    AND Camp.AccountKey = CamBilling.AccountKey
    AND Camp.CampaignKey = CamBilling.CampaignKey
    AND rCam.AccountKey = xSite.AccountKey
    AND rCam.AvSiteKey = xSite.AvSiteKey
    AND rCam.RmWhen BETWEEN to_date('01-01-2009', 'DD-MM-YYYY') and
    to_date('01-01-2011', 'DD-MM-YYYY')
    GROUP By rCam.AccountKey,
    Camp.Id,
    CamBilling.Cpm,
    CamBilling.Cpc,
    CamBilling.FlatRate,
    Camp.CampaignKey,
    Camp.AccountKey,
    CamBilling.billoncontractedamount
    Explain Plan :-
    Description Object_owner Object_name Cost Cardinality Bytes
    SELECT STATEMENT, GOAL = ALL_ROWS 14 1 13
    SORT AGGREGATE 1 13
    VIEW GEMINI_REPORTING 14 1 13
    HASH GROUP BY 14 1 103
    NESTED LOOPS 13 1 103
    HASH JOIN 12 1 85
    TABLE ACCESS BY INDEX ROWID GEMINI_REPORTING RCAMSIT 2 4 100
    NESTED LOOPS 9 5 325
    HASH JOIN 7 1 40
    SORT UNIQUE 2 1 18
    TABLE ACCESS BY INDEX ROWID GEMINI_PRIMARY SITE 2 1 18
    INDEX RANGE SCAN GEMINI_PRIMARY SITE_I0 1 1
    TABLE ACCESS FULL GEMINI_PRIMARY SITE 3 27 594
    INDEX RANGE SCAN GEMINI_REPORTING RCAMSIT_I 1 1 5
    TABLE ACCESS FULL GEMINI_PRIMARY CAMPAIGN 3 127 2540
    TABLE ACCESS BY INDEX ROWID GEMINI_PRIMARY CAMBILLING 1 1 18
    INDEX UNIQUE SCAN GEMINI_PRIMARY CAMBILLING_U1 0 1

    duplicate thread..
    How to improve performance of attached query

  • How to improve performance of attached query

    Hi,
    How to improve performance of the below query, Please help. also attached explain plan -
    SELECT Camp.Id,
    rCam.AccountKey,
    Camp.Id,
    CamBilling.Cpm,
    CamBilling.Cpc,
    CamBilling.FlatRate,
    Camp.CampaignKey,
    Camp.AccountKey,
    CamBilling.billoncontractedamount,
    (SUM(rCam.Impressions) * 0.001 + SUM(rCam.Clickthrus)) AS GR,
    rCam.AccountKey as AccountKey
    FROM Campaign Camp, rCamSit rCam, CamBilling, Site xSite
    WHERE Camp.AccountKey = rCam.AccountKey
    AND Camp.AvCampaignKey = rCam.AvCampaignKey
    AND Camp.AccountKey = CamBilling.AccountKey
    AND Camp.CampaignKey = CamBilling.CampaignKey
    AND rCam.AccountKey = xSite.AccountKey
    AND rCam.AvSiteKey = xSite.AvSiteKey
    AND rCam.RmWhen BETWEEN to_date('01-01-2009', 'DD-MM-YYYY') and
    to_date('01-01-2011', 'DD-MM-YYYY')
    GROUP By rCam.AccountKey,
    Camp.Id,
    CamBilling.Cpm,
    CamBilling.Cpc,
    CamBilling.FlatRate,
    Camp.CampaignKey,
    Camp.AccountKey,
    CamBilling.billoncontractedamount
    Explain Plan :-
    Description Object_owner Object_name Cost Cardinality Bytes
    SELECT STATEMENT, GOAL = ALL_ROWS 14 1 13
    SORT AGGREGATE 1 13
    VIEW GEMINI_REPORTING 14 1 13
    HASH GROUP BY 14 1 103
    NESTED LOOPS 13 1 103
    HASH JOIN 12 1 85
    TABLE ACCESS BY INDEX ROWID GEMINI_REPORTING RCAMSIT 2 4 100
    NESTED LOOPS 9 5 325
    HASH JOIN 7 1 40
    SORT UNIQUE 2 1 18
    TABLE ACCESS BY INDEX ROWID GEMINI_PRIMARY SITE 2 1 18
    INDEX RANGE SCAN GEMINI_PRIMARY SITE_I0 1 1
    TABLE ACCESS FULL GEMINI_PRIMARY SITE 3 27 594
    INDEX RANGE SCAN GEMINI_REPORTING RCAMSIT_I 1 1 5
    TABLE ACCESS FULL GEMINI_PRIMARY CAMPAIGN 3 127 2540
    TABLE ACCESS BY INDEX ROWID GEMINI_PRIMARY CAMBILLING 1 1 18
    INDEX UNIQUE SCAN GEMINI_PRIMARY CAMBILLING_U1 0 1

    duplicate thread..
    How to improve performance of attached query

  • How to improve performance of query

    Hi all,
    How to improve performance of query.
    please send :
    [email protected]
    thanks in advance
    bhaskar

    hi
    go through the following links for performance
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/cccad390-0201-0010-5093-fd9ec8157802
    http://www.asug.com/client_files/Calendar/Upload/ASUG%205-mar-2004%20BW%20Performance%20PDF.pdf
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2

  • How to improve performance of insert statement

    Hi all,
    How to improve performance of insert statement
    I am inserting 1lac records into table it takes around 20 min..
    Plz help.
    Thanx In Advance.

    I tried :
    SQL> create table test as select * from dba_objects;
    Table created.
    SQL> delete from test;
    3635 rows deleted.
    SQL> commit;
    Commit complete.
    SQL> select count(*) from dba_extents where segment_name='TEST';
    COUNT(*)
    4
    SQL> insert /*+ APPEND */ into test select * from dba_objects;
    3635 rows created.
    SQL> commit;
    Commit complete.
    SQL> select count(*) from dba_extents where segment_name='TEST';
    COUNT(*)
    6
    Cheers, Bhupinder

  • How to improve performance of Siebel Configurator

    Hi All,
    We are using Siebel Configurator to model the item structures. We wrote few constraint rules on that. But while launching the configurator it is taking more time to open.
    Even without rules also it is behaving in the same manner.
    Any inputs on this could be highly appreciated
    RAM

    duplicate thread..
    How to improve performance of attached query

  • How to Improve performance issue when we are using BRM LDB

    HI All,
    I am facing a performanc eissue when i am retriving the data from BKPF and respective BSEG table....I see that for fiscal period there are around 60lakhs records. and to populate the data value from the table to final internal table its taking so much of time.
    when i tried to make use of the BRM LDB with the SAP Query/Quickviewer, its the same issue.
    Please suggest me how to improve the performance issue.
    Thanks in advance
    Chakradhar

    Moderator message - Please see Please Read before Posting in the Performance and Tuning Forum before posting - post locked
    Rob

  • HOW TO IMPROVE PERFORMANCE ON SUM FUNCTION IN INLINE SQL QUERY

    SELECT NVL(SUM(B1.T_AMOUNT),0) PAYMENT,B1.ACCOUNT_NUM,B1.BILL_SEQ
    FROM
    SELECT P.T_AMOUNT,P.ACCOUNT_NUM,P.BILL_SEQ
    FROM PAYMENT_DATA_VIEW P
    WHERE TRUNC(P.ACC_PAYMENT_DATE) < '01-JAN-2013'
    AND P.CUSTOMER_NAME ='XYZ'
    AND P.CLASS_ID IN (-1,1,2,94)
    ) B1
    GROUP BY B1.ACCOUNT_NUM,B1.BILL_SEQ
    Above is the query.If we run inner query it takes few second to execute but while we are summing up the same amount and bill_Seq using inline view, it takes time to execute it.
    Note: Count of rows selected from inner query will be around >10 Lac
    How to improve the performance for this query?
    Pls suggest
    Thanks in advance

    989209 wrote:
    SELECT NVL(SUM(B1.T_AMOUNT),0) PAYMENT,B1.ACCOUNT_NUM,B1.BILL_SEQ
    FROM
    SELECT P.T_AMOUNT,P.ACCOUNT_NUM,P.BILL_SEQ
    FROM PAYMENT_DATA_VIEW P
    WHERE TRUNC(P.ACC_PAYMENT_DATE) < '01-JAN-2013'
    AND P.CUSTOMER_NAME ='XYZ'
    AND P.CLASS_ID IN (-1,1,2,94)
    ) B1
    GROUP BY B1.ACCOUNT_NUM,B1.BILL_SEQ
    Above is the query.If we run inner query it takes few second to execute but while we are summing up the same amount and bill_Seq using inline view, it takes time to execute it.
    Note: Count of rows selected from inner query will be around >10 Lac
    How to improve the performance for this query?
    Pls suggest
    Thanks in advancea) Lac is not an international unit, so is not understood by everyone. This is an international forum so please use international units.
    b) Please read the FAQ: {message:id=9360002} to learn how to format your question correctly for people to help you.
    c) As your question relates to performance tuning, please also read the two threads linked to in the FAQ: {message:id=9360003} for an idea of what specific information you need to provide for people to help you tune your query.

  • How to improve performance of my query

    Hello Friends,
    Good Morning.
    I am having the following query which is never ending - Can any one throw some light on how to improve the performance of my said said query ..This is the query generated in ODI ( ORACLE DATA INTEGRATOR 11G )
    The only thing I can put in this query is optimizers
    - issue resolved
    Please advice .
    Thanks / Kumar
    Edited by: kumar73 on May 18, 2012 6:38 AM
    Edited by: kumar73 on May 18, 2012 6:39 AM
    Edited by: kumar73 on May 18, 2012 12:04 PM

    The two DISTINCTs are redundant, as UNION results in unique records, as a set can't have duplicates.
    Other than that the query is not formatted and unreadable, and you didn't provide a description of the tables involved.
    Your strategy seems to be maximum help from this forum with minimum effort from yourself, other than hitting copy and paste.
    Sybrand Bakker
    Senior Oracle DBA

  • How to improve performance for Custom Extractor in BI..

    HI all,
               I am new to BI and started working on BI for couple of weeks.. I created a Custom Extractor(Data View) in the Source system and when i pull data takes lot of time.. Can any one respond to this, suggesting how to improve the performance of my custom Extractor.. Please do the needfull..
      Thanks and Regards,
    Venugopal..

    Dear Venugopal,
    use transaction ST05 to check if your SQL statements are optimal and that you do not have redundant database calls. You should use as much as possible "bulking", which means to fetch the required data with one request to database and not with multiple requests to database.
    Use transaction SE30 to check if you are wasting time in loops and if yes, optimize the algorithm.
    Best Regards,
    Sylvia

  • How to improve performance of a game in samsung mobile

    hi,
    I have developed a game which i am trying to port to various handsets, its works fine for nokia, sony,motorola. But it is giving problems when i run with samsung emulator.
    the game became very slow when i tried running it in samsung emulator and it is taking the ket events very late.
    can any one plz help in how to improve the speed of the game on samsung mobile. it kind of urgent.
    Thanks

    Hi Raj,
      How is your direct access datasource built ?  Is this a standard datasource or generic datasource on any view/table/function module. This strengthens my second point.
    Suppose you built a virtual cube on direct access datasource built  on AUFK table with Order as primary key (Order master data). when you use Order as selection on query built on this virtual cube then it retrievies the data faster than firing the query on other selections.
    If your selections are different. You can possibly create a secondary index on the table with selections used in query.
    Regards
    vamsi

  • How to improve performance when there are many TextBlocks in ItemsControl items?

       Hi,
       I'm trying to find a way to improve performance for a situation when there is an ItemsControl using UI and Data virtualization and each item on that control has 36 TextBlocks. Basically the item is a single string. There are so many TextBlocks
    to allow assigning different brushes to different parts of the string. Performance of this construction is terrible. I have 37 items visible on the screen and if I try to scroll up or down it scrolls into the black space and then it takes a second or two to
    show the items.
       I tried different things. For example, the most successful performance-wise was to replace TextBlocks with Borders and then draw bitmaps. In other words, I prepared 127 bitmaps for each character (I need ASCII only) and then I used those bitmaps
    to set Border.Backgrounds. It improved performance about 1.5 - 2 times but it consumed much more memory (which is not surprising, of course). Required amount of memory is so big that it throws OutOfMemoryException on 512MB emulator but works on 1GB. As a result
    I don't thing it is a good solution.
       Another thing that worked perfect is to replace 36 TextBlocks with only 6 TextBlocks. In this case the performance improvement is about 5 - 10 times but I lose the ability to set different colors to different parts of the string. It seems that
    the performance degrades dramatically with the increase of number of TextBlocks. Is there another technique to draw strings where literally each character can be of different color with decent performance?
    Thank you
    Alex

       Using Runs inside TextBlocks gives approximately the same improvement as using bitmaps 1.5 - 2 times faster but it is not even close to the case with just a couple of TextBlocks in the ItemsControl item. Any other ideas?
    Alex

  • How to improve Performance of the Statements.

    Hi,
    I am using Oracle 10g. My problem is when i am Execute & fetch the records from the database it is taking so much time. I have created Statistics also but no use. Now what i have to do to improve the Performance of the SELECT, INSERT, UPDATE, DELETE Statements.
    Is it make any differents because i am using WindowsXP, 1 GB RAM in Server Machine, and WindowsXP, 512 GB RAM in Client Machine.
    Pls. Give me advice for me to improve Performance.
    Thank u...!

    What and where to change parameters and values ?Well, maybe my previous post was not clear enough, but if you want to keep your job, you shouldn't change anything else on init parameter and you shouldn't fall in the Compulsive Tuning Disorder.
    Everyone who advise you to change some parameters to some value without any more info shouldn't be listen.
    Nicolas.

  • How to check if a DSO Active Table is being used in any look-up's?

    Hi Gurus,
    We have a DSO that is being loaded on a weekly basis. Now this DSO is being loaded right from the time our project went live. There are no reports built on top of this DSO and the DSO is not being used as a Data Mart i.e. it doesn't feed any other DSO's or InfoCube's. Also this DSO is not being used in any MultiProviders.
    So the only sensible reason for using this DSO that we can think of is that the data from this DSO is being 'looked up' by some other DSO's or InfoCube at run-time. So Is it possible to check if the active data table of an DSO is being ;looked up' at run time by some other data loads? I already tried using an 'Where Used List' on the active data table of the DSO, but it didn't show any results.
    Thanks
    Arvind

    Try the following:
    Check the table RSAABAP in SE16, you can search the technical name of DSOs active table in field EDIC.
    You get all Code-IDs, where this table is used in ABAP coding.
    In RSAROUT you can check, which code-type it is.
    Now you know, if the DSO is used in any routines.
    Hope this helps.
    Regards
    Tobias

Maybe you are looking for

  • Spry Menu Bar issue, NEED HELP...???

    Here is the coding for a menu bar that i created with CS3, for some reason i am having an issue when i open the web page in IE, on firefox and safari it looks fine, the menu drops down to sub menu's fine, but for some reason when i open it in IE, the

  • How to add another Computer to my Apple TV?

    I've had my apple TV for a while and really like it. If I rember correctly it used to be easy to add another computer to my Computer list on Apple TV. But for some reason I can't seem to do it anymore. I intend to take my main home imac down for main

  • Problem with csv imports

    Hi all, When I import a csv file, numbers puts spaces between all of the characters, and consequently doesn't recognise numbers as numbers. I know you can get rid of these by the Find and search method, but it's a bit clunky, and it takes all of the

  • Content Server on Virtual Machine with VPN

    Hello everybody. I want to create a test-environment with a Content Server. Regarding to the needed Operating System I decided to use VMware Workstation to virtualize the OS and the Content Server. Host-OS: Windows XP Professional Host-IP: 192.17.1.1

  • How to make processing faster in GUI application?

    Hello! We have a client /server application that processes data fed to it and displays them on the GUI. The problem is when the data becomes large in amount. The processing becomes very very slow. It takes more than 1 hour to process about 5000 data.