How do I correct universe performance issue?

We are trying to complete an upgrade from BO XIr2 to 3.1 SP3.  We have one universe (out of about 30) that is having serious performance issues in 3.1.  The universe is slow to import (takes less than 5 min in XI r2 and more than 20 min in 3.1) and reports using it take about 5 times longer to run in 3.1 than in XI r2.  I have already worked a ticket with support.  All they provided was an adapt about Webi reports running slower in 3.1 than in XI that's supposed to be fixed with FixPack 3.6.  This is rather vague and doesn't really help us get through a production upgrade that's scheduled for May.  I've already checked all the unvierse settings and Webi report server settings to verify that the options that do exist in both XI and 3.1 are the same.  Both versions of the universe are pointed to the same Oracle data source.  I've worked with our DBAs to troubleshoot.  The behavior on the database side is different in 3.1 than in XI but I don't know how to tweak it.  When a report is run in XI, the database sessions run in parallel.  When the same report is run in 3.1, the sessions are serial first and eventually go to parallel but are still considerably slower to run the exact same SQL.  Any suggestions would be greatly appreciated.  Thanks!

Hi,
   Are you referring to ADAPT01453709?
   From what I have gathered, this is an known issue and had since been escalated to the SAP developer. There were several releases already been made. The earliest official release to public will be on FP 3.6, (ETA Jun 2011).
   You may want to check with the engineer that worked on your current incident to understand exactly what is fixed/addressed in FP 3.6. I suspect the fixes is on the symantic layer, not something you can easliy check via D/B and running some SQL script.
Regards,
Ken

Similar Messages

  • How should I report forum performance issues?

    The forums rely heavily on the caching features of browsers to improve the speed of page rendering. Performance of these forums should greatly improve after a few pages because more and more of the images, css and javascript is cached in the browser. As a consequence, when reporting forums performance issues the report should include some information on the state of the browser cache to determine whether the issue is a browser issue or a server issue. Such detailed information is generally not available from just watching the browser screen, but needs to come from specialized tools such as performance monitor plugins and recording proxies.
    The preferred report method for performance issues is to use the speed reporting features build into or available as a plugin for a browser for both the page you want to report a problem with and several refence pages in the site. Detailed instructions are listed below separated out for different browsers. If possible, please use Firefox for submitting the report because it provides an export format that can be read back electronically.
    Known performance issues
    The performance issues with any screen with a Rich Text Editor, such as the Reply window and the compose Private Message window have been acknowleged and improvements are being implemented.
    Mozilla Firefox (preferred)
    Warning: it is currently not recommended to generate a speed report when logged in. The speed report has enough detail for somebody else to hijack your session and impersonate you on the forums. If you really must report while logged in, make sure you log out your browser after generating the speed report and wait at least 4 hours before posting.
    Install the Firebug plugin
    Install the NetExport 0.6 extension for Firebug
    Enable all Firebug panels
    Switch to the "Net" panel in Firebug
    Click on this link
    Export the data from the Firebug Net panel
    Click on this link
    Export the data from the Firebug Net panel
    Browse to the page where you are experiencing the performance problem.
    Export the data from the Firebug Net panel
    Click on this link
    Export the data from the Firebug Net panel
    Click on this link
    Export the data from the Firebug Net panel
    Browse to the page where you are experiencing the performance problem.
    Export the data from the Firebug Net panel
    When you report a performance problem please attach the 6 exports from the Firebug Net panel and an explanation of how you are experiencing the issues (for instance how much slower it is then normal) and include a description of your internet connection (dial-up, dsl, cable etc.) and the country from where you are connecting. If you have non-standard tweaks to your Firefox configuration (such as pipelining enabled) or are running any plugins please include that information in your report as well.
    Google Chrome
    Open the Developer Tools (Ctrl-Shift-J)
    Navigate to the resources tab
    Enable resource tracking.
    Click on this link
    Export the resource loading data.
    Reset the data by disabling and enabling resource tracking
    Click on this link
    Export the data
    Reset the data by disabling and enabling resource tracking
    Navigate to the page where you experience the performance problem
    Export the data
    Reset the data by disabling and enabling resource tracking
    Click on this link
    Export the data
    Reset the data by disabling and enabling resource tracking
    Click on this link
    Export the data
    Reset the data by disabling and enabling resource tracking
    Navigate to the page where you experience the performance problem
    Export the data
    Since Google Chrome does not have an export format for the Resource Tracking information best current practice is to take a screenshot and note the hover details for any resource with a tail that is longer then 25% of the total load time. When you report a performance problem please attach the screenshots and an explanation of how you are experiencing the issues (for instance how much slower it is then normal)  and include a description of your internet connection (dial-up, dsl, cable etc.) and the country from where you are connecting.
    Apple Safari
    The Apple Safari Web Inspector has a Resources panel similar to the Resources panel in the Google Chrome developer tools.To get there, follow these steps:
    Show the menu bar.
    Go to preferences
    Go to the Advanced Tab
    Check “Show  Develop menu in menu bar”.
    From the Develop menu select “Show Web  Inspector”. 
    Collecting the performance information and exporting works exactly the same as in Google Chrome. Please refer to the instructions for Google Chrome.
    Microsoft Internet Explorer
    IE does not have native features to analyze web traffic. No plugins have been found that produce the required information (please let us know if we missed any). For now, please reproduce the issue with Firefox, Chrome or Safari.
    Please note that due to the reliance on Javascript for the interactive effects the performance of these forums will be much better on MS IE 8 then on previous versions of MS IE.

    Hi
    It works, check once again...
    regards
    Swami

  • SAP BW OLAP Universe performance issue

    Hi,
    Based on BO recommendation, I created a universe on top of a big BEX query which included all characteristics and key figures from a cube. I then created a webi report with 5 characteristics and 1 key figure from this universe. I also create the same report in BEX query designer(same number of objects). I ran both. The Bex query completed under a minute and the webi report took more than 12 minutes to run. I did a bunch of other test with different number of objects combination and saw similar differential in query performance.
    I did a trace using 'sm50' and saw the open SQL submitted to BW from BEx was very different from what got submitted from the webi report. Here is what I saw in pseudo code.
    Bex:
    select dimension1.blah, dimension2.blah, dimension3.blah, dimension..... sum(measure)
    from dimension1, dimension2, dimension3, dimension..... factTable
    where dimension1.SID = factTable.d1SID
    and dimension2.SID = factTable.d2SID
    and ......
    and (query filters)
    OLAP Universe
    select dimension1,blah, dimension1.SID
    from dimension1, factTable
    where dimension1.SID = factTable.d1SID
    select dimension2,blah, dimension2.SID
    from dimension2, factTable
    where dimension2.SID = factTable.d2SID
    select dimension3,blah, dimension3.SID
    from dimension3, factTable
    where dimension3.SID = factTable.d3SID
    It seems the OLAP universe was querying the BW cube one dimension at a time and then somehow piece the result sets together to form the final result set. Dimension tables join to the fact table without any filter definitely causes performance issue. Beside, I have no idea why the query requests are submitted to the BW system like that.
    I looked in varies forums and found no similar issues posted by others. Has anyone had any performance problem with there OLAP universe? Is this a matter of configurations?
    My envrionment:
    SAP BW 3.5
    BOXI 3.0
    ORACLE DB (version ??)

    Hi,
    You cannot compare a BEx query and a universe query by just comparing the trace.
    BEx query makes ABAP calls and universe query makes MDX calls;
    Moreover there is only one MDX call sent to SAP, what you have noticed is that there is one MDX call to retrieve member for a given characteristic whereas a prompt has been set on.
    Last, Web Intelligence consume only flattened data (row sets) wheras BEx consume data sets.
    That means that there is a flattening operation between SAP OLAP engine and the data sent to the Web Intelligence document.
    A fix have been developed for a customer to improve performance on Web Intelligence queries, this fix will be available for all customers in SP2 planned in June 09.
    Here is a a brief summary of what the fix provide:
      -  Provide data directly from SAP server to Web Intelligence document. Avoid unnecessary conversion steps: row set --> data set --> row set
      -  Eliminate redundant sort operations
      -  Other optimization operations
    Didier
    Edited by: Didier Mazoue on Nov 20, 2008 8:38 AM

  • How to get around a performance issue when dealing with a lot of data

    Hello All,
    This is an academic question really, I'm not sure what I'm going to do with my issue, but I have some options.  I was wondering if anyone would like to throw in their two cents on what they would do.
    I have a report, the users want to see all agreements and all conditions related to the updating of rebates and the affected invoices. From a technical perspective ENT6038-KONV-KONP-KONA-KNA1.  THese are the tables I have to hit.  The problem is that when they retroactively update rebate conditions they can hit thousands of invoices, which blossoms out to thousands of conditions...you see the problem. I simply have too much data to grab, it times out.
    I've tried everything around the code.  If you have a better way to get price conditions and agreement numbers off of thousands of invoices, please let me know what that is.
    I have a couple of options.
    1) Use shared memory to preload the data for the report.  This would work, but I'm not going to know what data is needed to be loaded until report run time. They put in a date. I simply can't preload everything. I don't like this option much. 
    2) Write a function module to do this work. When the user clicks on the button to get this particular data, it will launch the FM in background and e-mail them the results. As you know, the background job won't time out. So far this is my favored option.
    Any other ideas?
    Oh...nope, BI is not an option, we don't have it. I know, I'm not happy about it. We do have a data warehouse, but the prospect of working with that group makes me whince.

    My two cents - firstly totally agree with Derick that its probably a good idea to go back to the business and justify their requirement in regards to reporting and "whether any user can meaningfully process all those results in an aggregate". But having dealt with customers across industries over a long period of time, it would probably be bit fanciful to expect them to change their requirements too much as in my experience neither do they understand (too much) technology nor they want to hear about technical limitations for a system etc. They want what they want if possible yesterday!
    So, about dealing with performance issues within ABAP, I'm sure you must be already using efficient programming techniques like using Hash internal tables with Unique Keys, accessing rows of the table using Field-Symbols and all that but what I was going to suggest to you is probably look at using [Extracts|http://help.sap.com/saphelp_nw04/helpdata/en/9f/db9ed135c111d1829f0000e829fbfe/content.htm]. I've had to deal with this couple of times in the past when dealing with massive amount of data and I found it to be very efficient in regards to performance. A good point to remember when using Extracts that, I quote from SAP Help, "The size of an extract dataset is, in principle, unlimited. Extracts larger than 500KB are stored in operating system files. The practical size of an extract is up to 2GB, as long as there is enough space in the filesystem."
    Hope this helps,
    Cheers,
    Sougata.

  • How did you learn about performance issues?

    I've been thinking about trying to get a LV certification, so I tried out the online test, and I've got a few questions about learning LV on a deeper level.
    One question asked whether it is more efficient to cast each piece of an array as its being built in a for loop, or to cast the entire array at once, after the array has been created.
    Another question asked, if array indexing is enabled on the output of a while loop then is the array resized on every iteration?
    I can make guesses at how LV works, but I really don't know what goes on under the hood. How do you pros know what's really going on with your code?
    Also for anyone out there who has been certified, can you pass the test with just experience, or do you have to undergo some NI training to learn some things? Has LV certification been beneficial to you?
    Thanks,
    John

    Hi John,
    I have mentored 2 1/2 CLA's (the 1/2 is the "Architect Supreme" and she really only needed encouragement).
    Here are my suggestions.
    1) Study for the test you plan to take. Studying above the test level you are after can only confuse things.
    2) If you can get the funding, take the NI courses suggested for that test. Some of the Q's come from the text.
    3) Monitor this Exchange for questions regarding performance and speed.
    4) Use the search box at the top of this page and search for "Greg McKaskle". Some of the best performance info available is in his writtings.
    5) Use the "info-LabVIEW" serach engine that can be found at
    http://www.searchview.net/
    and search fro Greg McKaskle and Rolf Kalbermatter. Again great information.
    6) Make a habit of using the "Task manager>>> Performance screen while you are devloping and testing. It is a good way to find out what things are good and which are bad. You find out subtle things like straight line interpolation on a chart is faster than step interpolation.
    7) Read all of the LV release notes. The questions you posted are answered in those docs and changed in LV 6.0.
    8) Review the LabVIEW ZONE challenges. They are competitions based on performance. Read the reviews.
    9) Read the white papers. Do not miss 144!
    10) Start posting Q's on this Exchange. Let us expert (myself excluded) that prowl this list help out.
    Done for now,
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Performance issue on Fiscal period.

    HI all,
    I have multiprovider built on an infoset.The infoset is built on 3 stadard ODS(0FIGL_O02,0PUR_O01,0PUR_DS03).The user is running the report by Company code and Fiscal Period.
    The Company Code and Fiscal period is available only for FI-GL ODS.The purchasing os has only Fiscal Variant time characterstic.When i am trying to runt he report,its taking unusually long time to run.
    How should i resolve this performance issue?
    1)will getting Fiscal period into Purchasing ods help improve the performance.If so can anyone please give me step by step procees.As this is very urgent.
    2) Or should i take any other method to improve the performance.Th FI-GL already has secondary indexes on it.
    Please advise.
    Message was edited by:
            sap novice

    Duplicate post:
    Performance issue on AFPO
    Performance issue on AFPO

  • OLAP_TABLE performance issues

    Please advice. OLAP_TABLE performance is slow and SELECT did not return data after running for > 10 hours.
    ISSUES:
    (1) I started with only one measure and one dimension in the OLAP_TABLE statement and it returned data. However, when I added the remaining measures and dimensions, the statement never return data. The end user required all 20 measures and all 7 dimensions. How can I overcome the performance issue and get the data from the cube? Please HELP!
    (2) The cube is compressed. I read articles saying that OLAP_TABLE cannot use the LOOP keyword when cube is being compressed. I have already using MODEL keyword and the statement did not come back. How can I improve the performance and return data?
    (3) Can I create MV using OLAP_TABLE in 10g? If not, is there anyway to get around without MV because the VIEW is killing the performance so badly (it simply did not return and has to be killed manually).
    (4) I also used “AWM plug-in” to create the relational table view. However, all 20 measures and 7 dimensions must be included. It exceeded 4000 characters in single PL/SQL function parameter and caused the limitmap error. So the “AWM plug-in” did not work for me.
    Appreciate all of our help!
    (*) DATA
    1 Fact; 7 Dimensions; 1 cube; 20 measures
    POSITION_FACT - 9,387,384 rows
    DIM_BUSINESS_DAY - 2 rows - 1 hierarchy - 2 levels - 1 attribute
    DIM_INSTRUMENT_TYPE - 16 rows - 1 hierarchy - 2 levels - 1 attribute
    DIM_RISK_TYPE - 21 rows - 1 hierarchy - 2 levels - 1 attribute
    DIM_BOOK - 673 rows - 1 hierarchy - 10 levels - 2 attributes
    DIM_CURVE - 4,869 rows - 1 hierarchy - 6 levels - 1 attribute
    DIM_REFERENCE_ENTITY - 3,756 rows - 1 hierarchy - 7 levels - 3 attributes
    DIM_POSITION - 745,957 rows - 1 hierarchy - 2 levels - 9 attributes
    (*) CUBE CREATED IN AWM:
    fully pre-aggregated;
    used global composites;
    used compassion by integer;
    partition by business date;
    took minimum ~ 30 minutes to build the cube.
    ENVIRONMENT:
    (*) Oracle Database 10g Release 2 Patch Set 2 (10.2.0.3.0) 64-bit
    (*) 3 products installed in Oracle Home: Interim patches: 5746153 (OLAP 'A' patch), 5556081, 5557962
    (*) AWM 10.2.0.3.0A
    (*) SQL
    CREATE OR REPLACE VIEW vw_cube_bi_nrdb_vw_fl AS
    SELECT * FROM TABLE(OLAP_TABLE(
    'bi_nrdb DURATION SESSION',
    MEASURE am_value_total AS NUMBER FROM bi_nrdb_vw_total
    MEASURE am_value_03m AS NUMBER FROM bi_nrdb_vw_am_03m
    MEASURE am_value_06m AS NUMBER FROM bi_nrdb_vw_am_06m
    MEASURE am_value_09m AS NUMBER FROM bi_nrdb_vw_am_09m
    MEASURE am_value_01y AS NUMBER FROM bi_nrdb_vw_am_01y
    MEASURE am_value_18m AS NUMBER FROM bi_nrdb_vw_am_18m
    MEASURE am_value_02y AS NUMBER FROM bi_nrdb_vw_am_02y
    MEASURE am_value_03y AS NUMBER FROM bi_nrdb_vw_am_03y
    MEASURE am_value_04y AS NUMBER FROM bi_nrdb_vw_am_04y
    MEASURE am_value_05y AS NUMBER FROM bi_nrdb_vw_am_05y
    MEASURE am_value_06y AS NUMBER FROM bi_nrdb_vw_am_06y
    MEASURE am_value_07y AS NUMBER FROM bi_nrdb_vw_am_07y
    MEASURE am_value_08y AS NUMBER FROM bi_nrdb_vw_am_08y
    MEASURE am_value_09y AS NUMBER FROM bi_nrdb_vw_am_09y
    MEASURE am_value_10y AS NUMBER FROM bi_nrdb_vw_am_10y
    MEASURE am_value_12y AS NUMBER FROM bi_nrdb_vw_am_12y
    MEASURE am_value_15y AS NUMBER FROM bi_nrdb_vw_am_15y
    MEASURE am_value_20y AS NUMBER FROM bi_nrdb_vw_am_20y
    MEASURE am_value_30y AS NUMBER FROM bi_nrdb_vw_am_30y
    MEASURE am_value_40y AS NUMBER FROM bi_nrdb_vw_am_40y
    DIMENSION dim_business FROM dim_business_day WITH
    ATTRIBUTE dt_business FROM dim_business_day_long_description
    DIMENSION dim_risk_type FROM dim_risk_type WITH
    ATTRIBUTE id_risk_type FROM dim_risk_type_long_description
    DIMENSION dim_instrument_type FROM dim_instrument_type WITH
    ATTRIBUTE id_instrument_type FROM dim_instrument_type_long_description
    DIMENSION dim_book FROM dim_book WITH
    ATTRIBUTE nm_dim_book FROM dim_book_long_description
    ATTRIBUTE trader FROM dim_book_trader
    DIMENSION dim_reference_entity FROM dim_reference_entity WITH
    ATTRIBUTE nm_dim_reference_entity FROM dim_reference_entity_long_description
    ATTRIBUTE nm_spn_moody_rating FROM dim_reference_entity_spn_moody_rating
    ATTRIBUTE nm_spn_sp_rating FROM dim_reference_entity_spn_sp_rating
    DIMENSION dim_position FROM dim_position WITH
    ATTRIBUTE id_buysell FROM dim_position_buysell
    ATTRIBUTE id_coupon FROM dim_position_coupon
    ATTRIBUTE id_cusip FROM dim_position_cusip
    ATTRIBUTE id_isin FROM dim_position_isin
    ATTRIBUTE id_instrument_name FROM dim_position_instrument_name
    ATTRIBUTE id_maturity FROM dim_position_maturity
    ATTRIBUTE id_mtm FROM dim_position_mtm
    ATTRIBUTE id_notional FROM dim_position_notional
    MODEL
    DIMENSION BY (dim_business, dt_business, dim_risk_type, id_risk_type, dim_instrument_type, id_instrument_type,
    dim_book, nm_dim_book, trader, dim_reference_entity, nm_dim_reference_entity, nm_spn_moody_rating,
    nm_spn_sp_rating, dim_position, id_buysell, id_coupon, id_cusip, id_isin, id_instrument_name,
    id_maturity, id_mtm, id_notional)
    MEASURES (am_value_total, am_value_03m, am_value_06m, am_value_09m, am_value_01y,
    am_value_18m,am_value_02y, am_value_03y, am_value_04y, am_value_05y,
    am_value_06y, am_value_07y,am_value_08y, am_value_09y, am_value_10y,
    am_value_12y, am_value_15y, am_value_20y, am_value_30y, am_value_40y)
    RULES UPDATE SEQUENTIAL ORDER ();

    (1a) Thank you so much! The SQL “the CREATE OR REPLACE VIEW vw_cube_bi_nrdb_vw_fl…” the SQL I provided from my previous was the OLAP_STATEMENT I ran for > 10 hrs and killed manually. Please advice.
    I have business requirements to display all 7 dimensions and 20 measures for reporting. So I can’t really filter my dimensions much.
    (1b) Separately, I also follow your advice to add the WHERE clause after I created the VIEW vw_cube_bi_nrdb_vw_fl, see below statement and received error.
    SQL> select * from vw_cube_bi_nrdb_vw_fl
    where dt_business = '06/23/2008'
    and ID_RISK_TYPE ='CR01'
    and ID_INSTRUMENT_TYPE='VANILLA CDS'
    and dim_book='BOOK_63272279'
    AND dim_reference_entity='GFN_0113182'
    and dim_position='3645636' ;
    select * from vw_cube_bi_nrdb_vw_fl
    ERROR at line 1:
    ORA-32638: Non unique addressing in MODEL dimensions
    (4) I received below error when I ran “AWM plug-in” to create a VIEW from AWM. I have already uncheck fields that I do not need and only keeping the measures and dimensions I need. Sorry for the long err below:
    AUG-21-2008 12:14:47: . Creating view for cube BI_NRDB_AGGR
    AUG-21-2008 12:14:48: ..... generating limitmap for cube
    AUG-21-2008 12:14:48: ... generating limitmap for cube
    AUG-21-2008 12:14:48: ..... mapping table out of date for cube BI_NRDB_AGGR. Updating mapping table.
    AUG-21-2008 12:14:48: ... populating mapping for cube BI_NRDB_AGGR
    AUG-21-2008 12:14:48: ..... clearing mappings for the cube
    AUG-21-2008 12:14:48: ..... collecting metadata for measures in cube
    AUG-21-2008 12:14:48: ..... retrieving dimensions for the cube. Need limitmap for each dimension.
    AUG-21-2008 12:14:48: ... generating limitmap for DIM_BUSINESS_DAY
    AUG-21-2008 12:14:48: ..... mapping table out of date for dimension DIM_BUSINESS_DAY. Updating mapping table.
    AUG-21-2008 12:14:48: ... populating dimension map for DIM_BUSINESS_DAY
    AUG-21-2008 12:14:48: ..... clearing mappings for the dimension
    AUG-21-2008 12:14:48: ..... retrieving physical objects
    AUG-21-2008 12:14:48: ..... checking for value hierarchies
    AUG-21-2008 12:14:48: ..... retrieving label for dimension levels
    AUG-21-2008 12:14:48: ..... populating mapping info for the DIMENSION clause
    AUG-21-2008 12:14:48: ..... populating mapping info for the INHIERARCHY clause
    AUG-21-2008 12:14:48: ..... retrieving hierarchy information
    AUG-21-2008 12:14:48: ..... populating mapping info for the HIERARCHY and FAMILYREL clauses for hierarchy PRIM
    AUG-21-2008 12:14:48: ..... populating mapping info for the ATTRIBUTE clause
    AUG-21-2008 12:14:48: ... completed populating mapping for DIM_BUSINESS_DAY
    AUG-21-2008 12:14:48: ... generating limitmap for DIM_INSTRUMENT_TYPE
    AUG-21-2008 12:14:48: ..... mapping table out of date for dimension DIM_INSTRUMENT_TYPE. Updating mapping table.
    AUG-21-2008 12:14:48: ... populating dimension map for DIM_INSTRUMENT_TYPE
    AUG-21-2008 12:14:48: ..... clearing mappings for the dimension
    AUG-21-2008 12:14:48: ..... retrieving physical objects
    AUG-21-2008 12:14:48: ..... checking for value hierarchies
    AUG-21-2008 12:14:48: ..... retrieving label for dimension levels
    AUG-21-2008 12:14:48: ..... populating mapping info for the DIMENSION clause
    AUG-21-2008 12:14:48: ..... populating mapping info for the INHIERARCHY clause
    AUG-21-2008 12:14:48: ..... retrieving hierarchy information
    AUG-21-2008 12:14:49: ..... populating mapping info for the HIERARCHY and FAMILYREL clauses for hierarchy PRIM
    AUG-21-2008 12:14:49: ..... populating mapping info for the ATTRIBUTE clause
    AUG-21-2008 12:14:49: ... completed populating mapping for DIM_INSTRUMENT_TYPE
    AUG-21-2008 12:14:49: ... generating limitmap for DIM_RISK_TYPE
    AUG-21-2008 12:14:49: ..... mapping table out of date for dimension DIM_RISK_TYPE. Updating mapping table.
    AUG-21-2008 12:14:49: ... populating dimension map for DIM_RISK_TYPE
    AUG-21-2008 12:14:49: ..... clearing mappings for the dimension
    AUG-21-2008 12:14:49: ..... retrieving physical objects
    AUG-21-2008 12:14:49: ..... checking for value hierarchies
    AUG-21-2008 12:14:49: ..... retrieving label for dimension levels
    AUG-21-2008 12:14:49: ..... populating mapping info for the DIMENSION clause
    AUG-21-2008 12:14:49: ..... populating mapping info for the INHIERARCHY clause
    AUG-21-2008 12:14:49: ..... retrieving hierarchy information
    AUG-21-2008 12:14:49: ..... populating mapping info for the HIERARCHY and FAMILYREL clauses for hierarchy PRIM
    AUG-21-2008 12:14:49: ..... populating mapping info for the ATTRIBUTE clause
    AUG-21-2008 12:14:49: ... completed populating mapping for DIM_RISK_TYPE
    AUG-21-2008 12:14:49: ... generating limitmap for DIM_BOOK
    AUG-21-2008 12:14:49: ..... mapping table out of date for dimension DIM_BOOK. Updating mapping table.
    AUG-21-2008 12:14:49: ... populating dimension map for DIM_BOOK
    AUG-21-2008 12:14:49: ..... clearing mappings for the dimension
    AUG-21-2008 12:14:49: ..... retrieving physical objects
    AUG-21-2008 12:14:49: ..... checking for value hierarchies
    AUG-21-2008 12:14:49: ..... retrieving label for dimension levels
    AUG-21-2008 12:14:49: ..... populating mapping info for the DIMENSION clause
    AUG-21-2008 12:14:49: ..... populating mapping info for the INHIERARCHY clause
    AUG-21-2008 12:14:49: ..... retrieving hierarchy information
    AUG-21-2008 12:14:49: ..... populating mapping info for the HIERARCHY and FAMILYREL clauses for hierarchy PRIM
    AUG-21-2008 12:14:49: ..... populating mapping info for the ATTRIBUTE clause
    AUG-21-2008 12:14:50: ... completed populating mapping for DIM_BOOK
    AUG-21-2008 12:14:50: ... generating limitmap for DIM_CURVE
    AUG-21-2008 12:14:50: ..... mapping table out of date for dimension DIM_CURVE. Updating mapping table.
    AUG-21-2008 12:14:50: ... populating dimension map for DIM_CURVE
    AUG-21-2008 12:14:50: ..... clearing mappings for the dimension
    AUG-21-2008 12:14:50: ..... retrieving physical objects
    AUG-21-2008 12:14:50: ..... checking for value hierarchies
    AUG-21-2008 12:14:50: ..... retrieving label for dimension levels
    AUG-21-2008 12:14:50: ..... populating mapping info for the DIMENSION clause
    AUG-21-2008 12:14:50: ..... populating mapping info for the INHIERARCHY clause
    AUG-21-2008 12:14:50: ..... retrieving hierarchy information
    AUG-21-2008 12:14:50: ..... populating mapping info for the HIERARCHY and FAMILYREL clauses for hierarchy PRIM
    AUG-21-2008 12:14:50: ..... populating mapping info for the ATTRIBUTE clause
    AUG-21-2008 12:14:50: ... completed populating mapping for DIM_CURVE
    AUG-21-2008 12:14:50: ... generating limitmap for DIM_REFERENCE_ENTITY
    AUG-21-2008 12:14:50: ..... mapping table out of date for dimension DIM_REFERENCE_ENTITY. Updating mapping table.
    AUG-21-2008 12:14:50: ... populating dimension map for DIM_REFERENCE_ENTITY
    AUG-21-2008 12:14:50: ..... clearing mappings for the dimension
    AUG-21-2008 12:14:50: ..... retrieving physical objects
    AUG-21-2008 12:14:50: ..... checking for value hierarchies
    AUG-21-2008 12:14:50: ..... retrieving label for dimension levels
    AUG-21-2008 12:14:50: ..... populating mapping info for the DIMENSION clause
    AUG-21-2008 12:14:50: ..... populating mapping info for the INHIERARCHY clause
    AUG-21-2008 12:14:50: ..... retrieving hierarchy information
    AUG-21-2008 12:14:50: ..... populating mapping info for the HIERARCHY and FAMILYREL clauses for hierarchy PRIM
    AUG-21-2008 12:14:50: ..... populating mapping info for the ATTRIBUTE clause
    AUG-21-2008 12:14:51: ... completed populating mapping for DIM_REFERENCE_ENTITY
    AUG-21-2008 12:14:51: ... generating limitmap for DIM_POSITION
    AUG-21-2008 12:14:51: ..... mapping table out of date for dimension DIM_POSITION. Updating mapping table.
    AUG-21-2008 12:14:51: ... populating dimension map for DIM_POSITION
    AUG-21-2008 12:14:51: ..... clearing mappings for the dimension
    AUG-21-2008 12:14:51: ..... retrieving physical objects
    AUG-21-2008 12:14:51: ..... checking for value hierarchies
    AUG-21-2008 12:14:51: ..... retrieving label for dimension levels
    AUG-21-2008 12:14:51: ..... populating mapping info for the DIMENSION clause
    AUG-21-2008 12:14:51: ..... populating mapping info for the INHIERARCHY clause
    AUG-21-2008 12:14:51: ..... retrieving hierarchy information
    AUG-21-2008 12:14:51: ..... populating mapping info for the HIERARCHY and FAMILYREL clauses for hierarchy PRIM
    AUG-21-2008 12:14:51: ..... populating mapping info for the ATTRIBUTE clause
    AUG-21-2008 12:14:51: ... completed populating mapping for DIM_POSITION
    AUG-21-2008 12:14:51: ... completed generating limitmap for cube BI_NRDB_AGGR
    AUG-21-2008 12:14:51: ..... assigning limitmap to variable in the AW
    AUG-21-2008 12:14:51: ..... BI_NRDB_AGGR_CUBE_LIMITMAP found. Will update the variable
    AUG-21-2008 12:14:51: ..... defining view BI_DEMO.BI_NRDB_AGGR_CUBEVIEW over the cube
    AUG-21-2008 12:14:51: **
    AUG-21-2008 12:14:51: ** ERROR: View not created.
    AUG-21-2008 12:14:51: ** CAUSE: CREATE VIEW statement failed
    AUG-21-2008 12:14:51: ORA-36804: The OLAP_TABLE function encountered an error while parsing the LIMITMAP.
    AUG-21-2008 12:14:51: *** DEBUG INFORMATION ***
    AUG-21-2008 12:14:51: VIEW CREATION DDL ((truncated after 3900 characters)
    AUG-21-2008 12:14:51: CREATE OR REPLACE VIEW BI_DEMO.BI_NRDB_AGGR_CUBEVIEW AS
    SELECT *
    FROM table(OLAP_TABLE ('BI_DEMO.BI_NRDB duration session',
    '&(BI_NRDB_AGGR_CUBE_LIMITMAP)'))
    MODEL
    DIMENSION BY (
    DIM_BUSINESS_DAY,
    DIM_INSTRUMENT_TYPE,
    DIM_RISK_TYPE,
    DIM_BOOK,
    DIM_CURVE,
    DIM_REFERENCE_ENTITY,
    DIM_POSITION)
    MEASURES (
    DIM_BUSINE_PRIM_PRNT,
    DIM_BUSINE_ALL_BUSINE_LVLDSC,
    DIM_BUSINE_BUSINESS_D_LVLDSC,
    DIM_BUSINE_LDSC,
    DIM_BUSINE_LEVEL,
    DIM_INSTRU_PRIM_PRNT,
    DIM_INSTRU_ALL_INSTRU_LVLDSC,
    DIM_INSTRU_INSTRUMENT_LVLDSC,
    DIM_INSTRU_LDSC,
    DIM_INSTRU_LEVEL,
    DIM_RISK_T_PRIM_PRNT,
    DIM_RISK_T_ALL_RISK_T_LVLDSC,
    DIM_RISK_T_RISK_TYPE_LVLDSC,
    DIM_RISK_T_LDSC,
    DIM_RISK_T_LEVEL,
    DIM_BOOK_PRIM_PRNT,
    DIM_BOOK_ALL_BOOK_LVLDSC,
    DIM_BOOK_SYSTEM_LVLDSC,
    DIM_BOOK_REGION_LVLDSC,
    DIM_BOOK_BUSINESS2_LVLDSC,
    DIM_BOOK_BUSINESS1_LVLDSC,
    DIM_BOOK_BUSINESS_LVLDSC,
    DIM_BOOK_DESK_LVLDSC,
    DIM_BOOK_RISKSTRIPE_LVLDSC,
    DIM_BOOK_SUBRISKSTR_LVLDSC,
    DIM_BOOK_BOOK_LVLDSC,
    DIM_BOOK_TRADER,
    DIM_BOOK_LDSC,
    DIM_BOOK_LEVEL,
    DIM_CURVE_PRIM_PRNT,
    DIM_CURVE_ALL_CURVE_LVLDSC,
    DIM_CURVE_CURRENCY_LVLDSC,
    DIM_CURVE_SENIORITY_LVLDSC,
    DIM_CURVE_CURVE_OWNE_LVLDSC,
    DIM_CURVE_CURVE_NAME_LVLDSC,
    DIM_CURVE_CURVE_LVLDSC,
    DIM_CURVE_LDSC,
    DIM_CURVE_LEVEL,
    DIM_REFERE_ALL_GFN_LVLDSC,
    DIM_REFERE_GFN_INDUST_LVLDSC,
    DIM_REFERE_GFN_COUNTR_LVLDSC,
    DIM_REFERE_GFN_LVLDSC,
    DIM_REFERE_SPN_INDUST_LVLDSC,
    DIM_REFERE_SPN_COUNTR_LVLDSC,
    DIM_REFERE_SPN_LVLDSC,
    DIM_REFERE_LDSC,
    DIM_REFERE_SPN_SP_RATING,
    DIM_REFERE_SPN_MOODY_RATING,
    DIM_REFERE_LEVEL,
    DIM_REFERE_PRIM_PRNT,
    DIM_POSITI_PRIM_PRNT,
    DIM_POSITI_ALL_POSITI_LVLDSC,
    DIM_POSITI_TRADE_ID_LVLDSC,
    DIM_POSITI_COUPON,
    DIM_POSITI_MATURITY,
    DIM_POSITI_INSTRUMENT_NAME,
    DIM_POSITI_NOTIONAL,
    DIM_POSITI_BUYSELL,
    DIM_POSITI_CUSIP,
    DIM_POSITI_ISIN,
    DIM_POSITI_LDSC,
    DIM_POSITI_MTM,
    DIM_POSITI_LEVEL,
    TOTAL,
    AM_40Y,
    AM_30Y,
    AM_20Y,
    AM_15Y,
    AM_12Y,
    AM_10Y,
    AM_09Y,
    AM_08Y,
    AM_07Y,
    AM_06Y,
    AM_05Y,
    AM_04Y,
    AM_03Y,
    AM_02Y,
    AM_18M,
    AM_01Y,
    AM_09M,
    AM_06M,
    AM_03M,
    OLAP_CALC
    ) RULES UPDATE SEQUENTIAL ORDER()
    AUG-21-2008 12:14:51: LIMITMAP (truncated after 3900 characters):
    AUG-21-2008 12:14:51: DIMENSION DIM_BUSINESS_DAY FROM DIM_BUSINESS_DAY WITH -
    HIERARCHY DIM_BUSINE_PRIM_PRNT FROM DIM_BUSINESS_DAY_PARENTREL(DIM_BUSINESS_DAY_HIERLIST \'PRIM\') -
    INHIERARCHY DIM_BUSINESS_DAY_INHIER -
    FAMILYREL DIM_BUSINE_ALL_BUSINE_LVLDSC, -
    DIM_BUSINE_BUSINESS_D_LVLDSC -
    FROM DIM_BUSINESS_DAY_FAMILYREL(DIM_BUSINESS_DAY_LEVELLIST \'ALL_BUSINESS_DAY\'), -
    DIM_BUSINESS_DAY_FAMILYREL(DIM_BUSINESS_DAY_LEVELLIST \'BUSINESS_DAY\') -
    LABEL DIM_BUSINESS_DAY_LONG_DESCRIPTION -
    ATTRIBUTE DIM_BUSINE_LDSC FROM DIM_BUSINESS_DAY_LONG_DESCRIPTION -
    ATTRIBUTE DIM_BUSINE_LEVEL FROM DIM_BUSINESS_DAY_LEVELREL-
    DIMENSION DIM_INSTRUMENT_TYPE FROM DIM_INSTRUMENT_TYPE WITH -
    HIERARCHY DIM_INSTRU_PRIM_PRNT FROM DIM_INSTRUMENT_TYPE_PARENTREL(DIM_INSTRUMENT_TYPE_HIERLIST \'PRIM\') -
    INHIERARCHY DIM_INSTRUMENT_TYPE_INHIER -
    FAMILYREL DIM_INSTRU_ALL_INSTRU_LVLDSC, -
    DIM_INSTRU_INSTRUMENT_LVLDSC -
    FROM DIM_INSTRUMENT_TYPE_FAMILYREL(DIM_INSTRUMENT_TYPE_LEVELLIST \'ALL_INSTRUMENT_TYPE\'), -
    DIM_INSTRUMENT_TYPE_FAMILYREL(DIM_INSTRUMENT_TYPE_LEVELLIST \'INSTRUMENT_TYPE\') -
    LABEL DIM_INSTRUMENT_TYPE_LONG_DESCRIPTION -
    ATTRIBUTE DIM_INSTRU_LDSC FROM DIM_INSTRUMENT_TYPE_LONG_DESCRIPTION -
    ATTRIBUTE DIM_INSTRU_LEVEL FROM DIM_INSTRUMENT_TYPE_LEVELREL-
    DIMENSION DIM_RISK_TYPE FROM DIM_RISK_TYPE WITH -
    HIERARCHY DIM_RISK_T_PRIM_PRNT FROM DIM_RISK_TYPE_PARENTREL(DIM_RISK_TYPE_HIERLIST \'PRIM\') -
    INHIERARCHY DIM_RISK_TYPE_INHIER -
    FAMILYREL DIM_RISK_T_ALL_RISK_T_LVLDSC, -
    DIM_RISK_T_RISK_TYPE_LVLDSC -
    FROM DIM_RISK_TYPE_FAMILYREL(DIM_RISK_TYPE_LEVELLIST \'ALL_RISK_TYPE\'), -
    DIM_RISK_TYPE_FAMILYREL(DIM_RISK_TYPE_LEVELLIST \'RISK_TYPE\') -
    LABEL DIM_RISK_TYPE_LONG_DESCRIPTION -
    ATTRIBUTE DIM_RISK_T_LDSC FROM DIM_RISK_TYPE_LONG_DESCRIPTION -
    ATTRIBUTE DIM_RISK_T_LEVEL FROM DIM_RISK_TYPE_LEVELREL-
    DIMENSION DIM_BOOK FROM DIM_BOOK WITH -
    HIERARCHY DIM_BOOK_PRIM_PRNT FROM DIM_BOOK_PARENTREL(DIM_BOOK_HIERLIST \'PRIM\') -
    INHIERARCHY DIM_BOOK_INHIER -
    FAMILYREL DIM_BOOK_ALL_BOOK_LVLDSC, -
    DIM_BOOK_SYSTEM_LVLDSC, -
    DIM_BOOK_REGION_LVLDSC, -
    DIM_BOOK_BUSINESS2_LVLDSC, -
    DIM_BOOK_BUSINESS1_LVLDSC, -
    DIM_BOOK_BUSINESS_LVLDSC, -
    DIM_BOOK_DESK_LVLDSC, -
    DIM_BOOK_RISKSTRIPE_LVLDSC, -
    DIM_BOOK_SUBRISKSTR_LVLDSC, -
    DIM_BOOK_BOOK_LVLDSC -
    FROM DIM_BOOK_FAMILYREL(DIM_BOOK_LEVELLIST \'ALL_BOOK\'), -
    DIM_BOOK_FAMILYREL(DIM_BOOK_LEVELLIST \'SYSTEM\'), -
    DIM_BOOK_FAMILYREL(DIM_BOOK_LEVELLIST \'REGION\'), -
    DIM_BOOK_FAMILYREL(DIM_BOOK_LEVELLIST \'BUSINESS2\'), -
    DIM_BOOK_FAMILYREL(DIM_BOOK_LEVELLIST \'BUSINESS1\'), -
    DIM_BOOK_FAMILYREL(DIM_BOOK_LEVELLIST \'BUSINESS\'), -
    DIM_BOOK_FAMILYREL(DIM_BOOK_LEVELLIST \'DESK\'), -
    DIM_BOOK_FAMILYREL(DIM_BOOK_LEVELLIST \'RISKSTRIPE\'), -
    DIM_BOOK_FAMILYREL(DIM_BOOK_LEVELLIST \'SUBRISKSTRIPE\'), -
    DIM_BOOK_FAMILYREL(DIM_BOOK_LEVELLIST \'BOOK\') -
    LABEL DIM_BOOK_LONG_DESCRIPTION -
    ATTRIBUTE DIM_BOOK_TRADER FROM DIM_BOOK_TRADER -
    ATTRIBUTE DIM_BOOK_LDSC FROM DIM_BOOK_LONG_DESCRIPTION -
    ATTRIBUTE DIM_BOOK_LEVEL FROM DIM_BOOK_LEVELREL-
    DIMENSION DIM_CURVE FROM DIM_CURVE WITH -
    HIERARCHY DIM_CURVE_PRIM_PRNT FROM DIM_CURVE_PARENTREL(DIM_CURVE_HIERLIST \'PRIM\') -
    INHIERARCHY DIM_CURVE_INHIER -
    FAMILYREL DIM_CURVE_ALL_CURVE_LVLDSC, -
    DIM_CURVE_CURRENCY_LVLDSC, -
    DIM_CURVE_SENIORITY_LVLDSC, -
    DIM_CURVE_CURVE_OWNE_LVLDSC, -
    DIM_CURVE_CURVE_NAME_LVLDSC, -
    DIM_CURVE_CURVE_LVLDSC -
    FROM DIM_CURVE_FAMILYREL(DIM_CURVE_LEVELLIST \'ALL_CURVE\'), -
    DIM_CURVE_FAMILYREL(DIM_CURVE_LEVELLIST \'CURRENCY\'), -
    DIM_CURVE_FAMILYREL(DIM_CURVE_LEVELLIST \'SENIORITY\'), -
    DIM_CURVE_FAMILYREL(DIM_CURVE_LEVELLIST \'CURVE_OWNER\'), -
    DIM_CURVE_FAMILYREL(DIM_CURVE_LE
    AUG-21-2008 12:14:51: **
    AUG-21-2008 12:14:51: ** ERROR: Unable to create view over cube BI_NRDB_AGGR.
    AUG-21-2008 12:14:51: ORA-36804: The OLAP_TABLE function encountered an error while parsing the LIMITMAP.

  • Performance issues in bw

    Hi All,
    What is buffering number?How it is useful in performance issue? Tell me the option where it is available? To set this what are navigational steps?
    Thanks inadvance.
    Yogeswar

    Hi Yogi,
    A nice weblog by Vikas Please do check this.on number range buffering,
    /people/vikash.agrawal/blog/2006/04/05/load-lots-of-data-147faster148-with-buffering-number-range
    Check these links.
    FAQ - The Future of SAP NetWeaver Business Intelligence in the Light of the NetWeaver BI&Business Objects Roadmap
    https://www.sdn.sap.com/irj/sdn/developerareas/bi?rid=/webcontent/uuid/b4674415-0b01-0010-ae81-deb009860b7e [original link is broken]
    following are the links that may help you
    http://help.sap.com/search/highlightContent.jsp
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/com.sap.km.cm.docs/library/biw/s-u/sap%20bw%20business%20planning%20and%20simulation%20-%20how%20to%20guides%20list.htm
    http://help.sap.com/search/highlightContent.jsp
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/cccad390-0201-0010-5093-fd9ec8157802
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/afbad390-0201-0010-daa4-9ef0168d41b6
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/b7bdde90-0201-0010-26b1-dcda5e0b394d
    How to improve performance in reporting side?
    Query Performance Techniques:
    1. Check Query properties—Use RSRT tcode
    2. Check whether cube is compressed
    3. Optimize query definition
    4. Analyze query execution
    5. Check for additional indexes
    6. Archive unwanted data
    7. Check for partitioning options
    8. Check for additional aggregates ( Consider DB ratio and KPI ratio)
    9. Check for parallelization options
    10. Use Nav attributes instead of hierarchies, use free char and filters.
    Possible causes for the performance :
    A) High Database Runtime
    B) High OLAP Runtime
    C) High Frontend Runtime
    Depending upon your analysis
    A)Strategy - High Database Runtime
    Check if an aggregate is suitable (use All data to get values "selected records to transferred records", a high number here would be an indicator for query performance improvement using an aggregate)
    Check if database statistics are update to data for the Cube/Aggregate, use Tcode RSRV output (use database check for statistics and indexes)
    Check if the read mode of the query is unfavourable - Recommended (H)
    B)Strategy - High OLAP Runtime
    Check if a high number of Cells transferred to the OLAP (use "All data" to get value "No. of Cells")
    a) Use RSRT technical Information to check if any extra OLAP-processing is necessary (Stock Query, Exception Aggregation, Calc. before Aggregation, Virtual Char. Key Figures, Attributes in Calculated Key Figs, Time-dependent Currency Translation) together with a high number of records transferred.
    b) Check if a user exit Usage is involved in the OLAP runtime?
    c) Check if large hierarchies are used and the entry hierarchy level is as deep as possible. This limits the levels of the hierarchy that must be processed.
    C)Strategy - High Frontend Runtime
    1) Check if frontend PC are within the recommendation (RAM, CPU Mhz)
    2) Check if the bandwidth for WAN connection is sufficient.
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1e553368-0601-0010-49ab-c429607f3eb3
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/5401ab90-0201-0010-b394-99ffdb15235b
    check this, you can download lot of performance materials
    Business Intelligence Performance Tuning [original link is broken] [original link is broken] [original link is broken]
    and e-learning -> intermediate course and advance course
    https://www.sdn.sap.com/irj/sdn/developerareas/bi?rid=/webcontent/uuid/fe5b0b5e-0501-0010-cd88-c871915ec3bf [original link is broken]
    e.g
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/media/uuid/10b589ad-0701-0010-0299-e5c282b7aaad
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/media/uuid/d9fd84ad-0701-0010-d9a5-ba726caa585d
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/media/uuid/8e6183ad-0701-0010-e083-9ab1c6afe6f2
    performance tools in bw 3.5
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/media/uuid/07a4f070-0701-0010-3b91-a6bf7644c98f
    (here also you can download the presentation by righ click the disk drive icon)
    Check the following links,
    FAQ - The Future of SAP NetWeaver Business Intelligence in the Light of the NetWeaver BI&Business Objects Roadmap
    Business Intelligence Performance Tuning [original link is broken] [original link is broken] [original link is broken]
    http://help.sap.com/saphelp_nw04/helpdata/en/06/b5f8926ba22b45bc9eaa589f1c835b/content.htm
    Some bw docs/ performance material
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/3a699d90-0201-0010-bc99-d5c0e3a2c87b
    https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/4c0ab590-0201-0010-bd9a-8332d8b4f09c
    and don't miss bw performance knowledge centre, there are e-learning
    Business Intelligence Performance Tuning [original link is broken] [original link is broken] [original link is broken]
    Hope this Helps.
    <removed>
    Regards,
    Ravikanth.

  • Info Package in Process Chain - Performance Issue

    Dear All,
    We have some Info Package in Process Chain which when triggered initially turns red, then again yellow, after sometime red/green. These status keeps changing and finally results in green status. These infopackage takes 15-30 mins to complete.
    And the info package error message shows "Processing time overdue". The chain runs daily and this issue also occurs on a daily basis.
    Does anyone one of you have an idea of how to deal with this performance issue?
    Regards.

    Hi
    As you said--
    "initially time out time was set to 10 mins, so it was turning red at 10 mins. Later i changed it to 30 mins, then it is turning red at 30 mins, this info package takes maximum 20 mins to complete.
    thatsy i asked whether there will be any problem if we remove this time out time."
    I guess, its not mandatory to include the timeout value. Just try to execute the IP without entering the timeout value.
    It should work.
    Let me know if this helps.
    Regards
    SaiPrasad

  • How to troubleshoot JVM performance issues

    We recently had a performance issue with our application running on Java 1.4.2 on Windows where the CPU utilization was extremely high. Using the +PrintClassHistogram option I was able to determine that a char array was utilizing more than 500 megabytes of memory.
    My question is, is there another option I can include or process I can follow that would help me determine which object was creating this array?
    I am not a Java programmer so I apologize if this is elementary question or if I posted this in the wrong forum. I am the SA they call when the application is not running correctly. I have been reading this forum and the manuals trying to find something that would assist me. The developers will fix the problem after I pinpoint where the problem is.
    We do not have any type of JVM profiler so I am trying to troubleshoot this issue with thread dumps and JVM options. My assumption is that if a profiler can retrieve the information I should be able to. Is this a correct assumption?
    Lastly, if we were to purchase a profiler does anyone have a recommendation on which one to use? Our latest version of the application utilizes Java 5 so we would need something that would work with both versions of Java if possible.
    Any assistance or recommendations anyone could provide would be greatly appreciated.

    unixmon wrote:
    We recently had a performance issue with our application running on Java 1.4.2 on Windows where the CPU utilization was extremely high. Using the +PrintClassHistogram option I was able to determine that a char array was utilizing more than 500 megabytes of memory.
    My question is, is there another option I can include or process I can follow that would help me determine which object was creating this array?
    I am not a Java programmer so I apologize if this is elementary question or if I posted this in the wrong forum. I am the SA they call when the application is not running correctly. I have been reading this forum and the manuals trying to find something that would assist me. The developers will fix the problem after I pinpoint where the problem is.
    This is a management problem. If you have a performance problem then you should be calling the developers and telling them to figure it out and that they need to fix it. You certainly shouldn't be looking at code or profilers.
    Unless you have plenty of time on your hands and your goal is to become a developer.
    But regardless...
    Buying a profiler is an option but if you go that route then to figure out a solution you will be a java programmer even if that isn't your title.
    You should however figure out how to simulate load on the server, even with a profiler. There are tools that allow you to send things like http requests, web service requests, etc. Simulating load is a good idea even without a profiler. It allows you to estimate future system needs dependent upon future company growth.
    Note that profiling really slows down an application so an app that can handle 10 requests a second might only be able to do 1 a second or less while being profiled.

  • Performance issue at Webi/Universe

    Hi BO Experts,
    I am new to Business objects.
    I have developed a new universe and webi reports, but I am facing some performance issues at webi side. Below I am furnishing my development details; please let me know how I can overcome with the performance issues.
    1. Created a Query on Multiprovider (query is having nearly 125 objects (Info objects (restricted with hierarchies), navigational attributes & restricted key figures).
    2. Developed a new universe on the query.
    3. As part of performance- restricted the some fields at query filters and also applied conditions at universe level.
    4. Created a webi report and it is generating the results for 8 fields (4 characteristics (hierarchy levels) + 4 restricted key figures) and also I used 5 conditions at webi report filters.
    Finally, my Webi results are matching with BW query results, but Webi report is taking more time ( approximately 15 min. ) when I compare with BW query results( approximately 1min. ) and MDX query(approximately 2 min.) results. Requesting t you to please let me know, if I need to do any changes at universe or webi or bw query level  to overcome with this performance issue.
    Awaiting for your response.
    Regards,
    PRK

    Hi,
    Here are few steps to improve the performance:
    Try to build the aggrregrates at Cube leve for those Characterstics which u are using frequenlty in WEBI Reports.
    As you are using Hierarchy, the report performance will be always not good.
    I suggest this can be best handled in BI Modelling Level only instead of handling the same at WEBI Report Level as at WEBI Reports we dont have much options to improve the performance
    Also, I suggest you to look into the Query performance at BEX Level and then use the Query for further connections
    Regards
    Sindhu

  • How to Improve performance issue when we are using BRM LDB

    HI All,
    I am facing a performanc eissue when i am retriving the data from BKPF and respective BSEG table....I see that for fiscal period there are around 60lakhs records. and to populate the data value from the table to final internal table its taking so much of time.
    when i tried to make use of the BRM LDB with the SAP Query/Quickviewer, its the same issue.
    Please suggest me how to improve the performance issue.
    Thanks in advance
    Chakradhar

    Moderator message - Please see Please Read before Posting in the Performance and Tuning Forum before posting - post locked
    Rob

  • Every time I boot up I receive the message that "iCal cannot log in to your iCloud account because you password is incorrect-" I do enter it correctly but the same error message comes up the next time the computer boots up.  How do I correct this issue?

    How do I correct the problem that every timeI boot up I receive the message that “iCal cannot log in to your iCloud accountbecause you password is incorrect…” I do enter it correctly but the same errormessage comes up the next time the computer boots up?  Thanks.

    Hello jwpg,
    Thank you for the details of the issue you are experiencing after restoring an iCloud backup.  I recommend the following articles for an issue like this:
    iCloud: "Incorrect Apple ID or Password" alert when setting up iCloud after upgrading to iOS 7
    http://support.apple.com/kb/TS5207
    iOS 7: If you're asked for the password to your previous Apple ID when signing out of iCloud
    http://support.apple.com/kb/TS5223
    Thank you for using Apple Support Communities.
    Best,
    Sheila M.

  • How to restrict logging on portal (Performance issue)

    Hi guys,
    does anyone know how to restrict logging of the portal? Even though I made a change on Visual Administrator -> Services -> Log Configurator setting ALL categories to severity = 'Error', I keep getting 'Info' severity traces logged in defaultTrace.trc (viewed by Logg Viewer). Which in practise means tons of logged trace information! I think this might be one of the reasons we have been experiencing performance issues lately.
    Does anyone know how to restrict this logging?
    Any help greatly appreciated!
    Cheers,
    Frank

    Hi,
    I would not only set the severity of categories to 'Error' but also locations to 'Error'.
    A log controller - can be a "category" or a "location".     Categories are used for "logging" and represent problem areas such as network, database, security, user interface, etc. Identified by a slash-separated ('/') path of names, prefixed with an additional slash.     Locations are used for "tracing" and correspond to places in the development components. Identified by a dot-separated ('.') path of names, which is often the name of the respective java package.
    Greetings,
    Praveen Gudapati

  • HT3275 Time Machine Error: The backup disk image "Volumes/Data/My Macbook Pro.sparsebundle" is already in use. (How do I correct this issue?)

    Time Machine did not back up:
    Time Machine Error; ("The backup disk image "/Volumes/Data/My MacBook Pro.sparsebundle" is already in use.")
    How do I correct this issue?
    Thanks!

    You need to be careful Bob.. people might take this literally.
    Bob Timmons wrote:
    How do I correct this issue?
    Call the exterminator to get rid of the bugs in Lion and Mountain Lion.
    https://discussions.apple.com/thread/5222487?tstart=0

Maybe you are looking for

  • Lenovo Yoga 2 13 (NOT Pro) Review

    A few things (and please note that I am no techno geek). But I want to include things that helped me: - If you do searches online, and even talk to customer service reps, here and at stores, they tend to get the Yoga 2 Pro mixed up with the Yoga 2 13

  • ISE 1.3 Internal CA

    Hi,      I am looking for some assistance on the new local CA. My client provisions and launches the Cisco Network Setup Assistant but once I hit start, the client fails. Tail logs off the ISE node produce the following.. 2015-01-09 16:00:00,155 WARN

  • RAC Newbie: Does RAC on AIX require HACMP?

    I've never installed/configured RAC before, and the documentation I'm finding has conflicts on whether HACMP needs to be installed. One FAQ states the following: Is HACMP needed for RAC on AIX 5.2 using GPFS file system? The newest version of GPFS ca

  • Real Player not working

    Hello everyone, I'm trying to access the videos at http://www.learnerstv.com, however I keep getting the following message from the embedded Real Player: "A component needed to play this content is not available. Real Player may be able to download t

  • Applying Chromatic Aberration to a image ?

    I want to apply a small amount of Chromatic aberration to a Image. The Lens Correction Filter doesn't offer me the slightest results, unless I have to choose a specific camera for the image. I don't know what type of camera was used on the image?