Inventory Cube performance Issue

Hi All,
This is not something new, but an old issue traditionally with this cube. I have customized 0IC_C03 for my requirement and having serious performance issues. It has 0CALDAY, 0MATERIAL, 0PLANT as non-cumulative value parameters. i hav eadded movement types (temproarily for validation purpose). But my query always timed out, unless I specify the material. There are close to 40K materials are being maintained. The values are all fine between ECC and BI afterdata loads. So we are thinking may be snap shot approach would hlp us resolve the performanc eissues.
Anybody has implementeted snap-shot approach for inventory? I know it is a loading issue, but we think we could deal with that rather than performanc eissue when the users execute the query.
if anybody has done it, could ou provide the steps?
Thanks,
Alex.

Hi Jameson - Thanks for your response.
We thought that would be the case. Have raised a SR with oracle and they are investigating on it. We have also sent an EIFF file to Oracle for investigation.
Both the DBs are in the same environment (AIX 6.1) and DBAs have confirmed both the DBs have the same system parameters.
Even if we keep aside comparing to 11.2.0.1, for some reason 11.2.0.3 seems to be very slow. Even a simple cube (2 Dim and 2 Measures) with 9K records takes around 15 min to get refreshed and it takes ages to view the data.
Havent generated the AWR report, will see if we can do the same.
rgds,
Prakash S

Similar Messages

  • ASO Cube Performance Issue

    We have been working on 2 ASO cubes, performance was great. No modification hasn't been done ever since, but we have been experiencing performance issue now. What could be the possible cause or how I could resolve this. Thank you.

    'Performance issue' isn't very descriptive - performance of what? Query? Data load? Aggregation? Restructure?
    As a start, has the volume of data (input data cells) been increasing significantly?

  • BW BCS cube(0bcs_vc10 ) Report huge performance issue

    Hi Masters,
    I am working out for a solution for BW report developed in 0bcs_vc10 virtual cube.
    Some of the querys is taking more 15 to 20 minutes to execute the report.
    This is huge performance issue. We are using BW 3.5, and report devloped in bex and published thru portal. Any one faced similar problem please advise how you tackle this issue. Please give the detail analysis approach how you resolved this issue.
    Current service pack we are using is
    SAP_BW 350 0016 SAPKW35016
    FINBASIS 300 0012 SAPK-30012INFINBASIS
    BI_CONT 353 0008 SAPKIBIFP8
    SEM-BW 400 0012 SAPKGS4012
    Best of Luck
    Chris
    BW BCS cube(0bcs_vc10 ) Report huge performance issue

    Ravi,
    I already did that, it is not helping me much for the performance. Reports are taking 15 t0 20 minutes. I wanted any body in this forum have the same issue how
    they resolved it.
    Regards,
    Chris

  • Cache and performance issue in browsing SSAS cube using Excel for first time

    Hello Group Members,
    I am facing a cache and performance issue for the first time, when I try to open a SSAS cube connection using Excel (using Data tab  -> From Other Sources --> From Analysis Services) after daily cube refresh. In end users
    system (8 GB RAM), for the first time, it takes 10 minutes to open the cube. From next run onwards, its open up quickly within 10 secs.
    We have daily ETL process running in high end servers. The configuration of dedicated SSAS cube server is 8 core, 64GB RAM. In total we have 4 cubes - out of which for 3 is full cube refresh and 1 is incremental refresh. We have seen after
    daily cube refresh, it takes 10 odd minutes to open the cube in end users system. From next time onwards, it opens up really fast with 10 secs. After cube refresh, in server systems (16 GB RAM), it takes 2 odd minutes to open the cube.
    Is there, any way we could reduce the time taken for first attempt ?
    Best Regards, Arka Mitra.

    Thanks Richard and Charlie,
    We have implemented the solution/suggestions in our DEV environment and we have seen a definite improvement. We are waiting this to be deployed in UAT environment to note down the actual performance and time improvement while browsing the cube for the
    first time after daily cube refresh.
    Guys,
    This is what we have done:
    We have 4 cube databases and each cube db has 1-8 cubes.
    1. We are doing daily cube refresh using SQL jobs as follows:
    <Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
    <Parallel>
    <Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2" xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100" xmlns:ddl200="http://schemas.microsoft.com/analysisservices/2010/engine/200" xmlns:ddl200_200="http://schemas.microsoft.com/analysisservices/2010/engine/200/200">
    <Object>
    <DatabaseID>FINANCE CUBES</DatabaseID>
    </Object>
    <Type>ProcessFull</Type>
    <WriteBackTableCreation>UseExisting</WriteBackTableCreation>
    </Process>
    </Parallel>
    </Batch>
    2. Next we are creating a separate SQL job (Cache Warming - Profitability Analysis) for cube cache warming for each single cube in each cube db like:
    CREATE CACHE FOR [Profit Analysis] AS
    {[Measures].members}
    *[TIME].[FINANCIAL QUARTER].[FINANCIAL QUARTER]
    3. Finally after each cube refresh step, we are creating a new step of type T-SQL where we are calling these individual steps:
    EXEC dbo.sp_start_job N'Cache Warming - Profit Analysis';
    GO
    I will update the post after I receive the actual im[provement from UAT/ Production environment.
    Best Regards, Arka Mitra.

  • Performance issue in browsing SSAS cube using Excel for first time after cube refresh

    Hello Group Members,
    This is a continuation of my earlier blog question -
    https://social.msdn.microsoft.com/Forums/en-US/a1e424a2-f102-4165-a597-f464cf03ebb5/cache-and-performance-issue-in-browsing-ssas-cube-using-excel-for-first-time?forum=sqlanalysisservices
    As that thread is marked as answer, but my issue is not resolved, I am creating a new thread.
    I am facing a cache and performance issue for the first time when I try to open a SSAS cube connection using Excel (using Data tab  -> From Other Sources --> From Analysis Services) after daily cube refresh. In end users system (8 GB RAM but around
    4GB available RAM), for the first time, it takes 10 minutes to open the cube. From next run onwards, its open up quickly within 10 secs.
    We have daily ETL process running in high end servers. The configuration of dedicated SSAS cube server is 8 core, 64GB RAM. In total we have 4 cube DB - out of which for 3 is full cube refresh and 1 is incremental refresh. We have seen after daily cube
    refresh, it takes 10 odd minutes to open the cube in end users system. From next time onwards, it opens up really fast with 10 secs. After cube refresh, in server systems (32 GB RAM, around 4GB available RAM), it takes 2 odd minutes to open the cube.
    Is there, any way we could reduce the time taken for first attempt ?
    As mentioned in my previous thread, we have already implemented a cube wraming cache. But, there is no improvement.
    Currently, the cumulative size of the all 4 cube DB are more than 9 GB in Production and each cube DB having 4 individual cubes in average with highest cube DB size is 3.5 GB. Now, the question is how excel works with SSAS cube after
    daily cube refresh?
    Is it Excel creates a cache of the schema and data after each time cube is refreshed and in doing so it need to download the cube schema in Excel's memory? Now to download the the schema and data of each cube database from server to client, it will take
    a significant time based on the bandwidth of the network and connection.
    Is it anyway dependent to client system RAM ? Today the bigest cube DB size is 3.5 GB, tomorrow it will be 5-6 GB. Now, though client system RAM is 8 GB, the available or free RAM would be around 4 GB. So, what will happen then ?
    Best Regards, Arka Mitra.

    Could you run the following two DMV queries filling in the name of the cube you're connecting to. Then please post back the row count returned from each of them (by copying them into Excel and counting the rows).
    I want to see if this is an issue I've run across before with thousands of dimension attributes and MDSCHEMA_CUBES performance.
    select [HIERARCHY_UNIQUE_NAME]
    from $system.mdschema_hierarchies
    where CUBE_NAME = 'YourCubeName'
    select [LEVEL_UNIQUE_NAME]
    from $system.mdschema_levels
    where CUBE_NAME = 'YourCubeName'
    Also, what version of Analysis Services is it? If you connect Object Explorer in Management Studio to SSAS, what's the exact version number it says on the top server node?
    http://artisconsulting.com/Blogs/GregGalloway

  • Inventory 0IC_C03 Cube Transformations issue BI 7

    Hi Experts,
    we are Installing Inventory Cube 0IC_C03, with In dataflow options in BI 7, 2LIS_03_BF transformtions is in Active State, while 2LIS_03_BX transformtion and 2LIS_03_UM transformtions are in Active.
    and for most of the Keyfields in Cube are not mapped with infosource fields.
    can any body implemented Inventory in BI 7 system, any body faced this issue. can you please share how you resolved this issue.
    we are on BI Content 703 11 version.
    Regards,
    Raj

    Raj,
    No need to check start routine code for this.
    Goto Transformation --> Change mode --> Choose Rule Group (on top of window, middle of the screen) --> Choose 05 --> Dispay mapping --> check routines available or not.
    Confirm back are you able to see filed routine or not...!!
    Check doc:  [Rule Groups in Transformation|https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/90754b76-bcf1-2a10-3ba7-b299b2be09f2]

  • Inventory Cube Stock Quantity/Value Issue

    Hi Experts,
    I have an issue regarding inventory cube about the stock value and quantity.
    The Issue is:
    1.  I Loaded the 2LIS_03_BX first to initialise the materials opening balance, after compressed ther request with the Marker (checked) .
    2.  I loaded the 2LIS_03_BF with Initialize without data transfer infopackage and compressed in the cube
    3.  And  I did full load 2LIS_03_BF and compressed with check box selected for marker update.
    The issue I have is, when the stock quantity/value  of a material ( BX , for example is taken on Dec.2011) is calculated(non cumulative key figure ex.0VALSTCKVAL/0VALSTCKQTY in the inventory cube) backwards based on Issues and received stock/value of material , I am not getting the value/stock equal to the value when the business started ( i.e for example Dec.2009, time at which the material opening stock is counted and entered in the system).
    example:  When I loaded BX ( intialize opening stock), let us suppose for Material A , have 1000kg in Dec 2011.  When the non cumulative key figure (0VALSTCKQTY/0VALSTCKVAL) is calicualted for each month or day , at the end of the period for example Dec.2009 the stock should be 100, but I am getting more than 100 like 150 for example, even though all the issued,received and 0valstckval/0valstckqty are correct upto Dec.2009.
    am i missing any steps or please let me your thoughts on this one.
    Regards
    Vamsi

    Hi,
    Please follow the steps mentioned in this link.I think 2LIS_03_BX full load should not be compressed with marker update only deltas to be updated in marker table of BF and UM.
    http://wiki.sdn.sap.com/wiki/display/BI/StepbyStepLoadingDatatoInventory+Cube
    Hope it helps.
    Regards,
    AL

  • Inventory Cube Issue

    Hi Experts,
    We have created a copy of 0IC_C03 with more no of fields added to the copy cube CIC_C03. We had lot of data mismatch in the earlier stage & now after  a long time this copy cube is working fine in production. We have not yet compressed the deltas in the cube.
    Now, later there was a field to be added in the inventory cube which takes data from an ODS. For this we created one more copy cube as we didnot want to disturb the inventory cube in production, hence we created one  Zcube which is the copy of CIC_C03 with an added field. In the update rules we have a routine which fills with data from the ODS. The CIC_C03 acts as datasource for this Zcube.
    My question is:
    1) I dont feel that after a month or so, we require the data in CIC_c03 as the same data comes in ZCube. So will deletion of data from CIC_C03 affect my data?
    2) The intial data in CIC_C03 are compressed whereas the deltas  are not yet compressed. Should the full upload to Zcube take after compression only?
    3) Suppose we delete the data from CIC_C03 after 2 months after a full upload is taken till that point. After deletion, the deltas will be coming in CIC_C03, canl these everyday delta be transfeered to Zcube?
    Waiting for reply.
    Regards,
    Vaishnavi.

    Hi,
    The note(637927) is saying that the KFs which represents the <b>values</b> of Stock-in-transit (issue, receipt, current) are deleted in the cube . But still the cube is containint the KF which represents the <b>quntity</b> of Stock-in-transit (issue, receipt, current) .
    So the deleted KF are :
    0VALTRANSST - Stock-in-transit value
    0RECCCNSVAL - Stock-in-transit issue value
    0RECTRFSTVA - Stock-in-transit receipt <b>value</b>
    Available KF are:
    0TRANSSTOCK - Stock in transit <b>Qty</b>
    0RECTRANSST - Receipt <b>quantity</b>: stock in transit
    0ISSTRANSST - Issue <b>quantity</b>: stock in transit
    So to find out the value of Stock-in-transit , you will use the Stock in transit. The scenario has to be used in this calculation is same as explained in note:589024.
    With rgds,
    Anil Kumar Sharma .P

  • Performance query on 0IC_C03 inventory cube

    Hello,
    I am currently facing performance problems on this cube. The query is on material groups so the number of row returned are not to high.
    The cube is compressed. Could aggregates be a solution, or does this not work well on this cube because on the non-cumulative key figure?
    Does anyone have any hints on speeding this cube up? (the only tip I see in the collective note is to always compress)
    Best regards
    Jørgen

    Hi Ruud,
    Once compression with Marker update, latest balances will be created automatically for inventory cube: 0IC_C03.
    Historic moments only required to show stock status for any historic date(eg: 02-01-2008).
    If user not interested to check 3 years old status of stock, old data can be deleted selectively from cube using selective deletion.
    Go through doc: [How Tou2026 Handle Inventory Management Scenarios in BW|https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328?overridelayout=true]
    Srini

  • Building a new Cube Vs Restricted Key figure in Query - Performance issue

    Hi,
    I have a requirement to create  a OPEX restricted key figure in Query. The problem is that the key figure should be restricted to about 30 GL Accounts and almost 300 Cost centers.
    I do not know if this might cause performance issue in the query. At the moment, I am thinking of creating a new OPEX cube and load only those 30 GL Accounts, 300 cost  centers and Amount. and  include OPEX  in multiprovider in order to get OPEX
    amount in the report.
    whats the best solution - creating OPEX restricted key figure or OPEX cube ?
    thanks,
    Bhat

    I think you should go for cube as all the restrcited key figure are calculated at OLAP runtime so it will definitely affect the query performance.There are a lot of costcenter for which you have to restrict it so definitely during the runtime of query it will take a lot of time to fetch tha data from infoprovider.Its better that you create a cube with the restrictions and include it in MP.It will definitely save a lot of time during query execution
    Edited by: shyam agarwal on Feb 29, 2012 10:26 AM

  • Cube Refresh Performance Issue

    We are facing a strange performance issue related to cube refresh. The cube which used to take 1 hr to refresh is taking around 3.5 to 4 hr without any change in the environment. Also, the data that it processes is almost the same before and now. Only these cube out of all the other cubes in the workspace is suffering the performance issue over a period of time.
    Details of the cube:
    This cube has 7 dimensions and 11 measures (mix of sum and avg as aggregation algorithm). No compression. Cube is partioned (48 partitions). Main source of the data is a materialized view which is partitioned in the same way as the cube.
    Data Volume: 2480261 records in the source to be processed daily (almost evenly distributed across the partition)
    Cube is refreshed with the below script
    DBMS_CUBE.BUILD(<<cube_name>>,'SS',true,5,false,true,false);
    Has anyone faced similar issue? Please can advise on what might be the cause for the performance degradation.
    Environment - Oracle Database 11g Enterprise Edition Release 11.2.0.3.0
    AWM - awm11.2.0.2.0A

    Take a look at DBMS_CUBE.BUILD documentation at http://download.oracle.com/docs/cd/E11882_01/appdev.112/e16760/d_cube.htm#ARPLS218 and DBMS_CUBE_LOG documentation at http://download.oracle.com/docs/cd/E11882_01/appdev.112/e16760/d_cube_log.htm#ARPLS72789
    You can also search this forum for more questions/examples about DBMS_CUBE.BUILD
    David Greenfield has covered many Cube loading topics in the past on this forum.
    Mapping to Relational tables
    Re: Enabling materialized view for fast refresh method
    DBMS CUBE BUILD
    CUBE_DFLT_PARTITION_LEVEL in 11g?
    Reclaiming space in OLAP 11.1.0.7
    Re: During a cube build how do I use an IN list for dimension hierarchy?
    .

  • Query performance on Inventory Cube

    Hi All,
            I have a query on Inventory Cube with non cumulative key figures, when I ran a query with them its taking 60 to 70 minutes. When I ran the same query by removing non cumulatives its displaing results in 25 seconds. Is there any way we can improve query performance which is effected  by non cumulative keyfigures.
        I have checked the performance related tools like RSRV on cube and master data no errors, in RSRT> execute debug the more time query consumes in data manager, ST03> DB and data manager time and also unassigned time is more.
        I know that query consumes time because of non cumulative keyfigures as it need to perform calculations on fly but its taking lot more than that. I apprecate your inputs to this query in advance.
      I will reward points.
    Regards
    Satish Reddy

    Hi Anil,
        Its nice to see you. We have compressed the cube with marker update and we are using only two infosources to the cube(BF and UM). As there are 150 queries on that cube I don't want to build aggregate especially for that query. I also treid DB stats refresh, there is a process chain to delete and recreate indexes, analysed cube and master data in RSRV etc. it didn't really helped me. Would you please suggest any good solution for this. I apprecaite it in advance.
    When i check in Application log in Cube Manage it is displayed that Mass Upsert of Markers update so I assumed that markers are updated.
    Regards
    Satish Arra.

  • BW on HANA - Inventory Cube Optimized Cube issue

    Hi All,
    We have an inventory cube which works perfectly fine and we created a HANA optimized copy of the same cube. We sourced the Initialization from the original (non-HANA) cube, loaded data, and then we loaded the rest of the data from the corresponding DSOs (one for 2LIS_03_BF and one for 2LIS_03_UM).
    In order to reconcile them, I opened both of them in BEx Analyzer. The original works perfectly fine, however the HANA optimized one, give me a error when entering the Fiscal Variant:
    Error in BW: Time is not consistent.
    Error reading the data of InfoProvider
    Error while reading data; navigation is possible
    I checked in RSDV and the validity dates are fine. When I check in RSRV I get the following errors when running "Test the partitioning column for InfoCube" and the "Consistency of time dimension for InfoCube". The return issues. If I try to repair the issues, I get the same error message.
    An exception with the type CX_SY_DYNAMIC_OSQL_SEMA
    Message no. RSRV000
    Any clues?

    Hi Busy waiting,
                           Please see if the following notes help you to solve the issue :
    1999013 - RSRV - Initial key Figure Units in Fact Tables test results in exception
    1953650 - RSRV - Fact tables and dimension tables of InfoCube results in exception error
    BR

  • Inventory cube - non cumulativekey fig values are showing -ve values

    Hi Guru's,
             For Improving the performance of inventory cube *0IC_C03
    The following steps i did:
    1) Created History cube by taking a copy of actual cube (0IC_C03).
    2) Transferred all the four years of data (2007, 2008, 2009, 2010) to history cube(4 yr data) as a back up to do clustering and for cube remodelling.
    3) After doing all these, loaded the current 3 years (2008, 2009, 2010)data back to the actual cube and kept one year data in the history cube (2007) (i.e maintained only recent 3yrs data in actual cube).
    5) Created a multiprovider includes actual and history cubes and populated the existing report on top of the multiprovider.
    6) After purging one year data from the actual cube, stock values in the reports are showing negative values
    7) To clear that issue i loaded the 2007 year data back to the actual cube (now the cube has all years data as it was before) to avoid the negative stock value, but again stock values are showing negative values.
    How to solve this issues in inventory cube..
    how too eliminate the negative value in reports which was working prperly before data purging( removing the first year data from the actual cube)

    Hi prayog.. 10q for answering... Yeah i went 2 the data targets. And the forumlae is already wrriten like this IF( Debit/Credit = 'H', Qty in OUn, ( 1- * Qty in OUn ) ) for Actual Consum. K.F and IF( Debit/Credit = 'H', Amt. in local curr., ( 1- * Amt. in local curr. ) ) for Amount.....
    So i already said that from one of the infosource the data is flowing through ODS and then 2 CUBE. So i checked out the data in ODS with the movement type and posting date as per in the Report.. I selected the 'Debit/Credit' = H and Movement type and Posting date... But in ODS o/p the keyfig's are not displayed..... This is the problem...
    Cheers,
    Hemanth Aluri...

  • Non Cumulative Inventory Cube Remodel

    Hello Gurus,
    I've existing inventory cube with non-cumulative key figures with large amount of data. We are using standard non cumulative key figures 0TOTALSTCK, 0ISSTOTSTCK & 0RECTOTSTCK.
    Now I've a new requirement to report issues (0ISSTOTSTCK) & receipts (0RECTOTSTCK) based only on LB. Right now they both are in base UOM. Converting them in LB in query (on the fly) is not an option because of volume / performance of report.
    I'm planning to do this in backend by adding two more key figures for issues & receipts where i'll convert standard ones in LB. I'm not going to touch any of existing non cumulative key figures. But the overall structure (F table) is going to change.
    I'm concerned whether this will work? Any risks? Marker updates?
    Thanks.
    Abhijeet

    Please have a look
    http://sap.seo-gym.com/inventory.pdf

Maybe you are looking for

  • Migrate installed Win 8.1 to a drive

    I have Win 8.1 upgrade from Win 7 installed on an internal HD.  I would like to migrate this Win 8.1 install and settings to a new ssd.  Does Win 8.1 have this capability built in? The is a upgrade from Win 7, all upgraded over the internet, no discs

  • Adobat Pro doesn't work in CS5.5

    I should format my mac and when I installed CS5.5 it worked with my serial number. However when I would use Adobe Acrobat X pro , it asked me a new time the serial number but it didn't work. Adobe acrobat told that hte serial number couldn't be used

  • Is it possible to import CDs in the background without changing focus

    Hello, I'm trying to import my CD collection into iTunes (9.0.3). I've setup preferences so that when I insert a CD it will automatically import and eject. At the same time I'm trying to watch an hour long video in itunes U. Is there any way to stop

  • Reg; Webi Reporting Queries

    Hi Gurus. 1. How to add 6 months to the current date in Webi report.. 2. My report is having last one year data,  I want to display only one month report from the Sys date.. Regards L k Vepuri

  • Getting back .Mac Web Galleries after reformat

    I've created many .Mac Web Galleries with iPhoto and they've successfully published and appeared in the iPhoto source list. I reformated, and even after logging into .Mac in System Prefs, the .Mac Web Galleries do not display in my iPhoto source list