WBS Commitments issue 0CO_OM_WBS_7 extractor - 0WBS_C11 cube

Hi All,
We have a problem with "WBS overview" report (on cube 0WBS_C11)
displaying wrong "Commitment" values . The data flow is designed to 
update from PSA directly to Cube . i.e Update rules are built from
"commitments" info source
0CO_OM_WBS_7---->0WBS_C11 cube directly .
RSA3- extractor checker is showing correct "commitment" values but PSA is showing more entires whichis reflecting in the reports,since PSA feeds cube directly
For example
for a sample WBS :lsap-80751-100010
RSA3 result:
WBSelement               FY/Period   Amount    DocDate    User
lsap-80751-100010        005/10     0            05/05/10    A
lsap-80751-100010       008/10    3155         05/07/10   B
Total value for Commitments in RSA3 is "3155" which should be the expected result in Query.
PSA dispalying 3 entries instead of 2 and thus CUBE summarizing all the
below 3 values showing commitment = "6310"
WBSelement           FY/Period   Amount      RefDocNumber#          Sch.line deldate   DocDate    User
lsap-80751-100010       005/10  0             0010131115(PR)           14/05/10        05/05/10    A
lsap-80751-100010       008/10  3155          4700001009(PO)          06/08/10              05/07/10    B
lsap-80751-100010       005/10  3155       0010131115(PR)          14/05/10        05/05/10    C
Could you please advise as we are using standard extractor ,  and not sure why the
commitment values are displaying incorrectly

Hi Prabhu,
If I am recalling it correctly, it uses the COOI table for commitment data.
I hope the data issue is resolved, since we also tried to do it using delta, but didn't worked, then we use the full load option only.
So for commitment data we placed a DSO in between PSA and cube and  they need to be refreshed before the actual full load.
The DSO was used to reconcile the data with COOI table and also staged many more filed's which can be used for future reporting.
Thank-You.
Regards,
VB

Similar Messages

  • Standard PS (WBS) Extractor and cube that contains data generated by CJR2

    Hi Experts,
    Can someone recommend a suitable standard extractor for us to extract plan data that was generated in R/3 PS, via transaction code CJR2, as well as a standard PS Cube that stores that data?
    The extractor and Cube should be able to capture characteristics such as, sender cost center, Activity Type, WBS element, Activity quantity, total costs, version, posting periods, fiscal year etc.
    I searched through those many extractors and cubes, but could not find one.
    Thanks.

    Hi Ehab,
    Go through this link, on Page No.7, you can see PS Cubes, ODS and standard queries.
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/c04e90b2-f31c-2a10-0595-8409b37914f3
    Thanks
    Sayed

  • Performance issue in browsing SSAS cube using Excel for first time after cube refresh

    Hello Group Members,
    This is a continuation of my earlier blog question -
    https://social.msdn.microsoft.com/Forums/en-US/a1e424a2-f102-4165-a597-f464cf03ebb5/cache-and-performance-issue-in-browsing-ssas-cube-using-excel-for-first-time?forum=sqlanalysisservices
    As that thread is marked as answer, but my issue is not resolved, I am creating a new thread.
    I am facing a cache and performance issue for the first time when I try to open a SSAS cube connection using Excel (using Data tab  -> From Other Sources --> From Analysis Services) after daily cube refresh. In end users system (8 GB RAM but around
    4GB available RAM), for the first time, it takes 10 minutes to open the cube. From next run onwards, its open up quickly within 10 secs.
    We have daily ETL process running in high end servers. The configuration of dedicated SSAS cube server is 8 core, 64GB RAM. In total we have 4 cube DB - out of which for 3 is full cube refresh and 1 is incremental refresh. We have seen after daily cube
    refresh, it takes 10 odd minutes to open the cube in end users system. From next time onwards, it opens up really fast with 10 secs. After cube refresh, in server systems (32 GB RAM, around 4GB available RAM), it takes 2 odd minutes to open the cube.
    Is there, any way we could reduce the time taken for first attempt ?
    As mentioned in my previous thread, we have already implemented a cube wraming cache. But, there is no improvement.
    Currently, the cumulative size of the all 4 cube DB are more than 9 GB in Production and each cube DB having 4 individual cubes in average with highest cube DB size is 3.5 GB. Now, the question is how excel works with SSAS cube after
    daily cube refresh?
    Is it Excel creates a cache of the schema and data after each time cube is refreshed and in doing so it need to download the cube schema in Excel's memory? Now to download the the schema and data of each cube database from server to client, it will take
    a significant time based on the bandwidth of the network and connection.
    Is it anyway dependent to client system RAM ? Today the bigest cube DB size is 3.5 GB, tomorrow it will be 5-6 GB. Now, though client system RAM is 8 GB, the available or free RAM would be around 4 GB. So, what will happen then ?
    Best Regards, Arka Mitra.

    Could you run the following two DMV queries filling in the name of the cube you're connecting to. Then please post back the row count returned from each of them (by copying them into Excel and counting the rows).
    I want to see if this is an issue I've run across before with thousands of dimension attributes and MDSCHEMA_CUBES performance.
    select [HIERARCHY_UNIQUE_NAME]
    from $system.mdschema_hierarchies
    where CUBE_NAME = 'YourCubeName'
    select [LEVEL_UNIQUE_NAME]
    from $system.mdschema_levels
    where CUBE_NAME = 'YourCubeName'
    Also, what version of Analysis Services is it? If you connect Object Explorer in Management Studio to SSAS, what's the exact version number it says on the top server node?
    http://artisconsulting.com/Blogs/GregGalloway

  • Issue with Building OLAP Cubes in Project Server 2010

    Hi
    There is some issue with while building OLAP cubes. 
    I have created OLAP cube then successfully cube has builded. When i add resource level custom field which has lookup tables values in ASSIGNMENT CUBE  then getting cube failure meesage.
    I deleted and recreated custom field and lookup table but no luck
    Below error message from manage queue jobs
    General
    CBS message processor failed:
    CBSOlapProcessingFailure (17004) - Failed to process the Analysis Services database <DB NAME> on the 10.3.66.12 server. Error: OLE DB error: OLE DB or ODBC error: 
    Warning: Null value is eliminated by an aggregate or other SET operation.; 01003. Errors in the OLAP storage engine: An error occurred while processing 
    the 'Assignment Timephased' partition of the 'Assignment Timephased' measure group for the 'Assignment Timephased' cube from the <DB NAME> database. 
    Internal error: The operation terminated unsuccessfully. Server:  Details: id='17004' name='CBSOlapProcessingFailure' uid='f2dea43a-eeea-4704-9996-dc0e074cf5c8'
     QueueMessageBody='Setting UID=afb5c521-2669-4242-b9f4-116f892e70f5 
    ASServerName=10.3.66.12 ASDBName=<DB NAME> ASExtraNetAddress= RangeChoice=0 PastNum=1 PastUnit=0 NextNum=1 NextUnit=0 FromDate=02/27/2015 02:10:15 
    ToDate=02/27/2015 02:10:15 HighPriority=True' Error='Failed to process the Analysis Services <DB NAME> on the 10.3.66.12 server. Error:
     OLE DB error: OLE DB or ODBC error: Warning: Null value is eliminated by an aggregate or other SET operation.; 01003. Errors in the OLAP storage engine: An error 
    occurred while processing the 'Assignment Timephased' partition of the 'Assignment Timephased' measure group for the 'Assignment Timephased' cube from the 
    <DB NAME> database. Internal  
    Queue:
    GeneralQueueJobFailed (26000) - CBSRequest.CBSQueueMessage. Details: id='26000' name='GeneralQueueJobFailed' uid='b7162f77-9fb5-49d2-8ff5-8dd63cc1d1d3' 
    JobUID='76837d02-d0c6-4bf8-9628-8cec4d3addd8' ComputerName='WebServer2010' GroupType='CBSRequest' MessageType='CBSQueueMessage' MessageId='2' Stage=''.
     Help me to resolve the issue
    Regards
    Santosh

    Is the SQL Server and Analysis Server are running on different servers and not on default ports? 
    If yes, then check if the same alias’s name added in Project Server is added on the Analysis Server.
    Cheers! Happy troubleshooting !!! Dinesh S. Rai - MSFT Enterprise Project Management Please click Mark As Answer; if a post solves your problem or Vote As Helpful if a post has been useful to you. This can be beneficial to other community members reading
    the thread.

  • Cache and performance issue in browsing SSAS cube using Excel for first time

    Hello Group Members,
    I am facing a cache and performance issue for the first time, when I try to open a SSAS cube connection using Excel (using Data tab  -> From Other Sources --> From Analysis Services) after daily cube refresh. In end users
    system (8 GB RAM), for the first time, it takes 10 minutes to open the cube. From next run onwards, its open up quickly within 10 secs.
    We have daily ETL process running in high end servers. The configuration of dedicated SSAS cube server is 8 core, 64GB RAM. In total we have 4 cubes - out of which for 3 is full cube refresh and 1 is incremental refresh. We have seen after
    daily cube refresh, it takes 10 odd minutes to open the cube in end users system. From next time onwards, it opens up really fast with 10 secs. After cube refresh, in server systems (16 GB RAM), it takes 2 odd minutes to open the cube.
    Is there, any way we could reduce the time taken for first attempt ?
    Best Regards, Arka Mitra.

    Thanks Richard and Charlie,
    We have implemented the solution/suggestions in our DEV environment and we have seen a definite improvement. We are waiting this to be deployed in UAT environment to note down the actual performance and time improvement while browsing the cube for the
    first time after daily cube refresh.
    Guys,
    This is what we have done:
    We have 4 cube databases and each cube db has 1-8 cubes.
    1. We are doing daily cube refresh using SQL jobs as follows:
    <Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
    <Parallel>
    <Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2" xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100" xmlns:ddl200="http://schemas.microsoft.com/analysisservices/2010/engine/200" xmlns:ddl200_200="http://schemas.microsoft.com/analysisservices/2010/engine/200/200">
    <Object>
    <DatabaseID>FINANCE CUBES</DatabaseID>
    </Object>
    <Type>ProcessFull</Type>
    <WriteBackTableCreation>UseExisting</WriteBackTableCreation>
    </Process>
    </Parallel>
    </Batch>
    2. Next we are creating a separate SQL job (Cache Warming - Profitability Analysis) for cube cache warming for each single cube in each cube db like:
    CREATE CACHE FOR [Profit Analysis] AS
    {[Measures].members}
    *[TIME].[FINANCIAL QUARTER].[FINANCIAL QUARTER]
    3. Finally after each cube refresh step, we are creating a new step of type T-SQL where we are calling these individual steps:
    EXEC dbo.sp_start_job N'Cache Warming - Profit Analysis';
    GO
    I will update the post after I receive the actual im[provement from UAT/ Production environment.
    Best Regards, Arka Mitra.

  • Issue with Compression of Cube 0IC_C03

    Dear Mates,
    Before initiating this Thread, i tried to search on SDN but couldnot get something that will work in my case.
    Issue: Compression scheduled in Process Chain starts, runs for a long time and then gets cancelled,as i can see in Process (right Click) >> Display messages.
    Compression was running fine since last one year. It started with the above mentioned behavior since Nov 2010.
    One Area which i feel fishy, is in November we had a Database movement to DB2. But i really couldnot relate the Issue with compression to Change in Database.
    We are currently working on SAP BW 3.5
    Please share your Comments/ Suggestion. Below is the Log observed after Compression cancelled.
    Job log     Job log     Message text uncoded
    12/19/2010     03:03:48     Job started
    12/19/2010     03:03:48     Step 001 started (program RSPROCESS, variant &0000000530859, user ID BW_CPIC)
    12/19/2010     03:03:48     Performing check and potential update for status control table
    12/19/2010     03:03:58     FB RSM1_CHECK_DM_GOT_REQUEST called from PRG RSSM_PROCESS_COMPRESS; row 000200
    12/19/2010     03:03:58     Request '758.334'; DTA '0IC_C03'; action 'C'; with dialog 'X'
    12/19/2010     03:03:58     Leave RSM1_CHECK_DM_GOT_REQUEST in row 70; Req_State ''
    12/19/2010     03:03:58     FB RSM1_CHECK_DM_GOT_REQUEST called from PRG RSSM_PROCESS_COMPRESS; row 000200
    12/19/2010     03:03:58     Request '758.348'; DTA '0IC_C03'; action 'C'; with dialog 'X'
    12/19/2010     03:03:58     Leave RSM1_CHECK_DM_GOT_REQUEST in row 70; Req_State ''
    12/19/2010     03:03:58     FB RSM1_CHECK_DM_GOT_REQUEST called from PRG RSSM_PROCESS_COMPRESS; row 000200
    12/19/2010     03:03:58     Request '761.202'; DTA '0IC_C03'; action 'C'; with dialog 'X'
    12/19/2010     03:03:58     Leave RSM1_CHECK_DM_GOT_REQUEST in row 70; Req_State ''
    12/19/2010     03:03:58     FB RSM1_CHECK_DM_GOT_REQUEST called from PRG RSSM_PROCESS_COMPRESS; row 000200
    12/19/2010     03:03:58     Request '763.019'; DTA '0IC_C03'; action 'C'; with dialog 'X'
    12/19/2010     03:03:58     Leave RSM1_CHECK_DM_GOT_REQUEST in row 70; Req_State ''
    12/19/2010     03:03:58     FB RSM1_CHECK_DM_GOT_REQUEST called from PRG RSSM_PROCESS_COMPRESS; row 000200
    12/19/2010     03:03:58     Request '763.397'; DTA '0IC_C03'; action 'C'; with dialog 'X'
    12/19/2010     03:03:58     Leave RSM1_CHECK_DM_GOT_REQUEST in row 70; Req_State ''
    12/19/2010     03:04:06     SQL: 19.12.2010 03:04:06 BW_CPIC
    12/19/2010     03:04:06     INSERT INTO "/BI0/L0IC_C03" ( "SID_0REQUID"
    12/19/2010     03:04:06     ,"SID_0PLANT" ,"SID_0CALDAY_F" ,"SID_0CALDAY_T" )
    12/19/2010     03:04:06     SELECT 2000000000 AS "SID_0REQUID"
    12/19/2010     03:04:06     ,"/BI0/L0IC_C03"."SID_0PLANT" , MIN (
    12/19/2010     03:04:06     "/BI0/L0IC_C03"."SID_0CALDAY_F"  )  AS
    12/19/2010     03:04:06     "SID_0CALDAY_F" , MAX (
    12/19/2010     03:04:06     "/BI0/L0IC_C03"."SID_0CALDAY_T"  )  AS
    12/19/2010     03:04:06     "SID_0CALDAY_T" FROM "/BI0/L0IC_C03" WHERE (
    12/19/2010     03:04:06     "/BI0/L0IC_C03"."SID_0REQUID" BETWEEN 0 AND
    12/19/2010     03:04:06     763397 ) GROUP BY "/BI0/L0IC_C03"."SID_0PLANT"
    12/19/2010     03:04:06     SQL-END: 19.12.2010 03:04:06 00:00:00
    12/19/2010     03:04:06     SQL: 19.12.2010 03:04:06 BW_CPIC
    12/19/2010     03:04:06     INSERT INTO "/BI0/L0IC_C03" ( "SID_0REQUID"
    12/19/2010     03:04:06     ,"SID_0PLANT" ,"SID_0CALDAY_F" ,"SID_0CALDAY_T" )
    12/19/2010     03:04:06     SELECT -1 AS "SID_0REQUID"
    12/19/2010     03:04:06     ,"/BI0/L0IC_C03"."SID_0PLANT" , MIN (
    12/19/2010     03:04:06     "/BI0/L0IC_C03"."SID_0CALDAY_F"  )  AS
    12/19/2010     03:04:06     "SID_0CALDAY_F" , MAX (
    12/19/2010     03:04:06     "/BI0/L0IC_C03"."SID_0CALDAY_T"  )  AS
    12/19/2010     03:04:06     "SID_0CALDAY_T" FROM "/BI0/L0IC_C03" WHERE (
    12/19/2010     03:04:06     "/BI0/L0IC_C03"."SID_0REQUID" BETWEEN 0 AND
    12/19/2010     03:04:06     763397 ) GROUP BY "/BI0/L0IC_C03"."SID_0PLANT"
    12/19/2010     03:04:06     SQL-END: 19.12.2010 03:04:06 00:00:00
    12/19/2010     03:04:07     SQL: 19.12.2010 03:04:07 BW_CPIC
    12/19/2010     03:04:07     TRUNCATE TABLE "/BI0/0100000095"
    12/19/2010     03:04:07     SQL-END: 19.12.2010 03:04:07 00:00:00
    12/19/2010     03:04:12     SQL: 19.12.2010 03:04:12 BW_CPIC
    12/19/2010     03:04:12     TRUNCATE TABLE "/BI0/0100000091"
    12/19/2010     03:04:12     SQL-END: 19.12.2010 03:04:12 00:00:00
    Thanks & Regards
    Sameer
    Edited by: Sameer A Ganeshe on Dec 28, 2010 10:51 AM

    Hi Zeeshan,
    I  handled inventory scenarios, actually total stock qty is non *** Key figure with cumulative key figures of Issue qty and receipt qty, receipt - issues gives you the the total stock qty, and in inventory cube you can capture daily movments as well as monthly since you do have the calmonth as Time char in the dimensions.
    Regards
    Ram

  • Data syncup issue - Custom extractor for revenue recognition data from ECC into BW

    Hi there,
    We have created a custom extractor on top of a Function Module (FM) in ECC that reads data primarily from VBREVE table. Currently, we have close to 10 million records and full load isn't a preferred option so we built in a delta mechanism using "Created On" (ERDAT) and "Posting Date" (BUDAT). The idea is to do an initial full load and then switch over to delta loads on a nightly basis for data where:
    "Created On" OR "Posting Date" is within last 6 months.
    This will ensure if any updates are made to the existing VBREVE entries, this change will be reflected in the BW DSO, provided the "Created On" or "Posting Date" fall within last 6 months. The issue comes up if in ECC a billing plan is cut down to a smaller term, let's say from 3 years to 2 years, the entries for the 1 year will be deleted from VBREVE table. How can I pick up this change in BW DSO since this data has already been loaded in the past? Addition of entries are okay but I need to address the deletion on entries in VBREVE table so that it reflects this in BW DSO. Any ideas how I can accomplish this? In the example on the screenshots, BW still shows the before image and I need to be able to get it to sync up with ECC as per the after image.
    -Rohit

    Dear Rohit,
    The case is complicated , there can be workaround using the changedon date from VBAK table. if the billing plan is changed in the sales order VBAK will be updated for changedon.
    1) If the billing plan deletion is a very specific case, then using the change log tables find out which are the sales order for which there has been billing plan change. Store this sales order details in DSO day wise. call it DSO1
    2) Create a self transformation to the DSO ( DSO2) where you are currently loading from the table VBREVE (DSO2 to DSO2 -For loading data from the same DSO) .Transformation should have reverse record mode given as a constant.
    3) load the data DSO2 from DSO2 by filling up sales order details from DSO1 in the DTP filter.
    ( This will reverse the complete entry for the sales order in DSO2 for the sales order for which billing plan has been changed)
    4) Now load afresh for these sales order again from source system to DSO2 from the table VBREVE.
    You can also use VBAK changedon date to trigger this load.
    Hope I have not confused you
    Regards
    Gajesh

  • Issue in integrating Essbase cubes with OBIEE

    Hi
    I am trying to use Essbase cubes as datasource in OBIEE for generating reports,but the issue is in generating , No columns in fact table of cube in BMM layer.
    Outline of cube is
    Revel(cube)
    (Hierachies)
    Time Time <5> (Label Only)
    Item <54> (Label Only) (Two Pass)
    DepInst <20> (Label Only)
    SFA_Flag <2>
    Deduction_Flag <2>
    Rating_Category <6>
    PD_Band <9>
    Product <17>
    Entity <4>
    CR_Agency <5>
    I am confused how to generate reports without measures in fact table.
    Regards
    Sandeep

    Hi Sandeep,
    in that case it's as I thought:
    Or did you just not specify any measure hierarchy?You tried this...
    In BMM layer i made this dimension as fact and tried to create reports but not use....which isn't the way. First of all your cube seems to be built quite bizarre since it doesn't even provide a default measure hierarchy so I'd have your Essbase guys check that.
    As for the OBIEE side: the key is the physical layer. BMM's already too late. In the physical cube object, you must define one of the hierarchies as the measure hierarchy (since your cube doesn't seem to provide it; see above):
    [http://hekatonkheires.blogspot.com/2010/02/obieeessbase-how-to-handle-missing.html]
    Cheers,
    C.

  • How to enhace business content extractor for cube in production

    Hi All,
    I want to run report based on 0MATERIAL on two cubes 0PCA_C01 and ZCOPC_C08. I dont have 0MATERIAL in cube 0PCA_C01.
    Both of the cubes are in production. I am trying to enhance the data source 0EC_PCA_1 (for cube 0PCA_C01) to bring in 0MATERIAL.
    > Is this the right way to do it?
    If it is then, do i need to write the code in funciton module after appending the 0MATERIAL fields to the
    data source. does any body have the code or plz help me step by step to write the code in CMOD....which tables should i go to get the 0MATERIAL.
    > Most importantly, does enhancing the datasource 0EC_PCA_1 with omaterial affect the omaterial in the datasource for my second cube (can the same infoobject be in two data sources..bcoz master data is common for all cubes)....
    I want to do this right....as both the cubes are in production.....plz help me..
    Plz address any issues like data loss bcoz of enhancing datasource in production or something like that
    thanks in advance,
    > useful inputs will be rewarded

    Hi Aravind,
    You need to have a foreign key relationship with the base data to material to bring material no as a transaction data. There has to be a link from where you can extract the data. FIrst you see if this link is available in any other ods in BW itself. For example, if you have the vendor details in your cube, see if you can get the material from the AP ods. this depends on your implementation.
    If you do not drop the cube load, how will this value be updated to the old data. Assuming you are doing deltas, the new records coming in will have this data filled but the data that already sxisted will not have this data. So, I think you will need to drop the load.
    I know the amount of data could be huge but if you have to get the data from R/3 (this is if the data/link i mentioned above is not available in BW), you have to enhance the data source. In this case, you can not avoid datasource enhancement.
    You can create a Zmaterial and load it from 0material. This will not help you. By doing this, you will be creating another master data. Even if you include this master data zmaterial in your cube, since you can not map it in the update rules, it will not help you. See if you can lookup this value from some other ODS.
    Hope this helps.
    Thanks and Regards
    Subray Hegde

  • Sequence issue for Extractor

    Hi All,
    We are facing problems with the delta record sequence in some of our extractors(QM, SD). How do we fix this issue.
    we are of 2004.1 Patch 5, Update Mode - Queued Delta.
    Thanks
    Niveda

    Hi Niveda,
    Please clerify (and check) that where is this sequence problem, at R3 side or in BW.
    In BW if you are using any datamart then chk at data target also. In ODS check via keys, how data is coming.
    My suggestion would be:
    1. if you are sure at R3 side then go for writing a start routine and try to sequence records while extracting them.
    Like say there are 3 records. Record 1, Record 2, Record 3 and thery are in different requests then you should have these requests in sequence, meaning req 1 for rec 1 and so on.
    2. Look for OSS note, may be patch upgrade needed (I am not sure of this but you can write to SAP for this also) .
    3. Check RODELTAM for different delta types and their working, in contrast to your source.
    4. Also check your update rules, is there any additional setting?
    Hope this helps.

  • WBS currency issue

    Hi experts,
    Since Thursday 10th we have a very strange issue in our production environment.
    One user has modified a WBS element using CJ02 transaction.
    The user seems to have modified the currency field with a very strange value.
    Instead of EUR which was the previously value, the WBS element now contains 31.12
    The user told me he only wanted to modify the user field USR09 with '31.12.2009'. Maybe he made a mistake ... he doesn't know.
    When I use CJ02 the field currency is in display mode.
    When I try to modify the currency on another WBS element which accept such modification : I get an error message telling me that the input value 31.12 is not accepted...
    => How can a user enter such a value in a currency field ?
    => How can we change back the Currency value to EUR (in CJ02 the field is in display mode only).
    Thank you for your help.
    Regards,
    Philippe

    Not getting any clue. But it is always advisable to share error / information message  no. along with it. If you are getting any message no. please share it's detail along with description.
    With Regards
    Nitin P.

  • WBS Hirarachy Issue at 2nd , 3rd and 4th Level

    Hi ,
    I am loading the planning data from 3rd party system using UDC .  planning data had wbs element . Planning amount will post some time at 2nd leve or 3rd or 4th or 5th Level .
    I am using SAP stadard WBS hirarchy for reporting . If  5th level node has possting  stadard hirachy showing perfectly because wbs hierarchy is rollup of 5th level postings and also alwasy posting at 5th level in r/3. but I am loading the data from 3rd party system so 2,3,4th levels also has postings.  so 2,3,4 th level amounts are not adding into the standard hierarchy  and   amounts are displaying under unassigne node in WBS standard Hierarchy .  2,3,4th level amount not adding into the Stadard wbs hirarchy .
    If the amount post at 5th level no issues stadard hirarchy working fine
    like this
    WBS
    D-01----
    200           
    D-01-01-02----
    200
    D-01-01-02-03----
    200
    D-01-01-02-03-04----100
    D-01-01-02-03-04----100
    If the amount post at 2nd level 3rd leve
    WBS
    D-01----
    200           
    D-01-01-02----
    200
    D-01-01-02-03----
    200
    D-01-01-02-03-04----100
    D-01-01-02-03-04----100
    unassign nodes
    d02----
    300
    d0505----
    400
    r030405----
    1000  like this
    Please help me how to resolve this issue .  How to get 2nd level and 3rd level amounts in standard hirarchy.
    Thanks
    Rohan

    To any moderator: could you please move this post to "Desktop Environments", as it is related to Xorg and maybe someone might have an idea in this forum. Thank you. tetrapilot.

  • PA Missing Commitments Issue

    Hi All,
    I have couple of projects for which the AUD report is giving couple of documents which has funds check missing commitment's error message.
    I got a data fix patch 7662888 in metalink for this issue, but unfortunately this doesn't work for all projects.
    Can anyone help me to fix this issue, even data fix will do.
    Thanks,
    Vishwamber Shetty

    Hi Vishwamber,
    The missing commitments error is usually indicative of data corruption. This error occurs when the funds check engine is unable to find the original commitment that its trying to relive as it moves the amounts from (say) the commitment  to the actuals bucket.  Typically this error is seen in cases where during the course of the Project Funds check enabled project, the check-boxes in the "Encumbrance" tab in the financial options window are unchecked / checked at some point. 
    First thing we need to do here is to see if the critical patches related to funds check have been applied. These fix a lot of issues in the funds check area.
    I'm not sure of your apps version , however you can refer to the following Notes and see if you have the critical patches applied as per your apps version.
    Critical Consolidated Codefix Patch Released For Oracle Project Costing - Funds Check (Doc ID 1430023.1)
    Critical Patch Released For Customers Using Funds Check Functionality in Project Costing (Doc ID 1356459.1)
    In order to verify / correct the data corruption have the GDF 13639679 (this patch supersedes patch 7662888) applied and follow the steps in the readme.
    As always do the above is recent clone of production and check if the issue gets resolved. if the issue still exists, log an Sr with Oracle Support.
    Regards,
    Raghavan

  • EVDESCRIPTION Issue from BPC OLAP Cubes universe

    Hi,
    I created a universe from a BPC OLAP Cube.
    it has several dimensions, dimension Profitcenter has 10 levels like level 1,level2,etc..
    each level has its detail objects, one of the details object in each level is EVDESCRIPTION. Now i created a webi report, with object profitcenter_lev7 and its deatil object EVDESCRIPTION (LEV7), and the report now showing any data for teh deatil object . i run for sevaral objects and its deatil object  EVDESCRIPTION, still i dont see any data for it. but i cross checked with database and my BPC cubes, data is exisitng in there. then i realised, it is the issue with the deatils object with in the universe.
    and i changed the select statement code for the detail object EVDESCRIPTION.
    the exisitng code for the EVDESCRIPTION (LEV7) is  :
    [ProfitCenter].[H1].[LEV7].[EVDESCRIPTION].[Value] then i changed to
    <EXPRESSION>[ProfitCenter].CurrentMember.Properties("EVDESCRIPTION")</EXPRESSION> .
    then i created a webi report for only Proficenter_lev7 and EVDESCRIPTION (LEV7) , then the report showed the code and its description perfect.
    i chaned the same code for all the levels with same code, adn created a report with only level object and its evdescription, it is showing the perfect data. now the issue is when i run a report with 2 codes and its description, like
    Profiftcenter 7,EVDESCRIPTION (LEV7),profitcenter 6,EVDESCRIPTION (LEV6) , then
    i expect 4 colums in the report with code 7 and its description and 6 and its description.
    but here the code 6's description i.e. EVDESCRIPTION (LEV6) is overwritting with lev 7's descrition.
    I expect data below shows  
    code 7      desc 7                                         code 6                    desc 6
    USA      United states of america                 NY                  New York
    USA      United states of america                 CA                  California
    But i am getting :
    code 7      desc 7                                         code 6                    desc 6
    USA      New York                                        NY                  New York
    USA      California                                         CA                  California
    the EVDESCRIPTION (LEV6) valeus are overlapped to EVDESCRIPTION (LEV7).
    i changed the EVDESCRIPTION object code in different ways like
    <EXPRESSION>[ProfitCenter].CurrentMember.Properties("EVDESCRIPTION_LEV7")</EXPRESSION>   
    or
    <EXPRESSION>[ProfitCenter].CurrentMember.Properties("EVDESCRIPTION7")</EXPRESSION>
    and many more ways, then the 1st problem comes which is no data retriving. the only way data is showing for the code is
    <EXPRESSION>[ProfitCenter].CurrentMember.Properties("EVDESCRIPTION")</EXPRESSION>  but like i said when i run for 2 codes and descriptions, the lower level description data is overlapping to its highest level.
    Thanks
    Krishna

    Hello
    We have got passed this issue by creating it as a measure and using it in the query for the dimension.
    We created the measure with the expression "[dim].currentmember.properties("EVDESCRIPTION")".
    Works perfectly.
    Hope this helps.
    Regards
    Perven

  • Issue populating Enhaced Virtual Cube Fields

    Hi Gurus,
    Please help me with the below Issue. I tried a lot on SDN before writing this Post.
    Architecture:
    I have a Standard Multi-provider - "0FIAR_M30". It has got a Standard Cube: 0FIAR_C30 & a virtual Cube: 0FIAR_R30.
    Cube: 0FIAR_C30 is getting Data from R/3 DS: 0FI_AR_3 and Cube: 0FIAR_R30 is getting data from DS: 0FI_AR_30.
    I enhaced the two DS for fields: Plant, Sales Org, Created By, Sales Dist.
    Data is populating properly in the enhaced fields for standard Cube, but in case of Virtual cube, data is getting loaded only till PSA and but when i am trying to display data for Virtual Cube. No data is getting Populated in the enhaced Fields.
    Options Tried:
    1. I used Z Info-objects with template of the standard Info-objects. I made all the Attributes of the Z-info-objects as Display wherever the attributes are navigational.
    I used the below posts while investigating on the issue:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a00ba5ee-9840-2d10-f385-933182d62b12?QuickLink=index&overridelayout=true
    Please provide your suggestions n Inputs.
    Thanks
    Sameer

    Hi Sameer,
    We are also facing the same issue.
    Could you please provide how you have corrected the issue
    Thanks,
    Jyothi.

Maybe you are looking for