CO-PA (0COPC_C02) Cube loading..

Hi Friends,
Can anybody help me out from this problem.
I want to load data into C0-PA infocube e.g: 0COPC_C02.
For this infocube doen't have any predefined datasource in R/3 eventhough exits but 0records.
So I've to create datasource for this infocube(0COPC_C02).
how to select dimensions, keyfigures & navigational attr. Plx suggest me. It's very urgent
Cheers
Suresh

hi,
following data source is represents your cube requirement
0CO_PC_ACT_1: Actual Costing/Material Ledger - Costs
check the above data source in r/3 side. otherwise based on the above data source create LIS data source in r/3 side. replicate that data source in BW. do the standard loading process in BW side. so based on your data source create dimensions, key figure and all in your cube. hope this help you.
if you want to know more about the cube check the following link
http://help.sap.com/saphelp_nw70/helpdata/en/23/1569372b2b7d20e10000009b38f842/frameset.htm
if helpful provide points
regards
harikrishan N

Similar Messages

  • Can we show some user defined message while cube loading in OBIEE

    Hi All,
    Currently we are building OBIEE dashboards from Essbase ASO Cube as datasource.
    During Essbase Cube loading ,Essbase will "Disable all Administrative Commands".In this mean time,When users are trying to run the dashboards,OBIEE will thorws severe error.
    Can we dispaly some user defined message something like "Cube is loading.Please wait...." or any other alternative.
    Time taken to load the cube is 1hr.Please suggest
    Thanks,
    SatyaB

    Yes, You can better try replacing the whole OBIEE dashboard with a custom message. Whenver the load happens...you shud trigger or set a flag value to 'Y'. Using guided navigation when Flag value is matched to Y...right a JS to replace the whole dashboard.
    [html]
    [head]
    [script type="text/javascript"]var strStatus="@1";
    if(strStatus =='Y')
    location.replace('saw.dll?PortalPages&_scid=in6lPJnWWNk&PortalPath=/shared/Framework/_portal/Cube%20Status%20&Page=Cube%20Status');
    [script]
    [head]
    Replace square bracket with curly brackets in above replace code..
    BTW, you shud mark the previous post on Implicit fact. Follow https://forums.oracle.com/forums/ann.jspa?annID=939

  • Issue while cube load

    Hi All,
    I am getting following error while doing the cube load.
    ***Error Occured in __XML_PARALLEL_LOADER: In __XML_UNIT_LOADER: In __XML_RUN_CC_AUTOSOLVE: Incremental aggregation over the dense DIMENSION OLAP_AGG!SCENARIO is not supported when aggregating a VARIABLE dimensioned by a COMPRESSED COMPOSITE.
    I understand this error comes when we try to aggregate the compressed cube and dimensions are not limit to all(limit to some values only). I faced this issue when manually running the aggmap of the cube after limiting the dimenisons.
    But this is happening now in the cube load through the script generated by AWM. I cannot run the cube load through AWM as this is in UAT env.
    One more clue is that the permit read is implemented and it do not allows to see some private scenarios but this should not cause the issue as permit_read will only get executed when user tries to connect in read only mode.
    Here is the xml load log content.
    13:08:47 ***Error Occured: Failed to Build(Refresh) XPRO_OLAP_AGG.OLAP_AGG Analytic Workspace.
    13:08:46 ***Error Occured in __XML_PARALLEL_LOADER: In __XML_UNIT_LOADER: In __XML_RUN_CC_AUTOSOLVE: Incremental aggregation over the >dense DIMENSION OLAP_AGG!SCENARIO is not supported when aggregating a VARIABLE dimensioned by a COMPRESSED COMPOSITE.
    13:08:46 Job #AWXML$_9179_3238 Stopped.
    13:08:44 Job #AWXML$_9179_3237 Stopped.
    13:08:42 Incremental aggregation over the dense DIMENSION OLAP_AGG!SCENARIO is not supported when aggregating a VARIABLE dimensioned by a >COMPRESSED COMPOSITE.
    13:08:29 Started Auto Solve for Measures: OPER_CUST_OPER_DATA from Cube OPERATIONAL_CUSTOMER.CUBE. Q2 2009 Partition.
    13:08:29 Finished Load of Measures: OPER_CUST_OPER_DATA from Cube OPERATIONAL_CUSTOMER.CUBE. Q2 2009 Partition. Processed 71359 >Records. Rejected 0 Records.
    13:08:21 Started Load of Measures: OPER_CUST_OPER_DATA from Cube OPERATIONAL_CUSTOMER.CUBE. Q2 2009 Partition.
    13:08:21 Started Load of Measures: OPER_CUST_OPER_DATA from Cube OPERATIONAL_CUSTOMER.CUBE. Q1 2009 Partition.
    13:08:21 Started Load of Measures: OPER_CUST_OPER_DATA from Cube OPERATIONAL_CUSTOMER.CUBE. Q1 2008 Partition.
    13:08:20 Attached AW XPRO_OLAP_AGG.OLAP_AGG in MULTI Mode.
    13:08:20 Attached AW XPRO_OLAP_AGG.OLAP_AGG in MULTI Mode.
    13:08:19 Attached AW XPRO_OLAP_AGG.OLAP_AGG in MULTI Mode.
    13:08:17 Running Jobs: AWXML$_9179_3237, AWXML$_9179_3238, AWXML$_9179_3239. Waiting for Free Process...
    13:08:17 Started 3 Finished 0 out of 5 Tasks.
    13:08:17 Running Jobs: AWXML$_9179_3237, AWXML$_9179_3238, AWXML$_9179_3239.
    13:08:17 Started 3 Finished 0 out of 5 Tasks.
    13:08:17 Running Jobs: AWXML$_9179_3237, AWXML$_9179_3238.
    13:08:17 Started 2 Finished 0 out of 5 Tasks.
    13:08:17 Running Jobs: AWXML$_9179_3237.
    13:08:17 Started 1 Finished 0 out of 5 Tasks.
    13:08:17 Starting Parallel Processing.
    13:08:17 Detached AW XPRO_OLAP_AGG.OLAP_AGG.
    13:07:17 Attached AW XPRO_OLAP_AGG.OLAP_AGG in RW Mode.
    13:07:15 Started Build(Refresh) of XPRO_OLAP_AGG.OLAP_AGG Analytic Workspace.
    13:07:14 Job# AWXML$_9179 to Build(Refresh) Analytic Workspace XPRO_OLAP_AGG.OLAP_AGG Submitted to the Queue.Any help on this error?
    Thanks
    Brijesh

    In AW, Right click on the Cube to Maintain it. Once done, Rebuild the cube. If this doesn't work, then first save the cube to template. Delete the cube, recreate it from the saved template and then maintain it followed by restarting all services and then do the full build.
    Regards
    Ankur Patel
    글 수정: user8337346

  • Incremental Cube Load Error: job slave process terminated

    Hi,
    For performance reasons, we switched to Incremental Cube Loading i.e. only those partitions are autosolved whose data is made available.
    Some times, the background submitted job terminates and the reason given in dba_scheduler_job_run_details is:
    REASON="Job slave process was terminated"
    There so no definits occurance pattren for this error.
    The job submitted in background is killed.
    The last entry the xml_load_log displayed is of Started Auto solving of a partition.
    After this error occurs, we have to Full Aggregate the cube; which offcourse would autosolve all partitions.
    We have been too much annoyed by this error as we did lot of package changes as part of a release to production to include Incremental cube loading, and once done, we see that incremental cube loading just terminates while autosolving a partitions.
    Can any one assist please? Urgent?
    thank you,

    Hi,
    There is a metalink note about this issue. Check note 443348.1
    Thanks
    Brijesh

  • BI 7 real-time cube load behavior changed after transport

    Dears,
      After I transported real-time cube from DEV to QA system, I found that all the real-time cube load behavior changed from 'can be loaded, planning not allowed' to 'can be planned, load not allowed'. What happened? Or I'm enpackaging this type cube with wrong step? Any suggestions are appreciated.
    Best regards,
    Gerald

    Hi,
    I think note 984803 can be usefull for you.
    The link is https://websmp207.sap-ag.de/~form/handler?_APP=01100107900000000342&_EVENT=REDIR&_NNUM=984803&_NLANG=E
    Rgs
    Antonino

  • Real-Time i-Cube load behavior changed after transport

    Dears
      After the transports compleleted, the Real-Time i-cube load behavior is changed. All the 'can be loaded,planning not allowed' were changed to 'can be planned, load not allowed'. What happened? How I enpackage this type R-T i-cube? Any suggestions are appreciated.
    Best regards,
    Gerald

    Hi Gerald,
    you can switch it back to the enable loadind of data, using the "Switch Transactional Infocube" option you get when you right click on the Infocube.

  • Cube Load Times

    Hi there,<BR><BR>We have one BSO cube implemented on Essbase 7.1.2 and we load data from an SQL server source every night. My question is to do with the time taken to load/calculate the cube.<BR><BR>It started off at around 20mins however in the space of a few months this time has risen a LOT. Its now taking upto 2.5hours to complete the load/calculation process.<BR><BR>If we re-structure the cube, the next days load takes around 1hour.<BR><BR>My question was basically, can anyone reccomend any tips/tricks to optimise the load process?<BR><BR>One thing we have noticed is that the data compression on the cube is set to RLE. IS this the right one or should we be using one of the others?<BR><BR>Any help appreciated! <img src="i/expressions/beer.gif" border="0"><BR><BR>Brian

    With the assumptions that (a) you perform a full calc of the database, and (b) the full calc itself isn't an exceptionally long process, you can typically get a much better overall performance gain by doing the following:<BR><BR>1) Export the input level data (level 0 if you have agg missing turned on).<BR>2) Clear the database<BR>3) Load data from your Essbase export<BR>4) Load data from your SQL source<BR>5) Run your full calc/calc all<BR><BR>The reason you could get an overall performance gain is two-fold:<BR>1) The RLE compression tends to cause a lot of fragmentation within the page files as blocks are written and re-written.<BR>2) The calc engine tends to run faster when it doesn't have to read the upper level blocks in before calculating them.<BR><BR>Both of the above are simplified explanations, and results vary greatly based on the outline/hierarchies and the calc required. One thing to note however, is that if you run any type of input scrubbing/modification script (e.g. allocation, elimination, etc.), the optimum sequence can be quite different than the above (I won't go into the hows and whys here).<BR><BR>Now on to the other ways to optimize your load. In the Database Administrators Guide (DBAG), there is a good section on ways to optimize a file for optimum loading. Basically, the closer the order of the members appearing in your load file matches the order they appear in the outline, the better. For instance, if the columns in the file are dense, and the rows are sorted in the same order as the sparse dimensions, you load MUCH faster than if you do the opposite. This can have a big impact on the fragmentation that occurs during a load, enough so that if you can be pretty sure that the sort for your SQL export is optimum for loading, the above approach (Export/Clear/Load/Calc) won't get you any real benefit -- at least from the fragmentation aspect.<BR><BR>Of course, the outline itself can be less than optimum, so any fixes you address should start with the outline, then move to the SQL file layout, then the load process. The caution here is to be sure that changes you make to align the outline with your dataload don't adversely affect your calc and retrieval performace too much.<BR><BR>Most of this is covered in the DBAG, in greater detail, just not layed out in an order that can be easily followed for your needs. So if you need further details on any of the above, start there.<BR>

  • Cube loaded with data, but unavailable in the query

    Hello everyone,
    Someone can help me?
    I created a new Cube CO-PA for the year 2010, and loaded with data this year. To make sure that the data were correct did a ListCube, with all the filter equal to the query, and indeed the data is loaded properly for this year. And when i run the query with data from the years 2008 and 2009 show me  data in my report, but when I run the query for 2010, the query displays an error message saying that "no data".
    How is it possible to have the correct data in the Cube, and i can not view the data in the query?
    Note: The request in the cube is available to report
    Thank you all,
    Greetings,
    Maria João
    30-Março 2010

    Hello,
    I was in trouble because I had not selected the key figures in the multicube : S
    After selecting the key figures, I ran the query and it had data.
    Thank you for your help.
    Sincerely,
    Maria Joã

  • Planning Cube Load process

    Hi Gurus,
    How to Load data to DP cubes from a BW 3.5 system

    If the BW 3.5 system is an external system then you need to build an Infocube in APO datamart and first load the data from external BW system to APO BW. Then create Planning area, POS, CVCs etc and then load the Planning data from APO Infocube to Planning area. using Transaction /SAPAPO/TSCUBE. DP Cubes are virtual ones and there is no physical existence. Please check this [DP Process|http://help.sap.com/saphelp_scm50/helpdata/en/8f/9d6937089c2556e10000009b38f889/frameset.htm] and also please go through the [Best Practices|http://help.sap.com/bp_scmv250/BBLibrary/HTML/DPL_EN_DE.htm]. open the link and click on Configuration guide. It will you through step by step process.

  • Data Extraction and ODS/Cube loading: New date key field added

    Good morning.
    Your expert advise is required with the following:
    1. A data extract was done previously from a source with a full upload to the ODS and cube. An event is triggered from the source when data is available and then the process chain will first clear all the data in the ODS and cube and then reload, activate etc.
    2. In the ODS, the 'forecast period' field was now moved from data fields to 'Key field' as the user would like to report per period in future. The source will in future only provide the data for a specific period and not all the data as before.
    3) Data must be appended in future.
    4) the current InfoPackage in the ODS is a full upload.
    5) The 'old' data in the ODS and cube must not be deleted as the source cannot provide it again. They will report on the data per forecast period key in future.
    I am not sure what to do in BW as far as the InfoPackages are concerned, loading the data and updating the cube.
    My questions are:
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    Your assistance will be highly appreciated. Thanks
    Cornelius Faurie

    Hi Cornelius,
    Q1) How will I ensure that BW will append the data for each forecast period to the ODS and cube in future? What do I check in the InfoPackages?
    -->> Try to load data into ODS in Overwrite mode full update asbefore(adds new records and changes previous records with latest). Pust delta from this ODS to CUBE.
    If existing ODS loading in addition, introduce one more ODS with same granularity of source and load in Overwrite mode if possible delta or Full and push delta only subsequently.
    Q2) I have now removed the process chain event that used to delete the data in the ODS and cube before reloading it again. Was that the right thing to do?
    --> Yes, It is correct. Otherwise you will loose historic data.
    Hope it Helps
    Srini

  • EIS 7 and EIS 9 Cube Load Issue-Urgent

    Hi,
    I am loading a cube which takes 25 mins in thru EIS 7 but it takes 2 hours+ in if the cube is loaded from EIS 9.
    Is it hardware confiugration problem.
    Please suggest something.
    TIA

    You have two different installations, one with EIS 7 and one with EIS 9... are these completely identical models, metaoutlines, servers, applications, and databases that have such dramatically different performance? At first guess I would think any thing that is different between the load with EIS7 and the load with EIS9 would be a candidate for explaining the performance difference.

  • ODS to CUBE loading - taking too much time

    Hi Experts,
    I am loading data from R/3(4.7) to BW (3.5).
    I am loading with option --> PSA and then Data Target (ODS ).
    I have a selection criteria in Infopackage while loading from standard Datasource to ODS.
    It takes me 20 mins to load 300K records.
    But, from ODS to Infocube ( update method: Data Target Only), it is taking 8 hours.
    The data packet size in Infopackage is 20,000 ( same for ODS and Infocube).
    I also tried changing the data packet size, tried with full load , load with initialization,..
    I tried scheduling it as a background job too.
    I do not have any selection criteria in the infopackage from ODS to Cube.
    Please let me know how can I decrease this loading time from ODS to Infocube.

    Hi,
    To improve the data load performance
    1. If they are full loads then try to see if you make them delta loads.
    2. Check if there are complex routines/transformations being performed in any layer. In that case see if you can optimize those codes with the help of an abaper.
    3. Ensure that you are following the standard procedures in the chain like deleting Indices/secondary Indices before loading etc.
    4. Check whether the system processes are free when this load is running
    5. try making the load as parallel as possible if the load is happening serially. Remove PSA if not needed.
    6. When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
    1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
    2) Run the load only to PSA.
    3) Once the load is succesfull , then push the data to targets.
    In this way you can overcome this issue.
    Ensure the data packet sizing and also the number range buffering, PSA Partition size, upload sequence i.e, always load master data first, perform change run and then transaction data loads.
    Check this doc on BW data load perfomance optimization
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/1955ba90-0201-0010-d3aa-8b2a4ef6bbb2
    BI Performance Tuning
    FAQ - The Future of SAP NetWeaver Business Intelligence in the Light of the NetWeaver BI&Business Objects Roadmap
    Thanks,
    JituK

  • Delta Init DTP DSO- Cube loads all requests

    Hi,
    i've loaded several Full Request to a DSO (Datasource 2LIS_02_ITM).
    I had to to fragment it to 3 months per Full load because otherwise the data volume was too much for the rollback segment.
    Now i want to load these requests from the DSO to a Cube.
    I created a Delta DTP and selected the "All new data by request" option.
    The DTP starts as a Delta Init but selects all open requests at once and crashes with a "ORA-01555: snapshot too old..." error.
    As a workaround i created several Full DTPs that only load part of the data but i really want to understand why the Delta Init doesn't work as expected and documented (to load request by request in a new process).
    thanks
    Carsten

    Hemant Khemani wrote:>
    > Please check if you have selected the option "Get All New Data Request by Request" in the Delta DTP.
    > If you have selected this option then the delta DTP will select only one request (oldest - yet to be updated to the target) from the DSO.
    That's the option i've selected.
    When monitoring the DTP-Request i see in the head data that all requests from the DSO are selected at once.

  • Inventory cube load: can I use full load?

    hi,
    I have an inventory cube with non-cumolative chars. Can I use full load option or I thave to load with delta for non-cumulative KFs?
    Regards,
    Andrzej

    Hi Aksik,
    The load will not matter in the Addregation option.You can load by full or delta.
    Now the reports creation has to be done with this in mind that the figures has to come correctly.Specify the aggregation required in the properties of the keyfigure in the query.If required use a ODS in the data flow.
    regards
    Happy Tony

  • Inventory Cube Loading - Questions....

    This is the process I intend to follow to load InfoCube 0IC_C03 and the questions therein. I have the "how to handle inventory scenarios" document, so please don't suggest I read that.
    1A) Delete Set-up tables for application 03
    1B) Lock Users
    1C) Flush LBWQ and delete entries in RSA7 related to application 03 (optional - only if needed)
    2A) Fill set up tables for 2LIS_03_BX. Do Full load into BW Inventory Cube with "generate initial status" in InfoPackage.
    2B) Compress request with a marker update
    3A) Fill set up table for 2LIS_03_BF with posting date 01/01/2006 to 12/31/2007 (Historical) and load to Inventory Cube with a full update
          QUESTION1: Does this need a marker update?
          QUESTION2: Is this a good strategy  - do movement documents that old get updated once posted?
          QUESTION3: Does the posting dates restriction depend on on how far back in history we want to go and look at stock values?
    3B) Fill set up table for 2LIS_03_BF with posting date 01/01/2008 to 9/9/999  (Current) and load to Inventory Cube with a delta init with data transfer.
    3C) Compress load in 3B without a marker update
    4A) Fill set up table for 2LIS_03_UM  and load to Inventory Cube with a delta init
    4B) Compress load in 4A without a marker update
          QUESTION4: How should we select the posting date criteria? Do I need to load from 01/01/2006 to 9/9/9999 since that's the range used for BF?
    5) Start V3 update jobs via Job Control
    6) Intiate subsequent delta loads from BF and UM and compress with marker update
    QUESTION 5: Is the sequence of loading BX first, then BF and UM fixed? or can I do BF and then BX, UM
    QUESTION 6: Any tips on minimizing downtime in this particular scenario? Please don't suggest generic steps. If you can suggest something specific to this situation, that'd be great.
    I hope you can help with the 6 questions I asked above.
    Regards,
    Anita S.

    Hi Anita,
    Please find my answers below. I have worked enough with this scenario and hence feel that these would be worth considering for your scenario.
    3A) Fill set up table for 2LIS_03_BF with posting date 01/01/2006 to 12/31/2007 (Historical) and load to Inventory Cube with a full update
    QUESTION1: Does this need a marker update?
    In this step we dont need marker update while compressing.
    QUESTION2: Is this a good strategy - do movement documents that old get updated once posted?
    I am able to get the question quite clearly.
    QUESTION3: Does the posting dates restriction depend on on how far back in history we want to go and look at stock values?
    Yes. We need to start from latest and then go back as backwards as we want to see the stock values. This holds true when we are using non cumulative key figures
    4B) Compress load in 4A without a marker update
    QUESTION4: How should we select the posting date criteria? Do I need to load from 01/01/2006 to 9/9/9999 since that's the range used for BF?
    No need to provide any selection criteria for UM while loading to BW, as this would fetch the same data filled in setup of revaluations.Unless you are looking for only some history other wise you can fill that for company code list and bring the whole data to BW  by a single full load, as the data wont be that huge as compared to BF
    6) Intiate subsequent delta loads from BF and UM and compress with marker update
    QUESTION 5: Is the sequence of loading BX first, then BF and UM fixed? or can I do BF and then BX, UM
    This is fixed in terms of compression with marker updates.
    QUESTION 6: Any tips on minimizing downtime in this particular scenario? Please don't suggest generic steps. If you can suggest something specific to this situation, that'd be great.
    *Yes. Most of time consuming activity is for filling BF history. Negligable for BX and UM comparitively.
    Either try having multiple BF selective setup filling based on posting date based on available background processes or filli open months setup during downtime and rest history once you unlock the system for postings. The reason for this being we dont expect any posting for history*
    Feel free to ask any further questions about this.
    Naveen.A

Maybe you are looking for