Cube Maintaining

All,
I'm in the process of re-building an AW using Non Aggregated Exports. All the importing into the new AW is completed. So I'm now in the process of maintain the base cubes, to aggregate the data. During the dialog via AWM, I'm given the option to run in parallel, which I accept and run 8 wide. The first two partitions complete relatively quicky, but then every 10 mins in the XML_LOAD_LOG table I get the following message:
13:10:26 Running Jobs: AWXML$_28702_31138, AWXML$_28702_31139, AWXML$_28702_31140, AWXML$_28702_31141, AWXML$_28702_31142, AWXML$_28702_31143, AWXML$_28702_31144, AWXML$_28702_31146. Waiting for Free Process...
After a number of hours the process completes with a non descript Error message:
03:15:14 ***Error Occured: Failed to Build(Refresh) PNRG4_AW.PNRG Analytic Workspace.
I assume this is because the Free Process is never got (so to speak). The relational side of this database is very busy as we're mid batch run (run takes several days). The Unix box is not under any stress though (SWAP or CPU). So my questions is what resource may be lacking in order for this job to grab a "Free Process"?
Nick.

Parallel cube builds use the DBMS_SCHEDULER.CREATE_JOB procedure to process each partition. If you specify 5 parallel processes, say, then the code will call DBMS_SCHEDULER.CREATE_JOB 5 times to start the first 5 slave jobs and will then go into a waiting loop. During this loop it will repeatedly run the following select statement waiting for the slaves to finish.
select job_name, status from all_scheduler_job_run_details where job_name like ...If a slave finishes, then it will immediately start the next one by calling DBMS_SCHEDULER.CREATE_JOB again.
The "Waiting for Free Process..." message that you see in the load log is a 'heart beat' message to let you know that the master process is in this waiting loop. The message will be sent to the log every 10 minutes.
So in your case the slave processes are still working on the current batch of partitions and the master process is waiting for them to complete. This is not in itself a sign of a problem since partitions frequently take longer than 10 minutes to process, but obviously they are taking longer than you expect. The load log should give you some idea of what the slave processes are doing.

Similar Messages

  • AWM - After successull cube maintainance data does not load

    Hi All,
    After creating the cube when i maintain the cube the data does not load. I went back and looked at the mapping at the cube level and the dimension level and everything looks fine. Any clues why the data does not get loaded. Also, i did not recieve any error while maintaing the cube.
    thanks

    Hi,
    I was able to resolve this issue. I did not map the cube properly especially the time dimension. Due to this the data did not get loaded in the cube. ReMapping fixed the problem.
    Hope this helps other newbies..

  • AWM CUBE Maintainance

    We have schema "OOWAAPP".
    In AWM I have created "PEN_NB2_STD" workspace.
    Under that I have created CUBE "ANTAL_PERSONER_CUBE" which is based on View.
    I want to do incremental load by changing value in View condition.
    Could you please give me sample loading script which I can run.
    I tried as below
    BEGIN
    DBMS_CUBE.BUILD('PEN_NB2_STD.ANTAL_PERSONER_CUBE USING (CLEAR, LOAD, SOLVE)','S',false,1,true,true,false);
    END;
    Error:--
    ORA-37162: OLAP error
    XOQ-01710: unknown build item "PEN_NB2_STD.ANTAL_PERSONER_CUBE"
    ORA-06512: at "SYS.DBMS_CUBE", line 234
    ORA-06512: at "SYS.DBMS_CUBE", line 287
    ORA-06512: at line 2
    BEGIN
    DBMS_CUBE.BUILD('OOWAAPP.PEN_NB2_STD.ANTAL_PERSONER_CUBE USING (CLEAR, LOAD, SOLVE)','S',false,1,true,true,false);
    END;
    eRROR:'
    ORA-37162: OLAP error
    XOQ-01703: error during parse of build process script: "BUILD OOWAAPP.PEN_NB2_STD.ANTAL_PERSONER_CUBE USING (CLEAR, LOAD, SOLVE)", ""
    ORA-06512: at "SYS.DBMS_CUBE", line 234
    ORA-06512: at "SYS.DBMS_CUBE", line 287
    Edited by: user10048382 on Mar 21, 2013 11:20 AM

    Take a look at these discussions from David Greenfield:
    Re: Enabling materialized view for fast refresh method
    Mapping to Relational tables
    Load Prune
    If you still have question, then post again.

  • Performance tuning in Cubes in Anlytic Workspace Manager 10g

    Hi,
    Can anyone tell me or suggest anything how i should i improve the performance of cube maintainance in Analytic Workspace Manager..

    generate statspack/AWR reports
    HOW To Make TUNING request
    https://forums.oracle.com/forums/thread.jspa?threadID=2174552#9360003

  • Inventory Management Extractors via DSO to Cube 0IC_C03

    Dear experts,
    to fulfill the requirements of DataWarehouse spirit using Entry-, Harmonization- and Reporting-Layer I want to use for Inventory Management a Data Flow from extractors 2LIS_03_BX, 2LIS_03_BF and 2LIS_03_UM via DSO/DSOs and later on to Cube 0IC_C03.
    In this forum there are some opened threads about this issue, but none has been resolved! Many hints are about the "How to Inventory Management..." - some other say: "You can use DSO's!" - others say: "Don't use DSOs!". So: Where is the truth? Is there anybody who can provide a really praticable way to implement the above mentioned data flow?
    I'm so excited about your suggestions and answers.
    Thanks in advance and regards
    Patrick
    --> Data Flow has to be in BW 7.0

    Hi Patrick,
    Yes indeed there is.
    Using DSOs in inventory flow is absolutely possible. Here's what you need to know and do:
    1 - Firstly don't be apprehensive about it. Knowledge is power! We know that inventory uses ABR delta update methodology and that is supported by DSOs so no doubt abt it.
    2 - Secondly Inventory is special because of non-cumulative and several rule group at cube level for BX and BF.
    3 - Now as you want to use a DSO and I am presuming that this will be used for staging purpose, use a write optimized DSO as the first layer. While mapping from DS to this DSO, keep one to one mapping from field to Info-objects.
    4- Keep in mind that from Infosource to 0IC_C03 there would be multiple rule groups present in transformations.
    These rule groups will have different KPIs mapped with routines. The purpose is that from ECC only 1 field for quantity (BWMNG) and one field for value (BWGEO) is coming but from Infosource to 0IC_C03 the same fields in different rule groups are mapped to different KPIs (Stock in transit, vendor consignment, valuated stock etc) based on stock category, stock type and BW transaction keys.
    5- So create a write optimized DSO. map it one to one from datasource fields. create/copy transformations from this DSO to cube maintaining the same rule groups and same logic.
    6- you can also use standard DSO and create rule groups and logic at DSO level and then do one to one from DSO to Cube.
    Keep in mind that these rule groups and logic in them should be precisely the same as in standard flow.
    This should work.
    Debanshu

  • Updating Data from OLAP Cube

    Hi guys,
    i have completed this tutorial ["Building OLAP 11g Cubes" |http://www.oracle.com/technology/obe/olap_cube/buildicubes.htm] followed by ["Lesson 1: Creating OBIEE Metadata for OLAP 11g Cubes"|http://www.oracle.com/technology/obe/olap_biee/CreateBIEEMetadata.htm] . But i notice whenever new data is added to my database , the Data isnt showing on my dashboard. Unless i go back to Analytic Workspace manager and redo the step of maintaining the cube and recreate the obiee metadata repository again with Oracle BI adminstration. New Data will be then show on my dashboard. Is a way for me to edit the settings or something so that the dashboard will be able to show the data once i just update the database.
    Thanks for the help
    Jerald

    Hi,
    When any data is added to the relational db you need to run the cube maintainance to load that data into the cube. While loading itself the materialized views are refreshed to represent the latest dataset.
    I think for OBIEE you need to refresh the repository to reflect the changes but you neednot to rebuild it.
    Thanks,
    Brijesh

  • Splitting info cube.

    HI
    Please consider splitting a large InfoCube into small InfoCubes (such
    as separated by a time characteristic like 0CALYEAR). You can use MultiProviders as query
    targets to obtain all data in your query result.
    1) The above statement talks about the partitioning of the cube or asking to create new cube...
    2)How to achieve it
    Regards

    Hi,
    Re: Partitioning dimension
    http://help.sap.com/saphelp_nw04s/helpdata/en/33/dc2038aa3bcd23e10000009b38f8cf/frameset.htm
    Because partition with database MaxDB is not possible I use the following way:
    1. Create new cubes based on years like 2006, 2007
    2. In Cube maintainance set constant for year e.g. 2006
    3. DTP for cube filter for 0CALYEAR e.g. 2006
    4. MultiProvider based on this cubes
    Regards
    Andreas

  • RSA1 - Cube Data

    Is there any function module which can be used to get the details of the Cube maintained in RSA1 in APO. The cube gets the data populated from BW.

    Hi Padmanabhan,
    Please try these function modules
    /FRE/FU_BW_LAST_EXTRACTION_GET
    OBJ2_VARALV_EXTRACT_SAVE
    /SAPTRX/BWEX (function group)
    ALV_EXTRACT_LOAD
    REUSE_ALV_EXTRACT_LOAD
    REUSE_ALV_EXTRACT_SAVE_LEVELS
    Please let me know whether your issue is resolved
    Regards
    R. Senthil Mareeswaran.

  • Query executation taking somuch time

    Hi
    I have created query on Inventory Cube(Copied from 0IC_C03).
    In cube maintainance i have changed the 'maintain non cumulative values' from calday and plant to only calday,
    because to get the correct data on day basis
    The query is based on material and vendor.
    it is taking so much time like 50 min to execute.
    I am thinking this is because of changing the 'maintain non cumulative values' in cube maintainance.
    But if i don't do that i can't get the correct values.
    Hope i explained the pbm very clearly.
    can anybody help me on this issue.
    I appreciate if anybody help me.
    Thanks
    Madhu

    HI,
    Have you tried other performance tuning activites, like creating aggregates, checked indices on the cube, etc.
    Also, did you give any filter criteria when running the query in the selection screens. If not do that as well.
    Suggest you run tcode ST03 in expert mode to check and see where the time is being taken and if needed run the query in debug mode using RSRT to see how best you can tune it further.
    Cheers,
    Kedar
    Cheers,
    Kedar

  • L fact table

    if u use non cumm kf info cube maintains 3 fact tables F,E,L fact tables.
    what L fact table contains?
    if we compress the data in F- table then data goes to E-table, then wat happen to my L-fact table?

    Hi msg2dp:
       The "L" Fact Table is the "Validity Table". For InfoCubes with non-cumulative key figures, you need to maintain a validity table. This table specifies the time interval for which the non-cumulatives are valid for a specific characteristic combination.
    For more information take a look at this papers.
    "How to... Handle Inventory Management Scenarios in BW"
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/f83be790-0201-0010-4fb0-98bd7c01e328?quicklink=index&overridelayout=true
    "Non-cumulatives / Stock Handling"
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/93ed1695-0501-0010-b7a9-d4cc4ef26d31?quicklink=index&overridelayout=true
    Regards,
    Francisco Milán.

  • Calculation before aggregation

    Hello BW Guru's
    What does this calculation before aggregation mean ?
    In which situation we use the above setting ?
    when we can use this option in Modelling and when we can use this option in reporting ?
    I am zero with this.. please explain me with a real time example.
    Full points will be assured for the best answer
    many thanks in advance,
    NIru.

    Hi,
    An aggregate is a materialized, aggregated view of the data in an InfoCube. In an aggregate, the dataset of an InfoCube is saved redundantly and persistently in a consolidated form into the database.
    How to create Aggregate for the cube.(for the example shown below)
    step1
    go to rsa1 transaction code , right click on the cube --->maintain aggregates ---create by your self
    Step2
    in the dimension Drag Region give the name of the aggregate .
    after dragging in the aggregation selection type you will find * . that means it is assigned for all characteristics value.
    or
    right click on the * and select fixed value as NY( this will give aggregation values only for NY)
    but in the example i have take * it is for all characteristic value ( means for all values NY and CA)
    Step3
    Now place the curser on the sigma symbol and activate the aggregate.
    Step4
    Now load the data to the cube , befor loading the data to the cube it a good practice of switching of the aggregate and switch on after loading.
    this option can be obtain at top menu aggregate---> switch on of for queries that means the aggregates are available for the reporting to view.
    by doing this this will increase the load performance and also the execution of the query will be fast
    Step6
    now open the report and execute
    The scenario below shows you how to count the results of a calculated key figure in the BEx Query Designer.
    You have loaded the following data into your InfoCube:
    Before Aggregation
    Region/     Customer  /Sales Volume USD
    NY /     A     /400
    NY/     B     /200
    NY/     C     /50
    CA/     A     /800
    CA/     C     /300
    After Aggregation
    Region/                SalesVolumeUSD
    NY/                        650
    CA/                       1100
    Overall result                 1750
    hope this is clear
    Santosh
    Edited by: Santhosh Nagaraj on Oct 29, 2009 3:35 AM

  • Replacing Oracle DW with BW

    Hello guys,
    I need your expert advice on replacing Oracle data warehouse with BW.We have three cubes maintained in Oracle data warehouse and we are planning to implement BW warehouse and replcace Oracle.Could you please let me know from where should i start?.The Oracle is sourced from a non/R3 and BW should be sourced from R/3 as the client is implementing SAP.Should we bring the data from external source to R/3 before doing any thing?.Please guide me.Thank you.
    Raj.

    Hi Raj,
    Welcome to SDN!
    I don't think you have to bring all data into R/3 first.
    You need to create a source system and setup DB COnnect between Oracle and BW:
    http://help.sap.com/saphelp_nw04/helpdata/en/58/54f9c1562d104c9465dabd816f3f24/content.htm
    choose relevant tables, generate datasources on them, replicate these DSs in BW, create the appropriate infoobjects and data targets, assign DS to infosource, create transfer rules, create update rules for data targets, create infopackages, and finally, load data.
    It's a rather long way to go.
    Best regards,
    Eugene

  • EAS Service is Down suddenly

    Hi All,
    Today I have come across with EAS services down in the Server suddenly & unable to access the EAS console.
    But there is no EAS service stopped information entered in any of the log files & no entry in the Event Viewer also.
    Can you pls let me know where (in which log file) does the EAS service stopped information is maintained?
    What all are the reasons for suddenly stopping of EAS services?
    Thanks you very much in advance.
    Thanks,
    Swathi.
    Edited by: 811829 on Mar 21, 2012 6:06 AM

    Dear JohnGoodwin,
    Thankyou very much for sharing the information.
    First day of EAS down issue -- Actually we have nearly 30 cubes maintained in the server. And this EAS down is due to Memory Utilization is high as all the cubes are loaded at that point... We think as this may be the main reason for this issue on first day.
    But after unloading the cubes memory utilization is reduced and then EAS service is started.
    But w.r.t second & third day issue, I am not understanding wht is the root cause for it...
    I have read the information which is provided in the below link:
    http://docs.oracle.com/cd/E17236_01/epm.1112/epm_install_11121/ch07s01s01.html
    And in our case, this issue happened in the Development server and wht steps I need to follow to find the root cause and resolve the issue..

  • Restricting access to a  cube while it is being maintained

    Hi,
    We are trying to restrict access via discoverer/excel add in to a CUBE while cube is being maintained. We were able to achieve this by revoking privileges to certain roles before the start of the cube build.
    I would like to know if there is any better way or built in functionality(out of box) that restricts access to a cube a while it is refreshing? Any help is appreciated.

    Ragnar is correct, the best way to do this is to attach the AW in exclusive mode. You can either do this manually yourself before starting your load job, or automatically by scheduling the job and using mutiple processes to load and solve the cube.
    The problem is removing users currently viewing data via Excel/Disco when the job starts. If you can ensure there will be no users accessing the AW when the job starts, then the exclusive attach mode will prevent any users from attaching the AW during the processing. If you cannot guarantee this, then there is a problem because the job will fail when it tries to attach the AW in exclusive mode. Obviously you could put this in a loop and wait until a user exits the front end application and releases the AW. Alternatively, you could write a SQL script to disconnect/kill all sessions accessing the AW - not very nice for the users though if they are building a report because they will lose all their unsaved changes.
    When the AW is attached in exclusive mode, bad news is that Discoverer/Excel will probably generate a nasty Java error message when a user tries to connect using Discoverer/Excel.
    Therefore, overall not an ideal situation. But I cannot think of a really good way to manage this at the moment. Sorry I can't be more helpful.
    Keith Laker
    Oracle EMEA Consulting
    OLAP Blog: http://oracleOLAP.blogspot.com/
    OLAP Wiki: http://wiki.oracle.com/page/Oracle+OLAP+Option
    DM Blog: http://oracledmt.blogspot.com/
    OWB Blog : http://blogs.oracle.com/warehousebuilder/
    OWB Wiki : http://wiki.oracle.com/page/Oracle+Warehouse+Builder
    DW on OTN : http://www.oracle.com/technology/products/bi/db/11g/index.html

  • ORA-37999: Serious OLAP error - while running Maintain Cube...

    While running "Maintain cube..." I'm getting error
    "Action BUILDDATABASE failed on object TEST_DB.NPE"
    Below is the stack trace. It states its a "ORA-37999: Serious OLAP error". Am I missing something or should I contact Oracle support like the statement said?
    oracle.AWXML.AWException: Action BUILDDATABASE failed on object TEST_DB.NPE
    at oracle.AWAction.BuildDatabase.Execute(BuildDatabase.java:530)
    at oracle.olap.awm.wizard.awbuild.BuildWizardHelper$1.construct(BuildWizardHelper.java:185)
    at oracle.olap.awm.ui.SwingWorker$2.run(SwingWorker.java:109)
    at java.lang.Thread.run(Unknown Source)
    Caused by: oracle.AWXML.AWException: oracle.express.ExpressServerException
    java.sql.SQLException: ORA-37999: Serious OLAP error: UPDATE. Please contact Oracle Technical Support.
    ORA-03238: unable to extend LOB segment HPMP_WH.SYS_LOB0000043088C00004$$ subpartition SYS_LOB_SUBP73 by 64 in tablespace USERS
    ORA-06512: at "SYS.GENCONNECTIONINTERFACE", line 70
    ORA-06512: at line 1
    at oracle.AWXML.AWConnection.executeCommand(AWConnection.java:279)
    at oracle.AWAction.BuildDatabase.Execute(BuildDatabase.java:513)
    ... 3 more
    Caused by: oracle.express.ExpressServerException
    java.sql.SQLException: ORA-37999: Serious OLAP error: UPDATE. Please contact Oracle Technical Support.
    ORA-03238: unable to extend LOB segment HPMP_WH.SYS_LOB0000043088C00004$$ subpartition SYS_LOB_SUBP73 by 64 in tablespace USERS
    ORA-06512: at "SYS.GENCONNECTIONINTERFACE", line 70
    ORA-06512: at line 1
    at oracle.express.spl.SPLExecutor.executeCommand(SPLExecutor.java:155)
    at oracle.AWXML.AWConnection.executeCommand(AWConnection.java:268)
    ... 4 more

    Well... it was indeed a tablespace error...no need to contact Oracle.
    Worked fine after the DBA's added more tablespace!

Maybe you are looking for

  • How can i use my iphone in turkey with a turkish simcard?

    Hello Everyone, I have bought an Iphone4 on Dec2010 from America which consists AT&T sim card in itself. when i bought the IOS of the phone was 4.3.1. but for some reasons i had to change it and now its 5.1.1. I m living in turkey and because of siml

  • Need help setting up a shared /Library/Fonts folder for font sharing

    Hi, I have a problem setting up a shared /Library/Fonts folder on my OSX 10.5.5 server that gets automounted on the OSX 10.5.x clients. While I managed to get the folder to show up automatically under /Network/Library/Fonts on the clients, the fonts

  • TS1717 When trying to open itunes I get an error message "New iTunes Library".  any suggestions?

    i can't open itunes.  i get an error message: New iTunes Library.  Anyone else encounter this?

  • Setting date in prepared statement

    I am using preparesStatement.setDate(param1,findAuditSearchDTO.getStgAuditGeneral().getAuditBeginDate()); I am wondering setting date this way will slow down the query? should I use preparesStatement.setString where I can call to_date function or the

  • Date format help

    Hi, select to_char(SYSDATE,'MMDDYYYYHHMISS') from dual; Ouput: 10082012010338 It is giving me correct output, but i want to display even the Fraction of seconds, how can i? I have tried this..but not working.. select to_char(SYSDATE,'MMDDYYYYHHMISSFF