Oracle OLAP cube build question

Hello,
     I am trying to build a reasonably large cube (around 100
million rows from the underlying relational fact table). I am using
Oracle 10g Release 2. The cube has 7 dimensions, the largest of which
is TIME (6 years of data with the lowest level day). The cube build
never finishes.
Apparently it collapses while doing "Auto Solve". I'm assuming this
means calculating the aggregations for upper levels of the hierarchy
(although this is not mentioned in any of the documentation I have).
I have two questions related to this:
1. Is there a way to keep these aggregations from being performed at
cube build time on dimensions with a value-based hierarchy? I already
have the one dimension designated as level-based unchecked in the
"Summarize To" tab in AW manager (TIME dimension).
2. Are there any other tips that might help me get this cube built?
Here is the log from the olapsys.xml_load_log table:
RECORD_ID LOG_DATE AW XML_MESSAGE
1. 09-MAR-06 SYS.AWXML 08:18:51 Started Build(Refresh) of APSHELL Analytic Workspace.
2. 09-MAR-06 SPADMIN.APSHELL 08:18:53 Attached AW APSHELL in RW Mode.
3. 09-MAR-06 SPADMIN.APSHELL 08:18:53 Started Loading Dimensions.
4. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Started Loading Dimension Members.
5. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Started Loading Dimension Members for ACCOUNT.DIMENSION (1 out of 9 Dimensions).
6. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Finished Loading Members for ACCOUNT.DIMENSION. Added: 0. No Longer Present: 0.
7. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Started Loading Dimension Members for CATEGORY.DIMENSION (2 out of 9 Dimensions).
8. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Finished Loading Members for CATEGORY.DIMENSION. Added: 0. No Longer Present: 0.
9. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Started Loading Dimension Members for DATASRC.DIMENSION (3 out of 9 Dimensions).10. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Finished Loading Members for DATASRC.DIMENSION. Added: 0. No Longer Present: 0.
11. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Started Loading Dimension Members for ENTITY.DIMENSION (4 out of 9 Dimensions).
12. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Finished Loading Members for ENTITY.DIMENSION. Added: 0. No Longer Present: 0.
13. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Started Loading Dimension Members for INPT_CURRENCY.DIMENSION (5 out of 9 Dimensions).
14. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Finished Loading Members for INPT_CURRENCY.DIMENSION. Added: 0. No Longer Present: 0.
15. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Started Loading Dimension Members for INTCO.DIMENSION (6 out of 9 Dimensions).
16. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Finished Loading Members for INTCO.DIMENSION. Added: 0. No Longer Present: 0.
17. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Started Loading Dimension Members for RATE.DIMENSION (7 out of 9 Dimensions).
18. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Finished Loading Members for RATE.DIMENSION. Added: 0. No Longer Present: 0.
19. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Started Loading Dimension Members for RPTCURRENCY.DIMENSION (8 out of 9 Dimensions).
20. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Finished Loading Members for RPTCURRENCY.DIMENSION. Added: 0. No Longer Present: 0.
21. 09-MAR-06 SPADMIN.APSHELL 08:18:54 Started Loading Dimension Members for TIME.DIMENSION (9 out of 9 Dimensions).
22. 09-MAR-06 SPADMIN.APSHELL 08:18:55 Finished Loading Members for TIME.DIMENSION. Added: 0. No Longer Present: 0.
23. 09-MAR-06 SPADMIN.APSHELL 08:18:55 Finished Loading Dimension Members.
24. 09-MAR-06 SPADMIN.APSHELL 08:18:55 Started Loading Hierarchies.
25. 09-MAR-06 SPADMIN.APSHELL 08:18:55 Started Loading Hierarchies for ACCOUNT.DIMENSION (1 out of 9 Dimensions).
26. 09-MAR-06 SPADMIN.APSHELL 08:18:55 Finished Loading Hierarchies for ACCOUNT.DIMENSION. 1 hierarchy(s) ACCOUNT_HIERARCHY Processed.
27. 09-MAR-06 SPADMIN.APSHELL 08:18:55 Started Loading Hierarchies for CATEGORY.DIMENSION (2 out of 9 Dimensions).
28. 09-MAR-06 SPADMIN.APSHELL 08:18:56 Finished Loading Hierarchies for CATEGORY.DIMENSION. 1 hierarchy(s) CATEGORY_HIERARCHY Processed.
29. 09-MAR-06 SPADMIN.APSHELL 08:18:56 Started Loading Hierarchies for DATASRC.DIMENSION (3 out of 9 Dimensions).
30. 09-MAR-06 SPADMIN.APSHELL 08:18:56 Finished Loading Hierarchies for DATASRC.DIMENSION. 1 hierarchy(s) DATASRC_HIER Processed.
31. 09-MAR-06 SPADMIN.APSHELL 08:18:56 Started Loading Hierarchies for ENTITY.DIMENSION (4 out of 9 Dimensions).
32. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Hierarchies for ENTITY.DIMENSION. 2 hierarchy(s) ENTITY_HIERARCHY1, ENTITY_HIERARCHY2 Processed.
34. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Hierarchies for INPT_CURRENCY.DIMENSION. No hierarchy(s) Processed.
36. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Hierarchies for INTCO.DIMENSION. 1 hierarchy(s) INTCO_HIERARCHY Processed.
37. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Hierarchies for RATE.DIMENSION (7 out of 9 Dimensions).
38. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Hierarchies for RATE.DIMENSION. No hierarchy(s) Processed.
39. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Hierarchies for RPTCURRENCY.DIMENSION (8 out of 9 Dimensions).
40. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Hierarchies for RPTCURRENCY.DIMENSION. No hierarchy(s) Processed.
41. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Hierarchies for TIME.DIMENSION (9 out of 9 Dimensions).
42. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Hierarchies for TIME.DIMENSION. 2 hierarchy(s) CALENDAR, FISCAL_CALENDAR Processed.
43. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Hierarchies.
44. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Attributes.
45. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Attributes for ACCOUNT.DIMENSION (1 out of 9 Dimensions).
46. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Attributes for ACCOUNT.DIMENSION. 6 attribute(s) ACCTYPE, CALC, FORMAT, LONG_DESCRIPTION, RATETYPE, SCALING Processed.
47. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Attributes for CATEGORY.DIMENSION (2 out of 9 Dimensions).
48. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Attributes for CATEGORY.DIMENSION. 2 attribute(s) CALC, LONG_DESCRIPTION Processed.
49. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Attributes for DATASRC.DIMENSION (3 out of 9 Dimensions). 50. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Attributes for DATASRC.DIMENSION. 3 attribute(s) CURRENCY, INTCO, LONG_DESCRIPTION Processed.
51. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Attributes for ENTITY.DIMENSION (4 out of 9 Dimensions).
52. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Attributes for ENTITY.DIMENSION. 3 attribute(s) CALC, CURRENCY, LONG_DESCRIPTION Processed.
53. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Attributes for INPT_CURRENCY.DIMENSION (5 out of 9 Dimensions).
54. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Attributes for INPT_CURRENCY.DIMENSION. 2 attribute(s) LONG_DESCRIPTION, REPORTING Processed.
55. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Attributes for INTCO.DIMENSION (6 out of 9 Dimensions).
56. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Finished Loading Attributes for INTCO.DIMENSION. 2 attribute(s) ENTITY, LONG_DESCRIPTION Processed.
57. 09-MAR-06 SPADMIN.APSHELL 08:18:57 Started Loading Attributes for RATE.DIMENSION (7 out of 9 Dimensions).
58. 09-MAR-06 SPADMIN.APSHELL 08:18:58 Finished Loading Attributes for RATE.DIMENSION. 1 attribute(s) LONG_DESCRIPTION Processed.
59. 09-MAR-06 SPADMIN.APSHELL 08:18:58 Started Loading Attributes for RPTCURRENCY.DIMENSION (8 out of 9 Dimensions).
60. 09-MAR-06 SPADMIN.APSHELL 08:18:58 Finished Loading Attributes for RPTCURRENCY.DIMENSION. 2 attribute(s) LONG_DESCRIPTION, REPORTING Processed.
61. 09-MAR-06 SPADMIN.APSHELL 08:18:58 Started Loading Attributes for TIME.DIMENSION (9 out of 9 Dimensions).
62. 09-MAR-06 SPADMIN.APSHELL 08:20:26 Finished Loading Attributes for TIME.DIMENSION. 3 attribute(s) END_DATE, LONG_DESCRIPTION, TIME_SPAN Processed.
63. 09-MAR-06 SPADMIN.APSHELL 08:20:26 Finished Loading Attributes.
64. 09-MAR-06 SPADMIN.APSHELL 08:20:26 Finished Loading Dimensions.
65. 09-MAR-06 SPADMIN.APSHELL 08:20:26 Started Updating Partitions.
66. 09-MAR-06 SPADMIN.APSHELL 08:20:26 Finished Updating Partitions.
67. 09-MAR-06 SPADMIN.APSHELL 08:20:40 Started Loading Measures.
68. 09-MAR-06 SPADMIN.APSHELL 08:20:40 Started Load of Measures: SIGNEDDATA from Cube FINANCE.CUBE.
69. 09-MAR-06 SPADMIN.APSHELL 10:54:06 Finished Load of Measures: SIGNEDDATA from Cube FINANCE.CUBE. Processed 100000001 Records. Rejected 0 Records.
70. 09-MAR-06 SPADMIN.APSHELL 10:54:06 Started Auto Solve for Measures: SIGNEDDATA from Cube FINANCE.CUBE.

Hi, I've taken a few minutes to do a quick analysis. I just saw in your post that this isn't "real data", but some type of sample. Here is what I'm seeing. First off, this is the strangest dataset I've ever seen. With the exception of TIME, DATASOURCE, and RPTCURRENCY, every single other dimension is nearly 100% dense. Quite truthfully, in a cube of this many dimensions, I have never seen data be 100% dense like this (usually with this many dimensions its more around the .01% dense max, usually even lower than that). Is it possible that the way you generated the test data would have caused this to happen?
If so, I would strongly encourage you to go back to your "real" data and run the same queries and post results. I think that "real" data will produce a much different profile than what we're seeing here.
If you really do want to try and aggregate this dataset, I'd do the following:
1. Drop any dimension that doesn't add analytic value
Report currency is an obvious choice for this - if every record has exactly the same value, then it adds no additional information (but increases the size of the data)
Also, data source falls into the same category. However, I'd add one more question / comment with data source - even if all 3 values DID show up in the data, does knowing the data source provide any analytical capabilities? I.e. would a business person make a different decision based on whether the data is coming from system A vs. system B vs. system C?
2. Make sure all remaining dimensions except TIME are DENSE, not sparse. I'd probably define the cube with this order:
Account...........dense
Entity..............dense
IntCo...............dense
Category.........dense
Time...............sparse
3. Since time is level based (and sparse), I'd set it to only aggregate at the day and month levels (i.e. let quarter and year be calculated on the fly)
4. Are there really no "levels" in the dimensions like Entity? Usually companies define those with very rigid hierarchies (assuming this means legal entity)
Good luck with loading this cube. Please let us know how "real" this data is. I suspect with that many dimensions that the "real" data will be VERY sparse, not dense like this sample is, in which case some of the sparsity handling functionality would make a huge benefit for you. As is, with the data being nearly 100% dense, turning on sparsity for any dimension other than TIME probably kills your performance.
Let us know what you think!
Thanks,
Scott

Similar Messages

  • Largest Oracle OLAP cube

    Forum members,
    What is the largest Oracle OLAP cube that is out there in terms of total size, number of dimensions, and metrics? The company I recently joined uses Microsoft SSAS for building cubes pulling data from an Oracle DB. The size of the Microsoft SSAS cube is 18 TB (Tera Bites). I am encouraging these people to use Oracle OLAP, but there is push back saying Oracle OLAP won't scale to the extent Microsoft SSAS has.
    I need some real data to defend my case. Has any one moved from Microsoft SSAS to Oracle OLAP?
    Thanks,
    Logan

    Logan,
    In addition to the link that David provided, some more points are:
    (1). With cube compression, it could be that the same 18 TB cube in MSAS is smaller in OLAP. Its hard to prove it without doing a test load.
    (2). Microsoft does have a very feature-rich front-end to develop/build dimensions and cubes for MSAS, compared to Oracle OLAP's AWM.
    Most of the work is handled in relational sql-views before loading dimensions and cubes in Oracle-OLAP. That is why there is relatively less functionality in AWM.
    (3). Another thing I heard about in MSAS is "Scoped Assignments". I am not sure what it is, so I can't say how it can be done in Oracle-OLAP. If your customer currently use this MSAS feature, then you can post a question on this forum providing some details.
    (4). MSAS does provide good Excel reporting. If your client will continue using Excel, then you should look at Simba's Oracle OLAP MDX plugin for Excel. Search this forum and you will find information about it, or goto http://www.simba.com/MDX-Provider-for-Oracle-OLAP.htm
    (5). Writeback is another feature that can be done easily in MSAS. With Oracle-OLAP, you will have to come up with a custom (although simple) solution, where the data will be written into a table first and then loaded into Oracle-OLAP cubes. Its a very very quick process.
    (6). There are more choices for reporting tools compared to MSAS. Since Oracle-OLAP cubes are queried through SQL SELECT statements, as long as a reporting tool can generate SELECT statements for an Oracle database, it can also work with Oracle-OLAP.
    (7). Finally, it is much easier to federate (or combine) relational and multi-dimensional data together between Oracle-OLAP and Relational, as both are in the same database. So you may not have to store all the data into OLAP cube, what is currently loaded into MSAS cube.
    Overall, its a slightly different way of doing things between OLAP and MSAS. But I am almost certain that with the power and scalability of Oracle database, Oracle-OLAP will perform better than MSAS. You just have to know how to convert those MSAS cubes/dimensions into OLAP cubes/dimensions.
    .

  • Can Crystal Connect to Oracle OLAP Cube

    I have CR Professional XI installed.  Does it have the capability to connect to Oracle 10g OLAP Cubes created in Oracle AWM?

    Hi Tommyi,
        For Oracle OLAP Cube, are we referring to Hyperien ESSbase?
        If the answer is true, then we can create a connection in Crystal, under OLAP, to create an OLAP connection to ESSbase in CR.
       Please let me know if this answers your question.
    Best Regards
    Carlos Chen

  • Refresh ORACLE OLAP cube

    I am new at Oracle OLAP programming, and just wanted to know about Cube´s Data Update. For example, right now I maintained and processed the cube for 2012 data; 2013 data is ready on the fact table and I need to include it on the cube as well...
    Thanks!

    Are you having any issue loading 2013 data into your cube, using AWM "Maintain Cube" option?
    The default cube loading behavior is to add new data during each cube build and keep existing data (unless it is overridden in a cell).
    Take a look at the sql script generated by AWM. As long as there is no CLEAR command in it, then your 2012 data will not be cleared.
    You can write your own DBMS_CUBE.BUILD statement.
    DBMS_CUBE.BUILD procedure provides many ways of loading cube data (as well as dimension refreshes), without using AWM:
    http://docs.oracle.com/cd/E11882_01/appdev.112/e16760/d_cube.htm#CHDFIFIA
    If you search this forum for "DBMS_CUBE" (select Forum = 'OLAP' and Date Range = 'ALL') , you will find lot of examples for loading cubes and dimensions.
    Some examples are:
    Mapping to Relational tables
    Re: Enabling materialized view for fast refresh method
    DBMS CUBE BUILD
    Since you are new to all this, initially you should use AWM for all maintenance, and use the scripts generated by it.
    Only if you come to a point where AWM cannot be used for something, then write your own DBMS_CUBE.BUILD statement.
    .

  • Project Server 2013 OLAP cube build error

    When I try to build the OLAP cube I receive the below error:
    [2/26/2014 12:36 PM] Failed to build the OLAP cubes. Error: The attempt to build the OLAP database on server SQL_Server_Name failed, the SQL Server Analysis Services Analysis Management Objects (AMO) client software may not be installed on this server, please
    install/update the client on this server and retry. The underlying exception was: Could not load file or assembly 'Microsoft.AnalysisServices, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot
    find the file specified.
    I did some search and found that Analysis Management Objects (AMO) client software is not installed on my servers, I downloaded and installed the AMO client software , restarted the Project Server Service application to the 4 Servers. I rebooted the
    4 servers, tried to build the cube, but I got the same error. I checked the assembly on the 4 servers and they contain the dll Microsoft.AnalysisServices.
    Any help?
    I read that AMO client software tools must be installed during the farm setup, after installing the SharePoint Prerequisites and before installing SharePoint. Is that correct?

    Hello,
    Even with SQL 2012, the 2008 SQL AMO component is required on the Project Server servers. See the post from Brian for more details:
    http://blogs.msdn.com/b/brismith/archive/2012/11/12/project-server-2013-requirements-to-build-an-olap-cube.aspx
    Paul
    Paul Mather | Twitter |
    http://pwmather.wordpress.com | CPS

  • How to query Oracle OLAP cube?

    Does ODP.NET support querying a cube in Oracle? Does it support the MDX syntax?
    If ODP .NET does not support it, which other library does?
    Thanks,
    Yash

    Simba Technologies provides a MDX provider for Oracle OLAP.
    http://simba.com/evaluate-MDX-Provider-for-Oracle-OLAP.htm
    Watch a video on how to use Microsoft Excel with the MDX provider for Oracle OLAP.
    http://download.oracle.com/otndocs/products/warehouse/olap/videos/excel_olap_demo/Excel_Demo_for_Web.html
    If you have a requirement for specifically using ODP.NET with Oracle OLAP, let me know: alex.keh (at) oracle.com.

  • Olap cube build Error ORA-01403: no data found during DML "SYS.AWXML!R11_AGGREGATE_CONSISTENT"

    Can anyone help decoding this error and how to resolve this ?
    <ERROR>  <![CDATA[37162: ORA-37162: OLAP errorORA-01403: no data foundXOQ-01600: OLAP DML error while executing DML "SYS.AWXML!R11_AGGREGATE_CONSISTENT"ORA-06512: at "SYS.DBMS_CUBE", line 234ORA-06512: at "SYS.DBMS_CUBE", line 316ORA-06512: at line 1]]>></ERROR>
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, Automatic Storage Management, OLAP, Data Mining
    and Real Application Testing options
    Following is the cube_build_log rows when the error occurs
    BUILD_ID  STATUS  COMMAND  BUILD_OBJECT  BUILD_OBJECT_TYPE  OUTPUT_TXT
    9921  FAILED  BUILD    BUILD  <ERROR>  <![CDATA[XOQ-01707: Oracle job "A$9921_JOB$_501362" failed while executing slave build "ORSFHR_PRDBI.RESV_PACE_BI USING (LOAD PRUNE, SOLVE PARALLEL) AS OF SCN 10311831493648" with error "37162: ORA-37162: OLAP errorORA-01403: no data foundXOQ-01600: OLAP DML error while executing DML "SYS.AWXML!R11_AGGREGATE_CONSISTENT"ORA-06512: at "SYS.DBMS_CUBE", line 234ORA-06512: at "SYS.DBMS_CUBE", line 316ORA-06512: at line 1".]]>></ERROR>
    9921  COMPLETED  DETACH AW    BUILD 
    9921  STARTED  DETACH AW    BUILD 
    9921  COMPLETED  ROLLBACK    BUILD 
    9921  STARTED  ROLLBACK    BUILD 
    9921  COMPLETED  REATTACH AW RW WAIT    BUILD 
    9921  STARTED  REATTACH AW RW WAIT    BUILD 
    9921  FAILED  SLAVE  RESV_PACE_BI  CUBE  <ERROR>  <![CDATA[37162: ORA-37162: OLAP errorORA-01403: no data foundXOQ-01600: OLAP DML error while executing DML "SYS.AWXML!R11_AGGREGATE_CONSISTENT"ORA-06512: at "SYS.DBMS_CUBE", line 234ORA-06512: at "SYS.DBMS_CUBE", line 316ORA-06512: at line 1]]>></ERROR>
    9921  COMPLETED  DETACH AW  RESV_PACE_BI  CUBE 
    9921  FAILED  BUILD    BUILD  <ERROR>  <![CDATA[ORA-01403: no data foundXOQ-01600: OLAP DML error while executing DML "SYS.AWXML!R11_AGGREGATE_CONSISTENT"]]>></ERROR>
    9921  STARTED  DETACH AW  RESV_PACE_BI  CUBE 
    9921  FAILED  SOLVE PARALLEL  RESV_PACE_BI  CUBE  <SolveStatistics  IS_EMPTY="no"  CHANGED_VALUES="yes"  CHANGED_RELATIONS="no"/><ERROR>  <![CDATA[ORA-01403: no data foundXOQ-01600: OLAP DML error while executing DML "SYS.AWXML!R11_AGGREGATE_CONSISTENT"]]>></ERROR>
    9921  STARTED  SOLVE PARALLEL  RESV_PACE_BI  CUBE 
    9921  COMPLETED  LOAD  RESV_PACE_BI  CUBE  <CubeLoad  LOADED="4114188"  REJECTED="0"/>

    Please enter a service request with regard to this issue with Oracle Support.
    thanks,
    Ken

  • Recommended Hardware Config for huge OLAP Cube build

    Hi David ,
    Can you please provide the recommended hardware config for cube having billions of data under fact table . We ran a cube with 0.1 billion of data took around 7 hours to process. What could be the key areas to gain the performance benefit ? ansd also what could be the CPU , RAM (server) , RAM (Oracle DB) to gain much more perf benefit in such configurations ?
    Also we have 32 bit windows 2003 server .Can we get better result if we switch to 64 bit ?
    Thanks in advance,
    DxP.

    Hi!
    Well, I would definitely recommend you to proceed with a consultant because I feel that you have some lack of methodology and experience together with time constraints for your project.
    Regarding hardware, unfortunately, I would not be able to give you any precise figures because I have no supporting information. What you should bear in mind that your system must be balanced. You (better with consultant) need to find a right balance between all the factors you consider as important while building a pile of hardware:
    - cost
    - architecture
    - speed
    - availability
    - load scenarios
    etc...
    Regarding architecture point bear in mind that today finding right balance between processing power and storage processing capacity is a challenge. Put attention to this.
    Regards,
    Kirill

  • Project Server 2013 - OLAP Cube Build Issues

    Hi,
    I have a Project Server 2013 environment with SQL Server 2012 SP1. My cube built successfully and Report Authors group (including me) were accessing the templates and Data Connections from BI centre successfully.
    I re-built the cube and suddenly we all cannot access templates or data connection as it says 'Access denied'!
    Went to server hosting Analysis Services and checked the role ProjectServerViewOlapDataRole and didn't see any of the check box ticked for database permissions like it was when I initially built the cube and added Report Authors group and account
    that is used by Secure Store app. Membership has both account and group.
    Has anyone come across this issue?
    Thank you,
    SJ

    SJ,
    The access of the BI Center, templates and data connections are handled by SharePoint permissions and are usually inherited from PWA.
    The ProjectServerViewOlapDataRole simply provides the ability to view the data from the OLAP database. It sounds like the issue isn't with the OLAP but rather with PWA itself. An OLAP error would have only precluded you from retrieving data.
    Did someone modify the BI Center site permissions recently such as breaking inheritance?
    Treb Gatte, Project MVP |
    @tgatte | http://AboutMSProject.com

  • Oracle olap cubes

    HI guys
    there are many things that i couldn't understand about creating cube
    for olap using Enterprise Mangager ( or cwm2 package ) ,
    i need the most details about :
    1- aggregation in cube
    2- solve order
    3- dimension alias for each hierarchy, what the difference between it and dimension
    and in aggregation i defined how measure will be aggregated for each dimension
    i don't understand this.
    can anyone explain these things in more details.
    Thanks Alot.

    The issue seems to be with the ODBC and got resolved...cheers

  • Oracle OLAP Universe

    Not sure where to post this so apologize if this is not the right place for this post.
    I am at a client's where there is a requirement to build a universe on top of an Oracle OLAP (10g) cube to use with WebI, Crystal, Xcelsius etc. that they want to begin using. Here are the issues I am facing and need some help with.
    1) They have a view built on top of their OLAP that I have been told is what is used for reporting. There are also 3 heirarchies that are maintained in Peoplesoft. The have developed a reporting interface using BO SDK API's and have their own custom SQL engine that they have built using Java. They use this SQL engine attached to a Crystal template to build their reports (The last part didn't make sense to me so quoting it verbatim as I have limited familiarity). When I try creating a universe using the view mentioned above in Designer, it throws an error with the mesasge E_METADATA_EMPTY_LEVELLIST. When I try using Universe builder to create the universe, as soon as I right click on the cube to select 'create universe', it hangs up infinitely. My question is if this is even possible and if so, what is the solution to my problem above.
    2) The second issue is that while the fact are in the OLAP, they maintain 2-3 heirarchies in PeopleSoft. From what I have gathered by talking to them, they use some DBLINK from the OLAP to connect to Peoplesoft and join the OLAP view with the Peoplesoft heirarchy when they dynamically build their queries to fetch the data for their reports. If I have to do this in the universe as well, how would I go about doing it or is it even possible?
    Really need an answer fast. If what I'm trying to do is possible then I would need to some help with the 'how and if the answer is 'no' then I would need a proper explanation for that too.
    Appreciate the help !!
    Kartik

    Hi,
    Rather than move all your data to another cube like SSAS as has been suggested, or go back to the relational source, why not just use Voyager to connect directly to the Oracle OLAP cube you have already built?
    Simba's MDX Provider for Oracle OLAP allows Voyager to do this. 
    This would be much simpler than introducing a new tier of server (SSAS), and allows you to take advantage of the performance of the OLAP engine in the Oracle database.
    http://www.simba.com/oracle
    in particular, look at the document that describes how to set this up:
    http://www.simba.com/docs/Using-Simba-MDX-Provider-for-Oracle-OLAP-with-SAP-BusinessObjects-Voyager.pdf
    Mike

  • Will Oracle OLAP handle our case(s).

    We are about to build a cube to roughly handle following dimensions and facts:
    15 dimensions ranging from a couple of members to 40,000+ members.
    A fact table holding 200,000,000+ rows
    So my question is: Does anybody has a sense of whether OLAP has a chance of handling this data? We are pretty certain that the data will be sparse.
    A second item relates to whether Oracle OLAP cubes give us the ability to compute what we refer to as "Industry" data. We serve a number of companies and we compute metrics that applies to their data. In order to allow these companies to see what they are doing against the other companies we provide the metrics for every other company; this metrics are considered Industry. So my question is: Do OLAP cubes have any structure or mechanism that allows to compute these metrics within the same cube or do we have to create a cube to hold Industry metrics?
    Thanks,
    Thomas

    Thomas,
    I cannot advise you for or against based on the small amount of information I have. I will not deny that at 15 dimensions you are at the upper limit of what you can achieve using the current (11.1) OLAP technology, but I have seen cubes of this size built and queried, so I know it is possible.
    The answer would depend on many things: hardware, query tools, expectations for query and build performance, and whether you have a viable alternative technology (which will determine how hard you will work to get past problems). It even depends on your project time frames, since release 11.2 is currently in beta and will, we hope, handle greater volumes of data than 11.1 or 10g.
    One important factor is how you partition your cube. At what level do you load the data (e.g. DAY or WEEK)? What is your partition level (e.g. MONTH or QUARTER)? A partition that loads, say, 10 million rows, is going to be much easier to build than a partition with 50 million rows. To decide this you need to know where your users will typically issue queries, since queries that cross partitions (e.g. ask for data at the YEAR level but are partitioned by WEEK) are significantly slower than those that do not cross partitions (e.g. ask for data at WEEK when you are partitioned by MONTH).
    Cube-based MVs can offer advantages for cubes of this size even if you define a single, 15-dimensional, cube. One nice trick is to only aggregate the cube up to the partition level. Suppose, for example, that you load data at the DAY level and partition by QUARTER. Then you would make QUARTER be the top level in you dimension instead of YEAR or ALL_YEARS. The trick is to make YEAR be an ordinary attribute of QUARTER so that it appears in the GROUP BY clause of the cube MV. Queries that ask for YEAR will still rewrite against the cube, but the QUARTERS will be summed up to YEAR using standard SQL. The result will generally be faster than asking the cube to do the same calculation. This technique allows you to lower your partition level (so that there are fewer rows per partition) without sacrificing on query performance.
    Cube-based MVs also allow you to raise the lowest level of any dimension (to PRODUCT_TYPE instead of PRODUCT_SKU say). Queries that go against the upper levels (PRODUCT_TYPE and above) will rewrite against the cube, and those that go against the detail level (PRODUCT_SKU) will go direct to the base tables. This compromise is worth making if your users only rarely query against the detail level and are willing to accept slower response times when they do. It can reduce the size of the cube considerably and is definitely worth considering.
    David

  • OLAP cubes from heterogeneous data sources using XML DB

    hi.
    Q:1
    How we can create an OLAP cube (XML cubes) from XML data sources if using XML DB.
    Q:2
    How we can create an OLAP cubes from warehouses and flat files and covert these cubes into XML cubes.
    Q:3
    Is there any other tool (Except Analytical workspace manager AWM) which supports the construction of OLAP cubes in XML format from heterogeneous data sources?
    Edited by: user11236392 on Aug 21, 2009 3:50 AM

    Hi Stuart!
    Your undersatnding is partially correct. i am working for providing an architecture for XOLAP, XML is one of my data source.
    The idea is to generate uniform cubes from heterogeneous sources that can be integrated into a global cube. instead of building Oracle OLAP cubes from all the sources (this work is already done) i want to generate an XCube from XML data sources using XQuery.
    On the other hand if we have generated the Oracle OLAP cube from other sources like warehouses or flat files. i have to convert these Oracle OLAP cube into XML cube for uniformity. in an research paper i find that there is an operator Xcube embedded in XQuery which converts the multidimensional data (cube) into XML cube. im looking for the implementation of this operator in Query that how this operator works.
    hope u understand my architecture. but if u still have some confusion, kindly give me ur mail id. i will mail the diagram of my architecture.
    thanks.
    saqib

  • Specifying OLAP Cubes On XML Data

    Hi,
    is there any good example on how does OLAP handle XML data ?
    is it possible such a management in Oracle 9i Release 2 ?

    Hi Stuart!
    Your undersatnding is partially correct. i am working for providing an architecture for XOLAP, XML is one of my data source.
    The idea is to generate uniform cubes from heterogeneous sources that can be integrated into a global cube. instead of building Oracle OLAP cubes from all the sources (this work is already done) i want to generate an XCube from XML data sources using XQuery.
    On the other hand if we have generated the Oracle OLAP cube from other sources like warehouses or flat files. i have to convert these Oracle OLAP cube into XML cube for uniformity. in an research paper i find that there is an operator Xcube embedded in XQuery which converts the multidimensional data (cube) into XML cube. im looking for the implementation of this operator in Query that how this operator works.
    hope u understand my architecture. but if u still have some confusion, kindly give me ur mail id. i will mail the diagram of my architecture.
    thanks.
    saqib

  • ORACLE olap - dimension with huge number of members.

    Hi Experts,
    I have modeling a ORACLE OLAP cube with 10 dimensions. I have one of the dimensions which have huge leaf members...(probably 1 billion).. How many dimension members i can have for one dimension. Is there any limitation to the dimension members ?
    Thanks and Regards
    Siva

    Hi there,
    I'm not sure of the theoretical limits, but I would never recommend trying to build a dimension with 1 billion members.
    In situations were such granular data exists, it is common to leave this in table and load data into the OLAP cube at a more aggregate level.
    Which business entity are you trying to model in a 1 billion member dimension?
    Thanks,
    Stuart Bunby
    OLAP Blog: http://oracleOLAP.blogspot.com
    OLAP Wiki: http://wiki.oracle.com/page/Oracle+OLAP+Option
    OLAP on OTN: http://www.oracle.com/technology/products/bi/olap/index.html
    DW on OTN : http://www.oracle.com/technology/products/bi/db/11g/index.html

Maybe you are looking for

  • Query to find unique combinations in a record set

    I need to build a SQL to identify unique combinations I have the following data structure - I have a table to capture characteristics. Each characteristic can be associated with one or more attributes. Each attribute can be assigned one or more value

  • Pre shipment Blockage

    Hi All, I would need some help on an issue. Scenario: I have a sales order with one line item having 2 schedule lines. Scedule line 1 - request delivery date : 15th oct 2011 Schedule line 2 - requested delivery date - 15th Dec I would like to block d

  • Third party Manufacturing

    Hi Sapients, We are planning to use Third party manufacturing for a plant. And would like to use Purchase orders with TPM order type. Could some one help me the accounting flow in the third party manufactuirng when using the movement type: 543 and 54

  • How to Set Field NAST-AENDE to X

    Hello Sir, I am creating one Scheduling agrement using t-code ME31L and then changing it's validity date using t-code ME32L. When i change this Scheduling agrement my Field AENDE(Chyange  Flag) in table NAST is not set to X by default. May i Know whi

  • Move items from one page to another

    this is the URL i am using to send parmeters from page one to page two. http://lpar106:7779/pls/apex/f?p=111:2:463000813039856:::::P1_ACCOUNT_NO,P1_COMPANY_INFORMATION1:&P1_ACCOUNT_NO.,&P1_COMPANY_INFORMATION1. this is the SQL i am using to receive t