Creating a cube using OEM

I have an Oracle 10g database that has 5 tables in it. these tables are linked together through keys. Now, I am curious to know if I can create a cube with two or more dimensions based on columns from the same table, and also have the measure(s) for that cube be from the same table?? Would I need to create a view of that table and use that view as the "fact table"?
It seems like the cubes built in a lot of these samples use dimensions where each of those dimensions each come form a separate table. Does it always have to be that way?
Thanks for any input.

I'm not sure exactly why you want to do this.
I guess I didn't explain very clearly :)
We (i.e. the company I work for) creates software (in C++) which needs to work with a lot of data which is in an Oracle-DB.
As the analysis done by the SW would profit (performance-wise) from the layout of the data in an cube, we want to get the data in a cube.
But information like the number of dimensions isn't known beforehand (as that's sth. that's dependent on the site the customer uses the SW on, and can
change with time), the software has to do the creation of dims and the cube itself.
(For the curious ones: http://www.centerpoint.eu.com/ )
And as OLAPI isn't available for C++ (at least afaics), I'd really want to do this in PL/SQL.
If you do use standard form, you then allow yourself the
option of using the various tools later
That's really not needed :)
You can always build the original AW using
AWM, then save the build script it generates, and use
that to do subsequent builds.
Is that build-script in OLAP DML? How do I save it?
BTW, in addition to the SPL INFILE command, there is
the DBMS_AW.INFILE plsql routine, which saves you
some quotes :-).
dbms_aw.execute('infile ''work_dir/my.inf''') vs
dbms_aw.infile('work_dir/my.inf')
I hate superfluous quotes :)
Thanks Jim!

Similar Messages

  • Error while creating a cube using Cube Builder

    we have installed the Cube builder for SSM 7.0
    while creating the cube I am facing a error and the .TRC file contents are as follows :-
    LSS>
    LSS> set control LINKID SSM_CB_EA
    LSS>
    LSS> .... capture cube id in a control var to simplify queries on other tables
    LSS> access lslink
    LSLink> connect SSM_CB_EA
    LSLink>
    LSLink> SELECT ID FROM CPMS_CB_CUBES WHERE NAME = 'CUBEBUILDER1'
    [Microsoft][ODBC SQL Server Driver][SQL Server]Invalid object name 'CPMS_CB_CUBES'.
    SQLSTATE: S0002
    SQL System code: 208
    LSLink> lss create code = 'set control CUBEID ' + ID
    LSLink> output proc setcubeid;PIPADMINDEFAULT over
    LSLink> peek create nohead nonumb
    No Fields currently Selected
    ACC004:
    No Record Has Been Accessed From the Database.
    ACC004:
    No Record Has Been Accessed From the Database.
    CHE FRE
    CHE UPD
    Kindly help me out........

    Hi Martin,
    I believe that the issue you are encountering now is that you're using the wrong set of procedures for accessing MS SQL Server.
    In section 5.4.1 (Setting up SAP NetWeaver System database software) in the Strategy Management configuration guide, there is a section which discusses steps that are required depending on whether or not you are using MaxDB or MS SQL Server as your system database behind NetWeaver CE.  My guess is that you have not followed the steps described for SQL Server here.  This is simple to do thoughm, because all you need to do is copy the files in the "<install-dir>\SAP\SSM\InternetPub\procs\sqlsrvr_procs" into the <install-dir>\SAP\SSM\InternetPub\procs" directory.
    This is required because, when using Cube Builder or Entry and Appropval, the PAS component communicates directly with the system database in order to load data from there into PAS.  Because MaxDB and SQL Server have different SQL syntax, specifically regarding field concatenation, slightly different SQL is required.  So if you copy the SQL Server specific procedures across, you should be able to successfully build your model from Cube Builder.
    Of course, having read through your question again I'm not 100% sure that this is what you are hitting.  The problem you're seeing seems to be that, when we internally concatenate some selected fields together, we aren't finding a field called ID.  However, the SQL select being issued doesn't seem to have returned any error messages, so I'm not sure why there is a problem referencing the field.
    The only thing that comes to mind is a problem that I ran into at one point with the type of driver I was using to connect to SQL Server.  The PAS component of Strategy Management, even though it runs on Windows 2003 Server 64bit, is still only a 32bit component.  This means that, when configuring the System Data Sources within the ODBC applet in Control Panel, it is critical that the 32bit version of that ODBC applet be used - as documented in the note at the end of section 5.4.3 in the configuration guide.  The reason this is important is because, if you use an OLEDB connection to SQL Server, for some reason the field names are not returned correctly.  It is only when using the ODBC driver that the field names are returned correctly.
    So please ensure that you are using the ODBC driver for your Link ID which connects to SQL Server, and not an OLEDB connection.
    Hope this helps,
    Robert
    Edited by: Robert Holland on Dec 19, 2008 3:23 PM

  • How to create olap cube using Named Query Table in Data source View

     I Create on OLAP Cube using Existing Tables Its Working Fine But When i Use Named Query Table with RelationShip To other Named query Table  It Not Working .So give me some deep Clarification On Olap Cube for Better Understanding
    Thanks

    Hi Pawan,
    What do you mean "It Not Working"? As Kamath said, please post the detail error message, so that we can make further analysis.
    In the Data Source View of a CUBE, we can define a named query. In a named query, you can specify an SQL expression to select rows and columns returned from one or more tables in one or more data sources. A named query is like any other table in a data source
    view (DSV) with rows and relationships, except that the named query is based on an expression.
    Reference:Define Named Queries in a Data Source View (Analysis Services)
    Regards,
    Charlie Liao
    TechNet Community Support

  • Creating a Cube using pure OLAP DML

    This should be quite simple, but neither reading OLAP_*.pdf nor googling helped me, so maybe you can give me a hint where to look:
    I want to create dimensions, populate them, create a cube, and finally populate it, using just OLAP DML (i.e. no DBMS_AW or CWM/CWM2 packages).
    1) How can I execute OLAP DML? Right now I'm using sqlplus, wrapping each line in execute dbms_aw.execute(...);
    There's some better way, right? :)
    2) Can someone give me a link to a simple tutorial, showing how to create a cube et c.? Right now I'm as far as
    AW CREATE olaptest1
    DEFINE dimfoo DIMENSION TEXT AW olaptest1
    DEFINE dimbar DIMENSION TEXT AW olaptest1
    SQL SELECT DISTINCT(foo) FROM footable ORDER BY 1 INTO :dimfoo
    (the latter one fails, telling me that "quux is not a vaild olaptest1!dimfoo")
    3) Have I understood this correctly: I only need an Catalog if I want to work with AWM or DBMS_AWM, not if I just use DML?

    I'm not sure exactly why you want to do this.
    I guess I didn't explain very clearly :)
    We (i.e. the company I work for) creates software (in C++) which needs to work with a lot of data which is in an Oracle-DB.
    As the analysis done by the SW would profit (performance-wise) from the layout of the data in an cube, we want to get the data in a cube.
    But information like the number of dimensions isn't known beforehand (as that's sth. that's dependent on the site the customer uses the SW on, and can
    change with time), the software has to do the creation of dims and the cube itself.
    (For the curious ones: http://www.centerpoint.eu.com/ )
    And as OLAPI isn't available for C++ (at least afaics), I'd really want to do this in PL/SQL.
    If you do use standard form, you then allow yourself the
    option of using the various tools later
    That's really not needed :)
    You can always build the original AW using
    AWM, then save the build script it generates, and use
    that to do subsequent builds.
    Is that build-script in OLAP DML? How do I save it?
    BTW, in addition to the SPL INFILE command, there is
    the DBMS_AW.INFILE plsql routine, which saves you
    some quotes :-).
    dbms_aw.execute('infile ''work_dir/my.inf''') vs
    dbms_aw.infile('work_dir/my.inf')
    I hate superfluous quotes :)
    Thanks Jim!

  • How we can update cube using Change Tables as Source

    I have created a cube using source tables. I have also created change table (feature of 9i) based on source table. which store new changes to souce table. I want to load cube using one mapping from source table or change table based on the validity of the change table.
    Is that possible to do this? have anybody implemented change table to incremental load of cube?
    Sanjiv Tyagi

    Hi Sanjiv,
    I can think of 2 ways of doing this:
    one is using a single mapping with a pre mapping process to determine whether the change table is valid. If yes the pre mapping process creates a view (that is also referenced in the mapping itself) on top of the change table (select * from change_table), if this is not valid then the view will be on the source table (select * from source). After this pre map process the mapping runs on which ever view exists and loads the data.
    the second way is using process flow and 2 mappings. In PF you have a PL/SQL call that verifies which data to use. If the change table is correct, that will give success and uses the mapping on the change table. If the return is Error then you use the mapping on the source table.
    I have not implemented these solutions, however I think these are 2 ways of getting it to behave like you would want to...
    Hope this gives you some ideas,
    Jean-Pierre

  • Create and Populate a Hyperion Planning Cube using Hyperion Data Integratio

    Friends,
    I am new to Essbase and have worked extensively in Informatica. Hyperion DIM (OEM version of Informatica) is chosen to create and populate a Hyperion Planning System (with Essbase cube in the backend).
    I am using Hyperion DIM 9.3.
    Can someone let me know (or share a document) how I can do the following
    1) Create a Planning application with a Essbase Cube in the backend using Hyperion Data Integration Management
    2) How to populate the Essbase outline and the actuals Essbase cube with data using DIM.
    Thanks a lot for all help.

    Hi,
    You cannot create planning applications using DIM.
    To load metadata have a look at :- http://www.oracle.com/technology/obe/hyp_fp/DIM_Planning/OBE_Dim_Planning.html
    You can refresh planning database in DIM by
    To enable the Refresh Database property for a session:
    In Workflow Manager, right-click the session and select Edit.
    Click the Mapping tab.
    Select a Planning target.
    Check the Refresh Database box.
    Ok?
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Create Catalog using OEM

    Hi,
    I am having a relational database. I have created a schema only with views based on the underlying tables. Now, I want to create the OLAP catalog using the views.
    I do not know how to do this using OEM. If anyone have done this earlier and provide me that information that will be of great help.
    thanks,
    Saumen.

    yes Saumen
    OLAP metadata represents a star or snowflake schema as a logical cube consisting of measures, dimensions, levels, hierarchies, and attributes.
    OLAP Catalog consists of tables that instantiate the logical model and APIs for updating and viewing the model tables. The write APIs are PL/SQL procedures. The read APIs are relational views
    The OLAP feature within Oracle Enterprise Manager is a graphical tool for manipulating some of the OLAP Catalog APIs. You can use Enterprise Manager to create and modify OLAP metadata as long as the source data is stored in a star or snowflake schema
    To create a dimension:
    Navigate to the Database home page and choose Administration.
    On the Database Administration page, choose Dimensions or OLAP Dimensions.
    On the Dimensions page, click Create.
    On the General page of the dimension property sheet, specify a name, schema, and (OLAP) type for the dimension. The type is Time for time dimensions and Normal for all other dimensions.
    On the Levels page, click Add to define each level of the dimension. Levels are containers for the members of the dimension. Each level maps to one or more columns. Each dimension must have at least one level.
    On the Hierarchies page, click Add to define a set of parent-child relationships between the levels of the dimension. Dimensions may have one or more hierarchies.
    On the Attributes page, define the attributes that hold descriptive information about levels of the dimension. Each attribute maps to a column. By default, each OLAP dimension has Long Description and Short Description attributes for each level. Time dimensions have additional default attributes for End Date and Time Span. Specify columns for each of the default attributes. Click Add to define additional attributes.
    On the OLAP Options page, provide OLAP-specific labels and descriptions for the dimension.
    If the dimension maps to a snowflake schema, use the Join Keys page to define the join keys between levels that are mapped to columns in different tables.
    When you have fully specified the dimension, click OK.

  • Create SSRS report using DMV for querying SSAS cube.

    I am trying to create a SSRS Report to find the Cube/Dimension Status (when was Cube/Dimension last processed and is Failed/Success), for example I have below DMV query for the same.
    SELECT CUBE_NAME, LAST_DATA_UPDATE FROM $System.MDSCHEMA_CUBES
    When i execute the above query in MDX query window it comes up with results, when i try to create a data using the above query in report server its coming up with error.
    Error : Please verify that the query is an MDX one and not DMX. (Microsoft.AnalysisServices.Controls)
    Can we use DMV querys for createing SSRS report and what should be the datasource.
    Thank You.
    Praveen

    Hi Praveen,
    Glad to hear that the issue had been solved. Thank you for sharing the useful information.
    Regards,
    Charlie Liao
    If you have any feedback on our support, please click
    here.
    We
    are trying to better understand customer views on social support experience, so your participation in this
    interview project would be greatly appreciated if you have time.
    Thanks for helping make community forums a great place.

  • Can I create a BI Beans compliant cube using OWB?

    Can I create a cube that I can browse using BI beans through OWB 9.0.4 or are there additional steps that I need to take using other tools such as Enterprise manager?
    Are there any known incompatibilities between OWB 9.0.4 and BI beans 9.03.1.?
    I will also pose this question in the BI Beans forum.
    Thanks for any replies.
    Cor

    Hi,
    I am trying to build an Analytic workspace using OWB (9.0.4.8) Transfer Brigde and I got the similar error.
    None of the view/mv sqls are generated and the analytic workspace was not created either.
    FYI.
    **! Transfer logging started at Wed May 14 18:07:41 EDT 2003 !**
    OWB Bridge processed arguments
    Default local= en_US
    Exporting project:OM_SAMPLE
    initializing project:OM_SAMPLE
    Initializing module :WH
    Exporting cube:SALES
    Exporting dimension:CHANNELS
    Exporting dimension:COUNTRIES
    Exporting dimension:CUSTOMERS
    Exporting dimension:PRODUCTS
    Exporting mappings
    Exporting table:CHANNELS
    Exporting table:COUNTRIES
    Exporting table:CUSTOMERS
    Exporting table:PRODUCTS
    Exporting table:SALES
    Exporting datatypes
    Exporting project OM_SAMPLE complete.
    setting parameter: olapimp.deploytoaw = Y
    setting parameter: olapimp.awname = OWBTARDEMO
    setting parameter: olapimp.awobjprefix = OWBTAR_
    setting parameter: olapimp.awuser =
    setting parameter: olapimp.createviews = Y
    setting parameter: olapimp.viewprefix = OWBTAR_
    setting parameter: olapimp.viewaccesstype = OLAP
    setting parameter: olapimp.creatematviews = Y
    setting parameter: olapimp.viewscriptdir = /opt/oracle
    setting parameter: olapimp.deploy = N
    setting parameter: olapimp.username = OLAPSYS
    setting parameter: olapimp.password = manager
    setting parameter: olapimp.host = 10.215.79.139
    setting parameter: olapimp.port = 1521
    setting parameter: olapimp.sid = INDEXDB
    setting parameter: olapimp.inputfilename = C:\TEMP\bridges\null-nullMy_Metadata_Transfer1052950061353.XMI
    setting parameter: olapimp.outputfilename = C:\Panneer\owbtardemo.sql
    Loading Metadata
    Loading XMI input file
    processing dim: CHANNELS
    processing level: CHANNELin dimension CHANNELS
    processing level attribute use: CHL_ID in level CHANNEL for level attribute ID
    processing level attribute : ID in level CHANNEL
    processing level attribute use: CHL_LLABEL in level CHANNEL for level attribute LLABEL
    processing level attribute : LLABEL in level CHANNEL
    processing level attribute use: CHL_SLABEL in level CHANNEL for level attribute SLABEL
    processing level attribute : SLABEL in level CHANNEL
    processing level: CLASSin dimension CHANNELS
    processing level attribute use: CLS_ID in level CLASS for level attribute ID
    processing level attribute : ID in level CLASS
    processing level attribute use: CLS_LLABEL in level CLASS for level attribute LLABEL
    processing level attribute : LLABEL in level CLASS
    processing level attribute use: CLS_SLABEL in level CLASS for level attribute SLABEL
    processing level attribute : SLABEL in level CLASS
    processing hierarchy: CHANNEL_HIERARCHY in dimension CHANNELS
    processing dim: COUNTRIES
    processing level: REGIONin dimension COUNTRIES
    processing level attribute use: RGN_ID in level REGION for level attribute ID
    processing level attribute : ID in level REGION
    processing level attribute use: RGN_LLABEL in level REGION for level attribute LLABEL
    processing level attribute : LLABEL in level REGION
    processing level attribute use: RGN_SLABEL in level REGION for level attribute SLABEL
    processing level attribute : SLABEL in level REGION
    processing level: COUNTRYin dimension COUNTRIES
    processing level attribute use: CTY_ID in level COUNTRY for level attribute ID
    processing level attribute : ID in level COUNTRY
    processing level attribute use: CTY_LLABEL in level COUNTRY for level attribute LLABEL
    processing level attribute : LLABEL in level COUNTRY
    processing level attribute use: CTY_SLABEL in level COUNTRY for level attribute SLABEL
    processing level attribute : SLABEL in level COUNTRY
    processing hierarchy: COUNTRY_HIERARCHY in dimension COUNTRIES
    processing dim: CUSTOMERS
    processing level: CUSTOMERin dimension CUSTOMERS
    processing level attribute use: CTR_CREDIT_LIMIT in level CUSTOMER for level attribute CREDIT_LIMIT
    processing level attribute : CREDIT_LIMIT in level CUSTOMER
    processing level attribute use: CTR_EMAIL in level CUSTOMER for level attribute EMAIL
    processing level attribute : EMAIL in level CUSTOMER
    processing level attribute use: CTR_ID in level CUSTOMER for level attribute ID
    processing level attribute : ID in level CUSTOMER
    processing level attribute use: CTR_NAME in level CUSTOMER for level attribute NAME
    processing level attribute : NAME in level CUSTOMER
    processing dim: PRODUCTS
    processing level: PRODUCTin dimension PRODUCTS
    processing level attribute use: PDT_DESCRIPTION in level PRODUCT for level attribute DESCRIPTION
    processing level attribute : DESCRIPTION in level PRODUCT
    processing level attribute use: PDT_ID in level PRODUCT for level attribute ID
    processing level attribute : ID in level PRODUCT
    processing level attribute use: PDT_LIST_PRICE in level PRODUCT for level attribute LIST_PRICE
    processing level attribute : LIST_PRICE in level PRODUCT
    processing level attribute use: PDT_MIN_PRICE in level PRODUCT for level attribute MIN_PRICE
    processing level attribute : MIN_PRICE in level PRODUCT
    processing level attribute use: PDT_NAME in level PRODUCT for level attribute NAME
    processing level attribute : NAME in level PRODUCT
    processing level: CATEGORYin dimension PRODUCTS
    processing level attribute use: CTY_ID in level CATEGORY for level attribute ID
    processing level attribute : ID in level CATEGORY
    processing level attribute use: CTY_LLABEL in level CATEGORY for level attribute LLABEL
    processing level attribute : LLABEL in level CATEGORY
    processing level attribute use: CTY_SLABEL in level CATEGORY for level attribute SLABEL
    processing level attribute : SLABEL in level CATEGORY
    processing hierarchy: PRODUCT_HIERARCHY in dimension PRODUCTS
    processing cube: SALES
    processing classification type is := Warehouse Builder Business Area
    processing catalog name := SALESCOLLECTION ,and description is := null
    processing catalog entry element name := SALES
    processing Cube
    processing catalog entity cube := SALES
    processing measure := COSTS , in a cube := SALES
    processing measure := SALES , in a cube := SALES
    processing catalog entry element name := CHANNELS
    processing catalog entry element name := COUNTRIES
    processing catalog entry element name := CUSTOMERS
    processing catalog entry element name := PRODUCTS
    processing catalog entry element name := CHANNELS
    Class Name CHANNELS is TableImpl@405ffd not supported
    processing catalog entry element name := COUNTRIES
    Class Name COUNTRIES is TableImpl@5e1b8a not supported
    processing catalog entry element name := CUSTOMERS
    Class Name CUSTOMERS is TableImpl@6232b5 not supported
    processing catalog entry element name := PRODUCTS
    Class Name PRODUCTS is TableImpl@6f144c not supported
    processing catalog entry element name := SALES
    Class Name SALES is TableImpl@14013 not supported
    processing classification type is := Dimensional Attribute Descriptor
    Classification type Dimensional Attribute Descriptor is not supported
    closing output file
    closing log stream
    **! Transfer process 2 of 2 completed with status = 0 !**
    **! Transfer logging stopped at Wed May 14 18:07:47 EDT 2003 !**
    But when I ran the "select * from dba_registry" everything seems to be valid.
    CATALOG     Oracle9i Catalog Views     9.2.0.2.0     VALID     24-APR-2003 09:39:24     SYS     SYS     DBMS_REGISTRY_SYS.VALIDATE_CATALOG
    CATPROC     Oracle9i Packages and Types     9.2.0.2.0     VALID     24-APR-2003 09:39:24     SYS     SYS     DBMS_REGISTRY_SYS.VALIDATE_CATPROC
    OWM     Oracle Workspace Manager     9.2.0.1.0     VALID     24-APR-2003 09:39:27     SYS     WMSYS     OWM_VALIDATE
    JAVAVM     JServer JAVA Virtual Machine     9.2.0.2.0     VALID     23-APR-2003 22:19:09     SYS     SYS     [NULL]
    XML     Oracle XDK for Java     9.2.0.2.0     VALID     24-APR-2003 09:39:32     SYS     SYS     XMLVALIDATE
    CATJAVA     Oracle9i Java Packages     9.2.0.2.0     VALID     24-APR-2003 09:39:32     SYS     SYS     DBMS_REGISTRY_SYS.VALIDATE_CATJAVA
    ORDIM     Oracle interMedia     9.2.0.2.0     LOADED     23-APR-2003 23:16:42     SYS     SYS     [NULL]
    SDO     Spatial     9.2.0.2.0     LOADED     23-APR-2003 23:17:06     SYS     MDSYS     [NULL]
    CONTEXT     Oracle Text     9.2.0.2.0     VALID     23-APR-2003 23:17:26     SYS     SYS     [NULL]
    XDB     Oracle XML Database     9.2.0.2.0     VALID     24-APR-2003 09:39:39     SYS     XDB     DBMS_REGXDB.VALIDATEXDB
    WK     Oracle Ultra Search     9.2.0.2.0     VALID     24-APR-2003 09:39:42     SYS     WKSYS     WK_UTIL.VALID
    OLS     Oracle Label Security     9.2.0.2.0     VALID     24-APR-2003 09:39:43     SYS     LBACSYS     LBAC_UTL.VALIDATE
    ODM     Oracle Data Mining     9.2.0.1.0     LOADED     12-MAY-2002 17:59:03     SYS     ODM     [NULL]
    APS     OLAP Analytic Workspace     9.2.0.2.0     LOADED     23-APR-2003 22:49:51     SYS     SYS     [NULL]
    XOQ     Oracle OLAP API     9.2.0.2.0     LOADED     23-APR-2003 22:51:49     SYS     SYS     [NULL]
    AMD     OLAP Catalog     9.2.0.2.0     VALID     02-MAY-2003 15:00:13     SYS     OLAPSYS     CWM2_OLAP_INSTALLER.VALIDATE_CWM2_INSTALL
    Your help is appreciated!
    Thanks
    Panneer

  • Good document for creating cube using AWM wih oracle 10.1.0.2 version

    Can anyone provide some good guide for implementing the OLAP cubes using ORACLE ver 10.1.0.2.0 with AWM Release 1.
    Also, please let me know whether I need to apply any patches.
    The demo provided under this link has some differences when trying to implement.
    http://www.oracle.com/technology/products/bi/olap/viewlet/awm10g_viewlet_swf.html
    Your help is appreciated.

    Well, if some of the required componets were missing, I would imagine you getting an error when trying to connect?
    Where is this ASP application actually running?
    Is it running on your localhost IIS web server?
    If not, any setup on your computer will be irrelevent.
    If it is, you need to install MDAC components, and 2.8 is the latest.
    If running on your local IIS, did you try to create a very basic ASP page which just creates a connection to the database in question? Does that work?
    Do you have ON ERROR statements in your code to bypass any/all SQL errors?
    If you create a simple UDL file on your desktop and specify the server/user/password (ater selecting Oracle OLEDB driver), does the connect button work?

  • Error Using OEM to create export files

    Hi
    I get the following error whenever I try to use OEM to create export files (login to OS as oracle user -> http://localhost:1158/em -> login as SYSTEM -> Maintenance -> Export to Export Files (with oracle user as host credentials)
    >>>
    Validation Error
    Examine and correct the following errors, then retry the operation:
    Error - ERROR: NMO not setuid-root (Unix-only)
    >>>
    From what I've read in this forum and in others is that the problem is often related to the owner/group permissions associated with the bin/nmo and bin/nmb files. However I've tried the posted solutions by changing the owner/permissions as suggested, but this does not help. I get the same error regardless if the files are owned by oracle or by root.
    Here is some more info
    >>>
    [root@rhlinux bin]# id oracle
    uid=501(oracle) gid=502(dba) groups=502(dba)
    [root@rhlinux bin]# ls -ld /app/
    drwxr-xr-x 4 root root 4096 Jul 6 13:36 /app/
    [root@rhlinux bin]# ls -ld /app/oracle/
    drwxrwxr-x 5 oracle dba 4096 Jul 10 10:13 /app/oracle/
    [root@rhlinux bin]# ls -ld /app/oracle/product/
    drwxrwx--- 3 oracle dba 4096 Jul 6 16:04 /app/oracle/product/
    [root@rhlinux bin]# ls -ld /app/oracle/product/v10.2.0/
    drwxr-x--- 58 oracle dba 4096 Jul 11 15:31 /app/oracle/product/v10.2.0/
    [root@rhlinux bin]# ls -ld /app/oracle/product/v10.2.0/bin/
    drwxr-xr-x 2 oracle dba 8192 Jul 20 11:19 /app/oracle/product/v10.2.0/bin/
    [root@rhlinux bin]# ll nm?
    -rwxr-x--- 1 root dba 18462 Jul 6 16:16 nmb
    -rwxr-x--- 1 root dba 19624 Jul 6 16:16 nmo
    >>>
    >>>
    [oracle@rhlinux bin]$ uname -a
    Linux rhlinux 2.6.9-34.0.2.EL #1 Fri Jun 30 10:23:19 EDT 2006 i686 i686 i386 GNU/Linux
    >>>
    >>>
    [oracle@rhlinux bin]$ sqlplus system@dware
    SQL*Plus: Release 10.2.0.1.0 - Production on Thu Jul 20 16:34:16 2006
    Copyright (c) 1982, 2005, Oracle. All rights reserved.
    Enter password:
    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production
    With the Partitioning, OLAP and Data Mining options
    >>>
    >>>
    [oracle@rhlinux bin]$ emctl status agent
    TZ set to Canada/Saskatchewan
    Oracle Enterprise Manager 10g Database Control Release 10.2.0.1.0
    Copyright (c) 1996, 2005 Oracle Corporation. All rights reserved.
    Agent Version : 10.1.0.4.1
    OMS Version : 10.1.0.4.0
    Protocol Version : 10.1.0.2.0
    Agent Home : /app/oracle/product/v10.2.0/rhlinux.swlocal_DWARE
    Agent binaries : /app/oracle/product/v10.2.0
    Agent Process ID : 32098
    Parent Process ID : 30792
    Agent URL : http://rhlinux.swlocal:3938/emd/main
    Started at : 2006-07-20 12:00:05
    Started by user : oracle
    Last Reload : 2006-07-20 12:00:05
    Last successful upload : 2006-07-20 16:27:29
    Total Megabytes of XML files uploaded so far : 1.54
    Number of XML files pending upload : 0
    Size of XML files pending upload(MB) : 0.00
    Available disk space on upload filesystem : 47.83%
    Agent is Running and Ready
    >>>
    >>>
    [oracle@rhlinux bin]$ emctl status dbconsole
    TZ set to Canada/Saskatchewan
    Oracle Enterprise Manager 10g Database Control Release 10.2.0.1.0
    Copyright (c) 1996, 2005 Oracle Corporation. All rights reserved.
    http://rhlinux.swlocal:1158/em/console/aboutApplication
    Oracle Enterprise Manager 10g is running.
    Logs are generated in directory /app/oracle/product/v10.2.0/rhlinux.swlocal_DWARE/sysman/log
    >>>
    Any other ideas what might be causing this problem?
    Thanks and take care,
    Shayne

    From what I've read in this forum and in others is
    that the problem is often related to the owner/group
    permissions associated with the bin/nmo and bin/nmb
    files. However I've tried the posted solutions by
    changing the owner/permissions as suggested, but this
    does not help. I get the same error regardless if the
    files are owned by oracle or by root.
    I am not sure which "posted solutions" you had, but this is related to not running root.sh which must be run against the Agent Home.
    1. Stop the Agent (emctl stop agent) while connected to the OS as Oracle user (normally oracle) that installed Grid Control
    2. Connect to the OS as root and run root.sh from Agent ORACLE_HOME (if you do not have root access, your System Admin can do it)
    3. Start the Agent (emctl start agent) while connected to the OS as Oracle user

  • Creating a Physical Standby Database using OEM GC

    This Paper is about the creation of a physical Stanby Database using OEM GC.
    You will see that there is really nothing to it and using the Data Guard Broker, performing a switchover is as easy as 1,2,3,.
    BTW, this works in both OEM GC 10.2.0.5 and 11.1
    http://oemgc.wordpress.com/2010/07/19/creating-a-physical-standby-database-using-oem-gc/

    Thanks Rob for sharing this.
    Regards
    Rajesh

  • How to create ssas cube by using MySQL Database

    please tell me step by step process to create ssas cube by using MySQL database in my system, i have Sqlserver 2008 enterprise edition and MySql5.0

    There is an OLEDB provider for MySQL which you can get from here
    https://cherrycitysoftware.com/ccs/Providers/ProvMySQL.aspx otherwise you can also use SSIS to push the data straight into the Analysis Services database without needing to stage it in SQL Server. Also, as you can load data into AS using XMLA you could
    also write your own loader to extract data using ODBC and push it into AS using XMLA, essentially what I suspect SSIS does. However those latter solutions don't allow you to create a database on top of MySQL because you need an OLEDB (or .net) provider for
    that.
    In the simplest case, install the OLEDB provider and then in AS create a Data Source connection using that provider. Once you have done that you should be able to create a Data Source View using that connection enabling you to import the schema definitions
    for the tables/views in the MySQL database. From there you build dimensions and cubes etc. about which there is plenty of information on the web.
    http://bi-logger.blogspot.com/

  • When I create a new cube using another cube as template which datasource i

    Hi,,
    When I create  a new cube using another cube as template(copy) which datasource is assigned to the new cube, why because am getting all the fields which are there in copied cube even though I used only required fields.why this is occuring?
    Thanx in advance,
    Ravi

    Hi,
    Usually, when you copy the cube only data model is copied. means all KF and CHAR, NAV CHAR.
    May be after copying you can reassign (CHARS) once again in the New cube as per your requirement.
    When you create the update rules, there you can assign the Infosource or original cube as datasource.
    may help...
    Vis

  • Creating a cube with more than one dimension

    I have been able to create a cube with one dimension using our own data. I am able to view data from this cube in Cube Viewer and in a presentation created with BI Beans in JDeveloper.
    However, I have been unsuccessful in doing this when creating a cube with two dimensions. What am I missing?
    I have been using OEM to create the dimensions and cubes, etc. in a 9.2.0.4 database.

    You can use Analytic Workspace Manager 10.1.0.4 to create analytic workspaces (MOLAP) in Oracle OLAP 10.1.0.4. The Model View in this GUI tool utilizes the OLAP AW Java API that was introduced in Oracle OLAP 10g. This Java API fully abstracts the logical dimensional model from physical design.
    OWB Paris, which is currently in beta, likewise uses the OLAP AW API to create AWs. In addition, it can create ROLAP cubes via the OLAP Catalog CWM2 APIs. This is a change in APIs as the previous version used CWM1/Lite like Enterprise Manager. Also, for ROLAP cubes OWB Paris will automatically include MR_REFRESH in its scripts and will call the appropriate DBMS_ODM package in order to create materialized views.
    As for tutorials for creating a star schema, check with the Oracle Warehouse Builder forum.
    OWB Forum:
    Warehouse Builder

Maybe you are looking for

  • Can I backup printing presets?

    I just got a new computer and on my old one, I had a bunch of printing presets saved. Is there a way to get those off the old computer and put into the new one? Also what about a way to backup my saved workspace layout to bring to another computer?

  • Problem N5800 XM I need help

    I have problem with my navigation, I have fw. 30.0.011 and update OVI maps 3 . Navigation need many time to locate my position via GPS . I test It with E66 and the are more quickly about 20 seconds but on 5800 are more than 5 minutes , but when I run

  • HT1338 Skype

    why won't Skypw work with MAC?? I can't do vidseo calling, it shows me a s off line to the receiver, and when I dial their number it's like making a call vs video chating.

  • PLEASE HELP - Inconsistent AQ Behavior!

    HELP!!!! HELP!!!! HELP!!!! We are using Java 1.3 and Oracle 8.1.7's aqapi.jar. A frontend application enqueues messages OK. The other "side" of the queue dequeues these messages MOST of the time, but not all the time. If we change it from blocking fo

  • Template adds an extra ../

    I'm creating templates for my pages. The template includes a menu, and two main editable regions. The menu is not editable. However, when I apply this template to existing pages it is adding an extra "../" to all of my menu links. Anyone know why? Th