Warehouse Problem !!!

Hi guys im having a huge problem with trying to create a new repository using the Warehouse Builder Repository Assistant ... as it creates and error during installation ... mentioning JAVAIDPRIV ... exception... i have tried many things ... but still did not resolve it ...
Im trying to set it up so that i can use thee Warehouse Builder ... using Oracle 10g Express Edition ... Anyone !!!... if you can please provide me with some guidance !!! i would appreciate it ...
Thanks very much .,,
Edited by: user609620 on Oct 4, 2009 5:19 AM

Hi,
you cannot install an owb repository into the oracle express edition. You need a database edition that supports java as standard or enterprise edition.
Regards,
Carsten.

Similar Messages

  • Data warehouse problem plz help

    hi, i got a problem in making my first warehouse
    first of all
    i have many operational databases, and i want to make a warehouse to get the data of these db's to save them according to time...
    and how i connect a vb .net application to retrieve data and queries from the data warehouse
    is this possible and can any one help pleaaaaaaaassssssse

    Because of this our server gets shut down automatically No. Just because connection pool got suspended, server should not go down. There is any other issue which you did not notice. For Data Source to function properly, make sure that intial and maximum connections limit has been set appropriately (preferably both should be equal) and make sure that database is up and running always and has that many connections open. Check with the DBA for DB connection limit settings.
    Raise a SR with support if you are not able to figure out the exact issue.
    Regards,
    Anuj

  • Knowledge warehouse problem

    Hi all,
    I had installed KW7.0.When i logged into Knowledge workbench,i am able to find all the folders and infoobjects(Power Point Presentation).
    When i am trying to view the data in that info object,i am getting an error-
    An unexpected error occured while retrieve infor object' ' for editing:you are not authorzed to modify this info object.
    I just want to see the content in the info object,but not for editing.
    Please let me know,how i can view the data.
    Thanking you,
    With Regards,
    venkat.

    Hi all,
    I had installed KW7.0.When i logged into Knowledge workbench,i am able to find all the folders and infoobjects(Power Point Presentation).
    When i am trying to view the data in that info object,i am getting an error-
    An unexpected error occured while retrieve infor object' ' for editing:you are not authorzed to modify this info object.
    I just want to see the content in the info object,but not for editing.
    Please let me know,how i can view the data.
    Thanking you,
    With Regards,
    venkat.

  • Data Warehouse Cubes Not Processing

    With a customer now who is having data warehouse problems, the main issues being:
    - ETL jobs run fine (except MPSync which finishes with only 179/180 jobs complete)
    - Cube processes are stuck in a RUNNING loop, they never complete or fail out and all show a last run time 1/10 and next run time 1/11
    Have scoured the internet to find any solution to this, have come across various blogs to fix the issue. Have tried to manually disable the jobs and then manually process the cubes, restart the DW server, restart the SSAS, etc. to no avail.
    Lastest solution we tried to take was to unregister the DW and re-register, however when we went to unregister the DW we received the following error:
    "Failed to unregister from the data warehouse"
    Title: Error
    Text: System.ArgumentException: SM data source not found.
    at
    Microsoft.EnterpriseManagement.ServiceManager.UI.Administration.DWRegistration.Unregistration.DWUnregistrationHelper.AcceptChanges(WizardMode wizardMode)
    So our next step was to unregister the data sources and then re-register them individually. Of the two data sources, we were able to unregister and re-register the DW data source (DW_COMPANY01), but when we tried to unregister the Operational data source
    (COMPANY01) we got the following error:
    Title:  An error was encountered while running the task.
    Text:  Exception has been thrown by the target of an invocation.
    Based on the two errors shown above I assume we cannot un-register this data source or the DW as a whole because it cannot find that operational data source. Couple things to point out above this data source and the environment to shed some light on the
    situation:
    Customer upgraded all servers to UR4 around the January time frame
    Customer promoted a new primary MS around January 20<sup>th</sup>
    Currently, when reviewing the details of the data source, the SDK Server references the computer name of the OLD primary management server
    Looking at the event logs on the DW MS (with SSRS/SSAS) an error shows that the MPSynch job failed to finish and reference the OLD primary management server
    Looking for any guidance for this issue. Have taken many steps to troubleshoot the problem to avail. Please let me know if you have any questions or need any more information.

    Hi !
    Probably it was a problem with the primary Keys, because when MPSyncJobs does not complete something will fail in the following ETL Jobs.
    Solution would have been here: http://technet.microsoft.com/en-us/library/jj614520.aspx
    In this stage, frankly spoken if you have still all the Data in your CMDB and nothing gone bcs of grooming, deinstall all DWH components
    (DM-MgmtServer, Databases) and reinstall DW completely.
    R.

  • Index on Partitioned Table with Some ReadOnly Tablespaces

    We have a warehouse with fact tables range partitioned on date - daily partitions with each month worth of partitions put into a specific monthly tablespace. Each month, we set the prior month's tablespace to READONLY. So our table ends up having data in readonly and read-write tablespaces.
    We now have a change we need to make to one of the fact tables - we need to add a new column AND add an index to that column. But because we have partitions in readonly state, Oracle doesn't let us create the index and it also doesn't let us update the local unique key (unique index).
    Is there a way we can do this without having to put the tablespaces in read-write mode? As importantly, what happens when we offline or drop some of the older tablespaces (for archiving purposes)? We need to find a way to add the index on just the read-write partitions.
    Thanks.

    We have a warehouse with fact tables range
    partitioned on date - daily partitions with each
    month worth of partitions put into a specific monthly
    tablespace. Each month, we set the prior month's
    tablespace to READONLY. So our table ends up having
    data in readonly and read-write tablespaces.
    We now have a change we need to make to one of the
    fact tables - we need to add a new column AND add an
    index to that column. But because we have partitions
    in readonly state, Oracle doesn't let us create the
    index and it also doesn't let us update the local
    unique key (unique index).
    Is there a way we can do this without having to put
    the tablespaces in read-write mode? As importantly,
    what happens when we offline or drop some of the
    older tablespaces (for archiving purposes)? We need
    to find a way to add the index on just the read-write
    partitions.
    Thanks.Hi,
    Improvements in Oracle 10g to maintain local-partitioned indexes when you use partition DDL commands:
    add partition, split partition, merge partiton, move partition.
    ALSO, the associated indexes NO LONGER have to be stored on the same tablespace as the table (i.e. answer to your question).
    On Oracle 9i: Local indexes are recommended on data warehouse platforms. In an OLTP system, global indexes are more common. On a data data warehouse, problems can be isoloted to one partition, the partitions moved, made r/o (like yours), no local indexes need to be rebuilt
    Regarding your issue:
    We now have a change we need to make to one of the
    fact tables - we need to add a new column AND add an
    index to that columnTo maintain the simplicity + functionality of your DW configuration, I think you need to change the TS to R/W, update the objects, then alter to R/O.
    fyi
    http://www.oracle.com/technology/deploy/availability/htdocs/online_ops.html

  • Problem during  Data Warehouse Loading (from staging table to Cube)

    Hi All,
    I have created a staging Module in owb to load my flat files to my staging tables.I have created an Warehouse module to load my staging tables to Dimension and Cube that I have created.
    My senario:
    I have a temp_table_transaction which had loaded my flat files to it .This table had loaded with 168,271,269 milion record as through this flat file.
    I have created a mapping in owb which loaded my temp_table_transaction which has join with other tables and some expression and convert function that these numbers fill to a new table called stg_tbl_transaction in my staging module.Running this mapping takes 3 hours and 45 minutes with this configue of my mapping:
    Default operation mode in running parameter of Mapp config=Set based
    My dimesion filled correctly but I have two problem when I want to transfer my staging table to my Cube:
    #1 Problem:
    i have created a cube is called transaction_cube with owb and it generated and deployed correctly.
    i have created a map to fill my cube with 168,271,268 milon recodes in staging table was called stg_tbl_transaction and deployed it to server (my cube map operating mode is set based)
    but after running this map it did not complete after 9 hour and I forced to cancel my running's map by kill its sessions .I want to know this time for loading this capacity of data is acceptable or for this capacity of data we should spend more time.Please let me know if anybody has any Issue.
    #2 Problem
    To test my map I have created a map with configure set based in operation modes and select my stg_tbl_transaction as source with 168,271,268 records in it and I have created another table to transfer and load my data in it.I wanted to test the time we should spend on this simple map but after 5 hours my data had not loaded in new table.I want to know where is my problem.Should I have set something in configue of map or anothe things.Please guide me about these problems.
    CONFIGURATION OF MY SERVER:
    i run owb on two socket xeon 5500 series with 192 GB ram and disks with RAID 10 Array
    Regards,
    Sahar

    For all of you
    It is possible to load from Infoset to Cube we did it, and it was ok.
    Data are really loaded from Infoset (Cube + master dat) to cube.
    When you create a transformation under a cube Infoset is proposed, and it works fine ....
    Now the process is no more operationnal and i don't understand why .....
    Load from infoset to cube is possible, i can send you screen shot if you want ....
    Christophe

  • Problem in import of MetaData file in Oracle Warehouse Builder 9.2

    My Problem is Related to Oracle WareHouse Builder.My system's configuration is following :-
    Machine : P4
    Operating System :Windows XP
    ORacle : Version 9.2
    Oracle WareHouse Builder 9.2
    Ram :1GB
    SGA Size :Approximately 650M
    I have 2 MetaData files which I have to import into OWB.
    First file's size is 9M.It got imported properly in 4 minutes without any error.
    2nd File's Size is 30M.when
    I start importing this MEtaData file.It goes upto 35% in Progress Bar.But after that it does not show any improvement.
    I have left the machine running with this option upto 24 hours.
    But nothing happened .It still shows only 35%.
    I have checked log file/Trace file/Alert log files ,but there is no error in these files.
    If any body have any solution about it,Please reply ASAP.I am stucked in this problem from last 7-8 days.
    Waiting for reply
    Thanks & Regards
    Harvinder Singh

    Thanks for your reply Igor.
    As posted the OWB 10.1.0 software is running against a 9.2.0.8.0. database, so OWB still uses DBMS_JOB instead of DBMS_SCHEDULER in this database. In addition I already tested scheduling a simple job to run an OS command as suggested in note 365539.1 , and job scheduling is working fine.
    There are no invalid objects in this database.
    Yasmin

  • Problem of uploading data in Data Warehouse

    I am using Oracle9i Warehouse Builder on Windows 2000 and i just started working in it. I am facing problem in uploading data from source schema to Target schema
         I) Created Source Moulde -- Link to One schema from where i have get data
         2) Created Warehouse Target Module -- I have created dimesnions , facts , Mapping and i deployed dimensions , fact and mapping. As a result of it i got dimension table , dimension , fact table , facts ,mapping package and TCL script for mapping. I deployed all of them and register TCL script with OEM .
         But my job created in OEM is not running. If run then does not upload any data and If I try to run job from the backend manually then it gives me message that No active job exist..
    Thanks in advance
    Surinder

    I was getting the same error. I don't know if a step failed somewhere in the install or if I missed performing a manual step but I just fired up TOAD and manually inserted a record into the WB_RT_JOB table. I set the RTJ_JOBNAME = DEFAULT_JOB and RTJ_STATUS = BEGIN. The rest of the columns I just entered best guesses. I commited and then my job would run. I guess it just needs a seed record.
    Lewis

  • Problem in DB Link creation ( Oracle warehouse builder 3i  )

    I am facing a problem in DB Link creation.
    Backend: Oracle 8i Server on my machine
    DW Software: Oracle warehouse builder 3i ( client , repository asistant.....)
    Operating system: Windows NT 4 SERVICE PACK 6
    I wants to use the scott database( default database given by oracle ) as my input source.
    How can I create the DB LINK ( for scott database) ?
    How can I create the DB LINK ( for any other database) ?
    Should I need to add anything in Setting of"ODBC DATASOURCE ADMINISTRATION"
    ==================
    Settings done:
    ==================
    DB Link Name :scott
    Host name
    Host name: my machine's ip address
    port number: 1521
    oracle sid: prashant ( my oracle sid)
    user name:scott
    password:tiger
    ==================
    Gives error:
    ==================
    Testing...
    Failed.
    ORA-02085 Database link %s connects to %s
    *Cause:   a database link connected to a database with a different name.
    The connection is rejected.
    *Action:   create a database link with the same name as the database it
    connects to, or set global_names=false.
    Please change it to false by doing :
    first option:
    Log in the database with DBA privilege and use the command:
    alter system set GLOBAL_NAMES = false
    second option:
    Change the GLOBAL_NAMES to false in database system parameter file, init.ora
    ==================
    Options tried:
    ==================
    1. I tried to change GLOBAL_NAMES = false but still not able to create the DB LINK.
    2. As per suggestion of one the friend
    "A file named "Logon.Properties" under the directory $OWB_HOME/wbapp
    in this file please set the property
    OWBSingleUserLockUsage = false"
    I tried the same but it is still not working.
    How should I proceed further.
    I am expecting URGENT FEEDBACK.
    Reply me on : [email protected]
    From
    Prashant

    I solved the problem.
    Procedure I followed :
    UNINSTALL ORACLE WRAEHOUSE BUILDER SOFTAWARE.
    'GLOBAL_NAMES = FALSE' in init.ora file.
    RESTARTED MY MACHINE.
    INSTALL THE ORACLE WRAEHOUSE BUILDER SOFTAWARE.

  • Update/Insert Problem with Oracle Warehouse Builder

    Hello,
    i have update/insert problem with owb.
    Situation: I have a source-table called s_account and a target table called w_account_d. In the target table are already data which was filled trough the source table inserts. Now anyone make changes on data on the target table. This changes should now give further on the source table with an update operation. But exactly here is the problem i can´t map back the data to source because that will create a loop.
    My idea was to set a trigger but i can´t find this component in owb or is anywhere hidden?
    Also i have already seen properties as CDC or conditonal loading in the property inspector of the table, but i have no idea how it works.
    Give it other possibilities to modeling this case? or can anyone me explain how i can implement this eventually with CDC?
    I look forward for your replies :)

    Hi
    thanks for your answer. I follow your suggestion and have set the constraints of both tables into the database directly.Nevertheless it doesn´t work to begin. In the next step i found by right click on a table the listpoint "configure" - I goes to "unique key" --> creation method and set here follow options: Constraint State = ENABLE, Constraint Validation = Validate. That error message that appears before by the deployment disappears yet. Now i start the job to test if the insert/update process works right. Finally it seems to work - but not really.
    My Testscenario
    1. Load the data from source table about the staging area to data warehouse table: Check - it works!
    2. Change one data record in source table
    3. Load the source table with changed data record once again to staging area: Check - it works!
    4. Load new staging area table with the changed data record to data warehouse table: Check it works! BUT, BUT i can not recognize if it is insert or update operation, then under the design window by jobs execution windows is reported "rows selected 98", Rows inserted" is empty and "rows updated" is empty. So i think works not correct, then my opinion if it works correct it should show be "rows updated" 1.
    What can yet now still be wrong or forgotten? Any ideas?
    *By the way think not 98 rows there is not important if you make an update or insert which performance. It is an example table the right tables have million of records.*
    I look forward for your answers :)

  • I bought my iphone unlocked from carephone warehouse, I had a problem with the phone and apple swapped it over but they locked it to tmobile!!! what can i do?? I have changed contracts and can not use my phone!

    I bought my iphone from carephone warehouse, sim free and with no contract. 5 months down the line I had a problem with the phone and I booked a genius appointment, they swapped my phone and gave me a new phone but they locked my phone to tmobile!!! I now do not use tmobile and I have an iphone which is locked to tmobile!!! I bought an unlocked phone and apple have locked it!!! what can i do?

    In response to this, Carphone Warehouse (and Phones4U) sell what is commonly referred to as "sim free" handsets - these are not the same as unlocked. Sim free handsets will automatically lock to the first network sim card that the phone is activated with, and that will be permanently. The only way to unlock the handset would be to go through the network (T-Mobile I understand) and request they unlock it for you. More than likely, there will be a charge as you are no longer a T-Mobile customer.
    If you ever want to purchase a new unlocked iPhone, the only place you can buy one is from Apple directly. Any other place will most likely sell sim free handsets.

  • Data Warehouse Cursor Problem

    I am trying to complete a piece of work for College but am having trouble with the completion of a cursor. The object of this small project is to create a very basic data warehouse from an operational system.
    I have populated all of the dimension tables except one which is to be populated with the FACT table. These tables are to be populated with the cursor I am trying to complete.
    I having difficulty understanding what the first select statement in the cursor does. For the region dimension table, I was asked to create a sequence to use as the primary key (region_id). The region_id in the operational table has different values e.g. 6000, 6001.
    'dw_op' is the schema on the operation table which is accessed through the DB link 'q_link'.
    Any thoughts on what is required to complete this cursor would be a big help.
    Here is the incomplete anonymous block and cursor:
    declare
         cursor c_sales is
              select order_line.product_id, order_line.quantity,
              product.unit_cost, product.unit_price, ord.client_id,
              ord.SALES_REP_ID, ord.order_date from dw_op.ord@q_link, dw_op.order_line@q_link,
              dw_op.product@q_link
              where ord.order_id = order_line.order_id AND
              order_line.product_id = product.product_id...
         s_value number;
         s_cost number;
         begin
         for c_rec in c_sales loop
              select region_id into r_id
              from region where region_name =
                   (select region_name from dw_op.sales_region@q_link,
                   dw_op.sales_rep@q_link where sales_region.region_id =
                   sales_rep.region_id and SR_id = c_rec.SALES_REP_ID);
              select time_seq.nextval into s_time from dual;
              Insert into time values (s_time, s_day, s_month, s_year ... );
              s_value := ... // how much it costs the company, unit_price * something?
              s_cost := ... // time something by the quantity
    INSERT INTO sale VALUES (ord.client_id, order_line.product_id, ord.SALES_REP_ID, r_id, s_time, order_line.quantity,
    s_value, s_cost); // need to find out how to enter select info into table
         end loop
    end;

    You may have an IO problem but you may also have a design issue/configuration issue. what you are seeing is multiple session waiting for the same block, if 20 sessions all request the same block one will read it form disk (db file scattered read or db file sequential read) and 19 session will wailt on read by other session and then get the block from the cache.
    there do seem to be a very high number of waits for read by other session in the database so you may want to investigate exactly what sql is waiting on this event and if you could benifit form either a larger buffer cache or using te keep and recycle pools to manage frequently accessed tables better. Otherwise investigate the SQL that is perfomring the most IO and tune it to do less work.
    Chris

  • Problem with Starting with Oracle Warehouse Builder 11g Release 1

    Hello,
    I want to do the training modules for owb. I installed Oracle Databes 11g2 and was attempting to load all the demo files from the learning module Starting with Oracle Warehouse Builder 11g Release 1 (http://st-curriculum.oracle.com/obe/db/11g/r1/owb/owb11g_update_getting_started_intro/lesson1/less1_start.htm)
    I get an error during the running of the tcl scripts in omb plus. Everything works until the last files in which an object cannot be found. I get the following error:
    OMB02923: Attribute TIMES_CAL_MONTH_CODE of group SALES of operator SALES_OUT does not exist.
    The only reference i can find to this is in the load_sales.tcl file from the zip-file and i have no idea what is going wrong. As a result 5 mappings are missing in the end result (all LOAD_.. mappings)
    Can anyone assist me with this problem and provide a solution?
    thank you in advance

    Additional information after looking at the tcl files and the data in OWB that was loaded.
    The section where the error occurs connects two parts of a mapping, the OUTGRP1 and the SALES_OUT. THe SALES_OUT is based upon the cube Sales. This cube contains all the data that is being connected except for the 'TIMES_CAL_MONTH_CODE'. I have no idea based upon the code of the creation of the cube where they take the data from. None of the connected attributes is named specifically. The cube also contains the several groups, one of these is time. If you open the details of the group the list of input attributes lists the needed attribute but it is not listed seperatly like all the others that are connected for the mapping. Does anybody have an idea how to solve the problem based on this extra information and the code listed below.
    Code connecting outgrp1 and sales_out:
    OMBALTER MAPPING 'LOAD_SALES' \
    ADD CONNECTION FROM ATTRIBUTE 'AMOUNT' OF GROUP 'OUTGRP1' OF OPERATOR 'AGGREGATOR' \
    TO ATTRIBUTE 'AMOUNT' OF GROUP 'SALES' OF OPERATOR 'SALES_OUT'\
    ADD CONNECTION FROM ATTRIBUTE 'COST' OF GROUP 'OUTGRP1' OF OPERATOR 'AGGREGATOR' \
    TO ATTRIBUTE 'COST' OF GROUP 'SALES' OF OPERATOR 'SALES_OUT'\
    ADD CONNECTION FROM ATTRIBUTE 'QUANTITY' OF GROUP 'OUTGRP1' OF OPERATOR 'AGGREGATOR' \
    TO ATTRIBUTE 'QUANTITY' OF GROUP 'SALES' OF OPERATOR 'SALES_OUT' \
    ADD CONNECTION FROM ATTRIBUTE 'CHANNEL_SOURCE_ID' OF GROUP 'OUTGRP1' OF OPERATOR 'AGGREGATOR' \
    TO ATTRIBUTE 'CHANNELS_SOURCE_ID' OF GROUP 'SALES' OF OPERATOR 'SALES_OUT'\
    ADD CONNECTION FROM ATTRIBUTE 'SUBCATEGORY_ID' OF GROUP 'OUTGRP1' OF OPERATOR 'AGGREGATOR' \
    TO ATTRIBUTE 'PROMOTIONS_SOURCE_ID' OF GROUP 'SALES' OF OPERATOR 'SALES_OUT'\
    ADD CONNECTION FROM ATTRIBUTE 'PRODUCT_ID' OF GROUP 'OUTGRP1' OF OPERATOR 'AGGREGATOR' \
    TO ATTRIBUTE 'PRODUCTS_SOURCE_ID' OF GROUP 'SALES' OF OPERATOR 'SALES_OUT'\
    ADD CONNECTION FROM ATTRIBUTE 'CITY_ID' OF GROUP 'OUTGRP1' OF OPERATOR 'AGGREGATOR' \
    TO ATTRIBUTE 'CUSTOMERS_SOURCE_ID' OF GROUP 'SALES' OF OPERATOR 'SALES_OUT' \
    ADD CONNECTION FROM ATTRIBUTE 'FINISH_MONTH' OF GROUP 'OUTGRP1' OF OPERATOR 'AGGREGATOR' \
    TO ATTRIBUTE 'TIMES_CAL_MONTH_CODE' OF GROUP 'SALES' OF OPERATOR 'SALES_OUT'
    Code for creating the cube:
    OMBCREATE CUBE 'SALES'\
    SET PROPERTIES (BUSINESS_NAME, DESCRIPTION, DEPLOYMENT_OPTIONS)\
    VALUES ('Sales Cube', 'Sales Cube', 'Deploy All')
    OMBALTER CUBE 'SALES' ADD MEASURE 'AMOUNT'\
    SET PROPERTIES (DATATYPE,PRECISION,SCALE,BUSINESS_NAME,DESCRIPTION)\
    VALUES ('NUMBER',10,2,'Sales Amount','Sales Amount')
    OMBALTER CUBE 'SALES' ADD MEASURE 'QUANTITY'\
    SET PROPERTIES (DATATYPE,BUSINESS_NAME,DESCRIPTION)\
    VALUES ('NUMBER','Sales Quantity','Sales Quantity')
    OMBALTER CUBE 'SALES' ADD MEASURE 'COST'\
    SET PROPERTIES (DATATYPE,PRECISION,SCALE,BUSINESS_NAME,DESCRIPTION)\
    VALUES ('NUMBER',10,2,'Sales Cost','Sales Cost')
    OMBALTER CUBE 'SALES' ADD DIMENSION_USE 'TIMES'\
    SET REF LEVEL 'MONTH' OF DIMENSION 'TIMES' AT POSITION "1"
    OMBALTER CUBE 'SALES' ADD DIMENSION_USE 'PRODUCTS'\
    SET REF LEVEL 'PRODUCT' OF DIMENSION 'PRODUCTS' AT POSITION "2"
    OMBALTER CUBE 'SALES' ADD DIMENSION_USE 'CHANNELS'\
    SET REF LEVEL 'CHANNEL' OF DIMENSION 'CHANNELS' AT POSITION "3"
    OMBALTER CUBE 'SALES' ADD DIMENSION_USE 'CUSTOMERS'\
    SET REF LEVEL 'CITY' OF DIMENSION 'CUSTOMERS' AT POSITION "4"
    OMBALTER CUBE 'SALES' ADD DIMENSION_USE 'PROMOTIONS'\
    SET REF LEVEL 'SUBCATEGORY' OF DIMENSION 'PROMOTIONS' AT POSITION "5"
    OMBALTER CUBE 'SALES' ADD COMPOSITE_DIMENSION 'SALES_COMP'\
    SET REF DIMENSIONS ('PRODUCTS','CHANNELS','CUSTOMERS','PROMOTIONS');
    OMBALTER CUBE 'SALES' IMPLEMENTED BY SYSTEM
    puts "SALES defined"

  • Oracle Warehouse Builder Problem

    Hello All,
    I have just installed oracle warehouse builder 10g on windows 2000 system.
    My database is oracle 10gR2
    While connecting Oracle Repository Assistant, when I provide sys user/password and other credentials i got following error:
    INS0003: OWB Repository Installation cannot continue without DBA privileges. Connect as DBA and use option Create a New Warehouse Builder Repository to continue the process.
    Sys user is already having sysdba rights. I have tried to login with another dba account but still same error.
    Any help??

    Hello,
    I have found the solution to this problem.
    Metalink note: 332371.1
    There ware problem with compatibility of OWB10.1 with Oracle database release 10.2...
    I have installed oracle 10gR1 and now it is working.

  • Problem of querying a data warehouse

    hi
    I need to create a data warehouse I used version 10.2 Oracle Warehouse Builder but I found problems with the interrogation of my warehouse so I used Excel to solve this problem
    if someone helps me and gives me an alternative I do not know if there is another version that resolve this problem
    thnx

    Can you explain the problem in detail to understand it better?
    what does interrogation exactly means in OWB.

Maybe you are looking for