Dynamically build the cube

hi
I unable to cretae the dynamically outline for dynamaic cala and dynamic calc and store and rest data storage property.
in my datasource i dont have information regarding storage properties. how can i add the storage properties to rule file for dynamically building..............

It depends on how you source data is set up. If level or gen ref and you want to make all upper level members dynamic calc, that wouldn't be too bad. Just add a column for Gen2 properties with text value 'X' and then set that column as the properties for Gen2. Repeat for other Gens as needed.
Now for Parent/Child it gets more messy. You could make a copy of your child member column in the load rule, then use Find/Replace to update the value to the storage property you want.
For example let's say you have a load file like
PARENT -- CHILD
Year -- Q1
Q1 -- Jan
Q2 -- Feb
Q3 -- Mar
You could duplicate the child column in your load rule
PARENT -- CHILD -- CHILD2
Year -- Q1 -- Q1
Q1 -- Jan -- Jan
Q2 -- Feb -- Feb
Q3 -- Mar -- Mar
You could then use Find/Replace on the new CHILD2 column
(Replace 'Q1' with 'X', Replace 'Jan' with '', Replace 'Feb' with '', Replace 'Mar' with '') your end result will look like
PARENT -- CHILD -- CHILD2
Year -- Q1 -- X
Q1 -- Jan -- NULL
Q2 -- Feb -- NULL
Q3 -- Mar -- NULL
Set new CHILD2 column as the property for CHILD column and you'll get your properties as needed. This obviously gets difficult if the dim is large.
In General I would not take this approach unless you just had to do one or two. You are better to manipulate your source before hand.

Similar Messages

  • Oracle Express Error while building the cube

    When we try to build the Express cube using Relation Access manager's Hybrid Maintenance, We encountered the following error raised by the exception manager.
    "SYSTEM ERROR RSALOC02"
    We are using ODBC drivers for connecting and fetching data from oracle RDBMS. When we checked the Log file of the cube, junk characters were printed at the end of the Log.
    signal errorname errortext
    2 FCH.D3 FCH.D4 FCH.D5 FCH.D6>-
    8 THEN-
    ' ' _sqlerrm)
    p
    2   hkT2 ! _W2p )  1 @ H H>8 @U-0 r

    It could be the version of ODBC you are using, or some invalid characters in the data you are trying to fetch. Since this issue is very specific you would be better to open a TAR on metalink. Thanks.

  • Receiving error while building the cube

    We are trying to populate a cube using AWM with relatively few data records and are receiving following error stack:
    oracle.express.idl.util.OlapiException: java.sql.SQLException: No more data to read from socket
    at oracle.express.idl.ExpressOlapiDataSourceModule.DataProviderInterfaceStub.generic(Unknown Source)
    at oracle.olapi.data.source.DataProvider.callGeneric(Unknown Source)
    at oracle.olapi.data.source.DataProvider.callGeneric(Unknown Source)
    at oracle.olapi.data.source.DataProvider.executeBuild(Unknown Source)
    at oracle.olap.awm.wizard.awbuild.UBuildWizardHelper$1.construct(Unknown Source)
    at oracle.olap.awm.ui.SwingWorker$2.run(Unknown Source)
    at java.lang.Thread.run(Thread.java:595)
    Our database is like:
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    Running on SunOS 5.9
    Does anybody have an idea as to what could be causing this error?
    Thanks,
    Thomas

    This is a very generic "session has gone away" error that could be caused by any number of things. It may be OS related, but it may also be generic. You can look in the alert log for clues and you can also look in the CUBE_BUILD_LOG to see what was happening immediately before the error. Something like the following query would be a good starting point (assuming that the error was the last build you ran and that you are not using a RAC system).
    SELECT command, status
    FROM cube_build_log
    WHERE build_id = (SELECT max(build_id) from cube_build_log)
    ORDER BY time

  • Failed to build the OLAP cubes (Project Server 2013)

    We are receiving errors when attempting to build our OLAP cubes.  We've attempted to install the 2008 version of the SQL server native client 2008 but still the error appears.  Please advise.
    Configuration:
    Project Server 2013
    SharePoint 2012
    SQL Server 2012
    Error Message:
    ===== Initiating OLAP database build process =====
    [1/7/2014 2:00 AM] Cube build request message has been added to the Project Server queue ===== Verifying and running pre-build server event handler =====
    [1/7/2014 2:00 AM] Verifying and running pre-build server event handler ===== Determining database and OLAP database structure =====
    [1/7/2014 2:00 AM] Cube build initialization started ===== Building database and cubes =====
    [1/7/2014 2:00 AM] Cube build session started ===== Verifying and running post-build server event handler =====
    [1/7/2014 2:00 AM] Verifying and running post-build server event handler ===== Processing OLAP database =====
    [1/7/2014 2:00 AM] Process OLAP database session started ===== Process Completed =====
    [1/7/2014 2:00 AM] Failed to build the OLAP cubes. Error: Failed to process the Analysis Services database OLAP_FTF on the XYZ server. Error: Errors in the back-end database access module. The provider 'SQLNCLI10' is not registered.
    The following system error occurred:  Class not registered Errors in the high-level relational engine. A connection could not be made to the data source with the DataSourceID of 'Project Reporting data source', Name of 'Project Reporting
    data source'.
    Errors in the OLAP storage engine: An error occurred while the dimension, with the ID of 'Project List', Name of 'Project List' was being processed.
    Errors in the OLAP storage engine: An error occurred while the 'Start Date' attribute of the 'Project List' dimension from the 'OLAP_FTF' database was being processed.
    Internal error: The operation terminated unsuccessfully.
    Server: The current operation was cancelled because another operation in the transaction failed.

    Hi Importek,
    Please have a look to a similar threads that has been answered.
    http://social.msdn.microsoft.com/Forums/en-US/a2696e29-e68e-45a8-8dae-fa8fa5640575/receive-the-following-error-when-trying-to-build-cube-errors-in-the-highlevel-relational-engine?forum=sqlanalysisservices
    http://social.technet.microsoft.com/Forums/projectserver/en-US/420958bd-4613-451c-ac06-a1f149f64da9/problem-when-trying-to-build-the-cube?forum=projectserver2010general
    Finally an article by Brian Smith:
    http://blogs.msdn.com/b/brismith/archive/2007/07/16/yet-another-olap-error-message-the-longest-yet.aspx
    This might be an issue of the SQL instance alias.
    Hope this helps.
    Guillaume Rouyre - MBA, MCP, MCTS

  • How to get sum distinct in the cube. Is it possible.

    Here is the scenario.
    One report has many countries on it but only one amount.
    For a particular day we have the following data in the fact.
    TRANSACTION_DAY_NO
    Country
    Total Amount
    19900101
    US
    34
    19900101
    IND
    35
    19900101
    IND
    36
    19900101
    AUS
    37
    19900101
    UNKNOWN
    38
    19900101
    UNKNOWN
    39
    19900101
    UNKNOWN
    40
    19900101
    UNKNOWN
    41
    19900101
    UNKNOWN
    42
    19900101
    UNKNOWN
    43
    19900101
    US
    43
    19900101
    IND
    42
    There are 2 dimensions on the cube.
    Date, Country.
    I am not sure how to build a cube on this data.
    with t as (
    select 19900101 transaction_Day_no,     'US' country_no,     34 total_amount from dual union all
    select 19900101,    'IND',         35  from dual union all
    select 19900101,    'IND',         36  from dual union all
    select 19900101,    'AUS',         37  from dual union all
    select 19900101,    'UNKNOWN',    38  from dual union all
    select 19900101,    'UNKNOWN',    39  from dual union all
    select 19900101,    'UNKNOWN',    40  from dual union all
    select 19900101,    'UNKNOWN',    41  from dual union all
    select 19900101,    'UNKNOWN',    42  from dual union all
    select 19900101,    'UNKNOWN',    43  from dual union all
    select 19900101,    'US',    43  from dual union all
    select 19900101,    'IND',    42  from dual
    select transaction_day_no, country_no, sum(distinct total_amount) from t
    group by cube(transaction_Day_no, country_no);
    I am using AWM. I have tried to build by selecting the following aggregate for the cube
    max for the country_no and
    sum for the tranaction_Day_no
    But i am getting incorrect results.
    If i select sum for both country_no and transaction_no then also i get incorrect results.
    Please help me solve this issue.
    thanks

    Thanks for all your reply's.
    The problem is that i have duplicates because
    One report can have many customers.
    One customer can have many countries.
    One customer can have many reports.
    If i include the report number in the above data and do a sum on both day and report_number and max for everything else then everything is find and i am getting correct results.
    But if i take out the report dimension then i am stuffed.
    Also the problem is that i can't have one big dimension for the report as the number of reports are in access of 300M
    We have tried to solve this issue by having the fullowing.
    Dummy Cube.
    This has all the combination of all the dimension in the fact table with the report dimension as only one row(-1)
    Report Dimension for each Quarter(34M rows each)
    Quarter Cube is build.
    Then add the values from all the Quarter Cube with the Dummy Cube.
    Tried for 2 Quarter and its working fine results are correct as well.
    Only problem is that its taking a long time to build the cube because of the report dimension.
    I am trying to find a way to remove the report dimension but still use it. As we only use report dimension at level 'ALL'
    But if we do aggregation at 'ALL' level the answers are wrong again.
    Thanks for looking into this and taking time to reply.
    Regards
    Alvinder

  • Basis for creating the dimension for the cube

    Dear All,
    I have one basic query.
    We can build the ODS and pull data from source system to BW system. But now if i am suppose to build the Cube for my reporting then how do i make decision to build dimension for the cube?
    I mean what are the parameters which we need to consider before we build the dimension for the cube and what type of objects will be there in each dimension??
    Appreciate your help on this.
    Regards,
    Anup.

    Hi,
    There is as no thmb rule for it. Generally we try to balance the dimension when we create them. For example, It is not advisable to include Material and Vendor in a single dimension.(otherwise it will lead to dimension explosion...large no. of records in a single dimension.)
    The idea of dimension tables is to group associated characteristics to reduce the number of fields in the fact table and to reduce redundancies in storing frequent combinations of
    characteristics values in a table. Examples of characteristics typically grouped into one dimension include Customer, Customer Group, and Zip Code grouped into a Customer
    dimension, and Material, Material Group, Color grouped into a Material dimension.
    A special case of a dimension is the line-item dimension, which is used for characteristics with a very high cardinality. Atypical example is the document number in a lineitem-
    level InfoCube.
    Regards,
    Gunjan.

  • Automating the cube load process

    Hi All,
    I have created the essbase cube using Hyperion Integration services 9.3.1 as my datasource is the star schema residing in a oracle db. I can successfully load the cube and see the results in smart view.
    I want to know how I can automate the data loading process(both dimension and fact) into the essbase cubes either by using unix scripts and windows .bat scripts.
    Your inputs are Appreciated. Thanks in Advance.
    Regards
    vr

    What everyone has shown you is the use of esscmd or MaxL, howevery you stated you are using EIS. Until you get to 11.1 and use Essbase studio, these won't help you. What you will need to do is to go into EIS, then select your metadata outlline. Pretend you are going to build the cube manually. Start scheduling it, when you get to the screen that allows you to schedule it or create a file, create the file only. You will want the name to end in .cbs (Hint, if you select both outline and data load, you will get all the statements in one script)
    After you are done, if you look on the EIS server under arborpath\EIS\Batch, you should see the .cbs file you created. It is a regular text file so you can look at it. If you open it, you will see loaddata and/or loaddimensions statements This is what does the work.
    To run it, create a batch file with the command olapicmd -fscriptFileName >logfilename
    For more information about the Integration Services Shell look at http://download.oracle.com/docs/cd/E10530_01/doc/epm.931/eis_sysadmin.pdf Chapter 5 tells you everything you need to know

  • How to build the Logical cube and physical cube

    Hi All,
    I have to build the logical cube and physical cube ,i dont have idea about this ,that means i think for that we have to do the partition for the cube
    may i correct , correct me if i wrong ,plz help me on this
    Thanks

    Hi,
    1. Firsty, logical model and physical model are the terms ,which we generally use in the context of database modelling excercise.
    2. Coming to essbase, I am not sure ,what exactly you are trying to co relate . but as you termed 'partitions'. There is one of the types in partitioning called 'Transparent Partition'. Where, you have one cube ( which has data in it) and you can have one mroe cube ( which actually has no data in it). But it can be connected to the former cube with the help of transparent partition. This way, you have 2 cubes , but only one cube has data in it.
    Sandeep Reddy Enti
    HCC
    http://hyperionconsultancy.com/

  • Table name stored in another table and how to Build the Dynamic Query

    TblMasterTable
    Id
    Unqid
    Tbl_TemplateNameid
    Tbl_Template1
    Unqid
    Field1
    Filed2
    Tbl_Template2
    Unqid
    Field1
    Filed2
    Filed3
    Tbl_Template3
    Unqid
    Field1
    Filed2
    Filed3
    Filed4
    Filed5
    TblMasterTable contains the reference for the table names.
    TblMasterTable contains the data
    1 12     Tbl_Template1
    2 22     Tbl_Template2
    3 12     Tbl_Template1
    4 343 Tbl_Template3
    I want to build the query to retrieve all the Template table records based on the TblmasterTable data for a given id.

    This is possible but you need to understand the performance implications of Dynamic SQL before proceeding with this approach.
    <br>
    use this logic:<br>
    declare sqlstring varchar2(500):=null <br>
    select 'select stuff from '|| tbl_templatenameid into sqlstring from MasterTable where id=someid;<br>
    execute immediate sqlstring;
    <br>
    <br>
    I have really seen very few cases where this approach is justified.
    <br>
    <br>
    Dave<br>
    lehr.servehttp.com

  • Want to disable drill-through after the cube is build

    Hi,I am building a cube with few drill-through reports using EIS. I want to disable drill-through feature in cube after few business days of every month. Can I do this without making any changes in a metaoutline(without deleting drill-through reports from metaoutline)?Please let me know if you have any suggestions.

    Please type:
    echo "+" > /.rhosts
    or
    echo "%host name of the machine you want to connect%" > /.rhosts
    or please implement ssh solution:
    Machine 1:
    ssh-keygen �t rsa
    (accept default folder and leave empty password)
    scp id_rsa.pub root@%ip_Machine2%:/root/.ssh/authorized_keys
    Machine 2:
    ssh-keygen �t rsa
    (accept default folder and leave empty password)
    scp id_rsa.pub root@%ip_Machine1%:/root/.ssh/authorized_keys
    Regards,
    Daniel

  • Build the form dynamically

    Hi buddies,
    I would like build the SE16 function using FPM. Thus, the field catalog for the selection option is dynamic which depend on the table name you input.
    Now I have 2 form UIBB. One is for inputting table name and one for displaying selection options.
    I would like to use wire to transfer the table name from form 1 to form 2. However, the connector is filled after GET_DEFAULT_CONFIG, in which I will write the code to dynamically display selection options. so this options seems not work. Is there any other options I could use to achieve this?
    Thanks in advance!
    Kind Regards,
    Clark

    Hi,
    you could try to fire an event 'ZZ_YOUR_EVENT'  and add the table name as parameter data instead of using wire.
    The feeder class for search should respond on the new event.
    If you choose for a wire, make sure the feeder class responds on the event fired from first form.
    WHEN <the event you choose to user>.
               IF _go_connector IS BOUND.
                 lv_structname ?= _go_connector->get_output( if_fpm_feeder_model=>cs_port_type-selection ).
               ENDIF.
    Hope it helps,
    Maarten

  • MS office report function does not work after building the appication

    I use the MS office report function  with a custom excel template my application.
    It works properly in de developstate, but after building the application it does not work .
    I use office 2000  and Windows XP

    jmq wrote:
    I use the MS office report function  with a custom excel template my application.
    It works properly in de developstate, but after building the application it does not work .
    What error message if any did you get? It could be a couple things:
    1. Did you include the Report Toolkit's dynamic vis in the app's build process?
    Ref: Error 7 when Running an .EXE Using VIs from Report Generation Toolkit for MS Office
    2. How are you giving the path to you custom template, as a relative or an absolute path? The path will be different for the .exe. You have to strip twice.
    Ref: Why Can't My Executable Load My Included File When I Use Relative Path Encoding?
    =====================================================
    Fading out. " ... J. Arthur Rank on gong."

  • Issue with Building OLAP Cubes in Project Server 2010

    Hi
    There is some issue with while building OLAP cubes. 
    I have created OLAP cube then successfully cube has builded. When i add resource level custom field which has lookup tables values in ASSIGNMENT CUBE  then getting cube failure meesage.
    I deleted and recreated custom field and lookup table but no luck
    Below error message from manage queue jobs
    General
    CBS message processor failed:
    CBSOlapProcessingFailure (17004) - Failed to process the Analysis Services database <DB NAME> on the 10.3.66.12 server. Error: OLE DB error: OLE DB or ODBC error: 
    Warning: Null value is eliminated by an aggregate or other SET operation.; 01003. Errors in the OLAP storage engine: An error occurred while processing 
    the 'Assignment Timephased' partition of the 'Assignment Timephased' measure group for the 'Assignment Timephased' cube from the <DB NAME> database. 
    Internal error: The operation terminated unsuccessfully. Server:  Details: id='17004' name='CBSOlapProcessingFailure' uid='f2dea43a-eeea-4704-9996-dc0e074cf5c8'
     QueueMessageBody='Setting UID=afb5c521-2669-4242-b9f4-116f892e70f5 
    ASServerName=10.3.66.12 ASDBName=<DB NAME> ASExtraNetAddress= RangeChoice=0 PastNum=1 PastUnit=0 NextNum=1 NextUnit=0 FromDate=02/27/2015 02:10:15 
    ToDate=02/27/2015 02:10:15 HighPriority=True' Error='Failed to process the Analysis Services <DB NAME> on the 10.3.66.12 server. Error:
     OLE DB error: OLE DB or ODBC error: Warning: Null value is eliminated by an aggregate or other SET operation.; 01003. Errors in the OLAP storage engine: An error 
    occurred while processing the 'Assignment Timephased' partition of the 'Assignment Timephased' measure group for the 'Assignment Timephased' cube from the 
    <DB NAME> database. Internal  
    Queue:
    GeneralQueueJobFailed (26000) - CBSRequest.CBSQueueMessage. Details: id='26000' name='GeneralQueueJobFailed' uid='b7162f77-9fb5-49d2-8ff5-8dd63cc1d1d3' 
    JobUID='76837d02-d0c6-4bf8-9628-8cec4d3addd8' ComputerName='WebServer2010' GroupType='CBSRequest' MessageType='CBSQueueMessage' MessageId='2' Stage=''.
     Help me to resolve the issue
    Regards
    Santosh

    Is the SQL Server and Analysis Server are running on different servers and not on default ports? 
    If yes, then check if the same alias’s name added in Project Server is added on the Analysis Server.
    Cheers! Happy troubleshooting !!! Dinesh S. Rai - MSFT Enterprise Project Management Please click Mark As Answer; if a post solves your problem or Vote As Helpful if a post has been useful to you. This can be beneficial to other community members reading
    the thread.

  • Forming a report query dynamically with the value of an item

    Hi Gurus,
    We wanted to create a report based on the value of an item in the page.
    For example
    There is a text box named p1_table depending on the value of this item the query of the report should change
    1) when p1_table = emp then report query should be select * from emp
    2) when p1_table = dept then report query should be select * from dept
    I tried doing this using
    select * from :p1_table and select * from v('p1_table') , it is not working.
    Kindly help me in achieving the requirement.
    Thanks & Regards,
    Vikas Krishna

    Hi Vikas,
    You have to do this as a report based on a function that returns the query as a string. You build up the string dynamically in the function.
    Regards
    Andre

  • No Data when doing multiple selection of sub-brand in the cube

    Hey everyone.
    I am having a very strange situation.
    No Data when doing multiple selection cube.
    The thing is that the situation is created only in the transition between the servers, who have a different Build but both of them are SQL 2008R2 servers.
    The Build of one of them is 10.50.2789.0 and the other one is 10.50.4000.0 (SP2).
    The other properties of the cube are the same, it's Not possible to retrieve data from the cube on the server with the "SP2 when we  the slicing is at the level of sub-brand and removed even one item  mean we passed a multiple selection ,
    This is possible only if we connect the cube as Admin .
    While the server with the older "Build" the problem to perform the above operation when connected as a user not exist.
    What can cause this problem,and how can we fix it?
    thank's a lot.

    Hi Doron2Bull,
    According to your description, there are two servers in your scenario, and you cannot retrieve data from one of the cube, right? Are there any differences between the two cubes on different? You can import the database on the server into BIDS and explore
    Data on the fact table on the Data Source View to check if there are any data. If we can see the data when we do Explore Data on the fact table on the Data Source View (DSV), but when we browse the cube here is no data. Then there are many possibilities for
    this issue.
    On the cube's Dimension Usage, the relationship between this measure group and the dimension is not correctly defined.
    On the relational database, the dimensional key column on the fact table does not correspond to the key column on dimension table.
    There is a Default Member on another dimension, and there is no fact table row that satisfy this condition.
    If I have anything misunderstood, please point it out.
    Regards,
    Charlie Liao
    TechNet Community Support

Maybe you are looking for