Small data mart tools question

sorry that I'm rather new to data olap!
we have a small operational system that creates approximately 120K records a year in a single table with a couple of two level lookup tables. the time component is stored with the measure which is already aggregated to the desired "day" granularity.
CategoryLevel1 --> CategoryLevel2 --> Measure with date <-- LocationLevel1 <-- LocationLevel2
we want to target a "lightweight" BI design using Pentaho, Mondrian and Saiku against an Oracle database. if we need another schema then its ok to have that in the same database.
we are considering simply using materialized views as fact and dimension tables for ETL as described here:
http://ww1.ucmss.com/books/LFS/CSREA2006/IKE4645.pdf
is this a common approach? are there any drawbacks that are of significance for our effort?
appreciate any insight you can provide.

I am not sure if this will help you, but there is a nice white paper on how the Oracle database OLAP option can be used at http://www.oracle.com/technetwork/database/options/olap/oracle-olap-11gr2-twp-132055.pdf.
Other OLAP collateral can be found at Oracle OLAP.
--Ken Chin

Similar Messages

  • Data Utilization Tool Question

    Does any one know what is the 'Technology' sub-category in the Web & Apps category of the new Data Utilization Tool?
    On my account, there is one line's usage that is comprised of 30% Technology and Verizon as thus far been unable to be specific about what this means.  Because we have seen a tremendous spike in data usage, I'd like to know what function/program to turn off to avoid the excessive data usage, but no one can tell me what that would be.  Anyone??

    baylakates, that is a wonderful question. I want to make sure that we are able to get you all the details that you are looking for. The Technology category includes applications and websites for technology companies building tethering technologies and websites offering content for software programmers. So this could be any 3rd party applications that were downloaded from the app stores.
    KevinR_VZW
    Follow us on Twitter @VZWSupport
    If my response answered your question please click the "Correct Answer" button under my response. This ensures others can benefit from our conversation. Thanks in advance for your help with this!!

  • Help in data marting

    hi gurus,,
    can u plz help me in explaining the procedure for data marting and how much it is useful
    thanx

    Hi,
    Check these links:
    Data Marts
    data mart and open hub service
    Re: Uses of Data Mart
    Few questions
    Search the forums for few more links.
    With rgds,
    Anil Kumar Sharma .P

  • Best way to capture data every 5 ms (milli-seconds) in the .vi diagram when using "Time between Points, and Small Loop Delay tools" ?

    - Using LabView version 6.1, is there anyway to change the "Time Between Points" indicator of (HH.MM.SS) to only (mm.ss), or to perhaps only (.ss) ?
    - Need to set the data sampling rate to capture every 5 milliseconds, but the defaults is always to 20 or greater; even when the "Small Loop Delay" variable is adjusted down. 
    Thank you in advance.

    I have no idea what "Time between Points, and Small Loop Delay tools" is. If this is some code you downloaded, you should provide a linke to it. And, if you want to acquire analog data every 5 milliseconds from a DAQ board, that is possible with just about every DAQ board and is not related to the version of LabVIEW. You simply have to set the sample rate of the DAQ board to 200 samples/sec. If it's digital data, then there will be a problem getting consistent 5 msec data.

  • Table Onwers and Users Best Practice for Data Marts

    2 Questions:
    (1)We are developing multiple data marts that share the same Instance. We want to deny access to the users when tables are being updated. We have one generic user (BI_USER) with read access through one of the popular BI Tools. The current (first) data mart we denied access by revoking the privilege to the BI_USER, however going forward with other data marts the tables will get updated on a different schedule and we do not want to deny access to all the data marts. What is the best approach?
    (2) What is the best Methodology for table ownership of tables in different data marts that share tables across marts? Can we create one generic ETL_USER to update tables with different owners?
    Thanx,
    Jim Masterson

    If you have to go with generic logins, I would at least have separate generic logins for each data mart.
    Ideally, data loads should be transactional (or nearly transactional), so you don't have to revoke access ever. One of the easier tricks to accomplish this is to load data into a shadow table and then rename the existing table and the shadow table. If you can move the data from the shadow table to the real table in a single transaction, though, that's even better from an availability standpoint.
    If you do have to revoke table access, you would generally want to revoke SELECT access to the particular object from a role while the object is being modified. If this role is then assigned to all the Oracle user accounts, everyone will be prevented from viewing the table. Of course, in this scenario, you would have to teach your users that "table not found" means that the table is being refreshed, which is why the zero downtime approach makes sense.
    You can have generic users that have UPDATE access on a large variety of tables. I would suggest, though, that you have individual user logins to the database and use roles to grant whatever ad-hoc privileges users need. I would then create one account per data mart, with perhaps one additional account for the truely generic tables, that own each data mart's objects. Those users would then grant different roles different database privileges, and you would then grant those different roles to different users. That way, Sue in accounting can have SELECT access to portions of one data mart and UPDATE access to another data mart without granting her every privilege under the sun. My hunch is that most users should not be logging in to, let alone modifying, all the data marts, so their privileges should reflect that.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Diff b/w Data Mart & Generate Export data source

    HI All
    Here  i want one small clarification.
    will you plze help me regarding of the following
    I want to know the Difference between Data Mart & Generate Export data source.
    In my point of you both are little bit same.
    any way some budy explain with full details.
    If you provide any help doc its too help ful for me.
    Regards n Thanks
    kgb

    Hi kgbm,
    This my point of view:
    1)DataMart: in BI you use this term for identify a specific area of analysis in your DataWarehouse. Generally it is obtained with an aggregation of data of other structures, or combining internal data with external data.
    2)Generate Export D.S.: it is the technical tool to obtain an extractor to feed other structures, called DataMart.
    Ciao.
    Riccardo.

  • Data mart cube to cube copy records are not matching in target cube

    hI EXPERTS,
    Need help on the below questions for DATA Mart-Cube to cube copy(8M*)
    Its BW 3.5 system
    We have two financial cube.
    Cube A1 - Sourced through R/3 system (delta update) and Cube B1- Sourced through A1 cube.(Full update). These two cubes are connected through update rules with one to one mapping without any routines.Basis did a copy of back-end R/3 system from Production to Quality server.This happened approximately 2 months back.
    The Cube A1 which extracts delta load from R/3 is loading fine. but for the second cube, (extraction from previous cube A1) i am not getting full volume of data instead i m getting meagre value but the loading shows successful status in the monitor.
    We  tried through giving conditions in my infopackage (as it was given in previous year's loading) but then also its fetching the same meagre volume of data.
    To ensure that is it happening for the particular cube, we tried out in other cube which are sourced thro myself system and that is also getting meagre data rather than full data..
    For Example: For an employee if the data available is 1000, the system is extracting randomly some 200 records.
    Any quick reply will be more helpful. Thanks

    Hi Venkat,
                  Did you do any selective delitions in CUBEA1.
    first reconcile data cube1 & cube2 .
    match totals of cube1 with cube2.
    Thanks,
    Vijay.

  • Facebook images, profile photos, cover photos are not loading. Only small blue square with question mark. Same thing in Safari, FireFox, Chrome.

    Since July on MacBook Pro facebook images, profile photos, cover photos are not loading. Only small blue square with question mark. Same thing in Safari, FireFox, Chrome. Everything is up to date including last version of Adobe Flash. And watching YouTube also a problem. Video is skipping, freezing for few seconds and jumps forward. At the same time on Toshiba with Windows Vista under the same account, same wi-fi connection everything works fine.
    Please help with sugesstions to fix the issue.

    Seems like problem is solved but it might be temporarily. Anyway. Click on the image, it will open facebook photo browser. Image is not loading so right click on the blue box with question mark, save image as. It will start downloading but without any success since browser do not see the image. After this attempt refresh the page and close all tabs with facebook. Try to open facebook once again. For some reason after this weird actions all images displayed properly. Do not know for how long but now it works.

  • RE: DataField, update underlying data via TOOL,Express

    John,
    does it work if you mix the "CopyfromClipboard" method with the "PlaceValueinDisplayedField" ?
    If this is not the correct solution to you problem, could you please specify "where" it does not work ?
    Thanks a lot indeed.
    Best regards
    /Stefano
    Stefano POGLIANI Fort&eacute; Software Consultant
    E-Mail : [email protected] Tel : +33.0)450201025
    Fax : +33.(0)450200257 Mobile : +33.(6)08431221
    Visit the Fort&eacute; Web Site : http://www.forte.com/
    Ducunt fata volentem, nolentem trahunt....
    -----Original Message-----
    From: John Hodgson [SMTP:[email protected]]
    Sent: Wednesday, July 02, 1997 8:39 PM
    To: [email protected]
    Subject: DataField, update underlying data via TOOL, Express
    In TOOL code we PasteText() into a DataField, but the underlying data
    object does not get updated until the user interacts with the GUI.
    That causes problems if we need to use the underlying data object's
    value immediately after the paste. How can we:
    force an update of the underlying data object and
    ensure that the update goes through before our method call returns,
    i.e., ensure that if the update is via Forte events, those events
    are handled before returning.
    The context is a calendar lookup button that pastes into an adjoining
    DataField.
    John Hodgson |Descartes Systems Group Inc.|[email protected]
    Systems Engineer|120 Randall Drive |http://www.descartes.com
    |Waterloo, Ontario |Tel.: 519-746-8110 x250
    |CANADA N2V 1C6 |Fax: 519-747-0082

    Well, I think I have answered my own question, but I will leave it here in case anyone else has the same problem. 
    So, as far as I have been able to track down, it all went wrong when I was running through the connection wizard. 
    Under the section titled "Creating the Data Source" is describes how to find your database file and create the appropriate connection string. However, on my version of VS Express 2010 . It offered me a prompts , saying something like, "would you like to
    move this database file into the application directory and change the connection string" this sounded very sensible to me, so I said yes.
    All proceeded accordingly. And the database file now appeared in the solution explorer. 
    The app config file said that the connection string was 
    Data Source=|DataDirectory|\Database1.sdf"
    I presumed this would be interpreted correctly by the rest of the app, as it was generated by VS.
    But it didn't, what I cannot understand is how no error was generated. And data seemed to pull
    into the bound controls. 
    But I have been testing it for a while now. and it seems that if I manually override the config file with the actual directory where the file exists , then there is not a problem. Data is retained in the file 
    This is more of a VB.net question, but I couldn't find it in the drop down. I will try and move it there now
    Thanks guys for your patience. 
    p.s. RSingh , the code I posted above did come from the SaveItem_Click event
    handler

  • Data Warehouse Partitioning question

    Hi All,
    I have a data warehousing partitioning question - I am defining partitions on a fact table in OWB and have range partitioning on a contract number field. Because I am on 10gR2 still, I have to put the contract number field into the fact table from its dimension in order to partition on it.
    The tables look like
    Contract_Dim (dimension_key, contract_no, ...)
    Contract_Fact(Contract_Dim, measure1,measure2, contract_no)
    So my question:
    When querying via reporting tools, my users are specifying contract_no conditions on the dimension object and joining into the contract_fact via the dimension_key->Contract_dim fields.
    I am assuming that the queries will not use partition pruning unless I put the contract_fact.contract_no into the query somehow. Is this true?
    If so, how can I 'hide' that additional step from my end-users? I want them to specify contract numbers on the dimension and have the query optimizer be smart enough to use partition pruning when running the query.
    I hope this makes sense.
    Thanks,
    Mike

    I am about to start a partitioning program on my dimension / fact tables and was hoping to see some responses to this thread.
    I suggest that you partition the tables on the dimension key, not any attribute. You could partition both fact and dimension tables by the same rule. Hash partitions seem to make sense here, as opposed to range or list partitions.
    tck

  • GRC AC 5.3 SP10: Query Data Mart Tables using MS Access ODBC

    Dear all,
    I have created a ODBC-connection from MS Access on my computer to the GRC AC 5.3 SP10 Data Mart Tables according to the Guide "Data Mart Reporting with SAP BusinessObjects Access Control 5.3".
    Now I want to start a query on e.g. DM Table GRC_DM_CC_MGMTTOT to get information on the AC RAR Management Reports.
    1. In Access I  create a new DB
    2. File -> External data -> Link Tables: ALL Oracle Table of the connected Oracle GRC Table are listed.
    3. When I select table GRC_DM_CC_MGMTTOT to be linked to my new DB Access starts to link to a lot of tables like TSTAT_, TT2C_ and TTEC_* in the Oracle DB!  Are these links really required to query table GRC_DM_CC_MGMTTOT?
    This connection attempts make my computer busy for a very long time so I have to end Access using the Task Manager.
    Does anybody has some experience with connecting MS Access to the Data Mart tables? Unfortunately we have no license for Crystal Reports so we need to get this running MS Access.
    Thanks and regards,
    Markus
    Edited by: Markus  Richter on Feb 5, 2010 4:06 PM
    Edited by: Markus  Richter on Feb 5, 2010 4:10 PM

    Markus,
    Please ensure you are using the correct ODBC driver with your Oracle data base.  You can verify that with your DBA.  I have successfully connected SQL Server, MaxDB and Oracle DB's to MS Access.
    In addition, if there is a significant amount of data in the data mart table, it takes MS Access a long time to download that data.  E.g. for the GRC_DM_SOD_ACT table in my internal system, it took about 10 hrs to download to Access.
    Please feel free to ping me if you have any specific questions regarding the Data Mart.
    Thanks!
    Ankur
    SAP GRC RIG

  • AC 5.3 - Data Mart - FUNCTID missing in DB_GRC_DM_CC_SOD_ACT table

    Hi all,
    I was successfuly able to make a link from GRC AC 5.3 to an Microsoft Access database in order to create custom reports.
    I have run all the necessary Jobs!
    Looking at table GRC_DM_CC_SOD_ACT I found that the FUNCID field is empty. The same field in table GRC_DM_CC_SOD_PRM is correctly filled.
    Do you know if there is a bug around this ?
    Thanks
    Andrea Cavalleri
    Aglea

    Markus,
    Please ensure you are using the correct ODBC driver with your Oracle data base.  You can verify that with your DBA.  I have successfully connected SQL Server, MaxDB and Oracle DB's to MS Access.
    In addition, if there is a significant amount of data in the data mart table, it takes MS Access a long time to download that data.  E.g. for the GRC_DM_SOD_ACT table in my internal system, it took about 10 hrs to download to Access.
    Please feel free to ping me if you have any specific questions regarding the Data Mart.
    Thanks!
    Ankur
    SAP GRC RIG

  • SAP GRC AC 5.3 SP09 Data Mart: How to set up this new feature

    Hi there,
    does anyone already has some experience in setting the new data mart functionality in AC 5.3 SP09?
    I have read through the data mart config section of AC 5.3 config guide pages 55-56 and 321-322 but don´t understand how this process should work in the simpliest way to receive simple flat files of out AC 5.3.
    We do not use Crystal Reports but we just want to have flat file data extracts out of the AC 5.3 and load them into MS Access.
    1. Regarding the creation of a new data source in the Visual Admin: Is this required when I just want to get flat files out of AC 5.3 using the data mart jobs?
    We use a Oracle DB in a dual stack install for the AC 5.3. Do I need to install a JDBC Oracle driver for setting up the data source?
    2. Where am I supposed to find the data mart flat files that would result out of the data mart jobs when I do not have any connection set up to certain analysis tool DB? Do I need to define a file location in the data mart jobs in RAR?
    Thanks for any help and regards,
    Markus

    Dear all,
    got a reply from SAP on a message regarding this issue stating that the connection process outlined in the document
    http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/10e245d3-09a5-2c10-3296-f3afe44497e2&overridelayout=true
    can be applied as well to Oracle DB if your GRC AC 5.3 data is stored in there. Prereq is to add the additional data source in your visual admin.
    Via the data mart jobs you then need to fill the data mart and via ODBC for Oracle you should be able to access this data using Crystal Reports or MS Access.
    Regards,
    Markus

  • Data mart data load InfoPackage gets shot dumps

    This is related to the solution Vijay provided in the link What is the functionality of 0FYTLFP [Cumulated to Last Fiscal Year/Period
    I encounter the problem again for Data Mart load that I created different initial load InfoPackages with different data selection and ran them separatedly that the initial data packet are messed up and whenever I try to creat a new InfoPackage, always get short dumps. RSA7 on BW system itself doesn't give the fault entry.
    I try to use the program RSSM_OLTP_INIT_DELTA_UPDATE you provided, get three parameters:
    LOGSYS (required)
    DATASOUR (required)
    ALWAYS (not required)
    I fill in LOGSYS with our BW system source system name that's in the InfoPackage and fill in DATASOUR with the datasource name 80PUR_C01. But it goes nowhere when clicking the execution button!
    Then I tried another option you suggested by checking the entries in the following three tables:
    ROOSPRMS Control Parameters Per DataSource
    ROOSPRMSC Control Parameter Per DataSource Channel
    ROOSPRMSF Control Parameters Per DataSource
    I find there is no any entry for 1st table with datasource 80PUR_C01, but find two entries in each of the 2nd and 3rd tables. I need to go ahead to delete these two entries for these two tables, right?
    Thanks

    Kevin,
    sorry, I didn't follow your problem/question, but pay attention when you want to modify these tables content !!!
    Since there is an high risk of inconsistencies...(why don't you ask for some SAP support in OSS for this situation?)
    Hope it helps!
    Bye,
    Roberto

  • Do we require an OLTP DB and Data Mart?

    Our data sources are as follows:
    - An mdb file (downloaded every hour)
    - Multiple xls files (downloaded every week)
    Our aim is to develop a BI solution using BISM, Data Mart, OLAP cubes etc.
    From my understanding, we do not necessarily require an OLTP DB and we can import our data directly into our data mart using SSIS.
    - However, with a data mart, will we be able to display all our data and and perform CRUD operations on all our data at our presentation layer just like an OLTP DB? For example, list historical data in table format, which can be updated if needed?
    Thanks.

    Hi DarrenOD,
    It is correct that you do not require an OLTP DB, but only the extracts you require. The extracts are usually significantly less than the OLTP DB since you will never do analysis on every fields in the operational system, but only a small portion of the
    source system.
    The tradisional datawarehouse (DWH) architecture is staging DB, DWH DB(+data mart if needed) and analytic layer ( OLAP \ Tabular). There are very specific and good reasons why. The DWH DB contains all history. Keep in mind that the DWH follows a dimensional
    model whereas the OLTP follows a normalized (3NF) model with lots of indexes and foreign keys and table relationships.
    Data marts are created for specific reporting reasons which cannot be derived from the DWH facts. The marts are created from the DWH tables.
    Hope this helps.

Maybe you are looking for