OLAP is bad

ok, this is not a question, but general comment. I kinda hope that next to dumb copy-paste from documentation responses and random links, there is somebody out there who feels my pain and responds...
I really don't want to talk about the engine and really terrible java based development ide. I want to talk about actual client.
I work for a bank in finance technology team (we focus 100% on different financial apps, data marts - some of them 1TB+, ....). Our users use excel. You can and probably will say something about BI Tools, but .... we tried to give them at least 3 different "cool" BI tools. They don't like them, simply because they are excel gurus and don't want to even try new tool and just copy-paste it back to excel.
So natural pick for us was Olap Excel add-in .... i've never seen worse add-in ever, period. Not only this tool is slow (it takes at least 7 seconds to open excel now), it crashes a lot and on top of it, the user friendly screens are simply not there.Not sure who wrote it, but it almost looks like a job of one developer with no feel for actual non-technical excel users.
Oracle bought Hyperion while ago, I hope they are going to use the best from ESSBase excel add-in .....
jiri

Hi Stevan,
Best practice is to keep them separate, But also its depends on the data you have.
You can get more information on below paths.
is keeping your OLAP and OLTP databases on...
SSIS Validation slow. OLTP OLAP running on same server
SQL Sever OLAP and OLTP on the same server with different instances
Thanks
Suhas
Mark as Answer if this resolves your problem or "Vote as Helpful" if you find it helpful.
My Blog
Follow @SuhasKudekar

Similar Messages

  • Limitations of Discoverer compared to Other Reporting tools

    Hi,
    I need to evaluate the Pro's and Con's of Oracle Discoverer.Can u please tell me the advantages and disadvantages when compared to Business Objects, Cognos or any other Reporting Tool.
    Thanks, Prasad

    Hey Prasad.
    Now there's an open question (ie: which is better, Ford or Chevy).
    I'm sure this has been address on this forum before so you might want to perform a search for this subject.
    However, dipping my toes in the pool ...
    1. Oracle is pushing Discoverer as an important tool in their repertoire. Future versions will now interface with XML Publisher (now BI Publisher), etc. so it's tightly inter-woven into Oracle. BO, Cognos, etc. are 3rd. party tools and therefore not tightly woven.
    2. If running Oracle Apps, then you can use the BIS views for pre-written views, Disco workbooks, etc. It's simply not practical to try to use BIS without Discoverer.
    3. Newest version of Disco (10g r2) now supports OLAP. Badly ... but it's a start.
    4. Oracle is really pushing the idea of everything being in the database and therefore they are pushing OLAP, DW stuff in the database along with your relational stuff. What this means in tools is that they're strongly pushing Disco with OLAP instead of a customer having to buy hardware, software (ie: Hyperion, Cognos, etc.), people (dbas to manage, software types to program), etc. and exporting your data from the Oracle database to a 3rd. party database to then report on. Again, Disco with OLAP (and someday if actual possible / practical) mixing your OLTP stuff with OLAP stuff.
    5. It's been awhile since I used BO and Cognos so not sure if their newest versions have the same idea of 3 versions (well 2 someday) where you create the report with Plus (or Desktop) and allow users to lightly manipulate (but not create) existing reports using a simple HTML browser (Viewer).
    6. Discoverer is a neat name as it sounds like you're stuttering ...
    I'm sure they'll be lots of other reasons stated by others on this board, but if I was betting the farm on where to go, I would go with Discoverer if I had an Oracle installation. It'll cost you - and Oracle is definitely in it for making a buck or two - but if you're in for a penny with Oracle databases, etc., then you're in for a pound and using 3rd party tools that may do the same, less or even a bit more, never appeals to me as Oracle will either catch up and surpass them ... or let's be realistic ... buy them out! Like Microsoft, they've been doing that lately with a spare few billion dollars falling out of Larry's pockets.
    Russ

  • Import OLAP Metadata in OBIEE 11.1.1.5 failing

    We are in the process of integrating our existing Oracle OLAP infrastructure with OBIEE 11.1.1.5. We are currently on 11.2.0.1 DB with our 11g cubes in 11.2.0.0.0 compatibility mode.
    We have two AWs within the same schema that are tightly integrated (Eg. Q12_AW and Q12_HIST_AW). When I import the metadata using OBIEE 11g BI Administrator, I can only see Q12_HIST_AW but not Q12_AW. Also, when I copy Q12_HIST_AW to OBIEE, it errors out. Looking at java host logs, it looks like I am getting parse errors but not sure how to pursue further on why the parser is failing.
    Logs:
    [2011-08-16T10:34:45.168-05:00] [init] [WARNING] [] [saw.init.application] [tid: 14] [ecid: 0000J7JEjD6FS8O6yjMaMG1EIco2000006,0] Bad AW -- Q12_AW.Q12_AW
    [2011-08-16T10:34:45.175-05:00] [init] [WARNING] [] [saw.init.application] [tid: 14] [ecid: 0000J7JEjD6FS8O6yjMaMG1EIco2000006,0] Errors have occurred during xml parse[[
    <Line 1181, Column 26>: Encountered "(" at line 1, column 83.
    Was expecting one of:
    "DIMENSION" ...
    <Line 1210, Column 26>: Encountered "(" at line 1, column 83.
    Was expecting one of:
    "DIMENSION" ...
    <Line 1529, Column 26>: Encountered "(" at line 1, column 58.
    Was expecting one of:
    "DIMENSION" ...
    <Line 1558, Column 26>: Encountered "(" at line 1, column 58.
    Was expecting one of:
    "DIMENSION" ...
    <Line 3025, Column 23>: Encountered "(" at line 1, column 54.
    Was expecting one of:
    "DIMENSION" ...
    <Line 4020, Column 24>: Encountered "(" at line 1, column 81.
    Was expecting one of:
    "DIMENSION" ...
    <Line 9516, Column 24>: Encountered "(" at line 1, column 101.
    Was expecting one of:
    "DIMENSION" ...
         at oracle.olapi.xml.TagHandler.createRootException(Unknown Source)
         at oracle.olapi.xml.TagHandler.getRootException(Unknown Source)
         at oracle.olapi.xml.TagHandler.reportException(Unknown Source)
         at oracle.olapi.xml.TagHandler.processException(Unknown Source)
         at oracle.olapi.metadata.BaseMetadataXMLReader.resolveDeferredProperties(Unknown Source)
         at oracle.olapi.metadata.MetadataXMLReaderMetadataInitialState.exit(Unknown Source)
         at oracle.olapi.metadata.MetadataXMLReaderMetadataInitialState.exit(Unknown Source)
         at oracle.olapi.xml.TagHandler.endElement(Unknown Source)
         at org.xml.sax.helpers.ParserAdapter.endElement(ParserAdapter.java:626)
         at oracle.xml.parser.v2.XMLContentHandler.endElement(XMLContentHandler.java:211)
         at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingParser.java:1359)
         at oracle.xml.parser.v2.NonValidatingParser.parseRootElement(NonValidatingParser.java:376)
         at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingParser.java:322)
         at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:226)
         at org.xml.sax.helpers.ParserAdapter.parse(ParserAdapter.java:405)
         at oracle.olapi.xml.XMLProcessor.parse(Unknown Source)
         at oracle.olapi.metadata.MetadataFetcher.processXML(Unknown Source)
         at oracle.olapi.metadata.MetadataFetcher.fetchBaseMetadataObjects(Unknown Source)
         at oracle.olapi.metadata.BaseMetadataProvider.fetchMetadataObjects(Unknown Source)
         at oracle.olapi.metadata.MetadataListProperty.getObjects(Unknown Source)
         at oracle.olapi.metadata.BaseMetadataObjectState.getPropertyListValues(Unknown Source)
         at oracle.olapi.metadata.BaseMetadataObject.getPropertyListValues(Unknown Source)
         at oracle.olapi.metadata.mdm.MdmSchema.getCubes(Unknown Source)
         at oracle.olapi.metadata.deployment.AW.getCubes(Unknown Source)
         at oracle.bi.integration.aw.v11g.AW11gUtil.getAWImportInfo(AW11gUtil.java:1035)
         at oracle.bi.integration.aw.v11g.AW11gUtil.getAWImportInfo(AW11gUtil.java:1113)
         at oracle.bi.integration.aw.v11g.service.AW11gService.execute(AW11gService.java:83)
         at oracle.bi.integration.javahost.ServiceRpcCall.processMessageInternal(ServiceRpcCall.java:55)
         at com.siebel.analytics.javahost.AbstractRpcCall.processMessage(AbstractRpcCall.java:251)
         at com.siebel.analytics.javahost.MessageProcessorImpl.processMessage(MessageProcessorImpl.java:193)
         at com.siebel.analytics.javahost.Listener$Job.run(Listener.java:223)
         at com.siebel.analytics.javahost.standalone.SAJobManagerImpl.threadMain(SAJobManagerImpl.java:207)
         at com.siebel.analytics.javahost.standalone.SAJobManagerImpl$1.run(SAJobManagerImpl.java:155)
         at java.lang.Thread.run(Thread.java:662)
    [2011-08-16T10:34:46.359-05:00] [init] [NOTIFICATION] [] [saw.init.application] [tid: 14] [ecid: 0000J7JEjD6FS8O6yjMaMG1EIco2000006,0] Reading AW -- Q12_AW.Q12_HIST_AW
    [2011-08-16T10:34:46.419-05:00] [init] [NOTIFICATION] [] [saw.init.application] [tid: 14] [ecid: 0000J7JEjD6FS8O6yjMaMG1EIco2000006,0] [Thread 21] Service done -- AWImportService11G
    [2011-08-16T10:34:50.149-05:00] [workmanager] [NOTIFICATION] [] [saw.workmanager] [tid: 15] [ecid: 0000J7JElO_FS8O6yjMaMG1EIco2000007,0] Thread started
    [2011-08-16T10:35:22.340-05:00] [init] [NOTIFICATION] [] [saw.init.application] [tid: 14] [ecid: 0000J7JEtF^FS8O6yjMaMG1EIco200000C,0] [Thread 21] calling service -- AWImportService11G
    [2011-08-16T10:35:22.340-05:00] [init] [NOTIFICATION] [] [saw.init.application] [tid: 14] [ecid: 0000J7JEtF^FS8O6yjMaMG1EIco200000C,0] Reading AW UDML -- Q12_HIST_AW
    [2011-08-16T10:35:25.768-05:00] [init] [ERROR] [] [saw.init.application] [tid: 14] [ecid: 0000J7JEtF^FS8O6yjMaMG1EIco200000C,0] Errors have occurred during xml parse[[
    <Line 1181, Column 26>: Encountered "(" at line 1, column 83.
    Was expecting one of:
    "DIMENSION" ...
    <Line 1210, Column 26>: Encountered "(" at line 1, column 83.
    Was expecting one of:
    "DIMENSION" ...
    <Line 9516, Column 24>: Encountered "(" at line 1, column 101.
    Was expecting one of:
    "DIMENSION" ...
         at oracle.olapi.xml.TagHandler.createRootException(Unknown Source)
         at oracle.olapi.xml.TagHandler.getRootException(Unknown Source)
         at oracle.olapi.xml.TagHandler.reportException(Unknown Source)
         at oracle.olapi.xml.TagHandler.processException(Unknown Source)
         at oracle.olapi.metadata.BaseMetadataXMLReader.resolveDeferredProperties(Unknown Source)
         at oracle.olapi.metadata.MetadataXMLReaderMetadataInitialState.exit(Unknown Source)
         at oracle.olapi.metadata.MetadataXMLReaderMetadataInitialState.exit(Unknown Source)
         at oracle.olapi.xml.TagHandler.endElement(Unknown Source)
         at org.xml.sax.helpers.ParserAdapter.endElement(ParserAdapter.java:626)
         at oracle.xml.parser.v2.XMLContentHandler.endElement(XMLContentHandler.java:211)
         at oracle.xml.parser.v2.NonValidatingParser.parseElement(NonValidatingParser.java:1359)
         at oracle.xml.parser.v2.NonValidatingParser.parseRootElement(NonValidatingParser.java:376)
         at oracle.xml.parser.v2.NonValidatingParser.parseDocument(NonValidatingParser.java:322)
         at oracle.xml.parser.v2.XMLParser.parse(XMLParser.java:226)
         at org.xml.sax.helpers.ParserAdapter.parse(ParserAdapter.java:405)
         at oracle.olapi.xml.XMLProcessor.parse(Unknown Source)
         at oracle.olapi.metadata.MetadataFetcher.processXML(Unknown Source)
         at oracle.olapi.metadata.MetadataFetcher.fetchBaseMetadataObjects(Unknown Source)
         at oracle.olapi.metadata.BaseMetadataProvider.fetchMetadataObjects(Unknown Source)
         at oracle.olapi.metadata.MetadataListProperty.getObjects(Unknown Source)
         at oracle.olapi.metadata.BaseMetadataObjectState.getPropertyListValues(Unknown Source)
         at oracle.olapi.metadata.BaseMetadataObject.getPropertyListValues(Unknown Source)
         at oracle.olapi.metadata.mdm.MdmSchema.getCubes(Unknown Source)
         at oracle.olapi.metadata.deployment.AW.getCubes(Unknown Source)
         at oracle.bi.integration.aw.v11g.AW11gUtil.getAWUdml(AW11gUtil.java:914)
         at oracle.bi.integration.aw.v11g.AW11gUtil.getAWUdml(AW11gUtil.java:876)
         at oracle.bi.integration.aw.v11g.service.AW11gService.getAWUdmlObject(AW11gService.java:157)
         at oracle.bi.integration.aw.v11g.service.AW11gService.getAWUdml(AW11gService.java:137)
         at oracle.bi.integration.aw.v11g.service.AW11gService.execute(AW11gService.java:78)
         at oracle.bi.integration.javahost.ServiceRpcCall.processMessageInternal(ServiceRpcCall.java:55)
         at com.siebel.analytics.javahost.AbstractRpcCall.processMessage(AbstractRpcCall.java:251)
         at com.siebel.analytics.javahost.MessageProcessorImpl.processMessage(MessageProcessorImpl.java:193)
         at com.siebel.analytics.javahost.Listener$Job.run(Listener.java:223)
         at com.siebel.analytics.javahost.standalone.SAJobManagerImpl.threadMain(SAJobManagerImpl.java:207)
         at com.siebel.analytics.javahost.standalone.SAJobManagerImpl$1.run(SAJobManagerImpl.java:155)
         at java.lang.Thread.run(Thread.java:662)
    [2011-08-16T10:35:25.784-05:00] [init] [ERROR] [] [saw.init.application] [tid: 14] [ecid: 0000J7JEtF^FS8O6yjMaMG1EIco200000C,0] [Thread 21] Service failed - AWImportService11G. Details:Q12_HIST_AW
    Any help to diagnose the problem is appreciated.
    Swapan.
    Edited by: Swapan on Aug 16, 2011 9:28 AM

    It looks like OBIEE 11.1.1.5 ships with 11.1 jars and since my DB was running 11.2, I faced this issue. The fix is described below for folks who encounter this issue.
    The OLAP API jars on the middle tier need to be updated to version 11.2.x. The OLAP API libraries are found in your Oracle Database home: [oracledb home]\product\11.2.0\dbhome_1\olap\api\lib. BI EE provides an 11.1.x version of these files in [obiee home]\Oracle_BI1\bifoundation\javahost\lib\obisintegration\aw\11g. Backup the BI EE version of the OLAP API jars and replace them with the version provided by the database.
    Swapan.
    Edited by: Swapan on Aug 16, 2011 1:33 PM

  • Reporting on master data customer and bad performances : any workaround ?

    Hello,
    I've been asked to investiguate on bad performances encountered when performing reporting
    on the specific master data zcustomer.
    Basically this master data has a quite similar design that 0customer, there are 96000 entries in the master data table.
    A simple query has been developed : the reporting is done on the master data zcustomer and its attributes : no key figure, no calculation, no restriction ...
    Nevertheless, the query can not be executed .. the query runs around 10 minute in rsrt, then the private memory is exhausted and then a short dump is generated.
    I tried to buid a very simple query on 0customer, this time, without the attributes ... and it took more than 30 sec before I get the results.
    I checked the queries statistics :
    3.x Analyzer Server 10 sec
    OLAP: Read Texts : 20 sec
    How is it that it is so long to performthe reporitng on those master data, while in the same time If i try to display the content in SAP by choosing "maintain master data", I have an immediate answer.
    I there any workaround ?
    Any help would be really appreciated.
    thank you.
    Raoul

    Hi.
    How much data have you got in the cube?
    If you make no restrictions, you are asking the system to return data for all 96.000 customers. That is one thing that might take some time.
    Also, using the attributes of this customer object, fx making selection or displaying several of them, means that the system has to run through the 96.000 records in masterdata to know what goes where in the report.
    When you display the masterdata, you are by default displaying just the 250 or so first hits, and you are not joining against any cube or sorting the result set, so that is fast.
    You should make some kind of restriction on other things than zcustomer (time, org.unit, version, etc, to limit the dataset from the cube, but also a restriction on one of the zcustomer attribs, with an index for that maybe, and performance should improve.
    br
    Jacob

  • Should I use olap plus package in my data warehousing project?

    I will build a massive data warehousing which size is over 3t.
    And I will provide many analytical functions to users.
    Oracle 11g is my choice. But OLAP is suitable for me, I don't know?
    In my opinion, OLAP is helpful to analysis, but it's hard to refresh data when it's based on a large table.
    Who can help me? Thanks.

    So as I'm attempting to do the first one of these, I'm feeling like I should have gone with iMovie. Why? Because everything except for the green screen can be done so easily and with profressional looking results in iMovie. But that just doesn't seem right to me.< </div>
    It takes thick skin to be in this business so I'll test that ability of yours now: Your statements imply vast gaps in your skills, knowledge and patience. That's not bad, we all got started someplace, but you're in way over your head.
    We'll help in any way we can. I'll start by telling you to forget the notion that HD is necessary if your release media are iPods and Flash.
    bogiesan

  • Pre-fill the OLAP cache for a query on Data change event  of infoprovider

    Hi Gurus,
    I have to pre-fill the OLAP cache for a query,which has bad performance.
    I read a doc 'Periodic Jobs and Tasks in SAP BW'
    which suggested sum steps to do this
    i hav created the setting for Bex broadcasting for scheduling job Execution with data change in info provider
    thereafter doc says  "an event has to be raised in the process chain which loads the data to this InfoProvider.When the process chain executes the process u201CTrigger Event Data Change (for Broadcaster)u201D, an event is raised to inform the Broadcaster that the query can be filled in the OLAP cache."
    how can this b done please provide with sum proper steps
    Answers are always appreciated.
    Thanks.

    Hi
    U need to create a process chain or use the existing process chain which you are using to load your current solution, just add event change process type in the process chian  and inside it add the info provider which are going to be affected.
    Once you are done with this go to the broadcaster  and  create new setting for that query...you will see the option for event data chainge in infoprovider just choose that  and create the settings.
    hope it helps

  • Bad Data from Cache after infoprovider update

    Hello,
    We have queries running off of a multi-provider consisting of 4 cubes.  Occasionally when we generate quires after deltas have been loaded into the cubes we get incomplete or bad data.  Deactivating the cache or deleting the cache and runing the query again always seems to resolve the problem.
    Any thoughts on what the issue might be and how to resolve?
    Thanks
    Stan Pickford

    Stanley,
    after any data change in an InfoCube, table RSDINFOPROVDATA should contain the timestamp. This timestamp is used by the OLAP cache to determine if new data has been loaded or if the cache is still valid.
    I'm not aware of any problems. If it is reproducible and you can't fix it, open a message to SAP Support.
    Regards,
    Marc
    SAP NetWeaver RIG

  • OLTP and OLAP databases on same SQL Server?

    Would you put OLTP and OLAP databases on same SQL Server or separate?
    I realize the ideal would be separate, but that means 2 expensive licenses instead of 1.  Most of our OLTP stuff happens during the day while OLTP processing happens at night (so good use of resources) if just 1 server.  Also, the disks that hold
    the files could have different allocations (small chunks for the OLTP disks, large for the OLAP disks).  Also, by being on the same database server, the OLTP data is immediately accessible for processing into the warehouse.
    Or, is it just a cardinal rule to never mix the two worlds as server completely different purposes?
    Thanks.

    Answer given in simillar below thread would help you for sure:
    MSDN Forum:
    Is keeping your OLAP and OLTP databases on the same server is considered bad practice?
    -Vaibhav Chaudhari

  • ORA-01578 block corrupted in OLAP instances

    Hi all,
    We found in almost every instance we got OLAP Option installed this error:
    ORA-01578: bloque de datos ORACLE corrupto (archivo numero 3, bloque numero 1452)
    ORA-01110: archivo de datos 3: 'G:\ORADATA\NKDW2\CWMLITE01.DBF'
    ORA-06512: en "OLAPSYS.CWM2_OLAP_METADATA_REFRESH", linea 8
    when executing:
    cwm2_OLAP_METADATA_REFRESH.MR_REFRESH()
    The first time we thought it was due to a "real" block corruption but when started to appear
    in others instances (different physical machines) we really thought it could be a bug.
    Several disk scans on linux and windows environments showed us everything was fine but Oracle
    still persist in the block corruption.
    Any ideas? your comments are welcome
    Thanx in advanced
    aLeX

    Please confirm the corruption.
    select tablespace_name
    , segment_type
    , owner
    , segment_name
    from dba_extents
    where file_id='3'
    and '1425' between block_id and block_id + blocks -1;
    Since you're running into issues on several machines (on a procedure in a package) it's possible that a piece of the code might be corrupted in some way. Can you drop the package then recreate it? The ?/cwmlite/admin/cwm2mrrf.plb and cwm2mrrf.plb scripts recreate this bit. If it turns out you have a bad script we can wrap you a new one and send it.

  • Third Party Tool using OLE DB for OLAP

    We have built a query in BEx 7.0 Query Designer and released for access via OLE DB for OLAP to Cog nos 8 Repor Designr. 
    I am trying to see the SQL/MDX code Cog nos is sending to BW to retrieve data.  Is there a way to see what BW gets from Cog nos report on the BW side?  Just to clarify, getting this info for the BW query executing without the 3rd party tool is not useful, because we dont have any issues running the BEx query, however 3pty tool gets back bad data.  Specifically when using a query with a BEx reusable structure. 
    I have further details on the design of the query, however I am trying not to create a very long and messy thread with unnecessary info.  Please ask if those details are necessary to understand the issue.

    Here is a little more info on how the query is setup and why I think it is having a problem.....
    Query has Two Structures....
    <u>Rows:</u>
    Reusable Structure (Without Key Figures) with 20 Rows (selections and formulas) defining each line of a Financial Statement.  Structure Name is "XYZ Statement"
    <u>Colums:</u>
    Reusable Structure (With Key Figures) with 3 Columns (05/2007 Actuals, 05/2007 Plan, Variance).  Structure Name is "Comparison"
    <u>Free Characteristics:</u>
    GL Account
    Cost Center
    Other Char...
    <u>What Cognos 8 sees in the package:</u>
    In Cognos, Structure "XYZ Statement" appears as a regular dimension, the same way Free Characteristics do.  The Structure "Comparison" appears as a Key Figure, aka Measure.
    <u>How the Cognos Report is built in Report Studio:</u>
    If the report is built NOT using the dimension "XYZ Statement", regardless of whatever Free Characteristics you use from the query as a datasource, you will not get any data, however it works fine when the "XYZ Statement" dimension is used.
    <u>What I think the problem is:</u>
    I am thinking all Cognos reports build off this package MUST use the dimension "XYZ Statement" (aka Reusable Characteristic Structure in BW), other wise the query does not return any data.
    <u>How I am trying to validate this theory:</u>
    Query is the datasource for Cognos.  Executing RSRT only allows me to get details on the query execution via BW, however since the Cognos report is not returning correct data, I am trying to see what the Cognos Report is passing back to BW, to understand/validate what I think is the issue.  If this is just the way it works, then I want to validate and move on.

  • Five issues about OLAP making me confused.

    1. Why do we need to map level twice in both dimension and cube? I know when mapping in cube, we only map the lowest level of
    a hierarchy. And what if a same level member has been mapped with different column from source table?
    2. There could define multiple hierarchies for a dimension, then why do we need to set one of them as default hierarchy? In
    what scenario we need to define multiple hierarchies?
    3. When creating a Cube, there is an option in Aggregation->Precompute->Cost_base Aggregation- >Percentage, what does
    "Percentage" used for? For performance?
    4. When I do some operation in AWM, sometimes it comes up with some errors. But it is really harsh to understand what it is
    meaning. Like the error message below,
    The transaction is not committable: "An error has occurred on the server
    Error class: Express Failure
    Server error descriptions:
    DPR: Unable to create server cursor, Generic at TxsOqDefinitionManager::generic<CommitRoot>
    INI: XOQ-01600: OLAP DML Error "Analytic workspace object OLAPTRAIN.SALESTRACK!
    SALES_CUBE_PRODUCT_ALERT does not exist." while executing DML "SYS.AWXML!R11_MANAGE_CUBE
    ('SALES_CUBE.CUBE' 'DELETE')", Generic at TxsOqStdFormCommand::execute
    at oracle.olapi.transaction.BaseTransaction.commit(Unknown Source)
    at oracle.olapi.transaction.BaseTransactionProvider.commitCurrentTransaction(Unknown Source)
    at oracle.olap.awm.dataobject.DatabaseDO.commitOLAPI(Unknown Source)
    at oracle.olap.awm.dataobject.aw.WorkspaceDO.commitOLAPI(Unknown Source)
    at oracle.olap.awm.dataobject.olapi.UModelDO.commitOLAPI(Unknown Source)
    at oracle.olap.awm.dataobject.olapi.UModelDO.delete(Unknown Source)
    at oracle.olap.awm.dataobject.olapi.UCubeDO.delete(Unknown Source)
    at oracle.olap.awm.navigator.node.DeleteThread.run(Unknown Source)
    When it said "Analytic workspace object OLAPTRAIN.SALESTRACK!SALES_CUBE_PRODUCT_ALERT does not exist.” I can't even find out
    what SALES_CUBE_PRODUCT_ALERT is and where I can find the more detail information for me to solve the problem. I mean is here
    a manual or reference Oracle provided for helping developer?
    5. Does it matter the order in Join Condition when we mapping a cube to resouce, like a=b or b=a? which kind of Join is it, LEFT JOIN, RIGHT JOIN, OUTER JOIN or INNER JOIN?
    Thanks,
    Satine

    Satine wrote:
    Hey Shankar,
    Thanks a lot for your help.
    I was out of office until this week, so I have to read your reply now. I think your explanation is very clear, but there is something I am not sure about.
    1.
    Shankar S. wrote:
    However this is quite similar to loading DAY level in Cube and marking DAY and MONTH levels as stored levels, only relational side does the additional work of precalculating MONTH level summaries for loading into MONTH level of Cube. Quarters get calculated on basis of MONTH level info.First, I have to apologise for my bad english cause I don't understand it very well. Could I say any calculation for the data, which will be loaded to cube, is done on the source data side? Like from your example,Yes, calculations are required on the source side also... specifically the calculation over day level information to get Max at Month level information should be done on the source (relational) side.
    Relational:
    TBL1: Fact table containing DAY level data
    TBL2: Fact table containing MONTH level data. This can be an Materialized View with the requisite logic built into the MV definition or this can be a normal summary/aggregate table which is loaded via ETL e.g. a pl/sql procedure.
    Cube:
    Dimension TIME has levels DAY, MONTH, QUARTER and YEAR. Use Implementation Details tab option of creating dimension members using keys from data source (dont generate surrogate keys in the AW).
    Aggregation: Sum along TIME dimension.
    In Mapping Editor for Cube: use option: Other (not star or snowflake)
    Use both tables TBL1 and TBL2 in the mapping editor and associate/map DAY level with TBL1 and associate/map MONTH level with TBL2.
    Load each day's Daily Total Withdrawal for each Customer/Customer Account at DAY level.
    Load the Maximum (MAX) amongst the day level Total Withdrawals at MONTH level.Does it mean the calculation of "Total Withdrawal for each Customer/Customer Account at DAY level" and "Maximum (MAX) amongst the day level Total Withdrawals" both done on source data side before loading to cube? If it does, it means we have to balance the granularity and loading time while we mapping the cube, right?Shankar: Yes, its not optimal but the requirement/business case is not typical either. We'll be executing two loads (from 2 fact tables) to load a single cube.
    2.
    Shankar S. wrote:
    Cube has aggregation set as SUM along TIME, SUM along CUSTOMER but by loading DAY level info into DAY and MAX(day) level info into MONTH, we achieve something like below which is not possible to do out of the box.
    DAY level info loaded as is, MAX aggregation operator at MONTH level, SUM aggregation operator at QUARTER, YEAR levels of TIME
    SUM along CUSTOMER from Customer Account upwards.After we deifined the MONTH level as "Maximum (MAX) amongst the day level Total Withdrawals", does it mean we can't do SUM aggregation operator at MONTH level just like we do on QUARTER or YEAR?As mentioned above, we dont define Max anywhere within the cube (structure or definition). We define Sum along Time dimension for the Cube but due to the fact load of data at 2 levels using the max operator, the cube will not perform sum of day level to get data for month level but display the loaded data values at month level "straight off" from the cube. Quarter and Year levels would get calculated as SUM(month) dynamically (if not stored levels). If they are stored levels, then the post load cube aggregation process will calculate and store them.
    NOTE: Post the load (and internal cube aggregation process) the cube appears to be fully solved... all the data is available at all the levels and we just read them off the cube from the appropriate dimension levels using the relevant access mechanism (sql, olap dml, olap api etc.)
    I hope I could express my confusion clearly :).
    Thanks again for your help!
    Satine

  • ORA-39000: bad dump file specification

    Hi ,
    i exported the dumpfile with %U and its genereated successfully with suffix of 101..102..till 243.
    But while importing its searching the file from 01 02 and so in other env.
    Could you please assist.
    Actual file name start with:
    Dump file set for EXPDPUSER.EXPFULL_NW is:
      +LMDRPRDERDWDB_ASM_DATA_DISK9/asm_dpump/prod_data_expdb_nw101.dmp
      +LMDRPRDERDWDB_ASM_DATA_DISK8/asm_dpump/prod_data_expdb_nw201.dmp
      +LMDRPRDERDWDB_ASM_DATA_DISK9/asm_dpump/prod_data_expdb_nw102.dmp
      +LMDRPRDERDWDB_ASM_DATA_DISK8/asm_dpump/prod_data_expdb_nw202.dmp
      +LMDRPRDERDWDB_ASM_DATA_DISK9/asm_dpump/prod_data_expdb_nw103.dmp
      +LMDRPRDERDWDB_ASM_DATA_DISK8/asm_dpump/prod_data_expdb_nw203.dmp
      +LMDRPRDERDWDB_ASM_DATA_DISK9/asm_dpump/prod_data_expdb_nw104.dmp
      +LMDRPRDERDWDB_ASM_DATA_DISK8/asm_dpump/prod_data_expdb_nw204.dmp
      +LMDRPRDERDWDB_ASM_DATA_DISK9/asm_dpump/prod_data_expdb_nw105.dmp
      +LMDRPRDERDWDB_ASM_DATA_DISK8/asm_dpump/prod_data_expdb_nw242.dmp
      +LMDRPRDERDWDB_ASM_DATA_DISK9/asm_dpump/prod_data_expdb_nw143.dmp
    Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, Automatic Storage Management, OLAP, Data Mining
    and Real Application Testing options
    ORA-39001: invalid argument value
    ORA-39000: bad dump file specification
    ORA-31640: unable to open dump file "+LMDRSITERDB_ASM_DATA1/dpump/PROD_data_expdb_NW01.dmp" for read
    ORA-17503: ksfdopn:2 Failed to open file +LMDRSITERDB_ASM_DATA1/dpump/PROD_data_expdb_NW01.dmp
    ORA-15173: entry 'PROD_data_expdb_NW01.dmp' does not exist in directory 'dpump'Thanks,

    Mishra wrote:
    Hi ,
    i exported the dumpfile with %U and its genereated successfully with suffix of 101..102..till 243.
    But while importing its searching the file from 01 02 and so in other env.
    Could you please assist.
    Use the SAME format (with the %U) that you used for the export.
    This:
    "+LMDRSITERDB_ASM_DATA1/dpump/PROD_data_expdb_NW01.dmp"does not look anything like these:
    "+LMDRPRDERDWDB_ASM_DATA_DISK9/asm_dpump/prod_data_expdb_nw101.dmp":p

  • OLAP DML - aw access via AWM

    Hi All,
    Could you please clarify if I can access the AW created and maintained by OLAP DML via AWM?
    I created an AW and some dims and cubes inside it, by using OLAP Worksheet, and cannot see these objects in AWM?
    Kind Regards, Neelesh

    That is another question. As long as you use AWM to define the dimensions, attributes, measures, and cubes, then you should be fine. And you are correct in saying that the OLAP DML can be used as a calculation language to extend the model defined through AWM. (You can do many other things, too, with this 30-year-old language, but then you really are on your own.)
    There are two basic points of integration between the objects defined in AWM and objects you define yourself using OLAP DML.
    (1) Calculated measures can be directed to FORMULAs that you define. This, in turn, gives you access to PROGRAMs and other structures. There is also a way, not exposed by AWM, to support calculated attributes (in 11.2.0.2).
    (2) Cube Build Scripts can execute OLAP DML commands during a cube build. This gives you a way to link into a forecasting program, for example.
    In the other direction you can use the (new to 11.2.0.2) objorg function in OLAP DML to get access to the structures defined through AWM. (This is better than hardcoding the generated names.)
    So the real issue is how you edit these calculation objects through AWM. One answer, and it is not a bad one, is through the OLAP Worksheet. The EDIT command will bring up a PROGRAM editor, for example.
    For a non AWM answer you could try the old OX program, which is available from here: http://www.oracle.com/technetwork/database/options/olap/olap-downloads-098860.html . This provides a full view of all the objects in an AW. It is quirky, but I like it.

  • Using OLAP DML to maintain measures

    Can anyone point me to documentation on running custom DML programs in maintenance scripts in AWM? I would like to be able, for instance, to output information to the Logs.

    You can run an insert statement from the OLAP DML to populate any table you want. The only complication is that the rows will only be persisted if the transaction is committed, and you can not (should not!) commit the transaction during the current build since it will persist bad data in the case of a failure. A possible solution is to write a PL/SQL procedure that begins an "autonomous transaction" and make this be responsible for logging. E.g.
    DECLARE
      PRAGMA AUTONOMOUS_TRANSACTION;
    BEGIN
      INSERT INTO ...
      COMMIT;
    END;
    /You would then call your PL/SQL procedure from OLAP DML every time you wanted to log a record.

  • EIS - OLAP Model - Dimension Joins

    Hi,
    I know the in EIS OLAP model we need to build a logical STAR schema between dimensions and fact tables with some join condition. Now, can you tell me how the join works while building the dimensions? Do EIS really bother about the joins between dimension and fact tables? My understanding is, EIS will load all the members present in the dimension table irrespctive of join clause.
    Pls shed some light on this!
    Thanks!

    You can join tables together with a "bad" JOIN. Your outlines will still build and probably look fine, however, this is what I tend to think of as a "dirty" EIS model. You will run into issues if you try and load data through EIS but you can probably get the outline to build how you want it. I highly recommend that you try and do things the "right" way as it will pay off in the future, but if you have to get it done some other way.... well, I've seen worse. :)

Maybe you are looking for

  • Issue when uploading Sales data from DSO to Cube.

    Dear All, I have an issue when I am uploading Sales data from DSO to Cube.  I am using BI 7.0 and I have uploaded all sales document level data to my DSO.  Then I use a transformation rule to calculate the Sales Value when I am doing DTP to cube.  Cu

  • Query in Av Check and TOR

    Dear Sap Gurus, I am a fresher in SAP ISU  trying to learn the concepts of AV check and TOR. Intial stock is zero for material 'x'. Sale order is created for qty 10 of material 'x' and saved. In schedule line system confirms that the material can be

  • Change physical tables?

    I was importing 3 tables from Oracle 10g database and one table from access as dbf (all over ODBC based on windows XP) This tables exist in physical part of Bussiness Inteligence 3 tables with primary keys in relations 1 to n, but tables from access

  • HTML Link to Current Directory Has Problem (s)

    DOS 4.2 was almost, almost there. We seem to have a sleepy foot! To code a link to a current directory (local authoring), this link's shorthand is <a href="./" target="_blank">Please don't blink!</a> In spite of the target designation, Internet Explo

  • Where is the Nike+ tab?

    For synchronizing a new sensor to my Nike+ account I need the Nike+ tab. However since two updates, this tab is missing. First it did not bother me, but now I'll need it. Can anyone help me?