OWB 10.2 Dimension 'remove' performance

Has anyone had much luck with using the "Remove" option against dimensions for processing deletions? I am using relationally implemented dimensions and cubes.
My first attempt failed because of Fact table constraints. I then tried to do a "remove" against the cube with the business key of one of the dimensions. The mapping ran and did nothing at all - apparently, to delete from a fact table, you have to provide every single dimension key which is annoying.
I then decided to change the FK options to CASCADE (which cannot be done in OWB, I had to alter the constraints outside the tool). This seems to work but performance is somewhat awful.
The deletion code generated against the dimension is weird. It does compares for null values as well as for the key that I specified which makes the optimizer not use indexes well. I created a compound index to help a little but it is still slow.
Anyone have better experiences or advice?
Thanks in advance,
Mike

Did you try to create the following indexes on your repositories schemas ?
create index wb_rt_idx_ao_ou on wb_rt_audit_objects (object_uoid) tablespace &tndex.;
create index wb_rt_idx_ao_pa on wb_rt_audit_objects (parent_audit_object_id) tablespace &tndex;
Regards
Matthias

Similar Messages

  • Dimension tabs Performance vs Evaluation order

    Hi All,
    We use planning 9.3.0.1.
    What is the difference between Performance settings vs Evaluation order? What I know is that if we have to set hour glass/ hour glass on stick we need to change the order of the dimensions in Performance settings tab.
    Any information will be very helpful.
    Thanks,
    AD

    Same question as :- Dimension tabs Performance vs Evaluation order
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Retrieval time unchanged even after extra dimensions removed from ASO app

    We have an ASO application on which HFR reports are running. The application has 18 dimensions. The HFR reports running were taking a long time to get generated so we created a duplicate of that application and removed 5 dimensions which were not being used. Even after running the reports out of the duplicate application the generation time has not reduced.
    Please advice.

    Do you know how the report generation time relates to the Essbase query time (check the Essbase application log)? It might be that you're doing a lot of work on the FR side.
    If the problem really is on the ASO side, did the number of input data cells and / or key length change significantly when you removed the dimensions? If not, removing dimensions (which I am assuming were not referenced in the report in the first place) might not get you very much in terms of query performance. Have you created any aggregate views on the ASO cube?
    If you have a particular 'problem' report it's possible to use query tracking to create some aggregations directly to support that one query, particularly if the report always retrieves at the same level in each dimension.

  • Cannot see all the essbase dimensions when performed reverse engg.using ODI

    Hi,
    I performed the reverse engineering process for getting the dimensions from Essbase to the Model. First time the session has shown an error. When i right clicked the session in the operator and clicked on restart it has run succesfuly. But only the Accounts dimension got reversed. The rest of the dimensions are not seen in the Model. Please suggest

    I have got to import my actuals data which is ther in Oracle database to Planning. As i was told that ODI is not compatible with EPMA, i cannot load data into Planning using ODI. As anyways the data is to be stored in essbase itself i want to use ODI to load the data into essbase. For metadata upload i am making use of Interface tables
    My problem is
    1.do i need to have a staging table for bringing my actuals data into it first and then connect it to Essbase?
    2. If yes, what shud be the table format, like the columns?
    3. And, the user wants this to be automated. And if there are any changes to the base tables(tables with actual data in oracle) they shud be reflected in the essbase aslo. So is there any lookup functionality in ODI?
    I am new to ODI so having many doubts. Hope to see a reply soon

  • Infocube line item dimension and performance optimization

    Hi,
    I remodelled an infocube and line item dimension contains only one characteristics set as line item dimension.
    previously the dimension as one characteristics but it wasn't set as line item dimension.
    and when I checked the SAP_INFOCUBE_DESIGNS from SE38  it looks ok.
    /SAP/CUBE   /SAP/CUBE3   rows:        8663  ratio:          3  %
    After setting it as line item the rows is now minus but it is showing red which means that there is problem with the dimension
    /SAP/CUBE   /SAP/CUBE3   rows:          1-   ratio:          0  %
    Its this a performance problem since it is showing red.
    thanks

    hi,
    No,its not performance issue.
    for a dimension to be line item dimension...the dimension size shouldn't be more than 20% size of the fact table size.
    when a dimension is set as line item dimension,the regarding SID will be placed in Fact Table,but not the DIM ID.
    may be that is the reason when your dimension is not line item dimension,it shows the number of rows and when it is made line item dimension,its not showing any rows and the ratio also null.
    hope this is clear for you.
    Regards
    Ramsunder

  • OWB Newbie - Joining Dimensions to Fact Tables

    Hello Forum,
    This may seem like a simple question, but the documentation is so lacking that I can't seem to find the answer to "how does the tool" work.
    I am creating a simple data mart starting with a star schema and choosing a deployment option of ROLAP.
    When using the OWB Tool to create dimensions and fact tables, do you need to define your primary business key (coming from your source system) in each fact table for each dimension that will join to that fact table? I am assuming yes, at least at the lowest leaf level of the dimension hierarchy in this case Store_ID. How else would you be able to correctly join a particular sales order record to the store that it was sold in? a simple query that we know our users will execute for reporting purposes. To make myself clear here is a simple example:
    Dimension = Store
    Hierarchy = Store ----> Sales Territory ---->Region ----->Country
    Attributes = Store ID (primary business key), Store_Name, Sales_Territory_ID, Sales_Territory_Description, Store_Manager_Name, Store_Location etc.,
    Cube or Fact Table = Sales Orders
    Measures = sale_price, quantity, drop_ship_cost, sales_tax, etc.,
    Do I also need to create an attribute for Store ID so that when I load each sales order record into the fact table along with it comes it's proper Store ID?
    I understand how the tool uses the surrogate key and business identifier to create a foreign key look-up from dimension to fact table and pull in the correct store description (i.e. name) but unless you have the Store ID as part of the sales record being loaded into the fact table I don't see how the you can traverse the table and join a sales order to it's proper store?
    Can someone who is farther along in the process or has more experience with other components of the tool confirm my assumption or set me straight on how the tool accomplishes this?
    thanks
    monalisa

    Hi Monalisa, I'll reply assuming your using OWB 10gR2.
    First off, for each of the dimensions, you'll define a business key (as you noted below). Then, when you define the cube object, you'll tell it which dimensions you are using, along with the measures you will be storing.
    When you drop a "cube" object into a map, it will show the measures, but it will replace the dimensions with their natural keys. I.e. in your example below, if your sales order cube had ONLY the store dimension, then the cube object will show up in a mapping with the measures of sales_price, quantity, drop_ship_cost, etc., but instead of showing the dimension for store, it will instead show the natural key Store_ID. When you map your data to load (which should be by store_id), OWB will automagically look up the proper dimension keys and do everything else for you.
    Hope this helps!
    Scott

  • EPMA Validation Error: dimension removed or deleted since first deployment

    Dear experts,
    Our deployment is on EPMA and we were able to successfully deploy the Planning application before. Then we tried to update the dimension by loading UDA to each members, after which, we hit this error at validation:
    Error : The xxxx dimension has been removed or deleted since the first successful deployment of the application.
    We did have UDA dimension in it before the update and all dimensions in the applications are shared from the library.
    Our application is on 11.1.1.3 and we have tried to removed all associations and redo it, but still hit this error. As there are many members being assigned with UDA, wonder if it's possible to solve the error without deleting the UDA dimension and recreate it?
    Any input help, please kindly help!
    Thanks!!
    Edited by: 845931 on Apr 5, 2012 4:20 PM

    You will find it in My Oracle Support.
    The contents are as below:
    Applies to:
    Hyperion Planning - Version: 11.1.1.0.00 to 11.1.2.0.00 - Release: 11.1 to 11.1
    Information in this document applies to any platform.
    Goal
    You cannot delete a UDA dimension from EPMA once the application it belongs to has been deployed.
    ERROR
    The 'Colour' dimension has been removed or deleted since the first successful deployment of the application.
    Is there some way to remove the UDA dimension (not just modify its members)? This was possible in the Classic Planning interface but is not when using EPMA.
    Solution
    This is a limitation of EPMA and Planning. There is no mechanism in version 11.1.1.3 and earlier to allow EPMA to remove dimensions from Planning, and because EPMA treats UDAs, Attribute Dimensions, and Smart Lists as dimensions they also cannot be removed from Planning once the application has been deployed for the first time. They can be modified, but not deleted.
    In version 11.1.2 you can delete a UDA dimension in EPMA with no errors but the application will no longer validate (throwing the same error as in 11.1.1.3) and therefore cannot be deployed. However, this does offer an easy way to remove all UDAs assigned to members. If you delete and then re-create a UDA dimension in EPMA the application can be re-deployed again, even if you do not re-create any dimension associations. When you redeploy, all UDAs for this dimension are cleared from Planning and Essbase. This means that although the UDA dimension itself is required in EPMA, there is no trace of it in Planning and Essbase after redeploying.
    HTH-
    Jasmine.

  • Keyword add/remove performance

    I've been a huge critic of Aperture and with good reason, but once you start using Aperture 1.1 it's hard to go back to iPhoto.
    Managing keywords is all messed up but once you figure out how to remove a keyword from a bunch of images, it becomes possible to organize your photos the way you'd like. And the hierarchical keywords are a good idea I think.
    But the performance! Ha! It takes about a minute to remove a keyword from 100 photos. Why? I assume it's because for each image it has to write a plist file, usually around 15kB-18Kb in size, and then it fsync's the data. That means the computer stops while the OS writes the data to the disk. There's also a ton of other stuff going on but I don't see those operations stopping and waiting for data to be stored on the disk, so they are probably not slowing things down too much.
    I assume this is done to keep the data completely consistent, even in the face of a system or application crash. Nevertheless, it's not really possible to write an application this way and have it perform at all reasonably. There are other ways to guarantee data consistency or at least to enable recovery if the system/app crashes unexpectedly. I'd be happy to make a few suggestions
    Dual 2.5Ghz Power Mac w/3Gb RAM   Mac OS X (10.4.1)  

    How to add / remove keybindings from a KeyMap?
    Action breakAction = new myAction();
    Keymap map = pane.getKeymap();
    KeyStroke key = KeyStroke.getKeyStroke(KeyEvent.VK_ENTER, 0);
    map.removeKeyStrokeBinding(key);
    map.addActionForKeyStroke(key, breakAction);
    I tried to remove the keybinding of the Enter-key from
    a textPane.
    The textPane is connected to a HTMLDocument. Everytime
    I press Enter, a new paragraph is started. This is
    very irritating! I don't want to start a new
    paragraph, I just want to go down one line. The
    strange thing is: when I remove the Enter key from the
    keymap of the textPane, the <p> tag is left out. So no
    new paragraph is initiated. But all other textformat
    tags like <b> and <i> are closed when the Enter key is
    pressed. Once again: this is very irritating!
    I don't want to loose the textformat when I press
    Enter. I just want to go down one line! Is it
    possible?I believe you have to get the attribute set of the text on the current line and reapply the attributes to the newline....I agree, it is irritating, but you get used to it if you use the EditorPanes and EditorKits often.
    I will show you how to apply the breakAction in your other post.

  • OWB - making cubes / dimensions available for Disco OLAP

    ** Posted on the OWB forum initially but no replies **
    Hi, can someone explain (or tell where I can find documentation on) deploying cubes and dimensions for viewing in Disco Plus OLAP ?
    I have already derived other unrelated tables/views to business areas and have setup the security on those for Disco with no problem.
    The cubes and dims are defined as ROLAP and there is data in the underlying tables. I have granted SELECT access on the underlying tables to the required users. The dims and cubes have been deployed with the "Deploy All" setting
    If I log on as the eul owner into Disco Plus OLAP I can see the dims, cubes, measures and data with no problem, however if I log on with one of the other users I can log on ok (so no issues there) but have no objects to select from.
    I think the issue is to do with access not being granted to the new users to the relevant OLAP objects but do not know which ones.
    Thanks
    Paul

    Sorry folks, I got mixed up with another thread running somewhere else.
    With ROLAP, all you should need to do is grant select on all the dimension and cube tables. The metatdata for these tables is owned by OLAPSYS and stored in various tables starting with CWM_XXXX. Assuming your Disco user has been granted the role OLAP_USER (which should happen by default, and you confirmed this in a previous part of this thread) then everything should work.
    This might be caused by the CWM catalog not being up to date in terms of all the required metadata. Try connecting as the OLAPSYS user and then execute the following command
    exec CWM2_OLAP_METADATA_REFERESH.MR_REFERESH
    Once this has finished you might find that you cannot connect to Discoverer. This will be because the sample ROLAP schema owned by SH gets corrupted during installation and this can cause problems with the OLAP metadata. The only thing to do is to use the OLAP PL/SQL packages to delete the dimensions/cubes owned by SH then re run the command to refresh the metadata catalog - > CWM2_OLAP_METADATA_REFERESH.MR_REFERESH
    Security - You need to use the normal database label security with a ROLAP implementation as the AW PERMIT command is not available with ROLAP.
    Keith

  • OWB: how can automatic updation  perform in staging database using OWB

    I am using OWB-etl to fetch data from source database and store to staging db.
    in target table operator i am using insert operation.
    it is inserting data fien.
    but my requirement is target database must be automatically updated with what ever modification made to source data base.
    can u pls help me how to achive this ?
    Thanks alot ...
    k azamtulla khan

    Hi,
    why do you want to do this using OWB, is it not easier to create a before/after insert/update trigger on source table so that your target table is updated automatically. You can benefit by using owb in scenarios where you need to transform and load data and then schedule this process and for the rest of it i would recommend using triggers for row level activities such as update.

  • Editing the period dimension - removing quarters

    Hi,
    By default the Period dimension is composed by 19 stored members (12 months, 4 quarters, YearTotal, BegBalance and Period itself). Is it possible to delete the quarters so the block size of my application will become smaller and only have the YearTotal > Months hierarchy?
    Thank you
    Edited by: Icebergue on 4/Ago/2011 6:33
    Edited by: Icebergue on 4/Ago/2011 6:33

    you can create custom type period dim and create 12 Members under it. YOu cant modify an existing application but you can create a new application with custom period dim.
    but still you will have begBalance and YearTotal.
    BTW anytime you can change the storage property of Quarters to Dynamic Calc.
    Cheers..!!!
    Edited by: RahulS on Aug 4, 2011 7:26 PM

  • Improve browsing performance of a SSAS Dimension

    Hi,
        I have a huge dimension with almost 20 million rows. Browsing the Dimension takes a lot of time. Please help me on improving the dimension browse performance. I tried searching in the internet nut I am able nto find only the articles related
    to improving the Measure performance but nothing about Dimension.
    Regards,
    Venkata
    Venkata Koppula

    Hi Venkata,
    According to your description, you want to improve the performance of browsing a huge dimension. Right?
    As yger mentioned, we could create a default measure in your cube. Otherwise it will query all measure groups when you pulling out your huge dimension. We can change the default measure to empty hidden field, which will speed up this
    kind of queries. For more information about optimizing performance on dimension design, please refer to the link below:
    SSAS - Best Practices and Performance Optimization
    Regards,
    Simon Hou
    TechNet Community Support

  • Creating A new dimension for a characteristic versus adding in the same dim

    Hi Guys,
    I have a scenario where I have 0Material in a line item dimension in the cube.
    Have to add 0Mat_plant which is compounded to 0plant as we need MRP controller
    as one of the Nav attributes. 0Plant is also available in the cube.
    There are two options of doing this:
    1) Either add it to the 0Material dimension removing the line item property.
    2) Or create a new dimension for 0MAt_plant and make it as line item dimension
       considering the large volume of material information.
    Which is a better option and why.
    Please advise.
    Many Thanks and Regards,
    KAte

    Hi Kate,
    I'd recommend to have a new dimension as line itemdimension for 0MAT_PLANT, just for performance purposes (almost logarithmic access of data instead of full table scan)
    The Plant Segments in R/3 usually have a lot more records than the general material master has (max: number of plants * number of materials).
    Adding the object to 0material means that you have to unassign the line item flag. Usually this leads to increasing load- and queryruntimes.
    hth
    cheers
    sven

  • OWB Corrupts Discoverer EUL

    HI,
    I want to put some attention to an issue I encountered using OWB 9.2. The issue occurs when defining a dimension without any levels. If one exports that dimension through the discoverer bridge and imports it in the EUL, the EUL is corrupted. To see it one needs to do a second import, it will generate an error (cyclic hierarchy, eul not changed). The mentioned dimension is NOT visible in discoverer, however it will need either reinstalling the EUL (thus losing all metadata entered and/or changed) or hacking into the EUL to remove the hierarchy which has no levels.
    IMO this is a serious problem which needs to be addressed ASAP, however raising an iTAR only resulted in a product enhancement request (IMO way to low for such a problem).
    The referenced iTAR ( 3507810.999 ) and product enhancement request ( bug: 3237872) are available for viewing (with proper authorisation).
    How to put it "High" on the list of development?
    Greetings,
    Wilco

    Let me explain,
    A developper designed the dimension and forgot to add levels.
    OWB validated the dimension as OK (This shoul already result in invalid dimension IMO).
    We assigned the dimension to a collection, and exported a complete collection through the discoverer bridge. (6 Dimensions an 3 Facts).
    Export went successfully.
    During Import we got your error also, so we exported again. The then resulting eex imported successfully into OWB.
    The dimension we did not use right away, is was part of the 2nd increment of development, but created allready for future need.
    When we modified the ETL, because of a missing field, we reexported through the discoverer bridge.
    The file we got then we loaded with the following option in Discoverer : refresh the object, preserve display related properties. This reulted in the import not beeing able to finish due to "cyclic hierarchy error", the EUL was not changed. importing the new field was impossible.
    We did not want to create the extra field by hand, due to the fact that Discoverer and OWB use uid's, which would get out of sync, resulting in a almost unmaintainable EUL (we have experienced it allready).
    If of use, following TAR/bug's are related (from Discoverer and OWB side)
    Discoverer
    iTAR 3398979.996
    BUG none
    OWB
    iTAR 3507810.999
    BUG 3237872
    On the side of Discoverer they recognize it as a bug, but also treat it as enhancement, while at the side that realy causes the problems it is treated as a "would be nice to have" feature (in my interpretation)
    I am even encouraged to start a discussion on this issue to upgrade the bug. In my opinion it is a bug, and not even a small one. I believe that a product should never be corrupting metadata or functionality of another product.
    If one buy's a car, and after your wife has driven it and made a little scratch on the paint (and only teh paint), I believe nobody would accept it if from that moment on, noone would be able to normally drive the car because it is impossible to drive that car around a corner. If one goes to the garage, they will respond with: eeeh, OK we will mention this to the factory, so they can add it as an option. In the mean time buy a new car, and do not make any scratches to the paint.
    (You will have to admit the analogy here)
    Greetings,
    Wilco

  • Connect a Date Dimension to a cube without relationship

    Hi everybody,
    I would like to answers to one business requirements.
    I create a cube that models the following event : a customer send a product from an agency to another customer who receives it in another agency.
    So I have a fact table with only two measures
    Amount
    Count
    which is connected to these dimensions
    Product
    Sending Date
    Receiving Date
    Sender (Customer)
    Receiver (Customer)
    Sender (Agency)
    Receiver (Agency)
    The users would like to analyse the following KPI, at a specific date :
    Number of transactions sent, the amount
    Number of transactions received, the amount
    Number of transactions pending, the amount
    To answer this business requirement, I have added a new date dimension in the cube with no relation, so that the user can select a date from this independent dimension and get the different KPIs. 
    But I don't get any results.
    Is it a good model ? How to make it possible for the user to use the independent Date Dimension to perform analyzes of different KPI ?

    Is it a good model ? How to make it possible for the user to use the independent Date Dimension to perform analyzes of different KPI ?
    Hi Meal,
    According to your description, you want to know if is it possible for the user to use the independent Date Dimension to perform analyzes of different KPI, right?
    As per my understanding, we cannot do this without relationship between dimension tables and the independent Date table. However, we can add the relationship between the added date table and the fact table Sending Date and Receiving Date column. Please refer
    to the link below to see the details.
    http://msdn.microsoft.com/en-us/library/ms175427.aspx
    Regards,
    Charlie Liao
    TechNet Community Support

Maybe you are looking for

  • Question about new video card from MSI - R7750-PMD1GD5/OC

    Hello Folks, I am afraid I can't find an answer to this anywhere, so I have had to post, even though I bet that somewhere, someone has asked this question before !!!! I changed by Base Unit PC but kept my old monitors because I like them. I run two A

  • Loading .txt with html markup in AS3.. grrrrrrr

    I'm having real trouble in as3. This used to be simple but.... I have a flash file with 5 dynamic text fields with instance names... textfield1, textfield2, textfield3, textfield4, textfield5 I really need to have one simple .txt file that someone wh

  • Connecting One process Server to Multiple SAP Systems

    Hi Experts, We have a licensed version of the SAP CPS with the Process server limit parameter set to 4, which means we are limited to 4 process server as per my understanding. But when we create a SAP system the process server and the respective Queu

  • Roles for the System user

    Hi All, Currently, I am working on AII 5.1 Slap and Ship Outbound scenario and got the document from the service marketplace under http://service.sap.com/rfid -> SAP AII 2007. Under Activating HTTP Services(Page no: 9 - 9th step) section , we have to

  • How do I fix Application error "The instruction at 0x77a42245 referenced memory at 0x77f174e8. The memory could not be read"

    I updated my Firefox to 19.1 and since then Firefox will not respond and gives the following error message ""The instruction at 0x77a42245 referenced memory at 0x77f174e8. The memory could not be read. Clic on OK to terminate the program. When you cl