9i Lite Optimizer

Does anyone know of, has read or discussed the nature of the optimizer used by the Oracle Lite database engine ?
I had a discussion with someone in Oracle development a few years back where it was claimed the 9i Lite kernel used a version of cost-based optimization.
Any comments ?
Regards,
RP.

I did recently stumble across a item in the documentation that says you can change the OLite database parameter file to a strict "do it in the order of the from clause" optimization. Which implies that there is normally some level of optimization going on. However, since I'm not aware of any "Analyze Table" command for OLite - optimization certainly won't be as extensive as with real Oracle.
My Guess -
they look at predicates (where x=3),
index availability (trying to avoid full scans),
and probably only support nested loop join.

Similar Messages

  • The Full Optimization & Lite Optimization Data Manager packages are failing

    Hi,
    The Full Optimization and Lite Optimization Data Manager packages are failing with the following message "An Error occured while querying for the webfolders path".
    Can anyone had similar issue earlier, please let me know how can we rectify the issue.
    Thanks,
    Vamshi Krishna

    Hi,
    Does the Full Optimize work from the Administration Console directly?
    If it's the case, delete the scheduled package for Full Optimize every night (in both eData -> Package Schedule Status and in the Scheduled Tasks on your server Control Panel -> Scheduled Tasks), and then try to reschedule it from scratch.
    If it's not solving your problem, I would check if there are some "wrong" records into the FACT and FAC2 tables.
    After that, I would also check if the tblAppOptimize is having other values than 0. For all applications, you should have a 0 there.
    Hope this will help you..
    Best Regards,
    Patrick

  • Lite Optimization - As User defined Variant

    Team,
    I have the following question.
    I have two BPC planning cubes,Say Cube 1 ,Cube 2 and for Cube 1 Lite optimization Automation,I have used /CPMB/LITE_OPTIMIZATION variant in Process chain as per requirement (like till 3 days ago) but in case of Cube 2 I need to change scenario like till 0 days ago and leave 3 requests
    For Cube 2 Scenario , I cant give  /CPMB/LITE_OPTIMIZATION variant in Process chain as it is already used in scenario 1 ,So I tried creating user defined variant for lite optimization for that specific cube 2,but I am having trouble in giving  required tasks in Advanced data manager package settings for the user defined task
    Could you help me out here

    Hello Ravi
    You need to look into running the program UJD_TEST_PACKAGE in your process chain and then use a variant for that.  The variant will allow you to select the model, run package and variants you require.
    Hope this helps.
    Ian

  • Re: Full Optimize and Lite Optimize Disabled

    Hi,
    We have an BPC App.
    What we need now is that for a particular group of users to have no access to run Full Optimize or Lite Optimize rights to trigger it.
    Please can any one advice how to disable this functionality or un authorize this for particular set of users so that they cannot execute any optimization.
    Regards.

    Hi,
    Follow below steps:
    1. Create a Team and assign users those should have an access for Full and light optimization
    2. Goto  Manage data>Maintain data management>Manage Packages
        -->Select the team which you created in step 1
         -->Select add package
        --> Choosee process chain as /CPMB/LIGHT_OPTIMIZE
        --> Give package name and description
        --> Select Taks type as User Package
        -->Click Save
    3. Create one more package for Full Optimization as expalined in step 2 under same team
    Now users who are related to this new team can able to run both Full and light optimizations provided Execute task assigned under task profile.
    By following above steps you can assign many package to team and can modify team access from Manage Team user package access  if you need.
    Hope it helps..
    Regards,
    Raju

  • Oracle Lite, optimizer?

    Hi all!
    I'm in Oracle lite 10.3...what type of optimizer use Olite?
    I've some mysterious results in a select, only i change the order in the where conditions and the results are distincts...

    Hey Anastasia,
    Do you mean for the client or for MGP? I am assuming you want to unnest the subquery within the COMPOSE phase. Correct?
    Here is the only reference to unnesting on the client in the documentation:
    Example 5
    In this example, the "ordered" hint selects the EMP table as the outermost table in the join ordering. The optimizer still attempts to pick the best possible indexes to use for execution. All other optimizations, such as view replacement and subquery unnesting are still attempted.
    Select //ordered// Eno, Ename, Loc from Emp, Dept
    where Dept.DeptNo = Emp.DeptNo and Emp.Sal > 50000;
    But if this is your MGP that you are inquiring about, just alter the publication select statement with:
    SELECT /*+ UNNEST */ * FROM SOMESCHEMA.SOMETABLE WHERE BLAH, BLAH, BLAH

  • Lite & Full optimize doesnt clear WB Table

    I think the subject line is self explaining, I have tried few times to optimize(full & lite) nothing seems to be helping. The WB table still has records. thanks.
    fyi -  I did do a count on the WB table, there is 109k records in it.
    Edited by: Zack Thacker on Jan 17, 2012 5:56 AM

    Hi Zack,
    I never seen this behaviour, are you sure that no packages/users are working when you run the optimize?
    Maybe a package runs in this moment and added new records in the WB...
    Is better to schedule the optimize, so please ìtry to schedule lite optimize without checks every 30 minutes and see if it works (WB totally empty) and in the night a Full opt. to see if the fac2 will be emptied and at the week end a full opt. will all the checks.
    Kind regards
    Roberto
    Edited by: Roberto Vidotti on Jan 17, 2012 5:16 PM

  • Execute Optimize Outside of SAP BPC

    Hi Guys,
    I am at a client that would like their internal IT people to be able to optimize applications without having access to the SAP BPC data for governance reasons. We are having heavy performance issues due to the fact that the WB tables are growing too quickly.
    Would they be able to use the Admin_Task in their own SSIS package, modify the package to perform a lite optimize and execute the package via a trigger (when the WB table reach a certain record count)?
    Would they require admin access to the system for this package to work? Has anyone set up a solution like this before.
    If this would not work, what do you suggest?
    Regards,
    Andries van den Berg

    Andries,
    Why don't you schedule the optimisation.
    SAP provide both DTS and DTSX packages which can be used to optimise the application as you desire.  I use them myself.  through the data manager I created a scheduled task to run every 4 hours and run a lite optimization if there is above a certain threshold of records in the write back table.
    From your description this would meet your needs.  It creates a scheduled task that is run by the servers schedule process. This task could probably be used by your IT to run the process on an ad hoc basis.
    Doing it this way means that IT does not need to get involved, and it runs as a back ground process.  The running of the optimisation appears in the DTS log within the application where it is set up.
    Regards,
    Mark

  • Full optimize Failed when scheduled

    Hi everyone,
    I got an issue with a full optimize package in SAP BPC 7.0 MS when scheduled.
    The package admin_Optimize i use is the standard one. They are no overlapping package scheduled at the same time.
    When i manually run the package for a Full optimize on one application it works.
    When i manually run the package for a Lite optimize on one application it works.
    When i schedule the package for a Lite optimize on one application it works.
    When i schedule the package for a Full optimize on one application the package failed and stay stuck. Then the application stay unavailable.
    This issue takes place in the production environment so i can't make a test during the day. and some people are using the application very early in the morning so they stay stuck in the morning.
    Have you ever meet this issue ? What can i do ? What could be the test ?
    Best regards,
    Paul

    When i start the full optimize by hand it's working fine (But we are supposed  to launch it at 4h15 in the morning so i would prefer to schedule it !)
    Are you receiving any error message or actually the statusof package show that was not completed and no erro mesage is provided?
    The package stay stuck, it never ends even after 4 hours. We are obliged to kill it.
    Did you check tbllogs durin the time when the package is running ? It can provide very usefull information.
    Yes we did, nothings appears in this table ... all the other package are correclty recorded even the LiteOptimize that is run 30 minutes after ... byt the way this Lite optimize doesn't failed.
    Also please check the event viewer from BPC servers during the period when the full optimize was running.
    No error  related to bpc. But i had an error 45 minutes before related to Osoft data manager

  • Full Optimize problem - BPC 10.1 Classic on HANA

    Hi,
    I try to execute a Full Optimize, but it always ends with a warning and starts the Light Optimize instead.
    The model has 18 dimensions and a very large dimension is in the so-called "Last dimension". I'd expect the Full Optimize to reorganise the model.
    ENABLE_FIXED_CUBENAME is not active/set.
    Any ideas what is causing this or how to enforce the full optimize to be performed?
    Best regards,
    Christoph

    Hi Chrisoph,
    Do you know the full implication after running a full optimize? changing of cube names and so on?
    You can ENABLE_FIXED_CUBENAME parameter by looking at sap note 1689814
    once you fixed the cube name you can only run lite optimize.
    Andy

  • Lite Vs Incremental Vs Full optimze ?

    Hi all,
    I want to understand some basic things :
    1- what is the difference between processing an application and optimizing it ?
    I know that optimization moves data between fact, fac2 and wb tables, but what does the process do then ?
    2 what is the difference between lite, incremental and full optimize
    I think lite moves data from short term (wb) table to fac2 table but I am not sure of what the 2 other options do
    thank you in advance

    Optimizing applications
    Optimization cleans up data storage which improves the responsiveness of the system. You should optimize your applications periodically to enhance system performance.
    There is no rule of thumb for how often to run optimizations. The need can vary depending on the characteristics of your hardware environment and your application. However, you can set the system to remind you to optimize the application when the system database grows to a certain size.
    Data storage types
    Optimization options center around three different types of Business Planning and Consolidation data storage:
    Data storage type
    Real-time(FactWBTable)
    This storage area holds the most recent data sent through BPC for Excel and BPC Web. Periodically clearing real-time data greatly optimizes the performance of the system. See the Lite Optimization option, below.
    Short-term(Fact2Tale)
    This is data that is not real-time data, but is also not in long-term storage yet. When you load data via Data Manager, it loads the data to short-term storage so that the loaded data does not affect system performance.
    Long-term(FactTable)
    This is your main data storage. All data eventually resides in long-term storage. Data that is not accessed very often remains in long-term storage so that the system maintains performance.
    Optimization options
    The optimization options interact with the three types of data storage in different ways:
    Optimization type
    Lite Optimization
    Clears real-time data storage and moves it to Short-term data storage. This option does not take the system offline, so you can schedule it to run during normal business activity.
    Incremental Optimization
    Clears both real-time and short-term data storage and moves both to long-term data storage. This option takes the system offline, so it is best run at off-peak periods of activity.
    Full Process Optimization
    Clears both real-time and short-term data storage and processes the dimensions. This option takes the system offline and takes longer to run than the Increment Optimization. It is best run scheduled at down-time periods.
    In the rare case that you have two application sets on the same server and two different applications from different application sets are optimized at the same time, you may receive an error message. To work around this issue, be sure to optimize applications for one Application Set at a time. It is especially important to schedule application optimization at different times when optimizing applications from different Application Sets that are on the same server.

  • Read dimension member(master data) to ABAP internal table  in BPC 10.0 NW

    Hi all,
    I manage to read transaction data using this example [replacement for IF_UJ_MODEL~GET_APPL_DATA;
    I am now trying to read members(master data) from a dimension to a ABAP internal table but I have no idea how to.
    Can anyone advise me on how to read members(master data) from a dimension to a ABAP internal table.
    Some sample code would be really appreciated.
    Regards
    Edited by: HK Kang on Jan 3, 2012 4:26 AM

    Hi Chanaveer,
    UJD_ADMIN_RUN_OPTIMIZE can be used only for executing the FULL & LITE OPTIMIZER packages.
    Looking at the code of UJD_RUN_OPTIMIZE_PACKAGE it seems this FM can be used to trigger process chain from BW.
    Please refer below link on SDN showing how to load Master Data on FLY in SAP BPC 10.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/2020b522-cdb9-2e10-a1b1-873309454fce?QuickLink=index&overridelayout=true
    Thanks,
    Rohit

  • Problem on deletting members in BPC dimension

    Hi everybody,
    How is the right way to delete a member dimension?
    I've followed these steps:
    - i've done a lite optimize
    - I've cleared all data belonging to this member using the Clear package
    - I've deleted the member in Excel using BPC Admin
    - I've processed the dimension
    During the dimension processing, a warning appears: the member is being used, but it has no data and it's not been used in any formula member!!
    What's happening?
    Any idea out there?
    Thanks in advance,
    Albert Mas

    Hi everybody,
    I need to use something that works directly on BPC Admin or BPC Excel.
    I've read that it's possible to create a package that runs the same process chain that BPC Admin uses, when it runs 'Lite/Full Optimization'. I would have to include a new line:
    TASK(/CPMB/LIGHT_COMPRESS,ZERO_ELIMINATION_ON,X) for the Lite optimization package
    TASK(/CPMB/FULL_COMPRESS,ZERO_ELIMINATION_ON,X)  for the Full optimization package
    and after executing this package, it would make a zero elimination on the BPC cube.
    Does somebody know the code that I have to write at the Data Manager Package script?
    Does somebody know if this SAP BPC 7.0 behaviour will be the same at SAP BPC 7.5 version?
    Thanks in advance,
    Albert Mas

  • Bad Performance on SQL 2005 multiserver installation

    We have upgraded our server environment from  a single server solution with SQL 2000 and are now running on a multiserver solution with one DB server with SQL 2005 and two applictionservers.
    And now we are experience a much slower performance in our new environment on all EVDRE reports. Is there anyone who know about any issue with a multiserver environment or SQL 2005 that makes the reports slower?
    In our more advanced report with "expand by sheet" that took about 1 minute to expand on our old environment can take up to 10 minutes to expand in our new environment.
    Both servers are running BPC 5.1 SP5 patch 1

    The SQL Server version is very good and the performance problems should not come from there.
    Please make sure you check the document from sd regarding
    SAP_BPC_Performance_Tuning_Guide.doc to be sure you have the right configuration for OLAP and SQL and also application server.
    Can you please provide an information regarding what kind of configuration do you have into this multiserver environment?
    It is important also nr. of processors and memory for each server.
    Do you have a maintenance plan for your SQL server database?
    If you don't have that can cause again a big problem regarding performances.
    Please run into management studio exec sp_updatestats for you database to be sure that you have at least update statistics.
    So all these things mentioned before should be managed carefully to be sure that you have good performances into system.
    From application point of view make sure that you schedule lite optimize to be sure that you don't have more than 40000 rows into wb table.
    Regards
    Sorin Radulescu

  • Understanding the data storage

    Hello,
    In order to understand very well what happens regarding the data in the cube and in the relational database, here is a little scenario.
    Clould you please tell me whether my understanding is correct or not. Further precisions are welcome :
    If I send an amount (10 for example) in a based members intersection, the number 10 will be in the WB fact table (supposing that the table was empty just before) and in the WB partition.
    Now supposing, I change the amount, in the same intersection, I send the number 20. Thus, there will be 2 entries in the wb table with the same SIGNEDDATA to 10. 
    Are there also 2 entries in the WB partition in the cube or only one to 20 ?  
    After that, I run a lite optimization, now both the entries are in the FAC2 tables.
    Now If I send the amount 5 (still in the same intersection), there will be the amount -15 in the WB table. is it exact ?
    At this moment, what is the situation regarding the OLAP WB and FAC2 partitions ?
    If I retrieve the amount through a report, I asume that there will be a merger between the two fact tables in order to retrieve the right amount (ie 5 : 10 +10 -15), but I'm not sure.
    Also, as long as the data intersection concerned a based member, the retrieve is made from the relational database, Right ? The OLAP partitions are used when the data we want to retrieve concerned parent members? The retrieve will be made through MDX queries ?
    So, there is also a merger between the OLAP partitions in order to retrieve the right data ?
    Best regards,
    Lionel

    I think it is question regarding Microsoft OLAP database. I am not sure about the other OLAP database's behavior.
    Simple answers:
    If you send 10 and 20 to make it 20, you have 2 entries in WB table. Each entry has 10 for their own intersection.
    If you do lite optimize, your WB tables will be empty. And exact records will be moved to FAC2 table.
    If you send 5, your WB table will have 1 entry that has -15. Total number of record for that intersection is 3.
    However, number of records will not changed no matter what. Only exception is compress data during full optimize.
    That value, 5 from cube is by merge of three partitions - FACT, FAC2 and FACTWB. Normally, you do not need to care about table itself. SSAS will do it based on partition scheme.
    Data from relational database tables are correct in some cases such as WB partition. But if your WB partition is empty, data is from OLAP partition no matter it is base level or parent level.
    We are using some different way to lookup relational tables to get data faster and faster in BPC. But it is your own risk if you try to make something to lookup that tables directly.

  • Experiences of Partitioning FACT tables

    Running BPC 7.0 SP3 for MS
    We have two very large FACT tables (195milliion records and 105million records) and these are currently growing at a rate of 2m/5m records per month - we are running an incremental optimize twice per day
    It has been suggested that we consider partioning the tables to improve performance, but I have not been able to find any users/customers with any experience of doing this
    Specifically
    1. Does it improve performance?
    2. What additional complexity does it add to regular maintenance ?
    3. Have there been any problems encountered implementing Partioned tables?
    4. It would seem that partioning based on time would make sense - historic data in one partition, current data in another HOWEVER many of our reports pull current year and prior year so will this cause a reporting issue? Or degrade report performance?

    I don't know if this is still an issue for you.  You ask about Fact Table partitioning specifically, but you need to be aware that it is possible to partition either the FACT tables or the Fact table partition of the cube, or both. We have used (further) partioning of Fact table partition in the cube with success, and it sounds as if this is what you are really asking about. 
    The impacts are on
    1. processing time, a full optimize without Compress only processes the paritions that have changed, thereby reducing the run time where there is a lot of unchanged data. You mention that you run incremental updates twice daily,  this is currently reprocessing the whole database.  I would have expected the lite optimize to be more effective, supported by an overnight full optimize, if you have an overnight window. You can also run the lite optimize more frequently.
    2. query time. The filters defined in the partitions provide a more efficient path to data in the reporting processes than the defaults, which have the potential to scan large parts of the database.
    Partitioning is not a panacea. You need to be specific about the areas of performance problem that you have and choose the performance improvement strategy to address these.  Looking at the indexing of the database is also an area where you can improve performance significantly.
    If you partition the cube, it is transparent to the usage of the application, from both user and admin perspective. The greatest complexity comes is the definition of the partitions in the first place, but this is a normal DBA function.  The trick is ensure that the filter statements do not overlap, otherwise you might get a value duplicated in 2 partitions, and to define a catchall partition to include anything not included in specific partitions. You should expect to revist the partitioning from time to time.  It is quite straightforward to repartition, you are not doing anything to the underlying data in the FACT tables
    Time is a common dimension to partition and you may partition at different levels of granularity for different periods, e.g. current year by qtr or month, prior and future years by year.  This reflects where the most frequent updates will be.  It is also possible to define partitions based on combinations of dimensions, we use category and time, so that currenct year actuals has the most granular partitions and all historic years budgets go into a single partition.

Maybe you are looking for