Loading of time dimension

Hi,
I started thinking about what the best way to load the time dimension is. Until now I have loaded data for 10-15 years in advance into the dimension to be done with it. One down side to this is that when you create or view reports, you might see the year 2015 but for obvious reasons there won't be any data loaded for this year. Another thing I started thinking of the other day is how this may affect the performance and the loadtime of the cube.
If all dimensions is set to not be sparse and you have loaded 10 years that's not yet in use, the size of the cube, from what I understand, will be unnecessarily large. This will probably also have an affect on the loadtime.
If you set all the dimensions except for the time dimension as sparse, as AWM sets it as default, how will this affect the size of the cube?
Anyone have any thoughts around this?
Regards Ragnar

Hi,
I need to load time dimensions to have data till 2015. Moreover we have just regular time dimensions tables and no fiscal tables.
We have updated the fiscal data file to include data till 2015 and update DAc parameter Analysis end date to 31/12/2015. Data is loaded in these tables till 2015 but the ETL is failing for analyze W_ETL_RUN_S task.(param-g table contans 256 rows inspite of just 1 record)
I checked the OBI guide and found that parameters $$Start_date and $$End_Date needs to be set.
I am not sure if Analysis end date in DAC-> ETL prefernces is same as $$End_Date . Also, I could not see any start date parameter. I have checked mapping parameter as well.
It would be really great if you could tell the exact steps followed to extend the datawarehouse.

Similar Messages

  • Populating the time dimension in ODI

    I need to populate my time dimension in ODI> I read a solution in this forum suggesting to create a time table/view in the source schema, reverse it in ODI and then use it as source to populate the time dimension. Is there another way to do this? One way I thought of was to use the ORDERDATE field in my ORDER table (my source table in Oracle) and map it to my time dimension in SQL Server via an interface. But I also have DUEDATE, SHIPDATE and PAYDATE fields in my ORDERS table and this approach would mean that I have to map them through separate interfaces to the time dimension as well. I have created a procedure in the source schema(Oracle) and want to use it in ODI to populate the time dimension. But I amnt sure if that is possible in ODI. Could anyone help me with this please?
    Regards,
    Neel

    Hi Neelab,
    Sorry for my delay to reply you, I had no time the lasts days...
    To get the four distinct key from your time dimension, just add four instance of dimension table at interface each one joined with one of the columns.
    I believe that you load your time dimension from some other table than PRJ_TBL_TRANSACTION because you have the HolidayType column in your time dimension...
    A view is one possible solution to load the time table but depends how the performance of the query is.
    A way to do it at ODI is:
    - Create 4 interfaces, one for each column, to load 1 singe table with 1 single date column, don't worry about duplicated value at this time, than you can just use the "IKM Control Append" that has more performance but check the "Distinct" box (flow tab) at each interface
    - Create a last interface from this temp table as source, to the time dimension target table. Now you will use the "IKM Incremental Update" and do choose the "Update" option to "NO". Check the "Distinct" box.
    As this table will have no more than 6.200 records from the last 20 years it will be a small table where you shouldn't have performance problems.
    These are some of possible solutions but I would like to add other "way to think".
    By the table that you show here you have a simple time table with no special feature, for that, let me suggest you other way.
    - in the current way you will join but didn't get the record that "fail" from the join once they will be exclude if a date do not exist at time dimension
    My suggestion:
    - Load the dimension time table from your source table
    - as PK in time dimension table, use the ''Julian Day"
    - At ODI target fact table (datastore), create a 4 reference constraints (one by column) to the time dimension
    - at interface do not use the dimension as source and transform the 4 date to Julian and let the 4 constraints take care if they exists or not at dimension table.
    OR
    - Look for the minimum "possible" date at your company
    - populate your time dimension with every each day since then until a future date (Dec 31, for instance)
    - create a process to populate the future date that will be execute in a interval that you decide (once a year, once a month, as you wish) dependent on how further the date is populated
    - use the "Julian date" as PK
    - At interface just transform any date to "Julian Date" it will be at dimension time once it is naturally unique
    You could substitute the Julian date for "YYYYMMDD" that is a unique value too.
    I presented you 2 way to consider be considered, each one could be used based on how important is for the business know if a date was loaded or not.
    Someone can question that has the dates loaded from source against has all dates previous loaded could help to find errors from days that wasn’t loaded but it has a failure. As there are 4 dates source columns (and we are talking just about one source table until now) if a date loaded math a date when the load failure there is no value in use the time dimension date to analyze this possibility.
    I defend the full time dimension load.
    Make sense and/or help you??

  • ERROR while loading time dimension table

    i need to load time dimension from csv to oracle table, while loading i got the error.
    my source data type is date and target is date.
    ODI-1226: Step sample day fails after 1 attempt(s).
    ODI-1240: Flow sample day fails while performing a Loading operation. This flow loads target table W_SAMPLE_DATE.
    ODI-1228: Task SrcSet0 (Loading) fails on the target ORACLE connection Target_Oracle.
    Caused By: java.sql.SQLException: ORA-30088: datetime/interval precision is out of range
    while creating c$ table
    create table WORKSCHEMA.C$_0W_SAMPLE_DATE
         C3_ROW_WID     NUMBER(10) NULL,
         C1_CALENDAR_DATE     TIMESTAMP() NULL,
         C2_DAY_DT     TIMESTAMP() NULL
    )

    check the source data and use the correct function eg TO_DATE(SRC.DATE, 'MM?DD/YYYY') use NVL if required.

  • Time Dimension not loaded correctly when imported to 10.2.0.3 database

    Dear All,
    I created a Time dimension using the wizard that contian both Normal and Fiscal hierarchies on a repository created on 10.2.0.1 DBMS
    I exported the poject in mdl and imported it in repository on 10.2.0.3 DBMS ... the Time dimension in the new poject wasn't loaded correctly
    the Rows of the Day level contain information about the upper fiscal levels' columns while the columns of the normal hierarchy containd null
    Can you help me to solve this problem ???? !!!!!...
    Best of regards,
    Shaimaa

    Your quote comes from the OBIEE Plug-in user guide (http://www.oracle.com/technetwork/database/options/olap/awm-plugin-user-guide-for-obiee10g-303148.pdf)
    So, yes, you can only use this if you are using database 11.1.0.7 or later and if you are creating an 11g style AW.
    You can map 10g style AWs to OBIEE, but it is a manual and complex task. You would need to start by generating views using (for example) the      OLAP View Generator (10.2.0.3) available on http://www.oracle.com/technetwork/database/options/olap/olap-downloads-098860.html . After this you would need to manually create the mapping metadata in OBIEE.

  • How to load time dimension informatica...

    Hi, im new to datawarehouse... Can anybody tell me,how to load time dimension informatica...
    Thanks and Regards,
    jagadish.

    Do you mean Informatica, the ETL tool? If you do then I would post your question on the Informatica website as we typically use Oracle Warehouse Builder and Analytic Workspace Manager here at Oracle.
    If you want to know hoe to generate a time dimension table, then I would recommend using the Time Dimension Wizard in Warehouse Builder as this will automatically create the target dimension table and populate all the required levels with the correct data values. There is more information on the OWB OTN Home Page
    Keith Laker
    Oracle EMEA Consulting
    OLAP Blog: http://oracleOLAP.blogspot.com/
    OLAP Wiki: http://wiki.oracle.com/page/Oracle+OLAP+Option
    DM Blog: http://oracledmt.blogspot.com/
    OWB Blog : http://blogs.oracle.com/warehousebuilder/
    OWB Wiki : http://wiki.oracle.com/page/Oracle+Warehouse+Builder
    DW on OTN : http://www.oracle.com/technology/products/bi/db/11g/index.html

  • Creating Time dimension in BW data model. - like seen in logical data model

    Hello all,
    I have been struggling with this thing and I am looking for some help from anyone on this forum.
    We are trying to create a logical data model of our bw system. We are going live next month with Student module for universities. We have multiple Infocubes and DSO and since there is so much crossing over in between them most of the reporting is done on infosets.
    One of the thing we were thinking; is it possible to create something like a common time dimension table for every infoprovider. Basically when we are providing the reports to the end user can we give them a drop down menu which gives a time frame for reporting rather than selecting.
    Example: Like can we create something which looks in the drop down like current month data, last months data, three months ago, four months ago, five months ago, one year ago, two years ago. Can we make like these data slices in our cube and deliver it to the end user?
    We have in our cube a few date infoobjects, like receipt date, decision date, cancellation date and like wise.
    Please let me know if any one has done any similar thing, it will be very helpful.
    Thank you so much in advance.

    if you add your common time dimension to your data model, first identify for each infoprovider the time against which 'current month' and other frames should be applied and map them to your dimension.
    just a question... are you not using time dimension in cubes ? ideally this should be your time dimension llinking all.
    when you use time dimension which uses 'current month' , 'current year' , you will have to address their historisation as well. (because current month now will not be so current after 2 months).
    so in data load procedure every day these values need to change (meaning drop and reload).
    and routines to populate these values based on reporting date.
    Edited by: hemant vyas on May 6, 2009 1:56 PM

  • How do I include 'Time' in a time dimension?

    I have a requirement where I need to classify the data in the warehouse based on date and time (example : between 10 am to 11 am, 3.00 pm to 6.00 pm like that).
    The time dimension created by OWB contains only the date part and when I load the data I use the expression TO_NUMBER(TO_CHAR(input,'YYYYMMDD')) to populate the date.
    But I would like to include the time also in the expression like 'YYYYMMDDHHMISS'. But unfortunately the OWB generated time dimension does not have data in this format.
    How do I do that? Any pointers?
    thanks in advance

    You rather create 2 dimensions: Time and TimeOfDay. The first one will have all the date values with it characteristics. The second one will hold the all the possible HH24:MI:SS values with it's characteristics, for example which hour and which part of the day it is (Morning, Afternoon, Evening, Midnight or whether it's lunch time or not), depending on the requirements.
    If you try to combine them both into 1 dimension you will get 86400 dimension records every day. Your dimension table will probably will become bigger than the fact tables.

  • Modifying Existing Time Dimension

    Hi All,
    Presently I have a Time Dimension and a Fact table which has a FK to this Dimension. The Time Dim has a level upto hourly values.
    Now it's in production and I need to change the Time Dim to take on minutes 'cause some reports need minutes.
    1. So what is the best way to go about this?
    2. Is it better just to add another extra field which has the required date, in the fact table?
    3. How can the existing data be easily ported to the new Time Dim(if it has a minute-level)?
    Thanks,
    Justin.

    I am wondering how you implemented the time dimension which only has a level up to hourly values. Do you mean the time dimension only has 24+1 = 25 rows? Usually the time dimension is separate from the date dimension and its lowest level values are minutes or seconds, depending on the business requirements. Therefore the time dimension should have 24X60 + 1 or 24X60X60 +1 rows.
    The solution is depended on what you want.
    My suggestion is as the follows:
    1.     Create a new time dimension with the desired lowest level values (minutes or seconds).
    2.     Create or modify the foreign key of the fact table to point the new created time dimension.
    3.     Modify the ETL mappings.
    4.     Option 1: If you want the new functionality available for the existing rows in the fact, you have to reload the fact no matter which approach you use.
    Option 2: If you only want the new functionality available for the subsequent refresh load, you can modify the foreign-key values pointing to the old time dimension to the new time dimension, using the rule XX --> XX:00 or XX --> XX:00:00 (where XX is the hour number in the old dimension). In the subsequent refresh load, map the foreign-key values to the actual time.
    Maybe other else can provide better solutions. IF so, please let me know.
    Good luck!

  • How to build dynamic time series for the time dimension

    I am planning to build dynamic time series using rule file instead of manually.Please let me know if there is any property need to assign to enable DTS property for TIME dimension.
    Edited by: 844104 on Mar 14, 2011 3:37 AM

    In the load rule in the dimension build settings you would need to go to the tab dimension definition, choose the time dimension and right click on it. Select Edit properties. If you have not done so, set the dimension to be the time dimension. Then go to the generations/levels tab and set the generation names you need. For example if you want YTD, you would set the generation name to Year, if you want QTD set it to Quarter. You would set the number to the generation number that coorisponds to the generation. The DBAG has the list of names for all of the DTS members.

  • How to time dimension

    How do you link a time dimension to the fact table?
    If the time dimension has a sequence as the PK then
    the following columns:
    year
    month
    quarter
    day
    time
    etc..
    How is the fact table connected to the time dimension if the fact table has the
    primary key as a FK in the fact table?
    -Jim

    Hi Ragnar
    Because a date is in fact stored as a number you don't need a surrogate key for your time dimension. You can make the primary key the date itself. Then you just join from the fact to the time dimension on the date.
    Oh yes, you would need to populate the time dimension before loading the fact table. But then this is true of all star schemas. You must populate the dimensions before you populate the facts.
    However, in the case of time, because it never changes over time (pardon the pun) you could load years of dates way out into the future. Then you would never have to worry about it for a while.
    I have customers who are using E-Business Suite where the time dimension is fed from GL_PERIODS and BOM_CALENDAR_DATES. Because the periods are entered once per year in most cases then they only need to run the load once per year. Matrialized vieww of time work well too.
    Hope this helps
    Regards
    Michael

  • TIME_DSO_1 attribute in Time Dimension

    Hi,
    We are using Oracle Analytic Workspace Manager version 10.2.0.3.0A for creating a cube.
    We have defined a dimension as "Time" dimension and have also specified the dimension type as "Time".
    This dimension has the levels: Year, Quarter, Month, Week and Day. These levels are for the one and only hierarchy of the Time dimension.
    The attributes of the dimension (applicable to all the levels) are: END_DATE, LONG_DESCRIPTION, SHORT_DESCRIPTION, and TIME_SPAN.
    In this structure, we are having a strange observation : During the process of defining this TIME dimension, there were some problems with the machine (it got hanged) and so we had to exit (kill) the AWM and come in again. When we logged in to AWM again, we could see a new attribute called TIME_DSO_1 defined within the Time dimension by itself.
    Can anyone let us know what is this attribute all about ? And can we go ahead and just delete this from our dimension structure without creating problems for ourselves?
    Many thanks in advance for the kind inputs of the forum.
    Regards,
    Piyush

    Hi,
    I can't say that I really understand how your data shows up, but I do see an error in your hierarchy.
    What you should do is split it up into two hierarchies.
    Year-> Quarter-> Month-> Day
    and
    Year-> Week-> Day
    And here is the reason:
    A day can easily have a week as a parent and month. But week cannot. Since a month can end on any given day in a week, that means that a week can have two parents. If I take the next week as an example. Mon, tue, wed, thu and fri would belong to the august month, while the rest of the week would belong to september. But the entire week would still be week 35. And its the same issue with quarter. Months can be added into a quarter, but weeks can not.
    Now, I'm not sure if this would solve all your problems, but there might be a few other things you might check out and thats what ID's you populate your member fields with.
    say that you have used the month numbers (or at least the same id for every january and so on)to populate your months. If that is the case things will also behave extremely weird and slow. What you need to do is to create unique IDs for every month (and any other level) so that you are sure that january 2007 don't have any child records that really should belong to january 2006.
    As a general rule, the member field needs to contain a completely unique value across on from all levels. The "generate surrogate key" functionality helps you a bit along the way as it adds the name of your level in front of the value that you load. If the values you load doesn't have unique values within the level it won't help you any.
    Hope this can help you some more... ;)
    regards Ragnar

  • OWB 904 - problems creating time dimension

    I'm using Oracle9iR2 DB and OWB 904. I have not much experience with OWB 904 yet and problems with creating the time dimension.
    I tried to figure out the example which comes with OWB 904 and did everything as it is written in the readme.txt file:
    1. loaded the time table functions owb_time_seq.sql and owb_time.sql in my runtime user schema
    2. imported the owb_bp_time.mdl file into my design repository, which created the demo project 'OWB_BP'
    The problems which I have with this demo are:
    1.) I get warnings when I validate the mappings (TF_TIME_MAP):
    VLD-1002: Mapping object T_TIME is not bound to a repository object.
    VLD-1004: Column length of L_DAY_NAME is longer than the target column length.
    VLD-1004: Column length of L_MONTH_NAME is longer than the target column length.
    VLD-1004: Column length of L_QUARTER_NAME is longer than the target column length.
    VLD-1004: Column length of L_YEAR_NAME is longer than the target column length.
    VLD-3260: No Output Attribute name is specified. In this case, the attribute’s physical name will be used.
    VLD-1123: Missing location information for Module WAREHOUSE.
    VLD-1115: Commit frequency is defaulted to Bulk Size.
    I know how to solve the errors VLD-1002, VLD-1004, VLD-1123 and VLD-1115 but I have no idea how I should solve VLD-3260 and VLD-1002, because reconcile outbound is not possible for the T_TIME dimension.
    2.) When I try to import the table function
    TIMEDATA (IN VARCHAR2, IN NUMBER) return TABLE
    from the runtime user schema into my design repository I get the error message: Argument Data type is not supported.
    3.) I cant deploy the mapping TF_TIME_MAP.
    Is it possible to use the table function from the demo to create a time dimension and to import it into my own project? My aim is to create a bean compliant time dimension and I want to know what I have to bear in mind to accomplish that. The OWB904UsersGuide didn't give me enough information, therefore I'm asking if you can help me out.
    Thanks in advance,
    Dirk

    Dirk,
    The way you should approach the time dimension is following:
    - Run the SQL scripts into your target schema (you already did).
    - Import the MDL file (you already did).
    - Copy and paste the times dimension into your own project to be able to use it.
    - Copy and paste the mapping you want to use to your own project.
    - If necessary, modify the times dimension according to your needs; change the mapping accordingly.
    - Open the mapping, do a right mouse click on the time dimension and select 'Reconcile inbound'. Select matching strategy to match by bound name.
    We do not currently support the table function as an object in the metadata repository. I.e. if it exists at runtime then you can call it (as the time dimension load mappings do).
    With the objects in the target schema you should be able to deploy the mapping.
    Thanks,
    Mark.

  • Time Dimension Type allows different values in attributes - Bug or Feature?

    Not sure if this is a bug or a feature.
    But if one has multiple hierarchies on a Time dimension. You have the ability to specify different values for member attributes in different hierarchies.
    Example.
    Hierarchy A has MIN_ID for it's Member and uses MIN_END_DATE for it's END_DATE
    Hierarchy B has MIN_ID for it's Member and uses SESS_END_DATE for it's END_DATE
    As per this post and David Greenfield's comment:
    Dimension Sort issue when multiple mappings for different hierarchies
    "Are you attempting to map the same attribute, SORT, to different columns in the two hierarchies? Put another way, do you expect the same member to have different values for the attribute in the two different hierarchies? If so, then this is a problem since a member must have the same value for the attribute regardless of the hierarchy."
    Unlike a user dimension, a time dimension appears to allow this and it appears to work as intended. Is the behavior in this case intended to be different between a user and time dimension?

    I think that this is not a bug. There is an incompatibility in design which prevents you from using the same attribute differently for both hierarchies.
    NOTE: Unlike parent relationship which depends on <dimension, dimension hierarchy>, Dimension Attribute is dependent on <dimension> alone, not dependent on <dimension, dimension hierarchy> combination. Hence it can only take on 1 value for 1 dimension member.
    I think that the time dimension only appears to allow this. The key thing to check is for Time Dimension members which are common to both the hierarchies. Only one of the mappings will take effect (usually the hierarchy which is loaded last will remain in the aw/usable for queries, reports.. it would have over-written the earlier attribute value loaded as per the earlier hierarchy load).
    Visualize a dimension as a long list of members which are built up contiguously on a per hierarchy, per level process using the mapping information saved. Once a member is defined (created) via Hierarchy A, it wont be created once again while loading Hierarchy B but is instead updated or redefined based on Hierarchy B's mapping info.
    Assuming the dimension load attempts to load Hierarchy A first and then Hierarchy B,
    * Dimension load for Hierarchy A will define the various members using MIN_ID and set the END_DATE attribute to value=MIN_END_DATE
    * Dimension load for Hierarchy B will re-define the various members using MIN_ID and re-set or over-write the END_DATE attribute to value=SESS_END_DATE
    * In this case, it looks like all members are common for both hierarchies (as both members are mapped to same column MIN_ID) and you would end up with END_DATE=SESS_END_DATE.
    Actually whether all members are common to both hierarchies or not depends on the quality of data in your snowflake/star table: if parent level for Hierarchy A as well as Hierarchy B is setup fine then the members will be same set (overlapping in whole). If some rows for MIN_ID have parent column for Hierarchy A setup correctly but parent column for Hierarchy B =null or invalid value then that member will exist in Hierarchy A alone and would contain END_DATE=MIN_END_DATE as the corresponding update along Hierarchy B would fail due to hierarchy data quality issues (join from current level to parent level).
    As regards a solution to your problem, you should not use the same attribute "SORT" for dual purpose (both hierarchies). Instead define attributes SORT_A and SORT_B and make them enabled for Hierarchy A, Hierarchy B respectively and map/use them appropriately in your reports.
    HTH
    Shankar

  • Primary Key Data type in Time Dimension????

    I have to create a Time dimension with day grain in a Datawarehouse system and I don’t know what is the best data type for the primary key...
    For example
    1) I could put Number(8) datatype, then the dates will be: 20050114, 20050115, 20050116.... Then in the fact tables I put the Number(8) datatype in the date fields... But in my reporting tools I have to put the to_date function to show the dates in the right format.
    2) Or I could put Date datatype, then the dates will be: 01/14/2005, 01/15/2005, 01/16/2005.... Then in the fact tables I put the Date datatype in the date fields...
    It’s the Date primary key a bad datatype? (Very slow)
    What is the best Primary Key Data type in Time Dimension???
    Thanks!

    <quote>I have to create a Time dimension with day grain</quote>
    OK.
    <quote>But in my reporting tools I have to put the to_date function to show the dates in the right format</quote>
    Why? ... if you’ve decided to have a Day dimension table what is stopping you from having the day represented as a DATE column in there? (plus all the other "right formats" you may need). The join keys should only be used for … joining.
    <quote> It’s the Date primary key a bad datatype? (Very slow)</quote>
    No … DATE or NUMBER won’t make any noticeable difference when used as the join key between the time dimension and the fact table.
    Some see advantages in having the DATE FK in the fact table …
    1. One can have range partitioning in the fact using real DATEs
    2. One can get Day-derived info right from the fact table … that is, the join to the Day dimension is not needed
    I don’t (see them as advantages) … for #1, range partitioning by some measure of time is still achievable as long as the PK values on the Day dimension are immutable (as they should) … and I don’t see #2 as an advantage for the DW end-users.
    Personally, I prefer a surrogate numeric PK … but not things like 20050129.
    <William>If you need dates then use dates. They are more robust, you can never accidentally have November 43rd</William>
    Of course this can not be about the fact table … since there the column is constrained … so this is about accidentally getting "November 43rd, 2004" into the Day dimensions when the PK is numeric 20041143; true, but is a DATE PK more robust in this case? … No … one could accidentally insert to_date('29-Jan-2005','DD-Mon-YYYY') and to_date('29-Jan-2005 00:00:01','DD-Mon-YYYY HH24:MI:SS') and that won't be very good, would it? In both cases, one would need something extra to 100% protect the integrity of the Day dimension (mind you, loading the Day dimension probably happens once a year under the supervision of the most technical people on the project).
    There is no best PK data type for the Day dimension (between NUMBER and DATE) … they are both workable solutions ... go with what you’re comfortable.

  • Time dimension with Hourly base time periods

    Hi all
    I need to analyze data at Hour, Day, Month, and Year levels. The data in the fact and dimension tables are at the 'Hour' level with DATE datatype, such as:
    02-SEP-10 10:00:00 AM
    02-SEP-10 11:00:00 AM
    To use Time-Series type calculations, I understand that I have to create an OLAP dimension of type 'TIME' (and not 'USER') and map it to the populated relational time dimension table.
    1) Can I have the primary key for 'Hour' level as the actual base level value of datatype DATE (eg. 02-SEP-10 10:00:00 AM) ?
    2) For the END_DATE and TIME_SPAN attributes at the 'Hour' level, what should I use?
    The documentation is only available for minimum 'Day' level hierarchies, which allows setting END_DATE and TIME_SPAN to the actual 'Day' value and 1, respectively.
    3) For the END_DATE and TIME_SPAN attributes at the 'Month' level, do I need to supply the last-date-of-each-month and number-of-days-in-that-month, respectively?
    Please bear in mind that I am relatively new to Oracle OLAP. Any assistance will be appreciated.
    Cheers.

    Thank you Szilard and Adnan for the very prompt and informative responses.
    I managed to follow the advice on the oracleolap.blogspot link and created a time dimension with members at Hour level loaded into the dimension in character format: TO_CHAR(hour_id, 'DD-MON-YYYY HH24')
    The problem now is the maintenance (loading) of the dimension is taking an abnormally large amount of time (over 1 hour) as opposed to when the members were being loaded in DATE format (5 minutes). The mapping table only as 10,000 entries.
    Why is these such a big difference? Is it normal? Is there a way to speed up the maintenance time?
    FYI, I have not created any indexes on any of the attributes.
    My platform is:
    11.1.0.7.0 DB
    11.1.0.7.0B Client

Maybe you are looking for