Loading data at parent levels
I have actuals data available at the day level but plan is available only at month level. Month is level 1, and I'm thinking about loading plan to this level. Are there any problems with doing this? What are the things I need to be careful about? I will be doing a nightly level 0 export of the data, clearing the database, reloading the export, then loading new data and calc-ing the cube.Thanks for your help.
Well, there are a few problems to worry about. First, you must ensure that your database is not set to Aggregate Missing Values (database seting) - otherwise your upper level values will be made #Missing if all of the children are #Missing.Second, if you want to write back to level 1 of Time, you cannot make those members Dynamic calc, which takes away a big optimization tool. It is probably easier (from a cube admin perspective) to either create dummy members to load the plan to, or chosing one month in each quarter in which to input the plan. At the upper levels of time, the data will look the same as if you input it at that level.Regards,Jade----------------------------------Jade ColeSenior Business Intelligence ConsultantClarity [email protected]
Similar Messages
-
Loaidng data to Parent level using ERPi
Hi All,
We have a scenario that we have to load data from GL_Balance to Planning and HFM using ERPi at Parent level, is this possible to load data at parent level using ERPi..
SVKFDM does not impose any rstrictions on what you can load into HFM it just respects the load constraints of the target system i.e. if you can't do something directly in HFM you won't be able to do it via FDM. In HFM data can only be loaded to base level members not parents, parents are calculated and are not stored i the HFM database. Therefore you will not be able to load to parent level members in HFM. You can sometimes have excptoions to this but only if the HFM developer has written specific rules to allow this.
-
Is it possible to load data at aggregated level directly
Hi All,
My question may sound vague but still would like to clarify this.
Is it possible to load data at some higher level than the leaf level? The reason for asking this question is that we are facing severe performance issues while loading the cube.
We are trying to reduce the number of members which inturn can reduce the loading time. On this attempt to reduce the number of members client said that there are 3 million leaf members(out of 4.3 million total members) for which they do not bother if they appear on report or not but they want to see the total value correct. i.e. the dimension will be only loaded with 1.3 million(4.3 - 3) members but the total can still be correct(including the 3 million leaf members).
Is it possible to have two loads one at leaf level only and other at parents and above levels.
DB - 10.2.0.4
Also I want to know when do we use allocmap? how does this work?
Regards,
Brijesh
Edited by: BGaur on Feb 13, 2009 3:33 PMHi Carey,
Thanks for your response.
I worked on your suggestion and I could able to load data at higher level for a value based hierarchy. But I met with a other problem while doing this.
I am describing this as a level based but while loading we are converting to value based hierarchy.
we have following levels in the customer dimension having two hierarchy
hier1
lvl1
lvl2
lvl3
lvl4
lvl5
prnt
leaf
hier2
level1
level2
level3
level4
prnt
leaf
so prnt and leaf is common levels are common in both the hierarchies but we were facing multipath issues in second hierarchy and we work around it by concatenating the level name.
In attemp to decrease the number of this dimension member(currrently 4.3 million) business suggested that there are some kind of customer for which they dont want to see the data at leaf and prnt level instead the level4 or lvl5 should have the total consisting of those members. So by this way we need not to load those members and they were making up 2.4 million out of 4.3
To implement above I did following.
1. Created six hierarchies.
one to have members from level4 till level1
second to have members from lvl5 till lvl1
third to have all members of hier1
fourth to have all members of hier2
fifth will have leaf and prnt of hier1
sixth will have leaf and prnt of hier2
In fifth and sixth hierarchy leaf is common but prnt is different due to the concatenation.
In the cube I am selecting the aggregation hierarhies as first,second,fifth and sixth. hierarchies third and fourth will be used for viewing the data.
Fact will have data corresponding for leaf level data,level4 and lvl5.
The challenge I am facing is that if there is some fact value for leaf level loaded through relational fact but no value being loaded through fact for lvl5 or level4 I am not seeing the total value as leaf value is aggregating till prnt level and no value at level4 or lvl5 so same is propagated till lvl1 and level1.
I tried to be as clear as possible but still if there is any confussion pls update the thread. I understand that the approach is vague but I am not seeing any other way.
Thanks
Brijesh -
Can we load data for all levels in ASO?
Hi All,
Im creating cube in ASO
can i load data for all levels in ASO
we can load data for all Levels In BSO but in ASO i need confirmation????
and one more
wat is the consider all levels option in ASO is used for ? wat is the purpose?
Can any one help ,it would be appriciate.
ThanksIn an ASO cube you can only load to level zero
The consider all levels if used for aggregation hints. It allows you to tell the aggregation optimizer to look at all levels when deciding if aggregation needs to be done on the dimension -
Can we load data for all levels in ASO cube
Hi All,
Can we load data for all levels of members in ASO cube in 9.3.1.
RegardsYes you can load data for all levels in an ASO cube in any version HOWEVER, none of the upper level data in any cube will be there when you look for it. You will get a warning message in the load because ASO cubes don't store data at upper levels. It is the same as loading data into dynamic calc members in BSO cube. It will do the load without compalints, but there will be no data there (At least you get the warning in ASO)
-
Advice on inserting data at parent level
Hi experts,
I would need some advice on how can I enter data at parent level. As we cannot enter data at parent level I am figuring out different strategies to do so. The one I seem to like the most is to create members representing the parents where we can input the data. For example:
- 2008.Total (Parent)
+ 2008.Generic (Leaf)
+ 2008.Jan (Leaf)
+ 2008.Feb (Leaf)
+ etc....
I would need to do this in different dimensions in my application. So here are the questions:
1) Is there anything wrong about doing this?
2) Is there a better option? (what is the Best Practice in this case?)
3) In the case of doing this for the TIME dimension, there are some properties which value is not easy to set for those members (TIMEID, YEAR and LEVEL). I don't really know where and how those properties are used and the help doesn't help me much (what is the "time format required for Analysis Services"?). So what values have you used in these properties for such members?
Thank you very much,
RafaelHi Rafael,
What purpose is your application designed for? In a legal consol application, would never mess with the standard time dimension setup, in particular the time dimension. The setup of the Time.Level and Time.Year properties are very sensitive in the operation of some of the business rules, particularly the opening balances & account transformation rules (where there's a time-period offset functionality). I always try to retain the standard time dimension setup there.
This means, for example, using a seperate datasrc member (or members) if the customer insists on having "13th period" for year-end adjustments. If they don't go into December along with the standard December values, then the carry-forward opening balances for the following year can be very tricky to calculate.
If your application is purely analytics, or for planning, then you should still understand what business rules features you want to use before deciding the time dimension design. When I've had year-total input data, I either
1.) use a standard 12-month time dimension, and by practice always put total-year data in December. Set up input schedules appropriately, and train people appropriately. Use Dec not Jan, or else the year-total and periodic/YTD computations in the cube won't work correctly. And consider your Account.Acctype carefully.
2.) If the data is for annual values that aren't directly part of the P&L or B/S, and are things like allocation drivers, tax rates, inflation rates, stuff like that -- which doesn't get added up along with the monthly values to come to a total-year "full picture" -- then I follow the structure from the old Outlooksoft demo. Create each of your years with the standard 12 months. Then create
Input.Year
xxxx Input.Quarter
xxxxxxxx 2008.Input
xxxxxxxx 2009.Input
xxxxxxxx 2010.Input
etc.
Then set your YEAR = 2008,LEVEL=MONTH for 2008.Input, etc.
This approach works well in logic *LOOKUPs and also keeps the dimension structure clean & intuitive for end users. If you put your 2008.Input under 2008.Q1 or 2008.Q4, then every time someone lays out or drills down in a report, they will curse the designer, since they now need to remove that extra column of the 2008.Input member.
With the datasrc approach for the "13th month" you can control whether the year-end adjustments are there or not, for those few reports where you'll want to see them. And 99% of the time, your time dimension "looks" normal.
I'm sure there are other approaches beyond what I've discussed, but I've never gotten into trouble using one or both of these two.
Regards,
Tim -
Loading data to Parent in ASO cube
I created an Excel data file to test different load scenarios. One scenario was using a parent level in one of the dimensions. I expected the record to show up in the dropped records file but it doesn't. Does anyone know why it doesn't? Is there any way to tell that the record wasn't loaded? We are on v11.
You get a warning message 1003055: "Aggregate storage applications ignore update to derived cells. [X] cells skipped".
-
Loading to Parent level Period using multiload template
Hi Experts,
Is it possible to load data to Parent level period using the FDM multiload template to HFM? Does multiload template allow loading of only the leaf level period?
Thanks
Kannan.FDM does not impose any rstrictions on what you can load into HFM it just respects the load constraints of the target system i.e. if you can't do something directly in HFM you won't be able to do it via FDM. In HFM data can only be loaded to base level members not parents, parents are calculated and are not stored i the HFM database. Therefore you will not be able to load to parent level members in HFM. You can sometimes have excptoions to this but only if the HFM developer has written specific rules to allow this.
-
Base level entities data is not rollup to parent level entity for few ac's
Hi guys,
Can some one pls help me i am not able to consolidate the base level entitys data to parent level for few accounts.
remaining all accounts are consolidating to parent level entity from base level.
Accounts which are not rolling up to parent level those accounts are few flow accounts and some accounts are Dynamic accounts.
Thanks in advance............Pls revert me earliest........
Rama.Guys this is HFM Consolidation hope there is no time balance settings in accounts attributes and we have not mentioned grouplabel to parent entity if that is the case it is as group label base entities will not aggrigate to parent entity but here few accounts are not aggrigating parent entity.
Thanks / Rama -
Load cube data at multiple levels
Hi All,
I have a time dimension with following hierarchies.
all_time --> Year --> Quarter -->Month
all_time -->Week
My fact table contains data at week and month level. I mapped my fact table time_key to both levels(Week and month) in AWM but when I load data into cube AWM loading data only at week level and ignoring all moths facts rows in facts table. How can I load data at both levels?
Olap Version: 10.2.0.4
Thanks
Dileep.I am trying to wrap my mind around your design. It appears that you have a fact table with two levels of granularity. Why not just have one hierarchy with a granularity at the week level?
All Time --> Year --> Quarter --> Month --> Week
Do you have some facts that relate only to a month level? If so, can you just relate the month total to one of the weeks in the month? Or is it acceptable to just have data at the month level and eliminate the week level?
If your week and month data are mutually exclusive, you could run into problems. When you look at data rolled up to the year level, for example, it would exclude the data that was inserted at the week level.
When I think of multiple hierarchies of time, I first think of calendar years versus fiscal years. The two hierarchies are completely unique due to a business requirement, yet they are clearly related as measures of time (years, months, etc.). But a fiscal month never rolls up into a calendar quarter, or a calendar day into a fiscal week or year, because the hierarchies are distinct. But the fact table is at only one level of granularity (day or week or month), no matter how it rolls up into the separate hierarchies.
At first glance, your two hierarchies look like they should be one. -
WRITE_BACK BADI for Parent Level Data Save
Hi Experts,
I am using write back badi for Parent level data saving in BPC 10, but my requirement is to save data at parent level for time period 2007.Q1 without segregating at corresponding base members level.
Is it possible so? if yes , please suggest.
I will highly appreciate for your quick response.
Regards,
Dipesh MudrasHi Dipesh,
BPC NetWeaver related questions should be posted in the corresponding forum, cf.
SAP Planning and Consolidation, version for SAP NetWeaver
Regards,
Gregor -
Loading facts at multiple levels
Hi
I have a cube with six dimensions. For one of those six dimensions I get data at all levels so what I want to do is load data at all levels and choose no aggregation for that particular dimension. I need levels in that dimension because I have other cubes in same AW for which I get only leaf level data for all dimensions. How can I load data at all levels for a given dimension?
olap version: 11.2.0.3
Thanks
Dileep.In AWM you should be able to drag the dimension key column from the fact table to multiple levels of the hierarchy.
At the XML level you should see elements in the cube map like this:
<CubeDimensionalityMap
Name="MY_DIM"
Dimensionality="MY_DIM"
MappedDimension="MY_DIM.MY_HIER.LEAF_LEVEL"
Expression="MY_FACT.MY_DIM_COLUMN">
</CubeDimensionalityMap>The MappedDimension attribute instructs the server to load data at the LEAF_LEVEL only. To get it to load at all levels you can simply delete the MappedDimension line and recreate the cube. If you then look in the AWM mapping screen you should see MY_FACT.MY_DIM_COLUMN mapped to all levels within MY_DIM.MY_HIER.
There are a few cases where this will not work, so be warned! Here are some that I can think of.
(1) If you have an MV on the cube
(2) You are partitioning by the dimension
(3) The expression for the CubeDimensionalityMap comes from the dimension table.
Case (3) sometimes happens because users map to a dimension table column instead of to a fact table column wwhen they are different sides of the table join. For example if the table join condition is "MY_FACT.MY_DIM_COLUMN = MY_DIM_TABLE.KEY_COLUMN" then some people map to the expression MY_DIM_TABLE.KEY_COLUMN instead of to MY_FACT.MY_DIM_COLUMN. If this is true, then you can just change the expression to point to the fact column. A legitimate case of (3) happens where the fact table contains data at a lower level (e.g. DAY) than what is loaded into the AW (e.g. MONTH). A solution there is to create a VIEW that does the correct join and then map the cube to the VIEW. -
How to load data at non-leaf levels directly?
Hi,
We have a requirement where by Fact data for different scenarios gets loaded at different levels of ragged/recursive hierarchies.
For example, the Actual sales are loaded at Product Leaf level but the Budgets data is available for loading only at Product Group level (which is one level above Product Leaf in Product Hierarchy).
Please let me know if this is possible in Essbase.
This is rater urgent.
Thanks & Regards,
GurudattaYes, as Gary said, performance is one factor.
There are other less prominent reasons to keep agg missing on, but the overall reason for me is that it just makes more sense, especially when you consider the user operations involved in zooming in on data. When you do a zoom to bottom level, it's going to show you that the sum of the parts is equal to the whole in a well designed cube with agg missing on. With agg missing off, this may or may not be the case, and would NOT be the case if you load at non-leaf members. In that scenario, you end up with a "where did the data go?" issue that adds time to any hunt for answers as well as the likelyhood that more questions will be brought up that ultimately need answering. It's not obvious, but one scenario can be described as follows:
(For convenience, I will use a standard time dimension hierarchy of YEAR|QUARTER|MONTH|DAY to illustrate the issue)
2008 = 12000
2008Q1 = 3000
Jan 2008 = 1000 (input)
1/1/2008 = #Missing
1/2/2008...
Feb 2008 = 1000 (input)
Mar 2008 = 1000 (input)
2008Q2 = 3000
... and so onIn the above example, there are two ways that confusion and errors can occur, the first is that somebody decides to upload some data to the days (level 0) instead of the month, and it doesn't add up to the total for the month. The second is that somebody decides to upload to the quarter, and thus the months no longer add up to the quarter amount. I treat these two conditions separately because in the first, the total is still represented by the amounts loaded to the monthly values, whereas the second case changes the grand total. If you were troubleshooting, you have to go level by level to see where the "breakdown" occurs by comparing each and every parent value to the sum of the children. It's a valid condition that is almost impossible to find quickly. There is no quick "show me all the inputs" approach, and no easy way to enforce an "input only at these levels" condition (there is, but there isn't as it's only practical in a perfect environment and trust me here you won't have it for long even if you could create it to begin with).
So, long story short, data consistency goes out the window in a "load at upper level" cube, and the simple answer of providing an input member for the required generation(s) is much easier and reliable in the long run.
Of course, there will always be exceptions, I just like to avoid them at all cost on this issue. -
Aggregating data loaded into different hierarchy levels
I have some problems when i try to aggregate a variable called PRUEBA2_IMPORTE dimensinated by time dimension (parent-child type).
I read the help in DML Reference of the OLAP Worksheet and it said the follow:
When data is loaded into dimension values that are at different levels of a hierarchy, then you need to be careful in how you set status in the PRECOMPUTE clause in a RELATION statement in your aggregation specification. Suppose that a time dimension has a hierarchy with three levels: months aggregate into quarters, and quarters aggregate into years. Some data is loaded into month dimension values, while other data is loaded into quarter dimension values. For example, Q1 is the parent of January, February, and March. Data for March is loaded into the March dimension value. But the sum of data for January and February is loaded directly into the Q1 dimension value. In fact, the January and February dimension values contain NA values instead of data. Your goal is to add the data in March to the data in Q1. When you attempt to aggregate January, February, and March into Q1, the data in March will simply replace the data in Q1. When this happens, Q1 will only contain the March data instead of the sum of January, February, and March. To aggregate data that is loaded into different levels of a hierarchy, create a valueset for only those dimension values that contain data. DEFINE all_but_q4 VALUESET time
LIMIT all_but_q4 TO ALL
LIMIT all_but_q4 REMOVE 'Q4'
Within the aggregation specification, use that valueset to specify that the detail-level data should be added to the data that already exists in its parent, Q1, as shown in the following statement. RELATION time.r PRECOMPUTE (all_but_q4)
How to do it this for more than one dimension?
Above i wrote my case of study:
DEFINE T_TIME DIMENSION TEXT
T_TIME
200401
200402
200403
200404
200405
200406
200407
200408
200409
200410
200411
2004
200412
200501
200502
200503
200504
200505
200506
200507
200508
200509
200510
200511
2005
200512
DEFINE T_TIME_PARENTREL RELATION T_TIME <T_TIME T_TIME_HIERLIST>
-----------T_TIME_HIERLIST-------------
T_TIME H_TIME
200401 2004
200402 2004
200403 2004
200404 2004
200405 2004
200406 2004
200407 2004
200408 2004
200409 2004
200410 2004
200411 2004
2004 NA
200412 2004
200501 2005
200502 2005
200503 2005
200504 2005
200505 2005
200506 2005
200507 2005
200508 2005
200509 2005
200510 2005
200511 2005
2005 NA
200512 2005
DEFINE PRUEBA2_IMPORTE FORMULA DECIMAL <T_TIME>
EQ -
aggregate(this_aw!PRUEBA2_IMPORTE_STORED using this_aw!OBJ262568349 -
COUNTVAR this_aw!PRUEBA2_IMPORTE_COUNTVAR)
T_TIME PRUEBA2_IMPORTE
200401 NA
200402 NA
200403 2,00
200404 2,00
200405 NA
200406 NA
200407 NA
200408 NA
200409 NA
200410 NA
200411 NA
2004 4,00 ---> here its right!! but...
200412 NA
200501 5,00
200502 15,00
200503 NA
200504 NA
200505 NA
200506 NA
200507 NA
200508 NA
200509 NA
200510 NA
200511 NA
2005 10,00 ---> here must be 30,00 not 10,00
200512 NA
DEFINE PRUEBA2_IMPORTE_STORED VARIABLE DECIMAL <T_TIME>
T_TIME PRUEBA2_IMPORTE_STORED
200401 NA
200402 NA
200403 NA
200404 NA
200405 NA
200406 NA
200407 NA
200408 NA
200409 NA
200410 NA
200411 NA
2004 NA
200412 NA
200501 5,00
200502 15,00
200503 NA
200504 NA
200505 NA
200506 NA
200507 NA
200508 NA
200509 NA
200510 NA
200511 NA
2005 10,00
200512 NA
DEFINE OBJ262568349 AGGMAP
AGGMAP
RELATION this_aw!T_TIME_PARENTREL(this_aw!T_TIME_AGGRHIER_VSET1) PRECOMPUTE(this_aw!T_TIME_AGGRDIM_VSET1) OPERATOR SUM -
args DIVIDEBYZERO YES DECIMALOVERFLOW YES NASKIP YES
AGGINDEX NO
CACHE NONE
END
DEFINE T_TIME_AGGRHIER_VSET1 VALUESET T_TIME_HIERLIST
T_TIME_AGGRHIER_VSET1 = (H_TIME)
DEFINE T_TIME_AGGRDIM_VSET1 VALUESET T_TIME
T_TIME_AGGRDIM_VSET1 = (2005)
Regards,
Mel.Mel,
There are several different types of "data loaded into different hierarchy levels" and the aproach to solving the issue is different depending on the needs of the application.
1. Data is loaded symmetrically at uniform mixed levels. Example would include loading data at "quarter" in historical years, but at "month" in the current year, it does /not/ include data loaded at both quarter and month within the same calendar period.
= solved by the setting of status, or in 10.2 or later with the load_status clause of the aggmap.
2. Data is loaded at both a detail level and it's ancestor, as in your example case.
= the aggregate command overwrites aggregate values based on the values of the children, this is the only repeatable thing that it can do. The recomended way to solve this problem is to create 'self' nodes in the hierarchy representing the data loaded at the aggregate level, which is then added as one of the children of the aggregate node. This enables repeatable calculation as well as auditability of the resultant value.
Also note the difference in behavior between the aggregate command and the aggregate function. In your example the aggregate function looks at '2005', finds a value and returns it for a result of 10, the aggregate command would recalculate based on january and february for a result of 20.
To solve your usage case I would suggest a hierarchy that looks more like this:
DEFINE T_TIME_PARENTREL RELATION T_TIME <T_TIME T_TIME_HIERLIST>
-----------T_TIME_HIERLIST-------------
T_TIME H_TIME
200401 2004
200402 2004
200403 2004
200404 2004
200405 2004
200406 2004
200407 2004
200408 2004
200409 2004
200410 2004
200411 2004
200412 2004
2004_SELF 2004
2004 NA
200501 2005
200502 2005
200503 2005
200504 2005
200505 2005
200506 2005
200507 2005
200508 2005
200509 2005
200510 2005
200511 2005
200512 2005
2005_SELF 2005
2005 NA
Resulting in the following cube:
T_TIME PRUEBA2_IMPORTE
200401 NA
200402 NA
200403 2,00
200404 2,00
200405 NA
200406 NA
200407 NA
200408 NA
200409 NA
200410 NA
200411 NA
200412 NA
2004_SELF NA
2004 4,00
200501 5,00
200502 15,00
200503 NA
200504 NA
200505 NA
200506 NA
200507 NA
200508 NA
200509 NA
200510 NA
200511 NA
200512 NA
2005_SELF 10,00
2005 30,00
3. Data is loaded at a level based upon another dimension; for example product being loaded at 'UPC' in EMEA, but at 'BRAND' in APAC.
= this can currently only be solved by issuing multiple aggregate commands to aggregate the different regions with different input status, which unfortunately means that it is not compatable with compressed composites. We will likely add better support for this case in future releases.
4. Data is loaded at both an aggregate level and a detail level, but the calculation is more complicated than a simple SUM operator.
= often requires the use of ALLOCATE in order to push the data to the leaves in order to correctly calculate the aggregate values during aggregation. -
Error while loading data after level 17 patch
Hi all,
A week ago we had patch level 17 implemented on our SAP NetWeaver BI 7.0 system. Since then, we cannot load any data to some cubes using update rules. We get "Error when generating the update program" (Message no. RSAU484), status red while loading data.
I tried activating the update rules
We get te same error when loading data from a DSO into a cube.
The strange thing is that it doesn't happen for all the objects in the system. We have troubles loading deltas from 2LIS_12_VCITM (delivery items), 2LIS_13VAITM (sales order items) and from a BI content DSO in assets accountng.
Please if you have any idea or if this happend to you, i appreciate any suggestion.
This is not an implementation, these processes are already live for months, and we didtn't have any problems before the patch..
thanks,
AndreeaWe ran into same issue & another one during our BI 7.0 SPS15/SP17 upgrade.
The solution:
1. Apply 2 notes:
[Note 1146851 - BI 7.0 (SP 18): Error RSAU 484 when generating update|http://service.sap.com/sap/support/notes/1146851]
[Note 1152453 - Termination generating InfoCube write program|http://service.sap.com/sap/support/notes/1152453]
2. Then run program RSDD_RSTMPLWIDTP_GEN_ALL
to generate the new programs for InfoCubes from the template RSTMPLWIDTP and cleans up the staging table RSUPDINFO per note 1152453.
Maybe you are looking for
-
HTML result gets truncated after including of the JSF page
Hi there, I'm trying to create a page that will include dynamically content of JSF pages from another web application. Thus, there is a web application (Pages) that is deployed under web context: /test-pages and another web application (Faces) that i
-
I own an iPod Touch 4g. Now i'm planning to buy iPod Touch 5g. Is it possible for me to register the new iPod with existing Apple ID so that i can transfer the apps from my existing account to my new Ipod 5g?
-
ITunes blinking + popping to foreground in Windows 7
So I'm sure this has come up before, but it is absolutely infuriating. iTunes keeps blinking in the task bar once I open it. Incessantly and nonstop. And then, when typing in another program, it will randomly pop to the foreground, usually several ti
-
Cannot send or receive Pics. APN needs to be changed, but does not allow updating. Phone is unlocked, using Straighttalk/AT&T. Service provider states MMS is supported with plan. Any help available?
-
Date format in 8i Lite ver. 4.0.0.2
Hello, whenever I set the NLS_DATE_FORMAT to 'dd.mm.yyyy' I get the following error when selecting SYSDATE in SQL-Plus or opening a table containing dates in Navigator. Error: OCA-30023: error fetching result set [POL-2416] invalid date/time format C