2 Digit Year Reveals Inconsistent Results

As I have read what I can about implicit versus explicit dates, I have come across something I find a bit odd.  I've tried to simplify it here in this anonymous block.
declare
bad_year varchar2(30);
good_year varchar2(30);
neutral_year varchar2(30);
testdate date;
neutraldate date;
begin
testdate := to_date('01-JAN-50', 'DD-MON-YY');
--test one
bad_year := to_char(testdate, 'DD-MON-YYYY');
dbms_output.put_line(bad_year);
--test two
neutral_year := to_char(testdate, 'DD-MON-YY');
neutraldate := to_date(neutral_year);
good_year := to_char(neutraldate, 'DD-MON-YYYY');
dbms_output.put_line(good_year);
end;
The results:
01-JAN-2050
01-JAN-1950
I would think that unless you are altering your session between tests and changing the NLS_DATE_FORMAT, that the results would at least be consistent.
This has truly got me stumped.
Anyone out there understand this?
thank you,
Kristi

kbarry wrote:
For the record, I never would have blindly started coding implicit date formats without looking into it first.  I believe it is just that I did not see all of the documentation and what I did see, I apparently did not understand as thoroughly as I thought I did.  And I guess I tend to assume that if a language offers constructs with a set of rules behind them, that it is okay to use those constructs so long as you follow the rules.  But more and more, I hear of things that should never be used, such as global variables, and now implicit date formats.  Why offer such language constructs if they should never be used?  I suppose if it was that straight-forward, there would be no fodder for the forums.
Nevertheless, I truly appreciate the prompt answers.
Some constructs seemed like a good idea at the time, but practice has shown them to be not such a good idea after all.  Like a GOTO statement
Some constructs have legitimate uses, but are more often abused and used in ways the original designers never intended.
Some constructs (such as the RR date format) are, as I mentioned in an earlier post, emergency solutions to intractable problems - with the understanding by the original designers that the feature should only be used until the underlying problem is fixed.
Those of us who were actually engaged in applications programming in the late 90's are particularly sensitive about dates.  We busted our butts for at least two years insuring that Y2k passed without a problem.  And exactly to the degree we were successful, people thought then, and still believe, the whole thing was a hoax.  I had to explain it again to an 'unbeliever' just a couple of weeks ago.  Problem is, some of the fixes relied on things like RR date formats, and 'moving windows', which got us through, but just kicked the can down the road.  At which point, once again, today's problem - like reformatting a sales report for a VP - became more important than the one we knew was down the road.  And with the passage of time, the one down the road was completely forgotten. 
And while I'm ranting on the subject, let me point out for the benefit of those who weren't there that in spite of everything they might have heard, Y2k was not a problem with COBOL.  It was a problem with the design of the data.  COBOL was quite capable of handling 4-digit years, if only the apps designers had allowed for 4 digits in their data.  There are valid, historical reasons why the didn't, but it was not due to a limitation in COBOL or any other language.  The only reason COBOL got so much attention was that it was the dominant language at the time.  <rant off>

Similar Messages

  • Inconsistent results with MDX formula

    Hi. I'm converting a BSO cube to ASO, and it has dynamically calculated formulas that I'm converting to MDX. I have a formula that is supposed to accumulate an account (Order Intake) through the months and years until it gets to the current month of the current year (set by substitution variables) and then just carries that balance forward until the end.
    This is the formula I wrote in MDX.
    IIF( Count( Intersect( {MemberRange([Years].[FY95], [&Auto_CurYr].Lag(1))}, {Years.CurrentMember} ) ) = 1,
    IIF(CurrentMember ([Period]) = [Jan],
    [Order Intake] + ([Contract Value],[Adj],[Years].CurrentMember.PrevMember),     
    [Order Intake] + ([Contract Value],[Period].CurrentMember.PrevMember)
    IIF( CurrentMember ([Years]) = [&Auto_CurYr],
    IIF( CurrentMember ([Period]) = [Jan],
    [Order Intake] + ([Contract Value],[Adj],[Years].CurrentMember.PrevMember),
    IIF( Count( Intersect( {MemberRange([Period].[Feb], [&Auto_CurMoNext_01].Lag(1))}, {Period.CurrentMember} ) ) = 1,
    [Order Intake] + ([Contract Value],[Period].CurrentMember.PrevMember),
    ([Contract Value],[Period].CurrentMember.PrevMember)
    ([Contract Value],[Adj],[Years].CurrentMember.PrevMember) /*This is the statement that evaluates for months and years after the current month and year*/
    The inconsistent results are as follows:
    I have a spreadsheet that has the years and months across the top in columns. The substitution variables are set to FY09 for the year and Oct for the month. The formula works fine until it gets to Jan of FY10, at which point it produces a number out of thin air, and carries that incorrect number through to the end.
    When I put the years and months into my rows, however, and then drill down on the months, I get different results. Not only different, but different results at different times, too. When I first drilled, all results were correct. Now when I drill, it produces a random number in October of FY09 (not entirely random, but actually double what it's supposed to be), then #missing in Nov of FY09, then the correct number thereafter. Same exact data intersection on both spreadsheets, different results. I've retrieved over and over again, and the only time it might change is if I re-drill. I've used both Essbase Add-in and Smart View with consistently inconsistent results.
    Has anyone ever encountered this sort of behavior with an MDX formula?

    Well, I finally got a formula that works. I did end up using a combination of CASE and IIF, but I never did figure out how to deal with summing up ranges of data correctly, accounting for changing substitution variables, so I had to do a lot of hard coding by month. For instance, I couldn't ask it to sum([Order Intake],[Jan],[&Auto_CurYr]:([Order Intake],[&Auto_CurMo],[&Auto_CurYr]). Although it validated fine, when I tried to retrieve it said members were not of the same generation, presumably because my substitution variable could potentially be a non - level 0 month (it worked if I hard coded the end month). Also, I really don't like the MDX version of @LSIBLINGS and @RSIBLINGS.
    But this works.
    CASE
    When Count( Intersect( {MemberRange([Years].[FY95], [&Auto_CurYr].Lag(1))}, {Years.CurrentMember} ) ) = 1
    THEN IIF(CurrentMember ([Period]) = [Jan],
    [Order Intake] + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Feb],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Feb]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Mar],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Mar]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Apr],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Apr]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [May],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[May]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Jun],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Jun]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Jul],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Jul]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Aug],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Aug]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Sep],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Sep]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Oct],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Oct]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Nov],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Nov]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Dec]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember))
    When CurrentMember ([Years]) IS [&Auto_CurYr]
    THEN IIF(CurrentMember ([Period]) = [Jan],
    [Order Intake] + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Feb] AND CONTAINS([Feb], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Feb]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Mar] AND CONTAINS([Mar], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Mar]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Apr] AND CONTAINS([Apr], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Apr]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [May] AND CONTAINS([May], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[May]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Jun] AND CONTAINS([Jun], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Jun]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Jul] AND CONTAINS([Jul], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Jul]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Aug] AND CONTAINS([Aug], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Aug]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Sep] AND CONTAINS([Sep], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Sep]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Oct] AND CONTAINS([Oct], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Oct]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Nov] AND CONTAINS([Nov], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Nov]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[&Auto_CurMo]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember))
    WHEN CONTAINS([Years].CurrentMember, {MemberRange([&Auto_CurYr].Lead(1), [Years].[FY15])})
    THEN ([Contract Value],[Adj],[Years].&Auto_CurYr)
    END
    Thanks for looking at it, Gary, I appreciate it.
    Sabrina
    Edited by: SabrinaD on Nov 18, 2009 2:29 PM
    Edited by: SabrinaD on Nov 18, 2009 2:31 PM
    Edited by: SabrinaD on Nov 18, 2009 2:34 PM
    Edited by: SabrinaD on Nov 18, 2009 2:35 PM

  • Two digit year cutoff in SQL Server 2008 R2

    Hi All,
    We have a 2008R2 setup here and have SSIS jobs importing data from flat files into our database. Some sources only give us 2 digit years unfortunately. 
    Looking at some data, the default 2 digit year cutoff  is not really appropriate to our setup, a value of 2080 would be better. However, when checking the official documentation here:
    http://msdn.microsoft.com/en-us/library/ms191004(v=sql.105).aspx
    there is a warning to leave the value alone to maintain backwards compatibility. This warning is NOT present for the newer versions of SQL Server. Is this just a generic warning about the general application landscape, or is there some Microsoft code somewhere
    within SQL Server/SSIS/SSRS which assumes this default is locked to 2049? In other words, if I understand the application landscape at my office - am I safe to change this? Or is there some underlying code which will break?
    Also, am I right in the assumption that once a 2 digit year is imported into a database table, it's then converted to a 4 digit year, so any existing data is "safe" from this setting being changed? In other words, the DB loses all memory of whether
    an imported date was originally imported as a 2 digit year?
    Thanks

    Import the string dates into DATE/DATETIME format.
    DATE/DATETIME has 4 digit year.
    You can convert it differently than the automatic 2049 flip year.
    DATE/TIME functions & string conversions:
    http://www.sqlusa.com/bestpractices/datetimeconversion/
    Here is an automatic conversion example.
    DECLARE @StringDate char(8) = '10/23/80';
    DECLARE @Date DATE;
    SELECT @Date=CONVERT(date, @StringDate, 1);
    SELECT @Date;
    -- 1980-10-23
    Kalman Toth Database & OLAP Architect
    SQL Server 2014 Database Design
    New Book / Kindle: Beginner Database Design & SQL Programming Using Microsoft SQL Server 2014

  • Query resulting inconsistent results

    Hi,
    I'm running Oracle 10.2.0.4 and have a partitioned table.
    When I run the query
    select bse_rem_bse, ben_aly_num from dw_cn2.prs_old
    where bse_rem_bse = 3
    I'm getting records with null data (i.e. bse_rem_bse is 'null') and the correct data (i.e. bse_rem_bse is 3).
    bse_rem_bse is a number type data field.
    Can anyone help me understanding the inconsistent result
    and how to find what's broken in table.
    Thanks
    Tarun

    Post your table structure along with data type and some sample data here. Also, post your DB version by executing the following query -
    select * from v$version;Regards.
    Satyaki De.

  • ADF Faces 10.1.3.0.4 - two-digit-year-start has no effect?

    Hi all,
    I have problem with entering date with an af:selectInputDate control:
    the user enters the date and on an onblur event, the adf field
    automatically reformats it:
    10.10.10 ->10.10.2010
    10.10.60 ->10.10.1960
    The user (customer) wants 2 things to change:
    1. if the given two digit year <= actual year (06):
    10.10.06 ->10.10.2006
    otherwise
    10.10.07 ->10.10.1907
    2. by entering a date, adf have to be accept the date without dots too:
    101006 ->10.10.2006
    1.: I tried to set theconfig param: <two-digit-year-start>2006</two-digit-year-start> (or with an EL expression) in the adf-faces-config.xml but it has no effect.
    Is it a totally wrong idea or is it a bug?
    Any other idea?
    2.: Is it possible to solve this problem with a built in control or converter? Or do i have to write my own converter?
    Thanks for any help!

    The exception is gone - I must have had something wrong in my classpath. However, I still see the warnings. Are those warnings expected when using ADF in conjunction with myfaces?

  • Force 2 digit month, 2 digit days and 4 digit year.

    I have input date format month/day/year. 
    The possible input may be like following:
      1/1/2006, 1/12/2006,
    10/1/2006, 10/12/2006.
    in summay the 'day' may be 1 or 2 digit, the 'month' may be 1 or 2 digit.
    How do I make them all 2 digit 'day', 2 digit 'month' and 4 digit year.  My final goal is convert it to 'YYYYMMDD'.
    Here is my code:
    TYPES :BEGIN OF type_int_date,
                l_year(4)   TYPE c,
                l_month(2)  TYPE c,
                l_days(2)   TYPE c,
              END OF type_int_date.
    DATA:l_startdate    TYPE string,
         l_month        TYPE string,
         l_days         TYPE string,
         l_year(4)      TYPE c,
         lw_date        TYPE type_int_date,
         l_len TYPE i.
    SPLIT l_startdate AT '/' INTO
              l_month
              l_days
              l_year.
        l_len = <b>strlen</b>(l_month).
        IF l_len = 1.
           CONCATENATE '0' l_month INTO lw_int_date-l_month.
        l_len = strlen(l_day).
        IF l_len = 1.
           CONCATENATE '0' l_days INTO lw_int_date-l_days.
        lw_int_date-l_year = l_year.
        zzstartdate = lw_int_date.
    I have syntax error on the 'strlen'.  The compiler do not understand the 'strlen'.
    Please help.
    Thanks,
    Helen

    Please implement my sample program above, run it?  Any problems? If so, then I guess that you user date format is set as MM.DD.YYYY.  Which in this case will not work for you.  Neil has suggest that you translate the / to .  This may work, again, if your user date format is MM.DD.YYYY.
    report zrich_0001.
    data: im_datum(10) type c value '1/1/2006'.
    data: ex_datum type sy-datum.
    <b>translate im_datum using '/.'.</b>
    call function 'CONVERT_DATE_TO_INTERNAL'
         exporting
              date_external = im_datum
         importing
              date_internal = ex_datum.
    write:/ ex_datum.
    REgards,
    Rich Heilman

  • Single digit years (with a space), converts the space to zero ???????

    Date vadildation accepts single digit years (with a space) and converts the space to zero. How do we stop converting the space into zero.
    Thanks

    Put 'fx' in front of your date format:
    fxDDMONYY

  • Inconsistent results from rpm

    Hi Oracle Forums,
         I am installing packages required for Oracle Database 11gR2, and am having problems with rpm responses.
         When I query rpm about a package, it tells me that the package is not installed. When I go to install the package, rpm informs me that the package is already installed. An example is shown below:
    [root@OELVM02 Server]# uname -a
    Linux OELVM02.localdomain 2.6.32-300.10.1.el5uek #1 SMP Wed Feb 22 17:37:40 EST 2012 x86_64 x86_64 x86_64 GNU/Linux
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# pwd
    /media/OL5.8 x86_64 dvd 20120229/Server
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# ls -alrt binutils-2.17.50.0.6-20.el5.x86_64.rpm
    -rw-r--r-- 1 root root 3069914 Dec 28  2011 binutils-2.17.50.0.6-20.el5.x86_64.rpm
    [root@OELVM02 Server]#
    [root@OELVM02 Server]#  rpm -q ./binutils-2.17.50.0.6-20.el5.x86_64.rpm
    package ./binutils-2.17.50.0.6-20.el5.x86_64.rpm is not installed
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# rpm -ivh ./binutils-2.17.50.0.6-20.el5.x86_64.rpm
    warning: ./binutils-2.17.50.0.6-20.el5.x86_64.rpm: Header V3 DSA signature: NOKEY, key ID 1e5e0159
    Preparing...                ########################################### [100%]
            package binutils-2.17.50.0.6-20.el5.x86_64 is already installed
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# rpm -q ./binutils-2.17.50.0.6-20.el5.x86_64.rpm
    package ./binutils-2.17.50.0.6-20.el5.x86_64.rpm is not installed
    [root@OELVM02 Server]#     I do not understand the inconsistent results that rpm is giving me.
         Any help would be greatly appreciated
         Thanks
         Gavin

    Hi Avi,
         Thanks for your quick response!!
         The Oracle documentation requires that both the 32 and 64 bit rpm's be installed for some packages. In the below scenario, is rpm telling me that both 32 and 64 bit packages are installed?
    [root@OELVM02 Server]# uname -a
    Linux OELVM02.localdomain 2.6.32-300.10.1.el5uek #1 SMP Wed Feb 22 17:37:40 EST 2012 x86_64 x86_64 x86_64 GNU/Linux
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# pwd
    /media/OL5.8 x86_64 dvd 20120229/Server
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# ls -alrt *glibc-2*
    -rw-r--r-- 1 root root 1544040 Nov 18  2010 compat-glibc-2.3.4-2.26.x86_64.rpm
    -rw-r--r-- 1 root root 1069214 Nov 18  2010 compat-glibc-2.3.4-2.26.i386.rpm
    -rw-r--r-- 1 root root 5607577 Feb 26  2012 glibc-2.5-81.i686.rpm
    -rw-r--r-- 1 root root 4997627 Feb 26  2012 glibc-2.5-81.x86_64.rpm
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# rpm -q glibc-2.5-81.i686
    glibc-2.5-81
    [root@OELVM02 Server]#
    [root@OELVM02 Server]#  rpm -q glibc-2.5-81.x86_64
    glibc-2.5-81
    [root@OELVM02 Server]#
    [root@OELVM02 Server]#     Thanks heaps
         Gavin

  • Inconsistent Results Installing WebApps in Console.  The Secret?

    Hi:
    I'm having a difficult time understanding what exactly happens on my
    server when I use the mydomain->Deployments->Web Applications->Install
    a New Web Application dialog.
    Sometimes when I upload a .war file, it then displays in the Web
    Application section of the left pane where certificate and
    DefaultWebApp are shown. Othertimes? Nothing.
    I have tried installing my web application in several ways, and the
    results are not consistent. What settings should be made and where
    (for example, in files like config.xml) when I install a web
    application?
    I have also tried using the Configure a new Web Application dialog,
    and have similarly inconsistent results.
    Sometimes my config.xml gets updated, sometimes not. Sometimes it
    updates with an ineffective <Application> tag that does not include
    the <WebAppComponent> tag, and sometimes it works.
    Thanks for any insight. I'm really having a tough time of this.
    Thanks,
    Bill

    Hi.
    there should be no other file updated. You should open a case with support.
    Regards,
    Michael
    bill b3nac wrote:
    Yes, I'm using wls6.1 with sp2. jdk131 on windows 2000. my browser is
    msie 6.0.26.
    On installing a new web app through the console, is there any file
    that gets updated that I should keep my eye on other than
    ./wlserver6.1SP2/config/mydomain/config.xml?
    Thanks.
    Bill
    Michael Young <[email protected]> wrote in message news:<[email protected]>...
    Hi.
    Hmm. I'll presume you are running under wls 6.1. Make sure you are
    running with the latest service pack - sp2. Also, what platform/jdk are
    you using?
    If you already are using sp2 and are still seeing this inconsistency I
    recommend you open a case with support. However, be forwarned that they
    may not be able to help much if this only occurs randomly or rarely.
    Regards,
    Michael
    bill b3nac wrote:
    Hi:
    I'm having a difficult time understanding what exactly happens on my
    server when I use the mydomain->Deployments->Web Applications->Install
    a New Web Application dialog.
    Sometimes when I upload a .war file, it then displays in the Web
    Application section of the left pane where certificate and
    DefaultWebApp are shown. Othertimes? Nothing.
    I have tried installing my web application in several ways, and the
    results are not consistent. What settings should be made and where
    (for example, in files like config.xml) when I install a web
    application?
    I have also tried using the Configure a new Web Application dialog,
    and have similarly inconsistent results.
    Sometimes my config.xml gets updated, sometimes not. Sometimes it
    updates with an ineffective <Application> tag that does not include
    the <WebAppComponent> tag, and sometimes it works.
    Thanks for any insight. I'm really having a tough time of this.
    Thanks,
    Bill
    Michael Young
    Developer Relations Engineer
    BEA Support

  • Inconsistent results with localtimestamp and current_timestamp

    Running XE on Windows XP with the system timezone to GMT rebooted, restarted XE)
    Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product
    PL/SQL Release 10.2.0.1.0 - Production
    CORE     10.2.0.1.0     Production
    TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
    NLSRTL Version 10.2.0.1.0 - Production
    I'm getting incorrect and inconsistent results with current_timestamp and localtimestamp:
    With SQL, localtimestamp computes the wrong offset (appears to use 1987-2006 DST rules):
    select
    dbtimezone
    , sessiontimezone
    , current_timestamp
    , current_timestamp + numtodsinterval(18,'day') as current_timestamp18
    , localtimestamp
    from dual;
    +00:00     
    US/Eastern     
    17-MAR-10 10.27.17.376000000 AM US/EASTERN     
    04-APR-10 10.27.17.376000000 AM US/EASTERN     
    17-MAR-10 09.27.17.376000000 AM
    however, in PL/SQL, both current_timestamp and localtimestamp return the wrong hour value, and adding 18 to current_timestamp shows it is using 1987-2006 DST rules (1st sunday of april)/ note that this happens in straight PL/SQL and in embedded SQL (same results selecting from tables other than DUAL):
    begin
    for r1 in (
    select
    dbtimezone
    , sessiontimezone
    , current_timestamp
    , current_timestamp + numtodsinterval(18,'day') as current_timestamp18
    , localtimestamp
    from dual
    loop
    dbms_output.put_line('SQL dbtimezone = ' || r1.dbtimezone);
    dbms_output.put_line('SQL sessiontimezone = ' || r1.sessiontimezone);
    dbms_output.put_line('SQL current_timestamp = ' || r1.current_timestamp);
    dbms_output.put_line('SQL current_timestamp +18 = ' || r1.current_timestamp18);
    dbms_output.put_line('SQL localtimestamp = ' || r1.localtimestamp);
    end loop;
    dbms_output.put_line('dbtimezone = ' || dbtimezone);
    dbms_output.put_line('sessiontimezone = ' || sessiontimezone);
    dbms_output.put_line('systimestamp = ' || systimestamp);
    dbms_output.put_line('current_timestamp = ' || current_timestamp);
    dbms_output.put_line('current_timestamp +18 = ' || (current_timestamp + numtodsinterval(18,'day')));
    dbms_output.put_line('localtimestamp = ' || localtimestamp);
    end;
    SQL dbtimezone = +00:00
    SQL sessiontimezone = US/Eastern
    SQL current_timestamp = 17-MAR-10 09.29.32.784000 AM US/EASTERN
    SQL current_timestamp +18 = 04-APR-10 10.29.32.784000000 AM US/EASTERN
    SQL localtimestamp = 17-MAR-10 09.29.32.784000 AM
    dbtimezone = +00:00
    sessiontimezone = US/Eastern
    systimestamp = 17-MAR-10 02.29.32.784000000 PM +00:00
    current_timestamp = 17-MAR-10 09.29.32.784000000 AM US/EASTERN
    current_timestamp +18 = 04-APR-10 10.29.32.784000000 AM US/EASTERN
    localtimestamp = 17-MAR-10 09.29.32.784000000 AM
    dbtimezone = +00:00
    sessiontimezone = US/Eastern
    systimestamp = 17-MAR-10 02.16.21.366000000 PM +00:00
    current_timestamp = 17-MAR-10 09.16.21.366000000 AM US/EASTERN
    current_timestamp +18 = 04-APR-10 10.16.21.366000000 AM US/EASTERN
    localtimestamp = 17-MAR-10 09.16.21.366000000 AM
    is this a known bug?
    is there a patch or a work-around for XE?
    are other datasbase versions affected?

    Can't patch XE, unfortunately it comes with pre-2007 DST rules.
    There is a metalink note describing how to fix the DST changes, and while it's not really a "supported" method, neither is XE- if you can get updated timezone files from a later patch set for the same release, 10gR2, on the right operating system, shutdown/startup the database the updated DST rules will be in place. The timezone files are in $ORACLE_HOME/oracore/zoneinfo.
    Another unfortunately, any values already stored in the database using timestamp with local timezone datatypes for the affected period of the DST changes won't be correct, i.e. there is no 2010-03-14 02:01 (?) but with older timezone rules in place that would be a valid timestamp. The data has to be saved before updating the timezone file, and re-translated to timestamp w/local tz datatypes after the update.
    IMHO storing literal timezone info isn't an ideal practice, let the client settings do the time interpretation, time is always changing. Its the interpretation of the time that gets changed. From time to time. :(

  • Inconsistent results when exporting projects as new libraries

    I am trying to export a project as a new library without the original ("master") images and am getting inconsistent results. Each time I've tried this, I am careful not to check "Copy originals into exported library". Sometimes it works exactly as expected -- i.e., a new library file is created and, when I examine the contents of the new library, there are no master image files. Other times when I examine the new Aperture library file, I find that it contains master image files. Indeed, sometimes, the new library contains more master image files than there are images in the project from which the library was created.
    What am I doing wrong or missing? Is this normal behavior?
    (Aperture 3.4.3)

    After a bit of exploring, here is what I determined:
    1. Most of my projects were created by importing images from a folder without moving the original images (Files> Import> Folder as Projects> Store Files: In their original location). Using this method, the images in the project are all referenced images (i.e. the originals are not moved or copied into the Aperture library). When later export these projects as described above, the resulting library also does not contain any of the original image files or masters. This is my desired state and for most of my 2012 projects exactly what happened.
    2. If, however, any of the images in a given library were originally imported so that the master image resided in the Aperture library and that image was subsequently deleted, then the exported library will still contain the masters.
    The solution I found was to open the exported library file and empty the trash (Aperture> Empty Aperture Trash).
    (Of course, the longer term solution within my given workflow is to be careful not to import the masters at the beginning of the process.)
    Hope this helps someone.

  • Inconsistent results with ANSI LEFT JOIN on 9iR2

    Is this a known issue? Is it solved in 10g?
    With the following data setup, I get inconsistent results. It seems to be linked to the combination of using LEFT JOIN with the NULL comparison within the JOIN.
    create table titles (title_id int, title varchar(50));
    insert into titles values (1, 'Red Book');
    insert into titles values (2, 'Yellow Book');
    insert into titles values (3, 'Blue Book');
    insert into titles values (4, 'Orange Book');
    create table sales (stor_id int, title_id int, qty int, email varchar(60));
    insert into sales values (1, 1, 1, '[email protected]'));
    insert into sales values (1, 2, 1, '[email protected]');
    insert into sales values (3, 3, 4, null);
    insert into sales values (3, 4, 5, '[email protected]');
    SQL&gt; SELECT titles.title_id, title, qty
    2 FROM titles LEFT OUTER JOIN sales
    3 ON titles.title_id = sales.title_id
    4 AND stor_id = 3
    5 AND sales.email is not null
    6 ;
    TITLE_ID TITLE QTY
    4 Orange Book 5
    3 Blue Book
    1 Red Book
    2 Yellow Book
    SQL&gt;
    SQL&gt; SELECT titles.title_id, title, qty
    2 FROM titles LEFT OUTER JOIN sales
    3 ON titles.title_id = sales.title_id
    4 AND 3 = stor_id
    5 AND sales.email is not null;
    TITLE_ID TITLE QTY
    2 Yellow Book 1
    4 Orange Book 5
    3 Blue Book
    1 Red Book
    It seems to matter what order I specify the operands stor_id = 3, or 3 = stor_id.
    In the older (+) environment, I would understand this, but here? I'm pretty sure most other databases don't care about the order.
    thanks for your insight
    Kevin

    Don't have a 9i around right now to test ... but in 10 ...
    SQL> create table titles (title_id int, title varchar(50));
     
    Table created.
     
    SQL> insert into titles values (1, 'Red Book');
     
    1 row created.
     
    SQL> insert into titles values (2, 'Yellow Book');
     
    1 row created.
     
    SQL> insert into titles values (3, 'Blue Book');
     
    1 row created.
     
    SQL> insert into titles values (4, 'Orange Book');
     
    1 row created.
     
    SQL> create table sales (stor_id int, title_id int, qty int, email varchar(60));
     
    Table created.
     
    SQL> insert into sales values (1, 1, 1, '[email protected]');
     
    1 row created.
     
    SQL> insert into sales values (1, 2, 1, '[email protected]');
     
    1 row created.
     
    SQL> insert into sales values (3, 3, 4, null);
     
    1 row created.
     
    SQL> insert into sales values (3, 4, 5, '[email protected]');
     
    1 row created.
     
    SQL> SELECT titles.title_id, title, qty
      2   FROM titles LEFT OUTER JOIN sales
      3   ON titles.title_id = sales.title_id
      4   AND stor_id = 3
      5   AND sales.email is not null
      6   ;
     
      TITLE_ID TITLE                                                     QTY
             4 Orange Book                                                 5
             3 Blue Book
             1 Red Book
             2 Yellow Book
     
    SQL>
    SQL> SELECT titles.title_id, title, qty
      2   FROM titles LEFT OUTER JOIN sales
      3   ON titles.title_id = sales.title_id
      4   AND 3 = stor_id
      5   AND sales.email is not null;
     
      TITLE_ID TITLE                                                     QTY
             4 Orange Book                                                 5
             3 Blue Book
             1 Red Book
             2 Yellow Book
    SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options

  • Inconsistent results for SDO_RELATE

    Using SDO_VERSION = 10.2.0.2.0
    I am getting inconsistent results using SDO_RELATE. If I do the whole table (over 200,000 records), one particular record that I know of is skipped while if I pick a smaller range of records ie. 4 records in this case, the record is not skipped. The particular relationship with this record is "touch". Is there any limitation on table size or is this something else? Here is the example :
    -- The column is set to null
    SQL> update nad_als_fixed_stn_10G_HQ set insidecheck = null;
    231484 rows updated.
    -- SDO_RELATE on a few records which actually finds the correct relationship
    SQL> UPDATE nad_als_fixed_stn_10G_HQ C SET C.insidecheck = '1'
    2 WHERE EXISTS (SELECT 1 FROM MetroRegions A, nad_als_fixed_stn_10G_HQ B
    3 WHERE SDO_RELATE(B.location83r, A.geoloc, 'mask=anyinteract') = 'TRUE'
    4 AND C.lic_no = B.lic_no and C.lic_no between 4687157 and 4687223 )
    5 ;
    3 rows updated.
    -- Displays the correct relationship for that record (this is a "touch")
    SQL> select insidecheck from nad_als_fixed_stn_10G_HQ where lic_no = 4687161;
    Inside
    check
    1
    -- Reset the column to null
    SQL> update nad_als_fixed_stn_10G_HQ set insidecheck = null;
    231484 rows updated.
    -- SDO_RELATE on the complete table
    SQL> UPDATE nad_als_fixed_stn_10G_HQ C SET C.insidecheck = '1'
    2 WHERE EXISTS (SELECT 1 FROM MetroRegions A, nad_als_fixed_stn_10G_HQ B
    3 WHERE SDO_RELATE(B.location83r, A.geoloc, 'mask=anyinteract') = 'TRUE'
    4 AND C.lic_no = B.lic_no );
    48488 rows updated.
    -- This particular record which was located correctly earlier appears to be skipped
    SQL> select insidecheck from nad_als_fixed_stn_10G_HQ where lic_no = 4687161;
    Inside
    check
    SQL>
    François Sigouin

    Thanks but it did not solve the problem of inconsistent results. The response time for the first update is much improved though. When I added the hint on the second update (which is about 600 records), it never came back so I tested without it. Any other ideas ?
    François.
    TEST
    SQL> update nad_als_fixed_stn_10G_HQ set insidecheck = null;
    231484 rows updated.
    -- First update with hint, response time is improved but same results obtained
    SQL> UPDATE nad_als_fixed_stn_10G_HQ C SET C.insidecheck = '1'
    2 WHERE EXISTS (SELECT /*+ ORDERED */ 1 FROM MetroRegions A, nad_als_fixed_stn_10G_HQ B
    3 WHERE SDO_RELATE(B.location83r, A.geoloc, 'mask=anyinteract') = 'TRUE'
    4 AND C.lic_no = B.lic_no );
    48488 rows updated.
    SQL> select insidecheck from nad_als_fixed_stn_10G_HQ where lic_no = 4687161;
    Inside
    check
    SQL> update nad_als_fixed_stn_10G_HQ set insidecheck = null;
    231484 rows updated.
    --The second update has to be without the hint otherwise it does not come back.                                                                                                  
    SQL> UPDATE nad_als_fixed_stn_10G_HQ C SET C.insidecheck = '1'
    2 WHERE EXISTS (SELECT 1 FROM MetroRegions A, nad_als_fixed_stn_10G_HQ B
    3 WHERE SDO_RELATE(B.location83r, A.geoloc, 'mask=anyinteract') = 'TRUE'
    4 AND C.lic_no = B.lic_no and C.lic_no between 4687157 and 4687223 )
    5 ;
    3 rows updated.
    SQL> select insidecheck from nad_als_fixed_stn_10G_HQ where lic_no = 4687161;
    Inside
    check
    1

  • Inconsistent results while searching with TREX

    Hi all, iam getting inconsistent results for the same search terms. iam searching for content in a document.one user has read permission on this document and other doesn't. if search using user without read access no results are displayed. i logged in as a user who has read permission in a different window. search for the same content displays the document. and now if i search for the same content for the user who don't have read permission it's displaying the document. ideally it should not display. it would be very helpful if somebody can point what is the problem. thanks in advance.
    regards
    kranthi

    Hi Kranthi,
    could this be a browser caching or credentials per browser session issue?
    - Do you open the new Window with Ctrl-N?
    - Or do you start a completely new browser (click browser icon a second time)? Does it still happen in that case?
    - Does it also happen, if you completely close the browser in between and then re-open?
    - Does it still happen, if you delete the temporary internet files in between? And/or the cookies?
    Regards,
    Karsten

  • Inconsistent Results from dbms_output.get_lines

    Hi,
    I am getting inconsistent results from using dbms_output.get_lines.
    I'm using get_lines in a procedure A that executes a function B to test if the function returns 0 or > 0 to indicate validity of my data. In that function, I use dbms_output.put_lines to communicate data points that I want to use. My procedure A does a get_lines after executing function B then either logs the lines into a table or sends an email.
    Right now, get_lines is behaving sporadically for me. Sometimes I the chararr returns some lines while other times it doesn't. The strange thing is numlines does return a value, and it's the value that I expected.
    Can someone please help?
    Thanks.

    Use parameters or even global package variables to transport data from one procedure to the other. dbms_output is not meant for this, it will not work.

Maybe you are looking for

  • Zen Vision Problem. Driving me batty! He

    Dear Customer Support and fellow Zen Users, I have had my Creative Zen M: Vision (30G) for just over one year now and for the last few months I have been having many problems with the unit until, finally, it has become stuck today. Here is a short hi

  • ALV_GRID Toolbar_Button_Click modifying the tablecontent before excelexport

    First a Happy New Year to all of you! Hi! The content of my alv_grid shall be exported to excel. For this alv offers the export-button on the toolbar. Our customer wants additionally to the exported content two more lines in his excel-sheet like a ti

  • Macbook boot up

    Hoping someone can help me with this. I cleaned my macbook keyboard yesterday thinking it was switched off but it was in sleep mode. Later I realised that some of the key letters were not appearing when I was typing n, b etc. I switched off the macbo

  • WebUtil Error when called via When_Custom_Javascript_Event

    We are using  javascript events to open forms from a landing form. In the landing form, the When_Custom_Javascript_Event trigger has something like this: v_event_name  VARCHAR2(30) := name_in(':SYSTEM.JAVASCRIPT_EVENT_NAME'); v_event_val   VARCHAR2(5

  • Urgent: Unicode conversion - table splitting

    Hi all, I am having a problem when trying to perform the export step of the unicode conversion on ERP2005, MSSQL server 2005. Due to previously very long runtime, I am trying to use the table splitting option. I have performed the "Table Splitting Pr