Cfheader /CFcontent producing inconsistent results

Hi all
I am using cf header to display a dynically generated report as a word doc. At least this is my goal
i'm using mx7 on appache 2.0 and Firefox 2.0.0.16 [i can not update this] and ie 6 sp 3 for browsers.
no matter how a set this up i don't get the expected results
<cfheader name "content-disposition" value="attachment;filename=#GetInspection.file_number#.doc">
<cfcontent type="application/vnd.msword">
in Firefox the file always downloads as a cfm file. I cant change this.
in ie it down loads as a cfm file but if i elect to save it i can select word.
the other issue with both browsers is that it will not take on the file name assigned it the cf header tag. It just uses the template name.
anyone have a suggestion?
tia
J

The equality sign is missing for the name attribute. I also think you have to include the file attribute in cfcontent.
For example, suppose I have stored the file MyFileName.doc in the web root. When the user opens the page I want the file that is displayed to be named SomeOtherFileName.doc instead. Then some possibilities are:
1) to download and save file
<cfset GetInspection.file_number="SomeOtherFileName"><cfheader
name ="Content-Disposition" value="attachment;filename=#GetInspection.file_number#.doc"><cfcontent
type="application/msword" file="C:\ColdFusionCentaur\wwwroot\MyFileName.doc"> 
2) to open and view file in the browser
<cfset GetInspection.file_number="SomeOtherFileName"><cfheader
name ="Content-Disposition" value="inline;filename=#GetInspection.file_number#.doc"><cfcontent
type="application/msword" file="C:\ColdFusionCentaur\wwwroot\MyFileName.doc">

Similar Messages

  • Save for Web producing inconsistent results

    Hi-
    I am currently working on a printed book that will also be available online. I am a print designer, and am having difficulty saving for the web. I am setting type in illustrator, and then adding a pixel font for annotations. When saving for the web, I first rasterize the pixel font with anti-alias off, (which is working fine), but when I save for the web, (png-24), the results of the system fonts are mixed; some text is blurry, some text is light, and other text is dark, (the pixel font is okay). I am really stuck here. I have tried rasterizing all the text, anti-aliasing the system fonts, creating outlines, but nothing seems to work. Is there anyone out there that knows what I am doing wrong???

    are you rasterizing your non-pixel font your text with preserve hinting in the rasterize settings?
    this topic always spurs a lot of controversy here, mostly because Illustrator does a pretty mediocre job of rasterizing text, compared to what Safari, Preview, and Reader 9 can do. There's no one good way to do it in illustrator.
    Can you post a sample of the results you are getting?

  • Math.round produces inconsistent results

    We've been developing a tax calculator using the Flash 4 SDK in Flash Builder 4, and have experienced difficulties with rounding numbers - for example, the following statement:
         Alert.show(Math.round((8.325 * 100.0)) + ":" + Math.round(832.5));
    Both of the values rounded above should equal 833, however, the first one shows 832, while the second shows 833.  If both of the expressions inside the Math.round() functions equal 832.5, then how are the resulting values different?

    Thanks, Shongrunden - I found that same link literally a minute or two before I saw your response - after some preliminary testing, it looks like the following functions work well as a workaround:
    * Corrects errors caused by floating point math.
    public function correctFloatingPointError(number:Number, precision:int = 5):Number
         //default returns (10000 * number) / 10000
         //should correct very small floating point errors
         var correction:Number = Math.pow(10, precision);
         return Math.round(correction * number) / correction;
    * Tests if two numbers are almost equal.
    public function fuzzyEquals(number1:Number, number2:Number, precision:int = 5):Boolean
         var difference:Number = number1 - number2;
         var range:Number = Math.pow(10, -precision);
         //default check:
         //0.00001 < difference > -0.00001
         return difference < range && difference > -range;

  • What am I doing wrong with my form - XML schema producing inconsistent results

    I have an interactive dynamic form built in LCD 8.2.  The idea is that users fill in the form to indicate any fields that they want to update on our directory.
    We do not have the necessary plumbing to get it to feed directly to the directory so the workaround is that the form generates an XML file.  These files are then imported into Excel and then that file is submitted as a bulk update request.
    I have a schema that I have embedded in the form using the Data Connection
    <?xml version="1.0" encoding="UTF-8" standalone="no" ?>
    <xsd:schema xmlns:xsd="http://www.w3.org/2001/XMLSchema">
    <xsd:element name="MySite">
      <xsd:complexType>
       <xsd:sequence>
        <xsd:element minOccurs="0" maxOccurs="unbounded" name="general">
         <xsd:complexType>
          <xsd:all>
           <xsd:element minOccurs="0" maxOccurs="1" name="FullName" />
           <xsd:element minOccurs="0" maxOccurs="1" name="StaffNumber" />
           <xsd:element minOccurs="0" maxOccurs="1" name="JobTitle" />
           <xsd:element minOccurs="0" maxOccurs="1" name="Responsibilities" />
           <xsd:element minOccurs="0" maxOccurs="1" name="LineManager" />
           <xsd:element minOccurs="0" maxOccurs="1" name="LMStaffNumber" />
           <xsd:element minOccurs="0" maxOccurs="1" name="AltContact" />
           <xsd:element minOccurs="0" maxOccurs="1" name="AltSE" />
           <xsd:element minOccurs="0" maxOccurs="1" name="Unit" />
           <xsd:element minOccurs="0" maxOccurs="1" name="TelExt" />
           <xsd:element minOccurs="0" maxOccurs="1" name="WorkMobile" />
           <xsd:element minOccurs="0" maxOccurs="1" name="OutHoursTel" />
          </xsd:all>
         </xsd:complexType> 
        </xsd:element>
       </xsd:sequence>
      </xsd:complexType>
    </xsd:element>
    </xsd:schema>
    I use this same schema in Excel and then generate a test file from my form and then import it.  All goes well.
    I then try it again or get someone else to try it and it fails.  What am I doing wrong?  I thought that embedding the schema into the form would mean that it would become part of the form but I keep getting an error message saying it cannot find an XML map that corresponds to the data. 

    Thanks for your reply.
    The form is fine and no error messages are evident until I attempt to import an xml file into Excel.  The schema is very, very basic and no validation has been applied.
    I've looked at the form today to investigate what the problem may be and have discovered three issues.
    I tried with two different versions of the form to try and work out why one version worked and the other did not.  The only obvious difference between them was that one had been Reader Enabled and it was this form that did not work.
    The non-enabled form produced this output:
    <?xml version="1.0" encoding="UTF-8" ?>
    - <MySite>
    - <general>
    <FullName>th</FullName>
    </general>
    </MySite>
    The enabled form produced something different.
    <?xml version="1.0" encoding="UTF-8" ?>
    - <xfa:data xmlns:xfa="http://www.xfa.org/schema/xfa-data/1.0/">
    - <MySite>
    - <general>
    <FullName>gt</FullName>
    </general>
    </MySite>
    <MySite />
    </xfa:data>
    Files with this extra line of code would fail to be imported into the Excel file.
    Another issue was that the Reader Enabled form had a completely different file path for the embedded schema.  In the non-enabled version the path pointed to the correct location on my personal drive.  The Reader Enabled version could not find the schema as the path was incorrect and was pointing to a folder on the temporary drive.  At no point had I manually amended this path to point to there!
    The non-enabled form allowed me to import an xml file whereas the enabled form gave me an error message stating that:
    "The document restricts some Acrobat features to allow for extended features in Adobe Reader.  To create a copy of the document that is not restricted (and has no extended features in Adobe Reader), click Save a Copy."
    Are these expected behaviours or is there something wrong?

  • Inconsistent results with MDX formula

    Hi. I'm converting a BSO cube to ASO, and it has dynamically calculated formulas that I'm converting to MDX. I have a formula that is supposed to accumulate an account (Order Intake) through the months and years until it gets to the current month of the current year (set by substitution variables) and then just carries that balance forward until the end.
    This is the formula I wrote in MDX.
    IIF( Count( Intersect( {MemberRange([Years].[FY95], [&Auto_CurYr].Lag(1))}, {Years.CurrentMember} ) ) = 1,
    IIF(CurrentMember ([Period]) = [Jan],
    [Order Intake] + ([Contract Value],[Adj],[Years].CurrentMember.PrevMember),     
    [Order Intake] + ([Contract Value],[Period].CurrentMember.PrevMember)
    IIF( CurrentMember ([Years]) = [&Auto_CurYr],
    IIF( CurrentMember ([Period]) = [Jan],
    [Order Intake] + ([Contract Value],[Adj],[Years].CurrentMember.PrevMember),
    IIF( Count( Intersect( {MemberRange([Period].[Feb], [&Auto_CurMoNext_01].Lag(1))}, {Period.CurrentMember} ) ) = 1,
    [Order Intake] + ([Contract Value],[Period].CurrentMember.PrevMember),
    ([Contract Value],[Period].CurrentMember.PrevMember)
    ([Contract Value],[Adj],[Years].CurrentMember.PrevMember) /*This is the statement that evaluates for months and years after the current month and year*/
    The inconsistent results are as follows:
    I have a spreadsheet that has the years and months across the top in columns. The substitution variables are set to FY09 for the year and Oct for the month. The formula works fine until it gets to Jan of FY10, at which point it produces a number out of thin air, and carries that incorrect number through to the end.
    When I put the years and months into my rows, however, and then drill down on the months, I get different results. Not only different, but different results at different times, too. When I first drilled, all results were correct. Now when I drill, it produces a random number in October of FY09 (not entirely random, but actually double what it's supposed to be), then #missing in Nov of FY09, then the correct number thereafter. Same exact data intersection on both spreadsheets, different results. I've retrieved over and over again, and the only time it might change is if I re-drill. I've used both Essbase Add-in and Smart View with consistently inconsistent results.
    Has anyone ever encountered this sort of behavior with an MDX formula?

    Well, I finally got a formula that works. I did end up using a combination of CASE and IIF, but I never did figure out how to deal with summing up ranges of data correctly, accounting for changing substitution variables, so I had to do a lot of hard coding by month. For instance, I couldn't ask it to sum([Order Intake],[Jan],[&Auto_CurYr]:([Order Intake],[&Auto_CurMo],[&Auto_CurYr]). Although it validated fine, when I tried to retrieve it said members were not of the same generation, presumably because my substitution variable could potentially be a non - level 0 month (it worked if I hard coded the end month). Also, I really don't like the MDX version of @LSIBLINGS and @RSIBLINGS.
    But this works.
    CASE
    When Count( Intersect( {MemberRange([Years].[FY95], [&Auto_CurYr].Lag(1))}, {Years.CurrentMember} ) ) = 1
    THEN IIF(CurrentMember ([Period]) = [Jan],
    [Order Intake] + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Feb],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Feb]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Mar],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Mar]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Apr],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Apr]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [May],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[May]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Jun],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Jun]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Jul],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Jul]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Aug],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Aug]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Sep],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Sep]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Oct],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Oct]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Nov],
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Nov]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Dec]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember))
    When CurrentMember ([Years]) IS [&Auto_CurYr]
    THEN IIF(CurrentMember ([Period]) = [Jan],
    [Order Intake] + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Feb] AND CONTAINS([Feb], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Feb]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Mar] AND CONTAINS([Mar], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Mar]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Apr] AND CONTAINS([Apr], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Apr]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [May] AND CONTAINS([May], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[May]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Jun] AND CONTAINS([Jun], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Jun]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Jul] AND CONTAINS([Jul], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Jul]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Aug] AND CONTAINS([Aug], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Aug]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Sep] AND CONTAINS([Sep], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Sep]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Oct] AND CONTAINS([Oct], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Oct]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    IIF(CurrentMember ([Period]) = [Nov] AND CONTAINS([Nov], {MEMBERRANGE([&Auto_CurMoNext_01].FirstSibling, [&Auto_CurMoNext_01].Lag(1))}),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[Nov]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember)),
    Sum(CrossJoin({[Order Intake]}, {[Jan]:[&Auto_CurMo]})) + sum(([Order Intake],[YearTotal],[FY95]):([Order Intake],[YearTotal],[Years].CurrentMember.PrevMember))
    WHEN CONTAINS([Years].CurrentMember, {MemberRange([&Auto_CurYr].Lead(1), [Years].[FY15])})
    THEN ([Contract Value],[Adj],[Years].&Auto_CurYr)
    END
    Thanks for looking at it, Gary, I appreciate it.
    Sabrina
    Edited by: SabrinaD on Nov 18, 2009 2:29 PM
    Edited by: SabrinaD on Nov 18, 2009 2:31 PM
    Edited by: SabrinaD on Nov 18, 2009 2:34 PM
    Edited by: SabrinaD on Nov 18, 2009 2:35 PM

  • Filter expression producing different results after upgrade to 11.1.1.7

    Hello,
    We recently did an upgrade and noticed that on a number of reports where we're using the FILTER expression that the numbers are very inflated. Where we are not using the FILTER expression the numbers are as expected. In the example below we ran the 'Bookings' report in 10g and came up with one number and ran the same report in 11g (11.1.1.7.0) after the upgrade and got two different results. The data source is the same database for each envrionment. Also, in running the physical SQL generated by the 10g and 11g version of the report we get different the inflated numbers from the 11g SQL. Any ideas on what might be happening or causing the issue?
    10g report: 2016-Q3......Bookings..........72,017
    11g report: 2016-Q3......Bookings..........239,659
    This is the simple FILTER expression that is being used in the column formula on the report itself for this particular scenario which produces different results in 10g and 11g.
    FILTER("Fact - Opportunities"."Won Opportunity Amount" USING ("Opportunity Attributes"."Business Type" = 'New Business'))
    -------------- Physical SQL created by 10g report -------- results as expected --------------------------------------------
    WITH
    SAWITH0 AS (select sum(case when T33142.OPPORTUNITY_STATUS = 'Won-closed' then T33231.USD_LINE_AMOUNT else 0 end ) as c1,
    T28761.QUARTER_YEAR_NAME as c2,
    T28761.QUARTER_RANK as c3
    from
    XXFI.XXFI_GL_FISCAL_MONTHS_V T28761 /* Dim_Periods */ ,
    XXFI.XXFI_OSM_OPPTY_HEADER_ACCUM T33142 /* Fact_Opportunity_Headers(CloseDate) */ ,
    XXFI.XXFI_OSM_OPPTY_LINE_ACCUM T33231 /* Fact_Opportunity_Lines(CloseDate) */
    where ( T28761.PERIOD_NAME = T33142.CLOSE_PERIOD_NAME and T28761.QUARTER_YEAR_NAME = '2012-Q3' and T33142.LEAD_ID = T33231.LEAD_ID and T33231.LINES_BUSINESS_TYPE = 'New Business' and T33142.OPPORTUNITY_STATUS <> 'Duplicate' )
    group by T28761.QUARTER_YEAR_NAME, T28761.QUARTER_RANK)
    select distinct SAWITH0.c2 as c1,
    'Bookings10g' as c2,
    SAWITH0.c1 as c3,
    SAWITH0.c3 as c5,
    SAWITH0.c1 as c7
    from
    SAWITH0
    order by c1, c5
    -------------- Physical SQL created by the same report as above but in 11g (11.1.1.7.0) -------- results much higher --------------------------------------------
    WITH
    SAWITH0 AS (select sum(case when T33142.OPPORTUNITY_STATUS = 'Won-closed' then T33142.TOTAL_OPPORTUNITY_AMOUNT_USD else 0 end ) as c1,
    T28761.QUARTER_YEAR_NAME as c2,
    T28761.QUARTER_RANK as c3
    from
    XXFI.XXFI_GL_FISCAL_MONTHS_V T28761 /* Dim_Periods */ ,
    XXFI.XXFI_OSM_OPPTY_HEADER_ACCUM T33142 /* Fact_Opportunity_Headers(CloseDate) */ ,
    XXFI.XXFI_OSM_OPPTY_LINE_ACCUM T33231 /* Fact_Opportunity_Lines(CloseDate) */
    where ( T28761.PERIOD_NAME = T33142.CLOSE_PERIOD_NAME and T28761.QUARTER_YEAR_NAME = '2012-Q3' and T33142.LEAD_ID = T33231.LEAD_ID and T33231.LINES_BUSINESS_TYPE = 'New Business' and T33142.OPPORTUNITY_STATUS <> 'Duplicate' )
    group by T28761.QUARTER_YEAR_NAME, T28761.QUARTER_RANK),
    SAWITH1 AS (select distinct 0 as c1,
    D1.c2 as c2,
    'Bookings2' as c3,
    D1.c3 as c4,
    D1.c1 as c5
    from
    SAWITH0 D1),
    SAWITH2 AS (select D1.c1 as c1,
    D1.c2 as c2,
    D1.c3 as c3,
    D1.c4 as c4,
    D1.c5 as c5,
    sum(D1.c5) as c6
    from
    SAWITH1 D1
    group by D1.c1, D1.c2, D1.c3, D1.c4, D1.c5)
    select D1.c1 as c1, D1.c2 as c2, D1.c3 as c3, D1.c4 as c4, D1.c5 as c5, D1.c6 as c6 from ( select D1.c1 as c1,
    D1.c2 as c2,
    D1.c3 as c3,
    D1.c4 as c4,
    D1.c5 as c5,
    sum(D1.c6) over () as c6
    from
    SAWITH2 D1
    order by c1, c4, c3 ) D1 where rownum <= 2000001
    Thank you,
    Mike
    Edited by: Mike Jelen on Jun 7, 2013 2:05 PM

    Thank you for the info. They are definitely different values since ones on the header and the other is on the lines. As the "Won Opportunity" logical column is mapped to multiple LTS it appears the OBI 11 uses a different alogorthim to determine the most efficient table to use in the query generation vs 10g. I'll need to spend some time researching the impact to adding a 'sort' to the LTS. I'm hoping that there's a way to get OBI to use similar logic relative to 10g in how it generated the table priority.
    Thx again,
    Mike

  • SQL Query produces different results when inserting into a table

    I have an SQL query which produces different results when run as a simple query to when it is run as an INSERT INTO table SELECT ...
    The query is:
    SELECT   mhldr.account_number
    ,        NVL(MAX(DECODE(ap.party_sysid, mhldr.party_sysid,ap.empcat_code,NULL)),'UNKNWN') main_borrower_status
    ,        COUNT(1) num_apps
    FROM     app_parties ap
    SELECT   accsta.account_number
    ,        actply.party_sysid
    ,        RANK() OVER (PARTITION BY actply.table_sysid, actply.loanac_latype_code ORDER BY start_date, SYSID) ranking
    FROM     activity_players actply
    ,        account_status accsta
    WHERE    1 = 1
    AND      actply.table_id (+) = 'ACCGRP'
    AND      actply.acttyp_code (+) = 'MHLDRM'
    AND      NVL(actply.loanac_latype_code (+),TO_NUMBER(SUBSTR(accsta.account_number,9,2))) = TO_NUMBER(SUBSTR(accsta.account_number,9,2))
    AND      actply.table_sysid (+) = TO_NUMBER(SUBSTR(accsta.account_number,1,8))
    ) mhldr
    WHERE    1 = 1
    AND      ap.lenapp_account_number (+) = TO_NUMBER(SUBSTR(mhldr.account_number,1,8))
    GROUP BY mhldr.account_number;      The INSERT INTO code:
    TRUNCATE TABLE applicant_summary;
    INSERT /*+ APPEND */
    INTO     applicant_summary
    (  account_number
    ,  main_borrower_status
    ,  num_apps
    SELECT   mhldr.account_number
    ,        NVL(MAX(DECODE(ap.party_sysid, mhldr.party_sysid,ap.empcat_code,NULL)),'UNKNWN') main_borrower_status
    ,        COUNT(1) num_apps
    FROM     app_parties ap
    SELECT   accsta.account_number
    ,        actply.party_sysid
    ,        RANK() OVER (PARTITION BY actply.table_sysid, actply.loanac_latype_code ORDER BY start_date, SYSID) ranking
    FROM     activity_players actply
    ,        account_status accsta
    WHERE    1 = 1
    AND      actply.table_id (+) = 'ACCGRP'
    AND      actply.acttyp_code (+) = 'MHLDRM'
    AND      NVL(actply.loanac_latype_code (+),TO_NUMBER(SUBSTR(accsta.account_number,9,2))) = TO_NUMBER(SUBSTR(accsta.account_number,9,2))
    AND      actply.table_sysid (+) = TO_NUMBER(SUBSTR(accsta.account_number,1,8))
    ) mhldr
    WHERE    1 = 1
    AND      ap.lenapp_account_number (+) = TO_NUMBER(SUBSTR(mhldr.account_number,1,8))
    GROUP BY mhldr.account_number;      When run as a query, this code consistently returns 2 for the num_apps field (for a certain group of accounts), but when run as an INSERT INTO command, the num_apps field is logged as 1. I have secured the tables used within the query to ensure that nothing is changing the data in the underlying tables.
    If I run the query as a cursor for loop with an insert into the applicant_summary table within the loop, I get the same results in the table as I get when I run as a stand alone query.
    I would appreciate any suggestions for what could be causing this odd behaviour.
    Cheers,
    Steve
    Oracle database details:
    Oracle Database 10g Release 10.2.0.2.0 - Production
    PL/SQL Release 10.2.0.2.0 - Production
    CORE 10.2.0.2.0 Production
    TNS for 32-bit Windows: Version 10.2.0.2.0 - Production
    NLSRTL Version 10.2.0.2.0 - Production
    Edited by: stevensutcliffe on Oct 10, 2008 5:26 AM
    Edited by: stevensutcliffe on Oct 10, 2008 5:27 AM

    stevensutcliffe wrote:
    Yes, using COUNT(*) gives the same result as COUNT(1).
    I have found another example of this kind of behaviour:
    Running the following INSERT statements produce different values for the total_amount_invested and num_records fields. It appears that adding the additional aggregation (MAX(amount_invested)) is causing problems with the other aggregated values.
    Again, I have ensured that the source data and destination tables are not being accessed / changed by any other processes or users. Is this potentially a bug in Oracle?Just as a side note, these are not INSERT statements but CTAS statements.
    The only non-bug explanation for this behaviour would be a potential query rewrite happening only under particular circumstances (but not always) in the lower integrity modes "trusted" or "stale_tolerated". So if you're not aware of any corresponding materialized views, your QUERY_REWRITE_INTEGRITY parameter is set to the default of "enforced" and your explain plan doesn't show any "MAT_VIEW REWRITE ACCESS" lines, I would consider this as a bug.
    Since you're running on 10.2.0.2 it's not unlikely that you hit one of the various "wrong result" bugs that exist(ed) in Oracle. I'm aware of a particular one I've hit in 10.2.0.2 when performing a parallel NESTED LOOP ANTI operation which returned wrong results, but only in parallel execution. Serial execution was showing the correct results.
    If you're performing parallel ddl/dml/query operations, try to do the same in serial execution to check if it is related to the parallel feature.
    You could also test if omitting the "APPEND" hint changes anything but still these are just workarounds for a buggy behaviour.
    I suggest to consider installing the latest patch set 10.2.0.4 but this requires thorough testing because there were (more or less) subtle changes/bugs introduced with [10.2.0.3|http://oracle-randolf.blogspot.com/2008/02/nasty-bug-introduced-with-patch-set.html] and [10.2.0.4|http://oracle-randolf.blogspot.com/2008/04/overview-of-new-and-changed-features-in.html].
    You could also open a SR with Oracle and clarify if there is already a one-off patch available for your 10.2.0.2 platform release. If not it's quite unlikely that you are going to get a backport for 10.2.0.2.
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle (Open source Oracle GUI for Windows):
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/

  • Query resulting inconsistent results

    Hi,
    I'm running Oracle 10.2.0.4 and have a partitioned table.
    When I run the query
    select bse_rem_bse, ben_aly_num from dw_cn2.prs_old
    where bse_rem_bse = 3
    I'm getting records with null data (i.e. bse_rem_bse is 'null') and the correct data (i.e. bse_rem_bse is 3).
    bse_rem_bse is a number type data field.
    Can anyone help me understanding the inconsistent result
    and how to find what's broken in table.
    Thanks
    Tarun

    Post your table structure along with data type and some sample data here. Also, post your DB version by executing the following query -
    select * from v$version;Regards.
    Satyaki De.

  • Spatial Queries Not Always Producing Accurate Results

    Hi,
    Spatial queries are not always producing accurate results. Here are the issues. We would appreciate any clarification you could provide to resolve these issues.
    1. When querying for points inside a polygon that is not an MBR (minimum bounded rectangle), some of the coordinates returned are not inside the polygon. It is as though the primary filter is working, but not the secondary filter when using sdo_relate. How can we validate that the spatial query using sdo_relate is using the secondary filter?
    2. SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT returns true when validating geometries even though we find results that are invalid.
    3. Illegal geodetic coordinates can be inserted into a table: latitude > 90.0, latitude < -90.0, longitude > 180.0 or longitude < -180.0.
    4. Querying for coordinates outside the MBR for the world where illegal coordinates existed did NOT return any rows, yet there were coordinates of long, lat: 181,91.
    The following are examples and information relating to the above-referenced points.
    select * from USER_SDO_GEOM_METADATA
    TABLE_NAME      COLUMN_NAME      DIMINFO(SDO_DIMNAME, SDO_LB, SDO_UB, SDO_TOLERANCE)      SRID
    LASTKNOWNPOSITIONS      THE_GEOM SDO_DIM_ARRAY(SDO_DIM_ELEMENT('X', -180, 180, .05), SDO_DIM_ELEMENT('Y', -90, 90, .05))      8307
    POSITIONS     THE_GEOM SDO_DIM_ARRAY(SDO_DIM_ELEMENT('X', -180, 180, .05), SDO_DIM_ELEMENT('Y', -90, 90, .05))      8307
    Example 1: Query for coordinates inside NON-rectangular polygon includes points outside of polygon.
    SELECT l.vesselid, l.latitude, l.longitude, TO_CHAR(l.observationtime,
    'YYYY-MM-DD HH24:MI:SS') as obstime FROM lastknownpositions l where
    SDO_RELATE(l.the_geom,SDO_GEOMETRY(2003, 8307, NULL,
    SDO_ELEM_INFO_ARRAY(1, 1003, 1),
    SDO_ORDINATE_ARRAY(-98.20268,18.05079,-57.30101,18.00705,-57.08229,
    54.66061,-98.59638,32.87842,-98.20268,18.05079)),'mask=inside')='TRUE'
    This query returns the following coordinates that are outside of the polygon:
    vesselid : 1152 obstime : 2005-08-24 06:00:00 long : -82.1 lat : 45.3
    vesselid : 3140 obstime : 2005-08-28 12:00:00 long : -80.6 lat : 44.6
    vesselid : 1253 obstime : 2005-08-22 09:00:00 long : -80.0 lat : 45.3
    Example 2a: Using SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT
    Select areaid, the_geom,
    SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT(the_geom, 0.005) from area where
    areaid=24
    ResultSet:
    AREAID THE_GEOM(SDO_GTYPE, SDO_SRID, SDO_POINT(X, Y, Z), SDO_ELEM_INFO,
    SDO_ORDINATES) SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT(THE_GEOM,0.005)
    24 SDO_GEOMETRY(2003, 8307, NULL, SDO_ELEM_INFO_ARRAY(1, 1003, 1), SDO_ORDINATE_ARRAY(-98.20268, 18.05079, -57.30101, 18.00705, -57.08229, 54.66061, -98.59638, 32.87842, -98.20268, 18.05079)) TRUE
    Example 2b: Using SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT
    Select positionid, vesselid, the_geom,
    SDO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT(the_geom, 0.005) from positions where vesselid=1152
    ResultSet:
    POSITIONID VESSELID THE_GEOM(SDO_GTYPE, SDO_SRID, SDO_POINT(X, Y, Z),
    SDO_ELEM_INFO, SDO_ORDINATES) DO_GEOM.VALIDATE_GEOMETRY_WITH_CONTEXT(THE_GEOM,0.005)
    743811 1152 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-82.1, 45.3, NULL), NULL, NULL) TRUE
    743812 1152 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-82.1, 45.3, NULL), NULL, NULL) TRUE
    743813 1152 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-80.2, 42.5, NULL), NULL, NULL) TRUE
    743814 1152 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-80.2, 42.5, NULL), NULL, NULL) TRUE
    Example 3: Invalid Coordinate values found in POSITIONS table.
    SELECT p.positionid, p.latitude, p.longitude, p.the_geom FROM positions p
    WHERE p.latitude < -180.0
    2 lines from ResultSet:
    POSITIONID LATITUDE LONGITUDE THE_GEOM(SDO_GTYPE, SDO_SRID, SDO_POINT(X, Y, Z), SDO_ELEM_INFO, SDO_ORDINATES)
    714915 -210.85408 -79.74449 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-79.74449, -210.85408, NULL), NULL, NULL)
    714938 -211.13632 -79.951256 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(-79.951256, -211.13632, NULL), NULL, NULL)
    SELECT p.positionid, p.latitude, p.longitude, p.the_geom FROM positions p
    WHERE p.longitude > 180.0
    3 lines from ResultSet:
    POSITIONID LATITUDE LONGITUDE THE_GEOM(SDO_GTYPE, SDO_SRID, SDO_POINT(X, Y, Z), SDO_ELEM_INFO, SDO_ORDINATES)
    588434 91 181 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(181, 91, NULL), NULL, NULL)
    589493 91 181 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(181, 91, NULL), NULL, NULL)
    589494 91 181 SDO_GEOMETRY(2001, 8307, SDO_POINT_TYPE(181, 91, NULL), NULL, NULL)
    Example 4: Failure to locate illegal coordinates by querying for disjoint coordinates outside of MBR for the world:
    SELECT p.vesselid, p.latitude, p.longitude, p.the_geom,
    TO_CHAR(p.observationtime, 'YYYY-MM-DD HH24:MI:SS') as obstime,
    SDO_GEOM.RELATE(p.the_geom, 'determine',
    SDO_GEOMETRY(2003, 8307, NULL,SDO_ELEM_INFO_ARRAY(1, 1003, 1),
    SDO_ORDINATE_ARRAY(-180.0,-90.0,180.0,-90.0,180.0,90.0,
    -180.0,90.0,-180.0,-90.0)), .005) relationship FROM positions p where
    SDO_GEOM.RELATE(p.the_geom, 'disjoint', SDO_GEOMETRY(2003, 8307,
    NULL,SDO_ELEM_INFO_ARRAY(1, 1003, 1),
    SDO_ORDINATE_ARRAY(-180.0,-90.0,180.0,-90.0,180.0,90.0,-80.0,90.0,
    -180.0,-90.0)),.005)='TRUE'
    no rows selected
    Carol Saah

    Hi Carol,
    1) I think the results are correct. Note in a geodetic coordinate system adjacent points in a linestring or polygon are connected via geodesics. You are probably applying planar thinking to an ellipsoidal problem! I don't have time to do the full analysis right now, but a first guess is that is what is happening.
    2) The query window seems to be valid. I don't think this is a problem.
    3) Oracle will let you insert most anything into a table. In the index, it probably wraps. If you validate, I think the validation routines will tell you is is illegal if you use the signature with diminfo, where the coordinate system bounds are included in the validation.
    4) Your query window is not valid. Your data is not valid. As the previous reply stated, you need to have valid data. If you think in terms of a geodetic coordinate system, you will realize that -180.0,-90.0 and 180.0,-90.0 are really the same point. Also, Oracle has a rule that polygon geometries cannot be greater than half the surface of the Earth.
    Hope this helps.

  • Inconsistent results from rpm

    Hi Oracle Forums,
         I am installing packages required for Oracle Database 11gR2, and am having problems with rpm responses.
         When I query rpm about a package, it tells me that the package is not installed. When I go to install the package, rpm informs me that the package is already installed. An example is shown below:
    [root@OELVM02 Server]# uname -a
    Linux OELVM02.localdomain 2.6.32-300.10.1.el5uek #1 SMP Wed Feb 22 17:37:40 EST 2012 x86_64 x86_64 x86_64 GNU/Linux
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# pwd
    /media/OL5.8 x86_64 dvd 20120229/Server
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# ls -alrt binutils-2.17.50.0.6-20.el5.x86_64.rpm
    -rw-r--r-- 1 root root 3069914 Dec 28  2011 binutils-2.17.50.0.6-20.el5.x86_64.rpm
    [root@OELVM02 Server]#
    [root@OELVM02 Server]#  rpm -q ./binutils-2.17.50.0.6-20.el5.x86_64.rpm
    package ./binutils-2.17.50.0.6-20.el5.x86_64.rpm is not installed
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# rpm -ivh ./binutils-2.17.50.0.6-20.el5.x86_64.rpm
    warning: ./binutils-2.17.50.0.6-20.el5.x86_64.rpm: Header V3 DSA signature: NOKEY, key ID 1e5e0159
    Preparing...                ########################################### [100%]
            package binutils-2.17.50.0.6-20.el5.x86_64 is already installed
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# rpm -q ./binutils-2.17.50.0.6-20.el5.x86_64.rpm
    package ./binutils-2.17.50.0.6-20.el5.x86_64.rpm is not installed
    [root@OELVM02 Server]#     I do not understand the inconsistent results that rpm is giving me.
         Any help would be greatly appreciated
         Thanks
         Gavin

    Hi Avi,
         Thanks for your quick response!!
         The Oracle documentation requires that both the 32 and 64 bit rpm's be installed for some packages. In the below scenario, is rpm telling me that both 32 and 64 bit packages are installed?
    [root@OELVM02 Server]# uname -a
    Linux OELVM02.localdomain 2.6.32-300.10.1.el5uek #1 SMP Wed Feb 22 17:37:40 EST 2012 x86_64 x86_64 x86_64 GNU/Linux
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# pwd
    /media/OL5.8 x86_64 dvd 20120229/Server
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# ls -alrt *glibc-2*
    -rw-r--r-- 1 root root 1544040 Nov 18  2010 compat-glibc-2.3.4-2.26.x86_64.rpm
    -rw-r--r-- 1 root root 1069214 Nov 18  2010 compat-glibc-2.3.4-2.26.i386.rpm
    -rw-r--r-- 1 root root 5607577 Feb 26  2012 glibc-2.5-81.i686.rpm
    -rw-r--r-- 1 root root 4997627 Feb 26  2012 glibc-2.5-81.x86_64.rpm
    [root@OELVM02 Server]#
    [root@OELVM02 Server]# rpm -q glibc-2.5-81.i686
    glibc-2.5-81
    [root@OELVM02 Server]#
    [root@OELVM02 Server]#  rpm -q glibc-2.5-81.x86_64
    glibc-2.5-81
    [root@OELVM02 Server]#
    [root@OELVM02 Server]#     Thanks heaps
         Gavin

  • Inconsistent Results Installing WebApps in Console.  The Secret?

    Hi:
    I'm having a difficult time understanding what exactly happens on my
    server when I use the mydomain->Deployments->Web Applications->Install
    a New Web Application dialog.
    Sometimes when I upload a .war file, it then displays in the Web
    Application section of the left pane where certificate and
    DefaultWebApp are shown. Othertimes? Nothing.
    I have tried installing my web application in several ways, and the
    results are not consistent. What settings should be made and where
    (for example, in files like config.xml) when I install a web
    application?
    I have also tried using the Configure a new Web Application dialog,
    and have similarly inconsistent results.
    Sometimes my config.xml gets updated, sometimes not. Sometimes it
    updates with an ineffective <Application> tag that does not include
    the <WebAppComponent> tag, and sometimes it works.
    Thanks for any insight. I'm really having a tough time of this.
    Thanks,
    Bill

    Hi.
    there should be no other file updated. You should open a case with support.
    Regards,
    Michael
    bill b3nac wrote:
    Yes, I'm using wls6.1 with sp2. jdk131 on windows 2000. my browser is
    msie 6.0.26.
    On installing a new web app through the console, is there any file
    that gets updated that I should keep my eye on other than
    ./wlserver6.1SP2/config/mydomain/config.xml?
    Thanks.
    Bill
    Michael Young <[email protected]> wrote in message news:<[email protected]>...
    Hi.
    Hmm. I'll presume you are running under wls 6.1. Make sure you are
    running with the latest service pack - sp2. Also, what platform/jdk are
    you using?
    If you already are using sp2 and are still seeing this inconsistency I
    recommend you open a case with support. However, be forwarned that they
    may not be able to help much if this only occurs randomly or rarely.
    Regards,
    Michael
    bill b3nac wrote:
    Hi:
    I'm having a difficult time understanding what exactly happens on my
    server when I use the mydomain->Deployments->Web Applications->Install
    a New Web Application dialog.
    Sometimes when I upload a .war file, it then displays in the Web
    Application section of the left pane where certificate and
    DefaultWebApp are shown. Othertimes? Nothing.
    I have tried installing my web application in several ways, and the
    results are not consistent. What settings should be made and where
    (for example, in files like config.xml) when I install a web
    application?
    I have also tried using the Configure a new Web Application dialog,
    and have similarly inconsistent results.
    Sometimes my config.xml gets updated, sometimes not. Sometimes it
    updates with an ineffective <Application> tag that does not include
    the <WebAppComponent> tag, and sometimes it works.
    Thanks for any insight. I'm really having a tough time of this.
    Thanks,
    Bill
    Michael Young
    Developer Relations Engineer
    BEA Support

  • Inconsistent results with localtimestamp and current_timestamp

    Running XE on Windows XP with the system timezone to GMT rebooted, restarted XE)
    Oracle Database 10g Express Edition Release 10.2.0.1.0 - Product
    PL/SQL Release 10.2.0.1.0 - Production
    CORE     10.2.0.1.0     Production
    TNS for 32-bit Windows: Version 10.2.0.1.0 - Production
    NLSRTL Version 10.2.0.1.0 - Production
    I'm getting incorrect and inconsistent results with current_timestamp and localtimestamp:
    With SQL, localtimestamp computes the wrong offset (appears to use 1987-2006 DST rules):
    select
    dbtimezone
    , sessiontimezone
    , current_timestamp
    , current_timestamp + numtodsinterval(18,'day') as current_timestamp18
    , localtimestamp
    from dual;
    +00:00     
    US/Eastern     
    17-MAR-10 10.27.17.376000000 AM US/EASTERN     
    04-APR-10 10.27.17.376000000 AM US/EASTERN     
    17-MAR-10 09.27.17.376000000 AM
    however, in PL/SQL, both current_timestamp and localtimestamp return the wrong hour value, and adding 18 to current_timestamp shows it is using 1987-2006 DST rules (1st sunday of april)/ note that this happens in straight PL/SQL and in embedded SQL (same results selecting from tables other than DUAL):
    begin
    for r1 in (
    select
    dbtimezone
    , sessiontimezone
    , current_timestamp
    , current_timestamp + numtodsinterval(18,'day') as current_timestamp18
    , localtimestamp
    from dual
    loop
    dbms_output.put_line('SQL dbtimezone = ' || r1.dbtimezone);
    dbms_output.put_line('SQL sessiontimezone = ' || r1.sessiontimezone);
    dbms_output.put_line('SQL current_timestamp = ' || r1.current_timestamp);
    dbms_output.put_line('SQL current_timestamp +18 = ' || r1.current_timestamp18);
    dbms_output.put_line('SQL localtimestamp = ' || r1.localtimestamp);
    end loop;
    dbms_output.put_line('dbtimezone = ' || dbtimezone);
    dbms_output.put_line('sessiontimezone = ' || sessiontimezone);
    dbms_output.put_line('systimestamp = ' || systimestamp);
    dbms_output.put_line('current_timestamp = ' || current_timestamp);
    dbms_output.put_line('current_timestamp +18 = ' || (current_timestamp + numtodsinterval(18,'day')));
    dbms_output.put_line('localtimestamp = ' || localtimestamp);
    end;
    SQL dbtimezone = +00:00
    SQL sessiontimezone = US/Eastern
    SQL current_timestamp = 17-MAR-10 09.29.32.784000 AM US/EASTERN
    SQL current_timestamp +18 = 04-APR-10 10.29.32.784000000 AM US/EASTERN
    SQL localtimestamp = 17-MAR-10 09.29.32.784000 AM
    dbtimezone = +00:00
    sessiontimezone = US/Eastern
    systimestamp = 17-MAR-10 02.29.32.784000000 PM +00:00
    current_timestamp = 17-MAR-10 09.29.32.784000000 AM US/EASTERN
    current_timestamp +18 = 04-APR-10 10.29.32.784000000 AM US/EASTERN
    localtimestamp = 17-MAR-10 09.29.32.784000000 AM
    dbtimezone = +00:00
    sessiontimezone = US/Eastern
    systimestamp = 17-MAR-10 02.16.21.366000000 PM +00:00
    current_timestamp = 17-MAR-10 09.16.21.366000000 AM US/EASTERN
    current_timestamp +18 = 04-APR-10 10.16.21.366000000 AM US/EASTERN
    localtimestamp = 17-MAR-10 09.16.21.366000000 AM
    is this a known bug?
    is there a patch or a work-around for XE?
    are other datasbase versions affected?

    Can't patch XE, unfortunately it comes with pre-2007 DST rules.
    There is a metalink note describing how to fix the DST changes, and while it's not really a "supported" method, neither is XE- if you can get updated timezone files from a later patch set for the same release, 10gR2, on the right operating system, shutdown/startup the database the updated DST rules will be in place. The timezone files are in $ORACLE_HOME/oracore/zoneinfo.
    Another unfortunately, any values already stored in the database using timestamp with local timezone datatypes for the affected period of the DST changes won't be correct, i.e. there is no 2010-03-14 02:01 (?) but with older timezone rules in place that would be a valid timestamp. The data has to be saved before updating the timezone file, and re-translated to timestamp w/local tz datatypes after the update.
    IMHO storing literal timezone info isn't an ideal practice, let the client settings do the time interpretation, time is always changing. Its the interpretation of the time that gets changed. From time to time. :(

  • Inconsistent results when exporting projects as new libraries

    I am trying to export a project as a new library without the original ("master") images and am getting inconsistent results. Each time I've tried this, I am careful not to check "Copy originals into exported library". Sometimes it works exactly as expected -- i.e., a new library file is created and, when I examine the contents of the new library, there are no master image files. Other times when I examine the new Aperture library file, I find that it contains master image files. Indeed, sometimes, the new library contains more master image files than there are images in the project from which the library was created.
    What am I doing wrong or missing? Is this normal behavior?
    (Aperture 3.4.3)

    After a bit of exploring, here is what I determined:
    1. Most of my projects were created by importing images from a folder without moving the original images (Files> Import> Folder as Projects> Store Files: In their original location). Using this method, the images in the project are all referenced images (i.e. the originals are not moved or copied into the Aperture library). When later export these projects as described above, the resulting library also does not contain any of the original image files or masters. This is my desired state and for most of my 2012 projects exactly what happened.
    2. If, however, any of the images in a given library were originally imported so that the master image resided in the Aperture library and that image was subsequently deleted, then the exported library will still contain the masters.
    The solution I found was to open the exported library file and empty the trash (Aperture> Empty Aperture Trash).
    (Of course, the longer term solution within my given workflow is to be careful not to import the masters at the beginning of the process.)
    Hope this helps someone.

  • Inconsistent results with ANSI LEFT JOIN on 9iR2

    Is this a known issue? Is it solved in 10g?
    With the following data setup, I get inconsistent results. It seems to be linked to the combination of using LEFT JOIN with the NULL comparison within the JOIN.
    create table titles (title_id int, title varchar(50));
    insert into titles values (1, 'Red Book');
    insert into titles values (2, 'Yellow Book');
    insert into titles values (3, 'Blue Book');
    insert into titles values (4, 'Orange Book');
    create table sales (stor_id int, title_id int, qty int, email varchar(60));
    insert into sales values (1, 1, 1, '[email protected]'));
    insert into sales values (1, 2, 1, '[email protected]');
    insert into sales values (3, 3, 4, null);
    insert into sales values (3, 4, 5, '[email protected]');
    SQL&gt; SELECT titles.title_id, title, qty
    2 FROM titles LEFT OUTER JOIN sales
    3 ON titles.title_id = sales.title_id
    4 AND stor_id = 3
    5 AND sales.email is not null
    6 ;
    TITLE_ID TITLE QTY
    4 Orange Book 5
    3 Blue Book
    1 Red Book
    2 Yellow Book
    SQL&gt;
    SQL&gt; SELECT titles.title_id, title, qty
    2 FROM titles LEFT OUTER JOIN sales
    3 ON titles.title_id = sales.title_id
    4 AND 3 = stor_id
    5 AND sales.email is not null;
    TITLE_ID TITLE QTY
    2 Yellow Book 1
    4 Orange Book 5
    3 Blue Book
    1 Red Book
    It seems to matter what order I specify the operands stor_id = 3, or 3 = stor_id.
    In the older (+) environment, I would understand this, but here? I'm pretty sure most other databases don't care about the order.
    thanks for your insight
    Kevin

    Don't have a 9i around right now to test ... but in 10 ...
    SQL> create table titles (title_id int, title varchar(50));
     
    Table created.
     
    SQL> insert into titles values (1, 'Red Book');
     
    1 row created.
     
    SQL> insert into titles values (2, 'Yellow Book');
     
    1 row created.
     
    SQL> insert into titles values (3, 'Blue Book');
     
    1 row created.
     
    SQL> insert into titles values (4, 'Orange Book');
     
    1 row created.
     
    SQL> create table sales (stor_id int, title_id int, qty int, email varchar(60));
     
    Table created.
     
    SQL> insert into sales values (1, 1, 1, '[email protected]');
     
    1 row created.
     
    SQL> insert into sales values (1, 2, 1, '[email protected]');
     
    1 row created.
     
    SQL> insert into sales values (3, 3, 4, null);
     
    1 row created.
     
    SQL> insert into sales values (3, 4, 5, '[email protected]');
     
    1 row created.
     
    SQL> SELECT titles.title_id, title, qty
      2   FROM titles LEFT OUTER JOIN sales
      3   ON titles.title_id = sales.title_id
      4   AND stor_id = 3
      5   AND sales.email is not null
      6   ;
     
      TITLE_ID TITLE                                                     QTY
             4 Orange Book                                                 5
             3 Blue Book
             1 Red Book
             2 Yellow Book
     
    SQL>
    SQL> SELECT titles.title_id, title, qty
      2   FROM titles LEFT OUTER JOIN sales
      3   ON titles.title_id = sales.title_id
      4   AND 3 = stor_id
      5   AND sales.email is not null;
     
      TITLE_ID TITLE                                                     QTY
             4 Orange Book                                                 5
             3 Blue Book
             1 Red Book
             2 Yellow Book
    SQL> exit
    Disconnected from Oracle Database 10g Enterprise Edition Release 10.1.0.2.0 - Production
    With the Partitioning, OLAP and Data Mining options

  • Inconsistent results for SDO_RELATE

    Using SDO_VERSION = 10.2.0.2.0
    I am getting inconsistent results using SDO_RELATE. If I do the whole table (over 200,000 records), one particular record that I know of is skipped while if I pick a smaller range of records ie. 4 records in this case, the record is not skipped. The particular relationship with this record is "touch". Is there any limitation on table size or is this something else? Here is the example :
    -- The column is set to null
    SQL> update nad_als_fixed_stn_10G_HQ set insidecheck = null;
    231484 rows updated.
    -- SDO_RELATE on a few records which actually finds the correct relationship
    SQL> UPDATE nad_als_fixed_stn_10G_HQ C SET C.insidecheck = '1'
    2 WHERE EXISTS (SELECT 1 FROM MetroRegions A, nad_als_fixed_stn_10G_HQ B
    3 WHERE SDO_RELATE(B.location83r, A.geoloc, 'mask=anyinteract') = 'TRUE'
    4 AND C.lic_no = B.lic_no and C.lic_no between 4687157 and 4687223 )
    5 ;
    3 rows updated.
    -- Displays the correct relationship for that record (this is a "touch")
    SQL> select insidecheck from nad_als_fixed_stn_10G_HQ where lic_no = 4687161;
    Inside
    check
    1
    -- Reset the column to null
    SQL> update nad_als_fixed_stn_10G_HQ set insidecheck = null;
    231484 rows updated.
    -- SDO_RELATE on the complete table
    SQL> UPDATE nad_als_fixed_stn_10G_HQ C SET C.insidecheck = '1'
    2 WHERE EXISTS (SELECT 1 FROM MetroRegions A, nad_als_fixed_stn_10G_HQ B
    3 WHERE SDO_RELATE(B.location83r, A.geoloc, 'mask=anyinteract') = 'TRUE'
    4 AND C.lic_no = B.lic_no );
    48488 rows updated.
    -- This particular record which was located correctly earlier appears to be skipped
    SQL> select insidecheck from nad_als_fixed_stn_10G_HQ where lic_no = 4687161;
    Inside
    check
    SQL>
    François Sigouin

    Thanks but it did not solve the problem of inconsistent results. The response time for the first update is much improved though. When I added the hint on the second update (which is about 600 records), it never came back so I tested without it. Any other ideas ?
    François.
    TEST
    SQL> update nad_als_fixed_stn_10G_HQ set insidecheck = null;
    231484 rows updated.
    -- First update with hint, response time is improved but same results obtained
    SQL> UPDATE nad_als_fixed_stn_10G_HQ C SET C.insidecheck = '1'
    2 WHERE EXISTS (SELECT /*+ ORDERED */ 1 FROM MetroRegions A, nad_als_fixed_stn_10G_HQ B
    3 WHERE SDO_RELATE(B.location83r, A.geoloc, 'mask=anyinteract') = 'TRUE'
    4 AND C.lic_no = B.lic_no );
    48488 rows updated.
    SQL> select insidecheck from nad_als_fixed_stn_10G_HQ where lic_no = 4687161;
    Inside
    check
    SQL> update nad_als_fixed_stn_10G_HQ set insidecheck = null;
    231484 rows updated.
    --The second update has to be without the hint otherwise it does not come back.                                                                                                  
    SQL> UPDATE nad_als_fixed_stn_10G_HQ C SET C.insidecheck = '1'
    2 WHERE EXISTS (SELECT 1 FROM MetroRegions A, nad_als_fixed_stn_10G_HQ B
    3 WHERE SDO_RELATE(B.location83r, A.geoloc, 'mask=anyinteract') = 'TRUE'
    4 AND C.lic_no = B.lic_no and C.lic_no between 4687157 and 4687223 )
    5 ;
    3 rows updated.
    SQL> select insidecheck from nad_als_fixed_stn_10G_HQ where lic_no = 4687161;
    Inside
    check
    1

Maybe you are looking for

  • 24" imac, can this be used as a second monitor

    Hi All, I have a 2007 iMac and on the verge of buying a new Mac Pro Desktop. So i was wondering if I can use this imac as the monitor for the New Mac Pro. This will save me money for buying the new monitor. Can some1 pls be able to tell me how this w

  • Error: Adobe Acrobat 7.0 Standard

    When I try to convert a powerpoint presentation into Adobe PDF I get the error message "Missing PDFMaker files" Then asks "Do you want to run the installer in repair mode"    I have clicked yes. But continues to get the same error message "Missing PD

  • Mailto links are not working. How do I set mailto: links to open in Outlook?

    mailto: links used to open in Outlook. Then they began opening Gmail, so I uninstalled the Google Toolbar. Now mailto: links do nothing when I click on them. I've tried deleting the mimeTypes.rdf as recommended by [http://kb.mozillazine.org/File_type

  • Document flow icon - disabled

    Hi All, Many service orders for which the service confirmations are created , the document flow icon is disabled ( greyed out ) and not able to view the subseding and preceeding documents. If you have any clue , Pl  help. Thanx. Anitha

  • Multiple Entries for a field in Table

    Hi All, I needed to add field to a ztable, the options should include  values from table LIKP-LFART and should allow for multiple entries. I tried adding field with table type and structure type but error saying table type or sturcture cannot be used