IN range comparison

Hi,
How do I check if a value is in a range?
I am testing the 'IN' clause as follows:-
my test_value is declared as :-
lv_current_value       TYPE atwrt, with a value of  '210'
my range_table is declared as :-
RANGES: lr_check_value FOR ausp-atwrt.
and has the table entry =
sign = 'I',
option = 'BT'.
low = '200'
high = '400'
I would have expected the following test to come out positive
IF test_value IN lr_check_value.
   test = positive
else.
  test = negative.
endif.
Can ayone shed any light?
Thanks alot

Write the code as :
In your coding UR not giving any default value to test_value .. instead
lv_current_value is defaulted to 210 ..
*my test_value is declared as :-
*lv_current_value TYPE atwrt, with a value of '210'
tables : ausp.
data : test_value TYPE atwrt value '210'.
RANGES: lr_check_value FOR ausp-atwrt.
lr_check_value-sign = 'I'.
lr_check_value-option = 'BT'.
lr_check_value-low = '200'.
lr_check_value-high = '400'.
append lr_check_value.
IF test_value IN lr_check_value.
write :/ 'positive'.
else.
write :/ 'negative'.
endif.

Similar Messages

  • Date range Comparison to Period

    Hi Guys
    I have a selection screen date range. I want to select the agreements from a table whose Period is within the selection screen date range . what is the most appropriate way of compairing a period fied with date range??
    Thanks in Advance
    Harkamal

    Hi , u can try FM like this
    MM_ARRANG_SPMON_RANGE
    regards
    Prabhu

  • Range comparison

    Hi all,
    I'm new in FDK and try to compare two ranges to define which of them is the first one in the document. But I didn't find any function in FDK.
    Does anyone know ways to do it?

    mdeg,
    What do you mean by "range"? Do you mean a text range, and you want to know which range occurs before the other in the respective flow? If that's the case, I think you would need to do the following, assuming the ranges start in different paragraphs:
    - Find the text frame for one of the paragraphs, likely with the InTextFrame property of the textrange.beg.obj object
    - Get the parent flow object, likely with the textframe.Flow property
    - Get the first paragraph in the flow, likely with flow.FirstTextFrameInFlow.FirstPgf
    - Iterate over all paragraphs in the flow, likely with the pgf.NextPgfInFlow property
    - The first one that you get to that matches one of your textrange.beg.obj objects means that textrange occurs first
    If the two ranges start in the same paragraph, where textrange1.beg.obj = textrange2.beg.obj, you could just skip all that and compare the offsets.
    If one or more of the ranges are in a table cell, footnote, etc, the whole process becomes significantly more complex. Too much to address here.
    Russ

  • Range comparison using VARCHAR2

    Hi
    How can I achieve
    SELECT * FROM TABLE WHERE 'B' BETWEEN 'A' AND 'C'
    How do I obtain a number representation for a given character string?

    Oracle can do a sort on varchar2 (sorting order depends on NLS_LANG settings), so there is no need to convert:
    SQL> select 'A'
      2  from dual
      3  where 'Y' between 'X' and 'Z'
      4  ;
    'A'
    AHowever, with select ascii('A') from dual; you can get the ASCII number of a 'A' (for example).

  • Exact date range comparison

    I have payment table like this,
    Date | Store | payment
    01/01/2012 | A | $300 
    01/01/2012 | B | $400 
    01/01/2012 | C | $500 
    01/02/2012 | A | $500
    01/22/2015 | C | $300
    I want to make a report like this,
                                  A                                          
                    B                           ........     Totals
            Current year | Last Year | Difference         Current year | Last Year | Difference .....     Totals
    Jan
    Feb
    March
    Dec
    However for comparing last year & current year, I only want to take into account the number of days in the current month. For instance, if current year-month sales are only up-to-date until 22 then for previous year, i want to show sales for Jan only
    until 22.
    I know I can hardcode this for previous year sales  by specifying harcoded dates, but anytime I try using something like calculate(sum(payment), formula to calculate last date in current year and use dateadd() formula move this back by one year) always
    results in error like "cannot use formula in calculate filter"
    is there any solution to this? 

    PaymentYTD:=CALCULATE(SUM([Amount]),DATESYTD(Dates[Date],"12/31"))
    PaymentPriorYear:=CALCULATE([PaymentYTD],
    SAMEPERIODLASTYEAR(Dates[Date]))
    The dates in Dates tables are contiguous, however there are missing dates in payment table (as the store for closed for some of the holidays). Below
    is sample data, 
    PAYMENT_DATE
    AMOUNT
    STORENO
    PAY_DATEKEY
    01/02/2014 9:51
    $359.00
    1
    01/02/2014
    01/02/2014 10:08
    $283.00
    1
    01/02/2014
    01/02/2014 10:09
    $497.00
    1
    01/02/2014
    01/02/2014 10:23
    $494.00
    1
    01/02/2014
    01/02/2014 10:34
    $27.00
    1
    01/02/2014
    01/02/2014 10:34
    $63.00
    1
    01/02/2014
    01/02/2014 10:56
    $453.00
    3
    01/02/2014
    01/02/2014 10:56
    $175.00
    3
    01/02/2014
    01/02/2014 10:59
    $197.00
    1
    01/02/2014
    01/02/2014 11:00
    $145.00
    1
    01/02/2014
    01/02/2014 11:01
    $373.00
    3
    01/02/2014
    01/02/2014 11:06
    $475.00
    1
    01/02/2014
    01/02/2014 11:10
    $413.00
    2
    01/02/2014
    01/02/2014 11:11
    $431.00
    2
    01/02/2014
    01/02/2014 11:13
    $131.00
    2
    01/02/2014
    01/02/2014 11:16
    $34.00
    2
    01/02/2014
    01/02/2014 11:16
    $59.00
    2
    01/02/2014
    01/02/2014 11:17
    $203.00
    1
    01/02/2014
    01/02/2014 11:19
    $80.00
    1
    01/02/2014
    01/02/2014 11:27
    $418.00
    1
    01/02/2014
    01/02/2014 11:27
    $198.00
    1
    01/02/2014
    01/02/2014 11:28
    $354.00
    2
    01/02/2014
    01/02/2014 11:29
    $19.00
    2
    01/02/2014
    01/02/2014 11:33
    $425.00
    3
    01/02/2014
    01/02/2014 11:34
    $296.00
    2
    01/02/2014
    01/02/2014 11:34
    $302.00
    1
    01/02/2014
    01/02/2014 11:35
    $244.00
    3
    01/02/2014
    01/02/2014 11:35
    $13.00
    3
    01/02/2014
    01/02/2014 11:39
    $419.00
    2
    01/02/2014
    01/02/2014 11:43
    $144.00
    2
    01/02/2014
    01/02/2014 11:43
    $206.00
    2
    01/02/2014
    01/02/2014 11:47
    $194.00
    3
    01/02/2014
    01/02/2014 11:48
    $479.00
    3
    01/02/2014
    01/02/2014 11:53
    $20.00
    1
    01/02/2014
    01/02/2014 12:06
    $419.00
    1
    01/02/2014
    01/02/2014 12:15
    $129.00
    1
    01/02/2014

  • Date range parameters are off by one day

    I have a Start Date and End Date parameter in my SSRS report.  The results are off by 1 day.  For example if I enter 4/2/2015 and 4/20/2015 it will return a few results from 4/1/2015 to 4/19/2015.  I think maybe it's a problem with my date
    conversion in T-SQL, but thought someone more experienced than me would have seen this before.
    Thanks

    Yes 
    The safest way to write a date range comparison would be as below
    WHERE DateField >= @StartDate
    AND DateField < DATEADD(dd,1,@EndDate)
    If you want results with @EndDate inclusive
    see more details here
    http://visakhm.blogspot.ae/2012/12/different-ways-to-implement-date-range.html
    Please Mark This As Answer if it solved your issue
    Please Vote This As Helpful if it helps to solve your issue
    Visakh
    My Wiki User Page
    My MSDN Page
    My Personal Blog
    My Facebook Page

  • Wireless range MB VS. MBP

    Does any one have any insight into whether there is much difference betwween the two units as far as their ability to recieve wireless internet signals. In the past it seems as thought the iBooks got a better signal. I thought I heard somewhere about the PB's doing something to change this.
    Rob

    The MBP has superior range to the PB due to a relocated antennae (it is now along the hinge of the screen rather than along the side of the display - both have always had the plastic inserts and range comparisons are inclusive of the presence of those inserts). It is still likely to be somewhat less than the MB but the difference may not be as significant as it was between the PB and iBook.

  • CHARS NOT SELECTING IN RANGE

    TABLE:SK_TEMP
    COL1
    K4
    K5
    K6
    K7
    K8
    L4
    L5
    L6
    L7
    L8
    M4
    M5
    M6
    When am writing below query
    Requirement 1: -
    select col1 from sk_temp where col1 between 'K%' and 'L%' is returning only reocrds of K4 - K8 and not returning records between L4 - L8
    Requirement 2: -
    Is there a possibility to get below output
    Data between K8 and M5
    i.e.,
    K8
    L4
    L5
    L6
    L7
    L8
    M4
    M5
    Thanks in advance for your suggestionsEdited by: Sun Vth Oracle on Jan 7, 2011 2:22 AM

    Sun Vth Oracle wrote:
    TABLE:SK_TEMP
    COL1
    K4
    K5
    K6
    K7
    K8
    L4
    L5
    L6
    L7
    L8
    M4
    M5
    M6When am writing below query
    Requirement 1: -
    select col1 from sk_temp where col1 between 'K%' and 'L%' is returning only reocrds of K4 - K8 and not returning records between L4 - L8If you look at the ASCII character set (your local one may differ slightly depending on your NLS settings)...
    SQL> with t as (select ascii, decode(ascii,7,null,9,null, ch) as ch, col, rn
      2             from (select rownum-1 as ascii
      3                         ,chr(rownum-1) as ch
      4                         ,trunc((rownum-1)/64) as col
      5                         ,row_number() over (partition by trunc((rownum-1)/64) order by rownum) as rn
      6                   from dual connect by rownum <= 256
      7                  )
      8            )
      9  --
    10  select t1.ascii, t1.ch
    11        ,t2.ascii, t2.ch
    12        ,t3.ascii, t3.ch
    13        ,t4.ascii, t4.ch
    14  from (select * from t where col = 0) t1
    15      ,(select * from t where col = 1) t2
    16      ,(select * from t where col = 2) t3
    17      ,(select * from t where col = 3) t4
    18  where t1.rn = t2.rn
    19  and   t2.rn = t3.rn
    20  and   t3.rn = t4.rn
    21  /
         ASCII C      ASCII C      ASCII C      ASCII C
             0           64 @        128 Ç        192 └
             1 ☺         65 A        129 ü        193 ┴
             2 ☻         66 B        130 é        194 ┬
             3 ♥         67 C        131 â        195 ├
             4 ♦         68 D        132 ä        196 ─
             5 ♣         69 E        133 à        197 ┼
             6 ♠         70 F        134 å        198 ã
             7           71 G        135 ç        199 Ã
             8           72 H        136 ê        200 ╚
             9           73 I        137 ë        201 ╔
            10           74 J        138 è        202 ╩
            11 ♂         75 K        139 ï        203 ╦
            12 ♀         76 L        140 î        204 ╠
            13           77 M        141 ì        205 ═
            14 ♫         78 N        142 Ä        206 ╬
            15 ☼         79 O        143 Å        207 ¤
            16 ►         80 P        144 É        208 ð
            17 ◄         81 Q        145 æ        209 Ð
            18 ↕         82 R        146 Æ        210 Ê
            19 ‼         83 S        147 ô        211 Ë
            20 ¶         84 T        148 ö        212 È
            21 §         85 U        149 ò        213 ı
            22 ▬         86 V        150 û        214 Í
            23 ↨         87 W        151 ù        215 Î
            24 ↑         88 X        152 ÿ        216 Ï
            25 ↓         89 Y        153 Ö        217 ┘
            26 →         90 Z        154 Ü        218 ┌
            27 ←         91 [        155 ø        219 █
            28 ∟         92 \        156 £        220 ▄
            29 ↔         93 ]        157 Ø        221 ¦
            30 ▲         94 ^        158 ×        222 Ì
            31 ▼         95 _        159 ƒ        223 ▀
            32           96 `        160 á        224 Ó
            33 !         97 a        161 í        225 ß
            34 "         98 b        162 ó        226 Ô
            35 #         99 c        163 ú        227 Ò
            36 $        100 d        164 ñ        228 õ
            37 %        101 e        165 Ñ        229 Õ
            38 &        102 f        166 ª        230 µ
            39 '        103 g        167 º        231 þ
            40 (        104 h        168 ¿        232 Þ
            41 )        105 i        169 ®        233 Ú
            42 *        106 j        170 ¬        234 Û
            43 +        107 k        171 ½        235 Ù
            44 ,        108 l        172 ¼        236 ý
            45 -        109 m        173 ¡        237 Ý
            46 .        110 n        174 «        238 ¯
            47 /        111 o        175 »        239 ´
            48 0        112 p        176 ░        240
            49 1        113 q        177 ▒        241 ±
            50 2        114 r        178 ▓        242 ‗
            51 3        115 s        179 │        243 ¾
            52 4        116 t        180 ┤        244 ¶
            53 5        117 u        181 Á        245 §
            54 6        118 v        182         246 ÷
            55 7        119 w        183 À        247 ¸
            56 8        120 x        184 ©        248 °
            57 9        121 y        185 ╣        249 ¨
            58 :        122 z        186 ║        250 ·
            59 ;        123 {        187 ╗        251 ¹
            60 <        124 |        188 ╝        252 ³
            61 =        125 }        189 ¢        253 ²
            62 >        126 ~        190 ¥        254 ■
            63 ?        127 ⌂        191 ┐        255  
    64 rows selected.
    SQL>
    {code}
    The '%' sign is character 37 and the BETWEEN operator is based on the numbers, so you query of:
    {code}
    select col1 from sk_temp where col1 between 'K%' and 'L%'
    {code}
    Is saying you want to select any records where col1 is between "K" (75) and "L" (76) in the most significant part and '%' (37) and '%' (37) in the least significant part.  If you take this in a binary sense you could look at this the same as saying you want col1 between (75*256)+37 and (75*256)+37, so let's look at your data to see how that works...
    {code}
    SQL> ed
    Wrote file afiedt.buf
      1  with t as (select 'K4' as col1 from dual union all
      2             select 'K5' from dual union all
      3             select 'K6' from dual union all
      4             select 'K7' from dual union all
      5             select 'K8' from dual union all
      6             select 'L4' from dual union all
      7             select 'L5' from dual union all
      8             select 'L6' from dual union all
      9             select 'L7' from dual union all
    10             select 'L8' from dual union all
    11             select 'M4' from dual union all
    12             select 'M5' from dual union all
    13             select 'M6' from dual)
    14  --
    15  -- test data
    16  --
    17  select col1
    18        ,dump(col1) as dump_col1
    19        ,(ascii(substr(col1,1,1))*256)+ascii(substr(col1,2,1)) as binary_val
    20        ,(ascii('K')*256)+ascii('%') as "K% val"
    21        ,(ascii('L')*256)+ascii('%') as "L% val"
    22        ,case when col1 between 'K%' and 'L%' then 'Between' else 'Not Between' end as "Between"
    23* from t
    SQL> /
    CO DUMP_COL1                 BINARY_VAL     K% val     L% val Between
    K4 Typ=96 Len=2: 75,52            19252      19237      19493 Between
    K5 Typ=96 Len=2: 75,53            19253      19237      19493 Between
    K6 Typ=96 Len=2: 75,54            19254      19237      19493 Between
    K7 Typ=96 Len=2: 75,55            19255      19237      19493 Between
    K8 Typ=96 Len=2: 75,56            19256      19237      19493 Between
    L4 Typ=96 Len=2: 76,52            19508      19237      19493 Not Between
    L5 Typ=96 Len=2: 76,53            19509      19237      19493 Not Between
    L6 Typ=96 Len=2: 76,54            19510      19237      19493 Not Between
    L7 Typ=96 Len=2: 76,55            19511      19237      19493 Not Between
    L8 Typ=96 Len=2: 76,56            19512      19237      19493 Not Between
    M4 Typ=96 Len=2: 77,52            19764      19237      19493 Not Between
    M5 Typ=96 Len=2: 77,53            19765      19237      19493 Not Between
    M6 Typ=96 Len=2: 77,54            19766      19237      19493 Not Between
    13 rows selected.
    SQL>
    {code}
    As you can see the value of the strings starting with "L" are actually greater than the value of "L%".
    Hopefully this helps to clarify your misunderstanding of how range comparisons work against strings.  Remember, range comparisons work with numbers, so if you provide strings Oracle has to look at it in it's binary form.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • 802.11n Increased Range - How?

    Hi All,
    All the blub in the market place talks about 802.11n increased range from using multiple antennas. I would have thought there would have to be an increased power level also?
    Am I missing something here?
    Also, is the max power setting for any AP 100mw and does anyone have any test results (lets say in open space) of 802.11b/g/n range comparisons?
    Many thx indeed,
    Ken

    Hi Ken,
    For legacy (a/b/g) clients the increase in range, or increase in the distance before rates shift down, is due to Maximal Ratio Combining (MRC) and ClientLink (legacy beamforming).
    MRC is done on the receiver; hence how legacy clients can benefit. All three antennas receive the signal, which gets combined and adjusted so all are in phase. This effectively boosts the signal (gain).
    With ClientLink, when the client sends a signal, each antenna is going to receive the signal at a slightly different time and therefore slightly out of phase. Say it hits the right antenna, then middle, then left. To get the signal back to the client in phase, we flip the order and send on the last to receive first. So, left then right (middle antenna is Rx only). Again, because the signals will arrive in phase, there will be gain.
    Maximum range isn't increased all that much, but the range until rates shift down is increased - increasing overall cell throughput.
    Check out:
    http://www.cisco.com/en/US/prod/collateral/wireless/ps5678/ps10092/white_paper_c11-516389_ns767_Networking_Solutions_White_Paper.html
    -Matt

  • Can I put a READ function in a TRUE/FALSE case structure?

    Hi,
    I have a vi that operates as controller for a mechanical system of motors.  There are several sensors of various types, that provide input to the vi, including encoders, whose period is being measured.  The attached vi is the encoder period measuring part. It measures 15 periods and tells me the average of the 15 with each loop iteration.
    In my application, the encoder period measurement is not needed unless the vi knows that ALL THE OTHER sensors in the hardware configuration are measuring values within the desired range.  For example, Sensors A, B and C should all measure between 5-10.  If they are all showing values between 5-10, then, we want the encoder value to read and display.  If only one of the sensors, say A, is measuring 12, then, we don't want the period value to READ or be subsequently processed.  So, it is easy to set up the sensors A, B, C, to give a TRUE or FALSE based on whether they are within range or not, and if all 3 are TRUE, then, that value can easily be passed to the case structure that holds my READ function for the period.  Is this a good way to do this?  The goal is to eliminate unnecessary execution time that the period READ function would consume, as well as all subsequent calculations that are performed on the period value collected.  The period data is of no use if any one of the other sensors is not in the correct range.
    If any one of the other sensors is out of range, we want to skip the encoder READ step altogether.  So, it is easy to structure a BOOLEAN in my vi as shown in the attachment.  If I do it this way, will it throw an error, or just skip the READ until the BOOLEAN is true again?  Is there a better way to prevent the READ from happening if one of my other sensor values is not within the correct range?
    Thanks,
    Dave
    Solved!
    Go to Solution.
    Attachments:
    forum JUly 18.vi ‏26 KB

    Sure, you can put a Read inside a case structure. For your application, just check that your sensors are all within range (Comparisons palette), then AND the T/F's and use the result of that for your case structure. (If you are already doing the in-range check, I can't see it here because you have 5 subVIs that I cannot open here.
    Cameron
    To err is human, but to really foul it up requires a computer.
    The optimist believes we are in the best of all possible worlds - the pessimist fears this is true.
    Profanity is the one language all programmers know best.
    An expert is someone who has made all the possible mistakes.
    To learn something about LabVIEW at no extra cost, work the online LabVIEW tutorial(s):
    LabVIEW Unit 1 - Getting Started
    Learn to Use LabVIEW with MyDAQ

  • ORA-01858: a non-numeric character was found where a numeric was expected

    hi ,
    This was the code which shows the sales rep invoice amount and collected amount but while running report thru concurrent program its showing the following error:
    ORA-01858: a non-numeric character was found where a numeric was expected
    WHERE TO_CHAR ( TO_DATE ( PS.GL_DATE , 'DD/MON/YY' ) , 'MON-YYYY' ) BETWEEN TO_CHAR ( TO_DATE ( : ==> P_todate , 'YYYY/MM/DD' ) , 'MON-YYYY' ) AND TO_CHAR ( TO_DATE ( : P_todate , 'YYYY/MM/DD' ) , 'MON-YYYY' ) AND ps.customer_id = cust.custome
    The Actual Code was this
    SELECT SUBSTR(SALES.name,1,50) salesrep_name_inv,
    --ps.CLASS,
    SUM(ABS(ps.acctd_amount_due_remaining)) acctd_amt,
    SUM(ABS(ps.amount_due_remaining)) amt_due_remaining_inv,
    SUM(ABS(ps.amount_adjusted)) amount_adjusted_inv,
    SUM(ABS(ps.amount_applied)) amount_applied_inv,
    SUM(ABS(ps.amount_credited)) amount_credited_inv,
              SALES.salesrep_id,
    NULL "REMARKS"
    -- ps.gl_date gl_date_inv,
    FROM ra_cust_trx_types ctt,
    ra_customers cust,
    ar_payment_schedules ps,
    ra_salesreps SALES,
    ra_site_uses site,
    ra_addresses addr,
    ra_cust_trx_line_gl_dist gld,
    gl_code_combinations c,
    ra_customer_trx ct
    WHERE TO_CHAR(TO_DATE(PS.GL_DATE,'DD/MON/YY'),'MON-YYYY')
    BETWEEN TO_CHAR(TO_DATE(:P_todate,'YYYY/MM/DD'),'MON-YYYY') AND TO_CHAR(TO_DATE(:P_todate,'YYYY/MM/DD'),'MON-YYYY')
    AND ps.customer_id = cust.customer_id
    AND ps.customer_trx_id = ct.customer_trx_id
    AND ps.cust_trx_type_id = ctt.cust_trx_type_id
    AND NVL(ct.primary_salesrep_id, -3) = SALES.salesrep_id
    AND ps.customer_site_use_id+0 = site.site_use_id(+)
    AND site.address_id = addr.address_id(+)
    AND TO_CHAR(TO_DATE(PS.GL_DATE_CLOSED,'DD/MON/YY'),'MON-YYYY')
    BETWEEN TO_CHAR(TO_DATE(:P_todate,'YYYY/MM/DD'),'MON-YYYY') AND TO_CHAR(TO_DATE(:P_todate,'YYYY/MM/DD'),'MON-YYYY')
    --AND    ps.gl_date_closed > TO_DATE(:P_todate,'MON-YYYY')
    AND ct.customer_trx_id = gld.customer_trx_id
    AND gld.account_class = 'REC'
    AND gld.latest_rec_flag = 'Y'
    AND gld.code_combination_id = c.code_combination_id
    AND sales.salesrep_id is not null and sales.name is not null
    -- and ps.payment_schedule_id+0 < 9999
    -- AND SALES.salesrep_id ='1001'
    GROUP BY SALES.name,
    --ps.CLASS,
    SALES.salesrep_id

    So to_date function accepts a string as input and returns a date. When a date is input instead, it is implicity converted to the required type of the function paremeter, which is a string, so that to_date can convert it back to a date again.
    If you are lucky with the implicit conversion, you get the same date back, if you are not you might get a different date or an error.
    From your query it appears that this conversion from a date, to a string, to a date, and then back to a string using to_char this time, is being done to remove the time or day part of the date. The actual range comparison is being done on strings rather than dates, which is dangerous as strings sort differently than dates.
    In this example if I sort by date, Jan 01 comes between Dec 00 and Feb 01 as you would expect.
    SQL> select * from t order by d;
    D
    12-01-2000
    01-01-2001
    02-01-2001When converted to strings, Feb 01 comes between Dec 00 and Jan 01, which is probably not the desired result
    SQL> select * from t order by to_char(d,'DD-MON-YY');
    D
    12-01-2000
    02-01-2001
    01-01-2001If you want to remove time and day parts of dates you should use the trunc function
    trunc(d) removes the time, trunc(d,'mm') will remove the days to start of month.
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14200/functions201.htm#i79761

  • Weird result of oracle query before and after function base index creation

    Hi All,
    Here is the unique situation we are facing after creating just an index.
    The query result before the index and the query result after the index do not match.
    This is very illogical situation. Shidhar and me have done lot of R&D, also tried to get lots info from Google but we couldn't decipher the reason for that.
    I am giving you all the details about the query, index and tables with following steps.
    Please let us know if anything is going wrong from our side or is it a bug at oracle level which is a rarest possibility but a possibility.
    Step 1 :- Create table
    create table TEMP_COMP
    ID VARCHAR2(10),
    GROUP_ID VARCHAR2(10),
    TRAN_DATE DATE,
    AMT_1 NUMBER,
    AMT_2 NUMBER,
    AMT_3 NUMBER
    Step 2 :- Insert Sample data
    set feedback off
    set define off
    prompt Deleting TEMP_COMP...
    delete from TEMP_COMP;
    commit;
    prompt Loading TEMP_COMP...
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('01', 'G01', to_date('01-03-2007', 'dd-mm-yyyy'), 1, 11, 111);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('02', 'G01', to_date('02-03-2007', 'dd-mm-yyyy'), 2, 22, 222);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('03', 'G01', to_date('03-03-2007', 'dd-mm-yyyy'), 3, 33, 333);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('04', 'G01', to_date('04-03-2007', 'dd-mm-yyyy'), 4, 44, 444);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('05', 'G01', to_date('05-03-2007', 'dd-mm-yyyy'), 5, 55, 555);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('06', 'G01', to_date('01-03-2008', 'dd-mm-yyyy'), 6, 66, 666);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('07', 'G01', to_date('02-03-2008', 'dd-mm-yyyy'), 7, 77, 777);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('08', 'G01', to_date('03-03-2008', 'dd-mm-yyyy'), 8, 88, 888);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('09', 'G01', to_date('04-03-2008', 'dd-mm-yyyy'), 9, 99, 999);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('10', 'G01', to_date('05-03-2008', 'dd-mm-yyyy'), 10, 100, 1000);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('01', 'G01', to_date('01-03-2007', 'dd-mm-yyyy'), 1, 11, 111);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('02', 'G01', to_date('02-03-2007', 'dd-mm-yyyy'), 2, 22, 222);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('03', 'G01', to_date('03-03-2007', 'dd-mm-yyyy'), 3, 33, 333);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('04', 'G01', to_date('04-03-2007', 'dd-mm-yyyy'), 4, 44, 444);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('05', 'G01', to_date('05-03-2007', 'dd-mm-yyyy'), 5, 55, 555);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('06', 'G01', to_date('01-03-2008', 'dd-mm-yyyy'), 6, 66, 666);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('07', 'G01', to_date('02-03-2008', 'dd-mm-yyyy'), 7, 77, 777);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('08', 'G01', to_date('03-03-2008', 'dd-mm-yyyy'), 8, 88, 888);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('09', 'G01', to_date('04-03-2008', 'dd-mm-yyyy'), 9, 99, 999);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('10', 'G01', to_date('05-03-2008', 'dd-mm-yyyy'), 10, 100, 1000);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('01', 'G02', to_date('01-03-2007', 'dd-mm-yyyy'), 1, 11, 111);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('02', 'G02', to_date('02-03-2007', 'dd-mm-yyyy'), 2, 22, 222);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('03', 'G02', to_date('03-03-2007', 'dd-mm-yyyy'), 3, 33, 333);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('04', 'G02', to_date('04-03-2007', 'dd-mm-yyyy'), 4, 44, 444);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('05', 'G02', to_date('05-03-2007', 'dd-mm-yyyy'), 5, 55, 555);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('06', 'G02', to_date('01-03-2008', 'dd-mm-yyyy'), 6, 66, 666);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('07', 'G02', to_date('02-03-2008', 'dd-mm-yyyy'), 7, 77, 777);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('08', 'G02', to_date('03-03-2008', 'dd-mm-yyyy'), 8, 88, 888);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('09', 'G02', to_date('04-03-2008', 'dd-mm-yyyy'), 9, 99, 999);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('10', 'G02', to_date('05-03-2008', 'dd-mm-yyyy'), 10, 100, 1000);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('01', 'G03', to_date('01-03-2007', 'dd-mm-yyyy'), 1, 11, 111);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('02', 'G03', to_date('02-03-2007', 'dd-mm-yyyy'), 2, 22, 222);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('03', 'G03', to_date('03-03-2007', 'dd-mm-yyyy'), 3, 33, 333);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('04', 'G03', to_date('04-03-2007', 'dd-mm-yyyy'), 4, 44, 444);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('05', 'G03', to_date('05-03-2007', 'dd-mm-yyyy'), 5, 55, 555);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('06', 'G03', to_date('01-03-2008', 'dd-mm-yyyy'), 6, 66, 666);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('07', 'G03', to_date('02-03-2008', 'dd-mm-yyyy'), 7, 77, 777);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('08', 'G03', to_date('03-03-2008', 'dd-mm-yyyy'), 8, 88, 888);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('09', 'G03', to_date('04-03-2008', 'dd-mm-yyyy'), 9, 99, 999);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('10', 'G03', to_date('05-03-2008', 'dd-mm-yyyy'), 10, 100, 1000);
    insert into TEMP_COMP (ID, GROUP_ID, TRAN_DATE, AMT_1, AMT_2, AMT_3)
    values ('11', 'G01', to_date('03-03-2008', 'dd-mm-yyyy'), 100, 200, 300);
    commit;
    prompt 41 records loaded
    set feedback on
    set define on
    prompt Done.
    Step 3 :- Execute the query.
    SELECT GROUP_ID
    , SUM(LAST_YR_REV) as "Year_2007_Amt"
    , SUM(CURR_YR_REV) as "Year_2008_Amt"
    FROM (
    SELECT GROUP_ID,
    CASE WHEN TO_CHAR(TRAN_DATE,'YYYYMM') BETWEEN'200701'AND'200712'THEN SUM(AMT_1) ELSE 0 END AS LAST_YR_REV ,
    CASE WHEN TO_CHAR(TRAN_DATE,'YYYYMM') BETWEEN'200801'AND'200812'THEN SUM(AMT_1) ELSE 0 END AS CURR_YR_REV
    FROM TEMP_COMP t
    WHERE GROUP_ID ='G01'
    AND TO_CHAR(TRAN_DATE,'YYYYMM') BETWEEN'200601'AND'200912'
    GROUP BY GROUP_ID, TRAN_DATE
    GROUP BY GROUP_ID
    The result of above query
    GROUP_ID Year_2007_Amt Year_2008_Amt
    G01 30 180
    Step 4 : Create composite index
    create index GROUP_ID_TRAN_DATE_IDX on TEMP_COMP (GROUP_ID, TO_CHAR(TRAN_DATE,'YYYYMM'))
    Step 5 : Execute once again query from step 3.
    SELECT GROUP_ID
    , SUM(LAST_YR_REV) as "Year_2007_Amt"
    , SUM(CURR_YR_REV) as "Year_2008_Amt"
    FROM (
    SELECT GROUP_ID,
    CASE WHEN TO_CHAR(TRAN_DATE,'YYYYMM') BETWEEN'200701'AND'200712'THEN SUM(AMT_1) ELSE 0 END AS LAST_YR_REV ,
    CASE WHEN TO_CHAR(TRAN_DATE,'YYYYMM') BETWEEN'200801'AND'200812'THEN SUM(AMT_1) ELSE 0 END AS CURR_YR_REV
    FROM TEMP_COMP t
    WHERE GROUP_ID ='G01'
    AND TO_CHAR(TRAN_DATE,'YYYYMM') BETWEEN'200601'AND'200912'
    GROUP BY GROUP_ID, TRAN_DATE
    GROUP BY GROUP_ID
    The result of above query
    GROUP_ID Year_2007_Amt Year_2008_Amt
    G01 0 210
    Thanks
    Sunil

    I just wanted to make a comment. The predicates in both your queries are flawed I believe. You convert a date column to a character and then you say only pick the converted result between two sets of characters.
    TO_CHAR(TRAN_DATE,'YYYYMM') BETWEEN'200601'AND'200912'It should be coded like this for a proper date range comparison:
    TRAN_DATE BETWEEN TO_DATE('200601','YYYYMM') AND TO_DATE('200912','YYYYMM')That will eliminate the need for you to create FBI that you created. You should now be able to create a regular (B*Tree) index on the TRAN_DATE column.
    Also, now that we have the structure of your tables and sample data, what business question are you trying to answer? I think there is a better query that can be written if we know the requirements.
    Hope this helps!

  • Date format in MVIEW

    Hi All,
    I have creted the snapshot as
    CREATE SNAPSHOT EMPLOYEE_CRIS_MVIEW
    PCTFREE 10
    PCTUSED 40
    MAXTRANS 255
    TABLESPACE users
    STORAGE (
    INITIAL 40960
    NEXT 73728
    PCTINCREASE 1
    MINEXTENTS 1
    MAXEXTENTS 505
    BUILD IMMEDIATE
    REFRESH ON DEMAND
    With rowid
    AS
    select
    emp_id,
    join_dt,
    emp_stat,
    from employee_CRIS
    The date format for the column join_dt is dd-mon-yyyy.
    But I want to modify the date format for this column as 'MM/DD/YYYY'.
    For that I tried
    ALTER snapshot EMPLOYEE_CRIS_MVIEW modify(TO_CHAR(join_dt,'MM/DD/YYYY'));
    But I got the error
    ERROR at line 1:
    ORA-00902: invalid datatype
    Could you please help me to change the change the date format of the column of the snapshot
    with out dropping the snapshot.
    Please help me.
    Thanks in advance.

    What people are saying is that date columns (of DATE datatype) are stored internally using a fixed internal notation, which is essentially a series of bytes that describe the date.
    e.g.
    SQL> select empno, ename, hiredate, dump(hiredate) as dump_hiredate from emp;
         EMPNO ENAME      HIREDATE            DUMP_HIREDATE
          7369 SMITH      17/12/1980 00:00:00 Typ=12 Len=7: 119,180,12,17,1,1,1
          7499 ALLEN      20/02/1981 00:00:00 Typ=12 Len=7: 119,181,2,20,1,1,1
          7521 WARD       22/02/1981 00:00:00 Typ=12 Len=7: 119,181,2,22,1,1,1
          7566 JONES      02/04/1981 00:00:00 Typ=12 Len=7: 119,181,4,2,1,1,1
          7654 MARTIN     28/09/1981 00:00:00 Typ=12 Len=7: 119,181,9,28,1,1,1
          7698 BLAKE      01/05/1981 00:00:00 Typ=12 Len=7: 119,181,5,1,1,1,1
          7782 CLARK      09/06/1981 00:00:00 Typ=12 Len=7: 119,181,6,9,1,1,1
          7788 SCOTT      19/04/1987 00:00:00 Typ=12 Len=7: 119,187,4,19,1,1,1
          7839 KING       17/11/1981 00:00:00 Typ=12 Len=7: 119,181,11,17,1,1,1
          7844 TURNER     08/09/1981 00:00:00 Typ=12 Len=7: 119,181,9,8,1,1,1
          7876 ADAMS      23/05/1987 00:00:00 Typ=12 Len=7: 119,187,5,23,1,1,1
          7900 JAMES      03/12/1981 00:00:00 Typ=12 Len=7: 119,181,12,3,1,1,1
          7902 FORD       03/12/1981 00:00:00 Typ=12 Len=7: 119,181,12,3,1,1,1
          7934 MILLER     23/01/1982 00:00:00 Typ=12 Len=7: 119,182,1,23,1,1,1
    14 rows selected.Here, you can see from the dumped dates that they are all stored as 7 bytes of data, which sort of resemble the date you see on the screen in some way, but not quite. It's an internal format that Oracle understands and uses, and it's important it's stored in this way so that date arithmetic and date range comparisons can be performed easily (and quickly) in queries.
    If we were to alter our sessions date format (the display format for our session only) and query the data again in the same way...
    SQL> alter session set nls_date_format='YYYY-MM-DD';
    Session altered.
    SQL> select empno, ename, hiredate, dump(hiredate) as dump_hiredate from emp;
         EMPNO ENAME      HIREDATE   DUMP_HIREDATE
          7369 SMITH      1980-12-17 Typ=12 Len=7: 119,180,12,17,1,1,1
          7499 ALLEN      1981-02-20 Typ=12 Len=7: 119,181,2,20,1,1,1
          7521 WARD       1981-02-22 Typ=12 Len=7: 119,181,2,22,1,1,1
          7566 JONES      1981-04-02 Typ=12 Len=7: 119,181,4,2,1,1,1
          7654 MARTIN     1981-09-28 Typ=12 Len=7: 119,181,9,28,1,1,1
          7698 BLAKE      1981-05-01 Typ=12 Len=7: 119,181,5,1,1,1,1
          7782 CLARK      1981-06-09 Typ=12 Len=7: 119,181,6,9,1,1,1
          7788 SCOTT      1987-04-19 Typ=12 Len=7: 119,187,4,19,1,1,1
          7839 KING       1981-11-17 Typ=12 Len=7: 119,181,11,17,1,1,1
          7844 TURNER     1981-09-08 Typ=12 Len=7: 119,181,9,8,1,1,1
          7876 ADAMS      1987-05-23 Typ=12 Len=7: 119,187,5,23,1,1,1
          7900 JAMES      1981-12-03 Typ=12 Len=7: 119,181,12,3,1,1,1
          7902 FORD       1981-12-03 Typ=12 Len=7: 119,181,12,3,1,1,1
          7934 MILLER     1982-01-23 Typ=12 Len=7: 119,182,1,23,1,1,1
    14 rows selected.... whilst the hiredate is now showing on the screen in the format we've chosen, the actual internal storage of those dates remains completely unchanged. i.e. we don't have to change the format of the internal storage of dates to make them display differently.
    Likewise you can format dates manually as part of your query using to_char function...
    SQL> select empno, ename, to_char(hiredate,'DD Month YYYY') as hiredate, dump(hiredate) as dump_hiredate from emp;
         EMPNO ENAME      HIREDATE          DUMP_HIREDATE
          7369 SMITH      17 December  1980 Typ=12 Len=7: 119,180,12,17,1,1,1
          7499 ALLEN      20 February  1981 Typ=12 Len=7: 119,181,2,20,1,1,1
          7521 WARD       22 February  1981 Typ=12 Len=7: 119,181,2,22,1,1,1
          7566 JONES      02 April     1981 Typ=12 Len=7: 119,181,4,2,1,1,1
          7654 MARTIN     28 September 1981 Typ=12 Len=7: 119,181,9,28,1,1,1
          7698 BLAKE      01 May       1981 Typ=12 Len=7: 119,181,5,1,1,1,1
          7782 CLARK      09 June      1981 Typ=12 Len=7: 119,181,6,9,1,1,1
          7788 SCOTT      19 April     1987 Typ=12 Len=7: 119,187,4,19,1,1,1
          7839 KING       17 November  1981 Typ=12 Len=7: 119,181,11,17,1,1,1
          7844 TURNER     08 September 1981 Typ=12 Len=7: 119,181,9,8,1,1,1
          7876 ADAMS      23 May       1987 Typ=12 Len=7: 119,187,5,23,1,1,1
          7900 JAMES      03 December  1981 Typ=12 Len=7: 119,181,12,3,1,1,1
          7902 FORD       03 December  1981 Typ=12 Len=7: 119,181,12,3,1,1,1
          7934 MILLER     23 January   1982 Typ=12 Len=7: 119,182,1,23,1,1,1
    14 rows selected.
    SQL>Again, the internal date format remains the same.
    It is important that, when you store date information you store it as DATE datatype and let oracle use it's internal format, so that it can accurately do the date arithmetic, date range searches and date ordering etc., that everyone likes to do in queries. If you try and store it as VARCHAR2 then not only can information be lost (i.e. is '01/02/2010' representing 1st February 2010 or is it 2nd January 2010?) but you prevent date arithmetic, range searches and ordering in your queries from working correctly (i.e. '03/01/2009' would work out to be a greater date than '01/02/2010'). To prove it...
    SQL> select 'Wrong' from dual where '03/01/2009' > '01/02/2010';
    'WRON
    Wrong
    SQL> ed
    Wrote file afiedt.buf
      1* select 'Wrong' from dual where to_date('03/01/2009','DD/MM/YYYY') > to_date('01/02/2010','DD/MM/YYYY')
    SQL> /
    no rows selected

  • Result of Calculated Field as Criteria?

    I'm using CASE and CAST to convert a field to datetime format so that I can account for null field values
    CASE
         WHEN ISDATE([Date])=0 THEN 01/01/1900
         ELSE
             CAST([Date] as datetime)
    END as [DateConverted]
    How can I use this new [DateConverted] field in the criteria of my query so that I can only get records in a specific date range?

    I prefer using the below approach for date range comparison
    DECLARE @tmp TABLE (Id INT IDENTITY(1,1),[Date] VARCHAR(20))
    INSERT @tmp SELECT 0
    INSERT @tmp SELECT GETDATE()-1
    INSERT @tmp SELECT GETDATE()
    INSERT @tmp SELECT GETDATE()+1
    --SELECT * FROM @tmp
    SELECT * FROM (
    SELECT Id,CASE
    WHEN ISDATE([Date])=0 THEN 01/01/1900
    ELSE CAST([Date] AS DATETIME)
    END [DateConverted]
    FROM @tmp ) tmp
    WHERE [DateConverted] >='1900-01-01'
    AND [DateConverted] <'2014-01-05'
    Reason being there's a small chance of some records getting wrongly excluded/included when you use BETWEEN clause especially when datetime field stores time part also.
    http://visakhm.blogspot.in/2012/12/different-ways-to-implement-date-range.html
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Help with multiple variable

    I'm trying to write a query which pulls shipment information
    from an access database. Then it grabs the weight of each shipments
    and multiplies it by the corresponding rate depending on its weight
    class.
    I'm using the CFIF, CFELSEIF and CFSET tags to try to
    accopmlisht this, making it look something like this
    <CFIF "Shipments.weight" LT 500 >
    <cfset Shipments.rate = 7.5 >
    <CFELSEIF "Shipments.weight" GT 499 LT 1000 >
    <cfset Shipments.rate = 7 >
    <CFELSEIF "Shipments.weight" GT 999 LT 2000 >
    <cfset Shipments.rate = 6.5 >
    <CFELSEIF "Shipments.weight" GT 1999 LT 5000 >
    <cfset Shipments.rate = 5 >
    <CFELSEIF "Shipments.weight" GT 4999 >
    <cfset Shipments.rate = 3.65 >
    </CFIF>
    then in the Output tag I use this CFSCRIPT
    <cfscript>
    WriteOutput(#ACC_Report.weight#/100*#Shipments.rate#);
    </cfscript>
    It runs fine but it only seems to grab either the first or
    the second set rate variables and multiplies all the weights in the
    output query by it, inetead of logically choosing its weight class.
    Any help, perhaps I'm using the wrong logic.
    Thanks in advance

    "There's no such thing as "greater-than-ness" on string data"
    Adam, everything you said is basically true and relevant to
    the original
    poster's issue. But this line is wrong is it not? Isn't
    comparing one
    string to another and determining which is greater or lesser
    the essence
    of sorting a set of strings into an alphabetical order?
    Now, I am not sure I can remember the last time I had to
    write my own
    sort routine. But I am pretty sure when I did for sorting
    string data,
    I communally compared one to another to see which was greater
    and|or
    lesser then the other.
    To reiterate in the original post, the person is comparing
    the string
    constants "Shipments.weight" to the numbers do to the
    improper quotes,
    not the value in the Shipments.weight variable.
    Also, if one cares, the logic could be greatly simplified
    with some
    basic boolean logic. The lower range comparisons are
    unnecessary.
    <CFIF Shipments.weight LT 500 >
    <!--- weight is less then 500 --->
    <cfset Shipments.rate = 7.5 >
    <CFELSEIF Shipments.weight LT 1000 >
    <!--- weight is greater then equal to 500 it was not
    caught in first
    branch, but it is also less then 1000 --->
    <cfset Shipments.rate = 7 >
    <CFELSEIF Shipments.weight LT 2000 >
    <!--- weight is greater then equal to 1000 it was not
    caught in any
    previous branch, but it is also less then 2000--->
    <cfset Shipments.rate = 6.5 >
    <CFELSEIF Shipments.weight LT 5000 >
    <!--- weight is greater then equal to 2000 it was not
    caught in any
    previous branch, but it is also less then 5000--->
    <cfset Shipments.rate = 5 >
    <CFELSE>
    <!--- weight is greater then equal to 5000 it was not
    caught in any
    previous branch--->
    <cfset Shipments.rate = 3.65 >
    </CFIF>

Maybe you are looking for

  • Login of one user at a time

    When having different network admin, what will be the configuration/settings on the ACS to set one at a time login onto the cisco device? So that when a user already logged into the device, other users attempting to login will be blocked.

  • VF03 - Services of Object - Attachment List - Send Email

    Dear All, I have following requirement and need help on this. Step1: Go the Transaction code u2018VF03u2019, give the Billing document and press u2018Enteru2019. Step2: In the next screen, we will be on the screen u2018Standard Invoice XXXXXXXX (ZSOR

  • Bringing back results that match a certain character type

    Hi, thank you for any help. Below is a few values from a field in a database table. I like to bring back the values that only have ###-####. I do not want to bring back any other values that has any other kind of characters in them unless its a numbe

  • Autoinvoice Master Program is not populating Paying Customer Details

    Hi All, We have requirement to populate paying customer details in AR Transactions form. We were able to load data into interface tables(Paying_customer_id) and Autoinvoice Master Program is running successfully without any errors/warnings,but Paying

  • Error while using md01

    Hi I am getting an error while creating MRP run.This is the error: MRP parameters for plant 'ZAK1' have not been maintained. Can anyone tell me how to remove this error. Points will be awarded. Thanks and Regards. Subhash