Different results for decimal values between SQL and B1 RecordSet

Hi Experts,
This is really a strange one. I have the query below:
SELECT Amount, TaxPrcnt FROM [@Table] WHERE ORDER# = '20246' AND ItemCode = '112488'
The same query gives me 0.085 for TaxPrcnt if I run it in SQL Management Studio, but gives me 8.5 if I run it in B1 Query Generator, and the same if I use RecordSet object in DI. The actual value in the table is 0.085. Anyone know why B1 is multiplying the value by 100?

Hi Ronnie,
sp_prepexec is a system stored procedure used to generate  to prepare and execute a parameterized SQL statement. Have a look at:
http://jtds.sourceforge.net/apiCursors.html#_sp_prepexec
Basically, it's another way of passing a query to SQL rather than using a stored procedure. This command shouldn't be converting the data in any way, unless there's a specific cast or calculation in the query itself.
Can you give simple details of the SQL table you created and I'll see if I can replicate the issue.
Kind Regards,
Owen
P.S. Sorry about the query, I wrote that down this morning before all my brain cells were awake Referencing a field by index only works when grouping and sorting.

Similar Messages

  • Different results of same calculation between SQL and PL/SQL

    This SQL statement:
    select 1074 * (4 / 48) from dual;Gives the result 89.5.
    However this PL/SQL block
    declare
        tmp     NUMBER;
    begin
        SELECT 1074 * (4 / 48) into tmp from dual;
        dbms_output.put_line('Result '||tmp);
    end;Gives a different result:
    Result 89.49999999999999999999999999999999999996
    If I change and give my variable tmp a precision and scale, say (38,36) then the result is 89.5.
    Edit. I have done this on both 10g (10.2.0.4.0) and 11g (11.1.0.7.0) with the same result in both.
    Edited by: kendenny on Jul 9, 2010 10:19 AM for additional information

    What's your current NUMWIDTH value in SQL*Plus (I'm assuming you are using that as your tool)?
    SQL> set numwidth 50
    SQL> select 1074 * (4 / 48) from dual;
                                           1074*(4/48)
             89.49999999999999999999999999999999999996

  • Different behavior for Power View between Excel and Power Bi Site

    Hello,
    this problem is driving me crazy...
    I have a PowerPivot Data Model and created a PowerView that includes three charts for the 
    Year, Quarter, Brand and a matrix that allow me to drill down the different projects. Each Project in the matrix can be clicked to view the project details.
    The strange thing is that in Excel on my Desktop and in the Power Bi App for Surace RT when i press the Project in the drill down the previous filters remains set, while if i do the same in Power Bi Sites the filters are lost. I tried with Chrome and IE
    but behavior the same. Am i missing something?
    Thank you so much

    Dear Will,
    thank you so much for your reply.
    I'm attaching come screenshots from Power bi site i remove the clases and the data, but you should have an idea of the scenario. Actually I'm not setting the Year and Quarter filter as view filters in the Power View.
    I'm just using Cross filtering and i realize that the behavior is different between Excel Desktop and Power Bi Surface RT app and the bi sites (with all browsers).
    First image with 2014 set (just clicked on 2014) the data on the Matrix have been deleted.
    After i select the drill down:
    Thank you so much for your help, I really appreciate it!
    --silvano

  • Use context to pass value between JPF and Java Control

    hi,
    Can we use context to pass value between JPF and Java Control? It works if i m using InitialContext in the same machine but i dun think it will works if i put Web server and application server in different machine. Anyone have an idea how to do it?
    I also want to get the method name in the onAcquire callback method. Anyone knows how to do this?
    thks
    ?:|

    Hi.
    Yoy can the next options for make this:
    1. Pass your values in http request.
    example: http://localhost/index.jsp?var1=value1&var2=value2&var3....
    2. You can use a no visible frame (of height or width 0) and keep your variables and values there using Javascript, but of this form, single you will be able to check the values in the part client, not in the server. Of this form, the values of vars are not visible from the user in the navigation bar.
    Greeting.

  • Alter script to keep values between 1 and 99

    Hi guys, I currently have a table (for the sake of this lets call it my_table with a column my_column which is integer) I want to restrict the values of this column to values between 1 and 99 and was wondering how I would build this constraint using a script.
    Thanks for any help I may recieve on this.

    Adding a check constraint should serve the purpose.
    create table t_test
    col1 integer
    alter table t_test add constraint check_col1_val check (col1 between 1 and 99);
    insert into t_test values(1);
    insert into t_test values(99);
    insert into t_test values(81);
    insert into t_test values(0);
    insert into t_test values(100);Running the last two insert statements will give you a error
    insert into t_test values(0)
    Error report:
    SQL Error: ORA-02290: check constraint (HI_OWNER.CHECK_COL1_VAL) violated
    02290. 00000 -  "check constraint (%s.%s) violated"
    *Cause:    The values being inserted do not satisfy the named check
    *Action:   do not insert values that violate the constraint.

  • Select just the values between min and max of an accumulated value over day

    Hello Forum,
    a value is accumulated over a day and over a period of time. Next day the value is reseted and starts again to be accumulated:
    with sampledata as (select to_date('09.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
                       select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 29 val from dual union all
                       select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 30 val from dual union all
                       select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
                       select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
                       select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 75 val from dual union all
                       select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 95 val from dual union all
                       select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 14 val from dual union all
                       select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 34 val from dual union all
                       select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 58 val from dual union all
                       select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 70 val from dual union all
                       select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
                       select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
                       select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
                       select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual)
    select   ts, val
    from     sampledata
    order by ts asc;How should I change the select statement to skip all data sets before the first minimum and the duplicates after the maximum of a day in order to get such a result:
    TS     VAL
    09.09.12 06:12     23
    09.09.12 07:12     29
    09.09.12 08:12     30
    09.09.12 09:12     45
    09.09.12 10:12     60
    09.09.12 11:12     75
    09.09.12 12:21     95
    09.09.12 13:21     120
    09.09.12 14:21     142
    10.09.12 06:12     14
    10.09.12 07:12     34
    10.09.12 08:12     58
    10.09.12 09:12     70
    10.09.12 10:12     120
    10.09.12 11:12     142
    10.09.12 12:21     153Thank you

    This solution works perfectly when the accumulated value has its low and its high on the same day. But I found out :( , that there is also data, which has its low yesterday and its high today. For a better understandig of the case, there is a machine, wich is working over 3 Shifts with irregular start and end time. For example Shift1 cann start at 5:50 or at 7:15. The accumulated value of the worked time is accumuated for each shift extra. This solution works for the shift 1 (approximate between 06:00-14:00) and for the shift 2(approximate between 14:00-22:00), because there is the low and the high of the accumulated value on the same day. This solution does not work for the shif 3(approximate between 22:00-06:00), because the high of the accumulated value is or can be the next day.
    So the thread title should be: "Select just the values between min and max of an accumulated value over the same day(today) or over two successive days (yesterday and today)
    Sampledata for shift 1 or shift 2:
    {code}
    with sampledata as (select to_date('09.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
    select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 29 val from dual union all
    select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 30 val from dual union all
    select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
    select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
    select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 75 val from dual union all
    select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 95 val from dual union all
    select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 120 val from dual union all
    select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 143 val from dual union all
    select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 144 val from dual union all
    select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 145 val from dual union all
    select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 147 val from dual union all
    select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 148 val from dual union all
    select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual union all
    select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 153 val from dual)
    , got_analytics     AS
         SELECT ts, val
         ,     MIN (val) OVER ( PARTITION BY TRUNC (ts)
                        ORDER BY      ts DESC
                        )      AS min_val_after
         ,     CASE
              WHEN ROW_NUMBER () OVER ( PARTITION BY TRUNC (ts)
                             ORDER BY     val
                             ,      ts     
                             ) = 1          
              THEN -1 -- Impossibly low val
              ELSE LAG (val) OVER ( PARTITION BY TRUNC (ts)
                             ORDER BY      ts
              END           AS prev_val
         ,     MIN (val) OVER (PARTITION BY     TRUNC (ts))
                        AS low_val_today
         ,     NVL ( LAST_VALUE (val) OVER ( ORDER BY ts
                             RANGE BETWEEN UNBOUNDED PRECEDING
                                  AND     ts - TRUNC (ts) PRECEDING
              , -1
              )          AS last_val_yesterday
         FROM sampledata
    SELECT     ts
    ,     val
    FROM     got_analytics
    WHERE     val          <= min_val_after
    AND     val          > prev_val
    AND     (      val     > low_val_today
         OR     val     != last_val_yesterday
    ORDER BY ts
    {code}
    with the expected results:
    {code}
    1     09.09.2012 06:12:02     23
    2     09.09.2012 07:12:03     29
    3     09.09.2012 08:12:04     30
    4     09.09.2012 09:12:11     45
    5     09.09.2012 10:12:12     60
    6     09.09.2012 11:12:13     75
    7     09.09.2012 12:21:24     95
    8     09.09.2012 13:21:26     120
    9     09.09.2012 14:21:27     142
    10     10.09.2012 06:12:02     143
    11     10.09.2012 07:12:03     144
    12     10.09.2012 08:12:04     145
    13     10.09.2012 09:12:11     146
    14     10.09.2012 10:12:12     147
    15     10.09.2012 11:12:13     148
    16     10.09.2012 12:21:24     153
    {code}
    And the sampledata for shift 3 is:
    {code}
    with sampledata as (select to_date('08.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union ALL
    select to_date('08.09.2012 02:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 45 val from dual union all
    select to_date('08.09.2012 05:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 78 val from dual union all
    select to_date('08.09.2012 06:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 08:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 10:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 12:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 16:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 17:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 19:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 21:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 90 val from dual union all
    select to_date('08.09.2012 22:00:12', 'dd.mm.yyyy hh24:mi:ss') ts, 24 val from dual union all
    select to_date('08.09.2012 22:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 40 val from dual union all
    select to_date('08.09.2012 23:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 68 val from dual union all
    select to_date('09.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 79 val from dual union all
    select to_date('09.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 124 val from dual union all
    select to_date('09.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 125 val from dual union all
    select to_date('09.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 126 val from dual union all
    select to_date('09.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 17:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union all
    select to_date('09.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 127 val from dual union ALL
    select to_date('09.09.2012 22:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 5 val from dual union ALL
    select to_date('09.09.2012 22:51:33', 'dd.mm.yyyy hh24:mi:ss') ts, 23 val from dual union all
    select to_date('09.09.2012 23:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 40 val from dual union all
    select to_date('10.09.2012 00:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 50 val from dual union all
    select to_date('10.09.2012 01:03:08', 'dd.mm.yyyy hh24:mi:ss') ts, 60 val from dual union all
    select to_date('10.09.2012 02:54:11', 'dd.mm.yyyy hh24:mi:ss') ts, 78 val from dual union all
    select to_date('10.09.2012 03:04:08', 'dd.mm.yyyy hh24:mi:ss') ts, 142 val from dual union all
    select to_date('10.09.2012 04:04:19', 'dd.mm.yyyy hh24:mi:ss') ts, 145 val from dual union all
    select to_date('10.09.2012 05:04:20', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 06:12:02', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 07:12:03', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 08:12:04', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 09:12:11', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 10:12:12', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 11:12:13', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 12:21:24', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 13:21:26', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 14:21:27', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 15:21:30', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 16:21:32', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual union all
    select to_date('10.09.2012 21:21:33', 'dd.mm.yyyy hh24:mi:ss') ts, 146 val from dual)
    , got_analytics AS
    SELECT ts, val
    , MIN (val) OVER ( PARTITION BY TRUNC (ts)
    ORDER BY ts DESC
    ) AS min_val_after
    , CASE
    WHEN ROW_NUMBER () OVER ( PARTITION BY TRUNC (ts)
    ORDER BY val
    , ts
    ) = 1
    THEN -1 -- Impossibly low val
    ELSE LAG (val) OVER ( PARTITION BY TRUNC (ts)
    ORDER BY ts
    END AS prev_val
    , MIN (val) OVER (PARTITION BY TRUNC (ts))
    AS low_val_today
    , NVL ( LAST_VALUE (val) OVER ( ORDER BY ts
    RANGE BETWEEN UNBOUNDED PRECEDING
    AND ts - TRUNC (ts) PRECEDING
    , -1
    ) AS last_val_yesterday
    FROM sampledata
    SELECT ts
    , val
    FROM got_analytics
    WHERE val <= min_val_after
    AND val > prev_val
    AND ( val > low_val_today
    OR val != last_val_yesterday
    ORDER BY ts
    {code}
    with the unexpected results:
    {code}
    - ts val
    1     08.09.2012 00:04:08     23
    2     08.09.2012 22:12:13     40
    3     08.09.2012 23:21:24     68
    4     09.09.2012 22:21:33     5
    5     09.09.2012 22:51:33     23
    6     09.09.2012 23:21:33     40
    7     10.09.2012 00:04:08     50
    8     10.09.2012 01:03:08     60
    9     10.09.2012 02:54:11     78
    10     10.09.2012 03:04:08     142
    11     10.09.2012 04:04:19     145
    12     10.09.2012 05:04:20     146
    {code}
    The result should be:
    {code}
    - ts val
    1     08.09.2012 00:04:08     23
    2     08.09.2012 02:04:08     45
    3     08.09.2012 05:03:08     78
    4     08.09.2012 06:54:11     90
    5     08.09.2012 22:00:12     24
    6     08.09.2012 22:12:13     40
    7     08.09.2012 23:21:24     68
    8     09.09.2012 01:03:08     79
    9     09.09.2012 02:54:11     124
    10     09.09.2012 03:04:08     125
    11     09.09.2012 04:04:19     126
    12     09.09.2012 05:04:20     127
    13     09.09.2012 22:21:33     5
    14     09.09.2012 22:51:33     23
    15     09.09.2012 23:21:33     40
    16     10.09.2012 00:04:08     50
    17     10.09.2012 01:03:08     60
    18     10.09.2012 02:54:11     78
    19     10.09.2012 03:04:08     142
    20     10.09.2012 04:04:19     145
    21     10.09.2012 05:04:20     146
    {code}
    Thank you for your help!

  • Difference in value Between BW and R/3

    Hi,
    We are in the process of validating reports .We have observed difference in value for  credit and debit amount fields in BW and for the same data is coming from standard data source 0EN_JVA_2 (Joint venture: Actual line item using delta extraction) we have done full data load. In the data source the data coming from a single field and that is spited based on debit/credit indicator in R/3.
    I need your valuable input on where the problem is for the difference of value between BW and R/3.
    Regards
    Deva

    Hi,
    Is there any filterations in the IP?
    Any data packet deletion activity in the start rountine?
    Any update rule for the key figure is having routine with modifiying the value?
    Kindly check.
    In case if you are seeing number of records, always aggregation happens.
    If the difference is in terms of quantity of the key figure values, kindly check the above three.
    Kindly come to us so that we can suggest
    Hope this helps
    Janardhan Kumar

  • I need a table of the different features for Adobe Standard 9, 10 and 11?

    I need a table of the different features for Adobe Standard 9, 10 and 11 so that can advise my client which version can be packaed and deployed via SCCM. A detailed list of features for each version would be most useful.
    Regards Paul Jenkins

    See the following documents:
    Acrobat Standard 9, X and XI : http://www.adobe.com/products/acrobatstandard/buying-guide-version-comparison.html
    Acrobat Pro 9, X and XI : http://www.adobe.com/products/acrobatpro/buying-guide-version-comparison.html
    Back to version 7 : http://prodesigntools.com/version-comparison-differences-between-adobe-acrobat-x-10-9-8-7. html

  • Rebate: Differ "Scale Base Value" between "Statistics" and "Billing Doc"

    Dear SD Expert,
                After I go with Transaction Code :"VOB3" (run report RV15B002), I found that "Scale Base Value" have different value between "Statistics" and "Billing Doc". And the correct value is "Billing Doc". However, the value which shows in my rebate condition now is "Statistic". Thus, my question is
                How can I configure my rebate condition based on "Billing Doc" Value instead of "Statistic"? (I don't want to "Correct backlogs" anytime in changing scales based value.)  Anyone please feel free in suggestion.
    Thanks,
    Prach

    HI,
    This may be useful to you.
    http://help.sap.com/saphelp_47x200/helpdata/en/dd/5612d8545a11d1a7020000e829fd11/content.htm
    Vrajesh

  • Can We use FDM as ETL tool between SQL and Oracle

    I want to use FDM as ETL tool between SQL and Oracle. Can it be possible. I didn,t found any target adapter for oracle database.My source system is SQL and Target system is Oracle database.
    Rahul
    Edited by: user12190125 on Nov 9, 2009 4:23 AM

    Rahul,
    I believe this is possible to do, but not an easy one and there are a few considerations:
    How much data are you processing? FDM has a lot of features which support the business process. While this is great for users and audit trail etc. it slows down performance if you want to process a lot of data. It also depends on the type of mappings you use (Like mappings are slower than explicit mappings).
    How familiar are you with VBScript? There is no explicit target adapter for Oracle, but there is a data mart adapter which can be used for anything. You have to implement everything yourself though, mainly the Export and Load actions. In there you will also have to handle the the connections to the MSSQL and Oracle databases.
    Check the data mart adapter and see if you feel comfortable with defining the vb code in there. There are reasons for and against this approach. ODI would probably be the better choice unless you really need to have FDM's process support.
    Regards,
    Matt

  • Activity type is mandatory: enter value between 01 and 07

    Hi guys,
    I would like to ask you for help..
    Recently I updated the App CRM Sales to 2.0.3
    Now, when I try to create an activity/task in IPhone(IOs6), it shows me this error "Activity type is mandatory: enter value between 01 and 07".
    Do I need to apply any SAP note?
    Does anybody have any idea?
    Regards,
    Eder.

    Hi Masayuki,
    Thanks again for you help.
    So, When I create an Activity/Task in CRM Backend I don´t have problem. I can see in my mobile device (Iphone).
    The problem is when I try to create a new activity in App CRM Sales (Iphone).
    I think that there is some obligatory field in CRM Backend, but that in App CRM there isn´t to filled in it.
    Thanks,

  • How to get the leading zeros for decimal values?

    Hi,
      How i wil get the leading zeros for decimal values.For CONVERSION_EXIT_ALPHA_INPUT it is not working.Now iam using overlay condition for getting leading zeros.But iam getting the value like 00013.500.But as per my requirement i want to display this value 0000013.5.
    my code is
                    overlay w_MetLife_detail-rdempsalary with '000000000'
                    data :rdempsalary     type char9
    Please help me on this.
    Regards,
    Sujan

    Hi
    For more info,
    The function of the statement UNPACK is based on the fact, that the BCD display of a decimal place corresponds to the second half-byte of code of a digit in the most character representations. This conversion is commonly called "unpacking".
    The statement PACK to pack is obsolete and can be replaced by MOVE.
    If destination is specified as untyped field symbol or as untyped formal parameter and is not flat and not character-type during execution of the statement, then an untreatable exception occurs in Unicode programs. In non-Unicode programs, an exception occurs only with deep types, whereas flat types are treated as character-type types.
    Example
    After the assignments,char1 and char2 contain the values "123.456" and "0000123456".
    DATA: pack  TYPE p LENGTH 8 DECIMALS 3 VALUE '123.456',
          char1 TYPE c LENGTH 10,
          char2 TYPE c LENGTH 10.
    MOVE   pack TO char1.
    UNPACK pack TO char2.
    Regards

  • GET function fails to get values between 1 and 0

    Hi i have the folowing code
    GET(CONTAVENDAS="PERCVOLUMECORP",FONTEDADO="INPUT",TEMPO="2011.INP",PRODUTO="%SPECIE%_IN",CANAL="NACANAL",ORGCOMERCIAL="NAORGCOM")
    When the value in the PERCVOLUMECORP account, is greater or equal to 1 let's say 80, the get function returns me the correct value, but when i have a value between 0 and one, for instance 0,25, the get function returns 0 (zero)
    Any ideas?
    Thanks in advance.
    Leandro
    Brasil

    Here is the entire script
    **SELECT(%ANO_BD_TRAB%,[YEAR],CATEGORIA,"ID='BDTRABALHO'")
    **SELECT(%ANO_BD_TRAB1%,[YEAR]+1,CATEGORIA,"ID='BDTRABALHO'")
    **SELECT(%ANO_BD_TRAB2%,[YEAR]+2,CATEGORIA,"ID='BDTRABALHO'")
    **SELECT(%MES_INCIO%, STARTMTH, CATEGORIA, "ID='BDTRABALHO'")
    **SELECT(%MESES%,[ID],TEMPO,"MONTHNUM>=%MES_INCIO% AND YEAR=%ANO_BD_TRAB% AND LEVEL='MONTH'")
    **SELECT(%MESES1%,[ID],TEMPO,"MONTHNUM>=%MES_INCIO% AND YEAR=%ANO_BD_TRAB1% AND LEVEL='MONTH'")
    **SELECT(%MESES2%,[ID],TEMPO,"MONTHNUM>=%MES_INCIO% AND YEAR=%ANO_BD_TRAB2% AND LEVEL='MONTH'")
    **SELECT(%CONTAS%, [ID], CONTAVENDAS, "TOPDOWNALLOC='Y'")
    *SELECT(%TODOSPROD%, [ID],PRODUTO,"GRUPO='PRODUTO'")
    **SELECT(%SPECIES%, [ID],PRODUTO,"GRUPO='TIPOESPECIE' AND FLAG_TOPDA='N'")
    **SELECT(%SPECIES_IN%, [ID],PRODUTO,"GRUPO='TIPOESPECIE' AND FLAG_TOPDA='Y'")
    *XDIM_MEMBERSET TEMPO = %MESES%,%MESES1%,%MESES2%,%ANO_BD_TRAB%.INP,%ANO_BD_TRAB1%.INP,%ANO_BD_TRAB2%.INP
    *XDIM_MEMBERSET CONTAVENDAS = %CONTAS%,PERCVOLUMECORP
    *XDIM_MEMBERSET PRODUTO = %SPECIES%,%SPECIES_IN%,%TODOSPROD%
    *OLAPLOOKUP VENDAS
      *DIM FONTEDADO="INPUT"
      *DIM TEMPO="2011.INP"
      *DIM PRODUTO="C_IN"
      *DIM CANAL="NACANAL"
      *DIM ORGCOMERCIAL="NAORGCOM"
      *DIM PERCVOL:CONTAVENDAS="PERCVOLUMECORP"
    *ENDLOOKUP
    *FOR %CONTA% = VOLUME
      *FOR %MES% = 2011.JAN
        *FOR %SPECIE% = C AND %SPECIENAME% = CERVEJAS
         *WHEN PRODUTO.SPECIETYPE
         *IS "%SPECIENAME%"
           *WHEN CONTAVENDAS
           *IS "%CONTA%"
             *WHEN TEMPO
             *IS "%MES%"
                    *REC(CONTAVENDAS="%CONTA%",EXPRESSION=LOOKUP(PERCVOL))
             *ENDWHEN
           *ENDWHEN
         *ENDWHEN
        *NEXT
      *NEXT
    *NEXT
    *COMMIT
    Thanks is advance.

  • Passing values between jsp and php

    hi there, is it possible to pass values between jsp and
    php? i really need to find out the way if there is
    any. thanks in advance
    -azali-

    Yes, there are a few ways to do this.
    1) Think about using Cookies.
    2) Maybe use a Redirect passing the values in the Query string.
    3) Retain the data in a repository in the back end.
    4) Using Hidden fields within your pages.
    I am sure you can use these Idea's for a base to develop other methods on how to pass values back and forth from JSP -> PHP and vice versa.
    -Richard Burton

  • Different ways to establish SSO between Portal and ADP

    Hi,
    We are implementing payroll with the help of ADP.
    Please let me know different ways of establishing SSO between portal  and ADP
    Thanks
    Bala Duvvuri

    You may a few issues. SSO with logon tickets is based on accessing web sites in the same domain. So, if the portal is on http://ourportal.company.com, then the web site being accessed needs to have a URL like http://adphosted.company.com. Is the ADP system accessible by a DNS alias that is within company.com? If so, you're OK. If not, then there will be problems.
    The other SSO method is user mapping, but the security implications are not good...

Maybe you are looking for