Query on sc3.0

Hi,
In sun cluster 2.2, i saw the concept of logical hosts being configured with the disksets. And we can change the mastery of the disk groups among the nodes using the haswitch command. How this concept works in Sun cluster 3.0. I didn't get clear idea by reading the document on Global name space and global devices.
Please clarify me in this regard.
Thanks,
babu.

SC 3 is a little bit different wrt the global file system. There will be a
primary node which is analogous to what happens in SC 2.2 when
one node owns the diskset. In the global file system, the primary
node will be the node which reads and writes directly to the disk.
Other nodes can access the disks via the cluster interconnect.
The path occurs transparently to the application, though there is
a potential performance impact depending on what you are doing
and whether the cache is warmed.
In the case of simple failover services (non-scalable services)
it is worthwhile to make sure the node hosting the resource group
is also the GFS primary. An HAStorage+ resource group will do this
work for you.
-- richard

Similar Messages

  • SC3.1, Sol10, SE6920 storage - global disk I/O query

    Hi, if there is anybody that could throw some light on this problem it is much appreciated !
    We have a cluster which I believe is not behaving as expected when dealing with disk I/O to the global storage (or my understanding of this is incorrect).
    In our certified 3 node (E2900) cluster with fiber attached 6920 storage (dual FC paths with MPXIO), we do not use SVM for global disk redundancy because there is hardware RAID. We are NOT using device groups nor metasets.
    In previous clusters we have always used SVM metasets as device groups to control the node ownership for the disks and I/O for a node not owning the disks was accessed via the interconnects.
    However, in this case I thought that all 3 nodes should have direct I/O to the global storage - is this correct?
    We have seen that write activity from all the nodes is only going through 1 node - how can this be? - how can we identify the node which has ownership for the global storage?
    thanks for you time,
    Trevor

    hi, not sure if this is exactly what you are looking for... the luns from se6920 are viewed as:
    1 sdp-1:/dev/rdsk/c0t0d0 /dev/did/rdsk/d1
    2 sdp-1:/dev/rdsk/c1t0d0 /dev/did/rdsk/d2
    3 sdp-1:/dev/rdsk/c1t1d0 /dev/did/rdsk/d3
    4 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003714d0 /dev/did/rdsk/d4
    5 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003362d0 /dev/did/rdsk/d5
    6 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000335Fd0 /dev/did/rdsk/d6
    7 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000335Cd0 /dev/did/rdsk/d7
    8 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003359d0 /dev/did/rdsk/d8
    9 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003356d0 /dev/did/rdsk/d9
    10 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003353d0 /dev/did/rdsk/d10
    11 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000370Bd0 /dev/did/rdsk/d11
    12 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003707d0 /dev/did/rdsk/d12
    13 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003703d0 /dev/did/rdsk/d13
    14 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000370Fd0 /dev/did/rdsk/d14
    15 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003350d0 /dev/did/rdsk/d15
    16 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000334Dd0 /dev/did/rdsk/d16
    17 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000334Ad0 /dev/did/rdsk/d17
    18 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003347d0 /dev/did/rdsk/d18
    19 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003344d0 /dev/did/rdsk/d19
    20 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003341d0 /dev/did/rdsk/d20
    21 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000333Ed0 /dev/did/rdsk/d21
    22 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000333Bd0 /dev/did/rdsk/d22
    23 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003338d0 /dev/did/rdsk/d23
    24 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003335d0 /dev/did/rdsk/d24
    25 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003332d0 /dev/did/rdsk/d25
    26 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000332Fd0 /dev/did/rdsk/d26
    27 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000332Cd0 /dev/did/rdsk/d27
    28 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003329d0 /dev/did/rdsk/d28
    29 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003326d0 /dev/did/rdsk/d29
    30 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003323d0 /dev/did/rdsk/d30
    31 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003320d0 /dev/did/rdsk/d31
    32 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000331Dd0 /dev/did/rdsk/d32
    33 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000331Ad0 /dev/did/rdsk/d33
    34 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003317d0 /dev/did/rdsk/d34
    35 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003314d0 /dev/did/rdsk/d35
    36 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003311d0 /dev/did/rdsk/d36
    37 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000330Ed0 /dev/did/rdsk/d37
    38 sdp-1:/dev/rdsk/c5t600015D00005AE00000000000000330Bd0 /dev/did/rdsk/d38
    39 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003308d0 /dev/did/rdsk/d39
    40 sdp-1:/dev/rdsk/c5t600015D00005AE000000000000003305d0 /dev/did/rdsk/d40
    but the mounted filesystems are:
    /dev/global/dsk/d5s6 /dev/global/rdsk/d5s6 /global/sdp ufs 2 yes global,logging
    when i write to the above filesystem from all 3 nodes, iostat only reports write activity from 1 of the nodes.

  • Why does the query work in SQL Developer, but not in APEX?

    Hi, guys:
    I have a silly question. I have a complicated query, and I tested it successfully with SQL developer, and result is correct. However, when I put it into APEX to generate a report, it always reports no data found. I also know the condition related to "other marks" select list in the where clause part causes this problem. It also looks strange: before I add this condition, everything works OK; after I add this condition, even I do not choose the "other marks" select list, other components do not work neither. I always got no data found result. Could anyone help me on this problem? You can also visit our developing site as http://lsg-solutions.com:8888/apex/f?p=206 to check the problem.
    Thanks a lot
    Sam
    select distinct 'MAP','Detail',so.doc_number as "DOC Number", so.offender_id as "Offender ID", so.first_name||' '|| so.middle_name||' '||so.last_name as "Offender Name",
    so.date_of_birth as "Date of Birth",
    (select sc1.description from sor_code sc1 where sc1.code_id=so.race) as "Race",
    (select sc2.description from sor_code sc2 where sc2.code_id=so.sex) as "Sex",
    (select sc8.description from sor_code sc8 where sc8.code_id=so.hair_color) as "Hair Color",
    (select sc9.description from sor_code sc9 where sc9.code_id=so.eye_color) as "Eye Color",
    replace(replace(nvl2(sl.address1, sl.address1||' '||sl.address2 ||' '||sl.city ||' '||sl.county||' '||(select sc3.description from sor_code sc3 where sc3.code_id=sl.state)||' '||sl.zip, 'No Known Address'),'#'),',') as "Address",
    replace(replace(nvl2(sl.physical_address1,sl.physical_address1||' '||sl.physical_city ||' '||sl.physical_county||' '||(select sc4.description from sor_code sc4 where sc4.code_id=sl.physical_state)||' '||sl.physical_zip, 'No Known Address'),'#'),',') as "Physical Address",
    sl.status as "Status",
    sl.jurisdiction as "Jurisdiction",
    to_char(sl.ADDRESS_LATITUDE) as "Address Latitude",to_char(sl.address_longitude) as "Address Longitude",
    to_char(sl.physical_address_latitude) as "Physical Latitude",to_char(sl.physical_address_Longitude) as "Physical Longitude",
    decode(rox.habitual, 'Y', 'Habitual', '') as "Habitual",
    decode(rox.aggravated, 'Y', 'Aggravated', '') as "Aggravated",
    rox.status as "Registration Status",
    rox.registration_date as "Registration Date",
    rox.end_registration_date as "End Registration Date"
    from sor_location sl, sor_offender so, registration_offender_xref rox, sor_last_locn_v sllv
    where rox.offender_id=so.offender_id
    and sllv.offender_id(+)=so.offender_id
    and sl.location_id(+)=sllv.location_id
    and rox.status not in ('Merged')
    and rox.reg_type_id=1
    and upper(rox.status)='ACTIVE'
    and nvl(rox.admin_validated, to_date(1,'J'))>=nvl(rox.entry_date, to_date(1,'J'))
    and (((select sc11.description from sor_code sc11 where sc11.code_id=so.race and sc11.description=:P5_SL_RACE) is not null ) or (:P5_SL_RACE is null))
    and (((select sc12.description from sor_code sc12 where sc12.code_id=so.sex and sc12.description=:P5_SL_SEX) is not null ) or (:P5_SL_SEX is null))
    and (((select sc13.description from sor_code sc13 where sc13.code_id=so.hair_color and sc13.description=:P5_SL_HAIR_COLOR) is not null ) or (:P5_SL_HAIR_COLOR is null))
    and (((select sc14.description from sor_code sc14 where sc14.code_id=so.eye_color and sc14.description=:P5_SL_EYE_COLOR) is not null ) or (:P5_SL_EYE_COLOR is null))
    and (( so.offender_id in(select sm.offender_id from sor_code sc15, sor_mark sm, sor_offender so1 where sm.offender_id=so1.offender_id and sc15.code_id=sm.code and sc15.description=:P5_SL_OTHER_MARKS and sm.description is not null)) or (:P5_SL_OTHER_MARKS is null))

    My suggestion would be to put some instrumentation into your query and see what values you are using.. Or even simpler take out ALL the where clauses you can until data starts to sho wup and then add them back in one at a time until you find the culprit..
    My bet would be on any date comparisons you are doing between page items and database columns..
    Thank you,
    Tony Miller
    Dallas, TX

  • How to hold result of query with too many characters into a cursor?

    Hi, guys:
    Could anyone help me on this issue? I encounter such a error: ORA-06502: PL/SQL: numeric or value error: character string buffer too small. The reason is because the returning result is too big to hold in a cursor. I know I should use clob, how can I return result of a query into a clob?
    Here is my code of procedure
    function Find_Near_Offenders(P_f_Resident_Latitude in float, P_f_Resident_Longitude in float, P_n_Radius in number, P_s_User_Group in varchar2, P_b_Found out boolean) return rcur_Offender AS
        rcur_Offender_address rcur_Offender;
      begin
        if P_s_User_Group='Public' then
            open rcur_Offender_address for
              select distinct  so.offender_id as "Offender_ID",  so.first_name||' '|| so.middle_name||' '||so.last_name as "Offender_Name",
              replace(replace(nvl2(sl.address1, sl.address1||' '||sl.address2 ||' '||sl.city ||' '||sl.county||' '||(select sc3.description from sor_code sc3 where sc3.code_id=sl.state)||' '||sl.zip, 'No Known Address'),'#') ,',') as "Address",
              replace(replace(nvl2(sl.physical_address1,sl.physical_address1||' '||sl.physical_city ||' '||sl.physical_county||' '||(select sc4.description from sor_code sc4 where sc4.code_id=sl.physical_state)||' '||sl.physical_zip, 'No Known Address'),'#') ,',')  as "Physical_Address",
              nvl2(sl.ADDRESS_LATITUDE, to_char(sl.ADDRESS_LATITUDE)||','||to_char(sl.address_longitude),'') as "Address_Geocoding",
              nvl2(sl.physical_address_latitude,to_char(sl.physical_address_latitude) ||','||to_char(sl.physical_address_Longitude),'') as "Physical_Geocoding"
              from sor_location sl, sor_offender so, sor_offense sof, registration_offender_xref rox, sor_last_locn_v sllv
              where rox.offender_id=so.offender_id
              and sllv.offender_id(+)=so.offender_id
              and sl.location_id(+)=sllv.location_id
              and sof.offender_id=so.offender_id
              and rox.status not in ('Merged')
              and rox.reg_type_id=1
              and upper(rox.status)='ACTIVE'
              and nvl(rox.admin_validated, to_date(1,'J'))>=nvl(rox.entry_date, to_date(1,'J'))
              --and sl.physical_address_latitude is null
              and sl.ADDRESS_LATITUDE <=to_number(P_f_Resident_Latitude)+0.02*to_number(P_n_Radius) and sl.ADDRESS_LATITUDE>= to_number(P_f_Resident_Latitude)-0.02*to_number(P_n_Radius)
              and sl.address_longitude >=to_number(P_f_Resident_Longitude)-0.02*to_number(P_n_Radius) and  sl.address_longitude<=to_number(P_f_Resident_Longitude)+0.02*to_number(P_n_Radius)
              and sor_google_map_service.Calculate_Distance(P_f_Resident_Latitude, P_f_Resident_Longitude, sl.ADDRESS_LATITUDE, sl.address_longitude)<=P_n_Radius
              union
              select distinct  so.offender_id as "Offender_ID",  so.first_name||' '|| so.middle_name||' '||so.last_name as "Offender_Name",
              replace(replace(nvl2(sl.address1, sl.address1||' '||sl.address2 ||' '||sl.city ||' '||sl.county||' '||(select sc3.description from sor_code sc3 where sc3.code_id=sl.state)||' '||sl.zip, 'No Known Address'),'#') ,',') as "Address",
              replace(replace(nvl2(sl.physical_address1,sl.physical_address1||' '||sl.physical_city ||' '||sl.physical_county||' '||(select sc4.description from sor_code sc4 where sc4.code_id=sl.physical_state)||' '||sl.physical_zip, 'No Known Address'),'#') ,',')  as "Physical_Address",
              nvl2(sl.ADDRESS_LATITUDE, to_char(sl.ADDRESS_LATITUDE)||','||to_char(sl.address_longitude),'') as "Address_Geocoding",
              nvl2(sl.physical_address_latitude,to_char(sl.physical_address_latitude) ||','||to_char(sl.physical_address_Longitude),'') as "Physical_Geocoding"
              from sor_location sl, sor_offender so, sor_offense sof, registration_offender_xref rox, sor_last_locn_v sllv
              where rox.offender_id=so.offender_id
              and sllv.offender_id(+)=so.offender_id
              and sl.location_id(+)=sllv.location_id
              and sof.offender_id=so.offender_id
              and rox.status not in ('Merged')
              and rox.reg_type_id=1
              and upper(rox.status)='ACTIVE'
              and nvl(rox.admin_validated, to_date(1,'J'))>=nvl(rox.entry_date, to_date(1,'J'))
              and sl.physical_address_latitude <=to_number(P_f_Resident_Latitude)+0.02*to_number(P_n_Radius) and sl.physical_address_latitude>= to_number(P_f_Resident_Latitude)-0.02*to_number(P_n_Radius)
              and sl.physical_address_Longitude >=to_number(P_f_Resident_Longitude)-0.02*to_number(P_n_Radius) and  sl.physical_address_Longitude<=to_number(P_f_Resident_Longitude)+0.02*to_number(P_n_Radius)
              and sor_google_map_service.Calculate_Distance(P_f_Resident_Latitude, P_f_Resident_Longitude, sl.physical_address_latitude, sl.physical_address_Longitude)<=P_n_Radius;
            return rcur_Offender_address;
          end if;
      end;and my anonymous block is:
    declare
      query_result json_list;
      --list_string varchar2(32000);
      list_string clob;
      l_cursor sor_google_map_service.rcur_Offender;
      b_found boolean;
    begin
        l_cursor:=sor_google_map_service.find_near_offenders(35.5113030,-97.5543081, 3, 'Public', b_found);
        query_result:=json_util_pkg.ref_cursor_to_json(l_cursor);
        list_string:='{"Offenders": '||json_printer.pretty_print_list(query_result)||'}';
        dbms_output.put_line(list_string);
    end;

    lxiscas wrote:
    I checked the PL_JSON, and I found it uses sys_refcursor, which is limited to 32K around. That doesn't make sense. A SYS_REFCURSOR has no 32k limit-- the limits are related to the data types that the SQL statement that the SYS_REFCURSOR points to uses. If that SQL statement has a CLOB column, there is effectively no limit. If that SQL statement has a VARCHAR2 column, you've got the 4000 byte limit.
    and I was using varchar2 variable in anonymous block. However, my cursor result is too big to hold in a 32K space. Again, this doesn't seem to make sense. There is no such thing as a result that is too big for a cursor
    Do you think I need to change PL_JSON API myself to ensure it uses CLOB? or there is anyway that I can avoid update PL_JSON package internally?If the PL_JSON package is trying to write more than 32k of data into a VARCHAR2, your options are
    - Modify the package to use a CLOB
    - Find some other package that implements whatever functionality you need
    - Limit the data to 32k
    Justin

  • Need a help in a query

    Table structure is given below …
    Test_role
    SC_CODE     VAR_NAME     FLD_NAME
    SC1     Var1     NBR1
    SC1     Var2     NBR2
    SC2     Var3     NBR1
    SC3     Var1     NBR1
    SC3     Var4     NBR2
    Test_role table contains some code and associated variable name like Var1, Var2, Var3 etc . The third column FLD_NAME  contains the column name of another table(Test_data) which contains actual value.
    You can see here the field name NBR1 may not contain a particular variable always. It can vary for different  SC_CODE .
    Test_data
    CUST_ID     SC_CODE     NBR1     NBR2
    1     SC1     10     20
    2     SC2     50     
    3     SC3     70     80
    4     SC1     101     110
    The above table contains CUST_ID which is mapped to SC_CODE of Test_role table.
    Here CUST_ID  1 and 2  mapped to same variable SC1 but the actual value  for  cust_id=1 is (10,20) and  for 2 is (101,110)
    I need a resulting table like test _result which will take all variable name as column , and populate the data accordingly. Resulting table data is given below.                                                        
    Test_Result
    CUST_ID     SC_CODE     VAR1     VAR2     VAR3     VAR4
    1     SC1     10     20           
    2     SC2                 50     
    3     SC3     70                 80
    4     SC1     101     102            Table script is given below ..
    TEST_DATA
    create table TEST_DATA
    CUST_ID NUMBER(10),
    SC_CODE VARCHAR2(50),
    NBR1 NUMBER(5),
    NBR2 NUMBER(5)
    insert into TEST_DATA (CUST_ID, SC_CODE, NBR1, NBR2)
    values (1, 'SC1', 10, 20);
    insert into TEST_DATA (CUST_ID, SC_CODE, NBR1, NBR2)
    values (2, 'SC2', 50, null);
    insert into TEST_DATA (CUST_ID, SC_CODE, NBR1, NBR2)
    values (3, 'SC3', 70, 80);
    insert into TEST_DATA (CUST_ID, SC_CODE, NBR1, NBR2)
    values (4, 'SC1', 10, null);
    TEST_RESULT
    create table TEST_RESULT
    CUST_ID VARCHAR2(50),
    SC_CODE VARCHAR2(50),
    VAR1 NUMBER(5),
    VAR2 NUMBER(5),
    VAR3 NUMBER(5),
    VAR4 NUMBER(5)
    insert into TEST_RESULT (CUST_ID, SC_CODE, VAR1, VAR2, VAR3, VAR4)
    values ('1', 'SC1', 10, 20, null, null);
    insert into TEST_RESULT (CUST_ID, SC_CODE, VAR1, VAR2, VAR3, VAR4)
    values ('2', 'SC2', null, null, 50, null);
    insert into TEST_RESULT (CUST_ID, SC_CODE, VAR1, VAR2, VAR3, VAR4)
    values ('3', 'SC3', 70, null, null, 80);
    insert into TEST_RESULT (CUST_ID, SC_CODE, VAR1, VAR2, VAR3, VAR4)
    values ('4', 'SC1', 101, 110, null, 80);
    test_role
    create table TEST_ROLE
    SC_CODE VARCHAR2(50),
    VAR_NAME VARCHAR2(50),
    FLD_NAME VARCHAR2(50)
    insert into TEST_ROLE (SC_CODE, VAR_NAME, FLD_NAME)
    values ('SC1', 'Var1', 'NBR1');
    insert into TEST_ROLE (SC_CODE, VAR_NAME, FLD_NAME)
    values ('SC1', 'Var2', 'NBR2');
    insert into TEST_ROLE (SC_CODE, VAR_NAME, FLD_NAME)
    values ('SC2', 'Var3', 'NBR1');
    insert into TEST_ROLE (SC_CODE, VAR_NAME, FLD_NAME)
    values ('SC3', 'Var1', 'NBR1');
    insert into TEST_ROLE (SC_CODE, VAR_NAME, FLD_NAME)
    values ('SC3', 'Var4', 'NBR2');
    Please help , if it can be done

    Hi,
    The script below is one example of how to do a pivot:
    --     How to Pivot a Result Set (Display Rows as Columns)
    --     For Oracle 10, and earlier
    --     Actually, this works in any version of Oracle, but the
    --     "SELECT ... PIVOT" feature introduced in Oracle 11
    --     is better.  (See Query 2, below.)
    --     This example uses the scott.emp table.
    --     Given a query that produces three rows for every department,
    --     how can we show the same data in a query that has one row
    --     per department, and three separate columns?
    --     For example, the query below counts the number of employess
    --     in each departent that have one of three given jobs:
    PROMPT     ==========  0. Simple COUNT ... GROUP BY  ==========
    SELECT     deptno
    ,     job
    ,     COUNT (*)     AS cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno
    ,          job;
    Output:
        DEPTNO JOB              CNT
            20 CLERK              2
            20 MANAGER            1
            30 CLERK              1
            30 MANAGER            1
            10 CLERK              1
            10 MANAGER            1
            20 ANALYST            2
    PROMPT     ==========  1. Pivot  ==========
    SELECT     deptno
    ,     COUNT (CASE WHEN job = 'ANALYST' THEN 1 END)     AS analyst_cnt
    ,     COUNT (CASE WHEN job = 'CLERK'   THEN 1 END)     AS clerk_cnt
    ,     COUNT (CASE WHEN job = 'MANAGER' THEN 1 END)     AS manager_cnt
    FROM     scott.emp
    WHERE     job     IN ('ANALYST', 'CLERK', 'MANAGER')
    GROUP BY     deptno;
    --     Output:
        DEPTNO ANALYST_CNT  CLERK_CNT MANAGER_CNT
            30           0          1           1
            20           2          2           1
            10           0          1           1
    --     Explanation
    (1) Decide what you want the output to look like.
         (E.g. "I want a row for each department,
         and columns for deptno, analyst_cnt, clerk_cnt and manager_cnt)
    (2) Get a result set where every row identifies which row
         and which column of the output will be affected.
         In the example above, deptno identifies the row, and
         job identifies the column.
         Both deptno and job happened to be in the original table.
         That is not always the case; sometimes you have to
         compute new columns based on the original data.
    (3) Use aggregate functions and CASE (or DECODE) to produce
         the pivoted columns. 
         The CASE statement will pick
         only the rows of raw data that belong in the column.
         If each cell in the output corresponds to (at most)
         one row of input, then you can use MIN or MAX as the
         aggregate function.
         If many rows of input can be reflected in a single cell
         of output, then use SUM, COUNT, AVG, STRAGG, or some other
         aggregate function.
         GROUP BY the column that identifies rows.
    PROMPT     ==========  2. Oracle 11 PIVOT  ==========
    WITH     e     AS
    (     -- Begin sub-query e to SELECT columns for PIVOT
         SELECT     deptno
         ,     job
         FROM     scott.emp
    )     -- End sub-query e to SELECT columns for PIVOT
    SELECT     *
    FROM     e
    PIVOT     (     COUNT (*)
              FOR     job     IN     ( 'ANALYST'     AS analyst
                             , 'CLERK'     AS clerk
                             , 'MANAGER'     AS manager
    NOTES ON ORACLE 11 PIVOT:
    (1) You must use a sub-query to select the raw columns.
    An in-line view (not shown) is an example of a sub-query.
    (2) GROUP BY is implied for all columns not in the PIVOT clause.
    (3) Column aliases are optional. 
    If "AS analyst" is omitted above, the column will be called 'ANALYST' (single-quotes included).
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

  • Error while running a query-Input for variable 'Posting Period is invalid

    Hi All,
    NOTE: This error is only cropping up when I input 12 in the posting period variable selection. If I put in any other value from 1-11 I am not getting any errors. Any ideas why this might be happening?
    I am getting the following error when I try and run a query - "Input for variable 'Posting Period (Single entry, mandatory)' is invalid" - On further clicking on this error the message displayed is as follows -
    Diagnosis
    Variable Posting Period (Single Value Entry, Mandatory) is used as a lower limit (X) and an upper limit () in an interval selection. This limit has the value #.
    System Response
    Procedure
    Enter a different value for variable Posting Period (Single Value Entry, Mandatory). If the value of the other limit is determined by another variable, you can change its value also.
    Procedure for System Administration

    OK.
    Well, if the variable is not used in any interval selection, then I would say "something happened to it".
    I would make a copy of the query and run it to check if I get the same problem with period 12.
       -> If not, something is wrong in the original query (you can proceed as below, if changes to original are permitted).
    If so, then try removing the variable completely from the query and hardcode restriction to 12.
       -> If problem still persists, I would have to do some thinking.
    If problem is gone, then add the variable again. Check.
       -> If problem is back, then the variable "is sick". Only quick thing to do, is to build an identical variable and use that one.
    If problem also happens with the new variable, then it's time to share this experience with someone else and consider raising an OSS.
    Good luck!
    Jacob
    P.S: what fisc year variant are you using?
    Edited by: Jacob Jansen on Jan 25, 2010 8:36 PM

  • Logical database in adhoc query

    Hello All,
    Can anyone tell me what is the logical database in adhoc query?

    Hi
    When you create a query , you have to select an infoset. Infoset can be considered as a source from which data is populated in the Query Fields.
    Infosets are created from Transaction SQ02.
    There can be four methods through which an Infoset can become a source of data:
    1.  Table join ( By joining two or more tables from Data dictionary)
         example: Joining tables PA0001 and PA0006 on Pernr to get a one resultant dataset
    2. Direct read of Basis Table ( Like PA0001 as a source for data in Infoset )
    3. Logical Database ( A Pre-written Program by SAP that extract data from clusters, tables taking care of authorizations and validity periods)
    Example : Logical database PNP, PNPCE (Concurrent Employement),PCH ( LDB for Personnel Development Data)
    Custom Logical DBs can be created in T_Code SE-36.
    4. Data Retrieval by a Program ( Custom code written by ABAP developers which will collect and process data) . This program has a corresponding Structure in data dictionary and the fields of this structure will be used in query)
    Reward Points, if helpful.
    Regards
    Waseem Imran

  • Query help on Goods Receipt Query with AP Invoice

    Looking for a little help on a query.  I would like to list all the goods receipts for a given date range and then display the AP Invoice information (if its been copied to an AP Invoice).  I think my problem is in my where clause, I plagerized an SAP query to show GR and AP from a PO as a start.  SBO 2005 SP01.  Any help would be great appreciated.  Thanks
    SELECT distinct 'GR',
    D0.DocStatus,
    D0.DocNum ,
    D0.DocDate,
    D0.DocDueDate,
    D0.DocTotal,
    'AP',
    I0.DocStatus,
    I0.DocNum ,
    I0.DocDate,
    I0.DocDueDate,
    I0.DocTotal,
    I0.PaidToDate
    FROM
    ((OPDN  D0 inner Join PDN1 D1 on D0.DocEntry = D1.DocEntry)
    full outer join
    (OPCH I0 inner join PCH1 I1 on I0.DocEntry = I1.DocEntry)
    on (I1.BaseType=20 AND D1.DocEntry = I1.BaseEntry AND D1.LineNum=I1.BaseLine))
    WHERE
    (D1.BaseType=22 AND D1.DocDate>='[%0]' AND D1.DocDate<='[%1]')
    OR (I1.BaseType=20 AND I1.BaseEntry IN
    (SELECT Distinct DocEntry
    FROM PDN1 WHERE BaseType=22 AND DocDate>='[%0]' AND DocDate<='[%1]'))

    Hi Dalen ,
    I  believe it is because of the condition
    (D1.BaseType=22 AND D1.DocDate>='%0' AND D1.DocDate<='%1')
    OR (I1.BaseType=20 AND I1.BaseEntry IN
    (SELECT Distinct DocEntry FROM PDN1 WHERE PDN1.BaseType=22 AND DocDate>='%0' AND DocDate<='%1'))
    Try changing
    D1.BaseType=22 OR D1.DocDate>='%0' AND D1.DocDate<='%1
    PDN1.BaseType=22 OR DocDate>='%0' AND DocDate<='%1'))
    Lets see what would be the result . Lets have some fun with troubleshooting
    See what would be the difference in the result .
    Thank you
    Bishal

  • Can you check for data in one table or another but not both in one query?

    I have a situation where I need to link two tables together but the data may be in another (archive) table or different records are in both but I want the latest record from either table:
    ACCOUNT
    AccountID     Name   
    123               John Doe
    124               Jane Donaldson           
    125               Harold Douglas    
    MARKETER_ACCOUNT
    Key     AccountID     Marketer    StartDate     EndDate
    1001     123               10526          8/3/2008     9/27/2009
    1017     123               10987          9/28/2009     12/31/4712    (high date ~ which means currently with this marketer)
    1023     124               10541          12/03/2010     12/31/4712
    ARCHIVE
    Key     AccountID     Marketer    StartDate     EndDate
    1015     124               10526          8/3/2008     12/02/2010
    1033     125               10987         01/01/2011     01/31/2012  
    So my query needs to return the following:
    123     John Doe                        10526     8/3/2008     9/27/2009
    124     Jane Donaldson             10541     12/03/2010     12/31/4712     (this is the later of the two records for this account between archive and marketer_account tables)
    125     Harold Douglas               10987          01/01/2011     01/31/2012     (he is only in archive, so get this record)
    I'm unsure how to proceed in one query.  Note that I am reading in possibly multiple accounts at a time and returning a collection back to .net
    open CURSOR_ACCT
              select AccountID
              from
                     ACCOUNT A,
                     MARKETER_ACCOUNT M,
                     ARCHIVE R
               where A.AccountID = nvl((select max(M.EndDate) from Marketer_account M2
                                                    where M2.AccountID = A.AccountID),
                                                      (select max(R.EndDate) from Archive R2
                                                    where R2.AccountID = A.AccountID)
                   and upper(A.Name) like parameter || '%'
    <can you do a NVL like this?   probably not...   I want to be able to get the MAX record for that account off the MarketerACcount table OR the max record for that account off the Archive table, but not both>
    (parameter could be "DO", so I return all names starting with DO...)

    if I understand your description I would assume that for John Dow we would expect the second row from marketer_account  ("high date ~ which means currently with this marketer"). Here is a solution with analytic functions:
    drop table account;
    drop table marketer_account;
    drop table marketer_account_archive;
    create table account (
        id number
      , name varchar2(20)
    insert into account values (123, 'John Doe');
    insert into account values (124, 'Jane Donaldson');
    insert into account values (125, 'Harold Douglas');
    create table marketer_account (
        key number
      , AccountId number
      , MktKey number
      , FromDt date
      , ToDate date
    insert into marketer_account values (1001, 123, 10526, to_date('03.08.2008', 'dd.mm.yyyy'), to_date('27.09.2009', 'dd.mm.yyyy'));
    insert into marketer_account values (1017, 123, 10987, to_date('28.09.2009', 'dd.mm.yyyy'), to_date('31.12.4712', 'dd.mm.yyyy'));
    insert into marketer_account values (1023, 124, 10541, to_date('03.12.2010', 'dd.mm.yyyy'), to_date('31.12.4712', 'dd.mm.yyyy'));
    create table marketer_account_archive (
        key number
      , AccountId number
      , MktKey number
      , FromDt date
      , ToDate date
    insert into marketer_account_archive values (1015, 124, 10526, to_date('03.08.2008', 'dd.mm.yyyy'), to_date('02.12.2010', 'dd.mm.yyyy'));
    insert into marketer_account_archive values (1033, 125, 10987, to_date('01.01.2011', 'dd.mm.yyyy'), to_date('31.01.2012', 'dd.mm.yyyy'));
    select key, AccountId, MktKey, FromDt, ToDate
         , max(FromDt) over(partition by AccountId) max_FromDt
      from marketer_account
    union all
    select key, AccountId, MktKey, FromDt, ToDate
         , max(FromDt) over(partition by AccountId) max_FromDt
      from marketer_account_archive;
    with
    basedata as (
    select key, AccountId, MktKey, FromDt, ToDate
      from marketer_account
    union all
    select key, AccountId, MktKey, FromDt, ToDate
      from marketer_account_archive
    basedata_with_max_intervals as (
    select key, AccountId, MktKey, FromDt, ToDate
         , row_number() over(partition by AccountId order by FromDt desc) FromDt_Rank
      from basedata
    filtered_basedata as (
    select key, AccountId, MktKey, FromDt, ToDate from basedata_with_max_intervals where FromDt_Rank = 1
    select a.id
         , a.name
         , b.MktKey
         , b.FromDt
         , b.ToDate
      from account a
      join filtered_basedata b
        on (a.id = b.AccountId)
    ID NAME                     MKTKEY FROMDT     TODATE
    123 John Doe                  10987 28.09.2009 31.12.4712
    124 Jane Donaldson            10541 03.12.2010 31.12.4712
    125 Harold Douglas            10987 01.01.2011 31.01.2012
    If your tables are big it could be necessary to do the filtering (according to your condition) in an early step (the first CTE).
    Regards
    Martin

  • Query help : Query to get values SYSDATE-1 18:00 hrs to SYSDATE 08:00 hrs

    Hi Team
    I want the SQl query to get the data for the following comparison : -
    Order Created is a Date Column , and i want to find out all the values from (SYSDATE-1) 18:00 hours to SYSDATE 08:00 hours
    i.e.
    (SYSDATE-1) 18:00:00 < Order.Created < SYSDATE 08:00:00.
    Regards

    Hi, Rohit,
    942281 wrote:
    If i want the data in the below way i.e.
    from (SYSDATE-1) 18:00 hours to SYSDATE 17:59 hours ---> (SYSDATE-1) 18:00:00 < Order.Created < SYSDATE 07:59:00.If you want to include rows from exactly 18:00:00 yesterday (but no earlier), and exclude rows from exatly 08:00:00 today (or later), then use:
    WHERE   ord_dtl.submit_dt  >= TRUNC (SYSDATE) - (6 / 24)
    AND     ord_dtl.submit_dt  <  TRUNC (SYSDATE) + (8 / 24)
    So can i use the below format : -
    ord_dtl.submit_dt BETWEEN trunc(sysdate)-(6/24) and trunc(sysdate)+(7.59/24) . Please suggest . .59 hours is .59 * 60 * 60 = 2124 seconds (or .59 * 60 = 35.4 minutes), so the last time included in the range above is 07:35:24, not 07:59:59.
    If you really, really want to use BETWEEN (which includes both end points), then you could do it with date arithmentic:
    WHERE   ord_dtl.submit_dt  BETWEEN  TRUNC (SYSDATE) - (6 / 24)
                      AND         TRUNC (SYSDATE) + (8 / 24)
                                               - (1 / (24 * 60 * 60))but it would be simpler and less error prone to use INTERVALs, as Karthick suggested earlier:
    WHERE   ord_dtl.submit_dt  BETWEEN  TRUNC (SYSDATE) - INTERVAL '6' HOUR
                      AND         TRUNC (SYSDATE) + INTERVAL '8' HOUR
                                               - INTERVAL '1' SECONDEdited by: Frank Kulash on Apr 17, 2013 9:36 AM
    Edited by: Frank Kulash on Apr 17, 2013 11:56 AM
    Changed "- (8 /24)" to "+ (8 /24)" in first code fragment (after Blushadown, below)

  • Query help, subtract two query parts

    Hi,
    I am beginner of PL/SQL and have a problem I couldn’t solve:
    Table (op_list):
    Item     -     Amount -     Status
    Item1     -     10     -     in
    Item2     -     12     -     in
    Item3     -     7     -     in
    Item1     -     2     -     out
    Item2     -     3     -     out
    Item1     -     1     -     dmg
    Item3     -     3     -     out
    Item1     -     2     -     out
    Item2     -     5     -     out
    Item2     -     2     -     in
    Item3     -     1     -     exp
    Would like to get result of query (subtract amount of 'out/dmg/exp' from 'in' ):
    Item - Amount left
    Item1     -     5
    Item2     -     6
    Item3 -     3
    I wrote code that returns sum of all incoming items and sum all out/dmg/exp items, but couldn’t solve how to subtract one part of querry from another. Or maybe there is a better way. Also worried what happens if there is no 'out/dmg/exp' only 'in'
    select item.name, sum(op_list.item_amount)
    from op_list
    inner join item
    on op_list.item = item.item_id
    where op_list.status = 'in'
    group by item.name
    union
    select item.name, sum(op_list.item_amount)
    from op_list
    inner join item
    on op_list.item = item.item_id
    where op_list.status = 'out'
    or op_list.status = 'dmg'
    or op_list.status = 'exp'
    group by item.name
    Return:
    Item1     -     10      [10 in]
    Item1     -     5     [2+1+2]
    Item2     -     14     [12+2]
    Item3     -     7
    Item3     -     4     [3+1]
    Thanks in advance

    Hi,
    We can also use simple inline views to get what we need.
    select a.item,a.amount-b.amount Balance from
    (select item,sum(amount) Amount from op_list
    where status = 'in'
    group by item) a,
    (select item,sum(amount) Amount from op_list
    where status in ('out','dmg','exp')
    group by item) b
    where
    a.item=b.item
    order by item;
    ITEM       BALANCE
    Item1                      5
    Item2                      6
    Item3                      3Regards,
    Prazy

  • Query help: query to return column that represents multiple rows

    I have a table with a name and location column. The same name can occur multiple times with any arbitrary location, i.e. duplicates are allowed.
    I need a query to find all names that occur in both of two separate locations.
    For example,
    bob usa
    bob mexico
    dot mexico
    dot europe
    hal usa
    hal europe
    sal usa
    sal mexico
    The query in question, if given the locations usa and mexico, would return bob and sal.
    Thanks for any help or advice,
    -=beeky

    How about this?
    SELECT  NAME
    FROM    <LOCATIONS_TABLE>
    WHERE   LOCATION IN ('usa','mexico')
    GROUP BY NAME
    HAVING COUNT(DISTINCT LOCATION) >= 2Results:
    SQL> WITH person_locations AS
      2  (
      3          SELECT 'bob' AS NAME, 'USA' AS LOCATION FROM DUAL UNION ALL
      4          SELECT 'bob' AS NAME, 'Mexico' AS LOCATION FROM DUAL UNION ALL
      5          SELECT 'dot' AS NAME, 'Mexico' AS LOCATION FROM DUAL UNION ALL
      6          SELECT 'dot' AS NAME, 'Europe' AS LOCATION FROM DUAL UNION ALL
      7          SELECT 'hal' AS NAME, 'USA' AS LOCATION FROM DUAL UNION ALL
      8          SELECT 'hal' AS NAME, 'Europe' AS LOCATION FROM DUAL UNION ALL
      9          SELECT 'sal' AS NAME, 'USA' AS LOCATION FROM DUAL UNION ALL
    10          SELECT 'sal' AS NAME, 'Mexico' AS LOCATION FROM DUAL
    11  )
    12  SELECT  NAME
    13  FROM    person_locations
    14  WHERE   LOCATION IN ('USA','Mexico')
    15  GROUP BY NAME
    16  HAVING COUNT(DISTINCT LOCATION) >= 2
    17  /
    NAM
    bob
    salHTH!
    Edited by: Centinul on Oct 15, 2009 2:25 PM
    Added sample results.

  • QUERY HELP!!! trying to create a query

    i'm creating a summary report
    i have a table with sale dates
    for example i have a table tab_1 and column saleDate as
    saleDat
    1923
    1936
    1945
    2003
    2005
    saleDate contains years and there are some missing years where no sale
    was made
    My report has to display years starting from earliest year
    so i have to create a query that starts with 1923
    but the problem is that I have to have years that are not in table.
    for example i have to display years 1924 which is not in table
    so the part of report has to look like
    1923 blah blah summary.........
    1924 "
    1925
    1926
    2005
    2006
    upto current year (2006 may not be in the table, but i have to display)
    i just need to know the query that can query all the years starting from
    the ealiest saleDate to current year
    thanks in advance

    Please write the query in the following form:
    SELECT a.year, --- place other columns from your table.
    FROM (SELECT (:start_num + rownum) year
    FROM all_tab_columns
    WHERE :start_num + rownum <= :end_num) a,
    tab_1 b
    WHERE a.year = b.saleDat(+);
    Note:
    1) if your start year and end year are 1923 and 2006. Then input as below:
    :start_num = 1922
    :end_num = 2006
    2) Since for some of the years (1924 etc) may not be there in your so you may need to use NVL to print proper indicators.
    3) If you have more than one record in tab_1 for a particular year then group them based year and then use it.
    Hope this helps.
    - Saumen.

  • IF statement in Query

    Hi
    I have a query / recordset that  would be looking at 12000 rows in a database and 10 different variables and potential filters chosen by end users.
    Should I put 10 wild card / url Where statements in my recordset query or should I put IF statements in my query.
    ie.
    If URL colname then ",and BetType= "xzz"" .
    Is that the most efficient way to run my queries.
    I will be running about 10 recordsets on my page all looking at these url variables so it will be a busy page.
    If that is the answer can someone please tell me how to insert the if statement - i've tried allsorts but it wont work.
    $colname_Recordset4 = "%";
    if (isset($_GET['colname'])) {
      $colname_Recordset4 = $_GET['colname'];
    mysql_select_db($database_racing_analysis, $racing_analysis);
    $query_Recordset4 = sprintf("SELECT BetType, sum(if(season='2006-2007', Bet, 0)) AS '2006-2007',  sum(if(season='2007-2008', Bet, 0)) AS '2007-2008',  sum(if(season='2008-2009', Bet, 0)) AS '2008-2009' FROM dataextract WHERE BetType Like %s and TrackID = 1 and Distance = 1000 and Class = 1 GROUP BY BetType", GetSQLValueString($colname_Recordset4, "text"));
    $Recordset4 = mysql_query($query_Recordset4, $racing_analysis) or die(mysql_error());
    $row_Recordset4 = mysql_fetch_assoc($Recordset4);
    $totalRows_Recordset4 = mysql_num_rows($Recordset4);
    hope someone can help.
    Simon

    That part of the query cross tabs my data into three columns - not intended to confuse the issue. I'd have the same problem with a basic query.
    So Here is a very basic version:
    I want an IF statement to go around: "WHERE Distance = 1000" and "AND Class = 1"
    That I believe will reduce the effort on the MYSQL Server as it wouldn't be running a bunch of wild card queries and would only run if there is a URL paramter delivered.??
    I just don't get how to put the IF statements in PHP.
    thanks
    $maxRows_Recordset3 = 5;
    $pageNum_Recordset3 = 0;
    if (isset($_GET['pageNum_Recordset3'])) {
      $pageNum_Recordset3 = $_GET['pageNum_Recordset3'];
    $startRow_Recordset3 = $pageNum_Recordset3 * $maxRows_Recordset3;
    mysql_select_db($database_racing_analysis, $racing_analysis);
    $query_Recordset3 = "SELECT * FROM dataextract WHERE Distance = 1000 AND Class = 1";
    $query_limit_Recordset3 = sprintf("%s LIMIT %d, %d", $query_Recordset3, $startRow_Recordset3, $maxRows_Recordset3);
    $Recordset3 = mysql_query($query_limit_Recordset3, $racing_analysis) or die(mysql_error());
    $row_Recordset3 = mysql_fetch_assoc($Recordset3);
    if (isset($_GET['totalRows_Recordset3'])) {
      $totalRows_Recordset3 = $_GET['totalRows_Recordset3'];
    } else {
      $all_Recordset3 = mysql_query($query_Recordset3);
      $totalRows_Recordset3 = mysql_num_rows($all_Recordset3);
    $totalPages_Recordset3 = ceil($totalRows_Recordset3/$maxRows_Recordset3)-1;

  • IF and ABS condition statement in BEX query designer

    Hi,
    I would like to ask the best way for me to produce an acceptable result from Excel IF and ABS Condition statement.
    The condition statement that I have on my Excel file is
    =IF((A2-B2)>0,ABS(A2-B2),0)
    I'm trying multiple times to reproduce this in BEX Query designer, unfortunately I'm getting a bad result or unacceptable formula.
    Anyone who could help me with my issue?
    Thanks,
    Arnold

    Hi Arnold,
    Thank you,
    Nanda

Maybe you are looking for