Table(cast())

Hi All,
I am using Oracle 9i and I have an Oracle Type which has been created as follows
create or replace TYPE "Dummy" AS VARRAY (500) OF VARCHAR2(50)
It is assigned a set of values as follows:
oracle.sql.ArrayDescriptor desc = oracle.sql.ArrayDescriptor.createDescriptor("Dummy", stmt.getConnection());
oracle.sql.ARRAY newArray;
newArray = new oracle.sql.ARRAY(desc, stmt.getConnection() , filter); //where filter contains the values here
I am trying to execute the following statement but I get an error.
SELECT process_instance_id, workstep_name, status FROM WORKSTEP
WHERE PROCESS_INSTANCE_ID IN table(cast(newarray)))
AND WORKSTEP_NAME = 'X';
When I execute this I get a ORA-00936: missing expression
Can you pls point out what's wrong here and how I can cast a varray as a table and use it in an "IN" clause?
Thanks in advance.
AD.

user604168 wrote:
Hi All,
I am using Oracle 9i and I have an Oracle Type which has been created as follows
create or replace TYPE "Dummy" AS VARRAY (500) OF VARCHAR2(50)
It is assigned a set of values as follows:
oracle.sql.ArrayDescriptor desc = oracle.sql.ArrayDescriptor.createDescriptor("Dummy", stmt.getConnection());
oracle.sql.ARRAY newArray;
newArray = new oracle.sql.ARRAY(desc, stmt.getConnection() , filter); //where filter contains the values here
I am trying to execute the following statement but I get an error.
SELECT process_instance_id, workstep_name, status FROM WORKSTEP
WHERE PROCESS_INSTANCE_ID IN table(cast(newarray)))
AND WORKSTEP_NAME = 'X';
When I execute this I get a ORA-00936: missing expression
Can you pls point out what's wrong here and how I can cast a varray as a table and use it in an "IN" clause?
Thanks in advance.
AD.I'm not sure why you're using a VARRAY, i would personally use a NESTED TABLE
create or replace TYPE better_Dummy AS table OF VARCHAR2(50);
/As i find them more flexible.
Also, i would be quite sad to come into an environment and see that someone had created a database object using double quotes like you have to preserve case sensitivity (it's really a pain in the ***).
That being said, here's the syntax you're looking for.
create or replace TYPE Dummy AS VARRAY (500) OF VARCHAR2(50);
ME_XE?declare
  2    in_dummy  dummy default dummy('a','b');
  3    x         number;
  4  begin
  5
  6    select count(*)
  7    into x
  8    from dual
  9    where 'a' in (select column_value from table(cast(in_dummy as dummy)));
10
11    dbms_output.put_line(x);
12  end;
13  /
1
PL/SQL procedure successfully completed.
Elapsed: 00:00:00.14
ME_XE?

Similar Messages

  • Derive found flag in SQL with where clause using TABLE(CAST function

    Dear All,
    Stored procedure listEmployees
    ==========================
    CREATE OR REPLACE TYPE STRING_ARRAY AS VARRAY(8000) OF VARCHAR2(15);
    empIdList STRING_ARRAY
    countriesList STRING_ARRAY
    SELECT EMP_ID, EMP_COUNTRY, EMP_NAME, FOUND_FLAG_
    FROM EMPLOYEE WHERE
    EMP_ID IN
    (SELECT * FROM TABLE(CAST(empIdList AS STRING_ARRAY))
    AND EMP_COUNTRY IN
    (SELECT * FROM TABLE(CAST(countriesList AS STRING_ARRAY))
    =================
    I have a stored procedure which lists the employees using above simple query.
    Here I am using table CAST function to find the list of employees in one go
    instead of looping through each and every employee
    Everything fine until requirements forced me to get the FOUND_FLAG as well.
    Now I wanted derive the FOUND_FLAG by using rownum, rowid, decode functions
    but I was not successful
    Can you please suggest if there is any intelligent way to say weather the
    row is found for given parameters in the where clause?
    If not I may have to loop through each set of empIdList, countriesList
    and find the values individually just to set a flag. In this approach I can’t use
    the TABLE CAST function which is efficient I suppose.
    Note that query STRING_ARRAY is an VARRAY. It is very big in size and this procedure
    suppose to handle large sets of data.
    Thanks In advance
    Regards
    Charan
    Edited by: kmcharan on 03-Dec-2009 09:55
    Edited by: kmcharan on 03-Dec-2009 09:55

    If your query returns results, you have found them... so your "FOUND" flag might be a constant,...

  • Error in table(Cast)

    Hi
    I tried the following code
    could you please resolve the error in this code
    create or replace type numlist as table of number;
    declare
      var numlist;
    begin
    select 1 into var
      from table(cast(var));
    end;
    /thanks

    ME_XE?create or replace type numlist as table of number;
      2  /
    Type created.
    Elapsed: 00:00:00.62
    ME_XE?
    ME_XE?declare
      2    var    numlist  := numlist(1,2,3);
      3  begin
      4
      5     for x in
      6     (
      7        select column_value
      8        from table(cast(var as numlist))
      9     )
    10     loop
    11        dbms_output.put_line('value = ' || x.column_value);
    12     end loop;
    13
    14  end;
    15  /
    value = 1
    value = 2
    value = 3
    PL/SQL procedure successfully completed.
    Elapsed: 00:00:00.46

  • TABLE(CAST()) function not returning the correct results in few scenarios.

    I am using TABLE(CAST()) operation in PL/SQL and it is returning me no data.
    Here is what I have done:
    1.     Created Record type
    CREATE OR REPLACE TYPE target_rec AS OBJECT
    target__id          NUMBER(10),
    target_entity_id NUMBER(10),
    dd           CHAR(3),
    fd           CHAR(3),
    code      NUMBER(10),
    target_pct      NUMBER,
    template_nm VARCHAR2(50),
    p_symbol      VARCHAR2(10),
    pm_init          VARCHAR2(3),
    target_name     VARCHAR2(20),
    targe_type     VARCHAR2(30),
    target_caption     VARCHAR2(30),
    sort_order      NUMBER (4)
    2.     Created Table type
    CREATE OR REPLACE TYPE target_arr AS TABLE OF target_rec
    3.     Created Stored procedure which accepts parameter of type target_arr and runs the Table(Cast()) function on it.
         Following is the simplified form of my procedure.
         PROCEDURE get_target_weights
         p_in_template_target IN target_arr,
         p_out_count          OUT NUMBER,
         IS
         BEGIN
              SELECT count(*) into p_out_count
         FROM TABLE(CAST(p_in_template_target AS                     target_arr)) arr;
         END;
    I am calling get_target_weights from my java code and passing p_in_template_target with 10140 records.
    Scenario 1: If target_pct in the last record is 0, p_out_count returned from the procedure is 0.
    Scenario 2: If target_pct in the last record is any other value(say 0.01), p_out_count returned from the procedure is 10140.
    Please help me understand why the Table(Cast()) is not returning the correct results in Scenario 1. Also adding or deleting any record from the test data returns the correct results (i.e. if keep target_pct in the last record as 0 but add or delete any record).
    Let me know how can I attach the test data I am using to help you debugging as I don’t see any Attach file button on Post Message screen on the forum.

    I am not able to reproduce this problem with a small data set. I can only reproduce with the data having 10140 records.
    I am not sure if this is the memory issue as adding a new record also solves the problem.
    This should not be the error because of wrong way of filling the records in java as for testing purpose I just saved the records which I am sending from java in a table. I updated the stored procedure as well to read the data from the table and then perform TABLE(CAST()) operation. I am still getting 0 as the output for scenario 1 mentioned in my last mail.
    Here is what I have updated:
    1.     Created the table target_table
    CREATE Table target_table
    target_id          NUMBER(10),
    target_entity_id NUMBER(10),
    dd           CHAR(3),
    fd           CHAR(3),
    code      NUMBER(10),
    target_pct      NUMBER,
    template_nm VARCHAR2(50),
    p_symbol      VARCHAR2(10),
    pm_init          VARCHAR2(3),
    target_name     VARCHAR2(20),
    target_type     VARCHAR2(30),
    target_caption     VARCHAR2(30),
    sort_order      NUMBER (4)
    2.     Inserted data into the table : The script has around 10140 rows. Pls let me know how can I send it to you
    3.     Updated procedure to read data from table and stored into variable of type target_arr. Run Table(cast()) operation on target_arr and get the count
    PROCEDURE test_target_weights
    IS
         v_target_rec target_table%ROWTYPE;
         CURSOR wt_cursor IS
         Select * from target_table;
         v_count NUMBER := 1;
         v_target_arr cws_target_arr:= target_arr ();
         v_target_arr_rec target_rec;
         v_rec_count NUMBER;
         BEGIN
         OPEN wt_cursor;
         loop
              fetch wt_cursor into v_target_rec; -- fetch data from table into local           record.
              exit when wt_cursor%notfound;
              --move data into target_arr
              v_target_arr_rec :=                     cws_curr_pair_entity_wt_rec(v_target_rec target_id,v_target_rec. target_entity_id,
                        v_target_rec.dd,v_target_rec.fd,v_target_rec.code,v_target_rec.target_pct,
         v_target_rec.template_nm,v_target_rec.p_symbol,v_target_rec.pm_init,v_target_rec.template_name,
         v_target_rec.template_type,v_target_rec.template_caption,v_target_rec.sort_order);
              v_target_arr.extend();
              v_target_arr(v_count) := v_target_arr_rec;
              v_count := v_count + 1;
         end loop;
         close wt_cursor;
         -- run table cast on target_arr
         SELECT count(*) into v_rec_count
         FROM TABLE(CAST(v_target_arr AS target_arr)) arr;
         DBMS_OUTPUT.enable;
         DBMS_OUTPUT.PUT_LINE('p_out_count ' || v_rec_count);
         DBMS_OUTPUT.PUT_LINE('v_count ' || v_count);
    END;
    Output is
    p_out_count 0
    v_count 10140
    Expected output
    p_out_count 10140
    v_count 10140

  • Ora-1460 during select using table(cast(type))

    Hi all!
    I've got a problem with a query that fails with a ora-1460 at random time intervals. This is a 10.2.0.3 version database running on a sun os.
    We've managed to reproduce this error controlled when running an analyze on the table in question at the same time this query runs.
    The error looks like this:
    Unexpected system error, see server log for details. Root message is: org.apache.ojb.broker.PersistenceBrokerSQLException: * SQLException during execution of sql-statement: * sql statement was 'SELECT A0.ID,A0.LOCK_VERSION,A0.CLASS_NAME,A0.DESCRIPTION,A0.NAME,A0.EXTERNAL_ID,A0.ORDER_NUMBER,A0.LEVEL_ID,A0.ROOT_AH_ID,A0.DIFF_END_DATE,A0.DIFF_START_DATE,A0.SUPPORTED_BY_ASS_CAL,A0.CATEGORY_ROLE_ID FROM CATEGORY A0 WHERE A0.ID IN (select /*+ cardinality(1) */ * from table(cast( ems_string_to_table(?) as ems_table_of_number_type ))union select /*+ cardinality(1) */ * from table(cast( ems_string_to_table('') as ems_table_of_number_type)))' * Exception message is [ORA-01460: unimplemented or unreasonable conversion requested ] * Vendor error code [1460] * SQL state code [72000]
    ems_string_to_table is a function thats populates a type with a unknown number of values. Though, the length of the string never exceeds 4k.
    ems_table_of_number_type is a type define as "table of numbers".
    Has anybody seen this error before, or have any idea why this should be?
    Best regards,
    Heyers

    There is an IN clause constaraint i.e max number of characters you can pass from Oracle .Please check your select inner query which might be resulting to cross more than the boundary IN clause ,

  • 10g: parallel pipelined table func. using table(cast(SQL collect.))?

    Hi,
    i try to distribute SQL data objects - stored in a SQL data type TABLE OF <object-Type> - to multiple (parallel) instances of a table function,
    by passing a CURSOR(...) to the table function, which selects from the SQL TABLE OF storage via "select * from TABLE(CAST(<storage> as <storage-type>)".
    But oracle always only uses a single table function instance :-(
    whatever hints i provide or setting i use for the parallel table function (parallel_enable ...)
    Could it be, that this is due to the fact, that my data are not
    globally available, but only in the main thread data?
    Can someone confirm, that it's not possible to start multiple parallel table functions
    for selecting on SQL data type TABLE OF <object>storages?
    Here's an example sqlplus program to show the issue:
    -------------------- snip ---------------------------------------------
    set serveroutput on;
    drop table test_table;
    drop type ton_t;
    drop type test_list;
    drop type test_obj;
    create table test_table
         a number(19,0),
         b timestamp with time zone,
         c varchar2(256)
    create or replace type test_obj as object(
         a number(19,0),
         b timestamp with time zone,
         c varchar2(256)
    create or replace type test_list as table of test_obj;
    create or replace type ton_t as table of number;
    create or replace package test_pkg
    as
         type test_rec is record (
              a number(19,0),
              b timestamp with time zone,
              c varchar2(256)
         type test_tab is table of test_rec;
         type test_cur is ref cursor return test_rec;
         function TF(mycur test_cur)
    return test_list pipelined
    parallel_enable(partition mycur by hash(a));
    end;
    create or replace package body test_pkg
    as
         function TF(mycur test_cur)
    return test_list pipelined
    parallel_enable(partition mycur by hash(a))
    is
              sid number;
              counter number(19,0) := 0;
              myrec test_rec;
              mytab test_tab;
              mytab2 test_list := test_list();
         begin
              select userenv('SID') into sid from dual;
              dbms_output.put_line('test_pkg.TF( sid => '''|| sid || ''' ): enter');
              loop
                   fetch mycur into myRec;
                   exit when mycur%NOTFOUND;
                   mytab2.extend;
                   mytab2(mytab2.last) := test_obj(myRec.a, myRec.b, myRec.c);
              end loop;
              for i in mytab2.first..mytab2.last loop
                   -- attention: saves own SID in test_obj.a for indication to caller
                   --     how many sids have been involved
                   pipe row(test_obj(sid, mytab2(i).b, mytab2(i).c));
                   counter := counter + 1;
              end loop;
              dbms_output.put_line('test_pkg.TF( sid => '''|| sid || ''' ): exit, piped #' || counter || ' records');
         end;
    end;
    declare
         myList test_list := test_list();
         myList2 test_list := test_list();
         sids ton_t := ton_t();
    begin
         for i in 1..10000 loop
              myList.extend; myList(myList.last) := test_obj(i, sysdate, to_char(i+2));
         end loop;
         -- save into the real table
         insert into test_table select * from table(cast (myList as test_list));
         dbms_output.put_line(chr(10) || 'copy ''mylist'' to ''mylist2'' by streaming via table function...');
         select test_obj(a, b, c) bulk collect into myList2
         from table(test_pkg.TF(CURSOR(select /*+ parallel(tab,10) */ * from table(cast (myList as test_list)) tab)));
         dbms_output.put_line('... saved #' || myList2.count || ' records');
         select distinct(tab.a) bulk collect into sids from table(cast (myList2 as test_list)) tab;
         dbms_output.put_line('worker thread''s sid list:');
         for i in sids.first..sids.last loop
              dbms_output.put_line('sid #' || sids(i));
         end loop;
         dbms_output.put_line(chr(10) || 'copy physical ''test_table'' to ''mylist2'' by streaming via table function:');
         select test_obj(a, b, c) bulk collect into myList2
         from table(test_pkg.TF(CURSOR(select /*+ parallel(tab,10) */ * from test_table tab)));
         dbms_output.put_line('... saved #' || myList2.count || ' records');
         select distinct(tab.a) bulk collect into sids from table(cast (myList2 as test_list)) tab;
         dbms_output.put_line('worker thread''s sid list:');
         for i in sids.first..sids.last loop
              dbms_output.put_line('sid #' || sids(i));
         end loop;
    end;
    -------------------- snap ---------------------------------------------
    Here's the output:
    -------------------- snip ---------------------------------------------
    copy 'mylist' to 'mylist2' by streaming via table function...
    test_pkg.TF( sid => '98' ): enter
    test_pkg.TF( sid => '98' ): exit, piped #10000 records
    ... saved #10000 records
    worker thread's sid list:
    sid #98 -- ONLY A SINGLE SID HERE!
    copy physical 'test_table' to 'mylist2' by streaming via table function:
    ... saved #10000 records
    worker thread's sid list:
    sid #128 -- A LIST OF SIDS HERE!
    sid #141
    sid #85
    sid #125
    sid #254
    sid #101
    sid #124
    sid #109
    sid #142
    sid #92
    PL/SQL procedure successfully completed.
    -------------------- snap ---------------------------------------------
    I posted it to newsgroup comp.databases.oracle.server.
    (summary: "10g: parallel pipelined table functions with cursor selecting from table(cast(SQL collection)) doesn't work ")
    But i didn't get a response.
    There i also wrote some background information about my application:
    -------------------- snip ---------------------------------------------
    My application has a #2 steps/stages data selection.
    A 1st select for minimal context base data
    - mainly to evaluate for due driving data records.
    And a 2nd select for all the "real" data to process a context
    (joining much more other tables here, which i don't want to do for non-due records).
    So it's doing stage #1 select first, then stage #2 select - based on stage #1 results - next.
    The first implementation of the application did the stage #1 select in the main session of the pl/sql code.
    And for the stage #2 select there was done a dispatch to multiple parallel table functions (in multiple worker sessions) for the "real work".
    That worked.
    However there was a flaw:
    Between records from stage #1 selection and records from stage #2 selection there is a 1:n relation (via key / foreign key relation).
    Means, for #1 resulting record from stage #1 selection, there are #x records from stage #2 selection.
    That forced me to use "cluster curStage2 by (theKey)".
    Because the worker sessions need to evaluate the all-over status for a context of #1 record from stage #1 and #x records from stage #2
    (so it needs to have #x records of stage #2 together).
    This then resulted in delay for starting up the worker sessions (i didn't find a way to get rid of this).
    So i wanted to shift the invocation of the worker sessions to the stage #1 selection.
    Then i don't need the "cluster curStage2 by (theKey)" anymore!
    But: i also need to do an update of the primary driving data!
    So the stage #1 select is a 'select ... for update ...'.
    But you can't use such in CURSOR for table functions (which i can understand, why it's not possible).
    So i have to do my stage #1 selection in two steps:
    1. 'select for update' by main session and collect result in SQL collection.
    2. pass collected data to parallel table functions
    And for 2. i recognized, that it doesn't start up multiple parallel table function instances.
    As a work-around
    - if it's just not possible to start multiple parallel pipelined table functions for dispatching from 'select * from TABLE(CAST(... as ...))' -
    i need to select again on the base tables - driven by the SQL collection data.
    But before i do so, i wanted to verify, if it's really not possible.
    Maybe i just miss a special oracle hint or whatever you can get "out of another box" :-)
    -------------------- snap ---------------------------------------------
    - many thanks!
    rgds,
    Frank

    Hi,
    i try to distribute SQL data objects - stored in a SQL data type TABLE OF <object-Type> - to multiple (parallel) instances of a table function,
    by passing a CURSOR(...) to the table function, which selects from the SQL TABLE OF storage via "select * from TABLE(CAST(<storage> as <storage-type>)".
    But oracle always only uses a single table function instance :-(
    whatever hints i provide or setting i use for the parallel table function (parallel_enable ...)
    Could it be, that this is due to the fact, that my data are not
    globally available, but only in the main thread data?
    Can someone confirm, that it's not possible to start multiple parallel table functions
    for selecting on SQL data type TABLE OF <object>storages?
    Here's an example sqlplus program to show the issue:
    -------------------- snip ---------------------------------------------
    set serveroutput on;
    drop table test_table;
    drop type ton_t;
    drop type test_list;
    drop type test_obj;
    create table test_table
         a number(19,0),
         b timestamp with time zone,
         c varchar2(256)
    create or replace type test_obj as object(
         a number(19,0),
         b timestamp with time zone,
         c varchar2(256)
    create or replace type test_list as table of test_obj;
    create or replace type ton_t as table of number;
    create or replace package test_pkg
    as
         type test_rec is record (
              a number(19,0),
              b timestamp with time zone,
              c varchar2(256)
         type test_tab is table of test_rec;
         type test_cur is ref cursor return test_rec;
         function TF(mycur test_cur)
    return test_list pipelined
    parallel_enable(partition mycur by hash(a));
    end;
    create or replace package body test_pkg
    as
         function TF(mycur test_cur)
    return test_list pipelined
    parallel_enable(partition mycur by hash(a))
    is
              sid number;
              counter number(19,0) := 0;
              myrec test_rec;
              mytab test_tab;
              mytab2 test_list := test_list();
         begin
              select userenv('SID') into sid from dual;
              dbms_output.put_line('test_pkg.TF( sid => '''|| sid || ''' ): enter');
              loop
                   fetch mycur into myRec;
                   exit when mycur%NOTFOUND;
                   mytab2.extend;
                   mytab2(mytab2.last) := test_obj(myRec.a, myRec.b, myRec.c);
              end loop;
              for i in mytab2.first..mytab2.last loop
                   -- attention: saves own SID in test_obj.a for indication to caller
                   --     how many sids have been involved
                   pipe row(test_obj(sid, mytab2(i).b, mytab2(i).c));
                   counter := counter + 1;
              end loop;
              dbms_output.put_line('test_pkg.TF( sid => '''|| sid || ''' ): exit, piped #' || counter || ' records');
         end;
    end;
    declare
         myList test_list := test_list();
         myList2 test_list := test_list();
         sids ton_t := ton_t();
    begin
         for i in 1..10000 loop
              myList.extend; myList(myList.last) := test_obj(i, sysdate, to_char(i+2));
         end loop;
         -- save into the real table
         insert into test_table select * from table(cast (myList as test_list));
         dbms_output.put_line(chr(10) || 'copy ''mylist'' to ''mylist2'' by streaming via table function...');
         select test_obj(a, b, c) bulk collect into myList2
         from table(test_pkg.TF(CURSOR(select /*+ parallel(tab,10) */ * from table(cast (myList as test_list)) tab)));
         dbms_output.put_line('... saved #' || myList2.count || ' records');
         select distinct(tab.a) bulk collect into sids from table(cast (myList2 as test_list)) tab;
         dbms_output.put_line('worker thread''s sid list:');
         for i in sids.first..sids.last loop
              dbms_output.put_line('sid #' || sids(i));
         end loop;
         dbms_output.put_line(chr(10) || 'copy physical ''test_table'' to ''mylist2'' by streaming via table function:');
         select test_obj(a, b, c) bulk collect into myList2
         from table(test_pkg.TF(CURSOR(select /*+ parallel(tab,10) */ * from test_table tab)));
         dbms_output.put_line('... saved #' || myList2.count || ' records');
         select distinct(tab.a) bulk collect into sids from table(cast (myList2 as test_list)) tab;
         dbms_output.put_line('worker thread''s sid list:');
         for i in sids.first..sids.last loop
              dbms_output.put_line('sid #' || sids(i));
         end loop;
    end;
    -------------------- snap ---------------------------------------------
    Here's the output:
    -------------------- snip ---------------------------------------------
    copy 'mylist' to 'mylist2' by streaming via table function...
    test_pkg.TF( sid => '98' ): enter
    test_pkg.TF( sid => '98' ): exit, piped #10000 records
    ... saved #10000 records
    worker thread's sid list:
    sid #98 -- ONLY A SINGLE SID HERE!
    copy physical 'test_table' to 'mylist2' by streaming via table function:
    ... saved #10000 records
    worker thread's sid list:
    sid #128 -- A LIST OF SIDS HERE!
    sid #141
    sid #85
    sid #125
    sid #254
    sid #101
    sid #124
    sid #109
    sid #142
    sid #92
    PL/SQL procedure successfully completed.
    -------------------- snap ---------------------------------------------
    I posted it to newsgroup comp.databases.oracle.server.
    (summary: "10g: parallel pipelined table functions with cursor selecting from table(cast(SQL collection)) doesn't work ")
    But i didn't get a response.
    There i also wrote some background information about my application:
    -------------------- snip ---------------------------------------------
    My application has a #2 steps/stages data selection.
    A 1st select for minimal context base data
    - mainly to evaluate for due driving data records.
    And a 2nd select for all the "real" data to process a context
    (joining much more other tables here, which i don't want to do for non-due records).
    So it's doing stage #1 select first, then stage #2 select - based on stage #1 results - next.
    The first implementation of the application did the stage #1 select in the main session of the pl/sql code.
    And for the stage #2 select there was done a dispatch to multiple parallel table functions (in multiple worker sessions) for the "real work".
    That worked.
    However there was a flaw:
    Between records from stage #1 selection and records from stage #2 selection there is a 1:n relation (via key / foreign key relation).
    Means, for #1 resulting record from stage #1 selection, there are #x records from stage #2 selection.
    That forced me to use "cluster curStage2 by (theKey)".
    Because the worker sessions need to evaluate the all-over status for a context of #1 record from stage #1 and #x records from stage #2
    (so it needs to have #x records of stage #2 together).
    This then resulted in delay for starting up the worker sessions (i didn't find a way to get rid of this).
    So i wanted to shift the invocation of the worker sessions to the stage #1 selection.
    Then i don't need the "cluster curStage2 by (theKey)" anymore!
    But: i also need to do an update of the primary driving data!
    So the stage #1 select is a 'select ... for update ...'.
    But you can't use such in CURSOR for table functions (which i can understand, why it's not possible).
    So i have to do my stage #1 selection in two steps:
    1. 'select for update' by main session and collect result in SQL collection.
    2. pass collected data to parallel table functions
    And for 2. i recognized, that it doesn't start up multiple parallel table function instances.
    As a work-around
    - if it's just not possible to start multiple parallel pipelined table functions for dispatching from 'select * from TABLE(CAST(... as ...))' -
    i need to select again on the base tables - driven by the SQL collection data.
    But before i do so, i wanted to verify, if it's really not possible.
    Maybe i just miss a special oracle hint or whatever you can get "out of another box" :-)
    -------------------- snap ---------------------------------------------
    - many thanks!
    rgds,
    Frank

  • Getting SqlException while using TABLE CAST but works fine with order by

    I am using following query as sub-query with in a query which I have used in PL/SQL. This sub-query works fine If I add Order by x1,x2 at the end of the query. Otherwise it gives a SQLException.
    SELECT x1,x2 FROM TABLE( CAST (somelist AS X_ARRAY ))

    Narendra,
    If this question is related to HTML DB, please provide complete context, and show the exact text of error messages and the query itself.
    Scott

  • Table(cast  - invalid datatype problem

    Hi All,
    Basic scenario:
    PACKAGE
    create or replace
    PACKAGE C2_PAYMENT_DOC IS
    TYPE bin_array IS TABLE OF NUMBER
    INDEX BY BINARY_INTEGER;
    FUNCTION test_fun(l_doc_id IN integer) RETURN bin_array;
    PACKAGE_BODY
    create or replace
    package body C2_PAYMENT_DOC as
    FUNCTION test_fun (l_doc_id IN integer) RETURN bin_array IS
    l_gross bin_array;
    begin
    c2_purchase_invoice.get_inv_gross_amount(l_doc_id, l_gross(1));
    return l_gross;
    end;
    END;
    QUERY
    select * from Table(Cast(c2_payment_doc.test_fun(1) As bin_array));
    Result of the query is ORA-00902: invalid datatype
    How can I make the select statement valid?
    Thanks in advance,
    Bartek

    You can not use local collection types in SQL. You must create type bin_array as SQL nested table type. Also, depending on version you might not need to cast:
    SQL> CREATE OR REPLACE
      2    TYPE bin_array
      3      AS TABLE OF NUMBER
      4  /
    Type created.
    SQL>  create or replace
      2   PACKAGE C2_PAYMENT_DOC IS
      3  FUNCTION test_fun(l_doc_id IN integer) RETURN bin_array;
      4  end;
      5  /
    Package created.
    SQL> create or replace
      2  package body C2_PAYMENT_DOC as
      3 
      4  FUNCTION test_fun (l_doc_id IN integer) RETURN bin_array IS
      5  l_gross bin_array := bin_array();
      6  begin
      7      l_gross.extend;
      8      l_gross(1) := l_doc_id;
      9  return l_gross;
    10  end;
    11  END;
    12  /
    Package body created.
    SQL> select * from Table(Cast(c2_payment_doc.test_fun(1) As bin_array));
    COLUMN_VALUE
               1
    SQL> select * from Table(c2_payment_doc.test_fun(1))
      2  /
    COLUMN_VALUE
               1
    SQL> SY.

  • Table cast PL/SQL: ORA-00902: invalid datatype

    I m getting
    PL/SQL: ORA-00902: invalid datatype
    error in
    OPEN pPymtCur FOR
    SELECT *
    FROM TABLE(CAST( up_gap_tra_reports.myArray AS traArray));
    in my package up_gap_tra_reports.
    CREATE OR REPLACE PACKAGE GAPSDVEL.up_gap_tra_reports
    AS
    TYPE traRecord IS RECORD
    group1StudEnrol NUMBER(6,1),
    group2StudEnrol NUMBER(6,1),
    pymtAmt gap_payment.NET_AMT%TYPE
    TYPE traArray IS TABLE OF traRecord;
    myArray traArray := traArray() ;
    END up_gap_tra_reports;
    I hv alreay declared traArray type.
    pls help me to solve this.

    Meghna wrote:
    is there any way to use pl/sql collection in SQL or refcur without creating it because i am not able to create type in database.The only way I am aware of is pipelined function:
    create or replace
      package pkg1
        is
          type traRecord
            is record(
                      ename emp.ename%type,
                      sal   emp.sal%type
          TYPE traArray IS TABLE OF traRecord;
          function f1
            return traArray
            pipelined;
    end;
    create or replace
      package body pkg1
        is
        function f1
            return traArray
            pipelined
          is
              v_rec traRecord;
          begin
              v_rec.ename := 'Sam';
              v_rec.sal := 1000;
              pipe row(v_rec);
              v_rec.ename := 'John';
              v_rec.sal := 1500;
              pipe row(v_rec);
              v_rec.ename := 'Mary';
              v_rec.sal := 2000;
              pipe row(v_rec);
              return;
        end;
    end;
    /Now you can:
    SQL> select * from table(pkg1.f1)
      2  /
    ENAME             SAL
    Sam              1000
    John             1500
    Mary             2000
    SQL>Keep in mind, it will create system generated types:
    SQL> select type_name from user_types
      2  /
    TYPE_NAME
    SYS_PLSQL_73305_9_1
    SYS_PLSQL_73305_DUMMY_1
    SYS_PLSQL_73305_34_1
    SQL> desc SYS_PLSQL_73305_9_1
    Name                                      Null?    Type
    ENAME                                              VARCHAR2(10)
    SAL                                                NUMBER(7,2)
    SQL> desc SYS_PLSQL_73305_DUMMY_1
    SYS_PLSQL_73305_DUMMY_1 TABLE OF NUMBER
    SQL> desc SYS_PLSQL_73305_34_1
    SYS_PLSQL_73305_34_1 TABLE OF SYS_PLSQL_73305_9_1
    Name                                      Null?    Type
    ENAME                                              VARCHAR2(10)
    SAL                                                NUMBER(7,2)
    SQL> SY.

  • Special Grant to use "Select * from Table(cast..."??

    Hi,
    I've recently created the types and function to use the Table(Cast(funtion) as type)). It works fine, and gives me the correct result. I've granted execute on the types and on the function to a role that is enabled for a user, but when the user tries to use the "select * from table(cast(function) as type))", he gets a "ORA-01031: Insufficient Privileges" error message. Is there any other grant that must be given to the role, so that the user can execute the select?
    Thanks in advance!
    Daniel

    Hi Kamal,
    I'm not sure what anonymous PL/SQL block means. When I (or the user) try to run the select, I enter all the information, i.e., the owners for the type and function: "select * from table(cast(a.my_function(my_argument) as a.my_type))". I'm trying to use SQLPlus at this time, and I have Oracle 8i.
    I didn't to explicitly grant execute to the user because that would go against some rules I have to follow... I'll se if I give it a try though!
    Thanks!

  • Problems using table (cast as)

    Hi
    I have some code like this:
    declare
    TYPE t_forall_bags IS TABLE OF misbag.bags%ROWTYPE;
    l_forall_bags t_forall_bags := t_forall_bags ();
    begin
    open c2;
    FETCH c2 BULK COLLECT INTO l_forall_bags LIMIT v_array_size;
    if l_forall_bags.COUNT > 0 then
    begin
    merge into misbag.bags dest
    using (select col1,
    col2,
    colx
    from TABLE( cast( l_forall_bags as t_forall_bags ) ) ) src
    on (dest.bag_id = src.bag_id )
    when matched then
    --do update stuff
    when not matched then
    --do insert stuff;
    end;
    end if;
    end;
    on compilation I am getting an ora-00902 invalid datatype seemingly on the t_forall_bags in side the cast (as highlighted in bold)
    I thought I had the syntax correct, but maybe not.
    rgds
    Tony

    BluShadow wrote:Why are you querying data from the database into a collection (in expensive PGA memory) to then pass that back down to the SQL engine to be treated as a table (and incidently one without any indexes or the other benefits of a database table).Well that is a very good question.
    The task is to take a generally smaller number of very recent rows from one table and apply them to a similar table in another schema. This task will run very frequently (ie every second or two) so generally will have a smallish number fo rows (ie 100-200) each time it runs. Some rows are updates and some rows are inserts.
    If there is a delay on running the task, we don't necessarily want to process all of the outstanding rows in one go, but to take them in chunks until is catches up.
    One way to do this would be to perform multiple queries on the original data to check how many rows where outstanding, then to select which ones were to be merged, then go ahead and do the merge (with both main tables as you propose). This alternate idea (that I was looking at here) was to bulk collect the first n rows from the table into the array (up to the defined limit) and then to merge this list of rows into the destination table. The goal was to perform fewer data accesses and make the process least expensive in I/O. By bulk selecting up to N rows into the array, it was felt that there was less I/O on the source table, and probably the same amount of I/O on the destination table.
    The very first method of writing was to bulk select the first N rows into an array, delete any that already existed in the dest table then to "forall" insert the array contents into the destination table. This seemed to work quite well, we wanted to compare the merge version and see how it compared in speed and I/O usage.
    Tony
    rgds
    Tony

  • Rbo + history data index/no index table (cast ())

    Hi guys,
    I have couple of questions and maybe you can help me with some advices.
    I have the misfortune to work with a 10g system that works on RBO.
    1. and I have a lot of tables containing activity per id.
    for example:
    i have a table like: moved_history
    lets say I have 100 k ids in system, and each month there are some ids moved (lets say 20 k rows. an Id can be moved multiple types.
    so each month we will add extra 20k lines to this table. most of my queries are accessing this type of table with a predicate like entry_date between sysdate and sysdate - 2 months(or 1.5 months or 1 month).
    I have a lot of tables like this: change_history, pay_history...etc
    the question: it is Ok to create an index on entry_date. at the beginning this will not be useful, because it will select entire table using the index. but after a lot of months, this index will start to be usable?!.(I am using RBO and oracle it is using it no matter what)
    2.(second question)
    let’s say a have a table called dummy: ~100k records.
    a part of it ~80k it is used multiple times in selects in a pl/sql process.
    Initially I select with bulk collect and put those 80k(or in other case 10K or 1k) rows in nested tables.
    after that I use in sql the nested table, instead of using the table (with table(cast (.....)).
    Is this a correct approach?
    What are the advantages and the disadvantages.
    When using the nested table, I dont have any index on it(although I want to access all rows from that nested table and a full scan will be best), any statistics on it (in case, at some point I will move on CBO :) ), I use memory from PGA.
    When using the table directly, oracle knows the date, I am using SGA, maybe the data from table is cached, so the access speed will be the same like the one with nested tables.
    please correct me if I am wrong.
    How can I test/benchmark this? what should I look at?
    there is another option. the solution depends on how I will join, access, select from tables? and again, how can I test this in a good way.
    thanks guys for your time

    Well it is a bit hard to say. Someone has decided to go with a desupported optimizer
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14211/glossary.htm#sthref1601
    >
    Note:
    This feature has been desupported.
    >
    And since the optimizer is responsible for the efficient execution of queries
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14220/sqlplsql.htm#sthref3528
    >
    All SQL statements use the optimizer, a part of Oracle that determines the most efficient means of accessing the specified data.
    >
    I can only deduce that they do not care about optimization or query efficiency and that unfortunately your hands are pretty much tied.

  • Expression Framework / SELECT * FROM TABLE(CAST(...

    Hello!
    Is it possible to build the following with Toplink Expression Framework?
    Example:
    CREATE TYPE TY_OB_TEST AS OBJECT
    ( SYSTOP_NR NUMBER(5,0)
    SYS_NR NUMBER(5,0)
    IM_SYS_NAME VARCHAR2(80) ) ;
    CREATE TYPE TY_TB_TEST AS TABLE OF TY_OB_TEST;
    Package1.FUNCTION1 returns Type TY_TB_TEST.
    SQL:
    select * from TABLE(CAST(PACKAGE1.FUNCTION1(42)) AS TY_TB_TEST ));
    thank you!
    Harald.

    Nope, just use SQL.
    - Don

  • For loop or table cast to collection variable

    Hi ,
    I have a collection variable (Table type) which can have at-most 20 variables in it. I want to fetch one particular record.
    Which is more efficient; a For loop* or a table cast* ?
    Thanks,

    @Karthick, The requirement is like I have a query (with 2 joins in it) in a procedure which is executing about 50Lac times, and out of 50Lac execution it is fetching same data for say 25000 records.
    I mean query output is same for 25000 records (not fixed) and then it is same for next 25000 records. So I took a collection variable and did a BULK COLLECT to a variable and trying to process (fetch) data in memory instead of hitting the query again (and table ) again.
    Also query is taking 0.001 sec per execution but it is running for so many times thats why procedure is taking time. Oracle performs 2 type of I/O
    1. Physical I/O
    This happens when oracle picks up the data blocks from the data file and puts it in the Data buffer (SGA)
    2. Logical I/O
    This happens when oracle picks up data block from Data buffer (SGA)
    In Data Buffer the data is stored in FIFO basis. So when you hit a table for the first time oracle goes for a physical I/O the subsequent time it will go for Logical I/O.
    What you are trying to do oracle does it already. You don't have to use a collection. Collection uses expensive private memory (PGA).
    And again the basic question is why are you executing a procedure 50,00,000 times?

  • What this problem? table(cast(array))

    exits in databse one type
    " type InNumberTab is table of number;"
    declare
    v_tbseq_trans:= InNumberTab(null);
    select count(*) into v_cont
    from dados_propriedade dp
    where dp.num_pessoa=p_num_pessoa
    group by num_nirf;
    v_tbseq_trans.EXTEND(SQL%ROWCOUNT);
    v_cont:=0;
    for v_cs in
    (select to_number(replace(dp.num_nirf,'-','')) nirf, max(dp.seq_transacao) seq_transacao
    from dados_propriedade dp
    where dp.num_pessoa=p_num_pessoa
    group by dp.num_nirf) loop
    v_tbseq_trans(v_cont):=v_cs.seq_transacao;
    v_cont:=v_cont+1;
    END LOOP;
    open p_cursor_prop for
    select
    from dados_propriedade dp
    where dp.seq_transacao in (SELECT column_value FROM TABLE(CAST(v_tbseq_trans as InNumberTab)));

    I'll try to guess what you mean.
    If you need to use object or collection type in SQL query you should declare in in SQL server not in PL/SQL code:
    SQL> declare
      2   type na is table of number;
      3   nat na := na(1,2,3);
      4  begin
      5   for x in (select column_value y from table(cast(nat as na))) loop
      6     dbms_output.put_line(x.y);
      7   end loop;
      8  end;
      9  /
    for x in (select column_value y from table(cast(nat as na))) loop
    ERROR at line 5:
    ORA-06550: line 5, column 56:
    PL/SQL: ORA-00902: invalid datatype
    ORA-06550: line 5, column 11:
    PL/SQL: SQL Statement ignored
    ORA-06550: line 6, column 25:
    PLS-00364: loop index variable 'X' use is invalid
    ORA-06550: line 6, column 4:
    PL/SQL: Statement ignored
    SQL> set serveroutput on
    SQL> create type na is table of number;
      2  /
    Type created.
    SQL> declare
      2   nat na := na(1,2,3);
      3  begin
      4   for x in (select column_value y from table(cast(nat as na))) loop
      5     dbms_output.put_line(x.y);
      6   end loop;
      7  end;
      8  /
    1
    2
    3
    PL/SQL procedure successfully completed.Is it your problem ?
    Rgds.

Maybe you are looking for