Collection Specialist Name table?

Hi everybody,
Can anyone please tell me in which table collection specialist name is stored. I need "Name of specialist " , not collection specialist ID.
In "assign processor to collection groups" config , when I enter collection specialist ID, the name gets automatically populated. i want to know where the field is getting populated from.
Thank you
Donny

Hi,
Use read specialist from table udmbpsegments-coll_specialist and pass the value in "BAPI_USER_GET_DETAIL" to get user name details
Regards
Rinku Bhowal

Similar Messages

  • Managing statistics for object collections used as table types in SQL

    Hi All,
    Is there a way to manage statistics for collections used as table types in SQL.
    Below is my test case
    Oracle Version :
    SQL> select * from v$version;
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    PL/SQL Release 11.2.0.3.0 - Production
    CORE    11.2.0.3.0      Production
    TNS for IBM/AIX RISC System/6000: Version 11.2.0.3.0 - Production
    NLSRTL Version 11.2.0.3.0 - Production
    SQL> Original Query :
    SELECT
         9999,
         tbl_typ.FILE_ID,
         tf.FILE_NM ,
         tf.MIME_TYPE ,
         dbms_lob.getlength(tfd.FILE_DATA)
    FROM
         TG_FILE tf,
         TG_FILE_DATA tfd,
              SELECT
              FROM
                   TABLE
                        SELECT
                             CAST(TABLE_ESC_ATTACH(OBJ_ESC_ATTACH( 9999, 99991, 'file1.png', NULL, NULL, NULL),
                             OBJ_ESC_ATTACH( 9999, 99992, 'file2.png', NULL, NULL, NULL)) AS TABLE_ESC_ATTACH)
                        FROM
                             dual
         )     tbl_typ
    WHERE
         tf.FILE_ID     = tfd.FILE_ID
    AND tf.FILE_ID  = tbl_typ.FILE_ID
    AND tfd.FILE_ID = tbl_typ.FILE_ID;
    Elapsed: 00:00:02.90
    Execution Plan
    Plan hash value: 3970072279
    | Id  | Operation                                | Name         | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT                         |              |     1 |   194 |  4567   (2)| 00:00:55 |
    |*  1 |  HASH JOIN                               |              |     1 |   194 |  4567   (2)| 00:00:55 |
    |*  2 |   HASH JOIN                              |              |  8168 |   287K|   695   (3)| 00:00:09 |
    |   3 |    VIEW                                  |              |  8168 |   103K|    29   (0)| 00:00:01 |
    |   4 |     COLLECTION ITERATOR CONSTRUCTOR FETCH|              |  8168 | 16336 |    29   (0)| 00:00:01 |
    |   5 |      FAST DUAL                           |              |     1 |       |     2   (0)| 00:00:01 |
    |   6 |    TABLE ACCESS FULL                     | TG_FILE      |   565K|    12M|   659   (2)| 00:00:08 |
    |   7 |   TABLE ACCESS FULL                      | TG_FILE_DATA |   852K|   128M|  3863   (1)| 00:00:47 |
    Predicate Information (identified by operation id):
       1 - access("TF"."FILE_ID"="TFD"."FILE_ID" AND "TFD"."FILE_ID"="TBL_TYP"."FILE_ID")
       2 - access("TF"."FILE_ID"="TBL_TYP"."FILE_ID")
    Statistics
              7  recursive calls
              0  db block gets
          16783  consistent gets
          16779  physical reads
              0  redo size
            916  bytes sent via SQL*Net to client
            524  bytes received via SQL*Net from client
              2  SQL*Net roundtrips to/from client
              0  sorts (memory)
              0  sorts (disk)
              2  rows processed Indexes are present in both the tables ( TG_FILE, TG_FILE_DATA ) on column FILE_ID.
    select
         index_name,blevel,leaf_blocks,DISTINCT_KEYS,clustering_factor,num_rows,sample_size
    from
         all_indexes
    where table_name in ('TG_FILE','TG_FILE_DATA');
    INDEX_NAME                     BLEVEL LEAF_BLOCKS DISTINCT_KEYS CLUSTERING_FACTOR     NUM_ROWS SAMPLE_SIZE
    TG_FILE_PK                          2        2160        552842             21401       552842      285428
    TG_FILE_DATA_PK                     2        3544        852297             61437       852297      852297 Ideally the view should have used NESTED LOOP, to use the indexes since the no. of rows coming from object collection is only 2.
    But it is taking default as 8168, leading to HASH join between the tables..leading to FULL TABLE access.
    So my question is, is there any way by which I can change the statistics while using collections in SQL ?
    I can use hints to use indexes but planning to avoid it as of now. Currently the time shown in explain plan is not accurate
    Modified query with hints :
    SELECT    
        /*+ index(tf TG_FILE_PK ) index(tfd TG_FILE_DATA_PK) */
        9999,
        tbl_typ.FILE_ID,
        tf.FILE_NM ,
        tf.MIME_TYPE ,
        dbms_lob.getlength(tfd.FILE_DATA)
    FROM
        TG_FILE tf,
        TG_FILE_DATA tfd,
            SELECT
            FROM
                TABLE
                        SELECT
                             CAST(TABLE_ESC_ATTACH(OBJ_ESC_ATTACH( 9999, 99991, 'file1.png', NULL, NULL, NULL),
                             OBJ_ESC_ATTACH( 9999, 99992, 'file2.png', NULL, NULL, NULL)) AS TABLE_ESC_ATTACH)
                        FROM
                             dual
        tbl_typ
    WHERE
        tf.FILE_ID     = tfd.FILE_ID
    AND tf.FILE_ID  = tbl_typ.FILE_ID
    AND tfd.FILE_ID = tbl_typ.FILE_ID;
    Elapsed: 00:00:00.01
    Execution Plan
    Plan hash value: 1670128954
    | Id  | Operation                                 | Name            | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT                          |                 |     1 |   194 | 29978   (1)| 00:06:00 |
    |   1 |  NESTED LOOPS                             |                 |       |       |            |          |
    |   2 |   NESTED LOOPS                            |                 |     1 |   194 | 29978   (1)| 00:06:00 |
    |   3 |    NESTED LOOPS                           |                 |  8168 |  1363K| 16379   (1)| 00:03:17 |
    |   4 |     VIEW                                  |                 |  8168 |   103K|    29   (0)| 00:00:01 |
    |   5 |      COLLECTION ITERATOR CONSTRUCTOR FETCH|                 |  8168 | 16336 |    29   (0)| 00:00:01 |
    |   6 |       FAST DUAL                           |                 |     1 |       |     2   (0)| 00:00:01 |
    |   7 |     TABLE ACCESS BY INDEX ROWID           | TG_FILE_DATA    |     1 |   158 |     2   (0)| 00:00:01 |
    |*  8 |      INDEX UNIQUE SCAN                    | TG_FILE_DATA_PK |     1 |       |     1   (0)| 00:00:01 |
    |*  9 |    INDEX UNIQUE SCAN                      | TG_FILE_PK      |     1 |       |     1   (0)| 00:00:01 |
    |  10 |   TABLE ACCESS BY INDEX ROWID             | TG_FILE         |     1 |    23 |     2   (0)| 00:00:01 |
    Predicate Information (identified by operation id):
       8 - access("TFD"."FILE_ID"="TBL_TYP"."FILE_ID")
       9 - access("TF"."FILE_ID"="TBL_TYP"."FILE_ID")
           filter("TF"."FILE_ID"="TFD"."FILE_ID")
    Statistics
              0  recursive calls
              0  db block gets
             16  consistent gets
              8  physical reads
              0  redo size
            916  bytes sent via SQL*Net to client
            524  bytes received via SQL*Net from client
              2  SQL*Net roundtrips to/from client
              0  sorts (memory)
              0  sorts (disk)
              2  rows processed
    Thanks,
    B

    Thanks Tubby,
    While searching I had found out that we can use CARDINALITY hint to set statistics for TABLE funtion.
    But I preferred not to say, as it is currently undocumented hint. I now think I should have mentioned it while posting for the first time
    http://www.oracle-developer.net/display.php?id=427
    If we go across the document, it has mentioned in total 3 hints to set statistics :
    1) CARDINALITY (Undocumented)
    2) OPT_ESTIMATE ( Undocumented )
    3) DYNAMIC_SAMPLING ( Documented )
    4) Extensible Optimiser
    Tried it out with different hints and it is working as expected.
    i.e. cardinality and opt_estimate are taking the default set value
    But using dynamic_sampling hint provides the most correct estimate of the rows ( which is 2 in this particular case )
    With CARDINALITY hint
    SELECT
        /*+ cardinality( e, 5) */*
    FROM
         TABLE
              SELECT
                   CAST(TABLE_ESC_ATTACH(OBJ_ESC_ATTACH( 9999, 99991, 'file1.png', NULL, NULL, NULL),
                   OBJ_ESC_ATTACH( 9999, 99992, 'file2.png', NULL, NULL, NULL)) AS TABLE_ESC_ATTACH)
              FROM
                   dual
         ) e ;
    Elapsed: 00:00:00.00
    Execution Plan
    Plan hash value: 1467416936
    | Id  | Operation                             | Name | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT                      |      |     5 |    10 |    29   (0)| 00:00:01 |
    |   1 |  COLLECTION ITERATOR CONSTRUCTOR FETCH|      |     5 |    10 |    29   (0)| 00:00:01 |
    |   2 |   FAST DUAL                           |      |     1 |       |     2   (0)| 00:00:01 |
    With OPT_ESTIMATE hint
    SELECT
         /*+ opt_estimate(table, e, scale_rows=0.0006) */*
    FROM
         TABLE
              SELECT
                   CAST(TABLE_ESC_ATTACH(OBJ_ESC_ATTACH( 9999, 99991, 'file1.png', NULL, NULL, NULL),
                   OBJ_ESC_ATTACH( 9999, 99992, 'file2.png', NULL, NULL, NULL)) AS TABLE_ESC_ATTACH)
              FROM
                   dual
         ) e ;
    Execution Plan
    Plan hash value: 4043204977
    | Id  | Operation                              | Name | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT                       |      |     5 |   485 |    29   (0)| 00:00:01 |
    |   1 |  VIEW                                  |      |     5 |   485 |    29   (0)| 00:00:01 |
    |   2 |   COLLECTION ITERATOR CONSTRUCTOR FETCH|      |     5 |    10 |    29   (0)| 00:00:01 |
    |   3 |    FAST DUAL                           |      |     1 |       |     2   (0)| 00:00:01 |
    With DYNAMIC_SAMPLING hint
    SELECT
        /*+ dynamic_sampling( e, 5) */*
    FROM
         TABLE
              SELECT
                   CAST(TABLE_ESC_ATTACH(OBJ_ESC_ATTACH( 9999, 99991, 'file1.png', NULL, NULL, NULL),
                   OBJ_ESC_ATTACH( 9999, 99992, 'file2.png', NULL, NULL, NULL)) AS TABLE_ESC_ATTACH)
              FROM
                   dual
         ) e ;
    Elapsed: 00:00:00.00
    Execution Plan
    Plan hash value: 1467416936
    | Id  | Operation                             | Name | Rows  | Bytes | Cost (%CPU)| Time     |
    |   0 | SELECT STATEMENT                      |      |     2 |     4 |    11   (0)| 00:00:01 |
    |   1 |  COLLECTION ITERATOR CONSTRUCTOR FETCH|      |     2 |     4 |    11   (0)| 00:00:01 |
    |   2 |   FAST DUAL                           |      |     1 |       |     2   (0)| 00:00:01 |
    Note
       - dynamic sampling used for this statement (level=2)I will be testing the last option "Extensible Optimizer" and put my findings here .
    I hope oracle in future releases, improve the statistics gathering for collections which can be used in DML and not just use the default block size.
    By the way, are you aware why it uses the default block size ? Is it because it is the smallest granular unit which oracle provides ?
    Regards,
    B

  • Oracle Quality Module - Collection Plan name

    The collection Plan name field in Orcale quality module - collection plan form (QLTPLMDF) does not accept special characters. Why is this so and what is the rationale behind this?
    Here is the wording from Oracle Quality user Manual
    Collection plan names can include alphanumeric characters and spaces. Collection
    plan names can have up to 30 characters, and the first 25 characters must be
    distinctive. The only special characters that can be included in a collection plan name
    are the underscore (_) and the single quotation mark (’). any special
    I am working with a customer who wants to use decimals (Example: 35.24.101) so that search will be easier. Some screens does not allow searching by description so need to have these in the collection name/code
    Any solution you can think of?
    Thanks

    Hi,
    When you create the collection Plan , actually you are creating the name of the table with required column names as collection elements in the data base.
    This is the reason you are not allowed to do so as per your query.
    thanks,
    Raja

  • Require  field name , Table name

    Hi All,
    For a Z development i need a field name & table name
    the requirement is for which FG material this SFG is produced.,suppose A is the FG material
    X,Y,Z are the semifinished materials required to produce A ., when going for production for X or Y o Z it should tell for which Fg this is produced
    Please give your input
    Regards
    Anand Srinivasan

    Hi
    Have checked RESB & AUFM tables before but couldnt get the FG link, it is pegged to the next level only
    If you keep on passing the pegged requirements then only the FG link is coming
    at the time of prodn order creation of SFG they want to generate this report
    The backdrop  in this is the same  SFG can be used in many FG
    Regards
    Anand Srinivasan

  • Taglib problem: Cannot parse custom tag with short name table

    Hello!
    I am having problems deploying a jsp tag in web as. The same war file works fine on websphere, jboss. SAP web as seems to be complaining about the short name in the tld.
    Can any body me to any known web as issues with jsp tags?
    Thanks
    [code]
    Application error occurs during processing the request.
    Details: com.sap.engine.services.servlets_jsp.server.exceptions.WebIOException: Internal error while parsing JSP page /usr/sap/J2E/JC00/j2ee/cluster/server0/apps/sap.com/dispear/servlet_jsp/disp/root/test.jsp.
         at com.sap.engine.services.servlets_jsp.server.jsp.JSPParser.parse(JSPParser.java:85)
         at com.sap.engine.services.servlets_jsp.server.servlet.JSPServlet.getClassName(JSPServlet.java:207)
         at com.sap.engine.services.servlets_jsp.server.servlet.JSPServlet.compileAndGetClassName(JSPServlet.java:369)
         at com.sap.engine.services.servlets_jsp.server.servlet.JSPServlet.service(JSPServlet.java:164)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:385)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:263)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:340)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:318)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:821)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:239)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:92)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:147)
         at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:37)
         at com.sap.engine.core.cluster.impl6.session.UnorderedChannel$MessageRunner.run(UnorderedChannel.java:71)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:94)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:162)
    Caused by: com.sap.engine.services.servlets_jsp.lib.jspparser.exceptions.JspParseException: Cannot parse custom tag with short name table.
         at com.sap.engine.services.servlets_jsp.lib.jspparser.syntax.xmlsyntax.CustomJspTag.action(CustomJspTag.java:129)
         at com.sap.engine.services.servlets_jsp.lib.jspparser.syntax.ElementCollection.action(ElementCollection.java:52)
         at com.sap.engine.services.servlets_jsp.server.jsp.JSPParser.initParser(JSPParser.java:307)
         at com.sap.engine.services.servlets_jsp.server.jsp.JSPParser.parse(JSPParser.java:74)
         ... 18 more
    Caused by: com.sap.engine.services.servlets_jsp.lib.jspparser.exceptions.JspParseException: Unknown class name java.lang.Object.
         at com.sap.engine.services.servlets_jsp.lib.jspparser.taglib.TagBeginGenerator.convertString(TagBeginGenerator.java:365)
         at com.sap.engine.services.servlets_jsp.lib.jspparser.taglib.TagBeginGenerator.generateSetters(TagBeginGenerator.java:187)
         at com.sap.engine.services.servlets_jsp.lib.jspparser.taglib.TagBeginGenerator.generateServiceMethodStatements(TagBeginGenerator.java:212)
         at com.sap.engine.services.servlets_jsp.lib.jspparser.taglib.TagBeginGenerator.generate(TagBeginGenerator.java:269)
         at com.sap.engine.services.servlets_jsp.lib.jspparser.syntax.xmlsyntax.CustomJspTag.action(CustomJspTag.java:127)
         ... 21 more
    [/code]

    Hi Ray,
    I am facing similar kind of issue.
    Can you please help to resolve it?
    Thanks in advance.
    Logs are as below [Here I am using standard tag lib]::
    Caused by: com.sap.engine.services.servlets_jsp.jspparser_api.exception.JspParseException: Cannot parse custom tag with short name [out].
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.xmlsyntax.CustomJspTag.action(CustomJspTag.java:183)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.ElementCollection.action(ElementCollection.java:59)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.JspIncludeDirective.action(JspIncludeDirective.java:51)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.ElementCollection.action(ElementCollection.java:59)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.JspElement.customTagAction(JspElement.java:994)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.JspElement.action(JspElement.java:228)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.ElementCollection.action(ElementCollection.java:59)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.ElementCollection.action(ElementCollection.java:69)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.GenerateJavaFile.generateJavaFile(GenerateJavaFile.java:72)
         at com.sap.engine.services.servlets_jsp.server.jsp.JSPProcessor.parse(JSPProcessor.java:270)
         at com.sap.engine.services.servlets_jsp.server.jsp.JSPProcessor.generateJavaFile(JSPProcessor.java:194)
         at com.sap.engine.services.servlets_jsp.server.jsp.JSPProcessor.parse(JSPProcessor.java:126)
         at com.sap.engine.services.servlets_jsp.jspparser_api.JSPChecker.getClassName(JSPChecker.java:319)
         at com.sap.engine.services.servlets_jsp.jspparser_api.JSPChecker.compileAndGetClassName(JSPChecker.java:248)
         at com.sap.engine.services.servlets_jsp.jspparser_api.JSPChecker.getClassNameForProduction(JSPChecker.java:178)
         at com.sap.engine.services.servlets_jsp.jspparser_api.JSPChecker.processJSPRequest(JSPChecker.java:109)
         at com.sap.engine.services.servlets_jsp.jspparser_api.JspParser.generateJspClass(JspParser.java:154)
         at com.sap.engine.services.servlets_jsp.server.servlet.JSPServlet.service(JSPServlet.java:193)
         ... 47 more
    Caused by: com.sap.engine.services.servlets_jsp.jspparser_api.exception.JspParseException: Attribute [value] of [<c:out>] can accept only static values.
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.taglib.TagBeginGenerator.calculateAttributeValue(TagBeginGenerator.java:476)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.taglib.TagBeginGenerator.generateSetters(TagBeginGenerator.java:394)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.taglib.TagBeginGenerator.generateServiceMethodStatements(TagBeginGenerator.java:562)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.taglib.TagBeginGenerator.generate(TagBeginGenerator.java:678)
         at com.sap.engine.services.servlets_jsp.jspparser_api.jspparser.syntax.xmlsyntax.CustomJspTag.action(CustomJspTag.java:181)
         ... 64 more
    Regards,
    Sankalp

  • Merging a collection into a table

    I have a table with some blank columns.
    I take all the data from the table in a collection and then process the data to fill those empty columns.
    Now i have the collection with all the valid data.
    Is it possible to cast this same collection into a table and merge it again into the original table?
    I am trying to do so, but there is an error saying "cant use local collection"...
    Any help on this would be appreciated.

    Thanks a lot for your suggestion.
    I would like to give you an elaborate process.
    I have a table ABC. Some columns of that table are empty. I need to get those filled using some other tables.
    Firstly, using a cursor I fetch the records in a collection say V_ABC.
    Now in a FOR loop, I am fetching the missing data from other tables for every record in the collection and setting it in the empty places in the collection V_ABC itself.
    So finally i have the collection V_ABC which has complete data.
    The FOR loop is now ended.
    Now i try to merge this V_ABC into tale ABC like this:
    MERGE INTO ABC abc
    USING TABLE(CAST(V_ABC AS TT_ABC)) z
    ON (abc.random_column_name = z.random_column_name)
    WHEN MATCHED THEN
    UPDATE
    SET abc.initially_empty_column = z.initially_empty_column
    The exact error is "Error: PLS-00642: local collection types not allowed in SQL statements".
    I am forced to use PL-SQL because of performance issues.
    Could you provide an alternate way?
    I guess one way would be to create another empty collection and EXTENDing it and creating a replica of V_ABC. Then probably it will allow me to merge. But i m not sure.

  • Parse SQL: How to extract column names, table names from a SQL query

    Hi all,
    I have a requirement wherein a SQL query will be given in a text file.
    The text file has to be read (probably using text_io package), the SQL query needs to be parsed and the column names, table names and where clauses have to be extracted and inserted into separate tables.
    Is there a way to do it using PL/SQL ?
    Is there a PL/SQL script available to parse and extract column names, table names etc ?
    Pls help.
    Regards,
    Sam

    I think I jumped to conclusion too early saying it is completely possible and straight forward. But after reading through your post for one more time I realised you are not interested only in the column_names, also the table_names and the predicates .
    >
    SQL query needs to be parsed and the column names
    >
    The above is possible and straight forward using the dbms_sql package.
    I am pasting the same information as I did in the other forum.
    Check this link and search for Example 8 and .
    http://download.oracle.com/docs/cd/B19306_01/appdev.102/b14258/d_sql.htm#sthref6136
    Also check the working example from asktom
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:1035431863958
    >
    table names and where clauses have to be extracted
    >
    Now this is the tricky bit. You can extract the list of tables by getting the sql_id from v$sql and joining it with v$sql_plan. But sometimes you may not get all the results because the optimizer may choose to refine your query (check this link)
    http://optimizermagic.blogspot.com/2008/06/why-are-some-of-tables-in-my-query.html
    and you can get the predicate information from the same table v$sql_plan but I will leave that area for you to do some R&D.
    Regards
    Raj
    Edited by: R.Subramanian on Dec 10, 2008 3:14 AM

  • Using the collection as a Table

    Hi,
    I want to use a collection as a Table.Can anyone give me any code sample for that.
    I am getting a collection through in parameter and want to join that rows with another table to out another collection.I want that because if I will do the operation in loop by taking the element one by one and comparing with the DB table record,the performance will be degraded.
    Please give the solution for using a collection as a table to join it with another table.
    Thanks in advance
    Guljeet

    Something like this probably will do for you.
    SQL> create table t1 as
      2  select 1 col1 from dual union all
      3  select 2 from dual union all
      4  select 3 from dual;
    Table created
    SQL> create or replace type typ_row as object (col1 number, col2 varchar2(10));
      2  /
    Type created
    SQL> create or replace type typ_table as table of typ_row
      2  /
    Type created
    SQL> set serveroutput on
    SQL>
    SQL> declare
      2    tab_test typ_table := typ_table(typ_row(1, 'A'), typ_row(3, 'C'));
      3    cursor cur1 is
      4      select t1.col1, t2.col2
      5        from t1
      6        join table(tab_test) t2 on t1.col1 = t2.col1; --use table() and put your collection in it then join
      7  begin
      8    for rec in cur1 -- sample execution just for displaying the join performed
      9    loop
    10      dbms_output.put_line(rec.col1 || ' ' || rec.col2);
    11    end loop;
    12  end;
    13  /
    1 A
    3 C
    PL/SQL procedure successfully completed
    SQL> or, after creating the structures, simply run something like:
    SQL> select t1.col1, t2.col2
      2        from t1
      3        join table(typ_table(typ_row(1, 'A'), typ_row(3, 'C'))) t2 on t1.col1 = t2.col1;
          COL1 COL2
             1 A
             3 C
    SQL>

  • System Copy -Solman 7.1 Adjust name table phase error

    Dear Experts,
    I am on way of performing the system copy for solman 7.1 from prod to dev system.We are encountering the error in the adjust name table phase .Below is the log :
    ERROR in initialization (can't get R/3-version)
    ERROR in initialization (can't get R/3-version)
    D:\usr\sap\\D00\log\SLOG00.LOG: No such file or directory
    rslgwr1(20): rstrbopen cannot open pre-existing SysLog file.
    D:\usr\sap\\D00\log\SLOG00.LOG: No such file or directory
    rslgwr1(11): rstrbopen cannot open SysLog file.
    SysLog:iE1020140515082322000000000000  this TTY           
          :                                                            0000
          :SCSA         4096                                              
    SysLog:hBZA20140515082322000000000000  this TTY           
          :                                                            0000
          :SVERS                                               dblink  1323
    OS:Windows 2008
    Database :Sybase Ase 15.7 ,and the weird thing is my SID is different .
    Appreciate your responses
    Thanks
    Asim

    Hi Asim,
    1.I was getting error at Abap phase load error .So i have merge the tsk & bak files by r3laod and manually set ign in tsk files .Should i change the ign status back to error.Is this could be the problem causing in adjusting table name phase.
    Status "ign" will ignore the object from being imported into the database. This may lead to inconsistency and loss of data.
    2.We have used abap export using Sap inst only .Will that not effect if using import by SWPM
    Actually SWPM also contains sapinst tool. But As I never never used this approach , I would suggest you to give a try and check.
    Else you may need to perform fresh export and then import using SWPM.
    Hope this helps.
    Regards,
    Deepak Kori

  • Collection Specialist - Collections Profile Tab of Business Partner.

    Hi,
    We are using the SAP FSCM Collections Management module. When I assign the collection profile to the Business Partner on the Collections Profile Tab, the Collection Segment, Collection Group are populated based on the configuration but the Collections Specialist does not get populated.
    Can someone please throw some light on this if they have experienced this issue?
    Thanks & Regards,
    Bhairav Naik.

    Hello Mark , actually I dont think that is possible to transfer data related to collection profile (displayed in BP master data for a BP with colletion managemnt role) from Cusotmer master data because this kind of information are not availabe as my knowledge in Cusotmer master
    Solution could to transfer all other available data from Cusotmer data to BP and once transferred update mannually the collection profile.
    I have seen also a program ( transaction UDM_BP_PROF) to update collection profile automatically, but I did not manage to make it work on a exidting BP.
    What do you think about this approach?

  • Package name table?

    hi friends ı didnt find package name table?
    help me please.
    thanks.

    Hi,
    The table is TDEVC ---> Packages.
    For reference check in the table TADIR there is a field call DEVCLASS... see the ENTRY HELP/ CHECK tab for that field and you will find the check table as TDEVC
    Hope this would help you.
    Good luck
    Narin.

  • Worklist items distribution to collection specialist

    Dear All,
    Doubt: worklist item distribution method to collection specialists
    Scenario: Specialists are not assigned to the business partner master record,worklist items are distributed by equal distribution method.worklist will get generated on a daily basis.
    Today i run the UDM_GENWL program and then distribute the worklist items equally, after that i assign some of the worklist items from one specialist to other specialist(i.e manually), all the specialists has processed their items for the day.
    The nextday i once again run the UDM_GENWL program and distribute the worklist items equally.
    Question
    Now will the program assign worklist items to the same specialist who processed the same yesterday or it will allocate it to some other specialists.
    (It may go to the same specialist or different one because the collection group may have 3 or 4 specialists so, the probability of allocation to the same specialist is high)
    Whether it will over write all the worklist items with the specialists including manually allocated worklist items with the equal distribution formula.
    How the distribution algorithm works..?
    Thank you.
    Regards,
    D vasanth

    When the std worklist generation prog runs, first it evaluates the BPs based on coll strategies and then generates the WL entries.. If the BP contains coll spec entry then the WL item gets assigned to that.
    For the rest of the unassigned entries, If you employ the std dist procedure of even distribution, the unassigned entries in WL, will be distributed to coll spec with smallest # of WL entries. If the # of entries to each coll spec are same then the next item will be assigned to coll spec with the lowest avg valuation.
    If you wish to have the same BP assigned to the same coll spec in every WL generation run then assign it in the master data itself.

  • Collection specialists

    Do collection specialist need to be created as busines partners? under what role?

    You can assign Coll specialist (SAP user-ids) at two levels.
    1) at the group level (as mentioned by Mark above) and then during UDM_GENWL assign the worklist entries to coll specialist as per the distribution method defined
    OR
    2) assign directly in BP record itself. FYI.. this assignment is at the Segment level

  • Comparing String values against a collection of Names in a Hash Table

    Objective:
    Is to make a script that will import a csv file containing two values: "Name" and "Price". This would ideally be stored in a hash table with the key name of "Name" and the value being "Price". The second part would be
    importing a second csv file that has a list of names to compare too. If it finds a similar match to a key name in the hash table and then it will add that to a new array with the price. At the end it would add all the prices and give you a total value.
    The Problem to Solve:
    In the real world people have a tendency to not write names exactly the same way, for example I am looking at a list of books to buy in an eBay auction. In the auction they provide a text list of all the names of books for sale. In my price guide it has all
    the names and dollar values of each book. The wording of the way each book is named could differ from the person who writes it and what is actually in my reference pricing list. An example might be "The Black Sheep" vs "Black Sheep" or
    "Moby-Dick" vs "Moby Dick".
    I've tried making a script and comparing these values using the -like operator and have only had about 70% accuracy. Is there a way to increase that by 
    comparing the characters instead of likeness of words? I'm not really sure how to solve this issue as it's very hard to do quality check on the input when your talking about hundreds of names in the list. Is there a better way to compare values in power-shell
    then the "like" operator? Do I need to use a database instead of a hash table? In the real world I feel like a search engine would know the differences in these variations and still provide the desired results so why not for this type of application?
    In other words, create a bit more intelligence to say well it's not a 100% match but 90% so that is close enough, add it to the array as a match and add the price etc..
    I'd be curious as to any thoughts on this? Maybe a scripting language with better matching for text?

    Have you considered setting up a manual correction process that "learns" as you make corrections, automatically building a look up table of possible spellings of each Name?  If you get an exact match, use it.  If not, go to the look up
    table and see if there's been a previous entry with the same spelling and what it was corrected to.
    [string](0..33|%{[char][int](46+("686552495351636652556262185355647068516270555358646562655775 0645570").substring(($_*2),2))})-replace " "

  • Assign a collection field name dynamically

    Hi
    I am new to this forum. Could you please help me out with this.
    I have a main collection which has 20 columns. I need to pass each of this 20 columns in another function to validate some functionality. I am thinking of doing that in naother loop. I put all thsi 20 column names in another nested table collection. I will get the column_name form the second loop. Can I combine that with the first collection to get the value of that column_name.
    I will give an example wit 2 loops.
    DECLARE
    V_NAME VARCHAR2(100);
    V1 VARCHAR2(100);
    begin
    for i in (select * from employeeS)
    loop
    for j in(select 'DEPARTMENT_ID' COLUMN_NAME FROM DUAL)
    LOOP
    v_name:=j.COLUMN_NAME;
    v1:=i.V_NAME;
    -- I need to get the value of " i.department_id "in variable v1. I need to pass the value of i.department_id (like 20 columns) to another function.
    --That is the need for second loop
    DBMS_OUTPUT.PUT_LINE('V1'||V1);
    END LOOP;
    END LOOP;
    END;
    This code is not compiling. It would be great ,If you can assist me with this.
    My oracle version is 11g. If you need any further information, I can provide that.

    The real question is why do you want to do this. What problem are you trying to solve?
    Yes, one can create an array/collection SQL projection, as oppose to the typical projection containing scalar (single column) values. But I would be hesitant to recommend this approach as there are issues with it (such as the array only supporting a single data type), that needs to be taken into consideration.
    Basic example:
    // create a SQL collection/array type
    SQL> create or replace type TStrings is table of varchar2(4000);
      2  /
    Type created.
    // use the array type to construct the SQL projection as an array of
    // column values - as the array is of type string, all columns need to
    // be of type string, or needs to be converted to type string
    SQL> begin
      2          for c in(
      3                  select
      4                          TStrings(
      5                                  to_char(empno),
      6                                  ename,
      7                                  job,
      8                                  to_char(hiredate,'yyyy/mm/dd')
      9                          ) as COL_LIST
    10                  from    emp
    11                  where   rownum <= 2
    12          ) loop
    13                  dbms_output.put_line( '**new row**' );
    14                  for i in 1..c.Col_List.Count loop
    15                          dbms_output.put_line( 'col'||i||'='||c.Col_List(i) );
    16                  end loop;
    17          end loop;
    18  end;
    19  /
    **new row**
    col1=7369
    col2=SMITH
    col3=CLERK
    col4=1980/12/17
    **new row**
    col1=7499
    col2=ALLEN
    col3=SALESMAN
    col4=1981/02/20
    PL/SQL procedure successfully completed.
    SQL>

Maybe you are looking for

  • How do I prevent the use of a digital signature?

    I realize this flies in the face of some core security issues built into PDFs, but here we go... We have a customer for whom we are creating fillable PDF forms.  Due to the nature of the documents, they require handwritten signatures on printed hard

  • Audio Levels Are Different within program

    For some reason when I am editing a multi-track project I seem to get three different levels as I go along.  When I am editing a single file the levels come out as one thing.  When I get the levels good, usually around -4 or so I play them back on th

  • Minimum Order qty in po

    Hi While creating purchase order for a particular material, system should take minimum order quantity. For example, i create po 10ea minimum order quantity 25.then system should automatically should pick 25. 1)pls tell me where to define minimum orde

  • How to convert hex string to time stamp?

    Hello everyone.. I am currently working on a project in which I have to read the data from a unit and display that data using LabVIEW. I am using serial communication for reading the data. The read data is in hex format.  Now, I want to convert this

  • Issues outputting to pdf

    I have created a number of documents in Indesign CS5 and have output them as an interactive pdf using the default setting for this. I have two questions which my client has asked me to resolve and fear that there is nothing I can do as these are pref