Unable to pass comma separated values for in clause

I have the following query : for :P_LEG_NUM Parameter when i am passing values like 1,2,5 as string type i am getting invalid number error... I have defined in clause for it but still it does not work.. For individual values like 2, etc it works... how can i pass comma separted values for this bind variable
select trip_number as prl_trip_number,
       flight_number as prl_f_number,
       trip_leg_id as prl_trip_leg_id,
       leg_number as prl_leg_num,
       dicao as prl_dicao,
       etd_zulu as prl_etd_zulu,
       aicao as prl_aicao,
       eta_zulu as prl_eta_zulu,
       to_char(etd_zulu,'DD-Mon-YYYY HH24:MI') as prl_cb_etd,
       to_char(eta_zulu,'DD-Mon-YYYY HH24:MI') as prl_cb_eta,
       diata as prl_diata,
       aiata as prl_aiata,
      (select client_name
       from xxwfs_trip_header_details t_h
       where t_h.trip_number = t_leg.trip_number) as prl_client_name,
      (select to_char((select systimestamp at time zone 'GMT' from dual),'YYYY-MM-DD-HH24MI')
       from dual) as prl_curr_zulu_date
from xxwfs_trip_leg_details t_leg
where 1=1
and t_leg.leg_number in nvl(:P_LEG_NUM,t_leg.leg_number)
and t_leg.trip_number = :P_trip_no

This is the problem know as <tt>'Varying IN List'</tt> issue.
Check this - SQL and PL/SQL FAQ

Similar Messages

  • Comma separated values for input and return multiple values

    Hello everyone,
    I have this simple package. Can someone suggest a way to accept multiple empno as input (comma separated) and to return set of salary values for the set of employee numbers (compatible to work with lower Oracle versions). Thanks much!
    CREATE OR REPLACE PACKAGE test_multi IS
    FUNCTION GET_sal(P_empno IN emp.empno%TYPE) RETURN NUMBER;
    END test_multi;
    CREATE OR REPLACE PACKAGE BODY test_multi IS
    FUNCTION GET_sal(P_empno IN emp.empno%TYPE) RETURN NUMBER IS
    V_sal NUMBER(10,2);
    MSG VARCHAR2(200);
    BEGIN
    SELECT sal
    INTO V_sal
    FROM emp
    WHERE empno = p_empno;
    RETURN V_sal;
    EXCEPTION
    WHEN NO_DATA_FOUND THEN
    DBMS_OUTPUT.PUT_LINE('No data found.');
    IF (V_sal IS NULL OR V_sal = 0) THEN
    V_sal := 0;
    END IF;
    RETURN V_sal;
    WHEN OTHERS THEN
    MSG := SUBSTR(SQLERRM, 1, 70);
    DBMS_OUTPUT.PUT_LINE(MSG);
    END GET_sal;
    END test_multi; -- End package

    A way to do this in 10g or above...
    SQL> ed
    Wrote file afiedt.buf
      1  with e as (select '7499,7698,7654,7902' as enos from dual)
      2  --
      3  select empno, sal
      4  from emp
      5  where empno in (select regexp_substr(enos,'[^,]+',1,rownum)
      6                  from   e
      7*                 connect by rownum <= length(regexp_replace(enos,'[^,]'))+1)
    SQL> /
         EMPNO        SAL
          7902       3000
          7698       2850
          7654       1250
          7499       1600
    SQL>As for Oracle 8, .... well.... like Oracle, I no longer use unsupported versions, so I'd recommend you upgrade to something that is supported.

  • How to seach for a particular text in comma separated values

    Hi,
    I have one table for eg. TB_Fruits.
    In that i have one column FruitsName(Varchar)
    In that column i am storing string in comma separated values.
    Select FruitsName from tb_fruits;
    Result: orange,banana,apple
    Now the issue is suppose if i try to insert any of these fruits name again then it should not allow me to insert.
    Suppose now if i try to insert ('grapes,banana')
    or
    ('apple,grapes')
    the orange,banana,apple can be in any position.
    How to check if any of these names already exist or not in the column fruitsname?
    I cannot use like or INstr function here. because the position is not fixed not even string.
    Appreciate any help.

    After doing search.
    Got to know <= 3 length in word is in stoplist.
    That's why the value ALL it was not searching in index.
    After modifying the index this problem is solved.
    CREATE INDEX
    Fruitsname_idx ON tb_fruits (FruitsName)
    indextype is ctxsys.context
    PARAMETERS('SYNC ( ON COMMIT)
    stoplist ctxsys.empty_stoplist');
    But now the issue is suppose i have value with space..
    i inserted one more row with value 'FRUITS YELLOW'
    So in the index it is storing two rows....one is for FRUITS and second is for YELLOW.
    select * from tb_fruits t where contains(t.FruitsName,'FRUITS')>0
    I will get record..but actually there should be no record.
    And it should allow me to insert. So i can insert the value FRUITS in more row.
    Any help on how to store the value with space in one row in index??

  • Custom Document for Comma Separated Values in a JTextField

    Hi,
    I tried to create a custom structured Document implementation for comma separated values (on one line) in a JTextField:
    I want to have one LeafElement for each value (to store as an attribute the Object associated with this value). So, if I plug a DocumentListener to the JTextField, I can easily see which value(s) have been added/removed (see ElementChange).
    I used the PlainDocument implementation as a starting point and replaced the logic related to the '\n' character by the ',' (see #insertUpdate).
    Then I realized that the View must be customized too. So I created a View with a custom #paint method to draw each LeafElement on the same line.
    But, now, I try to fix the ElementChange returned by the #insertUpdate and #removeUpdate methods.
    Indeed, I played with a JTextArea with two lines "Bart\nisa" and plugged a debugging DocumentListener.
    And the event I get when I type 'L' before "isa" is really strange:
    - Two elements removed,
    - Two elements added.
    In the PlainDocument#insertUpdate, the code updates the 'offset' but I don't understand the logic behind.
    Could you explain me why I got this result?
    Could you give me an hint?
    Best Regards,
    El Barto.

    When you say the offset gets updated, I assume you're talking about this bit:
         int offset = chng.getOffset();
         int length = chng.getLength();
         if (offset > 0) {
           offset -= 1;
           length += 1;
         } The result is that, if a character is added at the very beginning or end of a line, the edit gets treated as a multi-line edit, which means the LeafElements on either side of the edit get removed and reconstructed. I discovered this behavior a few years ago, and I never have figured out what purpose it serves. I think you'll just have to try overriding the method to eliminate that behavior, and see what happens.

  • Exporting Metadata (caption information) from JPEGS to a comma separated value (CSV) file

    Here is my dilemma. I am an archivist at an arts organization and we are in the process of digitizing many of our materials to post them on the web and make them available to internet users. One of the principle components of our collection is a large trove of photographs. We have been in the process of digitizing these images and embedding metadata (in the Caption/Description, Author/Photographer and Copyright fields) via PhotoShops File Info command.
    Now I am at a crossroads. We need to extract this metadata and transfer it into a comma separated value form, like an Excel spreadsheet or a FileMakerPro database. I have been told that it is not possible to do this through PhotoShop, that I must run a script through Acrobat or Bridge. I have no clue how to do this. I have been directed to a couple of links.
    First I was directed to this (now dead) link: http://www.barredrocksoftware.com/products.html
    The BSExportMetadata script allegedly exports the metadata from files selected in Adobe's Bridge into a comma separated value (CSV) file suitable for import into Excel, Access and most database programs. It installs as a Bridge menu item making it simple to use. The the Export Metadata script provides you with an easy to use wizard allowing you to select associated information about a set of images that you can then export. This script requires Creative Suite 2 (CS2). This script sounds like it does exactly what I want to do, but unfortunately, it no longer exists.
    Then I found this:
    Arnold Dubin, "Script to Export and Import Keywords and Metadata" #13, 8 Aug 2005 7:23 am
    I tried this procedure, but nothing seemed to happen. I also tried to copy the script into the JAVASCRIPT action option in Acrobat, but I received a message that the script had an error. It also seems to me that this script does not set up a dumping point, that is, a file into which this information will be exported to.
    I am a novice, not a code writer or a programmer/developer. I need a step-by-step explanation of how to implement this filtering of information. We have about 2000 jpeg and tiff files, so I would rather not go through each file and copy and paste this information elsewhere. I need to find out how to create a batch process that will do this procedure for me. Can anyone help?

    Hello -
    Is anyone aware of a tool that will do the above that is available for mac? Everything I've found so far seems to be PC only.
    Any help is appreciated, thanks!

  • Comma Separated Value Taking too Much Time to Execute

    Hi,
    select ES_DIAGNOSIS_CODE DC from ecg_study WHERE PS_PROTOCOL_ID LIKE 'H6L-MC-LFAN'
    The above query returns comma separated value from the above query.
    I am using the query below to split the comma separated value but the below query is taking lot of time to return the data.
    SELECT
    select DC from (
    with t as ( select ES_DIAGNOSIS_CODE DC from ecg_study WHERE PS_PROTOCOL_ID LIKE 'H6L-MC-LFAN' )
    select REGEXP_SUBSTR (DC, '[^,]+', 1, level) DC from t
    connect by level <= length(regexp_replace(DC,'[^,]*'))+1 )
    Please suggest me is there any alternative way to do this comma separated value.
    Thanks
    Sudhir

    Nikolay Savvinov wrote:
    Hi BluShadow,
    I know that this function is fast with varchar2 strings from several years of using it. With CLOBs one may need something faster, but the OP didn't menion CLOBs.
    Best regards,
    NikolayJust because you perceive it to be fast doesn't mean it's faster than doing it in SQL alone.
    For starters you are context switching from the SQL engine to PL/SQL to call it.
    Then in your code you are doing this...
    select substr(v_str,v_last_break+1, decode(v_nxt_break,0,v_length, v_nxt_break-v_last_break-1)) into v_result from dual;which is context switching back from the PL/SQL engine to the SQL engine for each entry in the string.
    Why people do that I don't know... when PL/SQL alone could do it without a context switch e.g.
    v_result := substr(v_str,v_last_break+1, case when v_nxt_break = 0 then v_length else v_nxt_break-v_last_break-1 end);So, if you still think it's faster than pure SQL (which is what the OP is using), please go ahead and prove it to us.

  • Query on column with comma separated values

    I have a proposed table with unnormalized data like the following:
    ID COLA COLB REFLIST
    21 xxx  zzz  24,25,78,412
    22 xxx  xxx  21
    24 yyy  xxx  912,22
    25 zzz  fff  433,555,22
    .. ...  ...  ...There are 200 million rows. There is maximum of about 10 IDs in the REFLIST, though typically two or three. How could I efficiently query this data on the REFLIST column? e.g. something like:
    SELECT id FROM mytable WHERE :myval in reflistLogically there is a many to many relationship between rows in this table. The REFLIST column contains pointers to ID values elsewhere in the table. The data could be normalized so that the relationship keys are in a separate table (in fact this is the current solution that we want to change).
    ID  REF
    21  24
    21  25
    21  78
    21  412
    22  21
    24  912
    ... ...The comma separated list seems instinctively like a bad idea, however there are various reasons for proposing it. The main reason is because the source for this data has it structured like the REFLIST example. It is an OLTP-like system rather than a data warehouse. The source code (and edit performance) would benefit greatly from not having to maintain the relationship table as the data changes.
    Going back to querying the REFLIST column, the problem seems to be building an approriate index for the data. The ideas proposed so far are:
    <li>Make a materialized view that presents the relationships as normalized (e.g. as in the example with ID, REF columns above), then index the plain column - the various methods of writing the view SQL have been widely posted.
    <li>Use a Oracle Text Index (not something I have ever had call to use before).
    Any other ideas? Its Oracle 10.2, though 11g could be possible.
    Thanks
    Jim

    Something like this ?
    This is test demo on my 11.2.0.1 Windows XP
    SQL> create table test (id number,reflist varchar2(30));
    Table created.
    SQL> insert into test values (21,'24,25,78,412');
    1 row created.
    SQL> insert into test values (22,'21');
    1 row created.
    SQL> insert into test values (24,'912,22');
    1 row created.
    SQL> insert into test values (25,'433,555,22');
    1 row created.
    SQL> select * from test
      2  where
      3  ',' || reflist || ',' like '%,22,%';
            ID REFLIST
            24 912,22
            25 433,555,22
    SQL>Source:http://stackoverflow.com/questions/7212282/is-it-possible-to-query-a-comma-separated-column-for-a-specific-value
    Regards
    Girish Sharma
    Edited by: Girish Sharma on Jul 12, 2012 2:31 PM

  • Enable comma separated values (CSV) output

    I am trying to figure out how to (Enable comma separated values (CSV) output ) for a report. Do you have an example or info on how to do that?

    Can you provide an example on how you completed the setup? I am in the same boat but can't find an example to this subject

  • Passing comma separated string to stored procedure

    Hi,
    There is thread with same query I created earlier and that was answered. That solution worked if I pass comma separated string containing IDs. But due to changes in the logic, I have to pass usernames instead of userIDs. I tried to modify the solution provided to use with this.
    Following the link to previous post :
    Re: Passing comma separated string to stored procedure
    ------Package-------
    TYPE refcurQID IS REF CURSOR;
    TYPE refcurPubs IS REF CURSOR;
    procedure GetAllPersonalQueue (p_user_name in nvarchar2, TestQID OUT Test.refcurQID
    , TestPubs OUT Test.refcurPubs);
    ------Package-------
    ------Package Body-------
    PROCEDURE GetAllPersonalQueue (p_user_name in nvarchar2, TestQID OUT Test.refcurQID, TestPubs OUT Test.refcurPubs) as
    BEGIN
    Open TestQID for
    select id from cfq where name in (p_user_name);
    Open TestPubs for
    SELECT qid FROM queues WHERE qid in(
    select id from cfq where name in (p_user_name));
    END GetAllPersonalQueue;
    ------Package Body-------
    Thanks in advance
    Aditya

    Hi,
    I modified the query as per the solution provided by isotope, after which the logic changed and I am passing username instead of userID in comma separated string.
    Following is the changes SP, which does not throw any error, but no data is returned.
    PROCEDURE GetAllPersonalQueue (p_user_name in nvarchar2, TestQID OUT Test.refcurQID, TestPubs OUT Test.refcurPubs
    ) is
    --local variable
    strFilter varchar2(100);
    BEGIN
    Open TestQID for
    select id, name from cfq where name in
    select regexp_substr(p_user_name||',','[a-z]+[0-9]+',1,level)
    from dual
    connect by level <= (select max(length(p_user_name)-length(replace(p_user_name,',')))+1
    from dual)
    Open TestPubs for
    SELECT qid FROM queues WHERE qid in(
    select id from cfq where name in
    select regexp_substr(p_user_name||',','[a-z]+[0-9]+',1,level)
    from dual
    connect by level <= (select max(length(p_user_name)-length(replace(p_user_name,',')))+1
    from dual)
    END GetAllPersonalQueue;
    Edited by: adityapawar on Feb 27, 2009 8:38 AM

  • Comma Separated Values files ?!!

    Is there special classes for dealing with CSV (comma separated values) to use a text file just like a database ??
    thanks in advance

    sorry, it just sounds like a strange thing to do. Normally, you would not use CSV for persistent data storage with a High Level Language. In your design, if you wanted to refer to a particular record, you get into all sorts of problems. You could keep a note of where each record is, like what line number its on, but then you would need some way of modifying this note when you add records. Also, you would have to be sure that no-one is accessing the file outside of the program as then things will not be where the program expects them to be. You could refer to each record using some identifier, but then if you are going to start using all of this, then you may as well go the database route. If you are new to java, it will probably help you learn. I just don't envisage you ever having to use the type of design you are talking of in practice. The closest thing that you ever might use it for is if you had a plain text configuration or parameter file for a program.

  • Comparing 2 comma separated value strings

    Hi
    I've the following requirement where i need to comapre 2 strings having comma separated values.
    declare
    v_Str1 varchar2(100) ;
    v_str2 varchar2(100);
    begin
    v_str1 := '123,234,456' ;
    v_str2 := '123,234';
    /*  *I need to write a logic to compare the above 2 strings
    Could you please give me  hint to achieve this*  */Thanks
    Edited by: smile on Mar 13, 2012 8:20 PM

    Try this
    declare
    v_Str1 varchar2(100) ;
    v_str2 varchar2(100);
    begin
    v_str1 := '123,234,456,4364' ;
    v_str2 := '123,234';
    For cur_rec in (
      select REGEXP_SUBSTR (v_str1, '[^,]+', 1, level) output
      from t
      connect by level <= regexp_count(v_str1,',')+1
      MINUS
      select REGEXP_SUBSTR (v_str2, '[^,]+', 1, level) output
      from t
      connect by level <= regexp_count(v_str2,',')+1) loop
      DBMS_OUTPUT.PUT_LINE(cur_rec.output);
    End loop;
    End;
    4364
    456
    PL/SQL procedure successfully completedAnother format
    declare
    v_Str1 varchar2(100) ;
    v_str2 varchar2(100);
    v_Str3 varchar2(100);
    begin
    v_Str1 := '600,100,500,200,300,400' ;
    v_Str2 :='100,200';
    For cur_rec in (
      select REGEXP_SUBSTR (v_str1, '[^,]+', 1, level) output
      from t
      connect by level <= regexp_count(v_str1,',')+1
      MINUS
      select REGEXP_SUBSTR (v_str2, '[^,]+', 1, level) output
      from t
      connect by level <= regexp_count(v_str2,',')+1) loop
      v_Str3:=v_Str3||','||cur_rec.output;
    End loop;
      v_Str3:=LTRIM(RTRIM(v_Str3,','),',');
      DBMS_OUTPUT.PUT_LINE(v_Str3);
    End;
    300,400,500,600
    PL/SQL procedure successfully completedEdited by: Lokanath Giri on १४ मार्च, २०१२ १:३३ अपराह्न

  • Passing different (multi-) values for parameters to drill-through report, based on clicked subtotal in main report

    In Report Builder 3.0, I have made a main report in which the user can filter the underlying dataset using three parameters (all multi-value). The report shows totals grouped by these three parameters, as well as a grand total. What I want is to click
    on a total, which then opens the drill-through report for the corresponding records. I have achieved this for the grand total; the action in the corresponding text box passes all selected values of the parameters to the drill-through report.
    What I cannot figure out is how to make this work right when clicking on a subtotal. When I use the same expression as for the grand total, the same values for the parameters are passed, instead of the subset that apply to the corresponding text box in the
    main report. I expected this to work, because Report Builder /does/ correctly calculate the SUMs for the different levels, even though the expressions are the same.
    My question is: how do I pass different drill-through (multi-) values for parameters, corresponding to the respective subtotals in the main report?
    (FYI: I am using Microsoft SQL Server 2008 R2 and Report Builder 3.0 .)

    Hello Katherine,
    Thanks once more for your quick reply.
    I was aware of the textbox action "Go to report", and how to pass parameters in general. My question concerned how to determine the scope of the passed multi-value parameters (to the values that apply to the respective group/subtotal). The article you linked
    to is informative, but not a solution to my problem.
    A colleague of mine came up with a pragmatic solution: instead of trying to determine the scope of the parameter values, now I "look to the left in the results table". The two screenshots below should illustrate this. (Screenshots are in Dutch. Specific information
    is pixelated.)
    Unfortunately, I am not able to post images. Once my account is verified, I will edit them in. For now, I hope the text speaks for itself enough.
    [Screenshot: Drill-through parameters - 01: Report Builder tablix with subtotals]
    [Screenshot: Drill-through parameters - 02: Textbox properties (selected in screenshot 01) - Action - Go to report]
    The first screenshot shows the tablix in the Report Builder. The second screenshot shows the properties of the textbox selected in the first.
    Notice that I do not pass parameters for the first two columns, but the actual values. I only pass the parameter (containing /all/ user-selected values) for the third column. In the textbox below the selected one, I pass the actual values for the first column,
    and parameters for the last two. In the textbox above the selected one, I pass the actual values for all three columns.
    The only (cosmetic) flaw this approach has is that in the drill-through report, the list of selected parameters might show values that do not occur in the (corresponding part of the) results, and only for those parameters for which the main report passes (all
    user-selected) parameter values, and not the actual values in the results. The results are correct, though.
    If there is a way to directly determine the scope of multi-value parameters for passing to a drill-through report, I would still like know. But for now, this seems to work.

  • HT2486 The selected file does not appear to be a valid comma separated values (csv) file or a valid tab delimited file. Choose a different file.

    The selected file does not appear to be a valid comma separated values (csv) file or a valid tab delimited file. Choose a different file.

    I guess your question is, "what's wrong with the file?"
    You're going to have to figure that out yourself, as we cannot see the file.
    Importing into Address book requires either a tab-delimited or a comma-delimited file. You can export out of most spreadsheets into a csv file. However, you need to make sure you clean up the file first.  If you have a field that has commas in the field, they will create new fields at the comma. So, some lines will have more fields than the others, causing issues like the error you saw.

  • Converting xml data in to comma separated values in bpel

    Is there a way to covert generic xml data to comma separated value in BPEL? i have tried using createDelimitedString but no luck.
    Please guide me on this issue.
    Thanks!
    Shan
    Edited by: 876372 on Aug 3, 2011 6:58 AM

    Hi,
    Have a luk at the below link it has many examples:
    http://download.oracle.com/docs/cd/B31017_01/integrate.1013/b28994/nfb.htm

  • Rows into comma separated values

    DB version : 11.2
    I would like get rows into comma separated values
    expected output
    rowvalue1,<space>rowvalue2,<space>rowvalue3,<space>rowvalue4,.....Example:
    create table test1 (name1 varchar2(10));
    insert into test1 values ('JOHN');
    insert into test1 values ('YING');
    insert into test1 values ('KAREN');
    insert into test1 values ('PEDRO');
    commit;
    SQL> select * from test1;
    NAME1
    JOHN
    YING
    KAREN
    PEDROHow can I get this to printed as
    JOHN, YING, KAREN, PEDRO

    Assuming you want them in no particular order
    SQL> select listagg(name1, ',') within group (order by rowid) from test1;
    LISTAGG(NAME1,',')WITHINGROUP(
    JOHN,YING,KAREN,PEDRO

Maybe you are looking for