Performance of an sql query with 16 million rows

Hello,
I have about 16 million records in a project table, lets call it the ENCOUNTER table.
I have an ENCOUNTER_NUMBER to be searched for which is VARCHAR2(30) and has a unique constraint on it. The results are to be ordered by ENCOUNTER_NUMBER.
The search string that the user enters is always considered to be a contains search and so a % is prefixed and suffixed to the search string.
I have created a functional index on UPPER(ENCOUNTER_NUMBER)
More the number of matches to the search query, the less the time taken. If there are 100 matches or more, the query returns within a second.
If there is only one match it takes atleast 10 seconds.
I am not sure at this stage why there is huge difference. Our client would be happy with 5 seconds.
Can anyone help me with this please, if at all it is possible to get the search quicker
Any help would be greatly appreciated.
Many thanks
Marilyn

You are seeing differing response times to fetch the first n rows - not all the rows. So if your search is 'less selective' then you need to search fewer rows to find the first n rows meeting your criteria.
For example, if you have a search that 1% of your rows will match, then you probably need to scan about 10,000 rows to find the first 100 matching rows. If you specify a more selective predicate, matching 0.1% of the rows, then you will need to scan 100k rows to find your first 100 matches.
A generic 'contains' query, such as you describe, is difficult to implement with good performance. And it is often the result of someone not giving sufficient thought to the requirements: 'Well, it's too hard to figure out how people really need to use it, so I'll ask for this and I'll be covered regardless of what people really need.'
Can you negotiate/narrow down the requirements, such as:
- allow leading wildcards or trailing wildcards but not both
- allow/require other search criteria to be specified (not more double-wildcarded criteria, though)
- allow searching specific positions of the column (for example, if positions 4-6 have some specific meaning, then allow a search on that substring)

Similar Messages

  • Performance tuning of sql query with multiple joins

    My query takes at least half an hour to execute and the number of records returned are approx 1 lakh records.
    Structure of tables are:
    tblSession : ID,option1,option2,option3,option4,option5,option6,option7,option8,option9.
    tblOption : ID, labelID
    tblLabelDetail  : ID, LABELID, text
    optionID 1 to optionID9 are Foreign keys to table tblOption.ID
    My query is as below : 
    select 
    session.ID 
    ,session.tstamp
    ,session.score
    ,session.hid1
    ,session.hID2
    ,session.hID3
    ,session.collectionID
    ,session.consumerID
    ,session.langID
    ,cons_cust.text1 as    customCons_text1, 
    cons_cust.text2 as customCons_text2, 
    cons_cust.text3 as customCons_text3,
    cons_cust.text4 as customCons_text4,
    cons_cust.text5 as customCons_text5,
    cons_cust.text6 as customCons_text6,
    cons_cust.text7 as customCons_text7,
    cons_cust.text8 as customCons_text8,
    cons_cust.text9 as customCons_text9,
    ld_cons1.text as customCons_option1GUID, 
    ld_cons2.text as customCons_option2GUID, 
    ld_cons3.text as customCons_option3GUID, 
    ld_cons4.text as customCons_option4GUID ,
    ld_cons5.text as customCons_option5GUID, 
    ld_cons6.text as customCons_option6GUID, 
    ld_cons7.text as customCons_option7GUID, 
    ld_cons8.text as customCons_option8GUID, 
    ld_cons9.text as customCons_option9GUID,
    --session
    session_cust.text1 as  session_cust_text1, 
    session_cust.text2 as session_cust_text2, 
    session_cust.text3 as session_cust_text3,
    session_cust.text4 as session_cust_text4,
    session_cust.text5 as session_cust_text5,
    session_cust.text6 as session_cust_text6,
    session_cust.text7 as session_cust_text7,
    session_cust.text8 as session_cust_text8,
    session_cust.text9 as session_cust_text9,
    ld_sess1.text as session_cust_option1GUID, 
    ld_sess2.text as session_cust_option2GUID, 
    ld_sess3.text as session_cust_option3GUID, 
    ld_sess4.text as session_cust_option4GUID, 
    ld_sess5.text as session_cust_option5GUID, 
    ld_sess6.text as session_cust_option6GUID, 
    ld_sess7.text as session_cust_option7GUID, 
    ld_sess8.text as session_cust_option8GUID, 
    ld_sess9.text as session_cust_option9GUID, 
    session_cust.tStamp1,
    session_cust.tStamp2
    from mvSession session with (noexpand)
    inner join tblCollection c on c.ID=session.collectionID AND c.templateID = 405
    left join tblConsumer cons on cons.ID=session.consumerID and cons.sessionYM between 601 and 1412 and cons.sID=105
    left join vCustomConsumer cons_cust on cons_cust.sessionYM between 601 and 1412 and cons_cust.sID=105 and cons_cust.ID=cons.ID
    left join tbloption o_cons1 on o_cons1.id = cons_cust.option1 and  o_cons1.sid = 105
    left join tblLabelDetail ld_cons1 on ld_cons1.labelID = o_cons1.labelID and ld_cons1.langId = 1 and ld_cons1.eid = 107 
    left join tbloption o_cons2 on o_cons2.id = cons_cust.option2 and  o_cons2.sid = 105
    left join tblLabelDetail ld_cons2 on ld_cons2.labelID = o_cons2.labelID and ld_cons2.langId = 1 and ld_cons2.eid = 107 
    left join tbloption o_cons3 on o_cons3.id = cons_cust.option3 and  o_cons3.sid = 105
    left join tblLabelDetail ld_cons3 on ld_cons3.labelID = o_cons1.labelID and ld_cons3.langId = 1 and ld_cons3.eid = 107 
    left join tbloption o_cons4 on o_cons4.id = cons_cust.option4 and  o_cons4.sid = 105
    left join tblLabelDetail ld_cons4 on ld_cons4.labelID = o_cons4.labelID and ld_cons4.langId = 1 and ld_cons4.eid = 107 
    left join tbloption o_cons5 on o_cons5.id = cons_cust.option5 and  o_cons5.sid = 105
    left join tblLabelDetail ld_cons5 on ld_cons5.labelID = o_cons5.labelID and ld_cons5.langId = 1 and ld_cons5.eid = 107 
    left join tbloption o_cons6 on o_cons6.id = cons_cust.option6 and  o_cons6.sid = 105
    left join tblLabelDetail ld_cons6 on ld_cons6.labelID = o_cons6.labelID and ld_cons6.langId = 1 and ld_cons6.eid = 107 
    left join tbloption o_cons7 on o_cons7.id = cons_cust.option7 and  o_cons7.sid = 105
    left join tblLabelDetail ld_cons7 on ld_cons7.labelID = o_cons7.labelID and ld_cons7.langId = 1 and ld_cons7.eid = 107 
    left join tbloption o_cons8 on o_cons8.id = cons_cust.option8 and  o_cons8.sid = 105
    left join tblLabelDetail ld_cons8 on ld_cons8.labelID = o_cons8.labelID and ld_cons8.langId = 1 and ld_cons8.eid = 107 
    left join tbloption o_cons9 on o_cons9.id = cons_cust.option9 and  o_cons9.sid = 105
    left join tblLabelDetail ld_cons9 on ld_cons9.labelID = o_cons9.labelID and ld_cons9.langId = 1 and ld_cons9.eid = 107 
    left join vCustomSession session_cust on session_cust.sessionYM between 601 and 1412 and session_cust.sID=105 and session_cust.ID=session.ID
    left join tbloption o_sess1 on o_sess1.id = session_cust.option1 and  o_sess1.sid = 105
    left join tblLabelDetail ld_sess1 on ld_sess1.labelID = o_sess1.labelID and ld_sess1.langId = 1 and ld_sess1.eid = 107 
    left join tbloption o_sess2 on o_sess2.id = session_cust.option2 and  o_sess2.sid = 105
    left join tblLabelDetail ld_sess2 on ld_sess2.labelID = o_sess2.labelID and ld_sess2.langId = 1 and ld_sess2.eid = 107 
    left join tbloption o_sess3 on o_sess2.id = session_cust.option3 and  o_sess3.sid = 105
    left join tblLabelDetail ld_sess3 on ld_sess3.labelID = o_sess2.labelID and ld_sess3.langId = 1 and ld_sess3.eid = 107 
    left join tbloption o_sess4 on o_sess4.id = session_cust.option4 and  o_sess4.sid = 105
    left join tblLabelDetail ld_sess4 on ld_sess4.labelID = o_sess4.labelID and ld_sess4.langId = 1 and ld_sess4.eid = 107 
    left join tbloption o_sess5 on o_sess5.id = session_cust.option5 and  o_sess5.sid = 105
    left join tblLabelDetail ld_sess5 on ld_sess5.labelID = o_sess5.labelID and ld_sess5.langId = 1 and ld_sess5.eid = 107 
    left join tbloption o_sess6 on o_sess6.id = session_cust.option6 and  o_sess6.sid = 105
    left join tblLabelDetail ld_sess6 on ld_sess6.labelID = o_sess6.labelID and ld_sess6.langId = 1 and ld_sess6.eid = 107 
    left join tbloption o_sess7 on o_sess7.id = session_cust.option7 and  o_sess7.sid = 105
    left join tblLabelDetail ld_sess7 on ld_sess7.labelID = o_sess7.labelID and ld_sess7.langId = 1 and ld_sess7.eid = 107 
    left join tbloption o_sess8 on o_sess8.id = session_cust.option8 and  o_sess8.sid = 105
    left join tblLabelDetail ld_sess8 on ld_sess8.labelID = o_sess8.labelID and ld_sess8.langId = 1 and ld_sess8.eid = 107 
    left join tbloption o_sess9 on o_sess9.id = session_cust.option9 and  o_sess9.sid = 105
    left join tblLabelDetail ld_sess9 on ld_sess9.labelID = o_sess9.labelID and ld_sess9.langId = 1 and ld_sess9.eid = 107 
    where session.sID=105  and session.tStamp >= 'Sep  1 2014 12:00AM' and session.tStamp < 'Dec 12 2014 12:00AM'   
    order by session.tStamp, session.ID
    Is there a way , where i can simplify the joins with tbloption and tblLabelDetail and get my o/p in optimal time.
    Regards 

    I have headed towards another approach ie. using unpivot and then pivot.
    First i am converting option1-option9 into column , then doing  PIVOT to get back the same record . But issue is that when i am doing pivoting i am getting NULL values.
    My query is :
    select * into #t1  from
    select ID
    , option1
    , option2
    , option3
    , option4
    , option5
    , option6
    , option7
    , option8
    , option9
    from vCustomConsumer
    where sid=105
    and sessionYM = 1412 
    ) SourceTable
    UNPIVOT
       optionID FOR Col IN
        (option1 
    ,option2
    ,option3 
    ,option4 
    ,option5 
    ,option6 
    ,option7 
    ,option8 
    ,option9 )
    ) AS unpvt
    select t.ID,t.optionID,t.col,cast(ld.text as varchar(max)) as text into #t2
    from #t1 t
    left outer join tbloption o on o.ID =  t.optionID
    left outer join tblLabelDetail ld on ld.labelID = o.labelID and ld.langID=1 
    order by ID,col
    select ID,option1 
    ,option2
    ,[option3]
    ,option4 
    ,option5 
    ,option6 
    ,option7 
    ,option8 
    ,option9
    from
    select ID,optionID,col,text
    from #t2
    )up
    pivot 
    min(text)for col in 
    (option1 
    ,option2
    ,[option3]
    ,option4 
    ,option5 
    ,option6 
    ,option7 
    ,option8 
    ,option9
    )as pvt
    In my last query where i am using pivot, i am getting NULL values. When i check the data in temp table #t2 , it exists perfectly . But when pivoting i dont understand why it is returning most of the NULL values. I am getting data for only one column in single
    row.
    Below are some rows from result set finally obtained after pivoting :
    ID
    option1
    option2
    option3
    option4
    option5
    option6
    option7
    option8
    option9
    62949026
    NULL
    0 to 200 seconds
    NULL
    NULL
    NULL
    NULL
    NULL
    NULL
    NULL
    62966000
    NULL
    NULL
    4
    NULL
    NULL
    NULL
    NULL
    NULL
    NULL
    62966032
    NULL
    NULL
    4
    NULL
    NULL
    NULL
    NULL
    NULL
    NULL
    63090372
    NULL
    NULL
    NULL
    NULL
    EN
    NULL
    NULL
    NULL
    NULL
    63090375
    NULL
    NULL
    NULL
    NULL
    EN
    NULL
    NULL
    NULL
    NULL
    Thanks,

  • Need help in improving the performance for the sql query

    Thanks in advance for helping me.
    I was trying to improve the performance of the below query. I tried the following methods used merge instead of update, used bulk collect / Forall update, used ordered hint, created a temp table and upadated the target table using the same. The methods which I used did not improve any performance. The data count which is updated in the target table is 2 million records and the target table has 15 million records.
    Any suggestions or solutions for improving performance are appreciated
    SQL query:
    update targettable tt
    set mnop = 'G',
    where ( x,y,z ) in
    select a.x, a.y,a.z
    from table1 a
    where (a.x, a.y,a.z) not in (
    select b.x,b.y,b.z
    from table2 b
    where 'O' = b.defg
    and mnop = 'P'
    and hijkl = 'UVW';

    987981 wrote:
    I was trying to improve the performance of the below query. I tried the following methods used merge instead of update, used bulk collect / Forall update, used ordered hint, created a temp table and upadated the target table using the same. The methods which I used did not improve any performance. And that meant what? Surely if you spend all that time and effort to try various approaches, it should mean something? Failures are as important teachers as successes. You need to learn from failures too. :-)
    The data count which is updated in the target table is 2 million records and the target table has 15 million records.Tables have rows btw, not records. Database people tend to get upset when rows are called records, as records exist in files and a database is not a mere collection of records and files.
    The failure to find a single faster method with the approaches you tried, points to that you do not know what the actual performance problem is. And without knowing the problem, you still went ahead, guns blazing.
    The very first step in dealing with any software engineering problem, is to identify the problem. Seeing the symptoms (slow performance) is still a long way from problem identification.
    Part of identifying the performance problem, is understanding the workload. Just what does the code task the database to do?
    From your comments, it needs to find 2 million rows from 15 million rows. Change these rows. And then write 2 million rows back to disk.
    That is not a small workload. Simple example. Let's say that the 2 million row find is 1ms/row and the 2 million row write is also 1ms/row. This means a 66 minute workload. Due to the number of rows, an increase in time/row either way, will potentially have 2 million fold impact.
    So where is the performance problem? Time spend finding the 2 million rows (where other tables need to be read, indexes used, etc)? Time spend writing the 2 million rows (where triggers and indexes need to be fired and maintained)? Both?

  • SQL query with Bind variable with slower execution plan

    I have a 'normal' sql select-insert statement (not using bind variable) and it yields the following execution plan:-
    Execution Plan
    0 INSERT STATEMENT Optimizer=CHOOSE (Cost=7 Card=1 Bytes=148)
    1 0 HASH JOIN (Cost=7 Card=1 Bytes=148)
    2 1 TABLE ACCESS (BY INDEX ROWID) OF 'TABLEA' (Cost=4 Card=1 Bytes=100)
    3 2 INDEX (RANGE SCAN) OF 'TABLEA_IDX_2' (NON-UNIQUE) (Cost=3 Card=1)
    4 1 INDEX (FAST FULL SCAN) OF 'TABLEB_IDX_003' (NON-UNIQUE)
    (Cost=2 Card=135 Bytes=6480)
    Statistics
    0 recursive calls
    18 db block gets
    15558 consistent gets
    47 physical reads
    9896 redo size
    423 bytes sent via SQL*Net to client
    1095 bytes received via SQL*Net from client
    3 SQL*Net roundtrips to/from client
    1 sorts (memory)
    0 sorts (disk)
    55 rows processed
    I have the same query but instead running using bind variable (I test it with both oracle form and SQL*plus), it takes considerably longer with a different execution plan:-
    Execution Plan
    0 INSERT STATEMENT Optimizer=CHOOSE (Cost=407 Card=1 Bytes=148)
    1 0 TABLE ACCESS (BY INDEX ROWID) OF 'TABLEA' (Cost=3 Card=1 Bytes=100)
    2 1 NESTED LOOPS (Cost=407 Card=1 Bytes=148)
    3 2 INDEX (FAST FULL SCAN) OF TABLEB_IDX_003' (NON-UNIQUE) (Cost=2 Card=135 Bytes=6480)
    4 2 INDEX (RANGE SCAN) OF 'TABLEA_IDX_2' (NON-UNIQUE) (Cost=2 Card=1)
    Statistics
    0 recursive calls
    12 db block gets
    3003199 consistent gets
    54 physical reads
    9448 redo size
    423 bytes sent via SQL*Net to client
    1258 bytes received via SQL*Net from client
    3 SQL*Net roundtrips to/from client
    1 sorts (memory)
    0 sorts (disk)
    55 rows processed
    TABLEA has around 3million record while TABLEB has 300 records. Is there anyway I can improve the speed of the sql query with bind variable? I have DBA Access to the database
    Regards
    Ivan

    Many thanks for your reply.
    I have run the statistic already for the both tableA and tableB as well all the indexes associated with both table (using dbms_stats, I am on 9i db ) but not the indexed columns.
    for table I use:-
    begin
    dbms_stats.gather_table_stats(ownname=> 'IVAN', tabname=> 'TABLEA', partname=> NULL);
    end;
    for index I use:-
    begin
    dbms_stats.gather_index_stats(ownname=> 'IVAN', indname=> 'TABLEB_IDX_003', partname=> NULL);
    end;
    Is it possible to show me a sample of how to collect statisc for INDEX columns stats?
    regards
    Ivan

  • [Excel] Running a SQL Query to delete rows

    Hello Experts,
    Background: I am attempting to use a dba of my companies time keeping system and implement it with Power BI tools. Given the file size restrictions within Power Bi itself I need to lower my file size by removing all time logs from
    inactive employees.
    I have a question regarding whether or not you can use a sql query to delete rows in excel. I have roughly 200,000+ rows in my excel spreadsheet. I am attempting to delete all rows where an employee equals inactive. I have attempted to
    delete these rows by sorting them and doing a bulk delete and clear contents, but it seems to crash my excel every time.  My thought process is that using a query that does a timed delete might put less of a burden on deleting the massive amount of data.
    I would like to use this: DELETE * FROM [Table_(...)_Actual$] WHERE [Current] = "Inactive" (Will add more once I know it is possible to use sql queries in Excel.
    Any information on whether or not this is possible would be appreciated.
    Regards,
    Link

    Running SQL Query in Excel is possible, however, the delete query is not supported in Excel.
    You are more restricted in deleting Excel data than   data from a relational data source. In a relational database, "row" has no   meaning or existence apart from "record"; in an Excel worksheet, this is not   true. You can delete values
    in fields (cells). Please see:http://support.microsoft.com/kb/257819/en-us
    One workaround : Use update query to set the rows as null, then use select query.
    e.g. 
    SQL = "update [sheet2$A1:B5] set name=null,age=null where name='andy'"
    cnn.Execute SQL
    SQL = "select name,age from [sheet2$A1:B5] where name is not null"
    Wind Zhang
    TechNet Community Support

  • SQL query with JSP and WML-parameters

    Hey,
    Could you help me?
    I'm trying to do the following. WML deck card 1 send parameter to same WML deck's card help. I try to read the parameter with JSP in card help by putting the parameter to SQL query, but it doesn't work. I can read the parameter with WML in card help. I can also print the value of the parameter with JSP if I generate WML with JSP.
    /*parameter sending from card 1 to card help*/
    out.println("<go href='#helpcard'>");
    out.println("<setvar name='valittukurssi' value='$(valittukurssi)'/>");
    /*parameter read with WML in card help */
    <p>Valitse kurssi.
    $valittukurssi</p>
    /'parameter read with JSP by generating WML with JSP*/
    out.println("<p>$valittukurssi</p>");
    /* SQL query with JSP */
    ResultSet uudettulokset = uusilause.executeQuery("select * from kurssi where lyhenne='$valittukurssi'");
    Thanks,
    Rampe

    You're problem is easy to fix. You're confusing WML variables with JSP variables. See below:
    >
    /*parameter sending from card 1 to card help*/
    out.println("<go href='#helpcard'>");
    out.println("<setvar name='valittukurssi'
    value='$(valittukurssi)'/>");
    Above you set a var that will work on the phone, not in JSP.
    /*parameter read with WML in card help */
    <p>Valitse kurssi.
    $valittukurssi</p>
    Yes the above does display the parameter, because it is a client side WML var, but you cannot use this variable in the JSP code (that's why your SWL fails).
    /'parameter read with JSP by generating WML with
    JSP*/
    out.println("<p>$valittukurssi</p>");Here's you're problem, the above line is EXACTLY the same as the one before it. When the container parses through this JSP code it translates the above line to:
    <p>$valittukurssi</p> on the WML page and the CLIENT uses it's local variable to display it.
    What you need and want is to have a variable that can be used in JSP code and output to your WML page. Here's how it's done:
    out.println("<go href='#helpcard'>");
    String some_name = "valittukurssi";
    out.println("<setvar name='"+some_name+"'
    value='$("+some_name+")'/>");
    //note that you may have to escape the ( and ) with a \
    //so we displayed the variable above into the WML page, now we can use it in the SQL query:
    /* SQL query with JSP */
    ResultSet uudettulokset =
    uusilause.executeQuery("select * from kurssi where
    lyhenne='"+some_name+"'");//the end of the command is: " ' " ) ;
    Frank Krul
    Got Node?

  • How to compare result from sql query with data writen in html input tag?

    how to compare result
    from sql query with data
    writen in html input tag?
    I need to compare
    user and password in html form
    with all user and password in database
    how to do this?
    or put the resulr from sql query
    in array
    please help me?

    Hi dejani
    first get the user name and password enter by the user
    using
    String sUsername=request.getParameter("name of the textfield");
    String sPassword=request.getParameter("name of the textfield");
    after executeQuery() statement
    int exist=0;
    while(rs.next())
    String sUserId= rs.getString("username");
    String sPass_wd= rs.getString("password");
    if(sUserId.equals(sUsername) && sPass_wd.equals(sPassword))
    exist=1;
    if(exist==1)
    out.println("user exist");
    else
    out.println("not exist");

  • How could I replace hard coded value in my sql query with constant value?

    Hi all,
    Could anyone help me how to replace hardcoded value in my sql query with constant value that might be pre defined .
    PROCEDURE class_by_day_get_bin_data
         in_report_parameter_id   IN   NUMBER,
         in_site_id               IN   NUMBER,
         in_start_date_time       IN   TIMESTAMP,
         in_end_date_time         IN   TIMESTAMP,
         in_report_level_min      IN   NUMBER,
         in_report_level_max      IN   NUMBER
    IS
      bin_period_length   NUMBER(6,0); 
    BEGIN
      SELECT MAX(period_length)
         INTO bin_period_length
        FROM bin_data
         JOIN site_to_data_source_lane_v
           ON bin_data.data_source_id = site_to_data_source_lane_v.data_source_id
         JOIN bin_types
           ON bin_types.bin_type = bin_data.bin_type 
       WHERE site_to_data_source_lane_v.site_id = in_site_id
         AND bin_data.start_date_time     >= in_start_date_time - numtodsinterval(1, 'DAY')
         AND bin_data.start_date_time     <  in_end_date_time   + numtodsinterval(1, 'DAY')
         AND bin_data.bin_type            =  2
         AND bin_data.period_length       <= 60;
      --Clear the edr_class_by_day_bin_data temporary table and populate it with the data for the requested
      --report.
      DELETE FROM edr_class_by_day_bin_data;
       SELECT site_to_data_source_lane_v.site_id,
             site_to_data_source_lane_v.site_lane_id,
             site_to_data_source_lane_v.site_direction_id,
             site_to_data_source_lane_v.site_direction_name,
             bin_data_set.start_date_time,
             bin_data_set.end_date_time,
             bin_data_value.bin_id,
             bin_data_value.bin_value
        FROM bin_data
        JOIN bin_data_set
          ON bin_data.bin_serial = bin_data_set.bin_serial
        JOIN bin_data_value
          ON bin_data_set.bin_data_set_serial = bin_data_value.bin_data_set_serial
        JOIN site_to_data_source_lane_v
             ON bin_data.data_source_id = site_to_data_source_lane_v.data_source_id
            AND bin_data_set.lane = site_to_data_source_lane_v.data_source_lane_id
        JOIN (
               SELECT CAST(report_parameter_value AS NUMBER) lane_id
                 FROM report_parameters
                WHERE report_parameters.report_parameter_id    = in_report_parameter_id
                  AND report_parameters.report_parameter_group = 'LANE'
                  AND report_parameters.report_parameter_name  = 'LANE'
             ) report_lanes
          ON site_to_data_source_lane_v.site_lane_id = report_lanes.lane_id
        JOIN (
               SELECT CAST(report_parameter_value AS NUMBER) class_id
                 FROM report_parameters
                WHERE report_parameters.report_parameter_id    = in_report_parameter_id
                  AND report_parameters.report_parameter_group = 'CLASS'
                  AND report_parameters.report_parameter_name  = 'CLASS'
             ) report_classes
          ON bin_data_value.bin_id = report_classes.class_id
        JOIN edr_rpt_tmp_inclusion_table
          ON TRUNC(bin_data_set.start_date_time) = TRUNC(edr_rpt_tmp_inclusion_table.date_time)
       WHERE site_to_data_source_lane_v.site_id = in_site_id
         AND bin_data.start_date_time     >= in_start_date_time - numtodsinterval(1, 'DAY')
         AND bin_data.start_date_time     <  in_end_date_time   + numtodsinterval(1, 'DAY')
         AND bin_data_set.start_date_time >= in_start_date_time
         AND bin_data_set.start_date_time <  in_end_date_time
         AND bin_data.bin_type            =  2
         AND bin_data.period_length       =  bin_period_length;
    END class_by_day_get_bin_data;In the above code I'm using the hard coded value 2 for bin type
    bin_data.bin_type            =  2But I dont want any hard coded number or string in the query.
    How could I replace it?
    I defined conatant value like below inside my package body where the actual procedure comes.But I'm not sure whether I have to declare it inside package body or inside the procedure.
    bin_type     CONSTANT NUMBER := 2;But it does't look for this value. So I'm not able to get desired value for the report .
    Thanks.
    Edited by: user10641405 on May 29, 2009 1:38 PM

    Declare the constant inside the procedure.
    PROCEDURE class_by_day_get_bin_data(in_report_parameter_id IN NUMBER,
                                        in_site_id             IN NUMBER,
                                        in_start_date_time     IN TIMESTAMP,
                                        in_end_date_time       IN TIMESTAMP,
                                        in_report_level_min    IN NUMBER,
                                        in_report_level_max    IN NUMBER) IS
      bin_period_length NUMBER(6, 0);
      v_bin_type     CONSTANT NUMBER := 2;
    BEGIN
      SELECT MAX(period_length)
        INTO bin_period_length
        FROM bin_data
        JOIN site_to_data_source_lane_v ON bin_data.data_source_id =
                                           site_to_data_source_lane_v.data_source_id
        JOIN bin_types ON bin_types.bin_type = bin_data.bin_type
       WHERE site_to_data_source_lane_v.site_id = in_site_id
         AND bin_data.start_date_time >=
             in_start_date_time - numtodsinterval(1, 'DAY')
         AND bin_data.start_date_time <
             in_end_date_time + numtodsinterval(1, 'DAY')
         AND bin_data.bin_type = v_bin_type
         AND bin_data.period_length <= 60;
      --Clear the edr_class_by_day_bin_data temporary table and populate it with the data for the requested
      --report.
      DELETE FROM edr_class_by_day_bin_data;
      INSERT INTO edr_class_by_day_bin_data
        (site_id,
         site_lane_id,
         site_direction_id,
         site_direction_name,
         bin_start_date_time,
         bin_end_date_time,
         bin_id,
         bin_value)
        SELECT site_to_data_source_lane_v.site_id,
               site_to_data_source_lane_v.site_lane_id,
               site_to_data_source_lane_v.site_direction_id,
               site_to_data_source_lane_v.site_direction_name,
               bin_data_set.start_date_time,
               bin_data_set.end_date_time,
               bin_data_value.bin_id,
               bin_data_value.bin_value
          FROM bin_data
          JOIN bin_data_set ON bin_data.bin_serial = bin_data_set.bin_serial
          JOIN bin_data_value ON bin_data_set.bin_data_set_serial =
                                 bin_data_value.bin_data_set_serial
          JOIN site_to_data_source_lane_v ON bin_data.data_source_id =
                                             site_to_data_source_lane_v.data_source_id
                                         AND bin_data_set.lane =
                                             site_to_data_source_lane_v.data_source_lane_id
          JOIN (SELECT CAST(report_parameter_value AS NUMBER) lane_id
                  FROM report_parameters
                 WHERE report_parameters.report_parameter_id =
                       in_report_parameter_id
                   AND report_parameters.report_parameter_group = 'LANE'
                   AND report_parameters.report_parameter_name = 'LANE') report_lanes ON site_to_data_source_lane_v.site_lane_id =
                                                                                         report_lanes.lane_id
          JOIN (SELECT CAST(report_parameter_value AS NUMBER) class_id
                  FROM report_parameters
                 WHERE report_parameters.report_parameter_id =
                       in_report_parameter_id
                   AND report_parameters.report_parameter_group = 'CLASS'
                   AND report_parameters.report_parameter_name = 'CLASS') report_classes ON bin_data_value.bin_id =
                                                                                            report_classes.class_id
          JOIN edr_rpt_tmp_inclusion_table ON TRUNC(bin_data_set.start_date_time) =
                                              TRUNC(edr_rpt_tmp_inclusion_table.date_time)
         WHERE site_to_data_source_lane_v.site_id = in_site_id
           AND bin_data.start_date_time >=
               in_start_date_time - numtodsinterval(1, 'DAY')
           AND bin_data.start_date_time <
               in_end_date_time + numtodsinterval(1, 'DAY')
           AND bin_data_set.start_date_time >= in_start_date_time
           AND bin_data_set.start_date_time < in_end_date_time
           AND bin_data.bin_type = v_bin_type
           AND bin_data.period_length = bin_period_length;
    END class_by_day_get_bin_data;

  • How to write sql query with many parameter in ireport

    hai,
    i'm a new user in ireport.how to write sql query with many parameters in ireport's report query?i already know to create a parameter like(select * from payment where entity=$P{entity}.
    but i don't know to create query if more than 1 parameter.i also have parameter such as
    $P{entity},$P{id},$P{ic}.please help me for this.
    thanks

    You are in the wrong place. The ireport support forum may be found here
    http://www.jasperforge.org/index.php?option=com_joomlaboard&Itemid=215&func=showcat&catid=9

  • Need help with SQL Query with Inline View + Group by

    Hello Gurus,
    I would really appreciate your time and effort regarding this query. I have the following data set.
    Reference_No---Check_Number---Check_Date--------Description-------------------------------Invoice_Number----------Invoice_Type---Paid_Amount-----Vendor_Number
    1234567----------11223-------------- 7/5/2008----------paid for cleaning----------------------44345563------------------I-----------------*20.00*-------------19
    1234567----------11223--------------7/5/2008-----------Adjustment for bad quality---------44345563------------------A-----------------10.00------------19
    7654321----------11223--------------7/5/2008-----------Adjustment from last billing cycle-----23543556-------------------A--------------------50.00--------------19
    4653456----------11223--------------7/5/2008-----------paid for cleaning------------------------35654765--------------------I---------------------30.00-------------19
    Please Ignore '----', added it for clarity
    I am trying to write a query to aggregate paid_amount based on Reference_No, Check_Number, Payment_Date, Invoice_Number, Invoice_Type, Vendor_Number and display description with Invoice_type 'I' when there are multiple records with the same Reference_No, Check_Number, Payment_Date, Invoice_Number, Invoice_Type, Vendor_Number. When there are no multiple records I want to display the respective Description.
    The query should return the following data set
    Reference_No---Check_Number---Check_Date--------Description-------------------------------Invoice_Number----------Invoice_Type---Paid_Amount-----Vendor_Number
    1234567----------11223-------------- 7/5/2008----------paid for cleaning----------------------44345563------------------I-----------------*10.00*------------19
    7654321----------11223--------------7/5/2008-----------Adjustment from last billing cycle-----23543556-------------------A--------------------50.00--------------19
    4653456----------11223--------------7/5/2008-----------paid for cleaning------------------------35654765-------------------I---------------------30.00--------------19
    The following is my query. I am kind of lost.
    select B.Description, A.sequence_id,A.check_date, A.check_number, A.invoice_number, A.amount, A.vendor_number
    from (
    select sequence_id,check_date, check_number, invoice_number, sum(paid_amount) amount, vendor_number
    from INVOICE
    group by sequence_id,check_date, check_number, invoice_number, vendor_number
    ) A, INVOICE B
    where A.sequence_id = B.sequence_id
    Thanks,
    Nick

    It looks like it is a duplicate thread - correct me if i'm wrong in this case ->
    Need help with SQL Query with Inline View + Group by
    Regards.
    Satyaki De.

  • Extract Sql Query with Actual Parameters from Report

    Hi
    I am able to extract query from Crystal Report using the following code :
    ReportDocument.ReportClientDocument.RowSetController.GetSqlStatement(new GroupPath, out tmp);
    But the sql query retrieve comes in the following format :
    select name , trans_code, account_code from command_query.accounts
    where account_code = {?Command_query_Prompt0}  and effective_date < '{?Command_query_prompt1 }'
    The parameters which I m using in the reports are :
    Account_Code and Effective_Date .
    Why does my extracted sql translates it into {?Command_query_Prompt0} and '{?Command_query_prompt1 }' .
    Is there any way to map this to the actual parameter values ?
    OR
    Can we extract the query after assigning the values ?
    Any help is appreciated ... 
    Thanks
    Sanchet

    hi,
    You can create nested sql query with conditional parameters,
    For example
    Select Code From OITT Where Code IN (Select ItemCode From OITM
    Where ItemName LIKE '[%0]' + '%%')
    Edited by: Jeyakanthan A on Jun 9, 2009 12:31 PM

  • Hosting company does not support SQL query with OUTFILE clause

    From my mysql database, I want to allow the user to run a query and produce a csv / text file of our membership database.   Unfortunately,  I just found out my hosting company does not support the SQL query with OUTFILE clause for MySQL database.
    Are there any other options available to produce a file besides me running the query in phpadmin and making the file available to users.
    Thanks.  George

    Maybe this external Export Mysql data to CSV - PHP tutorial will be of help
    Cheers,
    Günter

  • SQL Query With Like Operator - Performance is very poor - Oracle Apps Table

    Hi,
    I'm querying one of the Oracle Applications Standard Table. The performance is very slow when like operator is used in the query condition. The query uses a indexed column in the where clause.
    The query is..
    select * from hz_parties
    where upper(party_name) like '%TOY%'
    In the above case, It is not using the index and doing full table scan. I have checked the explain plan and the cost is 4496.
    select * from hz_parties
    where upper(party_name) like 'TOY%'
    If I remove the '%' at the begining of the string, the performance is good and it is using the index. In this case, the cost is 5.
    Any ideas to improve the performance of the above query. I have to retrieve the records whose name contains the string. I have tried hints to force the use of index. But it is of no use.
    Thanks,
    Rama

    If new indexes are disallowed, not a lot of good ones, no.
    If you know what keyword(s) are going to be searched for, a materialized view might help, but I assume that you're searching based on user input. In that case, you'd have to essentially build your own Text index using materialized views, which will almost certainly be less efficient and require more maintenance than the built-in functionality.
    There may not be much you could do to affect the query plan in a reasonable way. Depending on the size of the table, how much RAM you're willing to throw at the problem, how your system is administered, and what Oracle Apps requires/ prohibits in terms of database configuration, you might be able to force Oracle to cache this table so that your full table scans are at least more efficient.
    Justin

  • DELETE QUERY FOR A TABLE WITH MILLION ROWS

    Hello,
    I have a requirement where I have to compare 2 tables - both having around million rows, and delete data based on a single column.
    DELETE FROM TABLE_A WHERE COLUMN_A NOT IN
    (SELECT COLUMN_A FROM TABLE_B)
    COLUMN_A had index defined on it in both tables. Still it is taking a long time. What is the best way to achieve this? any work around?
    thanks

    How many rows are you deleting from this table ? If the precentage is large then the better option is
    1) Create a new table where COLUMN_A NOT IN
    (SELECT COLUMN_A FROM TABLE_B)
    2) TRUNCATE table_A
    3)Insert in to table_A (select * from new table)
    4) If you have any constraints then may be it can be diaabled.
    thanks

  • Performance of native sql query detoriates

    Dear Experts,
    The performance of my native SQL query is bad. On the database the query takes less than 5 seconds to process. From my abap program I get a session timeout dump after 10 minutes. What might be the possible reason.
    Warm Regards,
    Abdullah

    I am not a DBA, but this is a wild guess.
    I have a native SQL query. It was running fine all morning(transported it to production today). By afternoon the report was not giving any output.
    I went to the MS SQL query analyzer and executed the query, it returned the results in less than 5 seconds. The same query when I was executing from SAP using native SQL took more than 10 minutes and gave a dump(time exceeded).
    My database guy asked me to execute the following on the database. Dbcc dbreindex('tablename')
    The report is running fine since then. I am still not satisfied if this is the reason the performance is back on track, but yeah the report is running fine again. There seems to be some problem with the indexes.
    I am using standard classes provided by SAP to execute my query and after execution the resultset reference object is being closed, I am closing the connection.
    the code is as below.
          PERFORM:
            connect               USING con_name con_ref,
            select_into_table     USING con_ref,
            disconnect            USING con_ref.
    *  FORM connect
    *  Connects to the database specified by the logical connection name
    *  P_CON_NAME which is expected to be specified in table DBCON. In case
    *  of success the form returns in P_CON_REF a reference to a connection
    *  object of class CL_SQL_CONNECTION.
    *  --> P_CON_NAME  logical connection name
    *  <-- P_CON_REF   reference to a CL_SQL_CONNECTION object
    FORM connect  USING    p_con_name TYPE dbcon-con_name
                           p_con_ref  TYPE REF TO cl_sql_connection
                           RAISING cx_sql_exception.
    * if CON_NAME is not initial then try to open the connection, otherwise
    * create a connection object representing the default connection.
      IF p_con_name IS INITIAL.
        CREATE OBJECT p_con_ref.
      ELSE.
        p_con_ref = cl_sql_connection=>get_connection( p_con_name ).
      ENDIF.
    ENDFORM.                    " connect
    *  FORM select_into_table
    *  Selects some rows from the test table and fetches the result rows
    *  into an internal table whose row structure corresponds to the
    *  queries select list columns.
    FORM select_into_table
      USING   p_con_ref TYPE REF TO cl_sql_connection
      RAISING cx_sql_exception.
      DATA:
        l_stmt         TYPE string,
        l_stmt_ref     TYPE REF TO cl_sql_statement,
        l_dref         TYPE REF TO data,
        l_res_ref      TYPE REF TO cl_sql_result_set,
    *Data related query
        l_itab         TYPE TABLE OF t_pricing_report,
        l_row_cnt      TYPE i.
    * create the query string
    CONCATENATE
        'select A.SEQ,A.CONDTABLE,A.CONDNAME,A.VKORG,A.VTWEG,A.MATKL,A.MATNR,B.MTEXT,A.VKGRP,A.SGRPNAME,'
        'A.VKBUR,A.SOFFNAME,A.ZSALES,A.SCNTNAME,A.KUNNR,A.SCSTNAME,A.PRBATCH,A.INCO1,'
        'A.INCO2,A.DATAB,A.DATBI,A.KBETR,A.KONWA,A.KOSRT,B.MTART,B.GROES,B.VOLUM,B.EXTWG,B.WRKST,'
        'A.MXWRT,A.GKWRT,'
        'B.PATTERN,B.RIM,B.SERIES,B.SPDINDEX,B.LDINDX,B.MGROUP,B.APPLN,B.SDWALL,B.MGRPTXT'
        'FROM Z_PRICELIST A,Z_MATERIALVIEW B'
        'WHERE A.MANDT = ? AND'
              'B.MANDT = A.MANDT AND'
              'A.MATNR = B.MATNR AND'
              'A.KSCHL = ? AND'
              'A.CONDTABLE LIKE ? AND'
              'A.VKORG LIKE ? AND'
              'A.VTWEG LIKE ? AND'
              'A.MATKL >= ? AND A.MATKL <= ? AND'
              'A.MATNR >= ? AND A.MATNR <= ? AND'
              'A.INCO1 LIKE ? AND'
              'A.INCO2 LIKE ? AND'
              'A.ZSALES >= ? AND A.ZSALES <= ? AND'
              'A.KUNNR  >= ? AND A.KUNNR <= ? AND'
              'A.PRBATCH  >= ? AND A.PRBATCH <= ? AND'
              'A.VKBUR  >= ? AND A.VKBUR <= ? AND'
              'A.VKGRP  >= ? AND A.VKGRP <= ? AND'
              'B.WRKST  >= ? AND B.WRKST <= ? AND'
              'B.MTART  >= ? AND B.MTART <= ? AND'
              '? BETWEEN A.DATAB AND A.DATBI AND'
              'B.GROES LIKE ? AND'
              'B.LDINDX LIKE ? AND'
              'B.SPDINDEX LIKE ? AND'
              'B.RIM LIKE ? AND'
              'B.SERIES LIKE ? AND'
              'B.PATTERN LIKE ? AND'
              'B.MGROUP LIKE ?'
              'order by A.MATNR'
        INTO l_stmt SEPARATED BY space.                         "#EC NOTEXT
    * create a statement object
      l_stmt_ref = p_con_ref->create_statement( ).
    * bind input variables
      GET REFERENCE OF l_col1 INTO l_dref.
      l_stmt_ref->set_param( l_dref ).
    *binding other references here
      GET REFERENCE OF l_col33 INTO l_dref.
      l_stmt_ref->set_param( l_dref ).
    * set the input values and execute the query
      l_col1  = sy-mandt.
    *..Assigning values here
      l_col33 = p_mgroup.
    *  PERFORM trace_2 USING 'EXECUTE_QUERY' l_stmt l_col1 l_col2.
      l_res_ref = l_stmt_ref->execute_query( l_stmt ).
    * set output table
      GET REFERENCE OF l_itab INTO l_dref.
      l_res_ref->set_param_table( l_dref ).
    * get the complete result set
      l_row_cnt = l_res_ref->next_package( ).
    * display the contents of the output table
    *  PERFORM trace_next_package USING l_itab.
    *  PERFORM trace_result USING l_row_cnt 'rows fetched'.
      pricing_report[] = l_itab[].
      free l_itab.
    * don't forget to close the result set object in order to free
    * resources on the database
      l_res_ref->close( ).
    ENDFORM.                    "select_into_table
    *  FORM disconnect
    *  Disconnect from the given connection. In case of the default
    *  connection this can be omitted.
    FORM disconnect
      USING   p_con_ref TYPE REF TO cl_sql_connection
      RAISING cx_sql_exception.
      DATA: l_con_name TYPE dbcon-con_name.
      l_con_name = p_con_ref->get_con_name( ).
      CHECK l_con_name <> cl_sql_connection=>c_default_connection.
    *  PERFORM trace_0 USING 'CLOSE CONNECTION' l_con_name.
      p_con_ref->close( ).
    *  PERFORM trace_result USING l_con_name 'closed'.
    ENDFORM.                    "disconnect
    *  FORM handle_sql_exception
    *  Write appropriate error messages when a SQL exception has occured
    *  -->  P_SQLERR_REF  reference to a CX_SQL_EXCEPTION object
    FORM handle_sql_exception
      USING p_sqlerr_ref TYPE REF TO cx_sql_exception.
      FORMAT COLOR COL_NEGATIVE.
      IF p_sqlerr_ref->db_error = 'X'.
        WRITE: / 'SQL error occured:', p_sqlerr_ref->sql_code,
               / p_sqlerr_ref->sql_message.                     "#EC NOTEXT
      ELSE.
        WRITE:
          / 'Error from DBI (details in dev-trace):',
            p_sqlerr_ref->internal_error.                       "#EC NOTEXT
      ENDIF.
    ENDFORM.                    "handle_sql_exception

Maybe you are looking for

  • My MacBook Pro does not produce clear sound when i play music via aux or bluetooth

    i have noticed recently that when i listen to music via aux(through my headph0nes or a sound system) or bluetoooth the sound is not clear. i thought that maybe my music quality wasnt good but then when i loaded the songs onto my iphone and ipad, the

  • Sorry, an error has occurred. Adobe Air not available

    I have tried to install this on two Mac machines and have been unable to do so. I get the same error message on both machines as follows:           Sorry, an error has occurred.           This application requires an update to Adobe AIR that is not a

  • Problem with midi input in Mainstage

    Hi all, I have a keyboard connected to my mac via USB midi interface. When I'm trying to play midi instruments in Garage Band - it works very well, no significant latency and no other problems. But in Mainstage the situation with the midi input is di

  • How to configure returns process if ware house management is configured

    hi,, sap gurus, how to configure returns process if ware house management is configured. bcz total ware house management is configured but not with returns process plz help me on this. regards, balajit 09990019711

  • PO approval control wrt last price

    Hello Guys! We have a requirement  which is, while releasing Purchase Order, message should be triggered giving warning if the order price is higher as compared to last order price. Kindly give your inputs to achive this Thanks Hrishi