PL/SQL speed issues when using a variable

I have a very strange issue that is causing problems.
I am running Golden connecting to a 11g database.
I created a procedure to insert records into a table based on a query. The source query includes variables that I have populated prior to the insert statement. The problem is that if I use the variable for one very specific where statement, the statement goes from running in less than 1 second, to running in 15 minutes. It gets even more strange though. Not only does a 2nd variable not cause any problems, the exact same variable in the same statement works fine.
This procedure takes 15 minutes to run.
declare
    v_start_period date;
    v_end_period date;
begin
    select add_months(trunc(sysdate,'mm'), -1), trunc(sysdate,'mm')
    into v_start_period, v_end_period
    from dual;
    insert into RESULTS_TABLE
            (first_audit_date, last_audit_date,
            data_column1, data_column2, data_column3)
        select
            a.first_audit_date, a.last_audit_date,
            b.data_column1, b.data_column2, b.data_column3
        from
            SOURCE_TABLE_1 b,
            (select marker_id, min(action_time) as first_audit_date, max(action_time) as last_audit_date
                from SOURCE_TABLE_2
                where action_time >= v_start_period and action_time < v_end_period
                group by marker_id) a
        where b.marker_id = a.marker_id
            and exists (select 1 from SOURCE_TABLE_2
                where marker_id = b.marker_id
                and action_time >= v_start_period and action_time < v_end_period
                and action_type = 'XYZ');
    commit;
end;This procedure runs in less than 1 second, yet returns the exact same results.
declare
    v_start_period date;
    v_end_period date;
begin
    select add_months(trunc(sysdate,'mm'), -1), trunc(sysdate,'mm')
    into v_start_period, v_end_period
    from dual;
    insert into RESULTS_TABLE
            (first_audit_date, last_audit_date,
            data_column1, data_column2, data_column3)
        select
            a.first_audit_date, a.last_audit_date,
            b.data_column1, b.data_column2, b.data_column3
        from
            SOURCE_TABLE_1 b,
            (select marker_id, min(action_time) as first_audit_date, max(action_time) as last_audit_date
                from SOURCE_TABLE_2
                where action_time >= v_start_period and action_time < trunc(sysdate,'mm')
                group by marker_id) a
        where b.marker_id = a.marker_id
            and exists (select 1 from SOURCE_TABLE_2
                where marker_id = b.marker_id
                and action_time >= v_start_period and action_time < v_end_period
                and action_type = 'XYZ');
    commit;
end;The only difference between the two is where I replace the first v_end_period variable with trunc(sysdate,'mm').
I've been googling for possible solutions and keep running into something called "parameter sniffing". It appears to be a SQL Server issue though, as I cannot find solutions for Oracle. I tried nesting the insert statement inside it's own procedure with new variables populated by the old variables, but nothing changed.
Edited by: user_7000017 on Jan 8, 2013 9:45 AM
Edited by: user_7000017 on Jan 8, 2013 9:52 AM Put the code in code tags.

You are not describing procedures. You are listing anonymous PL/SQL blocks.
As for the code - this approach to assigning PL/SQL variables are highly questionable. As the functions are native PL/SQL functions too, there is no need to parse and execute a SQL cursor, do context switching, in order to assign values to PL/SQL variables.
This is wrong:
select
    add_months(trunc(sysdate,'mm'), -1), trunc(sysdate,'mm')
    into v_start_period, v_end_period
from dual;This is correct:
v_start_period := add_months(trunc(sysdate,'mm'), -1);
v_end_period := trunc(sysdate,'mm');Just as you would not use +"select 1 into plVariable from dual;+", instead of a standard variable assignment statement like +"plVariable := 1;"+.
As for the performance/execution difference. Does not make sense. I suggest simplifying the code in order to isolate the problem. For example, take that in-line SQL (aliased as a in the main SQL) and create a testcase where it uses the function, versus a PL/SQL bind variable with the same data type and value as that returned by the function. Examine and compare the execution plans.
Increase complexity (if no error) and repeat. Until the error is isolated.
The 1st step in any troubleshooting, is IDENTIFYING the problem. Without knowing what the problem is, how can one fix it?

Similar Messages

  • Oracle SQL Developer issue when using VPN (Cisco)

    All,
    We've got a somewhat obscure issue with SQL Developer connectivity to our databases from a remote employee's laptop.
    The database is hosted inside our firewall and when the employee tries to connect to the database he receives the following error:
    +"Error Encountered: An error was encountered performing the requested operation: The network adaptor could not establish the connection. Vendor Code 20"+
    I believe I may have seen the following error at one point as well...
    Status : Failure -Test failed: IO Error: The Network Adapter could not establish the connection
    Note: He gets this error when logged into our VPN & while he can connect via SQL*Plus to that same database. He gets the error regardless if he uses the tnsnames entry option in SQL Developer or if he is using the "Basic" connection type.
    What's interesting is that if he disconnects from the VPN, turns off his wireless connectivity and then hard-wires into our network he can connect, no issue.
    The strangest part for me is that while he can not connect to the database via SQL Developer, he can connect via SQL*Plus. He can ping the source IP address fine, he can tnsping the database no problem, he can connect via SQL*Plus, he just can't connect via SQL*Developer. Since SQL*Plus and tnsping are working, I know he can access the db server via port 1521, so I don't think it's a port issue.
    Another strange thing: When he connects to our network via an "old" VPN (open VPN) he has no issues.
    His machine is Windoz 64 bit (HP laptop) and he's using the latest version of SQL Developer and a newer JVM.
    Also interesting: I also have a Windoz 64 bit laptop (Lenovo) and I do not have the same issues as he does. I however am running an older version of SQL Developer and probably an older JVM.
    I dug around a bit, seems like perhaps this has something to do with IPv6? No idea what to do next except have my network guy look at firewall logs to try to capture where the connectivity is being dropped.
    Any thoughts or suggestions would be appreciated. I've tried all kinds of things (giving him my tnsnames.ora and sqlnet.ora file, manually editing those files, tried IP addresses instead of host names, tried host names with the .doman.org extensions, etc.).
    Again, any suggestions would be appreciated and thank you very much.
    Rich
    Edited by: rmurnane on Oct 23, 2012 10:58 AM

    Since SQLPlus is installed and works fine you could try to set
    Tools -> Preferences -> Database -> Advanced -> Use OCI/Thick driverThis should enable SQLDeveloper to use the same oracle client SQLPlus is using to connect to the database.
    Mind that by setting this option you need to have an 11g client (or instant client) installed that matches the ojdbc6.jar version used by SQLDeveloper, so if you are using an older client this may not be a usable solution.

  • Speed issue when using Airport Express

    Hello.
    I have had an ongoing battle recently with Internet download speeds.
    I connect my iMac to my Airport Express.
    iMac has Firewall ON.
    When I connect this way, I get slow connection / download speeds.
    The file I am using in this example is the MS Office 2003 SP3 update.
    I get around 200kbps.
    However....
    When I connect my iMac directly via Ethernet to my Cable Modem, I am getting 1.5mbps download.
    Can anyone shed any light on this and advise how I can correctly configure the Airport to get nearer this speed?
    Many thank in advance
    Glenn

    Have you repaired Permissions since the last update/upgrade?
    If not, either do that or get Applejack...
    http://www.versiontracker.com/dyn/moreinfo/macosx/19596
    After installing, reboot holding down CMD+s, then when the prompt shows, type in...
    applejack AUTO
    Then let it do all 5 of it's things.
    At least it'll eliminate some questions if it doesn't fix it.
    The 5 things it does are...
    Correct any Disk problems.
    Repair Permissions.
    Clear out Cache Files.
    Repair/check several plist files.
    Dump the VM files for a fresh start
    While watching Activity Monitor>Networks, are there dips in the download speed, or just flat slow input?
    How many bars do you get?
    What speed does it say it's connecting at in Network Utility?

  • Issue in using presentation variable as filter condition in the reports

    Hi,
    I have an issue in using presentation variable as filter condition in my reports the details are as follows:
    Details :
    We want to implement the Max and Min variables through Presentation variables only.we do not want to implement it through session variables in this case.
    We have two variables MIN and MAX to be used as Presentation Variables,for a column of the report (which is a quantity),so that the user wants to see the data for this column within a particular range.i.e the Min and the Max.This part has been implemented well . The issue is when the user wants to see the full data.In that case we will not pass any values to these two Presentation Variable or in other words we are not restricting the report data so we are not passing any value to the variables,this is when the report is throwing the error. we want to leave this variables blank in that case.but this is giving error.
    Please suggest how can I overcome this issue.
    Thanks in Advance.
    Regards,
    Praveen

    i think you have to use guided navigation for this. create two reports first is the one you are having currently and second is the one in which remove the presentation variable from the column formula. i.e. the same report with no aggregation applied.
    Now create a dummy report and make it return value only when the presentation variable value is not equal to max or min. guide the report to navigate between the first and second report based on the result of the dummy report.

  • Iteration Speed issue when Indexing 3D array wried to a while-loop

    Brief Description of my Program:
    I am working on the real-time signal generation by using LabView and DAQmx (PCI-6110 card). My vi reads the big data file (typically 8MB txt file containing about 400,000 samples (complex double precision). Then, the signal is pre-processed and I end up with a huge 3D array to feed while-loop (typically 3D array dimension is N x 7 x M where N & M >> 7). Inside the loop, that 3D array is indexed and processed before the results are written to the DAQmx outputs. I have a speed issue when indexing the large 3D array (i.e, 3D array having a large sub-array size). My while-loop could not run fast enough to top-up the DAQmx AO buffer (at the output rate of 96kHz). It could run faster only if I use smaller 3D array (i.e, smaller-sized sub-arrays). I do not quite understand why the size of 3D sub-array affects the rate of looping although I am indexing same sub-array size at each iteration. I really appreciate your comments, advices and helps.
    I include my 3D array format as a picture below.
    Question on LabView:
    How does indexing an 3D array which wires to the while-loop affect the speed of the loop iteration? I found that large dimension of sub-arrays in the 3D array slows down the iteration speed by comparing to indexing the same size of sub-array from smaller-sized sub-arrays of the 3D array to perform signal processing inside the while-loop. Why? Is there any other way of designing LabView Program to improve speed of iteration?
    attachment:

    Thank you all for your prompt replies and your interests. I am sorry about my attachment. But, I have now attached a jpg format image file as you suggested.
    I had read the few papers on large data handling such as "LabVIEW Performance and Memory Management". Thus, I had already tried to avoid making unnecessary copies of data and growing arrays in my while-loop. I am not an expert on LabView, so I am not sure if the issues I have are just LabView fundamental limitations or there are any other ways to improve the iteration speed without reducing the input file size and DAQ output rate.
    As you request, I also attach my top-level vi showing essential sections such as while-loop and its indexing. The attached file is as an image jpg format because the actual vi including Sub-VIs are as big as 3MB in total. I hope my attachment would be useful for anyone who would like to reply my question. If anyone would like to see my whole vi & llb files, I would be interesting to send it to you by an e-mail privately and thus please provide your e-mail address.
    The dimension of my 3D array is N x 7 x M (Page x Row x Column), where N represents number of pages in 3D array, and M represents the size of 1D array.  The file I am currently using forms 3D array of N = 28, & M = 10,731.  Refering to the top-level vi picture I attached, my while-loop indexes each page per iteration and wrap-around.  The sub-VI called "channel" inside the while-loop will further index its input (2D array) into seven of 1D arrays for other signal processsing.  The output from that "channel" sub-VI is the superposition of those seven arrays.  I hope my explaination is clear. 
    Attachement: 3Darray.jpg and MyVi.jpg
    Kind Regards,
    Shein
    Attachments:
    3Darray.jpg ‏30 KB
    MyVI.jpg ‏87 KB

  • Coercion problem when using Shared Variable

    I have a curious coercion problem when using Shared Variables.  I want to share the state of a State Machine, which is an enum saved as a control (typedef) called TYPE State (see attached).  I create a shared variable called State and define it as a Custom Control, using the just-mentioned typedef.  So far, so good.  I've attached three simple VIs -- the first one, Init State, simply wires a constant to the input of the Shared Variable to initialize it -- the wired constant is, of course, defined by the typedef.  However, the Get State and Set State, meant to wire an indicator (for reading the state) or control (for setting it), develop coercion dots when wired into the Shared Variable.  Why?  How do I get rid of the dot?  [I suppose I could abandon my typedef and custom control, but the beauty of typedefs and custom controls is that it "enforces" rules, lets you use enums for clarity, keeps the code "honest", etc. -- I'd hate to give that up just to get rid of a dot!].
    On a related note, the code seems to work.  This is much too simplistic to do anything, but if you open Set State and Get State, set the state to anything, run it (it immediately stops, of course), then run Get State, you'll see the chosen state appear in the indicator.  So it does appear to work.  The "error" (coercion dot) may, I suppose, be a "bug" in Labview because it can't figure out the mapping of the (very simple!) Custom Control, but if so, I hope it gets fixed quickly!
    Bob Schor
    Attachments:
    Coercion Problem1.zip ‏38 KB

    Hello Bob,
    I am also seeing this behavior, I will escalate this question to our LabVIEW developers and post again here no later than next Tuesday, November 27th as National Instruments will be closed for the remainder of this week.
    If this issue does turn into a product suggestion, I would suspect the workaround would to live with the coersion dot for the time being.
    Enjoy the holiday
    Regards,
    Erik J.
    Applications Engineer
    National Instruments

  • JTable text alignment issues when using JPanel as custom TableCellRenderer

    Hi there,
    I'm having some difficulty with text alignment/border issues when using a custom TableCellRenderer. I'm using a JPanel with GroupLayout (although I've also tried others like FlowLayout), and I can't seem to get label text within the JPanel to align properly with the other cells in the table. The text inside my 'panel' cell is shifted downward. If I use the code from the DefaultTableCellRenderer to set the border when the cell receives focus, the problem gets worse as the text shifts when the new border is applied to the panel upon cell selection. Here's an SSCCE to demonstrate:
    import java.awt.Color;
    import java.awt.Component;
    import java.awt.EventQueue;
    import javax.swing.GroupLayout;
    import javax.swing.JFrame;
    import javax.swing.JLabel;
    import javax.swing.JPanel;
    import javax.swing.JTable;
    import javax.swing.border.Border;
    import javax.swing.table.TableCellRenderer;
    import javax.swing.table.TableColumn;
    import sun.swing.DefaultLookup;
    public class TableCellPanelTest extends JFrame {
      private class PanelRenderer extends JPanel implements TableCellRenderer {
        private JLabel label = new JLabel();
        public PanelRenderer() {
          GroupLayout layout = new GroupLayout(this);
          layout.setHorizontalGroup(layout.createParallelGroup().addComponent(label));
          layout.setVerticalGroup(layout.createParallelGroup().addComponent(label));
          setLayout(layout);
        public Component getTableCellRendererComponent(JTable table, Object value, boolean isSelected, boolean hasFocus, int row, int column) {
          if (isSelected) {
            setBackground(table.getSelectionBackground());
          } else {
            setBackground(table.getBackground());
          // Border section taken from DefaultTableCellRenderer
          if (hasFocus) {
            Border border = null;
            if (isSelected) {
              border = DefaultLookup.getBorder(this, ui, "Table.focusSelectedCellHighlightBorder");
            if (border == null) {
              border = DefaultLookup.getBorder(this, ui, "Table.focusCellHighlightBorder");
            setBorder(border);
            if (!isSelected && table.isCellEditable(row, column)) {
              Color col;
              col = DefaultLookup.getColor(this, ui, "Table.focusCellForeground");
              if (col != null) {
                super.setForeground(col);
              col = DefaultLookup.getColor(this, ui, "Table.focusCellBackground");
              if (col != null) {
                super.setBackground(col);
          } else {
            setBorder(null /*getNoFocusBorder()*/);
          // Set up our label
          label.setText(value.toString());
          label.setFont(table.getFont());
          return this;
      public TableCellPanelTest() {
        JTable table = new JTable(new Integer[][]{{1, 2, 3}, {4, 5, 6}}, new String[]{"A", "B", "C"});
        // set up a custom renderer on the first column
        TableColumn firstColumn = table.getColumnModel().getColumn(0);
        firstColumn.setCellRenderer(new PanelRenderer());
        getContentPane().add(table);
        pack();
      public static void main(String[] args) {
        EventQueue.invokeLater(new Runnable() {
          public void run() {
            new TableCellPanelTest().setVisible(true);
    }There are basically two problems:
    1) When first run, the text in the custom renderer column is shifted downward slightly.
    2) Once a cell in the column is selected, it shifts down even farther.
    I'd really appreciate any help in figuring out what's up!
    Thanks!

    1) LayoutManagers need to take the border into account so the label is placed at (1,1) while labels just start at (0,0) of the cell rect. Also the layout manager tend not to shrink component below their minimum size. Setting the labels minimum size to (0,0) seems to get the same effect in your example. Doing the same for maximum size helps if you set the row height for the JTable larger. Easier might be to use BorderLayout which ignores min/max for center (and min/max height for west/east, etc).
    2) DefaultTableCellRenderer uses a 1px border if the UI no focus border is null, you don't.
    3) Include a setDefaultCloseOperation is a SSCCE please. I think I've got a hunderd test programs running now :P.

  • What's disadvantages when using bind variables always in java?

    Hello everyone .. Could someone tell me what's the disadvantage when using bind variable in java ? I heard it somecases since before.. Thanks in advance!

    99% of the time, you should be using bind variables. If you have columns which are highly skewed, however, you may want to consider using literals (assuming CURSOR_SHARING=EXACT), since that may allow the CBO to make a better decision.
    If you have an orders table, for example, you may have a status column that specifies whether the order is complete, in transit, or new. If you've been running for a while, 99% of your orders will be complete, so
    SELECT COUNT(*)
      FROM orders
    WHERE status = :1should do a full table scan if you specify 'COMPLETE'. If you passed in 'IN TRANSIT', though, an index scan might be more appropriate. If you want to pass in different values and get different query plans, you need to use literals. 99% of the time, though, you want the same plan, so you want to use bind variables.
    Justin
    Distributed Database Consulting, Inc.
    http://www.ddbcinc.com/askDDBC

  • Does resteasy API have class loader issues when using via OSGi

    Does resteasy API have class loader issues when using via OSGi

    Hi Scott,
    THis isnt an answer to ur Question, but could u tell me which jar files are needed for the packages:
    com.sap.portal.pcm.system.ISystems
    com.sap.portal.pcm.system.ISystem
    and under which path I coul dfind them.
    Thnx
    Regards
    Meesum.

  • Odd issue when using UDT (user defined type)

    Hi all,
    11g.
    I ran into an odd issue when using UDT, I have these 4 schemas: USER_1, USER_2, USER_A, USER_B.
    Both USER_1 and USER_2 have an UDT (actually a nested table):
    CREATE OR REPLACE TYPE TAB_NUMBERS AS TABLE OF NUMBER(10)USER_A has a synonym points to the type in USER_1:
    create or replace synonym TAB_NUMBERS for USER_1.TAB_NUMBERS;USER_B has a synonym points to the type in USER_2:
    create or replace synonym TAB_NUMBERS for USER_2.TAB_NUMBERS;Both USER_A and USER_B have a procedure which uses the synonym:
    CREATE OR REPLACE PROCEDURE proc_test (p1 in tab_numbers)
    IS
    BEGIN
      NULL;
    END;And in the C# code:
    OracleConnection conn = new OracleConnection("data source=mh;user id=USER_A;password=...");
    OracleCommand cmd = new OracleCommand();
    cmd.Connection = conn;
    cmd.CommandText = "proc_test";
    cmd.CommandType = CommandType.StoredProcedure;
    OracleParameter op = new OracleParameter();
    op.ParameterName = "p1";
    op.Direction = ParameterDirection.Input;
    op.OracleDbType = OracleDbType.Object;
    op.UdtTypeName = "TAB_NUMBERS";
    Nested_Tab_Mapping_To_Object nt = new Nested_Tab_Mapping_To_Object();
    nt.container = new decimal[] { 1, 2 };
    op.Value = nt;
    ......This code works fine, but it raises an error when I change the connection string from USER_A to USER_B, the error says:
    OCI-22303: type ""."TAB_NUMBERS" not foundInterestingly, if I change op.UdtTypeName = "TAB_NUMBERS"; to op.UdtTypeName = "USER_2.TAB_NUMBERS", the error is gone, and everything works fine.
    Anyone has any clues?
    Thanks in advance.

    Erase and reformat the ext HD. Then, redo the SuperDuper! backup.

  • Bluetooth issues when using smartwatch on the z1s

    I'm having some Bluetooth issues when using my smartwatch. I usually use my z1s with my car radio via Bluetooth to listen to music and for making calls. Now that I got a sony smartwatch I always got problems with the calls. It looks like that the Bluetooth turns off during the call when I'm using the smartwatch. I already tried by deleting and pairing again the z1s, but nothing works. Not sure if maybe the z1s does not support the smartwatch and the care radio Bluetooth connection at the same time. However it works fine when playing music! The problem is just with the calls. This never happens before I got the smartwatch.

    I have the Tmobile Z1s and a SW2 and have the same issues with multiple devices:
    1) Bose Soundlink Mini
    2) Sony MBR-100
    3) My car's bluetooth
    4) Ford Sync
    5) Sony SBH20
    6) Motorola S305
    If the phone connects to the A2DB device, it will typically play fine for several minutes. Then, audio will drop out. If I do nothing, after several minutes my watch will vibrate to show it has disconnected. Sometimes audio then comes back, and sometimes it doesn't.
    Sometimes my phone will not connect to my car. Then, if I can get it to connect, it will only allow for phone audio, not media audio. 
    This issue has been on (2) Z1s's with the SW2. The first I returned due to touch screen issues, which the second one also has. All of the devices above worked perfectly fine with my Xperia Z, Galaxy S2, HTC One, etc. The Xperia Z and the SW2 never had any problems.
    Honestly, this phone has been incredibly frustrating. If the upcoming updates for either the SW2 or Z1S don't fix this issue, I'm going to return it and get something else.

  • Query don't use the right index when using bind variables

    Hi people !
    I need some help because I have an issue with a query that don t use the right Indexes as it should
    First of all, I have mainly three tables :
    ORDER : Table that contains description for each Order (approximately 1 000 000 Records)
    ORDER_MVTS : Table that contains the tasks made (called movements) to set up each Orders
    with quantity of packages prepared for each product (approximately 10 000 000 Records)
    PRODUCT : Tables that contains the products (approximately 50 000 Records)
    When I launch the query with hard coded values, it brings back response very fast
    because it uses the right index (ORDER_DHR_VALID) which represent the date and hour of the order
    (with format 'DD/MM/YYYY HH24:MI:SS'). The selectivity for this index is good.
    NB 1: I have to use the trick " >= Trunc(date) and < trunc(date) +1 " to filter on a simple date because
    the index contains hour and minutes (I know it wasn't probably a bright idea at conception time).
    NB 2: The index on ORDER_MVTS.PRODUCT_CODE is'nt discriminating enough because there is'nt enough different products.
    It's the same for index on CUSTOMER_CODE and on MVT_TYPE so only the index on ORDER.DHR_VALID is good.
    Here is the correct explain plan when I execute the query with hard coded values :
    SELECT SUM(ORDER_MVTS.NB_PACKAGE)
    FROM ORDER_MVTS, PRODUCT, ORDER
    WHERE ORDER.DHR_VALID >= TRUNC(to_date('14/11/2008 10:04:56','DD/MM/YYYY HH24:MI:SS'))
    AND ORDER.DHR_VALID < TRUNC(to_date('14/11/2008 10:04:56','DD/MM/YYYY HH24:MI:SS')) + 1
    AND ORDER_MVTS.MVT_TYPE = 'DELIVERY'
    AND PRODUCT.CODE = ORDER_MVTS.PRODUCT_CODE
    AND ORDER_MVTS.ORDER_CODE = ORDER.CODE
    AND ORDER.CUSTOMER_CODE = 'ADIDAS'
    AND PRODUCT.CODE = 1234
    Rows Row Source Operation
    1 SORT AGGREGATE
    2 NESTED LOOPS
    4 NESTED LOOPS
    2 INDEX UNIQUE SCAN (object id 378548) --> PRODUCT_PK
    4 TABLE ACCESS BY INDEX ROWID ORDER
    777 INDEX RANGE SCAN (object id 378119) --> ORDER_DHR_VALID
    2 TABLE ACCESS BY INDEX ROWID ORDER_MVTS
    30 INDEX RANGE SCAN (object id 377784) --> ORDER_MVTS_ORDER_FK
    Now the problem is when the query is used in a Cursor with bind variables.
    It seems like Oracle don't use index on ORDER.DHR_VALID because he can't figure out that he have
    to actually filter on a short period of time (only one day).
    So Oracle uses the index on ORDER_MVTS.PRODUCT_CODE which is'nt a bright idea (it takes 10 secondes instead of just one)
    Here is the bad explain plan :
    Rows Row Source Operation
    1 SORT AGGREGATE
    2 NESTED LOOPS
    722 NESTED LOOPS
    2 INDEX UNIQUE SCAN (object id 378548) --> PRODUCT_PK
    722 TABLE ACCESS BY INDEX ROWID ORDER_MVTS
    1790 INDEX RANGE SCAN (object id 377777) --> ORDER_MVTS_PRODUCT_FK
    2 TABLE ACCESS BY INDEX ROWID ORDER
    1442 INDEX UNIQUE SCAN (object id 378439) --> ORDER_PK
    Now I have found two solutions to this problem :
    1) using a Hint to force the use of index on ORDER.DHR_VALID (with /*+ INDEX(ORDER ORDER_DHR_VALID) */ )
    2) Using Dynamic SQL and keeping the date hard coded (but not the other values except mvt_type)
    For example :
    QUERY :=
    'SELECT SUM(ORDER_MVTS.NB_PACKAGE)
    FROM ORDER_MVTS, PRODUCT, ORDER
    WHERE ORDER.DHR_VALID >= TRUNC(TO_DATE('''||To_char(P_DTE_VAL,'DD/MM/YYYY')||''',''DD/MM/YYYY'')) '||
    AND ORDER.DHR_VALID < TRUNC(TO_DATE('''||To_char(P_DTE_VAL,'DD/MM/YYYY')||''',''DD/MM/YYYY'')) + 1 '||
    AND ORDER_MVTS.MVT_TYPE = 'DELIVERY'
    AND PRODUCT.CODE = ORDER_MVTS.PRODUCT_CODE
    AND ORDER_MVTS.ORDER_CODE = ORDER.CODE
    AND ORDER.CUSTOMER_CODE = :CUSTOMER
    AND PRODUCT.CODE = :CODE ';
    These two solutions work but Number 1 is bad in theory because it uses a Hint
    and Number 2 may be difficult to code.
    So my question is : Does someone knows another solution to force the use of index ORDER_DHR_VALID that can be simple and reliable.
    Thank you very much for support
    Edited by: remaï on Apr 1, 2009 4:08 PM

    What version of oracle you have? CBO work is different in 9i and 10g.
    Usually cost based optimizer do not want to use index for >< condition with binding variables because optimizer can not use statistic to determine selectivity, and by default selectivity of <> operators is low.
    (As I remember '>' selectivity by default is 5%, you have two conditions > and <, therefore resulting selectivity will be 0.05*0.05=0.0025 as two independent events, but selectivity of other conditions
    ORDER_MVTS.MVT_TYPE = 'DELIVERY' or ORDER.CUSTOMER_CODE = 'ADIDAS' looks much better for CBO)
    The best solution I see is do not use binding variables. Actually your query looks as searching query, which executes not so often, therefore you will not have perfomance win along of skipping execution plan creation.
    Edited by: JustasVred on Apr 1, 2009 10:10 AM

  • Unresolved column when using presentation variable in Answers filter

    Hi all,
    I'm using a dashboard prompt that sets a presentation variable - pres_year. I'm using the presentation variable in a request's filter.
    In the 'Criteria' tab, in the filter for the request, I set a year column (YEAR) to the presentation variable via Variable Expr and enter the presentation variable name only. End up with:
    YEAR is equal to / is in @{pres_year}
    When using the dashboard, the prompt and request all work fine. The problem is when I'm in the request in Answers and I click on 'Display Results' button. I get the following error:
    'NQSError: 27005 unresolved column pres_year'
    Any suggestions?
    Thanks.

    I got same error, if i'm using SQL Expression, instead of variable expression in filter section.
    make sure that you followed this:
    filter on year > Add > Variable > Presentation. Then, give the presentation variable name (which you defined in dashboard prompt) in Variable Expr field , click OK
    getting same error when using:
    Filter on year column > Add > SQL Expression > presentation variable. This is not right place to give presentation variable.
    and, error i got:
    *State: HY000. Code: 10058. [NQODBC] [SQL_STATE: HY000] [nQSError: 10058] A general error has occurred. [nQSError: 27005]* Unresolved column: "var_Year". (HY000)
    so, make sure that you are suing the right one..

  • Error when using the variable System::ErrorDescription

    HI All,
    When I try to use the variable System::ErrorDescription I get the following error:
    The element cannot be found in a collection. This error happens when you try to retrieve an element from a collection on a container during execution of the package and the element is not there.
    Do I need to declare it or it's a variable that always is available to get exceptions from the execution states?

    if this is in a Script Task then:
    - add the variable in the Script task property ReadOnlyVariables list
    - in the Script Code refer to it as
    string ErrorDesc = (string) Dts.Variables["System::ErrorDescription"].Value;
    Jan D'Hondt - SQL server BI development

  • Performance when using bind variables

    I'm trying to show myself that bind variables improve performance (I believe it, I just want to see it).
    I've created a simple table of 100,000 records each row a single column of type integer. I populate it with a number between 1 and 100,000
    Now, with a JAVA program I delete 2,000 of the records by performing a loop and using the loop counter in my where predicate.
    My first JAVA program runs without using bind variables as follows:
    loop
    stmt.executeUpdate("delete from nobind_test where id = " + i);
    end loop
    My second JAVA program uses bind variables as follows:
    pstmt = conn.prepareStatement("delete from bind_test where id = ?");
    loop
    pstmt.setString(1, String.valueof(i));
    rs = pstmt.executeQuery();
    end loop;
    Monitoring of v$SQL shows that program one doesn't use bind variables, and program two does use bind variables.
    The trouble is that the program that does not use bind variables runs faster than the bind variable program.
    Can anyone tell me why this would be? Is my test too simple?
    Thanks.

    [email protected] wrote:
    I'm trying to show myself that bind variables improve performance (I believe it, I just want to see it).
    I've created a simple table of 100,000 records each row a single column of type integer. I populate it with a number between 1 and 100,000
    Now, with a JAVA program I delete 2,000 of the records by performing a loop and using the loop counter in my where predicate.
    Monitoring of v$SQL shows that program one doesn't use bind variables, and program two does use bind variables.
    The trouble is that the program that does not use bind variables runs faster than the bind variable program.
    Can anyone tell me why this would be? Is my test too simple?
    The point is that you have to find out where your test is spending most of the time.
    If you've just populated a table with 100,000 records and then start to delete randomly 2,000 of them, the database has to perform a full table scan for each of the records to be deleted.
    So probably most of the time is spent scanning the table over and over again, although most of blocks might already be in your database buffer cache.
    The difference between the hard parse and the soft parse of such a simple statement might be negligible compared to effort it takes to fulfill each delete execution.
    You might want to change the setup of your test: Add a primary key constraint to your test table and delete the rows using this primary key as predicate. Then the time it takes to locate the row to delete should be negligible compared to the hard parse / soft parse difference.
    You probably need to increase your iteration count because deleting 2,000 records this way probably takes too short and introduces measuring issues. Try to delete more rows, then you should be able to spot a significant and constant difference between the two approaches.
    In order to prevent any performance issues from a potentially degenerated index due to numerous DML activities, you could also just change your test case to query for a particular column of the row corresponding to your predicate rather than deleting it.
    Regards,
    Randolf
    Oracle related stuff blog:
    http://oracle-randolf.blogspot.com/
    SQLTools++ for Oracle (Open source Oracle GUI for Windows):
    http://www.sqltools-plusplus.org:7676/
    http://sourceforge.net/projects/sqlt-pp/

Maybe you are looking for

  • Dynamic parallel approval workflow

    Hi all, I want to design a workflow where I have to send mails to different approvers. Here,the case with me is the approvers are selected dynamically & No. of approvers are also selected dynamically (at run time). I would like to know whether its ph

  • Login to AS JAVA as administrator

    Now that we have enabled SSO  we login to AS JAVA with the X509 certs , would anyone know how we can login as Administrator and to temporarily disable the x509 cert ? Thank you Jonu Joy 

  • Delta INIT problems in production system

    hi! How do you do, I hope all fine, well, I have a problem in production system, I execute the delta init and the data is in BW the problem is when I want to activate the data, the system give me a processing error, somebody can helpme please? BR

  • EAS Data Load Problem - Very Strange!

    Hi there,<BR><BR>Has anyone came across the following problem?<BR><BR>Within EAS using any of our applications, or a newly created application, if we create a new rule's file (or open an existing) that much is fine.<BR><BR>However - as soon as we try

  • Safari 6.1.6 freezes with os 10.7.5

    Safari keeps freezing and I have to Force Quit every time.  Saw a previous post and found system log showing spindump and a hang report which is quite lengthy.  What's the next step to learn why it keeps hanging?