SQL report performance problem

I have a SQL classic report in Apex 4.0.2 and database 11.2.0.2.0 with a performance problem.
The report is based on a PL/SQL function returning a query. The query is based on a view and pl/sql functions. The Apex parsing schema has select grant on the view only, not the underlying objects.
The generated query runs in 1-2 sec in sqlplus (logged in as the Apex parsing schema user), but takes many minutes in Apex. I have found, by monitoring the database sessions via TOAD, that the explain plan in the Apex and sqlplus sessions are very different.
The summary:
In sqlplus SELECT STATEMENT ALL_ROWS Cost: 3,695                                                                            
In Apex SELECT STATEMENT ALL_ROWS Cost: 3,108,551                                                        
What could be the cause of this?
I found a blog and Metalink note about different explain plans for different users. They suggested to set optimizer_secure_view_merging='FALSE', but that didn't help.

Hmmm, it runs fast again in SQL Workshop. I didn't expect that, because both the application and SQL Workshop use SYS.DBMS_SYS_SQL to parse the query.
Only the explain plan doesn't show anything.
To add: I changed the report source to the query the pl/sql function would generate, so the selects are the same in SQL Workshop and in the application. Still in the application it's horribly slow.
So, Apex does do something different in the application compared to SQL Workshop.
Edited by: InoL on Aug 5, 2011 4:50 PM

Similar Messages

  • Interactive report performance problem over database link - Oracle Gateway

    Hello all;
    This is regarding a thread Interactive report performance problem over database link that was posted by Samo.
    The issue that I am facing is when I use Oracle function like (apex_item.check_box) the query slow down by 45 seconds.
    query like this: (due to sensitivity issue, I can not disclose real table name)
    SELECT apex_item.checkbox(1,b.col3)
    , a.col1
    , a.col2
    FROM table_one a
    , table_two b
    WHERE a.col3 = 12345
    AND a.col4 = 100
    AND b.col5 = a.col5
    table_one and table_two are remote tables (non-oracle) which are connected using Oracle Gateway.
    Now if I run above queries without apex_item.checkbox function the query return or response is less than a second but if I have apex_item.checkbox then the query run more than 30 seconds. I have resolved the issues by creating a collection but it’s not a good practice.
    I would like to get ideas from people how to resolve or speed-up the query?
    Any idea how to use sub-factoring for the above scenario? Or others method (creating view or materialized view are not an option).
    Thank you.
    Shaun S.

    Hi Shaun
    Okay, I have a million questions (could you tell me if both tables are from the same remote source, it looks like they're possibly not?), but let's just try some things first.
    By now you should understand the idea of what I termed 'sub-factoring' in a previous post. This is to do with using the WITH blah AS (SELECT... syntax. Now in most circumstances this 'materialises' the results of the inner select statement. This means that we 'get' the results then do something with them afterwards. It's a handy trick when dealing with remote sites as sometimes you want the remote database to do the work. The reason that I ask you to use the MATERIALIZE hint for testing is just to force this, in 99.99% of cases this can be removed later. Using the WITH statement is also handled differently to inline view like SELECT * FROM (SELECT... but the same result can be mimicked with a NO_MERGE hint.
    Looking at your case I would be interested to see what the explain plan and results would be for something like the following two statements (sorry - you're going have to check them, it's late!)
    WITH a AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_one),
    b AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_two),
    sourceqry AS
    (SELECT  b.col3 x
           , a.col1 y
           , a.col2 z
    FROM table_one a
        , table_two b
    WHERE a.col3 = 12345
    AND   a.col4 = 100
    AND   b.col5 = a.col5)
    SELECT apex_item.checkbox(1,x), y , z
    FROM sourceqry
    WITH a AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_one),
    b AS
    (SELECT /*+ MATERIALIZE */ *
    FROM table_two)
    SELECT  apex_item.checkbox(1,x), y , z
    FROM table_one a
        , table_two b
    WHERE a.col3 = 12345
    AND   a.col4 = 100
    AND   b.col5 = a.col5If the remote tables are at the same site, then you should have the same results. If they aren't you should get the same results but different to the original query.
    We aren't being told the real cardinality of the inners select here so the explain plan is distorted (this is normal for queries on remote and especially non-oracle sites). This hinders tuning normally but I don't think this is your problem at all. How many distinct values do you normally get of the column aliased 'x' and how many rows are normally returned in total? Also how are you testing response times, in APEX, SQL Developer, Toad SQLplus etc?
    Sorry for all the questions but it helps to answer the question, if I can.
    Cheers
    Ben
    http://www.munkyben.wordpress.com
    Don't forget to mark replies helpful or correct ;)

  • Sub Report Performance problem

    Our current software configuration is Microsoft SQL Server 2008 R2 (10.0.2531.0) where our data is housed. Our User Interface is Microsoft Visual Studio 2008 (Visual Basic) and Microsoft .NET 3.5 SP1 platform.  We use Crystal Reports 2008 (12.0.0.683) to do our reporting.
    We upgraded from Crystal Reports XI and our reports are taking a long time to render?... Is there any evidence of other companies experiencing this trouble? Are there any degraded performance issues when moving from XI to 2008?
    In some cases we utilizing a small simple sub report.
    Any help appreciated. Thanks.

    Hi Billy,
    Subreports in detail sections are never a good idea. Example, you have 100 records returned int he main report and if that field is your subreport link it causes the subreport to run 100 times. Your database is being queried 101 times.
    Also, the default time out for database is typically 1 minute, could be you need to verify your report and database so CR can "clean up" any wrong pointers.
    It's always better to do the data collection server side if and when possible, after all that's what they do best.
    Not much detail to go on so I can't say if it's a report design problem other than the above or not. It could be also that .NET takes time to load all of the CR assemblies so depending on how you have your app configured depends on the delay. If you load a dummy report when your starts up to pre-load CR runtime then subsequent reports will be much faster.
    Also check off all of the Verify Database options, you may be telling CR to verify each time which on big databases can cause a delay also.
    Thank you
    Don

  • Report performance problem

    Hi
    My problem is with performance in reports. The reports I use is 'Reports From SQL Query' and I have the following select statement.
    select * from lagffastadr
    where kommun like :bind_kommun
    and adrniva3 like :bind_adrniva3
    and adrnr like :bind_adrnr
    and adrlitt like :bind_adrlit
    and :bind_show = 1
    this works fine, but I have one big problem. The user may use wildcard to select all, %, this give me a new problem. The table I use contains NULL values. Then we don't get all the rows by using %, we get all that have a value. This can be solved by adding
    where (kommun like :bind_kommun OR kommun IS NULL)
    this will give me all the rows. But, then this will take a long time to evaluate since the sql query don't use the index that I have created. Is there another way to solve this or do I have to change all the NULL values in my database? (I rather not).
    thanks
    /Jvrgen Swensen

    It looks more like a query tuning problem. Please try the SQL, PL/SQL forum.
    You can expect a better answer there.
    Thanx,
    Chetan.

  • Report performance problems on htmldb 1.6

    Hello everybody,
    I'm not a guru in both pl-sql and htmldb and I'm having slow performances on two queries, that are similar, so I think the root of the problem is the same..here they are:
    This query populates a select list and takes about 2 minutes:
    SELECT FLAG_VERBAL,FLAG_NUMBER
    FROM DISCHARGE_FLAG
    WHERE FLAG_NUMBER IN (SELECT UNIQUE(FLAG) FROM HYDRODATA
    WHERE LSFLD_ID IN (SELECT ID FROM DEMO_SIMO_CGMS_UPPDANUBE_ID))This query is in a Report (SQL query Region) and takes about 100 seconds:
    SELECT LSFLD_ID, MOMENT, DISCHARGE, FLAG
    FROM HYDRODATA
    WHERE
    FLAG = to_number(:P6_DISCHARGE_TYPE)
    AND LSFLD_ID IN(
      SELECT ID FROM DEMO_SIMO_CGMS_UPPDANUBE_ID
    AND moment >= to_date(:P6_FROM_DATE, 'yyyy-mm-dd')
    AND moment <= to_date(:P6_TO_DATE, 'yyyy-mm-dd')
    ORDER BY 1,2By my testing, it seems that the problem is correlated to the sub-SELECT inside the main SELECT.
    If I substitute the sub-SELECT statement ( SELECT ID FROM DEMO_SIMO_CGMS_UPPDANUBE_ID) with a list of its costant values, the output of those queries appears in few seconds.
    Another strange thing is that if I run alone the sub-select SELECT ID FROM DEMO_SIMO_CGMS_UPPDANUBE_ID it also takes few seconds!
    So why together they need so much time??
    Maybe the problem is in the table DEMO_SIMO_CGMS_UPPDANUBE_ID? It is populated by a process that:
    deletes its content and
    copies the selected values (by check boxes) of a previous Report :
    DELETE FROM DEMO_SIMO_CGMS_UPPDANUBE_ID;
    FOR I in 1..HTMLDB_APPLICATION.G_F01.COUNT LOOP
        INSERT INTO DEMO_SIMO_CGMS_UPPDANUBE_ID(id) VALUES (HTMLDB_APPLICATION.G_F01(i));
    END LOOP;Thanks in advance! :D

    This is really a SQL question and not a HTML DB question, but I will pose some things for you to consider. Just some high-level tips without getting into the details of your queries.
    1) Time your queries in SQL*Plus. I would guess that they are as slow in SQL*Plus as they are in HTML DB
    2) Keep in mind that use of functions like TO_NUMBER or TO_CHAR or TO_DATE applied to database columns disables the use of indexing on columns.
    3) Look to see if the columns you are joining on are indexed.
    4) Use EXPLAIN PLAN and tracing to see what execution plan Oracle is taking.
    5) Rewrite your queries. Join your three tables together in one SELECT statement instead of using IN's.
    For example change your first query to:
    SELECT FLAG_VERBAL,FLAG_NUMBER
    FROM DISCHARGE_FLAG df, HYDRODATA h, EMO_SIMO_CGMS_UPPDANUBE_ID e
    WHERE df.FLAG_NUMBER = h.FLAG
    AND h.LSFLD_ID = e.ID
    Mike

  • PowerPivot report performance problem on Sharepoint

    Hi everybody,
    I have created a PowerPivot report based on SQL Server tables.
    When I use this report on my computer the response time are really good.
    I get data from two different table which are linked by an integer key in my model.
    It means that in my pivot I chose my fields in two different "tables" 
    Let say I select 10 fields in one table and 2 in the other one
    When I put it on the Sharepoint Server the report is not correctly loaded.
    When I put calculated columns into my first table and use lookup to get the 2 fields from my second table the performance are far better and the report works on Sharepoint.
    I don't understand why.
    Any idea?
    Thanks
    Pete
    P.S: I work on PowerPivot for Excel 2010 and  SP 2010 Server

    This is an issue my client is having at the moment. As of right now they are going through a MS ticket to resolve it. Will update when I hear more.
    -Link

  • Interactive report performance problem over database link

    Hi gurus,
    I have an interactive report that retrieves values from two joined tables both placed on the same remote database. It takes 45 seconds to populate (refresh) a page after issuing a search. If I catch the actual select that is generated by apex ( the one with count(*) over ()...) and run it from sql environment, it takes 1 or 2 seconds. What on earth does consume the rest of the time and how to speed this up? I would like to awoid creating and maintaining local materialized views if possible.
    Regards
    Samo

    Hi
    APEX normally needs to return the full result set for the purposes of pagination (where you have it set to something along the lines of show x-y of z), changing this or the max row count can affect performance greatly.
    The driving site hint would need to refer to the name of the view, but can be a bit temperamental with this kind of thing. The materialize hint only works for sub-factored queries (in a 'WITH blah AS(SELECT /*+ MATERIALIZE */ * FROM etc. etc.)). They tend to materialize anyway and its best not to use hints like that for production code unless there is absolutely no other option, but just sub factoring without hints can often have a profound effect on performance. For instance
    WITH a AS
    SELECT c1,
            c2,
            c3,
            c4,
            c5
    FROM schema1.view1)
    , b AS
    SELECT c1,
            c2,
            c3
    FROM schema1.view2)
    SELECT *
    FROM a, b
    WHERE a.c5 = b.c3May produce a different plan or even just sub factoring one of the external tables may have an effect.
    You need to try things out here and experiment and keep looking at the execution plans. You should also change Toads row fetch setting to be all 9's to get a real idea of performance.
    Let me know how you get on.
    Cheers
    Ben
    http://www.munkyben.wordpress.com
    Don't forget to mark replies helpful or correct ;)
    Edited by: Munky on Sep 29, 2009 1:41 PM

  • Reports Performance Problem

    I have a report that contains 17 queries, which populate 17 layouts. As you can imagine, it is very slow. It takes about a minute before the report will display on the client PC. In a form which calls this report I have a pl/sql routine that populates some display fields with the same data as the report. This pl/sql trigger returns the data in under a second. What am I doing wrong in the report? How can I improve the performance of the report and thereby shorten the execution time? Please help. Thanks in advance.

    Richard,
    This is difficult to say because a lot depends on the complexity of the queries and the complexity of the layouts. There are some statistics you can gather from Reports by using the trace profile command (see the online help or the following document for syntax). Download the "Tuning Oracle Reports" from the Reports product page here at OTN. Here is the link http://technet.oracle.com/products/reports/pdf/275641.pdf
    The will give you some ideas on how to improve performance of your Report and Thanks for using Oracle Reports.
    Regards,
    The Oracle Reports Team.

  • PL/SQL report header problems

    Hi, I have PL/SQL Headings for my reports. Is there a way to use the sorting feature nevertheless?

    Hi John,
    Sorry for not explaining into more details.
    I have a pl/sql heading in a report, in this particular case, I have 2 headers in one column (1 of them share with other column)
    When I enable the sorting of a normal column (with one header), I see no link, and the Sorting Image (the arrow) appears outside the report.
    So I tried to input the sorting code into the pl/sql heading code, but It was no possible, as a do not know the region_id.
    Thanks!

  • Sql loader performance problem with xml

    Hi,
    i have to load a 400 mb big xml file into mz local machine's free oracle db
    i have tested a one record xml and was able to load succesfully, but 400 mb freeying for half an hour and does not even started?
    it is normal? is there any chance i will be able to load it, just need to wait?
    are there any faster solution?
    i ahve created a table below
    CREATE TABLE test_xml
    COL_ID VARCHAR2(1000),
    IN_FILE XMLTYPE
    XMLTYPE IN_FILE STORE AS CLOB
    and control file below
    LOAD DATA
    CHARACTERSET UTF8
    INFILE 'test.xml'
    APPEND
    INTO TABLE product_xml
    col_id filler CHAR (1000),
    in_file LOBFILE(CONSTANT "test.xml") TERMINATED BY EOF
    anything i am doing wrong? thanks for advices

    SQL*Loader: Release 11.2.0.2.0 - Production on H. Febr. 11 18:57:09 2013
    Copyright (c) 1982, 2009, Oracle and/or its affiliates. All rights reserved.
    Control File: prodxml.ctl
    Character Set UTF8 specified for all input.
    Data File: test.xml
    Bad File: test.bad
    Discard File: none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 5000
    Bind array: 64 rows, maximum of 256000 bytes
    Continuation: none specified
    Path used: Conventional
    Table PRODUCT_XML, loaded from every logical record.
    Insert option in effect for this table: APPEND
    Column Name Position Len Term Encl Datatype
    COL_ID FIRST 1000 CHARACTER
    (FILLER FIELD)
    IN_FILE DERIVED * EOF CHARACTER
    Static LOBFILE. Filename is bv_test.xml
    Character Set UTF8 specified for all input.
    SQL*Loader-605: Non-data dependent ORACLE error occurred -- load discontinued.
    ORA-01652: unable to extend temp segment by 128 in tablespace TEMP
    Table PRODUCT_XML:
    0 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    0 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.
    Space allocated for bind array: 256 bytes(64 rows)
    Read buffer bytes: 1048576
    Total logical records skipped: 0
    Total logical records rejected: 0
    Total logical records discarded: 0
    Run began on H. Febr. 11 18:57:09 2013
    Run ended on H. Febr. 11 19:20:54 2013
    Elapsed time was: 00:23:45.76
    CPU time was: 00:05:05.50
    this is the log
    i have truncated everything i am not able to load 400 mega into 4 giga i cannot understand
    windows is not licensed 32 bit

  • ABAP Report Performance Problem

    Hello Experts,
    Check the folowing code and suggest me a solution.
    IF BSIS-BLART = 'KR'.
    SELECT SINGLE LIFNR  AS LIFNR2   FROM BSAK  INTO CORRESPONDING FIELDS OF ITAB WHERE  BELNR = ITAB-BELNR AND GJAHR = ITAB-GJAHR.
    SELECT SINGLE NAME1 AS NAME2 FROM LFA1 INTO CORRESPONDING FIELDS OF ITAB WHERE LIFNR  =  ITAB-LIFNR2.
        WRITE:      42 ITAB-LIFNR2,  56 ITAB-NAME2.
    ELSEIF BSIS-BLART = 'SA'.
    SELECT  SINGLE HKONT FROM BSIS INTO CORRESPONDING FIELDS OF ITAB WHERE BELNR = ITAB-BELNR  AND GJAHR =
    ITAB-GJAHR AND  BUKRS = ITAB-BUKRS .
    SELECT  SINGLE TXT20 FROM SKAT INTO CORRESPONDING FIELDS OF ITAB WHERE SAKNR = BSIS-HKONT
    In the above query, the table BSAK, BSIS and SKAT contains lakhs and lakhs of record, its give time out dump in test server. Kindly provide me a solution for the same.
    Regards,
    Senthil Kumar

    Hi,
    its inside a Select .. endselect.  like this.
    select * from covp where kokrs eq itab-kokrs and
                                 lednr eq '00' and
                                 objnr eq itab-objnr and
                                 versn eq '000' and
                                 wrttp eq '04' and
                                 gjahr ge fyr.
                                   itab-belnr = covp-refbn.
          if sy-subrc eq 0.
            itab-kstar = covp-kstar.
            itab-budat = covp-budat.
            itab-refbn = covp-refbn.
            itab-gjahr = covp-gjahr.
           itab-ebeln = covp-ebeln.
            itab-ebelp = covp-ebelp.
            itab-awtyp = covp-awtyp.
            itab-mbgbtr = covp-mbgbtr.
    Here the above coding comes.
    endselect.
    So, How can i proceed wit this.

  • About report performance

    Hi Friends,
    I created a report with 45 ref.cursors,
    All ref.cursors are in a Package,
    The package is in Database side.
    The report is report server.
    IF i start to run the report through application
    the report is taking 50% of cpu memory around 40 seconds.
    is this report performance problem ?
    if i have more ref.cursors in report
    is there any problem in report performance ?
    Can somebody help me ?

    One performance consideration I'd do is try to avoid multiple similar queries or even repeats of the same query.
    Is
    from invoice
    where trunc(invoice_date) between :date1 and :date2
    and currency_code = '$' -- sometimes 'euro' and so no
    and ISSUE_PLACE = 'xx'
    and investor_code = :investor_code;
    return(v_comm*5.5137);
    in main query? Can those Formulas be included/replaced into the main query? Are appropriate Indexes created for the joins?

  • Performance problems - query on the copy of a table is faster than the orig

    Hi all,
    I have sql (select) performance problems with a specific table (costs 7800)(Oracle 10.2.0.4.0, Linux).
    So I copied the table 1:1 (same structure, same data, same indexes etc. ) under the same area (database, user, tablespace etc.) and gathered the table_stats with the same parameters. The same query on this copied table is faster and the costs are just 3600.
    Why for gods sake is the query on this new table faster??
    I appreciate any idea.
    Thank you!
    Fibo
    Edited by: user954903 on 13.01.2010 04:23

    Could you please share more information/link which can elaborate the significance of using SHRINK clause.
    If this is so useful and can shrink the unused space , why not my database architect has suggested this :).
    Any disadvantage also?It moves the highwater mark back to the lowest position, therefore full tables scans would work faster in some cases. Also it can reduce number of migrated rows and number of used blocks.
    Disadvantage is that it involves row movement, so operations which based on rowid are permitted during the shrinking.
    I think it is even better to stop all operations on the table when shrinking and disable all triggers. Another problem is that this process can take a long time.
    Guru's, please correct me if I'm mistaken.
    Edited by: Oleg Gorskin on 13.01.2010 5:50

  • Performance problem in Zstick report...

    Hi Experts,
    I am facing performance problem in Custoom Stock report of Material Management.
    In this report i am fetching all the materials with its batches to get the desired output, at a time this report executes 36,000 plus unique combination of material and batch.
    This report takes around 30 mins to execute. And this report is to be viewed regularly in every 2 hours.
    To read the batch characteristics value I am using FM -> '/SAPMP/CE1_BATCH_GET_DETAIL'
    Is there any way out to increase the performance of this report, the output of the report is in ALV.
    May i have any refresh button in the report so that data may get refreshed automatically without executing it again. or is there any cache memory concept.
    Note: I have declared all the itabs with type sorted, all the select queries are fetched with key and index.
    Thanks
    Rohit Gharwar

    Hello,
    SE30 is old. Switch on trace on ST12 while running this progarm and identify where exactly most of the time is being spent. If you see high CPU time this problem with the ABAP code. You can exactly figure out the program/function module from ST12 trace where exactly time is being spent. If you see high database time in ST12, problem is with database related issue. So basically you have to analyze sql statement from performance traces in ST12. These could resolve your issue.
    Yours Sincerely
    Dileep

  • PL/SQL Performance problem

    I am facing a performance problem with my current application (PL/SQL packaged procedure)
    My application takes data from 4 temporary tables, does a lot of validation and
    puts them into permanent tables.(updates if present else inserts)
    One of the temporary tables is parent table and can have 0 or more rows in
    the other tables.
    I have analyzed all my tables and indexes and checked all my SQLs
    They all seem to be using the indexes correctly.
    There are 1.6 million records combined in all 4 tables.
    I am using Oracle 8i.
    How do I determine what is causing the problem and which part is taking time.
    Please help.
    The skeleton of the code which we have written looks like this
    MAIN LOOP ( 255308 records)-- Parent temporary table
    -----lots of validation-----
    update permanent_table1
    if sql%rowcount = 0 then
    insert into permanent_table1
    Loop2 (0-5 records)-- child temporary table1
    -----lots of validation-----
    update permanent_table2
    if sql%rowcount = 0 then
    insert into permanent_table2
    end loop2
    Loop3 (0-5 records)-- child temporary table2
    -----lots of validation-----
    update permanent_table3
    if sql%rowcount = 0 then
    insert into permanent_table3
    end loop3
    Loop4 (0-5 records)-- child temporary table3
    -----lots of validation-----
    update permanent_table4
    if sql%rowcount = 0 then
    insert into permanent_table4
    end loop4
    -- COMMIT after every 3000 records
    END MAIN LOOP
    Thanks
    Ashwin N.

    Do this intead of ditching the PL/SQL.
    DECLARE
    TYPE NumTab IS TABLE OF NUMBER(4) INDEX BY BINARY_INTEGER;
    TYPE NameTab IS TABLE OF CHAR(15) INDEX BY BINARY_INTEGER;
    pnums NumTab;
    pnames NameTab;
    t1 NUMBER(5);
    t2 NUMBER(5);
    t3 NUMBER(5);
    BEGIN
    FOR j IN 1..5000 LOOP -- load index-by tables
    pnums(j) := j;
    pnames(j) := 'Part No. ' || TO_CHAR(j);
    END LOOP;
    t1 := dbms_utility.get_time;
    FOR i IN 1..5000 LOOP -- use FOR loop
    INSERT INTO parts VALUES (pnums(i), pnames(i));
    END LOOP;
    t2 := dbms_utility.get_time;
    FORALL i IN 1..5000 -- use FORALL statement
    INSERT INTO parts VALUES (pnums(i), pnames(i));
    get_time(t3);
    dbms_output.put_line('Execution Time (secs)');
    dbms_output.put_line('---------------------');
    dbms_output.put_line('FOR loop: ' || TO_CHAR(t2 - t1));
    dbms_output.put_line('FORALL: ' || TO_CHAR(t3 - t2));
    END;
    Try this link, http://download-west.oracle.com/docs/cd/B10501_01/appdev.920/a96624/05_colls.htm#23723

Maybe you are looking for