Problem w/ Oracle temp tables and WL61SP3

I have an odd problem that arises when I use Weblogic 6.1, Oracle, callable statements,
and a stored procedure that uses temporary tables. Specifically I don't get the
result set pack I expect.
Specifically, I have a stored procedure which first populates a temporary table,
then joins the temporary table with the additional tables to build the final result
set--i.e. which it returns as a cursor.
I've tested this procedure and it works fine when called from PLSQL and from Jboss
(using Oracle's driver). When I switch to Weblogic 6.1, *using the same database
and the same Oracle driver*, the returned result set (cursor) has no records.
I've added additional debugging to the stored procedure and find that it indeed
is populating the temporary table, but for some reason, the select/join acts like
the temporary table has no records. Similar 'exists' or 'in' clauses likewise
do not work.
In playing with the stored procedure I found that removing the join with the
temporary table brings back results. This isn't functionally correct--i.e. the
final result set has too many rows--but it confirmed that the problem lies in
the temporary tables.
Again, we've developed to run on multiple app servers so can switch from jboss
to Weblogic simply by running an ant task. The same callable statement executed
using the same driver, but from different app servers lead to different results.
The next step is to try to mess with the 'on commit preserve rows' clause in the
'create global temporary table' section and see if this has any effect. This procedure
doesn't perform any commits, however, so this shouldn't produce any changes.
Any suggestions? Thanks in advance.
bill milbratz

william milbratz wrote:
I have an odd problem that arises when I use Weblogic 6.1, Oracle, callable statements,
and a stored procedure that uses temporary tables. Specifically I don't get the
result set pack I expect.
Specifically, I have a stored procedure which first populates a temporary table,
then joins the temporary table with the additional tables to build the final result
set--i.e. which it returns as a cursor.
I've tested this procedure and it works fine when called from PLSQL and from Jboss
(using Oracle's driver). When I switch to Weblogic 6.1, *using the same database
and the same Oracle driver*, the returned result set (cursor) has no records.First, make sure you are using the same driver, by ensuring the driver you want
is in front of all weblogic stuff in the classpath the server startup script creates for the
server. We ship a version of oracle's thin driver, so it could be picked up instead
of yours if the classpath is weblogic-first.
Second, and this is a long shot, are you using our connection pools? I guess not,
but let me know...
Joe
>
>
I've added additional debugging to the stored procedure and find that it indeed
is populating the temporary table, but for some reason, the select/join acts like
the temporary table has no records. Similar 'exists' or 'in' clauses likewise
do not work.
In playing with the stored procedure I found that removing the join with the
temporary table brings back results. This isn't functionally correct--i.e. the
final result set has too many rows--but it confirmed that the problem lies in
the temporary tables.
Again, we've developed to run on multiple app servers so can switch from jboss
to Weblogic simply by running an ant task. The same callable statement executed
using the same driver, but from different app servers lead to different results.
The next step is to try to mess with the 'on commit preserve rows' clause in the
'create global temporary table' section and see if this has any effect. This procedure
doesn't perform any commits, however, so this shouldn't produce any changes.
Any suggestions? Thanks in advance.
bill milbratz

Similar Messages

  • Temp table, and gather table stats

    One of my developers is generating a report from Oracle. He loads a subset of the data he needs into a temp table, then creates an index on the temp table, and then runs his report from the temp table (which is a lot smaller than the original table).
    My question is: Is it necessary to gather table statistics for the temp table, and the index on the temp table, before querying it ?

    It depends yesterday I have very bad experience with stats one of my table has NUM_ROWS 300 and count(*)-7million and database version is 9206(bad every with optimizer bugs) so queries starts breaking and lot of buffer busy and latch free it took while to figure out but I have deleted the stats and every thing came under control - my mean to say statistics are good and bad. Once you start collecting you should keep an eye.
    Thanks.

  • Difference between Temp table and Variable table and which one is better performance wise?

    Hello,
    Anyone could you explain What is difference between Temp Table (#, ##) and Variable table (DECLARE @V TABLE (EMP_ID INT)) ?
    Which one is recommended to use for better performance?
    also Is it possible to create CLUSTER and NONCLUSTER Index on Variable table?
    In my case: 1-2 days transactional data are more than 3-4 Millions. I tried using both # and table variable and found table variable is faster.
    Is that Table variable using Memory or Disk space?
    Thanks Shiven:) If Answer is Helpful, Please Vote

    Check following link to see differences b/w TempTable & TableVariable: http://sqlwithmanoj.com/2010/05/15/temporary-tables-vs-table-variables/
    TempTables & TableVariables both use memory & tempDB in similar manner, check this blog post: http://sqlwithmanoj.com/2010/07/20/table-variables-are-not-stored-in-memory-but-in-tempdb/
    Performance wise if you are dealing with millions of records then TempTable is ideal, as you can create explicit indexes on top of them. But if there are less records then TableVariables are good suited.
    On Tables Variable explicit index are not allowed, if you define a PK column, then a Clustered Index will be created automatically.
    But it also depends upon specific scenarios you are dealing with , can you share it?
    ~manoj | email: http://scr.im/m22g
    http://sqlwithmanoj.wordpress.com
    MCCA 2011 | My FB Page

  • Query is taking too much time for inserting into a temp table and for spooling

    Hi,
    I am working on a query optimization project where I have found a query which takes hell lot of time to execute.
    Temp table is defined as follows:
    DECLARE @CastSummary TABLE (CastID INT, SalesOrderID INT, ProductionOrderID INT, Actual FLOAT,
    ProductionOrderNo NVARCHAR(50), SalesOrderNo NVARCHAR(50), Customer NVARCHAR(MAX), Targets FLOAT)
    SELECT
    C.CastID,
    SO.SalesOrderID,
    PO.ProductionOrderID,
    F.CalculatedWeight,
    PO.ProductionOrderNo,
    SO.SalesOrderNo,
    SC.Name,
    SO.OrderQty
    FROM
    CastCast C
    JOIN Sales.Production PO ON PO.ProductionOrderID = C.ProductionOrderID
    join Sales.ProductionDetail d on d.ProductionOrderID = PO.ProductionOrderID
    LEFT JOIN Sales.SalesOrder SO ON d.SalesOrderID = SO.SalesOrderID
    LEFT JOIN FinishedGoods.Equipment F ON F.CastID = C.CastID
    JOIN Sales.Customer SC ON SC.CustomerID = SO.CustomerID
    WHERE
    (C.CreatedDate >= @StartDate AND C.CreatedDate < @EndDate)
    It takes almost 33% for Table Insert when I insert the data in a temp table and then 67% for Spooling. I had removed 2 LEFT joins and made it as JOIN from the above query and then tried. Query execution became bit fast. But still needs improvement.
    How I can improve further. Will it be good enough if I create Indexes on the columns for the temp table and try.or what If I use derived tables?? Please suggest.
    -Pep

    How I can improve further. Will it be good enough if I create Indexes on the columns for the temp table and try.or what If I use derived tables??
    I suggest you start with index tuning.  Specifically, make sure columns specified in the WHERE and JOIN columns are properly indexed (ideally clustered or covering, and unique when possible).  Changing outer joins to inner joins is appropriate
    if you don't need outer joins in the first place.
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • Insert data from an tabular to a temp table and fetching a columns.

    Hi guys ,
    I am working in apex 3.2 in which in a page i have a data's fom various tables and displays it in tabular form. Then i have to insert the tabular form data to a temp table and fetch the data from the temp table and insert into my main table. I think that i have to use a cursor to fetch the data from the temp table and insert into the main table but i didnt get the perfect example for doing this. Can any one help me to sort it out.
    Thanks With regards
    Balaji

    Hi,
    Follow this scenario.
    Your Query:
    SELECT t1.col1, t1.col2, t2.col1, t2.col2, t3.col1
    FROM table1 t1, table2 t2, table3 t3
    (where some join conditions);On insert button click call this process
    DECLARE
    temp1 VARCHAR2(100);
    temp2 VARCHAR2(100);
    temp3 VARCHAR2(100);
    temp4 VARCHAR2(100);
    temp5 VARCHAR2(100);
    BEGIN
         FOR i IN 1..apex_application.g_f01.COUNT
         LOOP
              temp1    := apex_application.g_f01(i);
              temp2    := apex_application.g_f02(i);
              temp3    := apex_application.g_f03(i);
              temp4    := apex_application.g_f04(i);
              temp5    := apex_application.g_f05(i);
              INSERT INTO table1(col1, col2) VALUES(temp1, temp2);
              INSERT INTO table2(col1, col2) VALUES(temp3, temp4);
              INSERT INTO table3(col1) VALUES(temp5);
         END LOOP;
    END;You don't even need temp tables and cursor to make an insert into different tables.
    Thanks,
    Ramesh P.
    *(If you know you got the correct answer or helpful answer, please mark as corresponding.)*

  • Global temp table and edit

    Hi all,
    Can someone tell me why when I create a GTT and insert the data like the followijng ,I get insert 14 rows msg. But when I do a select statement from sqlwork shop , sometimes i get the data sometimes I don't. my understanding is this data is supposed to stay during my logon session then got cleaned out when I exit session.
    I am developing a screen in apex and will use this temp table for user to do some editing work. Once ithe editing is done then I save the data into a static table. Can this be done ? So far my every attempt to update the temp table always result to 0 rows updated and the temp table reversed back to 0 rows. CAn you help me ?
    CREATE GLOBAL TEMPORARY TABLE "EMP_SESSION"
    (     "EMPNO" NUMBER NOT NULL ENABLE,
         "ENAME" VARCHAR2(10),
         "JOB" VARCHAR2(9),
         "MGR" NUMBER,
         "HIREDATE" DATE,
         "SAL" NUMBER,
         "COMM" NUMBER,
         "DEPTNO" NUMBER
    ) ON COMMIT PRESERVE ROWS
    insert into emp_session( EMPNO, ENAME, JOB, MGR, HIREDATE, SAL, COMM, DEPTNO)
    select * from emp
    select * from emp_session
    -- sometimes I get 14 rows, sometimes 0 rows
    Thanks.
    Tai

    Tai,
    To say that Apex doesn't support GTT's is not quite correct. In order to understand why it is not working for you and how they may be of use in an Apex application, you have to understand the concept of a session in Apex as opposed to a conventional database session.
    In a conventional database session, as when you are connected with sqlplus then you have what is known as a dedicated session, or a synchronous connection. Temporary objects such as GTTs and packaged variables can persist across calls to the database. A session in Apex however is asynchronous by nature and a connection to the database is done through some sort of a server such as the Oracle HTTP server or the Apex Listener, which in effect maintains a pool of connections to the database and calls by your application aren't guaranteed to get the same connection for each call.
    To get over this, the guys who developed Apex came up with various methods to maintain session state and global objects that are persistent within the context of an Apex session. One of these is Apex collections, which are a device for maintaining collection like (array like) data that is persistent within an Apex session. These are Apex session specific objects in that they are local to the session that creates and maintains them.
    With this knowledge, you can then see why the GTT is not working for you and also how a GTT may be of use in an Apex application, provided you don't expect the data to persist across a call, as in a PL/SQL procedure. You should note though, that unless you are dealing with very large datasets, then a regular Oracle collection is preferable.
    I hope this explains your issue.
    Regards
    Andre

  • Slow Problems with Oracle Forms 10g and Oracle Database 11g

    Hi, I wonder if there is a compatibility problem between Version 10.1.2.0.2 32 Oracle Forms and Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - 64bit Production. This is because my application runs correctly on a version of Oracle Database 10g, and when we migrated the database to an Oracle Database 11g, slowness problems came out.
    Thanks.

    We have the same issue happening with our custom forms and with some of the standard forms in EBSO. So far we have found that the form invoking a view causes ridiculous slowness in opening the form (40 mins). Using a table access has shortened the open time significantly. At this time Oracle DBAs at OOD have no clear idea why it is happening.
    we are on 11.1 database with 11.5 EBSO
    Edited by: user3223867 on Feb 4, 2011 7:55 AM

  • What are these DR$TEMP % tables and can they be deleted?

    We are generating PL/SQL using ODMr and then periodically running the model in batch mode. Out of 26,542 objects in the DMUSER1 schema, 25,574 are DR$TEMP% tables. Should the process be cleaning itself up or is this supposed to be a manual process? Is the cleanup documented somewhere?
    Thanks

    Hi Doug,
    The only DR$ tables/indexes built are the ones generated by the Build,Apply and Test Activities. I confirmed that they are deleted in ODMr 10.2.0.3. As I noted earlier, there was a bug in ODMr 10.2.0.2 which would lead to leakage when deleting Activities. You will have DR$ table around for existing Activities, so do not delete these without validating they are no longer part of an existing Activity.
    You can track down the DR$ objects associated to an Activity by viewing the text step in the activity and finding the table generated for the text data. This table will have a text index created on it. The name of that text index is used as a base name for several tables which Oracle text utilizes.
    Again, all of these are deleted when you delete an Activity with ODMr 10.2.0.3.
    Thanks, Mark

  • Problem w/ Oracle Spread Table Control Properties -

    Dear Contributors:
    I'm using Developer 6i. In an attempt to use Oracle Spread Table Control, I've inserted the right ActiveX in my Form and imported associated OLE library interfaces. However, when I went to inspect and change properties 2 of the tab pages titled
    1) Special and
    2) User Actions
    showed 'MMTX32' Caution Alert w/ a message of:
    "An ussupported operation was attempted."
    Any suggestion friends?

    Yes it have a connection method, the problem here is that this control returns a SCODE value, and i don't know where it is specified to return variables.
    Database Methods
    SCODE Connect(BSTR* database, BSTR* user, BSTR* password, long options);
         Connect to the given database as the given user. options determines
         the type of database connection. See Appendix A: Database Connection Types
         for more information. If the connection fails, Connect will return S_FALSE

  • Jdeveloper with VPD / FGAC possible ? i.e. oracle portal tables and views

    I am trying to create some view objects based on oracle portals views and tables. However I always get the following error.
    ORA-06510: PL/SQL: unhandled user-defined exception
    ORA-06512: at "PORTAL.WWCTX_SSO", line 1407
    ORA-06510: PL/SQL: unhandled user-defined exception
    ORA-06512: at "PORTAL.WWCTX_SSO", line 1216
    ORA-06502: PL/SQL: numeric or value error
    ORA-06512: at "PORTAL.WWCTX_SSO", line 1469
    ORA-06512: at "PORTAL.WWCTX_API", line 152
    This is because I have not set the context using plsql. i.e
    portal.wwctx_api_private.set_context(p_user_name => 'PORTAL',p_update_flat => true);
    Is there a way of using portal views in jdevloper and setting the context first. I am thinking the portal database uses vpd, fine grained access control.
    Regards
    Orlando

    Hi,
    using ADF BC you can override the prepare session method on the AM to set the context.
    public void prepareSession(Session _session)
    super.prepareSession(_session);
    // some PLSQL like
    String appContext = "Begin ctxhrpckg.set_userinfo('"+getApplicationUserName()+"'); END;";
    java.sql.CallableStatement st= null;
    try
    st = getDBTransaction().createCallableStatement(appContext,0);
    st.execute();
    } catch (java.sql.SQLException s)
    throw new oracle.jbo.JboException(s);
    } finally
    try
    if (st!= null)
    st.close();
    } catch (java.sql.SQLException s2){}
    Frank

  • Temp tables and transaction log

    Hi All,
    I am on SQL 2000.
    When I am inserting(or updating or deleting) data to/from temp tables (i.e. # tables), is transaction log created for those DML operations?
    The process is, we have a huge input dataset to process. So, we insert subset(s) of input data in temp table, treat that as our input set and do the processing in parts. Can I avoid transaction log generation for these intermediate steps?
    Soon, we will be moving to 2008 R2. Are there any features in 2008, which can help me in avoiding this transaction logging?
    Thanks in advance

    Every DML operation is logged in the LOG file. Is that possible to insert the data in small chunks?
    http://www.dfarber.com/computer-consulting-blog/2011/1/14/processing-hundreds-of-millions-records-got-much-easier.aspx
    Best Regards,Uri Dimant SQL Server MVP,
    http://sqlblog.com/blogs/uri_dimant/
    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Blog:
    Large scale of database and data cleansing
    Remote DBA Services:
    Improves MS SQL Database Performance

  • HOW TO STORE FETCH DATA IN TEMP TABLE AND HOW CAN I USE THAT FURTHER

    I WANT TO STORE THIS FETCH DATA IN  SUM VALUE IN TEMP TABLE THEN I WANT TO USE THIS VALUE IN ANOTHER
    CODING. HELP ME TO DO THIS?
    SELECT SUM(SIGNEDDATA) 
    FROM FACPLAN
    WHERE TIMEID IN
    (SELECT TIMEID FROM Time 
    WHERE ID IN
    (SELECT CURRENT_MONTH FROM mbrVERSION WHERE CURRENT_MONTH!=''))

    If you want assign to variable:
    DECLARE @SUMAMOUNT INT - -you may change the datatype as required
    Set @SUMAMOUNT = (SELECT SUM(SIGNEDDATA) 
    FROM FACPLAN
    WHERE TIMEID IN
    (SELECT TIMEID FROM Time 
    WHERE ID IN
    (SELECT CURRENT_MONTH FROM mbrVERSION WHERE CURRENT_MONTH!='')))
    And you can use @SUMAMOUNT for further processing
    If you want to store it in a table 
    SELECT SUM(SIGNEDDATA)  as SUMAMOUNT into #Temp
    FROM FACPLAN
    WHERE TIMEID IN
    (SELECT TIMEID FROM Time 
    WHERE ID IN
    (SELECT CURRENT_MONTH FROM mbrVERSION WHERE CURRENT_MONTH!=''))

  • Simple Dir Listing save to Oracle Temp Table

    Hello All, any help appreciated...
    I'm trying to perform the following:
    import java.io.*;
    import java.sql.*;
    public class DirList
    // public static void getList(String directory)
    public static String getList(String directory)
    throws SQLException
    File path = new File( directory );
    String[] list = path.list();
    String element;
    String element2;
    element2 = "";
    for(int i = 0; i < list.length; i++)
    element = list;
    element2 = element2 + ":" + list[i];
    // #sql {INSERT INTO DIR_LIST (FILENAME)
    // VALUES (:element)};
    return element2;
    Issues:
    I found the above code snippet from AskTom on Oracle's web site... exactly what I want to do, but cant get it to work/compile...
    1) Cannot get the #sql statement to compile... Thus I created the variable String element2, which works, except for large directories... It would be nice if I could insert each element into the temporary table and then use sql... Am I missing something with the #sql statement? Is it not within java.sql? on my system? How to verify this/ workarounds?
    2) the only way I could load this (with #sql commented out) was via loadjava... I tried loading within the database via sqlplus but get the following error...
    ORA-29536: badly formed source: Encountered "<EOF>" at line 1, column 24.
    example:
    CREATE or replace and compile java source
    named "MyTimestampx"
    as
    import java.lang.String;
    import java.sql.Timestamp;
    public class MyTimestampx
    public static String getTimestamp()
    return (new
    Timestamp(System.currentTimeMillis())).toString();
    produces the above error...
    any ideas why it fails to load via the above method?
    thanks for any help.

    I solved this issue and perhaps this may help others...
    I coded the insert statement a better way and
    as for the error with sqlplus, EOF, the answer is:
    You have to use 8i or above sqlplus
    the 8.0 plus doesn't understand the "create or replace and compile ..." syntax and is prematurely submitting the CREATE statement to the database.
    Viola... my client sqlplus version is 8.0.6....
    Installing 8i...
    null

  • Difference betweem temp table and CTE as performance wise?

    Hi Techies,
    Can anyone explain CTE and Temp table performance wise. Which is the better object to use while implementing DML operations.
    Thanks in advance.
    Regards
    Cham bee

    Welcome to the world of performance tuning in SQL Server! The standard answer to this kind of question is:
    It depends.
    A CTE is a logical construct, which specifies the logical computation order for the query. The optimizer is free to recast computation order in such away that the intermediate result from the CTE never exists during the calculation. Take for instance this
    query:
    WITH aggr AS (
        SELECT account_no, SUM(amt) AS amt
        FROM   transactions
        GROUP  BY account_no
    SELECT account_no, amt
    FROM   aggr
    WHERE  account_no BETWEEN 199 AND 399
    Transactions is a big table, but there is an index on account_no. In this example, the optimizer will use that index and only compute the total amount for the accounts in the range. If you were to make a temp table of the CTE, SQL Server would have no choice
    to scan the entire table.
    But there also situations when it is better to use a temp table. This is often a good strategy when the CTE appears multiple times in the query. The optimizer is not able to pick a plan where the CTE is computed once, so it may compute the CTE multiple times.
    (To muddle the waters further, the optimizers in some competing products have this capability.)
    Even if the CTE is only referred to once, it may help to materialise the CTE. The temp table has statistics, and those statistics may help the optimizer to compute a better plan for the rest of the query.
    For the case you have at hand, it's a little difficult to tell, because it is not clear to me if the conditions are the same for points 1, 2 and 3 or if they are different. But the second one, removing duplicates, can be quite difficult with a temp table,
    but is fairly simple using a CTE with row_number().
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Trees, temp tables and apex

    Hello,
    Has anyone had any luck building trees that go against temp tables? My tree works great with a regular table but runs flaky when I change the table to a temp table. Is this a limitation with APEX?
    Thanks in advance,
    Sam

    Temporary tables that belong to a database session are not reliably accessible across Application Express page requests. You should look at apex collections for temporary storage that will be persistent for the life of the apex session.
    Scott

Maybe you are looking for

  • Accurate proof with inaccurate monitor? [color management question]

    At the risk of sounding really dumb, here goes: I have never had a true color managed workflow despite dabbling in it and even delving into custom profiling. I don't want to shut the windows in my upstairs office and be dependent on unnatural light s

  • OC4J XSLT Transformation

    I use xml to html transformation. code Transformer t = tf.newTransformer(new StreamSource(s))); t.transform(new DOMSource(data.getXml()), new StreamResult(sout)); When I started application on OC4J i got exception javax.xml.transform.TransformerExcep

  • ECC6 and PI integration design question

    Hello experts, We have a customer that is going for ECC6 and going to use PI to integrate with external systems. We are designing the way to design the logic for distributing materials so two options are on the desk and I wanted to ask which one you

  • Full OS backup and restoration

    I have a Win 2012 R2 server installed on a Dell PE730 box. The box is highly critical and there are lot of configurations in that server that I want to create a full backup and incase of failure of the box, the backup should be useful to restore the

  • Open the error, error recovery is to use log

    D:\new\Data\NewBDB\log>db_recover -c -v Finding last valid log LSN: file: 8468 offset 8283830 db_recover: read: 0x12fb64, 12: Insufficient system resources to complete the requested service. db_recover: DB_LOGC->get: LSN: 8434/0: read: Bad address db