CACHE Oracle Tables

Hello Gurus,
We are building a new application and identified that few tables will be accesses very frequently. To decrease I/O we are planning to CACHE these tables. I am not sure if we made right decision. My question what are the things you need to consider before caching Oracle tables.
Any help greatly appreciated. Thanks.
select * from V$VERSIONBANNER                                                                          
Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production    
PL/SQL Release 11.2.0.3.0 - Production                                          
CORE     11.2.0.3.0     Production                                                        
TNS for IBM/AIX RISC System/6000: Version 11.2.0.3.0 - Production               
NLSRTL Version 11.2.0.3.0 - Production    

OK, so you want to use multiple buffer pools and to put these tables into the keep pool?
Why do you believe that this will improve performance? Oracle's default algorithm for aging out blocks that are seldomly used is pretty good for the vast majority of applications. Why do you believe that you can identify what blocks will most benefit from caching better than Oracle? Why do you believe that you wouldn't be better off giving whatever KEEP pool cache size you would allocate to the DEFAULT pool and letting Oracle's cache algorithm cache whatever it determines is appropriate? It is possible that there is something that you know about your application that allows you to make this sort of determination. But in the vast majority of cases I've seen, people that have tried to do so end up hurting performance at least a little because they're forcing Oracle at the margin to age out blocks that it would benefit from caching and to cache blocks that it would benefit from aging out.
Do you understand the maintenance impact of using multiple buffer caches? If you are using a vaguely recent version of Oracle and using any of the automatic memory management features, Oracle does not automatically manage the non-default buffer caches. That increases the probability that using non-default buffer caches is going to create performance problems since humans are much less efficient at recognizing and reacting to changing memory utilization and substantially increases the amount of monitoring and work that the DBAs need to do on the system (which, in turn, increases the risk that they make a mistake).
Justin

Similar Messages

  • Unable to create cache groups from CASE-SENSITIVE oracle table's name

    Hello All
    I have some case-sensitive tables in a oracle database and their columns are the same too. I've tried to cache these tables into TimesTen under a read-only cache group. I think timesten cannot find
    case-sensitive tables because as soon as I changed name of the tables, the creation could succeeded. What can I do to overcome this issue? I don't want lose case-sensitive feature. Is it because of
    I'm using an old version of TimesTen(11.2.1.4.0)

    Hi Chris
    Thanks for your answer. I'm using SQL Developer(both graphical and by command) to manage Timesten db. When I'm about to select root table for cache group i can see the table and when I
    select on it, the caching procedures can not be done and it says your table does not have Primary Key; you can see below that this is not true and the table has two primary key. When I'm
    trying to create the cache group via command in work sheet the error is "TT5140: could not find HLR.SUBSCRIBER. may not have privileges"
    in Oracle:
    CREATE TABLE "HLR"."Subscriber"
    "SSI" NUMBER(10,0) NOT NULL ENABLE,
    "CCNC" VARCHAR2(50 BYTE) NOT NULL ENABLE,
    "Code" VARCHAR2(128 BYTE) DEFAULT NULL NOT NULL ENABLE,
    "Account" NVARCHAR2(32),
    "Mnemonic" NVARCHAR2(15),
    "Region" NVARCHAR2(32),
    "UserAddress" NVARCHAR2(32),
    "Name" NVARCHAR2(32) NOT NULL ENABLE,
    "VPNCode" NUMBER(10,0),
    "VPNCCNC" VARCHAR2(50 BYTE),
    "SubOrgId" NUMBER(10,0),
    "SubscriberTypeId" NUMBER(2,0) DEFAULT 5 NOT NULL ENABLE,
    "StatusId" NUMBER(2,0) DEFAULT 1 NOT NULL ENABLE,
    "SubscriberClass" NUMBER(2,0),
    "DefinedIpAddressId" NUMBER(10,0),
    CONSTRAINT "Subscriber_PK" PRIMARY KEY ("SSI", "CCNC") USING INDEX PCTFREE 10 INITRANS 2 MAXTRANS 255 COMPUTE STATISTICS STORAGE(INITIAL 65536 NEXT 1048576 MINEXTENTS 1 MAXEXTENTS 2147483645 PCTINCREASE 0 FREELISTS 1 FREELIST GROUPS 1 BUFFER_POOL DEFAULT FLASH_CACHE DEFAULT CELL_FLASH_CACHE DEFAULT) TABLESPACE "USERS" ENABLE,
    CONSTRAINT "FK_DefinedIpAdd_Subscriber" FOREIGN KEY ("DefinedIpAddressId") REFERENCES "HLR"."DefinedIPAddress" ("Id") ENABLE,
    CONSTRAINT "Fk_Status_Subscriber" FOREIGN KEY ("StatusId") REFERENCES "HLR"."Status" ("Id") ENABLE,
    CONSTRAINT "Fk_SubOrg_Subscriber" FOREIGN KEY ("SubOrgId") REFERENCES "HLR"."SubOrganization" ("Id") ENABLE,
    CONSTRAINT "Fk_SubscriberType_Subscriber" FOREIGN KEY ("SubscriberTypeId") REFERENCES "HLR"."SubscriberType" ("Id") ENABLE,
    CONSTRAINT "Fk_VPN_Subscriber" FOREIGN KEY ("VPNCode", "VPNCCNC") REFERENCES "HLR"."VPN" ("SSI", "CCNC") ENABLE
    in TimesTen:
    CREATE READONLY CACHE GROUP "PRO1"
    AUTOREFRESH MODE INCREMENTAL INTERVAL 5 MINUTES
    STATE PAUSED
    FROM "HLR"."Subscriber"
    "SSI" NUMBER(10,0) NOT NULL ,
    "CCNC" VARCHAR2(50 BYTE) NOT NULL ,
    "Code" VARCHAR2(128 BYTE) NOT NULL ,
    "Account" NVARCHAR2(32),
    "Mnemonic" NVARCHAR2(15),
    "Region" NVARCHAR2(32),
    "UserAddress" NVARCHAR2(32),
    "Name" NVARCHAR2(32) NOT NULL ,
    "VPNCode" NUMBER(10,0),
    "VPNCCNC" VARCHAR2(50 BYTE),
    "SubOrgId" NUMBER(10,0),
    "SubscriberTypeId" NUMBER(2,0) DEFAULT 5 NOT NULL ,
    "StatusId" NUMBER(2,0) DEFAULT 1 NOT NULL ,
    "SubscriberClass" NUMBER(2,0),
    "DefinedIpAddressId" NUMBER(10,0),
    PRIMARY KEY("CCNC","SSI")
    )

  • Why changes on cached tables can't be propagated to the Oracle tables?

    Dear all
    I have a Oracle database that i want cache its tables in two Timesten databases operated in two different PCs in a local network. There are all kinds of cache groups. When some changes are applied in
    TimesTen databases, propagation to oracle tables fails. This problem occur on global and SWT cache groups. My guess is something wrong about the TimesTen grid.
    Error is: [TimesTen][TimesTen 11.2.1.4.0 ODBC Driver][TimesTen]TT5025: Commit failure in Oracle. Transaction must be rolled back in TimesTen. -- file "sqlAPI.c", lineno 3277, procedure
    "sb_sqlCompileMulti1()"
    OS: windows xp sp3
    Timesten version: 11.2.1.4.0
    My Oracle: 11g
    Regards
    shahrokh
    Edited by: Shahrokh on Jun 13, 2012 8:54 PM

    It difficult to give you a recommendation without knowing the detail.
    If you have some level of node affinity in the TimesTen clients then Grid is the perfect solution. How do you know it's not fast enough without testing it?
    If you only have occasional updates to the data in the TimesTen clients then using Read-Only cache groups with PASSTHROUGH is the perfect solution. Updates happen on the TimesTen connection but are passed through to Oracle where at the next AUTOREFRESH they are propagated from Oracle to both TimesTen clients. But if you have a large volume of updates this is not a practical solution. If you do have a large volume of updates then you could send the updates directly to Oracle but this of course requires the application to have a connection to TimesTen and a connection to Oracle.
    Without more detail I suggest you try these 2 simple approaches to see which one works for you.
    What you are asking for is to replicate a change from a TimesTen database to an Oracle database then to another TimesTen database. This is very difficult to achieve with any replication technology. Others in this forum may have other ideas?
    Tim

  • Loading xml file of 1GB in a oracle table

    Could any one help me out , how can i load a xml file of large size in a oracle table
    using pl/sql developer.
    Thanks

    Hi,
    I am also trying to process large XML file but getting the following error. The code works fine if the smaller size file. Could you please help me how can I process and upload the XML file of more than 10MB size into an oracle table. The table structure does not match at all to the xml tags. I am trying with the following code.
    ERROR at line 1:
    ORA-04031: unable to allocate 2520 bytes of shared memory ("large
    pool","unknown object","session heap","koh-kghu session heap")
    ORA-06512: at "VMI_USER.PKG_VMI_FILE_UPLOAD", line 978
    ORA-04031: unable to allocate 1000 bytes of shared memory ("large
    pool","unknown object","qmxlu subheap","qmemNextBuf:alloc")
    ORA-06512: at line 1
    I am using Oracle 10g and having the following method to process the xml file to upload it's contents in the database table conditionally:
    PROCEDURE pr_process_xml_file(pin_xml_file_name IN VARCHAR2)
    IS
    l_bfile BFILE;
    l_clob CLOB;
    l_parser dbms_xmlparser.parser;
    l_doc dbms_xmldom.domdocument;
    l_nl dbms_xmldom.domnodelist;
    l_n dbms_xmldom.domnode;
    l_nl_party dbms_xmldom.domnodelist;
    l_n_party dbms_xmldom.domnode;
    l_nl_nad dbms_xmldom.domnodelist;
    l_n_nad dbms_xmldom.domnode;
    l_nl_line dbms_xmldom.domnodelist;
    l_n_line dbms_xmldom.domnode;
    begin
    -- the xml file to process
    l_bfile := BFileName('DATA_DIR',pin_xml_file_name);
    -- Create temporary lob
    dbms_lob.createtemporary(l_clob, cache=>FALSE);
    -- open xml file in readonly mode
    dbms_lob.OPEN(l_bfile, dbms_lob.lob_readonly);
    -- load the file contents in the clob
    dbms_lob.loadfromfile(l_clob,l_bfile,dbms_lob.getlength(l_bfile));
    -- close the xml file
    dbms_lob.CLOSE(l_bfile);
    -- create a parser
    l_parser := dbms_xmlparser.newparser;
    -- parse the document and create a new DOM document
    dbms_xmlparser.parseclob(l_parser,l_clob);
    l_doc := dbms_xmlparser.getdocument(l_parser);
    -- free resources associated with the parser now it is no longer required
    dbms_lob.freetemporary(l_clob);
    dbms_xmlparser.freeparser(l_parser);
    -- get a list of all the DOCUMENT nodes in the document using the XPATH systax
    l_nl := xslprocessor.selectnodes(xmldom.makenode(l_doc),'/TRADING/DOCUMENT');
    -- loop thru the list and create a DOCUMENT record
    FOR cur_doc IN 0 .. xmldom.getlength(l_nl) - 1
    LOOP
    .. here I process the contents of file tag by tag and load them in the stagging table first
    END LOOP;
    .. Here I load the data from stagging table to target oracle table.
    xmldom.freedocument(l_doc);
    end;
    This works fine if the size of the xml file is smaller like 1-2 MB but not for large file.
    Please help.
    Thanks - Pawan

  • Caching of tables

    Hi All
    I want some info on caching of tables in shared pool.
    1) How many tables can I or should I cache in the shared pool. You would say depends on the Shared pool size, but can you let me know what percent of shared pool.
    2) Also what tables should I cache in the shared pool .
    ---Tables accesses frequently
    ---- Small tables (What size is suggested)
    What other considerations I need to make while caching
    I have around 20 small tables (not more than 500 rows in each) and around 5 pl/sql programs all of which will run at different times.
    But all of them will not access all the 20 tables. Only 5-7 tables at a time.
    Please help
    Thanks
    Ashwin N.

    Oracle makes use of a LRU algorithm to determine what data/sql/pl/sql should remain in memory for later use. This algorithm works very well in 99% of the cases. Only if this algorithm is disturbed by e.g. big loads, it sometimes is necessary to tell Oracle what should be 'kept' and what not. Usually, this is determined as part of a performance tuning phase.
    Therefore, my suggestion would be that in general you don't explicitely keep or recycle objects. Only when your requirements are such that performance becomes an issue if you do nothing you should take a look at which objects should be kept and which ones not. Typically load/stage tables should not.
    O, before this confuses you (like many before you): alter table cache; does not KEEP the blocks in memory rather it puts the blocks of the table at the beginning of the LRU after a full table scan rather than at the end. alter table storage(buffer_pool keep); does keep de db blocks in memory and alter table storage (buffer_pool recycle); makes sure blocks are aged out immediately.
    Hope this helps,
    L.

  • Aggregate query on global cache group table

    Hi,
    I set up two global cache nodes. As we know, global cache group is dynamic.
    The cache group can be dynamically loaded by primary key or foreign key as my understanding.
    There are three records in oracle cache table, and one record is loaded in node A, and the other two records in node B.
    Oracle:
    1 Java
    2 C
    3 Python
    Node A:
    1 Java
    Node B:
    2 C
    3 Python
    If I select count(*) in Node A or Node B, the result respectively is 1 and 2.
    The questions are:
    how I can get the real count 3?
    Is it reasonable to do this query on global cache group table?
    I have one idea that create another read-only node for aggregation query, but it seems weird.
    Thanks very much.
    Regards,
    Nesta
    Edited by: user12240056 on Dec 2, 2009 12:54 AM

    Do you mean something like
    UPDATE sometable SET somecol = somevalue;
    where you are updating all rows (or where you may use a WHERE clause that matches many rows and is not an equality)?
    This is not something you can do in one step with a GLOBAL DYNAMIC cache group. If the number of rows that would be affected is small and you know the keys or every row that must be updated then you could simply execute multiple individual updates. If the number of rows is large or you do not know all the ketys in advance then maybe you would adopt the approach of ensuring that all relevant rows are in the local cache grid node already via LOAD CACHE GROUP ... WHERE ... Alternatively, if you do not need Grid functionality you could consider using a single cache with a non-dynamic (explicitly loaded) cache group and just pre-load all the data.
    I would not try and use JTA to update rows in multiple grid nodes in one transaction; it will be slow and you would have to know which rows are located in which nodes...
    Chris

  • Cache agent table update transaction size

    Is there a way to impose a transaction size limit, a "commit every n rows", on readonly cache group updates?
    Specifically for single table cache groups.

    An unexpectedly large number of updates (> 1,000,000 rows) were made to an Oracle table with 89 columns referenced by a readonly cache group. The Cache Agent started an incremental update for this cache group and during the update, the datastore ran out of space, so the update was rolled back. All of the update and rollback records went into the lognnn files, using up most of the disk bandwidth. After the rollback completed and the Cache Agent started a refresh for that interval and the same failure/rollback sequence started again. This cache update failure/rollback cycle continued until the datastore full message was noticed in the log and I was able to pause the automatic refresh of this one table. Then I manually refreshed the cache group with "commit every n rows".

  • Cache 1000 tables at a time?

    hi,
    i want to cache 1000 tables at a time.
    how can i cache ? it could be either AWT or SWT
    thanx in advance
    :)

    TimesTen is a database. It supports creation of 1000s of tables. For caching, tables are encapsulated within cache groups. You can create throusands of cache groups. You are not limited to one cache group and one table; that would not be a very useful product :-)
    I would recommend that you read the very good Introduction and Cache User's Guides to be found here: http://docs.oracle.com/cd/E21901_01/welcome.html
    They explain the basic concepts related to using TimesTen as a cache as well as lot of other more in-depth information. Once you have done that tyou may then have other questions that the forum can help you with.
    Regards,
    Chris

  • Jython error while updating a oracle table based on file count

    Hi,
    i have jython procedure for counting counting records in a flat file
    Here is the code(took from odiexperts) modified and am getting errors, somebody take a look and let me know what is the sql exception in this code
    COMMAND on target: Jython
    Command on source : Oracle --and specified the logical schema
    Without connecting to the database using the jdbc connection i can see the output successfully, but i want to update the oracle table with count. any help is greatly appreciated
    ---------------------------------Error-----------------------------
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 45, in ?
    java.sql.SQLException: ORA-00936: missing expression
    ---------------------------------------Code--------------------------------------------------
    import java.sql.Connection
    import java.sql.Statement
    import java.sql.DriverManager
    import java.sql.ResultSet
    import java.sql.ResultSetMetaData
    import os
    import string
    import java.sql as sql
    import java.lang as lang
    import re
    filesrc = open('c:\mm\xyz.csv','r')
    first=filesrc.readline()
    lines = 0
    while first:
    #get the no of lines in the file
    lines += 1
    first=filesrc.readline()
    #print lines
    ## THE ABOVE PART OF THE PROGRAM IS TO COUNT THE NUMBER OF LINES
    ## AND STORE IT INTO THE VARIABLE `LINES `
    def intWithCommas(x):
    if type(x) not in [type(0), type(0L)]:
    raise TypeError("Parameter must be an integer.")
    if x < 0:
    return '-' + intWithCommas(-x)
    result = ''
    while x >= 1000:
    x, r = divmod(x, 1000)
    result = ",%03d%s" % (r, result)
    return "%d%s" % (x, result)
    ## THE ABOVE PROGRAM IS TO DISPLAY THE NUMBERS
    sourceConnection = odiRef.getJDBCConnection("SRC")
    sqlstring = sourceConnection.createStatement()
    sqlstmt="update tab1 set tot_coll_amt = to_number( "#lines ") where load_audit_key=418507"
    sqlstring.executeQuery(sqlstmt)
    sourceConnection.close()
    s0=' \n\nThe Number of Lines in the File are ->> '
    s1=str(intWithCommas(lines))
    s2=' \n\nand the First Line of the File is ->> '
    filesrc.seek(0)
    s3=str(filesrc.readline())
    final=s0 + s1 + s2 + s3
    filesrc.close()
    raise final

    i changed as you adviced ankit
    am getting the following error now
    org.apache.bsf.BSFException: exception from Jython:
    Traceback (innermost last):
    File "<string>", line 37, in ?
    java.sql.SQLException: ORA-00911: invalid character
    here is the modified code
    sourceConnection = odiRef.getJDBCConnection("SRC")
    sqlstring = sourceConnection.createStatement()
    sqlstmt="update tab1 set tot_coll_amt = to_number('#lines') where load_audit_key=418507;"
    result=sqlstring.executeUpdate(sqlstmt)
    sourceConnection.close()
    Any ideas
    Edited by: Sunny on Dec 3, 2010 1:04 PM

  • Best Practice to fetch SQL Server data and Insert into Oracle Tables

    Hello,
    I want to read sqlserver data everry half an hour and write into oracle tables ( in two different databases). What is the best practice for doing this?
    We do not have any database dblinks from oracle to sqlserver and vice versa.
    Any help is highly appreciable?
    Thanks

    Well, that's easy:
    use a TimerTask to do the following every half an hour:
    - open a connection to sql server
    - open two connections to the oracle databases
    - for each row you read from the sql server, do the inserts into the oracle databases
    - commit
    - close all connections

  • Insert /delete data from SAP Z table to Oracle table and opposite

    Hi,
    Can u help me write this FM from the SAP side?
    So, I have two tables ZTABLE in SAP and Oracle table ORAC.
    Let's put three columns in each of them, for example
    TEL1
    TEL2
    ADRESS
    NAME
    where TEL field is primary from ZTABLE to ORAC...
    (in FM there shoud be abap code for writing data in ZTABLE after we press some pushbutton made in sap screen painter..)
    for example, when we write new record in ZTABLE
    00
    112233
    Street 4
    Name1
    this data shoud be inserted in Oracle table ORAC.
    when we write new record in Oracle table for example
    01
    445566
    New Street
    Name2
    this data shoud be inserted in ZTABLE.
    Field TEL1 can be only of two values 01 or 02, other combination is not valid...
    I must have all data from Oracle table ORAC in ZTABLE and opposite.
    It should be the same scenario for DELETE...
    And this communication should be online between sap and table in oracle database...
    Can u help me from sap side? and give idea how to configure on oracle side??
    Thanks a lot,
    Nihad

    I dont know if we can directly connect to a oracle database ( wait for the answers from others on this )
    but in XI we have the JDBC adaptor to insert and retrieve data.
    so for the outbound from SAP the flow can be something like this (with XI in landscape):
    1) You have a screen to maintain a new entry / delete an entry
    2) On save , this record gets saved or deleted from the Ztable in SAP
    3)) In the same screen you can call a proxy class-method (generated using SPROXY transaction ) to send the record to XI.
    4) XI to format it and insert into the oracle table
    Mathews

  • XML file to Oracle Table

    Hello friends,
    Can you please help me with the following requirements?
    I have a xml structure like this
    <?xml version="1.0"?>
    <data>
    <var name="document">
    <string>Sales Order</String>
    </var>
    <var name="results">
    <recordset rowcount="2">
    <field name="sales_num">
    <string>12345</string>
    <string>A0192</string>
    </field>
    <field name="ord_qty">
    <string>10</string>
    <string>50</string>
    </field>
    </recordset>
    </var>
    </data>
    I have to read this xml file and copy the data to the Oracle table
    Sales Table
    CREATE TABLE SALES
    SALES_NUM VARCHAR2(20 BYTE),
    ORD_QTY NUMBER(4)
    Expected Result
    x. Sales Num Ord Qty
    1. 12345 10
    2. A0192 50
    I tried to follow the approach provided in this link http://www.oracle-base.com/articles/9i/ParseXMLDocuments9i.php . But it doesn't work with the XML structure I have.
    Thanks,
    Mahesh

    please let me know the solution as well. i need this

  • Upload data from excel to oracle table

    Hi,
    if i'm user and using an application and i want to upload data from excel to oracle table on button click . Is it possible by using sql loader.
    If yes then please clarify it .
    is it possible from client end.
    thanks
    kam

    Yes it is possible using SQL*LDR, External tables and ORCL Export Utility. Though I didn't try Export Utility to load the external files.
    SQLLdr sysntax:
    Create a control file.
    It looks like this
    Load data
    Infile 'source.dat' 
    Into Table tablename
    Fields terminated by ',' optionally enclosed by '"'
    {code}
    then use sqlldr command from your OS.
    {code}
    sqlldr userid/password@sid control = filename.ctl, data = source.dat, log = logname.log, bad = badname.log, discard = discardname.discard
    {code}                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Upload data from excel file to Oracle table

    Dear All,
    I have to upload data from excel file to Oracle table without using third party tools and without converting into CSV file.
    Could you tell me please how can i do this using PLSQl or SQL Loader.
    Thnaks in Advance..

    Dear All,
    I have to upload data from excel file to
    Oracle table without using third party tools and
    without converting into CSV file.
    Could you tell me please how can i do this
    using PLSQl or SQL Loader.
    Thnaks in Advance..As billy mentioned using ODBC interface ,the same HS service which is a layer over using traditional ODBC to access non oracle database.Here is link you can hit and trial and come out here if you have any problem.
    http://www.oracle-base.com/articles/9i/HSGenericConnectivity9i.php[pre]
    Khurram                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Fields missing in Oracles tables created by Infospokes

    Hi S.A.P Gurus!
    My problem is as follows:
    We have created Infopokes with Oracle table as destination.
    Even though, the transport to other systems is correct, and the infospoke have all the fields, when we execute it, some fields are missing in the destination (oracle table), and they should appear.
    Does anyone know what we could do to avoid this problem?
    Thanks in advance!
    Kind regards.

    Hi Raul,
    Was just going through your thread ..and found that you were able to transfer the data to the oracle table.
    I am aslo working on infospoke where I have to transfer the data from my Infoobject to a table in Oracle.
    For that i have created a infospoke and executed it to get the data in the another tabel in sap.
    The problem is that  I dnt know how to move the data from this table in SAP to oracle table . Did u used any interface for this or any kind of ABAP programming that you have done.
    Please suggest how this can be implemented.
    I have open up a thread for this InfoSpokes InfoSpokes
    Reward Points will be awarded.
    Best Regards
    Ankit Bhandari

Maybe you are looking for

  • Error on invoking a siebel wsdl using bpel

    Hi, We are getting the following error on trying to invoke a siebel application wsdl through bpel: exception on JaxRpc invoke: HTTP transport error: javax.xml.soap.SOAPException: java.security.PrivilegedActionException: oracle.j2ee.ws.saaj.ContentTyp

  • Auto-populate Computer Description Fields in AD Console

    Hi guys, I need some help with populating computer descriptions in AD.  I have searched this on google numerous times and everytime I find a script there are loads of comments with "changes" saying it doesn't work. What I am looking for in the descri

  • Problem in logon to my application

    Hello all, I have an application written using JHeadstart and JDeveloper 9.0.5. When I run the application logon page via Jdeveloper (embedded OC4J) I can login without any problem. But when I run the deploy version of the application I have a logon

  • Dowloading data to presentation server

    Hi, My requirement is to download large amount of data on to the presentation server. Space on presentation server is not an issue. If I use GUI_DOWNLOAD, I need to know what type of file I need to create and whether there is any limit on the file si

  • TV Show Cover Missing in iTunes 11

    Since I've installed iTunes 11 some of my TV Shows (bought via iTunes) doesn't show their Cover Artworks anymore. If I check the files (Get Info), the artworks are ok. Looks like a bug in iTunes 11 @ Apple, fix it please.