ORA-31020 when using XML with external DTD or entities

I'd like to parse XML documents against a modular DTD that references other DTDs. This works fine with Oracle 9i. But after upgrading to 11g, the parsing of XML-instances fails and DBMS_XMLPARSER.parseClob produces ORA-31020.
The same error occurs even if I simply try to store XML with a reference to an external DTD as xmltype:
SQL> select xmltype('<?xml version="1.0" encoding="iso-8859-1"?><!DOCTYPE ewl-artikel SYSTEM "http://www.foo.com/example.dtd"><test>123</test>') from dual;
ERROR:
ORA-31020: Der Vorgang ist nicht zulässig, Ursache: For security reasons, ftp
and http access over XDB repository is not allowed on server side
ORA-06512: in "SYS.XMLTYPE", Zeile 310
ORA-06512: in Zeile 1
How can I use external DTDs on remote servers in order to parse XML in an 11g database??? Any ideas for a workaround? Thanks in advance!

This is my PL/SQL validation procedure:
procedure validatexml (v_id in number default 0) is
PARSER DBMS_XMLPARSER.parser;
DTD_SOURCE clob;
DTD_DOCUMENT xmldom.DOMDocumentType;
XML_INSTANCE xmltype;
BEGIN
-- load DTD from XDB repository
SELECT httpuritype('http://example.foo.de/app1/DTD1.dtd').getclob() into DTD_SOURCE from dual;
-- load XML instance
select co_xml into XML_INSTANCE from tb_xmltab where co_id=v_id;
-- parse XML instance
PARSER := DBMS_XMLPARSER.newParser;
xmlparser.setValidationMode( PARSER , false);
xmlparser.parseDTDClob( PARSER , DTD_SOURCE , 'myfirstnode' );
DTD_DOCUMENT := xmlparser.getDoctype( PARSER );
xmlparser.setValidationMode( PARSER , true );
xmlparser.setDoctype( PARSER , DTD_DOCUMENT );
DBMS_XMLPARSER.parseClob( PARSER , v_xml );
DBMS_XMLPARSER.freeParser(PARSER);
htp.print('<P>XML instance succesfully validated!<P>');
end validatexml;

Similar Messages

  • Need advise for best practice when using Toplink with external transaction

    Hello;
    Our project is trying to switch from Toplink control transaction to using External transaction so we can make database operation and JMS operation within a single transaction.
    Some of our team try out the Toplink support for external transaction and come up with the following initial recommendation.
    Since we are not familar with using external transaction, I would like member of this forum and experts, to help comment on whether these recommendation are indeed valid or in line with the best practice. And for folks that have done this in their project, what did you do ?
    Any help will be most appreciated.
    Data Access Objects must be enhanced to support reading from a TOPLink unit of work when using an external transaction controller. Developers must consider what impact a global transaction will have on the methods in their data access objects (DAOs).
    The following findSomeObject method is representative of a “finder” in the current implementation of our DAOs. It is not especially designed to execute in the context of a global transaction, nor read from a unit of work.
    public findSomeObject(ILoginUser aUser, Expression queryExpression)
    ClientSession clientSession = getClientSession(aUser);
    SomeObject obj = null;
    try
    ReadObjectQuery readObjectQuery = new ReadObjectQuery(SomeObject.class);
    readObjectQuery.setSelectionCriteria(queryExpression);
    obj = (SomeObject)clientSession.executeQuery(readObjectQuery);
    catch (DatabaseException dbe)
    // throw an appropriate exception
    finally
    clientSession.release();
    if (obj == null)
    // throw an appropriate exception
    return obj;
    However, after making the following changes (in blue) the findSomeObject method will now read from a unit of work while executing in the context of a global transaction.
    public findSomeObject(ILoginUser aUser, Expression queryExpression)
    Session session = getClientSession(aUser);
    SomeObject obj = null;
    try
    ReadObjectQuery readObjectQuery = new ReadObjectQuery(SomeObject.class);
    readObjectQuery.setSelectionCriteria(queryExpression);
    if (TransactionController.getInstance().useExternalTransactionControl())
         session = session.getActiveUnitOfWork();
         readObjectQuery.conformResultsInUnitOfWork(); }
    obj = (SomeObject)session.executeQuery(readObjectQuery);
    catch (DatabaseException dbe)
    // throw an appropriate exception
    finally
    if (TransactionController.getInstance().notUseExternalTransactionControl())
         session.release();
    if (obj == null)
    // throw an appropriate exception
    return obj;
    When getting the TOPLink client session and reading from the unit of work in the context of a global transaction, new objects need to be cached.
    public getUnitOfWork(ILoginUser aUser)
    throws DataAccessException
         ClientSession clientSession = getClientSession(aUser);
         UnitOfWork uow = null;
         if (TransactionController.getInstance().useExternalTransactionControl())
              uow = clientSession.getActiveUnitOfWork();
              uow.setShouldNewObjectsBeCached(true);     }
         else
              uow = clientSession.acquireUnitOfWork();
         return uow;
    }

    As it generally is with this sort of question there is no exact answer.
    The only required update when working with an External Transaction is that getActiveUnitOfWork() is called instead of acquireUnitOfWork() other than that the semantics of the calls and when you use a UnitOfWork is still dependant on the requirements of your application. For instance I noticed that originally the findSomeObject method did not perform a transactional read (no UnitOfWork). Has the requirements for this method changed? If they have not then there is still no need to perform a transactional read, and the method would not need to change.
    As for the requirement that new object be cached this is only required if you are not conforming the transactional queries and adds a slight performance boost for find by primary key queries. In order to use this however, objects must be assigned primary keys by the application before they are registered in the UnitOfWork.
    --Gordon                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Validating XML with external dtd without doctype specified in xml

    Hi,
    I am very new to SAX, DOM and things..but I am really pulling my hair to find the soln., I have tried to search soln but I found many people asking same question but hardly anyone was satisfactory.
    My problem is that I have xml file without doctype specified in it, but I have dtd available on my system.
    I have tried to set MyEntityResolver which implements EntityResolver in documentBuilder but its only getting called (resolveEntity method of MyEntirtyResolver), only when I add doctype to the xml (which is not what I want) and not when there is no doctype in the xml. I have set "factory.setValidating(true)" and I also have errorHandler in place.
    But why EntityResolver is not invoked when its needed most. ie. when doctype is not available in xml ?...it complains that DOCTYPE must match root=null , which is obvious because no DOCTYPE in xml.
    code is as follows:
    please help me ..if anyone has any idea about this ....
    Main class is :
    public static void main(String args[]){
              Document document=null;
              ErrorHandler defaultHandler=new MyDefaultHandler();
              String xmlFile="note.xml";
              try{
                   System.out.println("Starting...");
                 boolean validXML = true;
                 DocumentBuilderFactory factory = DocumentBuilderFactory.newInstance();
                 factory.setNamespaceAware(true);
                 factory.setValidating(true);
                 try {
                    ExternalResolver er = new ExternalResolver();
    // addURL is just a method which sets string in a map to be retrieved by resolveEntiy
                    er.addURL("D:\\SAXnDOM\\SAXProject\\note.dtd");
                    DocumentBuilder builder = factory.newDocumentBuilder();
                     builder.setEntityResolver(er);
                     builder.setErrorHandler(new MyDefaultHandler());
                    builder.parse(new File(xmlFile));
                 resolveEntity of ExternalResolver is as follows:
    public InputSource resolveEntity(String publicId, String systemId)
                   throws SAXException, IOException {
               System.out.println("********resolvedEntity:" +publicId +" and "+systemId +"******");
             if ( urlMap != null && urlMap.get(systemId)!= null ){
                 try {
                     return new InputSource(new FileReader(systemId));
                 } catch (FileNotFoundException e) {
                     System.out.println("[ERROR] Unable to load entity reference: " + systemId );
             return null;
    public void addURL(String filePath) throws MalformedURLException{
              addURL(new File(filePath).toURL());
         public void addURL(URL url) {
             if ( urlMap == null ){
                 urlMap = new HashMap();
             urlMap.put(url, null);
         }

    Its working.... its working ...
    problem was in resolveEntity, that stupid if condition was removed like this:
    public InputSource resolveEntity(String publicId, String systemId)
                   throws SAXException, IOException {
               System.out.println("********resolvedEntity:" +publicId +" and "+systemId +"******");
                try {
                     return new InputSource("D:\\SAXnDOM\\SAXProject\\note.dtd");
                 } catch (Exception e) {
                     System.out.println("[ERROR] Unable to load entity reference: " + systemId );
             return null;
         }Other change is (which I didnt like ) is that in my xml, I had written fake doctype like :
    <?xml version="1.0"?>
    <!DOCTYPE note SYSTEM "fakenote.dtd">
    <note>
    <to>Tove</to>
    <from>Jani</from>
    <heading>Reminder</heading>
    <body>Don't forget me this weekend!</body>
    <Prashant>prdfjfdj</Prashant>
    </note> in above code fakenote.dtd doesnt exist anywhere ..its just to bypass that doctype:null error.
    So my guess is that EntityResolver overrides doctype in the xml and applies its own doctype (note.dtd in this case)
    But new proble comes...what if i dont want to add any doctype ..not even fake in xml ?
    Hope my stupid mistakes will find someone usefull..

  • ORA-03113 when using getSystemResource in java stored proc

    I have a java stored procedure that validates xml files.
    The xml to validate is stored in a blob and the dtd's are loaded into oracle as Java Resources.
    I load the dtd's dynamically through getSystemResource(dtd).
    If I call the procedure, everything runs fine one time (dtd is loaded and xml is validated). If I try to run a second time (same session) I get a ORA-03113: end-of-file on communication channel and the server dumps.
    (I can run "forever" if I replace the getSystemResource call and get the dtd from file instead).
    Also it works all the time when I'm running outside Oracle.
    Is this a nown problem or does some one have a suggestion on what I'm doing wrong?
    System tested:
    8.1.7.x (on AIX, HP, Linux and Win)
    9.2.0.x (on Win)
    Regards,
    Magnus

    Hi Avi,
    Well actually, why do you need to repeatedly reload
    the DTD, anyway? Isn't it always the same one? So
    once you've loaded it, you wouldn't need to bother
    reloading it, would you?It�s not always the same DTD:
    We have a PL/SQL procedure (XML API) that is feed with different XML by an �external� process. We have to validate each XML with corresponding DTD to either accept or refuse it.
    To make the XML API flexible and easy to maintain, we want to load all �files� (both java classes and dtd/xslt files) for each type of XML into Oracle instead of having some parts stored on file system and some loaded into Oracle.
    (The �problem� is that we can�t demand that the �external� process disconnect/reconnect before each new XML).
    Otherwise, I would say go with your workaround.I think I have to do this�
    (I�m going to have the same problem with dynamical loading of XSLT files when I transform the incoming XML to our internal XML format).
    Regards,
    Magnus

  • The preview option in FCK doesn't work when using trought an external app!

    Hello everyone!
    I have an issue with the FCK-editor when using in an external application.
    We are executing the “WCM_EDIT_DATA_FILE” service to launch the FCK contributor application in a portal (Webcenter) so we can edit datafiles. The preview option (preview butto) does not apply a region template and the content is shown in the same order as defined in the region definition. Viewing the same datafile trough Site Studio I can see that the preview option applies a region template.
    I've compared the links but I am not getting anywhere.....
    The links for the preview are:
    Via the portal:
    http://MyServer/vcc/idcplg?IdcService=SS_QD_GET_RENDITION&coreContentOnly=1&dDocName=DF_NEWS_BERLIN_8&dID=765&wcm.contributor.mode=false&IgnoreContributorOnly=true&previewId=1276003853183&WCMPopupId=POPUP09875462424653607178
    Via Site Studio:
    http://Myserver/VCCPOC/index.htm?wcm.contributor.mode=false&IgnoreContributorOnly=true&previewId=1275983448108&WCMPopupId=POPUP07808631203986127290
    I can see that the portal link has the same parameters as the Site Studio link and some more. I even tried adding a region template in the url by adding my template as a parameter like this "&templateDocName=RT_BOUND_DETAIL_VIEW" just to force it but without any luck. The same goes for the "view differences"-button in the FCK editor.
    Has anyone had this problem?
    Thanks

    Hi Stijn,
    Here is the link to the SS-services:
    http://download.oracle.com/docs/cd/E14571_01/doc.1111/e10615/c11_ss_services.htm#insertedID0
    And here is the link to the CS-services:
    http://download.oracle.com/docs/cd/E10316_01/cs/cs_doc_10/documentation/developer/services_reference_10gr3en.pdf
    The name of the service that creates new data file is CHECKIN_NEW_FORM.
    Saving the file from the FCKEditor makes the browser window "greyed out" so for the time being we are hitting the "go back"-button in the browser to return to the portal... as we open the editor in the same window as the portal. However, there must be another solution as this is fare away from user friendly!
    Let me try to answer some of your questions:
    - Are you also using the Site Studio publishing functionality (we will not)
    No, we are not either.
    - where will your site design be done (we'll try to do as much as we can in WebCenter to keep it all in one place; we might even just output XML with our templates from SS, and transform it in WebCenter using different "templates")
    We are also planning to have it all in WebCenter. We are using SS for creating region templates and subtempalates used in the portal (Webcenter/ADF) for showing the same data files in different ways. That means that we also have elements, region definitions, placeholders and so on in the SS.
    - what are you doing with inline images in the FCKEditor (we want contributors to be able to upload there images easily from the Editor, without the need to browse to UCM)
    Well, for the time being the contributor must browse the some images from the CS because the images must have a certain height and width... so that is our way of controlling that the user is choosing from the approved images. When it comes to images in the text the contributors can add as many images as they want that are not from the CS. They can just copy/paste them in the editor from whatever location they want.
    - is it correct that the metadata tab cannot be used in combination with profiles defined in UCM (see other thread WebCenter Content
    I've heard that before so my guess is that it's true. My question then is how do I do to hide the metadata tab if I do not use it?
    - will you be using the ctrl-shift-F5 functionality to edit content, or just create a link to the WCM_EDIT_... service in a popup window?
    We create a link to the WCM_EDIT_... service that opens in the same window as the portal.... but as I mentioned earlier that is not the most user friendly solution so we probably must change that....
    Now back to my problems:
    I know why the preview trough the portal and the site studio differs.
    The services that are used when previewing content trought the SS are:
    SS_SET_PREVIEW_ELEMENT_DATA
    SS_GET_PAGE
    LOAD_DOC_ENVIRONMENT
    The services that are used when previewing content trought the portal are:
    SS_SET_PREVIEW_ELEMENT_DATA
    SS_QD_GET_RENDITION
    LOAD_DOC_ENVIRONMENT
    The SS_GET_PAGE actually shows a page in SS and as soon as there is a region template attached to the page you are able to see the content the way you want. That is not the case then calling the SS_QD_GET_RENDITION service. So my question is if there is a workaround where I can force a region template to be used con conjunction with the SS_QD_GET_RENDITION service?
    Anyone? All ideas are really appreciated!

  • ORA-32034: unsupported use of WITH clause-issue

    hello all,
    i am facing some issue when i use with clause and union all operator.
    i have created a dummy code to solve this problem..
    my code is ----------
    with dept_1 as
    (select deptno d1 from detp9 where deptno = 20)
    select empno from emp9 e,dept_d1 where e.empno = dept_1.d1
    UNION ALL
    with dept_1 as
    (select deptno d2 from detp9 where deptno = 30)
    select empno from emp9 e,dept_d2 where e.empno = dept_2.d2.
    when i ran this i gort a message-
    ORA-32034: unsupported use of WITH clause.
    please help me to solve this iisue..
    when i ran it separatly without using union/union all it ran sucessfully..
    thanks in advance..

    923315 wrote:
    hello all,
    i am facing some issue when i use with clause and union all operator.
    i have created a dummy code to solve this problem..
    my code is ----------
    with dept_1 as
    (select deptno d1 from detp9 where deptno = 20)
    select empno from emp9 e,dept_d1 where e.empno = dept_1.d1
    UNION ALL
    with dept_1 as
    (select deptno d2 from detp9 where deptno = 30)
    select empno from emp9 e,dept_d2 where e.empno = dept_2.d2.
    when i ran this i gort a message-
    ORA-32034: unsupported use of WITH clause.
    please help me to solve this iisue..
    when i ran it separatly without using union/union all it ran sucessfully..
    thanks in advance..Well, i don't see anything about these queries that makes sense.
    You are essentially joining emp and dept on EMPNO to DEPTNO ... that doesn't usually make any sense.
    How about you step back from the query which is almost certainly incorrect, and explain your tables, their data and what you need as output?
    Cheers,

  • Performance problem in Mapping Designer using UDF with external imports

    Hello,
    we do have a big performance problem in developing (not in execution) graphical Mappings as far as we use "user defined functions" (UDF) with include-entries referencing to jar files which are imported as "imported archives".
    For example the execution of invice mapping with a little bit bigger test file in the Mapping designer:
    - after opening, not in change mod: 6 seconds
    - after switching to change mod: 37 seconds (that's clear, now everything is compiled first)
    - after adding "com.seeburger.functions.permstore.CounterFactory;" into the "import" field of one UDF, no other change: 227 seconds
    - after saving and submiting the changlist (no longer in change mode): 6 seconds
    - after switching to change mode: 227 seconds
    So execution speed of testing (and also when watching queues) only increases in changemod more then three minutes when using UDF with imports, referencing to external JAR files. It doesn't depend on Seeburger functions (we are using XI also for EDIFACT, so we also use some Seeburger functions), I can reproduce it with any other JAR file which is used from a UDF.
    Using java included functions like "java.text.NumberFormat;" in "Import" doesn't slow down the testing.
    Can anybody reproduce this? We are using XI 3.0 SP19 on a AIX machine, so we also have to use the Java version from IBM.
    cu
    Manfred

    Problem was fixed by a upgrad of the JDK.

  • Error ORA-06502 When using function REPLACE in PL/SQL

    Hi,
    I have a PL/SQL procedure which gives error 'Error ORA-06502 When using function REPLACE in PL/SQL' when the string value is quite long (I noticed this with a string 9K in length)
    variable var_a is of type CLOB
    and the assignment statement where it gives the error is
    var_a := REPLACE(var_a, '^', ''',''');
    Can anyone please help!
    Thanks

    Even then that shouldn't do so:
    SQL> select overload, position, argument_name, data_type, in_out
      2  from all_arguments
      3  where package_name = 'STANDARD'
      4  and object_name = 'LPAD'
      5  order by 1,2
      6  /
    OVERLOAD   POSITION ARGUMENT_NAME                  DATA_TYPE                      IN_OUT
    1                 0                                VARCHAR2                       OUT
    1                 1 STR1                           VARCHAR2                       IN
    1                 2 LEN                            BINARY_INTEGER                 IN
    1                 3 PAD                            VARCHAR2                       IN
    2                 0                                VARCHAR2                       OUT
    2                 1 STR1                           VARCHAR2                       IN
    2                 2 LEN                            BINARY_INTEGER                 IN
    3                 0                                CLOB                           OUT
    3                 1 STR1                           CLOB                           IN
    3                 2 LEN                            NUMBER                         IN
    3                 3 PAD                            CLOB                           IN
    4                 0                                CLOB                           OUT
    4                 1 STR1                           CLOB                           IN
    4                 2 LEN                            NUMBER                         INI wonder what happened?

  • Kernel Panic when using Garageband with Blue Snowball USB mic

    *HI ALL,*
    *I purchased Mac Pro recently about 3 weeks ago. I am on the latest updates including the recent firmware update. However whenever I use garageband with external USB mic(Blue Snowball) my Kernel Panics and OS freezes and forces me to Power On again. It happened 20 times so far in few hours.*
    *I opened garageband first time today after bought this external mic from apple store. I did not use garageband without this mic plugged into USB, I only used garageband with MIC plugged in and in couple of minutes to 5 minutes of using garageband the Kernel Panic occurs. Need your help please. Here is the message I get after login again which I reported consistently to Apple. Any immediate help is appreciated. I use iMove, iPhoto and other apps without any problem. Harddisk verification went o.k.*
    Sun Apr 6 22:13:35 2008
    panic(cpu 7 caller 0x001A8C8A): Kernel trap at 0x004209d5, type 14=page fault, registers:
    CR0: 0x8001003b, CR2: 0x0000000c, CR3: 0x013fe000, CR4: 0x00000660
    EAX: 0x00000000, EBX: 0x0ea7c3c0, ECX: 0x1059f4f0, EDX: 0x85ccfb8c
    CR2: 0x0000000c, EBP: 0x85ccfb08, ESI: 0x0ea7c3c8, EDI: 0x0ea7c3d0
    EFL: 0x00010202, EIP: 0x004209d5, CS: 0x00000008, DS: 0x11ac0010
    Error code: 0x00000002
    Backtrace, Format - Frame : Return Address (4 potential args on stack)
    0x85ccf8f8 : 0x12b0f7 (0x4581f4 0x85ccf92c 0x133230 0x0)
    0x85ccf948 : 0x1a8c8a (0x461720 0x4209d5 0xe 0x460ed0)
    0x85ccfa28 : 0x19ece5 (0x85ccfa40 0x4256f6 0x85ccfb08 0x4209d5)
    0x85ccfa38 : 0x4209d5 (0xe 0x85cc0048 0x10 0xe920010)
    0x85ccfb08 : 0x42028d (0xea7c3c0 0x85ccfb8c 0x0 0x0)
    0x85ccfb58 : 0x420c0e (0xea8b700 0x420978 0x85ccfb8c 0x0)
    0x85ccfba8 : 0x6055d9 (0xea7c3c0 0x0 0x85ccfc08 0x615b08)
    0x85ccfc08 : 0x609a11 (0xe92a000 0xeed0240 0xf56bff 0x0)
    0x85ccfc68 : 0x86334bde (0x13c56800 0xeed0240 0xf56bff 0x0)
    0x85ccfcd8 : 0x86334ddd (0xfbe4400 0x2 0x1 0x14)
    0x85ccfcf8 : 0x86335056 (0xfbe4400 0x14 0xf7f4a80 0x7c79d6)
    0x85ccfd38 : 0x86335156 (0xfbe4400 0x54 0x0 0xf7f4a80)
    0x85ccfd88 : 0x9f2e64 (0xfbe4400 0x8634d000 0x862c1010 0x110a)
    0x85ccfdf8 : 0x9eec9f (0xfd03600 0x10506588 0x110a 0x3ea740)
    0x85ccfe68 : 0x9ee169 (0x13d7d000 0x110a 0x13c4c940 0x13dbab7c)
    0x85ccfea8 : 0x43c47e (0x13d7d000 0x110a 0x55f 0x1)
    Backtrace continues...
    Kernel loadable modules in backtrace (with dependencies):
    com.apple.driver.AppleUSBAudio(2.5.4b4)@0x86326000->0x8634cfff
    dependency: com.apple.iokit.IOAudioFamily(1.6.4b7)@0x9e4000
    dependency: com.apple.iokit.IOUSBFamily(3.0.8)@0x5ff000
    com.apple.iokit.IOAudioFamily(1.6.4b7)@0x9e4000->0x9fafff
    dependency: com.apple.kext.OSvKernDSPLib(1.1)@0x9e1000
    com.apple.iokit.IOUSBFamily(3.0.8)@0x5ff000->0x626fff
    BSD process name corresponding to current thread: GarageBand
    Mac OS version:
    9C7010
    Kernel version:
    Darwin Kernel Version 9.2.2: Tue Mar 4 21:17:34 PST 2008; root:xnu-1228.4.31~1/RELEASE_I386
    System model name: MacPro3,1 (Mac-F42C88C8)

    I've been getting a number of kernel panics when using USB Mics with Skype, though I have not experienced any crashes when using Garageband.
    The crash report cited the AppleUSBAudio driver as the culprit. Here's the set up:
    MacBookPro Core2 2.66 15"
    Mac OS X 10.5.2
    Blue Snowflake USB Mic, Blue Snowball USB Mic
    Skype (latest rev - 2.7.0.257)
    AppleUSBAudio.kext version 2.5.6b3
    Seems to be a lot of crashiness with USB Audio in 10.5.2 - let's hope we see a patch soon

  • Problem while using XML with Oracle

    I have a problem using XML with oracle applications
    The process I am following
    1. Making a XML string.
    2. Parsing it to get a document
    3. Free the parser using xmlparser.freeparser
    4. Traversing through nodes .
    5. Freeing the document.
    The whole Process is executed in batch mode.
    The problem occurs after executing the procedure for 5000 records and I get the error
    ORA-04031: unable to allocate 4176 bytes of shared memory ("shared pool","unknown object","sga
    heap","library cache")
    Can you please help me out to overcome this problem
    It's urgent
    I have
    Oracle version 8.1.7.0.0
    XML version 1.2
    OS Windows NT
    To resolve the problem I have increase shared memory size and java initialization parameters ,which seems OK
    Looking forward for your answer.

    Hello, Reena
    Your process flow seems to be correct in term of getting/freeing memory.
    Following error
    The problem occurs after executing the procedure for 5000 records and I get the error
    ORA-04031: unable to allocate 4176 bytes of shared memory ("shared pool","unknown object","sga
    heap","library cache")may be caused by memory leaks in xdk or memory fragmentation(due to get/free memory cycle)
    To find out if this is an memory leak issue you could try to monitor V$SGASTAT from one session while running your batch process in another session.
    To prevent (or lower its impact) fragmentation issues try to PIN objects, and adjust java_pool_size and shared_pool_reserved_size.
    Anyway, counsult your Oracle DBA.
    Oracle version 8.1.7.0.0I think, you should apply database patch first of all. The latest one (8.1.7.4.x) could be accured from Metalink.

  • Dreamweaver very slow when opening files with external links

    Is there any way to fix Dreamweaver CS5 (Windows 7, 64bit) when opening files with external references?
    For example, I have a very simple CSS file, and toward the top the file has a line that says:
    @import url("https://fonts.googleapis.com/css?family=Ubuntu:500,700");
    This causes the file to take around 20 seconds to open in DW. If I comment out the line or remove it, the file opens almost instantly. This behaviour is also present in files that contain an SSI.
    DW often crashes before being able to open the file and I need to restart the app.
    Can I disable DW from attempting to load the references?? It's obviously what is causing the issue IMO and would love to make DW workable again!

    "Enable Related Files" isn't the fix for this issue. It's slightly different - I wanted to still load related files, but only the ones local to the file I was editing.
    The fix was to use DW's Resolve To IP Address feature. It required adding a registry value - it fixed the issue straight away
    This Adobe support doc helped: http://kb2.adobe.com/cps/887/cpsid_88742.html

  • Best practice when using Tangosol with an app server

    Hi,
    I'm wondering what is the best practice when using Tangosol with an app server (Websphere 6.1 in this case). I've been able to set it up using the resource adapter, tried using distributed transactions and it appears to work as expected - I've also been able to see cache data from another app server instance.
    However, it appears that cache data vanishes after a while. I've not yet been able to put my finger on when, but garbage collection is a possibility I've come to suspect.
    Data in the cache survives the removal of the EJB, but somewhere later down the line it appear to vanish. I'm not aware of any expiry settings for the cache that would explain this (to the best of my understanding the default is "no expiry"), so GC came to mind. Would this be the explanation?
    If that would be the explanation, what would be a better way to keep the cache from being subject to GC - to have a "startup class" in the app server that holds on to the cache object, or would there be other ways? Currently the EJB calls getCacheAdapter, so I guess Bad Things may happen when the EJB is removed...
    Best regards,
    /Per

    Hi Gene,
    I found the configuration file embedded in coherence.jar. Am I supposed to replace it and re-package coherence.jar?
    If I put it elsewhere (in the "classpath") - is there a way I can be sure that it has been found by Coherence (like a message in the standard output stream)? My experience with Websphere is that "classpath" is a rather ...vague concept, we use the J2CA adapter which most probably has a different class loader than the EAR that contains the EJB, and I would rather avoid to do a lot of trial/error corrections to a file just to find that it's not actually been used.
    Anyway, at this stage my tests are still focused on distributed transactions/2PC/commit/rollback/recovery, and we're nowhere near 10,000 objects. As a matter of fact, we haven't had more than 1024 objects in these app servers. In the typical scenario where I've seen objects "fade away", there has been only one or two objects in the test data. And they both disappear...
    Still confused,
    /Per

  • Display "blank-out" when using Maya with a GF 7800?

    I have a problem where my LCD display occasionally blinks off for about 5-10 seconds when using Maya with a scene displaying about 100,000 polys. Has anyone else encountered this? This only seems to happen when I am quickly rotating the camera in a grey shaded perspective panel?
    My LCD is a Dell 20" LCD (set at 1600x1200) connected to a Geforce 7800 in a Quad G5 with 4.5GB of ram.
    I have never seen this issue before in Maya (I have been using Maya for years, on other machines), so I suspect it is hardware or driver related?

    OK, have completed several tests. I definately seems that the my Geforce 7800 has problems when running some applications. I wonder if anyone else has encountered this problem?
    The easiest way to cause the display to momentarily shut off is to use Maya in the full screen perspective camera window in shaded mode. Proceed to rapidly move a 20,000 poly model (or greater) both zooming and rotating (the polys need to fill the entire screen and shake the mouse).My display (Dell 20" 2001fp) will blink-off for a few seconds. Of course, this in not "normal" work flow, but I have this issue happening when working normally, it can sometimes take time to occur.
    I have tried the Geforce 6600 (256mb ver) and this does not occur with that video card. I have tried several mice both wired and wireless, plus changed the Energy Saver options for Processor performance (from comments I read on Macintouch.com). The problem is consistent to my Geforce 7800. This problem will also occur in Photoshop CS, by moving images and resizing, but I found it much harder to induce.

  • URGENT : ORA 302000 when using TEXT_IO.fopen

    Hi,
    I get this error ORA 302000 when using TEXT_IO package, the code I use is
    new_file:=text_io.fopen('c:\text.txt','r')
    i don't have the description of this ORA 302000 , pls does anyone have it?

    Hi,
    I know it's been 2 years but it's still up to date for me.
    I tried the suggested piece of code to trace the error but it did not bring anything more
    EXCEPTION
    When Others then
    srw.Message( 2, 'EXCEPTION ' || SQLCODE || ' in common package. Can not open the file ');
    IF SQLCODE = -302000 then
    LOOP
    EXIT WHEN TOOL_ERR.NERRORS = 0;
    SRW.MESSAGE( 667, TO_CHAR(TOOL_ERR.CODE) || ': ' || TOOL_ERR.MESSAGE);
    TOOL_ERR.POP;
    END LOOP;
    END IF;
    srw.Message( 3, 'EXCEPTION ' || SQLCODE || ' in com package. Can not open the file ' || I_Desname || ' : ' || SQLERRM );
    Only Message 2 and 3 are displayed in the trace file
    Any other suggestion?
    Manu

  • When using iPhone with Ford automobiles' Sync, the connection drops each time the ignition is shut off. In order to restore, the bluetooth must be turned off on the iphone, then restarted. Any suggestions?

    When using iPhone with Ford Automobiles' SYNC, the connection drops each time the ignition is shut off. To restore, you must turn off the bluetooth on iPhone, then restart each time. Any suggestions?

    There is an update avaialbe for Ford systems to correct bluetooth problems. You need to update your system.

Maybe you are looking for