Large XML Document Error

We are using Oracle's XSQL servlet engine 9.0.2.0.0A to query an Oracle Database using xsql:include-owa. This xsql document uses a custom serializer to write the resulting XML to a file on the webserver. We have run into a problem with writing XML documents larger than approximatly 7MB. The file is written but just contains a Malformed XML Document error. We have tried tracking down the problem and it seems to be occuring before the custom serializer is given control of the document. Attached below is the xsql document and the Java source of our serializer if it would be of any help.
-Eric Dalquist
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="simple_transform_xml.xsl" serializer="XMLWriter"?>
<BASE_TAG connection="XXXXXXXX" xmlns:xsql="urn:oracle-xsql">
    <xsql:set-page-param name="save-file" value="XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX/applInfo.xml"/>
    <xsql:include-owa >
        STUDEV.STUADMSDSET.P_GET_RECRUIT_INFO('A','C');
    </xsql:include-owa>
</BASE_TAG>-----------------------------------------
* Author:        Eric Dalquist
* E-Mail:        [email protected]
* Date Created:  10-03-2001
* Last Modified: 10-03-2001
* Modified By:   Eric Dalquist
* Description:   This class writes out the contents of the final document produced by the xsql servlet.
package com.mtu.XSQLSerializers;
import java.io.FileOutputStream;
import oracle.xml.xsql.*;
import oracle.xml.parser.v2.XMLDocument;
import org.w3c.dom.Document;
* <P>
*    XMLWriter is an implementation of the abstract XSQLDocumentSerializer class. It is passed a refrence to a DOM Document
*    and the servlet environment. The XMLWriter takes the document and writes it out to the servlet's output stream. There
*    is also the option that the saveFile xsql page parameter can be set to a fully qualified path and file name to write the
*    document to.
* </P>
* <P>
*    Here is an example of setting the saveFile page paramter. This line would be the first tag inside the root xsql servlet tag.
* </P>
* <PRE>
*    &#060;xsql:set-page-param name="save-file" value="/u01/oracle/ias10210/xdk/java/xsql/Repository/Course_List/XMLDocument.xml"/&#062;
* </PRE>
* @author  Eric B. Dalquist
* @version 1.0 10/03/2001
public class XMLWriter implements XSQLDocumentSerializer
    /** the header we send if we are returning XML */
    private static final String MIME_XML = "text/xml";
    /** the name of the page parameter we look for to get the path and name to save the XML document to */
    private static final String ATTR_FILE = "save-file";
     * The serialize is called by the servlet engine and is passed the DOM Docuement result
     * from the XSLT transformation if there was one and a refrence to the XSQL servlet environment
     * @param doc The document representation of the XML output of the xsql servlet.
     * @param env A refrence to the servlet's environment.
     * @throws Throwable Any unhandled error is handled by the servlet by default.
    public void serialize(Document doc, XSQLPageRequest env) throws Throwable
        System.out.print("start ... ");
        FileOutputStream fos = null;
        String reqType = env.getRequestType();
        String saveFile = env.getParameter(ATTR_FILE);
        if (!reqType.equals("Command Line"))
            env.setContentType(MIME_XML);
            ((XMLDocument)doc).print(env.getOutputStream());
            env.getOutputStream().close();
        if (saveFile != null && !saveFile.equals("") && saveFile.indexOf("..") == -1 && saveFile.indexOf("~") == -1)
            fos = new FileOutputStream(saveFile);
            ((XMLDocument)doc).print(fos);
            fos.close();
        System.gc();
        System.out.println("end");
}

Technically DBMS_XMLSAVE and it's alter ego DBMS_XMLQUERY are not considered part of XML DB. DBMS_XMLSAVE and DBMS_XMLQUERY are legacy Java implementations.
In 9.2.x these routines were replaced by DBMS_XMLSTORE and DBMS_XMLGEN which are written in 'C' and should be much faster in most cases and address a number of limitations inherant in the older implementations. DBMS_XMLSTORE and DBMS_XMLGEN are part of XML DB and the minimum supported release for XML DB is 9.2.0.3.0.
DBMS_XMLSAVE issues are addressed in the TECH/XML forum
http://forums.oracle.com/forums/category.jspa?categoryID=51

Similar Messages

  • Large XML document performance

    We are using XDB 9.2.0.4. I am seeing a severe performance degradation when attempting to extract larger XML documents from XDB (somewhere over 3M). Smaller documents appear to be working fine.
    I have been reading in the forum that the problem I am running into is most likely related to the storage model being used. ie) There are several repeating elements within the schema.
    I have added xdb:storeVarrayAsTable="true" statement to the schema and re-registered. I can see, based on user_nested_tables, that XDB appears to be storing the repeating elements as nested tables vs varrays.
    The change to the storage model does not seem to have significantly changed the queries performance.
    The schemas I am using can be found at http://www.sasked.gov.sk.ca/xsd/sl/1.x/SLMessage.xsd & http://www.sasked.gov.sk.ca/xsd/sl/1.x/SDSElements.xsd
    The schema documentation can be found at http://www.sasked.gov.sk.ca/sds/xml/SchemaDocumentation/SLMessage.html
    The element /SL_Message/SL_Event/SL_ObjectData/SL_EventObject is the primary repeating element
    I am using a table with an XMLType column
    CREATE TABLE XML_SL_MESSAGE
    (XML_SL_MESSAGE_ID NUMBER(11) NOT NULL
    ,DTE_TIMESTAMP TIMESTAMP DEFAULT SYSTIMESTAMP NOT NULL
    ,ORIGINAL_XML_SL_MESSAGE_ID NUMBER(11)
    ,VALID_SL_MESSAGE_XML sys.XMLType
    ,INVALID_XML CLOB
    ,ERROR_MESSAGE VARCHAR2(4000)
    ) xmltype column valid_sl_message_xml XMLSCHEMA "http://www.sasked.gov.sk.ca/xsd/sl/1.x/SLMessage.xsd" element "SL_Message"
    The SQL I am using is attempting to bring the XMLType back as a clob, the query seems to be intensive in both CPU and I/O. (looks like it is the getClobVal function)
    select xsm.xml_sl_message_id
    ,xsm.dte_timestamp
    ,nvl(xsm.valid_sl_message_xml.getClobVal(),xsm.invalid_xml) as xml_clob
    ,xsm.error_message
    ,xsm.original_xml_sl_message_id
    from xml_sl_message xsm
    where xsm.dte_timestamp &gt; sysdate –1
    I guess what I am wondering is what are my options ? Changing the storage model ? Applying Indexes ?
    On an unrelated topic, Are there many differences in XDB 9.2.0.5 and 9.2.0.4 ? (I don’t believe 10g will be an option here … yet)
    Thanx in advance
    Trent

    I have applied the 9.2.0.5.0 patches and created the relational table with the following attributes:
    CREATE TABLE XML_SL_MESSAGE
    (XML_SL_MESSAGE_ID NUMBER(11) NOT NULL
    ,DTE_TIMESTAMP TIMESTAMP DEFAULT SYSTIMESTAMP NOT NULL
    ,PSE_SYS_USR_ID NUMBER(11) NOT NULL
    ,ORIGINAL_XML_SL_MESSAGE_ID NUMBER(11)
    ,VALID_SL_MESSAGE_XML sys.XMLType
    ,INVALID_XML CLOB
    ,ERROR_MESSAGE VARCHAR2(4000)
    xmltype column valid_sl_message_xml
    STORE AS OBJECT RELATIONAL
    XMLSCHEMA "http://www.sasked.gov.sk.ca/xsd/sl/1.x/SLMessage.xsd" ELEMENT "SL_Message"
    -- 1:1 ----------------------
    varray valid_sl_message_xml."XMLDATA"."SL_Event"."SL_ObjectData"."SL_EventObject"
    store as table SL_EVENTOBJECT2_TB(
    (constraint SL_EVENTOBJECT2_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 2:2 ----------------------
    varray "SchoolTerm"
    store as table SCHOOLTERM2_TB(
    (constraint SCHOOLTERM2_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 3:2 ----------------------
    varray "SchoolClass"
    store as table SCHOOLCLASS3_TB(
    (constraint SCHOOLCLASS3_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 6:2 ----------------------
    varray "StudentCourseHistory"
    store as table STUDENTCOURSEHISTORY6_TB(
    (constraint STUDENTCOURSEHISTORY6_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 7:2 ----------------------
    varray "StudentSupplementalMark"
    store as table STUDENTSUPPLEMENTALMARK7_TB(
    (constraint STUDENTSUPPLEMENTALMARK7_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 8:2 ----------------------
    varray "StudentClassMark"
    store as table STUDENTCLASSMARK8_TB(
    (constraint STUDENTCLASSMARK8_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 9:2 ----------------------
    varray "StudentExamRegistration"
    store as table STUDENTEXAMREGISTRATION9_TB(
    (constraint STUDENTEXAMREGISTRATION9_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 10:2 ----------------------
    varray "StudentClassEnrollment"
    store as table STUDENTCLASSENROLLMENT10_TB(
    (constraint STUDENTCLASSENROLLMENT10_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 11:2 ----------------------
    varray "StudentPersonal"
    store as table STUDENTPERSONAL11_TB(
    (constraint STUDENTPERSONAL11_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 18:2 ----------------------
    varray "StudentProgramEnrollment"
    store as table STUDENTPROGRAMENROLLMENT18_TB(
    (constraint STUDENTPROGRAMENROLLMENT18_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 19:2 ----------------------
    varray "StudentSchoolEnrollment"
    store as table STUDENTSCHOOLENROLLMENT19_TB(
    (constraint STUDENTSCHOOLENROLLMENT19_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 26:1 ----------------------
    varray valid_sl_message_xml."XMLDATA"."SL_Response"."SL_ObjectData"."SL_EventObject"
    store as table SL_EVENTOBJECT26_TB(
    (constraint SL_EVENTOBJECT26_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 27:2 ----------------------
    varray "SchoolTerm"
    store as table SCHOOLTERM27_TB(
    (constraint SCHOOLTERM27_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 28:2 ----------------------
    varray "SchoolClass"
    store as table SCHOOLCLASS28_TB(
    (constraint SCHOOLCLASS28_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 31:2 ----------------------
    varray "StudentProgramEnrollment"
    store as table STUDENTPROGRAMENROLLMENT31_TB(
    (constraint STUDENTPROGRAMENROLLMENT31_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 32:2 ----------------------
    varray "StudentExamRegistration"
    store as table STUDENTEXAMREGISTRATION32_TB(
    (constraint STUDENTEXAMREGISTRATION32_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 33:2 ----------------------
    varray "StudentClassEnrollment"
    store as table STUDENTCLASSENROLLMENT33_TB(
    (constraint STUDENTCLASSENROLLMENT33_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 34:2 ----------------------
    varray "StudentPersonal"
    store as table STUDENTPERSONAL34_TB(
    (constraint STUDENTPERSONAL34_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 41:2 ----------------------
    varray "StudentSchoolEnrollment"
    store as table STUDENTSCHOOLENROLLMENT41_TB(
    (constraint STUDENTSCHOOLENROLLMENT41_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 48:2 ----------------------
    varray "StudentClassMark"
    store as table STUDENTCLASSMARK48_TB(
    (constraint STUDENTCLASSMARK48_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 49:2 ----------------------
    varray "StudentCourseHistory"
    store as table STUDENTCOURSEHISTORY49_TB(
    (constraint STUDENTCOURSEHISTORY49_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 50:2 ----------------------
    varray "StudentSupplementalMark"
    store as table STUDENTSUPPLEMENTALMARK50_TB(
    (constraint STUDENTSUPPLEMENTALMARK50_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 51:1 ----------------------
    varray valid_sl_message_xml."XMLDATA"."SL_Response"."SL_Ack"."SL_Error"
    store as table SL_ERROR51_TB(
    (constraint SL_ERROR51_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    -- 52:1 ----------------------
    varray valid_sl_message_xml."XMLDATA"."SL_Request"."SL_Query"."SL_QueryObject"
    store as table SL_QUERYOBJECT52_TB(
    (constraint SL_QUERYOBJECT52_PK primary key (NESTED_TABLE_ID,ARRAY_INDEX))
    tablespace data
    ALTER TABLE XML_SL_MESSAGE
    ADD (CONSTRAINT XML_SL_MESSAGE_PK PRIMARY KEY
    (XML_SL_MESSAGE_ID))
    ALTER TABLE XML_SL_MESSAGE
    ADD (CONSTRAINT XMLSLMSG_ORIGINAL_XMLSLMSG_UK UNIQUE
    (ORIGINAL_XML_SL_MESSAGE_ID))
    ALTER TABLE XML_SL_MESSAGE ADD (CONSTRAINT
    XMLSLMSG_SYSUSR_FK FOREIGN KEY
    (PSE_SYS_USR_ID) REFERENCES PSE_SYS_USR
    (PSE_SYS_USR_ID))
    ALTER TABLE XML_SL_MESSAGE ADD (CONSTRAINT
    XMLSLMSG_ORIGINAL_XMLSLMSG_FK FOREIGN KEY
    (ORIGINAL_XML_SL_MESSAGE_ID) REFERENCES XML_SL_MESSAGE
    (XML_SL_MESSAGE_ID))
    -- Create a unique index for the XML Message id
    CREATE UNIQUE INDEX XMLSLMSG_MSGID_UNIQUE ON XML_SL_MESSAGE
    ((substr(extractValue(valid_sl_message_xml,'//SL_MsgId'),1,255)))
    tablespace indx
    COMPUTE STATISTICS
    Here is the nested table structure of the XMLType table created during the schema registration:
    select level
    ,parent_table_column
    from user_nested_tables
    connect by prior table_name = parent_table_name
    start with parent_table_name = 'SL_Message4724_TAB'
    LEVEL PARENT_TABLE_COLUMN
    1 "XMLDATA"."SL_Event"."SL_ObjectData"."SL_EventObject"
    2 SchoolTerm
    2 StudentSchoolEnrollment
    3 "StudentInfo"."Name"
    3 "StudentInfo"."Demographics"."CountryOfCitizenship"
    3 "StudentInfo"."StudentAddress"
    3 "StudentInfo"."PhoneNumber"
    3 "StudentInfo"."Demographics"."Language"
    3 "StudentInfo"."Email"
    2 SchoolClass
    3 "ClassInfo"."DeptAssignedCourseId"
    3 "ClassInfo"."EducatorCertificateNumber"
    2 StudentProgramEnrollment
    2 StudentClassEnrollment
    2 StudentClassMark
    2 StudentCourseHistory
    2 StudentSupplementalMark
    2 StudentExamRegistration
    2 StudentPersonal
    3 "StudentInfo"."Name"
    3 "StudentInfo"."Email"
    3 "StudentInfo"."Demographics"."CountryOfCitizenship"
    3 "StudentInfo"."StudentAddress"
    3 "StudentInfo"."PhoneNumber"
    3 "StudentInfo"."Demographics"."Language"
    1 "XMLDATA"."SL_Request"."SL_Query"."SL_QueryObject"
    1 "XMLDATA"."SL_Response"."SL_ObjectData"."SL_EventObject"
    2 SchoolTerm
    2 SchoolClass
    3 "ClassInfo"."DeptAssignedCourseId"
    3 "ClassInfo"."EducatorCertificateNumber"
    2 StudentProgramEnrollment
    2 StudentClassEnrollment
    2 StudentClassMark
    2 StudentCourseHistory
    2 StudentSupplementalMark
    2 StudentExamRegistration
    2 StudentPersonal
    3 "StudentInfo"."Name"
    3 "StudentInfo"."Email"
    3 "StudentInfo"."Demographics"."Language"
    3 "StudentInfo"."PhoneNumber"
    3 "StudentInfo"."StudentAddress"
    3 "StudentInfo"."Demographics"."CountryOfCitizenship"
    2 StudentSchoolEnrollment
    3 "StudentInfo"."Name"
    3 "StudentInfo"."Demographics"."Language"
    3 "StudentInfo"."PhoneNumber"
    3 "StudentInfo"."StudentAddress"
    3 "StudentInfo"."Demographics"."CountryOfCitizenship"
    3 "StudentInfo"."Email"
    1 "XMLDATA"."SL_Response"."SL_Ack"."SL_Error"
    52 rows selected.
    When I attempt to insert to previously valid XML documents I get a core dump … Here are the insert statements:
    insert into xml_sl_message (
    xml_sl_message_id, dte_timestamp, pse_sys_usr_id, original_xml_sl_message_id, valid_sl_message_xml, invalid_xml,error_message
    select xml_sl_message_id, dte_timestamp, pse_sys_usr_id, original_xml_sl_message_id, xmltype(valid_sl_message_clob), invalid_xml,error_message
    from xml_sl_message_temp where xml_sl_message_id in (5154,5155)
    Here are the details on the exception:
    Oracle9i Enterprise Edition Release 9.2.0.5.0 - 64bit Production
    With the Partitioning, OLAP and Oracle Data Mining options
    JServer Release 9.2.0.5.0 - Production
    ORACLE_HOME = /opt/app/oracle/product/9.2.0
    System name: SunOS
    Node name: *****
    Release: 5.8
    Version: Generic_117000-01
    Machine: sun4u
    Instance name: EDDSDS
    Redo thread mounted by this instance: 1
    Oracle process number: 76
    Unix process pid: 11460, image: oracle@***** (TNS V1-V3)
    *** 2004-06-22 09:24:34.603
    *** SESSION ID:(29.159) 2004-06-22 09:24:34.602
    Exception signal: 11 (SIGSEGV), code: 1 (Address not mapped to object), addr: 0x20, PC: [0x101e41830, 0000000101E41830]
    *** 2004-06-22 09:24:34.606
    ksedmp: internal or fatal error
    ORA-07445: exception encountered: core dump [0000000101E41830] [SIGSEGV] [Address not mapped to object] [0x000000020] [] []
    Current SQL statement for this session:
    insert into xml_sl_message (
    xml_sl_message_id, dte_timestamp, pse_sys_usr_id, original_xml_sl_message_id, valid_sl_message_xml, invalid_xml,error_message
    select xml_sl_message_id, dte_timestamp, pse_sys_usr_id, original_xml_sl_message_id, xmltype(valid_sl_message_clob), invalid_xml,error_message
    from xml_sl_message_temp where xml_sl_message_id in (5154,5155)
    I have tried a couple different storage sections (with different levels of nesting) and still the same problem ..
    Is there something wrong with my storage section ?
    What is the "return as LOCATOR" clause for?

  • Date insertion from large XML document (clob) into relation table very slow

    Hi Everybody!
    I'm working with Oracle 9.2.0.5 on Microsoft Windows Server 2003 Enterprise Edition.
    The server (a test server) is a Pentium 4 2.8 GHz, 1GB of RAM.
    I use a procedure called PARITOP_TRAITERXMLRESULTMASSE to insert the data contained in the pXMLDOC clob parameter in the table pTABLENAME. (You can see the format of the XML document below). The first step on this procedure is to verify that the XML document is not empty. If not, the procedure needs to add a node in the document, in every <ROW> tag. This added node is named “RST_ID”. It’s the foreign key of each record. I can retrieve the value of each <RST_ID> node in an other table in which the data has been previously added (by the calling procedure). When each of the <ROW> elements has been treated, the PARITOP_INSERTXML procedure is called. This procedure uses DBMS_XMLSAVE.INSERTXML to insert the data contained in the XML document in the specified table.
    (Below, you can see the code of my procedures.)
    With this information, can you tell me why this treatment is very very very slow with a large XML document and how I can improve it?
    Thank you for your help!
    Anne-Marie
    CREATE OR REPLACE PROCEDURE "PARITOP_TRAITERXMLRESULTMASSE" (
    pPRC_ID IN PARITOP_PARC.PRC_ID%TYPE,
    pRST_MONDE IN PARITOP_RESULTAT.RST_MONDE%TYPE,
    pXMLDOC IN CLOB,
    pTABLENAME IN VARCHAR2)
    AS
    Objectif :Insérer le contenu du XML passé en paramètre (pXMLDOC) à la table passée en paramètre (pTABLENAME)
    La table passée en paramètre doit être une table ayant comme clé étrangère le champs "RST_ID" .
    (Le noeud RST_ID est donc ajouté à tous les document XML. Ce rst_id est
    déterminé à partir de la table PARITOP_RESULTAT grâce à pPRC_ID et
    pRstMonde fournis en paramètre)
    result_doc CLOB;
    XMLDOMDOC XDB.DBMS_XMLDOM.DOMDOCUMENT;
    NODE_ROWSET DBMS_XMLDOM.DOMNODE;
    NODE_ROW DBMS_XMLDOM.DOMNODE;
    vUE_ID PARITOP_RESULTAT.UE_ID%TYPE;
    vRST_ID PARITOP_RESULTAT.RST_ID%TYPE;
    nodeList DBMS_XMLDOM.DOMNODELIST;
    BEGIN
    BEGIN
    vUE_ID := 0;
    vRST_ID := 0;
    XMLDOMDOC := DBMS_XMLDOM.NEWDOMDOCUMENT(pXMLDOC);
    IF NOT GESTXML_PKG.FN_PARITOP_DOCUMENT_IS_NULL(XMLDOMDOC) THEN
    NODE_ROWSET := DBMS_XMLDOM.item(DBMS_XMLDOM.GETCHILDNODES (DBMS_XMLDOM.MAKENODE(XMLDOMDOC)),0);
    for i in 0..dbms_xmldom.getLength(DBMS_XMLDOM.getchildnodes(NODE_ROWSET))-1 loop
    NODE_ROW := DBMS_XMLDOM.ITEM(DBMS_XMLDOM.GETCHILDNODES(NODE_ROWSET), i) ;
    nodeList := DBMS_XMLDOM.GETELEMENTSBYTAGNAME(DBMS_XMLDOM.makeelement(NODE_ROW) , 'UE_ID');
    IF vUE_ID <> DBMS_XMLDOM.GETNODEVALUE(DBMS_XMLDOM.GETFIRSTCHILD(DBMS_XMLDOM.ITEM(nodeList, 0))) THEN
    vUE_ID := DBMS_XMLDOM.GETNODEVALUE(DBMS_XMLDOM.GETFIRSTCHILD(DBMS_XMLDOM.ITEM(nodeList, 0)));
    --on ramasse le rst_id
    SELECT RST_ID INTO vRST_ID
    FROM PARITOP_RESULTAT RST
    WHERE RST.PRC_ID = pPRC_ID
    AND RST.UE_ID = vUE_ID
    AND RST.RST_MONDE = pRST_MONDE
    AND RST_A_SUPPRIMER = 0;
    END IF;
    GESTXML_PKG.PARITOP_ADDNODETOROW(XMLDOMDOC, NODE_ROW, 'RST_ID', vRST_ID);
    end loop;
    RESULT_DOC := ' '; --à garder, pour ne pas que ca fasse d'erreur lors du WriteToClob.
    dbms_xmldom.writeToClob(DBMS_XMLDOM.MAKENODE(XMLDOMDOC), RESULT_DOC);
    --Insertion du document XML dans la table "tableName"
    GESTXML_PKG.PARITOP_INSERTXML(RESULT_DOC, pTABLENAME);
    DBMS_XMLDOM.FREEDOCUMENT( XMLDOMDOC);
    end if;
    EXCEPTION
    […exception treatement…]
    END;
    END;
    The format of a XML clob is :
    <ROWSET>
    <ROW>
    <PRC_ID>193</PRC_ID>
    <UE_ID>8781</UE_ID>
    <VEN_ID>6223</VEN_ID>
    <RST_MONDE>0</RST_MONDE>
    <CMP_SELMAN>0</CMP_SELMAN>
    <CMP_INDICESELECTION>92.307692307692307</CMP_INDICESELECTION>
    <CMP_PVRES>94900</CMP_PVRES>
    <CMP_PVAJUSTE>72678.017699115066</CMP_PVAJUSTE>
    <CMP_PVAJUSTEMIN>72678.017699115095</CMP_PVAJUSTEMIN>
    <CMP_PVAJUSTEMAX>72678.017699115037</CMP_PVAJUSTEMAX>
    <CMP_PV>148000</CMP_PV>
    <CMP_VALROLE>129400</CMP_VALROLE>
    <CMP_PVRESECART>4790</CMP_PVRESECART>
    <CMP_PVRHAB>101778.01769911509</CMP_PVRHAB>
    <CMP_UTILISE>1</CMP_UTILISE>
    <CMP_TVM>1</CMP_TVM>
    <CMP_PVA>148000</CMP_PVA>
    </ROW>
    <ROW>
    <PRC_ID>193</PRC_ID>
    <UE_ID>8781</UE_ID>
    <VEN_ID>6235</VEN_ID>
    <RST_MONDE>0</RST_MONDE>
    <CMP_SELMAN>0</CMP_SELMAN>
    <CMP_INDICESELECTION>76.92307692307692</CMP_INDICESELECTION>
    <CMP_PVRES>117800</CMP_PVRES>
    <CMP_PVAJUSTE>118080</CMP_PVAJUSTE>
    <CMP_PVAJUSTEMIN>118080</CMP_PVAJUSTEMIN>
    <CMP_PVAJUSTEMAX>118080</CMP_PVAJUSTEMAX>
    <CMP_PV>172000</CMP_PV>
    <CMP_VALROLE>134800</CMP_VALROLE>
    <CMP_PVRESECART>0</CMP_PVRESECART>
    <CMP_PVRHAB>147180</CMP_PVRHAB>
    <CMP_UTILISE>1</CMP_UTILISE>
    <CMP_TVM>1</CMP_TVM>
    <CMP_PVA>172000</CMP_PVA>
    </ROW>
    </ROWSET>
    PARITOP_COMPARABLE TABLE :
    RST_ID NUMBER(10) NOT NULL,
    VEN_ID NUMBER(10) NOT NULL,
    CMP_SELMAN NUMBER(1) NOT NULL,
    CMP_UTILISE NUMBER(1) NOT NULL,
    CMP_INDICESELECTION FLOAT(53) NOT NULL,
    CMP_PVRES FLOAT(53) NULL,
    CMP_PVAJUSTE FLOAT(53) NULL,
    CMP_PVRHAB FLOAT(53) NULL,
    CMP_TVM FLOAT(53) NULL
    ROCEDURE PARITOP_INSERTXML (xmlDoc IN clob, tableName IN VARCHAR2)
    AS
    insCtx DBMS_XMLSave.ctxType;
    rowss number;
    BEGIN
    --permet d'insérer les champs du XML dans la table passée en paramètre.
    --il suffit que les champs XML aient le même nom que les champs de la table
    BEGIN
    insCtx := DBMS_XMLSave.newContext(tableName); -- get context handle
    DBMS_XMLSAVE.SETDATEFORMAT( insCtx, 'yyyy-MM-dd HH:mm:ss');--attention, case sensitive
    DBMS_XMLSAVE.setIgnoreCase(insCtx, 1);
    rowss := DBMS_XMLSAVE.INSERTXML(insCtx , xmlDoc);
    DBMS_XMLSave.closeContext(insCtx);
    EXCEPTION
    […]
    END;
    END;
    PROCEDURE PARITOP_ADDNODETOROW (
    XMLDOMDOC DBMS_XMLDOM.DOMDOCUMENT,
    NODE_ROW dbms_xmldom.DOMNode,
    NOM_NOEUD VARCHAR2,
    VALEUR_NOEUD VARCHAR2)
    AS
    --PERMET D'AJOUTER UN NOEUD AVEC 1 SEULE VALEUR DANS une ROW D'UN XML.
    --UTILE SURTOUT POUR LES CLÉS ÉTRANGÈRES
    domElemAInserer DBMS_XMLDOM.DOMELEMENT;
    NODE dbms_xmldom.DOMNode;
    NODE_TMP dbms_xmldom.DOMNode;
    BEGIN
    domElemAInserer := DBMS_XMLDOM.createElement(XMLDOMDOC, NOM_NOEUD) ;
    NODE := DBMS_XMLDOM.MAKENODE(domElemAInserer); --cast
    NODE := DBMS_XMLDOM.APPENDCHILD(NODE_ROW,NODE);
    NODE_TMP := DBMS_XMLDOM.MAKENODE(DBMS_XMLDOM.CREATETEXTNODE(XMLDOMDOC, VALEUR_NOEUD ) );
    NODE := DBMS_XMLDOM.APPENDCHILD(NODE,NODE_TMP );
    END;

    Hi Everybody!
    I'm working with Oracle 9.2.0.5 on Microsoft Windows Server 2003 Enterprise Edition.
    The server (a test server) is a Pentium 4 2.8 GHz, 1GB of RAM.
    I use a procedure called PARITOP_TRAITERXMLRESULTMASSE to insert the data contained in the pXMLDOC clob parameter in the table pTABLENAME. (You can see the format of the XML document below). The first step on this procedure is to verify that the XML document is not empty. If not, the procedure needs to add a node in the document, in every <ROW> tag. This added node is named “RST_ID”. It’s the foreign key of each record. I can retrieve the value of each <RST_ID> node in an other table in which the data has been previously added (by the calling procedure). When each of the <ROW> elements has been treated, the PARITOP_INSERTXML procedure is called. This procedure uses DBMS_XMLSAVE.INSERTXML to insert the data contained in the XML document in the specified table.
    (Below, you can see the code of my procedures.)
    With this information, can you tell me why this treatment is very very very slow with a large XML document and how I can improve it?
    Thank you for your help!
    Anne-Marie
    CREATE OR REPLACE PROCEDURE "PARITOP_TRAITERXMLRESULTMASSE" (
    pPRC_ID IN PARITOP_PARC.PRC_ID%TYPE,
    pRST_MONDE IN PARITOP_RESULTAT.RST_MONDE%TYPE,
    pXMLDOC IN CLOB,
    pTABLENAME IN VARCHAR2)
    AS
    Objectif :Insérer le contenu du XML passé en paramètre (pXMLDOC) à la table passée en paramètre (pTABLENAME)
    La table passée en paramètre doit être une table ayant comme clé étrangère le champs "RST_ID" .
    (Le noeud RST_ID est donc ajouté à tous les document XML. Ce rst_id est
    déterminé à partir de la table PARITOP_RESULTAT grâce à pPRC_ID et
    pRstMonde fournis en paramètre)
    result_doc CLOB;
    XMLDOMDOC XDB.DBMS_XMLDOM.DOMDOCUMENT;
    NODE_ROWSET DBMS_XMLDOM.DOMNODE;
    NODE_ROW DBMS_XMLDOM.DOMNODE;
    vUE_ID PARITOP_RESULTAT.UE_ID%TYPE;
    vRST_ID PARITOP_RESULTAT.RST_ID%TYPE;
    nodeList DBMS_XMLDOM.DOMNODELIST;
    BEGIN
    BEGIN
    vUE_ID := 0;
    vRST_ID := 0;
    XMLDOMDOC := DBMS_XMLDOM.NEWDOMDOCUMENT(pXMLDOC);
    IF NOT GESTXML_PKG.FN_PARITOP_DOCUMENT_IS_NULL(XMLDOMDOC) THEN
    NODE_ROWSET := DBMS_XMLDOM.item(DBMS_XMLDOM.GETCHILDNODES (DBMS_XMLDOM.MAKENODE(XMLDOMDOC)),0);
    for i in 0..dbms_xmldom.getLength(DBMS_XMLDOM.getchildnodes(NODE_ROWSET))-1 loop
    NODE_ROW := DBMS_XMLDOM.ITEM(DBMS_XMLDOM.GETCHILDNODES(NODE_ROWSET), i) ;
    nodeList := DBMS_XMLDOM.GETELEMENTSBYTAGNAME(DBMS_XMLDOM.makeelement(NODE_ROW) , 'UE_ID');
    IF vUE_ID <> DBMS_XMLDOM.GETNODEVALUE(DBMS_XMLDOM.GETFIRSTCHILD(DBMS_XMLDOM.ITEM(nodeList, 0))) THEN
    vUE_ID := DBMS_XMLDOM.GETNODEVALUE(DBMS_XMLDOM.GETFIRSTCHILD(DBMS_XMLDOM.ITEM(nodeList, 0)));
    --on ramasse le rst_id
    SELECT RST_ID INTO vRST_ID
    FROM PARITOP_RESULTAT RST
    WHERE RST.PRC_ID = pPRC_ID
    AND RST.UE_ID = vUE_ID
    AND RST.RST_MONDE = pRST_MONDE
    AND RST_A_SUPPRIMER = 0;
    END IF;
    GESTXML_PKG.PARITOP_ADDNODETOROW(XMLDOMDOC, NODE_ROW, 'RST_ID', vRST_ID);
    end loop;
    RESULT_DOC := ' '; --à garder, pour ne pas que ca fasse d'erreur lors du WriteToClob.
    dbms_xmldom.writeToClob(DBMS_XMLDOM.MAKENODE(XMLDOMDOC), RESULT_DOC);
    --Insertion du document XML dans la table "tableName"
    GESTXML_PKG.PARITOP_INSERTXML(RESULT_DOC, pTABLENAME);
    DBMS_XMLDOM.FREEDOCUMENT( XMLDOMDOC);
    end if;
    EXCEPTION
    […exception treatement…]
    END;
    END;
    The format of a XML clob is :
    <ROWSET>
    <ROW>
    <PRC_ID>193</PRC_ID>
    <UE_ID>8781</UE_ID>
    <VEN_ID>6223</VEN_ID>
    <RST_MONDE>0</RST_MONDE>
    <CMP_SELMAN>0</CMP_SELMAN>
    <CMP_INDICESELECTION>92.307692307692307</CMP_INDICESELECTION>
    <CMP_PVRES>94900</CMP_PVRES>
    <CMP_PVAJUSTE>72678.017699115066</CMP_PVAJUSTE>
    <CMP_PVAJUSTEMIN>72678.017699115095</CMP_PVAJUSTEMIN>
    <CMP_PVAJUSTEMAX>72678.017699115037</CMP_PVAJUSTEMAX>
    <CMP_PV>148000</CMP_PV>
    <CMP_VALROLE>129400</CMP_VALROLE>
    <CMP_PVRESECART>4790</CMP_PVRESECART>
    <CMP_PVRHAB>101778.01769911509</CMP_PVRHAB>
    <CMP_UTILISE>1</CMP_UTILISE>
    <CMP_TVM>1</CMP_TVM>
    <CMP_PVA>148000</CMP_PVA>
    </ROW>
    <ROW>
    <PRC_ID>193</PRC_ID>
    <UE_ID>8781</UE_ID>
    <VEN_ID>6235</VEN_ID>
    <RST_MONDE>0</RST_MONDE>
    <CMP_SELMAN>0</CMP_SELMAN>
    <CMP_INDICESELECTION>76.92307692307692</CMP_INDICESELECTION>
    <CMP_PVRES>117800</CMP_PVRES>
    <CMP_PVAJUSTE>118080</CMP_PVAJUSTE>
    <CMP_PVAJUSTEMIN>118080</CMP_PVAJUSTEMIN>
    <CMP_PVAJUSTEMAX>118080</CMP_PVAJUSTEMAX>
    <CMP_PV>172000</CMP_PV>
    <CMP_VALROLE>134800</CMP_VALROLE>
    <CMP_PVRESECART>0</CMP_PVRESECART>
    <CMP_PVRHAB>147180</CMP_PVRHAB>
    <CMP_UTILISE>1</CMP_UTILISE>
    <CMP_TVM>1</CMP_TVM>
    <CMP_PVA>172000</CMP_PVA>
    </ROW>
    </ROWSET>
    PARITOP_COMPARABLE TABLE :
    RST_ID NUMBER(10) NOT NULL,
    VEN_ID NUMBER(10) NOT NULL,
    CMP_SELMAN NUMBER(1) NOT NULL,
    CMP_UTILISE NUMBER(1) NOT NULL,
    CMP_INDICESELECTION FLOAT(53) NOT NULL,
    CMP_PVRES FLOAT(53) NULL,
    CMP_PVAJUSTE FLOAT(53) NULL,
    CMP_PVRHAB FLOAT(53) NULL,
    CMP_TVM FLOAT(53) NULL
    ROCEDURE PARITOP_INSERTXML (xmlDoc IN clob, tableName IN VARCHAR2)
    AS
    insCtx DBMS_XMLSave.ctxType;
    rowss number;
    BEGIN
    --permet d'insérer les champs du XML dans la table passée en paramètre.
    --il suffit que les champs XML aient le même nom que les champs de la table
    BEGIN
    insCtx := DBMS_XMLSave.newContext(tableName); -- get context handle
    DBMS_XMLSAVE.SETDATEFORMAT( insCtx, 'yyyy-MM-dd HH:mm:ss');--attention, case sensitive
    DBMS_XMLSAVE.setIgnoreCase(insCtx, 1);
    rowss := DBMS_XMLSAVE.INSERTXML(insCtx , xmlDoc);
    DBMS_XMLSave.closeContext(insCtx);
    EXCEPTION
    […]
    END;
    END;
    PROCEDURE PARITOP_ADDNODETOROW (
    XMLDOMDOC DBMS_XMLDOM.DOMDOCUMENT,
    NODE_ROW dbms_xmldom.DOMNode,
    NOM_NOEUD VARCHAR2,
    VALEUR_NOEUD VARCHAR2)
    AS
    --PERMET D'AJOUTER UN NOEUD AVEC 1 SEULE VALEUR DANS une ROW D'UN XML.
    --UTILE SURTOUT POUR LES CLÉS ÉTRANGÈRES
    domElemAInserer DBMS_XMLDOM.DOMELEMENT;
    NODE dbms_xmldom.DOMNode;
    NODE_TMP dbms_xmldom.DOMNode;
    BEGIN
    domElemAInserer := DBMS_XMLDOM.createElement(XMLDOMDOC, NOM_NOEUD) ;
    NODE := DBMS_XMLDOM.MAKENODE(domElemAInserer); --cast
    NODE := DBMS_XMLDOM.APPENDCHILD(NODE_ROW,NODE);
    NODE_TMP := DBMS_XMLDOM.MAKENODE(DBMS_XMLDOM.CREATETEXTNODE(XMLDOMDOC, VALEUR_NOEUD ) );
    NODE := DBMS_XMLDOM.APPENDCHILD(NODE,NODE_TMP );
    END;

  • Bulk Loader Program to load large xml document

    I am looking for a bulk loader database program that will load a very large xml document. The simple bulk loader application available on the oracle site will not load this document due to its size which is approximately 20MG. Please advise asap. Thank you.

    From the above document:
    Storing XML Data Across Tables
    Question
    Can XML- SQL Utility store XML data across tables?
    Answer
    Currently XML-SQL Utility (XSU) can only store to a single table. It maps a canonical representation of an XML document into any table/view. But of course there is a way to store XML with the XSU across table. One can do this using XSLT to transform any document into multiple documents and insert them separately. Another way is to define views over multiple tables (object views if needed) and then do the inserts ... into the view. If the view is inherently non-updatable (because of complex joins, ...), then one can use INSTEAD-OF triggers over the views to do the inserts.
    -- I've tried this, works fine.

  • XMLTYPE as CLOB storage "inserting large xml document in xml type column"

    Hi All,
    i have a table containing an xml datatype(non schema based)
    i would like to insert a large xml document in it
    but an exception is thrown-->"string literal too long"
    i tried to use bind variables as a solution"prepared statements as i write in java"
    but it didn't work....as xml document is large
    when i tried to change the column type to CLOB,it worked but without xml validataion,
    although the xml type is mapped to a CLOB in storage, xml type couldn't insert the document
    if anyone have a solution plz tell,i needed it urgently
    thanks,in advance :-)

    thx it was very useful :-)
    but i am not having any success getting the following statement working using a JDBC connection pool rather than a hard coded URL connection
    tempClob = CLOB.createTemporary(conn, true, CLOB.DURATION_SESSION);
    it works with:
    "jdbc:oracle:thin:@server:port:dbname"
    Does NOT work with:
    datasource.getConnection()
    if anyone colud help...

  • Loading large XML documents

    Hi everyone again :) Just sitting here trying to load a large XML document (it's ~13Mb). I know that's a massive XML document, but that's the way it is. The problem that I am having is that when I try to load the document I get an out of memory exception.
    Frankly, I'm not surprised, but is there a remedy? Any thoughts/ideas/solutions would be greatly appreciated. :)
    Ben

    Sounds like you are using DOM. The DOM parser you are using must be loading the whole Document tree right away. DOM really eats up memory. There are two possible solutions:
    1. Look into using a SAX parser. I don't know what you are doing with the xml, so I can't say whether or not that will work for you.
    2. Configure the DOM parser to defer loading nodes until they are requested, or if that option is not available with your parser, get a parser that will defer node loading.
    If option 2 sounds like what you need, then I suggest looking into the Apache Xerces parser. I am pretty sure it defers loading. You shouldn't have to change your code to work with the Xerces parser, you just have to make sure you set the proper system properties so that Java will automatically use the Xerces parser.

  • Performance issues with FDK in large XML documents

    In my current project with FrameMaker 8 I'm experiencing severe performance issues with some FDK API calls.
    The documents are about 3-8 MBytes in size. Fortmatted they cover 150-250 pages.
    When importing such an XML document I do some extensive "post-processing" using FDK. This processing happens in Sr_EventHandler() during the SR_EVT_END_READER event. I noticed that some FDK functions calls which modify the document's structure, like F_ApiSetAttribute() or F_ApiNewElementInHierarchy(), take several seconds, for the larger documents even minutes, to complete one single function call. I tried to move some of these calls to earlier events, mostly to SR_EVT_END_ELEM. There the calls work without a delay. Unfortunately I can't rewrite the FDK client to move all the calls that are lagging to earlier events.
    Does anybody have a clue why such delays happen, and possibly can make a suggestion, how to solve this issue? Thank you in advance.
    PS: I already thought of splitting such a document in smaller pieces by using the FrameMaker book function. But I don't think, the structure of the documents will permit such an automatic split, and it definitely isn't an option to change the document structure (the project is about migrating documents from Interleaf to XML with the constraint of keeping the document layout identical).

    FP_ApplyFormatRules sounds really good--I'll give it a try on Monday. Wonder how I could miss it, as I already tried FP_Reformatting and FP_Displaying at no avail?! By the way, what is actually meant with FP_Reformatting (when I used it I assumed it would do exactly what FP_ApplyFormatRules sounds to do), or is that one another of Lynne's well-kept secrets?
    Thank's for all the helpful suggestions, guys. On Friday I already had my first improvements in a test version of my client: I did some (not all necessary) structural changes using XSLT pre-processing, and processing went down from 8 hours(!) to 1 hour--Yeappie! I was also playing with the idea of writing a wrapper to F_ApiNewElementInHierarchy() which actually pastes an appropriate element created in a small flow on the reference pages at the intended insertion location. But now, with FP_ApplyFormatRules on the horizon, I'm quite confident to get even the complicated stuff under control, which cannot be handled by the XSLT pre-processing, as it is based on the actual formatting of the document at run-time and cannot be anticipated in pre-processing.
    --Franz

  • JDBC - No 'action' attribute found in XML document - error

    Hi,
    I'm trying to write to SQL Server form File
    I successfully read from file, but fail to write.
    <b>My XML is :</b>
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:SD_NEZIGA_OUT_MT xmlns:ns0="ssss.co.il:SD:Office_core_Neziga"><statement2 action="INSERT"><table>Employees</table><access><ID>000009</ID><Name>&#1497;&#1493;&#1504;&#1505;&#1497; &#1512;&#1493;&#1514;&#1497;</Name><Phone>972528288840</Phone><Manager>001037</Manager><DistManager>001037</DistManager><Password>D</Password><UserType>0</UserType><miskalID>0000</miskalID></access></statement2></ns0:SD_NEZIGA_OUT_MT>
    <b>Error from JDBC adapter:</b>
    TransformException error in xml processor class: Error processing request in sax parser: No 'action' attribute found in XML document (attribute "action" missing or wrong XML structure)
    Help me please.
    Best regards, Natalia.

    Hey
    Ur XML is not correct,it must be something like this
    <root>
    <StatementName1>
    <dbTableName action=”UPDATE” | “UPDATE_INSERT”>
    <table>realDbTableName</table>.....
    </StatementName1>
    if u look at the receiver structure of /people/sap.user72/blog/2005/06/01/file-to-jdbc-adapter-using-sap-xi-30 this blog,action is an attribute of TEST and not STATEMENTNAME,for ur structure its an attribute of Statement2
    can u send ur receiver structure?
    thanx
    ahmad
    Message was edited by:
            Ahmad

  • How to map an large XML document to the XMLType with TopLink in JDev

    Hello!
    We need to map an XML document in the Java String to an XMLType column. If the XML document has less than 4000 characters, we have no problems by using the DirectToField mapping. However, once the XML document has more than 4000 characters, using the DirectToField mapping, we got the error: Exception [TOPLINK-4002] (Oracle TopLink - 10g Release 3 (10.1.3.0.0) (Build 060118)): oracle.toplink.exceptions.DatabaseException
    Internal Exception: java.sql.BatchUpdateException: ORA-01461: can bind a LONG value only for insert into a LONG column
    Then we have tried to use the "Type Conversion" mapping, to map it to the database type (Clob or NClob --oracle.toplink.oraclespecific). we got:
    Exception [TOPLINK-4002] (Oracle TopLink - 10g Release 3 (10.1.3.0.0) (Build 060118)): oracle.toplink.exceptions.DatabaseException
    Internal Exception: java.sql.SQLException: ORA-00932: inconsistent datatypes: expected NUMBER got CLOB
    Error Code: 932
    Any suggestions?
    Thanks in advance!

    Thanks Matt! It works fine by using DirectToXMLTypeMapping.
    However, to get it work is not smooth. In particularly, we use Jdev in our project and Jdev has not included this feature yet. Therefore, I had to use the TopLink Workbench to create the descriptor file and cut&paste to the description file that generated by JDev. And also xdb.jar has to be part of the classpath.
    Then I come to an question -- how can we take some features that only provided by the TopLink Workbench into the Jdev environment without having to hack around?
    Thanks,
    s.c.

  • XML document error - ELEARNING TRAINING MATERIAL

    Hi,
    Using SOLAR_LEARNING_MAP i was trying to create Training Material for a particular process (creating sales order as example). I attached a SAP Tutor also for that. The document is displaying from the tcode without any issue, but if i click OPEN BROWSER the following error is coming:
    The XML page cannot be displayed
    Cannot view XML input using style sheet. Please correct the error and then click the Refresh button, or try again later.
    XML document must have a top level element. Error processing resource 'http://hostname.domain:8002/sap/bc/solman/defaul...
    Anybody tried this feature ?
    Thanks,
    Murali.

    Since no answer for long time, i am closing it now

  • Building XML document error

    Hi.
    I am bulding a XML Document but i am getting this error: Error in building: Not logged in. Why? In my PC it works well, but when I try run on NT server I get the error.
    The class is very simple:
    import org.jdom.Document;
    import org.jdom.Element;
    import org.jdom.JDOMException;
    import org.jdom.input.SAXBuilder;
    import org.xml.sax.InputSource;
    public class ReadXmlDocument {
    private Document objXmlDocument;
    public void setXmlDocument() throws JDOMException {
    if ( getFileName() != null ) {
    SAXBuilder builder = new SAXBuilder();
    objXmlDocument = builder.build( getFileName() );
    } else {
    throw new NullPointerException("File name has not been setted");
    Thanks.

    For those that care, I found the solution/workaround.
    This is an actually bug........... http://developer.java.sun.com/developer/bugParade/bugs/4880236.html
    THe workaround is listen in the prroblem statement and is really simple.
    Instead of passing a File into the pass a FileInputStream
    CUSTOMER SUBMITTED WORKAROUND :
    /** Since the java 2 sdk java.io.* package is keen on UNC paths the following
    workaroung is quite simple and effective*/
    java.io.File fileHandler = new
    java.io.File("\\\\Network_name\\Share_name\\file_name.xml");
    javax.xml.parsers.DocumentBuilderFactory factory =
    DocumentBuilderFactory.newInstance();
    try {
    javax.xml.parsers.DocumentBuilder builder = factory.newDocumentBuilder();
    //Create a new FileInputStream from the file handler
    java.io.FileInputStream FIS = new java.io.FileInputStream(fileHandler);
    //The parse function takes an InputStream as well
    org.w3c.dom.Document xmlDoc = builder.parse(FIS);
    catch (SAXException sxe) {
    System.err.println("SaxException");
    } catch (ParserConfigurationException pce) {
    System.err.println("ParserConfigurationException");
    } catch (IOException ioe) {
    System.err.println("IOException");
    This worked for me with SAXBuilder too

  • Best parser for handling very large XML  document

    which is the best parser whenread and extract information from very large XML document

    Any SAX-parser, since DOM would use 6 times as much primary memory as the file-size.
    Xerces SAX-parser is in my experience the fastest.
    Gil

  • Bug in JavaScript code parsing a large XML document returns truncated values

    I believe I am seeing this old bug (https://bugzilla.mozilla.org/show_bug.cgi?id=449219) in Firefox ESR v10.0.3. Could this be a regression or new version of the same bug? Or is the ESR build separate from the normal trunk releases and could this bug not have been patched onto the ESR branch?

    Firefox ESR is based upon the Firefox 10 version with some security patches applied from later Firefox versions, so you shouldn't see bug that were fixed in Firefox 3 versions.
    Please update to the latest Firefox 10.0.11 ESR version.
    *http://www.mozilla.org/en-US/firefox/organizations/all.html
    *http://www.mozilla.org/en-US/firefox/10.0.11/releasenotes/

  • Passing Large xml (as CLOB) to store proc - fails for ORA-29532

    I am getting the following error when I pass a large XML document. It works upto a limit (not sure the size). Is there any size limitation using XMLSAVE? Please assist.
    Oracle Version : Oracle 9.2.0.1
    Error:
    ORA-29532: Java call terminated by uncaught Java exception:
    oracle.xml.sql.OracleXMLSQLException: '>' Missing from end tag.
    Procedure:
    CREATE OR REPLACE PROCEDURE XML2TABLE
    in_tx_XML IN CLOB
    ,in_na_Table IN VARCHAR2
    IS
    lv_XML_Context DBMS_XMLSAVE.ctxType;
    lv_Ct_Rows NUMBER(10);
    BEGIN
    EXECUTE IMMEDIATE 'TRUNCATE TABLE '||in_na_Table;
    lv_XML_Context := DBMS_XMLSAVE.NewContext(in_na_Table);
    lv_Ct_Rows := DBMS_XMLSAVE.InsertXML(lv_XML_Context,in_tx_XML);
    DBMS_XMLSAVE.CloseContext(lv_XML_Context);
    END XML2TABLE;
    /

    Technically DBMS_XMLSAVE and it's alter ego DBMS_XMLQUERY are not considered part of XML DB. DBMS_XMLSAVE and DBMS_XMLQUERY are legacy Java implementations.
    In 9.2.x these routines were replaced by DBMS_XMLSTORE and DBMS_XMLGEN which are written in 'C' and should be much faster in most cases and address a number of limitations inherant in the older implementations. DBMS_XMLSTORE and DBMS_XMLGEN are part of XML DB and the minimum supported release for XML DB is 9.2.0.3.0.
    DBMS_XMLSAVE issues are addressed in the TECH/XML forum
    http://forums.oracle.com/forums/category.jspa?categoryID=51

  • Problem Loading Large XML Doc

    I'm running 10.2.0.3 on a Linux box and I'm having problems loading a large XML document (about 100 MB). In the past, I would simply load the XML file into a XMLType column like this:
    INSERT INTO foo VALUES (XMLType(bfilename('XMLDIR', 'test.xml'). nls_charset_id('AL32UTF8')));
    But when I try this with a large file, it runs for 10 minutes and then returns an ORA-03113. I'm assuming the file is just too large for this technique. I spoke to Mark Drake when I was at OpenWorld and he suggested I use Oracle XML DB, so I created and registered a schema and tried using sqlldr to load the doc, but it ran for 2 1/2 hours before returning:
    Parse Error on row 1 in table FOO
    OCI-31038: Invalid integer value: "129"
    I tried simplifying both the XML file and schema to just the following:
    <schedules>
    <s s="2009-09-21T04:00:00" d="21600" p="335975" c="19672"/>
    <s s="2009-09-21T04:00:00" d="21600" p="335975" c="15387"/>
    <s s="2009-09-21T04:00:00" d="25200" p="335975" c="5256"/>
    <s s="2009-09-21T04:00:00" d="86400" p="335975" c="26198">
    <k id="5" v="2009-09-21 09:00:00.000"/>
    <k id="6" v="2009-09-22 03:59:59.000"/>
    <k id="26" v="0.00"/><k id="27" v="US"/>
    </s>
    <s s="2009-09-21T04:00:00" d="21600" p="335975" c="11678"/>
    <s s="2009-09-21T04:00:00" d="21600" p="335975" c="26697"/>
    <s s="2009-09-21T04:00:00" d="21600" p="335975" c="25343"/>
    <s s="2009-09-21T04:00:00" d="21600" p="335975" c="25269"/>
    <s s="2009-09-21T04:00:00" d="86400" p="335975" c="26200">
    <k id="5" v="2009-09-21 09:00:00.000"/>
    <k id="6" v="2009-09-22 03:59:59.000"/>
    <k id="26" v="0.00"/><k id="27" v="US"/>
    </s>
    </schedules>
    <?xml version="1.0" encoding="UTF-8"?>
    <!--W3C Schema generated by XMLSpy v2008 sp1 (http://www.altova.com)-->
    <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema"
         xmlns:xdb="http://xmlns.oracle.com/xdb"
        version="1.0"
        xdb:storeVarrayAsTable="true">
         <xs:element name="schedules" xdb:defaultTable="SCHEDULES">
              <xs:complexType xdb:SQLType="SCHEDULES_T">
                   <xs:sequence>
                        <xs:element ref="s" maxOccurs="unbounded"/>
                   </xs:sequence>
              </xs:complexType>
         </xs:element>
         <xs:element name="s">
              <xs:complexType>
                   <xs:choice minOccurs="0">
                        <xs:element ref="f" maxOccurs="unbounded"/>
                        <xs:element ref="k" maxOccurs="unbounded"/>
                   </xs:choice>
                   <xs:attribute name="s" use="required" type="xs:dateTime"/>
                   <xs:attribute name="p" use="required" type="xs:int"/>
                   <xs:attribute name="d" use="required" type="xs:int"/>
                   <xs:attribute name="c" use="required" type="xs:short"/>
              </xs:complexType>
         </xs:element>
         <xs:element name="k">
              <xs:complexType>
                   <xs:attribute name="v" use="required" type="xs:string"/>
                   <xs:attribute name="id" use="required" type="xs:byte"/>
              </xs:complexType>
         </xs:element>
         <xs:element name="f">
              <xs:complexType>
                   <xs:attribute name="id" use="required" type="xs:byte"/>
              </xs:complexType>
         </xs:element>
    </xs:schema>
    Keep in mind both the actual XML file and corresponding XSD is much more complex, but this particular section is about 70 MB so I wanted to see if I could just load that.  I used the following sqlldr script:
    LOAD DATA
    INFILE *
    INTO TABLE schedules_tmp TRUNCATE
    XMLType(xml_doc)(
         lobfn FILLER CHAR TERMINATED by ',',
         xml_doc LOBFILE(lobfn) TERMINATED BY EOF
    BEGINDATA
    /tmp/schedules.xmlThis worked fine on a small doc - loaded correctly and I could query it fine - but when I tried using the 70 MB file it ran for a couple of hours before dying with a memory problem.
    So what am I doing wrong? Is there a better way to load a large file?
    Thanks for the help.
    Pete
    Edited by: mdrake on Nov 9, 2009 8:46 PM

    Mark:
    Answers to your questions:
    Do you use direct load ? -> Yes. I tried again using UNRECOVERABLE LOAD DATA to try to speed up performance, but it still ran for a couple of hours before dying.
    Which DB Release are you working with ? -> 10.2.0.3
    Can you see if you can upload via FTP ? -> I added noNamespaceSchemaLocation to my XML file and ftp'd it to my XML directory, but it wasn't recognized. Is there something else I have to do?
    The change for unsignedInt should have code rid of the issue with the value 129. Did it ? -> I didn't try it again on the whole XML file (I'm just working with the schedules section), so I haven't verified this.
    I'm still stumped as to why sqlldr takes so long. I could write something to parse the XML file into a flat file and then use sqlldr to load it into a relational table, and the load would only take a few minutes. But then I wouldn't be using XML DB which I thought would be faster. What am I doing wrong?
    Pete

Maybe you are looking for

  • How to display total for a column in updateble report

    How can I display total on a report column? the query is select htmldb_item.checkbox(1,invoice_id) invoice_id, htmldb_item.DISPLAY_AND_SAVE(2,invoice_no,15)invoice_no, htmldb_item.DISPLAY_AND_SAVE(3,to_char(invoice_date, 'DD-MON-YYYY'),20) invoice_da

  • Is it possible to burn music folders on to an mp3 cd?

    I have 6 music albums that I would like to burn on to a CD as MP3 files, is it possible to burn the 6 albums in 6 folders and not just all the songs. I would like to have the songs from each albums in their seperate folders and not 60 songs from 1-60

  • IPod Touch to Verizon iPhone

    I have been using a 2nd gen iPod Touch for a while now but I just ordered a new iPhone from Verizon. I'm pretty sure I know how to sync the iPhone from the iPod (though I wouldn't mind suggestions). My main question is, can I keep using both devices

  • Use RemoteObject method defined in MXML from AS

    I need to use a RemoteObject from a custom component written as an ActionScript class. I can easily define the RemoteObject service and it's methods in MXML, but I don't know how to acces that definition from a class insite an ActionScript package. F

  • How do I get AVCHD footage into FCPX?

    I have a Panasonic HMC 150 HD camera that records in AVCHD. The files have an ".MTS" extension on them. I'm using Final Cut Pro X and trying to get the files to import using the "Import" function under "File." No luck. Do I have to convert these file