Inserting a long XML document into XMLType

I'm trying to load the following document into an XMLType column in 10.2. I've tried every example I can find and can push the data into CLOBs using the Java work around just fine (http://www.oracle.com/technology/sample_code/tech/java/codesnippet/xmldb/HowToLoadLargeXML.html).
Can anyone provide a solution or let me know if there is a limitation please?
Given the table;
SQL> describe xmltable_1
Name Null? Type
DOC_ID NUMBER
XML_DATA XMLTYPE
How do I load this data into 'XML_DATA'?
<?xml version="1.0" encoding="UTF-8"?>
<metadata>
<idinfo>
<citation>
<citeinfo>
<origin>Rand McNally and ESRI</origin>
<pubdate>1996</pubdate>
<title>ESRI Cities Geodata Set</title>
<geoform>vector digital data</geoform>
<onlink>\\OIS23\C$\Files\Working\Metadata\world\cities.shp</onlink>
</citeinfo>
</citation>
<descript>
<abstract>World Cities contains locations of major cities around the world. The cities include national capitals for each of the countries in World Countries 1998 as well as major population centers and landmark cities. World Cities was derived from ESRI's ArcWorld database and supplemented with other data from the Rand McNally New International Atlas</abstract>
<purpose>606 points, 4 descriptive fields. Describes major world cities.</purpose>
</descript>
<timeperd>
<timeinfo>
<sngdate>
<caldate>1996</caldate>
</sngdate>
</timeinfo>
<current>publication date</current>
</timeperd>
<status>
<progress>Complete</progress>
<update>None planned</update>
</status>
<spdom>
<bounding>
<westbc>
-165.270004</westbc>
<eastbc>
177.130188</eastbc>
<northbc>
78.199997</northbc>
<southbc>
-53.150002</southbc>
</bounding>
</spdom>
<keywords>
<theme>
<themekt>city</themekt>
<themekey>cities</themekey>
</theme>
</keywords>
<accconst>none</accconst>
<useconst>none</useconst>
<ptcontac>
<cntinfo>
<cntperp>
<cntper>unknown</cntper>
<cntorg>unknown</cntorg>
</cntperp>
<cntpos>unknown</cntpos>
<cntvoice>555-1212</cntvoice>
</cntinfo>
</ptcontac>
<datacred>ESRI</datacred>
<native>Microsoft Windows NT Version 4.0 (Build 1381) Service Pack 6; ESRI ArcCatalog 8.1.0.570</native>
</idinfo>
<dataqual>
<attracc>
<attraccr>no report available</attraccr>
<qattracc>
<attraccv>1000000</attraccv>
<attracce>no report available</attracce>
</qattracc>
</attracc>
<logic>no report available</logic>
<complete>no report available</complete>
<posacc>
<horizpa>
<horizpar>no report available</horizpar>
</horizpa>
<vertacc>
<vertaccr>no report available</vertaccr>
</vertacc>
</posacc>
<lineage>
<srcinfo>
<srccite>
<citeinfo>
<title>ESRI</title>
</citeinfo>
</srccite>
<srcscale>20000000</srcscale>
<typesrc>CD-ROM</typesrc>
<srctime>
<timeinfo>
<sngdate>
<caldate>1996</caldate>
</sngdate>
</timeinfo>
<srccurr>publication date</srccurr>
</srctime>
<srccontr>no report available</srccontr>
</srcinfo>
<procstep>
<procdesc>no report available</procdesc>
<procdate>Unknown</procdate>
</procstep>
</lineage>
</dataqual>
<spdoinfo>
<direct>Vector</direct>
<ptvctinf>
<sdtsterm>
<sdtstype>Entity point</sdtstype>
<ptvctcnt>606</ptvctcnt>
</sdtsterm>
</ptvctinf>
</spdoinfo>
<spref>
<horizsys>
<geograph>
<latres>0.000001</latres>
<longres>0.000001</longres>
<geogunit>Decimal degrees</geogunit>
</geograph>
<geodetic>
<horizdn>North American Datum of 1927</horizdn>
<ellips>Clarke 1866</ellips>
<semiaxis>6378206.400000</semiaxis>
<denflat>294.978698</denflat>
</geodetic>
</horizsys>
</spref>
<eainfo>
<detailed>
<enttyp>
<enttypl>
cities</enttypl>
</enttyp>
<attr>
<attrlabl>FID</attrlabl>
<attrdef>Internal feature number.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Sequential unique whole numbers that are automatically generated.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>Shape</attrlabl>
<attrdef>Feature geometry.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Coordinates defining the features.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>NAME</attrlabl>
<attrdef>The city name. Spellings are based on Board of Geographic Names standards and commercial atlases.</attrdef>
<attrdefs>ESRI</attrdefs>
</attr>
<attr>
<attrlabl>COUNTRY</attrlabl>
<attrdef>An abbreviated country name.</attrdef>
</attr>
<attr>
<attrlabl>POPULATION</attrlabl>
<attrdef>Total population for the entire metropolitan area. Values are from recent census or estimates.</attrdef>
</attr>
<attr>
<attrlabl>CAPITAL</attrlabl>
<attrdef>Indicates whether a city is a national capital (Y/N).</attrdef>
</attr>
</detailed>
<overview>
<eaover>none</eaover>
<eadetcit>none</eadetcit>
</overview>
</eainfo>
<distinfo>
<stdorder>
<digform>
<digtinfo>
<transize>0.080</transize>
</digtinfo>
</digform>
</stdorder>
</distinfo>
<metainfo>
<metd>20010509</metd>
<metc>
<cntinfo>
<cntorgp>
<cntorg>ESRI</cntorg>
<cntper>unknown</cntper>
</cntorgp>
<cntaddr>
<addrtype>unknown</addrtype>
<city>unknown</city>
<state>unknown</state>
<postal>00000</postal>
</cntaddr>
<cntvoice>555-1212</cntvoice>
</cntinfo>
</metc>
<metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
<metstdv>FGDC-STD-001-1998</metstdv>
<mettc>local time</mettc>
<metextns>
<onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
<metprof>ESRI Metadata Profile</metprof>
</metextns>
</metainfo>
</metadata>
rtacce>Vertical Positional Accuracy is expressed in meters. Vertical accuracy figures were developed by comparing elevation contour locations on 1:24,000 scale maps to elevation values at the same location within the digital database. Some manual interpolation was necessary to complete this test. The analysis results are expressed as linear error at a 90% confidence interval.</vertacce>
</qvertpa>
</vertacc>
</posacc>
<lineage>
<srcinfo>
<srccite>
<citeinfo>
<origin>National Imagery and Mapping Agency</origin>
<pubdate>1994</pubdate>
<title>Operational Navigational Chart</title>
<geoform>map</geoform>
<pubinfo>
<pubplace>St.Louis, MO</pubplace>
<publish>National Imagery and Mapping Agency</publish>
</pubinfo>
</citeinfo>
</srccite>
<srcscale>1000000</srcscale>
<typesrc>stable-base material</typesrc>
<srctime>
<timeinfo>
<rngdates>
<begdate>1974</begdate>
<enddate>1994</enddate>
</rngdates>
</timeinfo>
<srccurr>Publication dates</srccurr>
</srctime>
<srccitea>ONC</srccitea>
<srccontr>All information found on the source with the exception of aeronautical data</srccontr>
</srcinfo>
<srcinfo>
<srccite>
<citeinfo>
<origin>National Imagery and Mapping Agency</origin>
<pubdate>199406</pubdate>
<title>Digital Aeronautical Flight Information File</title>
<geoform>model</geoform>
<pubinfo>
<pubplace>St. Louis, MO</pubplace>
<publish>National Imagery and Mapping Agency</publish>
</pubinfo>
</citeinfo>
</srccite>
<typesrc>magnetic tape</typesrc>
<srctime>
<timeinfo>
<sngdate>
<caldate>1994</caldate>
</sngdate>
</timeinfo>
<srccurr>Publication date</srccurr>
</srctime>
<srccitea>DAFIF</srccitea>
<srccontr>Airport records (name, International Civil Aviation Organization, position, elevation, and type)</srccontr>
</srcinfo>
<srcinfo>
<srccite>
<citeinfo>
<origin>Defense Mapping Agency</origin>
<pubdate>1994</pubdate>
<title>Jet Navigational Chart</title>
<geoform>map</geoform>
<pubinfo>
<pubplace>St.Louis, MO</pubplace>
<publish>Defense Mapping Agency</publish>
</pubinfo>
</citeinfo>
</srccite>
<srcscale>2,000,000</srcscale>
<typesrc>stable-base material</typesrc>
<srctime>
<timeinfo>
<rngdates>
<begdate>1974</begdate>
<enddate>1991</enddate>
</rngdates>
</timeinfo>
<srccurr>Publication date</srccurr>
</srctime>
<srccitea>JNC</srccitea>
<srccontr>All information found on the source with the exception of aeronautical data. JNCs were used as source for the Antartica region only.</srccontr>
</srcinfo>
<srcinfo>
<srccite>
<citeinfo>
<origin>USGS EROS Data Center</origin>
<pubdate></pubdate>
<title>Advance Very High Resolution Radiometer</title>
<geoform>remote-sensing image</geoform>
<pubinfo>
<pubplace>Sioux Falls, SD</pubplace>
<publish>EROS Data Center</publish>
</pubinfo>
</citeinfo>
</srccite>
<srcscale>1000000</srcscale>
<typesrc>magnetic tape</typesrc>
<srctime>
<timeinfo>
<rngdates>
<begdate>199003</begdate>
<enddate>199011</enddate>
</rngdates>
</timeinfo>
<srccurr>Publication date</srccurr>
</srctime>
<srccitea>AVHRR</srccitea>
<srccontr>6 vegetation types covering the continental US and Canada</srccontr>
</srcinfo>
<procstep>
<procdesc>For the first edition DCW, stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
<procdate>199112</procdate>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Environmental Systems Research Institute</cntorg>
</cntorgp>
<cntpos>Applications Division</cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>380 New York St.</address>
<city>Redlands</city>
<state>CA</state>
<postal>92373</postal>
<country>US</country>
</cntaddr>
<cntvoice>909-793-2853</cntvoice>
<cntfax>909-793-5953</cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdate>199404</procdate>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Geonex</cntorg>
</cntorgp>
<cntpos></cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>8950 North 9th Ave.</address>
<city>St. Petersburg</city>
<state>FL</state>
<postal>33702</postal>
<country>US</country>
</cntaddr>
<cntvoice>(813)578-0100</cntvoice>
<cntfax>(813)577-6946</cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdesc>Transferred digitally directly into the VPF files.</procdesc>
<procdate>199408</procdate>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Geonex</cntorg>
</cntorgp>
<cntpos></cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>8950 North 9th Ave.</address>
<city>St. Petersburg</city>
<state>FL</state>
<postal>33702</postal>
<country>US</country>
</cntaddr>
<cntvoice>813-578-0100</cntvoice>
<cntfax>813-577-6946</cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdesc>Stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
<procdate>199112</procdate>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Environmental Systems Research Institute</cntorg>
</cntorgp>
<cntpos>Applications Division</cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>380 New York St.</address>
<city>Redlands</city>
<state>CA</state>
<postal>92373</postal>
<country>US</country>
</cntaddr>
<cntvoice>909-793-2853</cntvoice>
<cntfax>909-793-5953</cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdesc>Daily AVHRR images were averaged for two week time periods over the entire US growing season. These averaged images, their rates of change, elevation information, and other data were used to produce a single land classification image of the contental US. The VMap-0 data set extended this coverage over the Canadian land mass, however vegetation classification was further subdivided into nine vegetation types.</procdesc>
<procdate>199402</procdate>
<srcprod>EROS data</srcprod>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>USGS Eros Data Center</cntorg>
</cntorgp>
<cntpos></cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address></address>
<city>Sioux Falls</city>
<state>SD</state>
<postal></postal>
<country>US</country>
</cntaddr>
<cntvoice></cntvoice>
<cntfax></cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdesc>The Eros data (raster files) were converted to vector polygon, splined (remove stairstepping), thinned (all ploygons under 2km2 were deleted), and tied to existing DCW polygons (water bodies, built-up areas). The resulting file was tiled and converted to a VPF Vegetation coverage for use in the DCW. All processing was performed using ARC-INFO software.</procdesc>
<procdate>199412</procdate>
<srcprod>VMap-0 Vegetation Coverage</srcprod>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Geonex</cntorg>
</cntorgp>
<cntpos></cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>8950 North 9th Ave.</address>
<city>St. Petersburg</city>
<state>FL</state>
<postal>33702</postal>
<country>US</country>
</cntaddr>
<cntvoice>813-578-0100</cntvoice>
<cntfax>813-577-6946</cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdesc>Data was translated from VPF format to ArcInfo Coverage format. The coverages were then loaded into a seamless ArcSDE layer.</procdesc>
<procdate>02152001</procdate>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Geodesy Team, Harvard University</cntorg>
</cntorgp>
<cntemail>[email protected]</cntemail>
</cntinfo>
</proccont>
</procstep>
</lineage>
</dataqual>
<spdoinfo>
<direct>Vector</direct>
<ptvctinf>
<sdtsterm>
<sdtstype>Complete chain</sdtstype>
</sdtsterm>
<sdtsterm>
<sdtstype>Label point</sdtstype>
</sdtsterm>
<sdtsterm>
<sdtstype>GT-polygon composed of chains</sdtstype>
</sdtsterm>
<sdtsterm>
<sdtstype>Point</sdtstype>
</sdtsterm>
<vpfterm>
<vpflevel>3</vpflevel>
<vpfinfo>
<vpftype>Node</vpftype>
</vpfinfo>
<vpfinfo>
<vpftype>Edge</vpftype>
</vpfinfo>
<vpfinfo>
<vpftype>Face</vpftype>
</vpfinfo>
</vpfterm>
</ptvctinf>
</spdoinfo>
<spref>
<horizsys>
<geograph>
<latres>0.000000</latres>
<longres>0.000000</longres>
<geogunit>Decimal degrees</geogunit>
</geograph>
<geodetic>
<horizdn>D_WGS_1984</horizdn>
<ellips>WGS_1984</ellips>
<semiaxis>6378137.000000</semiaxis>
<denflat>298.257224</denflat>
</geodetic>
</horizsys>
<vertdef>
<altsys>
<altdatum>Mean Sea Level</altdatum>
<altunits>1.0</altunits>
</altsys>
</vertdef>
</spref>
<eainfo>
<detailed>
<enttyp>
<enttypl>
lc.pat</enttypl>
</enttyp>
<attr>
<attrlabl>FID</attrlabl>
<attrdef>Internal feature number.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Sequential unique whole numbers that are automatically generated.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>Shape</attrlabl>
<attrdef>Feature geometry.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Coordinates defining the features.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>AREA</attrlabl>
<attrdef>Area of feature in internal units squared.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Positive real numbers that are automatically generated.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>PERIMETER</attrlabl>
<attrdef>Perimeter of feature in internal units.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Positive real numbers that are automatically generated.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>LC#</attrlabl>
<attrdef>Internal feature number.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Sequential unique whole numbers that are automatically generated.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>LC-ID</attrlabl>
<attrdef>User-defined feature number.</attrdef>
<attrdefs>ESRI</attrdefs>
</attr>
<attr>
<attrlabl>LCAREA.AFT_ID</attrlabl>
</attr>
<attr>
<attrlabl>LCPYTYPE</attrlabl>
<attrdef>Land cover poygon type</attrdef>
<attrdefs>NIMA</attrdefs>
<attrdomv>
<edom>
<edomv>1</edomv>
<edomvd>Rice Field</edomvd>
</edom>
<edom>
<edomv>2</edomv>
<edomvd>Cranberry bog</edomvd>
</edom>
<edom>
<edomv>3</edomv>
<edomvd>Cultivated area, garden</edomvd>
</edom>
<edom>
<edomv>4</edomv>
<edomvd>Peat cuttings</edomvd>
</edom>
<edom>
<edomv>5</edomv>
<edomvd>Salt pan</edomvd>
</edom>
<edom>
<edomv>6</edomv>
<edomvd>Fish pond or hatchery</edomvd>
</edom>
<edom>
<edomv>7</edomv>
<edomvd>Quarry, strip mine, mine dump, blasting area</edomvd>
</edom>
<edom>
<edomv>8</edomv>
<edomvd>Oil or gas</edomvd>
</edom>
<edom>
<edomv>10</edomv>
<edomvd>Lava flow</edomvd>
</edom>
<edom>
<edomv>11</edomv>
<edomvd>Distorted surface area</edomvd>
</edom>
<edom>
<edomv>12</edomv>
<edomvd>Unconsolidated material (sand or gravel, glacial moraine)</edomvd>
</edom>
<edom>
<edomv>13</edomv>
<edomvd>Natural landmark area</edomvd>
</edom>
<edom>
<edomv>14</edomv>
<edomvd>Inundated area</edomvd>
</edom>
<edom>
<edomv>15</edomv>
<edomvd>Undifferentiated wetlands</edomvd>
</edom>
<edom>
<edomv>99</edomv>
<edomvd>None</edomvd>
</edom>
</attrdomv>
</attr>
<attr>
<attrlabl>TILE_ID</attrlabl>
<attrdef>VPF Format tile ID</attrdef>
<attrdefs>NIMA</attrdefs>
</attr>
<attr>
<attrlabl>FAC_ID</attrlabl>
</attr>
</detailed>
<overview>
<eaover>The DCW used a product-specific attribute coding system that is composed of TYPE and STATUS designators for area, line, and point features; and LEVEL and SYMBOL designators for text features. The TYPE attribute specifies what the feature is, while the STATUS attribute specifies the current condition of the feature. Some features require both a TYPE and STATUS code to uniquely identify their characteristics. In order to uniquely identify each geographic attribute in the DCW, the TYPE and STATUS attribute code names are preceded by the two letter coverage abbreviation and a two letter abbreviation for the type of graphic primitive present. The DCW Type/Status codes were mapped into the FACC coding scheme. A full description of FACC may be found in Digital Geographic Information Exchange Standard Edition 1.2, January 1994.</eaover>
<eadetcit>Entities (features) and Attributes for DCW are fully described in: Department of Defense, 1992, Military Specification Digital Chart of the World (MIL-D-89009): Philadelphia, Department of Defense, Defense Printing Service Detachment Office.</eadetcit>
</overview>
</eainfo>
<distinfo>
<distrib>
<cntinfo>
<cntorgp>
<cntorg>NIMA</cntorg>
</cntorgp>
<cntpos>ATTN: CC, MS D-16</cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>6001 MacArthur Blvd.</address>
<city>Bethesda</city>
<state>MD</state>
<postal>20816-5001</postal>
<country>US</country>
</cntaddr>
<cntvoice>301-227-2495</cntvoice>
<cntfax>301-227-2498</cntfax>
</cntinfo>
</distrib>
<distliab>None</distliab>
<stdorder>
<digform>
<digtinfo>
<formname>VPF</formname>
<formverd>19930930</formverd>
<formspec>Military Standard Vector Product Format (MIL-STD-2407). The current version of this document is dated 28 June 1996. Edition 3 of VMap-0 conforms to a previous version of the VPF Standard as noted. Future versions of VMap-0 will conform to the current version of VPF Standard.</formspec>
<transize>0.172</transize>
</digtinfo>
<digtopt>
<offoptn>
<offmedia>CD-ROM</offmedia>
<recfmt>ISO 9660</recfmt>
</offoptn>
</digtopt>
</digform>
<fees>Not Applicable</fees>
</stdorder>
</distinfo>
<distinfo>
<distrib>
<cntinfo>
<cntorgp>
<cntorg>USGS Map Sales</cntorg>
</cntorgp>
<cntaddr>
<addrtype>mailing address</addrtype>
<address>Box 25286</address>
<city>Denver</city>
<state>CO</state>
<postal>80225</postal>
<country>US</country>
</cntaddr>
<cntvoice>303-236-7477</cntvoice>
<cntfax>303-236-1972</cntfax>
</cntinfo>
</distrib>
<distliab>None</distliab>
<stdorder>
<digform>
<digtinfo>
<transize>0.172</transize>
</digtinfo>
</digform>
<fees>$82.50 per four disk set</fees>
<ordering>For General Public: Payment (check, money order, purchase order, or Government account) must accompany order.
Make all drafts payable to Dept. of the Interior- US Geological Survey.
To provide a general idea of content, a sample data set is available from the TMPO Home Page at:</ordering>
</stdorder>
</distinfo>
<distinfo>
<distrib>
<cntinfo>
<cntorgp>
<cntorg>Geodesy Team, Harvard University</cntorg>
<cntper>Geodesy Team</cntper>
</cntorgp>
<cntemail>[email protected]</cntemail>
</cntinfo>
</distrib>
<resdesc>Geodesy layer</resdesc>
<distliab>None</distliab>
<stdorder>
<digform>
<digtinfo>
<formname>SHP</formname>
<transize>0.172</transize>
</digtinfo>
<digtopt>
<onlinopt>
<computer>
<networka>
<networkr>geodesy.harvard.edu</networkr>
</networka>
</computer>
</onlinopt>
</digtopt>
</digform>
<fees>none</fees>
</stdorder>
<availabl>
<timeinfo>
<sngdate>
<caldate>1992</caldate>
</sngdate>
</timeinfo>
</availabl>
</distinfo>
<metainfo>
<metd>20010226</metd>
<metc>
<cntinfo>
<cntorgp>
<cntorg>Geodesy Team</cntorg>
<cntper>REQUIRED: The person responsible for the metadata information.</cntper>
</cntorgp>
<cntvoice>REQUIRED: The telephone number by which individuals can speak to the organization or individual.</cntvoice>
<cntemail>[email protected]</cntemail>
</cntinfo>
</metc>
<metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
<metstdv>FGDC-STD-001-1998</metstdv>
<mettc>local time</mettc>
<metextns>
<onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
<metprof>ESRI Metadata Profile</metprof>
</metextns>
</metainfo>
</metadata>

Have you tired the directory and bfile methods? Here is the example for that in the Oracle XML Developer's Guide:
CREATE DIRECTORY xmldir AS 'path_to_folder_containing_XML_file';
Example 3-3 Inserting XML Content into an XMLType Table
INSERT INTO mytable2 VALUES (XMLType(bfilename('XMLDIR', 'purchaseOrder.xml'),
nls_charset_id('AL32UTF8')));
1 row created.
The value passed to nls_charset_id() indicates that the encoding for the file to be read is UTF-8.
ben

Similar Messages

  • How to insert 4K of XML data into XMLType column?

    I use OCCI and our Oracle version is 9.2.0.1.0. I have successfully been able to insert xml data (any size) into a clob column using "insert into...values(.., empty_clob()) returning c1 into :3"; and then using Statement::GetClob() to acquire a reference to the internal clob and populate it. I cannot seem to be able to do the same when the column type is of XMLType.
    I could not find a single sample code which demonstrates inserting into a table with a XMLType column. Using SetDataBuffer(OCCI_SQLT_STR) with data over 4000 bytes does not work.
    I'd greatly appreciate any feedback.

    Pretty sure this was a bug in the base 9.2 release which was fixed in a patch drop. Try 9.2.0.6 or later.

  • Inserting large xml data into xmltype

    Hi all,
    In my project I need to insert very large XML data into xmltype column.
    My table:
    CREATE TABLE TransDetailstblCLOB ( id number, data_xml XMLType) XmlType data_xml STORE AS CLOB;
    I am using JDBC approach to insert values. It works fine for data less than 4000 bytes when using preparedStatement.setString(1, xmlData). As I have to insert large Xml data >4000 bytes I am now using preparedStatement.setClob() methods.
    My code works fine for table which has column declared as CLOB expicitly. But for TransDetailstblCLOB where the column is declared as XMLTYPE and storage option as CLOB I am getting the error : "ORA-01461: can bind a LONG value only for insert into a LONG column".
    This error means that there is a mismatch between my setClob() and column. which means am I not storing in CLOB column.
    I read in Oracle site that
    When you create an XMLType column without any XML schema specification, a hidden CLOB column is automatically created to store the XML data. The XMLType column itself becomes a virtual column over this hidden CLOB column. It is not possible to directly access the CLOB column; however, you can set the storage characteristics for the column using the XMLType storage clause."
    I dont understand its stated here that it is a hidden CLOB column then why not I use setClob()? It worked fine for pure CLOB column (another table) then Why is it giving such error for XMLTYPE table?
    I am struck up with this since 3 days. Can anyone help me please?
    My code snippet:
    query = "INSERT INTO po_xml_tab VALUES (?,XMLType(?)) ";
              //query = "INSERT INTO test VALUES (?,?) ";
         // Get the statement Object
         pstmt =(OraclePreparedStatement) conn.prepareStatement(query);
         // pstmt = conn.prepareStatement(query);
         //xmlData="test";
    //      If the temporary CLOB has not yet been created, create new
         temporaryClob = oracle.sql.CLOB.createTemporary(conn, true, CLOB.DURATION_SESSION);
         // Open the temporary CLOB in readwrite mode to enable writing
         temporaryClob.open(CLOB.MODE_READWRITE);
         log.debug("tempClob opened"+"size bef writing data"+"length "+temporaryClob.getLength()+
                   "buffer size "+temporaryClob.getBufferSize()+"chunk size "+temporaryClob.getChunkSize());
         OutputStream out = temporaryClob.getAsciiOutputStream();
         InputStream in = new StringBufferInputStream(xmlData);
    int length = -1;
    int wrote = 0;
    int chunkSize = temporaryClob.getChunkSize();
    chunkSize=xmlData.length();
    byte[] buf = new byte[chunkSize];
    while ((length = in.read(buf)) != -1) {
    out.write(buf, 0, length);
    wrote += length;
    temporaryClob.setBytes(buf);
    log.debug("Wrote lenght"+wrote);
         // Bind this CLOB with the prepared Statement
         pstmt.setInt(1,100);
         pstmt.setStringForClob(2, xmlData);
         int i =pstmt.executeUpdate();
         if (i == 1) {
         log.debug("Record Successfully inserted!");
         }

    try this, in adodb works:
    declare poXML CLOB;
    BEGIN
    poXML := '<OIDS><OID>large text</OID></OIDS>';
    UPDATE a_po_xml_tab set podoc=XMLType(poXML) WHERE poid = 102;
    END;

  • Insert a XSL document into XMLType

    Hi dear reader
    I want to insert a XSL document into XMLType field , but the length of the XSL is 5900 character and I have problem with loading and executing the INSERT statement in database.
    for solving the load problem , I use binding variable and try to execute it with EXECUTE IMMEDIATE str.
    and get the following error :
    ORA-01704: string literal too long
    please advice;
    thanks

    create directory object DIR_TEMP on the database pointing to the physical path where your xml files are available.
    Note the path should be on the database server and not on the client and also your files should be present on the database server and not client for this to work.
    create table xml_table (id number, xml_data xmltype)
    CREATE OR REPLACE FUNCTION getClobDocument
    ( filename  in varchar2,
      charset   in varchar2 default NULL,
      dir       IN VARCHAR2 default 'DIR_TEMP',
    ) RETURN CLOB deterministic is
      file          bfile := bfilename(dir,filename);
      charContent   CLOB := ' ';
      targetFile    bfile;
      lang_ctx      number := DBMS_LOB.default_lang_ctx;
      charset_id    number := 0;
      src_offset    number := 1 ;
      dst_offset    number := 1 ;
      warning       number;
    BEGIN
      IF charset is not null then
          charset_id := NLS_CHARSET_ID(charset);
      end if;
      targetFile := file;
      DBMS_LOB.fileopen(targetFile, DBMS_LOB.file_readonly);
      DBMS_LOB.LOADCLOBFROMFILE(charContent, targetFile,
      DBMS_LOB.getLength(targetFile), src_offset, dst_offset,
      charset_id, lang_ctx,warning);
      DBMS_LOB.fileclose(targetFile);
      return charContent;
    end;
    insert into xml_table values(1, XMLTYPE(GetClobDocument('purchaseorder.xml')))
    /The function GetClobDocument returns the content of the xml file in CLOB format.
    Before insert this is converted into XMLTYPE.

  • How to insert document into xmltype column through an http post request with perl

    Oracle 11.2.0.3
    Windows server 2008r2
    Apache tomcat 7.0
    Oracle APEX 4.2.1
    Oracle APEX Listener 2.0
    I would like to insert a XML document into the database through an APEX restful web service. The POST into the web service in done with PERL. The following code will insert an empty record in a table with column of XMLType type.
    Perl Code
    use strict;
    use warnings;
    use LWP::UserAgent;
    use HTTP:Headers;
    my $headers = HTTP::Headers->new();
    my $url = "http://host:port:apex/<application_workspace>/<restfull service module>/<uri template>/
    my $sendthis = ('<?xml version="1.0" enconding="utf-8"?>
    <students>
    <row>
           <name>Mark</name>
          <age>30</age>
    </row>
    <./students>';)
    $headers -> header('Content-Type' => 'text/xml; charset=utf-8');
    my $request = HTTP:Request->new('POST', $url, $headers, $sendthis);
    $request-> protocol('HTTP/1.1');
    my $browser = LWP::UserAgent->new();
    my $response = $browser->request($request);
    my $gotthis= $response->content();
    my $the_file_data = $response->content();
    APEX restful service
    Method: POST
    Source type: PL/SQL
    MIME Types allowed: blank
    require secure access: none
    source:
    {declare
    doc varchar2(32000);
    begin
    insert into table <column name>
    values(doc);
    commit;
    end;
    Table code
    { create table <tablename>
    (column name XMLType>);
    The above code will insert an empty column into the table.
    Any ideas why?

    It's a really bad idea to assemble XML using strings and string concatenation in SQL or PL/SQL. First there is a 4K limit in SQL, and 32K limit in PL/SQL, which means you end up constructing the XML in chunks, adding uneccessary complications. 2nd you cannot confirm the XML is valid or well formed using external tools.
    IMHO it makes much more sense to keep the XML content seperated from the SQL / PL/SQL code
    When the XML can be stored a File System accessable from the database, The files can be loaded into the database using mechansims like BFILE.
    In cases where the XML must be staged on a remote file system, the files can be loaded into the database using FTP or HTTP and in cases where this is not an option, SQLLDR.
    -Mark

  • Exception when inserting a XML document into database

    Hello.
    I'm trying to insert the content of a XML document into a database. The problem consists on the exception that is raised when a FK in the XML is not in the database as a PK
    How can i catch these exceptions and continuing inserting the rest of the data
    Thanks in advance ...

    I really hope for you this wasn't a production database, as mentioned in the URL, it has been some years ago for me fiddling around this which was btw a supported metalink workaround ONCE (during my Oracle 7 days), but now, I wouldn't be surprised if you now have mixed NLS content in your tables/database and more or less "corrupted" your content.
    The initial issue was probably a client one.
    BTW
    I actually hope it was a production database, maybe then you would learn that way, that you SHOULD NEVER EVER UPDATE DATABASE BASE TABLES directly...
    ...unless (the only exception) you want to learn something about the internals and don't care that your TEST database gets corrupted....

  • Java5 Document into XMLType

    Hi,
    I had a look to the example on OTN and I found how to get an XMLType into a Java 5 Document.
    I want to do exactly the contrary.
    So far, I transform my XML Document into a string
    (The XML data come from a file and are parsed with a JAXP DOM Parser, validated with an XSD, and then quite a lot of operations are performed on the document).
    Then I create an XMLType using that string, to finally insert it into the database.
    I'd like to go faster and to initialize my XMLType directly with the DOM Document
    (to avoid any transformation and any additional parsing from Oracle XMLType)
    So,
    Document ==> XMLType, insert
    should be faster than
    Document ==> String , String ==> XMLType, insert
    Any idea ?
    Thanks

    create directory object DIR_TEMP on the database pointing to the physical path where your xml files are available.
    Note the path should be on the database server and not on the client and also your files should be present on the database server and not client for this to work.
    create table xml_table (id number, xml_data xmltype)
    CREATE OR REPLACE FUNCTION getClobDocument
    ( filename  in varchar2,
      charset   in varchar2 default NULL,
      dir       IN VARCHAR2 default 'DIR_TEMP',
    ) RETURN CLOB deterministic is
      file          bfile := bfilename(dir,filename);
      charContent   CLOB := ' ';
      targetFile    bfile;
      lang_ctx      number := DBMS_LOB.default_lang_ctx;
      charset_id    number := 0;
      src_offset    number := 1 ;
      dst_offset    number := 1 ;
      warning       number;
    BEGIN
      IF charset is not null then
          charset_id := NLS_CHARSET_ID(charset);
      end if;
      targetFile := file;
      DBMS_LOB.fileopen(targetFile, DBMS_LOB.file_readonly);
      DBMS_LOB.LOADCLOBFROMFILE(charContent, targetFile,
      DBMS_LOB.getLength(targetFile), src_offset, dst_offset,
      charset_id, lang_ctx,warning);
      DBMS_LOB.fileclose(targetFile);
      return charContent;
    end;
    insert into xml_table values(1, XMLTYPE(GetClobDocument('purchaseorder.xml')))
    /The function GetClobDocument returns the content of the xml file in CLOB format.
    Before insert this is converted into XMLTYPE.

  • To load an XML document into 40 tables

    How do I load a large XML document into 40 tables. Most of the exmaples, I see only load one table into the Orcale database?

    From the above document:
    Storing XML Data Across Tables
    Question
    Can XML- SQL Utility store XML data across tables?
    Answer
    Currently XML-SQL Utility (XSU) can only store to a single table. It maps a canonical representation of an XML document into any table/view. But of course there is a way to store XML with the XSU across table. One can do this using XSLT to transform any document into multiple documents and insert them separately. Another way is to define views over multiple tables (object views if needed) and then do the inserts ... into the view. If the view is inherently non-updatable (because of complex joins, ...), then one can use INSTEAD-OF triggers over the views to do the inserts.
    -- I've tried this, works fine.

  • Can I append an XML Document into another?

    I have a large xml document which I open into coldfusion (mx
    7), and then a smaller group of files that are opened into thier
    own xml objects. After some customization of xml text values, I
    want to append the smaller xml documents into the larger document
    at specific positions within the document tree.
    I have the code that I thought would do this, using an
    arrayappend function, but what happens is that only the root
    element of the smaller (inserted) xml document is placed into the
    main document.
    Is this possible, or do I need to modify the code so that the
    entire tree of xml data that I am appending is created on the
    fly?

    Personally I do this with XSLT.
    <cfxml variable="doc1">
    <root>
    <member id="10">
    <name>Ian</name>
    <sex>male</sex>
    </member>
    </root>
    </cfxml>
    <cfxml variable="doc2">
    <root>
    <member id="1">
    <name>Joe</name>
    <sex>male</sex>
    </member>
    <member id="2">
    <name>John</name>
    <sex>male</sex>
    </member>
    <member id="3">
    <name>Sue</name>
    <sex>female</sex>
    </member>
    </root>
    </cfxml>
    <cffile action="write" file="#expandPath("/")#\doc2.xml"
    output="#ToString(doc2)#" nameconflict="overwrite">
    <!--- METHOD ONE using a depreciated and undocumented
    method --->
    <cfset newNode = doc1.root.member.cloneNode(true)>
    <cfset doc2.changeNodeOwner(newNode)>
    <cfset doc2.root.appendChild(newNode)>
    <cfdump var="#doc2#" label="Merged by hidden functions"
    expand="no">
    <!--- METHOD TWO using XSLT --->
    <!--- create an xsl docutment--->
    <cfoutput>
    <cfsavecontent variable="transformer">
    <xsl:stylesheet xmlns:xsl="
    http://www.w3.org/1999/XSL/Transform"
    version="1.0">
    <xsl:output method="xml"/>
    <!--load the merge file -->
    <xsl:variable name="emps"
    select="document('#expandPath("/")#\doc2.xml')"/>
    <!--- this line references and XML file to be combined
    with the main file,
    I wonder if there is a way to use an XML document in memory
    here? --->
    <!-- combine the files -->
    <xsl:template match="/">
    <root>
    <!-- select all the child nodes of the root tag in the
    main file -->
    <xsl:for-each select="root/child::*">
    <!-- copy the member tag -->
    <xsl:copy-of select="."/>
    </xsl:for-each>
    <!-- and all the child nodes of the root tag in the merge
    file -->
    <xsl:for-each select="$emps/root/child::*">
    <!-- copy the member tag -->
    <xsl:copy-of select="."/>
    </xsl:for-each>
    </root>
    </xsl:template>
    </xsl:stylesheet>
    </cfsavecontent>
    </cfoutput>
    <!--- create a combined xml document --->
    <cfset doc2 = XMLparse(XMLtransform(doc1,transformer))>
    <cfdump var="#doc2#" label="Merged by XSLT"
    expand="no">

  • Got error message when store XML documents into XML DB repository, via WebD

    Hi experts,
    I am in I am in Oracle Enterprise Manager 11g 11.2.0.1.0.
    SQL*Plus: Release 11.2.0.1.0 Production on Tue Feb 22 11:40:23 2011
    I got error message when store XML documents into XML DB repository, via WebDAV.
    I have successfully registered 5 related schemas and generated 1 table.
    I have inserted 40 .xml files into this auto generated table.
    using these data I created relational view successfully.
    but since I couldn't store XML documents into XML DB repository, via WebDAV
    when I query using below code:
    SELECT rv.res.getClobVal()
    FROM resource_view rv
    WHERE rv.any_path = '/home/DEV/messages/4fe1-865d-da0db9212f34.xml';
    I got nothing.
    My ftp code is listed below:
    ftp> open localhost 2100
    Connected to I0025B368E2F9.
    220- C0025B368E2F9
    Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution.
    220 I0025B368E2F9 FTP Server (Oracle XML DB/Oracle Database) ready.
    User (I0025B368E2F9:(none)): fda_xml
    331 pass required for FDA_XML
    Password:
    230 FDA_XML logged in
    ftp> cd /home/DEV/message
    250 CWD Command successful
    ftp> pwd
    257 "/home/DEV/message" is current directory.
    ftp> ls -la
    200 PORT Command successful
    150 ASCII Data Connection
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 .
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 ..
    226 ASCII Transfer Complete
    ftp: 115 bytes received in 0.00Seconds 115000.00Kbytes/sec.
    250 SET_CHARSET Command Successful
    ftp> put C:\ED\SPL\E_Reon_Data\loaded\4fe1-865d-da0db9212f34.xml
    200 PORT Command successful
    150 ASCII Data Connection
    550- Error Response
    ORA-00600: internal error code, arguments: [qmxConvUnkType], [], [], [], [], [], [], [], [], [], [], []
    550 End Error Response
    ftp: 3394 bytes sent in 0.00Seconds 3394000.00Kbytes/sec.
    I have tried all suggestion from another thread such as:
    alter system set events ='31150 trace name context forever, level 0x4000'
    SQL> alter system set shared_servers = 1;
    but failed.
    is there anyone can help?
    Thanks.
    Edited by: Cow on Mar 29, 2011 12:58 AM

    Hi experts,
    I am in I am in Oracle Enterprise Manager 11g 11.2.0.1.0.
    SQL*Plus: Release 11.2.0.1.0 Production on Tue Feb 22 11:40:23 2011
    I got error message when store XML documents into XML DB repository, via WebDAV.
    I have successfully registered 5 related schemas and generated 1 table.
    I have inserted 40 .xml files into this auto generated table.
    using these data I created relational view successfully.
    but since I couldn't store XML documents into XML DB repository, via WebDAV
    when I query using below code:
    SELECT rv.res.getClobVal()
    FROM resource_view rv
    WHERE rv.any_path = '/home/DEV/messages/4fe1-865d-da0db9212f34.xml';
    I got nothing.
    My ftp code is listed below:
    ftp> open localhost 2100
    Connected to I0025B368E2F9.
    220- C0025B368E2F9
    Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution.
    220 I0025B368E2F9 FTP Server (Oracle XML DB/Oracle Database) ready.
    User (I0025B368E2F9:(none)): fda_xml
    331 pass required for FDA_XML
    Password:
    230 FDA_XML logged in
    ftp> cd /home/DEV/message
    250 CWD Command successful
    ftp> pwd
    257 "/home/DEV/message" is current directory.
    ftp> ls -la
    200 PORT Command successful
    150 ASCII Data Connection
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 .
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 ..
    226 ASCII Transfer Complete
    ftp: 115 bytes received in 0.00Seconds 115000.00Kbytes/sec.
    250 SET_CHARSET Command Successful
    ftp> put C:\ED\SPL\E_Reon_Data\loaded\4fe1-865d-da0db9212f34.xml
    200 PORT Command successful
    150 ASCII Data Connection
    550- Error Response
    ORA-00600: internal error code, arguments: [qmxConvUnkType], [], [], [], [], [], [], [], [], [], [], []
    550 End Error Response
    ftp: 3394 bytes sent in 0.00Seconds 3394000.00Kbytes/sec.
    I have tried all suggestion from another thread such as:
    alter system set events ='31150 trace name context forever, level 0x4000'
    SQL> alter system set shared_servers = 1;
    but failed.
    is there anyone can help?
    Thanks.
    Edited by: Cow on Mar 29, 2011 12:58 AM

  • XML document into multiple tables

    How to insert a xml document into multiple tables. Eg. Purchase Order having multiple line items. I have to insert xml document into Parent as well as child with different sets of columns.

    I created the tables using the create_ch14_tables.sql. I call it using java -classpath .;C:\commerceone\xpc\lib\xmlparserv2.jar;C:\commerceone\xpc\lib\classes12.zip;C:\commerceone\xpc\lib\xsu12.jar XMLLoader -file deptempdepend.xml -connName default -transform deptempdepend.xsl. The code doesn't seem to like the "<xsl:for-each select="Department">" tags. If I remove them, the insDoc.getDocumentElement().getFirstChild() will find the element, but it still doesn't insert anything into the database.
    Thank You,
    Dave

  • How to convert Xml Document into orcale tempary table

    i am creating xml document into java and passing this xml into oracle database and i need to fetch this xml into result set | rowset.
    Xml Structure
    <Department deptno="100">
    <DeptName>Sports</DeptName>
    <EmployeeList>
    <Employee empno="200"><Ename>John</Ename><Salary>33333</Salary>
    </Employee>
    <Employee empno="300"><Ename>Jack</Ename><Salary>333444</Salary>
    </Employee>
    </EmployeeList>
    </Department>
    i need like this format
    Deptno DeptName empno Ename Salary
    100 Sports 200 Jhon 2500
    100 Sports 300 Jack 3000

    It does depend on your version as odie suggests.
    Here's a way that will work in 10g...
    SQL> ed
    Wrote file afiedt.buf
      1  with t as (select xmltype('<Department deptno="100">
      2  <DeptName>Sports</DeptName>
      3  <EmployeeList>
      4  <Employee empno="200"><Ename>John</Ename><Salary>33333</Salary>
      5  </Employee>
      6  <Employee empno="300"><Ename>Jack</Ename><Salary>333444</Salary>
      7  </Employee>
      8  </EmployeeList>
      9  </Department>
    10  ') as xml from dual)
    11  --
    12  -- End of test data, Use query below
    13  --
    14  select x.deptno, x.deptname
    15        ,y.empno, y.ename, y.salary
    16  from t
    17      ,xmltable('/'
    18                passing t.xml
    19                columns deptno   number       path '/Department/@deptno'
    20                       ,deptname varchar2(10) path '/Department/DeptName'
    21                       ,emps     xmltype      path '/Department/EmployeeList'
    22               ) x
    23      ,xmltable('/EmployeeList/Employee'
    24                passing x.emps
    25                columns empno    number       path '/Employee/@empno'
    26                       ,ename    varchar2(10) path '/Employee/Ename'
    27                       ,salary   number       path '/Employee/Salary'
    28*              ) y
    SQL> /
        DEPTNO DEPTNAME        EMPNO ENAME          SALARY
           100 Sports            200 John            33333
           100 Sports            300 Jack           333444
    SQL>If the XML is a string e.g. a CLOB then it can easily be converted to XMLTYPE using the XMLTYPE function.

  • Open .xml documents into InDesign?

    Is there a way to open .xml documents into InDesign, or any Adobe program?
    The goal is to access text from a web site feed and place it in an Indesign doc, to create a print product.

    You can't open XML but you can certainly utilize it.
    Here's a good book you might want to look into: http://amzn.to/i3iWZ5
    Bob

  • Existing XML documents into KM

    Hello Everyone
    Is there a way to load existing xml documents into KM and then maintain them using xml forms built using the form builder. Any help is greatly appreciated.
    Thanks
    Swetha

    Hi Renuka
    You can move the XML projects from one server  to another.The Xml forms created is stored in the path 'etc\xmlforms' in KM. You can send it out and Upload to the target server instead of creating the same in the Target server from the scratch.
    Regards
    Geogi

  • How to ftp XML document into a XML type which is not created by itself.

    Hi,
    1.
    I have a table call SNPLEX_DESIGN which is created automaticly when I register a snplex_design.xsd XML schema.( Oracle creates it through the xdb:defaultTable="SNPLEX_DESIGN attribute). and it is created using SNPLEX user account.
    2.
    I also created a folder (resource) call /home/SNPLEX/Orders. which is used to hold all the incoming XML document.
    3.
    I created another user account call SNPLEX_APP, which is the only user account allowed to FTP XML document into /home/SNPLEX/Orders folder.
    Isuues,
    If I login as SNPLEX user, I can ftp XML document into the folder and TABLE (the file size = 0). But If I login as SNPLEX_APP user account, I can only ftp XML document into the folder, but Oracle doesn't store the document into the table( becuase the files size shows a number).
    I have granted all the ACL privileges on the /home/SNPEX/Orders folder to SNPLEX_APP hrough OEM.
    DO I miss anything. Any helps will be great appreciated. Resolve this issues is very import to us, sicne we are on a stage to roll system into production.
    Regards,
    Jinsen

    IN order for a registered schema to be available to other users the schema must be registered as a GLOBAL, rather than a LOCAL Schema. This is controlled by the third agument passed to registerSChema, and the default is local. Note that you will also need to explicity grant appropriate permissions on any tables created by the schema registration process to other users who will be loading or reading data from these tables.

Maybe you are looking for