Inserting XML document into XDB fails with can't convert to OPAQUE

Hi,
When I try to insert a document using oracle 9.2.0.5 client software into a 9.2.0.5 database, with the following code:
void insertDocument(String document) {
XMLType xt = XMLType.createXML(connection, document);
PreparedStatement ps = connection.prepareStatement("insert into xmldocuments values(?)");
ps.setObject(1, xt);
ps.executeUpdate();
The setObject function always throws an exception (after a very long time) with:
java.sql.SQLException: Fail to convert to internal representation: OPAQUE()
This also fails when we use the InputStream and org.w3c.xml.Document variants.
We use the OCI driver, otherwise we get errors retrieving the documents from XMLDB.
What is going wrong here?

David,
If you search through the historical data in this list there are previous post regarding opaque.
These may be useful. Possibly your reaching the size limit.

Similar Messages

  • XML document into multiple tables

    How to insert a xml document into multiple tables. Eg. Purchase Order having multiple line items. I have to insert xml document into Parent as well as child with different sets of columns.

    I created the tables using the create_ch14_tables.sql. I call it using java -classpath .;C:\commerceone\xpc\lib\xmlparserv2.jar;C:\commerceone\xpc\lib\classes12.zip;C:\commerceone\xpc\lib\xsu12.jar XMLLoader -file deptempdepend.xml -connName default -transform deptempdepend.xsl. The code doesn't seem to like the "<xsl:for-each select="Department">" tags. If I remove them, the insDoc.getDocumentElement().getFirstChild() will find the element, but it still doesn't insert anything into the database.
    Thank You,
    Dave

  • Inserting a long XML document into XMLType

    I'm trying to load the following document into an XMLType column in 10.2. I've tried every example I can find and can push the data into CLOBs using the Java work around just fine (http://www.oracle.com/technology/sample_code/tech/java/codesnippet/xmldb/HowToLoadLargeXML.html).
    Can anyone provide a solution or let me know if there is a limitation please?
    Given the table;
    SQL> describe xmltable_1
    Name Null? Type
    DOC_ID NUMBER
    XML_DATA XMLTYPE
    How do I load this data into 'XML_DATA'?
    <?xml version="1.0" encoding="UTF-8"?>
    <metadata>
    <idinfo>
    <citation>
    <citeinfo>
    <origin>Rand McNally and ESRI</origin>
    <pubdate>1996</pubdate>
    <title>ESRI Cities Geodata Set</title>
    <geoform>vector digital data</geoform>
    <onlink>\\OIS23\C$\Files\Working\Metadata\world\cities.shp</onlink>
    </citeinfo>
    </citation>
    <descript>
    <abstract>World Cities contains locations of major cities around the world. The cities include national capitals for each of the countries in World Countries 1998 as well as major population centers and landmark cities. World Cities was derived from ESRI's ArcWorld database and supplemented with other data from the Rand McNally New International Atlas</abstract>
    <purpose>606 points, 4 descriptive fields. Describes major world cities.</purpose>
    </descript>
    <timeperd>
    <timeinfo>
    <sngdate>
    <caldate>1996</caldate>
    </sngdate>
    </timeinfo>
    <current>publication date</current>
    </timeperd>
    <status>
    <progress>Complete</progress>
    <update>None planned</update>
    </status>
    <spdom>
    <bounding>
    <westbc>
    -165.270004</westbc>
    <eastbc>
    177.130188</eastbc>
    <northbc>
    78.199997</northbc>
    <southbc>
    -53.150002</southbc>
    </bounding>
    </spdom>
    <keywords>
    <theme>
    <themekt>city</themekt>
    <themekey>cities</themekey>
    </theme>
    </keywords>
    <accconst>none</accconst>
    <useconst>none</useconst>
    <ptcontac>
    <cntinfo>
    <cntperp>
    <cntper>unknown</cntper>
    <cntorg>unknown</cntorg>
    </cntperp>
    <cntpos>unknown</cntpos>
    <cntvoice>555-1212</cntvoice>
    </cntinfo>
    </ptcontac>
    <datacred>ESRI</datacred>
    <native>Microsoft Windows NT Version 4.0 (Build 1381) Service Pack 6; ESRI ArcCatalog 8.1.0.570</native>
    </idinfo>
    <dataqual>
    <attracc>
    <attraccr>no report available</attraccr>
    <qattracc>
    <attraccv>1000000</attraccv>
    <attracce>no report available</attracce>
    </qattracc>
    </attracc>
    <logic>no report available</logic>
    <complete>no report available</complete>
    <posacc>
    <horizpa>
    <horizpar>no report available</horizpar>
    </horizpa>
    <vertacc>
    <vertaccr>no report available</vertaccr>
    </vertacc>
    </posacc>
    <lineage>
    <srcinfo>
    <srccite>
    <citeinfo>
    <title>ESRI</title>
    </citeinfo>
    </srccite>
    <srcscale>20000000</srcscale>
    <typesrc>CD-ROM</typesrc>
    <srctime>
    <timeinfo>
    <sngdate>
    <caldate>1996</caldate>
    </sngdate>
    </timeinfo>
    <srccurr>publication date</srccurr>
    </srctime>
    <srccontr>no report available</srccontr>
    </srcinfo>
    <procstep>
    <procdesc>no report available</procdesc>
    <procdate>Unknown</procdate>
    </procstep>
    </lineage>
    </dataqual>
    <spdoinfo>
    <direct>Vector</direct>
    <ptvctinf>
    <sdtsterm>
    <sdtstype>Entity point</sdtstype>
    <ptvctcnt>606</ptvctcnt>
    </sdtsterm>
    </ptvctinf>
    </spdoinfo>
    <spref>
    <horizsys>
    <geograph>
    <latres>0.000001</latres>
    <longres>0.000001</longres>
    <geogunit>Decimal degrees</geogunit>
    </geograph>
    <geodetic>
    <horizdn>North American Datum of 1927</horizdn>
    <ellips>Clarke 1866</ellips>
    <semiaxis>6378206.400000</semiaxis>
    <denflat>294.978698</denflat>
    </geodetic>
    </horizsys>
    </spref>
    <eainfo>
    <detailed>
    <enttyp>
    <enttypl>
    cities</enttypl>
    </enttyp>
    <attr>
    <attrlabl>FID</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>Shape</attrlabl>
    <attrdef>Feature geometry.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Coordinates defining the features.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>NAME</attrlabl>
    <attrdef>The city name. Spellings are based on Board of Geographic Names standards and commercial atlases.</attrdef>
    <attrdefs>ESRI</attrdefs>
    </attr>
    <attr>
    <attrlabl>COUNTRY</attrlabl>
    <attrdef>An abbreviated country name.</attrdef>
    </attr>
    <attr>
    <attrlabl>POPULATION</attrlabl>
    <attrdef>Total population for the entire metropolitan area. Values are from recent census or estimates.</attrdef>
    </attr>
    <attr>
    <attrlabl>CAPITAL</attrlabl>
    <attrdef>Indicates whether a city is a national capital (Y/N).</attrdef>
    </attr>
    </detailed>
    <overview>
    <eaover>none</eaover>
    <eadetcit>none</eadetcit>
    </overview>
    </eainfo>
    <distinfo>
    <stdorder>
    <digform>
    <digtinfo>
    <transize>0.080</transize>
    </digtinfo>
    </digform>
    </stdorder>
    </distinfo>
    <metainfo>
    <metd>20010509</metd>
    <metc>
    <cntinfo>
    <cntorgp>
    <cntorg>ESRI</cntorg>
    <cntper>unknown</cntper>
    </cntorgp>
    <cntaddr>
    <addrtype>unknown</addrtype>
    <city>unknown</city>
    <state>unknown</state>
    <postal>00000</postal>
    </cntaddr>
    <cntvoice>555-1212</cntvoice>
    </cntinfo>
    </metc>
    <metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
    <metstdv>FGDC-STD-001-1998</metstdv>
    <mettc>local time</mettc>
    <metextns>
    <onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
    <metprof>ESRI Metadata Profile</metprof>
    </metextns>
    </metainfo>
    </metadata>
    rtacce>Vertical Positional Accuracy is expressed in meters. Vertical accuracy figures were developed by comparing elevation contour locations on 1:24,000 scale maps to elevation values at the same location within the digital database. Some manual interpolation was necessary to complete this test. The analysis results are expressed as linear error at a 90% confidence interval.</vertacce>
    </qvertpa>
    </vertacc>
    </posacc>
    <lineage>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>National Imagery and Mapping Agency</origin>
    <pubdate>1994</pubdate>
    <title>Operational Navigational Chart</title>
    <geoform>map</geoform>
    <pubinfo>
    <pubplace>St.Louis, MO</pubplace>
    <publish>National Imagery and Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>1000000</srcscale>
    <typesrc>stable-base material</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>1974</begdate>
    <enddate>1994</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication dates</srccurr>
    </srctime>
    <srccitea>ONC</srccitea>
    <srccontr>All information found on the source with the exception of aeronautical data</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>National Imagery and Mapping Agency</origin>
    <pubdate>199406</pubdate>
    <title>Digital Aeronautical Flight Information File</title>
    <geoform>model</geoform>
    <pubinfo>
    <pubplace>St. Louis, MO</pubplace>
    <publish>National Imagery and Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <typesrc>magnetic tape</typesrc>
    <srctime>
    <timeinfo>
    <sngdate>
    <caldate>1994</caldate>
    </sngdate>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>DAFIF</srccitea>
    <srccontr>Airport records (name, International Civil Aviation Organization, position, elevation, and type)</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>Defense Mapping Agency</origin>
    <pubdate>1994</pubdate>
    <title>Jet Navigational Chart</title>
    <geoform>map</geoform>
    <pubinfo>
    <pubplace>St.Louis, MO</pubplace>
    <publish>Defense Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>2,000,000</srcscale>
    <typesrc>stable-base material</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>1974</begdate>
    <enddate>1991</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>JNC</srccitea>
    <srccontr>All information found on the source with the exception of aeronautical data. JNCs were used as source for the Antartica region only.</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>USGS EROS Data Center</origin>
    <pubdate></pubdate>
    <title>Advance Very High Resolution Radiometer</title>
    <geoform>remote-sensing image</geoform>
    <pubinfo>
    <pubplace>Sioux Falls, SD</pubplace>
    <publish>EROS Data Center</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>1000000</srcscale>
    <typesrc>magnetic tape</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>199003</begdate>
    <enddate>199011</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>AVHRR</srccitea>
    <srccontr>6 vegetation types covering the continental US and Canada</srccontr>
    </srcinfo>
    <procstep>
    <procdesc>For the first edition DCW, stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
    <procdate>199112</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Environmental Systems Research Institute</cntorg>
    </cntorgp>
    <cntpos>Applications Division</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>380 New York St.</address>
    <city>Redlands</city>
    <state>CA</state>
    <postal>92373</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>909-793-2853</cntvoice>
    <cntfax>909-793-5953</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdate>199404</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>(813)578-0100</cntvoice>
    <cntfax>(813)577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Transferred digitally directly into the VPF files.</procdesc>
    <procdate>199408</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>813-578-0100</cntvoice>
    <cntfax>813-577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
    <procdate>199112</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Environmental Systems Research Institute</cntorg>
    </cntorgp>
    <cntpos>Applications Division</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>380 New York St.</address>
    <city>Redlands</city>
    <state>CA</state>
    <postal>92373</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>909-793-2853</cntvoice>
    <cntfax>909-793-5953</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Daily AVHRR images were averaged for two week time periods over the entire US growing season. These averaged images, their rates of change, elevation information, and other data were used to produce a single land classification image of the contental US. The VMap-0 data set extended this coverage over the Canadian land mass, however vegetation classification was further subdivided into nine vegetation types.</procdesc>
    <procdate>199402</procdate>
    <srcprod>EROS data</srcprod>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>USGS Eros Data Center</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address></address>
    <city>Sioux Falls</city>
    <state>SD</state>
    <postal></postal>
    <country>US</country>
    </cntaddr>
    <cntvoice></cntvoice>
    <cntfax></cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>The Eros data (raster files) were converted to vector polygon, splined (remove stairstepping), thinned (all ploygons under 2km2 were deleted), and tied to existing DCW polygons (water bodies, built-up areas). The resulting file was tiled and converted to a VPF Vegetation coverage for use in the DCW. All processing was performed using ARC-INFO software.</procdesc>
    <procdate>199412</procdate>
    <srcprod>VMap-0 Vegetation Coverage</srcprod>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>813-578-0100</cntvoice>
    <cntfax>813-577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Data was translated from VPF format to ArcInfo Coverage format. The coverages were then loaded into a seamless ArcSDE layer.</procdesc>
    <procdate>02152001</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team, Harvard University</cntorg>
    </cntorgp>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </proccont>
    </procstep>
    </lineage>
    </dataqual>
    <spdoinfo>
    <direct>Vector</direct>
    <ptvctinf>
    <sdtsterm>
    <sdtstype>Complete chain</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>Label point</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>GT-polygon composed of chains</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>Point</sdtstype>
    </sdtsterm>
    <vpfterm>
    <vpflevel>3</vpflevel>
    <vpfinfo>
    <vpftype>Node</vpftype>
    </vpfinfo>
    <vpfinfo>
    <vpftype>Edge</vpftype>
    </vpfinfo>
    <vpfinfo>
    <vpftype>Face</vpftype>
    </vpfinfo>
    </vpfterm>
    </ptvctinf>
    </spdoinfo>
    <spref>
    <horizsys>
    <geograph>
    <latres>0.000000</latres>
    <longres>0.000000</longres>
    <geogunit>Decimal degrees</geogunit>
    </geograph>
    <geodetic>
    <horizdn>D_WGS_1984</horizdn>
    <ellips>WGS_1984</ellips>
    <semiaxis>6378137.000000</semiaxis>
    <denflat>298.257224</denflat>
    </geodetic>
    </horizsys>
    <vertdef>
    <altsys>
    <altdatum>Mean Sea Level</altdatum>
    <altunits>1.0</altunits>
    </altsys>
    </vertdef>
    </spref>
    <eainfo>
    <detailed>
    <enttyp>
    <enttypl>
    lc.pat</enttypl>
    </enttyp>
    <attr>
    <attrlabl>FID</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>Shape</attrlabl>
    <attrdef>Feature geometry.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Coordinates defining the features.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>AREA</attrlabl>
    <attrdef>Area of feature in internal units squared.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Positive real numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>PERIMETER</attrlabl>
    <attrdef>Perimeter of feature in internal units.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Positive real numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>LC#</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>LC-ID</attrlabl>
    <attrdef>User-defined feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    </attr>
    <attr>
    <attrlabl>LCAREA.AFT_ID</attrlabl>
    </attr>
    <attr>
    <attrlabl>LCPYTYPE</attrlabl>
    <attrdef>Land cover poygon type</attrdef>
    <attrdefs>NIMA</attrdefs>
    <attrdomv>
    <edom>
    <edomv>1</edomv>
    <edomvd>Rice Field</edomvd>
    </edom>
    <edom>
    <edomv>2</edomv>
    <edomvd>Cranberry bog</edomvd>
    </edom>
    <edom>
    <edomv>3</edomv>
    <edomvd>Cultivated area, garden</edomvd>
    </edom>
    <edom>
    <edomv>4</edomv>
    <edomvd>Peat cuttings</edomvd>
    </edom>
    <edom>
    <edomv>5</edomv>
    <edomvd>Salt pan</edomvd>
    </edom>
    <edom>
    <edomv>6</edomv>
    <edomvd>Fish pond or hatchery</edomvd>
    </edom>
    <edom>
    <edomv>7</edomv>
    <edomvd>Quarry, strip mine, mine dump, blasting area</edomvd>
    </edom>
    <edom>
    <edomv>8</edomv>
    <edomvd>Oil or gas</edomvd>
    </edom>
    <edom>
    <edomv>10</edomv>
    <edomvd>Lava flow</edomvd>
    </edom>
    <edom>
    <edomv>11</edomv>
    <edomvd>Distorted surface area</edomvd>
    </edom>
    <edom>
    <edomv>12</edomv>
    <edomvd>Unconsolidated material (sand or gravel, glacial moraine)</edomvd>
    </edom>
    <edom>
    <edomv>13</edomv>
    <edomvd>Natural landmark area</edomvd>
    </edom>
    <edom>
    <edomv>14</edomv>
    <edomvd>Inundated area</edomvd>
    </edom>
    <edom>
    <edomv>15</edomv>
    <edomvd>Undifferentiated wetlands</edomvd>
    </edom>
    <edom>
    <edomv>99</edomv>
    <edomvd>None</edomvd>
    </edom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>TILE_ID</attrlabl>
    <attrdef>VPF Format tile ID</attrdef>
    <attrdefs>NIMA</attrdefs>
    </attr>
    <attr>
    <attrlabl>FAC_ID</attrlabl>
    </attr>
    </detailed>
    <overview>
    <eaover>The DCW used a product-specific attribute coding system that is composed of TYPE and STATUS designators for area, line, and point features; and LEVEL and SYMBOL designators for text features. The TYPE attribute specifies what the feature is, while the STATUS attribute specifies the current condition of the feature. Some features require both a TYPE and STATUS code to uniquely identify their characteristics. In order to uniquely identify each geographic attribute in the DCW, the TYPE and STATUS attribute code names are preceded by the two letter coverage abbreviation and a two letter abbreviation for the type of graphic primitive present. The DCW Type/Status codes were mapped into the FACC coding scheme. A full description of FACC may be found in Digital Geographic Information Exchange Standard Edition 1.2, January 1994.</eaover>
    <eadetcit>Entities (features) and Attributes for DCW are fully described in: Department of Defense, 1992, Military Specification Digital Chart of the World (MIL-D-89009): Philadelphia, Department of Defense, Defense Printing Service Detachment Office.</eadetcit>
    </overview>
    </eainfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>NIMA</cntorg>
    </cntorgp>
    <cntpos>ATTN: CC, MS D-16</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>6001 MacArthur Blvd.</address>
    <city>Bethesda</city>
    <state>MD</state>
    <postal>20816-5001</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>301-227-2495</cntvoice>
    <cntfax>301-227-2498</cntfax>
    </cntinfo>
    </distrib>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <formname>VPF</formname>
    <formverd>19930930</formverd>
    <formspec>Military Standard Vector Product Format (MIL-STD-2407). The current version of this document is dated 28 June 1996. Edition 3 of VMap-0 conforms to a previous version of the VPF Standard as noted. Future versions of VMap-0 will conform to the current version of VPF Standard.</formspec>
    <transize>0.172</transize>
    </digtinfo>
    <digtopt>
    <offoptn>
    <offmedia>CD-ROM</offmedia>
    <recfmt>ISO 9660</recfmt>
    </offoptn>
    </digtopt>
    </digform>
    <fees>Not Applicable</fees>
    </stdorder>
    </distinfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>USGS Map Sales</cntorg>
    </cntorgp>
    <cntaddr>
    <addrtype>mailing address</addrtype>
    <address>Box 25286</address>
    <city>Denver</city>
    <state>CO</state>
    <postal>80225</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>303-236-7477</cntvoice>
    <cntfax>303-236-1972</cntfax>
    </cntinfo>
    </distrib>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <transize>0.172</transize>
    </digtinfo>
    </digform>
    <fees>$82.50 per four disk set</fees>
    <ordering>For General Public: Payment (check, money order, purchase order, or Government account) must accompany order.
    Make all drafts payable to Dept. of the Interior- US Geological Survey.
    To provide a general idea of content, a sample data set is available from the TMPO Home Page at:</ordering>
    </stdorder>
    </distinfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team, Harvard University</cntorg>
    <cntper>Geodesy Team</cntper>
    </cntorgp>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </distrib>
    <resdesc>Geodesy layer</resdesc>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <formname>SHP</formname>
    <transize>0.172</transize>
    </digtinfo>
    <digtopt>
    <onlinopt>
    <computer>
    <networka>
    <networkr>geodesy.harvard.edu</networkr>
    </networka>
    </computer>
    </onlinopt>
    </digtopt>
    </digform>
    <fees>none</fees>
    </stdorder>
    <availabl>
    <timeinfo>
    <sngdate>
    <caldate>1992</caldate>
    </sngdate>
    </timeinfo>
    </availabl>
    </distinfo>
    <metainfo>
    <metd>20010226</metd>
    <metc>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team</cntorg>
    <cntper>REQUIRED: The person responsible for the metadata information.</cntper>
    </cntorgp>
    <cntvoice>REQUIRED: The telephone number by which individuals can speak to the organization or individual.</cntvoice>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </metc>
    <metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
    <metstdv>FGDC-STD-001-1998</metstdv>
    <mettc>local time</mettc>
    <metextns>
    <onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
    <metprof>ESRI Metadata Profile</metprof>
    </metextns>
    </metainfo>
    </metadata>

    Have you tired the directory and bfile methods? Here is the example for that in the Oracle XML Developer's Guide:
    CREATE DIRECTORY xmldir AS 'path_to_folder_containing_XML_file';
    Example 3-3 Inserting XML Content into an XMLType Table
    INSERT INTO mytable2 VALUES (XMLType(bfilename('XMLDIR', 'purchaseOrder.xml'),
    nls_charset_id('AL32UTF8')));
    1 row created.
    The value passed to nls_charset_id() indicates that the encoding for the file to be read is UTF-8.
    ben

  • Can I append an XML Document into another?

    I have a large xml document which I open into coldfusion (mx
    7), and then a smaller group of files that are opened into thier
    own xml objects. After some customization of xml text values, I
    want to append the smaller xml documents into the larger document
    at specific positions within the document tree.
    I have the code that I thought would do this, using an
    arrayappend function, but what happens is that only the root
    element of the smaller (inserted) xml document is placed into the
    main document.
    Is this possible, or do I need to modify the code so that the
    entire tree of xml data that I am appending is created on the
    fly?

    Personally I do this with XSLT.
    <cfxml variable="doc1">
    <root>
    <member id="10">
    <name>Ian</name>
    <sex>male</sex>
    </member>
    </root>
    </cfxml>
    <cfxml variable="doc2">
    <root>
    <member id="1">
    <name>Joe</name>
    <sex>male</sex>
    </member>
    <member id="2">
    <name>John</name>
    <sex>male</sex>
    </member>
    <member id="3">
    <name>Sue</name>
    <sex>female</sex>
    </member>
    </root>
    </cfxml>
    <cffile action="write" file="#expandPath("/")#\doc2.xml"
    output="#ToString(doc2)#" nameconflict="overwrite">
    <!--- METHOD ONE using a depreciated and undocumented
    method --->
    <cfset newNode = doc1.root.member.cloneNode(true)>
    <cfset doc2.changeNodeOwner(newNode)>
    <cfset doc2.root.appendChild(newNode)>
    <cfdump var="#doc2#" label="Merged by hidden functions"
    expand="no">
    <!--- METHOD TWO using XSLT --->
    <!--- create an xsl docutment--->
    <cfoutput>
    <cfsavecontent variable="transformer">
    <xsl:stylesheet xmlns:xsl="
    http://www.w3.org/1999/XSL/Transform"
    version="1.0">
    <xsl:output method="xml"/>
    <!--load the merge file -->
    <xsl:variable name="emps"
    select="document('#expandPath("/")#\doc2.xml')"/>
    <!--- this line references and XML file to be combined
    with the main file,
    I wonder if there is a way to use an XML document in memory
    here? --->
    <!-- combine the files -->
    <xsl:template match="/">
    <root>
    <!-- select all the child nodes of the root tag in the
    main file -->
    <xsl:for-each select="root/child::*">
    <!-- copy the member tag -->
    <xsl:copy-of select="."/>
    </xsl:for-each>
    <!-- and all the child nodes of the root tag in the merge
    file -->
    <xsl:for-each select="$emps/root/child::*">
    <!-- copy the member tag -->
    <xsl:copy-of select="."/>
    </xsl:for-each>
    </root>
    </xsl:template>
    </xsl:stylesheet>
    </cfsavecontent>
    </cfoutput>
    <!--- create a combined xml document --->
    <cfset doc2 = XMLparse(XMLtransform(doc1,transformer))>
    <cfdump var="#doc2#" label="Merged by XSLT"
    expand="no">

  • Exception when inserting a XML document into database

    Hello.
    I'm trying to insert the content of a XML document into a database. The problem consists on the exception that is raised when a FK in the XML is not in the database as a PK
    How can i catch these exceptions and continuing inserting the rest of the data
    Thanks in advance ...

    I really hope for you this wasn't a production database, as mentioned in the URL, it has been some years ago for me fiddling around this which was btw a supported metalink workaround ONCE (during my Oracle 7 days), but now, I wouldn't be surprised if you now have mixed NLS content in your tables/database and more or less "corrupted" your content.
    The initial issue was probably a client one.
    BTW
    I actually hope it was a production database, maybe then you would learn that way, that you SHOULD NEVER EVER UPDATE DATABASE BASE TABLES directly...
    ...unless (the only exception) you want to learn something about the internals and don't care that your TEST database gets corrupted....

  • Problem in inserting XML document

    Give XML DTD :
    <!ELEMENT a (b, c)>
    <!ELEMENT b (#PCDATA)>
    <!ELEMENT c (d, e)>
    <!ELEMENT d (#PCDATA)>
    <!ELEMENT e (#PCDATA)>
    I made schema at Oracle8i as this:
    create type c_t
    as object (d varchar2(3),
    e varchar2(3));
    create table a (
    b varchar2(3),
    c c_t,
    constraint b_pk primary key (b));
    And the XML document to insert into DB is as this:
    <?xml version="1.0"?>
    <ROWSET>
    <ROW>
    <b>"100"</b>
    <c><d>"aaa"</d><e>"abc"</e></c>
    </ROW>
    <ROW>
    <b>"200"</b>
    <c><d>"bbb"</d><e>"def"</e></c>
    </ROW>
    </ROWSET>
    Using Oracle8i's XML-JAVA library interface,
    I tried to insert the XML document into the DB table, but JAVA runtime exception occurred says "java.lang.NoSuchMethodError: oracle.jdbc.oci8.OCIDBAccess: method initObjectFields(Loracle/jdbc/dbaccess/DBColumn;[BI)V not found".
    How can I solve this problem?
    null                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                       

    try calling dbms_xmlgen.setrowsettag(qryctx,'siteMap') ;
    before retrieving the result with DBMS_XMLGEN.getxmltype (qryctx);

  • To load an XML document into 40 tables

    How do I load a large XML document into 40 tables. Most of the exmaples, I see only load one table into the Orcale database?

    From the above document:
    Storing XML Data Across Tables
    Question
    Can XML- SQL Utility store XML data across tables?
    Answer
    Currently XML-SQL Utility (XSU) can only store to a single table. It maps a canonical representation of an XML document into any table/view. But of course there is a way to store XML with the XSU across table. One can do this using XSLT to transform any document into multiple documents and insert them separately. Another way is to define views over multiple tables (object views if needed) and then do the inserts ... into the view. If the view is inherently non-updatable (because of complex joins, ...), then one can use INSTEAD-OF triggers over the views to do the inserts.
    -- I've tried this, works fine.

  • Got error message when store XML documents into XML DB repository, via WebD

    Hi experts,
    I am in I am in Oracle Enterprise Manager 11g 11.2.0.1.0.
    SQL*Plus: Release 11.2.0.1.0 Production on Tue Feb 22 11:40:23 2011
    I got error message when store XML documents into XML DB repository, via WebDAV.
    I have successfully registered 5 related schemas and generated 1 table.
    I have inserted 40 .xml files into this auto generated table.
    using these data I created relational view successfully.
    but since I couldn't store XML documents into XML DB repository, via WebDAV
    when I query using below code:
    SELECT rv.res.getClobVal()
    FROM resource_view rv
    WHERE rv.any_path = '/home/DEV/messages/4fe1-865d-da0db9212f34.xml';
    I got nothing.
    My ftp code is listed below:
    ftp> open localhost 2100
    Connected to I0025B368E2F9.
    220- C0025B368E2F9
    Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution.
    220 I0025B368E2F9 FTP Server (Oracle XML DB/Oracle Database) ready.
    User (I0025B368E2F9:(none)): fda_xml
    331 pass required for FDA_XML
    Password:
    230 FDA_XML logged in
    ftp> cd /home/DEV/message
    250 CWD Command successful
    ftp> pwd
    257 "/home/DEV/message" is current directory.
    ftp> ls -la
    200 PORT Command successful
    150 ASCII Data Connection
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 .
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 ..
    226 ASCII Transfer Complete
    ftp: 115 bytes received in 0.00Seconds 115000.00Kbytes/sec.
    250 SET_CHARSET Command Successful
    ftp> put C:\ED\SPL\E_Reon_Data\loaded\4fe1-865d-da0db9212f34.xml
    200 PORT Command successful
    150 ASCII Data Connection
    550- Error Response
    ORA-00600: internal error code, arguments: [qmxConvUnkType], [], [], [], [], [], [], [], [], [], [], []
    550 End Error Response
    ftp: 3394 bytes sent in 0.00Seconds 3394000.00Kbytes/sec.
    I have tried all suggestion from another thread such as:
    alter system set events ='31150 trace name context forever, level 0x4000'
    SQL> alter system set shared_servers = 1;
    but failed.
    is there anyone can help?
    Thanks.
    Edited by: Cow on Mar 29, 2011 12:58 AM

    Hi experts,
    I am in I am in Oracle Enterprise Manager 11g 11.2.0.1.0.
    SQL*Plus: Release 11.2.0.1.0 Production on Tue Feb 22 11:40:23 2011
    I got error message when store XML documents into XML DB repository, via WebDAV.
    I have successfully registered 5 related schemas and generated 1 table.
    I have inserted 40 .xml files into this auto generated table.
    using these data I created relational view successfully.
    but since I couldn't store XML documents into XML DB repository, via WebDAV
    when I query using below code:
    SELECT rv.res.getClobVal()
    FROM resource_view rv
    WHERE rv.any_path = '/home/DEV/messages/4fe1-865d-da0db9212f34.xml';
    I got nothing.
    My ftp code is listed below:
    ftp> open localhost 2100
    Connected to I0025B368E2F9.
    220- C0025B368E2F9
    Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution.
    220 I0025B368E2F9 FTP Server (Oracle XML DB/Oracle Database) ready.
    User (I0025B368E2F9:(none)): fda_xml
    331 pass required for FDA_XML
    Password:
    230 FDA_XML logged in
    ftp> cd /home/DEV/message
    250 CWD Command successful
    ftp> pwd
    257 "/home/DEV/message" is current directory.
    ftp> ls -la
    200 PORT Command successful
    150 ASCII Data Connection
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 .
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 ..
    226 ASCII Transfer Complete
    ftp: 115 bytes received in 0.00Seconds 115000.00Kbytes/sec.
    250 SET_CHARSET Command Successful
    ftp> put C:\ED\SPL\E_Reon_Data\loaded\4fe1-865d-da0db9212f34.xml
    200 PORT Command successful
    150 ASCII Data Connection
    550- Error Response
    ORA-00600: internal error code, arguments: [qmxConvUnkType], [], [], [], [], [], [], [], [], [], [], []
    550 End Error Response
    ftp: 3394 bytes sent in 0.00Seconds 3394000.00Kbytes/sec.
    I have tried all suggestion from another thread such as:
    alter system set events ='31150 trace name context forever, level 0x4000'
    SQL> alter system set shared_servers = 1;
    but failed.
    is there anyone can help?
    Thanks.
    Edited by: Cow on Mar 29, 2011 12:58 AM

  • Adobe Acrobat 9/Win7 - Inserting Word documents into PDF file intermittently leads to missing or garbled text?

    Heya.
    I have a user who is creating a PDF essentially compiled from inserting word documents into the position she wants inside of the PDF.
    The problem is that sometimes this leads to missing or garbled text. For example, 6-8.1 is displayed as 6- .1. Which is just plain weird.
    Other times, the text looks garbled or uneven. She turned off her PC but I took screenshots of both of these scenarios. I fear that I am not going to be able to retroactively fix the corruption in this document for her.
    She says she's using very standard fonts. Like Arial. I asked if she could locate the original Word document that produced the garbled text, just for a sanity check. The problem is that her co-worker may have been the only one with that particular word document (there are probably hundreds which compile this PDF...) and IF she has it, she might not be able to locate it on her PC TO check what the original text is. Yikes.
    Google-fu turns up very little. I tried the guidance in this document - https://answers.acrobatusers.com/We-combine-PDF-document-ends-corrupt-What-problem-q197619 .aspx[2] and a Save-As but no avail.
    This also turns up but is not applicable to Acrobat 9 I believe:  http://blogs.adobe.com/acrobatineducation/2010/02/get_rid_of_that_bloat_in_your.html[3]
    She states that she can manually fix the document but after a while it gets corrupted again? But I haven't observed that and she might have just fed me that to amplify the problem.
    I will be able to post screenshots when she turns on her computer tomorrow.

    Here are the two instances of corruption we have come across.
    http://imgur.com/nPztMYm
    http://imgur.com/wHM12e2
    Edit:  A LITTLE MORE INFO.  I am able to select the corrupted text, paste it into Notepad with all the characters intact.  So the characters are THERE.  they are just not being displayed correctly.
    Edit:  They also do not print correctly.  The page prints with characters missing, and she wants to print this document, so that is going to be an issue.

  • Inserting XML content into Database

    Hallo
    i´m new to Oracle.
    i want to insert xml content into the database. for testing i installed the version 10g on a windowsxp computer.
    i read the oracle xmldb developer´s guide. on page 3-4 ff. it is explained how i can insert content into the database. that´s how i did it (with isqlplus):
    create table example1(key_column varchar2(10) primary key, xml_column xmltype)
    -->works fine
    create table example2 of xmltype
    -->works fine
    now i made a directory under c:\myXmlFilesForDb in WinXp
    create directory xmldir as 'c:/myXmlFilesForDb in WinXp'
    (also tried: create directory xmldir as 'c:\myXmlFilesForDb in WinXp')
    --> in this directory is a file named: mydokument.xml
    --> works fine
    insert into example2 values(xmltype(bfilename('xmldir','mydokument.xml'), nls_charset_id('AL32UTF8')))
    the following error message is displayed (in German):
    ORA-22285: Verzeichnis oder Datei für FILEOPEN-Vorgang nicht vorhanden
    ORA-06512: in "SYS.DBMS_LOB",Zeile 523
    ORA-06512: in "SYS:XMLTYPE",Zeile 287
    ORA-06512: in Zeile 1
    whats wrong? can anybody help me please?
    ohhh....
    thank you very much
    cu
    George

    Directory entries are case sensitive.
    Select * From dba_directories to ensure you case is the same as you are using in your insert statement.
    Are you using the same user? if not grant read on directory directoryname to username;
    try to just select.
    select dbms_lob.getLength(BFileName('directoryname','filename.xml'))
    as length
    from dual;

  • Error in inserting XML document

    I'm trying to insert into a table from an HTML form.
    Here's the table I'm trying to insert into
    TABLE employee
    name VARCHAR2(40),
    ssnum NUMBER NOT NULL
    Note: even though SSNUM is number I take it as a string and I
    have a before insert trigger that converts it to number. Works
    from SQL plus if entry is number.
    Here's the testxsql.html
    <html>
    <body>
    Input your name and ssnumber ..
    <form action="testxsql.xsql" method="post">
    Name<input type="text" name="name_field" size="30">
    SSNUM<input type = "text" name="ssnum_field" size="10">
    <input type="submit">
    </form>
    </body>
    </html>
    it calls testxsql.xsql which is
    <?xml version="1.0"?>
    <?xml-stylesheet type="text/xsl" href="employee.xsl"?>
    <page connection="demo">
    <insert table="employee" transform="testxsql.xsl"/>
    <content>
    <query tag-case="lower" max-rows="5" rowset-element=""
    row-element="emp" >
    select *
    from employee
    order by ssnum desc
    </query>
    </content>
    </page>
    and testxsql.xsl is
    <?xml version = '1.0'?>
    <ROWSET xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
    <xsl:for-each select="request">
    <ROW>
    <NAME><xsl:value-of select="name_field"/></NAME>
    <SSNUM><xsl:value-of select="ssnum_field"/></SSNUM>
    <SOURCE>User-Submitted</SOURCE>
    </ROW>
    </xsl:for-each>
    </ROWSET>
    I get the following error while trying to insert
    XSQL-015: Error inserting XML Document
    invalid LABEL name in XML doc-
    If I remove the insert tag, the query portion works fine.
    I'm new to XML so I've already pulled all the hair that I can
    muster. Thanks
    null

    Kumar Pandey (guest) wrote:
    : I'm trying to insert into a table from an HTML form.
    : Here's the table I'm trying to insert into
    : TABLE employee
    : name VARCHAR2(40),
    : ssnum NUMBER NOT NULL
    : Note: even though SSNUM is number I take it as a string and I
    : have a before insert trigger that converts it to number. Works
    : from SQL plus if entry is number.
    : Here's the testxsql.html
    : <html>
    : <body>
    : Input your name and ssnumber ..
    : <form action="testxsql.xsql" method="post">
    : Name<input type="text" name="name_field"
    size="30">
    : SSNUM<input type = "text" name="ssnum_field"
    size="10">
    I'm using apache web and JServ and 8i as db
    : <input type="submit">
    : </form>
    : </body>
    : </html>
    : it calls testxsql.xsql which is
    : <?xml version="1.0"?>
    : <?xml-stylesheet type="text/xsl" href="employee.xsl"?>
    : <page connection="demo">
    : <insert table="employee" transform="testxsql.xsl"/>
    : <content>
    : <query tag-case="lower" max-rows="5" rowset-element=""
    : row-element="emp" >
    : select *
    : from employee
    : order by ssnum desc
    : </query>
    : </content>
    : </page>
    : and testxsql.xsl is
    : <?xml version = '1.0'?>
    : <ROWSET xmlns:xsl="http://www.w3.org/1999/XSL/Transform">
    : <xsl:for-each select="request">
    : <ROW>
    : <NAME><xsl:value-of select="name_field"/></NAME>
    : <SSNUM><xsl:value-of select="ssnum_field"/></SSNUM>
    : <SOURCE>User-Submitted</SOURCE>
    : </ROW>
    : </xsl:for-each>
    : </ROWSET>
    : I get the following error while trying to insert
    : XSQL-015: Error inserting XML Document
    : invalid LABEL name in XML doc-
    : If I remove the insert tag, the query portion works fine.
    : I'm new to XML so I've already pulled all the hair that I can
    : muster. Thanks
    null

  • ORA-00600: inserting xml document

    I am getting an this error when inserting an xml document into Oracle.
    The schema below and sample xml document reproduce the problem.
    If I move the element named "theID" from the type "BaseObjectType"
    to "NextObjectType" and remove the extension declaration in this type
    all works ok. complete error is:
    ORA-00600: internal error code, arguments: [qmxConvUnkType], [], [], [], [],
    Schema:
    <?xml version="1.0" encoding="UTF-8"?>
    <xs:schema
    targetNamespace="http://www.x.com/test"
    xmlns:xs="http://www.w3.org/2001/XMLSchema"
    xmlns:tes="http://www.x.com/test">
    <xs:complexType name="BaseObjectType" abstract="true">
    <xs:sequence>
    <xs:element name="theID" type="xs:string">
    </xs:element>
    </xs:sequence>
    </xs:complexType>
    <xs:complexType name="NextObjectType" abstract="true">
    <xs:complexContent>
    <xs:extension base="tes:BaseObjectType">
    <xs:sequence>
    <xs:element name="title" type="xs:string">
    </xs:element>
    </xs:sequence>
    </xs:extension>
    </xs:complexContent>
    </xs:complexType>
    <xs:complexType name="RBaseType" abstract="true">
    <xs:complexContent>
    <xs:restriction base="tes:NextObjectType">
    <xs:sequence>
    <xs:element name="theID" type="xs:string"/>
    <xs:element name="title" type="xs:string"/>
    </xs:sequence>
    </xs:restriction>
    </xs:complexContent>
    </xs:complexType>
    <xs:complexType name="RType">
    <xs:complexContent>
    <xs:extension base="tes:RBaseType">
    <xs:sequence>
    <xs:element name="theName" type="xs:string">
    </xs:element>     
    </xs:sequence>
    </xs:extension>
    </xs:complexContent>
    </xs:complexType>
    <xs:element name="RObject" type="tes:RType"/>
    </xs:schema>
    Sample xml document:
    <?xml version="1.0" encoding="UTF-8"?>
    <RObject xmlns="http://www.x.com/test"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:noNamespaceSchemaLocation="http://localhost:8080:/test.xsd">
    <theID xmlns="">123</theID>
    <title xmlns="">title</title>
    <theName xmlns="">name</theName>
    </RObject>
    Register Schema call:
    BEGIN
    DBMS_XMLSCHEMA.registerSchema(
    SCHEMAURL => 'http://localhost:8080/test.xsd',
    SCHEMADOC => bfilename('TMP3','test.xsd'),
    LOCAL => TRUE,
    GENTYPES => TRUE,
    GENTABLES => TRUE,
    CSID => nls_charset_id('AL32UTF8'));
    END;
    Insert xml document call:
    insert into "RObject566_TAB" values
    (XMLType(bfilename('TMP3','test.xml'),nls_charset_id('AL32UTF8')));
    Changing NextObjectType to this gets insert to work:
    <xs:complexType name="NextObjectType" abstract="true">
    <xs:sequence>
    <xs:element name="theID" type="xs:string"/>
    <xs:element name="title" type="xs:string"/>
    </xs:sequence>
    </xs:complexType>
    Anyone else seen this?
    jcl.

    BTW: This is with Oracle 10.2.0

  • How to convert Xml Document into orcale tempary table

    i am creating xml document into java and passing this xml into oracle database and i need to fetch this xml into result set | rowset.
    Xml Structure
    <Department deptno="100">
    <DeptName>Sports</DeptName>
    <EmployeeList>
    <Employee empno="200"><Ename>John</Ename><Salary>33333</Salary>
    </Employee>
    <Employee empno="300"><Ename>Jack</Ename><Salary>333444</Salary>
    </Employee>
    </EmployeeList>
    </Department>
    i need like this format
    Deptno DeptName empno Ename Salary
    100 Sports 200 Jhon 2500
    100 Sports 300 Jack 3000

    It does depend on your version as odie suggests.
    Here's a way that will work in 10g...
    SQL> ed
    Wrote file afiedt.buf
      1  with t as (select xmltype('<Department deptno="100">
      2  <DeptName>Sports</DeptName>
      3  <EmployeeList>
      4  <Employee empno="200"><Ename>John</Ename><Salary>33333</Salary>
      5  </Employee>
      6  <Employee empno="300"><Ename>Jack</Ename><Salary>333444</Salary>
      7  </Employee>
      8  </EmployeeList>
      9  </Department>
    10  ') as xml from dual)
    11  --
    12  -- End of test data, Use query below
    13  --
    14  select x.deptno, x.deptname
    15        ,y.empno, y.ename, y.salary
    16  from t
    17      ,xmltable('/'
    18                passing t.xml
    19                columns deptno   number       path '/Department/@deptno'
    20                       ,deptname varchar2(10) path '/Department/DeptName'
    21                       ,emps     xmltype      path '/Department/EmployeeList'
    22               ) x
    23      ,xmltable('/EmployeeList/Employee'
    24                passing x.emps
    25                columns empno    number       path '/Employee/@empno'
    26                       ,ename    varchar2(10) path '/Employee/Ename'
    27                       ,salary   number       path '/Employee/Salary'
    28*              ) y
    SQL> /
        DEPTNO DEPTNAME        EMPNO ENAME          SALARY
           100 Sports            200 John            33333
           100 Sports            300 Jack           333444
    SQL>If the XML is a string e.g. a CLOB then it can easily be converted to XMLTYPE using the XMLTYPE function.

  • How to ftp XML document into a XML type which is not created by itself.

    Hi,
    1.
    I have a table call SNPLEX_DESIGN which is created automaticly when I register a snplex_design.xsd XML schema.( Oracle creates it through the xdb:defaultTable="SNPLEX_DESIGN attribute). and it is created using SNPLEX user account.
    2.
    I also created a folder (resource) call /home/SNPLEX/Orders. which is used to hold all the incoming XML document.
    3.
    I created another user account call SNPLEX_APP, which is the only user account allowed to FTP XML document into /home/SNPLEX/Orders folder.
    Isuues,
    If I login as SNPLEX user, I can ftp XML document into the folder and TABLE (the file size = 0). But If I login as SNPLEX_APP user account, I can only ftp XML document into the folder, but Oracle doesn't store the document into the table( becuase the files size shows a number).
    I have granted all the ACL privileges on the /home/SNPEX/Orders folder to SNPLEX_APP hrough OEM.
    DO I miss anything. Any helps will be great appreciated. Resolve this issues is very import to us, sicne we are on a stage to roll system into production.
    Regards,
    Jinsen

    IN order for a registered schema to be available to other users the schema must be registered as a GLOBAL, rather than a LOCAL Schema. This is controlled by the third agument passed to registerSChema, and the default is local. Note that you will also need to explicity grant appropriate permissions on any tables created by the schema registration process to other users who will be loading or reading data from these tables.

  • Is it possible to insert a document into another?

    I would like to insert a document into another document without cutting and pasting. That is, to "insert" it into the present document. I have been unable to find that option. Is it possible?

    It would seem a logical function, though Pages is in essence a $30.00 program so it's not going to have all the features one might wish. You can submit feedback to Apple on the issue here, though:
    http://www.apple.com/feedback/pages.html
    Regards.

Maybe you are looking for

  • SQL Query Help Needed

    I'm having trouble with an SQL query. I've created a simple logon page wherein a user will enter their user name and password. The program will look in an Access database for the user name, sort it by Date/Time modified, and check to see if their pas

  • Help: create index error

    When I create a CONTEXT Index using default preferences, I meet an exception: ORA-29855: error occurred in the execution of ODCIINDEXCREATE routine ORA-20000: Oracle Text error: DRG-50857: oracle error in drixtab.create_index_tables ORA-01031 insuffi

  • Can any one tell me how to change OS of Lenovo T 500 from Vista to XP

    Hello, Thank you in advance for your help. I have just bought a T500 from lenovo, I just want to use Vista instead of XP. I do not mind using both of them but I surely want the XP. Is there any way to do it.  Can any one help Regards, Faisal Message

  • TMADMIN_CAT:1607: ERROR: Unable to initialize public key subsystem

    Hi All, I have installed HCM 9 with 8.49.08 PeopleTools on CentOS 5.3 Linux. Everything went fine (I have completed the creation of database) until I am about to start the application server that I have just created. I am getting this error, "TMADMIN

  • How to cancel print job with control panel on HP8610

    Is there a way to cancel a print using the control panel of the HP8610 rather than using task manager? This question was solved. View Solution.