How to insert 4K of XML data into XMLType column?

I use OCCI and our Oracle version is 9.2.0.1.0. I have successfully been able to insert xml data (any size) into a clob column using "insert into...values(.., empty_clob()) returning c1 into :3"; and then using Statement::GetClob() to acquire a reference to the internal clob and populate it. I cannot seem to be able to do the same when the column type is of XMLType.
I could not find a single sample code which demonstrates inserting into a table with a XMLType column. Using SetDataBuffer(OCCI_SQLT_STR) with data over 4000 bytes does not work.
I'd greatly appreciate any feedback.

Pretty sure this was a bug in the base 9.2 release which was fixed in a patch drop. Try 9.2.0.6 or later.

Similar Messages

  • Inserting XML data into xmltype column

    Oracle version: 10.1.0.5
    OpenVms Alpha V8.3
    1) Tried this and get the error shown below. Removed charset and placed a zero. Same error.
    INSERT INTO xml_demo (xml_data) -- column of xmltype
    VALUES
    xmltype
    bfilename('XML_DIR', 'MOL.XML'),
    nls_charset_id('AL32UTF8')
    ORA-22993: specified input amount is greater than actual source amount
    ORA-06512: at "SYS.DBMS_LOB", line 637
    ORA-06512: at "SYS.XMLTYPE", line 283
    ORA-06512: at line 1
    2) This PL/SQL block works. However maximum raw size around 32K. The file can be around 100K. May be I can load it into a table of raw and somehow concatnate it to insert. Not sure whether this is possible but I am sure there must me a simple way of doing this.
    Subset of the xml file is pasted below.
    set serveroutput on size 1000000
    DECLARE
    file1 bfile;
    v_xml XMLType;
    len1 number(6);
    v_rec1 raw(32000);
    BEGIN
    file1 := bfilename('XML_DIR','MOL.XML');
    DBMS_LOB.fileopen(file1, DBMS_LOB.file_readonly);
    len1 := DBMS_LOB.getLength(file1);
    v_rec1 := dbms_lob.substr(file1,len1,1);
    v_xml := xmltype(UTL_RAW.CAST_TO_VARCHAR2(v_rec1));
    INSERT INTO xml_demo (xml_data) VALUES (v_xml);
    COMMIT;
    DBMS_LOB.fileclose(file1);
    exception
    when others then
    dbms_output.put_LINE (sqlerrm);
    DBMS_LOB.fileclose(file1);
    END;
    <?xml version="1.0" encoding="UTF-8"?>
    <MolDocument DtdVersion="3" DtdRelease="0">
    <DocumentIdentification v="MOL_20100331_1500_1600"/>
    <DocumentVersion v="1"/>
    <DocumentType v="A43"/>
    <SenderIdentification codingScheme="A01" v="17X100Z100Z0001H"/>
    <SenderRole v="A35"/>
    <ReceiverIdentification codingScheme="A01" v="10XFR-RTE------Q"/>
    <ReceiverRole v="A04"/>
    <CreationDateTime v="2010-03-31T14:10:00Z"/>
    <ValidTimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <Domain codingScheme="A01" v="10YDOM-1001A001A"/>
    <MolTimeSeries>
    <ContractIdentification v="RTE_20100331_1500_16"/>
    <ResourceProvider codingScheme="A01" v="10XFR-RTE------Q"/>
    <AcquiringArea codingScheme="A01" v="17Y100Z100Z00013"/>
    <ConnectingArea codingScheme="A01" v="10YFR-RTE------C"/>
    <AuctionIdentification v="AUCTION_20100331_1500_1600"/>
    <BusinessType v="A10"/>
    <BidTimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <MeasureUnitQuantity v="MAW"/>
    <Currency v="EUR"/>
    <MeasureUnitPrice v="MWH"/>
    <Direction v="A02"/>
    <MinimumActivationQuantity v="50"/>
    <Status v="A06"/>
    <Period>
    <TimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <Resolution v="PT60M"/>
    <Interval>
    <Pos v="1"/>
    <Qty v="50"/>
    <EnergyPrice v="50.45"/>
    </Interval>
    </Period>
    </MolTimeSeries>
    </MolDocument>

    Marc
    Thanks. I understand what you are saying. I have been copying files in binary mode from NT servers into VMS. I have to get a proper xml file via FTP from the originating system to further investigate.
    I have one last item i need help on. If anything looks obvious let me know:
    +1) The xsd defintion of Qty (type: QuantityType) and EnergyPrice (type: Amount Type)+
                   <xsd:element name="Qty" type="ecc:QuantityType">
                        <xsd:annotation>
                             <xsd:documentation/>
                        </xsd:annotation>
                   </xsd:element>
                   <xsd:element name="EnergyPrice" type="ecc:AmountType" minOccurs="0">
                        <xsd:annotation>
                             <xsd:documentation/>
                        </xsd:annotation>
                   </xsd:element>
    +2) Definition of AmountType and QuantityType in the parent xsd+
         <xsd:complexType name="AmountType">
              <xsd:annotation>
                   <xsd:documentation>
                        <Uid>ET0022</Uid>
                        <Definition>The monetary value of an object</Definition>
                   </xsd:documentation>
              </xsd:annotation>
              <xsd:attribute name="v" use="required">
                   <xsd:simpleType>
                        <xsd:restriction base="xsd:decimal">
                             <xsd:totalDigits value="17"/>
                        </xsd:restriction>
                   </xsd:simpleType>
              </xsd:attribute>
         </xsd:complexType>
         <!--_________________________________________________-->
         <xsd:complexType name="QuantityType">
              <xsd:annotation>
                   <xsd:documentation>
                        <Uid>ET0012</Uid>
                        <Definition>(Synonym "qty") The quantity of an energy product. Positive quantities shall not have a sign.</Definition>
                   </xsd:documentation>
              </xsd:annotation>
              <xsd:attribute name="v" type="xsd:decimal" use="required"/>
         </xsd:complexType>
         <!--________________
    +3. Data in the XML file+
    <Period>
    <TimeInterval v="2010-03-31T15:00Z/2010-03-31T16:00Z"/>
    <Resolution v="PT60M"/>
    <Interval>
    <Pos v="1"/>
    <Qty v="50"/>
    <EnergyPrice v="50.45"/>
    </Interval>
    +4) When I do the load:+
    the EnergyPrice is saved in the xmltype column as <EnergyPrice v="50"/>
    Losing its decimal value of .45
    +5) When I select as follows:+
    **DEV** SQL>> l
    1 SELECT
    2 EXTRACTVALUE(x2.column_value,'/MolTimeSeries/Period/Interval/EnergyPrice/@v') v1,
    3 EXTRACTVALUE(x2.column_value,'/MolTimeSeries/Period/Interval/EnergyPrice') v2,
    4 EXTRACTVALUE(x2.column_value,'/MolTimeSeries/Period/Interval/Qty') v3
    5 FROM balit_mol_xml x,
    6 TABLE(
    7 XMLSEQUENCE(
    8 EXTRACT(x.xml_payload, '/MolDocument/MolTimeSeries')
    9 )
    10 ) x2
    11* WHERE EXISTSNODE(x.xml_payload,'/MolDocument/DocumentIdentification[@v="MOL_20100331_1500_1600"]') = 1
    +6) get the result+
    50
    AmountType479_T(XDB$RAW_LIST_T('1301000000'), 50)
    QuantityType471_T(XDB$RAW_LIST_T('1301000000'), 50)
    +7) XDB$RAW_LIST_T('1301000000'),+
    Does that tell what I am doing wrong?

  • Inserting large xml data into xmltype

    Hi all,
    In my project I need to insert very large XML data into xmltype column.
    My table:
    CREATE TABLE TransDetailstblCLOB ( id number, data_xml XMLType) XmlType data_xml STORE AS CLOB;
    I am using JDBC approach to insert values. It works fine for data less than 4000 bytes when using preparedStatement.setString(1, xmlData). As I have to insert large Xml data >4000 bytes I am now using preparedStatement.setClob() methods.
    My code works fine for table which has column declared as CLOB expicitly. But for TransDetailstblCLOB where the column is declared as XMLTYPE and storage option as CLOB I am getting the error : "ORA-01461: can bind a LONG value only for insert into a LONG column".
    This error means that there is a mismatch between my setClob() and column. which means am I not storing in CLOB column.
    I read in Oracle site that
    When you create an XMLType column without any XML schema specification, a hidden CLOB column is automatically created to store the XML data. The XMLType column itself becomes a virtual column over this hidden CLOB column. It is not possible to directly access the CLOB column; however, you can set the storage characteristics for the column using the XMLType storage clause."
    I dont understand its stated here that it is a hidden CLOB column then why not I use setClob()? It worked fine for pure CLOB column (another table) then Why is it giving such error for XMLTYPE table?
    I am struck up with this since 3 days. Can anyone help me please?
    My code snippet:
    query = "INSERT INTO po_xml_tab VALUES (?,XMLType(?)) ";
              //query = "INSERT INTO test VALUES (?,?) ";
         // Get the statement Object
         pstmt =(OraclePreparedStatement) conn.prepareStatement(query);
         // pstmt = conn.prepareStatement(query);
         //xmlData="test";
    //      If the temporary CLOB has not yet been created, create new
         temporaryClob = oracle.sql.CLOB.createTemporary(conn, true, CLOB.DURATION_SESSION);
         // Open the temporary CLOB in readwrite mode to enable writing
         temporaryClob.open(CLOB.MODE_READWRITE);
         log.debug("tempClob opened"+"size bef writing data"+"length "+temporaryClob.getLength()+
                   "buffer size "+temporaryClob.getBufferSize()+"chunk size "+temporaryClob.getChunkSize());
         OutputStream out = temporaryClob.getAsciiOutputStream();
         InputStream in = new StringBufferInputStream(xmlData);
    int length = -1;
    int wrote = 0;
    int chunkSize = temporaryClob.getChunkSize();
    chunkSize=xmlData.length();
    byte[] buf = new byte[chunkSize];
    while ((length = in.read(buf)) != -1) {
    out.write(buf, 0, length);
    wrote += length;
    temporaryClob.setBytes(buf);
    log.debug("Wrote lenght"+wrote);
         // Bind this CLOB with the prepared Statement
         pstmt.setInt(1,100);
         pstmt.setStringForClob(2, xmlData);
         int i =pstmt.executeUpdate();
         if (i == 1) {
         log.debug("Record Successfully inserted!");
         }

    try this, in adodb works:
    declare poXML CLOB;
    BEGIN
    poXML := '<OIDS><OID>large text</OID></OIDS>';
    UPDATE a_po_xml_tab set podoc=XMLType(poXML) WHERE poid = 102;
    END;

  • How to insert a very long string into a column of datatype 'LONG'

    Can anyone please tell me how can I insert a very long string into a column of datatype 'LONG'?
    I get the error, ORA-01704: string literal too long when I try to insert the value into the table.
    Since it is an old database, I cannot change the datatype of the column. And I see that the this column already contains strings which are very long.
    I know this can be done using bind variables but dont know how to use it in a simple query.
    Also is there any other way to do it?

    Hello,
    To preserve formatting in this forum, please enclose your code output between \ tags. And when executing you code as a pl/sql or sql script
    include following lineset define off;
         Your code or output goes here
      \Regards
    OrionNet                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • Javascript Question - How to Insert a Dynamic/Current Date into the Footer of a Scanned Document

    Hi!
    I am looking for help in finding a Javascript that would allow the insertion of a dynamic/current date into the footer of a scanned document at the time the document is printed.
    I am currently using Adobe Acrobat Professional 8.0 at my work and there has arisen a need to have a dynamic/current date in the footer on scanned documents when they are printed out on different days by different people.
    I am new to the Forum and I am also very new to Javascript and what this entails.
    Thank you in advance for your help and input!
    Tracy

    this.addWatermarkFromText({
    cText: util.printd("mmmm dd, yyyy", new Date()),
    nTextAlign: app.constants.align.right,
    nHorizAlign: app.constants.align.right,
    nVertAlign: app.constants.align.bottom,
    nHorizValue: -72, nVertValue: 72
    Will insert the current Monday/Day/Year as a text watermark at the bottom of every page of a document, 1 inch up and 1 inch in from the right corner.

  • Inserting a long XML document into XMLType

    I'm trying to load the following document into an XMLType column in 10.2. I've tried every example I can find and can push the data into CLOBs using the Java work around just fine (http://www.oracle.com/technology/sample_code/tech/java/codesnippet/xmldb/HowToLoadLargeXML.html).
    Can anyone provide a solution or let me know if there is a limitation please?
    Given the table;
    SQL> describe xmltable_1
    Name Null? Type
    DOC_ID NUMBER
    XML_DATA XMLTYPE
    How do I load this data into 'XML_DATA'?
    <?xml version="1.0" encoding="UTF-8"?>
    <metadata>
    <idinfo>
    <citation>
    <citeinfo>
    <origin>Rand McNally and ESRI</origin>
    <pubdate>1996</pubdate>
    <title>ESRI Cities Geodata Set</title>
    <geoform>vector digital data</geoform>
    <onlink>\\OIS23\C$\Files\Working\Metadata\world\cities.shp</onlink>
    </citeinfo>
    </citation>
    <descript>
    <abstract>World Cities contains locations of major cities around the world. The cities include national capitals for each of the countries in World Countries 1998 as well as major population centers and landmark cities. World Cities was derived from ESRI's ArcWorld database and supplemented with other data from the Rand McNally New International Atlas</abstract>
    <purpose>606 points, 4 descriptive fields. Describes major world cities.</purpose>
    </descript>
    <timeperd>
    <timeinfo>
    <sngdate>
    <caldate>1996</caldate>
    </sngdate>
    </timeinfo>
    <current>publication date</current>
    </timeperd>
    <status>
    <progress>Complete</progress>
    <update>None planned</update>
    </status>
    <spdom>
    <bounding>
    <westbc>
    -165.270004</westbc>
    <eastbc>
    177.130188</eastbc>
    <northbc>
    78.199997</northbc>
    <southbc>
    -53.150002</southbc>
    </bounding>
    </spdom>
    <keywords>
    <theme>
    <themekt>city</themekt>
    <themekey>cities</themekey>
    </theme>
    </keywords>
    <accconst>none</accconst>
    <useconst>none</useconst>
    <ptcontac>
    <cntinfo>
    <cntperp>
    <cntper>unknown</cntper>
    <cntorg>unknown</cntorg>
    </cntperp>
    <cntpos>unknown</cntpos>
    <cntvoice>555-1212</cntvoice>
    </cntinfo>
    </ptcontac>
    <datacred>ESRI</datacred>
    <native>Microsoft Windows NT Version 4.0 (Build 1381) Service Pack 6; ESRI ArcCatalog 8.1.0.570</native>
    </idinfo>
    <dataqual>
    <attracc>
    <attraccr>no report available</attraccr>
    <qattracc>
    <attraccv>1000000</attraccv>
    <attracce>no report available</attracce>
    </qattracc>
    </attracc>
    <logic>no report available</logic>
    <complete>no report available</complete>
    <posacc>
    <horizpa>
    <horizpar>no report available</horizpar>
    </horizpa>
    <vertacc>
    <vertaccr>no report available</vertaccr>
    </vertacc>
    </posacc>
    <lineage>
    <srcinfo>
    <srccite>
    <citeinfo>
    <title>ESRI</title>
    </citeinfo>
    </srccite>
    <srcscale>20000000</srcscale>
    <typesrc>CD-ROM</typesrc>
    <srctime>
    <timeinfo>
    <sngdate>
    <caldate>1996</caldate>
    </sngdate>
    </timeinfo>
    <srccurr>publication date</srccurr>
    </srctime>
    <srccontr>no report available</srccontr>
    </srcinfo>
    <procstep>
    <procdesc>no report available</procdesc>
    <procdate>Unknown</procdate>
    </procstep>
    </lineage>
    </dataqual>
    <spdoinfo>
    <direct>Vector</direct>
    <ptvctinf>
    <sdtsterm>
    <sdtstype>Entity point</sdtstype>
    <ptvctcnt>606</ptvctcnt>
    </sdtsterm>
    </ptvctinf>
    </spdoinfo>
    <spref>
    <horizsys>
    <geograph>
    <latres>0.000001</latres>
    <longres>0.000001</longres>
    <geogunit>Decimal degrees</geogunit>
    </geograph>
    <geodetic>
    <horizdn>North American Datum of 1927</horizdn>
    <ellips>Clarke 1866</ellips>
    <semiaxis>6378206.400000</semiaxis>
    <denflat>294.978698</denflat>
    </geodetic>
    </horizsys>
    </spref>
    <eainfo>
    <detailed>
    <enttyp>
    <enttypl>
    cities</enttypl>
    </enttyp>
    <attr>
    <attrlabl>FID</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>Shape</attrlabl>
    <attrdef>Feature geometry.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Coordinates defining the features.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>NAME</attrlabl>
    <attrdef>The city name. Spellings are based on Board of Geographic Names standards and commercial atlases.</attrdef>
    <attrdefs>ESRI</attrdefs>
    </attr>
    <attr>
    <attrlabl>COUNTRY</attrlabl>
    <attrdef>An abbreviated country name.</attrdef>
    </attr>
    <attr>
    <attrlabl>POPULATION</attrlabl>
    <attrdef>Total population for the entire metropolitan area. Values are from recent census or estimates.</attrdef>
    </attr>
    <attr>
    <attrlabl>CAPITAL</attrlabl>
    <attrdef>Indicates whether a city is a national capital (Y/N).</attrdef>
    </attr>
    </detailed>
    <overview>
    <eaover>none</eaover>
    <eadetcit>none</eadetcit>
    </overview>
    </eainfo>
    <distinfo>
    <stdorder>
    <digform>
    <digtinfo>
    <transize>0.080</transize>
    </digtinfo>
    </digform>
    </stdorder>
    </distinfo>
    <metainfo>
    <metd>20010509</metd>
    <metc>
    <cntinfo>
    <cntorgp>
    <cntorg>ESRI</cntorg>
    <cntper>unknown</cntper>
    </cntorgp>
    <cntaddr>
    <addrtype>unknown</addrtype>
    <city>unknown</city>
    <state>unknown</state>
    <postal>00000</postal>
    </cntaddr>
    <cntvoice>555-1212</cntvoice>
    </cntinfo>
    </metc>
    <metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
    <metstdv>FGDC-STD-001-1998</metstdv>
    <mettc>local time</mettc>
    <metextns>
    <onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
    <metprof>ESRI Metadata Profile</metprof>
    </metextns>
    </metainfo>
    </metadata>
    rtacce>Vertical Positional Accuracy is expressed in meters. Vertical accuracy figures were developed by comparing elevation contour locations on 1:24,000 scale maps to elevation values at the same location within the digital database. Some manual interpolation was necessary to complete this test. The analysis results are expressed as linear error at a 90% confidence interval.</vertacce>
    </qvertpa>
    </vertacc>
    </posacc>
    <lineage>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>National Imagery and Mapping Agency</origin>
    <pubdate>1994</pubdate>
    <title>Operational Navigational Chart</title>
    <geoform>map</geoform>
    <pubinfo>
    <pubplace>St.Louis, MO</pubplace>
    <publish>National Imagery and Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>1000000</srcscale>
    <typesrc>stable-base material</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>1974</begdate>
    <enddate>1994</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication dates</srccurr>
    </srctime>
    <srccitea>ONC</srccitea>
    <srccontr>All information found on the source with the exception of aeronautical data</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>National Imagery and Mapping Agency</origin>
    <pubdate>199406</pubdate>
    <title>Digital Aeronautical Flight Information File</title>
    <geoform>model</geoform>
    <pubinfo>
    <pubplace>St. Louis, MO</pubplace>
    <publish>National Imagery and Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <typesrc>magnetic tape</typesrc>
    <srctime>
    <timeinfo>
    <sngdate>
    <caldate>1994</caldate>
    </sngdate>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>DAFIF</srccitea>
    <srccontr>Airport records (name, International Civil Aviation Organization, position, elevation, and type)</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>Defense Mapping Agency</origin>
    <pubdate>1994</pubdate>
    <title>Jet Navigational Chart</title>
    <geoform>map</geoform>
    <pubinfo>
    <pubplace>St.Louis, MO</pubplace>
    <publish>Defense Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>2,000,000</srcscale>
    <typesrc>stable-base material</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>1974</begdate>
    <enddate>1991</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>JNC</srccitea>
    <srccontr>All information found on the source with the exception of aeronautical data. JNCs were used as source for the Antartica region only.</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>USGS EROS Data Center</origin>
    <pubdate></pubdate>
    <title>Advance Very High Resolution Radiometer</title>
    <geoform>remote-sensing image</geoform>
    <pubinfo>
    <pubplace>Sioux Falls, SD</pubplace>
    <publish>EROS Data Center</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>1000000</srcscale>
    <typesrc>magnetic tape</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>199003</begdate>
    <enddate>199011</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>AVHRR</srccitea>
    <srccontr>6 vegetation types covering the continental US and Canada</srccontr>
    </srcinfo>
    <procstep>
    <procdesc>For the first edition DCW, stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
    <procdate>199112</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Environmental Systems Research Institute</cntorg>
    </cntorgp>
    <cntpos>Applications Division</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>380 New York St.</address>
    <city>Redlands</city>
    <state>CA</state>
    <postal>92373</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>909-793-2853</cntvoice>
    <cntfax>909-793-5953</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdate>199404</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>(813)578-0100</cntvoice>
    <cntfax>(813)577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Transferred digitally directly into the VPF files.</procdesc>
    <procdate>199408</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>813-578-0100</cntvoice>
    <cntfax>813-577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
    <procdate>199112</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Environmental Systems Research Institute</cntorg>
    </cntorgp>
    <cntpos>Applications Division</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>380 New York St.</address>
    <city>Redlands</city>
    <state>CA</state>
    <postal>92373</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>909-793-2853</cntvoice>
    <cntfax>909-793-5953</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Daily AVHRR images were averaged for two week time periods over the entire US growing season. These averaged images, their rates of change, elevation information, and other data were used to produce a single land classification image of the contental US. The VMap-0 data set extended this coverage over the Canadian land mass, however vegetation classification was further subdivided into nine vegetation types.</procdesc>
    <procdate>199402</procdate>
    <srcprod>EROS data</srcprod>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>USGS Eros Data Center</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address></address>
    <city>Sioux Falls</city>
    <state>SD</state>
    <postal></postal>
    <country>US</country>
    </cntaddr>
    <cntvoice></cntvoice>
    <cntfax></cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>The Eros data (raster files) were converted to vector polygon, splined (remove stairstepping), thinned (all ploygons under 2km2 were deleted), and tied to existing DCW polygons (water bodies, built-up areas). The resulting file was tiled and converted to a VPF Vegetation coverage for use in the DCW. All processing was performed using ARC-INFO software.</procdesc>
    <procdate>199412</procdate>
    <srcprod>VMap-0 Vegetation Coverage</srcprod>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>813-578-0100</cntvoice>
    <cntfax>813-577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Data was translated from VPF format to ArcInfo Coverage format. The coverages were then loaded into a seamless ArcSDE layer.</procdesc>
    <procdate>02152001</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team, Harvard University</cntorg>
    </cntorgp>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </proccont>
    </procstep>
    </lineage>
    </dataqual>
    <spdoinfo>
    <direct>Vector</direct>
    <ptvctinf>
    <sdtsterm>
    <sdtstype>Complete chain</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>Label point</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>GT-polygon composed of chains</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>Point</sdtstype>
    </sdtsterm>
    <vpfterm>
    <vpflevel>3</vpflevel>
    <vpfinfo>
    <vpftype>Node</vpftype>
    </vpfinfo>
    <vpfinfo>
    <vpftype>Edge</vpftype>
    </vpfinfo>
    <vpfinfo>
    <vpftype>Face</vpftype>
    </vpfinfo>
    </vpfterm>
    </ptvctinf>
    </spdoinfo>
    <spref>
    <horizsys>
    <geograph>
    <latres>0.000000</latres>
    <longres>0.000000</longres>
    <geogunit>Decimal degrees</geogunit>
    </geograph>
    <geodetic>
    <horizdn>D_WGS_1984</horizdn>
    <ellips>WGS_1984</ellips>
    <semiaxis>6378137.000000</semiaxis>
    <denflat>298.257224</denflat>
    </geodetic>
    </horizsys>
    <vertdef>
    <altsys>
    <altdatum>Mean Sea Level</altdatum>
    <altunits>1.0</altunits>
    </altsys>
    </vertdef>
    </spref>
    <eainfo>
    <detailed>
    <enttyp>
    <enttypl>
    lc.pat</enttypl>
    </enttyp>
    <attr>
    <attrlabl>FID</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>Shape</attrlabl>
    <attrdef>Feature geometry.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Coordinates defining the features.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>AREA</attrlabl>
    <attrdef>Area of feature in internal units squared.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Positive real numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>PERIMETER</attrlabl>
    <attrdef>Perimeter of feature in internal units.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Positive real numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>LC#</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>LC-ID</attrlabl>
    <attrdef>User-defined feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    </attr>
    <attr>
    <attrlabl>LCAREA.AFT_ID</attrlabl>
    </attr>
    <attr>
    <attrlabl>LCPYTYPE</attrlabl>
    <attrdef>Land cover poygon type</attrdef>
    <attrdefs>NIMA</attrdefs>
    <attrdomv>
    <edom>
    <edomv>1</edomv>
    <edomvd>Rice Field</edomvd>
    </edom>
    <edom>
    <edomv>2</edomv>
    <edomvd>Cranberry bog</edomvd>
    </edom>
    <edom>
    <edomv>3</edomv>
    <edomvd>Cultivated area, garden</edomvd>
    </edom>
    <edom>
    <edomv>4</edomv>
    <edomvd>Peat cuttings</edomvd>
    </edom>
    <edom>
    <edomv>5</edomv>
    <edomvd>Salt pan</edomvd>
    </edom>
    <edom>
    <edomv>6</edomv>
    <edomvd>Fish pond or hatchery</edomvd>
    </edom>
    <edom>
    <edomv>7</edomv>
    <edomvd>Quarry, strip mine, mine dump, blasting area</edomvd>
    </edom>
    <edom>
    <edomv>8</edomv>
    <edomvd>Oil or gas</edomvd>
    </edom>
    <edom>
    <edomv>10</edomv>
    <edomvd>Lava flow</edomvd>
    </edom>
    <edom>
    <edomv>11</edomv>
    <edomvd>Distorted surface area</edomvd>
    </edom>
    <edom>
    <edomv>12</edomv>
    <edomvd>Unconsolidated material (sand or gravel, glacial moraine)</edomvd>
    </edom>
    <edom>
    <edomv>13</edomv>
    <edomvd>Natural landmark area</edomvd>
    </edom>
    <edom>
    <edomv>14</edomv>
    <edomvd>Inundated area</edomvd>
    </edom>
    <edom>
    <edomv>15</edomv>
    <edomvd>Undifferentiated wetlands</edomvd>
    </edom>
    <edom>
    <edomv>99</edomv>
    <edomvd>None</edomvd>
    </edom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>TILE_ID</attrlabl>
    <attrdef>VPF Format tile ID</attrdef>
    <attrdefs>NIMA</attrdefs>
    </attr>
    <attr>
    <attrlabl>FAC_ID</attrlabl>
    </attr>
    </detailed>
    <overview>
    <eaover>The DCW used a product-specific attribute coding system that is composed of TYPE and STATUS designators for area, line, and point features; and LEVEL and SYMBOL designators for text features. The TYPE attribute specifies what the feature is, while the STATUS attribute specifies the current condition of the feature. Some features require both a TYPE and STATUS code to uniquely identify their characteristics. In order to uniquely identify each geographic attribute in the DCW, the TYPE and STATUS attribute code names are preceded by the two letter coverage abbreviation and a two letter abbreviation for the type of graphic primitive present. The DCW Type/Status codes were mapped into the FACC coding scheme. A full description of FACC may be found in Digital Geographic Information Exchange Standard Edition 1.2, January 1994.</eaover>
    <eadetcit>Entities (features) and Attributes for DCW are fully described in: Department of Defense, 1992, Military Specification Digital Chart of the World (MIL-D-89009): Philadelphia, Department of Defense, Defense Printing Service Detachment Office.</eadetcit>
    </overview>
    </eainfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>NIMA</cntorg>
    </cntorgp>
    <cntpos>ATTN: CC, MS D-16</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>6001 MacArthur Blvd.</address>
    <city>Bethesda</city>
    <state>MD</state>
    <postal>20816-5001</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>301-227-2495</cntvoice>
    <cntfax>301-227-2498</cntfax>
    </cntinfo>
    </distrib>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <formname>VPF</formname>
    <formverd>19930930</formverd>
    <formspec>Military Standard Vector Product Format (MIL-STD-2407). The current version of this document is dated 28 June 1996. Edition 3 of VMap-0 conforms to a previous version of the VPF Standard as noted. Future versions of VMap-0 will conform to the current version of VPF Standard.</formspec>
    <transize>0.172</transize>
    </digtinfo>
    <digtopt>
    <offoptn>
    <offmedia>CD-ROM</offmedia>
    <recfmt>ISO 9660</recfmt>
    </offoptn>
    </digtopt>
    </digform>
    <fees>Not Applicable</fees>
    </stdorder>
    </distinfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>USGS Map Sales</cntorg>
    </cntorgp>
    <cntaddr>
    <addrtype>mailing address</addrtype>
    <address>Box 25286</address>
    <city>Denver</city>
    <state>CO</state>
    <postal>80225</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>303-236-7477</cntvoice>
    <cntfax>303-236-1972</cntfax>
    </cntinfo>
    </distrib>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <transize>0.172</transize>
    </digtinfo>
    </digform>
    <fees>$82.50 per four disk set</fees>
    <ordering>For General Public: Payment (check, money order, purchase order, or Government account) must accompany order.
    Make all drafts payable to Dept. of the Interior- US Geological Survey.
    To provide a general idea of content, a sample data set is available from the TMPO Home Page at:</ordering>
    </stdorder>
    </distinfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team, Harvard University</cntorg>
    <cntper>Geodesy Team</cntper>
    </cntorgp>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </distrib>
    <resdesc>Geodesy layer</resdesc>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <formname>SHP</formname>
    <transize>0.172</transize>
    </digtinfo>
    <digtopt>
    <onlinopt>
    <computer>
    <networka>
    <networkr>geodesy.harvard.edu</networkr>
    </networka>
    </computer>
    </onlinopt>
    </digtopt>
    </digform>
    <fees>none</fees>
    </stdorder>
    <availabl>
    <timeinfo>
    <sngdate>
    <caldate>1992</caldate>
    </sngdate>
    </timeinfo>
    </availabl>
    </distinfo>
    <metainfo>
    <metd>20010226</metd>
    <metc>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team</cntorg>
    <cntper>REQUIRED: The person responsible for the metadata information.</cntper>
    </cntorgp>
    <cntvoice>REQUIRED: The telephone number by which individuals can speak to the organization or individual.</cntvoice>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </metc>
    <metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
    <metstdv>FGDC-STD-001-1998</metstdv>
    <mettc>local time</mettc>
    <metextns>
    <onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
    <metprof>ESRI Metadata Profile</metprof>
    </metextns>
    </metainfo>
    </metadata>

    Have you tired the directory and bfile methods? Here is the example for that in the Oracle XML Developer's Guide:
    CREATE DIRECTORY xmldir AS 'path_to_folder_containing_XML_file';
    Example 3-3 Inserting XML Content into an XMLType Table
    INSERT INTO mytable2 VALUES (XMLType(bfilename('XMLDIR', 'purchaseOrder.xml'),
    nls_charset_id('AL32UTF8')));
    1 row created.
    The value passed to nls_charset_id() indicates that the encoding for the file to be read is UTF-8.
    ben

  • Loading data into XMLType column using dbms_xmlsave.insertxml get ORA-29532

    The following simple test case succeeded in 9.2.0.1 but failed in 9.2.0.2.
    CREATE OR REPLACE procedure InsertXML(xmlDoc IN VARCHAR2, tableName IN VARCHAR2) is
    insCtx DBMS_XMLSave.ctxType;
    rows number;
    begin
    insCtx := DBMS_XMLSave.newContext(tableName); -- get the context handle
    rows := DBMS_XMLSave.insertXML(insCtx,xmlDoc); -- this inserts the document
    dbms_output.put_line(to_char(rows) || ' rows inserted');
    DBMS_XMLSave.closeContext(insCtx); -- this closes the handle
    end;
    CREATE TABLE XMLtable
    (column1 xmltype)
    exec insertxml('<?xml version = "1.0"?><ROWSET><ROW><COLUMN1><TEST>HELLO</TEST></COLUMN1></ROW></ROWSET>', 'XMLTABLE');

    Hi,
    For your XML file I think you just need to enclose XML elemnts in ROWSET AND ROW TAGS - so xml should look like :
    <ROWSET>
    <ROW>
    <DEPT>
    </DEPT>
    and just pass it as CLOB to dbms_xmlsave.insertXML proc.
    I hope it should work.
    I am also trying to insert XML file but with a bit complex structure having multiple nested elements.
    I am not sure how to transform the external XML file to wrap it in ROWSET/ROW using XSLT. It's mandatory to use ROWSET/ROW tags to be able to insert in oracle tables. I am facing this problem right now. I am using object views to accomplish the purpose but still needs to figure out the way to apply stylesheet to incoming XML file.
    If you come to know of any way, pls do let me know also.
    Thanks

  • How to insert Serialised Object(XML DOM) into Oracle Table(as BLOB or CLOB)

    we need a urgent help. How can we insert and retrieve the XML Document DOM Object into Oracle Table.Actually we used BLOB for insert the DOM Object,its inserted finely but we have a problem in retrieving that object, we got error when v're retrieving. so could you anyone tell us what's this exact problem and how can we reslove this problem If anyone knows or used this kind operation, pls let us know immediately.
    Thanks in advance.

    Please repost your question in the appropriate XML forum, http://forums.oracle.com/forums/index.jsp?cat=51

  • How to display appcmd /config xml data into a single table

    Hi,
    I extracted my IIS sites using appcmd
    appcmd /site /config /xml > c:\temp\iisconfig.xml
    afterwhich I would like to grab the data from this xml and export into excel file
    [xml]$iisconfig = get-content c:\temp\iisconfig.xml
    I would like to get the Site Name, Bindings, Physical Directory Path and AppPool from this xml file and output to a excel file
    #Physical Path Directory
    $iisPhysicalPath = $iisconfig.appcmd.site.site.application.VirtualDirectory.PhysicalPath
    #Application Pool
    $iisAppPool = $iisconfig.appcmd.site.site.application.applicationPool
    #Site Details (Site Name, Binding)
    $iisSite = $iisconfig.appcmd.site
    It doesn't have a single command to output into a table. How do I combine all these code and display in a single table?
    Jeron

    Try it like this:
    $sites=[xml](c:\windows\system32\inetsrv\appcmd.exe list site  /xml)
    $sites.appcmd.site
    \_(ツ)_/

  • Illegal combination when importing xml file into xmltype column

    I have the following control file.
    LOAD DATA
    CHARACTERSET UTF8
    INFILE *
    INTO TABLE IMPORTRAWXML TRUNCATE
    SITEID constant 0
    ,VENDORID constant 17
    ,SITEFORMATID constant 2
    ,"\\plutonium\outcomes\AHA GWTG-Outpatient\Programs\DataTransfer\LoadTest\V17_standard_test.xml" filler char(1000)
    ,RAWDATA LOBFILE ("\\plutonium\outcomes\AHA GWTG-Outpatient\Programs\DataTransfer\LoadTest\V17_standard_test.xml")
    TERMINATED BY EOF
    )When I run it using sqlldr command line I get the following error:
    SQL*Loader-350: Syntax error at line 1.
    Illegal combination of non-alphanumeric characters
    <?xml version="1.0" encoding="ISO-8859-1"?>Does anyone have any idea what I am doing wrong here? If I remove the fully resolved path (both the control file and xml file are in the same folder) it tells me it can't find the file to load.
    HELP!!!!
    Thanks,
    Eva

    evaleah wrote:
    I have made sure all my home settings are correct in my registry editor and they are. Another thing to note is the control file I am using works 100% perfectly, beautifully when called from Toad for Oracle. It is when called from the command line utility that it fails. Is there anyway to determine what the difference could be?
    So we know that I can work (toad works), but it doesn't yet work in a "cmd" environment.
    Just as any other program Toad is also a client and uses NLS and other environment settings. Maybe these are stored in the registry, maybe the are being set by Toad by reading a configuration file while it is started or while running.
    As said, on Windows its tricky...
    Setting properties in the registry will not mean that they are the same a "cmd" window or maybe not even been set.
    If you execute / run "cmd" then the "set" statement/command will output the environment settings that will be used during the livetime of that "cmd" session.
    C:/> setIn my laptop environment (windows 7 64 bit) it will show the following
    C:\>set
    ALLUSERSPROFILE=C:\ProgramData
    APPDATA=C:\Users\marco\AppData\Roaming
    CommonProgramFiles=C:\Program Files\Common Files
    CommonProgramFiles(x86)=C:\Program Files (x86)\Common Files
    CommonProgramW6432=C:\Program Files\Common Files
    COMPUTERNAME=00-00-000
    ComSpec=C:\Windows\system32\cmd.exe
    DEFLOGDIR=C:\ProgramData\McAfee\DesktopProtection
    FP_NO_HOST_CHECK=NO
    HOMEDRIVE=C:
    HOMEPATH=\Users\marco
    LOCALAPPDATA=C:\Users\marco\AppData\Local
    LOGONSERVER=\\AMISNT
    MpConfig_ProductAppDataPath=C:\ProgramData\Microsoft\Windows Defender
    MpConfig_ProductCodeName=AntiSpyware
    MpConfig_ProductPath=C:\Program Files\Windows Defender
    MpConfig_ProductUserAppDataPath=C:\Users\marco\AppData\Local\Microsoft\Windows
    Defender
    MpConfig_ReportingGUID=CA08B82B-EF0A-4107-89D8-ED5BB37E7515
    NUMBER_OF_PROCESSORS=2
    OS=Windows_NT
    Path=C:\oracle\product\11.2.0\dbhome_1\bin;C:\Windows\system32;C:\Windows;C:\Win
    dows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\
    PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC
    PERL5LIB=c:\oracle\product\10.2.0\db_1\perl\5.8.3\lib\MSWin32-x64;c:\oracle\prod
    uct\10.2.0\db_1\perl\5.8.3\lib;c:\oracle\product\10.2.0\db_1\perl\site\5.8.3;c:\
    oracle\product\10.2.0\db_1\perl\site\5.8.3\lib;c:\oracle\product\10.2.0\db_1\sys
    man\admin\scripts;
    PROCESSOR_ARCHITECTURE=AMD64
    PROCESSOR_IDENTIFIER=Intel64 Family 6 Model 23 Stepping 10, GenuineIntel
    PROCESSOR_LEVEL=6
    PROCESSOR_REVISION=170a
    ProgramData=C:\ProgramData
    ProgramFiles=C:\Program Files
    ProgramFiles(x86)=C:\Program Files (x86)
    ProgramW6432=C:\Program Files
    PROMPT=$P$G
    PSModulePath=C:\Windows\system32\WindowsPowerShell\v1.0\Modules\
    PUBLIC=C:\Users\Public
    SESSIONNAME=Console
    SystemDrive=C:
    SystemRoot=C:\Windows
    TEMP=C:\Users\marco\AppData\Local\Temp
    TMP=C:\Users\marco\AppData\Local\Temp
    USERDNSDOMAIN=AMIS
    USERDOMAIN=AMIS
    USERNAME=marco
    USERPROFILE=C:\Users\marco
    VBOX_INSTALL_PATH=C:\Program Files\Sun\VirtualBox\
    VSEDEFLOGDIR=C:\ProgramData\McAfee\DesktopProtection
    windir=C:\Windows
    C:\>The only thing that identifies that I have Oracle installed is set in the %PATH% variable and %PERL5LIB%. From the path setting you can also deduct that I have Oracle 11 and Oracle 10 software installed. So when I execute "sqlldr" what NLS settings will it use and which tnsnames.ora alias for example to connect to the database...?
    You can you do it and see what happens...
    C:\>sqlldr
    SQL*Loader: Release 11.2.0.1.0 - Production on Wed May 12 20:36:25 2010
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Usage: SQLLDR keyword=value [,keyword=value,...]
    Valid Keywords:
    ...So it will pick the first "sqlldr" in the %PATH% environment setting. But what about NLS...? As said to be absolute sure you will have to set it in your environment.
    C:/> set ORACLE_HOME="C:\oracle\product\10.2.0\db_1"
    C:\> set
    Path=C:\oracle\product\11.2.0\dbhome_1\bin;C:\Windows\system32;C:\Windows;C:\Win
    dows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\
    PATHEXT=.COM;.EXE;.BAT;.CMD;.VBS;.VBE;.JS;.JSE;.WSF;.WSH;.MSC
    PERL5LIB=c:\oracle\product\10.2.0\db_1\perl\5.8.3\lib\MSWin32-x64;c:\oracle\prod
    uct\10.2.0\db_1\perl\5.8.3\lib;c:\oracle\product\10.2.0\db_1\perl\site\5.8.3;c:\
    oracle\product\10.2.0\db_1\perl\site\5.8.3\lib;c:\oracle\product\10.2.0\db_1\sys
    man\admin\scripts;
    ORACLE_HOME="C:\oracle\product\10.2.0\db_1"
    C:\> echo %ORACLE_HOME%
    "C:\oracle\product\10.2.0\db_1"Because if I enter "sqlldr" it will pick the executable from the 11.2 install, but the ORACLE_HOME is set to the wrong environment
    Executing sqlldr now will give me an error
    C:\>sqlldr
    Message 2100 not found; No message file for product=RDBMS, facility=ULMessage 21
    00 not found; No message file for product=RDBMS, facility=UL
    C:\>Another thing you can notice now is that from that output alone, you can't deduct the Oracle "sqlldr" version anymore. Setting the ORACLE_HOME environment to either 10.2 or 11.2 will cause "sqlldr" to execute normally (at least thats how it looks) BUT in the case of the oracle 10.2 setting it will use the wrong message files etc. At least not the correct software versions / files "sqlldr" is shipped with. So you can (and most of the time) will get strange errors.
    C:\> set ORACLE_HOME=C:\oracle\product\10.2.0\db_1
    C:\>sqlldr
    SQL*Loader: Release 11.2.0.1.0 - Production on Wed May 12 20:45:00 2010
    C:\>set ORACLE_HOME=C:\oracle\product\11.2.0\dbhome_1
    C:\>sqlldr
    SQL*Loader: Release 11.2.0.1.0 - Production on Wed May 12 20:49:50 2010
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    Usage: SQLLDR keyword=value [,keyword=value,...]
    ...In the registry you can find a lot of those variables are set for ORACLE_HOME, ORACLE_BASE, SQLPATH, maybe TNS_ADMIN, ORACLE_SID, NLS_LANG. You can find the variables IN the registry on two places. The most common one is the SYSTEM wide environment settings under //HKEY_LOCAL_MACHINE/SOFTWARE/ORACLE and in my case for 11.2 under //HKEY_LOCAL_MACHINE/SOFTWARE/ORACLE/KEY_OraDb11g_Home1.
    You could overrule this for to be active on "my user session", my login session as account "marco" under //HKEY_CURRENT_USER/Software/Oracle, but no one using Windows is doing this.
    Besides the "cmd" and registry environment, there is a different place as well were you can set these parameters. In windows go to "start", click "control panel", click "system", click tab "Advanced" and then click on the button on the bottum with "Environment Variables". Here you can see the distinction between user and system wide variables as well. Here you can also set NLS_LANG, ORACLE_HOME etc. If I am not mistaken than these will be the default values used in a fresh "cmd" window session. But they can and will interfere with programs you start via clicking them. For example something a Java program like SQL Developer (and/or Toad). If those values are not overrulled by the program by for example using his own variables from a config file or else, those sessings from the "system" / control panel item will be used. If it is the wrong mix, you will encounter strange issues.
    Setting ORACLE_HOME and ORACLE_BASE will be used by a lot of derived other settings for example the default place SQL*Net drivers and tnsnames aliases etc will be checked. For example setting the ORACLE_HOME to
    C:\> set ORACLE_HOME=C:\oracle\product\10.2.0\db_1will result in that the tnsnames aliases from
    C:\> set ORACLE_HOME=C:\oracle\product\10.2.0\db_1\NETWORK\adminwill be used and NOT the ones maybe needed from
    C:\oracle\product\11.2.0\dbhome_1\NETWORK\adminYou can overrule this behavior by setting the TNS_ADMIN variable. By default, most explicit form for example on linux and Unix, the following rule will be used by Oracle
    1) .tnsnames.ora in the home directory of the user
    2) standard default: $ORACLE_HOME/network/admin
    3) behavior can be overruled via setting $TNS_ADMINThe fact that you are able via Toad to execute it correctly proves you that it can be done. But Toad uses in your session "SQL*Net" or JDBC or ODBC or ADO or ? via its own configuration environment settings that are being set on which values...?
    Both Oracle and Toad use the OSI model (http://en.wikipedia.org/wiki/OSI_model), both have to follow the same rules. There hasn't been changed that much over the years, character set conversions are done in the "two task common" layer of the data transport on either side (client/server) when data travels between a client and server. And don't forget a database can be also be a client when for instance database links are being used. One of the reasons to read old manuals because there the basics are still perfectly explained: http://download.oracle.com/docs/cd/A57673_01/DOC/net/doc/NWUS233/ch2.htm#twotask (Oracle 7.3.4 Networking Manual).
    This long long story is just to show you that you have more control if you set variables explicitly in a "cmd" window (and or in a linux/unix shell (as long as the session isn't "forked")) BUT you will have to be precise. Check the environment a set those environment variables that control / that are being used by "sqlldr" (and that are probably more then you realised, for example that SQL_PATH variable is the default directory where SQL*Plus is looking and saving its "SQL" and spool "LIS" files).
    HTH

  • How to insert more than 32k xml data into oracle clob column

    how to insert more than 32k xml data into oracle clob column.
    xml data is coming from java front end
    if we cannot use clob than what are the different options available

    Are you facing any issue with my code?
    String lateral size error will come when you try to insert the full xml in string format.
    public static boolean writeCLOBData(String tableName, String id, String columnName, String strContents) throws DataAccessException{
      boolean isUpdated = true;
      Connection connection = null;
      try {
      connection = ConnectionManager.getConnection ();
      //connection.setAutoCommit ( false );
      PreparedStatement PREPARE_STATEMENT = null;
      String sqlQuery = "UPDATE " + tableName + " SET " + columnName + "  = ?  WHERE ID =" + id;
      PREPARE_STATEMENT = connection.prepareStatement ( sqlQuery );
      // converting string to reader stream
      Reader reader = new StringReader ( strContents );
      PREPARE_STATEMENT.setClob ( 1, reader );
      // return false after updating the clob data to DB
      isUpdated = PREPARE_STATEMENT.execute ();
      PREPARE_STATEMENT.close ();
      } catch ( SQLException e ) {
      e.printStackTrace ();
      finally{
      return isUpdated;
    Try this JAVA code.

  • Insert XML Data Into An Existing PDF Form

    I am working on an application, written with XHTML and JavaScript, and running on AIR, so it is a desktop application.
    Users enter data into an XHTML form and upon submission I create an XML file of the data using JavaScript.
    At any later time users will be able to open the XML file from the application. I will use JavaScript, again, to read the XML and fill in an XHTML form. But at this point I will provide a button for users to generate a PDF with the data. I would then like to insert the XML data into the appropriate fields of the existing PDF form. I would like to continue to do this from within the AIR application using JavaScript.
    Is this possible?
    What version of Acrobat would I need as the developer? Professional or Standard or Other or None?
    What version of Acrobat would the end users need? Professional or Standard or Other or None?
    Can this all be done using only JavaScript and the Acrobat SDK? Does using AIR offer any help (all Adobe products)?
    When the end user clicks the generate PDF button would Acrobat have to open? Can this be done without the user seeing Acrobat open? Either way is ok, I would just like the user to see an alert saying that the file has be generated and point them to its location on the local machine. But again, this is not a requirement.
    Thanks in advance.
    Not asking for source code here, just if its possible. :)
    I'll figure out the how.

    You have two choices
    1) You can use JavaScript in your AIR application to communicate with the JavaScript in the PDF to fill in the form directly. This can all be done inside of your AIR app. This is certainly the best route to go and there is (IIRC) a sample in the AIR SDK.
    2) You can create an XFDF file from your XML data and then have Acrobat/Reader open the XFDF file to fill in the data.
    Both methods will work with Acrobat and Reader - HOWEVER, Reader users won't be able to save the PDF unless it has been Reader Enabled.
    Leonard

  • How to Dump XML data into table

    Hi All,
    We have one schema on a server.
    We have created the same schema with only table structures on another new server.
    Now we got the data from the original schema as XML files corresponding to each table.
    For Example, we have Employee table created newly on a new server. Now we have the XML file which has Employee data. We have to dump this XML data into new EMPLOYEE table.
    We have to dump these XML files data into corresponding tables.
    How can we achieve this?
    Please help me..
    Thanks in advance

    Look at this example:
    Assume you have the following XML in a tempxml table (tempxml could be an external table):
    create table tempxml (xml xmltype);
    insert into tempxml values (xmltype(
    '<?xml version="1.0"?>
    <ROWSET>
    <ROW>
      <EMPNO>7369</EMPNO>
      <ENAME>SMITH</ENAME>
      <JOB>CLERK</JOB>
      <MGR>7902</MGR>
      <HIREDATE>17-DEC-80</HIREDATE>
      <SAL>800</SAL>
      <DEPTNO>20</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7499</EMPNO>
      <ENAME>ALLEN</ENAME>
      <JOB>SALESMAN</JOB>
      <MGR>7698</MGR>
      <HIREDATE>20-FEB-81</HIREDATE>
      <SAL>1600</SAL>
      <COMM>300</COMM>
      <DEPTNO>30</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7521</EMPNO>
      <ENAME>WARD</ENAME>
      <JOB>SALESMAN</JOB>
      <MGR>7698</MGR>
      <HIREDATE>22-FEB-81</HIREDATE>
      <SAL>1250</SAL>
      <COMM>500</COMM>
      <DEPTNO>30</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7566</EMPNO>
      <ENAME>JONES</ENAME>
      <JOB>MANAGER</JOB>
      <MGR>7839</MGR>
      <HIREDATE>02-APR-81</HIREDATE>
      <SAL>2975</SAL>
      <DEPTNO>20</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7654</EMPNO>
      <ENAME>MARTIN</ENAME>
      <JOB>SALESMAN</JOB>
      <MGR>7698</MGR>
      <HIREDATE>28-SEP-81</HIREDATE>
      <SAL>1250</SAL>
      <COMM>1400</COMM>
      <DEPTNO>30</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7698</EMPNO>
      <ENAME>BLAKE</ENAME>
      <JOB>MANAGER</JOB>
      <MGR>7839</MGR>
      <HIREDATE>01-MAY-81</HIREDATE>
      <SAL>2850</SAL>
      <DEPTNO>30</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7782</EMPNO>
      <ENAME>CLARK</ENAME>
      <JOB>MANAGER</JOB>
      <MGR>7839</MGR>
      <HIREDATE>09-JUN-81</HIREDATE>
      <SAL>2450</SAL>
      <DEPTNO>10</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7788</EMPNO>
      <ENAME>SCOTT</ENAME>
      <JOB>ANALYST</JOB>
      <MGR>7566</MGR>
      <HIREDATE>19-APR-87</HIREDATE>
      <SAL>3000</SAL>
      <DEPTNO>20</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7839</EMPNO>
      <ENAME>KING</ENAME>
      <JOB>PRESIDENT</JOB>
      <HIREDATE>17-NOV-81</HIREDATE>
      <SAL>5000</SAL>
      <DEPTNO>10</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7844</EMPNO>
      <ENAME>TURNER</ENAME>
      <JOB>SALESMAN</JOB>
      <MGR>7698</MGR>
      <HIREDATE>08-SEP-81</HIREDATE>
      <SAL>1500</SAL>
      <COMM>0</COMM>
      <DEPTNO>30</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7876</EMPNO>
      <ENAME>ADAMS</ENAME>
      <JOB>CLERK</JOB>
      <MGR>7788</MGR>
      <HIREDATE>23-MAY-87</HIREDATE>
      <SAL>1100</SAL>
      <DEPTNO>20</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7900</EMPNO>
      <ENAME>JAMES</ENAME>
      <JOB>CLERK</JOB>
      <MGR>7698</MGR>
      <HIREDATE>03-DEC-81</HIREDATE>
      <SAL>950</SAL>
      <DEPTNO>30</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7902</EMPNO>
      <ENAME>FORD</ENAME>
      <JOB>ANALYST</JOB>
      <MGR>7566</MGR>
      <HIREDATE>03-DEC-81</HIREDATE>
      <SAL>3000</SAL>
      <DEPTNO>20</DEPTNO>
    </ROW>
    <ROW>
      <EMPNO>7934</EMPNO>
      <ENAME>MILLER</ENAME>
      <JOB>CLERK</JOB>
      <MGR>7782</MGR>
      <HIREDATE>23-JAN-82</HIREDATE>
      <SAL>1300</SAL>
      <DEPTNO>10</DEPTNO>
    </ROW>
    </ROWSET>'));You can insert into an empty EMP2 table this way:
    SQL> insert into emp2
      2  select myemp.empno, myemp.ename, myemp.job, myemp.mgr,
      3         to_date(myemp.hiredate,'dd-mon-yy'), myemp.sal,
      4         myemp.comm, myemp.deptno
      5  from tempxml x, xmltable('/ROWSET/ROW'
      6                           PASSING x.xml
      7                           COLUMNS
      8                             empno number PATH '/ROW/EMPNO',
      9                             ename varchar2(10) PATH '/ROW/ENAME',
    10                             job   varchar2(9) PATH '/ROW/JOB',
    11                             mgr   number PATH '/ROW/MGR',
    12                             hiredate varchar2(9) PATH '/ROW/HIREDATE',
    13                             sal    number PATH '/ROW/SAL',
    14                             comm   number PATH '/ROW/COMM',
    15                             deptno number PATH '/ROW/DEPTNO'
    16                            ) myemp;
    14 rows created.
    SQL> select * from emp2;
         EMPNO ENAME      JOB              MGR HIREDATE         SAL       COMM     DEPTNO
          7369 SMITH      CLERK           7902 17-DEC-80        800                    20
          7499 ALLEN      SALESMAN        7698 20-FEB-81       1600        300         30
          7521 WARD       SALESMAN        7698 22-FEB-81       1250        500         30
          7566 JONES      MANAGER         7839 02-APR-81       2975                    20
          7654 MARTIN     SALESMAN        7698 28-SEP-81       1250       1400         30
          7698 BLAKE      MANAGER         7839 01-MAY-81       2850                    30
          7782 CLARK      MANAGER         7839 09-JUN-81       2450                    10
          7788 SCOTT      ANALYST         7566 19-APR-87       3000                    20
          7839 KING       PRESIDENT            17-NOV-81       5000                    10
          7844 TURNER     SALESMAN        7698 08-SEP-81       1500          0         30
          7876 ADAMS      CLERK           7788 23-MAY-87       1100                    20
          7900 JAMES      CLERK           7698 03-DEC-81        950                    30
          7902 FORD       ANALYST         7566 03-DEC-81       3000                    20
          7934 MILLER     CLERK           7782 23-JAN-82       1300                    10
    14 rows selected.Max
    http://oracleitalia.wordpress.com

  • Steps to insert xml data into oracle

    Please give me next steps to insert xml data into oracle 9i:
    i've been doing this steps :
    1. create folder in oracle port:8080
    2. copy xsd into folder
    3. register schema
    4. Give me next step...
    5.
    6.
    Thanks

    this is my complete xmlschema
    <?xml version = "1.0" encoding = "UTF-8"?>
    <xs:schema xmlns:xs = "http://www.w3.org/2001/XMLSchema">
         <xs:element name = "A3A8Vers" type = "xs:string"/>
         <xs:element name = "F1F5Vers" type = "xs:string">
         </xs:element>
         <xs:element name = "sequence" type = "xs:string">
         </xs:element>
         <xs:element name = "amf" type = "xs:string">
         </xs:element>
         <xs:element name = "trnsKeyNumber" type = "xs:string">
         </xs:element>
         <xs:element name = "mac" type = "xs:string">
         </xs:element>
         <xs:element name = "encryptionKey" type = "xs:string">
         </xs:element>
         <xs:element name = "signature" type = "xs:string">
         </xs:element>
         <xs:element name = "signer">
              <xs:complexType>
                   <xs:sequence>
                        <xs:element ref = "entityNumber"/>
                        <xs:element ref = "keyNumber"/>
                   </xs:sequence>
              </xs:complexType>
         </xs:element>
         <xs:element name = "entityNumber" type = "xs:string">
         </xs:element>
         <xs:element name = "keyNumber" type = "xs:string">
         </xs:element>
         <xs:element name = "pblKey">
              <xs:complexType>
                   <xs:sequence>
                        <xs:element ref = "entityNumber"/>
                        <xs:element ref = "entityRole"/>
                        <xs:element ref = "keyNumber"/>
                        <xs:element ref = "publicKeyVal"/>
                   </xs:sequence>
              </xs:complexType>
         </xs:element>
         <xs:element name = "ntrTime" type = "xs:string">
         </xs:element>
         <xs:element name = "deActionTime" type = "xs:string">
         </xs:element>
         <xs:element name = "actionTime" type = "xs:string">
         </xs:element>
         <xs:element name = "entityRole">
              <xs:complexType>
                   <xs:attribute name = "role" default = "INVALID">
                        <xs:simpleType>
                             <xs:restriction base = "xs:NMTOKEN">
                                  <xs:enumeration value = "TKD"/>
                                  <xs:enumeration value = "SKD"/>
                                  <xs:enumeration value = "INVALID"/>
                             </xs:restriction>
                        </xs:simpleType>
                   </xs:attribute>
              </xs:complexType>
         </xs:element>
         <xs:element name = "publicKeyVal">
              <xs:complexType>
                   <xs:sequence>
                        <xs:element ref = "exponent"/>
                        <xs:element ref = "mod"/>
                   </xs:sequence>
              </xs:complexType>
         </xs:element>
         <xs:element name = "exponent" type = "xs:string">
         </xs:element>
         <xs:element name = "mod" type = "xs:string">
         </xs:element>
         <xs:element name = "encriptionTransKey" type = "xs:string">
         </xs:element>
         <xs:element name = "keyType" type = "xs:string">
         </xs:element>
    </xs:schema>.
    I use command to create table :
    create table elements of xmltype
    xmlschema "http://192.168.1.1:8080/test.xsd"
    element "publicKey"
    . But why the result,table as object type. so i cant use command "desc <table_name>;"

  • How to parse xml data into java component

    hi
    everybody.
    i am new with XML, and i am trying to parse xml data into a java application.
    can anybody guide me how to do it.
    the following is my file.
    //MyLogin.java
    import javax.swing.*;
    import java.awt.*;
    import java.awt.event.*;
    import java.io.*;
    import java.util.*;
    class MyLogin extends JFrame implements ActionListener
         JFrame loginframe;
         JLabel labelname;
         JLabel labelpassword;
         JTextField textname;
         JPasswordField textpassword;
         JButton okbutton;
         String name = "";
         FileOutputStream out;
         PrintStream p;
         Date date;
         GregorianCalendar gcal;
         GridBagLayout gl;
         GridBagConstraints gbc;
         public MyLogin()
              loginframe = new JFrame("Login");
              gl = new GridBagLayout();
              gbc = new GridBagConstraints();
              labelname = new JLabel("User");
              labelpassword = new JLabel("Password");
              textname = new JTextField("",9);
              textpassword = new JPasswordField(5);
              okbutton = new JButton("OK");
              gbc.anchor = GridBagConstraints.NORTHWEST;
              gbc.gridx = 1;
              gbc.gridy = 5;
              gl.setConstraints(labelname,gbc);
              gbc.anchor = GridBagConstraints.NORTHWEST;
              gbc.gridx = 2;
              gbc.gridy = 5;
              gl.setConstraints(textname,gbc);
              gbc.anchor = GridBagConstraints.NORTHWEST;
              gbc.gridx = 1;
              gbc.gridy = 10;
              gl.setConstraints(labelpassword,gbc);
              gbc.anchor = GridBagConstraints.NORTHWEST;
              gbc.gridx = 2;
              gbc.gridy = 10;
              gl.setConstraints(textpassword,gbc);
              gbc.anchor = GridBagConstraints.NORTHWEST;
              gbc.gridx = 1;
              gbc.gridy = 15;
              gl.setConstraints(okbutton,gbc);
              Container contentpane = getContentPane();
              loginframe.setContentPane(contentpane);
              contentpane.setLayout(gl);
              contentpane.add(labelname);
              contentpane.add(labelpassword);
              contentpane.add(textname);
              contentpane.add(textpassword);
              contentpane.add(okbutton);
              okbutton.addActionListener(this);
              loginframe.setSize(300,300);
              loginframe.setVisible(true);
         public static void main(String a[])
              new MyLogin();
         public void reset()
              textname.setText("");
              textpassword.setText("");
         public void run()
              try
                   String text = textname.getText();
                   String blank="";
                   if(text.equals(blank))
                      System.out.println("First Enter a UserName");
                   else
                        if(text != blank)
                             date = new Date();
                             gcal = new GregorianCalendar();
                             gcal.setTime(date);
                             out = new FileOutputStream("log.txt",true);
                             p = new PrintStream( out );
                             name = textname.getText();
                             String entry = "UserName:- " + name + " Logged in:- " + gcal.get(Calendar.HOUR) + ":" + gcal.get(Calendar.MINUTE) + " Date:- " + gcal.get(Calendar.DATE) + "/" + gcal.get(Calendar.MONTH) + "/" + gcal.get(Calendar.YEAR);
                             p.println(entry);
                             System.out.println("Record Saved");
                             reset();
                             p.close();
              catch (IOException e)
                   System.err.println("Error writing to file");
         public void actionPerformed(ActionEvent ae)
              String str = ae.getActionCommand();
              if(str.equals("OK"))
                   run();
                   //loginframe.setDefaultCloseOperation(DISPOSE_ON_CLOSE);
    }

    hi, thanks for ur reply.
    i visited that url, i was able to know much about xml.
    so now my requirement is DOM.
    but i dont know how to code in my existing file.
    means i want to know what to link all my textfield to xml file.
    can u please help me out. i am confused.
    waiting for ur reply

Maybe you are looking for