Existing XML documents into KM

Hello Everyone
Is there a way to load existing xml documents into KM and then maintain them using xml forms built using the form builder. Any help is greatly appreciated.
Thanks
Swetha

Hi Renuka
You can move the XML projects from one server  to another.The Xml forms created is stored in the path 'etc\xmlforms' in KM. You can send it out and Upload to the target server instead of creating the same in the Target server from the scratch.
Regards
Geogi

Similar Messages

  • How to split XML document into two

    I want to create new docuemnt by taking part of the existent XML document under the node document_text
    vNodeList := xmldom.getElementsByTagName(vdoc, 'document_text');
    vNode := xmldom.item(vNodeList, 0); -- The xmldom.getNodeName(vNode) of this node is exactly what i am
    -- looking for
    vdoc:= xmldom.makeDocument( vNode);
    insert into cltab values (empty_clob()) returning cl into cl;
    xmldom.writeToClob(vdoc_txt, cl);
    But after that i am still getting the whole document tree, - copy of the whole document.
    What am i doing wrong? What functions should i use for this kind of task? xmldom.clonenode?
    Could you recommend any good description of the xmldom package?
    Thanks,
    Roman

    If you are using an FI  entry F-43 to generate invoice this can be done by giving the same invoice ref. in the Inv. Ref. field  for two vendors. This is manual
    Document Split will split the document between two profit center and not between vendors.

  • To load an XML document into 40 tables

    How do I load a large XML document into 40 tables. Most of the exmaples, I see only load one table into the Orcale database?

    From the above document:
    Storing XML Data Across Tables
    Question
    Can XML- SQL Utility store XML data across tables?
    Answer
    Currently XML-SQL Utility (XSU) can only store to a single table. It maps a canonical representation of an XML document into any table/view. But of course there is a way to store XML with the XSU across table. One can do this using XSLT to transform any document into multiple documents and insert them separately. Another way is to define views over multiple tables (object views if needed) and then do the inserts ... into the view. If the view is inherently non-updatable (because of complex joins, ...), then one can use INSTEAD-OF triggers over the views to do the inserts.
    -- I've tried this, works fine.

  • Can I append an XML Document into another?

    I have a large xml document which I open into coldfusion (mx
    7), and then a smaller group of files that are opened into thier
    own xml objects. After some customization of xml text values, I
    want to append the smaller xml documents into the larger document
    at specific positions within the document tree.
    I have the code that I thought would do this, using an
    arrayappend function, but what happens is that only the root
    element of the smaller (inserted) xml document is placed into the
    main document.
    Is this possible, or do I need to modify the code so that the
    entire tree of xml data that I am appending is created on the
    fly?

    Personally I do this with XSLT.
    <cfxml variable="doc1">
    <root>
    <member id="10">
    <name>Ian</name>
    <sex>male</sex>
    </member>
    </root>
    </cfxml>
    <cfxml variable="doc2">
    <root>
    <member id="1">
    <name>Joe</name>
    <sex>male</sex>
    </member>
    <member id="2">
    <name>John</name>
    <sex>male</sex>
    </member>
    <member id="3">
    <name>Sue</name>
    <sex>female</sex>
    </member>
    </root>
    </cfxml>
    <cffile action="write" file="#expandPath("/")#\doc2.xml"
    output="#ToString(doc2)#" nameconflict="overwrite">
    <!--- METHOD ONE using a depreciated and undocumented
    method --->
    <cfset newNode = doc1.root.member.cloneNode(true)>
    <cfset doc2.changeNodeOwner(newNode)>
    <cfset doc2.root.appendChild(newNode)>
    <cfdump var="#doc2#" label="Merged by hidden functions"
    expand="no">
    <!--- METHOD TWO using XSLT --->
    <!--- create an xsl docutment--->
    <cfoutput>
    <cfsavecontent variable="transformer">
    <xsl:stylesheet xmlns:xsl="
    http://www.w3.org/1999/XSL/Transform"
    version="1.0">
    <xsl:output method="xml"/>
    <!--load the merge file -->
    <xsl:variable name="emps"
    select="document('#expandPath("/")#\doc2.xml')"/>
    <!--- this line references and XML file to be combined
    with the main file,
    I wonder if there is a way to use an XML document in memory
    here? --->
    <!-- combine the files -->
    <xsl:template match="/">
    <root>
    <!-- select all the child nodes of the root tag in the
    main file -->
    <xsl:for-each select="root/child::*">
    <!-- copy the member tag -->
    <xsl:copy-of select="."/>
    </xsl:for-each>
    <!-- and all the child nodes of the root tag in the merge
    file -->
    <xsl:for-each select="$emps/root/child::*">
    <!-- copy the member tag -->
    <xsl:copy-of select="."/>
    </xsl:for-each>
    </root>
    </xsl:template>
    </xsl:stylesheet>
    </cfsavecontent>
    </cfoutput>
    <!--- create a combined xml document --->
    <cfset doc2 = XMLparse(XMLtransform(doc1,transformer))>
    <cfdump var="#doc2#" label="Merged by XSLT"
    expand="no">

  • Got error message when store XML documents into XML DB repository, via WebD

    Hi experts,
    I am in I am in Oracle Enterprise Manager 11g 11.2.0.1.0.
    SQL*Plus: Release 11.2.0.1.0 Production on Tue Feb 22 11:40:23 2011
    I got error message when store XML documents into XML DB repository, via WebDAV.
    I have successfully registered 5 related schemas and generated 1 table.
    I have inserted 40 .xml files into this auto generated table.
    using these data I created relational view successfully.
    but since I couldn't store XML documents into XML DB repository, via WebDAV
    when I query using below code:
    SELECT rv.res.getClobVal()
    FROM resource_view rv
    WHERE rv.any_path = '/home/DEV/messages/4fe1-865d-da0db9212f34.xml';
    I got nothing.
    My ftp code is listed below:
    ftp> open localhost 2100
    Connected to I0025B368E2F9.
    220- C0025B368E2F9
    Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution.
    220 I0025B368E2F9 FTP Server (Oracle XML DB/Oracle Database) ready.
    User (I0025B368E2F9:(none)): fda_xml
    331 pass required for FDA_XML
    Password:
    230 FDA_XML logged in
    ftp> cd /home/DEV/message
    250 CWD Command successful
    ftp> pwd
    257 "/home/DEV/message" is current directory.
    ftp> ls -la
    200 PORT Command successful
    150 ASCII Data Connection
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 .
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 ..
    226 ASCII Transfer Complete
    ftp: 115 bytes received in 0.00Seconds 115000.00Kbytes/sec.
    250 SET_CHARSET Command Successful
    ftp> put C:\ED\SPL\E_Reon_Data\loaded\4fe1-865d-da0db9212f34.xml
    200 PORT Command successful
    150 ASCII Data Connection
    550- Error Response
    ORA-00600: internal error code, arguments: [qmxConvUnkType], [], [], [], [], [], [], [], [], [], [], []
    550 End Error Response
    ftp: 3394 bytes sent in 0.00Seconds 3394000.00Kbytes/sec.
    I have tried all suggestion from another thread such as:
    alter system set events ='31150 trace name context forever, level 0x4000'
    SQL> alter system set shared_servers = 1;
    but failed.
    is there anyone can help?
    Thanks.
    Edited by: Cow on Mar 29, 2011 12:58 AM

    Hi experts,
    I am in I am in Oracle Enterprise Manager 11g 11.2.0.1.0.
    SQL*Plus: Release 11.2.0.1.0 Production on Tue Feb 22 11:40:23 2011
    I got error message when store XML documents into XML DB repository, via WebDAV.
    I have successfully registered 5 related schemas and generated 1 table.
    I have inserted 40 .xml files into this auto generated table.
    using these data I created relational view successfully.
    but since I couldn't store XML documents into XML DB repository, via WebDAV
    when I query using below code:
    SELECT rv.res.getClobVal()
    FROM resource_view rv
    WHERE rv.any_path = '/home/DEV/messages/4fe1-865d-da0db9212f34.xml';
    I got nothing.
    My ftp code is listed below:
    ftp> open localhost 2100
    Connected to I0025B368E2F9.
    220- C0025B368E2F9
    Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution.
    220 I0025B368E2F9 FTP Server (Oracle XML DB/Oracle Database) ready.
    User (I0025B368E2F9:(none)): fda_xml
    331 pass required for FDA_XML
    Password:
    230 FDA_XML logged in
    ftp> cd /home/DEV/message
    250 CWD Command successful
    ftp> pwd
    257 "/home/DEV/message" is current directory.
    ftp> ls -la
    200 PORT Command successful
    150 ASCII Data Connection
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 .
    drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 ..
    226 ASCII Transfer Complete
    ftp: 115 bytes received in 0.00Seconds 115000.00Kbytes/sec.
    250 SET_CHARSET Command Successful
    ftp> put C:\ED\SPL\E_Reon_Data\loaded\4fe1-865d-da0db9212f34.xml
    200 PORT Command successful
    150 ASCII Data Connection
    550- Error Response
    ORA-00600: internal error code, arguments: [qmxConvUnkType], [], [], [], [], [], [], [], [], [], [], []
    550 End Error Response
    ftp: 3394 bytes sent in 0.00Seconds 3394000.00Kbytes/sec.
    I have tried all suggestion from another thread such as:
    alter system set events ='31150 trace name context forever, level 0x4000'
    SQL> alter system set shared_servers = 1;
    but failed.
    is there anyone can help?
    Thanks.
    Edited by: Cow on Mar 29, 2011 12:58 AM

  • Open .xml documents into InDesign?

    Is there a way to open .xml documents into InDesign, or any Adobe program?
    The goal is to access text from a web site feed and place it in an Indesign doc, to create a print product.

    You can't open XML but you can certainly utilize it.
    Here's a good book you might want to look into: http://amzn.to/i3iWZ5
    Bob

  • XML document into multiple tables

    How to insert a xml document into multiple tables. Eg. Purchase Order having multiple line items. I have to insert xml document into Parent as well as child with different sets of columns.

    I created the tables using the create_ch14_tables.sql. I call it using java -classpath .;C:\commerceone\xpc\lib\xmlparserv2.jar;C:\commerceone\xpc\lib\classes12.zip;C:\commerceone\xpc\lib\xsu12.jar XMLLoader -file deptempdepend.xml -connName default -transform deptempdepend.xsl. The code doesn't seem to like the "<xsl:for-each select="Department">" tags. If I remove them, the insDoc.getDocumentElement().getFirstChild() will find the element, but it still doesn't insert anything into the database.
    Thank You,
    Dave

  • Exception when inserting a XML document into database

    Hello.
    I'm trying to insert the content of a XML document into a database. The problem consists on the exception that is raised when a FK in the XML is not in the database as a PK
    How can i catch these exceptions and continuing inserting the rest of the data
    Thanks in advance ...

    I really hope for you this wasn't a production database, as mentioned in the URL, it has been some years ago for me fiddling around this which was btw a supported metalink workaround ONCE (during my Oracle 7 days), but now, I wouldn't be surprised if you now have mixed NLS content in your tables/database and more or less "corrupted" your content.
    The initial issue was probably a client one.
    BTW
    I actually hope it was a production database, maybe then you would learn that way, that you SHOULD NEVER EVER UPDATE DATABASE BASE TABLES directly...
    ...unless (the only exception) you want to learn something about the internals and don't care that your TEST database gets corrupted....

  • How to convert Xml Document into orcale tempary table

    i am creating xml document into java and passing this xml into oracle database and i need to fetch this xml into result set | rowset.
    Xml Structure
    <Department deptno="100">
    <DeptName>Sports</DeptName>
    <EmployeeList>
    <Employee empno="200"><Ename>John</Ename><Salary>33333</Salary>
    </Employee>
    <Employee empno="300"><Ename>Jack</Ename><Salary>333444</Salary>
    </Employee>
    </EmployeeList>
    </Department>
    i need like this format
    Deptno DeptName empno Ename Salary
    100 Sports 200 Jhon 2500
    100 Sports 300 Jack 3000

    It does depend on your version as odie suggests.
    Here's a way that will work in 10g...
    SQL> ed
    Wrote file afiedt.buf
      1  with t as (select xmltype('<Department deptno="100">
      2  <DeptName>Sports</DeptName>
      3  <EmployeeList>
      4  <Employee empno="200"><Ename>John</Ename><Salary>33333</Salary>
      5  </Employee>
      6  <Employee empno="300"><Ename>Jack</Ename><Salary>333444</Salary>
      7  </Employee>
      8  </EmployeeList>
      9  </Department>
    10  ') as xml from dual)
    11  --
    12  -- End of test data, Use query below
    13  --
    14  select x.deptno, x.deptname
    15        ,y.empno, y.ename, y.salary
    16  from t
    17      ,xmltable('/'
    18                passing t.xml
    19                columns deptno   number       path '/Department/@deptno'
    20                       ,deptname varchar2(10) path '/Department/DeptName'
    21                       ,emps     xmltype      path '/Department/EmployeeList'
    22               ) x
    23      ,xmltable('/EmployeeList/Employee'
    24                passing x.emps
    25                columns empno    number       path '/Employee/@empno'
    26                       ,ename    varchar2(10) path '/Employee/Ename'
    27                       ,salary   number       path '/Employee/Salary'
    28*              ) y
    SQL> /
        DEPTNO DEPTNAME        EMPNO ENAME          SALARY
           100 Sports            200 John            33333
           100 Sports            300 Jack           333444
    SQL>If the XML is a string e.g. a CLOB then it can easily be converted to XMLTYPE using the XMLTYPE function.

  • How to ftp XML document into a XML type which is not created by itself.

    Hi,
    1.
    I have a table call SNPLEX_DESIGN which is created automaticly when I register a snplex_design.xsd XML schema.( Oracle creates it through the xdb:defaultTable="SNPLEX_DESIGN attribute). and it is created using SNPLEX user account.
    2.
    I also created a folder (resource) call /home/SNPLEX/Orders. which is used to hold all the incoming XML document.
    3.
    I created another user account call SNPLEX_APP, which is the only user account allowed to FTP XML document into /home/SNPLEX/Orders folder.
    Isuues,
    If I login as SNPLEX user, I can ftp XML document into the folder and TABLE (the file size = 0). But If I login as SNPLEX_APP user account, I can only ftp XML document into the folder, but Oracle doesn't store the document into the table( becuase the files size shows a number).
    I have granted all the ACL privileges on the /home/SNPEX/Orders folder to SNPLEX_APP hrough OEM.
    DO I miss anything. Any helps will be great appreciated. Resolve this issues is very import to us, sicne we are on a stage to roll system into production.
    Regards,
    Jinsen

    IN order for a registered schema to be available to other users the schema must be registered as a GLOBAL, rather than a LOCAL Schema. This is controlled by the third agument passed to registerSChema, and the default is local. Note that you will also need to explicity grant appropriate permissions on any tables created by the schema registration process to other users who will be loading or reading data from these tables.

  • Inserting a long XML document into XMLType

    I'm trying to load the following document into an XMLType column in 10.2. I've tried every example I can find and can push the data into CLOBs using the Java work around just fine (http://www.oracle.com/technology/sample_code/tech/java/codesnippet/xmldb/HowToLoadLargeXML.html).
    Can anyone provide a solution or let me know if there is a limitation please?
    Given the table;
    SQL> describe xmltable_1
    Name Null? Type
    DOC_ID NUMBER
    XML_DATA XMLTYPE
    How do I load this data into 'XML_DATA'?
    <?xml version="1.0" encoding="UTF-8"?>
    <metadata>
    <idinfo>
    <citation>
    <citeinfo>
    <origin>Rand McNally and ESRI</origin>
    <pubdate>1996</pubdate>
    <title>ESRI Cities Geodata Set</title>
    <geoform>vector digital data</geoform>
    <onlink>\\OIS23\C$\Files\Working\Metadata\world\cities.shp</onlink>
    </citeinfo>
    </citation>
    <descript>
    <abstract>World Cities contains locations of major cities around the world. The cities include national capitals for each of the countries in World Countries 1998 as well as major population centers and landmark cities. World Cities was derived from ESRI's ArcWorld database and supplemented with other data from the Rand McNally New International Atlas</abstract>
    <purpose>606 points, 4 descriptive fields. Describes major world cities.</purpose>
    </descript>
    <timeperd>
    <timeinfo>
    <sngdate>
    <caldate>1996</caldate>
    </sngdate>
    </timeinfo>
    <current>publication date</current>
    </timeperd>
    <status>
    <progress>Complete</progress>
    <update>None planned</update>
    </status>
    <spdom>
    <bounding>
    <westbc>
    -165.270004</westbc>
    <eastbc>
    177.130188</eastbc>
    <northbc>
    78.199997</northbc>
    <southbc>
    -53.150002</southbc>
    </bounding>
    </spdom>
    <keywords>
    <theme>
    <themekt>city</themekt>
    <themekey>cities</themekey>
    </theme>
    </keywords>
    <accconst>none</accconst>
    <useconst>none</useconst>
    <ptcontac>
    <cntinfo>
    <cntperp>
    <cntper>unknown</cntper>
    <cntorg>unknown</cntorg>
    </cntperp>
    <cntpos>unknown</cntpos>
    <cntvoice>555-1212</cntvoice>
    </cntinfo>
    </ptcontac>
    <datacred>ESRI</datacred>
    <native>Microsoft Windows NT Version 4.0 (Build 1381) Service Pack 6; ESRI ArcCatalog 8.1.0.570</native>
    </idinfo>
    <dataqual>
    <attracc>
    <attraccr>no report available</attraccr>
    <qattracc>
    <attraccv>1000000</attraccv>
    <attracce>no report available</attracce>
    </qattracc>
    </attracc>
    <logic>no report available</logic>
    <complete>no report available</complete>
    <posacc>
    <horizpa>
    <horizpar>no report available</horizpar>
    </horizpa>
    <vertacc>
    <vertaccr>no report available</vertaccr>
    </vertacc>
    </posacc>
    <lineage>
    <srcinfo>
    <srccite>
    <citeinfo>
    <title>ESRI</title>
    </citeinfo>
    </srccite>
    <srcscale>20000000</srcscale>
    <typesrc>CD-ROM</typesrc>
    <srctime>
    <timeinfo>
    <sngdate>
    <caldate>1996</caldate>
    </sngdate>
    </timeinfo>
    <srccurr>publication date</srccurr>
    </srctime>
    <srccontr>no report available</srccontr>
    </srcinfo>
    <procstep>
    <procdesc>no report available</procdesc>
    <procdate>Unknown</procdate>
    </procstep>
    </lineage>
    </dataqual>
    <spdoinfo>
    <direct>Vector</direct>
    <ptvctinf>
    <sdtsterm>
    <sdtstype>Entity point</sdtstype>
    <ptvctcnt>606</ptvctcnt>
    </sdtsterm>
    </ptvctinf>
    </spdoinfo>
    <spref>
    <horizsys>
    <geograph>
    <latres>0.000001</latres>
    <longres>0.000001</longres>
    <geogunit>Decimal degrees</geogunit>
    </geograph>
    <geodetic>
    <horizdn>North American Datum of 1927</horizdn>
    <ellips>Clarke 1866</ellips>
    <semiaxis>6378206.400000</semiaxis>
    <denflat>294.978698</denflat>
    </geodetic>
    </horizsys>
    </spref>
    <eainfo>
    <detailed>
    <enttyp>
    <enttypl>
    cities</enttypl>
    </enttyp>
    <attr>
    <attrlabl>FID</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>Shape</attrlabl>
    <attrdef>Feature geometry.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Coordinates defining the features.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>NAME</attrlabl>
    <attrdef>The city name. Spellings are based on Board of Geographic Names standards and commercial atlases.</attrdef>
    <attrdefs>ESRI</attrdefs>
    </attr>
    <attr>
    <attrlabl>COUNTRY</attrlabl>
    <attrdef>An abbreviated country name.</attrdef>
    </attr>
    <attr>
    <attrlabl>POPULATION</attrlabl>
    <attrdef>Total population for the entire metropolitan area. Values are from recent census or estimates.</attrdef>
    </attr>
    <attr>
    <attrlabl>CAPITAL</attrlabl>
    <attrdef>Indicates whether a city is a national capital (Y/N).</attrdef>
    </attr>
    </detailed>
    <overview>
    <eaover>none</eaover>
    <eadetcit>none</eadetcit>
    </overview>
    </eainfo>
    <distinfo>
    <stdorder>
    <digform>
    <digtinfo>
    <transize>0.080</transize>
    </digtinfo>
    </digform>
    </stdorder>
    </distinfo>
    <metainfo>
    <metd>20010509</metd>
    <metc>
    <cntinfo>
    <cntorgp>
    <cntorg>ESRI</cntorg>
    <cntper>unknown</cntper>
    </cntorgp>
    <cntaddr>
    <addrtype>unknown</addrtype>
    <city>unknown</city>
    <state>unknown</state>
    <postal>00000</postal>
    </cntaddr>
    <cntvoice>555-1212</cntvoice>
    </cntinfo>
    </metc>
    <metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
    <metstdv>FGDC-STD-001-1998</metstdv>
    <mettc>local time</mettc>
    <metextns>
    <onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
    <metprof>ESRI Metadata Profile</metprof>
    </metextns>
    </metainfo>
    </metadata>
    rtacce>Vertical Positional Accuracy is expressed in meters. Vertical accuracy figures were developed by comparing elevation contour locations on 1:24,000 scale maps to elevation values at the same location within the digital database. Some manual interpolation was necessary to complete this test. The analysis results are expressed as linear error at a 90% confidence interval.</vertacce>
    </qvertpa>
    </vertacc>
    </posacc>
    <lineage>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>National Imagery and Mapping Agency</origin>
    <pubdate>1994</pubdate>
    <title>Operational Navigational Chart</title>
    <geoform>map</geoform>
    <pubinfo>
    <pubplace>St.Louis, MO</pubplace>
    <publish>National Imagery and Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>1000000</srcscale>
    <typesrc>stable-base material</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>1974</begdate>
    <enddate>1994</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication dates</srccurr>
    </srctime>
    <srccitea>ONC</srccitea>
    <srccontr>All information found on the source with the exception of aeronautical data</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>National Imagery and Mapping Agency</origin>
    <pubdate>199406</pubdate>
    <title>Digital Aeronautical Flight Information File</title>
    <geoform>model</geoform>
    <pubinfo>
    <pubplace>St. Louis, MO</pubplace>
    <publish>National Imagery and Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <typesrc>magnetic tape</typesrc>
    <srctime>
    <timeinfo>
    <sngdate>
    <caldate>1994</caldate>
    </sngdate>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>DAFIF</srccitea>
    <srccontr>Airport records (name, International Civil Aviation Organization, position, elevation, and type)</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>Defense Mapping Agency</origin>
    <pubdate>1994</pubdate>
    <title>Jet Navigational Chart</title>
    <geoform>map</geoform>
    <pubinfo>
    <pubplace>St.Louis, MO</pubplace>
    <publish>Defense Mapping Agency</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>2,000,000</srcscale>
    <typesrc>stable-base material</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>1974</begdate>
    <enddate>1991</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>JNC</srccitea>
    <srccontr>All information found on the source with the exception of aeronautical data. JNCs were used as source for the Antartica region only.</srccontr>
    </srcinfo>
    <srcinfo>
    <srccite>
    <citeinfo>
    <origin>USGS EROS Data Center</origin>
    <pubdate></pubdate>
    <title>Advance Very High Resolution Radiometer</title>
    <geoform>remote-sensing image</geoform>
    <pubinfo>
    <pubplace>Sioux Falls, SD</pubplace>
    <publish>EROS Data Center</publish>
    </pubinfo>
    </citeinfo>
    </srccite>
    <srcscale>1000000</srcscale>
    <typesrc>magnetic tape</typesrc>
    <srctime>
    <timeinfo>
    <rngdates>
    <begdate>199003</begdate>
    <enddate>199011</enddate>
    </rngdates>
    </timeinfo>
    <srccurr>Publication date</srccurr>
    </srctime>
    <srccitea>AVHRR</srccitea>
    <srccontr>6 vegetation types covering the continental US and Canada</srccontr>
    </srcinfo>
    <procstep>
    <procdesc>For the first edition DCW, stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
    <procdate>199112</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Environmental Systems Research Institute</cntorg>
    </cntorgp>
    <cntpos>Applications Division</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>380 New York St.</address>
    <city>Redlands</city>
    <state>CA</state>
    <postal>92373</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>909-793-2853</cntvoice>
    <cntfax>909-793-5953</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdate>199404</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>(813)578-0100</cntvoice>
    <cntfax>(813)577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Transferred digitally directly into the VPF files.</procdesc>
    <procdate>199408</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>813-578-0100</cntvoice>
    <cntfax>813-577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
    <procdate>199112</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Environmental Systems Research Institute</cntorg>
    </cntorgp>
    <cntpos>Applications Division</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>380 New York St.</address>
    <city>Redlands</city>
    <state>CA</state>
    <postal>92373</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>909-793-2853</cntvoice>
    <cntfax>909-793-5953</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Daily AVHRR images were averaged for two week time periods over the entire US growing season. These averaged images, their rates of change, elevation information, and other data were used to produce a single land classification image of the contental US. The VMap-0 data set extended this coverage over the Canadian land mass, however vegetation classification was further subdivided into nine vegetation types.</procdesc>
    <procdate>199402</procdate>
    <srcprod>EROS data</srcprod>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>USGS Eros Data Center</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address></address>
    <city>Sioux Falls</city>
    <state>SD</state>
    <postal></postal>
    <country>US</country>
    </cntaddr>
    <cntvoice></cntvoice>
    <cntfax></cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>The Eros data (raster files) were converted to vector polygon, splined (remove stairstepping), thinned (all ploygons under 2km2 were deleted), and tied to existing DCW polygons (water bodies, built-up areas). The resulting file was tiled and converted to a VPF Vegetation coverage for use in the DCW. All processing was performed using ARC-INFO software.</procdesc>
    <procdate>199412</procdate>
    <srcprod>VMap-0 Vegetation Coverage</srcprod>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geonex</cntorg>
    </cntorgp>
    <cntpos></cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>8950 North 9th Ave.</address>
    <city>St. Petersburg</city>
    <state>FL</state>
    <postal>33702</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>813-578-0100</cntvoice>
    <cntfax>813-577-6946</cntfax>
    </cntinfo>
    </proccont>
    </procstep>
    <procstep>
    <procdesc>Data was translated from VPF format to ArcInfo Coverage format. The coverages were then loaded into a seamless ArcSDE layer.</procdesc>
    <procdate>02152001</procdate>
    <proccont>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team, Harvard University</cntorg>
    </cntorgp>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </proccont>
    </procstep>
    </lineage>
    </dataqual>
    <spdoinfo>
    <direct>Vector</direct>
    <ptvctinf>
    <sdtsterm>
    <sdtstype>Complete chain</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>Label point</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>GT-polygon composed of chains</sdtstype>
    </sdtsterm>
    <sdtsterm>
    <sdtstype>Point</sdtstype>
    </sdtsterm>
    <vpfterm>
    <vpflevel>3</vpflevel>
    <vpfinfo>
    <vpftype>Node</vpftype>
    </vpfinfo>
    <vpfinfo>
    <vpftype>Edge</vpftype>
    </vpfinfo>
    <vpfinfo>
    <vpftype>Face</vpftype>
    </vpfinfo>
    </vpfterm>
    </ptvctinf>
    </spdoinfo>
    <spref>
    <horizsys>
    <geograph>
    <latres>0.000000</latres>
    <longres>0.000000</longres>
    <geogunit>Decimal degrees</geogunit>
    </geograph>
    <geodetic>
    <horizdn>D_WGS_1984</horizdn>
    <ellips>WGS_1984</ellips>
    <semiaxis>6378137.000000</semiaxis>
    <denflat>298.257224</denflat>
    </geodetic>
    </horizsys>
    <vertdef>
    <altsys>
    <altdatum>Mean Sea Level</altdatum>
    <altunits>1.0</altunits>
    </altsys>
    </vertdef>
    </spref>
    <eainfo>
    <detailed>
    <enttyp>
    <enttypl>
    lc.pat</enttypl>
    </enttyp>
    <attr>
    <attrlabl>FID</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>Shape</attrlabl>
    <attrdef>Feature geometry.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Coordinates defining the features.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>AREA</attrlabl>
    <attrdef>Area of feature in internal units squared.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Positive real numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>PERIMETER</attrlabl>
    <attrdef>Perimeter of feature in internal units.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Positive real numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>LC#</attrlabl>
    <attrdef>Internal feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    <attrdomv>
    <udom>Sequential unique whole numbers that are automatically generated.</udom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>LC-ID</attrlabl>
    <attrdef>User-defined feature number.</attrdef>
    <attrdefs>ESRI</attrdefs>
    </attr>
    <attr>
    <attrlabl>LCAREA.AFT_ID</attrlabl>
    </attr>
    <attr>
    <attrlabl>LCPYTYPE</attrlabl>
    <attrdef>Land cover poygon type</attrdef>
    <attrdefs>NIMA</attrdefs>
    <attrdomv>
    <edom>
    <edomv>1</edomv>
    <edomvd>Rice Field</edomvd>
    </edom>
    <edom>
    <edomv>2</edomv>
    <edomvd>Cranberry bog</edomvd>
    </edom>
    <edom>
    <edomv>3</edomv>
    <edomvd>Cultivated area, garden</edomvd>
    </edom>
    <edom>
    <edomv>4</edomv>
    <edomvd>Peat cuttings</edomvd>
    </edom>
    <edom>
    <edomv>5</edomv>
    <edomvd>Salt pan</edomvd>
    </edom>
    <edom>
    <edomv>6</edomv>
    <edomvd>Fish pond or hatchery</edomvd>
    </edom>
    <edom>
    <edomv>7</edomv>
    <edomvd>Quarry, strip mine, mine dump, blasting area</edomvd>
    </edom>
    <edom>
    <edomv>8</edomv>
    <edomvd>Oil or gas</edomvd>
    </edom>
    <edom>
    <edomv>10</edomv>
    <edomvd>Lava flow</edomvd>
    </edom>
    <edom>
    <edomv>11</edomv>
    <edomvd>Distorted surface area</edomvd>
    </edom>
    <edom>
    <edomv>12</edomv>
    <edomvd>Unconsolidated material (sand or gravel, glacial moraine)</edomvd>
    </edom>
    <edom>
    <edomv>13</edomv>
    <edomvd>Natural landmark area</edomvd>
    </edom>
    <edom>
    <edomv>14</edomv>
    <edomvd>Inundated area</edomvd>
    </edom>
    <edom>
    <edomv>15</edomv>
    <edomvd>Undifferentiated wetlands</edomvd>
    </edom>
    <edom>
    <edomv>99</edomv>
    <edomvd>None</edomvd>
    </edom>
    </attrdomv>
    </attr>
    <attr>
    <attrlabl>TILE_ID</attrlabl>
    <attrdef>VPF Format tile ID</attrdef>
    <attrdefs>NIMA</attrdefs>
    </attr>
    <attr>
    <attrlabl>FAC_ID</attrlabl>
    </attr>
    </detailed>
    <overview>
    <eaover>The DCW used a product-specific attribute coding system that is composed of TYPE and STATUS designators for area, line, and point features; and LEVEL and SYMBOL designators for text features. The TYPE attribute specifies what the feature is, while the STATUS attribute specifies the current condition of the feature. Some features require both a TYPE and STATUS code to uniquely identify their characteristics. In order to uniquely identify each geographic attribute in the DCW, the TYPE and STATUS attribute code names are preceded by the two letter coverage abbreviation and a two letter abbreviation for the type of graphic primitive present. The DCW Type/Status codes were mapped into the FACC coding scheme. A full description of FACC may be found in Digital Geographic Information Exchange Standard Edition 1.2, January 1994.</eaover>
    <eadetcit>Entities (features) and Attributes for DCW are fully described in: Department of Defense, 1992, Military Specification Digital Chart of the World (MIL-D-89009): Philadelphia, Department of Defense, Defense Printing Service Detachment Office.</eadetcit>
    </overview>
    </eainfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>NIMA</cntorg>
    </cntorgp>
    <cntpos>ATTN: CC, MS D-16</cntpos>
    <cntaddr>
    <addrtype>mailing and physical address</addrtype>
    <address>6001 MacArthur Blvd.</address>
    <city>Bethesda</city>
    <state>MD</state>
    <postal>20816-5001</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>301-227-2495</cntvoice>
    <cntfax>301-227-2498</cntfax>
    </cntinfo>
    </distrib>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <formname>VPF</formname>
    <formverd>19930930</formverd>
    <formspec>Military Standard Vector Product Format (MIL-STD-2407). The current version of this document is dated 28 June 1996. Edition 3 of VMap-0 conforms to a previous version of the VPF Standard as noted. Future versions of VMap-0 will conform to the current version of VPF Standard.</formspec>
    <transize>0.172</transize>
    </digtinfo>
    <digtopt>
    <offoptn>
    <offmedia>CD-ROM</offmedia>
    <recfmt>ISO 9660</recfmt>
    </offoptn>
    </digtopt>
    </digform>
    <fees>Not Applicable</fees>
    </stdorder>
    </distinfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>USGS Map Sales</cntorg>
    </cntorgp>
    <cntaddr>
    <addrtype>mailing address</addrtype>
    <address>Box 25286</address>
    <city>Denver</city>
    <state>CO</state>
    <postal>80225</postal>
    <country>US</country>
    </cntaddr>
    <cntvoice>303-236-7477</cntvoice>
    <cntfax>303-236-1972</cntfax>
    </cntinfo>
    </distrib>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <transize>0.172</transize>
    </digtinfo>
    </digform>
    <fees>$82.50 per four disk set</fees>
    <ordering>For General Public: Payment (check, money order, purchase order, or Government account) must accompany order.
    Make all drafts payable to Dept. of the Interior- US Geological Survey.
    To provide a general idea of content, a sample data set is available from the TMPO Home Page at:</ordering>
    </stdorder>
    </distinfo>
    <distinfo>
    <distrib>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team, Harvard University</cntorg>
    <cntper>Geodesy Team</cntper>
    </cntorgp>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </distrib>
    <resdesc>Geodesy layer</resdesc>
    <distliab>None</distliab>
    <stdorder>
    <digform>
    <digtinfo>
    <formname>SHP</formname>
    <transize>0.172</transize>
    </digtinfo>
    <digtopt>
    <onlinopt>
    <computer>
    <networka>
    <networkr>geodesy.harvard.edu</networkr>
    </networka>
    </computer>
    </onlinopt>
    </digtopt>
    </digform>
    <fees>none</fees>
    </stdorder>
    <availabl>
    <timeinfo>
    <sngdate>
    <caldate>1992</caldate>
    </sngdate>
    </timeinfo>
    </availabl>
    </distinfo>
    <metainfo>
    <metd>20010226</metd>
    <metc>
    <cntinfo>
    <cntorgp>
    <cntorg>Geodesy Team</cntorg>
    <cntper>REQUIRED: The person responsible for the metadata information.</cntper>
    </cntorgp>
    <cntvoice>REQUIRED: The telephone number by which individuals can speak to the organization or individual.</cntvoice>
    <cntemail>[email protected]</cntemail>
    </cntinfo>
    </metc>
    <metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
    <metstdv>FGDC-STD-001-1998</metstdv>
    <mettc>local time</mettc>
    <metextns>
    <onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
    <metprof>ESRI Metadata Profile</metprof>
    </metextns>
    </metainfo>
    </metadata>

    Have you tired the directory and bfile methods? Here is the example for that in the Oracle XML Developer's Guide:
    CREATE DIRECTORY xmldir AS 'path_to_folder_containing_XML_file';
    Example 3-3 Inserting XML Content into an XMLType Table
    INSERT INTO mytable2 VALUES (XMLType(bfilename('XMLDIR', 'purchaseOrder.xml'),
    nls_charset_id('AL32UTF8')));
    1 row created.
    The value passed to nls_charset_id() indicates that the encoding for the file to be read is UTF-8.
    ben

  • Loading a XML-document into tables

    Hi,
    I have created a view for the tables below.
    Table Book with columns:
    - Book_ID
    - Book_Name
    - Ref_To_Price ( -> is a reference to Table Price (Price_ID))
    Table Price with columns:
    - Price_ID
    - Price_DM
    SQL-Syntax:
    "create view Bookprice as select Book_ID, Book_Name, Ref_To_Price, Price_ID, Price_DM from Book, Price where Ref_To_Price = Price_ID; "
    XML-Document:
    <?xml version="1.0" encoding="UTF-16"?>
    <!DOCTYPE ANWENDUNGEN SYSTEM "file:/E:/book.dtd">
    <!-- ?xml-stylesheet href="book.xsl" type="text/xsl"? -->
    <ROOTDOC>
    <ROW>
    <BOOK_ID>66-77</BOOK_ID>
    <BOOK_NAME>JavaScript</BOOK_NAME>
    <REF_TO_PRICE>12</REF_TO_PRICE>
    <PRICE_ID>12</PRICE_ID>
    <PRICE_DM>25.50DM</PPRICE_DM>
    </ROW>
    </ROOTDOC>
    If I use the XML SQL Utility to insert the XML-Document, the following error message came up:
    " Exception in thread "main" oracle.xml.sql.OracleXMLSQLException: java.sql.SQLException: ORA-01776: cannot modify more than one base table through a join view "
    Can anyone help me, please ?
    null

    Hi,
    This is a classic join view problem, where u cannot update two tables in one shot. The main problem is that ur two tables are not normalized correctly. Why can't they be in just one table?
    OK, if that is not possible, then the best way out is to create a simple INSTEAD OF trigger on the view which will insert correclty,
    e.g.
    CREATE TRIGGER bookprice_tr INSTEAD OF INSERT ON Bookprice FOR EACH ROW
    BEGIN
    insert into Book values (:NEW.Book_id,
    :NEW.Book_name, :NEW.Ref_to_Price);
    insert into Price
    values(:NEW.Price_ID, :NEW.Price_DM);
    END;
    Hope this helps,
    Murali

  • Inserting XML document into XDB fails with can't convert to OPAQUE

    Hi,
    When I try to insert a document using oracle 9.2.0.5 client software into a 9.2.0.5 database, with the following code:
    void insertDocument(String document) {
    XMLType xt = XMLType.createXML(connection, document);
    PreparedStatement ps = connection.prepareStatement("insert into xmldocuments values(?)");
    ps.setObject(1, xt);
    ps.executeUpdate();
    The setObject function always throws an exception (after a very long time) with:
    java.sql.SQLException: Fail to convert to internal representation: OPAQUE()
    This also fails when we use the InputStream and org.w3c.xml.Document variants.
    We use the OCI driver, otherwise we get errors retrieving the documents from XMLDB.
    What is going wrong here?

    David,
    If you search through the historical data in this list there are previous post regarding opaque.
    These may be useful. Possibly your reaching the size limit.

  • Help - Inserting an XML document into a Oracle 8i Column (CLOB Type)

    Hi JavaGurus,
    I am looking for a simple java code which will take my XML document as input and insert the same into a Oracle 8i database's column which is of type CLOB.
    Insert statement won't work and I can not use SQL Loader.
    Any one?
    JK

    Maybe you can adapt some of the code in Oracle's "LOB Datatype" example, which is a complete working program that stores and retrieves BLOBs and CLOBs.
    http://www.oracle.com/technology/sample_code/tech/java/sqlj_jdbc/files/advanced/advanced.html

  • Loading XML Document into Structured Storage

    HI Everybody,
    I am confronted with a BUG in Oracle which is being fixed but will last a long time. I have a relatively complex Schema with manifold includes, imports and inheritances which I can register successfully. The object-relational table is also generated. But loading the instance document with INSERT INTO... fails due to severe internal error. I have not yet tried SQL*Loader, however.
    Please, does anyone know a safe alternative method to parse and load? Structured storage is mandatory because I need to retrieve individual elements and attributes. Instance documents are big, ca. 20 MB.
    Thanks, regards
    Miklos HERBOLY

    HI,
    Oracle version is 11.1.0.7.0.
    Metalink found three bugs related to the error ORA-00600: Interner Fehlercode, Argumente: [qmxConvUnkType], [], [], [], [],
    The bugs are:
    BUG 8644684 ORA-600 [QMXCONVUNKTYPE] DURING INSERT INTO SCHEMA-BASED TABLE
    BUG 8671408 INSERT OF XML DOC INTO AN XML SCHEMA STORED AS BINARY XML FAILS WITH ORA-30937
    BUG 8683472 SCHEMA EXECUTABLE RETURNS INCORRECT ERRORS DURING VALIDATION AGAINST XML SCHEMA
    Well, I have to look round in XML DB Forum yet.

Maybe you are looking for

  • Can't see the ESS Additional Personal Data link within Personal Information

    Hi All, For some reasons, I can not see the Additional Personal Data link within the ESS Personal Information iView. I have checked the Homepage framework and the existence of the iView in the PCD and everything appears to be fine. Do I need to assig

  • Oracle 11g client installation on Windows 7 Home Premium (64-bit)

    I'm having trouble installing the Oracle 11g client installation on Windows 7 Home Premium (64-bit) machine. Can Oracle 11g client even be installed on the Windows 7 Home Premium 64? Thanks!

  • Help Identify Card

    Recently bought HP Pavillion 7670 (Core Duo E6600) which comes with "Ati Radeon x1600se 512 MB" video card. System running Windows XP MCE 2005 (all up-to-date) I have identified this to be manufactured by MSI [MS-V040B0609060667]. However I cannot fi

  • Statement ignored

    I am getting following error when i call function thru SQL. SELECT F1(100) FROM DUAL ERROR at line 1: ORA-06552: PL/SQL: Statement ignored ORA-06553: PLS-382: expression is of wrong type My function is CREATE OR REPLACE FUNCTION F1(N NUMBER) RETURN B

  • Pc is installed with Photoshop elements and premiere but...

    i have a sony vaio Model:VGM-CR4354 It Is Installed with Adobe Premiere Elements 4.0 Adobe Photoshop Elements 6.0 but Photoshop Elements Don't work i Reinstalled it and still not working it says "Unable to continue because of a hardware or system err