Loading a XML-document into tables
Hi,
I have created a view for the tables below.
Table Book with columns:
- Book_ID
- Book_Name
- Ref_To_Price ( -> is a reference to Table Price (Price_ID))
Table Price with columns:
- Price_ID
- Price_DM
SQL-Syntax:
"create view Bookprice as select Book_ID, Book_Name, Ref_To_Price, Price_ID, Price_DM from Book, Price where Ref_To_Price = Price_ID; "
XML-Document:
<?xml version="1.0" encoding="UTF-16"?>
<!DOCTYPE ANWENDUNGEN SYSTEM "file:/E:/book.dtd">
<!-- ?xml-stylesheet href="book.xsl" type="text/xsl"? -->
<ROOTDOC>
<ROW>
<BOOK_ID>66-77</BOOK_ID>
<BOOK_NAME>JavaScript</BOOK_NAME>
<REF_TO_PRICE>12</REF_TO_PRICE>
<PRICE_ID>12</PRICE_ID>
<PRICE_DM>25.50DM</PPRICE_DM>
</ROW>
</ROOTDOC>
If I use the XML SQL Utility to insert the XML-Document, the following error message came up:
" Exception in thread "main" oracle.xml.sql.OracleXMLSQLException: java.sql.SQLException: ORA-01776: cannot modify more than one base table through a join view "
Can anyone help me, please ?
null
Hi,
This is a classic join view problem, where u cannot update two tables in one shot. The main problem is that ur two tables are not normalized correctly. Why can't they be in just one table?
OK, if that is not possible, then the best way out is to create a simple INSTEAD OF trigger on the view which will insert correclty,
e.g.
CREATE TRIGGER bookprice_tr INSTEAD OF INSERT ON Bookprice FOR EACH ROW
BEGIN
insert into Book values (:NEW.Book_id,
:NEW.Book_name, :NEW.Ref_to_Price);
insert into Price
values(:NEW.Price_ID, :NEW.Price_DM);
END;
Hope this helps,
Murali
Similar Messages
-
Loading this xml data into tables
Hello,
I am having a problem loading this XML file into tables. The xml file structure is
<FILE>
<ACCESSION>
some ids
<INSTANCE>
some data
<VARIATION
</VARIATION>
<VARIATION>
</VARIATION> variation gets repeated a number of times
<ASSEMBLY>
</ASSEMBLY>
<ASSEMBLY>
</ASSEMBLY> Assembly gets repeated a number of times.
</INSTANCE>
</ACCESSION>
</FILE>
I created a table which has the structure:
create table accession(
accession_id varchar2(20),
Instance instance_type);
create or replace type instance_type as object
(method varchar2(20),
class varchar2(20),
source varchar2(20),
num_char number(10),
variation variation_type,
assembly assembly_type)
create or replace type variation_type as object
(value varchar2(2),
count number(10),
frequency number(10),
pop_id varchar2(10)
Created a similiar type for assembly.
When I load it, I could only store the first variation data but not the subsequent ones. Similarly for assembly I could only store the first data but not the subsequent ones.
Could anyone let me know how I could store this data into tables? I have also included a sample XML file in this message.
Thank You for your help.
Rama.
Here is the sample xml file.
<?xml version="1.0" ?>
- <FILE>
- <ACCESSION>
<ACCESSION_ID>accid1</ACCESSION_ID>
- <INSTANCE>
<METHOD>method1</METHOD>
<CLASS>class1</CLASS>
<SOURCE>source1</SOURCE>
<NUM_CHAR>40</NUM_CHAR>
- <VARIATION>
<VALUE>G</VALUE>
<COUNT>5</COUNT>
<FREQUENCY>66</FREQUENCY>
<POP1>pop1</POP1>
<POP2>pop1</POP2>
</VARIATION>
<VARIATION>
<VALUE>C</VALUE>
<COUNT>2</COUNT>
<FREQUENCY>33</FREQUENCY>
<POP_ID1>pop2</POP_ID1>
</VARIATION>
- <ASSEMBLY>
<ASSEMBLY_ID>1</ASSEMBLY_ID>
<BEGIN>180</BEGIN>
<END>180</END>
<TYPE>2</TYPE>
<ORI>-</ORI>
<OFFSET>0</OFFSET>
</ASSEMBLY>
- <ASSEMBLY>
<ASSEMBLY_ID>2</ASSEMBLY_ID>
<BEGIN>235</BEGIN>
<END>235</END>
<TYPE>2</TYPE>
<ORI>-</ORI>
<OFFSET>0</OFFSET>
</ASSEMBLY>
</INSTANCE>
</ACCESSION>
</FILE>Hello,
I could figure out how to load this XML file by using cast(multiset(
So never mind.
Thank You.
Rama. -
To load an XML document into 40 tables
How do I load a large XML document into 40 tables. Most of the exmaples, I see only load one table into the Orcale database?
From the above document:
Storing XML Data Across Tables
Question
Can XML- SQL Utility store XML data across tables?
Answer
Currently XML-SQL Utility (XSU) can only store to a single table. It maps a canonical representation of an XML document into any table/view. But of course there is a way to store XML with the XSU across table. One can do this using XSLT to transform any document into multiple documents and insert them separately. Another way is to define views over multiple tables (object views if needed) and then do the inserts ... into the view. If the view is inherently non-updatable (because of complex joins, ...), then one can use INSTEAD-OF triggers over the views to do the inserts.
-- I've tried this, works fine. -
Load an XML file into table(s)
Hi ,
I have to load data from an xml file into an Oracle DB but I never used this king of process before. The purpose is to use as much as possible Oracle standard features ( stored procedures , functions , API's ).
Can someone explain me in simple explanations how to do it ? Thanks in advance for your help.
The XML must not be stored in the database , only the final tables
Values can be inserted , updated , or deleted from the final tables
Here are the versions of the tools I am using :
Oracle RDBMS : 10.2.0.4.0
Oracle Applications : 11.5.10.2
Toad : 9.5.0.31
SQL Plus : 8.0.6.0.0
The header of the xsd :
<?xml version="1.0" encoding="windows-1252" ?>
- <!-- edited with XMLSPY v2004 rel. 4 U (http://www.xmlspy.com) by erik de bruyn (Graydon)
-->
- <xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns="http://www.w3schools.com" targetNamespace="http://www.w3schools.com" elementFormDefault="qualified">
An extract of the xml :
<?xml version="1.0" encoding="windows-1252" ?>
- <GraydonBeDialogue>
<TransactionCode>RTB</TransactionCode>
- <Table ClassTable="Country">
- <TableEntry>
<TableLanguage>N</TableLanguage>
<TableCode>AD</TableCode>
<TableValue>Andorra</TableValue>
</TableEntry>
- <TableEntry>
<TableLanguage>N</TableLanguage>
<TableCode>AE</TableCode>
<TableValue>Verenigde Arabische Emiraten</TableValue>
</TableEntry>
</Table>
- <Table ClassTable="Summons">
- <TableEntry>
<TableLanguage>N</TableLanguage>
<TableCode>D</TableCode>
<TableValue>De dagvaarding is het gevolg</TableValue>
</TableEntry>
- <TableEntry>
<TableLanguage>N</TableLanguage>
<TableCode>S</TableCode>
<TableValue>Doorgehaald bij de arbeidsrechtbank</TableValue>
</TableEntry>
</Table>
</GraydonBeDialogue>
The result I would have :
Two tables ( Country and Summons ) , each containing 3 columns ( TableLanguage , TableCode , TableValue ) :
Table Country : TableLanguage TableCode TableValue
N AD Andorra
N AE Verenigde Arabische Emiraten
Table Summons : TableLanguage TableCode TableValue
N D De dagvaarding is het gevolg
N S Doorgehaald bij de arbeidsrechtbankfor table Country
create table Country as
with t as (
select
xmltype (
'<?xml version="1.0" encoding="windows-1252" ?>
<GraydonBeDialogue>
<TransactionCode>RTB</TransactionCode>
<Table ClassTable="Country">
<TableEntry>
<TableLanguage>N</TableLanguage>
<TableCode>AD</TableCode>
<TableValue>Andorra</TableValue>
</TableEntry>
<TableEntry>
<TableLanguage>N</TableLanguage>
<TableCode>AE</TableCode>
<TableValue>Verenigde Arabische Emiraten</TableValue>
</TableEntry>
</Table>
<Table ClassTable="Summons">
<TableEntry>
<TableLanguage>N</TableLanguage>
<TableCode>D</TableCode>
<TableValue>De dagvaarding is het gevolg</TableValue>
</TableEntry>
<TableEntry>
<TableLanguage>N</TableLanguage>
<TableCode>S</TableCode>
<TableValue>Doorgehaald bij de arbeidsrechtbank</TableValue>
</TableEntry>
</Table>
</GraydonBeDialogue>') as xml
from dual
select x.TableLanguage, x.TableCode, x.TableValue
from t
,xmltable('/GraydonBeDialogue/Table[@ClassTable="Country"]/TableEntry'
passing t.xml
columns TableLanguage varchar2(50) path '/TableEntry/TableLanguage'
, TableCode varchar2(50) path '/TableEntry/TableCode'
, TableValue varchar2(50) path '/TableEntry/TableValue'
) x -
Existing XML documents into KM
Hello Everyone
Is there a way to load existing xml documents into KM and then maintain them using xml forms built using the form builder. Any help is greatly appreciated.
Thanks
SwethaHi Renuka
You can move the XML projects from one server to another.The Xml forms created is stored in the path 'etc\xmlforms' in KM. You can send it out and Upload to the target server instead of creating the same in the Target server from the scratch.
Regards
Geogi -
How to load a XML file into a table
Hi,
I've been working on Oracle for many years but for the first time I was asked to load a XML file into a table.
As an example, I've found this on the web, but it doesn't work
Can someone tell me why? I hoped this example could help me.
the file acct.xml is this:
<?xml version="1.0"?>
<ACCOUNT_HEADER_ACK>
<HEADER>
<STATUS_CODE>100</STATUS_CODE>
<STATUS_REMARKS>check</STATUS_REMARKS>
</HEADER>
<DETAILS>
<DETAIL>
<SEGMENT_NUMBER>2</SEGMENT_NUMBER>
<REMARKS>rp polytechnic</REMARKS>
</DETAIL>
<DETAIL>
<SEGMENT_NUMBER>3</SEGMENT_NUMBER>
<REMARKS>rp polytechnic administration</REMARKS>
</DETAIL>
<DETAIL>
<SEGMENT_NUMBER>4</SEGMENT_NUMBER>
<REMARKS>rp polytechnic finance</REMARKS>
</DETAIL>
<DETAIL>
<SEGMENT_NUMBER>5</SEGMENT_NUMBER>
<REMARKS>rp polytechnic logistics</REMARKS>
</DETAIL>
</DETAILS>
<HEADER>
<STATUS_CODE>500</STATUS_CODE>
<STATUS_REMARKS>process exception</STATUS_REMARKS>
</HEADER>
<DETAILS>
<DETAIL>
<SEGMENT_NUMBER>20</SEGMENT_NUMBER>
<REMARKS> base polytechnic</REMARKS>
</DETAIL>
<DETAIL>
<SEGMENT_NUMBER>30</SEGMENT_NUMBER>
</DETAIL>
<DETAIL>
<SEGMENT_NUMBER>40</SEGMENT_NUMBER>
<REMARKS> base polytechnic finance</REMARKS>
</DETAIL>
<DETAIL>
<SEGMENT_NUMBER>50</SEGMENT_NUMBER>
<REMARKS> base polytechnic logistics</REMARKS>
</DETAIL>
</DETAILS>
</ACCOUNT_HEADER_ACK>
For the two tags HEADER and DETAILS I have the table:
create table xxrp_acct_details(
status_code number,
status_remarks varchar2(100),
segment_number number,
remarks varchar2(100)
before I've created a
create directory test_dir as 'c:\esterno'; -- where I have my acct.xml
and after, can you give me a script for loading data by using XMLTABLE?
I've tried this but it doesn't work:
DECLARE
acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
BEGIN
insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
select x1.status_code,
x1.status_remarks,
x2.segment_number,
x2.remarks
from xmltable(
'/ACCOUNT_HEADER_ACK/HEADER'
passing acct_doc
columns header_no for ordinality,
status_code number path 'STATUS_CODE',
status_remarks varchar2(100) path 'STATUS_REMARKS'
) x1,
xmltable(
'$d/ACCOUNT_HEADER_ACK/DETAILS[$hn]/DETAIL'
passing acct_doc as "d",
x1.header_no as "hn"
columns segment_number number path 'SEGMENT_NUMBER',
remarks varchar2(100) path 'REMARKS'
) x2
END;
This should allow me to get something like this:
select * from xxrp_acct_details;
Statuscode status remarks segement remarks
100 check 2 rp polytechnic
100 check 3 rp polytechnic administration
100 check 4 rp polytechnic finance
100 check 5 rp polytechnic logistics
500 process exception 20 base polytechnic
500 process exception 30
500 process exception 40 base polytechnic finance
500 process exception 50 base polytechnic logistics
but I get:
Error report:
ORA-06550: line 19, column 11:
PL/SQL: ORA-00932: inconsistent datatypes: expected - got NUMBER
ORA-06550: line 4, column 2:
PL/SQL: SQL Statement ignored
06550. 00000 - "line %s, column %s:\n%s"
*Cause: Usually a PL/SQL compilation error.
and if I try to change the script without using the column HEADER_NO to keep track of the header rank inside the document:
DECLARE
acct_doc xmltype := xmltype( bfilename('TEST_DIR','acct.xml'), nls_charset_id('AL32UTF8') );
BEGIN
insert into xxrp_acct_details (status_code, status_remarks, segment_number, remarks)
select x1.status_code,
x1.status_remarks,
x2.segment_number,
x2.remarks
from xmltable(
'/ACCOUNT_HEADER_ACK/HEADER'
passing acct_doc
columns status_code number path 'STATUS_CODE',
status_remarks varchar2(100) path 'STATUS_REMARKS'
) x1,
xmltable(
'/ACCOUNT_HEADER_ACK/DETAILS'
passing acct_doc
columns segment_number number path 'SEGMENT_NUMBER',
remarks varchar2(100) path 'REMARKS'
) x2
END;
I get this message:
Error report:
ORA-19114: error during parsing the XQuery expression:
ORA-06550: line 1, column 13:
PLS-00201: identifier 'SYS.DBMS_XQUERYINT' must be declared
ORA-06550: line 1, column 7:
PL/SQL: Statement ignored
ORA-06512: at line 4
19114. 00000 - "error during parsing the XQuery expression: %s"
*Cause: An error occurred during the parsing of the XQuery expression.
*Action: Check the detailed error message for the possible causes.
My oracle version is 10gR2 Express Edition
I do need a script for loading xml files into a table as soon as possible, Give me please a simple example for understanding and that works on 10gR2 Express Edition
Thanks in advance!The reason your first SQL statement
select x1.status_code,
x1.status_remarks,
x2.segment_number,
x2.remarks
from xmltable(
'/ACCOUNT_HEADER_ACK/HEADER'
passing acct_doc
columns header_no for ordinality,
status_code number path 'STATUS_CODE',
status_remarks varchar2(100) path 'STATUS_REMARKS'
) x1,
xmltable(
'$d/ACCOUNT_HEADER_ACK/DETAILS[$hn]/DETAIL'
passing acct_doc as "d",
x1.header_no as "hn"
columns segment_number number path 'SEGMENT_NUMBER',
remarks varchar2(100) path 'REMARKS'
) x2
returns the error you noticed
PL/SQL: ORA-00932: inconsistent datatypes: expected - got NUMBER
is because Oracle is expecting XML to be passed in. At the moment I forget if it requires a certain format or not, but it is simply expecting the value to be wrapped in simple XML.
Your query actually runs as is on 11.1 as Oracle changed how that functionality worked when 11.1 was released. Your query runs slowly, but it does run.
As you are dealing with groups, is there any way the input XML can be modified to be like
<ACCOUNT_HEADER_ACK>
<ACCOUNT_GROUP>
<HEADER>....</HEADER>
<DETAILS>....</DETAILS>
</ACCOUNT_GROUP>
<ACCOUNT_GROUP>
<HEADER>....</HEADER>
<DETAILS>....</DETAILS>
</ACCOUNT_GROUP>
</ACCOUNT_HEADER_ACK>
so that it is easier to associate a HEADER/DETAILS combination? If so, it would make parsing the XML much easier.
Assuming the answer is no, here is one hack to accomplish your goal
select x1.status_code,
x1.status_remarks,
x3.segment_number,
x3.remarks
from xmltable(
'/ACCOUNT_HEADER_ACK/HEADER'
passing acct_doc
columns header_no for ordinality,
status_code number path 'STATUS_CODE',
status_remarks varchar2(100) path 'STATUS_REMARKS'
) x1,
xmltable(
'$d/ACCOUNT_HEADER_ACK/DETAILS'
passing acct_doc as "d",
columns detail_no for ordinality,
detail_xml xmltype path 'DETAIL'
) x2,
xmltable(
'DETAIL'
passing x2.detail_xml
columns segment_number number path 'SEGMENT_NUMBER',
remarks varchar2(100) path 'REMARKS') x3
WHERE x1.header_no = x2.detail_no;
This follows the approach you started with. Table x1 creates a row for each HEADER node and table x2 creates a row for each DETAILS node. It assumes there is always a one and only one association between the two. I use table x3, which is joined to x2, to parse the many DETAIL nodes. The WHERE clause then joins each header row to the corresponding details row and produces the eight rows you are seeking.
There is another approach that I know of, and that would be using XQuery within the XMLTable. It should require using only one XMLTable but I would have to spend some time coming up with that solution and I can't recall whether restrictions exist in 10gR2 Express Edition compared to what can run in 10.2 Enterprise Edition for XQuery. -
XML document into multiple tables
How to insert a xml document into multiple tables. Eg. Purchase Order having multiple line items. I have to insert xml document into Parent as well as child with different sets of columns.
I created the tables using the create_ch14_tables.sql. I call it using java -classpath .;C:\commerceone\xpc\lib\xmlparserv2.jar;C:\commerceone\xpc\lib\classes12.zip;C:\commerceone\xpc\lib\xsu12.jar XMLLoader -file deptempdepend.xml -connName default -transform deptempdepend.xsl. The code doesn't seem to like the "<xsl:for-each select="Department">" tags. If I remove them, the insDoc.getDocumentElement().getFirstChild() will find the element, but it still doesn't insert anything into the database.
Thank You,
Dave -
Loading an XML file into the table without creating a directory .
Hi,
I wanted to load an XML file into a table column . But I should not create a directory in the server and placing the XML file there and giving the path in the insert query. Can anybody help me here?
Thanks in advance.You could write a java stored procedure that retrieves the file into a clob. Wrap that in a function call and use it in your insert statement.
This solution require read privileges granted by sys and is therefore only feasible if the top-level directory/directories are known or you get read-access to everything. -
I know Safari 1.3 does not support loading of an XML Document, but does Safari 2 or 3 support the XML DOM? I will be really really really disappointed in apple if they still do not support the XML DOM for parsing with javascript. Why is Apple so negligent to Javascript?
My XML doc loads fine in IE 5.5+ and FF1+ but I cannot get Safari clients to load the document. This is extremely detrimental to my site considering over 20% of the users are safari clients. I absolutely need this XML doc to be available to them.
What is the safari specific javascript needed to load an XML document?
Any help is GREATLY appreciated
Mac OS X (10.4.8)Why does no one know the answer to this question? I've posted in every forum I could find.
This procedure should be very standard and very easy. Much like it is in IE/FF. Why does no one know how to do it for Safari? -
How to convert Xml Document into orcale tempary table
i am creating xml document into java and passing this xml into oracle database and i need to fetch this xml into result set | rowset.
Xml Structure
<Department deptno="100">
<DeptName>Sports</DeptName>
<EmployeeList>
<Employee empno="200"><Ename>John</Ename><Salary>33333</Salary>
</Employee>
<Employee empno="300"><Ename>Jack</Ename><Salary>333444</Salary>
</Employee>
</EmployeeList>
</Department>
i need like this format
Deptno DeptName empno Ename Salary
100 Sports 200 Jhon 2500
100 Sports 300 Jack 3000It does depend on your version as odie suggests.
Here's a way that will work in 10g...
SQL> ed
Wrote file afiedt.buf
1 with t as (select xmltype('<Department deptno="100">
2 <DeptName>Sports</DeptName>
3 <EmployeeList>
4 <Employee empno="200"><Ename>John</Ename><Salary>33333</Salary>
5 </Employee>
6 <Employee empno="300"><Ename>Jack</Ename><Salary>333444</Salary>
7 </Employee>
8 </EmployeeList>
9 </Department>
10 ') as xml from dual)
11 --
12 -- End of test data, Use query below
13 --
14 select x.deptno, x.deptname
15 ,y.empno, y.ename, y.salary
16 from t
17 ,xmltable('/'
18 passing t.xml
19 columns deptno number path '/Department/@deptno'
20 ,deptname varchar2(10) path '/Department/DeptName'
21 ,emps xmltype path '/Department/EmployeeList'
22 ) x
23 ,xmltable('/EmployeeList/Employee'
24 passing x.emps
25 columns empno number path '/Employee/@empno'
26 ,ename varchar2(10) path '/Employee/Ename'
27 ,salary number path '/Employee/Salary'
28* ) y
SQL> /
DEPTNO DEPTNAME EMPNO ENAME SALARY
100 Sports 200 John 33333
100 Sports 300 Jack 333444
SQL>If the XML is a string e.g. a CLOB then it can easily be converted to XMLTYPE using the XMLTYPE function. -
Problem when trying to load an XML document with DTD or XML SCHEMA
Hello
I have tried to load an XML document in Data Services, and I created the xsd file and Dtd file. (With altova xml spy software automatically) to import into SAP Data Services 3.2. .
In Data Services I created the dtd import file DTD and then called the XML file from the DTD (the xml file is validated vs the dtd file), and I could not read the xml correctly because it tells me that an ELEMENT called <item> expected joiners who did not come in the structure of import (dtd), but if the xml.
I understand that the document root is the label: CUSTOMER_FULL_2014, the data flow is as follows:CARGA_XML_CUSTOMER |
Turns out the joiners element is used to separate the xml elements are repeated, but is used at different levels. My idea is that the design will dtd second, or something I'm missing or is incorrectly stated.
Thank you.
xml
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
<?xml version="1.0" encoding="utf-8"?>
<CUSTOMER_FULL_2014>
<item>
<CUST_NO>202222</CUST_NO>
<ADDRESS>
<item>
<SHIP_TO>202222</SHIP_TO>
<NAME1>Henley.</NAME1>
<STREET>Vitacura #40</STREET>
<CITY>LIMA</CITY>
</item>
</ADDRESS>
<EQUIPMENT>
<item>
<EQUI_NO>81623</EQUI_NO>
</item>
<item>
<EQUI_NO>81633</EQUI_NO>
</item>
<item>
<EQUI_NO>81993</EQUI_NO>
</item>
<item>
<EQUI_NO>82003</EQUI_NO>
</item>
<item>
<EQUI_NO>82013</EQUI_NO>
</item>
<item>
<EQUI_NO>82103</EQUI_NO>
</item>
<item>
<EQUI_NO>82113</EQUI_NO>
</item>
<item>
<EQUI_NO>581203</EQUI_NO>
</item>
<item>
<EQUI_NO>900003-EMER</EQUI_NO>
</item>
<item>
<EQUI_NO>9000033-STOCK</EQUI_NO>
</item>
</EQUIPMENT>
<STORAGE_LOC>
<item>
<STOR_LOC_NO>0001</STOR_LOC_NO>
<DESCRIPTION>01 Parts Center</DESCRIPTION>
</item>
<item>
<STOR_LOC_NO>0056</STOR_LOC_NO>
<DESCRIPTION>56 henley</DESCRIPTION>
</item>
</STORAGE_LOC>
</item>
<item>
<CUST_NO>2007933434343</CUST_NO>
<ADDRESS>
<item>
<SHIP_TO>2007933434343</SHIP_TO>
<NAME1>Campos de Almacenaje SA</NAME1>
<STREET>Calacoto2, Calle 1</STREET>
<HOUSE_NO>Piso 1</HOUSE_NO>
<CITY>La Paz</CITY>
</item>
</ADDRESS>
<EQUIPMENT>
<item>
<EQUI_NO>90000-EMER</EQUI_NO>
</item>
<item>
<EQUI_NO>90000333-STOCK</EQUI_NO>
</item>
</EQUIPMENT>
<STORAGE_LOC>
<item>
<STOR_LOC_NO>00012</STOR_LOC_NO>
<DESCRIPTION>01 Parts Center</DESCRIPTION>
</item>
<item>
<STOR_LOC_NO>0056</STOR_LOC_NO>
<DESCRIPTION>56 henley</DESCRIPTION>
</item>
</STORAGE_LOC>
</item>
<item>
<CUST_NO>200801333</CUST_NO>
<ADDRESS>
<item>
<SHIP_TO>200801333</SHIP_TO>
<NAME1>CONSTRUCTORA SA.</NAME1>
<STREET>Ruta Panamericana Km 100</STREET>
<CITY>San Antonio 23</CITY>
</item>
</ADDRESS>
<EQUIPMENT>
<item>
<EQUI_NO>1507933</EQUI_NO>
</item>
<item>
<EQUI_NO>1509733</EQUI_NO>
</item>
<item>
<EQUI_NO>90000-EMER</EQUI_NO>
</item>
<item>
<EQUI_NO>90000333-STOCK</EQUI_NO>
</item>
</EQUIPMENT>
<STORAGE_LOC>
<item>
<STOR_LOC_NO>0001</STOR_LOC_NO>
<DESCRIPTION>01 Parts Center</DESCRIPTION>
</item>
<item>
<STOR_LOC_NO>0056</STOR_LOC_NO>
<DESCRIPTION>56 henley</DESCRIPTION>
</item>
</STORAGE_LOC>
</item>
</CUSTOMER_FULL_2014>
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
ARCHIVO DTD CREADO (automáticamente con xml spy)
<?xml version="1.0" encoding="UTF-8"?>
<!-- DTD generado con XMLSpy v2014 rel. 2 (x64) (http://www.altova.com) -->
<!ELEMENT CITY (#PCDATA)>
<!ELEMENT item ((SHIP_TO, NAME1, STREET, HOUSE_NO?, CITY) | (CUST_NO, ADDRESS, EQUIPMENT, STORAGE_LOC) | (STOR_LOC_NO, DESCRIPTION) | EQUI_NO)>
<!ELEMENT NAME1 (#PCDATA)>
<!ELEMENT STREET (#PCDATA)>
<!ELEMENT ADDRESS (item)>
<!ELEMENT CUST_NO (#PCDATA)>
<!ELEMENT EQUI_NO (#PCDATA)>
<!ELEMENT SHIP_TO (#PCDATA)>
<!ELEMENT HOUSE_NO (#PCDATA)>
<!ELEMENT EQUIPMENT (item+)>
<!ELEMENT DESCRIPTION (#PCDATA)>
<!ELEMENT STORAGE_LOC (item+)>
<!ELEMENT STOR_LOC_NO (#PCDATA)>
<!ELEMENT CUSTOMER_FULL_2014 (item+)>
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
The error of MONITOR Data Services
11676
5184
XML-240108
11-04-2014 17:34:16
|Data flow CARGA_XML_CUSTOMER|Reader READ MESSAGE customer OUTPUT(customer)
11676
5184
XML-240108
11-04-2014 17:34:16
An element named <item> present in the XML data input does not exist in the XML format used to set up this XML source in data
11676
5184
XML-240108
11-04-2014 17:34:16
flow <CARGA_XML_CUSTOMER>. Please validate your XML data.
11676
5184
XML-240307
11-04-2014 17:34:16
|Data flow CARGA_XML_CUSTOMER|Reader READ MESSAGE customer OUTPUT(customer)
11676
5184
XML-240307
11-04-2014 17:34:16
XML parser failed: See previously displayed error message.
The Error from Monitor
The metadata DTD
Thanks
JuanHi Juan Juan,
I generated a new XSD file using MS Visual Studio based on your XML file.
The empty spaces in the source is replaced with <Null> in the target table.
Here is the XSD:
<?xml version="1.0" encoding="utf-8"?>
<xs:schema attributeFormDefault="unqualified" elementFormDefault="qualified" xmlns:xs="http://www.w3.org/2001/XMLSchema">
<xs:element name="CUSTOMER_FULL_20140207_033015_001">
<xs:complexType>
<xs:sequence>
<xs:choice maxOccurs="unbounded">
<xs:element name="CUST_NO" type="xs:unsignedInt" />
<xs:element name="ADDRESS">
<xs:complexType>
<xs:sequence>
<xs:element name="SHIP_TO" type="xs:unsignedInt" />
<xs:element name="NAME1" type="xs:string" />
<xs:element name="STREET" type="xs:string" />
<xs:element minOccurs="0" name="HOUSE_NO" type="xs:string" />
<xs:element name="CITY" type="xs:string" />
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:element name="EQUIPMENT">
<xs:complexType>
<xs:sequence>
<xs:element maxOccurs="unbounded" name="EQUI_NO" type="xs:string" />
</xs:sequence>
</xs:complexType>
</xs:element>
<xs:element name="STORAGE_LOC">
<xs:complexType>
<xs:sequence>
<xs:choice maxOccurs="unbounded">
<xs:element name="STOR_LOC_NO" type="xs:unsignedByte" />
<xs:element name="DESCRIPTION" type="xs:string" />
</xs:choice>
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:choice>
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:schema>
For Source XML
<?xml version="1.0" encoding="utf-8"?>
<CUSTOMER_FULL_20140207_033015_001>
<CUST_NO>200530</CUST_NO>
<ADDRESS>
<SHIP_TO>903533</SHIP_TO>
<NAME1>HENLEY - PART MAQUINARIAS S.A.</NAME1>
<STREET>Dean Camilo # 148, San Carlos</STREET>
<CITY>LIMA</CITY>
</ADDRESS>
<EQUIPMENT>
<EQUI_NO>4442</EQUI_NO>
<EQUI_NO>8163</EQUI_NO>
<EQUI_NO>8199</EQUI_NO>
<EQUI_NO>8200</EQUI_NO>
<EQUI_NO>8201</EQUI_NO>
<EQUI_NO>8210</EQUI_NO>
<EQUI_NO>8211</EQUI_NO>
<EQUI_NO>58120</EQUI_NO>
<EQUI_NO>90000-EMERGENCY</EQUI_NO>
<EQUI_NO>90000-STOCK</EQUI_NO>
</EQUIPMENT>
<STORAGE_LOC>
<STOR_LOC_NO>0001</STOR_LOC_NO>
<DESCRIPTION>01 Parts Center</DESCRIPTION>
<STOR_LOC_NO>0056</STOR_LOC_NO>
<DESCRIPTION>56 HEN</DESCRIPTION>
</STORAGE_LOC>
<CUST_NO>200793</CUST_NO>
<ADDRESS>
<SHIP_TO>200793</SHIP_TO>
<NAME1>Minera San Cristobal S.A.</NAME1>
<STREET>Calacoto, Calle 90, Torre 2</STREET>
<HOUSE_NO>Piso 5</HOUSE_NO>
<CITY>La Paz</CITY>
</ADDRESS>
<EQUIPMENT>
<EQUI_NO>90000-EMERGENCY</EQUI_NO>
<EQUI_NO>90000-STOCK</EQUI_NO>
</EQUIPMENT>
<STORAGE_LOC>
<STOR_LOC_NO>0001</STOR_LOC_NO>
<DESCRIPTION>01 Parts Center</DESCRIPTION>
<STOR_LOC_NO>0056</STOR_LOC_NO>
<DESCRIPTION>56 HEN</DESCRIPTION>
</STORAGE_LOC>
<CUST_NO>200801</CUST_NO>
<ADDRESS>
<SHIP_TO>200801</SHIP_TO>
<NAME1>ISEMAR S.A.</NAME1>
<STREET>Ruta Km 28.45</STREET>
<CITY>Don Torcuato Paraguay</CITY>
</ADDRESS>
<EQUIPMENT>
<EQUI_NO>15079</EQUI_NO>
<EQUI_NO>15097</EQUI_NO>
<EQUI_NO>90000-EMERGENCY</EQUI_NO>
<EQUI_NO>90000-STOCK</EQUI_NO>
</EQUIPMENT>
<STORAGE_LOC>
<STOR_LOC_NO>0001</STOR_LOC_NO>
<DESCRIPTION>01 Parts Center</DESCRIPTION>
<STOR_LOC_NO>0056</STOR_LOC_NO>
<DESCRIPTION>56 HEN</DESCRIPTION>
</STORAGE_LOC>
</CUSTOMER_FULL_20140207_033015_001>
Output:
Regards,
Akhileshkiran. -
Inserting a long XML document into XMLType
I'm trying to load the following document into an XMLType column in 10.2. I've tried every example I can find and can push the data into CLOBs using the Java work around just fine (http://www.oracle.com/technology/sample_code/tech/java/codesnippet/xmldb/HowToLoadLargeXML.html).
Can anyone provide a solution or let me know if there is a limitation please?
Given the table;
SQL> describe xmltable_1
Name Null? Type
DOC_ID NUMBER
XML_DATA XMLTYPE
How do I load this data into 'XML_DATA'?
<?xml version="1.0" encoding="UTF-8"?>
<metadata>
<idinfo>
<citation>
<citeinfo>
<origin>Rand McNally and ESRI</origin>
<pubdate>1996</pubdate>
<title>ESRI Cities Geodata Set</title>
<geoform>vector digital data</geoform>
<onlink>\\OIS23\C$\Files\Working\Metadata\world\cities.shp</onlink>
</citeinfo>
</citation>
<descript>
<abstract>World Cities contains locations of major cities around the world. The cities include national capitals for each of the countries in World Countries 1998 as well as major population centers and landmark cities. World Cities was derived from ESRI's ArcWorld database and supplemented with other data from the Rand McNally New International Atlas</abstract>
<purpose>606 points, 4 descriptive fields. Describes major world cities.</purpose>
</descript>
<timeperd>
<timeinfo>
<sngdate>
<caldate>1996</caldate>
</sngdate>
</timeinfo>
<current>publication date</current>
</timeperd>
<status>
<progress>Complete</progress>
<update>None planned</update>
</status>
<spdom>
<bounding>
<westbc>
-165.270004</westbc>
<eastbc>
177.130188</eastbc>
<northbc>
78.199997</northbc>
<southbc>
-53.150002</southbc>
</bounding>
</spdom>
<keywords>
<theme>
<themekt>city</themekt>
<themekey>cities</themekey>
</theme>
</keywords>
<accconst>none</accconst>
<useconst>none</useconst>
<ptcontac>
<cntinfo>
<cntperp>
<cntper>unknown</cntper>
<cntorg>unknown</cntorg>
</cntperp>
<cntpos>unknown</cntpos>
<cntvoice>555-1212</cntvoice>
</cntinfo>
</ptcontac>
<datacred>ESRI</datacred>
<native>Microsoft Windows NT Version 4.0 (Build 1381) Service Pack 6; ESRI ArcCatalog 8.1.0.570</native>
</idinfo>
<dataqual>
<attracc>
<attraccr>no report available</attraccr>
<qattracc>
<attraccv>1000000</attraccv>
<attracce>no report available</attracce>
</qattracc>
</attracc>
<logic>no report available</logic>
<complete>no report available</complete>
<posacc>
<horizpa>
<horizpar>no report available</horizpar>
</horizpa>
<vertacc>
<vertaccr>no report available</vertaccr>
</vertacc>
</posacc>
<lineage>
<srcinfo>
<srccite>
<citeinfo>
<title>ESRI</title>
</citeinfo>
</srccite>
<srcscale>20000000</srcscale>
<typesrc>CD-ROM</typesrc>
<srctime>
<timeinfo>
<sngdate>
<caldate>1996</caldate>
</sngdate>
</timeinfo>
<srccurr>publication date</srccurr>
</srctime>
<srccontr>no report available</srccontr>
</srcinfo>
<procstep>
<procdesc>no report available</procdesc>
<procdate>Unknown</procdate>
</procstep>
</lineage>
</dataqual>
<spdoinfo>
<direct>Vector</direct>
<ptvctinf>
<sdtsterm>
<sdtstype>Entity point</sdtstype>
<ptvctcnt>606</ptvctcnt>
</sdtsterm>
</ptvctinf>
</spdoinfo>
<spref>
<horizsys>
<geograph>
<latres>0.000001</latres>
<longres>0.000001</longres>
<geogunit>Decimal degrees</geogunit>
</geograph>
<geodetic>
<horizdn>North American Datum of 1927</horizdn>
<ellips>Clarke 1866</ellips>
<semiaxis>6378206.400000</semiaxis>
<denflat>294.978698</denflat>
</geodetic>
</horizsys>
</spref>
<eainfo>
<detailed>
<enttyp>
<enttypl>
cities</enttypl>
</enttyp>
<attr>
<attrlabl>FID</attrlabl>
<attrdef>Internal feature number.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Sequential unique whole numbers that are automatically generated.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>Shape</attrlabl>
<attrdef>Feature geometry.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Coordinates defining the features.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>NAME</attrlabl>
<attrdef>The city name. Spellings are based on Board of Geographic Names standards and commercial atlases.</attrdef>
<attrdefs>ESRI</attrdefs>
</attr>
<attr>
<attrlabl>COUNTRY</attrlabl>
<attrdef>An abbreviated country name.</attrdef>
</attr>
<attr>
<attrlabl>POPULATION</attrlabl>
<attrdef>Total population for the entire metropolitan area. Values are from recent census or estimates.</attrdef>
</attr>
<attr>
<attrlabl>CAPITAL</attrlabl>
<attrdef>Indicates whether a city is a national capital (Y/N).</attrdef>
</attr>
</detailed>
<overview>
<eaover>none</eaover>
<eadetcit>none</eadetcit>
</overview>
</eainfo>
<distinfo>
<stdorder>
<digform>
<digtinfo>
<transize>0.080</transize>
</digtinfo>
</digform>
</stdorder>
</distinfo>
<metainfo>
<metd>20010509</metd>
<metc>
<cntinfo>
<cntorgp>
<cntorg>ESRI</cntorg>
<cntper>unknown</cntper>
</cntorgp>
<cntaddr>
<addrtype>unknown</addrtype>
<city>unknown</city>
<state>unknown</state>
<postal>00000</postal>
</cntaddr>
<cntvoice>555-1212</cntvoice>
</cntinfo>
</metc>
<metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
<metstdv>FGDC-STD-001-1998</metstdv>
<mettc>local time</mettc>
<metextns>
<onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
<metprof>ESRI Metadata Profile</metprof>
</metextns>
</metainfo>
</metadata>
rtacce>Vertical Positional Accuracy is expressed in meters. Vertical accuracy figures were developed by comparing elevation contour locations on 1:24,000 scale maps to elevation values at the same location within the digital database. Some manual interpolation was necessary to complete this test. The analysis results are expressed as linear error at a 90% confidence interval.</vertacce>
</qvertpa>
</vertacc>
</posacc>
<lineage>
<srcinfo>
<srccite>
<citeinfo>
<origin>National Imagery and Mapping Agency</origin>
<pubdate>1994</pubdate>
<title>Operational Navigational Chart</title>
<geoform>map</geoform>
<pubinfo>
<pubplace>St.Louis, MO</pubplace>
<publish>National Imagery and Mapping Agency</publish>
</pubinfo>
</citeinfo>
</srccite>
<srcscale>1000000</srcscale>
<typesrc>stable-base material</typesrc>
<srctime>
<timeinfo>
<rngdates>
<begdate>1974</begdate>
<enddate>1994</enddate>
</rngdates>
</timeinfo>
<srccurr>Publication dates</srccurr>
</srctime>
<srccitea>ONC</srccitea>
<srccontr>All information found on the source with the exception of aeronautical data</srccontr>
</srcinfo>
<srcinfo>
<srccite>
<citeinfo>
<origin>National Imagery and Mapping Agency</origin>
<pubdate>199406</pubdate>
<title>Digital Aeronautical Flight Information File</title>
<geoform>model</geoform>
<pubinfo>
<pubplace>St. Louis, MO</pubplace>
<publish>National Imagery and Mapping Agency</publish>
</pubinfo>
</citeinfo>
</srccite>
<typesrc>magnetic tape</typesrc>
<srctime>
<timeinfo>
<sngdate>
<caldate>1994</caldate>
</sngdate>
</timeinfo>
<srccurr>Publication date</srccurr>
</srctime>
<srccitea>DAFIF</srccitea>
<srccontr>Airport records (name, International Civil Aviation Organization, position, elevation, and type)</srccontr>
</srcinfo>
<srcinfo>
<srccite>
<citeinfo>
<origin>Defense Mapping Agency</origin>
<pubdate>1994</pubdate>
<title>Jet Navigational Chart</title>
<geoform>map</geoform>
<pubinfo>
<pubplace>St.Louis, MO</pubplace>
<publish>Defense Mapping Agency</publish>
</pubinfo>
</citeinfo>
</srccite>
<srcscale>2,000,000</srcscale>
<typesrc>stable-base material</typesrc>
<srctime>
<timeinfo>
<rngdates>
<begdate>1974</begdate>
<enddate>1991</enddate>
</rngdates>
</timeinfo>
<srccurr>Publication date</srccurr>
</srctime>
<srccitea>JNC</srccitea>
<srccontr>All information found on the source with the exception of aeronautical data. JNCs were used as source for the Antartica region only.</srccontr>
</srcinfo>
<srcinfo>
<srccite>
<citeinfo>
<origin>USGS EROS Data Center</origin>
<pubdate></pubdate>
<title>Advance Very High Resolution Radiometer</title>
<geoform>remote-sensing image</geoform>
<pubinfo>
<pubplace>Sioux Falls, SD</pubplace>
<publish>EROS Data Center</publish>
</pubinfo>
</citeinfo>
</srccite>
<srcscale>1000000</srcscale>
<typesrc>magnetic tape</typesrc>
<srctime>
<timeinfo>
<rngdates>
<begdate>199003</begdate>
<enddate>199011</enddate>
</rngdates>
</timeinfo>
<srccurr>Publication date</srccurr>
</srctime>
<srccitea>AVHRR</srccitea>
<srccontr>6 vegetation types covering the continental US and Canada</srccontr>
</srcinfo>
<procstep>
<procdesc>For the first edition DCW, stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
<procdate>199112</procdate>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Environmental Systems Research Institute</cntorg>
</cntorgp>
<cntpos>Applications Division</cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>380 New York St.</address>
<city>Redlands</city>
<state>CA</state>
<postal>92373</postal>
<country>US</country>
</cntaddr>
<cntvoice>909-793-2853</cntvoice>
<cntfax>909-793-5953</cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdate>199404</procdate>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Geonex</cntorg>
</cntorgp>
<cntpos></cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>8950 North 9th Ave.</address>
<city>St. Petersburg</city>
<state>FL</state>
<postal>33702</postal>
<country>US</country>
</cntaddr>
<cntvoice>(813)578-0100</cntvoice>
<cntfax>(813)577-6946</cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdesc>Transferred digitally directly into the VPF files.</procdesc>
<procdate>199408</procdate>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Geonex</cntorg>
</cntorgp>
<cntpos></cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>8950 North 9th Ave.</address>
<city>St. Petersburg</city>
<state>FL</state>
<postal>33702</postal>
<country>US</country>
</cntaddr>
<cntvoice>813-578-0100</cntvoice>
<cntfax>813-577-6946</cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdesc>Stable-based positives were produced from the original reproduction negatives (up to 35 per ONC sheet). These were digitized either through a scanning-raster to vector conversion or hand digitized into vector form. The vector data was then tagged with attribute information using ARC-INFO software. Transformation to geographic coordinates was performed using the projection graticules for each sheet. Digital information was edge matched between sheets to create large regional datasets. These were then subdivided into 5 x 5 degree tiles and converted from ARC/INFO to VPF. The data was then pre-mastered for CD-ROM. QC was performed by a separate group for each step in the production process.</procdesc>
<procdate>199112</procdate>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Environmental Systems Research Institute</cntorg>
</cntorgp>
<cntpos>Applications Division</cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>380 New York St.</address>
<city>Redlands</city>
<state>CA</state>
<postal>92373</postal>
<country>US</country>
</cntaddr>
<cntvoice>909-793-2853</cntvoice>
<cntfax>909-793-5953</cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdesc>Daily AVHRR images were averaged for two week time periods over the entire US growing season. These averaged images, their rates of change, elevation information, and other data were used to produce a single land classification image of the contental US. The VMap-0 data set extended this coverage over the Canadian land mass, however vegetation classification was further subdivided into nine vegetation types.</procdesc>
<procdate>199402</procdate>
<srcprod>EROS data</srcprod>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>USGS Eros Data Center</cntorg>
</cntorgp>
<cntpos></cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address></address>
<city>Sioux Falls</city>
<state>SD</state>
<postal></postal>
<country>US</country>
</cntaddr>
<cntvoice></cntvoice>
<cntfax></cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdesc>The Eros data (raster files) were converted to vector polygon, splined (remove stairstepping), thinned (all ploygons under 2km2 were deleted), and tied to existing DCW polygons (water bodies, built-up areas). The resulting file was tiled and converted to a VPF Vegetation coverage for use in the DCW. All processing was performed using ARC-INFO software.</procdesc>
<procdate>199412</procdate>
<srcprod>VMap-0 Vegetation Coverage</srcprod>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Geonex</cntorg>
</cntorgp>
<cntpos></cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>8950 North 9th Ave.</address>
<city>St. Petersburg</city>
<state>FL</state>
<postal>33702</postal>
<country>US</country>
</cntaddr>
<cntvoice>813-578-0100</cntvoice>
<cntfax>813-577-6946</cntfax>
</cntinfo>
</proccont>
</procstep>
<procstep>
<procdesc>Data was translated from VPF format to ArcInfo Coverage format. The coverages were then loaded into a seamless ArcSDE layer.</procdesc>
<procdate>02152001</procdate>
<proccont>
<cntinfo>
<cntorgp>
<cntorg>Geodesy Team, Harvard University</cntorg>
</cntorgp>
<cntemail>[email protected]</cntemail>
</cntinfo>
</proccont>
</procstep>
</lineage>
</dataqual>
<spdoinfo>
<direct>Vector</direct>
<ptvctinf>
<sdtsterm>
<sdtstype>Complete chain</sdtstype>
</sdtsterm>
<sdtsterm>
<sdtstype>Label point</sdtstype>
</sdtsterm>
<sdtsterm>
<sdtstype>GT-polygon composed of chains</sdtstype>
</sdtsterm>
<sdtsterm>
<sdtstype>Point</sdtstype>
</sdtsterm>
<vpfterm>
<vpflevel>3</vpflevel>
<vpfinfo>
<vpftype>Node</vpftype>
</vpfinfo>
<vpfinfo>
<vpftype>Edge</vpftype>
</vpfinfo>
<vpfinfo>
<vpftype>Face</vpftype>
</vpfinfo>
</vpfterm>
</ptvctinf>
</spdoinfo>
<spref>
<horizsys>
<geograph>
<latres>0.000000</latres>
<longres>0.000000</longres>
<geogunit>Decimal degrees</geogunit>
</geograph>
<geodetic>
<horizdn>D_WGS_1984</horizdn>
<ellips>WGS_1984</ellips>
<semiaxis>6378137.000000</semiaxis>
<denflat>298.257224</denflat>
</geodetic>
</horizsys>
<vertdef>
<altsys>
<altdatum>Mean Sea Level</altdatum>
<altunits>1.0</altunits>
</altsys>
</vertdef>
</spref>
<eainfo>
<detailed>
<enttyp>
<enttypl>
lc.pat</enttypl>
</enttyp>
<attr>
<attrlabl>FID</attrlabl>
<attrdef>Internal feature number.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Sequential unique whole numbers that are automatically generated.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>Shape</attrlabl>
<attrdef>Feature geometry.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Coordinates defining the features.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>AREA</attrlabl>
<attrdef>Area of feature in internal units squared.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Positive real numbers that are automatically generated.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>PERIMETER</attrlabl>
<attrdef>Perimeter of feature in internal units.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Positive real numbers that are automatically generated.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>LC#</attrlabl>
<attrdef>Internal feature number.</attrdef>
<attrdefs>ESRI</attrdefs>
<attrdomv>
<udom>Sequential unique whole numbers that are automatically generated.</udom>
</attrdomv>
</attr>
<attr>
<attrlabl>LC-ID</attrlabl>
<attrdef>User-defined feature number.</attrdef>
<attrdefs>ESRI</attrdefs>
</attr>
<attr>
<attrlabl>LCAREA.AFT_ID</attrlabl>
</attr>
<attr>
<attrlabl>LCPYTYPE</attrlabl>
<attrdef>Land cover poygon type</attrdef>
<attrdefs>NIMA</attrdefs>
<attrdomv>
<edom>
<edomv>1</edomv>
<edomvd>Rice Field</edomvd>
</edom>
<edom>
<edomv>2</edomv>
<edomvd>Cranberry bog</edomvd>
</edom>
<edom>
<edomv>3</edomv>
<edomvd>Cultivated area, garden</edomvd>
</edom>
<edom>
<edomv>4</edomv>
<edomvd>Peat cuttings</edomvd>
</edom>
<edom>
<edomv>5</edomv>
<edomvd>Salt pan</edomvd>
</edom>
<edom>
<edomv>6</edomv>
<edomvd>Fish pond or hatchery</edomvd>
</edom>
<edom>
<edomv>7</edomv>
<edomvd>Quarry, strip mine, mine dump, blasting area</edomvd>
</edom>
<edom>
<edomv>8</edomv>
<edomvd>Oil or gas</edomvd>
</edom>
<edom>
<edomv>10</edomv>
<edomvd>Lava flow</edomvd>
</edom>
<edom>
<edomv>11</edomv>
<edomvd>Distorted surface area</edomvd>
</edom>
<edom>
<edomv>12</edomv>
<edomvd>Unconsolidated material (sand or gravel, glacial moraine)</edomvd>
</edom>
<edom>
<edomv>13</edomv>
<edomvd>Natural landmark area</edomvd>
</edom>
<edom>
<edomv>14</edomv>
<edomvd>Inundated area</edomvd>
</edom>
<edom>
<edomv>15</edomv>
<edomvd>Undifferentiated wetlands</edomvd>
</edom>
<edom>
<edomv>99</edomv>
<edomvd>None</edomvd>
</edom>
</attrdomv>
</attr>
<attr>
<attrlabl>TILE_ID</attrlabl>
<attrdef>VPF Format tile ID</attrdef>
<attrdefs>NIMA</attrdefs>
</attr>
<attr>
<attrlabl>FAC_ID</attrlabl>
</attr>
</detailed>
<overview>
<eaover>The DCW used a product-specific attribute coding system that is composed of TYPE and STATUS designators for area, line, and point features; and LEVEL and SYMBOL designators for text features. The TYPE attribute specifies what the feature is, while the STATUS attribute specifies the current condition of the feature. Some features require both a TYPE and STATUS code to uniquely identify their characteristics. In order to uniquely identify each geographic attribute in the DCW, the TYPE and STATUS attribute code names are preceded by the two letter coverage abbreviation and a two letter abbreviation for the type of graphic primitive present. The DCW Type/Status codes were mapped into the FACC coding scheme. A full description of FACC may be found in Digital Geographic Information Exchange Standard Edition 1.2, January 1994.</eaover>
<eadetcit>Entities (features) and Attributes for DCW are fully described in: Department of Defense, 1992, Military Specification Digital Chart of the World (MIL-D-89009): Philadelphia, Department of Defense, Defense Printing Service Detachment Office.</eadetcit>
</overview>
</eainfo>
<distinfo>
<distrib>
<cntinfo>
<cntorgp>
<cntorg>NIMA</cntorg>
</cntorgp>
<cntpos>ATTN: CC, MS D-16</cntpos>
<cntaddr>
<addrtype>mailing and physical address</addrtype>
<address>6001 MacArthur Blvd.</address>
<city>Bethesda</city>
<state>MD</state>
<postal>20816-5001</postal>
<country>US</country>
</cntaddr>
<cntvoice>301-227-2495</cntvoice>
<cntfax>301-227-2498</cntfax>
</cntinfo>
</distrib>
<distliab>None</distliab>
<stdorder>
<digform>
<digtinfo>
<formname>VPF</formname>
<formverd>19930930</formverd>
<formspec>Military Standard Vector Product Format (MIL-STD-2407). The current version of this document is dated 28 June 1996. Edition 3 of VMap-0 conforms to a previous version of the VPF Standard as noted. Future versions of VMap-0 will conform to the current version of VPF Standard.</formspec>
<transize>0.172</transize>
</digtinfo>
<digtopt>
<offoptn>
<offmedia>CD-ROM</offmedia>
<recfmt>ISO 9660</recfmt>
</offoptn>
</digtopt>
</digform>
<fees>Not Applicable</fees>
</stdorder>
</distinfo>
<distinfo>
<distrib>
<cntinfo>
<cntorgp>
<cntorg>USGS Map Sales</cntorg>
</cntorgp>
<cntaddr>
<addrtype>mailing address</addrtype>
<address>Box 25286</address>
<city>Denver</city>
<state>CO</state>
<postal>80225</postal>
<country>US</country>
</cntaddr>
<cntvoice>303-236-7477</cntvoice>
<cntfax>303-236-1972</cntfax>
</cntinfo>
</distrib>
<distliab>None</distliab>
<stdorder>
<digform>
<digtinfo>
<transize>0.172</transize>
</digtinfo>
</digform>
<fees>$82.50 per four disk set</fees>
<ordering>For General Public: Payment (check, money order, purchase order, or Government account) must accompany order.
Make all drafts payable to Dept. of the Interior- US Geological Survey.
To provide a general idea of content, a sample data set is available from the TMPO Home Page at:</ordering>
</stdorder>
</distinfo>
<distinfo>
<distrib>
<cntinfo>
<cntorgp>
<cntorg>Geodesy Team, Harvard University</cntorg>
<cntper>Geodesy Team</cntper>
</cntorgp>
<cntemail>[email protected]</cntemail>
</cntinfo>
</distrib>
<resdesc>Geodesy layer</resdesc>
<distliab>None</distliab>
<stdorder>
<digform>
<digtinfo>
<formname>SHP</formname>
<transize>0.172</transize>
</digtinfo>
<digtopt>
<onlinopt>
<computer>
<networka>
<networkr>geodesy.harvard.edu</networkr>
</networka>
</computer>
</onlinopt>
</digtopt>
</digform>
<fees>none</fees>
</stdorder>
<availabl>
<timeinfo>
<sngdate>
<caldate>1992</caldate>
</sngdate>
</timeinfo>
</availabl>
</distinfo>
<metainfo>
<metd>20010226</metd>
<metc>
<cntinfo>
<cntorgp>
<cntorg>Geodesy Team</cntorg>
<cntper>REQUIRED: The person responsible for the metadata information.</cntper>
</cntorgp>
<cntvoice>REQUIRED: The telephone number by which individuals can speak to the organization or individual.</cntvoice>
<cntemail>[email protected]</cntemail>
</cntinfo>
</metc>
<metstdn>FGDC Content Standards for Digital Geospatial Metadata</metstdn>
<metstdv>FGDC-STD-001-1998</metstdv>
<mettc>local time</mettc>
<metextns>
<onlink>http://www.esri.com/metadata/esriprof80.html</onlink>
<metprof>ESRI Metadata Profile</metprof>
</metextns>
</metainfo>
</metadata>Have you tired the directory and bfile methods? Here is the example for that in the Oracle XML Developer's Guide:
CREATE DIRECTORY xmldir AS 'path_to_folder_containing_XML_file';
Example 3-3 Inserting XML Content into an XMLType Table
INSERT INTO mytable2 VALUES (XMLType(bfilename('XMLDIR', 'purchaseOrder.xml'),
nls_charset_id('AL32UTF8')));
1 row created.
The value passed to nls_charset_id() indicates that the encoding for the file to be read is UTF-8.
ben -
Got error message when store XML documents into XML DB repository, via WebD
Hi experts,
I am in I am in Oracle Enterprise Manager 11g 11.2.0.1.0.
SQL*Plus: Release 11.2.0.1.0 Production on Tue Feb 22 11:40:23 2011
I got error message when store XML documents into XML DB repository, via WebDAV.
I have successfully registered 5 related schemas and generated 1 table.
I have inserted 40 .xml files into this auto generated table.
using these data I created relational view successfully.
but since I couldn't store XML documents into XML DB repository, via WebDAV
when I query using below code:
SELECT rv.res.getClobVal()
FROM resource_view rv
WHERE rv.any_path = '/home/DEV/messages/4fe1-865d-da0db9212f34.xml';
I got nothing.
My ftp code is listed below:
ftp> open localhost 2100
Connected to I0025B368E2F9.
220- C0025B368E2F9
Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution.
220 I0025B368E2F9 FTP Server (Oracle XML DB/Oracle Database) ready.
User (I0025B368E2F9:(none)): fda_xml
331 pass required for FDA_XML
Password:
230 FDA_XML logged in
ftp> cd /home/DEV/message
250 CWD Command successful
ftp> pwd
257 "/home/DEV/message" is current directory.
ftp> ls -la
200 PORT Command successful
150 ASCII Data Connection
drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 .
drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 ..
226 ASCII Transfer Complete
ftp: 115 bytes received in 0.00Seconds 115000.00Kbytes/sec.
250 SET_CHARSET Command Successful
ftp> put C:\ED\SPL\E_Reon_Data\loaded\4fe1-865d-da0db9212f34.xml
200 PORT Command successful
150 ASCII Data Connection
550- Error Response
ORA-00600: internal error code, arguments: [qmxConvUnkType], [], [], [], [], [], [], [], [], [], [], []
550 End Error Response
ftp: 3394 bytes sent in 0.00Seconds 3394000.00Kbytes/sec.
I have tried all suggestion from another thread such as:
alter system set events ='31150 trace name context forever, level 0x4000'
SQL> alter system set shared_servers = 1;
but failed.
is there anyone can help?
Thanks.
Edited by: Cow on Mar 29, 2011 12:58 AMHi experts,
I am in I am in Oracle Enterprise Manager 11g 11.2.0.1.0.
SQL*Plus: Release 11.2.0.1.0 Production on Tue Feb 22 11:40:23 2011
I got error message when store XML documents into XML DB repository, via WebDAV.
I have successfully registered 5 related schemas and generated 1 table.
I have inserted 40 .xml files into this auto generated table.
using these data I created relational view successfully.
but since I couldn't store XML documents into XML DB repository, via WebDAV
when I query using below code:
SELECT rv.res.getClobVal()
FROM resource_view rv
WHERE rv.any_path = '/home/DEV/messages/4fe1-865d-da0db9212f34.xml';
I got nothing.
My ftp code is listed below:
ftp> open localhost 2100
Connected to I0025B368E2F9.
220- C0025B368E2F9
Unauthorised use of this FTP server is prohibited and may be subject to civil and criminal prosecution.
220 I0025B368E2F9 FTP Server (Oracle XML DB/Oracle Database) ready.
User (I0025B368E2F9:(none)): fda_xml
331 pass required for FDA_XML
Password:
230 FDA_XML logged in
ftp> cd /home/DEV/message
250 CWD Command successful
ftp> pwd
257 "/home/DEV/message" is current directory.
ftp> ls -la
200 PORT Command successful
150 ASCII Data Connection
drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 .
drw-r--r-- 2 FDA_XML oracle 0 DEC 17 19:19 ..
226 ASCII Transfer Complete
ftp: 115 bytes received in 0.00Seconds 115000.00Kbytes/sec.
250 SET_CHARSET Command Successful
ftp> put C:\ED\SPL\E_Reon_Data\loaded\4fe1-865d-da0db9212f34.xml
200 PORT Command successful
150 ASCII Data Connection
550- Error Response
ORA-00600: internal error code, arguments: [qmxConvUnkType], [], [], [], [], [], [], [], [], [], [], []
550 End Error Response
ftp: 3394 bytes sent in 0.00Seconds 3394000.00Kbytes/sec.
I have tried all suggestion from another thread such as:
alter system set events ='31150 trace name context forever, level 0x4000'
SQL> alter system set shared_servers = 1;
but failed.
is there anyone can help?
Thanks.
Edited by: Cow on Mar 29, 2011 12:58 AM -
Bulk Loader Program to load large xml document
I am looking for a bulk loader database program that will load a very large xml document. The simple bulk loader application available on the oracle site will not load this document due to its size which is approximately 20MG. Please advise asap. Thank you.
From the above document:
Storing XML Data Across Tables
Question
Can XML- SQL Utility store XML data across tables?
Answer
Currently XML-SQL Utility (XSU) can only store to a single table. It maps a canonical representation of an XML document into any table/view. But of course there is a way to store XML with the XSU across table. One can do this using XSLT to transform any document into multiple documents and insert them separately. Another way is to define views over multiple tables (object views if needed) and then do the inserts ... into the view. If the view is inherently non-updatable (because of complex joins, ...), then one can use INSTEAD-OF triggers over the views to do the inserts.
-- I've tried this, works fine. -
How to ftp XML document into a XML type which is not created by itself.
Hi,
1.
I have a table call SNPLEX_DESIGN which is created automaticly when I register a snplex_design.xsd XML schema.( Oracle creates it through the xdb:defaultTable="SNPLEX_DESIGN attribute). and it is created using SNPLEX user account.
2.
I also created a folder (resource) call /home/SNPLEX/Orders. which is used to hold all the incoming XML document.
3.
I created another user account call SNPLEX_APP, which is the only user account allowed to FTP XML document into /home/SNPLEX/Orders folder.
Isuues,
If I login as SNPLEX user, I can ftp XML document into the folder and TABLE (the file size = 0). But If I login as SNPLEX_APP user account, I can only ftp XML document into the folder, but Oracle doesn't store the document into the table( becuase the files size shows a number).
I have granted all the ACL privileges on the /home/SNPEX/Orders folder to SNPLEX_APP hrough OEM.
DO I miss anything. Any helps will be great appreciated. Resolve this issues is very import to us, sicne we are on a stage to roll system into production.
Regards,
JinsenIN order for a registered schema to be available to other users the schema must be registered as a GLOBAL, rather than a LOCAL Schema. This is controlled by the third agument passed to registerSChema, and the default is local. Note that you will also need to explicity grant appropriate permissions on any tables created by the schema registration process to other users who will be loading or reading data from these tables.
Maybe you are looking for
-
what a mess . it has taken me the past 5 weeks to have my iphone set as I wanted . AN now I get to start all over again .This is the ONE that **** me off about Apple . Your great ideas will take me another 5 weeks to recover !!! I don't know if they
-
Service Master for Maintenance
Hi MM Consultants My client wants to know SAP Ability to record the reasons for the change of maintenance venor for the asset in creating service PO. for eg for an asset in AMC i choose a particular vendor for creating service PO and latter change th
-
How to get server certificate before sending off confidential content?
If my mail client knows for example the SHA1 value of TLS/SSL enabled receiving MTA's certificate, subversion of DNS and Root-CAs could be prevented if after establishing the connection, the com.sun.mail.smtp.SMTPTransport.serverSocket were accessibl
-
How do I creat group contacts? I have a group of managers that I text/email regularly. Is it possible to save these as a group so I don't have to add each individual each time I need to send a message?
-
How to execute a function in oracle plsql object, when object dies
I have a plsql object with a member function as exec_last. I want this procedure to be called when plsql object is cleaned or when the session hosting this object dies. In C, we have a system call as atexit(). Is there any such feature in Oracle 10g