Parse column with csv string into table with one row per item
I have a table (which has less than 100 rows) - ifs_tables that has two columns: localtable and Fields. Localtable is a table name and Fields contains a subset of columns from that table. Fields is a comma delimited list: 'Fname,Lname'. It looks like
this:
localtable fields
========= =============
customertable fname,lname
accounttable type,accountnumber
Want to end up with a new table that has one row per column. It should look like this:
TableName ColumnName
============ ==========
CustomerTable Fname
CustomerTable Lname
AccountTable Type
AccountTable AccountNumber
Tried this code but have two issues (1) My query using the Splitfields functions gets "Subquery returned more than 1 value" (2) some of my Fields has hundreds of collumns in the commas delimited list. It will returns "Msg 530, Level 16, State
1, Line 8. The statement terminated. The maximum recursion 100 has been exhausted before statement completion.maxrecursion greater than 100." Tried adding OPTION (maxrecursion 0) in the Split function on the SELECT statment that calls the CTE, but
the syntax is not correct.
Can someone help me to get this sorted out? Thanks
DROP FUNCTION [dbo].[SplitFields]
go
CREATE FUNCTION [dbo].[SplitFields]
@String NVARCHAR(4000),
@Delimiter NCHAR(1)
RETURNS TABLE
AS
RETURN
WITH Split(stpos,endpos)
AS(
SELECT 0 AS stpos, CHARINDEX(@Delimiter,@String) AS endpos
UNION ALL
SELECT endpos+1, CHARINDEX(@Delimiter,@String,endpos+1)
FROM Split
WHERE endpos > 0
SELECT 'Id' = ROW_NUMBER() OVER (ORDER BY (SELECT 1)),
'Data' = SUBSTRING(@String,stpos,COALESCE(NULLIF(endpos,0),LEN(@String)+1)-stpos)
FROM Split --OPTION ( maxrecursion 0);
GO
IF OBJECT_ID('tempdb..#ifs_tables') IS NOT NULL DROP TABLE #ifs_tables
SELECT *
INTO #ifs_tables
FROM (
SELECT 'CustomerTable' , 'Lname,Fname' UNION ALL
SELECT 'AccountTable' , 'Type,AccountNumber'
) d (dLocalTable,dFields)
IF OBJECT_ID('tempdb..#tempFieldsCheck') IS NOT NULL DROP TABLE #tempFieldsCheck
SELECT * INTO #tempFieldsCheck
FROM
( --SELECT dLocaltable, dFields from #ifs_tables
SELECT dLocaltable, (SELECT [Data] FROM dbo.SplitFields(dFields, ',') ) from #ifs_tables
) t (tLocalTable, tfields) -- as Data FROM #ifs_tables
SELECT * FROM #tempFieldsCheck
Try this
DECLARE @DemoTable table
localtable char(100),
fields varchar(200)
INSERT INTO @DemoTable values('customertable','fname,lname')
INSERT INTO @DemoTable values('accounttable','type,accountnumber')
select * from @DemoTable
SELECT A.localtable ,
Split.a.value('.', 'VARCHAR(100)') AS Dept
FROM (SELECT localtable,
CAST ('<M>' + REPLACE(fields, ',', '</M><M>') + '</M>' AS XML) AS String
FROM @DemoTable) AS A CROSS APPLY String.nodes ('/M') AS Split(a);
Refer:-https://sqlpowershell.wordpress.com/2015/01/09/sql-split-delimited-columns-using-xml-or-udf-function/
CREATE FUNCTION ParseValues
(@String varchar(8000), @Delimiter varchar(10) )
RETURNS @RESULTS TABLE (ID int identity(1,1), Val varchar(8000))
AS
BEGIN
DECLARE @Value varchar(100)
WHILE @String is not null
BEGIN
SELECT @Value=CASE WHEN PATINDEX('%'+@Delimiter+'%',@String) >0 THEN LEFT(@String,PATINDEX('%'+@Delimiter+'%',@String)-1) ELSE @String END, @String=CASE WHEN PATINDEX('%'+@Delimiter+'%',@String) >0 THEN SUBSTRING(@String,PATINDEX('%'+@Delimiter+'%',@String)+LEN(@Delimiter),LEN(@String)) ELSE NULL END
INSERT INTO @RESULTS (Val)
SELECT @Value
END
RETURN
END
SELECT localtable ,f.Val
FROM @DemoTable t
CROSS APPLY dbo.ParseValues(t.fields,',')f
--Prashanth
Similar Messages
-
Inserting XML String into Table with help of Stored Proc
I will be getting XML String from JAVA, which I have to insert in Table A, XML String is as follows
<?xml version = '1.0'?>
< TableA>
<mappings Record="3">
< MESSAGEID >1</ MESSAGEID >
< MESSAGE >This is available at your address!</ MESSAGE>
</mappings>
<mappings Record="3">
< MESSAGEID >2</ MESSAGEID>
< MESSAGE >This isn’t available at your address. </ MESSAGE>
</mappings>
</ TableA >
Table Structure*
MESSAGEID VARCHAR2(15 BYTE)
MESSAGE VARCHAR2(500 BYTE)
This is the stored procedure which I have written to insert data into TableA, V_MESSAGE will be input parameter for inserting XML String
create or replace procedure AP_DBI_PS_MESSAGE_INSERT
V_MESSAGE VARCHAR2(1024)
AS
declare
charString varchar2(80);
finalStr varchar2(4000) := null;
rowsp integer;
V_FILEHANDLE UTL_FILE.FILE_TYPE;
begin
-- the name of the table as specified in our DTD
xmlgen.setRowsetTag('TableA');
-- the name of the data set as specified in our DTD
xmlgen.setRowTag('mappings');
-- for getting the output on the screen
dbms_output.enable(1000000);
-- open the XML document in read only mode
v_FileHandle := utl_file.fopen(V_MESSAGE);
--v_FileHandle := V_MESSAGE;
loop
BEGIN
utl_file.get_line(v_FileHandle, charString);
exception
when no_data_found then
utl_file.fclose(v_FileHandle);
exit;
END;
dbms_output.put_line(charString);
if finalStr is not null then
finalStr := finalStr || charString;
else
finalStr := charString;
end if;
end loop;
-- for inserting the XML data into the table
rowsp := xmlgen.insertXML('ONE.TableA',finalStr);
dbms_output.put_line('INSERT DONE '||TO_CHAR(rowsp));
xmlgen.resetOptions;
end;Please Help
Edited by: 846857 on Jul 18, 2011 10:55 PMwith t as (select xmltype('<TableA >
<mappings Record="3">
<MessageId>1</MessageId>
<Message> This bundle is available at your address!</Message>
</mappings>
<mappings Record="3">
<MessageId>2</MessageId>
<Message>This isn’t available at your address. </Message>
</mappings>
</TableA >') col FROM dual)
--End Of sample data creation with subquery factoring.
--You can use the query from here with your table and column name.
select EXTRACTVALUE(X1.column_value,'/mappings/MessageId') MESSAGEID
,EXTRACTVALUE(X1.column_value,'/mappings/Message') MESSAGE
from t,table(XMLSEQUENCE(extract(t.COL,'/TableA/mappings'))) X1;Above Code works as i get result
MESSAGEID MESSAGE
1 This bundle is available at your address!
2 This isn’t available at your address.
_____________________________________________ now I want to insert the result into Table A... How to proceed... Please help
Edited by: 846857 on Jul 19, 2011 12:15 AM -
How do I split a single event with many clips into multiples events, one event per date?
I archived the video from my AVCHD camera into a Final Cut video archive. Later I imported this into iMovie. All the clips from this archive (spanning several months) are dumped into a single event. If I recall, older versions of iMovie would import video into separate events for each day. Then I would go through and give each event a meaningful name (e.g., "Mary's Birthday").
This is what I would like. Is there a way to split this very long event into multiple events, one for each day that is present in the event?Not automatically but with imovie 10 you can create new events with any name you like then move clips into them from other events (Just like FCP 10.1). See:
http://help.apple.com/imovie/mac/10.0/#mov1d890f00b
You can also choose to display clips in an event grouped by date.
Geoff -
String into table with separate row and column bytes
Hello, well I have a string (coming from an I2C eeprom and then through RS232 via MCU) with this format --> byte1 byte2 byte3 byte4 byte5 \r byte1 byte2 byte3 byte4 byte5 \r and so on, the length of the frame isn't known, it depends on how many sensors have passed over its limit. I am having problems on plotting this information into a table, I don't know at all how to separate each byte in a column and each packet of five bytes in a row. I attached my vi with the example of two sensors information string, I am able only to plot the first sensor .Any help will be welcome.
Thanks in advance.
Regards.
Attachments:
string_to_table.vi 31 KBThis is working modification of your code
Message Edited by EVS on 08-26-2005 09:16 PM
Jack
Win XP
LabVIEW 6.1, 7.0, 7.1, LabWindows/ CVI 7.1
Let us speek Russian
Attachments:
Clip_5.jpg 53 KB -
Stored Procedure with CSV string as parameters
Hi,
I have been tasked with this issue, but I have no real SQL/Oracle experience. My boss advises that the best way to do this would be to pass a CSV string into the stored proc for each parameter,
What I need to do is parse the CSV string and use each value within their as part of the where clause so. If I have two parameter CSV strings A and B. My outcome would be something like:
select * from table where col1 = A1 or A2 or A3 and col2 = B1 or B2 or B3.
The number of values within each CSV string are unknown until they are passed into the stored proc. How would I best go about this?
Thanks,
Darren.You can use a comma separated string of values in the where clause of the query this way:
SQL> select * from emp;
EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO
7369 SMITH CLERK 7902 17-DEC-80 800 20
7499 ALLEN SALESMAN 7698 20-FEB-81 1600 300 30
7521 WARD SALESMAN 7698 22-FEB-81 1250 500 30
7566 JONES MANAGER 7839 02-APR-81 2975 20
7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400 30
7698 BLAKE MANAGER 7839 01-MAY-81 2850 30
7782 CLARK MANAGER 7839 09-JUN-81 2450 10
7788 SCOTT ANALYST 7566 19-APR-87 3000 20
7839 KING PRESIDENT 17-NOV-81 5000 10
7844 TURNER SALESMAN 7698 08-SEP-81 1500 0 30
7876 ADAMS CLERK 7788 23-MAY-87 1100 20
7900 JAMES CLERK 7698 03-DEC-81 950 30
7902 FORD ANALYST 7566 03-DEC-81 3000 20
7934 MILLER CLERK 7782 23-JAN-82 1300 10
14 rows selected.
SQL> select * from emp
2 where ','||&instr||',' like '%,'||empno||',%';
Enter value for instr: '7654,7876,7369'
old 2: where ','||&instr||',' like '%,'||empno||',%'
new 2: where ','||'7654,7876,7369'||',' like '%,'||empno||',%'
EMPNO ENAME JOB MGR HIREDATE SAL COMM DEPTNO
7369 SMITH CLERK 7902 17-DEC-80 800 20
7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400 30
7876 ADAMS CLERK 7788 23-MAY-87 1100 20Max -
How can I load data into table with SQL*LOADER
how can I load data into table with SQL*LOADER
when column data length more than 255 bytes?
when column exceed 255 ,data can not be insert into table by SQL*LOADER
CREATE TABLE A (
A VARCHAR2 ( 10 ) ,
B VARCHAR2 ( 10 ) ,
C VARCHAR2 ( 10 ) ,
E VARCHAR2 ( 2000 ) );
control file:
load data
append into table A
fields terminated by X'09'
(A , B , C , E )
SQL*LOADER command:
sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
datafile:
column E is more than 255bytes
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)Check this out.
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961 -
Error when attempting to merge into table with RLS - ORA - 28132
hey,
i've encountered an "ORA-28132 - Merge into syntax does not support security policies" when attempting to merge into a table with RLS defined on it.
i looked around and found this [discussion |http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:259415261270] on ask tom.
according to the thread - this was a bug in 9i. in 10g merge is possible only if the policy returns "null" and in 11g it was finally fixed.
*i have been able to merge into tables with rls in 10gR2 in some cases.*
i would like to know if anyone has some more information about this error. when can i merge into tables with RLS ?
thanks, Haki.Never.
The bug was that it did work sometimes in 10g i.e. it should always raise an error.
It's something that I've always found a bit odd and frustrating - that MERGE and RLS policy should not be compatible.
But bottom line is that you shouldn't be using MERGE on table with RLS policy.
Edited by: Dom Brooks on Feb 14, 2011 10:51 AM -
Could not enter feature 000000000000000008BTL with SID 228008 into table
Hi all,
When i am loading master data for 0MAT_UNIT iam getting below error message:
i tried thro RSRV but using rsrv with one attempt its gettng solve for one SID and facing same error for other records, like correcting error number of times in RSRV is time taking process.
can you please suggest me how can i solve this issue.
Error is : Could not enter feature 000000000000000008BTL with SID 228008 into table /BI0/SMAT_UNIT
thanks in advance.for MAT_UNIT, Remove the alpha Conversion Exit Checkbox and load.
This may work -
Problem - insert JSON string into table in CLR function
Hi
I create a CLR function to insert JSON string into table.
With this line
mt = JsonConvert.DeserializeObject<MyTable>(jsonStr.ToString());
The class is OK (no error), but when I try to add the Assembly in Microsoft SQL Server Management Studio, it has error :
Assembly 'newtonsoft.json, version=4.5.0.0, culture=neutral, publickeytoken=30ad4fe6b2a6aeed.' was not found in the SQL catalog.
(I have Newtonsoft.Json in the Reference)
Please help !
ThanksHi Bob
Could you elaborate a bit more?
I think the code is ok. The problem is when I deploy the Visual Studio creates/alters the assembly, and get the error
Error: SQL72014: .Net SqlClient Data Provider: Msg 6503, Level 16, State 12, Line 1 Assembly 'newtonsoft.json, version=6.0.0.0, culture=neutral, publickeytoken=30ad4fe6b2a6aeed.' was not found in the SQL catalog.
ALTER ASSEMBLY [Database1]
FROM 0x4D5A90000300000004000000FFFF0000B800000000000000400000000000000000000000000000000000000000000000000000000000000000000000800000000E1FBA0E00B409CD21B8014CCD21546869732070726F6772616D2063616E6E6F742062652072756E20696E20444F53206D6F64652E0D0D0A2400000000000000504500004C0103000DE411540000000000000000E00002210B010B000012000000060000000000008E3000000020000000400000000000100020000000020000040000000000000006000000000000000080000000020000000000000300608500001000001000000000100000100000000000001000000000000000000000003C3000004F00000000400000A802000000000000000000000000000000000000006000000C000000042F00001C0000000000000000000000000000000000000000000000000000000000000000000000000000000000000000200000080000000000000000000000082000004800000000000000000000002E7465787400000094100000002000000012000000020000000000000000000000000000200000602E72737263000000A8020000004000000004000000140000000000000000000000000000400000402E72656C6F6300000C0000000060000000020000001800000000000000000000000000004
An error occurred while the batch was being executed.
Done building project "Database1.sqlproj" -- FAILED.
This is my FillRow function. Without the bold line, the everything is fine. I can create the assembly, then create the SQL function. Done. When I call select from the SQL function, it returns 10 rows as expected.
public static IEnumerable getTable(SqlString jsonStr)
ArrayList resultCollection = new ArrayList();
MyTable mt;
//mt = JsonConvert.DeserializeObject<MyTable>(jsonStr.ToString());
int c = 1;
while (c < 10)
mt = new MyTable();
mt.GlobalId = c.ToString();
mt.DateSet = "DS=!=" + c.ToString();
mt.Timestamp = "TS==" + c.ToString();
mt.PartnerId = "PI==" + c.ToString();
mt.PartnerUserId = "PUI=" + c.ToString();
mt.UserIP = "UIP=" + c.ToString();
mt.UserAgent = "UG=" + c.ToString();
mt.Referer = "R=" + c.ToString();
resultCollection.Add(mt);
c++;
//resultCollection.Add(mt);
return resultCollection; -
I have 3 tables - Book, Author, BookAuthorReference
A book can have multiple authors, and when I do straight query I get multiple rows per book
SELECT <columns>
FROM Book b, Author a, BookAuthorReference ba
where ba.BookId = b.BookId and
ba.AuthorId = a.AuthorId
I want to get the results as ONE row per book, and Authors separated by commas, like:
SQL 2008 internals book Paul Randal, Kimberly Tripp, Jonathan K, Joe Sack...something like this
Thank you in advanceThis can by done by straying into XML land. The syntax is anything but intuitive, but it works. And moreover, it is guaranteed to work.
SELECT b.Title, substring(a.Authors, 1, len(a.Authors) - 1) AS Authors
FROM Books b
CROSS APPLY (SELECT a.Author + ','
FROM BookAuthorReference ba
JOIN Authors a ON a.AuthorID = ba.AuthorID
WHERE ba.BookID = a.BookID
ORDER BY ba.AuthorNo
FOR XML PATH('')) AS a(Authors)
Erland Sommarskog, SQL Server MVP, [email protected] -
Parsing BLOB (CSV file with special characters) into table
Hello everyone,
In my application, user uploads a CSV file (it is stored as BLOB), which is later read and parsed into table. The parsing engine is shown bellow...
The problem is, that it won't read national characters as Ö, Ü etc., they simply dissapear.
Is there any CSV parser that supports national characters? Or, said in other words - is it possible to read BLOB by characters (where characters can be Ö, Ü etc.)?
Regards,
Adam
|
| helper function for csv parsing
|
+-----------------------------------------------*/
FUNCTION hex_to_decimal(p_hex_str in varchar2) return number
--this function is based on one by Connor McDonald
--http://www.jlcomp.demon.co.uk/faq/base_convert.html
is
v_dec number;
v_hex varchar2(16) := '0123456789ABCDEF';
begin
v_dec := 0;
for indx in 1 .. length(p_hex_str) loop
v_dec := v_dec * 16 + instr(v_hex, upper(substr(p_hex_str, indx, 1))) - 1;
end loop;
return v_dec;
end hex_to_decimal;
|
| csv parsing
|
+-----------------------------------------------*/
FUNCTION parse_csv_to_imp_table(in_import_id in number) RETURN boolean IS
PRAGMA autonomous_transaction;
v_blob_data BLOB;
n_blob_len NUMBER;
v_entity_name VARCHAR2(100);
n_skip_rows INTEGER;
n_columns INTEGER;
n_col INTEGER := 0;
n_position NUMBER;
v_raw_chunk RAW(10000);
v_char CHAR(1);
c_chunk_len number := 1;
v_line VARCHAR2(32767) := NULL;
n_rows number := 0;
n_temp number;
BEGIN
-- shortened
n_blob_len := dbms_lob.getlength(v_blob_data);
n_position := 1;
-- Read and convert binary to char
WHILE (n_position <= n_blob_len) LOOP
v_raw_chunk := dbms_lob.substr(v_blob_data, c_chunk_len, n_position);
v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
n_temp := ascii(v_char);
n_position := n_position + c_chunk_len;
-- When a whole line is retrieved
IF v_char = CHR(10) THEN
n_rows := n_rows + 1;
if n_rows > n_skip_rows then
-- Shortened
-- Perform some action with the line (store into table etc.)
end if;
-- Clear out
v_line := NULL;
n_col := 0;
ELSIF v_char != chr(10) and v_char != chr(13) THEN
v_line := v_line || v_char;
if v_char = ';' then
n_col := n_col+1;
end if;
END IF;
END LOOP;
COMMIT;
return true;
EXCEPTION
-- some exception handling
END;Uploading CSV files into LOB columns and then reading them in PL/SQL: [It’s|http://forums.oracle.com/forums/thread.jspa?messageID=3454184�] Re: Reading a Blob (CSV file) and displaying the contents Re: Associative Array and Blob Number of rows in a clob doncha know.
Anyway, it woudl help if you gave us some basic information: database version and NLS settings would seem particularly relevant here.
Cheers, APC
blog: http://radiofreetooting.blogspot.com -
Sample insert into table with BLOB column
This is my first opportunity to work with BLOB columns. I was wondering if anyone had some sample code that shows the loading of a blob column and inserted into the table with a blob column.
I have to produce a report (including crlf) and place it into a blob column. The user will download the report at a later time.
Any suggestions / code samples are greatly appreciated!!!!You can enable string binding in TopLink.
login.useStringBinding(int size);
or you could enable binding in general,
login.bindAllParameters(); -
ORA-07445 in the alert log when inserting into table with XMLType column
I'm trying to insert an xml-document into a table with a schema-based XMLType column. When I try to insert a row (using plsql-developer) - oracle is busy for a few seconds and then the connection to oracle is lost.
Below you''ll find the following to recreate the problem:
a) contents from the alert log
b) create script for the table
c) the before-insert trigger
d) the xml-schema
e) code for registering the schema
f) the test program
g) platform information
Alert Log:
Fri Aug 17 00:44:11 2007
Errors in file /oracle/app/oracle/product/10.2.0/db_1/admin/dntspilot2/udump/dntspilot2_ora_13807.trc:
ORA-07445: exception encountered: core dump [SIGSEGV] [Address not mapped to object] [475177] [] [] []
Create script for the table:
CREATE TABLE "DNTSB"."SIGNATURETABLE"
( "XML_DOCUMENT" "SYS"."XMLTYPE" ,
"TS" TIMESTAMP (6) WITH TIME ZONE NOT NULL ENABLE
) XMLTYPE COLUMN "XML_DOCUMENT" XMLSCHEMA "http://www.sporfori.fo/schemas/www.w3.org/TR/xmldsig-core/xmldsig-core-schema.xsd" ELEMENT "Object"
ROWDEPENDENCIES ;
Before-insert trigger:
create or replace trigger BIS_SIGNATURETABLE
before insert on signaturetable
for each row
declare
-- local variables here
l_sigtab_rec signaturetable%rowtype;
begin
if (:new.xml_document is not null) then
:new.xml_document.schemavalidate();
end if;
l_sigtab_rec.xml_document := :new.xml_document;
end BIS_SIGNATURETABLE2;
XML-Schema (xmldsig-core-schema.xsd):
=====================================================================================
<?xml version="1.0" encoding="utf-8"?>
<!-- Schema for XML Signatures
http://www.w3.org/2000/09/xmldsig#
$Revision: 1.1 $ on $Date: 2002/02/08 20:32:26 $ by $Author: reagle $
Copyright 2001 The Internet Society and W3C (Massachusetts Institute
of Technology, Institut National de Recherche en Informatique et en
Automatique, Keio University). All Rights Reserved.
http://www.w3.org/Consortium/Legal/
This document is governed by the W3C Software License [1] as described
in the FAQ [2].
[1] http://www.w3.org/Consortium/Legal/copyright-software-19980720
[2] http://www.w3.org/Consortium/Legal/IPR-FAQ-20000620.html#DTD
-->
<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:ds="http://www.w3.org/2000/09/xmldsig#" xmlns:xdb="http://xmlns.oracle.com/xdb"
targetNamespace="http://www.w3.org/2000/09/xmldsig#" version="0.1" elementFormDefault="qualified">
<!-- Basic Types Defined for Signatures -->
<xs:simpleType name="CryptoBinary">
<xs:restriction base="xs:base64Binary">
</xs:restriction>
</xs:simpleType>
<!-- Start Signature -->
<xs:element name="Signature" type="ds:SignatureType"/>
<xs:complexType name="SignatureType">
<xs:sequence>
<xs:element ref="ds:SignedInfo"/>
<xs:element ref="ds:SignatureValue"/>
<xs:element ref="ds:KeyInfo" minOccurs="0"/>
<xs:element ref="ds:Object" minOccurs="0" maxOccurs="unbounded"/>
</xs:sequence>
<xs:attribute name="Id" type="xs:ID" use="optional"/>
</xs:complexType>
<xs:element name="SignatureValue" type="ds:SignatureValueType"/>
<xs:complexType name="SignatureValueType">
<xs:simpleContent>
<xs:extension base="xs:base64Binary">
<xs:attribute name="Id" type="xs:ID" use="optional"/>
</xs:extension>
</xs:simpleContent>
</xs:complexType>
<!-- Start SignedInfo -->
<xs:element name="SignedInfo" type="ds:SignedInfoType"/>
<xs:complexType name="SignedInfoType">
<xs:sequence>
<xs:element ref="ds:CanonicalizationMethod"/>
<xs:element ref="ds:SignatureMethod"/>
<xs:element ref="ds:Reference" maxOccurs="unbounded"/>
</xs:sequence>
<xs:attribute name="Id" type="xs:ID" use="optional"/>
</xs:complexType>
<xs:element name="CanonicalizationMethod" type="ds:CanonicalizationMethodType"/>
<xs:complexType name="CanonicalizationMethodType" mixed="true">
<xs:sequence>
<xs:any namespace="##any" minOccurs="0" maxOccurs="unbounded"/>
<!-- (0,unbounded) elements from (1,1) namespace -->
</xs:sequence>
<xs:attribute name="Algorithm" type="xs:anyURI" use="required"/>
</xs:complexType>
<xs:element name="SignatureMethod" type="ds:SignatureMethodType"/>
<xs:complexType name="SignatureMethodType" mixed="true">
<xs:sequence>
<xs:element name="HMACOutputLength" minOccurs="0" type="ds:HMACOutputLengthType"/>
<xs:any namespace="##other" minOccurs="0" maxOccurs="unbounded"/>
<!-- (0,unbounded) elements from (1,1) external namespace -->
</xs:sequence>
<xs:attribute name="Algorithm" type="xs:anyURI" use="required"/>
</xs:complexType>
<!-- Start Reference -->
<xs:element name="Reference" type="ds:ReferenceType"/>
<xs:complexType name="ReferenceType">
<xs:sequence>
<xs:element ref="ds:Transforms" minOccurs="0"/>
<xs:element ref="ds:DigestMethod"/>
<xs:element ref="ds:DigestValue"/>
</xs:sequence>
<xs:attribute name="Id" type="xs:ID" use="optional"/>
<xs:attribute name="URI" type="xs:anyURI" use="optional"/>
<xs:attribute name="Type" type="xs:anyURI" use="optional"/>
</xs:complexType>
<xs:element name="Transforms" type="ds:TransformsType"/>
<xs:complexType name="TransformsType">
<xs:sequence>
<xs:element ref="ds:Transform" maxOccurs="unbounded"/>
</xs:sequence>
</xs:complexType>
<xs:element name="Transform" type="ds:TransformType"/>
<xs:complexType name="TransformType" mixed="true">
<xs:choice minOccurs="0" maxOccurs="unbounded">
<xs:any namespace="##other" processContents="lax"/>
<!-- (1,1) elements from (0,unbounded) namespaces -->
<xs:element name="XPath" type="xs:string"/>
</xs:choice>
<xs:attribute name="Algorithm" type="xs:anyURI" use="required"/>
</xs:complexType>
<!-- End Reference -->
<xs:element name="DigestMethod" type="ds:DigestMethodType"/>
<xs:complexType name="DigestMethodType" mixed="true">
<xs:sequence>
<xs:any namespace="##other" processContents="lax" minOccurs="0" maxOccurs="unbounded"/>
</xs:sequence>
<xs:attribute name="Algorithm" type="xs:anyURI" use="required"/>
</xs:complexType>
<xs:element name="DigestValue" type="ds:DigestValueType"/>
<xs:simpleType name="DigestValueType">
<xs:restriction base="xs:base64Binary"/>
</xs:simpleType>
<!-- End SignedInfo -->
<!-- Start KeyInfo -->
<xs:element name="KeyInfo" type="ds:KeyInfoType"/>
<xs:complexType name="KeyInfoType" mixed="true">
<xs:choice maxOccurs="unbounded">
<xs:element ref="ds:KeyName"/>
<xs:element ref="ds:KeyValue"/>
<xs:element ref="ds:RetrievalMethod"/>
<xs:element ref="ds:X509Data"/>
<xs:element ref="ds:PGPData"/>
<xs:element ref="ds:SPKIData"/>
<xs:element ref="ds:MgmtData"/>
<xs:any processContents="lax" namespace="##other"/>
<!-- (1,1) elements from (0,unbounded) namespaces -->
</xs:choice>
<xs:attribute name="Id" type="xs:ID" use="optional"/>
</xs:complexType>
<xs:element name="KeyName" type="xs:string"/>
<xs:element name="MgmtData" type="xs:string"/>
<xs:element name="KeyValue" type="ds:KeyValueType"/>
<xs:complexType name="KeyValueType" mixed="true">
<xs:choice>
<xs:element ref="ds:DSAKeyValue"/>
<xs:element ref="ds:RSAKeyValue"/>
<xs:any namespace="##other" processContents="lax"/>
</xs:choice>
</xs:complexType>
<xs:element name="RetrievalMethod" type="ds:RetrievalMethodType"/>
<xs:complexType name="RetrievalMethodType">
<xs:sequence>
<xs:element ref="ds:Transforms" minOccurs="0"/>
</xs:sequence>
<xs:attribute name="URI" type="xs:anyURI"/>
<xs:attribute name="Type" type="xs:anyURI" use="optional"/>
</xs:complexType>
<!-- Start X509Data -->
<xs:element name="X509Data" type="ds:X509DataType"/>
<xs:complexType name="X509DataType">
<xs:sequence maxOccurs="unbounded">
<xs:choice>
<xs:element name="X509IssuerSerial" type="ds:X509IssuerSerialType"/>
<xs:element name="X509SKI" type="xs:base64Binary"/>
<xs:element name="X509SubjectName" type="xs:string"/>
<xs:element name="X509Certificate" type="xs:base64Binary"/>
<xs:element name="X509CRL" type="xs:base64Binary"/>
<xs:any namespace="##other" processContents="lax"/>
</xs:choice>
</xs:sequence>
</xs:complexType>
<xs:complexType name="X509IssuerSerialType">
<xs:sequence>
<xs:element name="X509IssuerName" type="xs:string"/>
<xs:element name="X509SerialNumber" type="xs:integer"/>
</xs:sequence>
</xs:complexType>
<!-- End X509Data -->
<!-- Begin PGPData -->
<xs:element name="PGPData" type="ds:PGPDataType"/>
<xs:complexType name="PGPDataType">
<xs:choice>
<xs:sequence>
<xs:element name="PGPKeyID" type="xs:base64Binary"/>
<xs:element name="PGPKeyPacket" type="xs:base64Binary" minOccurs="0"/>
<xs:any namespace="##other" processContents="lax" minOccurs="0"
maxOccurs="unbounded"/>
</xs:sequence>
<xs:sequence>
<xs:element name="PGPKeyPacket" type="xs:base64Binary"/>
<xs:any namespace="##other" processContents="lax" minOccurs="0"
maxOccurs="unbounded"/>
</xs:sequence>
</xs:choice>
</xs:complexType>
<!-- End PGPData -->
<!-- Begin SPKIData -->
<xs:element name="SPKIData" type="ds:SPKIDataType"/>
<xs:complexType name="SPKIDataType">
<xs:sequence maxOccurs="unbounded">
<xs:element name="SPKISexp" type="xs:base64Binary"/>
<xs:any namespace="##other" processContents="lax" minOccurs="0"/>
</xs:sequence>
</xs:complexType>
<!-- End SPKIData -->
<!-- End KeyInfo -->
<!-- Start Object (Manifest, SignatureProperty) -->
<xs:element name="Object" type="ds:ObjectType"/>
<xs:complexType name="ObjectType" mixed="true">
<xs:sequence minOccurs="0" maxOccurs="unbounded">
<xs:any namespace="##any" processContents="lax"/>
</xs:sequence>
<xs:attribute name="Id" type="xs:ID" use="optional"/>
<xs:attribute name="MimeType" type="xs:string" use="optional"/> <!-- add a grep facet -->
<xs:attribute name="Encoding" type="xs:anyURI" use="optional"/>
</xs:complexType>
<xs:element name="Manifest" type="ds:ManifestType"/>
<xs:complexType name="ManifestType">
<xs:sequence>
<xs:element ref="ds:Reference" maxOccurs="unbounded"/>
</xs:sequence>
<xs:attribute name="Id" type="xs:ID" use="optional"/>
</xs:complexType>
<xs:element name="SignatureProperties" type="ds:SignaturePropertiesType"/>
<xs:complexType name="SignaturePropertiesType">
<xs:sequence>
<xs:element ref="ds:SignatureProperty" maxOccurs="unbounded"/>
</xs:sequence>
<xs:attribute name="Id" type="xs:ID" use="optional"/>
</xs:complexType>
<xs:element name="SignatureProperty" type="ds:SignaturePropertyType"/>
<xs:complexType name="SignaturePropertyType" mixed="true">
<xs:choice maxOccurs="unbounded">
<xs:any namespace="##other" processContents="lax"/>
<!-- (1,1) elements from (1,unbounded) namespaces -->
</xs:choice>
<xs:attribute name="Target" type="xs:anyURI" use="required"/>
<xs:attribute name="Id" type="xs:ID" use="optional"/>
</xs:complexType>
<!-- End Object (Manifest, SignatureProperty) -->
<!-- Start Algorithm Parameters -->
<xs:simpleType name="HMACOutputLengthType">
<xs:restriction base="xs:integer"/>
</xs:simpleType>
<!-- Start KeyValue Element-types -->
<xs:element name="DSAKeyValue" type="ds:DSAKeyValueType"/>
<xs:complexType name="DSAKeyValueType">
<xs:sequence>
<xs:sequence minOccurs="0">
<xs:element name="P" type="ds:CryptoBinary"/>
<xs:element name="Q" type="ds:CryptoBinary"/>
</xs:sequence>
<xs:element name="G" type="ds:CryptoBinary" minOccurs="0"/>
<xs:element name="Y" type="ds:CryptoBinary"/>
<xs:element name="J" type="ds:CryptoBinary" minOccurs="0"/>
<xs:sequence minOccurs="0">
<xs:element name="Seed" type="ds:CryptoBinary"/>
<xs:element name="PgenCounter" type="ds:CryptoBinary"/>
</xs:sequence>
</xs:sequence>
</xs:complexType>
<xs:element name="RSAKeyValue" type="ds:RSAKeyValueType"/>
<xs:complexType name="RSAKeyValueType">
<xs:sequence>
<xs:element name="Modulus" type="ds:CryptoBinary"/>
<xs:element name="Exponent" type="ds:CryptoBinary"/>
</xs:sequence>
</xs:complexType>
<!-- End KeyValue Element-types -->
<!-- End Signature -->
</xs:schema>
===============================================================================
Code for registering the xml-schema
begin
dbms_xmlschema.deleteSchema('http://xmlns.oracle.com/xdb/schemas/DNTSB/www.sporfori.fo/schemas/www.w3.org/TR/xmldsig-core/xmldsig-core-schema.xsd',
dbms_xmlschema.DELETE_CASCADE_FORCE);
end;
begin
DBMS_XMLSCHEMA.REGISTERURI(
schemaurl => 'http://www.sporfori.fo/schemas/www.w3.org/TR/xmldsig-core/xmldsig-core-schema.xsd',
schemadocuri => 'http://www.sporfori.fo/schemas/www.w3.org/TR/xmldsig-core/xmldsig-core-schema.xsd',
local => TRUE,
gentypes => TRUE,
genbean => FALSE,
gentables => TRUE,
force => FALSE,
owner => 'DNTSB',
options => 0);
end;
Test program
-- Created on 17-07-2006 by EEJ
declare
XML_TEXT3 CLOB := '<Object xmlns="http://www.w3.org/2000/09/xmldsig#">
<SignatureProperties>
<SignatureProperty Target="">
<Timestamp xmlns="http://www.sporfori.fo/schemas/dnts/general/2006/11/14">2007-05-10T12:00:00-05:00</Timestamp>
</SignatureProperty>
</SignatureProperties>
</Object>';
xmldoc xmltype;
begin
xmldoc := xmltype(xml_text3);
insert into signaturetable
(xml_document, ts)
values
(xmldoc, current_timestamp);
end;
Platform information
Operating system:
-bash-3.00$ uname -a
SunOS dntsdb 5.10 Generic_125101-09 i86pc i386 i86pc
SQLPlus:
SQL*Plus: Release 10.2.0.3.0 - Production on Fri Aug 17 00:15:13 2007
Copyright (c) 1982, 2006, Oracle. All Rights Reserved.
Enter password:
Connected to:
Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - Production
With the Partitioning and Data Mining options
Kind Regards,
EyðunYou should report this in a service request on http://metalink.oracle.com.
It is a shame that you put all the effort here to describe your problem, but on the other hand you can now also copy & paste the question to Oracle Support.
Because you are using 10.2.0.3; I am guessing that you have a valid service contract... -
Loading data into table with filename
Hi All,
I am new to ODI. I have a requirement to load of a flat file into oracle, along with the name of file in one column of the table. That means, if there are 10 rows of data from that file then the additional column 'FileName" of the table will have current file name written 10 times. Please suggest how this can be achieved. I am able to do half of it but file name is not populated.
Thanks.Hi ,
Probably what you can do is
1. create a variable to read the file name . Refer http://odiexperts.com/?tag=file-variable-odi for creating such a variable.
2. Now in your interface , target table column which holds the filename , use '#<variable created in step 1>'
and specify it to be executed on target .
It should do the trick .
Thanks,
Sutirtha -
Loading Data into Table with Complex Transformations
Hello Guys
I am trying to load data into one of the Dimension table and it has quite a few Transformations and i created 6 temp tables
1. It has 7 Columns , Gets 935 rows based on where condition
2. It has 10 Columns , Gets 935 rows with but it has nulls in it i.e for column 1 there are 500 fields and columns 2 there are 300 etc ...
3 , 4 , 5 , 6 all the same as the 2 table
and at the end when i am trying to join all the temp tables with the Product_id into the target , which is in each temp table ...
I am Getting Error Saying Not Obeying Primary key Constraints i.e unique values are not been inserting into the Product_id Column of Target Table and the Job is running for Hours
and the main Problem comes at , some of the Columns have the same Product_id
Please help me
I have been Trying for 1 week and i am in Full pressure
Thanks
Sriks
Edited by: Sriks on Oct 16, 2008 6:43 PMHi,
If you are creating a warehouse and product_key is ur PK then it should come only once and so u might have to think ur logic in getting the data. To get over the isue u can disable the constraint and load with out the cosntraint, but i would have u look at the logic and make sure u have only 1 product_key in the table.
Regards
Bharath
Maybe you are looking for
-
I have an ipod touch that I would like to give to my grandson. The id is now also associated with my iphone 4s. I would like to separate the two and give him his own account. How do I do that?
-
How to unlock a carrier locked iPad 1?
I have a iPad that was bought on contract on Orange. I have a 3 sim and when I try it, I get a message in iTunes "The Sim card inserted in this iPad does not appear to be supported" I am guessing it is carrier locked, I have found a few people on the
-
Project wise Payments/Inpayments-S_ALR_87013573
Hi , I want a report which will show me projectwise inpayments & payment.I tried this using S_ALR_87013573 but doesnt showed me correct info. For example I booked a invoice of amt Rs 100 thru fb60 & payed amt of Rs 50 thru f-53.But the above said rep
-
Search rollover page number not consistent with thumb and pdf page number
Please respond by e-mail to [email protected] Adobe Reader 9.0 and 8.1.2. Mac OS X 10.4.11 Mac PowerPC G4 640 MB SDRAM Does Adobe Reader have a search window phrase rollover, yellow-highlighted pop up page number bug displaying the wrong page in some
-
Develop Module and Preset Application
I would love a way to apply my general adjustments to a copy of a photo and reset all adjustment sliders so I can then consistantly apply a Develop Preset. I have a large number of develop presets that provide me with particular "looks" I use a lot,