EA1 - Clob truncated ?
I have a very simple my_log_query table with varchar partition_key, timestamp, varchar identifier, clob querytext.
With SQL Developer 1.2, when SELECTing * FROM my_log_query, I could get the whole clob displayed, and more importantly, copying or exporting data to the clipboard I would have the whole querytext as it is in DB.
With EA1, when SELECTing * FROM my_log_query, I can see the word (CLOB) is inserted at the beginning of the display. So far so good.
If I move my mouse cursor over the data, a hint is displayed containing a truncated version of the clob (truncated at 4000th char). So far so good.
But if I copy (Ctrl-C while cell is highlighted) or export the data (right-click / export), my Clob is truncated like in the hint at 4000th char!
If I "auto-fit on data" the cell several times, it will dsplay the whole clob (although forcing me to scroll), but even then, copying or exporting the data will give a truncated result.
The only workaround seems to be:
- double-click on the cell to edit its contents, then select all and copy.
Would it be possible to have the whole clob's contents on copy and/or export without having to edit the cell?
Would it be possible to have the whole clob's contents on copy and/or export without having to edit the cell? We can but aren't inclined since the CLOB can size upto GBs. The truncation size though can be made a preference. Please open a new feature request at the exchange for this, if you like.
We would recommend living with an extra click or two compared to preferences overkill.
Similar Messages
-
When overwritting CLOB longer previous value still remains
Hi,
I have a problem with updating CLOB's. Below is how I write them:
String selectStatement = "select description from modules where " +
"system = ? and version = ? and module = ? for update";
PreparedStatement selStmt = con.prepareStatement(selectStatement);
selStmt.setString(1, system);
selStmt.setString(2, version);
selStmt.setString(3, module);
ResultSet rset = selStmt.executeQuery();
ResultSetInPool rpool = (ResultSetInPool)rset;
if (rpool.next()) {
desclob = (CLOB)((OracleResultSet)rpool.getUnderlyingResultSet()).getCLOB(1);
// Write character stream (description) to CLOB variable
writer = ((CLOB)desclob).getCharacterOutputStream();
writer.write(description.toCharArray());
writer.flush();
} else {
throw new NoSuchEntityException("Row does not exist");
The problem is that when there is a value already written which is longer then new one, part of the old value (which has not been overwriten) still remains in the database. Is that normal behavior? Is there any way how to address this problem better/faster then updating row with EMPTY_CLOB() and writting string again?
Thanks for response in advance,
RAFHi Rafal,
If you are using Oracle9i JDBC Drivers, then use the
oracle.sql.CLOB.trim(long offset) method to clear the contents of the CLOB.
like ,
// clear the contents of clob
clob.trim(0);
else If you are using JDK 1.4 , use java.sql.Clob.truncate(long offset)
clob.truncate(0);
then start streaming the new contents into the clob.
I guess its the normal behaviour of CLOB.
Regards
Elango. -
ORA-01000 during inserting xml files
Hello,
I'm using the following jdbc-code to insert xml files into oracle 10gRelease2:
PreparedStatement stmt = null;
Connection conn = getConnection();
try
stmt = conn.prepareStatement("insert into security values (?)");
} catch (Exception e) {
System.err.println(e);
BufferedInputStream bufferedInputStream;
for (int i = 1; i < 500; i++)
String docName = "../../../data/output/security/security" + i+ ".xml";
try
FileInputStream fileInputStream = new FileInputStream(docName);
bufferedInputStream = new BufferedInputStream(fileInputStream, 5000);
XMLType doc = null;
doc = XMLType.createXML(conn, bufferedInputStream);
stmt.setObject(1, doc);
stmt.executeUpdate();
doc.close();
fileInputStream.close();
} catch (Exception e) {
System.out.println(i);
System.err.println(e);
conn.commit();
The code is working fine when I insert less than 340 files. But when I try to insert more than 340 files I'm getting the error:
java.sql.SQLException: ORA-00604: error occurred at recursive SQL level 1
ORA-01000: maximum open cursors exceeded
Where do I open so many cursors? During creating the XMLType?
Thx,
FabianI can reproduce this. Looks like a problem with the XMLType.createXML() method. Will confirm and file a bug if necessary. In the mean time the following works and has the same effect
package com.oracle.st.xmldb.pm.examples;
import com.oracle.st.xmldb.pm.common.baseApp.BaseApplication;
import java.io.ByteArrayInputStream;
import java.io.FileInputStream;
import java.io.FileOutputStream;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.io.OutputStreamWriter;
import java.io.Reader;
import java.io.Writer;
import java.sql.SQLException;
import oracle.jdbc.OracleCallableStatement;
import oracle.jdbc.OraclePreparedStatement;
import oracle.jdbc.OracleResultSet;
import oracle.sql.CLOB;
import oracle.xdb.XMLType;
public class BulkInsertXMLType extends BaseApplication
public static String TABLE_NAME = "Table";
public static String SOURCE_FILE = "File";
public static String ITERATIONS = "Iterations";
public void doSomething(String[] Args) throws Exception
OracleCallableStatement statement = null;
String statementText;
XMLType xml;
CLOB clob = CLOB.createTemporary( getConnection(), true, CLOB.DURATION_SESSION);
// statementText = "insert into " + this.getSetting(this.TABLE_NAME) + " values(:1)";
statementText = "insert into " + this.getSetting(this.TABLE_NAME) + " values(xmltype(:1))";
System.out.println("GetXMLType.doSomething : Driver Type = " + this.getDriver() + ". Statement = " + statementText);
statement = (OracleCallableStatement) getConnection().prepareCall(statementText);
for (int i=0; i < Integer.parseInt(getSetting(this.ITERATIONS)); i++)
try {
InputStream is = new FileInputStream(getSetting(this.SOURCE_FILE));
// xml = XMLType.createXML(this.getConnection(),is);
// statement.setObject(1,xml);
InputStreamReader reader = new InputStreamReader( is );
Writer writer = clob.setCharacterStream(0);
char [] buffer = new char [ clob.getChunkSize() ];
for( int charsRead = reader.read( buffer );
charsRead > - 1;
charsRead = reader.read( buffer ) )
writer.write( buffer, 0, charsRead );
writer.close();
reader.close();
statement.setCLOB(1,clob);
boolean result = statement.execute();
is.close();
clob.truncate(0);
catch (SQLException sqle) {
System.out.println("SQL Exception caught after " + i + "Iterations");
System.out.println(sqle);
throw sqle;
CLOB.freeTemporary(clob);
statement.close();
getConnection().commit();
getConnection().close();
public static void main (String[] args)
try
BulkInsertXMLType example = new BulkInsertXMLType();
example.initializeConnection();
example.doSomething(args);
catch (Exception e)
e.printStackTrace();
} -
Is there a SAX parser with PL/SQL??
Hi All,
I am a Oracle developer and generally want to do everything from PL/SQL.
So, from a PL/SQL procedure I would like to parse a XML document. I have tried hard to find a PL/SQL package which can give me a PL/SQL SAX parser, but could not find one.
All packages are based on DOM tree.
Please convey if I am wrong and is there a SAX parser?
I know that there is only one package DBMS_XMLSTORE which is C based and uses SAX parser, but it does not have procedures/functions like ".parse()".
Thanks and regardsHere's an example
SQL> --
SQL> define MODULE=SaxProcessor
SQL> define CLASS=com/oracle/st/xmldb/pm/examples/SaxProcessor
SQL> --
SQL> set define off
SQL> --
SQL> var sourceFileName varchar2(32)
SQL> var targetPath varchar2(1024)
SQL> --
SQL> begin
2 :sourceFileName := 'JavaSource';
3 :targetPath := '/public/' || :sourceFileName || '.java';
4 end;
5 /
PL/SQL procedure successfully completed.
SQL> declare
2 res boolean;
3 javaSource clob :=
4 'package com.oracle.st.xmldb.pm.examples;
5
6
7 import java.io.IOException;
8 import java.io.StringWriter;
9
10 import java.io.Writer;
11
12 import java.sql.DriverManager;
13 import java.sql.SQLException;
14
15 import java.util.Enumeration;
16 import java.util.Hashtable;
17
18 import oracle.jdbc.OracleConnection;
19 import oracle.jdbc.OracleDriver;
20
21 import oracle.jdbc.OraclePreparedStatement;
22
23 import oracle.sql.BFILE;
24
25 import oracle.sql.CLOB;
26
27 import oracle.xml.parser.v2.SAXParser;
28 import oracle.xml.parser.v2.XMLDocument;
29 import oracle.xml.parser.v2.XMLElement;
30
31
32 import org.w3c.dom.Attr;
33 import org.w3c.dom.Element;
34 import org.w3c.dom.Node;
35
36 import org.xml.sax.Attributes;
37 import org.xml.sax.ContentHandler;
38 import org.xml.sax.Locator;
39 import org.xml.sax.SAXException;
40
41 public class SaxProcessor implements ContentHandler {
42
43 public static final boolean DEBUG = true;
44
45 private OracleConnection dbConnection;
46
47 private OraclePreparedStatement insertStatement;
48 private OraclePreparedStatement errorStatement;
49 private CLOB clob;
50
51 private Hashtable namespaceToPrefix = null;
52 private Hashtable prefixToNamespace = null;
53 private String targetElementName = null;
54
55 private XMLDocument currentDocument;
56 private Node currentNode;
57
58 private int documentCount = 0;
59
60 public SaxProcessor() {
61 this.namespaceToPrefix = new Hashtable();
62 this.prefixToNamespace = new Hashtable();
63 }
64
65 private boolean isTargetElement(String elementName) {
66 return ((this.currentDocument == null) &&
67 (elementName.equals(this.targetElementName)));
68 }
69
70 public void startDocument() throws SAXException {
71 }
72
73 public void endDocument() throws SAXException {
74 }
75
76 public void startElement(String namespaceURI, String localName, String elementName, Attributes attrs)
77 throws SAXException
78 {
79 if (DEBUG) {
80 System.out.println("startElement() : URI = " + namespaceURI + ", localName = " + localName + ", elementName = " + elementNa
me);
81 }
82 if (this.currentDocument == null)
83 {
84 if (DEBUG) {
85 System.out.println("startElement() : Checking for start of Fragment.");
86 }
87 if (isTargetElement(localName))
88 {
89 if (DEBUG) {
90 System.out.println("startElement() : Starting New Document");
91 }
92 this.currentDocument = new XMLDocument();
93 this.currentNode = this.currentDocument;
94 XMLElement rootElement = createNewElement(namespaceURI, localName, elementName, attrs);
95 this.currentDocument.appendChild(rootElement);
96 this.currentNode = rootElement;
97 }
98 }
99 else
100 {
101 XMLElement nextElement = createNewElement(namespaceURI, localName, elementName, attrs);
102 this.currentNode.appendChild(nextElement);
103 this.currentNode = nextElement;
104 }
105 }
106
107 public void endElement(String namespaceURI, String localName, String qName)
108 throws SAXException
109 {
110 if (this.currentDocument != null)
111 {
112 if (this.currentNode.equals(this.currentDocument.getDocumentElement()))
113 {
114 try
115 {
116 insertDocument();
117 this.currentDocument = null;
118 }
119 catch (SQLException sqlE)
120 {
121 throw new SAXException(sqlE);
122 }
123 catch (IOException ioe)
124 {
125 throw new SAXException(ioe);
126 }
127 }
128 else
129 {
130 this.currentNode = this.currentNode.getParentNode();
131 }
132 }
133 }
134
135 private XMLElement createNewElement(String namespaceURI, String localName,
136 String elementName, Attributes attrs) {
137 XMLElement newElement = null;
138 if (namespaceURI != null) {
139 if (this.namespaceToPrefix.containsKey(namespaceURI)) {
140 /* Namespace in already in Scope - create Element from Qualified Name */
141 newElement =
142 (XMLElement)this.currentDocument.createElement(elementName);
143 } else {
144 /* Namespace is not already in Scope - create Element with namespace */
145 newElement =
146 (XMLElement) this.currentDocument.createElementNS(namespaceURI,
147 elementName);
148 newElement.setPrefix((String)this.namespaceToPrefix.get(namespaceURI));
149 }
150 } else {
151 newElement =
152 (XMLElement)this.currentDocument.createElement(localName);
153 }
154 addAttributes(newElement, attrs);
155 if (this.currentNode.equals(this.currentDocument)) {
156 addNamespaceDeclarations(newElement);
157 }
158 return newElement;
159 }
160
161 private void addAttributes(Element element, Attributes attrs) {
162 for (int i = 0; i < attrs.getLength(); i++) {
163 if (attrs.getURI(i).equals("http://www.w3.org/2000/xmlns/")) {
164 } else {
165 element.setAttribute(attrs.getQName(i), attrs.getValue(i));
166 }
167 }
168 }
169
170 private void addNamespaceDeclarations(Element element) {
171 Enumeration keys = this.namespaceToPrefix.keys();
172 while (keys.hasMoreElements()) {
173 String namespace = (String)keys.nextElement();
174 String prefix = (String)namespaceToPrefix.get(namespace);
175 Attr attr = null;
176 if (prefix.equals("")) {
177 attr = this.currentDocument.createAttribute("xmlns");
178 attr.setValue(namespace);
179 element.setAttributeNode(attr);
180 } else {
181 if (!prefix.equals(element.getPrefix())) {
182 attr =
183 this.currentDocument.createAttribute("xmlns:" + prefix);
184 attr.setValue(namespace);
185 element.setAttributeNode(attr);
186 }
187 }
188 }
189 }
190
191 public void characters(char[] p0, int p1, int p2) throws SAXException {
192 if (this.currentDocument != null) {
193 StringWriter sw = new StringWriter();
194 sw.write(p0, p1, p2);
195 String value = sw.toString();
196 Node textNode = this.currentDocument.createTextNode(value);
197 this.currentNode.appendChild(textNode);
198 }
199 }
200
201 public void startPrefixMapping(String prefix,
202 String uri) throws SAXException {
203 this.namespaceToPrefix.put(uri, prefix);
204 this.prefixToNamespace.put(prefix, uri);
205 }
206
207 public void endPrefixMapping(String prefix) throws SAXException {
208 Enumeration e = prefixToNamespace.keys();
209 while (e.hasMoreElements()) {
210 String thisPrefix = (String)e.nextElement();
211 if (thisPrefix.equals(prefix)) {
212 String namespace =
213 (String)prefixToNamespace.remove(thisPrefix);
214 namespaceToPrefix.remove(namespace);
215 }
216 }
217 }
218
219 public void ignorableWhitespace(char[] p0, int p1,
220 int p2) throws SAXException {
221 // throw new SAXException ("Un-Implemented Method: ingnoreableWhitespace");
222 }
223
224 public void processingInstruction(String p0,
225 String p1) throws SAXException {
226 throw new SAXException("Un-Implemented Method: processingInstruction");
227 }
228
229 public void setDocumentLocator(Locator p0) {
230 // throw new SAXException ("Un-Implemented Method: setDocumentLocator");
231 }
232
233 public void skippedEntity(String p0) throws SAXException {
234 throw new SAXException("Un-Implemented Method: skippedEntity");
235 }
236
237 public void doParse(BFILE bfile, String targetElement,
238 String targetTable, String errorTable) throws Exception {
239 this.targetElementName = targetElement;
240 String insertStatementText =
241 "insert into " + targetTable + " values (xmlParse(DOCUMENT ? WELLFORMED))";
242 String errorStatementText =
243 "insert into " + errorTable + " values (xmlParse(DOCUMENT ? WELLFORMED))";
244
245 DriverManager.registerDriver(new oracle.jdbc.OracleDriver());
246 OracleDriver ora = new OracleDriver();
247 this.dbConnection = (OracleConnection)ora.defaultConnection();
248
249 this.insertStatement =
250 (OraclePreparedStatement)this.dbConnection.prepareStatement(insertStatementText);
251 this.errorStatement =
252 (OraclePreparedStatement)this.dbConnection.prepareStatement(errorStatementText);
253
254 this.clob =
255 CLOB.createTemporary(this.dbConnection, true, CLOB.DURATION_SESSION);
256
257 SAXParser parser = new SAXParser();
258 parser.setAttribute(SAXParser.STANDALONE, Boolean.valueOf(true));
259 parser.setValidationMode(SAXParser.NONVALIDATING);
260 parser.setContentHandler(this);
261 bfile.openFile();
262 parser.parse(bfile.getBinaryStream());
263 bfile.closeFile();
264 this.insertStatement.close();
265 this.errorStatement.close();
266 }
267
268 private void insertDocument() throws SQLException, IOException {
269 this.clob.truncate(0);
270 Writer out = clob.setCharacterStream(0);
271 this.currentDocument.print(out);
272 out.close();
273
274 this.insertStatement.setClob(1, clob);
275 this.insertStatement.execute();
276
277 this.documentCount++;
278
279 if (DEBUG) {
280 System.out.println("insertDocument() : Document Inserted");
281 }
282 }
283
284 public static int parseBFile(BFILE bfile, String targetElement,
285 String targetTable, String errorTable) throws Exception {
286 try {
287 SaxProcessor processor = new SaxProcessor();
288 processor.doParse(bfile,targetElement,targetTable,errorTable);
289 return processor.documentCount;
290 }
291 catch (Exception e) {
292 e.printStackTrace(System.out);
293 throw e;
294 }
295
296 }
297 }
298 ';
299 begin
300 if dbms_xdb.existsResource(:targetPath) then
301 dbms_xdb.deleteResource(:targetPath);
302 end if;
303 res := dbms_xdb.createResource(:targetPath,javaSource);
304 end;
305 /
Queuing DELETE Event
PL/SQL procedure successfully completed.
SQL> --
SQL> set define on
SQL> --
SQL> create or replace and resolve java source
2 named "&MODULE"
3 using blob
4 (
5 select xdburiType('/public/JavaSource.java').getBlob(nls_charset_id('WE8ISO8859P1'))
6 from dual
7 )
8 /
old 2: named "&MODULE"
new 2: named "SaxProcessor"
Java created.
SQL> show errors
No errors.
SQL> --
SQL> declare
2 shortname varchar2(128);
3 begin
4 shortname := dbms_java.shortname('&CLASS');
5 execute immediate 'grant execute on "' || shortname || '" to public';
6 end;
7 /
old 4: shortname := dbms_java.shortname('&CLASS');
new 4: shortname := dbms_java.shortname('com/oracle/st/xmldb/pm/examples/SaxProcessor');
PL/SQL procedure successfully completed.
SQL> create or replace package SAX_PROCESSOR
2 as
3 procedure PARSE_BFILE(file BFILE, targetElement varchar2, targetTable varchar2, errorTable varchar2);
4 end;
5 /
Package created.
SQL> show errors
No errors.
SQL> --
SQL> create or replace package body SAX_PROCESSOR
2 as
3 --
4 procedure PARSE_BFILE(file BFILE, targetElement varchar2, targetTable varchar2, errorTable varchar2)
5 AS LANGUAGE JAVA
6 NAME 'com.oracle.st.xmldb.pm.examples.SaxProcessor.parseBFile( oracle.sql.BFILE, java.lang.String, java.lang.String, java.lang.Strin
g)';
7 end;
8 /
Package body created.
SQL> show errors
No errors.
SQL> --
SQL> drop table PO_TEST
2 /
Table dropped.
SQL> create table PO_TEST of XMLTYPE
2 /
Table created.
SQL> drop table PO_ERROR
2 /
Table dropped.
SQL> create table PO_ERROR of XMLTYPE
2 /
Table created.
SQL> create or replace directory xmldir as 'c:\temp'
2 /
Directory created.
SQL> set serveroutput on
SQL> --
SQL> call SAX_PROCESSOR.PARSE_BFILE(bfilename('XMLDIR','testcase.xml'),'PurchaseOrder','PO_TEST','PO_ERROR')
2 /
call SAX_PROCESSOR.PARSE_BFILE(bfilename('XMLDIR','testcase.xml'),'PurchaseOrder','PO_TEST','PO_ERROR')
ERROR at line 1:
ORA-29549: class XFILES.com/oracle/st/xmldb/pm/examples/SaxProcessor has
changed, Java session state cleared
SQL> call SAX_PROCESSOR.PARSE_BFILE(bfilename('XMLDIR','testcase.xml'),'PurchaseOrder','PO_TEST','PO_ERROR')
2 /
Call completed.
SQL> select count(*) from PO_TEST
2 /
3
SQL> select * from PO_TEST
2 /
<PurchaseOrder xsi:noNamespaceSchemaLocation="http://xfiles:8080/home/SCOTT/poSo
urce/xsd/purchaseOrder.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance
">
<Reference>AMCEWEN-20030409123336271PDT</Reference>
<Actions>
<Action>
<User>KPARTNER</User>
</Action>
</Actions>
<Reject/>
<Requestor>Allan D. McEwen</Requestor>
<User>AMCEWEN</User>
<PurchaseOrder xsi:noNamespaceSchemaLocation="http://xfiles:8080/home/SCOTT/poSo
urce/xsd/purchaseOrder.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance
">
<Reference>SKING-20030409123336321PDT</Reference>
<Actions>
<Action>
<User>SKING</User>
</Action>
</Actions>
<Reject/>
<Requestor>Steven A. King</Requestor>
<User>SKING</User>
<PurchaseOrder xsi:noNamespaceSchemaLocation="http://xfiles:8080/home/SCOTT/poSo
urce/xsd/purchaseOrder.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance
">
<Reference>SMCCAIN-20030409120030451PDT</Reference>
<Actions>
<Action>
<User>SVOLLMAN</User>
</Action>
</Actions>
<Reject/>
<Requestor>Samuel B. McCain</Requestor>
<User>SMCCAIN</User>
SQL> -
Using TRUNCATE to free space used by CLOB/BLOB
Hi ,
Can we free the complete space used by a CLOB or BLOB by issuing a TRUNCATE command on the table containg these large objects?Sorry about my loose terminology - yes I did mean a sparse bundle. Yes if the backup is created on a local disk, it uses ordinary folders. If it is first created over a network, then it creates a sparse bundle. having created the sparse bundle, you can then connect the disk directly for speeding up the initial backup or a major restore, but also connect remotely for routine incremental backups or minor restores.
It sounds like the type of sparse bundle created may depend on circumstances. In my case I am backing up from a laptop onto a partition on my desktop machine running Leopard 10.5.2 and when I grew the partition, although the Time Machine preference pane saw the extra space, when it came to a backup I got an error message saying there was not enough space and reporting the original size. Deleting and starting over again fixed this.
It is possible that in other circumstances a disk attached to a Time Capsule or elsewhere might get a sparse bundle with different parameters.
Incidentally, I tried copying my old backup sparse bundle onto another drive, deleting it and letting Time Machine create a new sparse bundle on my grown partition and copying the contents from the old one into the new one. Time Machine refused to work with it, so I lost my old backups.
What we need is a Time Machine Utility to manipulate these files, copy them, move backups from direct folders to sparse bundles etc. Ideally Apple would produce this, but I would be willing to pay a shareware fee for that. -
When I fetch data I'm trying to access some CLOBs as strings (i.e. when I define them I used SQLT_STR instead of SQLT_CLOB). The problem is that some of CLOBs are over 4000 bytes and 4000 is the size I get from OCI_ATTR_DATA_SIZE. Needless to say some columns get truncated (code 1406).
Now I can detect the truncated data but I can't seem to find a way to get it. Is there a way?
I don't want to but I suppose I could write routines to handle CLOB data using OCILobLocator but I haven't yet found a good example. The one example I have shows this:
OCIDefine* define;
OCILobLocator* lob;
OCIDescriptorAlloc(envhp, &lob, OCI_DTYPE_LOB, 0, 0);
OCIDefineByPos(stmthp, &define, errhp, index, &lob, -1, SQLT_CLOB, ind, len, code, OCI_DEFAULT);
/* later */
OCIStmtFetch2(stmthp, errhp, 100, OCI_FETCH_NEXT, OCI_DEFAULT);Now I'm used to passing arrays to OCIDefineByPos() but in this case I only see one OCILobLocator being allocated but 100 rows being fetched. From everything I know this should cause a major problem.
Am I right? Must you iterate through an array of OCILobLocator and initialize each one before passing it to OCIDefineByPos()?
For example:
int i;
OCILobLocator* lob[100];
for (i = 0; i < 100; i++)
OCIDescriptorAlloc(envhp, &lob, OCI_DTYPE_LOB, 0, 0);http://www.wireless.att.com/cell-phone-service/specials/iPhoneApps.jsp
-
Issues with CLOB dataype in 10g. (Data truncation)
I have a product which is running perfectly fine on Oralce 9i. We are trying to upgrade the product to 10g. Once the database is upgraded and after the data is imported, we are facing few issues with CLOB data type values. We are using SQLGetData function to fetch data from these CLOB datatypes. The data is getting truncated when retriving. Oracle says, its an issue with ODBC driver which was released with 10g Relase 1. I tried to upgrade the ODBC driver. But even after upgrading, I am still facing the same issue. I tried to upgrade it Oracle 10g Release 2. Still I am facing the same issue. The data stored in CLOB datatype is xml data. So after retriving the truncated data, XML parser is not able to parse the data.
When I trace the error using ODBC Error Tracing, I am getting this error :
[01004] [Oracle][ODBC]String data, right truncated. (0) .
Has anyone faced anything like this. your inputs will be greatly appreciated.
Thanks in advance.Hi,
Thanks for the reply. I couldn't quite got what you meant by referential variable. Could you please elaborate on that.
Thanks in advance. -
Why are results from xmlgen.getxml truncated?
I have a table with 25 rows in it that displays all contents with a SELECT * statement. However, if I wrap this statement in an xmlgen.getxml the result is truncated. I'm sure this is a novice issue, and this novice would appreciate any insight.
Thanks in advance,
Rich
query:
SELECT xmlgen.getxml('SELECT * FROM dbotemp.portfolio') xml from dual
result:
XML
<?xml version = '1.0'?>
<ROWSET>
<ROW num="1">
<PORTFOLIO_ID>187</PORTF
1 row selected.you get a clob and clobs are normely truncated
when you use the statement ins SQL*PLus try: "set long 10000"
and you will see the whole result -
Export XML data from a CLOB field using sql developer
I have a table that contains a CLOB field, each record is a complete XML in itself. After running the select query i am exporting this data by right clicking on the answer set, while the export completes the data for each record gets truncated.
When i chose to save the export in loader format i get the complete records but now there are n files created for n records exported.
Is there a way i can get alll the records in a single file wirthout any truncation.
Thanks in advance!You might try delimited format or csv, with no enclosures if desired.
-
Deleting a row from a table containing CLOB as one of the columns
When i delete a row from a table which contains a CLOB (internal clob) i.e. CLOB or BLOB column, Will the CLOB data will also be deleted ? I understand that what exactly stored in the CLOB column is the clob locator which points to the actual data.
So, when I delete this row, the clob locator will be deleted, but will the actual data what this locator is pointing to is also deleted ??? if not what is the process to delete the data the locator is pointing to when the row containing the locator is deleted ? If this is not happening then the actual data might become an orphan data which nobody has access to, will automatic garbage cleaning occurs on a frequent intravels to delete unaddressed data residing on the database server ?
Thanks in advance for the help, can email me at [email protected] alternatively.
Regards,
Srinivasa C.Michael,
Thanks very much for your inputs, here are the results i got when i tried the way you explained in your answer, the TRUNCATE command made the actual size back to normal, but the delete is not the same, so, how can i delete the data that a particular clob locator may point to ?
truncate would delete all the rows of the table, which might not serve my purpose, i would like to delete a row and also it's associated clob data from the database! is there anyway to do this ?
is there any limitation on the ool_sample size? i am basically a c++ programmer, i am looking for some function like FREE which would free the allocated memory to the clob once the locator is deleted.
your help is greatly appreciated - Thanks!
:-) Srini.
==========================
My Results:
==========================
SQL> create table sample (
2 id integer primary key,
3 the_data CLOB default empty_clob() )
4 lob (the_data) store as ool_sample;
Table created.
SQL> select segment_name, round(sum(bytes)/1024, 2) || 'K' as sotrage_consumed
2 from user_segments
3 where segment_name in ('SAMPLE', 'OOL_SAMPLE')
4 group by segment_name;
SEGMENT_NAME
SOTRAGE_CONSUMED
OOL_SAMPLE
20K
SAMPLE
10K
SQL> select count(*) from sample;
COUNT(*)
0
SQL> begin
2 for i in 1..1000
3 loop
4 insert into sample values (i, RPAD('some data', 4000) );
5 end loop;
6 end;
7 /
PL/SQL procedure successfully completed.
SQL> select segment_name, round(sum(bytes)/1024, 2) || 'K' as sotrage_consumed
2 from user_segments
3 where segment_name in ('SAMPLE', 'OOL_SAMPLE')
4 group by segment_name;
SEGMENT_NAME
SOTRAGE_CONSUMED
OOL_SAMPLE
6420K
SAMPLE
70K
SQL> delete sample;
1000 rows deleted.
SQL> select segment_name, round(sum(bytes)/1024, 2) || 'K' as sotrage_consumed
2 from user_segments
3 where segment_name in ('SAMPLE', 'OOL_SAMPLE')
4 group by segment_name;
SEGMENT_NAME
SOTRAGE_CONSUMED
OOL_SAMPLE
6420K
SAMPLE
70K
SQL> commit;
Commit complete.
SQL> select segment_name, round(sum(bytes)/1024, 2) || 'K' as sotrage_consumed
2 from user_segments
3 where segment_name in ('SAMPLE', 'OOL_SAMPLE')
4 group by segment_name;
SEGMENT_NAME
SOTRAGE_CONSUMED
OOL_SAMPLE
6420K
SAMPLE
70K
SQL> begin
2 for i in 1..1000
3 loop
4 insert into sample values (i, rpad('some data', 4000));
5 end loop;
6 end;
7 /
PL/SQL procedure successfully completed.
SQL> select segment_name, round(sum(bytes)/1024, 2) || 'K' as sotrage_consumed
2 from user_segments
3 where segment_name in ('SAMPLE', 'OOL_SAMPLE')
4 group by segment_name;
SEGMENT_NAME
SOTRAGE_CONSUMED
OOL_SAMPLE
9616K
SAMPLE
70K
SQL> truncate table sample;
Table truncated.
SQL> select segment_name, round(sum(bytes)/1024, 2) || 'K' as sotrage_consumed
2 from user_segments
3 where segment_name in ('SAMPLE', 'OOL_SAMPLE')
4 group by segment_name;
SEGMENT_NAME
SOTRAGE_CONSUMED
OOL_SAMPLE
20K
SAMPLE
10K -
Display XML Document from CLOB Column on page
Hi,
I have been reading all the CLOB postings that I can find, but I still cannot get my page to do what I want.
I have a very simjple table:
MF_XML_DOCS (DOC_ID NUMBER, DOC_XML CLOB)
I can populate this table OK but I am having problems getting the cotent back out. I want a simple page that takes an ID Number and displays the XML Document for that ID (select doc_xml from mf_xml_docs where doc_id = :P1_DOC_ID). Everything I try either truncates the text (or errors) at 4000 or 32767 characters or reads the XML tags as tags and does not display them. I want a simple display of the XML Document (and I don't mind if it is in an 'updateable' field or not):
<Parent>
<name>Dad</name>
<Children>
<Child>
<name>Number 1 Son</name>
</Child>
<Child>
<name>Number 2 Son</name>
</Child>
</children>
</Parent>But when I do something that works for large (32767+) documents all I see is 'Dad Number 1 Son Number 2 Son'.
Help!!
many thanks,
MartinHave you tried simply outputting iusing the htp.p function? for example:
/* ... inside a PL/SQL region */
declare
lclb_output clob;
begin
select doc_xml into lclb_output from mf_xml_docs where doc_id = :P1_DOC_ID;
htp.p(lclb_output); -- you may have to split this into chunks and loop through, depending on how big the clob is
end; -
Web Service Response truncated
Hi out there,
I finally got a web service reference to Amazon's ECS up and running.
But the response from the web service is not put into a collection. IF I debug my application I see that Amazon is responding and that the response is truncated.
If I view the collection's session state I get the error message: character buffer too small ...
Is there any possiblity to store the response from a web service in a CLOB or XMLType item, so that the parsing can take place ?
Regards MarkusI use web services a lot and I sympathize with your issue. I
struggle with similar issues and I found this great utility that
will help you confirm if your webservice is returning the data
correctly to Flex. I know you said it works in other applications
but who knows if flex is calling it correctly etc. This utility is
been the most amazing tool in helping me resolve web service
issues.
http://www.charlesproxy.com/
Once you can confirm the data being returned is good you can
try several things in flex. Try changing your result format to
object or e4x etc. See how that plays out. Not sure where your
tapping in to look at your debugger, you might want to catch it
right at the result handler before converting to any collections. .
If nothing here helps maybe post some code to look at. .
. -
Clob temporary table space is not released
In our java web application, it uses datasource to call oralce stored procedure and get back data in Clob data type. The oracle version is 8.1.7.4. After the database call, the database connection is closed. But I found that the tablespace for Clob data is not released. Right now it becomes a critical issue of our production database. Anybody has any clue? Really appreciate.
Team,
We have a table of size 550gigs in size and we truncated the table , truncated sucessfully but space is not released in os level, what action we can take to release the space and this table has only one row and contains the binary data.
Thanks
PGR
Hello,
Yes space wont be released immediately .If large extents are in picture which I assume is your case it goes into deferred drop a background process which will execute after some time( time may vary).See below link for details.
As per BOL if extents are more than 128 it goes in deferred drop.
http://msdn.microsoft.com/en-us/library/ms177495.aspx
You should wait for some time .keep checking the free space
Below discussion will surely help you in understanding.See Jonathan's reply
http://social.msdn.microsoft.com/Forums/en-US/4aa2537e-246b-4bfe-818d-3482531d9149/sql-server-2005-massive-400gb-table-dropped-space-not-released
Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers -
Hi,
I have a clob in the the table which have the huge data in each row. i want this table to export some other instance( to client place)
I exported like below
exp kvv/kvv@Ora1 buffer=10240000 file=kvvexp.dmp tables=groupdetails log = exp.log
After this i am tried to import like below
imp kvv/kvv@Ora2 buffer=10240000 file=kvvexp.dmp log = imp.log
After this i found no errors in both export and import ogs
But in Ora2( whi is impoerted) all the rows i found but clob is not fully there truncated. in some point the data is cutting.
Both db instances(Ora1 and Ora 2) are version of oracle9i- 9.2.0.1.0
settings also same.
Why full row is not exported and imported any probelm in my way of export and import.
Thanks & Reagds
Venkata Vara Prasad Kosarajusince LOB's are fetched one row at a time, can you remove the buffer parameter in the export option and try again to re-export and import.
-
Which is better - BLOB, CLOB or XMLTYPE?
Hi All,
We have to store XML in one of the fields in our table.
XML size is not fix but its not so huge that we shold go for external LOBs.
I mean it will be alwas less than 4G.
Our concern here is, does CLOB allocates exactly as much memory needed as much the data is present,(something like VARCHAR2).
How is it in case of BLOB and XMLTYPE as well?
Is there any Oracle documentation or way (code snippet) to verify this?
Thanks in advnace.
Avinash.You have asked if CLOB allocates space like VARCHAR2 - yes it does. Maybe this example will be better:
SQL> TRUNCATE TABLE test_table;
Tabela zosta│a obciŕta.
SQL> insert into test_table values(LPAD('A', 100, 'A'));
1 wiersz zosta│ utworzony.
SQL> insert into test_table values(LPAD('A', 1000, 'A'));
1 wiersz zosta│ utworzony.
SQL> insert into test_table values(LPAD('A', 10000, 'A'));
1 wiersz zosta│ utworzony.
SQL> COMMIT;
Zatwierdzanie zosta│o uko˝czone.
SQL> SELECT LENGTH(col1) FROM test_table;
LENGTH(COL1)
100
1000
4000
SQL>
Maybe you are looking for
-
Problem in calling the function
HI i am creating the dynamic xml menu drop down menu in flash , 1) CreateMainMenu --- used to create the m ain menu.......... 2) getEmpDetailsFrmXML ------- used to load the xml file... 3) GenerateMenu ------ used to genereta the menus.. in run time
-
How to get a visualizer for iPad2?
Know this will probably be simple but how do you get a visualizer for the iPad 2. Thanks
-
Transferring Numbers Files from iMac running Lion to powerbook pro running 10.6.8 reuslts in an inability to open some but not all of these files. Any ideas?
-
CallManager 4.1(3) - Inter-cluster Trunk (ICT) behaviour and config
Hi Guys, Trying to get some clarification on this. Currently chasing a few different avenues. If anyone knows of some good detailed docco on this (have tried the standard stuff). Or if anyone has any best practice advice, otherwise any one have any c
-
Pages is incredibly slow.
Pages is incredibly slow. Everytime I click or type the pinwheel appears. How can I fix this? I have a document with some pictures on it, but I can't even move the image a few degrees without the pinwheel appearing.