Blob Truncated
Hi!
I'm using Informatica PowerCenter 8.1.1 and need to pass from my source table an image stored in blob column the problem is that this image gets truncated in the target table.
I've increased the size of the ports to 24'000,000 and the image size is 348Kb.
I tested the target table too loading the same file from the same source with a PL/SQL program, and the image was loaded fine. I don't want to use PL/SQL to load just the images.
If someone have an advice I really appreciate your reply.
Regards.
Jorge.
Well this is really an Informatica question rather than an OBI one, but heres what I came up with: http://technet.informatica.com/forum/enhancements/topic/blob-gets-truncatet-when-bigger-64k
Similar Messages
-
Blob truncated with DbFactory and Bulk insert
Hi,
My platform is a Microsoft Windows Server 2003 R2 Server 5.2 Service Pack 2 (64-bit) with an Oracle Database 11g 11.1.0.6.0.
I use the client Oracle 11g ODAC 11.1.0.7.20.
Some strange behavior appends when used DbFactory and bulk command with Blob column and parameter with a size larger than 65536bytes. Let me explain.
First i create a dummy table in my schema :
create table dummy (a number, b blob)To use bulk insert we can use the code A with oracle object (succes to execute) :
byte[] b1 = new byte[65530];
byte[] b2 = new byte[65540];
Oracle.DataAccess.Client.OracleConnection conn = new Oracle.DataAccess.Client.OracleConnection("User Id=login;Password=pws;Data Source=orcl;");
OracleCommand cmd = new OracleCommand("insert into dummy values (:p1,:p2)", conn);
cmd.ArrayBindCount = 2;
OracleParameter p1 = new OracleParameter("p1", OracleDbType.Int32);
p1.Direction = ParameterDirection.Input;
p1.Value = new int[] { 1, 2 };
cmd.Parameters.Add(p1);
OracleParameter p2 = new OracleParameter("p2", OracleDbType.Blob);
p2.Direction = ParameterDirection.Input;
p2.Value = new byte[][] { b1, b2 };
cmd.Parameters.Add(p2);
conn.Open(); cmd.ExecuteNonQuery(); conn.Close();We can write the same thing with an abstract level when used the DbProviderFactories (code B) :
var factory = DbProviderFactories.GetFactory("Oracle.DataAccess.Client");
DbConnection conn = factory.CreateConnection();
conn.ConnectionString = "User Id=login;Password=pws;Data Source=orcl;";
DbCommand cmd = conn.CreateCommand();
cmd.CommandText = "insert into dummy values (:p1,:p2)";
((OracleCommand)cmd).ArrayBindCount = 2;
DbParameter param = cmd.CreateParameter();
param.ParameterName = "p1";
param.DbType = DbType.Int32;
param.Value = new int[] { 3, 4 };
cmd.Parameters.Add(param);
DbParameter param2 = cmd.CreateParameter();
param2.ParameterName = "p2";
param2.DbType = DbType.Binary;
param2.Value = new byte[][] { b1, b2 };
cmd.Parameters.Add(param2);
conn.Open(); cmd.ExecuteNonQuery(); conn.Close();But this second code doesn't work, the second array of byte is truncated to 4byte. It seems to be an int16 overtaking.
When used a DbTYpe.Binary, oracle use an OracleDbType.Raw for mapping and not an OracleDbType.Blob, so the problem seems to be with raw type, BUT if we use the same code without bulk insert, it's worked !!! The problem is somewhere else...
Why used an DbConnection ? To be able to switch easy to an another database type.
So why used "((OracleCommand)cmd).ArrayBindCount" ? To be able to used specific functionality of each database.
I can fix the issue when casting DbParameter as OracleParameter and fix the OracleDbType to Blob, but why second code does not working with bulk and working with simple query ?BCP and BULK INSERT does not work the way you expect them do. What they do is that they consume fields in a round-robin fashion. That is, they first looks for data for the first field, then for the second field and so on.
So in your case, they will first read one byte, then 20 bytes etc until they have read the two bytes for field 122. At this point they will consume bytes until they have found a sequence of carriage return and line feed.
You say that some records in the file are incomplete. Say that there are only 60 fields in this file. Field 61 is four bytes. BCP and BULK INSERT will now read data for field 61 as CR+LF+the first two bytes in the next row. CR+LF has no special meaning,
but they are just data at this point.
You will have to write a program to parse the file, or use SSIS. But BCP and BULK INSERT are not your friends in this case.
Erland Sommarskog, SQL Server MVP, [email protected] -
Using TRUNCATE to free space used by CLOB/BLOB
Hi ,
Can we free the complete space used by a CLOB or BLOB by issuing a TRUNCATE command on the table containg these large objects?Sorry about my loose terminology - yes I did mean a sparse bundle. Yes if the backup is created on a local disk, it uses ordinary folders. If it is first created over a network, then it creates a sparse bundle. having created the sparse bundle, you can then connect the disk directly for speeding up the initial backup or a major restore, but also connect remotely for routine incremental backups or minor restores.
It sounds like the type of sparse bundle created may depend on circumstances. In my case I am backing up from a laptop onto a partition on my desktop machine running Leopard 10.5.2 and when I grew the partition, although the Time Machine preference pane saw the extra space, when it came to a backup I got an error message saying there was not enough space and reporting the original size. Deleting and starting over again fixed this.
It is possible that in other circumstances a disk attached to a Time Capsule or elsewhere might get a sparse bundle with different parameters.
Incidentally, I tried copying my old backup sparse bundle onto another drive, deleting it and letting Time Machine create a new sparse bundle on my grown partition and copying the contents from the old one into the new one. Time Machine refused to work with it, so I lost my old backups.
What we need is a Time Machine Utility to manipulate these files, copy them, move backups from direct folders to sparse bundles etc. Ideally Apple would produce this, but I would be willing to pay a shareware fee for that. -
Unable to convert BLOB to XML using XMLTYPE
Hello (XML) Experts
I need your help with manipulating a BLOB column containing XML data - I am encountering the following error:
ORA-31011: XML parsing failed
ORA-19202: Error occurred in XML processing
LPX-00200: could not convert from encoding UTF-8 to WINDOWS-1252
Error at line 1
ORA-06512: at "SYS.XMLTYPE", line 283
I am on Windows 7 64 bit, Oracle 11.2.0.3 64 bit and database character set is WE8MSWIN1252, NLS_LANG is set to AMERICAN_AMERICA.AL32UTF8. The BLOB column contains the following XML data:
<?xml version="1.0" encoding="utf-8"?>
<Root CRC="-4065505">
<Header Converted="0">
<Version Type="String" Value="512" />
<Revision Type="String" Value="29" />
<SunSystemsVersion Type="String" Value="" />
<Date Type="String" Value="20080724" />
<Time Type="String" Value="165953" />
<DAG Type="String" Value="" />
<ChkID Type="String" Value="" />
<FormType Type="String" Value="1" />
<DB Type="String" Value="AllBusinessUnits" />
<FuncID Type="String" Value="SOE" />
<Status Type="String" Value="" />
<FileType Type="String" Value="SFL" />
<Descriptions>
<Default Type="String" Value="Sales Order Entry" />
<L01 Type="String" Value="Sales Order Entry" />
<L33 Type="String" Value="Saisie commande client" />
<L34 Type="String" Value="Entrada de órdenes de venta" />
<L39 Type="String" Value="Inserimento ordine di vendita" />
<L49 Type="String" Value="Aufträge erfassen" />
<L55 Type="String" Value="Entrada de pedido de venda" />
<L81 Type="String" Value="�注オーダー入力" />
<L86 Type="String" Value="销售订�录入" />
<L87 Type="String" Value="銷售訂單錄入" />
</Descriptions>
</Header>
<FormDesignerAppVer Type="String" Value="5.1" SFLOnly="1" />
</Root>I am using the XMLTYPE constructor and passing in the BLOB column and the character set id of the XML data stored in the BLOB column in order to extract and update a node in the XML as follows:
select xmltype(srce_form_detail,873) from SRCE_FORM where 873 above corresponds to the utf-8 encoding of the XML data in the BLOB column i.e. AL32UTF8, but this results in the above error.
I have also tried converting the BLOB to a CLOB first as below where BLOB2CLOB is a function that converts the BLOB to a CLOB:
select xmltype(BLOB2CLOB(srce_form_detail)).EXTRACT('/Root/Header/DB').getStringVal() XMLSrc from SRCE_FORM;This results in the following error:
ORA-31011: XML parsing failed
ORA-19202: Error occurred in XML processing
LPX-00210: expected '<' instead of '¿'
Error at line 1
ORA-06512: at "SYS.XMLTYPE", line 272
ORA-06512: at line 1
Looking at the XML in the BLOB I noticed that it contains a BOM(byte order mark) and this is causing the XML parsing to fail and I don't know how to deal with it and I don't want to simply SUBSTR it out.
What I am trying to achieve is to extract the contents of the DB node in the XML and depending on its value I need to update the 'Value' part of that node. I am stuck at the point of extracting the contents of the DB node.
I hope I have provided enough information and I would appreciate any suggestions on how best to resolve this - my XML knowledge is very limited so I would appreciate any help.
Regards,
MohinderHi Marc
Thanks for your response.
You are correct that the blob contains Japanese and Chinese characters but I was expecting that using the XMLTYPE constructor would convert the character set albeit with some data loss or then not display the Chinese and Japanese characters correctly.
It seems to me that XMLTYPE is not handling/interpreting the BOM contained in the BLOB since even converting the BLOB to CLOB is resulting in an error. If I use SUBSTR and ignore the BOM to extract the XML from the BLOB then it works and as expected the Chinese and Japanese characters are not displayed correctly, they are displayed as '¿' corresponding to the lines beginning with L81, L86 & L87 , see below:
select xmltype(SUBSTR(BLOB2CLOB(srce_form_detail),4)) from SRCE_FORM
<?xml version="1.0" encoding="utf-8"?>
<Root CRC="-4065505">
<Header Converted="0">
<Version Type="String" Value="512" />
<Revision Type="String" Value="29" />
<SunSystemsVersion Type="String" Value="" />
<Date Type="String" Value="20080724" />
<Time Type="String" Value="165953" />
<DAG Type="String" Value="" />
<ChkID Type="String" Value="" />
<FormType Type="String" Value="1" />
<DB Type="String" Value="AllBusinessUnits" />
<FuncID Type="String" Value="SOE" />
<Status Type="String" Value="" />
<FileType Type="String" Value="SFL" />
<Descriptions>
<Default Type="String" Value="Sales Order Entry" />
<L01 Type="String" Value="Sales Order Entry" />
<L33 Type="String" Value="Saisie commande client" />
<L34 Type="String" Value="Entrada de ¿¿rdenes de venta" />
<L39 Type="String" Value="Inserimento ordine di vendita" />
<L49 Type="String" Value="Auftr¿¿ge erfassen" />
<L55 Type="String" Value="Entrada de pedido de venda" />
<L81 Type="String" Value="¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿" />
<L86 Type="String" Value="¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿" />
<L87 Type="String" Value="¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿¿" />
</Descriptions>
</Header>Can you please let me know how I can extract the binary dump of the BLOB and post it on the forum as I don't know how to do this. Below is snippet of the hexadecimal dump, that includes the BOM. I can post the full hexadecimal dump if this can help you to reproduce the error ?
EFBBBF3C3F786D6C2076657273696F6E3D22312E302220656E636F64696E673D227574662D38223F3E0D0A3C526F6F74204352433D222D34303635353035223E0D0A20203C48656164657220436F6E7665727465643D2230223E0D0A202020203C56657273696F6E20547970653D22537472696E67222056616C75653D2235313222202F3E0D0A202020203C5265766973696F6E20547970653D22537472696E67222056616C75653D22323922202F3E0D0A202020203C53756E53797374656D7356657273696F6E20547970653D22537472696E67222056616C75653D2222202F3E0D0A202020203C4461746520547970653D22537472696E67222056616C75653D22323030383037323422202F3E0D0A202020203C54696D6520547970653D22537472696E67222056616C75653D2231363539353322202F3E0D0A202020203C44414720547970653D22537472696E67222056616C75653D2222202F3E0D0A202020203C43686B494420547970653D22537472696E67222056616C75653D2222202F3E0D0A202020203C466F726D5479706520547970653D22537472696E67222056616C75653D223122202F3E0D0A202020203C444220547970653D22537472696E67222056616C75653D22416C6C427573696E657373556E69747322202F3E0D0A202020203C46756E63494420547970653D22537472696E67222056616C75653D22534F4522202F3E0D0A202020203C53746174757320547970653D22537472696E67222056616C75653D2222202F3E0D0A202020203C46696C655479706520547970653D22537472696E67222056616C75653D2253464C22202F3E0D0A202020203C4465736372697074696F6E733E0D0A2020202020203C44656661756C7420547970653D22537472696E67222056616C75653D2253616C6573204F7264657220456E74727922202F3E0D0A2020202020203C4C303120547970653D22537472696E67222056616C75653D2253616C6573204F7264657220456E74727922202F3E0D0A2020202020203C4C333320547970653D22537472696E67222056616C75653D2253616973696520636F6D6D616E646520636C69656E7422202F3E0D0A2020202020203C4C333420547970653D22537472696E67222056616C75653D22456E747261646120646520C3B37264656E65732064652076656E746122202F3E0D0A2020202020203C4C333920547970653D22537472696E67222056616C75653D22496E736572696D656E746F206F7264696E652064692076656E6469746122202F3E0D0A2020202020203C4C343920547970653D22537472696E67222056616C75653D224175667472C3A4676520657266617373656E22202F3E0D0A2020202020203C4C353520547970653D22537472696E67222056616C75653D22456E74726164612064652070656469646F2064652076656E646122202F3E0D0A2020202020203C4C383120547970653D22537472696E67222056616C75653D22E58F97E6B3A8E382AAE383BCE38380E383BCE585A5E58A9B22202F3E0D0A2020202020203C4C383620547970653D22537472696E67222056616C75653D22E99480E594AEE8AEA2E58D95E5BD95E585A522202F3E0D0A2020202020203C4C383720547970653D22537472696E67222056616C75653D22E98AB7E594AEE8A882E596AEE98C84E585A522202F3E0D0A202020203C2F4465736372697074696F6E733E0D0A20203C2F4865616465723E0D0A20203C466F726D3E0D0A202020203C4372656174696F6E4C616E6720547970653D22537472696E67222056616C75653D223031222053464C4F6E6C793D223122202F3E0D0A202020203C416374696F6E733E0D0A2020202020203C5065726D697373696F6E73202F3E0D0A202020203C2F416374696F6E733E0D0A202020203C48656C70202F3E0D0A202020203C466F6E743E0D0A2020202020203C446566466F6E7453697A6520547970653D22496E7465676572222056616C75653D2238222053464C4F6E6C793D223122202F3E0D0A2020202020203C466F6E743E0D0A20202020202020203C4C616E677561676520547970653D22537472696E67222056616C75653D2244656661756C7422202F3E0D0A20202020202020203C466F6E744E616D6520547970653D22537472696E67222056616C75653D224D532053616E7320536572696622202F3E0D0A2020202020203C2F466F6E743E0D0A2020202020203C466F6E743E0D0A20202020202020203C4C616E677561676520547970653D22537472696E67222056616C75653D22383122202F3E0D0A20202020202020203C466F6E744E616D6520547970653D22537472696E67222056616C75653D224D5320554920476F7468696322202F3E0D0A2020202020203C2F466F6E743E0D0A202020203C2F466F6E743E0D0A202020203C436F6E74726F6C733E0D0A2020202020203C436F6E74726F6C3E0D0A20202020202020203C436F6E74726F6C5479706520547970653D22496E746567657222204644496E743D2231222056616C75653D223122202F3E0D0A20202020202020203C446973706C61795479706520547970653D22537472696E6722204644496E743D2230222056616C75653D22466F726D2057696E646F77222053464C4F6E6C793D223122202F3E0D0A20202020202020203C43617074696F6E20547970653D22537472696E6722204644496E743D2230222056616C75653D2253597C3F7C55547C3F7C3F3F3F3F3F3F22202F3E0DThe XML I posted so far is actually truncated as the full XML is quite big but I showed the beginning of it as this is the section I believe that is not being handled properly. Furthermore I am able to write the BLOB out to a file successfully without any errors using DBMS_LOB & UTL_FILE.PUT_RAW and this seems to handle the BOM without any issues but what I really need to do is read a single node in the XML and update it directly preferably using XMLTYPE directly with the BLOB.
I would welcome your suggestions on how best to read a single node and update it when the XML is contained in a BLOB.
Regards,
Mohinder -
Unable to generate XML's for BLOB datatypes from Concurrent program
Hi All,
I've a requirement to print images on rtf layout. Images are uploaded by end user through attahments men
there are getting stored in fnd_lobs tables.
for printing blob images we need to convert them into CLOB and generate XML's.
I've done the conversion through a function and calling the function in the select query which is generating XML when i run it from toad.
SELECT xmlgen.getXml(
'SELECT file_id,mob_getbase64String(file_data) photo
FROM fnd_lobs
WHERE file_id = 2490481'
,0
) FROM dual;
But the same thing we i registered as concurrent program (SQL*Plus) the program is running into error.
Output file
The XML page cannot be displayed
Cannot view XML input using style sheet. Please correct the error and then click the Refresh button, or try again later.
Invalid at the top level of the document. Error processing resource 'https://dbtdev5i.oracleoutsourcing.com/OA_CGI/FNDWRR.e...
Input truncated to 17 characters
^
Log file
Concurrent Manager encountered an error while running SQL*Plus for your concurrent request 10868311.
Review your concurrent request log and/or report output file for more detailed information.
Can anyone help me through on how to bypass this error and generate XML's.
Thanks in Advance
JanaHi Priya..,
I have changed the query and registered in apps and now i am able to generate XML's of the blob image and the same is getting printed on the tempate..
DECLARE
v_colb CLOB;
v_query VARCHAR2(1000);
BEGIN
v_query := 'SELECT file_id,mob_getbase64String(file_data) photo
FROM fnd_lobs fl,
fnd_documents_vl fd
WHERE fd.media_id = fl.file_id
AND (fd.end_date_active IS NULL
OR fd.end_date_active > SYSDATE)
AND fd.security_type = 2
AND fd.security_id = fnd_profile.value(''GL_SET_OF_BKS_ID'')';
--FND_FILE.put_line( FND_FILE.LOG,v_query);
v_colb := xmlgen.getxml (v_query, 0);
--DBMS_OUTPUT.put_line (v_query);
FND_FILE.put_line( FND_FILE.OUTPUT,v_colb);
END;
/ -
Recovering Data from a truncated table in OraXE?
A funny thing happened, i truncated a table containing BLOBs and other stuff, with over 700 entries!!, now i wanna know if it´s feasible to recover that truncated data, i'm just wondering about it, i know it sounds silly i know, but the storage space didn´t get shorter after the data was truncated from the table, so after this´s been said, can any body explain me what happened, can i retrieved this truncated data that i truncated from the table (without any previous backup)?
Message was edited by:
efebo_abel2002
Message was edited by:
efebo_abel2002
Mensaje editado por:
efebo_abel2002Truncate simply resets the pointer that marks the 'end of data' in a table to the beginning of the table. The storage is still allocated, on the assumption that you want to reuse the table and want to avoid the overhead associated with extending the table. The system knows there are never any rows past the 'end of data' and it won;t bother looking past that point.
Recovering that data from the actual current database - about as likely as recovering data from the Windows recycle bin when you've set the max allowable storage in the recycle bin to '0'. Technically possible, but needing someone with a lot of experience and patience ... and absolutely no changes to the database after truncate. An expensive proposition.
You have, of course, switched to archive log mode and been taking regular backups. (Have you not?) In which case, a 'point in time' recover, to just before the truncate, would be the solution. This brings back the database (or at least a tablespace) back to a former 'life', and is quite different from restoring a table. -
SQL*Loader and "Variable length field was truncated"
Hi,
I'm experiencing this problem using SQL*Loader: Release 8.1.7.0.0
Here is my control file (it's actually split into separate control and data files, but the result is the same)
LOAD DATA
INFILE *
APPEND INTO TABLE test
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
first_id,
second_id,
third_id,
language_code,
display_text VARCHAR(2000)
begindata
2,1,1,"eng","Type of Investment Account"
The TEST table is defined as:
Name Null? Type
FIRST_ID NOT NULL NUMBER(4)
SECOND_ID NOT NULL NUMBER(4)
THIRD_ID NOT NULL NUMBER(4)
LANGUAGE_CODE NOT NULL CHAR(3)
DISPLAY_TEXT VARCHAR2(2000)
QUESTION_BLOB BLOB
The log file displays:
Record 1: Warning on table "USER"."TEST", column DISPLAY_TEXT
Variable length field was truncated.
And the results of the insert are:
FIRST_ID SECOND_ID THIRD_ID LANGUAGE_CODE DISPLAY_TEXT
2 1 1 eng ype of Investment Account"
The language_code field is imported correctly, but display_text keeps the closing delimiter, and loses the first character of the string. In other words, it is interpreting the enclosing double quote and/or the delimiter, and truncating the first two characters.
I've also tried the following:
LOAD DATA
INFILE *
APPEND INTO TABLE test
FIELDS TERMINATED BY '|'
first_id,
second_id,
third_id,
language_code,
display_text VARCHAR(2000)
begindata
2|1|1|eng|Type of Investment Account
In this case, display_text is imported as:
pe of Investment Account
In the log file, I get this table which seems odd as well - why is the display_text column shown as having length 2002 when I explicitly set it to 2000?
Column Name Position Len Term Encl Datatype
FIRST_ID FIRST * | O(") CHARACTER
SECOND_ID NEXT * | O(") CHARACTER
THIRD_ID NEXT * | O(") CHARACTER
LANGUAGE_CODE NEXT 3 | O(") CHARACTER
DISPLAY_TEXT NEXT 2002 VARCHAR
Am I missing something totally obvious in my control and data files? I've played with various combinations of delimiters (commas vs '|'), trailing nullcols, optional enclosed etc.
Any help would be greatly appreciated!Use CHAR instead aof VARCHAR
LOAD DATA
INFILE *
APPEND INTO TABLE test
FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"'
first_id,
second_id,
third_id,
language_code,
display_text CHAR(2000)
)From the docu:
A VARCHAR field is a length-value datatype.
It consists of a binary length subfield followed by a character string of the specified length.
http://download-west.oracle.com/docs/cd/A87860_01/doc/server.817/a76955/ch05.htm#20324 -
Which is better - BLOB, CLOB or XMLTYPE?
Hi All,
We have to store XML in one of the fields in our table.
XML size is not fix but its not so huge that we shold go for external LOBs.
I mean it will be alwas less than 4G.
Our concern here is, does CLOB allocates exactly as much memory needed as much the data is present,(something like VARCHAR2).
How is it in case of BLOB and XMLTYPE as well?
Is there any Oracle documentation or way (code snippet) to verify this?
Thanks in advnace.
Avinash.You have asked if CLOB allocates space like VARCHAR2 - yes it does. Maybe this example will be better:
SQL> TRUNCATE TABLE test_table;
Tabela zosta│a obciŕta.
SQL> insert into test_table values(LPAD('A', 100, 'A'));
1 wiersz zosta│ utworzony.
SQL> insert into test_table values(LPAD('A', 1000, 'A'));
1 wiersz zosta│ utworzony.
SQL> insert into test_table values(LPAD('A', 10000, 'A'));
1 wiersz zosta│ utworzony.
SQL> COMMIT;
Zatwierdzanie zosta│o uko˝czone.
SQL> SELECT LENGTH(col1) FROM test_table;
LENGTH(COL1)
100
1000
4000
SQL> -
String Truncation Exception in SQL Server
I get an error when issuing an updateRow() after updateString("String Field", "Test"). My exception is:
java.sql.SQLException: [Microsoft][ODBC SQL Server Driver]String data
, right truncation
at sun.jdbc.odbc.JdbcOdbcResultSet.setPos(JdbcOdbcResultSet.java:5068)
at sun.jdbc.odbc.JdbcOdbcResultSet.updateRow(JdbcOdbcResultSet.java:4053
at exuberance.remote.JDBCClient.setFieldValue(JDBCClient.java:446)
I believe this occurs because Java stores strings in Unicode (2 bytes), but I'm not positive. Any help would be much appreciated.You are passing a string that is too big.
Possibilities:
-String you are passing is bigger than field it is being inserted into.
-Total length of SQL call exceeds the max len limit (8000?, 2174?)
-The string is a actually a 'blob' type of string (greater than 8000 on MS SQL?) and you need to use the same sort of handling for blobs to pass it. -
How much space does a blob use?
How much space does a BLOB use?
Does it use allocated anyspace if its empty?It will size appropriately:
TEST.SQL>CREATE TABLE TBLOB
2 (
3 A BLOB
4 ) TABLESPACE _SANITIZED_;
TABLE CREATED.
TEST.SQL>SELECT INITIAL_EXTENT FROM DBA_TABLESPACES WHERE TABLESPACE_NAME='_SANITIZED_';
INITIAL_EXTENT
5242880
TEST.SQL>SELECT * FROM DBA_EXTENTS WHERE SEGMENT_NAME='TBLOB';
OWNER SEGMENT_NAME PARTITION_NAME SEGMENT_TYPE
TABLESPACE_NAME EXTENT_ID FILE_ID BLOCK_ID BYTES BLOCKS RELATIVE_FNO
YJAM TBLOB TABLE
_SANITIZED_ 0 16 1201289 5242880 640 16
TEST.SQL>SELECT * FROM DBA_OBJECTS WHERE CREATED > SYSDATE-1;
OWNER OBJECT_NAME
SUBOBJECT_NAME OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL TIMESTAMP STATUS T G S
YJAM SYS_LOB0000920875C00001$$
920876 920876 LOB 14:41:36 14:41:36 2005-11-16:14:41:36 VALID N Y N
YJAM TBLOB
920875 920875 TABLE 14:41:36 14:41:36 2005-11-16:14:41:36 VALID N N N
TEST.SQL>SELECT * FROM DBA_EXTENTS WHERE SEGMENT_NAME='SYS_LOB0000920875C00001$$';
OWNER SEGMENT_NAME PARTITION_NAME SEGMENT_TYPE
TABLESPACE_NAME EXTENT_ID FILE_ID BLOCK_ID BYTES BLOCKS RELATIVE_FNO
YJAM SYS_LOB0000920875C00001$$ LOBSEGMENT
_SANITIZED_ 0 16 1202569 5242880 640 16
TEST.SQL>DECLARE
2 VALINS VARCHAR2(4000);
3 BEGIN
4 VALINS:='1';
5 FOR I IN 1..100000
6 LOOP
7 INSERT INTO TBLOB VALUES (RPAD(VALINS,4000,'0'));
8 END LOOP;
9 END;
10 /
PL/SQL procedure successfully completed.
TEST.SQL>COMMIT;
Commit complete.
TEST.SQL>SELECT * FROM DBA_EXTENTS WHERE SEGMENT_NAME IN ('SYS_LOB0000920875C00001$$','TBLOB');
OWNER SEGMENT_NAME PARTITION_NAME SEGMENT_TYPE
TABLESPACE_NAME EXTENT_ID FILE_ID BLOCK_ID BYTES BLOCKS RELATIVE_FNO
YJAM TBLOB TABLE
_SANITIZED_ 0 16 1201289 5242880 640 16
YJAM TBLOB TABLE
_SANITIZED_ 1 17 1169929 5242880 640 17
YJAM TBLOB TABLE
_SANITIZED_ 2 18 1163529 5242880 640 18
YJAM TBLOB TABLE
_SANITIZED_ 3 19 1176969 5242880 640 19
YJAM TBLOB TABLE
_SANITIZED_ 4 5 379529 5242880 640 5
YJAM TBLOB TABLE
_SANITIZED_ 5 14 375689 5242880 640 14
YJAM TBLOB TABLE
_SANITIZED_ 51 18 1172489 5242880 640 18
YJAM TBLOB TABLE
_SANITIZED_ 52 19 1191689 5242880 640 19
YJAM SYS_LOB0000920875C00001$$ LOBSEGMENT
_SANITIZED_ 0 16 1202569 5242880 640 16
54 rows selected.
TEST.SQL>ANALYZE TABLE TBLOB COMPUTE STATISTICS;
Table analyzed.
TEST.SQL>SELECT * FROM DBA_TABLES WHERE TABLE_NAME='TBLOB';
OWNER TABLE_NAME TABLESPACE_NAME CLUSTER_NAME IOT_NAME PCT_FREE PCT_USED
INI_TRANS MAX_TRANS INITIAL_EXTENT NEXT_EXTENT MIN_EXTENTS MAX_EXTENTS PCT_INCREASE FREELISTS FREELIST_GROUPS LOG B NUM_ROWS BLOCKS EMPTY_BLOCKS AVG_SPACE CHAIN_CNT
AVG_ROW_LEN AVG_SPACE_FREELIST_BLOCKS NUM_FREELIST_BLOCKS DEGREE INSTANCES CACHE TABLE_LO SAMPLE_SIZE LAST_ANA PAR IOT_TYPE T S NES BUFFER_ ROW_MOVE GLO USE
DURATION SKIP_COR MON CLUSTER_OWNER DEPENDEN COMPRESS
YJAM TBLOB _SANITIZED_ 10
1 255 5242880 5242880 1 2147483645 0 YES N 100000 33547 373 1977 0
2042 0 0 1 1 N ENABLED 100000 14:51:22 NO N N NO DEFAULT DISABLED NO NO
DISABLED NO DISABLED DISABLED
TEST.SQL>SELECT BYTES/1024 FROM DBA_SEGMENTS WHERE SEGMENT_NAME='TBLOB';
BYTES/1024
271360
TEST.SQL>TRUNCATE TABLE TBLOB;
Table truncated.
TEST.SQL>BEGIN
2 FOR i IN 1..100000
3 LOOP
4 INSERT INTO TBLOB VALUES ('1');
5 END LOOP;
6 END;
7 /
PL/SQL procedure successfully completed.
TEST.SQL>COMMIT;
Commit complete.
TEST.SQL>ANALYZE TABLE TBLOB COMPUTE STATISTICS;
Table analyzed.
TEST.SQL>SELECT BYTES/1024 FROM DBA_SEGMENTS WHERE SEGMENT_NAME='TBLOB';
BYTES/1024
5120Note: there is a deported blob segment for the pointers I forgot in my previous post.
HTH,
Yoann.
Message was edited by:
Yoann Mainguy
Hmm, forgot that for a full comparison of the avg linesize:
TEST.SQL>SELECT * FROM DBA_TABLES WHERE TABLE_NAME='TBLOB';
OWNER TABLE_NAME TABLESPACE_NAME CLUSTER_NAME IOT_NAME PCT_FREE PCT_USED
INI_TRANS MAX_TRANS INITIAL_EXTENT NEXT_EXTENT MIN_EXTENTS MAX_EXTENTS PCT_INCREASE FREELISTS FREELIST_GROUPS LOG B NUM_ROWS BLOCKS EMPTY_BLOCKS AVG_SPACE CHAIN_CNT
AVG_ROW_LEN AVG_SPACE_FREELIST_BLOCKS NUM_FREELIST_BLOCKS DEGREE INSTANCES CACHE TABLE_LO SAMPLE_SIZE LAST_ANA PAR IOT_TYPE T S NES BUFFER_ ROW_MOVE GLO USE
DURATION SKIP_COR MON CLUSTER_OWNER DEPENDEN COMPRESS
YJAM TBLOB _SANITIZED_ 10
1 255 5242880 5242880 1 2147483645 0 YES N 100000 628 12 1223 0
41 0 0 1 1 N ENABLED 100000 14:56:56 NO N N NO DEFAULT DISABLED NO NO
DISABLED NO DISABLED DISABLED -
SQL Server 2005 (64 bit) returns truncated text from Oracle 10g (64 bit)
Hi all,
I'm no expert in Oracle so I desperately need your advice.
I'm using SQL Server 2005 (64 bit) and trying to pull data from a Oracle 10g (64 bit) server. I am able to connect to the Oracle server with no problems.
The problem came when I was trying to pull a text column (I think it's BLOB in Oracle) from the Oracle server. The results came back BUT truncated to only 100 characters. (This is a column containing event logs so the text is really really long...)
As you may already know, Microsoft is not providing the driver for this case (which is insanely ridiculous!!!). There is no standard “Microsoft OLE DB Provider for Oracle” which we would normally use to establish a linked server connection to Oracle. Instead, we had to install the provider given by Oracle (Oracle Provider for OLE DB).
I have tried to Google for solutions but didn't find much. Please help! :(
Thanks in advance,
EleniIf this is a one time activity, you need to find a migration utility which will do the job for you. SQL Developer is available had your case been the other way round (migrate from SQL Server to Oracle).
Check out tools from SwisSQL
http://www.swissql.com/products/oracle-to-sqlserver/oracle-to-sql-server.html
Microsoft's own SSMA (I think this is what you are using already)
http://www.microsoft.com/sqlserver/2005/en/us/migration-oracle.aspx
Apart from this, if there are any issues with the tools themselves, you need to get in touch with the respective vendors. -
Strange error when downloading blob
ADF 11g
Hello
I have a pdf file uploaded to the DB as a blob column.
I am now trying to download it.
I've tried to convert the Kuba [http://kuba.zilp.pl/?id=1] example from ADF 10 to 11.
I have page showing an af:table.
Clicking on the filename column runs the following :
public BindingContainer getBindings() {
FacesContext fc = FacesContext.getCurrentInstance();
return (BindingContainer)fc.getApplication().evaluateExpressionGet(fc,"#{bindings}", BindingContainer.class);
public String onDownLoad() {
BindingContainer bindings = getBindings();
DCIteratorBinding itr = (DCIteratorBinding) bindings.get ("SttAttachedFilesView1Iterator");
Row r = itr.getCurrentRow();
BlobDomain file = (BlobDomain)r.getAttribute("FileContent");
String fileName = (String)r.getAttribute("FileName");
FileOperations.downloadFile( fileName, file);
return null;
}The actual download code is
public class FileOperations {
public FileOperations() {
public static String getMimeType(String fileName){
String mime = null;
String ext = fileName.toLowerCase();
if(ext.endsWith(".pdf")) { mime = "application/pdf"; }
else if(ext.endsWith(".doc")) { mime = "application/msword"; }
else if(ext.endsWith(".ppt")) { mime = "application/vnd.ms-powerpoint"; }
else if(ext.endsWith(".rar")) { mime = "application/octet-stream"; }
else if(ext.endsWith(".zip")) { mime = "application/zip"; }
return mime;
public static synchronized void downloadFile( String fileName, BlobDomain blobDomain )
FacesContext facesContext = FacesContext.getCurrentInstance();
ExternalContext extContext = facesContext.getExternalContext();
Long length = blobDomain.getLength();
String fileType = getMimeType(fileName);
HttpServletResponse response = (HttpServletResponse) extContext.getResponse();
response.setHeader("Content-Disposition", "attachment;filename=\"" + fileName + "\"");
response.setContentLength((int) length.intValue());
response.setContentType( fileType );
try {
InputStream in = blobDomain.getBinaryStream();
OutputStream out = response.getOutputStream();
byte[] buf = new byte[1024];
int count;
while ((count = in.read(buf)) >= 0) {
out.write(buf, 0, count);
in.close();
out.flush();
out.close();
facesContext.responseComplete();
} catch (IOException ex) {
System.out.println(ex.getMessage());
ex.printStackTrace();
}Whats happening is the first time I click on the filename the file is downloaded from the DB correctly.
If I click a second time I get the following errors :
<25 juil. 2009 12.54. h CEST> <Error> <HTTP> <BEA-101083> <Connection failure.
java.net.ProtocolException: Didn't meet stated Content-Length, wrote: '0' bytes instead of stated: '156751' bytes.
at weblogic.servlet.internal.ServletOutputStreamImpl.ensureContentLength(ServletOutputStreamImpl.java:425)
at weblogic.servlet.internal.ServletResponseImpl.ensureContentLength(ServletResponseImpl.java:1451)
at weblogic.servlet.internal.ServletResponseImpl.send(ServletResponseImpl.java:1494)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1437)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
Truncated. see log file for complete stacktrace
>
<25 juil. 2009 12.54. h CEST> <Error> <HTTP> <BEA-101104> <Servlet execution in servlet context "ServletContext@18785780[app:UpLoadDownload module:UpLoadDownload-ViewController-context-root path:/UpLoadDownload-ViewController-context-root spec-version:2.5]" failed, java.net.ProtocolException: Didn't meet stated Content-Length, wrote: '0' bytes instead of stated: '156751' bytes..
java.net.ProtocolException: Didn't meet stated Content-Length, wrote: '0' bytes instead of stated: '156751' bytes.
at weblogic.servlet.internal.ServletOutputStreamImpl.ensureContentLength(ServletOutputStreamImpl.java:425)
at weblogic.servlet.internal.ServletResponseImpl.ensureContentLength(ServletResponseImpl.java:1451)
at weblogic.servlet.internal.ServletResponseImpl.send(ServletResponseImpl.java:1494)
at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1437)
at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
Truncated. see log file for complete stacktrace
>It as if the inputStream is empty the second time round even though the lenght is correct.
If I close the application and restart I can download the file again.
I've added an execute button to re-execute the query after the first download but the problem persists.
Anybody got any ideas ?
Regards
PaulIf I add
OperationBinding operation = getBindings().getOperationBinding("Rollback");
operation.execute(); to the end of the onDownLoad method the error dissapears and everything works as expected.
If I replace Rollback with either Commit or Execute the problem continues.
Whats going on here its as if the blob field isn't really there....
Regards
Paul
Bump - because I've been on holiday :-)
Edited by: Paul (MITsa) on 10-Aug-2009 04:20 -
BLOB Columns Inserts/Updates ???
Please let me know how SQL*Loader can run updates to just the blob columns without truncation/replace. We want to do incremental updates to the BLOB column without truncation/replace. If this is not possible please let me know any other process which can do this with the best performance we have huge amount of data that needs to be processed. Please let me know .......
e-mail from the developer.
Is there something that we or the DBA’s can do to improve performance on these types of inserts/updates with blobs? Currently we are executing batches of inserts/update to the 5 IMAGE_* tables and those are triggering inserts into the IMAGE*_HISTORY tablesHandle: user01
Status Level: Newbie
Registered: Jun 10, 2004
Total Posts: 252
Total Questions: 28 (24 unresolved)
so many questions & so few answers.
EXTERNAL TABLE is available option
How fast it is depends upon the SQL & design implementation -
JDBC & BLOBS not adhering to JDBC Interface
It appears that the Oracle JDBC classes do not seem to support the "setBinaryStream" (nor truncate) methods of the Blob class. The "getBinaryStream" is supported.
Why do we have to cast the Blob to a BLOB and use "getBinaryOutputStream" in order to get an output stream. This means that more database specific encapsulation is required when dealing with blobs.
It seems to be so close. Surely, "setBinaryStream" can be implemented to return BLOB.getBinaryStream..
I'm using the 9i JDBC classes with Oracle 8.1.6. Is there a configuration problem?
Am I missing something? Can I use Blobs and Oracle using the standard JDBC interfaces?
JasonHi,
I found the LOB trim method in the classes12.jar distributed with
Oracle 9.2.
Regards
Pete -
Blobs in XML documents.
I'm using an xml file to update the database. Everything runs smoothly until I hit a BLOB field. I'm using the xmldom.getnodevalue to get the value associated with the node and if the node contains a BLOB value the xmldom.getnodevalue gives a truncation error.
The xml looks like this:
<process_code>
<code>0003</code>
<table_name>Test1</table_name>
<column_values>
<id>1234</id>
<blob1>Insert blob here ............... end of blob</blob1>
</column_values>
</process_code>
Is there a way to get the value of the node that contains a BLOB?
Thanks
/kpWould you provide sample test data and code for this?
Thanks.
Maybe you are looking for
-
I just bought an iPod Classic, and in the shop, the demo model had a "Voice Recorder" option on the menu, and I read somewhere that my ipod could be used as a voice recorder, and I was just wondering how to do this? Do I need a microphone??
-
Hi all, I'm trying to add Ciso ISE 1.2 (1.2.0.899 with version 13 patch) servers (primary and secondary) as "External Management Servers" in Cisco PI 2.1 (2.1.0.0.87) but there appears such message indicating that ISE server is not reachable: The we
-
Hi, How can I form page`s template`s html columns width and amount of these columns. Thanks.
-
Logic issue with audio and instrm tracks!
Hi! Iam old Cubase user And i didnt see THAT on Cubase When i cut a part of audo region or close the end by draging bottom corner of region,it play anyway on empty space Any ideas?
-
One SkyDrive client only, please
I am an avid SkyDrive user. I pay a large annual fee, along with having bought a tablet I knew I would never use, to get a very large SkyDrive. I love it - it works great. Right up until I installed Office. The Office SkyDrive integration SUCKS. Whil