CLOB vs NCLOB
Hi Everyone,
Would should be the deciding factor to choose between using a CLOB or NCLOB.
Thank you
Why read answer here when you can read answer there (below)
http://download.oracle.com/docs/cd/B19306_01/server.102/b14200/sql_elements001.htm#sthref172
Edited by: sb92075 on Jun 15, 2010 6:26 PM
Similar Messages
-
CLOB to NCLOB conversion in Oracle 9i
Hi,
I have a table with 8 million records. The table has a clob column that has to be converted in to a nclob column. I have writtten a procedure for the clob to nclob conversion using cursors and the limit functionality.The procedure is given as follows:
CREATE OR REPLACE PROCEDURE pr_clob_to_nclob is
TYPE type_dockey IS TABLE OF inf_doc_store.ds_doc_key%TYPE INDEX BY PLS_INTEGER;
l_type_dockey type_dockey;
TYPE type_rowid IS TABLE OF VARCHAR(100) INDEX BY PLS_INTEGER;
l_type_rowid type_rowid;
L_NUMERRORS NUMBER := 0;
E_BULKINS_ILCHIS_EXCEPTION EXCEPTION;
E_BULKINS_ELCHIS_EXCEPTION EXCEPTION;
E_BULKINS_ILCTXN_EXCEPTION EXCEPTION;
E_BULKINS_ELCTXN_EXCEPTION EXCEPTION;
L_ERR_MESSAGE INF_ERROR_LOG.ERROR_TEXT%TYPE;
L_FILE UTL_FILE.FILE_TYPE;
CURSOR l_doc_store IS SELECT ds_doc_key,rowid from inf_doc_store where ds_doc_content1 is null;
BEGIN
--OPEN l_doc_store FOR 'select ds_doc_key,rowid from inf_doc_store1';
OPEN l_doc_store;
LOOP
FETCH l_doc_store BULK COLLECT INTO l_type_dockey,l_type_rowid LIMIT 5000;
BEGIN
FORALL docidx IN 1 .. l_type_dockey.COUNT SAVE EXCEPTIONS
UPDATE inf_doc_store SET DS_DOC_CONTENT1 = TO_NCLOB(DS_DOC_CONTENT)
WHERE ROWID =l_type_rowid(docidx);
EXCEPTION
WHEN OTHERS THEN
L_NUMERRORS := SQL%BULK_EXCEPTIONS.COUNT;
FOR IDX IN 1 .. L_NUMERRORS LOOP
L_ERR_MESSAGE := 'on bulk insert : Error in Doc key << ' ||
l_type_dockey(IDX) || ' >> index << ' ||
SQL%BULK_EXCEPTIONS(IDX)
.ERROR_INDEX || ' IS ' ||
SQLERRM(0 - SQL%BULK_EXCEPTIONS(IDX)
.ERROR_CODE) || ' >>';
PKG_INF_COMMON.PR_LOG_ERROR('CLOB_TO_NCLOB Migration',
'inf_doc_store',
L_ERR_MESSAGE,
'NA');
PKG_INF_COMMON.PR_UTL_PUTLINE(L_FILE,
L_ERR_MESSAGE,
'NA');
END LOOP;
PKG_INF_COMMON.PR_UTL_FCLOSE(L_FILE, 'inf_doc_store_updation');
RAISE E_BULKINS_ILCHIS_EXCEPTION;
END;
COMMIT;
EXIT WHEN l_doc_store%NOTFOUND;
END LOOP;
CLOSE l_doc_store;
END pr_clob_to_nclob;
The table in which the clob column is to be converted is inf_doc_header.
The above procedure runs successfully but takes more than 48 hours to convert all the records.
Can any body suggest me a way to optimise the procedure?
Regards,
SiddarthHi,
I have a table with 8 million records. The table has a clob column that has to be converted in to a nclob column. I have writtten a procedure for the clob to nclob conversion using cursors and the limit functionality.The procedure is given as follows:
CREATE OR REPLACE PROCEDURE pr_clob_to_nclob is
TYPE type_dockey IS TABLE OF inf_doc_store.ds_doc_key%TYPE INDEX BY PLS_INTEGER;
l_type_dockey type_dockey;
TYPE type_rowid IS TABLE OF VARCHAR(100) INDEX BY PLS_INTEGER;
l_type_rowid type_rowid;
L_NUMERRORS NUMBER := 0;
E_BULKINS_ILCHIS_EXCEPTION EXCEPTION;
E_BULKINS_ELCHIS_EXCEPTION EXCEPTION;
E_BULKINS_ILCTXN_EXCEPTION EXCEPTION;
E_BULKINS_ELCTXN_EXCEPTION EXCEPTION;
L_ERR_MESSAGE INF_ERROR_LOG.ERROR_TEXT%TYPE;
L_FILE UTL_FILE.FILE_TYPE;
CURSOR l_doc_store IS SELECT ds_doc_key,rowid from inf_doc_store where ds_doc_content1 is null;
BEGIN
--OPEN l_doc_store FOR 'select ds_doc_key,rowid from inf_doc_store1';
OPEN l_doc_store;
LOOP
FETCH l_doc_store BULK COLLECT INTO l_type_dockey,l_type_rowid LIMIT 5000;
BEGIN
FORALL docidx IN 1 .. l_type_dockey.COUNT SAVE EXCEPTIONS
UPDATE inf_doc_store SET DS_DOC_CONTENT1 = TO_NCLOB(DS_DOC_CONTENT)
WHERE ROWID =l_type_rowid(docidx);
EXCEPTION
WHEN OTHERS THEN
L_NUMERRORS := SQL%BULK_EXCEPTIONS.COUNT;
FOR IDX IN 1 .. L_NUMERRORS LOOP
L_ERR_MESSAGE := 'on bulk insert : Error in Doc key << ' ||
l_type_dockey(IDX) || ' >> index << ' ||
SQL%BULK_EXCEPTIONS(IDX)
.ERROR_INDEX || ' IS ' ||
SQLERRM(0 - SQL%BULK_EXCEPTIONS(IDX)
.ERROR_CODE) || ' >>';
PKG_INF_COMMON.PR_LOG_ERROR('CLOB_TO_NCLOB Migration',
'inf_doc_store',
L_ERR_MESSAGE,
'NA');
PKG_INF_COMMON.PR_UTL_PUTLINE(L_FILE,
L_ERR_MESSAGE,
'NA');
END LOOP;
PKG_INF_COMMON.PR_UTL_FCLOSE(L_FILE, 'inf_doc_store_updation');
RAISE E_BULKINS_ILCHIS_EXCEPTION;
END;
COMMIT;
EXIT WHEN l_doc_store%NOTFOUND;
END LOOP;
CLOSE l_doc_store;
END pr_clob_to_nclob;
The table in which the clob column is to be converted is inf_doc_header.
The above procedure runs successfully but takes more than 48 hours to convert all the records.
Can any body suggest me a way to optimise the procedure?
Regards,
Siddarth -
Alter table reaching the max number of char 4000, CLOB? NCLOB?
I have a column is a table defined as a VARCHAR2(4000)
I try to alter the table
alter table sybaapc
MODIFY SYBAAPC_EXTRACURRICULAR VARCHAR2(5000)but it give me this error
ORA-00910: specified length too long for its datatype
I want the users to be able to enter more information on that column.
It seems that 4000 is the max we are in Oracle 10g
can I changed to CLOB or nclob, we already have data in that table?
What are the implications of changing it to clob or nclob,Thats becuase the maximum character length for a table column is 4000.
Alternatively you can:
SQL> create table test_1 (col1 VARCHAR2(4000))
2 /
Table created.
SQL> insert into test_1 values(RPAD('*',4000,'*'));
1 row created.
SQL> commit;
Commit complete.
SQL> alter table test_1 modify (col1 clob);
alter table test_1 modify (col1 clob)
ERROR at line 1:
ORA-22858: invalid alteration of datatype
SQL> alter table test_1 add(col2 clob);
Table altered.
SQL> update test_1 set col2 = col1;
1 row updated.
SQL> commit;
Commit complete.
SQL> alter table test_1 drop column col1;
Table altered.
SQL> alter table test_1 rename column col2 to col1;
Table altered.
SQL> desc test_1
Name Null? Type
COL1 CLOB
SQL>
SQL> select * from test_1;
COL1
SQL> Edited by: AP on Aug 24, 2010 7:41 AM -
Hello Everyone,
Before I go to my question let me give you the context. I wanted to upload the description of a set of products with their IDs into my database. Hence I created a table 'demo' with two columns of INT and CLOB datatypes using the following script. *create table demo ( id int primary key, theclob Clob );*
Then I create a directory using the following script, *Create Or Replace Directory MY_FILES as 'C:\path of the folder.......\';*
In the above mentioned directory I create one .txt file for each product with the description of the product. Using the below script I created a procedure to load the contents of the .txt files into my 'demo' table.
*CREATE OR REPLACE*
*PROCEDURE LOAD_A_FILE( P_ID IN NUMBER, P_FILENAME IN VARCHAR2 ) AS*
*L_CLOB CLOB;*
*L_BFILE BFILE;*
*BEGIN*
*INSERT INTO DEMO VALUES ( P_ID, EMPTY_CLOB() )*
*RETURNING THECLOB INTO L_CLOB;*
*L_BFILE := BFILENAME( 'MY_FILES', P_FILENAME );*
*DBMS_LOB.FILEOPEN( L_BFILE );*
*DBMS_LOB.LOADFROMFILE( L_CLOB, L_BFILE,*
*DBMS_LOB.GETLENGTH( L_BFILE ) );*
*DBMS_LOB.FILECLOSE( L_BFILE );*
*END;*
After which I called the procedure using, *exec load_a_file(1, 'filename.txt' );*
When I queried the table like, select * from demo; I am getting the following output..... which is all fine.
ID THECLOB
1 "product x is an excellent way to improve your production process and enhance your turnaround time....."
_*QUESTION*_
When I did the exact same thing in my friend's machine and query the demo table, I get garbage value in the 'theclob' column (as shown below). The only difference is that mine is an enterprise edition of Oracle 11.2.0.1 and my friends is an Express edition of Oracle 11.2.0.2. Does this has anything to do with the problem?
1 猺⁁摶慮捥搠摡瑡潬汥捴楯渠捡灡扩汩瑩敳㨠扡牣潤攠獣慮湩湧Ⱐ灡湩挠慬敲琬⁷潲欠潲摥爠浡湡来浥湴Ⱐ睩牥汥獳潲浳湤異敲癩獯爠瑩浥湴特⸊潭整⁍潢楬攠坯牫敲㨠周攠浯獴潢畳琠灡捫慧攮⁐牯癩摥猠扵獩湥獳敳⁷楴栠愠捯浰汥瑥汹⁷楲敬敳猠潰敲慴楯湡氠浡湡来浥湴祳瑥洮⁉湣汵摥猠慬氠潦⁃潭整⁔牡捫敲❳敡瑵牥猠灬畳㨠䍡汥湤慲猬畴潭慴敤畳瑯浥爠捯浭畮楣慴楯湳Ⱐ睯牫牤敲⽩湶潩捥⁵灤慴楮朠晲潭⁴桥楥汤Ⱐ睯牫牤敲敱略湣楮本硣敳獩癥瑯瀠瑩浥汥牴猬⁷楲敬敳猠景牭猬⁴畲渭批畲渠癯楣攠湡癩条瑩潮Ⱐ慮搠浯牥⸊ੁ摶慮捥搠坩
2 ≁否吠潦晥牳摶慮捥搠睩牥汥獳潲浳慰慢楬楴礠睩瑨⁃潭整⁅娠䍯浥琬⁔牡捫敲湤⁃潭整⁍潢楬攠坯牫敲ਊ䍯浥琠䕚㨠周攠浯獴潢畳琬潳琠敦晥捴楶攠睥戠扡獥搠䵒䴠慰灬楣慴楯渠楮⁴桥湤畳瑲礮⁃慰慢楬楴楥猠楮捬畤攠䝐匠汯捡瑩潮⁴牡捫楮本⁷楲敬敳猠瑩浥汯捫Ⱐ来漭晥湣楮朠睩瑨汥牴猬灥敤湤瑯瀠瑩浥汥牴猬湤渭摥浡湤爠獣桥摵汥搠牥灯牴楮朮ਊ䍯浥琠呲慣步爺⁁⁰潷敲晵氠捬楥湴ⵢ慳敤⁰污瑦潲洠瑨慴晦敲猠慬氠瑨攠晥慴畲敳映䍯浥琠䕚⁰汵猺⁁摶慮捥搠摡瑡潬汥捴楯渠捡灡扩汩瑩敳㨠扡牣潤攠獣慮湩湧Ⱐ灡湩挠慬敲琬⁷潲欠潲摥爠浡湡来浥湴Ⱐ睩牥汥獳潲浳湤異敲癩獯爠瑩浥湴特⸊潭整⁍潢楬攠坯牫敲㨠周攠浯獴潢畳琠灡捫慧攮⁐牯癩摥猠扵獩湥獳敳⁷楴栠愠捯浰汥瑥汹⁷楲敬敳猠潰敲慴楯湡氠浡湡来浥湴祳瑥洮⁉湣汵摥猠慬氠潦⁃潭整⁔牡捫敲❳敡瑵牥猠灬畳㨠䍡汥湤慲猬畴潭慴敤畳瑯浥爠捯浭畮楣慴楯湳Ⱐ睯牫牤敲⽩湶潩捥⁵灤慴楮朠晲潭⁴桥楥汤Ⱐ睯牫牤敲敱略湣楮本硣敳獩癥瑯瀠瑩浥汥牴猬⁷楲敬敳猠景牭猬⁴畲渭批畲渠癯楣攠湡癩条瑩潮Ⱐ慮搠浯牥⸊ੁ摶慮捥搠坩牥汥獳⁆潲浳㨠呵牮湹⁰慰敲潲洠楮瑯⁷楲敬敳猠捬潮攠潦⁴桥慭攠楮景牭慴楯渠ⴠ湯慴瑥爠桯眠捯浰汩捡瑥搮⁓慶攠瑩浥礠瑲慮獦敲物湧湦潲浡瑩潮慣欠瑯⁴桥晦楣攠睩瑨⁷楲敬敳猠獰敥搮⁓慶攠灡灥爠慮搠敬業楮慴攠摵慬•ഊ
3 ≁䥒呉䵅⁍慮慧敲牯洠䅔♔⁰牯癩摥猠愠浯扩汥灰汩捡瑩潮猠摥獩杮敤⁴漠瑲慣欠扩汬慢汥潵牳⸠⁔桥⁁㑐潬畴楯湳畴潭慴楣慬汹潧⁷楲敬敳猠敭慩氬慬汳Ⱐ慮搠扩汬慢汥癥湴猬獳潣楡瑥猠瑨敭⁷楴栠捬楥湴爠灲潪散琠捯摥猠慮搠摩牥捴猠扩汬慢汥散潲摳⁴漠扩汬楮朠獹獴敭献†周攠呩浥乯瑥潬畴楯湳⁰牯癩摥汩浭敤潷渠數灥物敮捥Ⱐ慬汯睩湧潲牥慴楯渠潦慮畡氠扩汬慢汥癥湴献†周敲攠慲攠瑷漠癥牳楯渠潦⁁㑐湤⁔業敎潴攮ਊ䭥礠䙥慴畲敳㨊⨠䥮捬畤攠捡灴畲攠慤潣楬污扬攠敶敮瑳ਪ⁃慰瑵牥潢楬攠灨潮攠捡汬湤浡楬†慳楬污扬攠敶敮瑳Ⱐਪ⁁扩汩瑹⁴漠慳獩杮楬污扬攠敶敮琠瑯汩敮琠慮搠灲潪散琊⨠䅢楬楴礠瑯敡牣栠慮搠獣牯汬⁴桲潵杨楬污扬攠敶敮瑳Ⱐ潰瑩潮⁴漠楮瑥杲慴攠睩瑨楬汩湧祳瑥浳 ⨠偯瑥湴楡氠扥湥晩瑳湣汵摥湣牥慳敤⁰牯摵捴楶楴礠慮搠牥摵捥搠慤浩湩獴牡瑩癥癥牨敡搠湤湣牥慳敤敶敮略略⁴漠浯牥捣畲慴攠捡灴畲楮朠潦楬污扬攠敶敮瑳•ഊ
4 ≁灲楶慐慹⁁乄⁁灲楶慐慹⁐牯晥獳楯湡氠晲潭⁁否吠瑵牮⁹潵爠浯扩汥敶楣攠楮瑯⁰潲瑡扬攠捲敤楴慲搠瑥牭楮慬⸠坩瑨潭灡瑩扬攠䅔♔浡牴灨潮攬⁁灲楶慐慹爠䅰物癡偡礠偲潦敳獩潮慬潦瑷慲攬湤敲捨慮琠慣捯畮琬⁹潵爠浯扩汥⁷潲武潲捥慮⁰牯捥獳牥摩琠潲敢楴慲搠灡祭敮瑳牯洠瑨攠晩敬搮ਊ䭥礠䙥慴畲敳㨠 ⨠卭慲瑰桯湥ⵢ慳敤潬畴楯渠⁴漠灲潣敳猠捲敤楴慲搠灡祭敮瑳 ⨠䙵汬ⵦ敡瑵牥搠灯楮琭潦慬攠獯汵瑩潮異灯牴楮朠慬氠浡橯爠瑲慮獡捴楯渠瑹灥ਠ⨠卵灰潲瑳牥摩琠慮搠摥扩琠瑲慮獡捴楯湳 ਊ∍
To make sure that the .txt files are accessible in the directory I executed the following script, Host Echo Hello World > C:\...path...\1.Txt
After which I found the contents of the file changed to "Hello World". Later I loaded the .txt file with "Hello World" and queried the table. Still I am getting some garbage value. However since the string "Hello World" is much smaller than the previous contents, the garbage size is also smaller for ID 1. I don't get any errors, but you can see the output as follows.
1 䠀攀氀氀漀 圀漀爀氀搀 ഀ
2 ≁否吠潦晥牳摶慮捥搠睩牥汥獳潲浳慰慢楬楴礠睩瑨⁃潭整⁅娠䍯浥琬⁔牡捫敲湤⁃潭整⁍潢楬攠坯牫敲ਊ䍯浥琠䕚㨠周攠浯獴潢畳琬潳琠敦晥捴楶攠睥戠扡獥搠䵒䴠慰灬楣慴楯渠楮⁴桥湤畳瑲礮⁃慰慢楬楴楥猠楮捬畤攠䝐匠汯捡瑩潮⁴牡捫楮本⁷楲敬敳猠瑩浥汯捫Ⱐ来漭晥湣楮朠睩瑨汥牴猬灥敤湤瑯瀠瑩浥汥牴猬湤渭摥浡湤爠獣桥摵汥搠牥灯牴楮朮ਊ䍯浥琠呲慣步爺⁁⁰潷敲晵氠捬楥湴ⵢ慳敤⁰污瑦潲洠瑨慴晦敲猠慬氠瑨攠晥慴畲敳映䍯浥琠䕚⁰汵猺⁁摶慮捥搠摡瑡潬汥捴楯渠捡灡扩汩瑩敳㨠扡牣潤攠獣慮湩湧Ⱐ灡湩挠慬敲琬⁷潲欠潲摥爠浡湡来浥湴Ⱐ睩牥汥獳潲浳湤異敲癩獯爠瑩浥湴特⸊潭整⁍潢楬攠坯牫敲㨠周攠浯獴潢畳琠灡捫慧攮⁐牯癩摥猠扵獩湥獳敳⁷楴栠愠捯浰汥瑥汹⁷楲敬敳猠潰敲慴楯湡氠浡湡来浥湴祳瑥洮⁉湣汵摥猠慬氠潦⁃潭整⁔牡捫敲❳敡瑵牥猠灬畳㨠䍡汥湤慲猬畴潭慴敤畳瑯浥爠捯浭畮楣慴楯湳Ⱐ睯牫牤敲⽩湶潩捥⁵灤慴楮朠晲潭⁴桥楥汤Ⱐ睯牫牤敲敱略湣楮本硣敳獩癥瑯瀠瑩浥汥牴猬⁷楲敬敳猠景牭猬⁴畲渭批畲渠癯楣攠湡癩条瑩潮Ⱐ慮搠浯牥⸊ੁ摶慮捥搠坩牥汥獳⁆潲浳㨠呵牮湹⁰慰敲潲洠楮瑯⁷楲敬敳猠捬潮攠潦⁴桥慭攠楮景牭慴楯渠ⴠ湯慴瑥爠桯眠捯浰汩捡瑥搮⁓慶攠瑩浥礠瑲慮獦敲物湧湦潲浡瑩潮慣欠瑯⁴桥晦楣攠睩瑨⁷楲敬敳猠獰敥搮⁓慶攠灡灥爠慮搠敬業楮慴攠摵慬•ഊ
3 ≁䥒呉䵅⁍慮慧敲牯洠䅔♔⁰牯癩摥猠愠浯扩汥灰汩捡瑩潮猠摥獩杮敤⁴漠瑲慣欠扩汬慢汥潵牳⸠⁔桥⁁㑐潬畴楯湳畴潭慴楣慬汹潧⁷楲敬敳猠敭慩氬慬汳Ⱐ慮搠扩汬慢汥癥湴猬獳潣楡瑥猠瑨敭⁷楴栠捬楥湴爠灲潪散琠捯摥猠慮搠摩牥捴猠扩汬慢汥散潲摳⁴漠扩汬楮朠獹獴敭献†周攠呩浥乯瑥潬畴楯湳⁰牯癩摥汩浭敤潷渠數灥物敮捥Ⱐ慬汯睩湧潲牥慴楯渠潦慮畡氠扩汬慢汥癥湴献†周敲攠慲攠瑷漠癥牳楯渠潦⁁㑐湤⁔業敎潴攮ਊ䭥礠䙥慴畲敳㨊⨠䥮捬畤攠捡灴畲攠慤潣楬污扬攠敶敮瑳ਪ⁃慰瑵牥潢楬攠灨潮攠捡汬湤浡楬†慳楬污扬攠敶敮瑳Ⱐਪ⁁扩汩瑹⁴漠慳獩杮楬污扬攠敶敮琠瑯汩敮琠慮搠灲潪散琊⨠䅢楬楴礠瑯敡牣栠慮搠獣牯汬⁴桲潵杨楬污扬攠敶敮瑳Ⱐ潰瑩潮⁴漠楮瑥杲慴攠睩瑨楬汩湧祳瑥浳 ⨠偯瑥湴楡氠扥湥晩瑳湣汵摥湣牥慳敤⁰牯摵捴楶楴礠慮搠牥摵捥搠慤浩湩獴牡瑩癥癥牨敡搠湤湣牥慳敤敶敮略略⁴漠浯牥捣畲慴攠捡灴畲楮朠潦楬污扬攠敶敮瑳•ഊ
4 ≁灲楶慐慹⁁乄⁁灲楶慐慹⁐牯晥獳楯湡氠晲潭⁁否吠瑵牮⁹潵爠浯扩汥敶楣攠楮瑯⁰潲瑡扬攠捲敤楴慲搠瑥牭楮慬⸠坩瑨潭灡瑩扬攠䅔♔浡牴灨潮攬⁁灲楶慐慹爠䅰物癡偡礠偲潦敳獩潮慬潦瑷慲攬湤敲捨慮琠慣捯畮琬⁹潵爠浯扩汥⁷潲武潲捥慮⁰牯捥獳牥摩琠潲敢楴慲搠灡祭敮瑳牯洠瑨攠晩敬搮ਊ䭥礠䙥慴畲敳㨠 ⨠卭慲瑰桯湥ⵢ慳敤潬畴楯渠⁴漠灲潣敳猠捲敤楴慲搠灡祭敮瑳 ⨠䙵汬ⵦ敡瑵牥搠灯楮琭潦慬攠獯汵瑩潮異灯牴楮朠慬氠浡橯爠瑲慮獡捴楯渠瑹灥ਠ⨠卵灰潲瑳牥摩琠慮搠摥扩琠瑲慮獡捴楯湳 ਊ∍
Edited by: Arunkumar Gunasekaran on Jan 3, 2013 11:38 AM>
To make sure that the .txt files are accessible in the directory I executed the following script, Host Echo Hello World > C:\...path...\1.Txt
After which I found the contents of the file changed to "Hello World". Later I loaded the .txt file with "Hello World" and queried the table. Still I am getting some garbage value. However since the string "Hello World" is much smaller than the previous contents, the garbage size is also smaller for ID 1. I don't get any errors, but you can see the output as follows.
>
The most common problem I have seen using BFILEs is the character set; BFILEs do NOT handle character set conversion.
That is the main reason I don't recommend using BFILEs for loading data like this. Either SQL*Loader or external tables can do the job and they both handle character set conversions properly.
See the LOADFROMFILE Procedure of DBMS_LOB package in the PL/SQL Language doc
http://docs.oracle.com/cd/B28359_01/appdev.111/b28419/d_lob.htm#i998778
>
Note:
The input BFILE must have been opened prior to using this procedure. No character set conversions are performed implicitly when binary BFILE data is loaded into a CLOB. The BFILE data must already be in the same character set as the CLOB in the database. No error checking is performed to verify this.
Note:
If the character set is varying width, UTF-8 for example, the LOB value is stored in the fixed-width UCS2 format. Therefore, if you are using DBMS_LOB.LOADFROMFILE, the data in the BFILE should be in the UCS2 character set instead of the UTF-8 character set. However, you should use sql*loader instead of LOADFROMFILE to load data into a CLOB or NCLOB because sql*loader will provide the necessary character set conversions.
>
I suggest you use an external table definition to do this load. You can do an ALTER to change the file name for each load.
See External Tables Concepts in the Utilities doc for the basics
http://docs.oracle.com/cd/B28359_01/server.111/b28319/et_concepts.htm
See Altering External Tables in the DBA doc for detailed information
http://docs.oracle.com/cd/B28359_01/server.111/b28310/tables013.htm
>
DEFAULT DIRECTORY
Changes the default directory specification
ALTER TABLE admin_ext_employees
DEFAULT DIRECTORY admin_dat2_dir;
LOCATION
Allows data sources to be changed without dropping and re-creating the external table metadata
ALTER TABLE admin_ext_employees
LOCATION ('empxt3.txt',
'empxt4.txt');
>
You can also load in parallel if you have licensed that option. -
Problem with Unicode and Oracle NCLOB fields
When I try to INSERT a new (N)CLOB into an Oracle database, all is fine until I use a non-ASCII character, such as an accented roman letter, like the "�" (that's '\u00E9') in "caf�" or the Euro Currency symbol "?" (that's '\u20AC' as a Java character literal, just in case the display is corrupted here too). This doesn't happen with "setString", but does happen when streaming characters to the CLOB; however, as Oracle or the driver refuse strings larger than 4000 characters, and as I need to support all the above symbols (and many more), I'm stuck.
Here's the background to the problem (I've tried to be detailed, after a lot of looking around on the web, I've seen lots of people with similar problems, but no solutions: I've seen and been able to stream ASCII clobs, or add small NCHAR strings, but not stream NCLOBs...).
I'm using Oracle 9.2.0.1.0 with the "thin" JDBC driver, on a Windows box (XP Pro). My database instance is set up with AL32UTF8 as the database encoding, and UTF8 as the national character set.. I've created a simple user/schema, called LOBTEST, in which I created two tables (see below).
The basic problems are :
- with Oracle and JDBC, you can't set the value of a CLOB or NCLOB with PreparedStatement's setString or setCharacterStream methods (as it throws an exception when you send more than 4000 characters)
- with Oracle, you can only have one LONG VARCHAR-type field per table (according to their documentation) and you MUST read all columns in a set order (amongst other limitations).
- with a SQL INSERT command, there's no way to set the value of a parameter that's a CLOB (implementations of the CLOB interface can only be obtained by performing a SELECT.... but obviously, when I'm inserting, the record doesn't exist yet...). Workarounds include (possibly) JDBC 4 (doesn't exist yet...) or doing the following Oracle-specific stuff :
INSERT INTO MyTable (theID,theCLOB) VALUES (1, empty_clob());
SELECT * FROM MyTable WHERE theId = 1;
...and getting the empty CLOB back (via a ResultSet), and populating it. I have a very large application, that's deployed for many of our customers using SapDB and MySQL without a hitch, with "one-step" INSERTS; I can't feasibly change the application into "three-step INSERT-SELECT-UPDATE" just for Oracle, and I shouldn't need to!!!
The final workaround is to use Oracle-specific classes, described in:
http://download-east.oracle.com/otn_hosted_doc/jdeveloper/904preview/jdbc-javadoc/index.html
...such as CLOB (see my example). This works fine until I add some non-ASCII characters, at which point, irrespective of whether the CLOB data is 2 characters or 2 million characters, it throws the same exception:
java.io.IOException: Il n'y a plus de donn?es ? lire dans le socket
at oracle.jdbc.dbaccess.DBError.SQLToIOException(DBError.java:716)
at oracle.jdbc.driver.OracleClobWriter.flushBuffer(OracleClobWriter.java:270)
at oracle.jdbc.driver.OracleClobWriter.flush(OracleClobWriter.java:204)
at scratchpad.InsertOracleClobExample.main(InsertOracleClobExample.java:61)...where the error message in English is "No more data to read from socket". I need the Oracle-specific "setFormOfUse" method to force it to correctly use the encoding of the NCLOB field, without it, even plain ASCII data is rejected with an exception indicating that the character set is inappropriate. With a plain CLOB, I don't need it, but the plain CLOB refuses my non-ASCII data anyway.
So, many many thanks in advance for any advice. The remainder of my post includes my code example and a simple SQL script to create the table(s). You can mess around with the source code to test various combinations.
Thanks,
Chris B.
CREATE TABLE NCLOBTEST (
ID INTEGER NOT NULL,
SOMESTRING NCLOB,
PRIMARY KEY (ID)
CREATE TABLE CLOBTEST (
ID INTEGER NOT NULL,
SOMESTRING CLOB,
PRIMARY KEY (ID)
package scratchpad;
import java.io.Writer;
import java.sql.Connection;
import java.sql.Driver;
import java.sql.PreparedStatement;
import java.sql.SQLException;
import java.util.Properties;
import oracle.jdbc.driver.OracleDriver;
import oracle.jdbc.driver.OraclePreparedStatement;
import oracle.sql.CLOB;
public class InsertOracleClobExample
public static void main(String[] args)
Properties jdbcProperties = new Properties();
jdbcProperties.setProperty( "user", "LOBTEST" );
jdbcProperties.setProperty( "password", "LOBTEST" );
// jdbcProperties.setProperty("oracle.jdbc.defaultNChar","true");
Driver jdbcDriver = new OracleDriver();
PreparedStatement pstmt = null;
Connection connection = null;
String tableName = "NCLOBTEST";
CLOB clob = null;
try
connection = jdbcDriver.connect("jdbc:oracle:thin:@terre:1521:orcl", jdbcProperties);
pstmt = connection.prepareStatement("DELETE FROM NCLOBTEST");
pstmt.executeUpdate();
pstmt.close();
pstmt = connection.prepareStatement(
"INSERT INTO "+tableName+" (ID,SOMESTRING) VALUES (?,?);"
clob = CLOB.createTemporary(pstmt.getConnection(), true, CLOB.DURATION_SESSION);
clob.open(CLOB.MODE_READWRITE);
Writer clobWriter = clob.getCharacterOutputStream();
clobWriter.write("Caf? 4,90? TTC");
clobWriter.flush();
clobWriter.close();
clob.close();
OraclePreparedStatement opstmt = (OraclePreparedStatement)pstmt;
opstmt.setInt(1,1);
opstmt.setFormOfUse(2, OraclePreparedStatement.FORM_NCHAR);
opstmt.setCLOB(2, clob);
System.err.println("Rows affected: "+opstmt.executeUpdate());
catch (Exception sqlex)
sqlex.printStackTrace();
try {
clob.freeTemporary();
} catch (SQLException e) {
System.err.println("Cannot free temporary CLOB: "+e.getMessage());
try { pstmt.close(); } catch(SQLException sqlex) {}
try { connection.close(); } catch(SQLException sqlex) {}
}The solution to this is to use a third-party driver. Oranxo works really well.
- Chris -
Trouble converting MS SQL 7 ntext to 10g CLOBs
Hola -
Using OMWB 10.1.0.2, and I'm having trouble converting over SQL 7 ntext fields to CLOBS; I either wind up with a single character in the CLOB field, the first character in the SQL field, or all text is brought over with spaces in between (represented by some odd character I haven't fully checked out yet, but it shows up onscreen as a box).
I'm in the early stages of checking this out, but any greneral ideas or directions for testing/solving this difficulty would be greatly appreciated, many thanks.Hi James,
S a m p l e T e x t
After futher investigation, I determined
that these are not spaces but a null character
ascii code "0".This data is in Unicode, so you see a null char after each character in text editor.
It is normal.
Seems you data were transferred well.
If the original data is not in Unicode, change the database character set to ASCII.
Or please make sure that SQL Server NTEXT was converted to CLOB, not NCLOB. NCLOB is always Unicode.
Best regards, Dmitry Tolpeko
SQLWays - Data, schema, procedures conversion for Oracle, DB2, SQL Server, Informix, Sybase and MySQL
http://www.ispirer.com -
Hi all,
I am trying to insert this data into the CLOB column. I am getting the error - Records rejected.
Please help me in this regard.
Control file:
load data
infile 'c:\test.csv'
truncate
into table test_clob
fields terminated by ","
a ,
b
Table Structure:
create table test_clob(a number, b clob);
Test.csv:
1, "Marsh Supermarkets here is the exclusive supermarket partner of Project 18 a program it developed with the Peyton Manning Childrens Hospital and Ball State University to fight childhood obesity.
Project 18 Approved shelf tags identify better-for-you kid-friendly products throughout Marsh stores. Canned fruit frozen entrees vegetables granola bars and salty snacks are among the categories analyzed.
In the granola bar category for instance products can have no more than 35% of calories from total fat and have no more than 10% of calories from saturated and trans fats combined.
Along with shelf-tags at Marsh the initiative includes: Project 18 MVPs local high school students honored for serving as role models for younger children in the community; community wellness events Project 18 walks; a school curriculum; and The Project 18 Mobile Van."
Warm Regards
SwamiHi Swami,
You need to modify your control file like this:
load data
infile 'c:\test.csv'
TRUNCATE
into table test_clob
fields terminated by ","
a ,
b CHAR(4000)
)Reason for specifying CHAR(4000) is as under (from Oracle Documentation)
To load internal LOBs (BLOBs, CLOBs, and NCLOBs) or XML columns from a primary datafile, you can use the following standard SQL*Loader formats:
* Predetermined size fields
* Delimited fields
* Length-value pair fields
For more go here: Link: [http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_loading.htm#i1008564]
Thanks,
Ankur -
Loading UTF-8 String into CLOB column
Hello!
I am trying to load UTF-8 encoded strings into a CLOB column in an Oracle 9i database from VB.Net using ODP.Net (9.2.0.414).
The strings are XML snippets (Microsoft WordML to be precise). Each corresponds to a record which already exists in the database, therefore I do an update to add the UTF-8 string.
Some of the XML snippets contain characters which once inserted look like upside down question marks (characters represented by 0x92 and 0x96 for example end up as 0xBF once in the database).
Setting breakpoints in Visual Studio, I can watch the string values in the 'Locals' window and they appear correct (in fact I can copy from the 'Locals' window and using a tool such as TOAD can paste the strings into the database successfully). Pasting through TOAD, the characters are properly represented in the database (ie 0x92 is 0x92).
I've tried a number of approaches with no luck.
Any advice/suggestion are most welcome. Thanks!
Here is my code:
strConnectionString = ConfigurationSettings.AppSettings.Item("ConnectionString")
strComponentsTable = ConfigurationSettings.AppSettings.Item("ComponentsTable")
objConnection = New OracleConnection(strConnectionString)
objCommand = objConnection.CreateCommand()
objCommand.CommandType = CommandType.Text
objCommand.CommandText = "UPDATE " & strComponentsTable & " SET TEMPLATE_COMPONENT_CONTENT = :p_content WHERE TEMPLATE_COMPONENT_ID = :p_id"
objConnection.Open()
For Each strId In objComponents.Keys
strContent = objComponents.Item(strId)
objCommand.Parameters.Clear()
objParameter = objCommand.CreateParameter()
objParameter.ParameterName = "p_content"
objParameter.OracleDbType = OracleDbType.Clob
objParameter.Direction = ParameterDirection.Input
objParameter.Value = strContent
objCommand.Parameters.Add(objParameter)
objParameter = objCommand.CreateParameter()
objParameter.ParameterName = "p_id"
objParameter.OracleDbType = OracleDbType.Int32
objParameter.Direction = ParameterDirection.Input
objParameter.Value = CInt(strId)
objCommand.Parameters.Add(objParameter)
intResult = objCommand.ExecuteNonQuery()
NextSome further research has revealed the following:
Two of the characters I provided as examples of not being stored properly in the database are (in Unicode) U+2013 and U=2019. These characters, encoded as UTF-8 should each be three bytes (0xE2 80 93 and 0xE2 80 99 respectively). Sent via VB.Net and ODP.Net they both end up in the database as one byte each (0xBF). Copy and Pasted via TOAD they end up as one byte each (0x92 and 0x96 respectively).
The NLS settings on the server side are:
NLS_CHARACTERSET = WE8ISO8859P1
NLS_NCHAR_CHARACTERSET = AL16UTF16
I have tried using both CLOB and NCLOB column with the results being identical.
Not sure what else to try... -
Clob data type oracle to oracle issue
Hi Guys
i am unable to load clob data type and am getting the following error
" java.lang.NumberFormatException: For input "string: "4294967295"
i also have checked for clob data type in topology manager -- technology -- oracle -- data type
and it is available.
i have added the following statement in my parameter file
set ODI_ADDITIONAL_JAVA_OPTIONS=%ODI_ADDITIONAL_JAVA_OPTIONS% " -Doracledatabasemetadata.get_lob_precision=false";
but i don't see any statement like the one below after which i have to add the above statement.
set ODI_ADDITIONAL_JAVA_OPTIONS="-Djava.security.policy=server.policy";
please let me know how to make the clob data type work
Regards
janakiramHi Sutirtha
Yes i am able to view source data with clob data type by Right click onto your source datastore ---> View data
I have checked metalink
it says
1. Oracle recommends the setting of the "get_lob_precision" flag to FALSE to avoid this message when dealing with LOB family datatypes (CLOB, BLOB, NCLOB, BFILE...).
i have used the follwoing setting
set ODI_JAVA_OPTIONS="-Djava.security.policy=server.policy"
set ODI_ADDITIONAL_JAVA_OPTIONS=%ODI_ADDITIONAL_JAVA_OPTIONS% "-Doracledatabasemetadata.get_lob_precision=false"
2. checked the internal id of CLOB data type and it ends with 999
but still i have the same problem
Regards
janakiram -
Hi,
we have this code
DECLARE
v_cCLOB CLOB := 'TEST';
v_cVARCHAR VARCHAR2(500) := 'TEST';
v_cCHAR CHAR(500) := 'TEST';
BEGIN
--Trying to get substring beyond the actual amount of text
--CLOB TEST
IF NVL(SUBSTR(v_cCLOB, 376, 1), ' ') = ' ' THEN
dbms_output.put_line('YES');
ELSE
dbms_output.put_line('NO');
END IF;
--VARCHAR TEST
IF NVL(SUBSTR(v_cVARCHAR, 376, 1), ' ') = ' ' THEN
dbms_output.put_line('YES');
ELSE
dbms_output.put_line('NO');
END IF;
--CHAR TEST
IF NVL(SUBSTR(v_cCHAR, 376, 1), ' ') = ' ' THEN
dbms_output.put_line('YES');
ELSE
dbms_output.put_line('NO');
END IF;
END;
we're expecting to get three YES as the result,
however we got NO, YES, YES for the result
Can we know why we got NO for the CLOB? what is behind the CLOB? Is there any particular value stored there?
thanksInteresting finding: SUBSTR should very well work with CLOBs the same as it does with VARCHAR2:
char can be any of the datatypes CHAR, VARCHAR2, NCHAR, NVARCHAR2, CLOB, or NCLOB. Both position and substring_length must be of datatype NUMBER, or any datatype that can be implicitly converted to NUMBER, and must resolve to an integer. The return value is the same datatype as char. Floating-point numbers passed as arguments to SUBSTR are automatically converted to integers.
Instead SUBSTR leaves it initialized (but unpopulated) and returns EMPTY_CLOB() instead of NULL:
SQL> declare
v_cclob clob := 'TEST';
v_cvarchar varchar2(500) := 'TEST';
v_cchar char(500) := 'TEST';
begin
--Trying to get substring beyond the actual amount of text
--CLOB TEST
if substr(v_cclob, 376, 1) = empty_clob() then
dbms_output.put_line('YES');
else
dbms_output.put_line('NO');
end if;
--VARCHAR TEST
if nvl(substr(v_cvarchar, 376, 1), ' ') = ' ' then
dbms_output.put_line('YES');
else
dbms_output.put_line('NO');
end if;
--CHAR TEST
if nvl(substr(v_cchar, 376, 1), ' ') = ' ' then
dbms_output.put_line('YES');
else
dbms_output.put_line('NO');
end if;
end;
YES
YES
YES
PL/SQL procedure successfully completed.Should have been documented I think .... -
Oracle Lite 10.3.0: how to find out the consumed size of a BLOB column
Hi,
we are developing an app that utilizes an Oracle Lite database and so far I have just been unable to find a reasonable way to determine the size of stored binary data.
All the "usual" ways we aware of from "main" Oracle just don't seem to work with OLIte (ie SELECT dbms_lob.getlength(file_data) from APPS.CSL_LOBS) so how does one find out the size of blob objects?
Thanks in advanceAssuming BLOBs are not stored in-line:
SQL> desc pm.print_media
Name Null? Type
PRODUCT_ID NOT NULL NUMBER(6)
AD_ID NOT NULL NUMBER(6)
AD_COMPOSITE BLOB
AD_SOURCETEXT CLOB
AD_FINALTEXT CLOB
AD_FLTEXTN NCLOB
AD_TEXTDOCS_NTAB PM.TEXTDOC_TAB
AD_PHOTO BLOB
AD_GRAPHIC BINARY FILE LOB
AD_HEADER PM.ADHEADER_TYP
SQL> select segment_name,
2 index_name
3 from dba_lobs
4 where owner = 'PM'
5 and table_name = 'PRINT_MEDIA'
6 and column_name = 'AD_COMPOSITE'
7 /
SEGMENT_NAME INDEX_NAME
SYS_LOB0000051988C00003$$ SYS_IL0000051988C00003$$
SQL> select sum(bytes)
2 from dba_segments
3 where owner = 'PM'
4 and segment_name in (
5 'SYS_LOB0000051988C00003$$',
6 'SYS_IL0000051988C00003$$'
7 )
8 /
SUM(BYTES)
262144
SQL> SY. -
Print PDF rich text editor using BI Publisher
Hi !
I have a CLOB field from database, and I store data using a rich text editor, like a MS Word system, here as you know, I can write words in bold format, use differents kind of colors, etc... and use all formatted text that the editor allow me. I haven't problems once save this data into table and restore again, I can see all this data formatted , and I don't loose any property of this formatted text.
The problem is when I want print this information in the same way that I can see in the rich text editor. I use BI Publisher for generate a PDF, when I send this field for print a PDF, the PDF that BI publisher generate, print only text, prints all characters including generating that text is formatted, for example, if the field is written in bold print "<b>hello</b>" in the PDF generated by BI Publisher see "< b. > hello < / b>".
Please which is the way for print the same as is displayed in the rich text editor?
Thanks in advance.
Edited by: Almogaver on 07-mar-2011 14:35
Edited by: Almogaver on 07-mar-2011 14:35
Edited by: Almogaver on 07-mar-2011 14:36
Edited by: Almogaver on 07-mar-2011 14:37Bump Again. I understand how to remove the HTML tags, thats not a problem. The problem is getting the report to print the data from a Rich Text Field stored in a CLOB or NCLOB as formatted text. Is there a "Master Style Template" or "Master Rich Text Field Subtemplate"?
Richard -
Manual Exporting of BLOB specific table to text file
Hi,
Our application is having 60000 record in a BLOB specific table.
My requirement is to export the entire table data to text files .
When I tried converting BLOB to sting and writing to file, it took almost 3 mins for 100 records, if so, will take so much of time in exporting data from BLOB specific table.
I am using the following logic,
byte[] bdata = blob.getBytes(1, (int)blob.length());
String data1 = new String(bdata);
buffer.append(data1);
Can anyone please tell me how can I speed up the operation.>
Our application is having 60000 record in a BLOB specific table.
My requirement is to export the entire table data to text files .
>
Welcome to the forum!
If you are looking for a pure Java solution you should post this question in the JDBC forum.
https://forums.oracle.com/forums/category.jspa?categoryID=288
Unless your BLOB data is located inline you are going to use Oracle to read 60,000 files, one at a time, and then use Java to write 60,000 files one at a time.
That will be a very slow process.
Also - your Java code is only going to read the LOB locator and inline BLOB data since you are not getting and processing the actual stream.
See 'Reading and Writing BLOB, CLOB and NCLOB Data' in the JDBC Dev Guide for details
http://docs.oracle.com/cd/B28359_01/java.111/b31224/oralob.htm#sthref756 -
Plsql datatype -- URGENT HELP..
Is there a plsql datatype i can declare which can hold > 32k bytes value.
please note that data value is not coming from database.
rather i will be getting it from an OUT variable of an external program call from plsql stored procedure.
THANKS.LOB Types
The LOB (large object) datatypes BFILE, BLOB, CLOB, and NCLOB let you store blocks of unstructured data (such as text, graphic images, video clips, and sound waveforms) up to four gigabytes in size. And, they allow efficient, random, piece-wise access to the data.
The LOB types differ from the LONG and LONG RAW types in several ways. For example, LOBs (except NCLOB) can be attributes of an object type, but LONGs cannot. The maximum size of a LOB is four gigabytes, but the maximum size of a LONG is two gigabytes. Also, LOBs support random access to data, but LONGs support only sequential access.
LOB types store lob locators, which point to large objects stored in an external file, in-line (inside the row) or out-of-line (outside the row). Database columns of type BLOB, CLOB, NCLOB, or BFILE store the locators. BLOB, CLOB, and NCLOB data is stored in the database, in or outside the row. BFILE data is stored in operating system files outside the database.
PL/SQL operates on LOBs through the locators. For example, when you select a BLOB column value, only a locator is returned. If you got it during a transaction, the LOB locator includes a transaction ID, so you cannot use it to update that LOB in another transaction. Likewise, you cannot save a LOB locator during one session, then use it in another session.
Starting in Oracle9i, you can also convert CLOBs to CHAR and VARCHAR2 types and vice versa, or BLOBs to RAW and vice versa, which lets you use LOB types in most SQL and PL/SQL statements and functions. To read, write, and do piecewise operations on LOBs, you can use the supplied package DBMS_LOB. For more information, see Oracle9i Application Developer's Guide - Large Objects (LOBs).
BFILE
You use the BFILE datatype to store large binary objects in operating system files outside the database. Every BFILE variable stores a file locator, which points to a large binary file on the server. The locator includes a directory alias, which specifies a full path name (logical path names are not supported).
BFILEs are read-only, so you cannot modify them. The size of a BFILE is system dependent but cannot exceed four gigabytes (2**32 - 1 bytes). Your DBA makes sure that a given BFILE exists and that Oracle has read permissions on it. The underlying operating system maintains file integrity.
BFILEs do not participate in transactions, are not recoverable, and cannot be replicated. The maximum number of open BFILEs is set by the Oracle initialization parameter SESSION_MAX_OPEN_FILES, which is system dependent.
BLOB
You use the BLOB datatype to store large binary objects in the database, in-line or out-of-line. Every BLOB variable stores a locator, which points to a large binary object. The size of a BLOB cannot exceed four gigabytes.
BLOBs participate fully in transactions, are recoverable, and can be replicated. Changes made by package DBMS_LOB can be committed or rolled back. BLOB locators can span transactions (for reads only), but they cannot span sessions.
CLOB
You use the CLOB datatype to store large blocks of character data in the database, in-line or out-of-line. Both fixed-width and variable-width character sets are supported. Every CLOB variable stores a locator, which points to a large block of character data. The size of a CLOB cannot exceed four gigabytes.
CLOBs participate fully in transactions, are recoverable, and can be replicated. Changes made by package DBMS_LOB can be committed or rolled back. CLOB locators can span transactions (for reads only), but they cannot span sessions.
NCLOB
You use the NCLOB datatype to store large blocks of NCHAR data in the database, in-line or out-of-line. Both fixed-width and variable-width character sets are supported. Every NCLOB variable stores a locator, which points to a large block of NCHAR data. The size of an NCLOB cannot exceed four gigabytes.
NCLOBs participate fully in transactions, are recoverable, and can be replicated. Changes made by package DBMS_LOB can be committed or rolled back. NCLOB locators can span transactions (for reads only), but they cannot span sessions. -
DBMS_LOB.GETLENGTH into bytes
I am using Oralce 11.2.0.3. I am using DBMS_LOB.GETLENGTH to determine size of the CLOB . Is the value given by DBMS_LOB.GETLENGTH in bytes?
For WE8MSWIN1252 you can try directly running LENGTHB against the CLOB. LENGTHB won't work against multibyte LOBs (in which case you will have to use a function similar to the above). But if it does run then you know your CLOB is single-byte and you get the size as well.
SQL> select lengthb(c) from z_t;
LENGTHB(C)
4343
4343
4969
5414
4593
162
6 rows selected
SQL> select lengthb(nc) from z_t;
select lengthb(nc) from z_t
ORA-22998: CLOB or NCLOB in multibyte character set not supported
SQL> select clob_lengthb(nc) from z_t;
CLOB_LENGTHB(NC)
4343
4343
4969
5414
4593
162
6 rows selected
Maybe you are looking for
-
User Not Found When Attempting to Sign In To FusionHQ
Hello Firefox Tech Support, When I initialized Firefox in order to sign into my Fusionhq software, first, there was the Prompt from Firefox inquiring as to whether I can trust the Fusionhq site. Then, after responding Yes, I can vaguely recall some t
-
Dr/Cr Indicator instead of (-) & (+)
Hi All, In Line item display of Vendor, Cut, GL -SAP standard is showing for Credit amount indicator and Debit amount indicator, but instead of this indicators can we use Dr / Cr indicators. Thanks in Advance Javed
-
Solution Manager-Search help limitation
Dear friends, I am using SOl Man 4.0 , service pack level SP12. I have maintained hundred document types in SOLAR_PROJECT_ADMIN for my project. Now i want to extract a report through transaction SOLAR_EVAL, based on document type. In SOLAR_EVAL
-
How to use Swing in Web Applications
Can you suggest me some good PDF or some good site for reading about using SWING in a Web Application. In our web application we plan to use JSP/Struts for the presentation layer. However there are a few screens that require some advanced UI like col
-
Can't complete install of Photoshop Elements 10 & Premiere Elements 10, Disc 5. Prior disks loaded fine but Disc 5 causes a window from my Lexmard printer to open which asks "What do you want to do? View & Print or Save to PC?" Although I close the