Blob to clob conversion when inserting as select?
I have a table with one row that has BLOB column. I would like to copy that row to a different table, with CLOB column data type. I’ve found this discussion (BLOB to CLOB? where some examples of stored procedures to convert blob to clob are used but I am unsure what should I use as input parameter to the blob_to_clob procedures listed in that discussion? Please advice.
Thank you very much for your time.
Daniel
The function by Richard Wiener {message:id=9646697} in the thread simply uses the procedure dbms_lob.converttoclob with a set of parameters defined in the function.
At my work we have created a similar function:
CREATE OR REPLACE function blob2clob (p_blob in blob)
return clob
is
v_clob clob;
amount INTEGER;
dest_offset INTEGER;
src_offset INTEGER;
blob_csid NUMBER;
lang_context INTEGER;
warning INTEGER;
begin
dbms_lob.createtemporary (
v_clob,
false,
dbms_lob.session
amount := dbms_lob.lobmaxsize;
dest_offset := 1;
src_offset := 1;
blob_csid := nls_charset_id('AL32UTF8');
lang_context := 0;
dbms_lob.converttoclob(
v_clob,
p_blob,
amount,
dest_offset,
src_offset,
blob_csid,
lang_context,
warning
return v_clob;
end blob2clob;
/Difference here is that Richard uses default charset - we have need for AL32UTF8 charset. You should use whatever is relevant for your case.
The use of either Richards function or the above is simple for your insert as select:
insert into tab_c (clob_col)
select blob_to_clob(blob_col) from tab_b
Similar Messages
-
hi all,
i have a table have 2 clounms in my DB (10.2.0.4) one colunm is varchar2 and other one was blob. as per requested from developer team, i changed it from blob to clob. after this conversion, object size is increased from 11mb to 39mb. my point is is it expected behavior ?????? if yes ! can any one pl. explain it.
regards,CREATE TABLE TABLE_CLOB
RECID VARCHAR2(255 BYTE),
XMLRECORD SYS.XMLTYPE
TABLESPACE DATATABLESPACE
PCTUSED 0
PCTFREE 10
INITRANS 1
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
LOGGING
NOCOMPRESS
NOCACHE
NOPARALLEL
MONITORING
ENABLE ROW MOVEMENT;
CREATE UNIQUE INDEX TABLE_CLOB_PK ON TABLE_CLOB
(RECID)
LOGGING
TABLESPACE INDEXTABLESPACE
PCTFREE 10
INITRANS 2
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
BUFFER_POOL DEFAULT
NOPARALLEL;
ALTER TABLE TABLE_CLOB ADD (
CONSTRAINT TABLE_CLOB_PK
PRIMARY KEY
(RECID)
USING INDEX
TABLESPACE INDEXTABLESPACE
PCTFREE 10
INITRANS 2
MAXTRANS 255
STORAGE (
INITIAL 64K
MINEXTENTS 1
MAXEXTENTS 2147483645
PCTINCREASE 0
)); -
Mutating Problem when insert while selecting data from dual
Hi All,
we have a table
test (
ID NUMBER
NAME VARCHAR2(100)) and a before insert trigger
create or replace trigger test1
before insert on test
for each row
begin
select decode(max(id),null,1,max(id)+1) into :new.id from test;
end;
i am able to insert values by using
"insert into test(name) values('test1')" with out any issues.
when i am inserting the values
"insert into test(name) select 'test1' from dual" i am getting error
ORA-04091: table SCOTT.TEST is mutating, trigger/function may not see it
Could someone please advice why i am getting error in second scenario.
Thanks in Advance
PrasadPrasad
try
insert into test (name) values ((select 'test1' from dual));
Frank -
ORA-22835: Buffer too small for CLOB to CHAR or BLOB to RAW conversion
Hi all,
the following query select to_char(nvl(round(pc.target_cost*xx_primavera.geteurtolvrate,2),amount),'FM999G999G999G999G990D00') detail_amount,
nvl(ct.cost_type, description) detail_description,
tm_desc.memo_id,
primavera_prj_name detail_prj_name,
hp.party_number detail_party_number,
xpid.interface_line_attribute1,
utl_i18n.unescape_reference(replace(regexp_replace(utl_raw.cast_to_varchar2(tm_desc.task_memo), '<[^>]*>'), chr(13)||chr(10))) document_description,
REPLACE(regexp_replace(utl_raw.cast_to_varchar2(tm_id.task_memo), '<[^>]*>'), chr(13)||chr(10)) prim_memo_client_id
from XX_PRIMAVERA_INVOICES_DETAIL xpid
join admuser.xx_ar_hz_parties xahp on xahp.orig_system_bill_customer_id = xpid.orig_system_bill_customer_id
join hz_parties hp on hp.party_id = xahp.party_id
left join admuser.projcost pc on pc.proj_id = xpid.primavera_prj_id and pc.cost_type_id != 29 and xpid.service_code = 8 and pc.task_id = xx_primavera.getTaskId(xpid.primavera_prj_id,'A1020', 'Изготвяне на оферта') and delete_session_id is null
left join admuser.costtype ct on ct.cost_type_id = pc.cost_type_id
left join admuser.taskmemo tm_id on tm_id.proj_id = xpid.primavera_prj_id and tm_id.memo_type_id = 53 and tm_id.task_id = xx_primavera.getTaskId(xpid.primavera_prj_id,'A1020', 'Изготвяне на оферта')
left join admuser.taskmemo tm_desc on tm_desc.proj_id = xpid.primavera_prj_id and tm_desc.memo_type_id = 55 and tm_desc.task_id = xx_primavera.getTaskId(xpid.primavera_prj_id,'A1020', 'Изготвяне на оферта')
where amount != 0
and xpid.interface_line_attribute1 = :ra_ctp_attribute1
ORDER BY xpid.primavera_prj_name, xpid.description;returns error:
ORA-22835: Buffer too small for CLOB to CHAR or BLOB to RAW conversion (actual: 2371, maximum: 2000) I found that the error occurs in the row : utl_i18n.unescape_reference(replace(regexp_replace(utl_raw.cast_to_varchar2(tm_desc.task_memo), '<[^>]*>'), chr(13)||chr(10))) document_description,and tried to change it to: utl_i18n.unescape_reference(replace(regexp_replace(utl_raw.cast_to_varchar2(dbms_lob.substr(tm_desc.task_memo,1,2000)), '<[^>]*>'), chr(13)||chr(10))) document_description,....but it returns not value for that field... am i using dbms_lob.substr at the wrong place? The column 'tm_desc.task_memo' is BLOB type.
Any ideas how to cheat it ?
Version: Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - Production
PL/SQL Release 11.1.0.7.0 - Production
"CORE 11.1.0.7.0 Production"
TNS for Linux: Version 11.1.0.7.0 - Production
NLSRTL Version 11.1.0.7.0 - ProductionThanks in advance,
Bahchevanov.Your second example has the parameters reversed. The amount (length) comes first and then the offset:
DBMS_LOB.SUBSTR (
lob_loc IN BLOB,
amount IN INTEGER := 32767,
offset IN INTEGER := 1)
RETURN RAW;
DBMS_LOB.SUBSTR (
lob_loc IN CLOB CHARACTER SET ANY_CS,
amount IN INTEGER := 32767,
offset IN INTEGER := 1)
RETURN VARCHAR2 CHARACTER SET lob_loc%CHARSET;
DBMS_LOB.SUBSTR (
file_loc IN BFILE,
amount IN INTEGER := 32767,
offset IN INTEGER := 1)
RETURN RAW;Also, remember that # of bytes is not necessarily the same as the # of characters depending on your character set. So 2000 bytes might become 4000 characters. And you have to make sure the BLOB is actually character data and not arbitrary binary data.
Post the results of reversing the parameters and using a smaller chunk size. -
Literal too long when inserting 4000 chars to long or clob
How to avoid getting the 'literal too long' error when inserting a string >4000 chars to a long or clob field?
A string literal can only be 4000 characters. To insert into a LOB, you need to use the dbms_lob package.
Justin -
ORA-00600 error when inserting NULL in BLOB column
Hi,
I want to insert NULL value into a BLOB column w/o using empty_blob(), but I am getting the following error upon submission (both through program and upon directly executing it from TOAD/SQL*):
java.sql.SQLException: ORA-00600: internal error code, arguments: [kxtotolc_lobopt], [], [], [], [], [], [], []
The query is as follows:
insert into image_rendering r
(r.Version_date_time, r.id, r.Name, r.Type, r.Image_url, r.Image_filesize, r.Html, r.Original_Text, r.Redirect_url, r.Version, r.Rendering_size) values
(sysdate, '1963884', '468x60_1.gif', '0', '225/9-468x60_1.gif', '1471', null, null, null, '1', '30670908')
In the table r.Html, r.Original_Text are blob and clob columns respectively.
If I remove the column r.Html (blob) and its corresponding value the following query executes fine :
insert into image_rendering r
(r.Version_date_time, r.id, r.Name, r.Type, r.Image_url, r.Image_filesize, r.Original_Text, r.Redirect_url, r.Version, r.Rendering_size)
values
(sysdate, '1963884', '468x60_1.gif', '0', '225/9-468x60_1.gif', '1471', null, null, '1', '30670908')
I know I can also insert the query using the string empty_blob(), but I dont want to do it without using empty_blob() since it will involve changing my generic DB classes.
The strange thing is that I created another table involving blob and clob columns through TOAD.
In this table I can insert NULL values in the blob column without any error.
Is there anything that can be done to insert a NULL into my blob column? Am I missing anything?
Is there a setting in the database that will allow me to do this? cuz one table accepts null and the other doesn't. Its strange.
I am using Oracle8i Enterprise Edition Release 8.1.7.0.0
The query doen't execute through toad or sql* plus or through program (I am using thin oracle drivers: the usual classes12.zip file)
thanks in advance
- NileshFrom metalink
Oracle 9i Message~~~~~~~~~~~~~~~~~
Error: ORA-14400 (ORA-14400)
Text: inserted partition key does not map to any partition
Cause: An attempt was made to insert a record into, a Range or Composite
Range object, with a concatenated partition key that is beyond the
concatenated partition bound list of the last partition -OR- An
attempt was made to insert a record into a List object with a
partition key that did not match the literal values specified for
any of the partitions.
Action: Do not insert the key. Or, add a partition capable of accepting
the key, Or add values matching the key to a partition
specification
>
So check the date .
What is your insert statement.
Anand
Edited by: Anand... on Mar 4, 2009 5:42 PM -
When inserting pictures in a new mail message using the " photo browser" button I can view and select photos but the " choose " button is gone. What have I done wrong?
Hi Liz,
Sorry to hear you are having a similar problem. Last night I went to the tool bar at the top of iphoto, clicked on "File", then clicked "Browse Backups" in the drop down menu. I have an external hard drive that is set up to Time Machine. The Browse Backups opened the iphoto pages in the Time Machine. I selected a date one day ahead of the day I performed the now infamous update, and it showed my iphoto library as it had existed that day. I then clicked "Restore Library" at the bottom right corner of the Time Machine screen. Roughly 2 hours later my iphoto was back to normal. When I opened iphoto there was a message saying I need to upgrade my program to be compatible with the new version of iphoto(version 9.2.1). I clicked "Upgrade" and within seconds it had done whatever upgrading it needed to do.
The only glitch in the restoration was that it restored the library as it appeared last week, so I no longer had photos I had imported this past weekend. I simply went back to the Browse Backups in the drop down menu, when Time Machine opened I selected the page showing my pictures from this weekend and again said to Restore Library. Roughly 45 minutes later the library was restored including the most recent photos.
I am now a happy camper.
I don't know if any of this will be of help to you because your email says you are having trouble with photos imported after the upgrade was performed. Have you had any pop up notices when you first open iphoto, that tell you you need an upgrade to be compatible with the new iphoto? If so have you clicked "upgrade"?
Good luck Liz, if you have Time Machine running as a back up to your library, maybe you wil be able to get help there, by following my instructions above. Otherwise, good luck with your investigations. I'd be interested in hearing how you make out.
Karen -
Connection reset when inserting file to BLOB column
Friends,
When inserting a file to a BLOB clomun and the file is more than 1KB, I receive the following message: java.sql.SQLException: Io exception: Connection reset
The code is this:
int fileLength = (int)file.length();
System.out.println("File length: "+fileLength);
int cod = (int)(Math.random() * 1000);
String sql = "INSERT INTO BLOB_TABLE VALUES(?,?)";
try {
FileInputStream fis = new FileInputStream(file);
PreparedStatement pstmt = connection.prepareStatement(sql);
pstmt.setInt(1, cod);
pstmt.setBinaryStream(2, fis, fileLength);
pstmt.executeUpdate();
System.out.println("File insert sucess!");
connection.close();
Does anybody know what this can be? My database is oracle.
Thanks!When you create objects on the database they are stored in the data dictionary by default in UPPER case.
So in this line:
src_loc bfile := bfilename('example_lob_dir', 'example.gif'); -- source location
you need to reference the name of the directory object in upper case. e.g.
src_loc bfile := bfilename('EXAMPLE_LOB_DIR', 'example.gif'); -- source location
;) -
hi, I was about to reset my iphone3g when I accidentally selected erase all contents and setting. It was stucked with the iphone logo, so i tried to fix it by restoring all my datas in iTunes. I was able to successfully fix it but when I inserted my sim card it says that simcard not recognized. I tried all the solutions in the net but nothing seemed to work. What should I do? I'm located here in the Philippines. I hope you could help. Thanks!
I know. I was with my brother when we did this. He didnt clearly understood the question that poped-up. Regarding the simcard, after successfully restoring it on itunes my simcard won't work. I'm not able to send and receive calls and messages.
hi, I was about to reset my iphone3g when I accidentally selected erase all contents and setting. It was stucked with the iphone logo, so i tried to fix it by restoring all my datas in iTunes. I was able to successfully fix it but when I inserted my sim c -
Can't fetch clob and long in one select/query
I created a nightmare table containing numerous binary data types to test an application I was working on, and believe I have found an undocumented bug in Oracle's JDBC drivers that is preventing me from loading a CLOB and a LONG in a single SQL select statement. I can load the CLOB successfully, but attempting to call ResultSet.get...() for the LONG column always results in
java.sql.SQLException: Stream has already been closed
even when processing the columns in the order of the SELECT statement.
I have demonstrated this behaviour with version 9.2.0.3 of Oracle's JDBC drivers, running against Oracle 9.2.0.2.0.
The following Java example contains SQL code to create and populate a table containing a collection of nasty binary columns, and then Java code that demonstrates the problem.
I would really appreciate any workarounds that allow me to pull this data out of a single query.
import java.sql.*;
This class was developed to verify that you can't have a CLOB and a LONG column in the
same SQL select statement, and extract both values. Calling get...() for the LONG column
always causes 'java.sql.SQLException: Stream has already been closed'.
CREATE TABLE BINARY_COLS_TEST
PK INTEGER PRIMARY KEY NOT NULL,
CLOB_COL CLOB,
BLOB_COL BLOB,
RAW_COL RAW(100),
LONG_COL LONG
INSERT INTO BINARY_COLS_TEST (
PK,
CLOB_COL,
BLOB_COL,
RAW_COL,
LONG_COL
) VALUES (
1,
'-- clob value --',
HEXTORAW('01020304050607'),
HEXTORAW('01020304050607'),
'-- long value --'
public class JdbcLongTest
public static void main(String argv[])
throws Exception
Driver driver = (Driver)Class.forName("oracle.jdbc.driver.OracleDriver").newInstance();
DriverManager.registerDriver(driver);
Connection connection = DriverManager.getConnection(argv[0], argv[1], argv[2]);
Statement stmt = connection.createStatement();
ResultSet results = null;
try
String query = "SELECT pk, clob_col, blob_col, raw_col, long_col FROM binary_cols_test";
results = stmt.executeQuery(query);
while (results.next())
int pk = results.getInt(1);
System.out.println("Loaded int");
Clob clob = results.getClob(2);
// It doesn't work if you just close the ascii stream.
// clob.getAsciiStream().close();
String clobString = clob.getSubString(1, (int)clob.length());
System.out.println("Loaded CLOB");
// Streaming not strictly necessary for short values.
// Blob blob = results.getBlob(3);
byte blobData[] = results.getBytes(3);
System.out.println("Loaded BLOB");
byte rawData[] = results.getBytes(4);
System.out.println("Loaded RAW");
byte longData[] = results.getBytes(5);
System.out.println("Loaded LONG");
catch (SQLException e)
e.printStackTrace();
results.close();
stmt.close();
connection.close();
} // public class JdbcLongTestThe problem is that LONGs are not buffered but are read from the wire in the order defined. The problem is the same as
rs = stmt.executeQuery("select myLong, myNumber from tab");
while (rs.next()) {
int n = rs.getInt(2);
String s = rs.getString(1);
The above will fail for the same reason. When the statement is executed the LONG is not read immediately. It is buffered in the server waiting to be read. When getInt is called the driver reads the bytes of the LONG and throws them away so that it can get to the NUMBER and read it. Then when getString is called the LONG value is gone so you get an exception.
Similar problem here. When the query is executed the CLOB and BLOB locators are read from the wire, but the LONG is buffered in the server waiting to be read. When Clob.getString is called, it has to talk to the server to get the value of the CLOB, so it reads the LONG bytes from the wire and throws them away. That clears the connection so that it can ask the server for the CLOB bytes. When the code reads the LONG value, those bytes are gone so you get an exception.
This is a long standing restriction on using LONG and LONG RAW values and is a result of the network protocol. It is one of the reasons that Oracle deprecates LONGs and recommends using BLOBs and CLOBs instead.
Douglas -
I wish to migrate records of a col. FILETYPE (LONG RAW) of existing table to col. DOCTYPE (CLOB) of a new table.
But the function to_lob() is unable to carry out the required migration. The Error message is : " Inconsistent Data Types expected - binary .... ".
Moreover LONG RAW to BLOB conversion works fine from a simple INSERT statement but the same fails when insertion is done using a CURSOR.
Can u guyz help me in doing the same !
Oracle Version used - 9i
Regards,
Chinmay <Infocker>You can transfer data from
LONG to CLOB
LONG RAW to BLOB only..
In Oracle9i -- there is a very cool "alter table t modify long_col CLOB" to do
this as well....
you could convert a clob to a blob, or a blob to a clob using utl_raw.cast_to_varchar2/raw and doing it 32k at a time. -
Hi All,
Any help with be much appreciated.
I wrote the following but keep getting the following error: ORA-06502: PL/SQL: numeric or value error. I know the flex_ws_api.blob2clobbase64 function is working as I have tested it outside of the application but when I try to push it to a page item I get the error, the function is converting a document form a blob to a clob.
declare
l_blob BLOB;
l_return CLOB;
BEGIN
select blob_content into l_blob
from wwv_flow_files
where name = :P169_FILENAME;
l_return := flex_ws_api.blob2clobbase64(l_blob);
:P169_CLOB_VALUE := l_return;
END;user10256482 wrote:
:P169_CLOB_VALUE defined as just a textarea item and the error I receive is in the Application (APEX) so I dont get a line number.I'm thinking :p169_clob_value was defined with insufficient length and the clob is too long to fit into it -
Using blob or clob from db as document
I'm changing a working process to fetch an XDP document from a database rather than fetch from resources:// on the Adobe server. The DB2 database field containing the XDP is a clob data type. We were using blob. The services operations are:
- Foundation/JdbcService/Query Single Row this fetches the XDP
- Foundation/SetValue/Execute this converts whatever was fetched into a document variable
- Forms/FormsService/renderPDFForm this merges the document with XML and produces PDF output
I'm unable to write the database field into a variable due to lack of choices. For instance there is no BLOB or CLOB variable type in the list of available types. When using STRING I get the following error:
Caused by: java.io.NotSerializableException: com.ibm.db2.jcc.b.ub
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1081)
at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:302)
at com.adobe.idp.dsc.util.CoercionUtil.toString(CoercionUtil.java:498)
When using XML I get the following error:
Caused by: com.adobe.workflow.WorkflowRuntimeException: Invalid location: /process_data/@clob_XDP_string cannot be stored for action instance: -1
at com.adobe.workflow.pat.service.PATExecutionContextImpl.setProcessDataValue(PATExecutionCo ntextImpl.java:701)
When using OBJECT I get the following error:
Caused by: com.adobe.workflow.WorkflowRuntimeException: Invalid location: /process_data/@clob_XDP_string cannot be stored for action instance: -1
at com.adobe.workflow.pat.service.PATExecutionContextImpl.setProcessDataValue(PATExecutionCo ntextImpl.java:701)Steve,
Going against DB2 doesn't work for me with a document variable type. It gives a coercion error.
I did solve my problem though from the following URL: http://groups.google.com/group/livecycle/browse_thread/thread/6c4b9156b52b71a7
JYates:
You can do this, but you have to use the Execute Script service -- at this time there isn't a deployable component for it.
Use this sort of script in the Execute Script service to read the PDF blob from the database and populate a Document variable.
import java.sql.Connection;
import java.sql.PreparedStatement;
import java.sql.Statement;
import java.sql.ResultSet;
import javax.sql.DataSource;
import javax.naming.InitialContext;
int processId = patExecContext.getProcessDataIntValue("/process_data/@id");
InitialContext context = new InitialContext();
Connection connection = ((DataSource)context.lookup("java:/IDP_DS")).getConnection();
String queryQuery = "select bigdocument, bigstring from tb_pt_workwithxlobs where process_instance_id = ?";
PreparedStatement queryStatement = connection.prepareStatement(queryQuery);
try {
queryStatement.setInt(1, processId);
ResultSet results = queryStatement.executeQuery();
results.next();
java.sql.Blob documentBlob = results.getBlob(1);
com.adobe.idp.Document document = new com.adobe.idp.Document(documentBlob.getBinaryStream());
patExecContext.setProcessDataValue("/process_data/@NewBigDocument",document);
java.sql.Clob stringClob = results.getClob(2);
patExecContext.setProcessDataValue("/process_data/@NewBigString",stringClob.getSubString( 1L,(int)stringClob.length()));
catch(Exception ex) {
ex.printStackTrace();
queryStatement.close();
connection.close(); -
Can I run a Unix shell when insert some record on a specific table?
Can I run a Unix shell when insert some record on a specific table?
I need to run a Unix shell when a record be insert on a table. Is there a way in order to do that?
THanks,
Carlos.1. Make a backup of the extproc.c file in the c:\orant\rdbms80\extproc
directory.
2. Create a file called extern.c in the c:\orant\rdbms80\extproc directory.
The "extern.c" file :
#include <oci.h>
#define NullValue -1
#include<stdio.h>
#include<string.h>
long __declspec(dllexport) OutputString(context ,
path , path_ind ,
message , message_ind,
filemode , filemode_ind ,
len , len_ind )
char *path;
char *message;
char *filemode;
int len;
OCIExtProcContext *context;
short path_ind;
short message_ind;
short filemode_ind;
short len_ind;
FILE *file_handle;
int i ;
char str[3];
int value;
/* Check whether any parameter passing is null */
if (path_ind == OCI_IND_NULL || message_ind == OCI_IND_NULL ||
filemode_ind == OCI_IND_NULL || len_ind == OCI_IND_NULL ) {
text initial_msg = (text )"One of the Parameters Has a Null Value!!! ";
text *error_msg;
/* Allocate space for the error message text, and set it up.
We do not have to free this memory - PL/SQL will do that automatically. */
error_msg = OCIExtProcAllocCallMemory(context,
strlen(path) + strlen(initial_msg) + 1);
strcpy((char *)error_msg, (char *)initial_msg);
/*strcat((char *)error_msg, path); */
OCIExtProcRaiseExcpWithMsg(context, 20001, error_msg, 0);
/* OCIExtProcRaiseExcp(context, 6502); */
return 0;
/* Open the file for writing. */
file_handle = fopen(path, filemode);
/* Check for success. If not, raise an error. */
if (!file_handle) {
text initial_msg = (text )"Cannot Create file ";
text *error_msg ;
/* Allocate space for the error message text, and set it up.
We do not have to free this memory - PL/SQL will do that automatically. */
error_msg = OCIExtProcAllocCallMemory(context,
strlen(path) + strlen(initial_msg) + 1);
strcpy((char *)error_msg, (char *)initial_msg);
strcat((char *)error_msg, path);
OCIExtProcRaiseExcpWithMsg(context, 20001, error_msg, 0);
return 0;
i = 0;
while (i < len)
/* Read the hexadecimal value(1). */
str[0] = message;
i++;
/* Read the hexadecimal value(2). */
str[1] = message[i];
/* Convert the first byte to the binary value. */
if (str[0] > 64 && str[0] < 71)
str[0] = str[0] - 55;
else
str[0] = str[0] - 48;
/* Convert the second byte to the binary value. */
if (str[1] > 64 && str[1] < 71)
str[1] = str[1] - 55;
else
str[1] = str[1] - 48;
/* Convert the hex value to binary (first & second byte). */
value = str[0] * 16 + str[1];
/* Write the binary data to the binary file. */
fprintf(file_handle,"%c",value);
i++;
/* Output the string followed by a newline. */
/* fwrite(message,len,1,file_handle); */
/* Close the file. */
fclose(file_handle);
3. Use the make.bat available in the c:\orant\rdbms80\extproc directory. You
need to run vcvars32.bat file before running this batch file. This will
create a dll file.
4. Configure the tnsnames.ora and the listener.ora files.
The tnsnames.ora should contain the following entries.
extproc_connection_data.world =
(DESCRIPTION =
(ADDRESS =
(PROTOCOL = IPC)
(KEY = ORCL)
(CONNECT_DATA = (SID = extproc)
The listener.ora should contain the following entries.
# P:\ORANT\NET80\ADMIN\LISTENER.ORA Configuration File:p:\orant\net80\admin\listener.ora
# Generated by Oracle Net8 Assistant
LISTENER8 =
(ADDRESS = (PROTOCOL = TCP)(HOST = winnt_nsc)(PORT = 1521))
SID_LIST_LISTENER8=
(SID_LIST =
(SID_DESC =
(GLOBAL_DBNAME = winnt_nsc)
(SID_NAME = ORCL)
(SID_DESC =
(SID_NAME = extproc)
(PROGRAM = extproc)
5. Login from sqlplus and issue the following statements.
create library externProcedures as 'C:\orant\RDBMS80\EXTPROC\extern.dll';
Create or replace PROCEDURE OutputString(
p_Path IN VARCHAR2,
p_Message IN VARCHAR2,
p_mode in VARCHAR2,
p_NumLines IN BINARY_INTEGER) AS EXTERNAL
LIBRARY externProcedures
NAME "OutputString"
With context
PARAMETERS (CONTEXT,
p_Path STRING,
p_path INDICATOR,
p_Message STRING,
p_message INDICATOR,
p_mode STRING,
p_mode INDICATOR,
p_NumLines INT,
p_numlines INDICATOR);
This is the pl/sql block used to write the contents of the BLOB into a file.
Set serveroutput on before running it.
SQL> desc lob_tab;
Name Null? Type
C1 NUMBER
C2 BLOB
lob_tab is the table which contains the blob data.
declare
i1 blob;
len number;
my_vr raw(10000);
i2 number;
i3 number := 10000;
begin
-- get the blob locator
SELECT c2 INTO i1 FROM lob_tab WHERE c1 = 2;
-- find the length of the blob column
len := DBMS_LOB.GETLENGTH(i1);
dbms_output.put_line('Length of the Column : ' || to_char(len));
-- Read 10000 bytes at a time
i2 := 1;
if len < 10000 then
-- If the col length is < 10000
DBMS_LOB.READ(i1,len,i2,my_vr);
outputstring('p:\bfiles\ravi.bmp',rawtohex(my_vr),'wb',2*len);
-- You have to convert the data to rawtohex format. Directly sending the buffer
-- data will not work
-- That is the reason why we are sending the length as the double the size of the data read
dbms_output.put_line('Read ' || to_char(len) || 'Bytes');
else
-- If the col length is > 10000
DBMS_LOB.READ(i1,i3,i2,my_vr);
outputstring('p:\bfiles\ravi.bmp',rawtohex(my_vr),'wb',2*i3);
dbms_output.put_line('Read ' || to_char(i3) || ' Bytes ');
end if;
i2 := i2 + 10000;
while (i2 < len ) loop
-- loop till entire data is fetched
DBMS_LOB.READ(i1,i3,i2,my_vr);
dbms_output.put_line('Read ' || to_char(i3+i2-1) || ' Bytes ');
outputstring('p:\bfiles\ravi.bmp',rawtohex(my_vr),'ab',2*i3);
i2 := i2 + 10000 ;
end loop;
end; -
I need to validate Input List Of Values Field when Inserted From Bean
I've two view object EmployeeView and DepartmentsView
In Employee View I've changed Department_Id to Input List OF Values
I've created a bean to inset data into Employee Table
ApplicationModule am = ADFUtils.getApplicationModuleForDataControl("AppModuleDataControl");
ViewObject importedVO = am.findViewObject("EmployeesView");
try {
newEmp.setEmployeeId(new Number("3424));
newEmp.setFirstName("Test Fname");
newEmp.setLastName("Test Lname");
newEmp.setEmail("[email protected]");
newEmp.setPhoneNumber("4643131345");
newEmp.setJobId("AD_VP");
newEmp.setDepartmentId(new oracle.jbo.domain.Number(999));
} catch (Exception e) {
// TODO: Add catch code
System.out.println("inside Catch");
e.printStackTrace();
}I know that ADF validate Input list of values when inserting new value on it
when I've enabled the debugger I've found that
<ViewObjectImpl><buildQuery> [604] SELECT DepartmentsEO.DEPARTMENT_ID, DepartmentsEO.DEPARTMENT_NAME, DepartmentsEO.MANAGER_ID, DepartmentsEO.LOCATION_ID, DepartmentsEO.ISDELETED FROM DEPARTMENTS DepartmentsEO WHERE ( ( (DepartmentsEO.DEPARTMENT_ID = :vc_temp_1 ) ) )
<ViewObjectImpl><bindParametersForCollection> [605] Bind params for ViewObject: [com.test.model.views.DepartmentsEOView]AppModule.__LOCAL_VIEW_USAGE_com_test_model_views_EmployeesEOView_DepartmentsView_findByVC_12_LOV_DepartmentId_lov__filterlist__vcr___
<OracleSQLBuilderImpl><bindParamValue> [606] Binding param "vc_temp_1": 999
<ViewObjectImpl><processViewCriteriaForRowMatch> [607] VCs converted to RowMatch: ( (DepartmentId = :vc_temp_1 ) )
<ViewRowImpl><handleListBindingMismatch> [608] No matching row found for list binding:LOV_DepartmentId for ViewRow:oracle.jbo.Key[34235 ]
<ViewRowImpl><handleListBindingMismatch> [609] --- filterList ValueMap key:DepartmentId, value:999
<DCBindingContainer><internalRefreshControl> [610] **** refreshControl() for BindingContainer :com_test_view_testInsertPageDef
<JUCtrlHierNodeBinding><release> [611] released: ROOT node binding:noCtrl_oracle_adfinternal_view_faces_model_binding_FacesCtrlHierNodeBinding_2, value:EmployeesViewIterator
<JUCtrlHierNodeBinding><release> [612] released: ROOT node binding:noCtrl_oracle_adfinternal_view_faces_model_binding_FacesCtrlHierNodeBinding_2, value:EmployeesViewIterator
<DCIteratorBinding><releaseDataInternal> [613] Releasing iterator binding:EmployeesViewIterator
<ApplicationPoolMessageHandler><doPoolMessage> [614] **** PoolMessage REQ ATTACH LWS
<ApplicationPoolMessageHandler><doPoolMessage> [615] **** PoolMessage REQ DETACH LWS
<DCJboDataControl><initializeApplicationModule> [616] (oracle.adf.model.bc4j.DataControlFactoryImpl.SyncMode = Immediate
<DCBindingContainer><internalRefreshControl> [617] **** refreshControl() for BindingContainer :com_test_view_testInsertPageDef
<DCBindingContainerState><validateToken> [618] Process BindingContainer state token(decompressed state):BCST:=0%V%=NEmployeesViewIterator=-D-,
<this means that ADF tries to validate the input list of values "Department_Id" and didn't found any match
I need to through exception when I try to set Department_ID with invalid value "value not on range" like
newEmp.setDepartmentId(new oracle.jbo.domain.Number(999));I don't want to validate it manually as I've a lot of fields.
I need to through exception when try to
newEmp.setDepartmentId(new oracle.jbo.domain.Number(999));Thanks Timo
this is the code that I use
ApplicationModule am = ADFUtils.getApplicationModuleForDataControl("AppModuleDataControl");
ViewObject importedVO = am.findViewObject("EmployeesView");
EmployeesEOViewRowImpl newEmp = (EmployeesEOViewRowImpl)importedVO .createRow();
importedVO .insertRow(newEmp);
try {
newEmp.setEmployeeId(new Number("3424));
newEmp.setFirstName("Test Fname");
newEmp.setLastName("Test Lname");
newEmp.setEmail("[email protected]");
newEmp.setPhoneNumber("4643131345");
newEmp.setJobId("AD_VP");
newEmp.setDepartmentId(new oracle.jbo.domain.Number(999));
} catch (Exception e) {
// TODO: Add catch code
System.out.println("inside Catch");
e.printStackTrace();
}
Maybe you are looking for
-
Premiere and Media Encoder Crashing During Export of Quicktime Files
Premiere CC and Media Encoder CC crashes when exporting Quicktime H.264 files. Sometimes the export will work, sometimes it crashes the app. After crashing, the app will freeze while loading "ExporterQuictimeHost.bundle" and requires a hard re-boot o
-
How do I fix a sync problem between iBooks/iTunes to iPad?
After installing Mavericks, when I try and sync my iPad in iTunes, it will not sync a specific book because it says "it can't locate it". I tried to delete the book on my iMac iBooks, and then redownloaded it, but the same problem occurs. All other b
-
HT1212 how to enter passcode when screen is broken
I just bought a Iphone 4s from a guy off of craigslist and am trying to restore it to make sure it works before I replace the screen and it prompted me for a passcode which I do not know and the screen is busted all up anyways and the lcd doesnt work
-
How does the return method work?
how does the return method work?
-
Pages shuts down after I click to print
Is anybody having this problem? Happening with the new intel G5 24" iMac I won't export to PDF but will export to Word... and won't print, it'll just shut down. Many thanks,