NCHAR OR NVARCHAR2
HELLO
i use oracle developer 6i with oracle 9i database and i have a nvarchar2 data-type , but when i wirte insert query in the developer it give me a error
<<<<< character set mismatch >>>>>>>
column in the database
create table table_2
t_id number(3) primary key,
t_name nvarchar2(100)
---- insert query ---------
insert into table_2
t_id,
t_name
values
:t_id,
:t_name
);
Hi,
take a look at the whitepapers on Globalization home page at:
http://www.oracle.com/technology/tech/globalization/index.html
If I'm remember correctly, you need the "N" in front of the string you are inserting. (to tell the inserting string is of NLS type).
Monica
Similar Messages
-
Question about NCHAR and NVARCHAR2 in Oracle 8.1.7
I am using Oracle 8.1.7 on Red Hat 7.2. The database encoding is UTF8.
I created a table with a field: col1 nvarchar2(5).
But I can insert only ONE Chinese character to that field.
Doesn't it supposed to be 5?
(I can insert a 5 english characters into the field)
It seems that a Chinese character occupies 3 bytes in UTF8.
Doesn't the size of NCHAR and NVARCHAR2 refer to the size of CHARACTERS not bytes?
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_CHARACTERSET UTF8
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZH:TZM
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZH:TZM
NLS_DUAL_CURRENCY $
NLS_NCHAR_CHARACTERSET UTF8
NLS_COMP BINARYThe National Character Set in 8i was only meant to support a few special Asian character sets. Though it is set to whatever you set the database to the NCHAR columns will not work properly for UTF8. You will need to use SQL CHAR fields in order to support UTF8 in 8i. And the semantics are byte. In Oracle9i the National Character Set exclusively supports Unicode via the character sets UTF8 and Al16UTF16. The default is Al16UTF16 which is UTF-16. The 9i NCHAR fields are character semantics so field(5) would be 5 characters not bytes. There are a couple of papers on our home page that talk about unicode support at:
http://technet.oracle.com/tech/globalization/content.html
Oracle Unicode Database Support
Migration to Unicode Datatypes for Multilingual Databases and Applications in Oracle9i -
Euro-sign (and Greek) doesn't work even with nchar/nvarchar2
This is something that has been blocking me for a few days now, and I'm running out of ideas.
Basically, the problem can be summarised as follows:
declare
text nvarchar2(100) := 'Make €€€ fast!';
begin
dbms_output.put_line( text );
end;And the output (both in SQL Developer and Toad) is:
Make ¿¿¿ fast!See, I was under the impression that by using nchar and nvarchar2, you avoid the problems you get with character sets. What I need this for is to check (in PL/SQL) what the length of a string is in 7-bit units when converted to the GSM 03.38 character set. In that character set, there are 128 characters: mostly Latin characters, a couple of Greek characters that differ from the Latin ones, and some Scandinavian glyphs.
Some 10 other characters, including square brackets and the euro sign, are escaped and take two 7-bit units. So, the above message takes 17 7-bit spaces.
However, if I make a PL/SQL function that defines an nvarchar2(128) with the 128 standard characters and another nvarchar2(10) for the extended characters like the euro sign (the ones that take two 7-bit units), and I do an instr() for each character in the source string, the euro sign gets converted to an upside-down question mark, and because the delta (the first Greek character in the GSM 03.38 character set) also becomes an upside-down question mark, the function thinks that the euro sign is in fact a delta, and so assigns a length of 1.
To try to solve it, I created a table with an nchar(1) for the character and a smallint for the number of units it occupies. The characters are entered correctly, and show as euro signs and Greek letters, but as soon as I do a query, I get the same problem again. The code for the function is below:
function get_gsm_0338_length(
text_content in nvarchar2
) return integer
as
v_offset integer;
v_length integer := 0;
v_char nchar(1);
begin
for i in 1..length(text_content)
loop
v_char := substr( text_content, i, 1 );
select l
into v_offset
from gsm_0338_charset
where ch = v_char;
v_length := v_length + v_offset;
end loop;
return v_length;
exception
when no_data_found then
return length(text_content) * 2;
end get_gsm_0338_length;Does anybody have any idea how I can get this to work properly?
Thanks,
- PeterWell, the person there used a varchar2, whereas I'm using an nvarchar2. I understand that you need the right codepage and such between the client and the database if you use varchar2, which is exactly the reason why I used the nvarchar2.
However, if I call the function from /Java/, it does work (I found out just now). But this doesn't explain why SQL Developer and Toad are being difficult, and I'm afraid that, because this function is part of a much bigger application, I'll run into the same problem.
- Peter -
Oracle defaultNChar=true SLOW on NCHAR/NVARCHAR2
Hi all,
I am using a JDBC Prepared Statement with a bunch of parameters using setString(pos, value). The underlying columns on the tables are all NCHAR and NVARCHAR2. I have set the Oracle JDBC driver's "defaultNChar=true" so that Oracle DB would always treat my parameters as national language characters. The driver file is "ojdbc6.jar".
My problem: My parametrized query is extremely slow with "defaultNChar=true". But as soon as I set "defaultNChar=false" the query is ultra fast (3 seconds).
Query usage looks like that:
String sql = "INSERT INTO MYTABLE_ERROR(MY_NAME,MY_FLAG,MY_VALUE) "
+ "SELECT ? AS MY_NAME,"
+ "? AS MY_FLAG,v.MY_VALUE"
+ " FROM OTHER_TABLE v"
+ " JOIN ( SELECT * FROM ... iv ... WHERE iv.MY_NAME = ? ) rule1 "
+ " ON v.\"MY_NAME\"=rule1.\"MY_NAME\" AND v.\"MY_VALUE\"=rule1.\"MY_VALUE\""
+ " WHERE rule1.\"MY_NAME\" = ? AND v.\"MY_VALUE\" = ?";
preStatement = conn.prepareStatement (sql);
int count = 1;
for (String p : params)
// SLOW
//preStatement.setNString (count++, p);
// SLOW
//preStatement.setObject (count++, p, Types.NVARCHAR);
// SLOW
preStatement.setString (count++, p);
I have been trying to find the root cause of why my prepared statements executed against an "Oracle Database 11g Release 11.2.0.3.0 - 64bit Production" DB are slow with a JDBC driver "Oracle JDBC driver, 11.2.0.3.0". I could not find any clue!
I even got the DB NLS config hoping to find anything, but I am not sure here either:
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_CHARACTERSET AL32UTF8
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY $
NLS_NCHAR_CHARACTERSET AL16UTF16
NLS_COMP BINARY
Please help!
Thanks,
H.UPDATE: It looks like the query is stuck when using "defaultNChar=true" somehow. I am seeing this when using JConsole:
Total blocked: 1 Total waited: 1
Stack trace:
java.net.SocketInputStream.socketRead0(Native Method)
java.net.SocketInputStream.read(Unknown Source)
java.net.SocketInputStream.read(Unknown Source)
oracle.net.ns.Packet.receive(Packet.java:311)
oracle.net.ns.DataPacket.receive(DataPacket.java:103)
oracle.net.ns.NetInputStream.getNextPacket(NetInputStream.java:312)
oracle.net.ns.NetInputStream.read(NetInputStream.java:257)
oracle.net.ns.NetInputStream.read(NetInputStream.java:182)
oracle.net.ns.NetInputStream.read(NetInputStream.java:99)
oracle.jdbc.driver.T4CSocketInputStreamWrapper.readNextPacket(T4CSocketInputStreamWrapper.java:121)
oracle.jdbc.driver.T4CSocketInputStreamWrapper.read(T4CSocketInputStreamWrapper.java:77)
oracle.jdbc.driver.T4CMAREngine.unmarshalUB1(T4CMAREngine.java:1173)
oracle.jdbc.driver.T4CTTIfun.receive(T4CTTIfun.java:309)
oracle.jdbc.driver.T4CTTIfun.doRPC(T4CTTIfun.java:200)
oracle.jdbc.driver.T4C8Oall.doOALL(T4C8Oall.java:543)
oracle.jdbc.driver.T4CPreparedStatement.doOall8(T4CPreparedStatement.java:238)
oracle.jdbc.driver.T4CPreparedStatement.executeForRows(T4CPreparedStatement.java:1446)
oracle.jdbc.driver.OracleStatement.doExecuteWithTimeout(OracleStatement.java:1757)
oracle.jdbc.driver.OraclePreparedStatement.executeInternal(OraclePreparedStatement.java:4372)
oracle.jdbc.driver.OraclePreparedStatement.executeUpdate(OraclePreparedStatement.java:4539)
- locked oracle.jdbc.driver.T4CConnection@7f2315e5
oracle.jdbc.driver.OraclePreparedStatementWrapper.executeUpdate(OraclePreparedStatementWrapper.java:5577)
com.mycomp.test.DriverTest.fireStatement(DriverTest.java:253) -
NCHAR, NVARCHAR2 with JDBC THICK drivers
Hi,
I am using weblogic Thick jdbc drivers. We have a requirement of storing data
in multiple languages. So we have added two columns in the oracle 9i table with
data type NCHAR and NVARCHAR2.
I tried the code using Oracle jdbc thin drivers. Its working fine.
But when I tried with weblogic thick drivers its not able to read the data...
Its reading null values.
Any suggestions/links/guidelines would of great help.
Thanks & Regards,
Purvesh VoraPurvesh Vora wrote:
Hi,
I am using weblogic Thick jdbc drivers. We have a requirement of storing data
in multiple languages. So we have added two columns in the oracle 9i table with
data type NCHAR and NVARCHAR2.
I tried the code using Oracle jdbc thin drivers. Its working fine.
But when I tried with weblogic thick drivers its not able to read the data...
Its reading null values.
Any suggestions/links/guidelines would of great help.
Thanks & Regards,
Purvesh VoraHi. Our type-2 driver (weblogic.jdbc.oci.Driver) uses Oracle's OCI, which
may need some OCI-specific charset properties and/or environment variables
set for what you need. Please check our docs. I wouldn't expect nulls though,
maybe corrupted data, but not nulls... What happens if you try Oracle's own
thick driver? (all you would need to do is to change their URL).
Joe -
I have recetly heard that Oracle is not longer going to support NCHAR and NVARCHAR
data types. Can any one tell me if you know any thing about this. I can not find any documents for this claim.
Thanks
[email protected]
nullThere is no special external data type for NCHAR or NVARCHAR2 columns. You can use any capable data type at your choice.
-
Cannot insert Chinese character into nvarchar2 field
I have tested in two environments:
1. Database Character Set: ZHS16CGB231280
National Character Set: AL16UTF8
If the field type of datatable is varchar2 or nvarchar2, the provider can read and write Chinese correctly.
2. Database Character Set:WE8MSWIN1252
National Character Set: AL16UTF8
The provider can not read and write Chinese correctly even the field type of datatable is nvarchar2
I find that for the second one, both MS .NET Managed Provider for Oracle and Oracle Managed Data Provider cannot read and write NCHAR or NVARCHAR2 fields. The data inserted into these fields become question marks.
Even if I changed the NLS_LANG registry to SIMPLIFIED CHINESE_CHINA.ZHS16CGB231280, the result is the same.
For the second situation, only after I change the Database Character Set to ZHS16CGB231280 with ALTER DATABASE CHARACTER SET statement, can I insert Chinese correctly.
Does any know why I cannot insert Chinese characters into Unicode fields when the Database Character Set is WE8MSWIN1252? Thanks.
Regards,
JasonHi Jason,
First of all, I am not familiar with MS .NET Managed Provider for Oracle or Oracle Managed Data Provider.
How did you insert these Simplified Chinese characters into the NVARCHAR2 column ? Are they hardcoded as string literals as part of the SQL INSERT statement ? If so, this could be because, all SQL statements are converted into the database character set before they are parsed by the SQL engine; hence these Chinese characters would be lost if your db character set was WE8MSWIN1252 but not when it was ZHS16CGB231280.
Two workarounds, both involved the removal of hardcoding chinese characters.
1. Rewrite your string literal using the SQL function UNISTR().
2. Use bind variables instead of text literals in the SQL.
Thanks
Nat -
Upgrading user tables with NCHAR columns in 10.2.0.1
Hi,
we have upgraded our database to 10.2.0.1 from 8.1.7.4 through DBUA
Before upgradation our database was having WEISO8895P1 Character Set and WEISO8895P1 National character set.
In 10g we are having WEISO8895P1 Character Set and AL16UTF16 National character set.
Now to upgrade user tables with NCHAR columns, we have to perform the following steps to run the scripts :
SQL> SHUTDOWN IMMEDIATE
SQL> STARTUP RESTRICT
SQL> @ utlnchar.sql
SQL> @ n_switch.sql
SQL> SHUTDOWN IMMEDIATE
SQL> STARTUP
But when i query for the NCHAR or NVARCHAR2 or NCLOB datatype for verification, the NCHAR columns in the user tables have not been upgraded, it still remains the same.
Kindly suggest for the same.
Regards
MilinKindly explain or post the following
- the 'query' you used after 'upgradation' (a word not occurring in any dictionary and probably specific to the Hindi-English) for 'verification'
- what result you expected
- what 'still remains the same' means
Kindly consider no one is looking over your should. If you don't plan to post anything other than 'It doesn't work', paid support might be a better option for you.
Volunteers like we are not being paid to tear the information out of you.
Sybrand Bakker
Senior Oracle DBA -
Character semantics when setting the length of NVARCHAR2 field (Oracle 10g)
SQL> CREATE TABLE viv_naseleni_mesta (
2 naseleno_mjasto_id INTEGER NOT NULL PRIMARY KEY,
3 ime NVARCHAR2(15 CHAR) NOT NULL,
4 oblast_id INTEGER NOT NULL REFERENCES viv_oblasti(oblast_id),
5 grad CHAR NOT NULL CHECK (grad IN (0,1)),
6 de_timestamp TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
7 de_update_time TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
8 de_operator INTEGER DEFAULT NULL,
9 de_versija NUMBER(5,2) DEFAULT NULL
10 );
ime NVARCHAR2(15 CHAR) NOT NULL,
ERROR at line 3:
ORA-00907: missing right parenthesis
It seems that the DB does not accept the length of a field to be given in character
semantics. I have several sources that say it is possible, but is that
only possible in 11g?According to the docs, NCHAR and NVARCHAR2 types are always character-based. So specifying in CHAR doesn't make sense explicitly.
http://download.oracle.com/docs/cd/B19306_01/server.102/b14225/ch3globenv.htm#sthref385
Cheers
Sarma. -
I'm trying to wrap my head around multi-byte character set support and I'm having quite the time. Our production database was created before my time, so I don't have a good reason why it was created with the WE8MSWIN1252 character set, but here's our setup.
10.2.0.2 ( upgraded from 8.1.7->9.2.0 ) EE on AIX 5.3
character set WE8MSWIN1252
Ncharset AL16UTF16
I created a test table
create table balvey.tsting (COL1 VARCHAR2(50), COL2 NVARCHAR2(50), id number, descrip varchar2(30));
and inserted some records via different connections from a windows client. And none of them really gave me what I was expecting. I used sqldev connected to 10.2 db with AL32UTF32 character set and the following select to get the greek phi symbol 'Φ' and then copy/pasted from there into each insert statement.
From Putty connection on win client to the server.
insert into balvey.tsting values
('.', '.', 1, 'phi aix/slqplus 1252');
insert into balvey.tsting values
(chr(52902), chr(52902), 2, '52902 aix/sqlplus 1252');
From sql*plus on win client.
insert into balvey.tsting values
('F', 'F', 3, 'phi win/sqlplus 1252');
insert into balvey.tsting values
(chr(52902), chr(52902), 4, '52902 win/sqlplus 1252');
From sqldeveloper on win client
insert into balvey.tsting values
('Φ', 'Φ', 5, 'Φ win/sqldev 1252');
insert into balvey.tsting values
(chr(52902), chr(52902), 6, '52902 win/sqldev 1252');
Then selecting back out of the database from each client application I didn't get the Φ from any of the records. It didn't surprise me that I didn't get it from the varchar2 column, but I thought I would have gotten it from the nvarchar2 columns, at least from the sqldev application.
here are the results from selecting back out via each client app.
from aix client
system@JDEDEV> select * from balvey.tsting order by id;
COL1 COL2 ID DESCRIP
. . 1 phi aix/slqplus
¦ ¦ 2 52902 aix/sqlplus
F F 3 phi win/sqlplus
¦ ¦ 4 52902 win/sqlplus
¦ ¦ 5 ¦ win/sqldev
¦ ¦ 6 52902 win/sqldev
6 rows selected.
from sqlplus on win client
SQL> select * from balvey.tsting order by id;
COL1 COL2 ID DESCRIP
. . 1 phi aix/slqplus
¦ ¦ 2 52902 aix/sqlplus
F F 3 phi win/sqlplus
¦ ¦ 4 52902 win/sqlplus
¦ ¦ 5 ¦ win/sqldev
¦ ¦ 6 52902 win/sqldev
from sqldev on win client
select * from balvey.tsting order by id;
. . 1 phi aix/slqplus
¦ ¦ 2 52902 aix/sqlplus
F F 3 phi win/sqlplus
¦ ¦ 4 52902 win/sqlplus
¦ ¦ 5 ¦ win/sqldev
¦ ¦ 6 52902 win/sqldev Edited by: PktAces on Jan 7, 2009 12:33 PM
Edited by: PktAces on Jan 7, 2009 12:35 PMWell, I'm not running into ora- errors or using an 8i client to connect, but I think in-directly you have helped to clear up some of the confusion. That ML note points to ML #227330.1 and point #14 in that note is, "14. I'm inserting <special character> in a Nchar or Nvarchar2 col but it comes back as ? or ¿ ...". I wasn't necessarily getting the ? or ¿ but that lead me to the suggestion to add the setting to SqlDeveloper to allow the N flag in the insert statement, like so:
insert into balvey.tsting values
('Φ', N'Φ', 9, 'NΦ win/sqldev');
Which I had already tried but it didn't work until the setting change. Then when selecting back out via sqldeveloper does return the Φ from the NVARCHAR2 field. It also pointed to using sqlloader to load from a flat file due to sqlplus not being a UNICODE application.
So while I'm still far from understanding all there is to know about character sets, I'm not quite as confused now. Thanks. -
What merits of NCHAR as well as NVARCHAR for national language?
dear all,
while i set a field's datatype in NCHAR or NVARCHAR2, so as to record simple chinese, however, i found it can not work and prompt on "characterset do not match", althought both c/s charachterset has been set in simplified chinese.
yet once i set them back in normal CHAR or VARCHAR2 datatype, it record simplfied chinese very good.
so i question that what merits actually to NCHAR or NVARCHAR2, if there be, how can i use this two datatype?
thanks for your answer.
regards,
fredreick
nullIn Oracle8i, specifying an NCHAR character set allows you to specify an alternate character set from the database character set for use in NCHAR, NVARCHAR2, and NCLOB columns. This is particularly useful for customers using a variable-width multibyte database character set because NCHAR has the capability to support Asian fixed-width multibyte encoding schemes, whereas the database character set cannot. The benefits in using a fixed-width multibyte encoding over a variable-width one are:
B7 optimized string processing performance on NCHAR, NVARCHAR2, and NCLOB columns
B7 ease-of-programming with a fixed-width multibyte character set as opposed to a variable-width multibyte character set.
The NCHAR datatype has been redefined in Oracle9i to be a Unicode datatype exclusively. You can specify one of the following two Oracle character sets as the national character set:
B7 AL16UTF16
B7 UTF8
The recommendation would be to use char and varchar for simplified chinese support, due to the coming change of NCHAR to be exclusively a Unicode data type. -
Convert number field to nvarchar2
How??? I tried using the to_char and convert functions but I keep getting a 'character set mismatch'...
Try using the "TRANSLATE ... USING" It converts text into the character set specified for conversions between the database character set and the national character set.
The text argument is the expression to be converted.
Specifying the USING CHAR_CS argument converts text into the database character set. The output datatype is VARCHAR2.
Specifying the USING NCHAR_CS argument converts text into the national character set. The output datatype is NVARCHAR2.
This function is similar to the Oracle CONVERT function, but must be used instead of CONVERT if either the input or the output datatype is being used as NCHAR or NVARCHAR2.
Example
The examples below use the following table and table values:
CREATE TABLE t1 (char_col CHAR(20),
nchar_col nchar(20));
INSERT INTO t1
VALUES ('Hi', N'Bye');
SELECT * FROM t1;
CHAR_COL NCHAR_COL
Hi Bye
UPDATE t1 SET
nchar_col = TRANSLATE(char_col USING NCHAR_CS);
UPDATE t1 SET
char_col = TRANSLATE(nchar_col USING CHAR_CS);
SELECT * FROM t1;
CHAR_COL NCHAR_COL
Hi Hi -
hi all
i want to write a select statment in a procedure with out into statement ..
is it possilbe if yes how can i do this.
if its not possible then if we have to Return a number of rows from procedure then how can i do this..
thanks in advanceUser1728 wrote:
actual i want to return a datatable type data from procedureWhat does "datatable type" mean? "Datatable" is not a type in PL/SQL, so I assume that it has some meaning in your client programming language. What Oracle data type are you trying to return?
My guess would be that you are trying to return a REF CURSOR so that you want something like
SQL> create procedure return_rc( p_rc OUT sys_refcursor )
2 as
3 begin
4 open p_rc for select * from emp;
5 end;
6 /
Procedure created.
SQL> variable rc ref cursor;
Usage: VAR[IABLE] [ <variable> [ NUMBER | CHAR | CHAR (n [CHAR|BYTE]) |
VARCHAR2 (n [CHAR|BYTE]) | NCHAR | NCHAR (n) |
NVARCHAR2 (n) | CLOB | NCLOB | REFCURSOR |
BINARY_FLOAT | BINARY_DOUBLE ] ]
SQL> variable rc refcursor;
SQL> exec return_rc( :rc );
PL/SQL procedure successfully completed.
SQL> print rc
EMPNO ENAME JOB MGR HIREDATE SAL COMM
DEPTNO
7369 SMITH CLERK 7902 17-DEC-80 800
20
7499 ALLEN SALESMAN 7698 20-FEB-81 1600 300
30
7521 WARD SALESMAN 7698 22-FEB-81 1250 500
30
EMPNO ENAME JOB MGR HIREDATE SAL COMM
DEPTNO
7566 JONES MANAGER 7839 02-APR-81 2975
20
7654 MARTIN SALESMAN 7698 28-SEP-81 1250 1400
30
<<more data snipped>>Justin -
Using a query with bind variable with columns defined as raw
Hi,
We are on Oracle 10.2.0.4 on Solaris 8. I have a table that has 2 columns defined as raw(18). I have a query from the front end that queries these two raw columns and it uses bind vairables. The query has a performance issue that I need to reproduce but my difficulty is that how to test the query in sqlplus using bind variables (the syntax for bind vairables fails for columns with raw datatype).
SQL> DESC TEST
Name Null? Type
ID1 RAW(18)
ID2 RAW(18)
SQL> variable b1 RAW(18);
Usage: VAR[IABLE] [ <variable> [ NUMBER | CHAR | CHAR (n [CHAR|BYTE]) |
VARCHAR2 (n [CHAR|BYTE]) | NCHAR | NCHAR (n) |
NVARCHAR2 (n) | CLOB | NCLOB | REFCURSOR |
BINARY_FLOAT | BINARY_DOUBLE ] ]
The above is the error I get - i cant declare a variable as raw.
SQL> variable b2 RAW(18);
Usage: VAR[IABLE] [ <variable> [ NUMBER | CHAR | CHAR (n [CHAR|BYTE]) |
VARCHAR2 (n [CHAR|BYTE]) | NCHAR | NCHAR (n) |
NVARCHAR2 (n) | CLOB | NCLOB | REFCURSOR |
BINARY_FLOAT | BINARY_DOUBLE ] ]
SQL> variable b3 RAW(18);
Usage: VAR[IABLE] [ <variable> [ NUMBER | CHAR | CHAR (n [CHAR|BYTE]) |
VARCHAR2 (n [CHAR|BYTE]) | NCHAR | NCHAR (n) |
NVARCHAR2 (n) | CLOB | NCLOB | REFCURSOR |
BINARY_FLOAT | BINARY_DOUBLE ] ]
--now the actual query below
SQL> SELECT * FROM TEST WHERE ID1=:B1 AND ID2 BETWEEN :B2 AND :B3;
SP2-0552: Bind variable "B3" not declared.
(this fails due to the errors earlier)Also this is a third party app schema so that we don't have the option of modifying the data type of the columns.
Thanks,
Edited by: orausern on May 10, 2011 11:30 AMTry anonymous PL/SQL block:
declare
b1 RAW(18);
b2 RAW(18);
b3 RAW(18);
begin
b1:=..;
b2:=..;
b3:=..;
SELECT col1, col2, ..
INTO ...
FROM TEST
WHERE ID1=:B1
AND ID2 BETWEEN :B2 AND :B3;
end;
/ -
From Last 4 hours I am trying to resolve below issue.
Please Please.......Please Help me out.
Not getting that what is the problem ???? every thing is alright, why the error is comming on different database.
SQL*Plus: Release 10.1.0.2.0 - Production on Wed Jul 25 11:57:49 2007
Copyright (c) 1982, 2004, Oracle. All rights reserved.
Connected to:
Oracle9i Enterprise Edition Release 9.2.0.6.0 - 64bit Production
With the Partitioning and OLAP options
JServer Release 9.2.0.6.0 - Production
SQL> select * from v$version
2 /
BANNER
Oracle9i Enterprise Edition Release 9.2.0.6.0 - 64bit Production
PL/SQL Release 9.2.0.6.0 - Production
CORE 9.2.0.6.0 Production
TNS for IBM/AIX RISC System/6000: Version 9.2.0.6.0 - Production
NLSRTL Version 9.2.0.6.0 - Production
SQL>
SQL> SELECT * FROM GLOBAL_NAME
2 /
GLOBAL_NAME
PNRDPD
SQL>
SQL> CREATE OR REPLACE PACKAGE ORAPDEV_PKG IS
2 PROCEDURE DUAL_DATA(P_DEPTNO IN NUMBER, RESULTSET IN OUT SYS_REFCURSOR);
3 END ORAPDEV_PKG;
4 /
Package created.
SQL>
SQL>
SQL> CREATE OR REPLACE PACKAGE BODY ORAPDEV_PKG IS
2 PROCEDURE DUAL_DATA(P_DEPTNO IN NUMBER, RESULTSET IN OUT SYS_REFCURSOR) IS
3 BEGIN
4 OPEN RESULTSET FOR 'SELECT EMPNO FROM SCOTT.EMP WHERE DEPTNO = '||P_DEPTNO;
5 END;
6 END ORAPDEV_PKG;
7 /
Package body created.
SQL>
SQL>
SQL> VARIABLE DTA REFCURSOR;
SQL> EXEC ORAPDEV_PKG.DUAL_DATA(10,:DTA);
PL/SQL procedure successfully completed.
SQL> PRINT DTA
EMPNO
7782
7839
7934
SQL>
SQL>
SQL>
SQL> GRANT EXECUTE ON ORAPDEV_PKG TO PUBLIC;
Grant succeeded.
SQL> New Session of Another Database
SQL> CONNECT XXX/XXX@CISDPD
Connected.
SQL> create public database link PNRAPD connect to xxx identified by xxx using 'pnrapd';
Database link created.
SQL> SELECT COUNT(*) FROM ALL_TABLES@PNRAPD
2 /
COUNT(*)
722
SQL>
SQL> CREATE PUBLIC SYNONYM ORAPDEV_PKG FOR ORAPDEV_PKG@PNRAPD;
Synonym created.
SQL>
SQL> VARIABLE DTA REFCURSOR;
SQL> EXEC ORAPDEV_PKG.DUAL_DATA(10,:DTA);
BEGIN ORAPDEV_PKG.DUAL_DATA(10,:DTA); END;
ERROR at line 1:
ORA-01001: invalid cursor
ORA-02063: preceding line from PNRAPD
SQL>
SQL>
SQL> SELECT * FROM GLOBAL_NAME;
GLOBAL_NAME
CISDPD
SQL>
SQL> EXEC ORAPDEV_PKG.DUAL_DATA@pnrapd(10,:DTA);
BEGIN ORAPDEV_PKG.DUAL_DATA@pnrapd(10,:DTA); END;
ERROR at line 1:
ORA-01001: invalid cursor
ORA-02063: preceding line from PNRAPD
SQL> SELECT COUNT(*) FROM ALL_TABLES@PNRAPD
2 /
COUNT(*)
722
SQL>
SQL> select * from v$version
2 /
BANNER
Oracle9i Enterprise Edition Release 9.2.0.7.0 - 64bit Production
PL/SQL Release 9.2.0.7.0 - Production
CORE 9.2.0.7.0 Production
TNS for Linux: Version 9.2.0.7.0 - Production
NLSRTL Version 9.2.0.7.0 - Production
SQL>again.....:(
how can I resolve this issue ??
PNRAPD DATABASE
CREATE OR REPLACE PACKAGE orapdev_pkg IS
TYPE rcursor IS REF CURSOR;
PROCEDURE dual_data(p_deptno IN NUMBER
,resultset IN OUT rcursor);
END orapdev_pkg;
CREATE OR REPLACE PACKAGE BODY orapdev_pkg IS
PROCEDURE dual_data(p_deptno IN NUMBER
,resultset IN OUT rcursor) IS
BEGIN
OPEN resultset FOR 'SELECT EMPNO FROM SCOTT.EMP WHERE DEPTNO = ' || p_deptno;
END;
END orapdev_pkg;
CISDPD DATABASE
SQL> var dta ORAPDEV_PKG.RCURSOR;
Usage: VAR[IABLE] [ <variable> [ NUMBER | CHAR | CHAR (n [CHAR|BYTE]) |
VARCHAR2 (n CHAR) | NCHAR | NCHAR (n) |
NVARCHAR2 (n) | CLOB | NCLOB | REFCURSOR |
BINARY_FLOAT | BINARY_DOUBLE ] ]
SQL>
SQL>
SQL>
SQL> declare
2 vempno scott.emp.empno@pnrapd%type;
3 DTA ORAPDEV_PKG.RCURSOR;
4 begin
5 ORAPDEV_PKG.DUAL_DATA(10,DTA);
6 LOOP
7 FETCH DTA
8 INTO vempno;
9 EXIT WHEN dta%NOTFOUND;
10 DBMS_OUTPUT.PUT_LINE(vempno);
11 END LOOP;
12 CLOSE dta;
13 end;
14 /
declare
ERROR at line 1:
ORA-01001: invalid cursor
ORA-06512: at line 7Please help.
Maybe you are looking for
-
CUCM 10 Barge/CBarge with shared line between phone and a remote device
Is it possible to configure CBARGE/BARGE feature if shared line is between a cisco phone and a remote device. When remote device picks up the phone, red light comes up on the phone, if we press the line button to Cbarge, dial tone comes up and the ca
-
Hi, I am doing a search on a table using a date column having index defined on it. when I am using below query:[Here 26-OCT-2010 is the current date.] SELECT * FROM cs_incidents_all_b WHERE incident_date BETWEEN TO_DATE ('25-OCT-2010 00:00:00', 'DD-M
-
Hi. I suggest you a little game. Given an XML file defining a menu : <?xml version="1.0" encoding="UTF-8"?> <Application> <Menus> <action type="add"> <Menu Checked="0" Enabled="1" FatherUID="43520" Position="13" String="Miscellaneous Demo" Type=
-
Hi, I really have looked for this, but I cannot seem to find it. I want to do a full database backup of our SQL database, once a week, and I want it to be set up to do this automatically. I can find where to do a full backup, but cannot find where
-
Is there a plugin that would allow me to import various formats (including Windows files)?