Special characters in a feild in oracle table.
Hi,
I want to find out all the special characters in a particular field say name, how am I to do it.
Can u please help me.
Regards
Sridhar
SQL> ed
Wrote file afiedt.buf
1 with t as (select 1 as id, 'Fred' as name from dual union all
2 select 2, 'Bob!' from dual union all
3 select 3, 'Jimmy123' from dual union all
4 select 4, '45Tim' from dual union all
5 select 5, 'Ken5@Thomas.' from dual)
6 -- end of test data
7 select id, regexp_replace(name, '[(A-Za-z0-9)]') as special_chrs
8* from t
SQL> /
ID SPECIAL_CHRS
1
2 !
3
4
5 @.
SQL>
Similar Messages
-
Special Characters in CONTAINS section of ORACLE TEXT
Can we have special characters in the CONTAINS Section of the ORACLE TEXT.
Ex:
Select count(*) from Table_Name where CONTAINS(Column_Name, '%SearchString#%')>1..
If I am introductions characters like @,#,$,^,&,*(, ) --> then I am getting error as given below
ERROR at line 1:
ORA-29902: error in executing ODCIIndexStart() routine
ORA-20000: Oracle Text error:
DRG-51030: wildcard query expansion resulted in too many terms
Any Suggestions Please.Check this
http://download-east.oracle.com/docs/cd/B10501_01/text.920/a96518/cqspcl.htm#1360 -
Dynamic SQL Query to Find Special Characters in Table columns
Hi,
I am new to OTN FORUMS.
I am trying to find the columnsi of a table which have special characters in them.
I am planning on using this query
select ' select INSTR('||column_name||', chr(0))
from '||table_name||'where INSTR('||column_name||', chr(0)) >0' from user_tab_columns
where table_name='Account'
and spool the output to run as a script.
Is this the right way or do u suggest any modifications to the query?
Thanks in advance.Hi,
I think your basic approach is right. Since you can't hard-code the table- or column names into the query, you'll need dynamic SQL.
Instead SQL-from-SQL (that is, writing a pure SQL query, whose output is SQL code), you could do the whole job in PL/SQL, but I don't see any huge advantage either way.
When you say "Special character<b>s</b>", do you really mean "one given special character" (in this case, CHR(0))?
Will you ever want to search for multiple special characters at once?
What if table foo has a column bar, and in 1000 rows of foo, bar contains CHR (0). Do you want 1000 rows of output, each showing the exact position of the first CHR(0)? If the purpose is to look at theese rows later, shouldn't you include the primary key in the output? What if CHR(0) occurs 2 or more times in the same string?
If you'd rather have one row of output, that simply says that the column foo.bar sometimes contains a CHR(0), then you could do something like this:
SELECT 'foo', 'bar'
FROM dual
WHERE EXISTS (
SELECT NULL
FROM foo
WHERE INSTR ( bar
, CHR (0)
) > 0
); -
Question: Trying to add a emjoi from special characters to my document which has tables. when adding can not see the emjoi picture. Can anyone help. Thank you
Emoji is not supported by the iWork apps.
Send Feedback to Apple via the Pages menu.
Jerry -
How export datas with special characters from SQL Developer?
Hi.
I'm doing an import of datas of a table, but this table have special characteres in specific accents (á,é,í,ó,ú), my source table have for example "QRCN Querétaro, Candiles" but when I done an export from opcion Tool --> Export DLL (and Datas) from SQL Developer generate the next script
Insert into tablexxx(CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20280','QRCN Quer?ro, Candiles');
How can I do for export my datas and generate the script correct?
Insert into tablexxx(CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20280','QRCN Querétaro, Candiles');
thanks.Hi sybrand_b,
1. In my SQL Developer I select Tool-->Export DDL (and Data).
2. I Select name file, connection (this is a remote DB), objects to export in this case I select 'Tables and data' and table name to export
3. Run the procedure and generate the script following:
-- File created - jueves-julio-01-2010
-- DDL for Table TABLEXXX
CREATE TABLE "BOLINF"."TABLEXXX"
( "CADENA" VARCHAR2(50 BYTE),
"NUMERO_FARMACIA" VARCHAR2(50 BYTE),
"SUCURSAL_REFERENCIA" VARCHAR2(200 BYTE)
-- DATA FOR TABLE TABLEXXX
-- FILTER = none used
REM INSERTING into TABLEXXX
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20280','QRCN Quer?ro, Candiles');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20281','QRCG Quer?ro, Corregidora');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20282','QRFU');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20283','QRFU');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20284','SAUN San Lu?P, Universidad');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20285','SAEV San Lu?P, Eje Vial');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20286','SALB San Lu?P, Los Bravo');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20287','SAAL San Lu?P, Alvaro Obreg?');
Insert into TABLEXXX (CADENA,NUMERO_FARMACIA,SUCURSAL_REFERENCIA) values ('C002','20288','SACA San Lu? Callej?n de Cod');
4. But my source table have the next datas.
Select * from TABLEXXX.
CADENA NUMERO_FARMACIA SUCURSAL_REFERENCIA
C002 20280 QRCN Querétaro, Candiles
C002 20281 QRCG Querétaro, Corregidora
C002 20282 QRFU
C002 20283 QRFU
C002 20284 SAUN San Luís P, Universidad
C002 20285 SAEV San Luís P, Eje Vial
C002 20286 SALB San Luís P, Los Bravo
C002 20287 SAAL San Luís P, Alvaro Obregó
C002 20288 SACA San Luís, Callejón de Cod
5. I have done a query to table nls_database_parameters.
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CHARACTERSET UTF8
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZH:TZM
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZH:TZM
NLS_DUAL_CURRENCY $
NLS_COMP BINARY
NLS_NCHAR_CHARACTERSET AL16UTF16
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE
NLS_RDBMS_VERSION 10.2.0.4.0
6. I have revised in Regedit-->HKEY_LOCAL_MACHINE-->SOFTWARE-->ORACLE-->ORACLE HOME and value for NLS_LANG=AMERICAN_AMERICA.UTF8
where should I change for export my datas correct?
or exist any form for export my datas?
thanks a lot.
regards -
Function to insert special characters
I want procedure or function which can insert set of huge values say1000.. with special characters existing in spreasheet to a table.
for example:
L&T Infotech
Ana's parlour
Procter & Gamble
ETSA Transmission Corporation T/As Electranet SAAnd the problem is where?
You don't know how to work with spreadsheets (assuming excel) or Oracle?
OK here is an example
Table is as follows:
SQL> create table mytable (col1 varchar2(100), col2 varchar2(100));
Table created.Excel cell contains following value
Latvia's men & women
You need it to insert in the table
The formula for insert creation is (one example of thousands of course ;)
=CONCATENATE("INSERT INTO mytable (col1, col2) values ('";SUBSTITUTE(C16;"'";"''");"', 'aaa');")As a result you need in sqlplus do following
SET SCAN OFF
and then just call script with your calculated excel column values and for particular above mentioned column and table it would be:
SQL> set scan off
SQL> INSERT INTO mytable (col1, col2) values ('Latvia''s men & women', 'aaa');
1 row created.Gints Plivna
http://www.gplivna.eu -
Problems with Special Characters in DBs...
We have just moved a set of tables from one db on one server to another db on another server. the only problem is that some special characters held within data on the tables aren't showing up properly on the new db. characters such as a circle with an 'R' in (as in registered trademark), or vowels with umlauts over them.
THe db is used by a website and it is the development version of the website which uses the new db that is not displaying the characters properly. The characters are also displayed incorrectly within sql plus 3.3.
If anyone could point out why the new db is not storing the characters correctly i would unmeasurably grateful.
Thanking you in advance
Ewan Gibb.Hello, thanks for you reply. I had to rewrite your code a little, and I think that your code would not really compile, if you tried yourself :-)
I ended up with something like this:
our environment is running in 1.4 mode, so I could not use the for-each loop :-(
public String printEntities(String s) {
char[] sArray = s.toCharArray();
StringBuffer sb = new StringBuffer();
for (int i = 0; i < sArray.length; i++) {
if (sArray[i] > 256)
sb.append("&#x" + Integer.toHexString(sArray) + ";");
} else
sb.append(sArray[i]);
return sb.toString(); -
How to avoid special characters( #) via Utl_file Package
Hello,
I am using UTL_FILE package in order to read the text from a text file and insert this text into a table.
But the text contain special character like #.
This is the nature of this text data.
For example the O/S text file contain the following text:
TEXT:
Name of the default schema being used in the current schema.
Name of the default schema being used in the current schema.
Name of the default schema being used in the current schema.
Name of the default schema being used in the current schema.
Name of the default schema being used in the current schema.
Name of the default schema being used in the current schema.
How should we avoid reading the following line via UTL_FILE from the text file?
So the text without these special characters will be inserted into a table.
I wanted that these special characters would not be inserted into a table via UTL_FILE.
What code I can add in my following routine in order to avoid the reading of special characters # or a following line from the text file?
Thanks
Sharbat
UTL FILE Code:
Declare
l_file_handle UTL_FILE.FILE_TYPE;
l_buffer VARCHAR2(4000);
BEGIN
l_file_handle := UTL_FILE.FOPEN('c:\temp', 'test.txt', 'r', 4000);
loop
UTL_FILE.get_line(l_file_handle,l_buffer);
insert into TEST (text) values(l_buffer);
end loop;
exception
when no_data_found then
UTL_FILE.FCLOSE(l_file_handle);
when others then
if utl_file.is_open(l_file_handle)
then
utl_file.fclose(l_file_handle);
end if;
end;Hi,
in Forms you can use text_io for reading text from a file. For questions related to database packages I suggest to post this question on teh database forum here on OTN.
Frank -
Oracle SQL query for getting specific special characters from a table
Hi all,
This is my table
Table Name- Table1
S.no Name
1 aaaaaaaa
2 a1234sgjghb
3 a@3$%jkhkjn
4 abcd-dfghjik
5 bbvxzckvbzxcv&^%#
6 ashgweqfg/gfjwgefj////
7 sdsaf$([]:'
8 <-fdsjgbdfsg
9 dfgfdgfd"uodf
10 aaaa bbbbz#$
11 cccc dddd-/mnm
The output has to be
S.no Name
3 a@3$%jkhkjn
5 bbvxzckvbzxcv&^%#
7 sdsaf$([]:'
8 <-fdsjgbdfsg
10 aaaa bbbbz#$
It has to return "Name" column which is having special characters,whereas some special chars like -, / ," and space are acceptable.
The Oracle query has to print columns having special characters excluding -,/," and space
Can anyone help me to get a SQL query for the above.
Thanks in advance.You can achieve it in multiple ways. Here are few.
SQL> with t
2 as
3 (
4 select 1 id, 'aaaaaaaa' name from dual union all
5 select 2 id, 'a1234sgjghb' name from dual union all
6 select 3 id, 'a@3$%jkhkjn' name from dual union all
7 select 4 id, 'abcd-dfghjik' name from dual union all
8 select 5 id, 'bbvxzckvbzxcv&^%#' name from dual union all
9 select 6 id, 'ashgweqfg/gfjwgefj////' name from dual union all
10 select 7 id, 'sdsaf$([]:''' name from dual union all
11 select 8 id, '<-fdsjgbdfsg' name from dual union all
12 select 9 id, 'dfgfdgfd"uodf' name from dual union all
13 select 10 id, 'aaaa bbbbz#$' name from dual union all
14 select 11 id, 'cccc dddd-/mnm' name from dual
15 )
16 select *
17 from t
18 where regexp_like(translate(name,'a-/" ','a'), '[^[:alnum:]]');
ID NAME
3 a@3$%jkhkjn
5 bbvxzckvbzxcv&^%#
7 sdsaf$([]:'
8 <-fdsjgbdfsg
10 aaaa bbbbz#$
SQL> with t
2 as
3 (
4 select 1 id, 'aaaaaaaa' name from dual union all
5 select 2 id, 'a1234sgjghb' name from dual union all
6 select 3 id, 'a@3$%jkhkjn' name from dual union all
7 select 4 id, 'abcd-dfghjik' name from dual union all
8 select 5 id, 'bbvxzckvbzxcv&^%#' name from dual union all
9 select 6 id, 'ashgweqfg/gfjwgefj////' name from dual union all
10 select 7 id, 'sdsaf$([]:''' name from dual union all
11 select 8 id, '<-fdsjgbdfsg' name from dual union all
12 select 9 id, 'dfgfdgfd"uodf' name from dual union all
13 select 10 id, 'aaaa bbbbz#$' name from dual union all
14 select 11 id, 'cccc dddd-/mnm' name from dual
15 )
16 select *
17 from t
18 where translate
19 (
20 lower(translate(name,'a-/" ','a'))
21 , '.0123456789abcdefghijklmnopqrstuvwxyz'
22 , '.'
23 ) is not null;
ID NAME
3 a@3$%jkhkjn
5 bbvxzckvbzxcv&^%#
7 sdsaf$([]:'
8 <-fdsjgbdfsg
10 aaaa bbbbz#$
SQL> -
How to insert & # special characters into oracle table?
I have a text value which contains special characters such as & and #.
After I pass from one page to another,
TEST& becomes "TEST&"
I put " " on the above test, otherwise amp; will be truncated by OTN).
TEST# becomes TEST (# is truncated).
Actually the value is saved in table like this: "TEST&"
How to solve this problem?
How to insert & into table without &
Thank you.
Edited by: user628655 on Jul 27, 2009 9:47 AM
Edited by: user628655 on Jul 27, 2009 9:49 AM
Edited by: user628655 on Jul 27, 2009 10:39 AMAvoid doing that through a link. If this is a page item then submit the page and redirect to the target page using a branching. On the target page, create a computation to compute the target item using the original page item as the source. If you are talking about a report, use the id to pass through the link and fetch the text column in an on load computation.
Denes Kubicek
http://deneskubicek.blogspot.com/
http://www.opal-consulting.de/training
http://apex.oracle.com/pls/otn/f?p=31517:1
------------------------------------------------------------------------------ -
Hi,
We have a requirement where we need to load data into Oracle . We are creating a CSV file and loading data to Oracle tables using SQL Loader scripts. There are certain records in the CSV Files that have special characters. For example 20°/ 60°/ 85°. The data is coming fine when in the CSV file, but when it is loaded in Oracle tables it is displayed as 20¿/ 60¿/ 85¿. Please share if you have any info on this.
Character set value:
SELECT value
FROM nls_database_parameters
WHERE parameter ='NLS_CHARACTERSET';
Result:AL32UTF8
Regards
AMHi,
where do you load those files? On the server or on the client? You have to look at the NLS_LANG setting on the OS. If it is not properly set, then it uses ASCII. So set NLS_LANG properly, or use the CHARACTER keyword in the controlfile of SQLLOADER. More information in the manual: http://download.oracle.com/docs/cd/E11882_01/server.112/e16536/ldr_control_file.htm#i1005287
Herald ten Dam
http://htendam.wordpress.com -
Load special characters in oracle by using informatica
Hi All, I'm trying to load data from flat file to oracle databse table using Informatica power center 9.1.0 and I have some special characters in source file. Data are loaded sucessfully without any errors but these special characters are loaded different way like 1) Planner – loaded as Planner ��������2) Háiréch loaded as Hair��������ch While same flatfile loaded into another flatfile,data loaded correctly including special characters.So,I am unable to understand problem with database or informatica. SourceFlat File - comma ',' delimtedCode page is defined as UTF-8 encoding of Unicode Relational connectionI have tried by changing the code page while creating relational connection in Informatica to UTF-8, ISO 8859-1 Western European, MS Windows Latin 1 etc.,didn't work. InformaticaIntegration Service and Repo code page is defined as UTF8Data movement code page of Integration server was set to UNICODE TargetOracle databaseNLS_NCHAR_CHARACTERSET: AL16UTF16NLS_CHARACTERSET: AL32UTF8 ThanksSai
Hi All, I'm trying to load data from flat file to oracle databse table using Informatica power center 9.1.0 and I have some special characters in source file. Data are loaded sucessfully without any errors but these special characters are loaded different way like 1) Planner – loaded as Planner ��������2) Háiréch loaded as Hair��������ch While same flatfile loaded into another flatfile,data loaded correctly including special characters.So,I am unable to understand problem with database or informatica. SourceFlat File - comma ',' delimtedCode page is defined as UTF-8 encoding of Unicode Relational connectionI have tried by changing the code page while creating relational connection in Informatica to UTF-8, ISO 8859-1 Western European, MS Windows Latin 1 etc.,didn't work. InformaticaIntegration Service and Repo code page is defined as UTF8Data movement code page of Integration server was set to UNICODE TargetOracle databaseNLS_NCHAR_CHARACTERSET: AL16UTF16NLS_CHARACTERSET: AL32UTF8 ThanksSai
-
MySql / Oracle special characters
Searched all around, my apologies if this is a redundant thread.
I have an Oracle (10.2.0.3) database set up to connect to MySql (4.1) database using HSODBC. Using the 3.51 MySql driver. Has a couple bugs. Distributed transactions, erroneous record counts, etc -- but I know from reading other threads that the fixes are coming. There is one problem I can't seem to get around...
In MySql, a certain field is defined as CHAR(40). I would think any content short of 40 characters is padded with spaces. But when I pull pipeline characters concatenated on both sides of the values (MySql query browser), there are no extra spaces. It's behaving more like a VARCHAR on the MySql end:
[select concat( '|', field, '|') from table]
returns '|value|'
NOT '|value |'
Now, when I pull this data through to Oracle, there are special characters padding the content to the length of the column. When I pull the ASCII code value, they're 0 NUL (null) characters. Every CHAR field in every table that doesn't fill the column is coming across the database link padded with these NUL characters, which look like little hollow squares (in SqlDeveloper and JDeveloper). Since they're not spaces, I can't TRIM them out. I can remove them with TRANSLATE, but I would have to identify every CHAR field in every table and manually code a TRANSLATE into views against the DB_LINK, and then always pull from the views. There must be an easier way to get around this.
Has anybody had a similar experience? Is this the version of the MySql driver (3.51) that I'm using? I see they have a version 5 out now. Is this addressed by a 10.2.0.4 patchset? Is that available yet? (Redhat) Is this fixed with 11g?
Oracle - AL32UTF8, UTF8
MySql - uses both UTF8 and Latin1, but the default on the source table is Latin1 / Latin_swedish_ciThe problem is a MySQL ODBC driver issue.
You have to enable ODBC option 512 (pad char to full length):
Default behaviour of the MySQL ODBC is not to PAD spaces:
SQL> select dump( "col1") from "counter"@mysql;
DUMP("COL1")
Typ=96 Len=20: 72,101,108,108,111,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0
As you can see the char column is padded with 0 instead of spaces (32).
As soon as you add OPTION 512 to the ODBC.INI (please make sure if you already have a value set for OPTION to add 512 on top of your value to get your dedicated value). So the ODBC.INI might look like:
[mysql]
Description = MySQL database test
Driver = /usr/lib/libmyodbc3.so
#Driver = MySQL ODBC 3.51 Driver
OPTION = 512
and then MySQL behaves correctly:
SQL> select dump( "col1") from "counter"@mysql;
DUMP("COL1")
Typ=96 Len=20: 72,101,108,108,111,32,32,32,32,32,32,32,32,32,32,32,32,32,32,32 -
Identify special characters in oracle 9i
HI,
I want to identify the special characters in a table.Right now i am using oracle 9i version.
Please help us.You can use following pl/sql block for this purpose. It will check if there is any special character in a field(item description here) and will display that display the position and ascii value of that special character. Later you can write another query(if needed) to remove those special characters.
Modify the query as needed.
declare
l_desc VARCHAR2(90);
l_length NUMBER;
l_cnt NUMBER := 1;
l_char VARCHAR2(20);
l_spc_char NUMBER := 0;
CURSOR c1 is select segment1, description, length(description) length1 from mtl_system_items_b where 1=1 rownum < 10000 and segment1 = '00000942304A330'
and organization_id = 156;
begin
FOR c_rec IN C1
LOOP
l_cnt := 1;
l_spc_char := 0;
WHILE l_cnt <= c_rec.length1
LOOP
l_char := SUBSTR(c_rec.description,l_cnt,1);
IF (ascii(l_char) < 32 or ascii(l_char) > 126) then
DBMS_OUTPUT.PUT_LINE('Character: '||l_char||' Position: '||l_cnt||' Ascii Value: '||ascii(l_char));
l_spc_char := l_spc_char + 1;
end if;
l_cnt := l_cnt + 1 ;
END LOOP;
IF l_spc_char > 0 THEN
DBMS_OUTPUT.PUT_LINE('Item: '||c_rec.segment1||' Description: '||c_rec.description);
END IF;
END LOOP;
end; -
How to save Special Characters in oracle?
Is there any way to enter special characters such as ºC ? i am using J2EE and Oracle 9 i.
When i try to enter 2ºC after updating the datbase it is converted to 2ºC when it is displayed in HTML. All special characters are prefixed with Â. Pls suggest any way to use special characters with oracle ..This has nothing do to with NLS_LANGUAGE. In general, character set processing depends on NLS_LANG setting (which is an OS setting and not a instance initialization parameter) and database character set. To understand NLS_LANG see OTN NLS_LANG FAQ http://www.oracle.com/technology/tech/globalization/htdocs/nls_lang%20faq.htm.
However, I think that JDBC is an exception and does not use the character set defined by NLS_LANG. See last answer in following discussion:
Re: When is NLS_LANG used ?
Maybe you are looking for
-
I have a macbook pro 5,3 white I am trying to hook up to a samsung hdtv. I have used a mini dv to hdmi adapter but I get no sound on my tv. What can I do to get sound?
-
HT4623 yahoo mail wont open with i0s6
YAHOO MAIL WONT OPEN WITH NEW IOS6 ..
-
Read versions of comments field using javascript client object model
Hi, Does someone knows how to Read versions of comments field in 'tasks' list using javascript client object model? Thanks Manvir
-
Hi, I am working on an in terface for upload of data in journal ledger(FB50) Actually i need to validate a field which has been included in transaction FB50 as an enhancement based on customer need . a function module was created as suggested in note
-
Adobe Illustrator Draw will not export to desktop Illustrator with the photo layer turned on.
Adobe Illustrator Draw will not export to desktop Illustrator with the photo layer turned on. This severely strains my workflow and makes it incredibly hard to add the image after it is exported to the desktop Illustrator. Is this being fixed??