Inserting Japanese in Oracle
Hi,
I want to insert japanese in a column of Oracle table. Session paramters NLS_LANG, NLS_CHARACTERSET and NLS_NCHAR_CHARACTERSET are American, WE8ISO8859P1 and AL16UTF16 respectively.
I am getting the data through MS Excel, and inserting the data into table.
Stored data is coming in as ??????????,
Can someone please help me for this....
its very urgent......
Hi user635879,
The following helped me insert into an nvarchar2 field using:
drop table mynvarchar;
create table mynvarchar(id int, mynv nvarchar2(100));
insert into mynvarchar values (1, N'世界您好');
commit;
select * from mynvarchar;
on a WE8ISO8859P1 database (giving the Chinese output as non ????)
-Turloch
from:
Re: SQL*Plus and chinese characters
Read Re: SQL*Plus and chinese characters
Posted: Dec 13, 2007 2:43 PM in response to: leonid.pavlov in response to: leonid.pavlov
Click to reply to this thread Reply
To insert Chinese characters into NVARCHAR2 columns of a 10.2 or higher database using N'literals' and SQL Developer, go to sqldeveloper\sqldeveloper\bin directory, open the file sqldeveloper.conf and add another AddVMOption line below other such lines:
AddVMOption -Doracle.jdbc.convertNcharLiterals=true
Note, both SQL Developer's JDBC driver and the database must be at least 10.2. Otherwise, you have to use UNISTR or the Table Data Grid.
-- Sergiusz
Similar Messages
-
I am running Oracle 9.2 on a WIN2k m/c.
I need to insert JAPANESE KANJI characters into my tables.
1) Would like to know what are the setting required for the same.
I would be pulling the data from remote SQL SERVER using OWB.
Createad a SQL Server Transparent Gategway(tg4msql) to connect to the remote SQL Server.
Current NLS Setting
NLS_LANGUAGE AMERICAN
NLS_TERRITORY AMERICA
NLS_CURRENCY $
NLS_ISO_CURRENCY AMERICA
NLS_NUMERIC_CHARACTERS .,
NLS_CALENDAR GREGORIAN
NLS_DATE_FORMAT DD-MON-RR
NLS_DATE_LANGUAGE AMERICAN
NLS_CHARACTERSET AL32UTF8
NLS_SORT BINARY
NLS_TIME_FORMAT HH.MI.SSXFF AM
NLS_TIMESTAMP_FORMAT DD-MON-RR HH.MI.SSXFF AM
NLS_TIME_TZ_FORMAT HH.MI.SSXFF AM TZR
NLS_TIMESTAMP_TZ_FORMAT DD-MON-RR HH.MI.SSXFF AM TZR
NLS_DUAL_CURRENCY $
NLS_NCHAR_CHARACTERSET AL16UTF16
NLS_COMP BINARY
NLS_LENGTH_SEMANTICS BYTE
NLS_NCHAR_CONV_EXCP FALSE
2) How can i see/verify the inserted data using SQL-PLUS.
Need your help in solving the same.
TIA
shankarWell I would assume that the first thing that is required is to run the db in utf8, which you are doing.
So storing should be no problem. To display in SQL Plus I would suspect you need to set your NLS language to something that would "read" the utf8 character sets.
I have not looked it up but I'm sure the RDBMS documentation will cover most of these topics extensively. Did you take a look at that doc set?
Jean-Pierre -
Oracle error ORA-01461when trying to insert into an ORACLE BLOB field
I am getting Oracle error ‘ORA-01461: can bind a LONG value only for insert into a LONG column' when trying to insert into an ORACLE BLOB field. The error occurs when trying to insert a large BLOB (JPG), but does not occur when inserting a small (<1K) picture BLOB.(JPG). Any ideas?
BTW, when using a SQL Server datasource using the same code.... everything works with no problems.
ORACLE version is 11.2.0.1
The ORACLE datasource is JDBC using Oracle's JDBC driver ojdbc6.jar v11.2.0.1 (I also have tried ojdbc5.jar v11.2.0.1; ojdbc5.jar v11.2.0.4; and ojdbc6.jar v11.2.0.4 with the same error result.)
Here is my code:
<cfset file_mime = Lcase(Right(postedXMLRoot.objname.XmlText, 3))>
<cfif file_mime EQ 'jpg'><cfset file_mime = 'jpeg'></cfif>
<cfset file_mime = 'data:image/' & file_mime & ';base64,'>
<cfset image64 = ImageReadBase64("#file_mime##postedXMLRoot.objbase64.XmlText#")>
<cfset ramfile = "ram://" & postedXMLRoot.objname.XmlText>
<cfimage action="write" source="#image64#" destination="#ramfile#" overwrite="true">
<cffile action="readbinary" file="#ramfile#" variable="image_bin">
<cffile action="delete" file="#ramfile#">
<cfquery name="InsertImage" datasource="#datasource#">
INSERT INTO test_images
image_blob
SELECT
<cfqueryparam value="#image_bin#" cfsqltype="CF_SQL_BLOB">
FROM dual
</cfquery>Can't you use "alter index <shema.spatial_index_name> rebuild ONLINE" ? Thanks. I could switch to "rebuild ONLINE" and see if that helps. Are there any potential adverse effects going forward, e.g. significantly longer rebuild than not using the ONLINE keyword, etc? Also wondering if spatial index operations (index type = DOMAIN) obey all the typical things you'd expect with "regular" indexes, e.g. B-TREE, etc.
-
How to insert data into Oracle db from MySQL db through PHP?
Hi,
I want to insert my MySQL data into Oracle database through PHP.
I can access Mysql database using mysql_conect() & Oracle database using oci_connect() through PHP.
Now How can I insert my data which is in MySQL into Oracle table. Both table structure are exactly same.
So I can use
insert into Oracle_Table(Oracle_columns....) values(Select * from MySQL_Table);
But again problem is I can't open both connections at the same time.
So has anyone done this before or anyone having any other idea..
Plz guide me...You can do it if you setup a ODBC Gateway between MYSQL and Oracle DB. Then you can directly read from MySQL table using DB links and insert into Oracle table in one single SQL Statement.
Otherwise you need to fetch the data from MySQL Into variables in your PHP Program and then insert into Oracle after binding the variables to the insert statement for Oracle. Hope that is clear enough.
Regards
Prakash -
Inserting variables into oracle
how do you creat insert statements in oracle using variables? it won't accept this statement
stmt.executeUpdate("INSERT into T_LINKMANAGER(LMID,LMNAME,LMURL,LMFLAG,LMHITS,LMCATEG) VALUES(ct,aname,aurl,'false',0,acat)");
giving a column not allowed here errorIf you're new to this, you should ESPECIALLY learn to use bind variables.
There are times when experts know not to use bind variables, but for 90-99% of all SQL, and 99.9% of all inserts, bind variables should be used.
Bind variables are what grownup developers use. They:
1) usually make your code run much faster in the long run, most definitely so on Oracle
2) make your code more secure; search the web for "SQL injection" to see why
3) often make your code easier to code; you don't have to worry about embedded single-quotes in your data breaking your SQL
Where I work, when we interview for new Java developers, if they claim any JDBC knowledge at all, we ask them to write a sample. If they don't use bind variables, they drastically reduce their chances of getting the job.
Back to your original query:
1) when asking help in a forum it's ALWAYS better to cut and paste the full original error message than to paraphrase it; there are often clues that get omitted in a paraphase
2) In your SQL:"INSERT into T_LINKMANAGER(LMID,LMNAME,LMURL,LMFLAG,LMHITS,LMCATEG) VALUES(ct,aname,aurl,'false',0,acat)"ct, aname, aurl, and acat are being interpreted as Oracle column names.
If those are supposed to be literal values, you must write them as 'ct', etc.
If those are Java String variables that you want to embed in your SQL, you need to do: "INSERT into T_LINKMANAGER(LMID,LMNAME,LMURL,LMFLAG,LMHITS,LMCATEG) VALUES('" +ct+"','" + aname + "','" ...If there's ever ANY chance that your Java variables will ever have a ' (single-quote) character in them, then before the SQL statement is built, you have to escape the embedded ' (convert ' to '' (single-quote single-quote) ). To avoid this common headache, use bind variables.
Actually, for heavily used code, building up a String with "+" is also bad; it generates excessive amounts of intermediate values that have to be garbage collected, thereby slowing your throughput. If you just have to use dynamically built SQL (you should use bind variables instead), then you should build the string in a StringBuffer and convert it to a String at execution. -
Insert data from oracle to sql server
we have running R12.1.1 .
our company wants to deploy some CRM Application with sql server database. Continuously we have to insert data ( some fields from some tables of Order Management module ) from ORACLE -----------> SQL Server
My Questions are :
1. What are possible methods to insert data from ORACLE -----------------> SQL Server ?
2. What are pros and cons of each method ?
3. What is performance over load of each method ?
PLease share your valuable knowledge / ideas / suggestions
thanksWhat is a maximal latency for data changed in Oracle come into MS SQL?
this data will be updated in CRM after 24 hours.Does it mean that data replicated from Oracle will be updated in MS SQL and should be replicated back?
i want to know more details about these APIs etc and middle layer.what API? In MS SQL you can use Linked server and write SQLs that select from Oracle tables via that linked server and insert/update/delete into MS SQL. You can organize these SQLs into T-SQL procedures and run them in jobs.
Also you can use MS SQL replication, snapshot or transactional, which is ugly because it places triggers on Oracle tables. I doubt that this will be approved by app vendor.
Also you can use Oracle Streams via Transparent Gateway for MS SQL, which will be quite heavy solution.
Also, if real-time replication is needed, you can use "middle layer" Oracle Golden Gate, which may be quite pricy for 30 tables.
Also, if real-time replication is needed, you can use 3-rd party middle layer like DataCurrents or others. -
Insert data into oracle table from XML file
I need to insert data into oracle table from XML file
If anybody handled this type of scenario, Please let me know how to insert data into oracle table from XML file
Thanks in advanceThe XML DB forum provides the best support for XML topics related to Oracle.
Here's the FAQ on that forum:
XML DB FAQ
where there are plenty of examples of shredding XML into Oracle tables and such like. ;) -
Inserting Data from Oracle to SQL Server on the Real Time Basis.
Hi Everyone,
I need to insert data from Oracle to SQL Server on the Real Time basis, we have to fetch data from oracle approx 20 tables, and each table has more than 30 Fields. I need to fetch data in every 15 mins.
I have created a job using SQL SERVER Agent by writing insert queries for all the tables with conditions that no rows will be inserted which is already in SQL. note that this job is taking only 1 min to execute.
But in this way our SQL Server getting hanged and it giving problems to other application running in the SQL SERVER.
So i m requesting all of you that what is the best way to insert huge amount of data on the real time basis.
Thanx in Advance.1) Create Linked server
2) insert data using openquery and set job in sql agent
3) run job after 15 minutes -
Script request for exporting, importing, inserting entry to Oracle 9i LDAP
Do anyone has the scripts for exporting, importing, inserting entry to Oracle 9i LDAP server? Thanks for help.
you can use ldapsearch utility to generate ldif files. Perform a search on the node you want to export and specify the output to be in LDIF format ( -L - this is a ldapsearch option). Once you have the ldif file, you can import it to any LDAPv3 complaint server such as OID. for OID, use ldapadd/ldapmodify utility to import the data.
These utilities are present under IAS_HOME/bin or ORACLE_HOME/bin. -
Hi All,
If any body hit the same issue as the following case ?
We have a job in SQL2008 Insert data to Oracle 11g using OLEDB Linked Server.
Previously in 9.2.0.8 & 11.2.0.1 version , the insert speed is very fast .
But after we upgrade oracle to 11.2.0.2 , the insert speed drop down a lot , maybe 1min to 10min .....
Could any body give any idea ?
Best Regards
ChiaChanFrom 10046 trace file , we found the time spent on PARSE !
Could anyone hit the same issue at 11.2.0.2 version ?
Please HELP ! HELP ! -
Please HELP!!! I try to insert BLOB to Oracle
Please help, I try to insert BLOB to Oracle.
Here is my sample code. Basically what i tried to do is to to try to insert an image file to Oracle.
But it did not work. If you have a sample code that works, please give me.
Thanks,
Tom
try
out.println ("Done");
dbCon=trans.getConnection();
dbCon.setAutoCommit(false);
stmt1.execute ("update emorytest.PHYSICIANFOTO set FOTO=empty_blob() where name = 'foobar'");
stmt = dbCon.prepareCall("SELECT FOTO from emorytest.PHYSICIANFOTO where name='foobar' for update");
stmt.execute();
rset = (OracleResultSet)stmt.getResultSet();
rset.next();
BLOB bdata = rset.getBLOB("test.jpg");
os = bdata.getBinaryOutputStream();
os.write(bdata);
os.flush();
os.close();
dbCon.commit();
dbCon.close();
catch (Exception ex)
ex.printStackTrace();Well, the obvious problem is that your "getBLOB" call ought to access the field by the field name "FOTO" not by a file name. Then you need to open an FileInputStream for the image before copying it to the blob's output stream.
In my opinion BLOB and CLOB handling is jdbc is a confusing mess. -
How to insert images into oracle databse......
hi,
i have to insert images into oracle database..
but we have a procedure to insert images into oracle database..
how to execute procedure.
my images file is on desktop.
i am using ubuntu linux 8.04..
here i am attaching code of my procedure
create or replace PROCEDURE INSERT_BLOB(filE_namE IN VARCHAR2,dir_name varchar2)
IS
tmp number;
f_lob bfile;
b_lob blob;
BEGIN
dbms_output.put_line('INSERT BLOB Program Starts');
dbms_output.put_line('--------------------------');
dbms_output.put_line('File Name :'||filE_namE);
dbms_output.put_line('--------------------------');
UPDATE photograph SET image=empty_blob()
WHERE file_name =filE_namE
returning image INTO b_lob;
f_lob := bfilename( 'BIS_IMAGE_WORKSPACE',filE_namE);
dbms_lob.fileopen(f_lob,dbms_lob.file_readonly);
--dbms_lob.loadfromfile(b_lob,f_lob,dbms_lob.getlength(f_lob));
insert into photograph values (111,user,sysdate,b_lob);
dbms_lob.fileclose(f_lob);
dbms_output.put_line('BLOB Successfully Inserted');
dbms_output.put_line('--------------------------');
commit;
dbms_output.put_line('File length is: '||dbms_lob.getlength( f_lob));
dbms_output.put_line('Loaded length is: '||dbms_lob.getlength(b_lob));
dbms_output.put_line('BLOB Committed.Program Ends');
dbms_output.put_line('--------------------------');
END inserT_bloB;
warm regerds
pydiraju
please solve my problem
thanks in advance.......thank you
but i am getting the following errors.
i connected as dba and created directory on /home/pydiraju/Desktop/PHOTO_DIR'
and i gave all permissions to scott user.
but it not working . it gives following errors.
ERROR at line 1:
ORA-22288: file or LOB operation FILEOPEN failed
Permission denied
ORA-06512: at "SYS.DBMS_LOB", line 523
ORA-06512: at "SCOTT.LOAD_FILE", line 28
ORA-06512: at line 1
Warm regards,
Pydiraju.P
Mobile: +91 - 9912380544 -
Inserting JAPANESE characters in a database
Can somebody please let me know how to insert the JAPANESE (Kanji) Characters into the Database.
Database Version: Oracle 9.2.0.1.0
Paremeters Setting:
NLS_CHARACTERSET - UTF8
NLS_NCHAR_CHARACTERSET - UTF8
Server OS: Win2K
Client OS: Win2KNot sure what your overall requirements from an application
support standpoint. But a simple way would be to use UNIST.
Here is a description:
UNISTR takes as its argument a string and returns it in the national character set.The national character set of the database can be either AL16UTF16 or UTF8.
UNISTR provides support for Unicode string literals by letting you specify the Unicode encoding value of characters in the string. This is useful, for example, for
inserting data into NCHAR columns.
The Unicode encoding value has the form '\xxxx' where 'xxxx' is the hexadecimal value of a character in UCS-2 encoding format. To include the backslash in
the string itself, precede it with another backslash (\\).
For portability and data preservation, Oracle Corporation recommends that in the UNISTR string argument you specify only ASCII characters and the Unicode
encoding values.
Examples
The following example passes both ASCII characters and Unicode encoding values to the UNISTR function, which returns the string in the national character
set:
SELECT UNISTR('abc\00e5\00f1\00f6') FROM DUAL;
UNISTR
abceqv -
Error in inserting Japanese charcters into XMLDB -[b]Urgent[/b]
Hi,
I have set database charcterset as 'UTF-8' and I saved the xml file in server with 'UTF-8' encoding, the inerstion works for english, but if I inlucde japanese charcters I get the following error (please find below), we use Oracle 9i for developement.Since this is very urgent and critical,kindly help us, we are thankful for the earlier responses got for our diferent queries.
This is XML Schema we registered.
XML SCHEMA:
==========
<?xml version="1.0" encoding="UTF-8"?>
<xs:schema id="Dp_Pref_mst" xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:msdata="urn:schemas-microsoft-com:xml-msdata">
<xs:element name="PrefMaster">
<xs:complexType>
<xs:choice maxOccurs="unbounded">
<xs:element name="StateDetails">
<xs:complexType>
<xs:sequence>
<xs:element name="Pref_code" type="xs:string" minOccurs="0" />
<xs:element name="Pref_desc" type="xs:string" minOccurs="0" />
</xs:sequence>
</xs:complexType>
</xs:element>
</xs:choice>
</xs:complexType>
</xs:element>
</xs:schema>
Schema registration:
===================
begin
dbms_xmlschema.registerSchema( 'sample.xsd', getFileContent('sample.xsd','DP','UTF8'));
end;
Table creation :
===============
Create table states of XMLType XMLSCHEMA "sample.xsd" ELEMENT "PrefMaster";
This is XML we try to insert
XML :
====
<?xml version="1.0" encoding="UTF-8"?>
<PrefMaster xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:noNamespaceSchemaLocation="sample.xsd">
<StateDetails>
<Pref_code>1</Pref_code>
<Pref_desc>å°å·éµå·¥æ</Pref_desc>
</StateDetails>
<StateDetails>
<Pref_code>2</Pref_code>
<Pref_desc>Honshu</Pref_desc>
</StateDetails>
<StateDetails>
<Pref_code>3</Pref_code>
<Pref_desc>Nagasaki</Pref_desc>
</StateDetails>
<StateDetails>
<Pref_code>4</Pref_code>
<Pref_desc>Osaka</Pref_desc>
</StateDetails>
</PrefMaster>
CREATE OR REPLACE FUNCTION getFileContent(filename varchar2,directoryName varchar2 default USER,
charset varchar2 default 'AL32UTF8')
return CLOB
is
fileContent CLOB := NULL;
file bfile := bfilename(directoryName,filename);
dest_offset number := 1;
src_offset number := 1;
lang_context number := 0;
conv_warning number := 0;
begin
DBMS_LOB.createTemporary(fileContent,true,DBMS_LOB.SESSION);
DBMS_LOB.fileopen(file, DBMS_LOB.file_readonly);
DBMS_LOB.loadClobfromFile
fileContent,
file,
DBMS_LOB.getLength(file),
dest_offset,
src_offset,
nls_charset_id(charset),
lang_context,
conv_warning
DBMS_LOB.fileclose(file);
return fileContent;
end;
our insert statement
====================
Insert into states values (XMLType(getFileContent('sample.xml','DP','UTF8')));
The error we are getting
=======================
Error
=====
ERROR at line 1:
ORA-31011: XML parsing failed
ORA-19202: Error occurred in XML processing
LPX-00210: expected '<' instead of '¿'
Error at line 1
ORA-06512: at "SYS.XMLTYPE", line 0
ORA-06512: at line 1
Any help in resolving this would be highly appriciated.
Thank you
Kathiresan.COMMIT is an SQL command and I don't think it will have any effect on you dataset. I think this should work:
FORM f_write_aufk.
CLEAR: gv_error, gv_reccnt, gv_t_amt, gv_t_qty,
gv_t_docs, gv_previous_rec_id, gv_passcnt.
*--------Open in compress mode ----------------------------------------*
DATA: gc_commit dafault '1000'.
OPEN DATASET lv_file FOR OUTPUT IN TEXT MODE ENCODING DEFAULT FILTER
'compress'.
IF sy-subrc = 0.
SELECT aufnr auart autyp ernam aenam aedat ktext
FROM aufk INTO gs_aufk
WHERE aufnr IN s_aufnr.
PERFORM f_transfer_dataset
USING p_f_aufk gs_aufk CHANGING gv_error.
ENDSELECT.
PERFORM f_closedataset USING p_f_aufk CHANGING gv_error.
ENDIF.
ENDFORM. "F_WRITE_AUFK
Rob -
Issu for running insert statement in oracle procedure.
Hi expert,
I ran a oracle procedure with a insert statement inside as:
insert into table1 select....
but I got error message related to this insert statement as "SQLERRM= ORA-08103: object no longer exists"
I ran this statement separately in toad, no error message, but no data result from this execute.
please tell how to fix this issue.
Many Thanks,
Edited by: 918440 on 27-Jun-2012 8:04 AMHi friend,
my insert statement is as follows:
INSERT INTO HIROC_RU_FACT_S
select
pp.policy_fk,
pp.transaction_log_fk,
p.policy_no,
p.policy_type_code,
hiroc_rpt_user.hiroc_get_entity_name(pp.policy_fk,'POLHOLDER') policy_holder,
pp.risk_fk,
r.risk_base_record_fk,
r.entity_fk,
hiroc_sel_entity_risk_name2 (pp.risk_fk,r.entity_fk) risk_name,
substr(trim(nvl(r.county_code_used_to_rate,pth.issue_state_code)),1,2) rating_state_code,
hiroc_get_province_name(substr(trim(nvl(r.county_code_used_to_rate,pth.issue_state_code)),1,2), 'PROVINCE_CODE', 'L') rating_state_name,
hiroc_get_provicne_pol_prefix(substr(trim(nvl(r.county_code_used_to_rate,pth.issue_state_code)),1,2),p.policy_type_code) rating_prov_pol_prefix,
nvl(r.risk_cls_used_to_rate,pth.peer_groups_code) rating_peer_group_code,
hiroc_get_lookup_desc('PEER_GROUP',nvl(r.risk_cls_used_to_rate,pth.peer_groups_code),'L') rating_peer_group_name,
pth.policy_term_history_pk,
pth.term_base_record_fk,
to_char(pth.effective_from_date,'yyyy') term_effective_year,
c.coverage_pk,
c.coverage_base_record_fk,
pc.coverage_code,
c.product_coverage_code,
pc.long_description,
pp.coverage_component_code,
c.effective_from_date,
c.effective_to_date,
cls.coverage_code coverage_class_code,
cls.coverage_long_desc coverage_class_long_desc,
decode(pp.coverage_component_code ,'GROSS',cls.exposure_unit,null) exposure_unit, --hiroc_get_expos_units_by_cov(c.coverage_pk,pc.coverage_code,c.effective_from_date,c.effective_to_date) exposure_unit,
decode(pp.coverage_component_code ,'GROSS',cls.number_of_patient_day,null) number_of_patient_day,
pth.effective_from_date term_eff_from_date,
pth.effective_to_date term_eff_to_date,
pp.premium_amount premium_amount,
(case when (pc.coverage_code in ('CP','MC1','MC2','MC3','MC4','HR','F') or pc.coverage_code like 'ST%') and
pp.coverage_component_code != 'RISKMGMT' then
(nvl(pp.premium_amount,0))
else
0
end) primary_premium,
(hiroc_get_risk_units(hiroc_get_provicne_pol_prefix(substr(trim(nvl(r.county_code_used_to_rate,pth.issue_state_code)),1,2),p.policy_type_code)-- rating_prov_pol_prefix
,nvl(r.risk_cls_used_to_rate,pth.peer_groups_code) -- rating_peer_group_code
,cls.coverage_code --coverage_class_code
,decode(pp.coverage_component_code ,'GROSS',cls.exposure_unit,null)
,pp.premium_amount
,(case when (pc.coverage_code in ('CP','MC1','MC2','MC3','MC4','HR','F') or pc.coverage_code like 'ST%') and
pp.coverage_component_code != 'RISKMGMT' then
(nvl(pp.premium_amount,0))
else
0
end) -- primary_premium
,p.policy_type_code
,trunc(pth.effective_to_date))) risk_units
from proddw_mart.rmv_territory_makeup tm,
proddw_mart.rmv_premium_class_makeup pcm,
proddw_mart.rmv_product_coverage pc,
proddw_mart.rmv_coverage c,
proddw_mart.rmv_risk r,
proddw_mart.rmv_policy_term_history pth,
proddw_mart.rmv_policy p,
proddw_mart.rmv_transaction_log tl,
proddw_mart.rmv_policy_premium pp,
(select /* +rule */
p.policy_no,
p.policy_start_date,
p.policy_end_date,
r.risk_pk,
r.risk_description,
c.coverage_pk,
c.parent_coverage_base_record_fk,
pc.parent_product_covg_code,
pc.coverage_code,
pc.short_description coverage_short_desc,
pc.long_description coverage_long_desc,
c.exposure_unit,
pc.exposure_basis_code,
c.number_of_patient_day,
p.policy_start_date policy_effective_date,
p.policy_end_date policy_expiry_date,
c.effective_from_date,
c.effective_to_date,
to_char(c.effective_from_date,'YYYY') class_eff_year
from proddw_mart.odwr_coverage_only c
,proddw_mart.odwr_product_coverage pc
,proddw_mart.odwr_risk r
,proddw_mart.odwr_policy p
where pc.code = c.product_coverage_code
and pc.parent_product_covg_code is not null -- coverage classes only
and r.risk_pk = c.risk_base_record_fk
and c.accounting_to_date = to_date('1/1/3000','mm/dd/yyyy') -- only open records
and c.base_record_b = 'N'
and p.base_record_b = 'N'
and p.policy_pk = r.policy_fk
and p.accounting_to_date = to_date('1/1/3000','mm/dd/yyyy') -- only open records
group by p.policy_no,
p.policy_start_date,
p.policy_end_date,
r.risk_pk,
r.risk_description,
c.coverage_pk,
c.parent_coverage_base_record_fk,
pc.parent_product_covg_code,
pc.coverage_code,
pc.short_description, -- coverage_short_desc,
pc.long_description, -- coverage_long_desc,
c.exposure_unit,
pc.exposure_basis_code,
c.number_of_patient_day,
p.policy_start_date, -- policy_effective_date,
p.policy_end_date, -- policy_expiry_date,
c.effective_from_date,
c.effective_to_date,
to_char(c.effective_from_date,'YYYY')-- class_eff_year
) cls
where tm.risk_type_code = r.risk_type_code
and tm.county_code = r.county_code_used_to_rate
and tm.effective_from_date <= pp.rate_period_from_date
and tm.effective_to_date > pp.rate_period_from_date
and pcm.practice_state_code (+) = r.practice_state_code
and pcm.risk_class_code (+) = r.risk_cls_used_to_rate
and nvl(pcm.effective_from_date, pp.rate_period_from_date) <= pp.rate_period_from_date
and nvl(pcm.effective_to_date, to_date('01/01/3000','mm/dd/yyyy')) > pp.rate_period_from_date
and pc.code = c.product_coverage_code
and c.base_record_b = 'N'
and ( c.record_mode_code = 'OFFICIAL'
and (c.closing_trans_log_fk is null or
c.closing_trans_log_fk != tl.transaction_log_pk)
or c.record_mode_code = 'TEMP'
and c.transaction_log_fk = tl.transaction_log_pk )
and c.parent_coverage_base_record_fk is null
and c.effective_from_date < c.effective_to_date
and c.effective_from_date <= pp.rate_period_from_date
and c.effective_to_date > pp.rate_period_from_date
and c.accounting_from_date <= tl.accounting_date
and c.accounting_to_date > tl.accounting_date
and c.coverage_base_record_fk=pp.coverage_fk
and r.base_record_b = 'N'
and ( r.record_mode_code = 'OFFICIAL'
and (r.closing_trans_log_fk is null or
r.closing_trans_log_fk != tl.transaction_log_pk)
or r.record_mode_code = 'TEMP'
and r.transaction_log_fk = tl.transaction_log_pk )
and r.effective_from_date < r.effective_to_date
and r.effective_from_date <= pp.rate_period_from_date
and r.effective_to_date > pp.rate_period_from_date
and r.accounting_from_date <= tl.accounting_date
and r.accounting_to_date > tl.accounting_date
and r.risk_base_record_fk = pp.risk_fk
and pth.base_record_b = 'N'
and ( pth.record_mode_code = 'OFFICIAL'
and (pth.closing_trans_log_fk is null or
pth.closing_trans_log_fk != tl.transaction_log_pk)
or pth.record_mode_code = 'TEMP'
and pth.transaction_log_fk = tl.transaction_log_pk )
and pth.accounting_from_date <= tl.accounting_date
and pth.accounting_to_date > tl.accounting_date
and pth.term_base_record_fk = pp.policy_term_fk
and p.policy_pk = pp.policy_fk
and tl.transaction_log_pk = pp.transaction_log_fk
and pp.active_premium_b = 'Y'
and pp.rate_period_type_code in ('CS_PERIOD','SR_PERIOD')
and pp.rate_period_to_date > pp.rate_period_from_date
and tl.accounting_date <= sysdate
and p.policy_cycle_code = 'POLICY'
and substr(p.policy_no,1,1) <> 'Q'
and tl.transaction_log_pk = (select max(pp.transaction_log_fk)
from proddw_mart.rmv_policy_premium pp,proddw_mart.rmv_transaction_log tl2
where pth.term_base_record_fk = pp.policy_term_fk
and pp.transaction_log_fk = tl2.transaction_log_pk
and tl2.accounting_date <= sysdate )
and p.policy_type_code in ('LIABCRIME','MIDWIFE')
and pth.accounting_to_date = to_date('01/01/3000','mm/dd/yyyy') --<<<******* eliminates duplicates
and p.policy_no = cls.policy_no
-- and r.risk_pk = cls.risk_pk
and c.coverage_base_record_fk = cls.parent_coverage_base_record_fk(+)
and cls.effective_from_date < pth.effective_to_date -- from date less than period end date
and cls.effective_to_date > pth.effective_from_date -- to date greater than period start date
and cls.policy_effective_date < pth.effective_to_date -- from date less than period end date
and cls.policy_expiry_date > pth.effective_from_date -- to date greater than period start date
group by pp.policy_fk,
pp.transaction_log_fk,
p.policy_no,
p.policy_type_code,
pp.risk_fk,
r.risk_base_record_fk,
r.entity_fk,
substr(trim(nvl(r.county_code_used_to_rate,pth.issue_state_code)),1,2), -- rating_state_code,
r.county_code_used_to_rate,
pth.issue_state_code,
nvl(r.risk_cls_used_to_rate,pth.peer_groups_code) , -- rating_peer_group_code,
r.risk_cls_used_to_rate,
pth.peer_groups_code,
pth.policy_term_history_pk,
pth.term_base_record_fk,
to_char(pth.effective_from_date,'yyyy'), --term_effective_year,
c.coverage_pk,
c.coverage_base_record_fk,
pc.coverage_code,
c.product_coverage_code,
pc.long_description,
pp.coverage_component_code,
c.effective_from_date,
c.effective_to_date,
cls.coverage_code, -- coverage_class_code,
cls.coverage_long_desc, -- coverage_class_long_desc,
decode(pp.coverage_component_code ,'GROSS',cls.exposure_unit,null),-- exposure_unit,
decode(pp.coverage_component_code ,'GROSS',cls.number_of_patient_day,null), -- number_of_patient_day,
pth.effective_from_date, --term_eff_from_date,
pth.effective_to_date, --, --term_eff_to_date,
pp.premium_amount ;Edited by: BluShadow on 27-Jun-2012 16:12
added {noformat}{noformat} tags for readability. PLEASE READ {message:id=9360002} AS PREVIOUSLY REQUESTED! &
Maybe you are looking for
-
Here is how to enable full screen mode for AIR Mac Desktop Apps
I just discovered that with a very simple native extension method, I can enable full screen mode for AIR Mac Desktop Applications. This gets you the full screen icon in the top right, and it works perfectly with no changes to my code. (Obviously only
-
I m trying to import some picture from my iphoto library but element 12 don't show me the iphoto dialog box ,where i choose what i want to import, as I can see on my classroom book and in many tutorial.any one can help please ? thanks
-
I GOT MY N8 FOR A MONTH NOW I TS A COOL DEVICE RATHER THE COOLEST ONE HOWEVER THE AUDIO IS NOT LOUD ENOUGH FOR ME FOR LISTENING TO MUSIC OR OVER HEADPHONE THE OINLY SETTING THAT WORKS IS NEAR THE MAX VOLUME THE SAME FOR RINGER VOLUME IN A TWO STORY
-
Result from 2 queries input to 3rd query
Hi, We have requriement as follows, not sure how to go about this. We Need to build a workbook with 3 sheets wth 3 different queries. Once the workbook is open, we need to take documents numbers from query 1 and query 2 and input to variable of query
-
Custom jms headers are removed when a message pass through OSB's proxyservi
Hi, I am using OSB 10.3. Trying to build the application withe the following flow. proxyservice(jms integration)-->businessservice(jms integration)-->MDB When a jms message with some custom headers are posted to a proxyservice, the custom headers are