Loading data into a CLOB column
I need to find out how to load about ten sentences of data into a clob column for a table in the database. I have a pl/sql procedure that loads data from an xml file into various tables in the the database. Recently, we added a column (test_dummy) to one of the tables and defined it as a CLOB. There is a corresponding node (detail_info) in the XML file that maps to this column. I need to figure out how to incorporate this in the pl/sql procedure so that the data in the XML file for the node (detail_info) is loaded into "test_dummy". Any ideas?
Take it one at a time. Use 'extract' function to extract an XML snippet from a given XML. The question couldn't be more vague. Maybe an example would help?
Rahul
Similar Messages
-
How to insert more than 32k xml data into oracle clob column
how to insert more than 32k xml data into oracle clob column.
xml data is coming from java front end
if we cannot use clob than what are the different options availableAre you facing any issue with my code?
String lateral size error will come when you try to insert the full xml in string format.
public static boolean writeCLOBData(String tableName, String id, String columnName, String strContents) throws DataAccessException{
boolean isUpdated = true;
Connection connection = null;
try {
connection = ConnectionManager.getConnection ();
//connection.setAutoCommit ( false );
PreparedStatement PREPARE_STATEMENT = null;
String sqlQuery = "UPDATE " + tableName + " SET " + columnName + " = ? WHERE ID =" + id;
PREPARE_STATEMENT = connection.prepareStatement ( sqlQuery );
// converting string to reader stream
Reader reader = new StringReader ( strContents );
PREPARE_STATEMENT.setClob ( 1, reader );
// return false after updating the clob data to DB
isUpdated = PREPARE_STATEMENT.execute ();
PREPARE_STATEMENT.close ();
} catch ( SQLException e ) {
e.printStackTrace ();
finally{
return isUpdated;
Try this JAVA code. -
How to insert data into a CLOB column
In UIX XML I used a bc4j:textInput control for a CLOB column and I cannot type any letter in this field....Why ?
Regards,
LucianHello again !
Here are my findings (Content is a CLOB column; I use UIX XML + BC4J):
1.If I put:
<inlineMessage prompt="Content" required="no" vAlign="middle" >
<contents>
<bc4j:textInput name="Content" attrName="Content" columns="162" rows="5" />
</contents>
</inlineMessage>
I cannot type any letter in the text Input field !!!!!!
2. If I put
<inlineMessage prompt="Content" required="no" vAlign="middle" >
<contents>
<bc4j:attrScope name="Content">
<contents>
<textInput name="Content" columns="162" rows="5">
<boundAttribute name="text">
<bc4j:attrProperty name="value"/>
</boundAttribute>
</textInput>
</contents>
</bc4j:attrScope>
</contents>
</inlineMessage>
I can type in the textInput field, but I cannot save the text after a certain number of characters. Also I cannot see the saved text. I receive :
Servlet error: Renderer failed: java.lang.ArrayIndexOutOfBoundsException: -35</PRE></BODY></HTML>
inside the textInput field.
Please help me ...
Regards,
Lucian -
Error while loading data into clob data type.
Hi,
I have created interface to load data from oracle table into oracle table.In target table we have attribute with clob data type. while loading data into clob field ODI gave below error. I use odi 10.1.3.6.0
java.lang.NumberFormatException: For input string: "4294967295"
at java.lang.NumberFormatException.forInputString(Unknown Source)
at java.lang.Integer.parseInt(Unknown Source)
at java.lang.Integer.parseInt(Unknown Source)
Let me know if anyone come across and resolved this kind of issue.
Thanks much,
Nishit GajjarMr. Gajjar,
You didnt mention what KMs you are using ?
have a read of
Re: Facing issues while using BLOB
and
Load BLOB column in Oracle to Image column in MS SQL Server
Try again.
And can you please mark the Correct/Helpful points to the answers too.
Edited by: actdi on Jan 10, 2012 10:45 AM -
How can I load data into table with SQL*LOADER
how can I load data into table with SQL*LOADER
when column data length more than 255 bytes?
when column exceed 255 ,data can not be insert into table by SQL*LOADER
CREATE TABLE A (
A VARCHAR2 ( 10 ) ,
B VARCHAR2 ( 10 ) ,
C VARCHAR2 ( 10 ) ,
E VARCHAR2 ( 2000 ) );
control file:
load data
append into table A
fields terminated by X'09'
(A , B , C , E )
SQL*LOADER command:
sqlldr test/test control=A_ctl.txt data=A.xls log=b.log
datafile:
column E is more than 255bytes
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)
1 1 1 1234567------(more than 255bytes)Check this out.
http://download-west.oracle.com/docs/cd/B10501_01/server.920/a96652/ch06.htm#1006961 -
Greetings.
I have successfully worked out inserting SQL data (2008 R2) into my 2010 SharePoint list (New, Update, Delete) by creating an SSIS Data Flow Task as outlined here:
http://fsugeiger.blogspot.com/2010/01/synchronise-sql-table-with-sharepoint.html
However, the problem I am running into is inserting data into the SharePoint Columns that are "Lookup" column types. I verified that all of the values I am copying from SQL into the SharePoint lookup column exist in the customn list it is pointing to. It
is important to have this column be a lookup column as it links to another custom list that has many more columns of related information.
I have read and re-read the SharePoint SSIS Adapters 2011.docx from
http://sqlsrvintegrationsrv.codeplex.com/ and the only section that seems to apply is this:
"Looking Up Values in a SharePoint List
If you have to look up a value in a SharePoint list, you can use the Lookup transformation in your data flow, and use the SharePoint List source to load the lookup table. You may have to add a Derived Column transformation or a Script component that splits
data in the lookup column on the ";#" delimiter to separate the ID value from the description.
If you are replacing values in your data with the values that you look up in the list, then loading the changed data back into SharePoint, you only have to include the ID from the lookup column. SharePoint ignores the description if you include it."
I am not sure if the above statement means that I should be passing the assocaited ID's other than the actual data into the SharePoint List destination. If that is the case, that will not really work as the lookup contains hundreds of rows. Not too mention
I have several of these lookup column types pointing to several different lists.
Any guidance in how I can put data into a SharePoint Lookup column type via Data Flow Task would be so much appreaciated.
Thank you.
My errors are:
Error: 0x0 at Data Flow Task, SharePoint List Destination: Error on row ID="1": 0x1 - Unspecified error, such as too many items being updated at once (batch), or an invalid core field value.
Error: 0xC0047062 at Data Flow Task, SharePoint List Destination [1903]: Microsoft.Samples.SqlServer.SSIS.SharePointListAdapters.PipelineProcessException: Errors detected in this component - see SSIS Errors at Microsoft.Samples.SqlServer.SSIS.SharePointListAdapters.SharePointListDestination.ProcessInput(Int32
inputID, PipelineBuffer buffer) at Microsoft.SqlServer.Dts.Pipeline.ManagedComponentHost.HostProcessInput(IDTSManagedComponentWrapper100 wrapper, Int32 inputID, IDTSBuffer100 pDTSBuffer, IntPtr bufferWirePacket)
Error: 0xC0047022 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "SharePoint List Destination" (1903) failed with error code 0x80131500 while processing input "Component Input" (1912). The identified
component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and willI have found a solution to my problem and thought I would share it here in case there are others who are struggling with the above scenario. If you have a better way, I would love to hear about it since my way is a bit tedious.
In a nutshell, in order to have an SSIS package put data from an OLE DB Source into a SharePoint List Destination Lookup Column, you need to pass the ID of the value that is being looked up, not the value that is in the “master” OLE DB source.
Rough explanation, OLE DB Source value for column “Approp” is “4005” --> SQL matches “4005” with the ID in the new lookup table (“4005” = ID “5” as defined in the SharePoint lookup list) --> “5” gets passed into SharePoint List destination lookup
column --> SharePoint displays “4005” and successfully links to the lookup list.
Funny thing (not really), the error(s) outlined in my original post are not related in getting data into a SharePoint Lookup column as I am now successful in getting data into the system but I am still getting the same above error(s). I think it has to do
with the ID column in the SharePoint list destination. What I can’t seem to figure out is why since I am not linking any data to that ID column (at least on new records). I am however linking it on Update and Delete and the errors mentioned above disappear
and things work well.
There are three tasks that need to get done in order to get data from SQL into a SharePoint lookup column assuming you have already set up your SharePoint lookup lists:
1. Create new lookup table(s) in SQL that has the IDs from the SharePoint Lookup list and the values coming from the “master” OLD DB Source. You can see the ID column in SharePoint by toggling it on in a view.
2. Create a SQL command that JOINs all the databases and tables so that the ID is passed and not the value into the SharePoint lookup column
3. Change the “Data access mode” to “SQL Command” instead of the “Table or view” in the OLE DB Source and paste your command into the “SQL command text:” area.
Other helpful info is that you may also need to add additional columns in the new lookup tables in SQL for the scenarios when the data is not unique. You can see this two times in my SQL command example for Units and JobTitles:
SELECT
pps.SSNm,
pps.file_updated,
pps.Employee_id,
/* pps.CheckDistNm,*/
Check_Distribution_id = COALESCE( d.ID, 0 ),
pps.Job_nbr,
pps.SeqNm,
pps.action_eff_dt,
Fund_id = COALESCE( f.id, 0 ),
Appropriation_id = COALESCE( ap.id, 0 ),
ActionCode_id = COALESCE( ac.id, 0 ),
SpecNumber_id = COALESCE( jt.ID, 0 ),
pps.Employee_id,
/* pps.Fund,
pps.Approp,
pps.Unit,*/
Unit_id = COALESCE( u.ID, 0 ),
PosNm,
PosCode,
pps.LastName,
pps.FirstName,
pps.MI
FROM
x_PPS.aReportVw.pps_screens_active AS pps
LEFT OUTER JOIN dbo.DistributionNumbers AS d ON
pps.CheckDistNm = d.Check_Distribution
LEFT OUTER JOIN dbo.Units AS u ON
pps.Fund = u.Fund AND
pps.Approp = u.Approp AND
pps.Unit = u.Unit
LEFT OUTER JOIN dbo.Appropriations AS ap ON
pps.Approp = ap.Approp
LEFT OUTER JOIN dbo.Funds AS f ON
pps.Fund = f.Fund
LEFT OUTER JOIN dbo.ActionCodes AS ac ON
pps.ActionCode = ac.ActionCode
LEFT OUTER JOIN dbo.JobTitles AS jt ON
pps.SpecNm = jt.SpecNumber AND
pps.JurisClass = jt.JurisClass -
ORA-22275 inserting into the CLOB column using ODBC input parameters
Hi all,
I'm having problem with INSERT into the CLOB column via bound input parameters.
After calling SQLExecDirect() I'm getting following error:
[Oracle][ODBC][Ora]ORA-22275: invalid LOB locator specified
Adding defaults to the table definitions does not help. If I embed parameter values into the SQL statement - everything works just fine.
I use Oracle 9.2 with latest Oracle ODBC driver 9.2.0.4 under Windows XP.
Any ideas appreciated...
Vlad
Code looks like this:
SQLBindParameter(hstmt,1,...);
SQLBindParameter(hstmt,2,...);
SQLBindParameter(hstmt,3,...);
SQLExecDirect(hstmt,...);
SQL statement looks like this:
insert into tst_table (id,str_fld,clob_fld1,clob_fld2) values (50, ? , ? , ?)
Table looks like this:
CREATE TABLE tst_table (
id number (10,0) NOT NULL ,
str_fld nvarchar2 (50) NOT NULL ,
clob_fld1 nclob NOT NULL ,
clob_fld2 nclob NOT NULL ,
CONSTRAINT PK_tst_table PRIMARY KEY
id
I tried to add defaults to the table, but result is the same:
CREATE TABLE tst_table (
id number (10,0) NOT NULL ,
str_fld nvarchar2 (50) NOT NULL ,
clob_fld1 nclob default EMPTY_CLOB() NOT NULL ,
clob_fld2 nclob default EMPTY_CLOB() NOT NULL ,
CONSTRAINT PK_tst_table PRIMARY KEY
idYou need to provide the data at execution time (i.e. SQL_LEN_DATA_AT_EXEC(0) in the SQLBindParameter followed by a series of SQLPutData calls). If you go to Metalink
Top Tech Docs | Oracle ODBC Driver | Scripts & Sample Code
has some sample code that shows you how to do this.
Justin -
How to load data into user tables using DIAPIs?
Hi,
I have created an user table using UserTablesMD object.
But I don't have know how to load data into this user table. I guess I have to use UserTable object for that. But I still don't know how to put some data in particular column.
Can somebody please help me with this?
I would appreciate if somebody can share their code in this regard.
Thank you,
SudhaYou can try this code:
Dim lRetCode As Long
Dim userTable As SAPbobsCOM.UserTable
userTable = pCompany.UserTables.Item("My_Table")
'First row in the @My_Table table
userTable.Code = "A1"
userTable.Name = "A.1"
userTable.UserFields.Fields.Item("U_1stF").Value = "First row value"
userTable.Add()
'Second row in the @My_Table table
userTable.Code = "A2"
userTable.Name = "A.2"
userTable.UserFields.Fields.Item("U_1stF").Value = "Second row value"
userTable.Add()
This way I have added 2 lines in my table.
Hope it helps
Trinidad. -
Error when loading data into Planning
Unable to load data into planning 11.1.2.3.200 using ODI 11.1.1.7
Please find the errors from below logs:
INFO [SimpleAsyncTaskExecutor-2]: Oracle Data Integrator Adapter for Hyperion Planning
INFO [SimpleAsyncTaskExecutor-2]: Connecting to planning application [xxxxxxx] on [xxxxxxxxxxx]:[xxxx] using username [admin].
INFO [SimpleAsyncTaskExecutor-2]: Successfully connected to the planning application.
INFO [SimpleAsyncTaskExecutor-2]: The load options for the planning load are
Dimension Name: Account Sort Parent Child : false
Load Order By Input : false
Refresh Database : false
INFO [SimpleAsyncTaskExecutor-2]: Begining the load process.
DEBUG [SimpleAsyncTaskExecutor-2]: Number of columns in the source result set does not match the number of planning target columns.
INFO [SimpleAsyncTaskExecutor-2]: Load type is [Load dimension member].
ERROR [SimpleAsyncTaskExecutor-2]: Record [[A603010, null, null, null, null, null, null, null, null, null, null, null, xxxxx, -100, F3E0,C011,E7172_93275,FY17,Stage 1,Current Service Level,Jul, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null]] was rejected by the Planning Server.
ERROR [SimpleAsyncTaskExecutor-2]: Record [[A601060, null, null, null, null, null, null, null, null, null, null, null, xxxxx, -250, F3E0,C011,E7172_93275,FY17,Stage 1,Current Service Level,Jul, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null, null]] was rejected by the Planning Server.
log.err
Account,Data Load Cube Name,Budget,Point-of-View,Error_Reason
A603010,xxxxx,-100,F3E0,C011,E7172_93275,FY17,Stage 1,Current Service Level,Jul,Cannot load dimension member, error message is: RemoteException occurred in server thread; nested exception is:
java.rmi.UnmarshalException: unrecognized method hash: method not supported by remote object
A601060,xxxxx,-250,F3E0,C011,E7172_93275,FY17,Stage 1,Current Service Level,Jul,Cannot load dimension member, error message is: RemoteException occurred in server thread; nested exception is:
java.rmi.UnmarshalException: unrecognized method hash: method not supported by remote object
FDMEE log:
ERROR [AIF]: Error: No records exist for Period 'Pd 2 - 2014-08-01'
ERROR [AIF]: Error: No records exist for Period 'Pd 3 - 2014-09-01'
FDMEE Logging Level is set to 5Are you sure that you haven't got Planning 11.1.2.3.500 in your environment? This sounds spookily similar to the issue described in note 1678759.1 which was seen after applying Planning 11.1.2.3.500. If it's definitely Planning 11.1.2.3.500 is there any chance that someone has applied ODI patch 18687916 to the ODI home (C:\Oracle\Middleware\odi)?
If you /*are*/ running Planning 11.1.2.3.500 then you might need to apply patch 18687916 to C:\Oracle\Middleware\odi - you might need note 1683307.1 to take some of the pain out of applying this patch though.
Given the error that you're seeing though there's a mismatch in versions of JAR files between ODI and the Planning server somewhere.
Regards
Craig -
Loading data into existing table
Hi I have tried to load data into a large table from a csv file but am not getting any success. I have this control file
LOAD DATA
INFILE 'Book1.xls'
BADFILE 'p_sum_bad.txt'
DISCARDFILE 'p_sum_dis.txt'
APPEND
INTO TABLE p_sum
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
SUMMARY_LEVEL ,
PERIOD_START_TIME ,
BUSY_HOUR ,
OMC ,
INT_ID ,
BTS_ID ,
BTS_INT_ID ,
CELL_GROUP ,
HO_PERIOD_DURATION ,
POWER_PERIOD_DURATION ,
MSC_I_SUCC_HO ,
MSC_I_TCH_TCH ,
MSC_I_SDCCH_TCH ,
MSC_I_SDCCH ,
MSC_I_TCH_TCH_AT ,
MSC_I_SDCCH_TCH_AT ,
MSC_I_SDCCH_AT ,
MSC_I_FAIL_LACK ,
MSC_I_FAIL_CONN ,
MSC_I_FAIL_BSS ,
MSC_I_END_OF_HO ,
MSC_O_SUCC_HO ,
The data is:
2 3-Nov-06 1000033 9 8092220 1440 1440 5411 5374 7 30 5941
2 3-Nov-06 1000033 10 1392190 1440 1440 0 0 0 0 0
2 3-Nov-06 2000413 3 2127446 1440 1440 80 80 0 0 83
2 3-Nov-06 2000413 4 2021248 1140 1440 0 0 0 0 0
2 3-Nov-06 2000413 5 2021252 1080 1440 1 1 0 0 1
2 3-Nov-06 2000413 6 2130163 1440 1440 2200 2193 2 5 2224
2 3-Nov-06 2000413 7 6205155 1020 1440 0 0 0 0 0
2 3-Nov-06 2000413 8 6200768 900 1440 30 30 0 0 31
2 3-Nov-06 2000413 10 2111877 1440 1440 0 0 0 0 0
2 3-Nov-06 1000033 18 1076419 1440 1440 75 73 0 2 79
2 3-Nov-06 1000033 19 8089060 1440 1440 0 0 0 0 0
but when I try to load the data, I get:
Column Name Position Len Term Encl Datatype
SUMMARY_LEVEL FIRST * , O(") CHARACTER
PERIOD_START_TIME NEXT * , O(") CHARACTER
Record 51: Rejected - Error on table OMC.P_SUM_BTS_HO_POWER, column SUMMARY_LEVEL.
ORA-01722: invalid number
I believe the data being loaded has to be NUMBER. Can anyone adivse what do I need to change to load the data. ThanksJustin,
Tried that, no luck:
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table P_SUM, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
SUMMARY_LEVEL FIRST * WHT O(") CHARACTER
PERIOD_START_TIME NEXT * WHT O(") CHARACTER
BUSY_HOUR NEXT * WHT O(") CHARACTER
OMC NEXT * WHT O(") CHARACTER
INT_ID NEXT * WHT O(") CHARACTER
BTS_ID NEXT * WHT O(") CHARACTER
BTS_INT_ID NEXT * WHT O(") CHARACTER
CELL_GROUP NEXT * WHT O(") CHARACTER
Record 51: Rejected - Error on table OMC.P_SUM_BTS_HO_POWER, column SUMMARY_LEVEL.
ORA-01722: invalid number
Any other sugesstion -
I am loading data into a table I created which includes a column "Description" with a data type VARCHAR2(1000). When I go to load the data which is less than 1000 characters I receive the following error message:
Record 38: Rejected - Error on table SSW_INPUTS, column DESCRIPTION.
Field in data file exceeds maximum length
I have increased the size of the column but that does not seem to fix the error. Does anyone know what this error means? Another thought is that I have created the "Description" column to large...which can't be true because I should receive the error when I create the table. Plus I already inputted data into a similar table with similar data and had no problems!
Someone please help...
Thank you,
April.Note that I'm assuming Oracle8(i) behavior. Oracle9 may treat Unicode differently.
Are you inserting Unicode data into the table? Declaring a variable as varchar2(1000) indicates that Oracle should reserve 1000 bytes for data. If you're inserting UTF-8 encoded data, each character may take up to 3 bytes to store. Thus, 334 characters of data could theoretically overflow a varchar2(1000) variable.
Note that UTF-8 is designed so that the most commonly used characters are stored in 1 byte, less commonly used characters are stored in 2 bytes, and the remainder is stored in 3 bytes. On average, this will require less space than the more familiar UCS-2 encoding which stores every character as 2 bytes of data.
Justin -
Shell Script Programming -- Loading data into table
Hello Gurus
I am using Oracle's sql*loader utility to load data into a table. Lately, I got an unlikely scenario where in I need to process the data file first before loading it into the table and where I need help from you guys.
Consider the following data line
"Emp", DOB, Gender, Subject
"1",01/01/1980,"M","Physics:01/05/2010"
"2",01/01/1981,"M","Chemistry:02/05/2010|Maths:02/06/2011"
"3",01/01/1982,"M","Maths:03/05/2010|Physics:06/07/2010|Chemistry:08/09/2011"
"4",01/01/1983,"M","Biology:09/09/2010|English:10/10/2010"Employee - 1 will get loaded as a single record in the table. But I need to put Subject value into two separate fields into table. i.e. Physics into one column and date - 01/05/2010 into separate column.
Here big problem starts
Employee - 2 Should get loaded as 2 records into the table. The first record should have Chemistry as subject and date as 02/05/2010 and the next record should have all other fields same except the subject should be Maths and date as 02/06/2011. The subjects are separated by a pipe "|" in the data file.
Similarly, Employee 3 should get loaded as 3 records. One as Maths, second as Physics and third as Chemistry along with their respective dates.
I hope I have made my problem clear to everyone.
I am looking to do something in shell scripting such that before finally running the sql*loader script, the above 4 employees have their records repeated as many times as their subject changes.
In summary 2 problems are described above.
1. To load subject and date into 2 separate fields in Oracle table at the time of load.
2. If their exists multiple subjects then a record is to be loaded that many times as there exists any changes in employee's subject.
Any help would be much appreciated.
Thanks.Here are some comments. Perl can be a little cryptic but once you get used to it, it can be pretty powerful.
#!/usr/bin/perl -w
my $line_count = 0;
open FILE, "test_file" or die $!;
# Read each line from the file.
while (my $line = <FILE>) {
# Print the header if it is the first line.
if ($line_count == 0) {
chomp($line);
print $line . ", Date\n";
++$line_count;
next;
# Get all the columns (as separated by ',' into an array)
my @columns = split(',', $line);
# Remove the newline from the fourth column.
chomp($columns[3]);
# Read the fields (separated by pipe) from the fourth column into an array.
my @subject_and_date = split('\|', $columns[3]);
# Loop for each subject and date.
foreach my $sub_and_date (@subject_and_date) {
# Print value of Emp, DOB, and Gender first.
print $columns[0] . "," . $columns[1] . "," . $columns[2] . ",";
# Remove all double quotes from the subject and date string.
$sub_and_date =~ s/"//g;
# Replace ':' with '","'
$sub_and_date =~ s/:/","/;
print '"' . $sub_and_date . '"' . "\n";
++$line_count;
close FILE; -
Loading Data into Table with Complex Transformations
Hello Guys
I am trying to load data into one of the Dimension table and it has quite a few Transformations and i created 6 temp tables
1. It has 7 Columns , Gets 935 rows based on where condition
2. It has 10 Columns , Gets 935 rows with but it has nulls in it i.e for column 1 there are 500 fields and columns 2 there are 300 etc ...
3 , 4 , 5 , 6 all the same as the 2 table
and at the end when i am trying to join all the temp tables with the Product_id into the target , which is in each temp table ...
I am Getting Error Saying Not Obeying Primary key Constraints i.e unique values are not been inserting into the Product_id Column of Target Table and the Job is running for Hours
and the main Problem comes at , some of the Columns have the same Product_id
Please help me
I have been Trying for 1 week and i am in Full pressure
Thanks
Sriks
Edited by: Sriks on Oct 16, 2008 6:43 PMHi,
If you are creating a warehouse and product_key is ur PK then it should come only once and so u might have to think ur logic in getting the data. To get over the isue u can disable the constraint and load with out the cosntraint, but i would have u look at the logic and make sure u have only 1 product_key in the table.
Regards
Bharath -
Sql* Loader syntax to load data into a partitioned table
Hi All,
I was trying to load data from a csv file to a partitioned table.
The table name is countries_info
columns :
country_code ,
country_name,
country_language
The column country_code is list partitioned with partition p1 for values 'EN','DE','IN' etc
and partition p2 for 'KR','AR','IT' etc.
I tried to load data into this table, but I was getting some error Message (not mapping to partitioned key).
I tried syntax
load data
infile 'countries.csv'
append
into table countries_info
partition(p1) ,
partition(p2)
fields terminated by ','
country_code ,
country_name,
country_language
What is the correct syntax- I searched a lot but have not been able to find out.Peeush_Rediff wrote:
Hi All,
I tried to load data into this table, but I was getting some error Message (not mapping to partitioned key).It's not some error message, it's relevant information for resolving problems you encounter while using Oracle.
In your case, although you didn't specifiy the exact ORA you recived, it sounds like [ORA-14400|http://forums.oracle.com/forums/search.jspa?threadID=&q=ORA-14400&objID=f61&dateRange=all&userID=&numResults=15&rankBy=10001]
What is the correct syntax- I searched a lot but have not been able to find out.It's not about correct syntax , it's about understanding message that was trying to tell you that something went wrong.
That message was (i guess) ORA-14400.
So, refering to that ORA message , you will need to add new partition ,
where the data from your csv file that currently don't map to any of the specified partition would fit ,
or add partition that would be able to accept data range in which value/s for which you received that ORA message could fit into. -
Procedure for loading data into gl_interface
hi all
iam new to oracle applications.....i just want to know how to load data into gl_interface table using pl/sql (that is...first loading data into a temporary table and then using pl/sql procedure to load into gl_interface). can anybody help me out with this by providing the pl/sql structure for it??
thanx in advanceAsuming you have data in a datafile and file is camma delimited. I asume table has two columns you can add more columns also.
CREATE OR REPLACE PROCEDURE p10
IS
lv_filehandle UTL_FILE.FILE_TYPE;
lv_iicc_premium ref_cursor;
lv_newline VARCHAR2(2000); -- Input line
lv_header_line VARCHAR2(20);
lv_trailer_line VARCHAR2(20);
lv_file_dir VARCHAR2(100);
lv_file_name VARCHAR2(100);
lv_col1 VARCHAR2(10);
lv_col2 VARCHAR2(50);
lv_comma VARCHAR2(1) := ',';
BEGIN
gv_PrFnName := '[pr_iicc_premium]';
DELETE FROM temp_table;
lv_file_dir := 'p:\temp';
lv_file_name := 'test.dat';
lv_filehandle := UTL_FILE.FOPEN (lv_file_dir, lv_file_name, 'r', 32766);
UTL_FILE.PUT_LINE (lv_filehandle, lv_header_line);
LOOP
FETCH lv_iicc_premium INTO lv_newline;
EXIT WHEN lv_iicc_premium%NOTFOUND;
UTL_FILE.PUT_LINE (lv_filehandle, lv_newline);
lv_col1 := substr(lv_newline, 1, instr(lv_newline, ',', 1)-1);
lv_col2 := substr(lv_newline, instr(lv_newline, ',', 1)+1, instr(lv_newline, ',', 2)-1);
INSERT INTO temp_table VALUES (lv_col1, lv_col2);
COMMIT;
END LOOP;
INSERT INTO your_production_tables VALUES ( SELECT * FROM temp_table );
COMMIT;
UTL_FILE.FFLUSH (lv_filehandle);
UTL_FILE.FCLOSE (lv_filehandle);
EXCEPTION
WHEN UTL_FILE.INVALID_PATH THEN
RAISE_APPLICATION_ERROR(-20100,'Invalid Path');
WHEN UTL_FILE.INVALID_MODE THEN
RAISE_APPLICATION_ERROR(-20101,'Invalid Mode');
WHEN UTL_FILE.INVALID_OPERATION then
RAISE_APPLICATION_ERROR(-20102,'Invalid Operation');
WHEN UTL_FILE.INVALID_FILEHANDLE then
RAISE_APPLICATION_ERROR(-20103,'Invalid Filehandle');
WHEN UTL_FILE.WRITE_ERROR then
NULL;
WHEN UTL_FILE.READ_ERROR then
RAISE_APPLICATION_ERROR(-20105,'Read Error');
WHEN UTL_FILE.INTERNAL_ERROR then
RAISE_APPLICATION_ERROR(-20106,'Internal Error');
WHEN OTHERS THEN
UTL_FILE.FCLOSE(lv_filehandle);
END p10;
/Code is not tested.
Hope this helps
Ghulam
Maybe you are looking for
-
Switching iphone from pc to mac
previously owned a pc which my iphone is synced to, bought a mac but cant upload music
-
SQL XML Document format - Sequence of excecution for the sentences
Hi, I am using XML SQL Document format to send actions to an Oracle database. My concern is this: I need to insert several invoice header lines then after that I need to insert the Invoice position lines and then, at the end, i need to excecute an s
-
I am on 10.2.0.4 and DB size is 150gb. Each day I have about 20 redo log switches each of 1GB. I have my rman backup on tape/separate location. How do I calculate the FRA space for DB flash for 24 hours? This is just to flash back the DB to 24 hours
-
Keep getting asked to supply keychain password. I don't know the password. If I click on cancel the arrow becomes a color wheel and spins around. Unable to figure out how to correct this so I can use the computer.
-
I used to be able to right-click on scanned images and select "convert to pdf." I've recently noticed that option is gone. I have no way whatsoever to convert anything to a pdf other than powerpoint. I downloaded the Adobe Reader 9 but no luck. I