Loading data into existing table
Hi I have tried to load data into a large table from a csv file but am not getting any success. I have this control file
LOAD DATA
INFILE 'Book1.xls'
BADFILE 'p_sum_bad.txt'
DISCARDFILE 'p_sum_dis.txt'
APPEND
INTO TABLE p_sum
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
TRAILING NULLCOLS
SUMMARY_LEVEL ,
PERIOD_START_TIME ,
BUSY_HOUR ,
OMC ,
INT_ID ,
BTS_ID ,
BTS_INT_ID ,
CELL_GROUP ,
HO_PERIOD_DURATION ,
POWER_PERIOD_DURATION ,
MSC_I_SUCC_HO ,
MSC_I_TCH_TCH ,
MSC_I_SDCCH_TCH ,
MSC_I_SDCCH ,
MSC_I_TCH_TCH_AT ,
MSC_I_SDCCH_TCH_AT ,
MSC_I_SDCCH_AT ,
MSC_I_FAIL_LACK ,
MSC_I_FAIL_CONN ,
MSC_I_FAIL_BSS ,
MSC_I_END_OF_HO ,
MSC_O_SUCC_HO ,
The data is:
2 3-Nov-06 1000033 9 8092220 1440 1440 5411 5374 7 30 5941
2 3-Nov-06 1000033 10 1392190 1440 1440 0 0 0 0 0
2 3-Nov-06 2000413 3 2127446 1440 1440 80 80 0 0 83
2 3-Nov-06 2000413 4 2021248 1140 1440 0 0 0 0 0
2 3-Nov-06 2000413 5 2021252 1080 1440 1 1 0 0 1
2 3-Nov-06 2000413 6 2130163 1440 1440 2200 2193 2 5 2224
2 3-Nov-06 2000413 7 6205155 1020 1440 0 0 0 0 0
2 3-Nov-06 2000413 8 6200768 900 1440 30 30 0 0 31
2 3-Nov-06 2000413 10 2111877 1440 1440 0 0 0 0 0
2 3-Nov-06 1000033 18 1076419 1440 1440 75 73 0 2 79
2 3-Nov-06 1000033 19 8089060 1440 1440 0 0 0 0 0
but when I try to load the data, I get:
Column Name Position Len Term Encl Datatype
SUMMARY_LEVEL FIRST * , O(") CHARACTER
PERIOD_START_TIME NEXT * , O(") CHARACTER
Record 51: Rejected - Error on table OMC.P_SUM_BTS_HO_POWER, column SUMMARY_LEVEL.
ORA-01722: invalid number
I believe the data being loaded has to be NUMBER. Can anyone adivse what do I need to change to load the data. Thanks
Justin,
Tried that, no luck:
Number to load: ALL
Number to skip: 0
Errors allowed: 50
Bind array: 64 rows, maximum of 256000 bytes
Continuation: none specified
Path used: Conventional
Table P_SUM, loaded from every logical record.
Insert option in effect for this table: APPEND
TRAILING NULLCOLS option in effect
Column Name Position Len Term Encl Datatype
SUMMARY_LEVEL FIRST * WHT O(") CHARACTER
PERIOD_START_TIME NEXT * WHT O(") CHARACTER
BUSY_HOUR NEXT * WHT O(") CHARACTER
OMC NEXT * WHT O(") CHARACTER
INT_ID NEXT * WHT O(") CHARACTER
BTS_ID NEXT * WHT O(") CHARACTER
BTS_INT_ID NEXT * WHT O(") CHARACTER
CELL_GROUP NEXT * WHT O(") CHARACTER
Record 51: Rejected - Error on table OMC.P_SUM_BTS_HO_POWER, column SUMMARY_LEVEL.
ORA-01722: invalid number
Any other sugesstion
Similar Messages
-
I am loading data into a table I created which includes a column "Description" with a data type VARCHAR2(1000). When I go to load the data which is less than 1000 characters I receive the following error message:
Record 38: Rejected - Error on table SSW_INPUTS, column DESCRIPTION.
Field in data file exceeds maximum length
I have increased the size of the column but that does not seem to fix the error. Does anyone know what this error means? Another thought is that I have created the "Description" column to large...which can't be true because I should receive the error when I create the table. Plus I already inputted data into a similar table with similar data and had no problems!
Someone please help...
Thank you,
April.Note that I'm assuming Oracle8(i) behavior. Oracle9 may treat Unicode differently.
Are you inserting Unicode data into the table? Declaring a variable as varchar2(1000) indicates that Oracle should reserve 1000 bytes for data. If you're inserting UTF-8 encoded data, each character may take up to 3 bytes to store. Thus, 334 characters of data could theoretically overflow a varchar2(1000) variable.
Note that UTF-8 is designed so that the most commonly used characters are stored in 1 byte, less commonly used characters are stored in 2 bytes, and the remainder is stored in 3 bytes. On average, this will require less space than the more familiar UCS-2 encoding which stores every character as 2 bytes of data.
Justin -
Error while loading data into External table from the flat files
HI ,
We have a data load in our project which feeds the oracle external tables with the data from the Flat Files(.bcp files) in unix.
While loading the data, we are encountering the following error.
Error occured (Error Code : -29913 and Error Message : ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
KUP-04063: un) while loading data into table_ext
Please let us know what needs to be done in this case to solve this problem.
Thanks,
KartheekKartheek,
I used Google (mine still works).... please check those links:
http://oraclequirks.blogspot.com/2008/07/ora-29400-data-cartridge-error.html
http://jonathanlewis.wordpress.com/2011/02/15/ora-29913/
HTH,
Thierry -
How to load data into user tables using DIAPIs?
Hi,
I have created an user table using UserTablesMD object.
But I don't have know how to load data into this user table. I guess I have to use UserTable object for that. But I still don't know how to put some data in particular column.
Can somebody please help me with this?
I would appreciate if somebody can share their code in this regard.
Thank you,
SudhaYou can try this code:
Dim lRetCode As Long
Dim userTable As SAPbobsCOM.UserTable
userTable = pCompany.UserTables.Item("My_Table")
'First row in the @My_Table table
userTable.Code = "A1"
userTable.Name = "A.1"
userTable.UserFields.Fields.Item("U_1stF").Value = "First row value"
userTable.Add()
'Second row in the @My_Table table
userTable.Code = "A2"
userTable.Name = "A.2"
userTable.UserFields.Fields.Item("U_1stF").Value = "Second row value"
userTable.Add()
This way I have added 2 lines in my table.
Hope it helps
Trinidad. -
Loading data into multiple tables - Bulk collect or regular Fetch
I have a procedure to load data from one source table into eight different destination tables. The 8 tables have some of the columns of the source table with a common key.
I have run into a couple of problems and have a few questions where I would like to seek advice:
1.) Procedure with and without the BULK COLLECT clause took the same time for 100,000 records. I thought I would see improvement in performance when I include BULK COLLECT with LIMIT.
2.) Updating the Load_Flag in source_table happens only for few records and not all. I had expected all records to be updated
3.) Are there other suggestions to improve the performance? or could you provide links to other posts or articles on the web that will help me improve the code?
Notes:
1.) 8 Destination tables have at least 2 Million records each, have multiple indexes and are accessed by application in Production
2.) There is an initial load of 1 Million rows with a subsequent daily load of 10,000 rows. Daily load will have updates for existing rows (not shown in code structure below)
The structure of the procedure is as follows
Declare
dest_type is table of source_table%ROWTYPE;
dest_tab dest_type ;
iCount NUMBER;
cursor source_cur is select * from source_table FOR UPDATE OF load_flag;
BEGIN
OPEN source_cur;
LOOP
FETCH source_cur -- BULK COLLECT
INTO dest_tab -- LIMIT 1000
EXIT WHEN source_cur%NOTFOUND;
FOR i in dest_tab.FIRST .. dest_tab.LAST LOOP
<Insert into app_tab1 values key, col12, col23, col34 ;>
<Insert into app_tab2 values key, col15, col29, col31 ;>
<Insert into app_tab3 values key, col52, col93, col56 ;>
UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
iCount := iCount + 1 ;
IF iCount = 1000 THEN
COMMIT ;
iCount := 0 ;
END IF;
END LOOP;
END LOOP ;
COMMIT ;
END ;
Edited by: user11368240 on Jul 14, 2009 11:08 AMAssuming you are on 10g or later, the PL/SQL compiler generates the bulk fetch for you automatically, so your code is the same as (untested):
DECLARE
iCount NUMBER;
CURSOR source_cur is select * from source_table FOR UPDATE OF load_flag;
BEGIN
OPEN source_cur;
FOR r IN source_cur
LOOP
<Insert into app_tab1 values key, col12, col23, col34 ;>
<Insert into app_tab2 values key, col15, col29, col31 ;>
<Insert into app_tab3 values key, col52, col93, col56 ;>
UPDATE source_table SET load_flag = 'Y' WHERE CURRENT OF source_cur ;
iCount := iCount + 1 ;
IF iCount = 1000 THEN
COMMIT ;
iCount := 0 ;
END IF;
END LOOP;
COMMIT ;
END ;However most of the benefit of bulk fetching would come from using the array with a FORALL expression, which the PL/SQL compiler can't automate for you.
If you are fetching 1000 rows at a time, purely from a code simplification point of view you could lose iCount and the IF...COMMIT...END IF and just commit each time after looping through the 1000-row array.
However I'm not sure how committing every 1000 rows helps restartability, even if your real code has a WHERE clause in the cursor so that it only selects rows with load_flag = 'N' or whatever. If you are worried that it will roll back all your hard work on failure, why not just commit in your exception handler? -
Loading data into multiple tables from an excel
Can we load data in to multiple tables at a time from an excel through Utilities? If yes how? Please help me
Regards,
PallaviI would imagine that the utilities allow you to insert data from a spreadsheet into 1 and only 1 table.
You may have to write your own custom data upload using External Tables and a PL/SQL procedure to insert data from 1 spreadsheet into more than 1 table.
If you need any guidance on doing this let me know and I will happily point you in the right direction.
Regards
Duncan -
Loading data into multiple tables using sqlloader
Hi,
I am using sql loader to load the data from flat file into the data base
my file structure is as below
====================
101,john,mobile@@fax@@home@@office@@email,1234@@3425@@1232@@2345@@[email protected],1234.40
102,smith,mobile@@fax@@home,1234@@345@@234,123.40
103,adams,fax@@mobile@@office@@others,1234@@1233@@1234@@3456,2345.40
in file first columns are empno,ename,comm_mode(multiple values terminated by '@@'),comm_no_txt(multiple values terminated by '@@'), sal
the comm_mode and comm_no_text needs to be inserted into the separate table (emp_comm) like below
emp
empno ename sal
101 john 1234.40
102 smith 123.40
103 adams 2345.40
emp_comm
empno comm_mode comm_no_text
101 mobile 1234
101 fax 3425
101 home 1232
101 office 2345
101 email [email protected]
102 mobile 1234
102 fax 345
102 home 234
103 fax 1234
like this needs to insert the data using sql loader
my table structures
===============
emp
empno number(5)
ename varchar2(15)
sal number(10,2)
emp_comm
empno number(5) reference the empno of the emp table
comm_mode varchar2(10)
Comm_no_text varchar2(35)
now i want insert the file data into the specified structues
please help me out to achieve this using sql loader
(we are not using external tables for this)
Thanks & Regards.
Bala Sake
Edited by: 954925 on Aug 25, 2012 12:24 AMPl post OS and database details
You will need to split up the datafile in order to load into separate tables. The process is documented
http://docs.oracle.com/cd/E11882_01/server.112/e22490/ldr_control_file.htm#autoId72
HTH
Srini -
Loading Data into existing Data Targets from 2 DataSources
Hello:
I want to report on COPA data. I am currently reporting on one Operating concern E.g 1_CO_PA_0001. Now i want to report on the Operating concern E.g 1_CO_PA_0004. For this i created a new DataSource. Replicated it into BW.
My question is can i link the newly created COPA datasource (1_CO_PA_0004) with the Data Targets (ODS or Cube) of the of 1_CO_PA_0001 and load data? Will get any errors? or is this a big blunder i am doing?
Can any one explain?
Regards
LalithaIf you are using Calc manager rules then have a look at the CalcMgrCmdLineLauncher.cmd utility -http://docs.oracle.com/cd/E17236_01/epm.1112/hp_admin/ch06s09s05.html
If it is EAS rules then you could use the CmdLnLauncher utility to automate running rules.
or you could create calc scripts and automate using Maxl.
Cheers
John
http://john-goodwin.blogspot.com/ -
Shell Script Programming -- Loading data into table
Hello Gurus
I am using Oracle's sql*loader utility to load data into a table. Lately, I got an unlikely scenario where in I need to process the data file first before loading it into the table and where I need help from you guys.
Consider the following data line
"Emp", DOB, Gender, Subject
"1",01/01/1980,"M","Physics:01/05/2010"
"2",01/01/1981,"M","Chemistry:02/05/2010|Maths:02/06/2011"
"3",01/01/1982,"M","Maths:03/05/2010|Physics:06/07/2010|Chemistry:08/09/2011"
"4",01/01/1983,"M","Biology:09/09/2010|English:10/10/2010"Employee - 1 will get loaded as a single record in the table. But I need to put Subject value into two separate fields into table. i.e. Physics into one column and date - 01/05/2010 into separate column.
Here big problem starts
Employee - 2 Should get loaded as 2 records into the table. The first record should have Chemistry as subject and date as 02/05/2010 and the next record should have all other fields same except the subject should be Maths and date as 02/06/2011. The subjects are separated by a pipe "|" in the data file.
Similarly, Employee 3 should get loaded as 3 records. One as Maths, second as Physics and third as Chemistry along with their respective dates.
I hope I have made my problem clear to everyone.
I am looking to do something in shell scripting such that before finally running the sql*loader script, the above 4 employees have their records repeated as many times as their subject changes.
In summary 2 problems are described above.
1. To load subject and date into 2 separate fields in Oracle table at the time of load.
2. If their exists multiple subjects then a record is to be loaded that many times as there exists any changes in employee's subject.
Any help would be much appreciated.
Thanks.Here are some comments. Perl can be a little cryptic but once you get used to it, it can be pretty powerful.
#!/usr/bin/perl -w
my $line_count = 0;
open FILE, "test_file" or die $!;
# Read each line from the file.
while (my $line = <FILE>) {
# Print the header if it is the first line.
if ($line_count == 0) {
chomp($line);
print $line . ", Date\n";
++$line_count;
next;
# Get all the columns (as separated by ',' into an array)
my @columns = split(',', $line);
# Remove the newline from the fourth column.
chomp($columns[3]);
# Read the fields (separated by pipe) from the fourth column into an array.
my @subject_and_date = split('\|', $columns[3]);
# Loop for each subject and date.
foreach my $sub_and_date (@subject_and_date) {
# Print value of Emp, DOB, and Gender first.
print $columns[0] . "," . $columns[1] . "," . $columns[2] . ",";
# Remove all double quotes from the subject and date string.
$sub_and_date =~ s/"//g;
# Replace ':' with '","'
$sub_and_date =~ s/:/","/;
print '"' . $sub_and_date . '"' . "\n";
++$line_count;
close FILE; -
Sql* Loader syntax to load data into a partitioned table
Hi All,
I was trying to load data from a csv file to a partitioned table.
The table name is countries_info
columns :
country_code ,
country_name,
country_language
The column country_code is list partitioned with partition p1 for values 'EN','DE','IN' etc
and partition p2 for 'KR','AR','IT' etc.
I tried to load data into this table, but I was getting some error Message (not mapping to partitioned key).
I tried syntax
load data
infile 'countries.csv'
append
into table countries_info
partition(p1) ,
partition(p2)
fields terminated by ','
country_code ,
country_name,
country_language
What is the correct syntax- I searched a lot but have not been able to find out.Peeush_Rediff wrote:
Hi All,
I tried to load data into this table, but I was getting some error Message (not mapping to partitioned key).It's not some error message, it's relevant information for resolving problems you encounter while using Oracle.
In your case, although you didn't specifiy the exact ORA you recived, it sounds like [ORA-14400|http://forums.oracle.com/forums/search.jspa?threadID=&q=ORA-14400&objID=f61&dateRange=all&userID=&numResults=15&rankBy=10001]
What is the correct syntax- I searched a lot but have not been able to find out.It's not about correct syntax , it's about understanding message that was trying to tell you that something went wrong.
That message was (i guess) ORA-14400.
So, refering to that ORA message , you will need to add new partition ,
where the data from your csv file that currently don't map to any of the specified partition would fit ,
or add partition that would be able to accept data range in which value/s for which you received that ORA message could fit into. -
Hi
I have written sql loader script for loading data into two tables.
Script is working ..But output is not coming properly.
I want to load data into first table which lines are having first char 'R'.
In the second table I have to load data which are related from the first line. Incase
first line data is not properly (means discarded) then related second table data
will not be load.
But I am getting both rows.Though the first table record is discarded. Please find below
the output.
Any other solution also ....Ok..external tables..Utl_file.
LOAD DATA
infile "inputFileForRmaReceiptAcknowledgement.dat"
BADFILE 'inputFileForRmaReceiptAcknowledgement.bad'
DISCARDFILE 'inputFileForRmaReceiptAcknowledgement.dsc'
APPEND
INTO TABLE XXGW_RMA_HEDR_RCPTACK_TAB
WHEN (01)='R'
( LINE_TYPE POSITION(1:1) "substr(:Line_Type, 1)",
RMA_ORDER_NO POSITION(2:16) CHAR,
ACKNOWLEDGEMENT_NO POSITION(17:31) CHAR,
ACKNOWLEDGEMENT_DATE POSITION(32:45)
"to_date(substr(:acknowledgement_date,3),'YYMMDDHH24MISS')",
DETAIL_LINE_COUNT POSITION(46:51) INTEGER EXTERNAL,
FLAG CHAR)
INTO TABLE XXGW_RMA_RCPT_ACKLDGMNT_TAB
WHEN (01) = 'D'
( LINE_TYPE POSITION(1:1) "substr(:Line_Type, 1)",
RMA_ORDER_NO POSITION(2:16) CHAR,
RMA_ORDER_LINE POSITION(17:19) INTEGER EXTERNAL,
SERIAL_NUMBER POSITION(20:49) CHAR,
SKU POSITION(50:63) CHAR,
QUANTITY POSITION(64:69) INTEGER EXTERNAL,
WAREHOUSE_CODE POSITION(70:71) CHAR,
WAYBILL_NUMBER POSITION(72:121) CHAR,
COURIER POSITION(122:146) CHAR,
RETURN_DEALER_FLAG POSITION(147:156) CHAR)
inputFileForRmaReceiptAcknowledgement.dat
R12345678901 2345456789123200 21111228241113000002 --- discarded record
D12345678901 00159123687402 45678925803 00000102name
D12345678901 00159143687402 45678925603 00000102name
T000004Regards
ArPl post details of OS and database versions.
Create a foreign key constraint between the detail table and the master table. If the row fails to load into the master table, then the detail table rows will fail the foreign key constraint and will not load.
http://docs.oracle.com/cd/E11882_01/server.112/e25789/datainte.htm#CNCPT1649
HTH
Srini -
FM to upload data into HRP Tables
Hello Guys,
Do we have any FM to load data into HRP tables. I would like to update 1000, 1001, 1002, 1021, 1024, 1025, 1028, 1029, 1032, 1034, 1036, 1063, 5006, 5007, 5008.
Please let me know if there is any FM to upload, or any other process to upload data.
Thanks,
RaviHi
This a useful forum link to understand and implement the FMs.
Updating HRP tables through abap code- is it correct?
If this helps, pl do reward.
Thanks
Narasimha -
Procedure for loading data into gl_interface
hi all
iam new to oracle applications.....i just want to know how to load data into gl_interface table using pl/sql (that is...first loading data into a temporary table and then using pl/sql procedure to load into gl_interface). can anybody help me out with this by providing the pl/sql structure for it??
thanx in advanceAsuming you have data in a datafile and file is camma delimited. I asume table has two columns you can add more columns also.
CREATE OR REPLACE PROCEDURE p10
IS
lv_filehandle UTL_FILE.FILE_TYPE;
lv_iicc_premium ref_cursor;
lv_newline VARCHAR2(2000); -- Input line
lv_header_line VARCHAR2(20);
lv_trailer_line VARCHAR2(20);
lv_file_dir VARCHAR2(100);
lv_file_name VARCHAR2(100);
lv_col1 VARCHAR2(10);
lv_col2 VARCHAR2(50);
lv_comma VARCHAR2(1) := ',';
BEGIN
gv_PrFnName := '[pr_iicc_premium]';
DELETE FROM temp_table;
lv_file_dir := 'p:\temp';
lv_file_name := 'test.dat';
lv_filehandle := UTL_FILE.FOPEN (lv_file_dir, lv_file_name, 'r', 32766);
UTL_FILE.PUT_LINE (lv_filehandle, lv_header_line);
LOOP
FETCH lv_iicc_premium INTO lv_newline;
EXIT WHEN lv_iicc_premium%NOTFOUND;
UTL_FILE.PUT_LINE (lv_filehandle, lv_newline);
lv_col1 := substr(lv_newline, 1, instr(lv_newline, ',', 1)-1);
lv_col2 := substr(lv_newline, instr(lv_newline, ',', 1)+1, instr(lv_newline, ',', 2)-1);
INSERT INTO temp_table VALUES (lv_col1, lv_col2);
COMMIT;
END LOOP;
INSERT INTO your_production_tables VALUES ( SELECT * FROM temp_table );
COMMIT;
UTL_FILE.FFLUSH (lv_filehandle);
UTL_FILE.FCLOSE (lv_filehandle);
EXCEPTION
WHEN UTL_FILE.INVALID_PATH THEN
RAISE_APPLICATION_ERROR(-20100,'Invalid Path');
WHEN UTL_FILE.INVALID_MODE THEN
RAISE_APPLICATION_ERROR(-20101,'Invalid Mode');
WHEN UTL_FILE.INVALID_OPERATION then
RAISE_APPLICATION_ERROR(-20102,'Invalid Operation');
WHEN UTL_FILE.INVALID_FILEHANDLE then
RAISE_APPLICATION_ERROR(-20103,'Invalid Filehandle');
WHEN UTL_FILE.WRITE_ERROR then
NULL;
WHEN UTL_FILE.READ_ERROR then
RAISE_APPLICATION_ERROR(-20105,'Read Error');
WHEN UTL_FILE.INTERNAL_ERROR then
RAISE_APPLICATION_ERROR(-20106,'Internal Error');
WHEN OTHERS THEN
UTL_FILE.FCLOSE(lv_filehandle);
END p10;
/Code is not tested.
Hope this helps
Ghulam -
OWB11gR2 - simple and easy way to load XML formatted data into db tables?
Hi,
we're currently trying to load table data stored in XML files into our datawarehouse using OWB 11gR2.
However, we're finding this is not quite as trivial as loading flat files...
Most postings on this forum points to the blog-entry title "Leveraging XDB" found here (http://blogs.oracle.com/warehousebuilder/2007/09/leveraging_xdb.html).
This blog also references the zip-file owb_xml_etl_utils.zip, which seems to have disappeared from it's original location and can now be found on sourceforge.
Anyway, the solution described is for OWB 10g, and when trying to import experts from the zip-file etc. we end up not being able to run the "Create ETL from XSD" expert, as the 11gR2 client is different from the 10g and does not have the Experts menu et.al.
Also, this solution was published over 3 years ago, and it seems rather strange that importing XML-formatted data should still be so cumbersome in the newer warehouse builder releases.
The OWB 11gR2 documentation is very sparse (or rather - quite empty) on how to load XML data, all it has is a few lines on "XML Transformations", giving no clue as to how one goes about loading data.
Is this really the state of things? Or are we missing some vital information here?
We'd have thought that with 11g-releases, loading XML-data would be rather simple, quick and painless?
Is there somewhere besides the blog mentioned above where we can find simple and to the point guidelines for OWB 11gR2 on how to load XML-formatted data into Oracle tables?
Regards,
-Haakon-Yes it is possible to use SQL*Loader to parse and load XML, but that is not what it was designed for and so is not recommended. You also don't need to register a schema, just to load/store/parse XML in the DB either.
So where does that leave you?
Some options
{thread:id=410714} (see page 2)
{thread:id=1090681}
{thread:id=1070213}
Those talk some about storage options and reading in XML from disk and parsing XML. They should also give you options to consider. Without knowing more about your requirements for the effort, it is difficult to give specific advice. Maybe your 7-8 tables don't exist and so using Object Relational Storage for the XML would be the best solution as you can query/update tables that Oracle creates based off the schema associated to the XML. Maybe an External Table definition works better for reading the XML into the system because this process will happen just once. Maybe using WebDAV makes more sense for loading XML to be parsed (I don't have much experience with this, just know it is possible from what I've read on the forums). Also, your version makes a difference as you have different options available depending upon the version of Oracle.
Hope all that helps as a starter.
Edited by: A_Non on Jul 8, 2010 4:31 PM
A great example, see the answers by mdrake in {thread:id=1096784} -
Error loading data from bw table into BW system
Dear all,
I have created and fill z tables in bw system, and try to load data into same bw system.
What i am facing is error message saying "Error in module RSQL of the database".
and when i check st22 for dump analysis, it says:
Runtime Errors DBIF_RSQL_INVALID_RSQL
Exception CX_SY_OPEN_SQL_DB
how come?
ThanksCan you try again by activating the generic datasource in RSO2. It should work.
And also make sure the tables are active.
Ravi
Maybe you are looking for
-
For convenience I want to have my already created bookmarks imported into another Firefox user's browser. For security reasons I don't want any cookies or flash cookies to be imported when I do this and I don't want to Sync browsers. Is this possible
-
Ipad does not appear on itunes 11
My first gen ipad and my iphone 5 do not appear in itunes 11 (even though the computer recognizes the devices when they are plugged in to import photos. Wifi sync does not work either. Any help?
-
NEED THE BEST QUALITY QUICKTIME MOVIE FOR DVD! THANKS FOR ANY ADVICE.
I just finished a three minute piece that will be shown on DVD and projected to a sixteen foot screen. I usually just output a self-contained quicktime movie with current settings and then compress with bitvice. While I'm happy with those results, be
-
Is iChat available in Mountain Lion?
Is iChat available in Mountain Lion? If so, how do I use it and connect with a contact who, I think, has Leopard with iChat.
-
Configuring NT service with disableWeblogicClassPath
Hi Due to some classloader issues we start up WebLogic with the following system property: -Dweblogic.system.disableWeblogicClassPath=true and no weblogic.class.path property. However, when trying to set this up for an NT service using wlconfig, wlco