Sql loader loading date
hi all,
How is it possible to insert date in table through sql loader.
Likewise if i have table A
(Name varchar2(23),Age int,Dateofbirth date)
when i run sqlloader it through me ORA-01858 Error on dateofbirth
if i want to insert
NAME=Neerav,
AGE=23,
DATEOFBIRTH=14-08-1985 in table A
How can we insert date value through SQL LOADER..
Version:Oracle10g
Data file FORMAT:TEST.csv
Thanks in advance,
Neerav
Edited by: Neerav999 on Aug 12, 2009 10:31 PM
Sorry, but I am not your mother & refuse to spoon feed you.
sqlldr needs to be informed exactly how the STRINGS in the cvs file are formatted to represent DATE.
Your data does not match the default DATE format.
The answer can be found if/when you are willing to Read The Fine Manual in URL previously posted.
Similar Messages
-
How can I load my data faster? Is there a SQL solution instead of PL/SQL?
11.2.0.2
Solaris 10 sparc
I need to backfill invoices from a customer. The raw data has 3.1 million records. I have used pl/sql to load these invoices into our system (dev), however, our issue is the amount of time it's taking to run the load - effectively running at approx 4 hours. (Raw data has been loaded into a staging table)
My research keeps coming back to one concept: sql is faster than pl/sql. Where I'm stuck is the need to programmatically load the data. The invoice table has a sequence on it (primary key = invoice_id)...the invoice_header and invoice_address tables use the invoice_id as a foreign key. So my script takes advantage of knowing the primary key and uses that on the subsequent inserts to the subordinate invoice_header and invoice_address tables, respectively.
My script is below. What I'm asking is if there are other ideas on the quickest way to load this data...what am I not considering? I have to load the data in dev, qa, then production so the sequences and such change between the environments. I've dummied down the code to protect the customer; syntax and correctness of the code posted here (on the forum) is moot...it's only posted to give the framework for what I currently have.
Any advice would be greatly appreciated; how can I load the data faster knowing that I need to know sequence values for inserts into other tables?
DECLARE
v_inv_id invoice.invoice_id%TYPE;
v_inv_addr_id invoice_address.invoice_address_id%TYPE;
errString invoice_errors.sqlerrmsg%TYPE;
v_guid VARCHAR2 (128);
v_str VARCHAR2 (256);
v_err_loc NUMBER;
v_count NUMBER := 0;
l_start_time NUMBER;
TYPE rec IS RECORD
BILLING_TYPE VARCHAR2 (256),
CURRENCY VARCHAR2 (256),
BILLING_DOCUMENT VARCHAR2 (256),
DROP_SHIP_IND VARCHAR2 (256),
TO_PO_NUMBER VARCHAR2 (256),
TO_PURCHASE_ORDER VARCHAR2 (256),
DUE_DATE DATE,
BILL_DATE DATE,
TAX_AMT VARCHAR2 (256),
PAYER_CUSTOMER VARCHAR2 (256),
TO_ACCT_NO VARCHAR2 (256),
BILL_TO_ACCT_NO VARCHAR2 (256),
NET_AMOUNT VARCHAR2 (256),
NET_AMOUNT_CURRENCY VARCHAR2 (256),
ORDER_DT DATE,
TO_CUSTOMER VARCHAR2 (256),
TO_NAME VARCHAR2 (256),
FRANCHISES VARCHAR2 (4000),
UPDT_DT DATE
TYPE tab IS TABLE OF rec
INDEX BY BINARY_INTEGER;
pltab tab;
CURSOR c
IS
SELECT billing_type,
currency,
billing_document,
drop_ship_ind,
to_po_number,
to_purchase_order,
due_date,
bill_date,
tax_amt,
payer_customer,
to_acct_no,
bill_to_acct_no,
net_amount,
net_amount_currency,
order_dt,
to_customer,
to_name,
franchises,
updt_dt
FROM BACKFILL_INVOICES;
BEGIN
l_start_time := DBMS_UTILITY.get_time;
OPEN c;
LOOP
FETCH c
BULK COLLECT INTO pltab
LIMIT 1000;
v_err_loc := 1;
FOR i IN 1 .. pltab.COUNT
LOOP
BEGIN
v_inv_id := SEQ_INVOICE_ID.NEXTVAL;
v_guid := 'import' || TO_CHAR (CURRENT_TIMESTAMP, 'hhmissff');
v_str := str_parser (pltab (i).FRANCHISES); --function to string parse - this could be done in advance, yes.
v_err_loc := 2;
v_count := v_count + 1;
INSERT INTO invoice nologging
VALUES (v_inv_id,
pltab (i).BILL_DATE,
v_guid,
'111111',
'NONE',
TO_TIMESTAMP (pltab (i).BILL_DATE),
TO_TIMESTAMP (pltab (i).UPDT_DT),
'READ',
'PAPER',
pltab (i).payer_customer,
v_str,
'111111');
v_err_loc := 3;
INSERT INTO invoice_header nologging
VALUES (v_inv_id,
TRIM (LEADING 0 FROM pltab (i).billing_document), --invoice_num
NULL,
pltab (i).BILL_DATE, --invoice_date
pltab (i).TO_PO_NUMBER,
NULL,
pltab (i).net_amount,
NULL,
pltab (i).tax_amt,
NULL,
NULL,
pltab (i).due_date,
NULL,
NULL,
NULL,
NULL,
NULL,
TO_TIMESTAMP (SYSDATE),
TO_TIMESTAMP (SYSDATE),
PLTAB (I).NET_AMOUNT_CURRENCY,
(SELECT i.bc_value
FROM invsvc_owner.billing_codes i
WHERE i.bc_name = PLTAB (I).BILLING_TYPE),
PLTAB (I).BILL_DATE);
v_err_loc := 4;
INSERT INTO invoice_address nologging
VALUES (invsvc_owner.SEQ_INVOICE_ADDRESS_ID.NEXTVAL,
v_inv_id,
'BLAH INITIAL',
pltab (i).BILL_DATE,
NULL,
pltab (i).to_acct_no,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
SYSTIMESTAMP,
NULL);
v_err_loc := 5;
INSERT INTO invoice_address nologging
VALUES ( SEQ_INVOICE_ADDRESS_ID.NEXTVAL,
v_inv_id,
'BLAH',
pltab (i).BILL_DATE,
NULL,
pltab (i).TO_ACCT_NO,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
SYSTIMESTAMP,
NULL);
v_err_loc := 6;
INSERT INTO invoice_address nologging
VALUES ( SEQ_INVOICE_ADDRESS_ID.NEXTVAL,
v_inv_id,
'BLAH2',
pltab (i).BILL_DATE,
NULL,
pltab (i).TO_CUSTOMER,
pltab (i).to_name,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
SYSTIMESTAMP,
NULL);
v_err_loc := 7;
INSERT INTO invoice_address nologging
VALUES ( SEQ_INVOICE_ADDRESS_ID.NEXTVAL,
v_inv_id,
'BLAH3',
pltab (i).BILL_DATE,
NULL,
'SOME PROPRIETARY DATA',
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
SYSTIMESTAMP,
NULL);
v_err_loc := 8;
INSERT
INTO invoice_event nologging (id,
eid,
root_eid,
invoice_number,
event_type,
event_email_address,
event_ts)
VALUES ( SEQ_INVOICE_EVENT_ID.NEXTVAL,
'111111',
'222222',
TRIM (LEADING 0 FROM pltab (i).billing_document),
'READ',
'some_user@some_company.com',
SYSTIMESTAMP);
v_err_loc := 9;
INSERT INTO backfill_invoice_mapping
VALUES (v_inv_id,
v_guid,
pltab (i).billing_document,
pltab (i).payer_customer,
pltab (i).net_amount);
IF v_count = 10000
THEN
COMMIT;
END IF;
EXCEPTION
WHEN OTHERS
THEN
errString := SQLERRM;
INSERT INTO backfill_invoice_errors
VALUES (
pltab (i).billing_document,
pltab (i).payer_customer,
errString || ' ' || v_err_loc
COMMIT;
END;
END LOOP;
v_err_loc := 10;
INSERT INTO backfill_invoice_timing
VALUES (
ROUND ( (DBMS_UTILITY.get_time - l_start_time) / 100,
2)
|| ' seconds.',
(SELECT COUNT (1)
FROM backfill_invoice_mapping),
(SELECT COUNT (1)
FROM backfill_invoice_errors),
SYSDATE
COMMIT;
EXIT WHEN c%NOTFOUND;
END LOOP;
COMMIT;
EXCEPTION
WHEN OTHERS
THEN
errString := SQLERRM;
INSERT INTO backfill_invoice_errors
VALUES (NULL, NULL, errString || ' ' || v_err_loc);
COMMIT;
END;Hello
You could use insert all in your case and make use of sequence.NEXTVAL and sequence.CURRVAL like so (excuse any typos - I can't test without table definitions). I've done the first 2 tables, so it's just a matter of adding the rest in...
INSERT ALL
INTO invoice nologging
VALUES ( SEQ_INVOICE_ID.NEXTVAL,
BILL_DATE,
my_guid,
'111111',
'NONE',
CAST(BILL_DATE AS TIMESTAMP),
CAST(UPDT_DT AS TIMESTAMP),
'READ',
'PAPER',
payer_customer,
parsed_francises,
'111111'
INTO invoice_header
VALUES ( SEQ_INVOICE_ID.CURRVAL,
TRIM (LEADING 0 FROM billing_document), --invoice_num
NULL,
BILL_DATE, --invoice_date
TO_PO_NUMBER,
NULL,
net_amount,
NULL,
tax_amt,
NULL,
NULL,
due_date,
NULL,
NULL,
NULL,
NULL,
NULL,
SYSTIMESTAMP,
SYSTIMESTAMP,
NET_AMOUNT_CURRENCY,
bc_value,
BILL_DATE)
SELECT
src.billing_type,
src.currency,
src.billing_document,
src.drop_ship_ind,
src.to_po_number,
src.to_purchase_order,
src.due_date,
src.bill_date,
src.tax_amt,
src.payer_customer,
src.to_acct_no,
src.bill_to_acct_no,
src.net_amount,
src.net_amount_currency,
src.order_dt,
src.to_customer,
src.to_name,
src.franchises,
src.updt_dt,
str_parser (src.FRANCHISES) parsed_franchises,
'import' || TO_CHAR (CURRENT_TIMESTAMP, 'hhmissff') my_guid,
i.bc_value
FROM BACKFILL_INVOICES src,
invsvc_owner.billing_codes i
WHERE i.bc_name = src.BILLING_TYPE;Some things to note
1. Don't commit in a loop - you only add to the run time and load on the box ultimately reducing scalability and removing transactional integrity. Commit once at the end of the job.
2. Make sure you specify the list of columns you are inserting into as well as the values or columns you are selecting. This is good practice as it protects your code from compilation issues in the event of new columns being added to tables. Also it makes it very clear what you are inserting where.
3. If you use WHEN OTHERS THEN... to log something, make sure you either rollback or raise the exception. What you have done in your code is say - I don't care what the problem is, just commit whatever has been done. This is not good practice.
HTH
David
Edited by: Bravid on Oct 13, 2011 4:35 PM -
SQL Interface - Error in Loading the data from SQL data source
Hello,
We have been using SQl data source for loading the dimensions and the data for so many years. Even using Essbase 11.1.1.0, it's been quite a while (more than one year). For the past few days,we are getting the below error when trying to load the data.
[Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Info(1021013)
ODBC Layer Error: [S1000] ==> [[DataDirect][ODBC DB2 Wire Protocol driver][UDB DB2 for Windows, UNIX, and Linux]CURSOR IDENTIFIED IN FETCH OR CLOSE STATEMENT
IS NOT OPEN (DIAG INFO: ).]
[Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Info(1021014)
ODBC Layer Error: Native Error code [4294966795]
[Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Error(1021001)
Failed to Establish Connection With SQL Database Server. See log for more information
[Mon Jan 10 11:02:56 2011]Local/{App Name}/{DB Name}/{User Id}/Error(1003050)
Data Load Transaction Aborted With Error [7]
[Mon Jan 10 11:02:56 2011]Local/{App Name}///Info(1013214)
Clear Active on User [Olapadm] Instance [1]
Interestingly, after the job fails thru our batch scheduler environment, when I run the same script that's being used in the batch scheduler, the job completes successfully.
Also, this is first time, I saw this kind of error message.
Appreciate any help or any suggestions to find a resolution. Thanks,Hii Priya,
The reasons may be the file is open, the format/flatfile structure is not correct, the mapping/transfer structure may not be correct, presence of invalid characters/data inconsistency in the file, etc.
Check if the flatfile in .CSV format.
You have to save it in .CSV format for the flatfile loading to work.
Also check the connection issues between source system and BW or sometimes may be due to inactive update rules.
Refer
error 1
Find out the actual reason and let us know.
Hope this helps.
Regards,
Raghu. -
Loading the data from a packed decimal format file using a sql*loader.
Hi ,
In one of the project i'm working here i have to load the data into oracle table from a file using a Sql*loader but the problem is the data file is in the packed decimal format so please let me know if there is any way to do this....I search a lot regarding this ..If anybody faced such type of problem ,then let me the steps to solve this.
Thanks in advance ,
Narasingarao.declare
f utl_file.file_type;
s1 varchar2(200);
s2 varchar2(200);
s3 varchar2(200);
c number := 0;
begin
f := utl_file.fopen('TRY','sample1.txt','R');
utl_file.get_line(f,s1);
utl_file.get_line(f,s2);
utl_file.get_line(f,s3);
insert into sampletable (a,b,c) values (s1,s2,s3);
c := c + 1;
utl_file.fclose(f);
exception
when NO_DATA_FOUND then
if utl_file.is_open(f) then utl_file.fclose(f); ens if;
dbms_output.put_line('No. of rows inserted : ' || c);
end;SY. -
Loading the data from a text file to a table using pl/sql
Hi Experts,
I want to load the data from a text (sample1.txt) file to a table using pl/sql
I have used the below pl/sql code
declare
f utl_file.file_type;
s varchar2(200);
c number := 0;
begin
f := utl_file.fopen('TRY','sample1.txt','R');
loop
utl_file.get_line(f,s);
insert into sampletable (a,b,c) values (s,s,s);
c := c + 1;
end loop;
exception
when NO_DATA_FOUND then
utl_file.fclose(f);
dbms_output.put_line('No. of rows inserted : ' || c);
end;
and my sample1.txt file looks like
1
2
3
The data is getting inserted, with below manner
select * from sampletable;
A B C
1 1 1
2 2 2
3 3 3
I want the data to get inserted as
A B C
1 2 3
The text file that I have is having three lines, and each line's first value should go to each column
Please help...
Thanksdeclare
f utl_file.file_type;
s1 varchar2(200);
s2 varchar2(200);
s3 varchar2(200);
c number := 0;
begin
f := utl_file.fopen('TRY','sample1.txt','R');
utl_file.get_line(f,s1);
utl_file.get_line(f,s2);
utl_file.get_line(f,s3);
insert into sampletable (a,b,c) values (s1,s2,s3);
c := c + 1;
utl_file.fclose(f);
exception
when NO_DATA_FOUND then
if utl_file.is_open(f) then utl_file.fclose(f); ens if;
dbms_output.put_line('No. of rows inserted : ' || c);
end;SY. -
How to Load Arabic Data from flat file using SQL Loader ?
Hi All,
We need to load Arabic data from an xls file to Oracle database, Request you to provide a very good note/step to achieve the same.
Below are the database parameters used
NLS_CHARACTERSET AR8ISO8859P6
nls_language american
DB version:-10g release 2
OS: rhel 5
Thanks in advance,
SatishTry to save your XLS file into CSV format and set either NLS_LANG to the right value or use SQL*Loader control file parameter CHARACTERSET.
See http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_control_file.htm#i1005287 -
Hello,
EBS version : 11.5.10.2
DB version : 11.2.0.3
OS version : AIX 6.1
As a part of 11.5.10.2 to R12.1.1 upgrade, while applying merged 12.1.1 upgrade driver(u6678700.drv), we got below error :
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
ATTENTION: All workers either have failed or are waiting:
FAILED: file glsupdas.ldt on worker 3.
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 adwork003.log
Restarting job that failed and was fixed.
Time when worker restarted job: Wed Aug 07 2013 10:36:14
Loading data using FNDLOAD function.
FNDLOAD APPS/***** 0 Y UPLOAD @SQLGL:patch/115/import/glnlsdas.lct @SQLGL:patch/115/import/US/glsupdas.ldt -
Connecting to APPS......Connected successfully.
Calling FNDLOAD function.
Returned from FNDLOAD function.
Log file: /fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log/US_glsupdas_ldt.log
Error calling FNDLOAD function.
Time when worker failed: Wed Aug 07 2013 10:36:14
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
drix10:/fmstop/r12apps/apps/apps_st/appl/admin/FMSTEST/log>tail -20 US_glsupdas_ldt.log
Current system time is Wed Aug 7 10:36:14 2013
Uploading from the data file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt
Altering database NLS_LANGUAGE environment to AMERICAN
Dumping from LCT/LDT files (/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0), /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt) to staging tables
Dumping LCT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/glnlsdas.lct(120.0) into FND_SEED_STAGE_CONFIG
Dumping LDT file /fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/import/US/glsupdas.ldt into FND_SEED_STAGE_ENTITY
Dumped the batch (GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS , GL_DEFAS_ACCESS_SETS SUPER_USER_DEFAS ) into FND_SEED_STAGE_ENTITY
Uploading from staging tables
Error loading seed data for GL_DEFAS_ACCESS_SETS: DEFINITION_ACCESS_SET = SUPER_USER_DEFAS, ORA-06508: PL/SQL: could not find program unit being called
Concurrent request completed
Current system time is Wed Aug 7 10:36:14 2013
++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++Below is info about file versions and INVALID packages related to GL.
PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG is invalid with error component 'GL_DEFAS_DBNAME_S' must be declared.
I can see GL_DEFAS_DBNAME_S is a VALID sequence accessible by apps user with or without specifying GL as owner.
SQL> select text from dba_source where name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG') and line=2;
TEXT
/* $Header: glistdds.pls 120.4 2005/05/05 01:23:16 kvora ship $ */
/* $Header: glistddb.pls 120.16 2006/04/10 21:28:48 cma ship $ */
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
SQL> select * from all_objects where object_name in ('GL_DEFAS_ACCESS_DETAILS_PKG','GL_DEFAS_ACCESS_SETS_PKG')
2 ; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1118545 PACKAGE 05-AUG-13 05-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1118548 PACKAGE 05-AUG-13 06-AUG-13 2013-08-05:18:54:51 VALID N N N 1
APPS GL_DEFAS_ACCESS_SETS_PKG 1128507 PACKAGE BODY 05-AUG-13 06-AUG-13 2013-08-06:12:56:50 INVALID N N N 2
APPS GL_DEFAS_ACCESS_DETAILS_PKG 1128508 PACKAGE BODY 05-AUG-13 05-AUG-13 2013-08-05:19:43:51 VALID N N N 2
SQL> select * from all_objects where object_name='GL_DEFAS_DBNAME_S'; OWNER OBJECT_NAME SUBOBJECT_NAM OBJECT_ID DATA_OBJECT_ID OBJECT_TYPE CREATED LAST_DDL_TIME TIMESTAMP STATUS T G S NAMESPACE
EDITION_NAME
GL GL_DEFAS_DBNAME_S 1087285 SEQUENCE 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
APPS GL_DEFAS_DBNAME_S 1087299 SYNONYM 05-AUG-13 05-AUG-13 2013-08-05:17:34:43 VALIDN N N 1
SQL> conn apps/apps
Connected.
SQL> SELECT OWNER, OBJECT_NAME, OBJECT_TYPE, STATUS
FROM DBA_OBJECTS
WHERE OBJECT_NAME = 'GL_DEFAS_ACCESS_SETS_PKG'; 2 3 OWNER OBJECT_NAME OBJECT_TYPE STATUS
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE VALID
APPS GL_DEFAS_ACCESS_SETS_PKG PACKAGE BODY INVALID SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE; Warning: Package altered with compilation errors. SQL> show error
No errors.
SQL> ALTER PACKAGE GL_DEFAS_ACCESS_SETS_PKG COMPILE BODY; Warning: Package Body altered with compilation errors. SQL> show error
Errors for PACKAGE BODY GL_DEFAS_ACCESS_SETS_PKG: LINE/COL ERROR
39/17 PLS-00302: component 'GL_DEFAS_DBNAME_S' must be declared
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>cat $GL_TOP/patch/115/sql/glistdab.pls|grep -n GL_DEFAS_DBNAME_S
68: SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
81: fnd_message.set_token('SEQUENCE', 'GL_DEFAS_DBNAME_S');
SQL> show user
USER is "APPS"
SQL> SELECT GL.GL_DEFAS_DBNAME_S.NEXTVAL
FROM dual; 2 -- with GL.
NEXTVAL
1002
SQL> SELECT GL_DEFAS_DBNAME_S.NEXTVAL from dual; --without GL. or using synonym.
NEXTVAL
1003
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdab.pls|grep '$Header'
REM | $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ |
/* $Header: glistdab.pls 120.5 2006/03/13 19:56:21 cma ship $ */
drix10:/fmstop/r12apps/apps/apps_st/appl/gl/12.0.0/patch/115/odf>strings -a $GL_TOP/patch/115/sql/glistdas.pls |grep '$Header'
REM | $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ |
/* $Header: glistdas.pls 120.4 2005/05/05 01:23:02 kvora ship $ */ -
Does SSIS guarantee that it loads the data into SQL Server in the same order as it is in Excel
Hi,
We are trying to load several Excel files into SQL Server SSIS and we need to save the data in the database in the same order as it is in Excel..
My question is, Does SSIS guarantee that it loads the data into SQL Server in the same order as it is in Excel. If so, we will add a sequence to ensure we have the order.
Please advise.
Thanks & Regards,
DhanumjayThanks for your response.
If it is one file then we can add an index column, but we have to load hundreds of files.
How adding an index/key column to the table works unless SSIS picks and loads the data in the table in the same order as it is in Excel?
Just to explain my question better,
If excel has three rows {a,b},{c,d},{e,f}, does SSIS ensure it loads the data in the same order in a table.
Thanks. -
Error in SQL Statement (ODS data loading Error)
Hello Gurus...
I am trying to loading the data Info Source To ODS in my ODS Key Figure Z_IVQUA (Invoice Quantity)
is getting the Error Message Data shows 0 from 0 Records.
I was Remove the Key Field from ODS Then try to load the data but i am getting same Error..
Error in SQL Statement: SAPSQL_INVALID_FIELDNAME F~/BIC/Z_INVQUA
Errors in source system
Please Suggest....
Thanks
PrakashHello All...
Really appreciate if you can give me some advise. Thanks.
Prakash
Edited by: Prakash M on Jan 14, 2009 2:14 PM -
Sql loader loading data from legacy to Oracle
Hi
we have a requirement that we need to import data from legacy to oracle AR.We know that we need to put the data file in BIN folder,but I want to know about the data file source if we place in windows folder instead of BIN directory will SQL loader picks the file and loads the data.
Thanks
YYes,
Refer this
http://www.oracle.com/technology/products/database/utilities/htdocs/sql_loader_overview.html
* Load data across a network. This means that a SQL*Loader client can be run on a different system from the one that is running the SQL*Loader server.
* Load data from multiple datafiles during the same load session
* Load data into multiple tables during the same load session
* Specify the character set of the data
* Selectively load data
* Load data from disk, tape, or named pipe
* Specify the character set of the data
* Generate sophisticated error reports, which greatly aid troubleshooting
* Load arbitrarily complex object-relational data
* Use either conventional or direct path loading.
-Arun -
SQL loader load data very slow...
Hi,
On my production server have issue of insert. Regular SQL loder load file, it take more time for insert the data in database.
First 2 and 3 hours one file take 8 to 10 seconds after that it take 5 minutes.
As per my understanding OS I/O is very slow, First 3 hours DB buffer is free and insert data in buffer normal.
But when buffer is fill then going for buffer waits and then insert is slow on. If it rite please tell me how to increase I/O.
Some analysis share here of My server...................
[root@myserver ~]# iostat
Linux 2.6.18-194.el5 (myserver) 06/01/2012
avg-cpu: %user %nice %system %iowait %steal %idle
3.34 0.00 0.83 6.66 0.00 89.17
Device: tps Blk_read/s Blk_wrtn/s Blk_read Blk_wrtn
sda 107.56 2544.64 3140.34 8084953177 9977627424
sda1 0.00 0.65 0.00 2074066 16
sda2 21.57 220.59 1833.98 700856482 5827014296
sda3 0.00 0.00 0.00 12787 5960
sda4 0.00 0.00 0.00 8 0
sda5 0.69 2.75 15.07 8739194 47874000
sda6 0.05 0.00 0.55 5322 1736264
sda7 0.00 0.00 0.00 2915 16
sda8 0.50 9.03 5.24 28695700 16642584
sda9 0.51 0.36 24.81 1128290 78829224
sda10 0.52 0.00 5.98 9965 19004088
sda11 83.71 2311.26 1254.71 7343426336 3986520976
[root@myserver ~]# hdparm -tT /dev/sda11
/dev/sda11:
Timing cached reads: 10708 MB in 2.00 seconds = 5359.23 MB/sec
Timing buffered disk reads: 540 MB in 3.00 seconds = 179.89 MB/sec
[root@myserver ~]# sar -u -o datafile 1 6
Linux 2.6.18-194.el5 (mca-webreporting2) 06/01/2012
09:57:19 AM CPU %user %nice %system %iowait %steal %idle
09:57:20 AM all 6.97 0.00 1.87 16.31 0.00 74.84
09:57:21 AM all 6.74 0.00 1.25 17.48 0.00 74.53
09:57:22 AM all 7.01 0.00 1.75 16.27 0.00 74.97
09:57:23 AM all 6.75 0.00 1.12 13.88 0.00 78.25
09:57:24 AM all 6.98 0.00 1.37 16.83 0.00 74.81
09:57:25 AM all 6.49 0.00 1.25 14.61 0.00 77.65
Average: all 6.82 0.00 1.44 15.90 0.00 75.84
[root@myserver ~]# sar -u -o datafile 1 6
Linux 2.6.18-194.el5 (mca-webreporting2) 06/01/2012
09:57:19 AM CPU %user %nice %system %iowait %steal %idle
mca-webreporting2;601;2012-05-27 16:30:01 UTC;2.54;1510.94;3581.85;0.00
mca-webreporting2;600;2012-05-27 16:40:01 UTC;2.45;1442.78;3883.47;0.04
mca-webreporting2;599;2012-05-27 16:50:01 UTC;2.44;1466.72;3893.10;0.04
mca-webreporting2;600;2012-05-27 17:00:01 UTC;2.30;1394.43;3546.26;0.00
mca-webreporting2;600;2012-05-27 17:10:01 UTC;3.15;1529.72;3978.27;0.04
mca-webreporting2;601;2012-05-27 17:20:01 UTC;9.83;1268.76;3823.63;0.04
mca-webreporting2;600;2012-05-27 17:30:01 UTC;32.71;1277.93;3495.32;0.00
mca-webreporting2;600;2012-05-27 17:40:01 UTC;1.96;1213.10;3845.75;0.04
mca-webreporting2;600;2012-05-27 17:50:01 UTC;1.89;1247.98;3834.94;0.04
mca-webreporting2;600;2012-05-27 18:00:01 UTC;2.24;1184.72;3486.10;0.00
mca-webreporting2;600;2012-05-27 18:10:01 UTC;18.68;1320.73;4088.14;0.18
mca-webreporting2;600;2012-05-27 18:20:01 UTC;1.82;1137.28;3784.99;0.04
[root@myserver ~]# vmstat
procs -----------memory---------- -swap -----io---- system -----cpu------
r b swpd free buff cache si so bi bo in cs us sy id wa st
0 1 182356 499444 135348 13801492 0 0 3488 247 0 0 5 2 89 4 0
[root@myserver ~]# dstat -D sda
----total-cpu-usage---- dsk/sda -net/total- -paging -system
usr sys idl wai hiq siq| read writ| recv send| in out | int csw
3 1 89 7 0 0|1240k 1544k| 0 0 | 1.9B 1B|2905 6646
8 1 77 14 0 1|4096B 3616k| 433k 2828B| 0 0 |3347 16k
10 2 77 12 0 0| 0 1520k| 466k 1332B| 0 0 |3064 15k
8 2 77 12 0 0| 0 2060k| 395k 1458B| 0 0 |3093 14k
8 1 78 12 0 0| 0 1688k| 428k 1460B| 0 0 |3260 15k
8 1 78 12 0 0| 0 1712k| 461k 1822B| 0 0 |3390 15k
7 1 78 13 0 0|4096B 6372k| 449k 1950B| 0 0 |3322 15k
AWR sheet output
Wait Events
ordered by wait time desc, waits desc (idle events last)
Event Waits %Time -outs Total Wait Time (s) Avg wait (ms) Waits /txn
free buffer waits 1,591,125 99.95 19,814 12 129.53
log file parallel write 31,668 0.00 1,413 45 2.58
buffer busy waits 846 77.07 653 772 0.07
control file parallel write 10,166 0.00 636 63 0.83
log file sync 11,301 0.00 565 50 0.92
write complete waits 218 94.95 208 955 0.02
SQL> select 'free in buffer (NOT_DIRTY)',round((( select count(DIRTY) N_D from v$bh where DIRTY='N')*100)/(select count(*) from v$bh),2)||'%' DIRTY_PERCENT from dual
union
2 3 select 'keep in buffer (YES_DIRTY)',round((( select count(DIRTY) N_D from v$bh where DIRTY='Y')*100)/(select count(*) from v$bh),2)||'%' DIRTY_PERCENT from dual;
'FREEINBUFFER(NOT_DIRTY)' DIRTY_PERCENT
free in buffer (NOT_DIRTY) 10.71%
keep in buffer (YES_DIRTY) 89.29%
Rag....1)
Yah This is partition table and on it Local partition index.
SQL> desc GR_CORE_LOGGING
Name Null? Type
APPLICATIONID VARCHAR2(20)
SERVICEID VARCHAR2(25)
ENTERPRISENAME VARCHAR2(25)
MSISDN VARCHAR2(15)
STATE VARCHAR2(15)
FROMTIME VARCHAR2(25)
TOTIME VARCHAR2(25)
CAMP_ID VARCHAR2(50)
TRANSID VARCHAR2(25)
MSI_INDEX NUMBER
SQL> select index_name,column_name from user_ind_columns where table_name='GR_CORE_LOGGING';
INDEX_NAME
COLUMN_NAME
GR_CORE_LOGGING_IND
MSISDN
2) I was try direct but after that i was drop this table and again create new partition table and create fresh index. but still same issue. -
SQL*LOADER ERROR WHILE LOADING ARABIAN DATA INTO UNICODE DATABSE
Hi,
I was trying to load arabic data using sql*loader and the datafile is in .CSV format.But i am facing a error Value to large for a column while loading and some data are not loaded due to this error.My target database character set is..
Characterset : AL32UTF8
National Character set: AL16UTF16
DB version:-10g release 2
OS:-Cent OS 5.0/redhat linux 5.0
I have specified the characterset AR8MSWIN1256/AR8ISO8859P6/AL32UTF8/UTF8 separately in the sql*loader control file,but getting the same error for all the cases.
I have also created the table with CHAR semantics and have specified the "LENGTH SEMANTICS CHAR" in the sql*loader control file but again same error is coming.
I have also changed the NLS_LANG setting.
I am getting stunned that the data that i am goin to load using sql*loader, it is resided in the same database itself.But when i am generating a csv for those datas and trying to load using sql*loader to the same database and same table structure,i am getting this error value too large for a column.
whats the probs basically???? whether the datafile is problemetic as i am generating the csv programmetically or is there any problem in my approach of loading unicode data.
Please help...Here's what we know from what you've posted:
1. You may be running on an unsupported operating system ... likely not the issue but who knows.
2. You are using some patch level of 10gR2 of the Oracle database but we don't know which one.
3. You've had some kind of error but we have no idea which error or the error message displayed with it.
4. You are loading data into a table but we do not have any DDL so we do not know the data types.
Perhaps you could provide a bit more information.
Perhaps a lot more. <g> -
Sql*loader - load data in table with multiple condition
Hi,
I have oracle 9i on Sun sloaris and i need to load data in one of oracle table using sql*loader with conditional column data.
My table is like:
Load_table
col1 varchar2(10),
col2 varchar2(10),
col3 varchar2(10),
Now i have to load data like:
If col2 = US1 then col3 = 'AA'
If col2 = US2 then col3 = 'BB'
If col2 = US3 then col3 = 'CC'
How can i load this data in table using sql*loader?
Thanks,
PoraHi
it is a half-solution.
You have to:
1. open file
2. take a line
3. split the line into values (using substring to)
4. check condition (01 or 02)
5. do a proper insertion
Good Luck,
Przemek
DECLARE
v_dir VARCHAR2(50) := 'd:/tmp/'; --directory where file is placed
v_file VARCHAR2(50) := 'test.txt'; -- file name
v_fhandle UTL_FILE.FILE_TYPE; ---file handler
v_fline VARCHAR2(906); --file line
v_check VARCHAR2(50);
BEGIN
v_fhandle := UTL_FILE.FOPEN(v_dir, v_file, 'R'); --open file for read only
LOOP -- in the loop
UTL_FILE.GET_LINE( v_fhandle , v_fline); -- get line by line from file
if (substr(v_fline,17,2) = '01') then --check the value
INSERT INTO ... -- Time_in
else
INSERT INTO ... -- Time_out
end if;
END LOOP;
EXCEPTION
WHEN NO_DATA_FOUND
THEN UTL_FILE.FCLOSE( v_fhandle );
END; -
How to load oracle data into SQL SERVER 2000?
how to load oracle data into SQL SERVER 2000.
IS THERE ANY UTILITY AVAILABLE?Not a concern for an Oracle forum.
Als no need for SHOUTING.
Conventional solutions are
- dump the data to a csv file and load it in Mickeysoft SQL server
- use Oracle Heterogeneous services
- use Mickeysoft DTS
Whatever you prefer.
Sybrand Bakker
Senior Oracle DBA -
Problem: loading SQL Server 'image' data
Source database: SQL Server 2000
OS: Windows 2003 (SP1)
Oracle: 10g (R2)
Datatype Mapping: SQL Server ‘image’ to oracle ‘BLOB’
With the help of OMWB, I created the oracle database schema for a SQL Server DB (offline capture). I have problems when I tried to populate the (destination) database with two tables with ‘image’ datatype:
Frame.img (as on http://www.sdss.org.uk/mybestdr5/en/help/browser/description.asp?n=Frame&t=U)
And SpecObjAll.img (as on http://www.sdss.org.uk/mybestdr5/en/help/browser/description.asp?n=SpecObjAll&t=U)
The part of .ctl files (generated by OMWB) for the two ‘img’ columns is like this:
IMG CHAR(2000000) "HEXTORAW (:IMG)")
I failed to load the data with the sql_load_script.sh script, and the log file is like this:
IMG NEXT ***** CHARACTER
Maximum field length is 2000000
Terminator string : '<ec>'
SQL string for column : "HEXTORAW (:IMG)"
SQL*Loader-309: No SQL string allowed as part of IMG field specification
I tried to removed ‘CHAR(2000000)’ in the .ctl files:
IMG "HEXTORAW (:IMG)")
But this doesn’t work and the log file is like this:
IMG NEXT * CHARACTER
Terminator string : '<ec>'
SQL string for column : "HEXTORAW (:IMG)"
SQL*Loader-309: No SQL string allowed as part of IMG field specification
Any help would be extremely appreciated.
HelenHi you might want to post your question in General Forum.
General Database Discussions
There's very few users visit this forum. -
Problem: load SQL Server 'varbinary' data
Source database: SQL Server 2000
OS: Windows 2003 (SP1)
Oracle: 10g (R2)
Datatype Mapping: SQL Server ‘varbinary(1000)’ to oracle ‘BLOB’
With the help of OMWB, I created the oracle database schema for a SQL Server DB (offline capture). I have problems when I tried to populate the (destination) database with a table with ‘varbinary(1000)’ datatype:
PlateX.expID (as on http://www.sdss.org.uk/mybestdr5/en/help/browser/description.asp?n=PlateX&t=U) and the with original data is like this: http://dsg.port.ac.uk/~hx/research/sdss/logs/SelectExpidFromPlatex
The PLATEX.ctl file (generated by OMWB) is like this (http://dsg.port.ac.uk/~hx/research/sdss/logs/PLATEX.ctl):
load data
infile 'PLATEX.dat' "str '<er>'"
into table PLATEX
fields terminated by '<ec>'
trailing nullcols
<SKIP>
EXPID)
I failed to load the data with the sql_load_script.sh script, and the log file is like this:
<SKIP>
EXPID NEXT * CHARACTER
Terminator string : '<ec>'
value used for ROWS parameter changed from 64 to 12
Record 1: Rejected - Error on table PLATEX, column EXPID.
Field in data file exceeds maximum length
<SKIP>
Record 60: Rejected - Error on table PLATEX, column EXPID.
Field in data file exceeds maximum length
MAXIMUM ERROR COUNT EXCEEDED - Above statistics reflect partial run.
Table PLATEX:
9 Rows successfully loaded.
51 Rows not loaded due to data errors.
0 Rows not loaded because all WHEN clauses were failed.
0 Rows not loaded because all fields were null.
Space allocated for bind array: 238392 bytes(12 rows)
Read buffer bytes: 1048576
Total logical records skipped: 0
Total logical records read: 60
Total logical records rejected: 51
Total logical records discarded: 0
Any help would be extremely appreciated.
Helen
P.S. previously, I mapped: SQL Server ‘varbinary(1000)’ to ‘RAW(1000)’ in Oracle but failed to populate the data (http://dsg.port.ac.uk/~hx/research/sdss/logs/PLATEX.log).Hi Emile,
regading extracting data from MSSQL with OWB on Unix platform (using Generic Connectivity):
Re: SQLServer access from AIX Warehouse builder
Re: OWB on Solaris Connectivity with SQL SERVER on Windows
We want to load aprox. 100 million records a day.
I've read some articles about this and the advise was to dump the data from SQL Server to files and load the files with OWB.100 million records per day is not a problem for daily extracting from MSSQL Server if you have 1-2 hour.
In my opinion dumping to text file is a bad practice and is unnecessary if customer don't have special requirements (for example for security reason).
SQL Server source table and a Oracle target table without any difficult transformationsIn my opninion the best way process data from MSSQL is to extract data to staging area (schema) on Oracle DB with mappings as simple as possible (ONLY filters, without any join), and most of data processing prefom in Staging area or during moving from staging to DWH.
Also look at OWB user guide (how to use Generic Connectivity in OWB)
http://download.oracle.com/docs/cd/E11882_01/owb.112/e10582/loading_ms_data.htm#i1064950
Regards,
Oleg
Edited by: added link to OWB doc with description of using Generic Connectivity
Maybe you are looking for
-
Radeon kms, xrandr and xorg-server 1.7
I upgraded to xorg-server 1.7.2 the other day and now I've been getting immediate segfaults every time I try to use xrandr. All I'm doing is running: xrandr --output VGA-0 --left-of DVI-0 which used to work perfectly with 1.6.3.901. It still does bec
-
Updated to 10.6.8 and App Store is gone.
Mac support article online suggested downloading 10.6.6 bundle, but that won't work. How can I get App Store back?
-
Foreign jms connection factory username and password
I have to connect to TIBCO as the foreign JMS. I have created a foreign JMS server and configured JNDI properties including username and password. I configured a connection factory and specified username and password as well. Finally, I configured fo
-
Repairing damaged itunes files
Hi, not sure what happened but i went into my itunes one morninging and found my library file was damaged and a new one was in place. the new one only had a few of my songs... i lost play lists and almost all my music. any way to repair the old damag
-
Report Server in Fusion middle ware 11g
I have good skills in oracle programming and i migration our solutions form 10G to Fusion middle ware but i face problem "Unable to run report" please if any one can help me.... thanks for all