Bulk insert task issue
I Have table,It contains 4 millions records,I want load data into Sql Server table using Bulk Insert task.
How can i load data using Bulk Insert task.Bulk insert task supports only text source.
Thanks in Advance.
If its a sql server table to table transfer You can use data flow task with OLEDB Source and destination. In the OLEDB destination use
table or view - fast load option as the data access mode.
Also if databases are in same server you can even use Execute SQL task with statement like
INSERT INTO DestTable
SELECT *
FROM SourceDB.dbo.SourceTable
which will be set based
Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs
Similar Messages
-
If your database in Full Recovery mode, can you use Bulk Insert Task to load data
If your database in Full Recovery mode, can you use Bulk Insert Task to load data
If your database in Full Recovery mode, can you use Bulk Insert Task to load data
Yes you can ofourse but dont be in idea that logging will be mininal. Loggign will be as per recovery model full. Every thing will be logged. If you are going to use bulk insert task you can consider switching recovery model to Bulk logged but you will not
have option to do point in time recovery.
PS: please dont create duplicate threads
If you read first Note section in below link it clearly states that yes logging will be full and you can use
http://technet.microsoft.com/en-us/library/ms191244(v=sql.105).aspx
Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it.
My TechNet Wiki Articles -
Following error i am getting after i chnaged the Path in Config File from
\\vs01\d$\\Deployment\Files\temp.txt
to
C:\Deployment\Files\temp.txt
[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot bulk load because the file "C:\Deployment\Files\temp.txt" could not be opened. Operating system error code 3(The system cannot find the path specified.).".I think i know whats going on. The Bulk Insert task runs by executing sql command (bulk insert) internally from the target sql server to load the file. This means that the SQL Server Agent of the target sql server should have permissions on the file you trying to load. This also means that you need to use UNC path instead to specify the file path (if the target server in on different machine)
Also from BOL (see section Usage Considerations - last bullet point)
http://msdn.microsoft.com/en-us/library/ms141239.aspx
* Only members of the sysadmin fixed server role can run a package that contains a Bulk Insert task.
Make sure you take care of this as well.
HTH
~Mukti
Mukti -
Cannot fetch a row from OLE DB provider "BULK" with bulk insert task
Hi, folks:
I created a simple SSIS package. On the Control Flow, I created a Bulk INsert Task with Destination connection to a the local SQL server, a csv file from a local folder, specify comma delimiter. Then I excute the task and I got this long error message.
[Bulk Insert Task] Error: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.".I got the same error with some additional error details (below). All I had to do to fix the problem was set the Timeout property for the SQL Server Destination = 0
I was using the following components:
SQL Server 2008
SQL Server Integration Services 10.0
Data Flow Task
OLE DB Source – connecting to Oracle 11i
SQL Server Destination – connecting to the local SQL Server 2008 instance
Full Error Message:
Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80040E14.
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80040E14 Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".".
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80040E14 Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.".
An OLE DB record is available. Source: "Microsoft SQL Server Native Client 10.0" Hresult: 0x80040E14 Description: "The Bulk Insert operation of SQL Server Destination has timed out. Please consider increasing the value of Timeout property on the SQL Server Destination in the dataflow.".
For SQL Server 2005 there is a hot fix available from Microsoft at http://support.microsoft.com/default.aspx/kb/937545 -
Getting random LCK_M_SCH_M on convert and bulk insert task
I starting getting random LCK_M_SCH_M locks with huge wait time, which hung my etl proccess.
The ssis package runs like this:
I have 4 containers that run in parallel and do the same thing:
-Convert a tab delimited file from unicode->utf8
-Truncate the table (within a foreach loop)
-Bulk insert the data
Also transactionoption is set to NotSupported.
What could be causing the lock?
All foreach loops do not overlap ragarding tables/files.
Do they contest somehow?
EliasThe truncate table command imposes the schema lock so you will have to not to run in parallel this task
Arthur
MyBlog
Twitter -
Hi,
I am trying to figure out how to fix my problem
Error: Could not be opened. Operating system error code 5(Access is denied.)
Process Description:
Target Database Server Reside on different Server in the Network
SSIS Package runs from a Remote Server
SSIS Package use a ForEachLoop Container to loop into a directory to do Bulk Insert
SSIS Package use variables to specified the share location of the files using UNC like this
\\server\files
Database Service accounts under the Database is runing it has full permission on the share drive were the files reside.
In the Execution Results tab shows the prepare SQL statement for the BULK insert and I can run the same exact the bulk insert in SSMS without errors, from the Database Server and from the server were SSIS package is executed.
I am on a dead end and I don’t want to re-write SSIS to use Data Flow Task because is not flexible to update when metadata of the table changed.
Below post it has almost the same situation:
https://social.msdn.microsoft.com/Forums/sqlserver/en-US/8de13e74-709a-43a5-8be2-034b764ca44f/problem-with-bulk-insert-task-in-foreach-loop?forum=sqlintegrationservicesInsteresting how I fixed the issue, Adding the Application Name into the SQL OLAP Connection String Fixed the issue. I am not sure why SQL Server wasn't able to open the file remotely without this.
-
Error while running bulk insert in SSIS package
Hi:
I have an error when I am running bulk insert in SSIS package.
I have implemented an SSIS package to update master data directly from R/3, R/3 gives the file in a specified format, I take this and insert all the records into a temporary table and then update mbr table and process the dimension.
This works perfectly well in our development system where both our app server and sql server on the same box. But in QAS, the 2 servers are separate and when I try to run the SSIS package I get the below error.
We have tested all connections and are able to access the path and file from both app server and sql server using the shared folder. Our basis team says that it is a problem with bulk insert task and nothing to do with any authorization.
Has anyone experienced with this sort of problem in multi server environment? Is there another way to load all data from a file into bespoke table without using bulk insert.
Thanks,
Subramania
Error----
SSIS package "Package.dtsx" starting.
Error: 0xC002F304 at Insert Data Into Staging Table (Account), Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "
msapbpcapq01\dim\entity.csv" could not be opened. Operating system error code 5(Access is denied.).".
Task failed: Insert Data Into Staging Table (Account)
SSIS package "Package.dtsx" finished: Success.
The program '[2496] Package.dtsx: DTS' has exited with code 0 (0x0).Hi Subramania
From your error:
Error: 0xC002F304 at Insert Data Into Staging Table (Account), Bulk Insert Task: An error occurred with the following error message: "Cannot bulk load because the file "
msapbpcapq01\dim\entity.csv" could not be opened. Operating system error code 5(Access is denied.).".
Let say, server A is where the file entity.csv is located
Please check the Event Viewer->Security of Server A at the time when the SSIS run, there must be an entry with Logon Failure and find what user was used to access the shared path.
If your both servers are not in a domain, create the user in server A with the same name and password and grant read access to the shared folder.
The other workaround is grant read access to Everybody on the shared folder.
Halomoan
Edited by: Halomoan Zhou on Oct 6, 2008 4:23 AM -
Bulk Insert Failure: Unexpected end of file
Hi all,
I have a Bulk Insert task which pulls data from a .csv file to SQL server table.
It works fine 99 out of 100 times, but sometimes it throws the following error in production.
"System exception: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information
about the error.Bulk load: An unexpected end of file was encountered in the data file.". "
My Client says that there was no change to the file and it runs fine the next time.Its a error which I could not reproduce in development.
I have an interface prior to this bulk insert interface to check whether the file is completely available(i.e i try to open the file and check whether its used by any other process).
I'm supposed to give an answer for why it fails sometimes. Your thoughts on this?
RajkumarThe Bulk insert task is used in SSIS package. The error Posted in my first post is what I got from SSIS..
Why don't you use Data Flow task? BULK INSERT is fast, but not much error control.
BOL quote ( http://msdn.microsoft.com/en-us/library/ms141679.aspx ) : "Error Handling in Data
When a data flow component applies a transformation to column data, extracts data from sources, or loads data into destinations, errors can occur. Errors frequently occur because of unexpected data values. For example, a data conversion fails because a column
contains a string instead of a number, an insertion into a database column fails because the data is a date and the column has a numeric data type, or an expression fails to evaluate because a column value is zero, resulting in a mathematical operation that
is not valid.
Errors typically fall into one the following categories:
Data conversion errors, which occur if a conversion results in loss of significant digits, the loss of insignificant digits, and the truncation of strings. Data conversion errors also occur if the requested conversion is not supported.
Expression evaluation errors, which occur if expressions that are evaluated at run time perform invalid operations or become syntactically incorrect because of missing or incorrect data values.
Lookup errors, which occur if a lookup operation fails to locate a match in the lookup table.
Many data flow components support error outputs, which let you control how the component handles row-level errors in both incoming and outgoing data. You specify how the component behaves when truncation or an error occurs by setting options on individual
columns in the input or output. For example, you can specify that the component should fail if customer name data is truncated, but ignore errors on another column that contains less important data."
http://msdn.microsoft.com/en-us/library/ms141679.aspx
Kalman Toth SQL SERVER & BI TRAINING -
BULK INSERT into View w/ Instead Of Trigger - DML ERROR LOGGING Issue
Oracle 10.2.0.4
I cannot figure out why I cannot get bulk insert errors to aggregate and allow the insert to continue when bulk inserting into a view with an Instead of Trigger. Whether I use LOG ERRORS clause or I use SQL%BULK_EXCEPTIONS, the insert works until it hits the first exception and then exits.
Here's what I'm doing:
1. I'm bulk inserting into a view with an Instead of Trigger on it that performs the actual updating on the underlying table. This table is a child table with a foreign key constraint to a reference table containing the primary key. In the Instead of Trigger, it attempts to insert a record into the child table and I get the following exception: +5:37:55 ORA-02291: integrity constraint (FK_TEST_TABLE) violated - parent key not found+, which is expected, but the error should be logged in the table and the rest of the inserts should complete. Instead the bulk insert exits.
2. If I change this to bulk insert into the underlying table directly, it works, all errors get put into the error logging table and the insert completes all non-exception records.
Here's the "test" procedure I created to test my scenario:
View: V_TEST_TABLE
Underlying Table: TEST_TABLE
PROCEDURE BulkTest
IS
TYPE remDataType IS TABLE of v_TEST_TABLE%ROWTYPE INDEX BY BINARY_INTEGER;
varRemData remDataType;
begin
select /*+ DRIVING_SITE(r)*/ *
BULK COLLECT INTO varRemData
from TEST_TABLE@REMOTE_LINK
where effectiveday < to_date('06/16/2012 04','mm/dd/yyyy hh24')
and terminationday > to_date('06/14/2012 04','mm/dd/yyyy hh24');
BEGIN
FORALL idx IN varRemData.FIRST .. varRemData.LAST
INSERT INTO v_TEST_TABLE VALUES varRemData(idx) LOG ERRORS INTO dbcompare.ERR$_TEST_TABLE ('INSERT') REJECT LIMIT UNLIMITED;
EXCEPTION WHEN others THEN
DBMS_OUTPUT.put_line('ErrorCode: '||SQLCODE);
END;
COMMIT;
end;
I've reviewed Oracle's documentation on both DML logging tools and neither has any restrictions (at least that I can see) that would prevent this from working correctly.
Any help would be appreciated....
Thanks,
SteveThanks, obviously this is my first post, I'm desperate to figure out why this won't work....
This code I sent is only a test proc to try and troubleshoot the issue, the others with the debug statement is only to capture the insert failing and not aggregating the errors, that won't be in the real proc.....
Thanks,
Steve -
I'm running SQL Server 2008 R2 and trying to test out bcp in one of our databases. For almost all the tables, the bcp and bulk insert work fine using similar commands below. However on a few tables I am experiencing an issue when trying to Bulk Insert
in.
Here are the details:
This is the bcp command to export out the data (via simple batch file):
1.)
SET OUTPUT=K:\BCP_FIN_Test
SET ERRORLOG=C:\Temp\BCP_Error_Log
SET TIMINGS=C:\Temp\BCP_Timings
bcp "SELECT * FROM FS84RPT.dbo.PS_PO_LINE Inner Join FS84RPT.[dbo].[PS_RECV_LN_ACCTG] on PS_PO_LINE.BUSINESS_UNIT = PS_RECV_LN_ACCTG.BUSINESS_UNIT_PO and PS_PO_LINE.PO_ID= PS_RECV_LN_ACCTG.PO_ID and PS_PO_LINE.LINE_NBR= PS_RECV_LN_ACCTG.LINE_NBR WHERE
PS_RECV_LN_ACCTG.FISCAL_YEAR = '2014' and PS_RECV_LN_ACCTG.ACCOUNTING_PERIOD BETWEEN '9' AND '11' " queryout %OUTPUT%\PS_PO_LINE.txt -e %ERRORLOG%\PS_PO_LINE.err -o %TIMINGS%\PS_PO_LINE.txt -T -N
2.)
BULK INSERT PS_PO_LINE FROM 'K:\BCP_FIN_Test\PS_PO_LINE.txt' WITH (DATAFILETYPE = 'widenative')
Msg 4869, Level 16, State 1, Line 1
The bulk load failed. Unexpected NULL value in data file row 2, column 22. The destination column (CNTRCT_RATE_MULT) is defined as NOT NULL.
Msg 4866, Level 16, State 4, Line 1
The bulk load failed. The column is too long in the data file for row 3, column 22. Verify that the field terminator and row terminator are specified correctly.
Msg 7399, Level 16, State 1, Line 1
The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
Msg 7330, Level 16, State 2, Line 1
Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
I've tried a few different things including trying to export as character and import as BULK INSERT PS_PO_LINE FROM 'K:\BCP_FIN_Test\PS_PO_LINE.txt' WITH (DATAFILETYPE = 'char')
But no luck
Appreciate helpIt seems that the target table does not match your expectations.
Since I don't know exactly what you are doing, I will have to resort to guesses.
I note that you export query goes:
SELECT * FROM FS84RPT.dbo.PS_PO_LINE Inner Join
And then you are importing into a table called PS_PO_LINE as well. But for your operation to make sense the import PS_PO_LINE must not only have the columns from the PS_PO_LINE, but also all columns from PS_RECV_LN_ACCTG. Maybe your SELECT should read
SELECT PS_PO_LINE.* FROM FS84RPT.dbo.PS_PO_LINE Inner Join
or use an EXISTS clause to add the filter of PS_RECV_LN_ACCTG table. (Assuming that it appears in the query for filtering only.)
Erland Sommarskog, SQL Server MVP, [email protected] -
Bulk Insert using Script Task in SSIS
Hi guys,
Please I have about 300Million rows of data I want to load from a remote SQL server into another SQL server. The problem is that the data takes forever to load and I have a time line.
I was wondering if there is a way in can do bulk insert for this or maybe if I can get a code that I can use to load the data faster.
Please I really need your help as my job depends on this.
meCREATE TABLE [dbo].[post_tran](
[post_tran_id] [bigint] NOT NULL,
[post_tran_cust_id] [bigint] NOT NULL,
[settle_entity_id] [dbo].[POST_ID] NULL,
[batch_nr] [int] NULL,
[prev_post_tran_id] [bigint] NULL,
[next_post_tran_id] [bigint] NULL DEFAULT ((0)),
[sink_node_name] [dbo].[POST_NAME] NULL,
[tran_postilion_originated] [dbo].[POST_BOOL] NOT NULL,
[tran_completed] [dbo].[POST_BOOL] NOT NULL,
[message_type] [char](4) NOT NULL,
[tran_type] [char](2) NULL,
[tran_nr] [bigint] NOT NULL,
[system_trace_audit_nr] [char](6) NULL,
[rsp_code_req] [char](2) NULL,
[rsp_code_rsp] [char](2) NULL,
[abort_rsp_code] [char](2) NULL,
[auth_id_rsp] [varchar](10) NULL,
[auth_type] [numeric](1, 0) NULL,
[auth_reason] [numeric](1, 0) NULL,
[retention_data] [varchar](999) NULL,
[acquiring_inst_id_code] [varchar](11) NULL,
[message_reason_code] [char](4) NULL,
[sponsor_bank] [char](8) NULL,
[retrieval_reference_nr] [char](12) NULL,
[datetime_tran_gmt] [datetime] NULL,
[datetime_tran_local] [datetime] NOT NULL,
[datetime_req] [datetime] NOT NULL,
[datetime_rsp] [datetime] NULL,
[realtime_business_date] [datetime] NOT NULL,
[recon_business_date] [datetime] NOT NULL,
[from_account_type] [char](2) NULL,
[to_account_type] [char](2) NULL,
[from_account_id] [varchar](28) NULL,
[to_account_id] [varchar](28) NULL,
[tran_amount_req] [dbo].[POST_MONEY] NULL,
[tran_amount_rsp] [dbo].[POST_MONEY] NULL,
[settle_amount_impact] [dbo].[POST_MONEY] NULL,
[tran_cash_req] [dbo].[POST_MONEY] NULL,
[tran_cash_rsp] [dbo].[POST_MONEY] NULL,
[tran_currency_code] [dbo].[POST_CURRENCY] NULL,
[tran_tran_fee_req] [dbo].[POST_MONEY] NULL,
[tran_tran_fee_rsp] [dbo].[POST_MONEY] NULL,
[tran_tran_fee_currency_code] [dbo].[POST_CURRENCY] NULL,
[tran_proc_fee_req] [dbo].[POST_MONEY] NULL,
[tran_proc_fee_rsp] [dbo].[POST_MONEY] NULL,
[tran_proc_fee_currency_code] [dbo].[POST_CURRENCY] NULL,
[settle_amount_req] [dbo].[POST_MONEY] NULL,
[settle_amount_rsp] [dbo].[POST_MONEY] NULL,
[settle_cash_req] [dbo].[POST_MONEY] NULL,
[settle_cash_rsp] [dbo].[POST_MONEY] NULL,
[settle_tran_fee_req] [dbo].[POST_MONEY] NULL,
[settle_tran_fee_rsp] [dbo].[POST_MONEY] NULL,
[settle_proc_fee_req] [dbo].[POST_MONEY] NULL,
[settle_proc_fee_rsp] [dbo].[POST_MONEY] NULL,
[settle_currency_code] [dbo].[POST_CURRENCY] NULL,
[icc_data_req] [text] NULL,
[icc_data_rsp] [text] NULL,
[pos_entry_mode] [char](3) NULL,
[pos_condition_code] [char](2) NULL,
[additional_rsp_data] [varchar](25) NULL,
[structured_data_req] [text] NULL,
[structured_data_rsp] [text] NULL,
[tran_reversed] [char](1) NULL DEFAULT ((0)),
[prev_tran_approved] [dbo].[POST_BOOL] NULL,
[issuer_network_id] [varchar](11) NULL,
[acquirer_network_id] [varchar](11) NULL,
[extended_tran_type] [char](4) NULL,
[ucaf_data] [varchar](33) NULL,
[from_account_type_qualifier] [char](1) NULL,
[to_account_type_qualifier] [char](1) NULL,
[bank_details] [varchar](31) NULL,
[payee] [char](25) NULL,
[card_verification_result] [char](1) NULL,
[online_system_id] [int] NULL,
[participant_id] [int] NULL,
[opp_participant_id] [int] NULL,
[receiving_inst_id_code] [varchar](11) NULL,
[routing_type] [int] NULL,
[pt_pos_operating_environment] [char](1) NULL,
[pt_pos_card_input_mode] [char](1) NULL,
[pt_pos_cardholder_auth_method] [char](1) NULL,
[pt_pos_pin_capture_ability] [char](1) NULL,
[pt_pos_terminal_operator] [char](1) NULL,
[source_node_key] [varchar](32) NULL,
[proc_online_system_id] [int] NULL,
[from_account_id_cs] [int] NULL,
[to_account_id_cs] [int] NULL,
[pos_geographic_data] [char](5) NULL,
[payer_account_id] [char](5) NULL,
[cvv_available_at_auth] [varchar](30) NULL,
[cvv2_available_at_auth] [varchar](30) NULL
) ON [PRIMARY] TEXTIMAGE_ON [PRIMARY]
GO
me -
Bulk inserts on Solaris slow as compared to windows
Hi Experts,
Looking for tips in troubleshooting 'Bulk inserts on Solaris'. I have observed the same bulk inserts are quite fast on Windows as compared to Solaris. Is there known issues on Solaris?
This is the statement:
I have 'merge...insert...' query which is in execution since long time more than 12 hours now:
merge into A DEST using (select * from B SRC) SRC on (SRC.some_ID= DEST.some_ID) when matched then update ...when not matched then insert (...) values (...)Table A has 600K rows with unique identifier some_ID column, Table B has 500K rows with same some_id column, the 'merge...insert' checks if the some_ID exists, if yes then update query gets fired, when not matched then insert query gets fired. In either case it takes long time to execute.
Environment:
The version of the database is 10g Standard 10.2.0.3.0 - 64bit Production
OS: Solaris 10, SPARC-Enterprise-T5120
These are the parameters relevant to the optimizer:
SQL>
SQL> show parameter sga_target
NAME TYPE VALUE
sga_target big integer 4G
SQL>
SQL> show parameter sga_target
NAME TYPE VALUE
sga_target big integer 4G
SQL>
SQL> show parameter optimizer
NAME TYPE VALUE
optimizer_dynamic_sampling integer 2
optimizer_features_enable string 10.2.0.3
optimizer_index_caching integer 0
optimizer_index_cost_adj integer 100
optimizer_mode string ALL_ROWS
optimizer_secure_view_merging boolean TRUE
SQL>
SQL> show parameter db_file_multi
NAME YPE VALUE
db_file_multiblock_read_count integer 16
SQL>
SQL> show parameter db_block_size
NAME TYPE VALUE
db_block_size integer 8192
SQL>
SQL> show parameter cursor_sharing
NAME TYPE VALUE
cursor_sharing string EXACT
SQL>
SQL> column sname format a20
SQL> column pname format a20
SQL> column pval2 format a20
SQL>
SQL> select sname, pname, pval1, pval2 from sys.aux_stats$;
SNAME PNAME PVAL1 PVAL2
SYSSTATS_INFO STATUS COMPLETED
SYSSTATS_INFO DSTART 07-12-2005 07:13
SYSSTATS_INFO DSTOP 07-12-2005 07:13
SYSSTATS_INFO FLAGS 1
SYSSTATS_MAIN CPUSPEEDNW 452.727273
SYSSTATS_MAIN IOSEEKTIM 10
SYSSTATS_MAIN IOTFRSPEED 4096
SYSSTATS_MAIN SREADTIM
SYSSTATS_MAIN MREADTIM
SYSSTATS_MAIN CPUSPEED
SYSSTATS_MAIN MBRC
SYSSTATS_MAIN MAXTHR
SYSSTATS_MAIN SLAVETHR
13 rows selected.
Following is the error messages being pushed into oracle alert log file:
Thu Dec 10 01:41:13 2009
Thread 1 advanced to log sequence 1991
Current log# 1 seq# 1991 mem# 0: /oracle/oradata/orainstance/redo01.log
Thu Dec 10 04:51:01 2009
Thread 1 advanced to log sequence 1992
Current log# 2 seq# 1992 mem# 0: /oracle/oradata/orainstance/redo02.logPlease provide some tips to troubleshoot the actual issue. Any pointers on db_block_size,SGA,PGA which are the reasons for this failure?
Regards,
neuronSID, SEQ#, EVENT, WAIT_CLASS_ID, WAIT_CLASS#, WAIT_TIME, SECONDS_IN_WAIT, STATE
125 24235 'db file sequential read' 1740759767 8 -1 *58608 * 'WAITED SHORT TIME'Regarding the disk, I am not sure what needs to be checked, however from output of iostat it does not seem to be busy, check last three row's and %b column is negligible:
tty cpu
tin tout us sy wt id
0 320 3 0 0 97
extended device statistics
r/s w/s kr/s kw/s wait actv wsvc_t asvc_t %w %b device
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0 0 ramdisk1
0.0 2.5 0.0 18.0 0.0 0.0 0.0 8.3 0 1 c1t0d0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0 0 c1t1d0
0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0 0 c0t0d0 -
How to debug bulk insert?
I have this code which doesn't cause any error, and actually gives message 'query executed successfully', but it doesn't load any data.
bulk insert [dbo].[SPGT]
from '\\sys.local\london-sql\FTP\20140210_SPGT.SPL'
WITH (
KEEPNULLS,
FIRSTROW=5,
FIELDTERMINATOR='\t',
ROWTERMINATOR='\n'
How can I debug the issue, or see what the script is REALLY doing? It's not doing what I think it's doing.
All permissions, rights, etc are setup correctly. I just run the code successfully with a .txt file. Maybe it has something to do with the extension...
Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.Yes, here is the final solution (for the benefit of others who find this anytime in the future).
CREATE
TABLE [dbo].[ICM]
Date DATETIME,
Type VARCHAR(MAX),
Change VARCHAR(MAX),
SP_ID VARCHAR(MAX),
Sedol VARCHAR(MAX),
Cusip VARCHAR(MAX),
Issue_Name VARCHAR(MAX),
Cty VARCHAR(MAX),
PE VARCHAR(MAX),
Cap_Range VARCHAR(MAX),
GICS VARCHAR(MAX),
Curr VARCHAR(MAX),
Local_Price DECIMAL(19,8),
Index_Total_Shares DECIMAL(19,8),
IWF DECIMAL(19,8),
Index_Curr VARCHAR(MAX),
Float_MCAP DECIMAL(19,8),
Total_MCAP DECIMAL(19,8),
Daily_Price_Rtn DECIMAL(19,8),
Daily_Total_Rtn DECIMAL(19,8),
FX_Rate DECIMAL(19,8),
Growth_Weight DECIMAL(19,8),
Value_Weight DECIMAL(19,8),
Bloomberg_ID VARCHAR(MAX),
RIC VARCHAR(MAX),
Exchange_Ticker VARCHAR(MAX),
ISIN VARCHAR(MAX),
SSB_ID VARCHAR(MAX),
REIT_Flag VARCHAR(MAX),
Weight
DECIMAL(19,8),
Shares DECIMAL(19,8)
bulk
insert dbo.ICM
from
'C:\Documents and Settings\london\Desktop\ICM.txt'
WITH
FIRSTROW
= 2,
FIELDTERMINATOR
= ',',
ROWTERMINATOR
= '\n'
GO
This was a bit confusing at first, because I've never done it before, and also, I was getting all kinds of errors, which turned out to be numbers in string fields and strings in number fields. Basically, the data that was given to me was totally screwed
up. That compounded the problem exponentially. I finally got the correct data, and I'm all set now.
Thanks everyone!
Knowledge is the only thing that I can give you, and still retain, and we are both better off for it. -
SQL Server 2008 - RS -Bulk Insert
I'am trying to import some flat files to SQL using the following bulk insert:
cREATE TABLE #temp1
[field1] [varchar](20) NOT NULL,
[field2] [datetime] NOT NULL,
[fields3] [varchar](100) not null
select * from #temp1
BULK
INSERT #temp1
FROM 'c:\testestes.txt'
WITH
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n',
FIRSTROW = 1
GO
INSERT INTO dbo.teste1 ( M_nAME, [Date], Notes)
Select RTRIM(LTRIM([field1])), RTRIM(LTRIM([field2])), RTRIM(LTRIM([fields3])) From #temp1
IF EXISTS(SELECT * FROM #temp1) drop table #temp1
And here is an example of my flat file:
TESTES11;19-03-2015 16:03:07
However, some rows have a third column with this aspect:
TESTES12;27-03-2015 18:03:32;Request timed out.
And I'm having some issues to import the second and third column to the table that I created (#temp1) because it don't allows me to import a datetime data.One solution: import the line as whole into a staging table column. Process it further from the staging table.
Example of importing an entire line:
http://www.sqlusa.com/bestpractices2005/notepad/
Kalman Toth Database & OLAP Architect
SQL Server 2014 Database Design
New Book / Kindle: Beginner Database Design & SQL Programming Using Microsoft SQL Server 2014 -
Number of rows inserted is different in bulk insert using select statement
I am facing a problem in bulk insert using SELECT statement.
My sql statement is like below.
strQuery :='INSERT INTO TAB3
(SELECT t1.c1,t2.c2
FROM TAB1 t1, TAB2 t2
WHERE t1.c1 = t2.c1
AND t1.c3 between 10 and 15 AND)' ....... some other conditions.
EXECUTE IMMEDIATE strQuery ;
These SQL statements are inside a procedure. And this procedure is called from C#.
The number of rows returned by the "SELECT" query is 70.
On the very first time call of this procedure, the number rows inserted using strQuery is *70*.
But in the next time call (in the same transaction) of the procedure, the number rows inserted is only *50*.
And further if we are repeating calling this procedure, it will insert sometimes 70 or 50 etc. It is showing some inconsistency.
On my initial analysis it is found that, the default optimizer is "ALL_ROWS". When i changed the optimizer mode to "rule", this issue is not coming.
Anybody faced these kind of issues?
Can anyone tell what would be the reason of this issue..? any other work around for this...?
I am using Oracle 10g R2 version.
Edited by: user13339527 on Jun 29, 2010 3:55 AM
Edited by: user13339527 on Jun 29, 2010 3:56 AMYou have very likely concurrent transactions on the database:
>
By default, Oracle Database permits concurrently running transactions to modify, add, or delete rows in the same table, and in the same data block. Changes made by one transaction are not seen by another concurrent transaction until the transaction that made the changes commits.
>
If you want to make sure that the same query always retrieves the same rows in a given transaction you need to use transaction isolation level serializable instead of read committed which is the default in Oracle.
Please read http://download.oracle.com/docs/cd/E11882_01/appdev.112/e10471/adfns_sqlproc.htm#ADFNS00204.
You can try to run your test with:
set transaction isolation level serializable;If the problem is not solved, you need to search possible Oracle bugs on My Oracle Support with keywords
like:
wrong results 10.2Edited by: P. Forstmann on 29 juin 2010 13:46
Maybe you are looking for
-
Using Web.Show_Document
When I use the web.show_document, I want to hide the toolbar, scrollbar, button bar on the browser. How can I do it?
-
Renabled web services but new printer code says this printer already assigned to someone
I had to remove then reenable web services. Print out gave me new code to add printer again.signed in with same login as when first registered but after entering new code it says invalid. Just got new code so its not 24 he's yet. Says this printer al
-
Player size question for player developers
Question for all player developers: For media players that you've built or that you've seen out on the web today, how big are they in k? And of that total size, how much code does it take to support each of the service integrations with CDNs, ad ser
-
Time machine make huge backups every time...
Since last weekend my Time Machine have been acting Up by making huge backups every time like 130 gb +. That started on Mavericks and now continued into Yosemite(I made the upgrade with the ankward belief that the upgrade would fix it...) Is there a
-
Hello, Is someone has 1 left official Photoshop CS3 licence key for sale? I can buy it after Adobe Support Verefication. Thank you, Oleg