SQL 2012 Trigger bulk insert lock
Hi
I have built a proc which basically captures some invalid codes and what I would like to do is use a trigger to fire an email everytime this the table receives information .
;WITH MatchesCTE ( SK_Partial, Matchcode ) AS
(SELECT 1, '1234')
SELECT * INTO match FROM MatchesCTE
ALTER TRIGGER EmailInavildPartials1
ON [dbo].[match]
AFTER INSERT
AS
BEGIN
-- SET NOCOUNT ON added to prevent extra result sets from
-- interfering with SELECT statements.
IF (@@rowcount > 1)
SET NOCOUNT ON;
--SELECT GETDATE() ;
EXEC Send_email_Invalid_Partials1
END
ALTER PROCEDURE [dbo].[Send_email_Invalid_Partials1]
AS
DECLARE @p_body as nvarchar(max), @p_subject as nvarchar(max)
DECLARE @p_recipients as nvarchar(max), @p_profile_name as nvarchar(max), @p_query as nvarchar(max),
@p_subject_date as nvarchar(max), @fileName as nvarchar(max)
SET @p_profile_name = N'DEV'
SET @p_recipients = N'ROBERT@BLAH;'
Select @p_subject_date = (select DATENAME (weekday, getdate ()))+ ' ' + substring (convert(varchar,getdate(),12), 5,2) +'/' + substring (convert(varchar,getdate(),12), 3,2)+ '/' +
substring (convert(varchar,getdate(),12), 1,2)
SET @filename = 'INVALID MATCH CODES ' + CONVERT(CHAR(8),GETDATE(),112)+ '.csv'
SET @p_subject = @p_subject_date + N' INVALID MATCH CODES'
SET @p_body = 'Please see the invaild Partials List.'
SET @p_query = 'SET NOCOUNT ON;Select * FROM DATABASE..match '
EXEC msdb.dbo.sp_send_dbmail
@profile_name = @p_profile_name,
@recipients = @p_recipients,
@body = @p_body,
@body_format = 'HTML',
@subject = @p_subject,
@query = @p_query,
@attach_query_result_as_file = 1,
@importance = 'High',
@query_attachment_filename =@filename,
@query_result_separator = ',',
@query_result_no_padding = 1
you''ll need to change
@p_profile_name
@p_recipients
@p_query
when you then run the following
INSERT INTO match
SELECT 1,'1234'
UNION
SELECT 1, '5678'
It locks. Is there any way around this ? Any help would be great
Thanks
While I can debate the wisdom of this approach in general, as well as the selectivity of your logic with respect to the trigger, the rows inserted, the contents of the entire table - I won't.
The short story is that this approach will not work. Your trigger fires within the context of an insert (presumably - perhaps a merge) statement that inserts some unknown number of rows into the target table. Your trigger logic will then execute
a procedure if more than one (but not one or zero rows) is inserted. Your procedure will then attempt to execute a query against the same table and attempt to dump ALL rows in the table into a file for the email. This "dumping" executes in a different
process - hence the blocking.
Is it correct that your procedure will send the email with an attachment of ALL rows, not just the inserted ones (but ignore the insertion of just one row)? I can't say. It would be unusual but stranger things have been done. To get
around that you can use a locking hint as suggested - but that may not be what you desire depending on other factors (is anything else manipulating the contents of the table while this process is running? Are you certain of that answer after evaluating the
range of possibilities?).
Perhaps the best approach is to use a very different one. It appears that you insert rows into the table for a single purpose. So why not just turn that process into the one that does the notification?
Similar Messages
-
SQL Server 2012 Express bulk Insert flat file 1million rows with "" as delimeter
Hi,
I wanted to see if anyone can help me out. I am on SQL server 2012 express. I cannot use OPENROWSET because my system is x64 and my Microsoft office suit is x32 (Microsoft.Jet.OLEDB.4.0).
So I used Import wizard and is not working either.
The only thing that let me import this large file, is:
CREATE TABLE #LOADLARGEFLATFILE
Column1
varchar(100), Column2 varchar(100), Column3 varchar(100),
Column4 nvarchar(max)
BULK INSERT
#LOADLARGEFLATFILE
FROM 'C:\FolderBigFile\LARGEFLATFILE.txt'
WITH
FIRSTROW = 2,
FIELDTERMINATOR ='\t',
ROWTERMINATOR ='\n'
The problem with CREATE TABLE and BULK INSERT is that my flat file comes with text qualifiers - "". Is there a way to prevent the quotes "" from loading in the bulk insert? Below is the data.
Column1
Column2
Column3
Column4
"Socket Adapter"
8456AB
$4.25
"Item - Square Drive Socket Adapter | For "
"Butt Splice"
9586CB
$14.51
"Item - Butt Splice"
"Bleach"
6589TE
$27.30
"Item - Bleach | Size - 96 oz. | Container Type"
Ed,
Edwin LoperaHi lgnusLumen,
According to your description, you use BULK INSERT to import data from a data file to the SQL table. However, to be usable as a data file for bulk import, a CSV file must comply with the following restrictions:
1. Data fields never contain the field terminator.
2. Either none or all of the values in a data field are enclosed in quotation marks ("").
In your data file, the quotes aren't consistent, if you want to prevent the quotes "" from loading in the bulk insert, I recommend you use SQL Server Import and Export Wizard tools in SQL Server Express version. area, it will allow to strip the
double quote from columns, you can review the following screenshot.
In other SQL Server version, we can use SQL Server Integration Services (SSIS) to import data from a flat file (.csv) with removing the double quotes. For more information, you can review the following article.
http://www.mssqltips.com/sqlservertip/1316/strip-double-quotes-from-an-import-file-in-integration-services-ssis/
In addition, you can create a function to convert a CSV to a usable format for Bulk Insert. It will replace all field-delimiting commas with a new delimiter. You can then use the new field delimiter instead of a comma. For more information, see:
http://stackoverflow.com/questions/782353/sql-server-bulk-insert-of-csv-file-with-inconsistent-quotes
Regards,
Sofiya Li
Sofiya Li
TechNet Community Support -
BULK INSERT into View w/ Instead Of Trigger - DML ERROR LOGGING Issue
Oracle 10.2.0.4
I cannot figure out why I cannot get bulk insert errors to aggregate and allow the insert to continue when bulk inserting into a view with an Instead of Trigger. Whether I use LOG ERRORS clause or I use SQL%BULK_EXCEPTIONS, the insert works until it hits the first exception and then exits.
Here's what I'm doing:
1. I'm bulk inserting into a view with an Instead of Trigger on it that performs the actual updating on the underlying table. This table is a child table with a foreign key constraint to a reference table containing the primary key. In the Instead of Trigger, it attempts to insert a record into the child table and I get the following exception: +5:37:55 ORA-02291: integrity constraint (FK_TEST_TABLE) violated - parent key not found+, which is expected, but the error should be logged in the table and the rest of the inserts should complete. Instead the bulk insert exits.
2. If I change this to bulk insert into the underlying table directly, it works, all errors get put into the error logging table and the insert completes all non-exception records.
Here's the "test" procedure I created to test my scenario:
View: V_TEST_TABLE
Underlying Table: TEST_TABLE
PROCEDURE BulkTest
IS
TYPE remDataType IS TABLE of v_TEST_TABLE%ROWTYPE INDEX BY BINARY_INTEGER;
varRemData remDataType;
begin
select /*+ DRIVING_SITE(r)*/ *
BULK COLLECT INTO varRemData
from TEST_TABLE@REMOTE_LINK
where effectiveday < to_date('06/16/2012 04','mm/dd/yyyy hh24')
and terminationday > to_date('06/14/2012 04','mm/dd/yyyy hh24');
BEGIN
FORALL idx IN varRemData.FIRST .. varRemData.LAST
INSERT INTO v_TEST_TABLE VALUES varRemData(idx) LOG ERRORS INTO dbcompare.ERR$_TEST_TABLE ('INSERT') REJECT LIMIT UNLIMITED;
EXCEPTION WHEN others THEN
DBMS_OUTPUT.put_line('ErrorCode: '||SQLCODE);
END;
COMMIT;
end;
I've reviewed Oracle's documentation on both DML logging tools and neither has any restrictions (at least that I can see) that would prevent this from working correctly.
Any help would be appreciated....
Thanks,
SteveThanks, obviously this is my first post, I'm desperate to figure out why this won't work....
This code I sent is only a test proc to try and troubleshoot the issue, the others with the debug statement is only to capture the insert failing and not aggregating the errors, that won't be in the real proc.....
Thanks,
Steve -
ODBC, bulk inserts and dynamic SQL
I am writing an application running on Windows NT 4 and using the oracle ODBC driver (8.01.05.00, that inserts many rows at a time (10000+) into an oracle 8i database.
At present, I am using a stored procedure to insert each row into the database. The stored procedure uses dynamic SQL because I can only determine the table and field names at run time.
Due to the large number of records, it tends to take a while to perform all the inserts. I have tried a number of solutions such as using batches of SQL statements (e.g. "INSERT...;INSERT...;INSERT..."), but the oracle ODBC driver only seems act on the first statement in the batch.
I have also considered using the FOR ALL statement and SQL*Loader utility.
My problem with FOR ALL is that I'm not sure it works on dynamic SQL statements and even if it did, how do I pass an array of statements to the stored procedure.
I ruled out SQL* Loader because I could not find a way to invoke it it from an ODBC statement. Secondly, it requires the spawining of a new process.
What I am really after is something similar the the SQL Server (forgive me!) BULK INSERT statement where you can simply create an input file with all the records you want to insert, and pass it along in an ODBC statement such as "BULK INSERT <filename>".
Any ideas??
nullHi,
I faced this same situation years ago (Oracle 7.2!) and had the following alternatives.
1) Use a 3rd party tool such as Sagent or CA Info pump (very pricey $$$)
2) Use VisualC++ and OCI to hook into the array insert routines (there are examples of these in the Oracle Home).
3) Use SQL*Loader (the best performance, but no real control of what's happening).
I ended up using (2) and used the Rouge Wave dbtools.h++ library to speed up the development.
These days, I would also suggest you take a look at Perl on NT (www.activestate.com) and the DBlib modules at www.perl.org. I believe they will also do bulk loading.
Your problem is that your program is using Oracle ODBC, when you should be using Oracle OCI for best performance.
null -
I am writing an application running on Windows NT 4 and using the oracle ODBC driver (8.01.05.00, that inserts many rows at a time (10000+) into an oracle 8i database.
At present, I am using a stored procedure to insert each row into the database. The stored procedure uses dynamic SQL because I can only determine the table and field names at run time.
Due to the large number of records, it tends to take a while to perform all the inserts. I have tried a number of solutions such as using batches of SQL statements (e.g. "INSERT...;INSERT...;INSERT..."), but the oracle ODBC driver only seems act on the first statement in the batch.
I have also considered using the FOR ALL statement and SQL*Loader utility.
My problem with FOR ALL is that I'm not sure it works on dynamic SQL statements and even if it did, how do I pass an array of statements to the stored procedure.
I ruled out SQL* Loader because I could not find a way to invoke it it from an ODBC statement. Secondly, it requires the spawining of a new process.
What I am really after is something similar the the SQL Server (forgive me!) BULK INSERT statement where you can simply create an input file with all the records you want to insert, and pass it along in an ODBC statement such as "BULK INSERT <filename>".
Any ideas??
nullHi,
I faced this same situation years ago (Oracle 7.2!) and had the following alternatives.
1) Use a 3rd party tool such as Sagent or CA Info pump (very pricey $$$)
2) Use VisualC++ and OCI to hook into the array insert routines (there are examples of these in the Oracle Home).
3) Use SQL*Loader (the best performance, but no real control of what's happening).
I ended up using (2) and used the Rouge Wave dbtools.h++ library to speed up the development.
These days, I would also suggest you take a look at Perl on NT (www.activestate.com) and the DBlib modules at www.perl.org. I believe they will also do bulk loading.
Your problem is that your program is using Oracle ODBC, when you should be using Oracle OCI for best performance.
null -
How to change Bulk Insert statement from MS SQL to Oracle
Hi All,
Good day, I would like to bulk insert the content of a file into Oracle db. May I know how to change the below MS SQL syntax to Oracle syntax?
Statement statement = objConnection.createStatement();
statement.execute("BULK INSERT [TBL_MERCHANT] FROM '" MERCHANT_FILE_DIR "' WITH ( FIELDTERMINATOR = '~~', ROWTERMINATOR = '##' )");
Thanks in advance.
cs.Oracle SQL Loader utility allows you to insert data from flat file to database tables.
Go to SQL Loader links on following url to learn more on this utility
http://otn.oracle.com/docs/products/oracle9i/doc_library/release2/server.920/a96652/toc.htm
Chandar -
SQL Server 2008 - RS - Loop of multiple Bulk Inserts
Hi,
I want to import multiple flat files to a table on SQL Server 2008 R2. However, I don't have access to Integration Services to use a foreach loop, so I'm doing the process using T-SQL. Actually, I'm using manually code to which file to introduce the data on
tables. My code are like this:
cREATE TABLE #temporaryTable
[column1] [varchar](100) NOT NULL,
[column2 [varchar](100) NOT NULL
BULK
INSERT #temp
FROM 'C:\Teste\testeFile01.txt'
WITH
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n',
FIRSTROW = 1
GO
BULK
INSERT #temp
FROM 'C:\Teste\testeFile02.txt'
WITH
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n',
FIRSTROW = 1
GO
-------------------------------------------------INSERT INTO dbo.TESTE ( Col_1, Col_2)
Select RTRIM(LTRIM([column1])), RTRIM(LTRIM([column2])) From #temporaryTable
IF EXISTS(SELECT * FROM #temporaryTable) drop table #temporaryTable
The problem is that I have 20 flat files to Insert... Do I have any loop solution in T-SQL to insert all the flat files on same table?
Thanks!Here is a working sample of powershell script I adopted from internet( I don't have the source handy now).
Import-Module -Name 'SQLPS' -DisableNameChecking
$workdir="C:\temp\test\"
$svrname = "MC\MySQL2014"
Try
#Change default timeout time from 600 to unlimited
$svr = new-object ('Microsoft.SqlServer.Management.Smo.Server') $svrname
$svr.ConnectionContext.StatementTimeout = 0
$table="test1.dbo.myRegions"
#remove the filename column in the target table
$q1 = @"
Use test1;
IF COL_LENGTH('dbo.myRegions','filename') IS NOT NULL
BEGIN
ALTER TABLE test1.dbo.myRegions DROP COLUMN filename;
END
Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -Query $q1
$dt = (get-date).ToString("yyyMMdd")
$formatfilename="$($table)_$($dt).xml"
$destination_formatfilename ="$($workdir)$($formatfilename)"
$cmdformatfile="bcp $table format nul -c -x -f $($destination_formatfilename) -T -t\t -S $($svrname) "
Invoke-Expression $cmdformatfile
#Delay 1 second
Start-Sleep -s 1
$q2 = @"
Alter table test1.dbo.myRegions Add filename varchar(500) Null;
#add the filename column to the target table
Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -Query $q2
$files = Get-ChildItem $workdir
$items = $files | Where-Object {$_.Extension -eq ".txt"}
for ($i=0; $i -lt $items.Count; $i++) {
$strFileName = $items[$i].Name
$strFileNameNoExtension= $items[$i].BaseName
$query = @"
BULK INSERT test1.dbo.myRegions from '$($workdir)$($strFileName)' WITH (FIELDTERMINATOR = '\t', FIRSTROW = 2, FORMATFILE = '$($destination_formatfilename)');
Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -Query $query -querytimeout 65534
#Delay 10 second
Start-Sleep -s 10
# Update the filename column
Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -querytimeout 65534 -Query "Update test1.dbo.myRegions SET filename= '$($strFileName)' WHERE filename is null; "
# Move uploaded file to archive
If ((Test-Path "$($workdir)$($strFileName)") -eq $True) { Move-Item -Path "$($workdir)$($strFileName)" -Destination "$($workdir)Processed\$($strFileNameNoExtension)_$($dt).txt"}
Catch [Exception]
write-host "--$strFileName "$_.Exception.Message -
SQL Server 2008 - RS -Bulk Insert
I'am trying to import some flat files to SQL using the following bulk insert:
cREATE TABLE #temp1
[field1] [varchar](20) NOT NULL,
[field2] [datetime] NOT NULL,
[fields3] [varchar](100) not null
select * from #temp1
BULK
INSERT #temp1
FROM 'c:\testestes.txt'
WITH
FIELDTERMINATOR = ';',
ROWTERMINATOR = '\n',
FIRSTROW = 1
GO
INSERT INTO dbo.teste1 ( M_nAME, [Date], Notes)
Select RTRIM(LTRIM([field1])), RTRIM(LTRIM([field2])), RTRIM(LTRIM([fields3])) From #temp1
IF EXISTS(SELECT * FROM #temp1) drop table #temp1
And here is an example of my flat file:
TESTES11;19-03-2015 16:03:07
However, some rows have a third column with this aspect:
TESTES12;27-03-2015 18:03:32;Request timed out.
And I'm having some issues to import the second and third column to the table that I created (#temp1) because it don't allows me to import a datetime data.One solution: import the line as whole into a staging table column. Process it further from the staging table.
Example of importing an entire line:
http://www.sqlusa.com/bestpractices2005/notepad/
Kalman Toth Database & OLAP Architect
SQL Server 2014 Database Design
New Book / Kindle: Beginner Database Design & SQL Programming Using Microsoft SQL Server 2014 -
First Row Record is not inserted from CSV file while bulk insert in sql server
Hi Everyone,
I have a csv file that needs to be inserted in sql server. The csv file will be format will be like below.
1,Mr,"x,y",4
2,Mr,"a,b",5
3,Ms,"v,b",6
While Bulk insert it coniders the 2nd column as two values (comma separte) and makes two entries .So i used filelterminator.xml.
Now, the fields are entered into the column correctly. But now the problem is, the first row of the csv file is not reading in sql server. when i removed the terminator, i can get the all records. But i must use the above code terminator. If
am using means, am not getting the first row record.
Please suggests me some solution.
Thanks,
SelvamHi,
I have a csv file (comma(,) delimited) like this which is to be insert to sql server. The format of the file when open in notepad like below:
Id,FirstName,LastName,FullName,Gender
1,xx,yy,"xx,yy",M
2,zz,cc,"zz,cc",F
3,aa,vv,"aa,vv",F
The below is the bulk insert query which is used for insert above records,
EXEC(BULK INSERT EmployeeData FROM '''+@FilePath+'''WITH
(formatfile=''d:\FieldTerminator.xml'',
ROWTERMINATOR=''\n'',
FIRSTROW=2)'
Here, I have used format file for the "Fullname" which has comma(,) within the field. The format file is:
The problem is , it skip the first record (1,xx,yy,"xx,yy",M) when i use the format file. When i remove the format file from the query, it takes all the records but the "fullName" field makes the problem because of comma(,) within the
field. So i must use the format file to handle this. So please suggest me , why the first record skipped always when i use the above format file.
If i give the "FirstRow=1" in bulk insert, it shows the "String or binary data would be truncated.
The statement has been terminated." error. I have checked the datatype length.
Please update me the solution.
Regards,
Selvam. M -
Using Bulk insert or SQL Loader in VB6
Hi,
I am quite new to the Oracle world and also forums. But I am looking for some directions in how i get a dataset of 10000 records into a table the quickest way. I have the recordset in an ADO Recordset (or a textfile if that is easier) and I want to insert them in an empty Oracle table. The problem is - I don't know how to.
Situation
The Oracle DB is on another computer I have nothing special installed on the computer running the VB6 application.
Can anyone please provide code example or guidelines...
Regards,
ChristianThis may not be "bulk insert" by your definition, but it can transfer data as you want.
A simple VB code for demo purpose:
Dim con As New ADODB.Connection
Dim con2 As New ADODB.Connection
Dim rst As New ADODB.Recordset
Dim rst2 As New ADODB.Recordset
Dim rst3 As New ADODB.Recordset
con.ConnectionString = "Provider=OraOLEDB.Oracle.1;User ID=scott;Password=tiger;Data Source=db_one;"
con.Open
rst.Open "select * from dept", con, adOpenDynamic, adLockOptimistic
'save to a file using ADTG format. You may choose other format.
rst.Save "c:\myfile.txt", adPersistADTG
'dept2 is an empty table with the same table definition as dept. You can create it using SQL*Plus.
'add rows by reading from the saved file.
con2.ConnectionString = "Provider=OraOLEDB.Oracle.1;User ID=xyz;Password=xyz;Data Source=db_two;"
con2.Open
'open the saved file
rst2.Open "c:\myfile.txt"
'rst3 is an empty recordset because dept2 is empty at this time.
rst3.Open "select * from dept2", con2, adOpenDynamic, adLockOptimistic
'adding rows into dept2.
Do Until rst2.EOF
rst3.AddNew Array("deptno", "dname", "loc"), Array(rst2.Fields("deptno"), rst2.Fields("dname"), rst2.Fields("loc"))
rst2.MoveNext
Loop
rst.Close
rst2.Close
rst3.Close
con.Close
con2.Close
Sinclair -
If you have a table with a unique index and ignore_dup_key = on and you INSERT rows into that table with an ORDER BY clause (because you want to control which of the duplicate
key rows gets inserted), the wrong row gets inserted in SQL2012. It works correctly in SQL 2008.
We have recently migrated a database from SQL 2008 to SQL 2012. We do have few other dbs which are in compatability mode 100. The above operation works fine
in SQL 2008 dbs but not SQL 2012.
I've even tried applying the latest patch for SQL 2012 SP2 with CU2. Still the problem exists. I'm going to call MS support, but want to know if anyone has come across this problem ?The MS documentation doesn't guarantee that the first row of the duplicates will always be inserted and the next duplicate row(s) get(s) ignored. Where did you find it in the MS documentation? I think you were just lucky that it was always inserting the
first row in SQL 2008 (and ignoring the rest of the duplicates) - I don't think this is guaranteed
Satish Kartan http://www.sqlfood.com/ -
Dear Experts,
In our SCOM 2007 R2 environment SQL 2012 DB Engine [Login failed: Account locked out] alerts not received but we are receiving the following alerts fr the DB instance.
1. Database Backup Failed To Complete
2. Login failed: Password expired
3. Log Backup Failed to Complete
4. Login failed: Password cannot be used at this time
5. Login failed: Password must be changed
6. IS Package Failed.
Why we are not receiving the "Login failed: Account locked out" ? Customers are asking the notification email alert for this Rule even I have checked the override settings everything is enabled by default same as above rules.
What can be the issue here ?
Thanks,
Saravana
Saravana RajaHi,
Could you please check the Windows security log for (MSSQLSERVER) event ID 18486? The rule should rely on this event.
Regards,
Yan Li
Please remember to mark the replies as answers if they help and unmark them if they provide no help. -
[Forum FAQ] How to use multiple field terminators in BULK INSERT or BCP command line
Introduction
Some people want to know if we can have multiple field terminators in BULK INSERT or BCP commands, and how to implement multiple field terminators in BULK INSERT or BCP commands.
Solution
For character data fields, optional terminating characters allow you to mark the end of each field in a data file with a field terminator, as well as the end of each row with a row terminator. If a terminator character occurs within the data, it is interpreted
as a terminator, not as data, and the data after that character is interpreted and belongs to the next field or record. I have done a test, if you use BULK INSERT or BCP commands and set the multiple field terminators, you can refer to the following command.
In Windows command line,
bcp <Databasename.schema.tablename> out “<path>” –c –t –r –T
For example, you can export data from the Department table with bcp command and use the comma and colon (,:) as one field terminator.
bcp AdventureWorks.HumanResources.Department out C:\myDepartment.txt -c -t ,: -r \n –T
The txt file as follows:
However, if you want to bcp by using multiple field terminators the same as the following command, which will still use the last terminator defined by default.
bcp AdventureWorks.HumanResources.Department in C:\myDepartment.txt -c -t , -r \n -t: –T
The txt file as follows:
When multiple field terminators means multiple fields, you use the below comma separated format,
column1,,column2,,,column3
In this occasion, you only separate 3 fields (column1, column2 and column3). In fact, after testing, there will be 6 fields here. That is the significance of a field terminator (comma in this case).
Meanwhile, using BULK INSERT to import the data of the data file into the SQL table, if you specify terminator for BULK import, you can only set multiple characters as one terminator in the BULK INSERT statement.
USE <testdatabase>;
GO
BULK INSERT <your table> FROM ‘<Path>’
WITH (
DATAFILETYPE = ' char/native/ widechar /widenative',
FIELDTERMINATOR = ' field_terminator',
For example, using BULK INSERT to import the data of C:\myDepartment.txt data file into the DepartmentTest table, the field terminator (,:) must be declared in the statement.
In SQL Server Management Studio Query Editor:
BULK INSERT AdventureWorks.HumanResources.DepartmentTest FROM ‘C:\myDepartment.txt’
WITH (
DATAFILETYPE = ‘char',
FIELDTERMINATOR = ‘,:’,
The new table contains like as follows:
We could not declare multiple field terminators (, and :) in the Query statement, as the following format, a duplicate error will occur.
In SQL Server Management Studio Query Editor:
BULK INSERT AdventureWorks.HumanResources.DepartmentTest FROM ‘C:\myDepartment.txt’
WITH (
DATAFILETYPE = ‘char',
FIELDTERMINATOR = ‘,’,
FIELDTERMINATOR = ‘:’
However, if you want to use a data file with fewer or more fields, we can implement via setting extra field length to 0 for fewer fields or omitting or skipping more fields during the bulk copy procedure.
More Information
For more information about filed terminators, you can review the following article.
http://technet.microsoft.com/en-us/library/aa196735(v=sql.80).aspx
http://social.technet.microsoft.com/Forums/en-US/d2fa4b1e-3bd4-4379-bc30-389202a99ae2/multiple-field-terminators-in-bulk-insert-or-bcp?forum=sqlgetsta
http://technet.microsoft.com/en-us/library/ms191485.aspx
http://technet.microsoft.com/en-us/library/aa173858(v=sql.80).aspx
http://technet.microsoft.com/en-us/library/aa173842(v=sql.80).aspx
Applies to
SQL Server 2012
SQL Server 2008R2
SQL Server 2005
SQL Server 2000
Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.Thanks,
Is this a supported scenario, or does it use unsupported features?
For example, can we call exec [ReportServer].dbo.AddEvent @EventType='TimedSubscription', @EventData='b64ce7ec-d598-45cd-bbc2-ea202e0c129d'
in a supported way?
Thanks! Josh -
I don't know if this has been posted before. I have look around and could not find any similar question.
I am written a stored proc that will do a bulk load using bulk insert command. But I am getting an error msg 4861. below is my full error msg and my code. Can someone advice what I am doing wrong. The sql server engine is on a total different server from
my text file. But they are all on the same network.
use test_sp
go
Declare @path nvarchar(max)
declare @str varchar (1000)
declare @Fulltblname varchar (1000)
Set @path ='\\myservername\ShareName\Path\FileName.txt'
Set @Fulltblname ='table1'
--bulk load the table with raw data
Set @str = 'BULK INSERT [dbo].['+@Fulltblname+']
FROM ' + char(39) + @Path + Char(39) + '
WITH
FIELDTERMINATOR = ''|'',
FIRSTROW = 1,
ROWTERMINATOR =''\n'',
MAXERRORS = 0
Exec sp_executesql @str
Errors getting below
Cannot bulk load because the file "\\myservername.domainname\ShareName\Path\FileName.txt" could not be opened. Operating system error code 5(Access is denied.).
Msg 4861, Level 16, State 1, Line 1
Cannot bulk load because the file "\\myservername.domainname\ShareName\Path\FileName.txt" could not be opened. Operating system error code 5(Access is denied.).
Mail queued.Hi,
Try below links :
http://blogs.msdn.com/b/dataaccesstechnologies/archive/2012/03/22/10082977.aspx
http://blogs.msdn.com/b/jay_akhawri/archive/2009/02/16/resolving-operating-system-error-code-5-with-bulk-insert-a-different-perspective.aspx
http://stackoverflow.com/questions/14555262/cannot-bulk-load-operating-system-error-code-5-access-is-denied
sathya - www.allaboutmssql.com ** Mark as answered if my post solved your problem and Vote as helpful if my post was useful **. -
SQL merge and after insert or update on ... for each row fires too often?
Hello,
there is a base table, which has a companion history table
- lets say USER_DATA & USER_DATA_HIST.
For each update on USER_DATA there has to be recorded the old condition of the USER_DATA record into the USER_DATA_HIST (insert new record)
- to have the history of changes to USER_DATA.
The first approach was to do the insert for the row trigger:
trigger user_data_tr_aiu after insert or update on user_data for each rowBut the performance was bad, because for a bulk update to USER_DATA, there have been individual inserts per records.
So i tried a trick:
Instead of doing the real insert into USER_DATA_HIST, i collect the USER_DATA_HIST data into a pl/sql collection first.
And later i do a bulk insert for the collection in the USER_DATA_HIST table with stmt trigger:
trigger user_data_tr_ra after insert or update on user_dataBut sometimes i recognize, that the list of entries saved in the pl/sql collection are more than my USER_DATA records being updated.
(BTW, for the update i use SQL merge, because it's driven by another table.)
As there is a uniq tracking_id in USER_DATA record, i could identify, that there are duplicates.
If i sort for the tracking_id and remove duplicate i get exactly the #no of records updated by the SQL merge.
So how comes, that there are duplicates?
I can try to make a sample 'sqlplus' program, but it will take some time.
But maybe somebody knows already about some issues here(?!)
- many thanks!
best regards,
FrankHello
Not sure really. Although it shouldn't take long to do a test case - it only took me 10 mins....
SQL>
SQL> create table USER_DATA
2 ( id number,
3 col1 varchar2(100)
4 )
5 /
Table created.
SQL>
SQL> CREATE TABLE USER_DATA_HIST
2 ( id number,
3 col1 varchar2(100),
4 tmsp timestamp
5 )
6 /
Table created.
SQL>
SQL> CREATE OR REPLACE PACKAGE pkg_audit_user_data
2 IS
3
4 PROCEDURE p_Init;
5
6 PROCEDURE p_Log
7 ( air_UserData IN user_data%ROWTYPE
8 );
9
10 PROCEDURE p_Write;
11 END;
12 /
Package created.
SQL> CREATE OR REPLACE PACKAGE BODY pkg_audit_user_data
2 IS
3
4 TYPE tt_UserData IS TABLE OF user_data_hist%ROWTYPE INDEX BY BINARY_INTEGER;
5
6 pt_UserData tt_UserData;
7
8 PROCEDURE p_Init
9 IS
10
11 BEGIN
12
13
14 IF pt_UserData.COUNT > 0 THEN
15
16 pt_UserData.DELETE;
17
18 END IF;
19
20 END;
21
22 PROCEDURE p_Log
23 ( air_UserData IN user_data%ROWTYPE
24 )
25 IS
26 ln_Idx BINARY_INTEGER;
27
28 BEGIN
29
30 ln_Idx := pt_UserData.COUNT + 1;
31
32 pt_UserData(ln_Idx).id := air_UserData.id;
33 pt_UserData(ln_Idx).col1 := air_UserData.col1;
34 pt_UserData(ln_Idx).tmsp := SYSTIMESTAMP;
35
36 END;
37
38 PROCEDURE p_Write
39 IS
40
41 BEGIN
42
43 FORALL li_Idx IN INDICES OF pt_UserData
44 INSERT
45 INTO
46 user_data_hist
47 VALUES
48 pt_UserData(li_Idx);
49
50 END;
51 END;
52 /
Package body created.
SQL>
SQL> CREATE OR REPLACE TRIGGER preu_s_user_data BEFORE UPDATE ON user_data
2 DECLARE
3
4 BEGIN
5
6 pkg_audit_user_data.p_Init;
7
8 END;
9 /
Trigger created.
SQL> CREATE OR REPLACE TRIGGER preu_r_user_data BEFORE UPDATE ON user_data
2 FOR EACH ROW
3 DECLARE
4
5 lc_Row user_data%ROWTYPE;
6
7 BEGIN
8
9 lc_Row.id := :NEW.id;
10 lc_Row.col1 := :NEW.col1;
11
12 pkg_audit_user_data.p_Log
13 ( lc_Row
14 );
15
16 END;
17 /
Trigger created.
SQL> CREATE OR REPLACE TRIGGER postu_s_user_data AFTER UPDATE ON user_data
2 DECLARE
3
4 BEGIN
5
6 pkg_audit_user_data.p_Write;
7
8 END;
9 /
Trigger created.
SQL>
SQL>
SQL> insert
2 into
3 user_data
4 select
5 rownum,
6 dbms_random.string('u',20)
7 from
8 dual
9 connect by
10 level <=10
11 /
10 rows created.
SQL> select * from user_data
2 /
ID COL1
1 GVZHKXSSJZHUSLLIDQTO
2 QVNXLTGJXFUDUHGYKANI
3 GTVHDCJAXLJFVTFSPFQI
4 CNVEGOTDLZQJJPVUXWYJ
5 FPOTZAWKMWHNOJMMIOKP
6 BZKHAFATQDBUVFBCOSPT
7 LAQAIDVREFJZWIQFUPMP
8 DXFICIPCBCFTPAPKDGZF
9 KKSMMRAQUORRPUBNJFCK
10 GBLTFZJAOPKFZFCQPGYW
10 rows selected.
SQL> select * from user_data_hist
2 /
no rows selected
SQL>
SQL> MERGE
2 INTO
3 user_data a
4 USING
5 ( SELECT
6 rownum + 8 id,
7 dbms_random.string('u',20) col1
8 FROM
9 dual
10 CONNECT BY
11 level <= 10
12 ) b
13 ON (a.id = b.id)
14 WHEN MATCHED THEN
15 UPDATE SET a.col1 = b.col1
16 WHEN NOT MATCHED THEN
17 INSERT(a.id,a.col1)
18 VALUES (b.id,b.col1)
19 /
10 rows merged.
SQL> select * from user_data_hist
2 /
ID COL1 TMSP
9 XGURXHHZGSUKILYQKBNB 05-AUG-11 10.04.15.577989
10 HLVUTUIFBAKGMXBDJTSL 05-AUG-11 10.04.15.578090
SQL> select * from v$version
2 /
BANNER
Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
PL/SQL Release 10.2.0.4.0 - Production
CORE 10.2.0.4.0 Production
TNS for Linux: Version 10.2.0.4.0 - Production
NLSRTL Version 10.2.0.4.0 - ProductionHTH
David
Maybe you are looking for
-
Hi, I am working in ECC 5.0...This is my requirement.. After receiving the workitem approval message in managers inbox..after he approves/rejects,whatever approval i need to send back confirmation mail to Employee as well as payroll and admistrator a
-
Purchased used G3 Vigor; SIM not recognized at all?
This is my first time ever using a wireless forum to ask a question, but I figure since it's a Sunday evening and all the stores are closed, it's worth a shot asking for advice here before tomorrow! Yesterday I bought a used LG G3 Vigor (model D725)
-
Right adapter for my new old mini
Hi all. I am new to minis. My imac G5 had a logic board blow out and a friend is selling me her mini...this one: Apple Mac mini "Core 2 Duo" 1.83 Specs Identifiers: Mid-2007 - MB138LL/A - Macmini2,1 - A1176 - 2108 I would like to pick up a used Apple
-
MX2004, FF1.5 A nested tabele shows fine in IE, but FF http://www.cbrc.us/Results-VFTT.html Specificly, the category headings, such as "Men <45: 16.1km" seem to wrap in FF, but looks fine FF. I recall a tech bulletin from Macromedia re. this issue, b
-
HT3702 I can't download anything in my apps because of my credit card
I can't use the app because of my credit card