BCP-style bulk insert from remote C++ ODBC Native client application

I am trying to find documentation or sample code for performing bulk inserts into SQL Server 2012 from a remote client using the ODBC native client driver from Linux.  We currently perform INSERT statements on blocks of data, wrapping it in BEGIN/COMMIT,
and achieving through approximately half of bcp reading from a delimited text file.  While there are many web pages talking about bulk inserts via the native driver, this page (http://technet.microsoft.com/en-us/library/ms130792.aspx) seems closest to
what I'm after but doesn't go into any detail or give API calls.  The referenced header file is just a bunch of options and constants, so presumablyone gains access to bulk functions via the standard ODBC mechanism, the question is how.
For clarity, I am NOT interested in:
BULK INSERT: because it requires a server-side data file or a UNC path with appropriate permissions (doesn't work from Linux)
INSERT ... SELECT
* FROM OPENROWSET(BULK...): same problem as above
IRowsetFastload: OLEDB, but I need ODBC on Linux.
Basically, I want to emulate BCP.  I don't want to *run* BCP because it requires landing data to disk. 
Thanks
john
John Lilley Chief Architect RedPoint Global Inc.

Other than block inserts within BEGIN/COMMIT transaction blocks or running bcp, is there anything else that can be done on Linux?
No other option from Linux that I am aware of.  The SQL Server Native Client ODBC driver also supports table-valued-parameters, which can be used to stream data but the Linux ODBC driver API doesn't have a way to do that either.  That said, I would
still expect file-based BCP to significantly outperform inserts with large batches.  I've seen a rate of 100K/sec. with this technique, including the file create overhead but much depends on the particulars of your use case.
Consider voting for this on Connect.  BCP is on the roadmap but no date yet: 
https://connect.microsoft.com/SQLServer/SearchResults.aspx?SearchQuery=linux+odbc+bcp
Also, I filed a Connect item for TVP support:
https://connect.microsoft.com/SQLServer/feedback/details/874616/add-tvp-support-to-sql-server-odbc-driver-for-linux
Dan Guzman, SQL Server MVP, http://www.dbdelta.com

Similar Messages

  • BULK INSERT from a text (.csv) file - read only specific columns.

    I am using Microsoft SQL 2005, I need to do a BULK INSERT from a .csv I just downloaded from paypal.  I can't edit some of the columns that are given in the report.  I am trying to load specific columns from the file.
    bulk insert Orders
    FROM 'C:\Users\*******\Desktop\DownloadURL123.csv'
       WITH
                  FIELDTERMINATOR = ',',
                    FIRSTROW = 2,
                    ROWTERMINATOR = '\n'
    So where would I state what column names (from row #1 on the .csv file) would be used into what specific column in the table.
    I saw this on one of the sites which seemed to guide me towards the answer, but I failed.. here you go, it might help you:
    FORMATFILE [ = 'format_file_path' ]
    Specifies the full path of a format file. A format file describes the data file that contains stored responses created using the bcp utility on the same table or view. The format file should be used in cases in which:
    The data file contains greater or fewer columns than the table or view.
    The columns are in a different order.
    The column delimiters vary.
    There are other changes in the data format. Format files are usually created by using the bcp utility and modified with a text editor as needed. For more information, see bcp Utility.

    Date, Time, Time Zone, Name, Type, Status, Currency, Gross, Fee, Net, From Email Address, To Email Address, Transaction ID, Item Title, Item ID, Buyer ID, Item URL, Closing Date, Reference Txn ID, Receipt ID,
    "04/22/07", "12:00:21", "PDT", "Test", "Payment Received", "Cleared", "USD", "321", "2.32", "3213', "[email protected]", "[email protected]", "", "testing", "392302", "jdal32", "http://ddd.com", "04/22/03", "", "",
    "04/22/07", "12:00:21", "PDT", "Test", "Payment Received", "Cleared", "USD", "321", "2.32", "3213', "[email protected]", "[email protected]", "", "testing", "392932930302", "jejsl32", "http://ddd.com", "04/22/03", "", "",
    Do you need more than 2 rows? I did not include all the columns from the actual csv file but most of it, I am planning on taking to the first table these specfic columns: date, to email address, transaction ID, item title, item ID, buyer ID, item URL.
    The other table, I don't have any values from here because I did not list them, but if you do this for me I could probably figure the other table out.
    Thank you very much.

  • How to get current month from filename and bulk insert from text file into table?

    I set up some dynamic SQL to help my bulk copy data from a text file to a table.  This works fine for files that come in every day; I get the previous day’s data, based on the file name that’s placed
    in the folder.  That’s why I’m using the ‘-1’.  The dates will look like this: '20140131', so I'm using type 112.
    declare @fullpath1 varchar(1000)
    select @fullpath1 = '''\\system.local\ms\london\FTP\' + convert(varchar, getdate()-1, 112) + '_INDEXPRICES_EOM.SPC'''
    declare @cmd1 nvarchar(1000)
    print (@cmd1)
    select @cmd1 = 'bulk insert [dbo].[SB_Monthly] from ' + @fullpath1 + ' with (FIELDTERMINATOR = ''\t'', FIRSTROW = 5, LASTROW = 675, ROWTERMINATOR=''0x0a'')'
    print(@cmd1)
    exec (@cmd1)
    I think the syntax will be somewhat similar to this:
    YEAR(date_column)=YEAR(getdate()) AND MONTH(date_column)=MONTH(getdate())
    I’m not totally sure how to incorporate that into my current syntax.
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

    I tried a couple versions of this.
    Declare @StartDate Date, @EndDate Date
    Select @StartDate = convert(varchar, getdate()-28, 112), @EndDate = convert(varchar, getdate()-1, 112)
    BEGIN
    declare @fullpath1 varchar(1000)
    select @fullpath1 = '''\\ms\london\FTP\' + ''' between ''' + Convert(Varchar(10), @StartDate, 101) + ''' and ''' + Convert(Varchar(10), @EndDate, 101) + '''_SP.SPC'''
    declare @cmd1 nvarchar(1000)
    print (@cmd1)
    select @cmd1 = 'bulk insert [dbo].[SPBMI_Monthly] from ' + @fullpath1 + ' with (FIELDTERMINATOR = ''\t'', FIRSTROW = 5, LASTROW = 675, ROWTERMINATOR=''0x0a'')'
    print(@cmd1)
    exec (@cmd1)
    END
    Here’s the string:
    bulk insert [dbo].[SPBMI_Monthly] from '\\ms\london\FTP\' between '02/03/2014' and '03/02/2014'_SP.SPC' with (FIELDTERMINATOR = '\t', FIRSTROW = 5, LASTROW = 675, ROWTERMINATOR='0x0a')
    The error message I keep getting is:
    Msg 156, Level 15, State 1, Line 1
    Incorrect syntax near the keyword 'between'.
    Msg 319, Level 15, State 1, Line 1
    Incorrect syntax near the keyword 'with'. If this statement is a common table expression, an xmlnamespaces clause or a change tracking context clause, the previous statement must be terminated with a semicolon.
    I feel like I’m already pushing this thing to the limit. 
    Maybe this last part isn’t possible.
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

  • Bulk   Insert   from  SQL Server  to Oracle

    I have to load around 20 million rows from SQL Server table to Oracle table using Network Link,wrote following code using Bulk Collect,which is working but taking more time(taking 5 hrs).
    I also tried with changing table to parallel degree 8 didn't help(Also Oracle Table set to NOLOGGONG mode).
    Is there any better way to do this ? Appreciate any help in this regard .
    Script :
    CREATE OR REPLACE PROCEDURE INSERT_SQLSERVER_TO_ORACLE
    IS
    TYPE v_ARRAY IS TABLE OF TARGET_CUST%ROWTYPE INDEX BY BINARY_INTEGER;
    ins_rows v_ARRAY;
    BEGIN
    DECLARE CURSOR REC1 IS
    SELECT COL1, COL2,COL3,COL4 SOURCE_SQLSERVER_CUST;
    BEGIN
    OPEN REC1;
    LOOP
    FETCH REC1 BULK COLLECT INTO ins_rows LIMIT 5000;
    FORALL i IN ins_rows.FIRST..ins_rows.LAST
    INSERT INTO TARGET_CUST VALUES ins_rows(i);
    EXIT WHEN REC1%NOTFOUND;
    END LOOP;
    COMMIT;
    CLOSE REC1;
    END;
    END;
    Thanks in Advance.

    887204 wrote:
    I have to load around 20 million rows from SQL Server table to Oracle table using Network Link,wrote following code using Bulk Collect,which is working but taking more time(taking 5 hrs).I would not pull that data via a network link and use standard SQL insert statements. Bulk processing is meaningless in this context. It does nothing to increase the performance - as context switching is not the issue.
    The biggest factor is pulling 20 million rows's data via database link across the network. This will be slow by it's very nature.
    I would use bcp (Bulk Copy export) on SQL-Server to write the data to a CSV file.
    Zip that file. FTP/scp/sftp it to the Oracle server. Unzip it.
    Then do a parallel direct load of the data using SQL*Loader.
    This will be a lot faster than pulling uncompressed data across the network, a couple of rows at a time (together with the numerous moving parts on the Oracle side that uses a HS agent as interface between SQL-Server and the Oracle database).

  • Oracle connection from remote server where oracle client is installed not happening.

    Hi  ,
    I am facing problem today , that i am not able to reach oracle DB from other machine where i have installed Oracle clinet few months ago. Same was heppeing till yesterday.
    OS: Windows
    Version: 11G
    Oracle DB TNS:
    # tnsnames.ora Network Configuration File: C:\app\oracluadmin\product\11.2.0\dbhome_1\NETWORK\ADMIN\tnsnames.ora
    # Generated by Oracle configuration tools.
    DFCCDB =
      (DESCRIPTION =
        (ADDRESS_LIST =
          (ADDRESS = (PROTOCOL = TCP)(HOST = 10.199.4.130)(PORT = 1521))
        (CONNECT_DATA =
          (SERVER = DEDICATED)
          (SERVICE_NAME = dfccdb.dfcc.co.in)
    ORACLR_CONNECTION_DATA =
      (DESCRIPTION =
        (ADDRESS_LIST =
          (ADDRESS = (PROTOCOL = IPC)(KEY = EXTPROC1521))
        (CONNECT_DATA =
          (SID = CLRExtProc)
    Output from DB server:
    C:\Users\oracluadmin>tnsping DFCCDB
    TNS Ping Utility for 64-bit Windows: Version 11.2.0.3.0 - Production on 28-JAN-2014 14:51:22
    Copyright (c) 1997, 2011, Oracle.  All rights reserved.
    Used parameter files:
    C:\app\oracluadmin\product\11.2.0\dbhome_1\network\admin\sqlnet.ora
    Used TNSNAMES adapter to resolve the alias
    Attempting to contact (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = 10.199.4.130)(PORT = 1521))) (CONNECT_DATA = (SERVER
    = DEDICATED) (SERVICE_NAME = dfccdb.dfcc.co.in)))
    OK (0 msec)
    C:\Users\oracluadmin>sqlplus SYS AS SYSDBA@DFCCDB
    SQL*Plus: Release 11.2.0.3.0 Production on Tue Jan 28 14:56:15 2014
    Copyright (c) 1982, 2011, Oracle.  All rights reserved.
    Enter password:
    Connected to:
    Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production
    With the Partitioning, OLAP, Data Mining and Real Application Testing options
    SQL>
    Clinet server TNS:
    DFCCDB =
      (DESCRIPTION =
        (ADDRESS_LIST =
          (ADDRESS = (PROTOCOL = TCP)(HOST = 10.199.4.130)(PORT = 1521))
        (CONNECT_DATA =
          (SERVER = DEDICATED)
          (SERVICE_NAME = dfccdb.dfcc.co.in)
    Output from client server:
    C:\Users\gisadmin>tnsping DFCCDB
    TNS Ping Utility for 64-bit Windows: Version 11.2.0.1.0 - Production on 28-JAN
    014 15:06:18
    Copyright (c) 1997, 2010, Oracle.  All rights reserved.
    Used parameter files:
    C:\app\gisadmin\product\11.2.0\client_1\network\admin\sqlnet.ora
    Used TNSNAMES adapter to resolve the alias
    Attempting to contact (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TC
    (HOST = 10.199.4.130)(PORT = 1521))) (CONNECT_DATA = (SERVICE_NAME = dfccdb.df
    .co.in)))
    OK (10 msec)
    C:\Users\gisadmin>sqlplus SYS AS SYSDBA@DFCCDB
    SQL*Plus: Release 11.2.0.1.0 Production on Tue Jan 28 15:06:36 2014
    Copyright (c) 1982, 2010, Oracle.  All rights reserved.
    Enter password:
    ERROR:
    ORA-12560: TNS:protocol adapter error
    Enter user-name:
    Thnaks....

    Hi,
    Refer the logs:
    <msg time='2014-02-11T10:43:10.455+05:30' org_id='oracle' comp_id='tnslsnr'
    type='UNKNOWN' level='16' host_id='DFCC-GISDB-01'
    host_addr='10.199.4.58'>
    <txt>11-FEB-2014 10:43:10 * (CONNECT_DATA=(SERVICE_NAME=dfccdb.dfcc.co.in)(CID=(PROGRAM=C:\app\gisadmin\product\11.2.0\client_1\bin\sqlplus.exe)(HOST=DFCC-GISAPP-01)(USER=gisadmin))) * (ADDRESS=(PROTOCOL=tcp)(HOST=10.199.4.55)(PORT=25229)) * establish * dfccdb.dfcc.co.in * 12514
    </txt>
    </msg>
    <msg time='2014-02-11T10:43:10.455+05:30' org_id='oracle' comp_id='tnslsnr'
    type='UNKNOWN' level='16' host_id='DFCC-GISDB-01'
    host_addr='10.199.4.58'>
    <txt>TNS-12514: TNS:listener does not currently know of service requested in connect descriptor
    </txt>
    </msg>
    <msg time='2014-02-11T10:43:58.706+05:30' org_id='oracle' comp_id='tnslsnr'
    type='UNKNOWN' level='16' host_id='DFCC-GISDB-01'
    host_addr='10.199.4.58'>
    <txt>11-FEB-2014 10:43:58 * (CONNECT_DATA=(SERVICE_NAME=dfccdb.dfcc.co.in)(CID=(PROGRAM=C:\app\gisadmin\product\11.2.0\client_1\bin\sqlplus.exe)(HOST=DFCC-GISAPP-01)(USER=gisadmin))) * (ADDRESS=(PROTOCOL=tcp)(HOST=10.199.4.55)(PORT=25230)) * establish * dfccdb.dfcc.co.in * 12514
    </txt>
    </msg>
    <msg time='2014-02-11T10:43:58.722+05:30' org_id='oracle' comp_id='tnslsnr'
    type='UNKNOWN' level='16' host_id='DFCC-GISDB-01'
    host_addr='10.199.4.58'>
    <txt>TNS-12514: TNS:listener does not currently know of service requested in connect descriptor
    </txt>
    </msg>
    <msg time='2014-02-11T10:44:33.120+05:30' org_id='oracle' comp_id='tnslsnr'
    type='UNKNOWN' level='16' host_id='DFCC-GISDB-01'
    host_addr='10.199.4.58'>
    <txt>11-FEB-2014 10:44:33 * (CONNECT_DATA=(SERVICE_NAME=dfccdb.dfcc.co.in)(CID=(PROGRAM=C:\app\gisadmin\product\11.2.0\client_1\bin\sqlplus.exe)(HOST=DFCC-GISAPP-01)(USER=gisadmin))) * (ADDRESS=(PROTOCOL=tcp)(HOST=10.199.4.55)(PORT=25231)) * establish * dfccdb.dfcc.co.in * 12514
    </txt>
    </msg>
    <msg time='2014-02-11T10:44:33.136+05:30' org_id='oracle' comp_id='tnslsnr'
    type='UNKNOWN' level='16' host_id='DFCC-GISDB-01'
    host_addr='10.199.4.58'>
    <txt>TNS-12514: TNS:listener does not currently know of service requested in connect descriptor
    </txt>
    </msg>
    <msg time='2014-02-11T10:45:24.662+05:30' org_id='oracle' comp_id='tnslsnr'
    type='UNKNOWN' level='16' host_id='DFCC-GISDB-01'
    host_addr='10.199.4.58'>
    <txt>11-FEB-2014 10:45:24 * (CONNECT_DATA=(SERVICE_NAME=dfccdb.dfcc.co.in)(CID=(PROGRAM=C:\app\gisadmin\product\11.2.0\client_1\bin\sqlplus.exe)(HOST=DFCC-GISAPP-01)(USER=gisadmin))) * (ADDRESS=(PROTOCOL=tcp)(HOST=10.199.4.55)(PORT=25280)) * establish * dfccdb.dfcc.co.in * 12514
    </txt>
    </msg>
    <msg time='2014-02-11T10:45:24.662+05:30' org_id='oracle' comp_id='tnslsnr'
    type='UNKNOWN' level='16' host_id='DFCC-GISDB-01'
    host_addr='10.199.4.58'>
    <txt>TNS-12514: TNS:listener does not currently know of service requested in connect descriptor
    </txt>
    </msg>
    <msg time='2014-02-11T10:48:00.819+05:30' org_id='oracle' comp_id='tnslsnr'
    type='UNKNOWN' level='16' host_id='DFCC-GISDB-01'
    host_addr='10.199.4.58'>
    <txt>11-FEB-2014 10:48:00 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=)(USER=oracluadmin))(COMMAND=status)(ARGUMENTS=64)(SERVICE=LISTENER)(VERSION=186647296)) * status * 0
    </txt>
    </msg>
    <msg time='2014-02-11T10:48:14.547+05:30' org_id='oracle' comp_id='tnslsnr'
    type='UNKNOWN' level='16' host_id='DFCC-GISDB-01'
    host_addr='10.199.4.58'>
    txt>11-FEB-2014 10:48:14 * (CONNECT_DATA=(CID=(PROGRAM=)(HOST=)(USER=oracluadmin))(COMMAND=services)(ARGUMENTS=64)(SERVICE=LISTENER)(VERSION=186647296)) * services * 0
    </txt>
    </msg>

  • Driver's SQLSetConnectAttr failed/SQL Server 2005/ODBC Native Client

    I have an ODBC application that I am writing in VS 2008 C++. I freely admit to not being an ODBC expert. I am having trouble with SetConnectAttr failures. I've got them down to one specific case that I will describe here. The environment is 64-bit Windows
    7 (client) and SQL Server 2005 running on Win Server 2003. I am using a 32-bit ODBC connection. For the most part the application is working: I am successfully using the connection to do multiple INSERTs into multiple tables. I get lots of hits when I
    search on the above error message but they are so "all over the map" (Oracle, packaged applications, specific situations) that I have not been able to find one that seems to apply. Connection Pooling is not turned on in the connection.
    I make the following call
    SQLUINTEGER timeout = 10;
    // seconds
    retCode = SQLSetConnectAttr(ConnectionHandle, SQL_ATTR_LOGIN_TIMEOUT, &timeout, 0);
    I receive a zero in retCode.
    However when I subsequently issue the SQLConnect() I receive
    [Microsoft][ODBC Driver Manager] Driver's SQLSetConnectAttr failed
    I was setting three different connection attributes and getting the above message twice, but I have for debugging purposes commented out two of the set's. The above call is the only attribute I am setting so it is apparently the one that is failing. Any
    help would be appreciated.
    Charles

    In case anyone else runs into this, here is the real problem with the original code:
        retCode = SQLSetConnectAttr(ConnectionHandle, SQL_ATTR_LOGIN_TIMEOUT,
    &timeout, 0);
    The SQL_ATTR_LOGIN_TIMEOUT attribute takes an integer by value, not a pointer to an integer.  This is a confusing aspect of this API.  If you pass a pointer, ODBC is really "seeing" a large integer value, which causes the
    code to effectively behave as though there were an infinite timeout.
    Instead, just pass the value of the timeout:
        retCode = SQLSetConnectAttr(ConnectionHandle, SQL_ATTR_LOGIN_TIMEOUT,
    (SQLPOINTER) timeout, SQL_IS_UINTEGER);
    Then the timeout value should be respected.  Using SQLSetConnectOption(SQL_LOGIN_TIMEOUT) seems to be equivalent to the above, provided you pass the timeout by value and not via a pointer.

  • Help me plz: Connect to tuxedo from visual basic 6.0 client application

    Hello everyone,
    i have a big problem
    i try to connect my visual basic application 6.0 to the tuxedo, but i dont know how to use tpinit and tpcall in vb environnement.
    that my code:
    Private Declare Function tpinit Lib "C:\OracleHome\tuxedo11gR1\bin\wtuxws32.dll" _
    (ByVal vlTpInfo As Long) As Integer
    Private Declare Function TpTerm Lib "C:\OracleHome\tuxedo11gR1\bin\wtuxws32.dll" Alias "tpterm" () As Integer
    Private Declare Function TpCall Lib "C:\OracleHome\tuxedo11gR1\bin\wtuxws32.dll" Alias "tpcall" _
    (ByVal vsServiceName As String, ByVal vlBufPtr As Long, ByVal vlBufLen As Long, _
    ByRef rlReplyBufPtr As Long, ByRef rlReplyBufLen As Long, ByVal vlFlags As Long) As Integer
    Private Declare Function TpAlloc Lib "C:\OracleHome\tuxedo11gR1\bin\wtuxws32.dll" Alias "tpalloc" _
    (ByVal vsTpType As String, ByVal vsTpSubType As String, ByVal vlSize As Long) As Long
    Private Declare Sub TpFree Lib "C:\OracleHome\tuxedo11gR1\bin\wtuxws32.dll" Alias "tpfree" (ByVal vlBufPtr As Long)
    Option Explicit
    Type tpinfo_type
    username As String
    cltname As String
    passwd As String
    flags As Long
    datalen As Long
    data As String
    End Type
    Dim tpinfo As tpinfo_type
    tpinfo.username = "cajat05"
    tpinfo.passwd = "cajat05"
    tpinfo.cltname = "VB6"
    tpinfo.flags = 110
    tpinfo.data = "NMLRS 55502022"
    tpinfo.datalen = 14
    Dim ret_init As Integer
    Dim ret_tpalloc As Long
    ret_tpalloc = TpAlloc("tpinfo", "", 9000)
    MsgBox "ret_tpalloc = " & ret_tpalloc
    'ret_init = tpinit(900)
    'MsgBox "ret_init = " & ret_init
    Dim s As String
    If ret_init = 0 Then
    msgbox "good"
    Else
    msgbox "not good"
    End If
    Can any one send me the code he made for a similar case ?
    Thank you in advance.
    Edited by: user3465258 on 2 août 2010 14:57
    Edited by: user3465258 on 2 août 2010 15:44
    Edited by: user3465258 on 2 août 2010 18:00

    As an alternative you may want to consider writing a "C" DLL that is callable from VB. The DLL would make the Tuxedo "tp" calls and check all the return codes and the functions defined in the DLL would simplify the interface to VB. I have done this successfully in the past.
    Harvey

  • Bulk Insert Issue with BCP

    I'm running SQL Server 2008 R2 and trying to test out bcp in one of our databases. For almost all the tables, the bcp and bulk insert work fine using similar commands below.  However on a few tables I am experiencing an issue when trying to Bulk Insert
    in.
    Here are the details:
    This is the bcp command to export out the data (via simple batch file):
     1.)
    SET OUTPUT=K:\BCP_FIN_Test
    SET ERRORLOG=C:\Temp\BCP_Error_Log
    SET TIMINGS=C:\Temp\BCP_Timings
    bcp "SELECT * FROM FS84RPT.dbo.PS_PO_LINE Inner Join FS84RPT.[dbo].[PS_RECV_LN_ACCTG] on PS_PO_LINE.BUSINESS_UNIT = PS_RECV_LN_ACCTG.BUSINESS_UNIT_PO and PS_PO_LINE.PO_ID= PS_RECV_LN_ACCTG.PO_ID and PS_PO_LINE.LINE_NBR= PS_RECV_LN_ACCTG.LINE_NBR WHERE
    PS_RECV_LN_ACCTG.FISCAL_YEAR = '2014' and PS_RECV_LN_ACCTG.ACCOUNTING_PERIOD BETWEEN '9' AND '11' " queryout %OUTPUT%\PS_PO_LINE.txt -e %ERRORLOG%\PS_PO_LINE.err -o %TIMINGS%\PS_PO_LINE.txt -T -N
     2.)
    BULK INSERT PS_PO_LINE FROM 'K:\BCP_FIN_Test\PS_PO_LINE.txt' WITH (DATAFILETYPE = 'widenative')
    Msg 4869, Level 16, State 1, Line 1
    The bulk load failed. Unexpected NULL value in data file row 2, column 22. The destination column (CNTRCT_RATE_MULT) is defined as NOT NULL.
    Msg 4866, Level 16, State 4, Line 1
    The bulk load failed. The column is too long in the data file for row 3, column 22. Verify that the field terminator and row terminator are specified correctly.
    Msg 7399, Level 16, State 1, Line 1
    The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
    Msg 7330, Level 16, State 2, Line 1
    Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
    I've tried a few different things including trying to export as character and import as BULK INSERT PS_PO_LINE FROM 'K:\BCP_FIN_Test\PS_PO_LINE.txt' WITH (DATAFILETYPE = 'char')
    But no luck
    Appreciate help

    It seems that the target table does not match your expectations.
    Since I don't know exactly what you are doing, I will have to resort to guesses.
    I note that you export query goes:
      SELECT * FROM FS84RPT.dbo.PS_PO_LINE Inner Join
    And then you are importing into a table called PS_PO_LINE as well. But for your operation to make sense the import PS_PO_LINE must not only have the columns from the PS_PO_LINE, but also all columns from PS_RECV_LN_ACCTG. Maybe your SELECT should read
      SELECT PS_PO_LINE.* FROM FS84RPT.dbo.PS_PO_LINE Inner Join
    or use an EXISTS clause to add the filter of PS_RECV_LN_ACCTG table. (Assuming that it appears in the query for filtering only.)
    Erland Sommarskog, SQL Server MVP, [email protected]

  • CUCM 9.1 Bulk Insert Remote Destinations with a +

    If I try a bulk insert of remote destinations that begin with a + sign (eg +15552125555) I get the following error:
    "A character to numeric conversion process failed"
    If i create the remote destination manually it works fine so I know that the string can contain a + sign. and If I export the existing remote destinations the format is identical to my csv. Is there anything that needs to be done for it to correctly handle the + character?

    CSV file attached. I've removed the identifyig information but this is what i exported from the call manager. If I try to import it back in it throws that error.

  • How can I debug a Bulk Insert error?

    I'm loading a bunch of files into SQL server.  All work fine, but one keeps erroring out on me.  All files should be exactly the same in structure, but they have different dates, and other different financial metrics, but the structure and field
    names should be exactly the same.  Nevertheless, one keeps konking out, and throwing this error.
    Msg 4832, Level 16, State 1, Line 1
    Bulk load: An unexpected end of file was encountered in the data file.
    Msg 7399, Level 16, State 1, Line 1
    The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
    Msg 7330, Level 16, State 2, Line 1
    Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
    The ROWTERMINATOR should be CRLF, and when you look at it in Notepad++ that's what it looks like, but it must be something else, because I keep getting errors here.  I tried the good old:  ROWTERMINATOR='0x0a'
    That works on all files, but one, so there's something funky going on here, and I need to see what SQL Server is really doing.
    Is there some way to print out a log, or look at a log somewhere?
    Thanks!!
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

    The first thing to try is to see if BCP likes the file. BCP and BULK INSERT adhere to the same spec, but they are different implementations, but there are subtle differences.
    There is an ERRORFILE option, but it more helps when there is bad data.
    You can also use the BATCHSIZE option to see how many records in the file it swallows, before things go bad. FIRSTROW and LASTROW can also help.
    All in all, it can be quite tedious find that single row where things are different - and where BULK INSERT loses sync entirely. Keep in mind that it reads fields on by one, and it there is one field terminator to few on a line, it will consume the line
    feed at the end of the line as data.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Prevent Duplicates from Inserting from File in Stored Procedure

    CREATE TABLE dbo.Logins
    [ID] numeric(10, 0) NOT NULL Primary Key,
    [Dr_FName] varchar(50) NOT NULL,
    [Dr_LName] varchar(50) NOT NULL,
    [Login] varchar(50) NOT NULL
    GO
    CREATE TABLE [dbo].[CRIS_USER] (
    [id] numeric(10, 0) NOT NULL Primary Key,
    [login_id] varchar(20) NOT NULL
    GO
    CREATE TABLE [dbo].[CRIS_SYSTEM_USER] (
    [id] numeric(10, 0) NOT NULL Primary Key,
    [cris_user_id] numeric(10, 0) NOT NULL,
    INSERT INTO Logins
    (ID, Dr_FName, Dr_LName,Login)
    VALUES(1,'Lisa','Mars','lmars')
    INSERT INTO Logins
    (ID, Dr_FName, Dr_LName,Login)
    VALUES(2,'Becky','Saturn','bsaturn')
    INSERT INTO Logins
    (ID, Dr_FName, Dr_LName,Login)
    VALUES(3,'Mary','Venus','mvenus')
    INSERT INTO CRIS_USER
    (ID,login_id)
    VALUES(10, 'lmars')
    INSERT INTO CRIS_USER
    (ID,login_id)
    VALUES(20, 'bsaturn')
    INSERT INTO CRIS_USER
    (ID,login_id)
    VALUES(30, 'mvenus')
    INSERT INTO CRIS_SYSTEM_USER
    (ID,cris_user_id)
    VALUES(110, 10)
    INSERT INTO CRIS_SYSTEM_USER
    (ID,cris_user_id)
    VALUES(120,20)
    INSERT INTO CRIS_SYSTEM_USER
    (ID,cris_user_id)
    VALUES(130, 30)
    I'm aware that "ID" is a bad column name and that it should not be numeric. The ID columns are incremented by a program. They are not auto incremented. I didn't design it.
    I have a stored procedure that does a bulk insert from a tab delimited file into the three tables:
    CREATE PROCEDURE [dbo].[InsertUserText]
    WITH EXEC AS CALLER
    AS
    IF OBJECT_ID('TEMPDB..#LoginTemp') IS NULL
    BEGIN
    CREATE TABLE #LoginTemp(Login varchar(50),Dr_FName varchar(50),Dr_LName varchar(50))
    BULK INSERT #LoginTemp
    FROM 'C:\loginstest\InsertUserText.txt'
    WITH (ROWTERMINATOR ='\n'
    -- New Line Feed (\n) automatically adds Carrige Return (\r)
    ,FIELDTERMINATOR = '\t'
    --delimiter
    ,FIRSTROW=4)
    PRINT 'File data copied to Temp table'
    END
    DECLARE @maxid NUMERIC(10,0)
    DECLARE @maxid2 NUMERIC(10,0)
    DECLARE @maxid3 NUMERIC(10,0)
    BEGIN TRANSACTION
    SELECT @maxid = coalesce(MAX(ID), 0)
    FROM dbo.LOGINS WITH (UPDLOCK)
    INSERT INTO dbo.LOGINS(ID,Dr_FName,Dr_LName,Login)
    SELECT row_number() OVER(ORDER BY (SELECT NULL)) + @maxid,
    Dr_FName,Dr_LName,Login
    FROM #LoginTemp
    WHERE #LoginTemp.Dr_FName is not Null;
    SELECT @maxid3 = coalesce(MAX(id), 0)
    FROM dbo.CRIS_USER WITH (UPDLOCK)
    INSERT INTO dbo.CRIS_USER(id,login_id)
    SELECT row_number() OVER(ORDER BY (SELECT NULL)) + @maxid3,
    Login
    FROM #LoginTemp
    WHERE #LoginTemp.Dr_FName is not Null;
    SELECT @maxid2 = coalesce(MAX(id), 0)
    FROM dbo.CRIS_SYSTEM_USER WITH (UPDLOCK)
    INSERT INTO dbo.CRIS_SYSTEM_USER(id,cris_user_id)
    SELECT row_number() OVER(ORDER BY (SELECT NULL)) + @maxid2,
    + row_number() OVER(ORDER BY (SELECT NULL)) + @maxid3
    FROM #LoginTemp
    WHERE #LoginTemp.Dr_FName is not Null;
    PRINT 'Copied from Temp table to CRIS_USER'
    COMMIT TRANSACTION
    GO
    What suggestions do you have to prevent a duplicate Logins.Login? None of the inserts for all three tables should occur if a login already exists in the Logins table. There should be a message to indicate which Login failed. I haven't yet decided if I want
    all of the logins in the text file to fail if there is a duplicate, or just the one with the duplicate. I'm open to suggestions on that. So far, the duplicates only occur when someone forgets to update the tabbed delimited file and accidently runs the procedure
    on an old one. I'm sure I can come up with an if statement that will accomplish this. I could maybe use WHERE EXISTS or WHERE NOT EXISTS. But I know I can get a good solution here.
    I'm also aware that duplicates could be prevented in the table design. Again, I didn't design it.
    I have a tab delimited file created but don't see a way to attach it.
    Thanks for any help.

    Thanks to all three that replied. I meant to mark the question as answered sooner. I've tried all the suggestions on a test system. All will work with maybe some slight variations. Below was my temporary quick fix. I'm working on switching to a permanent
    solution based on the replies.
    This is not the real solution and is not the answer to my question. It's just temporary.
    IF OBJECT_ID('TEMPDB..#LoginTemp') IS NULL
    BEGIN
    CREATE TABLE #LoginTemp(Login varchar(50),Dr_FName varchar(50),Dr_LName varchar(50))
    BULK INSERT #LoginTemp
    FROM 'C:\loginstest\InsertUserText.txt'
    WITH (ROWTERMINATOR ='\n'
    Return (\r)
    ,FIELDTERMINATOR = '\t'
    ,FIRSTROW=4)
    PRINT 'File data copied to Temp table'
    END
    DECLARE @maxid NUMERIC(10,0)
    DECLARE @maxid2 NUMERIC(10,0)
    DECLARE @maxid3 NUMERIC(10,0)
    IF EXISTS(SELECT 'True' FROM Logins L INNER JOIN #LoginTemp LT on L.Login = LT.Login
    BEGIN
    SELECT 'Duplicate row!'
    END
    ELSE
    BEGIN
    BEGIN TRANSACTION
    SELECT @maxid = coalesce(MAX(ID), 0)
    FROM dbo.LOGINS WITH (UPDLOCK)
    INSERT INTO dbo.LOGINS(ID,Dr_FName,Dr_LName,Login)
    SELECT row_number() OVER(ORDER BY (SELECT NULL)) + @maxid,
    Dr_FName,Dr_LName,Login
    FROM #LoginTemp
    WHERE #LoginTemp.Dr_FName is not Null;
    SELECT @maxid3 = coalesce(MAX(id), 0)
    FROM dbo.CRIS_USER WITH (UPDLOCK)
    INSERT INTO dbo.CRIS_USER(id,login_id)
    SELECT row_number() OVER(ORDER BY (SELECT NULL)) + @maxid3,
    Login
    FROM #LoginTemp
    WHERE #LoginTemp.Dr_FName is not Null;
    SELECT @maxid2 = coalesce(MAX(id), 0)
    FROM dbo.CRIS_SYSTEM_USER WITH (UPDLOCK)
    INSERT INTO dbo.CRIS_SYSTEM_USER(id,cris_user_id)
    SELECT row_number() OVER(ORDER BY (SELECT NULL)) + @maxid2,
    + row_number() OVER(ORDER BY (SELECT NULL)) + @maxid3
    FROM #LoginTemp
    WHERE #LoginTemp.Dr_FName is not Null;
    COMMIT TRANSACTION
    END
    GO

  • Blob truncated with DbFactory and Bulk insert

    Hi,
    My platform is a Microsoft Windows Server 2003 R2 Server 5.2 Service Pack 2 (64-bit) with an Oracle Database 11g 11.1.0.6.0.
    I use the client Oracle 11g ODAC 11.1.0.7.20.
    Some strange behavior appends when used DbFactory and bulk command with Blob column and parameter with a size larger than 65536bytes. Let me explain.
    First i create a dummy table in my schema :
    create table dummy (a number, b blob)To use bulk insert we can use the code A with oracle object (succes to execute) :
    byte[] b1 = new byte[65530];
    byte[] b2 = new byte[65540];
    Oracle.DataAccess.Client.OracleConnection conn = new Oracle.DataAccess.Client.OracleConnection("User Id=login;Password=pws;Data Source=orcl;");
    OracleCommand cmd = new OracleCommand("insert into dummy values (:p1,:p2)", conn);
    cmd.ArrayBindCount = 2;
    OracleParameter p1 = new OracleParameter("p1", OracleDbType.Int32);
    p1.Direction = ParameterDirection.Input;
    p1.Value = new int[] { 1, 2 };
    cmd.Parameters.Add(p1);
    OracleParameter p2 = new OracleParameter("p2", OracleDbType.Blob);
    p2.Direction = ParameterDirection.Input;
    p2.Value = new byte[][] { b1, b2 };
    cmd.Parameters.Add(p2);
    conn.Open(); cmd.ExecuteNonQuery(); conn.Close();We can write the same thing with an abstract level when used the DbProviderFactories (code B) :
    var factory = DbProviderFactories.GetFactory("Oracle.DataAccess.Client");
    DbConnection conn = factory.CreateConnection();
    conn.ConnectionString = "User Id=login;Password=pws;Data Source=orcl;";
    DbCommand cmd = conn.CreateCommand();
    cmd.CommandText = "insert into dummy values (:p1,:p2)";
    ((OracleCommand)cmd).ArrayBindCount = 2;
    DbParameter param = cmd.CreateParameter();
    param.ParameterName = "p1";
    param.DbType = DbType.Int32;
    param.Value = new int[] { 3, 4 };
    cmd.Parameters.Add(param);
    DbParameter param2 = cmd.CreateParameter();
    param2.ParameterName = "p2";
    param2.DbType = DbType.Binary;
    param2.Value = new byte[][] { b1, b2 };
    cmd.Parameters.Add(param2);
    conn.Open(); cmd.ExecuteNonQuery(); conn.Close();But this second code doesn't work, the second array of byte is truncated to 4byte. It seems to be an int16 overtaking.
    When used a DbTYpe.Binary, oracle use an OracleDbType.Raw for mapping and not an OracleDbType.Blob, so the problem seems to be with raw type, BUT if we use the same code without bulk insert, it's worked !!! The problem is somewhere else...
    Why used an DbConnection ? To be able to switch easy to an another database type.
    So why used "((OracleCommand)cmd).ArrayBindCount" ? To be able to used specific functionality of each database.
    I can fix the issue when casting DbParameter as OracleParameter and fix the OracleDbType to Blob, but why second code does not working with bulk and working with simple query ?

    BCP and BULK INSERT does not work the way you expect them do. What they do is that they consume fields in a round-robin fashion. That is, they first looks for data for the first field, then for the second field and so on.
    So in your case, they will first read one byte, then 20 bytes etc until they have read the two bytes for field 122. At this point they will consume bytes until they have found a sequence of carriage return and line feed.
    You say that some records in the file are incomplete. Say that there are only 60 fields in this file. Field 61 is four bytes. BCP and BULK INSERT will now read data for field 61 as CR+LF+the first two bytes in the next row. CR+LF has no special meaning,
    but they are just data at this point.
    You will have to write a program to parse the file, or use SSIS. But BCP and BULK INSERT are not your friends in this case.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Doing a bulk insert

    Hi
    I need recomendations for doinga bulk insert from a database to the OBPM, creating a new instance (new BPM oject with attributes) but I need to take the attributes from the db, so I need to read each row but I can't use a counter
    thanks for your help
    :-D have a nice day

    Read the metalink document 199746.1

  • Bulk copy from a temp table

    My input is from a file. Since I do not have an ETL tool, I am using a stored proc to do the ETL (which also gives me an advantage, I do not have to unload the target table to do the join). So, I dump the file contents into a temp table and use it in proc.
    The query is like
    Insert into <target table1> (Select fields and some transformation from <temp table> where <key> not in target table and <some joins with other tables in database>
    Like this I have four queries for four target tables.
    The inserts from the temp table into the target table is very slow because the target has a lot of index and RI. I cannot drop & create the index since the application requirements does not give me that liberty.
    My only option is to insert in a temp table similar to the target but without any index/RI/PK and then dump it into a file and then use SQL loader to load the file contents into target table. This is relatively faster but is a very cumbersome route to me.
    Is there any other way to do bulk insert from one table to another table like SQL loader without using a file? Is there anyway to bypass the index update operation without dropping the index?
    My source will be almost 500,000 rows and target is having 9 million rows.

    Posts like this one are better avoided.
    Because
    - You don't post a version
    - You don't post the SQL
    - You don't post the EXPLAIN PLAN
    It is your assertion the INSERT is to blame, yet it can equally be the SELECT statement involved.
    Basically your post boils down to
    'It doesn't work. Please help', without any relevant information.
    I'm saying this because INSERT SELECT is the fastest method available. OK, you could try the APPEND hint, but in that case you would have to rebuild all indices. Something you state you can not do.
    BULK INSERTs will be slower, SQL*Loader will be slower too, as it involves SQLnet. INSERT SELECT is a server side operation.
    And the 'solution' to do this by means of a file... Ahem, let's not talk about it. It just doesn't work.
    Sybrand Bakker
    Senior Oracle DBA

  • Create XML format file in bulk insert with a data file with out delimiter

    Hello
    I have a date file with no delimiter like bellow
    0080970393102312072981103378000004329392643958
    0080970393102312072981103378000004329392643958
    I just know 5 first number in a line is for example "ID of bank"
    or 6th and 7th number in a line is for example "ID of employee"
    Could you help me how can I create a XML format file?
    thanks alot

    This is a fixed file format. We need to know the length of each field before creating the format file. Say you have said the first 5 characters are Bank ID and 6th to 7th as Employee ID ... then the XML should look like,
    <?xml version="1.0"?>
    <BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <RECORD>
      <FIELD ID="1"xsi:type="CharFixed"LENGTH="5"/>
      <FIELD ID="2"xsi:type="CharFixed"LENGTH="2"/>
      <FIELD ID="3" xsi:type="CharFixed" LENGTH="8"/>
      <FIELD ID="4" xsi:type="CharFixed" LENGTH="14"/>
      <FIELD ID="5" xsi:type="CharFixed" LENGTH="14"/>
      <FIELD ID="6" xsi:type="CharFixed" LENGTH="1"/>
    </RECORD>
    <ROW>
      <COLUMNSOURCE="1"NAME="c1"xsi:type="SQLNCHAR"/>
      <COLUMNSOURCE="2"NAME="c2"xsi:type="SQLNCHAR"/>
      <COLUMN SOURCE="3" NAME="c3" xsi:type="SQLCHAR"/>
      <COLUMN SOURCE="4" NAME="c4" xsi:type="SQLINT"
    />
      <COLUMN SOURCE="5" NAME="c5" xsi:type="SQLINT"
    />
    </ROW>
    </BCPFORMAT>
    Note: Similarly you need to specify the other length as well.
    http://stackoverflow.com/questions/10708985/bulk-insert-from-fixed-format-text-file-ignores-rowterminator
    Regards, RSingh

Maybe you are looking for

  • Memory leak in Tomcat 5.5

    Hi all, i am experiencing memory leaks while using tomcat 5.5 and mysql connector 3.1.7.. While running the attached code tomcat swallows up to 20 mb and doesnt return it. I close down everything but the app still leaks mem. For now it's not an issue

  • HP Laptop mouse pointer moving automatically

    My laptop mouse pointer moving automatically and sometimes it results in clicking in the wrong item. Is this a mouse problem or issue with some driver? thanks

  • Diagnosing connection problem

    Please could you help. I am using TOAD to connect to a number of databases but for a particular database, it hangs and crashes TOAD. 1.How can I diagnose the problem? I can connect to the same database successfully using sqlplus. 2. If I want to set

  • PHD will not sync on log out

    Hi, I have two 10.7.4 clients running as PHD on a 10.7.4 server.  I have noticed rec ently that neither of them sync themselves when they log out.  All other syncing works (i.e log in, maually, and every 5 minutes).  I have an exception errro in my F

  • Cmd-esc undesirabley launches Front Row?

    Hi I have LiteSwitch installed (www.proteron.com/liteswitchx). According to its documentation, hitting esc while toggling through the apps (with CMD-TAB) would exit it without switching to the selected app. However, on my machine, hitting esc (while