CUCM 9.1 Bulk Insert Remote Destinations with a +

If I try a bulk insert of remote destinations that begin with a + sign (eg +15552125555) I get the following error:
"A character to numeric conversion process failed"
If i create the remote destination manually it works fine so I know that the string can contain a + sign. and If I export the existing remote destinations the format is identical to my csv. Is there anything that needs to be done for it to correctly handle the + character?

CSV file attached. I've removed the identifyig information but this is what i exported from the call manager. If I try to import it back in it throws that error.

Similar Messages

  • Bulk Insert Issue with BCP

    I'm running SQL Server 2008 R2 and trying to test out bcp in one of our databases. For almost all the tables, the bcp and bulk insert work fine using similar commands below.  However on a few tables I am experiencing an issue when trying to Bulk Insert
    in.
    Here are the details:
    This is the bcp command to export out the data (via simple batch file):
     1.)
    SET OUTPUT=K:\BCP_FIN_Test
    SET ERRORLOG=C:\Temp\BCP_Error_Log
    SET TIMINGS=C:\Temp\BCP_Timings
    bcp "SELECT * FROM FS84RPT.dbo.PS_PO_LINE Inner Join FS84RPT.[dbo].[PS_RECV_LN_ACCTG] on PS_PO_LINE.BUSINESS_UNIT = PS_RECV_LN_ACCTG.BUSINESS_UNIT_PO and PS_PO_LINE.PO_ID= PS_RECV_LN_ACCTG.PO_ID and PS_PO_LINE.LINE_NBR= PS_RECV_LN_ACCTG.LINE_NBR WHERE
    PS_RECV_LN_ACCTG.FISCAL_YEAR = '2014' and PS_RECV_LN_ACCTG.ACCOUNTING_PERIOD BETWEEN '9' AND '11' " queryout %OUTPUT%\PS_PO_LINE.txt -e %ERRORLOG%\PS_PO_LINE.err -o %TIMINGS%\PS_PO_LINE.txt -T -N
     2.)
    BULK INSERT PS_PO_LINE FROM 'K:\BCP_FIN_Test\PS_PO_LINE.txt' WITH (DATAFILETYPE = 'widenative')
    Msg 4869, Level 16, State 1, Line 1
    The bulk load failed. Unexpected NULL value in data file row 2, column 22. The destination column (CNTRCT_RATE_MULT) is defined as NOT NULL.
    Msg 4866, Level 16, State 4, Line 1
    The bulk load failed. The column is too long in the data file for row 3, column 22. Verify that the field terminator and row terminator are specified correctly.
    Msg 7399, Level 16, State 1, Line 1
    The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
    Msg 7330, Level 16, State 2, Line 1
    Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
    I've tried a few different things including trying to export as character and import as BULK INSERT PS_PO_LINE FROM 'K:\BCP_FIN_Test\PS_PO_LINE.txt' WITH (DATAFILETYPE = 'char')
    But no luck
    Appreciate help

    It seems that the target table does not match your expectations.
    Since I don't know exactly what you are doing, I will have to resort to guesses.
    I note that you export query goes:
      SELECT * FROM FS84RPT.dbo.PS_PO_LINE Inner Join
    And then you are importing into a table called PS_PO_LINE as well. But for your operation to make sense the import PS_PO_LINE must not only have the columns from the PS_PO_LINE, but also all columns from PS_RECV_LN_ACCTG. Maybe your SELECT should read
      SELECT PS_PO_LINE.* FROM FS84RPT.dbo.PS_PO_LINE Inner Join
    or use an EXISTS clause to add the filter of PS_RECV_LN_ACCTG table. (Assuming that it appears in the query for filtering only.)
    Erland Sommarskog, SQL Server MVP, [email protected]

  • BCP-style bulk insert from remote C++ ODBC Native client application

    I am trying to find documentation or sample code for performing bulk inserts into SQL Server 2012 from a remote client using the ODBC native client driver from Linux.  We currently perform INSERT statements on blocks of data, wrapping it in BEGIN/COMMIT,
    and achieving through approximately half of bcp reading from a delimited text file.  While there are many web pages talking about bulk inserts via the native driver, this page (http://technet.microsoft.com/en-us/library/ms130792.aspx) seems closest to
    what I'm after but doesn't go into any detail or give API calls.  The referenced header file is just a bunch of options and constants, so presumablyone gains access to bulk functions via the standard ODBC mechanism, the question is how.
    For clarity, I am NOT interested in:
    BULK INSERT: because it requires a server-side data file or a UNC path with appropriate permissions (doesn't work from Linux)
    INSERT ... SELECT
    * FROM OPENROWSET(BULK...): same problem as above
    IRowsetFastload: OLEDB, but I need ODBC on Linux.
    Basically, I want to emulate BCP.  I don't want to *run* BCP because it requires landing data to disk. 
    Thanks
    john
    John Lilley Chief Architect RedPoint Global Inc.

    Other than block inserts within BEGIN/COMMIT transaction blocks or running bcp, is there anything else that can be done on Linux?
    No other option from Linux that I am aware of.  The SQL Server Native Client ODBC driver also supports table-valued-parameters, which can be used to stream data but the Linux ODBC driver API doesn't have a way to do that either.  That said, I would
    still expect file-based BCP to significantly outperform inserts with large batches.  I've seen a rate of 100K/sec. with this technique, including the file create overhead but much depends on the particulars of your use case.
    Consider voting for this on Connect.  BCP is on the roadmap but no date yet: 
    https://connect.microsoft.com/SQLServer/SearchResults.aspx?SearchQuery=linux+odbc+bcp
    Also, I filed a Connect item for TVP support:
    https://connect.microsoft.com/SQLServer/feedback/details/874616/add-tvp-support-to-sql-server-odbc-driver-for-linux
    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

  • Using Bulk operations for INSERT into destination table and delete from src

    Hi,
    Is there any way to expediate the process of data movement?
    I have a source set of tables (with its pk-fk relations) and a destination set of tables.
    Currently my code as of now, is pickin up the single record in cursor from the parentmost table, and then moving the data from other respecitve tables. But this is happening one by one... Is there any way I can make this take less time?
    If I use bulk insert and collections, i will not be able to use the DELETE in the same block for same source record.
    Thanks
    Regards
    Abhivyakti

    Abhivyakti
    I'm not 100% sure how your code flows from what you've stated, but generally you should try and avoid cursor FOR LOOPS and possibly BULK COLLECTING.
    I always follow the sequence in terms of design:
    1. Attempt to use bulk INSERTS, UPDATES and/or DELETES first and foremost. (include MERGE as well!)
    2. If one cannot possibly do the above then USE BULK COLLECTIONS using a combination of RETURNING INTO's and
    FORALL's.
    However, before you follow this method and if you relatively new to Oracle PL/SQL,
    share the reason you cannot follow the first method on this forum, and you're bound to find some
    help with sticking to method one!
    3. If method two is impossible, and there would have to be a seriously good reason for this, then follow the cursor FOR LOOP
    method.
    You can combine BULK COLLECTS with UPDATES and DELETES, but not with INSERTS
    bulk collect into after insert ?
    Another simple example of BULK COLLECTING
    Re: Reading multiple table type objects returned
    P;

  • Cannot fetch a row from OLE DB provider "BULK" with bulk insert task

    Hi, folks:
    I created a simple SSIS package. On the Control Flow, I created a Bulk INsert Task with Destination connection to a the local SQL server, a csv file from a local folder, specify comma delimiter. Then I excute the task and I got this long error message.
    [Bulk Insert Task] Error: An error occurred with the following error message: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.The bulk load failed. The column is too long in the data file for row 1, column 1. Verify that the field terminator and row terminator are specified correctly.".

    I got the same error with some additional error details (below).  All I had to do to fix the problem was set the Timeout property for the SQL Server Destination = 0
    I was using the following components:
    SQL Server 2008
    SQL Server Integration Services 10.0
    Data Flow Task
    OLE DB Source – connecting to Oracle 11i
    SQL Server Destination – connecting to the local SQL Server 2008 instance
    Full Error Message:
    Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E14.
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E14  Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".".
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E14  Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.".
    An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80040E14  Description: "The Bulk Insert operation of SQL Server Destination has timed out. Please consider increasing the value of Timeout property on the SQL Server Destination in the dataflow.".
    For SQL Server 2005 there is a hot fix available from Microsoft at http://support.microsoft.com/default.aspx/kb/937545

  • Massive problem with remote destination

    hello all,
    we have massive problems with our remote destinations.
    after i create the remote destination profile and added the line, i create the remote destination for my mobile phone. 
    all ok so. but if im associate my mobile phone with the line by choosing Line Association i´m no able to reach many numbers in my company anymore.
    there is no ringing tone. there is nothing and after 20-30 seconds i got an busy tone.
    if i delete the Line Association i can reach the number normally.
    Perhaps you got any idea . I dont got any anymore

    finally we found the problem
    it was at the cucm service parameters.
    we changed " machting caller id with remote destination " to "partial match" after that all is working fine.

  • Need to increase performance-bulk collect in cursor with limit and in the for loop inserting into the trigger table

    Hi all,
    I have a performance issue in the below code,where i am trying to insert the data from table_stg into target_tab and in parent_tab tables and then to child tables via cursor with bulk collect .the target_tab and parent_tab are huge tables and have a row wise trigger enabled on it .the trigger is mandatory . This timetaken for this block to execute is 5000 seconds.Now my requirement is to reduce it to 5 to 10 mins.
    can someone please guide me here.Its bit urgent .Awaiting for your response.
    declare
    vmax_Value NUMBER(5);
      vcnt number(10);
      id_val number(20);
      pc_id number(15);
      vtable_nm VARCHAR2(100);
      vstep_no  VARCHAR2(10);
      vsql_code VARCHAR2(10);
      vsql_errm varchar2(200);
      vtarget_starttime timestamp;
      limit_in number :=10000;
      idx           number(10);
              cursor stg_cursor is
             select
                   DESCRIPTION,
                   SORT_CODE,
                   ACCOUNT_NUMBER,
                     to_number(to_char(CORRESPONDENCE_DATE,'DD')) crr_day,
                     to_char(CORRESPONDENCE_DATE,'MONTH') crr_month,
                     to_number(substr(to_char(CORRESPONDENCE_DATE,'DD-MON-YYYY'),8,4)) crr_year,
                   PARTY_ID,
                   GUID,
                   PAPERLESS_REF_IND,
                   PRODUCT_TYPE,
                   PRODUCT_BRAND,
                   PRODUCT_HELD_ID,
                   NOTIFICATION_PREF,
                   UNREAD_CORRES_PERIOD,
                   EMAIL_ID,
                   MOBILE_NUMBER,
                   TITLE,
                   SURNAME,
                   POSTCODE,
                   EVENT_TYPE,
                   PRIORITY_IND,
                   SUBJECT,
                   EXT_PRD_ID_TX,
                   EXT_PRD_HLD_ID_TX,
                   EXT_SYS_ID,
                   EXT_PTY_ID_TX,
                   ACCOUNT_TYPE_CD,
                   COM_PFR_TYP_TX,
                   COM_PFR_OPT_TX,
                   COM_PFR_RSN_CD
             from  table_stg;
    type rec_type is table of stg_rec_type index by pls_integer;
    v_rt_all_cols rec_type;
    BEGIN
      vstep_no   := '0';
      vmax_value := 0;
      vtarget_starttime := systimestamp;
      id_val    := 0;
      pc_id     := 0;
      success_flag := 0;
              vstep_no  := '1';
              vtable_nm := 'before cursor';
        OPEN stg_cursor;
              vstep_no  := '2';
              vtable_nm := 'After cursor';
       LOOP
              vstep_no  := '3';
              vtable_nm := 'before fetch';
    --loop
        FETCH stg_cursor BULK COLLECT INTO v_rt_all_cols LIMIT limit_in;
                  vstep_no  := '4';
                  vtable_nm := 'after fetch';
    --EXIT WHEN v_rt_all_cols.COUNT = 0;
        EXIT WHEN stg_cursor%NOTFOUND;
    FOR i IN 1 .. v_rt_all_cols.COUNT
      LOOP
       dbms_output.put_line(upper(v_rt_all_cols(i).event_type));
        if (upper(v_rt_all_cols(i).event_type) = upper('System_enforced')) then
                  vstep_no  := '4.1';
                  vtable_nm := 'before seq sel';
              select PC_SEQ.nextval into pc_id from dual;
                  vstep_no  := '4.2';
                  vtable_nm := 'before insert corres';
              INSERT INTO target1_tab
                           (ID,
                            PARTY_ID,
                            PRODUCT_BRAND,
                            SORT_CODE,
                            ACCOUNT_NUMBER,
                            EXT_PRD_ID_TX,         
                            EXT_PRD_HLD_ID_TX,
                            EXT_SYS_ID,
                            EXT_PTY_ID_TX,
                            ACCOUNT_TYPE_CD,
                            COM_PFR_TYP_TX,
                            COM_PFR_OPT_TX,
                            COM_PFR_RSN_CD,
                            status)
             VALUES
                            (pc_id,
                             v_rt_all_cols(i).party_id,
                             decode(v_rt_all_cols(i).product_brand,'LTB',2,'HLX',1,'HAL',1,'BOS',3,'VER',4,0),
                             v_rt_all_cols(i).sort_code,
                             'XXXX'||substr(trim(v_rt_all_cols(i).ACCOUNT_NUMBER),length(trim(v_rt_all_cols(i).ACCOUNT_NUMBER))-3,4),
                             v_rt_all_cols(i).EXT_PRD_ID_TX,
                             v_rt_all_cols(i).EXT_PRD_HLD_ID_TX,
                             v_rt_all_cols(i).EXT_SYS_ID,
                             v_rt_all_cols(i).EXT_PTY_ID_TX,
                             v_rt_all_cols(i).ACCOUNT_TYPE_CD,
                             v_rt_all_cols(i).COM_PFR_TYP_TX,
                             v_rt_all_cols(i).COM_PFR_OPT_TX,
                             v_rt_all_cols(i).COM_PFR_RSN_CD,
                             NULL);
                  vstep_no  := '4.3';
                  vtable_nm := 'after insert corres';
        else
              select COM_SEQ.nextval into id_val from dual;
                  vstep_no  := '6';
                  vtable_nm := 'before insertcomm';
          if (upper(v_rt_all_cols(i).event_type) = upper('REMINDER')) then
                vstep_no  := '6.01';
                  vtable_nm := 'after if insertcomm';
              insert into parent_tab
                 (ID ,
                 CTEM_CODE,
                 CHA_CODE,            
                 CT_CODE,                           
                 CONTACT_POINT_ID,             
                 SOURCE,
                 RECEIVED_DATE,                             
                 SEND_DATE,
                 RETRY_COUNT)
              values
                 (id_val,
                  lower(v_rt_all_cols(i).event_type), 
                  decode(v_rt_all_cols(i).product_brand,'LTB',2,'HLX',1,'HAL',1,'BOS',3,'VER',4,0),
                  'Email',
                  v_rt_all_cols(i).email_id,
                  'IADAREMINDER',
                  systimestamp,
                  systimestamp,
                  0);  
         else
                vstep_no  := '6.02';
                  vtable_nm := 'after else insertcomm';
              insert into parent_tab
                 (ID ,
                 CTEM_CODE,
                 CHA_CODE,            
                 CT_CODE,                           
                 CONTACT_POINT_ID,             
                 SOURCE,
                 RECEIVED_DATE,                             
                 SEND_DATE,
                 RETRY_COUNT)
              values
                 (id_val,
                  lower(v_rt_all_cols(i).event_type), 
                  decode(v_rt_all_cols(i).product_brand,'LTB',2,'HLX',1,'HAL',1,'BOS',3,'VER',4,0),
                  'Email',
                  v_rt_all_cols(i).email_id,
                  'CORRESPONDENCE',
                  systimestamp,
                  systimestamp,
                  0); 
            END if; 
                  vstep_no  := '6.11';
                  vtable_nm := 'before chop';
             if (v_rt_all_cols(i).ACCOUNT_NUMBER is not null) then 
                      v_rt_all_cols(i).ACCOUNT_NUMBER := 'XXXX'||substr(trim(v_rt_all_cols(i).ACCOUNT_NUMBER),length(trim(v_rt_all_cols(i).ACCOUNT_NUMBER))-3,4);
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)
              values
                (id_val,
                 'IB.Correspondence.AccountNumberMasked',
                 v_rt_all_cols(i).ACCOUNT_NUMBER);
             end if;
                  vstep_no  := '6.1';
                  vtable_nm := 'before stateday';
             if (v_rt_all_cols(i).crr_day is not null) then 
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)
              values
                (id_val,
                 --'IB.Correspondence.Date.Day',
                 'IB.Crsp.Date.Day',
                 v_rt_all_cols(i).crr_day);
             end if;
                  vstep_no  := '6.2';
                  vtable_nm := 'before statemth';
             if (v_rt_all_cols(i).crr_month is not null) then 
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)
              values
                (id_val,
                 --'IB.Correspondence.Date.Month',
                 'IB.Crsp.Date.Month',
                 v_rt_all_cols(i).crr_month);
             end if;
                  vstep_no  := '6.3';
                  vtable_nm := 'before stateyear';
             if (v_rt_all_cols(i).crr_year is not null) then 
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)
              values
                (id_val,
                 --'IB.Correspondence.Date.Year',
                 'IB.Crsp.Date.Year',
                 v_rt_all_cols(i).crr_year);
             end if;
                  vstep_no  := '7';
                  vtable_nm := 'before type';
               if (v_rt_all_cols(i).product_type is not null) then
                  insert into child_tab
                     (COM_ID,                                            
                     KEY,                                                                                                                                        
                     VALUE)
                  values
                    (id_val,
                     'IB.Product.ProductName',
                   v_rt_all_cols(i).product_type);
                end if;
                  vstep_no  := '9';
                  vtable_nm := 'before title';         
              if (trim(v_rt_all_cols(i).title) is not null) then
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE )
              values
                (id_val,
                 'IB.Customer.Title',
                 trim(v_rt_all_cols(i).title));
              end if;
                  vstep_no  := '10';
                  vtable_nm := 'before surname';
              if (v_rt_all_cols(i).surname is not null) then
                insert into child_tab
                   (COM_ID,                                            
                   KEY,                                                                                                                                          
                   VALUE)
                values
                  (id_val,
                  'IB.Customer.LastName',
                  v_rt_all_cols(i).surname);
              end if;
                            vstep_no  := '12';
                            vtable_nm := 'before postcd';
              if (trim(v_rt_all_cols(i).POSTCODE) is not null) then
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)                              
               values
                (id_val,
                 'IB.Customer.Addr.PostCodeMasked',
                  substr(replace(v_rt_all_cols(i).POSTCODE,' ',''),length(replace(v_rt_all_cols(i).POSTCODE,' ',''))-2,3));
              end if;
                            vstep_no  := '13';
                            vtable_nm := 'before subject';
              if (trim(v_rt_all_cols(i).SUBJECT) is not null) then
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)                              
               values
                (id_val,
                 'IB.Correspondence.Subject',
                  v_rt_all_cols(i).subject);
              end if;
                            vstep_no  := '14';
                            vtable_nm := 'before inactivity';
              if (trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) is null or
                  trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) = '3' or
                  trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) = '6' or
                  trim(v_rt_all_cols(i).UNREAD_CORRES_PERIOD) = '9') then
              insert into child_tab
                 (COM_ID,                                            
                 KEY,                                                                                                                                            
                 VALUE)                              
               values
                (id_val,
                 'IB.Correspondence.Inactivity',
                  v_rt_all_cols(i).UNREAD_CORRES_PERIOD);
              end if;
                          vstep_no  := '14.1';
                          vtable_nm := 'after notfound';
        end if;
                          vstep_no  := '15';
                          vtable_nm := 'after notfound';
        END LOOP;
        end loop;
                          vstep_no  := '16';
                          vtable_nm := 'before closecur';
        CLOSE stg_cursor;
                          vstep_no  := '17';
                          vtable_nm := 'before commit';
        DELETE FROM table_stg;
      COMMIT;
                          vstep_no  := '18';
                          vtable_nm := 'after commit';
    EXCEPTION
    WHEN OTHERS THEN
      ROLLBACK;
      success_flag := 1;
      vsql_code := SQLCODE;
      vsql_errm := SUBSTR(sqlerrm,1,200);
      error_logging_pkg.inserterrorlog('samp',vsql_code,vsql_errm, vtable_nm,vstep_no);
      RAISE_APPLICATION_ERROR (-20011, 'samp '||vstep_no||' SQLERRM:'||SQLERRM);
    end;
    Thanks

    Its bit urgent
    NO - it is NOT urgent. Not to us.
    If you have an urgent problem you need to hire a consultant.
    I have a performance issue in the below code,
    Maybe you do and maybe you don't. How are we to really know? You haven't posted ANYTHING indicating that a performance issue exists. Please read the FAQ for how to post a tuning request and the info you need to provide. First and foremost you have to post SOMETHING that actually shows that a performance issue exists. Troubleshooting requires FACTS not just a subjective opinion.
    where i am trying to insert the data from table_stg into target_tab and in parent_tab tables and then to child tables via cursor with bulk collect .the target_tab and parent_tab are huge tables and have a row wise trigger enabled on it .the trigger is mandatory . This timetaken for this block to execute is 5000 seconds.Now my requirement is to reduce it to 5 to 10 mins.
    Personally I think 5000 seconds (about 1 hr 20 minutes) is very fast for processing 800 trillion rows of data into parent and child tables. Why do you think that is slow?
    Your code has several major flaws that need to be corrected before you can even determine what, if anything, needs to be tuned.
    This code has the EXIT statement at the beginning of the loop instead of at the end
        FETCH stg_cursor BULK COLLECT INTO v_rt_all_cols LIMIT limit_in;
                  vstep_no  := '4';
                  vtable_nm := 'after fetch';
    --EXIT WHEN v_rt_all_cols.COUNT = 0;
        EXIT WHEN stg_cursor%NOTFOUND;
    The correct place for the %NOTFOUND test when using BULK COLLECT is at the END of the loop; that is, the last statement in the loop.
    You can use a COUNT test at the start of the loop but ironically you have commented it out and have now done it wrong. Either move the NOTFOUND test to the end of the loop or remove it and uncomment the COUNT test.
    WHEN OTHERS THEN
      ROLLBACK;
    That basically says you don't even care what problem occurs or whether the problem is for a single record of your 10,000 in the collection. You pretty much just throw away any stack trace and substitute your own message.
    Your code also has NO exception handling for any of the individual steps or blocks of code.
    The code you posted also begs the question of why you are using NAME=VALUE pairs for child data rows? Why aren't you using a standard relational table for this data?
    As others have noted you are using slow-by-slow (row by row processing). Let's assume that PL/SQL, the bulk collect and row-by-row is actually necessary.
    Then you should be constructing the parent and child records into collections and then inserting them in BULK using FORALL.
    1. Create a collection for the new parent rows
    2. Create a collection for the new child rows
    3. For each set of LIMIT source row data
      a. empty the parent and child collections
      b. populate those collections with new parent/child data
      c. bulk insert the parent collection into the parent table
      d. bulk insert the child collection into the child table
    And unless you really want to either load EVERYTHING or abandon everything you should use bulk exception handling so that the clean data gets processed and only the dirty data gets rejected.

  • SQL Server 2012 Express bulk Insert flat file 1million rows with "" as delimeter

    Hi,
    I wanted to see if anyone can help me out. I am on SQL server 2012 express. I cannot use OPENROWSET because my system is x64 and my Microsoft office suit is x32 (Microsoft.Jet.OLEDB.4.0).
    So I used Import wizard and is not working either. 
    The only thing that let me import this large file, is:
    CREATE TABLE #LOADLARGEFLATFILE
    Column1
    varchar(100), Column2 varchar(100), Column3 varchar(100),
    Column4 nvarchar(max)
    BULK INSERT
    #LOADLARGEFLATFILE
    FROM 'C:\FolderBigFile\LARGEFLATFILE.txt'
    WITH 
    FIRSTROW = 2,
    FIELDTERMINATOR ='\t',
    ROWTERMINATOR ='\n'
    The problem with CREATE TABLE and BULK INSERT is that my flat file comes with text qualifiers - "". Is there a way to prevent the quotes "" from loading in the bulk insert? Below is the data. 
    Column1
    Column2
    Column3
    Column4
    "Socket Adapter"
    8456AB
    $4.25
    "Item - Square Drive Socket Adapter | For "
    "Butt Splice" 
    9586CB
    $14.51
    "Item - Butt Splice"
    "Bleach"
    6589TE
    $27.30
    "Item - Bleach | Size - 96 oz. | Container Type"
    Ed,
    Edwin Lopera

    Hi lgnusLumen,
    According to your description, you use BULK INSERT to import data from a data file to the SQL table. However, to be usable as a data file for bulk import, a CSV file must comply with the following restrictions:
    1. Data fields never contain the field terminator.
    2. Either none or all of the values in a data field are enclosed in quotation marks ("").
    In your data file, the quotes aren't consistent, if you want to prevent the quotes "" from loading in the bulk insert, I recommend you use SQL Server Import and Export Wizard tools in SQL Server Express version. area, it will allow to strip the
    double quote from columns, you can review the following screenshot.
    In other SQL Server version, we can use SQL Server Integration Services (SSIS) to import data from a flat file (.csv) with removing the double quotes. For more information, you can review the following article.
    http://www.mssqltips.com/sqlservertip/1316/strip-double-quotes-from-an-import-file-in-integration-services-ssis/
    In addition, you can create a function to convert a CSV to a usable format for Bulk Insert. It will replace all field-delimiting commas with a new delimiter. You can then use the new field delimiter instead of a comma. For more information, see:
    http://stackoverflow.com/questions/782353/sql-server-bulk-insert-of-csv-file-with-inconsistent-quotes
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Blob truncated with DbFactory and Bulk insert

    Hi,
    My platform is a Microsoft Windows Server 2003 R2 Server 5.2 Service Pack 2 (64-bit) with an Oracle Database 11g 11.1.0.6.0.
    I use the client Oracle 11g ODAC 11.1.0.7.20.
    Some strange behavior appends when used DbFactory and bulk command with Blob column and parameter with a size larger than 65536bytes. Let me explain.
    First i create a dummy table in my schema :
    create table dummy (a number, b blob)To use bulk insert we can use the code A with oracle object (succes to execute) :
    byte[] b1 = new byte[65530];
    byte[] b2 = new byte[65540];
    Oracle.DataAccess.Client.OracleConnection conn = new Oracle.DataAccess.Client.OracleConnection("User Id=login;Password=pws;Data Source=orcl;");
    OracleCommand cmd = new OracleCommand("insert into dummy values (:p1,:p2)", conn);
    cmd.ArrayBindCount = 2;
    OracleParameter p1 = new OracleParameter("p1", OracleDbType.Int32);
    p1.Direction = ParameterDirection.Input;
    p1.Value = new int[] { 1, 2 };
    cmd.Parameters.Add(p1);
    OracleParameter p2 = new OracleParameter("p2", OracleDbType.Blob);
    p2.Direction = ParameterDirection.Input;
    p2.Value = new byte[][] { b1, b2 };
    cmd.Parameters.Add(p2);
    conn.Open(); cmd.ExecuteNonQuery(); conn.Close();We can write the same thing with an abstract level when used the DbProviderFactories (code B) :
    var factory = DbProviderFactories.GetFactory("Oracle.DataAccess.Client");
    DbConnection conn = factory.CreateConnection();
    conn.ConnectionString = "User Id=login;Password=pws;Data Source=orcl;";
    DbCommand cmd = conn.CreateCommand();
    cmd.CommandText = "insert into dummy values (:p1,:p2)";
    ((OracleCommand)cmd).ArrayBindCount = 2;
    DbParameter param = cmd.CreateParameter();
    param.ParameterName = "p1";
    param.DbType = DbType.Int32;
    param.Value = new int[] { 3, 4 };
    cmd.Parameters.Add(param);
    DbParameter param2 = cmd.CreateParameter();
    param2.ParameterName = "p2";
    param2.DbType = DbType.Binary;
    param2.Value = new byte[][] { b1, b2 };
    cmd.Parameters.Add(param2);
    conn.Open(); cmd.ExecuteNonQuery(); conn.Close();But this second code doesn't work, the second array of byte is truncated to 4byte. It seems to be an int16 overtaking.
    When used a DbTYpe.Binary, oracle use an OracleDbType.Raw for mapping and not an OracleDbType.Blob, so the problem seems to be with raw type, BUT if we use the same code without bulk insert, it's worked !!! The problem is somewhere else...
    Why used an DbConnection ? To be able to switch easy to an another database type.
    So why used "((OracleCommand)cmd).ArrayBindCount" ? To be able to used specific functionality of each database.
    I can fix the issue when casting DbParameter as OracleParameter and fix the OracleDbType to Blob, but why second code does not working with bulk and working with simple query ?

    BCP and BULK INSERT does not work the way you expect them do. What they do is that they consume fields in a round-robin fashion. That is, they first looks for data for the first field, then for the second field and so on.
    So in your case, they will first read one byte, then 20 bytes etc until they have read the two bytes for field 122. At this point they will consume bytes until they have found a sequence of carriage return and line feed.
    You say that some records in the file are incomplete. Say that there are only 60 fields in this file. Field 61 is four bytes. BCP and BULK INSERT will now read data for field 61 as CR+LF+the first two bytes in the next row. CR+LF has no special meaning,
    but they are just data at this point.
    You will have to write a program to parse the file, or use SSIS. But BCP and BULK INSERT are not your friends in this case.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Help with BULK Insert

    Hello SQL Buddies,
    I am trying to run a BULK insert.
    Here is my code:
    Code start:
    use Lagerliste
    drop table Import
    use Lagerliste
    create table Import
    TOTALQTY float null,
    PARTNO nvarchar (255) null,
    [DESC] nvarchar (255) null,
    CO nvarchar (255) null,
    BIN nvarchar (255) null,
    PRICE float null,
    DISCOUNT nvarchar (255) null,
    LASTPURC nvarchar (255) null,
    AGECODE nvarchar (255) null,
    MANFPART nvarchar (255) null
    use Lagerliste
    bulk
    insert Import
    from 'D:\FTP\RG\Stockfile.csv'
    with
    formatfile = 'D:\FormatFile\test.xml',
    Errorfile = 'D:\FormatFile\error.txt'
    Code stop..
    My format file code is here:
    Code start:
    <?xml version="1.0"?>
      <BCPFORMAT xmlns = "http://schemas.microsoft.com/sqlserver/2004/bulkload/format"
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
        <RECORD>
          <FIELD ID="1"    xsi:type="CharTerm"        TERMINATOR= ',"'    />
          <FIELD ID="2"    xsi:type="CharTerm"    MAX_LENGTH="255"    TERMINATOR= '","' />
          <FIELD ID="3"    xsi:type="CharTerm"    MAX_LENGTH="255"    TERMINATOR= '","'    />
          <FIELD ID="4"    xsi:type="CharTerm"    MAX_LENGTH="255"    TERMINATOR= '","'    />
          <FIELD ID="5"    xsi:type="CharTerm"    MAX_LENGTH="255"    TERMINATOR= '",'    />
          <FIELD ID="6"    xsi:type="CharTerm"      TERMINATOR= ',"'    />
          <FIELD ID="7"    xsi:type="CharTerm"    MAX_LENGTH="255"    TERMINATOR= '","'    />
          <FIELD ID="8"    xsi:type="CharTerm"    MAX_LENGTH="255"    TERMINATOR= '","'    />
          <FIELD ID="9"    xsi:type="CharTerm"    MAX_LENGTH="255"    TERMINATOR= '","'    />
          <FIELD ID="10"    xsi:type="CharTerm"    MAX_LENGTH="255"    TERMINATOR= '"'        />
        </RECORD>
        <ROW>
          <COLUMN SOURCE="1" NAME="TOTALQTY"    xsi:type="SQLFLT8"/>
          <COLUMN SOURCE="2" NAME="PARTNO"    xsi:type="SQLNVARCHAR"/>
          <COLUMN SOURCE="3" NAME="DESC"        xsi:type="SQLNVARCHAR"/>
          <COLUMN SOURCE="4" NAME="CO"        xsi:type="SQLNVARCHAR"/>
          <COLUMN SOURCE="5" NAME="BIN"        xsi:type="SQLNVARCHAR"/>
          <COLUMN SOURCE="6" NAME="PRICE"        xsi:type="SQLFLT8"/>
          <COLUMN SOURCE="7" NAME="DISCOUNT"    xsi:type="SQLNVARCHAR"/>
          <COLUMN SOURCE="8" NAME="LASTPURC"    xsi:type="SQLNVARCHAR"/>
          <COLUMN SOURCE="9" NAME="AGECODE"    xsi:type="SQLNVARCHAR"/>
          <COLUMN SOURCE="10" NAME="MANFPART"    xsi:type="SQLNVARCHAR"/>
        </ROW>
      </BCPFORMAT>
    Code stop..
    If i run the code it says:
    Msg 4832, Level 16, State 1, Line 20
    Bulk load: An unexpected end of file was encountered in the data file.
    Msg 7399, Level 16, State 1, Line 20
    The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
    Msg 7330, Level 16, State 2, Line 20
    Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
    And the error file says:
    Row 473629 File Offset 42503226 ErrorFile Offset 0 - HRESULT 0x80004005
    If i then in my bulk insert adds: "Lastrow = '473629' it works all fine. So it will mean that it gets an error at the last line where there is no more data? But this sql query should run every day, so i have to do so it dont gets the error.
    Can someone help me?
    Looking forward to an answer, i hope someone have a solution.
    Regards Christian.

    Try the below format file,
    <?xml version="1.0"?>
    <BCPFORMAT xmlns = "http://schemas.microsoft.com/sqlserver/2004/bulkload/format"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <RECORD>
    <FIELD ID="1" xsi:type="CharTerm" TERMINATOR= ',"' />
    <FIELD ID="2" xsi:type="CharTerm" MAX_LENGTH="255" TERMINATOR= '","' />
    <FIELD ID="3" xsi:type="CharTerm" MAX_LENGTH="255" TERMINATOR= '","' />
    <FIELD ID="4" xsi:type="CharTerm" MAX_LENGTH="255" TERMINATOR= '","' />
    <FIELD ID="5" xsi:type="CharTerm" MAX_LENGTH="255" TERMINATOR= '",' />
    <FIELD ID="6" xsi:type="CharTerm" TERMINATOR= ',"' />
    <FIELD ID="7" xsi:type="CharTerm" MAX_LENGTH="255" TERMINATOR= '","' />
    <FIELD ID="8" xsi:type="CharTerm" MAX_LENGTH="255" TERMINATOR= '","' />
    <FIELD ID="9" xsi:type="CharTerm" MAX_LENGTH="255" TERMINATOR= '","' />
    <FIELD ID="10" xsi:type="CharTerm" MAX_LENGTH="255" TERMINATOR= '"\r\n' />
    </RECORD>
    <ROW>
    <COLUMN SOURCE="1" NAME="TOTALQTY" xsi:type="SQLFLT8"/>
    <COLUMN SOURCE="2" NAME="PARTNO" xsi:type="SQLNVARCHAR"/>
    <COLUMN SOURCE="3" NAME="DESC" xsi:type="SQLNVARCHAR"/>
    <COLUMN SOURCE="4" NAME="CO" xsi:type="SQLNVARCHAR"/>
    <COLUMN SOURCE="5" NAME="BIN" xsi:type="SQLNVARCHAR"/>
    <COLUMN SOURCE="6" NAME="PRICE" xsi:type="SQLFLT8"/>
    <COLUMN SOURCE="7" NAME="DISCOUNT" xsi:type="SQLNVARCHAR"/>
    <COLUMN SOURCE="8" NAME="LASTPURC" xsi:type="SQLNVARCHAR"/>
    <COLUMN SOURCE="9" NAME="AGECODE" xsi:type="SQLNVARCHAR"/>
    <COLUMN SOURCE="10" NAME="MANFPART" xsi:type="SQLNVARCHAR"/>
    </ROW>
    </BCPFORMAT>
    Refer
    http://stackoverflow.com/questions/8530353/sql-bulk-insert-xml-format-file-with-double-quotes-in-terminator
    http://technet.microsoft.com/en-us/library/ms191234(v=sql.105).aspx
    Regards, RSingh

  • Complex query - improve performance with nested arrays, bulk insert....?

    Hello, I have an extremely complicated query, that has a structure similar to:
    Overall Query
    ---SubQueryA
    -------SubQueryB
    ---SubQueryB
    ---SubQueryC
    -------SubQueryA
    The subqueries themselves are slow, and having to run them multiple times is much too slow! Ideally, I would be able to run each subquery once, and then use the results. I cannot use standard oracle tables, and i would need to keep the result of the subqueries in memory.
    I was thinking I write a pl/sql script that did the subqueries at the beginning and stored the results in memory. Then in the overall query, I could loop through my results in memory, and join the results of the various subqueries to one another.
    some questions:
    -what is the best data structure to use? I've been looking around and there are nested arrays, and there's the bulk insert functionality, but I'm not sure what is the best to you
    -the advantage of the method I'm suggesting is that I only have to do each subquery once. But, when I start joining the results of the subquery to one another, will I take a performance hit? will Oracle not be able to optimize the joins?
    thanks in advance!
    Coop

    I cannot use standard oracle tablesWhat does this mean? If you have subqueries, i assume you have tables to drive them? You're in an Oracle forum, so i assume the tables are Oracle tables.
    If so, you can look into the WITH clause, it can 'cache' the query results for you and reuse them multiple times, also helpful in making large queries with many subqueries more readable.

  • Bulk insertion of CTI Route points in CUCM

    Hi team,
         Is there any way we can insert bulk number of CTI ROUTE POINT's in Cisco Call Manager. I have seen the option for bulk insertion of phones.But i didn't get any option for creating bulk number of CTI Route Points.
    Can someone please help me on this issue.
    Thanks,
    Sasikumar.

    Hi Gergely,
         Thankyou for responding.
        --> The CUCM is in 10.0 version.
        --> Actually i am pretty new to CUCM so i didn't understand what you mean by look at AXL .Can you please elaborate more.
    Thanks in advance,
    Sasikumar.

  • Create XML format file in bulk insert with a data file with out delimiter

    Hello
    I have a date file with no delimiter like bellow
    0080970393102312072981103378000004329392643958
    0080970393102312072981103378000004329392643958
    I just know 5 first number in a line is for example "ID of bank"
    or 6th and 7th number in a line is for example "ID of employee"
    Could you help me how can I create a XML format file?
    thanks alot

    This is a fixed file format. We need to know the length of each field before creating the format file. Say you have said the first 5 characters are Bank ID and 6th to 7th as Employee ID ... then the XML should look like,
    <?xml version="1.0"?>
    <BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <RECORD>
      <FIELD ID="1"xsi:type="CharFixed"LENGTH="5"/>
      <FIELD ID="2"xsi:type="CharFixed"LENGTH="2"/>
      <FIELD ID="3" xsi:type="CharFixed" LENGTH="8"/>
      <FIELD ID="4" xsi:type="CharFixed" LENGTH="14"/>
      <FIELD ID="5" xsi:type="CharFixed" LENGTH="14"/>
      <FIELD ID="6" xsi:type="CharFixed" LENGTH="1"/>
    </RECORD>
    <ROW>
      <COLUMNSOURCE="1"NAME="c1"xsi:type="SQLNCHAR"/>
      <COLUMNSOURCE="2"NAME="c2"xsi:type="SQLNCHAR"/>
      <COLUMN SOURCE="3" NAME="c3" xsi:type="SQLCHAR"/>
      <COLUMN SOURCE="4" NAME="c4" xsi:type="SQLINT"
    />
      <COLUMN SOURCE="5" NAME="c5" xsi:type="SQLINT"
    />
    </ROW>
    </BCPFORMAT>
    Note: Similarly you need to specify the other length as well.
    http://stackoverflow.com/questions/10708985/bulk-insert-from-fixed-format-text-file-ignores-rowterminator
    Regards, RSingh

  • Remote Site with its own CUCM Sub#2 and Router network failed and the site didnt keep its phones active, need some assistance

    I have a Pub and 2 Sub's.  Sub #1 is with the Pub, at the main site it is our TFTP server, Sub #2 is at a remote site with about 100 users along with a router and 2 PRI's for outbound calls.  We had a network failure between the main site and the remote site and all phone lost their registration with the system until we were able to get the network back up.   Currently the network is up in a crippled state on a 1 T-1 link while we troubleshoot the bigger issue with our 6mb pipe, however Sub #2 and its associated router arent talking with the PUB or other Sub.  I'm still getting alerts every 30 minutes stating they are the server is down.  I'm sure once the network is corrected this will bring everything back on line.  My question is how can I prevent in this in the future.  I need this site to be stand alone if the network goes down again.  I was told by our vendor that if we had a subscriber at each site then we would need SRST licensing. I know something needs to be configured to make it all work, I'm just not sure what.

    I already have a group at the site but it includes both the Pub and Sub from the main site as well as the Sub from the remote site.  I would only have to remove the Sub from the main site, problem is I currently only have 1 TFTP server it runs on SUB #1, should I make SUB#2 a TFTP server as well and the phones are setup for DHCP at the main site, I'm going to need to have a DHCP server setup at the remote site as well. Correct?

  • SSIS BULK INSERT unsing UNC inside of ForEach Loop Container Failed could not be opened. Operating system error code 5(Access is denied.)

    Hi,
    I am trying to figure out how to fix my problem
    Error: Could not be opened. Operating system error code 5(Access is denied.)
    Process Description:
    Target Database Server Reside on different Server in the Network
    SSIS Package runs from a Remote Server
    SSIS Package use a ForEachLoop Container to loop into a directory to do Bulk Insert
    SSIS Package use variables to specified the share location of the files using UNC like this
    \\server\files
    Database Service accounts under the Database is runing it has full permission on the share drive were the files reside.
    In the Execution Results tab shows the prepare SQL statement for the BULK insert and I can run the same exact the bulk insert in SSMS without errors, from the Database Server and from the server were SSIS package is executed.
    I am on a dead end and I don’t want to re-write SSIS to use Data Flow Task because is not flexible to update when metadata of the table changed.
    Below post it has almost the same situation:
    https://social.msdn.microsoft.com/Forums/sqlserver/en-US/8de13e74-709a-43a5-8be2-034b764ca44f/problem-with-bulk-insert-task-in-foreach-loop?forum=sqlintegrationservices

    Insteresting how I fixed the issue, Adding the Application Name into the SQL OLAP Connection String Fixed the issue. I am not sure why SQL Server wasn't able to open the file remotely without this.

Maybe you are looking for