Doing a bulk insert

Hi
I need recomendations for doinga bulk insert from a database to the OBPM, creating a new instance (new BPM oject with attributes) but I need to take the attributes from the db, so I need to read each row but I can't use a counter
thanks for your help
:-D have a nice day

Read the metalink document 199746.1

Similar Messages

  • Unique constraint violation error while bulk insert (TimesTen 7.0.1.0.0)

    Hi,
    I try to understand while I get error above when doing bulk inserts via TimesTen into Oracle database.
    The single inserts after that works perfect (no record will dropped), when doing direct to oracle bulk insert works with same data perfect. We tried it with AWT, then with readonly tables and passtrough=2, nothing works. The error comes even the table is empty.
    Any suggestions? is this a bug in timesten itself? Have I connect for bulk inserts direct to oracle database?
    Thanks in advanced
    Rajko Albrecht

    It may be a TT bug but if so it is not an obvious one since bulk inserts definitely work in TimesTen...
    Can you please provide:
    1. The schema of the table in question (including any indices)
    2. Details on how you are doing the bulk inserts (C/ODBC program, Java/JDBC program or ...). Actual program source code would be helpful.
    3. A (small) example of the data that you know would give this error.
    Thanks,
    Chris

  • Exception while doing bulk insertion

    Hi,
    I am trying to do a bulk insert of records into a table using my application. I am using prepared statement to achieve this. I am getting the following exception while doing bulk insert.
    java.lang.NegativeArraySizeException
    I am using SQL Server driver version 2000.80.380.00 for this. The database type chosen is JDBC-ODBC.
    Your early response is appreciated.
    Regards
    Ramesh

    Hi,
    I am trying to do a bulk insert of records into a
    table using my application. I am using prepared
    statement to achieve this. I am getting the following
    exception while doing bulk insert.
    java.lang.NegativeArraySizeException
    I am using SQL Server driver version 2000.80.380.00
    for this. The database type chosen is JDBC-ODBC.
    Your early response is appreciated.
    RegardsLooks like one of your arrays has a problem with its size, possibly a negative size!
    It could be a problem...
    somewhere...
    in your application...
    in the code...
    somewhere.
    Possibly at the line number indicated by the exception... just a wild guess!
    Thought about looking for it? Thats what I'd do first.
    Or do you expect someone to say "Ahhhhh 2000.80.380.00 marvelous plumage, bugger with the bulk inserts"

  • How can I debug a Bulk Insert error?

    I'm loading a bunch of files into SQL server.  All work fine, but one keeps erroring out on me.  All files should be exactly the same in structure, but they have different dates, and other different financial metrics, but the structure and field
    names should be exactly the same.  Nevertheless, one keeps konking out, and throwing this error.
    Msg 4832, Level 16, State 1, Line 1
    Bulk load: An unexpected end of file was encountered in the data file.
    Msg 7399, Level 16, State 1, Line 1
    The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
    Msg 7330, Level 16, State 2, Line 1
    Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
    The ROWTERMINATOR should be CRLF, and when you look at it in Notepad++ that's what it looks like, but it must be something else, because I keep getting errors here.  I tried the good old:  ROWTERMINATOR='0x0a'
    That works on all files, but one, so there's something funky going on here, and I need to see what SQL Server is really doing.
    Is there some way to print out a log, or look at a log somewhere?
    Thanks!!
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

    The first thing to try is to see if BCP likes the file. BCP and BULK INSERT adhere to the same spec, but they are different implementations, but there are subtle differences.
    There is an ERRORFILE option, but it more helps when there is bad data.
    You can also use the BATCHSIZE option to see how many records in the file it swallows, before things go bad. FIRSTROW and LASTROW can also help.
    All in all, it can be quite tedious find that single row where things are different - and where BULK INSERT loses sync entirely. Keep in mind that it reads fields on by one, and it there is one field terminator to few on a line, it will consume the line
    feed at the end of the line as data.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • [Forum FAQ] How to use multiple field terminators in BULK INSERT or BCP command line

    Introduction
    Some people want to know if we can have multiple field terminators in BULK INSERT or BCP commands, and how to implement multiple field terminators in BULK INSERT or BCP commands.
    Solution
    For character data fields, optional terminating characters allow you to mark the end of each field in a data file with a field terminator, as well as the end of each row with a row terminator. If a terminator character occurs within the data, it is interpreted
    as a terminator, not as data, and the data after that character is interpreted and belongs to the next field or record. I have done a test, if you use BULK INSERT or BCP commands and set the multiple field terminators, you can refer to the following command.
    In Windows command line,
    bcp <Databasename.schema.tablename> out “<path>” –c –t –r –T
    For example, you can export data from the Department table with bcp command and use the comma and colon (,:) as one field terminator.
    bcp AdventureWorks.HumanResources.Department out C:\myDepartment.txt -c -t ,: -r \n –T
    The txt file as follows:
    However, if you want to bcp by using multiple field terminators the same as the following command, which will still use the last terminator defined by default.
    bcp AdventureWorks.HumanResources.Department in C:\myDepartment.txt -c -t , -r \n -t: –T
    The txt file as follows:
    When multiple field terminators means multiple fields, you use the below comma separated format,
    column1,,column2,,,column3
    In this occasion, you only separate 3 fields (column1, column2 and column3). In fact, after testing, there will be 6 fields here. That is the significance of a field terminator (comma in this case).
    Meanwhile, using BULK INSERT to import the data of the data file into the SQL table, if you specify terminator for BULK import, you can only set multiple characters as one terminator in the BULK INSERT statement.
    USE <testdatabase>;
    GO
    BULK INSERT <your table> FROM ‘<Path>’
     WITH (
    DATAFILETYPE = ' char/native/ widechar /widenative',
     FIELDTERMINATOR = ' field_terminator',
    For example, using BULK INSERT to import the data of C:\myDepartment.txt data file into the DepartmentTest table, the field terminator (,:) must be declared in the statement.
    In SQL Server Management Studio Query Editor:
    BULK INSERT AdventureWorks.HumanResources.DepartmentTest FROM ‘C:\myDepartment.txt’
     WITH (
    DATAFILETYPE = ‘char',
    FIELDTERMINATOR = ‘,:’,
    The new table contains like as follows:  
    We could not declare multiple field terminators (, and :) in the Query statement,  as the following format, a duplicate error will occur.
    In SQL Server Management Studio Query Editor:
    BULK INSERT AdventureWorks.HumanResources.DepartmentTest FROM ‘C:\myDepartment.txt’
     WITH (
    DATAFILETYPE = ‘char',
    FIELDTERMINATOR = ‘,’,
    FIELDTERMINATOR = ‘:’
    However, if you want to use a data file with fewer or more fields, we can implement via setting extra field length to 0 for fewer fields or omitting or skipping more fields during the bulk copy procedure.  
    More Information
    For more information about filed terminators, you can review the following article.
    http://technet.microsoft.com/en-us/library/aa196735(v=sql.80).aspx
    http://social.technet.microsoft.com/Forums/en-US/d2fa4b1e-3bd4-4379-bc30-389202a99ae2/multiple-field-terminators-in-bulk-insert-or-bcp?forum=sqlgetsta
    http://technet.microsoft.com/en-us/library/ms191485.aspx
    http://technet.microsoft.com/en-us/library/aa173858(v=sql.80).aspx
    http://technet.microsoft.com/en-us/library/aa173842(v=sql.80).aspx
    Applies to
    SQL Server 2012
    SQL Server 2008R2
    SQL Server 2005
    SQL Server 2000
    Please click to vote if the post helps you. This can be beneficial to other community members reading the thread.

    Thanks,
    Is this a supported scenario, or does it use unsupported features?
    For example, can we call exec [ReportServer].dbo.AddEvent @EventType='TimedSubscription', @EventData='b64ce7ec-d598-45cd-bbc2-ea202e0c129d'
    in a supported way?
    Thanks! Josh

  • Bulk inserts on Solaris slow as compared to windows

    Hi Experts,
    Looking for tips in troubleshooting 'Bulk inserts on Solaris'. I have observed the same bulk inserts are quite fast on Windows as compared to Solaris. Is there known issues on Solaris?
    This is the statement:
    I have 'merge...insert...' query which is in execution since long time more than 12 hours now:
    merge into A DEST using (select * from B SRC) SRC on (SRC.some_ID= DEST.some_ID) when matched then update ...when not matched then insert (...) values (...)Table A has 600K rows with unique identifier some_ID column, Table B has 500K rows with same some_id column, the 'merge...insert' checks if the some_ID exists, if yes then update query gets fired, when not matched then insert query gets fired. In either case it takes long time to execute.
    Environment:
    The version of the database is 10g Standard 10.2.0.3.0 - 64bit Production
    OS: Solaris 10, SPARC-Enterprise-T5120
    These are the parameters relevant to the optimizer:
    SQL>
    SQL> show parameter sga_target
    NAME                                 TYPE                VALUE
    sga_target                           big integer           4G
    SQL>
    SQL> show parameter sga_target
    NAME                                 TYPE                 VALUE
    sga_target                          big integer           4G
    SQL>
    SQL>  show parameter optimizer
    NAME                                        TYPE        VALUE
    optimizer_dynamic_sampling       integer        2
    optimizer_features_enable          string         10.2.0.3
    optimizer_index_caching             integer        0
    optimizer_index_cost_adj            integer       100
    optimizer_mode                          string         ALL_ROWS
    optimizer_secure_view_merging   boolean     TRUE
    SQL>
    SQL> show parameter db_file_multi
    NAME                                             YPE        VALUE
    db_file_multiblock_read_count        integer     16
    SQL>
    SQL> show parameter db_block_size
    NAME                                        TYPE        VALUE
    db_block_size                           integer     8192
    SQL>
    SQL> show parameter cursor_sharing
    NAME                                 TYPE        VALUE
    cursor_sharing                    string      EXACT
    SQL>
    SQL> column sname format a20
    SQL> column pname format a20
    SQL> column pval2 format a20
    SQL>
    SQL> select sname, pname, pval1, pval2 from sys.aux_stats$;
    SNAME                PNAME                     PVAL1               PVAL2
    SYSSTATS_INFO        STATUS                                    COMPLETED
    SYSSTATS_INFO        DSTART                                    07-12-2005 07:13
    SYSSTATS_INFO        DSTOP                                      07-12-2005 07:13
    SYSSTATS_INFO        FLAGS                  1
    SYSSTATS_MAIN        CPUSPEEDNW       452.727273
    SYSSTATS_MAIN        IOSEEKTIM           10
    SYSSTATS_MAIN        IOTFRSPEED         4096
    SYSSTATS_MAIN        SREADTIM
    SYSSTATS_MAIN        MREADTIM
    SYSSTATS_MAIN        CPUSPEED
    SYSSTATS_MAIN        MBRC
    SYSSTATS_MAIN        MAXTHR
    SYSSTATS_MAIN        SLAVETHR
    13 rows selected.
    Following is the error messages being pushed into oracle alert log file:
    Thu Dec 10 01:41:13 2009
    Thread 1 advanced to log sequence 1991
      Current log# 1 seq# 1991 mem# 0: /oracle/oradata/orainstance/redo01.log
    Thu Dec 10 04:51:01 2009
    Thread 1 advanced to log sequence 1992
      Current log# 2 seq# 1992 mem# 0: /oracle/oradata/orainstance/redo02.logPlease provide some tips to troubleshoot the actual issue. Any pointers on db_block_size,SGA,PGA which are the reasons for this failure?
    Regards,
    neuron

    SID, SEQ#,           EVENT,          WAIT_CLASS_ID,     WAIT_CLASS#, WAIT_TIME, SECONDS_IN_WAIT,      STATE
    125   24235    'db file sequential read'           1740759767                         8                -1               *58608     *   'WAITED SHORT TIME'Regarding the disk, I am not sure what needs to be checked, however from output of iostat it does not seem to be busy, check last three row's and %b column is negligible:
    tty         cpu
    tin tout  us sy wt id
       0  320   3  0  0 97
                        extended device statistics
        r/s    w/s   kr/s   kw/s wait actv wsvc_t asvc_t  %w  %b device
        0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.0   0   0 ramdisk1
        0.0    2.5    0.0   18.0  0.0  0.0    0.0    8.3   0   1 c1t0d0
        0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.0   0   0 c1t1d0
        0.0    0.0    0.0    0.0  0.0  0.0    0.0    0.0   0   0 c0t0d0

  • BULK INSERT into View w/ Instead Of Trigger - DML ERROR LOGGING Issue

    Oracle 10.2.0.4
    I cannot figure out why I cannot get bulk insert errors to aggregate and allow the insert to continue when bulk inserting into a view with an Instead of Trigger. Whether I use LOG ERRORS clause or I use SQL%BULK_EXCEPTIONS, the insert works until it hits the first exception and then exits.
    Here's what I'm doing:
    1. I'm bulk inserting into a view with an Instead of Trigger on it that performs the actual updating on the underlying table. This table is a child table with a foreign key constraint to a reference table containing the primary key. In the Instead of Trigger, it attempts to insert a record into the child table and I get the following exception: +5:37:55 ORA-02291: integrity constraint (FK_TEST_TABLE) violated - parent key not found+, which is expected, but the error should be logged in the table and the rest of the inserts should complete. Instead the bulk insert exits.
    2. If I change this to bulk insert into the underlying table directly, it works, all errors get put into the error logging table and the insert completes all non-exception records.
    Here's the "test" procedure I created to test my scenario:
    View: V_TEST_TABLE
    Underlying Table: TEST_TABLE
    PROCEDURE BulkTest
    IS
    TYPE remDataType IS TABLE of v_TEST_TABLE%ROWTYPE INDEX BY BINARY_INTEGER;
    varRemData remDataType;
    begin
    select /*+ DRIVING_SITE(r)*/ *
    BULK COLLECT INTO varRemData
    from TEST_TABLE@REMOTE_LINK
    where effectiveday < to_date('06/16/2012 04','mm/dd/yyyy hh24')
    and terminationday > to_date('06/14/2012 04','mm/dd/yyyy hh24');
    BEGIN
    FORALL idx IN varRemData.FIRST .. varRemData.LAST
    INSERT INTO v_TEST_TABLE VALUES varRemData(idx) LOG ERRORS INTO dbcompare.ERR$_TEST_TABLE ('INSERT') REJECT LIMIT UNLIMITED;
    EXCEPTION WHEN others THEN
    DBMS_OUTPUT.put_line('ErrorCode: '||SQLCODE);
    END;
    COMMIT;
    end;
    I've reviewed Oracle's documentation on both DML logging tools and neither has any restrictions (at least that I can see) that would prevent this from working correctly.
    Any help would be appreciated....
    Thanks,
    Steve

    Thanks, obviously this is my first post, I'm desperate to figure out why this won't work....
    This code I sent is only a test proc to try and troubleshoot the issue, the others with the debug statement is only to capture the insert failing and not aggregating the errors, that won't be in the real proc.....
    Thanks,
    Steve

  • How to debug bulk insert?

    I have this code which doesn't cause any error, and actually gives message 'query executed successfully', but it doesn't load any data.
    bulk insert [dbo].[SPGT]
    from '\\sys.local\london-sql\FTP\20140210_SPGT.SPL'
    WITH (
    KEEPNULLS,
    FIRSTROW=5,
    FIELDTERMINATOR='\t',
    ROWTERMINATOR='\n'
    How can I debug the issue, or see what the script is REALLY doing?  It's not doing what I think it's doing.
    All permissions, rights, etc are setup correctly.  I just run the code successfully with a .txt file.  Maybe it has something to do with the extension...
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

    Yes, here is the final solution (for the benefit of others who find this anytime in the future). 
    CREATE
    TABLE [dbo].[ICM]
    Date  DATETIME,
    Type  VARCHAR(MAX),
    Change      VARCHAR(MAX),
    SP_ID VARCHAR(MAX),
    Sedol VARCHAR(MAX),
    Cusip VARCHAR(MAX),
    Issue_Name  VARCHAR(MAX),
    Cty   VARCHAR(MAX),
    PE    VARCHAR(MAX),
    Cap_Range   VARCHAR(MAX),
    GICS  VARCHAR(MAX),
    Curr  VARCHAR(MAX),
    Local_Price DECIMAL(19,8),
    Index_Total_Shares      DECIMAL(19,8),
    IWF   DECIMAL(19,8),
    Index_Curr  VARCHAR(MAX),
    Float_MCAP  DECIMAL(19,8),
    Total_MCAP  DECIMAL(19,8),
    Daily_Price_Rtn   DECIMAL(19,8),
    Daily_Total_Rtn   DECIMAL(19,8),
    FX_Rate     DECIMAL(19,8),
    Growth_Weight     DECIMAL(19,8),
    Value_Weight      DECIMAL(19,8),
    Bloomberg_ID      VARCHAR(MAX),
    RIC   VARCHAR(MAX),
    Exchange_Ticker   VARCHAR(MAX),
    ISIN  VARCHAR(MAX),
    SSB_ID      VARCHAR(MAX),
    REIT_Flag   VARCHAR(MAX),
    Weight     
    DECIMAL(19,8),
    Shares      DECIMAL(19,8)
     bulk
    insert dbo.ICM
     from
    'C:\Documents and Settings\london\Desktop\ICM.txt'
     WITH
     FIRSTROW
    = 2,
     FIELDTERMINATOR
    = ',',
     ROWTERMINATOR
    = '\n'
     GO
    This was a bit confusing at first, because I've never done it before, and also, I was getting all kinds of errors, which turned out to be numbers in string fields and strings in number fields.  Basically, the data that was given to me was totally screwed
    up.  That compounded the problem exponentially.  I finally got the correct data, and I'm all set now.
    Thanks everyone!
    Knowledge is the only thing that I can give you, and still retain, and we are both better off for it.

  • Bulk Insert Issue with BCP

    I'm running SQL Server 2008 R2 and trying to test out bcp in one of our databases. For almost all the tables, the bcp and bulk insert work fine using similar commands below.  However on a few tables I am experiencing an issue when trying to Bulk Insert
    in.
    Here are the details:
    This is the bcp command to export out the data (via simple batch file):
     1.)
    SET OUTPUT=K:\BCP_FIN_Test
    SET ERRORLOG=C:\Temp\BCP_Error_Log
    SET TIMINGS=C:\Temp\BCP_Timings
    bcp "SELECT * FROM FS84RPT.dbo.PS_PO_LINE Inner Join FS84RPT.[dbo].[PS_RECV_LN_ACCTG] on PS_PO_LINE.BUSINESS_UNIT = PS_RECV_LN_ACCTG.BUSINESS_UNIT_PO and PS_PO_LINE.PO_ID= PS_RECV_LN_ACCTG.PO_ID and PS_PO_LINE.LINE_NBR= PS_RECV_LN_ACCTG.LINE_NBR WHERE
    PS_RECV_LN_ACCTG.FISCAL_YEAR = '2014' and PS_RECV_LN_ACCTG.ACCOUNTING_PERIOD BETWEEN '9' AND '11' " queryout %OUTPUT%\PS_PO_LINE.txt -e %ERRORLOG%\PS_PO_LINE.err -o %TIMINGS%\PS_PO_LINE.txt -T -N
     2.)
    BULK INSERT PS_PO_LINE FROM 'K:\BCP_FIN_Test\PS_PO_LINE.txt' WITH (DATAFILETYPE = 'widenative')
    Msg 4869, Level 16, State 1, Line 1
    The bulk load failed. Unexpected NULL value in data file row 2, column 22. The destination column (CNTRCT_RATE_MULT) is defined as NOT NULL.
    Msg 4866, Level 16, State 4, Line 1
    The bulk load failed. The column is too long in the data file for row 3, column 22. Verify that the field terminator and row terminator are specified correctly.
    Msg 7399, Level 16, State 1, Line 1
    The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.
    Msg 7330, Level 16, State 2, Line 1
    Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".
    I've tried a few different things including trying to export as character and import as BULK INSERT PS_PO_LINE FROM 'K:\BCP_FIN_Test\PS_PO_LINE.txt' WITH (DATAFILETYPE = 'char')
    But no luck
    Appreciate help

    It seems that the target table does not match your expectations.
    Since I don't know exactly what you are doing, I will have to resort to guesses.
    I note that you export query goes:
      SELECT * FROM FS84RPT.dbo.PS_PO_LINE Inner Join
    And then you are importing into a table called PS_PO_LINE as well. But for your operation to make sense the import PS_PO_LINE must not only have the columns from the PS_PO_LINE, but also all columns from PS_RECV_LN_ACCTG. Maybe your SELECT should read
      SELECT PS_PO_LINE.* FROM FS84RPT.dbo.PS_PO_LINE Inner Join
    or use an EXISTS clause to add the filter of PS_RECV_LN_ACCTG table. (Assuming that it appears in the query for filtering only.)
    Erland Sommarskog, SQL Server MVP, [email protected]

  • BULK INSERT

    I don't know if this has been posted before. I have look around and could not find any similar question.
    I am written a stored proc that will do a bulk load using bulk insert command. But I am getting an error msg 4861. below is my full error msg and my code. Can someone advice what I am doing wrong. The sql server engine is on a total different server from
    my text file. But they are all on the same network.
    use test_sp
    go
    Declare @path nvarchar(max)
    declare @str varchar (1000)
    declare @Fulltblname varchar (1000)
    Set @path ='\\myservername\ShareName\Path\FileName.txt'
    Set @Fulltblname ='table1'
    --bulk load the table with raw data
    Set @str = 'BULK INSERT [dbo].['+@Fulltblname+'] 
    FROM ' + char(39) + @Path + Char(39) + '
    WITH 
    FIELDTERMINATOR = ''|'',
    FIRSTROW = 1,
    ROWTERMINATOR =''\n'',
    MAXERRORS = 0
    Exec sp_executesql @str
    Errors getting below
    Cannot bulk load because the file "\\myservername.domainname\ShareName\Path\FileName.txt" could not be opened. Operating system error code 5(Access is denied.).
    Msg 4861, Level 16, State 1, Line 1
    Cannot bulk load because the file "\\myservername.domainname\ShareName\Path\FileName.txt" could not be opened. Operating system error code 5(Access is denied.).
    Mail queued.

    Hi,
    Try below links :
    http://blogs.msdn.com/b/dataaccesstechnologies/archive/2012/03/22/10082977.aspx
    http://blogs.msdn.com/b/jay_akhawri/archive/2009/02/16/resolving-operating-system-error-code-5-with-bulk-insert-a-different-perspective.aspx
    http://stackoverflow.com/questions/14555262/cannot-bulk-load-operating-system-error-code-5-access-is-denied
    sathya - www.allaboutmssql.com ** Mark as answered if my post solved your problem and Vote as helpful if my post was useful **.

  • SQL Server 2008 - RS - Loop of multiple Bulk Inserts

    Hi,
    I want to import multiple flat files to a table on SQL Server 2008 R2. However, I don't have access to Integration Services to use a foreach loop, so I'm doing the process using T-SQL. Actually, I'm using manually code to which file to introduce the data on
    tables. My code are like this:
    cREATE TABLE #temporaryTable
        [column1] [varchar](100) NOT NULL,
        [column2 [varchar](100) NOT NULL
    BULK
    INSERT #temp
    FROM 'C:\Teste\testeFile01.txt' 
    WITH
    FIELDTERMINATOR = ';',
    ROWTERMINATOR = '\n',
    FIRSTROW = 1
    GO
    BULK
    INSERT #temp
    FROM 'C:\Teste\testeFile02.txt' 
    WITH
    FIELDTERMINATOR = ';',
    ROWTERMINATOR = '\n',
    FIRSTROW = 1
    GO
    -------------------------------------------------INSERT INTO dbo.TESTE ( Col_1, Col_2)
    Select RTRIM(LTRIM([column1])), RTRIM(LTRIM([column2])) From #temporaryTable
    IF EXISTS(SELECT * FROM #temporaryTable) drop table #temporaryTable
    The problem is that I have 20 flat files to Insert... Do I have any loop solution in T-SQL to insert all the flat files on same table?
    Thanks!

    Here is a working sample of powershell script I adopted from internet( I don't have the source handy now).
    Import-Module -Name 'SQLPS' -DisableNameChecking
    $workdir="C:\temp\test\"
    $svrname = "MC\MySQL2014"
    Try
    #Change default timeout time from 600 to unlimited
    $svr = new-object ('Microsoft.SqlServer.Management.Smo.Server') $svrname
    $svr.ConnectionContext.StatementTimeout = 0
    $table="test1.dbo.myRegions"
    #remove the filename column in the target table
    $q1 = @"
    Use test1;
    IF COL_LENGTH('dbo.myRegions','filename') IS NOT NULL
    BEGIN
    ALTER TABLE test1.dbo.myRegions DROP COLUMN filename;
    END
    Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -Query $q1
    $dt = (get-date).ToString("yyyMMdd")
    $formatfilename="$($table)_$($dt).xml"
    $destination_formatfilename ="$($workdir)$($formatfilename)"
    $cmdformatfile="bcp $table format nul -c -x -f $($destination_formatfilename) -T -t\t -S $($svrname) "
    Invoke-Expression $cmdformatfile
    #Delay 1 second
    Start-Sleep -s 1
    $q2 = @"
    Alter table test1.dbo.myRegions Add filename varchar(500) Null;
    #add the filename column to the target table
    Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -Query $q2
    $files = Get-ChildItem $workdir
    $items = $files | Where-Object {$_.Extension -eq ".txt"}
    for ($i=0; $i -lt $items.Count; $i++) {
    $strFileName = $items[$i].Name
    $strFileNameNoExtension= $items[$i].BaseName
    $query = @"
    BULK INSERT test1.dbo.myRegions from '$($workdir)$($strFileName)' WITH (FIELDTERMINATOR = '\t', FIRSTROW = 2, FORMATFILE = '$($destination_formatfilename)');
    Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -Query $query -querytimeout 65534
    #Delay 10 second
    Start-Sleep -s 10
    # Update the filename column
    Invoke-Sqlcmd -ServerInstance $svr.Name -Database master -querytimeout 65534 -Query "Update test1.dbo.myRegions SET filename= '$($strFileName)' WHERE filename is null; "
    # Move uploaded file to archive
    If ((Test-Path "$($workdir)$($strFileName)") -eq $True) { Move-Item -Path "$($workdir)$($strFileName)" -Destination "$($workdir)Processed\$($strFileNameNoExtension)_$($dt).txt"}
    Catch [Exception]
    write-host "--$strFileName "$_.Exception.Message

  • Bulk insert for two tables

    Hi All,
    I have a cursor in PL/SQL block and it selects the data for about 100 columns, my requirement is to insert the data of about 60 columns into one table and rest of the columns into another. Please let me know how do I implement the same by using BULK INSERT.
    Thanks for your help in advance.

    Why not dispense with the CURSOR and instead use a multi-table INSERT...SELECT?
    INSERT ALL
      INTO table1 (col1, col2, ..., col60)
      VALUES (col1, col2, ..., col60)
      INTO table2 (col61,...col100)
      VALUES (col61,...col100)
    SELECT col1,..., col100
      FROM etc.where the SELECT is doing the same query as the CURSOR.
    Edited by: user142857 on Nov 12, 2009 10:06 AM

  • JDBC Bulk Insert to MS Access

    I am trying to do bulk insert to MS Access database from text file. One of the solutions recommended by bbritta is as follows
    import java.sql.*;
    public class Test3 {
      public static void main(String[] arghs) {
        try {
          Class.forName("sun.jdbc.odbc.JdbcOdbcDriver");
          String filename = "C:/DB1.mdb";
          String database =
              "jdbc:odbc:Driver={Microsoft Access Driver (*.mdb)};DBQ=C:/DB1.MDB";
          Connection con = DriverManager.getConnection(database, "", "");
          Statement statement = con.createStatement();
          statement.execute("INSERT INTO Table1 SELECT * FROM [Text;Database=C:\\;HDR=YES].[TextFile.txt]");
          statement.close();
          con.close();
        } catch (Exception e) { e.printStackTrace(); }
    }Whenever I try to use that approach, I get error message
    java.sql.SQLException: [Microsoft][ODBC Microsoft Access Driver] Number of query values and destination fields are not the same.
    at sun.jdbc.odbc.JdbcOdbc.createSQLException(JdbcOdbc.java:6958)
    at sun.jdbc.odbc.JdbcOdbc.standardError(JdbcOdbc.java:7115)
    at sun.jdbc.odbc.JdbcOdbc.SQLExecDirect(JdbcOdbc.java:3111)
    at sun.jdbc.odbc.JdbcOdbcStatement.execute(JdbcOdbcStatement.java:338)
    Fields in Access destination tables are exactly the same as in text field and I still get an error message. I could manually import to Access from same file without any problem.
    I was wondering if someone out there could suggest another approach.

    >
    1) Is there a type-4 JDBC connector available to
    connect directly to MS Access databases and if so
    would it be difficult to implement or migrate to?
    This is important because dbAnywhere does not appear
    to be supported on Windows 2000, which is the
    platform we are migrating to. We need to eliminate
    dbAnywhere if possible.
    By definition no such driver can exist. A type 4 driver is java only and connects directly to the database. Excluding file writes the only connection method is via sockets and there is nothing for a socket to connect to in a MS Access database - MS Access doesn't work that way.
    You can look into type 3 driver. I believe there are a number of them. They use an intermediate server. Search here http://industry.java.sun.com/products/jdbc/drivers
    You could implement your own using RmiJdbc at http://www.objectweb.org/. However I personally would think that that would require a serious long look at security issues before exposing a solution to the internet.

  • Bulk Insert Errant File

    Hello Again,
    It's been a troublesome day. I'm trying to import a file with the normal manual import process. Import formats and maps are good. The issue is file system provisioning and behavior I've not seen before.
    The location is using Bulk Insert.
    When the xxxxxxxxxxxx.tmp file is created, it is being created in C:\Document and Settings\AdminUserID\LOCALSETTINGS\Temp.
    One would first think that the application was created with an app path that does not conform to UNC. It does. \\servername\AppFolders\ApplicationX
    Why would it choose this location to create the temp file? Usually the .tmp file is created right inside the inbox...
    Very confused, appreciate the help.
    -Richard

    We have an ORACLE DB.
    In the past I have seen where the Database service/user doesn't have permissions to the file system and that will create an error.
    However, even in those situations the Bulk Insert *.tmp file is created in the application's Inbox. This is completely different as the .tmp file is being created in the System Admin user's Document and Settings folder. I don't even know where it is getting a reference to this folder.

  • Blob truncated with DbFactory and Bulk insert

    Hi,
    My platform is a Microsoft Windows Server 2003 R2 Server 5.2 Service Pack 2 (64-bit) with an Oracle Database 11g 11.1.0.6.0.
    I use the client Oracle 11g ODAC 11.1.0.7.20.
    Some strange behavior appends when used DbFactory and bulk command with Blob column and parameter with a size larger than 65536bytes. Let me explain.
    First i create a dummy table in my schema :
    create table dummy (a number, b blob)To use bulk insert we can use the code A with oracle object (succes to execute) :
    byte[] b1 = new byte[65530];
    byte[] b2 = new byte[65540];
    Oracle.DataAccess.Client.OracleConnection conn = new Oracle.DataAccess.Client.OracleConnection("User Id=login;Password=pws;Data Source=orcl;");
    OracleCommand cmd = new OracleCommand("insert into dummy values (:p1,:p2)", conn);
    cmd.ArrayBindCount = 2;
    OracleParameter p1 = new OracleParameter("p1", OracleDbType.Int32);
    p1.Direction = ParameterDirection.Input;
    p1.Value = new int[] { 1, 2 };
    cmd.Parameters.Add(p1);
    OracleParameter p2 = new OracleParameter("p2", OracleDbType.Blob);
    p2.Direction = ParameterDirection.Input;
    p2.Value = new byte[][] { b1, b2 };
    cmd.Parameters.Add(p2);
    conn.Open(); cmd.ExecuteNonQuery(); conn.Close();We can write the same thing with an abstract level when used the DbProviderFactories (code B) :
    var factory = DbProviderFactories.GetFactory("Oracle.DataAccess.Client");
    DbConnection conn = factory.CreateConnection();
    conn.ConnectionString = "User Id=login;Password=pws;Data Source=orcl;";
    DbCommand cmd = conn.CreateCommand();
    cmd.CommandText = "insert into dummy values (:p1,:p2)";
    ((OracleCommand)cmd).ArrayBindCount = 2;
    DbParameter param = cmd.CreateParameter();
    param.ParameterName = "p1";
    param.DbType = DbType.Int32;
    param.Value = new int[] { 3, 4 };
    cmd.Parameters.Add(param);
    DbParameter param2 = cmd.CreateParameter();
    param2.ParameterName = "p2";
    param2.DbType = DbType.Binary;
    param2.Value = new byte[][] { b1, b2 };
    cmd.Parameters.Add(param2);
    conn.Open(); cmd.ExecuteNonQuery(); conn.Close();But this second code doesn't work, the second array of byte is truncated to 4byte. It seems to be an int16 overtaking.
    When used a DbTYpe.Binary, oracle use an OracleDbType.Raw for mapping and not an OracleDbType.Blob, so the problem seems to be with raw type, BUT if we use the same code without bulk insert, it's worked !!! The problem is somewhere else...
    Why used an DbConnection ? To be able to switch easy to an another database type.
    So why used "((OracleCommand)cmd).ArrayBindCount" ? To be able to used specific functionality of each database.
    I can fix the issue when casting DbParameter as OracleParameter and fix the OracleDbType to Blob, but why second code does not working with bulk and working with simple query ?

    BCP and BULK INSERT does not work the way you expect them do. What they do is that they consume fields in a round-robin fashion. That is, they first looks for data for the first field, then for the second field and so on.
    So in your case, they will first read one byte, then 20 bytes etc until they have read the two bytes for field 122. At this point they will consume bytes until they have found a sequence of carriage return and line feed.
    You say that some records in the file are incomplete. Say that there are only 60 fields in this file. Field 61 is four bytes. BCP and BULK INSERT will now read data for field 61 as CR+LF+the first two bytes in the next row. CR+LF has no special meaning,
    but they are just data at this point.
    You will have to write a program to parse the file, or use SSIS. But BCP and BULK INSERT are not your friends in this case.
    Erland Sommarskog, SQL Server MVP, [email protected]

Maybe you are looking for

  • How to add one column for entry in the TLB screen?

    Hi all, Does anybody know how to add a customised column for free text in the TLB header screen? The reason is user needs to add ship or container no. This info will be later on interfaced via CIF exit to R/3. I think many of you have the same requir

  • Pricing not possible for line Item

    hi I am facing the follwong error because of which I am nto able to go to change mode . can you guys please suggest what could be the solution? Pricing not allowed for item 000020 (Split 0001) Message no. /DBM/PRICING004 Pricing is not allowed for th

  • Mac mini goes crazy upon booting

    hello all, I have a powerpc 1.25 mac mini which had os x (10.3.4 (i think)) and needed an app (linotype explorer) which required a newer version of os x. I upgraded it with a 10.4.3 disk i had and all seemed to be fine. After a restart or two however

  • Weblogic deployment in jdev9i

    I see in the usage statement for weblogic.deploy that there is an argument for deployment to a url. I tried specifying a url in the host field in the application server connection wizard, and it still deploys with the -host option. How do I deploy to

  • RAW-support for Panasonic LC1?

    Hi, does anybody know, if iPhoto 6 supports RAW-files from a Panasonic LC1? Despite been mentioned as supported in this file: http://docs.info.apple.com/article.html?artnum=300884 it is not supported in iPhoto 5, only in Preview and aperture. Any cha