Create XML format file in bulk insert with a data file with out delimiter

Hello
I have a date file with no delimiter like bellow
0080970393102312072981103378000004329392643958
0080970393102312072981103378000004329392643958
I just know 5 first number in a line is for example "ID of bank"
or 6th and 7th number in a line is for example "ID of employee"
Could you help me how can I create a XML format file?
thanks alot

This is a fixed file format. We need to know the length of each field before creating the format file. Say you have said the first 5 characters are Bank ID and 6th to 7th as Employee ID ... then the XML should look like,
<?xml version="1.0"?>
<BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<RECORD>
  <FIELD ID="1"xsi:type="CharFixed"LENGTH="5"/>
  <FIELD ID="2"xsi:type="CharFixed"LENGTH="2"/>
  <FIELD ID="3" xsi:type="CharFixed" LENGTH="8"/>
  <FIELD ID="4" xsi:type="CharFixed" LENGTH="14"/>
  <FIELD ID="5" xsi:type="CharFixed" LENGTH="14"/>
  <FIELD ID="6" xsi:type="CharFixed" LENGTH="1"/>
</RECORD>
<ROW>
  <COLUMNSOURCE="1"NAME="c1"xsi:type="SQLNCHAR"/>
  <COLUMNSOURCE="2"NAME="c2"xsi:type="SQLNCHAR"/>
  <COLUMN SOURCE="3" NAME="c3" xsi:type="SQLCHAR"/>
  <COLUMN SOURCE="4" NAME="c4" xsi:type="SQLINT"
/>
  <COLUMN SOURCE="5" NAME="c5" xsi:type="SQLINT"
/>
</ROW>
</BCPFORMAT>
Note: Similarly you need to specify the other length as well.
http://stackoverflow.com/questions/10708985/bulk-insert-from-fixed-format-text-file-ignores-rowterminator
Regards, RSingh

Similar Messages

  • Adding Data file to existing primary file group with 1 data file

    Currently our databases are configured to only have 1 data file and 1 log file.  I am looking at adding a 2nd data file to the primary group, which will be on a separate lun.
    Will we benefit from adding the 2nd data file (same size as 1st data file and same autogrowth rate) , or should we create a new database with 2 data files (equal size and autogrowth rate), and import the data from the database with the single data file.
    Thanks.
    DJ

    Having another data file pointing to different Physical Volume
    will give you better performance gains. Additionally, you should pre-size them (Same as First Data File) with same growth settings (Preferably in Mb
    instead of Percentages) .
    It is perfectly OK to add another data file to PRIMARY file-group as well and SQL Server will automatically balance the data across multiple files over the period time (Due to Data Striping)
    HTH
    Good Luck! Please Mark This As Answer if it solved your issue. Please Vote This As Helpful if it helps to solve your issue

  • Hi i have a learning HTML file with a data file include a swf files how can i open it in my ipad?

    hi i have a learning HTML file with a data file include a swf files how can i open it in my ipad?

    iPad do not support flash natively.
    Look in the app store, some browsers do support flash with limited functionality.

  • I am trying to install Adobe Premiere Elements 9 without success. I successfully installed Photoshop Elements 9 without a problem. The message I am getting is 'Invalid Unicode file - .\Autoplay\LangData\en_US\lang.dat'. Anyone out there who can help pleas

    I am trying to install Adobe Premiere Elements 9 without success. I successfully installed Photoshop Elements 9 without a problem. The message I am getting is 'Invalid Unicode file - .\Autoplay\LangData\en_US\lang.dat'. Anyone out there who can help please?

    click setup, not autoplay.

  • Creating xml format file with double dagger ‡ as delimiter

    Hi,
    My source files are delimited with double dagger (‡) and I am using openrowset to load the files to SQL Server. When I am creating the format file by specifying the delimiter as ‡, the format file created has a different delimiter than the specified one.
    The new delimiter appearing in xml format file is ç.
    Any help?
    TIA
    Nitesh Rai- Please mark the post as answered if it answers your question

    How did you create format file?
    Try this method and see if it works
    http://visakhm.blogspot.com/2013/10/generate-format-files-based-on-table.html
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • First Row Record is not inserted from CSV file while bulk insert in sql server

    Hi Everyone,
    I have a csv file that needs to be inserted in sql server. The csv file will be format will be like below.
    1,Mr,"x,y",4
    2,Mr,"a,b",5
    3,Ms,"v,b",6
    While Bulk insert it coniders the 2nd column as two values (comma separte) and makes two entries .So i used filelterminator.xml.  
    Now, the fields are entered into the column correctly. But now the problem is, the first row of the csv file is not reading in sql server. when i removed the  terminator,  i can get the all records. But i must use the above code terminator. If
    am using means, am not getting the first row record.
    Please suggests me some solution.
    Thanks,
    Selvam

    Hi,
    I have a csv file (comma(,) delimited) like this which is to be insert to sql server. The format of the file when open in notepad like below:
    Id,FirstName,LastName,FullName,Gender
    1,xx,yy,"xx,yy",M
    2,zz,cc,"zz,cc",F
    3,aa,vv,"aa,vv",F
    The below is the bulk insert query which is used for insert above records,
    EXEC(BULK INSERT EmployeeData FROM '''+@FilePath+'''WITH
    (formatfile=''d:\FieldTerminator.xml'',
    ROWTERMINATOR=''\n'',
    FIRSTROW=2)'
    Here, I have used format file for the "Fullname" which has comma(,) within the field. The format file is:
    The problem is , it skip the first record (1,xx,yy,"xx,yy",M) when i use the format file. When i remove the format file from the query, it takes all the records but the "fullName" field makes the problem because of comma(,) within the
    field. So i must use the format file to handle this. So please suggest me , why the first record skipped always when i use the above format file.
    If i give the "FirstRow=1" in bulk insert, it shows the "String or binary data would be truncated.
    The statement has been terminated." error. I have checked the datatype length.
    Please update me the solution.
    Regards,
    Selvam. M

  • Ignore first row of data file in bulk insert

    Hello
    I use bulk insert to fill a table
    I use code bellow
    bulk insert dbo.test
    from 'c:\test.txt'
    with(FIRSTROW=2,FORMATFILE='c:\test.xml'
    go
    but data inserted to table start from 3throw.
    Could you help me?

    I added that closing parenthesis.
    format file
    <?xml version="1.0"?>
    <BCPFORMAT
    xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format"
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
    <RECORD>
    <FIELD ID="1" xsi:type="CharFixed" LENGTH="6"/>
    <FIELD ID="2" xsi:type="CharFixed" LENGTH="2"/>
    <FIELD ID="3" xsi:type="CharFixed" LENGTH="6"/>
    <FIELD ID="4" xsi:type="CharFixed" LENGTH="13"/>
    <FIELD ID="5" xsi:type="CharFixed" LENGTH="13"/>
    <FIELD ID="6" xsi:type="CharTerm" TERMINATOR="\n"/>
    </RECORD>
    <ROW>
    <COLUMN SOURCE="4" NAME="R3" xsi:type="SQLBIGINT"/>
    <COLUMN SOURCE="5" NAME="R4" xsi:type="SQLBIGINT"/>
    <COLUMN SOURCE="1" NAME="R5" xsi:type="SQLVARYCHAR"/>
    <COLUMN SOURCE="3" NAME="R9" xsi:type="SQLVARYCHAR"/>
    <COLUMN SOURCE="2" NAME="R8" xsi:type="SQLVARYCHAR"/>
    </ROW>
    </BCPFORMAT>
    when I use firstrow=2 it works but do it from 3th row
    bulk insert dbo.test
    from 'c:\test.txt'
    with(FIRSTROW=2,FORMATFILE='c:\test.xml')
    go
    BUT when I use firstrow=1
    bulk insert dbo.test
    from 'c:\test.txt'
    with(FIRSTROW=1,FORMATFILE='c:\test.xml')
    go
    I get thia error
    bulk load data conversion  error(type mismatch or invalid character for the specified codepage)

  • Obtaining a rowcount from a csv file before bulk insert

    I have a group of csv files that i am bulk inserting into my sql table.  They are varying lengths, but the last seven lines of every file are not lines of data that comply with the format of my table and are causing insert errors.  I know i can
    specify a "lastrow" value for the bulk insert, but my problem is that i need a rowcount before running the statement and I can't seem to figure out a way of getting it with how im accessing the file.  (example below of what i'm doing without
    checking for the qty of rows):
    CREATE TABLE ALLFILENAMES(WHICHPATH VARCHAR(255),WHICHFILE varchar(255))
        --some variables
        declare @filename varchar(255),
                @path     varchar(255),
                @sql      varchar(8000),
                @cmd      varchar(1000),
    @lastrow  integer
        --get the list of files to process:
    SET @path = '\\chewbacca\PlantFloorData\RLData\'
        SET @cmd = 'dir ' + @path + '*.csv /b'
    print @path
    print @cmd
        INSERT INTO  ALLFILENAMES(WHICHFILE)
        EXEC Master..xp_cmdShell @cmd
        UPDATE ALLFILENAMES SET WHICHPATH = @path where WHICHPATH is null
        --cursor loop
        declare c1 cursor for SELECT WHICHPATH,WHICHFILE FROM ALLFILENAMES where WHICHFILE like '%.csv%'
        open c1
        fetch next from c1 into @path,@filename
        While @@fetch_status <> -1
          begin
      set @lastrow = 3 -- for now just to make sure logic below works
      print @lastrow
     --bulk insert won't take a variable name, so make a sql and execute it instead:
           set @sql = 'BULK INSERT HMIDATA.dbo.RawData FROM ''' + @path + @filename + ''' '
               + '     WITH ( 
                       FIELDTERMINATOR = '','', 
                       ROWTERMINATOR = ''\n'', 
                       FIRSTROW = 2 ,
       LASTROW = ' + STR(@lastrow) + '
        print @sql
        exec (@sql)
          fetch next from c1 into @path,@filename
          end
        close c1
        deallocate c1
        --Extras
        --delete from ALLFILENAMES where WHICHFILE is NULL
        --select * from ALLFILENAMES
        drop table ALLFILENAMES

    You could do a SELECT COUNT(*) FROM OPENROWSET(BULK), but you would need a format file that specifies a single-field record format with \r\n as the terminator. With the COUNT(*) you could determine the correct value for LASTROW.
    You could also load the entire file into a two-column table where one column is an IDENTITY column and the other is the raw data from the line in the file. And then you could crack the good rows.
    ...or you could write a client-side program or a SSIS package for the task.
    Erland Sommarskog, SQL Server MVP, [email protected]

  • Creating XML format

    Hi  All,
    Can anyone please explain what is the difference between creating an XML format from an internal table using  CALL TRANSFORMATION and using CLASS BUILDER?
    In  class builder  we have  to Create the main iXML factory , then Create a stream factory and streams etc..
    Can  call transformation id
      source SOURCE = SOURCE
      result xml XML_RESULT.   be  used  as  a  replacement  for  the  above?

    yes

  • How to use BULK INSERT for a data from a cursor?

    Oracle 10G enterprise edition.
    I tried to Bulk insert datas returning from a cursor, its returning error.
    PLS-00302: component 'LAST' must be declared
    I need some help to use the Bulk INSERT here.Can any one help me to specify what error i have made?
    CREATE OR REPLACE PROCEDURE HOT_ADMIN.get_search_keyword_stats_prc
    IS
    CURSOR c_get_scenarios
    IS
    SELECT a.*,ROWNUM rnum
    FROM (
    SELECT TRUNC(r.search_date) sdate,
    r.search_hits hits,
    r.search_type stype,
    r.search_qualification qual,
    r.search_location loc,
    r.search_town stown,
    r.search_postcode pcode,
    r.search_college college,
    r.search_colname colname,
    r.search_text text,
    r.affiliate_id affiliate,
    r.search_study_mode smode,
    r.location_hint hint,
    r.search_posttown ptown,
    COUNT(1) cnt
    FROM w_search_headers r
    WHERE search_text IS NOT NULL
    AND NVL(search_type,' ') <> 'C'
    AND TRUNC(search_date)= TO_DATE(TO_CHAR(SYSDATE-1,'DD-MON-RRRR'))
    GROUP BY TRUNC(r.search_date),
    r.search_hits,
    r.search_type,
    r.search_qualification,
    r.search_location,
    r.search_town,
    r.search_postcode,
    r.search_college,
    r.search_colname,
    r.search_text,
    r.affiliate_id,
    r.search_study_mode,
    r.location_hint,
    r.search_posttown
    ORDER BY cnt desc
    ) a
    WHERE ROWNUM <=1000;
    lc_get_data c_get_scenarios%ROWTYPE;
    BEGIN
    OPEN c_get_scenarios;
    FETCH c_get_scenarios into lc_get_data;
    CLOSE c_get_scenarios;
    FORALL i IN 1..lc_get_data.last
    INSERT INTO W_SEARCH_SCENARIO_STATS VALUES ( i.sdate,
    i.hits,
    i.stype,
    i.qual,
    i.loc,
    i.stown,
    i.pcode,
    i.college,
    i.colname,
    i.text,
    i.affiliate,
    i.smode,
    i.hint,
    i.ptown,
    i.cnt
    COMMIT;
    END;

    This isn't what you asked, but I've generally found it helpful to list the columns in an INSERT statement before the values. It is of course optional, but useful for reference when looking at the statement later

  • Problems with Merge Data Files

    Created a form with radio buttons and finally have it working on my computer including the Merge Data Files creating the report.csv file.  However, when my co-worker creates the report.csv, the columns for the radio buttons are not included.  Had this problem on my computer when I had names for the radio buttons, but fixed it.  The data columns for the radio buttons are not exported on the co-workers computer.  Both are Pro X version, mine 10.0.0 but co-worker's is 10.1.7.
    Any suggestions?

    Back up all data.
    Please triple-click anywhere in the line below on this page to select it:
    defaults delete -app Safari WebKitOmitPDFSupport
    Copy the selected text to the Clipboard by pressing the key combination command-C.
    Quit Safari. Launch the built-in Terminal application in any of the following ways:
    ☞ Enter the first few letters of its name into a Spotlight search. Select it in the results (it should be at the top.)
    ☞ In the Finder, select Go ▹ Utilities from the menu bar, or press the key combination shift-command-U. The application is in the folder that opens.
    ☞ Open LaunchPad. Click Utilities, then Terminal in the icon grid.
    Paste into the Terminal window by pressing the key combination command-V. I've tested these instructions only with the Safari web browser. If you use another browser, you may have to press the return key after pasting.
    Wait for a new line ending in a dollar sign (“$”) to appear below what you entered. You can then quit Terminal. Test.

  • How to avoid duplicate data while inserting from sample.dat file to table

    Hi Guys,
    We have issue with duplicate data in flat file while loading data from sample.dat file to table. How to avoid duplicate data in control file.
    Can any one help me on this.
    Thanks in advance!
    Regards,
    LKR

    No, a control file will not remove duplicate data.
    You would be better to use an external table and then remove duplicate data using SQL as you query the data to insert it to your destination table.

  • Not receiving email with ETP.DAT file

    I have two users that are not receiving the email from the blackberry.net domain that has the ETP.DAT file attached. I set a new activation password on my blackberry and received the email. I have tried activating these guys a dozen time. I deleted them from the BES and re-entered their BES profile still nothing. I am only having problems with these two. When I activated myself I was able to trace the email through our SMTP Server right to my mail box. We are a Lotus Notes camp. Can anyone think of something I can try?

    1)  Apple ID: All about Apple ID security questions
    2)  See Here... ask to speak with the Account Security Team...
    Apple ID: Contacting Apple for help with Apple ID account security

  • Restore my DB with the data files.

    I needed to re-install my Windows 2000, so I lost my Oracle, so I have only my data files, I'd like to know if I can get my Data on it.

    What datafiles you have. Do you have control files and other dbf files or not. If you do then you might recover your old database. You have to install exactly the same Oracle and Database configuration that you had before. Then replace the old db files with your old control files. It should come-up fine. Give a shot.

  • TS3276 why is jpeg file lost and replaced with winmail.dat file in mail

    I sent a jpeg file to my hotmail account, which is synchronised in my Mac Mail but the file was lost/converted to winmail.dat attachment, which I couldn't import to iPhoto. How do I import a jpeg image, emailed to me from an external source, to iPhoto?

    I think you need to configure Outlook in Windows to use HTML messages.
    Here is Microsoft's incomprehensible support document on the issue.
    Here is one from about.com that makes sense.

Maybe you are looking for

  • Digital Signature invalid when sign a PDF with instances

    Hi, I created a XDP document with LiveCycle Designer ES2. This document contains: - Some required fields (text fields, data fields ecc...) - A Subform that user can add more instance pressing a button, contains some fields - A Digital Signature (With

  • Bootcamp and windows

    hey i want to install windows xp on my mac and when i try partitioning the disk for a 10GB windows space, it tells me that it cant't partition because "some files can't be moved". it says i need to run disk utility and partition/format my HD as a sin

  • How can I detect gaps in a sequence?

    I've a very long sequence with many footage in the timeline. Every time I want to check a sequence for missing film parts I have to scroll through the whole film and see if there are no parts where there's no film or audio. I was wondering if there i

  • Reset Invoice capture

    Hi, I need to capture all invoices which are reset. Is there any standard SAP program or any other possibility through which we can do this? I have one Z program through which i can capture all those reset inoices which are cleared again..... But it

  • Unable to Kill the Consolidation task for HFM application

    User is unable to kill the Consolidation task in HFM application. Consolidation process which usually takes 2 mins has been showing status as running @79% since 4-6 hrs. User ID is has the provisioning manager, lock Data and unlock data provisions in