Flat file connection manager

Hi All,
Try to create routine import file task using FlatFileManager, however the files are to be stored always in the network folder and needed to login by another particular account. Is that possible to embed the credential in FlatFileManager
to get files?
Thank you for your help

Hi Schineenee,
There is some points from another thread, which might help you.
1st, you can use File System Task to copy file as suggested
 Since it's a remote server you need a shared folder and correct permission to do so. You can either:
#1 Execute the package with SQL Server Agent:
You need to grant  read/write permission to identity of SQL Server Agent, to get the identity type services.msc in the command
prompt look for SQL Server Agent.
Or if it's other identity than the Agent, you can create a Proxy. In the Security -> Credentials -> Create new credentials (username
and password), then go to SQL Agent -> Proxies -> SSIS Package Execution -> Create a new Proxy and assign the credentials. Then on the sql job itself assign the Run as = proxy.
# 2 If you need to connect to multiple resources (shared folder) or you don't use SQL Job you can use theNET USE command.
http://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/net_use.mspx?mfr=true
Issue the NET USE command before the File System Task.
Thanks,
Shashikant

Similar Messages

  • How to remove the date extensions from a filename in SSIS Flat File Connection Manager dynamically at run time

    Hello,
    I have to load data from a csv file to SQL Database. The file is placed into a directory by another program but the file name being same, has different extensions based on time of the day that it is placed in the directory. Since I know the file name
    ahead of time, so, I want to strip off the date/time extension from the file name so that I can load the file using Flat File Connection Manager. I am trying to use 'variable' and 'expression editor' so that I can specify the file name dynamically. But I
    don't know how to write it correctly. I have a variable 'FileLocation' that holds the folder location where the file will be placed. The file for example:  MyFileName201410231230  (MyFileName always the same, but the date/time will be different)
    Thanks,
    jkrish

    I don't want to use ForEach Loop because the files are placed by a FTP process 3 times a day at a specific time, for ex. at 10 AM, 12 PM and 3 PM. These times are pretty much fixed. The file name is same but the extension will have this day time stamp. I
    have to load each file only once for a particular reason, meaning I don't want to load the ones I already loaded. I am planning on setting up the SSIS process to load at 10:05, 12:05 and 3:05 daily. The files will be piling up in the folder. As it comes,
    I load them. At some point in time, I can remove the old ones so that they won't take up space in the server.  In fact, I don't have to keep the old ones at all since they are saved in a different folder anyways. I can ask the FTP process to
    remove the previous one when the new one arrives. So, at any point in time, there will be one file, but that file will have different extensions every time.
    I am thinking of removing the extensions before I load every time. If the file name is 'MyFileNamexxxxxxx.csv', then I want to change it to 'MyFileName.csv' and load it.
    Thanks,
    jkrish
    You WILL need to use it eventually because you need to iterate over each file.
    Renaming is unnecessary as one way or another you will need to put a processed file away.
    And having the file with the original extension intact will also help you troubleshoot.
    Arthur
    MyBlog
    Twitter

  • Is there a way to change fixed width Flat file connection manager to delimited without losing all the column names ?

    Is there a way to change fixed width Flat file connection manager to delimited without losing all the column names ?

    Unfortunately not as it is a quite dramatic change causing the metadata being refreshed. I would try to open though the package file in a text editor and find them before doing the change, then after you made the change to the FF connector try to reconstitute
    / replace the autimatically generated columns with the old names.
    Arthur
    MyBlog
    Twitter

  • Need help ASAP with Data Flow Task Flat File Connection

    Hey there,
    I have a Data Flow Task within a ForEach loop container.  The source of the flow is ADO.NET connection and the destination is a Flat File Connection.  I loop through a collection of strings in the ForEach loop.  Based on the string content,
    I write some data to the same destination file in each iteration overwriting the previous version. 
    I am running into following Errors:
    [Flat File Destination [38]] Warning: The process cannot access the file because it is being used by another process.
    [Flat File Destination [38]] Error: Cannot open the datafile "Example.csv".
    [SSIS.Pipeline] Error: Flat File Destination failed the pre-execute phase and returned error code 0xC020200E.
    I know what's happening but I don't know how to fix it.  The first time through the ForEach loop, the destination file is updated.  The second time is when this error pops up.  I think it's because the first iteration is not closing the destination
    file. How do I force a close of the file within Data Flow task or through a subsequent Script Task.
    This works within a SQL 2008 package on one server but not within SQL 2012 package on a different server.
    Any help is greatly appreciated.
    Thanks! 

    Thanks for the response Narsimha.  What do you mean by FELC? 
    First time poster - what is the best way to show the package here?

  • Flat file connection: The file name "\server\share\path\file.txt" specified in the connection was not valid

    I'm trying to execute a SSIS package via SQL agent with a flat file source - however it fails with Code: 0xC001401E The file name "\server\share\path\file.txt" specified in the connection was not valid.
    It appears that the problem is with the rights of the user that's running the package (it's a proxy account). If I use a higher-privelege account (domain admin) to run the package it completes successfully. But this is not a long-term solution, and I can't
    see a reason why the user doesn't have rights to the file. The effective permissions of the file and parent folder both give the user full control. The user has full control over the share as well. The user can access the file (copy, etc) outside the SSIS
    package.
    Running the package manually via DTExec gives me the same error - I've tried 32 and 64bit versions with the same result. But running as a domain admin works correctly every time.
    I feel like I've been beating my head against a brick wall on this one... Is there some sort of magic permissions, file or otherwise, that are required to use a flat file target in an SSIS package?

    Hi Rossco150,
    I have tried to reproduce the issue in my test environment (Windows Server 2012 R2 + SQL Server 2008 R2), however, everything goes well with the permission settings as you mentioned. In my test, the permissions of the folders are set as follows:
    \\ServerName\Temp  --- Read
    \\ServerName\Temp\Source  --- No access
    \\ServerName\Temp\Source\Flat Files --- Full control
    I suspect that your permission settings on the folders are not absolutely as you said above. Could you double check the permission settings on each level of the folder hierarchy? In addition, check the “Execute as user” information from job history to make
    sure the job was running in the proxy security context indeed. Which version of SSIS are you using? If possible, I suggest that you install the latest Service Pack for you SQL Server or even install the latest CU patch. 
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • Foreach Loop Container with a Data Flow Task looking for file from Connection Manager

    So I have a Data Flow Task within a Foreach Loop Container. The Foreach Loop Container has a Variable Mapping of User:FileName to pass to the Data Flow Task.
    The Data Flow Task has a Flat File Source since we're looking to process .csv Files. And the Flat File Source has a Flat File Connection Manager where I specified the File name when I created it. I thought you needed to do this even though it won't really
    use it since it should be getting its File name from the Foreach Loop Container. But when attempting to execute, it is blowing up because it seems to be looking for my test file name that I indicated in the Flat File Connection Manager rather than the file
    it should be trying to process in User:FileName from the Foreach Loop Container.
    What am I doing wrong here??? I thought you needed to indicate a File name within the Flat File Connection Manager even though it really won't be using it.
    Thanks for your review...I hope I've been clear...and am hopeful for a reply.
    PSULionRP

    The Flat File Connection manager's Connection String needs to be set to reference the variable used in the ForEach Loop:
    Arthur My Blog

  • SSIS : How to create Column Header dynamically using expression in Flat File Source

    Hi Team,
    I need to keep configured Header Names for columns, Is there is any way to set each column name from expression? or is there is any other way?

    Nope
    But you could add a dummy row to your source to include column headers and then use options column headers in first row in flat file connection manager.
    So suppose you've three columns column0,coulmn1,column2 and you want to make it as ID,Name,Datethen make source query as
    SELECT 'ID' AS Col1,'Name' AS Col2,'Date' AS Col3, 0 AS ord
    UNION ALL
    SELECT Column1,Column2,Column3,1
    FROM YourTable
    ORDER BY Ord
    then choose  column headers in first row option
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • [Flat File Destination [220]] Error: Failed to write out column name for column "Column 2"

    I am using SSIS to extract fixed width data into a flat file destination and I keep getting below error. I have tried almost everything in this forum but still no solution. can anyone help me out to solve this problem.
    [Flat File Destination [220]] Error: Failed to write out column name for column "Column 2".
    [SSIS.Pipeline] Error: component "Flat File Destination" (220) failed the pre-execute phase and returned error code 0xC0202095
    Thanks

    Hi Giss68,
    Could you check the Advanced tab of the Flat File Connection Manager to see whether the InputColumnWidth and the OutputColumnWidth properties of the Column2 has the same value? Please refer to the following link about the same topic:
    http://stackoverflow.com/questions/10292091/how-do-i-fix-failed-to-write-error-while-exporting-data-to-ragged-right-flat-fil 
    If it doesn’t work, please post the sample data and the advanced settings of Column2 for further analysis.
    Regards,
    Mike Yin
    If you have any feedback on our support, please click
    here
    Mike Yin
    TechNet Community Support

  • Flat Files - SSIS Handy Hints

    After working with flat files (especially text and CSV) for some time in my current role I have decided to come up with this little discussion which I believe may benefit peers who might not yet have come across some of the issues that I have raised in this
    piece. This piece is just a summary of what I have thought would be useful hints for anyone to consider when they are working with CSV or text files. Everything in this writing is purely from my experience and I am discussing it at SSIS level. I believe most
    of the actions can be applied across in SSMS when dealing with flat files there.
    Flat files as destination file
    Exporting data to flat file destination is relatively easy and straight forward. There aren’t as many issues experienced here as they are when one imports data from flat files. However, whatever one does here hugely impacts on how easy and straightforward
    importing data down the line from your flat file that you create at this stage will be. There are a few things to note. When you open your Flat File Destination component, of the options that you will see – listed below is a discussion of some of them. I have
    found it useful to take the actions which I have mentioned below.
    Column names in the first row – If you want the column names to be in the first row on your file make sure that you always check the box with that option. SSIS does not do that for you by default.
    Column delimiter – I always prefer to use Tab {t}. The default delimiter is Comma {,}. The problem that I have found with use of comma delimiters is that if the values have inherent commas within them the system does not always get it right
    in determining where the column ends. As a result you may end up getting some rows with shifted and misplaced values from their original columns owing to the wrong column allocation by the system.
    AlwaysCheckForRowDelimiters – this is a Flat File Connection Manager property. The default setting of this property is True. At one time I found myself having to use this after I faced a huge problem with breaking and misplacement of rows
    in the dataset that I was dealing with. The problem emanated from the values in one of the columns. The offending column had values of varchar datatype which were presented in the form of paragraphs with all sorts of special characters within them, e.g.
    This is ++, an example of what – I mean… the characters ;
    in the dataset which gave me: nearly 100% ++ headaches – looked like {well}; this piece
    OF: example??
    You can see from the above italicised dummy value example what I mean. Values such as that make the system to prematurely break the rows. I don’t know why but the somehow painful experience that I had about this led me to the conclusion that I should not
    leave the system to auto-decide where the row ends. As such, when I changed the property
    AlwaysCheckForRowDelimiters from True to False, along with the recommendations mentioned in items 1 and 2 above, breaking and misplacement of rows was solved. By breaking I mean -  you will find one row in a table being broken into two
    or three separate rows in the flat file. This is carried over to the new table where that flat will is loaded.
    Addendum
    There is an additional option which I have found to work even better if one is experiencing issues with breaking of rows due to values being in the ‘paragraph’ format as illustrated above. The option for ‘paragraphed’ values which
    I explained earlier works, but not always as I have realised. If that option does not work, this is what you are supposed to do
    When you SELECT the data for export from your table, make your SELECT statement to look something like this
    SELECT
    ColumnName1,
    ColumnName2,
    ColumnName3,
    REPLACE(REPLACE(OffendingColumnNameName,CHAR(10),''),CHAR(13),'')
    AS OffendingColumnNameName,
    ColumnName4
    FROM
    MyTableName
    The REPLACE function gets rid of the breaks on your values. That is, it gets rid of the paragraphs in the values.
    I would suggest use of double dagger column delimiters if using this approach.
    Text or CSV file?? – In my experience going with the text file is always efficient. Besides, some of the things recommended above only work in text file (I suppose so. I stand to be corrected on this). An example of this is column delimiters.
    Item 2 above recommends use of Tab {t} column delimiter whereas in CSV, as the name suggests, the delimiters are commas.
    Flat files as source file
    In my experience, many headaches of working with flat files are seen at importing data from flat files. A few examples of the headaches that I’m talking about are things such as,
    Datatypes and datatype length, if using string
    Shifting and misplacement of column values
    Broken rows, with some pseudo-rows appearing in your import file
    Double quotation marks in your values
    Below I will address some of the common things which I have personally experienced and hope will be useful to other people. When you open your Flat File Source component, of the options that you will see – listed below is a discussion of some of them. I
    have found it useful to take the actions which I have mentioned below.
    Retain null values from the source as null values in the data flow – this option comes unchecked by default. From the time I noticed the importance of putting a check mark in it, I always make sure that I check it. It was after some of
    my rows in the destination table were coming up with shifted and misplaced column values. By shifted and misplaced column values I mean certain values appearing under columns where you do not expect them, by so doing showing that purely the value has
    been moved from its original column to another column where it does not belong.
    Text qualifier – the default entry here is <none>. I have found that it is always handy to insert double quotes here (“). This will eliminate any double quotes which the system may have included at the time when the flat file was
    created. This happens when the values in question have commas as part of the characters in them.
    Column delimiter – this solely depends on the column delimiter which was specified at the time when the flat file was created. The system default is Comma {,}. Please note that if the delimiter specified here is different from the one in
    your flat file the system will throw up an error with a message like “An error occurred while skipping data rows”.
    Column names in the first data row – if you want the first row to be column names put a check mark on this option.
    Datatypes and datatypes length
    By default when you import a flat file your datatypes for all the columns come up as varchar (50) in SSIS. More often than not if you leave this default setup your package will fail when you run it. This is because some of the values in some of your columns
    will be more than 50 characters, the default length. The resulting error will be a truncation error. I have found two ways of dealing with this.
    Advanced – This is an option found on the Flat File Source Editor. Once this option is selected on your Flat File Source Editor you will be presented with a list of columns from your flat file. To determine your datatypes and length there
    are two possible things that you can do at this stage.
    Go column by column – going column by column you can manually input your desired datatypes and lengths on the Flat File Source Editor through the Advanced option.
    Suggest types – this is another option under Advanced selection. What this option does is suggest datatypes and lengths for you based on the sample data amount that you mention in the pop-up dialog box. I have noticed that while this is
    a handy functionality, the problem with it is that if some of the values from the non-sampled data have lengths bigger than what the system would have suggested the package will fail with a truncation error.
    View code – this is viewing of XML code. If for example you want all your columns to be of 255 characters length in your landing staging table
    Go to your package name, right click on it and select the option View code from the list presented to you. XML code will then come up.
    Hit Ctrl + F to get a “Find and Replace:” window. On “Find What” type in
    DTS:MaximumWidth="50" and on “Replace with:” type in
    DTS:MaximumWidth="255". Make sure that under “Look in” the selection is
    Current Document.
    Click “Replace All” and all your default column lengths of 50 characters will be changed to 255 characters.
    Once done, save the changes. Close the XML code page. Go to your package GUI designer. You will find that the Flat File Source component at this point will be highlighted with a yellow warning triangle. This is because the metadata definition has changed.
    Double click the Flat File Source component and then click on Ok. The warning will disappear and you will be set to pull and load your data to your staging database with all columns being varchar (255). If you need to change any columns to specific data types
    you can either use Data Conversion Component or Derived Column component for that purpose, OR you can use both components depending on the data types that you will converting to.
    Dynamic Flat File Name and Date
    Please see this blog
    http://www.bidn.com/blogs/mikedavis/ssis/153/using-expression-ssis-to-save-a-file-with-file-name-and-date
    There is so much to flat files to be discussed in one piece.
    Any comments plus additions (and subtractions too) to this piece are welcome.
    Mpumelelo

    You could create a
    WIKI about this subject. Also see
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/f46b14ab-59c4-46c0-b125-6534aa1133e9/ssis-guru-needed-apply-within?forum=sqlintegrationservices
    Please mark the post as answered if it answers your question | My SSIS Blog:
    http://microsoft-ssis.blogspot.com |
    Twitter

  • Flat file format - importing vs typing it out

    Hi.  I've heard there is a way to import the format (rather that typing it out in connection mgr) of a fixed width flat file.  What r the general steps or where is there a set of instructions online?  I generally have a txt doc with column
    names (lots of them) , widths etc that I would use for import and could easily transfer them to a spreadsheet if necessary.   So far the word import is bringing up a lot of unwanted subject matter in my web searches.   I am spending a lot
    of time typing out flat file formats in conn mgrs and am thinking it may be worth a look at ssis's ability to "import" a format.   We generally land our data in "untyped" staging tables 1st so ssis could assume everything is string.  
    I dont really want to read my records as one long string and parse them later.

    thx.   I suppose I can write and share with the community some general t-sql that generates the following but would like to know a couple of things. 
    First, is any old guid ok in each of the places where a guid is indicated?  I would generate a new one for each guid.
    Second, what is the locale id?
    Of course i would only paste the results in a pkg (dtsx file) that doesnt reference the cols (beyond the connection stuff) yet.
    What gotchtyas am I looking at if i try this with the intention of pasting the generated code over the top of what ssis already generated (inside the dtsx file) in a one column connector created the usual way?
    <DTS:ConnectionManagers>
    <DTS:ConnectionManager
    DTS:refId="Package.ConnectionManagers[Flat File Connection Manager]"
    DTS:CreationName="FLATFILE"
    DTS:Description="ZZZZZZ"
    DTS:DTSID="{some guid}"
    DTS:ObjectName="Flat File Connection Manager">
    <DTS:PropertyExpression
    DTS:Name="ConnectionString">@[User::fileName]</DTS:PropertyExpression>
    <DTS:ObjectData>
    <DTS:ConnectionManager
    DTS:Format="FixedWidth"
    DTS:LocaleID="1033"
    DTS:HeaderRowDelimiter="_x000D__x000A_"
    DTS:RowDelimiter=""
    DTS:TextQualifier="_x003C_none_x003E_"
    DTS:CodePage="1252"
    DTS:ConnectionString="c:...">
    <DTS:FlatFileColumns>
    <DTS:FlatFileColumn
    DTS:ColumnDelimiter=""
    DTS:ColumnWidth="2"
    DTS:MaximumWidth="2"
    DTS:DataType="129"
    DTS:ObjectName="XXXXX"
    DTS:DTSID="{some other guid}"
    DTS:CreationName="" />
    <DTS:FlatFileColumn
    DTS:ColumnDelimiter=""
    DTS:ColumnWidth="2"
    DTS:MaximumWidth="2"
    DTS:DataType="129"
    DTS:ObjectName="YYYYYY"
    DTS:DTSID="{some other guid}"
    DTS:CreationName="" />

  • Export to flat file with unicode (chinese) characters??

    I have an SSIS (2013) package that needs to send an email notification with query results (via attachment). I have a data flow task with a source sql query that gathers 2 columns - ID and NAME. ID is varchar, NAME is nvarchar (some records have Chinese symbols).
    The destination is a .txt file. When I copy/paste a field w/ Chinese symbols to a .txt file, it shows up fine, so i know Notepad can handle the characters. I also have the .txt file set to Unicode encoding (I also tried ANSI originally). When I run the package,
    it says that it successfully moves 50 fields to the flat file. However, when I open the flat file, I see that it shows 10 records. It stops when it gets to the record with a Chinese symbol in it...but doesn't produce an error. My plan was to then take the
    attachment and use it in a Send Mail task. (i have the smtp/mail notification working fine...just need to figure out how to get the data to all export to the flat file correctly)
    Does anyone know how I can get the ID and NAME fields from a query into an email notification (either as an attachment, or even formatted nicely (with tabs) in an email would work).

    Hi,
    Based on my test, I can reproduce the similar issue in my environment. When we create the Flat File Connection Manager with the default settings, the data in flat file stops when goes to the Chinese characters.
    To fix the issue that makes the Chinese characters to actually get written into the flat file, we should check the Unicode checkbox on the right hand side of Locale property in Flat File Connection Manager. In this way, the flat file can display Chinese
    characters.
    The following screenshot is for your reference:
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Export SQL View to Flat File with UTF-8 Encoding

    I've setup a package in SSIS to export a SQL view to a flat file and it's working fine.  I now need to make that flat file UTF-8 encoded.  The package executes but still shows the files as ANSI encoded.
    My package consists of a Source (SQL View) -> Derived Column (casts the fields to DT_WSTR) -> Destination Flat File (Set to output UTF-8 file).
    I don't get any errors to help me troubleshoot further.  I'm running SQL Server 2005 SP2.

    Unless there is a Byte-Order-Marker (BOM - hex file prefix: EF BB BF) at the beginning of the file, and unless your data contains non-ASCII characters, I'm unsure there is a technical difference in the files, Paul.
    That is, even if the file is "encoded" UTF-8, if your data is only ASCII values (decimal values 0-127, hex 00-7F), UTF-8 doesn't really serve a purpose over ANSI encoding.  Now if you're looking for UTF-8 with specifically the BOM included, and your data is all standard ASCII, the Flat File Connection Manager can't do that, it seems.
    What the flat file connection manager is doing correctly though, is encoding values that are over decimal 127/hex 7F in UTF-8 when the encoding of the connection manager is set to 65001 (UTF-8).
    Example:
    Input data built with a script component as a source (code at the bottom of this post) and with only one WSTR output column hooked to a flat file destination component:
    a string containing only decimal value 225 (german Eszett character - ß)
    Encoding set to ANSI 1252 looks like:
    E1 0D 0A (which is the ANSI encoding of the decimal character value 225 (E1) and a CR-LF (0D 0A)
    Encoding set to UTF-8 65001 looks like:
    C3 A1 0D 0A  (which is the UTF-8 encoding of the decimal character value 225 (C3 A1) and a CR-LF (0D 0A)
    Note that for values over decimal 127, UTF-8 takes at least two bytes and up to four for the remaining values available.
    So, I'm comfortable now, after sitting down and going through this, that the flat file connection manager is working correctly, unless you need a BOM.
    1
    Imports System  
    2
    Imports System.Data  
    3
    Imports System.Math  
    4
    Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper  
    5
    Imports Microsoft.SqlServer.Dts.Runtime.Wrapper  
    6
    7
    Public Class ScriptMain  
    8
        Inherits UserComponent  
    9
    10
        Public Overrides Sub CreateNewOutputRows()  
    11
            Output0Buffer.AddRow()  
    12
            Output0Buffer.col1 = ChrW(225)  
    13
        End Sub 
    14
    15
    End Class 
    Phil

  • OWB export flat file with align to right

    Hi All
    I want to know if is possible to configure a metadata POSITIONAL FILE (Flat File) where every data is align to right, in a Mapping.
    I don't want to use a oracle function RPAD with Paletta Expression.
    Regards

    Hi,
    Based on my test, I can reproduce the similar issue in my environment. When we create the Flat File Connection Manager with the default settings, the data in flat file stops when goes to the Chinese characters.
    To fix the issue that makes the Chinese characters to actually get written into the flat file, we should check the Unicode checkbox on the right hand side of Locale property in Flat File Connection Manager. In this way, the flat file can display Chinese
    characters.
    The following screenshot is for your reference:
    Thanks,
    Katherine Xiong
    Katherine Xiong
    TechNet Community Support

  • Flat File Destination: Broken Rows

    I am exporting data to a csv file. One of the columns has values with so mixed characters which end up confusing the system and breaking the rows before the real end of the row. What I mean is; on the flat file a resultant single row can be broken down into
    two or more rows as a result of the characters in the offending column. I work for the healthcare industry. Please see the example of what I’m talking about below. This is a real value from one of the rows in my table which I have and this value(amongst many
    more under tis column) is causing the affected row to be split into two rows. Where the letters
    OE: is coming up as a new row on my flat file. I want all this to be in the same row. My column delimiter is double dagger and my row delimiter is {CR}{LF}
    Example value:
    Column name: Comments
    Value:
    BIBA: blah blah,
    accidentally
    kicked
    brick
    wall
    to
    lateral
    aspect
    to
    R
    foot.
    reports
    hearing
    a
    crack.
    mobilised
    after
    injury.
    pain
    ++
    swelling
    +
    nil
    obvious
    deformity.
    minimal
    movement
    of
    the
    toes.
    LAS
    obs:
    entonox
    to
    good
    affect
    RR18 98%air
    P66reg 114/75
    5.3mmols
    T36.8
    pain 8-4/10
    PMH:
    Nil
    Allergies-
    nuts&prawns
    OE:
    swelling
    and
    pain
    to
    lateral
    and
    medial
    malleolus
    and 5th
    metatarsals.
    For
    xray
    Mpumelelo

    I think I have managed to get a solution to my problem.
    Instead of using csv file I have decided to use text file as my destination flat file format when exporting the data
    I have used Tab{t} as my column delimiter
    I have left the default system specified                {CR}{LF} for row delimiter
    On my flat file connection manager, I have gone to the properties and changed
    AlwaysCheckForRowDelimiters from True to False.
    After a good fight the above worked. No more broken or split rows :).
    If anyone has a better approach or suggestion it will be gladly welcome.
    Many thanks,
    Mpumelelo

  • Flat File Destination Multiple Delimiters

    Hi Guys,
    This might been an easy one for you. I need to find out if the data from a table can be loaded into a text file with multiple delimiters or not using SSIS? For example, if I have 2 columns of data from a table,
    ID Name
    100 Mark
    I need to load this into a flat file and the output should be like this
    100,@Mark
    So basically, I need to have 2 delimiters in the flat file destination namely "," and "@". How can this be done.
    Thanks in advance.

    Can't you just fill in two delimiters in the Column Delimiter field of the Flat File Connection Manager (never tried it)?
    Alternatives:
    -In the source query add a @ to the second column: select column1,
    '@' + column2 as column2 from yourTable
    -Add a Derived Column with an expression that adds a @ in front of column2: "@" + [column2]
    Please mark the post as answered if it answers your question | My SSIS Blog:
    http://microsoft-ssis.blogspot.com |
    Twitter

Maybe you are looking for