Using DTS Wizard to Import a Flat File containing data like "4124, rue Champenois"

THE BELOW DATA SHOULD BE ONE FIELD, but the COMMA causes it to separate into two fields. The FLAT FILE is a comma-delimited CSV file. The data in the CSV file for this one field actually uses double quotes at each end of the street number and street name.
"2144, rue Champenois"
"340, rue Champs Elysees"
What happens during the conversion process when using the DTS Wizard is that the street number and street name are split apart into two separate fields. This messes up the output and causes the SSIS package to fail, so that the FLAT FILE Source's data never
makes it into a SQL Server Database Table. I saw a box called TEXT QUALIFIER, so I put the DOUBLE QUOTE in there, thinking that this would keep the one field together which contains the STREET NUMBER and STREET NAME. No, that didn't work.
When using this import wizard with a FLAT FILE SOURCE, how do you keep a string that has double quotes at each end and a comma in the middle, together as one field?
Now, I went back later and started all over and put the " (Double Quote one time) in thr TEXT QUALIFIER block, and this did work. Why sometimes and not others?

Hi Champenois,
I have tried your problem. 
1) Exported "DimCustomer" table in "AdventureWorksDW" to filat file with bellow query.
Query: select top 10 CustomerKey,AddressLine1 from DimCustomer where AddressLine1 like '%,%'
2) Data comes to flat file like this.
3) Then i have taken flat file source and configured like this.
4) Now went to "Columns" tab and check the data. I am getting only two columns like above.
So, Please check your flat file connection manager with above flat file connection screen shot.
Please make this as answer if, you are able to solve your problem.

Similar Messages

  • How to load flat file containing data(separated by commas) into cube sapbi.

    hi gurus,
    please help me with this question?
    i have some data in the flat file for example, address but with commas for ex:
    h.no:123,colony,area,hyd-59.now i have even other columns in this flat file like customer name,age.....i want to know ,how to load this data into infocube. where in when it comes to address column sap bi takes it as a single column in the generated report.

    Hi Amulya,
    To get reports on all ur records save ur flat file in .CSV format -> to do this open ur flat file, save as "filename.csv".
    -> CSV is nothing but comma seperator value, sap supports two types of formats 1).csv and 2) ASCII format.
    we prefer .csv format.
    check this link for step by step modelling.
    http://www.scribd.com/doc/3804698/SAP-BW-StepBystep-From-Fu-Fu
    http://books.google.co.in/books?id=3wBjrMDWescC&pg=PA184&lpg=PA184&dq=stepbystepdocumentationfor+sap-bw&source=bl&ots=RIp
    T1knFet&sig=_4HOg59Om504Zb9RObF9Ir_oE64&hl=en&sa=X&oi=book_result&resnum=3&ct=result#PPP1,M1
    hope it helps u,
    Thanks,
    Sai Chand.
    Edited by: sai chand on Jan 26, 2009 10:04 PM

  • Importing From Flat File with Dynamic Columns

    HI
    I am using ssis 2008,i have folder in which I have Four(4) “.txt” files each file will have 2 columns(ID, NAME). I loaded 4
    files in one destination, but today I receive one more “.txt” file here we have 3 columns (ID, NAME, JOB) how can I get a message new column will receive in source. And how can I create in extra column in my destination table dynamically …please help me

    Hi Sasidhar,
    You need a Script Task to read the names and number of columns in the first row of the flat file each time and store it in a variable, then create a staging table dynamically based on this variable and modify the destination table definition if one ore more
    new columns need to be added, and then use the staging table to load the destination table. I am afraid there is no available working script for your scenario, and you need some .NET coding experience to achieve your goal. Here is an example you can refer
    to:
    http://www.citagus.com/citagus/blog/importing-from-flat-file-with-dynamic-columns/ 
    Regards,
    Mike Yin
    TechNet Community Support

  • Modifying the number of records to skip after importing the flat file

    I imported a flat file and the first row was the column header. I also created an external table for that flat file. The sqlldr is skipping the first record during the load. Is there a way to change this in the flat file module or External table?

    If you marked this row as the header in the sample wizard then you will see the following in the External Table:
    ACCESS PARAMETERS (
    RECORDS DELIMITED BY NEWLINE
    CHARACTERSET WE8MSWIN1252
    STRING SIZES ARE IN BYTES
    NOBADFILE
    NODISCARDFILE
    NOLOGFILE
    SKIP 1
    So the external table is skipping this.
    Now the issue with changing it is interesting because you cannot change this after the sampling... I think this is a bug which I will file.
    Let me know if this answers the question,
    Jean-Pierre

  • Import unicode flat file

    Hello ,
    I try to import a unicode (Windows unicode, not UTF-8) flat file but it couldn't be executed successfully.
    There is an error message in Deployment Manager.
    RPE-01013: SQL Loader reported error condition, number -1073741819.
    My environment configuration as follows :
    Database ver. 9.2
    Database CharacterSet : AL16UTF16
    OWB ver 9.2
    Import CharacterSet : AL16UTF16LE
    Import module : Flat file module.
    Source Flat file Language : Unicode (Windows unicode) Traditional Chinese
    Source Flat file : Export from Windows 2k
    Validate , generate & deploy .... OK
    What's wrong of the above configuration ? Anybody can help ? Thanks !
    Daniel

    Daniel,
    The language settings in the Windows registry variable NLS_LANGUAGE on the machine where SQL Loader was run must be the same as the language parameter in the database. You should see more meaningful errors in SQL*Loader log file (by default $ORACLE_HOME/owb/temp/<SQLLDR_MAP_NAME>.log).
    Nikolai Rochnik

  • Extraction, Flat File, Generic Data Source, Delta Initialization

    Extraction, Flat File, Generic Data Source, Delta Initialization
    I have couple of questions regarding data extraction.
    1.     If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source 
    2.     Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    3.     I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    4.     What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update  or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    5.     If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been  expanded by one field or it may does not need to know at all?
    THANKSSSSSSSSSs

    Hi,
    1. If you have Data Source a Flat File e.g. Excel file I know that you have to create Data Source at BW side. How do you upload updates, by selecting Delta Update when executing next Data Load? Do you ever u201Cconvertu201D this Excel file into Application Tables to become SAP Source
    Once you create Datasource for A flat file extraction then it is file source system specific hence you cont change to Application table source Data source
    In info package you can change the source as application server instead of desktop no need to change the DS
    2. Can you please give me example of situation where you had to create Generic Data Source? What is difference between Time Stamp, Calend. Day and Numeric Pointer. Which one is most common to select?
    When we don't find any standard extractor then we can go for Generic(if i want information sales along with finance information in a data source then generally we dont get standard one hence we can go for generic DS)
    Check the below link for More about generic DS .
    http://wiki.sdn.sap.com/wiki/display/BI/Generic+Extraction
    for Delta capturing you can use
    Timestamp(if the table has time stamp filed  so it can stamp the last changed record in that table hence it is easy to get delta based on the time stamp)
    Calday- (If the table doesn't have the Timestamp filed then search for Calday where you can stamp the delta based on the date when documents are changed )
    Numericpointer : If the table doesn't above both then we go for this option where you can the numeric value change stamp )
    3. I read somewhere that Generic Data Source does not have Setup Table. I thought that you have to have Setup Table in order to load transaction Data otherwise you will lock the Application Tables. Where am I going wrong im my thinking please?
    Generic datasource nothing but we extracting data directly from the database table without any interface between the application/systems
    4. What are steps in terms of IP before, under and after Delta Initialization. I belive that you can do two ways:
    Full Update - Initialize Delta Process (without Data Transfer) u2013 Delta Update or
    Initialize Delta Process (with Data Transfer) u2013 Delta Update
    Am I right? What is most common method and why?
    Correct
    5. If you want to add a filed in Data Source after 6 month using it, you want to do it without re-init Delta Queue. You add field in RSA6, then provide info for ABAP to populate new filed (info u2013 name of Data Source, Extract Structure, field added, name of Application Table which contains the field). How does it work now as there is no SetUp table it has been deleted after Initialisation? How does Delta Queue know that it is going to receive data which has been expanded by one field or it may does not need to know at all?
    Once you add the new field to structure(DS) you will get the data as on date onwards not historical data hence what is the concept of setup table  ( delta records come from the Delta Que not from the setup table )
    If you want histaric data to new field then you need to setp table deletion ...etc...
    Hope it is clear..
    Regards,
    Satya

  • What are the advantages of idoc compare to flat file. how data is secure

    what are the advantages of idoc compare to flat file. how data is secure in idocs compare to flat file

    Hi Ramana,
    In simple words, Main advantage with idoc over flat file is security....
    I will explain you some scenario here U got a flat file with all the data...Now u r having the flat file if you want u can modify the data in it, or somehow any one can modify the data in it  if they were able to access this file. That means u maintained the file in the presentation server
    One level of higher security to the above level is maintaining the flat file in application server, at point also even though lot of people r not having the access to that file, super user who is  having  the access may modify the data or delete the data from it rite....
    so in both of those levels u don't have 100% security...
    So there come to the picture of idocs, Idocs simply data carriers, those r generated by a program but not manually...data will be divided into number of segments based upon ur program. So manually its not so easy to modify the data in these idocs. If any changes to be made in the data then u have to modify the data in the application and then u have to update the idoc or you have to generate the new idoc with that corresponding data. so in this case not even super user can manipulate the data directly in the idoc....
    I think u got my point what I mean to say.....
    If you find it useful mark the points
    ~~Guduri

  • Flat File - Delete data

    Hi Expert,
    I have flat file as data source. I want to know how to handle deletes with flat files.
    For examaple,
    If this is my flat file data
    Company...Costcenter....Stock
    A......B......10
    M.....Z.......20
    K......W......25
    Company and Costcenter are keys.
    If the above three records are loaded into an ODS last month and this month if I realize that A..B...10 record is wrong and is not needed, how do I delete this from the ODS? Do I need to send some indicator in the flat file for BW to understand that this is a delete entry?
    Another case is if A..B record changes valye from 10 to 35, then I can send the same file or the same record in a diferent file again and it will over write the data, which is ok.
    Can exprets plzz explain these data load streategies?
    Jason

    Hello ,
      For Case1) Do the reverse posting of the record which u wish to delete from the ODS. I Mean (A..B...-10)
    Case2) If you are loading data only till ODS. Then in your ODS u have active data table and a Change log table.
             so this new record A..B..35 will overwrite the existing rec -> A..B..10 to A..B..35 and if you are doing further update to Cube then in change log it will first create a reverse image of the exinging record ie A..B..-10 and the writes the new record A..B..35
    so --> A..B..10
             A..B..-10
             A..B..35  so the remaining will be A..B..35
                   Provided you have to maintain the Delta method in you flatfile
    -- EnjoySAP:-)
    **Award Credits if you find the solution, Have a great day

  • How to convert Flat file(.txt) data to an Idoc format(ORDERS05)

    Hi,
    How to convert Flat file(.txt) data to an Idoc format(ORDERS05). If any FM does the same work please let me know.
    thanks in advance,
    Chand
    Moderator message : Duplicate post locked. Read forum rules before posting.
    Edited by: Vinod Kumar on Jul 26, 2011 11:11 AM

    Hi,
            For more information, please check this link.
    http://sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/46759682-0401-0010-1791-bd1972bc0b8a
    Have a look at the FM IDOC_XML_FROM_FILE. May be it helps...
    Regards

  • When I see the usage on my iPod touch 4 they have categories for how much space is being used where. What does the "other" category contain? Like what could I delete on my iPod to get rid of the "other" category?

    when I see the usage on my iPod touch 4 they have categories for how much space is being used where. What does the "other" category contain? Like what could I delete on my iPod to get rid of the "other" category?

    What is the Other on my iPhone and How to Remove It
    An "other" larger than about 1 1/2 GB usually indicates that the "other" includes corrupted files.
    First try:
    "First you go settings/general/usage/music/then left swipe over music and press delete and you would think it deletes your music but it doesn't instead it deletes your other but make sure your ipod is connected to a computer while doing this"
    as recommended by:
    How do I get rid of "other" storage: Apple Support Communities
    Next, usually restoring from backup eliminated the corrupted files. However, sometimes restoring to factory settings/new iPod is required.
    To restore from backup see:
    iOS: How to back up
    To restore to factory settings/new iPod see:
    iTunes: Restoring iOS software

  • Text file contains data separated by pipe symbol read the data and saved in

    Hi ,
    This is Sreedhar, i am new to java. my query is Report its in text file format it contains data like GLNO,name ,amount. All fields are separated by pipe(|) symbol. I would like to read that
    data and saved into the database. Please anyone can help me.

    thx Ottobonn.
    Scanner is very usefull with string operation like in my problem..but i want try to find java.util.Scanner in j2me, i can't found this class...:(
    so, may be i can't use this class in j2me...
    i new in java, so i try myself to code the simple method for my problem,
    may be anyone can make this class more simple than my coding....:)
    public StringPipe(String _msg){
            String message = _msg;
            int pipe = 0;
            int lengthmessage = message.length();
            int lengthMsgresult = 0;
            while(lengthMsgresult<=lengthmessage){
                String msg = null;
                int pipeX = message.indexOf("|");
                if(pipeX==-1){
                    msg = message;
                else{
                    msg = message.substring(pipe, pipeX);
                    message = message.substring(pipeX+1, message.length());
                    lengthMsgresult += msg.length();  
                // the string seperated by pipe
                System.out.println("msg = " + msg);
                if(pipeX==-1){
                   break;
    }thx...

  • Split flat file column data into multiple columns using ssis

    Hi All, I need one help in SSIS.
    I have a source file with column1, I want to split the column1 data into
    multiple columns when there is a semicolon(';') and there is no specific
    length between each semicolon,let say..
    Column1:
    John;Sam;Greg;David
    And at destination we have 4 columns let say D1,D2,D3,D4
    I want to map
    John -> D1
    Sam->D2
    Greg->D3
    David->D4
    Please I need it ASAP
    Thanks in Advance,
    RH
    sql

    Imports System
    Imports System.Data
    Imports System.Math
    Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
    Imports Microsoft.SqlServer.Dts.Runtime.Wrapper
    Imports System.IO
    Public Class ScriptMain
    Inherits UserComponent
    Private textReader As StreamReader
    Private exportedAddressFile As String
    Public Overrides Sub AcquireConnections(ByVal Transaction As Object)
    Dim connMgr As IDTSConnectionManager90 = _
    Me.Connections.Connection
    exportedAddressFile = _
    CType(connMgr.AcquireConnection(Nothing), String)
    End Sub
    Public Overrides Sub PreExecute()
    MyBase.PreExecute()
    textReader = New StreamReader(exportedAddressFile)
    End Sub
    Public Overrides Sub CreateNewOutputRows()
    Dim nextLine As String
    Dim columns As String()
    Dim cols As String()
    Dim delimiters As Char()
    delimiters = ",".ToCharArray
    nextLine = textReader.ReadLine
    Do While nextLine IsNot Nothing
    columns = nextLine.Split(delimiters)
    With Output0Buffer
    cols = columns(1).Split(";".ToCharArray)
    .AddRow()
    .ID = Convert.ToInt32(columns(0))
    If cols.GetUpperBound(0) >= 0 Then
    .Col1 = cols(0)
    End If
    If cols.GetUpperBound(0) >= 1 Then
    .Col2 = cols(1)
    End If
    If cols.GetUpperBound(0) >= 2 Then
    .Col3 = cols(2)
    End If
    If cols.GetUpperBound(0) >= 3 Then
    .Col4 = cols(3)
    End If
    End With
    nextLine = textReader.ReadLine
    Loop
    End Sub
    Public Overrides Sub PostExecute()
    MyBase.PostExecute()
    textReader.Close()
    End Sub
    End Class
    Put this code in ur script component. Before that add 5 columns to the script component output and name them as ID, col1, co2..,col4. ID is of data type int. Create a flat file destination and name it as connection and point it to the flat file as the source.
    Im not sure whats the delimiter in ur flat file between the 2 columns. I have use a comma change it accordingly.
    This is the output I get:
    ID Col1
    Col2 Col3
    Col4
    1 john
    Greg David
    Sam
    2 tom
    tony NULL
    NULL
    3 harry
    NULL NULL
    NULL

  • How to import flat file with date in filename on a regular basis

    Hi,
    Using OWB 11gR1
    I have a file that will be delivered to an FTP each night with the date in the filename having the form YYYYMMDD-FILE.txt (ex: 20100326-FILE.txt) that I want to import to an external table.
    Now I've set up the import to the external table but am only able to import files that I specify the name for exactly. I've tried pointing to filenames such as "*-FILE.txt" and "%-FILE.txt" but that only results in errors.
    It must be possible to automatically import files with different filenames but the same structure, or isn't it? Could anyone help me solve this it'd be greatly appreciated.
    Thank you in advance.

    Hi
    For dynamic files you can;
    1. either do the DDL on the external table to point to the file with the changing name
    2Copy the file to a fixed name before using the external table/maps
    3. Use the preprocessor to cat/pipe these files for the external table. See the post here http://blogs.oracle.com/warehousebuilder/2009/06/file_staging_using_external_table_preprocessor.html it shows using gunzip but could simply be doing 'cat' on a bunch of files to standard output
    Cheers
    David

  • SSIS - Import Multiple flat files with different metadata

    Hi ,
    I have set of flat files with different metadata structure, I would like to load them into staging tables. 
    1. ) Can we load the flatfiles into the staging tables with out having multiple data flow task.
    2.) If possible , can we programmatically select the staging table based on the metadata of the flatfile and load them.
    Please advise.
    Thanks
    Thiya

    Nope in SSIS a data flow task needs to have a fixed metadata. So if your file metadata varies then best option would be use OPENROWSET syntax to pull the data and populate into your staging table. You may also use 
    SELECT .. INTO StagingTable ... FROM OPENROWSET (...)
    syntax to create staging table at runtime based on the file metadata
    http://sqlmate.wordpress.com/2012/08/09/use-your-text-csv-files-in-your-queries-via-openrowset/
    If you want to do this in SSIS you need to create data flow dynamically using script task and build the metadata
    see
    http://www.selectsifiso.net/?p=288
    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

  • Can you use a subquery when importing from a file?

    I was wondering if the DBMS_DATAPUMP API allows using a subquery (DATA_FILTER) when executing an import from a file?
    I am unable of limiting my data when creating the export file because my subquery requires a database link, which is not an option on the server I'm working on. So I'm wondering if I can limit my data on the import from a file, rather than the export?
    Thanks for any advice...
    ivalum21

    I was wondering if the DBMS_DATAPUMP API allows using a subquery (DATA_FILTER) when executing an import from a file?From the documentation I would say yes:
    SUBQUERY:
    Specifies a subquery that is added to the end of the SELECT statement for the table. If you specify a WHERE clause in the subquery, you can restrict the rows that are selected. Specifying an ORDER BY clause orders the rows dumped in the export which improves performance when migrating from heap-organized tables to index-organized tables.
    «

Maybe you are looking for

  • What is the best way to implement default values stored in a DB table?

    [JHeadstart 10.1.3 build 78] [JDeveloper 10.1.3 SU4] We are struggling on how to best implement default values that are stored in a DB table. What we have is a database table with (CODE_TYPE, TABLE_NAME, COLUMN_NAME, DEFAULT_VALUE) as columns. This w

  • RFC_ERROR_SYSTEM_FAILURE: Unable to interpret * as a number

    Hi, I facing a strange problem while executing a query in bex analyzer and WAD. When I open a query it is throwing error as 'Unable to interpret * as a number'. I am getting this error before the pop-up menu itself. But all other reports are working

  • Errors in jserv.log

    Hi, for the moment when we click on E_BUSINESS or OAM link on home page of Oracle applications we have http 404 error The requested URL /oa_servlets/AppsLogin was not found on this server In jserv.log we have [[09/11/2007 15:37:29:146 CET] Repository

  • Strange behavior with NI 9401

    Hi everyone! I think I have a strange problem with the NI 9401 modules. My hardware is a CompactRIO and I'm using Labview 8.0. I'm reading a 13 bit encoder with two NI 9401 modules: the first for 8 input channels and the second for the other remainin

  • Ipad cloning

    I want to move everything from a 16GB to a 64 GB iPad.  Can I clone everything, including apps, free digital downloads from local library whiich includes Kindle books and Zinio magazines, free podcasts, my Cox email, etc.? There are no movies, pictur