Flat file mapping question

Hello
I'm very new to XI. I have to a flat file which contains Order No, Product, Value, and looks something like this:
A,A,1
A,B,1
A,C,1
B,A,1
B,B,2
B,C,3
I need to be able to map it to a target structure that has a header for Order No, and detail for the Product  and Value fields, i.e. looks a bit like this:
<Order No>
A
<Product> A </Product>
<Value> 1 </Value>
<Product> B </Product>
<Value> 1 </Value>
<Product> C </Product>
<Value> 1 </Value>
</Order No>
<Order No>
B
<Product> A </Product>
<Value> 1 </Value>
<Product> B </Product>
<Value> 1 </Value>
<Product> C </Product>
<Value> 1 </Value>
</Order No>
In my attempts with various contexts etc. I either get just one output message for order A, or a message for every product within an order (so 6 messages). Amateur stuff, I know.....
Thanks for your help
Regards
Robert

Hi,
Please do this mapping:
For the node Order
OrderNumber --> removeContext --> splitByValue:ValueChanged --> collapseContext --> Order
For the node Ordernumber
OrderNumber --> removeContext --> splitByValue:ValueChanged --> collapseContext --> splitByValue:EachValue --> Ordernumber
For the node OrderLines
OrderNumber --> removeContext --> splitByValue:ValueChanged --> OrderLines
For the node Product
Product --> Product
For the node Value
Value --> Value
Hope this helps,
Edited by: Mark Dihiansan on Mar 12, 2009 6:04 PM

Similar Messages

  • ORDERS05 IDOC to flat file - mapping question

    I have a requirement to convert an ORDERS05 IDOC to a "flat" structure for one of our vendors.
    When the IDOC has the following data, the mapping works well
    Input
    =====
    ORDERS05
       - E1EDP01
          POSEX        00010
          - E1EDPT1
              -E1EDPT2
                 TDLINE   line 1 text
      - E1EDP01
          POSEX        00020
          - E1EDPT1
              -E1EDPT2
                 TDLINE   line 2 text
    Output
    ======
    GWC
      - GWC08-RECORD
          GWC08-SEQNO      0010
          GWC08-TXT        line 1 text
      - GWC08-RECORD
          GWC08-SEQNO      0020
          GWC08-TXT        line 2 text
    However when the IDOC has the following data
    ORDERS05
      - E1EDP01
          POSEX        00010
      - E1EDP01
          POSEX        00020
          - E1EDPT1
              -E1EDPT2
                 TDLINE   line 2 text
    the mapping yields wrong results as below
    GWC
      - GWC08-RECORD
          GWC08-SEQNO      0010
          GWC08-TXT        line 2 text
    The vendor has a requirement that the GWC08-RECORD should only be sent if TDLINE exists in the IDOC - so that part works. However, Text on Line 2 is being associated with Line 1.
    If I am not mistaken, this has to do with the Input queues for POSEX and that for TDLINE having differing number of entries. However, I have no idea how to go about fixing this
    Appreciate any help on this
    thanks
    anil

    Hi,
    This is possible without UDF:
    Please see mapping below:
    For the node GWC08-RECORD
    TDLINE --> removeContext --> GWC08-RECORD
    For the node GWC08-SEQNO
    TDLINE (set context to E1EDP01) --> exists --> ifWithoutElse --> removeContext --> SplitByValue:EachValue --> GWC08-SEQNO
    POSEX ------------------------------------------>/
    For the node GWC08-TXT
    TDLINE (set context to E1EDP01) --> exists --> ifWithoutElse --> removeContext --> SplitByValue:EachValue --> GWC08-SEQNO
    TDLINE (set context to E1EDP01)----------------->/
    Hope this helps,
    Regards

  • Flat File  mapping issue

    Hello All,
    I am trying to an extract using flat file method in BI 7.0. I have some 150 fields in my CSV file but i wanted is just about 10 which are scattered around in the CSV file.When i Read Preview Data in the Preview tab i see incorrect data in there.And that too all under one tab, basically all under one field , though in the extraction tab for  Data Seperator i have been using ; and also tried checking the HEX box, for escape i have the ", tried HEX for this aswell.For rows to ignore i have 1.
    One thing i would like to know is how will the BI infoobject know where is the position of the flat file field in the  CSV file and where will it be mapped.i know it can be mapped in the Transformations but that is from the flat file datasource, i am asking about the CSV file.
    Please help me out and tell me what am i doing incorrectly.
    Thanks for your help.Points will be assigned.

    hi,
    use ,and ; as the escape signs.
    system takes care of it when u speicfy the path and name of the file and format as CSV.
    always the system checks the one to one mapping of the falt file fields with the infoobject in the datasource.
    options for u
    1. arrange the neccessary fields in the flat file that exactly maps with infoobjects for mapping. then start loading.
    2. keep as such and load with scattered field and in transformation map the required fields alone.
    second option consumes more memory space unneccessarily.
    For BI 7.0 basic step to laod data from flat (excel files) files for this follow the beloww step-by step directions ....
    Uploading of master data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to select Insert Characteristics as info provider
    • Select required info object ( Ex : Employee ID)
    • Under that info object select attributes
    • Right click on attributes and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    4. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. Alternatively monitor icon can be used.
    BW 7.0
    Uploading of Transaction data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    5. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    6. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( Transaction data )
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    7. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to create ODS( Data store object ) or Cube.
    • Specify name fro the ODS or cube and click create
    • From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
    • Click Activate.
    • Right click on ODS or Cube and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    8. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used
    Ramesh

  • Error in flat file mapping

    I am using flat file as source ,and when i am trying to validate the mapping i am getting the following error:
    No data file specified
    I tried to configure the mapping,I did not find any option asking source file name.
    Can you please help me...
    Kiran

    Hi Kiran
    As far as I remember...when you configure the map, the map configuration property inspector has a tree in the lhs if you navigate to the flat file operator, and right click on that node you should be able to add a data file. The properties for the data file are shown on the rhs when selected in tree.
    Cheers
    David

  • OWB FLAT FILE MAPPING NOT ABLE TO DEPLOY

    Hi,
    I've recently started studying data warehousing and exploring the Datawarehouse builder version 3i.
    I've created a simple mapping from a csv file to database table ie my source is a flat file and target is a database table. The mapping gets validated successfully and the ctl scrip also gets generated. But the 'Deploy' and "Run' buttons on the "Transformations" tab of the "Generation Results" window are displabled(greyed out) and hence I cannot deploy the generated script.
    Please do suggest the cause and solution for the same.
    Regards,
    Barkha

    Many Thanks to Winfred Deering from Oracle who replied:
    "This is the expected behavior for flat file mappings. You can register the tcl script with Oracle Enterprise Manager and run it from there. Also you can save the control file and execute via sql loader. "
    Thanks and Regards,
    Barkha

  • Flat file mapping problem.

    Hi,
    I've just created a file mapping, and i'm trying to split a flat file field into 3 subfields for a xml record, but i have the following message:
    Before the process run, i've tested the message mapping and the interface mapping, and the log result said that the mapping has been successfully ended, so, Have you ever seen this problem before?
    I appreciate your help, thanks.
    Marzolla.

    Hi Jorge,
              Check your XML input. Try to put the same xml instance in the message mapping transformation and test it out.
    Regards,
    Dhana

  • Idoc to flat file mapping using XSLT

    Hi,
    i am using XSLT mapping. my requirement is mapping between idoc and flat file(xml to text). as i do not want to use FCC, i have opted for xslt mapping. please let me know any article which would be helpful for this.
    regards,
    Meenakshi

    Hi Meenakshi,
    Two things:
    1. Achieving this functionality  using XSLT is very difficult.
    2. Secondly you may not be able to find a direct document to convert IDoc-XML to flat file using XSLT in sdn. Try google.
    I found one link like that may be you can get some idea from there
    http://www.stylusstudio.com/SSDN/default.asp?action=9&read=6453&fid=48
    Also if you have a XSLT editor like XMLSPY or stylus studio then creating your specific XSLT will b much more simpler
    Regards
    Suraj

  • Extended Analytics Flat File extract question

    I have modified the Extended Analytics SDK application as such:
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
    instead of
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
    I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
    Where does the FLATFILE get saved/output when using the API?
    Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
    thanks for your help,
    Chris

    Never mind. I found the location on the server.

  • Flat File Mapping -- Looping Scenario

    Hi All,
    I have a flat file structure like below:
    My input flat file is like below:
    H,Header,,,
    O,Address,,,
    L,1,100,,
    L,2,400,,
    SL,2,225,,
    SL,2,175,,
    SL(ScheduledLineRecord) is a optional segment, LineLoop may or may not contain it. The actual quantity from L(DetailLine) segment gets split in SL record. i.e.. In above example 400 in L got splitted in 225 and 175 in SL.
    On Destination side I have a line loop. I want to pass on quantity(3rd element in L/SL segment) in one perticular field. Logic is# If LineLoop having SL then pass on quantity from  "SL" segment to that field. Else pass
    on quanity from "L" segment.
    So if I have input as shown above, then the required field should repeat for 3 times and should contains values as:
    <LineRecord>
        <xyz>100</xyz>
    </LineRecord>
    <LineRecord>
        <xyz>225</xyz>
    </LineRecord>
    <LineRecord>
        <xyz>175</xyz>
    </LineRecord>
    I want to avoid using custom XSLT. Can we do it without custom XSLT? How?
    Thanks, Girish R. Patil.

    You can achive it using custom XSLT or Inline XSLT in script functoid.
    Below script will achive the required results..
     <xsl:for-each select="LineLoop/LineRecord">
            <xsl:variable name ="LRlineNumber" select ="LineNumber/text()"/>
            <xsl:choose>
              <xsl:when test ="count(../ScheduledLineRecord[LineNumber=$LRlineNumber]) > 0">
                <xsl:for-each select ="../ScheduledLineRecord[LineNumber=$LRlineNumber]">
                  <LineRecord>
                    <xyz>
                      <xsl:value-of select="Quantity/text()" />
                    </xyz>
                  </LineRecord>
                </xsl:for-each>
              </xsl:when>
              <xsl:otherwise>
                <LineRecord>
                  <xyz>
                    <xsl:value-of select="Quantity/text()" />
                  </xyz>
                </LineRecord>
              </xsl:otherwise>
            </xsl:choose>      
          </xsl:for-each>
    You may need to tweek it if there is some namespace or elements or attributes are Qualified in the actual schema.
    Regards &lt;br/&gt; When you see answers and helpful posts,&lt;br/&gt; please click Vote As Helpful, Propose As Answer, and/or Mark As Answer

  • Streams question / file mapping question

    1. I read that I have to close all streams I use. But, on the other hand, in the topic on layering streams (in the same books or tutorials) I see such piece of code:
    DataInputStream dis = new DataInputStream(new BufferedInputStream(new FileInputStream("file.txt")));It's obviuos that I can explicitly close only the dis stream object. What obout the rest created along the way?
    2. I have this code snippet:
    import java.io.*;
    import java.nio.*;
    import java.nio.channels.*;
    public class Main {
        public static void main(String[] args) {
            try {
                FileChannel channel = new FileInputStream("c:\file.txt").getChannel();
                long size = channel.size();
                ByteBuffer buf = channel.map(FileChannel.MapMode.READ_WRITE, 0, size);
                for (long i = 0; i < size; ++i) {
                    buf.putChar('a');
            } catch (IOException exc) {
    }When the program terminates, the file is not updated. Why is that?
    I am using Win XP.
    Thanks!

    2. Ok, it generated an exception becouse the path was wrong (forgot about double backslach). Here is the code:
    package Main;
    import java.io.*;
    import java.nio.*;
    import java.nio.channels.*;
    public class Main {
        public static void main(String[] args) {
            try {
                FileChannel channel = new FileInputStream("c:\\file.txt").getChannel();
                long size = channel.size();
                ByteBuffer buf = channel.map(FileChannel.MapMode.READ_WRITE, 0, size);
                for (long i = 0; i < size; ++i) {
                    buf.putChar('a');
            } catch (IOException exc) {
                System.out.println("Fuck it, an exception!");
                exc.printStackTrace();
    }Now it generates a NonWritableChannelException at line:
    ByteBuffer buf = channel.map(FileChannel.MapMode.READ_WRITE, 0, size);I just don't get whats wrong with the code.

  • Flat file to target table map in 11G

    Hi,
    I am havin many problem with my 10G R1 map that have been migrated to 11G.
    all that my 10G map has is mapping from flat file to stage table. It also had one constant operator. i am getting error coz of this constant that i am setting up in the map.
    when i validate this constant variable i get error mess:
    API8534:Validation no supported for language SQLLOADER,property Expression for SAASD_MAP
    -- Generator Version : 11.1.0.7.0
    -- Created Date : Mon May 11 21:16:25 CDT 2009
    -- Modified Date : Mon May 11 21:16:25 CDT 2009
    -- Created By : OWB_WUSER
    -- Modified By : OWB_WUSER
    -- Generated Object Type : SQL*Loader Control File
    -- Generated Object Name : "SADAD_MAP"
    -- Copyright © 2000, 2007, Oracle. All rights reserved.
    OPTIONS (BINDSIZE=50000,ERRORS=50,ROWS=200,READSIZE=65536)
    LOAD DATA
    CHARACTERSET WE8MSWIN1252
    INFILE '{{ETL_FILE_LOC.RootPath}}{{}}dgp.dat'
    BADFILE '{{ETL_FILE_LOC.RootPath}}{{}}dgp.bad'
    DISCARDFILE '{{ETL_FILE_LOC.RootPath}}{{}}dgp.dis'
    CONCATENATE 1
    INTO TABLE "STAGING"."DGP_STAGG"
    APPEND
    REENABLE DISABLED_CONSTRAINTS
    WHEN (1:2) = 'EM'
    "PHONE_COUNTRY" CONSTANT ''Asia'',
    "MRI" POSITION(1:2) CHAR(2) ,
    if you note 11G some how does not allow me to have constant operator in flat file maps. and the constant for PHONE_COUNTRY column is enclosed in double quotes insted of single quotes.[should be 'Asia' and not ''Asia'']
    can any one tell me why this is heppening?...i also have problem with the READ BUFFER property. which in 10G used to be defulted to '4' but ignored which generating the .ctl but in 11G this property creates READ BUFFERS statement in .ctl....any help in this will be grt.
    Edited by: user591315 on May 12, 2009 7:39 AM

    if you note 11G some how does not allow me to have constant operator in flat file maps. and the constant for >PHONE_COUNTRY column is enclosed in double quotes insted of single quotes.Hi
    Please open the same mapping in 10g R2 its same .
    It use Double code around the column name.
    By the way execute the mapping and then check are you getting desired result.

  • Transfer data from Result Set (Execute SQL Task) into Flat File

    Hi to all,
    My problem is this:
     -) I have a stored procedure which use a temporary table in join with another "real" table in select statement. In particular the stored procedure look like this:
    create table #tmp
    col1 varchar(20),
    col2 varchar(50)
    --PUT SOME VALUE IN #TMP
    insert into #tmp
    values ('abc','xyz')
    --SELECT SOME VALUE
    select rt.*
    from realTable rt, #tmp
    where rt.col1 = #tmp.col1
    -) I cannot modify the stored procedure because I'm not admin of database.
    -) I HAVE THE REQUIREMENT OF TRANSFER DATA OF SELECT STATEMENT OF SOTRED PROCEDURE INTO FLAT FILE
    -) THE PROBLEM is that if I use an OLEDB source Task within a Data Flow Task I'm not be able of mapping column from OLEDB source to flat file destination. The reason for this, is that in the "Column page" of OLEDB source task, SSIS do not retrieves
    any column when we using a temporary table. The reason for this, is that SSIS is not be able to retrieve metadata related to temporary table. 
    -) One possible solution to this problem is to use an Execute SQL Task to invoke the stored procedure, store the result returned from stored procedure in a Result Set through a Object type user variable.
    -) The problem now is: How to transfer data from result set to flat file?
    -) Reading here on this forum the solution look be like to use a Script task to transfer data from result set to flat file.
    QUESTIONS: How to transfer data from result set to flat file using a script task?? (I'm not an expert of vb .net) Is it really possible?? P.S.: The number of row returned of stored procedure is very high!
    thanks in advance.

    Hi  Visakh16<abbr
    class="affil"></abbr>
    thanks for the response.
    Your is a good Idea, but I do not have permission to use DDL statement on the database in production environment.

  • Validation error using flat files..VLD-2407

    Trying to build a simple flat file mapping to test out file locations and permissions at a new client and getting the following error: VLD-2407: SQL*Loader map must have a source file. OWB 10.2.0.3. I've added a filename for everything i could find in the mapping configuration screens. Any thoughts?
    Thanks

    It can happen if your location for deployment is different than that of development and you have forgotten to make sure the
    source directory ane file exist in deployment location.
    Regards
    azaman

  • ORDERS05-To-File Mapping the qualifiers

    Hi,
       I have a scenario of mapping idoc orders05 to file.
       Can anybody tell me how to map the qualifiers in the target file structure for any one qualifier.
       If any body has mapping program for orders05 to flat file mapping can u send the info.

    Hi
       u shud be given the mapping specification sheet right.
    actually QUALF number will be unique for each n every segment , some segments will occur multiple times but the QUALF number should be different.
       I have did orders05 to file mapping but , can u explain ur problem precisely.
    regards,
    aravindh.

  • Loading flat files from a folder

    hi,
    i'm new in O.W.B., want to know if there is a way to import flat files into OWB directly from a folder. that is, automatically, without having to import one at a time.
    thanks.

    Hi Francesco
    So the UI lets you import one at a time or say this file is the same as that file (if same number of columns). You can also configure an external table or flat file mapping to add many data files to be read for the one external table or flat file mapping.
    If you want to import the metadata into OWB by pointing at a directory and importing all of the files there automatically then you would have to script this up. I have written scripts in the past that read metadata from files (such as for fixed length files for example the Oracle Sales Analyzer/OSA product has such files) and generate flat file definitions and external tables. The metadata files describe the fixed length fields, both the positional and datatype information. With CSVs if the files have a header row with a name for the column, you could script some of this or default the names, but then the datatypes would also have to be inferred by analyzing or from some metadata that can be added.
    Cheers
    David

Maybe you are looking for

  • IPhoto 6 lost some precious memories.

    i installed iLife '06 and it lost quite a few pics in the move. It completely restructures the file structure inside iPhoto and am kinda upset that a good handful of pics didn't make the move with me. I have never messed with these folders (in Finder

  • Interactive report using alv.

    Hello all, I want to create an interactive report using alv grid display.I tried it using the normal method,as in using hide n sy-lsind but could not get the display.Canu please help me out with the problem. Thanks and Regards. Seema.

  • Shared services- Access Assign issue for Essbase

    Hi All, I need to associate Filter with a group in Shared Services, but when I go to Application Groups, right click on the Application name and select "Assign access control", but in the list of Groups that come in the very first window, i dont see

  • Satellite Pro L300-EZ1004X - TouchPad stopped working, BIOS update does not help

    I have a Satellite Pro L300-EZ1004X laptop. Few days ago TouchPad stopped working. This is not a Fn + F9 issue. In Device Manager it looks like hidden device and has (!) on it. Re-installing Synaptics drivers, updating them, removing the device in Sa

  • How to create/get an OrdDicom instance

    Hey, guys, does anyone try to use the OrdDicom.jar to do some development? Now I have a rookie question: how to create/get an OrdDicom instance? It seems the OrdDicom class's construtor is private and no related factory method, so is there some other