Load flat file data into ODS in BI

Dear Gurus,
How to load flat file data into ODS?. Please share the article if you have.
advance wishes
Thanks
Venkadesh

Please search the forums before posting :
https://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/60debae1-84dd-2b10-e7bf-bdedf1eabdf9
http://wiki.sdn.sap.com/wiki/display/BI/Beginner+Section

Similar Messages

  • Interface used in loading flat file data into bw

    what is the interface used in loading flat files into the bw system.

    HI Flat file load is its own interface.  It leverages the S-API (Service API), the standard interface that is used to load data in the BI system.
    Start from this point:
    http://help.sap.com/saphelp_nw2004s/helpdata/en/43/03450525ee517be10000000a1553f6/frameset.htm
    Thanks for any points you choose to assign.
    Regards -
    Ron Silberstein
    SAP

  • How to store the flat file data into custom table?

    Hi,
    Iam working on inbound interface.Can any one tell me how to store the flat file data into custom table?what is the procedure?
    Regards,
    Sujan

    Hie
    u can use function
    F4_FILENAME
    to pick the file from front-end or location.
    then use function
    WS_UPLOAD
    to upload into
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
      CALL FUNCTION 'F4_FILENAME'   "Function to pick file
        EXPORTING
          field_name = 'p_file'     "file
        IMPORTING
          file_name  = p_file.     "file
      CALL FUNCTION 'WS_UPLOAD'
       EXPORTING
         filename                       = p_file1
        TABLES
          data_tab                      = it_line
    *then loop at it_line splitting it into the fields of your custom table.
    loop at it_line.
              split itline at ',' into
              itab-name
              itab-surname.
    endloop.
    then u can insert the values into yo table from the itab work area.
    regards
    Isaac Prince

  • Converting Flat File data into XML

    Hi Experts,
    Consider the message type of the SENDER system and flat file data
    <dt_sender>
    <root>
    <header1>   0..1
        <f1>
        <f2>
        <f3>
    <header2>   0..1
        <f4>
        <f5>
        <f6>
    <item>        1..unbounded
        <f7>
        <f8>
        <f9>
        <f10>
        <f11>
        <f12>
    </item>
    abc     def     ghi     jkl     mno     pqr
    123     123     123     123     123     123
    456     456      456     456     456     456
    how to convert the flat file data into following XML data. please note that each field value is separated by TAB delimeter...wht parameters shld b used
    <root>
        <Header1>
            <f1>abc</f1>
            <f2>def</f2>
            <f3>ghi</f3>
        </Header1>
        <Header2>
            <f4>jkl</f4>
            <f5>mno</f5>
            <f6>pqr</f6>
        </Header1>
        <item>
            <f7>123</f7>
            <f8>123</f8>
            <f9>123</f9>
            <f10>123</f10>
            <f11>123</f11>
            <f12>123</f12>
            <f7>456</f7>
            <f8>456</f8>
            <f9>456</f9>
            <f10>456</f10>
            <f11>456</f11>
            <f12>456</f12>
        </item>
    points will be given to the correct answers
    Thanks in advance.
    FAisal
    Edited by: Abdul Faisal on Feb 29, 2008 5:53 AM

    Faisal,
    When you read the multiple recordset strucutre file then each record in txt file should have an header from which you can identiy which segment it should go.. and you identiy it by using the keyfiledValue in file adapter
    <root>
    <header1> 0..1
    <f1>
    <f2>
    <f3>
    <header2> 0..1
    <f4>
    <f5>
    <f6>
    <item> 1..unbounded
    <f7>
    <f8>
    <f9>
    <f10>
    <f11>
    <f12>
    </item>
    for this input file
    abc def ghi jkl mno pqr
    123 123 123 123 123 123
    456 456 456 456 456 456
    abc def ghi can be read using the file adater to header 1 usinfg key field value, but using the same file adapter you cannt put GHI into header2.
    else you should read whole row abc def ghi jkl mno pqr in single filed and write an UDF to split data to header1 and Header 2
    similarly you have to take care for item records also
    if your inout file is something like this
    abc def ghi
    jkl mno pqr
    123 123 123 123 123 123
    456 456 456 456 456 456
    abc identifies to Header 1
    JKL for Header 2  so on...
    read the whole line in single field  and write UDF to Split to header 1 and header 2 similary for item.

  • How to convert the flat file data into sap tables . ?

    how to upload flat file data into sap table . before upload mapping is also there in some filds . any one can give me some steps how to upload and mapping . ?

    Hi
    See the sample code
    REPORT zmmupload.
    Internal Table for Upload Data
    DATA: i_mara like MARA occurs 0 with header line
    PARAMETERS: p_file LIKE ibipparms-path.  " Filename
    At selection-screen on Value Request for file Name
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR p_file.
    Get the F4 Values for the File
      CALL FUNCTION 'F4_FILENAME'
        EXPORTING
          program_name  = syst-cprog
          dynpro_number = syst-dynnr
        IMPORTING
          file_name     = p_file.
    Upload the File into internal Table
      CALL FUNCTION 'UPLOAD'
        EXPORTING
          filename                = p_file
          filetype                = 'DAT'
        TABLES
          data_tab                = i_mara
        EXCEPTIONS
          conversion_error        = 1
          invalid_table_width     = 2
          invalid_type            = 3
          no_batch                = 4
          unknown_error           = 5
          gui_refuse_filetransfer = 6
          OTHERS                  = 7.
      IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
    Upload the Data from Internal Table
      MODIFY MARA from TABLE i_MARA.
    Regards
    Anji.

  • Errors when loading flat file data

    We just test to load a very simple flat file data with only two lines and the two lines of data in preview of InfoSource is correct.  But when run InfoPackage to load data, the monitor of the InfoPackage shows the following errors (see in between two dashed lines below):
    Error getting SID for ODS object ZDM_SUBS
    Activation of data records from ODS object ZDM_SUBS terminated
    Error when assigning SID (details in long text)
    Error when assigning SID (details in long text)
    Error when assigning SID (details in long text)
    Error when assigning SID (details in long text)
    Error when assigning SID (details in long text)
    Error when assigning SID (details in long text)
    Error when assigning SID (details in long text)
    Error when assigning SID (details in long text)
    Value 'Bottom' (hex. '0042006F00740074006F006D') of characteristic ZRATEPLN contains invalid characters
    Value 'Dealer' (hex. '004400650061006C00650072') of characteristic ZCHANNEL contains invalid characters
    Value 'Bottom' (hex. '0042006F00740074006F006D') of characteristic ZRATEPLN contains invalid characters
    Value '19884/' of characteristic 0DATE is not a number with 000008 spaces
    Value '/19812' of characteristic 0DATE is not a number with 000008 spaces
    Value '19884/' of characteristic 0DATE is not a number with 000008 spaces
    In the flat file (excel sheet saved as a CSV file), for each row of the data, there are two fields which are start_date and end_date and the date format is MM/DD/YYYY and in the Transfer Rule, we transfer the date format from MM/DD/YYYY to YYYYMMDD which is required by DATS InfoObject type in BW.  If you need the excel sheet of data in order to answer our questions about the above errors, you can give us your e-mail address and we can send the simple two rows of data excel sheet file to you.
    Thanks!

    Hi Kevin,
    1.You can use lowercase letters in the values for your characteristics provided you have checked the lowercase checkbox in the general tab page of Create characteristic screen.But when you do so no masterdata tables,text tables, or another level of attributes underneath are allowed.
                            OR
    Use only upper case letters in your characteristic unchecking the above mentioned box.
    2.The date format in the CSV file should be yyyymmdd.It should have 8 characters . I guess there is something strange in your "calendardays" since I could not find 8 characters irrespective of the order.Do not forget to use zeroes.
    Hope this works.
    Reward if it is helpful.
    Regards,
    Balaji

  • Most efficient way to load XML file data into tables

    I have a complex XML file running into MBs. I want to load it's data into 7-8 tables.
    Which way will be better:
    1) Use SQL Loader to actually load directly into the 7-8 tables directly by modifying the control card.
    Is this really possible and feasible? I am not even sure about it
    2) Load data as XML Type in a table and register it. Then extract from there to load into various tables.
    Please help. I have to find the most efficient way of doing it.
    Regards,
    Sudhir

    Yes it is possible to use SQL*Loader to parse and load XML, but that is not what it was designed for and so is not recommended. You also don't need to register a schema, just to load/store/parse XML in the DB either.
    So where does that leave you?
    Some options
    {thread:id=410714} (see page 2)
    {thread:id=1090681}
    {thread:id=1070213}
    Those talk some about storage options and reading in XML from disk and parsing XML. They should also give you options to consider. Without knowing more about your requirements for the effort, it is difficult to give specific advice. Maybe your 7-8 tables don't exist and so using Object Relational Storage for the XML would be the best solution as you can query/update tables that Oracle creates based off the schema associated to the XML. Maybe an External Table definition works better for reading the XML into the system because this process will happen just once. Maybe using WebDAV makes more sense for loading XML to be parsed (I don't have much experience with this, just know it is possible from what I've read on the forums). Also, your version makes a difference as you have different options available depending upon the version of Oracle.
    Hope all that helps as a starter.
    Edited by: A_Non on Jul 8, 2010 4:31 PM
    A great example, see the answers by mdrake in {thread:id=1096784}

  • Loading wave file data into array

    Greetings
    For starters, I'm still learning java, so forgive me if this is a trivial question. I've been using the documentation on the java sound api as a guide. What I am trying to do is to load a wav file data into a byte array and eventually I will need it to have it in an int array so I can pass the data into a demodulation algorithm to display as an image. This wave file is a recorded signal from a NOAA satellite. Here is my code thus far:
    public boolean loadFile(File soundFile) {
              int totalFramesRead = 0;
              boolean result=false;
              File fileIn = new File(soundFile.getPath());
              try {
                   AudioInputStream ais = AudioSystem.getAudioInputStream(soundFile);
                   int bytesPerFrame = ais.getFormat().getFrameSize();
                   // Set an arbitrary buffer size of frames.
                   int numBytes = 1024*bytesPerFrame;
                   byte[] audioByte= new byte[numBytes];
                   try {
                        int numBytesRead = 0;
                        int numFramesRead = 0;
                        // Try to read numBytes bytes from the file.
                        while ((numBytesRead = ais.read(audioByte)) != -1) {
                             // Calculate the number of frames actually read.
                             numFramesRead = numBytesRead / bytesPerFrame;
                             totalFramesRead += numFramesRead;
                             return(result= true);
                   } catch (Exception ex) {
                   JOptionPane.showMessageDialog(desk, " Error Loading File" ,"Error",
                             JOptionPane.WARNING_MESSAGE);
                   ais.close();
                   throw ex;}
              } catch (Exception ex) {
                   JOptionPane.showMessageDialog(desk, " Error Loading File" ,"Error",
                             JOptionPane.WARNING_MESSAGE);
              return result;
         }

    You can find oracle documentation for sqlldr (11g) here:
    http://docs.oracle.com/cd/B28359_01/server.111/b28319/ldr_concepts.htm#SUTIL003
    If you have questions on details you should post your question in the special sqlldr forum:
    Export/Import/SQL Loader & External Tables
    hm

  • First Loading of Master data into ODS

    Hello BW gurus,
    I have read the below mentioned sentence in a material stating that
    'By using flexible updating, it is possible to write master data from different sources into a consolidated object (master data ODS object) before this is stored in individual master data tables.'
    1)  why master data from different sources is being written into ODS first , before it is stored in individual Master Data tables ?
    2) Is every master data from source systems first loaded into ODS  before it is stored in individual Master Data tables?

    hi,
    1) why master data from different sources is being written into ODS first , before it is stored in individual Master Data tables ?
    Ans - In real time scenarios y may have to load data from various source system (liks sap,non sap,flat file).also more than on r3 source system is also possible. hence to have consolidation of the master data based on source system we load master data to ods first. but is not always. one few master datas like - Batch number,delivery number,invoice number. u can also load directly to master data tables using direct update.
    2) Is every master data from source systems first loaded into ODS before it is stored in individual Master Data tables?
    not mandatory, it depends on requirement.
    Ramesh

  • Loading Flat files with into multiple tables using OWB

    Hi,
    How to implement the following logic in OWB.
    LOAD DATA
    INFILE 'myfile.txt'
    BADFILE 'myfile.bad'
    DISCARDFILE 'myfile.dsc'
    APPEND
    Into TABLE_Awhen (1:1) = 'A'
    (Col1 Position(1:1) CHAR,
    Col2 Position(2:5) CHAR)
    Into TABLE_Bwhen (1:1) = 'B'
    (Col1 Position(1:1) CHAR,
    Col2 Position(2:20) EXTERNAL INTEGER)
    Into TABLE_C
    when (1:1) = 'C'
    (Col1 Position(1:1) CHAR,
    Col2 Position(2:20) EXTERNAL INTEGER)
    I am using 10g version of OWB.I tried using the splitter operator.
    I am getting the following error when i use the splitter.
    An invalid combination of operators prevents the generation of code in a single implementation language (PL/SQL code, or SQL*Loader code, or ABAP code). For example, you may have included a SQL*Loader only operator such as the Data Generator in a mapping with a PL/SQL implementation type. If you designed the mapping to generate PL/SQL code, an invalid combination or sequence of operators prevents the generation of PL/SQL code in any of the operating modes(set based, row based, row based target only). If the mapping contains an operator that generates only PL/SQL output, all downstream dataflow operators must also be implementable by PL/SQL. You can use SQL operators in such a mapping only after loading the PL/SQL output to a target. Detail is as follows:
    PL/SQL set based operating mode: Operator trailer_source_txt does not support SQL generation.
    PL/SQL row based operating mode: Operator trailer_source_txt does not support SQL and PL/SQL generation.
    PL/SQL row based (target only) operating mode: Operator trailer_source_txt does not support SQL and PL/SQL generation.
    Both SQL and PL/SQL handlers are not supported by trailer_source_txt as output
    SQL*Loader: Operator SPLIT does not support SQL*Loader generation.
    ABAP: Operator trailer_source_txt does not support ABAP generation.
    Thanks in advance,
    VInay

    Hi
    Splitter can be used ib PL/SQL mappings, but if you use a flat file in a mapping, than it will be an SQLLoader mapping. So I suggest to you to create a mapping which load your flat file into a table, and from this table you load the data into the three table with the spillet in a PL/SQL mapping. Create two mappings.
    Or you can use an external table in a mapping with a splitter.
    Ott Karesz
    http://www.trendo-kft.hu

  • Loading FLat file data using FDMEE having 1 to many mapping

    Hi All,
    I need to load a data from Flat file to hyperion planning applcation using FDMEE having one to many mapping
    For e.g Data file has 2 records
    Acc Actual Version1 Scene1 1000
    Acc Actual Version1 Scene2 2000
    now target application has 5 dimension and data need to be load as
    acc Actual Version1 entity1 Prod2 1000
    Acc Actual Version1 Entity2 Prod2 2000
    Please suggest
    Regards
    Anubhav

    From your exmple I don't see the one too many mapping requirement. You have one source data line that maps to a single target intersection. Where is the one to many mapping requirement in your example?

  • Is it possible to load flat file data residing on a local machine via BODS 4.2 or flat file should reside on BODS server?

    Hi All,
    I have a requirement to load data from flatfiles stored on particular location, on a machine on which BODS client istallation is there.
    I am able to create a flat file format by browsing the file location and giving file name.I can view the file data also.
    But when I execute the job, job failswith the error "Cannot open file <D:/BODS_flatfiles/result.txt>. Check its path and permissions".
    Please let me know whether it is possible to load data from flat file on a local machine, which has BODS4.2 client installated on it or the flat file should reside in a path in BODS job server ?
    Thanks,
    Deepa

    Hi Deepa,
    If you get solution then please mark answer as correct and close the tread!
    Thanks,
    Swapnil

  • Moving flat file data into a nested xml hierarchy

    Post Author: jlpete72
    CA Forum: Data Integration
    I am working to build data flows which move data from fixed lenght flat files into nested hierarchical xml files.  I think I have the transforms mapped correctly but there must be something I'm missing.  The data flows validate with no errors, but when executed it appears that they are getting stuck in a loop.  My flat source files have around 3000 rows, but the data flows are showing millions of records being processed.
    I have only one schema in the source and a variable number of schemas in the target xml files.  I am using a query transport and mapping all of the output schema levels in the target xml From tabs.
    I have even tried creating separate database tables for each target schema, and I still get a runaway process.  When there are more than two schemas in the target xml file, the process eventually consumes all of the servers memory and crashes the system.
    I'm not sure if I need to be using a different transport or what but am hoping that someone can shed some insight on what is happening here.
    Thanks, Jerry

    Post Author: jlpete72
    CA Forum: Data Integration
    Thanks for the response.  I didn't have a choice but to enter the From statement, but haven't yet seen any results from entering a Where clause.  In my case, there is only one input file/schema so the where clause would be something like "cust_inv_1.SldFmCustCd = cust_inv_1.SldFmCustCd".
    I would agree that it seems as though I am generating a cartesian product.  In the circumstance of having only one input schema mapping to multiple output schemas, I'm not seeing how to properly restrict the rows.
    I have been going through the manuals looking for clues, but have not found anything that helped yet.
    Any help and/or ideas would be appreciated.
    Jerry

  • Runtime Error while loading Flat file in to ODS

    Hi Friends,
    I have 11 records in flatfile and I am loading it from PC. when I see in preview I can see.
    But when I load it ,it gives me short dump. request status is in amber..
    Please help me as this is urgent support issue.
    The ABAP Runtime Error is
    Runtime Error          CALL_FUNCTION_CONFLICT_TYPE
    Except.                CX_SY_DYN_CALL_ILLEGAL_TYPE
    Date and Time          02.01.2008 13:55:38       
    What happened?                                                                   
        Error in ABAP application program.                                                                               
    The current ABAP program "GP44UKXZB8WE74CAKI38CZ0B586" had to be terminated  
         because one of the                                                          
        statements could not be executed.                                                                               
    This is probably due to an error in the ABAP program.                                                                               
    A function module was called incorrectly.                                                                               
    What can you do?                                                                 
        Print out the error message (using the "Print" function)                     
        and make a note of the actions and input that caused the                     
        error.                                                                               
    To resolve the problem, contact your SAP system administrator.               
        You can use transaction ST22 (ABAP Dump Analysis) to view and administer     
         termination messages, especially those beyond their normal deletion         
        date.                                                                               
    is especially useful if you want to keep a particular message.

    Hi Sudhakar,
    This seems to be a type conflict error. It occurs when you assign for example a char field to a numc field. So, I suggest you check your mappings and see if there is any type mismatch in the info objects. Hope it helps.
    Thanks and Regards
    Subray Hegde

  • Conversion_Exit_Cunit_error occured while loading the Flat file data

    Hi
    Iam tryign to load Flat file data into an ODS, i am getting error like Error Conversion Cunit.
    Also we are using 0unit in the ODS for which CUNIT is a conversion rule
    Can you please suggest me why iam getting this error

    Hi Sunil
    Hope you can check whether you are loadig the flat file data from application server or Client workstation.
    May be if you are loading from Client work station you will face problem of this type.
    Try to check if any change in format in the file.
    at the end of the file delete the spaces.

Maybe you are looking for