Errors in flat file extraction

Experts pls ans
what R the possible ERRORS while extracting data from flat file to SAP BW 3.5
thanks in advance
ramakrishna

Hi,
possibel errors:
You have to select as csv file in info package selection, while loading.
Path of the File.
If your File is open then also you can get error.
If DS structure different to Flat File, when you do scheduling then also you will get error.
Hope it helps you.
Reg
Pra

Similar Messages

  • Oracle API for Extended Analytics Flat File extract issue

    I have modified the Extended Analytics SDK application as such:
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
    instead of
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
    I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
    Where does the FLATFILE get saved/output when using the API?
    Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
    thanks for your help,
    Chris

    Never mind. I found the location on the server.

  • Extended Analytics Flat File extract in ANSI

    Hi,
    Version: EPM 11.1.2.1
    The Flat file extract using Extended Analytics from HFM is exported in UNICODE format. Is there any option to get the export in ANSI code format?
    Thanks,
    user8783298

    Hi
    In the latest version, Currently the data from HFM can be exported only in UNICODE format. There is a defect filed to Oracle for the same.
    At present the only workaround is to change the file encoding after it has been extracted, using third party software.
    For example, the extracted file can be opened using Windows Notepad and then desired encoding format can be selected when the file is saved using option Save As.
    Hope this helps.
    thank you
    Regards,
    Mahe

  • Extended Analytics Flat File extract question

    I have modified the Extended Analytics SDK application as such:
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_FLATFILE
    instead of
    HFMCONSTANTSLib.EA_EXTRACT_TYPE_FLAGS.EA_EXTRACT_TYPE_STANDARD
    I am trying to figure out where the Flat file extract is placed once the Extract is complete. I have verified that I am connecting to HFM and the log file indicates all the steps in the application have completed successfully.
    Where does the FLATFILE get saved/output when using the API?
    Ultimate goal is to create a flat file through an automated process which can be picked up by a third party application.
    thanks for your help,
    Chris

    Never mind. I found the location on the server.

  • Changing the file name n Flat file Extraction

    Hi,
    Currently i am using flat file extraction for my forecast data and i am sending the file through application server.
    I have created directory successfully and now everyday morning i receive a file thru FTP server with name 20060903.csv and this name is based on one field in my flat file data.  ex. /interface/asf/20060903.csv
    During mid off month we have cut off date, and this cut off date varies for each month. During this time file name changes in the FTP and a file with different name i.e 20061002.csv will be existing in the application server.
    Now in the infopackage i also need to set the deletion settings like if the file name is same delete the previous requests. I could achieve this if i could get the file name changed.
    Lets say if i am not chnaging the file name how do i set deletion condition, like it should not delete if the field(scenario) changes. ie from 20061002 to 20061101. I should have only one file for 20061002 and one file for 20061101 etc... If the scenario is same it should delete.
    Any one kindly advise. Very urgent and critical.
    Tks & regards,
    Bhuvana.

    Hi Bhunva,
    Try the following abap code in routine under External data tab in infopackage.
    data: begin of i_req occurs 0,
          rnr like RSICCONT-rnr,
          end of i_req.
    select * from RSICCONT UP TO 1 ROWS
                      where ICUBE  = <datatargetname>
                      order by TIMESTAMP descending.
      i_req-rnr = rsiccont-rnr .
      append i_req.
      clear i_req.
    endselect.
    loop at i_req.
      select single * from RSSELDONE where RNR eq i_req-rnr and
                                       filename = p_filename.
      if sy-subrc = 0.
        CALL FUNCTION 'RSSM_DELETE_REQUEST'
          EXPORTING
            REQUEST                    = i_req-rnr
            INFOCUBE                   = <datatargetname>
          EXCEPTIONS
            REQUEST_NOT_IN_CUBE        = 1
            INFOCUBE_NOT_FOUND         = 2
            REQUEST_ALREADY_AGGREGATED = 3
            REQUEST_ALREADY_COMDENSED  = 4
            NO_ENQUEUE_POSSIBLE        = 5
            OTHERS                     = 6.
        IF SY-SUBRC <> 0.
          MESSAGE ID sy-MSGID TYPE 'I' NUMBER sy-MSGNO
              WITH sy-MSGV1 sy-MSGV2 sy-MSGV3 sy-MSGV4.
        else.
          message i799(rsm1) with i_req-rnr 'deleted'.
        ENDIF.
    endif.
    let me know if you get any problem in this logic.
    regards,
    Raju

  • Encountering problem in Flat File Extraction

    Flat file extraction: the settings which I maintain in the data source u201Cfield tabu201D are conversion routine "ALPHA", format "External". But when I load the data, the record which is maintained in lower case (in the Flat File) is not getting converted into upper case. But if I try with out ALPHA conversion, the lower case data is getting converted into the SAP format (Upper case). Could you please help me to fix the problem
    When I use both ALPH & External together?

    Hai did you enable the lower case enable option check box in the infoobject level .For the objects which you want lower case values .Check the help.sap site as well.
    Goodday

  • Delta upload for flat file extraction

    Hello everyone,
    Is it possible to initialize delta update for a flat file extraction, If yes please explain how to do that??

    Hi Norton,
    For a Flat file data source, the upload will be always FULL.
    Please refer to following doc:
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/a0b3f0e2-832d-2c10-1ca9-d909ca40b54e?QuickLink=index&overridelayout=true&43581033152710   if you need to to extract delta records from Flat file. We can write routine at infopackage level.
    Regards,
    Harish.

  • Bdc - errors in flat file

    hi,
    in BDC,
    after uploading data from flat file,
    how to find if there are any errors in flat file,
    before starting the session.

    Hi,
    You have to create internal tables for all the mandatory fields. see the following code :
    I am giving an example for XK01 Transaction.
    TYPES:BEGIN OF TY_XK01,
          LIFNR(10)      TYPE  C ,          "VENDOR'S ACCOUNT NUMBER
          BUKRS(4)       TYPE  C ,          "COMPANY CODE
          EKORG(4)       TYPE  C ,          "PURCHASING ORGANISATION
          KTOKK(4)       TYPE  C ,          "VENDOR ACCOUNT GROUP
          NAME1(35)      TYPE  C ,                              "NAME1
          SORTL(10)      TYPE  C ,          "SORT FIELD
          STRAS(35)      TYPE  C ,          "STREET
          PSTLZ(10)      TYPE  C ,          "POSTAL CODE
          ORT01(35)      TYPE  C ,          "CITY
          LAND1(3)       TYPE  C ,          "COUNTRY KEY
          REGIO(3)       TYPE  C ,          "REGION
          TIME_ZONE(6)   TYPE  C ,          "ADDRESS TIME ZONE
          LANGU(1)       TYPE  C ,          "LANGUAGE KEY
          TELF1(16)      TYPE  C ,          "TELEPHONE NUMBER
          TELFX(31)      TYPE  C ,          "FAX NUMBER
          SMTP_ADDR(241) TYPE  C ,          "E-MAIL ADDRESS
          URI_SCREEN(132) TYPE  C ,     "UNIFORM RESOURCE LOCATOR
          AKONT(10)      TYPE  C ,          "RECONCILITATION ACCOUNT
          ZUAWA(3)       TYPE  C ,          "KEY FOR SORTING ACCORDING TO ASSIGNMENT NUMBERS
          MINDK(3)       TYPE  C ,          "MINORITY INDICATORS
          ALTKN(10)      TYPE  C ,          "PREVIOUS MASTER RECORD NUMBER
          ZTERM(4)       TYPE  C ,          "TERMS OF PAYMENT KEY
          ZWELS(10)      TYPE  C ,          "LIST OF THE PAYMENT METHODS
          WAERS(5)       TYPE  C ,          "PURCHASE ORDER CURRENCY
          END OF TY_XK01,
    BEGIN OF TY_LIFNR,
         LIFNR(10)  TYPE  C ,
         END OF TY_LIFNR,
         BEGIN OF TY_BUKRS,
         BUKRS(4)  TYPE  C ,
         END OF TY_BUKRS,
         BEGIN OF TY_EKORG,
         EKORG(4)  TYPE  C ,
         END OF TY_EKORG,
         BEGIN OF TY_KTOKK,
         KTOKK(4)  TYPE  C ,
         END OF TY_KTOKK,
         BEGIN OF TY_LAND1,
         LAND1(3)  TYPE  C ,
         END OF TY_LAND1,
         BEGIN OF TY_LANGU,
         LANGU(1)  TYPE  C ,
         END OF TY_LANGU,
         BEGIN OF TY_AKONT,
         AKONT(10) TYPE  C ,
         END OF TY_AKONT,
         BEGIN OF TY_ZUAWA,
         ZUAWA(3)  TYPE  C ,
         END OF TY_ZUAWA,
         BEGIN OF TY_MINDK,
         MINDK(3) TYPE  C ,
         END OF TY_MINDK,
         BEGIN OF TY_WAERS,
         WAERS(5)  TYPE  C ,
         END OF TY_WAERS.
    DATA : I_XK01     TYPE TABLE OF  TY_XK01,     "FOR HOLDING DATA FROM FLAT FILE
           I_SUCCMESG TYPE TABLE OF  TY_MESG,     "FOR SUCCESS RECORDS DETAILS
           I_ERRMESG  TYPE TABLE OF  TY_MESG,     "FOR ERROR RECORDS DETAILS
           I_ERROR TYPE TABLE OF TY_ERROR,
    I_LIFNR TYPE TABLE OF TY_LIFNR,
    I_BUKRS TYPE TABLE OF TY_BUKRS,
    I_EKORG TYPE TABLE OF TY_EKORG,
    I_KTOKK TYPE TABLE OF TY_KTOKK,
    I_LAND1 TYPE TABLE OF TY_LAND1,
    I_LANGU TYPE TABLE OF TY_LANGU,
    I_AKONT TYPE TABLE OF TY_AKONT,
    I_ZUAWA TYPE TABLE OF TY_ZUAWA,
    I_MINDK TYPE TABLE OF TY_MINDK,
    I_WAERS TYPE TABLE OF TY_WAERS,
    I_FINALMESG TYPE TABLE OF TY_ERROR,
    I_MESG TYPE TABLE OF TY_MESG.
    *& WORK AREA DECLARATION
    DATA:  WA_XK01     TYPE TY_XK01,               "FOR HOLDING DATA FROM FLAT FILE
           WA_SUCCMESG TYPE TY_MESG,               "FOR SUCCESS RECORDS DETAILS
           WA_ERRMESG  TYPE TY_MESG,               "FOR ERROR RECORDS DETAILS
           WA_ERROR TYPE TY_ERROR,
    WA_LIFNR TYPE TY_LIFNR,
    WA_BUKRS TYPE TY_BUKRS,
    WA_EKORG TYPE TY_EKORG,
    WA_KTOKK TYPE TY_KTOKK,
    WA_LAND1 TYPE TY_LAND1,
    WA_LANGU TYPE TY_LANGU,
    WA_AKONT TYPE TY_AKONT,
    WA_ZUAWA TYPE TY_ZUAWA,
    WA_MINDK TYPE TY_MINDK,
    WA_WAERS TYPE TY_WAERS,
    WA_MESG TYPE TY_MESG,
    WA_FINALMESG TYPE TY_ERROR.
    INITIALIZATION.
    SELECT LIFNR FROM LFA1 INTO TABLE I_LIFNR.
      SELECT BUKRS FROM T001 INTO TABLE I_BUKRS .
      SELECT EKORG FROM T024E INTO TABLE I_EKORG .
      SELECT KTOKK FROM T077K INTO TABLE I_KTOKK .
      SELECT LAND1 FROM T005 INTO TABLE I_LAND1.
      SELECT SPRAS FROM T002 INTO TABLE I_LANGU.
      SELECT SAKNR FROM SKA1 INTO TABLE I_AKONT .
      SELECT ZUAWA FROM TZUN INTO TABLE I_ZUAWA .
      SELECT MINDK FROM T059M INTO TABLE I_MINDK .
      SELECT WAERS FROM TCURC INTO TABLE I_WAERS.
    LOOP AT I_XK01 INTO WA_XK01.
          TRANSLATE WA_XK01-LIFNR TO UPPER CASE.
          READ TABLE I_LIFNR INTO WA_LIFNR WITH KEY LIFNR = WA_XK01-LIFNR.
          IF SY-SUBRC = 0 .
            CONCATENATE WA_MESG-MESG TEXT-100 WA_XK01-LIFNR TEXT-101  ' , ' INTO WA_MESG-MESG.
          ENDIF.
          TRANSLATE WA_XK01-BUKRS TO UPPER CASE.
          READ TABLE I_BUKRS INTO WA_BUKRS WITH KEY BUKRS = WA_XK01-BUKRS.
          IF SY-SUBRC <> 0 .
            CONCATENATE WA_MESG-MESG  TEXT-102 WA_XK01-BUKRS TEXT-103  ' , ' INTO WA_MESG-MESG.
          ENDIF.
          TRANSLATE WA_XK01-EKORG TO UPPER CASE.
          READ TABLE I_EKORG INTO WA_EKORG WITH KEY EKORG = WA_XK01-EKORG.
          IF SY-SUBRC <> 0 .
            CONCATENATE WA_MESG-MESG TEXT-104    WA_XK01-EKORG  TEXT-103  ',' INTO WA_MESG-MESG.
          ENDIF.
          TRANSLATE WA_XK01-KTOKK TO UPPER CASE.
          READ TABLE I_KTOKK INTO WA_KTOKK WITH KEY KTOKK = WA_XK01-KTOKK.
          IF SY-SUBRC <> 0 .
            CONCATENATE WA_MESG-MESG TEXT-105   WA_XK01-KTOKK  TEXT-103   ',' INTO WA_MESG-MESG.
          ENDIF.
          TRANSLATE WA_XK01-LAND1 TO UPPER CASE.
          READ TABLE I_LAND1 INTO WA_LAND1 WITH KEY LAND1 = WA_XK01-LAND1.
          IF SY-SUBRC <> 0 .
            CONCATENATE WA_MESG-MESG TEXT-106   WA_XK01-LAND1  TEXT-103   ',' INTO WA_MESG-MESG.
          ENDIF.
          TRANSLATE WA_XK01-LANGU TO UPPER CASE.
          READ TABLE I_LANGU INTO WA_LANGU WITH KEY LANGU = WA_XK01-LANGU.
          IF SY-SUBRC <> 0 .
            CONCATENATE WA_MESG-MESG TEXT-107  WA_XK01-LANGU  TEXT-103   ',' INTO WA_MESG-MESG.
          ENDIF.
          READ TABLE I_AKONT INTO WA_AKONT WITH KEY AKONT = WA_XK01-AKONT.
          IF SY-SUBRC <> 0 .
            CONCATENATE WA_MESG-MESG TEXT-108   WA_XK01-AKONT  TEXT-103  ',' INTO WA_MESG-MESG.
          ENDIF.
          READ TABLE I_ZUAWA INTO WA_ZUAWA WITH KEY ZUAWA = WA_XK01-ZUAWA.
          IF SY-SUBRC <> 0 .
            CONCATENATE WA_MESG-MESG TEXT-109    WA_XK01-ZUAWA  TEXT-103   ',' INTO WA_MESG-MESG.
          ENDIF.
          READ TABLE I_MINDK INTO WA_MINDK WITH KEY MINDK = WA_XK01-MINDK.
          IF SY-SUBRC <> 0 .
            CONCATENATE WA_MESG-MESG TEXT-110    WA_XK01-MINDK  TEXT-103   ',' INTO WA_MESG-MESG.
          ENDIF.
          TRANSLATE WA_XK01-WAERS TO UPPER CASE.
          READ TABLE I_WAERS INTO WA_WAERS WITH KEY WAERS = WA_XK01-WAERS.
          IF SY-SUBRC <> 0 .
            CONCATENATE WA_MESG-MESG TEXT-111   WA_XK01-WAERS  TEXT-103   ',' INTO WA_MESG-MESG.
          ENDIF.
    append wa_mesg to err_mesg. 
    endloop.
    I am populating the error messages into workarea using concatenate statement.
    Hope this  solves the problem.
    Reward points if helpful.
    Thanks and Regards,
    Narayana.

  • Hierarchy flat file extraction

    Hi experts--
    Can anyone could guide me in Flat file Hierchy extraction.Step by step.
    if possible with screen shots.
    can send it to [email protected]
    regards,
    Rambo.

    Hi,
    Flat file hierarchy extraction is similar to the normal flat file extraction procedures. But the file structure itself can be complex and is different than normal flat files.
    Take a look at the threads below for more details :
    Hierarchy Flat file
    Hierarchy from flat file
    Program to load a flat file in a Hierarchy
    Cheers,
    Kedar

  • Generic and Flat file extraction

    Hi experts
    can any body send the Real time scenario for Generic and flat file extraction with examples.
    Thanks and regrds,
    satya

    Hi,
    Generic extraction is an extraction where you create a Generic DS and based upon this you would be extracting the data from R/3, which means you would have an source system to extract the generic data from the generic data source.
    Flact file extraction is an extraction where you need not have to connect to source
    system, in this case you source system would be PC, here you would be designing an infosource based up on the fields/columns of your flat file and mapping the corresponding the infoojects to it.. and update rules..
    and you would be loading/extracting the data from the file as a source..
    These two things are different in nature and usage..
    Hope this helps..
    assign points if useful..
    cheers,
    Pattan.

  • Omit the Open Hub control file 'S_*' for flat file extracts

    Hi Folks,
    a quick question is it somehow possible to omit the control file generation for flat file extracts.
    We got some unix scripts running that get confused by the S_* files and we where wondering if we can switch the creation of those files off.
    Thanks and best regards,
    Axel

    Hi,
    However, the S_ (structure) file does not reflect the proper structure. The S_ file lists all fields sorted in alphabetical order based on the field name, instead of listing all fields in the correct order that they appear in the output file.
    As I know and checked this is not the case. The S_ file has the fields in the order of infospoke object sequence ( and in transformation tab, source structure).
    I would suggest you to control it again.
    Derya

  • InfoSpoke Flat File Extract to Logical Filename

    I'm trying to extract data from an ODS to a flat file. So far, I've found that the InfoSpoke must write to the application server for large data volume. Also, in order for the InfoSpoke to transport properly, I must use logical filenames. I've attempted to implement the custom class and append structure as defined in the SAP document "How To... Extract Data with OPEN HUB to a Customer Defined Logical Filename". I'm getting an error when attempting to import the included transports (custom class code). It appears to be a syntax error. Has anyone encountered this, and, if so, how did you fix it?

    Hello.
    I'm getting a syntax error also.  I did not import the transport, but applied the notes thru the appendix.  When I modified the method "GET_OBJECT_REF_INT" in class CL_RSB_DEST as below, I get a syntax error on the "create object" statement.
        when rsbo_c_desttype_int-file_applsrv.
    *{   REPLACE        &$&$&$&$                                          1
    *\      data: l_r_file_applsrv type ref to cl_rsb_file_applsrv.
          data: l_r_file_applsrv type ref to zcl_rsb_file_logical.
    *}   REPLACE
          create object l_r_file_applsrv
            exporting i_dest    = n_dest
                      i_objvers = i_objvers
    Class CL_RSB_DEST,Method GET_OBJECT_REF_INT
    The obligatory parameter "I_S_VDEST" had no value assigned to it.

  • How to use 'flat file extraction' option in Infospoke

    Hello Gurus,
    We are trying to extract the BW data using Infospoke to a 3rd party ETL tool (Datastage). Now in the current BW 3.5 release, the customer is on SP11 and we are not able to extract any data using a 3rd party system option in Infospoke. Due to the fact, we found out from SAP, that since we are getting an error (3rd party system is not set)in infospoke, we have to upgrade the SP level from 11 to either 16 or 17. The customer that we are dealing with is not going to do that until July - 2007.
    So now we have a second option to load the 'flat file' in BW App. server and FTP to Datastage. Is this something workable option?
    When using that option within Infospoke destination Tab, we can see two options. One is to use File option and load the data within the standard DIR HOME. I tried that but I can't see the file and the entire data within the standard directory. Can anyone explain why and what will be the procedure to follow?
    The second option is to use Logical file. I never done that before. Can anyone explain step by step process or provide the detail document and/or link?
    Since we are at the critical timeframe stage, I would appreciate if I can get the faster response from the Gurus.
    I will make sure to give 'reward points' to each and every helpful answer(s).
    thanks in advance,
    Maulesh

    Hi shashi,
    first, let me clarify one thing here! I am not saying '3rd party tool' as an option in infospoke. I know Infospoke provides only two options (DB table, and file). But when you use DB table option, you check mark on 3rd party destination and that's where you define an RFC destination for 3rd party tool. So when you execute an infospoke, it will start extracting data and load into the defined destination. Since we are on SP11 on BW 3.5 release, we are getting an error while executing an Infospoke using 'DB table option'. The error is like '3rd party destination is not set'. When I went to Marketplace and found an OSS note, I found out that we need an upgrade on SP level (upto 17).
       As I mentioned earlier, the client doesn't want to upgrade until next year, and we are at the critical timeframe stage, where we have to make a decision whether we hold this project until the SP upgrade happens or found alternate approach where we can extract the data from BW (somehow) and load into Teradata database via Datastage - BW Pack (3rd party ETL).
       So we might be able to use the second option in Infospoke destination, which is to use 'Flat file' load.
       Now when I use File option with App. server checked, the data loads into default Server and directory on BW. Which is DIR_HOME. I can see those file extracted and loaded into that directory under the transaction AL11. But I can't see any data. Do you think I have to request for an increase in size limit? If I see all the data, how can I FTP from the BW server? Do I have to work with SAP Basis or create a UNIX script?
       When I use Logical file name, I guess I have to follow what you have described on your response, correct? Can i use an existing Logical path and file name? Do you have a 'How to' document on creating YEAR and PER FM? Is it possible that Basis can do this work for us? What happens after we assign a logical Path and logical file name into the destination tab on Infospoke? What is the use of logical file versus file? Overall, I am looking for a process from extracting the data from Infospoke using either file or logical file, all the way to load into 3rd party database.
      looking forward for your response!
    thanks,
    Maulesh

  • BI 7.0 Flat file extraction

    Hi all....
    For getting acquainted with BI 7.0 i am trying to extract the data from falt file...
    1)every thing is fine i created DTP's at cube level and as well as Data source level
    2)I have executed / triggred like in Infopacakage to schedule the dat
    3) Request stautus is green in Infocube and anf it is also available for reporting..
    4)When i checked monitor i could see over all status green...and i could find no data in the info cube.
    i ahve done transformations and DTP's correctly .then why i cpould not see the data in infocube and why i could see all requests set to green and also available for reporting
    Thnx in advance
    Reg
    Ram

    hi..
    follow these steps for easy data load from flat file..
    Uploading of master data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    1. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    2. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( master data attributes, text, hierarchies)
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    3. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to select Insert Characteristics as info provider
    • Select required info object ( Ex : Employee ID)
    • Under that info object select attributes
    • Right click on attributes and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    4. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. Alternatively monitor icon can be used.
    BW 7.0
    Uploading of Transaction data
    Log on to your SAP
    Transaction code RSA1—LEAD YOU TO MODELLING
    5. Creation of Info Objects
    • In left panel select info object
    • Create info area
    • Create info object catalog ( characteristics & Key figures ) by right clicking the created info area
    • Create new characteristics and key figures under respective catalogs according to the project requirement
    • Create required info objects and Activate.
    6. Creation of Data Source
    • In the left panel select data sources
    • Create application component(AC)
    • Right click AC and create datasource
    • Specify data source name, source system, and data type ( Transaction data )
    • In general tab give short, medium, and long description.
    • In extraction tab specify file path, header rows to be ignored, data format(csv) and data separator( , )
    • In proposal tab load example data and verify it.
    • In field tab you can you can give the technical name of info objects in the template and you not have to map during the transformation the server will automatically map accordingly. If you are not mapping in this field tab you have to manually map during the transformation in Info providers.
    • Activate data source and read preview data under preview tab.
    • Create info package by right clicking data source and in schedule tab click star to load data to PSA.( make sure to close the flat file during loading )
    7. Creation of data targets
    • In left panel select info provider
    • Select created info area and right click to create ODS( Data store object ) or Cube.
    • Specify name fro the ODS or cube and click create
    • From the template window select the required characteristics and key figures and drag and drop it into the DATA FIELD and KEY FIELDS
    • Click Activate.
    • Right click on ODS or Cube and select create transformation.
    • In source of transformation , select object type( data source) and specify its name and source system Note: Source system will be a temporary folder or package into which data is getting stored
    • Activate created transformation
    • Create Data transfer process (DTP) by right clicking the master data attributes
    • In extraction tab specify extraction mode ( full)
    • In update tab specify error handling ( request green)
    • Activate DTP and in execute tab click execute button to load data in data targets.
    8. Monitor
    Right Click data targets and select manage and in contents tab select contents to view the loaded data. There are two tables in ODS new table and active table to load data from new table to active table you have to activate after selecting the loaded data . Alternatively monitor icon can be used.
    hope it helps...
    all the best..

  • Flat file extraction

    hi
    i know the procedure to extract from flat file but my question is how to extract date from flat file every month without entering path every time, i mean how to extract different files ex: september file, october file.. and can  any one explain how we have to program for getting all the files without entering file path every time.

    Hi,
    If you want to automate your flat file data loads you must keep those files in application server and write an ABAP code at info pack level to extract the file.
    For example you are getting files every day and those files are ended with current date. like xxxx0311208.csv you can read system date and store in the ABAP object and search for the same in the application server by giving the server path, so once the program finds the specified file in the server your program will execute and your info pack will trigger and load that file.
    in the similar way you can write your code for your requirment for month.
    Regards
    Charan

Maybe you are looking for

  • Oam file wont loop in muse?

    Hello there, I just ran into a small problem with Muse where an adobe edge animate file I placed in fails to loop when previewing in the browse. The file loops fine when I preview it in adobe edge animate but for some reason Muse only plays it once.

  • How to transfer Elements 8 to new iMac

    I have Elements 8 on my 2006 iMac and would like to deactivate it and install it on my newer iMac, but the download area says it can't be done. What are my options?

  • Customer Create BAPI

    Hi All, I am trying to create a customer by calling a BAPI. I am unable to find the right one. I wanted to create customer with all levels of data(General, Company and Sales Org including partner functions). I have used below BAPI's. BAPI_CUSTOMER_CR

  • Reg Error in Workflow - SAP HR Triggers

    Dear Experts, We have SAP HR triggers as per our requirement. Whenever new employee gets created in SAP HR, in CUP - HR triggers. When we see it in process log, it shows the request in process. It gives below information. Please check 2011-07-27 12:1

  • Can i retrieve deleted photo or video using my icloud account

    can i retrieve deleted photo or video using my icloud account