Data load fron flat file through data synchronization

Hi,
Can anyone please help me out with a problem. I am doing a data load in my planning application through a flat file and using data synhronization for it. How can i specify during my data synchronization mapping to add values from the load file that are at the same intersection instead of overriding it.
For e:g the load files have the following data
Entity Period Year Version Scenario Account Value
HO_ATR Jan FY09 Current CurrentScenario PAT 1000
HO_ATR Jan FY09 Current CurrentScenario PAT 2000
the value at the intersection HO_ATR->Jan->FY09->Current->CurrentScenario->PAT should be 3000.
Is there any possibility? I dont want to give users rights to Admin Console for loading data.

Hi Manmit,
First let us know if you are in BW 3.5 or 7.0
In either of the cases, just try including the fields X,Y,Date, Qty etc in the datasource with their respective length specifications.
While loading the data using Infopackage, just make the setting for file format as Fixed length in your infopackage
This will populate the values to the respective fields.
Prathish

Similar Messages

  • Extracting data from Essbase & loading into flat file through ODI

    Hi,
    I want to extract data from Essbase and load it into a flat file through ODI(for extraction from essbase I'm using a report script) and I’m using these KM’s:- LKM Hyperion Essbase data to SQL,IKM SQL to FILE Append & for reversing I’m using RKM Hyperion Essbase.All the mappings have been done and the interface has been made. But when I’m executing the interface it is throwing the error below:-
    ODI-1217: Session ESS_FILEI (114001) fails with return code 7000.
    ODI-1226: Step ESS_FILEI fails after 1 attempt(s).
    ODI-1240: Flow ESS_FILEI fails while performing a Loading operation. This flow loads target table ESS_FILE.
    ODI-1228: Task SrcSet0 (Loading) fails on the target FILE connection FILE_PS_ODI.
    Caused By: java.sql.SQLException: ODI-40417: An IOException was caught while creating the file saying The system cannot find the path specified
    at com.sunopsis.jdbc.driver.file.impl.commands.CommandCreateTable.execute(CommandCreateTable.java:62)
    at com.sunopsis.jdbc.driver.file.CommandExecutor.executeCommand(CommandExecutor.java:33)
    at com.sunopsis.jdbc.driver.file.FilePreparedStatement.execute(FilePreparedStatement.java:178)
    at oracle.odi.runtime.agent.execution.sql.SQLCommand.execute(SQLCommand.java:163)
    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:102)
    at oracle.odi.runtime.agent.execution.sql.SQLExecutor.execute(SQLExecutor.java:1)
    at oracle.odi.runtime.agent.execution.TaskExecutionHandler.handleTask(TaskExecutionHandler.java:50)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.processTask(SnpSessTaskSql.java:2906)
    at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java:2609)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatAttachedTasks(SnpSessStep.java:537)
    at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java:453)
    at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java:1740)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$2.doAction(StartSessRequestProcessor.java:338)
    at oracle.odi.core.persistence.dwgobject.DwgObjectTemplate.execute(DwgObjectTemplate.java:214)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.doProcessStartSessTask(StartSessRequestProcessor.java:272)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor.access$0(StartSessRequestProcessor.java:263)
    at oracle.odi.runtime.agent.processor.impl.StartSessRequestProcessor$StartSessTask.doExecute(StartSessRequestProcessor.java:822)
    at oracle.odi.runtime.agent.processor.task.AgentTask.execute(AgentTask.java:123)
    at oracle.odi.runtime.agent.support.DefaultAgentTaskExecutor$2.run(DefaultAgentTaskExecutor.java:82)
    at java.lang.Thread.run(Thread.java:619)
    Please let me know what I'm missing and how I can resolve this error.
    Thanks

    It seems that you are trying to use the file as your staging areas. Hyperion LKM extracts essbase data into a DB staging area which can then be used by your file IKM to load it into file.
    You need to use a RDBMS for your staging area.

  • Loading of flat file (csv) into PSA – no data loaded

    Hi BW-gurus,
    We have an issue regarding loading a flat file (csv) into PSA using an infopackage u2013 (BI 7.0)
    The infopackage has been used for a while. Prior the consultants with SAP_ALL-profile have run the infopackage. Now we want a few super users to run the infopackage.
    We have created a role for the super users, including authorization objects:
    Data Warehousing objects: S_RS_ADMWB
    Activity: 03, 16, 23, 63, 66
    Data Warehousing Workbench obj: INFOAREA, INFOOBJECT, INFOPACKAG, MONITOR, SOURCESYS, WORKBENCH
    Data Warehousing Workbench u2013 datasource (version > BW 3.x): S_RS_DS
    Activity: All
    Datasource: All
    Subobject for New DataSource: All
    Sourcesystem: FILE
    Data Warehousing Workbench u2013 infosource (flex update): S_RS_ISOUR
    Activity: Display, Maintain, Request
    Application Component: All
    InfoSource: All
    InfoSource Subobject: All values
    As mentioned, the infopackage in question, has been used by consultants with SAP_ALL-profile for some time, and been working just fine.  When the super users with the new role are executing the infopackage, the records are found, but not loaded into PSA. The load seems to be stuck, but no error message occurs. The file we are trying to load contains only 15 records.
    Details monitor:
    Overall status: Missing messages or warnings (yellow)
    Requests (messages): Everything ok (green)
      ->  Data request arranged (green)
      ->  Confirmed with: OK (green)
    Extraction (messages): Errors occurred (yellow)
      ->  Data request received (green)
      -> Data selection scheduled (green)
      -> 15 Records sent (0 Records received) (yellow)
      -> Data selection ended (green)
    Transfer (IDocs and TRFC): Missing messages (yellow)
    Processing (data packet):  Warnings received (yellow)
      -> Data package 1 (? Records): Missing messages (yellow)
         -> Inbound processing (0 records): Missing messages (yellow)
         -> Update PSA (0 Records posted): Missing messages (yellow)
         -> Processing end: Missing messages (yellow)
    Have we forgotten something? Any assistance will be highly appreciated!
    Cheers,
    Anne Therese S. Johannessen

    Hi,
    Try to use the transaction ST01 to trace the authorization of the upload with the SAP_ALL. 
    And the enhance your Profile for the super user.
    Best regards
    Matthias

  • Cube creation & Data loading from Flat file

    Hi All,
    I am new to BI 7. Trying to create a cube and load data from a flat file.
    Successfully created the infosource and Cube (used infosource as a template for cube)
    But got stucked at that point.
    I need help on how to create transfer rules/update rules and then load data into it.
    Thanks,
    Praveen.

    Hi
    right click on infosource->additional functions->create transfer rules.
    now in the window insert the fields you want to load from flat file->activate it.
    now right click on the cube->additional functions->create update rules->activate it.
    click on the small arrow on the left and when you reach the last node(DS)
    right click on it->create info package->extenal data tab->give your FLAT file path and select csv format->schedule tab->click on start.
    hope it helps..
    cheers.

  • CUNIT error in data loading from flat file after r/3 extraction

    Hi all
    After R/3 business content extraction, if i load data from flat file to info cube, I am getting Conversion exit CUNIT error, what might be the reason, the data in the flat file 0unit column is accurate, and mapping rules are also correct, but still i am getting error with CUNIT.?

    check your unit if you are loading amount or quantities what mapping you have and what you are loading from flat files.
    BK

  • Data loading from flat file to cube using bw3.5

    Hi Experts,
                       Kindly give  me the detailed steps with screens  about Data loading from flat file to cube using bw3.5
           ...............Please

    Hi ,
    Procedure
    You are in the Data Warehousing Workbench in the DataSource tree.
           1.      Select the application components in which you want to create the DataSource and choose Create DataSource.
           2.      On the next screen, enter a technical name for the DataSource, select the type of DataSource and choose Copy.
    The DataSource maintenance screen appears.
           3.      Go to the General tab page.
                                a.      Enter descriptions for the DataSource (short, medium, long).
                                b.      As required, specify whether the DataSource builds an initial non-cumulative and can return duplicate data records within a request.
                                c.      Specify whether you want to generate the PSA for the DataSource in the character format. If the PSA is not typed it is not generated in a typed structure but is generated with character-like fields of type CHAR only.
    Use this option if conversion during loading causes problems, for example, because there is no appropriate conversion routine, or if the source cannot guarantee that data is loaded with the correct data type.
    In this case, after you have activated the DataSource you can load data into the PSA and correct it there.
           4.      Go to the Extraction tab page.
                                a.      Define the delta process for the DataSource.
                                b.      Specify whether you want the DataSource to support direct access to data.
                                c.      Real-time data acquisition is not supported for data transfer from files.
                                d.      Select the adapter for the data transfer. You can load text files or binary files from your local work station or from the application server.
    Text-type files only contain characters that can be displayed and read as text. CSV and ASCII files are examples of text files. For CSV files you have to specify a character that separates the individual field values. In BI, you have to specify this separator character and an escape character which specifies this character as a component of the value if required. After specifying these characters, you have to use them in the file. ASCII files contain data in a specified length. The defined field length in the file must be the same as the assigned field in BI.
    Binary files contain data in the form of Bytes. A file of this type can contain any type of Byte value, including Bytes that cannot be displayed or read as text. In this case, the field values in the file have to be the same as the internal format of the assigned field in BI.
    Choose Properties if you want to display the general adapter properties.
                                e.      Select the path to the file that you want to load or enter the name of the file directly, for example C:/Daten/US/Kosten97.csv.
    You can also create a routine that determines the name of your file. If you do not create a routine to determine the name of the file, the system reads the file name directly from the File Name field.
                                  f.      Depending on the adapter and the file to be loaded, make further settings.
    ■       For binary files:
    Specify the character record settings for the data that you want to transfer.
    ■       Text-type files:
    Specify how many rows in your file are header rows and can therefore be ignored when the data is transferred.
    Specify the character record settings for the data that you want to transfer.
    For ASCII files:
    If you are loading data from an ASCII file, the data is requested with a fixed data record length.
    For CSV files:
    If you are loading data from an Excel CSV file, specify the data separator and the escape character.
    Specify the separator that your file uses to divide the fields in the Data Separator field.
    If the data separator character is a part of the value, the file indicates this by enclosing the value in particular start and end characters. Enter these start and end characters in the Escape Charactersfield.
    You chose the; character as the data separator. However, your file contains the value 12;45 for a field. If you set u201C as the escape character, the value in the file must be u201C12;45u201D so that 12;45 is loaded into BI. The complete value that you want to transfer has to be enclosed by the escape characters.
    If the escape characters do not enclose the value but are used within the value, the system interprets the escape characters as a normal part of the value. If you have specified u201C as the escape character, the value 12u201D45 is transferred as 12u201D45 and 12u201D45u201D is transferred as 12u201D45u201D.
    In a text editor (for example, Notepad) check the data separator and the escape character currently being used in the file. These depend on the country version of the file you used.
    Note that if you do not specify an escape character, the space character is interpreted as the escape character. We recommend that you use a different character as the escape character.
    If you select the Hex indicator, you can specify the data separator and the escape character in hexadecimal format. When you enter a character for the data separator and the escape character, these are displayed as hexadecimal code after the entries have been checked. A two character entry for a data separator or an escape sign is always interpreted as a hexadecimal entry.
                                g.      Make the settings for the number format (thousand separator and character used to represent a decimal point), as required.
                                h.      Make the settings for currency conversion, as required.
                                  i.      Make any further settings that are dependent on your selection, as required.
           5.      Go to the Proposal tab page.
    This tab page is only relevant for CSV files. For files in different formats, define the field list on the Fields tab page.
    Here you create a proposal for the field list of the DataSource based on the sample data from your CSV file.
                                a.      Specify the number of data records that you want to load and choose Upload Sample Data.
    The data is displayed in the upper area of the tab page in the format of your file.
    The system displays the proposal for the field list in the lower area of the tab page.
                                b.      In the table of proposed fields, use Copy to Field List to select the fields you want to copy to the field list of the DataSource. All fields are selected by default.
           6.      Go to the Fields tab page.
    Here you edit the fields that you transferred to the field list of the DataSource from the Proposal tab page. If you did not transfer the field list from a proposal, you can define the fields of the DataSource here.
                                a.      To define a field, choose Insert Row and specify a field name.
                                b.      Under Transfer, specify the decision-relevant DataSource fields that you want to be available for extraction and transferred to BI.
                                c.      Instead of generating a proposal for the field list, you can enter InfoObjects to define the fields of the DataSource. Under Template InfoObject, specify InfoObjects for the fields in BI. This allows you to transfer the technical properties of the InfoObjects into the DataSource field.
    Entering InfoObjects here does not equate to assigning them to DataSource fields. Assignments are made in the transformation. When you define the transformation, the system proposes the InfoObjects you entered here as InfoObjects that you might want to assign to a field.
                                d.      Change the data type of the field if required.
                                e.      Specify the key fields of the DataSource.
    These fields are generated as a secondary index in the PSA. This is important in ensuring good performance for data transfer process selections, in particular with semantic grouping.
                                  f.      Specify whether lowercase is supported.
                                g.      Specify whether the source provides the data in the internal or external format.
                                h.      If you choose the external format, ensure that the output length of the field (external length) is correct. Change the entries, as required.
                                  i.      If required, specify a conversion routine that converts data from an external format into an internal format.
                                  j.      Select the fields that you want to be able to set selection criteria for when scheduling a data request using an InfoPackage. Data for this type of field is transferred in accordance with the selection criteria specified in the InfoPackage.
                                k.      Choose the selection options (such as EQ, BT) that you want to be available for selection in the InfoPackage.
                                  l.      Under Field Type, specify whether the data to be selected is language-dependent or time-dependent, as required.
           7.      Check, save and activate the DataSource.
           8.      Go to the Preview tab page.
    If you select Read Preview Data, the number of data records you specified in your field selection is displayed in a preview.
    This function allows you to check whether the data formats and data are correct.
    For More Info:  http://help.sap.com/saphelp_nw70/helpdata/EN/43/01ed2fe3811a77e10000000a422035/content.htm

  • Loading Multiple Flat File around 80+

    Hi,
    Loading multiple flat file in to BI system.
    My issue: I got a scenerio of loading multiple i.e., 80+ flat file into BI system every month and my client want it to be done through process chain r automation with out manual intervention.
    Can any one suggest me how can i achiev this one.
    Regards,
    Prabhakar.

    Hi All,
    We have developed a logic to upload multiple flat file at a time i.e., .CSV file. Please find below is the logic regarding that routine.
    *+ program filename_routine.
    Global code
    $$ begin of global - insert your declaration only below this line  -
    Enter here global variables and type declarations
    as well as additional form routines, which you may call from the
    main routine COMPUTE_FLAT_FILE_FILENAME below
    data : v_filename type string.
    data : v_foldername type string.
    data : v_uploadfile type string.
    data:  v_download_file type string.
    data : v_ext(4) type c.
    data : v_FILE_EXISTS type c.
    data : v_file type DXFILE-FILENAME.
    data : v_count(3) type c.
    data : v_length type i.
    TYPES: BEGIN OF  TY_file,
            LINE(900),
          END OF  TY_file.
    DATA: GIT_file TYPE TABLE OF TY_file.
    $$ end of global - insert your declaration only before this line   -
    form compute_flat_file_filename
      using    p_infopackage  type rslogdpid
               p_datasource   type rsoltpsourcer
               p_logsys       type rsslogsys
      changing p_filename     type RSFILENM
               p_subrc        like sy-subrc.
    $$ begin of routine - insert your code only below this line        -
    This routine will be called by the adapter,
    when the infopackage is executed.
      v_count = 1.
    *---- As per the folder location of the files & system we have to change
    *the path fo the folder in the below variable i.e., 'c:\............
      v_foldername =
      'C:\Documents and Settings\Prabhakar-HP\Desktop\test\'.
    *---- As per the Prefix of the files like 'CSI_01_..' we have change the
    *prefix if required in the below variable i.e., 'CSI_01_..', IF REQUIRED
    *OR IF WANT TO CHANGE THE PREFIX TO SOME OTHER.
      v_filename = 'CSI_01_'.
      v_ext = '.csv'.
      condense v_count.
      concatenate   v_foldername
                    v_filename
                    v_count
                    v_ext       into v_uploadfile.
      v_file = v_uploadfile.
      CALL FUNCTION 'DX_FILE_EXISTENCE_CHECK'
        EXPORTING
          FILENAME       = v_file
          PC             = 'X'
        IMPORTING
          FILE_EXISTS    = v_FILE_EXISTS
        EXCEPTIONS
          RFC_ERROR      = 1
          FRONTEND_ERROR = 2
          NO_AUTHORITY   = 3
          OTHERS         = 4.
      IF SY-SUBRC <> 0.
        MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
        WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
      ENDIF.
      while v_FILE_EXISTS = 'X'.
        v_file = v_uploadfile.
        CALL FUNCTION 'DX_FILE_EXISTENCE_CHECK'
          EXPORTING
            FILENAME       = v_file
            PC             = 'X'
          IMPORTING
            FILE_EXISTS    = v_FILE_EXISTS
          EXCEPTIONS
            RFC_ERROR      = 1
            FRONTEND_ERROR = 2
            NO_AUTHORITY   = 3
            OTHERS         = 4.
        IF SY-SUBRC <> 0.
          MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
          WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
        ENDIF.
        if v_file_exists = 'X'.
          CALL FUNCTION 'GUI_UPLOAD'
            EXPORTING
              FILENAME                = v_uploadfile
              FILETYPE                = 'ASC'
              HAS_FIELD_SEPARATOR     = 'X'
            TABLES
              DATA_TAB                = GIT_file
            EXCEPTIONS
              FILE_OPEN_ERROR         = 1
              FILE_READ_ERROR         = 2
              NO_BATCH                = 3
              GUI_REFUSE_FILETRANSFER = 4
              INVALID_TYPE            = 5
              NO_AUTHORITY            = 6
              UNKNOWN_ERROR           = 7
              BAD_DATA_FORMAT         = 8
              HEADER_NOT_ALLOWED      = 9
              SEPARATOR_NOT_ALLOWED   = 10
              HEADER_TOO_LONG         = 11
              UNKNOWN_DP_ERROR        = 12
              ACCESS_DENIED           = 13
              DP_OUT_OF_MEMORY        = 14
              DISK_FULL               = 15
              DP_TIMEOUT              = 16
              OTHERS                  = 17.
          IF SY-SUBRC <> 0.
           MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
                   WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
          elseif sy-subrc = 0.
    **----- As per the REQUIRMENT, want to change the down loading file
    *path means we have to change the path in the below variable i.e.,
    *'c:\.......
    v_download_file =
    'C:\Documents and Settings\Prabhakar-HP\Desktop\test\satheesh.CSV'.
            CALL FUNCTION 'GUI_DOWNLOAD'
              EXPORTING
                FILENAME                = v_download_file
                FILETYPE                = 'ASC'
                APPEND                  = 'X'
              TABLES
                DATA_TAB                = git_file
              EXCEPTIONS
                FILE_WRITE_ERROR        = 1
                NO_BATCH                = 2
                GUI_REFUSE_FILETRANSFER = 3
                INVALID_TYPE            = 4
                NO_AUTHORITY            = 5
                UNKNOWN_ERROR           = 6
                HEADER_NOT_ALLOWED      = 7
                SEPARATOR_NOT_ALLOWED   = 8
                FILESIZE_NOT_ALLOWED    = 9
                HEADER_TOO_LONG         = 10
                DP_ERROR_CREATE         = 11
                DP_ERROR_SEND           = 12
                DP_ERROR_WRITE          = 13
                UNKNOWN_DP_ERROR        = 14
                ACCESS_DENIED           = 15
                DP_OUT_OF_MEMORY        = 16
                DISK_FULL               = 17
                DP_TIMEOUT              = 18
                FILE_NOT_FOUND          = 19
                DATAPROVIDER_EXCEPTION  = 20
                CONTROL_FLUSH_ERROR     = 21
                OTHERS                  = 22.
            IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
            ENDIF.
            refresh git_file.
          ENDIF.
        endif.
        if v_file_exists = 'X'.
          if p_subrc = 0.
            v_count = v_count + 1.
            condense v_count.
            concatenate
                 v_foldername
                 v_filename
                 v_count
                 v_ext into v_uploadfile.
          endif.
        else.
        endif.
      endwhile.
      v_uploadfile = v_download_file.
      p_filename = v_uploadfile.
      p_subrc = 0.
    $$ end of routine - insert your code only before this line         -
    endform. +*
    Reagrds,
    Prabhakar.

  • Load a flat file into BW-BPS using SAP GUI

    Hi,
    We are using BW BPS 3.5 version, i implemented how to guide  " How to load a flat file into BW-BPS using SAP GUI" successfully without any errors.
    I inlcuded three infoobjects in the text file   costelemt, Posting period and amount. the same three infoobjects i inlcuded the file structure in the global data as specified in the how to document
    The flat file format is like this
    Costelmnt      Postingperiod         Amount   
    XXXXX             #      
    XXXXX             1                          100
    XXXXX             2                           800
    XXXXX             3                           700
    XXXXX             4                           500
    XXXXX             5                           300
    XXXXX             6                           200
    XXXXX             7                           270
    XXXXX             8                           120
    XXXXX             9                           145 
    XXXXX            10                           340
    XXXXX            11                           147
    XXXXX            12                           900 
    I successfully loaded above flat file in to BPS cube and it dispalyed in the layout also.
    But users are requesting to load  flatfile in the below format
    Costelmnt        Annual(PP=#)   Jan(PP=1)   Feb(PP=2) ........................................Dec(PP=12)  
    XXXXX              Blank                       100           800                                                   900
    Is it possible to load a flat file like this
    They wants load a single row instead of 13 rows for each costelement
    How to do this. Please suggest me if anybody accorss this requirment.
    In the infocube we have got only one Info object 0FISCPER3(Posting period) and one 0AMOUNT(Amount)
    do we need 13 Infobjects for each posting period and amount.
    Is there any possiblity we can implement any user exit which we use in BEX Quer's
    Please share your ideas on this.
    Thanks in advance
    Best regards
    SS

    Hi,
    There are 2 ways to do this.
    One is to change the structure of the cube to have 12 key figures for the 12 posting periods.
    Another way is to write an ABAP Function Module to fetch the values from each record based on the posting period and store it in the cube for the corresponding characteristic. This way, you dont have to change the structure of the cube.
    If this particular cube is not used anywhere else, I would suggest to change the structure itself.
    Hope this helps.

  • Loading Multiple Flat File in to BI System with one InfoPackage

    Greetings,
    i have developed a routine to Load Multiple Flate File in one InfoPackage, but it doesn't work. Only the Last File are loaded. Who can Help me?
    Loading 'R:\Verzeichnis \ing_wan_b_002.fall.csv' to 'R:\Verzeichnis \ing_wan_b_240*.fall.csv'
    Routine:
    DATA: VerzUndDateiname TYPE string value 'R:\Verzeichnis \ing_wan_b_000.fall.csv',
          VerzUndDateinameKomplett TYPE string,
          count TYPE i value 2,
          countN(3) TYPE n.
    WHILE count <= 240.
    countN = count.
    VerzUndDateinameKomplett = VerzUndDateiname.
    REPLACE '000' WITH zaehlerN
                  INTO VerzUndDateinameKomplett.
    p_filename = VerzUndDateinameKomplett.
    count = count + 1.
    ENDWHILE.
    Best Regards
    Jens
    Edited by: JB6493 on May 18, 2009 1:03 PM
    Edited by: JB6493 on May 18, 2009 1:07 PM

    Hello Jens,
    you have to process the InfoPackage 239 times. Your routine would be executed once during one processing of the InfoPackage. This would run the WHILE statement and end up with the last filename after 239 iterations.
    Either try to concatenate the files you want to load (if possible) and load them in one go. Alternatively you could try to use a process chain to run the InfoPackage 239 times.
    Kind regards,
    Christoph

  • Error when loading a flat file

    Hello,
    I am trying to load a flat file into our BW for the first time.  I keep getting the error:
    Error 1 when loading external data
    Message no. RSAR234
    I have search SDN and looked at the OSS notes and have tried several suggestions, but I still cannot get the file to load.  We are currently on version 3.5.
    My transfer structure and flat file do match.
    My transfer structure for IO_MAT_ATTR has a transfer method of PSA and is laid out as follows:
    IO_MAT                      Material Number          IO_MAT
    IO_MATNM     Material Name          IO_MATNM
    IO_MAT is: CHAR 15
    IO_MATNM is: CHAR 30
    My flat file is saved as a CSV and is as follows:
    MATONE,TEA
    MATTWO,COFFEE
    My file is saved on my local PC and is not open when I try to load it.  When I attempt to preview the file in my infopackage, I get the same error there as well.
    Any suggestions would be greatly appreciated.
    Thanks
    Charla

    Hi Charla,
    Error 1 means that the system is unable to access the file. Make sure that the path is correct and also check if the data separator is correctly defined in the "external tab" of the infopackage.
    I guess the settings in the "external tab" of the infopackage have some issues.
    Bye
    Dinesh

  • Unable to load the flat file  from client  work station

    Hi,
    I am trying load a flat file (.CSV file)from my desktop (Client work station) and getting the following error.
    An upload from the client workstation in the background is not possible
    Message no. RSM860
    Diagnosis
    You cannot load data from the client workstation in the background.
    Procedure
    Transfer your data to the application server and load it from there.
    I have recd a .XLS file and then I have converted to .CSV file , which I saved in my desktop and trying to load the same.
    Please help me how to go about this?...
    Thanks in advance.
    Christy.

    Hi All,
    Again, I have tried to load the flat file from clint work station with direct loading..I have got the following errors..
    Errors : 1
    Record                                                  990: Contents '50,000' from Field /BIC/ZPLQTY_B Not Convertible in Type QUAN -> Long Tex
    like this i recd so many errors.
    ERROR : 2
    Error in an arithmetic operation in record 259     
    Please help me how to load the flat file successfully.
    If I hv to save the flat file in Appl server..how to do that..Please provide step by step instruction..
    Thanks
    Christy

  • Tracking history while loading from flat file

    Dear Experts
    I have a scenario where in have to load data from flat file in regular intervals and also want to track the changes made to the data.
    the data will be like this and it keep on changes and the key is P1,A1,C1.
    DAY 1---P1,A1,C1,100,120,100
    DAY 2-- P1,A1,C1,125,123,190
    DAY 3-- P1, A1, C1, 134,111,135
    DAY 4-- P1,A1,C1,888,234,129
    I am planing to load data into an ODS and then to infocube for reporting what will be the result and how to track history like if i want to see what was the data on Day1 and also current data for P1,A1,C1.
    Just to mention as i am loading from flat file i am not mapping RECORDMODE in ODS from flat file..
    Thanks and regards
    Neel
    Message was edited by:
            Neel Kamal

    Hi
    You don't mention your BI release level, so I will assume you are on the current release, SAP NetWeaver BI 2004s.
    Consider loading to a write-optimized DataStore object, to strore the data.  That way, you automatically will have a unqiue technical key for each record and it will be kept for historical purposes.  In addition, load to a standard DataStore object which will track the changes in the change log.  Then, load the cube from the change log (will avoid your summarization concern), as the changes will be updates (after images) in the standard DataStore Object.
    Thanks for any points you choose to assign
    Best Regards -
    Ron Silberstein
    SAP

  • ABAP Code for loading the flat files kept at FTP locations

    Hi all,
    I need to automate the process of data loading via flat files with the help of process chains , with respect to which I require following help:
    1. what steps should I follow to load the data with respect that whenever the new file comes in the directory of the ftp location start that process chain for data load.(Note : there is no fix time that when the new file will be coming)
    2. Every time the new file will be having differnt name , then in that case how we will be fetching the new file which is require for data loading?
    Thanks and Regards,
    Neha

    Hi Neha,
    I have not worked on this or made this code.
    Recently my onsite did it.
    Try to find more gyan on this functiona module...'DX_GET_PHYSICAL_FILENAME'
    Code is... write it in infopackge start routine.
    DATA: v_filename LIKE dxfile-filename.
    Below function module generates the filename with the complete path
    based on the app server attached to the Central Instance
    The logical mapping Z_INBOUND_EBIZ_PAYMENTS is maintained in the
    transaction - "FILE"
      CALL FUNCTION 'DX_GET_PHYSICAL_FILENAME'
        EXPORTING
          i_filename             = 'Z_INBOUND_EBIZ_PAYMENTS'
        IMPORTING
          o_filename             = v_filename
        EXCEPTIONS
          not_registered         = 1
          logical_filename_error = 2
          OTHERS                 = 3.
      IF sy-subrc = 0.
        p_filename = v_filename.
        p_subrc = 0.
      ELSE.
        p_subrc = 4.
      ENDIF.
    Regards,
    San!

  • Errors when trying to load a flat file

    When I try and load a flat file (I have also manually created a test .csv file with one record) I am able to load to PSA, but when I try and load to the Object I get the following error message (below), how can I fix this? Thanks
    The test record is:
    1460;MAMO;UK - Banbury;MAMO;Other;FCJ
    Diagnosis                                                                               
    Data record 1 & with the key '1460MAMO &' is invalid in value 'Other &'   
        of the attribute/characteristic WINVCAT &.                                                                               
    System response                                                                               
    The system has recognized that the value mentioned above is invalid, and  
        has processed this general error message. A subsequent message may give   
        you more information on the error. This message refers to the same        
        value, even though it does not state this explicitly.                                                                               
    Procedure                                                                               
    If this message appears during a data load, maintain the attribute in     
        the PSA maintenance screens. If this message appears in the master data   
        maintenance screens, leave the transaction and call it again. This allows you to maintain your master data.

    Hi
    GO to Transaction RSKC and check whether & available or not,if it is not there please enter & value and execute.
    and also go to the psa and change record(delete the gap before & and save and reload it...
    Thanks
    Teja
    Message was edited by: Teja badugu

  • Error loading a flat file.

    Hi,
    I am getting the following error while loading a flat file (in Preview):
    Error 8 when compiling the upload program: row 227, message: Data type /BIC/CCHBZFI_PREV_YR was found in a newe
    Please help.

    Hi,
    Check whether the file is open? close the file and try to pull once again.
    Fields are matching with the data source? check it once. it should be in sync with MData fields.
    and may be the problem with data...I think some Extra permitted chars / lower case is there in flat file. this can be solved by RSKC

Maybe you are looking for