Extracting from flat file for testing inplace of ECC(r/3)

Hi Expert,
Please help me out and give me your suggestion.
We are creating a custom table in ECC it is under progress.
I would like to create infoObject, transformation, DTP and Cube and create queries with loading info by flate file. Later on connection whole provider to table.
Give me suggestion is ther any problem for this Or any good opinion.
Thanks in Advance

Loading from a flat file for testing has worked well for me.  We recently wanted to test an ODS object, but I didn't want to create a custom extracter until we knew if the ODS object would work for us. So under my infoSource I created a quick infopackage that pulled some test data from a flat file.  We've loaded test data into cubes before.  Only problem that I ever had was when I used Excel to create the flat file.  Excel wants to truncate leading zeros.  0calmonth for example needs to be loaded as 072007 not 72007.

Similar Messages

  • How to load Material from Flat File and convert to SAP Format

    Hi
              I am loading 0Material values from Flat File for mapping purpose. The format of Material in Flat File is "7704132". Within the system, i need to compare the value with 0Material values from the incoming data and update corresponding 0Material in records. for this purpose, I created dummy materials taking 0Material as template and trying to load data. I am getting invalid error- Version '7704132' is not valid RSDMD No194. Can anyone please let me know how to over come this issue. Should I include any routine in the data source or rules level. I am in BI 7.0.
    Thanks.

    Hi,
    Use the FM CONVERSION_EXIT_ALPHA_INPUT to convert the value into Internal format .Use this FM in the Transformations (field mapping).
    Search the forum with CONVERSION_EXIT_ALPHA_INPUT for more information on this.
    Regards,
    Anil Kumar Sharma .P

  • How often in the real time projects extract data from flat files n process

    I am going thru teh BODS data integrator, and trying to understand the demand of ETL services extract data from a flat file, is that really impt in teh real time jobs.
    Thank you very much for the helpful info.

    Hi,
    As per the inputs given by you guys i started loading data from flat file.
    I try to load 28 files from i which i was able to load 24 files succesfully.For the other 4 i got this error messages
    1) Error 'Enter period in the format __.YYYY...' at conversion exit CONVERSION_EXIT_PERI6_INPUT (field CALMONTH record 1, value DUMYTRA)
    Message no. RSDS012
    2)  a) Error 'The argument '1,008.00' cannot be interpreted as anumber' on assignment field QUANT_B record 11714 value 1,008.00
    Message no. RSDS013
       b) Error 'The argument '1,110.00' cannot be interpreted as anumber' on assignment field QUANT_B record 15374 value 1,110.00
    Message no. RSDS013
    3) a) Error 'The argument '1,140.00' cannot be interpreted as anumber' on assignment field QUANT_B record 1647 value 1,140.00
    Message no. RSDS013
       b) Error 'The argument '2,028.00' cannot be interpreted as anumber' on assignment field QUANT_B record 4625 value 2,028.00
    Message no. RSDS013
    4) Error 'The argument '1,151.00' cannot be interpreted as anumber' on assignment field QUANT_B record 7808 value 1,151.00
    Message no. RSDS013
    I'am unable to trace out what is the error exactly.
    I checked this values in files they are perfect.
    can anybody please guide me on this issue.
    With Regards,
    Pradeep.B

  • Wrong century for the dates from flat file

    I am in the process of uploading the data from flat file ( aka CSV) into oracle via external table .
    All the dates have the year of 20xx. My understanding was 51-99 will be prefixed with 19 ( such as 1951 - 1999) and 00-50
    will be prefixed with 20 ( such as 2004 ... )
    The column below ( startdate ) is of timestamp .
    Is my understanding wrong ?
    SQL> select startdate , to_date(startdate , 'dd/mm/yyyy' ) newdate from tab;
    NEW_DATE STARTDATE
    09/18/2090 18-SEP-90 12.00.00.000000000 AM
    BANNER
    Oracle Database 11g Enterprise Edition Release 11.1.0.7.0 - 64bit Production
    PL/SQL Release 11.1.0.7.0 - Production
    CORE 11.1.0.7.0 Production
    TNS for 64-bit Windows: Version 11.1.0.7.0 - Production
    NLSRTL Version 11.1.0.7.0 - Production
    SQL> show parameter nls
    NAME TYPE VALUE
    nls_calendar string
    nls_comp string BINARY
    nls_currency string
    nls_date_format string
    nls_date_language string
    nls_dual_currency string
    nls_iso_currency string
    nls_language string AMERICAN
    nls_length_semantics string BYTE
    nls_nchar_conv_excp string FALSE
    nls_numeric_characters string
    nls_sort string
    nls_territory string AMERICA
    nls_time_format string
    nls_time_tz_format string
    nls_timestamp_format string
    nls_timestamp_tz_format string
    SQL>
    SQL>

    Offense . None taken ...
    I literally typed the sql ( as I did not copy & paste SQL / resultsets ) . I did use to_char in my original sql . I don't want to expose the real table . thats why I was giving the made up test case. If you carefully the SQL and the equivalent result sets ... you would have noticed .
    Here is the SQL used ( of course , I have changed the table name ) ...
    select to_char(startdate , 'mm/dd/yyyy') sdate, startdate from t
    SDATE STARTDATE
    09/18/2090 18-SEP-90 12.00.00.000000 AM
    The flatfile had the value of 091890 as the data .

  • Extracting a flat file from oracle table

    I have moved the knowledge module KIM ISO SQL to FileAppend from the Metadata to my project folder.
    But when I create an interface mapping the oracle table and a flat file on a different unix server, in the drop down menu , it shows only KIM SQL TO SQL and KIM Control Append.It does not show up the SQL to FileAppend knowledge module option.
    What should I do to extract a flat file from oracle table?
    Thanks
    Hima
    Overstock.com

    All IKM in the Drop Down Menu are dependent of the target technology.
    A question, at this interface, is your target table a file ?

  • CUNIT error in data loading from flat file after r/3 extraction

    Hi all
    After R/3 business content extraction, if i load data from flat file to info cube, I am getting Conversion exit CUNIT error, what might be the reason, the data in the flat file 0unit column is accurate, and mapping rules are also correct, but still i am getting error with CUNIT.?

    check your unit if you are loading amount or quantities what mapping you have and what you are loading from flat files.
    BK

  • Data Source creation for Master Data from Flat File to BW

    Hi,
    I need to upload Master Data from Flat File. Can anybody tell step by step right from begining of creation of DataSource up to Loading into Master Data InfoObject.
    can any body have Document.
    Regards,
    Chakri.

    Hi,
    This is the procedure.
    1. Create m-data with or without attributes.
    2. Create infosource .
        a) with flexible update
             or
        b) with direct update
    3. Create transfer rules and assign tyhe names of m-data and attribute in "Transfer rules" tab and transfer them to communication structure.
    4. Create the flat-file with same stucture as communication structure.
    5. if chosen direct update then create infopackage and assign the name of flat-file and schedule it.
    6. if chosen flexible update the create update rule with assigning name of the infosource and the schedule it by creating infopackage.
    Hope this helps. If still have prob then let me know.
    Follow this link also.
    http://help.sap.com/saphelp_nw2004s/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm
    Assign points if helpful.
    Vinod.

  • How to design Flat file for loading attribute dimension in a planning application

    Dear Gurus,
    I have a requirement to extract attribute dimensions from an essbase application and load it to another planning application. I have a dimension called Program and two attribute dimensions Sales Manager, Accounts manager associated with Program dimension in Essbase application. I will Extract these dimensions using Essbase outline extractor. After Extracting the attribute dimensions I have to load these dimensions to planning applications using outline load utility. Kindly guide me how to design the flat file for loading attribute dimensions in planning application.
    Thanks and Regards
    SC

    You could dig through the docs and try to figure out the file format manually, or you could do this the easy way.  Simply use the Outline Load Utility to export your attribute dimension from Planning.  The export file format is the same as the import file format.  You might have to manually add a couple of test members to your attribute dimension so that your export file has some content.  Then simply update the file you exported, and import it.
    (I am assuming you have already manually created the Attribute dimension in Planning, and that you simply need to add members to it.)
    Hope this helps,
    - Jake

  • Error while loading table from flat file (.csv)

    I have a flat file which i am loading into a Target Table in Oracle Warehouse Builder. It uses SQL Loader Internally to load the data from flat file, I am facing an issue. Please find the following error ( This is an extract from the error log generated)
    SQL*Loader-500: Unable to open file (D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv)
    SQL*Loader-552: insufficient privilege to open file
    SQL*Loader-509: System error: The data is invalid.
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    I believe that this is related to SQL * Loader error.
    ACtually the flat file resides in my system ( D:\MY CURRENT PROJECTS\GEIP-IHSS-Santa Clara\CDI-OWB\Source_Systems\Acquisition.csv). I am connecting to a oracle server.
    Please suggest
    Is it required that i need to place the flat file in Oracle Server System ??
    Regards,
    Ashoka BL

    Hi
    I am getting an error as well which is similar to that described above except that I get
    SQL*Loader-500: Unable to open file (/u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv)
    SQL*Loader-553: file not found
    SQL*Loader-509: System error: The system cannot find the file specified.
    SQL*Loader-2026: the load was aborted because SQL Loader cannot continue.
    The difference is that Ashoka was getting
    SQL*Loader-552: insufficient privilege to open file
    and I get
    SQL*Loader-553: file not found
    The initial thought is that the file does not exist in the directory specified or I have spelt the filename incorrectly it but this has been checked and double checked. The unix directory also has permission to read and write.
    Also in the error message is
    Control File: C:\u21\oracle\owb_staging\WHITEST\source_depot\INV_LOAD_LABEL_INVENTORY.ctl
    Character Set WE8MSWIN1252 specified for all input.
    Data File: /u21/oracle/owb_staging/WHITEST/source_depot/Durham_Inventory_Labels.csv
    Bad File: C:\u21\oracle\owb_staging\WHITEST\source_depot\Durham_Inventory_Labels.bad
    As can be seen from the above it seems to be trying to create the ctl and bad file on my c drive instead of on the server in the same directory as the .csv file. The location is registered to the server directory /u21/oracle/owb_staging/WHITEST/source_depot
    I am at a lost as this works fine in development and I have just promoted all the development work to a systest environment using OMBPlus.
    The directory structure in development is the same as systest except that the data file is /u21/oracle/owb_staging/WHITED/source_depot/Durham_Inventory_Labels.csv and everything works fine - .ctl and .bad created in the same directory and the data sucessfully loads into a oracle table.
    Have I missed a setting in OWB during the promotion to systest or is there something wrong in the way the repository in the systest database is setup?
    The systest and development databases are on the same box.
    Any help would be much appreciated
    Thanks
    Edwin

  • Upload PO data into SRM system from flat file

    Hi all,
    I need to create conversion program to upload Open Purchase order data from flat file to System.
    I am trying to create po using bapi BAPI_POEC_CREATE. but getting error .
    should any one give the details of parameter need to  pass the bapi .
    Thanks in advance
    Sharad

    Sharad,
    Not the very best piece of code, but should be helpful.
    REPORT  zkb_po_create.
    DATA: ls_po_header   TYPE bapi_po_header_c.
    DATA: ls_e_po_header TYPE bapi_po_header_d.
    DATA: ls_po_items    TYPE bapi_po_item_c.
    DATA: ls_po_accass   TYPE bapi_acc_c.
    DATA: ls_po_partner  TYPE bapi_bup_c.
    DATA: ls_po_orgdata  TYPE bapi_org_c.
    DATA: ls_return      TYPE bapiret2.
    DATA: lt_po_items   TYPE TABLE OF bapi_po_item_c.
    DATA: lt_po_accass  TYPE TABLE OF bapi_acc_c.
    DATA: lt_po_partner TYPE TABLE OF bapi_bup_c.
    DATA: lt_po_orgdata TYPE TABLE OF bapi_org_c.
    DATA: lt_return     TYPE TABLE OF bapiret2.
    * Header Details
    ls_po_header-businessprocess = 1.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
      EXPORTING
        input  = ls_po_header-businessprocess
      IMPORTING
        output = ls_po_header-businessprocess.
    ls_po_header-process_type = 'EC'.
    ls_po_header-doc_date = sy-datum.
    ls_po_header-description = 'Test for BAPI_POEC_CREATE'.
    ls_po_header-logsys_fi = 'Backend'.
    ls_po_header-co_code = '1000'.
    ls_po_header-currency = 'GBP'.
    * Item Details
    ls_po_items-item_guid    = 2.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
      EXPORTING
        input  = ls_po_items-item_guid
      IMPORTING
        output = ls_po_items-item_guid.
    ls_po_items-parent        = ls_po_header-businessprocess.
    ls_po_items-product_guid  = '4678E74FFFC380AD000000000A8E035B'.
    ls_po_items-product_id    = '400030'.
    ls_po_items-product_type  = '01'.
    ls_po_items-category_guid = '4627B461073F40FC000000000A8E035B'.
    ls_po_items-category_id   = '1.04.0500'.
    ls_po_items-quantity      = 10.
    ls_po_items-deliv_date   = sy-datum + 10.
    ls_po_items-price = '25'.
    APPEND ls_po_items TO lt_po_items.
    * Account Assignment
    ls_po_accass-guid = 3.
    CALL FUNCTION 'CONVERSION_EXIT_ALPHA_INPUT'
      EXPORTING
        input  = ls_po_accass-guid
      IMPORTING
        output = ls_po_accass-guid.
    ls_po_accass-parent_guid = ls_po_items-item_guid.
    ls_po_accass-distr_perc  = 100.
    ls_po_accass-g_l_acct    = '<gl acc>'.
    ls_po_accass-cost_ctr    = '<cost centre>'.
    ls_po_accass-co_area     = '<Ctrl area>'.
    APPEND ls_po_accass TO lt_po_accass.
    * Partner Functions
    ls_po_partner-partner_fct  = '00000019'.
    ls_po_partner-partner      = 'Vendor'.
    ls_po_partner-parent_guid  = ls_po_items-item_guid.
    APPEND ls_po_partner TO lt_po_partner.
    ls_po_partner-partner_fct  = '00000016'.
    ls_po_partner-partner      = 'Requester'.
    ls_po_partner-parent_guid  = ls_po_items-item_guid.
    APPEND ls_po_partner TO lt_po_partner.
    ls_po_partner-partner_fct  = '00000020'.
    ls_po_partner-partner      = 'Receipient'.
    ls_po_partner-parent_guid  = ls_po_items-item_guid.
    APPEND ls_po_partner TO lt_po_partner.
    ls_po_partner-partner_fct  = '00000075'.
    ls_po_partner-partner      = 'Location'.
    ls_po_partner-parent_guid  = ls_po_items-item_guid.
    APPEND ls_po_partner TO lt_po_partner.
    ls_po_orgdata-proc_org_ot = 'O'.
    ls_po_orgdata-proc_org_id = 'Pur Org'.
    ls_po_orgdata-proc_group_ot = 'O'.
    ls_po_orgdata-proc_group_id = 'Pur Group'.
    ls_po_orgdata-parent_guid = ls_po_items-item_guid.
    APPEND ls_po_orgdata TO lt_po_orgdata.
    CALL FUNCTION 'BAPI_POEC_CREATE'
      EXPORTING
        i_po_header  = ls_po_header
      IMPORTING
        e_po_header  = ls_e_po_header
      TABLES
        i_po_items   = lt_po_items
        i_po_accass  = lt_po_accass
        i_po_partner = lt_po_partner
        i_po_orgdata = lt_po_orgdata
        return       = lt_return.
    READ TABLE lt_return INTO ls_return WITH KEY type = 'E'.
    IF sy-subrc NE 0.
      COMMIT WORK AND WAIT.
      WRITE:/ ls_e_po_header-doc_number, ': created successfully'.
    ELSE.
      WRITE:/ 'The below errors occurs during PO creation.'.
      LOOP AT lt_return INTO ls_return.
        WRITE:/ ls_return-message.
      ENDLOOP.
    ENDIF.
    Regards, Kathirvel

  • Data updation from flat file to I.C. via ODS

    Hi,
    I have to load transaction data from 4 tables into 4 ods and these 4 ods data will be loaded to 1 info cube.so please tell me the best method whether to use 4 info sources for 4 ods or can i define all the info objects in one info source and use the same info source for all ods objects.
    Thank you

    Hello GanGa,
    How r u ?
    1st could u tell, the loads are from Flat File or Tables ?
    coz ur Subject and the Message is getting clashed.
    If u use Flat Files, then better go for 4 IS and 1 ODS is enough. Then update the cube.
    If u r going to extract from Tables, then you have to create the Data Source using "Extraction from Views", for this u should have common key fields in all the 4 tables. In BW DataSource to InfoSource mapping should be 1 to 1.
    Best Regards....
    Sankar Kumar
    +91 98403 47141

  • Data loading from flat file to cube using bw3.5

    Hi Experts,
                       Kindly give  me the detailed steps with screens  about Data loading from flat file to cube using bw3.5
           ...............Please

    Hi ,
    Procedure
    You are in the Data Warehousing Workbench in the DataSource tree.
           1.      Select the application components in which you want to create the DataSource and choose Create DataSource.
           2.      On the next screen, enter a technical name for the DataSource, select the type of DataSource and choose Copy.
    The DataSource maintenance screen appears.
           3.      Go to the General tab page.
                                a.      Enter descriptions for the DataSource (short, medium, long).
                                b.      As required, specify whether the DataSource builds an initial non-cumulative and can return duplicate data records within a request.
                                c.      Specify whether you want to generate the PSA for the DataSource in the character format. If the PSA is not typed it is not generated in a typed structure but is generated with character-like fields of type CHAR only.
    Use this option if conversion during loading causes problems, for example, because there is no appropriate conversion routine, or if the source cannot guarantee that data is loaded with the correct data type.
    In this case, after you have activated the DataSource you can load data into the PSA and correct it there.
           4.      Go to the Extraction tab page.
                                a.      Define the delta process for the DataSource.
                                b.      Specify whether you want the DataSource to support direct access to data.
                                c.      Real-time data acquisition is not supported for data transfer from files.
                                d.      Select the adapter for the data transfer. You can load text files or binary files from your local work station or from the application server.
    Text-type files only contain characters that can be displayed and read as text. CSV and ASCII files are examples of text files. For CSV files you have to specify a character that separates the individual field values. In BI, you have to specify this separator character and an escape character which specifies this character as a component of the value if required. After specifying these characters, you have to use them in the file. ASCII files contain data in a specified length. The defined field length in the file must be the same as the assigned field in BI.
    Binary files contain data in the form of Bytes. A file of this type can contain any type of Byte value, including Bytes that cannot be displayed or read as text. In this case, the field values in the file have to be the same as the internal format of the assigned field in BI.
    Choose Properties if you want to display the general adapter properties.
                                e.      Select the path to the file that you want to load or enter the name of the file directly, for example C:/Daten/US/Kosten97.csv.
    You can also create a routine that determines the name of your file. If you do not create a routine to determine the name of the file, the system reads the file name directly from the File Name field.
                                  f.      Depending on the adapter and the file to be loaded, make further settings.
    ■       For binary files:
    Specify the character record settings for the data that you want to transfer.
    ■       Text-type files:
    Specify how many rows in your file are header rows and can therefore be ignored when the data is transferred.
    Specify the character record settings for the data that you want to transfer.
    For ASCII files:
    If you are loading data from an ASCII file, the data is requested with a fixed data record length.
    For CSV files:
    If you are loading data from an Excel CSV file, specify the data separator and the escape character.
    Specify the separator that your file uses to divide the fields in the Data Separator field.
    If the data separator character is a part of the value, the file indicates this by enclosing the value in particular start and end characters. Enter these start and end characters in the Escape Charactersfield.
    You chose the; character as the data separator. However, your file contains the value 12;45 for a field. If you set u201C as the escape character, the value in the file must be u201C12;45u201D so that 12;45 is loaded into BI. The complete value that you want to transfer has to be enclosed by the escape characters.
    If the escape characters do not enclose the value but are used within the value, the system interprets the escape characters as a normal part of the value. If you have specified u201C as the escape character, the value 12u201D45 is transferred as 12u201D45 and 12u201D45u201D is transferred as 12u201D45u201D.
    In a text editor (for example, Notepad) check the data separator and the escape character currently being used in the file. These depend on the country version of the file you used.
    Note that if you do not specify an escape character, the space character is interpreted as the escape character. We recommend that you use a different character as the escape character.
    If you select the Hex indicator, you can specify the data separator and the escape character in hexadecimal format. When you enter a character for the data separator and the escape character, these are displayed as hexadecimal code after the entries have been checked. A two character entry for a data separator or an escape sign is always interpreted as a hexadecimal entry.
                                g.      Make the settings for the number format (thousand separator and character used to represent a decimal point), as required.
                                h.      Make the settings for currency conversion, as required.
                                  i.      Make any further settings that are dependent on your selection, as required.
           5.      Go to the Proposal tab page.
    This tab page is only relevant for CSV files. For files in different formats, define the field list on the Fields tab page.
    Here you create a proposal for the field list of the DataSource based on the sample data from your CSV file.
                                a.      Specify the number of data records that you want to load and choose Upload Sample Data.
    The data is displayed in the upper area of the tab page in the format of your file.
    The system displays the proposal for the field list in the lower area of the tab page.
                                b.      In the table of proposed fields, use Copy to Field List to select the fields you want to copy to the field list of the DataSource. All fields are selected by default.
           6.      Go to the Fields tab page.
    Here you edit the fields that you transferred to the field list of the DataSource from the Proposal tab page. If you did not transfer the field list from a proposal, you can define the fields of the DataSource here.
                                a.      To define a field, choose Insert Row and specify a field name.
                                b.      Under Transfer, specify the decision-relevant DataSource fields that you want to be available for extraction and transferred to BI.
                                c.      Instead of generating a proposal for the field list, you can enter InfoObjects to define the fields of the DataSource. Under Template InfoObject, specify InfoObjects for the fields in BI. This allows you to transfer the technical properties of the InfoObjects into the DataSource field.
    Entering InfoObjects here does not equate to assigning them to DataSource fields. Assignments are made in the transformation. When you define the transformation, the system proposes the InfoObjects you entered here as InfoObjects that you might want to assign to a field.
                                d.      Change the data type of the field if required.
                                e.      Specify the key fields of the DataSource.
    These fields are generated as a secondary index in the PSA. This is important in ensuring good performance for data transfer process selections, in particular with semantic grouping.
                                  f.      Specify whether lowercase is supported.
                                g.      Specify whether the source provides the data in the internal or external format.
                                h.      If you choose the external format, ensure that the output length of the field (external length) is correct. Change the entries, as required.
                                  i.      If required, specify a conversion routine that converts data from an external format into an internal format.
                                  j.      Select the fields that you want to be able to set selection criteria for when scheduling a data request using an InfoPackage. Data for this type of field is transferred in accordance with the selection criteria specified in the InfoPackage.
                                k.      Choose the selection options (such as EQ, BT) that you want to be available for selection in the InfoPackage.
                                  l.      Under Field Type, specify whether the data to be selected is language-dependent or time-dependent, as required.
           7.      Check, save and activate the DataSource.
           8.      Go to the Preview tab page.
    If you select Read Preview Data, the number of data records you specified in your field selection is displayed in a preview.
    This function allows you to check whether the data formats and data are correct.
    For More Info:  http://help.sap.com/saphelp_nw70/helpdata/EN/43/01ed2fe3811a77e10000000a422035/content.htm

  • Extracting into Flat Files Using OCI or Pro*C Programs

    Data Extraction into Flat Files from a database Using OCI or Pro*C Programs - please provide me a sample code. It is urgent. Thank you in advance.

    This problem is very simple to solve. Simply use Pro*C, issue an SQL select into a host variable, then use unix "printf" to output the result. An alternative is to use the provided sqlplus utility to grab the data in a script, disabling headers, etc.
    Sadly, this area is a huge, basic hole in the Oracle product offering. While they have an import utility, there is no export utility, except for one that makes binary files not usable outside Oracle. Every other RDBMS I've seen has this. In Informix, you can say something like "export to <filename> select * from <table>", but that syntax is not offered by Oracle Corporation.

  • How to upload  schedule line from flat files to sap file

    dear all,
    i want to upload the schedule lines from flat files to sap schedulle lines
    but the flat files have 15 schedule lines and the data is as per date
    so how to upload that and the fields available in flat files are more than the sap screen
    we are having more than 6 items
    and 15scedule lines its abt 90data to be upload
    for one customer in every 15 day
    so how to do this
    is there any direct use in functional side
    with out the help of any abap
    but my user will do it
    so he need a permanent solution
    with regards
    subrat

    Hi Subrat ,
    u can upload the data either ( Master /Transaction) data with the help of lsmw. for that all u need to do is go through the lsmw and do it. in that u can go Batch input recording/ BAPI/ IDOC any of that. here i am sending the LSMW Notes go through it and do the work.
    once u create the LSMW project then u can ask the data from user or u can explain the user about the program and can run the flat file to upload the data.
    if u require LSMW material Just send me blank mail from u. my mail id is [email protected]
    Reward if Helpful.
    Regards,
    Praveen Kumar.D

  • Short dump while reading a currency field from Flat file into internal tabl

    Hi,
    I am getting a short dump........saying number conversion dump (while reading a currency value into field in internal table from a fixed lenght flat file).........
    Do I need to use a string variable to get the value from flat file or how ??
    Please suggest.

    Santosh,
    Thanks for your inputs,
    But my internal table type is of DEC (5,2) , I am getting that... it needs to be of type 'C'. Can you suggest.
    Ex :
    MOVE wa_temp-infile_string+106(8)  TO wa_item-QT_PERCENT
    This didnt work
    so i tried moving into a seperate variable
    MOVE wa_temp-infile_string+106(8)  TO v_percent.
    and then write to
    WRITE v_percent to  wa_item-QT_PERCENT.

Maybe you are looking for