Problem uploading data from an .CSV file

hello,
I have a requirement in which i need to upload .csv file. I tried using GUI_UPLOAD and TEXT_CONVERT_CSV_TO_SAP. But the values are getting appending as a single value seperated by cama.
for example, If there are 5 columns and 10 rows in my test.csv file. when i am using GUI_UPLOAD and TEXT_CONVERT_CSV_TO_SAP. I am getting data in 1 column and 10 rows. each row has 5 values seperated by column. the other 4 columns in the internal table are not populated.
can anyone help me.
Thanks and Regards,
gautham

hi,
use the Function module 'ALSM_EXCEL_TO_INTERNAL_TABLE'.
REPORT  ZSR_BDC_XL
        NO STANDARD PAGE HEADING LINE-SIZE 255.
TABLES : LFA1,RF02K.
DATA : BEGIN OF ITAB OCCURS 0,
       LIFNR LIKE RF02K-LIFNR,
       KTOKK LIKE RF02K-KTOKK,
       NAME1 LIKE LFA1-NAME1,
       SORTL LIKE LFA1-SORTL,
       LAND1 LIKE LFA1-LAND1,
       SPRAS LIKE LFA1-SPRAS,
       END OF ITAB.
DATA : ITAB1 LIKE ALSMEX_TABLINE OCCURS 0 WITH HEADER LINE.
DATA : B1 TYPE I VALUE 1,
       C1 TYPE I VALUE 1,
       B2 TYPE I VALUE 10,
       C2 TYPE I VALUE 99.
INCLUDE BDCRECX1.
START-OF-SELECTION.
  CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
    EXPORTING
      FILENAME                = 'C:\xl1.csv'
      I_BEGIN_COL             = B1
      I_BEGIN_ROW             = C1
      I_END_COL               = B2
      I_END_ROW               = C2
    TABLES
      INTERN                  = ITAB1
    EXCEPTIONS
      INCONSISTENT_PARAMETERS = 1
      UPLOAD_OLE              = 2
      OTHERS                  = 3.
  IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
  ENDIF.
  PERFORM  ORGANIZE_UPLOADED_DATA.
PERFORM OPEN_GROUP.
LOOP AT ITAB.
PERFORM BDC_DYNPRO      USING 'SAPMF02K' '0100'.
PERFORM BDC_FIELD       USING 'BDC_CURSOR'
                              'RF02K-KTOKK'.
PERFORM BDC_FIELD       USING 'BDC_OKCODE'
                              '/00'.
PERFORM BDC_FIELD       USING 'RF02K-LIFNR'
                              ITAB-LIFNR.
PERFORM BDC_FIELD       USING 'RF02K-KTOKK'
                              ITAB-KTOKK.
PERFORM BDC_DYNPRO      USING 'SAPMF02K' '0110'.
PERFORM BDC_FIELD       USING 'BDC_CURSOR'
                              'LFA1-SPRAS'.
PERFORM BDC_FIELD       USING 'BDC_OKCODE'
                              '/00'.
PERFORM BDC_FIELD       USING 'LFA1-NAME1'
                              ITAB-NAME1.
PERFORM BDC_FIELD       USING 'LFA1-SORTL'
                              ITAB-SORTL.
PERFORM BDC_FIELD       USING 'LFA1-LAND1'
                              ITAB-LAND1.
PERFORM BDC_FIELD       USING 'LFA1-SPRAS'
                              ITAB-SPRAS.
PERFORM BDC_DYNPRO      USING 'SAPMF02K' '0120'.
PERFORM BDC_FIELD       USING 'BDC_CURSOR'
                              'LFA1-KUNNR'.
PERFORM BDC_FIELD       USING 'BDC_OKCODE'
                              '/00'.
PERFORM BDC_DYNPRO      USING 'SAPMF02K' '0130'.
PERFORM BDC_FIELD       USING 'BDC_CURSOR'
                              'LFBK-BANKS(01)'.
PERFORM BDC_FIELD       USING 'BDC_OKCODE'
                              '=ENTR'.
PERFORM BDC_DYNPRO      USING 'SAPLSPO1' '0300'.
PERFORM BDC_FIELD       USING 'BDC_OKCODE'
                              '=YES'.
PERFORM BDC_TRANSACTION USING 'XK01'.
PERFORM CLOSE_GROUP.
ENDLOOP.
FORM ORGANIZE_UPLOADED_DATA .
  SORT  ITAB1  BY  ROW
                   COL.
  LOOP  AT  ITAB1.
    CASE  ITAB1-COL.
      WHEN  1.
        ITAB-LIFNR = ITAB1-VALUE.
      WHEN  2.
        ITAB-KTOKK = ITAB1-VALUE.
      WHEN  3.
        ITAB-NAME1 = ITAB1-VALUE.
      WHEN  4.
        ITAB-SORTL = ITAB1-VALUE.
      WHEN  5.
        ITAB-LAND1 = ITAB1-VALUE.
      WHEN  6.
        ITAB-SPRAS = ITAB1-VALUE.
    ENDCASE.
  AT  END  OF  ROW.
APPEND ITAB.
CLEAR ITAB.
ENDAT.
  ENDLOOP.
ENDFORM.                    " ORGANIZE_UPLOADED_DATA
if helpful reward points

Similar Messages

  • Error while uploading data from a flat file to the hierarchy

    Hi guys,
    after i upload data from a flat file to the hierarchy, i get a error message "Please select a valid info object" am loading data using PSA, having activated all external chars still get the problem..some help on this please..
    regards
    Sri

    there is o relation of infoobject name in flat file and infoobjet name at BW side.
    please check with the object in the BW and their lengths and type of the object and check your flat file weather u have the same type there,
    now check the sequence of the objects in the transfer rules  and activate them.
    there u go.

  • Uploading Data from a Flat File

    Hi
    I am trying to Upload data from a Flat File to the MDS. I have a few questions on the Process.
    a) XI would be the Interface, and one end would be a file adapter with the Flat File format. On the other end, which Interface should I use - ABA Business Partner In or MDM Business Partner In. I do not understand the differences between them and where should which one be used?
    B) At the moment,I want to map the data to standard Object type, Business Partner  BUS1006. This also has a staging area already defined in the System. Can I view the data which is imported into the Staging area ? How so ?
    C) The struture (or data) that I wish to upload has few fields, and does not map to the BP structure easily. In such a scenario does it make sense to
    Create a new Object type
    (ii) Create a new Business Partner type with the appropriate fields only ..
    What I need to know is if either of these options is feasible and what are the pros & cons of doing this, in terms of effort, skillset & interaction with SAP Development ?
    Kindly do reply if you have any answers to these questions.
    Regards,
    Gaurav

    Hi Markus,
    Thanks for your inputs.
    I have tried uploading some dats from a Flat file to the Business Partner Onject type, BUS1006. I initially got some ABAP Parsing errors on the MDM side, but after correcting that, I find my message triggers a short dump - with the Method SET_OBJECTKEYS, not finding any keys in the BP structure that has been created.
    Q1 - How do I get around this problem ? Is it necessary for me to specify Keys in the PartyID node of the Interface ABABusinessPartnerIn. or is it something else ?
    Q2 - How is this general process supposed to work? I would assume that for staging, I would get incomplete Master data or data from flat files, which need not neccesarily contain keys. The aim is to use the matching strategies in the CI to identify duplicates and consolidate them.
    Thanks in advance for your reply.
    Regards,
    Gaurav

  • Import data from excel/csv file in web dynpro

    Hi All,
    I need to populate a WD table by first importing a excel/CSV file thru web dynpro screen and then reading thru the file.Am using FileUpload element from NW04s.
    How can I read/import data from excel / csv file in web dynpro table context?
    Any help is appreciated.
    Thanks a lot
    Aakash

    Hi,
    Here are the basic steps needed to read data from excel spreadsheet using the Java Excel API(jExcel API).
    jExcel API can read a spreadsheet from a file stored on the local file system or from some input stream, ideally the following should be the steps while reading:
    Create a workbook from a file on the local file system, as illustrated in the following code fragment:
              import java.io.File;
              import java.util.Date;
              import jxl.*;
             Workbook workbook = Workbook.getWorkbook(new File("test.xls"));
    On getting access to the worksheet, once can use the following code piece to access  individual sheets. These are zero indexed - the first sheet being 0, the  second sheet being 1, and so on. (You can also use the API to retrieve a sheet by name).
              Sheet sheet = workbook.getSheet(0);
    After getting the sheet, you can retrieve the cell's contents as a string by using the convenience method getContents(). In the example code below, A1 is a text cell, B2 is numerical value and C2 is a date. The contents of these cells may be accessed as follows
    Cell a1 = sheet.getCell(0,0);
    Cell b2 = sheet.getCell(1,1);
    Cell c2 = sheet.getCell(2,1);
    String a1 = a1.getContents();
    String b2 = b2.getContents();
    String c2 = c2.getContents();
    // perform operations on strings
    However in case we need to access the cell's contents as the exact data type ie. as a numerical value or as a date, then the retrieved Cell must be cast to the correct type and the appropriate methods called. The code piece given below illustrates how JExcelApi may be used to retrieve a genuine java double and java.util.Date object from an Excel spreadsheet. For completeness the label is also cast to it's correct type. The code snippet also illustrates how to verify that cell is of the expected type - this can be useful when performing validations on the spreadsheet for presence of correct datatypes in the spreadsheet.
      String a1 = null;
      Double b2 = 0;
      Date c2 = null;
                        Cell a1 = sheet.getCell(0,0);
                        Cell b2 = sheet.getCell(1,1);
                        Cell c2 = sheet.getCell(2,1);
                        if (a1.getType() == CellType.LABEL)
                           LabelCell lc = (LabelCell) a1;
                           stringa1 = lc.getString();
                         if (b2.getType() == CellType.NUMBER)
                           NumberCell nc = (NumberCell) b2;
                           numberb2 = nc.getValue();
                          if (c2.getType() == CellType.DATE)
                            DateCell dc = (DateCell) c2;
                            datec2 = dc.getDate();
                           // operate on dates and doubles
    It is recommended to, use the close()  method (as in the code piece below)   when you are done with processing all the cells.This frees up any allocated memory used when reading spreadsheets and is particularly important when reading large spreadsheets.              
              // Finished - close the workbook and free up memory
              workbook.close();
    The API class files are availble in the 'jxl.jar', which is available for download.
    Regards
    Raghu

  • How can I import data from a csv file into databse using utl_file?

    Hi,
    I have two machines (os is windows and database is oracle 10g) that are not connected to each other and both are having the same database schema but data is all different.
    Now on one machine, I want to take dump of all the tables into csv files. e.g. if my table name is test then the exported file is test.csv and if the table name is sample then csv file name is sample.csv and so on.
    Now I want to import the data from these csv files into the tables on second machine. if I've 50 such csv files, then data should be written to 50 tables.
    I am new to this. Could anyone please let me know how can I import data back into tables. i can't use sqlloader as I've to satisfy a few conditions while loading the data into tables. I am stuck and not able to proceed.
    Please let me know how can I do this.
    Thanks,
    Shilpi

    Why you want to export into .csv file.Why not export/import? What is your oracle version?
    Read http://www.oracle-base.com/articles/10g/oracle-data-pump-10g.php
    Regards
    Biju

  • I need to import data from a CSV file to an Oracle table

    I need to import data from a CSV file to an Oracle table. I'd prefer to use either SQL Developer or SQL Plus code.
    As an example, my target database is HH910TS2, server is ADDb0001, my dB login is em/em, the Oracle table is AE1 and the CSV file is AECSV.
    Any ideas / help ?

    And just for clarity, it's good to get your head around some basic concepts...
    user635625 wrote:
    I need to import data from a CSV file to an Oracle table. I'd prefer to use either SQL Developer or SQL Plus code.SQL Developer is a GUI front end that submits code to the database and displays the results. It does not have any code of it's own (although it may have some "commands" that are SQL Developer specific)
    SQL*Plus is another front end (character based rather than GUI) that submits code to the database and displays the results. It also does not have code of it's own although there are SQL*Plus commands for use only in the SQL*Plus environment.
    The "code" that you are referring to is either SQL or PL/SQL, so you shouldn't limit yourself to thinking it has to be for SQL Developer or SQL*Plus. There are many front end tools that can all deal with the same SQL and/or PL/SQL code. Focus on the SQL and/or PL/SQL side of your coding and don't concern yourself with limitations of what tool you are using. We often see people on here who don't recognise these differences and then ask why their code isn't working when they've put SQL*Plus commands inside their PL/SQL code. ;)

  • ODI Error when Loading data from a .csv file to Planning

    Hello,
    I am trying to load data from a csv file to planning using ODI 10.1.3.6 and I am facing this particular error. I am using staging area as Sunopsis memory engine.
    7000 : null : java.sql.SQLException: Invalid COL ALIAS "DEFAULT C12_ALIAS__DEFAULT" for column "ALIAS:"
    java.sql.SQLException: Invalid COL ALIAS "DEFAULT C12_ALIAS__DEFAULT" for column "ALIAS:"
         at com.sunopsis.jdbc.driver.file.bb.b(bb.java)
         at com.sunopsis.jdbc.driver.file.bb.a(bb.java)
         at com.sunopsis.jdbc.driver.file.w.b(w.java)
         at com.sunopsis.jdbc.driver.file.w.executeQuery(w.java)
         at com.sunopsis.sql.SnpsQuery.executeQuery(SnpsQuery.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.execCollOrders(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTaskTrt(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSqlC.treatTaskTrt(SnpSessTaskSqlC.java)
         at com.sunopsis.dwg.dbobj.SnpSessTaskSql.treatTask(SnpSessTaskSql.java)
         at com.sunopsis.dwg.dbobj.SnpSessStep.treatSessStep(SnpSessStep.java)
         at com.sunopsis.dwg.dbobj.SnpSession.treatSession(SnpSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandSession.treatCommand(DwgCommandSession.java)
         at com.sunopsis.dwg.cmd.DwgCommandBase.execute(DwgCommandBase.java)
         at com.sunopsis.dwg.cmd.e.k(e.java)
         at com.sunopsis.dwg.cmd.g.A(g.java)
         at com.sunopsis.dwg.cmd.e.run(e.java)
         at java.lang.Thread.run(Unknown Source)
    Code from Operator:
    select     Account     C1_ACCOUNT,
         Parent     C2_PARENT,
         Alias: Default     C12_ALIAS__DEFAULT,
         Data Storage     C3_DATA_STORAGE,
         Two Pass Calculation     C9_TWO_PASS_CALCULATION,
         Account Type     C6_ACCOUNT_TYPE,
         Time Balance     C14_TIME_BALANCE,
         Data Type     C5_DATA_TYPE,
         Variance Reporting     C10_VARIANCE_REPORTING,
         Source Plan Type     C13_SOURCE_PLAN_TYPE,
         Plan Type (FinStmt)     C7_PLAN_TYPE__FINSTMT_,
         Aggregation (FinStmt)     C8_AGGREGATION__FINSTMT_,
         Plan Type (WFP)     C15_PLAN_TYPE__WFP_,
         Aggregation (WFP)     C4_AGGREGATION__WFP_,
         Formula     C11_FORMULA
    from      TABLE
    /*$$SNPS_START_KEYSNP$CRDWG_TABLESNP$CRTABLE_NAME=Account.csvSNP$CRLOAD_FILE=Y:/1 Metadata/Account//Account.csvSNP$CRFILE_FORMAT=DSNP$CRFILE_SEP_FIELD=2CSNP$CRFILE_SEP_LINE=0D0ASNP$CRFILE_FIRST_ROW=1SNP$CRFILE_ENC_FIELD=SNP$CRFILE_DEC_SEP=SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=AccountSNP$CRTYPE_NAME=STRINGSNP$CRORDER=1SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=ParentSNP$CRTYPE_NAME=STRINGSNP$CRORDER=2SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Alias: DefaultSNP$CRTYPE_NAME=STRINGSNP$CRORDER=3SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Data StorageSNP$CRTYPE_NAME=STRINGSNP$CRORDER=4SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Two Pass CalculationSNP$CRTYPE_NAME=STRINGSNP$CRORDER=5SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Account TypeSNP$CRTYPE_NAME=STRINGSNP$CRORDER=6SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Time BalanceSNP$CRTYPE_NAME=STRINGSNP$CRORDER=7SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Data TypeSNP$CRTYPE_NAME=STRINGSNP$CRORDER=8SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Variance ReportingSNP$CRTYPE_NAME=STRINGSNP$CRORDER=9SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Source Plan TypeSNP$CRTYPE_NAME=STRINGSNP$CRORDER=10SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Plan Type (FinStmt)SNP$CRTYPE_NAME=STRINGSNP$CRORDER=11SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Aggregation (FinStmt)SNP$CRTYPE_NAME=STRINGSNP$CRORDER=12SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Plan Type (WFP)SNP$CRTYPE_NAME=STRINGSNP$CRORDER=13SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=Aggregation (WFP)SNP$CRTYPE_NAME=STRINGSNP$CRORDER=14SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CRSNP$CRDWG_COLSNP$CRCOL_NAME=FormulaSNP$CRTYPE_NAME=STRINGSNP$CRORDER=15SNP$CRLENGTH=150SNP$CRPRECISION=150SNP$CR$$SNPS_END_KEY*/
    insert into "C$_0Account"
         C1_ACCOUNT,
         C2_PARENT,
         C12_ALIAS__DEFAULT,
         C3_DATA_STORAGE,
         C9_TWO_PASS_CALCULATION,
         C6_ACCOUNT_TYPE,
         C14_TIME_BALANCE,
         C5_DATA_TYPE,
         C10_VARIANCE_REPORTING,
         C13_SOURCE_PLAN_TYPE,
         C7_PLAN_TYPE__FINSTMT_,
         C8_AGGREGATION__FINSTMT_,
         C15_PLAN_TYPE__WFP_,
         C4_AGGREGATION__WFP_,
         C11_FORMULA
    values
         :C1_ACCOUNT,
         :C2_PARENT,
         :C12_ALIAS__DEFAULT,
         :C3_DATA_STORAGE,
         :C9_TWO_PASS_CALCULATION,
         :C6_ACCOUNT_TYPE,
         :C14_TIME_BALANCE,
         :C5_DATA_TYPE,
         :C10_VARIANCE_REPORTING,
         :C13_SOURCE_PLAN_TYPE,
         :C7_PLAN_TYPE__FINSTMT_,
         :C8_AGGREGATION__FINSTMT_,
         :C15_PLAN_TYPE__WFP_,
         :C4_AGGREGATION__WFP_,
         :C11_FORMULA
    Thanks in advance!

    Right-clicking "data" on the model tab can you see the data?
    In your code there's written:
    P$CRLOAD_FILE=Y:/1 Metadata/Account//Account.csv
    Is it right the double slash before the file name?

  • Need help in loading data from a csv file to a table in Oracle DB

    Hi,
    I am using sqlplus as a client to connect to the server.
    I am trying to load data from a csv file from a client to a table in Oracle DB server.
    I am trying to use the below command
    " LOAD DATA INFILE 'test.csv' INTO TABLE testTable FIELDS TERMINATED BY ',' (test1,test2,testc3,testc4); "
    But, I am encountered with the following error
    "SP2-0042: unknown command "load data" - rest of line ignored."
    Thanks in advance.
    SB

    Hey Frostmann,
    That was a nice post....
    I changed my mind to use an 'Insert into' statement from a shell script.
    I created a DB table named test and I tried using the below shell script to insert a row in the table.
    sqlplus test/test@test <<ENDOFSQL
    INSERT INTO test VALUES('test1',123,'test2','test3');
    exit
    ENDOFSQL
    A row is succesfully inserted into the table when I run the script manually, but it does not insert rows when a cron job is scheduled.
    Could you please help me with this?
    Thanks in advance.
    SB

  • Writing data from a CSV file in to table

    Hi All,
    I have a CSV (Comma Separated Values) file and i want to write its data in to the table.
    How can i get data from a CSV file.

    As Karthick suggested, you can use External Tables or Sql Loader functionality
    You can check this link for example on Sql Loader http://surachartopun.com/2007/10/example-sql-loader-some-data-into.html

  • BAPI to upload data from a flat file to VA01

    Hi guys,
    I have a requirement wherein i need to upload data  from a flat file to VA01.Please tell me how do i go about this.
    Thanks and regards,
    Frank.

    Hi
    previously i posted code also
        Include           YCL_CREATE_SALES_DOCU                         *
         Form  salesdocu
         This Subroutine is used to create Sales Order
         -->P_HEADER           Document Header Data
         -->P_HEADERX          Checkbox for Header Data
         -->P_ITEM             Item Data
         -->P_ITEMX            Item Data Checkboxes
         -->P_LT_SCHEDULES_IN  Schedule Line Data
         -->P_LT_SCHEDULES_INX Checkbox Schedule Line Data
         -->P_PARTNER  text    Document Partner
         <--P_w_vbeln  text    Sales Document Number
    DATA:
      lfs_return like line of t_return.
    FORM create_sales_document changing P_HEADER  like fs_header
                                       P_HEADERX like fs_headerx
                                       Pt_ITEM   like t_item[]
                                       Pt_ITEMX  like t_itemx[]
                                       P_LT_SCHEDULES_IN  like t_schedules_in[]
                                       P_LT_SCHEDULES_INX like t_schedules_inx[]
                                       Pt_PARTNER  like t_partner[]
                                       P_w_vbeln  like w_vbeln.
    This Perform is used to fill required data for Sales order creation
      perform sales_fill_data changing p_header
                                       p_headerx
                                       pt_item
                                       pt_itemx
                                       p_lt_schedules_in
                                       p_lt_schedules_inx
                                       pt_partner.
    Function Module to Create Sales and Distribution Document
      perform sales_order_creation using p_header
                                         p_headerx
                                         pt_item
                                         pt_itemx
                                         p_lt_schedules_in
                                         p_lt_schedules_inx
                                         pt_partner.
      perform return_check using p_w_vbeln .
    ENDFORM.                                 " salesdocu
        Form  commit_work
        To execute external commit                                    *
    FORM commit_work .
      CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
    EXPORTING
       WAIT          = c_x
    ENDFORM.                                 " Commit_work
    Include ycl_sales_order_header          " To Fill Header data and Item data
    Include ycl_sales_order_header.
         Form  return_check
        To validate the sales order creation
    FORM return_check using pr_vbeln type vbeln.
    if pr_vbeln is initial.
        LOOP AT t_return into lfs_return .
          WRITE / lfs_return-message.
          clear lfs_return.
        ENDLOOP.                             " Loop at return
      else.
        perform commit_work.                 " External Commit
        Refresh t_return.
        fs_disp-text = text-003.
        fs_disp-number = pr_vbeln.
        append fs_disp to it_disp.
      if p_del eq c_x or p_torder eq c_x or
        p_pgi eq c_x or p_bill eq c_x.
        perform delivery_creation.           " Delivery order creation
        endif.                               " If p_del eq 'X'......
      endif.                                 " If p_w_vbeln is initial
    ENDFORM.                                 " Return_check
    *&      Form  sales_order_creation
          text
         -->P_P_HEADER  text
         -->P_P_HEADERX  text
         -->P_PT_ITEM  text
         -->P_PT_ITEMX  text
         -->P_P_LT_SCHEDULES_IN  text
         -->P_P_LT_SCHEDULES_INX  text
         -->P_PT_PARTNER  text
    FORM sales_order_creation  USING    P_P_HEADER like fs_header
                                        P_P_HEADERX like fs_headerx
                                        P_PT_ITEM like t_item[]
                                        P_PT_ITEMX like t_itemx[]
                                        P_P_LT_SCHEDULES_IN like t_schedules_in[]
                                        P_P_LT_SCHEDULES_INX like t_schedules_inx[]
                                        P_PT_PARTNER like t_partner[].
        CALL FUNCTION 'BAPI_SALESDOCU_CREATEFROMDATA1'
        EXPORTING
          sales_header_in     = p_p_header
          sales_header_inx    = p_p_headerx
        IMPORTING
          salesdocument_ex    = w_vbeln
        TABLES
          return              = t_return
          sales_items_in      = p_pt_item
          sales_items_inx     = p_pt_itemx
          sales_schedules_in  = p_p_lt_schedules_in
          sales_schedules_inx = p_p_lt_schedules_inx
          sales_partners      = p_pt_partner.
    ENDFORM.                    " sales_order_creation
    plzz reward if i am usefull plzz

  • Error while uploading data from a flat file

    Hi All,
    I am trying to load data from a flat file to an ODS. The flat file contains a field, say FLAT01, of numerical type(Type Decimal, Length 18, Decimals 2). I have created an infoobject, say ZIO10, with type CURR and currency field, 0CURRENCY. In the transfer rules, I have assigned a constant for 0CURRENCY as the flat file doesn't have any currecy field.
    The problem is the flat file doesn't have any value in that field, FLAT01, for some records, infact most of the records. When I try to load it to BW,from applicatin server, it is throwing an error "Contents from field ZIO10 cannot be converted in type CURR ->longtext".
    I have debugged the transfer rules and find that the empty value is being represented as '#' and when it is trying to copy it into the CURR type field, it is throwing the error.
    Can someone please let me know how to solve it? why is it taking '#' for enpty value? Can't we change that?
    Any help would be highly appreciated.
    Best Regards,
    James.
    Message was edited by:
            James Helsinki

    Hi SS,
    I am sorry that I was not clear with my explanation. I have created ZIO10 as a keyfigure with type AMOUNT and the currency as 0CURRENCY.
    There is no currency coming in from the flat file so I used a constant in transfer rules to fill 0CURRENCY.
    The field which is coming as '#' is FLAT01 which is assigned to ZIO10.
    Best Regards,
    James.

  • Uploading data from a xlsx file into an Oracle table

    Hi All,
    I want to know what would be the best approach and tool to upload the data from an xlsx (excel) sheet into an Oracle table.
    Can I use 'sqlldr'?
    ~Parag

    Parag Kalra wrote:
    What if the excel sheet is exported from some other third party database and not explicitly generated on Windows? I know if we can export it to excel why can't we also export it to csv but what if that option is also not available.
    Also what things do I need to take into account when I export a xlsx file to csv file. I mean I hope in no circumstances there would be a data loss.How about instead of of these "what ifs" and "I hopes" you tell us exactly what you have to work with and describe the real business problem. If you ask the wrong questions, you get the wrong answers.
    You "hope" that there would be no data loss? Do you really think Excel would survive as a product, much less be the overwhelming dominant product of its type, if it screwed up something that fundamental? The only thing that will be lost is the formatting meta-data.

  • PROBLEM LOADING DATA FROM A TEXT FILE.

    Hi,
    Im having a problem in loading my csv file to the database. Im using Oracle Database 10g for Linux. Im in p. 228 in the book. This is my csv file look.
    db_name     db_version     host_id
    db10     9.2.0.7     1
    db11     10.2.0.1     1
    db12     10.2.0.1     1
    db13     9.2.0.7     1
    db14     10.2.0.1     1
    db15     9.2.0.7     1
    I loaded this data to an existing table called DATABASES loaded from tab delimited. FILE CHARACTER SET is UNICODE UTF-8. Then I browsed the name of the csv file to be uploaded. It looked like this.
    File Name      F23757437/db2.csv     Reupload File      
    Separator      
    Optionally Enclosed By      
         First row contains column names.
    File Character Set      
    I CLICKED NEXT, THIS IS WHAT IT LOOKED LIKE.
    Schema:      HANDSONXE06
    Table Name:      DATABASES
    Define Column Mapping      
    Column Names     %
    Format     
    Upload     yes
    Row 1     "db10" "9.2.0.7" 1
    Row 2     "db11" "10.2.0.1" 1
    Row 3     "db12" "10.2.0.1" 1
    Row 4     "db13" "9.2.0.7" 1
    Row 5     "db14" "10.2.0.1" 1
    Row 6     "db15" "9.2.0.7" 1
    I CLICKED LOAD AND THIS WAS THE RESULT.
    * There are NOT NULL columns in HANDSONXE06.DATABASES. Select to upload the data without an error.
    Schema
    Down
    Table Name
    Down
    File Details
    Down
    Column Mapping
    Load Data      
    Schema:      HANDSONXE06
    Table Name:      DATABASES
    Define Column Mapping      
    Column Names     COLUMN_NAMES
    Format     FORMAT
    Upload     UPLOAD
    Row 1     "db10" "9.2.0.7" 1
    Row 2     "db11" "10.2.0.1" 1
    Row 3     "db12" "10.2.0.1" 1
    Row 4     "db13" "9.2.0.7" 1
    Row 5     "db14" "10.2.0.1" 1
    Row 6     "db15" "9.2.0.7" 1
    I WAS REALLY WONDERING WHAT WAS REALLY WRONG. AN ERROR MESSAGE SAID, THERE ARE NOT NULL COLUMNS IN THE HANDSONXE06.DATABASES. I DIDN'T KNOW HOW TO FIX IT. WHAT DO I NEED TO CHANGE TO LOAD THE DATA WITHOUT AN ERROR? IT REALLY CONFUSED ME A LOT AND HOW COME I HAVE AN ERROR? PLEASE HELP ME. I NEED AND ANSWER TO MY PROBLEM PLEASE. I CANNOT GO FORWARD BECAUSE OF THIS.
    THANKS,
    JOCELYN

    I'm not certain of the utility you are using to load the data, however, I completed the following test using SQL Loader to insert the data into my table. Your process should work similar if the trigger and sequence are created for the table you are loading.
    SQL> create table load_tbl
      2  (db_id number(3) not null,
      3   db_name varchar2(100) not null,
      4   db_version varchar2(25),
      5   host_id number(3) not null)
      6  /
    Table created.
    SQL> desc load_tbl
    Name                                      Null?    Type
    DB_ID                                     NOT NULL NUMBER(3)
    DB_NAME                                   NOT NULL VARCHAR2(100)
    DB_VERSION                                         VARCHAR2(25)
    HOST_ID                                   NOT NULL NUMBER(3)
    SQL> create sequence db_id_seq;
    Sequence created.
    SQL> create or replace trigger db_id_trig
      2  before insert on load_tbl
      3  for each row
      4  when (new.db_id is null)
      5  begin
      6    select db_id_seq.nextval into :new.db_id from dual;
      7  end;
      8  /
    Trigger created.
    The contents of the data file, control file and log file are below for the load into load_tbl.
    C:\>sqlldr userid=username/password@db control=db_id_load.ctl log=db_id_load.log
    SQL*Loader: Release 9.2.0.6.0 - Production on Thu Jan 18 17:21:47 2007
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Commit point reached - logical record count 6
    C:\>
    SQL> select * from load_tbl
      2  /
         DB_ID DB_NAME              DB_VERSION                   HOST_ID
             1 db10                 9.2.0.7                            1
             2 db11                 10.2.0.1                           1
             3 db12                 10.2.0.1                           1
             4 db13                 9.2.0.7                            1
             5 db14                 10.2.0.1                           1
             6 db15                 9.2.0.7                            1
    6 rows selected.
    SQL>
    Data File"db10" "9.2.0.7" 1
    "db11" "10.2.0.1" 1
    "db12" "10.2.0.1" 1
    "db13" "9.2.0.7" 1
    "db14" "10.2.0.1" 1
    "db15" "9.2.0.7" 1
    Control FileLOAD DATA
    INFILE "C:\db_id_load.dat"
    APPEND INTO TABLE load_tbl
    FIELDS TERMINATED BY WHITESPACE OPTIONALLY ENCLOSED BY '"'
    TRAILING NULLCOLS
    (db_name CHAR,
    db_version CHAR,
    host_id "TO_NUMBER(:host_id,'99999999999')"
    Log FileSQL*Loader: Release 9.2.0.6.0 - Production on Thu Jan 18 17:21:47 2007
    Copyright (c) 1982, 2002, Oracle Corporation.  All rights reserved.
    Control File:   db_id_load.ctl
    Data File:      C:\db_id_load.dat
      Bad File:     db_id_load.bad
      Discard File:  none specified
    (Allow all discards)
    Number to load: ALL
    Number to skip: 0
    Errors allowed: 50
    Bind array:     64 rows, maximum of 256000 bytes
    Continuation:    none specified
    Path used:      Conventional
    Table LOAD_TBL, loaded from every logical record.
    Insert option in effect for this table: APPEND
    TRAILING NULLCOLS option in effect
       Column Name                  Position   Len  Term Encl Datatype
    DB_NAME                             FIRST     *  WHT O(") CHARACTER           
    DB_VERSION                           NEXT     *  WHT O(") CHARACTER           
    HOST_ID                              NEXT     *  WHT O(") CHARACTER           
        SQL string for column : "TO_NUMBER(:host_id,'99999999999')"
    Table LOAD_TBL:
      6 Rows successfully loaded.
      0 Rows not loaded due to data errors.
      0 Rows not loaded because all WHEN clauses were failed.
      0 Rows not loaded because all fields were null.
    Space allocated for bind array:                  49536 bytes(64 rows)
    Read   buffer bytes: 1048576
    Total logical records skipped:          0
    Total logical records read:             6
    Total logical records rejected:         0
    Total logical records discarded:        0
    Run began on Thu Jan 18 17:21:47 2007
    Run ended on Thu Jan 18 17:21:47 2007
    Elapsed time was:     00:00:00.39
    CPU time was:         00:00:00.13

  • Read data from a CSV file

    hi,
    i've managed to make a CSV file which includes logged data of how much my NXT car is turning left(L),right(R) or going straight(S). format is:direction/timer value (L/1922 means it went to left for 1922 ms).
    where i got stuck is how can i read those time and direction values from csv file in such way that those values could be used for the motor move input?
    Actually what i'm trying to do is severeal VIs which function like record/play action in NXT Toolkit.
    (i've attached the vi and the csv file, the file extension is CSV but it is not written in csv format it's a basic txt file actually )
    Message Edited by szemawy on 05-23-2008 10:34 AM
    Attachments:
    read_deneme.vi ‏10 KB
    testus.csv ‏1 KB

    after i get the data in 1-D array format I cannot convert it to string so that i can send it by write BT message. i tried flatten to string but didnt work out
    [nevermind the mess, just see the part where the 1-D string is being converted to string]
    Attachments:
    read_test2.JPG ‏184 KB
    read_test2_fp.JPG ‏132 KB

  • Uploading data from an excel file to ztable

    Hi,
    I tried to upload an excel file to ztable using fm 'ALSM_EXCEL_TO_INTERNAL_TABLE'.In the o/p I can see only 3 rows and all the date and time fields are not properly displayed in the o/p.What would be the reason?Can anyone give me a suggestion?
    Regards,
    Hema

    Hi,
    pls check my code.
                       Data Definitions                                  *
      Internal Tables
    DATA : t_zrb_hdr TYPE STANDARD TABLE OF ZRB_HDR WITH HEADER LINE,
           t_field_catalog  TYPE slis_t_fieldcat_alv.
    Declaration of the Internal Table with Header Line comprising of the uploaded data.
    DATA: BEGIN OF t_file_upload OCCURS 0.
    INCLUDE STRUCTURE ALSMEX_TABLINE. " Rows for Table with Excel Data
    DATA: END OF t_file_upload.
    form UPLOAD_EXCEL_FILE .
    CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
      EXPORTING
        filename                      = p_file
        i_begin_col                   = '1'
        i_begin_row                   = '2'
        i_end_col                     = '10'
        i_end_row                     = '250'
      tables
        intern                        = t_file_upload
    EXCEPTIONS
      INCONSISTENT_PARAMETERS       = 1
      UPLOAD_OLE                    = 2
      OTHERS                        = 3
    IF sy-subrc <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.

Maybe you are looking for

  • I am no longer able to email to a group

    I have been trying for 3 days to send e-mail messages to a group from the contacts I set up in my address book.  I have been able to do this for years, and all of a sudden, it does not work.  I always get the same message: Sending the message content

  • Safari 5.1.4 crashes upon loading pages

    After much time getting my MBP 5,1 10.6.8 running right another bug has cropped up, Safari crashes under these circumstances:      Only when loading pages. If Safari is only opened it's fine.      In normal user mode.      In another administrator ac

  • HT4623 my ipad update over the air to the new os but when I turn it on it ask for a password to get into my ipad to use it and i never had a password how to i get into using it?

    can not log into my ipad after it update to a new os system how do i log into the ipad?  i keep changing my apple id but not sure if this is the same way to log into my ipad.  i just bought this last year in 2013.

  • Images never load.

    pictures never fully load on any site i go to. the browser speed is ridiculously slow. what is the problem here. i have contacted my ISP and they didn't see anything and told me to contact Mac about the issue. i really really want this fixed. what ar

  • Creating user with LDAP Intergrated

    Hi Guys, I just sync with LDAP with SAP (ABAP) and its came out nicely.But there's still some questions about how to use this (FYI, the LDAP Server are the leading systems) : - How to create a new user from SAP, is it SU01 or from LDAP tcode? - As fo