834 Benefit Extracts - Error in Extract Definition Upload from Data File

Hi all,
I try uploading the Extract Definition of 834 Benefit Extract definition data file to create 834 Benefit Extract File layout in the system.
But it has errors as mention below and no clue in it to get through with the uploading process .
am providing the error info which i got from the log file.
Upload from stage tables
Error loading seed data for EXT_WHERE_CLAUSE8: FILE_NAME = ALB_ANSI-834 Full Profile, RECORD_NAME = ALB SE - Transaction Set Trailer, DATA_ELEMT_NAME = ALB SE01_Total, PARENT_RECORD_NAME = ALB SE - Transaction Set Trailer, PARENT_ELEMENT_NAME = , DATA_ELMT_NAME = ALB SE01_Total, COND_DATA_ELMT_NAME = ALB SE01_REF, ORA-01403: no data found
Start of log messages from FND_FILE
WARNING : Element ALB INS04 Decode Value 14 not uploaded
WARNING : Element ALB INS04 Decode Value 03 not uploaded
WARNING : Element ALB INS04 Decode Value 01 not uploaded
WARNING : Element ALB INS04 Decode Value 07 not uploaded
WARNING : Element ALB INS04 Decode Value 04 not uploaded
WARNING : Element ALB INS08 Decode Value FT not uploaded
WARNING : Element ALB INS08 Decode Value TE not uploaded
WARNING : Element ALB INS08 Decode Value RT not uploaded
WARNING : Element ALB DMG05 Decode Value O not uploaded
WARNING : Element ALB DMG05 Decode Value J not uploaded
WARNING : Element ALB DMG05 Decode Value H not uploaded
WARNING : Element ALB DMG05 Decode Value I not uploaded
WARNING : Element ALB DMG05 Decode Value B not uploaded
WARNING : Element ALB DMG05 Decode Value A not uploaded
WARNING : Element ALB HD03 Decode Value PPO not uploaded
WARNING : Element ALB HD03 Decode Value PPO not uploaded
WARNING : Element ALB HD03 Decode Value PPO not uploaded
WARNING : Element ALB HD05 Decode Value SPO not uploaded
WARNING : Element ALB HD05 Decode Value ESP not uploaded
WARNING : Element ALB HD05 Decode Value EMP not uploaded
WARNING : Element ALB HD05 Decode Value E1D not uploaded
WARNING : Element ALB HD05 Decode Value FAM not uploaded
WARNING : Element ALB HD01_TERM Decode Value 024 not uploaded
WARNING : Element ALB HD01_TERM Decode Value 024 not uploaded
End of log messages from FND_FILE
thanks & regards,
V.Leelaprasath.

Hi
Use the  PERFORM OPEN_GROUP after the enddo.
open dataset v_data for input in text mode encoding default.
do.
read dataset v_data into < itab-fields>
if sy-subrc <> 0.
message e001(yarcjosh).
else.
Append itab.
clear itab.
endif.
enddo.
perform open_group.
loop at record.
perform bdc_dynpro using 'SAPLMGMM' '0060'.
endloop.
Reward points if useful
Regards
Anji

Similar Messages

  • Error in running Extract Definition Upload from Data File concurrent.

    Hi all,
    Am trying to upload the 834 Extract Layout from the data file by running the concurrent program, Extract Definition Upload from Data File.
    After running this concurrent program am getting the Extract Layout definition with Record layout and Data elements within it.
    But some record layouts has the changes at their repeating level in that.
    Please suggest me how do i get the same repeating levels for the record layouts when i move the 834 benefit extract layout definition
    from one instance to another instance.

    Hi,
    We have exactly the same error in IBolt. This error happens sometime and we don't find the reason.
    IBolt has been upgraded to Version 3.1 SP1 and a fix has been developed by the support. After installation, we've work hardly during 2 days for testing all the flows by the customer without having this error.
    Ibolt works as a service and when this error occurs, the service stopped and must be restarted.
    In our flows, we have put a delete of the observer.dll file just before the rest of the flow but it doesn't solve the problem.
    Deleting the directory can be temporarly a solution but the error come back another day.
    Each time the SBO client or DTW is started, the %temp%\SMS_OBJ_DLL directory is created...
    When you just delete the observer.dll file and leave the rest of the directory like it is, the file is re-created when you start and log to SBO. I would say that this file is a copy of another file observer_800178.dll (800178 depending of the version of SBO yoi have). 800178 corresponds to SBO 2007 A PL 42.
    Do you have more information since your post has been send ?
    Thanks in advance for your help.
    Best regards.

  • Internal Server Error  when trying to upload and manage files

    The instructor was experiencing this difficulty on Tuesday 1/13, after the notice was closed. I just
    went into their site again(1/14/09 2:48pm), launched iTunes U and received the same error after clicking on 'Upload And Manage Files' in their iTunes U page.
    I was able to reproduce the same error on our iTunes U site.
    https://deimos.apple.com/WebObjects/Core.woa/EditFiles/indiana.edu

    I have only found two that work correctly. They were created the same way and permissions are the same. I don't know why one works over the other.
    /Indiana University/Project Sites/Oncourse Team - FAIL
    /Indiana University/Project Sites/SyllabusTest - Success
    /Indiana University/Project Sites/CHEN-TES 101 01 - FAIL
    /Indiana University/Project Sites/UITS Podcast Team - FAIL
    /Indiana University/Project Sites/ePortfolio Playgroun - FAIL
    /Indiana University/Project Sites/Testong - FAIL
    /Indiana University/SP08 Spring Semester/SP08 IN UITS PRAC 14431 - FAIL
    /Indiana University/SP08 Spring Semester/SP08 IN UITS PRAC 14431 - FAIL
    /Indiana University/SP08 Spring Semester/SP08 OC DEV C201 DEV2 - FAIL
    /Indiana University/SP08 Spring Semester/SP08 IN UITS PRAC 14438 - FAIL
    /Indiana University/SP08 Spring Semester/SP08 IN UITS PRAC 14552 - success
    /Indiana University/SP08 Spring Semester/SP08 IN UITS PRAC 14392 - FAIL

  • E-commerce Gateway - Extract POs automatically on to the ECG data file

    We use EBS 12.0.6, ECG and EDI.
    Currently, we transmit POs to suppliers via EDI by MANUALLY running the "OUT: Purchase Order (850/ORDERS)" ECG Program.
    Is it possible to AUTOMATICALLY extract the PO information in to the data file with out running the above program? Is there any setup to do this? Please advise. Thanks.

    Guess no setup available. Try launching conc program from approval workflow or via trigger is you dont want to launch manually.
    Regards,
    Praveen

  • Extracted from data files

    Hello All,
    DB version: 9i Rel 2
    Which Oracle command line tools allow data to be extracted from data files that are NOT ATTACHED to a database?
    DN

    Oracle Support has a tool called DUL, Data UnLoader, to extract data from data files. They have a nice rate they charge for the service as well. There are other groups that have such tool, and they charge the for the service as well.

  • Unable to upload a .dat file from pl/sql

    Hi,
    I have written a stub in plsql to upload a data file. the DB version which I am using is 11g ( 11.1.0.7). The data file consists of 94000 rows.
    Sys is able to upload only 8000 rows at a time. I tried to upload it from the DB server by Setting serverout param to off. I generated a log file for this and I get " -ORA-06502: PL/SQL: numeric or value error: character string buffer too small " error. Kindly help on this issue.Are there any param's in the DB which will restrict the data upload into table based on rows/blocks at a time ?
    My code is as follows :
    DECLARE
         f           UTL_FILE.file_type;
         l           VARCHAR2(30000);
         l_rec          tmp_bic%ROWTYPE;
         split          DBMS_SQL.number_table;
         val          VARCHAR2(4500);
         s          NUMBER;
         e          NUMBER;
    BEGIN
         -- Pls look at some sample lines in the file (FI.Dat)
         -- The following lines are to split the line to fields. If the position is changing,
         -- you may need to change the below accordingly.
         split( split.COUNT + 1 ) := 4;
         split( split.COUNT + 1 ) := 15;
         split( split.COUNT + 1 ) := 120;
         split( split.COUNT + 1 ) := 190;
         split( split.COUNT + 1 ) := 225;
         split( split.COUNT + 1 ) := 289;
         split( split.COUNT + 1 ) := 324;
         split( split.COUNT + 1 ) := 359;
         split( split.COUNT + 1 ) := 394;
         split( split.COUNT + 1 ) := 464;
         split( split.COUNT + 1 ) := 569;
         split( split.COUNT + 1 ) := 639;
         split( split.COUNT + 1 ) := 674;
         split( split.COUNT + 1 ) := 779;
         delete tmp_bic;
         commit;
         -- If the file name or Path are different, the following line needs to be edited accordingly
         f := utl_file.fopen('/tmp','FI.dat','r',32767);
         BEGIN
              LOOP
                   utl_file.get_line(f,l);
                   l_rec := NULL;               
                   FOR i IN split.FIRST .. split.LAST
                   LOOP
                        IF i = split.FIRST
                        THEN
                             s := 1;
                        ELSE
                             s := split(i-1);
                        END IF;
                        IF i = split.LAST
                        THEN
                             e := 500;
                        ELSE
                             e := split(i)-1;
                        END IF;
                        val := SUBSTR(l,s,(e-s+1));
                        val := LTRIM(RTRIM(val));
                        --dbms_output.put_line(i||' - '||s||' - '||e);                    
                        IF i = 1     THEN     l_rec.col1 := val;                    
                        ELSIF i=2     THEN     l_rec.col2 := val;
                        ELSIF i=3     THEN     l_rec.col3 := val;
                        ELSIF i=4     THEN     l_rec.col4 := val;
                        ELSIF i=5     THEN     l_rec.col5 := val;
                        ELSIF i=6     THEN     l_rec.col6 := val;
                        ELSIF i=7     THEN     l_rec.col7 := val;
                        ELSIF i=8     THEN     l_rec.col8 := val;
                        ELSIF i=9     THEN     l_rec.col9 := val;
                        ELSIF i=10     THEN     l_rec.col10 := val;
                        ELSIF i=11     THEN     l_rec.col11 := val;
                        ELSIF i=12     THEN     l_rec.col12 := val;
                        ELSIF i=13     THEN     l_rec.col13 := val;
                        ELSIF i=14     THEN     l_rec.col14 := val;
                        ELSIF i=15     THEN     l_rec.col15 := val;
                        END IF;
                   END LOOP;
                   INSERT INTO tmp_bic VALUES l_rec;
                   commit;
                   --exit;
              END LOOP;
         EXCEPTION
              WHEN NO_DATA_FOUND
              THEN
                   NULL;
         END;
         utl_file.fclose(f);
    EXCEPTION
         WHEN OTHERS
         THEN
              DBMS_OUTPUT.put_line('Error '||SQLERRM);          
              IF utl_file.is_open(f)
              THEN
                   utl_file.fclose(f);
              END IF;
    END;
    insert into glo_bic_bkp select a.*,sysdate from glo_bic a;
    commit;
    BEGIN
         for i IN (
              SELECT     rowid rid,
                   bic_code
              FROM      glo_bic          
         loop
              for j IN
                   select substr(col3,1,35) name,
                   substr(col4,1,35) addr1,
                   substr(col5,1,35) addr2,
                   substr(col8||' '||col9,1,35) addr3
                   from tmp_bic
                   where col2 = i.bic_code
              loop
                   update     glo_bic
                   set     name = j.name,
                        address1 = j.addr1,
                        address2 = j.addr2,
                        address3 = j.addr3,
                        mod_no = mod_no+1,                    
                   WHERE     rowid = i.rid;
                   commit;
              end loop;
         end loop;
    END;
    insert into glo_bic
         BIC_CODE,
    NAME,
         CUSTOMER_NO,
         SK_ARRANGEMENT,     
         ADDRESS1,
         ADDRESS2,
         ADDRESS3,
         UPLOAD_FLAG,
         UPLOAD_UPDATE     
    select     col2,
         substr(col3,1,35),
         null,
         'N',
         1,          
         substr(col4,1,35),
         substr(col5,1,35),substr(col8||' '||col9,1,35),
         'N',
         null     
    from     tmp_bic
    where      not exists (select 1 from glo_bic where bic_code = col2)
    commit;
    Edited by: tyro_vins on Sep 24, 2009 10:34 PM

    The error sounds like one of the fields to too big for the column. Maybe you can look at the data in a text editor.

  • When I try to install Itunes, I get this error: (translated from Norwegian) There was a network error while trying to read from the file: C: \ windows \ installer \ iTunes.msi I have tried many times to reinstall, but nothing helps, please help me.

    When I try to install Itunes, I get this error: (translated from Norwegian) There was a network error while trying to read from the file: C: \ windows \ installer \ iTunes.msi I have tried many times to reinstall, but nothing helps, please help me.

    (1) Download the Windows Installer CleanUp utility installer file (msicuu2.exe) from the following Major Geeks page (use one of the links under the "DOWNLOAD LOCATIONS" thingy on the Major Geeks page):
    http://majorgeeks.com/download.php?det=4459
    (2) Doubleclick the msicuu2.exe file and follow the prompts to install the Windows Installer CleanUp utility. (If you're on a Windows Vista or Windows 7 system and you get a Code 800A0046 error message when doubleclicking the msicuu2.exe file, try instead right-clicking on the msicuu2.exe file and selecting "Run as administrator".)
    (3) In your Start menu click All Programs and then click Windows Install Clean Up. The Windows Installer CleanUp utility window appears, listing software that is currently installed on your computer.
    (4) In the list of programs that appears in CleanUp, select any iTunes entries and click "Remove", as per the following screenshot:
    (5) Quit out of CleanUp, restart the PC and try another iTunes install. Does it go through properly this time?

  • Getting error "Cannot create a BACPAC from a file that does not contain exported data." from SqlManagementClient.Dac.ImportAsync

    We're trying to import a dacpac to azure via the new SqlManagementClient DacOperations ImportAsync api I get an exception with the error: "Cannot create a BACPAC from a file that does not contain exported data."
    This same dacpac imports fine using an alternate but less friendly API from sql server's tooling. We'd like to use the new management SDK instead for various reasons.

    Hi Kyle A Wilt,
    I am trying to involve someone more familiar with this topic for a further look at this issue. Sometime delay might be expected from the job transferring. Your patience is greatly appreciated.
    Thank you for your understanding and support.
    Regards,
    Sofiya Li
    Sofiya Li
    TechNet Community Support

  • Error during BAPI while uploading Material data through MM01

    Hi all,
    i am facing one problem when uploading Material data thorough BAPI in MM01.i am attaching the code below and the error given.
    but in the debugger all the value is being stored.
    REPORT  ZFINISHED_MAT.
    Data: Begin of legacy_data occurs 0,
         MATNR LIKE MARA-MATNR,
         MBRSH LIKE MARA-MBRSH,            "Industry Sector
         MTART LIKE MARA-MTART,            "Matl Type
         WERKS LIKE MARD-WERKS,            "Plant
         LGORT LIKE MARD-LGORT,            "Storage location
         VKORG LIKE MVKE-VKORG,
         VTWEG LIKE MVKE-VTWEG,
         MAKTX LIKE MAKT-MAKTX,             "Matl Desc.
         MEINS LIKE MARA-MEINS,             "Base UOM
         MATKL LIKE MARA-MATKL,             "Matl.Grp
    *     BISMT LIKE MARA-BISMT,
         SPART LIKE MARA-SPART,             "Division
    *     BRGEW LIKE MARA-BRGEW,             "Gross weight
         GROES LIKE MARA-GROES,
         FERTH LIKE MARA-FERTH,
         ZEINR LIKE MARA-ZEINR,
         TAXKM1 LIKE MLAN-TAXM1,
         TAXKM2 LIKE MLAN-TAXM2,
         TAXKM3 LIKE MLAN-TAXM3,
         TAXKM4 LIKE MLAN-TAXM4,
         KTGRM LIKE MVKE-KTGRM,
    *     GEWEI LIKE MARA-GEWEI,             "Weight unit
    *     NTGEW LIKE MARA-NTGEW,             "Net weight
    *     KLART LIKE RMCLF-KLART,
         MTVFP LIKE MARC-MTVFP,             "Availibility Check
    *     XGCHP LIKE MARA-XGCHP,
         XCHPF LIKE MARA-XCHPF,             "Batch Management
         TRAGR LIKE MARA-TRAGR,
         LADGR TYPE MARC-LADGR,
         VPRSV LIKE MBEW-VPRSV,            "Price Control
         VERPR LIKE MBEW-VERPR,
    *     SPRAS LIKE MAKT-SPRAS,
      END OF LEGACY_DATA.
    DATA: BEGIN OF IT_MAKT OCCURS 0.
    INCLUDE STRUCTURE BAPI_MAKT.
    DATA: END OF IT_MAKT.
    *--- BAPI structures
    DATA: BAPI_HEAD LIKE BAPIMATHEAD, " Header Segment with Control Information
    BAPI_MAKT LIKE BAPI_MAKT, " Material Description
    BAPI_MARA1 LIKE BAPI_MARA, " Client Data
    BAPI_MARAX LIKE BAPI_MARAX, " Checkbox Structure for BAPI_MARA
    BAPI_MARD1 LIKE BAPI_MARD,
    BAPI_MARDX1 LIKE BAPI_MARDX, " Checkbox Structure for BAPI_MARD
    BAPI_MARC1 LIKE BAPI_MARC, " Plant View
    BAPI_MARCX LIKE BAPI_MARCX, " Checkbox Structure for BAPI_MARC
    BAPI_MVKE1 LIKE BAPI_MVKE,
    BAPI_MVKEX1 LIKE BAPI_MVKEX, " Checkbox Structure for BAPI_MVKE
    BAPI_MLAN1 LIKE BAPI_MLAN,
    BAPI_MLANX1 LIKE bapi_mlan1, " Checkbox Structure for BAPI_MLAN
    BAPI_MBEW1 LIKE BAPI_MBEW, " Accounting View
    BAPI_MBEWX LIKE BAPI_MBEWX, " Checkbox Structure for BAPI_MBEW
    BAPI_RETURN LIKE BAPIRET2. " Return Parameter
    *              $PARAMETERS DECLARATION$
    SELECTION-SCREEN BEGIN OF BLOCK B11
                              WITH FRAME TITLE TEXT-001.
    PARAMETERS: P_FILE LIKE RLGRAP-FILENAME. " DEFAULT 'C:\TEST1.XLS'.
    SELECTION-SCREEN SKIP.
    SELECTION-SCREEN ULINE.
    SELECTION-SCREEN END OF BLOCK B11 .
    *              $DATA DECLARATION$
    *DATA : BDC_DATA LIKE STANDARD TABLE OF BDCDATA WITH HEADER LINE.
    DATA : IT_EXCEL TYPE STANDARD TABLE OF  ALSMEX_TABLINE INITIAL SIZE 0 WITH HEADER LINE,
            IT_EXCEL_DUMMY TYPE ALSMEX_TABLINE.
    DATA : MESSTAB LIKE BDCMSGCOLL OCCURS 0 WITH HEADER LINE.
    DATA : L_MSTRING(480).
    DATA :L_SUBRC LIKE SY-SUBRC.
    DATA: V_FILE TYPE STRING.
    *              $AT-SELECTON SCREEN DECLARATION$
    AT SELECTION-SCREEN ON P_FILE.
      IF P_FILE IS INITIAL.
        MESSAGE E398(00) WITH 'FILE NAME NEEDS TO BE SPECIFIED'.
      ENDIF.
    AT SELECTION-SCREEN ON VALUE-REQUEST FOR P_FILE.
      CALL FUNCTION 'F4_FILENAME'
       EXPORTING
         PROGRAM_NAME        = SYST-CPROG
    *   DYNPRO_NUMBER       = SYST-DYNNR
         FIELD_NAME          = 'P_FILE'
       IMPORTING
         FILE_NAME           = P_FILE
    start-of-selection.
    perform data_fetch_to_xls.
    perform insertion.
    *&      Form  data_fetch_to_xls
    *       text
    *  -->  p1        text
    *  <--  p2        text
    FORM data_fetch_to_xls .
    CALL FUNCTION 'ALSM_EXCEL_TO_INTERNAL_TABLE'
        EXPORTING
          FILENAME                = P_FILE
          I_BEGIN_COL             = 1
          I_BEGIN_ROW             = 2
          I_END_COL               = 25
          I_END_ROW               = 2
        TABLES
          INTERN                  = IT_EXCEL
        EXCEPTIONS
          INCONSISTENT_PARAMETERS = 1
          UPLOAD_OLE              = 2
          OTHERS                  = 3.
      IF SY-SUBRC <> 0.
    * MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
    *         WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
        WRITE: /'ERROR UPLOADING XLS FILE FROM PRESENTATION SERVER !' ,
               /'RETURN CODE : ', SY-SUBRC.
      ELSE.
    *************NOW FILL DATA FROM EXCEL INTO FINAL LEGACY DATA ITAB----LEGACY_DATA***************
        IF NOT IT_EXCEL[] IS INITIAL.
          CLEAR LEGACY_DATA.
          REFRESH LEGACY_DATA[].
          LOOP AT IT_EXCEL.
            IT_EXCEL_DUMMY = IT_EXCEL.
            AT NEW COL.
              CASE IT_EXCEL_DUMMY-COL.
                WHEN 1.
                  LEGACY_DATA-MATNR = IT_EXCEL_DUMMY-VALUE(18).
                WHEN 2.
                  LEGACY_DATA-MBRSH = IT_EXCEL_DUMMY-VALUE(1).
                WHEN 3.
                  LEGACY_DATA-MTART = IT_EXCEL_DUMMY-VALUE(4).
                WHEN 4.
                  LEGACY_DATA-WERKS = IT_EXCEL_DUMMY-VALUE(4).
                WHEN 5.
                  LEGACY_DATA-LGORT = IT_EXCEL_DUMMY-VALUE(4).
                WHEN 6.
                  LEGACY_DATA-VKORG = IT_EXCEL_DUMMY-VALUE(4).
                 WHEN 7.
                  LEGACY_DATA-VTWEG = IT_EXCEL_DUMMY-VALUE(2).
                WHEN 8.
                  LEGACY_DATA-MAKTX = IT_EXCEL_DUMMY-VALUE(40).
                WHEN 9.
                  LEGACY_DATA-MEINS = IT_EXCEL_DUMMY-VALUE(3).
                WHEN 10.
                  LEGACY_DATA-MATKL = IT_EXCEL_DUMMY-VALUE(9).
                WHEN 11.
                  LEGACY_DATA-SPART = IT_EXCEL_DUMMY-VALUE(2).
                WHEN 12.
                  LEGACY_DATA-GROES = IT_EXCEL_DUMMY-VALUE(32).
                WHEN 13.
                  LEGACY_DATA-FERTH = IT_EXCEL_DUMMY-VALUE(18).
                WHEN 14.
                  LEGACY_DATA-ZEINR = IT_EXCEL_DUMMY-VALUE(22).
                 WHEN 15.
                  LEGACY_DATA-TAXKM1 = IT_EXCEL_DUMMY-VALUE(1).
                 WHEN 16.
                  LEGACY_DATA-TAXKM2 = IT_EXCEL_DUMMY-VALUE(1).
                 WHEN 17.
                  LEGACY_DATA-TAXKM3 = IT_EXCEL_DUMMY-VALUE(1).
                 WHEN 18.
                  LEGACY_DATA-TAXKM4 = IT_EXCEL_DUMMY-VALUE(1).
                 WHEN 19.
                  LEGACY_DATA-KTGRM = IT_EXCEL_DUMMY-VALUE(2).
                WHEN 20.
                  LEGACY_DATA-MTVFP = IT_EXCEL_DUMMY-VALUE(2).
                 WHEN 21.
                  LEGACY_DATA-XCHPF = IT_EXCEL_DUMMY-VALUE(1).
                WHEN 22.
                  LEGACY_DATA-TRAGR = IT_EXCEL_DUMMY-VALUE(4).
                WHEN 23.
                  LEGACY_DATA-LADGR = IT_EXCEL_DUMMY-VALUE(4).
                WHEN 24.
                  LEGACY_DATA-VPRSV = IT_EXCEL_DUMMY-VALUE(1).
                WHEN 25.
                  LEGACY_DATA-VERPR = IT_EXCEL_DUMMY-VALUE(14).
                  APPEND LEGACY_DATA.
                  CLEAR LEGACY_DATA.
              ENDCASE.
            ENDAT.
            AT END OF ROW.
            ENDAT.
          ENDLOOP.
        ENDIF.
      ENDIF.
    ENDFORM.                    " data_fetch_to_xls
    *&      Form  insertion
    *       text
    *  -->  p1        text
    *  <--  p2        text
    FORM insertion .
    LOOP AT legacy_data.
    * Header
    BAPI_HEAD-MATERIAL = legacy_data-MATNR.
    BAPI_HEAD-IND_SECTOR = legacy_data-MBRSH.
    BAPI_HEAD-MATL_TYPE = legacy_data-MTART.
    BAPI_HEAD-BASIC_VIEW = 'X'.
    BAPI_HEAD-SALES_VIEW = 'X'.
    BAPI_HEAD-STORAGE_VIEW = 'X'.
    *BAPI_HEAD-PURCHASE_VIEW = 'X'.
    BAPI_HEAD-ACCOUNT_VIEW = 'X'.
    * Material Description
    REFRESH IT_MAKT.
    *IT_MAKT-LANGU = legacy_data-SPRAS.
    IT_MAKT-MATL_DESC = legacy_data-MAKTX.
    APPEND IT_MAKT.
    BAPI_MARD1-PLANT = legacy_data-WERKS.
    BAPI_MARD1-STGE_LOC = legacy_data-LGORT.
    BAPI_MARDX1-PLANT = legacy_data-WERKS.
    BAPI_MARDX1-STGE_LOC = legacy_data-LGORT.
    ** Client Data - Basic
    BAPI_MARA1-MATL_GROUP = legacy_data-MATKL.
    *bapi_mara1-OLD_MAT_NO = legacy_data-bismt.
    BAPI_MARA1-BASE_UOM = legacy_data-MEINS.
    BAPI_MARA1-PROD_MEMO = LEGACY_DATA-FERTH.
    BAPI_MARA1-SIZE_DIM = LEGACY_DATA-GROES.
    BAPI_MARA1-DOCUMENT = LEGACY_DATA-ZEINR.
    BAPI_MARA1-BATCH_MGMT = LEGACY_DATA-XCHPF.
    *BAPI_MARA1-UNIT_OF_WT = legacy_data-GEWEI.
    BAPI_MARA1-TRANS_GRP = legacy_data-TRAGR.
    BAPI_MARA1-DIVISION = legacy_data-SPART.
    BAPI_MARAX-MATL_GROUP = 'X'.
    *BAPI_MARAX-OLD_MAT_NO = 'X'.
    BAPI_MARAX-BASE_UOM = 'X'.
    BAPI_MARAX-PROD_MEMO = 'X'.
    BAPI_MARAX-SIZE_DIM = 'X'.
    BAPI_MARAX-DOCUMENT = 'X'.
    BAPI_MARAX-BATCH_MGMT = 'X'.
    *BAPI_MARAX-UNIT_OF_WT = 'X'.
    BAPI_MARAX-TRANS_GRP = 'X'.
    BAPI_MARAX-DIVISION = 'X'.
    *SALES
    BAPI_MVKE1-SALES_ORG = legacy_data-VKORG.
    BAPI_MVKE1-DISTR_CHAN = legacy_data-VTWEG.
    *BAPI_MVKE1-DELYG_PLNT = legacy_data-DWERK.
    BAPI_MVKE1-ACCT_ASSGT = legacy_data-KTGRM.
    BAPI_MVKEX1-SALES_ORG = legacy_data-VKORG.
    BAPI_MVKEX1-DISTR_CHAN = legacy_data-VTWEG.
    *BAPI_MVKEX1-DELYG_PLNT = 'X'.
    BAPI_MVKEX1-ACCT_ASSGT = 'X'.
    ** Plant - Purchasing
    BAPI_MARC1-PLANT = legacy_data-WERKS.
    BAPI_MARC1-LOADINGGRP = legacy_data-LADGR.
    BAPI_MARC1-AVAILCHECK = legacy_data-MTVFP.
    *BAPI_MARC1-MRP_GROUP =  legacy_data-disgr.
    BAPI_MARCX-PLANT = legacy_data-WERKS.
    BAPI_MARCX-LOADINGGRP = 'X'.
    BAPI_MARCX-AVAILCHECK = 'X'.
    *BAPI_MARCX-MRP_GROUP =  'X'.
    * Accounting
    BAPI_MBEW1-VAL_AREA = legacy_data-WERKS.
    BAPI_MBEW1-PRICE_CTRL = legacy_data-VPRSV.
    BAPI_MBEW1-STD_PRICE =  legacy_data-VERPR.
    *BAPI_MBEW1-VAL_CLASS =  legacy_data-BKLAS.
    *BAPI_MBEW1-STD_PRICE = legacy_data-STPRS.
    *BAPI_MBEW1-PRICE_UNIT = legacy_data-PEINH.
    BAPI_MBEWX-VAL_AREA = legacy_data-WERKS.
    BAPI_MBEWX-PRICE_CTRL = 'X'.
    BAPI_MBEWX-STD_PRICE =  'X'.
    *BAPI_MBEWX-VAL_CLASS =  'X'.
    * TAX JURISDICTION CODE
    BAPI_MLAN1-TAXCLASS_1 = LEGACY_DATA-TAXKM1.
    BAPI_MLAN1-TAXCLASS_2 = LEGACY_DATA-TAXKM2.
    BAPI_MLAN1-TAXCLASS_3 = LEGACY_DATA-TAXKM3.
    BAPI_MLAN1-TAXCLASS_4 = LEGACY_DATA-TAXKM4.
    BAPI_MLAN1-TAXCLASS_1 = 'X'.
    BAPI_MLAN1-TAXCLASS_2 = 'X'.
    BAPI_MLAN1-TAXCLASS_3 = 'X'.
    BAPI_MLAN1-TAXCLASS_4 = 'X'.
    *--- BAPI to create material
    call function 'BAPI_MATERIAL_SAVEDATA'
    exporting
    HEADDATA = BAPI_HEAD
    CLIENTDATA = BAPI_MARA1
    CLIENTDATAX = BAPI_MARAX
    *PLANTDATA = BAPI_MARC1
    *PLANTDATAX = BAPI_MARCX
    * FORECASTPARAMETERS =
    * FORECASTPARAMETERSX =
    * PLANNINGDATA =
    * PLANNINGDATAX =
    * STORAGELOCATIONDATA =
    * STORAGELOCATIONDATAX =
    VALUATIONDATA = BAPI_MBEW1
    VALUATIONDATAX = BAPI_MBEWX
    * WAREHOUSENUMBERDATA =
    * WAREHOUSENUMBERDATAX =
    * SALESDATA = BAPI_MVKE1
    * SALESDATAX = BAPI_MVKEX
    * STORAGETYPEDATA =
    * STORAGETYPEDATAX =
    IMPORTING
    RETURN = BAPI_RETURN
    TABLES
    MATERIALDESCRIPTION = IT_MAKT
    * UNITSOFMEASURE =
    * UNITSOFMEASUREX =
    * INTERNATIONALARTNOS =
    * MATERIALLONGTEXT =
    * TAXCLASSIFICATIONS =
    * RETURNMESSAGES =
    * PRTDATA =
    * PRTDATAX =
    * EXTENSIONIN =
    * EXTENSIONINX =
    IF BAPI_RETURN-TYPE = 'E'.
    WRITE:/ 'Error:' ,BAPI_RETURN-MESSAGE ,'for material:' ,legacy_data-maTNR.
    ELSEIF BAPI_RETURN-TYPE = 'S'.
    WRITE: 'Successfully created material' ,legacy_data-maTNR.
    ENDIF.
    ENDLOOP.
    ENDFORM.                    " insertion
    The error: The field MARC-MTVFP/BAPI_MARC-AVAILCHECK is defined as a required field; it does not contain an entr
    suggestion will be vry helpful,
    Kind Regards,
    Edited by: Prasenjit Sengupta on Nov 16, 2009 9:11 AM

    Hi Prasenjit,
    Did you get any solution for this as I am getting same error from BAPI to update MRP controller in MARC table.
    Regards,
    Suruchi

  • How to extract/export SD billing invoices to flat data file

    Hi,
    We are using SAP R/3 4.5b and we have to extract billing invoices to flat data file like csv or xml. How do we do it?
    Thanks

    Hi Gyan Der,
    There is another simple method to extract the SD Billing values from SAP.
    You can find the SD Billing data in the following tables:
    <b>VBRK - Header Data
    VBRP - Item Details.</b>
    You can create a report using these 2 tables in the Transaction <b>SQVI</b>.
    check the following link for help to create Quick Views:
    http://help.sap.com/saphelp_nw04/helpdata/en/d1/44f2b5c7f411d296080000e82de14a/frameset.htm
    In SQVI, you can create table joins for the above 2 tables & extract the required output.
    The report output can be transferred to an excel file or text file.
    hope this helps!
    best regards,
    Thangesh

  • Error occurred in the data uploading from Flat File in BPC NW

    Hi,
    I am doing Migration project from BPC MS to NW.
    In this i am loading data from flat file to BPC NW. One error occured in this process that is Record Duplication.
    Total 17000 records in that  7000 recards rejected by the reason of duplication.
    The Information about Package Log
    /CPMB/MODIFY completed in 0 seconds
    /CPMB/CONVERT completed in 2 seconds
    /CPMB/LOAD completed in 7 seconds
    /CPMB/CLEAR completed in 0 seconds
    [Selection]
    FILE= DATAMANAGER\DATAFILES\Aprilmayjun_2011_Budget_V1.CSV
    TRANSFORMATION= DATAMANAGER\TRANSFORMATIONFILES\ZAPRMAYJUN_2011BUDGET.xls
    CLEARDATA= No
    RUNLOGIC= Yes
    CHECKLCK= No
    [Messages]
    Task name CONVERT:
    No 1 Round:
    Record count: 17064
    Accept count: 17064
    Reject count: 0
    Skip count: 0
    Task name LOAD:
    Reject count: 7230
    Submit count: 9834
    Application: CorpBudget Package status: WARNING
    Could you help me in this at the earliest.
    Thanks and Regards
    Krishna

    Hi,
    You cannot send the duplicated with the standard import package. You need to create or add a new package and link to /CPMB/APPEND process chain and do the import. This would consider the duplicate entries also. It looks like you need somehow load all the records including the duplicate ones.
    So on excel go to Manage Data -> Maintain Data management -> organize package list and add a new package and link to the standard BPC process chain /CPMB/APPEND.
    Thanks,
    Sreeni

  • Error in Shopping Cart Upload from Portal Using Excel

    Dear All ,
                      I am facing problem when i try to upload excel file for creating Shopping cart from Portal . For doing this i have created a transaction from SRM r/3 and using Transaction i view i published the tcode on portal . But when i try to upload a excel file using the transaction i am getting a Exception UPLOAD_OLE . For uploading excel we copied the Fm ALSM_EXCEL_TO_INTERNAL_TABLE . Its working fine in SRM R/3 environment.but facing issue on portal . Please help us out on this .
    Regards
    Shankar

    Dear Poster,
    As no response has been provided to the thread in some time I must assume the issue is resolved, if the question is still valid please create a new thread rephrasing the query and providing as much data as possible to promote response from the community.
    Best Regards,
    SDN SRM Moderation Team

  • Error while creating a report from personal files.

    Hi All,
    We are trying to insert a report from excel file( Env BO XIr3 Desktop intelligence)
    Poping up with error "Too many data to display"
    Is there any data limit or data size limit or no of rows limit.
    Thanks in advance...

    Hi Rachna,
    Can you try one thing can you restrict the values to less then 16000 to check the behavior.
    What is the excel version is it 2007 Excel.
    Are you  trying to use an excel sheet which have more then 256 columns 
    Regards
    Kultar

  • Va31 shedule line agreement data upload from flat file

    Hi abapers
    I have to upload some data (va31) from flat file to my database (shedule line agreement data) I am using user exit for it...Cant get which user exit will solve the purpose and where to check it from..I tried using SDTRM001 , meeta001 and and the va45A series but its not working. I used break point on these user exits but its not stoping at break point.
    Can any one help me where to find which user exit will work in this case?
    Thanks in Advance
    Annu

    Hi Prash,
    Check these posts:
    Re: Increasing the length of Infoobject from 60 to 240 characters
    Re: InfoObject > 60
    Bye
    Dinesh

  • ERROR LOADING bcp -w option generated data files

    Hi,
    I have to migrate data from SQL Server to Oracle.
    The unload_script is this:
    bcp "CLEPSIDRA.dbo.InversionesAceptadas" out "[CLEPSIDRA].[dbo].[InversionesAceptadas].dat" -q -c -t "<EOFD>" -r "<EORD>" -Usa -Pas -STIEPO
    When I execute the load script, the data is loaded ok.
    But I have a field that contains character like á,é,ñ,...so I changed my unload script to:
    bcp "CLEPSIDRA.dbo.InversionesAceptadas" out "[CLEPSIDRA].[dbo].[InversionesAceptadas].dat" -q -w -t "<EOFD>" -r "<EORD>" -Usa -Pas -STIEPO
    If I execute the new unload_script, the data file generated is ok, with the characters like á,é,ñ...
    But when I trying to execute the load script, I have errors like 'field is not a valid number..'...
    But if I open the first (-c option generated) data file and replace it's content with the content of the second (-w option generated) data file and save it, the load script works ok!
    How can I solve this?
    Thanks

    Hi user616069,
    There is a preference in SQLDeveloper for encoding.
    Tools->Preferences->Environment->Encoding
    Do you have a reproducible test case you can give us a URL to, including for example a single fictional record?
    -Turloch

Maybe you are looking for

  • Equium A110: SVID to Composit - black and white output

    I Have recently bought a Toshiba Equium A110 and i have been trying to out put to a tv the problem is that svid to svid is fine and in colour however svid to composit (Which is what i need)is black and white i have tried all the settings to no avail

  • Wrap utility doesn't work on Trigger

    Hello Experts, I want to encrypt the code in trigger on database by using wrap tool. I type this one below: wrap iname=TRG_INSERT_BB_TEMP.trg oname=TRG_INSERT_BB_TEMP_wrap.trg It doesn't work on my side. Or is it true, that such wrapping doesn't eith

  • Report server failed service specific error 186

    Gurus! I am using oracle 9iAS enterprise, Report server failed service specific error 186 Kindly help!

  • Listening to iTunes U on a Nano?

    I have slowly realised that the iTunes U file audio on my iPod are not actually there. Two are because I accidentally put them in Music.  I want to listen to the rest of them on the iPod but there is no such category on it. I appreciate that the U ha

  • Subtracting sysdate from parameter

    All, I need to have user input parameter value. for example i want to restrict query based on input parameter. I have created a paramter as old_report where user would enter number of days they want to go back and see reports for. So my where conditi