BDC with many records ?

Hello,
I am having 10000 records material mater records to be uploaded by BDC i did the coding part and all its working fine i have checked it with sample 2 records it is working fine. Same code i got the instructions to upload the 10000 records use SM35 in this case what i have to do. I want to know how to configure SM35 so that i can trigger at the time which i got the instructions to run the program.
Thks

Hi balu,
You can use transaction code SM35 to run and manage batch input sessions.
You can schedule SAP standard program: RSBDCSUB to execute your session
see this link
http://e-mory.blogspot.com/2007/06/sap-batch-input-session-sm35.html
thanks
karthik

Similar Messages

  • Account document parking in BDC with multiple records

    Hi,
    I am writing one BDC program to park document( not posting the document) using transaction u2018FBV1u2019.In the file, we can have multiple item for posting key 40 and single item for posting key 50 and vice-versa. We can have multiple item both for posting key 40 and 50. But the summation of all amount of all record for posting key 40 and 50 should be u20180u2019( for debit and credit).All these multiple records is actually creating one account document.
    I have created the recording and written the BDC program, but I hope I am missing the grouping of records for posting key 40 and 50 for single document number. Can pls guide me for the same. If anyone has encountered such problem, can you pls guide me for the same. If you haave the code for BDC, please send me the same.
    Edited by: amrita banerjee on Jun 20, 2008 12:53 AM

    Hi amrita,
    I do not understand what do you mean by grouping of records. Can you please explain.
    Still for parking documents, the recommended solution will be to use the standard batch input program RFBIBL00.
    Here is the sample psedo-code
    *bgr00
      wa_bgr00-stype   = 0.
      CONCATENATE  'FBV1' sy-uzeit+0(2) '_' sy-uzeit+4(2)
                  INTO wa_bgr00-group.
      wa_bgr00-mandt   = sy-mandt.
      wa_bgr00-usnam   = sy-uname.
      wa_bgr00-start   = sy-datum.
      wa_bgr00-xkeep   = space.
      wa_bgr00-nodata  = '/'.
      MOVE wa_bgr00 TO wa_file.
      APPEND wa_file TO it_file.
      v_empty_indicator = wa_bgr00-nodata .
    *bbkpf
      PERFORM build_data CHANGING wa_bbkpf.
      wa_bbkpf-stype = 1.
      wa_bbkpf-tcode = 'FBV1'.
      wa_bbkpf-bldat = wa_docheader-doc_date.
      wa_bbkpf-blart = wa_docheader-doc_type.
      wa_bbkpf-bukrs = wa_docheader-comp_code.
      wa_bbkpf-budat = wa_docheader-pstng_date.
      wa_bbkpf-monat = wa_docheader-fis_period.
      wa_bbkpf-waers = 'USD'.
      wa_bbkpf-xblnr = wa_docheader-ref_doc_no.
      wa_bbkpf-bktxt = wa_docheader-header_txt.
      MOVE wa_bbkpf TO wa_file.
      APPEND wa_file TO it_file.
    *bbseg
      LOOP AT it_accntgl INTO wa_accntgl.
        CLEAR wa_bbseg.
        PERFORM build_data CHANGING wa_bbseg.
        READ TABLE it_curramount INTO wa_curramount INDEX wa_accntgl-itemno_acc.
        wa_bbseg-stype = '2'.
        wa_bbseg-tbnam = 'BBSEG'.
        wa_bbseg-zuonr = wa_accntgl-alloc_nmbr.
        wa_bbseg-newbk = wa_accntgl-comp_code.
        wa_bbseg-wrbtr = wa_curramount-amt_doccur.
        wa_bbseg-kostl = wa_accntgl-costcenter.
        wa_bbseg-aufnr = wa_accntgl-orderid.
        wa_bbseg-hkont = wa_accntgl-gl_account.
        wa_bbseg-prctr = wa_accntgl-profit_ctr..
        wa_bbseg-projk = wa_accntgl-wbs_element.
        IF wa_curramount-amt_doccur LT 0.
          wa_bbseg-newbs  = '40'.
        ELSE.
          wa_bbseg-newbs  = '50'.
        ENDIF.
        MOVE wa_bbseg TO wa_file.
        APPEND wa_file TO it_file.
      ENDLOOP.
      DATA: l_filepath TYPE eseftappl.
      CONCATENATE '/tmp/FBV1_load' sy-uzeit '.txt' INTO l_filepath.
    *write app file for rfbibl00
      PERFORM write_data_appl USING it_file l_filepath.
    *Submit the data to post the document
      SUBMIT rfbibl00 WITH ds_name   = l_filepath
                      WITH fl_check  = ' '
                      WITH callmode  = 'C'
                      WITH xinf      = 'X'
                      AND RETURN.
    form build_data  changing pv_header  TYPE any.
    DATA:
        v_field(3)   TYPE n.
      FIELD-SYMBOLS <fs_field_value> TYPE ANY.
      v_field = 1.
      DO.
        ASSIGN COMPONENT v_field  OF STRUCTURE pv_header TO <fs_field_value>.
        IF sy-subrc = 0.
    *     if the field is empty then fill it with nodata indicatort
          IF <fs_field_value> IS INITIAL.
            <fs_field_value> = v_empty_indicator.
          ENDIF.
          v_field = v_field + 1.
        ELSE.
          EXIT.
        ENDIF.
      ENDDO.
    endform.                    " build_data
    form write_data_appl   USING    pv_file      TYPE ty_t_filedata
                                    pv_file_path TYPE eseftappl.
      DATA:
           wa_file   TYPE ty_filedata,
           wa_return TYPE bapiret2,
           v_msg  TYPE string.
    *Open the application server file
    *and write the data
      OPEN DATASET pv_file_path FOR OUTPUT
                   IN TEXT MODE ENCODING NON-UNICODE
                   MESSAGE v_msg.
      IF sy-subrc = 0.
    *   write the file to the application server
        LOOP AT pv_file INTO wa_file.
          TRANSFER wa_file TO pv_file_path.
        ENDLOOP.
        CLOSE DATASET pv_file_path.
      ELSE.
        message e000(00) with v_msg.
      ENDIF.
    endform.                    " write_data_appl
    Thanks
    Romit

  • How can I set limitations on how many records a report can return

    I have a report on the web using Oracle Reports builder and I have the client enter in date parameters for the report that they want.
    Well with date ranges for different clients a different number of records are returned. Because of time it can take the report to return I want to limit the number of records that report can return.
    How can I go about doing that? I don't want to limit with date parameters because date won't really work for me. I need to limit on how many records can be returned. If it exceeds 10,000 records I want the client to refine the date range of schedule the report to run later. Meaning we will run that report. So I would have two check boxes if the count was over 10,000 do you want to define your date or schedule the job to run later.
    Can any one help me with this? How would I go about this?

    To know if the report is going to return more than 10,000 records, you first have to run the query with a 'select count(1) from ... where ...' (with the same from and where clauses as you normal query). Since this takes about the same time as runnng your report, I wonder if you really gain anything (although formatting may take some time too).
    You may simplify the select count(1) query by omitting all the lookup tables that are only needed for formatting. That way your query may run a lot faster. You can put this in your after parameter form trigger.

  • Fetching many records all at once is no faster than fetching one at a time

    Hello,
    I am having a problem getting NI-Scope to perform adequately for my application.  I am sorry for the long post, but I have been going around and around with an NI engineer through email and I need some other input.
    I have the following software and equipment:
    LabView 8.5
    NI-Scope 3.4
    PXI-1033 chassis
    PXI-5105 digitizer card
    DELL Latitude D830 notebook computer with 4 GB RAM.
    I tested the transfer speed of my connection to the PXI-1033 chassis using the niScope Stream to Memory Maximum Transfer Rate.vi found here:
    http://zone.ni.com/devzone/cda/epd/p/id/5273.  The result was 101 MB/s.
    I am trying to set up a system whereby I can press the start button and acquire short waveforms which are individually triggered.  I wish to acquire these individually triggered waveforms indefinitely.  Furthermore, I wish to maximize the rate at which the triggers occur.   In the limiting case where I acquire records of one sample, the record size in memory is 512 bytes (Using the formula to calculate 'Allocated Onboard Memory per Record' found in the NI PXI/PCI-5105 Specifications under the heading 'Waveform Specifications' pg. 16.).  The PXI-5105 trigger re-arms in about 2 microseconds (500kHz), so to trigger at that rate indefinetely I would need a transfer speed of at least 256 Mb/s.  So clearly, in this case the limiting factor for increasing the rate I trigger at and still be able to acquire indefinetely is the rate at which I transfer records from memory to my PC.
    To maximize my record transfer rate, I should transfer many records at once using the Multi Fetch VI, as opposed to the theoretically slower method of transferring one at a time.  To compare the rate that I can transfer records using a transfer all at once or one at a time method, I modified the niScope EX Timestamps.vi to allow me to choose between these transfer methods by changing the constant wired to the Fetch Number of Records property node to either -1 or 1 repectively.  I also added a loop that ensures that all records are acquired before I begin the transfer, so that acquisition and trigger rates do not interfere with measuring the record transfer rate.  This modified VI is attached to this post.
    I have the following results for acquiring 10k records.  My measurements are done using the Profile Performance and Memory Tool.
    I am using a 250kHz analog pulse source.
    Fetching 10000 records 1 record at a time the niScope Multi Fetch
    Cluster takes a total time of 1546.9 milliseconds or 155 microseconds
    per record.
    Fetching 10000 records at once the niScope Multi Fetch Cluster takes a
    total time of 1703.1 milliseconds or 170 microseconds per record.
    I have tried this for larger and smaller total number of records, and the transfer time per is always around 170 microseconds per record regardless if I transfer one at a time or all at once.  But with a 100MB/s link and 512 byte record size, the Fetch speed should approach 5 microseconds per record as you increase the number of records fetched at once.
    With this my application will be limited to a trigger rate of 5kHz for running indefinetely, and it should be capable of closer to a 200kHz trigger rate for extended periods of time.  I have a feeling that I am missing something simple or am just confused about how the Fetch functions should work. Please enlighten me.
    Attachments:
    Timestamps.vi ‏73 KB

    Hi ESD
    Your numbers for testing the PXI bandwidth look good.  A value of
    approximately 100MB/s is reasonable when pulling data accross the PXI
    bus continuously in larger chunks.  This may decrease a little when
    working with MXI in comparison to using an embedded PXI controller.  I
    expect you were using the streaming example "niScope Stream to Memory
    Maximum Transfer Rate.vi" found here: http://zone.ni.com/devzone/cda/epd/p/id/5273.
    Acquiring multiple triggered records is a little different.  There are
    a few techniques that will help to make sure that you are able to fetch
    your data fast enough to be able to keep up with the acquired data or
    desired reference trigger rate.  You are certainly correct that it is
    more efficient to transfer larger amounts of data at once, instead of
    small amounts of data more frequently as the overhead due to DMA
    transfers becomes significant.
    The trend you saw that fetching less records was more efficient sounded odd.  So I ran your example and tracked down what was causing that trend.  I believe it is actually the for loop that you had in your acquisition loop.  I made a few modifications to the application to display the total fetch time to acquire 10000 records.  The best fetch time is when all records are pulled in at once. I left your code in the application but temporarily disabled the for loop to show the fetch performance. I also added a loop to ramp the fetch number up and graph the fetch times.  I will attach the modified application as well as the fetch results I saw on my system for reference.  When the for loop is enabled the performance was worst at 1 record fetches, The fetch time dipped  around the 500 records/fetch and began to ramp up again as the records/fetch increases to 10000.
    Note I am using the 2D I16 fetch as it is more efficient to keep the data unscaled.  I have also added an option to use immediate triggering - this is just because I was not near my hardware to physically connect a signal so I used the trigger holdoff property to simulate a given trigger rate.
    Hope this helps.  I was working in LabVIEW 8.5, if you are working with an earlier version let me know.
    Message Edited by Jennifer O on 04-12-2008 09:30 PM
    Attachments:
    RecordFetchingTest.vi ‏143 KB
    FetchTrend.JPG ‏37 KB

  • Problem in BDC with ( MM01-Dynamic views )

    HI
    I am working with BDC on material Mater(mm01).
    Now while Creating Material,there is option of selecting different views based on MaterialType.
    But while recording a BDC i can record it for a particular Material Type.and can select view associated to that particular Material Type.
    Is there any solution to create a bdc progarm which can accept a new material based on
    any material type and having views on user's choice that are given by user in Input file.
    FOr that i think i need to create dynamic views and screen elements for input associated to those views.
    After scanning some sites and forums for this particular problem,i got some help on this:-----
    <b>SELECTION_VIEWS_FIND</b> should be very useful in this case.
    There are three FModules totally involved:
    a) <b>T130M_SINGLE_READ</b>
    Pass TCODE = MM01, and get T130M values
    b) <b>BILDSEQUENZ_IDENTIFY</b>
    Pass KZRFB = 'X',TCODE_REF=T130M-TRREF and get
    BILDSEQUENZ values
    c) <b>SELECTION_VIEWS_FIND</b>
    Pass BILDSEQUENZ and T130M-PSTAT
    Now, this function module holds the values of all the views and the next screen numbers
    But i am unable to interpret the solution..Can somebody Elaborate on this.or provide me some other solution for this.
    Thanks in Advance
    Help Will be appreciated.

    Yes, I agree with Christian.  You are gonna want to use the BAPI or the underlying function module.  It works good and isn't really that complex.  Here is sample program.  Here the function module is being used to change data,  it can also be used to create,  you will have to change the Tcode from MM02 to MM01.
    report zrich_0001.
    data: i_mara type table of mara_ueb with header line.
    data: i_marc type table of marc_ueb with header line.
    data: i_marcx type table of marc with header line.
    parameters: p_matnr type mara-matnr,
                p_plifz type marc-plifz.
    start-of-selection.
    * Get all plants for that material
      select * into corresponding fields of table i_marcx
            from marc where matnr = p_matnr.
    * Now maintain data for each plant
      loop at i_marcx.
    * Reset I_Mara to new material number
        clear i_mara. refresh i_mara.
        select * into corresponding fields of table i_mara
              from mara where matnr = p_matnr.
        read table i_mara index 1.
        i_mara-tcode = 'MM02'.
        i_mara-tranc = '0000000001'.
        modify i_mara index 1.
    * Get Plant Data
        clear i_marc. refresh i_marc.
        select * into corresponding fields of table i_marc
              from marc where matnr = i_marcx-matnr
                          and werks = i_marcx-werks.
        read table i_marc index 1.
        i_marc-plifz  = p_plifz.        " Plan Del time
        i_marc-tranc = '0000000001'.
        modify i_marc index 1.
    * Maintain material
        perform MATERIAL_MAINTAIN_DARK'.
      endloop.
    *  MATERIAL_MAINTAIN_DARK
    form MATERIAL_MAINTAIN_DARK.
    * Variable for "Maintain_Material_Dark" Function Module
      data: numerror like tbist-numerror.
      data: last_matnr type mara-matnr.
      data: i_delfields type table of mfieldres with header line.
      data: i_errors    type table of merrdat   with header line.
      call function 'MATERIAL_MAINTAIN_DARK'
             exporting
                  sperrmodus                = ' '
                  kz_prf                    = 'W'
                  max_errors                = ' '
                  p_kz_no_warn              = 'X'
                  kz_verw                   = 'X'
                  kz_aend                   = 'X'
                  kz_dispo                  = 'X'
                  kz_test                   = ' '
                  flag_muss_pruefen         = ' '
                  call_mode                 = 'ACT'
             importing
                  number_errors_transaction = numerror
                  matnr_last     = last_matnr
             tables
                 amara_ueb      = i_mara    "Basic Data
    *             amakt_ueb      = i_makt    "Descriptions
                 amarc_ueb      = i_marc    "Plant
    *             amard_ueb      = i_mard    "Storage Location
    *            AMFHM_UEB      = I_MFHM    "Production Tools
    *             amarm_ueb      = i_marm    "Units of Measure
    *            AMEA1_UEB      = I_MEA1    "Internal Mangagement -  EANs
    *             ambew_ueb      = i_mbew    "Accounting/Costing
    *             asteu_ueb      = i_steu    "Tax Data
    *             astmm_ueb      = i_steumm  "Tax Data
    *            AMLGN_UEB      = I_MLGN    "Warehouse Data
    *            AMLGT_UEB      = I_MLGT    "Storage Type Data
    *            AMPGD_UEB      = I_MPGD    "Change Documents
    *            AMPOP_UEB      = I_MPOP    "Forcast Parameters
    *            AMVEG_UEB      = I_MVEG    "Total Consumption Data
    *            AMVEU_UEB      = I_MVEU    "Unplanned Consumption Data
    *             amvke_ueb      = i_mvke    "Sales Data
    *             altx1_ueb      = i_ltx1    "Sales Text
    *            AMPRW_UEB      = I_MPRW    "Forcast Values
                 amfieldres     = i_delfields
                 amerrdat       = i_errors
             exceptions
                  kstatus_empty             = 01
                  tkstatus_empty            = 02
                  t130m_error               = 03
                  internal_error            = 04
                  update_error              = 05
                  too_many_errors           = 06.
      if sy-subrc <> 0.
        rollback work.
        call function 'DEQUEUE_ALL'.
      else.
        commit work and wait.
        call function 'DEQUEUE_ALL'.
      endif.
      clear: i_mara, i_marc.
      refresh: i_mara, i_marc.
    endform.
    Regards,
    Rich Heilman

  • Is there any limit on how many records a cursor can hold?

    Hi Everyone,
    This is Amit here. I want to know whether there is any limit on how many records a cursor can hold.
    I have a program in which i am creating a cursor and passing it to another procedure as an input parameter. But the count of cursor query is more than 15 Lakhs. The program is running forever.
    Just wanted to know whether the huge data is the problem.
    Thanks ....
    Regards,
    Amit

    user13079404 wrote:
    Just wanted to know whether the huge data is the problem.What do you think? How long does your code typically need to wait for the data to leave the magnetic platter of the harddisk, travel across wires and into the memory buffer of your application - for a single row?
    Now multiply that waiting for I/O time with a million - for a million rows. Or by a billion, for a billion rows.
    Is "+huge data+" a problem? Not really - it simple needs more work to get that amount of data from disk. More work means slower performance. It is that simple.
    Which is why the row-by-row approach used by many developers is wrong. You do not pull a million rows from disk and process it in PL/SQL or Java or .Net. Heck, you do not even pull 10,000 rows like that.
    The correct approach is to think data sets and use SQL to process that for you - and only return the bare minimum of data to the application layer. Maximize SQL. Minimize PL/SQL and Java and .Net.

  • Working with many-to-many relationships

    I am about to start my first experience with many-to-many
    relationships using PHP and MySQL.
    My project is to create an events registration form. As you
    know, there can be many events and many participants attending many
    events.
    I am not sure on how to structure my database and tables so I
    will first show you what I have:
    create table programs (
    program_id int not null primary key auto_increment,
    program_name varchar(100) not null,
    program_date varchar(25) not null,
    program_time varchar(25) not null,
    program_coordinator varchar(100),
    program_seats int not null
    create table participants (
    participant_id int not null primary key auto_increment,
    participant_name varchar(100) not null,
    participant_phone varchar(12) not null
    I know that I need a middle table to join the two.
    create table programs_participants (
    program_id int references program(id),
    participants_id int references participants(id),
    primary key (program_id, participants_id)
    My problem is, how do I submit to both the participants AND
    the programs_participants table together? Or is this not possible?
    The participants are not already in the database when we register
    them. We enter their personal info and select their desired events
    from checkboxes on the same page.
    Thanks for your help.

    > My problem is, how do I submit to both the participants
    AND the
    > programs_participants table together? Or is this not
    possible? The
    > participants are not already in the database when we
    register them. We
    > enter
    > their personal info and select their desired events from
    checkboxes on the
    > same
    > page.
    What you need to do is a multi-step insert.
    First, you insert the new participant into the participant
    table, then use
    the @@identity command to get the uniqueID of that newly
    entered record.
    Then you can take that uniqueID to build the entry for the
    programs_participants table.
    If you use a stored procedure, you should be able to do all
    of that with
    only having to create one call to the DB from your
    Application.
    _Darrel

  • Mapping with repeating records

    Hello,
    I have once schema which has two child records out of that second child has many child fields with repeating records, but when I mapped it further I am getting that second schema only once even it has multiple.
    Help me please ASAP
    Thanks in advance

    Nilesh,
    You have two options:
    Option1:
    Just ensure in schemas, in second child records, set the repeating child record's (repeating record under second child record) "Max Occurs" property is set to
    * (or unbounded). Ensure that this property (Max Occurs =
    *) is set for repeating records in both source and destination schemas. By this way, particular child record will repeat. This way gives more control on which child record you want to
    set as repeatable.
    Options2:
    Just set the second child record (which has the repeating records under it) "Group Max Occurs" property to
    * (or unbounded). By this way, all the child records of this second child, will be set to repeating record.
    Also ensure that the repeating records (not just the child elements, but also the repeating records) are mapped from source to destination.
    If this answers your question please mark it accordingly. If this post is helpful, please vote as helpful by clicking the upward arrow mark next to my reply.

  • Report timing out, "Too many records"

    I am using CR XIR2 with a univers that I created to run an inverntory report. The report works fine in the designer but when I go to run it from the CR Server It times out. (Very quickly).  If I put filters and only pull a subset of the data the report works fine. This issue is there should not be too many records. Where can I check to make sure that there are no limitations set and I can get all of the records? I have set the parameters for the Universe to no limit records or time to execute. Is there any place else I can check? or any other ideas?
    Thanks

    What viewer do you use?
    If the viewer is ADHTML, you are using the RAS (Report Application Server). In that case, you need to set the number of (db) records to read in the RAS properties on the CMC.
    If the viewer is DHTML, Java or ActiveX, you are using the CR Page Server. You will need to set the number of db records to read in the Cryatal Reports Page Server properties on the CMC.

  • BDC WITH ME51!!

    HELLO
    I HAVE CREATED A BDC WITH ME51,I AM FACING A PROBLEM WITH IT.IN THE ITEM DATA THE SECOND RECORD IS GETTING OVERWRITTEN BY THE 3RD RECORD DATA.PLEASE GIVE ME A SOLUTION.THE CODE GOES AS BELOW..........TELL ME WHERE I AM GOING WRONG.
    REPORT Z_BDC_ME51_CALLTRANSACTION NO STANDARD PAGE HEADING LINE-SIZE 255.
    TABLES: EBAN,RM06B.
    *include bdcrecx1.
    DATA:   T_BDCDATA LIKE BDCDATA OCCURS 0 WITH HEADER LINE.
    DATA:   MESSTAB LIKE BDCMSGCOLL OCCURS 0 WITH HEADER LINE .
    DATA : BEGIN OF T_UPLOAD OCCURS 0 ,
            BSART LIKE EBAN-BSART ,
            EEIND LIKE RM06B-EEIND ,
            LPEIN LIKE RM06B-LPEIN ,
            WERKS LIKE EBAN-WERKS ,
            LGORT LIKE EBAN-LGORT ,
            EKGRP LIKE EBAN-EKGRP ,
            MATKL LIKE EBAN-MATKL ,
            MATNR LIKE EBAN-MATNR ,
            MENGE(10) ,
           END OF T_UPLOAD .
    DATA : BEGIN OF T_HEADER OCCURS 0 ,
            BSART LIKE EBAN-BSART ,
            EEIND LIKE RM06B-EEIND ,
            LPEIN LIKE RM06B-LPEIN ,
            WERKS LIKE EBAN-WERKS ,
            LGORT LIKE EBAN-LGORT ,
            EKGRP LIKE EBAN-EKGRP ,
            MATKL LIKE EBAN-MATKL ,
           END OF T_HEADER ,
           BEGIN OF T_ITEM OCCURS 0 ,
            BSART LIKE EBAN-BSART ,
            MATNR LIKE EBAN-MATNR ,
            MENGE(10) ,
           END OF T_ITEM  ,
           V1 TYPE SY-TABIX .
    DATA : P_FLNAME LIKE RLGRAP-FILENAME VALUE 'D:\TEST_ME51.TXT' .
    CALL FUNCTION 'UPLOAD'
      EXPORTING
      CODEPAGE                      = ' '
        FILENAME                      = 'D:\TEST_ME51.txt'
        FILETYPE                      = 'DAT'
      ITEM                          = ' '
      FILEMASK_MASK                 = ' '
      FILEMASK_TEXT                 = ' '
      FILETYPE_NO_CHANGE            = ' '
      FILEMASK_ALL                  = ' '
      FILETYPE_NO_SHOW              = ' '
      LINE_EXIT                     = ' '
      USER_FORM                     = ' '
      USER_PROG                     = ' '
      SILENT                        = 'S'
    IMPORTING
      FILESIZE                      =
      CANCEL                        =
      ACT_FILENAME                  =
      ACT_FILETYPE                  =
      TABLES
        DATA_TAB                      = T_UPLOAD
    EXCEPTIONS
      CONVERSION_ERROR              = 1
      INVALID_TABLE_WIDTH           = 2
      INVALID_TYPE                  = 3
      NO_BATCH                      = 4
      UNKNOWN_ERROR                 = 5
      GUI_REFUSE_FILETRANSFER       = 6
      OTHERS                        = 7
    IF SY-SUBRC <> 0.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
    ENDIF.
    START-OF-SELECTION.
    DATA : V_BSART LIKE EBAN-BSART .
    LOOP AT T_UPLOAD.
       MOVE :
            T_UPLOAD-BSART TO T_HEADER-BSART,
            T_UPLOAD-EEIND TO T_HEADER-EEIND,
            T_UPLOAD-LPEIN TO T_HEADER-LPEIN,
            T_UPLOAD-WERKS TO T_HEADER-WERKS,
            T_UPLOAD-LGORT TO T_HEADER-LGORT,
            T_UPLOAD-EKGRP TO T_HEADER-EKGRP,
            T_UPLOAD-MATKL TO T_HEADER-MATKL.
       APPEND T_HEADER.
       V_BSART = T_UPLOAD-BSART.
       LOOP AT T_UPLOAD WHERE BSART = V_BSART.
         MOVE :
             T_UPLOAD-BSART TO T_ITEM-BSART ,
             T_UPLOAD-MENGE TO T_ITEM-MENGE,
             T_UPLOAD-MATNR TO T_ITEM-MATNR.
         APPEND T_ITEM.
         DELETE T_UPLOAD.
      ENDLOOP.
    ENDLOOP.
    LOOP AT T_HEADER .
    CLEAR T_BDCDATA .
    REFRESH T_BDCDATA .
    PERFORM BDC_DYNPRO      USING 'SAPMM06B' '0100'.
    *PERFORM BDC_FIELD       USING 'BDC_CURSOR' 'RM06B-EEIND'.
    PERFORM BDC_FIELD       USING 'BDC_OKCODE' '/00'.
    PERFORM BDC_FIELD       USING 'EBAN-BSART' T_HEADER-BSART.
    PERFORM BDC_FIELD       USING 'RM06B-EEIND' T_HEADER-EEIND.
    PERFORM BDC_FIELD       USING 'RM06B-LPEIN' T_HEADER-LPEIN.
    PERFORM BDC_FIELD       USING 'EBAN-WERKS' T_HEADER-WERKS.
    PERFORM BDC_FIELD       USING 'EBAN-LGORT' T_HEADER-LGORT.
    PERFORM BDC_FIELD       USING 'EBAN-EKGRP' T_HEADER-EKGRP.
    PERFORM BDC_FIELD       USING 'EBAN-MATKL' T_HEADER-MATKL.
    *PERFORM BDC_FIELD       USING 'BDC_CURSOR' 'EBAN-MENGE(01)'.
    PERFORM BDC_FIELD       USING 'BDC_OKCODE' '/00'.
    LOOP AT T_ITEM WHERE BSART EQ T_HEADER-BSART .
      IF SY-TABIX EQ 1.
        PERFORM BDC_DYNPRO      USING 'SAPMM06B' '0106'.
        PERFORM BDC_FIELD       USING 'EBAN-MATNR(01)' T_ITEM-MATNR.
       PERFORM BDC_FIELD       USING 'EBAN-TXZ01(01)' ''.
        PERFORM BDC_FIELD       USING 'EBAN-MENGE(01)' T_ITEM-MENGE.
        PERFORM BDC_FIELD       USING 'BDC_OKCODE' '/00'.
        PERFORM BDC_DYNPRO      USING 'SAPMM06B' '0102'.
        PERFORM BDC_FIELD       USING 'BDC_OKCODE' '/00'.
       PERFORM BDC_DYNPRO      USING 'SAPMM06B' '0106'.
       PERFORM BDC_FIELD       USING 'BDC_OKCODE' '/00'.
        ELSE .
        PERFORM BDC_DYNPRO      USING 'SAPMM06B' '0106'.
        PERFORM BDC_FIELD       USING 'EBAN-MATNR(02)' T_ITEM-MATNR.
       PERFORM BDC_FIELD       USING 'EBAN-TXZ01(01)' ''.
        PERFORM BDC_FIELD       USING 'EBAN-MENGE(02)' T_ITEM-MENGE.
        PERFORM BDC_FIELD       USING 'BDC_OKCODE' '/00'.
        PERFORM BDC_DYNPRO      USING 'SAPMM06B' '0102'.
        PERFORM BDC_FIELD       USING 'BDC_OKCODE' '/00'.
       PERFORM BDC_DYNPRO      USING 'SAPMM06B' '0106'.
       PERFORM BDC_FIELD       USING 'EBAN-MA
        ENDIF .
    ENDLOOP .
    PERFORM BDC_FIELD       USING 'BDC_OKCODE' '=BU'.
    *PERFORM BDC_FIELD       USING 'RM06B-BNFPO' '10'.
    CALL TRANSACTION 'ME51' USING T_BDCDATA
                                  MODE 'A'
                                  UPDATE 'S'
                                  MESSAGES INTO MESSTAB.
    ENDLOOP .
           Start new screen                                              *
    FORM BDC_DYNPRO USING PROGRAM DYNPRO.
      CLEAR T_BDCDATA.
      T_BDCDATA-PROGRAM  = PROGRAM.
      T_BDCDATA-DYNPRO   = DYNPRO.
      T_BDCDATA-DYNBEGIN = 'X'.
      APPEND T_BDCDATA.
    ENDFORM.
           Insert field                                                  *
    FORM BDC_FIELD USING FNAM FVAL.
      IF FVAL <> SPACE.
        CLEAR T_BDCDATA.
        T_BDCDATA-FNAM = FNAM.
        T_BDCDATA-FVAL = FVAL.
        APPEND T_BDCDATA.
      ENDIF.
    ENDFORM.
    I AM USING BSART,EEIND,LPEIN, WERKS,LGORT,EKGRP,MATKL,MATNR,MENGE IN THE FLAT FILE.

    The number in the brackets indicate the number of the line item, right?
    So, if you are not incrementing that number, you are always overwriting your 2nd line item with the next line item.
    So, as the loop goes, you should increment that number also so that when the BDC runs the materials are attached to different line numbers.
    Outside the loop
    DATA : counter(2) type n value '01'.
    Inside the loop
    counter = counter + 1.
    concatenate 'EBAN-MATNR(' counter ')' into fieldmatnr.
    PERFORM BDC_FIELD USING fieldmatnr = T_ITEM-MATNR.
    concatenate 'EBAN-MENGE(' counter ')' into fieldmenge.
    PERFORM BDC_FIELD USING fieldmenge T_ITEM-MENGE.
    The above changes should solve your problem.
    Regards,
    Ravi
    Note - Pleas emark all the helpful answes
    Message was edited by:
            Ravikumar Allampallam
    Message was edited by:
            Ravikumar Allampallam

  • At  first record of innerloop , want to know how many records exists

    How can I know the number of records in INTBSEG for each record of int_maintable before running the loop for that where condition of INTBSEG.
    Please see below:
    LOOP AT int_maintable.
      Loop at INTBSEG where bukrs = int_maintable-bukrs
                                     AND  belnr = int_maintable-belnr
                                     AND  gjahr = int_maintable-gjahr.
    ( I want to know at the point of sy-tabix of  INTBSEG = 1
    how many records exits in INTBSEG ...
          for this   where bukrs = int_maintable-bukrs
                                     AND  belnr = int_maintable-belnr
                                     AND  gjahr = int_maintable-gjahr.)
    I want to know how many records exits in INTBSEG , for the first record itself of INTBSEG)
      Endloop.
    ENDLOOP.

    Hi,
    Try this..
    DATA: INTBSEG_TMP LIKE INTBSEG OCCURS 0 WITH HEADER LINE.
    DATA: V_LINES TYPE SYTFILL.
    LOOP AT int_maintable.
    CLEAR: V_LINES.
    INTBSEG_TMP[] = INTBSEG[].
    DELETE INTBSEG_TMP WHERE where bukrs <> int_maintable-bukrs
    AND belnr <> int_maintable-belnr
    AND gjahr <> int_maintable-gjahr.
    DESCRIBE TABLE INTBSEG_TMP.
    V_LINES = SY-TFILL.
    Loop at INTBSEG where bukrs = int_maintable-bukrs
    AND belnr = int_maintable-belnr
    AND gjahr = int_maintable-gjahr.
    <b>* Use the field V_LINES to get the number of rows for that combination..</b>
    Endloop.
    Endloop.
    Thanks,
    Naren

  • I want to display many records in the same jsp page

    Hi,
    i want to display many records with in the same jsp page providing the next,previous,first last .
    give me clear idea how to do that one
    note :only using servlets,jsp,jdbc and javascript

    I believe that this is the fourth time this question has been asked by the same person
    http://forum.java.sun.com/thread.jspa?threadID=720977&messageID=4159465#4159465
    http://forum.java.sun.com/thread.jspa?threadID=720945&messageID=4159338#4159338
    http://forum.java.sun.com/thread.jspa?threadID=720919&tstart=0

  • How many records does JDBC adapter can obtain in one polling?

    Hello everybody,
    I need to do an interface between legacy system and SAP ECC, the legacy systems have a DB so i use the jdbc adapter (sender) and receive the information to SAP ECC with proxy, so i need to activate the polling option from my jdbc adapter working as a sender, i read a table with lot of records, and i need to know how many records does jdbc adapter support when the polling is executed, because is necessary read all records from the table and change the status of the processed field.
    Is possible to get all the records from that table in one polling interval (50,000 records aprox)?, or i need to do the polling by blocks of records until finish all records from the table?, the second option, i dont have idea how can i do it.
    Regards,
    Vicman

    Hi again!,
    i still working on that, but i have a question, is possible to handle Store Procedure in jdbc adapter?? is supported?, like PL SQL, because i was working in the next query but i don't know if it works and where do i need to locate the query in the Query SQL Statement or in Update SQL Statement field or both? but how?.
    DECLARE c_cursor CURSOR FOR
    SELECT * FROM tablename
    WHERE processed=0
    OPEN c_cursor
       FETCH NEXT FROM c_cursor
    WHILE @@FETCH_STATUS = 0
    BEGIN
       update tablename set processed=1
       FETCH NEXT FROM c_cursor
    END
    CLOSE c_cursor
    DEALLOCATE c_cursor
    Regards,

  • Report execute time nd how many records will be returned before hitting the "run" option?

    Post Author: Prasad15
    CA Forum: WebIntelligence Reporting
    Is there any way to know how long the report executes and how many records will be returned before hitting the "run" option?
    Regards
    Prasad

    To know if the report is going to return more than 10,000 records, you first have to run the query with a 'select count(1) from ... where ...' (with the same from and where clauses as you normal query). Since this takes about the same time as runnng your report, I wonder if you really gain anything (although formatting may take some time too).
    You may simplify the select count(1) query by omitting all the lookup tables that are only needed for formatting. That way your query may run a lot faster. You can put this in your after parameter form trigger.

  • BDC session : Error records

    I used a Bapi function module. In that i handled the error message and display as output in alv format but they changed the requirement like
    it has to create a session with those error recrods. is it possible?
    Please help
    Thank you

    Hi Pydi Reddy,
    Session to be created with error records? I couldn't understand, why they want to create a session with error records? do they mean that you have to create a BDC for the same T.Code and once BAPI fails to update those records then BDC should be used for those error records?
    Ask them what exactly they mean to schedule a session with error records.
    Thanks & Regards,
    Faheem.

Maybe you are looking for

  • How to add Characteristic text to Key figures in Bex report?

    The result set of a report is like this                                       Q1             Q2            Q3           Q4 Payment amount      Payment          2000           4000          1200         1400                     Recovery         1200  

  • ISight with Magnetic mount under TV

    Hi I have a Mac Mini running Snow Leopard and have an external iSight camera attached to it. I have decided to try putting it under the TV attached to the bottom but that means that the camera is upside down is there any way in ichat in 10.6 or some

  • How do you get iPhoto to stop autocropping/zooming photos within a frame?

    I have hundreds of odd shaped images and I don't want to have to manually "fit to frame" hundreds and hundreds of images? Can I turn off the autocropping or zooming function so Apple uses my  actual picture dimensions?  Or at least is there a way to

  • Error while initializing PA

    Hi, I am getting error when initializing the PA. "Error when reading fiscal year variant T1 for date 31.12.2015" Please help me to get out of this error. How to check whether fiscal year variant T1 has maintained for date 31.12.2015. Thanks a lot in

  • Upgrading Mac OSX 10.3.9 to Panther w/o Tiger

    My operating systems is now having problems downloading upgrades to Mozilla, flashplayer, etc due to it being out of date. The Apple Store does not sell Tiger any more how do I upgrade w/o buying Tiger or how do I get my hands on a copy of Tiger.