Data package/container size

We are using MI and I have a little question about data packages sizes and container sizes.
If I use data packaging and the size very small, I read at help.sap.com that MI is sending 20 containers.
But how much data is sent by MI. What is the size of 1 container?
I ask this question because we have a little problem with the sizes of the data packages/containers.
So the major question is: What is the size of a container which is sent from MI to the Middleware (SAP WAS)?
How can I see the data I sent from MI to the middleware (SAP WAS) and how can I see if the packages are full ore not.
I hope someone can help me. Thanks in advance.
Kind regards,
Bart Elshout

Hi Thomas,
Do you know anything about a "Fragment Bit" which is set with sending data from SAP MI client to SAP WAS (middleware)?
The data is sent with a "Don't fragment bit" and one of our switches/routers or something in the network has to fragment our data, but the bit is set. So the MI client receives an error message (Unexpected end of file from server).
We want to set something in the SAP MI client which sais this datapacket may be fragmented so it can pass the network switches/routers/etc.
Can someone help us with this problem?
Thanks in advance and kind regards,
Bart Elshout

Similar Messages

  • Increase data package size

    Hi,
    I'm using a flat file datasource to load data from the BW server into a cube.  When I load a file containing 9000 records, the data package size is 1000.  I'm trying to improve the load time and would like to increase the data package size to 2000.
    I made an entry to specify 2000 records for my flat file source in SBIW -> Maintain Control Parameters for Data Transfer.  My file still loads with 1000 records in the data package. 
    I've also tried changing the setting on the infopackage itself, but it keeps telling me the max is 1000.
    Any suggestions?
    Thanks

    the settings you maintained in SBIW is for Other BI system not for Current System. whiel you exchange data between your current BI system with other BI System, then you need to maintain in SBIW.
    If you want to change for current system, you need to maintain in SPRO not SBIW. i can't tell you the exact navigation. it will be under "Links to Other System"
    If you are on BI 7.0, follow the below navigation....
    SPRO --> F5 -->SAP Netweaver -> BI --> Links to Other Systems -->Maintain Control Parameters for Data Transfer.
    Let me know if you need any information.
    Nagesh Ganisetti.

  • Data package size

    What is the basic difference between RSCUSTV6 & SBIW->General setting ->Maintain Control parameters in relation of modification of data package format.

    Hi,
    Just see the help on
    Maintain Control Parameters for Data Transfer:
    1. Source System
    Enter the logical system of your source client and assign the control parameters you selected to it.
    You can find further information on the source client in the source system by choosing the path
    Tools -> Administration -> Management -> Client Maintenance.
    2. Maximum Size of the Data Package
    When you transfer data into BW, the individual data records are sent in packages of variable size. You can use these parameters to control how large a typical data packet like this is.
    If no entry was maintained then the data is transferred with a default setting of 10,000 kBytes per data packet. The memory requirement not only depends on the settings of the data package, but also on the size of the transfer structure and the memory requirement of the relevant extractor.
    3. Maximum Number of Rows in a Data Package
    With large data packages, the memory requirement mainly depends on the number of data recrods that are transferred with this package. Using this parameter you control the maximum number of data records that the data package should contain.
    By default a maximum of 100,000 records are transferred per  data package.
    The maximum main memory requiremen per data package is approximately 2  Max. Rows 1000 Byte.
    4. Frequency
    The specified frequency determines the number of IDocs that an Info IDoc is to be sent to, or how many data IDocs an Info Idoc describes.
    Frequency 1 is set by default.. This means that an Info Idoc follows every data Idoc. In general, you should select a frequency between 5 and 10 but no higher than 20.
    The bigger the data IDoc packet, the lower the frequency setting should be. In this way, when you upload you can obtain information on the respective data loading in relatively short spans of time .
    With the help of every Info IDoc, you can check the BW monitor to see if there are any errors in the loading process. If there are none, then the traffic light in the monitor will be green. The Info IDocs contain information such as whether the respective data IDocs were uploaded correctly.
    5. Maximum number of parallel processes for the data transfer
    An entry in this field is only relevant from release 3.1I onwards.
    Enter a number larger than 0. The maximum number of parallel processes is set by default at 2. The ideal parameter selection depends on the configuration of the application server, which you use for transferring data.
    6. Background job target system
    Enter the name of the application server on which the extraction job is to be processed.
    To determine the name of the application server, choose
    Tools -> Administration -> Monitor -> System monitoring -> Server. The name of the application server is displayed in the column Computer.
    7. Maximum Number of Data Packages in a Delta Request
    With this parameter, you can restrict the number of data packages in a delta request or in the repetition of a delta request.
    Only use this restriction when you expect delta requests with a very high data volume, so that, despite sufficiently large data package sizes, more than 1000 data packages can result in a request.
    With an initial value or when the value is 0, there is no restriction. Only a value larger than 0 leads to a restriction in the number of data packages. For reasons of consistency, this number is not generally exactly adhered to. The actual restriction can, depending on how much the data is compressed in the qRFC queue , deviate from the given limit by up to 100.
    RSA6:
    Used to change the Datapacket Size.
    Thanks
    Reddy
    Edited by: Surendra Reddy on Mar 12, 2010 6:27 AM

  • HT4623 I want to purchase a ipad 32gb in the larger size but have read many bad reviews that the wifi does not stay connected and I don't want to have to buy another data package as I already own an iphone that i pay data for, what are my concerns

    I want to purchase an ipad 32gb in the larger size buy have read many bad reviews that they have problems staying connected to wifi.  I don't want to have to buy a data package so I only want the wifi one.  I already pay apple for my iphone data and I can't afford more money.  Why are there so many bad reviews and are the newer ones that much better, according to many reviews they are not.Please help

    I already pay apple for my iphone data
    You pay your carrier for data, not Apple.
    Why are there so many bad reviews
    Where? Regardless, there are always going to be a handful of people having some sort of issue with any given device, but I would not therefore assume that you would be one of them.

  • Maximum package size for data packages was exceeded and Process terminated

    Hello Guru,
    When i am execute the process chain i got this message Maximum package size for data packages was exceeded and Process terminated,any body help to me in this case how can i proceed.
    Thanks & Regards,
    Suresh.

    Hi,
    When the load is not getiing processed due to huge volume of data, or more number of records per data packet, Please try the below option.
    1) Reduce the IDOC size to 8000 and number of data packets per IDOC as 10. This can be done in info package settings.
    2) Run the load only to PSA.
    3) Once the load is succesfull , then push the data to targets.
    In this way you can overcome this issue.
    You can also try RSCUSTV* where * is an integer to change data load settings.
    Change Datapackage size for extraction, use Transaction RSCUSTV6.
    Change Datapackage size when upload from an R/3 system, set this value in R/3 Customizing (SBIW -> General settings -> Control parameters for data transfer).
    IN R/3, T-Code SBIW --> Genaral settings --> Maintain Control Parameters for Data Transfer (source system specific)
    Hope this helps.
    Thanks,
    JituK

  • "Maximum package size for data packages was exceded".

    Hi,
    We are getting the below error.
    "Maximum package size for data packages was exceded".
    In our scenario we are loading the data product key wise (which is a semantic key as well) to the DSO thro’ a start routine.
    The logic in the start routine is such a way that it calculates the unique product counts , product key wise. Hence we are trying to
    group  the product key thro’ semantic groups.
    Ex: In this example the product counts should be A = 1,B=2 ,C = 1
      Product Key
      Products
      A
      1000100
      B
      2000100
      C
      3000100
      B
      2000300
      C
      3000100
    For some product keys the data is so huge that we could not load the data & we are getting the error.
    Please suggest any alternate way to  handle this thro’ code or introducing any other flow.
    Regards,
    Barla

    HI
    we can solve the issue by opening the system setting of data packer size
    like below we have create 2 programs, 1 for open the system settings,2 for
    close the settings .
    1 start program
    data: z_roidocprms like table of
    roidocprms.
    data: wa like line of z_roidocprms.
    wa-slogsys = 'system_client' . wa-maxsize = '50000'. wa-statfrqu = '10'.
    wa-maxprocs = '6'. wa-maxlines = '50000'.
    insert wa into table z_roidocprms.
    Modify roidocprms from table z_roidocprms .
    2 close program
    data: z_roidocprms like table of roidocprms.
    data: wa like line of z_roidocprms.
    wa-slogsys = 'syetm_client' . wa-maxsize = '50000'. wa-statfrqu = '10'.
    wa-maxprocs = '6'. wa-maxlines = '50000'.
    insert wa into table z_roidocprms.
    modify roidocprms from table z_roidocprms .
    data load infopakage settings we have to maintain like
    below
    we have create the process chain like as below
    1 start progarm
    data load infopakage
    2 close program.
    This might fix the problem.
    Regards,
    Polu.

  • Maximum package size for data packages was exceeded?

    Hi Experts,
    I am facing this problem "Maximum package size for data packages was exceeded". When I am trying to laod. I even tried to reduce data packet and change DTP settings in EXtraction to Get All New Data Request by Request but still same error is occuring. Can u Plz focus light on this.
    Thanks,
    Krishna

    You can refer to the below OSS note:
    Note 1144332 - Consulting note: Message RSBK 250: Package size exceeded
    And other related notes: 352038, 417307
    Hope this helps.
    Murali

  • Restrict Data Package Size

    hi all,
    I have created a generic data source which is extracting data using the FM.
    I wana restrict my data by packet size. how can i do this?
    my FM is
    FUNCTION zstock_requirement.
    ""Local interface:
    *"  IMPORTING
    *"     VALUE(I_DSOURCE) TYPE  RSISOURCE OPTIONAL
    *"     VALUE(I_REQUNR) TYPE  RSREQUNR
    *"     VALUE(I_MAXSIZE) TYPE  RSMAXSIZE DEFAULT 1000
    *"     VALUE(I_INITFLAG) TYPE  RSINITFLG OPTIONAL
    *"     VALUE(I_UPDMODE) TYPE  RSUPDMODE OPTIONAL
    *"     VALUE(I_DATAPAKID) TYPE  RSDATAPID DEFAULT 50000
    *"     VALUE(I_READ_ONLY) TYPE  SBIW_BOOL DEFAULT SBIW_C_FALSE
    *"  TABLES
    *"      I_T_SELECT TYPE  SBIWA_T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SBIWA_T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  MDEZ OPTIONAL
    Maximum number of lines for DB table
      STATICS: s_s_if TYPE srsc_s_if_simple,
    counter
              s_counter_datapakid LIKE sy-tabix,
    cursor
              s_cursor TYPE cursor.
      DATA : lt_mdezx TYPE TABLE OF mdez,
              la_mdezx LIKE LINE OF lt_mdezx.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
      IF i_initflag = sbiwa_c_flag_on.
    Initialization: check input parameters
                    buffer input parameters
                    prepare data selection
    Check DataSource validity
        CASE i_dsource.
          WHEN 'ZSTOCK_REQUIREMENT1'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE e009(r3). ENDIF.
    this is a typical log call. Please write every error message like this
            log_write 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      i_dsource   "message variable 1
                      ' '.                 "message variable 2
            RAISE error_passed_to_mess_handler.
        ENDCASE.
        APPEND LINES OF i_t_select TO s_s_if-t_select.
    Fill parameter buffer for data extraction calls
        s_s_if-requnr    = i_requnr.
        s_s_if-dsource = i_dsource.
        s_s_if-maxsize   = i_maxsize.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
        APPEND LINES OF i_t_fields TO s_s_if-t_fields.
      ELSE.                 "Initialization mode or data extraction ?
    Data transfer: First Call      OPEN CURSOR + FETCH
                   Following Calls FETCH only
    First data package -> OPEN CURSOR
        IF s_counter_datapakid = 0.
          SELECT matnr FROM mara INTO TABLE i_matnr
                            WHERE mtart IN ('ZFIN','ZRAW','ZSMI').
          If not i_matnr is initial.
          SELECT matnr werks FROM marc INTO TABLE i_input FOR ALL ENTRIES IN i_matnr
                                           WHERE matnr = i_matnr-matnr.
           endif.
            CLEAR wa_input.
            LOOP AT i_input INTO wa_input.
              CALL FUNCTION 'MD_STOCK_REQUIREMENTS_LIST_API'
                EXPORTING
       PLSCN                          =
                  matnr                          =  wa_input-matnr
                  werks                          =  wa_input-werks
       BERID                          =
       ERGBZ                          =
       AFIBZ                          =
       INPER                          =
       DISPLAY_LIST_MDPSX             =
       DISPLAY_LIST_MDEZX             =
       DISPLAY_LIST_MDSUX             =
       NOBUF                          =
    IMPORTING
       E_MT61D                        =
       E_MDKP                         =
       E_CM61M                        =
       E_MDSTA                        =
               TABLES
       MDPSX                          =
                 mdezx                          =   lt_mdezx
       MDSUX                          =
               EXCEPTIONS
                 material_plant_not_found       = 1
                 plant_not_found                = 2
                 OTHERS                         = 3.
              IF sy-subrc <> 0.
                MESSAGE i000(abc) WITH 'information'.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
              ENDIF.
              LOOP AT lt_mdezx INTO la_mdezx.
                APPEND la_mdezx TO e_t_data.
              ENDLOOP.
            ENDLOOP.
          ENDIF.
          s_counter_datapakid = s_counter_datapakid + 1.
        ENDIF.              "Initialization mode or data extraction ?
      ENDFUNCTION.

    Hi,
    One thing why u want to restrict data package. Anyway if it is delta enbaled u will make it as a initial load then delta load.
    Its very complex to implement datapackage as well as cursor and fetch statement in function module.
    Thanks,
    Debasish

  • Determination the size of data package

    Hi all,
    There is simple way to determine a size of the data packages in SAP NW BW 7.x with DTP, where you can do it easy (option: Package Size).
    How do you determine Package size in SAP BW 3.x? there is no option Package Size. maybe with Function module(datasource in SAP R3) it's possible? or anything else... have a problem with that.
    thnks for you response.

    in start routine or Field Routine Read the source_package like below.
    DATA: WA_SIZE TYPE SY-TFILL.
    READ TABLE SOURCE_PACKAGE.
    WA_SIZE = SY-TFILL.
    WA_SIZE will hold the size i.e number of entries in the package.
    Tx,
    SB

  • How do we control the data package size that comes into the DSO?

    Hi experts,
    I have this scenario:
    Initial information (numbers are not real):
    I have 10 contracts in CRM (one order documents)
    Each contract when extracted becomes 50 records.
    Running BW 3.x
    (1) Now i start data extraction in BW, i will receive 5 packets, split like following:
    DP1: 100 records (contract 1 and 2)
    DP2: 100 records (contract 3 and 4)
    DP3: 50 records (contract 5)
    These records are stored in the PSA.
    (2) Then, it seems the system keeps the same package size and send these DPs to DSO like following:
    DP1 -> 100 records -> DSO
    DP2 -> 100 records -> DSO
    DP3 -> 50 records -> DSO
    What i want:
    I have a special case and i want to be able to do the following starting from (2).
    Instead of sending
    DP1 -> 100 records -> DSO
    DP2 -> 100 records -> DSO
    DP3 -> 50 records -> DSO
    I want to send:
    DP1 -> 10 records -> DSO
    DP2 -> 10 records -> DSO
    DP3 -> 10 records -> DSO
    DP25 -> 10 records -> DSO
    Do I have control over the data package size (number of records)?
    Can the DPs between DataSource <-> DSO be different then the ones from SourceSystem <->DataSource?
    Can i even go further and do some kind of selection to be able to send like following:
    DP1 -> all records from item 01 to 10 of contract 1 -> DSO
    DP2 -> all records from item 11 to 20 of contract 1 -> DSO
    DP3 -> all records from item 01 to 10 of contract 2 -> DSO
    DP4 -> all records from item 11 to 20 of contract 2 -> DSO
    DPn -> all records from item 11 to 20 of contract 10 -> DSO
    Thanks!

    Hi,
      If you are using infopackage try the setting in the infopackage ie in the scheduler menu at the top
    choose DataS: Default data transfer in which you can change the package size of data
    if using DTP in Extraction Tab you can specify  Package Size.
    Hope this helps for you.
    Thanks,
    Arun

  • Impact of Changing Data Package Size with DTP

    Hi All,
    We have delta dtp to load data from DSO to infocube. Default data package size with dtp is 50,000 records.
    Due to huge no of data, internal table memory space is used and data loading get fails.
    Then we changed the data package size to 10,000, which executes the data load successfully.
    DTP with package size of 50,000 took 40 minutes to execute and failed, but DTP with package size of 10,000 took 15 minutes (for same amount of data).
    Please find below my questions:
    Why a DTP with bigger size of packet runs longer than a DTP with lower packet size ?
    Also by reducing the standard data package size 50,000 to 10,000, will it impact any other data loading?
    Thanks

    Hi Sri,
    If your DTP is taking more time then check your transformation .
    1.Transformation with Routines always take more time so you if you want to reduce the time of execution then routine should be optimized for good performance .
    2.Also check if you have filter at DTP level .Due to filters DTP takes long time .If same data get filtered at routine level it take much lesser time .
    3.If you cannot change routine then you can set semantic keys at your DTP .The package data will be sorted as per semantic keys and thus it may be helpful at routine level for fast processing.
    4.Your routine is getting failed due to  internal table memory space so check if you have select statement in routine without FOR ALL ENTRIES IN RESULT_PACKAGE or SOURCE_PACKAGE line .if you will use this It will reduce record count .
    5.Wherever possible delete duplicate records and if possible filter useless data at start routine itself .
    6.Refresh internal table if data no longer needed .If your tables are global then data will be present at every routine level so refreshing will help to reduce size.
    7.The maximum memory that can be occupied by an internal table (including its internal administration) is 2 gigabytes. A more realistic figure is up to 500 megabytes.
    8.Also check no of jobs running that time .May be you have lots of jobs active at the same time so memory availability will be less and DTP may get failed .
    Why a DTP with bigger size of packet runs longer than a DTP with lower packet size ?
    *Start and end routine works at package level so routine run for each package one by one .By default package have sorted data based on keys (non unique keys (characteristics )of source or target) and by setting semantic keys you can change this order.So Package having more data will take more time in processing then package have lesser data .
    by reducing the standard data package size 50,000 to 10,000, will it impact any other data loading?
    It will only impact running of that load .but yes if lots of other loads are running simultaneously then server can allocate more space to them .So better before reducing package size just check whether it is helpful in routine performance (start and end ) or increasing overhead .
    Hope these points will be helpful .
    Regards,
    Jaya Tiwari

  • Issue in Update routine due to Data Package

    We have this peculiar situation.
    The scenario is ..
    We have to load data from ODS1 to ODS2.
    The data package size is 9980 while transferring data from ODS1 to ODS2.
    In the update rule we have some calculations and we rank the records based on these calculations.
    The ODS key for both ODS1 and ODS2 is same ie Delivery Number , Delivery Item & Source System.
    For example a Delivery Number has 12 Delivery Items.
    These Delivery Items are in different Data Packages namely Data Package 1 and Data Package 4.
    So instead of having the ranks as 1 to 10 its calculating it as 1 to 5 and second item as 1 to 5.
    But what we require is Rank as 1 to 10.
    This is due to the fact that the items are in different Data packages.
    In this case the ABAP routine is working fine but the Data Package is the problem.
    Can anybody any alternative solution to this issue.?
    Thanks in advance for assistance.............

    CODE FOR INTER DATA PACKAGE TREATMENT
    PROGRAM UPDATE_ROUTINE.
    $$ begin of global - insert your declaration only below this line  -
    TABLES: ...
    DATA:   ...
    DATA: v_packet_nbr TYPE i VALUE 1.
    DATA:
      g_requnr  TYPE rsrequnr.
    DATA:
      l_is        TYPE string VALUE 'G_S_IS-RECNO',
      l_requnr    TYPE string VALUE 'G_S_MINFO-REQUNR'.
    FIELD-SYMBOLS: <g_f1> TYPE ANY,
                   <g_requnr> TYPE ANY.
    TYPES:
      BEGIN OF global_data_package.
            INCLUDE STRUCTURE /bic/cs8ydbim001.
    TYPES: recno   LIKE sy-tabix,
      END OF global_data_package.
    DATA lt_data_package_collect TYPE STANDARD TABLE OF global_data_package.
    DATA ls_datapack TYPE global_data_package.
    datapackage enhancement Declaration
    TYPES: BEGIN OF datapak.
            INCLUDE STRUCTURE /bic/cs8ydbim001.
    TYPES: END OF datapak.
    DATA: datapak1 TYPE STANDARD TABLE OF datapak,
          wa_datapak1 LIKE LINE OF datapak1.
    Declaration for Business Rules implementation
    TYPES : BEGIN OF ty_ydbsdppx.
            INCLUDE STRUCTURE /bic/aydbsdppx00.
    TYPES: END OF ty_ydbsdppx.
    DATA : it_ydbsdppx TYPE STANDARD TABLE OF ty_ydbsdppx WITH HEADER LINE,
           wa_ydbsdppx TYPE ty_ydbsdppx,
           temp TYPE /bic/aydbim00100-price,
           lv_tabix TYPE sy-tabix.
    $$ end of global - insert your declaration only before this line   -
    The follow definition is new in the BW3.x
    TYPES:
      BEGIN OF DATA_PACKAGE_STRUCTURE.
         INCLUDE STRUCTURE /BIC/CS8YDBIM001.
    TYPES:
         RECNO   LIKE sy-tabix,
      END OF DATA_PACKAGE_STRUCTURE.
    DATA:
      DATA_PACKAGE TYPE STANDARD TABLE OF DATA_PACKAGE_STRUCTURE
           WITH HEADER LINE
           WITH NON-UNIQUE DEFAULT KEY INITIAL SIZE 0.
    FORM startup
      TABLES   MONITOR STRUCTURE RSMONITOR "user defined monitoring
               MONITOR_RECNO STRUCTURE RSMONITORS " monitoring with record n
               DATA_PACKAGE STRUCTURE DATA_PACKAGE
      USING    RECORD_ALL LIKE SY-TABIX
               SOURCE_SYSTEM LIKE RSUPDSIMULH-LOGSYS
      CHANGING ABORT LIKE SY-SUBRC. "set ABORT <> 0 to cancel update
    $$ begin of routine - insert your code only below this line        -
    fill the internal tables "MONITOR" and/or "MONITOR_RECNO",
    to make monitor entries
    TABLES: rsmonfact.
      TYPES:
        BEGIN OF ls_rsmonfact,
          dp_nr TYPE rsmonfact-dp_nr,
        END OF ls_rsmonfact.
      DATA: k TYPE i,
            v_lines_1 TYPE i,
            v_lines_2 TYPE i,
            v_packet_max TYPE i.
    declaration of internal tables
      DATA: it_rsmonfact TYPE STANDARD TABLE OF ls_rsmonfact.
    INTER-PACKAGE COLLECTION TREATMENT *******************
      ASSIGN (l_requnr) TO <g_requnr>.
      SELECT dp_nr FROM rsmonfact
        INTO TABLE it_rsmonfact
        WHERE rnr = <g_requnr>.
      DESCRIBE TABLE it_rsmonfact LINES v_packet_max.
      IF v_packet_nbr < v_packet_max.
      APPEND LINES OF DATA_PACKAGE[] TO lt_data_package_collect[].
        CLEAR: DATA_PACKAGE.
        REFRESH DATA_PACKAGE.
        v_packet_nbr = v_packet_nbr + 1.
        CLEAR: MONITOR[], MONITOR.
        MONITOR-msgid = '00'.
        MONITOR-msgty = 'I'.
        MONITOR-msgno = '398'.
        MONITOR-msgv1 = 'All data_packages have been gathered in one. '.
        MONITOR-msgv2 = 'The last DATA_PACKAGE contains all records.'.
        APPEND MONITOR.
      ELSE.
    last data_package => perform Business Rules.
        IF v_packet_max > 1.
          APPEND LINES OF DATA_PACKAGE[] TO lt_data_package_collect[].
          CLEAR: DATA_PACKAGE[], DATA_PACKAGE.
          k = 1.
    We put back all package collected into data_package, handling recno.
          LOOP AT lt_data_package_collect INTO ls_datapack.
            ls_datapack-recno = k.
            APPEND ls_datapack TO DATA_PACKAGE.
            k = k + 1.
          ENDLOOP.
          CLEAR : lt_data_package_collect.
          REFRESH : lt_data_package_collect.
        ENDIF.
    sorting global data package and only keep the first occurence of the
    *record
      SORT DATA_PACKAGE BY material plant calmonth.
      DELETE ADJACENT DUPLICATES FROM DATA_PACKAGE
            COMPARING material plant calyear.
      SELECT * FROM /bic/aydbsdppx00
          INTO TABLE it_ydbsdppx
          FOR ALL ENTRIES IN DATA_PACKAGE
            WHERE material = DATA_PACKAGE-material
              AND plant    = DATA_PACKAGE-plant
              AND calyear  = DATA_PACKAGE-calyear.
    Enhance Data_package with Target additionnal fields.
      LOOP AT DATA_PACKAGE.
        CLEAR : wa_datapak1, wa_ydbsdppx.
        MOVE-CORRESPONDING DATA_PACKAGE TO wa_datapak1.
        READ TABLE it_ydbsdppx INTO wa_ydbsdppx
          WITH KEY material = DATA_PACKAGE-material
                      plant = DATA_PACKAGE-plant
                    calyear = DATA_PACKAGE-calyear.
        IF sy-subrc NE 0.       "new product price
          APPEND wa_datapak1 TO datapak1.
        ELSE.                   " a product price already exists
          IF wa_ydbsdppx-calmonth GE DATA_PACKAGE-calmonth.
    keep the eldest one  (for each year), or overwrite price if same month
            APPEND wa_datapak1 TO datapak1.
          ENDIF.
        ENDIF.
      ENDLOOP.
    ENDIF.
    if abort is not equal zero, the update process will be canceled
      ABORT = 0.
    $$ end of routine - insert your code only before this line         -
    ENDFORM.
    Edited by: mansi dandavate on Jun 17, 2010 12:32 PM

  • Data package is missing in the return structure

    Hi BW Folks,
    I have an issue with ODS activation.While activating the data in ODS object am getting following error message
    Activation of data records from ODS object XXXX terminated.
    data package XXXXX contains errors with status 9 in table 'XX' but this data package is missing in the return structure.
    In detail: The data package is entered in the return structure as incorrect.
    Can anyone provide me the solution. Thanks in advance. Have a nice time!
    Regards,
    Nani.

    HI
    Check these links
    Re: Status 9 error when activating an ODS in a Process Chain
    ODS activation error - status 9
    Error while data loading-terminated with Status 9
    Error while data loading-terminated with Status 9
    hope it helps
    regards
    CK
    Assing points if usefull

  • DTP does not fetch all records from Source, fetches only records in First Data Package.

    Fellas,
    I have a scenario in my BW system, where I pull data from a source using a Direct Access DTP. (Does not extract from PSA, extracts from Source)
    The Source is a table from the Oracle DB and using a datasource and a Direct Access DTP, I pull data from this table into my BW Infocube.
    The DTP's package size has been set to 100,000 and whenever this load is triggered, a lot of data records from the source table are fetched in various Data packages. This has been working fine and works fine now as well.
    But, very rarely, the DTP fetches 100,000 records in the first data package and fails to pull the remaining data records from source.
    It ends, with this message "No more data records found" even though we have records waiting to be pulled. This DTP in the process chain does not even fail and continues to the next step with a "Green" Status.
    Have you faced a similar situation in any of your systems?  What is the cause?  How can this be fixed?
    Thanks in advance for your help.
    Cheers
    Shiva

    Hello Raman & KV,
    Thanks for your Suggestions.
    Unfortunately, I would not be able to implement any of your suggestions because, I m not allowed to change the DTP Settings.
    So, I m working on finding the root cause of this issue and came across a SAP Note - 1506944 - Only one package is always extracted during direct access , which says this is a Program Error.
    Hence, i m checking more with SAP on this and will share their insights once i hear back from them.
    Cheers
    Shiva

  • Same set of Records not in the same Data package of the extractor

    Hi All,
    I have got one senario. While extracting the records from the ECC based on some condition I want to add some more records in to ECC. To be more clear based on some condition I want to add addiional lines of data by gving APPEND C_T_DATA.
    For eg.
    I have  a set of records with same company code, same contract same delivery leg and different pricing leg.
    If delivery leg and pricing leg is 1 then I want to add one line of record.
    There will be several records with the same company code contract delivery leg and pricing leg. In the extraction logic I will extract with the following command i_t_data [] = c_t_data [], then sort with company code, contract delivery and pricing leg. then Delete duplicate with adjustcent..command...to get one record, based on this record with some condition I will populate a new line of record what my business neeeds.
    My concern is
    if the same set of records over shoot the datapackage size how to handle this. Is there any option.
    My data package size is 50,000. Suppose I get a same set of records ie same company code, contract delivery leg and pricing leg as 49999 th record. Suppose there are 10 records with the same characteristics the extraction will hapen in 2 data packages then delete dplicate and the above logic will get wrong. How I can handle this secnaio. Whether Delta enabled function module help me to tackle this. I want to do it only in Extraction. as Data source enhancement.
    Anil.
    Edited by: Anil on Aug 29, 2010 5:56 AM

    Hi,
    You will have to do the enhancement of the data source.
    Please follow the below link.
    You can write your logic to add the additional records in the case statement for your data source.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c035c402-3d1a-2d10-4380-af8f26b5026f?quicklink=index&overridelayout=true
    Hope this will solve your issue.

Maybe you are looking for