Impact of Changing Data Package Size with DTP

Hi All,
We have delta dtp to load data from DSO to infocube. Default data package size with dtp is 50,000 records.
Due to huge no of data, internal table memory space is used and data loading get fails.
Then we changed the data package size to 10,000, which executes the data load successfully.
DTP with package size of 50,000 took 40 minutes to execute and failed, but DTP with package size of 10,000 took 15 minutes (for same amount of data).
Please find below my questions:
Why a DTP with bigger size of packet runs longer than a DTP with lower packet size ?
Also by reducing the standard data package size 50,000 to 10,000, will it impact any other data loading?
Thanks

Hi Sri,
If your DTP is taking more time then check your transformation .
1.Transformation with Routines always take more time so you if you want to reduce the time of execution then routine should be optimized for good performance .
2.Also check if you have filter at DTP level .Due to filters DTP takes long time .If same data get filtered at routine level it take much lesser time .
3.If you cannot change routine then you can set semantic keys at your DTP .The package data will be sorted as per semantic keys and thus it may be helpful at routine level for fast processing.
4.Your routine is getting failed due to  internal table memory space so check if you have select statement in routine without FOR ALL ENTRIES IN RESULT_PACKAGE or SOURCE_PACKAGE line .if you will use this It will reduce record count .
5.Wherever possible delete duplicate records and if possible filter useless data at start routine itself .
6.Refresh internal table if data no longer needed .If your tables are global then data will be present at every routine level so refreshing will help to reduce size.
7.The maximum memory that can be occupied by an internal table (including its internal administration) is 2 gigabytes. A more realistic figure is up to 500 megabytes.
8.Also check no of jobs running that time .May be you have lots of jobs active at the same time so memory availability will be less and DTP may get failed .
Why a DTP with bigger size of packet runs longer than a DTP with lower packet size ?
*Start and end routine works at package level so routine run for each package one by one .By default package have sorted data based on keys (non unique keys (characteristics )of source or target) and by setting semantic keys you can change this order.So Package having more data will take more time in processing then package have lesser data .
by reducing the standard data package size 50,000 to 10,000, will it impact any other data loading?
It will only impact running of that load .but yes if lots of other loads are running simultaneously then server can allocate more space to them .So better before reducing package size just check whether it is helpful in routine performance (start and end ) or increasing overhead .
Hope these points will be helpful .
Regards,
Jaya Tiwari

Similar Messages

  • Data Package Issue in DTP

    Hi gurus,
    My dataflow is like datsource->infosource->wrute optimised DSO with semantic key..
    In source , i have 10 records in that 7 records are duplicate records.
    I reduced the DTP datapackage size from 50000 to 5.
    When i excuted the DTP , i got 2 data package. in the first data package i got all the 7 records for the same set of keys and in the second data package i got the remaining records.
    My doubt is i have defined the data package size as "5" then how come the first data package can hold 7 records instead of 5 records.
    Thanks in advance !

    Hi ,
    It is because of the Semantic Key seeting that you have maintained .Data records that have the same key are combined in a single data package. This setting is only relevant for DataStore objects with data fields that are overwritten .
    Semantic Groups ensures how you want to build the data packages that are read from the source (DataSource or InfoProvider).
    This setting also defines the key fields for the error stack. By defining the key for the error stack, you ensure that the data can be updated in the target in the correct order once the incorrect data records have been corrected.
    Hope it helps .
    Thanks
    Kamal Mehta

  • How do we control the data package size that comes into the DSO?

    Hi experts,
    I have this scenario:
    Initial information (numbers are not real):
    I have 10 contracts in CRM (one order documents)
    Each contract when extracted becomes 50 records.
    Running BW 3.x
    (1) Now i start data extraction in BW, i will receive 5 packets, split like following:
    DP1: 100 records (contract 1 and 2)
    DP2: 100 records (contract 3 and 4)
    DP3: 50 records (contract 5)
    These records are stored in the PSA.
    (2) Then, it seems the system keeps the same package size and send these DPs to DSO like following:
    DP1 -> 100 records -> DSO
    DP2 -> 100 records -> DSO
    DP3 -> 50 records -> DSO
    What i want:
    I have a special case and i want to be able to do the following starting from (2).
    Instead of sending
    DP1 -> 100 records -> DSO
    DP2 -> 100 records -> DSO
    DP3 -> 50 records -> DSO
    I want to send:
    DP1 -> 10 records -> DSO
    DP2 -> 10 records -> DSO
    DP3 -> 10 records -> DSO
    DP25 -> 10 records -> DSO
    Do I have control over the data package size (number of records)?
    Can the DPs between DataSource <-> DSO be different then the ones from SourceSystem <->DataSource?
    Can i even go further and do some kind of selection to be able to send like following:
    DP1 -> all records from item 01 to 10 of contract 1 -> DSO
    DP2 -> all records from item 11 to 20 of contract 1 -> DSO
    DP3 -> all records from item 01 to 10 of contract 2 -> DSO
    DP4 -> all records from item 11 to 20 of contract 2 -> DSO
    DPn -> all records from item 11 to 20 of contract 10 -> DSO
    Thanks!

    Hi,
      If you are using infopackage try the setting in the infopackage ie in the scheduler menu at the top
    choose DataS: Default data transfer in which you can change the package size of data
    if using DTP in Extraction Tab you can specify  Package Size.
    Hope this helps for you.
    Thanks,
    Arun

  • Increase data package size

    Hi,
    I'm using a flat file datasource to load data from the BW server into a cube.  When I load a file containing 9000 records, the data package size is 1000.  I'm trying to improve the load time and would like to increase the data package size to 2000.
    I made an entry to specify 2000 records for my flat file source in SBIW -> Maintain Control Parameters for Data Transfer.  My file still loads with 1000 records in the data package. 
    I've also tried changing the setting on the infopackage itself, but it keeps telling me the max is 1000.
    Any suggestions?
    Thanks

    the settings you maintained in SBIW is for Other BI system not for Current System. whiel you exchange data between your current BI system with other BI System, then you need to maintain in SBIW.
    If you want to change for current system, you need to maintain in SPRO not SBIW. i can't tell you the exact navigation. it will be under "Links to Other System"
    If you are on BI 7.0, follow the below navigation....
    SPRO --> F5 -->SAP Netweaver -> BI --> Links to Other Systems -->Maintain Control Parameters for Data Transfer.
    Let me know if you need any information.
    Nagesh Ganisetti.

  • Data package size

    What is the basic difference between RSCUSTV6 & SBIW->General setting ->Maintain Control parameters in relation of modification of data package format.

    Hi,
    Just see the help on
    Maintain Control Parameters for Data Transfer:
    1. Source System
    Enter the logical system of your source client and assign the control parameters you selected to it.
    You can find further information on the source client in the source system by choosing the path
    Tools -> Administration -> Management -> Client Maintenance.
    2. Maximum Size of the Data Package
    When you transfer data into BW, the individual data records are sent in packages of variable size. You can use these parameters to control how large a typical data packet like this is.
    If no entry was maintained then the data is transferred with a default setting of 10,000 kBytes per data packet. The memory requirement not only depends on the settings of the data package, but also on the size of the transfer structure and the memory requirement of the relevant extractor.
    3. Maximum Number of Rows in a Data Package
    With large data packages, the memory requirement mainly depends on the number of data recrods that are transferred with this package. Using this parameter you control the maximum number of data records that the data package should contain.
    By default a maximum of 100,000 records are transferred per  data package.
    The maximum main memory requiremen per data package is approximately 2  Max. Rows 1000 Byte.
    4. Frequency
    The specified frequency determines the number of IDocs that an Info IDoc is to be sent to, or how many data IDocs an Info Idoc describes.
    Frequency 1 is set by default.. This means that an Info Idoc follows every data Idoc. In general, you should select a frequency between 5 and 10 but no higher than 20.
    The bigger the data IDoc packet, the lower the frequency setting should be. In this way, when you upload you can obtain information on the respective data loading in relatively short spans of time .
    With the help of every Info IDoc, you can check the BW monitor to see if there are any errors in the loading process. If there are none, then the traffic light in the monitor will be green. The Info IDocs contain information such as whether the respective data IDocs were uploaded correctly.
    5. Maximum number of parallel processes for the data transfer
    An entry in this field is only relevant from release 3.1I onwards.
    Enter a number larger than 0. The maximum number of parallel processes is set by default at 2. The ideal parameter selection depends on the configuration of the application server, which you use for transferring data.
    6. Background job target system
    Enter the name of the application server on which the extraction job is to be processed.
    To determine the name of the application server, choose
    Tools -> Administration -> Monitor -> System monitoring -> Server. The name of the application server is displayed in the column Computer.
    7. Maximum Number of Data Packages in a Delta Request
    With this parameter, you can restrict the number of data packages in a delta request or in the repetition of a delta request.
    Only use this restriction when you expect delta requests with a very high data volume, so that, despite sufficiently large data package sizes, more than 1000 data packages can result in a request.
    With an initial value or when the value is 0, there is no restriction. Only a value larger than 0 leads to a restriction in the number of data packages. For reasons of consistency, this number is not generally exactly adhered to. The actual restriction can, depending on how much the data is compressed in the qRFC queue , deviate from the given limit by up to 100.
    RSA6:
    Used to change the Datapacket Size.
    Thanks
    Reddy
    Edited by: Surendra Reddy on Mar 12, 2010 6:27 AM

  • Data Package size will be detemined dynamically.

    Dear SDNers,
    I have seen for some DTPs in my projects the data packet size in DTP is determined dynamically .How do we get this.
    I am getting this message in DTP->Extraction Tab
    The package size corresponds to package size in source.
    It is determined dynamically at runtime.
    Thanks,
    Swathi

    Hello,
    You would get this when semantic keys are not defined in the DTP.
    Regards..
    Balaji

  • Restrict Data Package Size

    hi all,
    I have created a generic data source which is extracting data using the FM.
    I wana restrict my data by packet size. how can i do this?
    my FM is
    FUNCTION zstock_requirement.
    ""Local interface:
    *"  IMPORTING
    *"     VALUE(I_DSOURCE) TYPE  RSISOURCE OPTIONAL
    *"     VALUE(I_REQUNR) TYPE  RSREQUNR
    *"     VALUE(I_MAXSIZE) TYPE  RSMAXSIZE DEFAULT 1000
    *"     VALUE(I_INITFLAG) TYPE  RSINITFLG OPTIONAL
    *"     VALUE(I_UPDMODE) TYPE  RSUPDMODE OPTIONAL
    *"     VALUE(I_DATAPAKID) TYPE  RSDATAPID DEFAULT 50000
    *"     VALUE(I_READ_ONLY) TYPE  SBIW_BOOL DEFAULT SBIW_C_FALSE
    *"  TABLES
    *"      I_T_SELECT TYPE  SBIWA_T_SELECT OPTIONAL
    *"      I_T_FIELDS TYPE  SBIWA_T_FIELDS OPTIONAL
    *"      E_T_DATA STRUCTURE  MDEZ OPTIONAL
    Maximum number of lines for DB table
      STATICS: s_s_if TYPE srsc_s_if_simple,
    counter
              s_counter_datapakid LIKE sy-tabix,
    cursor
              s_cursor TYPE cursor.
      DATA : lt_mdezx TYPE TABLE OF mdez,
              la_mdezx LIKE LINE OF lt_mdezx.
    Initialization mode (first call by SAPI) or data transfer mode
    (following calls) ?
      IF i_initflag = sbiwa_c_flag_on.
    Initialization: check input parameters
                    buffer input parameters
                    prepare data selection
    Check DataSource validity
        CASE i_dsource.
          WHEN 'ZSTOCK_REQUIREMENT1'.
          WHEN OTHERS.
            IF 1 = 2. MESSAGE e009(r3). ENDIF.
    this is a typical log call. Please write every error message like this
            log_write 'E'                  "message type
                      'R3'                 "message class
                      '009'                "message number
                      i_dsource   "message variable 1
                      ' '.                 "message variable 2
            RAISE error_passed_to_mess_handler.
        ENDCASE.
        APPEND LINES OF i_t_select TO s_s_if-t_select.
    Fill parameter buffer for data extraction calls
        s_s_if-requnr    = i_requnr.
        s_s_if-dsource = i_dsource.
        s_s_if-maxsize   = i_maxsize.
    Fill field list table for an optimized select statement
    (in case that there is no 1:1 relation between InfoSource fields
    and database table fields this may be far from beeing trivial)
        APPEND LINES OF i_t_fields TO s_s_if-t_fields.
      ELSE.                 "Initialization mode or data extraction ?
    Data transfer: First Call      OPEN CURSOR + FETCH
                   Following Calls FETCH only
    First data package -> OPEN CURSOR
        IF s_counter_datapakid = 0.
          SELECT matnr FROM mara INTO TABLE i_matnr
                            WHERE mtart IN ('ZFIN','ZRAW','ZSMI').
          If not i_matnr is initial.
          SELECT matnr werks FROM marc INTO TABLE i_input FOR ALL ENTRIES IN i_matnr
                                           WHERE matnr = i_matnr-matnr.
           endif.
            CLEAR wa_input.
            LOOP AT i_input INTO wa_input.
              CALL FUNCTION 'MD_STOCK_REQUIREMENTS_LIST_API'
                EXPORTING
       PLSCN                          =
                  matnr                          =  wa_input-matnr
                  werks                          =  wa_input-werks
       BERID                          =
       ERGBZ                          =
       AFIBZ                          =
       INPER                          =
       DISPLAY_LIST_MDPSX             =
       DISPLAY_LIST_MDEZX             =
       DISPLAY_LIST_MDSUX             =
       NOBUF                          =
    IMPORTING
       E_MT61D                        =
       E_MDKP                         =
       E_CM61M                        =
       E_MDSTA                        =
               TABLES
       MDPSX                          =
                 mdezx                          =   lt_mdezx
       MDSUX                          =
               EXCEPTIONS
                 material_plant_not_found       = 1
                 plant_not_found                = 2
                 OTHERS                         = 3.
              IF sy-subrc <> 0.
                MESSAGE i000(abc) WITH 'information'.
    MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
            WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
              ENDIF.
              LOOP AT lt_mdezx INTO la_mdezx.
                APPEND la_mdezx TO e_t_data.
              ENDLOOP.
            ENDLOOP.
          ENDIF.
          s_counter_datapakid = s_counter_datapakid + 1.
        ENDIF.              "Initialization mode or data extraction ?
      ENDFUNCTION.

    Hi,
    One thing why u want to restrict data package. Anyway if it is delta enbaled u will make it as a initial load then delta load.
    Its very complex to implement datapackage as well as cursor and fetch statement in function module.
    Thanks,
    Debasish

  • BODI Change Data Capture DB400 with Attunity

    Is really neccesary use Attunity connector for capturing changes from DB400 with BO Data Integrator XI 3.1.
    I read "Data Services Technical Manuals" document and in Techniques for capturing Changed Data when we talk about DB400 only Attunnity connector  is mentionated. Exist another connector that is free or is really neccesary Attunity??
    Thks
    SVidal

    There is no free CDC Connector, none that is capable of reading the DB2 transaction logs and tells you exactly the row that got touched.
    However, if you use CDC as synonym for any delta loading techiques, just go for DB2 connect and implement the delta logic, e.g. where change_date >= $start_date.

  • Change data package

    HI i want to change the data package for a particular psa data source.Can u pls tell me the transaction code and what to do next to change this package to $TMP

    Hi Srikar,
    Goto transport connection is RSA1, Collect your datasource .. And you will see a button in the manubar which says package, click on it you will be led to the screen to change the package .. There select it to $TMP only...
    Regards.
    Message was edited by:
            Dash

  • Determine the count of data-packages in a dtp request.

    Hello,
    in a start routine I need to know the count (=highest number) of all data-packages in the current DTP-Request.
    I have had a look to table RSTSODSREQUESTPG it contains exactly the fields I need,
    but I do not find every dtp-request in this table.
    For me it looks as if this table was only filled in BW 3.5 but not in BI 7.0 or there is another reason why not every DTP-Request is stored in this table.
    Is there a new table to get the information ?
    Thanks
    Armin

    Hello,
    in the table RSBKDATAPAKID I found the field DATAPAKID, what I was looking for.
    But the field REQUID is only 6 byte like: 123456.
    In the start routine I have only a 30 byte field like: DTPR_4B34567890123456789012345
    In the administartor workbench I can see that both numbers are the same request.
    Is there a matching table ?
    In the table RSDDSTATDTP the field INSTANCE seems to be the 30 byte DTPR-Number.
    And a field DATAPAKID is also in this table.
    That looks good, i will try out if it works.
    Thanks
    Armin
    Edited by: Armin Batzelt on Sep 17, 2008 4:44 PM

  • Data package size-LAN and WAN

    HI Experts,
    Could anybody give explanation/Document for the  below query?
    When transfering R/3 to BW.How the data pacakage size is determined.?Would the size of data packets be different over LAN and WAN? Why?
    Thanks
    Pradeep

    In transaction SBIW -> General Settings -> Maintain Control Parameters for Data Transfer (in the OLTP System), you can see and edit the default values for the source system.
    If you display the infopackage in the BW system and click on the menu option Scheduler -> DataS. Default Data Transfer, you will be able to edit the settings for the infopackage and also see what's the default configuration for the source system.
    You may also consult the following SAP Notes for more information (including exceptions):
    [417307 - Extractor package size: Collective note for applications.|https://websmp107.sap-ag.de/sap/support/notes/417307]
    [409641 - Examples of packet size dependency on ROIDOCPRMS.|https://websmp107.sap-ag.de/sap/support/notes/409641]
    I don't see what difference it could have over LAN or WAN, though...

  • Impact of changing data type of a field in Database table

    Hi All,
    I need to change the data type of a field(which is not a key field and it has no dependency with any other field) from NUMC to CHAR by maintaining the same length. Please let me know if there will be any impact in doing this. Hope the following things need to be taken care:
    - Take backup of entire data in the table before doing the change
    - After changing the data type, I need to set the option 'Save Data' and then 'Activate and Adjust Database' in 'Database Utility'.
    - Use 'Where Used List' to check the related objects.
    Please let me know if there is any impact or any thing else need to be taken care apart from the above things.
    Thanks in advance.
    Regards
    Vidhya.

    Hi,
    even if the length is same there would be no impact.
    just v need to adjust the database in the database utility.
    check the table maintenence generator also. if not reflected there u need to delete it and create it again.
    reward if useful,
    teja

  • Changing Date and Time with dng files

    My camera settings for date and time were wrong.
    So some of my files in Lightroom 3 have the wrong dates.
    This morning I downloaded Photoinfo 2.0.1 but it doesn't seem to want to accept "dng" files to change the info.
    It will accept "jpeg" files no problem.
    Can anyone help me with this.
    Thanks

    Open the Catalog Setting panel (Lightroom menu on Mac and Edit menu on Windows). Configure the Metadata>Exif tab as shown in red on screenshot. Select your images and choose Save Metadata to File (Ctrl/Cmd+S) from Metadata menu (same menu as Edit Capture time. This will save the date/time changes back into the file rather than just the catalog.

  • Data Package Requred with 3G iPad?

    If I purchase a 3g iPad, is the data plan required? Or can I add that at a future date? I would like the flexibility of having 3g, but do not currently need it. Is the plan required at time of purchase??

    The 3G iPad does not require a data plan - or even a SIM - to work. I did the same (albeit with a 1st generation iPad), i.e. I bought it, used it for a while, and only then settled on a mobile provider and data plan.

  • Impact on changing DATE

    Gurus,
    Our production database server currently is set to a wrong time (7 mts ahead). We are planning to sync it up with the network time server. Can you tell what are the implications in doing this? I am assuming this has to be done when DB is down.
    Can someone point me to a note reference/websites etc?
    Thanks,
    Arun

    Check this Note from Oracle
    Because Oracle tracks the sequence of events in the database using the System
    Commit Number (SCN), changing the system clock for daylight savings time will
    have no effect on database operation. The only point where the time change
    can have potentially harmful effect is during time-based recovery.
    Time based recovery requires checking of the actual time the transaction was
    recorded in the logfile. Every log record has a time stamp associated with
    it. If the system manager for some reason changes the system clock, Oracle
    Support recommends shutting down the database and taking a cold backup ( or a
    hot backup if preferred). If for some reason a dba has to go back to a backup
    which was taken prior to the system clock change and rollforward, recovery
    works just fine except for time based recovery (Note that time based recovery
    works fine if the system clock is moved forward in time). When the system
    clock is changed backwards, its possible that there could be two redo records
    with the same time stamp. If time based recovery is done in this scenario,
    since ORACLE applies only redo entries that were written prior to a specified
    time, ecovery stops when it finds the first redo record which has that
    specified time.

Maybe you are looking for