Infopackage size

Hi,
In Info Package under Scheduler->Data Default Transfer, we set the datapacket size and number of packets per info idoc. So if we take the default setting (datapacket size = 1000, number of packets per info idoc = 10), and we got some 10 records to be transferred,So does it mean that these 10 records get splitted into 10 packets, each packet conataining 1 record? Please correct me if I am wrong...
What exactly is Info Idoc? How it is related to data transfer?
Regards,
Ram.

Hi Ram!
All is explained in the OSS Notes...anyway:
You can look up the TRANSFER STRUCTURE SIZE (extract structure) in table ROOSOURCE in the active version of the DataSource and determine its size via SE11 (DDIC) -> Utilities -> Runtime object -> Table length.
The system default values of 10,000 or 100,000 are valid for parameter MAXSIZE and MAXLINES (see F1 help for the corresponding fields in ROIDOCPRMS). You can overwrite these parameters in table ROIDOCPRMS on a system specific basis by maintaining via "Maintain General settings for extractors". In addition, there is the option of overriding these values. In the scheduler (InfoPackage->Scheduler) you can only reduce the MAXSIZE.
The advantage of doing the maintenance via the scheduler is that the values are InfoSource-specific.
Hope now is clearer...
Bye,
Roberto

Similar Messages

  • Optimize HR extrators

    Hi,
    We are currently activating the business content for HR Payroll(4.7) in BW(3.5). Having extracted 3 periods of data (26 in all) we are estimating the load time for 1 year to be around 3 months which is unacceptable.
    Has anyone else been faced with this situation and were you able to significantly reduce the loading times(close to %75) by customizing the extractor or applying notes?
    Thanks,
    Eric Dupont

    hi Eric,
    welcome to SDN ...
    you have make sure this performance problem happened in r/3 side ? testing with rsa3 took long time ?
    - check HR support package level applied in your r/3, also PI level ? some extraction may improve with program correction e.g oss note 889456-Performance improvements for the 0HR_PY_PP_1 InfoSource, 756333-DS 0HR_PY_PP_1: Mass data problem / memory overflow
    - general oss note on extraction and loading 567747-Composite note BW 3.x performance: Extraction & loading
    I.    Extraction from the OLTP
    Note 417307: extractor packet size: Collective note for applications
    - check if any steps can be take to improve from 'bw data loading performance', e.g data transfer control parameter, infopackage size ..
    https://media.sdn.sap.com/asug/biti03/411.htm
    bw performance tuning centre
    Business Intelligence Performance Tuning [original link is broken]
    also you can try debug in r/3 with rsa3 and set 'trace on' (ST05) and check if retrieving from any table take long time, additional index can be created on that table with consult your dba/basis guy.
    hope this helps.

  • Change the packet size at infopackage level

    Hi, Experts:
    All loads (full or delta) in the system can split data into small packets expect for one full load, which has almost 2 million records into 1 packet. It overflows the memory and fails.
    I found the following settings for this datasource when click from menu "Scheduler"->"DataS. Default Data Target":
    "Maximum size of a data packet in kByte" = 20000
    "Number of Data Packets per info-IDoc" = 10
    "Number of Data Packages per Delta Request" = 0
    "Update Method" - full Upload
    It seems to me that the load should split data into small packets.
    Can someone please tell me why only this load still puts all 2 million records in one big packet?
    Thanks,
    Jenny

    Some update on more information:
    The datasource is 0FI_GL_8, a standard SAP datasource for transaction data. I think it is an old datasource because I can not find any on-line document for its details from SAP help website. I was told that this datasource is not Delta capable. That is why full load is used.
    I also tried to reduce the package size settings for this datasource from "Scheduler"->"DataS. Default Data Target". But whatever I changed did not affect the way that this load is putting all records in one big data package.
    I assume the set up in Transaction RSCUSTV6 is for every infopackage across the whole system unless setup differently at infopackage level? I checked in RSCUSTV6, it is set up as the following:
    FrequencyStatus-IDOC = 10
    Package size = 30000
    Partition size = 1.000.000
    I also checked in ECC system with transaction SBIW for the control parameters for data transfer from the source system. It has the following set up for the source system:
    Max. (kB) = 20000
    Max. lines = 0
    Frequency = 10
    Max. Proc. = 3
    Max. DPs = 0
    From all the checked results, I still don't get why every other datasource loading can split data into small data packages, but only this one datasource can not. And why whatever I changed from the infopackage to reduce the size did not affect how it is splitting the data package.
    Any more inputs?
    Thanks,
    Jenny

  • InfoPackage packet size

    Hi all,
    I've seen several threads regarding the customizing of the InfoPackage packet size. Basically, it says that I have to change it in "InfoPackage under Scheduler->Data Default Transfer" but I can't see this option there.
    Is it only for the 3.5 version ? Does it work with a loading through a PSA ?
    Thx in advance...

    These settings apply to the creation of data pakcaets. Say you had 50000 records coming in 50 packets with default settings of 20000 and 10. In that case there would have been 5 Info Idocs.
    Now if the settings are changed to say 10000 and 5, the number of data packates would become 100 and InfoIdocs 20 (one for every 5 data packets).
    Lower the size of data packets, higher the no. of packets. All records still get extracted.
    update rules run for a data packet at a time. If you are getting memory or performance problems in update rules, dropping the size of the data packet is an option. Lower no. of records to be processed per run of hte update rules.

  • InfoPackages Packet Size TRANSPORT

    Thanks for all the people who answered my questions about the customizing of the InfoPackages Packet Size.
    I have another one for you : is this modification transportable ? Do I just have to transport the InfoPackage that I modified, or do I have to transport something else ? Or can I just change it directly in production (I don't really like that option)

    Once you have trnasported an InfoPack, it will prompt for a transport request next time it is modified. Most likely the InfoPack is currently in $TMP.
    Go to the transport connection and select the InfoPack there, change the dev. class and transport.
    InfoPakcs must be changed in the production system otherwise they cannot be scheduled. You can make these settings in the production system directly. If the production system is closed, then open InfoPacks for change from the transport connection --> Object Changeability.

  • Regrd the data packet size

    Hi
    i have two questions
    1) where can i set the data packet size in BW and R3 and also in BI how and where can we set .
    2)by what method/logic we will select the key fields in dso.
    ex: i have 5 tables in the sourse and each table will have primary keys, now how do we know that particular primary keys should be kept in KEY FIELDS folder in DSO.
    full points will be assigned.

    HI,
    Data package settings for the data to be extracted from R3 to BI can be done through :
    1) SBIW>General Settings> Maintain Control Parameters for Data Transfer
        These settings are common for all the info packages which extract data from R3.
    2) If u want to do settings relevent to specific Infopackage then :
    RSA1>Click On the Specific Infopackage>Scheduler(in the Menu Bar)-->DataS Default Data Transfer.
    3) And if you want to do DSO Package settings then:
    Got to Transaction RSODSO_SETTINGS 
    Here u can do package settings for DSO activation ,paramenter for SID generation ect.
    And selection of Key fileds depends upon the requirement.
    Based on the key fields what all the data fileds u want to overwrite or add the corespomding data fields
    Regards,
    Chaitanya.

  • How do I change packet size for a single load

    All,
        I need to change the packet size for a single load I'm doing.  How do I set that?
        Thanks.
    Dave

    Hi..........
    For this u hav to go to the infopackage scheduler........
    If u r in Process chain >> Right click on the infopackage >> Maintain variant..........then it will come to the IP scheduler..................In the Top Scheduler Tab >> Click on it and select DataS. Default data transfer .......Now there may be a delta IP and Full upload IP for the same datasource.......check ur IP .....it is full or delta..........if full change the packet size for full upload..............otherwise for delta upload........
    If the IP is not a part of any Process chain..........then go to RSA1 >> Find the IP..........and double click on it........then the IP scheduler screen will open..........
    You can make global settings for all datasources...... in the transaction SBIW in the area General Settings >> Maintain control Parameter for Data transfer......... The parameters that you maintain here are valid as global standard values, and thus apply to all DataSources.
    Hope this helps.....
    Regards,
    Debjani........
    Edited by: Debjani  Mukherjee on Nov 13, 2008 2:19 PM

  • Number of Lines limited in InfoPackage Selection?

    Hello Experts!
    I have 2 InfoPackage in order to load Data from ODS1 into ODS2. Selection is made for customer number - exact criteria is made via Abap Routine in InfoPackage. (InfoPackage 1 has all customers meeting a certain criteria - InfoPackage 2 has all the ranges in between the customer Numvers of InfoPackage 1).
    Therefore the load of both InfoPackages should load all data - but the problem is, I only get a small part of the Data. In ODS 1 there are some 2.800.000 entries and in ODS2 the load of the 2 InfoPackages only select about 30.000 entries.
    Since the selection criteria in monitoring seem fine - do you have any idea where the problem comes from? Is there a max. number of lines for selection to be made in infoPackage?
    Thanks for your help!
    Angelika

    Hi Angelika,
    check out the monitor --> in the request tree check the numbers coming into the update rules an coming out of the update rules, may be you are loosing some records there. The only thing that I know about the size of selections is that you will get a dump if you have to many entries in a select-option table (and having those lines in a infopackage is nothing else).
    Another reason might be a missing alpha conversion of the entries in the infopackage.
    kind regards
    Siggi

  • First package size is bigger then others

    Hi,
    I am using a data source based on a function module to extract the data from R/3 to BW.
    The issue is that the data pulls the data correctly and everything is fine.
    But its the first data package which is always half the size of the all the data loads.
    So suppose if the total number of records in one of the deltas is 800000 then the first data package will be of the size 400000 and the rest packages will be fine and of the equal and small size and defined in the system setting.
    I tried changing the settings in infopackage but it has no effect on this first data package and issue remains.
    We are using the same code as used by standard SAP function module code and we give the parameter MAXSIZE when opening the cursor at the desired table.
    If any can point me out on how to reduce the size of first data package??
    Thanks for the help
    Ajeet

    Dear Siggi,
    You are write we are pulling lot of data during the datapackid=0
    I just wnated to know if we can do something so that data is distributed to the other data packages equally.
    Thanks for your answer
    Ajeet

  • Transport the configuration in Infopackage

    Dear experts,
    I have a question regarding as the configuration in the Infopackage. I would like to define the Maximum size of a data packet for certains infopackages. But it seems that these configurations would not be transported from dev to test.
    Is that true or is there a problem in my transport order?
    Thanks in advance!

    Yes the data packet size settings of the infopackage do not get transported. you will have to do this settings in each of the systems.
    Regards

  • Info on package size

    Hi,
    I would like to understand what does this setting means in infopackage
    Maximum size of a data packet in kByte = 50000?
    How do I determine what is the package size that is going to come from the source system? Is there any documentation or notes that tells what the suitable package size is?
    Number of data packets per Info-IDoc = 10
    How does records get divided per package? On what logic?
    I extracted the data into PSA.
    DP 1 = 8000 records
    DP2 = 9500
    DP3 = 8400.
    The number of records is not same in all the packages. How does it work?
    Thanks
    Annie

    The individual records are sent in packages of varying sizes in the data transfer to the Business Information Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package.
    SAP recommends a data package size between 10 and 50 MB.
    if you transfer a lot of fields, you can transfer less lines...
    if you transfer less fields, you transfer more lines,...
    more info :
    http://help.sap.com/saphelp_nw70/helpdata/en/51/85d6cf842825469a51b9a666442339/content.htm
    M.

  • Infopackage takes longer than usual

    Hi
    I've got an Infopackage that normally loads at 7 mins long.
    But intermittently (e.g. once a week), the loading will take more than
    1 hour.
    May I know what is the possible cause for such problems?
    Rgds
    Rusyinni

    1. Limit the amount of created data packages for each load between 1000 -1500 as described in below mentioned SAP OSS note:
    892513 - Consulting: Performance: Loading data, no of pkg, req size
    Note: 1000-1500 = sum of data packages over all data targets, i.e. with three targets 350-500 packages
    How to limit the amount of data packages for the delta load:
    Go to transaction RSA1 select your dedicated infopackage -- menue-- DataS. Default Data transfer --enter the max. nummber of data packages to be created for each delta load.
    2. Global level :Use Transaction RSCUSTV6 for changing configuration setting for all the infopackages globally :
    (i).Frequency with which status Idocs are sent : It means how many data-IDocs are described by one Info-IDoc.In simple terms after how many data idocs 1 info idocs to be sent .The larger the packet size for a data IDoc, the smaller you should set the frequency. By doing this, you make it possible to get information on the respective data load status in relatively short time periods when uploading data.You should choose a frequency between 5 and 10 but not greater than 20.
    (ii)Datapacket size (Number of Data Records per Package) The basic setting should be between 5000 and 20000 depending on how many data records you want to load
    .Recommendation :You should not divide up the quantity of data into too large a number of packets since this reduces performance when uploading. The number of data packets should not be more than 100 per loading process.
    (III) ROIDOCPRMS : In order to make the RSCUSTV6 settings work you also need to do the related setting in ROIDOCPRMS in source system .
    3. Increase the parallelization of the infopackage if possible
    Got to transaction SBIW  increase the amount of wp which can be used.
    Please read also OSS note:
    595251 - Performance of data mart: Deactivating tRFC sending in BW
    4.. Our Infopackage consists of 3 type of message types RSINFO , RSREQ , RSSEND etc. There is a standard program RSEOUT00 in the source system that is used to pushed IDOCS to OS layer .Therefore create variants for the above message type for this program in the source system .Also create some Background job for this to be run after every 30 minutes .You should not increase the frequency more than this .
    Hope this helps .

  • Problems for Infopackage setting for Hierarchy Load

    Hello Masters,
    I have a problem, in editing the setting for Hierarchy Infopackage, I was asked to reset the Processing mode of infopackage directly into Infoobject, instead of PSA and then into Subsequent Data targets.
    But here, i am unable to edit the setting because, every thing is gryed out. Only option selected  before is "Only PSA, and then Update in subsequent Data Targets"
    I have a question, like is it mandatory setting for Hierarchy Infopackage, to Update in PSA and then into Data Targets.
    As i was able to edit settings for Attribute, and Text Infopackages even though these are in Process chains.
    Could any one please help, if it is possible to edit Hierarchy Infopackage, instead of creating a new one.
    Thank you
    Regards
    Raghu

    Hi,
    I have checked, the transfer rules, The Transfer Method used is PSA. I have to reset this infopackage directly to Infoobject, because this process chain runs for every four hrs, and it increases PSA size in tons.
    Hence, i was asked to reset the setting for infopackage. But it is gryed out.
    Thank you,
    Regards
    Raghu
    Edited by: raghuram alamuri on Jul 16, 2009 3:29 PM

  • Setting up datapackage size

    Gurus,
    Where all can we set the datapackage size. I have set the datapackage in RSCUSTV6 but still in the extraction and processing smaller package sizes are being used. Also, if we set different numbers in different places which has higer priority while deciding the package size. Appreciate your help.
    Thanks,
    Venkat

    Hi,
    Generally Maximum size of a data packet in kByte is limited to 20000 by default.
    Setting for one data source :
    This setting can be changed in Infopackage.
    Go to scheduler option -> DataS defult data transfer -> coumn Maximum size. Enter number of records you want in one data package.
    setting for all the datasource in the system
    Same as mentioned by Jorge.
    let me know if you have any doubt
    Regards,
    Viren
    Edited by: Viren Devi on Apr 2, 2009 12:07 AM

  • About maxlines for infopackage

    Hi, i changed maxlines for infopackage in OLTP tx SBIW and then make infopacke in SAP BI7, but not correct parameters that i have seted,why?
    Any idea ?

    Hi
    The Settings are for Extraction for Ex -- If you set control parameter for data package as 20,000 KB..then during extraction of data each data package will be limited to size of 20,000 KB and this will reduce the no. of data packages to avoid smaller sizes(KB). Then the data package maximum size would be limited to 20,000 KB when you schedule your info-package.In that case, one data pack will contain your 20,000 kb worth of data. Also your infopack will contain more than one infopack if there is data beyond 20000kb.
    As each client of a system is a separate source system to BW. And in order to extract the data of all clients to BW each client has to be added as source system. To control the datapackages you then also need to maintain the general settings for each source system and so for each client.
    This settings are nothing relevant to Info Package settings.
    Hope it helps and clear

Maybe you are looking for

  • Error while creating setuppackage

    Hi All, I am getting the following error while creating a setup package. 0010 tc.mobile.admin.bl  Creation of virtual mi instance failed. Please suggest what can be the reason for the same. Also i have certain queries while creating a setup package f

  • Yet another one that cannot get video to work (most of the time)

    Hi all, I have been reviewing as many questions in this general forum to see if I could answer my own question, but I quickly found out that there is a lot of stuff out there that is a little over my head. So, I feel I need some help and you may need

  • Language Import problem

    Hello all, We met a language problem after SMLT importing ZH language. Our system is ECC 6.0 SR3, newly installed, support package level: SAP_BASIS    SAPKB70014 SAP_APPL      SAPKH60011 after  installing, I import language from DVD 51033496_6, and t

  • Does adobe acrobat for mac work safely on new iPad?

    Is there anything safe that will work to be able to use flash?

  • Error in Park doc WF

    Hi Friends, There is an error in workflow for posting Park document. when i have checked the error log then there is a message at Posting Step that Object does not exist. Does any one have any idea what can be the reasons for this? Regards Dev