Max Package Size - Uses

Hi,
Currently our CRM middleware has the MAX_PACKAGE_SIZE set to 50 in the table SMOFPARSFA (transaction R3AC6).
If we increase the size to 500, is it helpful in decreasing day to day message creation in the outbound queues for mobile clients?
thanks

Hello Mahesh,
MAX_PACKAGE_SIZE settings are relevant only for initial loads and not for deltas.
You may refer to SAP Note 874446  for more info on MAX_PACKAGE_SIZE.
Increasing its size till 1000 will definitely improve the initial load performance.
Thanks,
Rohit

Similar Messages

  • I'm using mac OSX 10.9.5 to write to NAS, I need to decrease the SMB max MTU size used by the OSX

    I'm using mac OSX 10.9.5 to write to NAS, I need to decrease the SMB max MTU size used by the OSX

    By default, OS X Mavericks 10.9.5 Network MTU is set to 1500. If you perform the following at your SMB server, with an MTU of 1500, you will likely receive Message too long, and the SMB share takes a very long while to mount. This might be the contributing factor that has you believe it is an SMB MTU configuration issue.
    ping -c 1 -D -s 1500 smbserver
    If you dial the Network MTU down in a custom setting, the ping is immediate, and the SMB share mounts briskly. I tried values in small decrements until the message too long ceased.
    ping -c 1 -D -s 1472 smbserver
    I have done some searching and I do not see how, in Apple's current SMB implementation, that one can configure /etc/nsmb.conf to reduce SMB MTU size. See man nsmb.conf.
    More on this subject.

  • SharePoint - Error_1_Error occurred in deployment step 'Add Solution': Timeout expired. The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool size was rea

    Hi,
    I am Shanmugavel, SharePoint developer, 
    I am facing the below SharePoint 2013 deployment issue while deploying using VS2012.
    If i will deploy the same wsp or existing wsp
    (last build) using direct powershell deployment, the solution adding properly, but the same timeout exception coming while activation the features.  Please find the below error.
    I tried the below activists:
    1. Restarted my dev server, DB server. 
    2. tried the same solution id different server
    3. tried existing wsp file (last build version)
    4. Deactivated all the features, including project Active deployment configuration.... but still i am facing the same issue.
    I hope this is not coding level issue, because still my code is not start running, before that some problem coming.
    Please help me any one.....  Last two days i am struck because of this...

    What you need to understand is the installation of a WSP does not do much. It just makes sure that you relevant solution files are deployed to the SharePoint farm.
    Next comes the point when you activate the features. It is when the code which you have written to "Activate" certain features for your custom solution.
    Regarding the error you are getting, it typically means that you have more connections (default is I guess 100) open for a SQL database then you are allowed to.
    If you have a custom database and you are opening a connection, make sure you close it as well.
    Look at the similar discussion here:
    The timeout period elapsed prior to obtaining a connection from the pool. This may have occurred because all pooled connections were in use and max pool
    size was reached[^]
    I would suggest further to look at the
    ULS logs[^] to get better insight.
    Manas Bhardwaj's Stream : www.manasbhardwaj.net

  • Select command using package size

    Hi all,
    I have to fetch the same record comparing the records from two data base table for which i m using
    select * from dbtab1 into table  itab1 package size n
    select * from dbtab2 into itab2
    where field = itab1-field.
    endselect.
    the loop of select is executed once for second time its givin dump.
    please help.
    regards,
    Gayatri

    Hi Gayatri,
    You can try following code:
    DATA: l_itab1 LIKE itab1 OCCURS 0 WITH HEADER LINE.
    Select first data for first select statement. Then select data for second select statement outside the ***** first select statement.
    SELECT *
        FROM dbtab1
        INTO TABLE itab1
        PACKAGE SIZE n
    ENDSELECT.
    IF NOT itab1[] IS INITIAL.
       SORT itab1 by field.
       l_itab1[] = itab1[].
       SELECT *
       FOR ALL ENTRIES IN l_itab1
       FROM dbtab2
       INTO itab2
      WHERE field = l_itab1-field.
    ENDIF.
    Hope this help you.
    Regards,
    Anil
    Edited by: Anil Salekar on Feb 23, 2009 11:45 AM

  • I'm using DVD Studio pro to burn a dvd but it won't let me cause I keep getting a "Formatting Failed" error message.  It says "Formatting was not successful. Layer 0 exceeds max layer size allowed. Choose a suitable marker location." What does this mean?

    I'm using DVD Studio pro to burn a dvd but it won't let me cause I keep getting a "Formatting Failed" error message.  It says "Formatting was not successful. Layer 0 exceeds max layer size allowed. Choose a suitable marker location that will support this condition." What does this mean?
    Kris

    It means your file is too large to fit on a single layer disk.
    Recompress to keep the overall size (audio,video and menus) below 4.5GB
    x

  • Shipping Options Packaging Type ? Can I customize my options for package sizes?  Tied with UPS but use own package sizing.

    My shipping department uses their own packaging for our products.  Currently I cannot find an option in the shipping options to tie in my package sizes to the shipping packaging options.

    My shipping department uses their own packaging for our products.  Currently I cannot find an option in the shipping options to tie in my package sizes to the shipping packaging options.

  • Min max lot size time based for use with SNP and PPDS

    Hi all, Is there anyway to set up time based min and max lot sizes? ie we want to have a Max lot size which is small for use with the first 3 months of our plan which starts life in SNP and then converts to PPDS and into blocks for block planning, but months 4 to 36 we want to leave as large max lot sizes as there is no need to have the small max lot sizes for the different horizon.
    As far as I can see there is only a material/plant lot size  and Max lot size and no way to have a different setting in a different time period.
    Thanks
    j

    Hi John,
    As you know, in the product master, the lot size maintenance is time-independent, so that obviously can not be used for your scenario. As per my understanding, to meet this using standard functionality, you can maintain multiple product specific t-lanes (for STRs) and PDSs (planned orders) with required lot size ranges and validity dates (for short or long term horizon). But, again since the validity of t-lanes and PDSs will not be automatically rolling forward so the updating the validities will be a challenge.
    The other option could be to enhance the heuristic functionality at lot size selection step while creating order.
    Regards,
    Umesh

  • Problem increasing heap size using Java Control Panel

    I am running a web-based bioinformatics package which uses a Java applet as the client application. It mostly works fine, but some large clustering processes don't complete, they just cycle endlessly. The user manual advises increasing heap space "by entering the following command in the Java Control Panel or Java Plug-in dialog: -Xms128M -Xmx256M." I can open the Java Control Panel OK and enter the necessary command in the Java Runtime Parameters box, but after clicking OK then Apply, the Runtime Parameters box is empty again and there is no increase in the heap size allocation. In other words, the change doesn't take effect.
    I have already tried: adjusting one parameter at a time (max heap size first, then min); changing uppercase to lowercase M (-Xmx256m); restarting the browser after the change; rebooting my PC - none of these has helped.
    Technical details: it's Java 6 standard edition, build 1.6.0_10-rc-b28; browser is Internet Explorer 7; operating system is Vista (Service Pack 1); PC is a 2.4 GHz processor with 2 GB RAM (so I have plenty available to allocate to Java). I have local admin privilege on my PC.
    Help would be much appreciated, as I really need to complete these cluster analyses. I'm a biologist not a developer, so ideally I need a solution that doesn't involve major programming, though I'm OK with registry edits and use of the command line. Thanks in advance.

    The failure to retain the parameters is a bug, per hdong in this [Java,net thread|http://forums.java.net/jive/thread.jspa?threadID=44540&tstart=0] dated August 14
    "I did a little more research and found out that the problem is only with "-rc" or "-beta" versions of plugin. When 1.6.0_10 is finally released, the version string will be "1.6.0_10". Plugin works fine to set/get jvm parameters when version string is "1.6.0_10".
    I am surprised that this issue was not discovered earlier. It was probably because the internal test binary use "1.6.0_10" as version string.
    Yeah, it is a bug that will affect many manny developers. It will be fixed as soon as possible."

  • Increase data package size

    Hi,
    I'm using a flat file datasource to load data from the BW server into a cube.  When I load a file containing 9000 records, the data package size is 1000.  I'm trying to improve the load time and would like to increase the data package size to 2000.
    I made an entry to specify 2000 records for my flat file source in SBIW -> Maintain Control Parameters for Data Transfer.  My file still loads with 1000 records in the data package. 
    I've also tried changing the setting on the infopackage itself, but it keeps telling me the max is 1000.
    Any suggestions?
    Thanks

    the settings you maintained in SBIW is for Other BI system not for Current System. whiel you exchange data between your current BI system with other BI System, then you need to maintain in SBIW.
    If you want to change for current system, you need to maintain in SPRO not SBIW. i can't tell you the exact navigation. it will be under "Links to Other System"
    If you are on BI 7.0, follow the below navigation....
    SPRO --> F5 -->SAP Netweaver -> BI --> Links to Other Systems -->Maintain Control Parameters for Data Transfer.
    Let me know if you need any information.
    Nagesh Ganisetti.

  • Data package size

    What is the basic difference between RSCUSTV6 & SBIW->General setting ->Maintain Control parameters in relation of modification of data package format.

    Hi,
    Just see the help on
    Maintain Control Parameters for Data Transfer:
    1. Source System
    Enter the logical system of your source client and assign the control parameters you selected to it.
    You can find further information on the source client in the source system by choosing the path
    Tools -> Administration -> Management -> Client Maintenance.
    2. Maximum Size of the Data Package
    When you transfer data into BW, the individual data records are sent in packages of variable size. You can use these parameters to control how large a typical data packet like this is.
    If no entry was maintained then the data is transferred with a default setting of 10,000 kBytes per data packet. The memory requirement not only depends on the settings of the data package, but also on the size of the transfer structure and the memory requirement of the relevant extractor.
    3. Maximum Number of Rows in a Data Package
    With large data packages, the memory requirement mainly depends on the number of data recrods that are transferred with this package. Using this parameter you control the maximum number of data records that the data package should contain.
    By default a maximum of 100,000 records are transferred per  data package.
    The maximum main memory requiremen per data package is approximately 2  Max. Rows 1000 Byte.
    4. Frequency
    The specified frequency determines the number of IDocs that an Info IDoc is to be sent to, or how many data IDocs an Info Idoc describes.
    Frequency 1 is set by default.. This means that an Info Idoc follows every data Idoc. In general, you should select a frequency between 5 and 10 but no higher than 20.
    The bigger the data IDoc packet, the lower the frequency setting should be. In this way, when you upload you can obtain information on the respective data loading in relatively short spans of time .
    With the help of every Info IDoc, you can check the BW monitor to see if there are any errors in the loading process. If there are none, then the traffic light in the monitor will be green. The Info IDocs contain information such as whether the respective data IDocs were uploaded correctly.
    5. Maximum number of parallel processes for the data transfer
    An entry in this field is only relevant from release 3.1I onwards.
    Enter a number larger than 0. The maximum number of parallel processes is set by default at 2. The ideal parameter selection depends on the configuration of the application server, which you use for transferring data.
    6. Background job target system
    Enter the name of the application server on which the extraction job is to be processed.
    To determine the name of the application server, choose
    Tools -> Administration -> Monitor -> System monitoring -> Server. The name of the application server is displayed in the column Computer.
    7. Maximum Number of Data Packages in a Delta Request
    With this parameter, you can restrict the number of data packages in a delta request or in the repetition of a delta request.
    Only use this restriction when you expect delta requests with a very high data volume, so that, despite sufficiently large data package sizes, more than 1000 data packages can result in a request.
    With an initial value or when the value is 0, there is no restriction. Only a value larger than 0 leads to a restriction in the number of data packages. For reasons of consistency, this number is not generally exactly adhered to. The actual restriction can, depending on how much the data is compressed in the qRFC queue , deviate from the given limit by up to 100.
    RSA6:
    Used to change the Datapacket Size.
    Thanks
    Reddy
    Edited by: Surendra Reddy on Mar 12, 2010 6:27 AM

  • Will it improve performance if PACKAGE SIZE added to a select query?

    Hi Gurus
    While doing the performance optimization I could see that in the code it is splitting the BKPF entries selected to a package of 500 and selecting from BSEG cluster table. In effect the remaining tables are also getting hit based on the no. of entries(packets) in the selected BKPF internal table.
    Cant we use the package size 500  in the select statement instead of this spliting and hiting the tables in a do loop.
    eg:-
          SELECT x  v z
            FROM bseg
            APPENDING TABLE lt_bseg
            PACKAGE SIZE 500
            FOR ALL ENTRIES IN lt_coep
            WHERE bukrs = lt_coep-bukrs
            AND   belnr = lt_coep-belnr_fi
            AND   gjahr = lt_coep-gjahr.
    When I read the keyword documentation I saw  a note in that as " If the addition PACKAGE SIZE is specified together with FOR ALL ENTRIES, it is not passed to the database system, but is applied to the result set on the application server, after all selected rows have been read"  I did not understand the meaning of this.
    So I am confused whether I can go ahead with this approach or will go along with the already used BKPF split logic and selct from BSEG and other tables in DO loop.
    Can you please help me << removed >>.
    Thanks & Regards,
    Shinimol.
    Edited by: Rob Burbank on Sep 23, 2010 9:09 AM

    Hi,
    regarding your second doubt:
    > Second doubt is on the note I saw in key word documentation. " If the addition PACKAGE SIZE is specified together with FOR ALL ENTRIES, it is not passed to the database system, but is applied to the result set on the application server, after all selected rows have been read.  "  I couldn't understnd this. When I debugged the select with F5 it was selecting 500 each  till it reaches th end.
    > So what does it mean, we can use this option or we should not use?
    the FAE does your select in pieces and collects the result in the database interface in an inermediate buffer and
    eliminates the duplicates there. AFTER this it gives you the number of rows back to your ABAP program that are specified with package size. It simply means that the package size addition has no effect on the memory consumptionfor the intermediate
    buffer in the database interface.
    Kind regards,
    Hermann

  • "max-pool-size"   what is it good for?

    SCreator simple CRUD use:
    After a while I get:
    " Error in allocating a connection. Cause: In-use connections equal max-pool-size and expired max-wait-time. Cannot allocate more connection"
    Which is odd, because its just me using the server/database. It looks like every tiime I run a test, another conection is lost.
    Do I have to restart the server? Is there a way to say "its only me, reuse a single connection"
    why does "connection pooling" make life harder?
    Can I turn it of?
    cheers
    cts

    I got the same error in my JSC project. I search for few days and i found the solution. I do a mistake in my page Navigation. I forgot a slash in <to-view-id>.
    A bad example:
    <navigation-rule>
    <from-view-id>/*</from-view-id>
    <navigation-case>
    <from-outcome>page13</from-outcome>
    <to-view-id>page13.jsp</to-view-id>
    </navigation-case>
    A good example:
    <from-view-id>/*</from-view-id>
    <navigation-case>
    <from-outcome>page13</from-outcome>
    <to-view-id>/page13.jsp</to-view-id>
    </navigation-case>
    with this mistake, the afterRenderedResponse() was never called, and the ResultRowSet was never closed.
    Korbben.

  • Need info about max HDD size available for Satellite Pro M30-813

    Hello,
    The following question is mainly to be addressed to authorized Toshiba support personnel. What exactly is the limitation of a maximum size of an internal HDD that I could use with my Satellite Pro M30-813?
    Recently, I have bought and installed seagate 160 GB SATA drive, onto which I have successfully installed WXP Pro and have been running it for quite a while with no problems. Recently, I have been copying large amount of data from an external hard drive to my new internal disk, and as the files were being copied as I noticed having about 50 GB free space left, I had experienced windows "delayed write failed" and a massive partition failure with no possibility to recover data. The system would no longer boot and the whole MBR was damaged. As the result, I have lost all data on my new disk.
    Although, I realize that Toshiba is not responsible for additional hardware that I use with my laptop and that is not officially supported by Toshiba, I am certain that as an end user of a Toshiba product I have the right to know about a max HDD size limitation information for my notebook model. Therefore, I request Toshiba technical support representative to give me a straight official answer to my question.
    Thank you in advance,
    Andrejs
    (You may also contact me privately at my e-mail address)

    Hi Andrew
    > The following question is mainly to be addressed to authorized Toshiba support personnel
    I think you are in the wrong area if you are looking for an answer from an authorized Toshiba support.
    This is a Toshiba user-to-user forum! You will meet here Toshiba notebook owner and enthusiasts who share knowledge and tries solve problems but nobody from Tosh :(
    I could provide my experience with the M30 Satellite and the HDD upgrade possibilities.
    In my knowledge the Sat M30 supports a 40GB, 60GB and 80GB HDD for sure.
    In my opinion you could use the 100GB HDD but bigger HDDs will not run and functions correctly.
    So switch to a lower HDD size and enjoy the notebook!
    Ive goggled a little bit and found compatible HDD and the part numbers
    HITACHI GBC000Z810 -> 80GB
    HITACHI GBC00014810 -> 80GB
    TOSHIBA HDD2188B -> 80GB
    HITACHI G8C0000Z610 -> 60GB
    HITACHI G8BC00013610 -> 60GB
    TOSHIBA HDD2183 -> 60GB
    TOSHIBA HDD2184 -> 60GB
    I hope this could help you a little bit!
    Best regards

  • Package Size parameter in Partner Profile

    Hello guys,
    I'd need your advice. We have 2 systems (SAP and Non-SAP) and we need to connect them by mean of HR-PDC interface (IDoc communication is supported). We can create new Partner Profile in SAP and we can add a few IDoc types as Inbound Parameters of the partner profile. Then it is possible to define maximal package size for several IDoc types. I suppose that the Non-SAP system reads the Partner Profile and the "Package Size" parameter for the particular IDoc type before it sends some IDoc to SAP system. IDoc's size is limited by the parameter.
    And now our problem. We need to insert XI between these 2 systems. XI should only forward IDocs and nothing else. How could I set up limitation of IDoc's size? Do you thing that following scenarion could be working?
    I would define Partner Profile in XI same way as in the previous case in SAP. XI is configured to process IDocs in Integration Server by means of IDoc adapter. My expectation is that the Non-SAP system would read the partner profile in XI (including "Package Size" parameter) and then it would send IDoc to XI. Then XI would process the received IDocs in IDoc adapter.
    How can be "Package Size" parameter useful for inbound IDocs?
    Best Regards,
    Zbynek

    Hello ,
                Check the blog below
    /people/michal.krawczyk2/blog/2007/12/02/xipi-sender-idoc-adapter-packaging
    Rajesh

  • Need you help !!! -- How to set PACKAGE SIZE in CURSOR?

    Dear experts:
    Does anybody know how to set the package size if you use cursor in your abap program?
    My scenario is to select all the data from MARA table in which the data volumn is very big, and then process for some specific logic. When I try to select them into one internal table or use CURSOR without package, my abap program will abort after running for around 1 hour. So I was thinking if I can use cursor and set the PACKAGE SIZE like 10,000 and process the data by batch.
    Here is my sample code:
    OPEN CURSOR: CUR_MARA_TBL FOR
    SELECT * FROM MARA.
    DO.
    FETCH NEXT CURSOR CUR_MARA_TBL INTO CORRESPONDING FIELDS OF WA_MARA_TBL.
    IF sy-subrc = 0.
    ......Detail logic......
    ENDIF.
    ENDDO.
    I would appreciate if anybody can let me know how to set the package size based on the sample code above, or any webpage/article about that.
    Any reply would be appreciated and thank you all in advance!
    Best Regards.
    Tim
    Moderator message: please study F1 help for FETCH statement.
    locked by: Thomas Zloch on Sep 8, 2010 4:10 PM

    The Spark List's layout property has the requestedRowCount and requestedMinRowCount properties.
    You can either set a new layout property on your List like this:
    <s:List>
            <s:layout>
                <s:VerticalLayout requestedMinRowCount="1" />
            </s:layout>
        </s:List>
    or reskin the List and modify the VerticalLayout in your copy of ListSkin.
    Jason

Maybe you are looking for