Determining support package size

Hi,
i want to apply New Support packages in my BI system.Our system is low in memory.I want to know how do you determine how much space a support package will take  in the system??

Hi Priya,
The support package size will be mentioned in the Service marketplace itself. While downloading itself you can see the size of the SAR file.
According to that you can plan.
Regards,
Raja. G

Similar Messages

  • How to determine package size dynamically for select - endselect statement in ABAP

    Hi All,
    I was using select- edselect with package size but i have hard coding the package size. Can any one tell me the the process to determine dynamic package size so that i can use in parallel processing.

    Hi,
    If you are looking to determine or set the value of package size dynamically then use below code for reference.
    DATA: itab TYPE TABLE OF spfli,
           wa like LINE OF itab,
           n    TYPE i.
    n = 10.
    SELECT carrid connid
    FROM   spfli
    INTO   CORRESPONDING FIELDS OF TABLE itab
    PACKAGE SIZE n.
       LOOP AT itab INTO wa.
         WRITE: / wa-carrid, wa-connid.
       ENDLOOP.
    ENDSELECT.
    If you are concerned about the performance then either use SELECT with OPEN CURSOR and CLOSE CURSOR.
    Let me know if it helped?
    Regards,
    Zuber

  • Support package SAPKB46C57 - package info shows blank fields and zero size

    Hello everybody,
    We have problem with SAP 4.6C and we found that note 1100728 can help us. We found that all corrections are in Support package SAPKB46C57, but we are not able to download it. We tried to check package info and we received blank fields and zero size of support package SAPKB46C57.
    Does anybody have same experiences with it?
    SAP support is very slow in solving this issue.
    Does anybody know how we can implement corrections from note 1100728?
    Thank you very much for answer.
    Best Regards
    Ing. Lukas Jarmar

    When i compiled the package, it compiled with following error.
    Warning: Package Body created with compilation errors.
    SQL> show error
    Errors for PACKAGE BODY STATSPACK:
    LINE/COL ERROR
    2045/3 PLS-00201: identifier 'SYS.DBMS_SHARED_POOL' must be declared
    2045/3 PL/SQL: Statement ignored
    SQL>

  • Data package size-LAN and WAN

    HI Experts,
    Could anybody give explanation/Document for the  below query?
    When transfering R/3 to BW.How the data pacakage size is determined.?Would the size of data packets be different over LAN and WAN? Why?
    Thanks
    Pradeep

    In transaction SBIW -> General Settings -> Maintain Control Parameters for Data Transfer (in the OLTP System), you can see and edit the default values for the source system.
    If you display the infopackage in the BW system and click on the menu option Scheduler -> DataS. Default Data Transfer, you will be able to edit the settings for the infopackage and also see what's the default configuration for the source system.
    You may also consult the following SAP Notes for more information (including exceptions):
    [417307 - Extractor package size: Collective note for applications.|https://websmp107.sap-ag.de/sap/support/notes/417307]
    [409641 - Examples of packet size dependency on ROIDOCPRMS.|https://websmp107.sap-ag.de/sap/support/notes/409641]
    I don't see what difference it could have over LAN or WAN, though...

  • Issue when number of records in a DSO exceeds DTP package size

    Hi all,
    I'm having a strange problem with the package size of my DTPs.
    I move data from a DSO to another one while performing some transformations in an Export routine. The DTP has a package size of 6 million records.
    When the number of records in the first DSO exceeds the package size, some of the records seem not to be processed properly by the transformation and this forces me to choose a very large number for the package size so that the ABAP code in the transformation is processed for all the records of the source DSO.
    I can't understand why this is happening because package size is only supposed to determine the number of records to be processed in a single step and nothing else.
    Am I right?
    Thanks

    My tip would also be the summary item

  • How do you determine the optimal size for Mozilla Firefox?

    How do you determine the optimal size for cache in Mozilla
    Firefox? I am using Firefox 7.0.1 on a 64-bit Windows 7 Ultimate operating system with 3GB RAM and 300 GB hard drive, but I have other computers running Windows XP. If the answer doesn't apply to all current versions of Firefox on all supported Windows operating systems, please explain the differences. Is there a formula for calculating the best cache size?

    I found that the best idea is to let Firefox decide that itself.

  • Package Size parameter in Partner Profile

    Hello guys,
    I'd need your advice. We have 2 systems (SAP and Non-SAP) and we need to connect them by mean of HR-PDC interface (IDoc communication is supported). We can create new Partner Profile in SAP and we can add a few IDoc types as Inbound Parameters of the partner profile. Then it is possible to define maximal package size for several IDoc types. I suppose that the Non-SAP system reads the Partner Profile and the "Package Size" parameter for the particular IDoc type before it sends some IDoc to SAP system. IDoc's size is limited by the parameter.
    And now our problem. We need to insert XI between these 2 systems. XI should only forward IDocs and nothing else. How could I set up limitation of IDoc's size? Do you thing that following scenarion could be working?
    I would define Partner Profile in XI same way as in the previous case in SAP. XI is configured to process IDocs in Integration Server by means of IDoc adapter. My expectation is that the Non-SAP system would read the partner profile in XI (including "Package Size" parameter) and then it would send IDoc to XI. Then XI would process the received IDocs in IDoc adapter.
    How can be "Package Size" parameter useful for inbound IDocs?
    Best Regards,
    Zbynek

    Hello ,
                Check the blog below
    /people/michal.krawczyk2/blog/2007/12/02/xipi-sender-idoc-adapter-packaging
    Rajesh

  • Post Processing Steps after Support Package Stack Application

    I'm curious if anyone has any guidelines for post processing steps (or pre-processing) when applying Support Package Stacks to their Development Infrastructure (Developmend Workplace and the Central NWDI Server).  We have just upgraded a couple of developers local engines and developer studio to SP stack 19 from SP stack 14 and are experiencing some problems.  We also applied the J2EE stack and appropriate SCA files to the NWDI server.
    After the support packs it looks like our DTR files are gone (when reimporting configuration via Developer Studio the SC's are there but there are no DC's inside of them).  Additionally, it looks like we have to manually reimport the newest versions of SAP_BUILDT, SAP_JTECHS, and SAP-JEE.  Another thing is that old Local Web Dynpro DC's are now complaining about class path problems and different versions of files.  We followed the documentation for applying SP19 exactly as the documentation says for a Java Development Usage type.  Is there a checklist or something that anyone has for steps to perform after the application of the support packs?

    I think I'm missing something.  Now I see the code and DC's inside the DTR.  However, when I try to import into NWDS no DC's come in (the software components exist, but no DC's in them).  Additionally, the CBS web ui reports that my software components do not contain any DC's even though I see them in the DTR.  What things can I look at to determine what I'm missing here?
    Thought I'd add some more info..after applying the support packs, we imported the new SAPBUILD, SAP_JTECH, and SAP_jee SCA'S into our track as we required some functionality from the newer build SCA.  We also reimported our old archives back into the system by manually checking them in assuming this would fix the problem with us not seeing the source in the NWDS or the DTR.  After the import, the CBS no longer sees our custom DC's, but the DTR does (both in active and inactive ws).  When importing the dev configuration into the NWDS our custom DC's no longer appear, but SAP's standard SCA's do.
    Message was edited by:
            Eric Vota

  • PACKAGE SIZE n in SELECT query

    Hi,
    When using PACKAGE SIZE n option with SELECT queries, how to determine the best/optimum value of n ? Especially when we use this for querying tables like EKPO, EKKO etc.
    Regards,
    Anand.

    > When using PACKAGE SIZE n option with SELECT queries, how to determine the best/optimum value of n ?
    The 'package size' option to the select specifies how many
    rows are returned in one chunk.
    According to ABAP-Doku, it is best to use it with an internal table:
    DATA: itab TYPE STANDARD TABLE OF SCARR WITH NON-UNIQUE
                     DEFAULT KEY INITIAL SIZE 10.
    FIELD-SYMBOLS: <FS> TYPE scarr.
    SELECT * INTO TABLE itab PACKAGE SIZE 20 FROM scarr.
      LOOP AT itab ASSIGNING <FS>.
        WRITE: / <FS>-carrid, <FS>-carrname.
      ENDLOOP.
    ENDSELECT.
    But, basically, your application's requirements determine
    what's the best value for n.
    If you don't want a lot of DB-access, you choose a high
    value for n. If you don't want a lot of data in memory, you adjust it to a lower value.
    You can also use the 'up to n rows' construct in the select to limit the number of rows fetched from the db.
    thomas

  • How to specify PACKAGE SIZE for to RFC_READ_TABLE from PyRFC?

    I'm trying to use PyRFC to extract large tables via RFC_READ_TABLE (due to an uncooperative/unsupportive basis team).
    I know that RFC_READ_TABLE supports calling it with PACKAGE SIZE since ERPConnect does it by default.
    In Python for Basis (Part 1), I learned how to specify an Open SQL where clause using the OPTIONS:
    result = connector.call('RFC_READ_TABLE',
                           QUERY_TABLE=tablename,
                           DELIMITER=delimiter,
                           OPTIONS = [{'TEXT':where_clause}])
    Is there a way, and how to specify PACKAGE SIZE in this case?
    Thanks

    Hi,
    the package size works, but you need to call the function module multiple times in a loop. Here's an excerpt of the attached script:
    recordcounter = 1          #needs to be >0 to get the while loop going
    iteration = 0              #the number of times the function module got called
    while recordcounter > 0:  #as long as the function module calll returns rows...
        tempresult = self.call('RFC_READ_TABLE',             
                                ROWSKIPS=iteration*fetchsize,  #defines the number of rows to skip
                                **parameters)
        iteration=iteration+1  #increase the iteration
        data = tempresult['DATA']    #assign the returned rows to a variable
        if len(data) > 0:      #have there been any rows?
            for row in data:      #Do something with them
        else:
            recordcounter=0    #set recordcounter to 0 to end the while loop.
    The script is far away from being perfect. It doesn't deal with RAW fields and the data types it returns are all strings. It does what I needed it to do but it may not be sufficient for what you may be trying to achieve.
    Best regards
    Lars

  • Info on package size

    Hi,
    I would like to understand what does this setting means in infopackage
    Maximum size of a data packet in kByte = 50000?
    How do I determine what is the package size that is going to come from the source system? Is there any documentation or notes that tells what the suitable package size is?
    Number of data packets per Info-IDoc = 10
    How does records get divided per package? On what logic?
    I extracted the data into PSA.
    DP 1 = 8000 records
    DP2 = 9500
    DP3 = 8400.
    The number of records is not same in all the packages. How does it work?
    Thanks
    Annie

    The individual records are sent in packages of varying sizes in the data transfer to the Business Information Warehouse. Using these parameters you determine the maximum size of such a package and therefore how much of the main memory may be used for the creation of the data package.
    SAP recommends a data package size between 10 and 50 MB.
    if you transfer a lot of fields, you can transfer less lines...
    if you transfer less fields, you transfer more lines,...
    more info :
    http://help.sap.com/saphelp_nw70/helpdata/en/51/85d6cf842825469a51b9a666442339/content.htm
    M.

  • Performance : how to detrmine the package size during a select

    Hi,
    When you do a select using package size, how to determine the value of the package size.
    Thanks for your help
    Marie

    Hi marie,
    1. When you do a select using package size
    Its done when number of records is very high.
    and we don't want to fetch all records,
    IN ONE SHOT.
    2. At that time, we fetch records in BUNCHES / PACKAGES.
    3. Just copy paste to get a taste of it.
    REPORT abc.
    DATA : t001 LIKE TABLE OF t001 WITH HEADER LINE.
    DATA : ctr TYPE i.
    selection screen.
    PARAMETERS : a TYPE c.
    START-OF-SELECTION.
      SELECT * FROM t001
      INTO TABLE t001
      PACKAGE SIZE 5 .
        ctr = ctr + 1.
        WRITE :/ '----
    Loop Pass # ' , ctr.
        LOOP AT t001.
          WRITE :/ t001-bukrs , t001-butxt.
        ENDLOOP.
      ENDSELECT.
    regards,
    amit m.

  • SPAM support packages error

    Hi,
    I've tried to upgrade CRM 5.0 SR 1 to SR 2, during  the operation I had error during the phase TP_IMPORT, the error was an buffer inconsistency. It was impossible to reset the queue during this phase. So I deleted entries in the database table PAT01-PAT02-PAT03, now queue is empty but I've internal error of spam check when I try to reload this package.
    In the background, I've the job still running RDDGENOL (triggered RAISE_EXECPTION, and timeout) created by ddic during the SP, who lock tables for SP, I can't stop it.
    Furthermore, a dialogue process running with a BDG process waiting for RFC, which is also impossible to cancel.
    Any help will be thankful.

    hi people,
    we have checked for the status of support package in Transaction SPAM traffic signal is still red and in the Phase RUN_SPAU_?  it is stuck and even after restarting import it shows clean-up phase is left and it ends with same error mention above
    the support package name is SAPKB70014 size is 116266 MB now after restarting the Import the following error message comes
    The import was stopped, since an error occurred during the phase
    RUN_SPAU_?, which the Support Package Manager is unable to resolve
    without your input.
    After you have corrected the cause of the error, continue with the
    import by choosing Support Package -> Import queue from the initial
    screen of the Support Package Manager.
    The following details help you to analyze the problem:
         -   Error in phase: RUN_SPAU_?
         -   Reason for error: TP_CANNOT_CONNECT_SYSTEM
         -   Return code: 0255
         -   Error message: Could not connect to semaphore server
    we run all the phase in background, and even after entering the support package number in transaction SPAU there it is showing
    "No objects within selected filter parameteres"
    Q.1 Any idea what went wrong and how to fix that?
    Q.2 Is there any problem if we run the all four phases of support package in background or it should be run in dialog ?
    latest tp & R3trans updated

  • Determining active package in exit variable function module

    Hi all.
    I'm writing an exit variable function module that will return different values for different packages in a level, based on a custom table to map the area / level / package to a key, then look up the variable values for that key in another table. 
    The idea is to get around the BPS constraint of choosing the variable in levels rather than in packages.  I'm trying to reduce the number of levels needed.
    Function module 'UPC_FW_GET_ACTIVE_OBJECTS' works to show the active package when the variable is being resolved in BPS0 (e.g. click on package then click on layout or other parameter group), but it does not work for executing global planning sequences.  In that case the active object is the global sequence.
    Any other ideas about how to determine the package that is active when the variable code is run? 
    Or any ideas about how to tell which global sequence step is being run so that the package can be determined from UPF_BSTEPS?
    Any ideas would be appreciated.
    Thanks!  Aron
    Message was edited by:
            Aron Weaver

    There seems not to be any available standard code providing package ID to call the exit variable function module, so it looks like either a code modification or a custom copy of UPC_BUNDLE_EXECUTE and several function modules will be necessary.  The objective of either approach would be to pass the package ID along to the plan level read and variable resolution portion of the code, then include package as an imported parameter for the exit variable function module.
    SAP development and support declined to assist, giving us the infamous standard "consulting required" response.  I've seen something like this done previously, but that client was matching the packages up with layouts in folders, and used a custom method in a standard class reading folder information to identify the package.  There doesn't seem to be anything equivalent for global planning sequences.

  • Data package size

    What is the basic difference between RSCUSTV6 & SBIW->General setting ->Maintain Control parameters in relation of modification of data package format.

    Hi,
    Just see the help on
    Maintain Control Parameters for Data Transfer:
    1. Source System
    Enter the logical system of your source client and assign the control parameters you selected to it.
    You can find further information on the source client in the source system by choosing the path
    Tools -> Administration -> Management -> Client Maintenance.
    2. Maximum Size of the Data Package
    When you transfer data into BW, the individual data records are sent in packages of variable size. You can use these parameters to control how large a typical data packet like this is.
    If no entry was maintained then the data is transferred with a default setting of 10,000 kBytes per data packet. The memory requirement not only depends on the settings of the data package, but also on the size of the transfer structure and the memory requirement of the relevant extractor.
    3. Maximum Number of Rows in a Data Package
    With large data packages, the memory requirement mainly depends on the number of data recrods that are transferred with this package. Using this parameter you control the maximum number of data records that the data package should contain.
    By default a maximum of 100,000 records are transferred per  data package.
    The maximum main memory requiremen per data package is approximately 2  Max. Rows 1000 Byte.
    4. Frequency
    The specified frequency determines the number of IDocs that an Info IDoc is to be sent to, or how many data IDocs an Info Idoc describes.
    Frequency 1 is set by default.. This means that an Info Idoc follows every data Idoc. In general, you should select a frequency between 5 and 10 but no higher than 20.
    The bigger the data IDoc packet, the lower the frequency setting should be. In this way, when you upload you can obtain information on the respective data loading in relatively short spans of time .
    With the help of every Info IDoc, you can check the BW monitor to see if there are any errors in the loading process. If there are none, then the traffic light in the monitor will be green. The Info IDocs contain information such as whether the respective data IDocs were uploaded correctly.
    5. Maximum number of parallel processes for the data transfer
    An entry in this field is only relevant from release 3.1I onwards.
    Enter a number larger than 0. The maximum number of parallel processes is set by default at 2. The ideal parameter selection depends on the configuration of the application server, which you use for transferring data.
    6. Background job target system
    Enter the name of the application server on which the extraction job is to be processed.
    To determine the name of the application server, choose
    Tools -> Administration -> Monitor -> System monitoring -> Server. The name of the application server is displayed in the column Computer.
    7. Maximum Number of Data Packages in a Delta Request
    With this parameter, you can restrict the number of data packages in a delta request or in the repetition of a delta request.
    Only use this restriction when you expect delta requests with a very high data volume, so that, despite sufficiently large data package sizes, more than 1000 data packages can result in a request.
    With an initial value or when the value is 0, there is no restriction. Only a value larger than 0 leads to a restriction in the number of data packages. For reasons of consistency, this number is not generally exactly adhered to. The actual restriction can, depending on how much the data is compressed in the qRFC queue , deviate from the given limit by up to 100.
    RSA6:
    Used to change the Datapacket Size.
    Thanks
    Reddy
    Edited by: Surendra Reddy on Mar 12, 2010 6:27 AM

Maybe you are looking for

  • Using arrays in sql statement

    Hello, I would like to use integer array( say..int count[] = new int[3];) on my sql statement to retrieve some values. Is there a way to use int array variable in sql statement?. If yes, how should I approach doing that?. Thank You in Advance. Regard

  • Horrible sync with my Music / iTunes Files - Need help!

    Hi, iam very frustrated about options forsyncing media files: BB Link is not working right or it so **bleep** slow, that 40GB needs 2 days to sync!!!! 2 days???? After syncing 1 album that was bought in iTunes - i missed some files on my phone, so al

  • Error -43 when trying to move/delete/rename a file.

    I have two files on my computer, each within a folder inside a folder. The outer folder can be renamed and moved, but no contents can be renamed, moved or deleted. Whenever I try to delete or move or rename any of the internal files I get an Error -4

  • How to use remote managed bean and JPA in JSF

    Hi All, I am familiar with referencing backing-beans and JPA properties where Glassfish and MySQL is running locally. However, is it possible to lookup these same properties using JNDI if they reside on remote servers? If so, what change is needed? I

  • Accidentally cleared messages in messaging is it possible to retrieve ?

    My daughter is away on 3 months backpacking and we have been messaging each other from ipad to iPhone (I am using ipad). She has sent lots of photos etc but through laziness I had not saved so it was one long string in messages. My wife has sent a me