Long runtime for CU50

Hi there, is there anywhere we can update the statistic for table CABN? We encountered long runtime when execute transaction code CU50, which we found out the process keep accessing the CABN table which contains more than 10k characteristics record. Thanks

If you are running on IBM i (i5/OS, OS/400), there is no need to update statistics for a database table, because that is done automatically by the database.
If you have a slow transaction, you can analyze it through transaction ST05 and then use the Explain function on the longest running statement. Within the Explain, there is a function "Index advised", that might help in your case.
Kind regards,
Christian Bartels.

Similar Messages

  • TDMS Data transfer Step : Long runtime for table GEOLOC and GEOOBJR

    Hi Forum,
    The data transfer for tables GEOLOC and GEOOBJR is taking too long (almost 3 days for 1.7 million records). There are no scrambling rules applied on this tables. Rather I am using TDMS for the first time so I am not using any scrambling rules to start with.
    Also there are 30 process which have gone in error. How to run those erreneous jobs again??
    Any help is greatly appreciated.
    Regards,
    Anup

    Thanks Harmeet,
    I changed the write type for those activities and re-executed them and they are successfully completed.
    Now the data transfer is complete but I see a difference in no of records for these two tables (Geoloc and Geoobjr).
    Can you please let me know what might be the reason?
    Regards,
    Anup

  • Long runtime for loading PCA data into PCA ODS

    hi all,
    hope you can help me regarding this problem:
    i have been trying to load the PCA ODS (standard) from R/3 but it is taking too long so i tried to limit the size by using key fields but no luck. i have tried to load to PSA even that is taking long for around 53440 records i am unable to figure out the problem, i have checked all the transfer rules and update rules every thing is fine.
    can anyone please help me
    thanx a lot in advance

    Hi there,
    don't know if you're loading line items, but following recommendation from SAP shortened our delta loads from 1.5 - 2 hours down to 2 minutes:
    To optimize the performance of data procurement when DataSource 0EC_PCA_3 accesses line item table GLPCA in the OLTP, you should set up an additional secondary index. This should contain the following fields:
    ·         RCLNT
    ·         CPUDT
    ·         CPUTM
    Kind regards
    /martin

  • Very long runtime for export

    Hi Guru:
    We are using the Send button to export the SPM data to SAP DSE service.
    What we find out is the functionality is extremly slow, for the same amount of data with BW openhub, we could export it in a couple of hours, but with the SPM function, it will take 1 day.
    Is there anyway to improve the performance, tips or tricks?
    Also the send function will export all data, is there any program could export the file for 1 object?
    Thanks.
    Eric

    HI Ines
    jload process needs more heap memory 
    Restart the jload with more heap memory, use 1.5 Gb or 2 Gb
    Note 1173398 - jload export fails with OutOfMemoryException
    Also check
    Note 1120872 - Hom./Het.System Copy SAP NetWeaver 7.0 SR3
    Error during export/import with jload
    Symptom:
    Jload aborts with error: SEVERE: java.lang.OutOfMemoryError
    Solution:
    5. Stop SAPinst.
    6. In the control.xml file in the installation directory, search for
                  "jload call args".
    7. Replace "-Xmx512m" by "-Xmx1024m" in the following lines:
                  var jdkArgs = installer. onOS400() ? ["-showversion", "-Xms256m", "-Xmx1536m"] : ["-showversion", "-Xmx512m"];
                  switch( new SystemMgt().getSystemInfo(). getOSFAName() ) {
    8. Save your entries and restart SAPinst.
    Best of luck
    Award points if this post is helpful
    Regards
    dEE
    Edited by: Deep Kwatra on Nov 18, 2008 4:43 PM

  • Long runtimes due to P to BP integration

    Hi all,
    The folks from my project are wondering if any of the experts out there have faced the following issue before. We have raised an OSS note for it but have yet to receive any concrete solution from SAP. As such, we are exploring other avenues of resolving this matter.
    Currently, we are facing an issue where a standard infotype BADi is causing extremely long runtimes for programs that update certain affected infotypes. The BADi name is HR_INTEGRATION_TO_BP and SAP recommends that it should be activated when E-Recruitment is implemented. A fairly detailed technical description is provided as follows.
    1. Within IN_UPDATE method of the BADi, a function module, HCM_P_BP_INTEGRATION is called to create linkages between a person object and business partner object.
    2. Function module RH_ALEOX_BUPA_WRITE_CP will be called within HCM_P_BP_INTEGRATION to perform the database updates.
    3. Inside RH_ALEOX_BUPA_WRITE_CP, there are several subroutines of interest, such as CP_BP_UPDATE_SMTP_BPS and CP_BP_UPDATE_FAX_BPS. These subroutines are structured similarly and will call function module BUPA_CENTRAL_EXPL_SAVE_HR to create database entries.
    4. In BUPA_CENTRAL_EXPL_SAVE_HR, subroutine ADDRESS_DATA_SAVE_ES_NOUPDTASK will call function module, BUP_MEMORY_PREPARE_FOR_UPD_ADR, which is where the problem begins.
    5. BUP_MEMORY_PREPARE_FOR_UPD_ADR contains 2 subroutines, PREPARE_BUT020 and PREPARE_BUT021. Both contain similar code where a LOOP is performed on a global internal table (GT_BUT020_MEM_SORT/GT_BUT021_MEM_SORT) and entries are appended to another global internal table (GT_BUT020_MEM/GT_BUT021_MEM). These tables (GT_BUT020_MEM/GT_BUT021_MEM) will be used later on for updates to database tables BUT020 or BUT021_FS. However, we noticed that these 2 tables are not cleared after updating the database, which results in an ever increasing number of entries that are updated into the database, even though many of them may have already been updated.
    If any of you are interested in seeing if this issue affects you, simply run a program that will update either infotype 0000, 0001, 0002, 0006 subty 1, 0009 subty 0 or 0105 subty (0001, 0005, 0010 or 0020) to replicate this scenario if E-recruitment is implemented in your system. Not many infotype updates are required to see the issue, just 2 are enough to tell if the tables in point 5 are being cleared. (We have observed that this issue occurs during the creation of a new personnel number (and hence a new business partner). For existing personnel numbers, the same code is executed but the internal tables in point 5 are not populated.)
    System details: SAP ECC 6.0 (Support package: SAPKA70021) with E-Recruitment (Support package: SAPK-60017INERECRUIT) implemented.
    Thanks for reading.

    Hi Annabelle,
    We have a similar setup, but are on SAPK-60406INERECRUIT.  Although the issue does not always occur, we do have a case where the error ADDRESS_DATA_SAVE_ES is thrown.
    Did you ever resolve your issue?  Hoping that solution can help guide me.
    Thanks
    Shane

  • Middleware- Taking long time for generation of Runtime objects- SMOGTOTAL

    Hi Experts,
    I am doing middleware settings for connecting CRM 2007 with R/3 4.7.
    When i am generating all the required objects ( Replication objects, publications....) using the transaction code SMOGTOTAL, system is taking very long time for generating the objects. Generally it takes 4 to 6 hours but in our case it has already took more than 36 hours and still its running.
    Can anybody tell me what i need to do to make the generation process faster.
    Regards
    Nadh

    What I read in the best practice:
    It is not required for a new installation.
    Typically this activity has already been executed during the system installation or upgrade.
    Use transaction SMOGLASTLOG  to check if an initial generation has already been executed. In this case you can skip this activity.
    I checked transaction SMOGLASTLOG, and in our case the initial generation was not yet executed. I also couldn't continue with the next steps.
    That's why I started up the job, it is finally finished after 104 hours..
    Thanks for your fast reply.
    Jasper.

  • Long time for activation

    Hi ,
    May I know the reason for
    1) whenever I activate Data source object or infocubes, it is taking very long time.
    2) even for deletion of DTP or deleting Data source object, its very very slow..
    Generally it was not taking so much time in 3.x (not even 1/4 th of this)..
    Thanks in advance
    Anan

    After some time it is throwing dump with the following message..
    How to correct the error                                                                        
        Programs with long runtime should generally be started as background                        
        jobs. If this is not possible, you can increase the system profile                          
        parameter "rdisp/max_wprun_time".                                                                               
    Depending on the cause of the error, you may have to take one of the                        
        following measures:                                                                         
        - Endless loop: Correct program;                                                            
        - Dataset resulting from database access is too large:                                      
          Instead of "SELECT * ... ENDSELECT", use "SELECT * INTO internal table                    
          (for example);                                                                               
    - Database has unsuitable index: Check index generation.                                                                               
    If the error occures in a non-modified SAP program, you may be able to                      
        find an interim solution in an SAP Note.                                                    
        If you have access to SAP Notes, carry out a search with the following                      
        keywords:                                                                               
    "TIME_OUT" " "                                                                               
    "RADBTDLT" or "RADBTDL0"                                                                    
        "TTYP_DEL"                                                                               
    If you cannot solve the problem yourself and want to send an error                          
        notification to SAP, include the following information:                                                                               
    1. The description of the current problem (short dump)                                                                               
    To save the description, choose "System->List->Save->Local File            
       (Unconverted)".                                                                               
    2. Corresponding system log                                                                               
    Display the system log by calling transaction SM21.                        
          Restrict the time interval to 10 minutes before and five minutes           
       after the short dump. Then choose "System->List->Save->Local File             
       (Unconverted)".                                                                               
    3. If the problem occurs in a problem of your own or a modified SAP           
       program: The source code of the program                                       
          In the editor, choose "Utilities->More                                     
       Utilities->Upload/Download->Download".                                                                               
    4. Details about the conditions under which the error occurred or which       
       actions and input led to the error.                                                                               
    THanks
    Anan

  • Reduce size of Air runtime for mobile apps

    Flash & Air are fantastic for create Android apps
    but most of user don't have Air Runtime package on their phones and we must embed this in publish
    unfortunately the size of adobe air runtime is about 8 MB ! and with slow internet it take long time to download the apps , and even for getting this apps with Bluetooth
    is there any chance to reduce size of Adobe Air Runtime ?
    or create various version of Air runtime , for example in most project we don't need play video or sound ...

    This has been asked before several times. I doubt you will get a procedure from NI on how to do this since the run-time engine is intended to be a collection of libraries to be used by various types of LabVIEW apps.
    That said, it doesn mean that users haven't tried to figure out how to do it. Note: I do not recommend this practice nor can I speak to its effectiveness since I've never tried it.

  • CDB Upgrade 4.0 - 5.0: Long Runtime

    Hello all,
    We are in the middle of CRM Upgrade from 4.0 -> 7.0 and currenty doing CDB upgrade from 4.0 -> 5.0. As part of Segment download, I am downloading CAPGEN_OBJECT_WRITE and it has created few lacs entries in SMQ2.
    The system is processing those entries for last 3 days and although the entries are being processed, we cannot afford to have such long runtime during Go-live. Did I miss something?
    Have you ever faced such scenario? Appreciate your valuable feedback on this.
    Thanks in advance,
    Regards
    Pijush

    Hi William,
    Cobras has it limitation when it comes to internet subscriber -
    noted in the link : Internet subscribers, Bridge, AMIS, SMTP users and such will not be included.
    http://www.ciscounitytools.com/Applications/General/COBRAS/Help/COBRAS.htm
    You might try using Subscriber Information Dump under tools depot > administration tools > Subscriber Information Dump and export and import to the new unity server.
    Rick Mai

  • Web Application Designer 7 - Long Runtime

    Hi,
    I'm working in BI-7 environment and to fulfil the users' requirement we have developed a web template having almost 30 queries in it.
    We are facing very long runtime of that report on web. Afer analysing with BI Statistics we came to know that DB and OLAP are not taking very long time to run but its the front-end (web template) which is causing delay. Another observation is maximum time is consumed while web template is being loaded/initialized, and once loaded fliping between different tabs (reports) doesn't take much time.
    My questions are;
    What can I do to reduce web template intialization/loading time?
    Is there any way I can get time taken by front-end in statistics? (currently we can get DB and OLAP time through BI statistics cube and consider remaing time as front-end time, because standard BI statistics cube is unable to get front-end time when report is running on browser)
    What is the technical processes involve when information moves back from DB to browser?
    Your earliest help would be highly appreciated. Please let me know if you require any further information.
    Regards,
    Shabbar
    0044 (0) 7856 048 843

    Hi,
    It asks you for a log in to the Portal, because the output of the Web Templates can be viewed only through the Enterprise Portal. This is perfectly normal. BI-EP Configuraion should be proper and you need to have a Login-id and Password for the Portal.
    For using WAD and design the front end, go through the below link. It would help you.
    http://help.sap.com/saphelp_nw70/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm

  • BPS0 - very long runtime

    Hi gurus,
    During the manual planning in BPS0 long runtime occurs.
    FOX formulas are used.
    There is lot of data selected, but it is business needs.
    Memory is OK as I can see in st02 - 10-15% of resources are usually used, no dumps, but very long runtime.
    I examine hardware, system, db with different methods, nothing unusual.
    Could you please give me more advices, how I can do extra check of the system? (from basis point of view preferably)
    BW 3.1. - patch 22
    SEM-BW 3.5 - patch 18
    Thanks in advance
    Elena

    Hello Elena,
    you need to take a structured approach. "Examining" things is fine but usually does not lead to results quickly.
    Performance tuning works best as follows:
    1) Check statistics or run a trace
    2) Find the slowest part
    3) Make this part run faster (better, eliminate it)
    4) Back to #1 until it is fast enough
    For the first round, use the BPS statistics. They will tell you if BW data selection or BPS functions are the slowest part.
    If BW is the problem, use aggregates and do all the things to speed up BW (see course BW360).
    If BPS is the problem, check the webinar I did earlier this year: https://www.sdn.sap.com/irj/sdn/webinar?rid=/webcontent/uuid/2ad07de4-0601-0010-a58c-96b6685298f9 [original link is broken]
    Also the BPS performance guide is a must read: https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7c85d590-0201-0010-20b5-f9d0aa10c53f
    Next, would be SQL trace and ABAP performance trace (ST05, SE30). Check the traces for any custom coding or custom tables at the top of the runtime measurements.
    Finally, you can often see from the program names in the ABAP runtime trace, which components in BPS are the slowest. See if you can match this to the configuration that's used in BPS (variables, characteristic relationships, data slices, etc).
    Regards
    Marc
    SAP NetWeaver RIG

  • Long runtimes while performing CCR

    Hello All,
    After running the delta report job we found some inconsistency for stocks when we try to delete or push to apo ( once performing the iteration ) the entries are not being deleted nor pushed  and is taking long runtimes. We dont see this issue for any other elements except stocks. Please let me know the reasons on why this would be happening and also please let me know if there is any way in which we can rectify this inconsistency for stock b/w ECC and apo .
    Thanks
    Uday

    Uday,
    I had one experience several years back with long CCR runtimes for Stock elements that might apply to you.
    For CCR, you have 6 categories of stocks to check.  If any of these stock category elements is not actually contained in any of your integration models, the CCR search can take a long time searching through ALL integration models trying to find a 'hit'.
    There are two possible solutions.  Ensure that you ONLY select CCR stock types that are contained in your CFM1 integration models.  If possible, deselect the CCR stock types that have no actual stocks within the integration models (where such stocks do not actually exist in ECC).  If this does not meet your business requirement, then try performing your CCR ONLY on the integration model(s) that contain the stock entries.  Do not leave the CCR field "Model Name" blank.
    With respect to the stock inconsistencies, 'how bad is it'?  It is common to have one or two Stock inconsistencies every day if you have hundreds of thousands of stock elements to keep in synch.  The most common reason I see for excessive stock entries in CCR is improperly coded enhancements.
    Best Regards,
    DB49

  • The iPod could not be synced because this computer is no longer authorized for purchased items that are on this iPod.

    Full message that pops up
    "The iPod could not be synced because this computer is no longer authorized for purchased items that are on this iPod.
    To authorize this computer for items purchased from iTunes Store, choose Store > Authorize This Computer."
    I am unable to sync my music, applications, movies, etc. to my iPod touch. The only way I am able to bypass the message that pops out when I try to sync (the one in the title) is by clicking "Apply" instead of "Sync". To do this, I make changes to Apps, Music, or Movies (...) and the buttons "Revert" and "Apply" will replace the button "Sync".
    My problem is most likely not due to using many accounts because I only have one and it is the only one that I use. I do not share accounts with my family members. Still, I have tried everything here, nothing has worked for me. Since I don't often listen to Podcasts, I have deleted them from my iPod. I have no idea why I am unable to sync and no one seems to have the same problem as me.
    Thank you for your help in advance.
    Details
    iPod touch, 5th generation
    iOS 8.1.2
    iTunes 12.0.1.26
    Windows 8
    Purchases
    Pre-ordered album "Froot" purchased from the account that is authorized to my computer
    Podcast "Not Too Deep with Grace Helbig" also purchased from the account that is authorized to my computer

    Try backing up to iTunes and restoring from that backup

  • THE IPHONE CANNOT BE SYNCED BECAUSE THIS COMPUTER IS NO LONGER AUTHORIZED FOR PURCHASED ITEMS THAT ARE ON THIS PHONE

    Hi I have a macbook pro (late 2011) running on Yosemite  and an iPhone 5s running iOS 8.1.2 and I have never had a problem until tonight. I wanted to sync my iPhone to add mew music but upon syncing I keep getting a pop up message saying "THE IPHONE "  " COULD NOT BE SYNCED BECAUSE THIS COMPUTER IS NO LONGER AUTHORIZED FOR PURCHASED ITEMS ON THIS PHONE" and tells me to go to the store tab on the top and to authorize this computer and every time I do it says computer is already authorized. I will also mention that my iTunes account is the only account that is used on my Mac and it is the only account I've used on my iPhone. Ive synced them plenty of time with no problems. I have never came across this message before and now my iPhone won't sync. I have been online for an hour now looking for solutions but I haven't found any that successfully worked. Ive rebooted both my Mac and iPhone with no success. ow for some reason when I plug my iPhone in it does not even charge but my Mac reads it and it shows up in the iTunes store so I tried syncing it again 1 more time snap it just gets stuck on step 1. I must also say state that is did successfully back up but absolutely will not sync. Can anybody please help me out without resetting my iPhone. Thanks

    Thanks Gail but ive tried that as well as logging oht and back into my accounts on my iphone and mac but I still get that message. I actually found a solution by unckecking my synced music playlist and clicking sync to remove it and then reclicked on my playlist to sync and it actually synced notmally and I got all of my new songs so I wanted to test it again so I clicked on sync again and the message once again popped up until I did the un-syncing and re-syncing again. I guess I will keep doing that in the future even though its annoying but its better than resetting my iphone.

  • IPad could not be Synced because it is no longer authorized for purchased items that are on iPad

    I recently purchased an IPAD AIR and synced ir with my comuter. 14 days later I try and SYNC it and I get a message " The IPAD could not be synced because this computer is no longe authorized for purchased items that are on this IPAD. The apple id's are the same and only 2 devices authorized. Meessage states to go to STORE and Authorize Computer. When I do I get a Message it is already authorized

    The problem appeared to be a Pending Request on the IPAD for the new ICLOUD option to share Keychain. I cleared that up and it appears that fixed the issue. Thanks for your help

Maybe you are looking for

  • Dunning letter - email/pdf problem

    I want to email a dunning letter to a customer. I have executed a dunning run. I go back to the Recommendation report for the dunning run - when I highlight the row under the customer name and click print preview I can see a preview of the dunning le

  • JDeveloper Error: White spaces are required between publicId and systemId.

    I'm using JDeveloper 10.1.3.1.0. This error is occurring in a BPEL process project. With one particular web service I have, adding a Partner Link to the BPEL generates the following compile error: Error: White spaces are required between publicId and

  • Diff Between Internal Table with Occurs 0 & Field Groups

    Hi, Is there really any difference between just using an internal table with an OCCURS 0 statement-- which would write the entire table to paging space-- and using field-groups? How is Field-Groups is more effective than Internal tables with occurs 0

  • Oracle 8i on NT: No more data to read from socket

    I'm using the thin JDBC to connect to Oracle 8i on NT. when i try to connect i get this exception: java.sql.SQLException: No more data to read from socket at oracle.jdbc.dbaccess.DBError.check_error (DBError.java:659) at oracle.jdbc.ttc7.MAREngine.un

  • How to create a File Save link inside a flash animation

    I have a .swf file the ends with " View or Download Catalog " - I need to know how to prompt the user to Choose a location to save the file locally. Your help is greatly appreciated.