Long runtime in table HRP1001

Hi,
We have recently done Unicode Conversion of our ECC 6 System. After Unicode Conversion, we have found that there is one customer program which is taking too much time to Execute. This was working fine when the system was non-unicode.
We have found out through SQL Trace that the customer program is taking too much time in fetching records from table HRP1001. We have checked the program and indexes for the table. These are same as they are in other non-unicode systems where it is working fine.
Kindly Advise.
Regards,
Prasad

Hi ,
Please work with the BASIS team to run statistics on the table. This will help optimizing the indexes.
Logon as ora<sid>
brconnect -u / -f stats -f allsel,collect,keep -t hrp1000
brconnect -u / -f stats -f allsel,collect,keep -t hrp1001
Thanks,
Puneet

Similar Messages

  • TDMS Data transfer Step : Long runtime for table GEOLOC and GEOOBJR

    Hi Forum,
    The data transfer for tables GEOLOC and GEOOBJR is taking too long (almost 3 days for 1.7 million records). There are no scrambling rules applied on this tables. Rather I am using TDMS for the first time so I am not using any scrambling rules to start with.
    Also there are 30 process which have gone in error. How to run those erreneous jobs again??
    Any help is greatly appreciated.
    Regards,
    Anup

    Thanks Harmeet,
    I changed the write type for those activities and re-executed them and they are successfully completed.
    Now the data transfer is complete but I see a difference in no of records for these two tables (Geoloc and Geoobjr).
    Can you please let me know what might be the reason?
    Regards,
    Anup

  • Long runtime report SMIGR_CREATE_DDL

    Hi SAP Experts.
    I am migrating a SAP ERP 6.0 SR3 32bits to x64, with system copy export/import. But the report SMIGR_CREATE_DDL has long runtime and it doesn´t finish.
    How can I solve the problem?.
    Best regards.
    Luis Gomez.

    Hi
    As far as i know, the report primarily only needed on BI systems. As long as you don't have partitioned tables or bitmap indexes, you don't have to run SMIGR_CREATE_DDL. You will only end up with an empty directory
    But to troubleshoot your problem, can you please tell us which database/version you have? Can you see which SQL statements are running?
    Best regards
    Michael
    Edit: i just tested the report on a ERP 6.0 system (on Oracle 10.2.0.2), it took ~2 hrs to run and the output was empty.

  • Long runtimes due to P to BP integration

    Hi all,
    The folks from my project are wondering if any of the experts out there have faced the following issue before. We have raised an OSS note for it but have yet to receive any concrete solution from SAP. As such, we are exploring other avenues of resolving this matter.
    Currently, we are facing an issue where a standard infotype BADi is causing extremely long runtimes for programs that update certain affected infotypes. The BADi name is HR_INTEGRATION_TO_BP and SAP recommends that it should be activated when E-Recruitment is implemented. A fairly detailed technical description is provided as follows.
    1. Within IN_UPDATE method of the BADi, a function module, HCM_P_BP_INTEGRATION is called to create linkages between a person object and business partner object.
    2. Function module RH_ALEOX_BUPA_WRITE_CP will be called within HCM_P_BP_INTEGRATION to perform the database updates.
    3. Inside RH_ALEOX_BUPA_WRITE_CP, there are several subroutines of interest, such as CP_BP_UPDATE_SMTP_BPS and CP_BP_UPDATE_FAX_BPS. These subroutines are structured similarly and will call function module BUPA_CENTRAL_EXPL_SAVE_HR to create database entries.
    4. In BUPA_CENTRAL_EXPL_SAVE_HR, subroutine ADDRESS_DATA_SAVE_ES_NOUPDTASK will call function module, BUP_MEMORY_PREPARE_FOR_UPD_ADR, which is where the problem begins.
    5. BUP_MEMORY_PREPARE_FOR_UPD_ADR contains 2 subroutines, PREPARE_BUT020 and PREPARE_BUT021. Both contain similar code where a LOOP is performed on a global internal table (GT_BUT020_MEM_SORT/GT_BUT021_MEM_SORT) and entries are appended to another global internal table (GT_BUT020_MEM/GT_BUT021_MEM). These tables (GT_BUT020_MEM/GT_BUT021_MEM) will be used later on for updates to database tables BUT020 or BUT021_FS. However, we noticed that these 2 tables are not cleared after updating the database, which results in an ever increasing number of entries that are updated into the database, even though many of them may have already been updated.
    If any of you are interested in seeing if this issue affects you, simply run a program that will update either infotype 0000, 0001, 0002, 0006 subty 1, 0009 subty 0 or 0105 subty (0001, 0005, 0010 or 0020) to replicate this scenario if E-recruitment is implemented in your system. Not many infotype updates are required to see the issue, just 2 are enough to tell if the tables in point 5 are being cleared. (We have observed that this issue occurs during the creation of a new personnel number (and hence a new business partner). For existing personnel numbers, the same code is executed but the internal tables in point 5 are not populated.)
    System details: SAP ECC 6.0 (Support package: SAPKA70021) with E-Recruitment (Support package: SAPK-60017INERECRUIT) implemented.
    Thanks for reading.

    Hi Annabelle,
    We have a similar setup, but are on SAPK-60406INERECRUIT.  Although the issue does not always occur, we do have a case where the error ADDRESS_DATA_SAVE_ES is thrown.
    Did you ever resolve your issue?  Hoping that solution can help guide me.
    Thanks
    Shane

  • BPS0 - very long runtime

    Hi gurus,
    During the manual planning in BPS0 long runtime occurs.
    FOX formulas are used.
    There is lot of data selected, but it is business needs.
    Memory is OK as I can see in st02 - 10-15% of resources are usually used, no dumps, but very long runtime.
    I examine hardware, system, db with different methods, nothing unusual.
    Could you please give me more advices, how I can do extra check of the system? (from basis point of view preferably)
    BW 3.1. - patch 22
    SEM-BW 3.5 - patch 18
    Thanks in advance
    Elena

    Hello Elena,
    you need to take a structured approach. "Examining" things is fine but usually does not lead to results quickly.
    Performance tuning works best as follows:
    1) Check statistics or run a trace
    2) Find the slowest part
    3) Make this part run faster (better, eliminate it)
    4) Back to #1 until it is fast enough
    For the first round, use the BPS statistics. They will tell you if BW data selection or BPS functions are the slowest part.
    If BW is the problem, use aggregates and do all the things to speed up BW (see course BW360).
    If BPS is the problem, check the webinar I did earlier this year: https://www.sdn.sap.com/irj/sdn/webinar?rid=/webcontent/uuid/2ad07de4-0601-0010-a58c-96b6685298f9 [original link is broken]
    Also the BPS performance guide is a must read: https://www.sdn.sap.com/irj/servlet/prt/portal/prtroot/docs/library/uuid/7c85d590-0201-0010-20b5-f9d0aa10c53f
    Next, would be SQL trace and ABAP performance trace (ST05, SE30). Check the traces for any custom coding or custom tables at the top of the runtime measurements.
    Finally, you can often see from the program names in the ABAP runtime trace, which components in BPS are the slowest. See if you can match this to the configuration that's used in BPS (variables, characteristic relationships, data slices, etc).
    Regards
    Marc
    SAP NetWeaver RIG

  • Long runtime for CU50

    Hi there, is there anywhere we can update the statistic for table CABN? We encountered long runtime when execute transaction code CU50, which we found out the process keep accessing the CABN table which contains more than 10k characteristics record. Thanks

    If you are running on IBM i (i5/OS, OS/400), there is no need to update statistics for a database table, because that is done automatically by the database.
    If you have a slow transaction, you can analyze it through transaction ST05 and then use the Explain function on the longest running statement. Within the Explain, there is a function "Index advised", that might help in your case.
    Kind regards,
    Christian Bartels.

  • The data in the table HRP1001 is inconsist with trx: ppoma_crm

    Hi gurus:
    I create a standalone Structure Model in SAP CRM for service scenario.
    I have an organizational unit that has two positions; one is a leader position and the other is normal position.
    We are using a function RH_STRUC_GET with relation u201C012u201D to look for the leader position from one organizational unit in the HRP1001
    We realized that in the HRP1001 exists a lot of position assigned with the relation u201C012u201D As the organizational unit has all of leader positions that are incorrectly because this is not like I created in the trx: ppoma_crm. As I said above for every organizational unit has two positions: one leader position and the other normal position.
    Do you know if there is a report that updates the table or is it a standard behavior?
    Thanks in advanced.
    Susana

    Hi:
    When you copy a organizational unit or a position an then is moved to another hierarchy, in the table HRP1001 is kept that data as the original unit or position from was copied.
    We solve the problem delete all these records and in the future no create with copy of others.
    Susana

  • Update table HRP1001. Change the Position and the Org Unit.

    Hi,
    I am new to SAP HR and have been given a work in this area. The requirement is to update the relationship table HRP1001 replacing the old Manager' Position and the Org Unit with the New Manager's Position and the Org. Unit.
    I have some how found out the way to get the Postion and the Org unit of the old manager. Now I am replacing the same with the new details using the Update HRP1001 Command but this is the old and not recommended method acheiving this task.
    Can anyone please suggest me any Function Module OR the BAPI to update the HRP 1001 table with the New MAnager;s Position and the Org Unit.
    Thanks...!!!
    Regards,
    Deepak.

    Hi Kalpesh,
    I am using the FM: 'BAPI_HRMASTER_SAVE_REPL_MULT'but still not updating. I am trying to change the Position and the Org unit but not successful yet.
    I first used the Operator 'I' withi the table BAPIHROBJ-OPERATOR, it failed then I tried 'U' for update it still failed. Can you please let me know how many such operations possible with their charecter values apart from I & U. the domain value set does not exist as it is char1.
    Now I came to know that I have to terminate the old relationship by changing the ENDDA and create a new one with the new BEGDA as tomorrow's date.
    Can you please suggest something.
    Regards,
    Deepak.

  • How to find the long/raw datatype tables

    HI ppl,
    I want to find out the long/raw datatype tables in Oracle database.
    Please provide the query..
    Plz help.
    Oracle version :10gr2
    platform:HP-UNIX

    Hi,
    is this is what you are looking for?
    SELECT
         TABLE_NAME,
          COLUMN_NAME,
          OWNER
    FROM
          DBA_TAB_COLUMNS
    WHERE
          DATA_TYPE IN('LONG','RAW')Regards

  • How to append data  runtime in Table in MIDlet

    Hi Friends,
    i am having 2 queries..
    1st:: How can i append data runtime in table in MIDlet ( like web )which are coming from Database.
    2: requirement is that 1st row of table should for headings like StartDate,EndDate,Resources and Status.
    From the 2nd row ,Columns for runtime data.(Like any Table how u all got my query).
    Plz send me reply as early as possible.
    Waiting for reply
    Best regards
    karan

    Presently you cannot use AJAX kind of stuffs in J2ME.
    If you want to achieve the functionality better to look at articles wriiten on writing custom items in J2ME. That will help you achieve the kind of requirement that you are expecting.
    ~Mohan

  • Update table HRP1001

    Hello,
    My requirement is to update the table HRP1001. I need to in some cases Terminate a relationship i.e. Change the Endda to current date and in some cases create a new relationship between an Employee , position and the Org unit i.e insert a few new records in table HRP1001.
    I have tried 'RH_UPDATE_INFTY' but no success. I gave here only two parameters i.e. VTASK  = 'D'/'S'/'V' & INNNN (Tables).
    Can any one please tell me why this happens . it returns sy-subrc code as 0 and then I am using the FM RH_UPDATE_DATABASE to commit the changes but yet no result.
    I tried HR_INFOTYPE_OPERATION also but I guess this is not meant for the 1001 table as this belongs to relationship table.
    I tried the BAPI  'BAPI_HRMASTER_SAVE_REPL_MULT' but still no result.
    Can any one please guide me as to which Fm Should i use to terminate a relationship i.e. to change the ENDDA from 31/12/9999 to current date and also insert new records to cretae new relationships starting from sy-datum + 1 to 31/12/9999.
    Your help on this would be highly desirable..
    Regards,
    Deepak.

    Please have a look in below thread, think you have already but worth to have a look:
    [BAPI to maintain Infotype;

  • Regarding table HRP1001

    Hi guys,
              Need to know what is what in table hrp1001.
    1) There are number of fields in this table but my interest is about OBJID and SOBID.
      As far as my knowledge i assumed that this is a relationship table and defines relation between objects.
    I thought if OTYPE = 'P' then OBJID is personnel number.
    But the problem is with SOBID. what this SOBID means if SCLAS = 'P' ???
      Could someone give me what this table means and how are the objects related....
    2) related issue
    "find the employee’s position Id using relationship B 008 in HRP1001 from Person object to the Position object"
    what could be the selection form ???
    Thanks in advance
    Abhi..

    Hi Abhi,
    <b>1</b>.
    <b>HRP1001</b>:Relationships table
    Relationships between different Objects are stored in this table.
    <b>HRHAP1000</b>:OBJECTS table.
    JOB,
    POSITION,
    PERSON,
    ORGANIZATIONAL UNIT,
    Personnel subarea,  
    Employee subgroup,  
    Employee group,     
    Company code,       
    Business area.
    <b>Ex</b>:
    <b>2</b>.
    U know one employee PERNR.If u want to get Manager to that emlpoyee.
    <b>a</b>.
    U get the Organizational unit from PA0001 table .
    <b>b</b>.
    Based on this OBJECT Organizational unit,we have to get
    the person Position who manages this Organization.
    Use select query
    SELECT single l_SOBID
    from HRP1001
    into l_sobid
    where OTYPE  = 'O'       
          OBJID  = PA0001-ORGEH
          PLVAR  = '01'      
          RSIGN  = 'B'       
          RELAT  = '012' 
    This query shows relationship between POSITION and ORGANIZATIONAL UNIT.   
    <b>c</b>.
    After getting the Manger position ,To get MAnager PERNR
    use the HRP1001 table .
    SELECT single l_SOBID
    from HRP1001
    into l_sobid
    where OTYPE  = 'S'       
          OBJID  = l_sobid
          PLVAR  = '01'      
          RSIGN  = 'A'       
          RELAT  = '008' .
    This query shows relationship between POSITION and PERSON.
    I hope that u can understand.
    <b>Thanks,
    Venkat.O </b>

  • Gather table stats taking longer for Large tables

    Version : 11.2
    I've noticed that gathers stats (using dbms_stats.gather_table_stats) is taking longer for large tables.
    Since row count needs to be calculated, a big table's stats collection would be understandably slightly longer (Running SELECT COUNT(*) internally).
    But for a non-partitioned table with 3 million rows, it took 12 minutes to collect the stats ? Apart from row count and index info what other information is gathered for gather table stats ?
    Does Table size actually matter for stats collection ?

    Max wrote:
    Version : 11.2
    I've noticed that gathers stats (using dbms_stats.gather_table_stats) is taking longer for large tables.
    Since row count needs to be calculated, a big table's stats collection would be understandably slightly longer (Running SELECT COUNT(*) internally).
    But for a non-partitioned table with 3 million rows, it took 12 minutes to collect the stats ? Apart from row count and index info what other information is gathered for gather table stats ?
    09:40:05 SQL> desc user_tables
    Name                            Null?    Type
    TABLE_NAME                       NOT NULL VARCHAR2(30)
    TABLESPACE_NAME                        VARCHAR2(30)
    CLUSTER_NAME                             VARCHAR2(30)
    IOT_NAME                             VARCHAR2(30)
    STATUS                              VARCHAR2(8)
    PCT_FREE                             NUMBER
    PCT_USED                             NUMBER
    INI_TRANS                             NUMBER
    MAX_TRANS                             NUMBER
    INITIAL_EXTENT                         NUMBER
    NEXT_EXTENT                             NUMBER
    MIN_EXTENTS                             NUMBER
    MAX_EXTENTS                             NUMBER
    PCT_INCREASE                             NUMBER
    FREELISTS                             NUMBER
    FREELIST_GROUPS                        NUMBER
    LOGGING                             VARCHAR2(3)
    BACKED_UP                             VARCHAR2(1)
    NUM_ROWS                             NUMBER
    BLOCKS                              NUMBER
    EMPTY_BLOCKS                             NUMBER
    AVG_SPACE                             NUMBER
    CHAIN_CNT                             NUMBER
    AVG_ROW_LEN                             NUMBER
    AVG_SPACE_FREELIST_BLOCKS                   NUMBER
    NUM_FREELIST_BLOCKS                        NUMBER
    DEGREE                              VARCHAR2(10)
    INSTANCES                             VARCHAR2(10)
    CACHE                                  VARCHAR2(5)
    TABLE_LOCK                             VARCHAR2(8)
    SAMPLE_SIZE                             NUMBER
    LAST_ANALYZED                             DATE
    PARTITIONED                             VARCHAR2(3)
    IOT_TYPE                             VARCHAR2(12)
    TEMPORARY                             VARCHAR2(1)
    SECONDARY                             VARCHAR2(1)
    NESTED                              VARCHAR2(3)
    BUFFER_POOL                             VARCHAR2(7)
    FLASH_CACHE                             VARCHAR2(7)
    CELL_FLASH_CACHE                        VARCHAR2(7)
    ROW_MOVEMENT                             VARCHAR2(8)
    GLOBAL_STATS                             VARCHAR2(3)
    USER_STATS                             VARCHAR2(3)
    DURATION                             VARCHAR2(15)
    SKIP_CORRUPT                             VARCHAR2(8)
    MONITORING                             VARCHAR2(3)
    CLUSTER_OWNER                             VARCHAR2(30)
    DEPENDENCIES                             VARCHAR2(8)
    COMPRESSION                             VARCHAR2(8)
    COMPRESS_FOR                             VARCHAR2(12)
    DROPPED                             VARCHAR2(3)
    READ_ONLY                             VARCHAR2(3)
    SEGMENT_CREATED                        VARCHAR2(3)
    RESULT_CACHE                             VARCHAR2(7)
    09:40:10 SQL> >
    Does Table size actually matter for stats collection ?yes
    Handle:     Max
    Status Level:     Newbie
    Registered:     Nov 10, 2008
    Total Posts:     155
    Total Questions:     80 (49 unresolved)
    why so many unanswered questions?

  • CDB Upgrade 4.0 - 5.0: Long Runtime

    Hello all,
    We are in the middle of CRM Upgrade from 4.0 -> 7.0 and currenty doing CDB upgrade from 4.0 -> 5.0. As part of Segment download, I am downloading CAPGEN_OBJECT_WRITE and it has created few lacs entries in SMQ2.
    The system is processing those entries for last 3 days and although the entries are being processed, we cannot afford to have such long runtime during Go-live. Did I miss something?
    Have you ever faced such scenario? Appreciate your valuable feedback on this.
    Thanks in advance,
    Regards
    Pijush

    Hi William,
    Cobras has it limitation when it comes to internet subscriber -
    noted in the link : Internet subscribers, Bridge, AMIS, SMTP users and such will not be included.
    http://www.ciscounitytools.com/Applications/General/COBRAS/Help/COBRAS.htm
    You might try using Subscriber Information Dump under tools depot > administration tools > Subscriber Information Dump and export and import to the new unity server.
    Rick Mai

  • Web Application Designer 7 - Long Runtime

    Hi,
    I'm working in BI-7 environment and to fulfil the users' requirement we have developed a web template having almost 30 queries in it.
    We are facing very long runtime of that report on web. Afer analysing with BI Statistics we came to know that DB and OLAP are not taking very long time to run but its the front-end (web template) which is causing delay. Another observation is maximum time is consumed while web template is being loaded/initialized, and once loaded fliping between different tabs (reports) doesn't take much time.
    My questions are;
    What can I do to reduce web template intialization/loading time?
    Is there any way I can get time taken by front-end in statistics? (currently we can get DB and OLAP time through BI statistics cube and consider remaing time as front-end time, because standard BI statistics cube is unable to get front-end time when report is running on browser)
    What is the technical processes involve when information moves back from DB to browser?
    Your earliest help would be highly appreciated. Please let me know if you require any further information.
    Regards,
    Shabbar
    0044 (0) 7856 048 843

    Hi,
    It asks you for a log in to the Portal, because the output of the Web Templates can be viewed only through the Enterprise Portal. This is perfectly normal. BI-EP Configuraion should be proper and you need to have a Login-id and Password for the Portal.
    For using WAD and design the front end, go through the below link. It would help you.
    http://help.sap.com/saphelp_nw70/helpdata/en/b2/e50138fede083de10000009b38f8cf/frameset.htm

Maybe you are looking for

  • How to leave message on the Mozilla's server?

    I received an email from HOTMAIL as below telling me to change the email client server to leave email message on the server. Now all emails that I reply from my blackberry device cannot be seen in Sent Folder when I open the hotmail account by using

  • Arch's custom linux_logo package.

    I always used linux_logo, a program which shows some system info with a customized distro-logo. I saw every distro has their custom package (with their own logo) except for ArchLinux.  So I made a new custom version of linux_logo, including an ASCII-

  • Item Cat NLN appears not relevant for picking even it is relevant in 0VLP

    Dear Expert, We have an issue in on the behavior of the item category we assigned to the replenishment delivery. The transaction starts from an STO-PO and create a replenishment delivery on the background via VL10B. The replenishment delivery was suc

  • On call notifications on lumia 730

    Hi all, As mentioned in my earlier posts I have been using a Lumia 730 for the past 2 weeks and I have come across the following issue which is a bothering point for me. Not sure if anyone else have come across the same issue. While on a call I can s

  • Brightness/Contrast issue

    when I add the brightness/contrast filter to my footage and adjust it-the picture looks normal,until I render, in which case I get flickering multi-colored frames. I'm editing prores422HQ footage at 23.98fps. The timeline is set to the correct settin