ODSI 10Gr3 audit logs common/time question

Hi
With ODSI10GR3, We are investigating delays in processing of some DB2 Inserts
Inserst occur daily but th eproblem happens maybe once in a week
A review of the audit log during teh problem occurance shows the following
Comon/time is taking 33 seconds
common/time {
timestamp: Mon Mar 01 10:21:36 PST 2010
duration: 33323 }
with compile time being ~ 14 sec and insert time being ~ 4 seconds
Is it possible that things such as full gc ooccuring can impact this time?
We increased tx timeout to 120 seconds to avoid the timeout but would like to investigate further on this.
Thanks Much for any info
Best
####<Mar 1, 2010 10:22:09 AM PST> <Info> <ODSI> <qa-sc-eibapp02.corp.test.com> <ds_ms2> <[ACTIVE] ExecuteThread: '1' for queue: 'weblogic.kernel.Default (self-tuning)'> <service.uateibsads> <> <> <1267467729490> <BEA-000000> <ClientDataspace> <DSPAuditEvent timestamp: Mon Mar 01 10:21:36 PST 2010 severity: FAILURE id: ClientDataspace:48:Mon Mar 01 10:21:36 PST 2010 {
common/application {
user: service.uateibsads
name: ClientDataspace
eventkind: update
server: ds_ms2
query/performance {
compiletime: 14869
update/relational {
source: CBS_DB2_DS
sql: INSERT INTO "S109935E"."C@CR538"."C@CUNEW01" ("C@STATUS", "C@SEQNBR", "CUBK", "CUNBR", "CUSTAT",
"CUALT", "CUNA1", "CUNA2", "CUNA3", "CUNA4", "CUNA5", "CUNA6", "CUZIP", "CUZIP2", "CUZIP3",
"CUZIP4", "CUSHRT", "CUSSNR@", "CUSSTY", "CUCLPH", "CUHMPH", "CUBUPH", "CUPOFF", "CUSOFF",
"CUPOF1", "CUPOF2", "CUOPDT", "CUTYPE", "CUTYP", "CUSIC", "CUSEX", "CURACE", "CUOWN", "CUYREM",
"CUINC", "CUSRIN", "CUBDTE", "CUDEP", "CUCTC", "CUCTCT", "CUCIRA", "CUMNBR", "CUNTID", "CUUSR1",
"CUCLNK", "CUUSR3", "CUCDCH", "CUCDCN", "CUCDCD", "CUCMCH", "CUCMNR", "CUCMCD", "CUCVSH",
"CUCVCN", "CUCVCD", "CUCATH", "CUCATN", "CUCATD", "CUCLNG", "CUCCCD", "CULGLR", "CUCWHP",
"CUCPSP", "CUCTXN", "CUCPRF", "CUSHKY", "CUITLD", "CUPSTL", "CUACOM", "CUBRCH", "CUMIDT@",
"CUMRTS", "CUMAIL", "CUSOLI", "CUSOCI", "CUCPNA", "CUBPNA", "CUPERS", "CUSALU", "CUFAX",
"CUTELX", "CUTXAN", "CUDOCF", "CUDCDT", "CUTINU", "CUTADT@", "CUWPRT", "CUCECD", "CUCELM",
"CUEXTF", "CUMTND", "CUCNCD", "CUMARK", "CUEMPL", "CUINQ", "CUMNT", "CUCENS", "CUCODT", "CUDEDT",
"CUACCD", "CUBYR1", "CUBYR2", "CUPREF", "CUJBDT", "CUJDDT", "CUEMA1", "CUEMA2", "CUOPT",
"CUOPTD", "CUSPFG", "CUOENTTYP", "CUCENTTYP", "CUAMDT", "CUESDT", "CUENA1", "CUENA2", "CUENA3",
"CUENA4", "CUENA5", "CUENA6", "CUEPST", "CUEZIP", "CUEZIS", "CUEZP3", "CUEZP4", "CUAAPL",
"CUAAKY", "CUAREC", "CUASTA", "CUANA1", "CUANA2", "CUANA3", "CUANA4", "CUANA5", "CUANA6",
"CUAZP1", "CUAZP2", "CUAZP3", "CUAZP4", "CUAPSD", "CUASTR", "CUASTP", "CUASTS", "CUAFLG",
"CUARFG", "CUCLS", "CURISK", "CURDT1", "CURSK2", "CURDT2", "CUCRLN", "CUCRDT", "CUCRFR",
"CUCRND", "CUCRPR", "CUFSDT", "CUFSFR", "CUFSND", "CUSALE", "CUCSTS", "CUNETI", "CUPRJI",
"CUASST", "CUCURA", "CUCASH", "CUACCR", "CUMKTS", "CUREAL", "CULIFE", "CUINVN", "CUFIXA",
"CULIAB", "CUCURL", "CULTRM", "CUNETW", "CUDIRL", "CUINDL", "CUDIRT", "CUINDT", "CUREDB",
"CULCRO", "CUOTHA", "CUOTHL", "CU5WHP", "CUIWHY", "CUWHEX", "CUFILL", "CUREC1", "CUSTAD",
"CUFRN1", "CUCHIB", "CUCHID", "CUCLOB", "CUCLOD", "CUCCDD", "CUCDD1", "CUCDD2", "CUCHD1",
"CUCHD2", "CUCHP1", "CUCHP2", "CUCHP3", "CUCHP4", "CUCOL1", "CUCOL2", "CUCOL3", "CUCOL4",
"CUCCDT", "CUCCYD", "CUCTYD", "CUCIDB", "CUCDIR", "CUCIND", "CUCSEC", "CUCUNS", "CUCILD",
"CUCOPN", "CUCTOD", "CUCNON", "CUCHGO", "CUCRNB", "CUCQAG", "CUCQHU", "CUCQHI", "CUCQLO",
"CUCQDD", "CUCQDT", "CUDPD1", "CUDPD2", "CUDPD3", "CUDPDP", "CUDPD", "CUIPD", "CUUPD", "CUCAGY",
"CUCIAM", "CUCIAY", "CUCPAM", "CUCPAY", "CUCLTC", "CUCLSO", "CUUCMO", "CULCCA", "CUDRSD",
"CUFACS", "CUINDS", "CUIPDS", "CUSECS", "CUUNSS", "CUDPDS", "CUUCMS", "CUBDIR", "CUBCMO",
"CUTHR1", "CUTHR2", "CUTHR3", "CUTHR4", "CUTHR5", "CUFIV1", "CUFIV2", "CUFIV3", "CUFIV4",
"CUFIV5", "CUTEN1", "CUTEN2", "CUTEN3", "CUTEN4", "CUTEN5", "CUTWN1", "CUTWN2", "CUTWN3",
"CUTWN4", "CUTWN5", "CUTHI1", "CUTHI2", "CUTHI3", "CUTHI4", "CUTHI5", "CUSEV1@", "CUSEV2",
"CUSEV3", "CUSEV4", "CUSEV5")
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?,
rowsmodified: 1
time: 3655
common/application {
exception: Transaction timed out after 29 seconds
BEA1-000021E917F59C34B15A
update/service {
procedure: InsertNewCIF
arity: 1
dataservice: ld:CoreBankingSystem/LogicalServices/CreateNewCIF.ds
script: declare namespace ns0="ld:CoreBankingSystem/LogicalServices/CreateNewCIF";
declare namespace ns1="http://www.test.com/schemas/client/cbs/logical";
declare variable $__fparam0 as element(ns1:NewCIF)* external;
{ return value ns0:InsertNewCIF($__fparam0); }
common/time {
timestamp: Mon Mar 01 10:21:36 PST 2010
duration: 33323 }

1) Is it possible that there is a database lock preventing the insert from being committed?
2) What does the audit look like for a successful update?
3) Notice that the "compile time" is non-zero. This indicates that the plan was not cached. Likely because it is the first time this was executed after the server was started. So not only do you have the extra query compilation time, there would also be time for loading classes and other initialization. (but 12 seconds of loading an initialization seems like a lot). Given that increasing the tx time to 120 seconds solves the problem (it does solve the problem, doesn't it?) I would say that this is the issue.
4) Given that you just started the server (right? see (3)), it's not likely this is due to GC. But you could enable gc verbose to see.

Similar Messages

  • Audit log trimmer timer job

    HI
    Audit log trimmer timer job
    what this job do 
    I set the sitecollection audit settings to trim the audit data after 7 days 
    is this job before delete it generate Excel report and save in document library?
    adil

    The job 'Trims audit trail entries from site collections.'
    It runs by defualt every month, which means you need to adjust the schedule so that it runs weekly so that it'll pick up your accelerated audit rules.
    This is the job that exports the data to Excel and cleans up the entries from the database.

  • Audit Log Trimming Timer Job stuck at "pausing" status

    Hi,
    We have a SharePoint 2010 farm and our Audit table is growing rapidly. I checked our "Audit log Trimming" timer job and it has been stuck at "pausing" status for more than a month. Any advice to resolve this issue would be great.
    Thanks,
    norasampang

    Hi Trevor,
    Do you think the reason that the time job is failing is because the audit log table is big and the audit timer jod times out. I saw your reply here at this
    post 
    where you have mentioned "
    It may be timing out. Have you executed it manually to see if it runs without errors?
    Can you please explain in more detail what you meant by that. I was thinking of trying to trim the Audit log using this script in small batch. Can you please let me know if this script seems right?
    $site = Get-SPSite -Identity http://sharepointsite.com
    $date = Get-Date
    $date = $date.AddDays(-1021)
    $site.Audit.DeleteEntries($date) 
    At first i would like to delete all datas that are older than 1021 days old and eventually get rid of the other logs in smaller chunks. Any advice and suggestion would be highly appreciated.
    Thanks,
    norasampang

  • When will audit on "CREATE SESSION" generate audit log?

    Hi,
    When we setup audit on "CREATE SESSION" for all uses with access, does it mean there will be an audit log every time some user is granted "CREATE SESSION" or does it mean that there will be an audit log every time a user connects? ( I'm assuming internally "CREATE SESSION" is executed every time user logs into the database)
    The command would be something like: audit CREATE SESSION by access
    Thanks for the help.

    Hozy wrote:
    Thanks Aman.
    I read somewhere that it is a good practice to enable audit on "CREATE SESSION" for all the users by access, and the next question that came to my mind is if we do this, when will the audit log get generated?To enabling auditing you have to set audit_trail initialization parameter,please refer
    http://www.oracle-base.com/articles/10g/Auditing_10gR2.php
    If it is everytime a user connects, then it will be a performance overhead.No,but after some days you can delete old records from sys.aud$

  • "logon time" between USR41 and security audit log

    Dear colleagues,
    I got a following question from customer for security audit reason.
    > 'Logon date' and 'Logon time' values stored in table  USR41 are exactly same as
    > logon history of Security Audit Log(Tr-cd:SM20)?
    Table:USR41 saves 'logon date' and 'logon time' when user logs on to SAP System from SAP GUI.
    And the Security Audit Log(Tr-cd:SM20) can save user's logon history;
    at the time when user logged on, the security audit log is recorded .
    I tried to check SAP GUI logon program:SAPMSYST several ways, however,
    I could not check it because the program is protected even for read access.
    I want to know about specification of "logon time" between USR41 and security audit log,
    or about how to look into the program:SAPMSYST and debug it.
    Thank you.
    Best Regards.

    Hi,
    If you configure Security Audit you can achieve your goals...
    1-Audit the employees how access the screens, tables, data...etc
    Answer : Option 1 & 3
    2-Audit all changes by all users to the data
    Answer : Option 1 & 3
    3-Keep the data up to one month
    Answer: No such settings, but you can define maximum log size.
    4-Log retention period can be defined.
    Answer: No !.. but you can define maximum log size.
    SM19/SM20 Options:
    1-Dialog logon
    You can check how many users logged in and at what time
    2-RFC login/call
    Same as above you can check RFC logins
    3-Transaction/report start
    You can see which report or transaction are executed and at what time
    (It will help you to analyise unauthorized data change. Transactions/report can give you an idea, what data has been changed. So you can see who changed the data)
    4-User master change
    (You can see user master changes log with this option)
    5-System/Other events
    (System error can be logged using this option)
    Hope, it clear the things...
    Regards.
    Rajesh Narkhede

  • Feature Question - Audit Logging

    Hello forum. Sorry if this is the wrong thread for this topic, but I was wondering if audit logging is a Standard Edition feature or if it's Enterprise only.
    I can't find anything specifically saying that it is or isn't, and it's not enabled on my database so other than trying to enabling it and getting an error message (or not) I can't find an answer.
    Sorry if this is a noob question, but I couldn't find anything about whether RMAN backup encryption was a Standard or Enterprise feature either, and it bit me in the ass when I found out it was Enterprise only.
    Thanks in advance...

    My apologies if you think I wasn't thorough enough in my own research before asking the question. I did do my own research using oracle's docs and google, and my final resource was the 10g Oracle Database Security Guide, which mentioned nothing about which versions supported which features.
    I did post my original question yesterday morning and received no response. As you said, this is a volunteer forum, and I'm grateful for the assistance, but as sad as you think it is that I asked for an answer within a time frame, I was left with no choice. The question was posed to me yesterday morning and I needed to return an answer this afternoon.
    In any event, thank you for the information.

  • How change sStart and End Date and time in the Audit log ???

    Install C2S BM39SP1. Work.
    Go to the: https://bmserver:8009
    Open : VPN Monitor | Audit log information.
    Problem - can not cahnge Date and time in the Audit Log Start(End)
    How i vcan do this ?
    How i can get every day stat log:
    login ; date_time_login ; dtae_timie_logout ; bite_in ; bite_out
    Serg

    Serg,
    It appears that in the past few days you have not received a response to your
    posting. That concerns us, and has triggered this automated reply.
    Has your problem been resolved? If not, you might try one of the following options:
    - Visit http://support.novell.com and search the knowledgebase and/or check all
    the other self support options and support programs available.
    - You could also try posting your message again. Make sure it is posted in the
    correct newsgroup. (http://support.novell.com/forums)
    Be sure to read the forum FAQ about what to expect in the way of responses:
    http://support.novell.com/forums/faq_general.html
    If this is a reply to a duplicate posting, please ignore and accept our apologies
    and rest assured we will issue a stern reprimand to our posting bot.
    Good luck!
    Your Novell Product Support Forums Team
    http://support.novell.com/forums/

  • Question: Best method for mounting drives at log-in time?

    I would like to know what others consider the best method for mounting drives at log-in time is? I can see a few methods such as start-up items on the client, start-up items on the server to managed users and possibly a start-up script. One wrinkle in the scenario is that users can log-in directly to the server so the method should allow that to happen gracefully. Thanks in advance for your help.

    Hi Bigsky,
    You are asking some really fundamental questions that require quite a lot of explanation. Luckily Apple has some great documentation on their server software.
    You will be able to find your answers here by diggin in a bit:
    http://www.apple.com/server/documentation/
    Good Luck!
    Dual 2.0Ghz G5   Mac OS X (10.4.3)  

  • Oblix audit logs to track last login time in Sun DS

    Hi,
    I would like to use oblix audit logs to track last login time in Sun DS.
    Is there a straightforward procedure to do that other than parsing the logs and using custom script to update Sun DS.
    Please advice.
    Thanks.

    Hi,
    In OAM you can define your own plugins to run during the authentication (you include them in the relevant authentication schemes) - you could write one that updates the user profile of the logged-in user. You would be pretty much on your own, though, all that OAM would give you is the DN of the logged in user. You would need to include libraries that connect to ldap (or maybe the plugin could send a web service call) and perform the necessary attribute updates. Authn plugins are documented in the Developer Guide: http://docs.oracle.com/cd/E15217_01/doc.1014/e12491/authnapi.htm#BABDABCG (actually that's for 10.1.4.3).
    Regards,
    Colin

  • Report RPUAUD00 Infotype audit log with date & time

    Hi all,
    I want to know from where the time is captured when we execute Report RPUAUD00 Infotype audit log?
    I want to know from which fields or tables or structures?
    We want to use the same logic in a custom report.
    Your help appreciated.
    Thanks & regards,
    Vikas

    Hi,
    Infotype logs are maintained in PCL4 cluster. You can use fn modules HR_INFOTYPE_LOG_GET_LIST
    and HR_INFOTYPE_LOG_GET_DETAIL to get infotype logs.
    Thanks,
    Aravind

  • I logged 3 times wrong security question. please help me

    I logged 3 times wrong security question. please help me

    Hey annamyle91,
    Thanks for the question. If you are having issues with the security questions associated with your Apple ID, follow these steps:
    If you forgot the answers to your Apple ID security questions
    http://support.apple.com/kb/HT6170
    Reset your security questions
    1. Go to My Apple ID (appleid.apple.com).
    2. Select “Manage your Apple ID” and sign in.
    3. Select “Password and Security” on the left side of the page.
    4. If you have only one security question, you can change the question and answer now.
    5. If you have more than one security question:
              - Select “Send reset security info email to [your rescue email address].” If you don't see this link or don't have access to your rescue address, contact Apple Support as described in the next section.
              - Your rescue address will receive a reset email from Apple. Follow its instructions to reset your security questions and set up new questions and answers.Didn't receive the email?
    After resetting your security questions, consider turning on two-step verification. With two-step verification, you don't need security questions to secure your account or verify your identity.
    If you can't reset your security questions
    Contact Apple Support in either of these circumstances:
              - You don't see the link to send a reset email, which means you don't have arescue address.
              - You see the link to send a reset email, but you don't have access to email at the rescue address.
    A temporary support PIN isn't usually required, but Apple may ask you to generate a PIN if your identity needs to be verified.
    Thanks,
    Matt M.

  • SGD 4.6 audit logging question

    When using audit logging and filters, how often is a new log file automatically created? I'm interested in preserving the audit data and need to know when the file gets recreated.
    Thanks.

    By default, the logs are archived once a week via a crontab entry (Sunday, 0300, by default) - this option is set at install time, can be changed by running tarantella setup, or you can alter the crontab entry directly.
    See "7.2.3.6 Automatic Log Archives" in the admin guide - http://docs.oracle.com/cd/E26362_01/E26354/html/sgd-installation.html#sgd-installation-full-restore and
    http://docs.oracle.com/cd/E26362_01/E26354/html/tta-archive.html for a description of the archive command.
    The archive command closes out the current log files, and compresses/summarizes them, and stores them in a numbered sub-directory under the "logs" directory, and opens news ones. Once closed out, you can no longer "query" the logfiles from the command line. Note that there are a limited number of archives maintained - 7 sets, I believe, is the default.
    Have a look at the "archive" command under .../bin/scripts to see how it works.

  • Issue with Audit Log report in SharePoint 2010

    I have enabled REPORTING feature at site collection level and configured the site collection audit settings. I tried to generated Audit log reports, most of the time it keeps on processing as shown in fig.It
    keeps on processing, never comes to report generated successful message. how to overcome this issue?

    i'm facing same issue, even when i tried to generate report for limited limited period(5 days) for a particular event(ex: delete or restore items (or) edit items).
    I think, the below reference may guide you solve your issue
    http://sharepoint.stackexchange.com/questions/17151/how-often-should-the-auditing-log-be-cleared-to-not-affect-performance
    Sekar - Our life is short, so help others to grow
    Whenever you see a reply and if you think is helpful, click "Vote As Helpful"! And whenever
    you see a reply being an answer to the question of the thread, click "Mark As Answer

  • BOE XI 3.1 Removing Audit log files

    Hi there experts,
    we have an issue with our production BOE install (3.1 SP7) whereby we have over 39,000 audit log files awaiting processing in the BOE_HOME/auditing folder. These audit files were generated a few months back when we had an issue with the system whereby thousands of scheduled events were created, we are not sure how. The removal of these events has had a knock on effect in that we have too many audit files to process, ie the system just cant process them all quickly enough.
    So my question is can we just remove these audit files from the auditing directory with no knock on effects as we dont need them loading into the audit database anyways as they are all multiples of the same event.
    As an aside when we upgraded from SP3 to SP7 the problem went away, ie no new audit files for these delete events being generated. We are still to establish how/why these audit events were created but for the time being we just want to be able to remove them. Unfortunately as its a production system we don't want to just take a chance and remove them without some advice first.
    thanks in advance
    Scott

    Is your auditing running now? Or still pending? Can you check in Audit DB, what is the max(audit_timestamp? This will tell you when was the recent actvitiy happened.
    Deleting the audit files, will not harm to your BO system. You will not be able to see auditing details for that period.
    Is the new auditing files are processed? or you still see the files created in auditing folder without processing?
    If the auditing file size shows 0 okb, than it means they were processed.

  • Abap-hr real time questions

    hi friends
    kindly send me ABAP-HR REAL TIME QUESTION to my mail [email protected]
    Thanks&Regards
    babasish

    Hi
    Logical database
    A logical database is a special ABAP/4 program which combines the contents of certain database tables. Using logical databases facilitates the process of reading database tables.
    HR Logical Database is PNP
    Main Functions of the logical database PNP:
    Standard Selection screen
    Data Retrieval
    Authorization check 
    To use logical database PNP in your program, specify in your program attributes.
    Standard Selection Screen
    Date selection
    Date selection delimits the time period for which data is evaluated. GET PERNR retrieves all records of the relevant infotypes from the database.  When you enter a date selection period, the PROVIDE loop retrieves the infotype records whose validity period overlaps with at least one day of this period.
    Person selection
    Person selection is the 'true' selection of choosing a group of employees for whom the report is to run.
    Sorting Data
    · The standard sort sequence lists personnel numbers in ascending order.
    · SORT function allows you to sort the report data otherwise. All the sorting fields are from infotype 0001.
    Report Class
    · You can suppress input fields which are not used on the selection screen by assigning a report class to your program.
    · If SAP standard delivered report classes do not satisfy your requirements, you can create your own report class through the IMG.
    Data Retrieval from LDB
    1. Create data structures for infotypes.
        INFOTYPES: 0001, "ORG ASSIGNMENT
                            0002, "PERSONAL DATA
                            0008. "BASIC PAY
    2. Fill data structures with the infotype records.
        Start-of-selection.
             GET PERNR.
        End-0f-selection. 
        Read Master Data
    Infotype structures (after GET PERNR) are internal tables loaded with data.
    The infotype records (selected within the period) are processed sequentially by the PROVIDE - ENDPROVIDE loop.
              GET PERNR.
                 PROVIDE * FROM Pnnnn BETWEEN PN/BEGDA AND PN/ENDDA
                        If Pnnnn-XXXX = ' '. write:/ Pnnnn-XXXX. endif.
                 ENDPROVIDE.
    Period-Related Data
    All infotype records are time stamped.
    IT0006 (Address infotype)
    01/01/1990   12/31/9999  present
              Which record to be read depends on the date selection period specified on the
              selection screen. PN/BEGDA PN/ENDDA.
    Current Data
    IT0006 Address  -  01/01/1990 12/31/9999   present
    RP-PROVIDE-FROM-LAST retrieves the record which is valid in the data selection period.
    For example, pn/begda = '19990931'    pn/endda = '99991231'
    IT0006 subtype 1 is resident address
    RP-PROVIDE-FROM-LAST P0006 1 PN/BEGDA PN/ENDDA.
    Process Infotypes
    RMAC Modules - RMAC module as referred to Macro, is a special construct of ABAP/4 codes. Normally, the program code of these modules is stored in table 'TRMAC'. The table key combines the program code under a given name. It can also be defined in programs.The RMAC defined in the TRMAC can be used in all Reports. When an RMAC is changed, the report has to be regenerated manually to reflect the change.
    Reading Infotypes - by using RMAC (macro) RP-READ-INFOTYPE
              REPORT ZHR00001.
              INFOTYPE: 0002.
              PARAMETERS: PERNR LIKE P0002-PERNR.
              RP-READ-INFOTYPE PERNR 0002 P0002 .
              PROVIDE * FROM P0002
                  if ... then ...endif.
              ENDPROVIDE.
    Changing Infotypes - by using RMAC (macro) RP-READ-INFOTYPE. 
    · Three steps are involved in changing infotypes:
    1. Select the infotype records to be changed;
    2. Make the required changes and store the records in an alternative table;
    3. Save this table to the database;
    The RP-UPDATE macro updates the database. The parameters of this macro are the OLD internal table containing the unchanged records and the NEW internal table containing the changed records. You cannot create or delete data. Only modification is possible.
    INFOTYPES: Pnnnn NAME OLD,
    Pnnnn NAME NEW.
    GET PERNR.
        PROVIDE * FROM OLD
               WHERE .... = ... "Change old record
               *Save old record in alternate table
               NEW = OLD.
        ENDPROVIDE.
        RP-UPDATE OLD NEW. "Update changed record
    Infotype with repeat structures
    · How to identify repeat structures.
    a. On infotype entry screen, data is entered in table form.
        IT0005, IT0008, IT0041, etc.
    b. In the infotype structure, fields are grouped by the same name followed by sequence number.
        P0005-UARnn P0005-UANnn P0005-UBEnn
        P0005-UENnn P0005-UABnn
    Repeat Structures
    · Data is entered on the infotype screen in table format but stored on the database in a linear  
      structure.
    · Each row of the table is stored in the same record on the database.
    · When evaluating a repeat structure, you must define the starting point, the increment and the
      work area which contains the complete field group definition.
    Repeat Structures Evaluation (I)
    · To evaluate the repeat structures
       a. Define work area.
           The work area is a field string. Its structure is identical to that of the field group.
       b. Use a DO LOOP to divide the repeat structure into segments and make it available for  
           processing in the work area, one field group (block) at a time.
    Repeat Structures Evaluation(II)
    Define work area
    DATA: BEGIN OF VACATION,
                  UAR LIKE P0005-UAR01, "Leave type
                  UAN LIKE P0005-UAN01, "Leave entitlement
                  UBE LIKE P0005-UBE01, "Start date
                  UEN LIKE P0005-UEN01, "End date
                  UAB LIKE P0005-UAB01, "Leave accounted
               END OF VACATION.
    GET PERNR.
         RP-PROVIDE-FROM-LAST P0005 SPACE PN/BEGDA PN/ENDDA.
         DO 6 TIMES VARYING VACATION
                 FROM P0005-UAR01 "Starting point
                     NEXT P0005-UAR02. "Increment
                 If p0005-xyz then ... endif.
          ENDDO.
    Processing 'Time Data'.
    · Dependence of time data on validity period
    · Importing time data
    · Processing time data using internal tables
    Time Data and Validity Period
    · Time data always applies to a specific validity period.
    · The validity periods of different types of time data are not always the same as the date selection period specified in the selection screen.
    Date selection period |----
    |
    Leave |----
    |
    · PROVIDE in this case is therefore not used for time infotypes.
    Importing Time Data
    · GET PERNR reads all time infotypes from the lowest to highest system data, not only those within the date selection period.
    · To prevent memory overload, add MODE N to the infotype declaration. This prevents the logical database from importing all data into infotype tables at GET PERNR.
    · Use macro RP-READ-ALL-TIME-ITY to fill infotype table.
    INFOTYPES: 2001 MODE N.
    GET PERNR.
        RP-READ-ALL-TIME-ITY PN/BEGDA PN/ENDDA.
        LOOP AT P0021.
             If P0021-XYZ = ' '. A=B. Endif.
        ENDLOOP.
    Processing Time Data
    · Once data is imported into infotype tables, you can use an internal table to process the interested data.
    DATA: BEGIN OF ITAB OCCURS 0,
                  BUKRS LIKE P0001-BUKRS, "COMPANY
                  WERKS LIKE P0001-WERKS, "PERSONNEL AREA
                  AWART LIKE P2001-AWART, "ABS./ATTEND. TYPE
                  ASWTG LIKE P2001-ASWTG, "ABS./ATTEND. DAYS
               END OF ITAB.
    GET PERNR.
    RP-PROVIDE-FROM-LAST P0001 SAPCE PN/BEGDA PN/ENDDA.
    CLEAR ITAB.
    ITAB-BUKRS = P0001-BURKS. ITAB-WERKS = P0001-WERKS.
    RP-READ-ALL-TIME-ITY PN/BEGDA PN/ENDDA.
    LOOP AT P2001.
          ITAB-AWART = P2001-AWART. ITAB-ASWTG = P2001-ASWTG.
          COLLECT ITAB. (OR: APPEND ITAB.)
    ENDLOOP.
    Database Tables in HR
    ·  Personnel Administration (PA) - master and time data infotype tables (transparent tables).
       PAnnnn: e.g. PA0001 for infotype 0001
    ·  Personnel Development (PD) - Org Unit, Job, Position, etc. (transparent tables).
       HRPnnnn: e.g. HRP1000 for infotype 1000
    ·  Time/Travel expense/Payroll/Applicant Tracking data/HR work areas/Documents (cluster  
       PCLn: e.g. PCL2 for time/payroll results.
    Cluster Table
    · Cluster tables combine the data from several tables with identical (or almost identical) keys
      into one physical record on the database.
    . Data is written to a database in compressed form.
    · Retrieval of data is very fast if the primary key is known.
    · Cluster tables are defined in the data dictionary as transparent tables.
    · External programs can NOT interpret the data in a cluster table.
    · Special language elements EXPORT TO DATABASE, IMPORT TO DATABASE and DELETE
      FROM DATABASE are used to process data in the cluster tables.
    PCL1 - Database for HR work area;
    PCL2 - Accounting Results (time, travel expense and payroll);
    PCL3 - Applicant tracking data;
    PCL4 - Documents, Payroll year-end Tax data
    Database Tables PCLn
    · PCLn database tables are divided into subareas known as data clusters.
    · Data Clusters are identified by a two-character code. e.g RU for US payroll result, B2 for
      time evaluation result...
    · Each HR subarea has its own cluster.
    · Each subarea has its own key.
    Database Table PCL1
    · The database table PCL1 contains the following data areas:
      B1 time events/PDC
      G1 group incentive wages
      L1 individual incentive wages
      PC personal calendar
      TE travel expenses/payroll results
      TS travel expenses/master data
      TX infotype texts
      ZI PDC interface -> cost account
    Database Table PCL2
    · The database table PCL2 contains the following data areas:
      B2 time accounting results
      CD cluster directory of the CD manager
      PS generated schemas
      PT texts for generated schemas
      RX payroll accounting results/international
      Rn payroll accounting results/country-specific ( n = HR country indicator )
      ZL personal work schedule
    Database Table PCL3
    · The database table PCL3 contains the following data areas:
      AP action log / time schedule
      TY texts for applicant data infotypes
    Data Management of PCLn
    · The ABAP commands IMPORT and EXPORT are used for management of read/write to
      database tables PCLn.
    · A unique key has to be used when reading data from or writing data to the PCLn.
      Field Name KEY Length Text
      MANDT X 3 Client
      RELID X 2 Relation ID (RU,B2..)
      SRTFD X 40 Work Area Key
      SRTF2 X 4 Sort key for dup. key
    Cluster Definition
    · The data definition of a work area for PCLn is specified in separate programs which comply  
       with fixed naming conventions.
    · They are defined as INCLUDE programs (RPCnxxy0). The following naming convention applies:
       n = 1 or 2 (PCL1 or PCL2)
       xx = Relation ID (e.g. RX)
       y = 0 for international clusters or country indicator (T500L) for different country cluster
    Exporting Data (I)
    · The EXPORT command causes one or more 'xy' KEY data objects to be written to cluster xy.
    · The cluster definition is integrated with the INCLUDE statement.
    REPORT ZHREXPRT.
    TABLES: PCLn.
    INCLUDE: RPCnxxy0. "Cluster definition
    Fill cluster KEY
    xy-key-field = .
    Fill data object
    Export record
    EXPORT TABLE1 TO DATABASE PCLn(xy) ID xy-KEY.
       IF SY-SUBRC EQ 0.
           WRITE: / 'Update successful'.
       ENDIF.
    Exporting Data (II)
    . Export data using macro RP-EXP-Cn-xy.
    · When data records are exported using macro, they are not written to the database but to a  
      main memory buffer.
    · To save data, use the PREPARE_UPDATE routine with the USING parameter 'V'.
    REPORT ZHREXPRT.
    *Buffer definition
    INCLUDE RPPPXD00. INCLUDE RPPPXM00. "Buffer management
    DATA: BEGIN OF COMMON PART 'BUFFER'.
    INCLUDE RPPPXD10.
    DATA: END OF COMMON PART 'BUFFER'.
    RP-EXP-Cn-xy.
    IF SY-SUBRC EQ 0.
        PERFORM PREPARE_UPDATE USING 'V'..
    ENDIF.
    Importing Data (I)
    · The IMPORT command causes data objects with the specified key values to be read from
       PCLn.
    · If the import is successful, SY-SUBRC is 0; if not, it is 4.
    REPORT RPIMPORT.
    TABLES: PCLn.
    INCLUDE RPCnxxy0. "Cluster definition
    Fill cluster Key
    Import record
    IMPORT TABLE1 FROM DATABASE PCLn(xy) ID xy-KEY.
       IF SY-SUBRC EQ 0.
    Display data object
       ENDIF.
    Importing data (II)
    · Import data using macro RP-IMP-Cn-xy.
    · Check return code SY-SUBRC. If 0, it is successful. If 4, error.
    · Need include buffer management routines RPPPXM00
    REPORT RPIMPORT.
    *Buffer definition
    INCLUDE RPPPXD00.
    DATA: BEGIN OF COMMON PART 'BUFFER'.
    INCLUDE RPPPXD10.
    DATA: END OF COMMON PART 'BUFFER'.
    *import data to buffer
    RP-IMP-Cn-xy.
    *Buffer management routines
    INCLUDE RPPPXM00.
    Cluster Authorization
    · Simple EXPORT/IMPORT statement does not check for cluster authorization.
    · Use EXPORT/IMPORT via buffer, the buffer management routines check for cluster
      authorization.
    Payroll Results (I)
    · Payroll results are stored in cluster Rn of PCL2 as field string and internal tables.
      n - country identifier.
    · Standard reports read the results from cluster Rn. Report RPCLSTRn lists all payroll results;
      report RPCEDTn0 lists the results on a payroll form.
    Payroll Results (II)
    · The cluster definition of payroll results is stored in two INLCUDE reports:
      include: rpc2rx09. "Definition Cluster Ru (I)
      include: rpc2ruu0. "Definition Cluster Ru (II)
    The first INCLUDE defines the country-independent part; The second INCLUDE defines the country-specific part (US).
    · The cluster key is stored in the field string RX-KEY.
    Payroll Results (III)
    · All the field string and internal tables stored in PCL2 are defined in the ABAP/4 dictionary. This
      allows you to use the same structures in different definitions and nonetheless maintain data
      consistency.
    · The structures for cluster definition comply with the name convention PCnnn. Unfortunately, 
       'nnn' can be any set of alphanumeric characters.
    *Key definition
    DATA: BEGIN OF RX-KEY.
         INCLUDE STRUCTURE PC200.
    DATA: END OF RX-KEY.
    *Payroll directory
    DATA: BEGIN OF RGDIR OCCURS 100.
         INCLUDE STRUCTURE PC261.
    DATA: END OF RGDIR.
    Payroll Cluster Directory
    · To read payroll results, you need two keys: pernr and seqno
    . You can get SEQNO by importing the cluster directory (CD) first.
    REPORT ZHRIMPRT.
    TABLES: PERNR, PCL1, PCL2.
    INLCUDE: rpc2cd09. "definition cluster CD
    PARAMETERS: PERSON LIKE PERNR-PERNR.
    RP-INIT-BUFFER.
    *Import cluster Directory
       CD-KEY-PERNR = PERNR-PERNR.
    RP-IMP-C2-CU.
       CHECK SY-SUBRC = 0.
    LOOP AT RGDIR.
       RX-KEY-PERNR = PERSON.
       UNPACK RGDIR-SEQNR TO RX-KEY-SEQNO.
       *Import data from PCL2
       RP-IMP-C2-RU.
       INLCUDE: RPPPXM00. "PCL1/PCL2 BUFFER HANDLING
    Function Module (I)
      CD_EVALUATION_PERIODS
    · After importing the payroll directory, which record to read is up to the programmer.
    · Each payroll result has a status.
      'P' - previous result
      'A' - current (actual) result
      'O' - old result
    · Function module CD_EVALUATION_PERIODS will restore the payroll result status for a period
       when that payroll is initially run. It also will select all the relevant periods to be evaluated.
    Function Module (II)
    CD_EVALUATION_PERIODS
    call function 'CD_EVALUATION_PERIODS'
         exporting
              bonus_date = ref_periods-bondt
              inper_modif = pn-permo
              inper = ref_periods-inper
              pay_type = ref_periods-payty
              pay_ident = ref_periods-payid
         tables
              rgdir = rgdir
              evpdir = evp
              iabkrs = pnpabkrs
         exceptions
              no_record_found = 1.
    Authorization Check
       Authorization for Persons
    ·  In the authorization check for persons, the system determines whether the user has the 
       authorizations required for the organizational features of the employees selected with
       GET PERNR.
    ·  Employees for which the user has no authorization are skipped and appear in a list at the end
       of the report.
    ·  Authorization object: 'HR: Master data'
    Authorization for Data
    · In the authorization check for data, the system determines whether the user is authorized to
      read the infotypes specified in the report.
    · If the authorization for a particular infotype is missing, the evaluation is terminated and an error
      message is displayed.
    Deactivating the Authorization Check
    · In certain reports, it may be useful to deactivate the authorization check in order to improve
      performance. (e.g. when running payroll)
    · You can store this information in the object 'HR: Reporting'.
    these are the main areas they ask q?

Maybe you are looking for

  • How to write formula for 0calweek from 0calmonth

    Hi Friends, I'm doing weekly report. my infocube, data coming from three sources, two sources are ODS. In ODS doesn't have 0calweek only have 0calmonth. When i see my report i can't view those ODS data in the report. So i want write a formula. How do

  • Replacement for ws_query

    Hi, We are using WS_QUERY FM to get the length of file name. CALL FUNCTION 'WS_QUERY' EXPORTING filename = l_filename query = 'FL' IMPORTING return = return This FM is obslete in ECC 6.0. So is there any FM or method which provides the same functiona

  • DATEDIFF inconsistencies when report range crosses years

    Hi folks - I have a requirement for a software metering report that provides prompted month ranges for usage activity - for what good reason, I still have not been told.  At any rate cannibalized a few reports to arrive at the following.  When I chec

  • Adobe ReaderX will only print in landscape

    I updated to adobereaderX (free version) in the last few days. Now all my pdf documents will only print in "landscape". Word docs are not affected. On checking, my Brother printer seems to be working fine and the printer orientation settings are corr

  • JCAPS 5.12 - modifying SOAP header for webservice Invocation

    I am trying to call an external web service from JCAPS 5.1.2 and need to set a token in the SOAP header. I am able to do this in other client implementations, however, the methods to modify the header don't seem to be exposed within JCAPS 5.1.2 or I