10g parameters for Task Audit System 9

Does anyone have suggestions regarding how to optimize for USER I/O during a metadata load the TASK_AUDIT table on 10g?
The ADDM report suggests Segment Tuning and Application tuning (Application tuning not possible since it is HFM code.) Looking at a relatively large application, the database improvement would be about 10 minutes if we could minimize the USER I/O waits for this table.
If you do not have a database recommendation, does anyone know if I can stop the logging of the task audit until metadata load completes?
Thanks,

We daily backup this table to a versionning table, and truncate it, it is much smaller and than the performance impact is much less
do you need to keep the whole thing??

Similar Messages

  • DB2 parameters for SAP Unicode system

    Hi,
    Can anybody guide us on specific and important DB2 9.7 parameters for our UNICODE SAP system.
    The parameters on the note: 1329179 has been already taken into consideration.
    But if anybody has the experience to set particular parameters please let us know the same.
    Best Regards,
    DVRK

    Hi,
    All the parameters has already been mentioned in the note. There are no other parameters that need to consider.
    Thanks
    Sunny

  • Basis Parameters for CRM

    Hi,
    are there any performance related / recommended parameters for CRM 4 systems?
    Can't find a specific note from SAP like is the case with BW and R/3.
    Many thanks,
    AB

    Hello Andrey,
    I think SAP will provide you this parameters during the Going Live check.
    Regards
    Gregor

  • Analysing Task Audit, Data Audit and Process Flow History

    Hi,
    Internal Audit dept has requested a bunch of information, that we need to compile from Task Audit, Data Audit and Process Flow History logs. We do have all the info available, however not in a format that allows proper "reporting" of log information. What is the best way to handle HFM logs so that we can quickly filter and export required audit information?
    We do have housekeeping in place, so the logs are partial "live" db tables, and partial purged tables that were exported to Excel to archive the historical log info.
    Many Thanks.

    I thought I posted this Friday, but I just noticed I never hit the 'Post Message Button', ha ha.
    This info below will help you translate some of the information in the tables, etc. You could report on it from the Audit tables directly or move them to another appropriate data table for analysis later. The concensus, though I disagree, is that you will suffer performance issues if your audit tables get too big, so you want to move them periodically. You can do this using a scheduled Task, manual process, etc.
    I personally just dump it to another table and report on it from there. As mentioned above, you'll need to translate some of the information as it is not 'human readable' in the database.
    For instance, if I wanted to pull Metadata Load, Rules Load, Member List load, you could run a query like this. (NOTE: strAppName should be equal to the name of your application .... )
    The main tricks to know at least for task audit table are figuring out how to convert times and determing which activity code corresponds to the user friendly name.
    -- Declare working variables --
    declare @dtStartDate as nvarchar(20)
    declare @dtEndDate as nvarchar(20)
    declare @strAppName as nvarchar(20)
    declare @strSQL as nvarchar(4000)
    -- Initialize working variables --
    set @dtStartDate = '1/1/2012'
    set @dtEndDate = '8/31/2012'
    set @strAppName = 'YourAppNameHere'
    --Get Rules Load, Metadata, Member List
    set @strSQL = '
    select sUserName as "User", ''Rules Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (1)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Metadata Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (21)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + '''
    union all
    select sUserName as "User", ''Memberlist Load'' as Activity, cast(StartTime-2 as smalldatetime) as "Time Start",
          cast(EndTime-2 as smalldatetime) as ''Time End'', ServerName, strDescription, strModuleName
       from ' + @strAppName + '_task_audit ta, hsv_activity_users au
       where au.lUserID = ta.ActivityUserID and activitycode in (23)
            and cast(StartTime-2 as smalldatetime) between ''' + @dtStartDate + ''' and ''' + @dtEndDate + ''''
    exec sp_executesql @strSQLIn regards to activity codes, here's a quick breakdown on those ....
    ActivityID     ActivityName
    0     Idle
    1     Rules Load
    2     Rules Scan
    3     Rules Extract
    4     Consolidation
    5     Chart Logic
    6     Translation
    7     Custom Logic
    8     Allocate
    9     Data Load
    10     Data Extract
    11     Data Extract via HAL
    12     Data Entry
    13     Data Retrieval
    14     Data Clear
    15     Data Copy
    16     Journal Entry
    17     Journal Retrieval
    18     Journal Posting
    19     Journal Unposting
    20     Journal Template Entry
    21     Metadata Load
    22     Metadata Extract
    23     Member List Load
    24     Member List Scan
    25     Member List Extract
    26     Security Load
    27     Security Scan
    28     Security Extract
    29     Logon
    30     Logon Failure
    31     Logoff
    32     External
    33     Metadata Scan
    34     Data Scan
    35     Extended Analytics Export
    36     Extended Analytics Schema Delete
    37     Transactions Load
    38     Transactions Extract
    39     Document Attachments
    40     Document Detachments
    41     Create Transactions
    42     Edit Transactions
    43     Delete Transactions
    44     Post Transactions
    45     Unpost Transactions
    46     Delete Invalid Records
    47     Data Audit Purged
    48     Task Audit Purged
    49     Post All Transactions
    50     Unpost All Transactions
    51     Delete All Transactions
    52     Unmatch All Transactions
    53     Auto Match by ID
    54     Auto Match by Account
    55     Intercompany Matching Report by ID
    56     Intercompany Matching Report by Acct
    57     Intercompany Transaction Report
    58     Manual Match
    59     Unmatch Selected
    60     Manage IC Periods
    61     Lock/Unlock IC Entities
    62     Manage IC Reason Codes
    63     Null

  • Cpio syntax in 10g Release 1 (10.1.0.3) Patch Set 1 for AIX-Based Systems

    Oracle® Database Patch Set Notes
    10g Release 1 (10.1.0.3) Patch Set 1 for AIX-Based Systems
    Download and Extract the Installation Software
    To download and extract the patch set installation software:
    1. Download the p3761843_10103_AIX64-5L.zip patch set installation archive to a directory that is not the Oracle home directory or under the Oracle home directory.
    2. Enter the following commands to unzip and extract the installation files:
    $ unzip p3761843_10103_AIX64-5L.zip
    $ cpio –idcv p3761843_10103_AIX64-5L.cpio
    Of course cpio -i is expecting a standard input, so there a missing <
    cpio –idcv < p3761843_10103_AIX64-5L.cpio
    Best Regards
    Laurent Schneider

    Hi Laurent,
    Apologies for the delay responding to your feedback.
    The Document to which you refer does not appear to be listed on the pages my group maintains at: http://www.oracle.com/technology/documentation/index.html
    We are not actually part of the OTN group.
    Therefore, please try the Members Feedback forum instead at: Community Feedback (No Product Questions)
    Thanks and regards,
    Les

  • ESS Leave request Visual Parameters for the Tasks

    Dear All,
    I have a basic dought about ESS Leave request Workflow process.
    We have used  WS9100004  TS12300097 for this Visual Parameters are 
    Visual Parameters        Visual Parameter Value
    APPLICATION                     LeaveRequestApprover
    PACKAGE                    com.sap.xss.hr.lea.appl
    For Task TS12300116
    Visual Parameters            Visual Parameter Value
    APPLICATION                    com.sap.xss.hr.lea.appl
    PACKAGE                    LeaveRequest
    In the same way I have to use another workflow for canceling the leave request
    Workflow number is  WS12400005  in which I have used
    TASKS  TS 12400028, TS04200011, TS04200007, TS01000162, TS04200008 
    For above tasks what are the visual parameters I need to maintain. 
    Visual Parameters     Visual Parameter Value
    APPLICATION              ?
    PACKAGE              ?
    Please suggest me on this.
    Regards,
    Kishorekumar.R

    Hi Siddharth Rajora,
         With you r help i was able to resolve the issue. Thanks for the information.
    Regards,
    kishore.

  • 10g: Cube  build performance:Waiting for tasks to finish

    I had a question related to cube build performance. We see the following in the OLAPSYS.XML_LOAD_LOG prior to completion of parallel processing. So basically the loading of records etc. happens in 2-3 mins but then the part for "running jobs and started 6 finished 5 out of 6 tasks takes almost 30-40 minutes to complete". At this point in background it invokes the xmlloader and xml_parallel_loader. Any insights as to what specifically this steps do (solving of measures etc.) ? Also what we need to modify in terms of xml tag or anything else to improve these timings?
    14:10:56 Started 6 Finished 5 out of 6 Tasks.
    14:10:56 Running Jobs: AWXML$_4361_77872. Waiting for Tasks to Finish...
    14:01:04 Started Auto Solve for Measures: TOT_AMT from Cube CR_BASE_OSUC.CUBE. 2008 Partition.
    14:01:04 Finished Load of Measures: TOT_AMT from Cube CR_BASE_OSUC.CUBE. 2008 Partition. Processed 189616 Records. Rejected 0 Records.
    Thanks,
    Sudip

    When we checked the scheduler jobs we saw xmlparallel_loader. in terms of v$session_wait we saw a db file sequential read. Following is a snapshot from the stage of finished updating partitions to starting of parallel load. Please advise.
    XML_MESSAGE
    14:49:12 Completed Build(Refresh) of AWM_CR2.CRICUBE Analytic Workspace.
    14:49:12 Finished Parallel Processing.
    14:49:10 Finished Auto Solve for Measures: TOT_AMT from Cube CR_BASE_OSUC.CUBE. 2008 Partition.
    14:40:59 Started 6 Finished 5 out of 6 Tasks.
    14:40:59 Running Jobs: AWXML$_4361_77872. Waiting for Tasks to Finish...
    14:30:58 Started 6 Finished 5 out of 6 Tasks.
    14:30:58 Running Jobs: AWXML$_4361_77872. Waiting for Tasks to Finish...
    14:20:57 Running Jobs: AWXML$_4361_77872. Waiting for Tasks to Finish...
    14:20:57 Started 6 Finished 5 out of 6 Tasks.
    14:10:56 Running Jobs: AWXML$_4361_77872. Waiting for Tasks to Finish...
    14:10:56 Started 6 Finished 5 out of 6 Tasks.
    14:01:04 Started Auto Solve for Measures: TOT_AMT from Cube CR_BASE_OSUC.CUBE. 2008 Partition.
    14:01:04 Finished Load of Measures: TOT_AMT from Cube CR_BASE_OSUC.CUBE. 2008 Partition. Processed 189616 Records. Rejected 0 Records.
    14:01:02 Finished Auto Solve for Measures: TOT_AMT from Cube CR_BASE_FIN.CUBE. 2008 Partition.
    14:01:00 Finished Load of Measures: TOT_AMT from Cube CR_BASE_FIN.CUBE. 2008 Partition. Processed 1494 Records. Rejected 0 Records.
    14:01:00 Finished Auto Solve for Measures: TOT_AMT from Cube CR_CURR1_OSUC.CUBE. 200901 Partition.
    14:01:00 Started Auto Solve for Measures: TOT_AMT from Cube CR_BASE_FIN.CUBE. 2008 Partition.
    14:01:00 Finished Auto Solve for Measures: TOT_AMT from Cube CR_CURR1_ENR.CUBE. 200901 Partition.
    14:01:00 Finished Auto Solve for Measures: TOT_AMT from Cube CR_CURR1_FIN.CUBE. 200901 Partition.
    14:01:00 Finished Auto Solve for Measures: TOT_AMT from Cube CR_BASE_ENR.CUBE. 2008 Partition.
    14:00:58 Finished Load of Measures: TOT_AMT from Cube CR_CURR1_FIN.CUBE. 200901 Partition. Processed 1494 Records. Rejected 0 Records.
    14:00:58 Started Auto Solve for Measures: TOT_AMT from Cube CR_CURR1_FIN.CUBE. 200901 Partition.
    14:00:58 Finished Load of Measures: TOT_AMT from Cube CR_BASE_ENR.CUBE. 2008 Partition. Processed 1577 Records. Rejected 0 Records.
    14:00:58 Started Load of Measures: TOT_AMT from Cube CR_CURR1_OSUC.CUBE. 200901 Partition.
    14:00:58 Finished Load of Measures: TOT_AMT from Cube CR_CURR1_OSUC.CUBE. 200901 Partition. Processed 4521 Records. Rejected 0 Records.
    14:00:58 Started Auto Solve for Measures: TOT_AMT from Cube CR_CURR1_OSUC.CUBE. 200901 Partition.
    14:00:58 Started Load of Measures: TOT_AMT from Cube CR_CURR1_FIN.CUBE. 200901 Partition.
    14:00:58 Started Load of Measures: TOT_AMT from Cube CR_BASE_ENR.CUBE. 2008 Partition.
    14:00:58 Started Auto Solve for Measures: TOT_AMT from Cube CR_CURR1_ENR.CUBE. 200901 Partition.
    14:00:58 Finished Load of Measures: TOT_AMT from Cube CR_CURR1_ENR.CUBE. 200901 Partition. Processed 1577 Records. Rejected 0 Records.
    14:00:58 Started Load of Measures: TOT_AMT from Cube CR_CURR1_ENR.CUBE. 200901 Partition.
    14:00:58 Started Auto Solve for Measures: TOT_AMT from Cube CR_BASE_ENR.CUBE. 2008 Partition.
    14:00:58 Started Load of Measures: TOT_AMT from Cube CR_BASE_FIN.CUBE. 2008 Partition.
    14:00:58 Started Load of Measures: TOT_AMT from Cube CR_BASE_OSUC.CUBE. 2008 Partition.
    14:00:57 Attached AW AWM_CR2.CRICUBE in MULTI Mode.
    14:00:57 Attached AW AWM_CR2.CRICUBE in MULTI Mode.
    14:00:57 Attached AW AWM_CR2.CRICUBE in MULTI Mode.
    14:00:57 Attached AW AWM_CR2.CRICUBE in MULTI Mode.
    14:00:57 Attached AW AWM_CR2.CRICUBE in MULTI Mode.
    14:00:57 Attached AW AWM_CR2.CRICUBE in MULTI Mode.
    14:00:55 Running Jobs: AWXML$_4361_77870, AWXML$_4361_77871, AWXML$_4361_77872, AWXML$_4361_77873.
    14:00:55 Started 4 Finished 0 out of 6 Tasks.
    14:00:55 Running Jobs: AWXML$_4361_77870, AWXML$_4361_77871, AWXML$_4361_77872, AWXML$_4361_77873, AWXML$_4361_77874.
    14:00:55 Started 6 Finished 0 out of 6 Tasks.
    14:00:55 Running Jobs: AWXML$_4361_77870, AWXML$_4361_77871, AWXML$_4361_77872, AWXML$_4361_77873, AWXML$_4361_77874, AWXML$_4361_77875.
    14:00:55 Started 6 Finished 0 out of 6 Tasks.
    14:00:55 Running Jobs: AWXML$_4361_77870, AWXML$_4361_77871, AWXML$_4361_77872, AWXML$_4361_77873, AWXML$_4361_77874, AWXML$_4361_77875. Waiting for Tasks to Finish...
    14:00:55 Running Jobs: AWXML$_4361_77870, AWXML$_4361_77871, AWXML$_4361_77872.
    14:00:55 Started 3 Finished 0 out of 6 Tasks.
    14:00:55 Running Jobs: AWXML$_4361_77870, AWXML$_4361_77871.
    14:00:55 Started 2 Finished 0 out of 6 Tasks.
    14:00:55 Running Jobs: AWXML$_4361_77870.
    14:00:55 Started 1 Finished 0 out of 6 Tasks.
    14:00:55 Starting Parallel Processing.
    14:00:55 Detached AW AWM_CR2.CRICUBE.
    14:00:55 Started 5 Finished 0 out of 6 Tasks.
    14:00:22 Finished Updating Partitions.

  • How to implement an audit system to track ADF applications DML activity?

    We have implemented a complete audit system for one of our databases in order to keep history for every table and every value that has been modified.
    The solution that we currently have can be split into two discrete parts:
    1. Keeping a record of all connections to the db account
    This is achieved via a table ‘user_sessions’ into which we record data for every session in the database with the help of on-logon and on-logoff triggers and some PL/SQL procedures:
    Column name        |  Explanation
    -------------------|-------------------------------------------
    US_ID              | PK, based on a sequence
    SESSION_ID         | sys_context('USERENV' ,'SESSIONID')  
    USER_NAME          | sys_context('USERENV' ,'OS_USER')
    LOGON_TIME         | when the on-logon trigger fires
    LOGOFF_TIME        | when the on-logoff trigger fires
    USER_SCHEMA        | sys_context('USERENV' ,'SESSION_USER')
    IP_ADDRESS         | sys_context('USERENV' ,'IP_ADDRESS')
    us_id |session_id |user_name|user_sschema|ip_address|logon_time               |logoff_time     
    560066|8498062       |BOB      |ABD         |1.1.1.2   |14-SEP-06 03.51.52.000000|14-SEP-06 03.52.30.000000
    560065|8498061       |ALICE    |ABC         |1.1.1.1   |14-SEP-06 02.45.31.000000|14-SEP-06 04.22.43.0000002. Keeping the history of every change of data made by a given user
    For every table in the account there is a corresponding history table with all of the columns of the original table plus columns to denote the type of the operation (Insert, Delete, Update), start and end time of validity for this record (createtime, retiretime) and us_id (which points to the user_sessions table).
    The original table has triggers, which fire if there is an insert, update or delete and they insert the data into the corresponding history table. For every record inserted into a history table the us_id taken from the user_sessions table is recorded as well, allowing us to determine who has modified what data via the combination of these two tables.
    Below is an example of a table TASKS, the history related triggers and the history table TASKS_HIST.
    At the moment we are developing new applications by using ADF. Since there is an Application Module Pool and Database Connection Pool implemented for the ADF, one connection to the database could be used by several users at different moments of time. In that case the history records will point to a database session logged into the user_sessions table, but we will not know who actually modified the data.
    Could you, please, give us a suggestion, how we can know at any moment of time who (which of our users currently making use of an ADF application) is using a given database connection?
    By way of an example of the problem we are facing, here is how we solved the same problem posed by the use of Oracle Forms applications.
    When the user starts to work with a given Forms application, user_sessions table would attempt to record the relevant information about he user, but since the db session was created by the application server, would in actual fact record the username and ip address of the application server itself.
    The problem was easy to solve due to the fact that there is no connection pooling and when a user opens their browser to work with Forms applications, a db connection is opened for the duration of their session (until they close their browser window).
    In that case, the moment when the user is authenticated (they log in), there is a PL/SQL procedure called from the login Form, which updates the record in the user_sessions table with the real login name and ip address of the user.
    Example of a table and its ‘shadow’ history table
    CREATE TABLE TASKS (
         TASKNAME     VARCHAR2(40),
         DESCRIPTION  VARCHAR2(80)
    ALTER TABLE TASKS ADD (
         CONSTRAINT TASKS_PK PRIMARY KEY (TASKNAME));
    CREATE OR REPLACE TRIGGER TASKS_HISTSTMP
    BEFORE INSERT OR UPDATE OR DELETE ON TASKS
       BEGIN
         HISTORY.SET_OPERATION_TIME('TASKS');
       EXCEPTION
         WHEN OTHERS THEN
           ERROR.REPORT_AND_GO;
    END TASKS_HISTSTMP;
    CREATE OR REPLACE TRIGGER TASKS_WHIST
      AFTER INSERT OR UPDATE OR DELETE ON TASKS
      FOR EACH ROW
      BEGIN
    CASE
          WHEN INSERTING THEN
            UPDATE TASKS_HIST
               SET retiretime = HISTORY.GET_OPERATION_TIME
             WHERE createtime = (SELECT MAX(createtime)
                                   FROM TASKS_HIST
                                  WHERE retiretime IS NULL AND TASKNAME=:NEW.TASKNAME)
               AND retiretime IS NULL AND TASKNAME=:NEW.TASKNAME;
            INSERT INTO TASKS_HIST (TASKNAME      ,DESCRIPTION      ,optype
                                    ,createtime                    
                                    ,us_id)
                   VALUES          (:NEW.TASKNAME ,:NEW.DESCRIPTION ,'I'
                                    ,HISTORY.GET_OPERATION_TIME    
                                    ,USER_SESSION.GET_USER_SESSIONS_ID);
          WHEN UPDATING THEN
            UPDATE TASKS_HIST
               SET retiretime = HISTORY.GET_OPERATION_TIME
             WHERE createtime = (SELECT MAX(createtime)
                                   FROM TASKS_HIST
                                  WHERE TASKNAME=:OLD.TASKNAME) 
               AND TASKNAME=:OLD.TASKNAME;
            INSERT INTO TASKS_HIST (TASKNAME      ,DESCRIPTION      ,optype
                                    ,createtime
                                    ,us_id)
                   VALUES          (:NEW.TASKNAME ,:NEW.DESCRIPTION ,'U'
                                    ,HISTORY.GET_OPERATION_TIME
                                    ,USER_SESSION.GET_USER_SESSIONS_ID);
          ELSE
            UPDATE TASKS_HIST
               SET retiretime = HISTORY.GET_OPERATION_TIME
             WHERE createtime = (SELECT MAX(createtime)
                                   FROM TASKS_HIST
                                  WHERE TASKNAME=:OLD.TASKNAME) 
               AND TASKNAME=:OLD.TASKNAME;
            INSERT INTO TASKS_HIST (TASKNAME      ,DESCRIPTION      ,optype
                                    ,createtime
                                    ,us_id)
                   VALUES          (:OLD.TASKNAME ,:OLD.DESCRIPTION ,'D'
                                    ,HISTORY.GET_OPERATION_TIME
                                    ,USER_SESSION.GET_USER_SESSIONS_ID);
        END CASE;
      EXCEPTION
        WHEN OTHERS THEN
          ERROR.REPORT_AND_GO;
    END TASKS_WHIST;
    CREATE TABLE TASKS_HIST (
         TASKNAME       VARCHAR2(40),
         DESCRIPTION    VARCHAR2(80),
         OPTYPE         VARCHAR2(1),
         CREATETIME     TIMESTAMP(6),
         RETIRETIME     TIMESTAMP(6),
         US_ID          NUMBER
    ALTER TABLE TASKS_HIST ADD (
         CONSTRAINT TASKS_HIST_PK PRIMARY KEY (TASKNAME, CREATETIME)
           );

    Frank,
    Thanks for your reply.
    I checked the site that you mentioned.
    I try the sample “demo with bundle. The sample worked.
    But it needed to start separately with the application.
    I do not know how to build a help system with the existed web application developed with Jdeveloper (It has two projects: model and user-view-control. It is deployed on Oracle Application server).
    Could you help me step by step to build the help system?

  • Export Task Audit Not Working

    Hello,
    We are on HFM 11.1.2.1.103 and I am getting an error message when trying to export a users activity from the "Task Audit" menu.  When I click on export a new tab comes up with the URL http://server:port/hfm/Administration/TaskAuditExport.asp and then I get an error message:
    An error occurred on the server when processing the URL. Please contact the
    system administrator.
    If you are the system administrator please click here to find out more
    about this error.
    And when I click on "here" it takes me to this URL Running Classic ASP Applications on IIS 7 and IIS 8 : The Official Microsoft IIS Site
    Any thoughts on how I resolve this would be much appreciated.
    Thank you,
    Jason

    SDM,
    Thank you for the response.  This is helpful and I was able to extrac the data, but this utility doesn't give you the ability to focus on one users' activity as you can from the web.  Why is it not recommended to use the web and is there a way to get that working properly?
    The reason the utility is not working for my purpose is I want to view a few users' activities over the last year and when I run the utility it gives me everyone and extracts the data across numerous files, which is not particularly easy to work with.
    Thanks.

  • Can we use replication usings oracle 10g steams for EBS r12 database?

    HI,
    We are using EBS 12.0.6 and database 10.2.0.3 on linux 32-bit/64-bit OS (Multi-Node env).
    Actually We want to decrease the load of custom reports on actual production system, so thats why
    we need reports server and as my past experience I had used oracle streams for replication with simple schema based replication and tablespace replication also.
    So, please educate me, that can i used the oracle 10g streams for EBS schemas like APPS, GL ,INV, PO, ONT, ODM, AR, AP, CM etc?
    Please provide us the above related env docs and road Map.
    If you have any other way to create the Report Server?
    Regard,
    Fayaz Ahmed

    Hi,
    We are not in a position to upgrade the EBS from 12.0.6 to 12.1, because this is lengthy task, but we had already tested technically it in test env, but functional testing in pending.
    Now we want to immediate solution if possible, because this is production?
    So, please suggest another.
    regards

  • Maintaining distribution model for ALE audit.

    hi,
    I have to create a bcakground job in the receiver system to periodically send back the audit data to the sender system. To do the same i need to maintain a distribution model for ALe audit. For doing so i don't have the modal view XIDEMO in BD64. How to implement. Also in SM59 while maintaing RFC destination it says U6D_700 no longer exists. Please suggest how to get it resolved.
    Regards
    Akash SInha
    Edited by: Akash Sinha on Dec 16, 2009 9:59 AM

    Hello,
    You can create your own model view in BD64. In the model view add message type ALEAUD with sender as your system name and logical system of your reciever.
    Also configure the Partner profile in WE20, with outbound parameter ALEAUD and IDoc Type ALEAUD01.
    Schedule the program RBDSTATE with the required parameters.
    BR/Yogesh

  • Connections to CMS/AUDIT system Databases lost

    we are often losing connection to CMS/AUDIT system Databases therefore
    users can not log into Infoview.
    The work around is to restart the Server Intelligence Agent (sia.exe)
    Server Details: we have BusObjects Edge standard XI 3.0
    Application Server: MS Windows Server 2003 Standard Edition SP2
    Database Server: MS Windows Server 2003 Standard Edition SP2
    There are multiple errors in our event viewer showing these errors.
    They have been happening for the last week.
    Not sure why this is happening or how to reproduce anyone know how to fix?
    Event viewer > application details:
    Event Type:      Error
    Event Source:   BusinessObjects_CMS
    Event Category:            Database
    Description:
    BusinessObjects Enterprise CMS: All connections to CMS system database ""BO_CMS"" have been lost. The CMS is entering the ""Waiting For Resources"" state. Reason [Microsoft][ODBC SQL Server Driver]Communication link failure
    Description:
    BusinessObjects Enterprise CMS: All connections to CMS audit database ""BO_AUDIT"" have been lost. The CMS can no longer serve as Auditor. Reason: [Microsoft][ODBC SQL Server Driver]Communication link failure
    Edited by: Dennis Alishah on Dec 15, 2010 11:37 PM

    no network problems
    thanks
    Found this is the SAP SMP I also asked them waiting for their reply to see if the below fix is suitable:
    SAP Knowledge Base Article
    Symptom
      CMS restarts and throws Database access error. Reason [Microsoft][ODBC SQL Server Driver]Communication link failure in the event viewer.
      BusinessObjects_CMS,Warning,Database ,33300,N/A,GADEVBO1,Database access error. Reason [Microsoft][ODBC SQL Server Driver]Communication link failure.
      BusinessObjects_CMS,Warning,Database ,33301,N/A,GADEVBO1,Retryable error returned from CMS database. Will retry the request 3 time(s).
      BusinessObjects_CMS,Warning,Database ,33300,N/A,GADEVBO1,Database access error. Reason [Microsoft][ODBC SQL Server Driver][DBNETLIB]ConnectionWrite
    Cause
      The error occurs because Microsoft Windows Server 2003 Service Pack 1 (SP1) and Windows Server 2003 Service Pack 2 (SP2) implement a security feature that the queue for concurrent TCP/IP connections to the server. This feature helps prevent denial of service attacks.
      Under heavy load conditions, the TCP/IP protocol in Windows Server 2003 SP1 and in Windows Server 2003 SP2 may incorrectly identify valid TCP/IP connections service attack.
    Resolution
    WARNING *** The following resolution involves editing the registry. Using the Registry Editor incorrectly can cause serious Use the Registry Editor at your own risk. Refer to Note 1323322 for more information.
    To resolve this issue, turn off this new functionality in Windows Server 2003 SP1 or in Windows Server 2003 SP2 by adding the SynAttackProtect entry to the following registry computer that is running Microsoft SQL Server that houses your BizTalk Server databases.
    HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters
    Set the SynAttackProtect entry to a DWORD value of 00000000.
    To do this, follow these steps:
    1. Click Start, click Run, type regedit, and then click OK.
    2. Locate and then click the following registry key:
      HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters
    3. On the Edit menu, point to New, and then click DWORD Value.
    4. Type SynAttackProtect, and then press ENTER.
    5. On the Edit menu, click Modify.
    6. In the Value data box, type 00000000. Click OK.
    7. Quit Registry Editor.
    Note To complete this registry change, you must restart the computer that is running SQL Server.
    See Also
    For detailed information, go to the following Microsoft website:
    http://support.microsoft.com/default.aspx?scid=kb;en-us;899599
    Header Data
    References
    Product
    1375277 - CMS lost the SQLServer 2000/2005 communication and restarts intermittently
    Version 1 Validity: 12.08.2009 - active Language English
    Released on 07.10.2009 15:35:25
    Release status Released to Customer
    Component BI-BIP-ADM User & server configuration, InfoView refresh, user rights
    Priority Normal
    Category Problem
    Database MSSQLSRV 2005
    Operating System WIN 2003
    Type ID Title
    Business Objects Notes 1323322 Editing the Windows Registry - Warning
    Product Product Version
    SAP BusinessObjects Business Intelligence platform (formerly SB BOBJ ENTERPRISE XI R2
    1375277 - CMS lost the SQLServer 2000/2005 communication and restarts intermitte... Page 1 of 1
    https://websmp130.sap-ag.de/sap(bD1lbiZjPTAwMQ==)/bc/bsp/sno/ui/main.do?para... 16/12/2010
    Edited by: Dennis Alishah on Dec 16, 2010 12:23 AM

  • Query on HFM Task Audit

    Hi,
    We have a requirement to check the user activities related to changes made in dataforms/data grids (or specifically changes made to items in Manage Documents)
    Are these activities logged in the task audit? If so could you please let me know the activity code.
    Thanks in advance,
    Aparna

    In your question you use the term "activities", but you have not articulated which ones you need to capture. HFM's task audit does not capture every task that is available in the system, so you should look through the listing of activities that it does capture. Opening and entering data into a web form for example, is not captured in Task Audit.
    If instead you are implying changes made to data via a web form, then you are asking about Data Audit, which is very different. Data Audit is dependent upon your own enablement of Account/Scenario combinations in metadata. Each account that is enabled for Data Audit, for the relevent scenarios that have been enabled, will be inlcuded in the data audit report anytime a user changes data for that account.
    --Chris                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                               

  • HFM Task Audit Logs

    Hi All,
    HFM Version 9.3.1. When you click on administration>task audit, there are a bunch of tasks which can be auditied. I am trying to pull the last logon information from this and apparently the file is to large to export.
    Where are these logs stored? Are they in SQL or on the app server somewhere? Does anyone every clear these logs? Will NOT doing so affect performance in any way.
    Thanks,
    Greg

    Greg,
    Logs are stored in the HFM relational database. The table name is <AppName>TaskAudit where <AppName> is the name of the HFM app.
    By default, the data in this table is not trimmed and it can grow quite large.
    From an overall systems level, I doubt the size of this table is going to negatively impact system performance unless you are trying to work with that table directly. ;)
    It is not uncommon for people to make a script (schedule task) that executes a sql query that cleans up the table except for critical pieces of information such as system loads, etc.
    The tricky fields in the table are ActivityCode and the time fields. ActivityCode correlates with the activity (i.e. Logon, Consolidation, etc). It is not something that is advertised that I know of; however, I created a list by analyzing the program. The list will follow at the bottom of this post.
    The times are a bit tricky as you need to convert them so that they are "human readable". In SQL, I use the follow statement to convert them to a human readable time/date : cast(endtime-2 as smalldatetime)
    Hope this helps.
    Activity Codes
    ActivityID     ActivityName
    0     Idle
    1     Rules Load
    2     Rules Scan
    3     Rules Extract
    4     Consolidation
    5     Chart Logic
    6     Translation
    7     Custom Logic
    8     Allocate
    9     Data Load
    10     Data Extract
    11     Data Extract via HAL
    12     Data Entry
    13     Data Retrieval
    14     Data Clear
    15     Data Copy
    16     Journal Entry
    17     Journal Retrieval
    18     Journal Posting
    19     Journal Unposting
    20     Journal Template Entry
    21     Metadata Load
    22     Metadata Extract
    23     Member List Load
    24     Member List Scan
    25     Member List Extract
    26     Security Load
    27     Security Scan
    28     Security Extract
    29     Logon
    30     Logon Failure
    31     Logoff
    32     External
    33     Metadata Scan
    34     Data Scan
    35     Extended Analytics Export
    36     Extended Analytics Schema Delete
    37     Transactions Load
    38     Transactions Extract
    39     Document Attachments
    40     Document Detachments
    41     Create Transactions
    42     Edit Transactions
    43     Delete Transactions
    44     Post Transactions
    45     Unpost Transactions
    46     Delete Invalid Records
    47     Data Audit Purged
    48     Task Audit Purged
    49     Post All Transactions
    50     Unpost All Transactions
    51     Delete All Transactions
    52     Unmatch All Transactions
    53     Auto Match by ID
    54     Auto Match by Account
    55     Intercompany Matching Report by ID
    56     Intercompany Matching Report by Acct
    57     Intercompany Transaction Report
    58     Manual Match
    59     Unmatch Selected
    60     Manage IC Periods
    61     Lock/Unlock IC Entities
    62     Manage IC Reason Codes
    63     Null

  • HFM TASK AUDIT TABLE

    We are using HFM system 9 and we notice that the size of the task audit table has exceeded the normal limits,we would like to take a backup of data before deleting the records.Is this possible?
    is the deletion of records recommeneded or truncation of the table.How do we take a backup of the table,can we make HFM web retrieve from this table when needed?
    Does anyone know how to convert the float type(starttime,endtime) to date datatype?
    Thanks in advance

    Here is the query I use against the task audit table to get my users by day.. Our DBA came up with the code to convert the starttime field using CASt(convert(VARCHAR
    SELECT COUNT(DISTINCT ActivityUserID) as NumUsers
         , CAST(convert(VARCHAR,CAST(starttime AS DATETIME),106) AS DATETIME) AS [DAY]
    FROM dbo.LAUREATE_TASK_AUDIT
    WHERE CAST(convert(VARCHAR,CAST(starttime AS DATETIME),106) AS DATETIME) >= '8/1/2009'
    GROUP BY CAST(convert(VARCHAR,CAST(starttime AS DATETIME),106) AS DATETIME)
    ORDER BY CAST(convert(VARCHAR,CAST(starttime AS DATETIME),106) AS DATETIME)
    What I noticed with our database (SQL 2000 SP4, HFM 9.2.1) is that the date conversion is off by 2 days. So user activity for today shows as 2 days in the future when I query the table. I logged this with Oracle but no resolution. For what I need, it works fine.
    We have made a backup of the table in the past and truncated the existing table. Your SQL DBA can do that easily.Then yes you can just query either table as needed.

Maybe you are looking for

  • How can I disconnect my iPhone from my MacBook so that it stops informing me of incoming calls?

    I have an iPhone. I have a MacBook Pro running Yosemite. Problem: whenever I get a phone call on the iPhone, the MacBook lets me know my phone is calling. I'm sure some people appreciate this feature, but it's very distracting if you don't want it. H

  • Using ODI to pull data from IBM DB2

    Has anybody used IBM DB2 as a data source for essbase using ODI ? If so, do you have any tips, pointers specific to this DB2 specific situation, or better yet reading resources aside from the DBAG ?

  • Problem with Download Basket

    Dear Experts, My download basket is not updated properly, whenever I had select new components to download and goes to download basket it remains empty. Thanks in Advance Regards, Rakesh

  • ISU Payment Plan

    Hello, Please share the link with  exhaustive documenatioon on SAP standard ISU Payment Plan material . Also would like to know the difference between Payment Scheme and Payment plan for Australian Market. Thanks in advance. Regards, Sunil S

  • IPhoto 8 - Pictures won't expand

    I have iPhoto 8. When I go to Library - Photos, I see all my pictures. If I click on one, it no longer comes up individually. It's just black where the picture used to show. I've added a bunch of new pictures recently, could this be the problem? I ha