DB analyzer logs remove

In our live cache server DB analyzer logs are not deleting from 2008 & we need to keep only last 30 days logs. I have used automatic remove option for "Bottlenecks" logs.
LC10 >Problem Analysis>Performance>DB analyzer>Bottlenecks & than "Administration of performance data" from "go to "
& for Expert Analysis logs
LC10 >Problem Analysis>Performance>DB analyzer>Expert Analysis >EDIT>Administration of performance data
I have given 30 days for LC host  & 4 weeks for DB
Its work from "Bottlenecks" logs but not for "Expert Analysis" logs. So there is any other option to delete Expert Analysis logs.

Hello,
Please review the SAP Notes No. 530394, No. 1389225 and No. 945757.
-> What is the SAP Basis SP on your system?
    What is the version of the liveCache on your system?
-> Did you create the SAP message on this issue?
Thank you & best regards, Natalia Khlopina

Similar Messages

  • Error in analyzer Log file (/sapdb/data/wrk/ACP/analyzer--- DBAN.err)

    Hello All,
    I am getting the following Error message in analyzer Log file (/sapdb/data/wrk/ACP/analyzer---> DBAN.err).
    the details are as follows:-
    =====================================================
    <i>2006-07-24 08:55:59
    ERROR 5: Cannot execute SQL statement.
    [MySQL MaxDB][LIBSQLOD SO][MaxDB] General error;-4008 POS(1) Unknown user name/password combination
    SELECT YEAR(NOW()),MONTH(NOW()),DAY(NOW()),HOUR(NOW()),MINUTE(NOW()),SECOND(NOW()) FROM DUAL
    2006-07-26 12:15:39
    ERROR 20: Database Analyzer not active in directory "/sapdb/data/wrk/ACP/analyzer".
    2006-08-03 12:33:08
    ERROR 5: Cannot execute SQL statement.
    [MySQL MaxDB][LIBSQLOD SO][MaxDB] Communication link failure;-709 CONNECT: (database not running: no request pipe)
    SELECT YEAR(NOW()),MONTH(NOW()),DAY(NOW()),HOUR(NOW()),MINUTE(NOW()),SECOND(NOW()) FROM DUAL</i>
    =====================================================
    can you please tell me what does that mean for my Database.
    The main problem that I am facing is I am not able to start my SAP application. when I issue startsap from <SID>adm login then I get error messag saying not able to connect to the database. Although the database is already up and running.
    Please help me !
    Regards,
    Premkishan chourasia

    Hi,
    well, the error -4008 denotes that the user/password combination used by the DB Analyzer for accessing the DB are incorrect. The DB Analyzer tries to issue SQL commands with the SYSDBA user.
    Do you know the user/password combination of your SYSDBA user?
    Regards,
    Roland

  • Error in DB analyzer logs

    HI,
    My SCM 5.0system is running on oracle 10g . I have checked LC & found below error in DB analyzer logs.
    W3  11523 primary key range accesses, selectivity 0.01%: 140299528 rows read, 12593 rows qualified
          CON: PKeyRgSel < 0.3
          VAL: 0.01      < 0.3
    W3  Selects and fetches selectivity 0.21%: 197822 selects and fetches, 71523587 rows read, 152554 rows qualified
          CON: SelFetSel < 0.3
          VAL: 0.21      < 0.3
    * W3  76125 primary key range accesses, selectivity 0.11%: 71444927 rows read, 76461 rows qualified
          CON: PKeyRgSel < 0.3
          VAL: 0.11      < 0.3
    W2  Number of symbol resolutions to create call stacks: 234
          CON: SymbolResolutions > 0
          VAL: 234               > 0
    LC Version - X64/HPUX  7.6.03   Build 012-123-169-237
    As i am new in MAX DB so i need experts advice on the above issues.

    Hello,
    You got the WARNING messages in the DB analyzer protocol. Those are NOT errors.
    You used the DB analyzer to find the bottleneck in liveCache.
    Itu2019s  the performance analysis tool for database.
    1. In general, the MAXDB library has the explanations about the DB analyzer warning messages.
    http://maxdb.sap.com/doc/7_7/default.htm -> Database Analyzer
               In the database analyzer messages section at Optimizer Strategies and Selects and Fetches documents you will see the u201CUser Responseu201D, when you will get the warnings like:
    W3  11523 primary key range accesses, selectivity 0.01%...
    W3  Selects and fetches selectivity 0.21% u2026
    W3  76125 primary key range accesses, selectivity 0.11%...
    =>Find what liveCache application scenario was running at that time.
    Repeat this application scenario & create the SQL trace. Find the statement that causes this warning.
    2. If you are not able to find the reason for those warnings on your system => create the SAP message to help you on this issue.
    Thank you and best regards, Natalia Khlopina

  • Data Guard archive log remove

    Hi,
    I am using 9i Data Guard now. I try to set up automatic procedure to remove the archive log on the standby site once it got applied. But except the manual remove/delete, there is no option to set the automatic procedure in Oracle Data Guard setting.
    Do anyone has solution for it?
    Thanks

    user3076922 wrote:
    Hi
    Standby database configured with broker and applying the redo in really time; however, I want to change this to archive log apply mode without losing the broker configuration. Is it possible? If it is not possible to use broker to do archive log apply, can I remove the broker and use data guard to set up the standby to use archive log apply?
    RegardsHi
    I think mseberg is answered correct, you can use enable/disable apply log with change of state on standby database with DGMGRL, as writen mseberg.
    or you can disable recover standby database with following script from SQL*Plus.
    SQL> alter database recover managed standby database cancel;Regards
    Mahir M. Quluzade
    www.mahir-quluzade.com

  • UNWANTED LOGS REMOVING

    Hi All
          We have a SAP - 4.7 . Now the problem is our E drive full , there was no space inside the E drive . So we are not able to take backup . Kindly give me suggestion to remove unwanted logs inside the E drive to make free space .
    Regards
    Selvan

    you will have a good idea if you follow http://help.sap.com/saphelp_nw70/helpdata/en/24/b884388b81ea55e10000009b38f842/frameset.htm
    cheers,
    -Sunil

  • Search for records in the event viewer after the last run (not the entire event log), remove duplicate - Output Logon type for a specific OU users

    Hi,
    The following code works perfectly for me and give me a list of users for a specific OU and their respective logon types :-
    $logFile = 'c:\test\test.txt'
    $_myOU = "OU=ABC,dc=contosso,DC=com"
    # LogonType as per technet
    $_logontype = @{
        2 = "Interactive" 
        3 = "Network"
        4 = "Batch"
        5 = "Service"
        7 = "Unlock"
        8 = "NetworkCleartext"
        9 = "NewCredentials"
        10 = "RemoteInteractive"
        11 = "CachedInteractive"
    Get-WinEvent -FilterXml "<QueryList><Query Id=""0"" Path=""Security""><Select Path=""Security"">*[System[(EventID=4624)]]</Select><Suppress Path=""Security"">*[EventData[Data[@Name=""SubjectLogonId""]=""0x0""
    or Data[@Name=""TargetDomainName""]=""NT AUTHORITY"" or Data[@Name=""TargetDomainName""]=""Window Manager""]]</Suppress></Query></QueryList>" -ComputerName
    "XYZ" | ForEach-Object {
        #TargetUserSid
        $_cur_OU = ([ADSI]"LDAP://<SID=$(($_.Properties[4]).Value.Value)>").distinguishedName
        If ( $_cur_OU -like "*$_myOU" ) {
            $_cur_OU
            #LogonType
            $_logontype[ [int] $_.Properties[8].Value ]
    #Time-created
    $_.TimeCreated
        $_.Properties[18].Value
    } >> $logFile
    I am able to pipe the results to a file however, I would like to convert it to CSV/HTML When i try "convertto-HTML"
    function it converts certain values . Also,
    a) I would like to remove duplicate entries when the script runs only for that execution. 
    b) When the script is run, we may be able to search for records after the last run and not search in the same
    records that we have looked into before.
    PLEASE HELP ! 

    If you just want to look for the new events since the last run, I suggest to record the EventRecordID of the last event you parsed and use it as a reference in your filter. For example:
    <QueryList>
      <Query Id="0" Path="Security">
        <Select Path="Security">*[System[(EventID=4624 and
    EventRecordID>46452302)]]</Select>
        <Suppress Path="Security">*[EventData[Data[@Name="SubjectLogonId"]="0x0" or Data[@Name="TargetDomainName"]="NT AUTHORITY" or Data[@Name="TargetDomainName"]="Window Manager"]]</Suppress>
      </Query>
    </QueryList>
    That's this logic that the Server Manager of Windows Serve 2012 is using to save time, CPU and bandwidth. The problem is how to get that number and provide it to your next run. You can store in a file and read it at the beginning. If not found, you
    can go through the all event list.
    Let's say you store it in a simple text file, ref.txt
    1234
    At the beginning just read it.
    Try {
    $_intMyRef = [int] (Get-Content .\ref.txt)
    Catch {
    Write-Host "The reference EventRecordID cannot be found." -ForegroundColor Red
    $_intMyRef = 0
    This is very lazy check. You can do a proper parsing etc... That's a quick dirty way. If I can read
    it and parse it as an integer, I use it. Else, I just set it to 0 meaning I'll collect all info.
    Then include it in your filter. You Get-WinEvent becomes:
    Get-WinEvent -FilterXml "<QueryList><Query Id=""0"" Path=""Security""><Select Path=""Security"">*[System[(EventID=4624 and EventRecordID&gt;$_intMyRef)]]</Select><Suppress Path=""Security"">*[EventData[Data[@Name=""SubjectLogonId""]=""0x0"" or Data[@Name=""TargetDomainName""]=""NT AUTHORITY"" or Data[@Name=""TargetDomainName""]=""Window Manager""]]</Suppress></Query></QueryList>"
    At the end of your script, store the last value you got into your ref.txt file. So you can for example get that info in the loop. Like:
    $Result += $LogonRecord
    $_intLastId = $Event.RecordId
    And at the end:
    Write-Output $_intLastId | Out-File .\ref.txt
    Then next time you run it, it is just scanning the delta. Note that I prefer this versus the date filter in case of the machine wasn't active for long or in case of time sync issue which can sometimes mess up with the date based filters.
    If you want to go for a date filtering, do it at the Get-WinEvent level, not in the Where-Object. If the query is local, it doesn't change much. But in remote system, it does the filter on the remote side therefore you're saving time and resources on your
    side. So for example for the last 30 days, and if you want to use the XMLFilter parameter, you can use:
    <QueryList>
    <Query Id="0" Path="Security">
    <Select Path="Security">*[System[TimeCreated[timediff(@SystemTime) &lt;= 2592000000]]]</Select>
    </Query>
    </QueryList>
    Then you can combine it, etc...
    PS, I used the confusing underscores because I like it ;)
    Note: Posts are provided “AS IS” without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

  • Open Source Log Analyzer Project

    Hi people,
    I have a question whether there is a open source project which analyze logs from database. I mean I have a table(Log table which is like syslog message format). I need to analyze this table with a web based project. So, Do you know any open source project that do this? Thanks

    Huh? How is this question related to JSF?
    Anyway, is Google installed at your machine? After feeding it with the topic title "Open Source Log Analyzer Project", it told me here something about AWStats and SourceTree and so on. More can't I (and Google) be of help.
    You can also consider writing one yourself with help of smart coding and nice API's like JFreeChart.

  • Log files can't be removed automatically in HA environment

    Hi BDB experts,
    I am writing db HA application based on bdb version 4.6.21. Two daemons run on two machines, one as master which will read/write db, one as client/backup will only read db. There is one thread in master daemon that run checkpoint every 1 second: dbenv->txn_checkpoint(dbenv, 1, 1, 0), and dbenv->log_archive(dbenv, NULL, DB_ARCH_REMOVE) will be called after runnng checkpoint each time. The env was open with flag: DB_CREATE | DB_INIT_TXN |  DB_INIT_LOCK | DB_INIT_LOG | DB_REGISTER | DB_RECOVER | DB_INIT_MPOOL | DB_THREAD  | DB_INIT_REP;   Autoremove flag was set by: envp->set_flags(uid_dbs.envp, DB_LOG_AUTOREMOVE, 1) before open env.
    I found this thread https://forums.oracle.com/message/10945602#10945602 which discussed about non-ha environment, and I tested my code in a non-ha env without DB_INIT_REP, it worked. However in HA env those log files were never removed. Could you help on this issue? Does the client need to run checkpoint? May there be a bdb bug?
    Thanks,
    Min

    There is one thread in master daemon that run checkpoint every 1 second: dbenv->txn_checkpoint(dbenv, 1, 1, 0), and dbenv->log_archive(dbenv, NULL, DB_ARCH_REMOVE) will be called after runnng checkpoint each time. The env was open with flag: DB_CREATE | DB_INIT_TXN |  DB_INIT_LOCK | DB_INIT_LOG | DB_REGISTER | DB_RECOVER | DB_INIT_MPOOL | DB_THREAD  | DB_INIT_REP;   Autoremove flag was set by: envp->set_flags(uid_dbs.envp, DB_LOG_AUTOREMOVE, 1) before open env.
    I am not saying that this is causing a problem, but doing the DB_ENV->log_archive(DB_ARCH_REMOVE) in your thread and setting DB_ENV->set_flags(DB_LOG_AUTOREMOVE) is redundant. In your thread, you control the timing. The DB_ENV->set_flags(DB_LOG_AUTOREMOVE) option checks for and removes unneeded log files when we create a new log file.
    Did you see in the documentation for DB_ENV->set_flags(DB_LOG_AUTOREMOVE) that we don't recommend doing automatic log file removal with replication? Although this warning is not repeated in DB_ENV->log_archive(DB_ARCH_REMOVE), it also applies to this option. You should reconsider using this option, particularly if it is possible that your client could go down for a long time.
    But this is only a warning and automatic log removal should work. My first thought here is to ask whether your client has recently gone through a sync? Internally, we block archiving on the master during some parts of a client sync to improve the chances that we will keep around all logs needed by the syncing client. We block archiving for up to 30 seconds after the client sync.
    I found this thread https://forums.oracle.com/message/10945602#10945602 which discussed about non-ha environment, and I tested my code in a non-ha env without DB_INIT_REP, it worked. However in HA env those log files were never removed.
    This thread is discussing a different issue. The reason for our warning in BDB 4.6 against using automatic log removal with replication is that it doesn't take into account all the sites in your replication group, so we could remove a log from the master that a client still needs.
    We added replication group-aware automatic log removal in BDB 5.3 Replication Manager, and this discussion is about a change of behavior from this addition. With this addition, we no longer need to recommend against using automatic log removal with replication in BDB 5.3 and later releases.
    Could you help on this issue? Does the client need to run checkpoint? May there be a bdb bug?
    I'm not sure the client needs to run its own checkpoints because it performs checkpoints when it receives checkpoint log records from the master.
    But none of the log removal options on the master does anything to remove logs on the client. You will need to perform steps to archive logs separately on the client and the master.
    Paula Bingham
    Oracle

  • ACS Tacacs administration report Log Analyzer

    The logs in ACS are in .csv format. My system is generation huge logs due to more than 1000 devices configured in ACS. Is there any tools available to analyze the Tacacs administration logs ?
    Regards
    Hitesh Vinzoda

    Hi Hitesh,
    The only option you have is to download the .CSV files and import it into spreadsheets by using most popular spreadsheet application software. You can also use a third-party reporting tool to manage report data. For example, aaa-reports! by Extraxi supports ACS.
    To download a CSV report:
    =========================
    # click Reports and Activity.
    # Click the CSV report filename that you want to download.
    # In the right pane of the browser, click Download.
    # You can easily analyse the logs in Microsoft excel
    How to filter and analyze logs ( with Regular Expression Syntax Definitions):
    ========================================
    http://www.cisco.com/en/US/partner/docs/net_mgmt/cisco_secure_access_control_server_for_windows/4.2/user/guide/LgsRpts.html#wp632961
    For downloading third party application
    http://www.extraxi.com/
    For more info, you can download the user guide:
    http://www.extraxi.com/PDFs/aaa-reports%20sales%20proposal%20-%20customer.pdf
    HTH
    Regards,
    JK

  • Log Analyzer

    Hi Achers,
    I am looking for a tool, command, etc. that can help me to analyze logs information. Furthermore, I also want to know who executed which commands on the servers, time run that commands, and so on.
    If I want to limit to certain users that can run a specific set of commands, wat can I do? sudo only limit cetain users to run commands with root permission.
    Any help will be appriciated.
    Cheers

    super is worth looking at to restrict the users to certain commands, plus it offers the logging you are looking for:
    http://www.ucolick.org/~will/#super
    I don't know much about log analysers though I'm afraid.
    I forgot to add it's in the aur.
    Last edited by loafer (2010-02-23 08:52:36)

  • Reading log file and calculating time between

    If someone could help me with this one, I would be very grateful.
    I have a log file and I need to search a string that contains a start time and end time (eg. <time="11:10:58.000+000">). When I have these two values, I need to measure the time that has been elapsed between these two (from start to end).

    $Path="C:\Times.log"
    remove-item $Path
    Add-Content $Path '<time="11:10:58.000+000">'
    Add-Content $Path '<time="12:10:58.000+000">'
    Add-Content $Path '<time="13:10:58.000+000">'
    Add-Content $Path '<time="15:13:38.000+000">'
    Add-Content $Path '<time="16:10:58.000+000">'
    Add-Content $Path '<time="17:08:28.000+000">'
    $File=Get-Content $Path
    $StartTime=$Null
    $EndTime=$Null
    $ElapsedTime = $Null
    ForEach ($Line in $File)
    If ($Line.Contains("time="))
    $Position = $Line.IndexOf("time=")
    $TimeStr =$Line.SubString($Position+6,8)
    IF ($StartTime -EQ $Null)
    $StartTime = $TimeStr -As [System.TimeSpan]
    Else
    $EndTime = $TimeStr -As [System.TimeSpan]
    $ElapsedTime = $EndTime.Subtract($StartTime)
    "StartTime=$StartTime EndTime=$EndTime ElapsedTime=$ElapsedTime"
    $StartTime = $Null
    Gives this output
    StartTime=11:10:58 EndTime=12:10:58 ElapsedTime=01:00:00
    StartTime=13:10:58 EndTime=15:13:38 ElapsedTime=02:02:40
    StartTime=16:10:58 EndTime=17:08:28 ElapsedTime=00:57:30

  • Reading log file

    Hi all ,
    I want to view a particular log file. Is there any transaction to view log files.Do i need basis rights for that?

    $Path="C:\Times.log"
    remove-item $Path
    Add-Content $Path '<time="11:10:58.000+000">'
    Add-Content $Path '<time="12:10:58.000+000">'
    Add-Content $Path '<time="13:10:58.000+000">'
    Add-Content $Path '<time="15:13:38.000+000">'
    Add-Content $Path '<time="16:10:58.000+000">'
    Add-Content $Path '<time="17:08:28.000+000">'
    $File=Get-Content $Path
    $StartTime=$Null
    $EndTime=$Null
    $ElapsedTime = $Null
    ForEach ($Line in $File)
    If ($Line.Contains("time="))
    $Position = $Line.IndexOf("time=")
    $TimeStr =$Line.SubString($Position+6,8)
    IF ($StartTime -EQ $Null)
    $StartTime = $TimeStr -As [System.TimeSpan]
    Else
    $EndTime = $TimeStr -As [System.TimeSpan]
    $ElapsedTime = $EndTime.Subtract($StartTime)
    "StartTime=$StartTime EndTime=$EndTime ElapsedTime=$ElapsedTime"
    $StartTime = $Null
    Gives this output
    StartTime=11:10:58 EndTime=12:10:58 ElapsedTime=01:00:00
    StartTime=13:10:58 EndTime=15:13:38 ElapsedTime=02:02:40
    StartTime=16:10:58 EndTime=17:08:28 ElapsedTime=00:57:30

  • Issue with status of data information in Bex analyzer report

    Hi BI gurus,
    One of the queries showing older date for the "status of data" information in the report of Bex Analyzer. I have tried to correct it in Bex analyzer by removing existing Text information element and adding a new Text element in the Bex Analyzer designer for the query. But it doesn't worked out as the changes made to the query through Bex Analyzer are only being saved as a local work book rather than reflecting to the query. Please suggest me with some options to resolve this issue and give any Idea to correct the "Status of data" in the Bex Query designer.

    Hi Aditya
    This is a common problem faced by users when reporting on Multi-Provider.
    In my project what I did to overcome this is to run a Fake DTP to the cube whose status is creating problem.
    Like , if under MultiPro I have a planning cube which is only updated monthly but all the actual cubes updated daily.  In this case create a DTP under Plan cube with some impossible selection condition ( like fiscal year 2099). This will bring 0 records to planning cube ( and thereby not impacting the data) but will update the last loading time.
    Regards
    Anindya

  • Java.io.UTFDataFormatException using WS-I Analyzer

    Hi
    I'm trying to use WS-I Analyzer on the HTTP Analyzer logs. If the HTTP Analyzer logs contain accented characters I get a Message Box with NullPointerException. If I check the JDeveloper conse I get:
    java.io.UTFDataFormatException: Codificaci¾n UTF8 no vßlida.
    Any idea how to fix this?
    Regards,
    Néstor Boscán

    Hi,
    I've got the same error, although the XML seems to be valid UTF-8.
    I'm also interested in any ideas regarding this.
    (Btw, the exception message is "Invalid UTF-8 encoding" in English.)
    (I've only found this post related to this exception at Sun Java Forum, but it doesn't help.)
    Thanks,
    Patrik

  • How to reduce "Wait for Log Writer"

    Hi,
    in a production system using MaxDB 7.6.03.07 i checked follow acitivities about Log:
    Log Pages Written: 32.039
    Wait for Log Writer: 31.530
    the docs explains that "Wait for Log Writer", Indicates how often it was necessary to wait for a log entry to be written.
    then what steps i must follow to reduce this?
    thanks for any help
    Clóvis

    Hi,
    when the log Io queue is full all user tasks who want to insert entries into the log io queue have to wait until the log entries of the queue have been written to the log volume - they are wiating for LOG Writer
    First you should check the size of the LOG_IO_QUEUE Parameter.
    Think about increaseing the parameter value.
    Second will be to check the time for write I/O to the log -> use DB-Analyzer and activate time measurement via x_cons <DBNAME> time enable command.
    Then you will get time for write I/O on the log in the DB-Analyzer log files (expert)
    You will find more information about MaxDb Logging and Performance Analysis on maxdb.sap.com -> [training material|http://maxdb.sap.com/training] chapter logging and performance analysis.
    Regards, Christiane

Maybe you are looking for

  • Installing Windows 7 on IMAC 27 256G SSD 2 TB HD Core I7 8G Ram

    I am a relative computer novice and purchased in Aug 2010 the lastest IMAC 27" with a 256G SSD and 2TB HD Core I7 withe 8G Ram. Here are my questions: 1. I wish to install Windows 7 on the SSD using Bootcamp. I would then like a recommendation as to

  • Drag drop layers behaviour

    Sorry if this is simple -I'm new here. I've applied a drag layers behaviour to a series of layers on a page in Dreamweaver 2004MX. When viewed on the screen resolution I developed the page on they work fine (1280 by 1024). When I take the screen reso

  • My friend has a non jailbroken iPhone 4 and has Siri fully working on it (iOS7) he won't tell me how he did it so pleeez people help me

    My friend has a non jailbroken iPhone 4 and has Siri fully working on it (iOS7) he won't tell me how he did it so pleeez people help me

  • Loading data into two files

    Hi I have written sql loader script for loading data into two tables. Script is working ..But output is not coming properly. I want to load data into first table which lines are having first char 'R'. In the second table I have to load data which are

  • PO quantity

    HI Experts, I am creating PO for deleted Info record, system is showing error that " inforecord  is flagged for deletion" I am not able to edit quanitity in PO but client is able to edit the PO quantity even the Inforecord  flagged for deletion. Plea