Query on available.log file process list

Hi gurus,
I am aware that available.log is the file where we can see the status of the monitored process for an instance and it is located in work directory.
I also guess it is not 100% reliable.
I would like to know what are the monitored process for available.log in a solaris environment.
the process monitored in the list are getting stop/started frequently.
I would like to check which processes are those .
Thanks,
Sowmya

Hi Sunny,
Thanks for your info.
This file is getting updated frequently .
that means the processes are getting restarted frequently , isn't it?
Please find the details from the log ::
Available   29.07.2010 14:45:22 - 29.07.2010 14:45:22
Unavailable 29.07.2010 14:45:27 - 29.07.2010 14:45:27
Available   29.07.2010 14:46:23 - 29.07.2010 14:46:23
Unavailable 29.07.2010 14:46:27 - 29.07.2010 14:46:27
Available   29.07.2010 14:47:23 - 29.07.2010 14:47:23
Unavailable 29.07.2010 14:47:27 - 29.07.2010 14:47:27
Available   29.07.2010 14:48:23 - 29.07.2010 14:48:23
Unavailable 29.07.2010 14:48:27 - 29.07.2010 14:48:27
Available   29.07.2010 14:49:23 - 29.07.2010 14:49:23
Unavailable 29.07.2010 14:49:27 - 29.07.2010 14:49:27
Available   29.07.2010 14:50:23 - 29.07.2010 14:50:23
Unavailable 29.07.2010 14:50:27 - 29.07.2010 14:50:27
Available   29.07.2010 14:51:23 - 29.07.2010 14:51:23
Unavailable 29.07.2010 14:51:27 - 29.07.2010 14:51:27
Available   29.07.2010 14:52:23 - 29.07.2010 14:52:23
Unavailable 29.07.2010 14:52:27 - 29.07.2010 14:52:27
Available   29.07.2010 14:53:23 - 29.07.2010 14:53:23
Unavailable 29.07.2010 14:53:27 - 29.07.2010 14:53:27
But the server is up and running fine .
Is this file 100% reliable?
Or can it be a bug somewhere ,for ex may be kernel bug which is updating the file incorrectly?

Similar Messages

  • Unable to view SQL Request in Log files

    Hi Folks,
    I am facing an issue which I am unable to find out the solution to view the physical query generated in log files in Presentation Services.
    Below is the SQL Request generated but I want to view the exact physical query i.e SQL Request which is hitting DB.
    So please guiude me to resolve this issue, I guess it is because of Initialization blocks created which is blocking to view the SQL request.
    -------------------- SQL Request:
    set variable LOGLEVEL = 7;SELECT "- Policy Effective-Start Date"."Start Quarter" saw_0, "- Insurance Policy Facts".Revenue saw_1, "- Insurance Policy Facts"."# Insurance Policies" saw_2, "Insurance Policy".Status saw_3, "Insurance Policy".Type saw_4 FROM "Insurance Policies" WHERE ("Insurance Policy".Type = 'Policy') AND ("- Policy Effective-Start Date"."Start Julian Day Number" BETWEEN VALUEOF(CURRENT_JULIAN_DAY)-365 AND VALUEOF("CURRENT_JULIAN_DAY")) ORDER BY saw_0, saw_3, saw_4
    /* QUERY_SRC_CD='rawSQL' */
    Regards
    Dj

    There is no Enterprise Edition of SSMS. There is SSMS Basic and SSMS Complete. Prior to 2012 sp1, only SSMS Basic were available with Express Edition - but as of 2012 sp1 Expredd also offers SSMS Complete. SSMS Complete is selected bu default when you install
    SSMS (unless you are prior to 2012 sp1 and are using Express, of course).
    However, even SSMS Basic *should* show Agent assuming you have permissions for that. This is hearsay, but from trusted sources. Here is what to do:
    Check what is installed for the machine from where you are running SSMS. You can do that using SQL Server Installation Center - see this blog post: http://sqlblog.com/blogs/tibor_karaszi/archive/2011/02/10/what-does-this-express-edition-look-like-anyhow.aspx
     (towards the end).
    On that machine try both this problematic account as well as an account which is sysadmin. Does the sysadmin account see Agent? If so, you know permissions aren't granted properly. If not, then you know the tool is the problme.
    Also try the problematic account from a machine where you know you see Agent normally. Again, this will help you assess whether the problem is the tool (SSMS) or permissions for the account.
    Tibor Karaszi, SQL Server MVP |
    web | blog

  • Available.log reports "Unavailable"

    We recently upgraded from BW3.5 to BI7.0.  The available.log file reports the ABAP system as "Unavailable" even though it is truly available.  The timestamp changes every minute, as expected. Interestingly, our PROD CI is the only system that reports "Available"...the PROD App Servers, DEV, and QA systems all report "Unavailable".
    Does anyone know what process is writing to this "available.log"?
    bdeadm> tail -5 /usr/sap/BDE/DVEBMGS00/work/available.log
    Unavailable 20.05.2008 15:45:53 - 26.06.2008 15:13:30
    Unavailable 26.06.2008 18:45:46 - 21.07.2008 06:35:15
    Unavailable 21.07.2008 06:39:32 - 21.07.2008 14:34:38
    Unavailable 21.07.2008 14:37:02 - 24.07.2008 12:07:05
    Unavailable 24.07.2008 12:09:06 - 08.08.2008 12:34:42
    bdeadm>
    -Greg

    I have a similar issue too, have a look :
    Unavailable 21.07.2010 11:12:55 - 21.07.2010 11:12:55
    Available   21.07.2010 11:13:50 - 21.07.2010 11:13:50
    Unavailable 21.07.2010 11:13:55 - 21.07.2010 11:13:55
    Available   21.07.2010 11:14:50 - 21.07.2010 11:14:50
    Unavailable 21.07.2010 11:14:55 - 21.07.2010 11:14:55
    Available   21.07.2010 11:15:50 - 21.07.2010 11:15:50
    Unavailable 21.07.2010 11:15:55 - 21.07.2010 11:15:55
    Available   21.07.2010 11:16:50 - 21.07.2010 11:16:50
    Unavailable 21.07.2010 11:16:55 - 21.07.2010 11:16:55
    Available   21.07.2010 11:17:50 - 21.07.2010 11:17:50
    Unavailable 21.07.2010 11:17:55 - 21.07.2010 11:17:55
    Available   21.07.2010 11:18:50 - 21.07.2010 11:18:50
    Unavailable 21.07.2010 11:18:55 - 21.07.2010 11:18:55
    Available   21.07.2010 11:19:50 - 21.07.2010 11:19:50
    Unavailable 21.07.2010 11:19:55 - 21.07.2010 11:19:55
    Available   21.07.2010 11:20:50 - 21.07.2010 11:20:50
    Unavailable 21.07.2010 11:20:55 - 21.07.2010 11:20:55
    Available   21.07.2010 11:21:50 - 21.07.2010 11:21:50
    Unavailable 21.07.2010 11:21:55 - 21.07.2010 11:21:55
    Available   21.07.2010 11:22:50 - 21.07.2010 11:22:50
    Unavailable 21.07.2010 11:22:55 - 21.07.2010 11:22:55
    Got Available/Unavailable several time per minute. Any ideas ?

  • Logical sql in log file.

    Can someone please tell me how to see the complete sql query in the log file. If I run the same query the sql is not being produced I looked in the server log file and also manage sessions log file. It just says all columns from 'Subject Area'. I want to see all the joins and filters as well. Even for repeated queries how can I see complete sql. I set my logging level to 2.

    http://lmgtfy.com/?q=obiee+disable+query+caching
    http://catb.org/esr/faqs/smart-questions.html#homework

  • How to know the history of shrinking log files in mssql

    hello,
    In my SAP system some one shrinked the log file to 100 GB to 5 GB.How we would check when this
    was shrinked recently .
    Regards,
    ARNS.

    hi,
    Did u check the logfile in sapdirectory.There will be entry of who changed the size and the time.
    Also,
    Goto the screen where we usually change the logfile size.In that particular field press f1 and goto technical setting screen. Get the program name , table name  and field name.
    Now using se11 try to open the table and check whether the changed by value is there for that table.
    Also open the program and debug at change log file process block.use can see in which table it update the changes.
    There is a case of caution in this case.
    The size of the application server's System Log is determined by the
    following SAP profile parameters. Once the current System Log reaches
    the maximum file size, it gets moved to the old_file and and a new
    System Log file gets created. The number of past days messages in the
    System Log depends on the amount/activity of System Log messages and the
    max file size. Once messages get rolled off the current and old files,
    they are no longer retrievable.
    rslg/local/file /usr/sap/<SID>/D*/log/SLOG<SYSNO>
    rslg/local/old_file /usr/sap/<SID>/D*/log/SLOGO<SYSNO>
    rslg/max_diskspace/local 1000000
    rslg/central/file /usr/sap/<SID>/SYS/global/SLOGJ
    rslg/central/old_file /usr/sap/<SID>/SYS/global/SLOGJO
    rslg/max_diskspace/central 4000000  .

  • VO ExecuteQuery in diagnostic log file.

    Hi am using JDeveloper 11.1.2.0.0
    I have developed a simple ADF application using view based VO, with ViewCriteria in it and Am currently focussing on the application tuning, so that to execute the VO minimal time.
    Now while running the application with logging enabled to finest level, i want to see when and all the VO query got executed.
    While analysing the diagnostic log file I can see 2 query statement for each action corresponds to any individual VO.
    One mere SELECT STATEMENT and the other SELECT STATEMENT with WHERE CLAUSE.
    Although it is expected that the SELECT STATEMENT with WHERE CLAUSE will only be executed, how can we confirm the execution of Query, from the diagnostic-log file (Any KEYWORD inthe log file confirming the execution of VO like 'executeQuery').
    While dubuging I noticed that several times the VO iterator is executed based on different refresh condition given in the pageDef.
    On this scenario how can we differentiate that whether VO is Querying the DB or it is caching the data.
    Can anyone pls help me to identify the executeQuery statement of VO Query from the log file.
    Any suggestion or information pointers will be helpful..
    Thanks and Regards,
    M A P.

    SOA 11g diagnostic logs can be done within the fusion middleware control gui app ( "soa-infra" -> logs -> log configuration ).
    OSB servers have a logging config file:
    {domain_home}/config/fmwconfig/servers/{managed_server}/logging.xml
    with default settings:
    <property name='path' value='${domain.home}/servers/${weblogic.Name}/logs/${weblogic.Name}-diagnostic.log'/>
    <property name='maxFileSize' value='10485760'/>
    <property name='maxLogSize' value='104857600'/>

  • Creating Log files.

    Hello friends,
    I am developing a program which works with 4 different text files.
    3 of them for reading and 1 for writing and reading.
    Here's a skeleton:
    open file1 for reading.
    for each element in file1 do
          record the element in the log file
          open file2 for reading
          open file3 for writing
          for every element in file2 do
                process all elements in file2 for one element of file1
                generating an element for file3
                write out the processed elements to file3.
           end for // here we have file3 ready for the next step
    end for
    open file3 for reading
    open file4 for reading
    for each element in file3 do
          process result1 and store it in log file
          process result2 and store it in log file
          for each element in file4 do
                 process result3 and store it in log file
          end for
    end forCan anyone suggest me how to use the Logger to log all the results?
    Also, How should I open the files to read and write?
    I am new to working with files in java and reading about the different I/O streams is driving me nuts at the moment.
    Thank you in advance.

    Thanks Keith for the useful links.
    I had to do a bit of reading regarding working with files in java
    here is what I have gotten to so far:
    try {
                         // OPEN PERMUTATION FILE FOR READING
                         FileInputStream pstream = new FileInputStream("permutations.txt");  // THIS FILE IS GENERATED IN PREVIOUS STEP
                         DataInputStream pin = new DataInputStream(pstream);
                         BufferedReader pbr = new BufferedReader(new InputStreamReader(pin));
                   //FileReader permfr = new FileReader("permutations.txt");
                   //BufferedReader permbr = new BufferedReader(permfr);           // WILL THIS WORK INSTEAD OF ABOVE 3 LINES?
                   //StreamTokenizer permst = new StreamTokenizer(permbr);
                   // OPEN INPUT FILE FOR READING
                   FileInputStream istream = new FileInputStream("sample.txt");  // THIS IS THE INPUT FILE OF WORDS
                   DataInputStream in = new DataInputStream(istream);
                   BufferedReader ibr = new BufferedReader(new InputStreamReader(in));
                   //FileReader ifr = new FileReader("sample.txt");
                   //BufferedReader ibr = new BufferedReader(ifr);           // WILL THIS WORK INSTEAD OF ABOVE 3 LINES?
                   //StreamTokenizer ist = new StreamTokenizer(ibr);
                   // CREATE A TEMP FILE FOR OUTPUT
                            // Creating a temp file because I will be using it for one permutation only
                            // instead of writing to a file and clearing it again for the next permutation.
                   File tempoutput = File.createTempFile("output",".txt");
                   String permstr;    // PERMUTATION string
                   String ipstr;      // INPUT string
                   String opstr;      // OUTPUT string
                   // create input char array, perm array and output char array
                   char[] permArray;
                   char[] opArray = null;
                   char[] ipArray;
                   while ((permstr = pbr.readLine()) != null)   { // for all strings in permutation file do
                        // read one permutation string
                        StringTokenizer permst = new StringTokenizer(permstr);
                        while(permst.hasMoreTokens()){  // TILL the end of permutation file
                             // convert the permutation string to char array and store it
                             String perm = permst.nextToken();
                             permArray = perm.toCharArray();
                             // open the output file for writing
                             BufferedWriter out = new BufferedWriter(new FileWriter(tempoutput, true));
                             // open the input file (original dictionary) for reading
                             while ((ipstr = ibr.readLine()) != null) {   // for all the words in the input file (dictionary)
                                  // read one input string
                                  StringTokenizer ipst = new StringTokenizer(ipstr);
                                  while (ipst.hasMoreTokens()){
                                       // convert it into a char array and store it in 'input' char array
                                       String ip = ipst.nextToken();
                                       ipArray = ip.toCharArray();
                                       // for each element in the permute array
                                       for(int i=0; i<=perm.length()-1; i++) {
                                            // read the permute array elements, index i
                                            // store it in input index j
                                            int j = permArray;
                                            // generate output as you go as output[i] = input[j]
                                            opArray[i] = ipArray[j];
                                       // convert output char array to string
                                       opstr = String.valueOf(opArray);
                                  // write the permuted output string to the output file
                                       out.write("\n"+opstr);
                                       out.flush();
                                       out.close();
                             // OPEN OUTPUT FILE FOR READING
    // RUN SOME PROGRAMS AND GENERATE SOME NUMBERS
                             newop.close();
                             STORE THE RESULTS (PREFERABLY IN A LOG FILE)
    I AM STILL WORKING ON HOW TO DO THIS
                             tempoutput.delete(); // delete the temp input file
                   } // end reading permutation file
                   // CLOSE ALL STREAMS Below
                   pstream.close();
                   pin.close();
                   pbr.close();
                   istream.close();
                   in.close();
                   ibr.close();
              catch (IOException e){
                   System.err.println("Error: " + e.getMessage());
    I am currently debugging this piece of code.
    Can anyone point out any flaws/design errors here?
    Any suggestions to make it better/efficient?
    Greatly appreciated as always.                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   

  • Amending script to read list of computers, run script and output to log file

    Hello all,
    I have cobbled together a script that runs and does what I want, now I would like to amend the script to read a list of computers rather than use the msg box that it is currently using for the strcomputer, if the computers doesn't respond to a ping, then
    log that, if it does continue with the script and when it is complete, log a success or failure. I have just started scripting and would really appreciate some help on this one,thanks. I created the script to fix an SCCM updates issue and failing task sequences,
    so it may prove useful to others.
    There are msg box entries that can be removed that were originally in there for the user running the script.
    'setting objects
    Dim net, objFSO, shell
    Dim objFile, strLine, intResult
    Set objnet = CreateObject("wscript.network")
    Set objFSO = CreateObject("scripting.filesystemobject")
    Set objshell = CreateObject("wscript.shell")
    strfile = "c:\wuafix\wuafix.vbs"
    strUser = "domain\user"
    strPassword = "password"
    'getting server name or IP address
    strComputer=InputBox("Enter the IP or computer name of the remote machine on which to repair the WUA agent:", "Starting WUA Fix")
    'check to see if the server can be reached
    Dim strPingResults
    Set pingExec = objshell.Exec("ping -n 3 -w 2000 " & strComputer) 'send 3 echo requests, waiting 2secs each
    strPingResults = LCase(pingExec.StdOut.ReadAll)
    If Not InStr(strPingResults, "reply from")>0 Then
    WScript.Echo strComputer & " did not respond to ping."
    WScript.Quit
    End If
    'Check if source file exists
    If Not objFSO.FileExists(strFile) Then
    WScript.Echo "The source file does not exist"
    WScript.Quit
    End If
    MsgBox "The WUA Fix is in process. Please wait.", 64, "Script Message"
    'mapping drive to remote machine
    If objFSO.DriveExists("Z:") Then
    objnet.RemoveNetworkDrive "Z:","True","True"
    End If
    objnet.MapNetworkDrive "Z:", "\\" & strComputer & "\c$", True
    'creating folder for install exe on remote machine
    If (objFSO.FolderExists("Z:\wuafix\") = False) Then
    objFSO.CreateFolder "Z:\wuafix"
    End If
    'copying vbs to remote machine
    objFSO.CopyFile strFile, "Z:\wuafix\wuafix.vbs"
    'set command line executable to run a silent install remotely
    strInstaller1 = "cscript.exe c:\wuafix\wuafix.vbs"
    'strInstaller2 = "c:\wuafix\wuafix.vbs"
    strExec = "c:\pstools\PsExec.exe "
    'objshell.Run strExec & " \\" & strComputer & strInstaller1
    On Error Resume Next
    result = objshell.Run(strExec & " \\" & strComputer & " " & strInstaller1)
    If Err.Number = 0 Then
    WScript.Echo "PSXEC Runing WUA fix remotely"
    Else MsgBox Err.Number
    MsgBox result
    End If
    Set objWMIService = GetObject("winmgmts:" _
    & "{impersonationLevel=impersonate}!\\" & strComputer & "\root\cimv2")
    Set colLoggedEvents = objWMIService.ExecQuery _
    ("SELECT * FROM Win32_NTLogEvent WHERE Logfile = 'Application' AND " _
    & "EventCode = '4'")
    Wscript.Echo "Event Viewer checked and Fix Applied:" & colLoggedEvents.Count
    MsgBox "Removing mapped drive Please wait.", 64, "Script Message"
    If objFSO.DriveExists("Z:") Then
    objnet.RemoveNetworkDrive "Z:","True","True"
    End If
    MsgBox "The WUA Fix has been applied.", 64, "Script Message"
    quit
    wscript.quit
    Any help appreciated and explanations on the process would be great as I would like to learn the process involved, which is difficult when working during the day.
    many thanks

    Hi Bill,
    long story short, I have approx. 2800 clients with an old entry in WMI for updates that the sccm client cannot clear or run because they do not exist anymore, so the client will not run updates or use a task sequence because of this, so my script fixes this
    and does a couple of other things, I have found another way to do this by running  a different script that uses WMI to call a cscript function that uses the wuafix.vbs that is coped to the machine, I am also changing the echo entries to output to a log
    file instead so that I can track what client has run the fix andn which ones haven't.
    If you have any suggestions then please let me know, nothing nefarious :)
    many thanks

  • Log file analysis - BPELFault - process instance id

    Hello,
    I'd like to ask you is there any way how to log BPEL process instance id into log files when BPELFault is thrown.
    For example in our production logs there are faults like this:
    com.oracle.bpel.client.BPELFault: faultName: {{http://schemas.xmlsoap.org/ws/2003/03/business-process/}selectionFailure}
    messageType: {}
    parts: {{summary=<summary>XPath query string returns zero node.
    According to BPEL4WS spec 1.1 section 14.3, The assign activity &amp;lt;to&amp;gt; part query should not return zero node.
    Please check the BPEL source at line number "2109" and verify the &amp;lt;to&amp;gt; part xpath query.
    Possible reasons behind this problems are: some xml elements/attributes are optional or the xml data is invalid according to XML Schema.
    To verify whether XML data received by a process is valid, user can turn on validateXML switch at the domain administration page.
    </summary>
    But there is no information in which process and instance id it was occured.
    There are many processes and instances in our prod server, so it is impossible to identify in which process and in which instance the error was occured.
    Is there any way how to configure/implement it so in log file there will be also instance ID ?
    For example
    com.oracle.bpel.client.BPELFault: faultName: {{http://schemas.xmlsoap.org/ws/2003/03/business-process/}selectionFailure}
    messageType: {}
    parts: {{....}}
    instance id: {12345678}
    thank you very much.
    Roman

    Hi Roman,
    Best way is create your fault handling framewrok. Where you create a BPEL process which stores all the faulted values in a table in DB and this BPEL process you can call from you catch blocks. In this way you can easily keep track of the errors and position where the error occurred. Also since you have put all the things in table you can generate reports for your production.
    Regards
    Sahil

  • Get Total DB size , Total DB free space , Total Data & Log File Sizes and Total Data & Log File free Sizes from a list of server

    how to get SQL server Total DB size , Total DB free space , Total Data  & Log File Sizes and Total Data  & Log File free Sizes from a list of server 

    Hi Shivanq,
    To get a list of databases, their sizes and the space available in each on the local SQL instance.
    dir SQLSERVER:\SQL\localhost\default\databases | Select Name, Size, SpaceAvailable | ft -auto
    This article is also helpful for you to get DB and Log File size information:
    Checking Database Space With PowerShell
    I hope this helps.

  • Location of query log files in OBIEE 11g (version 11.1.1.5)

    Hi,
    I wish to know the Location of query log files in OBIEE 11g (version 11.1.1.5)??

    Hi,
    Log Files in OBIEE 11g
    Login to the URL http://server.domain:7001/em and navigate to:
    Farm_bifoundation_domain-> Business Intelligence-> coreapplications-> Dagnostics-> Log Messages
    You will find the available files:
    Presentation Services Log
    Server Log
    Scheduler Log
    JavaHost Log
    Cluster Controller Log
    Action Services Log
    Security Services Log
    Administrator Services Log
    However, you can also review them directly on the hard disk.
    The log files for OBIEE components are under <OBIEE_HOME>/instances/instance1/diagnostics/logs.
    Specific log files and their location is defined in the following table:
    Log Location
    Installation log                     <OBIEE_HOME>/logs
    nqquery log <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    nqserver log <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    servername_NQSAdminTool log      <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    servername_NQSUDMLExec log                          <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    servername_obieerpdmigrateutil log (Migration log)           <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    sawlog0 log (presentation)                          <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIPresentationServicesComponent/coreapplication_obips1
    jh log (Java Host)                               <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIJavaHostComponent\coreapplication_obijh
    webcatupgrade log (Web Catalog Upgrade)                <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIPresentationServicesComponent/coreapplication_obips1
    nqscheduler log (Agents)                          <OBIEE_HOME>/instances/instance1/diagnostics/logsOracleBISchedulerComponent/coreapplication_obisch1
    nqcluster log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIClusterControllerComponent\coreapplication_obiccs1
    ODBC log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIODBCComponent/coreapplication_obips1
    opmn log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    debug log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    logquery log                               <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    service log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    opmn out                              <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    Upgrade Assistant log                         <OBIEE_HOME>Oracle_BI1/upgrade/logs
    Regards
    MuRam

  • ATSServer fatal exception- runaway syslogd process & massive asl.log file

    Hi All,
    (in lieu of a bug report)
    My MacBook has recently started to go unresponsive (beachball of death), with only cursor movement and the clock indicating it wasn't frozen (Cmd-Opt-Esc didn't work either), forcing me to hard restart. When this happens Activity Monitor shows syslogd hogging all the CPU, and inactive memory climbs until it maxes out after which the fans kick in and soon after the computer goes unresponsive. I checked Console and found a 1GB+ asl.log file (which has now mutated into a new file for each instance), with pages of this message (all same time stamp, 3 examples listed):
    [Time 2007.11.16 03:39:43 UTC] [Facility user] [Sender /System/Library/Frameworks/ApplicationServices.framework/Frameworks/ATS.framewo rk/Support/ATSServer] [PID -1] [Message FOExceptionMainHandler caught a fatal exception at 0x00018d03;] [Level 3] [UID -2] [GID -2] [Host oli-studholmes-computer]
    [Time 2007.11.16 03:26:10 UTC] [Facility user] [Sender /System/Library/Frameworks/ApplicationServices.framework/Frameworks/ATS.framewo rk/Support/ATSServer] [PID -1] [Message FOExceptionMainHandler caught a fatal exception at 0x00018d03;] [Level 3] [UID -2] [GID -2] [Host oli-studholmes-computer]
    [Time 2007.11.20 07:19:21 UTC] [Facility user] [Sender /System/Library/Frameworks/ApplicationServices.framework/Frameworks/ATS.framewo rk/Support/ATSServer] [PID -1] [Message FOExceptionMainHandler caught a fatal exception at 0x00018d03;] [Level 3] [UID -2] [GID -2] [Host oli-studholmes-computer]
    My only workaround is to force-kill the syslogd process. Since this happened I've updated from Mac OS X 10.4.10 to 10.4.11, but the problem hasn't stopped. I haven't recently installed any other software.
    I first noticed this trying to open a file containing Japanese text in TextMate, which led me to this page:
    http://d.hatena.ne.jp/hetima/20061102/1162435711#c1184639718
    (basically others having the same problem, using the same workaround, and the last comment saying it also happens in other text editors and looks like a system bug).
    At this time I did a Disk Utility repair which found errors and repaired them (although I had to do the repair via Target Disk Mode as I couldn't boot from the Tiger install DVD and even the verify process would fail due to an underlying process (i think it was diskutil-can't remember) hanging). Also because there was the suggestion of a font problem I checked my fonts with Font Book (all ok), and cleared font caches with Font Finangler (this was all pre-10.4.11 update).
    I'm going to go to the Apple store once I back everything up. I'd be interested if anyone else has seen this, and if so if it was related to Japanese fonts or not.
    peace
    PS how do I prevent Jive from turning all those nice square brackets into links I wonder? the pretag in square brackets doesn't do it, and [code] isn't enabled...
    PPS this link to a [Mac OS X 10.3 ATS Services bug report|http://daringfireball.net/2005/03/fontcaches_gonewild] might also be relevant
    Message was edited by: Oli Studholme (adding PSs
    Message was edited by: Oli Studholme

    I have a problem with safari just quitting on me. the safari window will close. a new window comes up saying safari quit unexpectedly and ask if it want to report it. I repaired permissions, a few things came up. Ran the repair. It seems to have cleared up the problem.
    Thankyou very much.

  • Query log file location?

    Is log file is created when query has completed execution, if yes plz tell its location.

    Put a set timing on before executing the query in SQL *Plus like:
    SQL> set timing on
    SQL> select 1 from dual;
    1
    1
    Elapsed: 00:00:00.00
    To make trace on just do the following:
    SQL> SET AUTOTRACE TRACEONLY
    SQL> SELECT 1 FROM DUAL;
    Elapsed: 00:00:00.01
    Execution Plan
    0 SELECT STATEMENT Optimizer=ALL_ROWS (Cost=2 Card=1)
    1 0 FAST DUAL (Cost=2 Card=1)
    Statistics
    1 recursive calls
    0 db block gets
    0 consistent gets
    0 physical reads
    0 redo size
    419 bytes sent via SQL*Net to client
    508 bytes received via SQL*Net from client
    2 SQL*Net roundtrips to/from client
    0 sorts (memory)
    0 sorts (disk)
    1 rows processed
    SQL>
    Regards
    Edited by: Virendra.k.Yadav on Aug 20, 2010 2:37 AM

  • How do I setup RMAN not to delete archive log files on the source database so GoldenGate can process DDL/DML changes?

    I want to setup RMAN not to delete any archive log files that will be used by GoldenGate.   Once GoldenGate is completed with the archive log file, the archive log file can be backup and deleted by RMAN.   It's my understanding that I can issue the following command "REGISTER EXTRACT <ext_name>, LOGRETENTION" to enable to functionally.   Is this the only thing I need to do to execute to enable this functionally?

    Hello,
    Yes this is the rigth way  using clasic capture.
    Using the command : REGISTER EXTRACT Extract_name LOGRETENTION.
    Create a Oracle Streams Group Capture (Artificial)  that prevent RMAN archive deletion if these are pending to process for Golden Gate capture process.
    You can see this integration doing a SELECT * FROM DBA_CAPTURE; after execute the register command.
    Then, when RMAN try to delete a archive file pending to process for GG this warning appear AT RMAN logs:
    Error:     RMAN 8317 (RMAN-08317 RMAN-8317)
    Text:     WARNING: archived log not deleted, needed for standby or upstream capture process.
    Then , this is a good manageability feature. I think is a 11.1 GG new feature.
    Tip. To avoid RMAN backup multiples times a archive pending to process, there is a option called BACKUP archivelog not backed 1 times.
    If you remove a Capture process that is registered with the database you need to use this comand to remove the streams capture group:
    unREGISTER EXTRACT extract_name LOGRETENTION;
    Then if you query dba_capture, the artificial Streams group is deleted.
    I hope help.
    Regards
    Arturo

  • (Cisco Historical Reporting / HRC ) All available connections to database server are in use by other client machines. Please try again later and check the log file for error 5054

    Hi All,
    I am getting an error message "All available connections to database server are in use by other client machines. Please try again later and check the log file for error 5054"  when trying to log into HRC (This user has the reporting capabilities) . I checked the log files this is what i found out 
    The log file stated that there were ongoing connections of HRC with the CCX  (I am sure there isn't any active login to HRC)
    || When you tried to login the following error was being displayed because the maximum number of connections were reached for the server .  We can see that a total number of 5 connections have been configured . ||
    1: 6/20/2014 9:13:49 AM %CHC-LOG_SUBFAC-3-UNK:Current number of connections (5) from historical Clients/Scheduler to 'CRA_DATABASE' database exceeded the maximum number of possible connections (5).Check with your administrator about changing this limit on server (wfengine.properties), however this might impact server performance.
    || Below we can see all 5 connections being used up . ||
    2: 6/20/2014 9:13:49 AM %CHC-LOG_SUBFAC-3-UNK:[DB Connections From Clients (count=5)]|[(#1) 'username'='uccxhrc','hostname'='3SK5FS1.ucsfmedicalcenter.org']|[(#2) 'username'='uccxhrc','hostname'='PFS-HHXDGX1.ucsfmedicalcenter.org']|[(#3) 'username'='uccxhrc','hostname'='PFS-HHXDGX1.ucsfmedicalcenter.org']|[(#4) 'username'='uccxhrc','hostname'='PFS-HHXDGX1.ucsfmedicalcenter.org']|[(#5) 'username'='uccxhrc','hostname'='47BMMM1.ucsfmedicalcenter.org']
    || Once the maximum number of connection was reached it threw an error . ||
    3: 6/20/2014 9:13:49 AM %CHC-LOG_SUBFAC-3-UNK:Number of max connection to 'CRA_DATABASE' database was reached! Connection could not be established.
    4: 6/20/2014 9:13:49 AM %CHC-LOG_SUBFAC-3-UNK:Database connection to 'CRA_DATABASE' failed due to (All available connections to database server are in use by other client machines. Please try again later and check the log file for error 5054.)
    Current exact UCCX Version 9.0.2.11001-24
    Current CUCM Version 8.6.2.23900-10
    Business impact  Not Critical
    Exact error message  All available connections to database server are in use by other client machines. Please try again later and check the log file for error 5054
    What is the OS version of the PC you are running  and is it physical machine or virtual machine that is running the HRC client ..
    OS Version Windows 7 Home Premium  64 bit and it’s a physical machine.
    . The Max DB Connections for Report Client Sessions is set to 5 for each servers (There are two servers). The no of HR Sessions is set to 10.
    I wanted to know if there is a way to find the HRC sessions active now and terminate the one or more or all of that sessions from the server end ? 

    We have had this "PRX5" problem with Exchange 2013 since the RTM version.  We recently applied CU3, and it did not correct the problem.  We have seen this problem on every Exchange 2013 we manage.  They are all installations where all roles
    are installed on the same Windows server, and in our case, they are all Windows virtual machines using Windows 2012 Hyper-V.
    We have tried all the "this fixed it for me" solutions regarding DNS, network cards, host file entries and so forth.  None of those "solutions" made any difference whatsoever.  The occurrence of the temporary error PRX5 seems totally random. 
    About 2 out of 20 incoming mail test by Microsoft Connectivity Analyzer fail with this PRX5 error.
    Most people don't ever notice the issue because remote mail servers retry the connection later.  However, telephone voice mail systems that forward voice message files to email, or other such applications such as your scanner, often don't retry and
    simply fail.  Our phone system actually disables all further attempts to send voice mail to a particular user if the PRX5 error is returned when the email is sent by the phone system.
    Is Microsoft totally oblivious to this problem?
    PRX5 is a serious issue that needs an Exchange team resolution, or at least an acknowledgement that the problem actually does exist and has negative consequences for proper mail flow.
    JSB

Maybe you are looking for

  • HT204053 Should I use one Apple id for two Apple products

    I Bought iPad and already had iPhone ans used same I'd for both want to know if in future I deleted apps from my iPhone will it be deleted from my iPad also or visa versa

  • Set a text value to more pages...problem

    So how do I archieve this? I have a TextField1 in Page1(subform), and TextField2, 3, and 4 on pages 2, 3, and 4. If I change TextField1 all the other textfields must have the same text as in 1...Or can some one give me a idea were to look in the help

  • Must force quit to close and won't display links with pdf or quicktime

    when i try to close firefox it gets hung up and i must force quit. also when i try to open links like show a bill and it a pdf the screen goes blank. also other links that use quicktime fail as well. all this works fine in safari so i must conclude i

  • Partition Removal Query

    Hello, I currently use Parallels 4 for my Windows Vista needs (as they are quite minimal), but my workload that requires the use of the Office 2007 layout requires me to make a more drastic dedication to Windows, in short, I'm considering using the B

  • 0008 error

    Hi, I am saving the 0008 infotye i m getting the entry in table T510 for the key 40 013 on 29.02.2008. Pls  let me know on this. Thanks and Regards, Revathi.