Specific log file

I want to see specific information in log file for outline change or any rejection of data load.
For Example : What log file logs when outline structure modified.

All you need to know is in section "Using Essbase Logs" :- http://download.oracle.com/docs/cd/E12825_01/epm.111/esb_dbag/dlogs.htm#dlogs1026780
Cheers
John
http://john-goodwin.blogspot.com/

Similar Messages

  • Database-Specific Log Files?

    Hi,
    Assuming that you are using an environment, is there some way to specify database-specific (rather than environment-specific) location for log files? Or, put another way, can you ask log files to be created for each database rather than each environment?
    Thanks,
    -- [email protected]
    Andrew Bell
    Iowa State University

    Hi Andrew,
    Could you let us know what is the goal that you want to achieve by using per database log files?
    You could open a single database per environment, thus having the log files associated with a single database, but this is not a scalable solution at all. Or, you could create your own logging mechanism, which might be quite complicated as it will affect related operations, such as checkpoints, recovery etc.
    Regards,
    Andrei

  • Log4j - display logger.debug(...) from class to only a specific log file

    I want logger.debug statement from one class (org.myorg.MyClass) to go to only one log file (logY.log, which is appender A1) and not to show in logX.log. With configuration below, logger.debug from org.myorg.MyClass is now shoing in both logX.log and logY.log.
    What is the least amount of change (this is maintenance code) that can be done in the following log4j.properties to show logger.debug("aaa") from org.myorg.MyClass only in logY.log (appender A1) and not show in logX.log?
    Following is log4j.properties:
    log4j.rootLogger = DEBUG, stdout
    log4j.appender.stdout=org.apache.log4j.RollingFileAppender
    log4j.appender.stdout.File=${pathVariable}/logX.log
    log4j.logger.org.myorg.MyClass=DEBUG, A1
    log4j.appender.A1=org.apache.log4j.RollingFileAppender
    log4j.appender.A1.File= ${pathVariable}/logY.log
    log4j.rootLogger = DEBUG, stdout
    log4j.appender.stdout=org.apache.log4j.RollingFileAppender
    log4j.appender.stdout.File=${pathVariable}/logX.log
    log4j.logger.org.myorg.MyClass=DEBUG, A1
    log4j.appender.A1=org.apache.log4j.RollingFileAppender
    log4j.appender.A1.File= ${pathVariable}/logY.log
    ...

    What have you done so far?
    What kind of approach are you going to take (aside from asking others to do it all for you)?
    Have you written the algorithm in psuedo-code? If so, can we see it? We might be able to give you a few pointers.

  • Problem with logging in log files.

    HI,
    Our's is a client/server application.
    In our application there are so many clients.
    They each have separate page(web page).
    When a client download any file from their site it logs(client name, time, file name etc.) into a common log file(access.log) and a client specific log file(access.client_name).
    Logging in common file(access.log) is performing well.
    Logging in client specific log file works for some clients and don't work for some clients.
    For example there is a client called 'candy'.
    Some times it logs in the log file, access.candy
    some times it don't log.
    Tell me wht is the problem.
    If you want more information regarding my problem, I will send.
    Please give me some solution.
    Thank you.............

    Third Party Client: ((__null) != m_lock && 0 == (*__error())) Can't create semaphore lock
    There seems to be something wrong with a handling (m_lock = method of lock) of semaphore mechanism-- It appears to be a program issue, which is either on your local machine or a remote Web site' page(s).
    'semaphore' Apple definition (quotation from ADC) :
    A programming technique for coordinating activities in which multiple processes compete for the same kernel resources. Semaphores are commonly used to share a common memory space and to share access to files. Semaphores are one of the techniques for interprocess communication in BSD.
    In short, it is a flag to terminate a task/thread efficiently without fail prior to another task/thread starts-- a synchronization mechanism among cooperating threads/tasks. (You might need to have some understanding of the basic concepts of locks and semaphores.)
    I would test any suspect applications to uninstall temporarily to see if the erratic events are displayed on Console. Perhaps, vlc player?
    Fumiaki
    Tokyo

  • Location of query log files in OBIEE 11g (version 11.1.1.5)

    Hi,
    I wish to know the Location of query log files in OBIEE 11g (version 11.1.1.5)??

    Hi,
    Log Files in OBIEE 11g
    Login to the URL http://server.domain:7001/em and navigate to:
    Farm_bifoundation_domain-> Business Intelligence-> coreapplications-> Dagnostics-> Log Messages
    You will find the available files:
    Presentation Services Log
    Server Log
    Scheduler Log
    JavaHost Log
    Cluster Controller Log
    Action Services Log
    Security Services Log
    Administrator Services Log
    However, you can also review them directly on the hard disk.
    The log files for OBIEE components are under <OBIEE_HOME>/instances/instance1/diagnostics/logs.
    Specific log files and their location is defined in the following table:
    Log Location
    Installation log                     <OBIEE_HOME>/logs
    nqquery log <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    nqserver log <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    servername_NQSAdminTool log      <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    servername_NQSUDMLExec log                          <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    servername_obieerpdmigrateutil log (Migration log)           <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIServerComponent/coreapplication_obis1
    sawlog0 log (presentation)                          <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIPresentationServicesComponent/coreapplication_obips1
    jh log (Java Host)                               <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIJavaHostComponent\coreapplication_obijh
    webcatupgrade log (Web Catalog Upgrade)                <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIPresentationServicesComponent/coreapplication_obips1
    nqscheduler log (Agents)                          <OBIEE_HOME>/instances/instance1/diagnostics/logsOracleBISchedulerComponent/coreapplication_obisch1
    nqcluster log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIClusterControllerComponent\coreapplication_obiccs1
    ODBC log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OracleBIODBCComponent/coreapplication_obips1
    opmn log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    debug log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    logquery log                               <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    service log                                    <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    opmn out                              <OBIEE_HOME>/instances/instance1/diagnostics/logs/OPMN/opmn
    Upgrade Assistant log                         <OBIEE_HOME>Oracle_BI1/upgrade/logs
    Regards
    MuRam

  • Listener log file

    My database version is 11g Release 11.1.0.6.0.
    I need my listener to write to a specific log file, I've tried different things including setting the following parameters in listener.ora and restrarting the listener:
    LOG_DIRECTORY_LISTENER = /mydesireddirectory
    LOG_FILE_LISTENER =/mydesireddirectory/listener.log
    In spite of whatever I do, it puts the log file in :
    /ORACLE_HOME/log/diag/tnslsnr/hostname/listener/alert/log.xml
    can anyone tell me how to get round this please, I don't want an xml listener log and I don't want it in that directory.
    thank you.

    Hi,
    In Oracle 11g you have
    automatic diagnostic repository
    The automatic diagnostic repository (ADR) is a systemwide tracing and logging central repository. The repository is a file-based hierarchical datastore for depositing diagnostic information, including network tracing and logging information
    DIAG_ADR_ENABLED_listener_name
    The DIAG_ADR_ENABLED_listener_name parameter indicates whether ADR tracing is enabled. By default it is on. So Listener would be writing to the diag folder only.
    Since LOG_DIRECTORY_listener_name and LOG_FILE_listener_name is a non-ADR parameter. They both would be ignored if above parameter is set to on. To disable ADR, use
    DIAG_ADR_ENABLED_listener_name=off and then only non-ADR parameter would be in effect.
    Regards
    Anurag

  • How to Create a batch file to display and count specific words in log file

    Hi All,
    I have requirement Program to be written that will go through a log file and look for following key words.
    Unexpected Essbase error
    And also it will count the # of times the word error appear in a log file.
    You may use batch file or Perl script to complete this task.
    e.g. in the log file - It will flag yes that keyword "Unexpected Essbase error" found and word error occurs 9 times.
    Pls help me in know process to achieve above requirement.
    and pls let me know what is perl scripting ?
    Thanks in Advance
    Regards,
    SM

    Sorry but it sounds like you have been asked to do something and you have pasted the requirement on the forum, have you done any research to find out which scripting language you are going to use or any find examples, there are so many differents examples and help on the internet it just takes a little bit of time and investment.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Help with Script created to check log files.

    Hi,
    I have a program we use in our organization on multiple workstations that connect to a MS SQL 2005 database on a Virtual Microsoft 2008 r2 Server. The program is quite old and programmed around the days when serial connections were the most efficient means
    of connection to a device. If for any reason the network, virtual server or the SAN which the virtual server runs off have roughly 25% utilization or higher on its resources the program on the workstations timeout from the SQL database and drop the program
    from the database completely rendering it useless. The program does not have the smarts to resync itself to the SQL database and it just sits there with "connection failed" until human interaction. A simple restart of the program reconnects itself
    to the SQL database without any issues. This is fine when staff are onsite but the program runs on systems out of hours when the site is unmanned.
    The utilization of the server environment is more than sufficient if not it has double the recommended resources needed for the program. I am in regular contact with the support for the program and it is a known issue for them which i believe they do not
    have any desire to fix in the near future. 
    I wish to create a simple script that checks the log files on each workstation or server the program runs on and emails me if a specific word comes up in that log file. The word will only show when a connection failure has occurred.
    After the email is sent i wish for the script to close the program and reopen it to resync the connection.
    I will schedule the script to run ever 15 minutes.
    I have posted this up in a previous post about a month ago but i went on holidays over xmas and the post died from my lack or response.
    Below is what i have so far as my script. I have only completed the monitoring of the log file and the email portion of it. I had some help from a guy on this forum to get the script to where it is now. I know basic to intermediate scripting so sorry for my
    crudity if any.
    The program is called "wasteman2G" and the log file is located in \\servername\WasteMan2G\Config\DCS\DCS_IN\alert.txt
    I would like to get the email side of this script working first and then move on to getting the restart of the program running after.
    At the moment i am not receiving an error from the script. It runs and doesn't complete what it should.
    Could someone please help?
    Const strMailto = "[email protected]"
    Const strMailFrom = "[email protected]"
    Const strSMTPServer = "mrc1tpv002.XXXX.local"
    Const FileToRead = "\\Mrctpv005\WasteMan2G\Config\DCS\DCS_IN\alert.txt"
    arrTextToScanFor = Array("SVR2006","SVR2008")
    Set WshShell = WScript.CreateObject("WScript.Shell")
    Set objFSO = WScript.CreateObject("Scripting.FileSystemObject")
    Set oFile = objFSO.GetFile(FileToRead)
    dLastCreateDate = CDate(WshShell.RegRead("HKLM\Software\RDScripts\CheckTXTFile\CreateDate"))
    If oFile.DateCreated = dLastCreateDate Then
    intStartAtLine = CInt(WshShell.RegRead("HKLM\Software\RDScripts\CheckTXTFile\LastLineChecked"))
    Else
    intStartAtLine = 0
    End If
    i = 0
    Set objTextFile = oFile.OpenAsTextStream()
    Do While Not objTextFile.AtEndOfStream
    If i < intStartAtLine Then
    objTextFile.SkipLine
    Else
    strNextLine = objTextFile.Readline()
    For each strItem in arrTextToScanFor
    If InStr(LCase(strNextLine),LCase(strItem)) Then
    strResults = strNextLine & vbcrlf & strResults
    End If
    Next
    End If
    i = i + 1
    Loop
    objTextFile.close
    WshShell.RegWrite "HKLM\Software\RDScripts\CheckTXTFile\FileChecked", FileToRead, "REG_SZ"
    WshShell.RegWrite "HKLM\Software\RDScripts\CheckTXTFile\CreateDate", oFile.DateCreated, "REG_SZ"
    WshShell.RegWrite "HKLM\Software\RDScripts\CheckTXTFile\LastLineChecked", i, "REG_DWORD"
    WshShell.RegWrite "HKLM\Software\RDScripts\CheckTXTFile\LastScanned", Now, "REG_SZ"
    If strResults <> "" Then
    SendCDOMail strMailFrom,strMailto,"VPN Logfile scan alert",strResults,"","",strSMTPServer
    End If
    Function SendCDOMail( strFrom, strSendTo, strSubject, strMessage , strUser, strPassword, strSMTP )
    With CreateObject("CDO.Message")
    .Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/sendusing") = 2
    .Configuration.Fields.Item("http://schemas.microsoft.com/cdo/configuration/smtpserver") = strSMTP
    .Configuration.Fields.item("http://schemas.microsoft.com/cdo/configuration/smtpauthenticate") = 1 'basic
    .Configuration.Fields.item("http://schemas.microsoft.com/cdo/configuration/sendusername") = strUser
    .Configuration.Fields.item("http://schemas.microsoft.com/cdo/configuration/sendpassword") = strPassword
    .Configuration.Fields.Update
    .From = strFrom
    .To = strSendTo
    .Subject = strSubject
    .TextBody = strMessage
    On Error Resume Next
    .Send
    If Err.Number <> 0 Then
    WScript.Echo "SendMail Failed:" & Err.Description
    End If
    End With
    End Function

    Thankyou for that link, it did help quite a bit. What i wanted was to move it to csript so i could run the wscript.echo in commandline. It all took to long and found a way to complete it via Batch. I do have a problem with my script though and you might
    be able to help.
    What i am doing is searching the log file. Finding the specific words then outputting them to an email. I havent used bmail before so thats probably my problem but then im using bmail to send me the results.
    Then im clearing the log file so the next day it is empty so that when i search it every 15 minutes its clean and only when an error occurs it will email me.
    Could you help me send the output via email using bmail or blat?
    @echo off
    echo Wasteman Logfile checker
    echo Created by: Reece Vellios
    echo Date: 08/01/2014
    findstr "SRV2006 & SRV2008" \\Mrctpv005\WasteMan2G\Config\DCS\DCS_IN\Alert.Txt > c:\log4mail.txt
    if %errorlevel%==0 C:\Documents and Settings\rvellios\Desktop\DCS Checker\bmail.exe -s mrc1tpv002.xxx.local -t [email protected] -f [email protected] -h -a "Process Dump" -m c:\log4mail.txt -c
    for %%G in (\\Mrctpv005\WasteMan2G\Config\DCS\DCS_IN\Alert.Txt) do (copy /Y nul "%%G")
    This the working script without bmail
    @echo off
    echo Wasteman Logfile checker
    echo Created by: Reece Vellios
    echo Date: 08/01/2014
    findstr "SRV2006 & SRV2008" \\Mrctpv005\WasteMan2G\Config\DCS\DCS_IN\Alert.Txt > C:\log4mail.txt
    if %errorlevel%==0 (echo Connection error)
    for %%G in (\\Mrctpv005\WasteMan2G\Config\DCS\DCS_IN\Alert.Txt) do (copy /Y nul "%%G")
    I need to make this happen:
    If error occurs at "%errorlevel%=0" then it will output the c:\log4mail.txt via smtp email using bmail.

  • Display string from a log file on a Dashboard?

    Hello everyone,
    I need a little guidance on how to solve this problem in SCOM 2012 R2.
    I need SCOM to do this:
    Read a log file on a specific application server (say its hostname is server21)
    Look for the text: Total number of users: 15
    Display the text “Users on Server21 =
     15” on a SCOM dashboard (or something to that effect)
    Sounds simple enough? Well, I am stumped :-(
    I have read and followed tutorials on internet, telling me how to create a rule and alerts. However I don’t want alerts. I want this data to be simply shown on a dashboard.
    How can I do that?
    PS: I hear that I may have to create my own management pack with only server21 in it. Is that right?
    -Rajeev rajdude.com

    Thanks for the tip. The powershell grid widget is pretty powerful. However it will take me quite some time and effort to write a PS script which can extract the exact info I need from that application's log file. I was hoping SCOM's own log parsing
    capabilities would do it.
    By the way, yes, we can create a MP and have only one server in it. We can use the (free) 
    MPAuthor to make a MP with only one server in it. Here is a video showing how to do it...
    http://www.silect.com/static/mpauthor/MP_Author_Creating_a_New_Single_Server_Application_MP.mp4
    I tried doing what I want using MPAuthor, but it again boiled down to me writing a PS script which can extract the exact info I need
    from that application's log file.
    -Rajeev rajdude.com

  • Java.util.logging: write to one log file from many application (classes)

    I have a menuapp to launch many applications, all running in same JVM and i want to add logging information to them, using java.util.logging.
    Intention is to redirect the logginginfo to a specific file within the menuapp. Then i want all logging from all applications written in same file. Finally, if needed (but i don't think it is), i will include code to write logging to specific file per app (class). The latter is probably not neccessary because there are tools to analyse the logging-files and allow to select filters on specific classes only.
    The applications are in their own packages/jars and contain following logging-code:
            // Redirect error output
            try {
                myHandler = new FileHandler("myLogging.xml",1000000,2);
            } catch (IOException e) {
              System.out.println("Could not create file. Using the console handler");
            myLogger.addHandler(myHandler);
            myLogger.info("Our first logging message");
            myLogger.severe("Something terrible happened");
            ...When i launch the menuapplication, it writes info to "myLogging.xml.0"
    but when i launch an application, the app writes info to "myLogging.xml.0.1"
    I already tried to leave out the creation of a new Filehandler (try/catch block in code above) but it doesn't help.
    Is it possible to write loginfo to same specific file?

    You should open/close it somehow at every write from different processes.
    But I personally prefer different file names to your forced merging, though.

  • How can I search for details in job log files

    Hi,
    I'm looking for a specific entry in the job log.  I don't know when it was written (other than the date), to find the log without trial and error, I need a specific time to open the correct one in IDC.
    The entry was written by the modifyADSuser pass and it would have an userID tag in the log file but there are many hundreds a day for me to hunt through.  If I could find where identity center pulls the log files from I could either use a SQL select (if it's held in the database) or text search (if it's held in a folder) to zero in on the correct log file.  Does anyone know where the information that's shown in the IDC job logs is stored?
    Thanks,
    Pete

    Thanks for the response, I checked MC_LOGS and that looks to be the same detail that is available in the management console, basically the rows displayed in the job log.  Do you know the table relationship after MC_LOGS, what's the tale name for the data (even if encrypted) that details each pass etc?
    Thanks,
    Pete

  • Rule created to monitor a single line entries in a text.log file does not work

    Hi All,
    I have this strange issue. I created a script which generates .log file and i have configured a rule to monitor it. Whenever the .log is altered the alert does not come at all in SCOM 2012 R2.
    I want this alert to be raised when one specific line in the center is altered from LISTENING to NOT LISTENING.
    I have configured it. It triggered a alert for the first time and again it did not trigger at all.
    I created this rule and disabled it and overrided the value to true only to the MS acting as the watcher for this log file.
    The log file generates in the local drive of the MS itself.
    Changed the log watcher to a different server and also mentioned the application data source to a network location when the watcher was changed so it can pull the log accordingly.
    The log is generated in the MS itself. Tried using both local location where the log is located as well as converted the same to a network location still didn't help.
    C:\Port_checker is the directory where the .log file is located also there is no other log file present only 1.
    I also changed the parameters such as "Contains, Wildcard matches etc but nothing worked.
    Screenshots:
    2. 
    The SCOM Action account has Full permissions on all servers over the entire forest itself.
    Target used to create this rule is "Windows server operating system"
    Can any one help me please.
    Gautam.75801

    Since you have a script that updates a file line from "LISTENING" to "NOT LISTENING"
    you might want to try and configure a Two State Script Unit Monitor rather then a rule. So your script just need to check say every 5 minutes the content of the log file and generate an alert when it matches "Not Listening" and clear when
    it changes to "listening".
    http://www.systemcentercentral.com/wp-content/uploads/2009/04/HOW-TO_2-state_ScriptMonitor.pdf
    Cheers,
    Martin
    Blog:
    http://sustaslog.wordpress.com 
    LinkedIn:
    Note: Posts are provided “AS IS” without warranty of any kind, either expressed or implied, including but not limited to the implied warranties of merchantability and/or fitness for a particular purpose.

  • Collecting Log Files

    When providing feedback around reported issues we may request you access and share certain log files from your computer in order to better understand the operation of the Creative Cloud app on your system.
    Below is information about various log files, their location, and when they are created.
    Note: (Windows) If unable to locate a specific folder it may be necessary to ensure the option to show hidden folders is turned on
    http://helpx.adobe.com/x-productkb/global/show-hidden-files-folders-extensions.html
    Note: (Mac) The Users Library user folder is hidden on starting with 10.7 or later
    see http://helpx.adobe.com/x-productkb/global/access-hidden-user-library-files.html
    ~ on Mac denotes the logged in user's directory.  The Library folder under it is hidden. The easiest way to access it is to use the "Go to folder" option from the "Go" menu of Finder and type the path exactly as I typed starting from ~ symbol.)
    Please always include these logs - PDApp.log and AMT3.log from this location:
        Win7/8 : %TEMP%
        Mac : ~/Library/Logs/
    For issues with installation - include these log files:
        Win: <drive>:\Program Files\Common Files\Adobe\Installers
        Mac: /Library/Logs/Adobe/Installers
    For issues with updates
        Win7/8: <%localappdata%>\Adobe\AAMUpdater\1.0
        Mac: ~/Library/Application Support/Adobe/AAMUpdater/1.0
    Please zip this complete 1.0 folder to share
    For issues with activation or licensing
    Please zip the entire SLCache folder and all files in it, into a single zip, and attach to bug)
        Windows 32 bit: <drive>:\Program Files\Common Files\Adobe
        Windows 64 bit: <drive>:\Program Files (x86)\Common Files\Adobe
        Mac: ~/Library/Application Support/Adobe
    For issues with Adobe Application Manager and downloading
        Windows: %temp%[AdobeDownloads]\
        Mac: ~/Library/Logs/Adobe/AdobeDownloads/
    For issues related to file sync and Typekit
    Mac:
    The log file can be found here:
    <Mac Hard Drive>/Users/<username>/Library/Application Support/Adobe/CloudSync/CoreSync-YYYY-MM-DD.log
    (where YYYY-MM-DD indicate the date of the last log)
    Please see note at the top about for accessing the user library folder
    Windows:
    The log file can be found here:
    C:\Users\<username>\AppData\Roaming\CloudSync\CoreSync-YYYY-MM-DD.log
    (where YYYY-MM-DD indicate the date of the last log)

    I am able to run the commands individually from oracle user, but when I run "raccheck" script its not working.
    oracle & grid user id:_
    oracle@hublpr1:/admintmp/ORACLE/RAC-Check$ id oracle
    uid=501(oracle) gid=501(oinstall) groups=501(oinstall),502(dba),504(asmdba),506(oper)
    oracle@hublpr1:/admintmp/ORACLE/RAC-Check$
    oracle@hublpr1:/admintmp/ORACLE/RAC-Check$ id grid
    uid=502(grid) gid=501(oinstall) groups=501(oinstall),503(asmadmin),504(asmdba),505(asmoper)
    oracle@hublpr1:/admintmp/ORACLE/RAC-Check$I am able to run crsctl command or any other GI command from oracle user. But "raccheck" script is not working.
    oracle@hublpr1:/admintmp/ORACLE/RAC-Check$ echo $CRS_HOME
    /app/grid/product/11.2.0.3
    oracle@hublpr1:/admintmp/ORACLE/RAC-Check$ $CRS_HOME/bin/crsctl query crs activeversion
    Oracle Clusterware active version on the cluster is [11.2.0.3.0]
    oracle@hublpr1:/admintmp/ORACLE/RAC-Check$ $CRS_HOME/bin/crsctl query crs softwareversion
    Oracle Clusterware version on node [hublpr1] is [11.2.0.3.0]
    oracle@hublpr1:/admintmp/ORACLE/RAC-Check$ $CRS_HOME/bin/crsctl check crs
    CRS-4638: Oracle High Availability Services is online
    CRS-4537: Cluster Ready Services is online
    CRS-4529: Cluster Synchronization Services is online
    CRS-4533: Event Manager is online
    oracle@hublpr1:/admintmp/ORACLE/RAC-Check$ cat /proc/cpuinfo
    processor       : 0
    vendor_id       : GenuineIntel
    cpu family      : 6
    model           : 44
    ...................................................

  • How to configure log files in SQL 2012?

    I'm installing a SharePoint Solution and using Microsoft SQL 2012 and I have limited knowledge in installing SQL.  I simply run the wizard and create the DB for SharePoint. 
    Is there any links/materials available demonstrating how to correctly configure the log files step by step in a specific partition during the SQL installation for SharePoint 2013?
    thanks, 

    Hello,
    SQL Server database log files benefit from RAID 1 and RAID 10 configurations. RAID 10 is recommended for both data files and log files.
    To control de grow of transaction log files, please backup them regularly. The following article may explain you in detail why:
    http://technet.microsoft.com/en-us/library/ms175495.aspx
    Monitor the growth of log files using Performance Monitor and the SQLServer:Databases --> Log Growths counter. Adjust the size of log files until Log Growths
    is constantly zero.
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • Moving the log file of a publisher database SQL Server 2008

    There are many threads on this here. Most of them not at all helpful and some of them wrong. Thus a fresh post.
    This post regards SQL Server 2008 (10.0.5841)
    The PUBLISHER database primary log file which is currently of 3 blocks and not extendable,
    must be moved as the LUN is going away.
    The database has several TB of data and a large number of push transactional replications as well as a couple of bi-directional replications.
    While the primary log file is active, it is almost never (if ever) used due to its small fixed size.
    We are in the 20,000 TPS range at peak (according to perfmon). This is a non-trivial installation.
    This means that
    backup/restore is not even a remotely viable option (it never is in the real world)
    downtime minimization is critical - measured in minutes or less.
    dismantling and recreating the replications is doable, but I have to say, I have zero trust in the script writer to generate accurate scripts. Many of these replications were originally set up in older versions of SQL Server and have come along for the
    ride as upgrades have occurred. I consider scripting everything and dismantling the whole lot pretty high risk. In any case, I do not want to have to reinitialize any replications as this takes, effectively, an eternity.
    Possible solution:
    The only option I can think of is to wind down everything, such that there are zero outstanding uncommitted transactions and detach the database, delete the offending log file and reattach using the CREATE DATABASE xyz ATTACH_REBUILD_LOG option.
    This should, if I have understood things correctly, cause SQL Server to recreate the default log file in the same directory as the .mdf file. I am not sure what will happen to the secondary log file which is not moving anywhere at this point.
    The hard bit is insuring that every transaction in the active log files have been replicated before shutdown. This is probably doable. I do not know how to manually flush any left over transactions to replication. I expect if I shut down all "real"
    activity and wait for a certain amount of time, eventually all the replications will show "No replicated transactions are available" and then I would be good to go.
    Hillary, if you happen to be there, comments appreciated.

    Hi Philip
    you should try this long back suggested way of stopping replication and restore db and rename or detach attach
    http://social.msdn.microsoft.com/Forums/sqlserver/en-US/6731803b-3efa-4820-a303-4ffb7edf154a/detaching-a-replicated-database?forum=sqlreplication
    Thanks
    Saurabh Sinha
    http://saurabhsinhainblogs.blogspot.in/
    Please click the Mark as answer button and vote as helpful
    if this reply solves your problem
    I do not wish to be rude, but which part of the OP didn't you understand?
    Specifically the bit about 20,000 transactions a second and database size of several TB. Do you have any concept whatsoever of what this means? I will answer for you, "no, you are clueless" as your answer clearly shows.
    Stop wasting bandwidth by proposing pointless and wrong solutions which indicate that you did not read the OP, or do you just do this to generate points?
    Also, you clearly failed to notice that I was on the thread to which you referred, and I had some pointed comments to make. This thread was an attempt to garner some input for an alternative proposal.

Maybe you are looking for

  • PC Suite: Image Store problem.

    Hey there, I'm using: Nokia PC Suite 7.0.9.2 Windows Vista SP1 Bluetooth Connection Nokia E71 The problem I'm having is that each time I go to sync my photos, I get the error: "Error communicating with the phone". I've tried using the cable connectio

  • Acrobat 9 Pro - Kerning Issue

    I just loaded Acrobat 9 Pro on several machines with an XP operating system. If you use the Text Edit function it will create numerous spaces between letters (Kerning) throught the document. The changes appear to be random and makes the document unus

  • Will I lose existing music when downloading itunes 11?

    If I download Itunes 11.1 from the apple website, will I lose all my existing music?

  • Netflix logon screen security issue

    I have scoured the net for an answer to this. I have 3 PC's with WMC/7. I use an HD Homerun Prime. Yesterday WMC would not play Netflix, reporting; The security certificate presented by this website is not secure. Security certificate problems may in

  • Installing Camera Raw update for PSE 9.0.3 fails

    I have been trying to update either manually or with the update function the camera raw plugin for Photoshop Elements 9 to the newest version. It keeps saying that the update has failed. I know I don't have the newest version because I can't open my