Log files/troubleshooting performance data collection

Hello: 
Trying to use MAP 9.0, 
When doing performance data collection, getting errors.  Is there a log file or event log that captures why the errors are occurring?
One posting said to look in bin\log - but there is no log directory under BIN for this version it seems.  
Thank you, 
Mustafa Hamid, System Center Consultant

Hi Mark,
There's no CLEANER_ADJUST_UTILIZATION in EnvironmentConfig for BDB JE 5.0.43 which I'm currently using, I also tried
   envConfig.setConfigParam("je.cleaner.adjustUtilization",
          "false");
it fails to start up with below error
Caused by: java.lang.IllegalArgumentException: je.cleaner.adjustUtilization is not a valid BDBJE environment parameter
    at com.sleepycat.je.dbi.DbConfigManager.setConfigParam(DbConfigManager.java:412) ~[je-5.0.43.jar:5.0.43]
    at com.sleepycat.je.EnvironmentConfig.setConfigParam(EnvironmentConfig.java:3153) ~[je-5.0.43.jar:5.0.43]

Similar Messages

  • Datalogging with options to retrieve subset of log file based on date/time

    I would like to thank this forum for useful advice so far in completing my LabVIEW software.
    I have a data logging challenge. I am supposed to log about 30 parameters every 5 seconds. Some of these parameters are digital (ON/OFF), some are values of speed (rpm) and others, an expression of a percentage (%). It should be possible in future to do a histogram or bar chart plot of some of the parameters, for a specific period range (say the last 5 minutes of a certain day). So in effect, do an extraction of a segment of the total log file.
    My challenge is if I use text file, like the one in the attached VI, can it give functionality of retrieving data (while the VI is running) from the log file, based on a certain time range (i.e. retrieve a section of the log file based on a certain date/time range, on demand)?
    The format in the text file is close to what I require, since it lists the time n one column and the other parameters on other columns to enable future histogram generation.
    Thanks a lot, friends.
    Solved!
    Go to Solution.
    Attachments:
    writer.vi ‏19 KB
    time.txt ‏1 KB

    Hey maxidivine,
    Iv been playing round with your code and found that to perform the search that you require could be quite demanding to system resources when scaled to the size of your application I shall try and find a way to perform the search using .txt files but the there are some other options available. I recommend the use if TDMS files as the file format is a very efficient, manageable method of data-logging. The TDMS file format is designed to write and read measured data at a very high speed, while maintaining a hierarchical system of descriptive information.
    Traditionally, TDMS was a National Instruments only file format – you could only read it using our products – LabVIEW/CVI/DIAdem. However, thanks to the popularity of the format, a bolt-on is now available for Excel, which allows you to directly open the .tdms files with Excel (see link).
    National Instruments Technical Data Management Overview
    http://zone.ni.com/devzone/cda/tut/p/id/3676
    Introduction to LabVIEW TDM Streaming Vis
    http://zone.ni.com/devzone/cda/tut/p/id/3539
    VI-Based API for Writing TDMS Files
    http://zone.ni.com/devzone/cda/tut/p/id/6471
    TDM Excel Add-In Tool for Microsoft Excel User Guide
    http://zone.ni.com/devzone/cda/tut/p/id/4906
    TDM Excel Add-In for Microsoft Excel Download
    http://zone.ni.com/devzone/cda/epd/p/id/2944
    Troubleshooting the TDM Excel Add-In for Microsoft Excel 2000-2003
    http://zone.ni.com/devzone/cda/tut/p/id/5874
    Examples of the use of the TDMS API ship with LabVIEW. You will find them in HELP > find examples > fundamentals > File Input and Output. For you application, I would recommend the “Cont Acq&Graph Voltage - Write Data to File (TDMS).vi”.
    Furthermore, if you require some help with DIAdem, I would recommend clicking "getting started" from the DIAdem splash screen. This opens a manual which discusses everything from data analysis to report generation. Also, if you have DIAdem 11 or above, there are tutorial videos which install with DIAdem. These are useful little tutorials, which discuss all the DIAdem fundamentals. You can access these by selecting a particular palette tab (eg. report, view, analysis...etc) and then clicking the tutorial button (shown as a film strip with a question mark) at the top of the group view.
    Here are some more helpful DIAdem related resources for future reference.
    Report Gen in DIAdem...
    http://zone.ni.com/devzone/cda/tut/p/id/7379
    DataPlugins: Supported Data Formats (ni.com/dataplugins)
    http://zone.ni.com/devzone/cda/tut/p/id/4065
    Hope this is helpful
    Philip
    Philip
    Applications Engineer
    National Instruments
    UK Branch
    ===If this fixes your problem, mark as solution!===

  • Performance data collection issue

    Hi,
    We are using SCOM 2007 r2.We have some servers not collecting performance data.These servers are up and running fine and generating alerts (Monitoring Working fine). Can any one please suggest us what is the work arround for this.
    Thanks&Regards,
    Padmaja M.

    Try to clear management server health service cache by stopping the system center management service, renaming the health service state folder and starting the service.
    Juke Chou
    TechNet Community Support

  • Log NC based on data collection

    Is it possible to trigger the logging of an NC based on a data collection value being outside the acceptable range?
    ie. Acceptable range for the data collection is a number less than 6, if the user enters 7 I would like to log an NC that says the data collection is out or range.

    To summarize:
    What I'm taking away from this is that it is the best practice to have only one parameter per DC group if you intend to trigger the automatic logging of an NC when that group "fails." The one parameter in the DC group MUST have a min/max value assigned and a fail is triggered when the operator enters a value outside of that range.  The NC is logged using the value assigned to the LOGNC_ID_ON_GROUP_FAILURE parameter in activity maintenance.
    If there are multiple parameters in the DC group, they all have to have a min/max value assigned and ALL of the responses have to be out of range in order to fail the SFC.
    I cannot have a DC group that contains parameters of multiple types and expect an NC to be logged based on an incorrect answer (for one question or multiple.)
    I cannot expect an NC to be logged based on an incorrect answer of one question, if the rest of the questions in the DC group are answered "correctly."
    Sound correct?
    Edited by: Allison Davidson on Apr 18, 2011 10:06 AM  - typo

  • Probleme avec le log file path de Data Logging Control de Veristand

    Bonjour à tous,
    Mon problème est que j'utilise un ordinateur comme passerelle sur le réseau. Cette dernière est connecté au PXI pour acquisitionner en Real Time. J'ai un autre ordinateur connecté à la passerelle pour lire les donnés du PXI. Je n'arrive pas en enregistrer sur mon disque dur local en utilisant le Data logging Control de Veristand sur le deuxième ordinateur. Cependant, il peut m'enregistrer sur le disque dur  se trouvant sur le réseau. De plus, je n'ai pas de problème à enregistrer si l'ordinateur est une passerelle.
    Cordialement,
    Kamal Bouamran

    Apologies for Google translate...
    Am I correct in assuming that you have a Logging Control connected to a remote gateway running on another computer and you want to access the log file on your local computer?
    Excuses pour Google Translate ...
    Ai-je raison de supposer que vous avez un Log Control relié à une Gateway distante exécutée sur un autre ordinateur et que vous voulez accéder au fichier journal sur votre ordinateur local ?

  • How to perform Data Collection on single SFC with QTY = 1 with material lot size 1?

    Dear experts,
    We are working with SFC qty>1 on a relaxed routing. At a given operation we want to collect the data on single quantity; i.e. SFC qty on that operation, where the collection will happen, will be 1.The corresponding material lot size is for ex 10. The operator must be able to collect data on SFC with qty=1 multiple times until the quantities will be consumed. He must be also able to collect other values on the remaining quantities on the same operation with the same DC group or other DC groups. How many times the data must be collected is dependent on the shop order build quantity. The data may be collected several time but not more than the build qty. In other words some specific data will be collected on a qty of a product while others will be collected against remaining quantity. The data collection must be also done in a serialized manner.
    Here's what we have set up so far:
    1) 3 DC groups, each DC group has 3 data fields.
    2) Each data field has the following restrictions:  Required Data Entries = 0 and Optional Data Entries = 1
    3) All DC groups are attached on the same combination of operation\material\routing
    4) we are using relaxed routing
    Process description:
    The operator must be able to collect any data field on single product. For that he will enter the operation where the data collect are attached, he will enter the SFC with qty=1 then he will run the data collection after selecting the appropriate DC Group and entering the needed information. The operator will complete the SFC with qty=1.
    The operator will pick the next product, select the same SFC and enter qty 1 and collect other value against this product.
    Problem is:
    Once the first collection is done on a given SFC with entered qty=1, the system is not allowing the operator to do further collects on the same SFC with qty=1 or any other quantity. He cannot select any DC group from the DC group list. We tried also with the table selection menu on the DC Group list but nothing can be selected.
    So we tried to play around with the DC group definitions as follows:
    A) we set Required Data Entries = 0 and Optional Data Entries = 10. Still the operator was not able to select any DC group after collecting data the first time. We tried to reopen the pod and list again. But we get the same blocking behavior.
    B) we set Required Data Entries = 10 and Optional Data Entries = 1. The operator was able to select the DC group after collecting data the first time. BUT operator must enter the data fields 10 times on one SFC quantity, which is not what we want. Besides again he cannot collect other information on remaining quantities on the same operation.
    C) There's an option to serialize the SFC before reaching the operation where the collection is happening, then merging ofter complete. Automation is needed here; hence customization. We are highly avoiding customization now, since we expect the data collect to work well on single quantities even when the main SFC has qty>1
    Questions:
    1) Are we missing any kind of further configuration\setup?
    2) Or the current system design does not allow collecting data on single quantities of an SFC which main quantity is greater than 1?
    3) Looking at this link Approaches to Collection of Data - SAP Manufacturing Execution (SAP ME) - SAP Library, there's nothing mentioned about same SFC number with multiple quantities!!!
    We are using SAP ME 15.0.3.0.
    Thanks in advance for your help
    Ali

    Ali
    to collect data for the same SFC multiple times, your system rule "Allow Multiple Data Collection" needs to be set to true for the site.
    Stuart

  • Search log file line entry (date and string)

    I use powershell 4 and I need to search from a text file if it has a line that contains a date like this '2014-02-12' AND the same line contains some text string like
    'execution successful', that line contains something else but those are the important things. I am a bit lost, I managed to parse it with select-string and split commands but it comes quite ugly and long code. Can I just use regex, something
    like this:
     select-string -Pattern '(2014-02-12)(execution successful)' -Path $myfile
    I want a single line output and then I make an IF sentence that checks if it exists and if it does not, then I create an email alert.

    Hi,
    Here's one possibility:
    $strFound = $false
    $dateStr = '2014-02-12'
    $searchStr = 'execution successful'
    Get-Content .\inputFile.txt | ForEach {
    if ($_.Contains($dateStr) -and $_.Contains($searchStr)) { Write-Host "Found - $_" ; $strFound = $true }
    If ($strFound -eq $false) { Send-MailMessage }
    Don't retire TechNet! -
    (Don't give up yet - 12,575+ strong and growing)

  • At least one of the input binary log files contain fewer than two data samples.

    Hi Guys,
    I encountered this error, when I stop the my defined collection set and try to see the related report in Performance Monitor. I searched on the Internet says that a fast start/stop man end without enough counter data being collected. However, it seems not
    the case in my scenario. I set the sample interval to 15 seconds (default) and run the collection set more than 1.5 days.
    By the way, I monitor the Windows Server 2008 SP1 server on a remote Windows 7 SP1, both of them are 64-bits. Only general performance counters are added such as CPU, DISK related. I am looking forward to your help. Thanks in advance.
    P.S. I forgot to say, I set up a same data collector on the local computer (Windows 7 box), this does not happen. So, I believe I must has some configuration issues for remote monitor.
    Please mark replies as helpful or answers if they are helful, doing so can help others encountering the similar issue.

    Hi,
    It works locally but  not remotely. I monitor Windows Server 2008 R2 SP1 (Enterprise 64-bit) remotely on Windows 7 SP1 (Ultimate, 64-bit), I start PerfMon using a domain account which is local administrator on both server boxes. I get a .cab files
    for each data collection set, size from 1KB to 3KB. The sample internal is default 15 seconds and I believe the data collection set was started for more than 15 seconds or even hours. Here are steps to reproduce this issue:
    On Win7 box, open performance monitor (PerfMon)
    Connect to the remote WinSrv box
    Create a data collection set, and add the following counters using default 15 seconds sample internal
     Processor(*)\% Privileged Time
     Processor(*)\% Processor Time 
    Save and Start data collection set
    After more than 15 seconds or even hours, I get the above error message.
    Thanks in advance.
    Please mark replies as helpful or answers if they are helful, doing so can help others encountering the similar issue.

  • May [date]IMAPMailboxSyncEngine.log files be deleted without losing emails?

    Problem description:
    [date]IMAPMailboxSyncServices.log makes large logs almost every day.  May the log files with older dates be deleted w/o affecting the email files?
    EtreCheck version: 2.2 (132)
    Report generated 4/26/15, 8:50 PM
    Download EtreCheck from http://etresoft.com/etrecheck
    Click the [Click for support] links for help with non-Apple products.
    Click the [Click for details] links for more information about that line.
    Click the [Click to remove] links for help removing adware.
    Hardware Information: ℹ️
        MacBook Pro (15-inch, Early 2011) (Technical Specifications)
        MacBook Pro - model: MacBookPro8,2
        1 2.2 GHz Intel Core i7 CPU: 4-core
        8 GB RAM Upgradeable
            BANK 0/DIMM0
                4 GB DDR3 1333 MHz ok
            BANK 1/DIMM0
                4 GB DDR3 1333 MHz ok
        Bluetooth: Old - Handoff/Airdrop2 not supported
        Wireless:  en1: 802.11 a/b/g/n
        Battery: Health = Normal - Cycle count = 448 - SN = D86130303H6DGDLAG
    Video Information: ℹ️
        Intel HD Graphics 3000 - VRAM: 512 MB
        AMD Radeon HD 6750M - VRAM: 1024 MB
            Color LCD 1680 x 1050
    System Software: ℹ️
        OS X 10.10.2 (14C1514) - Time since boot: 1:48:27
    Disk Information: ℹ️
        M4-CT512M4SSD2 disk0 : (512.11 GB)
            EFI (disk0s1) <not mounted> : 210 MB
            Recovery HD (disk0s3) <not mounted>  [Recovery]: 650 MB
            SSD 512 (disk1) /  [Startup]: 510.88 GB (10.86 GB free) (Low!)
                Core Storage: disk0s2 511.25 GB Online
        MATSHITADVD-R   UJ-898 
    USB Information: ℹ️
        SanDisk ImageMate 8 in 1
        Apple Inc. BRCM2070 Hub
            Apple Inc. Bluetooth USB Host Controller
        Apple Inc. Apple Internal Keyboard / Trackpad
        Apple Inc. FaceTime HD Camera (Built-in)
        Apple Computer, Inc. IR Receiver
    Thunderbolt Information: ℹ️
        Apple Inc. thunderbolt_bus
    Gatekeeper: ℹ️
        Anywhere
    Kernel Extensions: ℹ️
            /Applications/Parallels Access.app
        [loaded]    com.parallels.virtualsound (1.0.36 36 - SDK 10.6) [Click for support]
            /Applications/Parallels Desktop.app
        [not loaded]    com.parallels.kext.hypervisor (10.2.0 28956 - SDK 10.7) [Click for support]
        [not loaded]    com.parallels.kext.netbridge (10.2.0 28956 - SDK 10.7) [Click for support]
        [not loaded]    com.parallels.kext.usbconnect (10.2.0 28956 - SDK 10.7) [Click for support]
        [not loaded]    com.parallels.kext.vnic (10.2.0 28956 - SDK 10.7) [Click for support]
            /Library/Extensions
        [loaded]    com.sophos.kext.sav (9.2.50 - SDK 10.8) [Click for support]
        [loaded]    com.sophos.nke.swi (9.2.50 - SDK 10.8) [Click for support]
            /System/Library/Extensions
        [not loaded]    com.Belcarra.iokit.USBLAN_netpart (2.0.2) [Click for support]
        [not loaded]    com.Belcarra.iokit.USBLAN_usbpart (2.0.2) [Click for support]
        [not loaded]    com.RemoteControl.USBLAN.usbpart (2.0.6) [Click for support]
        [not loaded]    com.logmein.driver.LogMeInSoundDriver (1.0.0) [Click for support]
        [not loaded]    com.rogueamoeba.InstantOn (6.0.3 - SDK 10.6) [Click for support]
        [not loaded]    com.wdc.driver.1394HP (1.0.9) [Click for support]
        [not loaded]    com.wdc.driver.USBHP (1.0.11) [Click for support]
            /System/Library/Extensions/Belcarra.USBLAN_netpart.kext/Contents/PlugIns
        [not loaded]    com.belcarra.iokit.netpart.panther (1.6.1) [Click for support]
            /System/Library/Extensions/Belcarra.USBLAN_usbpart.kext/Contents/PlugIns
        [not loaded]    com.belcarra.iokit.usbpart.panther (1.6.1) [Click for support]
            /System/Library/Extensions/InstantOn.kext/Contents/PlugIns
        [not loaded]    com.rogueamoeba.InstantOnCore (6.0.3 - SDK 10.6) [Click for support]
            /System/Library/Extensions/RemoteControl.USBLAN_usbpart.kext/Contents/PlugIns
        [not loaded]    com.RemoteControl.USBLAN.panther (1.6.1) [Click for support]
    Launch Agents: ℹ️
        [not loaded]    com.adobe.AAM.Updater-1.0.plist [Click for support]
        [loaded]    com.adobe.CS5ServiceManager.plist [Click for support]
        [running]    com.epson.epw.agent.plist [Click for support]
        [failed]    com.epson.eventmanager.agent.plist [Click for support] [Click for details]
        [loaded]    com.google.keystone.agent.plist [Click for support]
        [running]    com.kodak.BonjourAgent.plist [Click for support]
        [running]    com.logmein.logmeingui.plist [Click for support]
        [not loaded]    com.logmein.logmeinguiagent.plist [Click for support]
        [not loaded]    com.logmein.logmeinguiagentatlogin.plist [Click for support]
        [failed]    com.opendns.osx.DNSCryptMenuBar.plist [Click for support] [Click for details]
        [loaded]    com.oracle.java.Java-Updater.plist [Click for support]
        [running]    com.parallels.mobile.prl_deskctl_agent.launchagent.plist [Click for support]
        [running]    com.sophos.uiserver.plist [Click for support]
        [running]    com.trusteer.rapport.rapportd.plist [Click for support]
        [loaded]    org.macosforge.xquartz.startx.plist [Click for support]
    Launch Daemons: ℹ️
        [loaded]    com.adobe.fpsaud.plist [Click for support]
        [not loaded]    com.adobe.SwitchBoard.plist [Click for support]
        [loaded]    com.google.keystone.daemon.plist [Click for support]
        [not loaded]    com.logmein.logmeinserver.plist [Click for support]
        [loaded]    com.microsoft.office.licensing.helper.plist [Click for support]
        [running]    com.opendns.osx.DNSCryptConfigUpdater.plist [Click for support]
        [loaded]    com.oracle.java.Helper-Tool.plist [Click for support]
        [loaded]    com.oracle.java.JavaUpdateHelper.plist [Click for support]
        [running]    com.parallels.mobile.dispatcher.launchdaemon.plist [Click for support]
        [loaded]    com.parallels.mobile.kextloader.launchdaemon.plist [Click for support]
        [running]    com.sophos.common.servicemanager.plist [Click for support]
        [running]    com.trusteer.rooks.rooksd.plist [Click for support]
        [loaded]    org.cindori.AuthHelper.plist [Click for support]
        [loaded]    org.macosforge.xquartz.privileged_startx.plist [Click for support]
    User Launch Agents: ℹ️
        [loaded]    com.adobe.AAM.Updater-1.0.plist [Click for support]
        [loaded]    com.adobe.ARM.[...].plist [Click for support]
        [failed]    com.citrixonline.GoToMeeting.G2MUpdate.plist [Click for support] [Click for details]
        [running]    com.google.Chrome.framework.plist [Click for support]
        [loaded]    com.kodak.KODAK AiO Firmware Updater.plist [Click for support]
        [loaded]    com.kodak.KODAK AiO Software Updater.plist [Click for support]
        [running]    com.microsoft.LaunchAgent.SyncServicesAgent.plist [Click for support]
        [running]    com.nds.pcshow.plist [Click for support]
        [loaded]    com.nds.pcshow.uninstall.plist [Click for support]
        [running]    com.parallels.mobile.startgui.launchagent.plist [Click for support]
        [loaded]    uk.co.markallan.clamxav.clamscan.plist [Click for support]
        [loaded]    uk.co.markallan.clamxav.freshclam.plist [Click for support]
    User Login Items: ℹ️
        Garmin Lifetime Map Updater    Application  (/Applications/Garmin Lifetime Map Updater.app)
        iTunesHelper    Application Hidden (/Applications/iTunes.app/Contents/MacOS/iTunesHelper.app)
        Dropbox    Application  (/Applications/Dropbox.app)
        ClipMenu    Application  (/Applications/ClipMenu.app)
        ClamXav Sentry    UNKNOWN  (missing value)
        TuneupMyMac    UNKNOWN  (missing value)
        ClamXav    Application  (/Applications/ClamXav.app)
        ClamXav Sentry    Application  (/Applications/ClamXav.app/Contents/Resources/ClamXav Sentry.app)
    Internet Plug-ins: ℹ️
        JavaAppletPlugin: Version: Java 8 Update 31 Check version
        LogitechDeviceDetection: Version: 1.0.0.76 - SDK 10.7 [Click for support]
        LogMeInSafari64: Version: 1.0.730 [Click for support]
        o1dbrowserplugin: Version: 5.41.0.0 - SDK 10.8 [Click for support]
        Default Browser: Version: 600 - SDK 10.10
        Flip4Mac WMV Plugin: Version: 3.2.0.16   - SDK 10.8 [Click for support]
        AdobePDFViewerNPAPI: Version: 10.1.13 [Click for support]
        FlashPlayer-10.6: Version: 17.0.0.169 - SDK 10.6 [Click for support]
        LogMeIn: Version: 1.0.730 [Click for support]
        Silverlight: Version: 5.1.30514.0 - SDK 10.6 [Click for support]
        Flash Player: Version: 17.0.0.169 - SDK 10.6 [Click for support]
        LogMeInSafari32: Version: 1.0.730 [Click for support]
        googletalkbrowserplugin: Version: 5.41.0.0 - SDK 10.8 [Click for support]
        QuickTime Plugin: Version: 7.7.3
        AdobePDFViewer: Version: 10.1.13 [Click for support]
        CANONiMAGEGATEWAYDL: Version: 3.0.0.2 [Click for support]
        CouponPrinter-FireFox_v2: Version: Version 1.1.7 - SDK 10.5 [Click for support]
        SharePointBrowserPlugin: Version: Unknown
        DirectorShockwave: Version: 12.1.2r152 - SDK 10.6 [Click for support]
    User internet Plug-ins: ℹ️
        CitrixOnlineWebDeploymentPlugin: Version: 1.0.105 [Click for support]
        WebEx64: Version: 1.0 - SDK 10.5 [Click for support]
        Google Earth Web Plug-in: Version: 7.1 [Click for support]
    Safari Extensions: ℹ️
        Open in Internet Explorer
        Searchme  Adware! [Click to remove]
        Slick Savings  Adware! [Click to remove]
        Amazon Shopping Assistant  Adware! [Click to remove]
        Ebay Shopping Assistant  Adware! [Click to remove]
    3rd Party Preference Panes: ℹ️
        DNSCrypt  [Click for support]
        Flash Player  [Click for support]
        Flip4Mac WMV  [Click for support]
        Growl  [Click for support]
        Java  [Click for support]
        MacFUSE  [Click for support]
        Trusteer Endpoint Protection  [Click for support]
    Time Machine: ℹ️
        Mobile backups: OFF
        Auto backup: NO - Auto backup turned off
        Volumes being backed up:
        Destinations:
            Alvin Lundgren's Time Capsu [Network]
            Total size: 0 B
            Total number of backups: 0
            Oldest backup: -
            Last backup: -
            Size of backup disk: Excellent
                Backup size 0 B > (Disk size 0 B X 3)
    Top Processes by CPU: ℹ️
            13%    WindowServer
             3%    iPhoto
             2%    Google Chrome Helper(40)
             2%    com.apple.WebKit.Plugin.64
             2%    Mail
    Top Processes by Memory: ℹ️
        1.92 GB    Google Chrome Helper(40)
        737 MB    kernel_task
        295 MB    clamd
        270 MB    Disk Inventory X
        262 MB    com.apple.WebKit.WebContent(5)
    Virtual Memory Information: ℹ️
        55 MB    Free RAM
        7.94 GB    Used RAM
        40 MB    Swap Used
    Diagnostics Information: ℹ️
        Apr 26, 2015, 07:13:08 PM    /Library/Logs/DiagnosticReports/Finder_2015-04-26-191308_[redacted].cpu_resourc e.diag [Click for details]
        Apr 26, 2015, 07:02:09 PM    /Users/[redacted]/Library/Logs/DiagnosticReports/DNSCrypt-Menubar_2015-04-26-19 0209_[redacted].crash
        Apr 26, 2015, 07:00:45 PM    Self test - passed
        Apr 25, 2015, 06:58:38 AM    /Library/Logs/DiagnosticReports/LegacyFileVaultMessageTracer_2015-04-25-065838_ [redacted].crash

    When you have a question, it's best just to ask, without posting reams of irrelevant information that no one asked for, especially if it comes from "etrecheck."
    From the Mail menu bar, select
              Window ▹ Connection Doctor
    In the window that opens, uncheck the box marked
              Log Connection Activity

  • Bad date recorded by AccessServer in Audit Log File

    Hi all,
    I have installed OAM and configure Audit Log File to AccessServer:
    Access System Configuration >> Access Server Configuration >> and put ON "Audit to File"
    The log is recorded OK, but when compare the date writed in log file with SO date, there are 6hs of diference
    LOG FILE
    01\/28\/2009 *00:18:07* \-0500 - AUTHZ_SUCCESS - GET - AccessServer - 192.168.3.105 - sec.biosnettcs.com\/access\/oblix\/lang\/en\-us\/msgctlg.js - cn=orcladmin\,cn=Users\,dc=biosnettcs\,dc=com - 00:18:07 - http - AccessGate - - 2
    SO date
    # date
    mar ene 27 *18:18:15 CST* 2009
    # date -u
    mié ene 28 *00:18:23 UTC* 2009
    How we can see in this lines the audit log is recording date in UTC, but a need this in the timezone setted in SO.
    How can do this (print date in audit log file with the same timezone setted by SO)??
    Thaks in advance,
    Julio

    I response myself.
    There is no way to set the Date/Time format to any other than UTC for the OAM component logs
    See note 742777.1 for deeph information.
    Julio.

  • Data Services Designer 14 - Large Log files

    Hello,
    we're running several jobs with the Data Services Designer 14, all works fine.
    But today a problem occur:
    The Data Designer on a client produces after finishing a big job a very large log file in the Data Services Designer folder with 8 GB.
    Is it possible to delete these log files automatically or restrict the maximum size of the created log files in the designer?
    What's the best way?
    Thanks!

    You can set to automatically delete the log files based on number of days.
    I have done this in XI 3.2, but as per the document, this is how it can be done in DS 14.0.
    In DS 14.0, this is handled in CMC.
    1. Log into the Central Management Console (CMC) as a user with administrative rights to the Data Services application.
    2. Go to the u201CApplicationsu201D management area of the CMC. The u201CApplicationsu201D dialog box appears.
    3. Right-click the Data Services application and select Settings. The u201CSettingsu201D dialog box appears.
    4. In the Job Server Log Retention Period box, enter the number of days that you want to retain the following:
    u2022 Historical batch job error, trace, and monitor logs
    u2022 Current service provider trace and error logs
    u2022 Current and historical Access Server logs
    The software deletes all log files beyond this period. For example:
    u2022 If you enter 1, then the software displays the logs for today only. After 12:00 AM, these logs clear and the software begins saving logs for the next day.
    u2022 If you enter 0, then no logs are maintained.
    u2022 If you enter -1, then no logs are deleted.
    Regards,
    Suneer.

  • How to configure log files in SQL 2012?

    I'm installing a SharePoint Solution and using Microsoft SQL 2012 and I have limited knowledge in installing SQL.  I simply run the wizard and create the DB for SharePoint. 
    Is there any links/materials available demonstrating how to correctly configure the log files step by step in a specific partition during the SQL installation for SharePoint 2013?
    thanks, 

    Hello,
    SQL Server database log files benefit from RAID 1 and RAID 10 configurations. RAID 10 is recommended for both data files and log files.
    To control de grow of transaction log files, please backup them regularly. The following article may explain you in detail why:
    http://technet.microsoft.com/en-us/library/ms175495.aspx
    Monitor the growth of log files using Performance Monitor and the SQLServer:Databases --> Log Growths counter. Adjust the size of log files until Log Growths
    is constantly zero.
    Hope this helps.
    Regards,
    Alberto Morillo
    SQLCoffee.com

  • How to manually write log file when tranform xslt by using Transformer?

    I want to ask experts that are there any way to write the information(such as template name or number of template used) into log file while performing transformation with javax.xml.transform.Transformer.
    Below is my sample code
    import javax.xml.transform.Result;
    import javax.xml.transform.Source;
    import javax.xml.transform.Transformer;
    // declare and assign value of transform, source, and result
    transformer.transform(source, result);
    Thanks for advance

    I think it will be from FDM, if I remember correctly FDM will generate a text file in the background and then load data into essbase using the text file.
    The codes at each line are standard essbase generated codes which relate to an operation e.g. 1013162 = Received Command [Calculate] from user [%s] using [%s]
    If you have a search on the web you will be able to find a full list of codes from numerous locations.
    Cheers
    John
    http://john-goodwin.blogspot.com/

  • Data Collection

    Hi Experts,
    I am facing issue in Data collection Error in Decentralized environment.
    Earlier Data collection (Standard collection) was working normal but recently I am facing the below issue.
    If I launch Standard collection, it is getting Error out. Log file message (in Data pull Program) is Staging Tables' data is not collected. Please submit the request Planning ODS Load.
    For resolving the above, I tried the below two methods.
    1. 1st launch Planning data collection - purging staging tables then Launching Standard collection, but it is also fails with error in the program "APS Collections Miscellaneous Task". Error message : FDPSTP failed due to ORA-06512: at "APPS.MSC_UTIL", line 1192.
    2. launch Planning data collection - purging staging table, Data pull (manual launch) then Planning ODS load (manual launch). Same error message
    But, I am getting negative result.
    Hence, what is the missing setup. Is this need any profile option. or How to solve this.
    Appreciating your earlier answer.
    Regards,
    Ramesh

    Hi
    Package is Valid (I used the said query "select object_name, object_type, status from all_objects where object_name = 'MSC_UTIL'")
    Data collection issue is not resolving. Below are my tryings
    Try 1:
    1. I launched "Planning Data Collection - Purge Staging Tables" with validation = No (Later I tried with Validation = Yes also) then
    2. I launched Standard Collection (as standard way)
    Result:
    Planning ODS Load getting Error out, log file showing below error
    Exceptions posted by this request
    Concurrent Request for "Planning ODS Load" has completed with Error.
    Try 2:
    1. I launched "Planning Data Collection - Purge Staging Tables" with validation = No (Later I tried with Validation = Yes also) then
    2. I launched "Planning Data Pull" manually (Completed Normal) then
    3. I launched "Planning ODS Load" manually
    Result:
    Planning ODS Load Error out because of below
    I am getting below error message in the concurrent program "Generate Trading Partner Keys" log file.
    **Starts**27-FEB-2011 16:20:16
    ORACLE error 6512 in FDPSTP
    Cause: FDPSTP failed due to ORA-06512: at "APPS.MSC_CL_SETUP_ODS_LOAD", line 5094
    ORA-06512: at "APPS.MSC_CL_COLLECTION", line 6862
    ORA-06512: at line 1
    Appreciating your earlier reply
    Regards,
    Ramesh
    .

  • ERP FORMS TRACE(FRD) LOG FILE 만드는 방법 - R11.5.X

    제품 : AOL
    작성날짜 : 2005-04-27
    ERP FORMS TRACE(FRD) LOG FILE 만드는 방법 - R11.5.X
    ==============================================
    PURPOSE
    ERP Forms trace(FRD) log file 만드는 방법 - R11.5.X
    Explanation
    FORM level에서의 Error를 추적하기 위한 방법으로
    FRD log 라는 것이 있습니다.
    해당 log file은 FORM level의 Data입출력을 보여주는 것입니다.
    여기서는 R11.5.X version대에서의 log file만드는 방법에 대해서
    설명하겠습니다.
    1. ERP US(영문) Version일 경우
    http://serverhostname:port/dev60cgi/f60cgi?&config=DBSID&record=collect&log=쓰기가 가능한 Directory/form_trc.txt&lang=US
    처음 ERP접속시 위와 같이 argument를 입력하시면, form_trc.txt file안에 다 기록됩니다.
    그렇기 때문에, file size를 줄이기 위해 가급적 문제 재현까지 바로 이동해주시고, Server상에서 해당 Form trace file을 tail -f 로 확인하시다가 문제현상이 발생하는 시점까지 도달하셨으면, ERP화면에서는 더이상 작업을 하지 말아주시고, Server상의 form_trc.txt file을 확인할 수 있습니다.
    실제 Parameter가 적용된 예제는 아래와 같습니다.
    고객님의 환경에 맞게 Server Host name및 Port등을 변경하셔서 ERP접속시 정확히 입력해주시기 바랍니다.
    예)
    http://ERP.oracle.co.kr:8000/dev60cgi/f60cgi?&config=PROD&record=collect&log=/u01/app/oraERP/temp/form_trc.txt&lang=US
    ===================================================================================================
    2. ERP NLS(한글) Version일 경우
    => 한글 환경에서만 FORM error가 발생하여, 한글 Mode에서 FRD log file을 생성해야하는 경우에
    사용할 수 있는 방법입니다.
    http://{hostname:port}/dev60cgi/f60cgi?config=PROD&record=collect&log=/u01/app/oraERP/temp/form_trc.txt&lang=KO&env=NLS_LANG='korean_korea.{characterset}'
    (characterset은 현재 설정된 것을 확인해주시고 입력해주시기 바랍니다.)
    예를들어, characterset이 KOREAN_KOREA.KO16KSC5601이라면 아래와 같을 수 있겠습니다.
    http://ERP.oracle.co.kr:8000/dev60cgi/f60cgi?&config=PROD&record=collect&log=/u01/app/oraERP/temp/form_trc.txt&lang=KO&env=NLS_LANG='KOREAN_KOREA.KO16KSC5601'
    다른 방법은 US와 동일하고, Parameter중 "lang"과 "env" parameter를 주의해서 입력하면 됩니다.

    Hi Hamish,
    If you managed to solve this issue, please indicate how, because I'm facing exactly the same...
    Thanks&Regards,
    Philippe

Maybe you are looking for