Streaming log file analyzer?

Hi there,
We host several flv files with Akamai and recently enabled
the log file delivery service. So, we now have Akamai-generated log
files in W3C format. I was assuming I could use WebTrends to
analyze these, but after looking at them briefly, it shows
different events like play, stop, seek, etc., and I don't know that
WebTrends would be able to process all of that.
Our most basic requirement is to see how many times each
video was viewed. If we could get more detailed analysis, like
video X gets viewed on average for 2:00, but video Y only gets
viewed for 20 seconds, that would be great as well.
Does anyone have any suggestions for the best software to
analyze these files?
Thanks,
Matt

Immediate AI:
0. Check the log file auto growth setup too and check is this a practically a good one and disk has still space or not.
1. If disk is full where you are keeping log file, then add a log file in database property page on another disk where you have planned to keep log files, in case you can't afford to get db down. Once you are done then you can plan to truncate data out of
log file and remove that if it has come just first time issues. If this happens now and then check for capacity part.
2. You can consider shrinking  the log files after no any other backup are going on or any maintenance job like rebuild\reorg indexes \update stats jobs are executing as this will be blocking it.
If db size is small and copy files from prod to dr is not that latency prone, and shrink is not happening, then you can try changing recovery model and then do shrinking and reconfigure log-shipping after reverting recovery model.
3. Even you can check if anyone mistakenly places some old files and forgot to remove them which is causing disk full issues. Also
4. For permanent solution, do monitor the environment for capacity and allocate good space for log file disks. Also consider tweaking frequencies of the log backup from default that suits your environment.
Santosh Singh

Similar Messages

  • Error in analyzer Log file (/sapdb/data/wrk/ACP/analyzer--- DBAN.err)

    Hello All,
    I am getting the following Error message in analyzer Log file (/sapdb/data/wrk/ACP/analyzer---> DBAN.err).
    the details are as follows:-
    =====================================================
    <i>2006-07-24 08:55:59
    ERROR 5: Cannot execute SQL statement.
    [MySQL MaxDB][LIBSQLOD SO][MaxDB] General error;-4008 POS(1) Unknown user name/password combination
    SELECT YEAR(NOW()),MONTH(NOW()),DAY(NOW()),HOUR(NOW()),MINUTE(NOW()),SECOND(NOW()) FROM DUAL
    2006-07-26 12:15:39
    ERROR 20: Database Analyzer not active in directory "/sapdb/data/wrk/ACP/analyzer".
    2006-08-03 12:33:08
    ERROR 5: Cannot execute SQL statement.
    [MySQL MaxDB][LIBSQLOD SO][MaxDB] Communication link failure;-709 CONNECT: (database not running: no request pipe)
    SELECT YEAR(NOW()),MONTH(NOW()),DAY(NOW()),HOUR(NOW()),MINUTE(NOW()),SECOND(NOW()) FROM DUAL</i>
    =====================================================
    can you please tell me what does that mean for my Database.
    The main problem that I am facing is I am not able to start my SAP application. when I issue startsap from <SID>adm login then I get error messag saying not able to connect to the database. Although the database is already up and running.
    Please help me !
    Regards,
    Premkishan chourasia

    Hi,
    well, the error -4008 denotes that the user/password combination used by the DB Analyzer for accessing the DB are incorrect. The DB Analyzer tries to issue SQL commands with the SYSDBA user.
    Do you know the user/password combination of your SYSDBA user?
    Regards,
    Roland

  • Incosistencies between Analyzer Server Console and stout.log file

    Hi,<BR><BR>In stout.log file of application server file there is a record:<BR>"Setting Current User Count To: 2 Users. Maximum Concurrent Licensed User Count Is: 10 Users." So 2 licences are ured, but checking "Analyer Server Console" there is only one user connected.<BR><BR>After restarting computer "Analyzer Server Console" and stout.log number of users are sinhronized. But I don't know what happens, but after some time this two parameters are not sinhronized anymore.<BR><BR>My problem: I have to show the number of user licences used and I am reading info rom stout.log. But something is not correct - it looks like stout.log doesn't show correct values?<BR><BR>Do I need to specify some setting or is there a bug in code?<BR><BR><BR>My system:<BR>Hyperion Analytic Server version 7.1.0<BR>Hyperion Analyzer Server version 7.0.1.8.01830<BR>IBM DB2 Workgroup Edition version 8 fixpack 9<BR>Tomcat version 4<BR>Windows 2003 Server

    Hi grofaty.<BR><BR>We use 7.0.0.0.01472 and I had experienced the same behaviour, <BR>Analyzer Server Console shows one more session than stdout.log.<BR><BR>If this difference 1 is a static value than you can assume it as an systematic bug...and do your license counting on it...<BR><BR>But again the Analyzer Server Console is not good as it should be for productive usage because all the information is only logged online til the next application restart. E.g. it is not helpful in using it for user tracking purposes. Do you use the stdout.log in such a way or have an idea how to grep measures for session logging analysis:<BR> - Session ID <BR> - User ID<BR> - Client ID <BR> - Total Number of Requests <BR> - Average Response (sec) <BR> - Login Time<BR> - Number of concurrent sessions<BR><BR>?

  • Streaming XML to a log file

    Hello, I am investigating streaming XML into a log file for my system so that I can tag various elements that will allow for efficient reporting on errors, user events, login states etc.....
    As yet I have found it very unclear as to whether I can do this. I do not wish to build up a DOM or hold any information in memory as I want it written to the file straight away, also is it possible to read it at the same time as it is being written?
    Can anyone give me a bit of guidance or direction to if it is possible to stream xml in this way from a singleton object in my application on the appserver? I am looking at either JDOM or JAXB, is this wise?
    Any help would be invaluable...
    cheers
    Richard

    akpattuka, having tried your run time parameters with plugin 1.4.2, i can tell you that they do not work here either. I have even tried setting -Djavaplugin.trace.option=ext as suggested by online help (not with numbers). Nothing. What does work for me is the following. While running your applet open the console window. Hit 'o' to enable logging, hit '5' to set the trace level. Hit the refresh button inside your browser. This will create a trace file and a log file.
    HTH, Markus

  • (Cisco Historical Reporting / HRC ) All available connections to database server are in use by other client machines. Please try again later and check the log file for error 5054

    Hi All,
    I am getting an error message "All available connections to database server are in use by other client machines. Please try again later and check the log file for error 5054"  when trying to log into HRC (This user has the reporting capabilities) . I checked the log files this is what i found out 
    The log file stated that there were ongoing connections of HRC with the CCX  (I am sure there isn't any active login to HRC)
    || When you tried to login the following error was being displayed because the maximum number of connections were reached for the server .  We can see that a total number of 5 connections have been configured . ||
    1: 6/20/2014 9:13:49 AM %CHC-LOG_SUBFAC-3-UNK:Current number of connections (5) from historical Clients/Scheduler to 'CRA_DATABASE' database exceeded the maximum number of possible connections (5).Check with your administrator about changing this limit on server (wfengine.properties), however this might impact server performance.
    || Below we can see all 5 connections being used up . ||
    2: 6/20/2014 9:13:49 AM %CHC-LOG_SUBFAC-3-UNK:[DB Connections From Clients (count=5)]|[(#1) 'username'='uccxhrc','hostname'='3SK5FS1.ucsfmedicalcenter.org']|[(#2) 'username'='uccxhrc','hostname'='PFS-HHXDGX1.ucsfmedicalcenter.org']|[(#3) 'username'='uccxhrc','hostname'='PFS-HHXDGX1.ucsfmedicalcenter.org']|[(#4) 'username'='uccxhrc','hostname'='PFS-HHXDGX1.ucsfmedicalcenter.org']|[(#5) 'username'='uccxhrc','hostname'='47BMMM1.ucsfmedicalcenter.org']
    || Once the maximum number of connection was reached it threw an error . ||
    3: 6/20/2014 9:13:49 AM %CHC-LOG_SUBFAC-3-UNK:Number of max connection to 'CRA_DATABASE' database was reached! Connection could not be established.
    4: 6/20/2014 9:13:49 AM %CHC-LOG_SUBFAC-3-UNK:Database connection to 'CRA_DATABASE' failed due to (All available connections to database server are in use by other client machines. Please try again later and check the log file for error 5054.)
    Current exact UCCX Version 9.0.2.11001-24
    Current CUCM Version 8.6.2.23900-10
    Business impact  Not Critical
    Exact error message  All available connections to database server are in use by other client machines. Please try again later and check the log file for error 5054
    What is the OS version of the PC you are running  and is it physical machine or virtual machine that is running the HRC client ..
    OS Version Windows 7 Home Premium  64 bit and it’s a physical machine.
    . The Max DB Connections for Report Client Sessions is set to 5 for each servers (There are two servers). The no of HR Sessions is set to 10.
    I wanted to know if there is a way to find the HRC sessions active now and terminate the one or more or all of that sessions from the server end ? 

    We have had this "PRX5" problem with Exchange 2013 since the RTM version.  We recently applied CU3, and it did not correct the problem.  We have seen this problem on every Exchange 2013 we manage.  They are all installations where all roles
    are installed on the same Windows server, and in our case, they are all Windows virtual machines using Windows 2012 Hyper-V.
    We have tried all the "this fixed it for me" solutions regarding DNS, network cards, host file entries and so forth.  None of those "solutions" made any difference whatsoever.  The occurrence of the temporary error PRX5 seems totally random. 
    About 2 out of 20 incoming mail test by Microsoft Connectivity Analyzer fail with this PRX5 error.
    Most people don't ever notice the issue because remote mail servers retry the connection later.  However, telephone voice mail systems that forward voice message files to email, or other such applications such as your scanner, often don't retry and
    simply fail.  Our phone system actually disables all further attempts to send voice mail to a particular user if the PRX5 error is returned when the email is sent by the phone system.
    Is Microsoft totally oblivious to this problem?
    PRX5 is a serious issue that needs an Exchange team resolution, or at least an acknowledgement that the problem actually does exist and has negative consequences for proper mail flow.
    JSB

  • MDIS failed to generate the Log file!!!

    Hello All,
    Having a issue where MDIS is not generating the log file..
    The scenario is something like this-
    The files are getting archived and the records not flowing into MDM
    Basis team says-
    2014-06-30T14:11:33.339,47083231971072,24,"[MDS=sapdpm1 Repository=REAL_ESTATE ClientSystem=MDM_REAL_ESTATE Port=Building]: Nigerian Building updates part 2 - SLKDDY.txt is empty, the file will be skipped
    But the source file was having data it was not empty(bit strange!!)
    Also its not generating the LOG to analyze
    Regards,
    Girish

    Hi Shenoy,
    Let me explain the scenario--
    User uploads the file through Portal and through FTP records resides in MDM...the issues is when i tried to import through IM it worked and i tried manually push file through Filezilla FTP it worked.
    But when we upload file through portal, the file resides in Archive and generating the message-
    2014-06-30T14:11:33.339,47083231971072,24,"[MDS=sapdpm1 Repository=REAL_ESTATE ClientSystem=MDM_REAL_ESTATE Port=Building]: Nigerian Building updates part 2 - SLKDDY.txtis empty, the file will be skipped
    But the file is having data.
    Regards,
    Girish

  • Reg:printing contents to log file

    Hello all iam creating a debug log file for my program.. debug will be based on the level set which will be taken as a input from the user ..when i try to print the contents to a file gen.out i could not print anything ..file is empty..can some one pls help me ..there is no error thrown by the program
    // my main propram
    public class ff {
      public static void main(String[] args) {
        PrintStream out = System.out;
        int dbgLevel = Integer.parseInt(args[1]);
        String path ="gen.out";
        debug.setLevel(dbgLevel);
        try {
            debug.setLogFile(path);
        catch(Exception e1) {
          e1.printStackTrace();
    debug.msg(0, "Printing contents to a file  ");
    if (Debug.allowed(0)) {
                    Debug.msg(0, "1.printing th econtents );
    //debug class
    public class debug {
      private static int ourLevel = 0;
      private static FileWriter c_writer;
      private static final String k_nl = System.getProperty("line.separator");
        public static void msg(int reqLevel, String debugMessage) {
        if (reqLevel <= ourLevel) {
          String s = "DBG(" + reqLevel + ") " + getTS() + " " + debugMessage + k_nl;
          if (c_writer == null) {
            System.out.println("writer is null "+c_writer.toString());
            System.err.println(s);
            System.err.flush();
          else {
              System.out.println("writer is not null "+c_writer.toString());
              try {
                         c_writer.write(s);
            catch (IOException e) {
                e.printStackTrace();
          } // if (c_writer...) else ...
        } // if (reqLevel...
      } // msg()
    public static void setLogFile(String path) throws IOException {
        c_writer = new FileWriter(path);
    public static void setLevel(int l) {
        ourLevel = l;
        public static boolean allowed(int reqLevel) {
        return reqLevel <= ourLevel;
    file is getting created but its empty ..wat am i doing wrong here .?

    thanks mkoryak got it i was flushing a wrong stream
    thanks for the help

  • Change block size for several log-files simultaneously?

    Hi,
    I'm using SignalExpress to record and analyze data.
    Sometimes I want to analyze the recorded data both for a short period of time and for longer time.
    (Imagine creating an average of every second first and then an average of every 10 seconds)
    Then I need to change all the log-files, and also the specific parts of the log-file. See attachment.
    I have sometimes up to 1000 log-files containing signals from 4 different modules, that makes 4000 adjustments to change from block size 10000 to block size 1000.
    Is there any way to adjust all the log-files block size at once?
    Many thanks!
    Anders Hansson
    Engineer
    Attachments:
    NI.JPG ‏95 KB

    Hi,
    Is't anyone else interested in a solution for this operation?
    I reported this to the NI-feedback service and they adviced me to report/request advice here to get a quicker reply.
    So...
    Best regards
    Ingenjör Hansson

  • Lots of Bonjour Service errors 100 in my windows 7 log file

    Hi there!
    I have two windows 7 PC's, and both of them have a *lot* of errors 100 in the log files.
    Most of them look like "Warning: Removed port mapping request 0101497C Prot 2 Int 9970 TTL 7200 duplicates existing port mapping request 0068EADC Prot 2 Int 9970 TTL 7200", where the 7200 sometimes is a 0.
    Tried to open port 5353 in my Norton Internet Security, but after downloading BonjourPSsetup.exe and running it and rebooting, the problem persists.
    I use Bonjour for ServeToMe, to stream to my iPad 1.
    How come there are so many errors and how do I fix them and prevent them happening again?
    Kind regards,
    Eelco

    I just google with "ORA-44003: invalid SQL name" and found below link:
    http://www.oracle-base.com/articles/10g/dbms_assert_10gR2.php
    SIMPLE_SQL_NAME
    The SIMPLE_SQL_NAME function checks the input string conforms to the basic characteristics of a simple SQL name:
    The first character of the name is alphabetic.
    The name only contains alphanumeric characters or the "_", "$", "#"
    Quoted names must be enclosed by double quotes and may contain any characters, including quotes provided they are represented by two quotes in a row ("").
    The function ignores leading and trailing white spaces are ignored
    The length of the input string is not validated.

  • How to identify exception stack traces in  log files programmatically

    Hello,
    I would like to develop a utility program which audits the command line or log files and detects exceptions.
    Do you know any good way of identifying exception stack traces from a stream?
    Are you aware of any existing tool that does something similar?
    Thank you in advance,
    Kostas

    I tried to copy/paste the entrire log file in the console but it seems that the copy/paste functionality in not included in the console pane.
    Regardless that problem is there any way to do that programmatically from my application?

  • I wonder what is included in the log files?

    My application generates 1G log files per day. when I use logmnr to
    analyze the log files, I find that 2000 rows which is performed by my
    application user is in the v$logmnr_contents view. And 55000 rows
    which is performed by the Oracle sys user. In these 5000 rows I can
    see SQL statements( Redo, Undo), and 50000 rows I can't see SQL
    statements which is identified by "unsupport statemet" or null.
    I wonder if my application generates 2000 rows, why oracle generates
    25 times than it. Can I tuning the log file that log file can be
    minimized.
    With best regard!

    Your problem may be related to the '''BLNGBAR''' shown at the end of your User Agent String. This answer from '''SafeBrowser''' may help -
    http://support.mozilla.org/en-US/questions/900875

  • SNTP log file on cRIO9024

    Hello,
    I have a question regarding the file "ts_sntp_log.txt" that is written on the internal disc of the cRIO when the device is synchronized to an NTP server. I have performed a measurement where the network latency is relevant to know. I am now trying to analyze the log file in Matlab.
    This is the only info I can find regarding the time format on the file:
    http://digital.ni.com/public.nsf/allkb/F2B057C72B5​37EA2862572D100646D43
    and
    http://www.faqs.org/rfcs/rfc2030.html
    The timestamps in my Tdms file are correct, so I suppose that the cRIO managed to synchronize to the NTP server during the measurement.
    This is a few lines from my log file
    2eb2c57243acbbd3: 98999630.450878, 000008199a840185, 2eb2c5724049c687, 2eb2c572404be4a6, 000008199a876693
    2eb2c5803c13647d: 98999633.822424, 0000081afc93ea5f, 2eb2c58038b10bbc, 2eb2c58038b32a62, 0000081afc976f60
    2eb2c58e347e54d9: 98999644.899869, 0000081c5ea3cdb7, 2eb2c58e3120db8b, 2eb2c58e312365e0, 0000081c5ea85bd3
    As I have interpreted the reference documents, the first number is the time on the cRIO formatted so that the first 8 digits are the number of whole seconds starting from 1900-01-01. My measurement was carried out in 2010 so this number should be
    (2011-1900)*365*24*3600 = 3.500496e+009
    I use the following Matlab lines to convert the hex-number into whole seconds
    ts = '2eb2c57243acbbd3';
    sec_from_1900 = hex2dec(['00000000',ts(1:8)])
    which turns out to be 783468043, i.e., approx 25 years.
    Am I doing something wrong, or is the log file corrupt? If it is corrupt, how can the timing info in my Tdms file still be correct?
    I would be very greatful for some help on this.
    Regards // Johan

    Hello again everyone,
    Is there really nobody out there who has got some information on this? I still believe that the hexadecimal numbers in the file ts_sntp_log.txt are correct since the timestamps written i my TDMS files are correct. I cannot understand the format in the log file though, and the support people at NI sweden has not been able to help me either. I now get the year 1999.
    Does anyone know where to find the exact algorithm for recalculating timestamps (in seconds) in LabView into yy-mm-dd HH:MMS.FFFFFF - format?
    I attach my files and would be very helpful if somebody could have a look at them. Unfortunately the upload function rejected Matlab files so I had to change the file extension into .txt. Just change the following:
       replaceinfile.txt --> replaceinfile.m 
       import_ts_sntp_log.txt --> import_ts_sntp_log.m
    and it should work fine.
    Regards
    Johan
    Attachments:
    ts_sntp_log.txt ‏157 KB
    replaceinfile.txt ‏3 KB
    import_ts_sntp_log.txt ‏3 KB

  • Log file sync waits - COMMIT class - Pattern observed

    I am running a performance test with 1 vUser for 24 hrs. Each time the user logs in, it logs in to my application as a different customer. All customers go through the same screens and perform the same operations.
    When I analyze the % Total CPU time, % User CPU and % Wait CPU, I see a strange pattern or behavior. % User CPU is pretty much flat for the entire duration of the test. However, % Total CPU and % Wait CPU have the same pattern as described below:
    - First 3 hrs 15 mins - high % Total CPU and % Wait CPU
    - Next 1 hr 15 mins - low % Total CPU and % Wait CPU
    - Next 3 hrs 15 mins - same as first 3 hrs 15 mins.
    - Next 1 hr 15 mins - same as the earlier 1 hr 15 mins
    ... and so on.
    I generated the awrddrpt - between two awrrpt - each of 1 hr duration. One of the 1 hr duration falls within the 3 hrs 15 mins time slice, and another 1 hr falls within the 1 hr 15 min time slice.
    I see that there is
    --> 40% difference in the wait time for log file sync events - 40% less in the 1 hr duration from 1 hr 15 mins slice compared to the larger slice, even though the # of log file sync events in both the 1 hr durations is close to same.
    Pls note that this is the same test from which the two durations were compared. Same environment, Same load (loadProfile in awrddrpt shows no difference), then why do I see difference in Wait CPU - higher in one time slice (3 hr 15 mins) when compared to the other slice (1 hr 15 mins).
    log file sync shows in Top 5 timed events in both time slices. In the 1 hr duration from longer time slice, the # of log file sync events is 13,222 and in the 1 hr duration from the shorter time slice it is 13,278.
    The log file sync event Wait Time(s) in the 1 hr duration in the longer time slice is 154.2s and that in the 1 hr duration in the shorter time slice is 10.3s.
    Why do we see this difference, when the load profile is the same in both time slices?
    Message was edited by: the user who posted the message.
    user586033

    You are either bored or suffer from Compulsive Tuning Disorder.
    It can be a challenge to solve  a problem that only exists between your ears
    post results from SQL below
    SELECT sql_id,
           SUM(time_waited) / 1000000
    FROM   v$active_session_history
    WHERE  sample_time > SYSDATE - 1 / 24
           AND time_waited > 0
    GROUP  BY sql_id
    ORDER  BY 2 DESC

  • Render log file has been written to Drive :\u(200101)

    Hi
    I am setting up a ADS server. (NW2004s)
    When I test using FP_PDF_TEST_00, it returns me the version information
    (Version Information: 705.20051005114147.242570)
    But when I test <b>FP_TEST_00</b>. I get the error
    The render log file has been written to <Drive>:\u(200101)
    Now in the server I went to <drive>:\usr\sap\<SID>\SYS\global\AdobeDocumentServices\renderErrorLog\errorFiles
    I check the error file ithas the message <b>No Error Reported</b>
    If there are no errors, then why is the program FP_TEST_00 is not displaying the result in pdf format?
    Thanks for your help
    Regards
    sm

    Hi
    Thanks Ramakrishna.
    Now I could get the detailed trace. It given below, please let me know what could the issue
    Begin trace
    Adobe Document Services Wed Nov 29 10:33:23 CST 2006 Trace Results:
             String: rp.script.uri = sap/SAPPDFDocument.rps
             String: Trace = 4
             String: Username = ABC123
             Stream: PDFDocument
             Stream: XFD
          IIOP service is running
          Created a Data Manager for this request
          Locating stream: PDFDocument and loading into a DOM
          PdfDocument bytes = 725
          Processing PDFDocument instructions
          Parsing the request
          Checking request semantics
    Request initialization (including DOM construction and validation) processing execution time = 16 ms.
    Action: Identity
             <Identity><Application> = SAFP
             <Identity><Form> = FP_TEST_00
             <Identity><SID> = AAA
             <Identity><Client> = 110
             <Identity><IsProductive> = 0
    Action: Renderer
    Using Template Template (xdp):
    Source = URL
    Name = dest:FP_ICF_DATA_AAA//sap/bc/fp/form/layout/FP_TEST_00.XDP?fp-language=DE
    CacheInfo =
          Setting PDFDynamic to default 0 for this PDF.
             Stream = PDFOut
             Input node: <Render><Output> = pdf
             Data: XFD
             No custom config file found.
             Using standard config file: AdobeDocumentServices\lib\xfa.xci
             Config setup completed.
    Processing exception during a "Renderer" operation.
    Request start time: Wed Nov 29 10:33:23 CST 2006
    com.adobe.ProcessingError: File not found on: URL location: dest:FP_ICF_DATA_AAA//sap/bc/fp/form/layout/FP_TEST_00.XDP?fp-language=DE
    Exception Stack Trace:
    com.adobe.ProcessingError: File not found on: URL location: dest:FP_ICF_DATA_AAA//sap/bc/fp/form/layout/FP_TEST_00.XDP?fp-language=DE
         at com.adobe.ads.util.FileUtils.getPDFDataFromInput(FileUtils.java:210)
         at com.adobe.ads.request.Template.setPDFData(Template.java:222)
         at com.adobe.ads.request.Renderer.initializeTemplates(Renderer.java:468)
         at com.adobe.ads.request.Renderer.execute(Renderer.java:395)
         at com.adobe.BaseADSRequest.doWork(BaseADSRequest.java:111)
         at com.adobe.AdobeDocumentServicesWorker.processRender(AdobeDocumentServicesWorker.java:1142)
         at com.adobe.AdobeDocumentServicesWorker.execute(AdobeDocumentServicesWorker.java:598)
         at com.adobe.AdobeDocumentServicesEJB.processRequest(AdobeDocumentServicesEJB.java:130)
         at com.adobe.AdobeDocumentServicesEJB.rpData(AdobeDocumentServicesEJB.java:108)
         at com.adobe.AdobeDocumentServicesLocalLocalObjectImpl0.rpData(AdobeDocumentServicesLocalLocalObjectImpl0.java:120)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at com.sap.engine.services.webservices.runtime.EJBImplementationContainer.invokeMethod(EJBImplementationContainer.java:126)
         at com.sap.engine.services.webservices.runtime.RuntimeProcessor.process(RuntimeProcessor.java:157)
         at com.sap.engine.services.webservices.runtime.RuntimeProcessor.process(RuntimeProcessor.java:79)
         at com.sap.engine.services.webservices.runtime.servlet.ServletDispatcherImpl.doPost(ServletDispatcherImpl.java:92)
         at SoapServlet.doPost(SoapServlet.java:51)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:760)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:853)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.runServlet(HttpHandlerImpl.java:390)
         at com.sap.engine.services.servlets_jsp.server.HttpHandlerImpl.handleRequest(HttpHandlerImpl.java:264)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:347)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.startServlet(RequestAnalizer.java:325)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.invokeWebContainer(RequestAnalizer.java:887)
         at com.sap.engine.services.httpserver.server.RequestAnalizer.handle(RequestAnalizer.java:241)
         at com.sap.engine.services.httpserver.server.Client.handle(Client.java:92)
         at com.sap.engine.services.httpserver.server.Processor.request(Processor.java:148)
         at com.sap.engine.core.service630.context.cluster.session.ApplicationSessionMessageListener.process(ApplicationSessionMessageListener.java:33)
         at com.sap.engine.core.cluster.impl6.session.MessageRunner.run(MessageRunner.java:41)
         at com.sap.engine.core.thread.impl3.ActionObject.run(ActionObject.java:37)
         at java.security.AccessController.doPrivileged(Native Method)
         at com.sap.engine.core.thread.impl3.SingleThread.execute(SingleThread.java:100)
         at com.sap.engine.core.thread.impl3.SingleThread.run(SingleThread.java:170)
    Caused by: java.io.FileNotFoundException: Destination exception while retrieving destination
         at com.adobe.ads.util.FileUtils.getURLStream(FileUtils.java:452)
         at com.adobe.ads.util.FileUtils.getPDFDataFromURL(FileUtils.java:572)
         at com.adobe.ads.util.FileUtils.getPDFDataFromInput(FileUtils.java:200)
         ... 34 more
    Caused by: com.sap.security.core.server.destinations.api.DestinationException: [destination_0001] The properties for destination FP_ICF_DATA_AAA of type HTTP could not be located.
         at com.sap.security.core.server.destinations.service.DestinationServiceImpl.getDestination(DestinationServiceImpl.java:166)
         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
         at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
         at java.lang.reflect.Method.invoke(Method.java:324)
         at com.sap.engine.services.rmi_p4.reflect.LocalInvocationHandler.invokeInternal(LocalInvocationHandler.java:77)
         at com.sap.engine.services.rmi_p4.reflect.AbstractInvocationHandler.invoke(AbstractInvocationHandler.java:53)
         at $Proxy103.getDestination(Unknown Source)
         at com.adobe.ads.util.FileUtils.getURLStream(FileUtils.java:444)
         ... 36 more
    Caused by: com.sap.exception.io.SAPIOException: <Localization failed: ResourceBundle='com.sap.exception.io.IOResourceBundle', ID='No such destination FP_ICF_DATA_AAA of type HTTP exists ', Arguments: []> : Can't find resource for bundle java.util.PropertyResourceBundle, key No such destination FP_ICF_DATA_AAA of type HTTP exists
         at com.sap.security.core.server.destinations.io.ConfigurationManagerStorage.get(ConfigurationManagerStorage.java:123)
         at com.sap.security.core.server.destinations.service.DestinationServiceImpl.getDestination(DestinationServiceImpl.java:136)
         ... 44 more
    Merging log data with log.pdf
             Merge result: Success with information
             Successfully merged data into dataless log.pdf.
    Action: Render Error Log
             Embedding file/stream: pdfDocument.xml using ID: pdf doc MimeType:  Description: Document Services Control Stream into PDF: C:\WINDOWS\Temp\adobewa_GAD_29997650\DM-1666978463314757527.dir\DM-6806111980545081106.tmp
    Instantiate PDFMM CORBA connection execution time = 0 ms.
    Error embedding request inputs in error log.:
    java.lang.NullPointerException
    Thanks,
    SM

  • Any software/program that can read audit log files

    Hi,
    Currently i am searching for a program/tools that can read audit log files and format it into a readable format. Anyone know is there any in the market or any open source program?
    Thank You.

    Not sure what you mean by "audit log".
    Anyway. Pete Finnigan's tools page has only one thing that might be what you're looking for - LMON, which runs on BSD, Solaris, Linux. As he's the go-to guy for Oracle security the chances of there being a good free log analyzer tool that he hasn't heard of is slight.
    Cheers, APC

Maybe you are looking for