Is there a Log Miner View in 10.2 GRID

Hello,
I am trying to locate a tool on the GRID 10.2 for log miner. Where can I find this tool on the GRID. Thanks.

Not yet I guess.

Similar Messages

  • IS there a way to view archive logs

    Hi, is there a way to view the content of the archive logs which have the extension "arc".

    Hi,
    This link is useful for you:
    http://www.oracle.com/technology/deploy/availability/htdocs/LogMinerOverview.htm
    Cheers

  • Is there any redo log reader tools other than the log miner?

    HI all,
    Is there any third party redo log reader tools other than the log miner?
    thanks
    V

    OH MY! THAT WAS RIGHT IN FRONT OF ME THE WHOLE TIME!!
    THANKS A LOT!!

  • Is there a way to view when gift cards were added so you can see what they were spent on?

    Is there a way to view when gift cards were added so you can see what they were spent on?

    No, you can't view a list of gift card redemptions, you can only view your account's purchase history : log into your account on your computer's iTunes via the Store > View Account menu option, you should then see a Purchase History section with a 'see all' link to the right of it, click on that and you should then see a list of your purchases - if you select a link on that history then it should show how it was charged e.g. mine show 'Store Credit' as I only use gift cards

  • Is there a way to view messages received on another phone in my plan?

    Is there a way to view messages received / sent on another phone in my plan? 

    There are ways. The easiest would be to look at the phone.
    Another way would be if Verizon Messages were installed on the phone in question and you could log into the MyVerizon account for the line in question to view SMS/MMS messages to/from that phone. Of course any messages sent via iMessage would not show up in Verizon Messages as they are not SMS/MMS messages. For those, you could possibly log into the Apple ID account on another Apple device to view those, but then SMS/MMS messages would not show up there.
    This is why I prefer to disable iMessage on my iPhone so that I don't have to look in multiple locations to read my messages. I prefer to have all my messages sent/received via the same messaging platform so that I don't have to look in multiple places to view/manage them.

  • Log miner doesn't show all transactions on a table

    I'm playing a little with log miner on oracle 11gR2 on a 32bit CentOS Linux install, but it looks like it's not showing me all DML on my test table. Am I doing something wrong?
    Hi, there's my test case:
    - Session #1, create table and insert first row:
    SQL> create table stolf.test_table (
    col1 number,
    col2 varchar(10),
    col3 varchar(10),
    col4 varchar(10));
    2 3 4 5
    Table created.
    SQL> insert into stolf.test_table (col1, col2, col3, col4) values ( 0, 20100305, 0, 0);
    1 row created.
    SQL> commit;
    SQL> select t.ora_rowscn, t.* from stolf.test_table t;
    ORA_ROWSCN COL1 COL2 COL3 COL4
    1363624 0 20100305 0 0
    - Execute shell script to insert a thousand lines into table:
    for i in `seq 1 1000`; do
    sqlplus -S stolf/<passwd><<-EOF
    insert into stolf.test_table (col1, col2, col3, col4) values ( $ , 20100429, ${i}, ${i} );
    commit;
    EOF
    done
    - Session #1, switch logfiles:
    SQL> alter system switch logfile;
    System altered.
    SQL> alter system switch logfile;
    System altered.
    SQL> alter system switch logfile;
    System altered.+
    - Session #2, start logminer with continuous_mine on, startscn = first row ora_rowscn, endscn=right now. The select on v$logmnr_contents should return at least a thousand rows, but it returns three rows instead :
    BEGIN
    SYS.DBMS_LOGMNR.START_LOGMNR(STARTSCN=>1363624, ENDSCN=>timestamp_to_scn(sysdate), OPTIONS => sys.DBMS_LOGMNR.DICT_FROM_ONLINE_CATALOG + sys.DBMS_LOGMNR.COMMITTED_DATA_ONLY + SYS.DBMS_LOGMNR.CONTINUOUS_MINE);
    END;
    SQL> select SCN, SQL_REDO, SQL_UNDO FROM V$LOGMNR_CONTENTS where SQL_REDO IS NOT NULL AND seg_owner = 'STOLF';
    SCN
    SQL_REDO
    SQL_UNDO
    1365941
    insert into "STOLF"."TEST_TABLE"("COL1","COL2","COL3","COL4") values ('378','20100429','378','378');
    delete from "STOLF"."TEST_TABLE" where "COL1" = '378' and "COL2" = '20100429' and "COL3" = '378' and "COL4" = '378' and ROWID = 'AAASOHAAEAAAATfAAB';
    1367335
    insert into "STOLF"."TEST_TABLE"("COL1","COL2","COL3","COL4") values ('608','20100429','608','608');
    delete from "STOLF"."TEST_TABLE" where "COL1" = '608' and "COL2" = '20100429' and "COL3" = '608' and "COL4" = '608' and ROWID = 'AAASOHAAEAAAATfAAm';
    1368832
    insert into "STOLF"."TEST_TABLE"("COL1","COL2","COL3","COL4") values ('849','20100429','849','849');
    delete from "STOLF"."TEST_TABLE" where "COL1" = '849' and "COL2" = '20100429' and "COL3" = '849' and "COL4" = '849' and ROWID = 'AAASOHAAEAAAATbAAA';+

    Enable supplemental logging.
    Please see below,
    SQL> shut immediate
    Database closed.
    Database dismounted.
    ORACLE instance shut down.
    SQL> startup mount;
    ORACLE instance started.
    Total System Global Area  422670336 bytes
    Fixed Size                  1300352 bytes
    Variable Size             306186368 bytes
    Database Buffers          109051904 bytes
    Redo Buffers                6131712 bytes
    alter databsDatabase mounted.
    SQL>
      2
    SQL> alter database archivelog;
    Database altered.
    SQL> alter database open;
    Database altered.
    SQL> alter system checkpoint;
    System altered.
    SQL> drop table test_Table purge;
    Table dropped.
    SQL> create table test_table(
      2  col1 number,
    col2 varchar(10),
    col3 varchar(10),
    col4 varchar(10));  3    4    5
    Table created.
    SQL> insert into test_table (col1, col2, col3, col4) values ( 0, 20100305, 0, 0);
    1 row created.
    SQL> commit;
    Commit complete.
    SQL> select t.ora_rowscn, t.* from test_table t;
    ORA_ROWSCN       COL1 COL2       COL3       COL4
       1132572          0 20100305   0          0
    SQL> for i in 1..1000 loop
    SP2-0734: unknown command beginning "for i in 1..." - rest of line ignored.
    SQL> begin
      2  for i in 1..1000 loop
      3  insert into test_table values(i,20100429,i,i);
      4  end loop; commit;
      5  end;
      6  /
    PL/SQL procedure successfully completed.
    SQL> alter system switch logfile;
    System altered.
    SQL> /
    SQL> select * from V$version;
    BANNER
    --------------------------------------------------------------------------------Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    PL/SQL Release 11.1.0.6.0 - Production
    CORE    11.1.0.6.0      Production
    TNS for Linux: Version 11.1.0.6.0 - Production
    NLSRTL Version 11.1.0.6.0 - ProductionIn the second session,
    SQL> l
      1  select SCN, SQL_REDO, SQL_UNDO FROM V$LOGMNR_CONTENTS where SQL_REDO IS NOT NULL
      2* and seg_owner='SYS' and table_name='TEST_TABLE'
    --------------------------------------------------------------------------------insert into "SYS"."TEST_TABLE"("COL1","COL2","COL3","COL4") values ('2','20100429','2','2');
    delete from "SYS"."TEST_TABLE" where "COL1" = '2' and "COL2" = '20100429' and "COL3" = '2' and "COL4" = '2' and ROWID = 'AAASPKAABAAAVpSAAC';
       1132607
    insert into "SYS"."TEST_TABLE"("COL1","COL2","COL3","COL4") values ('3','2010042
           SCN
    SQL_REDO
    --------------------------------------------------------------------------------SQL_UNDO
    --------------------------------------------------------------------------------9','3','3');
    delete from "SYS"."TEST_TABLE" where "COL1" = '3' and "COL2" = '20100429' and "COL3" = '3' and "COL4" = '3' and ROWID = 'AAASPKAABAAAVpSAAD';
       1132607
    insert into "SYS"."TEST_TABLE"("COL1","COL2","COL3","COL4") values ('4','20100429','4','4');
    <<trimming the output>>
    --------------------------------------------------------------------------------429','997','997');
    delete from "SYS"."TEST_TABLE" where "COL1" = '997' and "COL2" = '20100429' and
    "COL3" = '997' and "COL4" = '997' and ROWID = 'AAASPKAABAAAVpVACU';
       1132607
    insert into "SYS"."TEST_TABLE"("COL1","COL2","COL3","COL4") values ('998','20100429','998','998');
           SCN
    SQL_REDO
    --------------------------------------------------------------------------------SQL_UNDO
    --------------------------------------------------------------------------------delete from "SYS"."TEST_TABLE" where "COL1" = '998' and "COL2" = '20100429' and
    "COL3" = '998' and "COL4" = '998' and ROWID = 'AAASPKAABAAAVpVACV';
       1132607
    insert into "SYS"."TEST_TABLE"("COL1","COL2","COL3","COL4") values ('999','20100429','999','999');
    delete from "SYS"."TEST_TABLE" where "COL1" = '999' and "COL2" = '20100429' and
           SCN
    SQL_REDO
    --------------------------------------------------------------------------------SQL_UNDO
    --------------------------------------------------------------------------------"COL3" = '999' and "COL4" = '999' and ROWID = 'AAASPKAABAAAVpVACW';
       1132607
    insert into "SYS"."TEST_TABLE"("COL1","COL2","COL3","COL4") values ('1000','20100429','1000','1000');
    delete from "SYS"."TEST_TABLE" where "COL1" = '1000' and "COL2" = '20100429' and "COL3" = '1000' and "COL4" = '1000' and ROWID = 'AAASPKAABAAAVpVACX';
           SCN
    SQL_REDO
    --------------------------------------------------------------------------------SQL_UNDO
    1000 rows selected.
    SQL>HTH
    Aman....

  • Large number of objets - log miner scalability?

    We have been consolidating several departmental databases into one big RAC database. Moreover, in tests databases we are cloning test cells (for example, an application schema is getting cloned hundred of times so that our users may test independently from each others).
    So, our acception test database now have about 500,000 objects in it. We have production databases with over 2 millions objects in it.
    We are using streams. At this time we're using a local capture, but our architecture aims to use downstream capture soon... We are concerned about the resources required for the log miner data dictionary build.
    We are currently not using DBMS_LOGMNR_D.build directly, but rather indirectly through the DBMS_STREAMS_ADM.add_table_rule. We only want to replicate about 30 tables.
    We are surprised to find that the log miner always build a complete data dictionary for every objets of the database (tables, partitions, columns, users, and so on).
    Apparently there is no way to create a partial data dictionary even by using DBMS_LOGMNR_D.BUILD directly...
    Lately, it took more than 2 hours just to build the log miner data dictionary on a busy system! And we ended up with an ORA-01280 error. So we started all over again...
    We just increased our redo log size recently. I haven't had a chance to test after the change. Our redo log was only 4MB, we increased it to 64MB to reduce checkpoint activity. This will probably help...
    Does anybody has encountered slow log miner dictionary build?
    Any advice?
    Thanks you in advance.
    Jocelyn

    Hello Jocelyn,
    In streams environment, the logminer dictionary build is done using DBMS_CAPTURE_ADM.BUILD procedure. You should not be using DBMS_LOGMNR_D.BUILD for this.
    In Streams Environment, DBMS_STREAMS_ADM.ADD_TABLE_RULE will dump the dictionary only on the first time when you call this, since the capture process is not yet created and it will be created only when you call DBMS_STREAMS_ADM.ADD_TABLE_RULE and a dictionary dump as well. Logminer dictionary will have the information about all the objects like tables, partitions, columns, users and etc.. The dictionary dump will take time depends on the number of objects in the database since if the number of objects are very high in the database then the data dictionary itself will be big.
    Your redo size 64MB and this is too small for a production system, you should consider having a redo log size of 200M atleast.
    You can have a complete logminer dictionary build using DBMS_CAPTURE_ADM.BUILD and then create a capture process using the FIRST_SCN returned from the BUILD procedure.
    Let me know if you have more doubts.
    Thanks,
    Rijesh

  • Log miner

    begin dbmn_logmnr.start_logmnr(starttime => '01-oct-2011 00:00:00',endtime =>'21-feb-2012 00:00:00',options => dbms_logmnr.dict_from_online_catalog+dbms_logmnr.continuous_mine);
    ERROR at line 1:
    ORA-06550: line 1, column 7:
    PLS-00201: identifier 'DBMN_LOGMNR.START_LOGMNR' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    how to fix this problem....?
    i need to start log_miner

    915855 wrote:
    begin dbmn_logmnr.start_logmnr(starttime => '01-oct-2011 00:00:00',endtime =>'21-feb-2012 00:00:00',options => dbms_logmnr.dict_from_online_catalog+dbms_logmnr.continuous_mine);
    ERROR at line 1:
    ORA-06550: line 1, column 7:
    PLS-00201: identifier 'DBMN_LOGMNR.START_LOGMNR' must be declared
    ORA-06550: line 1, column 7:
    PL/SQL: Statement ignored
    how to fix this problem....?
    i need to start log_minerPlease read, there is a complete chapter on how to use Log Miner.
    http://docs.oracle.com/cd/E11882_01/server.112/e22490/logminer.htm#SUTIL019
    And as mentioned by Vivek, check the spellings of the package.
    Aman....

  • Excessive disk usage when I drag the log file viewer window (why)?

    When I drag the Log File Viewer window in Gnome, I get huge amounts of hard disk usage and the hard drive makes a loud rumbling noise. This happens only while dragging the Log File Viewer window and no other windows (that I've noticed so far).
    Why is this happening?
    Last edited by trusktr (2012-01-11 05:27:54)

    Elements11DRC
    What version of Premiere Elements are you working with and on what computer operating system is it running?
    Can we assume by your selected ID, that the program is Premiere Elements 11?
    Pending further details, I will assume that you are working with Premiere Elements 11 on Windows 7, 8, or 8.1 64 bit.
    Where is this "My Videos" Folder - on a DVD disc being used as a DataDisc for video storage purposes?
    If so, Add Media/DVD Camera or Computer Drive/Video Importer and from there automatically into the project in Project Assets as well as on the Timeline.
    If your "My Videos" Folder is a folder on the computer hard drive, then Add Media/Files and Folders to get the video into Project Assets from where you drag the video to the Timeline.
    Now for the video that you are trying to import...what are its properties
    Video Compression
    Audio Compression
    Frame Size
    Frame Rate
    Interlaced or Progressive
    File Extension
    Pixel Aspect Ratio
    Probably answered the easiest by knowing the brand/model/settings of the camera that recorded the video.
    Prime interest, that video compression. It could be MotionJPEG which can be problematic for Premiere Elements. It could be AVCHD.avi which cannot be imported.
    We can go into greater detail on your project details once we rule in or out any of the factors mentioned above.
    By the way, what is the destination for this project....burn to disc DVD or Blu-ray...export to file saved to the computer hard drive...other?
    More later.
    Thanks.
    ATR

  • Log access (viewing / display) to sensitive infotypes, e.g. IT0008

    Hi all,
    I am currently working in a customer project, the requirements are to log access (viewing / display) to sensitive infotypes, e.g. IT0008 when accessing through e.g. transaction PA20.
    Knowing Audit Trail and also logging functionality on a db table level which are only logging changes to data and/or deleting data (as far as I know...can it also be configured to log display/view access as well??).
    For this customer project I am looking for a solution to log simple access like viewing/display of e.g. IT0008 through PA20 with time, userid and terminal. Is this supported with SAP HCM? How can this be achieved?
    I know probably the simplest solution would be just to restrict user privileges properly. But the client definitely wants this logging feature (log viewing/display access to e.g. IT0008).
    Thx for your help, Stefan

    Dear Manoj,
    thanks for your answer. Indeed increasing db table size could be a problem.
    I am still wondering if there is not a standard SAP solution for this problem. Is this not a common customer request having
    infotype access (display access) being logged?
    I understand that for revision purpose it is necessary to have a change log on infotype data however in my understanding there must be the need on customer side as well to have logging for display access on sensitive infotypes to prevent misuse of data or at least set up a higher hurdle for prevention.
    Any more insights on this?
    Thanks again, Stefan

  • Is there an alternative to view Pictures from Aperture "Projects"/Folders

    I am considering the purchase of a new ATV to display pictures from my Aperture library on the HD TV.
    1)
    Is there a way to view the pictures using the full 1080 HD capacity of the TV or will this always be limited to 720 no matter what I do? It is not bad at 720 on a 42 inch screen, but I would like to maximise the resolution especially as we now have a 46" screen. If not, is there an alternative to ATV (to view Aperture Library - don;t want to mess around with exporting etc if I can avoid this).
    2) If I connect a HDD to the USB port, can I then Synchronise Aperture pictures to the ATV HDD as I used to with my old ATV (as an option to view pictures and other stuff without the Computer connected)?
    Any other words of wisdom? Is the MAC mini an option (without a computer screen - i.e. a replacement for the ATV). If so would it give 1080 through HDMI cable and would it allow the ATV remote to use it and be seen by iTunes as a device to synch with?
    (We don't have our old kit anymore in case anyone was wondering)
    Message was edited by: van D

    van D wrote:
    1)Is there a way to view the pictures using the full 1080 HD capacity of the TV or will this always be limited to 720 no matter what I do?
    AppleTV 2 only outputs 720p - whether this is a hardware or software limitation I don't know.
    Apple might be able to address this in a future update, but if it was capable overall I think it would have been an option from the start.
    2) If I connect a HDD to the USB port, can I then Synchronise Aperture pictures to the ATV HDD as I used to with my old ATV (as an option to view pictures and other stuff without the Computer connected)?
    No the USB port is for restoring via iTunes only - you can't officially connect a drive to it.  AppleTV2 also only has 8GB os solid state storage shared between all functions.
    MacMini is certainly capable of outputting 1080p over HDMI.  I have last year's model hooked up to a 1080p TV though more often than not the HDMI lead is disconnected in favour of another device and i manage it remotely.
    With Home Sharing you could easily transfer content to Mac Mini from another computer running iTunes.
    Remote should work with Mini, but not sure what app you'd be controlling for the slideshows or whether they'd respond to the remote.
    Lion killed Front Row support.
    AC

  • Is there a log I can check for internal messages not beingrecieved?

    We are using sendmail in an internal VB script. My boss is not getting emails sent to her from this script.
    Is there a log I can view (similar to the GWIA log for external messages) that will tell me if the Groupwise system even tried to process it?
    Because I am using a script and not the client, I cannot check the sent status. (I think...)

    On 6/23/2011 4:46 AM, Jason wrote:
    > We are using sendmail in an internal VB script. My boss is not getting
    > emails sent to her from this script.
    > Is there a log I can view (similar to the GWIA log for external
    > messages) that will tell me if the Groupwise system even tried to
    > process it?
    > Because I am using a script and not the client, I cannot check the sent
    > status. (I think...)
    First of all you'd be better learning pure object api rather than
    shelling out.
    Second of all, were you using object api it would generate sent item. I
    think sendmail avoids this.
    Third of all, the only other way to trace is have the POA on verbose
    logging and see what happens. Obviously easier to interpret off hours.

  • I would like to set up my email on the apple tv as well as have my husband's email on there so we can view both sets of photos and videos - it is already set up in his name - how do i add my name so as to view my photo library from all of my devices?

    I would like to set up my email on the apple tv as well as have my husband's email on there so we can view both sets of photos and videos - it is already set up in his name - how do i add my name so as to view my photo library and songs from MY phone ?

    this is not a reply - i asked the question - still trying to learn how all this works - someone please HELP ME

  • Is there any provision to view the selected record using SYS_REFCURSOR?

    hi friends ,
    I was using SQL Server . now i am shifting to Oracle . so we are changing the Stored Procedures in SQLServer to Oracle SP's. I have given the structure of procedure given below . If possible , i want to see the output of select statement in the TOAD editor . If any body knows please help me
    CREATE OR REPLACE PROCEDURE PS_AON
    P_STATUS OUT VARCHAR2,
    P_CUR OUT SYS_REFCURSOR
    AS
    BEGIN
    OPEN P_CUR FOR
              select colum1,column2,column3 from Table 1;
    EXCEPTION
                   WHEN OTHERS THEN
                   P_STATUS:=SQLERRM;
    END;
    This is one of the model of stored procedures i am using . And the editor i am using is TOAD 7.3.0 and oracle 9i. Is there any provision to view the selected records by running this procedure in TOAD editor
    thanks & regards

    (assuming you have relatively recent version of TOAD).
    Write a small block to call the procedure (or use Toad's 'execute procedure' option) as in the example below. Note the ':' in front of 'v_cur_out'. When you run the block, TOAD will prompt you for a value / datatype for 'v_cur_out'. Ignore the value, set the datatype to 'Cursor' and click OK. The resultset (if any) will be displayed in the Data Grid window below.
    DECLARE
       v_status VARCHAR2 (32767);
    BEGIN
       ps_aon (v_status, :v_cur_out);
       DBMS_OUTPUT.PUT_LINE ('v_status => ' || v_status);
    END;
    /

  • My AppStore id is my apple id n password which is good when purchasing. But when I am updating from update option it ask for a password. The user id I see there is not mine, it's different. How do I change that to my apple Id? Please help me resolve .....

    My AppStore id is my apple id n password which is good when purchasing. But when I am updating from update option it ask for a password. The user id I see there is not mine, it's different. How do I change that to my apple Id? Please help me resolve  this issue. I have tried resetting it but nothing... Either I'm doing something wrong or....

    I believe the issue is with the Apple ID that was used to purchase the App. If you download an App that was purchased under a different Appple ID then all updates will also be linked to the original purchaser's Apple ID. Your Apple ID is the the same ID as your iTunes, iCloud, etc. Some folks use different ID for the different Apple sites. No need for that One ID for all Apple Sites, and if someone else buys an App using their ID and they(you) download that App onto your device and that App requires an update it will ask for the purchasers Apple ID. This happens a lot when folks sell their iPad or give it to someone else and leave some purchased(free) Apps on the iOS device. You cannot change the original ID the App was purchased under. A suggestion would be if someone else has an App that you like but do not want to pay for use their ID or in the future have them gift the App to you.

Maybe you are looking for