Checkpointing - control file contents question

Some clarification is needed if possible...
When you commit a transaction:
- commit scn is recorded in the itl of the data block and undo segment header
- lgwr records the committed scn (for all data blocks involved) to the redo log
Checkpoint Event
- (3 seconds or possibly less passes by) CKPT wakes up and signals DBWn to write dirty (modified and committed)
blocks to disk
- CKPT records the scn of those blocks in the control file (data file and redo thread sections) and the data file
header (task of checkpoint when a log switch occurs)
- Checkpoint position in the Redo Log is forwarded
Control file contents question:
When LGWR writes the commit scn to the redo log, who writes the scn to the control file? LGWR or CKPT?
Also, when is the redo thread scn written?
Matt

Matt,
This is my understanding of the stuff. Feel free to correct me.
Checkpoint SCN , as I mentioned in my last reply is the marker of the point till which the data is "chekpointed" to the datafiles. This marker tells the controlfile that in the case of the crash, where to start recovery of the datafile and have to go which extent in the redo stream? This is only available in the datafile header and in the controlfile. This doesn't get recorded in the redo log file/stream.
I mentioned checkpoint queue in my reply too. Though I couldn't find any reference directly mentioned between this and in the checkpoint SCN but I believe my theory , if not totall, partially is correct. The incremental checkpoint is the stuff which makes the decision that how many redo blocks needs to be applied to the datafile if its closed without a proper checkpoint. So this part is maintained in the Datafile header itself in the form of the checkpoint SCN. When not matched with the conrolfile checkpoint SCN, which is always higher than this, a recovery is reported.
I hope its somewhat correct. Do let me know your views too.
Cheers
Aman....

Similar Messages

  • Rman causing control file contention

    Hi,
    may i ask for your opinions and advice regarding control file contention issues.
    last week when rman was running it caused a locking issue with another session in another node and based on ash/awr reports there was a control file contention issue..
    in need of your advice and opinions
    thanks.

    Hi,
    I have too little experience with RAC, so you better wait for some answers from experts on that matter.
    Nevertheless, my thoughts are as follows:
    * You have enqueue contention, and this should be taken care of by the GES part of your CRS layer: maybe a place to check if you can trace back any problems.
    * I thought that the version management should be like : CRS version >= ASM version >= DB version .... and this is no longer your situation since your CRS version seems to be 10g when your database run 11g.
    Maybe you could also mention if you use ASM, and where your control files are located.
    Again, wait for experienced answers...
    HTH,
    Thierry

  • Sql loader control file question.

    I have a text file (t.txt) which contains a record types AAA and AAB to input fixed width data into a table (t) AAA_NO, AAA_TYPE, AAB_DESC.
    Control file (control_t) contents:
    load data infile '/path/t.txt'
    insert into table t
    when ((1:3) = 'AAA' )
    AAA_NO position (4:14) CHAR,
    AAA_TYPE postion (15:27) CHAR
    Works prefectly, but I need to add another set of data from the same t.txt file with record type AAB. I attempted to add this into the same control file:
    into table t
    when (1:3) = 'AAB'
    AAB_DESC position (28:128) CHAR
    It fails naturally. How would I include the addtional record type data into the same table after AAA_NO and AAA_TYPE have already been inserted? Do I need to include the AAA_NO in the second insert (AAB_DESC)? Should I create another temp table to store only the AAA_NO and AAB_DESC and then insert that data into table t after the loader is done? Or can this be completed in the same control file?

    Thanks again for the assistance, this is a tough one to fix. I am new to sqlloader.
    The temp table creation is causing some serious errors, so I am back to trying to fix sqlloader to get the job done. the apt.txt file contains records that each row of a new record starts with either 'APT' or 'ATT'. Here is the details of what I am trying to do.
    crtl file:
    load data
    infile '/path/apt.txt
    insert
    into table t_hld
    when ((1:3) = 'APT')
    apt_no position (4:14) CHAR,
    apt_type position (15:27) CHAR,
    apt_id position (28:31) CHAR
    The next section is the problem where I am inserting apt_sked into the same table t_hld as above because it has a different record qualifier its ATT and not APT.
    insert
    into table t_hld
    when (1:3) = 'ATT'
    apt_no position (4:14) CHAR,
    apt_sked position (16:126) CHAR
    The positions of the data using fixed is working, I can insert the apt_sked data into another temp table instead of t_hld and it works. It's just when I attempt to place the ATT apt_sked data into the t_hld table after the APT data has been loaded into the t_hld table....I tried APPEND instead of INSERT, but that does not work.
    The APT_NO's of the data are all the same- it is the qualifier for the records (Primary Key attribute- however I do not have it established since it is a temp table concept).
    I am stuck trying to get the data in the t_hld table, everything works when I do not try to put the ATT apt_sked data into t_hld- everything is valid. And placing the ATT apt_sked data into a different temp table works perfectly- but I can't find a way to create an update to t_hld from this temp table without errors. So I am trying to go back to sqlloader to get this done- any thoughts or questions?
    Thanks a billion!
    Shawn

  • Hi Question about Sender File Content Conversion

    Hi All,
    I have a flat file i need convert that flat file to the below XML format using File content Conversion, can anybody help me on this with example File content Convertion parameters
    <?xml version="1.0" encoding="UTF-8"?>
    <ns0:MT_FileSend xmlns:ns0="http://test">
       <Recordset>
          <Contact>
             <Name>ABC</Name>
             <Number>123</Number>
             <Address>
                <HouseNumber>246789</HouseNumber>
                <StreetNumber>100</StreetNumber>
                <Phone>
                   <Mobile>90000000</Mobile>
                   <LandLine>12345678</LandLine>
                </Phone>
             </Address>
             <Email>
                <Office></Office>
                <Personal></Personal>
             </Email>
          </Contact>
       </Recordset>
    </ns0:MT_FileSend>

    Hi Sudheer reddy,
            Good question.
    File content conversion parameters are:
    1) Doccument Name: Doccument name is nothing but message type name for file side: MT_FileSend
    2) Doccument Namespace: mention the name space for the message type.
    3) Doccument Offset: for which row and which coloumn doccument has to be write.
    4) recordset name: mention the record set name: Recordset
    5) Recordset Namespace: if the namespace is different for message type name space mention name space or otherwise it is blank.
    6) Recordset Structure: Mention the structure name
    7) Recordset Sequence: mention the sequence of the recordset Ascending or Decending
    8) Recordset permessage: Would define no of record sets per message.
    9) Key-field name: to query to the doccument
    10) Key field type: string(Case-senstive)
    fieldValues
    fieldSeparator
    endSeparator
    Note: pls check in SDN about file content conversions it so many blogs are available, u can easily understand.
    Regards,
    Sateesh

  • Question on File Content Conversion

    Hi all,
       I went through this blog(/people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1) for File content conversion and I have a question regading File Content Conversion. Is it always neccessary to have the data type structure as File, Record, Row and then the fields. Is it always neccesary to maintain this level of hierarchy for File content conversion to work.

    Hi Sonia,
    For FCC sender or receiver if you want text with delimieter / csv file process.
    use the below parameters
    You would have to use the Document name ,namespace,record set name ,record strucutre name ( and occurrences also) and processing parameters in Sender side
    In receiver side Record structre, processing parameters.
    see below links to have more details..
    /people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter
    /people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/harrison.holland5/blog/2006/12/20/xi-configuration-for-mdm-integration--sample-scenario /people/shabarish.vijayakumar/blog/2006/04/03/xi-in-the-role-of-a-ftp
    /people/prateek.shah/blog/2005/06/14/file-to-r3-via-abap-proxy
    /people/mickael.huchet/blog/2006/09/18/xipi-how-to-exclude-files-in-a-sender-file-adapter -
    Regards
    Chilla

  • Flat File and Control Files Questions

    Greetings,
    I've worked with Oracle for about 10 years, but have little experience with using sql-loader.
    I have data from Visual FoxPro tables going into Oracle 10g via a Perl script. I am having issues and therefore have a couple questions.
    1) If the data from my foxpro table is basically everything in the table as in 'Select * from table-name', does the control file have to list every column that is in the FoxPro table?
    -- I have a case where a FoxPro table has 15 columns but we are trying to upload only 10 columns. The script is dynamic. It selects * from each FoxPro table and creates a Flat File for each on the fly. Then sql-loader uploads the data to Oracle. The Flat File for this one table has data from all 15 columns, but the Control File only lists 10 of the columns to be uploaded into Oracle.
    2) Do the column names in the control file 'have' to match both the column names in the FoxPro table and the Oracle table, or only the Oracle table?

    YankeeFan wrote:
    Greetings,
    I've worked with Oracle for about 10 years, but have little experience with using sql-loader.
    I have data from Visual FoxPro tables going into Oracle 10g via a Perl script. I am having issues and therefore have a couple questions.
    1) If the data from my foxpro table is basically everything in the table as in 'Select * from table-name', does the control file have to list every column that is in the FoxPro table?
    -- I have a case where a FoxPro table has 15 columns but we are trying to upload only 10 columns. The script is dynamic. It selects * from each FoxPro table and creates a Flat File for each on the fly. Then sql-loader uploads the data to Oracle. The Flat File for this one table has data from all 15 columns, but the Control File only lists 10 of the columns to be uploaded into Oracle.
    Yes - use the FILLER spec to ignore columns you do not care about - http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/ldr_field_list.htm#sthref946
    2) Do the column names in the control file 'have' to match both the column names in the FoxPro table and the Oracle table, or only the Oracle table?Only the Oracle table.
    HTH
    Srini

  • A question about restoring from cold backup(control file backup not clear)

    Hi,
    I had another question about restoring the cold backup. My database is in noarchivelog mode and after taking a consistent cold backup, all I need to do is to restore the backup right? -Why I got this question is because: when I backup my control file to trace, I see statements like this:-----
    -- Commands to re-create incarnation table
    -- Below log names MUST be changed to existing filenames on
    -- disk. Any one log file from each branch can be used to
    -- re-create incarnation records.
    -- ALTER DATABASE REGISTER LOGFILE '/uo1/app1/arch1_1_647102958.dbf';
    -- Recovery is required if any of the datafiles are restored backups,
    -- or if the last shutdown was not normal or immediate.
    RECOVER DATABASE
    -- Database can now be opened normally.
    ALTER DATABASE OPEN;
    My database is in noarchivelog mode now so don't know why these statements (of register the logfile) is there in the backup of control file? so when I restore the cold backup of this database, it will still work correct? (there is no logfile I have only CRD files in cold backup -no archive log files.)
    thanks
    Nirav

    Thanks for your inputs! It is most useful to me.
    Regards
    Nirav

  • How to check the contents of Control file, Log file & Parameter file

    Can anybody help me how i can check the contents of Control file, Log file & Parameter file.
    Arif

    OK ...
    Parameter file:
    It will normally be in the $ORACLE_HOME/dbs directopry. It could be aan init{sid}.ora or a spfile{sid}.ora ... do not edit an SPFILE as yu could corrupt it.
    http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14231/create.htm#sthref434
    If it's an init{sid}.ora, use any editor to see the actual content.
    If it's a spfile{sid}.ora, use one of V$PARAMETER, V$PARAMETER2, V$SYSTEM_PARAMETER, V$SYSTEM_PARAMETER2 as aappropriate (read the Reference manual to get an idea about when each is appropriate)
    Control File:
    You should not edit a control file. It consists of many areas, as described here http://download-east.oracle.com/docs/cd/B19306_01/server.102/b14220/physical.htm#i10135 and each are is interrogated by V$views as given in the reference manual. The following V$views all deal with some aspect of the control file or provide info about the control files or the contents of control files:
    V$ARCHIVE_DEST
    V$ARCHIVED_LOG
    V$BACKUP_ARCHIVELOG_DETAILS
    V$BACKUP_CONTROLFILE_DETAILS
    V$BACKUP_CONTROLFILE_SUMMARY
    V$BACKUP_COPY_DETAILS
    V$BACKUP_COPY_SUMMARY
    V$BACKUP_CORRUPTION
    V$BACKUP_DATAFILE
    V$BACKUP_DATAFILE_DETAILS
    V$BACKUP_FILES
    V$BACKUP_PIECE
    V$BACKUP_REDOLOG
    V$BACKUP_SET
    V$BACKUP_SET_DETAILS
    V$BACKUP_SPFILE
    V$CONTROLFILE
    V$CONTROLFILE_RECORD_SECTION
    V$COPY_CORRUPTION
    V$DATABASE
    V$DATABASE_INCARNATION
    V$DATAFILE
    V$DATAFILE_COPY
    V$DATAFILE_HEADER
    V$DELETED_OBJECT
    V$LOCK
    V$LOG
    V$LOG_HISTORY
    V$LOGHIST
    V$OBSOLETE_BACKUP_FILES
    V$OFFLINE_RANGE
    V$PROXY_COPY_DETAILS
    V$PROXY_COPY_SUMMARY
    V$PROXY_DATAFILE
    V$RMAN_BACKUP_JOB_DETAILS
    V$RMAN_BACKUP_SUBJOB_DETAILS
    V$RMAN_CONFIGURATION
    V$SESSION
    V$TABLESPACE
    V$THREAD
    V$UNUSABLE_BACKUPFILE_DETAILS
    3) Log Files
    The contents of the log files are viewed using LogMiner - read Chapter 11 of the Oracle Utilities manual

  • Question about File Content Conversion and parent-child relationships...

    Hello!
         I have read probably every blog, article and SAP Help document on the topic, but I am stuck on this one.  I am trying to convert a General Ledger flat file to an IDoc using the classic file --> IDoc scenario.  The setup is done and working, but the IDocs are formatted incorrectly and I believe at least part of the reason is how I am converting the file content. 
    The root of my problem is that the flat file has a parent-child relationship between the document header and the document item and I want to maintain that since the IDoc type (FIDCCP01) has the same structure in the BKPF and BSEG segments.
    Here is the flat (non-XML) file layout that is coming into the file adapter:
    FileHeader
    DocumentHeader
    DocumentItem
    DocumentHeader
    DocumentItem
    and so on (until the number of documents is complete
    I would really like the content to be converted so that the line items stay under their parent document headers like this:
    <FileHeader></FileHeader>
    <DocumentHeader>
       <ItemHeader>
       </ItemHeader>
    </DocumentHeader>
    <DocumentHeader>
       <ItemHeader>
      </ItemHeader>
    </DocumentHeader>
    But I keep getting this, where it lists the document headers first (one after another), and then all of the line items after the document headers like this:
    <FileHeader></FileHeader>
    <DocumentHeader></DocumentHeader>
    <DocumentHeader></DocumentHeader>
    <DocumentHeader></DocumentHeader>
    <ItemHeader></ItemHeader>
    Is is possible to maintain that parent-child relationship from the flat file and pass it over to the XML?
    Thanks,
    John

    Hi,
    Check some links on FCC.
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    /people/venkat.donela/blog/2005/03/03/introduction-to-simple-file-xi-filescenario-and-complete-walk-through-for-starterspart2
    /people/venkat.donela/blog/2005/06/08/how-to-send-a-flat-file-with-various-field-lengths-and-variable-substructures-to-xi-30
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    /people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion
    http://help.sap.com/saphelp_nw04/helpdata/en/d2/bab440c97f3716e10000000a155106/content.htm
    Regards,
    Phani
    Reward points if Helpful

  • Sender Adapter File Content Conversion - question

    Hi!
    Could you help me to make xml-file from csv?
    I have file like:
    12345#254#9765#89654
    55#9066#77127#47
    And i need file like:
    <dealings>
            <deal>
                    <field1>12345</field1>
                    <field2>254</field2>
                    <field3>9765</field3>
                    <field4>89654</field4>
            </deal>
            <deal>
                    <field1>55</field1>
                    <field2>9066</field2>
                    <field3>77127</field3>
                    <field4>47</field4>
            </deal>
    </dealings>
    I haven't key fields - all fields are accidental. And lengths of all fields are accidental.
    How to put content conversion parameters?
    Thank you.

    HI
    File Content prameters for the Sender Adapter
    http://help.sap.com/saphelp_nw04/helpdata/en/2c/181077dd7d6b4ea6a8029b20bf7e55/content.htm
    File content conversion sites
    http://help.sap.com/saphelp_nw04/helpdata/en/d2/bab440c97f3716e10000000a155106/content.htm
    Please see the below links for file content conversion..
    The specified item was not found. - FCC
    The specified item was not found. - FCC
    File Content Conversion for Unequal Number of Columns
    File Content Conversion for Unequal Number of Columns - FCC
    Content Conversion (Pattern/Random content in input file)
    Content Conversion (Pattern/Random content in input file) - FCC
    XI Configuration for MDM Integration - Sample Scenario - FCC - MDM
    XI in the role of a FTP
    XI in the role of a FTP
    File content conversion sites
    Introduction to simple(File-XI-File)scenario and complete walk through for starters(Part1)
    Introduction to simple (File-XI-File)scenario and complete walk through for starters(Part2)
    File Receiver with Content Conversion
    Content Conversion (Pattern/Random content in input file)
    NAB the TAB (File Adapter)
    Introduction to simple(File-XI-File)scenario and complete walk through for starters(Part1)
    Introduction to simple (File-XI-File)scenario and complete walk through for starters(Part2)
    How to send a flat file with various field lengths and variable substructures to XI 3.0
    Content Conversion (Pattern/Random content in input file)
    NAB the TAB (File Adapter)
    File Content Conversion for Unequal Number of Columns
    Content Conversion ( The Key Field Problem )
    The specified item was not found.
    File Receiver with Content Conversion
    http://help.sap.com/saphelp_nw04/helpdata/en/d2/bab440c97f3716e10000000a155106/content.htm
    Please see the below links for file content conversion..
    The specified item was not found. - FCC
    The specified item was not found. - FCC
    cheers

  • Missing Control File question

    I run the database in archivelog mode. I take online back-ups and occasional full backups.
    I take a full cold backup on day1 and on day 5 i take a hot backup including the control file.
    After this i create a couple of tablespaces, and usual work but don't take backup.
    On day 10 I lose my control files. Lets say all of them.
    Can i restore my d/b as it was on day 10? If yes, how?
    TIA
    Naveen

    Hi..
    AFAIK, there is no such level of logging in the alert log.From Oracle documents:-
    Each database also has an alert.log. The alert log of a database is a chronological log of messages and errors, including the following:* All internal errors (ORA-600), block corruption errors (ORA-1578), and deadlock errors (ORA-60) that occu.r
    Administrative operations, such as the SQL statements CREATE/ALTER/DROP DATABASE/TABLESPACE and the Enterprise Manager or SQLPlus statements STARTUP, SHUTDOWN, ARCHIVE LOG, and RECOVER.
    *Several messages and errors relating to the functions of shared server and dispatcher processes.
    * Errors during the automatic refresh of a materialized view.
    >
    [http://download.oracle.com/docs/cd/B19306_01/server.102/b14220/process.htm#sthref1633]
    Anand

  • Some question about control file

    Hello,
    ALTER DATABASE BACKUP CONTROLFILE TO TRACE
    - in my case I have three control files. The above command will generate the SQL
    command for regeneration of all three control files?
    - is there any way to find in which trace file the SQL script will be written. In my user (udump) and database (bddump) dump directory are many trace files and it is hard to find it. Also it is safe to delete all this trace files?
    - RMAN is kepping all it's data ( required for recovery) in control file.
    By using the above command I didn't find nothing about those data ( in the trace file ) or maybe I skip it.
    Windows 2000, Oracle 9i
    Thank you!

    Hi,
    To get the file, use this select in the session where you executed the backup to trace:SELECT
         'The file you search is: ' ||
         DIR.VALUE || DECODE(SUBSTR(DIR.VALUE,1,1),'/','/','\') || '*' || SPID.VALUE || '*' "Info"
    FROM
         (SELECT VALUE
          FROM V$PARAMETER
          WHERE NAME='user_dump_dest'
         ) DIR,
         (SELECT SPID VALUE
          FROM V$PROCESS
          WHERE ADDR=(SELECT PADDR
                    FROM V$SESSION
                    WHERE SID=(SELECT DISTINCT SID
                         FROM V$MYSTAT)
         ) SPID;You can cat/type this file directly. The file name returned uses wildcards (*) purposely.
    I read something about RECOVERY CATALOG and found that it can be created on same database or in a different database:Yes.
    If it's created in same database and this is lost, RECOVERY CATALOG will not be lost too ?Yes, it will. That's why we create the RECOVERY CATALOG in a dedicated database (for example the one in which EM Grid Control / OMS has it's repository.
    - If it's created in a different database, do I need a RECOVERY CATALOG for this database too? Well, you can do like that. I don't do it though. My RMAN's RECOVERY CATALOG database is backed up online via standard archivelog method (with my dynamic backup script), not using RMAN.
    Regards,
    Yoann.

  • Question regarding File Content Conversion

    Dear SAP experts,
    I need you expert advise regarding this.
    I have here my sample structure,
    CSV ---> this is the message type of the message
       '-- ROOT --> this is the root node
               '-- FieldA --> has a value of "Filename"
               '-- Field B
                      '-- subfield1 --> has a value of "A"
                      '-- subfield2 --> has a value of "B"
                      '-- subfied3 --> has a value of "C"
    I want to output this flatfile,
    subfield1;subfield2;subfield3
    A;B;C
    the output flatfile must have Header lines (subfield1;subfield2;subfield3)
    What will be the correct parameters in my File Content Conversion, so as, the value "Filename" will not be included in the output flatfile?
    Recordset Structure: ?
    Kindly advise.
    Thank you very much!
    Fred

    Refer
    FCC
    Ref:
    Sender -
    /people/venkat.donela/blog/2005/03/02/introduction-to-simplefile-xi-filescenario-and-complete-walk-through-for-starterspart1
    Key value:
    /people/venkat.donela/blog/2005/06/08/how-to-send-a-flat-file-with-various-field-lengths-and-variable-substructures-to-xi-30
    /people/anish.abraham2/blog/2005/06/08/content-conversion-patternrandom-content-in-input-file
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter -
    /people/jeyakumar.muthu2/blog/2005/11/29/file-content-conversion-for-unequal-number-of-columns
    /people/shabarish.vijayakumar/blog/2006/02/27/content-conversion-the-key-field-problem
    /people/michal.krawczyk2/blog/2004/12/15/how-to-send-a-flat-file-with-fixed-lengths-to-xi-30-using-a-central-file-adapter
    /people/shabarish.vijayakumar/blog/2005/08/17/nab-the-tab-file-adapter
    Receiver FCC no need of Endseparator
    /people/shabarish.vijayakumar/blog/2007/08/03/file-adapter-receiver--are-we-really-sure-about-the-concepts
    Receiver-
    /people/arpit.seth/blog/2005/06/02/file-receiver-with-content-conversion

  • Question about control files.

    Hi.
    For example, I have 1 control file on storage A and 1 on storage B. Storage B placed offline for some reasons. Will control file recover automatically when storage B be taken online or it has to be repaired manually?
    Thanks.
    Edited by: Web on Mar 6, 2012 6:04 PM

    Web wrote:
    Hi.
    For example, I have 1 control file on storage A and 1 on storage B. Storage B placed offline for some reasons. Will control file recover automaticall when storage B be taken online or it has to be repaired manually?
    Thanks.
    Will control file recover automaticallNo ... it can be recovered manually . Either you have to create another controlfile or you to move the controlfile location on other online storage ( using multiplexing ) ...
    --neeraj                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                           

  • In case of Control File Failure, Create Control File cmd how get scn?

    The following lines i picked from the
    http://asktom.oracle.com/pls/asktom/f?p=100:11:0::::P11_QUESTION_ID:5033895918209
    ======================================================
    1. We can use the 'alter database rename ' at mount stage to rename any datafile. Or is it not possible to rename the system datafile like this? Why?
    2. What happens to the SCN information in the controlfile when a controlfile is recreated? How will the database sync the SCN with that of the datafiles?
    If I issue a 'backup controlfile to <file>' at 8 am and then restore that controlfile binary backup at 10 am and try to open the database, it will give me a control file old error. I understand that it is because the SCN is not in sync. But if I issue a 'backup controlfile to trace' at 8 am and use that script to recreate a new controlfile at 10 am, why doesn't I get the error? Where does it get the SCN information then?
    So what is the use of taking a binary copy of the controlfile. Looks like having a 'backup controlfile to trace' script is better than a binary backup. Do you agree? Why/whynot?
    Followup August 16, 2002 - 2pm US/Eastern:
    1) you could but I just always did it with the create controlfile statement.
    When moving system -- I do it that way
    When moving ANY OTHER tablespace -- i just offline it, move the files, rename the files online it.
    2) it just happens.
    The control file you create will read the files to figure out what is up.
    I agree, I've never used a binary controlfile backup myself.
    =========================================================
    My Question- In the Point2 above "Where does it get the SCN information and how control file do SCN Sync with data files?
    "

    1. The CREATE CONTROLFILE reads SCNs from the DataFiles. If the
    database was last shutdown, all the datafiles are "non-fuzzy" and have the same
    SCN (as of the shutdown checkpoint), If the database or some of the files are from
    are hot backup, you cannot open the database because the SCN of some files is
    older (lower) than others -- that is why a RECOVER (DATABASE or DATAFILE) is
    required.
    See http://web.singnet.com.sg/~hkchital/Incomplete_Recovery_with_BackupControlfile.doc
    2. I'm not sure I agree with Tom Kyte's response
    "I agree, I've never used a binary controlfile backup myself. "
    to the question
    "So what is the use of taking a binary copy of the controlfile. Looks like having a 'backup controlfile to trace' script is better than a binary backup. Do you agree? Why/whynot?"
    If you have lost your database (storage/filesystem failure) and all your datafiles are lost,
    you cannot simply do a CREATE CONTROLFILE from a Trace -- because the
    CREATE CONTROLFILE has to read and verify all the datafiles specified in the
    CREATE statement. If you have an RMAN Repository, you can use that to restore
    your database files but otherwise, the RMAN information about backups and backupsets
    are in the binary controlfile.
    That is why it is important to take binary controlfile backups either manually or
    using RMAN or using CONFIGURE CONTROLFILE AUTOBACKUP ON.

Maybe you are looking for

  • When trying to save a file, no file dialog appears

    The last 2 weeks, often when I want to save a file --- any file, eg an image, I right-click "save image as" and then wait for a file dialog to ask where to save, then nothing, no matter how long I wait. If I exit Firefox and restart it it may work no

  • Server 2012 Password issue on new domain

    We recently setup a new domain controller running Server 2012 R2 standard 64 bit. All user profiles were setup in Active Directory. The default password we set users was Welcome1 and we chose all the defaults for the password policy. We set each acco

  • Headphones and Speakers

    Hi, im thinking of buying the Inspire T6060 5.1 Speaker System and was wondering if i can connect headphones to the wired remote? Also, if not, what speaker system would you recommend for use with headphones. I use the Plantronics Audio 90.

  • Plan not found

    hi everyone, i am deploying a jms adapter after configuring new jms queue and connection pool. what happened is i mistakenly deleted plan.xml in following path C:\Oracle\Middleware\home_11gR1\Oracle_SOA1\soa\connectors.so what i did i created a new p

  • Camera raw cs4 and sony dsc-rx100

    how do i make cs4 camera raw to work with sony's dsc-rx100?