Recovering for a archive gap
Hi All,
Using Oracle 11gR2 on RHEL 5.6. My Primary and Standby had different locations for datafiles and redolog files.
Since the Standby has a big gap with the Primary due to a network outage, plan to do an incremental rollforward.
Previously I have done a rollforward where file locations were same with no issues.
Intend to follow the steps as in http://docs.oracle.com/cd/B19306_01/server.102/b14239/scenarios.htm#CIHIAADC.
Now as per above Oracle reference why do I need to do ; I have already used earlier db_file_name_convert, log_file_name_convert in my pfile to create the DG.
1. remove all online logs/standby logs in standby directories
2. Standby – Clear all standby redo logs
If I need to do then will the above files be created automatically when I start the MRP?
Pleas advise ..
mseberg wrote:
OK.
First off you have a setup problem if your archive log are no longer on your primary. If RMAN removed them you should configure it to APPLIED on STANDBY.
RMAN> configure archivelog deletion policy to applied on standby;Do I have to set this on Primary and Standby both ?
I answered your question because I have dealt with some large gaps. However I have not done what you are trying. My friend CKPT has this excellent document :
RMAN Incremental Backups to Roll Forward a Physical Standby Database
http://www.oracle-ckpt.com/category/dataguard/page/6/
Have cheked the link earlier, it doesn't refer to my situation. My file locations/mount points are different.
If my MRP is stopped and I delete my standby and online redo logs in Standby. Then will it create automatically in Standby based on the control file restored when I start my managed recovery? This step is as per my first link.
Similar Messages
-
Standby archive gap error ..
I have created a standby database,on the same site where my primary
database running,every thing works fine,but the problem i am facing
is that i am not able to recover my standby database,althogh log
transport service archiveing both the dir,primary abd standby database
and i query the primary database archive log list ;
Primary database
Database log mode Archive Mode
Automatic archival Enabled
Archive destination d:\oracle\admin\db01\arch
Oldest online log sequence 8
Next log sequence to archive 10
Current log sequence 10
Standby Database
Database log mode Archive Mode
Automatic archival Enabled
Archive destination d:\oracle\admin\stdby\arch1
Oldest online log sequence 5
Next log sequence to archive 5
Current log sequence 10
why this archive gap generating in standby databse ?
And how to recover my standy database up to my pr. Database ??
plz don't mention any link i've already gone through it..
plz solve it,the help would be highly appricated..Hi,
The standby database don't generete archive log, this database only reapply archive log which sended by primary database (into a second arch dest).
In init.ora from your standby database :
Are standby database parameters standby_archive_dest and log_archive_dest_1 equal ?
standby_archive_dest parameter (from standby database) must equal to log_archive_dest_2 parameter (from primary database), is it the case ?
Is your standby database in recovery managed ?
Is standby database listener started ?
Nicolas. -
How to recover database without archive files
Hello,
I've a test database without archive files and I need to know how to recover it without using archive files.
What is SQL command to recover database wthout archives?
Thanks in advance for your help.
Regards,
CarlesThere are serveral information missing in this posting and as a support personnel, I would like to find out more when I see postings like this.
1. This person wants to "Test" his backup and recovery in NOARCHIVELOG mode.
2. He/She did not say his database had already crashed or had a problem.
3. He/She did not give any indication the kind of scenario he wants to recover from.
4. He/She did not mention the state of his database at the time of he posting.
5. He/She did not mention the Release of his Oracle Server or Platform.
6. He/She did not mention the type of backup he already has in place (although provided later as Full COLD Backup). How "genuine" is the back?
7. What actions has already been taken to attempt recovery.
In my support experience, I normally would go through a list of questions witht he use/client to get the full picture of the current status of the database before I give answers that would worsen situations, especially in a Disaster Recovery sitaution.
Personaly, I would suggest you have a look at the list above. One of the most important is number 3 (The Recovery Scenario), then number 4, number 6 and 7. Number 7 will determine the situation of your database. So, when you ask for SQL commands to use, there are different commands for different kind of recovery depending on what has gone wrong or what kind of file is damaged (Datafiles, Logfiles, Control Files, one tablespace, one table, entire database gone, etc). -
Logical repositories for Document Archiving
Hi.
We are changing all our content repositories to Logical Repositories and it has worked 100% for data archiving objects. (No more re-customizing after a copy) But this does not seem to work for Document archiving.
OAC3, field Cont.Rep.ID only sees selective repositories. Investigation showed that table TOAAR only caters for 2 CHAR on the ARCHIV_ID, which is fine in our case, because the logical repositories have been created as such. TOAAR however does not contain the logical repository entries.
Please advise how we can proceed to also make use of logical repositories for Document archiving.
ThanksCome on guys, there must surely be somebody that knows something about the use of Logical repositories with Document Archiving.
If somebody can shed some light, it will be highly appreciated.
Thanks -
How to recover a lost archive log file?
How to recover a lost archive log file? Do I need to open the database with RESETLOGS after recovery?-------No.156
I think he might rewrite the question in his own words.
I guess in the event of lost archive logs during db recovery, you have to open RESETLOGS and say goodbye to some of the data. -
A question on different options for data archival and compression
Hi Experts,
I have production database of about 5 terabyte size in production and about 50 GB in development/QA. I am on Oracle 11.2.0.3 on Linux. We have RMAN backups configured. I have a question about data archival stretegy. To keep the OLTP database size optimum, what options can be suggested for data archival?
1) Should each table have a archival stretegy?
2) What is the best way to archive data - should it be sent to a seperate archival database?
In our environment, only for about 15 tables we have the archival stretegy defined. For these tables, we copy their data every night to a seperate schema meant to store this archived data and then transfer it eventualy to a diffent archival database. For all other tables, there is no data archival stretegy.
What are the different options and best practices about it that can be reviewed to put a practical data archival strategy in place? I will be most thankful for your inputs.
Also what are the different data compression stregegies? For example we have about 25 tables that are read-only. Should they be compressed using the default oracle 9i basic compression? (alter table compress).
Thanks,
OrauserNYou are using 11g and you can compress read only as well as read-write tables in 11g. Both read only and read write tables are the candidate for compression always. This will save space and could increase the performance but always test it before. This was not an option in 10g. Read the following docs.
http://www.oracle.com/technetwork/database/storage/advanced-compression-whitepaper-130502.pdf
http://www.oracle.com/technetwork/database/options/compression/faq-092157.html -
Two entries for each archive log in v$archived_log
Hi,
I have noticied that there are two entries for each archive log. Why this is so...?
I have fired following command.
==================
set pages 300
set lines 120
ALTER SESSION SET nls_date_format='DD-MON-YYYY HH24:MI:SS';
SELECT sequence#, first_time, next_time
FROM v$archived_log
ORDER BY sequence#;
==================
output is as follows.
==================
1436 24-FEB-2012 00:04:09 24-FEB-2012 08:24:21
1436 24-FEB-2012 00:04:09 24-FEB-2012 08:24:21
1437 24-FEB-2012 08:24:21 24-FEB-2012 15:45:01
1437 24-FEB-2012 08:24:21 24-FEB-2012 15:45:01
1438 24-FEB-2012 15:45:01 24-FEB-2012 15:45:04
1438 24-FEB-2012 15:45:01 24-FEB-2012 15:45:04
1439 24-FEB-2012 15:45:04 24-FEB-2012 15:45:57
1439 24-FEB-2012 15:45:04 24-FEB-2012 15:45:57
1440 24-FEB-2012 15:45:57 24-FEB-2012 17:26:41
1440 24-FEB-2012 15:45:57 24-FEB-2012 17:26:41
1441 24-FEB-2012 17:26:41 24-FEB-2012 18:40:07
1441 24-FEB-2012 17:26:41 24-FEB-2012 18:40:07
1442 24-FEB-2012 18:40:07 24-FEB-2012 19:36:17
1442 24-FEB-2012 18:40:07 24-FEB-2012 19:36:17
1443 24-FEB-2012 19:36:17 24-FEB-2012 19:36:18
1443 24-FEB-2012 19:36:17 24-FEB-2012 19:36:18
==================
Regards
DBA.I have noticied that there are two entries for each archive log. Why this is so...?Mseberg already mentioned.. little in detail as below
Check for the name column in v$archived_log,
One location refers to Local destination LOG_ARCHIVE_DEST_1
Other location refers to your standby/DR location, But it will shows you only service name instead of full archive name.
select dest_id,name from v$archived_log where name is not null and completion_time like '%24%FEB%'
DEST_ID NAME
1 +ORAARCHIVE/prod1/archivelogs/arch_0001_0671689302_0000240097.arc
2 (DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=sldb1srv)(POR
T=9101)))(CONNECT_DATA=(SERVICE_NAME=prod_sldb1srv_XPT)(INSTANCE_N
AME=prod1)(SERVER=dedicated)))Edited by: CKPT on Feb 24, 2012 8:26 PM -
Archive gap between primary and standby
Hi,
I've a production environment with 2node RAC with ASM as primary and standalone standby with datafiles stored on the filesystem.
Always on the standby side, there is only one archive gap on the standby end, it is not applying it even after arrival of the archivelog.
How to overcome it?
ThanksHello;
Depending upon the query you are using "Real time apply" might show as 1 log behind. Is this possible?
Example from mine :
STANDBY SEQUENCE# APPLIED COMPLETIO
STANDBY2 10711 YES 31-MAY-12
STANDBY2 10712 YES 31-MAY-12
STANDBY2 10713 YES 31-MAY-12
STANDBY2 10714 YES 31-MAY-12
STANDBY2 10715 YES 31-MAY-12
STANDBY2 10716 YES 31-MAY-12
STANDBY2 10717 YES 31-MAY-12
STANDBY2 10718 YES 31-MAY-12
STANDBY2 10719 YES 31-MAY-12
STANDBY2 10720 YES 31-MAY-12
STANDBY2 10721 YES 31-MAY-12
STANDBY2 10722 YES 31-MAY-12
STANDBY2 10723 YES 31-MAY-12
STANDBY2 10724 YES 31-MAY-12
STANDBY2 10725 NO 01-JUN-12 So sequence 10725 is still in progress so it shows 'NO'.
Can you post the query you are using?
Best Regards
mseberg
Edited by: mseberg on Jun 14, 2012 7:28 AM -
Recovering documents from archiving
Hi,
Please guide me how can we recover documents from archiving.
e.g. Documents like consumption billing documents and invoicing documents.
Thanks and RegardsHello,
Please see the following link which described the process of Retrieving Archived Files:
http://help.sap.com/saphelp_nw70/helpdata/en/8d/3e4f9f462a11d189000000e8323d3a/frameset.htm
I hope this helps.
With thanks
Olivia -
Regarding Residenct Time for Z archiving Object
Hello experts,
For standard archiving objects we have application specific customizing , where we define residence time..but in our case we are using Z archiving objects, we don't have "application specific customizing".could someone tell me where exactlly we have to define the residence time..
Regards
RRHi RR,
Not all standard archiving objects have the feature of application specific customizing to define residence time. For some objects (co_ml_idx, fi_sl_data, rl_ta to quote a few examples), we have to limit/control the residence time using the archive variants (variant for write job).
In case of Z objects also you could follow the same, control the residence time using the selection variant for write job. An essential selection variable for doing this will be a date field (in form of date / year / period etc) in the archive selection variant.
Hope this answers your question.
Thanks,
Naveen -
There is still a purchse order commitment for ORD X for CO_ORDER archiving obj
Dear Experts,
I am facing issue for CO_ORDER archiving object. I am getting error saying like
" there is still a purchase order commitment for ORD 200060" and
"order 200060 still has at least one purchase order that you need to do".
screen shot attached for reference
so Please help me to solve this issue.
Thanks in advance,
Ningappa S MHello Manju Sir,
Now that issue setting of deletion indicator is got successful. I mean I set residence time 1 as 0 and I tried to archive and I got successful. but I am facing the issue saying " there is still a purchase order commitment" , "there is a still Purchase requisition assigned to order" and
"still has at least one purchase order that you need to delete" etc for all orders when i set residence time 1 as 39 months (even i tried by setting the residence time 1 as zero i am getting the same error messages) and residence time 2 as 0. for these orders deletion flag itself is not getting set. so please help me for this.
Thanks,
Ningappa -
Recommendation for storing/archiving AVCHD material
I'm sure this isn't a new question but after a bit of searching I still wasn't able to find it on this forum.
I have one of the newish Sony consumer cameras with flash memory. I use Log and Transfer in FCE to import the footage and that's all good. However I have noticed that when I use the Sony software that came with the camera to import the footage (Windows only) the files end up being a lot smaller. Unfortunately I can't use these files with FCE and there are plenty of posts on this forum detailing why not.
My question is: what is the "best practice" method for storing/archiving AVCHD footage? If I keep the .MOV files that FCE creates then I will be storing 4 or 5 times as much data as the .MTS (and related files) created by the Sony software supplied with the camera. So should I be importing the footage twice - once into FCE and once in Windows - and archiving the smaller (Windows/Sony) files? If I do this then how can I get FCE to recognise them if I ever need the source files again - do I have to transfer them back to the camera or use a third party product to convert them? What do others do?I back up the entire original AVCHD folder off the camera onto another hard drive. Store and maintain like an iTunes library of MP3 files. You can rename the top-level AVCHD folder as long as all the subfolders remain intact. Those files are about 1/10th the size of the ingested footage and a lot more manageable. In fact, I never actually log and transfer from the camera itself. Just copy the files to backup immediately, then mount whatever folder applies. I also keep a final uncompressed QuickTime movie of each "finished" project. Eventually, I toss the ingested footage for older projects. This means it would be somewhat difficult to go back and rebuild an older project, but I'm personally more of a "done and next project" person. Others might not be. AVCHD footage takes up no more disk space than than HDV once it's imported. It all gets converted to the much larger Apple Intermediate Codec for editing. Your decision is essentially whether you want to back up tapes or back up electronic files of your original media, and how long you want to keep the bloated AIC media on your drive for continual tweaking down the road. Either way, you'll likely want a couple large hard drives.
-
Regarding sending certificate to the third party for data archiving.
Hi,
I am stuck at one point in SAP. The steps I have followed are:
1. go to transaction code "oac0" (Display content repositories).
2. double click to any archive.
3. Here we can see a "Send Certificate", button.
By pressing this button, an http request is fired over the network, addressing the application server in its configuration configured for that archive. Together with it an certificate is also sent to the server outside SAP.
Generally we do validation of the certificate outside SAP in our application, e.g. in our JAVA code, using iaik jars (security jars).
Now I need to validate this certificate inside SAP itself prior to be sent over the network from SAP.
Is it possible any way.Mridul,
Depending on your ERP version you can use STRUST to import the certificate into the ABAP system.
Regards,
Salvatore Castro
Sr. Solution Architect -
Need help for new message type for PO archiving
Hello All
I have configured new message type ZARR for PO archiving process. After configuring all the steps this condition is not getting picked to by itself. And after maintaing it manually it is giving red flag without any error message. Here I am listing all the step that I have followed for config. Please help me am I missing something or any parameter is wrong.
Steps for new Message Type for PO Archiving
1. Defined new Table 513 Purch.Org./Plant.
2. Defined new access sequence Z513 Access Sequence for Archiving.
Following are parameters to the Access Sequence
Access Sequence Number = 21
Table = 513
Description = Purch.Org./Plant
Requirement = 101
Exclusive = Yes (box checked)
3. Maintain Output Type ZARR for PO
General Data
Access Sequence = Z513
Access to condition (Checked)
Multiple Issuing (Checked)
Program = FM06AEND
FORM Routine = CHANGE_FLAG
Default Values
Dispatch Time = Send Immediately (when saving application)
Transmission Medium = Print Out
Storage System
Storage Mode = Archive Only
Document Type = ARCHIVE
Mail Title and texts
Language = EN
Title = Archiving for Legal Requirement
Processing Routines
Transmission Medium = 1 (Print Out)
Program = SAPFM06P
Form Routine = ENTRY_NEU
Form = ZJ_9H_MEDRUCK
Partner Roles
Medium = Print Out
Funct = VN
Name = Vendor
4. Fine-Tuned Control: Purchase Order
Oprat. = 1
CType = ZARR
Name = Testing for Archivin
Short Text = New
Update Print Related Data = Yes (checked)
Oprat. = 2
CType = ZARR
Name = Testing for Archivin
Short Text = Change
Update Print Related Data = Yes (checked)
5. Maintain Message Determination Schema: Purchase Order
Procedure RMBEF1
Step = 100
Cntr = 1
CTyp = ZARR
Description = Testing for Archivin
Requirement = 101
6. Assign Schema to Purchase Order
Procedure RMBEF1
7. Define Partner Roles for Purchase Order
Out. = ZARR
Med = 1 (Printout)
Funct = VN
Name = Testing for Archivin
Name = Vendor
8. Test Condition Maintained in MN04 (Master Data)
Purch. Org. = 0001
Plant = 24
Partner Funct = VN
Medium = 1 (Printout)
Date/Time = 4 Send immediately (when saving application)
Output Device = HUL1
Storage Mode = Archiving only
Thanks
Ankitthe problem is the Exclusive = Yes indicator in the access sequence.
Your other message has this exclusive indicator as well.
so if the first message is found, then no other message wil be determined
to allow several messages in one PO, you must not set this exlusive indicator -
Best Practise for Data Archiving
Hi All,
I have query in SAP that when business will do the data archving in their R3 systems.
On which basis they will do data archiving?
Is there any minimum period that data need's to be archived?
before doing the data archiving what they will do ?
Regards
SriniHi Srini,
1. SAP suggestes to implement data archiving strategy as early as possible to manage database growth .
However pople think of archiving when they realise the problems like large data volumes,slow system resonse time,performance issues etc...
2. There is a proper way to implement Data Archiving . Database has to be anaylzed first for getting the top DB tables and Archiving objects.
3. Based on the DB analysis ,Data archiving plan has to be implenemted according to data management guide.
4. There is a minimum period known as residence time has to be completed before any data to be archived. Once the document is business completed and serverd its minimum required period in the Database ,it can be archived.
5, Before going for data archiving there are many steps to be followed like analysis,configuration etc that you can see in details at the link below :
http://help.sap.com/saphelp_47x200/helpdata/en/2e/9396345788c131e10000009b38f83b/frameset.htm
let me know if this helps you .
-Supriya
Edited by: Supriya Shrivastava on May 4, 2009 10:38 AM
Maybe you are looking for
-
I run Tiger OSX on one MacBook and SnowLeopard on another. Photos were downloaded from Tiger to SnowLeopard, and now iPhoto won't open on the Tiger Mac. Will upgrading iPhoto fix this problem?
-
I have yet to figure out how to define a data range to use as my x-axis in a line plot. Sure, I could use an XY scatter (that can't have lines) but what if I have 4 Y variables? There is obviously something missing since, if you display the x categor
-
Hi, I'm sending work centers using POIM between two SAP systems. I configured all the ALE setting and RFC Deastination. While sending, The outbound Idoc is getting created and in status 03 (Dispatched to the port). But There are no idoc's were create
-
Hello, I am trying to write a kernal mode driver for windows 8 platform. Basically I have an amd chipset and have determined that I would like to access a register with a certain offset from what amd labels acpimmioaddr base+offset. I have the offset
-
Updating Snow Leopard Server Software on Dual Intel Mac Mini
I have a 2009 Mac Mini Dual Core 2.53 Intel with 4 gb Ram running as a server under 10.6.8 Max OSX Server. But I have never used the server capabilities and don't reallyneed them. The Boss, Wife, wants ITunes radio on it, but it won't work apparently