Data Guard: redo apply vs. real-time apply?

Hi all-
Anyone know what the difference is between redo apply (i.e., ALTER DATABASE RECOVER MANAGED STANDBY DATABASE DISCONNECT;) and realtime apply (i.e., ALTER DATABASE RECOVER MANAGED STANDBY DATABASE USING CURRENT LOGFILE DISCONNECT;)? I'm using maximum protection mode (log_archive_dest includes LGWR SYNC AFFIRM), and I'm not seeing any difference between redo apply and realtime apply. In either case, if I update the primary, the transaction gets recorded in both primary and standby online redo logs. Any thoughts?
Thanks!
Rob

Thanks... it appears, however, that max availability and max protection are not (necessarily) determined by which of the ALTER DATABASE RECOVER... commands (above) are specified:
SQL> alter database recover managed standby database cancel;
Database altered.
SQL> alter database recover managed standby database using current logfile disconnect;
Database altered.
SQL> select protection_mode, protection_level from v$database;
PROTECTION_MODE PROTECTION_LEVEL
MAXIMUM PROTECTION MAXIMUM PROTECTION
SQL> alter database recover managed standby database cancel;
Database altered.
SQL> alter database recover managed standby database disconnect;
Database altered.
SQL> select protection_mode, protection_level from v$database;
PROTECTION_MODE PROTECTION_LEVEL
MAXIMUM PROTECTION MAXIMUM PROTECTION
SQL>
I specified the protection level from the primary when I first set data guard up, with "alter database set standby database to maximize protection," so maybe the two commands are synonymous but there for backward compatibility?
Rob

Similar Messages

  • Data Guard - Redo log files

    Hi....
    I have done data guard .......every thing is fine.......archives are bring transferred to standby..........
    Also, during configuration, I had created standby redolog groups 4,5,6 and copied to standby.....
    But in real time apply.......the standby is not using standby redolog groups 4,5,6........when i am query v$log it is showing group 1,2,3.
    Actually, its should use standby redo log for maximum availability.
    Please help....
    Thanks in advance.

    There was a similar question here just a few days ago:
    Data Guard - redo log files

  • How can I get updated data in sub VI in real time?

    Hello.
    I'm making application using Labview 7.0.
    What I want to do is the data acquisition changing some parameters. I get the data each time
    I changed variable parameters. I can do it in sub VI by inserting array component into 'for' loop.
    As loop iterates, the parameter sets are changed, and new data is inserted into array by 'replace subset array'.
    New data fills the array by each iteration.
    However, I can't bring this result array to my main.vi. Sub VI gives me the result array only when
    its processes are done. I want to display the data while parameters changing is still on
    the way.
    Are there any solution to get the output of sub VI while the it is not finished yet?

    Hi Joewun:
    Yes, you can update that data in real time.
    It's done using references and property nodes.
    You can Create Reference of an array in your main VI
    In your SubVI, create a new Input to an array reference, and insert a Property Node function into the For where you update the value, and select Value property.
    This way when your array updates in the SubVI you will see the changes in your main VI.
    Attatched Image:
    Left: Main VI
    Right SubVI.
    Aitortxo.
    Attachments:
    Update Array Data from SubVI.png ‏54 KB

  • Continuous data acquisition and analysis in real time

    Hi all,
    This is a VI for the continous acquisition of an ECG signal. As far as I understand the DAQmx Analog read VI needs to be placed inside a while loop so it can acquire the data continously, I need to perform filtering and analysis of the waveform in real time. The way I set up the block diagram means that the data stays int the while loop, and as far as I know the data will be transfered out through the data tunnels once the loop finishes executing, clearly this is not real time data processing. 
    The only way I can think of fixing this problem is by placing another while loop that covers the filtering the stage VIs and using some sort of shift registeing to pass the data to the second loop. My questions is whether or not this would introduce some sort of delay, and wether or not this would be considered to be real time processing. Would it be better to place all the VIs (aquicition and filtering) inside one while loop? or is this bad programming practice. Other functions that I need to perform is to save the data i na file but only when the user wants to do so. 
    Any advice would be appriciated. 
    Solved!
    Go to Solution.

    You have two options:
    A.  As you mentioned, you can place code inside your current while loop to do the processing.  If you are clever, you won't need to place another while loop inside your existing one (nested loops).  But that totally depends on the type of processing you are doing.
    B.  Create a second parallel loop to do the processing.  This would decouple the processes to ensure that the processing does not hinder your acquisition.  See here for more info.
    Your choice really depends on the processing that you plan to perform.  If it is processor-intensive, it might introduce delays as you mentioned.  
    I would reccomend you first try to place everything in the first loop, and see if your DAQ buffer overflows (you can monitor the buffer backlog while its running).  If so, then you should decouple the processes into separate loops.
    Regarding whether or not "this would be considered to be real time processing" is a loaded question.  Most people on these forums will say that your system will NEVER be real-time because you are using a desktop PC to perform the processing (note:  I am assuming this is code running on a laptop or desktop?).  It is not a deterministic system and your data is already "old" by the time it exits your DAQ buffer.  But the answer to your question really depends on how you define "real-time processing".  Many lay-people will define it as the processing of "live data" ... but what is "live data"?

  • BAM data object lookup fields for real-time data

    I have two tables T1 and T2 and associated data objects DO1 and DO2. T2 has two columns EventId and AgencyId which form composite primary key. I am creating an Updating Ordered List Report to show data from boths tables T1 and T2. How will I use Data objects lookup to achieve this. Is this possible, as the data feed will be real-time and I don't know whether T1 or T2 data will arrive earlier. I need to show all data in T1 and relevant Status column data from T2 based on EventId and AgencyId. Please suggest if datalookup fields will help achieve my use case in real-time scenario.
    Thanks

    Is there a working example of how the Lookup fields work for BAM data objects.
    Thanks

  • Data Mining model usage in real time application

    Has Oracle a product that is similar with the Intelligent Miner for Scoring from IBM. Or are there any technical white papers how to use analytical Models, exported via PMML, in an Oracle Datawarehouse enviroment? I read on the Data Mining Group web pages that Oracle is a member there, but i coul`d not found any documentation of how to use analytical models exported with PMML in an Oraclae enviroment

    Another way to do this is to specify your shared variable to be a custom control and create a control that simply has a 2-D array in it.  I've attached a zipped LabVIEW 8.2 project that shows both methods.  Enjoy!
    Becky
    Becky Linton
    Field Engineer - Michigan
    National Instruments
    Attachments:
    2DArraySharedVariable.zip ‏110 KB

  • Data guard performance problem (rac to single instance)

    i have a table it has GPS data, and gps table has too many data, 5 millions,
    iam using RAC (2 nodes 11gr2). standby database is single instance data guard,
    single instance database (standby)'s hardware cpu is lower than RAC machines. rac nodes have (15k) disks, standby has (7200rpm),
    so i dont want to use GPS tables on data guard system, i dont want to run GPS table's DML commands (delete, insert), i think it may increase performance,
    is it posible? what is your advise?
    any feedback makes me happy,
    best regards

    it's not possible with data guard, but you can use streams or golgengate for this purpose. Have a look at dataguard performance tuning guide. Maybe there is something you can fix on the configuration to be faster.
    [Data Guard Redo Apply and Media Recovery Best Practices|http://www.oracle.com/technetwork/database/features/availability/maa-wp-10grecoverybestpractices-129577.pdf]
    [Redo Transport and Network Best Practices|http://www.oracle.com/technetwork/database/features/availability/maa-wp-11gr1-activedataguard-1-128199.pdf]
    I don't know an 11g version for these docs but they would still help.

  • Can streams and data guard run together

    I need to set up a staging database to run streams from.
    Can I use data guard from the production database to the staging database and
    streams on top of that?
    Has anyone done this?

    Correction:
    Data Guard Redo Apply can be used to provide high availability/disaster recovery for a Streams database. For best practices using Dataguard Redo Apply with Streams, see http://www.oracle.com/technology/deploy/availability/pdf/MAA_WP_10gDataGuardRoleTransitionsStreams.pdf
    Data Guard SQL Apply cannot be used to provide high availability/disaster recovery for a Streams database.

  • What is the best way to store data on a network hard drive using CompactRIO RTOS and Labvew Real time?

    HI!
    I'm starting a project in which I have a low rate stream of data to read in a real time environment. I should store these data on a network hard disk without any PC with standard OS, I just have CompactRIO RTOS. How can I send this data to the network drive? Is it possible to just “write” data like I do for a standard file in LabView?
    Thanks for any help!!
    Il Conte
    dr. Valentino Tontodonato

    Il Conte,
    you have to keep in mind that normally the RT OS does not map drives other than the Compact Flash that it has onboard (C:\). There are exceptions such as
    -cFP-20xx which may have additional Flash Drives which can be addressed as D:\ Drive
    -CVS systems with IEEE-1394 interface can write/read to Firewire external Harddrives
    -PXI Controllers booted from a Floppy disk may map the floppy drive as A:\
    One solution to your needings may be to write data to files locally on your onboard CompactFlash and then transfer these files to a network location using FTP, provided the network drive you are pointing to supports FTP.
    Let us know if you need any more help with this,
    AlessioD
    National Instruments

  • Real-Time Data Acquisition

    WHAT IS REAL-TIME DATA ACQUISITION AND
    WHAT ARE REAL TIME QUERIES AND
    DEAMON UPDATE AND HOW DO WE EXTRACT AND LOAD DATA
    Please Explain in detail.....
    regards
    GURU

    Hi,
    Real-Time Data Acquisition –BI 2004s
    Real-time data acquisition supports tactical decision-making. It also supports operational reporting by allowing you to send data to the delta queue or PSA table in real-time.
    You might be having complex reports in your BI System, which helps in making decisions on the basis of data of your transactional system. Sometimes (quarter closure, month end, year ending...) single change in the transactional data can change your decision, and its very important to consider each record of transactional data of the company at the same time in BI system as it gets updated in the transactional system.
    Using new functionality of Real-time Data Acquisition (RDA) with the Net Weaver BI 2004s system we can now load transactional data into SAP BI system every single minute. If your business is demanding real-time data in SAP BI, you should start exploring RDA.
    The source system for RDA could be SAP System or it could be any non-SAP system. SAP is providing most of the Standard Data Sources as real-time enabled.
    The other alternative for RDA is Web Services, even though Web Services are referred for non-SAP systems, but for testing purpose here I am implementing Web Service (RFC) in SAP source system.
    Eg will be a production line where business wants information regarding defective products in the real time so that production can be stopped before more defective goods are produced.
    In the source system, the BI Service API has at least the version Plug-In-Basis 2005.1 or for 4.6C source systems Plug-In 2004.1 SP10.
    Real-Time Data Acquisition -BI@2004s
    http://help.sap.com/saphelp_nw2004s/helpdata/en/42/f80a3f6a983ee4e10000000a1553f7/content.htm
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3db14666-0901-0010-99bd-c14a93493e9c
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/3cf6a212-0b01-0010-8e8b-fc3dc8e0f5f7
    http://help.sap.com/saphelp_nw04s/helpdata/en/52/777e403566c65de10000000a155106/content.htm
    https://www.sdn.sap.com/irj/sdn/webinar?rid=/library/uuid/230d95df-0801-0010-4abb-ace1b3d197fd
    Thanks,
    JituK

  • Data Guard with Oracle 9i and 10g -- have you done it?

    We have an ERP system (JDEdwards, db size approx 600 GB) and we'd like to copy the data in this system to a database on another computer. We'd use the remote database for reporting, so we need it to be available. We are looking at using Data Guard running in Logical Standby mode.
    The ERP database is Oracle 9i, and we are considering having the Logical Standby database be 10g. Have you done this and if so, what was your experience?
    Here's why we want to do this. There are many tables in the ERP database that contain descending indexes. We can't change this, and it is not supported with Data Guard under Oracle 9.2. Tables with descending indexes do not copy to the logical standby database.
    We were told that this is corrected in a newer release of Oracle, and we were wondering if a 10g database would properly copy tables with descending indexes.
    Thanks in advance for any experiences you can share.
    Best Regards,
    Mike

    What you are trying to do might not be possible. Because when you create logical standby you have to create a physical standby first and convert it. Primary and standby have to be same version.
    In furture it might be possible because 10G will support rolling upgrade of logical standby.
    However even it's possible you have to go through a lot of pain to setup and maintain it because Oracle don't support the setup.
    What you could try are Stream and Replication, I will say Stream is very interesting one. Because Oracle say : "Oracle Streams and Oracle Data Guard (including Data Guard SQL Apply) are independent features based on some common underlying infrastructure and technology. "
    http://www.oracle.com/technology/deploy/availability/htdocs/DataGuardStreams.html

  • Streams Plus Data Guard

    Hi,
    Looking at http://otn.oracle.com/deploy/availability/htdocs/DataGuardStreams.html I see that Data Guard and Streams can be deployed on the same primary database.
    Has anyone done this and if so are there any pitfalls.
    Also, how much of the Streams environment is propagated by Data Guard, and how much needs to be re-instatiated once the standby is brought into the primary role after switchover/failover. This is for Oracle 9.
    Any help/hints/pointers much appreciated.

    Graham,
    Streams can co-exist with Dataguard in Physical mode, but Data Guard SQL apply mode and Streams are not supported on the same database.
    I have not implemented such a setup. You can review the " Streams High Availability Environments" chapter in the Oracle9i Streams Release 2 (9.2) [Part Number A96571-02] documentation for details on what to do when a failover occurs. This chapter also provides advice on when to use Streams and when to use DataGuard.
    Note: The Streams manual was updated with the 9.2.0.2 release, so check the part number of the book you reference. If it doesn't match the one listed previously, view the more current documentation on TechNet.

  • Producer/consumer architecture over network variables with Real-Time target.

    Hi all,
    I am maintaining a producer/consumer data acquisition program to be deployed on a real-time target. The main code is deployed to and run on the real-time target during experiments, but was having trouble because the program was originally designed to write all experimental data to disk on the real time during acquisition, which puts the whole experiment at the mercy of the hard drive. I am now trying to rework the code so that the host takes care of logging, so that my time-critical loops don't have to wait.
    I am currently using LabVIEW 8.5
    I have two questions:
    First, how can I programmatically call the data-logging subvi on the host so that it runs in parallel with the main vi which runs the experiment and collects data on the real-time? I have attached the test code that I have been working with to figure this out, but it does not run the logging vi continuously in the background. I am aware that there is better functionality for this in newer versions of LabVIEW, but I would prefer not to upgrade unless there is no other option. I would like to be able to run my data-generating vi and have it start the data logging remotely.
    Second, is there a way in the host VI to read values off the network variable using an event structure rather than polling it for updates?
    Any help would be sincerely appreciated!
    Attachments:
    testRemoteLogging.zip ‏124 KB

    VI server
    Mark the target VI as served on the machine on which it will execute and use VI server Call by reference to invoke the served VI.
    This used to be taught as THE way to communicate syncronously with an RT app.
    Ben
    Ben Rayner
    I am currently active on.. MainStream Preppers
    Rayner's Ridge is under construction

  • Real time transaction response - unable to locate the root element

    Hi There
    Has anyone integrated webMethods with Data Services 3.2 for real time transactions?
    We are trying to create a real time dataflow, initiated by webMethods.  We are sending the request to the Access Server which runs our
    real time dataflow and does process the request.  However, our return response is null.  No data in the response back to webMethods.  We have
    tested this in soapUI, and our transaction does work.  We get the correct response back. 
    So my question is has any experienced this issue with webMethods?  Any suggestions or ideas?
    Here is the error
    ===========================================
    Path name: 
         PortType: Real-time_Services
         Operation: Realtime_address_cleanse
    WSDL code:  S-9043
    [ISS.0092.9043] Schema Error: 1
         pathName: null
         errorCode: CONV-004
         errorMessage: [ISC.0082.9104] DataOut - unable to locate the root element
         identifier: null
    ===========================================
    ps- xmlspy does not work as well
    Please let me know
    Thanks
    Kurt

    The problem was the generated test SOAP message from Netbeans. When built, the interface had more levels than described below. I've also had similar issues caused by bad namespaces - either not defined properly or not picking up the default namespace.

  • Relplication / Duplicationg a Production Client in Real Time

    Hi Guys,
    please i have a mandate.we want to replicate our production client (client A) into another client (Client Z) in real time such that all transaction that is carried out in production client A is replicated immediately in client Z.This became necessary so that should client A go down for any reason client Z backs up immediately.
    Please how do we about this?
    What technology do we use ? ALE?
    Please a detailed step-by-step configuration will be appreciated.

    @ Soumyaprakash : yes
    @ Breakpoint : Yes ... that is the whole idea- a disaster recovery plan. To minimize data loss , we want the production client A to replicate into backup production client Z in real time.
    Have been assigned the task, hence i'll appreciate if i get a detailed technical guidance on how to go about this not a high-surface description.
    I was thinking configuring ALE but i want to know if data will be replicated in real-time i.e for example if a customer posts a transaction in Client A ,the same will be replicated in client Z at the same time .
    Suggestions please.

Maybe you are looking for

  • My music is not playing on iTunes

    Hello, I was wondering if anyone else is having the same problem. When I open up iTunes and go to my Library, it shows all my music, but when I try to play a song (any song) it does not play. This is only happening on my iMac... I am able to play all

  • How to access Access List information through SNMP?

    Hi, I wonder if it is possible to access a router's access lsit info (acl type, name, entries, stats) through SNMP. Using the SNMP Object Navigator I have found a MIB and OIDs that should allow me to do just that:  Object ciscoACLMIB OID 1.3.6.1.4.1.

  • Settings To Use In Premier With 720 15 frames p/s Video

    I have a video shot on a webcam on 720 resolution and 15 frames per second. When I import it into premier I can't find a 720 setting at 15 frames. Suggestions?

  • One page isn't exporting as HTML

    I recently exported my site as an HTML to my GoDaddy FTP. But there was a problem. I looked into the problem and found a possible error. Which is, when I export my site as an HTML, one of my pages (Home Page) does not seem to export. I can't find it

  • Ca-42 and vista

    Hi I have ca-42 cable (for phone models 6100 and 6020). Before the upgrade to vista it worked very good. Now i am trying install the ca-42 in vista using 6.83 version without success. in the device manager is stil in the other device: USB-UART TX LIN