Best way to synchronize DR standby database

Hi,
My setup consists of 3 database installations in three data centers on aix 6.1 and oracle 11g release 2 Enterprise editions.
The first data center will host the primary database, while the second data center will act as a hot standby site, hosting the physical standby database in a data guard configuration. This second data center is just a few hundred meters from the primary site and is connected with dark fiber.
The 3rd site is a DR site located about 300KM from the first two centers, and with a WAN link of speeds around 2Mbps. This third site should act as a disaster recovery site in case there's a natural calamity that destroys the first two sites.
We purchased full license for the first two sites for oracle. However, for the 3rd site, we're only allowed to open the database for up to 10 days in a year. My question is how can we always bring this 3rd database into sync with the first two. Can we use full backups to recover the database periodically, like every month? Or is there another option of using archivelogs, considering that the database can only be open up to a maximum of 10 days in a year?
Both the primary and the DR sites have Tivoli server installed complete with the tape libraries and we're suppose to take backups even on the DR site.
Regards,
dula

I think your potential solutions depend in part on the meaning of 'open' and what kind of database set up you have at the third site. If the third site is a Standby Datbase and open means in use for any purpose other than recovery mode then you should be able to start it and apply the logs so the the database is ready should the need arise.
If open means the database is started and active then all you can really do is install Oracle at the third site and leave it off. In this case what I would do is use the rman duplicate command to build the initial database, start it, test it, and shut it down. Then you transfer your primary database rman backup to the site every day along with copies of the archived redo logs. In the event of a primary/secondary site failure you just need to run an rman restore of the newest backup set then recovery using the copied archive redo logs and you are up and running.
There will be the time delay equal to the time to run restore and recovery but you can set up the remote diaster site, test it, and be pretty sure it will work without using up your 10 day limit.
An alternate license arrangement might be to run RAC at the two local data centers and have a Stand by database at the remote site.
HTH -- Mark D Powell --

Similar Messages

  • What is the best way to import a full database?

    Hello,
    Can anyone tell me, what is the best way to import a full database called test, into an existing database called DEV1?
    when import into an existing database do you have drop the users say pinfo, tinfo schemas are there. do i have drop these and recreate or how it will work, when you impport full database?
    Could you please give step by step instructions....
    Thanks a lot...

    Nayab,
    http://youngcow.net/doc/oracle10g/backup.102/b14191/rcmdupdb005.htmA suggestion that please don't use external sites which host oracle docs since there can not be any assurance that whether they update their content with the latest corrections or not. You can see the updated part no in the actual doc site from oracle,
    http://download.oracle.com/docs/cd/B19306_01/backup.102/b14191/rcmdupdb.htm#i1009381
    Aman....

  • Best way to load initial TimesTen database

    I have a customer that wants to use TimesTen as a pure in-memory database. This IMDB has about 65 tables some having data upwards of 6 million rows. What is the best way to load this data? There is no cache-connect option being used. I am thinking insert is the only option here. Are there any other options?
    thansk

    You can also use the TimesTen ttbulkcp command line utility, this tool is similar to SQL*Loader except it handles both import and export of data.
    For example, the following command loads the rows listed in file foo.dump into a table called foo in database mydb, placing any error messages into the file foo.err.
    ttbulkcp -i -e foo.err dsn=mydb foo foo.dump
    For more information on the ttbulkcp utility you can refer to the Oracle TimesTen API Reference Guide.

  • Best way to deploy a new database

    What is the best way to deploy a database for a user base that mostly doesn't understand how to use SQL based db products or has some understand?
    I'm current working on a setup utility for my desktop application, which uses MySQL, right now I'm at a design issue where I'm not sure how to get the database deployed.
    I have a creation script for deploying the database, but I'm not sure wither to create a default user, assign the rights to the user or make the user customizible. [which starts to branch off way too much] The desktop application does have an option of using an already deployed databased else where or creating one locally.
    Does anyone have suggestions for deploying databases for desktop applications? I know that derby is a great solution for this, however, It is not nearly powerful enough to handle what I need a database for. [large amount of transactions and comparisons, really quickly] I also have been unable to find information on this as well.

    are you talking about creating another copy of your existing DB on the same server ??
    (Just wanna confirm as your last line seems to contradict with this?!?)
    2 ways:
    Go for RESTORE database with the Source option pointing to one of the existing databases.
    Go for COPY DATABASE (useful when the copy of the db is to be put in another server maybe..)
    Note that you would need backup of the existing DB to proceed..
    Thanks, Jay <If the post was helpful mark as 'Helpful and if the post answered your query, mark as 'Answered'>

  • Best way to check whether the database is demo or sys?

    Hi Gurus,
    Whats the best way to check whether the installed peoplesoft database is a demo or a sys?
    Thanks for the help?
    Regards,
    Anoop

    There is nothing set by default.
    However, if it has been configured properly by the administrator after db creation, through the menu Peopletools>Utilities>Administration>Peopletools Options, the following query should return the type of the database :
    select systemtype from psoptions;Otherwise the following could help to understand what database you are on.
    From HRMS9.1 DMO database :
    SQL> select count(*) from ps_employees;
          2792From HRMS9.1 SYS database :
    SQL> select count(*) from ps_employees;
      COUNT(*)
             0Nicolas.

  • Best Way to Drop a 10g Database

    hi experts,
    This is 10g on Windows.
    I have 3 10g databases on this server and I need to drop and recreate 1 of the databases.
    What is the best way to get the cleanest, most thorough deletion?
    I'm thinking of doing:
    shutdown immediate;
    startup mount exclusive restrict;
    drop database;
    is there a better option?
    Thanks, John

    No.
    Though the "EXCLUSIVE" keyword is no longer required ... at least in 11gR1 and perhaps not in your version either.

  • Oracle 10.1, Whats the best way to load XML in database?

    Hi All,
    I am a typical Oracle developer. I know some Java and some XML technologies, but not an expert.
    I will be receiving XML files from some system, which will be,
    - of reasonable size like 2 to 15 MBs
    - of reasonable complexity, like the root element have children, grand-children and great-grand-children, with attributes and all
    - Every day it needs to be loaded to Oracle database, in relational format
    - we need not update the XML data, only put the XML data in relational table
    My questions are,
    - With Oracle 10.1, XML DB, what is the best way to load this XML file to relational Oracle tables ?
    - What can be the steps in this process ?
    - In the documentation, I was lost and was not able to decide anything concrete
    - If I write a pure Java program with SAX API and load the data to Oracle database in same program, is it a good idea?
    - Is there any pure Oracle based way to do this?
    - If I want to avoid using DOM parser, as it will need more JAVA_POOL_SIZE, what can be the way ?
    Please help.
    Thanks

    Many customer solve this problem by registering an XML Schema that corresponds to their XML and then creating relational views over the XML that allow them to access the content in a relational manner. They then use insert as select operations on the relational views to transfer data from the XML into relational tables where necessary. There are a large number of threads in this forum with detailed examples of how this can be done. Most of the customers who have adopted this approach have discovered that this is the least complex approach in terms of code that to be developed / maintained and offeres acceptable performance.

  • Best Way to Replicate Azure SQL Databases to Lower Environments

    I have XML files delivered to my server where they are parsed into reference data and written to a database (Premium tier).  I want to have that database sync to other databases (Basic tier) so that my non-production environments can use the same reference
    data.
    I tried Data Sync and it seems incredibly slow.  Is Azure Data Sync the best way?  What are my other options?  I don't really want to change my parser to write to 3 different databases each time they receive an updated XML file, but I suppose
    that is an option.

    Greg,
    Data sync is one of the option but I wouldn't recommend as Data-Sync Service is going to be deprecated in near future. I would urge you to go through the options around Geo-replication. There are 3 versions of Geo-repl and i believe Active-Geo replication
    would suit your requirement however the copy of the database which is in sync will also have to be in the same service tier (Basic is not possible). With the current Azure offering, it is not possible to have a sync copy of database with different SLOs. I
    would also recommend you to open a support incident with Microsoft to understand different options of Geo-replication. Throughout the time I was composing my answer, keeping DR (disaster recovery) in mind. If i am mistaken, please let me know.
    -Karthik Krishnamurthy (SQK Azure KKB)

  • Best way to transfer a 10g database from HP9000 to Linux Redhat?

    What is the best way to transfer a 10g databasse from HP9000 to Linux Redhat?

    Hi Bill,
    What is the best way to transfer a 10g databasse from HP9000 to Linux Redhat?Define "best"? There are many choices, each with their own benefits . . .
    Fastest?
    If you are on an SMP server, parallel CTAS over a databaee link can move large amnunts of tables, fast:
    http://www.dba-oracle.com/t_create_table_select_ctas.htm
    I've done 100 gig per hours . . .
    Easiest?
    If you are a beginner, data pump is good, and I have siome tips on doing it quickly:
    http://www.dba-oracle.com/oracle_tips_load_speed.htm
    Also,, make sure to check the Linux kernel settings. I query http://www.tpc.org and search for the server type . . .
    The full disclosure reports show optimal kernel settings.
    Finally, don't forget to set direct I/O in Linux:
    http://www.dba-oracle.com/t_linux_disk_i_o.htm
    Hope this helps . . .
    Donald K. Burleson
    Oracle Press author
    Author of "Oracle Tuning: The Definitive Reference" http://www.rampant-books.com/book_2005_1_awr_proactive_tuning.htm

  • Best Way To Synchronize Two Folders Via FTP?

    What's the best way to keep two folders in sync via FTP? Also, why is this site turning German? Will there still be a website in English?

    Allan wrote:
    solarwind wrote:What's the best way to keep two folders in sync via FTP?
    sitecopy in AUR may be helpful (http://aur.archlinux.org/packages.php?ID=11319).  You may also want to look at conduit and multisync-gui in AUR.
    solarwind wrote:Also, why is this site turning German? Will there still be a website in English?
    I hear archlinux.de will be in English as a replacement...
    Thanks for that!
    Also, I sure hope there will be an English site. There are many Germans who know English, but not many non-Germans know German. It was a stupid move to switch to German as an official language.

  • The best way to populate a secondary database

    I'm trying to create a secondary database over an existing primary database. I've looked over the GSG and Collections docs and haven't found an example that explicitly populates a secondary database.
    The one thing I did find was setAutoPopulate(true) on the SecondaryConfig.
    Is this the only way to get a secondary database populated from a primary? Or is there another way to achieve this?
    Thanks

    However, after primary and secondary are in sync,
    going forward, I'm unsure of the mechanics of how to
    automagically ensure that updates to primary db are
    reflected in secondary db. I'm sorry, I misunderstood your question earlier.
    Does JE take care of updating secondary db in such
    cases (provided both DBs are open)? In other words,
    if I have a Map open on the primary and do a put(), I
    can turn around and query the secondary (with apt
    key) and I should be able to retrieve the record I
    just put into the primary?Yes, JE maintains the secondaries automatically. The only requirement is that you always keep the secondary open while writing to the primary. JE uses your SecondaryKeyCreator implementation (you pass this object to SecondaryConfig.setKeyCreator when opening the secondary) to extract the secondary keys from the primary record, and automatically insert, update and delete records in the secondary databases as necessary.
    For the base API and collections API, JE does not persistently store the association between primaries and secondaries, so you must always open your secondary databases explicitly after opening your primary databases. For the DPL API (persist package), JE maintains the relationship persistently, so you don't have to always open the secondary indexes explicitly.
    I couldn't find an example illustrating this (nice)
    feature - hence the questions.For the collections API (I see you're using the collections API):
    http://www.oracle.com/technology/documentation/berkeley-db/je/collections/tutorial/UsingSecondaries.html
    In the examples directory:
    examples/collections/* -- all but the basic example use secondaries
    Mark

  • Best way to have a 'standby' apps node

    Hi,
    We're in the process of discussing several options for creating a standby application node in another data center. For the database we've decided to use Data Guard. But we're divided on the Application node.
    I'm of the view that we should have a separate independent server on the DR site that has the same installation and file setup as on the primary site. Whenever we patch, we can run rysnc to propagate the changes to the dr site. This way, we can have independent two servers and we can even patch on the primary and delay the propagation to the standby until we're satisfied everything is OK.
    The other colleagues think we can use the IBM Power HA to make a cluster between the two nodes and have a virtual ip through which the clients connect. The two nodes will have shared file-systems. If we patch the primary we don't need to patch the 'standby' since it's s shared file system cluster. But during patching, there's no 'freezing' window, so the only fallback option is the back.
    Based on this, which option can work correctly. The environment is ebs release 12.1.1 and oracle 10g release 2 on IBM aix 6.1
    dula

    Hi,
    In Apps tier ,there is no certified /supported method for standby server like datagurad in DBTier
    The best solution is to perform the clone of PROD instance everyday and copying the files to target instance or some otehr location
    becos the application tier files are instance-specific and maintained by auto-config so keeping the cluster and other ideas wil give error when u trying to up the the standby instance and during the ad sessions like patching and running adadmin
    if you going to keep the same hostname and same filestructure means u can try the simply copying the files or automatic copying of files thro cluster..
    -Rathina

  • How is the best way to connect to the Database ?

    I just have a question regarding to the connection to the Oracle DB.
    Every time I create a new JSP I am writing java code such as:
    Class.forName("oracle.jdbc.driver.OracleDriver").newInstance();
    conn = DriverManager.getConnection("jdbc:oracle:thin:@host:1521:DB",
    "user",
    "pwd");
    Is there a way I can reuse the connection that we create on JDeveloper on the tab Connections, under the Database node ?
    Because I would like to centralized more the way how I connect to DB.
    Thanks!
    Giovani

    That is a nice solution, but only works if you use embedded OC4J, if not you must define datasources, maybe in standalone OC4J, OAS, JBoss, etc
    this is because the embedded OC4J automatically create datasources if it sees database connectiones created through Connections Tab

  • Best way to synchronize in a Singleton

    I have this classimport java.util.ArrayList;
    import java.util.List;
    public class SynchIssues
      private List values = new ArrayList();
      private static final SynchIssues singleton = new SynchIssues();
      public static SynchIssues getInstance()
        return singleton;
      public void addValue(String value)
        synchronized(singleton)
          values.add(value);
      public String getValue(int i)
        synchronized (singleton)
          return (String) values.get(i);
    }And i synchronize on the singleton itself , but some collegue uses a different approach
    import java.util.ArrayList;
    import java.util.List;
    public class SynchIssues2
      private List values = new ArrayList();
      private static final SynchIssues2 singleton = new SynchIssues2();
      public static SynchIssues2 getInstance()
        return singleton;
      public synchronized void addValue(String value)
          values.add(value);
      public synchronized String getValue(int i)
          return (String) values.get(i);
    }He uses synchronize on the methods in stead of the SIngleton.
    What approach is best ?
    Are both approaches thread safe ?
    Do teh different approaches risk deadlocking when different thread will be accessing the classes ?

    What do you need to do exactly? What does "the most thread-safe" mean?
    If I read correctly your code samples, the fact that the "shared object" is a singleton is not relevant. Your point seems to be more about a generic issue of "synchronizing access to a shared collection".
    Without resorting to bytecode examination, it seems your 3 code samples look equivalent as far as individual add() and get() operations are concerned: each operation is atomic, and only one thread is actually reading/modifying the collection at a given time. In this regard each of these operations can be told "thread-safe".
    However it gives absolutely no guarantee about several operations executed in sequence : two threads executing such a sequence would probably see their get() and add() interleaved. This may or may not cause a problem depending on your application (what you do about the data in the collection, and in particular how you react to reaching the end of the list).
    If you want this kind of guarantee of per-sequence atomicity (a sequence of, e.g., get() is executed without any other thread modifying the collection in between), you have to make sure each client synchronizes on the collection (or on the singleton, if it's the only wrapper) around the whole block of the sequence:
    SynchIssueX singleton = SynchIssueX.getInstance();
    // make sure no other thread may read any value before I've added them all:
    synchronized (singleton) {
      singleton.add(value1);
      singleton.add(value2);
      singleton.add(value3);
    }

  • Best way to synchronize multiple FPGAs

    I have multiple PXI-7833R FPGAs and I need all of the AIs to be sampled at the same times (across all FPGAs). As I sample all of the individual AI channels, I buffer the data (write to DMA), scan it and check for a user defined trigger in a different loop. Once I discover this in one channel, I save the data from all FPGAs. In terms of synchronizing the sampling, I had begun, from one FPGA, to send a signal over the PXI trigger line to tell the others to sample, but I assume this does not guarantee synchronization. If I base all of the separate FPGA VIs off of the PXI clock, how do I synchronize the loops to sample at the same clock times?
    Thanks
    Solved!
    Go to Solution.

    Hi,
    Well, if it's not very timecritical you could pass messages through the host, thats right.
    Another way, which is quite hard to implement though, would be to use the other available pxi trigger lines to send messages directly from one FPGA to the other. You would need something like a handshaking, a kind of master who directs which slave is allowed to send, a kind of clock for synchronization and so on.
    I cannot give you detailed information, since I never did that by myself, but I know other projects where this works quite good.
    Maybe another forums user can give you some better advice.
    Thanks,
    Christian

Maybe you are looking for

  • Limited records displayed in a view

    Hello, I'm trying to see the table contents of a view in DB connect. It only returns back 20 records. Is there any way to change it to show all the records that are present in the view. I'm trying to figure out some missing data in my ODS and CUBE. W

  • How to connect Apple TV using redmere hdmi to avr?

    I have an Integra DTR-40.2 AVR. I have an Apple TV 2. I want to use an HDMI cable with Redmere technology, but I can't seem to get my Apple TV to work correctly. I have the TV "end" of the HDMI cable plugged into my AVR and the Source "end" in my App

  • Adobe Captivate will not open open within Presenter (Trial version)...

    Hello, I am a subscriber to Adobe Captivate 6 and am in the middle of a 30 day trial of Adobe Presenter.  I use Windows 7 Enterprise and have the the 64 bit version of Adobe Captivate 6.  I also use Power Point 2007. I cannot do a software simulation

  • ShellRunAs.exe and msc files

    not sure what the best forum was to post this, i have this issue: Im trying to use ShellRunAs.exe to run application under my Admin account with smartcard. This works for most applications, but not for the Domain admin tools like Active Directory Use

  • How do you unlock it.

    My nano has a lock symbol showing . It will not aloow me to do anything. How do I unlock it?  And how did it get locked/