Best Strategy for Oracle Replication Dataguard/Streams/Mat Views...

We have a database in Atlanta and a Nat. Office DB in DC. The primary workhorse is the ATL db. The DC office needs to be able to review the most current ATL database data and make small changes to status codes and comment fields (not necessarily on the database but can be if that is the best way). The DC database in the case of network failure/Loss of communication with ATL wants to be able to continue to work on the most current data as of the network failure and resync/or send those changes back to ATL db when the network is restored.
Iam interested in startegies you would take to accomplish these goals.
We will be using Oracle 10.2
Thanks in Advance

Sounds like a typical Replication Streams use case. DataGuard is not quite in the picture.
If your remote office only need to replication few tables from ATL, setup Mview based replication seems a good idea.
Check more information here,
Oracle® Database Advanced Replication

Similar Messages

  • Long-term  retention backup strategy for Oracle 10g

    Hello,
    I need design a Archivelog, Level 0 and Level 1 long-term retention backup strategy for Oracle 10g, It must be based in Rman with tapes.
    ¿Somebody could tell me what it's the best option to configure long-term retention backups in Oracle?, "CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 1825 DAYS;" It's posibble ?
    Regards and thanks in advance

    Hello;
    RECOVERY WINDOW is an integer so yes its possible. Does it make sense?
    Later My bad, this is Oracle 11 only ( the gavinsoorma link ) . Getting harder and harder to think about 10. Don't think I would use RECOVERY WINDOW.
    I would let the tape backup system handle the retention of your rman backups.
    Using crosscheck and delete expired commands you can keep the catalog current with the available backups on tape.
    Oracle 10
    Keeping a Long-Term Backup: Example
    http://docs.oracle.com/cd/B19306_01/backup.102/b14191/rcmbackp.htm#i1006840
    http://web.njit.edu/info/oracle/DOC/backup.102/b14191/advmaint005.htm
    Oracle 11
    If you want to keep 5 years worth of backups I might just use KEEP specific date using the UNTIL TIME clause :
    keep until time 'sysdate+1825'More info :
    RMAN KEEP FOREVER, KEEP UNTIL TIME and FORCE commands
    http://gavinsoorma.com/2010/04/rman-keep-forever-keep-until-time-and-force-commands/
    Best Regards
    mseberg
    Edited by: mseberg on Dec 4, 2012 7:20 AM

  • Best strategy for variable aggregate custom component in dataTable

    Hey group, I've got a question.
    I'd like to write a custom component to display a series of editable Things in a datatable, but the structure of each Thing will vary depending on what type of Thing it is. So, some Things will display radio button groups (with each radio button selecting a small set of additional input elements, so we have a vertical array radio buttons and beside each radio button, a small number of additional input elements), some will display text-entry fields, and so on.
    I'm wondering what the best strategy for tackling this sort of thing is. I'm sort of thinking I'll need to do something like dynamically add to the component tree in my custom component's encodeBegin(), and purge the extra (sub-) components in encodeEnd().
    Decoding will be a bit of a challenge, maybe.
    Or do I simply instantiate (via constructor calls, not createComponent()) the components I want and explicitly call their encode*() and decode() methods, without adding them to the view tree?
    To add to the fun of all this, I'm only just learning Faces (having gone through the Dudney book, Mastering JSF, and writing some simpler custom components) and I don't have experience with anything other than plain vanilla JSP. (No EJB, no Struts, no Tapestry, no spiffy VisualDevStudioWysiwyg++ [bah humbug, I'm an emacs user]). I'm using JSP 2.0, JSF 1.1_01, JBoss 4.0.1 and JDK 1.4.2. No, I won't upgrade to 1.5 (yet).
    Any hints, pointers to good sample code? I've looked at some of the sample code that came with the RI and I've tried to navigate the JSF Blueprints stuff, but I haven't really found anything on aggregating components into a custom component. Did I miss something obvious?
    If this isn't a good question, please let me know how I can sharpen it up a bit.
    Thanks.
    John.

    Hi,
    We're doing something very similar. I had a look at the Tomahawk Date component, and it seems to dynamically created InputText components in the encodeEnd(). However, it doesn't decode this directly (it only expects a single textual value). I expect you may have to check the request yourself in decode().
    Other ideas would be appreciated, though - I'm still new to JSF.

  • What is the best platform for Oracle Server?

    What is the best platform for Oracle Server?
    What is your criterion?
    Thanks,
    Somsak B.

    I believe that Oracle is developed and coded on a Solaris
    platform and then ported to other operating systems. If that is
    the case, you can assume that Solaris is the best platform or at
    least the safest in terms of performance and risk.
    If anyone knows anything to the contrary, pls correct me.

  • What's best strategy for dealing with 40+ hours of footage

    We have been editing a documentary with 45+ hours of footage and presently have captured roughly 230 gb. Needless to say it's a lot of files. What's the best strategy for dealing with so much captured footage? It's almost impossible to remember it all and labeling it while logging it seems inadequate as it difficult to actually read comments in dozens and dozens of folders.
    Just looking for suggestions on how to deal with this problem for this and future projects.
    G5 Dual Core 2.3   Mac OS X (10.4.6)   2.5 g ram, 2 internal sata 2 250gb

    Ditto, ditto, ditto on all of the previous posts. I've done four long form documentaries.
    First I listen to all the the sound bytes and digitize only the ones that I think I will need. I will take in much more than I use, but I like to transcribe bytes from the non-linear timeline. It's easier for me.
    I had so many interviews in the last doc that I gave each interviewee a bin. You must decide how you want to organize the sound bytes. Do you want a bin for each interviewee or do you want to do it by subject. That will depend on you documentary and subject matter.
    I then have b-roll bins. Sometime I base them on location and sometimes I base them on subject matter. This last time I based them on location because I would have a good idea of what was in each bin by remembering where and when it was shot.
    Perhaps, you weren't at the shoot and do not have this advantage. It's crucial that you organize you b-roll bins in a way that makes sense to you.
    I then have music bins and bins for my voice over.
    Many folks recommend that you work in small sequences and nest. This is a good idea for long form stuff. That way you don't get lost in the timeline.
    I also make a "used" bin. Once I've used a shot I pull it out of the bin and put it "away" That keeps me from repeatedly looking at footage that I've already used.
    The previous posts are right. If you've digitized 45 hours of footage you've put in too much. It's time to start deleting some media. Remember that when you hit the edit suite, you should be one the downhill slide. You should have a script and a clear idea of where you're going.
    I don't have enough fingers to count the number of times that I've had producers walk into my edit suite with a bunch of raw tape and tell me that that "want to make something cool." They generally have no idea where they're going and end up wondering why the process is so hard.
    Refine your story and base your clip selections on that story.
    Good luck
    Dual 2 GHz Power Mac G5   Mac OS X (10.4.8)  

  • Best strategy for migrating GL 6 DW CS4?

    First I gotta say that -- as decade-long user of several Adobe products -- Adobe has really screwed over long-time users of Adobe Golive. After a week of messing around with something that should be a slam dunk (considering the substatial amount of $ I've paid to Adobe over the years for multiple copies of GoLive 5, 6 and CS2), I can tell you Adobe's GL > DW migration utility ONLY works for sites created FROM SCRATCH in GL CS2 (they must have the web-content, web-data folder structure). This means that Adobe's migration utility only works for maybe 10% - 15% of the GoLive sites out there, and about 10% - 15% of Adobe's loyal GoLive customers, and it particularly screws over Adobe's longtime GoLive customers. Sweet! ( (Just like Adobe screwed over users of PhotoStyler -- which was a better image editor than PhotoShop, BTW.) Obviously, I would walk away from Adobe and make sure I never ever paid them another cent, but the Republican-Democrat "free market economy" has made sure I don't have that option.
    So I've gotta make Dreamweaver work, which means I've gotta migrate several large sites (the biggest has 5,000 static pages and is about 2 gigs in size) from GoLive 6 (or GoLive CS2 with the older folder structure) to Dreamweaver CS4.
    Which brings me to my question -- what's the best strategy for doing ths? Adobe's migration utility is a bad joke, so here's the alternative, real world plan I've come up with. I'd apreciate any suggestions or comments...
    1) create copies of all my GL components in the content folders for my GL site (not in the GoLIve components folder)
    2) replace the content of all GoLive compnents (in the GoLive components folder) with unque character strings, so that instead of a header with images and text, my old header would look something like xxxyyyzzz9
    3) create a new folder called astoni in the root of my hard drive. Copy all my GoLIve web site content (HTML, images, SWF, etc.) into astoni in exactly the structure it was in with GL
    4) create a new Dreamweaver site by defining astoni as the local location for my site, astoni\images as the location for images, etc.
    5) use Dreamweaver and open the newly defined astoni site. Then open each of the GoLive components I copied into a content level folder in astoni, and drag each into the Dreamweaver Assets/Library pane, in order to create library items just like my old GoLive components
    6) use Dreamweaver to Search & Replace out the unique text strings like xxxyyyzzz9 with the content of my new DW library items
    7) refresh site view to make all the links hook up...
    Thanks for your help. Hope this discussion helps others too...

    Instead of ragging on people who are familiar with DW and Site Development, you should be gracious and accept the practical advice you've been given.  A "best strategy" would be to read some tutorials and learn how to work with HTML, CSS and server-side technologies. Without some basic code skills, you're going to find DW an uphill, if not impossible battle.
    Frankly, I don't have free time to hand-hold someone through the excruciating process of migrating a 5,000 page static site from GoLive  to DW. And I doubt anyone else in this forum has either.  We're not Adobe employees.  We don't get paid to participate here.  We are all product users JUST LIKE YOU.
    I'm sorry you're frustrated.  I'm also sorry for your clients. But the problem you have now isn't Adobe's fault. It's yours for not keeping up with server-side technologies or handing-off these huge static sites to more capable web developers.  I'm not saying you need to buy anyone's books, but they are good resources for people willing to learn new things.
    That said, maybe you should stick with GoLive.  The software doesn't have an expiration date on it and will continue working long into the future.  If you're happy using GL, keep using it to maintain your legacy sites. At the same time learn X/HTML, CSS & PHP or ASP.  Use DW CS4 for the creation of new projects.
    FREE Tutorial Links:
    HTML & CSS Tutorials - http://w3schools.com/
    From   Tables to CSS Web Design Part 1 -
    http://www.adobe.com/devnet/dreamweaver/articles/table_to_css_pt1.html
    From   Tables to CSS Web Design Part 2 -
    http://www.adobe.com/devnet/dreamweaver/articles/table_to_css_pt2.html
    Taking  a Fireworks (or Photoshop) comp to a CSS based layout in DW
    http://www.adobe.com/devnet/fireworks/articles/web_standards_layouts_pt1.html
    Creating  your first website in DW CS4 -
    http://www.adobe.com/devnet/dreamweaver/articles/first_cs4_website_pt1.html
    Guidance  on when to use DW Templates, Library Items and SSIs -
    http://www.adobe.com/devnet/dreamweaver/articles/ssi_lbi_template.html
    Best of luck,
    Nancy O.
    Alt-Web Design & Publishing
    Web | Graphics | Print | Media  Specialists
    www.alt-web.com/
    www.twitter.com/altweb
    www.alt-web.blogspot.com

  • Best tool for Oracle Fusion.

    I have not explored this, but, does Selenium recognize Oracle Fusion related Objects (Fields in the web pages).
    Or, OATS (with ADF Accelerator) is the best option.
    Thanks,

    Hi,
    I feel OATS is the best fit for Oracle Fusion Applications, it comes with in built  ADF Accelerators.
    Thanks
    -POPS

  • Best strategy for buring dual layer DVD

    I've got a project that won't fit on a single layer DVD, so I've got it set up for dual layer and "best quality" which results in a 7.38GB disk image.  However, iDVD 8 warned me when burning the disk images that some utilities may not be able to burn a reliable dual layer disk copy and to use iDVD instead, does this include Disk Utility?
    I always use Disk Utility to make copies, iDVD took almost 7 hours to burn the disk image, I can't afford to wait that long for each copy.  So what's the best strategy for burning dual layer DVDs?

    subsequent copies burn in far less time immediately after the first provide you don't quit iDvd
    If you know ahead of time you'll need more than a few copies, I'd recommend burning from a disc image to the desktop and reducing the burn speed (4x or lower) I prefer to use Roxio Toast myself. But others have had success with Apple's Disc Utilities as well.

  • How to get best material for Oracle 10G PL/SQL

    Please help me out to get best material for Oracle 10G PL/SQL form oracle site.
    Thanks in Advance

    There's no special book or material related Oracle 10g PL/SQL. Can you explain what you want ?
    Please refer to this page for more information:
    http://www.oracle.com/technology/tech/pl_sql/index.html
    Edited by: Kamran Agayev A. on Aug 14, 2009 11:41 AM

  • Absolute Best Strategy for Cropping

    What's the absolute best strategy for high precision cropping when time is not an issue and and want "perfect" quality?
    What's the absolute best strategy for low precision cropping when time is an issue and standards are still high?
    There's gotta be different strategies for doing this, and I was curious if we could choose a winner.
    Thanks,

    Heh, Marian, that about sums it up. 
    In all seriousness...  How does "precision" connect with "cropping"?  I've always thought of "cropping" as most based on an artistic judgment, not something done for precision.  Perhaps you could describe your photography and what you expect from your workflow?
    Are you cropping photos of documents?  People?  Stars, nebulae, and galaxies?
    Is this "precision" you speak of an attempt to express "minimal loss of pixel data"?  If that's the case, get the most powerful computer you can and always work at upsampled resolutions.  And don't resample during cropping.
    One last comment:  Consider employing the crop tool in the Camera Raw dialog.  This stores metadata telling future conversions how to crop the image, but the original raw data is still all intact.  Note that you can open a whole set of images in Camera Raw, do exposure and cropping changes, and click [Done] to save that metadata.
    -Noel

  • Best Strategy for Integrating Crystal/Business Objects with OpenACS Environment

    Post Author: devashanti
    CA Forum: Deployment
    I'm working for a client that uses AOL server and OpenACS for their web services/applications. I need suggestions on the best strategy to integrate a reporting solution using Business Objects XI. Ideally I'd like to send an API call from our web application's GUI to the Crystal API with report parameter values to pass into specific reports called via the API - I can get it down to one integer value being passed - or if this is not possible a way to seamlessly, from the end user perspective, move into a reporting module. We are using an Oracle backend database. I'm experienced with creating stored procedures and packages for reporting purposes.
    Although I have many years of experience integrating the Crystal active X controls into n-tier client server applications, the past few years I have had little opportunity to work with Business Objects and the newer versions of Crystal or web based solutions with Crystal Reports. I signed up to try out crystalreports.com, but I doubt my client will find this solution acceptable for security reasons as the reports are for an online invoicing system we are developing. However we can set up a reports server in-house if necessary, so it gives me some testing ground.
    Can anyone provide suggestions for a doable strategy?

    Please post this query to the Business Objects Enterprise Administration forum:
    BI Platform
    That forum is monitored by qualified technicians and you will get a faster response there. Also, all BOE queries remain in one place and thus can be easily searched in one place.
    Thank you for your understanding,
    Ludek

  • Best strategy for data definition handling while using OCCI

    Hi, subject says all.
    Assuming to use OCCI for data manipulation (e.g. object navigation), what's the best strategy to handle definitions in the same context ?
    I mean primarily dynamic creation of tables and columns.
    Apparent choices are:
    - SQL from OCCI.
    - use OCI in parallel.
    Did I miss anything ? Thanks for any suggestion.

    Agreeing with Kappy that your "secondary" backups should be made with a different app. You can use Disk Utility as he details, or a "cloning" app such as CarbonCopyCloner or SuperDuper that can update the clone periodically, rather than erasing and re-copying from scratch.
    [CarbonCopyCloner|http://www.bombich.com> is donationware; [SuperDuper|http://www.shirt-pocket.com/SuperDuper/SuperDuperDescription.html] has a free version, but you need the paid one (about $30) to do updates instead of full replacements, or scheduling.
    If you do decide to make duplicate TM backups, you can do that. Just tell TM when you want to change disks (thus it's a good idea to give them different names). But there's no reason to erase and do full backups; after the first full backup to each drive, Time Machine will back up only the changes made since the last backup +*to that disk.+* Each is completely independent.

  • Strategy for ensuring replication is complete before initating query from source system

    Hi,
    I am using HANA as a side-car scenario with reports running in SAP ECC being accelerated by querying replicated tables in SAP HANA instead. This works well, however I don't have a good mechanism to validate before running a report whether the underlying data has already been replicated to HANA or is still queued up.
    Often users would want to run the reports soon after large data changes have been made in the source tables. It is unknown, based on the overall workload, how long it might take for the most recently written records to get replicated to the corresponding HANA table.
    What is a good practice approach to handle this. I have seen some separate threads on doing record counts between HANA and ECC. I think that is not a good idea at all. Firstly, for large tables, the time overhead of doing the record count is very large. In the time it takes for me to query the record count in ECC, the HANA report could have run 10 times over. But more importantly, for very large source tables, I may have opted to only replicate more recent data, leaving old historical data from ECC un-replicated to HANA.
    I know this is not a new problem. SAP must have already addressed it a number of ways for their own delivered application accelerators. The COPA accelerator for instance must be doing something along these lines. Possibly querying most recent records in ECC and comparing them to the most recent records in HANA for the same tables might be a way to go.
    Does anyone else have insights into how to best approach this? Does SLT expose a mechanism to check whether replication is completed for any given table?
    thanks,
    Nitin Goel

    Hi Nitin,
    SLT can never confirm whether replication is completed for any given table..Replication is a continous process,if records are there in the logging table of ECC then that will be replicated via SLT.
    So the best way to check Replication completion is :
    1. Go to SE16(ECC)--give the Logging table for which you want to check the replication.Copy the Logginng table from LTRC.If the number of entries is 0,then nothing is there in the logging table.
    You can verify the replication in LTRC(SLT)--Expert functions to check replication is working fine or not.
    2. Then go toi SE16(ECC)--Check the number of entries of the original table
    3. Check in HANA--The number of entries should be same as the source table.
    It takes a minute to verfify each table replication,not more than that.
    Regards,
    Joydeep.

  • Best strategy for flash access to an image in a protected folder

    hello all,
    i'm looking for a good strategy for my swf to be able to load
    images in a password protected directory. since flash can't access
    a file behind a protected folder i figure i'll use php to get the
    image and pass it to flash.
    how best to get an image from php to flash since i can't just
    send a path to a file and have flash execute a Loader() to retrieve
    it since the Loader won't be able to get behind the protected
    folder?
    is it best to have php send the file binary to flash that
    flash will receive via URLLoaderDataFormat.BINARY? is it best to,
    at runtime, have php move the requested file from the protected
    folder to an unprotected folder and after flash loads it then
    delete it from the unprotected folder? maybe it's best to leave the
    image folder unprotected and let an .htaccess redirect restrict
    outside access to the folder?
    suggestions?
    tia,
    michael

    The built-in Firefox Sync service allows you to store your settings, including bookmarks, on a Mozilla server. This isn't accessible as a file like Google Drive; it can only be used by connecting another copy of Firefox to your Sync account. You can read more about this here: [[How do I set up Firefox Sync?]] (I haven't tried it myself since it changed recently.)
    The cross-browser Xmarks service is more focused: it just does bookmarks. http://www.xmarks.com/ (I've never tried this one either)
    If you prefer working with files and you have a Dropbox or similar account, you could in theory back up your Firefox profile folder, or more particularly, the bookmarkbackups subfolder. I haven't tried that myself. To find the folder:
    Open your current Firefox settings (AKA Firefox profile) folder using either
    * "3-bar" menu button > "?" button > Troubleshooting Information
    * Help menu > Troubleshooting Information
    In the first table on the page, click the "Show Folder" button
    A window should launch showing your currently active settings files. The bookmarkbackups folder should contain dated .json files which can be used to restore your bookmarks to another copy of Firefox.
    Alternately, someone might have created an add-on for saving Firefox bookmarks to the cloud another way, but I haven't heard of one.

  • WHAT IS BEST STRATEGY FOR RMAN BACKUP CONFIGURATION

    Hi all,
    my database size is 50GB I want TAKe WEEKLY FULL BACKUP AND INCREMENTAL BACKUP
    WITHOUT RECOVERY CATALOG.by follwing commands
    weekly full database backup
    run
    backup as compressed backupset
    incremental level=0
    device type disk
    tag "weekly_database"
    format '/sw/weekly_database_%d_t%t_c%c_s%s_p%p'
    database;
    I want do CONFIGURE RMAN BY FOLLWING stragtegy
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 days.
    CONFIGURE BACKUP OPTIMIZATION OFF;
    CONFIGURE CONTROLFILE AUTOBACKUP ON;
    CONFIGURE CONTROLFILE AUTOBACKUP FORMAT FOR DEVICE TYPE DISK TO
    '\SW\AUTOCFILE_'%F';
    and other is by default
    sql>alter system set control_file_record_keep_time=15 days;
    os--aix6 and for two database 10g2 and 11g2
    what is best configuration strategy for rman backup.AND BACKUP WITH RECOVERY CATALOG OR WITHOUT RECOVERY CATALOG
    PLEASE TELL ME
    Edited by: afzal on Feb 26, 2011 1:45 AM

    For simply two databases, there really wouldn't be a need for a recovery catalog. You can still restore/recover without a controlfile and without a recovery catalog.
    From this:
    afzal wrote:
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 days.I am assuming you want to keep two weeks worth of backups, therefore these:
    alter system set control_file_record_keep_time=15 days;
    CONFIGURE RETENTION POLICY TO REDUNDANCY window of 14 daysShould be:
    RMAN> sql 'alter system set control_file_record_keep_time=22';
    RMAN> CONFIGURE RETENTION POLICY TO RECOVERY WINDOW OF 14;22 would give you that extra layer of protection for instances when a problem occurs with a backup and you want to ensure that data doesn't get aged out.

Maybe you are looking for

  • I have multiple devices and don't want everything shared between them. How do I fix that?

    I have and iphone and three ipods in my home. They are all connected to my apple id and everything ends up being shared between them. Can I make seperate accounts for each of them, but have them all attached to my credit card? Thanks for any help!!!

  • Is it possible to encrypt a single mail account?

    Hi, apologies to regulars if this topic has been addressed. I looked for posts about encrypting mail, and mostly found stuff about encrypting messages as they were sent/received (PGP type problems). I have a very specific question that doesn't seem t

  • CASE statement in a dynamic page

    I have written a query using a CASE statement in the select portion to evaluate column values and produce a text string. The query runs fine in sql*plus, but when I attempt to add the code to a dynamic page and compile it, I get the following error m

  • Spry menu problem with div

    when I try to place a horz. spry menu into a div, it hops outside the container.

  • OS 10.1

    Every single carrier in the entire continent of North America has released OS 10.1 for BlackBerry. Verizon is the sole hold out. In my humble opinion this is starting to get completely ridiculous. I mean really? I left AT&T for this? You guys realize