Best practice: managing FRA

I was not sure this was the most appropriate forum for this question; if not, please feel free to make an alternative suggestion.
For those of us who run multiple databases on a box with shared disk for FRA, I am finding the extra layer of ASM and db_recovery_file_dest_size to be a minor inconvenience. The Best Practice white papers I have found so far say that you should use db_recovery_file_dest_size, but they do not specify how you should set it. Currently, we have been setting db_recovery_file_dest_size rather small, as the databases so far are small and even at 3x the database size, the parameter is still significantly smaller than the total disk available in that diskgroup.
So, my question; is there any downside to setting db_recovery_file_dest_size equal to the total size of the FRA diskgroup for all databases? Obviously, this means that the amount of free space in the diskgroup may be consumed even if db_recovery_file_dest_size is not yet full (as reflected in the instance V$RECOVERY_FILE_DEST). But is that really a big deal at all? Can we not simply monitor the FRA diskgroup, which we have to do anyway? This eliminates the need to worry about an additional level of disk management. I like to keep things simple.
The question is relevant to folks using other forms of volume management (yes, I know, ASM is "not a volume manager"), but seems germane to the ASM forum because most articles and DBAs that I have talked to are using ASM for FRA.
Most importantly, what ramifications does "over-sizing" db_recovery_file_dest_size have? Aside from the scenario above.
TIA

As a general rule, the larger the flash recovery area(db_recovery_file_dest_size ), the more useful it becomes. Ideally, the flash recovery area should be large enough to hold a copy of all of your datafiles and control files, the online redo logs, and the archived redo log files needed to recover your database using the datafile backups kept under your retention policy.
Setting the size of DB_FILE_RECOVERY_DEST_SIZE must be based on following factors.
1) your falshback retention target,
2) what all files you are storing in flashback and
3) if that inclueds backup then the retention policy for them or how often you move them to tape
The bigger the flash recovery area, the more useful it becomes. Setting it much larger or equal to you FRA disk group does not cause any [b]overhead that is not known to Oracle.
But there are reasons why Oracle lets you define a disk limit, which is the amount of space that Oracle can use in the flash recovery area out of your FRA disk group.
1) A disk limit lets you use the remaining disk space for other purposes and not to dedicate a complete disk for the flash recovery area.
2)Oracle does not delete eligible files from the Flash Recovery Area until the
space must be reclaimed for some other purpose. So even though your database size is 5GB your restention target is very small but if your recovery_dest_size is much larger it will just keep filling.
3)Say in my case I have one FRA disk group of 150GB shared by 3 different databases. Based on the nature and criticaly of the database I have different size requirement of flashback recovery area for these databases. So I use varrying db_file_recovery dest_size (30GB, 50GB,70GB) respectively for meeting my retention target or the kind of files and backup I want to store them in FRA for these databases.
Oracle Internal Space management mehcanism for Falshback recovery area itself is designed in such way that if you define your db_recovery_file_dest_size and DB_FLASHBACK_RETENTION_TARGET at a optimal value, you wont need any further administration or management. If a Flash Recovery Area is configured, then the database uses an internal algorithm to delete files from the Flash Recovery Area that are no longer needed because they are redundant, orphaned, and so forth. The backups with status OBSOLETE form a subset of the files deemed eligible for deletion by the disk quota rules. When space is required in the Flash Recovery Area, then the following files are deleted:
a) Any backups which have become obsolete as per the retention policy.
b) Any files in the Flash Recovery Area which has been already backed up
to a tertiary device such as tape.
c) Flashback logs may be deleted from the Flash Recovery Area to make space available for other required files.
NOTE: If your FRA is 100GB and 3 databases have thier DB_RECOVERY_FILE_DEST set FRA then logically the total of db_recovery_file_dest_size for these 3 databases should not exceed 100GB. Even though practically it allows you to cross this limit.
Hope this helps.

Similar Messages

  • Best practice - Manage background job users

    Gate Keepers and Key Masters,
    What is the best way to manage users who run background jobs?
    For example currently we have a special system user with SAP_ALL that is only used to schedule jobs with. and we manage who has authorization to schedule jobs.
    We are told that this is not the best way to go about it and that we have to remove SAP_ALL from that user. I don't see a very good way to eliminate SAP_ALL profile, short of analyzing every single batch job that is already scheduled, and creating or assigning existing roles for each job or step. Even that doesn't guarantee that authorizations given to my batch user would be enough to run any jobs that may be scheduled in the future.
    Can you give me any pointers on how to address this problem.
    Thanks
    Matt

    Hello,
      as a matter of fact the cleanest way is giving to the backgroud job's user only the authorizations to perform programs and steps he have to.
    Usually auditors allows keeping SAP_ALL for system users.
    However, a work-aroud could be the creation of a special role, containing authorization to do "almost everything". You should run transaction PFCG, enter the name of the role, save, then go straightly to the tab "Authorizations" and press the button "Change authorization data". In the pop-up screen "Choose template" choose "Do not select templates"; then follow the scroll-down menu path "Edit --> Insert authorization(s) --> Full authorization". Then go to the push-button "Organizational levels" and press the push-button "Full authorization". If you want you can refine this role, removing some critical authorization object such as, for instance, S_USER_*, or others like that. Then you can assign this role to the background user.
    Hope to be useful.
    Best regards,
    Andrea

  • Best practices managing multiple iPhoto librarys

    I think that my challege in managing multiple iPhoto libraries is shared by a lot of amateur photographers.  My wife and I both have Apple MacBook Airs and we both take and upload photos.  About every three months I merge the two iPhoto libraries with the iPhoto Library Manager.  Then I take the output of the merge and replace both MacBook iPhoto libraries.
    Thinking out loud, would it be easier and less labor intensive to have both my wife and I share the same library?  I remember reading a caution statement regarding the possible iPhoto file corruption when sharing.
    Any suggestions?

    Thinking out loud, would it be easier and less labor intensive to have both my wife and I share the same library?  I remember reading a caution statement regarding the possible iPhoto file corruption when sharing.
    The alternative would be to keep separate photo libraries - one for your wife and one for you.
    I found it much easier to haveseparate photo libraries and just share the best photos with the other family members, after they have been tagged and edited. Export them and give them the other family members to import them to their libraries. Or transfer them using a shared photo stream.
    This avoids fighting about which photos to keep and how to adjust the photos.

  • CUWL Best Practice (Managing)

    Recently we have upgraded from 4.1.3 to CUCM 7.1.3(a) and also migrated from DLU to CUWL. We purchased 650 Standard Users, 100 Pro Users, and 100 public/analog. My question is how do we manage/differentiate between Standard vs. Pro users in CUCM. I understand the difference between the two but I am confused on how to manage them going forward. Any help would be appreciated.
    Thanks,
    David

    David,
    With CUCM 6x/7x the CUCM license service is not CUWL-aware.  Meaning, it is still operating on the DLU model.  So, there is no way to manage this through the CUCM.  With these versions you need to track the CUWL assignment using an internal process.  Or you could try to come up with a way to add a "self documenting" config in CUCM.  Such as creating custom User Groups like "CUWL-Std-User" and assign those to user IDs as you add them to the system (or retro-assign to existing).  Then you can use dependency records or SQL query to dump members of the particular CUWL group.  Most likely that won't map out well for you, particular with CUWL-entry (i.e. analog devices).  But it just a thought.
    I have tracked this with spreadsheets in the past.  Annoying, but when left with little option you make do.
    Now, with 8.x I am not sure if the licensing model for CUWL is enforced (i.e. the CUCM is CUWL-aware).  I thought that this was a feature for the 8.5 release but I may be mistaken.  Maybe someone in this forum can provide the roadmap with a little more precision.
    HTH.
    Regards,
    Bill
    Please remember to rate helpful posts.

  • What is a best practice for managing a large amount of ever-changing hyperlinks?

    I am moving an 80+ page printed catalog online. We need to add hyperlinks to our Learning Management System courses to each reference of a class - there are 100s of them. I'm having difficulty understanding what the best practice is for consistent results when I need to go back and edit (which we will have to do regularly).
    These seem like my options:
    Link the actual text - sometimes when I go back to edit the link I can't find it in InDesign but can see it's there when I open up the PDF in Acrobat
    Draw an invisible box over the text and link it - this seems to work better but seems like an extra step
    Do all of the linking in Acrobat
    Am I missing anything?
    Here is the document in case anyone wants to see it so far. For the links that are in there, I used a combination of adding the links in InDesign then perfecting them using Acrobat (removing additional links or correcting others that I couldn't see in InDesign). This part of the process gives me anxiety each month we have to make edits. Nothing seems consistent. Maybe I'm missing something obvious?

    what exatly needs to be edited, the hyperlink or content or?

  • Best practices for managing Movies (iPhoto, iMovie) to IPhone

    I am looking for some basic recommendations best practices on managing the syncing of movies to my iPhone. Most of my movies either come from a digital camcorder into iMovie or from a digital Camera into iPhone.
    Issues:
    1. If I do an export or a share from iPhoto, iMovie, or QuickTime, what formats should I select. I've seem 3gp, mv4.
    2. When I add a movie to iTunes, where is it stored. I've seen some folder locations like iMovie Sharing/iTunes. Can I copy them directly there or should I always add to library in iTunes?
    3. If I want to get a DVD I own into a format for the iPhone, how might I do that?
    Any other recommedations on best practices are welcome.
    Thanks
    mek

    1. If you type "iphone" or "ipod" into the help feature in imovie it will tell you how.
    "If you want to download and view one of your iMovie projects to your iPod or iPhone, you first need to send it to iTunes. When you send your project to iTunes, iMovie allows you to create one or more movies of different sizes, depending on the size of the original media that’s in your project. The medium-sized movie is best for viewing on your iPod or iPhone."
    2. Mine appear under "movies" which is where imovie put them automatically.
    3. If you mean movies purchased on DVD, then copying them is illegal and cannot be discussed here.
    From the terms of use of this forum:
    "Keep within the Law
    No material may be submitted that is intended to promote or commit an illegal act.
    Do not submit software or descriptions of processes that break or otherwise ‘work around’ digital rights management software or hardware. This includes conversations about ‘ripping’ DVDs or working around FairPlay software used on the iTunes Store."

  • SAP Best Practice Integrated with Solution manager

    We have a server in which we installed SAP Best practice baseline package and we have the solution manager 7.01 SP 25
    We maintained the logical port but when we try to check connectivity to solution manager we got the following error:
    Connectivity check to sap solution manager system not successful
    Message no. /SMB/BB_INSTALLER375
    Can anyone guide us how to solve the problem and also if there is another way to upload the solution defined on the best practice solution builder into sap solution manager as a template project
    Thanks,
    Heba Hesham

    Hi,
    Patches for SAPGUI 7.10 can be found at the following location:
    http://service.sap.com/patches
    -> Entry by Application Group -> SAP Frontend Components
    -> SAP GUI FOR WINDOWS -> SAP GUI FOR WINDOWS 7.10 CORE
    -> SAP GUI FOR WINDOWS 7.10 CORE -> Win 32
    -> gui710_2-10002995.exe

  • Basic Strategy / Best Practices for System Monitoring with Solution Manager

    I am very new to SAP and the Basis group at my company. I will be working on a project to identify the best practices of System and Service level monitoring using Solution Manager. I have read a good amount about SAP Solution Manager and the concept of monitoring but need to begin mapping out a monitoring strategy.
    We currently utilize the RZ20 transaction and basic CCMS monitors such as watching for update errors, availability, short dumps, etc.. What else should be monitored in order to proactively find possible issues. Are there any best practices you all have found when implimenting Monitoring for new solutions added to the SAP landscape.... what are common things we would want to monitor over say ERP, CRM, SRM, etc?
    Thanks in advance for any comments or suggestions!

    Hi Mike,
    Did you try the following link ?
    If not, it may be useful to some extent:
    http://service.sap.com/bestpractices
    ---> Cross-Industry Packages ---> Best Practices for Solution Management
    You have quite a few documents there - those on BPM may also cover Solution Monitoring aspects.
    Best regards,
    Srini
    Edited by: Srinivasan Radhakrishnan on Jul 7, 2008 7:02 PM

  • Manage vendor bank account in company code level - Best practice

    hi,
    we have on our client several company codes, and we are thinking how to manage the vendor bank accounts in company codes level or in centeral level.
    our problem with company codes level is that the vendor bank accounts define in a centeral level so technically with sap standard it is not possible. if anyone can tell me that this is possible i will be very happy to know.
    in centeral level our problem is with the business process because of we do not have a person (functionalism ) that can be responsible for all the company codes, basically every company shoule be responsible for her vendors
    please advice what is the best practice to manage vendor bank accounts in enviroment of several company code
    regards,
    meir

    hi Raghu,
    i know that but this is not my qusition.
    which company code will be responsible to enter the bank account in the vendor master data? as i say the vendor bank accounts manage in centeral level and not in company code level and this is our problem and i like to know how other companies with the same situation handle with this issue
    and also after we have the bank accounts in the vendor master data what happend if for one vendor has several bank accounts one for every company code? how the payment program F110 could know automaticaly which bank account to take (i know that there is an optoin to use Partner bank type in the vendor master data, and enter this PBT when posting the invoice of the vednor, but this is not good because of we do not want the the accountant will be responsible from which bank the vendor get his money).
    regards,
    meir

  • JSF - Best Practice For Using Managed Bean

    I want to discuss what is the best practice for managed bean usage, especially using session scope or request scope to build database driven pages
    ---- Session Bean ----
    - In the book Core Java Server Faces, the author mentioned that most of the cases session bean should be used, unless the processing is passed on to other handler. Since JSF can store the state on client side, i think storing everything in session is not a big memory concern. (can some expert confirm this is true?) Session objects are easy to manage and states can be shared across the pages. It can make programming easy.
    In the case of a page binded to a resultset, the bean usually helds a java.util.List object for the result, which is intialized in the constructor by query the database first. However, this approach has a problem: when user navigates to other page and comes back, the data is not refreshed. You can of course solve the problem by issuing query everytime in your getXXX method. But you need to be very careful that you don't bind this XXX property too many times. In the case of querying in getXXX, setXXX is also tricky as you don't have a member to set. You usually don't want to persist the resultset changes in the setXXX as the changes may not be final, in stead, you want to handle in the actionlistener (like a save(actionevent)).
    I would glad to see your thought on this.
    --- Request Bean ---
    request bean is initialized everytime a reuqest is made. It sometimes drove me nuts because JSF seems not to be every consistent in updating model values. Suppose you have a page showing parent-children a list of records from database, and you also allow user to change directly on the children. if I hbind the parent to a bean called #{Parent} and you bind the children to ADF table (value="#{Parent.children}" var="rowValue". If I set Parent as a request scope, the setChildren method is never called when I submit the form. Not sure if this is just for ADF or it is JSF problem. But if you change the bean to session scope, everything works fine.
    I believe JSF doesn't update the bindings for all component attributes. It only update the input component value binding. Some one please verify this is true.
    In many cases, i found request bean is very hard to work with if there are lots of updates. (I have lots of trouble with update the binding value for rendered attributes).
    However, request bean is working fine for read only pages and simple binded forms. It definitely frees up memory quicker than session bean.
    ----- any comments or opinions are welcome!!! ------

    I think it should be either Option 2 or Option 3.
    Option 2 would be necessary if the bean data depends on some request parameters.
    (Example: Getting customer bean for a particular customer id)
    Otherwise Option 3 seems the reasonable approach.
    But, I am also pondering on this issue. The above are just my initial thoughts.

  • Best Practice for Managing Cookies in an Enterprise Environment

    We are upgrading to IE11 for our enterprise. One member of the team wants to set a group policy that will delete all cookies every time the user exits IE11.  We have some websites that users access that use cookies to track progress in training,
    but are deleted when the user closes the browser.  What is the business best practice regarding deleting all history, temp internet files and, especially cookies when closing a browser.
    If you can point me to a white paper on this topic, that would be helpful.
    Thanks
    Bill

    Hi,
    Regarding cookie settings, we could manage IE privacy settings using Administrative templates for IE 11:
    Administrative templates and Internet Explorer 11
    Delete and manage cookies
    The Administrative templates for IE 11, we could download from here:
    Administrative Templates for Internet Explorer 11
    Hope this may help
    Best regards
    Michael Shao
    TechNet Community Support

  • Best Practice: Configuring Windows Azure Management Services

    I have a 3 Websites, 1 Blob Storage, and 1 SQL Server that I would like to configure for basic stability and performance monitoring. I know I can set up alerts through Management Services based on various metrics. My question is, can someone give me a recommended
    set of metrics that are good baselines?
    It is nice that Azure is so customizable, but frankly I have no idea how much CPU Time in milliseconds over a given evaluation window is appropriate. Or how many Http Server Errors? More than 0 seems bad, no? Wouldn't I want to know of any/all errors?
    So if anyone has some "best practice" metrics for me, that would be really helpful.
    Thanks.

    Hi,
      >> can someone give me a recommended set of metrics that are good baselines?
    Actually, many metrics depend on your scenario. For instance, if there're a lot of concurrent requests or if a single request is expected to take some heavy computation, then it is expected to have a high CPU usage, thus it is difficult to give
    you a specific number.
    In general, you may want the CPU usage of a web server to be as high as possible (idle CPU costs money but does not provide valuable results), but if it is low enough, if additional concurrent requests are received, they can be served without too much
    delay. In Windows Azure, you may want to setup auto scaling so that if CPU usage is high enough during a period, you create a new instance. If CPU usage is low enough during a period, you remove an instance. You may also want to use response time in addition
    to CPU to monitor whether you need to add/remove an instance.
      >> Or how many Http Server Errors? More than 0 seems bad, no? Wouldn't I want to know of any/all errors?
    As for server error, in general you want to get notified by all errors (> 0), however they're unexpected and need to be investigated. But if in your scenario you expect a certain level of server errors, then it is fine to use a larger number.
    Best Regards,
    Ming Xu
    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click
    HERE to participate the survey.

  • Best Practice for Managing a BPC Environment?

    My company is currently running a BPC 5.1 MS environment and will soon be upgrading to version 7.0 MS.  I was wondering if there is a white paper or some guidance that anyone can give with regard to the best practice for managing a BPC environment.  Which brings to light several questions in my mind:
    1.  Which department(s) in a company should u201Cownu201D the BPC application? 
    2. If both, whatu2019s SAPu2019s recommendation for segregation of duties?
    3. What roles should exist within our company to manage BPC?
    4. What type(s) of change control is SAPu2019s u201CBest Practiceu201D?
    We are currently evaluating the best way to manage the system across multiple departments, however there is no real business ownership in the system, which seems to be counter to the reason for having BPC as a solution in the first place.
    Any guidance on this would be very much appreciated.

    My company is currently running a BPC 5.1 MS environment and will soon be upgrading to version 7.0 MS.  I was wondering if there is a white paper or some guidance that anyone can give with regard to the best practice for managing a BPC environment.  Which brings to light several questions in my mind:
    1.  Which department(s) in a company should u201Cownu201D the BPC application? 
    2. If both, whatu2019s SAPu2019s recommendation for segregation of duties?
    3. What roles should exist within our company to manage BPC?
    4. What type(s) of change control is SAPu2019s u201CBest Practiceu201D?
    We are currently evaluating the best way to manage the system across multiple departments, however there is no real business ownership in the system, which seems to be counter to the reason for having BPC as a solution in the first place.
    Any guidance on this would be very much appreciated.

  • How to handle multiple site to site IPsec vpn on ASA, any best practice to to manage multiple ipsec vpn configrations

    how to handle multiple site to site IPsec vpn on ASA, any best practice to to manage multiple ipsec vpn configurations
    before ver 8.3 and after version 8.3 ...8.4.. 9 versions..

    Hi,
    To my understanding you should be able to attach the same cryptomap to the other "outside" interface or perhaps alternatively create a new crypto map that you attach only to your new "outside" interface.
    Also I think you will probably need to route the remote peer ip of the VPN connection towards the gateway IP address of that new "outside" and also the remote network found behind the VPN connection.
    If you attempt to use VPN Client connection instead of L2L VPN connection with the new "outside" interface then you will run into routing problems as naturally you can have 2 default routes active at the sametime (default route would be required on the new "outside" interface if VPN Client was used since you DONT KNOW where the VPN Clients are connecting to your ASA)
    Hope this helps
    - Jouni

  • Best practice for data migration install v1.40 - Error 2732 Directory manag

    Hi
    I'm attempting to install SAP Best Practice for Data migration 1.40 on Win Server 2008 R2 (64 bit).
    Prerequisite error
    Installation program stops with missing file error
    The following file was not found
    ... \migration\InstallationWizard\BusinessObjects Data Services\setup.exe
    The file is necessary for successful installation. Please connect to internet or refer to Quick Guide (available on SAP note 1527151) for information regarding the above file.
    Windows installer log displays
    Error 2732 Directory Manager not initialized
    SAP note 1527151 does not exist or is internal.
    Any help appreciated  on what is the root cause of the error as the file does not exist in that folder in the installation zip file.
    Other prerequisite of .NET 3.5.1 met already.
    Patch is released since 20.11.2011 so I presume that it is a good installation set.
    Thanks,
    Alan

    Hi Alan,
    There are details on data migration v1.4 installations on SAP website and market place. The below link should guide to the right place. It has a power point presentation and other useful links as well.
    http://help.sap.com/saap/sap_bp/DMS_V140/DMS_US/html/index.htm
    Arun

Maybe you are looking for

  • Pdf form imported to be Interactive in WDA?

    Hello WDA Group,    I have a .pdf file that was created in Netweaver Dev Studio. The XML file is also available. Is it possible to somehow put this somewhere.... possibly in a MIME folder for a specific WDA, then import and use it as an Adobe Interac

  • Interop.sapbobscom.dll compatibility

    I'm having a compatibility issue, I think, with the patch-level of my interop.sapbobscom.dll that is installed with my addon compared to the patch level that is installed at my customer's location.  I'm developing on SP00 PL49, but the customer is at

  • Value Type and Version

    Hi, In most FI extractors, I see Value type (mapped to 0vtype) and version (mapped to 0version). Can anyone please explain the difference between them? Can you please share examples? Thanks Jason

  • How to upgrade PowerBook G4 1.67Ghz to 10.5.8?

    So I just bought the last PowerBook model made, the awesome 1.67Ghz with 128mb vram, bluetooth, airport extreme, ect. I bought it for $75 and when I got it OS 10.5.1 was installed. I decided to update it to the latest version of leopard. When I try i

  • Recovery Mode Help!

    How do I get my phone out of recovery mode without losing all my photos and videos? Help!