Best practice for making a mirror of the system disk

I've got an HP ML115 machine that I'm installing Solaris 10 on. It has four disks: a 160GB (boot drive), a 300GB spare, and two 1.5TB drives. I've made a mirrored pool of the two 1.5TB drives. I'd like to create a mirror of the 160GB boot drive (I realise I'll lose some space). However, I'm unsure of how to do this. Attempting to create a mirror (zpool create -f bootmirror c0t0d0 c0t1d0) tells me that the partitions are mounted.
Any ideas on what the best way to achieve this is, please?

What filesystem type is root currently UFS or ZFS.
If its currently UFS, you will have to use SVM for mirroring. A UFS root can't be converted to a ZFS root.
If its already a ZFS root just not mirrored, it should be simple. Just add the disk to the pool to "promote" it to mirrored.
You do that with a zpool attach command.

Similar Messages

  • Best practice for putting binary data on the NMR

    Hi,
    We're creating a component that will consume messages off the NMR, encode them, and subsequently put them back on the NMR. What's the best practice for sending binary data over the NMR?
    1. setContent()?
    2. addAttachment()?
    3. setProperty()?
    If NormailzedMessage.setContent() is the desired approach, then how can you accomplish that?
    Thanks,
    Bruce

    setContent() is used only for XML messages. The recommended way to accommodate binary data is to use addAttachment().

  • Best practices for apps integration with third party systems ?

    Hi all
    I would like to know if there is any document from oracle or from your own regarding best practices for apps integration with third party systems.
    For example, in particular, let's say we need customization in a given module(ex:payables) need to provide data to a third party system, consider following:
    outbound interface:
    1)should third party system should be given with direct access to oracle database to access a particular payments data information table/view to look for data ?
    2) should oracle create a file to third party system, so that it can read and do what it need to do?
    inbound:
    1) should third party should directly login and insert data into tables which holds response data?
    2) again, should third party create file and oralce apps will pick up for further processing?
    again, there could be lot of company specific scenarios like it has to be real time or not... etc...
    How does companies make sure third party systems are not directly dipping into other systems (oracle apps/others), so that it will follow certain integration best practices.
    how does enterprise architectute will play a role in this? can we apply SOA standards? should use request/reply using Tibco etc?
    Many oracle apps implementations customizations are more or less directly interacting with third party systems by including code to login into respective third party systems and vice versa.
    Let me your know if you have done differently and that would help oracle apps community.
    thanks
    rrb.

    you want to send idoc to third party system (NONSAP).
    what kind of system is it? can it handle http requests
    or
    can it handle webservice?
    which version of R/3 you are using?
    what is the mechanism the receiving system has, to receive data?
    Regards
    Raja

  • Best Practice for Securing Web Services in the BPEL Workflow

    What is the best practice for securing web services which are part of a larger service (a business process) and are defined through BPEL?
    They are all deployed on the same oracle application server.
    Defining agent for each?
    Gateway for all?
    BPEL security extension?
    The top level service that is defined as business process is secure itself through OWSM and username and passwords, but what is the best practice for security establishment for each low level services?
    Regards
    Farbod

    It doesnt matter whether the service is invoked as part of your larger process or not, if it is performing any business critical operation then it should be secured.
    The idea of SOA / designing services is to have the services available so that it can be orchestrated as part of any other business process.
    Today you may have secured your parent services and tomorrow you could come up with a new service which may use one of the existing lower level services.
    If all the services are in one Application server you can make the configuration/development environment lot easier by securing them using the Gateway.
    Typical probelm with any gateway architecture is that the service is available without any security enforcement when accessed directly.
    You can enforce rules at your network layer to allow access to the App server only from Gateway.
    When you have the liberty to use OWSM or any other WS-Security products, i would stay away from any extensions. Two things to consider
    The next BPEL developer in your project may not be aware of Security extensions
    Centralizing Security enforcement will make your development and security operations as loosely coupled and addresses scalability.
    Thanks
    Ram

  • Kernel: PANIC! -- best practice for backup and recovery when modifying system?

    I installed NVidia drivers on my OL6.6 system at home and something went bad with one of the libraries.  On reboot, the kernel would panic and I couldn't get back into the system to fix anything.  I ended up re-installing the OS to recovery my system. 
    What would be some best practices for backing up the system when making a change and then recovering if this happens again?
    Would LVM snapshots be a good option?  Can I recovery a snapshot from a rescue boot?
    EX: File system snapshots with LVM | Ars Technica -- scroll down to the section discussing LVM.
    Any pointers to documentation would be welcome as well.  I'm just not sure what to do to revert the kernel or the system when installing something goes bad like this.
    Thanks for your attention.

    There is often a common misconception: A snapshot is not a backup. A snapshot and the original it was taken from initially share the same data blocks. LVM snapshot is a general purpose solution which can be used, for example, to quickly create a snapshot prior to a system upgrade, then if you are satisfied with the result, you would delete the snapshot.
    The advantage of a snapshot is that it can be used for a live filesystem or volume while changes are written to the snapshot volume. Hence it's called "copy on write (COW), or copy on change if you want. This is necessary for system integrity to have a consistent data status of all data at a certain point in time and to allow changes happening, for example to perform a filesystem backup. A snapshot is no substitute for a disaster recovery in case you loose your storage media. A snapshot only takes seconds, and initially does not copy or backup any data, unless data changes. It is therefore important to delete the snapshot if no longer required, in order to prevent duplication of data and restore file system performance.
    LVM was never a great thing under Linux and can cause serious I/O performance bottlenecks. If snapshot or COW technology suits your purpose, I suggest you look into Btrfs, which is a modern filesystem built into the latest Oracle UEK kernel. Btrfs employs the idea of subvolumes and is much more efficient that LVM because it can operate on files or directories while LVM is doing the whole logical volume.
    Keep in mind however, you cannot use LVM or Btrfs with the boot partition, because the Grub boot loader, which loads the Linux kernel, cannot deal with LVM or BTRFS before loading the Linux kernel (catch22).
    I think the following is an interesting and fun to read introduction explaining basic concepts:
    http://events.linuxfoundation.org/sites/events/files/slides/Btrfs_1.pdf

  • Best practices for making the end result web help printable

    Hi all, using TCS3 Win 7 64 bit.  All patched and up to date.
    I was wondering what the best practices are for the following scenario:
    I am authoring in Frame, link by reference into RH.
    I use Frame to generate PDFs and RH to generate webhelp.
    I have tons of conditional text which ultimately produce four separate versions of PDFs as well as online help - I handle these codes in FM and pull them into RH.
    I use a css on all pages of my RH to make it 'look' right.
    We now need to add the ability for end users to print the webhelp - outside of just CTRL+P because a)that cuts off the larger images and b)it doesn't show header, footer, logo, date, etc. (stuff that is in the master pages in FM).
    My thought is doing the following:
    Adding four sentences (one for each condition) in the FM book on the first page. Each one would be coded for audience A, B, C, or D (each of which require separate PDFs) as well as coded with ONLINE so that they don't show up in my printed PDFs that I generate out of Frame. Once the PDFs are generated, I would add a hyperlink in RH (manually) to each sentence and link the associated PDF (this seems to add the PDF file to the baggage files in RH). Then when I generate my RH webhelp, it would show the link, with the PDF, correctly based on the condition of the user looking at the help.
    My questions are as follows:
    1- This seems more complicated than it needs to be. Is it?
    2- I would have to manually update every single hyperlink each time I update my FM book, because I am single sourcing out of Frame and I am unable (as far as I can tell) to link a PDF within the frame doc. I update the entire book (over 1500 pages) once every 6 weeks so while this wouldn't be a common occurrence it will happen regularly, and it would be manual (as far as I can tell)?
    3- Eventually, I would have countless PDFs inside RH. I assume this will eventually impact performance. So this also doesn't seem ideal?
    If anyone has thoughts/suggestions on a simpler way or better way to do this, I'd certainly appreciate it. I have watched the Adobe TV tutorial on adding a master page but that seems to remove the ability to use a css across all my topics and it also requires the manual addition of a manual hyperlink to the PDF file, so that is what I am proposing above, anyway (not sure the benefit, therefore).
    Thanks in advance,
    Adriana

    Anything other than CTRL + P is going to create a lot of work so perhaps I can comment on what you see as drawbacks to that.
    a)that cuts off the larger images and b)it doesn't show header, footer,
    logo, date, etc. (stuff that is in the master pages in FM).
    Larger images.
    I simply make a point of keeping my image sizes down to a size that works. It's not a problem for me but that doesn't mean it will work for you. Here all I am doing is suggesting you review how big a problem that would be.
    Master Page Details
    I have to preface this with the statement that I don't work with FM. The details you refer to print when they are in RoboHelp master pages. Perhaps one of the FM users here can comment on how to get FM master pages to come through.
    See www.grainge.org for RoboHelp and Authoring tips
    @petergrainge

  • Best practice for making changes to Oracle apps business views and BAs/fold

    HI
    The oracle BI solution comes with pre-defined Business Views- database views and Business Areas and folders. If we want to customize those database views or BAs and folders what will be the best practice in order to avoid losing it during any upgrades.
    For ex Oracle out-of box Order Management BA that we are using heavily needs some additional fields to be added to Order Header and Order Lines folders and we also want to add some custom folders to this BA.
    If we do the changes to the database views behind this BA would they be lost during the upgrade or do we have to copy(duplicate) those views, updated them and create a custom BA and folders against those views.
    Thanks

    Hi,
    If you are adding new folders then just add them to the Oracle Business Area. The business area is just a collection of folders. If the business area was changed in an upgrade the new folder would not be deleted.
    If you want to add fields to the existing folders/views then you have 2 options. Add the field to the defining base view (these are the views beginning OEBV and OEFV) and then regenerate the business views. This may be overwritten if the view is upgrade but this is unlikely.
    Alternatively, copy the view to create a new version and then map the old folder to the new view and refresh. You may need to re-map the folder if the folder is upgraded, but at least you have a single folder used by both Oracle and custom reports.
    Rod West

  • The best practice for data mart to different BW System.

    Hi All,
    Would you like to suggest me what i have to do for this case ??
    I have 2 SAP BW systems e.g. BW A & BW B.
    I wanna transfer data from info cube within BW A into info cube within BW B.
    The things that I did :
    1. 'Generate Export Data Sources' for info cube BW A.
    2. Replicate source system in BW B. In BW B, it will show datasource from info cube BW A.
    3. In SAP BW B, I create info package, then I can fetch data from SAP BW A.
    What I wanna ask are:
    1. Could I make it automatically?? Because everytime I wanna fetch data from Info Cube SAP BW A, I must run info package in SAP BW B / what's the best practice.
    2. Could RDA make it automatically ?? Automatic in my case is everytime I have new/update data in Info cube SAP BW A, I don't have to run info package in SAP BW B.
    SAP BW B will automatically fetch the data from Info Cube A.
    If yes, could you give me step-by-step how to use RDA to solve my case please..
    Really need ur guidances all .
    Thanks,
    Best regards,
    Daniel N.

    Hi Daniel,
    You can create a process chain to load your cube in BW A. SImilarly create a process chain in your BW B system to load its cube.
    Now in your system BW A you create a process chain to load your cube. After that you can run automatically the procewss chain in BW B. You can use the Remote chain option for this.
    This will trigger a chain automatically in the remote system.
    Regards,
    Mansi

  • Best Practice for making material obselete

    Hai Experts,
    Could any one please advice me what s the best practice to make a material obselete. Is there any best practice or incase if it is not there what would be the diff ways it cant be made obselete.
    Please advice me on this.
    Thanking you in advance
    Regards,
    Gopalakrishnan.S

    Archiving is a process, rather than a transaction. you have to take care that you fulfill local laws and retain your data for x years for internal and external auditors.
    archiving requires customizing, you have to tell SAP where your archive is and what name it has and how big it can be, and when (definition of retention period) a document can be archived.
    you have to talk with your business how long they want the data keep in the production system (this depends on how long they need to run reports on this data)
    you have to define who can access the archived data and how this data is accessed.
    Who runs archiving?
    Tough question. Data owner is the business, so they should run.
    But archiving just buckets fragments your tables.
    so many companies have a person who is fulltime responsible for archiving and does it for all business units and organisations.
    Archiving is nothing what can be done "on the fly", you will encounter all kind of errors, processes that are not designed well will certainly create a lot problems in archiving and need to be reworked.
    I had an archving project with 50 days project time, and could develope guidelines to archive about 30 different objects. there are 100 more objects possible, and we have still not archived master data like material, vendors and customers because of so many dependencies and object that need to be archived prior to those objects.

  • Best practice for making a report of 10,000 to 20,000 rows(OBIEE 10.3.4.1)

    My Scenario is like this:*
    Hi i have 2 fact tables fact1 and fact 2 and four dimension tables D1,D2,D3 ,D4 & D1.1 ,D1.2 the relations in the data model is like this :
    NOTE: D1.1 and D1.2 are derived from D1 So D1 might be snow Flake.
    [( D1.. 1:M..> Fact 1 , D1.. 1:M..> Fact 2 ), (D2.. 1:M..> Fact 1 , D2.. 1:M..> Fact 2 ), ( D3.. 1: M.> Fact 1 , D3.. 1:M..> Fact 2 ),( D4.. 1:M..> Fact 1 , D4 ... 1:M..> Fact 2 )]
    Now from D1 there is a child level like this: [D1 --(1:M)..> D1.1 and from D1.1.. 1:M..> D1.2.. 1:M..> D4]
    Please help me in modeling these for making a report of 10,000 rows and also let me know for which tables do i need to enable cache?
    PS: There shouldn't be performance issue so please help me in modeling this.
    Thanks in Advance for the Experts who are helping me for a while.

    Shudn't be much problem with just these many rows...
    Model something like this only Re: URGENT MODELING SNOW FLAKE SCHEMA
    There are various ways of handling performance issues if any in OBIEE.
    Go for caching strategy for complete warehouse. Make sure to purge it after every data load..If you have aggr calculations at higher level then you can also go for aggregated tables in OBIEE for better performance.
    http://www.rittmanmead.com/2007/10/using-the-obiee-aggregate-persistence-wizard/
    Hope this is clear...Go ahead with actual implementation and lets us know incase you encounter any major issues.
    Cheers

  • Best practice for R12 upgrade middle of the period

    Does anyone know of where I can find an Oracle documented best practice on when to perform the 12.1.1 upgrade? We are considering if we can upgrade in the middle of a period or if we should wait until the period close.

    Best practice is to read the name of a forum before you post into it.
    This forum is titled: "Oracle Database General Questions."
    My suspicion is your question relates to EBS.

  • Best practices for making space on hard drive?

    My relatively trusty ol' 466 mhz G4 is bogging down. I am showing 9.77 GB out of 28.6 capacity. I have the Adobe CS programs and do a lot of Photoshop work. I decided to clean house and dump as much stuff as I can, including the older programs (Adobe Design Suite, Virtual PC, etc.) I don't use these programs anymore, and have the original CD's, so assume there is no reason to take up space with them. My questions are:
    1- Is there a general rule for optimal functioning of a Mac in terms of the % of HD space that should be available?
    2- What is the difference between regular 'empty trash' and 'secure empty trash'?
    3- I bought the LaCie for backing up the hard drives and files. Am using iMSafe, but am a little spooked by the warmings that "removing obsolete files from the destination could result in data loss". Any suggestions?
    PowerMac G4 Digital Audio 466 mhz   Mac OS X (10.3.9)   896 MB SDRAM, 30GB HD, 75 GB additional int. HD, and a 120 GB LaCie external HD

    10-15% free HD space. So having 25-30% free should not pose a problem?
    That should be fine for now, but you know how quickly space can disappear, and the less space you have the slower the system will become.
    Paranoid that something trashed will be found? Or that something trashed will be lost, then needed?
    Something trashed will be found. If you're using your computer in an everyday way and you delete something you later decide you really want, even using the most sophisticated data recovery programs you can only find stuff that hasn't already been written over.
    How does a backup program (such as iMSafe) decide which files are "obsolete"? I hardly know myself! So how do I know if it is OK to trust the program to decide?
    Personally, I don't like the fact that it does this, but I believe it is simply comparing the original with the copy. If you've deleted the original, then it wants to know if the backup file is now obsolete.
    I would suggest you consider something easier to use. Have a look at ChronoSync and see what you think.

  • Best practice for upgrading task definition in production system

    If I try and update a task definition with task instances running I get the following error:
    Task definition 'My Task - Add User' may not be modified while there are active task instances
    Is there a best practice to handle this. I tried to force an update through the console but that didn't work. I tried editing the task from the debug page and got the same error.

    The best way for upgrade purposes is to use the rename function of the TaskDefinition from the lh command line utility.
    Basically renames all current task instances with the TaskDefinition name. You can then alter the existing TaskDefinition and upload into identity manager.

  • Best Practices for creating reports/Dashboards from BW systems

    HI Gurus,
    Best  Practices of creating BO Dashboards / Xcelsisus from BW systems
    Prasad

    You can use the BICS connector that leverages BW queries directly.  It is listed in the Connection Manager as "SAP NetWeaver BW Connection".  You will need both the ABAP and Java stack and SSO configured between the two.  You will also need to have SAP GUI and BEx installed on the machine you are doing development on.  Note that dashboards using this connection can only be hosted in NW Portal for the time being until the next release of BI 4.x platform.
    Here are some links on getting started with the BICS connector:
    [Building Fast and Efficient Dashboards with BW and Xcelsius|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/d0ab8cce-1851-2d10-d5be-b5147a651c58]
    [Requirements for BICS|http://wiki.sdn.sap.com/wiki/display/BOBJ/prerequisitestoXcelsiusandSAPNetWeaverBW+Connection]

  • What is the best practice for PXI controller, connect to the company network and install antivirus? Special Subnet?

    I need your suggestions and common practices. 

    Hello TomMex,
    Thanks for posting. If what you are looking for are suggestions for how to use your PXI controller in regards to some of the issues you mentioned, then here are my suggestions. For networking purposes, you can consider your PXI controller the same as any other computer; you should be able to connect it to your network just fine and it will be able to see other computers and devices that are on the same subnet. Antivirus software in general should be fine for your system until you want to install new NI software, at which point you may want to disable it to avoid issues during installation. Does this answer your question? Let me know, thanks!
    Regards,
    Joe S.

Maybe you are looking for