Best practice for exporting the same disk to multiple guests?

I've got a T4-4 with two service domains. Each service domain has access to the same pool of SAN disks.
primary service domain has vds primary-vds0
secondary service domain has vds secondary-vds0
So far I've sucessfully built a single backend service for each SAN disk on each service domain and used the MP group operation to enable multipathing, i.e.
ldm add-vds primary-vds0_a primary
ldm add-vds secondary-vds0 secondary
ldm add-vdsdev mpgroup=target1 /dev/dsk/c0t6target1 target1@primary-vds0
ldm add-vdsdev mpgroup=target1 /dev/dsk/c0t6target1 target1@secondary-vds0
ldm add-vdisk id=0 bootdisk target1@primary-vds0 guest_a
Now, I have database datafile LUNS that I need to present to two guests at once since they're using Veritas clustering. As I understand it, what I have to present to each guest is a unique volume (volume@vds) export, and I can do it one of two ways:
Build mutiple exports of the same disk on the same vds:
ldm add-vds primary-vds0_a primary
ldm add-vds secondary-vds0 secondary
ldm add-vdsdev mpgroup=db1a /dev/dsk/c0c6target5 db1_server_a@primary-vds0
ldm add-vdsdev mpgroup=db1a /dev/dsk/c0c6target5 db1_server_a@secondary-vds0
ldm add-vdsdev mpgroup=db1b /dev/dsk/c0c6target5 db1_server_b@primary-vds0
ldm add-vdsdev mpgroup=db1b /dev/dsk/c0c6target5 db1_server_b@secondary-vds0
ldm add-vdisk id=6 database_disk db1_server_a@primary-vds0 guest_a
ldm add-vdisk id=6 database_disk db1_server_b@primary-vds0 guest_b
Build a separate vds for each guest domain, and then add the vdsdevs to each vds:
ldm add-vds primary-vds-server_a primary
ldm add-vds primary-vds-server_b primary
ldm add-vds secondary-vds-server_a secondary
ldm add-vds secondary-vds-server_b secondary
ldm add-vdsdev mpgroup=db1a /dev/dsk/c0c6target5 db_disk@primary-vds-server_a
ldm add-vdsdev mpgroup=db1a /dev/dsk/c0c6target5 db_disk@secondary-vds-server_a
ldm add-vdsdev mpgroup=db1b /dev/dsk/c0c6target5 db_disk@primary-vds-server_b
ldm add-vdsdev mpgroup=db1b /dev/dsk/c0c6target5 db_disk@secondary-vds-server_b
ldm add-vdisk id=6 database_disk db_disk@primary-vds-server_a guest_a
ldm add-vdisk id=6 database_disk db_disk@primary-vds-server_b guest_b
The end result is the same, but is there any advantage (performance, configuration management, whatever) to doing it one way versus the other?
Thanks!
Tim Metzinger

the standards for videoDVD are 720x480, and usually mpeg2 encoded..
so, your HiDef project HAS to be 'downsampled' somehow..
I would Export with Qucktime/apple intermediate => which is the 'format' your project is allready, and you avoid any useless 'inbetween encoding'..
iDVD will 'swallow' this huge export file - don't mind: iDVD cares for length, not size.
iDVD will then convert into DVD-standards..
you can 'raise' quality, by using projects <60min - this sets iDVD automatically to highest technical possible bitrate
hint: judge pic quality on a DVDplayer + TV.. not on your computer (DVDs are meant for TVdelivery)

Similar Messages

  • Best Practices for Export

    I have recently begun working with a few AIC-encoded home movie files in FCPX. My goal is to compress them using h.264 for viewing on computer screens. I had a few questions about the best practices for exporting these files, as I haven't worked with editing software in quite some time.
    1) Is it always recommended that I encode my video in the same resolution as its source? For example, some of my video was shot at 1440x1080, which I can only assume is anamorphic. I originally tried to export at 1920x1080 but then changed my mind as I assumed the 1440x1080 would just stretch naturally. Does this sound right?
    2) FCPX is telling me that a few of my files are in 1080i. I'd like to encode them in 1080p as it tends to look better on computer screens. In FCPX, is it as simple as dragging my interlaced footage into a progressive timline and then exporting? I've heard about checking the "de-interlace" box under clip settings and then doubling the framerate but that seemed to make my video look worse.
    3) I've heard that it might be better practice to export my projects as master files and then encode h.264 in Compressor. Is there any truth to this? Might it be better for the interlaced to progressive conversion as well?
    Any assistance is greatly appreciated.

    1) yes. 1440 will display ax 1920.
    2) put everything in a 1080p project.
    3) Compressor will give you more options for control. The h264 from FCP is a very high data rate and makes large files.

  • Best practice for exporting a dps folio so a third party can work on it?

    Hi All,
    Keen for some thoughts on the best practice for exporting a dps folio, indd files, links and all, so a third party can carry on the work. Is their a better alternative to packaging each individual indd file and sharing the folio through adobe?
    I know there have been similar questions here in the past, but the last (that i've found) was updated mid 2011 and I was wondering if their have been any improvements to this seemingly backwards workflow since then.
    Thanks in advance!

    Nothing better than packaging them and using Dropbox to share. I caution you
    on one thing as far as packaging:
    http://indesignsecrets.com/file-packaging-feature-can-cause-problems-in-dps-
    workflows.php

  • Best Practice for enhancing the SAP delivered standard WD ABAP application

    Hi,
    I am new to WebDypro ABAP.
    To enhance the SAP delivered Standard WebDynpro Component (complex component with Business objects & powl).
    Kindly let me know the best practice for enhancing the Standard WD ABAP from the below 1 or 2.
    1) To copy & create a "Z" of the component & make changes in that (or)
    2) to enhance directly on the same standard component without making "Z".
    Regards,
    NS

    Hi NS,
    If it is a standard component its better we go for enhancing the component rather than copying it into Z component.
    If there is any issue with in the standard component , SAP supports it through notes and OSS messages. If it is a Z component, SAP doesn't support it.
    If there is any up gradation of business packages, changes will be done to standard , but not the Z components, wherein we could miss it.
    Further, since it is a standard component it might have been used at many places, changes that has to done to reflect all changes might be difficult in this case if it is a z component.
    Regards,
    Harsha

  • Best practice for exporting from iMovie '08 to iDVD

    I am looking to find out what is the best practice for exporting from iMovie '08 to iDVD. I have read the other postings that give the basic howto (export to Media Browser then select the video in iDVD). However, my question is a little more technical. I have 1080i HD projects. I am interested in burning them to DVD in the best possible quality. What setting should I be using when I publish to Media Browser?
    I am wondering about quality loss due to more than one conversion/compression. I suspect that when I export to the Media Browser then this is occurring. If I am not mistaken iMovie is using something like H.264 for this. Then, when I run iDVD I suspect it will it do another conversion/compression, I think to get to MPEG2. Not only could this result in a loss of quality but also it will take extra time. I am interested to know what others think about this.
    Finally, I am looking to create DVDs for a lot of video. I am wondering if there are any USB or firewire hardware devices out there that could speed up the compression. I use the Elgato Turbo.264 when I want to encode to H.264 but I wonder if there is something similar for DVD creation.
    Thanks in advance.

    the standards for videoDVD are 720x480, and usually mpeg2 encoded..
    so, your HiDef project HAS to be 'downsampled' somehow..
    I would Export with Qucktime/apple intermediate => which is the 'format' your project is allready, and you avoid any useless 'inbetween encoding'..
    iDVD will 'swallow' this huge export file - don't mind: iDVD cares for length, not size.
    iDVD will then convert into DVD-standards..
    you can 'raise' quality, by using projects <60min - this sets iDVD automatically to highest technical possible bitrate
    hint: judge pic quality on a DVDplayer + TV.. not on your computer (DVDs are meant for TVdelivery)

  • What is the best practice for using the Calendar control with the Dispatcher?

    It seems as if the Dispatcher is restricting access to the Query Builder (/bin/querybuilder.json) as a best practice regarding security.  However, the Calendar relies on this endpoint to build the events for the calendar.  On Author / Publish this works fine but once we place the Dispatcher in front, the Calendar no longer works.  We've noticed the same behavior on the Geometrixx site.
    What is the best practice for using the Calendar control with Dispatcher?
    Thanks in advance.
    Scott

    Not sure what exactly you are asking but Muse handles the different orientations nicely without having to do anything.
    Example: http://www.cariboowoodshop.com/wood-shop.html

  • Best practices for using the knowledge directory

    Anyone know when it is best to store docs in the Knowledge Directory versus Collab? They are both searchable, but I guess you can publish from the Publisher to the KD. Anyone have any best practices for using the KD or setting up taxonomies in the KD?

    Hi Richard,
    If you need to configure dynamic pricing that may vary by tenant and/or if you want to set up cost drivers that are service item attributes, you should configure Billing Tables in the Demand Management module in 10.0. 
    The cost detail functionality in 9.4 will likely be changed to merged with the new pricing feature in 10.0.  The current plan is not to bring cost detail into the Service Catalog module.

  • Best practices for using the 'cost details' fields

    Hi
    Please could you advise us to the best practices for using the 'cost details' field within Pricing. Currently I cannot find the way to surface the individual Cost Details fields within the Next Generation UI, even with the tick box for 'display both cost and price' ticked. It seems that these get surfaced when the Next Generation UI is turned off, but cannot find them when it is turned on. We can see the 'Pricing Summary' field but this does not fulfill our needs, as some of our services have both recurring and one-off costs.
    Attached are some screenshots to further explain the situation.
    Many thanks,
    Richard Thornton

    Hi Richard,
    If you need to configure dynamic pricing that may vary by tenant and/or if you want to set up cost drivers that are service item attributes, you should configure Billing Tables in the Demand Management module in 10.0. 
    The cost detail functionality in 9.4 will likely be changed to merged with the new pricing feature in 10.0.  The current plan is not to bring cost detail into the Service Catalog module.

  • Best practice for partitioning 300 GB disk

    Hi,
    I would like to seek for advise on how I should partition a 300 GB disk on Solaris 8.x, what would be the optimal size for each of the partition.
    The system will be used internally for running web/application servers and database servers.
    Thanks in advance for your help.

    There is no "best practice" regardles of what others might say. I depends entirely on how you plan on using and maintaining the system. I have run into too many situations where fine-grained file system sizing bit the admins in the backside. For example, I've run into some that assumed that /var is only going to be for logging and printing, so they made it nice and small. What they didn't realize is that patch and package information is also stored in /var. So, when they attempted to install the R&S cluster, they couldn't because they couldn't put the patch info into /var.
    I've also run into other problems where a temp/export system that was mounted on a root-level directory. They made the assumption that "Oh, well, it's root. It can be tiny since /usr and /opt have their own partitions." The file system didn't mount properly, so any scratch files in that directory that were created went to the root file system and filled it up.
    You can never have a file system that's too big, but you can always have a file system that's too small.
    I will recommend the following, however:
    * /var is the most volatile directory and should be on its own several GB partition to account for patches, packages, and logs.
    * You should have another partition as big as your system RAM and assign that parition as a system/core dump for system crashes.
    * /usr or whatever file system it's on must be big enough to assume that it will be loaded with FOSS/Sunfreeware tools, even if at this point you have no plans on installing them. I try to make mine 5-6 GB or more.
    * If this is a single-disk system, do not use any kind of parallel access structure, like what Oracle prefers, as it will most likely degrade system performance. Single disks can only make single I/O calls, obviously.
    Again, there is no "best practice" for this. It's all based on what you plan on doing with it, what applications you plan on using, and how you plan on using it. There is nothing that anyone here can tell you that will be 100% applicable to your situation.

  • BEST PRACTICE TO PARTITION THE HARD DISK

    Can some Please guide me on THE BEST PRACTICE TO PARTITION THE[b] HARD DISK FOR 10G R2 on operating system HP-UX-11
    Thanks,
    Amol
    Message was edited by:
    user620887

    I/O speed is a basic function of number of disk controllers available to read and write, physical speed of the disks, size of the I/O pipe(s) between SAN and server, and the size of the SAN cache, and so on.
    Oracle recommends SAME - Stripe And Mirror Everything. This comes in RAID10 and RAID01 flavours. Ideally you want multiple fibre channels between the server and the SAN. Ideally you want these LUNs from the SAN to be seen as raw devices by the server and use these raw devices as ASM devices - running ASM as the volume manager. Etc.
    Performance is not achieved by just partitioning. Or just a more memory. Or just a faster CPU. Performance planning and scalability encapsulates the complete system. All parts. Not just a single aspect like partitioning.
    Especially not partitioning as an actual partition is simple a "logical reference" for a "piece" of the disk. I/O performance has very little do with how many pieces you split a a single disk into. That is the management part. It is far more important how you stripe and whether you use RAID5 instead of a RAID1 flavour, etc.
    So I'm not sure why you are all uppercase about partitioning....

  • Best Practices for Exporting Files??

    I'm new to Premiere (coming from FCP).  I used Premiere months ago to compress some ProRes files to h.264 files for the web.  I sent the files through Media Encoder and everything seemed fine.  However, I realized after several weeks that the audio in all of the files was a few frames out of sync.  Having not been a Premiere user at the time I did not do much research and decided to just use MPEG Streamclip from then on.
    Now that I'm learning how to use Premiere, I looked up the issue on the forums and found that many people have had similar issues with the audio being out of sync after exporting. However, there are tons of different scenerios in which it seems to be occuring.  The one common variable that I've noticed (among many of the threads, but not all) is that many of the people are exporting to a Quicktime format. 
    While I don't remember all the details of my export and sequence settings from my issue months ago (so I don't want to address that specific case), I am curious as to what are some "Best Practices" when exporting from Premiere Pro? Is there any advantage/disadvantage to use AME rather than exporting directly from Premiere Pro? In general, I will just be exporting as H.264 files for the web, MPEG-2 for DVD, and ProRes 422 for After Effects (or sometimes to bring into MPEG Streamclip). 
    I shoot almost entirely in AVCHD, and usually at 1080p 30fps.  I'm running CS5 on a Macbook Pro 15" 2.0 Quad Core i7 8GB RAM.
    While the question may seem broad, my main concern that I want to avoid is having the audio out of sync.  But also I just want to know of any important details to keep in mind to prevent other issues.
    Thanks,
    Mike

    > I'm running CS5...
    What specific version? We're up to 5.0.4 now.
    There have been bug fixes for audio/video synch in the updates. One of the fixes was for a bug in the conforming of audio and indexing of MPEG files, so you need to delete your media cache files and let Premiere Pro create new ones for this fix to take effect.

  • What are the best practices for using the enhancement framework?

    Hello enhancement framework experts,
    Recently, my company upgraded to SAP NW 7.1 EhP6.  This presents us with the capability to use the enhancement framework.
    A couple of senior programmers were asked to deliver a guideline for use of the framework.  They published the following statement:
    "SAP does not guarantee the validity of the enhancement points in future releases/versions. As a result, any implemented enhancement points may require significant work during upgrades. So, enhancement points should essentially be used as an alternative to core modifications, which is a rare scenario.".
    I am looking for confirmation or contradiction to the statement  "SAP does not guarantee the validity of enhancement points in future releases/versions..." .  Is this a true statement for both implicit and explicit enhancement points?
    Is the impact of activated explicit and implicit enhancements much greater to an SAP upgrade than BAdi's and user exits?
    Is there any SAP published guidelines/best practices for use of the enhancement framework?
    Thank you,
    Kimberly
    Edited by: Kimberly Carmack on Aug 11, 2011 5:31 PM

    Found an article that answers this question quite well:
    [How to Get the Most From the Enhancement and Switch Framework as a Customer or Partner - Tips from the Experts|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/c0f0373e-a915-2e10-6e88-d4de0c725ab3]
    Thank you Thomas Weiss!

  • Best practices for customizing the standard OBIA metedata repository (RPD)

    Hello
    Is there a Best practices document published by oracle or a partner that talks about best practices for customizing OBIA out-of-box RPD. I am specifically looking for guidance around:
    1. adding new objects to physical layer or modifying an exisitng table definition to add more columns
    2. Building new Logical columns in BMM layer
    3. Modifying the exisitng Subject areas.
    Thanks

    There is a very good presentation by Rittman mead on extending and customizing BI Applications. Refer to this link (http://www.rittmanmead.com/files/OOW2008%20-%20Extending%20and%20Customizing%20the%20BI%20Apps%20Data%20Warehouse.pdf ).
    Thanks,
    -Amith.

  • Best Practices for Accessing the Configuration data Modelled as XML File in

    Hi,
    I refer the couple of blof posts/Forum threads on How to model and access the Configuration data as XML inside OSB.
    One of the easiest and way is to
    Re: OSB: What is best practice for reading configuration information
    Another could be
    Uploading XML data as .xq file (Creating .xq file copy paste all the Configuration as XML )
    I need expert answers for following.
    1] I have .xsd file which is representing the Configuration data. Structure of XSD is
    <FrameworkConfig>
    <Config type="common" key="someKey">proprtyvalue</Config>
    <FrameworkConfig>
    2] As my project will move from one env to another the property-value will change according to the Environment...
    For Dev:
    <FrameworkConfig>
    <Config type="common" key="someKey">proprtyvalue_Dev</Config>
    <FrameworkConfig>
    For Stage :
    <FrameworkConfig>
    <Config type="common" key="someKey">proprtyvalue_Stage</Config>
    <FrameworkConfig>
    3] Let say I create the following Folder structure to store the Configuration file specific for dev/stage/prod instance
    OSB Project Folder
    |
    |---Dev
    |
    |--Dev_Config_file.xml
    |
    |---Stage
    |
    |--Stahe_Config_file.xml
    |
    |---Prod
    |
    |-Prod_Config_file.xml
    4] I need a way to load these property file as xml element/variable inside OSb message flow.?? I can't use XPath function fn:doc("URL") coz I don't know exact path of XMl on deployed server.
    5] Also I need to lookup/model the value which will specify the current server type(Dev/Stage/prod) on which OSB MF is running. Let say any construct which will act as a Global configuration and can be acccessible inside the OSb message flow. If I get the vaalue for the Global variable as Dev means I will load the xml config file under the Dev Directory @runtime containing key value pair for Dev environment.
    6] This Re: OSB: What is best practice for reading configuration information
    suggest the designing of the web application which will serve the xml file over the http protocol and getting the contents into variable (which in turn can be used in OSB message flow). Can we address this problem without creating the extra Project and adding the Dependencies? I read configuration file approach too..but the sample configuration file doesn't show entry of .xml file as resources
    Hope I am clear...I really appreciate your comments and suggestion..
    Sushil
    Edited by: Sushil Deshpande on Jan 24, 2011 10:56 AM

    If you can enforce some sort of naming convention for the transport endpoint for this proxy service across the environments, where the environment name is part of the endpoint you may able to retrieve it from $inbound in the message pipeline.
    eg. http://osb_host/service/prod/service1 ==> Prod and http://osb_host/service/prod/service2 ==> stage , then i think $inbound/ctx:transport/ctx:uri can give you /service/prod/service1 or /service/stage/service1 and applying appropriate xpath functions you will be able to extract the environment name.
    Chk this link for details on $inbound/ctx:transport : http://download.oracle.com/docs/cd/E13159_01/osb/docs10gr3/userguide/context.html#wp1080822

  • Best practice for replicate the IDM (OIM 11.1.1.5.0) environment

    Dear,
    I need to replicate the IDM production to build the IDM test environment. I have the OIM 11.1.1.5.0 in production with lots of custom codes.
    I have two approaches:
    approach1:
    Manually deploy the code through Jdeveloper and export and import all the artifacts. Issue is this will require a lot of time and resolving dependencies.
    approach2:
    Take the OIM, MDS and SOA schemas export and Import the same into new IDM Test DB environment.
    could you please suggest me what is the best practice and if you have some pointers to achieve the same.
    Appreciate your help.
    Thanks,

    Follow your build document for the same steps you used to build production.
    You should know where all your code is. You can use the deployment manager to export your configurations. Export customized files from MDS. Just follow the process again, and you will have a clean instance not containing production data.
    It only takes a lot of time if your client is lacking documentation or if you re not familiar with all the parts of the environment. What's 2-3 hours compared to all the issues you will run into if you copy databases or import/export schemas?
    -Kevin

Maybe you are looking for

  • Ethernet default over wireless connection.

    when I boot up I want my network to default to the ethernet which is connected to the Dynadock U3.0 before the wireless in the Lenovo x220 laptop. The ethernet is much faster. is there any way I can have this occur automatically - rather than me havi

  • How to create materialized view log on remote database

    How do you create materialized view logs on a remote database. I tried create materialized view log on global_express_views.vccs438_project_work_request@h92edwp with sequence, rowid ( columns...) ERROR at line 1: ora-00949 illegal reference to remote

  • Render Flame fails

    Hello , ich versuche ein einfachess Candle Light mit der Render Funktion Flame in PS CC 2014 zu generieren , habe einen Pfad Ich wähle Candel Light aus , wähle Ok und bekomme Irgendwelche Ideen?? ich versuch das seit mehreren Update Zyclen , keine Än

  • Maximum number of contacts for Symbian phone

    Hi, Anyone knows what is the maximum number of contacts one can store for their Symbian phone? For example, I am using N95 8GB. I loaded in 5000 numbers and I have nothing else in the phone. When I try to access to the phone book, phone prompts me "M

  • Windows 8 Updates

    I recently bought a HP210 G1 Notebook with Windows 8.1 installed. The problem is that when I try to install all the Windows updates to bring the system up to date, it gets to 30%, restarts and then immediately freezes and goes to  a black screen. I c