Best Practice for using with external HD

Greetings, I currently have a fairly large Aperture library, about 210Gb, it's managed by Aperture and on my internal MacBook Pro hard drive.  Not good, I know.  I just received my new MBP with Thunderbolt and also purchased the new Promise Pegasus 12 TB RAID storage device.  My question to the community is this... what is the best way to set up Aperture to take advantage of the external RAID storage while still being able to keep some photos on my hard drive so I can work when away from the Pegasus device.
Any thoughts are appreciated.
Regards,
Scott 

publish/subscribe, right?
lots of subscribers, big messages == lots of network traffic.
it's a wide open question, no?
%

Similar Messages

  • Best Practices for VidConf with external parties

    We currently use Sony Video conference units for our end units. On the backend, we use a Cisco 3515 for multipoint conferences.
    We now need to be able to do video conferences with external parties using our MCU.
    Since there are many ways of doing this, are there solutions that work better than others? Basically I need to publish a doc that says 'here is what you will need. This port open on your firewall, a public IP address that is not NATd....'
    Any help would be appreciated.

    David,
    Unfortunately, Cisco has not (currently) chosen to implement H.460.x in their H.323 infrastructure solutions, so a Cisco GK could not be used in combination with a Tandberg SBC (session border controller). However, we currently use Cisco gatekeepers for everything but firewall traversal. For traversal, the Tandberg GK functions as the inside portion of the traversal solution and proxies for all my codecs (a mix of six different Tandberg and Polycom product lines)--even those that are H.460-capable. Combined with the SBC (session border controller is a Cisco MCM (25xx router) setting on the public side of the firewall and neighbored to the session border controller.
    The "public" GK is for the entities who write custom policies in their firewalls or set their codecs on the public side of their security. If not H.460 compliant, they have to register with a GK, they can't register directly with the SBC, hence, the public GK which is simply neighbored to the SBC.
    I would encourage you to closely look at the Polycom V2IU as well. It has come a LONG way since it was introduced a couple of years ago.
    Personally, I still don't feel like the V2IU has as much flexibility nor does it implement a dialing methodology best suited for converging networks. It is an ALG (App Layer GW) that has been tweaked to support H.323 traversal, so I don't think it will ever truly match the Tandberg Expressway solution apples-apples, but it is dramatically less expensive and thus worth considering.
    We tested both the Tandberg and Polycom traversal solutions with our internal CAC (call admission and control infrastructure), which is made up of multiple Cisco GK products in a fully meshed neighboring scenario prior to purchase of a traversal solution, and the Cisco products interoperated with both traversal solutions.
    Cisco did present a solution that proposed a Layer 3 solution, but we felt it best to pursue something based within the H.323 umbrella standard.
    If you want to talk more, please email me at [email protected] w/ direct contact info and I'll be happy to assist you in any way I can.
    Greg

  • What is the best practice for using the Calendar control with the Dispatcher?

    It seems as if the Dispatcher is restricting access to the Query Builder (/bin/querybuilder.json) as a best practice regarding security.  However, the Calendar relies on this endpoint to build the events for the calendar.  On Author / Publish this works fine but once we place the Dispatcher in front, the Calendar no longer works.  We've noticed the same behavior on the Geometrixx site.
    What is the best practice for using the Calendar control with Dispatcher?
    Thanks in advance.
    Scott

    Not sure what exactly you are asking but Muse handles the different orientations nicely without having to do anything.
    Example: http://www.cariboowoodshop.com/wood-shop.html

  • What are the best practices for using the enhancement framework?

    Hello enhancement framework experts,
    Recently, my company upgraded to SAP NW 7.1 EhP6.  This presents us with the capability to use the enhancement framework.
    A couple of senior programmers were asked to deliver a guideline for use of the framework.  They published the following statement:
    "SAP does not guarantee the validity of the enhancement points in future releases/versions. As a result, any implemented enhancement points may require significant work during upgrades. So, enhancement points should essentially be used as an alternative to core modifications, which is a rare scenario.".
    I am looking for confirmation or contradiction to the statement  "SAP does not guarantee the validity of enhancement points in future releases/versions..." .  Is this a true statement for both implicit and explicit enhancement points?
    Is the impact of activated explicit and implicit enhancements much greater to an SAP upgrade than BAdi's and user exits?
    Is there any SAP published guidelines/best practices for use of the enhancement framework?
    Thank you,
    Kimberly
    Edited by: Kimberly Carmack on Aug 11, 2011 5:31 PM

    Found an article that answers this question quite well:
    [How to Get the Most From the Enhancement and Switch Framework as a Customer or Partner - Tips from the Experts|http://www.sdn.sap.com/irj/scn/index?rid=/library/uuid/c0f0373e-a915-2e10-6e88-d4de0c725ab3]
    Thank you Thomas Weiss!

  • Best practice for use of spatial operators

    Hi All,
    I'm trying to build a .NET toolkit to interact with Oracles spatial operators. The most common use of this toolkit will be to find results which are within a given geometry - for example select parish boundaries within a county.
    Our boundary data is high detail, commonly containing upwards of 50'000 vertices for a county sized polygon.
    I've currently been experimenting with queries such as:
    select
    from
    uk_ward a,
    uk_county b
    where
    UPPER(b.name) = 'DORSET COUNTY' and
    sdo_relate(a.geoloc, b.geoloc, 'mask=coveredby+inside') = 'TRUE';
    However the speed is unacceptable, especially as most of the implementations of the toolkit will be web based. The query above takes around a minute to return.
    Any comments or thoughts on the best practice for use of Oracle spatial in this way will be warmly welcomed. I'm looking for a solution which is as quick and efficient as possible.

    Thanks again for the reply... the query currently takes just under 90 seconds to return. Here are the results from the execution plan ran in sql*:
    Elapsed: 00:01:24.81
    Execution Plan
    Plan hash value: 598052089
    | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time |
    | 0 | SELECT STATEMENT | | 156 | 46956 | 76 (0)| 00:00:01 |
    | 1 | NESTED LOOPS | | 156 | 46956 | 76 (0)| 00:00:01 |
    |* 2 | TABLE ACCESS FULL | UK_COUNTY | 2 | 262 | 5 (0)| 00:00:01 |
    | 3 | TABLE ACCESS BY INDEX ROWID| UK_WARD | 75 | 12750 | 76 (0)| 00:00:01 |
    |* 4 | DOMAIN INDEX | UK_WARD_SX | | | | |
    Predicate Information (identified by operation id):
    2 - filter(UPPER("B"."NAME")='DORSET COUNTY')
    4 - access("MDSYS"."SDO_INT2_RELATE"("A"."GEOLOC","B"."GEOLOC",'mask=coveredby+i
    nside')='TRUE')
    Statistics
    20431 recursive calls
    60 db block gets
    22432 consistent gets
    1156 physical reads
    0 redo size
    2998369 bytes sent via SQL*Net to client
    1158 bytes received via SQL*Net from client
    17 SQL*Net roundtrips to/from client
    452 sorts (memory)
    0 sorts (disk)
    125 rows processed
    The wards table has 7545 rows, the county table has 207.
    We are currently on release 10.2.0.3.
    All i want to do with this is generate results which fall in a particular geometry. Most of my testing has been successful i just seem to run into issues when querying against a county sized polygon - i guess due to the amount of vertices.
    Also looking through the forums now for tuning topics...

  • Best practice for dealing with Recordsets

    Hi all,
    I'm wondering what is best practice for dealing with data retrieved via JDBC as Recordsets without involving third part products such as Hibernate etc. I've been told to NOT use RecordSets throughout in my applications since they are taking up resources and are expensive. I'm wondering which collection type is best to convert RecordSets into. The apps I'm building are webbased using JSPs as presentation layer, beans and servlets.
    Many thanks
    Erik

    There is no requirement that DAO's have a direct mapping to Database Tables. One of the advantages of the DAO pattern is that the business layer isn't directly aware of the persistence layer. If the joined data is used in the business code as if it were an unnormalized table, then you might want to provide a DAO for the joined data. If the joined data provides a subsiduray object within some particular object, you might add the access method to the DAO for the outer object.
    eg:
    In a user permissioning system where:
    1 user has many userRoles
    1 role has many userRoles
    1 role has many rolePermissions
    1 permission has many rolePermissions
    ie. there is a many to many relationship between users and roles, and between roles and permissions.
    The administrator needs to be able to add and delete permissions for roles and roles for users, so the crud for the rolePermissions table is probably most useful in the RoleDAO, and the crud for the userRoles table in the UserDAO. DOA's also can call each other.
    During operation the system needs to be able to get all permissions for a user at login, so the UserDAO should provide a readPermissions method that does a rather complex join across the user, userRole, rolePermission and permission tables..
    Note that f the system I just described were done with LDAP, a Hierarchical database or an Object database, the userRoles and rolePermissions tables wouldn't even exist, these are RDBMS artifacts since relational databases don't understand many to many relationships. This is good reason to avoid providing DAO's that give access to those tables.

  • Best practices for using the knowledge directory

    Anyone know when it is best to store docs in the Knowledge Directory versus Collab? They are both searchable, but I guess you can publish from the Publisher to the KD. Anyone have any best practices for using the KD or setting up taxonomies in the KD?

    Hi Richard,
    If you need to configure dynamic pricing that may vary by tenant and/or if you want to set up cost drivers that are service item attributes, you should configure Billing Tables in the Demand Management module in 10.0. 
    The cost detail functionality in 9.4 will likely be changed to merged with the new pricing feature in 10.0.  The current plan is not to bring cost detail into the Service Catalog module.

  • Best practices for using the 'cost details' fields

    Hi
    Please could you advise us to the best practices for using the 'cost details' field within Pricing. Currently I cannot find the way to surface the individual Cost Details fields within the Next Generation UI, even with the tick box for 'display both cost and price' ticked. It seems that these get surfaced when the Next Generation UI is turned off, but cannot find them when it is turned on. We can see the 'Pricing Summary' field but this does not fulfill our needs, as some of our services have both recurring and one-off costs.
    Attached are some screenshots to further explain the situation.
    Many thanks,
    Richard Thornton

    Hi Richard,
    If you need to configure dynamic pricing that may vary by tenant and/or if you want to set up cost drivers that are service item attributes, you should configure Billing Tables in the Demand Management module in 10.0. 
    The cost detail functionality in 9.4 will likely be changed to merged with the new pricing feature in 10.0.  The current plan is not to bring cost detail into the Service Catalog module.

  • Best practice for using messaging in medium to large cluster

    What is the best practice for using messaging in medium to large cluster In a system where all the clients need to receive all the messages and some of the messages can be really big (a few megabytes and maybe more)
    I will be glad to hear any suggestion or to learn from others experience.
    Shimi

    publish/subscribe, right?
    lots of subscribers, big messages == lots of network traffic.
    it's a wide open question, no?
    %

  • Best Practice for using multiple models

    Hi Buddies,
         Can u tell me the best practices for using multiple models in single WD application?
        Means --> I am using 3 RFCs on single application for my function. Each time i am importing that RFC model under
        WD --->Models and i did model binding seperately to Component Controller. Is this is the right way to impliment  multiple            models  in single application ?

    It very much depends on your design, but One RFC per model is definitely a no no.
    Refer to this document to understand how should you use the model in most efficient way.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/705f2b2e-e77d-2b10-de8a-95f37f4c7022?quicklink=events&overridelayout=true
    Thanks
    Prashant

  • Best Practice for Using Static Data in PDPs or Project Plan

    Hi There,
    I want to make custom reports using PDPs & Project Plan data.
    What is the Best Practice for using "Static/Random Data" (which is not available in MS Project 2013 columns) in PDPs & MS Project 2013?
    Should I add that data in Custom Field (in MS Project 2013) or make PDPs?
    Thanks,
    EPM Consultant
    Noman Sohail

    Hi Dale,
    I have a Project Level custom field "Supervisor Name" that is used for Project Information.
    For the purpose of viewing that "Project Level custom field Data" in
    Project views , I have made Task Level custom field
    "SupName" and used Formula:
    [SupName] = [Supervisor Name]
    That shows Supervisor Name in Schedule.aspx
    ============
    Question: I want that Project Level custom field "Supervisor Name" in
    My Work views (Tasks.aspx).
    The field is enabled in Task.aspx BUT Data is not present / blank column.
    How can I get the data in "My Work views" ?
    Noman Sohail

  • Best Practices for Using JSF with AJAX - BluePrints OR Ajax4Jsf ?

    I am a newbie to AJAX4JSF . I think it provides Rapid Application Development (RAD) just by using tags like a4j: without the need to develop complex JSF Custom Components as shown in BluePrints Catalog
    https://bpcatalog.dev.java.net/ajax/jsf-ajax/
    I understand the purpose of developing JSF Custom components as Reusable for using with AJAX. But its complex and requires lot of coding i.e. PhaseListeners and Managed Beans. There should be easy way to do this especially our project needs RAD tool like AJAX4JSF.
    Any suggestions will be highly appreciated
    Regards
    Bansi

    Bansi, you are trying to compare orange-to-apple. Blue print catalog is a historical retrospection about what people thought about AJAXifying JSF in the past. Currently, the playground has been moved to the jsf-extension project. Look for DynaFaces there.

  • Best Practices for Using Photoshop (and Computing in General)

    I've been seeing some threads that lead me to realize that not everyone knows the best practices for doing Photoshop on a computer, and in doing conscientious computing in general.  I thought it might be a good idea for those of us with some exprience to contribute and discuss best practices for making the Photoshop and computing experience more reliable and enjoyable.
    It'd be great if everyone would contribute their ideas, and especially their personal experience.
    Here are some of my thoughts on data integrity (this shouldn't be the only subject of this thread):
    Consider paying more for good hardware. Computers have almost become commodities, and price shopping abounds, but there are some areas where spending a few dollars more can be beneficial.  For example, the difference in price between a top-of-the-line high performance enterprise class hard drive and the cheapest model around with, say, a 1 TB capacity is less than a hundred bucks!  Disk drives do fail!  They're not all created equal.  What would it cost you in aggravation and time to lose your data?  Imagine it happening at the worst possible time, because that's exactly when failures occur.
    Use an Uninterruptable Power Supply (UPS).  Unexpected power outages are TERRIBLE for both computer software and hardware.  Lost files and burned out hardware are a possibility.  A UPS that will power the computer and monitor can be found at the local high tech store and doesn't cost much.  The modern ones will even communicate with the computer via USB to perform an orderly shutdown if the power failure goes on too long for the batteries to keep going.  Again, how much is it worth to you to have a computer outage and loss of data?
    Work locally, copy files elsewhere.  Photoshop likes to be run on files on the local hard drive(s).  If you are working in an environment where you have networking, rather than opening a file right off the network, then saving it back there, consider copying the file to your local hard drive then working on it there.  This way an unexpected network outage or error won't cause you to lose work.
    Never save over your original files.  You may have a library of original images you have captured with your camera or created.  Sometimes these are in formats that can be re-saved.  If you're going to work on one of those files (e.g., to prepare it for some use, such as printing), and it's a file type that can be overwritten (e.g., JPEG), as soon as you open the file save the document in another location, e.g., in Photoshop .psd format.
    Save your master files in several places.  While you are working in Photoshop, especially if you've done a lot of work on one document, remember to save your work regularly, and you may want to save it in several different places (or copy the file after you have saved it to a backup folder, or save it in a version management system).  Things can go wrong and it's nice to be able to go back to a prior saved version without losing too much work.
    Make Backups.  Back up your computer files, including your Photoshop work, ideally to external media.  Windows now ships with a quite good backup system, and external USB drives with surprisingly high capacity (e.g., Western Digital MyBook) are very inexpensive.  The external drives aren't that fast, but a backup you've set up to run late at night can finish by morning, and if/when you have a failure or loss of data.  And if you're really concerned with backup integrity, you can unplug an external drive and take it to another location.
    This stuff is kind of "motherhood and apple pie" but it's worth getting the word out I think.
    Your ideas?
    -Noel

    APC Back-UPS XS 1300.  $169.99 at Best Buy.
    Our power outages here are usually only a few seconds; this should give my server about 20 or 25 minutes run-time.
    I'm setting up the PowerChute software now to shut down the computer when 5 minutes of power is left.  The load with the monitor sleeping is 171 watts.
    This has surge protection and other nice features as well.
    -Noel

  • Best Practice for Use of ABAP in Customizing SRM and/or CRM

    I was wondering if there is a document that defines best practices for the use of ABAP with the installation and customization of SRM and/or CRM.   Such as amount of ABAP coding typically required, and best practices around the use of ABAP for customization and configuration.
    Thanks.

    Hi, Johnson
    Sorry, Please don't mind, you are not at right place to ask the Question like this
    Please read "The Forum Rules of Engagement" before posting!  HOT NEWS!!
    Thanks and Regards,
    Faisal

  • JSF - Best Practice For Using Managed Bean

    I want to discuss what is the best practice for managed bean usage, especially using session scope or request scope to build database driven pages
    ---- Session Bean ----
    - In the book Core Java Server Faces, the author mentioned that most of the cases session bean should be used, unless the processing is passed on to other handler. Since JSF can store the state on client side, i think storing everything in session is not a big memory concern. (can some expert confirm this is true?) Session objects are easy to manage and states can be shared across the pages. It can make programming easy.
    In the case of a page binded to a resultset, the bean usually helds a java.util.List object for the result, which is intialized in the constructor by query the database first. However, this approach has a problem: when user navigates to other page and comes back, the data is not refreshed. You can of course solve the problem by issuing query everytime in your getXXX method. But you need to be very careful that you don't bind this XXX property too many times. In the case of querying in getXXX, setXXX is also tricky as you don't have a member to set. You usually don't want to persist the resultset changes in the setXXX as the changes may not be final, in stead, you want to handle in the actionlistener (like a save(actionevent)).
    I would glad to see your thought on this.
    --- Request Bean ---
    request bean is initialized everytime a reuqest is made. It sometimes drove me nuts because JSF seems not to be every consistent in updating model values. Suppose you have a page showing parent-children a list of records from database, and you also allow user to change directly on the children. if I hbind the parent to a bean called #{Parent} and you bind the children to ADF table (value="#{Parent.children}" var="rowValue". If I set Parent as a request scope, the setChildren method is never called when I submit the form. Not sure if this is just for ADF or it is JSF problem. But if you change the bean to session scope, everything works fine.
    I believe JSF doesn't update the bindings for all component attributes. It only update the input component value binding. Some one please verify this is true.
    In many cases, i found request bean is very hard to work with if there are lots of updates. (I have lots of trouble with update the binding value for rendered attributes).
    However, request bean is working fine for read only pages and simple binded forms. It definitely frees up memory quicker than session bean.
    ----- any comments or opinions are welcome!!! ------

    I think it should be either Option 2 or Option 3.
    Option 2 would be necessary if the bean data depends on some request parameters.
    (Example: Getting customer bean for a particular customer id)
    Otherwise Option 3 seems the reasonable approach.
    But, I am also pondering on this issue. The above are just my initial thoughts.

Maybe you are looking for

  • How to redirect a page on iweb

    Hi I have an online store @ cafepress that I built with them..Is there a way to create a page on Iweb that visitors click (the tab) and are then taken to that specific page or is there a way to embed that page into an iweb page....any help would be a

  • Which is the better app for devloping a freelance design website?

    Hey Forum I'm putting together a website for my freelance design business. I've rough out the site, but now I'm stuck between using Dreamweaver or Flash. The site is relatively simple with the common home, about, philosophy portfolio and contact page

  • Is it possible to make MIRO against multiple documents???

    Dear experts , Is it possible to make an invoice verification against multiple delivery notes / or multiple POs?? Pls let ma  know Thanks & regards Anis

  • Error when opening illustrator

    how can i fix this ? Thanks

  • AI brushes and possible printing issues?

    Hello fellow creatives! I have a project that I am considering doing 100% in illustrator CS6 for Mac.  ( Three character illustrations ). The final files will be CMYK vector files. I will be drawing the illustrations in pencil.  Scanning the pencils