Best Practice for attaching Meta Data, Organizing & Archiving Photos

I am about to start the daunting task of organizing and archiving ALL of my photos so that I can access them from an external hard drive (or two) when I need them. I want to know/understand the best way to do this before I start so that thousands of hours later I don't realize that I have not considered something critical. I have already spent hours trying to find this information online to no satisfactory avail. I have both iPhoto ('09 v8.1.2) and Aperture 3, which I am just learning to use. I am wanting to back up to my WD My Book. I have already transferred some pictures and realized that, though I had them nicely organized in Albums in iPhoto, that information doesn't transfer over to the My Book so the file names are unrecognizable as any type of descriptor. I believe that I need to assign some Meta Data (?) to each photo so that I can do searches to find particular people or subjects when needed.
Here are all of the things I want/need to know:
1. Which Mac program would be best to accomplish attaching MetaData or adjusting file names: iPhoto or Aperture 3?
2. What is the best way to attach MetaData to pictures so that the information is permanent? Tagging (allowing multiple search options)? Batch Name Change (allowing limited search capabilities)? Other?
a. If I TAG all of my photos and then try to use that information outside of iPhoto or Aperture 3 (from the external HD) will it still be accessible for searches?
3. After attaching all of this information, what do I need to know about transferring the photos to an external HD? Are there multiple copies of the same photo (original, edited) and if so, are both transferred? Are they both deleted when I move the pictures to the trash?
4. Is there anything else that I need to know??
Your expertise would be greatly appreciated.
Thank you.

You are trying to defeat the photo management of iPhoto - My suggestion is to learn iPhoto and use it of choose another program - iPhoto does not work well unless it manages the entire library and you access it using the tools build into the OS - click here for a discussion on the many correct and safe ways to access your iPhoto library photos.
Aperture may or may not be a better choice - try the Aperture forum for detailed info
Backing up iPhoto is trivial - simply drag the iPhoto library intact as a single entity to another hard drive - or use an incremental backup program to do it as changes are made - I use Time Machine to one backup drive and SuperDuper to another
I believe that I need to assign some Meta Data (?) to each photo so that I can do searches to find particular people or subjects when needed.
This is an integral part of iPhoto and one of its strengths
1. Which Mac program would be best to accomplish attaching MetaData or adjusting file names: iPhoto or Aperture 3?
iPhoto does a great job of this - file names are only adjusted when you export - and there is no need for them to be adjusted within the iPhoto library since you never should be going into it directly and never need to see the files directly
2. What is the best way to attach MetaData to pictures so that the information is permanent?
Exporting from iPhoto
Tagging (allowing multiple search options)?
Using iPhoto and the OS tools
Batch Name Change (allowing limited search capabilities)?
Done during export if desired
Other?
Depends
a. If I TAG all of my photos and then try to use that information outside of iPhoto or Aperture 3 (from the external HD) will it still be accessible for searches?
In iPhoto yes so long as you are not going directly into the iPhoto library but are using the provided tools
3. After attaching all of this information, what do I need to know about transferring the photos to an external HD?
Simply drag the iPhoto library intact as a single entity to the EHD - the database is a single entity and it will be the same there (so long as the EHD is formatted Mac OS extended (journaled)
Are there multiple copies of the same photo (original, edited) and if so, are both transferred?
Yes - Yes
Are they both deleted when I move the pictures to the trash?
You do not move photos to the trash - you select them in iPhoto and press the delete key then empty the iPhoto trash (iPhoto menu ==> empty iPhoto trash)
4. Is there anything else that I need to know??
Lots - if you are going to use iPhoto stop thinking of managing files and start thinking of using a SQL database which is what iPhoto is -- otherwise you need to use a different program
LN

Similar Messages

  • Best Practice for Initial Load Data

    Dear Experts,
        I would like to know the best practices or factors to be concerned when performing initial load
    For example,
    1) requirement from business stakeholders for data analysis
    2) age of data to meet tactical reproting
    3) data dependency crossing sap  modules
    4) Is there any best practice for loading master data?

    HI ,
    check this links
    Master Data loading
    http://searchsap.techtarget.com/guide/allInOne/category/0,296296,sid21_tax305408,00.html
    http://datasolutions.searchdatamanagement.com/document;102048/datamgmt-abstract.htm
    Regards,
    Shikha

  • Best practice for putting binary data on the NMR

    Hi,
    We're creating a component that will consume messages off the NMR, encode them, and subsequently put them back on the NMR. What's the best practice for sending binary data over the NMR?
    1. setContent()?
    2. addAttachment()?
    3. setProperty()?
    If NormailzedMessage.setContent() is the desired approach, then how can you accomplish that?
    Thanks,
    Bruce

    setContent() is used only for XML messages. The recommended way to accommodate binary data is to use addAttachment().

  • Best Practice for Using Static Data in PDPs or Project Plan

    Hi There,
    I want to make custom reports using PDPs & Project Plan data.
    What is the Best Practice for using "Static/Random Data" (which is not available in MS Project 2013 columns) in PDPs & MS Project 2013?
    Should I add that data in Custom Field (in MS Project 2013) or make PDPs?
    Thanks,
    EPM Consultant
    Noman Sohail

    Hi Dale,
    I have a Project Level custom field "Supervisor Name" that is used for Project Information.
    For the purpose of viewing that "Project Level custom field Data" in
    Project views , I have made Task Level custom field
    "SupName" and used Formula:
    [SupName] = [Supervisor Name]
    That shows Supervisor Name in Schedule.aspx
    ============
    Question: I want that Project Level custom field "Supervisor Name" in
    My Work views (Tasks.aspx).
    The field is enabled in Task.aspx BUT Data is not present / blank column.
    How can I get the data in "My Work views" ?
    Noman Sohail

  • Best Practice for disparately sized data

    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    A

    Angel 1058 wrote:
    2 questions in about 20 minutes!
    We have a cache, which holds approx 80K objects, which expired after 24 hours. It's a rolling population, so the number of objects is fairly static. We're over a 64 node cluster, high units set, giving ample space. But.....the data has a wide size range, from a few bytes, to 30Mb, and everywhere in between. This causes some very hot nodes.
    Is there a best practice for handling a wide range of object size in a single cache, or can we do anything on input to spread the load more evenly?
    Or does none of this make any sense at all?
    Cheers
    AHi A,
    It depends... if there is a relationship between keys and sizes, e.g. if this or that part of the key means that the size of the value will be big, then you can implement a key partitioning strategy possibly together with keyassociation on the key in a way that it will evenly spread the large entries across the partitions (and have enough partitions).
    Unfortunately you would likely not get a totally even distribution across nodes because of having fairly small amount of entries compared to the square of the number of nodes (btw, which version of Coherence are you using?)...
    Best regards,
    Robert

  • Best practice for retraction of data from BPC 10.0 to General Ledger

    Hi All,
    I have the requirement to retract data from BPC 10.0 to General Ledger.
    What is the best practice for doing this?
    I have read the "How To... Retract data from BPC 7.5 NetWeaver to SAP ERP Cost Center Accounting, Part I", but this is prepared to transfer data to Cost Center accounting. Will General Ledger be populated automatically?
    What is your opinion on this?
    Best regards,
    JA

    Hi Rich,
    In BPC 10 NW, the data entered in Input Forms has to be retracted to ECC.
    For this retraction we are following below link.
    http://www.sdn.sap.com/irj/scn/go/portal/prtroot/docs/library/uuid/c029accf-6d95-2d10-79aa-cc291180bf73?QuickLink=index&overridelayout=true&59180354379512
    In this document, I am unable to get the details of class ZCL_BPC_RTRCT_SUPERCLASS.
    as it is Z class, i am unable to create in my system.
    could you please help me.

  • Best practices for submitting CF data to an AJAX page?

    Hi everyone,
    I've got a project I'm working on for work and have hit a
    little problem.
    I am extracting data from my database and then after each
    piece of data (just numbers, usually 10 chunks of numbers), I tack
    a "|" onto the end of each number. Then, I output the data to the
    page. Back on my AJAX enabled page, I get the "responseText" from
    that page, and then split it up using javascript and the
    pre-inserted "|".
    This seems to work fine, but it is quite a bit more messy.
    Also, It would really, really be nice to be able to do sorting and
    various other operations on the data with javascript instead of
    having to rely on CF's icky code logic.
    Can someone please enlighten me as to best practices for this
    type of thing? I get the suspicion that I'll probably be using XML
    somehow, but I'd your opinion.
    Thanks!

    Check out the Samples and Documentation portions of Adobe's
    Spry website for client side use of JSON with Spry.
    http://labs.adobe.com/technologies/spry/home.html
    Here is link to Adobe's Spry Forums:
    http://www.adobe.com/cfusion/webforums/forum/categories.cfm?forumid=72&catid=602
    If you are using CF8 you can use the SerializeJSON function
    to convert a variable to JSON. You might also be interested in the
    cfsprydataset tag. CF 8 documentation:
    http://livedocs.adobe.com/coldfusion/8/htmldocs/
    If you are using a previous version of CF there is 3rd party
    JSON support. You can find links at
    http://json.org.

  • Best Practice for Placing meta tags in HEAD of document

    Good afternoon,
    We are utilizing Oracle UCM's ability to dynamically generate pages based on page templates, region templates, and contributor data files that contributors may edit and publish. We have a specific need to include meta tags in the head section of each page that are custom to the data present in each contributor data file. Including the meta tags in the region template that pertains to a section of our website does not work, as it does not include a <head> section. Alternatively, I can not place the meta tags in the page template, as the data in each meta tag is specific to the metadata surrounding each contributor data file.
    My question is this: Is there a best practice, or Oracle-supported method of accomplishing this task? I have to believe there is some way to accomplish this, as the need for meta tags for social media sites and search engines is very common.
    Thank you very much for your time,
    Josh

    Thanks for the reply, Jiri.
    I believe that the answer is 'yes' to both of your questions above. I do have a question about the formatting of the meta tag itself, as my main question is how do I extract the needed information from a particular CDF so that it is displayed in the <head> section located in the page template.
    I have a Region Definition for the content section of my page with a 'Title' field, which a contributor can fill in, save, and create a Contributor Data File with their own specified title. Let's also say that my page template, which formats the entire page, looks like this (rough):
    <html>
      <head>
      <title>Web page</title>
      </head>
      <body>
      <!-- placeholder for header section -->
      <!-- placeholder for content section -->
      <!-- placeholder for footer section -->
      </body>
    </html>
    What would my <meta> tag look like for the following:
      <meta property="og:title" content="Title of the content section"/>
    Let's name our templates/definitions:
    Page Template: PT_META
    Region Template: RT_META
    Region Definition: RD_META
    Contributor Data File: CDF_META
    In other words, how do I extract the information included in the Contributor Data File (as well as its metadata) for use in the Page Template? Is there some sort of Idoc Script call that can be used?
    Thank you so much,
    Josh

  • Best Practices for Loading Master Data via a Process Chain

    Currently, we load attributes, text, and hierarchies before loading the transactional data.  We have one meta chain.  To load the master data it is taking more than 2 hours.  Most of the master data is full loads.  We've noticed that a lot of the master data, especially text, has not changed or changed very little since we implemented 18 months ago.  Is there a precedence or best practice to follow such as do we remove these processes from the chain?  If so, how often should it be run?  We would really like to reduce the amount of the master data loading time.  Is there any documentation that I can refer to?  What are other organizations doing to reduce the amount of time to load master data?
    Thanks!
    Debby

    Hi Debby,
    I assume you're loading Master Data from a BI system? The forum here are related to SAP NetWeaver MDM, so maybe you should ask this question in a BI forum?
    Nevertheless, if your data isn't changed this much, maybe you could use a delta mechanism for extraction? This would send only the changed records and not all the unchanged all the time. But this depends on your master data and of course on your extractors.
    Cheers
    Michael

  • Best Practice for Flat File Data Uploaded by Users

    Hi,
    I have the following scenario:
    1.     Users would like to upload data from flat file and subsequently view their reports.
    2.     SAP BW support team would not be involved in data upload process.
    3.     Users would not go to RSA1 and use InfoPackages & DTPs. Hence, another mechanism for data upload is required.
    4.     Users consists of two group, external and internal users. External users would not have access to SAP system. However, access via a portal is acceptable.
    What are the best practice we should adopt for this scenario?
    Thanks!

    Hi,
    I can share what we do in our project.
    We get the files from the WEB to the Application Server in path which is for this process.The file placed in the server has a naming convention based on ur project,u can name it.Everyday the same name file is placed in the server with different data.The path in the infopackage is fixed to that location in the server.After this the process chain trigers and loads the data from that particular  path which is fixed in the application server.After the load completes,a copy of file is taken as back up and deleted from that path.
    So this happens everyday.
    Rgds
    SVU123
    Edited by: svu123 on Mar 25, 2011 5:46 AM

  • Best practice for exposing internal data to external world?

    Currently we have our Internet server sitting in our corporate DMZ taking website and web service requests from the outside world.  Class libraries with compiled connection strings exist on that server.  That server then has a connection through
    the firewall to the database server.  I'm told that this is no longer the secure/recommended best practice.
    I'm told to consider having that Internet server make requests of not the database server, but rather a layer in between (application server, intranet server, whatever) that has those same Web UI methods exposed.. and then THAT server (being inside the firewall)
    connects to the database server.
    Is this the current recommended best practice to have external users interact with internal data?  It seems like lots of hoops -- outside person app queries Web UI methods on Internet server which in-turn queries same method (duplicated) on Intranet
    server which then talks to the database.
    I'm just trying to determine the simples practice, but also what is appropriately secure for our ASP.NET applications and services.
    Thanks.

    IMO this has little to do with SOA and all about DMZs. What you're are trying to stop is the same comm protocol accessing the database as accessed the web site. As long as you fulfil that then great. WCF can help here because it helps with configuring
    the transport of calls. Another mechanism is to use identities but IMO it's easier to use firewalls and transports.
    http://pauliom.wordpress.com

  • Best practices for attaching iSCSI devices

    Hello all,
    My environment: Nexus 5010 with 2148T FEX switches in the racks.
    Can (or should) I use the 2148T as the switch between clustered servers and the iSCSI SAN?  The SAN is a Dell MD3000i connected to Dell R710 servers, Win Ent 2008 R2 Server.
    Plans are to use two vlans so I have dual data paths to the SAN.  Being new to the Nexus, I'm not sure what the best practice/configuration is for that.
    Any suggestions on reading materials or best practice configurations would be appreciated!
    Thanks...
    Ted

    We recently implemented Nexus 7010s, 5020s, and 2248s in our data center. Now we would like to harden them from a security persective; are there any Best Practices available for hardening Nexus devices?
    Hi Carl,
    I dont think a specifc hardening document has been realsed but yes you can refer the generic ios based hardening and try with NX-OS which are all supported or not.
    http://www.cisco.com/en/US/tech/tk648/tk361/technologies_tech_note09186a0080120f48.shtml
    Hope to Help !!
    Ganesh.H
    Remeber to rate the helpful post

  • Best practice for migrating eLearning data from Sol Mgr 3 to 4?

    Greetings,
    What is the recommended method for moving eLearning data when migrating from Solution Manager 3 to version 4?
    Thanks in advance,
         Ken Henderson

    948115 wrote:
    Dear All,
    This is Priya.
    We are using ODI 11.1.1.6 version.
    In my ODI project, we have separate installations for Dev, Test and Prod. i.e. Master repositories are not common between all the three. Now my code is ready in dev. Test environment is just installed with ODI and Master and Work repositories are created. Thats it
    Now, I need to know and understand what is the simple & best way to import the code from Dev and migrate it to test environment. Can some one brief the same as a step by step procedure in 5-6 lines? If this is the 1st time you are moving to QA, better export/import complete work repositories. If it is not the 1st time then create scenario of specific packages and export/import them to QA. In case of scenario you need not to bother about model/datastores. keep in mind that the logical schema name should be same in QA as used in your DEV.
    Some questions on current state.
    1. Do the id's of master and work repositories in Dev and Test need to be the same?It should be different.
    2. I usually see in export file a repository id with 999 and fail to understand what it is exactly. None of my master or work repositories are named with that id.It is required to ensure object uniqueness across several work repositories. For more understanding you can refer
    http://docs.oracle.com/cd/E14571_01/integrate.1111/e12643/export_import.htm
    http://odiexperts.com/odi-internal-id/
    3. Logical Architecture objects and context do not have an export option. What is the suitable alternative for this?If you are exporting topology then you will get the logical connection and context details. If you are not exporting topology then you need to manually create context and other physical connection/logical connection.
    >
    Thanks,
    Priya
    Edited by: 948115 on Jul 23, 2012 6:19 AM

  • Best practice for implementing META tags for content items?

    Hello,
    The portal site I'm responsible for managing our content (www.sers.state.pa.us) runs on the following WebCenter products:
    WebCenter Interaction 10.3.0.1
    WebCenter Publisher 6.5
    WebCenter Studio 2.2 MP1
    Content Service 10gR3
    The agency I work for is one of many for the commonwealth of PA, which use this product suite, and I'm encountering some confusion on how to apply META tags to the content items for our site, so we can have effective search results. According to the [W3C site's explanation on META tag standards|http://www.w3schools.com/tags/tag_meta.asp], the tags for description, keywords, etc, should be within the head region of the HTML document. However, with how the WebCenter suite's configuration is set up, the head section of the HTML is closed off by the end of the template code for a common header portlet. I was advised to add fields to our presentation and data entry templates for content, to add these meta fields, however, since they are then placed within the body section of the HTML as a result, these tags fail to have any positive impact on the search results. Instead, many of our content items, when searched for, the description in the search results only shows text that is displayed in the header and left navigation of our template, which come early in the body section of the HTML.
    Please advise as to possible method(s) that would be best to implement usage of META tags so we can get our pages containing content to come up in search results with this relevant data.
    Thanks in advance,
    Brian

    if i remember right the index server will capture meta tags even if they are not in the <head> section. it is not well formed html but I think i remember that we created meta tags down in the body section and the index server still picked them up. you might try this and see if it still works. i believe it worked in 10gR3. Let me know your results.

  • Best Practice for Keeping Imported Music Organized? - Help Please

    I download tunes and am looking at the best way to keep iTunes up to date with new downloads (as opposed o CD rips). Am i best to have all files in one overall folder and then just use iTunes to import the files from whatever folder they reside in (after the download) to the copy in iTunes and delete the original? To date Ive been keeping them scattered across drives and just using iTunes as the pointer to the original but now losing track as to what has been imported and what hasnt. Perhaps the above is a better way and if so....best to delete iTunes, reinstall with an empty database, import all and then delete from original flile locations? Help and thanks in advance!!
    Porsche216

    I import similar to the way you described. Import then delete original. I keep a folder just for importing. iTunes has a function consolidating the library. It will copy from the multiple locations to the iTunes library location. You would then have to delete the originals to free up space.

Maybe you are looking for

  • Weird Firefox Happening Please Help!

    When I minimize Firefox 8.0.1 to work in my windows 7 computer it won't stay minimized but about 30seconds then it maximizes all by itself without any intervention from me. I minimize it and it starts all over again. Has anyone had this problem? Than

  • JDBC connection for SQL Server 2000

    How to connect SQL Server 2000 from java? If i can get any sites where i can get examples also fine. Thanks in advance Praveen.

  • Signatures in outgoing mail in Webclient

    hi all, how to add signatures in outgoing mails in IC webclient.We are working on CRM5.0 i could found signature configuration in Winclient but not in  webclient. were is this configuration done. Regards Ramesh

  • Converting java classes to an XSD, is there a tool?

    Or do I have to do this by hand. I was talking to someone who teaches this enterprise/webservice stuff and he said something to the effect that jaxb can convert my java classes to .xsd files, i don't see how this is possible (I see the other way, fro

  • Sounblaster live driver prob

    Hi, I installed my soundblaster li've on windows xp, but it doesn't work properly: Actually it works when it wants... this morning i was listening music, the sound was perfect for 0 minutes but after i could hear unpleasant scratch sounds periodicall