Best approach for using Faces with growing children?

Hi all,
I'm I recent Aperture user with young kids (four-year old twins), and I'm wondering how best to use Faces to identify the kids' faces.  I started working with about 6 months worth of recent photos in Aperture (3.2.3) before importing my full iPhoto library.  Faces did quite a good job of identifying the kids in that sample.  I've just imported all of my iPhoto library, which includes photos back to when the kids were first born.  Faces is now making suggestions that seem pretty reasonable, but far from perfect.  I suspect that if I go through the process of training Faces to do a better job with the "baby faces" its performace would improve on the old photos I just added, but is that a bad idea?  I'm afraid that training Faces to recognize the baby version of someone will "broaden" the definition it's using, making recognition less accurate for new photos I add. I could tell Faces that the baby versions are different people, but that might be worse -- then I'd have two very similar face profiles that are competing to "claim" new faces.  Does anyone have any experience that might help?
Thanks,
Brad

Let me start by stating:
- I don't know
- I don't think you'll get any help from Apple.
That said, here's what I suggest.  The Faces parameters are biometric.  The human head changes the least of any body part over the course of life.  Still, there is bound to be an age prior to which Faces identification works less-well because the data is "smeared".  Similarly, after that age, the Faces identification should work with the same level of accuracy.
I would, for the present, ignore that.  Identify all Faces you have.  If Faces identification is sub-optimal, pick an age that you think corresponds to what I've laid out above, and create a new Face for all pictures of the individual prior to that age.  At that point you'll have two "Faces" for each individual: let's say "Robin Infant" and "Robin (post-infant)".
While Aperture makes it easy to combine Faces (drag-drop in Faces View), I don't know an easy way to split named Faces.  It's easy enough to group the Images you want (filter for "Face is ... " and for "Date is before ... ").  From there, you will have to rename the Faces one-by-one.  This goes quickly by pasting the name in the name field.
My guess is that the identification algorithm rejects the data from included faces that are outliers.  IOW, I don't think you can train Faces to be sloppy.
Let us know what you find out.

Similar Messages

  • What's best approach for a person with dual office?

    Hi,
    I have an UC560 system to deploy. There are 3 users in company who work from home too. For home I have ordered SPA525G2 that has SSL VPN capabilities but when they go in office, they would be using 7945 phone in their dedicated cube. The office phone would not be shared.
    What's best way to configure this setup? I know extension mobility works for hoteling setup but this is different. Would it be possible to assign same primary extension to 2 different phones without an overlay configuration?
    What's best approach here?
    Thanks in advance,
    Sam

    Al the Drifter wrote:
    If you follow Steve's advice, and after doing the edits you discover
    that one instrument should come up 1db, you are screwed.
    I could be wrong about this in the classical music environment,
    where things are not close-mic'ed but if I am, I am confident Steve
    will correct me.  Ha.
    You always run the risk of small changes between takes - and that's where Audition 3 and the new improved crossfades score rather heavily. You won't notice 1dB on a single instrument across a fade though - it's hard to spot this as a jump, even, unless it's on pure tone. No, I very rarely close-mic stuff at all, although I did with a clavichord recently - it's seriously too quiet to mic any other way.
    jaypea500 wrote:
     when recording classical music, any engineer worth anything has the mix down pat as it's being recorded. 
    That's the way they used to work, certainly - but not nowadays, especially if it's done on location, which most classical recording is. What's more likely to happen is that you'd use decent mic preamps feeding straight into a multitrack, or even some software on a laptop. I generally record like that - but I also feed the multitrack outputs to a Yamaha mixer via ADAT, do a mix on that and record it back to a spare multitrack pair. I don't actually need to do that - but having a mix available from the multitrack that's pretty much there is good as far as being able to play back takes to conductors is concerned.
    Of course, one of the other reasons that classical sessions recorded on location aren't mixed on the spot is that the monitoring conditions are invariably far from ideal, and I'd have it that no engineer worth anything would ever risk a final mix done on location.
    But I only get paid to do all of this on a regular basis, so what would I know? Must be something though - my customers come back for more...

  • Best practice for using Muse with Lightroom?

    I'm creating a photography website. I use Lightroom to manage my photographs, and I keep the pictures for my site in Collections.
    In Muse, I'd like to have ALT text descriptions (or something similar) of each picture, so that search engines can see what's on my site.
    Is there a way to embed the descriptions into the image info in Lightroom, and have it export in a way that Muse picks it up?
    My problem is this: I've been creating ALT tags in my Muse slideshows (by right clicking on each image and selecting "Edit image properties..."). This works, but I lose my ALT tag if I export the images from Lightroom using any naming convention that doesn't maintain a consistent relationship between image and name. This makes it very difficult to manage my exports from Lightroom, especially if I want to add images or change the order of images.
    At the risk of providing too much detail, the reason I rename images on export is this: In Lightroom, my images are named with my client's last name and a sequence number. When I export them for use on my website, I want them to have a different name, primarily to protect my clients' privacy, but also to allow me to organize them easily in my slideshows. So for example Jones-103.dng gets exported as Checkerbox-Wedding-Photography-[nnn].jpg, where [nnn] is a new sequence number.
    Can anyone tell me a better way to manage this workflow?

    publish/subscribe, right?
    lots of subscribers, big messages == lots of network traffic.
    it's a wide open question, no?
    %

  • Best Practices for Using JSF with AJAX - BluePrints OR Ajax4Jsf ?

    I am a newbie to AJAX4JSF . I think it provides Rapid Application Development (RAD) just by using tags like a4j: without the need to develop complex JSF Custom Components as shown in BluePrints Catalog
    https://bpcatalog.dev.java.net/ajax/jsf-ajax/
    I understand the purpose of developing JSF Custom components as Reusable for using with AJAX. But its complex and requires lot of coding i.e. PhaseListeners and Managed Beans. There should be easy way to do this especially our project needs RAD tool like AJAX4JSF.
    Any suggestions will be highly appreciated
    Regards
    Bansi

    Bansi, you are trying to compare orange-to-apple. Blue print catalog is a historical retrospection about what people thought about AJAXifying JSF in the past. Currently, the playground has been moved to the jsf-extension project. Look for DynaFaces there.

  • Best Approach for Use Case

    Experts,
    I am creating a small POC for a search engine. I am thinking on what is the best way to achieve the below scenario
    1) Assume like in Google i am entering some data in the text box. I want the area below the text box to show the records (leaving aside from where the records is fetched - DB or flatflies) based on user input values and change as an when new characters are entered or deleted.
    I guess contextual events clubbed together with regions will help me in this but i need to know if i am thinking in the right direction.
    2) My other page has registration which contains 3 sections (logically) say Emp, Dept, Country. All the EO have validations defined on them. Now the user can only enter Emp details and proceed, or Dept details and proceed. So in such a case how do i skip the other section validations.
    I guess sub forms can help me in this
    Please advise.
    Jdev 11.1.1.4 and beyond.

    Hi,
    you didnot get the article..I explains the wizard type interface with next functionality where validations is not fired at first but during at commit time.....
    When i commit how do i make sure that only Transaction level validation is fired for the tab/section of one Entity only ?That article don't meet your usecase.....
    SkipValidation will only skip validations when you need to navigate along with immediate="false"(default), but it wont allow you to commit unless and until your eo validates.
    I guess SkipValidation set to Custom would help me Wrong..
    Lets say i have a page with 3 tabs or a single page with 3 sections each sowing data from diff EO. Each EO having validations defined on them.In that case you need to supress the validation of that eo using some flag workaround .. as i explained you in my previous post !!
    just set the flag to 'DRAFT' to supress the validation of that eo before commit..
    Using it as EntityLevel script validator would be better.
    Regards,
    Edited by: Santosh Vaza on Jun 29, 2012 10:51 AM

  • Help! Best setup for using A3 with a Drobo?

    A little over a month ago, I decided to get "serious" with my photography hobby and plunked down several hundred bucks for a Drobo (4-bay, FireWire800), a couple 1TB drives to go in it, and a copy of Aperture 3.
    It took a couple tries (and a couple days) to import my large iPhoto library and get everything up and running. I can see that Aperture has the potential to be a VERY useful tool... its feature set seems like exactly the type of photo manager I need/want.
    Unfortunately, Aperture has been very frustrating to use so far. It's just not very responsive... the "lag" is killin me! Sometimes it's EXTREMELY slow, sometimes it's kinda slow, but it's never fast. There's always a delay/lag any time I try to make an adjustment/edit, skip to the next photo, importing and ESPECIALLY when I am deleting photos. Just deleting one 8MB JPEG can take anywhere from 10 seconds to 10 minutes.
    I'm wondering if the slow performance issue is because my library and masters are stored on the Drobo (which uses a RAID-like storage system spread across multiple drives). Or do I just not have my preferences set up right? Or is there a corrupt file slowing things down? I don't know....
    Anybody out there successfully using Aperture 3 in conjunction with a DROBO? If so, tell me your secret! How do you have your library/masters set up?
    I'd appreciate any suggestions and advice!
    Here's my specs:
    - Aperture version 3.0.2
    - Drobo connected via FW800, appx 300GB used, 650GB avail
    - Model Name: iMac
    - Model Identifier: iMac8,1
    - Processor Name: Intel Core 2 Duo
    - Processor Speed: 3.06 GHz
    - Number Of Processors: 1
    - Total Number Of Cores: 2
    - L2 Cache: 6 MB
    - Memory: 4 GB
    - Dual monitor setup
    - Besides the Drobo, another 300GB external is attached via FW
    Thanks

    First let me say I'm a little on the paranoid side when it comes to loosing my work so my backup process will be overkill for most.
    I put all my images and my Aperture library on a large internal drive. All my images are referenced. I also put one Aperture vault on the same drive and another vault elsewhere. I allow Time Machine to back up to a partition on the Drobo (2nd gen) via FW800. I also have a external esata swappable bay that I use for full image backup of the Photo HD using Superduper. I keep one swappable image of each critical HD in fireproof storage and rotate them periodically. I also have a DVD archive of all my images that is kept in a different location.
    Having all the images online in the same box makes access fast.
    Backing up to the Drobo is not real fast, but I manually start Time Machine and go do something else. The backup on the Drobo is primarily so I can go back in time and locate an image I may have deleted but not realized I needed it. The speed of the Drobo is not a problem in this situation.
    The Superduper backup is very fast as it only updates the new or updated files.
    When I started I had 3 and later 4 500GB drives in the Drobo. I recently swapped them for 1.5TB drives. When I pulled a 500GB and put in a 1.5TB the Drobo took about 4 days to rebuild the data. The second and third swaps took just as long. The forth only a few minutes. Between each swap I checked and the Drobo (as advertised) recovered all the data.
    If you can't keep all your images on an internal HD, you might consider keeping your Aperture library and recent images on the internal drive and use Aperture to migrate them to the Drobo as they age. You could also keep one of your vaults on the Drobo. This approach would require you to use referenced images.

  • Best Approach for using Data Pump

    Hi,
    I configured a new database which I set up with schemas that I imported in from another production database. Now, before this database becomes the new production database, I need to re-import the schemas so that the data is up-to-date.
    Is there a way to use Data Pump so that I don't have to drop all the schemas first? Can I just export the schemas and somehow just overwrite what's in there already?
    Thanks,
    Nora

    Hi, you can use the NETWORK_LINK parameter for import data from other remote database.
    http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_import.htm#i1007380
    Regards.

  • Which FrameWork is Best Suitable For Using AJAX with JAVA

    And what are other differences between AJAX And Java Script other Than XMLHttpRequestObjects..i would be glad for any help or guidence given...

    And what are other differences between AJAX And Java Script other Than XMLHttpRequestObjects..i would be glad for any help or guidence given...

  • Best approach for uploading document using custom web part-Client OM or REST API

    Hi,
     Am using my custom upload Visual web part for uploading documents in my document library with a lot of metadata.
    This columns contain single line of text, dropdownlist, lookup columns and managed metadata columns[taxonomy] also.
    so, would like to know which is the best approach for uploading.
    curretnly I am  trying to use the traditional SSOM, server oject model.Would like to know which is the best approach for uploading files into doclibs.
    I am having hundreds of sub sites with 30+ doc libs within those sub sites. Currently  its taking few minutes to upload the  files in my dev env. am just wondering, what would happen if  the no of subsites reaches hundred!
    am looking from the performance perspective.
    my thought process is :
    1) Implement Client OM
    2) REST API
    Has anyone tried these approaches before, and which approach provides better  performance.
    if anyone has sample source code or links, pls provide the same 
    and if there any restrictions on the size of the file  uploaded?
    any suggestions are appreciated!

    Try below:
    http://blogs.msdn.com/b/sridhara/archive/2010/03/12/uploading-files-using-client-object-model-in-sharepoint-2010.aspx
    http://stackoverflow.com/questions/9847935/upload-a-document-to-a-sharepoint-list-from-client-side-object-model
    http://www.codeproject.com/Articles/103503/How-to-upload-download-a-document-in-SharePoint
    public void UploadDocument(string siteURL, string documentListName,
    string documentListURL, string documentName,
    byte[] documentStream)
    using (ClientContext clientContext = new ClientContext(siteURL))
    //Get Document List
    List documentsList = clientContext.Web.Lists.GetByTitle(documentListName);
    var fileCreationInformation = new FileCreationInformation();
    //Assign to content byte[] i.e. documentStream
    fileCreationInformation.Content = documentStream;
    //Allow owerwrite of document
    fileCreationInformation.Overwrite = true;
    //Upload URL
    fileCreationInformation.Url = siteURL + documentListURL + documentName;
    Microsoft.SharePoint.Client.File uploadFile = documentsList.RootFolder.Files.Add(
    fileCreationInformation);
    //Update the metadata for a field having name "DocType"
    uploadFile.ListItemAllFields["DocType"] = "Favourites";
    uploadFile.ListItemAllFields.Update();
    clientContext.ExecuteQuery();
    If this helped you resolve your issue, please mark it Answered

  • What is the best practice for using the Calendar control with the Dispatcher?

    It seems as if the Dispatcher is restricting access to the Query Builder (/bin/querybuilder.json) as a best practice regarding security.  However, the Calendar relies on this endpoint to build the events for the calendar.  On Author / Publish this works fine but once we place the Dispatcher in front, the Calendar no longer works.  We've noticed the same behavior on the Geometrixx site.
    What is the best practice for using the Calendar control with Dispatcher?
    Thanks in advance.
    Scott

    Not sure what exactly you are asking but Muse handles the different orientations nicely without having to do anything.
    Example: http://www.cariboowoodshop.com/wood-shop.html

  • HT1229 what is the best method for using a iphoto with an external hard drive for greater capacity?

    what is the best method for using a iphoto with an external hard drive for greater capacity?

    Moving the iPhoto library is safe and simple - quit iPhoto and drag the iPhoto library intact as a single entity to the external drive - depress the option key and launch iPhoto using the "select library" option to point to the new location on the external drive - fully test it and then trash the old library on the internal drive (test one more time prior to emptying the trash)
    And be sure that the External drive is formatted Mac OS extended (journaled) (iPhoto does not work with drives with other formats) and that it is always available prior to launching iPhoto
    And backup soon and often - having your iPhoto library on an external drive is not a backup and if you are using Time Machine you need to check and be sure that TM is backing up your external drive
    LN

  • What's the best approach for handeling about 1300 connections in Oracle.

    What's the best approach for handling about 1300 connections in Oracle 9i/10g through a Java application?
    1.Using separate schema s for various type users(We can store only relevant data with a particular schema.     Then No. of records per table can be reduced by replicating tables but we have to maintain all data with a another schema     Then we need update two schema s for a given session.Because we maintain separate scheama for a one user and another schema for all data and then there may be Updating problems)
    OR
    2. Using single schema for all users.
    Note: All users may access the same tables and there may be lot of records than previous case.
    What is the Best case.
    Please give Your valuable ideas

    It is a true but i want a solution from you all.I want you to tell me how to fix my friends car.

  • What are the best approaches for mapping re-start in OWB?

    What are the best approaches for mapping re-start in OWB?
    We are using OWB repository 10.2.0.1.0 and OWB client 10.2.0.1.31. The Oracle version is 10 G (10.2.0.3.0). OWB is installed on Linux.
    We have number of mappings. We built process flows for mappings as well.
    I like to know, what are the best approches to incorportate re-start options in our process. ie a failure of mapping in process flow.
    How do we re-cycle failed rows?
    Are there any builtin features/best approaches in OWB to implement the above?
    Does runtime audit tables help us to build re-start process?
    If not, do we need to maintain our own tables (custom) to maintain such data?
    How did our forum members handled above situations?
    Any idea ?
    Thanks in advance.
    RI

    Hi RI,
    How many mappings (range) do you have in a process flows?Several hundreds (100-300 mappings).
    If we have three mappings (eg m1, m2, m3) in process flow. What will happen if m2 fails?Suppose mappings connected sequentially (m1 -> m2 -> m3). When m2 fails then processflow is suspended (transition to m3 will not be performed). You should obviate cause of error (modify mapping and redeploy, correct data, etc) and then repeat m2 mapping execution from Workflow monitor - open diagram with processflow, select mapping m2 and click button Expedite, choose option Repeat.
    In re-start, will it run m1 again and m2 son on, or will it re-start at row1 of m2?You can specify restart point. "at row1 of m2" - I don't understand what you mean (all mappings run in Set based mode, so in case of error all table updates will rollback,
    but there are several exception - for example multiple target tables in mapping without corelated commit, or error in post-mapping - you must carefully analyze results of error).
    What will happen if m3 fails?Process is suspended and you can restart execution from m3.
    By having without failover and with max.number of errors=0, you achieve re-cycle failed rows to zero (0).This settings guarantee existence only two return result of mapping - SUCCSES or ERROR.
    What is the impact, if we have large volume of data?In my opinion for large volume Set based mode is the prefered processing mode of data processing.
    With this mode you have full range enterprise features of Oracle database - parallel query, parallel DML, nologging, etc.
    Oleg

  • What's Best Approach for Multitrack Classical Music?

    Can someone suggest the best approach for recording classical musicians onto
    four tracks? In this scenario, they play until they make a mistake on, say,
    measure 24, stop, then (take 2) go back to measure 20 and play until the next
    rough spot, and so on. Ultimately there may be 15 takes that all need to be
    trimmed and stitched together.
    In the old (tape) days, this was pretty basic editing. I would use a blade and block
    to cut out all the bad stuff on the multitrack tape, then I could mix. But how do I
    do this in Audition? (I use version 1.5.)
    I can't do the cuts it in edit view because the tracks would get out of sync
    Assuming all the takes are in one session, in multitrack view, this most basic of
    functions seems to elude me. What am I missing?

    Al the Drifter wrote:
    If you follow Steve's advice, and after doing the edits you discover
    that one instrument should come up 1db, you are screwed.
    I could be wrong about this in the classical music environment,
    where things are not close-mic'ed but if I am, I am confident Steve
    will correct me.  Ha.
    You always run the risk of small changes between takes - and that's where Audition 3 and the new improved crossfades score rather heavily. You won't notice 1dB on a single instrument across a fade though - it's hard to spot this as a jump, even, unless it's on pure tone. No, I very rarely close-mic stuff at all, although I did with a clavichord recently - it's seriously too quiet to mic any other way.
    jaypea500 wrote:
     when recording classical music, any engineer worth anything has the mix down pat as it's being recorded. 
    That's the way they used to work, certainly - but not nowadays, especially if it's done on location, which most classical recording is. What's more likely to happen is that you'd use decent mic preamps feeding straight into a multitrack, or even some software on a laptop. I generally record like that - but I also feed the multitrack outputs to a Yamaha mixer via ADAT, do a mix on that and record it back to a spare multitrack pair. I don't actually need to do that - but having a mix available from the multitrack that's pretty much there is good as far as being able to play back takes to conductors is concerned.
    Of course, one of the other reasons that classical sessions recorded on location aren't mixed on the spot is that the monitoring conditions are invariably far from ideal, and I'd have it that no engineer worth anything would ever risk a final mix done on location.
    But I only get paid to do all of this on a regular basis, so what would I know? Must be something though - my customers come back for more...

  • Best approach for building dialogs based on Java Beans

    I have a large amount of Java Beans with several properties each. These represent all the "data" in our system. We will now build a new GUI for the system and I intend to reuse the beans as far as possible. My idea is to automatically generate the configuration dialogs for each bean using the java.beans package.
    What is the best approach for achieving this? Should I use PropertyEditors or should I make my own dialog-generator using the Introspetor class or are there any other suitable solutions?
    All suggestions and tips are very welcome.
    Thanks!
    Erik

    Definitely, it is better for you to use JTable. Why not try it?

Maybe you are looking for

  • Backing up Windows Server 2012 R2 Virtual Machine

    When I try to back my Windows Server 2012 R2 Virtual machine up to a NAS, using the Windows Server Utility I get the following errors: There was a failure in preparing the backup image in one of the volumes in the backup set. Detailed error: The proc

  • Simulate a progress Bar similar to the one we get with 'wget' in linux

    Hello everybody, I want to write a class that simulate a progress bar in the console in linux. I've thougth of combining System.out.println's and calling to the command 'clear' for each printing, but if the refresh is too constant the result is an ex

  • I_vnam does not hold value after i_step is 1

    Hi all! I have a characteristic 0calday. For my report, i have a customer exit which has the following requirement: If only one value is entered for 0calday, I have to populate this value as 'From' value for 0calday and system date should be in the '

  • Nexus1000v local SPAN can't capture inbound traffic

    hi all, I just configured local SPAN on nexus1000v (version 1.3d). local SPAN source and destination is on same VEM. my config is like below: monitor session 3   source interface Vethernet13 both   destination interface Vethernet170   destination int

  • Display problems on T410s Switchable

    I just bought a T410s with switchable graphics in Hong Kong (Windows 7 32bit preloaded). I have some terrible issues with the display. When updating the NVS display driver (through the Lenovo updating facility), the displayer completely powered off a