Best way to handle mobile accounts with large itunes/photo libraries

What is the best way to handle mobile accounts, but not syncing itunes/photos libraries?
I have a time capsule so I can move itunes and photos libraries for each user if need be.
Thanks!

Hi,
I've done a great deal of work with mobile accounts in Snow Leopard and I'm now having a "play" with Lion. To be honest you have to sit down and think about why you need mobile accounts.
If your user only uses one computer then your safer having a local account backed up by a network Time Machine, this avoids the many many woes that the Servers FileSyncAgent brings to the table.
If your users are going to be accessing multiple computers on the network and leaving the network then a mobile account is good for providing a uniform user experience and access to files etc. However, your users will have to make a choice as to whether they want their iPhoto libraries on one Local machine (backed up by Time Machine) or whether they want their library to be hosted on the server and not part of the Mobile Home Sync schedule (adding ~/Pictures to the excluded items on the home sync settings).
With the latter, users will be able to access their iPhoto libraries on any computer when they are within the network (as it's accessed from the users server home folder).
With the first option the user would have their iPhoto library on one computer (say the laptop they used the most) but then would not be able to access it from other computers they log on to.
iPhoto libraries are a pain, and I'm working hard to come up with a workaround. If your users moved over to using Apeture then you could include the aperture library as part of the home sync thanks to Deeport (http://deepport.net/archives/os-x-portable-home-directories-and-syncing-flaw-wit h-bundles/)
He does suggest that the same would work with IPhoto libraries - but it doesn't for a number of mysterious reasons regarding how the OS recognizes thie iPhoto bundle (it does so differently compared to Apeture).
Hope this helps...

Similar Messages

  • (workflow question) - What is the best way to handle audio in a large Premiere project?

    Hey all,
    This might probably be suitable for any version of Premiere, but just in case, I use CS4 (Master Collection)
    I am wrestling in my brain about the best way to handle audio in my project to cut down on the time I am working on it.
    This project I just finished was a 10 minute video for a customer shot on miniDV (HVX-200) cut down from 3 hours of tape.
    I edited my whole project down to what looked good, and then I decided I needed to clean up all the Audio using Soundbooth, So I had to go in clip by clip, using the Edit in SoundBooth --> Render and Replace method on every clip. I couldn't find a way to batch edit any audio in Soundbooth.
    For every clip, I performed similar actions---
    1) both tracks of audio were recorded with 2 different microphones (2 mono tracks), so I needed only audio from 1 track - I used SB to cut and paste the good track over the other track.
    2) amplified the audio
    3) cleaned up the background noise with the noise filter
    I am sure there has to be a better workflow option than what I just did (going clip by clip), Can someone give me some advice on how best to handle audio in a situation like this?
    Should I have just rendered out new audio for the whole tape I was using, and then edit from that?
    Should I have rendered out the audio after I edited the clips into one long track and performed the actions I needed on it? or something entirely different? It was a very slow, tedious process.
    Thanks,
    Aza

    Hi, Aza.
    Given that my background is audio and I'm just coming into the brave new world of visual bits and bytes, I would second Hunt's recommendation regarding exporting the entire video's audio as one wav file, working on it, and then reimporting. I do this as one of the last stages, when I know I have the editing done, with an ear towards consistency from beginning to end.
    One of the benefits of this approach is that you can manage all audio in the same context. For example, if you want to normalize, compress or limit your audio, doing it a clip at a time will make it difficult for you to match levels consistently or find a compression setting that works smoothly across the board. It's likely that there will instead be subtle or obvious differences between each clip you worked on.
    When all your audio is in one file you can, for instance, look at the entire wave form, see that limiting to -6 db would trim off most of the unnecessary peaks, triim it down, and then normalize it all. You may still have to do some tweaking here and there, but it gets you much farther down the road, much more easily.Same goes for reverb, EQ or other effects where you want the same feel throughout the entire video.
    Hope this helps,
    Chris

  • Best way to handle two tracks with subtitles?

    Greetings All,
    I'm starting a new thread from a previous discussion on mixed 16:9 and 4:3 footage since I've hit a new somewhat separate issue. Mixing footage requires the use of two tracks sao that proper display modes can be set for each aspect ratio. However, I will ulitmately have two subtitle tracks as well on this project. I'm not too concerned about switching between the two subtitle streams; that seems straightforward enough. But I am at a loss as to how to allow two tracks to use the same subtitle stream.
    This is not like a multi-angle situation where two tracks are actually refernceing the same exact subtitles. The subtitles for the 16:9 clips will be different from the subtitles for the 4:3 clips. If I was using one track, I would be all set as I could just lay the clips out one after the other and then drop the subtitles in sequentially as well.
    But with two tracks, clips on each track occupy the same timecode at the same time, so I'm unsure how to program things so that the proper subtitles play at all times. Am I making sense?
    Just to simplify things: if I have one 16:9 clip on track 1 and a 4:3 clip on track 2 and they each have associated English subtitles that must reside in stream 1 (and also Spanish subs in stream 2), as far as I can tell they overlap timecode which creates a problem for subtitles.
    Do I need to use scripting? Any pointers in the right direcftion would be greatly appreciated!
    Thanks in advance!
    Todd Nocera

    Todd, are we to understand that you have the subtitle files already? That is, files complete with timecode?
    If your question is about how to make sure that, once the user has selected English subtitles prior to (or during) playback of Track A, Track B also plays English subtitles, that's part of the DVD video spec. Subtitles are a system stream so once a stream has been activated, that stream should continue to play for all tracks. Unless there is something odd going on.
    But, again, please detail what exactly the issue is.

  • OIM 9031: Best way to handle application/test account on target system.

    Hi Guys,
    I am wonder what will be the best way to handle application account created in target sytem . i.e. I have target system Active Directory and on non-trusted reconciliation, I also fetch application/test account which not going to match to any existing user ,but should be capture for reporting or any future action .
    Any input or idea is most welcome !!
    Cheers,
    Ankit

    There are basically two approaches to handle service accounts.
    Either you model them as a free standing RO very similar to a normal AD account or you use the built in "service account" and associate the account with an already existing AD RO instance. I haven't used the "service account" approach in any customer project yet so I can't really comment on the details of that approach (hopefully someone else will be able to do that).
    Are you sure that you have service accounts in AD that you can't attribute to a specific users? Most organizations require service accounts to be linked to a user or a group of users so that the need for the account to continue existing can be verified by a human. Having live accounts in your AD that no one can say what they do or why they exist is normally a very scary thought for most organizations.
    Hope this helps
    /M

  • Best way to handle medical collections?

    Hello all, My husband and I are new to the "rebuilding your credit" world.  We're coming back from a foreclosure and actively trying to rebuild.  I have 4 active collections for 3 creditors listed on my EQ/TU report - totally $399.  I can pay them in full right now but I don't want to call the creditors to pay them if its not going to be removed from my reports properly.  Some of these are old - 2011. I also have 3 collections "removed" from my TU/EQ report since March - will paying these affect what is listed on my reports?
    What is the best way to handle these accounts to ensure they are removed from my reports completely?    

    Contact each creditor and politely request a Pay for Deletion - agreement they will delete from your reports in exchange for full payment. Get this in writing. If that doesn't work (try a few times), pay it, then try the goodwill route -- write letters requesting early deletion of the entries from your reports. Even though negative items will fall off your report in 7 years (so the 2011 collection will fall off in 2018), you will still have to report that you have unpaid debt on future mortgage applications, etc. And on a manual review, paid collections look better than unpaid ones. I would not advise not paying them at all. 

  • What is the best way to handle very large images in Captivate?

    I am just not sure the best way to handle very large electrical drawings.
    Any suggestions?
    Thanks
    Tricia

    Is converting the colorspace asking for trouble?  Very possibly!  If you were to do that to a PDF that was going to be used in the print industry, they'd shoot you!  On the other hand, if the PDF was going online or on a mobile device – they might not care.   And if the PDF complies with one of the ISO subset standards, such as PDF/X or PDF/A, then you have other rules in play.  In general, such things are a user preference/setting/choice.
    On the larger question – there are MANY MANY ways to approach PDF optimization.  Compression of image data is just one of them.   And then within that single category, as you can see, there are various approaches to the problem.  If you extend your investigation to other tools such as PDF Enhancer, you'd see even other ways to do this as well.
    As with the first comment, there is no "always right" answer.  It's entirely dependent on the user's use case for the PDF, requirements of additional standard, and the user's needs.

  • What is the best way to reconcile gl account postings with pay results ?

    Hi experts
    what is the best way to reconcile gl account postings with pay results ?
    is it running wage type reporter ?
    or is there any otehr way .. i have variances in wtr n gl posting for a few employees.
    Regards

    Hi,
    Match Payroll Journal (PC00_M40_LJN - Payroll Journal ) with Wage type Report.(PC00_M99_CWTR - Wage Type Reporter )
    Finally verify the Posting amount (PCP0) with Payroll Journal....(PC00_M40_LJN - Payroll Journal )
    This should solve your issue...
    Regards,
    Veeram

  • Best way to handle large files in FCE HD and iDVD.

    Hi everyone,
    I have just finished working on a holiday movie that my octagenarian parents took. They presented me with about 100 minutes of raw footage that I have managed to edit down to 64 minutes. They have viewed the final version that I recorded back to tape for them. They now want to know if I can put it onto a DVD for them as well. Problem is the FCE HD file is 13Gb.
    So here is my question.
    What is the best way to handle this problem?
    I have spoken to a friend of mine who is a professional editor. She said reduce the movie duration down to about 15mins because it's probably too long and boring. (rather hurtful really) Anyway that is out of the question as far as my oldies are concerned.
    I have seen info on Toast 8 that mentions a "Fit to DVD" process that purports to "squash" 9Gb of movie to a 4.7Gb disk. I can't find if it will also put 13Gb onto a dual layer 8.5Gb disk.
    Do I have to split the movie into two parts and make two dual layer DVD's? If so I have to ask - How come "Titanic", 3hrs+ fits on one disk??
    Have I asked too many questions?

    Take a deep breath. Relax. All is fine.
    iDVD does not look at the size of your video file, it looks at the length. iDVD can accomodate up to 2 hours of movie
    iDVD gives you different options depending on the length of your movie. Although I won't agree with your friend about reducing the length of your movie to 15 minutes, if you could trim out a few minutes to get it under an hour that setting in iDVD (Best Performance though the new version may have renamed it) gives you the best quality. Still, any iDVD setting will give you good quality even at 64 minutes
    In FCE export as Quicktime Movie NOT any flavour of Quicktime Conversion. Select chapter markers if you have them. If everything is on one system unchecked the Make Movie Self Contained button. Drop the QT file into iDVD

  • Best practice converting local laptop accounts to Mobile Accounts with PHD

    Hi,
    what is the best practice to convert local laptop users (with different UIDs than their network account) to mobile accounts? Especially when the local dir should not be synced in whole (just Documents, Library). Client and server are 10.5, network accounts are on NFS.
    I tried creating the mobile account with a minimal network directory (Library etc. ) and then move the original folders into place, but this didn't work out (the sync info was overwritte somewhere ..)
    Christian

    I think your best bet is to copy the home folder off the laptop to the user share on the server. Then with WGM create the same user and the apply all permissions of the network user to the copied folder.
    Once you have that create your settings for the PHD and then go to the laptop. There you will setup the laptop and bind it to the directory, have that user login (might want to do this on a lan, not airport) and then it will move all the data across to that laptop, and since the network user (same as the local) owns that folder everything should work. If the password is the same then OS X should fix the login and keychain password, so saved forms or email password would show up.
    I did this same thing for 20 OS 10.4 client laptops. Took me a while to get all of this in place but will spare you the running around...
    hope that helps

  • Best way of handling large amounts of data movement

    Hi
    I like to know what is the best way to handle data in the following scenario
    1. We have to create Medical and Rx claims Tables for 36 months of data about 150 million records each - First month (month 1, 2, 3, 4, .......34, 35, 36)
    2. We have to add the DELTA of month two to the 36 month baseline. But the application requirement is ONLY 36 months, even though the current size is 37 months.
    3 Similarly in the 3rd month we will have 38 months, 4th month will have 39 months.
    4. At the end of 4th month - how can I delete the First three months of data from Claim files without affecting the performance which is a 24X7 Online system.
    5. Is there a way to create Partitions of 3 months each and that can be deleted - Delete Partition number 1, If this is possible, then what kind of maintenance activity needs to be done after deleting partition.
    6. Is there any better way of doing the above scenario. What other options do I have.
    7 My goal is to eliminate the initial months data from system as the requirement is ONLY 36 months data.
    Thanks in advance for your suggestion
    sekhar

    Hi,
    You should use table partitioning to keep your data on monthly partitions. Serach on table partitioning for detailed examples.
    Regards

  • Best way to handle session timeout

    Hello All,
    oracle 11g, Apex ver 3.1.2
    I am bit confused about the sessoin handling mecahnism for the users .
    Which is the best way to handle session for the users is it programatically or by DBA admin level.
    What are the pros and cons going DBA Level and Programmatica level.
    Before hand I got to have some information on hand for justification.
    thanks/kumar

    Hi,
    I've done a great deal of work with mobile accounts in Snow Leopard and I'm now having a "play" with Lion. To be honest you have to sit down and think about why you need mobile accounts.
    If your user only uses one computer then your safer having a local account backed up by a network Time Machine, this avoids the many many woes that the Servers FileSyncAgent brings to the table.
    If your users are going to be accessing multiple computers on the network and leaving the network then a mobile account is good for providing a uniform user experience and access to files etc. However, your users will have to make a choice as to whether they want their iPhoto libraries on one Local machine (backed up by Time Machine) or whether they want their library to be hosted on the server and not part of the Mobile Home Sync schedule (adding ~/Pictures to the excluded items on the home sync settings).
    With the latter, users will be able to access their iPhoto libraries on any computer when they are within the network (as it's accessed from the users server home folder).
    With the first option the user would have their iPhoto library on one computer (say the laptop they used the most) but then would not be able to access it from other computers they log on to.
    iPhoto libraries are a pain, and I'm working hard to come up with a workaround. If your users moved over to using Apeture then you could include the aperture library as part of the home sync thanks to Deeport (http://deepport.net/archives/os-x-portable-home-directories-and-syncing-flaw-wit h-bundles/)
    He does suggest that the same would work with IPhoto libraries - but it doesn't for a number of mysterious reasons regarding how the OS recognizes thie iPhoto bundle (it does so differently compared to Apeture).
    Hope this helps...

  • Best way to handle tcMultipleMatchFoundException

    Can any one tell me what is the best way to handle tcMultipleMatchFoundException during Reconciliaiton.
    One way which i know is to manually correct the data. Apart form is there any way..?
    Thanks,
    Venkatesh.

    Hi,
    I've done a great deal of work with mobile accounts in Snow Leopard and I'm now having a "play" with Lion. To be honest you have to sit down and think about why you need mobile accounts.
    If your user only uses one computer then your safer having a local account backed up by a network Time Machine, this avoids the many many woes that the Servers FileSyncAgent brings to the table.
    If your users are going to be accessing multiple computers on the network and leaving the network then a mobile account is good for providing a uniform user experience and access to files etc. However, your users will have to make a choice as to whether they want their iPhoto libraries on one Local machine (backed up by Time Machine) or whether they want their library to be hosted on the server and not part of the Mobile Home Sync schedule (adding ~/Pictures to the excluded items on the home sync settings).
    With the latter, users will be able to access their iPhoto libraries on any computer when they are within the network (as it's accessed from the users server home folder).
    With the first option the user would have their iPhoto library on one computer (say the laptop they used the most) but then would not be able to access it from other computers they log on to.
    iPhoto libraries are a pain, and I'm working hard to come up with a workaround. If your users moved over to using Apeture then you could include the aperture library as part of the home sync thanks to Deeport (http://deepport.net/archives/os-x-portable-home-directories-and-syncing-flaw-wit h-bundles/)
    He does suggest that the same would work with IPhoto libraries - but it doesn't for a number of mysterious reasons regarding how the OS recognizes thie iPhoto bundle (it does so differently compared to Apeture).
    Hope this helps...

  • Deleted items reappear on mobile account with syncing?

    On a mobile account with syncing, deleted items will reappear after syncing. One can manually do a full sync. Then delete stuff. Then do a full manual sync again and the deleted items are back.
    This is on Leopard 10.5.6 client and Leopard 10.5.6 server.
    Open to ideas one what I ought to look at.
    Best Wishes,
    Paul

    Paul,
    I'm happy I'm not alone (sorry..)
    I have exactly the same problem, although I'm using Linux server, not OSX.
    It all worked nicely until 10.5.6 upgrade, after that I'm having lots of home sync problems, including:
    1. locally deleted items re-appear after sync
    2. a lot of sync conflicts, specially when sync cannot resolve latest file or directory version between mobile and network copy (and mobile copy will be always the latest one)
    3. huge syncs even if no data has been modified, ie:
    I'm syncing all on login and logout, background sync is disabled.
    I do login then straight away logout, so practically no data has been modified, but the sync may show me tens of GB being transferred.
    Now, this is weird: I've done tests on a freshly created mobile account, with approx 50MB of data. Basically I've logged in and out repeatedly, sometimes modifying small files. Some of the syncs showed me transfer of 60MB!!! That's 10MB more than the size of the home directory!
    I've looked through release notes for 10.5.6 and some sync issues were 'fixed'. I'm wondering if other ones were introduced...
    As I've said, it all worked perfectly until the latest update - I have many machines behaving in the same, bad way.
    Perhaps someone has a solution?
    Thanks,
    Pawel

  • Best way to fill a datagird with A LOT of data?

    What's the best way to fill a datagrid with A LOT of data.
    I'm talking about something like 10,000 rows. Can the datagrid
    handle it? No screen is large enough to show 10,000 records..... is
    there a way to load 50, and then when a user scrolls down it loads
    the next 50 etc...?

    Right. It's not recommended that you load 10,000 rows into
    the datagrid at once. Using the data management service with paging
    enabled is a much better solution. See the Configuring the Data
    Management Service chapter in the Flex2 Developer's Guide for more
    info.

  • Best way to handle duplicate headings stemming from linked TOC book?

    What's the best way to handle duplicate topic titles stemming from TOC books that contain links to a topic that you want to have appear in the body? The problem I've had for years now is that the TOC generates one heading, and the topic itself generates one heading.This results in duplicate headings in the printed output.
    I have a large ~2500 topic project I have to print every release, and to date we've been handling it in post-build word macros, but it's not 100% effective, so we're looking to fix this issue on the source side of the fence. On some of our smaller projects, we've actually marked with the heading in the topic itself with the Online CBT and that seems to work. We're thinking of doing the same to our huge project unless there's a better way of handling this. Any suggestions?

    See the tip immediately above this link. http://www.grainge.org/pages/authoring/printing/rh9_printing.htm#wizard_page3
    The alternative is to remove the topic from the print layout so that it only generates by virtue of its link to the book.
    See www.grainge.org for RoboHelp and Authoring tips
    @petergrainge

Maybe you are looking for