Multi-user, Multi-mac, single-storage best practices?

I wouldn't share the MacBookPro so my wife finally replaced her PC with a new iMac. We want to store our big music collection in one place (either the iMac or external USB disk. Both machines presently use WiFi connectivity through our older round AiportExtreme, though I'd consider upgrading if the Airport Disk sharing would make this simple. We also presently use Airport Express to play music from the laptop to our home audio system and will continue to use the laptop for this. Presently we each have one library for laptop/iGadgets. Ideally we could share the library files across machines (in something akin to an NFS/Celerra mount in the Unix world) so that we don't have to add music more than once per person and I could recover laptop disk space. Is it possible to point multiple machines at the same library xml/itl files, or at least synch them somehow (maybe dotmac) to both machines and how would one configure that? My knowledge of mac networking is very small, but I'm tech-savvy in the Win/Unix world. Is the network latency prohibitively slow, particularly when pulling files through WiFi from remote disk and playing back remotely to AirPort Express? We don't want it to stop every 5 seconds to buffer. I welcome suggestions for the best way to proceed. Thanks in advance.

dead thread

Similar Messages

  • Windows 2012 R2 File Server Cluster Storage best practice

    Hi Team,
    I am designing  Solution for 1700 VDi user's . I will use Microsoft Windows 2012 R2 Fileserver Cluster to host their Profile data by using Group Policy for Folder redirection.
    I am looking best practice to define Storage disk size for User profile data . I am looking to have Single disk size of 30 TB to host user Profile data .Single disk which will spread across two Disk enclosure .
    Please let me know if if single disk of 30 Tb can become any bottle neck to hold user active profile data .
    I have SSD Writable disk in storage with FC connectivity.
    Thanks
    Ravi

    Check this
    TechEd session,
    the
    Windows Server 2012 VDI deployment Guide (pages 8,9), and 
    this article
    General considerations during volume size planning:
    Consider how long it will take if you ever have to run chkdsk. Chkdsk has gone significant improvements in 2012 R2, but it will still take a long time to run against a 30TB volume.  That's down time..
    Consider how will volume size affect your RPO, RTO, DR, and SLA. It will take a long time to backup/restore a 30 TB volume. 
    Any operation on a 30TB volume like snapshot will pose performance and additional disk space challenges.
    For these reasons many IT pros choose to keep volume size under 2TB. In your case, you can use 15x 2TB volumes instead of a single 30 TB volume. 
    Sam Boutros, Senior Consultant, Software Logic, KOP, PA http://superwidgets.wordpress.com (Please take a moment to Vote as Helpful and/or Mark as Answer, where applicable) _________________________________________________________________________________
    Powershell: Learn it before it's an emergency http://technet.microsoft.com/en-us/scriptcenter/powershell.aspx http://technet.microsoft.com/en-us/scriptcenter/dd793612.aspx

  • Seeking Mac OS X "Best Practices"

    I am looking into a support role within a company that widely uses Apple gear. They seem a bit discombobbulated (disorganized, inconsistent versions, unrealiable backups, etc.)
    Is there a concise Best Practices document for the Apple Hardware and the various OS offerings? Web searches aren't providing what I'm looking for.
    Thanks up front!
    Mtn

    It probably depends on what you want to do to get the company’s computers organized. You might try looking at Apple’s manuals:
    http://support.apple.com/manuals/
    You can also search Apple’s Knowledge Base for known issues with solutions from Apple.
    If you have specific questions about how to update the Mac OS on the computers or making bootable backups, or advantages and limitations of specific applications like Time Machine, you can search this forum for a wealth of information. Look to the Top Users (in the sidebar to the right of this) for the most experienced and reliable solutions.
    The most general advice includes:
    use Disk Utilities from the install disk to Repair Disk and Repair Permissions;
    or use a commercial disk repair application (Disk Warrior and TechToolPro are most commonly recommended here);
    use System Profiler to check hardware components (TechToolPro also does this);
    make sure there are no hard drive errors before updating the OS;
    make regular backups: automatically scheduled bootable backup (commonly recommended here is the free Carbon Copy Cloner and SuperDuper, but there are other commercial backup programs that can be purchased);
    set up Time Machine to do incremental backups.
    I’m sure there are many more, but that might get you started.

  • New mac - what is best practice for accounts?

    I am about to get a new mac (imac g5), and would like it to work well (ie file transfer and backup to from) my existing powerbook.
    Is there a best practice for account set up? should I use the same accounts between the two or can I set up a new account on the new mac?
    Related to this: what will key change sync give me? does that only work with the same accounts on two macs?
    thanks
    John

    With Tiger there is a migration assistant that will move everything over from your Powerbook to your new iMac G5. All you need is a firewire cable and when prompted in your first start-up select migration assistant and connect the two computers. You will need to boot up your Powerbook holding down the 't' key before you connect the two together. Good luck, Jack.

  • Servlet - xml data storage best practice

    Hello - I am creating a webapp that is a combination of servlets and jsp. The app will access, store and manipulate data in an xml file. I hope to deploy and distribute the webapp as a war file. I have been told that it is a bad idea to assume that the xml file, if included in a directory of the war file will be writeable, as the servlet spec does not guarantee that war are "exploded" into real file space. For that matter, they do not guarantee that the file space is writeable at all.
    So, what is the best idea for the placement of this xml file? Should I have users create a home directory for the xml file to sit in, so it can be guaranteed to be writeable? And, if so, how should I configure the webapp so they it will know where this file is kept?
    Any advice would be gratefully welcomed...
    Paul Phillips

    Great Question, but I need to take it a little further.
    First of all, my advice is to use some independent home directory for the xml file that can be located via a properties file or the like.
    This will make life easier when trying to deploy to a server such as JBoss (with Catalina/Tomcat) which doesn't extract the war file into some directory. In that case you would need to access your XML file which would be residing inside a war file. I haven't tried this (sounds painful) but I suspect there may be security access problems when trying to get the FileOutputStream on a file inside the war??
    Anyway.... so I recommend the independent directory away from the hustle and bustle of the servers' directories. Having said that..... I have a question in return: Where do you put a newly created (on the fly) jsp that you want accessed via your webapp?
    In Tomcat its easy... just put it in the tomcat/webapps/myapp directory, but this can't be done for JBoss with integrated Tomcat (jboss-3.0.0RC1_tomcat-4.0.3).
    Anyone got any ideas on that one?

  • Populating users and groups - design considerations/best practice

    We are currently running a 4.5 Portal in production. We are doing requirements/design for the 5.0 upgrade.
    We currently have a stored procedure that assigns users to the appropriate groups based on the domain info and role info from an ERP database after they are imported and synched up by the authentication source.
    We need to migrate this functionality to the 5.0 portal. We are debating whether to provide this functionality by doing this process via a custom Profile Web service. It was recommended during ADC and other presentation that we should stay away from using the database security/membership tables in the database directy and use the EDK/PRC instead.
    Please advise on the best way to approach(With details) this issue. We need to finalize the best approach to take asap.
    Thanks.
    Vanita

    So the best way to do this is to write a custom Authentication Web Service.  Database customizations can do much more damage and the EDK/PRC/API are designed to prevent inconsistencies and problems.
    Along those lines they also make it really easy to rationalize data from multiple backend systems into an orgainzation you'd like for your portal.  For example you could write a Custom Authentication Source that would connect to your NT Domain and get all the users and groups, then connect to your ERP system and do the same work your stored procedure would do.  It can then present this information to the portal in the way that the portal expects and let the portal maintain its own database and information store.
    Another solution is to write an External Operation that encapsulates the logic in your stored procedure but uses the PRC/Server API to manipulate users and group memberships.  I suggest you use the PRC interface since the Server API may change in subtle ways from release to release and is not as well documented.
    Either of these solutions would be easier in the long term to maintain than a database stored procedure.
    Hope this helps,
    -Akash

  • Runtime image storage best practice ?

    Hello,
    I have a question regarding the best place to store images that will be loaded at runtime. I understand the concept of having an assets folder within the project and keeping certain images as a part of the project itself, but what about storing images that are dynamic in that they not available at authoring, but are still loaded at runtime.
    The specific implementation is that I have an application that is configured by the user, and I want them to be able to assign their own images for icons on buttons (while still assigning default icon in case the image they've assigned is not found or is not compliant to the size requirements, etc). So where would be the best place to store images like this. There are a couple of other places in my project where I'll allow the user to place their own logos (such as the a control bar area etc) or other graphics withing the context of the UI so the question is not specific to buttons and icons.
    I hope my question makes sense, but I can be more specific if need be. Thanks in advance for your time.

    You could use resource bundling mechanism for your Idea, depends on how many users will you have? because this approach requires to have compiled resource modules loaded at runtime, so for each custom set of stuff for one of your users you should invoke mxmlc compiler to build it's own custom resource module. Which you can load at runtime and overlap all same named resources already used in the application.
    And all you have to worry about that your resources named matched and resource bundle names must be equal too.
    If you are interested, dig into ResourceManager class and resource bundling mechanism.
    If you feel this message answers your question or helps, please mark it respectively

  • SQL2008R2 cluster w/ several Named Instances -- shared storage best practice?

    Planning four node SQL2008R2 cluster; three named instances.  Each named instance requires exclusive use of a drive letter on shared storage.  Does the names instance need all it's files (data, logs, tempdb) on that exclusive drive?  Or can
    it use a drive shared by all 3 instances.  E.g. U:\SQLBackup\<instance-name>\...
    Thanks,
    Bob
     

    You will need at least one drive for each instance + 1 one for cluster Quorum (unless you go for fileshare).
    My recommandation would be:
    Instance1
    E:\SQLDataFiles
    F:\SQLLogFiles
    G:\SQLTempFiles
    Instance2
    H:\SQLDataFiles
    I:\SQLLogFiles
    J:\SQLTempFiles
    And so on.  If you are considered that you might run out of drive letters you could make a single Drive letter pr. instance and then attach the 3 drives as mountpoints into this drive. That way you will save 2 letters pr. instance.
    As for just using one single drive pr. instance with all 3 kinds of files: Don't go there - the performance gain of splitting then into 3 drives as laid out above, is at least 50% in my experience. Remeber also to format the sql drives with NTFS blocksize
    of 64K
    Regards
    Rasmus Glibstrup, SQLGuy
    http://blog.sqlguy.dk

  • BPC 7M SP6 - best practice for multi server setup

    Experts,
    We are considering purchasing new hardware for our BPC 7M implementation. My question is what is the recommended or best practice setup for SQL and Analysis Services? Should they be on the same server or each on a dedicated server?
    The hardware we're looking at would have 4 dual core processors and 32 GB RAM in a x64 base. Would this adequately support both services?
    Our primary application cube is just under 2GB and appset database is about 12 GB. We have over 1400 users and a concurrency count of 250 users. We'll have 5 app/web servers to handle this concurrency.
    Please let me know if I am missing information to be able to answer this question.
    Thank you,
    Hitesh

    I don't think there's really a preference on that point. As long as it's 64bit, the servers scale well (CPU, RAM), so SQL and SSAS can be on the same server. But it is important to look also beyond CPU and RAM and make sure there's no other bottlenecks like storage (Best practice is to split the database files on several disks and of course to have the logs on disks that are used only for the logs). Also the memory allocation in SQL and OLAP should be adjusted so that each has enough memory at all times.
    Another point to consider is high availability. Clustering is quite common on that tier. And you could consider having the active node for SQL on one server and the active node for OLAP (SSAS) on the other server. It costs more in SQL licensing but you get to fully utilize both servers, at the cost of degraded performance in the event of a failover.
    Bruno
    Edited by: Bruno Ranchy on Jul 3, 2010 9:13 AM

  • Webchannel b2b Accesed from CRM Java multi language best practice?

    Hi,
    We are accesing Webchannel from CRM Portal and we have the requirement to access Webchannel b2b with the same language as the Portal User.
    Is there a best practice to do this? I´ve seen creating a URL iView with Spanish and English URL with a parameter called language, and also an iView for english and an iView for Spanish pointing to a system for english and a system for spanish respectively.
    Is there another way to do this? or a recommended way?
    Hope somebody else have solved this.
    Thanx in Advanced!
    Kind Regards,
    Gerardo J

    My concern will be how to pass the parameters (for language or country) from the Portal to the Webchannel b2b... So, you end up having an URL iView for each different locale .  Can we pass such information programatically form portal to webchannel? Say through http header / cookies? That will be seamless..
    But to answer your question, yes, URL ISA iView is pretty common if you have only very few languages.  More importantly, if you want deep integration with the portal, you must use  ISA iView configuration provided by SAP, then this is the only way. See [Note 1021959 - Portal settings for ISA iViews|https://service.sap.com/sap/support/notes/1021959] for details of the available features.

  • Best Practices for user ORACLE

    Hello,
    I have few linux servers with user ORACLE.
    All the DBAs in the team connecting and working on the servers as ORACLE user and they dont have sperate account.
    I create for each DBA its own account and would like them to use it.
    The problem is that i dont want to lock the ORACLE account since i need it for installation/upgrade and etc , but yet i dont what
    the DBA Team to connect and work with the ORACLE user.
    What are the Best Practice for souch case ?
    Thanks

    To install databases you don't need acces to Oracle.
    Also installing 'few databases every month' is fundamentally wrong as your server will run out of resources, and Oracle can host multiple schemas in one database.
    "One reason for example is that we have many shell scripts that user ORACLE is the owner of them and only user ORACLE have a privilege to execute them."
    Database control in 10g and higher makes 'scripts' obsolete. Also as long as you don't provide w access to the dba group there is nothing wrong in providing x access.
    You now have a hybrid situation: they are allowed interactively to screw 'your' databases, yet they aren't allowed to run 'your' script.
    Your security 'model' is in urgent need of revision!
    Sybrand Bakker
    Senior Oracle DBA

  • Best practice for sudo

    Hi
    I am trying to install sap as a sudo user.
    What is the best practice to setup sudo ?
    I am getting this error when i install as sudo sapinst
    Output of /usr/sap/SPD/SYS/exe/run/brconnect -u / -c -o summary -f stats -o SAPIAL -t all -m +I -s P10 -f allsel,collect,method,precision,space,keep -p 2 is written to the logfile brconnect.log.
    WARNING    2011-03-10 10:04:01.192
               CJSlibModule::writeWarning_impl()
    Execution of the command "/usr/sap/SPD/SYS/exe/run/brconnect -u / -c -o summary -f stats -o SAPIAL -t all -m +I -s P10 -f allsel,collect,method,precision,space,keep -p 2" finished with return code 3. Output:
    BR0801I BRCONNECT 7.00 (32)
    BR0805I Start of BRCONNECT processing: cefkfypa.sta 2011-03-10 10.03.58
    BR0484I BRCONNECT log file: /oracle/SPD/sapcheck/cefkfypa.sta
    BR0280I BRCONNECT time stamp: 2011-03-10 10.04.01
    BR0301E SQL error -1017 at location db_connect-2, SQL statement:
    'CONNECT /'
    ORA-01017: invalid username/password; logon denied
    BR0310E Connect to database instance SPD failed
    BR0806I End of BRCONNECT processing: cefkfypa.sta 2011-03-10 10.04.01
    BR0280I BRCONNECT time stamp: 2011-03-10 10.04.01
    BR0804I BRCONNECT terminated with errors

    Hello,
    I'd ask in the Windows forum on Microsoft Community.
    Karl
    When you see answers and helpful posts, please click Vote As Helpful, Propose As Answer, and/or Mark As Answer.
    My Blog:http://unlockpowershell.wordpress.com
    My Book:Windows PowerShell 2.0 Bible
    My E-mail: -join ('6F6C646B61726C40686F746D61696C2E636F6D'-split'(?<=\G.{2})'|%{if($_){[char][int]"0x$_"}})

  • RAID Level Configuration Best Practices

    Hi Guys ,
       We are building new Virtual environment for SQL Server and have to define RAID level configuration for SQL Server setup.
    Please share your thoughts for RAID configuration for SQL data, log , temppdb, Backup files .
    Files  RAID Level 
    SQL Data File -->
    SQL Log Files-->
    Tempdb Data-->
    Tempdb log-->
    Backup files--> .
    Any other configuration best practices   are more then welcome . 
    Like Memory Setting at OS level , LUN Settings. 
    Best practices to configure SQL Server in Hyper-V with clustering.
    Thank you
    Please Mark As Answer if it is helpful. \\Aim To Inspire Rather to Teach A.Shah

    Hi,
    If you can shed some bucks you should go for RAID 10 for all files. Also as a best practice keeping database log and data files on different physical drive would give optimum performance. Tempdb can be placed with data file or on a different drive as per
    usage. Its always good to use dedicated drive for tempdb
    For memory setting.Please refer
    This link for setting max server memory
    You should monitor SQL server memory usage using below counters taken from
    this Link
    SQLServer:Buffer Manager--Buffer Cache hit ratio(BCHR): IIf your BCHR is high 90 to 100 Then it points to fact that You don't have memory pressure. Keep in mind that suppose somebody runs a query which request large amount of pages in that
    case momentarily BCHR might come down to 60 or 70 may be less but that does not means it is a memory pressure it means your query requires large memory and will take it. After that query completes you will see BCHR risiing again
    SQLServer:Buffer Manager--Page Life Expectancy(PLE): PLE shows for how long page remain in buffer pool. The longer it stays the better it is. Its common misconception to take 300 as a baseline for PLE.   But it is not,I read it from
    Jonathan Kehayias book( troubleshooting SQL Server) that this value was baseline when SQL Server was of 2000 version and max RAM one could see was from 4-6 G. Now with 200G or RAM coming into picture this value is not correct. He also gave the formula( tentative)
    how to calculate it. Take the base counter value of 300 presented by most resources, and then determine a multiple of this value based on the configured buffer cache size, which is the 'max server memory' sp_ configure option in SQL Server, divided by 4 GB.
      So, for a server with 32 GB allocated to the buffer pool, the PLE value should be at least (32/4)*300 = 2400. So far this has done good to me so I would recommend you to use it.  
    SQLServer:Buffer Manager--CheckpointPages/sec: Checkpoint pages /sec counter is important to know about memory pressure because if buffer cache is low then lots of new pages needs to be brought into and flushed out from buffer pool, 
    due to load checkpoint's work will increase and will start flushing out dirty pages very frequently. If this counter is high then your SQL Server buffer pool is not able to cope up with requests coming and we need to increase it by increasing buffer pool memory
    or by increasing physical RAM and then making adequate changes in Buffer pool size. Technically this value should be low if you are looking at line graph in perfmon this value should always touch base for stable system.  
    SQLServer:Buffer Manager--Freepages: This value should not be less you always want to see high value for it.  
    SQLServer:Memory Manager--Memory Grants Pending: If you see memory grants pending in buffer pool your server is facing SQL Server memory crunch and increasing memory would be a good idea. For memory grants please read this article:
    http://blogs.msdn.com/b/sqlqueryprocessing/archive/2010/02/16/understanding-sql-server-memory-grant.aspx
    SQLServer:memory Manager--Target Server Memory: This is amount of memory SQL Server is trying to acquire.
    SQLServer:memory Manager--Total Server memory This is current memory SQL Server has acquired.
    For other settings I would like you to discuss with vendor. Storage questions IMO should be directed to Vendor.
    Below would surely be a good read
    SAN storage best practice For SQL Server
    SQLCAT best practice for SQL Server storage
    Please mark this reply as answer if it solved your issue or vote as helpful if it helped so that other forum members can benefit from it.
    My TechNet Wiki Articles

  • Best practices TopLink Mapping Workbench multi-user + CVS?

    This might be a very important issue, in our decision whether or not to choose TopLink --
    How well is multi-user development and CVS supported when using the TopLink Mapping Workbench? Are there best practices regarding this use case?
    Thanks.

    We have no problem with the workbench and CVS. Only a couple of our developers are responsible for the mappings so we havn't really run into concurrent edits. It's pure XML so a decent mergetool with XML support should let you resolve conflicts pretty easily.

  • Is there a best practice for multi location server setups including mac mail server?

    Afternoon all,
    Last year I setup a client with Snow Leopard Server including hosting his mail on the server with Mac Mail Server and calendaring.  He now has plans to open other sites with the same setup, how can this be done using Mac Server?  The implementation of a new server on the new site would be straight forward my concerns / question are:
    1. How would the two servers communicate?
        a.)Do they need to communicate?
    3. How will mail across the two sites work?
        a.) How will DNS work for email internally?
        b.) How will DNS work for emai externally?
    4.  How will calendaring work across the two sites?
    Is Mac Server the best platform for moving ahead with this type of business need?
    Any help or direction would be greatly appreciated.
    Anthony

    Camelot,
    many thanks for the speedy reply.  Your comments are very helpful thank you, if I may I will give you some for information and some other questions.
    The offices will be from 5 miles to 25 miles apart, the new office and ones that follow will be considered as branches of the main office for example the company name is Sunflower and it serves area 1, then new will office will serve area 2 and so on.  So in theory I could leave the main server domain and mail mx record as sunflower and then add further servers as 2.sunflower.com, 3.sunflower.com as domains and mx records? This would then provide unique mail records and users within the organisation such as [email protected] and [email protected], which doesnt look to good I would prefer all users to be name@sunflower how can this be acheived?
    With regard to user activity in the whole users will be fixed but their will be floaters such as managers and the owners that may at times float from one office to the other and would beneift from logging into any machine and getting their profile.
    I have thought about VPN's as I have acheieved this type of setup with Microsoft server products, I have found speed issues in some cases, how can this be acheived using OS X, are there any how to docs around?  In the Microsoft setup I acheived this using netgear VPN Firewalls which ofcourse adds an overhead, can this be acheived using OS X propietary VPN software?
    So ultimatley I would prefer to have the one domain without subs "sunflower.com" and users to be able to login to their profiles from either location. The main location now will remain as headoffice and the primary location and the new will be satelites.
    I hope that covers all of your other question, again many thanks for your comments and time.
    Kind Regards
    Anthony

Maybe you are looking for

  • CD/DVD Drive making awful noise

    Hi All, my Macbook Pro has all of a sudden started making a strange noise when I try and burn stuff onto disk. I insert the disk and it starts to almost 'rev up' uncontrollably, its almost like its working overtime but I have only just started having

  • Puzzled With Stupid Tables - Can anyone help me pllleeeease?

    Hi guys, I'm just trying to help someone figure out some problems with their website and I'm totally stuck. If anyone could have a look I'd be super-appreciative!!?? I know it's really badly designed and that's probably half the cause of the problems

  • How to remove effect presets from a clip?

    I'm learning how to use Premiere Elements 8 from Adobe's official Help document. I'm currently working through Chapter 10, "Applying effects", section "Working with effect presets" -> "Apply an effect preset". The Help document doesn't tell how to re

  • Selecting Multiple products with relationship with each other....

    Hi All, I have a requirement to Add products under oppty product revenue based on following logic 1) First selection: Select Parent Product.....dropdown...single selection 2) Second Selection: Select Additional Product....associated to Parent Product

  • InCopy 5.5 (PC) in same workflow as InDesign 5.0 (Mac-OS)

    Does anyone have experience working with a PC version of InCopy 5.5 in a Mac-OS InDesign 5.0 workflow? Are there any issues between platforms? We share files over a local network. Thanks.