Best Practice: Team Management

Hi together,
I have to reconfigure a server (now 10.4.8) which has been running unattended for quite some time. I've been reading a lot but still didn't find the information I need.
When I took it over all users have been working with local accounts with identical usernames and an identical primary group. They also connect to the server with one account for all users whichs primary group is diffrent from staff.
There is a sharepoint 'office' with all the projects in it.
ACLs have been turned off.
I have setup some network users which (by default) are all members of the group 'staff'.
Now I want that all the new network users (the group 'staff') get read/write acces to the already existing files in 'office'. Also I want that all members of 'staff‘ get read/write access to newly created files. The "old" common user must still keep read/write acces to all existing and newly created files.
Which steps I need to take to cover all theese needs?
Since it's a productive server I do not want to mess arround and I can't restart the server to often.
Any help will be appreciated.
Best regards
Martin
iMac 20'' (Intel), G4 Sawtooth, iBook G4 800   Mac OS X (10.4)  

If you're just going to use POSIX permissions, you should add your "common" user to group staff, then set the permissions of the "offce" sharepoint to 2775 (or 2770 if you don't want users other that staff to be able to read the files). i.e. execute "sudo chmod 277x office" in Terminal where x is 5 or 0 depending on what you want. To ensure that other users in staff can read and write new files, you'll either want to change the umask (globablly or just for Finder), or do some chron trickery to chmod g+w the files regularly. These links should help you with the umask:
http://www.macosxhints.com/article.php?story=20061103144038651
http://www.macosxhints.com/article.php?story=20031211073631814

Similar Messages

  • Best practices for managing Movies (iPhoto, iMovie) to IPhone

    I am looking for some basic recommendations best practices on managing the syncing of movies to my iPhone. Most of my movies either come from a digital camcorder into iMovie or from a digital Camera into iPhone.
    Issues:
    1. If I do an export or a share from iPhoto, iMovie, or QuickTime, what formats should I select. I've seem 3gp, mv4.
    2. When I add a movie to iTunes, where is it stored. I've seen some folder locations like iMovie Sharing/iTunes. Can I copy them directly there or should I always add to library in iTunes?
    3. If I want to get a DVD I own into a format for the iPhone, how might I do that?
    Any other recommedations on best practices are welcome.
    Thanks
    mek

    1. If you type "iphone" or "ipod" into the help feature in imovie it will tell you how.
    "If you want to download and view one of your iMovie projects to your iPod or iPhone, you first need to send it to iTunes. When you send your project to iTunes, iMovie allows you to create one or more movies of different sizes, depending on the size of the original media that’s in your project. The medium-sized movie is best for viewing on your iPod or iPhone."
    2. Mine appear under "movies" which is where imovie put them automatically.
    3. If you mean movies purchased on DVD, then copying them is illegal and cannot be discussed here.
    From the terms of use of this forum:
    "Keep within the Law
    No material may be submitted that is intended to promote or commit an illegal act.
    Do not submit software or descriptions of processes that break or otherwise ‘work around’ digital rights management software or hardware. This includes conversations about ‘ripping’ DVDs or working around FairPlay software used on the iTunes Store."

  • Best Practice for Managing a BPC Environment?

    My company is currently running a BPC 5.1 MS environment and will soon be upgrading to version 7.0 MS.  I was wondering if there is a white paper or some guidance that anyone can give with regard to the best practice for managing a BPC environment.  Which brings to light several questions in my mind:
    1.  Which department(s) in a company should u201Cownu201D the BPC application? 
    2. If both, whatu2019s SAPu2019s recommendation for segregation of duties?
    3. What roles should exist within our company to manage BPC?
    4. What type(s) of change control is SAPu2019s u201CBest Practiceu201D?
    We are currently evaluating the best way to manage the system across multiple departments, however there is no real business ownership in the system, which seems to be counter to the reason for having BPC as a solution in the first place.
    Any guidance on this would be very much appreciated.

    My company is currently running a BPC 5.1 MS environment and will soon be upgrading to version 7.0 MS.  I was wondering if there is a white paper or some guidance that anyone can give with regard to the best practice for managing a BPC environment.  Which brings to light several questions in my mind:
    1.  Which department(s) in a company should u201Cownu201D the BPC application? 
    2. If both, whatu2019s SAPu2019s recommendation for segregation of duties?
    3. What roles should exist within our company to manage BPC?
    4. What type(s) of change control is SAPu2019s u201CBest Practiceu201D?
    We are currently evaluating the best way to manage the system across multiple departments, however there is no real business ownership in the system, which seems to be counter to the reason for having BPC as a solution in the first place.
    Any guidance on this would be very much appreciated.

  • What's the best practice to manage the page file?

           
    We have one Hyper-v Server running windows 2012 R2 with 128 GB RAM and 2 drives (C and D). It setup Automatically manage page file size for all drives. What's the best practice to manage the page file?
    Bob Lin, MCSE & CNE Networking, Internet, Routing, VPN Networking, Internet, Routing, VPN Troubleshooting on http://www.ChicagoTech.net How to Install and Configure Windows, VMware, Virtualization and Cisco on http://www.HowToNetworking.com

    For Hyper-V systems, my general recommendation is to set the page file to 1-4 GB. This allows for a mini-dump should something happen. 99.99% of the time, Microsoft will be able to figure out the cause of the problem from the mini-dump. It does not make
    sense on a Hyper-V system to set aside enough space to capture all the memory on the system because only a very small portion of that memory is used by the parent partition. Most of the memory is under control of the individual VMs.
    Yes, I had one of the Hyper-V product group tell me that I should let Windows manage it.  A couple of times I saw space on my system disk disappear because the algorithm decided it wanted all the space for the page file.  Made it so I couldn't
    patch my systems.  Went back in and set the page file to 1-4 GB and have not had any issues since.
    . : | : . : | : . tim

  • General Discussion - Best practice to manage Process order

    Hi Experts,
    Which is the best practice to manage process orders ?
    1. Quantity Change - I can make quantity adjustment in R3 and APO.
    2. Source Change - I can make a version change from order header . Also i can make a source change in APO by selecting a different PPM. Which is the best option.
    3. Re Read Master Data - Best practice to read master data is from R3 or APO ?
    I feel for all the above scenarios process ordes should always managed in R3. But still wondering why we have the same flexibility in APO too ?
    Can

    Hello,
    we are just migrating from 4.6c to ECC 6.0 and I have a couple of workflows to adopt.
    For background steps I defined in the corresponding BOR methods an exception to be fired when no result is available (e.g. no mail address available). Normally, I defined them as temporary errors.
    I activated in the WI outcome section the line for this exception and so the workflow processed this branch when the exception appeared. It worked fine.
    Now, in ECC 6.0, the same workflow get stuck in the WI. The exception is fired (I can see it in the log as "Error message"), but the WI is still in status "in process". It doesn't continue with the error outcome branch.
    Is this a new logic in ECC 6.0? Do you have any idea what to do? I used this logic some dozent times in different methods and workflows and it gives me a headache if I have to change everything ...
    Thank you!
    Best regards,
    Thomas

  • What are best practices for managing my iphone from both work and home computers?

    What are best practices for managing my iphone from both work and home computers?

    Sync iPod/iPad/iPhone with two computers
    Although it isn't possible to sync an Apple device with two different libraries it is possible to sync with the same logical library from multiple computers. Each library has an internal ID and when iTunes connects to your iPod/iPad/iPhone it compares the local ID with the one the device normally syncs with. If they are the same you can go ahead and sync...
    I have my library cloned to a small 1Tb USB drive which I can take between home & work. At either location I use SyncToy 2.1 to update the local copy with the external drive. Mac users should be able to find similar tools. I can open either of the local libraries or the one on the external drive and update the media content of my iPhone. The slight exception is Photos which normally connects to a specific folder on a specific machine, although that can easily be remapped to the current library if you create a "Photos" folder inside the iTunes Media folder so that syncing the iTunes folders keeps this up to date as well. I periodically sweep my library for new files & orphans withiTunes Folder Watch just in case I make changes at one location but then overwrite the library with a newer copy from the other. Again Mac users should be able to find similar tools.
    As long as your media is organised within an iTunes Music or Tunes Media folder, in turn held inside the main iTunes folder that has your library files (whether or not you let iTunes keep the media folder organised) each library can access items at the same relative path from the library folder so the library can be at different drives/paths on different machines. This solution ensures I always have adequate backups of my library and I can update my devices whenever I can connect to the same build of iTunes.
    When working with an iPhone earlier builds of iTunes would remove any file not physically present in the local library, even if there was an entry for it, making manual management practically redundant on the iPhone. This behaviour has been changed but it will still only permit manual management with a library that has the correct internal ID. If you don't want to sync your library between machines on a regular basis just copy the iTunes Library.itl file from the current "home" machine to any other you want to use, then clean out the library entires and import the local content you have on that box.
    tt2

  • Best Practice for Managing Cookies in an Enterprise Environment

    We are upgrading to IE11 for our enterprise. One member of the team wants to set a group policy that will delete all cookies every time the user exits IE11.  We have some websites that users access that use cookies to track progress in training,
    but are deleted when the user closes the browser.  What is the business best practice regarding deleting all history, temp internet files and, especially cookies when closing a browser.
    If you can point me to a white paper on this topic, that would be helpful.
    Thanks
    Bill

    Hi,
    Regarding cookie settings, we could manage IE privacy settings using Administrative templates for IE 11:
    Administrative templates and Internet Explorer 11
    Delete and manage cookies
    The Administrative templates for IE 11, we could download from here:
    Administrative Templates for Internet Explorer 11
    Hope this may help
    Best regards
    Michael Shao
    TechNet Community Support

  • Best practices to manage Materials+Vendors in an SRM-MDM Respository?

    Hi Gurus,
    I have a functional question about how manage the Master Data of "Materials" and "Vendors" in an scenario of SRM-MDM Catalog (Repository). MDM 7.1, SRM 7.0, MDM-SRM Catalog (Repository) 7.0.
    My concern is that this kind of repository has 32 fields aprox. and the majory of fields are referenced to Material information and a little fields of Vendors.
    The big question is how load or modeling the information in the Repository?
    Which are the best practices?:
    a) Manage the materials in the main table of the repository and then add other main table to maintain the vendor data?
    b) Manage the materials and the vendors in different repositorys?
    c) Manage the materials & vendors in the same main table in one repository?
    I know that part of the solution depends of the SRM Team requiriments, but I would like to know what are the best practices from the MDM Side.
    Thanks in advanced.
    JP

    Hey JP,
    Couple of questions to you.
    Do you have Material and Vendor Master in SRM or ECC or BOTH ??
    What will be the scenario, Consolidation, Catalogue Management or CMDM??
    What will be POC for Mater data?
    Cheers,
    Rajesh

  • What is a best practice for managing a large amount of ever-changing hyperlinks?

    I am moving an 80+ page printed catalog online. We need to add hyperlinks to our Learning Management System courses to each reference of a class - there are 100s of them. I'm having difficulty understanding what the best practice is for consistent results when I need to go back and edit (which we will have to do regularly).
    These seem like my options:
    Link the actual text - sometimes when I go back to edit the link I can't find it in InDesign but can see it's there when I open up the PDF in Acrobat
    Draw an invisible box over the text and link it - this seems to work better but seems like an extra step
    Do all of the linking in Acrobat
    Am I missing anything?
    Here is the document in case anyone wants to see it so far. For the links that are in there, I used a combination of adding the links in InDesign then perfecting them using Acrobat (removing additional links or correcting others that I couldn't see in InDesign). This part of the process gives me anxiety each month we have to make edits. Nothing seems consistent. Maybe I'm missing something obvious?

    what exatly needs to be edited, the hyperlink or content or?

  • Best practice for management of Portal personalisation

    Hi
    I've been working in our sandpit client figuring out all portal personalisation and backend/frontend customisation for ESS (with some great help from SDN). However I'm not clear on what best practice is when doing this for 'real'.
    I've read in places that objects to be personalised should be copied first. Elsewhere I've read about the creating of a new portal desktop for changes to the default framework page etc
    What I'm not clear on is the strategy for how you manage and control all the personalisation changes.  For example, if you copy an iview and personalise where do you copy it to? Do you create "Z" versions of all the folders of the standard content or something else?
    Do you copy everything at the lowest possible object level and copy or highest?
    Implications of applying support or enhancement packs?
    We're going pretty vanilla to begin with in that there will just be a single ESS role so things should be fairly simple from that point of view, but I've trawled all around SDN without being able to find clear guidance on the overall best pratice process for personalisation.
    Can anyone help?
    Regards
    Phil
    Edited by: Phil Martin on Feb 15, 2010 6:47 PM

    Hi Phil,
    To keep things organized I would create an new folder (so don't use the desktop one). The new folder should sit under Portal Content. In that folder create a folder for ESS, then inside that one for iViews (e.g. COMPANY_XYZ >>> ESS >>> iViews) then copy via delta link the iViews you want from the standard SAP folders.
    Then you will have to create another folder for your own roles (e.g. ESS >>> Roles). Inside this folder create a new role and add your iViews (again via delta link). Or you could just copy one of the standard SAP roles into this folder and not bother with copying the iViews. What you do depends on how many changes you intend to make, sometimes if the changes are only small it is easier to copy the entire SAP role (via delta link) and make your changes.
    You should end up with something link this in your PCD:
    Portal Content
         COMPANY_XYZ
              ESS
                   iViews
                        iView1
                        iView2 etc..
                   Roles
                        Role1
                        Role2 etc...
    Then don't forget to assign your new roles to your users or groups.
    Hope that makes sense.
    BRgds,
    Simon
    Edited by: Simon Kemp on Feb 16, 2010 3:56 PM

  • Advice re best practice for managing the scan listener logs and list logs

    Hi friends,
    I've just started a job as a RAC dba administrator for some big 24*7 systems, I've never worked with clusterware and RAC.
    2 Space problems
    1) Very large listener_scan2.log in /u01/11.2.0/grid/log/diag/tnslsnr/<server name>/listener_scan2/trace folder
    2) Heaps of log_nnn.xml files in /u01/11.2.0/grid/log/diag/tnslsnr/<server name>/listener_scan2/alert folder (4Gb used up)
    Welcome advice on the best way to manage these in the short term (i.e. delete manually) and the recommended practice and safest way (adri maybe not sure how it works with scan listeners)
    Welcome advice and commands that could be used to safely clean these up and put a robust mechanism in place for logfile management in RAC and CLusterware systems.
    Finally should I be checking the log files in /u01/11.2.0/grid/log/diag/tnslsnr/<server name>/listener_scan2/alert regulalrly ?
    My experience with listener logs is that they are only looked at when there are major connectivity issues and on the whole are ignored.
    Thanks for your help,
    Cheers, Rob

    Have you had any issues that require them for investigative purposes? If not, just remove them. Are the logs required for some sort of audit process? If yes, gzip them to a location where you can use your OS tape backup policies to retain them for n-days. Once you remove an active file, it should recreate the file and continue without interruption.

  • What is the best practice to manage versions in XI?

    Hi!
    Is there any <b>good</b> “best practice” ways to manage versions in XI.
    Have a challenging scenario with many legacy systems and many interfaces per legacy system.
    Should I put all the different interfaces for one legacy system under one namespace in the Integration Repository or should I make one own namespace for each interface.
    Or is there other approaches, that I should consider to get a environment that is “easily” maintained.
    br.samuli

    Hi,
    In our project we have defined our own naming conventions/namespaces.
    For instance, we have agreed that we will group all interfaces (from all offices worldwide and different projects) into one single product version.
    This global custom product will in turn be divided into different SWC (Software Components) and SWCV (Software Component versions).
    So for each business scenario we will create separate SWC's and when necessary create new versions of these SWC's.
    This means that each SWC should contain all the required objects to support a complete integration scenario i.e. inbound/outbound-interfaces, data types, msg types, business scenario's etc...
    Regards,
    Rob.

  • Best Practice to Manage Subscriptions for EA Enterprise Agreement Customer

    Hi Team,
    We as a company have EA Subscription for our future product(s) release. So, we want to understand How should we manage subscriptions for our different Project with different environment.
    I have visited few important links like
    http://blog.thavo.com/2014/05/where-to-start-with-microsoft-azure.html
    http://blog.kloud.com.au/2013/07/30/good-practices-for-managing-windows-azure-subscriptions/
    we use all possible combinations like SQL Azure, Azure Network, Azure Website, Cloud Service & Traffic Manager...
    we also like to use Slot & Swapping in Azure Websites while Staging & Production in case of Cloud Service.
    Based on above link, I understood that we should have different subscription for each type of environment like my Project name is ToDoApp then I should have two subscription
    1) ToDoApp Staging    2) ToDoApp Production.
    My Question here is: if we follow that practice, How could we utilize the Swapping feature of Azure because it works only within a specific subscription. so, If I deploy the application on my ToDoApp Staging Environment probably after testing I have to re-deploy
    the same on Production (Swapping can't work as, we have environment based different subscription)
    Request you to please suggest something where our deployment would minimize and we can have resource utilization based report per subscription with swapping functionality.
    Regards, Brijesh Shah

    Hi Jambor & all,
    You are correct. From the subscriptions, we might get all billing & resource utilization information easily. But the same time, my concern here is we might loose the core azure feature like Swapping.
    Azure Website (or Cloud service) has feature to test the deployment before move to production. so, we deploy that on staging or such slot & once we are done, we can move that.
    But this way we have to use all those instances (whether it's for testing-staging or production) all in single subscription.
    Moreover within single subscription it should be through only one instance (One cloud service or One Azure Website) then only you can swap.
    So, Don't you think that would be hard limitation
    Regards, Brijesh Shah

  • Best Practice for managing variables for multiple environments

    I am very new to Java WebDynPro and have a question
    concerning our deployments to Sandbox, Development, QA,
    and Production environments.
    What is the 'best practice' that people use so that if
    you have information specific to each environment you
    don't hard-code it in your Java WebDynPro code.
    I could put the value in a properties file, but how do I
    make that variant?  Otherwise I'd still have to make a
    change for each environment to the property file and
    re-deploy.  I know there are some configurations on the
    Portal but am not sure if that will work in my instance.
    For example, I have a URL that varies based on my
    environment.  I don't want to hard-code and re-compile
    for each environment.  I'd prefer to get that
    information on the fly by knowing which environment I'm
    running in and load the appropriate URL.
    So far the only thing I've found that is close to
    telling me where I'm running is by using a Parameter Map
    but the 'key' in the map is the URL not the value and I
    suspect there's a cleaner way to get something like that.
    I used Eclipse's autosense in Netweaver to discover some
    of the things available in my web context.
    Here's the code I used to get that map:
    TaskBinder.getCurrentTask().getWebContextAdapter().getRequestParameterMap();
    In the forum is an example that gets the IP address of
    the site you're serving from. It sounds like it is going
    to be or has been deprecated (it worked on my system
    right now) and I would really rather have something like
    the DNS name, not something like an IP that could change.
    Here's that code:
    String remoteHost = TaskBinder.getCurrentTask().getWebContextAdapter().getHttpServletRequest().getRemoteHost();
    Thanks in advance for any clues you can throw my way -
    Greg

    Hi Greg:
         I suggest you that checks the "Software Change Managment Guide", in this guide you can find an explication of the best practices to work with a development infrastructure.
    this is the link :
    http://help.sap.com/saphelp_erp2005/helpdata/en/83/74c4ce0ed93b4abc6144aafaa1130f/frameset.htm
    Now if you can gets the ip of your server or the name of your site you can do the next thing:
    HttpServletRequest request = ((IWebContextAdapter) WDWebContextAdapter.getWebContextAdapter()).getHttpServletRequest();
    String server_name = request.getServerName();
    String remote_address =     request.getRemoteAddr()
    String remote_host = request.getRemoteHost()
    Only you should export the servlet.jar in your project properties > Build Path > Libraries.
    Good Luck
    Josué Cruz

  • JSF best practice for managed bean population

    hi;
    consider a simple session scope managed bean with 1 attribute which is calculated dynamically.
    there seem to be 3 ways of populating that attribute (that i could think of):
    1. have the bean's get<attribute> get the data and populate it
    2. have the constructor of the bean get the data and populate it
    3. have an action get the data, and set it from the action code (maybe use a valueBinding to get the bean if the bean which needs population is not the same bean that holds the action) .
    option 1 seems ok for situations where i need my data to be calculated for every valueRef for that attribute but is an overhead for data that need not be calculated for every valueRef.
    option 2 will ensure that te data is calculated only once per session. but it still looks kinda strange to do it
    with option 3, it seems you should populate the dynamic content of the next page and its intuitive but some places in the page might be common content with dynamic data and this has nothing to do with navigation and the next page to be displayed.
    is there a best practice of how to populate a managed bean data?
    are different cases fit different ways?
    any thoughts are appriciated.

    I think it should be either Option 2 or Option 3.
    Option 2 would be necessary if the bean data depends on some request parameters.
    (Example: Getting customer bean for a particular customer id)
    Otherwise Option 3 seems the reasonable approach.
    But, I am also pondering on this issue. The above are just my initial thoughts.

Maybe you are looking for