Job (C) use best practices

Experts,
This question is in regard to best practices/common ways that various companies ensure the proper use of the Job(C) object in the HCM systems.  For example, if there are certain jobs in the system that should only be assigned to a position if that position is in certain areas of the business (i.e. belongs to specific organizational areas), how is this type of restriction maintained?  Is it simply through business processes? Is there a way/relationship that can be assigned? Are there typical customizations and/or processes that are followed?
I'm looking to begin organizing jobs into job families, and I'm currently trying to determine and maintain the underlying organization of our companies jobs in order to ensure this process is functional.
Any insight, thoughts, or advise would be greatly appreciated.
Best regards,
Joe

Hi Joe,
You can embed the business area info into job description and this would be a part of best practice.
What I mean is that:
e.g. In your company you have 4 managers:
HR Manager
IT Manager
Procurement Manager
Production Manager
Then, as part of the Best practice of SAP, you will have 4 positions (1 position per person)
My advice is you should also have 4 jobs that describe the positions.
Then, in order to group all managers, you may have one job family "Managers" and assign all four jobs to that family.
By this way you can report all the managers as well as area-specific managers (e.g. HR manager)
As far as I know, there is no standard relationship that holds business area info.
For further info check table T778V via SM31.
Regards,
Dilek

Similar Messages

  • Use Best Practice to Import Master Data

    Hi,
    I am SAP beginner, i glad someone can guide me on my issue. How can I using best practice to import 1000++ material master data into SAP system ??? I already prepared the data in Excel spreadsheet. Can any one guide me on step?
    Thanks.

    Hi,
    LSMW is a very good tool for master data upload. The tool is very rich in features is also complex. Being a beginner you should check with a consultant to know how you can use LSMW to upload your 1000 + records. The tool too is quite intuitive. After entering the LSMW transaction you create the project, subproject and the object you are going to work on uploading. When u enter the next screen you see several radio buttons. Typically every upload required all the features behind those radio buttons and in the same sequence. It is not possible to give the details to be performed in each of these radio buttons in this forum. Please take a consultant's help in your vicinity.
    thanx
    Bala

  • How to use best practices installation tool?

    Hello!
    can anyone share me some useful links/doc which can guide me how to use best practices installation tool (/SMB/BBI) ?
    any responses will be awarded,
    Regards,
    Samson

    hi,
    will you please share the same ?
    thanks in advance

  • Implementing a "login" using best practices

    I have a little bit of time now for my project where I'd like to refactor it a bit and take the opportunity to learn about the best practices to use with JSP/Servlets but I'm having some trouble thinking about what goes where, and how to organize things.
    Here's my current login functionality. I have not seperated my "business logic" from my "presentation logic" as you can see in this simple starting example.
    index.html:
    <html>
    <body>
    <form action="login.jsp" method="post">
        <h1>Please Login</h1>
        User Name:    <input type="text" name="login"><br>
        Password: <input type="password" name="password"><br>
        <input type=submit value="Login">
    </form>
    </body>
    </html>login.jsp:
    <jsp:useBean id="db" type="database.DatabaseContainer" scope="session"/>
    <%
    if(session.getAttribute("authorized")==null || session.getAttribute("authorized").equals("no") || request.getParameter("login")!=null)
         String login = request.getParameter("login");
         String password = request.getParameter("password");
         if (login!=null && db.checkLogin(login,password))
             // Valid login
              session.setAttribute("authorized", "yes");
             session.setAttribute("user",login);
         else
              // Invalid login
                 session.setAttribute("authorized", "no");
                 %><jsp:forward page="index.html"/><%
    else if(session.getAttribute("authorized").equals("no"))
        //System.out.println("Refresh");
        %><jsp:forward page="index.html"/><%   
    else System.out.println("Other");
    %>
    <html>
    <body>
    <h1>Welcome <%= " "+session.getAttribute("user") %></h1>
    //links to other jsps are here
    </body>
    </html>What should I be doing instead? Should I make the form action be a Servlet rather than a jsp? I don't want to be writing html in my servlets though. Do I do the authentication in a servlet that I make the form action and then make the servlet forward to some standard html page?

    Ok, so I'm starting things off simply be just converting what I have to use better practices. For now I just want to get the basic flow of how I transition from page to servlet to page.
    Here's my index.html page:
    <html>
    <body>
    <form action="login" method="post">
        <h1>Please Login</h1>
        Phone Number:    <input type="text" name="login"><br>
        Password: <input type="password" name="password"><br>
        <input type=submit value="Login">
    </form>
    </body>
    </html>I have a mapping that says login goes to LoginServlet, which is here:
    import java.io.IOException;
    import javax.servlet.ServletException;
    import javax.servlet.http.HttpServlet;
    import javax.servlet.http.HttpServletRequest;
    import javax.servlet.http.HttpServletResponse;
    import javax.servlet.http.HttpSession;
    import db.DatabaseContainer;
    public class LoginServlet extends HttpServlet
        public void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException
            HttpSession session = request.getSession();
            if(session.getAttribute("authorized")==null || session.getAttribute("authorized").equals("no") || request.getParameter("login")!=null)
                String login = request.getParameter("login");
                String password = request.getParameter("password");
                DatabaseContainer db = (DatabaseContainer)(request.getSession().getAttribute("db"));
                if (login!=null && db.checkLogin(login,password))
                    // Valid login
                    session.setAttribute("authorized", "yes");
                    session.setAttribute("user",login);
                    //forward to home page
                else
                    // Invalid login
                    session.setAttribute("authorized", "no");
                    //forward back to login page
            else if(session.getAttribute("authorized").equals("no"))
                //forward back to login page
        public void doPost(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException
            doGet(request, response);
    }If I'm not logged in, I want to simply forward back to the login page for now. If I am logged in, I want to forward to my home page. If my home page is a simple html page though, then what's to stop a person from just typing in the home page and getting to it? I would think it would need to be a jsp page but then the jsp page would have to have code in it to see if the user was logged in, and then I'd be back to where I was before.
    Edited by: JFactor2004 on Oct 21, 2009 7:38 PM
    Edited by: JFactor2004 on Oct 21, 2009 7:38 PM

  • Using Best Practices personalization tool

    hello,
    We wish to usr the Best Practices personalization tool for customer specific data sor baseline package.
    I do not understand from the documantation if its possible to use it after  installing the Baseline or its have to be done simutanioslly (meenung the personalized data have to be ready in the files before stsrting the baseline installaion)?!
    Thank you
    Michal

    Hi,
    Please Ref:
    http://help.sap.com/bp_bl603/BL_IN/html/index.htm
    Your personalized files to be done before implementaion as you will be using the file during the installation process.
    The xml file and the txt files you creat from the personalization tool is used to upload the scenario to the system, or else it will upload the default.
    Also ref note 1226570 ( here i am refering to IN ) you can check the same for other county version also.
    Thanks & Regards,
    Balaji.S

  • VC table use, best practices?

    Hi,
    I'm updating a table in the back end with an RFC. I would like to send only the rows I've modified or added on the VC client to the RFC and not the whole table. Is this possible?

    Hey Joel,
    Make a condition(say check box) on changing the table row (user need to select what are all the rows he is modifying).
    while sending the values to the RFC, give the guard condition as valid only when check box is selected.
    Regards,
    Pradeep

  • Best practice for migrating data tables- please comment.

    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    Please comment on your view of this practice. Thanks!

    >
    Please comment on your view of this practice. Thanks!
    >
    Sounds like the DBAs are using best practices to get the job done. Congratulations to them!
    >
    I have 5 new tables seeded with data that need to be promoted from a development to a production environment.
    Instead of the DBAs just using a tool to migrate the data they are insistent that I save and provide scripts for every single commit, in proper order, necessary to both build the table and insert the data from ground zero.
    >
    The process you describe is what I would expect, and require, in any well-run environment.
    >
    I am very unaccustomed to this kind of environment and it seems much riskier for me to try and rebuild the objects from scratch when I already have a perfect, tested, ready model.
    >
    Nobody cares if if is riskier for you. The production environment is sacred. Any and all risk to it must be reduced to a minimum at all cost. In my opinion a DBA should NEVER move ANYTHING from a development environment directly to a production environment. NEVER.
    Development environments are sandboxes. They are often not backed up. You or anyone else could easily modify tables or data with no controls in place. Anything done in a DEV environment is assumed to be incomplete, unsecure, disposable and unvetted.
    If you are doing development and don't have scripts to rebuild your objects from scratch then you are doing it wrong. You should ALWAYS have your own backup copies of DDL in case anything happens (and it does) to the development environment. By 'have your own' I mean there should be copies in a version control system or central repository where your teammates can get their hands on them if you are not available.
    As for data - I agree with what others have said. Further - ALL data in a dev environment is assumed to be dev data and not production data. In all environments I have worked in ALL production data must be validated and approved by the business. That means every piece of data in lookup tables, fact tables, dimension tables, etc. Only computed data, such as might be in a data warehouse system generated by an ETL process might be exempt; but the process that creates that data is not exempt - that process and ultimately the data - must be signed off on by the business.
    And the business generally has no access to, or control of, a development environment. That means using a TEST or QA environment for the business users to test and validate.
    >
    They also require extensive documentation where every step is recorded in a document and use that for the deployment.
    I believe their rationale is they don't want to rely on backups but instead want to rely on a document that specifies each step to recreate.
    >
    Absolutely! That's how professional deployments are performed. Deployment documents are prepared and submitted for sign off by each of the affected groups. Those groups can include security, dba, business user, IT and even legal. The deployment documents always include recovery steps so that is something goes wrong or the deployment can't procede there is a documented procedure of how to restore the system to a valid working state.
    The deployments themselves that I participate in have representatives from the each of those groups in the room or on a conference call as each step of the deployment is performed. Your 5 tables may be used by stored procedures, views or other code that has to be deployed as part of the same process. Each step of the deployment has to be performed in the correct order. If something goes wrong the responsible party is responsible for assisting in the retry or recovery of their component.
    It is absolutely vital to have a known, secure, repeatable process for deployments. There are no shortcuts. I agree, for a simple 5 new table and small amount of data scenario it may seem like overkill.
    But, despite what you say it simply cannot be that easy for one simple reason. Adding 5 tables with data to a production system has no business impact or utility at all unless there is some code, process or application somewhere that accesses those tables and data. Your post didn't mention the part about what changes are being made to actually USE what you are adding.

  • SAP SCM and SAP APO: Best practices, tips and recommendations

    Hi,
    I have been gathering useful information about SAP SCM and SAP APO (e.g., advanced supply chain planning, master data and transaction data for advanced planning, demand planning, cross-plant planning, production planning and detailed scheduling, deployment, global available-to-promise (global ATP), CIF (core interface), SAP APO DP planning tools (macros, statistical forecasting, lifecycle planning, data realignment, data upload into the planning area, mass processing u2013 background jobs, process chains, aggregation and disaggregation), and PP/DS heuristics for production planning).
    I am especially interested about best practices, tips and recommendations for using and developing SAP SCM and SAP APO. For example, [CIF Tips and Tricks Version 3.1.1|https://service.sap.com/form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700006480652001E] and [CIF Tips and Tricks Version 4.0|https://service.sap.com/form/sapnet?_FRAME=CONTAINER&_OBJECT=011000358700000596412005E] contain pretty useful knowledge about CIF.
    If you know any useful best practices, tips and recommendations for using and developing SAP SCM and SAP APO, I would appreciate if you could share those assets with me.
    Thanks in advance of your help.
    Regards,
    Jarmo Tuominen

    Hi Jarmo,
    Apart from what DB has suggested. you should give a good reading on the following.
    -Consulting Notes (use the application component filters in search notes)
    -Collective Notes (similar to the one above)
    -Release Notes
    -Release Restrictions
    -If $$ permit subscribe to www.scmexpertonline.com. Good perspective on concepts around SAP SCM.
    -There are a couple of blogs (e.g. www.apolemia.com) .. but all lack breadth.. some topics in depth.
    -"Articles" section on this site (not all are classified well.. see in ECCops, mfg, SCM, Logistics etc)
    -Serivce.sap.com- check the solution details overview in knowledge exchange tab. There are product presentations and collaterals for every release. Good breadth but no depth.
    -Building Blocks - available for all application areas. This is limited to vanilla configuration of just making a process work and nothing more than that.
    -Get the book "Sales and Operations Planning with SAP APO" by SAP Press. Its got plenty of  easy to follow stuff, good perspective and lots of screen shots to make life easier.
    -help.sap.com the last thing that most refer after all "handy" options (incl. this forum) are exhausted. Nevertheless, this is the superset of all "secondary" documents. But the maze of hyperlinks that start at APO might lead you to something like xml schema.
    Key Tip: Appreciate that SAP SCM is largely driven by connected execution systems (SAP ECC/ERP). So the best place to start with should be a good overview of ERP OPS solution overview, at least at the significant level of depth.). Check this document at sdn wiki "ERP ops architecture overview".
    I have some good collection of documents though many i havent read myself. If you need them let me know.
    Regards,
    Loknath

  • Best practice in implementation of SEM-CPM

    Is someone have  the experiance of implementing SEM-CPM using best practice. and if so, does it reduces implementation time?

    We should be able to adopt the best pratices when the software finally gets integrated into netweaver.
    Ravi Thothadri

  • What is best practice for dealing with Engineering Spare Parts?

    Hello All,
    I am after some advice regarding the process for handling engineering spare parts in PM. (We run ECC 5)
    Our current process is as follows:
    All materials are set up as HIBE's
    Each material is batch managed
    The Batch field is used for the Bin location
    We are now looking to role out PM to a site that has in excess of 50,000 spare parts and want to make sure we use best practice for handling the spare parts. We are now considering using a basic WM setup to handle the movement of parts.
    Please can you provide me with some feedback on what you feel the best practice is for dealing with these parts?
    We are looking to set up a solution that will us to generate pick lists etc and implment a scanning solution to move parts in and out of stores.
    Regards
    Chris

    Hi,
    I hope all the 50000 spare parts are maintained as stock items.
    1. Based on the usage of those spare parts, try to define safety stock & define MRP as "Reorder Point Planning". By this, you can avoid petty cash purchase.
    2. By keeping the spare parts (atleast critical components) in stock, Planned Maintenance as well as unplanned maintenance will not get delayed.
    3. By doing GI based on reservation, qty can be tracked against the order & equipment.
    As this question is MM & WM related, they can give better clarity on this.
    Regards,
    Maheswaran.

  • Best Practices v3.31 - SAP InfoSet Query connection

    Hi,
    I have a problem with adapting a Crystal Report from Best Practices for Business Intelligence v3.31 to SAP system. The report "Cost Analysis Planned vs. Actual Order Costs.rpt" is using SAP InfoSet Query "CO_OM_OP_20_Q1". This InfoSet Query is working fine in SAP. My SAP-User has access to the following user groups:
    - /SREP/IS_UG
    - /KYK/IS_UG
    - ZBPBI131_USR
    The Controlling Area in SAP is '1000'. Crystal Reports generates this error message:
    - Failed to retrieve data from the database.
    - Database Connector Error: "Controlling area does not exist"
    - Database Connector Error: 'RFC_CLOSED'
    But InfoSet Query "CO_OM_CA_20_Q1" has no problem with Controlling Area '1000' in Crystal Reports and is working fine in SAP!
    Can somebody help?
    Thanks in advance.
    Peter

    Hello Peter,
    I'm using Best Practices for BI v1.31 and this one also has this report you are talking about.
    I face the same issue when trying to adapt it to my ERP, but if I run it in SAP GUI, it goes smoothly without any issue.
    Please advise,
    Thanks in advance,
    Carlos Henrique Matos da Silva - SAP BusinessObjects BI - Porto Alegre/Brazil.

  • SAP Best Practice for Chemicals in an oil refinery

    Does anyone have experience in implementing the SAP Best Practice for Chemicals in an oil refinery that will also use IS-OIL?
    What would be the pros and cons?
    What is the implementation price of considering a best practice solution for a large complex organization?
    Oded Dagan
    Oil Refineries LTD.
    Israel

    Hi Oded,
    I don't know about any Best Practice Chemicals implementation in an refinery so far.
    But you can use Best Practice Chemicals with SAP IS-Oil for the non IS-Oil functionality as well.
    Best Practice Chemicals gives you benefits within standard business processes, but for the
    IS-Oil business processes you have to consider the traditon implementation methods.
    Best Practice Chemicals gives you a broad business process set, usually used in Chemical
    corporation. If you can cover 50% of your needed business processes out of Best Practice Chemicals you save approx 50% implementation time. It is not only implemenentation you save a lot in documentation and training material as well. Most of our Best Practice Chemicals implemenations used
    60-80% out of Best Practice Chemicals. At a large corporation the percentage of standard ERP processes is normally smaller, because of other additional needed SAP solutions e.g. APO, SRM, CRM etc.
    regards Rainer

  • Storage Server 2012 best practices? Newbie to larger storage systems.

    I have many years managing and planning smaller Windows server environments, however, my non-profit has recently purchased
    two StoreEasy 1630 servers and we would like to set them up using best practices for networking and Windows storage technologies. The main goal is to build an infrastructure so we can provide SMB/CIFS services across our campus network to our 500+ end user
    workstations, taking into account redundancy, backup and room for growth. The following describes our environment and vision. Any thoughts / guidance / white papers / directions would be appreciated.
    Networking
    The server closets all have Cisco 1000T switching equipment. What type of networking is desired/required? Do we
    need switch-hardware based LACP or will the Windows 2012 nic-teaming options be sufficient across the 4 1000T ports on the Storeasy?
    NAS Enclosures
    There are 2 StoreEasy 1630 Windows Storage servers. One in Brooklyn and the other in Manhattan.
    Hard Disk Configuration
    Each of the StoreEasy servers has 14 3TB drives for a total RAW storage capacity of 42TB. By default the StoreEasy
    servers were configured with 2 RAID 6 arrays with 1 hot standby disk in the first bay. One RAID 6 array is made up of disks 2-8 and is presenting two logical drives to the storage server. There is a 99.99GB OS partition and a 13872.32GB NTFS D: drive.The second
    RAID 6 Array resides on Disks 9-14 and is partitioned as one 11177.83 NTFS drive.  
    Storage Pooling
    In our deployment we would like to build in room for growth by implementing storage pooling that can be later
    increased in size when we add additional disk enclosures to the rack. Do we want to create VHDX files on top of the logical NTFS drives? When physical disk enclosures, with disks, are added to the rack and present a logical drive to the OS, would we just create
    additional VHDX files on the expansion enclosures and add them to the storage pool? If we do use VHDX virtual disks, what size virtual hard disks should we make? Is there a max capacity? 64TB? Please let us know what the best approach for storage pooling will
    be for our environment.
    Windows Sharing
    We were thinking that we would create a single Share granting all users within the AD FullOrganization User group
    read/write permission. Then within this share we were thinking of using NTFS permissioning to create subfolders with different permissions for each departmental group and subgroup. Is this the correct approach or do you suggest a different approach?
    DFS
    In order to provide high availability and redundancy we would like to use DFS replication on shared folders to
    mirror storage01, located in our Brooklyn server closet and storage02, located in our Manhattan server closet. Presently there is a 10TB DFS replication limit in Windows 2012. Is this replicaiton limit per share, or total of all files under DFS. We have been
    informed that HP will provide an upgrade to 2012 R2 Storage Server when it becomes available. In the meanwhile, how should we designing our storage and replication strategy around the limits?
    Backup Strategy
    I read that Windows Server backup can only backup disks up to 2TB in size. We were thinking that we would like
    our 2 current StoreEasy servers to backup to each other (to an unreplicated portion of the disk space) nightly until we can purchase a third system for backup. What is the best approach for backup? Should we use Windows Server Backup to be capturing the data
    volumes?
    Should we use a third party backup software?

    Hi,
    Sorry for the delay in reply.
    I'll try to reply each of your questions. However for the first one, you may have a try to post to Network forum for further information, or contact your device provider (HP) to see if there is any recommendation.
    For Storage Pooling:
    From your description you would like to create VHDX on RAID6 disks for increasment. It is fine and as you said it is 64TB limited. See:
    Hyper-V Virtual Hard Disk Format Overview
    http://technet.microsoft.com/en-us/library/hh831446.aspx
    Another possiable solution is using Storage Space - new function in Windows Server 2012. See:
    Storage Spaces Overview
    http://technet.microsoft.com/en-us/library/hh831739.aspx
    It could add hard disks to a storage pool and creating virtual disks from the pool. You can add disks later to this pool and creating new virtual disks if needed. 
    For Windows Sharing
    Generally we will have different sharing folders later. Creating all shares in a root folder sounds good but actually we may not able to accomplish. So it actually depends on actual environment.
    For DFS replication limitation
    I assume the 10TB limitation comes from this link:
    http://blogs.technet.com/b/csstwplatform/archive/2009/10/20/what-is-dfs-maximum-size-limit.aspx
    I contacted DFSR department about the limitation. Actually DFS-R could replicate more data which do not have an exact limitation. As you can see the article is created in 2009. 
    For Backup
    As you said there is a backup limitation (2GB - single backup). So if it cannot meet your requirement you will need to find a third party solution.
    Backup limitation
    http://technet.microsoft.com/en-us/library/cc772523.aspx
    If you have any feedback on our support, please send to [email protected]

  • Best practices for deployment from Dev /Staging /Production in SharePoint ?

    Hi All,
    What is a best practices to deploy SharePoint Portal to dev / staging / Production.
    I have custom solution deployed using WSP file. But I have done some changes using sharepoint designer.
    Like as Designer workflow, master pages etc.
    How can I deploy my document libraries and list to dev to prod using best practices?
    Thanks
    Balaji More

    Hi,
    According to your post, my understanding is that you wanted to know the best practices to deploy SharePoint Portal in different SharePoint environment.
    If the site is not existing in the production server, we can save the site from the development server, and then import it to the production server.
    But if the site is already existing in the production server, we should follow these steps to just add the taxonomy and content types to the production server:
    Save the site from Dev as a template
    Import the template as solution in Visual Studio
    Remove unnecessary items from the solution(Please pay more      attention on it. If a content type/list... in the solution is existing in      the production site too, it will replace the
    same object existing in the      production after deployment)
    Package the solution
    Deploy the solution in the production
    For more detailed, please see:
    http://ahmedmadany.wordpress.com/2012/12/30/importing-sharepoint-solution-package-wsp-into-visual-studio-2010/
    There is a similar thread for your reference.
    http://social.technet.microsoft.com/Forums/en-US/7dcf61a8-1af2-4f83-a04c-ff6c439e8268/best-practices-guide-for-deploying-sharepoint-2010-from-dev-to-test-to-production?forum=sharepointgeneralprevious
    Thanks & Regards,
    Jason
    Jason Guo
    TechNet Community Support

  • Is it a best practice to have a template with one master page?

    I am a newbie FM 11 writer and am cleaning up some unorganized books. Should I copy one set of Master Pages to all files in the book. Currently my TOC and certain other files have unique master pages. I would like to set up our books using best practices and would like input from the community. Thanks.

    There are two schools of thought on this. The specific sub-template approach or the "kitchen sink" approach.
    In the "kitchen sink" (i.e. everything, including the...) approach, the FM template is loaded with everything required for the project in a single file. It's simple to deploy, import it to all files and you're good to go. However, the author may have to deal with all sorts of superfluous tags and page layouts in some specific file types, like the cover pages, TOC, Index and other generated files. The onus is on the author to select the correct items to use from the multitude of choices.
    The sub-template approach is modular approach where one creates the various components in separate template files, e.g. paragraph and character tags, tables, page layouts, etc. and combines to create specific templates for the various book components. These component-combined templates only have the minimum that is required for each type of document component. This is a lego-like approach and it provides more flexibility (IMHO) with modifying, updating and creating new templates. This is easier (perhaps less intimidating would be a better term) for the author to use as their choices are much more limited in any given context. However, they do have to apply the correct templates to the specific book components.
    In all cases, you need to document the usage of all components in the template(s), so authors will know the intent of each and every tag, table, sttyle, page layout, etc.

Maybe you are looking for

  • Error Importing - Disk Space when there is plenty of Space

    Hello, I am trying to import small .avi movies into iMovies. When I do, it states that there is a Error Importing due to not enough Disk Space. I have 24GB of disk space available, however, inside the iMovie application it states that there is only 2

  • Copying data to time capsule from external hard drive too slow....

    is there a way to make the transfer of data from a external hard drive that is plugged into the time capsule, transfer faster? im transfering 11gb and it says it will take 3 hours to complete. there must be a faster way to transfer all my data, in le

  • How do you increase the size of a video clip in slide show?

    Using Elements 7. Video clips can be run in full screen when from the Organizer, but when playing the slide show, they only display about 1/3 the size of the screen with a large black border around them.

  • Camera for mac mini

    I am looking for a camera that will work with Photo Booth on a Mac Mini, OS X 10.8.2, 2.5 GHz i5, 4GB.  I have the IceCam2, but that does not work with moving backgrounds (e.g., rollercoaster).

  • Adding a checkbox to Trees

    Hi, Anyone how to add a checkbox to every item in a tree? I wanted to implement something that looks like what is found when you install software and get that grey check meaning not all the of the children items are not checked. I am not an expert wi