Back Up Strategy for EP7.0

Dear All,
Can any body help me in desingning a backup strategy for EP7.0.
The system is mostly used to publish the BI  webreports.
Thank you,
Regards,
Venkat

Hi
        This link is perfect for you:
http://help.sap.com/saphelp_nw04s/helpdata/en/f7/290995b9864dcfbe10ad70ddfb5d85/frameset.htm
SAP Note Number: 779708
Cheers.
Please award points for helpful answers.

Similar Messages

  • Back up strategy for java systems (EP7.0)

    Hi Gurus
    We are using windows server 2003 Ent and SQL server 2005 for our ECC 6.0 system (dual stack). and one more Enterprise portal  7.0. on different box with the same OS and DB.
    As i am new to MS sql can any body let me how to proceed with the back up policy  especially with the JAVA systems
    This is i am planning.
    Whole Database     Full - Offline     Monthly
    /usr/sap/<SID>         Full-  Online       Daily
    transaction logs                every 4 hours
    Any way frequency does not matter. but are these above steps are suffienent to recover or do we need to keep more on backup.
    kindly suggest us.
    Thanks in advance.
    Regards
    BALAJI

    Hi Bala,
    Do have a look at these threads, while they may not answer your question they seem to have some good inputs
    1 - How to backup MSSQL  database in ECC6
    2 - https://cw.sdn.sap.com/docs/DOC-40515;jsessionid=1B4BAE46AE116F95C178F7D34A259E6F
    In one of my assignment where the setup was more like the one you mentioned we did the following:
    BI-Weekly
    Windows Filesystem Backup
    Whole Database - Offline Backup
    Everyday:
    SID backup - online
    Apart from that we also included plans for backuping the UME Database (it was a directory), TREX
    Good Luck!!!
    GLM

  • Apps Backing up Strategy

    We are currently in the process of developing a new back up strategy for Oracle Application .
    What I understand currently is
    - RMAN can be used to back up the dbf files.
    - The application tier does not need to be backed up unless there is a patching or customization. The probability of this seems less in production. However we may need to back up the concurrent manager logs/output files.
    In application tier, the strategy seems to be copy the relevant directories to a tape or a disk drive maybe on a daily basis. This is a hot back up and hence there might be a data loss. Is this correct ?
    But there are some directories which cannot be copied hot. Which are those ? Is there any solution apart from going offline the least amount of time?
    So what is a good strategy for backing up the application tier ?

    Hi,
    What I understand currently is
    - RMAN can be used to back up the dbf files.
    - The application tier does not need to be backed up unless there is a patching or customization. The probability of this seems less in production. However we may need to back up the concurrent manager logs/output files. Correct.
    In application tier, the strategy seems to be copy the relevant directories to a tape or a disk drive maybe on a daily basis. This is a hot back up and hence there might be a data loss. Is this correct ?There will be no data loss at the application side as all the data are stored in the database. If you have no customization or patching on daily basis, then you need to backup log files and concurrent requests log/output files.
    But there are some directories which cannot be copied hot. Which are those ? Is there any solution apart from going offline the least amount of time?What is the OS? If you are on Linux/Unix, you should be able to copy the files, but on Windows this is not possible.
    So what is a good strategy for backing up the application tier ?I believe having a weekly offline backup should be enough. For the daily changes, consider taking a backup of the log/output files mentioned above.
    Please search this forum for similar topic, and you will find many hits.
    Regards,
    Hussein

  • Go Live Strategy for CO/MM

    Hi,
    Please suggest the Go Live strategy for CO / MM especially from Product costing point of view.
    I have an issue, whether material prices of FG and SFG should be uploaded along with the quantity?
    1.Whether, these prices are to be maintained in material master before executing the standard cost estimate?
    2.What would happen if std prices are maintained (based on legacy stock valuation) and thereafter std cost is run. Whether the differential would be recorded as material revaluation?
    3.Suppose Go live is on 1st.. the stock valuation of 31st would be available only on 3rd.. then in such cases, I would have to execute the std cost.. to ensure production/despatches are not stopped on 1st and 2nd. In such case how would the FG/SFG value as per initial upload match with Material master.
    Please suggest the implications as well as the best practice.
    Regards

    Hi Swapnik,
    Your understanding is correct. Pls go through the following...
    But, there are two strategies for this...
    1. Which I explained previously...in this case, you need to explaine the Core Team to estimate the expected PDiff in SAP and make suitable JV and neutralise the balances at the time of upload itself. See, literally when you released the Cost Estimate for the Products and Inventory Qty balances are available in Excel (or legacy) you would definitely come to know what is going to be the difference between legacy Inventory Value and SAP Inventory Value. Even the Business will have an option to reconsider their Quantity Structure and other things.
    This would be suggestible to Core Team when they have very systematic data and approach available with them about the Standard Costs of the Products.
    2. The second one is your strategy...
    Upload the Inventory with Values, then run the Std Cost Estimate. In this method, you should be sure of how much PDiff is going to hit your P&L. You can even change the sequence of steps you mentioned as follows...
    1. You upload Material (inputs) Qty and value
    2. Stock of FG and SFG with value is uploaded - This would tally with GL balance upload
    3. Std cost of SFG and FG would be run and released. Difference is posted in material revaluation account.
    The decision whether you have to follow method 1 or 2, you are the best person to choose depending on your client. It is an open secret that there would be lot of manipulations to be done in data by them so that they would not face any problem they get into SAP. Method 1 will be more suitable for such instances....
    Hope you got it...revert back for futher explanation...
    Srikanth Munnaluri
    Edited by: Srikanth Munnaluri on Apr 18, 2009 12:16 AM

  • What is your strategy for form validation when using MVC pattern?

    This is more of a general discussion topic and will not necessarily have a correct answer. I'm using some of the Flex validator components in order to do form validation, but it seems I'm always coming back to the same issue, which is that in the world of Flex, validation needs to be put in the view components since in order to show error messages you need to set the source property of the validator to an instance of a view component. This again in my case seems to lead to me duplicating the code for setting up my Validators into several views. But, in terms of the MVC pattern, I always thought that data validation should happen in the model, since whether or not a piece of data is valid might be depending on business rules, which again should be stored in the model. Also, this way you'd only need to write the validation rules once for all fields that contain the same type of information in your application.
    So my question is, what strategies do you use when validating data and using an MVC framework? Do you create all the validators in the views and just duplicate the validator if the exact same rules are needed in some other view, or do you store the validators in the model and somehow reference them from the views, changing the source properties as needed? Or do you use some completely different strategy for validating forms and showing error messages to the user?

    Thanks for your answer, JoshBeall. Just to clarify, you would basically create a subclass of e.g. TextInput and add the validation rules to that? Then you'd use your subclass when you need a textinput with validation?
    Anyway, I ended up building sort of my own validation framework. Because the other issue I had with the standard validation was that it relies on inheritance instead of composition. Say I needed a TextInput to both check that it doesn't contain an empty string or just space characters, is between 4 and 100 characters long, and follows a certain pattern (e.g. allows only alphanumerical characters). With the Flex built in validators I would have to create a subclass or my own validator in order to meet all the requirements and if at some point I need another configuration (say just a length and pattern restriction) I would have to create another subclass which duplicates most of the rules, or I would have to build a lot of flags and conditional statements into that one subclass. With the framework I created I can just string together different rules using composition, and the filter classes themselves can be kept very simple since they only need to handle a single condition (check the string length for instance). E.g. below is the rule for my username:
    library["user_name"] = new EmptyStringFilter( new StringLengthFilter(4,255, new RegExpFilter(/^[a-z0-9\-@\._]+$/i) ) );
    <code>library</code> is a Dictionary that contains all my validation rules, and which resides in the model in a ValidationManager class. The framework calls a method <code>validate</code> on the stored filter references which goes through all the filters, the first filter to fail returns an error message and the validation fails:
    (library["user_name"] as IValidationFilter).validate("testuser");
    I only need to setup the rule once for each property I want to validate, regardless where in the app the validation needs to happen. The biggest plus of course that I can be sure the same rules are applied every time I need to validate e.g. a username.
    The second part of the framework basically relies on Chris Callendar's great ErrorTipManager class and a custom subclass of spark.components.Panel (in my case it seemed like the reasonable place to put the code needed, although perhaps extending Form would be even better). ErrorTipManager allows you to force open a error tooltip on a target component easily. The subclass I've created basically allows me to just extend the class whenever I need a form and pass in an array of inputs that I want to validate in the creationComplete handler:
    validatableInputs = [{source:productName, validateAs:"product_name"},
                         {source:unitWeight, validateAs:"unit_weight", dataField:"value"},
                   {source:unitsPerBox, validateAs:"units_per_box", dataField:"value"},
                        {source:producer, validateAs:"producer"}];
    The final step is to add a focusOut handler on the inputs that I want to validate if I want the validation to happen right away. The handler just calls a validateForm method, which in turn iterates through each of the inputs in the validatableInputs array, passing a reference of the input to a suitable validation rule in the model (a reference to the model has been injected into the view for this).
    Having written this down I could probably improve the View side of things a bit, remove the dependency on the Panel component and make the API easier (have the framework wire up more of the boilerplate like adding listeners etc). But for now the code does what it needs to.

  • Patch strategy for ECC6 in SAP NW2004s environment

    Hi Experts,
    Our SAPNW04s landscape consists of ECC6, BI 7.0, EP7.0 , SRM, XI, Solution Manager systems.  We are drafting patch upgrade strategy for the ECC6 system & have 2 options:
    Option 1: Apply patches only in ECC6 & not in other components.
    Option 2: Patch ECC6 together with other components like SRM, BI, EP etc.
    Option 2 will make things complicated but for Option 1 below are the 2 concerns/questions raised:
    1. Option 1 will lead to doing UAT twice for Business Processes which have a tight integration with ECC6, for ex. UAT for data extraction needs to be done after applying patches both in the ECC6 & the BI systems. This will lead to arrange for more user resources.
    2. Due to the tight integration between ECC6 & other components applying patches alone in the ECC6 might lead to other systems like BI, EP,SRM, XI etc at a lower level & they might not be able to connect to ECC6 or work properly with ECC6.
    Can somebody share how they are managing the ECC6 patches upgrade in a big NW2004s environment with other components present, are the ECC6 patches done standalone & other systems like EP, BI follow their own different time schedules for patching or ECC6, BI, EP are recommended to be done together.
    Are there any best practices recommendations from SAP on the same?.  Can somebody share the blueprint or any strategy document they are using.
    Thanks,
    Abhishek

    Hi Abhishek,
    For the reasons you outline in 2, it is recommended to patch entire business scenarios (i.e. business processess that are coupled integrated across systems) together.  Rather than go into a full discussin here, I recommend you see my paper [How To Design a SAP NetWeaver - Based System Landscape |https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/50a9952d-15cc-2a10-84a9-fd9184f35366].
    Best Regards,
    Matt

  • "Best" Back Up Strategy

    According to Apple, I have a corrupt Leopard install and I need to back up my Mac and clean install Leopard again. I had originally intended to use Time Machine for my back ups, but since I also have a Airport Extreme, I wanted to use a HD connected to this for my back ups. Unfortunately, this won't work; the HD must be connected directly to the APE.
    So, I am looking for the best back up strategy that will -
    1) Let me back up all my info to a external drive, so I will have everything I need to get my Mac back in the same shape it was after I reinstall Leopard, i.e. everything on it, including mail, itunes, photos, etc. I can connect directly to a drive for this part.
    2) After this, I want to back up 2 Mac Book Pros regularly to the external drive connected to the Air Port Extreme. I would like to be able to access this back up in a native format, i.e., find a file and open it from the back up without having to restore it or any other gymnastics. I don't know if Time Machine is the best solution; looking for suggestions.
    Is this possible?
    Thanks for your help!

    Hi,
    1. Get the biggest external drive you can afford. You cant have enough storage space.
    2. Use Carbon Copy Cloner from here...
    http://www.bombich.com/software/ccc.html
    That will clone the entire content of your computer on to the external.
    3. Time Machine won't work in this scenario. As I understand it, Wireless back ups were removed from the list of goodies in Leopard at the last moment. It may become available in later releases of Leopard. It may not.
    Regards
    Ian

  • Best strategy for flash access to an image in a protected folder

    hello all,
    i'm looking for a good strategy for my swf to be able to load
    images in a password protected directory. since flash can't access
    a file behind a protected folder i figure i'll use php to get the
    image and pass it to flash.
    how best to get an image from php to flash since i can't just
    send a path to a file and have flash execute a Loader() to retrieve
    it since the Loader won't be able to get behind the protected
    folder?
    is it best to have php send the file binary to flash that
    flash will receive via URLLoaderDataFormat.BINARY? is it best to,
    at runtime, have php move the requested file from the protected
    folder to an unprotected folder and after flash loads it then
    delete it from the unprotected folder? maybe it's best to leave the
    image folder unprotected and let an .htaccess redirect restrict
    outside access to the folder?
    suggestions?
    tia,
    michael

    The built-in Firefox Sync service allows you to store your settings, including bookmarks, on a Mozilla server. This isn't accessible as a file like Google Drive; it can only be used by connecting another copy of Firefox to your Sync account. You can read more about this here: [[How do I set up Firefox Sync?]] (I haven't tried it myself since it changed recently.)
    The cross-browser Xmarks service is more focused: it just does bookmarks. http://www.xmarks.com/ (I've never tried this one either)
    If you prefer working with files and you have a Dropbox or similar account, you could in theory back up your Firefox profile folder, or more particularly, the bookmarkbackups subfolder. I haven't tried that myself. To find the folder:
    Open your current Firefox settings (AKA Firefox profile) folder using either
    * "3-bar" menu button > "?" button > Troubleshooting Information
    * Help menu > Troubleshooting Information
    In the first table on the page, click the "Show Folder" button
    A window should launch showing your currently active settings files. The bookmarkbackups folder should contain dated .json files which can be used to restore your bookmarks to another copy of Firefox.
    Alternately, someone might have created an add-on for saving Firefox bookmarks to the cloud another way, but I haven't heard of one.

  • What is the backup strategy for your rpool?

    Hi all,
    May I know what is your backup strategy for rpool? like how to integrate with the netbackup or any other backup software
    Is it using zfs send a data stream to a file and backup that file using backup software??
    Aside the above method, can we also use the UFS method?
         backup software to backup entire / /var..........  etc....
         1. re install the OS
         2. install the backup client
         3. create a zpool (rpool-restore)  using a 2nd disk,
         4. mount the new zpool  (rpool-restore) to /restore
         5. restore all the file in  /restore
         6. install boot blk,
         7. boot from 2nd disk
    Any more idea?

    HI Willy,
    According to the Flash Archives limitation stated in Oracle website, if the OS is with a child zone, it will not work right?
    http://docs.oracle.com/cd/E19253-01/821-0436/6nlg6fi8u/index.html
    I am thinking that using the traditional way to do it.
    Use a backup software, for example, networker, TSM, netbackup.. to back up the following:
    /rpool
    /export/
    /export/home
         1. re install the OS
         2. install the backup client
         3. create a zpool (rpool-restore)  using a 2nd disk,
         4. mount the new zpool  (rpool-restore) to /restore
         5. restore all the file in  /restore
         6. install boot blk,
         7. boot from 2nd disk
    will it still works?

  • Define Derivation Strategy for Control Objects - Value Not Transport

    I created Rules Value in Define Derivation Strategy for Control Objects (SPRO). Then, I perform the below steps for Transport the Request to other client.
    1. Select Derivation Strategy name, Table View > Transport
    2. Select Request Number
    3. Click Include in Request, then save
    4. Go to the Steps in Logical Order, Extras > Transport
    5. Select Request Number and back to SPRO screen
    After SCC1, there is only Derivation Strategy Name appeared but the rules value (Steps in Logical Order) is not appeared.
    How can I transport the Rule Value?
    Regards
    Ton

    Hi,
    When you are in the derivation transaction and select to transport it, the system asks you whether you want the values to be transported as well. If you choose 'yes' your change request will include this information.
    Regards,
    Eli

  • Good backup strategy for a mailserver?

    Hi,
    What would be a good backup strategy for a mailserver and what software would you use. I've got Retrospect right now ...
    Thanks a lot.
    Jerome

    Yeah, ptero, I know. I'm using mailbfr and it seems to work quite fine.
    I didn't express myself clearly in this thread, I was in fact looking for the best way to backup a whole MacOS X Server whose primary function is being a mailserver.
    I'm backing up the mailbfr via Retrospect on another server, and I'm using Carbon Copy Cloner from time to time to make an image of the whole server. But is that a good idea? I've got a neat program on my Windows servers which makes 'continuous' incremental images of the whole disk while the servers are being used. Is there anything like that for Mac? Or how do you back up the system configuration of your Mac servers?
    But perhaps this is the wrong subforum to ask questions like this?
    Thanks alot anyway!

  • Strategy for making a pop-up slideshow?

    I want to be able to tap a photo and then go to a full-page (or nearly) slideshow (and then back to the original page). I'm wondering if anyone has found an interesting strategy for this -- for a slideshow that doesn't become a page you can navigate to.
    The easy ways to do it are to have an in-between page as the slideshow, or a page at the end of the article as the slideshow. But what I really want is a slideshow function that doesn't "count" as a page. DPS doesn't let you hide things very well! Doing a single image pop-up is easy with an MSO, but I don't see anywhere to take it from there, since I can't put an HTML overlay into that MSO...
    If anyone has a slideshow idea to share, that would be appreciated.
    Thanks
    David

    How are your HTML skills?
    You could create a lightbox in Dreamweaver or even use Muse.
    Bob

  • Back Up Server - for 10g Database

    Dear Gurus,
    I have the following requirement:
    I have a production databaser running on Dell Server with Windows 2003. I want to create a back up server for the same and it need to be updated daily evening. So i have writtent a batch fil as follows:
    echo "Backup of 10g Database User"
    del d:\10gDB_Backup\*.dmp
    del d:\10gDB_Backup\*.log
    del s:\*.dmp
    exp system/oracle file=D:\10gDB_Backup\10gDB%DATE%.dmp full=Y log=D:\10gDB_Backup\10gDB.log
    echo "Finished Backup"
    echo "Backup of 10g Database User"
    copy D:\10gDB_Backup\*.dmp s:\
    echo "Finished Copying the 10g Database to Backup Server"
    Now from the above file you can see i am using the exp utility and taking a full backup of the database and copying it to S drive which is nothing but one of the drives i have mapped for coping over the network to my back up server. Now on my backup server i have installed Oracle 10g Database same version as on production with the same SID. Now after doing the export can i run import on the back up server using system user to load all the data from the export file?
    Will it work also i have to perform this exp/imp daily evening automatically so what will happen to the data which is on Back up will all the users data will overided or what will happen, I will be thankful if anyone can explain and guide me in performing the above mention task.
    Note Db Size is 450MB.
    Regards
    Kiran Rana

    If you want to use an export as a supplement to a viable backup mechanism, the least you could do is use datapump. However, if this is all that you use, like damorgan says, you are a disaster waiting to happen.
    Physical backups are backups of the physical files used in storing and recovering your database, such as datafiles, control files, and archived redo logs. Ultimately, every physical backup is a copy of files storing database information to some other location, whether on disk or some offline storage such as tape.
    Logical backups contain logical data (for example, tables or stored procedures) exported from a database with an Oracle export utility and stored in a binary file, for later re-importing into a database using the corresponding Oracle import utility.
    Physical backups are the foundation of any sound backup and recovery strategy. Logical backups are a useful supplement to physical backups in many circumstances but are not sufficient protection against data loss without physical backups.
    Unless otherwise specified, the term "backup" as used in the backup and recovery documentation refers to physical backups, and to back up part or all of your database is to take some kind of physcial backup. The focus in the backup and recovery documentation set will be almost exclusively on physical backups.
    http://download.oracle.com/docs/cd/B19306_01/backup.102/b14192/intro001.htm#sthref10

  • Need more advice on overall back up strategy

    Based on my previous post, "Need advice on setting up a portable external hard drive", I set up a portable backup strategy for the iBook G4 consisting of a 250 GB hard drive in a firewire enclosure which contains two bootable clones. In a similar fashion, I set up a 320 GB drive in a firewire enclosure which also has two bootable clones and also an empty partition for the MBP.
    Also, I have a 160 GB La Cie firewire desktop drive that was purchased in November of 2006 which has Time Machine and some other backups on it, some of which are unique.
    Since I had to dig pretty deep into the piggy bank to buy the drives and enclosures, I really don't want to buy any more hardware for a while. I also didn't expect I would have to worry about archiving anything for a while since I thought I had lots of capacity, so I hadn't intended to post this question for a while. However, circumstances have changed rather suddenly.
    After getting the logic board reballed and replacing the original 30 GB hard drive with the 120 GB drive, I decided to give the iBook to my rather cyberphobic Significant Other, along with a red iPod, a DynoScope, and the 250 GB backup drive as a first computer, little realizing what was about to happen. Even after 27 years of marriage, someone can still surprise you.
    My SO started digitizing our music collection, and once all the CD's were put into iTunes, started in on the vinyl collection. The iPod is already full, and both cyberphobia and hard drive space are rapidly evaporating. The vinyl files seem to be much bigger than the CD files. The 120 GB internal drive and 250 GB external drive that seemed so huge is now beginning to look too small. The overall strategy is still fine, but I'm about 60 GB's away from having to archive some of the music collection to free up more hard drive space both on the internal drive and on the backup clones. On general principles, I would rather do the archive on a third drive, and this drive does not need to be portable.
    I do have quite a bit of excess capacity on my MBP and MBP backup drive, and perhaps this could be utilized in the short term. I could use the La Cie, but the problem is that an archive would likely use up most of the free space remaining.
    So what I probably need advice on is what to plan on setting up that will accommodate both computers. The iBook is running Tiger and has iLife 06 on it. The MBP has Leopard and is running iLife 08. I already know that if a photo is in iPhoto 08 it cannot be put into iPhoto 06. I don't know how it works with iTunes--both computers seem to have the same version of iTunes--v. 7.6.1. So I don't know if the iBook needs to have a separate drive or if it could use the same drive as the MBP.
    Also, I am wondering if there is any way of finding out about impending failure of an external hard drive since S.M.A.R.T. status is not supported. I can verify the disk using Disk Utility, but was wondering if there was anything else to do other than pay attention to how old it is and how noisy it is. I imagine it is still true that over time, all hard drives will fail.
    Would setting up a RAID be something to consider? Does that give some protection against the failure of individual hard drives? (I don't know much about RAIDS.)
    I apologize for the length of this post. I wanted to try and include all the relevant information about my situation. There seem to be so many possible options that I am in a real quandary about just what to do, both for the short term and for the longer term. But I would like to have a sensible long term plan worked out in advance, so that when the time does come to buy more hardware, I will have a clear idea of what to get and how best to set it up. Any advice and help will be greatly appreciated.
    Thanks in advance!

    Thanks for responding!
    You are right about the AIFF format. I will have to burn the DVD's on my MBP since the iBook can only burn CD's, but I don't see any problem about doing this except for a question about DVD's.
    I have read several times about CD's degrading over time so that the data is lost and the CD's just turn into little gold frisbees. I think this is mostly true of the super cheap store brands, and I don't know if it's also true of DVD's.
    So I am wondering about the projected life span of whatever DVD's I choose to burn onto, and also, how best to store them. I have a stack of Verbatim DVD's which are supposed to be a good brand, and I have a much smaller stack of Archival Gold (by Delkin) which are supposed to be good for 100 years. 100 years is probably in excess of what we really need, but I would like to think in 10 or 20 or 30 years, the DVD's would still be good. If not, then I just need to know how often to reburn them.
    I tend to worry about the longevity of digital media. I had burned the iPhoto library onto inexpensive CD's, and used these to put it on the iBook. On every disc, there were unreadable files. The iBook can't read one of our purchased CD's, even though we have played it many times elsewhere. We'll try it on the MBP and another external CD drive that we have to see if we can digitize it.
    I think the DVD's for the large vinyl files are a great idea and I will plan to start doing this at some point soon. I will also burn either CD's or a DVD of anything which is an only copy. Right now there aren't many of those, but of course that could change and probably will.
    I can see we need to sort a few things out, but I think the road ahead is much more clear now. I addition to the original CD's and DVD's that I plan to burn, I will probably still keep redundant copies the entire iTunes library on a couple of different hard drives, since I have a lot of capacity open on the MBP. The music itself will be well backed up with discs, but since it has taken so many hours so far to put music into the library and will take many hours more, I also want to be sure in a way to back up those many hours of effort.
    Thank you very much for all your help. I want to keep this topic open a bit longer in case my SO has any further questions, but I think my concerns are pretty well answered.
    Many thanks again!

  • Start up disk is full, and need new storage and back up strategy

    I am maxed out on nothing my iMac and Mac book pro and recognise I need to introduce a new external hard drive into the equation.
    I currently have an apple time capsule with 2gb drive, that backs up the 600g iMac and 250g mac book.
    I would like to understand if I could point applications such as iTunes, iPhoto and iMovie towards an external drive ( as long as it is always connected to either the machine...or perhaps network so all machines have access to the external drive)....and hopefully I would also be able to access my iTunes library ( on the external drive) via an Apple TV...without any fuss.
    I would also like to use my apple time capsule to back up from both the iMac and Mac book internal drives...as well as recognising the data on the external drives that points to the respective computers.....eg the iTunes library etc.
    Is this at all possible???
    If so, which external hard drive solutions should I use ( I have seen some mention synology...but looked at the website, and so many models and not sure what I would need)???
    And how would I go about lifting and shifting data, redirecting applications to read and write to external drives, and getting time capsule to back up as above.....and possibly even have an additional back up strategy that allows me to take regular copies of all hard drives and keep content in an off site location for data security...possibly in a bootable format???
    Welcome all help and advice please.

    You should never, EVER let a conputer hard drive get completely full, EVER!
    With Macs and OS X, you shouldn't let the hard drive get below 15 GBs or less of free data space.
    If it does, it's time for some hard drive housecleaning.
    Follow some of my tips for cleaning out, deleting and archiving data from your Mac's internal hard drive.
    Have you emptied your iMac's Trash icon in the Dock?
    If you use iPhoto, iPhoto has its own trash that needs to be emptied, also.
    If you store images in other locations other than iPhoto, then you will have to weed through these to determine what to archive and what to delete.
    If you use Apple Mail app, Apple Mail also has its own trash area that needs to be emptied, too!
    Delete any old or no longer needed emails and/or archive to disc, flash drives or external hard drive, older emails you want to save.
    Other things you can do to gain space.
    Once you have around 15 GBs regained, do a search, download and install OmniDisk Sweeper.
    This app will help you locate files that you can move/archive and/or delete from your system.
    STAY AWAY FROM DELETING ANY FILES FROM OS X SYSTEM FOLDER!
    Look through your Documents folder and delete any type of old useless type files like "Read Me" type files.
    Again, archive to disc, flash drives, ext. hard drives or delete any old documents you no longer use or immediately need.
    Look in your Applications folder, if you have applications you haven't used in a long time, if the app doesn't have a dedicated uninstaller, then you can simply drag it into the OS X Trash icon. IF the application has an uninstaller app, then use it to completely delete the app from your Mac.
    Download an app called OnyX for your version of OS X.
    When you install and launch it, let it do its initial automatic tests, then go to the cleaning and maintenance tabs and run the maintenance tabs that let OnyX clean out all web browser cache files, web browser histories, system cache files, delete old error log files.
    Typically, iTunes and iPhoto libraries are the biggest users of HD space.
    move these files/data off of your internal drive to the external hard drive and deleted off of the internal hard drive.
    If you have any other large folders of personal data or projects, these should be archived or moved, also, to the optical discs, flash drives or external hard drive and then either archived to disc and/or deleted off your internal hard drive.
    Good Luck!

Maybe you are looking for