Advice on Synchronization Strategy

Hi,
I'm trying to devise a plan for synchronizing 2 databases. The problem is that one database is located on a server in one building, and the other database is on a different network in a classified building -- and the computers cannot make connections to anything outside the building or the network.
There are 3 schemas which need to be synchronized between the databases so that all of the table data is up-to-date in both databases.
I thought about exporting the schemas, putting them on CD, and having somebody in the classified building dropping the tables in the schemas and importing the new data. But, the tables will get extremely large so this won't be efficient.
Are there any other ways of synchronizing databases without having to link them in some sort of way? Can I do something with the redo logs maybe? Do any tools exist that will make some sort of script that I can put on a CD for somebody in the other building?
Thanks in advance,
Nora

First try to find out how much will be the total file size for 3 tables by exporting in to one file.
Based on File size, decide how many files you can create with the OS MAX file size limit (Also to make sure this individual files can be copied on CD)
From Oracle8i, the export utility supports multiple output files. This feature enables large exports to be divided into files whose sizes will not exceed any operating system limits (FILESIZE= parameter). When importing from multi-file export you must provide the same filenames in the same sequence in the FILE= parameter. Look at this example:
exp userid/password FILE=D:\F1.dmp,E:\F2.dmp FILESIZE=10m LOG=scott.log
hope this helps

Similar Messages

  • How can I create an EntityStore for a Database?  Advice on DB strategy?

    Hello All,
    How can I create an com.sleepycat.persist.EntityStore for a com.sleepycat.db.Database? Advice on DB strategy?
    I'm looking to create an application prototype that creates a database dynamically and can delete it programatically. Is openDatabase/removeDatabase what I'd want to use for that?
    So far, the Direct Persistence Layer is an amazing piece of technology and has the potential to make my team's application faster than we every though possible, so kudos to the author.
    I've built a high-security application using the DPL and a single database for hundreds millions of users. I'd like to attempt to create a database for each user for security and manageability purposes. Our mission statement doesn't allow us to execute queries for sensitive data across more than one user account at a time, so I don't get any benefit from having every user's objects in a single database. I'm investigating if giving each user their own database will speed up insert times and SecondaryIndex queries. It'll certainly be more secure as each database will be encrypted with a unique password.
    My design is that I have an app with dozens of com.sleepycat.persist.model.Entity beans which I persist and query on their Primary and secondary indexes. I've been accessing things through an EntityStore as illustrated in the Getting Started Guide. Can I access a database created with openDatabase through an EntityStore and com.sleepycat.persist.PrimaryIndex?
    Thanks in Advance,
    Steven
    Harvard Childrens Hospital Informatics Program

    I closed this as I found another way to solve my issue.
    Thanks,
    Steven

  • HR data synchronization strategy to CRM

    Dear all expert,
    I hope this question never asked before. We are already synchronizing our HR master data (org structure, position) from HR client to CRM client using ALE.
    The syncronization already ran 14 months.
    I'd like to know, how is the best practises or of run sync. of HR data to CRM?
    Now, we are sync. our HR data to CRM daily and using reporting period ALL at PFAL.
    But, the consequences is: the IDocs created were very much. We want to eliminate that.
    Why we do like that? Because our HR data can be changed dynamically and daily, updated back-dated or hiring in any-date. If we sync. the data not daily (example, weekly), it might be a salesperson not in CRM client when needed.
    Why we use reporting period ALL? Because if we do not using that selection (using today or this month), the link between objects might be broken or inconsistencies because not all link will be sent, only link/object in the restricted period.
    Any suggestion or experiences in other client about synchronization strategy? Is our selection already correct?
    Don't worry about points.

    Hi Kittu,
    Please surf throught he link below:
    https://www.sdn.sap.com/irj/sdn/go/portal/prtroot/docs/library/uuid/84dd0c97-0901-0010-4ab2-9849fba57e31
    Best Regards,
    Pratik Patel
    <b>Reward with Points!</b>

  • Need Advice on Backup strategy

    Hi,
    Database version:10.2.0.
    Database Size:60GB.
    Daily Updation Approximately :2GB.
    I need some advice on my backup strategy.
    My backup startegy:
    Monthly Once, Date: 1 Full backup (Incremental level 0).
    Daily Incremental level 2 backup from Date: 2 to 6.
    Weekly Once Incremental leve 1 backup on Date:7.
    Daily Incremental level 2 backup from Date:8 to 13.
    Weekly Incremental level 1 backup on Date:14.
    Daily Incremental level 2 backup from Date:15 to 21.
    Weekly Incremental level 1 backup on Date:22.
    Daily Incremental level 2 backup from Date:23 to 28.
    Weekly Incremental level 1 backup on Date:29.
    My system requirments are i have only 70GB disk to store the backups, i can store only the full backup in it.So what i am doing is, Am storing the full backup in my disk and am transfering the Incrmental level 1(weekly Once backup) to the Tape and deleting the Daily level 2 backups, after taking that weekly backup.
    Is my strategy correct or should i do anything different plz give me some advice.
    TIA,

    This backup strategy doesn't look bad. However the 70G of free space will be a pain in the neck. I suggest you to increase storage. Have you considered archivelog too? This average 2G daily transaction rate will increase storage requirements, sooner or later, so backup storage requirements.
    ~ Madrid

  • Advice on backup strategy.

    I have a small network of less than 10 macs, all on ethernet connected to a centrally located switch.  We have a Snow Leopard Server for storing and sharing about 300 GB of documents amongs the users and I currently have it setup to let all the macs backup to an external drive connected to it and enabled for Time Machine.  We have a fiberoptic line connecting our building to another building about 500 m away.  Initially, this line was used to access a router but now we are going to move the router into our building.  But, the fiber line will still be available and we would like to use it to send an offsite backup over.  What should we do?
    We could purchase a Time Capsule and put it in the other building but would we then disable the Time Machine on the server and have all the clients AND the server backup over the fiber line to the Time Capsule?  Can I backup the Time Machine backups to the Time Capsule?  What I mean is, leave all the clients backing up to the server and then backup the server to a remote time capsule?  Would that even work?  If the main building burnt could I restore the Time Machine backups to a drive and then from there restore them to client?
    Any advice will be appreciated.

    Larry Jorgenson wrote:
    We could purchase a Time Capsule and put it in the other building but would we then disable the Time Machine on the server and have all the clients AND the server backup over the fiber line to the Time Capsule?
    Yes, that's an option (probably the best).
    Can I backup the Time Machine backups to the Time Capsule?  What I mean is, leave all the clients backing up to the server and then backup the server to a remote time capsule?
    I'm not familiar enough with the Server product, but I don't think so.  The client backups will be on separate sparse bundle disk images, and the client version of OSX won't back up a disk image that's mounted.  Plus, the client version of OSX won't back up Time Machine backups -- they're excluded automatically.
    I've never been able to find anything definitive in the Server Manual.  Now I see the large Snow Leopard server manual seems to have disappeared, so don't even know where to tell you to look.  And Lion server may be different, anyway.
    You might want to try the Server forums, at: https://discussions.apple.com/community/servers_enterprise_software

  • Process synchronization strategy

    Trying not to reinvent the wheel...
    I have started redesigning an old data acquisition/processing application using as much parallelism as possible.
    UI/DAQ/Data Saving/Data Processing/Data Display are all handled in separate loops or VIs which communicate and exchange data via queues or notifiers. This works fine so far.
    Then I realized that the way I provided the data processed by Data Processing (DP) to Display (D) tasks was not going to work in all circumstances.
    I use a notifier, so that several D tasks can access the data and such that there is not much memory cost (no queue) if the D tasks can't keep up. If data is generated faster than the D tasks can get it, then they miss some but get a chance to catch up later or at least get some data every now and then. That was what I had in mind originally.
    Then I realized that I may want to display all processed data (for instance offline). This architecture won't work. If I were to keep the data notifier approach, I would need a way to tell the DP task to send data to the notifier only when all D tasks are done with their previous chunk of data.
    What is the best way to achieve that? I need to be able to change the number of clients (D tasks) dynamically, so I thought RendezVous might work. But then looking at the example "Wait for All Notifications Demo.vi", it seemed that this might be a more natural approach but a bit tedious to handle since the notifiers would be created (and destroyed() dynamically in separate VIs...
    The point is that I want DP to know how many clients it is supposed to wait for (at all time) before sending a new notification (with new data).
    Thanks for your feedback.
    X.
    Solved!
    Go to Solution.

    Hi Daklu,
    My original idea was to deal with either a case where all clients are D's or with a case where all are D*'s, but upon reading your answer, I am thinking of including that third case where both types can exist... But that would be for a "live mode" situation, not an offline one, which is the one I was discussing (in other words, a situation where all clients are D*'s).
    As far as I can see, the solutions you offer don't solve the problem I foresee in the offline mode (where I "replay" saved data). If data processing is much faster than data display (and again, "display" is a simplification, there will be some post-processing involved), then both solutions will create massive queues (or queues of events?) and that's precisely what I want to avoid. I'd like to have DP know when all its clients are ready for more and only then "output" data (or process some more if need be).
    I can think of "classic" ways of letting the clients register with DP, send messages when they are done or when they quit, etc, but I can't really find a way of avoiding to have DP be stuck in a polling loop waiting for these communications to be over before going for a new round of data. Maybe the overhead of having a polling loop is not that much of deal in offline mode, after all? I just wanted to make sure I am not missing a neat LabVIEW feature I don't know about before resorting to an old fashioned approach from my limited set of skills.
    In particular, I think it would be nice to have the notifier be able to check how many registered clients have received the last notification (either as a "Get #Clients Having Received Last Notification" function - which would however require polling to make sure all  clients have received it, so not really what I am looking for-, or better, as a "Wait for All Registered Clients to Receive Last Notification" function (with timeout)).
    Thanks for reading.
    X.

  • OWB 10gR2 - Implementation Strategy

    I am trying to find what is the best strategy to install and configure a OWB environment for the following scenario on one linux server machine:
    There are three Oracle databases on this machine:
    database 1 = OWB repository owned by a owb_owner schema. The repository will contain all of design metadata (Source Oracle module with source tables, Target "Oracle module A" with tables, sequences, mappings and Target "Oracle module B" with tables, sequences, mappings). The ETL process is setup to transform data in two stages: 1> Source to target A 2> Target A to target B
    database 2 = It will have target schema Target_A. The contents of "Oracle Module A" from the OWB repository in database 1 needs to be deployed to this "target_A" schema
    database 3 = It will have target schema Target_B. The contents of "Oracle Module B" from the OWB repository in database 1 needs to be deployed to this "target_B" schema
    Do I need to have OWB repository installed in database 2 and database 3 and have control center service running in each of them to facilitate execution of ETL process?
    Deployment of the objects will be launched from a windows workstation via client design center / control center manager. Do I need to have control service running from client machine to facilitate deployment ? If so, would it facilitate deployment of the objects to the above two targe schemas ?
    The intent is to have a process flow in OWB metadata repository of database 1 streamline the complete ETL process tied to a location owned by OWF_MGR schema in database1.
    Please advice what implementation strategy will work best for the scenario. I read strategy presented in http://download.oracle.com/docs/cd/B31080_01/doc/install.102/b28224.pdf .. BUT I need more clarity on the available options.

    Hi.
    you will need a unified repository in each database. The unified repository in OWB 10GR2 contains both the objects for the design repository (your mappings, tables etc.) and the control center repository (runtime and deployment information such as execution times of mappings etc.). In previous versions of OWB they were separate.
    Database 1: Install the unified repository and just use it as your design repository. You will use this when you log in via Designer.
    Database 2: Install unified repository together with Control Center Service and use as control center repository.
    Database 3: Install unified repository together with Control Center Service and use as control center repository.
    While it is possible to have the Control Center Service just run on a client machine , I ran into all sorts of problems in the past with this configuration, e.g. when you log off your client machine it shuts down the Control Center Service. From my experience this setup is not very stable.
    In OWB Designer you can then set up locations and configurations to deploy your mappings to database 2 or database 3.
    You will need owf_mgr on both database 2 and database 3 and you need to deploy your process flows to both databases. What you can't do is to run a process flow say from database 1 or database 2 across databases 2 and 3.
    Can you explain in some more detail why you have a requirement to transform data across two target servers?
    Regards
    Uli

  • How to stop hard disk noise on T61

    Hello everyone,
    I have a T61 and everything works well except the hard disk which is constantly working. The result is a gentle noise on the bottom right
    of laptop, always present. I used to have a T42 which when configured on XP to turn off hard disk (through power management), completly turned it off. I could then read or write long open files with a completly silent laptop (used tpfancontrol for controling fan, but this is not the problem here) until I save document or explcitly trigger the hdd (a completly silent laptop is a delight!).
    The noise of the present T61 is bearable, a kind of a whisper, but once it stops it gets really better for your ears. My problem is that the power manager (set to turn off disk after 3 minutes) works, but something is constantly accessing the hard disk, which prevents the power manager to turn the HDD off. So, sometimes it happens that after 3 minutes the HDD stops, and a hsort time latter, say 15 seconds, it turns on again because "something" used the hdd.
    The worse is that, most of the time, the hdd is triggered prior to the 3 minutes, so that it keeps always running!!
    I can see this as the hdd green light flashes.
    This is always the case, with or without internet, AC or DC.
    I tried to stop some services (including indexing), but I stopped looking because of the number of services to check!
    How do I know what service/program is being written on hdd at the specific time where I notice it, or makes it work, such that I can possibly disable it?
    This is maybe a XP SP§ issue I think, but I think many here have noticed this constant activity on lenovo´ T61?
    Any advice on possible strategy to find out who write what on the hdd or other solution?
    I really plan to fix this problem because the T61 as it is now is really good (with however minor problems),
    and i think many of us should benefit from this issue..
    xtex
    Message Edited by xtex on 09-01-2008 10:55 AM
    T61, UZ2AQGE 2.5 Ghz, XP Pro SP3
    T400s, NSF20UK, 2.4 GHz, SSD 128 GB, Vista B

    I would say it´s quite impossible on newer systems, to get the harddrive off for more than a few seconds.
    Many user have tried that, but no one had success.
    Another, but little dirty, possibility is, to take some piece of rubber or something and put it under the palm rest where the hdd is.
    But beware, there is a little hole in hdd that have to be open to get some air.
    This dirty trick will lower the permanent sound of your hdd.
    Greets.
    Follow @LenovoForums on Twitter! Try the forum search, before first posting: Forum Search Option
    Please insert your type, model (not S/N) number and used OS in your posts.
    I´m a volunteer here using New X1 Carbon, ThinkPad Yoga, Yoga 11s, Yoga 13, T430s,T510, X220t, IdeaCentre B540.
    TIP: If your computer runs satisfactorily now, it may not be necessary to update the system.
     English Community       Deutsche Community       Comunidad en Español

  • See the user's hostname when doing transactions at any given time.

    Hi SAP Experts,
    I want to see the hostname it used the user when the user is executing a transaction at a particular date and month. And current condition Security Audit not active. Can I see this? I can see on tCode ST03N (1 month ago), but I can't see detail the hostname.
    Do I need to enable Security Audit (SM19) if I want to track user activity? Is it necessary for security reasons?
    Please advice for active strategy schedule Security Audit because if it is activated storage will fill up quickly.
    Thanks,
    Marjan

    Hi Marjan,
         Without enabling the Security Audit, You can't find the required details like hostname/Terminal, User, Tcode/Report... Use SM19 to enable the Audit functionality and check for information in SM20.
    Yes, Definitely storage will fill up if you activate this functionality but it will helps you alot.
    If you are using Production Server, you must enable for security and audit purpose.
    You can activate for Development, Quality and Test purpose systems, if required otherwise no need..
    Ask me if you have any queries..
    Regards
    Santosh K

  • Migrating System Center Databases from cluster to Standalone

    Hi
    We have been advised to migrate all our System Center SQL DB's (mainly ConfigMgr, OpsMgr, ServiceMgr and Scorch) which are currently running on mostly 4 nodes SQL clusters (with different other business services and impacting the System Center Performance
    as Memory and CPU constantly hitting above 90%) to standalone SQL environment.
    OS - Windows Server 2008 R2 / Windows Server 2012
    SQL - SQL Server 2008 R2 / SQL Server 2012
    Can someone advice steps or strategy to execute this?
    regards
    Guru 
    Gururaj Pai

    Thanks Uri.
    For SCCM and SCOM SQL Instances- SQL Version is SQL Server 2008 R2
    We have 4 node Windows Cluster with 3 SQL clusters configured with below capacity planning -
    VM Nodes
    OS Version
    Disk
    Memory
    CPU
    SQL Cluster Instances
    Node1
    Windows Server 2008 R2 SP1
    100 GB
    32 GB
    4 core Intel Xeon X5650 2.67Ghz  
    SCCM (CAS DB)
    SCOM (OpsMgr DB)
    Solarwinds(Network Monitoring DB)
    Node2
    Windows Server 2008 R2 SP1
    100 GB
    32 GB
    4 core Intel Xeon E5-2680 2.7Ghz
    Node3
    Windows Server 2008 R2 SP1
    100 GB
    36 GB
    4 core Intel Xeon X5650 2.67Ghz  
    Node4
    Windows Server 2008 R2 SP1
    100 GB
    32 GB
    4 core Intel Xeon E5-2680   2.7Ghz
    We have one more 4 node Windows Cluster where 2 other SQL instances of SCCM are running with below capacity
    VM Nodes
    OS Version
    Disk
    Memory
    CPU
    SQL Cluster Instances
    Node1
    Windows Server 2008 R2 SP1
    150 GB
    16 GB
    4 core Intel Xeon X5650 2.67Ghz  
    SCCM1 (PrimarySite DB1)
    SCCM2(PrimarySite DB2)
    Node2
    Windows Server 2008 R2 SP1
    150 GB
    16 GB
    4 core Intel Xeon X5650 2.67Ghz
    Node3
    Windows Server 2008 R2 SP1
    150 GB
    16 GB
    4 core Intel Xeon X5650 2.67Ghz
    Node4
    Windows Server 2008 R2 SP1
    150 GB
    16 GB
    4 core Intel Xeon E5 2660 2.2Ghz
    Is this the right SQL architecture for System Center?
    How do you optimize this? I ran both queries and it returned zero values.
    The DR plan is to implement back up strategy with SCDPM and keep backup copies on Azure Cloud. Plus, we will follow the DR plans for each System Center Product.
    regards
    Guru
    Gururaj Pai

  • MPEG2 Muxed - No Audio

    I just bought a Sony Handycam DCR-SR80 Hard Drive based camera. The file format it records in is MPEG2 Muxed. Now, everything is fine when you record it and play it back in every app besides quicktime, imovie, itunes. Obviously everything is centered around quicktime. My question is, after laying down 800 dollars for this camera, will quicktime and all my other apps support this camera without me having to use other apps to convert the files? Is there a way I can get imovie alone to work? Thanks

    Question: since hard drive based camcorders seem to be the future, anyone have any insight as to whether and when Apple will upgrade its apps (QT, iMovie, Final Cut Express, etc) to integrate the new technology? Is there any way to import audio with my MPEG2 files into iMovie/iDVD or Final Cut Express?This is probably a "not gonna happen" type thing. Basically, there are two forces working against it.
    This first is the fact that the MPEG-2 format is, to be frank, somewhat old and not worth the time and trouble when newer, more efficient formats like MPEG-4/AVC (H.264) are available for handling HD content at MPEG-2 data rates.
    The second is the difference in technology involved. Basically, audio and video content can be synchronized either in terms of time or space. QT employs "temporal" relationships for its "frame-to-frame" synchronization strategy while both "muxed" and interleafed files employ "spatial" relationships. For instance, MPEG-1/MPEG-2 audio/video data is stored in alternating "blocks" of autio and video data in a single stream or track. As such, it remains synchronized because one form of data cannot physically "overrun" or "outrun" the other. AVI, while assigning audio and video data to separate streams, are physically interleaved in order to maintain synchronization. (Think of this as two separate tracks with "meshed cogs" which force the relative speed of one track to remain constant in comparison to the other. Lastly, you have QT synchronization which is based on an arbitrary unit of time which is used as a "frame of reference" for synchronization. This approach offers both advantages and disadvantages. For instance, QT technology does not require that every audio or video frame of data be played. This means the QT structure can have a wider range of platform CPU power requirements with older, less CPU intensive formats. (I.e., an older platform simply drops more and more frames during playback until the structure finally decides it can no longer keep up.) On the other hand, any reliance on a time reference means the playback software is very sensative to loss or absence of timing reference data. (Ever wonder about those "timecode breaks" and why part of an MPEG2 file appears to be missing or why a QT file won't load a file when it can't find a proper file termination?)
    I guess you might simply say that Apple is betting that time reference based file formats will dominate the future of multimedia and they are less than williing to support what they consider outmoded and/or hybrid technologies. (Consider how long took for public opinion to get Apple to even support "muxed" MPEG-2 video with an add-on component.)

  • Seek strategy advice after acquired OCA

    Dear all,
    I'm seeking advice from whom is willing to share strategy on how to get a job after OCA is acquired. The situation is this, I was in networking industry with 4 years experience but was cut short because of the changing trend in the economy and decided to focus career on Oracle related job. I just completed my first 1Z0-051 exam and preparing my next exam to get OCA.
    My concern is that I wasn't that young (I'm 36) and don't have any experience in database world whatsoever even though I have background in computer and electrical engineering. What is the best strategy to start my career in the industry as soon as the certification is acquired?
    best regards,
    Val

    jgarry wrote:
    As one who has seen network admins thrown into DBA work and fail miserably, I'd say your best option might be a smaller company that requires a spread of skills among a few people. That way you can impress them with your previous skills while developing your current ones. My impression is there are a lot of companies that have a few people in IT and those inevitably don't have a DBA but need the work done, and may not even know it. The problem with DBA work from a business perspective is that it is overhead. The problem from a skills perspective is that it requires quite a different view of solving problems (as well as being able to have multiple views of problems) and the real issues vary quite a bit from what is taught in OCA. So if you are good at switching hats on a moments notice, you might make the transition successfully. Some of the best DBA's I've run across came from math or EE backgrounds, and I think that is because they are taught to transform complex problems into solvable problems. Certification (especially network certs) emphasizes memorizing tasks, simply the wrong approach for solving complex problems. And that's why OCA has such a bad rap. Junior operational DBA's usually have boring repetitive tasks, but there is a lot more to DBA work than that.
    How to find such companies? That's a toughie. But keep at it, dealing with massive rejection is a marketing skill, which also helps in DBA work (in which by its nature, most success is invisible), and with a crashed improving economy full of companies that have lost good people, eventually you get something. If you are a mid-30's female, you have an advantage, believe it or don't (can't tell from Valerie, of course). I personally dislike linkedin, but it sure seems to be on a roll.would you be kind enough to publish the name of such guys who worked as marketing before and worked as a DBA just having only OCA ? !--! My dear friends word without an evidence has no value. !--! These days I found only OCA guys are not recruited to reputed companies !--! I found DBAs are not eligible   for BPO also !-_-! as they are not certified for BPO. I assure OP that vendors will shipped some job links after completion of your OCA along with your success kits -- few days back i get call for such !-_-! As a 500 --internal server error or network connection failure error or some database may be down or having some performance issue for storage also so DBA must know everything pl/sql is related to sql tuning which is a major part of PT and disk array concept is required also. so after knowing everything you can be a DBA. !-_-! . So my opinion is very simple first grab a job then learn by day by day,certification is useless unless it is supported by practical experience. And think 100 times to switch your domain always.Even after grabbing experiences you may found that certification has become more useless.As a DBA your friend will be alertlog and google and oracle docs will be your teacher.
    kind regards

  • Need more advice on overall back up strategy

    Based on my previous post, "Need advice on setting up a portable external hard drive", I set up a portable backup strategy for the iBook G4 consisting of a 250 GB hard drive in a firewire enclosure which contains two bootable clones. In a similar fashion, I set up a 320 GB drive in a firewire enclosure which also has two bootable clones and also an empty partition for the MBP.
    Also, I have a 160 GB La Cie firewire desktop drive that was purchased in November of 2006 which has Time Machine and some other backups on it, some of which are unique.
    Since I had to dig pretty deep into the piggy bank to buy the drives and enclosures, I really don't want to buy any more hardware for a while. I also didn't expect I would have to worry about archiving anything for a while since I thought I had lots of capacity, so I hadn't intended to post this question for a while. However, circumstances have changed rather suddenly.
    After getting the logic board reballed and replacing the original 30 GB hard drive with the 120 GB drive, I decided to give the iBook to my rather cyberphobic Significant Other, along with a red iPod, a DynoScope, and the 250 GB backup drive as a first computer, little realizing what was about to happen. Even after 27 years of marriage, someone can still surprise you.
    My SO started digitizing our music collection, and once all the CD's were put into iTunes, started in on the vinyl collection. The iPod is already full, and both cyberphobia and hard drive space are rapidly evaporating. The vinyl files seem to be much bigger than the CD files. The 120 GB internal drive and 250 GB external drive that seemed so huge is now beginning to look too small. The overall strategy is still fine, but I'm about 60 GB's away from having to archive some of the music collection to free up more hard drive space both on the internal drive and on the backup clones. On general principles, I would rather do the archive on a third drive, and this drive does not need to be portable.
    I do have quite a bit of excess capacity on my MBP and MBP backup drive, and perhaps this could be utilized in the short term. I could use the La Cie, but the problem is that an archive would likely use up most of the free space remaining.
    So what I probably need advice on is what to plan on setting up that will accommodate both computers. The iBook is running Tiger and has iLife 06 on it. The MBP has Leopard and is running iLife 08. I already know that if a photo is in iPhoto 08 it cannot be put into iPhoto 06. I don't know how it works with iTunes--both computers seem to have the same version of iTunes--v. 7.6.1. So I don't know if the iBook needs to have a separate drive or if it could use the same drive as the MBP.
    Also, I am wondering if there is any way of finding out about impending failure of an external hard drive since S.M.A.R.T. status is not supported. I can verify the disk using Disk Utility, but was wondering if there was anything else to do other than pay attention to how old it is and how noisy it is. I imagine it is still true that over time, all hard drives will fail.
    Would setting up a RAID be something to consider? Does that give some protection against the failure of individual hard drives? (I don't know much about RAIDS.)
    I apologize for the length of this post. I wanted to try and include all the relevant information about my situation. There seem to be so many possible options that I am in a real quandary about just what to do, both for the short term and for the longer term. But I would like to have a sensible long term plan worked out in advance, so that when the time does come to buy more hardware, I will have a clear idea of what to get and how best to set it up. Any advice and help will be greatly appreciated.
    Thanks in advance!

    Thanks for responding!
    You are right about the AIFF format. I will have to burn the DVD's on my MBP since the iBook can only burn CD's, but I don't see any problem about doing this except for a question about DVD's.
    I have read several times about CD's degrading over time so that the data is lost and the CD's just turn into little gold frisbees. I think this is mostly true of the super cheap store brands, and I don't know if it's also true of DVD's.
    So I am wondering about the projected life span of whatever DVD's I choose to burn onto, and also, how best to store them. I have a stack of Verbatim DVD's which are supposed to be a good brand, and I have a much smaller stack of Archival Gold (by Delkin) which are supposed to be good for 100 years. 100 years is probably in excess of what we really need, but I would like to think in 10 or 20 or 30 years, the DVD's would still be good. If not, then I just need to know how often to reburn them.
    I tend to worry about the longevity of digital media. I had burned the iPhoto library onto inexpensive CD's, and used these to put it on the iBook. On every disc, there were unreadable files. The iBook can't read one of our purchased CD's, even though we have played it many times elsewhere. We'll try it on the MBP and another external CD drive that we have to see if we can digitize it.
    I think the DVD's for the large vinyl files are a great idea and I will plan to start doing this at some point soon. I will also burn either CD's or a DVD of anything which is an only copy. Right now there aren't many of those, but of course that could change and probably will.
    I can see we need to sort a few things out, but I think the road ahead is much more clear now. I addition to the original CD's and DVD's that I plan to burn, I will probably still keep redundant copies the entire iTunes library on a couple of different hard drives, since I have a lot of capacity open on the MBP. The music itself will be well backed up with discs, but since it has taken so many hours so far to put music into the library and will take many hours more, I also want to be sure in a way to back up those many hours of effort.
    Thank you very much for all your help. I want to keep this topic open a bit longer in case my SO has any further questions, but I think my concerns are pretty well answered.
    Many thanks again!

  • Advice on a migration - or networking - strategy ?

    Hello. I am trying to map a logical strategy on how to do the following.
    I have a recording studio with an IMac 24 running 10.6.  It has all of my must-have apps and internet
    connectivity.
    I want to have identical capabilities on a separate IMac - located in another room. 
    I bought a used IMac which came loaded with 10.6 and is base set of apps/features.
    It seems I have a couple of options... Some form of networking where the added computer somehow
    would access the 'main' computer - or - running Migration Assistant and transferring the identity and
    contents of the first computer over to the newly bought computer.  I'm looking for the simplest and
    easiest solution.
    All computers have either ethernet and/or wireless access to my Airport Extreme/router connected to a Cable Modem. 
    I have never really messed with networking per se.  This second machine would easily be in range for wi-fi web connection.
    It's been quite a while since i've needed to deal with Migration Assistant and I don't want to wind up doing something
    irreversible.  If I were to connect the newly purchased Imac to the existing computer, do i have the option to selectively
    choose what gets carried over and what doesn't - like...if i don't want to clutter it with a lot of stuff that doesn't need to
    be on the target machine - just the must-have stuff -  like my recording, web access settings, Thunderbird email app and
    settings, and want to leave out excess things it really wouldn't require, would Migration ***'t allow me to itemize things
    that way i.e. customize the transfer?   
    Is there any reason i would have to "wipe" the OS on the target drive to do this
    or can i get away with just leaving the OS that's already installed on it, as-is, before i would run Migration Assistant?
    I'd hate to get into some weird Permissions issues that might be introduced...i never know quite how to resolve those
    situations as it's happened in the past.
    What i simply want to come away with is another Mac with pretty much all the same apps and internet/email access as
    my other machine... IF it was simple enough to use networking and make this 2nd machine capable of accessing/using
    all the main IMac's features and internet/email connection, i would consider that route but it seems more complicated...
    Sorry but i'm only an IT guy when I am forced to make changes and i just want to do it right 'the first time' so i appreciate
    any step-by-step on how to get this done in a sensible and straightforward way....and thanks very much!
    MIke

    Hi Mike,
    Best/easiest way is with the old one in Target mode...
    http://support.apple.com/kb/HT1661
    Use MA on the new one to Migrate everything, no detailed choices so it's best to get it all & remove what you don't need later.
    Other way would be the new one in Target mode...
    http://support.apple.com/kb/HT1661
    Then clone the old one to the new one for identical everything.
    Get carbon copy cloner to make an exact copy of your old HD to the New one...
    http://www.bombich.com/software/ccc.html
    Or SuperDuper...
    http://www.shirt-pocket.com/SuperDuper/

  • Advice needed for backup strategy for office

    My office is switching from PC to Mac (yay!) and I'm in charge of setting up the system. We'll be using 4 iMacs and a Mac Mini Server with ethernet connections. I'd like to ask what is recommended for backing up these computers.
    My thought would be to back up the iMacs separately from the server - is this definitely the way to go?
    I'm thinking to get a 3TB Time Capsule for the 4 iMacs so that Time Capsule will do automatic backups.
    - Is it OK to have 4 iMacs backup to one Time Capsule like this?
    As for the Mac Mini server, I'm not sure how to best back it up.
    Since it has two 1TB drives, I'm thinking of using one drive for data and having it automatically backup to the other internal drive. Any thoughts on this idea?
    Or, it is possible to connect an external USB drive to the Time Capsule's USB port, and have Time Machine backup the server to the external drive?
    Any advice is greatly appreciated!

    The TC is really a home solution.
    For business I think you should consider something extra.
    Although 4 computers and a server is not a big setup, it is worthwhile doing a more professional backup and use something more professional than Time Machine.
    For instance..
    http://www.retrospect.com/au/products/mac
    I have not used it.. but I see it recommended in business / larger installs. It is not cheap but well worth considering.
    I would not use the internal disk of the server for backup.
    As for the Mac Mini server, I'm not sure how to best back it up.
    Since it has two 1TB drives, I'm thinking of using one drive for data and having it automatically backup to the other internal drive. Any thoughts on this idea?
    You can raid the two disks if you don't need the capacity.
    But backup to a USB drive plugged in.. USB3 drives are cheap and speedy. Much more reliable than using network drives.
    Raid for integrity of working data.. backup to separate change over disks.
    So weekly you change over the USB drive for backup and store the other one at home.
    I'm thinking to get a 3TB Time Capsule for the 4 iMacs so that Time Capsule will do automatic backups.
    - Is it OK to have 4 iMacs backup to one Time Capsule like this?
    You can do this.. It is fine to have 4 Macs backup to it. But if you are not using the wireless router it is a waste.. you can just backup over network to the server.. TM works fine on network.
    You can buy another 3TB USB drive for backup of the clients.. normally though you want all the working data on the server and back that up. You are thinking a more peer to peer model.. once you introduce the server you can work on a more server client model where the current files are held on the server.

Maybe you are looking for