Pitfalls to avoid?

I have purchased an Aiport Extreme (802.11g) refurb from the Apple Store and should receive it this week. I want to set up a wireless network for our Verizon DSL (currently ethernet wired network) at home with my Pismo and a PC laptop running Windows 2000 Pro. Unfortunately we will be connecting at 802.11b until we upgrade machines at a later time. I would also like advice on if I should upgrade tp 10.4.9 as there seems to be several problems posted about dropped connections since the upgrade. I have never had a problem with a system upgrade in the past. Also should I upgrade any of the airport software or firmware if needed? Any advice will be gratefully welcomed. Thanks!
Pismo Powerbook   Mac OS X (10.4.8)   640 MB RAM, 60 GB HD (7200 RPM), Airport Card

welcome to the discussions.
If you maintain your machine as set out in this link,
http://discussions.apple.com/thread.jspa?threadID=122021
Then you should rarely have problems. It is always a good idea to do all of the system updates and upgrades for your software as well. I find that if you follow the insstructions, you will rarely have problems of any kind.
Normally the problems you read about, are from users that do not preform these tasks.
I also recommend, you change your mind on the g=refurb because the new n is compatible to a-b-g-n and so this will cover any older operating systems you may have. If you are going to update your computers later they will more than likely be n and will work with the newer n Routers.
It is always better to move forward in technology than go backwards.
Just my opinion.
Don

Similar Messages

  • NEW TO iPod...Need assistance on pitfalls to avoid before I buy!

    I am contemplating purchasing the 160GB iPod Classic and am an active Windows XP user on a regular PC. However, I have seen multiple posts/news articles about the new iPod Classic not syncing with the new iTunes version 9...
    I would hate to purchase such a great product only to find out that I am unable to upload my current music files to the iPod (basic WMA or MP3 files), or encounter other unforseen issues.
    Can anyone please provide any insight into what I should be aware of before making this purchase? THANKS!!

    i think issues with itunes should be the least of your worries, as those can always be fixed with firmware updates. i would be more worried about hard drive failure with the classic
    just keep in mind that any hard drive based ipod like the classic has a good chance of crashing or failing on you if you don't take proper care of it. Unlike flash based ipods like the nano or the ipod touch, you have to be wary about taking the classic out on jogs or any activity that involves a lot of movement and shock for you ipod. Also if you drop the ipod enough times or expose it to extreme heat/cold, you are probably going to get some type of hardware failure with the classic. i would consider getting the 64gb ipod touch instead although it does cost more money, but I feel it's more reliable and I think in the long run, the Touch will have a longer lifespan than the classic and will still be relevant in the near future. I've heard reports of Ipod classics lasting only about a year or even less before experiencing some type of problems. But perhaps that's only a small percentage of people. However, judging from past experiences, I used to have a 20gb hard-drive based ipod that performed well for about 10 months before dying out, and I very rarely ever mishandled it.
    But if you must get the classic, then invest some money in a good case or protector and be sensible about how you handle and use it. Otherwise, I do think it is a great product for people with large libraries.

  • ERPi Intialize Source System is not extracting any chart of accounts.

    Hi Gurus,
    I have an issue while Initializing Source system in ERPi. Any help fixing this issue is greatly appreciated.
    Issue is, When i Intialize source system in ERPi, its just kicking off only one scenario in ODI. And when i browse to process detail to check the status, its still showing as running.
    But i don't see any scenarios in ODI in running state.
    So, obviously we can guess it didn't go to that far extent to show the chart of accounts.
    So, here my questions is:
    1. I believe there should be multiple Scenarios which should have been kicked of in ODI. But why only one Scenario have been kicked off?
    2. And how long will Source Intialization process take.
    Version of software using (ERPi- v.11.1.2.2) and ODI (v.11.1.1.3) Source System (EBS v.11i)
    Any help is greatly appreciated.
    Thank you,
    Mike.

    Totally agree with you.
    Found this out after digging more into the issue. Finall passed through this issue.
    Yes, i Imported the Master and Work Reps which we will get as a part of FDM Installation.
    Surprising but true that ODI V.11g is only supported for Hyperion system greater than V.11.1.2.1.501.
    Also, by any chance do you know any Knowledge base or forums where we can find good material for Erpi Implementation, like Tips or pitfalls to avoid in Erpi Implementation.
    I know this is one of the good place.
    Found that admin guide doen't fully cover everything.
    Thank you,
    Mike.

  • How much will more RAM speed up my iBook?

    I am looking to upgrade my RAM.
    I know that speed is a multi-factor question. I need to know what's important and what pitfalls to avoid.
    I have an 12" iBook G4, 1.2GHz, 512 RAM (256 card, 256 internal). I recently installed a 120.0GB Western Digital "Scorpio" 5400RPM 9.5MM SuperSlim Notebook Drive with 8MB Data buffer.
    I am choosing between a 512 or 1g RAM upgrade. My questions are:
    1) How much will this speed up my computer? 2) Is the 1g really twice as fast? 3) with my specs?
    I read a post where some poor cat had upgraded RAM, but didn't feel the difference in iPhoto.
    My specs read pc2100 (pc2700 compatible). Will my specific machine work faster with pc2700? Comptick (which has an awesome battery price) has a 1g RAM upgrade for $189, which is pc2100. Apple's $300 1g is pc2700.
    What do I need to know about the bus rate so that I don't slow things up there?

    1) How much will this speed up my computer?
    There is no way we can quantify this for you. It depends on your usage pattern (i.e. what application you use, how many applications open at a time, etc.).
    2) Is the 1g really twice as fast?
    Most certainly the answer is no. If lack or RAM was causing delays in processing, adding more RAM will alleviate those delays and make your computing experience smoother and more enjoyable. But changing your RAM from 768 MB to 1.256 GB won't double the speed of your iBook.
    3) with my specs?
    ?? What is the question?
    I read a post where some poor cat had upgraded RAM, but didn't feel the difference in iPhoto.
    For the most part, iPhoto is disk intensive not RAM intensive. Therefore upgrading RAM will only improve certain aspects of iPhoto (such as photo editing).
    Will my specific machine work faster with pc2700?
    No, the PC2700 will operate at PC2100 speeds.
    Current RAM prices from Mac knowledgeable vendors offering lifetime warranties can be found at RamSeeker.com and DealRam.com.
    What do I need to know about the bus rate so that I don't slow things up there?
    Nothing. Just buy PC2100 or PC2700 RAM and relax.

  • Hard drive may be on it's way out - what now?

    I have a CoreDuo 1.83GHz white iMac, in the past month or so the hard drive has begun acting up (most recently permissions going haywire, losing printer drivers and unable to add any), Disk Utility has reported problems which it has managed to fix (booting from the install disc) but two problems in about a month does not sound good.
    Is there a free tool for checking the hardware health / status of my hard drive? Something that I can run on the iMac as-is, non-destructive. I am pretty low on internal HD space anyway so an upgrade wouldn't go amiss.
    I'm led to believe that these iMacs are tricky to get into for maintenance, however I am competent with a screwdriver and computer innards so would like to give it a go. Are there any good tips or pitfalls to avoid when doing this? I saw a guide online for upgrading the CPU to a Core2Duo, I could use this for the part about getting into the machine.
    http://www.maclife.com/article/createupgrade_your_imac_to_a_core_2_duoprocessor
    Thanks for your advice!
    PS Just out of interest, if I were to upgrade to a Core2Duo what would be the latest version of the chip that I could fit? In the guide they fit a T5600 but that was written over three years ago.
    Message was edited by: Jonathan Mortimer

    {quote:}The thing is that I really like my white iMac, I'm not at all keen on the shiny metal look of the new ones (especially not the gloss screen, that's just wrong and impractical for me){quote}
    I know exactly what you mean, that's why I'm still using my 17" iMac with a connected 21.5 LG display.
    {quote:}Hmm. I have freed up some space so I now have 51GB free, however I am a little puzzled as to where all of the remaining space has gone. It is a 160GB hard drive (152GB), my home folder uses about 73GB, that leaves about 80GB! I know OS X only uses about 6~8GB so where is the rest of the free space?{quote}
    The Drive could be somewhat fragmented because of iMovie, you might want to clone the Internal HD to an External Hard Drive and then back using the Restore feature in Disk Utility or SuperDuper. All-though I advise using a second dedicated External HD (FireWire 400 is faster than USB for cloning the OS X), so that you have a second untouched backup on your 1T in case something goes wrong. If you decided to swap the Internal HD to say a 320 or 500GB then a second External HD or USB/Firewire enclosure for your OEM 160GB drive will also come in very handy for restoring the OS X to the new drive and later backup or storage.
    In short: I never trust or rely on just one drive Internal or External with any of my data and always keep it on no less than two drives....
    I will help some folks that want to swap out there Internal HD and some that have already made a mess inside there iMac. But the two biggest problems are: (1) the White Intel iMac's are most difficult to open and work in without damaging other components in the process. (2) internally they tend to run hot and at 4 years old, Display connections, Display cables and pretty much everything else is very brittle and easy to damaged.

  • Is it common to keep a music library on an external drive now?

    I did this about five years ago, when it was relatively new, and stopped because it was sometimes wonky. Given the popularity of the MacBook Air I'm thinking that perhaps the wrinkles might be ironed out? I have a very large music library (200 gig) and would love to move it to our Time Capsule. Does this make sense with a fleet of devices (iPad, iPod Touch, iPod Classic, Nano) and one main computer (MBP) on a very fast WiFi LAN? One more snowflake detail: Our Time Capsule began exhibiting to exhibit the odd "connection has been interrupted" error when I tried my Calibre Library on it. If you recommend going forward I'll just switch back if this becomes an issue. I mention it in case someone reading this knows of a workaround.
    I appreciate any help.

    Brilliant; thank you! I hadn't heard about locking in the IP but it makes perfect sense. Should be easy to do in Airport Utility, too.
    Only the first part of my question remains. Does keeping a music library on an external drive work well now? Are there any pitfalls to avoid? Should I consider breaking up my collection into two libraries: one for audiobooks and the other for music?

  • Matching across multiple character sets

    Would like to know whether anyone has attempted matching across multiple character sets, for example, between English and Japanese: what are the pitfalls to avoid, what are the best practices, and what you would like to see from application/tools perspective as an ideal solution. thanks

    If you upgrade to Logic Pro, you'll get WaveBurner as part of the package which helps you do this, including tweaking your pauses between tracks, fades etc.
    If you have Toast, you can do it there too.
    If you don't have any 3rd. party software, the work around would be to assemble all your songs in order, end to end in a new Logic file, and listen to all your tracks and adjust the relative levels between songs, then bounce out the individual tracks which have volume changes with their new volume settings. Finally you could then use any burning app such as [SimplyBurns|http://bit.ly/c1oglP] to create CDs or bounce them out in Logic with the additional .mp3 option.
    Obviously it's important to listen to your material in order, in context, as some songs will be at the wrong subjective level depending on the tracks either side in the placement. This isn't really important in digital distribution where your material probably won't be listened to as a whole, but as individual downloads.

  • Unable to View Rendered Stills in Canvas / in QT

    Hello,
    I've read through a variety of related threads on this and the usual RGB / Alpha type tricks have no effect. I'm basically using the Motion panel to create a Ken Burns type effect, and have keyframed everything out and dropped the images into the timeline without any trouble. To that point I can scrub through the clips in the Viewer and see my effects.
    As soon as I render them (tried Animation, Graphics, MPEG-4, H.264(which doesnt render at all) I can no longer see the effect I made in the Viewer, and when I play the clip in the timeline the border of the image I'm using shows up in the canvas (scaling as it goes), but the viewer area itself is empty (Black).
    My squence settings are 24fps, square pixels (using PSDs but tried JPEG also), at a custom size. I have a video card that's capable of viewing rendered video on screen (Radeon 9800 Pro), and RT Extreme menu shows "Based Layer Only" unchecked, and Full quality checked on the Tape setting.
    I don't know though that this is a viewer problem (codec seems more likely) because when I output the rendered clip via QT, the resulting video file is also solid black.
    Any advice on the best sequence and viewer/canvas settings to use for Ken Burns type slideshow would be appreciated. I've checked the manual pages on working with still images and though they touch on image size and other items, they don't recommend any optimal sequence settings or give you pitfalls to avoid as far as this sort of problem.
    Thanks in advance.

    Not sure if you are trying to lead me to an answer you already know but if you do have a good idea I'd appreciate any details. I'm on the clock with this project so I don't have a lot of time for trial and error type experimentation.< </div>
    Not at all, "it doesn't work" is not helpful information.
    I provided everything we know about such situations: Read the manual, keep it simple, know how to walk before running with scissors, use conventional settings before trying weird stuff. If you don't have time to experiment, you don't have the luxury to try new stuff you don't already know how to do.
    For isntance: The "viewer" is not where you will see effects from the timeline. That's what you need the Canvas for. If you're editing your effects in the Viewer, they are being applied to the clips in the Browser, not to the copies of the clips in the timeline. But, like I said, if we were sitting at the same machine, this might be insanely easy to fix. As it is, I don't know what to tell you.
    We'd love to help.
    bogiesan

  • Labview resizing control.....

    Hello All,
    I am in the middle of developing an application that will be bundled with our product.  I have developed many such applications using visual studio, however this is my first adventure using Labview.
    As many of you know, one of the most common variables in distributing an application is the und-user screen resolution.  With the relatively recent explosion of differing screen resolutions and aspect ratios, making applications resolution independent is even more important than ever.  Along with this, giving the customer the ability to resize the appliction at will is a huge benefit.  On many occasions I have walked into a customer's site to see my application shrunk down to a corner of the window, where the use can still see it, almost like an indicator.
    For visual studio, I had purchased a third party control.  Plunk the control onto the window (it became invisible at run time) and all of the controls, fonts, etc resized as the screen was resized.  When visual studio went to .Net, this particular control no longer worked.  I had to evaluate 5 different controls to get one that even came close to working properly, and even then I had to work with support and beta test many versions of that control to get the bugs out. 
    Now I know that Labview has the panel resize options, etc and I have played around with them.  However, as most of you know, many problems still exist, such as the fonts not resizing, and multiple window changes cause control distortion, even when brought back to the same panel size.
    So, what I am wondering is this:  A) Does anyone make a control that I can purchase that does a good job in this regard, or B) if not, how about opening a discussion on how to implement a control to do just this?
    If I need to, as I forsee doing more applications with Labview, I could tackle this on my own, but as I am new to LV my learning curve may be steep so I'd like to draw on the abundance of talent and experience I've read on this board.
    I was thinking of a control that would enumerate all of the controls, labels, etc on the panel.  During the development cycle it would store all of the control / font sizes (internally? to a file?) as the developer intended the panel to appear.  At execution time (after it is built) each time the panel is resized, the control would set the height/widths and font sizes as needed.  Why store the development time sizes?  To eliminate the errors that creep up when sizing the panel up and down multiple times.  I assume this occurs in LV since original size is not tracked and pixel displacement is non-uniform across the pane, and the moving / scaling is non-integer in nature but gets applied to integer properties.  For example, when resizing from the lower right corner, to say half the width / height, controls in that corner move ALOT while controls in the Upper-left only move slightly, yet all controls are scaled in size the same amount.
    I would love it if someone could dope-slap me upside the head and say "just do this".  If not, what are some of the methods I should be looking at, and pitfalls to avoid?
    Thanks for your input!  

    tartan5 wrote:
    Wouldn't life be grand if every control included a set of read-only properties, say XOrigin, YOrigin, XSize, and YSize that were updated as you move the component around during development, but was fixed as you run the compiled program.  This would at least give you an absolute reference to use as you resize the panel, and eliminate the truncation errors we now see.
    You could do that using tags if that is what is needed.
    One interesting control I tested actually increased performance by grabbing an image of the window when the resize handle was moved, placing the image over the window, resizing the image as the corner was dragged.  Then when the corner was released, updated all of the controls / fonts as required, then hid the image.  It actually worked amazing well vs resizing the controls in real-time.......That might be very do-able in labview, show a picturebox, have it size to the window, etc....
    It probably would not be needed. You can just defer the panel updates until after you're done with your resizing. I was thinking the performance issue would be in going over all the controls and changing their size, but there is probably not going to be one.
    On another note, is it possible to recurse through an xcontrol and get all of the components at run time?  I have just finished developing my first xcontrol (so it's fresh on my mind) but I'm not sure how the xcontrol is viewed by the environment (ie if all the sub-components can be reference / obtained)?
    No idea. I use LV 7.0, which does not have them. My guess would be that you can't, at least in the earlier versions. Maybe the newer ones allow this, but I would doubt that. You might need to write a resize method (or ability or whatever they're called) for the control. That is one example of how writing a generic framework would be complicated.
    On your first point, are you suggesting that the resizer control would add the tags to each enumerated control itself?  That's almost as good as my wish above (LOL)....
    Well, I wouldn't call it a control. It would be a special resize handling (TM) process, but yes, that is exactly what I'm suggesting. See attached for a simple example.
    P.S. No, I don't know how to access these in LV 8.x.
    Try to take over the world!
    Attachments:
    Pos Tags.vi ‏36 KB

  • Fault tolerant, highly available BOSE XI R2 questions

    Post Author: waynemr
    CA Forum: Deployment
    I am designing a set of BOSE XI R2 deployment proposals for a customer, and I had a couple of questions about clustering. I understand that I can use traditional Windows clustering to setup an active/passive cluster for the input/output file repositories - so that if one server goes down, the other can seamlessly pick up where the other left off. On this Windows-based active/passive cluster, can I install other BOSE services and will they be redundant, or will they also be active/passive. For example: server A is active and has the input/output file repository services and the Page Server. Server B is passive and also has the input/output file repository services and the Page Server. Can the page Server on B be actively used as a redundant Page Server for the entire BOSE deployment? (probably not, but I am trying to check just to make sure) If I wanted to make the most fault-tolerant deployment possible, I think I would need to:Setup two hardware load-balanced web front-end serversSetup two servers for a clustered CMSSetup two web application servers (hardware load-balanced, or can BOSE do that load-balancing?)Setup two Windows-clustered servers for the input/output file repositoriesSetup two servers to provide pairs of all of the remaining BOSE services (job servers, page servers, webi, etc.)Setup the CMS, auditing, and report databases on a cluster of some form (MS SQL or Oracle)So 10 servers - 2 Windows 2003 enterprise and 8 Windows 2003 standard boxes, not including the database environment.Thanks!

    Post Author: jsanzone
    CA Forum: Deployment
    Wayne,
    I hate to beat the old drum, and no I don't work for BusinessObjects education services, but all of your questions and notions of a concept of operations in regards to redundancy/load balancing are easily answered by digesting the special BO course "SA310R2" (BusinessObjects Enterprise XI R1/R2 Administering Servers - Windows).  This course fully covers the topics of master/slave operations, BO's own load balancing operations within its application, and pitfalls to avoid.  Without attending this course, I for one would not have properly understood the BusinessObjects approach and would've been headed on a collision course with disaster in setting up a multi-server environment.
    Best wishes-- John.

  • Using Word wisely as a linked document for RH HTML 9

    I am new to application help authoring and a total newbie to RoboHelp.  I am writing a help project with RoboHelp HTML 9 for a brand-new software application.  Virtually no content already exists in any sort of document, so I have the opportunity/challenge of starting from scratch.  Since a handful of users will have authority to update content (but little bandwidth to learn much about RH), I was planning to write everything in Word 2010 and then link to my project.  What I'd like to know as I get started is:  Does this make sense as an approach (as opposed to, say, authoring in RH and then using print output for training and other doc requirements)?  What are the pitfalls to avoid?  What should I do/avoid in the Word doc to make the connection to RH 9 go as smoothly as possible?  Are certain layouts more congenial to linking with Word than others?
    I realize this is several questions--I'd be happy to get a recommendation on a helpful source of info or guidance on making these sorts of macro decisions.  I self-educated using an "Essentials of RH" workbook and have studied the User's Guide and forums to create a prototype set of help topics for my client, but I haven't been able to find a good source of general info that addresses "why do this vs. that?" or "if I had had the luxury of being in control of the content from the get-go, I'd certainly have done X, Y, and Z."  I am grateful (in advance) for any help you more-experienced community members can suggest.
    Regards,
    Amanda A

    Turning off smart quotes will only eliminate them in future usage, but do nothing to the occurrences in existing docs.
    You should therefore also do a Find & Replace (double quotes in each box, then another pass for single quotes) to "clear the palate," so to speak.
    Good luck,
    Leon

  • Database and labview 8.5

    I am looking to do some database code for an application in labview 8.5 and windows XP (maybe vista as well).  I havent done much database with labview in a few years and am looking for any pitfalls to avoide.  I need do choose the right DBMS access, mySql ..... and also know if there are know issues og the database toolkit.  Has anyone had any problems or suscess using 8.5 and databasing.  I will be writing moderate amounts of data to the data base, I will be updating fields several times per minute and the system will run mostly continous 24/7
    Paul 
    Paul Falkenstein
    Coleman Technologies Inc.
    CLA, CPI, AIA-Vision
    Labview 4.0- 2013, RT, Vision, FPGA

    Hi Paul,
    You should be able to use the Database Connectivity Toolkit with LabVIEW 8.5 just fine.  There is a list of supported databases here so you should be able to select the database you want.
    Donovan

  • SQL 7 to Oracle migration

    Is there any information available on a strategy for designing a DB schema that will be initially deployed on SQL Server 7, but will eventually be scaled to an Oracle server? I want to know some pitfalls to avoid so that the eventual migration proceeds as smoothly as possible.
    TIA,
    Michael Chapman

    Hi Michael,
    We don't have such a document. If it's scalability you want then I would design with Oracle from the start. We have had a lot of people requesting the Workbench for SQL Server 7.0 migrations.
    Regards
    John

  • Aperture: Migration to new Mac (Referenced Library)

    Apologies if this has been covered before but I've used numerous searches and don't seem to be hitting the right combination of terms to bring up what I want.
    I currently have a flagging 2007 MacBook (OS 10.6.8; 2.16 GHz, 2MB RAM and only 7GB free of the 160 GB HDD). Aperture is struggling.  Time to upgrade.
    My Aperture Library is currently on the Mac and is 'Managed'.  I have contemplated moving the 50GB or so of Aperture Library to an external HDD and going 'Referenced', mainly because I still have some images I need to work on (even though Aperture is for obvious reasons ponderously slow with frequent SBOD on this machine) until I decide what to upgrade to (Macbook 15" or iMac with more bangs for the buck) and wait for the latest refresh of the line that I choose.
    Upon getting the new machine I plan to use the Migration Assistant to help with app/doc/settings transfer but what about Aperture?  I am not sure if it's best to:
    1) Get the new Mac now, migrate everything across (including Aperture and its Managed library) THEN move the Aperture library off the internal HDD to an external and going Referenced, or;
    2) Go Referenced now.  In which case when I eventually do then migrate Aperture to the new machine will it automatically 'point' to the correct location of the external HDD referenced library when what is left of Aperture copies across or is there and easier (or indeed more convoluted) process I will have to go through if I switch to Rferenced before getting the new Mac and migrating? 
    Accept of course with the new Mac the HDD will be so much bigger so there may actually be no need to go Referenced, at least yet.  Try as I might, save for HDD space I don't see that many benefits to Referenced
    On the new Mac front, while I like laptops, I find that the iPad and this Mac do most of what I want (e.g. surfing, mailing and running the odd few apps).  While a new MBP would be appreciated part of me still thinks that the more bang for the buck iMac is the better investment.  The only thing I MAY need to do is upload the occasional photo shoot on the move (by creating a new project) which, if stripped back to basics, this Mac miight still be OK for until I get back home and move the project to the iMac, reloacting to the masters to the referenced external HDD after.
    Any help appreciated.

    Hi,
    some consederations you may want to keep in mind. There is no definitive answer for the perfect library setup - it will depend on the size of your Aperture Library,  the amount of available disk space, on your workflow, and on your backup strategy.
    I currently have a flagging 2007 MacBook (OS 10.6.8; 2.16 GHz, 2MB RAM and only 7GB free of the 160 GB HDD). Aperture is struggling.  Time to upgrade.
    On that machine you really need to relocate your master image files to an external drive or free disk space in a different way. With only for 4% of empty space on the system drive, even a newer Mac will be very slow. Try to keep 20% to 30% of your system volume free.
    My Aperture Library is currently on the Mac and is 'Managed'.  I have contemplated moving the 50GB or so of Aperture Library to an external HDD and going 'Referenced', mainly because I still have some images I need to work on (even though Aperture is for obvious reasons ponderously slow with frequent SBOD on this machine) until I decide what to upgrade to (Macbook 15" or iMac with more bangs for the buck) and wait for the latest refresh of the line that I choose.
    For best performance the Aperture library should reside on your fastest drive, usually the System drive. If you want to go referenced, relocate the masters, but keep the library on the internal drive. Only if you have a very fast connection to your second drive, or two internal drives, it may be advantegous to move the whole library to the other volume.
    Managed, referenced, or mixed?
    Managed: A managed library is easier to handle, as long as it is reasonably small. With 50 GB Aperture Library you can continue with a managed library, as soon as you have more disk space available. The advantage of "Managed" is that you do not have to keep track of your masters on your own, and that they will be included in the vaults. You will need an incremental backup scheme that looks inside the library package however - like Time Machine, otherwise you will need to backup the whole library over and over again, even if you only changed one single image.
    Referenced: If your Library gets larger, and you have several hundreds of GB, then a managed library becomes a nuisance and it is time to go referenced. Very large libraries are difficult to move or copy  between disks; It will be wasteful to have several vaults, for each vault will include the same masters over and over again.
    Mixed: The Aperture library on the system drive, most of the masters on an external (or second internal) volume. This setup is perfect for laptops with limited space on the internal drive, but it will require that you have a well ordered strategy where to keep your masters, since Aperture will not manage them for you. There are two pitfalls to avoid: Accidentally deleting or modifying masters from the Finder, or accidentally relocating them to a place where you store other images that are not your masters. When you have several similar images in the same folder, it can be very hard to tell which image is the master that you need to keep and which is a redundant copy.
    The "mixed" setup is great, if you are on the road (bt will put mre strain on your memory or master management skills)- you still have your Aperture library with you and the master image files you are currently working on, but not the bulk of your masters. If you create high quality previews, you probably even will not notice, that most of your master image files are still at home.
    Upon getting the new machine I plan to use the Migration Assistant to help with app/doc/settings transfer but what about Aperture?  I am not sure if it's best to:
    1) Get the new Mac now, migrate everything across (including Aperture and its Managed library) THEN move the Aperture library off the internal HDD to an external and going Referenced, or;
    2) Go Referenced now.  In which case when I eventually do then migrate Aperture to the new machine will it automatically 'point' to the correct location of the external HDD referenced library when what is left of Aperture copies across or is there and easier (or indeed more convoluted) process I will have to go through if I switch to Rferenced before getting the new Mac and migrating?
    Accept of course with the new Mac the HDD will be so much bigger so there may actually be no need to go Referenced, at least yet.  Try as I might, save for HDD space I don't see that many benefits to Referenced
    From my experience, it is less troublesome to migrate a managed library with Migration Assistant. If parts of your Library are referenced, and you migrate the referenced masters as well, you may need to reconnect them, unless you only have to plug in the volume with referenced masters. Then Aperture should reference them correctly without extra trouble.
    Try as I might, save for HDD space I don't see that many benefits to Referenced
    Then stick to the managed setup until your library really becomes huge.
    On the new Mac front, while I like laptops, I find that the iPad and this Mac do most of what I want (e.g. surfing, mailing and running the odd few apps).  While a new MBP would be appreciated part of me still thinks that the more bang for the buck iMac is the better investment.  The only thing I MAY need to do is upload the occasional photo shoot on the move (by creating a new project) which, if stripped back to basics, this Mac miight still be OK for until I get back home and move the project to the iMac, reloacting to the masters to the referenced external HDD after.
    Any help appreciated.
    I am still waiting for my iPad to be delivered - right now I take a MBP on the road. For the new shoots I create a new Aperture library, do most of the tagging while I still remember how the images have been taken, and when back home I import the new project into my main library.
    Reagrds
    Léonie

  • BLURRY FINAL PRODUCT!!

    Very VERY FRUSTRATED.  Just spent hours and hours putting together a slide show using Premiere Elements 12 - and the final product (after rendering) turned out VERY BLURRY!  I don't understand how this happened.  All the pictures and slides added were crystal clear until rendering.  And watching the final DVD was a total disappointment.  I am not happy that we spent all this money on a program that is not simple to use.  The free program I used to use worked better than this.  Please help me fix this slideshow.

    jenn
    Thanks for the reply. I will post information in whatever format you feel that you best understand so that we can get you the Premiere Elements end product that you seek. From what you have written, you want your slideshow in DVD-VIDEO format on a DVD disc. That is the format that will play back on the TV DVD player. Whether that DVD-VIDEO is standard 4:3 or DVD-VIDEO standard widescreen 16:9 will depend on how you set up the Premiere Elements project that produced it.
    This all leads us back to what project settings you used to produce the DVD-VIDEO on DVD disc and how you did or did not match up the properties of the source media with the project preset (also known as the project settings). This is not complicated, but it does require details consideration so that you get a decent end product.
    Also, in your end product is it your source videos or photos that are showing as blurry or is it both? Does anything display non blurry?
    You have said that your source video media are coming from
    GoPro camera
    iPhone
    Nikon Digital Camera
    Kodak (underwater)
    Each may be giving you source video in different formats (sizing et al). A Premiere Elements project has only one project preset. So if you are importing different formats into that one Timeline, you need to set priorities. Pick you prime source, set the project preset for that, and then fit all the rest as best you can after you import them.
    Is it possible for you to round up the brand/model/settings for each of the above sources for your video in this project? There are all sorts of pitfalls to avoid so that you are not left with burn to failures, blurry images, black borders, etc. iPhone video may present with a variable frame rate and not let you import it or give you out of sync audio. Photos grossly oversized for the project can cause failures and crashes. The above is my overview of the situation.
    The answers will be in the details. And, there are many that cannot be overlooked especially when you come with "Blurray End Product" as your thread title.
    Please think about the above. Please try to give us the details of the source media going into the project that will produce your DVD. Then we will help you put all that information together (step by step) for a Premiere Elements project to produced the DVD-VIDEO on DVD disc.
    Thanks.
    ATR

Maybe you are looking for