Live 5.1 internal p

I need a guide ( i lost my manual) so I can install my front spaker and microphone connections using the internal pins. If someone could help me out I would appreciate it, especially if the guide explains how to install surround sound speakers too.

smash59 wrote:
I, too, am interested in connecting the front panel headphone and microphone jacks on my PC case to my Sound Blaster Li've 5. sound card. I looked at the pinout given in the knowledge base and it seems to indicate that there are no pins available to connect my case audio jacks. Am I missing something? I don't really want to make a crappy patch cord to plug in on the back of the computer then run in through a hole in the case to connect to the front panel leads... yuck.
Shannon
I too would like to connect to the front panel of my computer from my CB0224 SoundBlaster Li've! 5. Digital. The KB listing of model numbers leads me to believe that my card has these connectors, BUT I don't see any unlabeled connectors, except one white one with ten or eleven pins in one row.

Similar Messages

  • Setting up an Internal Live Update Server for Symantec.

    I am trying to set up a live update server internally and can not get the thing to work. Anyone know where A: the update files have to be on the server? I have them in the root of the webserver folder. Also anyone know if it requires a certain port open? Or what the syntax is for the web address? I know it is somewhat working as I can see my Mac client, send it preferences, but can not get it to run an AV scan, or a live update. Any help is appreciated.

    I have tried to tinker with the python scripts from the OSX installer.
    http://vimeo.com/57973290
    Adobe need to release some documentation though. Plenty of Enterprise deployments have Linux backend servers. I want to set up AUSST on Linux to match my Reposado (https://github.com/wdas/reposado) Apple Update Server.
    Good luck!

  • How small can I make the boot camp partition on my MacBookPro internal drive?

    it seems that in order to get Windows7 on to my old non supported MacPro 1.1 that I need to firist install onto this mid 2009 MacBook Pro I want to do the install onto an external firwire800 hard drive which I have formater and Downloaded and installed the Windows support folder using the boot camp installer but the installer wants to grab 20 gig of my internal drive for the win 7 install what is the minimun that I must have on the internal drive to install and use win 7 on the external drive?
    is ther annother way to get Win7 onto my MacPro1.1 where it can live on an internal drive of its own. I do not have have it running on this MBP5.3 other than to get it running (64 mode) on the old tower In addition to the MBP I have at hand the esternal drive a blank 16 gig thumb drive and a firewire800 external DVD drive that I use mostly as a burner on the macpro as it is faster than the internal superdrive that it came with

    I assume that is a product so then I can remove the bootcamp partition from my internal drive and boot to the firewire 800 drive or do I need to have a SSD drive inside the macbookpro first?

  • FCP4.5HD caputure live with Canopus...workflow

    Here's the skinny,
    We do a live show that is presented is multiple formats:
    -Webcast live (media player)
    -Internal archived DVDs (iDVD/DVDSP)
    -Website archived streaming (real player)
    -Public Access broadcast (from DVD)
    -Hoping to elevate to PBS distribution (currently shooting MiniDV, wanting to bump up to DVCPro)
    Currently the workflow includes live cut w/Videonics MXProDV switcher that goes analog out to webcast server, and records analog out to MiniDV tape (Sony GV-D900 NTSC Walkman).
    I then capture recorded MiniDV tape into FCP4.5HD on our Dual 2.5 G5...internal 250GB HD...FCP is on the int. 250GB system drive. I then recut, title and cleanup the show for archiving and Public Access distribution via DVD, as well as export QTConversion to RealMedia for web distribution.
    I'd like to save 'ingest time' by including a 'live capture' into FCP using capture now through a Canopus A/D converter. (have not been successful getting FCP to recognize the FW output of the Videonics, so would need to go analog into FCP)
    Questions...
    1. Is the Canopus ADVC 110 going to give me what I need to handle the analog out of Videonics to capture into FCP on an external Maxtor 1Touch 300 GB drive?
    2. Should I also incorporate an additional PCI/FW card/bus to make certain I don't drop frames?
    3. Would I be better off replacing my 250Gb internal 'media' drive with a larger internal...say 500Gb?...and then...
    4. Will FCP be able to live capture directly to that (still using the Canopus, but without needing the second FW bus)
    5. Would I need a better Canopus or AJA card to handle this workflow?
    6. If/when we can bump up to DVCPro as our tape format, how would that affect any of the above mentioned hardware considerations?
    This and other forums has brought me this far, just hoping for real world practical experience/advice as to best workflow possible.
    I've suggested we get a much better Deck and could use other 'improvements' as well, but we are a non-profit, and funds are limited...however, all suggestions are welcome.
    System specs listed below. (though I had to drop our QT version to 6.5 from 7 to do RealPlayer conversions, and am waiting on a new Pro key)
    Thanks,
    Kevan
    Dual 2.5/1GBS SDRam/2x250IntHD/ATY-R360/FCPHD4.5/QTP7.0.2   Mac OS X (10.3.9)   DVDSP2/LiveType1.2/MPEG StrmClip/PeakXprs/RealPlyr/Cleaner6/Soundtrack/Soundsoap2

    just marking this as answered to get it out of "My questions" box.

  • HT4859 how do i use the hard disc to store data

    I find that my hard disc is huge but the Mac itself has a memory of 4 GB. I am new to Macs so this might be pretty basic... I used to just move items beween hard discs on my PC. Is there anyway to to do this on the Imac?

    Nothing is held in RAM in the way you mean. We are dealing with three things here:
    1. Your Mac's Random Access Memory - this is only a working area, holding applications while they are running and files only while you are working with them: as it's not permanent you must save them to permanent storage:
    2. Your Mac's internal hard disk - usually called 'Macintosh HD' or something like that.
    3. It sounds as if you also have an external or secondary hard disk.
    System files and applications will live on the internal hard disk, which is what you (usually) boot up from. Downloads go, basically, where you tell them to - the default is the Downloads folder in your Home Folder but you can perfectly well designate the external hard disk instead. You can save files to either disk, and you can copy files between them.

  • Can you help me solve my Leopard Server VPN madness?

    Hello all,
    I've been having a devil of a time getting Leopard Server's VPN service to work "properly". None of this is mission critical, as it's simply on a home system I'm using as a nat/dns/dhcp/firewall/mail/web server for my Comcast line (with a static IP). But, it is frustrating, because I currently have a 10.4.11 Server fulfilling the same roll. So it seems like Leopard should be able to be made to work. I'm gonna go step by step here with my install process in the hopes that if I'm doing something wrong someone will be kind enough to catch it. Thanks for bearing with me.
    I've installed Leopard Server 10.5 (Mirror door G4, FYI) with the built-in ethernet connected to my Comcast router (with a static external IP). Immediately after 10.5 installs I restart and update everything to 10.5.2, then I install a Sonnet Gigabit NIC, it's drivers, and assign it 192.168.3.1, where it will live as my internal router, server, etc. I turn on DNS and setup an internal ".lan" zone that resolves to 192.168.3.1. Pop into Terminal and confirm that rDNS is in fact working, it is. And check that "changeip -checkhostname" resolves itself correctly (to the external IP).
    Next, turn on the NAT service and run the gateway setup assistant. After a reboot I quickly check that my internal clients with static IPs (192.168.3.10, .20, etc) are working and pulling DNS OK, they are. Jump into the Firewall, and for the moment just open it wide up by accepting all connections. At various times during testing I've configured the firewall to exactly match my 10.4 Server firewall, but for the time being I can just leave it open. I create a Firewall group to cover my 192.168.3.x internal network, and another to handle 192.168.3.60/29 to handle the VPN service I'll setup in a sec. Jump over to the DHCP service where by default gateway setup creates a 192.168.1.x DHCP zone. I delete that and create a new 192.168.3.x zone covering .50-.59. Turn DHCP on and confirm it's working, good, it is.
    Now, here is where the VPN fun begins. The last service I turn on is the VPN service (I've alternatively tried letting Gateway Setup activate it, and just doing it myself, with this same result). I configure it to accept L2TP at 192.168.3.60 - .63. Like I said this is a home server, so I don't need a lot of VPN connections. Finally, when I test the VPN from a 10.5.2 Client (MacBook coming in off a neighbors open wireless network with a 10.0.0.x string) I am able to connect, and I can see/ping/mount/share screen on the server. I can also ping the attached VPN client at 192.168.3.60 from the server. However, I cannot ping or see (In ARD) any other machines on the internal network from the attached VPN client. Likewise from one of the internal systems, say my Mac mini at 192.168.3.10 I cannot ping the attached VPN client at 192.168.3.60. Out of curiosity I've tried doing a rDNS lookup while attached to the VPN and the client isn't able to resolve any of the internal DNS entries.
    So, what gives? As I've mentioned I have exactly this same setup working just fine with Tiger Server. Same NAT, same Firewall, same DNS, and same L2TP VPN setup. For the life of me though, I cannot get attached VPN clients to see the internal network when I put Leopard Server in place. Clearly the internal DNS isn't working for attached VPN clients, although I'm not certain if that is a cause or a symptom. I've setup a network routing definition for the internal private network, which didn't help. I also tried setting up PPTP instead of L2TP, and had the same problem.
    Is anyone having similar problems with Leopard Server's VPN service? If not, could someone hit me with the clue stick and set me right? As I said, in the grand scheme of things this isn't a big deal for me. But, it's just frustrating that I can get so close to updating my home server and just fall short.
    Thanks!

    Your post actually contains the Key to solve the problem and there is not really a big need for going all the way to use the Property List Editor to fork around /etc/ipfilter/ipaddressgroups.plist.
    There has been much written on this problem but basically you see that most is trial and error and this does include myself and my findings in this post, too, but I think I can further narrow down on what CAUSES this problem and how to fix it.
    First off, we are talking a combination of using NAT (Network Address Translation - bridging an Internet connection on an external network card over to an internal network card), Firewall (which is needed in OS X to be have NAT working because the Firewall "helps" NAT by doing its job, DHCP (for providing dynamic IP addresses to clients on the internal network, don't confuse, DHCP is not providing this service to the VPN clients, that is done by the VPN server), and - last but not least - VPN to provide access to not only to the server but to any machine on the internal network over the outside network card (aka, giving remote clients a chance to connect to the local network over the public Internet in a save and nice way).
    OK. The short story: you can do it ALL in Mac OS X 10.5's Server Admin tool. If it fails it is nearly always the Firewall!
    You can check if this is the case for your setup by temporarily opening the Firewall up to not block any traffic: in Server Admin, click on Firewall -> Settings -> Services -> Edit Service for: any and click "Allow all trafic from "any"", save it (and to be 100% sure, stop and restart the firewall. If your clients can NOW connect at least to the server, it was the firewall. Now don't forget to switch off allowing all traffic from any, or you will be left with an open doors server ready for anybody to explore
    Now what goes wrong in the first place? It appears that the GSA (Gateway Setup Assistant) that is "hidden away" in the NAT settings does something awfully wrong. It will set up all the address groups in the firewall: the any group will remain as it is usually, another one defining the internal network, and a one called VPN-net for VPN.
    What it DOES do wrong here (I am no firewall expert, this is purely trial and error, so please anybody do explain!) is to give the VPN-net exactly the same address range as the internal network. And here seems to be the overall problem.
    When Twintails wrote to add 192.168.3.60/27 as address range for VPN, I realized what he/she did. Writing 192.168.3.60/27 effectively narrows down the address range starting at 192.168.3.33 up to 192.168.3.62. There are millions of subnetmask calculators out on the net, give it a try e.g. here: http://www.subnet-calculator.com/
    So, I looked for what range of address will actually be given out by the VPN server to VPN clients upon connections. Of course you need to make sure that this address range is NOT given out by your DHCP server.
    In my setup, the server is 192.168.1.1, the DHCP server provides addresses from 192.168.1.10 up to 192.168.1.127 (I start with 10 because I have some static addresses for special purposes from 192.168.1.2 to 192.168.1.9. So, this means, anything above 192.168.1.127 is potentially "free" for my VPN connections.
    Next I used the subnetmask calculator to find a narrow address group that matched my purposes. I found 192.168.1.192/26 which effectively gives me a range from 192.168.1.192 to 192.168.1.255 (which is in fact more then I have clients connecting from externally!).
    I went to the Server Admin Tool, and clicked Firewall -> Settings -> Address Group and edited the VPN-net one. First I deleted what was in "Addresses in group" and entered from scratch 192.168.1.192/26. Next - just to make certain because basically this is what Twintails had in his/her post by saying to add a name String with exactly the same information - I overwrote VPN-net by 192.168.1.192/26 and saved. (I THINK that this last step might not really be needed, but I haven't tried).
    Next click Save (basically it should already work, but I always want to be extra sure, so I stoped and immediately thereafter started the firewall again to be 100% certain all new rules are now active.
    And now: it works! Clients can access the server AND the entire local network from remote using VPN.
    One last comment: I have the feeling that (although less safe and less advanced technologicall) PPTP works much better for us then L2TP. So I have switched off L2TP support altogether because it simply NEVER really worked. We are using Mac OS X 10.4 and 10.5 to connect to the 10.5 server using this setup.

  • How do I share an aperture library with multiple users?

    I have Aperture on a macbook pro that is my wife's primary computer. I purchased an imac for our family and to get our pictures on a static desktop. I unintentionally migrated her as a user to the family imac. This is fine as my primary goal was to get the pictures over. However, I want her to have her own view and another for myself. However, I would like for us both the have the same view of Aperture/our pics. Is there a way to do this?
    Thank you.

    Like Frank said, Aperture is a single-user app. Multi-user setups are all workarounds that can have dire consequences. Best is if you think single user and plan your setup accordingly.
    You should have one primary Referenced-Masters Library containing references to all Masters. Avoid using a Managed-Masters Library for many reasons (but that is another topic). The Library should live on an internal drive and no attempt should be made to network it (because Aperture is a single-user app).
    • One option (recommended) is to maintain the primary Library on computer A and periodically make copies of the primary Library for read-only usage by computer B (like Frank said, read-only usage cannot be set so it needs to be workflow based). Computer B can also have its own fully separate read and write Library, the contents of which are periodically added to the primary computer A Library.
    • Another option (not recommended) is to have one Library and move it around depending upon which computer is using it.
    • Another option is to maintain the Library on computer A and export Projects or Albums as needed for use by computer B.
    Original images should be backed up prior to being imported into Aperture or any other images app. Note that use of a Referenced-Masters Library makes all the options above more feasible because the Library stays a manageable size, whereas with a Managed-Masters Library the Library invariably grows to an uncivilized size.
    Set Preview size to be the size of the largest display among the multiple computers.
    HTH
    -Allen Wicks

  • Can't boot Arch from external HDD

    Hi, well first off this is not my first Arch Linux installation, I'm using Arch for a little over a year now (coming from slack). But this is my first attempt to have Arch on an external HDD.
    Alrighty, the situation is as follows: A rather new computer (supports booting from USB devices and the bios is set to boot from removable devices first) with a built in HDD and an external HDD that's connected to that computer via USB.
    What I did: I connected the external HDD to the computer and booted off a 2008.6 Overlord core-CD.
    Arch-Live recognized the internal HDD as sda and the external HDD as sdb.
    I partitioned the external HDD using cfdisk and ran mkfs.ext3 on it (I didn't use any switches with that).
    I started the installer, set my mount points (I should mention I'm not gonna use a swap partition here), installed the packages, well just went through the installation routine and installed grub in the MBR of sdb (the external HDD).
    Then I rebooted. And this is what doesn't work: When I boot that computer with the external HDD connected the computer completely hangs right before grub would come up. It freezes completely, ctrl+alt+del doesn't work, I need to use the powerswitch to reboot the computer.
    So I put the Arch-CD back in the CD drive trying to boot into my Arch System on the external HDD. So I started typing:
    root (hd1,0)
    kernel /boot/vmlinuz26 root=/dev/sdb1 ro vga=773
    initrd /boot/kernel26.img
    When I try to boot off that grub tells me that there is no such device as hd1,0. The funny thing is, that the auto copmletion in grub works for the kernel /boot/vmlinuz26 but it doesn't for the device /dev/sdb1, in fact even root=/de<tab> returns a "unrecognized string" message.
    So this is what I did and I can't boot off my external HDD, neither can I boot into my system on the external hdd from the cdrom.
    What am I missing here?

    The harddrive you boot from is in my experience always hd0. Could this mean your external disc is hd2?
    Have you tried chainloading from the installer cd to your external disc?

  • External drive not recognized (v.1.5.6)'

    after flawless performance for the last year, just lately aperture no longer manages referenced files very well. my image library is far too big to live on my internal drive, so referenced images are essential for my workflow.
    it began with the "image loading" bug that's discussed on other threads, but now it's failing to recognize my external firewire drive -- the drive is mounted and accessible from the desktop, but shown as offline by the app., or curiously shows some albums online and some off in the same project referenced to the same drive.
    i've tried using the utility to 'manage referenced files', but the button that should facilitate the mounting of an offline drive has no effect.

    Hello, go2post.  
    Thank you for visiting Apple Support Communities.  
    I understand that your external hard drive is not longer being recognized by your iMac.  Here are a couple articles that will get your started troubleshooting this issue.  
    OS X Yosemite: Check your device’s USB connection
    OS X Yosemite: If a USB device isn’t working
    Cheers, 
    Jason H.  

  • How to use itunes with music all on external drive?

    I have a PC laptop - with Itunes installed. I have a LOT of music all organized neatly on an external drive.
    Itunes folders obviously live on the internal drive, but music is all on external (not enough room on internal).
    I can dump the folder from the external drive onto itunes, and all the music is there - great.
    But I dont ALWAYS have that drive connected - and sometimes I open itunes when its not - for my phone sync or to buy something
    from the itunes store.
    I have done this for years, and only in more recent years has this become a problem - when I connect the drive, it has lost all the links to the
    music on it. I go into "preferences/advanced" - and tell it to look at the folder on the external drive - and it looks like its re-establishing the links - but it still does not. This only used to happen rarely - and I would have to clear itunes, then drop the folder again - but now it does it every time that I re-attach the drive after I have used itunes when it was NOT attached.
    I have to clear all my musis and then drop the folder again - which takes a while - I have tens of thousands of songs.
    please help!
    Thanks!

    Try to use instructions for consolidating library from here, see if that helps. At some point I have noticed, especially with huge library it gets hard for itunes to keep information about different locations, unless you have external drive plugged permanently.
    http://support.apple.com/kb/PH1459
    http://support.apple.com/kb/ht1449

  • Advice needed for provider hosted web application - authentication and access to SharePoint document library

    I haven't done SharePoint 2013 development with claims so I apologize in advance if my assumptions and questions are way out in left field.
    I'm trying to understand SharePoint 2013 claims authentication for a scenario that involves:
    A SharePoint provided hosted (web forms) app that will pull information and assets (e.g. PDFs) from SharePoint into the web page.
    It will be a VS 2012 solution with asp.net.identity feature.
    Security will be set for internal users, federated external users and forms-based external users.  Based on their security and (claim type) role it will define what information and assets that can be retrieved from SharePoint
    I have looked through MSDN and other sources to understand.
    This one helped with my understanding 
    Federated Identity for Web Applications and assumed that the general concept could be applied to forms-based identity for non-Federated external users .
    What I have now:
    VS 2012 solution web forms application set to Provider Host with asp.net.identity feature and its required membership tables.
    I can create new users and associate claims to the new user.
    I can log in with a user from the membership tables and it will take me to a default.aspx page.  I have added code to it that displays the claims associated to a user.
    For POC purposes I'd like to retrieve documents that are associated to this user from the default.aspx page.
    This is where I am having trouble understanding:  Is my understand correct?
    Internal users
    since they are internal on the network i am assuming that they would already have access to SharePoint and they would already be configured to what documents that they have available to them.
    Federated external users & Forms authentication external users
    it seems to me that the authentication for external users are separate from SharePoint authentication process.
    changes to the configuration settings are necessary in SharePoint, IIS, web application.
    I believe this is what i read.
    claims processes (e.g. mappings) need to be set up in SharePoint
    as long as external users are authenticated then things are ok b/c they would have claims associated to the user and the configuration in SharePoint takes are of the rest.
    This statement bothers me because I think it's wrong.
    So basically i'm stuck with if my understanding is correct: once a user is authenticated either by federated identity or asp.net.identity authentication that it should go to the provider hosted default.aspx page because the claim is authenticated and means
    that it should have access to it and the SharePoint document library based on some claim property.  I could then write the calls to retrieve from a document library and SharePoint will know based on some claim property that the logged in user can only
    access certain documents.
    It just sounds too good to be true and that i'm missing something in the thought process.
    Thanks in advance for taking the time to read.
    greenwasabi

    Hi GreenWasabi,
    i agree this is an interesting topic to discuss,
    as you can check from the article, you may check this example from the codeplex:http://claimsid.codeplex.com/
    when i thinking regarding this topic, its looks like an environment with multiple of realms,
    from what you understand, its correct that all the authentication is based from the provider, so for example i have a windows live ID and internal ID, then when i login windows live ID, it will be authenticated using windows live ID server.
    here is the example for the webservice:
    http://claimsid.codeplex.com/wikipage?title=Federated%20Identity%20for%20Web%20Services&referringTitle=Home
    as i know, if you using this federated, i am not quite sure that you will need to go to the provider page literally, perhaps you can check this example if we are using azure:
    http://social.technet.microsoft.com/wiki/contents/articles/22309.integrating-windows-live-id-google-and-facebook-accounts-with-sharepoint-2013-white-paper.aspx
    Regards,
    Aries
    Microsoft Online Community Support
    Please remember to click “Mark as Answer” on the post that helps you, and to click “Unmark as Answer” if a marked post does not actually answer your question. This can be beneficial to other community members reading the thread.

  • I try to download the OS X Mountain Lion, and I keep getting an error saying, "The product distribution file could not be verified. It may be damaged or was not signed." Please Help!

    I try to download the OS X Mountain Lion, and I keep getting an error saying, "The product distribution file could not be verified. It may be damaged or was not signed." Please Help!

    If you have Intego VirusBarrier X6 installed
    1. Open /Applications/VirusBarrier X6.
    2. Click on the "Surf" button.
    3. Click on the "Ad Banner Filter" tab.
    4. Click "OFF" next to "Banner advertisement filter".
    5. Start the computer in Safe Boot, then restart the computer into normal mode.
    If don't not have Intego Virus Barrier installed
    1. Verify that the date and time is correct on the computer.
    2. Safe Boot once, then restart normally, then try downloading again.
    3. Try with a different user account if the issue persists
    If the issue persists
    Install Lion onto an empty partition (internal or external).
    Note: You can Use Disk Utility to live partition the internal drive and create a second partition large enough to install Lion.
    Start from the clean installation of Lion.
    Download Mountain Lion from the Mac App Store. Note that the original startup disk can be set as the destination for Mountain Lion during the installation process.

  • Backing up Aperture to External Hard Drive

    Can anyone help please. I want to use an external hard drive to store my Photos on . I use aperture 2 to download and adjust my photos on my Macbook , but obviously I don't want to use all the hard drive up on my Macbook. I want to use the external drive to work with my Aperture images.
    Can this be done and if so how do I do it ?. I already have one external drive for time machine but that does seem to have saved any of my images.

    IMO generally most useful to help keep drives underfilled and fast (drives slow as they fill) is to manage by Reference ("referenced images") as in the workflow outline below where Master images can live anywhere.
    What this means is that the Library that manages the show and keeps any edits you make lives on your internal hard drive for best performance. To avoid overfilling (more than ~70% full) your internal drive, Master images live on external hard drives and are Referenced by the Library during editing. By generating screen-resolution-sized Previews you can still look at full screen images when the external hard drives are not connected.
    I feel pretty strongly that card-to-Aperture or even camera-to-Aperture handling of original images puts originals at unnecessary risk. I suggest this workflow, first using the Finder (not Aperture) to copy images from CF card to computer hard drive:
    • Remove the CF card from the camera and insert it into a CF card reader. Faster readers and faster cards are preferable, and Firewire is much preferable to USB2.
    • Finder-copy images from CF to a labeled folder on the intended permanent Masters location hard drive.
    • Eject CF.
    • Burn backup hard drive or DVD copies of the original images (optional recommended backup step).
    • Eject backup hard drive(s) or DVDs (optional recommended backup step).
    • From within Aperture, import images from the hard drive folder into Aperture selecting "Store files in their current location." This is called "referenced images."
    • Review pix for completeness (e.g. a 500-pic shoot has 500 valid images showing).
    • Reformat CF in camera, and archive DVDs of originals off site.
    Note that the "eject" steps above are important in order to avoid mistakenly working on removable media.
    It works equally well to allow Aperture to keep Masters in the Library itself ("managed images"), but at the cost of a rapidly growing Library size that ultimately outgrows the fast capacity of a single hard drive. The difference is that with managed images Vault backups include the Masters, while with referenced images Masters are maintained outside of Aperture and are not backed up in Vaults.
    The Aperture Help Menu has more on this if you search on managed images and on referenced images.
    -Allen Wicks

  • Batch Resizing within Aperture?

    Greetings-
    I shoot 400-600 images a day, thus I'm beginning to run out of hard drive space. I don't want to create a new vault of pictures on an external drive.  I'm willing to reduce the overall size of my photos in order to create some space on my drive.
    I'm at a loss how to batch resize groups of images in Aperture. It was an easy function in iPhoto.

    Spuds\'n\'Surf wrote:
    I don't want to skip a step, lose sharpness, image cropping/adjustments, or mess up my project/folder/album file structure in this process.
    What's the best way to do this and can it be done safely en masse? (We're talking about 120,000+ images)
    You just perfectly described a Referenced-Masters workflow. It works great for 200k+ images, the Library with its Previews lives on the internal drive and is always accessible. Masters live on external drives. The Library is backed up via Vaults and originals are backed up to redundant locations using the Finder before import into Aperture.
    Personally I have images managed on the internal SSD until editing is complete then convert to Referenced-Masters.
    IMO referenced Masters make far more sense than building huge managed-Masters Libraries.
    • Hard disk speed. Drives slow as they fill so making a drive more full (which managed Masters always does) will slow down drive operation.
    • Database size. Larger databases are by definition more prone to "issues" than smaller databases are.
    • Vaults. Larger Library means larger Vaults, and Vaults are an incremental repetitive backup process, so again larger Vaults are by definition more prone to "issues" than smaller Vaults are. One-time backup of Referenced Masters (each file small, unlike a huge managed-Masters DB) is neither incremental nor ongoing; which is by definition a more stable process.
    Managed-Masters Libraries can work, but they cannot avoid the basic database physics.
    HTH
    -Allen

  • IPhoto library is tooooo big and klunky - help!

    My iPhoto library package is 99.18 gb, for about 42,000 images. It's just too slow to load or work with, and it slows everything else down. Plugging in my iPhone means waiting 5 or more minutes for iPhoto to get going and read the photos on the phone before I can do anything else. Same for offloading photos from my camera.
    I use Blurb to make photobooks which requires the iPhoto library to live on the internal HD so I can't move this off to an external drive.
    I am concerned about how to get this under control, speed things back up to normal and keep everything working well. I am very worried about any corruption of the metadata. I used to have an automator db backup written by someone on this forum (Old Toad? Apologies for forgetting who made the script)... originally that worked but it no longer performs the db backup.
    Any suggestions from you iPhoto powerusers? I'm at a loss.
    Thanks for any advice.

    I've got a 320gb hd with 35gb free, and I think it's 2gb of RAM (at work now and
    can't remember exactly...)
    Well that's plenty of disk space.
    For metadata backups in iPhoto '09, where does that get stored?
    Same as in every version of iPhoto - in the main Database file. Added metadata only gets written to the file on export.
    As for Blurb, I used to have my iPhoto library on an external drive but Blurb has a functionality that opens the library from within their bookmaking software, which is great, but if the library is on an external drive it can't find it. No option to 'point' to it either. That may have changed in more recent iterations of their software, but I am not aware of that so far.
    I would certainly check for an update. If they are doing this they are not using an approved API for integrating with iPhoto, and they should. The location of the Library makes not a whit of difference to any other app and shouldn't to these guys either.
    Regards
    TD

Maybe you are looking for

  • How to call PL/SQL function from JSP ?

    We have the JSP application developed using the JDeveloper 3.0. I am trying to call the PL/SQL DB function. I'm trying to use the method of ApplicationModule: .getTransaction().executeCommand(sCommand) The problem is that I can not get the function r

  • Calling Javascript function from PL/SQL Process

    I am new to APEX and Javascript so please forgive my question. I have a process on page which successfully runs a few procedures etc. but now, as part of this process, I want to call a javascript function I have typed into the HTML Header. My questio

  • How to add image in the background of textarea?

    hello frnds i m confused regarding the sticking any png image as a background in the textArea which has a black background by default. If any body has any solution then plz do help me out.. thanx kuldeep

  • I do not want to use Preview - how do I replace it with Adobe

    When I receive a PDF as an attachment in Mail, I cannot readthe contents - it is all gobblygook. I want to make Adobe my defualt. I have used File Get Info and then selected a pdf file on my desk top and selected Open With Adobe for all files . Howev

  • Help!! How do you build the SWF files?

    Hi, I since I am new to Flex & Flex Builder2, I have tried to recreate an empty swf file, similar to the sample file "EmptySwf.swf" However, when I replace the default EmptySwf.swf with mine, I keep getting: TypeError: Error #1034: Type Coercion fail