Does Aperture Benefit from RAIDs???

I was just thinking about getting a RAID-5 external enclosure that would allow high-performance and the protection from a single dive failure. I would use this new drive for Final Cut Pro, Music files, and Photos (Aperture)...
So, my question here is, does Aperture significantly benefit from the speed of a RAID??? Please comment on the speed benefit that you have witnessed by going from a single (high-speed, SATA) drive to a RAID.
Thanks,
Robert

Complete Newbie wrote:
Aperture doesn't appear to me to be disk-bound. You get redundancy (and hence more reliability) from a RAID-5 system, but you won't see much increase in the speed of the app. Aperture's problem is that a single 12 Megapixel image takes ~ 100MB of RAM...
Working on a properly configured Mac Pro platform, the workflows of apps handling large quantities of images often are "disk bound." Fast drives and various RAID array configurations can be very beneficial. Optimizing each drives setup, however, is very individual and indeed a challenge.
Also RAM usage IMO is not a problem, RAM is cheap. Any new MP setup should have at least 8 GB RAM (US$450) added to it, minimum 2-GB sized DIMMs.
-Allen Wicks

Similar Messages

  • Does cs4 benefit from 64bit OS ?

    hi there,
    since cs4 doesn't work on my current machine (intel p4 3ghz, 2gb ram, winxp) i'll upgrade to a new one (probably core2duo 3ghz, 4gb ram).
    my question: should i use windows vista 64bit? does cs4 benifit from a 64bit system ? or would vista just eat additional resources ?
    thanks

    Well, most of CS4 (which suite are you getting?) should work on your current machine, albeit not as 'fast' as a modern Core 2 Duo/Quad or AMD equivalent. I'd be tempted to advise you to double your memory. However, as you're upgrading to a new system, there's no point upgrading your old PC.
    CS4 will benefit from Vista 64 bit, mainly in the area of Photoshop! PS now comes in 2 flavours, a 32 bit version anbd a 64 bit one. Only Windows users benefit fromm a native 64 bit version of PS by the way!
    The other programs will not benefit much if at all from Vista 64 bit right now.
    Yes, Vista DOES take up considerably more resources than XP. That said, most modern PC's can handle Vista's extra demands with ease.
    Remember just many complained a few years about XP being a 'resouce hog' with its 'huge' 1GB disk space and 512MB memory demands....
    Ian

  • Does Compressor benefit from a RAM upgrade?

    I have a dual 2.0 G5 cluster node Xserve that I want to use to do full-time Compressor tasks; mostly MPEG-2 and h.264 transcodes. It will have an attached XRAID as well. If I max out the RAM will I notice a significant decrease in render times on this machine?

    I doubt it. Compression is mostly reliant on processor speeds.

  • Why does Aperture 3,not process RAW files from Leica D-Lux 5 ?

             Hello:
    The purchase of my Leica D-Lux 5 came with LR 3. All raw files can be processed in that Application.However.I also have Aperture 3 and it will not process RAW files from the D-Lux 5.I hope that Apple soon will address this inconvenience.Has anyone els had the same problem?
    Hoping for a solution;Thanks John Basso.  Mac OS X 10.6.7

    I can tell you that it does process them from Panasonic LX5's - I have one. The files are probably very similar with slightly different processing metadata and a different tag. It is possible to change the tag in the RAW file prior to import that "fools" aperture into thinking that the RAW is the same as a camera that is supported. I have done this with other cameras where the RAW files were really the same but the camera was not yet "officially supported" at the time I bought it.
    I kinda remember doing it to the EP-2 or was it the EPL-1? Don't remember but there are plenty of tools that will allow this - google is your friend.
    Just make sure that your install of Ap3/OS X DOES actually have support for the LX5... Might work but I have not tried it.
    RB

  • How does the production system take the benefit from user-exits.

    How does the production system take the benefit from user-exits.

    and it is not the production system that benefits. its the company and the people working with SAP that benefits from the user exits which allow SAP to be altered for the company specific situations.

  • Why does aperture convert my raw files to jpegs when I import them from iPhoto?

    Why does aperture convert my raw files to jpegs when I import them from iPhoto?

    It doesn't.
    When you import  Raw to iPhoto the app makes a jpeg preview of the Raw automatically. When you import from iPhoto then Aperture brings over both versions.
    Regards
    TD

  • HT4007 Does Aperture support Images and videos from the new 3D cameras?

    Does Aperture support Images and videos from the new 3D cameras?

    If the camera is listed here OS X Lion: Supported digital camera RAW formats or here Mac OS X v10.6: Supported digital camera RAW formats then the raw image file is supported by Aperture.
    If the camera is not listed then as Keith wrote if it produces JPG's they can be imported into Aperture.
    If you told us what camera you were interested in it might help.
    regards

  • I am trying to import aperture library from macbook to new imac the import window sees the external harddrive but does nt see library.  I can open library as a refernce library but cannot import onto hardrive as managable files  Tried to do as a backeup t

    I am trying to import aperture library from macbook to new imac the import window sees the external harddrive but does nt see library.  I can open library as a refernce library but cannot import onto hardrive as managable files 

    I may be misreading what you are trying to do but you don't import libraries via the import window. You use File->Import->Library..
    If this doesn't resolve your problem post back with more detail of what you are doing and what is happening.
    regards

  • Why does aperture (OSX 10.7.5) delete image from album deletes master in project sometimes.

    When on occasion does Aperture delete the masters and all versions when you delete from an album. This absolutely should not happen. If you create an albbum of selected images and when you have finished delete the album as not needed it deletes certain masters connected to that album as well. At the moment you can make an album fronm selected images but never dare delete it as you have no idea what you will lose!
    Getting fed up with aperture!!

    Well, if we can catch you in a welcoming mood, let us see if we can't pro-actively correct some other user errors  , just in case.
    A simple place to begin is my concise guide to the parts of Aperture.
    Here are some other resources of note.
    Aperture is a fantastic program.  It is, however, almost certainly not, at heart, like any other program you've ever used.  Understanding the differences will certainly keep you from getting fed up with the program.

  • Does Aperture need to be open to import from photo stream?

    Hi,
    My photostream is linked to Aperture.  Does Aperture need to be open to import from photo stream or will it still import while it's not running?
    Thanks!
    Mavericks 10.9
    Aperture 3.5

    Might seem silly but is there any way to request a daemon be created to handle this magically in the background? Or open and push iPhoto to the background so I don't know its running?

  • Can't boot from Raid 0

    I have a pair of Raptor 74 HDs in a striped array. The OS (10.4.8) is currently on a Raptor 150. I cloned the boot OS to the raid array with Superduper (make bootable option set). The clone appeared to go fine but it won't boot.
    Have I missed an option somewhere?
    Thanks
    Mike

    Are you kidding me landrefl?
    No bootcamp
    I can understand this one although I would have thought the performance gains of RAID would far outweigh the 50€ or so you'd need to buy a separate drive for Boot Camp.
    No EFI updates
    So what you're suggesting is that Mike, and anyone else reading this, to forgo RAID, something from which you can get benefit from with every minute yur Mac Pro is turn on for something you might do for line 10 minutes maybe two times a year? That's like say that modern watches are bad because you need to change batteries in them.
    Again, for the small price of an additional drive to have a spare single drive boot you'd be mad to give up RAID for this reason.
    Performance hit as Raid is done by the computer CPUs, not the disk controller
    As far as I know, and I could be wrong on this, is that the Mac Pro actually utilises the RAID 0 and 1 support that is provided, in hardware, by the Intel 5000X chipset.
    Even if I'm wrong on this the additional performance gained by going RAID far outweighs any potential CPU usage it might incur. Let's face it, if your drives are working that hard to incur a noticeable CPU penalty I'd rather the greater drive performance any day as that'll be your bottleneck and not and CPU usage that might occur from it.
    For data protection, your best bet is still doing backups
    So where is that different to a single drive? Last time I checked when a drive failed the result was the same, RAID or not. While RAID does increase the chance of a hardware failure it does not change the chance of a software issue such as a drive catalogue corruption. Nor does it change the outcome in these incidences.
    Regular backups is recommended with or without RAID.

  • How do I know if my MBP will benefit from heatsink paste reapplication?

    Dear Mac Users
    I know the theme of hot MacBook Pro's has been done to death, but I would like to add my two pennyworth with respect to the specific question: How do I know if my MBP will benefit from reapplication of the heatsink compound? I have trawled numerous very long threads here and not found a satisfactory answer. The reapplication of the heatsink paste is a bit of a schlep and not without some risk of breaking it, so it would be good to know if the benefit is worth the risk.
    One common question is: "Are my MBP temperatures unusual?". A useful mac temperature database can be found at the following link:
    http://www.intelmactemp.com/list
    Consulting this list should set some parameter space for many users - there does seem to be considerable variability - suggesting highly variable quality of heat sink paste application by Apple. In my particular case I have an early 2011 MBP (i7, quad core, 2.2GHz). This certainly ran very hot. I even had a burn on my thigh from it, which took a while to heal. I know, they are notebooks, not laptops, but still . . . . To put very hot into perspective, the CPU was sitting at over 60C at idle, and whole case would be hot to the touch, such that resting your hands on it was uncomfortable. And yes, the computer was idling, there were no rogue processes chewing up cycles. At full load, the CPU would run at 90C and the whole thing would get very hot. I avoided running major projects or rendering jobs on it to preserve it from heat fatigue - the warranty is now up.
    My first attempt at reducing the temperatures, was to clean it. The clue was in the USB/Firewire etc ports. All were very fluffy, which suggested that there was a lot of crud in the heatsink fins. Fixing this was simple; adopt anti-static precautions; remove the back panel; blow air into the exhaust vents at the base of the screen. Numerous large fluff bunnies popped out of the fans and elsewhere. Alternatively, you can unscrew the two fans and partially lift them out (without removing the power cables to the motherboard). Major fluff on the heatsink fins can then be tweezered out, prior to blowing it through, to ensure all debris is removed. I did this subsequently on my Core 2 Duo MBP. After defluffing the idle i7 CPU temperatures dropped from 60C to 43C, but perhaps of more importance was that the whole case was much cooler and the keyboard temperature was close to ambient - much nicer to use.
    So that fixed the major heat problem. However, the question remained, would I see any further improvements by stripping the thing down and reapplying the heatsink paste? What I was looking for in this forum was a rule of thumb based on temperature measurements to help me make an informed decision on whether to strip the thing down. As my machine was out of warranty, I wasn't worried about voiding that. Also, where I am there is zero support (Africa), so Genius Bar or similar was not an option. But jiggering a 14month (very expensive) laptop was not something I wanted to do for thrills. The data in many of the posts did not provide any answers. In the end I made some measurements and decided to do the fix. The step by step instructions I followed were very clear and can be found for many Macs/models at:
    http://www.ifixit.com/Device/Mac
    This guide made life a lot less stressful, as pulling a connector the wrong way could trash the motherboard, and those connectors are so very tiny and fragile. It is very important to take it slow and follow each step to the letter (and not skip steps inadvertently). My before and after heatsink paste fix temperature data are shown below. The idle temps are improved, while the full load temps look like a marginal improvement. However, CPU temperature is not the only story. Certainly after the fix, the whole case is generally cooler, so the heat extraction system is working better. Also, the CPU before and after temps are quite similar at full load, but that assumes the CPU is running at the same power. I do not know enough about the i7 architecture to say if the before speed was being limited by heat, while in the after case the turbo-boost was able to run out to the maximum. Unfortunately, I didn't do any CPU performance tests. So, based on temperatures alone, it looks like a marginal improvement, but it is certainly making a difference to case temperatures.
    Note all temperatures were measured at an ambient temperature of 20C on a flat, hard surface using Marcel Bresink's Temperature Monitor. Fan speed was measured using the Fan Control software - set to allow firmware control at idle.
    Apple original heat sink paste - big dollop of grey crud and lots of extrusion at the sides.
    Idle
    CPU 43C
    GPU 38C
    Heat Sink 2 36C
    Heat Sink 3 35C
    Fan Speeds 2000rpm
    Full Load (all cores maxed with a Boinc Distributed Computing Project (Rosetta - the project, not the Apple technology).
    CPU 89C
    GPU 57C
    Heat Sink 2 52C
    Heat Sink 3 49C
    Fan Speeds 6200rpm
    Arctic Silver heat sink paste
    Idle
    CPU 38C
    GPU 34C
    Heat Sink 2 33C
    Heat Sink 3 32C
    Fan Speeds 2000rpm
    Full Load
    CPU 86C
    GPU 59C
    Heat Sink 2 50C
    Heat Sink 3 50C
    Fan Speeds 6200rpm
    Back to my original question: How do I know if my MBP will benefit from reapplication of the heatsink compound? I think temperatures alone will not give a good answer - mainly because the CPU probably has variable output - due to turbo boost. One thing I did notice is that the temperature response of the CPU when it gets switched to full load, does vary according to the quality of the heatsink paste. With the original Apple paste, when I switched the CPU to full load (from idle) the temperature of the CPU went up to its maximum almost instantaneously and stayed at that level ie the graph of temperature vs time was a step function. This suggests that the CPU may be controlling the temperature by throttling itself. Otherwise, as the heatsink warmed up, the CPU should also rise in temperature - mine stayed bang on 90C and fluctuated only by a degree either side.  After application of heat sink paste, the CPU has a much larger effective thermal mass and consequently heats up much more slowly. Turning the CPU up to maximum from idle resulted in the temperature climbing to a maximum over about 3-4s. After peaking at around 92C it dropped back as the fans kicked in, to around 86C.
    So perhaps one way of assessing the quality of your heatsink paste is to ramp the CPU to maximum from idle and look at the shape of the temperature profile. A step function suggests a lousy job and benefit may be had from reapplication. A more gently sloping profile, followed by a dip due to the fans kicking in, may suggest you are in good shape. Obviously, this presupposes that your Mac temperatures are not insanely hot to start with (i.e. high 90sC+). In which case, if defluffing doesn't do the job, then new heatsink paste is almost certainly required.
    Regards, BB

    I'm sorry but this is too funny to pass up. 

  • How can I migrate an Aperture library from an external HDD with Mac OS Extended (Journaled, Encrypted) format?

    How can I migrate an Aperture library from an external HDD with Mac OS Extended (Journaled, Encrypted) format?
    I used to store my pictures on an external hard drive using the latest version of Aperture. Now I tried migrating my Aperture library to Photos. However, after a short moment an error message popped up telling me that "Photos was unable to make a copy of your library before preparing it. Photos does not have the necessary permissions...".
    My external HDD doesn't need any permission repairs nor is the system prohibited to read from or write to it.
    Thanks for any advice in advance!
    Gohtac

    My guess is that the encryption is the problem for the new app.

  • RE : Who would benefit from Forte?

    RE : Jerry Fatcheric's message about "Who would benefit from Forte?"
    With regards the point mentioned in the attached message from Jerry
    Fatcheric below, I would like to illustrate my point. I implemented in both
    Visual Basic and Delphi, the example that is mentioned in the attached
    message, about a browser application, having the capability to browse
    thousands of records with the inital screenful needing to come ASAP. It took
    me less than 2 minutes to implement this in VB (I timed it). Just threw a
    "remote data" control and a "DBGrid" control on a form, set a few properties
    and wrote a "select *" sql specifying that only 30 records be returned at a
    time. For a table with 4K records, the first 30 came in and got displayed in
    less than 2 seconds. In Delphi, the response was even better and whole 4K of
    record could be retrieved in less than 4 second. (Yes less than 4 seconds
    for retrieving 4000 records from a DB2/NT database running on a remote
    machine). Even I could not believe the performance of Delphi which I haven't
    used that much. These tools are THE fastest way to get the data from a
    database server to a windows client. These will perform any day better than
    FORTE. One of the problem that I came across FORTE in one of situations like
    this was data movement across nodes is very costly. In one of our
    applications, since we stored the data as objects, in a similar situation as
    you have mentioned, the performance of moving a lot of data form the server
    to the client was not very good and in consulation with FORTE technical
    support we had to convert the data in objects to scalar (delimited string),
    move across node, and convert the data back to object at a client.
    Performance increase - 40 secs. vs 120 secs. earlier.
    About my background. I have worked about 8 years in application development
    and for the past 4 years have been working in a client server environment.
    Being a consultant, I have used many tools, including FORTE for one year, to
    provide my clients with the most bang for their buck, which to me is the
    topmost priority as a Consultant. I do not decide for my clients what
    technology they should use but sure evaluate the various options they have
    and recommend more than one solutions, listing the advantages and
    disadvantages.
    Currently working on coming up with a solution for a client with a customer
    service application need with around 50 users now, scaling up to 100 users
    in the future. The best solution that I could come up with was a logical
    3-tier with the presentation and the business layer running on NT
    workstation (client) and the database on NT server (server). With all the
    processing on a powerful and healthy (not "fat") client the system, I feel
    can scale very well. For a 500 user system, you literally have 500
    application server (physically on the client machine) being served by one
    data server. To the data server, having a physical middle tier between the
    client and the data server, I feel would not help, at least in our
    situation. Almost everything that the middle tier could do to reduce the
    load on the data server can be handled by the "business layer" running on
    the client machine. It does mean that each user connects to the database
    directly so in a case of 500 user, there are 500 connections to the database
    but lately with the sophisticated DBMS, this is no longer an issue. The DBMS
    can manage this many user very economically (read the benchmark about SQL
    server with 5000, yes 5k user at "www.microsoft.com/sql") and almost as well
    as a middle tier. It is fault tolerant - nothing can bring down the system
    except a client failure, the data server failure or a network failure, the
    same failure points as a N-Tier solution unless you are replicating or
    duplicating the database. In our solution our application is as scaleable as
    the database is, and the databases available today are very scaleable if you
    look at the current database technology offerings.
    As you may have guessed the abovementioned solution is cheaper with a very
    fast "time to market" than a forte solution (we started this about 6 months
    back and are in production for the past 1 month). This may not have all the
    features that FORTE offers, but for our purposes and I feel in similar
    applications, what we got was what we needed. By no means, this is going to
    meet all information tecnology needs for everyone and in many situations I
    believe FORTE would be well suited than any other tool.
    I still use FORTE can would continue to do so for some of the solutions that
    we develop, but I do not think that one shoud be using FORTE for "any
    development that is bigger than a breadbox" as Mr. Fatcheric suggests in the
    attached message, simply because if I do that, than I think that in some
    cases I would be selling the user a tank when the user just needs a rifle.
    I consider giving my clients the most value for their money in getting this
    solution developed. I would suggest my clients FORTE when I think they needs
    them but would definitely suggest another solution if I think that they can
    get their solution developed and get more value for their money using some
    other tool. Towards this end I would like to find out what kind of solutions
    people are developing and what kind of performance they are getting
    specially related to Windows platform.
    Any information about the benefits (actual benefits) you are getting from
    FORTE would be highly appreciated which would let a lot of us decide when to
    use FORTE and when not to use FORTE to meet ours and our clients'
    everchanging information technology needs.
    - Ari Singh
    [email protected]
    Ari Singh wrote a provocative piece questioning the benefits of Forte
    in "Windows only", non-large scale applications. Rather than get into
    a large philosopical discussion, I would like to illustrate my point
    with an example taken from a current Forte project.
    First, my background: 10+ years in Client server applications. Worked
    for several years at Oracle and have experience with Sybase. Worked
    extensively with a 2 tiered CS product (Uniface) and write C and C++.
    NOT a Windows expert.
    In our current application, the requirement is to allow the user to
    browse literally thousands of records on the Windows Client. There will
    never be lots of users doing this, but the ones that do must have
    reasonable performance. Our initial tests indicated that if we simply
    had the server pump the data to the client, we would have significant
    performance problems and face memory limitations on the PC. SO we
    utilized Forte's N-tiered capabilities. When the user starts a query
    (using dynamic sql with user controlled WHERE and ORDER BY), we start
    an asynchronous retrieval on the server with data is cached in an
    anchored object on the server. When the query has found the first
    THIRTY (30) records (2 screens worth), it posts an event to the client
    and the client request the first thirty. The retrieval process continues
    independently while the user can browse data on the client. Not until
    the user scrolls down far enough does the client again request more
    data. If the user quits from the screen or starts a new query, the
    first one is cancelled. Otherwise, the query runs to completion on the
    server.
    This approach gives us 3-5 second response time regardless of the size
    of the query result set. It minimizes the data on the client (moving
    us toward a thin client). The kicker is that with the help of Martha
    Lyman from Forte, we developed this technique in about 4 hours! Add
    to this all the standard inheritance, OO stuff, partitioning,
    customized monitoring, etc, etc, and IT IS MY OPINION that Forte
    is a GOOD tool for any development that is bigger than a breadbox
    and worth the $$$. And that's the way it is.... SO there...
    Jerry Fatcheric
    Relational Options, Inc.
    Florham Park, New Jersey
    201-301-0200
    201-301-00377 (FAX)
    [email protected]

    RE : Jerry Fatcheric's message about "Who would benefit from Forte?"
    With regards the point mentioned in the attached message from Jerry
    Fatcheric below, I would like to illustrate my point. I implemented in both
    Visual Basic and Delphi, the example that is mentioned in the attached
    message, about a browser application, having the capability to browse
    thousands of records with the inital screenful needing to come ASAP. It took
    me less than 2 minutes to implement this in VB (I timed it). Just threw a
    "remote data" control and a "DBGrid" control on a form, set a few properties
    and wrote a "select *" sql specifying that only 30 records be returned at a
    time. For a table with 4K records, the first 30 came in and got displayed in
    less than 2 seconds. In Delphi, the response was even better and whole 4K of
    record could be retrieved in less than 4 second. (Yes less than 4 seconds
    for retrieving 4000 records from a DB2/NT database running on a remote
    machine). Even I could not believe the performance of Delphi which I haven't
    used that much. These tools are THE fastest way to get the data from a
    database server to a windows client. These will perform any day better than
    FORTE. One of the problem that I came across FORTE in one of situations like
    this was data movement across nodes is very costly. In one of our
    applications, since we stored the data as objects, in a similar situation as
    you have mentioned, the performance of moving a lot of data form the server
    to the client was not very good and in consulation with FORTE technical
    support we had to convert the data in objects to scalar (delimited string),
    move across node, and convert the data back to object at a client.
    Performance increase - 40 secs. vs 120 secs. earlier.
    About my background. I have worked about 8 years in application development
    and for the past 4 years have been working in a client server environment.
    Being a consultant, I have used many tools, including FORTE for one year, to
    provide my clients with the most bang for their buck, which to me is the
    topmost priority as a Consultant. I do not decide for my clients what
    technology they should use but sure evaluate the various options they have
    and recommend more than one solutions, listing the advantages and
    disadvantages.
    Currently working on coming up with a solution for a client with a customer
    service application need with around 50 users now, scaling up to 100 users
    in the future. The best solution that I could come up with was a logical
    3-tier with the presentation and the business layer running on NT
    workstation (client) and the database on NT server (server). With all the
    processing on a powerful and healthy (not "fat") client the system, I feel
    can scale very well. For a 500 user system, you literally have 500
    application server (physically on the client machine) being served by one
    data server. To the data server, having a physical middle tier between the
    client and the data server, I feel would not help, at least in our
    situation. Almost everything that the middle tier could do to reduce the
    load on the data server can be handled by the "business layer" running on
    the client machine. It does mean that each user connects to the database
    directly so in a case of 500 user, there are 500 connections to the database
    but lately with the sophisticated DBMS, this is no longer an issue. The DBMS
    can manage this many user very economically (read the benchmark about SQL
    server with 5000, yes 5k user at "www.microsoft.com/sql") and almost as well
    as a middle tier. It is fault tolerant - nothing can bring down the system
    except a client failure, the data server failure or a network failure, the
    same failure points as a N-Tier solution unless you are replicating or
    duplicating the database. In our solution our application is as scaleable as
    the database is, and the databases available today are very scaleable if you
    look at the current database technology offerings.
    As you may have guessed the abovementioned solution is cheaper with a very
    fast "time to market" than a forte solution (we started this about 6 months
    back and are in production for the past 1 month). This may not have all the
    features that FORTE offers, but for our purposes and I feel in similar
    applications, what we got was what we needed. By no means, this is going to
    meet all information tecnology needs for everyone and in many situations I
    believe FORTE would be well suited than any other tool.
    I still use FORTE can would continue to do so for some of the solutions that
    we develop, but I do not think that one shoud be using FORTE for "any
    development that is bigger than a breadbox" as Mr. Fatcheric suggests in the
    attached message, simply because if I do that, than I think that in some
    cases I would be selling the user a tank when the user just needs a rifle.
    I consider giving my clients the most value for their money in getting this
    solution developed. I would suggest my clients FORTE when I think they needs
    them but would definitely suggest another solution if I think that they can
    get their solution developed and get more value for their money using some
    other tool. Towards this end I would like to find out what kind of solutions
    people are developing and what kind of performance they are getting
    specially related to Windows platform.
    Any information about the benefits (actual benefits) you are getting from
    FORTE would be highly appreciated which would let a lot of us decide when to
    use FORTE and when not to use FORTE to meet ours and our clients'
    everchanging information technology needs.
    - Ari Singh
    [email protected]
    Ari Singh wrote a provocative piece questioning the benefits of Forte
    in "Windows only", non-large scale applications. Rather than get into
    a large philosopical discussion, I would like to illustrate my point
    with an example taken from a current Forte project.
    First, my background: 10+ years in Client server applications. Worked
    for several years at Oracle and have experience with Sybase. Worked
    extensively with a 2 tiered CS product (Uniface) and write C and C++.
    NOT a Windows expert.
    In our current application, the requirement is to allow the user to
    browse literally thousands of records on the Windows Client. There will
    never be lots of users doing this, but the ones that do must have
    reasonable performance. Our initial tests indicated that if we simply
    had the server pump the data to the client, we would have significant
    performance problems and face memory limitations on the PC. SO we
    utilized Forte's N-tiered capabilities. When the user starts a query
    (using dynamic sql with user controlled WHERE and ORDER BY), we start
    an asynchronous retrieval on the server with data is cached in an
    anchored object on the server. When the query has found the first
    THIRTY (30) records (2 screens worth), it posts an event to the client
    and the client request the first thirty. The retrieval process continues
    independently while the user can browse data on the client. Not until
    the user scrolls down far enough does the client again request more
    data. If the user quits from the screen or starts a new query, the
    first one is cancelled. Otherwise, the query runs to completion on the
    server.
    This approach gives us 3-5 second response time regardless of the size
    of the query result set. It minimizes the data on the client (moving
    us toward a thin client). The kicker is that with the help of Martha
    Lyman from Forte, we developed this technique in about 4 hours! Add
    to this all the standard inheritance, OO stuff, partitioning,
    customized monitoring, etc, etc, and IT IS MY OPINION that Forte
    is a GOOD tool for any development that is bigger than a breadbox
    and worth the $$$. And that's the way it is.... SO there...
    Jerry Fatcheric
    Relational Options, Inc.
    Florham Park, New Jersey
    201-301-0200
    201-301-00377 (FAX)
    [email protected]

  • Importing into Aperture 3 from iPhoto as referenced files

    Apologies if this has been asked before, but I can't find anything this topic...
    I've just upgraded to Aperture 3. In Aperture 2, my way of working was to store RAW images in iPhoto and import as referenced files into Aperture. This was very straightforward. Now, I have 2 problems:
    1 - I can't see how to import referenced images from iPhoto. If I use the import command, my iPhoto images don't show up in the list of locations to import from; if I use the iPhoto browser in Aperture and drag from that, it imports the file into the Aperture library rather than importing as a referenced file...
    2 - If I import using the iPhoto browser, not only does it import into the Aperture library, it imports the jpeg preview rather than the RAW file.
    The only way I can see to get the RAW file into Aperture is to find it in the Finder then import into the Aperture library. This seems excessively long-winded and also duplicates the file, using up extra disk space.
    I'm using Aperture 3.0.1 and iPhoto 7.1.5
    Thanks in advance for your input

    I have not imported an iPhoto library because my workflow is to to use Aperture for storing and archiving Raw files and iPhoto for exported jpegs.
    However the instruction in the Aperture 3 PDF Manual are as follows:-
    To import your iPhoto library
    Choose File > Import > iPhoto Library.
    Select the iPhoto library in the dialog that appears.
    Choose a location for the imported images by doing one of the following:
    • To store imported masters in the Aperture library: Choose “In the Aperture Library” from the Store Files pop-up menu.
    • To import the files as referenced images stored in their current location on your hard disk: Choose “In their current location” from the Store Files pop-up menu.
    Tip: Choosing“Intheircurrentlocation”isrecommended.Whenyouchoosethisoption, Aperture refers to the original files in their current location and does not have to duplicate the files, which doubles the disk space needed to store the files. For more information about referenced images, see What Are Managed Images and Referenced Images?
    • To store imported masters as referenced images in the Pictures folder on your hard disk: Choose Pictures from the Store Files pop-up menu.
    • To store imported masters as referenced images in a location other than the Pictures folder: Choose “Choose” from the Store Files pop-up menu and select a folder. Choose “No folder” from the Subfolders pop-up menu to specify that the files be stored as separate, individual files in the selected folder. You can also specify that Aperture create a hierarchy of subfolders with specific folder names to hold your files.
    =====
    This extract is from page 169. Perhaps it clarifies and identifies a step you have omitted.
    AW.

Maybe you are looking for

  • How can i see photos with apple tv from my time capsule files?

    I have photos in my time capsule and i want to display them on my apple tv without a computer. Some sort of browsing in my time capsule from apple tv. Both are in the same wifi net, which uses time capsule as basis,

  • How do I get Preview v. 8?

    I have installed Maverick (OS 10.9). I have confirmed this by checking with "About this Mac". However, I only have version 7.0 of Preview, but I understand that the latest version of Preview is 8.0. How do I get this? I should add that I have searche

  • How much to repair iphone4 screen?

    How much would it be to repair/replace a front screen on an iphone 4?

  • Print all delivery number on invoice print out

    Hi , When we create a  invoice against two delivery only one first delivery number shown on invoice ? but my client want to print all delivery number on invoice print out. can you please guide me how to provide it coz only one delivery print . Regard

  • How to deliver 2-tier project with Project Builder?

    Hi all, I am playing around with Project Builder and try to deliver form with Delivery Wizard in 6i. However, I am unable to select the radio button where we ask Project Builder to create installation script. It is gray out. Am I missing something he