Adjustments usage

i have been looking for documentation on the best way to use the tools in the adjustments inspector/HUD, but have not found much so far. can anyone point me to something with some detail? thanks

the only ones i have seen will simply monitor the power. some will notify you or alarm you if it gets hot or the battery is low, etc,etc.....but i don't know of any that will actually change settings. i use timerriffic to help manage mine. it alllows you to set times to do things (mine automatically goes in to airplane mode from midnight to 6 am). you can also set areas to trigger wifi (like establish your "home" or "office" and when you get close enough to the router, it will automatically turn on wifi, which helps). 

Similar Messages

  • High memory usage problem

    Firefox regularly creeps up to about 350 mb of memory usage. It runs poorly, crashes, etc. I read some other posts about how to reduce the its memory usage via about:config tweaks. I made adjustments to browser.cache.disk.capacity (was set to over 1,000,000, reduced to 50,000) and config.trim_on_minimize. However, memory usage has not decreased. I also see the value browser.cache.disk.smart_size_cached_value;1048576. I changed this to 50,000 also, but it reverts to the original number after every restart.
    I have 3 tabs open right now (Hotmail, Google, and this page) and memory is at 242,000.
    I should also note that I recently reformatted my hard drive. Firefox has almost no extensions or plug-ins installed (other than things like Flash and Silverlight). I'm running Windows 7 64-bit, and have 6 gb of RAM.
    Any suggestions for alleviating this problem would be greatly appreciated.

    235 to 285 MB seems to be a rather low notification threshold on a PC that has 3GB installed, not even 10% of RAM being used and that message is triggered.
    That's a new feature in AVG, and I suspect it might need a little tweaking. Unfortunately AVG didn't seem to provide any user adjustments into that feature, so the best thing to do is turn it off if that warning message appears too often.
    Here's an AVG support thread where the users says he gets that message with IE, Chrome, and Firefox. <br />
    http://forums.avg.com/us-en/avg-forums?sec=thread&act=show&id=180124

  • How to blank screen during music usage with Apple TV.

    How do a select a blank screen as a screen saver when listening to music. I would like to save energy and minimize tv screen usage.  I don't want to show pictures or anything. My screen is currently on all day as I listen to music. Any adjustments with my Samsung tv during Apple TV stops Apple TV. I have an Apple TV 3.

    Like I said, the Tv would need to be on unless connected via optical, that is not software related. Not a lot of energy would be saved by having a blank screen. The majority do use the screensaver to showcase their pictures and there wouldn't be a lot of call for having a blank screen.
    If you really feel like you want to you can provide feedback directly
    https://www.apple.com/feedback/

  • Lightroom has a hard time with local adjustments:

    Local adjustments: Spotting / Cloning / Healing Brush, Adjustmenst Brush, Graduated filter
    I posted  the piece below as a response to thaehn’s post “Lightroom freezes on Dust Spot Removal” but I thought the topic is important enough to warrant its own post.
    This in advance: I'm running WIN XP SP4 with all available patches; have 4 GB of RAM; have a dual-core processor at 3 GB.
    Apart from my drive C, I have two internal hard drives in Raid 0 configuration (thus they show up as one drive). I also have an external hard drive that's in Raid 1 ( "mirroring') and is connected by fire-wire 400.
    Yes, LR "freezes" when using the dust spot tool and also when using the adjustment brush or - but to a lesser degree - when using the grad filter.
    My observation though is that it's not a true "freezing", because if I wait long enough LR starts working again. So when using these tools, LR gets extremely busy und just doesn't have enough resources left to respond to further commands right away - but it "stores" the comands and acts on them eventually.
    If I turn on my task manager I observe the following when using these tools: the CPU gets extremely busy (i.e. the green bar is at 100%)  while the RAM usage is high but not to the max. The "freezes" occur when and as long the CPU works at 100%. If the CPU is below 100% LR responds sluggish and the response is normal again when eventually the CPU usage is at 0%.
    Also, in my observation, this behaviour of LR is compounding, i.e. the wait-times for my CPU to get back to 0 are increasing, the longer I work with these tools or the more photos I work on. Re-starting LR does not bring change: the behaviour establishes itself when using the tools again. It is as if LR needs to do some work in the background and that this background work piles up the longer I work in LR.
    I am not a software or hardware specialist, so the following observations are somewhat subjective and not really quantifiable.
    In Photoshop I observed a similar behaviour (although never as bad) when working on files of 500 MB and above. It was particularly bad when working on photos residing on my external RAID 1 hard drive. I had a hunch that it might have to do with the reading and writing from a drive that a)  is external and b)  is  in RAID 1. It is known that RAID 1 has slower writing (and also reading) speeds than RAID 0 (or no RAID at all). So in Photoshop I managed to cut down on the unresponsiveness by moving large files to my internal RAID 0 hard drive and working from there.
    From this experience I come to the assumption that some of the "busy-ness" in LR (or of the CPU) is due to reading/writing to/from the hard drive.
    To test this assumption for LR, I moved some photos from my external (RAID 1) to my internal (RAID 0) hard drive and did some "intensive" work: first some "fancy" lens correction, then a few grad filters, and a "final touch" with the adjustment brush. I thought that LR could handle these tasks somewhat better when the photos were on the internal drive - so some of the "sluggishness" or " freezing" is due to the reading/writing from/to the hard drive.
    Acually I cannot explain why this should be so: In my naivete I thought LR writes everything only into the catalog (?), that - by the way - is on my internal RAID 0 drive.
    I think it would be good if Adobe could look into this and maybe give us some recommendations, just as they do for Photoshop by saying that the scratch disk should be on a different drive than Photoshop.
    But it seems also that local adjustments - spotting, adj. brush. etc - are a very high "number-crunching" business for LR - independent of where the photo is located on the drive. (Interestingly the grad filter is not as bad as the adjustment brush). The clone-tool in Photoshop hardly registers on my CPU and if at all only with a very short spike, while the clone tool in LR sends the CPU through the roof for 10 to 30 seconds- even when the photo in question is on the internal drive.
    I'm sure loots of people will now tell me that this is due to the difference between pixel-based adjustment and non-pixel-based ones. OK, I acknowledge that. But still ...
    It seems some guys are still in denial about this (or maybe they have two quad processors and 16 GB of RAM):
    But it's a fact: LR has a hard time with localized adjustments. There must be by now 100's of complaints to that respect, and that can't be dismissed by blaming it on the hardware specs alone.
    I just hope Adobe is working on that.
    LR3 is a fantastic program that I would not want to work without. Often I have to work on a lot of photos with a very tight deadline. Without LR it would be impossible.
    Thank you,
    ErnstK

    You are right Samoreen.
    I posted my above message in July 2010 and this topic has been re-opened by Adobe under the thread Lightroom 3.3 Performance Feedback
    I think it would be best if everybody would post their concerns only under this thread.
    Otherwise we'd have two different threads for the same issue.
    So let's close this one here.
    WW

  • Local adjustments slow on Vista 64 (LR 2.1 - not a Nvidia problem)

    Hi,
    I am using LR 2.0/2.1 from the release of the 2.0 and recently has migrated my PC to Vista 64bit. My hardware is 2 years old but nevertheless is quite up to task - P4 dual core 3.2 Ghz CPU with 4 GB of RAM and SATA RAID with XFX GeForce 6800XT card (with 512MB of card RAM).
    It is a very fast system in general but I do have some problems in LR 2.1 with a local adjustments brushes. I know it's a well discussed topic however I feel that in my case it is sufficiently different to warrant a new topic. I've done all the Nvidia driver optimisations - setting all the driver properties to max performance for LR. This did make a difference initially but as brush strokes are added the lags are getting worse and if I have about 8-10 strokes and more - it gets really unusable - very slow to catch up with delays as much as a minute or two.
    I don't run any antiviruses or any other scanning utils other that bog standard Vista 64 bit services (all the potential scanning services like disk indexing, defender scanning has been disabled for the purpouse of the performance testing - nothing made a difference). What is curious though, that a few people who have the brush working fast were saying that CPU was not topping up and in my case I noticed that both cores were used quite a lot when the delays started to occur (with the number of strokes around 8 or 10 and more). The combined CPU usage went up to 90% (so both cores were used extensively). I have a relatively medium catalog - around 10000 photos.
    Setting the automask on/off does not make a lot of difference - with the automsak off it only is more responsive for a couple more further strokes and then slows down again.
    I am sort of exhausted my options of what to investigate further and would appreciate any constructive comments of where to look and what to look for (please no Adobe blaiming here). I will also appreciate some Adobe engineers comments on the CPU usage of the adjustment brush and possible also insights of how can I look at what's going on.
    Thanks,
    Alex

    I've contributed elsewhere but I'll just add another voice to this thread. Same cumulative slowdown here with a very similar system. I don't know if you have noticed but it's not only the brush strokes that slow down to treacle speed. If you get fed up waiting for a response and start clicking buttons you can find them reacting sequentially for up to a minute. Its like watching a very slow recorded tutorial. It's the same for adjustment sliders once the slowdown is under way.
    Also I find that if you move the brush too quickly it will abreviate the curve you create into a series of angles. I postulate that, in the case of my Intuos tablet at least, the program has a problem responding to the data rate from the tablet. An example is if I tap the pen in the image, lift and go quickly to make a menu selection. The brush will continue to draw after the pen has lifted along the line of cursor movement and I have to go back and delete the stroke.
    Brian

  • What is Dynamic Usage Variance in ML?

    Hi Experts,
    What is Dynamic Usage Variance in ML? How does it effect the Periodic Unit Price calculated at the end of the period?
    Regards
    Srikanth Munnaluri

    Hi,
    Distribution of usage variances specially comes in scenarios like Physical Inventory checkup and its adjustments to derive the actual costing.
    Physical Inventory documents generally posted to adjust the physical inventory and the inventory as per sap. These total adjustments have to be made based on the quantity consumed on each order. Ofcourse this scenario is project specific.
    This ensures auto consideration at the time of determination of periodic unit rate in actual cost run. These adjustments will revaluate the consumption.
    This dynamic usage variance  is applicable both for the inventory and for activity rates.
    U have to maintain the following configuration-
    Controlling>Product Cost Controlling>Actual Costing-->Activate Distribution of Consumption Differences
    Once these settings are in place, u have to run either some or all of the following--
    CKMDUVACT - Distribution of Activity Differences
    CKMDUVMAT - Distribution of Inventory Differences
    CKMDUVREC - Enter Actual Activity Quantities
    if u want view the differences before executing above transactions, u can view by using
    CKMDUVSHOW - Display Inventory and Activity Differences
    Regards
    Sudhakar Reddy
    Edited by: Sudhakar Reddy K.V. on Sep 14, 2009 2:11 PM

  • CC&B and Meter Data Management Integration Usage Requests

    Hello.
    We investigating the following issue: Trying to create a Bill in CC&B with getting usage data from Meter Data Management using Usage Requests engine. Help pages of CC&B describe the big picture of Usage Requests and configuration overview. So this information is not enough for correct and complete adjustments. There is no any information about outbound and inbound XML messages formats.
    CC&B help contains link to the document: Oracle Utilities Customer Care and Billing - Meter Data Management Integration Implementation Guide, but our attempts of searching of this document was ineffectual.
    It's interesting to get any information about Usage Requests from CC&B to Meter Data Management or link to the document mentioned above.
    Thanks in advance!
    P.S. Our current setup of Usage Request gives the next log during bill segment generation (we are using jms queues and xsl transformation of outbound and inbound messages. So MDM sends a response, but something wrong in further processing on CC&B side). As a result we get a bill segment in Error state with remark "Usage Request for Bill Segment is not found.".
    Fragment from weblogic_current.log:
    SYSUSER - 318312-3794-1 2011-01-19 14:40:55,547 [Remote JVM:1 Thread 3] WARN (host.sql.CobolSQLParamMetaData) COBOL set the null indicator to false for SQL bind parameter xWIN-START-DT, but it sent a null value ' '; binding null
    SYSUSER - 318312-3794-1 2011-01-19 14:40:55,554 [Remote JVM:1 Thread 3] INFO (COBOL.CMPBBLLP) Invoking CMPBBILG
    SYSUSER - 318312-3794-1 2011-01-19 14:40:55,554 [Remote JVM:1 Thread 3] INFO (COBOL.CMPBBILG) CMPBBILG validate
    SYSUSER - 318312-3794-1 2011-01-19 14:40:55,572 [Remote JVM:1 Thread 3] INFO (COBOL.CMPBBILG) Get next Acct sa
    SYSUSER - 318312-3794-1 2011-01-19 14:40:55,579 [Remote JVM:1 Thread 3] INFO (COBOL.CMPBBILG) Acct Sa more
    SYSUSER - 318312-3794-1 2011-01-19 14:40:57,394 [Parent Reader:Thread-54] INFO (api.maintenanceObject.MaintenanceObjectLogHelper) (Server Message)
         Category: 11002
         Number: 12150
         Call Sequence:
         Program Name: Usage_CHandler
         Text: Transitioned from Pending to Awaiting Data Sync.
         Description:
         Table: null
         Field: null
    SYSUSER - 318312-3794-1 2011-01-19 14:40:58,089 [Parent Reader:Thread-54] INFO (api.maintenanceObject.MaintenanceObjectLogHelper) (Server Message)
         Category: 11002
         Number: 12150
         Call Sequence:
         Program Name: Usage_CHandler
         Text: Transitioned from Awaiting Data Sync to Send Request.
         Description:
         Table: null
         Field: null
    SYSUSER - 318312-3794-1 2011-01-19 14:41:03,720 [Parent Reader:Thread-54] INFO (support.context.CacheManager) Registering cache 'XAIOptionCache'
    SYSUSER - 318312-3794-1 2011-01-19 14:41:06,920 [Parent Reader:Thread-54] INFO (domain.integration.RealtimeOutboundMessage) sending Realtime Outbound message <?xml version="1.0" encoding="UTF-8"?><SOAP-ENV:Envelope xmlns:SOAP-ENV="urn:schemas-xmlsoap-org:envelope"><SOAP-ENV:Body><CM_UsageCalculation transactionType="READ" dateTimeTagFormat="CdxDateTime"><rate>ERES-1</rate><saId>123456789</saId><usageId>954617571747</usageId><scalarProcessing><billingOption>Y</billingOption><startDateTime>2010-06-09-12.00.00</startDateTime><endDateTime>2010-06-10-12.00.00</endDateTime></scalarProcessing></CM_UsageCalculation></SOAP-ENV:Body></SOAP-ENV:Envelope>
    SYSUSER - 318312-3794-1 2011-01-19 14:41:22,493 [Parent Reader:Thread-54] INFO (domain.integration.RealtimeOutboundMessage) Raw Response from External System is ID:325496.1295437282326.0
    SYSUSER - 318312-3794-1 2011-01-19 14:41:23,966 [Parent Reader:Thread-54] INFO (support.schema.BusinessObjectInfo) BusinessObject C1-NonCyclicalUsgReqOutMsg: Skipping audit calls while performing DEL on entity OutboundMessage_Id(899647019991), as there were no auditable changes
    SYSUSER - 318312-3794-1 2011-01-19 14:41:24,655 [Parent Reader:Thread-54] INFO (api.maintenanceObject.MaintenanceObjectLogHelper) (Server Message)
         Category: 11002
         Number: 12150
         Call Sequence:
         Program Name: Usage_CHandler
         Text: Transitioned from Send Request to Awaiting Bill Determinants.
         Description:
         Table: null
         Field: null
    SYSUSER - 318312-3794-1 2011-01-19 14:41:24,939 [Parent Reader:Thread-54] INFO (api.maintenanceObject.MaintenanceObjectLogHelper) (Server Message)
         Category: 11002
         Number: 12150
         Call Sequence:
         Program Name: Usage_CHandler
         Text: Transitioned from Awaiting Bill Determinants to Bill Determinants Received.
         Description:
         Table: null
         Field: null
    SYSUSER - 318312-3794-1 2011-01-19 14:41:25,892 [Remote JVM:1 Thread 3] WARN (host.sql.CobolSQLParamMetaData) COBOL set the null indicator to false for SQL bind parameter xWIN-START-DT, but it sent a null value ' '; binding null
    SYSUSER - 318312-3794-1 2011-01-19 14:41:25,932 [Remote JVM:1 Thread 3] INFO (COBOL.CMPBBILG) Get next Acct sa
    Edited by: Anton on 19.01.2011 1:56

    Did you find the document Oracle Utilities Customer Care and Billing - Meter Data Management Integration Implementation Guide?
    I searched on eDelivery and Oracle Support, but cannot find the document.

  • Are laggy adjustments/rendering camera specific?

    The thread on thumbnails got me to thinking, some people report snappy adjustments, quick rendering and great performance with A3. Others (like myself) experience slowdowns on some photos after applying several layers of edits to a photo ("processing" thumbnails, slow rendering of adjustment changes etc.)
    Just a few things that tend to take a long time to render an updated screen view for me (i.e. I get a low quality jaggy preview and it takes up to 20 seconds to "sharpen up" and show the changes:
    . ANY changes to the Raw Fine Tuning brick, especially in 100% view
    . some brushes take a long time to render changes (skin soften and sharpening to name a couple) again, worse in 100% view
    . noise reduction (worse in 100%)
    . Edge Sharpening (worse in 100%)
    . rendering a proper image after zooming 100% into a photo after it's been adjusted moderately with the above adjustments, or cropped/straightened. Also slow to render if scrolling around in 100% view.
    Notice that the worst wait times tend to appear in 100% view, though zoomed out sometimes I still get the "loading" pinwheel. These are adjustments that I prefer (justifiably so) to do in 100% view but the lag is just terrible sometimes. Some photos are worse than others, and it tends to be worse as the adjustments pile up.
    My equipment is listed in the sig, my MBPro tends to be a bit faster than the iMac. Before you ask, I've done all the standard performance enhancing things (previews settings, faces off, defrag etc etc etc).
    I am using Canon 7D full size raw files (18mp, ~25MB files). A couple people in the other thread mentioned this sort of lackluster performance with their 7D files, and I've seen people with 5DMk2's complain as well. I wonder if Aperture has a hard time dealing with large raw files, specifically Canon raw files.
    So informal poll, three part question:
    (1) Are you experiencing slow downs in rendering of adjustments on your photos, especially at 100% view?
    (2) What type of file from what camera are you using (i.e. large .jpg from Nikon D70, full size raw from Canon 1dMk4 etc)
    (3) What is your computer set up?
    I wonder if this is a trend, and am just curious to see if my wild theory has any merit.
    I see no reason why my two machines run LR and PShop flawlessly and with instant graphical feedback of ALL adjustments, while A3 lags and chews on the adjustment for up to half a minute before showing me what my sharpening move did to the photo at 100%.
    If it is some sort of trend, maybe it's fixable in a software update that might be forthcoming. I really, REALLY like Aperture 3, it's pretty much exactly what I want/need for a photo management/editing program (save for the lack of output sharpening, but that's another thread for another day :)) I just wish it was as snappy as the Adobe products with my 7D files.

    I certainly think that the size of the file is a factor . My 5d MKII files (>20Mbyte) are slower than files from my S90 (around 10MByte ), but on my system configuration perfectly manageable and usable.
    I think the file size, GPU memory/speed and memory in general are all factors that interact together to create what some see as lags and slow downs.
    Sharpening and noise reduction is a fairly intensive algorithm to apply in real time ( I know, I have written code to do it in non real time and it's a lot of pixel sampling and comparing, computing ).
    I have conducted extensive testing with Aperture 3.0 and for that matter Lightroom as well ( lot's of people point to LR as being much better and I'm fed up with the generalization ). The reality is, both programs struggle to keep up with the application of real time application of localized brush strokes with large files, especially when you have lots of adjustments applied. I have some interesting data comparing these things and might publish it one of these days.
    I think Apple need to be realistic with minimum system requirements to run fully all of the features with programs like AP3. Sure, you can apply adjustments to a small JPEG when you only have 1GB of RAM of 128MB GPU RAM etc, but it's common sense that at some point when you files are 20, 50, 100MB you are going have an issue when you are attempting to overlay in real time multiple computations to an image.
    I also think Apple has not yet optimized the code in this area. I have carefully monitored the number of processor threads, memory usage and CPU cycles when some of these things are being applied and, while I am not privy at all to anything under the hood with Aperture, as a fairly seasoned software developer of real time automation software, I think they have some tweaking to do, that will make this better for more people across a range of hardware and image sizes. At least I hope so! It will be interesting to see what the next update brings.
    AP3 is really a very, very powerful tool and I liken it to the first and early releases of Final Cut in the video world. I remember lots and lots people complaining about performance with Final Cut - mostly it was the system setup, limited resources in the machine and not yet knowing the best work flow. I see the same trend on some of these threads with AP3.
    No doubt in my mind Apple could have done a much better job of quality control here and I fear they bowed to pressures to release something early. For some the out of the box experience has been good for others not so much. I have no doubt they will get it right, but it doesn't help people having issues at the moment.
    Just my 2-cents.

  • Best Plug-in/External Editor for Aperture Local Adjustments

    I've been playing with trial versions of Nik's plug-in suite, Lightzone, and I have a version of CS3.
    From the standpoint of making local adjustments ( hopefully something we will see in the next version of Aperture), which do you feel is the easiest to use without compromising results.
    I sorta lean towards Nik's plug-in especially Viveza but would like to hear from those of you who have had some serious usage with any of these tools as with the trial period and being busy, there's only so much time I can spend on it. Nik suite is being bundled for discount and there's also a discount on Lightzone as an external editor.
    Thanks,
    Larry

    CS3 is a great external editor- If you want something that is really easy and quick to use the Nik software is amazing. The three tools that I use a lot from Nik (I use them directly from Aperture and in Photoshop on smart objects) are Viveza, Silver EFEX Pro, and sharpener.
    RB
    Ps. I have a couple of reviews on my site just look up the product name [http://photo.rwboyer.com]

  • Usage of Department/Staff switch in Organizational Unit

    Hi All,
    Just would like to know the detail usage of the Department/Staff switch for the Organizational Unit. Currently neither the Department nor the Staff switch is selected for all the organizational units created. What would be the effect and impact if I set the Department switch for the organizational unit?
    Thanks a lot,
    Francis

    Hi Francis,
    A staff flag does not have to be marked as such. It does, however, influence the presentation of positions and organizational units in the graphic. Positions or organizational units marked with a staff flag are shown in the graphic next to their respective superior nodes. Without a staff flag they are shown under the superior node.
    As for the department,marking an org unit is optional and relates to integration.
    If integration with Payroll Accounting is active, certain records are written from Personnel Management to Master Data. If you flag organizational units as departments, only the marked units are written to Master Data.
    Marking units as departments makes sense when your organizational structure includes organizational units that are not actual departments (for example, a special project team).
    NOTE: In order for the department marks to be recognized, you must also make adjustments in Customizing.
    Department Switch in T77S0 (PPABT PPABT):
        o   0 = causes the organizational unit directly above a position     to
            be read
        o   1 = causes the organizational unit with a department flag
            (infotype 1003 "Department/Staff") above a position to be read
    Hope this helps.
    Regards,
    Arpita

  • Lightroom LR4 - Silly RAM usage!!

    Lightroom LR4 - Silly RAM usage!!
    Edit Subject
    Lightroom LR4 - Silly RAM usage!!
    Hi there
    I'm stunned at the amount of RAM usage by LIghtroom. I have 6Gb of RAM using LR4 in 64 bit win7.
    This is what happened when I opened lightroom to edit a shoot of jpgs, all of which were only about 3mb each ( I wasn't even editing 30Mb RAW files from my DSLRs )
    [i]figures from task manager/performance pane[/i]:
    No Lightroom
    0.99 GB
    Launch Lightroom and cat
    1.53 GB
    select next pic
    2.00 Gb
    select next pic (no editing yet )
    2.20 GB
    after editing two pics
    3.13 GB
    editing the 2nd pic in Nik Software Silver Efex pro 2 ( edit copy with lightroom adjustments ) and returning to lightroom
    2.85 GB
    but ....
    Very soon, after only a few pics, the RAM usage is up to 5.5GB
    This seems very disproportionate. Why for example would simply selecting another 3mb jpg eat up another 200Mb of RAM? And why after only a bit of editing on this 3mb pic in the develop module eat up another 900Mb of RAM?
    The other highly annoying issue is that once Lightroom has decided to eat up nearly all my available RAM after editing only a few 3mb JPGs, things slow down rapidly. selecting another pic means everything locks up while their is vicious disk activity ( presumably swapping to swap file because Lightroom needs another 200Mb of ram for just one 3mb JPG )
    it's not much fun having to quit LR every few images just to 'reset' thje amount of RAM.

    OK, seem to be getting somewhere now
    As stated in the OP I work in 64 bit Win 7. I have a win 7 32 bitpartition on the same machine I use for various things and also for testing Lightroom updates without upsetting my main LR work partition.
    Earlier tonight I isntalled 4.1RC2 on my win 7 32 bit partition and ahve been working on a shoot. Interestingly the RAN usage seems much better managed. Instead of progressviely creeping up to a point of ARRGHHH!!!, with RC2 memory is released when switching images. Great news!!   Now bearing mind I still need to stest this in my 64 bit martition and I will isntall RC2 once confident it wont' fall right over, but positive os far.
    ALSO, which I've just started testing is the negative cache disablement as described by Adfoibe Employee  Julie Kmoch here
    http://feedback.photoshop.com/photoshop_family/topics/lr4_0_reacts_extremely_slow
    who said:
    Here's an update on what changed in RC2 relative to Develop performance.
    For starters, in RC1 we experimented with turning off sharpening while sliders were moving. We got a fair amount of negative feedback on that, and have reverted that behavior. Instead, we moved more of the rendering to a background thread, which keeps the sliders moving smoothly. One caveat with the behavior in RC2 is that those background renders can pile up if you're moving fast. The final 4.1 release will do a better job of trimming these when possible.
    I worked with a number of you who volunteered to try an experiment a couple weeks ago that turned off something we call negative caching. The results were mixed; a couple people said it was a clear improvement, others didn't see a benefit. But I'll offer it here in case it helps others who find performance starts out reasonably then suddenly, consistently goes south until a restart.
    What the following will do is turn off a cache which saves some of your most recent work done in Develop such that if you revisit a recently touched image, it loads faster. However, if our calculations are off, this cache can sometimes get too big and cause ACR to use virtual memory instead of RAM.
    To try it:
    1. Create a text file called “config.lua” and put the following line in it: AgNegativeCache.enabled = false
    2. Launch Lightroom, go to the Preferences dialog, Presets tab, and hit “Show Lightroom Presets Folder”.
    3. Close the Preferences dialog, quit Lightroom.
    4. Drop the attached config.lua file into the Lightroom folder that was opened in step 2 (do not put it in one of the preset subfolders). So the path to the file will look like this:
    Mac: /Users/[yourname]/Library/Application Support/Adobe/Lightroom/config.lua
    Win:  C:\Users\[yourname]\AppData\Roaming\Adobe\Lightroom\config.lua
    You’ll know this switch to turn the cache off is working if you see the “Loading...” symbol even when you revisit the previously edited image in Develop. (When caching is on, you can often revisit a recently edited photo without the Loading warning showing.)
    I'll report back laters

  • Elimininating cellular data usage

    Now that AT&T is obviously trying to get us to adopt low use plans (and then be able to ding us with overage charges) I want to take control of my cellular data usage.
    Even if I'm on a wifi network, if I set my "Settings|General|Usage"Cellular Network Data" stats to zero I find that I'm still sending and receiving cellular data.
    Any ideas on how to determine what rogue apps are using my now precious data traffic?
    What I really want is to be able to tell my iPhone not to use ANY cellular connections unless I specifically ask it to. That way I don't get random chatter sucking up my data when I'm away from wifi.

    Thanks for the info! That's what I was hoping to hear.
    Personally, its not worth it to give up my unlimited to save 5 bucks a month.
    But my wife never goes over 100 Meg a month so we're definitely dropping her to the 200 Meg plan ($15 IS worth it.)
    More important though is the fact that I'll be giving my daughter my 3GS when I get my iPhone 4 this month, and I want better control of data usage now that she'll be on a limited plan.
    I do think that Apple has to provide much better tools for controlling Cellular data usage. What you describe is a nice start, but the iPhone could be MUCH smarter about making quick adjustments more obvious and easy.
    One obvious solution would be to allow quick toggles to be exposed as one-click apps instead of requiring that we have to drill down through multiple screens in the Settings app.
    One click to turn cellular on/off.
    One click to turn wifi on/off.
    etc.
    The sad part is that these are all do-able by third-party developers now, but Apple for some reason doesn't allow it. (Apps that do these things have been rejected.)
    What's up with that?

  • Problem with Firefox and very heavy memory usage

    For several releases now, Firefox has been particularly heavy on memory usage. With its most recent version, with a single browser instance and only one tab, Firefox consumes more memory that any other application running on my Windows PC. The memory footprint grows significantly as I open additional tabs, getting to almost 1GB when there are 7 or 8 tabs open. This is as true with no extensions or pluggins, and with the usual set, (firebug, fire cookie, fireshot, XMarks). Right now, with 2 tabs, the memory is at 217,128K and climbing, CPU is between 0.2 and 1.5%.
    I have read dozens of threads providing "helpful" suggestions, and tried any that seem reasonable. But like most others who experience Firebug's memory problems, none address the issue.
    Firefox is an excellent tool for web developers, and I rely on it heavily, but have now resorted to using Chrome as the default and only open Firefox when I must, in order to test or debug a page.
    Is there no hope of resolving this problem? So far, from responses to other similar threads, the response has been to deny any responsibility and blame extensions and/or pluggins. This is not helpful and not accurate. Will Firefox accept ownership for this problem and try to address it properly, or must we continue to suffer for your failings?

    55%, it's still 1.6Gb....there shouldn't be a problem scanning something that it says will take up 300Mb, then actually only takes up 70Mb.
    And not wrong, it obviously isn't releasing the memory when other applications need it because it doesn't, I have to close PS before it will release it. Yes, it probably is supposed to release it, but it isn't.
    Thank you for your answer (even if it did appear to me to be a bit rude/shouty, perhaps something more polite than "Wrong!" next time) but I'm sitting at my computer, and I can see what is using how much memory and when, you can't.

  • Problem with scanning and memory usage

    I'm running PS CS3 on Vista Home Premium, 1.86Ghz Intel core 2 processor, and 4GB RAM.
    I realise Vista only sees 3.3GB of this RAM, and I know Vista uses about 1GB all the time.
    Question:
    While running PS, and only PS, with no files open, I have 2GB of RAM, why will PS not let me scan a file that it says will take up 300Mb?
    200Mb is about the limit that it will let me scan, but even then, the actual end product ends up being less than 100Mb. (around 70mb in most cases)I'm using a Dell AIO A920, latest drivers etc, and PS is set to use all avaliable RAM.
    Not only will it not let me scan, once a file I've opened has used up "x" amount of RAM, even if I then close that file, "x" amount of RAM will STILL be unavaliable. This means if I scan something, I have to save it, close PS, then open it again before I can scan anything else.
    Surely this isn't normal. Or am I being stupid and missing something obvious?
    I've also monitored the memory usage during scanning using task manager and various other things, it hardly goes up at all, then shoots up to 70-80% once the 70ishMb file is loaded. Something is up because if that were true, I'd actually only have 1Gb of RAM, and running Vista would be nearly impossible.
    It's not a Vista thing either as I had this problem when I had XP. In fact it was worse then, I could hardly scan anything, had to be very low resolution.
    Thanks in advance for any help

    55%, it's still 1.6Gb....there shouldn't be a problem scanning something that it says will take up 300Mb, then actually only takes up 70Mb.
    And not wrong, it obviously isn't releasing the memory when other applications need it because it doesn't, I have to close PS before it will release it. Yes, it probably is supposed to release it, but it isn't.
    Thank you for your answer (even if it did appear to me to be a bit rude/shouty, perhaps something more polite than "Wrong!" next time) but I'm sitting at my computer, and I can see what is using how much memory and when, you can't.

  • Problem with JTree and memory usage

    I have problem with the JTree when memory usage is over the phisical memory( I have 512MB).
    I use JTree to display very large data about structure organization of big company. It is working fine until memory usage is over the phisical memory - then some of nodes are not visible.
    I hope somebody has an idea about this problem.

    55%, it's still 1.6Gb....there shouldn't be a problem scanning something that it says will take up 300Mb, then actually only takes up 70Mb.
    And not wrong, it obviously isn't releasing the memory when other applications need it because it doesn't, I have to close PS before it will release it. Yes, it probably is supposed to release it, but it isn't.
    Thank you for your answer (even if it did appear to me to be a bit rude/shouty, perhaps something more polite than "Wrong!" next time) but I'm sitting at my computer, and I can see what is using how much memory and when, you can't.

Maybe you are looking for