Memory issue with 'Merge to HDR' in CS4

Hi All
I use Lightroom 2.3 and have recently tried to use the 'Edit in' function to 'Merge to HDR in Photoshop' with just 3 Raw files.
PS opens fine and the three images appear as separate layers, the 'Merge to HDR' window then opens allowing me to set the White Point preview etc. but on trying to complete the operation I get the following error message..
"The operation could not be completed.
Not enough storage is available to complete this operation"
I am using;
-Win Vista 32bit with 3gb of Ram.
-Intel 2 Quad CPU Q6600 @ 2.4GHz
-NVIDIA GeForce9600GS (7.15.11.7490 Driver)
Under 'Preferences'/'Performance' I have 'Memory Usage' set to 1298MB(79%) and the Scratch disk is set to the internal HD with 502.86GB reading as free. (I also have 2 external USB2.0 HDs with 880 & 909GB free respectively.I have tried every permutation of having these combined or individually as the Scratch disk but to no avail.)
The really annoying/bizarre aspect of this is that if I open Windows Task Manager and watch the Photoshop Memory usage it peaks and stays at 1,247,704K for about 1/1.5minutes during which I continue to get the above error message if I try to complete the Merge process.
However, if I wait and monitor this memory usage it starts to drop after said period to 1,003,612K after which the Merge process completes no problem??? (Memory usage by PS then drops to 770,384K)
Whilst I appreciate that my PC is no Ferrari..having to wait over a minute seems more than a little odd. Have I got something set wrong?
All advice gratefully received......
Many thanks!

Hi Chris/Zeno
Many thanks for the responses...
After more experimentation, I think I have solved this. If I set all three HDs as Scratch disks (which totals 2285GB!!!) and switch 'Enable OpenGL Drawing' OFF then I can merge as many RAW files as I like with no time delay..
Even with OpenGL Drawing switched to off, if I only have the internal HD set to be the Scratch disk, then I get the above error message which I would not have expected to see with 500+GB of free space???
Anyway, it works now...
Cheers
JJN

Similar Messages

  • Issue with Merge

    Hi all,
    Facing an issue with merge statement. I am getting source rows like:
    Table a: (Source)
    Id Vald_date Server_serial system_id
    1 11/oct/10 2.5 Real Id
    1 13/oct/10 2.5
    table b: (Target)
    Id Vald_date Server_serial system_id
    1 13/oct/10 2.5
    I am using the below Merge stmt .
    MERGE INTO b
    USING ( SELECT DISTINCT id,vald_date,server_serial,system_id
    from b
    ON ( a.Id = b.ID)
    when matched then
    update set
    b.Vald_date = a.Vald_date,
    b.server_serial = a. server_serial ,
    b.system_id = a.system_id
    when not matched then
    INSERT values(a.id,a.Vald_date,a. server_serial ,a.system_id
    Here , When i run the stmt first time , target table updated with the first source of a table and SQL%ROWCOUNT return 2 rows.
    For the second , if we run the same , it is throwing error 'ORA-30926: unable to get a stable set of rows in the source tables'.
    Please tell me the cause why it is happening like that and i'm not sure how it is reading and process the rows itself.
    Table scripts:
    create table a
    (id NUMBER,
    vald_date DATE,
    server_serial NUMBER,
    system_id varchar2(10)
    Insert into a values(1,TO_DATE('11/OCT/10','DD/MON/YY'),2.5,'Real Id')
    Insert into a values(1,TO_DATE('13/OCT/10','DD/MON/YY'),2.5,NULL)
    create table b
    (id NUMBER,
    vald_date DATE,
    server_serial NUMBER,
    system_id varchar2(10)
    Insert into b values(1,TO_DATE('13/OCT/10','DD/MON/YY'),2.5,'Real Id')
    Oracle Version:
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    PL/SQL Release 10.2.0.4.0 - Production
    CORE     10.2.0.4.0     Production
    TNS for IBM/AIX RISC System/6000: Version 10.2.0.4.0 - Productio
    NLSRTL Version 10.2.0.4.0 - Production
    Your help would be appreciated !!
    Regards,
    Vissu......
    Edited by: vissu on Oct 29, 2010 5:41 AM
    Edited by: vissu on Oct 29, 2010 6:05 AM

    As @Dombrooks already said there is a problem with the data. From next time if you could do like the way I have showed it will be much appreciated.
    There are quite a few bugs with your code. Until you sort out the rowsource I don't think you will be able to proceed further with the merge statement.
    SQL> drop table a;
    Table dropped.
    SQL> drop table b;
    Table dropped.
    SQL> create table a
      2  (id NUMBER,
      3  vald_date DATE,
      4  server_serial NUMBER,
      5  system_id varchar2(10)
      6  )
      7  /
    Table created.
    SQL> Insert into a values(1,TO_DATE('11/OCT/10','DD/MON/YY'),2.5,'Re
      2  /
    1 row created.
    SQL> Insert into a values(1,TO_DATE('13/OCT/10','DD/MON/YY'),2.5,NUL
      2  /
    1 row created.
    SQL> create table b
      2  (id NUMBER,
      3  vald_date DATE,
      4  server_serial NUMBER,
      5  system_id varchar2(10)
      6  )
      7  /
    Table created.
    SQL> Insert into a values(1,TO_DATE('13/OCT/10','DD/MON/YY'),2.5,'Re
      2  /
    1 row created.
    SQL>
    SQL> MERGE INTO b
      2  USING ( SELECT DISTINCT id,vald_date,server_serial,system_id
      3  from b
      4  )
      5  ON ( a.Id = b.ID)
      6  when matched then
      7  update set
      8  b.Vald_date = a.Vald_date,
      9  b.server_serial = a. server_serial ,
    10  b.system_id = a.system_id
    11  when matched then
    12  INSERT values(a.id,a.Vald_date,a. server_serial ,a.system_id
    13  );
    when matched then
    ERROR at line 11:
    ORA-00905: missing keyword
    SQL>
    SQL> ed
    Wrote file afiedt.buf
      1  MERGE INTO b
      2  USING ( SELECT DISTINCT id,vald_date,server_serial,system_id
      3  from b
      4  )
      5  ON ( a.Id = b.ID)
      6  when matched then
      7  update set
      8  b.Vald_date = a.Vald_date,
      9  b.server_serial = a. server_serial ,
    10  b.system_id = a.system_id
    11  when not matched then
    12  INSERT values(a.id,a.Vald_date,a. server_serial ,a.system_id
    13* )
    SQL> /
    ON ( a.Id = b.ID)
    ERROR at line 5:
    ORA-00904: "A"."ID": invalid identifier
    SQL> ed
    Wrote file afiedt.buf
      1  MERGE INTO b
      2  USING ( SELECT DISTINCT id,vald_date,server_serial,system_id
      3  from a
      4  ) a
      5  ON ( a.Id = b.ID)
      6  when matched then
      7  update set
      8  b.Vald_date = a.Vald_date,
      9  b.server_serial = a. server_serial ,
    10  b.system_id = a.system_id
    11  when not matched then
    12  INSERT values(a.id,a.Vald_date,a. server_serial ,a.system_id
    13* )
    SQL> /
    3 rows merged.
    SQL> select * from a;
            ID VALD_DATE SERVER_SERIAL SYSTEM_ID
             1 11-OCT-10           2.5 Real Id
             1 13-OCT-10           2.5
             1 13-OCT-10           2.5 Real Id
    SQL> select * from b;
            ID VALD_DATE SERVER_SERIAL SYSTEM_ID
             1 13-OCT-10           2.5 Real Id
             1 13-OCT-10           2.5
             1 11-OCT-10           2.5 Real Id
    SQL> MERGE INTO b
      2  USING ( SELECT DISTINCT id,vald_date,server_serial,system_id
      3  from a
      4  ) a
      5  ON ( a.Id = b.ID)
      6  when matched then
      7  update set
      8  b.Vald_date = a.Vald_date,
      9  b.server_serial = a. server_serial ,
    10  b.system_id = a.system_id
    11  when not matched then
    12  INSERT values(a.id,a.Vald_date,a. server_serial ,a.system_id
    13  )
    14  /
    MERGE INTO b
    ERROR at line 1:
    ORA-30926: unable to get a stable set of rows in the source tables
    SQL> select * from v$version;
    BANNER
    Oracle Database 10g Enterprise Edition Release 10.2.0.4.0 - 64bi
    PL/SQL Release 10.2.0.4.0 - Production
    CORE    10.2.0.4.0      Production
    TNS for 64-bit Windows: Version 10.2.0.4.0 - Production
    NLSRTL Version 10.2.0.4.0 - ProductionRegards
    Raj

  • Has anyone experience iMac memory issues with One Drive (previously Sky Drive) - uses 5GB of my 8GB memory!

    I recently started experiencing memory issues with my iMac due to One Drive using up to 5GB of my iMac memory (total of 8GB). Every time I start up my iMac Activity Monitor shows One Drive uses around 4-5Gb of the internal memory for no reason. Previously Onde Drive seemed to work fine, it's only since about 2 weeks I notice this. It's really annoying as the iMac freezes up regulalrly and I need One Drive to synchronise my files across different devices.
    Has anyone experienced this too and are there any known fixes?

    Yes this is happening to me too, since the last upgrade. Eventually MacOS starts insisting I "Force Quit" all my applications as I have run out of application memory.
    OneDrive doesn't appear in that list because it's not an application... but Activity Monotor clearly shows it's the culprit, taking up 5GB+ of RAM.
    I've raised the issue on Microsoft's forums:
    http://answers.microsoft.com/en-us/onedrive/forum/sdperformance-sdother/onedrive -for-macos-memory-leak/fe04cf60-949f-47b6-b886-10539b5c5cc3?tm=1397636232934

  • Merge to HDR in CS4 from LR2.1 not working

    I searched for this problem and did not find anything. I have created several virtual copies and varied the exposure from -2 to +2 EV in LR2.1. I then with all copies selected, then I chose Edit in CS4->Merge to HDR. When it goes into CS4, only one of the images is worked on times the number of copies selected in LR!?!? Am I doing something wrong??

    Unfortunately I can't answer your question, I can only tell you what the
    Adobe programmers have said about no being able to use a single raw image
    for an HDR. I can also tell you two things...
    1) A true HDR is about dynamic range over the entire image. A fake HDR is
    more about how to have for example a proper sky with a proper foreground,
    not about bringing out every possible bit of detail in both of those areas.
    2) If you look at a true HDR image that has been processed correctly and
    compare that to a fake HDR there is a rather large difference in the images.
    I will also tell you that both forms real and fake have their uses and that
    in my opinion by and large I tend to like the fake ones better as the image
    still looks like a photo where nearly 99.9% of the true HDR images that I
    have seen have looked more like paintings. I think this is for two reasons.
    1) The true HDR aren't processed correctly by the people very well. 2) Human
    beings for as long as there has been photography are conditioned to have
    images that have areas that are light and dark to the point that we don't
    see any or very much detail, HDR is specifically about having all areas have
    detail that can be clearly seen. No real dark, no real light just a
    perfectly even perfect exposure. To our minds this isn't a photograph it is
    more painting or artwork than photo. This isn't to say that I haven't seen
    true HDR that I didn't like, I have it just doesn't happen often, mostly I
    suspect do to poor processing, but even the ones with what I would consider
    proper processing (no halos, or overly vivid colors) still look more
    painting than photo.
    In the end it all comes down to creativity or to each artists tastes. If you
    like the "fake" HDR then by all means use it. I do. I have a technique that
    I use that I have not seen mentioned any place else that I will share. I
    will do a video these things are so visual that doing it in text just
    doesn't work well in my opinion. I will post a link to the video in a few
    days. Maybe someone will like it.
    Robert

  • Memory issue with Podcast !!!

    Hi,
    I have several issues with the Podcast App.
    - i cannot sync podcast that i have downloaded on my laptop using itunes to the iphone,
    - when i download some podcast directly on the iphone, i cannot play them, i still have to stream them,
    - i have uge memory issues and i think that the downloaded version of the podcast might be somewhere on the iphone, but cannot access / delete them (over 16 GO, i have more than 4.5 GO used for 'other', that are not related to anything...)
    i have tried to delete / reinstall the app, but this does not change anything...
    Any help ?
    Rgds

    Quote
    I have a MSI 975X Platinum rev 2 mb with a e6300 processor overclocked to 2.24ghz.
    If your CPU operates @2.24 GHz, it means that you increased FSB frequency to about 348 MHz.
    To make sure there is no misunderstanding here, let me point out the following:
    Memory frequency is ALWAYS linked to FSB frequency by the FSB/DRAM Ratio you can select in BIOS.  If you select 1:1 for example, the BIOS will show that memory frequency is 533 MHz. This value however, only applies to the default FSB clock speed of 266 MHz:
    266 MHz x 1 = 266 MHz (or 533 MHz effective DDR2-frequency).
    If your system is overclocked, what counts is only FSB/DRAM ratio, not the memory speed displayed in BIOS.  That means, if you set FSB/DRAM ration to 1:1 and FSB frequency to 348 MHz, your RAM will operate @:
    348 MHz x 1 = 348 MHz (or 696 MHz effective DDR2-frequency).
    The main question is:
    What are you talking about when you say, you can't make your RAM operate @800 MHz?
    The BIOS does not offer a proper divider to get your RAM to 800 MHz @348 FSB clock speed to begin with.
    You have the following choices if your overclock your system:
    FSB=300 + FSB/DRAM ratio = 1.33
    300 MHz x 1.33 ~ 400 MHz (or 800 MHz effective DDR2-frequency)
    FSB=320 MHz + FSB/DRAM ratio = 1:1.25
    320 MHz x 1.25 = 400 MHz (or 800 MHz effective DDR2-frequency)
    FSB=400 MHz + FSB/DRAM ratio = 1:1
    400 MHz x 1 = 400 MHz (or 800 MHz effective DDR2-frequency)
    Use CPU-Z to monitor the DRAM frequency that is actually set if you are overclocking.

  • IDCS4 V6.0 memory issue with preflight

    When I create a custom profile for preflight and run it i encounter memory issue. Hard disk starts running indefinitely and after a while InDesign crashes. I have try, by steps, to make the profile less demanding (I should try the reverse way) but instead of crashing, indesign finally sends memory error message and can crash later. Apple Activity monitor shows that sometimes indesign requires all the memory available (1,5Go)
    Any of you have ever try to set a custom preflight profile ? The basic one is really too tolerant.
    Thank you.
    I'am on Imac 3,06 2 Go

    I've spent quite a bit of time working with custom preflight profiles on my MacBook Pro, 2 Gb memory, Mac OS X 10.5.5. I have never run into a memory error. In fact, I've never ever run into a memory error in InDesign. Memory errors can occur because of defective fonts. Have you tried with different fonts? Have you tried on a different computer?
    Yes, the [Basic] preflight checks only for missing fonts, missing or modified graphics, and overset text. You need to work with custom preflight profiles depending on your particular workflow.

  • Issues with buttons in interactive pdf (CS4)

    Hi there, I'm working on an interactive pdf that is going to be used cross platform. I'm working with ID CS4 (6.0.6) and Acrobat CS4 (9.4.4). I have some issues with the buttons I've created that I need to solve. The buttons are contours made in AI and pasted in ID. When done I exported my pdf from ID and checked with Acrobat and added a few links using Acrobat as one can't use the command 'go to specific page' in ID itself (yes, this has to be a non-Flash pdf).
    When testing in Acrobat I noticed 2 things I need to solve.
    First: sometimes a button doesn't work properly the first time but later on it does or the other way around.
    Second: when viewing on a pc using Acrobat Reader 9.4 some of the buttons show little dots in a square showing the active field when hovering, although I've set border colour and fill colour to none on all buttons.
    How can I fix these issues? I couldn't find anything in the properties dialog box. And one final question: from which versions of Adobe Reader (Mac and pc) are interactive pdf's supported?
    TIA

    Thanks George. You're correct, most of the actions were set straight in ID. The only reason I go into Acrobat is to hookup the buttons that need to go to a specific page rather than to the next page. Since my version of Acrobat is not English I had to google English manuals to get the exact English phrases. Alas, I can't hand you a copy of the document either since it holds some sensitive information that I'm not allowed to share.
    Instead of posting all the settings I made, I looked more carfeful to the buttons that have multiple actions. In Acrobat you have a bit more control over the hierarchy of the actions so I guess this is where we can straighten out the hickups, right? I have a few buttons that have actions for go to a next/previous page and toggle the show/hide of a few items on the pages. So I my case most of the buttons were set to first do the show/hide and then go to the next page rather than first go to the previous/next page and then do the show/hide actions. Could this be the cause of the staggering of some of the buttons?
    And for last, right now the ID document has 4 layers. I could reduce that to 2. Would that be of any (additional) help? Thanks!

  • Out of memory issues with PSE 8

    I am using PSE 8 64 bit on a Dell desktop computer with Intel (R) Core (TM) i7 CPU 920 at 2.67 GHz. with 8 GB of ram. My operating system is Windows 7, 64 bit.
    My problem is that I get out of memory or insufficient ram messages in PSE Editor with some PSE tools when my CPU resource utilization reaches 37 to 38%. In other words, even though my computer is telling me I have almost 4 GB of memory left, PSE is saying it does not have enough memory to complete the operation. It looks to me as if PSE is only using 4GB of my 8 GB of Ram. Is this true and what do I need to do to allow PSE to utilize all of my available ram.

    Thanks, that does answer what the problem is, but not necessarily a solution. I like working with 8 to 10 pictures (files) in the editor tray at a time. I make whatever changes needed to each and then group 4 or 5 into an 8.5 X 11 collage. Each picture in the collage is a separate layer and each separate picture may multiple layers of its own. I print the collage on 8.5 x 11 photo paper and then put the page in a photo album. I like the pictures in different sizes, orientations and sometimes shapes, so the album and multiple picture options offered in PSE are not much help. My process eats a lot of memory, which I mistakenly thought, my 8 gb of ram would solve.
    Anyway, now that I know the limitations, I can adjust the process to avoid the memory issue and hopefully, a future version of Elements will accommodate 64 bit.
    Maybe, I am wondering, do I need to look at other programs or am I missing a PSE function that would make my chore easier.

  • Setting a fixed and consistent exposure level with merge to HDR Pro?

    I do panoramas with sets of bracketed HDR images.  From LR I select my stack and then merge to HDR pro in CC.  CC then produces a screen with a white point preview slider or the option to tone in ACR.  Since I want to tone in LR I have tried to set the white point consistently for each HDR output in the panorama set, but I cannot get consistent exposure levels.  I then try to adjust them all the same in LR, but sometimes differences result in undesirable effects in the panorama stitched output.
    How can I get a 32bit HDR from PS CC back to LR and have each stack come back in with exactly the same exposure without me trying to guess with a slider?
    I task all the images for the panorama with identical settings.  Same brackets, fstop , shutter speed, ISO, etc.  I should be able to get them all processed into 32bit HDRs and looking the same without manual adjustment guesses.
    Thanks.

    I'll tell you my workflow for panoramas such as this. Starting from bracketed stacks of images.
    1. I select all images, go into Develop and turn on auto-sync. Turn on "Remove Chromatic Aberration". Do not turn on "Enable Profile Corrections". Set a custom white balance to make sure all images are WB'ed the same and optimize sharpening and noise reduction. Basically you want all images to be developed exactly the same and fairly conservative at that.
    2. Export all images to 16-bit tiff in prophotoRGB or adobeRGB (depending on how colorful they are)
    3. At this point one can add a logo to the nadir shot or do it at the end using the workflow I described above.
    4. Load all images into hugin. Generate control points. Optimize positions - first just positions, then positions and view. Level the panorama in the GL preview and make sure the projection is set to equirectangular and the view to 360x180, then optimize view, barrel and positions and finally "everything without translation". During all these steps remove errant control points by going into the control point list and deleting the ones with the largest deviation, which are usually bad control points on clouds or other moving things. Also make sure the pano stays correctly leveled.
    5. Mask the tripod out of the images that have it in them using the mask tools in hugin. Mask feet and stuff out of the nadir shot.
    6. Do a photometric optimization with high dynamic range, fixed exposure.
    7. Go to stitcher, set the optimum canvas size, uncheck "exposure corrected, low dynamic range", Check A. "exposure fused from any arrangement", B. "High dynamic range->tiff", or C. "blended layers of similar exposure, without exposure correction" depending on what you want to do.
    8. Stitch
    If you did A. You will get a tone mapped image that you can work up further in Lightroom or Photoshop. That image will be tone mapped respecting the wraparound on all sides so it won't have weird seams in a spherical projection. If you did B. you get a 32-bit tiff that will work great in Lightroom and as long as you don't use highlights, shadows, clarity and such will remain good. If you did C. you can load the multiple panorama layers you get into a HDR Pro in Photoshop, and load that into Lightroom.

  • Is anyone else having memory issues with Safari after updating to 10.4.7

    Safari appears to be using to much memory.
    345 Safari xxxxxx 0.10 9 317.80 MB 703.39 MB Intel
    I usually keep the program open and the computer running. However, it gets like this in under a day. This just started to happen since I update to 10.4.7.
    Using Mac OS X 10.4.7 (8J2135)
    I am curious if anyone else is having this issue or if anyone knows a work around.

    I believe that I have found the issue. The issue appears to be the Flip4Mac Universal Beta. It had worked fine under 10.4.6 as far as memory goes but must have an issue with 10.4.7. I uninstalled Flip4Mac and memory went back to normal range.
    234 Safari xxxxxx 0.00 6 60.63 MB 394.54 MB Intel

  • Weird issue with merge using JPA...?

    Hi all,
    I have the following code snippet (HsTrans has a manytoone relation with Hss), Hss is the parent
                HsTrans hsTrans = new HsTrans();
             Hss hss=em.find(Hss.class, studyId);
             hsTrans.setStudy(studyId);
             hsTrans.setState(state.toString());
             hsTrans.setTimestamp(timeOfState);
             hss.getHsTrans().add(hsTrans); //Add child to aprent
             hsTrans.setHss(hss);                //Child Parent
                em.merge(hss);  // Does not work Throws a Unique Key violation error
    Whereas replacing the same with this
             Hss hss2=hsStudyTrans.getHsStudy(); 
                em.merge(hss2); // Works fine
       Does anyone know if this is a JPA bug or some other issue.
    Thx
    VR

    RainaV wrote:
    Does anyone know if this is a JPA bug or some other issue.JPA is a specification, so it has no bugs (only design flaws). If there is a bug somewhere it is in the persistence provider you are using, but most likely it is just you not understanding how merge and transaction management in general works. I've written dozens of applications now that use JPA and I think I really needed merge a grand total of once.
    In this snippet you provide, you are apparently making the 'hss' object managed through the em.find() method. This means that any changes you make to it will be made persistent as soon as the transaction is committed. You don't need to call merge() at all, you only call that on entities that are detached.
    Now for the actual problem of unique key violation, I don't even see you committing hsTrans. Change the code to this:
    HsTrans hsTrans = new HsTrans();
    Hss hss=em.find(Hss.class, studyId);
    hsTrans.setStudy(studyId);
    hsTrans.setState(state.toString());
    hsTrans.setTimestamp(timeOfState);
    hsTrans.setHss(hss); // set managed hss object
    em.persist(hsTrans); // persist hsTrans and make it managed
    hss.getHsTrans().add(hsTrans); // add the new managed hsTrans to the hss mapped collection

  • Issue with Merging two files in BPM

    Hi,
    I need to merge two files (Balance and transaction) with correlation is defined from ID, Date and Accountnumber..
    Sometimes, when there are no transaction records, then balanace file will come up number "0"
    Balance file:
    MDk;1728;175;02.09.11;781961.09;0.00;0.00;781961.09;;;;;;;;;0
    MDk;8574;175;02.09.11;4462;1112;104098800;104102150;;;;;;;;;2
    from the above file, two accounts..
    MDk;1728;  --- with zero transaction record
    MDk;8574;  --- with two transaction records
    Transaction file:
    MDk;8574;175;02.09.11;;DEBIT;;;;;-1112;;0;02.09.11;;;;20555;;;037;
    MDk;8574;175;02.09.11;;CREDIT;;;;;104098800;;0;02.09.11;;;;;;;099;
    We are using Correlation to merge the files by using the fields (MDk;8574;175;02.09.11)
    Now the issue is the BPM is not working as the correlation is not matching as the balance file consists of a row (with zero transaction record) which is not present in transaction file.
    I have to ignore the first record in balance file as it contains 0 transaction data which means there are no records for this account in transaction file.
    How can I delete those records before going to merge condition? Is there any thing i can do in balance file adapter?
    Any suggestions please?
    Thanks
    Deepthi

    Hi Ramesh,
    This is the problem at the first step of the BPM where we will receive the files.
    Start> FORK (Rec1 & Rec 2)>TransforMap( Merge_to_targetfile )-->SendtoReceiver  -->END
    It is failing at step1 (Fork), where the files are not matching according to correlation condition which we set.
    ie. ID, Date, Accountnumber.
    As the transaction file doesn't contain the record "MDk;1728;175;02.09.11" which is present in Balance file, the correlation is not  matching. Hence it is failed. It is not even reaching to map
    As correlation is mandatory to receive two matching files, it is failing here..

  • Memory issues with Oracle BPM 10gR3 application

    Hello,
    We have been running the load test(100 concurrent users) on our web application that is developed using Oracle BPM 10gR3 and seeing stuck threads on rendering the workspace page in JSF API(method createAndMaybeStoreManagedBeans). I copied one of the stuck thread trace below. When we looked at the heap, it's full. GC also not releasing memory. From the analysis, I found that due to the lack of memory the requests are stuck. I also went through the forums and found that Oracle 10.3 workspace is a memory hogger.
    Can anyone suggest me the recommendation settings for the workspace?
    We don't have clusters setup yet and planning to setup one. Are there any limitations on the user load on workspace per node?
    Please let me know if anyone had the same issue and resolved.
    Overview of the application:
    Most of the web application is running on global interactive activities with screen flows. The process instance size is small. The Engine and workspace are deployed on the same weblogic instance.
    "[STUCK] ExecuteThread: '167' for queue: 'weblogic.kernel.Default (self-tuning)'" daemon prio=3 tid=0x068bfc00 nid=0x8f8 waiting for monitor entry [0x4ac7d000]
    java.lang.Thread.State: BLOCKED (on object monitor)
         at com.sun.faces.application.ApplicationAssociate.createAndMaybeStoreManagedBeans(ApplicationAssociate.java:242)
         - waiting to lock <0x7e7df518> (a com.sun.faces.application.ApplicationAssociate)
         at com.sun.faces.el.VariableResolverImpl.resolveVariable(VariableResolverImpl.java:78)
         at fuego.workspace.application.WorkspaceVariableResolver.resolveVariable(WorkspaceVariableResolver.java:83)
         at com.sun.facelets.el.LegacyELContext$LegacyELResolver.getValue(LegacyELContext.java:134)
         at com.sun.el.parser.AstIdentifier.getValue(AstIdentifier.java:68)
         at com.sun.el.parser.AstEmpty.getValue(AstEmpty.java:49)
         at com.sun.el.parser.AstOr.getValue(AstOr.java:41)
         at com.sun.el.parser.AstAnd.getValue(AstAnd.java:41)
         at com.sun.el.ValueExpressionImpl.getValue(ValueExpressionImpl.java:192)
         at com.sun.facelets.el.TagValueExpression.getValue(TagValueExpression.java:71)
         at com.sun.facelets.el.LegacyValueBinding.getValue(LegacyValueBinding.java:56)
         at javax.faces.component.UIComponentBase.isRendered(UIComponentBase.java:307)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.getChildren(HtmlBasicRenderer.java:460)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.encodeRecursive(HtmlBasicRenderer.java:437)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.encodeRecursive(HtmlBasicRenderer.java:440)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.encodeRecursive(HtmlBasicRenderer.java:440)
         at com.sun.faces.renderkit.html_basic.TableRenderer.encodeChildren(TableRenderer.java:257)
         at javax.faces.component.UIComponentBase.encodeChildren(UIComponentBase.java:693)
         at com.bea.opencontrols.faces.JSFUtility.renderComponent(JSFUtility.java:150)
         at com.bea.opencontrols.faces.JSFUtility.renderChildren(JSFUtility.java:126)
         at com.bea.opencontrols.faces.JSFUtility.renderComponent(JSFUtility.java:154)
         at com.bea.opencontrols.faces.JSFUtility.renderChildren(JSFUtility.java:126)
         at com.bea.opencontrols.faces.JSFUtility.renderComponent(JSFUtility.java:154)
         at com.bea.opencontrols.faces.JSFUtility.renderChildren(JSFUtility.java:126)
         at com.bea.opencontrols.faces.JSFUtility.renderComponent(JSFUtility.java:154)
         at com.bea.opencontrols.faces.JSFUtility.renderChildren(JSFUtility.java:126)
         at com.bea.opencontrols.faces.JSFUtility.renderComponent(JSFUtility.java:154)
         at com.bea.opencontrols.faces.JSFUtility.renderChildren(JSFUtility.java:126)
         at com.bea.opencontrols.faces.JSFUtility.renderComponent(JSFUtility.java:154)
         at com.bea.opencontrols.faces.JSFUtility.renderChildren(JSFUtility.java:126)
         at com.bea.opencontrols.faces.JSFUtility.renderComponent(JSFUtility.java:154)
         at com.bea.opencontrols.faces.JSFUtility.renderChildren(JSFUtility.java:126)
         at com.bea.opencontrols.faces.JSFUtility.renderComponent(JSFUtility.java:154)
         at com.bea.opencontrols.faces.JSFUtility.renderChildren(JSFUtility.java:126)
         at com.bea.opencontrols.ajax.XPRefreshRenderer.RenderContents(XPRefreshRenderer.java:69)
         at com.bea.opencontrols.XPRenderer.encodeChildren(XPRenderer.java:190)
         at javax.faces.component.UIComponentBase.encodeChildren(UIComponentBase.java:693)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.encodeRecursive(HtmlBasicRenderer.java:435)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.encodeRecursive(HtmlBasicRenderer.java:440)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.encodeRecursive(HtmlBasicRenderer.java:440)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.encodeRecursive(HtmlBasicRenderer.java:440)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.encodeRecursive(HtmlBasicRenderer.java:440)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.encodeRecursive(HtmlBasicRenderer.java:440)
         at com.sun.faces.renderkit.html_basic.HtmlBasicRenderer.encodeRecursive(HtmlBasicRenderer.java:440)
         at com.sun.faces.renderkit.html_basic.GroupRenderer.encodeChildren(GroupRenderer.java:130)
         at javax.faces.component.UIComponentBase.encodeChildren(UIComponentBase.java:693)
         at com.bea.opencontrols.faces.JSFUtility.renderComponent(JSFUtility.java:150)
         at com.bea.opencontrols.faces.JSFUtility.renderChildren(JSFUtility.java:126)
         at com.bea.opencontrols.faces.JSFUtility.renderComponent(JSFUtility.java:154)
         at com.bea.opencontrols.faces.JSFUtility.renderChildren(JSFUtility.java:126)
         at com.bea.opencontrols.ajax.XPRefreshRenderer.RenderContents(XPRefreshRenderer.java:69)
         at com.bea.opencontrols.XPRenderer.encodeChildren(XPRenderer.java:190)
         at javax.faces.component.UIComponentBase.encodeChildren(UIComponentBase.java:693)
         at com.sun.facelets.tag.jsf.ComponentSupport.encodeRecursive(ComponentSupport.java:244)
         at com.sun.facelets.tag.jsf.ComponentSupport.encodeRecursive(ComponentSupport.java:249)
         at com.sun.facelets.tag.jsf.ComponentSupport.encodeRecursive(ComponentSupport.java:249)
         at com.sun.facelets.FaceletViewHandler.renderView(FaceletViewHandler.java:573)
         at com.sun.faces.lifecycle.RenderResponsePhase.execute(RenderResponsePhase.java:87)
         at fuego.workspace.faces.lifecycle.LifecycleImpl.phase(LifecycleImpl.java:132)
         at fuego.workspace.faces.lifecycle.LifecycleImpl.render(LifecycleImpl.java:76)
         at javax.faces.webapp.FacesServlet.service(FacesServlet.java:198)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
         at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
         at fuego.web.filter.NoCacheNoStoreFilter.doFilter(NoCacheNoStoreFilter.java:39)
         at fuego.web.filter.BaseFilter.doFilter(BaseFilter.java:63)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
         at fuego.web.filter.SingleThreadPerSessionFilter.doFilter(SingleThreadPerSessionFilter.java:64)
         at fuego.web.filter.BaseFilter.doFilter(BaseFilter.java:63)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
         at fuego.web.filter.CharsetFilter.doFilter(CharsetFilter.java:48)
         at fuego.web.filter.BaseFilter.doFilter(BaseFilter.java:63)
         at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3496)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(Unknown Source)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)

    Pradeep, It's 4GB. Workspace and engine are running on the same JVM.

  • Issues with merging layers with layer styles

    Just downloaded CS6 Photoshop off the creative cloud and the merge layer command does not work. What is happening is there is one layer with a layer style then a shape layer and when I hit Cmd E it merges the layers but the layer style will not merge with it. I was chatting it up and everything they recommended did not work. It works on all my other co-workers computers and we are all on the cloud. I tried renaming pref. folder, settings folder, re-installing, holding down shft+option+cmd on start up. I also created a test account but that turned out to be a major inconvenience. Once logged into the account there is nothing there. The suggestions just started getting more ridiculous as chat proceeded. Nothing.
    Has anyone experienced this issue?
    Does anyone have a solution where I don't have to create a whole new admin account just to see if CS6 Photoshop works correctly?
    P.S.
    Adobe customer portal is about worthless.

    Get Rich or Die Tryin wrote:
    Just downloaded CS6 Photoshop off the creative cloud and the merge layer command does not work. What is happening is there is one layer with a layer style then a shape layer and when I hit Cmd E it merges the layers but the layer style will not merge with it. I was chatting it up and everything they recommended did not work. It works on all my other co-workers computers and we are all on the cloud. I tried renaming pref. folder, settings folder, re-installing, holding down shft+option+cmd on start up. I also created a test account but that turned out to be a major inconvenience. Once logged into the account there is nothing there. The suggestions just started getting more ridiculous as chat proceeded. Nothing.
    There are two merge layers type commands  one is merge down involves only two layers and another that is merge visible layers.  In my opinion there is a problem with the way merge down is implemented. When you use merge down if the bottom layer in the merge down has a layer style the merged layer also has this layer style attached.  This means that pixels from the layer merged into it will be effected by this layer style.  The result may well not look like the composite looked before the merge. The will also happen in CS5. Merge visibile layer works the correct way in my view as the merge progresses the top two layers are merge by applying any layer masks and layer styles first and merging the two layers the resulting layer has no layer masks and no layer style. This result is then merged with the next lower layer with its masks and style applied first. The result is a single layer without mask and style that looks like the original composite view.  If your having the problem with CMD|CTRL+E and the result has a layer style try turning  the layer style visibility off see if that helps.  If it helps some  backup to before the merge down  use merge visible or first rasterzise the bottom layer to be merged.

  • Memory Issues with D: Drive on my Satellite L500

    This question is regarding my Satellite L500!
    Right so my D drive has 148GB of memory and currently has just over 3GB left. I keep getting told to clear it down but when looking through the files there is 7GB taken by "HDD Recovery" and somewhere around 80GB taken up by "Windows Image Backup". This obviously doesnt come to more than 87GB however i cant find anymore files that are taking up space.
    My question is why has it become so full? I do a backup every sunday, and have had the laptop less than a year.
    Any help would be appriciated, if you need anymore info let me know and i'll tell you ^^
    Thanks guys,
    George

    > My question is why has it become so full? I do a backup every sunday, and have had the laptop less than a year
    You said you backup the data every Sunday and the 80GB are used by "Windows Image Backup
    This means that the backups which you create every Sunday takes the space on the HDD...
    By the way: HDD Recovery is Toshiba HDD image which allows you to recover the notebook without the usage of recovery disk. But I still recommend creating the recovery disk using the preinstalled tool called Toshiba recovery disk creator.
    This is important in case something would go wrong with HDD!

Maybe you are looking for