Pictures with a large amount of white

I am digitalising the botanical paintings of my wife, and have some issues with getting the white background of the paintings white instead of light grey. I use the X-rite color checker passport for getting accurate colors, and I guess I should use the exposure correction to make the whites white, but I am not sure in which order to do things. Should I first adjust the exposure to get the background white instead of light grey and then do the white balance plus color profile adjustment, or do the white balance first, then the color, and then the exposure (or any other possible order)?

I like your advice, trshaner: that there is actually something to do/get right before the LR stage, in camera...
To give my 2 pennies worth to the opening question of the best sequence of adjustment, if still needed inside LR:
Technically Lee Jay is right: if you look at the final slider settings it is irrelevant in which order they have been set.
But process-wise you usually react to the intermediate result of the first adjustments to determine what to do with other sliders. You might not get the same feedback and do the next determination in the same way if you proceed in different sequence.
Good advice: adjust the Exposure first, close before clipping or even a bit beyond in case of specular highlights. Holding down ALT will give a clear indication when clipping starts. Then Blacks until the darkest blacks just start to get clipped. If the image was under-/overexposed it is best to deal with White Balance only then, not as the very first.
Then if needed Recovery & Fill Light, then fine-tune Brightness & Contrast.
Adobe's chosen sequence of the panels and the sliders therein are an indicator of usual best practices.
You could also auto-develop and start manual adjustment from there. I often prefer "Auto Exposure & Blacks" or "Auto Exposure & Blacks & Contrast" to the everything-auto-including brightness&contrast.

Similar Messages

  • How to use functional global with a large amount of variables?

    Hi all,
    I'm currently developping a LV program which control and acquired data from a device. Up to now I used global variables ( very conveniente to use for experimental parameters). But now my program is become to be too large and I have too much "global" variables to continue to use a global.vi.
    I'm wondering if functional global can help me. If someone has an exemple of how use functional global with a large amount of variables...
    thanks

    I agree with Ben.
    Using queues is better than a number of Globals or Action Engines (Functional Globals).  If you need to pass data to a sub-vi, you can simply wire the queue out to a queue control of a sub-vi, or a reference.  See attached example of passing data to a sub-vi using a queue.  The example is an extremely simple, undocumented tid-bit of code that sends the loop count to the consumer VI.
    Run the main program called QueueProducer.vi.
    R
    Attachments:
    QueueProducer.vi ‏18 KB
    QueueConsumer.vi ‏14 KB

  • FaceTime with a large amount of connections

    Just found out when u FaceTime with someone
    Then the browsing speed on other devices go extramly slow, and I checked the conection in my router setting, my 4s just hold 130+ conections when FaceTime is on and 30+ when it's off. Which makes the bandwidth of other devices went bad.
    Any one hv a issue with this? Any suggestion?

    Hello,
    1.want report just fetch the right amount of rows to display the current page
    In SSRS 2008 R2 and later version, Reports are processed and rendered page by page as a report user interactively reads through a report. The amount of data on each page influences the rendering time for each page.
    That means, when you preview the report, all data will be retrieved from the Oracle database, but it will just process and render the data in first page of the report, and then process and render next page when you choose to preview next page.
    In order to meet your requirement, you can place a page break at the end of a specified number of rows.For example, add a group in the report and group with following expression.When a page break is defined for the group, this expression results in a page break
    every 20 rows.
    =Ceiling(RowNumber(Nothing)/20)
    2.
    is it possible that the report service is changing my query to use the rownum column to reduce the amount of fetched data?
    No.Reporting Services cannot modify the dataset query. But you can filter report data before or after it is retrieved for a report dataset: change the query for each dataset to filter data before it is retrieved;To filter data after it is retrieved,
    create filter expressions in the report.
    Reference:http://technet.microsoft.com/en-us/library/dd239395.aspx
    Regards,
    Fanny Liu
    Fanny Liu
    TechNet Community Support

  • What is the best way to work with a large amount of data?

    Assume you have a big text file to work that you need to store and work with, what would be the best way to do that?
    I will be adding / removing a lot of elements so I really want to stay away from array resizing overhead.
    I don't want to step on any toes here, but would binary trees / linked lists in C++ be a better solution?

    Assume you have a big text file to work that you need
    to store and work with, what would be the best way to
    do that?Parse it into a database. Then use SQL to manipulate.

  • A few problems with Leopard running on my White iMac...

    I have only found 2 problems so far with Leopard running on my iMac. First off, the most annoying one:
    Sometimes, especially when switching users, the screen turns to the "gray screen of death." You know, the one with the request to restart your computer in like 4 languages and a huge picture of a power button... anyway, it is getting quite annoying and I wish I could make it stop. It has also happened when I have been dealing with moving large amounts of data, "large pictures, movies, etc.)
    And now, the second problem. I was just looking up the memory left on my hard drive when I noticed something I hadn't before. my 160gb hard drive had 50gb left. not too strange there. but when I checked through all the things (apps, users, system, library, developer, etc.) in it (I simply did apple-i and checked the gb each of them took up) and after a quick add-up on my calculator widget, all the things in the hard drive added up to 50gb. so I have 50gb free, 50gb used and a 160gb hard drive. How does that work? Do I need to do something to get that extra 60gb back or is it just gone???

    STEVE!!! wrote:
    Is there any way just to show hidden folders like on a PC?
    There's probably a way to coerce Finder.app into showing everything
    (all is possible in unix) -- but I don't know the OS-X magic words.
    It told me that improper of the "sudo" command could lead to data loss
    and i don't want to risk doing anything stupid (like a misspelling) that
    could lead to my hard drive being wrecked.
    Good advice. IMO, the 'du' (disk usage) command is safe, because it's
    'read only' -- but I undestand that you'd be hesitant to trust your data
    to advice fom an anonymous voice from "The Internets." I can't/won't
    argue with common sense.
    You can run the same command without the 'sudo' prefix. It might
    miss a few things and/or spit some 'permission denied' messages,
    but the results will at least be a lower-bound of disk usage.
    ...in the beginning was the command line,
    Looby

  • Large Amount of Data in JSF

    Hello,
    I am using the Table Group component for displaying data in my application designed in Java Studio Creator.
    I have enabled paging on the component. I use CachedRowSet on the bean for the page for getting the data. This works very well at the moment in my development environment. At the moment I am testing on small amount of data.
    I was wondering how does this component perform with very large amounts of data (>75,000 rows). I noticed that there is a button available for users to retrieve all the rows. So I was wondering apart from that instance, when viewing in a paged mode does the component get all the results from the database everytime ?
    Which component would be best suited for displaying large amounts of data in a table format?
    Thanks In Advance!!

    Thanks for your reply. The table control that I use does have paging as a feature and I have enabled it. It still takes time to load the data initially.
    I wonder if it is got to do with the logic of paging. How do you specify which set of 20 records to extract from SQL.
    Thanks for your help!!

  • Looking for ideas for transferring large amounts of data between systems

    Hello,
    I am looking for ideas based on best practices for transferring Large Amounts of Data in and out of a Netweaver based application.
    We have a new system we are developing in Netweaver that will utilize both the Java and ABAP stack, and will require integration with other SAP and 3rd Party Systems. It is a standalone product that doesn't share any form of data store with other systems.
    We need to be able to support 10s of millions of records of tabular data coming in and out of our system.
    Since we need to integrate with so many different systems, we are planning to use RFC for our primary interface in and out of the system. As it turns out RFC is not good at dealing with this large amount of data being pushed through a single call.
    We have considered a number of possible ideas, however we are not very happy with any of them. I would like to see what the community has done in the past to solve problems like this as well as how SAP currently solves this problem in other applications like XI, BI, ERP, etc.

    Primoz wrote:Do you use KDE (Dolphin) 4.6 RC or 4.5?
    Also I've noticed that if i move / copy things with Dolphin they're substantially slower than if I use cp/mv. But cp/mv works fine for me...
    Also run Dolphin from terminal to try and see what's the problem.
    Hope that help at least a bit.
    Could you explain why Dolphin should be slower? I'm not attacking you, I'm just asking.
    Cause I thought that Dolphin is just a „little" wrapper around the cp/mv/cd/ls applications/commands.

  • L3 topology diagram with a large number of networks

                       I am working on redesigning a network that is using a single class B network for there LAN workstations and servers. I proposed for them to separate the two and put all of their servers into one class C network and the workstation in another class C. They stated that they wanted 14 separate networks for the workstations, different for section of the building, and 28 for the servers. The reason for the 28 is due to the fact they didn't keep track of how they used their address space and they do not want to renumber servers due to firewall rules. I have 50 networks hanging off of the collapsed core switch "WS-C6513-E" and finding room for all of these network is quite difficult. Does anyone have any experience with creating network topology diagrams with a large amount of networks hanging off of one device? I have attached a doc of what I have done.

    Brandon
    Sorry, you did say that in the last bit of your first post.
    Last time i was doing topology diagrams i believe Visio had a pop up function where you could click over it but obviously if you are printing them out that does no good.
    Unless you can summarise them to one entry with a note of how many subnets are being used within the summary i can't think of a way to do it. You could match the vlan number to the subnet (which you have partly done) and then just list the vlan numbers with the network prefix and make a note the subnet is derived from the vlan number.
    Jon

  • Best way to pass large amounts of data to subroutines?

    I'm writing a program with a large amount of data, around 900 variables.  What is the best way for me to pass parts of this data to different subroutines?  I have a main loop on a PXI RT Controller that is controlling hydraulic cylinders and the loop needs to be 1ms or better.  How on earth should I pass 900 variables through a loop and keep it at 1ms?  One large cluster??  Several smaller clusters??  Local Variables?  Global Variables??  Help me please!!!

    My suggestion, similar to Altenbach and Matt above, is to use a Functional Global Variable (FGV) and use a 1D array of 900 values to store the data in the FGV. You can retrieve individual data items from the FGV by passing in the index of the desired variable and the FGV returns the value from the array. Instead of passing in an index you could also use a TypeDef'd Enum with all of your variables as element of the Enum, which will allow you to place the Enum constant on the diagram and make selecting variables, as well as reading the diagram, simpler.
    My group is developing a LabVIEW component/example code with this functionality that we plan to publish on DevZone in a month or two.
    The attached RTF file shows the core piece of this implementation. This VI off course is non-reentrant. The Init case could be changed to allocate the internal 1D array as part of this VI rather than passing it from another VI.
    Message Edited by Christian L on 01-31-2007 12:00 PM
    Christian Loew, CLA
    Principal Systems Engineer, National Instruments
    Please tip your answer providers with kudos.
    Any attached Code is provided As Is. It has not been tested or validated as a product, for use in a deployed application or system,
    or for use in hazardous environments. You assume all risks for use of the Code and use of the Code is subject
    to the Sample Code License Terms which can be found at: http://ni.com/samplecodelicense
    Attachments:
    CVT_Double_MemBlock.rtf ‏309 KB

  • My iMac running 10.7.5 crashes when copy and paste large amounts of information like a picture.

    My iMac running 10.7.5 crashes when I store a large amount of data like copy and paste a picture. It also has started being painfully slow to open the first page in Safari.  Here's is some information a program gathered on my computer.   -Thanks!
    Problem description:
    iMac 10.7.5.  Crashes when copy & paste large amounts and slow to first open first web page then back to normal speed
    EtreCheck version: 2.1.8 (121)
    Report generated April 13, 2015 3:05:27 PM EDT
    Download EtreCheck from http://etresoft.com/etrecheck
    Click the [Click for support] links for help with non-Apple products.
    Click the [Click for details] links for more information about that line.
    Hardware Information: ℹ️
        iMac (27-inch, Late 2009) (Technical Specifications)
        iMac - model: iMac10,1
        1 3.33 GHz Intel Core 2 Duo CPU: 2-core
        8 GB RAM
            BANK 0/DIMM0
                2 GB DDR3 1067 MHz ok
            BANK 1/DIMM0
                2 GB DDR3 1067 MHz ok
            BANK 0/DIMM1
                2 GB DDR3 1067 MHz ok
            BANK 1/DIMM1
                2 GB DDR3 1067 MHz ok
        Bluetooth: Old - Handoff/Airdrop2 not supported
        Wireless: Unknown
    Video Information: ℹ️
        ATI Radeon HD 4670 - VRAM: 256 MB
            iMac 2560 x 1440
    System Software: ℹ️
        Mac OS X 10.7.5 (11G63) - Time since boot: 14 days 7:47:18
    Disk Information: ℹ️
        ST31000528AS disk0 : (1 TB)
            disk0s1 (disk0s1) <not mounted> : 210 MB
            Macintosh HD (disk0s2) / : 999.35 GB (777.11 GB free)
            Recovery HD (disk0s3) <not mounted>  [Recovery]: 650 MB
        OPTIARC DVD RW AD-5680H 
    USB Information: ℹ️
        Sunplus Innovation Technology. USB to Serial-ATA bridge 1 TB
            disk2s1 (disk2s1) <not mounted> : 210 MB
            Time Machine Backups (disk2s2) /Volumes/Time Machine Backups : 999.86 GB (143.50 GB free)
        Apple Inc. Built-in iSight
        Apple Internal Memory Card Reader
        Apple Computer, Inc. IR Receiver
        HP Deskjet 9800
        Apple Inc. BRCM2046 Hub
            Apple Inc. Bluetooth USB Host Controller
    Kernel Extensions: ℹ️
            /Library/Extensions
        [loaded]    org.virtualbox.kext.VBoxDrv (4.2.18) [Click for support]
        [loaded]    org.virtualbox.kext.VBoxNetAdp (4.2.18) [Click for support]
        [loaded]    org.virtualbox.kext.VBoxNetFlt (4.2.18) [Click for support]
        [loaded]    org.virtualbox.kext.VBoxUSB (4.2.18) [Click for support]
            /Library/Parallels/Parallels Service.app
        [loaded]    com.parallels.kext.prl_hid_hook (7.0 15107.796624) [Click for support]
        [loaded]    com.parallels.kext.prl_hypervisor (7.0 15107.796624) [Click for support]
        [loaded]    com.parallels.kext.prl_netbridge (7.0 15107.796624) [Click for support]
        [loaded]    com.parallels.kext.prl_vnic (7.0 15107.796624) [Click for support]
            /System/Library/Extensions
        [loaded]    com.parallels.kext.prl_usb_connect (7.0 15107.796624) [Click for support]
    Startup Items: ℹ️
        HP IO: Path: /Library/StartupItems/HP IO
        ParallelsTransporter: Path: /Library/StartupItems/ParallelsTransporter
        VirtualBox: Path: /Library/StartupItems/VirtualBox
        Startup items are obsolete in OS X Yosemite
    Launch Agents: ℹ️
        [running]    com.hp.devicemonitor.plist [Click for support]
        [loaded]    com.hp.help.tocgenerator.plist [Click for support]
        [loaded]    com.parallels.desktop.launch.plist [Click for support]
        [loaded]    com.parallels.DesktopControlAgent.plist [Click for support]
        [running]    com.parallels.vm.prl_pcproxy.plist [Click for support]
    Launch Daemons: ℹ️
        [loaded]    com.adobe.fpsaud.plist [Click for support]
        [loaded]    com.hikvision.iVMS-4200.plist [Click for support]
        [loaded]    com.microsoft.office.licensing.helper.plist [Click for support]
        [running]    com.parallels.desktop.launchdaemon.plist [Click for support]
    User Launch Agents: ℹ️
        [loaded]    com.adobe.ARM.[...].plist [Click for support]
        [running]    com.akamai.single-user-client.plist [Click for support]
        [loaded]    com.google.keystone.agent.plist [Click for support]
        [not loaded]    org.virtualbox.vboxwebsrv.plist [Click for support]
    User Login Items: ℹ️
        iTunesHelper    UNKNOWN  (missing value)
        GrowlHelperApp    Application  (/Users/[redacted]/Library/PreferencePanes/Growl.prefPane/Contents/Resources/Gr owlHelperApp.app)
        Microsoft AU Daemon    Application  (/Applications/Microsoft AutoUpdate.app/Contents/MacOS/Microsoft AU Daemon.app)
        Air Mouse Server    UNKNOWN  (missing value)
        CrossOver CD Helper    UNKNOWN  (missing value)
        Pages    UNKNOWN  (missing value)
        Dropbox    Application  (/Applications/Dropbox.app)
        Wondershare Helper Compact    Application  (/Users/[redacted]/Library/Application Support/Helper/Wondershare Helper Compact.app)
        AdobeResourceSynchronizer    Application Hidden (/Applications/Adobe Reader.app/Contents/Support/AdobeResourceSynchronizer.app)
        HP Scheduler    Application  (/Library/Application Support/Hewlett-Packard/Software Update/HP Scheduler.app)
    Internet Plug-ins: ℹ️
        JavaAppletPlugin: Version: 14.9.0 - SDK 10.7 Check version
        FlashPlayer-10.6: Version: 16.0.0.305 - SDK 10.6 [Click for support]
        QuickTime Plugin: Version: 7.7.1
        AdobePDFViewerNPAPI: Version: 11.0.10 - SDK 10.6 [Click for support]
        Flash Player: Version: 16.0.0.305 - SDK 10.6 Outdated! Update
        AdobePDFViewer: Version: 11.0.10 - SDK 10.6 [Click for support]
        SharePointBrowserPlugin: Version: 14.4.8 - SDK 10.6 [Click for support]
        Google Earth Web Plug-in: Version: 6.0 [Click for support]
        Silverlight: Version: 4.0.60531.0 [Click for support]
        iPhotoPhotocast: Version: 7.0
    Safari Extensions: ℹ️
        1-ClickWeather
        Reload Button
    3rd Party Preference Panes: ℹ️
        Akamai NetSession Preferences  [Click for support]
        Flash Player  [Click for support]
        Growl  [Click for support]
        MacFUSE  [Click for support]
    Time Machine: ℹ️
        Time Machine not configured!
    Top Processes by CPU: ℹ️
             2%    WindowServer
             1%    prl_disp_service
             0%    fontd
             0%    AdobeReader
             0%    ODSAgent
    Top Processes by Memory: ℹ️
        120 MB    mds
        112 MB    AdobeReader
        94 MB    WindowServer
        94 MB    Finder
        69 MB    loginwindow
    Virtual Memory Information: ℹ️
        5.44 GB    Free RAM
        1.78 GB    Active RAM
        464 MB    Inactive RAM
        901 MB    Wired RAM
        1.64 GB    Page-ins
        0 B    Page-outs

    I believe that insufficient RAM may be the source of some of your problems. If you have a RAM of somewhere 4 to 8GB, you will experience smoother computing. 3GB doesn't seem right, so you might want to learn more by going to this site:
    http://www.crucial.com/store/drammemory.aspx
    I don't know what know what's happening with your optical drive, but it seems you use your drive quite a bit. In that case, look into a lens cleaner for your machine. It's inexpensive, works quite well.
    I hope you'll post here with your results!

  • When I click on almost any picture, in any album, to enlarge it, the pic starts to enlarge and then all I get is a black screen with a large exclamation point in the center.  What have I done to my iPhoto program, and how do I fix it?

    Recently, whenever I click on almost any of my pictures, in any album, in order to enlarge that photo, it starts to enlarge but then,
    very quickly I get a black screen with a large white exclamation mark in the middle.  And no full sized photo. Boo!
    What did I do to my iPhoto program, and how can I fix it?
    Thanks for any suggestions.
    Mike H.

    I'm sorry.  Too old to understand this stuff.  I'm sure this seems real stupid to you.  But I have no idea how one
    would "right click on the Original iPhoto Library" (I know, I know, 'as stated").
    Do I right click on the symbol in the dock??  Or open it first and then, . .  I can't find a way to right click on the word iPhoto in the top bar.
    Or should I open the program you had me download.  There I can find something called "original iPhoto library".
    When I "right click" on that I get some options.  None of them are "original folders".  There's "reveal library in finder", and "dupilicate library and "sort libraries".  I have no idea if any of these are what you want, or if any of these might change something.
    When I clicked on "reveal library in finder" earlier on it showed me the copy of the photo library that I made a while ago, sitting right where I know it is on my desktop.  And if I click on that symbol it shows me all the pictures again.  Just like in my regular iPhoto files - all the thumbs are there, but most can't be opened to full sized.
    Are we making progress here?  I'm sure this is frustrating to you.  This is just pictures of family.  Even the thumbs are nice to look at.  But this is nothing critical - like work related or anything.
    Thanks.
    Mike
    t

  • Why each time I try to save my day's work,version WITH mark-ups and version w/o, does it say "you have placed a large amount of text on clipboard do you want to access this? What is clipboard and how can I just save my work in its proper file?

    I don't get this clipboard thing! I have created two files, one for the MS with mark-ups, and one for the unmarred version, each living in their own spot in my "house." It seems to be merging the two versions and I have to go in and re-paste and then it always says "you have placed a large amount of text on clipboard do you want to be able to access this?" and it prompts me to say no but I am afraid I'll lose my day's work so I just tap cancel. I highlight the day's revision and select "copy" to paste into the unmarked doc. before I save on the marked up or working doc. It is when I try to close that one that the clipboard issue pops up. What can you tell be about saving doc. in two places and how to NOT get it on clipboard? Thanks! I am not super computer savvy so layperson's language much appreciated!

    Are you using Microsoft word?  Microsoft thinks the users are idiots. They put up a lot of pointless messages that annoy & worry users.  I have seen this message from Microsoft word.  It's annoying.
    As BDaqua points out...
    When you copy information via edit > copy,  command + c, edit > cut, or command +x, you place the information on the clipboard. When you paste information, edit > paste or command + v, you copy information from the clipboard to your data file.
    If you edit > cut or command + x and you do not paste the information and you quite Word, you could be loosing information.  Microsoft is very worried about this. When you quite Word, Microsoft checks if there is information on the clipboard & if so, Microsoft puts out this message.
    You should be saving your work more than once a day. I'd save every 5 minutes.  command + s does a save.
    Robert

  • Error in Generating reports with large amount of data using OBIR

    Hi all,
    we hve integrated OBIR (Oracle BI Reporting) with OIM (Oracle Identity management) to generate the custom reports. Some of the custom reports contain a large amount of data (approx 80-90K rows with 7-8 columns) and the query of these reports basically use the audit tables and resource form tables primarily. Now when we try to generate the report, it is working fine with HTML where report directly generate on console but the same report when we tried to generate and save in pdf or Excel it gave up with the following error.
    [120509_133712190][][STATEMENT] Generating page [1314]
    [120509_133712193][][STATEMENT] Phase2 time used: 3ms
    [120509_133712193][][STATEMENT] Total time used: 41269ms for processing XSL-FO
    [120509_133712846][oracle.apps.xdo.common.font.FontFactory][STATEMENT] type1.Helvetica closed.
    [120509_133712846][oracle.apps.xdo.common.font.FontFactory][STATEMENT] type1.Times-Roman closed.
    [120509_133712848][][PROCEDURE] FO+Gen time used: 41924 msecs
    [120509_133712848][oracle.apps.xdo.template.FOProcessor][STATEMENT] clearInputs(Object) is called.
    [120509_133712850][oracle.apps.xdo.template.FOProcessor][STATEMENT] clearInputs(Object) done. All inputs are cleared.
    [120509_133712850][oracle.apps.xdo.template.FOProcessor][STATEMENT] End Memory: max=496MB, total=496MB, free=121MB
    [120509_133818606][][EXCEPTION] java.net.SocketException: Socket closed
    at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:99)
    at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
    at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:525)
    at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:504)
    at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:382)
    at weblogic.servlet.internal.ChunkOutput.checkForFlush(ChunkOutput.java:469)
    at weblogic.servlet.internal.ChunkOutput.write(ChunkOutput.java:304)
    at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:139)
    at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:169)
    at java.io.BufferedOutputStream.write(BufferedOutputStream.java:105)
    at oracle.apps.xdo.servlet.util.IOUtil.readWrite(IOUtil.java:47)
    at oracle.apps.xdo.servlet.CoreProcessor.process(CoreProcessor.java:280)
    at oracle.apps.xdo.servlet.CoreProcessor.generateDocument(CoreProcessor.java:82)
    at oracle.apps.xdo.servlet.ReportImpl.renderBodyHTTP(ReportImpl.java:562)
    at oracle.apps.xdo.servlet.ReportImpl.renderReportBodyHTTP(ReportImpl.java:265)
    at oracle.apps.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:270)
    at oracle.apps.xdo.servlet.XDOServlet.writeReport(XDOServlet.java:250)
    at oracle.apps.xdo.servlet.XDOServlet.doGet(XDOServlet.java:178)
    at oracle.apps.xdo.servlet.XDOServlet.doPost(XDOServlet.java:201)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
    at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
    at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
    at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
    at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
    at weblogic.servlet.internal.TailFilter.doFilter(TailFilter.java:26)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at oracle.apps.xdo.servlet.security.SecurityFilter.doFilter(SecurityFilter.java:97)
    at weblogic.servlet.internal.FilterChainImpl.doFilter(FilterChainImpl.java:42)
    at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3496)
    at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
    at weblogic.security.service.SecurityManager.runAs(Unknown Source)
    at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
    at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
    at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
    at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
    at weblogic.work.ExecuteThread.run(ExecuteThread.java:173)
    It seems where the querry processing is taking some time we are facing this issue.Do i need to perform any additional configuration to generate such reports?

    java.net.SocketException: Socket closed
         at java.net.SocketOutputStream.socketWrite(SocketOutputStream.java:99)
         at java.net.SocketOutputStream.write(SocketOutputStream.java:136)
         at weblogic.servlet.internal.ChunkOutput.writeChunkTransfer(ChunkOutput.java:525)
         at weblogic.servlet.internal.ChunkOutput.writeChunks(ChunkOutput.java:504)
         at weblogic.servlet.internal.ChunkOutput.flush(ChunkOutput.java:382)
         at weblogic.servlet.internal.CharsetChunkOutput.flush(CharsetChunkOutput.java:249)
         at weblogic.servlet.internal.ChunkOutput.checkForFlush(ChunkOutput.java:469)
         at weblogic.servlet.internal.CharsetChunkOutput.implWrite(CharsetChunkOutput.java:396)
         at weblogic.servlet.internal.CharsetChunkOutput.write(CharsetChunkOutput.java:198)
         at weblogic.servlet.internal.ChunkOutputWrapper.write(ChunkOutputWrapper.java:139)
         at weblogic.servlet.internal.ServletOutputStreamImpl.write(ServletOutputStreamImpl.java:169)
         at com.tej.systemi.util.AroundData.copyStream(AroundData.java:311)
         at com.tej.systemi.client.servlet.servant.Newdownloadsingle.producePageData(Newdownloadsingle.java:108)
         at com.tej.systemi.client.servlet.servant.BaseViewController.serve(BaseViewController.java:542)
         at com.tej.systemi.client.servlet.FrontController.doRequest(FrontController.java:226)
         at com.tej.systemi.client.servlet.FrontController.doPost(FrontController.java:128)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
         at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
         at weblogic.servlet.internal.StubSecurityHelper$ServletServiceAction.run(StubSecurityHelper.java:227)
         at weblogic.servlet.internal.StubSecurityHelper.invokeServlet(StubSecurityHelper.java:125)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:292)
         at weblogic.servlet.internal.ServletStubImpl.execute(ServletStubImpl.java:175)
         at weblogic.servlet.internal.WebAppServletContext$ServletInvocationAction.run(WebAppServletContext.java:3498)
         at weblogic.security.acl.internal.AuthenticatedSubject.doAs(AuthenticatedSubject.java:321)
         at weblogic.security.service.SecurityManager.runAs(Unknown Source)
         at weblogic.servlet.internal.WebAppServletContext.securedExecute(WebAppServletContext.java:2180)
         at weblogic.servlet.internal.WebAppServletContext.execute(WebAppServletContext.java:2086)
         at weblogic.servlet.internal.ServletRequestImpl.run(ServletRequestImpl.java:1406)
         at weblogic.work.ExecuteThread.execute(ExecuteThread.java:201)
         at weblogic.work.ExecuteThread.run(ExecuteThread.java:17
    (Please help finding a solution in this issue its in production and we need to ASAP)
    Thanks in Advance
    Edited by: 909601 on Jan 23, 2012 2:05 AM

  • With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?

    With journaling, I have found that my computer is saving a large amount of data, logs of all the changes I make to files; how can I clean up these logs?
    For example, in Notes, I have written three notes; however if I click on 'All On My Mac' on the side bar, I see about 10 different versions of each note I make, it saves a version every time I add or delete a sentence.
    I also noticed, that when I write an email, Mail saves about 10 or more draft versions before the final is sent.
    I understand that all this journaling provides a level of security, and prevents data lost; but I was wondering, is there a function to clean up journal logs once in a while?
    Thanks
    Roz

    Are you using Microsoft word?  Microsoft thinks the users are idiots. They put up a lot of pointless messages that annoy & worry users.  I have seen this message from Microsoft word.  It's annoying.
    As BDaqua points out...
    When you copy information via edit > copy,  command + c, edit > cut, or command +x, you place the information on the clipboard. When you paste information, edit > paste or command + v, you copy information from the clipboard to your data file.
    If you edit > cut or command + x and you do not paste the information and you quite Word, you could be loosing information.  Microsoft is very worried about this. When you quite Word, Microsoft checks if there is information on the clipboard & if so, Microsoft puts out this message.
    You should be saving your work more than once a day. I'd save every 5 minutes.  command + s does a save.
    Robert

  • How do I pause an iCloud restore for app with large amounts of data?

    I am using an iPhone app which is holding 10 Gb of data (media files) .
    Unfortunately, although all data was backed up, my iPhone 4 was faulty and needed to be replaced with a new handset. On restore, the 10Gb of data takes a very long time to restore over wi-fi. If interrupted (I reached the halfway point during the night) to go to work or take the dog for a walk, I end up of course on 3G for a short period of time.
    Next time I am in a wi-fi zone the app is restoring again right from the beginning
    How does anyone restore an app with large amounts of data or pause a restore?

    You can use classifications but there is no auto feature to archive like that on web apps.
    In terms of the blog, Like I have said to everyone that has posted about blog preview images:
    http://www.prettypollution.com.au/business-catalyst-blog
    Just one example of an image at the start of the blog post rendering out, not hard at all.

Maybe you are looking for