Duel monitors for Lightroom

How can I use a 2006 iMac as second monitor for 2007 iMac? Early 2006 w/osx10.6.8-mid 2007 w/osx10.9.1. 

Well, I would say that dual monitors are a minority, rather than a majority -> So I can't really knock adobe too hard for not having detachable tool bars.
But then again. . .PS2 and PS elements both have this feature, so it is a bit disappointing not to have dual monitor support like this.
I do run dual monitors. . .but the monitors are not identical and do not run at the same resolution. I suspect that this is not an uncommon application . . simply dragging the window over won't work for me.
I would take points off for this ommission. . but not that many. I am much more upset about writing information into JPEGS ;)

Similar Messages

  • Duel Monitors and Lightroom

    The FAQ on Lightroom extra say
    Dual Monitor support - what does LR support?
    b You can only use a second monitor in the Slideshow module.
    You can work around the single screen interface limitation by dragging the right Panels over to the second monitor, but that is it for now
    Does this mean it should be possible to drag the panels to the second monitor in all other modes? I have all my tools etc.. on a second monitor in PS by dragging them but can't do this with Lightroom. Have I misunderstood this FAQ or is there a setting I am missing?

    Well, I would say that dual monitors are a minority, rather than a majority -> So I can't really knock adobe too hard for not having detachable tool bars.
    But then again. . .PS2 and PS elements both have this feature, so it is a bit disappointing not to have dual monitor support like this.
    I do run dual monitors. . .but the monitors are not identical and do not run at the same resolution. I suspect that this is not an uncommon application . . simply dragging the window over won't work for me.
    I would take points off for this ommission. . but not that many. I am much more upset about writing information into JPEGS ;)

  • Duel Monitor Issues

    Hi I'm new to the forums.
    My company just bought a Lanovo 3000 J110 from tigerdirect, and I am trying to install an extra videocard to the PCI-e slot. Its a NVIDEA GeForce7 NX7300LE. It was the cheapest I could find locally on short notice.
    Every time I put in the videocard (or a very old PCI video I finally found on the shelves), I keep getting beeping noises. I replaced the standard 250W PSU with a 380W I found also at work. I still get the same problem. I've checked the bios, video setup to the following:
       Select Active Vide: Automatic
       DVMT 4.0 Mode: Auto
    originally this was setup for the onboard video card.
    I can't seem to figure out why this is not working, tested the motherboard, works fine (I have a few PCI ethernet cards lying around, popped those in got no problems.) The beeping happens even on the old video card (has no 3d acceleration) still. PSU is perfectly fine, boots up windows perfectly so I know its not that either. Been looking around boards all day trying to find a fix.
    The machine runs perfectly fine, upgraded the 512 MB by adding a 1GB RAM, runs XP Pro smoothly. I just need duel monitor for a bunch of applications I have to keep up and running and monitoring for most of the day.
    Any suggestions to correct this problem?

    Yep its installed in the correct slot (#17), checked the guide and it was installed correctly.
    Was looking over the specs of the NX7300LE TD256EH and can't find anything saying PCI-e x16, it just says PCI-e certified. After doing some more scrounging around (basically google searches). I have located a few things saying this card is a PCI-e x16 card. The following link is the same make and model of the graphics card I have. The Interface Information is near the bottom of the page.
    http://www.cuttingedgecomputers.ca/shopexd.asp?id=​2318
    Interface
    PCI-E 16X
    VGA, DVI-I, S-Video/RCA
    Before I left work last nite, I figured it would be that the 380W powersupply was just not big enough, so I found a 480W and slapped that in. Still the beep warning and not booting up windows (or anything on the screen at all).

  • Advice needed on monitor for Photoshop and Lightroom use

    Hi. I am aserious amature photographer wishing to move to the next level and sell some of my work.
    I just had a custom pc built to work with the new copies of Photoshop CS5 and Lightroom 3 I bought (lots or RAM and HD space, ssd, etc....)
    The last piece of my system is to purchase a monitor. I want to be somewhere in the better then Best Buy but less than NEC/Eizo range in price, or between $500 - $900. I have worked with cameras since the early 80's and moved to digital several years ago but the only post processing I have done is with Photoshop Elements.  I would be doing mostly prints to sell but also need to have a web site to do so. Will also use the pc for daily net surfing... but do not game or watch a lot of video on the pc.
    Being really new to this whole process I have a few questions.
    The first thing I need to decide is whether I need to look for a wide gamut display or not.
    I realize the whole chain must be 10 bit (Adobe -OS - graphics driver - graphics card - display port.
    I have Adobe Photoshop CS5 and Lightroom 3, Win7 64 bit, Zotac ZT 50701 10M video card (which uses GeForce GTX 560  fermi and an nvidia chipset. It does have displayport). I am having a hard time determining whether my video card actually supports wide gamut (10bit).
    Standard vs Wide Gamut? Is wide gamut important enough to deal with the issues it brings (calibration, viewing things other than PS and LR or color managed, which appear to be rare?) Is sRGB good enough for most prints (don't do fine art, mostly nature and portraits but starting to do some HRD things). If wide gamut is the way to go I have no problem with that and have the time to learn about calibration, color management, etc... But I also want to make sure the juice is worth the squeeze.
    24" vs 27"? Is there any advantage to one or the other when editing photos?
    IPS vs PLS? I realize they are similar but are there differences worth noting?
    Glossy vs Matte Anti-Glare? seems to be a lot of comments regarding the anti-glare coating, mostly poor. Yet I can see issues using a glossy screen in my study with a window to my back.
    One manufacturer vs another? I realize Eizo, NEC and LaCie are at the top of the heap. But with my budget, after upgrading my pc and camera equipment, I can't make that work now. So I need to choose from the next group down (Dell, HP, Samsung, Asus...)
    One or two monitors? It looks like many (mid-grade) wide gamut monitors do a lousy job of displaying anything but color managed sites. Is that necessarily true of all the mid-grades? Or can some be used for graphics but as well for routine net surfing, MS Office, etc...? Or am I better off getting two monitors, one for graphics and one for the rest? That would pretty much limit me to 24" or less given my budget (used to using a Dell 21" TN monitor that oddly crapped out just as my new pc was done).
    The more I read reviews the more confusing it gets. There seems to be a difference of opinion even among pros on whether to go wide gamut or stick with an easier sRGB. Realizing a standard gamut monitor would be cheaper, I do want to make the right decision up front, given my budget.
    The one thing I have found astounding is that there is nowhere to actually see many of the monitors I am considering. We live in Nashville TN but my wife is from Atlanta Ga so we drove there a few weeks ago to visit family and for me to visit monitor shops. Even the largest ones there (Fry's and Microcenter) had minimal IPS monitors, a few Dells and HP's. The knowledge of their sales folks was so poor I finally gave up. Felt bad about this until I posted this on another board and got a reply from a guy in LA (second largest city in the US) that he wanted to see a particular monitor and there was no place even there to do so.
    Anyhow, here is what I have considered:
    24" Wide Gamut: Dell U2410 and Asus PA246Q. Dells appear to be good IF you get a good one. The Asus appears to be a clone of the Dell that gets a lot of good press.
    27" Wide Gamut: Dell U2711 that also gets a ton of good reviews
    24: Standard Gamut: Dell U2412 and HP ZR2440.
    27" Standard Gamut: Samsung S27A850D and Apple Cinema- The Samsung uses PLS technology versus IPS while the Apple is a glossy screen that will work with a pc.
    Sorry for the long post. Any comments are greatly appreciated.

    dkg62 wrote:
    I realize the whole chain must be 10 bit
    Not trying to talk you out of setting up a 10 bit pipeline, but it's still not very mature, and it really isn't a necessity to get a good editing experience.
    Personally I find advantage in using two 4:3 ratio monitors for Photoshop work.  All my panels are on the right monitor, while pretty much the entire left one shows the Photoshop main window and the working canvas space.  My desktop is 3200 x 1200 pixels overall, and I find having the panels remain visible all the time is important.
    Regarding whether a wide gamut is important...  Will you be printing to devices that deliver a wide gamut?  What other things will you be doing with your system?
    It's not a no-brainer whether a wider gamut monitor is always "better" for everything, since it can accentuate the differences between the output from color-managed and non-color-managed applications, and it's definitely true that not everything is color-managed.  With a monitor that's close to sRGB, for example, you might find Internet Explorer output acceptable, while using a wide gamut monitor will result in garishly oversaturated IE displays.  On the other hand, FireFox (with a settings tweak) seems to get color management right, so there is an alternative.
    I think, as John has implied above, you should work to get your head completely around how color-management works, soup to nuts.  If you don't, there will always be things that are a mystery or which surprise you at the wrong times.  Being able to order a print and have it come back with the expected color can be very important, as you might imagine.
    -Noel

  • Need recommendation on new monitor for CS6 and lightroom 5 -

    I would like to upgrade my monitor for photo editing using CS6 and lightroom 5.  I have spent a lot of time reviewing several monitors and just as confused as when I started.  I am looking for a monitor in the $500 to $900 range but want to make an intelligent decision.  HELP!
    JAG-EVV

    If you have a subscription there should be no problem.  You should be able to install on two machines. If you try to activate a third machine I believe the activation server will deactivate all your activations.  Then activate the current machine. You can then re-activate one of you other machines.
    If you have a perpetual licence and have two activation you will need Adobe Customer Support to assist you.

  • How to adjust hot corners for duel monitors?

    Anyone know how to adjust the hot corners so that I can use all 4 corners of my primary monitor and not my second?
    Right now I have to move the cursor all the way across both monitors to the bottom right corner on my second monitor to activate it. In Lion I believe they had a duel monitor adjustment to fix this problem.
    Any ideas in Mountain Lion?

    Actually, on top of that, is there also a way to make the menu bar appear on both screens? Having a window in a different monitor from the menu bar is a bit inefficient.

  • How to display a duel monitor in my Macbook pro?

    Hi recently I brought a 22 inch LCD monitor I would to make a duel monitor with my laptop how do I do that cause i have try out many way but still not display out. Please help.. Thanks in both window and OS

    I set mine up in OSX and it works fine in Windoz - both under Parallels and with CrossOver.
    Don't understand why yours doesn't work - unless you are using dual boot. If so have you gone into Windoz Control Panel and set up the monitor or installed the driver that came with the monitor. Windows now says that drivers aren't needed but many manufacturers include them as they do most of the work for you.
    Hope this is of some help.
    rhys

  • GPU notes for Lightroom CC (2015)

    Hi everyone,
    I wanted to share some additional information regarding GPU support in Lr CC.
    Lr can now use graphics processors (GPUs) to accelerate interactive image editing in Develop. A big reason that we started here is the recent development and increased availability of high-res displays, such as 4K and 5K monitors. To give you some numbers: a standard HD screen is 2 megapixels (MP), a MacBook Retina Pro 15" is 5 MP, a 4K display is 8 MP, and a 5K display is a whopping 15 MP. This means on a 4K display we need to render and display 4 times as many pixels as on a standard HD display. Using the GPU can provide a significant speedup (10x or more) on high-res displays. The bigger the screen, the bigger the win.
    For example, on my test system with a 4K display, adjusting the White Balance and Exposure sliders in Lightroom 5.7 (without GPU support) is about 5 frames/second -- manageable, but choppy and hard to control. The same sliders in Lightroom 6.0 now run smoothly at 60 FPS.
    So why doesn't everything feel faster?
    Well, GPUs can be enormously helpful in speeding up many tasks. But they're complex and involve some tradeoffs, which I'd like to take a moment to explain.
    First, rewriting software to take full advantage of GPUs is a lot of work and takes time. Especially for software like Lightroom, which offers a rich feature set developed over many years and release versions. So the first tradeoff is that, for this particular version of Lightroom, we weren't able to take advantage of the GPU to speed up everything. Given our limited time, we needed to pick and choose specific areas of Lightroom to optimize. The area that I started with was interactive image editing in Develop, and even then, I didn't manage to speed up everything yet (more on this later).
    Second, GPUs are marvelous at high-speed computation, but there's some overhead. For example, it takes time to get data from the main processor (CPU) over to the GPU. In the case of high-res images and big screens, that can take a LOT of time. This means that some operations may actually take longer when using the GPU, such as the time to load the full-resolution image, and the time to switch from one image to another.
    Third, GPUs aren't best for everything. For example, decompressing sequential bits of data from a file -- like most raw files, for instance -- sees little to no benefit from a GPU implementation.
    Fourth, Lightroom has a sophisticated raw processing pipeline (such as tone mapping HDR images with Highlights and Shadows), and running this efficiently on a GPU requires a fairly powerful GPU. Cards that may work with in the Photoshop app itself may not necessarily work with Lightroom. While cards that are 4 to 5 years old may technically work, they may provide little to no benefit over the regular CPU when processing images in Lr, and in some cases may be slower. Higher-end GPUs from the last 2 to 3 years should work better.
    So let's clear up what's currently GPU accelerated in Lr CC and what's not:
    First of all, Develop is the only module that currently has GPU acceleration whatsoever. This means that other functions and modules, such as Library, Export, and Quick Develop, do not use the GPU (performance should be the same for those functions regardless of whether you have GPU enabled or disabled in the prefs).
    Within Develop, most image editing controls have full GPU acceleration, including the basic and tone panel, panning and zooming, crop and straighten, lens corrections, gradients, and radial filter. Some controls, such as local brush adjustments and spot clone/heal, do not -- at least, not yet.
    While the above description may be disappointing to some of you, let's be clear: This is the beginning of the GPU story for Lightroom, not the end. The vision here is to expand our use of the GPU and other technologies over time to improve performance. I know that many photographers have been asking us for improved performance for a long time, and we're trying to respond to that. Please understand this is a big step in that direction, but it's just the first step. The rest of it will take some time.
    Summary:
    1. GPU support is currently available in Develop only.
    2. Most (but not all) Develop controls benefit from GPU acceleration.
    3. Using the GPU involves some overhead (there's no free lunch). This may make some operations take longer, such as image-to-image switching or zooming to 1:1. Newer GPUs and computer systems minimize this overhead.
    4. The GPU performance improvement in Develop is more noticeable on higher-resolution displays such as 4K. The bigger the display, the bigger the win.
    5. Prefer newer GPUs (faster models within the last 3 years). Lightroom may technically work on older GPUs (4 to 5 years old) but likely will not benefit much. At least 1 GB of GPU memory. 2 GB is better.
    6. We're currently investigating using GPUs and other technologies to improve performance in Develop and other areas of the app going forward.
    The above notes also apply to Camera Raw 9.0 for Photoshop/Bridge CC.
    Eric Chan
    Camera Raw Engineer

    I posted the following information on the Luminous Landscape forum (GPU used in Develop but not Library?) in response to comments you made there.
    I am very puzzled by the extremely blurry image in the second screen capture when the GPU is enabled.
    OS X (10.9.5)
    Hardware configuration:
       MacPro (late 2013)
       AMD FirePro D300 2048 MB
       Apple Cinema Display 1920 x 1200
       16 GB RAM
       1 TB SSD
    Test file:  Nikon D800 NEF, 50 MB
    (0)  open the Develop module
    (1)  select a different NEF file and zoom to 1:1
    (2)  clear the ACR cache
    (3)  select the test file
    (4)  take 3 screenshots to illustrate the 3 display states (the first one is hard to capture)
    (5)  select another image
    (6)  same 3 states are present
    (7)  return to the test file and the same 3 display states are present
       Why isn’t the ACR cache coming into play in step 7?
    If I repeat this process with the GPU disabled the image is displayed without the intermediate states.
    I have attached the 3 screenshots mentioned in step (4).

  • I'm getting error messages when I download updates for lightroom and flash for the mac. can't figure out how to trouble shoot.

    I'm getting error messages when I download updates for lightroom and flash for the mac. can't figure out how to trouble shoot.

    It sounds as if your browser and some of your apps are not color-managed.  This is pretty typical.  Even IE9 is only partially color-managed.
    You can expect color-managed and non-color-managed applications to show you different things with the same images.  How different will depend upon how different your monitor color profile is from the image's color profile.
    For web publication and most general use, experts usually advise saving images with the sRGB profile.  If such images, saved through the Save for Web & Devices function, look different to you than you expect, it may be that your input images have previously been saved in another color space.
    You should really try to get your head around color-management by reading more on it.  It can seem baffling and it's difficult to understand without some background.  A quick web search turns up many overviews.  Beware, though, even people writing articles sometimes don't fully understand it.
    -Noel

  • How to calibrate a monitor for Win 7 Pro?

    Thanks for any suggestions?
    Hersch Pilloff          [email protected]

    Some devices will calibrate/profile only monitors, some (more expensive ones) will also profile printers.  Profiling printers is not so important to most people. 
    Older devices (e.g. Eye One Display 2, Spyder 2) are not suitable for wide-gamut displays. 
    There are no special requirements for Lightroom or Windows 7. 
    Google for reviews such as http://www.northlight-images.co.uk/reviews.html#Monitor_profiling

  • Colors wrong on monitor in Lightroom 3

    Everything was working Ok until I updated my motherboard - Gigabyte motherboards with Intel chipsets in before and after case. The new board has an Intel HD Graphics 3000 chipset. Latest driver installed.  Monitor is a Samsung SyncMaster - latest driver installed. Windows XP SP3. Display setting 32 bit at 1680 x 1050 (to match the screen resolution).  Spyder3Express calibration used - latest version.
    The colors are wrong on the monitor in Lightroom 3.5.  The colors are OK in Windows Viewer or Canon ZoomBrowser EX.  I can even export from Lightroom to a jpg and get the same colors in the jpg as I did with an export before the upgrade.  The color change in Lightroom is applicable to CR2, jpg and dng files imported both before and after the motherboard upgrade.
    I've updated all drivers to latest versions and uninstalled/reinstalled the Spyder calibration software, recalibrated the monitor and rebooted.  The color problem still persists in Lightroom.
    I suspect a bad color profile is being loaded into Lightroom but do not know how to determine the profile loaded or even where the profiles are loaded from.  Any help in troubleshooting would be appreciated.

    Just want to clarify a point here that many people miss, because it's not obvious. Monitor calibration and profiling are separate processes, although both are usually performed at the same time and therefore often lumped together in the word "calibration".
    You see a color shift when you boot up the system, right? Many people assume that's the monitor profile, but actually it's only a small part of it. That's just a basic correction of white point and gamma curves, and this is the calibration part. Since it's loaded into the video card (or monitor) it affects everything, system-wide.
    But the monitor profile is more complex. It is a full and complete description of the monitor's behaviour in three-dimensional color space, in its calibrated state. The precision is much higher. Calibration can't differentiate between a wide gamut and a standard gamut monitor for instance, but the profile will, because it pinpoints the position of the three primaries. Just to give you an idea of the difference.
    So if the profile is bad Lightroom will get the wrong picture and be thrown off. The other applications live in blissful ignorance and for them it's business as usual.

  • Good 17-19" color monitor for under $200?

    I'm looking for a decent color monitor -- in terms of color fidelity and contrast ratio -- for under $200, preferably $150 or less.  Is that possible?  My current monitor makes pictures appear too dark when editing, even with the brightness turned all the way up.  When I edit photos to make them look good on the screen, they appear washed out elsewhere (i.e., on other screens or when printed).
    I'm obviously not a pro looking for top of the line stuff -- just a decent 17-19 inch monitor for editing photos in Lightroom.  Thanks.

    While a better monitor is always, well.....better...I'm not sure that's going to solve your problem. The issue appears to be one of color management, particularly monitor calibration. I would strongly advize investing $25 and taking a one month membership at lynda.com. There you can check out some excellent video training on this subject (and many others).
    My experience is that to get anywhere close to matching prints to what you see on screen, be prepared to spend a couple of hundred bucks on calibration software and hardware. I opened this can of worms a couple of years back and it  took a while to get my head around the concepts of color spaces, gamuts, soft proofing, hardware profiles etc. etc. But it was worth it in the end. I'm certanly no expert, but I have a much clearer idea of what's going on with my prints color-wise. Screen-to-print is still not an absolutely perfect match (is anyone's?), but close enough for my purposes (weddings, events, portrait photography).
    I looked into getting a better monitor, but my research tells me that if I'm paying much less that $1500, the result may not be much better than what I've got (an old View Sonic GS771, a snip at $20 !!).
    Seriously, lynda.com, you can't go wrong. Best money I ever spent.

  • 10 bit colour display path for lightroom 3??

    I have been following an interesting forum topic on the PS forum here about 10 bit monitor support in PS CS5.
    For some background:
    10 bits/pixel is becoming more prevalent with high-end displays these days (it provides 1.07 billion color possibilities instead of the 16.7 million colors available with the regular 8 bit display path). Getting a fully working 10 bit path from the app to the display appears to be more of a challenge though, but a challenge which appears to have been overcome now in PS CS5. I'm wondering if moves are afoot for Lightroom to also support the 10 bit color path??
    This is my take for what you need for PS CS5 to display 10 bit colour in windows (sorry, I'm not a Mac person so couldn't comment):
    Suitable graphics adaptor - Some ATI FirePro models and nVidia Quadro models support 10 bit color (cheapest is nVidia Quadro FX 580) - the availability/use of the displayport interface is most often the tell tale pointer as to whether the adaptor supports this. Only more recent driver packs provide 10bit API support for apps to utilise, so latest drivers are often required.
    Displayport connection between graphics adaptor and monitor (DVI only supports 8 bit color)
    A monitor which supports 10bit color - (High end NEC models (PA241W/PA271W models and new Spectraview Reference models in UK) now support this as does the HP DreamColor plus some Eizo and Lacie models to mention a few)
    Enable 10bit color support in the Graphics adaptor control panel - this option generally only appears when all of the above is in place, and when enabled, provides the display path via OpenGL, for apps to utilise if they are coded for such (this option is known as 'Enable 10-bit pixel format support' for ATI, and 'Deep color for 3D applications' for nVidia)
    Enable OpenGL hardware support (under Preferences, Performance,  'Enable OpenGL Drawing' in PS CS5)
    So will this be supported in Lightroom 3 before long? Or perhaps it is already??
    Anyone know this?
    Thanks, Paul
    p.s. I realise the above chain of requirements sounds complex, but I got it working this week for a new PC build and it was actually quite straightforward.

    martin-s wrote:
    What would be the advantage of this?
    If you google around for how many colours the human eye is able to distinguish, you'll find scientific research quoting numbers around 10.000.000.
    Horribly false.
    The prevalent sRGB color space is covered fairly well by the 8 bits per pixel depth most commonly used ("32 bit" or "True Color" color in Windows), and that is about 16.7 million colors.  While bit depth and color space are different things, once you go to the wider gamuts in other color spaces such as Adobe RGB without using more bits you will start to see banding in similarly colored (blue skies is a good example) areas because there are not enough digital steps to make smooth transitions as the color gradually changes.  Take my word for it (I've seen it in many of my own pictures.) or do more reading for more details if it's not clear, but the key points follow.
    First go to:
    http://en.wikipedia.org/wiki/XvYCC
    and read about the xvYCC color space.  In their reference number 2 down toward the bottom of the web page you find a link to:
    http://data.memberclicks.com/site/hopa/2007_xvYCC_forHPA_FINAL.pdf
    which is a pdf of a presentation discussing extended gamut color spaces.  Check out slide 3 which is at the bottom left of page 1 to see examples of many things in the real world that are quite visible to humans but outside of sRGB.
    Next check out slide 19 at the bottom left of the 5th page to see just how pathetic sRGB is at covering human vision.  The small pictures on the right in this slide show the coverage of the various color spaces of the "Munsell Color Cascade".  The gray areas indicate colors that each space cannot cover, and you can see sRGB has huge blotches of gray.
    Munsell did some of the very early definitive lab testing of human color perception.  You can start reading about him and his color system here:
    http://en.wikipedia.org/wiki/Munsell_color_system
    and continue reading other links found on that page for more info about the extensive lab testing of human color perception.
    Looking back at slide 4 on page 1 of the presentation linked above you will find an odd looking colored region that is curved around the top.  It looks like this:
    That colored region is the estimated range of standard human vision.  There is a version of that which has the sRGB color space shown on it at:
    http://en.wikipedia.org/wiki/File:Cie_Chart_with_sRGB_gamut_by_spigget.png
    And that large view also gives you a much better idea of just how pathetically inadequate the sRGB color space is at covering human vision.  Again, bit depth is not color space, but 8 bits per pixel is pretty much maxed out trying to represent the sRGB color space without having to fake colors within it's triangle.
    There are probably cases where banding can be seen even in purely sRGB images with only 8 bits per pixel.  And every color that you see in that large region with the curve at the top (in the picture shown in the last link above) that lies outside the sRGB triangle will have to be "clipped", which is converted into some color that does lie within the space.  That means you never see any actual colors that fall outside the triangle even if 8 bits per pixel could have covered them somehow using some other color space.  They're simply lost.
    If you do try to use 8 bits for wider gamut color spaces you end up with the banding I mention above because there are too many shades of color to be represented by 256 steps for each of red, green, and blue.  Thus the need for 10 bits or more per pixel.
    I hope this helps to dispel any more myths about the limitations of human vision falling within what 8 bits per pixel or sRGB can provide.

  • Duel Monitors Mac Mini

    I am getting a mac mini and want to hook up duel monitors on it. Each monitor has a single VGA, DVI, and HDMI port. For the first one I will use the HDMI to DVI adapter and for the second I would like to use the MiniDisplay(AKA thunderbolt) to DVI adapter. I heard some people saying it wont work for a second monitor because its passive and not active and stuff like that. How true is that?

    FYI: It's, "dual," not, "duel."
    A "duel" is something 2 people do with pistols.

  • Is it wise to keep the Nikon camera files "DSC's"  after downloading them and converting to DNG files via Adobe converter for lightroom use. In other words do the DNG files have all the raw data I would ever need in processing or should I save the camera'

    Is it wise to keep the Nikon camera files "DSC's"  after downloading them and converting to DNG files via Adobe converter for lightroom use. In other words do the DNG files have all the raw data I would ever need in processing or should I save the camera's DSC files?

    DNG files do not contain some metadata supplied by the camera, which can be used by the manufacturer's software. Thus, if you don't keep the original Raw photo, you will lose this information.
    If your 1000% sure you're never going to use the manufacturer's software, then this isn't a problem. But who can be sure what software you will be using 10 years from now?

Maybe you are looking for

  • Field description in roles

    Hi All, We are developing a custom application where we need to pull the field description e.g. ACTVT = Activity, BERGRU = Authorization group etc. I need to know how to link the technical name of the field to its description and get the output in a

  • Using JMF source codes to build applets........

    hi anybody tried to use the source codes to build an applet that installs/creates JMF registry or its fuctionality on client side make JMF use with no need to install on client side. if anybody have any idea's abt this please help me... or please giv

  • Date to Period

    Hi My requirement is date to period for that i wrritten the customer exit variable. Now my requirment is suppose user input date is belongs to 3rd period (11.06.2010) that  i need to show all the three period in my report.  pls correct my code WHEN '

  • Pass lov value from one page to another page

    hi can any one please help me how to pass selected lov value in one page to another page iam able to get the lov value in controller but how to pass that value as parameter to the VO in another page.

  • Another one with bad looking text

    Hi! I want to place a text in a motion project, but when I increase the size it looks really bad. I think I saw somewhere that you could do something and it behaves more like its vectorbased so it looks sharp what ever the size, but I can't remember