Lightroom, Color Temperature, Kelvin

Is there any way to have Lightroom how the color temperature in Kelvin
instead of what it is using? Or does someone has a chart that equates the LR
numbers to the degrees Kelvin?
I wish Adobe would stick with standards with such things. It is bad enough
when you use the white balance eye dropper that the grid enlargement you see
when it is over the image shows the RGB values as a percentage instead of
what Photoshop shows you. It is very hard to get close to 128 128 128 when
your using a percentage. Again, Adobe started the use of certain methods in
Photoshop so I don't know why they can't be consistent.
Robert

Jim,
My understanding is that an absolute K whitepoint does not strictly make any sense for relative-colour JPEGs. This has been hashed out, ad nauseam, on other threads.
That is, the controls provided are intended to be /correct/, and not just convenient.
I don't have a dog in this fight, but I prefer correct interfaces myself.
I'm actually having trouble coming up with a case where an absolute Kelvin setting is of much use, anyway. I suppose if you know that 5300K was likely the temperature of the colour on a particular shoot it might be useful. But, if you already know that why not set the camera to the same value? If one needed absolute K (say, for legal or scientific reasons) I suspect one might be shooting raw, and using a pre-set whitepoint.
At any rate, my advice is to post this as a feature, which (I understand) will probably be interpreted to mean, "provide access to absolute K values for whitepoint for file formats that can only represent relative temperature, quantizing and approximating in some cases to get as close as we can, within some arbitrary limits."
I'd have to go look at the other thread and do some more research, but this, I think, is the crux of the matter. Misunderstandings and errors on this matter are, of course, all mine.

Similar Messages

  • Color temperature setting on camera and in Lightroom do not match

    I found that Lightroom shows different value of color temperature then was set on the camera body when the photograph was taken.
    For example I have set up color temp. to 3700K for the given scene according to what the external color meter was showing (Sekonic Color meter C-500 Prodigi Color). It was 3700K, no color correction.
    When I imported this Raw file into Lightroom the reading of color temp. was 3550 and -10 to green.
    When I import the same file into Capture NX2 the temperature reading in the software is same as was set on the camera: 3700,0,0. Why this difference. Do I have some hidden setting somewhere in the Lightroom adjusting the CT during import into Lightroom?

    >Hi Lee Jay, so if I understand you well I should ignore the number. But then how LR knows what are the neutrals? Is it calculating something like average tonality and then sets the white balance so that this average is neural gray like some simple cameras do?
    Regardless what LR does it would be nice to see what was the camera set to when the shot was taken at least for reference purposes.
    You see a different number because LR uses a different color temperature model than your camera. There are different color models in use and there is no standard unfortunately so I guess that the ACR developers simply chose one that they liked best. In the Lightroom model a certain color temperature has a different meaning than the same temperature in another model. The calculation between the two is trivial. So when you see value A in Lightroom it is exactly equivalent to value B in your camera. No analysis of your image happens, it is just doing a simple transform of your camera's measured temp and tint values into the system used in Lightroom.

  • White Balance color temperature for custom temperature shots

    Hello,
    I would like to understand why LR and Photoshop too are always show wrong color temperature "As Shot"?
    Let say I shoot with 6000K, set manually (Canon 40D, RAW)
    Canon itself says it's 6000K in the slide info, fine.
    Both Adobe apps says it's 5750K!
    What's going on there?
    Second question - I know the light was EXACTLY 6000K, but the slide has some green in it. Picking white balance in LR gives me 5900K (which is close to the truth) and +31 Tint, which makes slide neutral.
    Of course I can correct Tint manually before the shot by WB Shift in Camera, so the question is: what is the best way - to find more or less appropriate WB shift in the camera, or use WB correction in LR after all?
    Thanks
    UPD. My suggestion is - Adobe doesn't read custom temperature from RAW file, it's just guessing! Is that right?

    The white balance value you see in Lightroom is profile specific. It reads your custom white balance and translates it so that neutrals stay neutral, which is exactly what you want. If it wouldn't do this, your neutrals would shift in color.
    The best way to get a correct white balance is to shoot a white balance (grey) card or use one of those filters on your camera to measure it. You can either set it in camera from the grey card or do it with the dropper in Lightroom and sync over the series. The effect will be the same.
    P.S. unless you shoot a white balance card, you do not know that the color temp of your light is exactly 6000K. That is quite impossible to know. It might be approximately that under certain standardized conditions but you cannot be sure, no matter what you're told by the manufacturer.

  • Camera Raw Color Temperature Issue

    Help me verify color temperature issue please.
    I shoot with Canon EOS 5D.
    I sat white balance mode to K - Manual Kelvin temperature - and set value to 6000K.
    So my raw files should have this setting - color temperature 6000K.
    Canon ZoomBrowser EX shows me the value - 6000K
    Nevertheless I see different reading in Adobe Camera RAW converter (as shot)?
    Photoshop CS2 Camera Raw CT=5600K Tint=+3 (! as Shot !)
    Why?

    Ramon seemed to have the answer in hand, then G Sch above chimed in with some random comment about coordinate systems. Weirdly, Ramon then agreed with G Sch's nonsense and thanked him for it. Suddenly the thread has suffered an ineluctable defenestration.
    Is:
    - camera maker's control for XXXX Kelvin wrong?
    - Adobe's control for XXXX Kelvin wrong?
    - the use of the designator "K" in these contexts wrong, as it implies physics reference for the measure while the camera and ACR just do their own thing?
    By the logic used in this thread, 1/250sec shutter doesn't have anything to do with a time standard, nor does F4 mean an aperture, it's just a coordinate in a locally defined system, la la la. So why bother to even code it in EXIF? What's the point of providing a control in terms of K if K isn't normalized?
    The question was answered at Ramon's first post: ACR doesn't read the 5D white bal metadata. The camera K setting is used for in-camera processing and by Canon utilities. But note that the raw data are white bal agnostic but white bal results are subject to a camera profile which may differ between OEM and ACR, and at which point there is room for discrepancy for interpretation of color. Which one is right? I can't say. It's important to realize the results for a given K setting may differ between OEM and ACR because of this. Contrary to what G Sch writes above, the same K setting ought to give the same results if a "Kelvin" setting is to have real meaning, but the seems to be impracticable if the developers don't agree on the characterization of the gear.

  • FR: color temperature adjustment layer

    I know there are 5 million methods to adjust colors in PS, but I would really prefer a simple (and almost scientific) method like the color temperature adjustment in ACR (with the green-magenta color cast adjustment).
    Ideally, PS would read the current color temperature from the EXIF data (and adjust to any adjustment layers below), and give the option to pick a neutral pixel (and display the color shift based on that). in case there is no original color temp available, the grey pixel picker could be used to establish a base value, and give the user at least relative color shift values (or the user could enter the values by hand from ACR).
    The reason for this is that most (pro) photographers probably know the color temp of their environments (lights, ambient, gels etc) (or can read those values from ACR), so adjusting values in regards to kelvin would make things much more precise than the ambivalent values of e.g. the color balance adjustment.

    What I was asking is a different method to achieve the same results what is possible with levels/curves/color balance etc without going into lab mode. (btw, the lack of a grey picker and the zoned approach of color balance isn't ideal sometimes)
    thank you for reminding me that I need to brush up on my color theory. your reply has a tone of "why are you asking this, there are already tools for this" which I find a bit too conservative, when my intention is to make PS even more user friendly... if we would go with your paradigm, we wouldn't need levels, because everybody could just use curves, and yet both are very popular and often used based on the users preferences.
    what I am asking is bringing the white balance settings from ACR into an adjustment layer, with all it's implications.
    it would enable users to specifically balance images with mixed light sources (with the proposed color temp/WB adjustment layer with it's mask) by entering values in kelvin, and not some arbitrary numbers from 0-1 or 0-255. it would be so much easier to balance tungsten to daylight by adding 2000 kelvin to the image, or to control a straw gel that was added somewhere in the scene. I can imagine so many applications for this adjustment layer...

  • Adjust color temperature how?

    When shooting a play with tungsten lighting I set the camcorder to custom white balance but forgot to adjust the color temperature and the video now has a reddish cast.
    If it was a still photo, I could open it in adobe raw/lightroom and adjust the color temperature slider until the color cast disappear.
    Is there an equivalent function in premiere pro cs4?
    There is a color balance video effect but it is overkill. I don't need to adjust the color on highlights, midtones, or shadow. All I need is change the color temperature. Any ideas?
    I'm using premiere pro cs4. I also have photoshop cs4.

    However I also worry this feature applies different correction at different times; whereas I want a constant correction throughout the video, so I can verify it looks ok without having to view the entire video.
    You are correct. It is the same for about every "Auto Effect." They only work well, where nothing has changed, i.e. the lighting, the exposure, etc., so about the only way to really use this is in a very controlled studio, or properly lit location shot. That is not what one usually is working with.
    If Fast Color is not working for you, explore Three-Way Color Corrector - more control.
    There are also 3rd party plug-ins, like Colorista, that many favor over the CC Effects in PrPro: http://www.redgiantsoftware.com/products/all/magic-bullet-colorista-II/
    Good luck,
    Hunt

  • Metadata for color temperature

    I can't find a metadata entry for Kelvin temperature, in IPTC or EXIF. If I can have the timezone why can't I have the color temperature?
    Kelvin temp is in the metadata, because I can see it in EOS Utilities.
    Can I create a custom metadata slot and somehow grab it?
    Thanks.
    Brent

    It's probably one of the proprietary Canon data fields/Tags (in Makernotes). The closest standard EXIF field is White Balance, but that's only a preset code not the actual color temperature.
    That's also why new cameras have to "reverse engineered" to decode thier RAW file format for use in Aperture or other RAW processors.

  • Color temperature adjustment

    Hello,
    A really good feature would be color temperature adjustment for Adjustment Brush and Graduated Filter to be able to set a corect white balance through the whole picture. This could be a scenario: a sunset with (partialy) falling shadows (from trees) on some people. Also to have a White Balance Selector for those two adjustments.
    Thanks.

    What I was asking is a different method to achieve the same results what is possible with levels/curves/color balance etc without going into lab mode. (btw, the lack of a grey picker and the zoned approach of color balance isn't ideal sometimes)
    thank you for reminding me that I need to brush up on my color theory. your reply has a tone of "why are you asking this, there are already tools for this" which I find a bit too conservative, when my intention is to make PS even more user friendly... if we would go with your paradigm, we wouldn't need levels, because everybody could just use curves, and yet both are very popular and often used based on the users preferences.
    what I am asking is bringing the white balance settings from ACR into an adjustment layer, with all it's implications.
    it would enable users to specifically balance images with mixed light sources (with the proposed color temp/WB adjustment layer with it's mask) by entering values in kelvin, and not some arbitrary numbers from 0-1 or 0-255. it would be so much easier to balance tungsten to daylight by adding 2000 kelvin to the image, or to control a straw gel that was added somewhere in the scene. I can imagine so many applications for this adjustment layer...

  • - Lightroom Color Management Hints & Tips -

    Summary
    If you have a profiled monitor and you experience that Lightroom 2.1 renders the image
    very different from the way Photoshop renders it, or that the Library and Slideshow modules render the image
    very different from the way it is rendered by the Develop module, chances are that this can be solved by re-profiling your monitor and saving the new profile as a matrix-based profile rather than a LUT-based profile.
    The full article
    Read the full article at: http://photo.bragit.com/LightroomColorManagement.shtml, which describes the background, the problem, the solution and the results. There are also some hints on the use of test patterns, choice of gamma, color temperature and luminance.
    I am sure many people may have opinions on these issues, so please run any discussions about the article in this forum.

    To Richard Waters:
    For normal mid-tone images (excluding shadows) viewed at 1:1, there should be no (significant) differences between Development and Library modules (and Photoshop). If you do see significant differences, there is something wrong with the calibration.
    As for Photoshop vs Lightroom: Photoshop is better for printing because it has a proofing systems. What one can do is to open it in Photoshop (with Lightroom adjustments), then do the proofing, and perhaps some extra adjustments to compensate for the paper, and then print the result either from Photoshop or from Lightroom. Printing from Lightroom has the advantage that it does the resampling and sharpening automatically.
    Choice of gamma when profiling is not very critical. 2.2 is reasonably okay (and the most common), although the sRGB gamma (if you have the choice) may be more optimal, especially for deep shadows. Color management works so that, in principle, if the bit depth from the graphics card to the monitor was infinite, it would compensate for whatever gamma you choose. Thus, in principle, you could choose any arbitrary gamma, and the image would look and print exactly the same. The only reasons to choice a "suitable" gamma are: (1) the bit depth is limited to 8 bits which makes it necessary to use a "reasonable" gamma so as to avoid banding and posterizations; (2) when viewing images from the internet that are not tagged with a profile, or using a lousy browser that does not understand CM, then the choice of gamma is critical since it directly affects the contrast of that image.

  • Lightroom color problem...

    I have calibrated my Dell monitor with EyeOne Display 2. The profile created by the EyeOne Display 2 is in the standard location "C:\WINDOWS\system32\spool\drivers\color". This profile has been set as the default profile for the monitor.
    I edit from RAW pictures in LR. Once I am satisfied with the colors, then exported pictures to JPEG using SRGB profile and the "EyeOneXXX.icc" profile calibrated using EyeOne Display 2.
    The pictures that got exported using "EyeOneXXX.icc" when viewed in a color management enabled FireFox browser look identical to what I see in the Lightroom. The pictures that got exported using "SRGB" has little less saturation when viewed in color management enabled Firefox browser than that looks in the LightRoom. What is the reason for this difference.
    Usually how should I export i.e. should I export to SRGB or monitor calibrated profile when I need to send them to print at Costco.
    Also, Costco has their printer profiles online. Should we use it soft proof it or should we use it when exporting to JPEGs in LR.
    Thanks
    Chandra
    Software: LightRoom 2.3
    OS: Windows Vista Home Premium
    Monitor: Dell (Connects HP laptop to this monitor during editing)
    Calibration: EyeOne Display 2
    Camera : Nikon D90
    Lens: Nikon 18-105mm that comes with Nikon D90

    What do you mean by "I should use standard color profile SRGB"? Do you mean that I should use SRGB as color profile in LR when exporting to JPEG.
    I calibrated my 19 inces Dell monitor connected to my latop using Eyeon display 2. The options I have used while calibrating are "Native WHite, Gama 2.2" My monitor color temperature is set 5000K. The profile created by calibration is used as the default color managment profile for the monitor.
    Also why is the following problem happening.
    I usually edit RAW photos in LR in a calibrated monitor as per info. given in the signature. After editing, I exported photos using SRGB and "EyeOneXXX.icc" i.e. Calibrated Monitor profile.
    I printed the pictures in Costco. The photos with SRGB have redish cast. The photos exported using Monitor profile are under saturated than the photos I see in the Lightroom.
    If I load the photo exported using monitor profile in CS3 and select "View -> Proof Setup -> Windows RGB", the picture show in CS3 under these conditions are similar to the prints at Costco i.e. lot under saturated. If I select "View -> Proof Setup -> Monitor RGB", the picture looks like in LR.
    If I load the photo exported using SRGB profile in CS3 and select "View -> Proof Setup -> Windows RGB", the photo looks like in LR. If I select "View -> Proof Setup -> Monitor RGB", the picture looks like the prints at Costco i.e. lot over saturated.
    In CS3 on the monitor the photos under the following two conditions are identical:-
    1. Photo exported using monitor profile and the proof setup option in CS3 is "View -> Proof Setup -> Monitor RGB"
    2. Photo exported using SRGB profile and the proof setup option in CS3 is "View -> Proof Setup -> Windows RGB"
    I even soft proofed the Costco Printer profiles. The corresponding photos viewed under costco print profile is almost identical to the photos seen in the 1 and 2 options described above.
    I basically want the photos to look like what I see in LR. From the above information what could be the problem I am having i.e. why are my printed photos does not look like in LR.
    Thanks
    Chandra

  • How can I correct for 2 different color temperatures in one image?

    Hi,
    In indoor architectural photography one may encounter mixed color temperatures in one image. If one could use gels and lamps to replace incoming natural daylight and force a uniform color temperature (for instance tungsten) then the problem would go away. Yet, sometimes this is not possible or practical and I was exploring what could be accomplished within Photoshop. Here is my proposal ( or beginning of it):
    shoot RAW
    generate 2 versions of the image, each corrected for one of the color temperatures present (say WB dropper in "should be neutral" area where daylight dominates and another WB dropper in "should be neutral" area where tungsten dominates)
    open them in Photoshop and move one of them as a layer on top of the other version
    Try to blend the two layers
    in one example, I used Hue blending mode and the photo fixed itself!! However this is not generalizable to other images
    I also tried BlendIf for harder images but with no success
    I presume selection is then a must which brings up the question:HOW CAN I SELECT DEPENDING ON COLOR TEMPERATURE? I mean, the layer where WB = daylight looks orange-casted in the areas where the main light source is tungsten, while the layer where WB = tungsten looks blue-casted in the areas lit mainly by sunlight. I must believe there should exist a best approach to select say the blue-casted areas or the orange-casted areas, and that approach should be simple if it somehow takes into consideration the predominant cast. I have tried Select Color Range but not very successfully. I just feel there must be a simpler way, taking advantage of the present casts.
    Masks seem also useful
    All feedback will be greatly appreciated!
    Thanks,
    Juan

    If you want to use multiple keepalives for a service, you must use a script. The alternative would be to create 2 content rules, and 2 sets of services, one for port 80 (or whatever) and one for port 443 (or whatever), and use http for one and ssl for the other.
    Michael Voight
    CSE

  • Color Temperature changes when playing video via Dell 2713H with iMac

    Hi people,
    I'm having major issues with the above. This happens specifically when video is played, be it via youtube or through VLC or even XBMC.The color temperature of the video just shifts to a cooler blue and then after a while goes back to normal. This does not happen on the monitor of the iMac itself. Otherwise, viewing still photos or browsing the web works okay. It is really very irritating as you don't expect this to be happening.
    I've read that this could be an issue affecting macbooks with the ability to switch between integrated and dedicated video cards, but i can't seem to find any thing suggesting a fix for iMacs. Just so you guys know, it occurs on my macbook pro as well too.
    I'm running mountain lion 10.8.2 on the iMac and connecting the dell u2713h via an displayport to mini displayport cable supplied by Dell. iMac is a late 2009 model.
    If anyone could offer any insights, i'll really appreciate it.
    Thanks
    J

    I found this in another thread and it fixed the same problem for me. Apparently it's a Dell "feature" it detects video and changes the color settings for you... ? On the monitor just disable Display Settings>Smart Video Enhance.
    https://discussions.apple.com/message/25116072#25116072

  • Camera Raw 5.5 VS Adobe Lightroom (color correction)

    Hi, does somebody know if discarding the advantage of making layers of Photoshop, is the Adobe Lightroom color correction controls superiors to the CameraRaw PS Interface correction controls ??? I mean for color correction purposes is Adobe Lightroom  better tan PS's  camera raw interface ???? because for me both controls seem to be pretty much the same thing,  does anyone know something about it ??
    Thank you in advance !

    RicardoAngelo wrote:
    I mean for color correction purposes is Adobe Lightroom  better tan PS's  camera raw interface ???? because for me both controls seem to be pretty much the same thing,  does anyone know something about it ??
    Both Camera Raw (most recent version) and Lightroom (most recent version) share the EXACT same processing pipeline...however, there are subtle differences in usability. For example, ACR 5.6 has a Point Curves Editor...Lightroom doesn't. Lightroom has a powerful capability to control output size and resolution which Camera Raw doesn't have...
    Bottom line is they are two horses of a different color but each is capable of performing the same tricks...so use whichever app allows you o accomplish what you need to do in the shortest and easiest process...

  • IPhone 3Gs has a color temperature display is warmer, or yellow-tinted

    All greetings!
    My new iPhone 3Gs has a color temperature display is much warmer, or yellow-tinted compared to the my iPhone 3G.
    It is a defect of my iPhone 3Gs?
    Here you can see it: http://farm3.static.flickr.com/2415/2660565874_ef8a841a20.jpg

    I recently bought a 3Gs and compared to my old 3G there's a hug difference! my 3Gs looks really yellow! I went to Apple store at the florida mall to verify if this was normal, brought my old 3G with me so they could notice the difference,so I made the appointment and when the so called "apple genius" saw my phone he noticed the difference and took my phone to change it to the back of the store, he came back in 3 minutes claiming that he changed my phone,when I looked at it,same yellow display I said to the guy that the display looks the same but he never gave me any other option, on my way out from the store i checked the info on the phone but it was the same IMEI as the one in the box, how could this be? if they supposedly change the phone, I asked to speak to a manager and explained to her my situation, the "genius" that "helped me" told the manager that what he did was only changing the display and since I didn't like it,he went back and put back again the display that came with the phone... at this point I got really upset and I asked the manager to speak to the "genius" and asked why he lied about the situation, the manager said that she would bring the guy so we could settle this situation,they left me waiting on the store 45 minutes and none of them showed up... Great Apple store service... Yeah right!! F#$%^ them!

  • Iphone 5S warmer color temperature (yellow hue)

    Hi guys,
    Yes another yellowgate thread I am afraid...
    Ive compared my iphone 5 with the iphone 5s.  Both devices set to full brighntess...both running ios7....and its quite clear that my iphone 5S seems to have a much warmer color temperature when compared with the iphone 5.   The iphone 5 produces white as brillaint white...whereas the iphone 5S seems to have a yellow hue to it....whites look yellow / washed out ....   This is even more noticable due to ios7 which is primarily an OS with a white background / theme throughout.
    Question being, is this "the norm" for the iphone 5S or a manufacturing defect ?    Anyone else compared the 5 to the 5S side by side see the same issue ?
    Its either a defect or apple has changed the default color temperature for the iphone 5S .... anyone got anything to back this up ? I thought these devices had exactly the same LCD panel in them ?
    I dunno if I can be bothered with the whole returns process thing if all others out there are the same....
    Side by side - all on full brightness...
    from left to right
    iphone 4s
    iphone 5
    iphone 5S

    Hy james!
    I have another problem.
    Since two days the right side of my screen (iphone 5s gold 64Gb) became Yellow with a normal use.
    If i touch the alluminium case i feel that the temperature is increase a lot and at the same time i see a stain in the right side of my screen.
    My iphone is out of warranty.
    Maybe your problem is similar of mine?

Maybe you are looking for