Setting my monitor's color temperature

Simple question... How would I set the color temperature of my display from 9300K to 6500K, absent the usual hardware menu? Are there any software utilities that let me set color temperature.
Edit: my graphics hardware is i945GM, not anything from nVidia, for what it's worth.
(And apparently gamma corrections don't have any bearing on color temperature. Oy veh.)
Last edited by Gullible Jones (2007-07-06 21:18:52)

Hmm. Found something interesting here:
3) Accessing the displays DDC/CI channel, to adjust the screen
    controls (brightness, contrast, color temperature etc.) automatically.
    This seems to be a lost cause at the moment under X11. While the DDC
    channel is often used in starting the X11 server to read the EDID,
    it is not made available to applications that need/want to adjust
    display settings. There is a project that goes in this direction
    somewhat (DDIcontrol), but it is not properly integrated
    into X11 - the application drawing the test colors should be
    able to access the same display.screen's DDC/CI channel, not
    rely on the user to make the association, and not have to incorporate
    all sorts of device dependent access code (that's what an X11 display
    driver is meant to be shielding the application from!)
    This could be a low level function that allows the application to
    read and write DDC messages, or it could be a higher level
    function that allows "brightness", "contrast" etc. to be
    read and set (encapsulating any display differences).
Hmm, EDID... completely forgot about the EDID settings. I'll try messing around with that and the DDC stuff, I think. Mayhap there's something going wrong with settings autodetection or whatever.

Similar Messages

  • Color temperature setting on camera and in Lightroom do not match

    I found that Lightroom shows different value of color temperature then was set on the camera body when the photograph was taken.
    For example I have set up color temp. to 3700K for the given scene according to what the external color meter was showing (Sekonic Color meter C-500 Prodigi Color). It was 3700K, no color correction.
    When I imported this Raw file into Lightroom the reading of color temp. was 3550 and -10 to green.
    When I import the same file into Capture NX2 the temperature reading in the software is same as was set on the camera: 3700,0,0. Why this difference. Do I have some hidden setting somewhere in the Lightroom adjusting the CT during import into Lightroom?

    >Hi Lee Jay, so if I understand you well I should ignore the number. But then how LR knows what are the neutrals? Is it calculating something like average tonality and then sets the white balance so that this average is neural gray like some simple cameras do?
    Regardless what LR does it would be nice to see what was the camera set to when the shot was taken at least for reference purposes.
    You see a different number because LR uses a different color temperature model than your camera. There are different color models in use and there is no standard unfortunately so I guess that the ACR developers simply chose one that they liked best. In the Lightroom model a certain color temperature has a different meaning than the same temperature in another model. The calculation between the two is trivial. So when you see value A in Lightroom it is exactly equivalent to value B in your camera. No analysis of your image happens, it is just doing a simple transform of your camera's measured temp and tint values into the system used in Lightroom.

  • Monitor color temperature is switching after launching a CC product

    Hello,
    i'm running OS X 10.8.5 on a Macbook Pro
    I didn't change nothing and never play with color profil on my computer
    Since this morning when i launch Photoshop CC or Illustrator CC color temperature switch automaticly to a cold white during software launch window.
    Anyone have an idea about this issue ?
    Regards.

    I got an error in the ICC profil

  • Strange color temperature when using iSight

    The image generated by my iSight always tends to have a "yellow" tone to it. Can the color temperature be adjusted somewhere? I compared images obtained under the same lighting by colleagues and the color temp with their Macbooks is more blue/white. It is frustrating to always look jaundiced!

    MacPatrik wrote:
    The image generated by my iSight always tends to have a "yellow" tone to it. Can the color temperature be adjusted somewhere? I compared images obtained under the same lighting by colleagues and the color temp with their Macbooks is more blue/white. It is frustrating to always look jaundiced!
    iSight color is set automatically by the app that runs it and is effected by room lighting and even reflections from your monitor, room decor, floor, and wall coverings.
    I can suggest two ways to change the color of your images:
    (1) Use only one color of light in the room, and adjust it by changing lamps if necessary. See Apple's article at http://support.apple.com/kb/HT3097
    This single room light color includes reflections from your monitor. I change my System Preferences > Desktop & Screen Saver > Desktop to use one of the Solid Gray colors rather than the standard screen color that is installed with Mac OS. The solid, neutral nature of gray greatly reduces the yellow color shift that is automatically applied to correct for a blue or purple desktop. The difference is increasingly notable as room light is reduced.
    Tungsten light is very orange-yellow. If your only light source is tungsten and you are still seeing an image that is too yellow, try changing the lamp bulb to one of the daylight-colored fluorescent replacement lamps. After testing four different brands and colors, I found one that works in my room.
    (2) You can take manual control if you add iGlasses2: http://www.ecamm.com/mac/iglasses/
    You can try before you buy to see if it does what you want.
    (10.6 users note that iGlasses is NOT yet compatible.)
    EZ Jim
    Mac Pro Quad Core (Early 2009) 2.93Ghz w/Mac OS X (10.6.1)  MacBook Pro (13 inch, Mid 2009) 2.26GHz (10.6.1)
    G5DP1.8GHz (10.5.8) G4 PowerBook 1.67GHz (10.4.11) iBookSE 366MHz (10.3.9) LED Cinema Display External iSight

  • How can I correct for 2 different color temperatures in one image?

    Hi,
    In indoor architectural photography one may encounter mixed color temperatures in one image. If one could use gels and lamps to replace incoming natural daylight and force a uniform color temperature (for instance tungsten) then the problem would go away. Yet, sometimes this is not possible or practical and I was exploring what could be accomplished within Photoshop. Here is my proposal ( or beginning of it):
    shoot RAW
    generate 2 versions of the image, each corrected for one of the color temperatures present (say WB dropper in "should be neutral" area where daylight dominates and another WB dropper in "should be neutral" area where tungsten dominates)
    open them in Photoshop and move one of them as a layer on top of the other version
    Try to blend the two layers
    in one example, I used Hue blending mode and the photo fixed itself!! However this is not generalizable to other images
    I also tried BlendIf for harder images but with no success
    I presume selection is then a must which brings up the question:HOW CAN I SELECT DEPENDING ON COLOR TEMPERATURE? I mean, the layer where WB = daylight looks orange-casted in the areas where the main light source is tungsten, while the layer where WB = tungsten looks blue-casted in the areas lit mainly by sunlight. I must believe there should exist a best approach to select say the blue-casted areas or the orange-casted areas, and that approach should be simple if it somehow takes into consideration the predominant cast. I have tried Select Color Range but not very successfully. I just feel there must be a simpler way, taking advantage of the present casts.
    Masks seem also useful
    All feedback will be greatly appreciated!
    Thanks,
    Juan

    If you want to use multiple keepalives for a service, you must use a script. The alternative would be to create 2 content rules, and 2 sets of services, one for port 80 (or whatever) and one for port 443 (or whatever), and use http for one and ssl for the other.
    Michael Voight
    CSE

  • Color Temperature changes when playing video via Dell 2713H with iMac

    Hi people,
    I'm having major issues with the above. This happens specifically when video is played, be it via youtube or through VLC or even XBMC.The color temperature of the video just shifts to a cooler blue and then after a while goes back to normal. This does not happen on the monitor of the iMac itself. Otherwise, viewing still photos or browsing the web works okay. It is really very irritating as you don't expect this to be happening.
    I've read that this could be an issue affecting macbooks with the ability to switch between integrated and dedicated video cards, but i can't seem to find any thing suggesting a fix for iMacs. Just so you guys know, it occurs on my macbook pro as well too.
    I'm running mountain lion 10.8.2 on the iMac and connecting the dell u2713h via an displayport to mini displayport cable supplied by Dell. iMac is a late 2009 model.
    If anyone could offer any insights, i'll really appreciate it.
    Thanks
    J

    I found this in another thread and it fixed the same problem for me. Apparently it's a Dell "feature" it detects video and changes the color settings for you... ? On the monitor just disable Display Settings>Smart Video Enhance.
    https://discussions.apple.com/message/25116072#25116072

  • How do I set external monitor to work with my G4 iMac?

    I just bought a 22" Proview LCD monitor to use with my iMac b/c it's monitor is pretty much gone. It did not work for six months, then came on for 1 day and went out again.
    Long story short, since I've hooked up the new monitor when I print in iPhoto the pictures come out too dark and the color is not what it used to be.
    How do I fix it so my monitor's calibrations match the original? Or, whatever I need to do to fix it.
    I print a lot of pictures and I need this to work correctly.
    Thanks.

    Snoot, welcome, your new monitor just needs to be calibrated. From the finder under the Apple scroll down to System Preferences, Click the Desktop & Screen Saver icon and set the desktop picture to a Medium Gray.. You can change this back when you are done. Now go back by clicking Show All and click the Displays icon, then choose Display in the upper tabs and set your preferred resolution. Set your monitor brightness at about medium (if there is one) and at Colors choose Millions. Now choose Color from the upper tab and select Calibrate. Choose Expert Mode at the introduction and follow the on screen instructions. After making all of the Hue,Value and Saturation adjustments your must save the setting back to the Display profile.
    Also some applications have monitor selection in there preferences, so just make sure that you choose your present Display Profile for that particular application.
    Joe
    Mac Pro 2.66 Ghz   Mac OS X (10.4.10)   Users (RAID 0), PM G4 (10.3.9, PM 6500 (10.2.8)

  • DUAL MONITOR BUG: One Picture – One Monitor – Different Colors!

    There have been several threads discussing that Photoshop CS4 and OS 10.6. show wrong colors in dual monitor setups.
    The first one to prove this with a screen capture was jb510-LJ0JJQ in August 2009: http://forums.adobe.com/message/2207312#2207312
    Eight months and three OS updates later the problem is still there and makes professional work with more than one monitor a nuisance.
    Here's another actual example: A picture and its 100% identical copy showing different colors on one screen!
    Here's how to repeat it – try this at home ;-)
    1) In the finder make a copy of a rgb picture.
    2) Open one of the two identical pics in photoshop.
    3) Drag it to the other monitor.
    When releasing the mouse button you may see the bug already: the colors will "switch".
    4) To verify this go to system presets - monitors - and change the primary monitor (by dragging the menu bar to the other)
    5) Now open the identical copy you made in step 1) and see the difference!
    Here's a synopsis of suggested workarounds:
    1) disable open GL (which seem to makes no difference on my setup)
    2) click "desaturate monitor colors" but set desaturation to 0 (which prevents visual switching in step 3) (only if openGL is ON) but also leads to a wrong display in step 4+5
    3) make the hardware calibrated monitor your default monitor and strictly avoid changes or dragging pics to the other monitor (which seems to be the only reliable method)
    4) some guys even force both monitors to use the profile of their hardware-calibrated monitor (using color sync) to make sure they see what they should see on at least that one monitor. What a stone-age workaround for the the world's leading imaging program!
    Considering that reliable colors in professional (=multi monitor) settings should be the most basic function of photoshop this mayor malfunction should be solved with CS5! Otherwise we might have to live with it for some more years...
    Have Adobe and Apple realized this bug and on their agenda? Will it be gone with OS 10.6.3? Can anyone explain the reasons for this bad bug? Is there a "officially" recommended workaround?
    Thanks, regards,
    Eric

    There have been several threads dealing with this malfunction. But most faded out in silence or acceptance of the workarounds.
    http://forums.adobe.com/message/2207312#2207312
    http://forums.adobe.com/thread/521483
    http://forums.adobe.com/message/2272192#2272192
    http://forums.adobe.com/thread/311761?start=0&tstart=0
    http://forums.adobe.com/thread/618927?tstart=60
    http://forums.adobe.com/thread/603992?tstart=90
    http://www.wetcanvas.com/forums/showthread.php?t=552977
    http://www.hilfdirselbst.ch/gforum/gforum.cgi?post=368889?search_string=zwei%20monitor#368 889
    But man, this is no minor bug! Consistent colors are the by far most important basic function of photoshop. If you cannot rely on the colors in photoshop you can as well use "Preview" and "Word" for print production ;-)
    Carson, how did Apple react to your complaint? What do you mean with "they don't get it". Do they not see it? Or can't they get it right?
    Shall we open a thread at apple forums to point out that it's not the problem of a single user?

  • White Balance color temperature for custom temperature shots

    Hello,
    I would like to understand why LR and Photoshop too are always show wrong color temperature "As Shot"?
    Let say I shoot with 6000K, set manually (Canon 40D, RAW)
    Canon itself says it's 6000K in the slide info, fine.
    Both Adobe apps says it's 5750K!
    What's going on there?
    Second question - I know the light was EXACTLY 6000K, but the slide has some green in it. Picking white balance in LR gives me 5900K (which is close to the truth) and +31 Tint, which makes slide neutral.
    Of course I can correct Tint manually before the shot by WB Shift in Camera, so the question is: what is the best way - to find more or less appropriate WB shift in the camera, or use WB correction in LR after all?
    Thanks
    UPD. My suggestion is - Adobe doesn't read custom temperature from RAW file, it's just guessing! Is that right?

    The white balance value you see in Lightroom is profile specific. It reads your custom white balance and translates it so that neutrals stay neutral, which is exactly what you want. If it wouldn't do this, your neutrals would shift in color.
    The best way to get a correct white balance is to shoot a white balance (grey) card or use one of those filters on your camera to measure it. You can either set it in camera from the grey card or do it with the dropper in Lightroom and sync over the series. The effect will be the same.
    P.S. unless you shoot a white balance card, you do not know that the color temp of your light is exactly 6000K. That is quite impossible to know. It might be approximately that under certain standardized conditions but you cannot be sure, no matter what you're told by the manufacturer.

  • How do you set up a custom color for the standard screen mode

    This new colour scheme has not enough contrast for it to be readable, so I was pleased to find custom screen mode. However, there seems to be no way of setting up a custom color for this. In addition, the icons are so small they are unreadable. Is there any way of making them bigger. I have a 20" and a 22" monitor working together, but they are still not clear enough to use.

    Hi there
    Are you referring to the tool icons in Photoshop CS6? There are a few options you can change to help make the interface more readable, but unfortunately, there is not an option to enlarge the tool icons.
    It sounds like you may have discovered this already, but if you go to Photoshop > Preferences > Interface, you can change the color of the Photoshop CS6 interface.
    You can also enlarge the UI text size in this same window, in the Text preferences box.
    You can also make the size of your layers thumbnails larger. You can do this by clicking on the small menu in the upper right hand corner of the Layers Panel and selecting Panel Options.
    Here you can change the size of the layers thumbnails.
    I hope this helps you out a bit!

  • Calibrate 2nd monitor for coloring

    I have a 23inch older apple cinema display and ive calibrated it with a spyder2express but It does not seem to always be consistent. I don't need a professional setup I just want a roughly calibrated consistent monitor that i can trust is the same. This project will mostly be on blu Ray and DVDs. I was wondering if maybe since the apple monitor is older that it may have issues with consistency? After it warms up or anything. Would I be better with a LCD tv as a reference monitor that I calibrate with the spyder since it's where the final project will be seen? Or a newer HD monitor? I know some of the non apples are getting cheaper.
    Any input? Thanks

    A search of this forum will reveal a large number of threads, the gist of which reduce to:
    The Spyder2Express is a calibration tool for photographers consequently not appropriate for use in video and the Apple Cinema Display is one of the worst possible solutions for grading digital media. Old ones worse so, but you already know that, but this is pretty much universally true of all graphics displays.
    COLOR is tailored for broadcast use with CCIR709 guidelines as its preset, and unchangeable, output standard for Quicktime workflows. Unless you are working under Snow Leopard you will experience gamma consistency problems between COLOR and all the other Final Cut Studio applications. There are innumerable threads covering that issue.
    You will likely not be able to calibrate a regular LCD monitor with the Spyder since it cannot operate at the low levels generated at broadcast standard biases. In other words it will not be able to do a "black balance", and it is unlikely that you will be able to set your monitor to the recommended brightness used by broadcasters, so a "white balance" is probably out of reach as well. Nearly all consumer "TVs" employ "dynamic contrast" to sweeten distributed media, so they are disqualified for professional use.
    jPo

  • Iphone 5S warmer color temperature (yellow hue)

    Hi guys,
    Yes another yellowgate thread I am afraid...
    Ive compared my iphone 5 with the iphone 5s.  Both devices set to full brighntess...both running ios7....and its quite clear that my iphone 5S seems to have a much warmer color temperature when compared with the iphone 5.   The iphone 5 produces white as brillaint white...whereas the iphone 5S seems to have a yellow hue to it....whites look yellow / washed out ....   This is even more noticable due to ios7 which is primarily an OS with a white background / theme throughout.
    Question being, is this "the norm" for the iphone 5S or a manufacturing defect ?    Anyone else compared the 5 to the 5S side by side see the same issue ?
    Its either a defect or apple has changed the default color temperature for the iphone 5S .... anyone got anything to back this up ? I thought these devices had exactly the same LCD panel in them ?
    I dunno if I can be bothered with the whole returns process thing if all others out there are the same....
    Side by side - all on full brightness...
    from left to right
    iphone 4s
    iphone 5
    iphone 5S

    Hy james!
    I have another problem.
    Since two days the right side of my screen (iphone 5s gold 64Gb) became Yellow with a normal use.
    If i touch the alluminium case i feel that the temperature is increase a lot and at the same time i see a stain in the right side of my screen.
    My iphone is out of warranty.
    Maybe your problem is similar of mine?

  • Camera Raw Color Temperature Issue

    Help me verify color temperature issue please.
    I shoot with Canon EOS 5D.
    I sat white balance mode to K - Manual Kelvin temperature - and set value to 6000K.
    So my raw files should have this setting - color temperature 6000K.
    Canon ZoomBrowser EX shows me the value - 6000K
    Nevertheless I see different reading in Adobe Camera RAW converter (as shot)?
    Photoshop CS2 Camera Raw CT=5600K Tint=+3 (! as Shot !)
    Why?

    Ramon seemed to have the answer in hand, then G Sch above chimed in with some random comment about coordinate systems. Weirdly, Ramon then agreed with G Sch's nonsense and thanked him for it. Suddenly the thread has suffered an ineluctable defenestration.
    Is:
    - camera maker's control for XXXX Kelvin wrong?
    - Adobe's control for XXXX Kelvin wrong?
    - the use of the designator "K" in these contexts wrong, as it implies physics reference for the measure while the camera and ACR just do their own thing?
    By the logic used in this thread, 1/250sec shutter doesn't have anything to do with a time standard, nor does F4 mean an aperture, it's just a coordinate in a locally defined system, la la la. So why bother to even code it in EXIF? What's the point of providing a control in terms of K if K isn't normalized?
    The question was answered at Ramon's first post: ACR doesn't read the 5D white bal metadata. The camera K setting is used for in-camera processing and by Canon utilities. But note that the raw data are white bal agnostic but white bal results are subject to a camera profile which may differ between OEM and ACR, and at which point there is room for discrepancy for interpretation of color. Which one is right? I can't say. It's important to realize the results for a given K setting may differ between OEM and ACR because of this. Contrary to what G Sch writes above, the same K setting ought to give the same results if a "Kelvin" setting is to have real meaning, but the seems to be impracticable if the developers don't agree on the characterization of the gear.

  • Lightroom, Color Temperature, Kelvin

    Is there any way to have Lightroom how the color temperature in Kelvin
    instead of what it is using? Or does someone has a chart that equates the LR
    numbers to the degrees Kelvin?
    I wish Adobe would stick with standards with such things. It is bad enough
    when you use the white balance eye dropper that the grid enlargement you see
    when it is over the image shows the RGB values as a percentage instead of
    what Photoshop shows you. It is very hard to get close to 128 128 128 when
    your using a percentage. Again, Adobe started the use of certain methods in
    Photoshop so I don't know why they can't be consistent.
    Robert

    Jim,
    My understanding is that an absolute K whitepoint does not strictly make any sense for relative-colour JPEGs. This has been hashed out, ad nauseam, on other threads.
    That is, the controls provided are intended to be /correct/, and not just convenient.
    I don't have a dog in this fight, but I prefer correct interfaces myself.
    I'm actually having trouble coming up with a case where an absolute Kelvin setting is of much use, anyway. I suppose if you know that 5300K was likely the temperature of the colour on a particular shoot it might be useful. But, if you already know that why not set the camera to the same value? If one needed absolute K (say, for legal or scientific reasons) I suspect one might be shooting raw, and using a pre-set whitepoint.
    At any rate, my advice is to post this as a feature, which (I understand) will probably be interpreted to mean, "provide access to absolute K values for whitepoint for file formats that can only represent relative temperature, quantizing and approximating in some cases to get as close as we can, within some arbitrary limits."
    I'd have to go look at the other thread and do some more research, but this, I think, is the crux of the matter. Misunderstandings and errors on this matter are, of course, all mine.

  • Adjusting color temperature from footage from a D7?

    Someone is going to give me footage shot with a Canon D7 and he told me he didn't set the White Balance, but instead changed the color temperature. I have next to zero experience with DSLR video. What does that mean in terms of fixing the WB in FCP?
    Thanks.

    Fixing the WB will depend on how much it is off. If you push the correction too far you will introduce grain.
    Transcode the footage to ProRes 422 through Log & Transfer. See Shane Ross' tutorial for tapeless workflows: http://library.creativecow.net/ross_shane/tapeless-workflow_fcp-7/1
    If you need to do a lot of correction use Color instead of FCP for best results.
    Also see this: http://library.creativecow.net/harrington_richard/final_cut_white_balance/1

Maybe you are looking for

  • Events showing on iPhone but not in MobileMe or iCal - Help!

    Hello. I wonder if someone can help please. This problem has been going on for a little while and is not specific to the iPhone 4, it also occurred on my iPhone 3G before this so it's not hardware specific. I use MobileMe to sync Contacts, Calendars

  • OBIEE 11g Custom Application Roles

    Hello Experts, I would need to create our Custom BI Consumer, Author Application Roles. I have followed the steps are 1) Created an Application Role "Revenue Data Access Role" for Data Level Security and added the users into it 2) Selected the existi

  • I could not read the AMHARIC FONT GEEZ.

    Pls advice what i have to do or which software i have to download.

  • Creative Zen Micro Firmware Prob

    So I've got a Zen Micro bout a year or two back and it stopped working a whle back. It turns on fine but used to make a buzzing sound and?freeze. It had a small box with?'Firmware Problem' written on the screen. Lately, it still makes the?buzzing sou

  • KM document workflow

    Hi, I have enabled approval process for a folder in CM repository . When I submitted a document for approval process, it is sending both the task and notification to UWL. I want the notification to be sent to approver's MS- Outlook express inbox inst