Monitor settings - RGB or sRGB or ?

Hi,
with two new Dell UltraSharp U2415 monitors what kind of settings should I use on Windows 8.1 Pro OS and do not know if on both the physical monitor and the GPU?
Currently, on Windows 8.1 OS for the monitor I have set RGB and on the PC I have set, under display settings, as ICC profile "DELL U2415 Color Prifile, D6500"
I also have an nVidia GeForce GTX 760
Moreover, should I run any kind of display test to verify color accuracy? If so, any freeware software to suggest?
Thank you

Okay, I put my Windows Vista Business hard drive back in my Mac Pro and booted off it.
I installed Xrite iMatch 3.6.2 software, connected my eye-one display 2, profiled my monitor with a custom ICC profile, and rebooted.
Now I am seeing exactly the same Safari behavior I see on system 10.6 using this hardware (and my other Mac Pros running profiled 30" Apple displays) --- a slight shift in the untagged sRGB rollover.
The untagged sRGB rollover appears like Safari for Windows is defaulting untagged sRGB to my custom monitor profile now, the same as my OS-X machines, and as I expected it to work on the PC.
+++++
Previously, my Windows system was using whatever profile my NEC 2490WUXi setup by default --- I had a feeling setting a custom profile would provide a clue.
Seeing is believing...

Similar Messages

  • ProPhoto RGB Conversions to Adobe RGB or sRGB

    With Lightroom using ProPhoto RGB with a 1.8 gamma, what happens to the image on the monitor when converting to either Adobe RGB or sRGB whose gamma is 2.2? Are the image colors simply remapped?

    Simple answer, yes..._IF_ you have an accurate profile of your display, Lightroom will use that to correctly display the image on screen. That's a double edged sword though, if you DON'T have an accurate profile of your display, the image on screen won't be accurate.
    And yeah, forget about the relationship of "working space" (an image's RGB specs) and your display's "profile". There is none-other than the fact that a color manged apps will display the image's color space correctly with an accurate profile.
    Did I mention it's really important to have an accurate display profile?
    :~)

  • I can convert almost things multiple times at the same time including CMYK to RGB (or sRGB), but not RGB to CMYK and not brighten/contrast. How?

    I can convert almost things multiple times at the same time including CMYK to RGB (or sRGB), but not RGB to CMYK and not brighten/contrast. How?

    If you want a relevant answer you may have to elaborate on what you actually mean. Posting screenshots might help, too.
    In any case multiple conversions of an image are not advisable in general.

  • Converting RGB images (sRGB or Adobe RGB) to 709 color space.

    I'm trying to determine the correct way to convert RGB images (sRGB or Adobe RGB) to 709 color space.  I can't just use the "covert to profile" function to do this because it does not produce results that fall within the 16 to 235 range that 709 dictates.  I've read that you can simply use the "Levels" adjustment and change the output levels to 16 to 235.  While this would clearly compress the luminance to the correct range, I'm not entirely clear if the end result would be a proper conversion (i.e. if color and gamma, for example, would be technically correct.)
    I noticed that converting the profile to "HDTV (Rec. 709)" does alter the image, so I'm wondering what the result would be if I did both this AND used the levels control to compress the output range to 16 to 235.
    Thanks for any feedback on this.

    (1)
    http://en.wikipedia.org/wiki/Rec._709
    (2)
    http://en.wikipedia.org/wiki/Rec._601
    The transfer functions for Rec.709 (1) refer to the range [0..1] or for 8 bits per pixel [0...255].
    It seems that the clipping, black=16 and white=235 has to happen after the application of the
    transfer function. If this should be true, then we don't have a level compression but a level
    clipping at both ends, as already for Rec.601 (2), like here:
    The ICC-Profile HDTV(Rec.709) in Photoshop contains the primaries and the white point
    (both like that in sRGB) and the transfer functions for [0..1], coded by a LUT with high resolution,
    as found by Profile Inspector. There is no clipping.
    By the way, that`s the internal profile name, I don't know the file name of the profile.
    Softproofing, source in sRGB, target HDTV(Rec.709), without clipping:
    With numbers not preserved: no change of the appearance, as expected.
    With numbers preserved: shows the effect of different effective gammas. 
    Your questions are very clear and I'm not sure whether my comments help. The information
    in the internet is not convincing.
    Best regards --Gernot Hoffmann

  • Where are monitor settings stored beside krandrrc

    Hi.
    I would like to ask for help. Recently i've found interesting issue regarding KDE monitor system settings. KDE system settings make one config file in ~/.kde4/share/config named krandrrc. When we use KDE monitor system settings, configurations are stored in file krandrrc (information like resolution, refresh rate, orientation when there are 2 LCD displays connected and so on are stored here).
    I have dual monitor setup. I found this issue because of what i do by an mistake. I chosed to disable my second screen, hit save - i wanted to disable it. Than by mistake i disabled my first, primary screen and this choos was saved right on the spot (i though that when i won't acept settings, they will be restored, just like in Windows, but there were not). So, in fact, i ended with second screen disabled and my first screen disabled, so after each boot, i log in and than, my picture/screen goes off and my lcd monitor shuts itself off.
    I tried to delete krandrrc file in ~/.kde4/share/config but it did not make a change. Is there a different config file for monitor settings in KDE? How i could restore it back? Is there a way? I restored it by a chance (i just logged in, hit alt+f2, tried to type "terminal", and than type xrandr --auto but this metod is a metod that needs luck because we can't see anything with disabled screen). I deleted also xorg.conf but in Arch, in fact, it is not needed because udev.  I tried to log into tty by hitting alt+f1, but in terminal typing xrandr --auto won't make difference, it says that it can't start X window session so it works only in gui.
    Could somebody help? Maybe you have a fix for it?
    Thanks.
    Last edited by firekage (2014-10-17 11:57:36)

    Hi firekage,
    I came across your post while trying to fix a similar issue on my laptop.  I don't have a krandrrc file.  However, I found that if I deleted the files in my ~/.kde4/share/apps/kscreen directory I was able to "start over" and configure my displays again.
    Hope that helps you out.
    --Ted

  • Monitor settings incorrect and can't get to system prefs to change them

    Hi,
    I was experimenting with screen resolution settings and somehow came up with one that doesn't allow me to get to any finder controls or to the apple icon. So I can't use my computer.
    Help!

    Thanks BD. And thanks for the welcome.
    I couldn't get it to start in safe mode no matter when I depressed and released the shift key. I realized that the computer is behaving as if it has 2 monitors, which is what it had when I made the mess. I tried detect displays to no avail. BUT I logged out and signed in as another user and no problem! Thus, the monitor settings must somehow be a function of a users prefs. The upshot is that I can now use the computer albeit as a different user. A potential problem may be that the original (screwed up monitor) user is the administrator. My problem is no longer so critical, though and I really appreciate the help.
    TB

  • Monitor settings lost when switching user accounts.

    Hi! I just recently noticed that every time i switch user accounts, my monitor settings get corrupted.
    There are 2 user accounts in my ibook. One account is used for work while the other one is used for home purposes. The work account has a specific monitor setting which is calibrated to our workstations. The other account has default settings on it.
    My question is that when I switch users, the settings for my work account gets corrupted. I also can't change the settings without having to restart the book.
    I already repaired permissions and zapped the pram. The same thing still happens.
    Thanks in advance!

    Hello Chuck!
    I've had this problem too. Like you, I managed to navigate to do a restart, but nothing else would make the mouse cursor reappear.
    Bruce, I wasn't using Biggy, PinPoint or any of the other mouse visibility things, simply the magnification provided by System Preferences.
    I put it down to a fault on 10.4.4—a very buggy release— and haven't yet been able to reproduce it on 10.4.5.
    I suggest you send a fault report to Apple ([email protected]), explaining all the circumstances. Try always undoing the cursor zoom before switching users (yes, that's a pain too!!!). And if I manage to reproduce the problem, I promise I'll send in a fault report too!
    HTH
    Archie

  • Iweb auto adjust for different computer monitor settings

    Nobody has the same display settings on their monitor, so how can I get iweb to auto adjust to different computer monitor settings?

    Are you referring to screen size? If so many prefer to have their browser window set to the size they want unlike PCs in which the browsers usually always fill the screen blocking out the desktop.
    Why not set your site width to a size that can be utilized by all, desktops, laptops and mobile devices. Apple recommends 980 pixel width for that.
    There was a post about this some time ago with, if I recall correctly, some code to do that. I think Cyclosaurus posted it but am not sure.
    I know its a personal preference but when I land on a site that forces my browser to fill the screen I leave and don't go back.
    OT

  • Where are KDE Battery Monitor settings?

    I previously used Debian 7 Stable with KDE 4.8. Back in those days, I could right click on the battery widget, and would see options for "Power settings" and "Battery Monitor settings" where the latter was only settings concerning the widget, ie show charge information, shortcut key, etc.
    Now that I'm on Arch with KDE 4.12, When I right click on the battery monitor, the only entry is for "Battery Monitor Settings", but when I click that, I am taken to the menu for what was previously referred to as "Power Settings", that is, system wide things like screen brightness, etc.
    Where can I find settings for the Battery Widget itself? For example, I'm trying to get it to show the estimated remaining time when clicked, and I can't find a place to do so. It's gotta be somewhere, right?

    So I tried that, but ran into a few snags -
    1. In that fix, they say to add the string showRemainingTime=true under another line about showMultipleBatteries. Unfortunately, since there's evidently not a menu anymore, I can't enable showing multiple batteries, so I'm not 100% sure I added it to the correct location.
    2. After I added that line, it still is not working. For reference, here is my ~/.kde4/share/config/plasma-desktop-appletsrc file:
    http://pastebin.com/SiKRS1dh
    The line I added is on line number 238.
    Any thoughts?
    Last edited by TheGuyWithTheFace (2014-02-13 18:44:10)

  • Gnome Crashes on Login after messing with dual monitor settings

    Gnome displays the sad monitor and has the error message "..something has gone wrong..~" and will force me to log out. I've only started to experience this after messing with dual monitor settings in nvidia. The dual setup(twinview) worked but once I rebooted, I couldn't get it going for awhile.
    Here's my xorg.conf
    # nvidia-xconfig: X configuration file generated by nvidia-xconfig
    # nvidia-xconfig:  version 285.05.09  ([email protected])  Fri Sep 23 19:18:19 PDT 2011
    Section "ServerLayout"
        Identifier     "Layout0"
        Screen      0  "Screen0"
        InputDevice    "Keyboard0" "CoreKeyboard"
        InputDevice    "Mouse0" "CorePointer"
    EndSection
    Section "Files"
    EndSection
    Section "InputDevice"
        # generated from default
        Identifier     "Mouse0"
        Driver         "mouse"
        Option         "Protocol" "auto"
        Option         "Device" "/dev/psaux"
        Option         "Emulate3Buttons" "no"
        Option         "ZAxisMapping" "4 5"
    EndSection
    Section "InputDevice"
        # generated from default
        Identifier     "Keyboard0"
        Driver         "kbd"
    EndSection
    Section "Monitor"
        Identifier     "Monitor0"
        VendorName     "Unknown"
        ModelName      "Unknown"
        HorizSync       28.0 - 33.0
        VertRefresh     43.0 - 72.0
        Option         "DPMS"
    Section "Device"
        Identifier     "Device0"
        Driver         "nvidia"
        VendorName     "NVIDIA Corporation"
    EndSection
    Section "Screen"
        Identifier     "Screen0"
        Device         "Device0"
        Monitor        "Monitor0"
        DefaultDepth    24
       #Option         "TwinView" "True"
       #Option         "MetaModes" "nvidia-auto-select, nvidia-auto-select"
        SubSection     "Display"
            Depth       24
        EndSubSection
    EndSection
    I commented out below to get it in again, but it still feels shaky and unstable to log in now. Sometimes I will get the error and after a log out and login, it'll work.
      #Option         "TwinView" "True"
       #Option         "MetaModes" "nvidia-auto-select, nvidia-auto-select"
    Extra Notes: I have another desktop session with awesome installed(uninstalled now) and could log into that no problem. Awesome was giving me troubles so I uninstalled it. I have 2 gnome desktop sessions, my regular gnome session(the one that crashes), and the other with no window manager(never crashed on startup).
    I installed awesome around the same time as configuring the dual monitor setup.
    Thanks.

    Update.
    The crashes are to be expected now. I'm circumventing the whole thing by logging into my other Gnome session without a window manager, logging out and then logging back into my regular Gnome with Metacity.
    I've noticed the tty1 sessions(not sure what's it's called) is strange. Apparently my session runs on tty8 but when I try logging in, it'll crash me and tty7 gives me a blank screen with a blinking underscore. The other tty1s work regularly. Not sure if this has anything to do with it.
    Is there a way to just reset the Gnome settings?

  • A question about monitor settings and printing?

    I have a new PC with an HD monitor, it didn't come with an HDMI cable, but a VGA one, I recently got the HDMI and I noticed that comparatively when using the VGA cable & using the default monitor settings that the darks were darker than when using the HDMI with default settings, (the colour depth looked best with the VGA) I'm using PE9 and need to make an image to have printed at the printers, I would love to use the HDMI cable, but I'm concerned that if I do (and I change the monitors default settings for contrast, hue...etc to compensate) that... it will have been edited here one way and look totally different on the print shops monitor...
    ... So, how can I be sure it will come back from the printers looking the same as I had edited it?
    Should I give up with the HDMI cable? I also have PrE 9 for video editing, so I'm a bit confused at to what is best to do...
    Thanks.

    There are no free calibration tools.  Like most things you get what you pay for.  However, the low end versions are quite good, so take a look at the Xrite Huey.
    This will get your display calibrated which ensures you see the best rendering of the image on screen.  It does not guarantee good print matching since this also requires a print colour managed process and good viewing conditions.
    Colour settings in Elements are mis-leading.  They apply to new documents being created in Elements or to images being opened that don't contain a colour profile.  Virtually all JPEG images from cameras will have a profile tagged in them.  So choose any setting but not the one that says 'No Colour Management'.  The best default in my opinion is 'Optimise for Screen'
    Colin

  • Make Nvidia-Settings monitor settings & PowerMizer stick

    If i access the Nvidia-Settings program, i can configure the OpenGL and Antialiasing Settings however i wish and quit and they will permanently stick through every restart of the computer or x server.
    However the "X Server Display Configuration" settings and "PowerMizer" settings, although they do apply for my current session after i change them, restarting X11 will reset them.
    I want to set my two displays to clone and powermizer to performance mode.
    I have both tried the "Save to X Configuration File" button in the display section and saved it to a file in xorg.conf.d/ and i also tried going to the nvidia-settings Configuration and using the "Save Current Configuration" on the default path/filename for my user (rather than root(
    Nothing i've tried seems to work. What am i doing wrong or just not doing?
    My window manager is a plain openbox-session.
    Last edited by rabcor (2013-10-11 01:03:09)

    rabcor wrote:
    nvidia-settings -l
    seems to do precisely nothing, and to my understanding when the drivers load (when X runs) the drivers should load the "~/.nvidia-settings-rc" which is created when i use the "Save Current Configuration" button if one exists (if not i assume it just uses some default settings). Also to my understanding "nvidia-settings -l" is just meant to load exactly that file again.
    whether it actually does that successfully or not i have not tested, but the file does not include the monitor settings when i save it and i assume it doesn't contain the PowerMizer settings either since that also doesn't stick.
    i seem to require a xorg.conf or a xorg.conf.d/file file with at leastthese contents to run X with the nvidia drivers. Last i checked anyways which was probably a few months back.
    Section "Device"
    Identifier "Whatever"
    Driver "nvidia"
    EndSection
    For me .nvidia-settings-rc wasn't loaded by default. I had to write it into my .xinitrc, too.

  • What are the monitoring settings?

    what are the monitoring settings?

    hi reddy which monitoring setting r u refereing
    generaly we dont need to maintain any setting for monitoring
    please let me know which setting r u talking about

  • Someone reversed the monitor settings -- help

    A student reversed the monitor settings so the screen shows black with silver letters/graphics. I think they hit 3 keys that does this. Does anyone know the keys that let me reset the monitor back to the original settings?

    Go to System preferences -> Universal Access (under System section). Click on the "Switch to Black On White" button.
    Patrick

  • Why does Lightroom (and Photoshop) use AdobeRGB and/or ProPhoto RGB as default color spaces, when most monitors are standard gamut (sRGB) and cannot display the benefits of those wider gamuts?

    I've asked this in a couple other places online as I try to wrap my head around color management, but the answer continues to elude me. That, or I've had it explained and I just didn't comprehend. So I continue. My confusion is this: everywhere it seems, experts and gurus and teachers and generally good, kind people of knowledge claim the benefits (in most instances, though not all) of working in AdobeRGB and ProPhoto RGB. And yet nobody seems to mention that the majority of people - including presumably many of those championing the wider gamut color spaces - are working on standard gamut displays. And to my mind, this is a huge oversight. What it means is, at best, those working this way are seeing nothing different than photos edited/output in sRGB, because [fortunately] the photos they took didn't include colors that exceeded sRGB's real estate. But at worst, they're editing blind, and probably messing up their work. That landscape they shot with all those lush greens that sRGB can't handle? Well, if they're working in AdobeRGB on a standard gamut display, they can't see those greens either. So, as I understand it, the color managed software is going to algorithmically reign in that wild green and bring it down to sRGB's turf (and this I believe is where relative and perceptual rendering intents come into play), and give them the best approximation, within the display's gamut capabilities. But now this person is editing thinking they're in AdobeRGB, thinking that green is AdobeRGB's green, but it's not. So any changes they make to this image, they're making to an image that's displaying to their eyes as sRGB, even if the color space is, technically, AdobeRGB. So they save, output this image as an AdobeRGB file, unaware that [they] altered it seeing inaccurate color. The person who opens this file on a wide gamut monitor, in the appropriate (wide gamut) color space, is now going to see this image "accurately" for the first time. Only it was edited by someone who hadn't seen it accurately. So who know what it looks like. And if the person who edited it is there, they'd be like, "wait, that's not what I sent you!"
    Am I wrong? I feel like I'm in the Twilight Zone. I shoot everything RAW, and I someday would love to see these photos opened up in a nice, big color space. And since they're RAW, I will, and probably not too far in the future. But right now I export everything to sRGB, because - internet standards aside - I don't know anybody who I'd share my photos with, who has a wide gamut monitor. I mean, as far as I know, most standard gamut monitors can't even display 100% sRGB! I just bought a really nice QHD display marketed toward design and photography professionals, and I don't think it's 100. I thought of getting the wide gamut version, but was advised to stay away because so much of my day-to-day usage would be with things that didn't utilize those gamuts, and generally speaking, my colors would be off. So I went with the standard gamut, like 99% of everybody else.
    So what should I do? As it is, I have my Photoshop color space set to sRGB. I just read that Lightroom as its default uses ProPhoto in the Develop module, and AdobeRGB in the Library (for previews and such).
    Thanks for any help!
    Michael

    Okay. Going bigger is better, do so when you can (in 16-bit). Darn, those TIFs are big though. So, ideally, one really doesn't want to take the picture to Photoshop until one has to, right? Because as long as it's in LR, it's going to be a comparatively small file (a dozen or two MBs vs say 150 as a TIF). And doesn't LR's develop module use the same 'engine' or something, as ACR plug-in? So if your adjustments are basic, able to be done in either LR Develop, or PS ACR, all things being equal, choose to stay in LR?
    ssprengel Apr 28, 2015 9:40 PM
    PS RGB Workspace:  ProPhotoRGB and I convert any 8-bit documents to 16-bit before doing any adjustments.
    Why does one convert 8-bit pics to 16-bit? Not sure if this is an apt comparison, but it seems to me that that's kind of like upscaling, in video. Which I've always taken to mean adding redundant information to a file so that it 'fits' the larger canvas, but to no material improvement. In the case of video, I think I'd rather watch a 1080p movie on an HD (1080) screen (here I go again with my pixel-to-pixel prejudice), than watch a 1080p movie on a 4K TV, upscaled. But I'm ready to be wrong here, too. Maybe there would be no discernible difference? Maybe even though the source material were 1080p, I could still sit closer to the 4K TV, because of the smaller and more densely packed array of pixels. Or maybe I only get that benefit when it's a 4K picture on a 4K screen? Anyway, this is probably a different can of worms. I'm assuming that in the case of photo editing, converting from 8 to 16-bit allows one more room to work before bad things start to happen?
    I'm recent to Lightroom and still in the process of organizing from Aperture. Being forced to "this is your life" through all the years (I don't recommend!), I realize probably all of my pictures older than 7 years ago are jpeg, and probably low-fi at that. I'm wondering how I should handle them, if and when I do. I'm noting your settings, ssprengel.
    ssprengel Apr 28, 2015 9:40 PM
    I save my PS intermediate or final master copy of my work as a 16-bit TIF still in the ProPhotoRGB, and only when I'm ready to share the image do I convert to sRGB then 8-bits, in that order, then do File / Save As: Format=JPG.
    Part of the same question, I guess - why convert back to 8-bits? Is it for the recipient?  Do some machines not read 16-bit? Something else?
    For those of you working in these larger color spaces and not working with a wide gamut display, I'd love to know if there are any reasons you choose not to. Because I guess my biggest concern in all of this has been tied to what we're potentially losing by not seeing the breadth of the color space we work in represented while making value adjustments to our images. Based on what several have said here, it seems that the instances when our displays are unable to represent something as intended are infrequent, and when they do arise, they're usually not extreme.
    Simon G E Garrett Apr 29, 2015 4:57 AM
    With 8 bits, there are 256 possible values.  If you use those 8 bits to cover a wider range of colours, then the difference between two adjacent values - between 100 and 101, say - is a larger difference in colour.  With ProPhoto RGB in 8-bits there is a chance that this is visible, so a smooth colour wedge might look like a staircase.  Hence ProPhoto RGB files might need to be kept as 16-bit TIFs, which of course are much, much bigger than 8-bit jpegs.
    Over the course of my 'studies' I came across a side-by-side comparison of either two color spaces and how they handled value gradations, or 8-bit vs 16-bit in the same color space. One was a very smooth gradient, and the other was more like a series of columns, or as you say, a staircase. Maybe it was comparing sRGB with AdobeRGB, both as 8-bit. And how they handled the same "section" of value change. They're both working with 256 choices, right? So there might be some instances where, in 8-bit, the (numerically) same segment of values is smoother in sRGB than in AdobeRGB, no? Because of the example Simon illustrated above?
    Oh, also -- in my Lumix LX100 the options for color space are sRGB or AdobeRGB. Am I correct to say that when I'm shooting RAW, these are irrelevant or ignored? I know there are instances (certain camera effects) where the camera forces the shot as a jpeg, and usually in that instance I believe it will be forced sRGB.
    Thanks again. I think it's time to change some settings..

Maybe you are looking for

  • ITunes 10.2.1.1 Crashing in Windows 7 SP1 - My solution

    I will first start off and say my solution isn't the prettiest girl at the dance but it will get you what you want. Like most of you I spent the last couple weeks combing the Internet for answers but nothing works. iTunes would just stop working/cras

  • Verity Search Error

    I just started a Verity search engine for a very large site, and have gotten the following error message a couple of times: There was a problem while attmpting to perform a search. Query has a parsing error. (-40) at com.verity.k2.K2Search.receive(Un

  • Bookmarks in "horizontal swipe" articles?

    Is it possible for a user to bookmark a particular page in an article if the folio is set to "horizontal swipe only"?  If I bookmark a vertical article arrangment, it goes to the first page.  Is this the same in horizontal arrangements?  Apologies fo

  • Inbox for ebay messages are crammed together.

    inbox for ebay messages shows messages crammed together (not double spaced, but half spaced). Is not this way using Explorer. On my ebay messages page, the messages are listed vertically line by line. But rather than being single spaced or double spa

  • Update OSRI with DTW

    Hi I want to update value for user defined filed that i make in the OSRI table. How can i do it with DTW? Thanks Edited by: Avidan1 on Aug 19, 2011 6:39 PM