-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unable to get correct gamma curve. Any help would be greatly appreciated. #79
Comments
Update: I think I've managed to figure out where the problem lies. Displaycal/novideo is not reading the correct Edit: If I clamp to sRGB before profiling, the gamma curve applies correctly, but delta e's are not good. |
This is only necessary if you have installed another profile by mistake (a custom display profile), since sRGB should be de default for your OS. It's okay to do it just to make sure. You can also load the gamma curves of your GPU in order to check that there's no gamma curves being used. And you don't need to keep DisplayCal's profile loader in the background, you can close it and disable the auto-start. I'm not sure about the origin of your problem, but keep in mind you're working with a 8-bit display, this may be related somehow with rounding errors. You can try if you get better results using dithering (8 bit temporal, for example). Sometimes you need to restart in order to make the dithering work. |
Yeah, I see. Windows update installed a Dell D6500 ICC profile in the system automatically a while back, and when I removed it, I got a good gamma curve result. Same as when I first tried your method of installing sRGB as default via Displaycal. Unfortunately after either restarting the pc or changing some settings in Nvidia, it stopped working again for the next profiling attempts (I think that might have been Displaycal profile loader maybe causing some issues (like for example after turning on g-sync, it would refresh the screen or something) I then removed Displaycal Profile loader from startup and quit the app, and it seemed to work again (and I could change settings in Nvidia freely while still getting good results). The next day, I decided to go into "Color mangament" and reset settings, hoping to get rid of any profiles being loaded, just in case. After that it stopped working again, sigh. I've also turned on and off the "Override to reference mode" in "Adjust desktop color settings". Could that have made a negative impact on profiling? The strange thing is that I've tested 2 monitors, 2 Colorimeters and also tried using my laptop instead of desktop (which I've reinstalled first with win 11 and now win 10) and the they all have the same problem with end of gamma curve. So, logic dictates that It must be a user error or something to do with Displaycal.
How would I go about doing this, just testing it with verification in Displaycal or? And how would I know if any gamma curves are being loaded after looking at the results?
The main screen AW2721D is actually 10bit, but only when using 144hz. I use it with 240hz, 8 bit instead. Don't know if that makes any difference. I'll try your tip of using dithering while calibrating tonight and see if that has any impact. Thank you so much for answering my call for help, by the way. I really appreciate it. |
Hi, it's been a while since I actually worked on this code, but I'll try to help anyway:
This just means that the profile needs to either be a 3D LUT profile (which is the default) or a matrix profile with black point compensation disabled (with the latter being preferred). Otherwise, if you create a matrix profile without black point compensation, the curves in the profile will just start at 0 as if the display had true blacks, which means novideo_srgb can't calculate an accurate gamma curve with desired input/output offset. The settings you posted all look correct to me. You could also try enabling "Override to reference mode" in NVIDIA control panel, which will prevent any potentially loaded VCGT or other color settings from interfering and should, afaik (despite their own documentation saying otherwise), still allow the novideo_srgb calibration to work. As for troubleshooting, you can try disabling novideo_srgb, unchecking "Use simulation profile as display profile" (leaving everything else as in your screenshot), and then running the measurement report. If the gamma looks right on that, then that would mean that the profile is good and the gamma deviations when using novideo_srgb either come from something in my code (bugs or misunderstandings about how the GPU pipeline works), or some interaction between the novideo_srgb calibration and how ArgyllCMS uses the VCGT to simulate a higher bit depth when measuring patches, or maybe (lack of) dithering, etc. Since I don't have access to any actual documentation on how the GPU applies this transform and how it dithers, or at least something like a signal analyzer to verify the actual GPU output, it's really hard for me to say whether my code works as it should. You could maybe try doing one or multiple of these:
Also, can you post the measurement report HTML file? It's hard to tell from just the gamma plot how off it really is, especially considering that the gamma value is less meaningful near white. |
In order to check if there are calibration curves loaded on the GPU you only need to search and open the "Curves" program. It should be installed on your PC since it's installed along with DisplayCal, just search for it on Windows. You can also use the "Profile info" program to check the current installed profile. |
Hi. Thanks for reaching out to help!
Hmm, I've tried that several times actually, but with the same result sadly.
With the newly created profile selected as preset, I assume? I'll upload the results of these verifications as soon as I can tomorrow.
I ran some quick profiles before going to work and actually managed to get a good result (as far as I can tell), with the gamma curve applying as I think it should. It was the third attempt, and like @Tallulah88 suggested, I tried profiling with dithering enabled in Novideo_sRGB, using SpatialDynamic2x2 (the dithering option others have been recommending). Also I forgot to change Profile type to Curves + Matrix, and so I used Xyzlut + Matrix this time instead, but was still using the same 265 greyscale test chart. Unfortunately I didn't have time to test more, so I don't know which of the two changes that made the difference, and if it will work next time I profile, but probably it was the dithering, seeing as I've tried Xyzlut several times before with lackluster results. Uploaded three different verifications here: https://drive.google.com/drive/folders/13QwVUBf-J-PjeRAgZRuQlrfT3o6tWXk4?usp=sharing (I couldn't figure out how to add HTML files to the comment). One of them displays how it looks when the profiling goes bad and the others are sRGB and Rec709 verification of the profile that seems to be working correctly. I will run more tests tomorrow, trying out what you've suggested!
Thanks for the suggestion @Tallulah88! I'll check this out and see if it can shine some light on what's going on. |
Update: After running some more profiles, I've figured out that the deciding factor for whether the gamma curves correctly or not is to have dithering enabled when creating the profile in Displaycal. I used Curves + Matrix, and this time also enabled "Override to reference mode" in Nvidia settings. This is on the second monitor AOC 24GU, which has the exact same problem as the AW2721D, so the testing should be equally valid. This is a verification of how the gamma looks when not using Novideo and no simulation profile in Displaycal: Novideo enabled, with no dithering used in creation of profile nor before running verification: Novideo enabled, with no dithering used in creation of profile, but with dithering enabled before running verification: Lastly, verification of successful attempt, with dithering (SpatialDynamic 2x2) enabled when creating profile and when testing: Posting another link here to the verification tests and images of Curves before profiling: |
Thank you for the extensive testing! I added a note in the README with a link to your last comment. |
Glad that we seem to have found the culprit in the end, and that the testing might help somebody else in the future. Huge shoutout to @Tallulah88 for the tip about dithering. It's strange that this "problem" doesn't seem to affect most other people though. At least that's the impression I get from reading through the other threads. I don't know if you already did, but if you do have some time in the near future, could you maybe take a look at the verifications in the link I posted and see if the final result looks like it's 100% working as intended? As far as I can tell it does, but then again I'm a layman in the world of monitor calibration. And thanks for making this tool, btw. Very much appreciated! |
It looks great to me, perfect results! Years ago I used to worry a lot about having the perfect calibration, but eventually you realize that the majority of the public consumes content on uncalibrated screens and it doesn't matter that much. Likewise, most independent artists who produce content at home also tend not to have good color management. In fact, now with the arrival of many P3 screens that are used without color management, all of this is becoming more complicated and "subjective." Anyway, what I want to say is that, unless for some reason you need a workflow of maximum precision, I recommend not worrying too much about having such a "perfect" calibration. Although, if you needed that level of perfection, factors such as the precision of the spectral correction or the uniformity of the screen would limit you anyways. |
Hehe, good to hear. Thanks for giving me the peace of mind! And you're right, of course. It isn't really worth obsessing over it too much, as you can never be sure it's gonna be 100% correct anyway. It's just a little bit of an OCD tendency of mine to want to get it "just right". |
I don't mean to spam this thread with comments, but I've got one final question for you guys. Hope you don't mind! Would there be any point in creating a 3dlut for Reshade and using that on top of Novideo-sRGB in fullscreen games? I am using "Override to reference" mode, which states that "color adjustments from the OS or other adjustment applications are ignored and rendered pixels are processed through the GPU pipeline". Does that mean that games in fullscreen make use of the novideo VCGT? If there is any point in making such a 3dlut, would the process of doing so be to:
|
I believe novideo_sRGB worked great on all my games. You can make a profile with a very different whitepoint so you can use it to check if novideo_sRGB is working when you open a game on fullscreen mode. Or just change the gamma target when you apply the profile to a very different value and check if you can see the difference in the game. |
Thank you, I'll try that 👍 |
But how did you enable dithering with no profile loaded in novideo? For me the "clamped" tickbox is greyed out when I have the mode set to EDID since my monitor is already natively sRGB only. Or does "dithering" not require "clamped" to be enabled, and actually always enable dithering at the driver level as long as dithering is set to "enabled"? |
I can't understand this either. |
You don't have to clamp for the dithering to be enabled. As long as you enable it in the advanced tab it should be applied. |
Hi. So I've been trying to get my monitors calibrated again by using Displaycal and Novideo_srgb. I've followed the advice on the "read me" as well as some of the comments in #18 (comment) and #32 (comment).
Correct Gamma curve Dell AW2721D:
Incorrect Gamma curve Dell AW2721D:
For some strange reason, I can't get the gamma curve to apply correctly. After hundreds of attempts, I've only been able to get it right once per monitor, but then the next time I try to calibrate/profile again, the gamma is wrong. If someone could take a look at the pictures I've included and tell me if there's anything I could be doing wrong, I would be so incredibly thankful. For reference, this is what I've been doing:
In Displaycal I set Settings to "Default: (gamma 2.2)", choose "LCD PFS Phosphor 94% P3" correction, "Fullrange RGB 0-255".
In Calibration, I set everything to "as measured" except whitepoint, which is set to "6500k daylight" (I've tried "as measured" also").
In profiling, I set Profile type: "Curves + Matrix", "Black point compensation unticked", and use "265 greyscale" test chart.
After DisplayCal finishes, I choose "Don't install profile" and then open NoVideo_Srgb.
In Novideo_Srgb I choose "use icc profile" and check "Calibrate gamma to relative 2.2" with "0% black output offset" and then clamp.
In Displaycal I then go to "Verification" and choose "Rec 709", "Use simulation profile as display" and "Custom Gamma: 2.2 Relative 0% Black output offset"
I've checked to see if any Eco or Dynamic brightness setting is turned on in OSD, but its not. I've also turned off G-sync/Free-sync and also Overdrive/Fast input options on the AOC 24G2U. Not to mention, I've tried 2 separate Colorimeters, one being standard X-rite colormunki and the other is a Colorchecker Plus.
These are my Nvidia and Color management settings:
If anyone can identify what I'm doing wrong, please tell me.
The text was updated successfully, but these errors were encountered: