HDR mode black level is way off!

+
Something I've just tried, Nvidia has a new feature in its drivers, something about enhanced colour accuracy mode.

By default it is set to Enhanced. I've just tried ticking the box that says Reference Mode and applying that, it seems to make the blacks look much better.

I've only had a chance to look around V's apartment but her computer monitor looks blacker, as do other areas.
Post automatically merged:

I should probably mention, it's in the Adjust desktop colour settings page of the Nvidia control panel

I just stumbled onto your post when scouring the web for info on fixing this damn black crush in not just Cyberpunk but Division 2 and other HDR games. You may be on to something with this. I just briefly tested it on my AW2721D in Division 2 White House area, stepping into the dark spots by the Recalibration desk. On my AW3821DW, there would've been total black crush on my character's back. With your recommendation to switch NVCP to Reference Nvidia Control, I could make out the back of my jacket and some details. That's a big improvement. I'm going to test it more extensively tonight on the Ultrawide and see if it really helped.


I'm interested in your opinion though...do you feel it helped or worsened color accuracy using Reference? Cause before I was using "Accurate". Not sure if it defaults to that, or it kicked in when I calibrated my SDR ICC profiles.



This guy's vid just confirmed my initial theory. the devs were working in the wrong gamma range to begin with when they wrote the game. look at the end of this guy's video when he's in the apartment in SDR mode. Even in SDR mode, nothing in the scene that's supposed to be black touches or anywhere near "0" on the spectrum. this tells me that the developer's monitors where incorrectly calibrated to begin with. my guess is that their monitor was calibrated in gamma 3 or 4, which sees all 20~30% IRE scale to be same as 0% IRE. so they lit their entire game using 20 or 30% IRE as BLACK as oppose to 0% IRE as black. so when the game reaches the consumer, all our monitors are set in the gamma range of 2.2 or somewhere close to 2. Our 20/30% IRE is actually dark grey. so everything looks washed out. This is worst than i thought. Based on this, in reality, their SMPTE curve more resembles a horizontal curve

The only other reason to explain this is that they intentionally wrote the game with this muddy washed out look using a standard gamma 2 monitor. At this point of their widely criticized PR, i honestly wouldn't be surprised if they come out and say that that's intentional. But honestly, who in their right mind would do that, what kind of Art Director would design a look like this in a game is beyond me.

All in all, the problem goes way beyond HDR. 1st the HDR mode is not properly tone mapped, and no proper black reference image is included. Then, we find out that even in SDR mode, the game is using the wrong black level as black.

i've already submitted a ticket to refund my game. this is just not worth my time and money.

You're doing the Lord's work with this post. That explains so much. The sad part is you pretty much have to play this game with HDR on. I tried it without and it looked so...bizarre is the only way to describe it. Terrible would be the only other way I'd describe it. I think it's because I started out playing with HDR10 sRGB. It really changes the lighting in this game for the better on top of the neon light bloom. That being said, even in HDR10 sRGB, it still can't help the washed out look, bland texture quality and horrible tonemapping.
 
Last edited:
I just stumbled onto your post when scouring the web for info on fixing this damn black crush in not just Cyberpunk but Division 2 and other HDR games. You may be on to something with this. I just briefly tested it on my AW2721D in Division 2 White House area, stepping into the dark spots by the Recalibration desk. On my AW3821DW, there would've been total black crush on my character's back. With your recommendation to switch NVCP to Reference Nvidia Control, I could make out the back of my jacket and some details. That's a big improvement. I'm going to test it more extensively tonight on the Ultrawide and see if it really helped.


I'm interested in your opinion though...do you feel it helped or worsened color accuracy using Reference? Cause before I was using "Accurate". Not sure if it defaults to that, or it kicked in when I calibrated my SDR ICC profiles.

I'm not sure exactly what was going on with that, I think there were several issues at once, mostly on the driver side.

A few days later Nvidia released a hotfix to correct black levels in HDR mode all around. This should be implemented in their latest drivers.

I can say with confidence that with the latest drivers I am enjoying very close to accurate colours, the driver is set to Enhanced (reference box unchecked)

When I tell MadVR that my monitor is calibrated, so it knows to expect a calibrated monitor when playing video, everything looks nicely balanced. I take this as a good sign that everything is on order.
 
I'm not sure exactly what was going on with that, I think there were several issues at once, mostly on the driver side.

A few days later Nvidia released a hotfix to correct black levels in HDR mode all around. This should be implemented in their latest drivers.

I can say with confidence that with the latest drivers I am enjoying very close to accurate colours, the driver is set to Enhanced (reference box unchecked)

When I tell MadVR that my monitor is calibrated, so it knows to expect a calibrated monitor when playing video, everything looks nicely balanced. I take this as a good sign that everything is on order.

I noticed better whites as soon as I switched it to "Reference" from Accurate, but after further testing, my colors do feel slightly muted in Division 2. I haven't even gotten around to setting up my MadVR for it other than choosing the Pixelshader settings. It seems like just telling MadVR you have it calibrated isn't going to work right. Because once you engage HDR mode in Windows, it stops using your SDR Calibrated ICC Profile and switches to your panel's default HDR profile/settings and you have to turn HDR mode on for it to work in HDR videos. People that know more about this than me seem to advise against trying to calibrate your HDR mode. I've yet to try it, but I'm considering giving it a shot. I wish there was an easier way to tell the panel to "Use this ICC profile for SDR and use this other ICC profile for HDR."

I'm probably going to try just making a 3DLut with my i1Display Pro Plus and DisplayCAL and using that in MadVR. It even has options specific to MadVR. Think it should be easy enough to setup a BT2020 3DLut for it and it should be way more accurate. It's pretty much necessary since I don't think a normal SDR ICC profile goes by the BT2020 gamut. I mostly just see people targeting the 2.2 Gamma curve since that's very close to sRGB. I'll dick around with it some more.
Post automatically merged:

 
Top Bottom