[PC] HDR10+ GAMING working only via HDMI 2.1, false positive!

+
Hi guys :)

Just for fun I have switched for the HDMI port on my RTX 4080 while playing the First Descendant game and - suprise, suprise - my monitor (Samsung Odyssey Neo G8, current oficially recommended firmware 1011.3) switched to the HDR10+ GAMING mode (as presented in the Support/Information Menu) for the first time in my life! It was dark like hell, but hell yeah, I have found a new feature to play with!

But just one moment - I remembered that such feature was in the Cyberpunk 2077, why I had no recollection of that mode reported on my monitor? Because I remember that I switched on the HDR+ GAMING mode there.

So, I have just reinstalled the Cyberpunk 2077, this time it is version 2.13, and started the game, reverting to the DisplayPort connection.

And again - suprise suprise - HDR10+ GAMING switch in the Cyberpunk 2077 VIDEO options does NOTHING, when set to ON and when using Displayport 1.4 on the RTX 4080 card.

How would I know that:
+ the screen is not immediately darker
+ the monitor doesn't tell us about being in HDR10+ GAMING mode!

So, I think that we have a false positive here - can you check that on your side as well?

Requirements for testing:
+ Windows 11 24H2+
+ RTX 4000 card, driver version 566.14+
+ HDR functionality turned on in Windows 11
+ Cyberpunk 2077 version 2.13+ installed
+ HDR10+ GAMING compatible monitor
+ HDMI 2.1 port on the RTX 4000 card connected to the monitor using proper HDMI 2.1 cable (supporting 4K/120Hz/HDR)
+ the monitor must be set to 120Hz only (at least this is what I do on the 240Hz monitor); just limiting this in Windows 11 was not enough for me)

But right after I switched to the HDMI port on the RTX 4080, the HDR10+ GAMING setting in the Cyberpunk 2077 started to work properly, that is:
+ the screen was darker/brighter, when switching HDR10+ GAMING switch on and off
+ the monitor was switching between the HDR10+ GAMING mode and HDR modes on the fly, respectively

Was that behavior overlooked because almost no one uses HDMI cable when using RTX 4000 series? Especially because you are limited to 120Hz, using HDMI 2.1 port, if you want to use VRR with HDMI.

The difference in the reception, when using HDR10+ GAMING mode in the Cyberpunk 2077 is night and day, go check this by yourselves :)
 
But right after I switched to the HDMI port on the RTX 4080, the HDR10+ GAMING setting in the Cyberpunk 2077 started to work properly, that is:
+ the screen was darker/brighter, when switching HDR10+ GAMING switch on and off
+ the monitor was switching between the HDR10+ GAMING mode and HDR modes on the fly, respectively

Was that behavior overlooked because almost no one uses HDMI cable when using RTX 4000 series? Especially because you are limited to 120Hz, using HDMI 2.1 port, if you want to use VRR with HDMI.
From the description, I believe what may be happening is that you've switched the HDMI port to the one that Windows is associating with HDR functionality. According to everything I can find, the 4080 only has one HDMI 2.1 port, and the rest are display ports. Hence, you must be switching to/from the HDMI on the actual GPU to an HDMI on the motherboard or a slave card connected to the mobo.

Now it gets fun.

Depending on the priority that the GPU drivers have been assigned under Windows, it may be failing to recognize HDR functionality through the mobo. That should be changeable between either Windows Display Settings or Nvidial Control Panel by forcing it to use the one you want.

Next layer is whether your monitor is capable of communicating correctly with mobo/GPU, depending on which one is active. The drivers and firmware on the GPU seem to be correctly interpreting the calls, giving you at least access to HDR functionality within the game, but the mobo connections do not seem to be recognizing it. That may require a firmware update / BIOS flash for the mobo to get it working on that end. But there's still a lot of questions about the system config before bothering with something like that. (Not hard, but not yet necessary, so I wouldn't personally do a BIOS flash unless I absolutely knew that this was the likely issue. Too much potential to introduce other wonkiness in the system, and the flash can't be easily undone.)

Whatever the case, it sounds like a system-specific issue. My recommendation would be to just use what works, and get the HDR customized the way you like through the GPU HDMI port, as that seems to be recognized/working.
 
@SigilFey
Maybe I was not precise enough, but I was switching between the ports on the same RTX 4080 card! It has three DPs and one HDMI 2.1 port.

So, earlier recollection (several months ago?) of the not working HDR10+ GAMING mode in the CP 2077 I had from the situation that I was using DisplayPort 1.4 on the RTX 4080 card. By "not working" I mean that I was able to turn it on and off, without seeing any changed presentation parameters or seeing HDR10+ GAMING mode as reported by the monitor.

Accidentally, for fun purposes, I switched (few days ago) from one of the three Displayports of the RTX 4080 card to the HDMI 2.1 port of the same RTX 4080 card few days ago. And that move solved the issue with the non-working HDR10+ GAMING mode in the CP 2077 2.13 for me - the presentation was changing accordingly based on the HDR10+ GAMING switch, and the respective mode (HDR10+ GAMIN or HDR) was properly reported by the monitor (Neo G8).

Are you able to test such configuration?

Do you want some extra tests to be performed on my side? Do you have any additional questions regarding my setup?
 
Last edited:
@SigilFey
Maybe I was not precise enough, but I was switching between the ports on the same RTX 4080 card! It has three DPs and one HDMI 2.1 port.

So, earlier recollection (several months ago?) of the not working HDR10+ GAMING mode in the CP 2077 I had from the situation that I was using DisplayPort 1.4 on the RTX 4080 card. By "not working" I mean that I was able to turn it on and off, without seeing any changed presentation parameters or seeing HDR10+ GAMING mode as reported by the monitor.

Accidentally, for fun purposes, I switched (few days ago) from one of the three Displayports of the RTX 4080 card to the HDMI 2.1 port of the same RTX 4080 card few days ago. And that move solved the issue with the non-working HDR10+ GAMING mode in the CP 2077 2.13 for me - the presentation was changing accordingly based on the HDR10+ GAMING switch, and the respective mode (HDR10+ GAMIN or HDR) was properly reported by the monitor (Neo G8).

Are you able to test such configuration?

Do you want some extra tests to be performed on my side? Do you have any additional questions regarding my setup?

BR.
Krzysztof Gras.
I'm not able to test on my end, specifically. My monitor is an IPS model with V-Sync/FreeSync and does not support hardware HDR. It's a bit before that tech was common.

Regardless of any informational display/menu function bugs, what about the actual HDR functionality? If on, would it be clearly on in-game? What about in-game reporting -- did the game seem to detect that it was on and functional, or were the settings options greyed out?

As for other tests...I guess the best thing I can think of is, "Knock yourself out with the fiddling!" :D It definitely sounds as if it's very system specific to your rig, so a "personality quirk" of your system. It's possible that it simply wants the HDR to be pipelined through whatever chipset is managing it via the HDMI on the 4080. Frankly, for as finicky of an issue as this is likely to be, I'd recommend again taking the path of least resistance. If there's no reason you can't use the HDMI port, just use that and forget the display ports altogether.

If you do want to fiddle, what about other games. Does HDR work correctly if activated through the display port channels?

Another thing to look at -- the actual cable. For example, if using a 1.4 display port, the cable may only be rated for 1.2, creating an issue. It should say on the cable what it's actually rated for. (Here, though, we're already hitting the limit of my knowledge on how HDR will work, as I'm not sure if HDMI or a display port would be better. I'd think that HDMI would be the way to go, in general, as both sound and video is being piped through a much more robust channel. Far as I know, any display port is going to allow for only static HDR, whereas HDMI can dynamically alter the settings on the fly. If the program or the drivers are calling for the HDR to be adjusted in-game, that could be what's causing it to fail on the display port.)
 
@SigilFey
The plain HDR works perfectly in my setup in several games. You know, Samsung Odyssey Neo G8 is known to be quite HDR capable and is considered as one of the best REAL HDR monitors on the market.

You may take a look at this beast here: https://www.rtings.com/monitor/reviews/samsung/odyssey-neo-g8-s32bg85

Yes, plain "old" ;l) HDR functionality works properly in Cyberpunk 2077, v2.13 - the game properly detects HDR environment and so on.

Regarding my system being very specific, I fully agree with you. Having HDR10+ GAMING compatible monitor is rare, and having Nvidia RTX 4080 card with HDMI port used as a method when connecting to the monitor is rare as well.

My current hypothesis is, for now, that HDR10+ GAMING functionality didn't work properly from the start when it was implemented or it became broken at some point. The game properly DETECTS if HDR10+ GAMING is supported by the monitor via DisplayPort 1.4, and, if yes, HDR10+ GAMING switch is presented. However, due to unknown error, AN INITIALISATION of the HDR10+ GAMING mode via DisplayPort 1.4 fails. At the same time both DETECTION and INITIALISATION of the HDR10+ GAMING mode work properly via HDMI 2.1 connection.

I have found following info from Samsung:

It suggests that it was known that HDR10+ GAMING mode was not working in previous firmware versions of (among others) Samsung Odyssey Neo G8 (G85NB), and starting from 1009 firmware version it should work via DisplayPort as well. I have later software installed, that is, 1011.3, as currently recommended by Samsung, so maybe it is a regression in the firmware code.

I fully agree that is is only a very rare corner case, with the known workaround (HDMI 2.1 route) :)

However, CD Projekt and Samsung are proud of HDR10+ Gaming functionality, so it would be great to make it easily usable :)

 
I fully agree that is is only a very rare corner case, with the known workaround (HDMI 2.1 route) :)
Yup, firmware was where I was leaning. As it stands, you've already covered that base. HDMI should be superior performance, overall, though. I'd stick with that.

I'm not too well versed in how all the HDR methodologies work with various setups, so I'm not sure that I can offer much else on display port thing. Whatever the case, things are mostly built around HDMI for the present. Reading up a bit more on it, while DisplayPort tech is newer, it's also proprietary...and it seems to come with some interesting limitations compared to HDMI. Faster for higher resolutions, due to the increased bandwidth, but simultaneously not able to do audio and incapable of dynamic HDR. Odd choices.

HDMI for the win. For now, at least.
 
I have one suggestion that might cause it, is the DP cable of good quality too support DP 1.4a? I have hade issues with DP cables before so i generaly use the one that came with the monitor or buy a really expencive one -.- they should not be too long either witch is kinda lol. DP1.4a should keep up with HDMI 2.1 but i dont have a hdr10+ supporting monitor so cant test. I bought a huge monitor instead :D

Edit: Nvm HDMI 2.1 has higher data rate, this could be why it doesnt work on Dp? The specs for HDR10+ only mentions HDMI too but i cant find any specs for HDR10+ Gaming.
 
Last edited:
Quick update on the matter:

I have just changed the monitor from the older Samsung Odyssey Neo G8, where the HDR10+ Gaming feature (and entire HDR configuration) seems to be rather a late afterthought, to the Samsung Odyssey OLED G8 S32DG80, where HDR10+ Gaming mode is even separately configured aside from the plain HDR.

And I can confirm that again, HDR10+ GAMING switch in the Cyberpunk 2077 VIDEO options does NOTHING, when set to ON and when using Displayport 1.4 on the RTX 4080 card.

As before, when using the HDMI 2.1 port on the RTX 4080 card, HDR10+ GAMING setting in the Cyberpunk 2077 started to work properly, that is:
+ the monitor was switching between the HDR10+ GAMING mode and HDR modes on the fly, respectively, as reported by the monitor.

It is inline with the NVIDIA statement here:


where NVIDIA representative clearly states that HDR10+ Gaming feature is only supported via HDMI port.

So, it looks it is just a long standing bug, to report HDR10+ Gaming switch in the Cyberpunk 2077, when Displayport 1.4 is used.
 
And I can confirm that again, HDR10+ GAMING switch in the Cyberpunk 2077 VIDEO options does NOTHING, when set to ON and when using Displayport 1.4 on the RTX 4080 card.
Cool. Can you send this in as an official Support Ticket if you haven't done so already? (You can add it as a follow-up if you already have an open ticket on the issue.)
 
Our technical team informed that Nvidia currently only supports HDR10+ GAMING over HDMI connector.

So i think its a bandwith issue.

Later: As far as I know, no. HDR10+ standard is for DP 2.0+. You can use HDMI output
 
Top Bottom