So I thought this might be interesting. I'm still testing the Omen X 35 monitor and it's actually quite good, it has great response times, no ghosting and when adjusted, very good colors. Two things however: the monitor is capable of displaying a wide range of dark tones that even my old Dell U2415 could not, so what was a black blob before is now a gradient, but in low quality content it displays as compression artifacts. If I "crush" the blacks a bit with gamma and brightness, it looks like my old IPS and most artifacts are gone. The other thing is some games have gradient banding, for instance in fog, smoke and so on. Examples include using a flash light in dark games like Soma or The Vanishing of Ethan Carter, and the (IMO) ugly vignetting effect in The Witcher 3. What I did not know is that some games actually implement dithering to reduce banding, creating much smoother gradients, while others (I suppose including the aforementioned games) do not and have more or less noticeable banding (more so in a high contrast panel). Dithering can also be implemented in-driver, and AMD does so which makes banding essentially disappear in games with Radeon/Vega. Nvidia for some obscure reason does NOT support driver level dithering in Windows so the banding is noticeable on my GTX 970.... but, and here comes the fun part, Nvidia DOES support dithering in Linux! Games with banding in Windows do not have banding in Linux, they look fantastic!
There you have it. These monitors (and even Nvidia for that matter) are better supported in Linux for some strange reason.