Nvidia and CD Projekt Red Must be losing their minds

+
Nvidia and CD Projekt Red Must be losing their minds

I have a pretty powerful system, a Falcon Northwest Tiki with a Titan X, a Devil Canyon i7-4790k, 16GB, SSD, coupled with an Acer XB270HU 144hz Gsync IPS 1440P Monitor. I have never seen a game where the more I raise the settings, the worse it looks..including using Nvidia DSR to downsample in 4K (though I prefer 1440P for the FPS boost).

The latest Nvidia Profile just downloaded. It wants me to lower Ambient Occlusion to SSAO, turn off anti-liasing, lower the detail level to high, reduce the Foliage Visibility Range to Medium, reduce the grass density to Medium, lower the number of background characters to medium, turn off Nvidia Hairworks, keep the resolution at 1440P (instead of using DSR 4K), lower the Shadow quality to medium, lower the texture quality to high, and lower the water quality to high. With those settings, my FPS is a steady 80+. BUT WHY WOULD THEY RECOMMEND THAT??? Its almost like they are saying, return your TItan X for a 770 or something. Who cares if the game looks much worse, at least you have 80+ FPS which isn't really necessary. Are they brain dead??? Are they even talking to CD Projekt Red?

Then there is the other side of the spectrum. If I turn everything to Ultra at 1440P including Hairworks on I get still good FPS between 40-60, but the sharpening makes trees/foliage look weird and things like pepples, straw house tops, etc look grainy whenever you move while facing those objects making the grainy distortion of those textures look weird as well. That forces me to turn down sharpening to Low (from high) and lower the texture detail from Ultra to high. That mostly gets rid of those symptoms (not 100%, but good enough) without making the game world look horrible.

I've never seen a game force such a high end card whether from Nvidia GeoForce optimal settings or in game setting standpoint force you to lower so many things just to have the game look somewhat normal. Ive seen things like this in SLI when games get released, but not for a single card. Hopefully they will get their act together, start talking with each other, and come out with settings that make sense and allow you to leverage the great PC you may have.

`
 

Attachments

  • Untitled.jpg
    Untitled.jpg
    317.6 KB · Views: 68
Last edited:
That doesn't sound right. I'm running a 780 and my recommended is HBAO+ with ultra/high settings.

It doesn't sound round to me either, but that is exactly what the GeoForce experience is recommending. I couldn't believe when I saw SSAO instead of HBAO+. You think maybe its a bug in their recommended settings for only the TitanX? lol. Anyone else have a Titan X to validate this?
 
Never use presets, profiles or Nvidia GeForce Experience.

This.

Always made a mess of my games. Creed: Unity came out and GeForce said my rig wasn't capable of even running Unity. Note the rig in my signature line . . ..

Running on ultra everything with Unity never gave me a lick of trouble. GeForce fail.
 
Never use presets, profiles or Nvidia GeForce Experience.

Why? you would think that experts are testing in the background to allow for optimal utilization based on your card, CPU, drive and memory. Why even have them recommend anything if there wasn't a logical purpose?
 
This.
Optimal settings they suggest is total bullcrap for me and not only for Witcher.

...but I'm also trying to figure out why having sharpening and texture detail on the highest setting would make the game look so much worse and grainy on certain objects like stones, foliage and straw house tops.
 
Maybe this will help someone in the future of this thread but my GeForce Experience is recommending nearly everything off and sliders to Low. Not once have I actually used the recommendations from GeForce Experience on any game, and just test thing out myself. All games I can set the graphic quality much higher then what it states.

I have a GTX750Ti and I understand it's a pile of garbage compared to what is out there now but I can put everything in Graphics and Post-processing on Ultra or high (except HairWorks) and I run an easy 35 FPS at worst and a very stable 45FPS average. Understandably this 45 FPS I run at, can often, in many other games, be said to look terrible but not this game. It still looks like 60FPS even if not and many other games running at 45 FPS would have a slight strobe effect.

Realizing how well optimized this game is I just threw it to a 30 FPS limitation and still looks just as good as 45 FPS and no strobe effect. Why not save on some processing power. =)

Well, anyways this was slightly off-topic but there may be others that had this experience as well.
 
GeForce 970 4GB

Plays 50-60 FPS on Ultra 1920x1080, VSync ON and HBAO+, Sharpness Low (Because it's better looking than High).
Hairworks OFF.
 
Why? you would think that experts are testing in the background to allow for optimal utilization based on your card, CPU, drive and memory. Why even have them recommend anything if there wasn't a logical purpose?

For two reasons.

1) Sadly, those ready-made settings are wrong...every time in every game. I don't know why.
2) We are PC gamers. We choose our own settings. It is a part of our experience.

---------- Updated at 06:59 PM ----------

...but I'm also trying to figure out why having sharpening and texture detail on the highest setting would make the game look so much worse and grainy on certain objects like stones, foliage and straw house tops.

Well, that is what sharpening is meant for.
You have to put that setting to low, or off.

High on specific settings doesn't mean "better".
 
...but I'm also trying to figure out why having sharpening and texture detail on the highest setting would make the game look so much worse and grainy on certain objects like stones, foliage and straw house tops.

Look at the link Avidya.454 posted. It's the MipMapBias issue. I lowered mine manually, but kept settings at ultra.
 
Last edited:
Keithianw, try using this fix for the tecture flickering problem:
http://www.geforce.com/whats-new/gu...-tweaking-guide#mipmap-bias-config-file-tweak

Basically you can turn ultra textures on again, but getting rid of the annoying flickering.,

Thank you. I appreciate that. I read that article. It was a bit confusing to me with some aspects. Basically its saying the more you reduce the mipmap (to ranges like -1 to -2), the better the distance foliage/trees will be, but at the expense of the shimmering (or graininess that I had described). The higher the mipmap, the less shimmering, but then things don't look as good in the distance and the texture of close things won't be as good...so they kept tweaking that number between -.4 to -1 to find an ideal balance if you are using 1080P versus 4K. Unfortunately, they didn't give a recommendation for 1440P :(...so I guess I will just play with the numbers in that range until I find a decent balance.

What still confuses me is why there aren't two settings that allow for better quality of distance versus close up where one isn't a sacrifice for the other. I just want everything to look good lol.

---------- Updated at 08:00 PM ----------

GeForce 970 4GB

Plays 50-60 FPS on Ultra 1920x1080, VSync ON and HBAO+, Sharpness Low (Because it's better looking than High).
Hairworks OFF.

So you are not getting the shimmering or graininess when moving and looking at pebbles or the straw house tops with Texture Detail on ultra? Or are you just not noticing it because it doesn't bother you? That is the whole point of the Nvidia article about the MipMapBias. Odd that you wouldn't be experiencing it since lowering Sharpness is for me at least unrelated to that symptom and it is more tied to Texture Detail.
 
This.
Optimal settings they suggest is total bullcrap for me and not only for Witcher.

Day 1 90% of the people having really bad issues where using nVidia's dumb presets, whoever came up with those should be fired.

If you have a 970 or 980 try my magic config, over 300 people with 970's said it fixed most of their problems.
Link in Sig
 
Last edited:
Day 1 90% of the people having really bad issues where using nVidia's dumb presets, whoever game up with those should be fired.

To be honest I'm wondering if I should just delete GeoForce Experience. I thought it would be such a convenient tool, but I find it distracting at best. I do like that it warns me of new drivers :).
 
Top Bottom