Witcher 3 Graphics

+
Status
Not open for further replies.
It is not just "preference". There are some settings into which a too vibrant world doesn't connect you to the world as it should. It depends on artistic style, primarily. If, fore example, you go for a realistic style if you go too colorful the two things don't associate too well as a colorful look is more appropriate in a fantasy setting. So the two things are associated: colorful visuals often go together with cartoonish style and, given that almost all games nowadays are colorful that means they also look cartoonish.

Perfectly fine, the problem ensues, imo, when ALL games (again, no matter the context) share that cartoonish style. This is, definitely, tied also to demographics and the console market, but as I said before, not all gamers are children and I would like sometimes for games with a more adult setting to have a more adult style.

Yeah i guessed this was a casualty from the downgrade. We got..

Nvidia demo
Trailer with the wild hunt.
Gameplay demo.
Downgrade
Downgrade
Cartoon
Downgrade
Ponies

Once you have really basic geometry per thing realistic gets harder to implement so it went stylized then cartoon.

In my country at least witcher 3 is rated 18+.. so just in theory this game is not intended for children anyway.... Maybe if you make little pony games there's some platform vendor feedback that's unavoidable.
 
Once you have really basic geometry per thing realistic gets harder to implement so it went stylized then cartoon.

Yes, sure, stylized geometry by all means forces you in a direction only. There's simply no way to do a proper realistic style with poor geometry.
 
@kenjigreat You really shouldn't say you would pay for a patch, that's just encouraging bad practices and I think it's even worse than the agreeing with the paid mods fiasco Steam just went through. These things should be free since we've already supported the company, in CDPR's own words they owe us!

I think you are highly mistaken when I say I'd pay 10$ to see the one on top , It would not be a 'patch' that would bring the 'look' of the picture on top where Geralt , Yennefer and Triss are on the Boat in the Sword of destiny Trailer.

It would literally be a whole reskin pack, since the character models are vastly different . People have paid more for reskins.
 
Yeah i guessed this was a casualty from the downgrade. We got..

Nvidia demo
Trailer with the wild hunt.
Gameplay demo.
Downgrade
Downgrade
Cartoon
Downgrade
Ponies

Once you have really basic geometry per thing realistic gets harder to implement so it went stylized then cartoon.

In my country at least witcher 3 is rated 18+.. so just in theory this game is not intended for children anyway.... Maybe if you make little pony games there's some platform vendor feedback that's unavoidable.
The EGX Wild Hunt trailer was labeled as in-game footage, or gameplay footage.
 
Wow, and people are surprised with the way the industry has evolved over the past years..

There are people that spend thousands of dollars on their PC setup and pirate video games. Fun Fact. I'd personally rather put my money where my mouth is.

---------- Updated at 11:51 AM ----------

Lol what do you want them to say ? They got flak for the downgrade, now leave it be. You can't really expect them to say "OK GUYS WE DONE DID IT. WE DONE DOWNGRADED IT FROM THE 35 MINUTE DEMO"

:p

I expect an official reply when this thread hits 1000 pages , if they don't then , MGS V would be my GOTY.

They done so much PR backpedaling that I honestly don't care whether they think or admit there is 'downgrade'.

If they 'fix' the game to close to the 35minute gameplay demo and release Redkit , we'd all be much happier.

Actions speak louder than words.
 
Okay, so my earlier theory about the cutscene lighting was very incorrect.

Another solution seems to be leave it off in novigrad (and any other place i havent been to where its in large worse) and leave it on everywhere else. I just rode into velen and didn't notice any super offensive transition..

Anyone know better than me that this doesn't work?
 
The EGX Wild Hunt trailer was labeled as in-game footage, or gameplay footage.

Honestly I think I am going to treat the words , Alpha Gameplay and Gameplay from trailers as 'Target Render'.

After MGS V all bets are off.

Although I hope Naughty Dog does a good job.

Note: I don't mean Witcher 3 is a horrible looking game or technically horrendous though the bugs and LOD issues do affect the visuals tremendously. It just doesn't look like the Trailers. Specifically far away from the SOD Trailer.
 
As promised, I've started doing some digging and have found something you all might be interested in, if you didn't already know about it that is. Browse to bin>config and open performance.xml. In there you will find what appears outwardly to be rather suspect performance nerfing variables which if you look at entries such as for a GTX780Ti and compare it to the GTX980, it looks like some deliberate hobbling has been done for older cards to make the new nvidia cards look better. This theory would also quite neatly tie in with many nvidia owners of cards like the 780 complaining of bad performance in Witcher 3.

Of course, this is just on face value observation it may be something else entirely, theres no conspiracy theory being created here just initial findings through a bit of exploratory work.
 
As promised, I've started doing some digging and have found something you all might be interested in, if you didn't already know about it that is. Browse to bin>config and open performance.xml. In there you will find what appears outwardly to be rather suspect performance nerfing variables which if you look at entries such as for a GTX780Ti and compare it to the GTX980, it looks like some deliberate hobbling has been done for older cards to make the new nvidia cards look better. This theory would also quite neatly tie in with many nvidia owners of cards like the 780 complaining of bad performance in Witcher 3.

Of course, this is just on face value observation it may be something else entirely, theres no conspiracy theory being created here just initial findings through a bit of exploratory work.

Please don't make this into a performance thread , as an SLI 980 User I can say Nvidia is a POS . Drivers 353.06 which is the Kepler fix messes with SLI Scaling on Maxwell.

Honestly , thats all I can say .
 
As promised, I've started doing some digging and have found something you all might be interested in, if you didn't already know about it that is. Browse to bin>config and open performance.xml. In there you will find what appears outwardly to be rather suspect performance nerfing variables which if you look at entries such as for a GTX780Ti and compare it to the GTX980, it looks like some deliberate hobbling has been done for older cards to make the new nvidia cards look better. This theory would also quite neatly tie in with many nvidia owners of cards like the 780 complaining of bad performance in Witcher 3.

Of course, this is just on face value observation it may be something else entirely, theres no conspiracy theory being created here just initial findings through a bit of exploratory work.
Lmao you're hilarious, those are the pre-set recommended settings for each GPU...

And i love how you act like you're the first one to look at the config files of the game.
 
Lmao you're hilarious, those are the pre-set recommended settings for each GPU...

And i love how you act like you're the first one to look at the config files of the game.

Dont be terrible someone's interested. I just went and tried some more ini settings, related to the plumbing and not the graphics.. that.. mind blown are they making games for 486's?

This is my post install config anyway.
- physx to cpu
- force triple buffering and max rendered frames

\The Witcher 3 Wild Hunt\bin\config\base\resources.ini
FileQueueSizeGame = 250
FileQueueSizeLoading = 700
MaxRequests = 45 [50 is max, and with 50 or over you might have crashes]

\The Witcher 3 Wild Hunt\bin\config\base\Gc.ini
; Upper limit for amount of memory used by the objects that will automatically trigger the GC
ObjectMemoryTrigger = 512

I guess its fun to do this stuff yourself but at the same time no!!

I didn't do the stuff in resources.ini and now i'm smooth as 40-50fps at 2560x1600 all up no hairworks on 1.04 780ti. Someone really needs to get this to benchmarking sites.. the difference is very significant, and nothing to do with gpu performance.
 
Working on userializing the .env files currently. Thank god Sir_Kane decompiled the game's source code already! ;) Meaning, soon we will be able to directly manipulate values for the variables stored in the .env files!

Are you aware of the SSAO problems on grass and in dark places KNG? Is it something you intend to fix, or is that simply out of your reach without proper toolset? Asking, because I really like the new color tone, but right now I only have the choice between no AO (which looks like there is something missing) and HBAO (which tanks my performance way too much in some scenes, thanks Nvidia!)

Would be really, really great if you could get SSAO up to snuff.
 
Are you aware of the SSAO problems on grass and in dark places KNG? Is it something you intend to fix, or is that simply out of your reach without proper toolset? Asking, because I really like the new color tone, but right now I only have the choice between no AO (which looks like there is something missing) and HBAO (which tanks my performance way too much in some scenes, thanks Nvidia!)

Would be really, really great if you could get SSAO up to snuff.
Another pathetic and senseless Nvidia hate train?
CDPR are responsible for the implementation of Nvidia technology in their game.
My performance does not tank significantly with HBAO+ compared to SSAO and every single other game i've played has been consistent with that.
 
Another pathetic and senseless Nvidia hate train?
CDPR are responsible for the implementation of Nvidia technology in their game.
My performance does not tank significantly with HBAO+ compared to SSAO and every single other game i've played has been consistent with that.

Nvidia did screw with the 353.06 drivers for SLI Maxwell , pointless , nope.
 
HBAO (which tanks my performance way too much in some scenes, thanks Nvidia!)

Are you positive its ssao? Hbao on my system was one of those 2-4 constant fps hits like fxaa. Another one of those subtly worth it effects... Yeah i doubt its going to cause fps spikes.... Set physx to your cpu and done all the other necessary post install? See my previous post.
 
Nvidia did screw with the 353.06 drivers for SLI Maxwell , pointless , nope.
Oh did they?
I run two 980s in SLI, my performance on the latest driver is IDENTICAL to the previous one, and i am not getting any issues.

But you're telling me that they screwed it up?

Lol ok.

---------- Updated at 01:35 PM ----------

Dont be terrible someone's interested. I just went and tried some more ini settings, related to the plumbing and not the graphics.. that.. mind blown are they making games for 486's?

This is my post install config anyway.
- physx to cpu
- force triple buffering and max rendered frames

\The Witcher 3 Wild Hunt\bin\config\base\resources.ini
FileQueueSizeGame = 250
FileQueueSizeLoading = 700
MaxRequests = 45 [50 is max, and with 50 or over you might have crashes]

\The Witcher 3 Wild Hunt\bin\config\base\Gc.ini
; Upper limit for amount of memory used by the objects that will automatically trigger the GC
ObjectMemoryTrigger = 512

I guess its fun to do this stuff yourself but at the same time no!!

I didn't do the stuff in resources.ini and now i'm smooth as 40-50fps at 2560x1600 all up no hairworks on 1.04 780ti. Someone really needs to get this to benchmarking sites.. the difference is very significant, and nothing to do with gpu performance.
Know what i hate the most?
People spreading misinformation because of their own ignorance and incompetence.

The performance.xml file assigns the default graphics preset for each of the GPUs listed which gets loaded automatically for the people who don't want to touch any settings, it doesn't have "performance nerfing variables which if you look at entries such as for a GTX780Ti and compare it to the GTX980"

<!--Geforce GTX 780 Ti-->
<device vid="0x10DE" did="0x100a" preset="2" />

<!--GeForce GTX 980-->
<device vid="0x10DE" did="0x13c0" preset="3" />

The 2 and 3 are the indicators of the default graphics presets they made for each GPU, that's what THEY FOUND to be optimal. It's the presets that the game loads automatically when it detects your GPU in the menu. You can just change it!

It doesn't gimp anything, it doesn't prevent anything on the other GPUs. Some other GPUs have a line fpslimit="30", it doesn't do anything, that's the fps limit for cutscenes, i've tried playing with and without it, it makes absolutely 0 difference.
 
My performance does not tank significantly with HBAO+ compared to SSAO and every single other game i've played has been consistent with that.
"performance with my 980 sli rig is fine, hence everything is fine, its your imagination!"

Im on AMD, not Nvidia and HBAO+ tanks my framerates in any game it is implemented in. At least AMD cards can run HBAO+ at all, but its not as smooth as it is on Nvidia cards.


Another pathetic and senseless Nvidia hate train?

(not even) 1 sentence = hate train? Are you still so touched that I disliked your (choice of) preset or why do you get so offensive? If you have nothing constructive to add to the problem I've mentioned, don't post a response.

Are you positive its ssao? Hbao on my system was one of those 2-4 constant fps hits like fxaa.

100% positive that its HBAO+ causing these (performance) issues for me. Not only does it easily cost twice as much performance as SSAO does, it also mucks up my frametimes considerably.
 
"performance with my 980 sli rig is fine, hence everything is fine, its your imagination!"

Im on AMD, not Nvidia and HBAO+ tanks my framerates in any game it is implemented in. At least AMD cards can run HBAO+ at all, but its not as smooth as it is on Nvidia cards.




(not even) 1 sentence = hate train? Are you still so touched that I disliked your (choice of) preset or why do you get so offensive? If you have nothing constructive to add to the problem I've mentioned, don't post a response.



100% positive that its HBAO+ causing these (performance) issues for me. Not only does it easily cost twice as much performance as SSAO does, it also mucks up my frametimes considerably.
You realize that your first criticism of me was EXACTLY what you did.
I didn't claim that everyone else had the same results as me, your mileage may vary, is that not understandable?
You hadn't even said you were on AMD, blame yourself for not providing enough information to everyone else here when making performance complaints.

In Watch Dogs, AMD's performance was nearly identical to Nvidia's, even when using the highest HBAO+ setting.

Want to point fingers? Point them at the developers.

Sarcastic comments such as "Thanks Nvidia!" following some sort of jab at Gameworks very often leads to senseless hate trains. I've been seeing people saying they'll pirate or refuse to purchase every game that is involved with Gameworks, and we're not talking about one or two guys from time to time, it's a recurrent theme.
 
If you think Gameworks is a good thing in the long run for PC gamers, you are completely and sadly mistaken, but you are entitled to your opinion of course. But maybe all that Nvidia hate as of late has some justification to it?

That beeing said; I really dont care taking this further into a circle jerk discussion that in the end leads exactly nowhere except to another forum ban, so live long and prosper.

Oh, and should you call me pathetic and/or senseless again for having different opinions about hardware manufacturers, I will happily report you to the mods. :sad:
 
Last edited:
Status
Not open for further replies.
Top Bottom