Performance issues with Witcher 3 ver 4.0

+
I just tried the new update to TW3 (4.0, GOG version) using DX12 mode (on Linux, Wine 8.0-rc1 + vkd3d-proton). I noticed that performance is much worse than in the previous version. Reduction is huge without visible graphics improvements at that.

Is Windows experience similar for such huge performance drop? I haven't enabled any ray tracing.

GPU: AMD RX 6800 XT.

Previous version was hitting easily 180 fps (capped at that) on 2560x1440. New version often drops to 40 fps in the Kaer Morhen tutorial scene for example.
 
Yes, the performance difference is very big. I have a 4090 and a 10600K, and play at 3840x1600. Maxing out all the visual settings, I get around 50-60fps in Toussaint. Fortunately I can turn on DLSS 3 and frame generation, which helps tremendously. Now I get high framerates and smooth gameplay again. But yeah, this is a very demanding version. Maybe some optimisations can be done with patching?

Btw. I disagree strongly that the graphcis don't look better. IMO they look A LOT better. Better vegetation, better lighting, ray tracing, better reflections, better textures etc..etc..
 
There is definitely a performance issue...even with all the settings set back to pre-next-gen. Also am noticing that the GPU is maxxing out...no matter the load. There is a flag set that forces the GPU into full power and doesn't let it scale down in frequency based on load. Thus you end up getting higher thermals till you hit the max and the gpu forces a mandatory scale down which at that point you get unstable performance as it goes up and down. You see this issue in Witcher 2 with the Vulkan conversion mod that does the same thing.

FSR2 doesnt seem to be working properly either. Yes...it visually changes things, but performance wise it should be changing a lot and it isn't. You are getting less visual quality for no real performance gain which is unlike other FSR implementations where you get quite a bit of performance gain.

Dynamic scaling? Its not really scaling right now. Whats the thresholds and conditions for it to kick in? I was able to get some performance out of manually scaling the game...but it doesn't look as good as it did pre-next-gen when doing this. Not sure why that is yet...will test more later once my game stops crashing.
Post automatically merged:

Just testing with DX-11 version with the added AA and Ultra+. Runs really well and looks good.

Here is DX12 vs DX11. Yes there is a visual bug with DX12 that I posted earlier, but notice the frequency and thermals bottom left? DX12 is running much higher, while DX11 is sitting comfortable with a lot of head room to add more stuff to like mods or higher settings...or in actual gameplay terms, able to handle combat with spell effects.
 

Attachments

  • witcher_3_bug_2_dx11.png
    witcher_3_bug_2_dx11.png
    4.7 MB · Views: 2,337
  • witcher_3_bug_2_dx12.png
    witcher_3_bug_2_dx12.png
    4.4 MB · Views: 2,332
Last edited:
What I notice is that GPU isn't fully utilized for me, so obviously it performs worse than before when I was getting 100% GPU utilization. So the issue is not simply heavier graphics, but some kind of bad optimization here.
 
Yeah, FSR mojo is not working methinks. Should make noticeable difference, but doesn't. Specially micro-stutters etc remain the same.
 
Even on RTX 4090 on full settings on a 2k screen the game still stutters and freezes at some places which is weird. A performance fix patch is truly need it, so many of my friends complain about it. And on reddit, forums here, everywhere.
 
Yeah, FSR mojo is not working methinks. Should make noticeable difference, but doesn't. Specially micro-stutters etc remain the same.
Micro stutters are an old trait of vanilla. To fix, turn off vsync. Force it on via your gpu. Use something like rivatuner/afterburner to cap your fps to 60. Then go into your documents folder and settings file and change the foliage distance to 1. You can also mess with lowering some of the other grass/foliage stuff down to medium in the in game settings, but if you like the high density and newer settings, just go into the INI file and change just the foliage distance.
 
vkd3d-proton developers commented that it's not really using a wrapper, but:

it's cpu-bound because they just derped all their d3d12 code into one thread

So may be there is a chance they'll fix it, but don't hold your breath for it. I think CDPR outsourced this version.
 
Yeah, and the constant crashes in dx12 makes this game pretty much unplayable at the moment. I have faith that CDPR will come through though, and the overall improvement is really big with the next gen patch.

(Windows 11, 4090, 10600K)
 
Hello everyone! I to am having horrible performance with DX12 on a beefy machine (specs below) with update 4.0 so I am coming over from Reddit with a full detailed post on my test/theories/full breakdown so here we go.

TLDR: The games DX12 implementation is 100% broken and causing awful performance/scaling along with stutters and graphical bugs. The game also seems to only utilize 2 CPU cores which is causing every GPU to be starved (with the exception of 40 series cards DLSS 3.0 and some 4k setups) no matter what you do. DX11 implementation works much better but you lose DLSS/FSR/RTX the selling points of the update. Lastly Alex from Digital Foundry is going to do a video and has already tweeted the patch defiantly has issues/CPU being a problem.
I also linked some tweets/articles/videos at the bottom confirming issues and proving it is not just ranting or a handful of people.

My original Reddit post here https://www.reddit.com/r/Witcher3/comments/zllc1l
So to begin I believe I have narrowed down what the two biggest issues are.

First: the game seems to only use two of my 5900X's 12 CPU cores (Cyberpunk also originally had this issue but to a lesser degree.) I’ve never seen the games CPU usage above 6% and I commonly see it at 1-2% usage. What the underutilized CPU means is that the game is starving our GPUs causing them to also be underutilized. For GPU I’ve seen as low as 45% I’ve seen as high as 85% , but nothing ever close to 100% unless playing on DX11. So in other words the games poor CPU optimizations is causing our beefy GPUs to not be fully used killing our performance while we have headroom left on our hardware.

This is not just a me issue several reports now claiming the same thing and I’ve done the testing and posted the screenshot in prior linked Reddit post above so I believe this is what’s likely happening.

Secondly: the games DX12 implementation is straight borked on PC and I’m even reading reports of the same thing for next gen Xbox as it also uses DX12. What this is doing is causing horrible scaling of settings, lower than expected performance, stutters, and possibly the graphical glitches/crashing.

How did I come to this theoretical conclusion?

To start my rig: Asus Crosshair VIII X570 (latest BIOS 4201) with a Ryzen 9 5900x slight OC to 4.95ghz, 360 AIO cooler, Asus Strix Rog RTX 3080 with slight factory OC, 32gb 3200mhz ram, and game installed to a 1TB Samsung 970 Evo NVME also latest Nvidia drivers (527.56) and latest version of Windows 11 (Windows 11 Home Version 22H2 Installed on 10/‎19/‎2022 OS build 22621.963 Windows Feature Experience Pack 1000.22638.1000.0) Note this install of Windows 11 is less than 2 months old and was a full fresh install not an upgrade from previous Windows 10 install and game is fresh installed not priorly modded due to my recent wipe moving to Windows 11. Also hardware accelerated GPU scheduling is on.

Now background info I run all my games more than fine (to include reasonable RTX) and could not complain but for the sake of keeping it short we’ll do OG 1.32 Witcher 3 and Cyberpunk 2077 this same rig did 1440p Max at 140+fps in Witcher 3 we are talking hair works and everything.
Cyberpunk 2077 all settings maxed minus the two psycho options and all RTX on at 1440p DLSS balanced locked 60 from day one to most recent patch.

Ok next gen Witcher 3 so to break this one down if I do all lowest settings possible on DX12 1440p DLSS performance I get 80-100ishfps or less. Again going back to my background info this same machine could do 1440p maxed at 140+ fps easy with 1.32 but now all lowest settings 1440p DLSS performance or TAAU with resolution scaling I get 80-100fps.....

I CANNOT make this up I am seriously somehow getting ALOT less fps at all low with DX12 compared to OG maxed 1.32 with the same rig....
If I turn on all RTX with all other settings low DLSS Ultra Performance I get 35-50FPS!
Turn it to all lowest settings 1440p DLSS Ultra performance with only RTGI I get around 42-55fps!
MAX game out with 1440p DLSS quality 32-40fps! Try to do anything to gain 60fps on anything RTX I can’t get it to.
Try to get 60fps on no RTX but ultra-ultra+ can’t get it to basically its all medium to low now no RTX and DLSS performance to net 60-75fps with less than 60fps 1% lows and insane input delay anything else I try gets me under 60 to as low as 32 when running maxed on my MONSTER (only 1.78% of 30+ million steam users have a RTX 3080) rig the same rig that runs Cyberpunk 2077 Ultra with RTX 1440p DLSS balanced and gets locked 60fps make it make sense please!

Now we can prove it’s bad DX12 by switching to DX11 and max it out obviously there is no DLSS or RTX but max so new ultra+ and TAAU no scaling at native 1440p I am now getting 100-110sh fps still not as much as maxed 1.32 but remember ultra+ settings and new screen space reflections and TAAU UHD texture etc.

This FPS on DX11 makes more sense than all low no RTX DX12 getting only up to 100fps but hovering around 80-90........ If you match the games settings as close as possible to OG 1.32 on DX11 you get almost the same performance (probably around a 10-15% loss) despite the higher quality graphics settings that cant be matched. I was getting around 110-135ishfps so not as much as OG 1.32 but close enough and the scaling is working and this all makes sense.

So DX12 is borked. If you scale all the way to low you get a lot less fps than max DX11 and max OG 1.32 but max out DX12 and game is unplayable getting half the FPS as Cyberpunk 2077 maxed.....

Lastly on performance game refuses to utilize the GPU and CPU in full. Maxed DX12 with DLSS quality I see 80-85% usage, maxed DX12 with DLSS ultra performance 50-55ish% usage, maxed DX12 with TAAU 45-50ish% usage, lowest settings DX12 40-45% usage. All meanwhile performance was at its worst 30ish fps and at its best 100fps but I still had hardware left to use....?

DX11 no issues 99-100% usage game scales fps scales just DX12.

At this point I am just waiting for Alex from Digital Foundry’s full tech review to highlight and confirm or elaborate on my findings. I’ve learned most of what I know from him over the years and he’s the wizard when it comes to stuff like this.
The devs already tweeted they are investigating the issues so the game will be shelved for me until a patch is released.

For clarification I’m not mad I’m just sad. I was really looking forward to this, it’s my favorite game of all time and I have a beefy enough machine that I should be able to do mostly maxed 1440p DLSS balanced 60 like Cyberpunk…..

Thank you all for the read and for those that aren’t having issues and enjoying the game good luck on the path!
Sources and good reads:
Alex’s tweet (From Digital Foundry): John’s tweet (From Digital Foundry): CDPR tweet: Marcin’s tweet (CDPR Global Community manager):
Videos and 3rd party benchmarks show exactly what I am saying here:

Reports of issues:
https://www.rockpapershotgun.com/th...update-is-borked-so-heres-how-to-roll-it-back
https://www.gamesradar.com/witcher-3-next-gen-update-pc-players-say-it-runs-terribly/
https://kotaku.com/witcher-3-next-gen-update-pc-gaming-ps5-xbox-series-1849893251/amp
https://www.pastemagazine.com/games/witcher-3-update-pc-issues/
https://www.dsogaming.com/news/the-...-another-cyberpunk-2077-buggy-mess-at-launch/
 
Very good points. That's my conclusion as well - the game doesn't saturate the GPU in DX12 mode, that's why performance is so much worse than in DX11 one.

I just hope CDPR (or more like developers they outsourced the game to, it's not CDPR's own studio if I understand correctly) can actually fix this mess. I don't mind waiting for it as long as it's a proper fix.
 
I just tried the new update to TW3 (4.0, GOG version) using DX12 mode (on Linux, Wine 8.0-rc1 + vkd3d-proton). I noticed that performance is much worse than in the previous version. Reduction is huge without visible graphics improvements at that.

Is Windows experience similar for such huge performance drop? I haven't enabled any ray tracing.

GPU: AMD RX 6800 XT.

Previous version was hitting easily 180 fps (capped at that) on 2560x1440. New version often drops to 40 fps in the Kaer Morhen tutorial scene for example.
The trees and the plants started flashing whenever i move
Post automatically merged:

 
I just had to fight the Noon Wraith at the abandoned village twice, a set of two wraiths just sprang up immediately after the mission was finished! It's not a graphics issue though.
 
RTX 4000 users have such an enormous edge in this version now that i suspect Nvidia is kinda behind a lot of crap. There is no reason a 4080 is perfect while a 3090Ti is struggling like crazy. Well, the reason is frame generation, but a 3090Ti shouldn't be cut to useless. No way.
 
RTX 4000 users have such an enormous edge in this version now that i suspect Nvidia is kinda behind a lot of crap. There is no reason a 4080 is perfect while a 3090Ti is struggling like crazy. Well, the reason is frame generation, but a 3090Ti shouldn't be cut to useless. No way.
Any 3000 card with right resolution/settings shouldn’t be cut to useless*
 
My issues with the last update on PC:
- I can't view my FPS with Riva Statistics because it doesn't display on the game, while it used to work fine prior.
- I can't use Photomode. When I go to keybindings, I see "NONE" attributed to it while it seems from captures I saw on the net that it is set to "U"
- High settings with Raytracing work all right (though obviously slower and choppy) but some shadows are grainy and pixelly, like grass shadows)

My GPU is an MSI RTX 3060 Ti
Before downloading the update I had uninstalled all mods with Mod Manager and deleted all mod files that were in the Steam game file
I also downloaded the GeForce Game Ready Driver v. 527.56 released on 12/08/2022
 

Attachments

  • no photo mode.jpg
    no photo mode.jpg
    357.7 KB · Views: 127
  • The Witcher 3 Screenshot 2022.12.16 - 18.23.27.51.png
    The Witcher 3 Screenshot 2022.12.16 - 18.23.27.51.png
    4.9 MB · Views: 127
- I can't use Photomode. When I go to keybindings, I see "NONE" attributed to it while it seems from captures I saw on the net that it is set to "U"
Maybe you should check that, in case :
 
Maybe you should check that, in case :
Oh thanks, will try that!
EDIT: It worked perfectly! You're the best!
 
Last edited:
RTX 4000 users have such an enormous edge in this version now that i suspect Nvidia is kinda behind a lot of crap. There is no reason a 4080 is perfect while a 3090Ti is struggling like crazy. Well, the reason is frame generation, but a 3090Ti shouldn't be cut to useless. No way.

I suspect too. There is an intertwining of interests between studios that develop games with Nvidia. There seems to be a tendency to induce gamers to buy the new RTX4000. In other words, it's always $$$$$.
 
I suspect too. There is an intertwining of interests between studios that develop games with Nvidia. There seems to be a tendency to induce gamers to buy the new RTX4000. In other words, it's always $$$$$.
When I look at the hugely expanded render distance of vegetation w/increased grass density on ultra+, and combine that with lots of ray tracing on top of an older engine (probably being pushed to its limit) - I get that the Next Gen version is much more demanding.

Hopefully CDPR can do some optimizing, but I don't want to loose any of the upgraded graphics either - W3 in dx12 looks absolutely brilliant. At least Ultra+ could be like a next-gen GPU option (like the ultra settings for Kingdom Come Deliverance).

However, dx12 is useless. Dx11 works fine, but the game crashes almost instantly with dx12 (I have a 4090, everything maxed out). First and foremost, CDPR has to fix the stability in dx12.

A brand New Game+ is the worst. Just running around in rural areas with an older save, there is less crashing. More frequent crashing happens in heavily populated areas like Novigrad and Oxenfurt.
Post automatically merged:

Any 3000 card with right resolution/settings shouldn’t be cut to useless*
That is true, but this could be about an older engine being pushed beyond what it's able to do efficiently, I dunno.

I went from a 3090 Suprim X to a 4090 Suprim X, and the performance difference is massive. 4090 eats the most demanding games (like RDR2 maxed out in 2840x1600) for breakfast, and with DLSS3 frame generation the performance uplift is nuts (DLSS3 really works as promised, unless the original framerate gets too low, then latency becomes an issue). Maybe Ultra+ is reserved for next gen graphics cards. CDPR obviously still has to figure out the huge framerate drops and stuttering.

I'm also a bit disappointed that some of the most obvious visual bugs and annoyances from the standard W3 version (like certain ugly low res meshes, moving ground while walking, boats dipping under and over water in lakes etc.) haven't been fixed, from what I've seen so far. I assume a rather small team worked on this, but I'd much rather pay for a properly polished update, as this one of the best games ever made. Sadly, another not great decision from CDPR here.
 
Last edited:
Top Bottom