Witcher 3 Next Gen update - Performance

+
Hello. Consider this to be a technical report on performance and few bugs:

System:
AMD Threadripper 1900x (8 cores at 4ghz))
32gb RAM at 3200MHz
AMD Radeon VII, 16gb VRAM (drivers from august 2022)
Windows 10 x64, latest patches.
4k Display with 120Hz

Recommendation:
- Fullscreen, with Freesync On and unlimited FPS is smoothest

Borderless or windowed adds additional graphic buffer and windows desktop is already Vsync-ed and frame limited on top of ingame settings.
You may turn Vsync off as well as it should not have any effect if Freesync/Gsync is used.

Borderless/Windowed mode, or Framelimiter On cause microstuttering.

Notes:
  • Engine does utilize about 5gb of RAM and 9gb of VRAM
  • Shader cache does cause stutter when object is loaded for the first time (especially in Tuissant?)
  • Using old AA methods was not adding much benefit on 4K display in old engine, FSR actually does help a lot so I keep it enabled.
  • Raytracing is not available on Radeon VII.
  • You can enable Nvidia Hairworks with AMD Radeon VII. AMD had technology of its own quite a while ago, and its not eating away any performance, it actually seems to run better.
  • Priscilla's Song (The Wolven Storm) is now perfectly smooth in the cutscene

Performance on highest settings on old and new versions of the game is almost identical on mentioned hardware. The scene from Tuissant with Geralt and Ciri under the tree was reaching 62-64fps on old engine, now its 66-68fps.

Everything is on Ultra+. AA is Off in older engine, FSR on Auto is used on newer one. RT is not available and thus off.

Bugs:
- Menu appears to be "autorefreshing". Its very annoying when scrolling a long list of saves and it "clicks".
This happens until new save is created.

- try to import save from Witcher 2. It pops few times messages that newer version of the game is no longer compatible with old saves.

- lightning in the battle with Eredin (on Naglfar) can give green or purple hues. That indicate a shared bug when G returned 255 in one case and 0 for other channels, and 255 for R and B in another case.
 
Last edited:
Hello everyone! I to am having horrible performance with DX12 on a beefy machine (specs below) with update 4.0 so I am coming over from Reddit with a full detailed post on my test/theories/full breakdown so here we go.

TLDR: The games DX12 implementation is 100% broken and causing awful performance/scaling along with stutters and graphical bugs. The game also seems to only utilize 2 CPU cores which is causing every GPU to be starved (with the exception of 40 series cards DLSS 3.0 and some 4k setups) no matter what you do. DX11 implementation works much better but you lose DLSS/FSR/RTX the selling points of the update. Lastly Alex from Digital Foundry is going to do a video and has already tweeted the patch defiantly has issues/CPU being a problem.
I also linked some tweets/articles/videos at the bottom confirming issues and proving it is not just ranting or a handful of people.

My original Reddit post here https://www.reddit.com/r/Witcher3/comments/zllc1l
So to begin I believe I have narrowed down what the two biggest issues are.

First: the game seems to only use two of my 5900X's 12 CPU cores (Cyberpunk also originally had this issue but to a lesser degree.) I’ve never seen the games CPU usage above 6% and I commonly see it at 1-2% usage. What the underutilized CPU means is that the game is starving our GPUs causing them to also be underutilized. For GPU I’ve seen as low as 45% I’ve seen as high as 85% , but nothing ever close to 100% unless playing on DX11. So in other words the games poor CPU optimizations is causing our beefy GPUs to not be fully used killing our performance while we have headroom left on our hardware.

This is not just a me issue several reports now claiming the same thing and I’ve done the testing and posted the screenshot in prior linked Reddit post above so I believe this is what’s likely happening.

Secondly: the games DX12 implementation is straight borked on PC and I’m even reading reports of the same thing for next gen Xbox as it also uses DX12. What this is doing is causing horrible scaling of settings, lower than expected performance, stutters, and possibly the graphical glitches/crashing.

How did I come to this theoretical conclusion?

To start my rig: Asus Crosshair VIII X570 (latest BIOS 4201) with a Ryzen 9 5900x slight OC to 4.95ghz, 360 AIO cooler, Asus Strix Rog RTX 3080 with slight factory OC, 32gb 3200mhz ram, and game installed to a 1TB Samsung 970 Evo NVME also latest Nvidia drivers (527.56) and latest version of Windows 11 (Windows 11 Home Version 22H2 Installed on 10/‎19/‎2022 OS build 22621.963 Windows Feature Experience Pack 1000.22638.1000.0) Note this install of Windows 11 is less than 2 months old and was a full fresh install not an upgrade from previous Windows 10 install and game is fresh installed not priorly modded due to my recent wipe moving to Windows 11. Also hardware accelerated GPU scheduling is on.

Now background info I run all my games more than fine (to include reasonable RTX) and could not complain but for the sake of keeping it short we’ll do OG 1.32 Witcher 3 and Cyberpunk 2077 this same rig did 1440p Max at 140+fps in Witcher 3 we are talking hair works and everything.
Cyberpunk 2077 all settings maxed minus the two psycho options and all RTX on at 1440p DLSS balanced locked 60 from day one to most recent patch.

Ok next gen Witcher 3 so to break this one down if I do all lowest settings possible on DX12 1440p DLSS performance I get 80-100ishfps or less. Again going back to my background info this same machine could do 1440p maxed at 140+ fps easy with 1.32 but now all lowest settings 1440p DLSS performance or TAAU with resolution scaling I get 80-100fps.....

I CANNOT make this up I am seriously somehow getting ALOT less fps at all low with DX12 compared to OG maxed 1.32 with the same rig....
If I turn on all RTX with all other settings low DLSS Ultra Performance I get 35-50FPS!
Turn it to all lowest settings 1440p DLSS Ultra performance with only RTGI I get around 42-55fps!
MAX game out with 1440p DLSS quality 32-40fps! Try to do anything to gain 60fps on anything RTX I can’t get it to.
Try to get 60fps on no RTX but ultra-ultra+ can’t get it to basically its all medium to low now no RTX and DLSS performance to net 60-75fps with less than 60fps 1% lows and insane input delay anything else I try gets me under 60 to as low as 32 when running maxed on my MONSTER (only 1.78% of 30+ million steam users have a RTX 3080) rig the same rig that runs Cyberpunk 2077 Ultra with RTX 1440p DLSS balanced and gets locked 60fps make it make sense please!

Now we can prove it’s bad DX12 by switching to DX11 and max it out obviously there is no DLSS or RTX but max so new ultra+ and TAAU no scaling at native 1440p I am now getting 100-110sh fps still not as much as maxed 1.32 but remember ultra+ settings and new screen space reflections and TAAU UHD texture etc.

This FPS on DX11 makes more sense than all low no RTX DX12 getting only up to 100fps but hovering around 80-90........ If you match the games settings as close as possible to OG 1.32 on DX11 you get almost the same performance (probably around a 10-15% loss) despite the higher quality graphics settings that cant be matched. I was getting around 110-135ishfps so not as much as OG 1.32 but close enough and the scaling is working and this all makes sense.

So DX12 is borked. If you scale all the way to low you get a lot less fps than max DX11 and max OG 1.32 but max out DX12 and game is unplayable getting half the FPS as Cyberpunk 2077 maxed.....

Lastly on performance game refuses to utilize the GPU and CPU in full. Maxed DX12 with DLSS quality I see 80-85% usage, maxed DX12 with DLSS ultra performance 50-55ish% usage, maxed DX12 with TAAU 45-50ish% usage, lowest settings DX12 40-45% usage. All meanwhile performance was at its worst 30ish fps and at its best 100fps but I still had hardware left to use....?

DX11 no issues 99-100% usage game scales fps scales just DX12.

At this point I am just waiting for Alex from Digital Foundry’s full tech review to highlight and confirm or elaborate on my findings. I’ve learned most of what I know from him over the years and he’s the wizard when it comes to stuff like this.
The devs already tweeted they are investigating the issues so the game will be shelved for me until a patch is released.

For clarification I’m not mad I’m just sad. I was really looking forward to this, it’s my favorite game of all time and I have a beefy enough machine that I should be able to do mostly maxed 1440p DLSS balanced 60 like Cyberpunk…..

Thank you all for the read and for those that aren’t having issues and enjoying the game good luck on the path!
Sources and good reads:
Alex’s tweet (From Digital Foundry): John’s tweet (From Digital Foundry): CDPR tweet: Marcin’s tweet (CDPR Global Community manager):
Videos and 3rd party benchmarks show exactly what I am saying here:

Reports of issues:
https://www.rockpapershotgun.com/th...update-is-borked-so-heres-how-to-roll-it-back
https://www.gamesradar.com/witcher-3-next-gen-update-pc-players-say-it-runs-terribly/
https://kotaku.com/witcher-3-next-gen-update-pc-gaming-ps5-xbox-series-1849893251/amp
https://www.pastemagazine.com/games/witcher-3-update-pc-issues/
https://www.dsogaming.com/news/the-...-another-cyberpunk-2077-buggy-mess-at-launch/
 
Hello everyone!
Hi. Whats the power consumption of CPU and GPU?

Sometimes when both are really powerful, engine does not need more than 2-3 percent of the CPU performance. As a result, Windows will keep CPU cores in lower power states and limiting performance. Same can happen for GPU.

Check with some tools, and when power consumption will be very low, close to idle, it could be the problem.

Also try fullscreen without frame limit and without Vsync.
 
Last edited:
Hi. Whats the power consumption of CPU and GPU?

Sometimes when both are really powerful, engine does not need more than 2-3 percent of the CPU performance. As a result, Windows will keep CPU cores in lower power states and limiting performance. Same can happen for GPU.

Check with some tools, and when power consumption will be very low, close to idle, it could be the problem.

Also try fullscreen without frame limit and without Vsync.
GPU at peak wattage was hitting like 330 watts also I wasn’t in full screen vsync off as I have Gsync and frame rate unlocked.
Game is 100% cpu bound which is causing GPU to be starved and underutilized. You can prove this my changing any graphics settings and retaining the same FPS as before based on your setup.

I also use HWINFO to validate what was being used GPU never hit full usage and only two CPU cores where being hammered the rest fully idle.

If this is a windows issue it still does not eliminate it being the games fault as no other game or application on my system has this issue to include running the game in DX11 only this game in DX12 sees this issue.
 
GPU at peak wattage was hitting like 330 watts also I wasn’t in full screen vsync off as I have Gsync and frame rate unlocked.
Game is 100% cpu bound which is causing GPU to be starved and underutilized. You can prove this my changing any graphics settings and retaining the same FPS as before based on your setup.

I also use HWINFO to validate what was being used GPU never hit full usage and only two CPU cores where being hammered the rest fully idle.

If this is a windows issue it still does not eliminate it being the games fault as no other game or application on my system has this issue to include running the game in DX11 only this game in DX12 sees this issue.
Look for Pixel format settings and ensure its set to RGB 4:4:4.

I had bug where it was set for some reason to YCbCr and basically added another converson on output taxing GPU a lot.
 
RTX 3080 here (5900x) running at 4K mix of Ultra RT and Ultra+ RT. For me, turning on DLSS actually lowers the fps that I am getting. 20-45 fps average in Toussaint. Seems bizarre, but I am guessing I should be getting much higher not lower with DLSS on.
NOTE: This is a completely fresh install of the game and the latest NVIDIA driver install (DDU etc)
 
Top Bottom