The Witcher 3 - 4.0 Version - Low performance in directx 12, even without using Ray Tracing.

+
Even using FSR, the performance is still bad, worse than Directx 11. Certain parts of the game use the video card less than expected, resulting in low FPS. Besides sometimes there are stutterings.

Directx 12 mode has very high latency compared to Directx 11, even using Nvidia Reflex, which Dx11 does not support.
 
Even using FSR, the performance is still bad, worse than Directx 11. Certain parts of the game use the video card less than expected, resulting in low FPS. Besides sometimes there are stutterings.

Directx 12 mode has very high latency compared to Directx 11, even using Nvidia Reflex, which Dx11 does not support.

I'm very curious about other people's experiences with this, so I'd like to ask: try turning down the "Grass Density" setting to its lowest, then report back. Does it improve performance for you, as much as it did for me?
 
I'm very curious about other people's experiences with this, so I'd like to ask: try turning down the "Grass Density" setting to its lowest, then report back. Does it improve performance for you, as much as it did for me?
I found that I have FPS drops despite the graphic quality level (I tried low, medium, and high settings). The performance is worse than it was in 1.32.
Also, I can see my laptop started to heat. In 1.32 median system temperature was 65-75 Celsium. In 4.0 with the same settings in DX11 now I have 75-90 degrees.
 
Oof, that's rough :\ I can't really say whether the performance is better or worse for me since I have a very old PC anyways (so it was always struggling to even reach 60 fps to begin with, hahah) but yeah. I can turn some options down and it'll literally go from 40 to 50 fps which is a huge boost for me then. Ultimately, the game still looks pretty much the same to me though - so it definitely doesn't perform much worse than it did before, at least not for me personally. Never ran a check on temperatures, but what you said doesn't really sound that good either.

Let's just hope they're actually aware (like they said) and will work on fixing things.
 
Hello everyone! I to am having horrible performance with DX12 on a beefy machine (specs below) with update 4.0 so I am coming over from Reddit with a full detailed post on my test/theories/full breakdown so here we go.

TLDR: The games DX12 implementation is 100% broken and causing awful performance/scaling along with stutters and graphical bugs. The game also seems to only utilize 2 CPU cores which is causing every GPU to be starved (with the exception of 40 series cards DLSS 3.0 and some 4k setups) no matter what you do. DX11 implementation works much better but you lose DLSS/FSR/RTX the selling points of the update. Lastly Alex from Digital Foundry is going to do a video and has already tweeted the patch defiantly has issues/CPU being a problem.
I also linked some tweets/articles/videos at the bottom confirming issues and proving it is not just ranting or a handful of people.

My original Reddit post here https://www.reddit.com/r/Witcher3/comments/zllc1l
So to begin I believe I have narrowed down what the two biggest issues are.

First: the game seems to only use two of my 5900X's 12 CPU cores (Cyberpunk also originally had this issue but to a lesser degree.) I’ve never seen the games CPU usage above 6% and I commonly see it at 1-2% usage. What the underutilized CPU means is that the game is starving our GPUs causing them to also be underutilized. For GPU I’ve seen as low as 45% I’ve seen as high as 85% , but nothing ever close to 100% unless playing on DX11. So in other words the games poor CPU optimizations is causing our beefy GPUs to not be fully used killing our performance while we have headroom left on our hardware.

This is not just a me issue several reports now claiming the same thing and I’ve done the testing and posted the screenshot in prior linked Reddit post above so I believe this is what’s likely happening.

Secondly: the games DX12 implementation is straight borked on PC and I’m even reading reports of the same thing for next gen Xbox as it also uses DX12. What this is doing is causing horrible scaling of settings, lower than expected performance, stutters, and possibly the graphical glitches/crashing.

How did I come to this theoretical conclusion?

To start my rig: Asus Crosshair VIII X570 (latest BIOS 4201) with a Ryzen 9 5900x slight OC to 4.95ghz, 360 AIO cooler, Asus Strix Rog RTX 3080 with slight factory OC, 32gb 3200mhz ram, and game installed to a 1TB Samsung 970 Evo NVME also latest Nvidia drivers (527.56) and latest version of Windows 11 (Windows 11 Home Version 22H2 Installed on 10/‎19/‎2022 OS build 22621.963 Windows Feature Experience Pack 1000.22638.1000.0) Note this install of Windows 11 is less than 2 months old and was a full fresh install not an upgrade from previous Windows 10 install and game is fresh installed not priorly modded due to my recent wipe moving to Windows 11. Also hardware accelerated GPU scheduling is on.

Now background info I run all my games more than fine (to include reasonable RTX) and could not complain but for the sake of keeping it short we’ll do OG 1.32 Witcher 3 and Cyberpunk 2077 this same rig did 1440p Max at 140+fps in Witcher 3 we are talking hair works and everything.
Cyberpunk 2077 all settings maxed minus the two psycho options and all RTX on at 1440p DLSS balanced locked 60 from day one to most recent patch.

Ok next gen Witcher 3 so to break this one down if I do all lowest settings possible on DX12 1440p DLSS performance I get 80-100ishfps or less. Again going back to my background info this same machine could do 1440p maxed at 140+ fps easy with 1.32 but now all lowest settings 1440p DLSS performance or TAAU with resolution scaling I get 80-100fps.....

I CANNOT make this up I am seriously somehow getting ALOT less fps at all low with DX12 compared to OG maxed 1.32 with the same rig....
If I turn on all RTX with all other settings low DLSS Ultra Performance I get 35-50FPS!
Turn it to all lowest settings 1440p DLSS Ultra performance with only RTGI I get around 42-55fps!
MAX game out with 1440p DLSS quality 32-40fps! Try to do anything to gain 60fps on anything RTX I can’t get it to.
Try to get 60fps on no RTX but ultra-ultra+ can’t get it to basically its all medium to low now no RTX and DLSS performance to net 60-75fps with less than 60fps 1% lows and insane input delay anything else I try gets me under 60 to as low as 32 when running maxed on my MONSTER (only 1.78% of 30+ million steam users have a RTX 3080) rig the same rig that runs Cyberpunk 2077 Ultra with RTX 1440p DLSS balanced and gets locked 60fps make it make sense please!

Now we can prove it’s bad DX12 by switching to DX11 and max it out obviously there is no DLSS or RTX but max so new ultra+ and TAAU no scaling at native 1440p I am now getting 100-110sh fps still not as much as maxed 1.32 but remember ultra+ settings and new screen space reflections and TAAU UHD texture etc.

This FPS on DX11 makes more sense than all low no RTX DX12 getting only up to 100fps but hovering around 80-90........ If you match the games settings as close as possible to OG 1.32 on DX11 you get almost the same performance (probably around a 10-15% loss) despite the higher quality graphics settings that cant be matched. I was getting around 110-135ishfps so not as much as OG 1.32 but close enough and the scaling is working and this all makes sense.

So DX12 is borked. If you scale all the way to low you get a lot less fps than max DX11 and max OG 1.32 but max out DX12 and game is unplayable getting half the FPS as Cyberpunk 2077 maxed.....

Lastly on performance game refuses to utilize the GPU and CPU in full. Maxed DX12 with DLSS quality I see 80-85% usage, maxed DX12 with DLSS ultra performance 50-55ish% usage, maxed DX12 with TAAU 45-50ish% usage, lowest settings DX12 40-45% usage. All meanwhile performance was at its worst 30ish fps and at its best 100fps but I still had hardware left to use....?

DX11 no issues 99-100% usage game scales fps scales just DX12.

At this point I am just waiting for Alex from Digital Foundry’s full tech review to highlight and confirm or elaborate on my findings. I’ve learned most of what I know from him over the years and he’s the wizard when it comes to stuff like this.
The devs already tweeted they are investigating the issues so the game will be shelved for me until a patch is released.

For clarification I’m not mad I’m just sad. I was really looking forward to this, it’s my favorite game of all time and I have a beefy enough machine that I should be able to do mostly maxed 1440p DLSS balanced 60 like Cyberpunk…..

Thank you all for the read and for those that aren’t having issues and enjoying the game good luck on the path!
Sources and good reads:
Alex’s tweet (From Digital Foundry): John’s tweet (From Digital Foundry): CDPR tweet: Marcin’s tweet (CDPR Global Community manager):
Videos and 3rd party benchmarks show exactly what I am saying here:

Reports of issues:
https://www.rockpapershotgun.com/th...update-is-borked-so-heres-how-to-roll-it-back
https://www.gamesradar.com/witcher-3-next-gen-update-pc-players-say-it-runs-terribly/
https://kotaku.com/witcher-3-next-gen-update-pc-gaming-ps5-xbox-series-1849893251/amp
https://www.pastemagazine.com/games/witcher-3-update-pc-issues/
https://www.dsogaming.com/news/the-...-another-cyberpunk-2077-buggy-mess-at-launch/
 
Even using FSR, the performance is still bad, worse than Directx 11. Certain parts of the game use the video card less than expected, resulting in low FPS. Besides sometimes there are stutterings.

Directx 12 mode has very high latency compared to Directx 11, even using Nvidia Reflex, which Dx11 does not support.
Post automatically merged:

Isnt directx 12 suposed to be the king of performance of cpus? Aahaha
 
1671265715187.png

DX12 with no RT is hilarious 😂
I should be bottle necking my CPU probably because I have an older one, a i7 5820k, but it didn't.
Comparing it to CP77 though is night 'n day, CP77 totally hammers my CPU to 100%. That's true bottleneck right there.
And also CP77 has way better performance 💀
This is clearly an optimization issue.
 
Last edited:
Getting more and more obvious that CDPR and Nvidia have cooked up this, cause with RTX 4000, the performance is through the roof with frame generation. Getting close to 300 fps at 3440x1440 without RT, but with DLSS at ultra performance. Bit silly to gate out customers who don't have the 4000 cards.
 
Getting more and more obvious that CDPR and Nvidia have cooked up this, cause with RTX 4000, the performance is through the roof with frame generation. Getting close to 300 fps at 3440x1440 without RT, but with DLSS at ultra performance. Bit silly to gate out customers who don't have the 4000 cards.
It is called "Next gen" for a reason. I have my game on EA/Origin, so I can't see how well my Ryzen 9 7900X plus Radeon 6800 will handle it yet. I really hope that CDPR just gives us a copy on GoG so I can be done with EA, outside of going back for old Bioware games from time to time.
 
Did they update the game today? Cause i am getting some crazy performance now that i did not get a few hours ago. This morning i got 100'ish fps at 3440x1440 DLSS quality at ultra+ an d RT. Now i get 160 fps on the exact same settings. I have rebooted and checked everything.
 
When we can expect the first hotfix??? I actually don't use any of new gfx stuff I just want my old performance back. Playing only cuz mods, new quest, fixes and loot.
Post automatically merged:

Did they update the game today? Cause i am getting some crazy performance now that i did not get a few hours ago. This morning i got 100'ish fps at 3440x1440 DLSS quality at ultra+ an d RT. Now i get 160 fps on the exact same settings. I have rebooted and checked everything.
They didn't as far as I know.
Post automatically merged:

View attachment 11334928
DX12 with no RT is hilarious 😂
I should be bottle necking my CPU probably because I have an older one, a i7 5820k, but it didn't.
Comparing it to CP77 though is night 'n day, CP77 totally hammers my CPU to 100%. That's true bottleneck right there.
And also CP77 has way better performance 💀
This is clearly an optimization issue.
Ofc it is. This game worked totaly fine on Ultra with my i7 3770k with rx 580 and 570 8gb.
 
I'm kinda feeling like I'm a bit in the minority here, but my PC specs are as follows:
  • AMD Radeon RX 480
  • AMD Ryzen 5 1500X
  • A measly 6 TB HDD (which is not an SSD, which in turn means it's slow, I know)
And my only goal, with any game EVER, is to a) get it to run in 1080p (because that's where my monitor caps anyways... and I personally don't need anything higher), and b) to have it run with as many FPS as I can squeeze out of it... as long as it stays somewhere above 30 :D

So far, with The Witcher 3 and my crappy hardware, I've been able to do just that :) even in the next-gen update, where many have said it ruined their performance... I am still getting (at the very least) a consistent 32-35 FPS in stress situations, and much more whenever the game eases up.

I guess at the end of the day it comes down to personal preference and tweaking. Simply hitting "ULTRA+" and calling it a day, I never believed for a second that this would please anyone. Not after this game had been out for over 7 years, given all the mods.
 
I'm kinda feeling like I'm a bit in the minority here, but my PC specs are as follows:
  • AMD Radeon RX 480
  • AMD Ryzen 5 1500X
  • A measly 6 TB HDD (which is not an SSD, which in turn means it's slow, I know)
And my only goal, with any game EVER, is to a) get it to run in 1080p (because that's where my monitor caps anyways... and I personally don't need anything higher), and b) to have it run with as many FPS as I can squeeze out of it... as long as it stays somewhere above 30 :D

So far, with The Witcher 3 and my crappy hardware, I've been able to do just that :) even in the next-gen update, where many have said it ruined their performance... I am still getting (at the very least) a consistent 32-35 FPS in stress situations, and much more whenever the game eases up.

I guess at the end of the day it comes down to personal preference and tweaking. Simply hitting "ULTRA+" and calling it a day, I never believed for a second that this would please anyone. Not after this game had been out for over 7 years, given all the mods.
You are not alone.

My spec is: Gigabyte z77, i7 3770k, 16gb ram ddr3, sapphire rx570 8gb oc. Before the patch I had steady 60fps, capped! with the high/ultra and even with hw on Geralt! Novigrad was fluid. 1080p of course.
Now on dx11 with the same or even worse settings I got like 35-60, depends where I am. Now I am asking myself, wtf is going on did I dream about that 60fps, that's how this patch screwed the performance.
On dx12 is just a lag fest no need to try and I don't have the reason cuz I can't use ray tracing etc..

With the same specs, only with my friend's Asus strix rx 580, on older version I have fluid performance like 80-100fps...

So this is their fault. Even if I am not using anything new that they introduced in the patch, yes they added mods like 4k textures and some effects, but I used these mods before and didn't have fps drop. 8gb vram and 16gb ram is enough for that and shouldn't cause any fps drop.
 
Holy shit I thought i was going crazy. I was averaging 65 fps at ultra and I decided to test it at low and medium and lo and behold. I was still getting 65 frames roughly. Something is very wrong with the graphics in this update.
 
Top Bottom