Performance issues with Witcher 3 ver 4.0

+
Problem with DLSS 2 on RTX 30xx is that it completely doesn't work. There is no any internal image scaling and this is why there is no any speed benefits while using it. Problem may be with Nvidia driver which locked this feature to prefer RT 40xx series with DLSS 3 instead.
 
Yes, the performance difference is very big. I have a 4090 and a 10600K, and play at 3840x1600. Maxing out all the visual settings, I get around 50-60fps in Toussaint. Fortunately I can turn on DLSS 3 and frame generation, which helps tremendously. Now I get high framerates and smooth gameplay again. But yeah, this is a very demanding version. Maybe some optimisations can be done with patching?

Btw. I disagree strongly that the graphcis don't look better. IMO they look A LOT better. Better vegetation, better lighting, ray tracing, better reflections, better textures etc..etc..


I think you made a bit of a judgement leap from 'low frame rate' to 'demanding' very quickly. Problem is not that this game is demanding. Problem is that this game doesn't demand, and runs terribly at that.

When I run CP2077, my GPU is pulling 450+ watts and humming nicely, with good frames, because it is doing work that it is being asked for. However, with Witcher 3 NG, it barely pulls 350 watts, and runs horribly. Which essentially means that GPU is not even being asked to do that much work by the game.

If it was demanding I would understand. the problem is that its is not demanding, it is just bad at running.

For a game to use 25% less power than CP2077, and run 30% worse.. is not demanding... It simply means that it is a hot pile of you know what.
 
coud you add same screenshots from dx12 TAAU RT off, DX12 DLSS quality RT Off and DX12 RT DLSS balanced?

W3 dx12 t q b.png
 
depends on your hardware. If you have a high end card this game will not saturate the GPU and still get crap performance with RT and DLSS 2. The reason is they didnt natively port to DX12 and the CPU has a ton of overhead using a DX12 wrapper instead. The CPU is the bottleneck. This is why someone with a 4090 vs a 3090 using DLSS2 and the same CPU will get the same performance in most situations with this game. The 4090 can pull away using that gimmick DLSS 3. DLSS 3 is only viable when you can already push alot of frames. It introduces latency penalties despite generating more frames. The quality of the frames is also crap when scrutinized but the issues with DLSS 3 are less apparent when you are going from an FPS of like 100 plus to something even higher using the feature. Using DLSS 3 to go from 60-70 fps to get somewhere else is probably not ideal. I don't have a 4090 yet but currently do have a 3090. I will be seeing the difference or lack thereof for this game soon though as I have a 4090 on order. I will try out DLSS 3 for myself but honestly hardware unboxed and other tech tubers havent really been impressed with the technology unless using it as I described and this game does not fit that scenario because everyone gets CPU capped crap performance using DLSS2 with RT.


I have ryzen 3600 with Rx6700xt and 16gig of RAM.. Right now, im playing DX11 version, with Ultra + settings (except for crowd density as i dont like to bump into people while moving through city, they are quite annoyingly changing direction of movement in front of me) I'm also using FPS lock to 70 FPS in my graphic card settings, as there is no point generating more frames anyway, and it keeps my card from going too hot and noisy.. When that lock is removed, my card is constantly at 95-99% and getting 70-75 degrees with fan running almost 2000rpm, which is not something i enjoy, especially when i play usually during night or early in the morning..
 
I have ryzen 3600 with Rx6700xt and 16gig of RAM.. Right now, im playing DX11 version, with Ultra + settings (except for crowd density as i dont like to bump into people while moving through city, they are quite annoyingly changing direction of movement in front of me) I'm also using FPS lock to 70 FPS in my graphic card settings, as there is no point generating more frames anyway, and it keeps my card from going too hot and noisy.. When that lock is removed, my card is constantly at 95-99% and getting 70-75 degrees with fan running almost 2000rpm, which is not something i enjoy, especially when i play usually during night or early in the morning..
Newer games are definitely more demandingen on the GPU, creating more temperature. That's normal. I remember Days Gone heated up my 3090 like crazy. Except for the 4090, which keeps cool no matter what game you throw at it.
 
AMD Ryzen 5 5600X
32 GB RAM
Zotac RTX 3060

It is not possible to play with DX12, RayTracing, DLSS makes big performance issues.
Even without the new features, DX12 is not running well.
World map stuttering, even if frames are above 30... everything is stuttering.

I have switched now to DX11 with the following settings:
FullScreen
VSync ON
2560 x 1440
Maximum Frames Per second: 144 fps

TAAU
Sharpening: low
Screen Space Ambient Occlusion ON
Screen Space Reflections High (low will not be safed.. so i switch to low manually)
Motion Blur OFF
Blur OFF
Bloom OFF
Depth of Field ON
Chromatic Aberration OFF
Vignetting OFF
Light Shafts OFF
Camera Lens Effects OFF
HairWorks ON 4xAA High

Characters HIGH
Shadows HIIGH
Terrain HIGH
Water Quality HIGH
Foliage HIGH
Grass HIGH
Texture HIGH
Detail Level HIGH

World MAP is not stuttering, with these settings in DX11 it is fine.

Standing still in Novigrad on clear weather, sunny: 75 FPS (Limit 144?)
Standing still outside Heatherton on map clear weather sunny: 75 FPS (Limit 144?)
Moving: FPS dropping between 65 and 75 FPS

Novigrad at night with rain and wind:
Moving: drops down to 50 FPS specially in the near of fire
Fighting also drops framerate.
Minium stuttering but playable.

- rain and wind in generally drops frame rates, even in Novigrad or outside
- swamps in combination with dust, rain, wind and fighting drops down to 50 and then stuttering

- clear night, clear sun in Novigrad or outside are the best running options
- 75 fps with minimum drops

Clear night in Novigrad
75 FPS while standing still.
GPU loaded 75% to 100% loaded
CPU max 40% in Novigrad.
3,3 GB GPU RAM loaded.
10,5 GB RAM loaded.

Clear night Heatherton with wind
75 FPS while standing still.
GPU always loaded 100%.
CPU max 40% but while turning camera, switching weapons goes up to 80% to 100% for short second.
3 GB GPU RAM loaded.
10m5 GB RAM loaded.

Going from Border Post to Novigrad at night with Wind
75 FPS while standing still.
GPU always loaded 100%.
CPU 40% but goes to 80% to 100% while switching camera or turning around while fighting
3,1 GB GPU RAM loaded.
10,6 GB RAM loaded.

Going from Border Post to Novigrad at night with Wind and heavy rain
66 FPS while standing still, so you have big drop here with rain.
It does not "feel" anymore perfect but is playable.
GPU always loaded 100%
CPU 40% same as above while turning around 80% to 100%.
3,1 GB GPU RAM loaded.
10,5 GB RAM loaded. (up to 10,7 before Portside Gate looking on the whole town from outside)
(up to 10,9 in Novigrad) frame drops here down to 50 fps while moving)

All in all with my settings it is a fine playable game with some improvements to do.
It looks very clear and sharp, not too much effects, looks natural.

All combinations with rain, fire, fog, fighting and wind should be improved in the future.

I do not understand while there is such a low level of GPU RAM and RAM using?

Hope this will helps.
 
I get terrible performance as well. At 4k with DLSS performance, settings on high, RT on except for RT shadows, I get 40-50 FPS plus random slow downs to 10-20 FPS after playing a bit. Light flickering over the whole scene occurs randomly as well.

RTX 3080 AMD 5800X 32GB RAM.
 
I have ryzen 3600 with Rx6700xt and 16gig of RAM.. Right now, im playing DX11 version, with Ultra + settings (except for crowd density as i dont like to bump into people while moving through city, they are quite annoyingly changing direction of movement in front of me) I'm also using FPS lock to 70 FPS in my graphic card settings, as there is no point generating more frames anyway, and it keeps my card from going too hot and noisy.. When that lock is removed, my card is constantly at 95-99% and getting 70-75 degrees with fan running almost 2000rpm, which is not something i enjoy, especially when i play usually during night or early in the morning..
Not sure if you are using FSR but that might bring your GPU utilization down. I cannot get the 3090 to saturate using DLSS2 with this game while I get 40-70fps. Turning DLSS off makes no difference for me FPS wise in a city area. especially. It is CPU bound so the only thing DLSS is doing is making my GPU work less because it is upscaling from a lower res which takes less horsepower than natively rendering 4k. A 4090 would not help without DLSS 3 it would be waiting on the CPU just the same as the 3090 is. DLSS 3 is not really waiting on the CPU it is using data from previous frames to create new ones and has nothing to do with the CPU when creating those "fake frames". This is why 4090 with DLSS3 is not as impacted by the CPU bottlenecking I believe. I am not an engineer for Nvidia but this is roughly how I understand the technology in plain language.
Post automatically merged:

This game really pegs a limited amount of cores. It doesn't seem to be able to use many cores well. The DX12 wrapper business and the lack of spreading the load to many cores are both issues leading to the performance issues.
Post automatically merged:

Newer games are definitely more demandingen on the GPU, creating more temperature. That's normal. I remember Days Gone heated up my 3090 like crazy. Except for the 4090, which keeps cool no matter what game you throw at it.
4090s have way overdesigned coolers. The 3090TI was drawing comparable wattage. That cooler was enough. I have no idea why the board partners made a bunch of 4 slot coolers. So annoyingly huge. Say goodbye to your other PCI-E slots except maybe the bottom one which is never a full bandwidth slot. That's even if you can fit the thing in your case.
 
Last edited:
Not sure if you are using FSR but that might bring your GPU utilization down. I cannot get the 3090 to saturate using DLSS2 with this game while I get 40-70fps. Turning DLSS off makes no difference for me FPS wise in a city area. especially. It is CPU bound so the only thing DLSS is doing is making my GPU work less because it is upscaling from a lower res which takes less horsepower than natively rendering 4k. A 4090 would not help without DLSS 3 it would be waiting on the CPU just the same as the 3090 is. DLSS 3 is not really waiting on the CPU it is using data from previous frames to create new ones and has nothing to do with the CPU when creating those "fake frames". This is why 4090 with DLSS3 is not as impacted by the CPU bottlenecking I believe. I am not an engineer for Nvidia but this is roughly how I understand the technology in plain language.
Post automatically merged:

This game really pegs a limited amount of cores. It doesn't seem to be able to use many cores well. The DX12 wrapper business and the lack of spreading the load to many cores are both issues leading to the performance issues.
Post automatically merged:


4090s have way overdesigned coolers. The 3090TI was drawing comparable wattage. That cooler was enough. I have no idea why the board partners made a bunch of 4 slot coolers. So annoyingly huge. Say goodbye to your other PCI-E slots except maybe the bottom one which is never a full bandwidth slot. That's even if you can fit the thing in your case.

Actually, TAAU seems to utilize my Graphic card less than FSR+DSR.. its very very strange...
 
coud you add same screenshots from dx12 TAAU RT off, DX12 DLSS quality RT Off and DX12 RT DLSS balanced?
Thank You, looks like because of CPU Bound in game client DLSS not working at all and performance impact of NG update is huge:
Ultra DX11 W3 Classic it's working with around 60% more FPS in DX11 Ultra+ NG
Ultra DX11 W3 Classic it's working with around 135% more FPS in DX12 Ultra+ NG RT off
worst update i have ever seen...
 
Thank You, looks like because of CPU Bound in game client DLSS not working at all and performance impact of NG update is huge:
Ultra DX11 W3 Classic it's working with around 60% more FPS in DX11 Ultra+ NG
Ultra DX11 W3 Classic it's working with around 135% more FPS in DX12 Ultra+ NG RT off
worst update i have ever seen...
No problem

I have been experimenting (with dx11)
A number of the NG's Ultra settings (eg. grass density, foliage) are set higher than Classic.
IIRC Classic Ultra Grass Density is the same as NG Medium Grass Density.
NG High Foliage is the same as Classic Ultra
This explains *some* of the difference in FPS
Similarly, NG utilises slightly higher MeshLOD distances (eg. 1.5 and 1.3 vs 1 and 1 in classic), however, to actually make use of these user.settings settings in Classic you need to use a mod like Increased Draw Distance as it can not be done in the User.settings.
This again explains *some* of the fps difference.
The issue of lower GPU usage, and thus lower FPS remains however.
I have also noticed that NG essentially uses higher quality shadows for NPCs, similar to a mod like "Lip Movement and HiRes Shadows" (just the shadow part)
This explains a larger part of the FPS difference but yet again you are left with GPU under utilization compared to modded Classic

After creating some extra settings for grass, foliage, shadows etc, above the Ultra+ settings, I have found something VERY ODD.
In the first screenshot (NG DX11) I have the Grass Density set to Ultra+ (which is 4750)
As you can see, GPU utilisation is low
However, when I up the Grass Density to 6000 (2nd screenshot - NG DX11) I get MUCH HIGHER GPU utilisation and MUCH HIGHER FPS (50%! at least in this location )
Doesn't make sense to me. It's like Grass Density at 6000 gives the game a jolt or something :/


EDIT: Turns out that it's not a jolt, the GPU usage is just tanking after altering the settings. It goes up again if I set it back to how it was when the game was loaded up (or if I reload the save)
Dunno if this is default behaviour, probably a result of adding custom settings beyond Ultra+
Still odd but not as interesting
witcher3_2022_12_21_20_15_36_414.png
witcher3_2022_12_21_20_15_51_584.png
 
Last edited:
No problem

I have been experimenting (with dx11)
A number of the NG's Ultra settings (eg. grass density, foliage) are set higher than Classic.
IIRC Classic Ultra Grass Density is the same as NG Medium Grass Density.
NG High Foliage is the same as Classic Ultra
This explains *some* of the difference in FPS
Similarly, NG utilises slightly higher MeshLOD distances (eg. 1.5 and 1.3 vs 1 and 1 in classic), however, to actually make use of these user.settings settings in Classic you need to use a mod like Increased Draw Distance as it can not be done in the User.settings.
This again explains *some* of the fps difference.
The issue of lower GPU usage, and thus lower FPS remains however.
I have also noticed that NG essentially uses higher quality shadows for NPCs, similar to a mod like "Lip Movement and HiRes Shadows" (just the shadow part)
This explains a larger part of the FPS difference but yet again you are left with GPU under utilization compared to modded Classic

After creating some extra settings for grass, foliage, shadows etc, above the Ultra+ settings, I have found something VERY ODD.
In the first screenshot (NG DX11) I have the Grass Density set to Ultra+ (which is 4750)
As you can see, GPU utilisation is low
However, when I up the Grass Density to 6000 (2nd screenshot - NG DX11) I get MUCH HIGHER GPU utilisation and MUCH HIGHER FPS (50%! at least in this location )
Doesn't make sense to me. It's like Grass Density at 6000 gives the game a jolt or something :/


EDIT: Turns out that it's not a jolt, the GPU usage is just tanking after altering the settings. It goes up again if I set it back to how it was when the game was loaded up (or if I reload the save)
Dunno if this is default behaviour, probably a result of adding custom settings beyond Ultra+
Still odd but not as interesting
View attachment 11336986View attachment 11336989
Nice observations, but with Grass Density you made mistake, by default in NG 6000 it's Uber+, 4750 is for Uber setting.

I made comparison of GPU impact by the Grass Density setting, in my Config: RTX 3060 mobile, 5800HS 32GB ram.
In low Grass setting i got around 50% more FPS compered to Ultra+ in very grassy localizations, on medium is ~40% more FPS, ~25% on High, 10-13% on Uber.
For me even if is set to Low it's look still good and on medium it's almost the same good as ultra+.

If Grass Density on medium is same as Ultra on Classic W3, that with other upgraded graphic options in NG it's sort of explains performance difference between NG.
 

Attachments

  • 2. DX11 grass High.jpg
    2. DX11 grass High.jpg
    2.2 MB · Views: 61
  • 2. DX11 grass Low.jpg
    2. DX11 grass Low.jpg
    2.2 MB · Views: 61
  • 2. DX11 grass Medium.jpg
    2. DX11 grass Medium.jpg
    2.2 MB · Views: 47
  • 2. DX11 grass Ultra.jpg
    2. DX11 grass Ultra.jpg
    2.1 MB · Views: 45
  • 2. DX11 grass Ultra+.jpg
    2. DX11 grass Ultra+.jpg
    2.1 MB · Views: 46
  • DX11 grass High.jpg
    DX11 grass High.jpg
    2.1 MB · Views: 44
  • DX11 grass Low.jpg
    DX11 grass Low.jpg
    2.2 MB · Views: 44
  • DX11 grass Medium.jpg
    DX11 grass Medium.jpg
    2.2 MB · Views: 39
  • DX11 grass Ultra.jpg
    DX11 grass Ultra.jpg
    2.2 MB · Views: 42
  • DX11 grass Ultra+.jpg
    DX11 grass Ultra+.jpg
    2.2 MB · Views: 53
Last edited:
I'm using a 6800xt with Ryzen 7 5800x and I'm noticing issues as well @ 1440p.

FSR looks great for me, with the exception that Geralt in the Equipment screen is of low resolution (he's normal with FSR off) and low resolution mini map.

Another issue I'm noticing is performance with latest drivers @ with fsr on my gpu only runs at 60% but runs at 90-100% in shopping menus, and other menus.

With fsr on Quality I have fewer frames at max settings (no ray trace) than I do on max default settings.


Clearly still has performance issues that CDPR needs to work on -- but I suppose this could be a driver issue too, although I'm using latest Adrenaline software -- not using Radeon Boost, or anything along those lines.
 
It is in this game. It is CPU bottlenecked. That was my point. The game in DX12 mode with RT is horribly un-optimized. Yes the 4090 should crush a 3090 in any game in 4k even without DLSS 3 but it doesn't. I've seen someone benchmark this game with a CPU lesser than mine but with a 4090 and they get less frames than I do with a 3090. The 4090's power is completely wasted in this game and the only thing that separates it is DLSS 3
Remember that RTX 4090 is twice as fast as RTX 3090 - this is not only DLSS 3.0 cause.
 
Last patch two hours ago brought improvements on Steam.
See my last post above with hardware details etc.

- combination from wind, rain, night, fire and fight animations is improved, much better FPS, no more big FPS drops
- less stuttering
- 75 FPS nearly stable

- fight animations with lots of monsters or enemies is improved, less FPS drops.

Good work.
 
Last patch two hours ago brought improvements on Steam.
See my last post above with hardware details etc.

- combination from wind, rain, night, fire and fight animations is improved, much better FPS, no more big FPS drops
- less stuttering
- 75 FPS nearly stable

- fight animations with lots of monsters or enemies is improved, less FPS drops.

Good work.
Yeah, that was quick, which is greatly appreciated.

Just finished White Orchard, and have started in Velen. Seriously, this is the most beautiful game I've ever played. The Next Gen version on PC, with everything maxed out, looks amazing.

However, with this latest patch, the rain droplets screen effect is active, even though I've turned off camera lense effects. Guess I have to use a mod now, that removes the effect (who wants these rain and mud lense effects anyway?).
 
Last edited:
The 4.0 patch seems to have severely reduced performance with RT enabled. Also, the latest hotfix released today (12-22-2022) has made it worse. With RT enabled, even just the "global illumination" setting enabled and all other RT settings disabled, I'm only getting about 40 FPS. As soon as I completely turn off RT, I get 100+ FPS. This is also with every other option maxed at Ultra+, DLSS on "quality" setting, and sharpening on "high". I'm also running at 4K resolution (3840 x 2160) fullscreen mode. I'm using an RTX 3090 with an AMD 7950X CPU. I have attached my dxdiag report in case it's helpful to somebody.
 

Attachments

  • DxDiag.txt
    130.4 KB · Views: 25
Last edited:
Does someone have issue with high usage GPU after Alt+Tab to desktop on Dx11? For me when I'm going alt+tab GPU become use +50 watt and temps become higher ~10'C than in-game values
 
So what I noticed while waiting for the hotfix on pc is that I still have one issue that was not fixed by the last two hotfixes. After playing for a while I suddenly get a fps drop and fps go from 40 to 20 and even lower by the minute after that. I am playing on 3080 and i9, I noticed that a restart of the game and even going into the main menu and reloading the last save fixes this issue, it is still very annoying because altough before the hotfixes it happened after about 30min and now with the hotfixes I can play up to 1h without this weird fps drop, it completely cancels the flow while playing the game. It seems like something is overloading because as soon as I see the fps drop, I can also notice my gpu usage go from 90+ to constant 99%. I know this is going to be a hard issue to fix because it appears after such a long time of playing but I still hope that devs manage to fix it eventually
 
Top Bottom