Performance issues with Witcher 3 ver 4.0

+
FSR is AMD technology, not NVIDIA. I believe to achieve proper scaling and performance for NVIDIA cards need to use DLSS
FSR is broken, just a blurry mess right now, needs hotfix.
Post automatically merged:

Even on DX11 the performance difference is huge, regardless of settings.
Not sure an extra few props and some distant terrain lighting (or fog extension) is worth nearly halving the framerate. Ditto playing on the lowest settings possible and still losing frames...
Something under the hood is broken. GPU, VRAM, RAM usages are all weird.
Comparison 1.32 Ultra (with HD Reworked mod) vs 4.00 Ultra vs 4.00 Low screens
View attachment 11335648View attachment 11335657View attachment 11335660View attachment 11335663
Great example. Something is not right cuz FPS drop is too huge on DX11 for no reason at all.
 
The game appears to be CPU limited. DLSS 2 hardly makes a difference. DLSS 3 is just faking performance with fake frames which are made by the GPU and most likely look like crap (from what I have seen DLSS 3 creates alot of noticeable artifacts which may be where some of visual glitches people are noticing are coming from). This is why 4000 series can survive the obvious un-optimization of this game. They just chucked the game in a DX12 wrapper instead of porting it natively (seems really lazy especially given 2 years). The CPU gets pegged with all the overhead and the GPU goes under-utilized because the CPU can't keep up reasonably. I can't recall ever seeing a game that pegs the CPU like this game does now. People are posting videos of the latest CPUs being pegged by this game.

Geralt also has a terrible movement delay on PC. From a standstill he will not move immediately and you have to wait for the camera to move into position before he will start moving. It is so noticeable and IMO makes the game unplayable for me even if I can scrap together 60fps with a god level PC
 
I was hoping the hotfix update would have fixed my grass. I am grateful I can meditate again.

screenshot_93115.png
 
the fog around the orphanage in the swamp clips through the water in an ugly way(no raytraycing enabled).

specs if it is any help

ryzen 7 3700x
RTX 3070
32gb 3200 ram
 

Attachments

  • Inkedscreenshot_181223.jpg
    Inkedscreenshot_181223.jpg
    1.3 MB · Views: 79
  • Inkedscreenshot_181227.jpg
    Inkedscreenshot_181227.jpg
    1.7 MB · Views: 73
The game appears to be CPU limited. DLSS 2 hardly makes a difference. DLSS 3 is just faking performance with fake frames which are made by the GPU and most likely look like crap (from what I have seen DLSS 3 creates alot of noticeable artifacts which may be where some of visual glitches people are noticing are coming from). This is why 4000 series can survive the obvious un-optimization of this game. They just chucked the game in a DX12 wrapper instead of porting it natively (seems really lazy especially given 2 years). The CPU gets pegged with all the overhead and the GPU goes under-utilized because the CPU can't keep up reasonably. I can't recall ever seeing a game that pegs the CPU like this game does now. People are posting videos of the latest CPUs being pegged by this game.

Geralt also has a terrible movement delay on PC. From a standstill he will not move immediately and you have to wait for the camera to move into position before he will start moving. It is so noticeable and IMO makes the game unplayable for me even if I can scrap together 60fps with a god level PC
hotfix released today unfortunately does not repair the movement delay Geralt has in certain situations. Did not notice an increase in FPS playing on a 3090 with DLSS2 set to Quality.
 
I thought I was going crazy about Geralt's movement at first, I'm also getting that delay.
There is a topic on steam forums growing regarding this issue. https://steamcommunity.com/app/292030/discussions/0/

Seems like a camera script is delaying geralts movement. We don't need the camera to auto-center or whatever it is trying to do before he moves. I have played with every camera option there is (including auto-center) and none of it fixes the problem. It is not input latency.
 

There Appears To Have Been A Small Performance Bump With Today's Hot fix. This is Potentially Good News For Future Patches Having Better Performance! Same Setting Max 1440P DLSS Quality Ryzen 9 5900X/RTX 3080 32GB 3200mhz Ram and a Samsung 970 NVME. GPU Utilization ~10ish% Higher On Average.

Original post pre-patch https://www.reddit.com/r/Witcher3/comments/zllc1l let me be clear the game still needs some work as performance still isn't where it should be given the hardware. The GPU still isn't being fully saturated and CPU also still mostly sits idle, but overall seems more stable less crashes around 5-10ish% better performance and around 10ish% more usage on GPU than before!

Pre-Patch.png

Post-Patch.png
 
There is a topic on steam forums growing regarding this issue. https://steamcommunity.com/app/292030/discussions/0/

Seems like a camera script is delaying geralts movement. We don't need the camera to auto-center or whatever it is trying to do before he moves. I have played with every camera option there is (including auto-center) and none of it fixes the problem. It is not input latency.

Here's the specific thread I've been following: https://steamcommunity.com/app/292030/discussions/0/3731826212313771663/?ctp=4

This movement delay and the disappearing music after doing the Netflix quest are the 2 most annoying bugs I've encountered in 4.0.0; hopefully both get resolved quickly whenever CDPR put out the first big patch.
 
With the patch it seems slightly better. I noticed one thing with my RTX 3080 (4K RT Settings mix of Ultra and Ultra +): 35-50 fps
After a few minutes of playing, I have already maxed out my VRAM and my system RAM keeps climbing. Previously with the old version and all the texture mods and other 50 mods I might peak out around total system usage of 13-14 GB. With this new Next Gen version, my system RAM keeps climbing up to over 20GB. That is 7 or 8 GB higher than previously, and even WAY WAY higher than Cyberpunk on 4k RT settings.

Is the VRAM overspilling into system RAM and causing the card to slow down immensely??? When the system RAM climbs I probably drop 10-15 fps overall, easily. We all know how much slower a GPU goes if it has to use System RAM instead of VRAM exclusively.

Any thoughts?
 
Still sometimes have only 75 to 80% GPU UItilization. My i7-9700k normally should handle more than 40 to 50 FPS.
Using Ultra Performance DLSS on my 3080 and still not hitting 50 FPS sucks.
 
sadly I think I am going to just downgrade my copy of witcher 3 and mod it. I've been waiting two years to play this game with nextgen quality and the way this was handled is sloppy. I don't think they can fix these issues quickly because of the shortcuts they took on the PC version. It's a very flawed way of doing things with ridiculous CPU overhead that will always destroy performance. This needs to be properly ported to DX12 not basically emulated.
 
Even on DX11 the performance difference is huge, regardless of settings.
Not sure an extra few props and some distant terrain lighting (or fog extension) is worth nearly halving the framerate. Ditto playing on the lowest settings possible and still losing frames...
Something under the hood is broken. GPU, VRAM, RAM usages are all weird.
Comparison 1.32 Ultra (with HD Reworked mod) vs 4.00 Ultra vs 4.00 Low screens
View attachment 11335648View attachment 11335657View attachment 11335660View attachment 11335663
coud you add same screenshots from dx12 TAAU RT off, DX12 DLSS quality RT Off and DX12 RT DLSS balanced?
 
Hey! I read about all the issues after the initial next gen update so decided to wait to try till after a fix. Just fired up the game for the first time today and disabled all my old mods and it fired right up. I have a GTX 1070 and I went into the graphics settings before I fired it up for the first time and turned all ray tracing off. After I entered the game my play was chugtastic, so bad in fact I didn't even bother looking at frame rate. Went back into graphics settings and full RT had re-enabled itself. Turned it off and the game was smooth as butter. Just a suggestion, make sure RT hasn't re-enabled itself if you want it off. If you do want it on, maybe turn RT completely off for now and try the game again. Ran great for me in DX12 with my 1070 after I did that.
 
The low GPU utilisation issue is still there after the hotfix. It is slightly better in the open field that the utilization can reach 85% sometimes, but in the city, it is still only 50 to 60% on my PC.
 
strange how i have completely opposite experience... i have high GPU utilization, while my CPU is sitting at 8-10% all the time...
Post automatically merged:

strange how i have completely opposite experience... i have high GPU utilization, while my CPU is sitting at 8-10% all the time...
 
strange how i have completely opposite experience... i have high GPU utilization, while my CPU is sitting at 8-10% all the time...
Post automatically merged:

strange how i have completely opposite experience... i have high GPU utilization, while my CPU is sitting at 8-10% all the time...
depends on your hardware. If you have a high end card this game will not saturate the GPU and still get crap performance with RT and DLSS 2. The reason is they didnt natively port to DX12 and the CPU has a ton of overhead using a DX12 wrapper instead. The CPU is the bottleneck. This is why someone with a 4090 vs a 3090 using DLSS2 and the same CPU will get the same performance in most situations with this game. The 4090 can pull away using that gimmick DLSS 3. DLSS 3 is only viable when you can already push alot of frames. It introduces latency penalties despite generating more frames. The quality of the frames is also crap when scrutinized but the issues with DLSS 3 are less apparent when you are going from an FPS of like 100 plus to something even higher using the feature. Using DLSS 3 to go from 60-70 fps to get somewhere else is probably not ideal. I don't have a 4090 yet but currently do have a 3090. I will be seeing the difference or lack thereof for this game soon though as I have a 4090 on order. I will try out DLSS 3 for myself but honestly hardware unboxed and other tech tubers havent really been impressed with the technology unless using it as I described and this game does not fit that scenario because everyone gets CPU capped crap performance using DLSS2 with RT.
 
Top Bottom