Extremly bad performance on Next Gen Update!

+
Was really excited for the next gen update to play the new improved Witcher 3 for the third time on my badass new pc(Ryzen 9 5900x + 32 GB RAM + RTX 3070).
I though there is no way in hell this release is gonna not work flawlessly after Cyberpunk 2077 fiasco. CDPR sure have learned their lesson.
But to my surprise that is not the case.
This release is rushed again and off course not a good one.

It seems this days game releases are indeed a joke. It doesn't matter one has a 3 thousand dollar gaming pc. One cannot play something when that thing is released.

I get ~10 FPS in the game 4k DLSS Performance, RT on(except RT shadow which is off)with a RTX 3070. Hairworks off. Rest of the graphical settings on Ultra(not Ultra +) except last two which are on Ultra+.
I get weird artifacts on the screen. Image not on all full screen but kind of cropped.

Q: Anyone got or knows of a fix, workaround?
 

Attachments

  • rsz_screenshot_2022-12-14_151807.png
    rsz_screenshot_2022-12-14_151807.png
    703.2 KB · Views: 632
  • rsz_1screenshot_2022-12-14_152108.png
    rsz_1screenshot_2022-12-14_152108.png
    3.3 MB · Views: 631
Last edited:
Rtx 3080, 3440x1440. 30-40 fps with drops...but always gpu load slide between 70-85%.
Also cut-scenes doesn't work at fullscreen on ultrawide resolution.
CDPr please fix it, so atmospheric game to play at ultrawide....but black sided cut-scene killing this atmosphere...
 
Well to be frank it was a free update. However I know what you are dealing with as in some cutscenes. A 1/4 of the screen goes black for a second and then returns to normal. I notice frame rate drops in some area. Well beyond normal as I run a 2060 Super and Ryzen 7 2700x. So it should run 60 frames very smoothly with the update on Ultra. Hopefully CDPR will have a hot fix soon.
 
It weird to say this but Cyberpunk 2077 4k DLSS Quality with RT runs much better-stutter free now on my PC then Witcher 3 4k DLSS Performance which runs very poorly and stuttery with weird graphical artifacts.
Witcher 3 it is a game from 2015 while Cyberpunk 2077 it is a game from 2020.
It is clearly not optimized and done well.
It should not run worse then Cyberpunk on the same hardware.
 
What I notice is that GPU isn't fully utilized for me, so general performance is bad. RX 6800 XT, Linux, Wine 8.0-rc1 + vkd3d-proton (latest master).

I'm not using any upscaling or ray racing. I think something is wrong with the game itself optimization wise. It's not saturating the GPU properly.
Post automatically merged:


The upgrade was not natively designed for DX12. It's running on a DX12 wrapper.
That might explain why performance is bad, yeah. But it would also mean there is very little chance of it being ever fixed.

I was able to start the game on Linux though unlike what that reviewer said, but performance is worse than before.
Post automatically merged:

Wow, I just tried DX11 variant of the new version - performance is so much better!
 
Last edited:

The upgrade was not natively designed for DX12. It's running on a DX12 wrapper.
exactly and that is what causes so much cpu/gpu overhead + extra memory consumption. For me it's a desaster, I wasn't expecting 144 fps on my 3080+10700k but also not 30 (!!!!!) fps in denser forests with a couple of enemies of screen... Come on... And this at 1440+DLSS quality.

I'm glad this update was free but it's clear to me they focussed on the console release this time.
 
Was really excited for the next gen update to play the new improved Witcher 3 for the third time on my badass new pc(Ryzen 9 5900x + 32 GB RAM + RTX 3070).
I though there is no way in hell this release is gonna not work flawlessly after Cyberpunk 2077 fiasco. CDPR sure have learned their lesson.
But to my surprise that is not the case.
This release is rushed again and off course not a good one.

It seems this days game releases are indeed a joke. It doesn't matter one has a 3 thousand dollar gaming pc. One cannot play something when that thing is released.

I get ~10 FPS in the game 4k DLSS Performance, RT on(except RT shadow which is off)with a RTX 3070. Hairworks off. Rest of the graphical settings on Ultra(not Ultra +) except last two which are on Ultra+.
I get weird artifacts on the screen. Image not on all full screen but kind of cropped.

Q: Anyone got or knows of a fix, workaround?
Why are you expecting a 3070 to run 4k raytracing? Nvidia themselves told you it wouldn't do that.

There are problems with the (completely free) update, notably memory leaks slowing the game to single digit frame rates after fast travel and some serious texture flicker problems, but there does still need to be some realism about raytracing and the hardware people are expecting to do it. The 3070 is not a 4K raytracing card. It is barely a 1440p raytracing card.
 
Last edited:
My conclusion - just play the update in DX11 mode with Ultra+ settings and forget about DX12 one until that will be fixed (if ever). DX12 version is just badly designed and not optimized to use GPU fully which results in poor performance.
 
Guys you can run this game in DX11 just by using the witcher3.exe in the x64 folder instead of the x64_dx12 folder ..... Put your Vulkan wrapper DLLs in the x64 folder, make a shortcut for the witcher3.exe in that directory, move it to the desktop and run it from there. Or if you need Steam then try to point Steam to start the game from EXE in the x64 folder.

With RT off, TAAU, everything else set to Ultra+ and DLDSR enabled I'm running the game on a 5800x, RTX 3070 Ti @ 1440p and averaging 115 FPS with worse case lows in the 90's. With RTSSGI only enabled and DLSS Balanced it drops down to 56 FPS average with lows down to 40. However if I enable RTSSGI from GeForce Experience which doesn't even use the RT cores or DLSS (and I have DLDSR enabled) I average 75 FPS with lows still above 60 so I just locked the game to 60 FPS and it's smooth and stutter free.

BTW that joker is wrong about it being CPU bottlenecked. If it were actually CPU bottlenecked you see one or more cores hitting 100% for significant amounts of time. 69% usage on one CPU core while the GPU is at 99% usage is NOT a CPU bottleneck. Yes there is a problem, yes using a wrapper isn't the way to go (but really not any different from using a Vulkan wrapper on DX11 or DX12) but he is wrong about the CPU bottleneck part
 
Last edited:
Guys you can run this game in DX11 just by using the witcher3.exe in the x64 folder instead of the x64_dx12 folder ..... Put your Vulkan wrapper DLLs in the x64 folder, make a shortcut for the witcher3.exe in that directory, move it to the desktop and run it from there. Or if you need Steam then try to point Steam to start the game from EXE in the x64 folder.

I'm using GOG version on Linux with my own script so didn't check it, but I thought Steam has some kind of launcher selection already in this release for DX11 or DX12 binaries.
Post automatically merged:

BTW that joker is wrong about it being CPU bottlenecked. If it were actually CPU bottlenecked you see one or more cores hitting 100% for significant amounts of time. 69% usage on one CPU core while the GPU is at 99% usage is NOT a CPU bottleneck. Yes there is a problem, yes using a wrapper isn't the way to go but he is wrong about the CPU bottleneck part

Yeah, I also noticed that CPU wasn't maxed out for me. I think something just is not parallelized properly to saturate the GPU, unlike DX11 only version.
 
Hmm, this game worked on 1070Ti in 1080p flawlessly without any crashes during hundreds of playhours ... now next gen crashed 2 times in first 10 minutes, environment in motion looks blurry (no matter if I switch option for blurr on/off) and whole game feels weird, worse. At this moment I'm reverting to previous version (via GOG launcher) and very likely will stay with it. I looked forward to this update, but it seems is aimed on users with high end PCs and big(er) resolutions.
 
Hmm, this game worked on 1070Ti flawlessly without any crashes during hundreds of playhours ... now next gen crashed 2 times in first 10 minutes, environment in motion looks blurry (no matter if I switch option for blurr on/off) and whole game feels weird, worse. At this moment I'm reverting to previous version (via GOG launcher) and very likely will stay with it. I looked forward for this update, but it seems is aimed on users with high end PCs and big resolutions.
I agree with this. At just over 1080p (1920x1200) on an RTX 3070 things just feel a bit, I don't know, "off". I also walked into the Passiflora and could barely see so extreme was the lighting falloff between lit areas and not (and that's not a gamma issue). I love that they've done this but I don't think it's to my taste. That said, they do seem to have turned down the absurdly over the top colour grade in Blood and Wine a little bit, which used to make reflections on some armour turned offputtingly green and make witcher senses nigh-on impossible to see. So that's a plus.
 
Why are you expecting a 3070 to run 4k raytracing? Nvidia themselves told you it wouldn't do that.

There are problems with the (completely free) update, notably memory leaks slowing the game to single digit frame rates after fast travel and some serious texture flicker problems, but there does still need to be some realism about raytracing and the hardware people are expecting to do it. The 3070 is not a 4K raytracing card. It is barely a 1440p raytracing card.
Firstly,
I run the game not on 4k with full RT but 4k DLSS Performance with RT(2 out of 3). Therefore not 4k. DLSS is upscaling.

Secondly,
I can run Cyberpunk 2077 4k DLSS Quality(not Performance) with RT(2 out of 3) with 30-45 FPS.

I don't need 60 FPS. I played most single-players games with a gamepad not keyboard.

Q: Considering Cyberpunk 2077 is a 2020 game and Witcher 3 is a 2015 I figured I should at least have similar performance, no?
Q: A 2020 game with RT should be harder to run then a 2015 game with RT not the other way around, no?
 
Last edited:
You call that extremely bad performance? Hold my beer ;)

20221214_182816.jpg
(sorry, there was only garbage on the screenshot, so I took a picture.)
Yup 2-4 fps. That's i7-13700k, rtx3080 12gb, 1440p, uber, rt all on, hairworks off

To be fair, though, it works decently when it doesn't bug out. The same place when it works properly.
obraz.png
When it works, it works 50-70fps at these settings, mostly around and little above of 60.

The problem is that it sometimes drops to these unplayable framerates for no apparent reason. It's usually 10-20 range, but sometimes drops down to 2fps, just like on the image above. The issue won't go away until I restart the game. Sometimes I have to move to a different place and save again (I've got a save in white orchard, that triggers the issue pretty reliably).

The framerate issues are always accompanied by graphical glitches (wide red borders around textures, geometry going haywire, lighting disappearing, etc).

Works fine with RT disabled, though.
 
You call that extremely bad performance? Hold my beer ;)

View attachment 11334253
(sorry, there was only garbage on the screenshot, so I took a picture.)
Yup 2-4 fps. That's i7-13700k, rtx3080 12gb, 1440p, uber, rt all on, hairworks off

To be fair, though, it works decently when it doesn't bug out. The same place when it works properly.
View attachment 11334256
When it works, it works 50-70fps at these settings, mostly around and little above of 60.

The problem is that it sometimes drops to these unplayable framerates for no apparent reason. It's usually 10-20 range, but sometimes drops down to 2fps, just like on the image above. The issue won't go away until I restart the game. Sometimes I have to move to a different place and save again (I've got a save in white orchard, that triggers the issue pretty reliably).

The framerate issues are always accompanied by graphical glitches (wide red borders around textures, geometry going haywire, lighting disappearing, etc).

Works fine with RT disabled, though.
I did not played enough. Maybe if I did I would have arrived at 0 FPS.
Imagine paying hard worked money( thousands of dollars on pcs-overpriced components and hundreds of dollars on games) just to be greeted with 3 or 10 FPS.
The feeling one feels is simply of disgust.
 
Last edited:
From what I can see by my tests DLSS seems to be broken. Changing from quality, to balanced and then to performance does not increase my FPS.

This is with RT on a rig with a 3080, Ryzen 5900X and 32 GB RAM.

I hope that CDPR reads this and fixes DLSS.
 
Top Bottom