AMD and Nvidia Ray Tracing Needs Fixing on The Witcher 3 Version 4.0

+
I'm starting to think Vram and Ram in general can be an issue. I see over 20GB of Vram usage and up to 25GB ram usage. And this is already a few minutes into a session. I'm thinking anyone with 16GB of ram might have issues, and especially if they have a regular HDD. Same with Vram on the GPU. Resolution and Raytracing is super taxing on Vram. My 4090 have plenty of Vram, but it gets filled. This is good for me off course, but less good for those that have much less Vram and Ram.
 
Hello everyone! I to am having horrible performance with DX12 on a beefy machine (specs below) with update 4.0 so I am coming over from Reddit with a full detailed post on my test/theories/full breakdown so here we go.

TLDR: The games DX12 implementation is 100% broken and causing awful performance/scaling along with stutters and graphical bugs. The game also seems to only utilize 2 CPU cores which is causing every GPU to be starved (with the exception of 40 series cards DLSS 3.0 and some 4k setups) no matter what you do. DX11 implementation works much better but you lose DLSS/FSR/RTX the selling points of the update. Lastly Alex from Digital Foundry is going to do a video and has already tweeted the patch defiantly has issues/CPU being a problem.
I also linked some tweets/articles/videos at the bottom confirming issues and proving it is not just ranting or a handful of people.

My original Reddit post here https://www.reddit.com/r/Witcher3/comments/zllc1l
So to begin I believe I have narrowed down what the two biggest issues are.

First: the game seems to only use two of my 5900X's 12 CPU cores (Cyberpunk also originally had this issue but to a lesser degree.) I’ve never seen the games CPU usage above 6% and I commonly see it at 1-2% usage. What the underutilized CPU means is that the game is starving our GPUs causing them to also be underutilized. For GPU I’ve seen as low as 45% I’ve seen as high as 85% , but nothing ever close to 100% unless playing on DX11. So in other words the games poor CPU optimizations is causing our beefy GPUs to not be fully used killing our performance while we have headroom left on our hardware.

This is not just a me issue several reports now claiming the same thing and I’ve done the testing and posted the screenshot in prior linked Reddit post above so I believe this is what’s likely happening.

Secondly: the games DX12 implementation is straight borked on PC and I’m even reading reports of the same thing for next gen Xbox as it also uses DX12. What this is doing is causing horrible scaling of settings, lower than expected performance, stutters, and possibly the graphical glitches/crashing.

How did I come to this theoretical conclusion?

To start my rig: Asus Crosshair VIII X570 (latest BIOS 4201) with a Ryzen 9 5900x slight OC to 4.95ghz, 360 AIO cooler, Asus Strix Rog RTX 3080 with slight factory OC, 32gb 3200mhz ram, and game installed to a 1TB Samsung 970 Evo NVME also latest Nvidia drivers (527.56) and latest version of Windows 11 (Windows 11 Home Version 22H2 Installed on 10/‎19/‎2022 OS build 22621.963 Windows Feature Experience Pack 1000.22638.1000.0) Note this install of Windows 11 is less than 2 months old and was a full fresh install not an upgrade from previous Windows 10 install and game is fresh installed not priorly modded due to my recent wipe moving to Windows 11. Also hardware accelerated GPU scheduling is on.

Now background info I run all my games more than fine (to include reasonable RTX) and could not complain but for the sake of keeping it short we’ll do OG 1.32 Witcher 3 and Cyberpunk 2077 this same rig did 1440p Max at 140+fps in Witcher 3 we are talking hair works and everything.
Cyberpunk 2077 all settings maxed minus the two psycho options and all RTX on at 1440p DLSS balanced locked 60 from day one to most recent patch.

Ok next gen Witcher 3 so to break this one down if I do all lowest settings possible on DX12 1440p DLSS performance I get 80-100ishfps or less. Again going back to my background info this same machine could do 1440p maxed at 140+ fps easy with 1.32 but now all lowest settings 1440p DLSS performance or TAAU with resolution scaling I get 80-100fps.....

I CANNOT make this up I am seriously somehow getting ALOT less fps at all low with DX12 compared to OG maxed 1.32 with the same rig....
If I turn on all RTX with all other settings low DLSS Ultra Performance I get 35-50FPS!
Turn it to all lowest settings 1440p DLSS Ultra performance with only RTGI I get around 42-55fps!
MAX game out with 1440p DLSS quality 32-40fps! Try to do anything to gain 60fps on anything RTX I can’t get it to.
Try to get 60fps on no RTX but ultra-ultra+ can’t get it to basically its all medium to low now no RTX and DLSS performance to net 60-75fps with less than 60fps 1% lows and insane input delay anything else I try gets me under 60 to as low as 32 when running maxed on my MONSTER (only 1.78% of 30+ million steam users have a RTX 3080) rig the same rig that runs Cyberpunk 2077 Ultra with RTX 1440p DLSS balanced and gets locked 60fps make it make sense please!

Now we can prove it’s bad DX12 by switching to DX11 and max it out obviously there is no DLSS or RTX but max so new ultra+ and TAAU no scaling at native 1440p I am now getting 100-110sh fps still not as much as maxed 1.32 but remember ultra+ settings and new screen space reflections and TAAU UHD texture etc.

This FPS on DX11 makes more sense than all low no RTX DX12 getting only up to 100fps but hovering around 80-90........ If you match the games settings as close as possible to OG 1.32 on DX11 you get almost the same performance (probably around a 10-15% loss) despite the higher quality graphics settings that cant be matched. I was getting around 110-135ishfps so not as much as OG 1.32 but close enough and the scaling is working and this all makes sense.

So DX12 is borked. If you scale all the way to low you get a lot less fps than max DX11 and max OG 1.32 but max out DX12 and game is unplayable getting half the FPS as Cyberpunk 2077 maxed.....

Lastly on performance game refuses to utilize the GPU and CPU in full. Maxed DX12 with DLSS quality I see 80-85% usage, maxed DX12 with DLSS ultra performance 50-55ish% usage, maxed DX12 with TAAU 45-50ish% usage, lowest settings DX12 40-45% usage. All meanwhile performance was at its worst 30ish fps and at its best 100fps but I still had hardware left to use....?

DX11 no issues 99-100% usage game scales fps scales just DX12.

At this point I am just waiting for Alex from Digital Foundry’s full tech review to highlight and confirm or elaborate on my findings. I’ve learned most of what I know from him over the years and he’s the wizard when it comes to stuff like this.
The devs already tweeted they are investigating the issues so the game will be shelved for me until a patch is released.

For clarification I’m not mad I’m just sad. I was really looking forward to this, it’s my favorite game of all time and I have a beefy enough machine that I should be able to do mostly maxed 1440p DLSS balanced 60 like Cyberpunk…..

Thank you all for the read and for those that aren’t having issues and enjoying the game good luck on the path!
Sources and good reads:
Alex’s tweet (From Digital Foundry): John’s tweet (From Digital Foundry): CDPR tweet: Marcin’s tweet (CDPR Global Community manager):
Videos and 3rd party benchmarks show exactly what I am saying here:

Reports of issues:
https://www.rockpapershotgun.com/th...update-is-borked-so-heres-how-to-roll-it-back
https://www.gamesradar.com/witcher-3-next-gen-update-pc-players-say-it-runs-terribly/
https://kotaku.com/witcher-3-next-gen-update-pc-gaming-ps5-xbox-series-1849893251/amp
https://www.pastemagazine.com/games/witcher-3-update-pc-issues/
https://www.dsogaming.com/news/the-...-another-cyberpunk-2077-buggy-mess-at-launch/
 
Hello everyone! I to am having horrible performance with DX12 on a beefy machine (specs below) with update 4.0 so I am coming over from Reddit with a full detailed post on my test/theories/full breakdown so here we go.

TLDR: The games DX12 implementation is 100% broken and causing awful performance/scaling along with stutters and graphical bugs. The game also seems to only utilize 2 CPU cores which is causing every GPU to be starved (with the exception of 40 series cards DLSS 3.0 and some 4k setups) no matter what you do. DX11 implementation works much better but you lose DLSS/FSR/RTX the selling points of the update. Lastly Alex from Digital Foundry is going to do a video and has already tweeted the patch defiantly has issues/CPU being a problem.
I also linked some tweets/articles/videos at the bottom confirming issues and proving it is not just ranting or a handful of people.

My original Reddit post here https://www.reddit.com/r/Witcher3/comments/zllc1l
So to begin I believe I have narrowed down what the two biggest issues are.

First: the game seems to only use two of my 5900X's 12 CPU cores (Cyberpunk also originally had this issue but to a lesser degree.) I’ve never seen the games CPU usage above 6% and I commonly see it at 1-2% usage. What the underutilized CPU means is that the game is starving our GPUs causing them to also be underutilized. For GPU I’ve seen as low as 45% I’ve seen as high as 85% , but nothing ever close to 100% unless playing on DX11. So in other words the games poor CPU optimizations is causing our beefy GPUs to not be fully used killing our performance while we have headroom left on our hardware.

This is not just a me issue several reports now claiming the same thing and I’ve done the testing and posted the screenshot in prior linked Reddit post above so I believe this is what’s likely happening.

Secondly: the games DX12 implementation is straight borked on PC and I’m even reading reports of the same thing for next gen Xbox as it also uses DX12. What this is doing is causing horrible scaling of settings, lower than expected performance, stutters, and possibly the graphical glitches/crashing.

How did I come to this theoretical conclusion?

To start my rig: Asus Crosshair VIII X570 (latest BIOS 4201) with a Ryzen 9 5900x slight OC to 4.95ghz, 360 AIO cooler, Asus Strix Rog RTX 3080 with slight factory OC, 32gb 3200mhz ram, and game installed to a 1TB Samsung 970 Evo NVME also latest Nvidia drivers (527.56) and latest version of Windows 11 (Windows 11 Home Version 22H2 Installed on 10/‎19/‎2022 OS build 22621.963 Windows Feature Experience Pack 1000.22638.1000.0) Note this install of Windows 11 is less than 2 months old and was a full fresh install not an upgrade from previous Windows 10 install and game is fresh installed not priorly modded due to my recent wipe moving to Windows 11. Also hardware accelerated GPU scheduling is on.

Now background info I run all my games more than fine (to include reasonable RTX) and could not complain but for the sake of keeping it short we’ll do OG 1.32 Witcher 3 and Cyberpunk 2077 this same rig did 1440p Max at 140+fps in Witcher 3 we are talking hair works and everything.
Cyberpunk 2077 all settings maxed minus the two psycho options and all RTX on at 1440p DLSS balanced locked 60 from day one to most recent patch.

Ok next gen Witcher 3 so to break this one down if I do all lowest settings possible on DX12 1440p DLSS performance I get 80-100ishfps or less. Again going back to my background info this same machine could do 1440p maxed at 140+ fps easy with 1.32 but now all lowest settings 1440p DLSS performance or TAAU with resolution scaling I get 80-100fps.....

I CANNOT make this up I am seriously somehow getting ALOT less fps at all low with DX12 compared to OG maxed 1.32 with the same rig....
If I turn on all RTX with all other settings low DLSS Ultra Performance I get 35-50FPS!
Turn it to all lowest settings 1440p DLSS Ultra performance with only RTGI I get around 42-55fps!
MAX game out with 1440p DLSS quality 32-40fps! Try to do anything to gain 60fps on anything RTX I can’t get it to.
Try to get 60fps on no RTX but ultra-ultra+ can’t get it to basically its all medium to low now no RTX and DLSS performance to net 60-75fps with less than 60fps 1% lows and insane input delay anything else I try gets me under 60 to as low as 32 when running maxed on my MONSTER (only 1.78% of 30+ million steam users have a RTX 3080) rig the same rig that runs Cyberpunk 2077 Ultra with RTX 1440p DLSS balanced and gets locked 60fps make it make sense please!

Now we can prove it’s bad DX12 by switching to DX11 and max it out obviously there is no DLSS or RTX but max so new ultra+ and TAAU no scaling at native 1440p I am now getting 100-110sh fps still not as much as maxed 1.32 but remember ultra+ settings and new screen space reflections and TAAU UHD texture etc.

This FPS on DX11 makes more sense than all low no RTX DX12 getting only up to 100fps but hovering around 80-90........ If you match the games settings as close as possible to OG 1.32 on DX11 you get almost the same performance (probably around a 10-15% loss) despite the higher quality graphics settings that cant be matched. I was getting around 110-135ishfps so not as much as OG 1.32 but close enough and the scaling is working and this all makes sense.

So DX12 is borked. If you scale all the way to low you get a lot less fps than max DX11 and max OG 1.32 but max out DX12 and game is unplayable getting half the FPS as Cyberpunk 2077 maxed.....

Lastly on performance game refuses to utilize the GPU and CPU in full. Maxed DX12 with DLSS quality I see 80-85% usage, maxed DX12 with DLSS ultra performance 50-55ish% usage, maxed DX12 with TAAU 45-50ish% usage, lowest settings DX12 40-45% usage. All meanwhile performance was at its worst 30ish fps and at its best 100fps but I still had hardware left to use....?

DX11 no issues 99-100% usage game scales fps scales just DX12.

At this point I am just waiting for Alex from Digital Foundry’s full tech review to highlight and confirm or elaborate on my findings. I’ve learned most of what I know from him over the years and he’s the wizard when it comes to stuff like this.
The devs already tweeted they are investigating the issues so the game will be shelved for me until a patch is released.

For clarification I’m not mad I’m just sad. I was really looking forward to this, it’s my favorite game of all time and I have a beefy enough machine that I should be able to do mostly maxed 1440p DLSS balanced 60 like Cyberpunk…..

Thank you all for the read and for those that aren’t having issues and enjoying the game good luck on the path!
Sources and good reads:
Alex’s tweet (From Digital Foundry): John’s tweet (From Digital Foundry): CDPR tweet: Marcin’s tweet (CDPR Global Community manager):
Videos and 3rd party benchmarks show exactly what I am saying here:

Reports of issues:
https://www.rockpapershotgun.com/th...update-is-borked-so-heres-how-to-roll-it-back
https://www.gamesradar.com/witcher-3-next-gen-update-pc-players-say-it-runs-terribly/
https://kotaku.com/witcher-3-next-gen-update-pc-gaming-ps5-xbox-series-1849893251/amp
https://www.pastemagazine.com/games/witcher-3-update-pc-issues/
https://www.dsogaming.com/news/the-...-another-cyberpunk-2077-buggy-mess-at-launch/
Yeah, I saw today on Reddit on your post that one douchebag was claiming that you must have 4090rtx in order to run this update properly. :ROFLMAO::ROFLMAO::ROFLMAO: I totally agree with you completely and like I said in my previous comments there is something wrong here with DX12 for sure and this update performance-wise. I also have 5900x but paired with 3080ti and on 1080p in DX12 mode my game is stutering a lot.
Thank you for your deep analysis, let's hope that they will fix this.
 
Yeah, I saw today on Reddit on your post that one douchebag was claiming that you must have 4090rtx in order to run this update properly. :ROFLMAO::ROFLMAO::ROFLMAO: I totally agree with you completely and like I said in my previous comments there is something wrong here with DX12 for sure and this update performance-wise. I also have 5900x but paired with 3080ti and on 1080p in DX12 mode my game is stutering a lot.
Thank you for your deep analysis, let's hope that they will fix this.
I feel like a broken record but I LOVE this game so much and want to see it fixed!! So I will continue to make as much noise as possible in hopes maybe my testing/theories help the devs! :))
 
BRAVO! Love how detailed this post is. Checkout mine in White Orchard using 99% GPU and 18% CPU at 1440p AMD FSR 2 set to Quality, dynamic scaling off, maxed out settings and RT off.
Hello everyone! I to am having horrible performance with DX12 on a beefy machine (specs below) with update 4.0 so I am coming over from Reddit with a full detailed post on my test/theories/full breakdown so here we go.

TLDR: The games DX12 implementation is 100% broken and causing awful performance/scaling along with stutters and graphical bugs. The game also seems to only utilize 2 CPU cores which is causing every GPU to be starved (with the exception of 40 series cards DLSS 3.0 and some 4k setups) no matter what you do. DX11 implementation works much better but you lose DLSS/FSR/RTX the selling points of the update. Lastly Alex from Digital Foundry is going to do a video and has already tweeted the patch defiantly has issues/CPU being a problem.
I also linked some tweets/articles/videos at the bottom confirming issues and proving it is not just ranting or a handful of people.

My original Reddit post here https://www.reddit.com/r/Witcher3/comments/zllc1l
So to begin I believe I have narrowed down what the two biggest issues are.

First: the game seems to only use two of my 5900X's 12 CPU cores (Cyberpunk also originally had this issue but to a lesser degree.) I’ve never seen the games CPU usage above 6% and I commonly see it at 1-2% usage. What the underutilized CPU means is that the game is starving our GPUs causing them to also be underutilized. For GPU I’ve seen as low as 45% I’ve seen as high as 85% , but nothing ever close to 100% unless playing on DX11. So in other words the games poor CPU optimizations is causing our beefy GPUs to not be fully used killing our performance while we have headroom left on our hardware.

This is not just a me issue several reports now claiming the same thing and I’ve done the testing and posted the screenshot in prior linked Reddit post above so I believe this is what’s likely happening.

Secondly: the games DX12 implementation is straight borked on PC and I’m even reading reports of the same thing for next gen Xbox as it also uses DX12. What this is doing is causing horrible scaling of settings, lower than expected performance, stutters, and possibly the graphical glitches/crashing.

How did I come to this theoretical conclusion?

To start my rig: Asus Crosshair VIII X570 (latest BIOS 4201) with a Ryzen 9 5900x slight OC to 4.95ghz, 360 AIO cooler, Asus Strix Rog RTX 3080 with slight factory OC, 32gb 3200mhz ram, and game installed to a 1TB Samsung 970 Evo NVME also latest Nvidia drivers (527.56) and latest version of Windows 11 (Windows 11 Home Version 22H2 Installed on 10/‎19/‎2022 OS build 22621.963 Windows Feature Experience Pack 1000.22638.1000.0) Note this install of Windows 11 is less than 2 months old and was a full fresh install not an upgrade from previous Windows 10 install and game is fresh installed not priorly modded due to my recent wipe moving to Windows 11. Also hardware accelerated GPU scheduling is on.

Now background info I run all my games more than fine (to include reasonable RTX) and could not complain but for the sake of keeping it short we’ll do OG 1.32 Witcher 3 and Cyberpunk 2077 this same rig did 1440p Max at 140+fps in Witcher 3 we are talking hair works and everything.
Cyberpunk 2077 all settings maxed minus the two psycho options and all RTX on at 1440p DLSS balanced locked 60 from day one to most recent patch.

Ok next gen Witcher 3 so to break this one down if I do all lowest settings possible on DX12 1440p DLSS performance I get 80-100ishfps or less. Again going back to my background info this same machine could do 1440p maxed at 140+ fps easy with 1.32 but now all lowest settings 1440p DLSS performance or TAAU with resolution scaling I get 80-100fps.....

I CANNOT make this up I am seriously somehow getting ALOT less fps at all low with DX12 compared to OG maxed 1.32 with the same rig....
If I turn on all RTX with all other settings low DLSS Ultra Performance I get 35-50FPS!
Turn it to all lowest settings 1440p DLSS Ultra performance with only RTGI I get around 42-55fps!
MAX game out with 1440p DLSS quality 32-40fps! Try to do anything to gain 60fps on anything RTX I can’t get it to.
Try to get 60fps on no RTX but ultra-ultra+ can’t get it to basically its all medium to low now no RTX and DLSS performance to net 60-75fps with less than 60fps 1% lows and insane input delay anything else I try gets me under 60 to as low as 32 when running maxed on my MONSTER (only 1.78% of 30+ million steam users have a RTX 3080) rig the same rig that runs Cyberpunk 2077 Ultra with RTX 1440p DLSS balanced and gets locked 60fps make it make sense please!

Now we can prove it’s bad DX12 by switching to DX11 and max it out obviously there is no DLSS or RTX but max so new ultra+ and TAAU no scaling at native 1440p I am now getting 100-110sh fps still not as much as maxed 1.32 but remember ultra+ settings and new screen space reflections and TAAU UHD texture etc.

This FPS on DX11 makes more sense than all low no RTX DX12 getting only up to 100fps but hovering around 80-90........ If you match the games settings as close as possible to OG 1.32 on DX11 you get almost the same performance (probably around a 10-15% loss) despite the higher quality graphics settings that cant be matched. I was getting around 110-135ishfps so not as much as OG 1.32 but close enough and the scaling is working and this all makes sense.

So DX12 is borked. If you scale all the way to low you get a lot less fps than max DX11 and max OG 1.32 but max out DX12 and game is unplayable getting half the FPS as Cyberpunk 2077 maxed.....

Lastly on performance game refuses to utilize the GPU and CPU in full. Maxed DX12 with DLSS quality I see 80-85% usage, maxed DX12 with DLSS ultra performance 50-55ish% usage, maxed DX12 with TAAU 45-50ish% usage, lowest settings DX12 40-45% usage. All meanwhile performance was at its worst 30ish fps and at its best 100fps but I still had hardware left to use....?

DX11 no issues 99-100% usage game scales fps scales just DX12.

At this point I am just waiting for Alex from Digital Foundry’s full tech review to highlight and confirm or elaborate on my findings. I’ve learned most of what I know from him over the years and he’s the wizard when it comes to stuff like this.
The devs already tweeted they are investigating the issues so the game will be shelved for me until a patch is released.

For clarification I’m not mad I’m just sad. I was really looking forward to this, it’s my favorite game of all time and I have a beefy enough machine that I should be able to do mostly maxed 1440p DLSS balanced 60 like Cyberpunk…..

Reports of issues:
https://www.rockpapershotgun.com/th...update-is-borked-so-heres-how-to-roll-it-back
https://www.gamesradar.com/witcher-3-next-gen-update-pc-players-say-it-runs-terribly/
https://kotaku.com/witcher-3-next-gen-update-pc-gaming-ps5-xbox-series-1849893251/amp
https://www.pastemagazine.com/games/witcher-3-update-pc-issues/
https://www.dsogaming.com/news/the-...-another-cyberpunk-2077-buggy-mess-at-launch/
 

Attachments

  • 1440P AMD FSR 2 MAXED SETTINGS DX12 RT OFF.png
    1440P AMD FSR 2 MAXED SETTINGS DX12 RT OFF.png
    5.7 MB · Views: 97
BRAVO! Love how detailed this post is. Checkout mine in White Orchard using 99% GPU and 18% CPU at 1440p AMD FSR 2 set to Quality, dynamic scaling off, maxed out settings and RT off.
Your better off than I have been :/ but white orchard runs leaps and bounds better than the full open world Novigrad, Skellige, Tousaint in particular destroy fps like I have written above.
 
Your better off than I have been :/ but white orchard runs leaps and bounds better than the full open world Novigrad, Skellige, Tousaint in particular destroy fps like I have written above.
I haven't gotten there yet but when I do, I'll report back with another screenshot. Nonetheless I really do hope they fix the DX12 and RT issues rather than pretend it's an isolated issue with Intel GPUs.
 
It's not so much a performance issue in my case, with an RtX 4090 e 13900k the game runs pretty well with all next gen effects turned on, it is not an racing game, 50/60 fps are good for me. The problem is that Ray Tracing effects are applied incorrectly. Also the hairworks effect on for example Geralt they have strange anomalies such as for example that hair and beard from whites turn black / gray .


Turning off the Ray Traced effects or return to the original version Is no sense to play the game of already played years ago

It Is a next gen update, so if you have got a CPU and GPU Nvidia of years ago you don't play the game with all set to max.
 
It's not so much a performance issue in my case, with an RtX 4090 e 13900k the game runs pretty well with all next gen effects turned on, it is not an racing game, 50/60 fps are good for me. The problem is that Ray Tracing effects are applied incorrectly. Also the hairworks effect on for example Geralt they have strange anomalies such as for example that hair and beard from whites turn black / gray .


Turning off the Ray Traced effects or return to the original version Is no sense to play the game of already played years ago

It Is a next gen update, so if you have got a CPU and GPU Nvidia of years ago you don't play the game with all set to max.
Bullsh*t. The game uses CPU resources in a very poor way, it basically uses two threads only. It runs decently exclusively through brute forcing and with nvidia's latest trickery, which is frame generation. Please keep this kind of elitist speeches for yourself lol
 
Found a setting that gives me 60fps everywhere except Novigrad (that still drops to 45 on average due to CPU hammering) while keeping RT enabled and with perfect framepacing to boot, yay. And it looks great on my 1080p plasma.


Spec: 5800X3D, RTX 3080Ti both default


First, disable vsync and frame limit in-game and enable it in nvidia control panel, also enable 60fps limit there.

Then, start game, in the menu disable hairworks and RT shadows (they flicker and pop-in), leave RTGI, RTAO and RT Reflections enabled. Resolution 1440p, DLSS performance.

Detail settings:



With this setting, the game is 100% smooth and well frame paced everywhere except Novigrad, at least from what I tested.
And I love how it looks.









 
Found a setting that gives me 60fps everywhere except Novigrad (that still drops to 45 on average due to CPU hammering) while keeping RT enabled and with perfect framepacing to boot, yay. And it looks great on my 1080p plasma.


Spec: 5800X3D, RTX 3080Ti both default


First, disable vsync and frame limit in-game and enable it in nvidia control panel, also enable 60fps limit there.

Then, start game, in the menu disable hairworks and RT shadows (they flicker and pop-in), leave RTGI, RTAO and RT Reflections enabled. Resolution 1440p, DLSS performance.

Detail settings:



With this setting, the game is 100% smooth and well frame paced everywhere except Novigrad, at least from what I tested.
And I love how it looks.









Thanks to you I got RTGI to finally work. This screenshot is DX12 RTGI ON at 1080p AMD FSR 2 set to Ultra Performance, Medium settings, dynamic resolution on, hair and all extras disabled. Frame times are terrible and so is the performance despite AMD FSR 2 essentially doing nothing to improve performance.
 

Attachments

  • DX12 RTGI ON 1080P MEDIUM SETTINGS AMD FSR 2 ULTRA PERFORMANCE.png
    DX12 RTGI ON 1080P MEDIUM SETTINGS AMD FSR 2 ULTRA PERFORMANCE.png
    4.4 MB · Views: 99
Found a setting that gives me 60fps everywhere except Novigrad (that still drops to 45 on average due to CPU hammering) while keeping RT enabled and with perfect framepacing to boot, yay. And it looks great on my 1080p plasma.


Spec: 5800X3D, RTX 3080Ti both default


First, disable vsync and frame limit in-game and enable it in nvidia control panel, also enable 60fps limit there.

Then, start game, in the menu disable hairworks and RT shadows (they flicker and pop-in), leave RTGI, RTAO and RT Reflections enabled. Resolution 1440p, DLSS performance.

Detail settings:



With this setting, the game is 100% smooth and well frame paced everywhere except Novigrad, at least from what I tested.
And I love how it looks.









This actually helped me a lot. Thank you!
 
Thanks to you I got RTGI to finally work. This screenshot is DX12 RTGI ON at 1080p AMD FSR 2 set to Ultra Performance, Medium settings, dynamic resolution on, hair and all extras disabled. Frame times are terrible and so is the performance despite AMD FSR 2 essentially doing nothing to improve performance.
Man, those frametimes makes me seasick and get nausia. I had those frametimes too when i first installed this yesterday before i found out about that HAGS did wonders together with frame generation.
 
Man, those frametimes makes me seasick and get nausia. I had those frametimes too when i first installed this yesterday before i found out about that HAGS did wonders together with frame generation.
Unfortunately, HAGS did nothing for me but then again frame generation is an Nvidia feature. And the frametimes are indeed sickening as even Cyberpunk 2077 at launch didn't run this horribly with RT on.
 
On top of all the other RT issues, raytracing only knows of the sky as a light source. Flames, candles, magic.. nothing else contributes as light sources to RT - be it shadows or global illumination. So in reality, this should be performing much better lol, since it doesn't have to worry about any other lights than those from the sky! But nah.. the reason is that they didn't take the time to go back through the game, fixing up every light source to work with RT. Basically just slapped RT on the skylighting and called it a day.
 
Whole game needs performance fix not just ray tracing. Game is running worse even on DX11 compared to 1.32.
That's because even Dx11 has Ultra+ settings.. entire game got updated with much larger textures.. so its more demanding on hardware than before...
 
Any news about a release date of the first performance patch ?

they needed a year of hard work to mess up a perfectly working game by adding five mods or so and rtx (after two years of fixing experience with it in cyberjoke2077), give them time
 
First things first, thanks to CDPR for this great and free update. I started playing the game for the fourth time (~150 hours per game) and I already know that the fifth time will come. :) Unfortunately, I've run into a few bugs and I know I'm not alone with them. Some are small, but it also has a few big ones. However I'm pretty sure they can be smoothed out.

This is apparently a common problem with RTs. The GPU is not fully saturated in every situation (utilization 70-90%), even though the fps is around 40-50. Sometimes (for example at the Bloody Barons court) the fps drops in the middle of playing and does not rise above ~55fps even though I set all settings to minimum or off, set DLSS to 'ultra performance' and RT settings off, except for the RT GI setting. Restarting the game or disabling RT GI helps. Just with RT GI on (other RTs off), 99% is also quite low (30-50fps) and the game has micro jerks all the time.

Today the game got a strange problem with RT. The game crashed every time if I loaded it with RT global lighting on or turned it on after loading. File repair didn't help. Then I turned on all the RTs and the game started normally!

While playing RTs on, the utilization of the first four threads of the CPU (AMD 3950x) jumps between 20% and 100%. Is this normal? The rest of the threads are fairly evenly loaded, half threads about 10-30% and half near 0%.

One smalI note. I don't know if this is a bug or if it's just done with low resolution. In the inventory menu, Gerald's character is displayed in low resolution (maybe HD or 720P) and has no anti-aliasing. It looks pretty bad on my 4k monitor. Graphics settings have no effect on it.

Machine specs:
MSI RTX 3090 Suprim
AMD 3950x
64 GB of RAM
Samsung SSD 980 PRO 1TB
4K display
Win 10
 

Attachments

  • Witcher3 cpu utilization with RT on.jpg
    Witcher3 cpu utilization with RT on.jpg
    528.2 KB · Views: 71
Top Bottom