Witcherpunk3077 & the new Buggy Next Gen Update

+

symbi

Forum regular
It´s kinda sad to see all the Threads on Steam/Reddit and how this Update is reminding us off the Debacle with Cyberpunk.
The "Free" Update is a Disaster.
The Performance on different Pc´s are hitting Low FPS and Laggy Gameplay and some can´t even start the Game.

Who is in control of this? Why are you Guys not testing your Updates with QA and then work on the Problems before you Release unfinished Products ...again.
I don´t get it why is this happen ....again?

Also the White Orchard "Lost inkeeper" Bug is not Fixed.

I´m out of words i love this Franchise but how you get things done is truly unbelievable CDPR needs a change a BIG Change
 
This game had some microstutters/hitches before on newer GPU architectures due to shader compilation on the fly. The game used to ran smooth and stable after playing for a while and most shaders being cached. What surprises me is that after 7 years of the original release, the developers of this update apparently didn't had the ability to make shaders compile asynchronously or give us an option to compile the shaders before starting to play the game, being in the menu or even in the first loading screen like some other games: Forza, HZD, Detroit Become Human, etc.

Nobody likes to play games with microstutters just because someone didn't want to take more time to implement better techniques and optimizations.
 
Funny, I remember the old CD Project Red forum from 2016 when people were capable of going into so called "graphics settings" and tweak the game a tiny little bit, which takes a few seconds, aaaw runs perfectly fine afterwards even on oldest computers! And the rest of all discussions were about story, art, fantasy, how awesome CD Project Red is and how awesome Witcher 3 is.

Since 2018 when the biggest downward spiral in gaming ever begun, everyone is only capable of repeating the words "FPS", "Multiplayer" and "Fortnite", and yelling at the developers that their mediocre PC is not capable of running a game in 12k with 166 FPS:
"baaadly optimized!!! Stupid developers, dumbest developers 4ever, not capable of OPTIMIZING the game!"

Hint: That´s why there are so called "graphics settings" in videogames. Since 32 years. ;)

Otherwise ---------------------> random console at Amazon | BUY |
Enough FPS guaranteed, because NOTHING ELSE seems to be of importance anyway in gaming anymore since 2018 like for example graphics quality story emotions gameplay etc. but only WHERE IS MULTIPLAYER???!!! and FPS FPS FPS FPS FPS FPS FPS FPS PPS. How many FPS here how many FPS there, and half of the screen running some FPS and frametime counters all the time.
 
Hello everyone! I to am having horrible performance with DX12 on a beefy machine (specs below) with update 4.0 so I am coming over from Reddit with a full detailed post on my test/theories/full breakdown so here we go.

TLDR: The games DX12 implementation is 100% broken and causing awful performance/scaling along with stutters and graphical bugs. The game also seems to only utilize 2 CPU cores which is causing every GPU to be starved (with the exception of 40 series cards DLSS 3.0 and some 4k setups) no matter what you do. DX11 implementation works much better but you lose DLSS/FSR/RTX the selling points of the update. Lastly Alex from Digital Foundry is going to do a video and has already tweeted the patch defiantly has issues/CPU being a problem.
I also linked some tweets/articles/videos at the bottom confirming issues and proving it is not just ranting or a handful of people.

My original Reddit post here


So to begin I believe I have narrowed down what the two biggest issues are.

First: the game seems to only use two of my 5900X's 12 CPU cores (Cyberpunk also originally had this issue but to a lesser degree.) I’ve never seen the games CPU usage above 6% and I commonly see it at 1-2% usage. What the underutilized CPU means is that the game is starving our GPUs causing them to also be underutilized. For GPU I’ve seen as low as 45% I’ve seen as high as 85% , but nothing ever close to 100% unless playing on DX11. So in other words the games poor CPU optimizations is causing our beefy GPUs to not be fully used killing our performance while we have headroom left on our hardware.

This is not just a me issue several reports now claiming the same thing and I’ve done the testing and posted the screenshot in prior linked Reddit post above so I believe this is what’s likely happening.

Secondly: the games DX12 implementation is straight borked on PC and I’m even reading reports of the same thing for next gen Xbox as it also uses DX12. What this is doing is causing horrible scaling of settings, lower than expected performance, stutters, and possibly the graphical glitches/crashing.

How did I come to this theoretical conclusion?

To start my rig: Asus Crosshair VIII X570 (latest BIOS 4201) with a Ryzen 9 5900x slight OC to 4.95ghz, 360 AIO cooler, Asus Strix Rog RTX 3080 with slight factory OC, 32gb 3200mhz ram, and game installed to a 1TB Samsung 970 Evo NVME also latest Nvidia drivers (527.56) and latest version of Windows 11 (Windows 11 Home Version 22H2 Installed on 10/‎19/‎2022 OS build 22621.963 Windows Feature Experience Pack 1000.22638.1000.0) Note this install of Windows 11 is less than 2 months old and was a full fresh install not an upgrade from previous Windows 10 install and game is fresh installed not priorly modded due to my recent wipe moving to Windows 11. Also hardware accelerated GPU scheduling is on.

Now background info I run all my games more than fine (to include reasonable RTX) and could not complain but for the sake of keeping it short we’ll do OG 1.32 Witcher 3 and Cyberpunk 2077 this same rig did 1440p Max at 140+fps in Witcher 3 we are talking hair works and everything.
Cyberpunk 2077 all settings maxed minus the two psycho options and all RTX on at 1440p DLSS balanced locked 60 from day one to most recent patch.

Ok next gen Witcher 3 so to break this one down if I do all lowest settings possible on DX12 1440p DLSS performance I get 80-100ishfps or less. Again going back to my background info this same machine could do 1440p maxed at 140+ fps easy with 1.32 but now all lowest settings 1440p DLSS performance or TAAU with resolution scaling I get 80-100fps.....

I CANNOT make this up I am seriously somehow getting ALOT less fps at all low with DX12 compared to OG maxed 1.32 with the same rig....
If I turn on all RTX with all other settings low DLSS Ultra Performance I get 35-50FPS!
Turn it to all lowest settings 1440p DLSS Ultra performance with only RTGI I get around 42-55fps!
MAX game out with 1440p DLSS quality 32-40fps! Try to do anything to gain 60fps on anything RTX I can’t get it to.
Try to get 60fps on no RTX but ultra-ultra+ can’t get it to basically its all medium to low now no RTX and DLSS performance to net 60-75fps with less than 60fps 1% lows and insane input delay anything else I try gets me under 60 to as low as 32 when running maxed on my MONSTER (only 1.78% of 30+ million steam users have a RTX 3080) rig the same rig that runs Cyberpunk 2077 Ultra with RTX 1440p DLSS balanced and gets locked 60fps make it make sense please!

Now we can prove it’s bad DX12 by switching to DX11 and max it out obviously there is no DLSS or RTX but max so new ultra+ and TAAU no scaling at native 1440p I am now getting 100-110sh fps still not as much as maxed 1.32 but remember ultra+ settings and new screen space reflections and TAAU UHD texture etc.

This FPS on DX11 makes more sense than all low no RTX DX12 getting only up to 100fps but hovering around 80-90........ If you match the games settings as close as possible to OG 1.32 on DX11 you get almost the same performance (probably around a 10-15% loss) despite the higher quality graphics settings that cant be matched. I was getting around 110-135ishfps so not as much as OG 1.32 but close enough and the scaling is working and this all makes sense.

So DX12 is borked. If you scale all the way to low you get a lot less fps than max DX11 and max OG 1.32 but max out DX12 and game is unplayable getting half the FPS as Cyberpunk 2077 maxed.....

Lastly on performance game refuses to utilize the GPU and CPU in full. Maxed DX12 with DLSS quality I see 80-85% usage, maxed DX12 with DLSS ultra performance 50-55ish% usage, maxed DX12 with TAAU 45-50ish% usage, lowest settings DX12 40-45% usage. All meanwhile performance was at its worst 30ish fps and at its best 100fps but I still had hardware left to use....?

DX11 no issues 99-100% usage game scales fps scales just DX12.

At this point I am just waiting for Alex from Digital Foundry’s full tech review to highlight and confirm or elaborate on my findings. I’ve learned most of what I know from him over the years and he’s the wizard when it comes to stuff like this.
The devs already tweeted they are investigating the issues so the game will be shelved for me until a patch is released.

For clarification I’m not mad I’m just sad. I was really looking forward to this, it’s my favorite game of all time and I have a beefy enough machine that I should be able to do mostly maxed 1440p DLSS balanced 60 like Cyberpunk…..

Thank you all for the read and for those that aren’t having issues and enjoying the game good luck on the path!
Sources and good reads:
Alex’s tweet (From Digital Foundry):
https://twitter.com/i/web/status/1603041215221534721
John’s tweet (From Digital Foundry):
https://twitter.com/i/web/status/1603085310522527748
CDPR tweet:
https://twitter.com/i/web/status/1603046409619513345
Marcin’s tweet (CDPR Global Community manager):
https://twitter.com/i/web/status/1603046848742187008

Videos and 3rd party benchmarks show exactly what I am saying here:



youtu.be



The Witcher 3 Next Gen Update | Ultra+ Setting With RTX Ultra | 3080Ti Benchmark! 1080p


The Witcher 3:Wild Hunt Next-Gen is Still Performance issues and Optimization.● SETUP● ● CPU: AMD Ryzen 9 5900X 3.7GH● GPU: NVIDIA GeForce RTX 3080 Ti 12GB● ...

youtu.be
youtu.be






youtu.be



RTX 3080 | THE WITCHER 3 Next Gen Update | Ray Tracing | DLSS | Ultra+


If you like this RTX 3080 | THE WITCHER 3 Next Gen Update Benchmark / gameplay content, make sure to like and subscribe Full Specs: https://linktr.ee/specsga...

youtu.be
youtu.be






youtu.be



The Witcher 3’s next-gen PC update is disappointing


The Witcher 3’s long awaited next-gen update has arrived for PC and its effect on the game is pretty disappointing. Not only does the addition of ray tracing...

youtu.be
youtu.be




Reports of issues:
https://www.rockpapershotgun.com/th...update-is-borked-so-heres-how-to-roll-it-back
https://www.gamesradar.com/witcher-3-next-gen-update-pc-players-say-it-runs-terribly/
https://kotaku.com/witcher-3-next-gen-update-pc-gaming-ps5-xbox-series-1849893251/amp
https://www.pastemagazine.com/games/witcher-3-update-pc-issues/
https://www.dsogaming.com/news/the-...-another-cyberpunk-2077-buggy-mess-at-launch/
 
it seems the console should be the best answer, at least for now.
I purchased the PS5 complete version which is 15 times more expensive than XSX version.
 
I wasn't really seeing any problem in CPU usage in White Orchard but once I hit Velen I started getting 1 or 2 threads maxing out to 100% while the other 14 threads were running 0% to 12% then it would kinda of kick out of that and all the threads would be doing something again so it was definitely bottlenecking. Since the in-game RT wasn't working correctly I was using the RTSSGI Freestyle Filter in GeForce Experience but I had to turn that off in Velen in order to keep locked at 60 FPS. I don't use an overlay for monitoring I have a 7" LCD that monitors my system and it look like this:

Sensor Panel.jpg


Unfortunately I don't have an easy way to screen shot or video record it during gameplay but this is what it looks like when I watch a YouTube video in Chrome
 
I wasn't really seeing any problem in CPU usage in White Orchard but once I hit Velen I started getting 1 or 2 threads maxing out to 100% while the other 14 threads were running 0% to 12% then it would kinda of kick out of that and all the threads would be doing something again so it was definitely bottlenecking. Since the in-game RT wasn't working correctly I was using the RTSSGI Freestyle Filter in GeForce Experience but I had to turn that off in Velen in order to keep locked at 60 FPS. I don't use an overlay for monitoring I have a 7" LCD that monitors my system and it look like this:

View attachment 11334856

Unfortunately I don't have an easy way to screen shot or video record it during gameplay but this is what it looks like when I watch a YouTube video in Chrome
Totally agreed on this. I might have an old CPU which is a 5820k, but seeing my cores having a vacation while the other 1 or 2 cores are for some reason working hard. Actually it's just a single core most of the time that's doing the job.
I should be seeing my CPU bottlenecked not sleeping cores 😂
 
"baaadly optimized!!! Stupid developers, dumbest developers 4ever, not capable of OPTIMIZING the game!"

Hint: That´s why there are so called "graphics settings" in videogames. Since 32 years. ;)

Otherwise ---------------------> random console at Amazon | BUY |
Enough FPS guaranteed, because NOTHING ELSE seems to be of importance anyway in gaming anymore since 2018 like for example graphics quality story emotions gameplay etc. but only WHERE IS MULTIPLAYER???!!! and FPS FPS FPS FPS FPS FPS FPS FPS PPS. How many FPS here how many FPS there, and half of the screen running some FPS and frametime counters all the time.
the game is still good, its still good old witcher 3, nobody is suddenly forgetting that the game had amazingly welldone storytelling and great graphics. what people is complaining about is the fact that now on PC, instead of performing better, it introduces stutters and MASSIVE fps cuts with raytracing, not to mention there are people saying the game doesnt launch AT ALL on dx12.

being mad that a game that works fine on a platform (which some people spent thousands of dollars on) suddenly doesn't work because of a free update seems reasonable.
 
Funny, I remember the old CD Project Red forum from 2016 when people were capable of going into so called "graphics settings" and tweak the game a tiny little bit, which takes a few seconds, aaaw runs perfectly fine afterwards even on oldest computers! And the rest of all discussions were about story, art, fantasy, how awesome CD Project Red is and how awesome Witcher 3 is.

Since 2018 when the biggest downward spiral in gaming ever begun, everyone is only capable of repeating the words "FPS", "Multiplayer" and "Fortnite", and yelling at the developers that their mediocre PC is not capable of running a game in 12k with 166 FPS:
"baaadly optimized!!! Stupid developers, dumbest developers 4ever, not capable of OPTIMIZING the game!"

Hint: That´s why there are so called "graphics settings" in videogames. Since 32 years. ;)

Otherwise ---------------------> random console at Amazon | BUY |
Enough FPS guaranteed, because NOTHING ELSE seems to be of importance anyway in gaming anymore since 2018 like for example graphics quality story emotions gameplay etc. but only WHERE IS MULTIPLAYER???!!! and FPS FPS FPS FPS FPS FPS FPS FPS PPS. How many FPS here how many FPS there, and half of the screen running some FPS and frametime counters all the time.
Q: How this comment can exist after the fiasco that was Cyberpunk 2077 is beyond me?!:facepalm:
The game was a literal mess. Fighting for the gold medal with Fallout 76 as the one of the worse game releases ever. Together being the best source for ridicule and memes on the internet.

Sir Witcher 3 Next Gen does not run ok on high end pcs. PCs that have Ryzen 9 12core 24 threads, RTX 3070 even 4090.
We have a stuttery mess, crashes, sudden drop in performance going to the likes of 4-10 FPS, graphical artifacts. Dx12 and RT not being optimized well.
They enticed the old/new gamers to replay/play the game with the new next gen improved mode: few mods, Dx12 and RT then released again an unfinished, unoptimized product on pc at least(don't know about consoles).
I am for example happy with reliable 30 FPS, stutter free->good frame-times and no crashes at 4k DLSS Performance with RT.

Screenshot 2022-12-18 105344.jpg


 
Last edited:
Top Bottom