Massive performance regression on Vega, looking for answers. (Fixed in Patch 2.11!)

+
Hey all, first post here. Sorry if its long-winded.

Purchased the game at launch. Back then I had a Ryzen 2700, 16GB 3466Mbps RAM, and a Vega 64 @ 1080p, all overclocked. This was enough to play the game comfortably at 60fps, with optimized settings. No complaints, considering the age of those parts. Played for around 30 hours, then stopped. Bugs were ruining the experience for me, so I thought I'd wait a year before playing again.

I ended up waiting for two years instead. Go figure.

During the wait, I upgraded some components. I now have a 5600X, 32GB 3800Mbps CL14 RAM, and the same Vega 64 GPU, again all overclocked. The Vega is even running the exact same settings as back then. No doubt the PC as a whole is faster. 5600X is rapid compared to the 2700. RAM was also a massive upgrade, both in speed, timings, and the fact that it's now a dual rank config. Granted, I should be GPU limited pretty much *everywhere* on Cyberpunk with a Vega, so I wasn't expecting a drastically different experience - though I was hoping some optimisations had been made during the wait.

So imagine my surprise upon returning to find that I can no longer run the game sufficiently. Seriously. Even on minimum settings at 1080p, I'll be dropping below 60fps constantly in many areas - areas that I distinctly remember being fine previously, like the Nomad intro location.

I downloaded the same build of the game that I played back then - 1.04 - and compared it to the current build, 1.63. As the original build lacks a benchmark, I had to test it myself. Same location, same save, same graphics settings (at least, they were the SET the same. You can see some visual differences in the following images.) I also gathered data from HWiNFO64 while taking these screenshots.

Build 1.04
old.png


Build 1.63
new.png


As you can see, there's a severe performance regression here. Also, look how much GPU power draw has dropped (GPU PPT in HWiNFO). 168w in the old build is around what i'd expect from a high load with these GPU settings (160-200w is a "high" power draw at this voltage). 129w, on the other hand, is far too low for a game like Cyberpunk, even though GPU Utilization remains at 100%. Something is wrong here for sure. No way such a massive drop can be intentional, I refuse to believe it.

Also, I'll save some suggestion by saying its *not* driver related (I tested the drivers from the games launch ages ago - same shit), my Windows install is clean, my overclocks are stable - everything else that runs on this system does so perfectly fine. I've also reinstalled the game more than once, tested various settings, pretty much everything I can think of that'd be something I can actually change. No dice.

I also saw that Phantom Liberty's specs differ quite drastically to the current ones. The old "Recommended" GPU's are now the MINIMUM spec for PL, which actually seems somewhat in line with what I'm experiencing - GTX 1060 and RX 580 struggle quite a bit with this game, cant imagine how they managed to land on the recommended spec to begin with. The NEW recommended GPU's - 2060 SUPER and 5700XT - seem MUCH more in line with reality.

Found another post regarding huge performance loss on Vega - onto something? https://forums.cdprojektred.com/ind...e-performance-since-1-6-patch-a-bug.11113883/
 
Last edited:
Its easy, 1.5 and 1.6 did major changes to the render pipeline. They even changed some of the presets to higher settings i think. This game has become harder too run since then, i myself saw a degrade in performance pretty much every major patch. At launch i managed 60 fps with balanced DLSS with maxed settings on my 3090. It kept going down tho every patch and soon i was on performance too keep my 60 fps. The specs will like you said increase even more in PL.
 
Changes to backend stuff is all well and good, but I'd expect a visual improvement to offset the insane cost of that. Which, as you can see by those screenshots, is missing. Shadow distance and *some* LOD are actually superior on the old screenshot, even. Other than that, and the tonemapping, its virtually identical.

I appreciate the response, but I refuse to believe a FORTY-FOUR PERCENT decrease in performance is somehow "intented". That's a load of bullshit. I waited for the game to get better, and now I cant even play it at acceptable performance. Even minimum settings with FSR2 Quality isn't enough.
 
Last edited:
It might not be intended but a result of the changes. Also it might be worse on your GPU then it is on many others. They redid quite a lot with 1.5 so its not impossible that you lost most performance there. Are you using FSR btw?
 
No FSR2, because it looks horrid at 1080p, not enough data there for it to work well. Also performance with it enabled still ain't good enough.
As it currently stands, looks like I won't be playing the game at all. Cool. Love it.
 
Havent really tested FSR2 too know enough since i use DLSS, but even that looks kinda bad at 1080p tbh. Thought it might be the reason for the missing power usage, it generally lowers power usage but keeps usage high if the cpu can keep up. Other then that not sure what else to do except lowering settings and using FSR too regain the lost fps. Oh should give CDPR a notice too if it is a bug if you have not already done so.

 
Havent really tested FSR2 too know enough since i use DLSS, but even that looks kinda bad at 1080p tbh. Thought it might be the reason for the missing power usage, it generally lowers power usage but keeps usage high if the cpu can keep up. Other then that not sure what else to do except lowering settings and using FSR too regain the lost fps. Oh should give CDPR a notice too if it is a bug if you have not already done so.

No, the "missing power usage" isn't coming from using FSR2, because as I said, *I'm not even using it to begin with*. On either screenshot. Low power draw with MAX utilization - to me - is indicative of a problem.

And as I also already said, lowering settings isn't good enough, because the game still performs poorly. So i'll say it again.

Even if i were to use MINIMUM settings, at 1080p, WITH FSR2, performance still wouldn't even be CLOSE to what it used to be. I would still have shit performance in this absolute middle-of-nowhere location, nevermind actually IN Night City.

Whatever has gone on here to make it this bad, its a HUGE misstep.
 
Last edited:

robzhe

Forum regular
As it currently stands, looks like I won't be playing the game at all. Cool. Love it.

I might not be playing much either seeing as my pc won't be supported after the expansion is out. I'm hoping to save up for a new one but no idea if I'll be ready by the time it's out and to be honest, it runs weirdly lately with even a small amount of mods I use seeing the game crashing or hanging more than it did before.

It's strange because when this game came out and so many were giving it crap and talking about waiting a year or two until it's patched up enough I was addicted and played it a lot. At the moment though I struggle to be bothered and I feel like I'm in the wait until it's patched up enough camp instead seeing as the expansion will have a fair amount of work needing after it I suppose, and I might need a new pc anyway.

It also seems odd to be changing requirements so much for an expansion to the game and shutting players out in the same month that Starfield is releasing. I'm thinking a lot of people will be moving over to playing that for at least a while and Baldur's Gate 3 is also out shortly.

I really want to return to playing this game regularly again but not sure if that'll happen, or if it does it might take a while.
 
It also seems odd to be changing requirements so much for an expansion to the game and shutting players out in the same month that Starfield is releasing. I'm thinking a lot of people will be moving over to playing that for at least a while and Baldur's Gate 3 is also out shortly.
i kinda agree with you but its been coming for a while, ive seen more and more problems with crashes and so on on the forums too and sometimes your just playing at too high settings for your hardware and the game cant run properly. Some mods kinda add some more load to btw. It sucks if you cant afford a upgrade but sadly thats just the way it is. The new system requirements i would not trust untill i se some independant tests tbh, they seem way to high on average.
 
Figured I'd amend this post since the issue is now fixed.

As of Patch 2.11, performance on Vega GPU's has improved drastically. CDPR actually mentioned Vega specifically in the patch notes:
  • Implemented a fix that improves performance, especially on AMD RX Vega GPUs.
I can say with 100% certainty that the game has never ran better on my Vega 64, even compared to launch. I still cant believe how long this went unresolved - I ended up doing a full playthrough of Patch 1.63-2.0 with horrendous performance often dipping below 30fps - but im glad its resolved now...though arguably too late.
 
Top Bottom