Performance Issues / Poor Optimization

+
So CDPR released expanded system requirements shortly before the release. For the RT minimum it states an RTX 2060 GPU.

I have an RTX 2060 Super, Ryzen 3600xt, and 16Gb RAM. On the RT medium preset @ 1080p I regularly get less than 30 fps.

In 2020 under 60 fps is poor. Under 30 fps is unacceptable.

Where did these system requirements come from? I understand the game is currently unoptimised/full of bugs, but did they just select hardware that sounded reasonable without any testing whatsoever?

Between the bugs, the performance issues, and the cut content...the only award I would give this game is disappointment of the year. Although I suppose that's still quite an achievement given everything else 2020 has produced.
Post automatically merged:

Is there confirmation anywhere of what the targeted frame rate is?
 
Last edited:
So CDPR released expanded system requirements shortly before the release. For the RT minimum it states an RTX 2060 GPU.

I have an RTX 2060 Super, Ryzen 3600xt, and 16Gb RAM. On the RT medium preset @ 1080p I regularly get less than 30 fps.

In 2020 under 60 fps is poor. Under 30 fps is unacceptable.

Where did these system requirements come from?
It's sponsored by Nvidia.
 
So you're deflecting the blame to Nvidia? I guess it's also Microsoft and Sony's fault that the game doesn't run well on base consoles?
Basically yes. Sony and Microsoft could have gone for a PCIe solution in order to make the GPUs upgradeable...
The 2060 Super should be the bare minimum requirement for RT, not the 2060.
 
Basically yes. Sony and Microsoft could have gone for a PCIe solution in order to make the GPUs upgradeable...
The 2060 Super should be the bare minimum requirement for RT, not the 2060.

Absolutely shocked. Imagine blaming the hardware, which manages to run other AAA games released this year perfectly fine.

People like you are why CDPR gets away with pulling this stuff. The 2060 S is still not good enough for RT minimum. The game is unoptimized garbage. Needed another year in the oven at least.
 
Because countless of similar threads are popping up, this thread is for all performance and optimization related issues. So, it's easier to keep an overview.
With the fact that most people show up with old budget processors that they got because someone told them it was "enough" when they bought them I'm not surprised that the increased load overwhelms their systems.

Ryzen Zen+ was never that great at gaming. A 2700X bottlenecked a GTX 1080Ti at 4K. It wasn't horrible. I managed to get one to play games like Witcher 3 at Ultra 60FPS but it was giving everything it had to do so. The problem for Zen+ is that because it was barely able to keep up with the GTX 1080Ti at 4K it buckled under the load at lower resolutions. This means with newer games that the GTX 1080TI cannot handle at 4K the processor cripples the gaming machine at lower resolutions. This is not the fault of the game developers.

This game has a ways to go to iron out the glitches but the bones are solid. With the PS5 and Xbox Series X/S released the bar has been raised. This and Red Dead Redemption 2 require greater hardware to pull off the high and ultra settings.

It runs passable on my undervolted Core i9-10900K at it's daily speed of 5GHz all cores and a RTX 3070 with a mild overclock. I get about 45 to 65 FPS at 1440P with Ultra Ray Tracing, Balanced DLSS and general ultra settings. I would love to have a RTX 3090 or RX 6900XT but they are not in the budget right now. In a year or so I'll upgrade to the next generation of graphics cards which hopefully will be on par with those and run a solid 60FPS with max settings.
 
The game is bad optimized. Without raytracing should be piece of cake to get 60fps Ultra on 1080p.
The game have good graphics but it isnt absolutely out of this world that you cant get 60 fps Ultra settings without raytracing on 1080p. Thats absurd.
 
Denial is a powerful thing but it doesn't change the truth.

Bruh at least, try to know what you're talking about before saying that much BS.

CPU has basically NO CHARGE at 4k. Only very bad cpu could bottleneck a 1080ti at 4k. Now I doubt you know what bottleneck is, cauz if you did, you would know that a 1080ti at 99% usage on 4k resolution is not being bottlenecked.

Now, see this video I'm linking here :

1080ti is always at 99% on 4k resolution, while the 2700x is barely getting used (around 30-40ish pourcent). If it was bottlenecking the GPU, the 1080ti would be at around 70% and the CPU at 70-80% or something, which is clearly not the case here.

Sorry for this little out of subject response, but I felt like I needed to adress this.
 
Has there been any progress made on performance besides the AMD fix? I don't have issues till I get into town which is densely populated. I mean a 3080 and 3600X should get better than 45FPS in town? Biggest problem I see is that GPU utilization is only at like 40%ish when in town. Outside of town or in building GPU usage can hit 95% though. What gives?

Moving in the Night city is apparently quite CPU-intensive. Especially if you are travelling fast speeds on a vehicle.

Also: Your cpu runs at 92 degrees celcius? Methinks you need to improve your cooling before your CPU melts.
 
Bruh at least, try to know what you're talking about before saying that much BS.

CPU has basically NO CHARGE at 4k. Only very bad cpu could bottleneck a 1080ti at 4k. Now I doubt you know what bottleneck is, cauz if you did, you would know that a 1080ti at 99% usage on 4k resolution is not being bottlenecked.

Now, see this video I'm linking here :

1080ti is always at 99% on 4k resolution, while the 2700x is barely getting used (around 30-40ish pourcent). If it was bottlenecking the GPU, the 1080ti would be at around 70% and the CPU at 70-80% or something, which is clearly not the case here.

Sorry for this little out of subject response, but I felt like I needed to adress this.

Edit: Please keep in mind I am talking about this games performance estimates, not how the 2700X paired with a GTX 1080TI handles Tomb Raider or other older games.

While the CPU has almost no effect at 4K that doesn't mean there is none. What it means is you are frame rate limited no matter what, so crank the graphics settings up until you max out the graphics card paired with the processor.

I'm going to try not to be insulting because I think you simply misunderstood my previous post. The 2700X/GTX 1080TI isn't going to buckle at 4K Ultra because of the 2700X. It's going to buckle because of the GTX 1080TI. It's a great card but it is getting old.

That said as the resolution lowers the bottleneck will shift to the 2700X which is why at lower resolutions the machine will struggle more and more. It's pretty significant at 1080P and a slight issue at 1440P. Even when the processor was new it was struggling. It's been a few years, the performance bar has risen and zen+ is getting long in the tooth. My suggestion if you want more performance for a reasonable amount of money if you have a decent X470 and maybe B450 board is to update the firmware and upgrade to a Ryzen 3000 series processor.

Your example is showing a bunch of old games. I know that old games play ok. That is not comparable to new games like Cyberpunk. Here is a link going over results for Red Dead Redemption 2. It's another very demanding game. Red Dead Redemption 2 benchmarks.
 
Last edited:
Firstly, some tech specs:

i7 6700k
2070 RTX super
32 gig RAM

This is clearly not an issue with my rig, even if the 6700k is now starting to look a little long in the tooth, for reasons that will become clear.

Ok, so this is a weird one.

I pre-ordered the game and installed it on launch day last week, only to find that the game appeared to be stuck at about 10 fps - it launched fine, 200+fps on the first splash screen, but then plunged down to 10fps on the 'Red Engine' screen and stayed around there, regardless of settings or whatever. The game was flatly refusing to use my gfdx card to anything approaching it's full capacity; it seemed locked on 30% usage.

EXCEPT. When I talked to Bug during the VR training module. Literally only when I was stood on the platform and talking to her tho - if I ran onto the platform before she appeared, it ran at 10 fps. If I got off the platform while talking to her, it fell back down to 10 fps. If I was stood on the platform while Bug was present, tho - 80fps. The moment training started, back down to 10 fps we went. I could even see this in the monitor - 30% gpu usage until I stepped on the platform, then it rocked up to 80-90%. Then straight back down to 30% the moment I left that platform.

I spent most of the following day messing about with it - updating drivers, tweaking settings in Nvidia etc - and eventually used MS's Windows 10 Update Assistant to force my PC onto 20H2. I didn't think this had a cat in hell's chance of making a difference, but to my surprise, it worked.

I launched the game and then was able to spend the following 3 or 4 days playing at 80+ fps on Ultra settings (without ray tracing switched on, tho I'm fairly confident I'd still be getting ~40 or so with it active). This should mean we can dispense with anyone proposing that a 6700k is somehow too old or inferior for playing the game, or that a 2070 won't handle it - it's been handling it just fine, and 2 of those days were using the latest hotfix, too, so it's not like Cyberpunk has changed since then.

Then, on Sunday, a few more Windows Updates installed overnight - KB4586876 and KB4580419. These are cumulative updates for .NET.

Monday morning, Cyberpunk is back down to 10fps. Removing those two windows updates does not appear to restore my previously decent frame rates.

No other game has had any negative impact from these updates - just Cyberpunk. I know I'm not the only person having this problem, either; I've seen a few threads about it on Reddit and Steam, and have seen mention of what sounds similar here.
 
Edit: Please keep in mind I am talking about this games performance estimates, not how the 2700X paired with a GTX 1080TI handles Tomb Raider or other older games.

While the CPU has almost no effect at 4K that doesn't mean there is none. What it means is you are frame rate limited no matter what, so crank the graphics settings up until you max out the graphics card paired with the processor.

I'm going to try not to be insulting because I think you simply misunderstood my previous post. The 2700X/GTX 1080TI isn't going to buckle at 4K Ultra because of the 2700X. It's going to buckle because of the GTX 1080TI. It's a great card but it is getting old.

That said as the resolution lowers the bottleneck will shift to the 2700X which is why at lower resolutions the machine will struggle more and more. It's pretty significant at 1080P and a slight issue at 1440P. Even when the processor was new it was struggling. It's been a few years, the performance bar has risen and zen+ is getting long in the tooth. My suggestion if you want more performance for a reasonable amount of money if you have a decent X470 and maybe B450 board is to update the firmware and upgrade to a Ryzen 3000 series processor.

Your example is showing a bunch of old games. I know that old games play ok. That is not comparable to new games like Cyberpunk. Here is a link going over results for Red Dead Redemption 2. It's another very demanding game. Red Dead Redemption 2 benchmarks.

I'm sorry if I misunderstood u but you litterally said : "A 2700X bottlenecked a GTX 1080Ti at 4K". How was I supposed to get you were talking about the 1080ti bottlenecking itself ? Which is quite obvious considering the fact it's indeed a old gpu. On top of that you added "Ryzen Zen+ was never that great at gaming", which is true at lower resolution but not at 4k. All those factors combining with the fact you never mentionned you were talking specifically about cyberpunk 2077 made me a bit confused.

However, the games in the video are pretty old I guess, but some of them are fairly recent, and I dont think it changes a lot of thing regarding bottlenecking. As long as Games dont use more cores on CPU, 4k will always be a GPU speciality, and the CPU won't do much.

Besides, I was only talking about 4k. 1440p and 1080p are a total different story, as bottlenecking is more noticable in this kind of resolution. The video you linked to me on RDR2 is the perfect exemple of that, as some beast GPU are getting bottlenecked at 1080p. But it doesn't show some 4k graphs where the CPU is bottlenecking the GPU. Why ? Because even on RDR2, CPU bottlenecking doesnt exists at 4k.

Regardless of that disagreement about bottlenecking on 4k resolution, I agree with the others points you made. But my position on the matter won't change, unless we get some games which use more cores to be effective.

EDIT : I'm part of some who actually thinks RDR2 is not a very good optimized game. The game is beautiful dont get me wrong, but not reaching 60 fps at 4k with max settings on a 3080, tells me it's heavy in ressources for not that much to show off. Death stranding is the kind of game godly optimized. According to me, RDR2 is not.
 
This might help people with AMD CPU's

This made a big difference for me (Ryzen 2600). Why couldn't CD Projekt Red have done this themselves? Out of all the bugs and glitches this is my biggest gripe with the game because it seems like this would be an easy fix.
 
I´ve done both fixes, the .exe fix (used the EB version, with EB 30 33 C9 B8 01 00 00 00 0F A2 8B C8 C1 F9 08 ) and the memory pool budget.csv, there i used, half of my memory, the full gpu memory and roughly twice the recommended values
PoolRoot
PoolCPU 16GB
PoolGPU 8GB
PoolFlexible -1
PoolDefault 1KB
PoolLegacyOperator 2MB
PoolFrame 64MB
PoolDoubleBufferedFrame 64MB
PoolEngine 900MB
PoolRefCount 32MB
PoolDebug 900MB
PoolBacked 900MB

furthermore i´ve done the recommend preset from the "update your drivers thread"

I'm willing to update this, let me know if you got settings that work out well.)

Links:
Thread: Performance Issues
Game Crashes - Support
Contact Technical Support
  1. Install Cyberpunk 2077 on an SSD (you can find 128gb second hand for 15$)
  2. If you want to play on Ultra:
    1. skip Ray Tracing for now. A patch might be on its way.
    2. Set DLSS to OFF, quality, or balanced. Don't use AUTO and don't use one of the performance modes.
    3. If you set resolution to 1080p or lower, set DLSS to OFF, otherwise it looks blurry.
    4. Only use the DLSS performance modes for 4K and 8K on big monitors / TVs.
  3. Make sure Vsync is disabled
  4. Set the powerplan of your PC to High Performance
    1. Search and open Control panel
    2. Click Power options
    3. Set it to High Performance or AMD Ryzen High Performance
  5. Enable the real full-screen mode (This can lower FPS, please compare at medium settings without RayTracing)
    1. go to C:\Program Files (x86)\GOG Galaxy\Games\Cyberpunk 2077\bin\x64, or
      1. Open Steam, Right-click on Rocket League, click Properties
      2. go to the "Local Files" tab and click browse local files
    2. right click on Cyberpunk2077.exe, click on properties and Compatibility
    3. Disable full-screen optimizations
    4. go to in-game-settings and choose full-screen-mode
  6. Use MSI Afterburner to make sure your GPU fans are spinning
    1. enable "Apply at Windows Startup"
      View attachment 11074793
    2. Force fan speed update on each period (can increase CPU-usage) if fan curve cannot be applied. Make sure to set curve for both fans, and also activate auto mode for both fans. Keeping your GPU cool is necessary to avoid crashing and throttling.
      View attachment 11074388
  7. optimize RAM timings
    1. Download DRAM calculator
    2. Follow Steve's guide
    3. Go to your UEFI / BIOS and set timings (like in that video, set the command rate to 1, or 1T)
  8. CPU Priority
    1. In Windows, after launching Cyberpunk, tab out and open Task Manager.
    2. Go to details tab, right click on Cyberpunk2077.exe, set priority to high
    3. It won't save this setting, you want to do this every time you start playing. (It doesn't do much tho and it can also cause crashing, but it might give you some extra performance.)
  9. Optimize CPU performance (Ryzen)
    1. Go to Bios, there you have two ways of overclocking:
      1. manually OC your cpu by setting a multiplier and adjusting the voltage. Precision Boost overdrive needs to be turned OFF.
      2. Alternatively, enable Precision Boost overdrive and SMT without setting the multiplier. CPU will automatically overclock.
    2. Adjust CPU fan curve to keep temps below 85°C. Also check Chipset and System Fan curve.
  10. Check for driver updates:
    Chipset ; GPU ; Windows ; Monitor
  11. Enable Gsync or Freesync and optimize other gpu settings
    1. Go to your gpu control panel
    2. (NVIDIA) click Manage 3d settings and go to Program settings, select or add Cyberpunk2077.exe
      1. Monitor Technology: G-sync
      2. Preferred refresh rate: Highest available
      3. Power management mode: Prefer maximum performance
      4. Low latency Mode: On (Don't set it to Ultra)
      5. Texture filtering: Quality, performance or high performance
      6. Make sure Vsync is set to application controlled

and oh boy my ryzen 3 3100 and the rtx 3060 came alive, no lag, no slowdown, no ctd. i´m literally crying for joy. This is on the Ultra preset with 100FOV at 1080p without DLSS and RT

 
I'm sorry if I misunderstood u but you litterally said : "A 2700X bottlenecked a GTX 1080Ti at 4K". How was I supposed to get you were talking about the 1080ti bottlenecking itself ? Which is quite obvious considering the fact it's indeed a old gpu. On top of that you added "Ryzen Zen+ was never that great at gaming", which is true at lower resolution but not at 4k. All those factors combining with the fact you never mentionned you were talking specifically about cyberpunk 2077 made me a bit confused.

However, the games in the video are pretty old I guess, but some of them are fairly recent, and I dont think it changes a lot of thing regarding bottlenecking. As long as Games dont use more cores on CPU, 4k will always be a GPU speciality, and the CPU won't do much.

Besides, I was only talking about 4k. 1440p and 1080p are a total different story, as bottlenecking is more noticable in this kind of resolution. The video you linked to me on RDR2 is the perfect exemple of that, as some beast GPU are getting bottlenecked at 1080p. But it doesn't show some 4k graphs where the CPU is bottlenecking the GPU. Why ? Because even on RDR2, CPU bottlenecking doesnt exists at 4k.

Regardless of that disagreement about bottlenecking on 4k resolution, I agree with the others points you made. But my position on the matter won't change, unless we get some games which use more cores to be effective.

EDIT : I'm part of some who actually thinks RDR2 is not a very good optimized game. The game is beautiful dont get me wrong, but not reaching 60 fps at 4k with max settings on a 3080, tells me it's heavy in ressources for not that much to show off. Death stranding is the kind of game godly optimized. According to me, RDR2 is not.

The thing is the 2700X does slightly bottleneck a GTX 1080TI in 4K. I saw it myself with my 2700X @ 4.1GHz with a MSI GTX 1080TI Gaming X running on a Crosshair VII Hero Wifi with 32GB of DDR4-3200MHz RAM. It isn't as noticeable and in older games it was not relevant to 4K Ultra 60FPS. If you look at the benchmarks though you can clearly see even at 4K a 8700K got slightly higher frame rates out of the GTX 1080Ti then the 2700X. Like I said it wasn't enough to make it a deal killer at the time.

When compared to Intel's processors of the time the Ryzen Zen+ processors where always struggling to keep up. The way I look at it is, if the processor was struggling with bottlenecking issues in games when the processor was in production then how would it handle these new games now? If it was bottlenecking a GTX 1080TI how will it handle newer faster graphics cards? How much performance will get left on the table because of the processor? That doesn't mean the computer won't see a performance gain from a upgraded graphics card, just that the gain won't be 100% of what the graphics card could bring with a slightly faster CPU.

I did mention that I used my Ryzen system to player Witcher 3 at 4K Ultra 60FPS. Sorry about the confusion. What Zen+ will forever be known for is the continued press by AMD on Intel to force Intel to sink or swim. It's why the 10900K and the 5950X exist now trading blows and giving us PC gamers incredible return on the expense as compared to the stagnant days of 4C/8T incremental increases in performance as Intel sat on improvements to release it bit by bit in order to harvest the money for as long as possible.

It is impossible for me to say if RDR2 is optimized or not without looking at the code. It could simply be that the effects utilized in the engine are not fully hardware accelerated. In addition the RTX 3080 is limited to 10GB of video memory and with all the effects in RDR2 you can easily blow past that in 4K which means performance will take a enormous hit. If nVidia had not been so skimpy with the RAM both the 3080 and 3070 would do better in higher resolutions.
 
Top Bottom