Low FPS no matter what the settings are

+
Hello,

I seem to be be only getting just above 60fps no matter what the settings are set to in game. I can have them set to Ultra and get anywhere from 59 - 65 fps or even have them on low and get the same performance. I don't believe this is a Vsync issue although I could be wrong, but I would think that if it was Vsync related then I would not go over the 60fps mark at all. My system specs are below, let me know if anyone has any suggestions.



I've noticed CPU utilization is only around 20-30 percent and GPU utilization is anywhere between 50 - 80 percent. (all seems a bit low to me)



System Specs:

GPU: 3090 strix CPU: AMD 5950x RAM: 32Gb 3200Mhz DDR4

Also something to note that I'm running the game on at 1440p 144hz and my monitor is a Dell S2716DG.

Something else of note, I've tried completely uninstalling and reinstalling video drivers to see if that would make a difference, it did not.

Thanks!
 
This appears to be a VERY common problem. In my case both my CPU and GPU throttle down whenever they reach 35-40 FPS as if there's an FPS cap even though I have no FPS cap on. I tested by turning off DLSS, and putting my setting to Ultra and sure enough my GPU throttled up to 100% at about 35 FPS. If I toggle down my settings to LOW my GPU throttles down to 30% and runs at about 35 FPS. However it will never fully throttle up so my game goes from 35 FPS all the way down to 25 before my GPU throttles up to 60% to reach 35 FPS. Also, this entire time my CPU sits around 40% usage regardless.

It's clear that something is targeting 35-40 FPS for no apparent reason. I tried deleting my user settings file, I uninstalled by GPU drivers and and did a clean install. I rolled back my GPU drivers. I tested a thousand different settings always getting the exact same result, and the GPU throttling down to reach the result. When I started playing I didn't have this problem. My game ran at 60 FPS on Ultra RT, and 90-100 on Ultra with RT off.

To be clear there is no bottlenecking. I get 35 FPS on Ultra RT at 1080P, I get 35 FPS on Ultra RT at 4K. I also get 35 FPS on LOW RT off at 1080p. This has been driving me crazy.

Specs:
Ryzen 1800X, RTX 2080, 32 GB 3200 DDR 4 memory.
 
I have the same problems. Just with diffrent hardware and diffrent fps. At me it stucks at 13-15 fps. No matter which settings.
I also tried everything. Installing windows new, Installing graphic driver new, install the game new (even diffrent lokations HDD, SSD). But when I go very far away from the city. So far till you are on the edge of the map then I get 50-65 fps. Even at Ultra Settings at 1080p. When I go step by step near to the city then the fps goes down. When I go some steps back the higher fps come back. Just some steps ( Character steps).

My Specs are:
Ryzen 5 2600 @stock Clocks
RX Vega 56 @stock Clocks (OC clocks from MSI Airboost edition)
16 GB G.Skill Trident Z DDR4 3200mhz @2933mhz
Mainboard: MSI x370 Gaming Pro Carbon.
 
I seem to have found a solution, I had a very similar issue where one day I booted up and couldn't get above 20 fps. I tried a few things with no luck, but then I thought to try and launch the game without the RED launcher, so I went into the game folders and found the game file and launched it with that bypassing the RED launcher. Now my game is back to normal FPS and smooth, so it appears to be an issue with their launcher.
 
Hello!

In my experience, what's causing low framerates on AMD CPUs is the fact that the game is not properly utilizing all cores.

I have an AMD 5800X and an RTX 3080 and with the Ultra Preset I can't keep my GPU usage at 99% while driving.

I'm also using MSI Afterburner to monitor my CPU (and all its cores), GPU (and it's VRAM) and RAM usage and can confirm that the game is not maxing out all cores as it does on Intel CPUs.

AFAIK, the dev team mentioned in the patch notes of the patch that should have addressed this issue that they worked with AMD to create a fix, so I'm not really sure if this is an actual technical issue that can be "fixed" or just the way the game will always run on AMD CPUs due to its coding.

Hope this info helps :)
 
Hello!

In my experience, what's causing low framerates on AMD CPUs is the fact that the game is not properly utilizing all cores.

I have an AMD 5800X and an RTX 3080 and with the Ultra Preset I can't keep my GPU usage at 99% while driving.

I'm also using MSI Afterburner to monitor my CPU (and all its cores), GPU (and it's VRAM) and RAM usage and can confirm that the game is not maxing out all cores as it does on Intel CPUs.

AFAIK, the dev team mentioned in the patch notes of the patch that should have addressed this issue that they worked with AMD to create a fix, so I'm not really sure if this is an actual technical issue that can be "fixed" or just the way the game will always run on AMD CPUs due to its coding.

Hope this info helps :)
While that is true in some cases its also dependent on what resolution you play at and other things. Lower resolution will generaly put a higher stress on your cpu since the gpu will render more frames (cpu bottleneck) It even affects DLSS scaling since you will technicaly just lower the render resolution and upscale it too whatever resolution your gaming at. 1440p and 1080p is more affected by the cpu bottleneck then 4k for example since 4k is generaly Gpu bound.

I generaly see 30-50% cpu usage and 97-99% gpu usage at 4k DLSS preformance (1080p) getting around 60-70 fps, if i disable DLSS i see 20% cpu usage and 20 fps. No game that i know off uses this menny cores (24/32 threads) but you will sooner or later hit a bottleneck. Intel could be having some advantages here since they generaly have fewer cores but higher clockspeeds.
 
I agree with what you said about the resolution and CPU usage, but the issue is that on Intel CPUs the game is actually utilizing all cores and on AMD CPUs is not.

This could happen due to a variety of reasons, and I personally believe that this is due to how the game was originally coded.

Check out my screenshot, I just took it to demonstrate what I'm talking about.

Although I've capped my CPU to 4.2 GHz, the game is not utilizing all CPU cores as you can see.

Oh BTW, I'm playing at 1440P with the Ultra RT preset.
 

Attachments

  • Cyberpunk 2077 Screenshot 2021.08.25 - 22.18.27.09.jpg
    Cyberpunk 2077 Screenshot 2021.08.25 - 22.18.27.09.jpg
    922.4 KB · Views: 285

DC9V

Forum veteran
Although I've capped my CPU to 4.2 GHz, the game is not utilizing all CPU cores as you can see.
Your 5800X has 8 cores. What you see is the result of SMT. Also, what you want to look at is the CPU per Thread usage.
Here's what 16 cores look like, without SMT:
(FCLK 1800, PBO enabled, XMP profile enabled, everything else on AUTO)

CPU usage Ryzen 5950X.png

BTW I don't recommend to cap the 5800x at 4.2GHz, it can boost up to 4.7GHz. 70°C seems a bit low.
 
Last edited:
Hello!

Thank you for your recommendations, but I would like to mention that I'm only keeping my CPU capped at 4.2 GHz because I don't currently have a good cooler, and I want to keep my temps low until I get WC or an excellent cooler.

As for your comments about thread usage, what you're seeing in my screenshot is exactly that, the usage of my CPU, thread by thread (8 cores/16 threads). SMT (HT on Intel CPUs) is enabled and correctly utilized in other games, but as you can see in the screenshot, it's not working correctly in Cyberpunk.
 
Yes most games never use more the 6-8 threads. This is why most gaming pcs rarely have more then 8 core cpus in them. You will not be using it too the fullest extent anyways when gaming. Its allways switching around on what thread its using how much and so on.


Thats me testing DLSS and Cpu/gpu usage and powerdraws.
 
Hey again,

Although I appreciate you sharing your video and explaining that most games use a certain amount of cores, I'm not really sure what your point is.
Additionally, you appear to be GPU bound with all DLSS settings, so showing your CPU usage would not be optimal under these circumstances.

In this particular case, and the main reason why I decided to post in this forum is that I believe that CB 2077 should properly utilize all cores and threads on AMD CPUs as it does with Intel CPUS. This would help me, as well as many other players that have older AMD CPUs.
 

DC9V

Forum veteran
Hello!

Thank you for your recommendations, but I would like to mention that I'm only keeping my CPU capped at 4.2 GHz because I don't currently have a good cooler, and I want to keep my temps low until I get WC or an excellent cooler.
Hm... did you set your temp limit to 70°C? Because that might be the reason why you don't see full utilization CPU usage. 75°C-80°C on average while gaming would still be fine, and spikes up to 90°C are totally fine, too.

And how did you "cap" your CPU exactly? Because using the multiplier in the BIOS is not capping, that's manual overclocking, and it shouldn't be combined with PBO.

As for your comments about thread usage, what you're seeing in my screenshot is exactly that, the usage of my CPU, thread by thread (8 cores/16 threads).
Sorry, I meant the maximum value of all current core-per-thread utilizations.
Although I appreciate you [Notserious80] sharing your video and explaining that most games use a certain amount of cores, I'm not really sure what your point is.
Additionally, you appear to be GPU bound with all DLSS settings, so showing your CPU usage would not be optimal under these circumstances.
DLSS and FidelityFX FSR upscale from a lower resolution, which means more CPU usage. You can clearly see that in his video.
In this particular case, and the main reason why I decided to post in this forum is that I believe that CB 2077 should properly utilize all cores and threads on AMD CPUs as it does with Intel CPUS. This would help me, as well as many other players that have older AMD CPUs.
I would be more worried about your low GPU usage. 72% seems to be pretty low. At 1440P with the Ultra_RT preset, you should be GPU bound for the most part (even with a last gen CPU like the 3800x). EDIT: It seems like GPU usage is significantly lower when RayTraced shadows are enabled, and I appreciate that the SMT mod seems to fix that.
 
Last edited:
i too would love for the game to use all my 24 threads but if the engien is built too use 8 its just gonna use 8. Only found 1 video of a dude filming Cp2077 with a overlay with a intel cpu and they seem too spread the load more (50% usage over all threads) but it still was only using 50% of the cpu. Amd seem to mostly use proper cores (the non smt ones) but i think the limit is kinda hardcoded into the engine since it cant just open up new threads for doing something. You could try the mod CyberEngineTweaks that still has the "smt fix" that CDPR says already is included in the game too see if it changes your usage in some way.

Also noticed that for cpu benchmarks in this game Ryzen seem too win. witch is odd if intel has better cpu usage.
 
Hey again DC9V,

Regarding my temps, I capped the frequency of my CPU by setting the multiplier to 4.2 (4.2 x 100 MHz = 4.200 MHz). Oh, I forgot to mention that this is not affecting CPU utilization across threads. I've tested the game without the cap and it does the same.

I'm not sure what you mean about DLSS and FidelityFX being more heavy on the CPU.

Example, let's say I run the game at 1080p with everything maxed out. Just like in my screenshot (Ultra RT preset at 1440p), I would get 48 FPS in that particular part of the game regardless of if I use DLSS or FidelityFX due to the game being limited by the CPU. Now, let's say I run the game without DLSS or FidelityFX, and even though I would be at native 1080p, I would still get 48 FPS in that part of the game, due to being CPU limited.

As for my GPU usage, as you can imagine this occurs due to the fact that the game is CPU bound due to not utilizing all cores/threads properly.

Hope this clarifies some of your observations.

Thank you Notserious80, I'll check out that mod, although I think this is something that the dev team should address from their end. Also, as you mentioned, the fact that the game is utilizing all threads (on an 8 core Intel CPU), would mean that no individual core would be maxing out and limiting the other cores as it does on my system.

In the end, this will mean that for my next build I'll have to go Intel + Nvidia, as in my experience that combo has the best compatibly across most games.
 
Last edited:
Hey again DC9V,

Regarding my temps, I capped the frequency of my CPU by setting the multiplier to 4.2 (4.2 x 100 MHz = 4.200 MHz). Oh, I forgot to mention that this is not affecting CPU utilization across threads. I've tested the game without the cap and it does the same.

I'm not sure what you mean about DLSS and FidelityFX being more heavy on the CPU.

Example, let's say I run the game at 1080p with everything maxed out. Just like in my screenshot (Ultra RT preset at 1440p), I would get 48 FPS in that particular part of the game regardless of if I use DLSS or FidelityFX due to the game being limited by the CPU. Now, let's say I run the game without DLSS or FidelityFX, and even though I would be at native 1080p, I would still get 48 FPS in that part of the game, due to being CPU limited.

As for my GPU usage, as you can imagine this occurs due to the fact that the game is CPU bound due to not utilizing all cores/threads properly.

Hope this clarifies some of your observations.

Thank you Notserious80, I'll check out that mod, although I think this is something that the dev team should address from their end. Also, as you mentioned, the fact that the game is utilizing all threads (on an 8 core Intel CPU), would mean that no individual core would be maxing out and limiting the other cores as it does on my system.

In the end, this will mean that for my next build I'll have to go Intel + Nvidia, as in my experience that combo has the best compatibly across most games.
DLSS shifts the usage, but its mostly usefull if your not already cpu capped. Generaly at higher resolutions you should be Gpu bound and thats when its good to use DLSS since it shifts the balance more towards the cpu. If your already cpu bound it wont make a differance tho.

I dont see any of your cores beeing 100% usage tho so your not caped fully. I see similar numbers when i tested my pc at 1080p with low gfx settings. I actualy think you capping the max clock / core is doing more dmg to your fps then not utilising every thread at 50% like the intel video i found. The total cpu usage seem to be around 50% anyways no matter what you do, spreading the load will not do much differance if your not maxing out the usage anyways. 50% total cpu usage is still 50% usage. Also Hyperthreading works diffrently then SMT. Clockspeed generaly reigns whe it comes too gaming since you wont be using the cpu fully anyways. Theres also memory/pci-e that can throttle gpu preformance.

Having said that i would also like too see more usage on more threads if it makes the preformance/gfx fidelity. Would probably req a total rewrite of the engine and i just dont see that happening. Next gen patch might help AMD with there multithreading tho since the new consoles both use AMD hardware capeble of it.
 
Hey Notserious80,

Thanks for your suggestions and insights.

I would like to mention that although I appreciate the help, I'm not really looking for recommendations on how to optimize my game and system.

Regarding DLSS, I'm fully aware of what it is and how it works, however I'm not really sure how it's relevant in the context of what I've mentioned regarding CPU utilization when not GPU bound.

As for your comments regarding how the frequency of the CPU and RAM can affect performance, I agree with you (due to obvious reasons), but as I mentioned, I've tested the game without the cap and although I get higher framerates, the game is not fully utilizing all cores and threads (which is the main reason why I'm here).

Also, you're not seeing any cores maxed out in the screenshot because that's not something I intended to showcase there. What I'm showing is that the game is not fully utilizing all cores and threads as it does on Intel CPUs.

Hopefully, the next gen patch that you mentioned will address this "issue", and the fact that the PS5 and X|S have AMD CPUs gives me quite a bit of hope.

Cheers.
 

DC9V

Forum veteran
Hey again DC9V,

Regarding my temps, I capped the frequency of my CPU by setting the multiplier to 4.2 (4.2 x 100 MHz = 4.200 MHz). Oh, I forgot to mention that this is not affecting CPU utilization across threads. I've tested the game without the cap and it does the same.

I'm not sure what you mean about DLSS and FidelityFX being more heavy on the CPU.

Example, let's say I run the game at 1080p with everything maxed out. Just like in my screenshot (Ultra RT preset at 1440p), I would get 48 FPS in that particular part of the game regardless of if I use DLSS or FidelityFX due to the game being limited by the CPU. Now, let's say I run the game without DLSS or FidelityFX, and even though I would be at native 1080p, I would still get 48 FPS in that part of the game, due to being CPU limited.

As for my GPU usage, as you can imagine this occurs due to the fact that the game is CPU bound due to not utilizing all cores/threads properly.

Hope this clarifies some of your observations.

Thank you Notserious80, I'll check out that mod, although I think this is something that the dev team should address from their end. Also, as you mentioned, the fact that the game is utilizing all threads (on an 8 core Intel CPU), would mean that no individual core would be maxing out and limiting the other cores as it does on my system.

In the end, this will mean that for my next build I'll have to go Intel + Nvidia, as in my experience that combo has the best compatibly across most games.
Trust me, you should be GPU bound, not CPU bound. If you'd be CPU bound you would see spikes in frame times, rather than a constant low GPU usage. The GPU usage should be at least 80%, so there must be something wrong with your setup. Probably your power plans, or some overlay interfering...
 
Hey DC9V,

Thanks for the continued suggestions but my system is working correctly and I know what I'm doing, so I'm afraid that your suggestions are not applicable in this case. 🤓

Don't you think that the game should be using all cores and threads on AMD CPUs (with 8 cores/16 threads) as it does on Intel CPUs? 🤔

From my perspective, you seem to be missing the point that I'm trying to make. 🙄
Post automatically merged:

To crarify a bit (in case you are not that "tech savvy"), what I'm describing is how the game behaves on AMD CPUs and not something particularly related to my system.

This is a widely known "issue" that might not be fixable, depending on what's causing it.
 
Last edited:

DC9V

Forum veteran
Hey DC9V,

Thanks for the continued suggestions but my system is working correctly and I know what I'm doing, so I'm afraid that your suggestions are not applicable in this case. 🤓
I'm sorry but your system is not working correctly. 70% GPU usage is not normal. You should definitely expect better performance with your gear.
Don't you think that the game should be using all cores and threads on AMD CPUs (with 8 cores/16 threads) as it does on Intel CPUs? 🤔
I think Ryzen CPUs work differently and you shouldn't expect the same results as with Intel CPUs when you adjust the multiplier in the BIOS. It is better to leave these settings on AUTO and use PBO instead.
From my perspective, you seem to be missing the point that I'm trying to make. 🙄
Post automatically merged:

To crarify a bit (in case you are not that "tech savvy"), what I'm describing is how the game behaves on AMD CPUs and not something particularly related to my system.
This is a widely known "issue" that might not be fixable, depending on what's causing it.
Improving how the game behaves on AMD CPUs wouldn't make a difference as long as your GPU isn't even reaching 80% utilization, which again should definitely be the case with a Ryzen 5800x at stock settings, or even a 3600x.

(btw my screenshot was with SMT disabled)
 
Top Bottom